What I said is correct.
If you have a panel or a generator Putting out say 28 Volts@100W, you can use an inverter or whatever the correct term is to reduce that to 14V volts at 200W ( less losses) to charge at and it converts the higher voltage input to the lower voltage at higher amps.
Solar inverters work the same way. I was running the one in the vid just under 500V. The unit itself will take up to 600V and output that at grid voltage of 200-260V, whatever the line voltage it is connected to is running at.
The inverter takes that 400V at say 10A and converts it down to 240 at say 16A ( depending on efficiency and losses and feeds it back to the grid.
HT Power lines don't run at 240 or even 440V they run at tens of thousands of volts or higher to reduce the current and then that voltage is reduced at the sub stations to lower volts and higher amps suitable for the end user.
There are also buck and boost circuits published everywhere that allow you to take a lower voltage and convert it to a higher one but at lower amps than the input voltage.
It is not about voltage or amps it's about the wattage because 1000V @1A can be 500V @ 2A or 500A @ 2V. Same thing. You can slide the scale and expand it anyway you want as long as your volts times amps does not exceed 1000W or whatever wattage you started with.
I won't even try to overwhelm you with multiphase AC to single phase DC increase's, C2C conversions, or how they work. Best you come to grips with the simple stuff first.
If you are not aware of these very fundamental things, it is clearly you that needs to read a LOT about basic electricity principals before you make yourself look foolish.
wlb said:
My apologies.
Something very odd is going on here. Are you sure your forum account hasn't been hacked? Because if you've built all those things, and you understand how they work and you weren't just following a set of assembly instructions, then your previous post was written by somebody else.
Posted Saturday 29 Aug 2015 @ 2:20:19 am from IP
#