# Efficiency



## Ziggythewiz (May 16, 2010)

Higher voltage is more efficient because heat losses go up exponentially with higher current.

Higher voltage also means higher speed so better.


----------



## major (Apr 4, 2008)

Dink said:


> This is probably a stupid question but here it goes. If you have 2 battery packs, one 24v-30ah and one 36v20ah they both have 720watt hours? Yes? If so what is more efficient, a [email protected], or [email protected] motor with the same watt hour pack?


They will be exactly the same efficiency assuming each motor is wound for its respective voltage.

In the example you give:

The 36V motor draws 18.3A which is 658.8W. A 500W output means 500W/658.8W = 75.9% efficient.

The 24V motor draws 27.4A which is 657.6W. 500/657.6 = 76.0%.

Within the margin of error, equal.


----------



## Red Neck (Feb 1, 2013)

Higher voltage means less losses, as already written and yes, more speed. Motor winding is obviously a factor, but for speed you need voltage. 

Think of it this way: Cost restrictions mean many are using low voltage systems (let's say up to 144v) but ideally, you want to be at at least 190v but between 200 to 300v is great. You will also see some systems at 600v, etc because some may use heavy, non automotive industrial motors due to costs (they are cheaper). Industrial motors tend to be high voltage. They are also bulky and heavy. 

You see less high voltage in DIY cars since high voltage components are expensive. Especially when talking about induction (AC) motors, which are pretty much ideal for cars. Making a controller for 200-300v costs about three times as much as one for 96v for instance. Thereabouts.. Batteries cost a lot then also. But 400 amps at 300v do a whole lot more than 600 amps at 96v.

For someone not too cost constrained, it is best to go for high voltage (200v is OK), induction motor and a lithium battery. It is also most cost effective. Since it is costly, many
do not go that route initially, but most will end up there in the end.


----------



## Siwastaja (Aug 1, 2012)

For lower voltage, you use thicker wire. Therefore, in theory, the efficiency is the same.

However, in practice, it is typical that a higher-voltage system may have a bit better efficiency. There is an optimal "band", but it is actually quite wide. Inside that band, the differences in efficiency are very small.

For example, a 30 kW system could be built between 100-500V quite well, but if you tried to do it at 20V, you would run to troubles and low efficiency.


----------



## major (Apr 4, 2008)

Red Neck said:


> Higher voltage means less losses, as already written and yes, more speed. Motor winding is obviously a factor, but for speed you need voltage.
> 
> Think of it this way: Cost restrictions mean many are using low voltage systems (let's say up to 144v) but ideally, you want to be at at least 190v but between 200 to 300v is great. You will also see some systems at 600v, etc because some may use heavy, non automotive industrial motors due to costs (they are cheaper). Industrial motors tend to be high voltage. They are also bulky and heavy.
> 
> ...


Dink inquired about a specific case involving two motors and the math indicates efficiency is equal between the two, contrary to the mistaken concepts expressed by Red Neck and Ziggythewiz. 



Ziggythewiz said:


> Higher voltage is more efficient because heat losses go up exponentially with higher current.
> 
> Higher voltage also means higher speed so better.


This logic applies only if the motor were the same in both cases. But once the motor is allowed to be designed for the intended voltage, the argument no longer holds. It is not current making the heat. It is the interaction of current and resistance. The lower voltage motor will have fewer turns in the coils, larger wire size and shorter wire length making the resistance less than the higher voltage motor by the factor of the inverse of the square so the I²R losses end up the same as well as the RPM.

The best example of this is the common NEMA induction motor for industry. Most of those below 100hp come with dual voltage windings where upon installation they can be configured for 230V or 460V. The 230V draws twice the phase current as the 460V connection. But both configurations yield equal torque, RPM, efficiency, heat and ratings. 

The same is true for batteries. Higher voltage battery packs are not inherently higher power or more efficient. 

The higher voltage meaning higher power and/or higher efficiency misconception is commonplace. The constraints of the application and pocketbook dictate the choice of system voltage more than the physics.


----------



## major (Apr 4, 2008)

Siwastaja said:


> For example, a 30 kW system could be built between 100-500V quite well, but if you tried to do it at 20V, you would run to troubles and low efficiency.


By the same token you may encounter difficulty at 100,000 volts  It's all relative. Making a 30kW 20V is really pressing practicality; how about a 300kW 2 volt motor? Things get hairy because you can't wind the thing with a 1/27th turn per coil, can you? But there are homopolar machines, superconductors and liquid brushes. We just live in our comfort zone.


----------



## Siwastaja (Aug 1, 2012)

To sum the math up:

Losses go I*^2*R, so for same R, losses go up 4x when halving the voltage and doubling the current.

But at the same time, when designing a motor, the number of parallel wires in the motor go linearly up; halving the voltage will double the number of wires, halving the resistance. Furthermore, at the same time, the length of those wires go linearly down as well, again halving the resistance. Combining these two, resistance goes down in the square, so, when halving the voltage, resistance is 1/4.

(2*I)^2 * (1/4 R) = 4I^2 * (1/4)R = back to I^2R. So nothing really changes when designing a motor for lower voltage.

But, there are some copper savings in cables between battery pack, controller and motor.

In practice, there may be a small difference in favor of high-voltage, or no difference at all. All components are individual, and it could be possible that a certain lower-voltage system is more efficient if compared to a poor high-voltage system. Of course, many high-voltage installations are modern and use best technology available and therefore have a bit better efficiency, not because of the voltage.

It is indeed a common myth that high-voltage system would have better efficiency. Of course, running any motor outside its recommended parameters will almost always affect the efficiency, and that's a different matter.


----------



## Red Neck (Feb 1, 2013)

This all appears valid and I am not technically versed enough myself to dispute this in any way but you will not find a high speed road going vehicle with low voltages and vehicle manufacturers all use higher voltages, even though low voltage components are MUCH and by factors cheaper and they like cheaper, if they can have it. Not only that, even for the 12v system in ICE vehicles, there is a push to raise it to 48v.

I wouldn't really know why this would be so then?


----------



## Ziggythewiz (May 16, 2010)

major said:


> Dink inquired about a specific case involving two motors and the math indicates efficiency is equal between the two, contrary to the mistaken concepts expressed by Red Neck and Ziggythewiz.


He didn't even say they were separate motors, and I'm pretty sure the 500W was more of a motor rating than a measured value that you could go throw in an equation.


----------



## Siwastaja (Aug 1, 2012)

Red Neck said:


> This all appears valid and I am not technically versed enough myself to dispute this in any way but you will not find a high speed road going vehicle with low voltages and vehicle manufacturers all use higher voltages, even though low voltage components are MUCH and by factors cheaper and they like cheaper, if they can have it. Not only that, even for the 12v system in ICE vehicles, there is a push to raise it to 48v.
> 
> I wouldn't really know why this would be so then?


Higher voltage means lower current, and that means copper savings in wiring (not including motor windings), smaller contacts, etc. Throwing in more plastic in insulation is easier and cheaper than using thick copper and sturdy connections.

Also, for really high power (say over 50 kW), the "comfort zone" or the optimal "band" I was talking earlier eventually shifts out of about 100-120V, which is considered a "safe" DC voltage in usual legislation. If you have to go, say, 150-200V, you can then go to 300-400V as well.

It is a misconception that there would be cheap low-voltage components available. Sure there are, but they are for below 48V.

So yes, a higher-voltage system is usually somewhat better. But automatically more efficient? Not so, unless we go to extremes.

At high-voltage side, going too high can again lower efficiency due to added insulation taking room from copper and slowing down cooling in motor. It also makes most things very nasty and difficult to design safely.

All of this leads to the fact that a typical commuter car with some performance reserve (maybe 50 kW peak) usually runs between 100V...600V. It does not make so much difference which voltage you choose, if the whole system has been correctly designed for the voltage selected. On the extreme side, it could be done at 50V or 1500V, but both would be difficult and expensive to do right, but in very different ways.


----------



## Red Neck (Feb 1, 2013)

High voltage IGBTs, etc are more expensive as far as know?
A controller for up to 96 volts can usually be had for 1200 USD, whereas the
kind for 200-300v only for some 3000USD if it is an actual functional new unit, not a liquidation piece.

As far as I understand, components are much more expensive for high voltage
inverters.


----------



## Siwastaja (Aug 1, 2012)

Red Neck said:


> High voltage IGBTs, etc are more expensive as far as know?
> A controller for up to 96 volts can usually be had for 1200 USD, whereas the
> kind for 200-300v only for some 3000USD if it is an actual functional new unit, not a liquidation piece.
> 
> ...


AFAIK, there is not much difference in price between lower-voltage and higher-voltage power MOSFET and IGBT devices, given the same power. Of course you need to compare 300V 600A device to a 600V 300A or 1200V 150A device.

Component cost is only a small part of the controller cost. Some higher-voltage controllers (and AC controllers) bear an extra "deluxe" price tag, for non-technical reasons.

The DIY market is still very small. A much more steady market situation is needed before the prices can really reflect the component and assembly cost directly.


----------



## Arlo (Dec 27, 2009)

You can find really good mosfets up to 200v then the 200-300v is kinda a fail range where mosfets have low current and hi rdson.
Above 300v IGBS are better then mosfets but their efficiency is based on a voltage drop not a resistance like a mosfet so you want to run the highest voltage and lowest amperage you can with them.

Your switching components do cost more as you look for higher power levels but its the little things that make the cost go up drastically. Try pricing out some good caps at 100v and then at 200v then at 500v! There is other design requirements that need care as you get to the high voltages. So far in my experiments I am trying to stay well below 200v.
I play a lot with PMAC (BLDC) motors and IM really looking at messing with some induction motors. 

One other note is Like major pointed out when you want high rpm with low voltage on a PMAC motor you will have very few turns and this causes very low motor inductance Inductance on a PMAC goes up at the square of the turns so this might be why the OEM run the voltages they do, I will play with this in the Induction motor world soon and I bet this is why we need a decent amount of voltage as well.
The reason I mention this is Low inductance has all kinds of control issues incl fast current rise increased torque ripple and more losses due to this.


----------



## Siwastaja (Aug 1, 2012)

Arlo said:


> You can find really good mosfets up to 200v then the 200-300v is kinda a fail range where mosfets have low current and hi rdson.
> Above 300v IGBS are better then mosfets but their efficiency is based on a voltage drop not a resistance like a mosfet so you want to run the highest voltage and lowest amperage you can with them.


This is probably why there are 100V controllers and then higher-voltage controllers that go to about 400V, but nothing in between. And this is why using a 100V battery pack with a controller capable of 400V, like many diyers do, is not the best choice; there is a voltage drop of about 2V (DC) or 4V (AC) in the IGBT controllers, so at 100V that would be 2-4% conduction losses, but at 400V only 0.5-1%.


----------

