# amperage vs voltage



## major (Apr 4, 2008)

bigdawg7299 said:


> how do amperage and voltage affect range and top speed?


I think it is expained here http://www.diyelectriccar.com/forums/showthread.php?t=669&redir_from=668


----------



## Sunking (Aug 10, 2009)

You are really asking the wrong question with respect to range, or better said run time. What is important with run time is the time element, Electrical energy is measured in watt hours. Notice the term watt hours has a time element, volts and amps do not have a time element, they are instantaneous measurements. 

Ok so a battery time element is Amp Hours. Now voltage comes into the picture ,because now we can determine the watt hours. So Watt Hours = Voltage x Amp Hours. 

So here are a couple of examples:

50 volts x 50 Amp Hours = 2500 WH
100 volts x 25 Amp Hours = 2500 WH

Which is better?

Well in theory they are equal, however the 100 volt is better because it uses less current, and less current means smaller lighter wiring, and less voltage drop..


In general the higher the voltage the faster the vehicle as voltage in a DC motor is promotional to the voltage. Higher current in a DC motor is proportional to the torque.


----------



## vpoppv (Jul 27, 2009)

I like the water hose analogy more than the skier, but there is one thing that I just can't wrap my mind around: does a motor "care" whether it gets more amps or volts? Since, in the end, the same amount of power is required to move a car, whether it's 12v*100amps or 120v*10 amps. Is the amount of heat created the same for both? Given the same motor rpm (chosen with gears in the transmission) does a motor "benefit" from higher amps and lower voltage, lower amps and higher voltage, or doesn't matter? Does the answer change depending on the motor?


----------



## Sunking (Aug 10, 2009)

vpoppv said:


> I like the water hose analogy more than the skier, but there is one thing that I just can't wrap my mind around: does a motor "care" whether it gets more amps or volts? Since, in the end, the same amount of power is required to move a car, whether it's 12v*100amps or 120v*10 amps. Is the amount of heat created the same for both? Given the same motor rpm (chosen with gears in the transmission) does a motor "benefit" from higher amps and lower voltage, lower amps and higher voltage, or doesn't matter? Does the answer change depending on the motor?


Both voltage and current matters once you take power into consideration. If a motor is rated say 50 KW, run the numbers. If you had 12 volts it would take 4166 amps. No matter how large the wires are, at 12 volts you would have a huge I^2R losses. If it was say a 500 volt motor then 100 amps. a 4/0 cable could be used with very little loss to speak of.


----------



## major (Apr 4, 2008)

vpoppv said:


> I like the water hose analogy more than the skier, but there is one thing that I just can't wrap my mind around: does a motor "care" whether it gets more amps or volts? Since, in the end, the same amount of power is required to move a car, whether it's 12v*100amps or 120v*10 amps. Is the amount of heat created the same for both? Given the same motor rpm (chosen with gears in the transmission) does a motor "benefit" from higher amps and lower voltage, lower amps and higher voltage, or doesn't matter? Does the answer change depending on the motor?


Hi vp,

Let me try this one by example. Let's use 20 kW. The motor is designed for 20 kW at 3000 RPM, 100V, 200A. Let's say that the vehicle and driving conditions in 2nd gear require 3000 RPM and 20 kW for 50 mph. Your controller would provide 100V to the motor and it would draw 200A (motor current). So the motor will be operating as it was designed and be happy (within its rated temperatures).

Now shift the vehicle into 4th for those same conditions and you half the gear ratio. So now the motor needs to turn at 1500 RPM to get the 50 mph. The vehicle still needs the 20 kW, so the controller now puts out 50V to the motor (to get half RPM) but the motor will draw 400A (motor current). The battery current will remain the same for both cases. That 20 kW motor is at 20 kW, but it is no longer at rated RPM or voltage and most importantly, not at rated current. It is at 400A, or twice design (or rated) current. And that motor is not going to be happy about it. It wouldn't take very long to overheat.

Same vehicle, same conditions, same speed (mph), same battery power, same battery current, only the wrong gear selected, and you fry the motor  That is why I like to see 2 ammeters on the dash. Battery current for monitoring your discharge and motor current so you don't end up over loading the motor.

Now you can select an incorrect gear in the opposite way and the result can be motor overspeed. So a tachometer is advisable.

Now I simplified some figures for example purpose. The actual ratios aren't so clean. But hopefully you get the drift. Remember, the motor current is almost always greater than battery current. Often much greater. And motor current is the real indication of motor load, and motor overload.

major


----------



## Sunking (Aug 10, 2009)

major said:


> Same vehicle, same conditions, same speed (mph), same battery power, same battery current, only the wrong gear selected, and you fry the motor  That is why I like to see 2 ammeters on the dash. Battery current for monitoring your discharge and motor current so you don't end up over loading the motor.


Major is that possible? Basically it is a series circuit right? How can the battery and motor current be that far out of balance?


----------



## major (Apr 4, 2008)

Sunking said:


> Major is that possible? Basically it is a series circuit right? How can the battery and motor current be that far out of balance?


The motor controller is basically a buck converter. There is a diode across the motor called a freewheeling diode. The motor itself provides the inductance found in the classic buck converter output leg. So the motor current is the sum of the switch current and the diode current. The motor current is always higher than battery current except at 0% duty cycle (off) or 100% duty cycle (full on). Then motor current equals battery current (diode current equals zero).

Do a google on DC motor controller (chopper) and/or buck converter. I don't have any bookmarks on this computer, but there are plenty of sites explaining this with diagrams and equations.

major


----------



## Qer (May 7, 2008)

Sunking said:


> Major is that possible? Basically it is a series circuit right? How can the battery and motor current be that far out of balance?


Since (almost) all power is converted from the battery to the motor it means that Pmotor and Pbattery has to be the same (well, minus some losses) which gives:

Pmotor = Umotor * Imotor = Ubattery * Ibattery = Pbattery

The controller uses PWM to chop up the voltage to make, for example, a 200 Volt pack provide 100 Volt to the motor. The duty cycle (D) of the PWM is the amount of time the transistors are on and D can vary between 0 (fully off) to 1 (fully on) of anything between (like 0.5, which is 50% duty cycle, or a symmetric square wave). That gives:

Umotor = Ubattery * D

So then we can rewrite the power for the motor to:

Pmotor = Ubattery * D * Imotor

But since Pmotor has to be (close to) Pbattery it also means that:

Pmotor = Ubattery * D * Imotor = Umotor * Ibattery * D

which means that:

Ibattery = Imotor * D

Ie, Ibattery is always less or equal to Imotor, and thus you can easily provide 1000 Amp motor current from for example a 150 Ah pack that doesn't like harder discharge than 3C as long as D is below 0.45.

This got a bit messy, but I hope you could follow my rambling...


----------



## bigdawg7299 (Jun 8, 2010)

Wow! Not the answer I was looking for but a great explanation (guess I should have asked the question better!) What I was trying to ask is how do voltage and amperage affect the range/speed of an ev? Is it better to go with higher voltage/lower amperage or better to go lower voltage and higher amperage and why? I am assuming that higher voltage will result in more torque whereas higher amperage will give longer distance travelled. Is this right? I realize that this is oversimplification and there are probably other factors to consider, but I am trying to "dumb" it down so I start off with the right basic concepts.


----------



## Sunking (Aug 10, 2009)

major said:


> The motor controller is basically a buck converter.


OK thanks. I know what a buck and boost transformer is, so that makes sense now that you enlightened me. Never occurred to me to use AC transformer principles with a DC motor. AC motors and controls I have a pretty good grip on, DC series wound are still a bit new to me.

Thanks again.


----------



## major (Apr 4, 2008)

bigdawg7299 said:


> What I was trying to ask is how do voltage and amperage affect the range/speed of an ev?


Hi dawg,

Did you look over the link I provided in post #2? I think it answers your question.

But what the H. The top speed of a vehicle depends on power. Of course you have to gear it correctly to match that power to that speed. And the range of the vehicle depends on the amount of energy stored. 

So speed relates to power and range relates to energy.

Of course voltage times current equals power. And power times time equals energy.

So your question is vague. The voltage/current relationship is actually dependent on your particular drive system. That would be the motor (and controller) and battery. 

In other words, it makes no difference if you have a 100 volt, 200 amp, 100 amp hour system or a 200 volt, 100 amp, 50 amp hour system or a 400 volt, 50 amp, 25 amp hour system. Each will give you 20 kilowatts of power and 10 kilowatt hours of energy. So each is capable of the same speed and same range for a given vehicle. However, each system would require a different motor and battery.

Regards,

major


----------

