# Interpreting motor graphs



## Arniem (Apr 24, 2008)

I have a couple of questions on how to read and interpret the torque/RPM/Amp graphs that vendors of electric motors (eg. ADC, Netgain) provide. They typically present their graphs with Torque in ft/lbs along the X-axis, and the combined Y-axises for RPM, Amp, HP, and % Efficiency. Example: http://www.go-ev.com/images/003_15_WarP_9_Graph.jpg

The graphs are also based on a test scenario in which the applied motor voltage is, usually, 72v. But what if I will be applying a battery pack of 144v - does the law of Watts = Volts x Amp apply, such that if I double the voltage the amperage drawn will halve?

Also, if I require a certain amount of torque (ft/lbs) to be delivered at an RPM rate well below the maximum shown on the graph, will I still be drawing the same amount of current as shown on the graph for that amount of torque? Or is the amperage drawn only in proportion to the RPMs required? Example: For 30 ft/lb of torque the graph shows 250 amps drawn at 3750 RPM, but if the motor will only be spinning at 2000 RPM will the amps drawn be 250 amps x (2000 rpm/3750 rpm) = 133 amps? Or is amperage drawn dependent on the torque required, irrespective of the RPMs needed, and up to the maximum RPMs for which that torque can be delivered? I somehow can't believe that a motor spinning at, say, 100 RPM for a certain amount of torque will be drawing the same current as the same motor for the same torque spinning at 3000 RPM...


----------



## Dennis (Feb 25, 2008)

> Also, if I require a certain amount of torque (ft/lbs) to be delivered at an RPM rate well below the maximum shown on the graph, will I still be drawing the same amount of current as shown on the graph for that amount of torque? Or is the amperage drawn only in proportion to the RPMs required? Example: For 30 ft/lb of torque the graph shows 250 amps drawn at 3750 RPM, but if the motor will only be spinning at 2000 RPM will the amps drawn be 250 amps x (2000 rpm/3750 rpm) = 133 amps? Or is amperage drawn dependent on the torque required, irrespective of the RPMs needed, and up to the maximum RPMs for which that torque can be delivered? I somehow can't believe that a motor spinning at, say, 100 RPM for a certain amount of torque will be drawing the same current as the same motor for the same torque spinning at 3000 RPM...


What the graph is telling you is that when a heavier load is applied to the motor's shaft, the torque and amperage will increase, but the RPMS will decrease. Likewise if the load becomes lighter, then the RPMS will increase, but the torque and amperage drawn will decrease. So keep in mind that it does NOT imply if you "REV" up to X RPM these conditions will occur. You have to realize that this is not a gasoline engine your dealing with here.

With that said it is possible to make a series wound motor have constant torque from zero RPM to some RPM cut off point. This can be accomplished with a constant torque controller when you floor the accelerator pedal to its max. What a constant torque controller will do is try to maintain constant current into the motor by increasing the voltage faster to the motor then the motor is capable of counteracting by producing a back emf voltage to oppose the voltage supplied to it. The net difference of these two voltages will result in a current flow into the motor's electrical resistance (I=E/R) that the controller will try to maintain.

Eventually, though, the back emf of the motor starts to catch up as the controller starts to run out of voltage it can feed to the motor to try to keep the current flow the same. So the current will begin to decline resulting in the torque falling off. The point at which this occurs is called the clamp point or cutoff point. Increasing the constant torque range is achieved by increasing the battery pack voltage. 

I believe the Zilla motor controllers support constant torque ability from what I have seen of the dyno graphs of electric cars with this controller. I am not sure if the Curtis controllers support constant torque.


----------



## major (Apr 4, 2008)

Hello Arnien,

You ask "such that if I double the voltage the amperage drawn will halve?"

No. The motor current, torque relationship is independent of voltage. So for a given load (torque), the motor current is the same regardless of the motor voltage (give or take a few percent due to efficiency).

You ask "if I require a certain amount of torque (ft/lbs) to be delivered at an RPM rate well below the maximum shown on the graph, will I still be drawing the same amount of current as shown on the graph for that amount of torque?"

Yes, the same motor current. To get RPM below the curve, you need to apply a lower motor voltage. In doing this, (ie. motor voltage lower than battery voltage), the controller modulates and causes the battery current to be lower than the motor current.

Hope that helps.

major


----------



## Arniem (Apr 24, 2008)

Thanks Dennis and major - these explanations have helped me rethink the way I work with these graphs. I'm also digesting another thread on the EV Performance forum ("Horsepower") for clues.

I'm concluding that for any motor you look at its torque curve, and the current it draws at that level of torque it delivers, is fixed. The RPM it spins at will be dependent on the voltage being applied (the graphs I've seen use 72v, 108v, 120v, and 144v), and the follow on is that if you, say, want the vehicle to travel slowly up a hill you would apply a low voltage to the motor (via that controller box) to get a low RPM, and that the current drawn will be that as shown in the graphs for that load (torque). If you want lower current to be drawn at a given torque requirement at a particular RPM level, look for another motor. 

So, what advantage is there in running a motor which can operate at 144v, or even 192v, at those voltages if you get all the RPM you need from the same motor running at 72v? I doubt that a motor which can spin at, say 5000 rpm for a certain load at 72v will spin at 10,000 rpm (or some multiple of 5000), under the same load, at 144v...

I'm trying to understand this as I have done some calculations which tell me the amount of torque that will be required to input into the transmission, for each gear, from the motor for the vehicle to travel at a certain speed (or to accellerate, or climb a hill). From this I can then determine whether the demands are within the operating range of the motor, and whether the current drawn will be excessive, such that it triggers the limiter within the controller.


----------



## 3dplane (Feb 27, 2008)

Hi Arniem!
You got some good answers to digest above.
You ask:what advantage is there running a motor at 144 V when a 72 V can achieve the same rpm?(kind of)
Besides the obvious efficiency gain due to less current needed to achieve the same power.(less heating losses) The torque will be available for a greater range of rpms. Barna.


----------

