Electronic – Why does BLDC Motor Spin Slower with Same Duty Cycle at Higher Frequencies

brushless-dc-motormotorspeed

I have an OTS BLDC motor controller as well as one that I just made myself and both seem to be dramatically slower at higher frequencies. When I look at how much current is being drawn it makes sense, but it does not make sense otherwise.

I have two tests:

  1. 20% duty cycle, f_PWM = 3kHz draws about I = 0.8A from power supply, motor spins faster
  2. 20% duty cycle, f_PWM = 8kHz draws about I = 0.25A from power supply, motor spins slower

This is particularly bad because I want to drive the motor at f_PWM ~30kHz.

To be honest, these tests were done with my controller which is limited from ever getting to 100% duty cycle so that I will never have any crossover that could lead to shoot-through. That said, I do NOT think the losses are switching losses (the current is also much lower).

Any ideas or common reasons why?

Thanks in advance

ADDED:
by f_PWM I mean the frequency of the square wave I am sending to the MOSFET gates at each commutation step. I have not yet attempted speed control but my plan was to adjust this frequency.

I guess I'm not entirely sure of the dead time. On a single commutation step I:
1. turn off all square wave outputs
2. re-map the appropriate outputs according to what the hall effect sensor state is.
I put the same square wave on the appropriate high and low side transistors
Additionally, 1.5us before and after the square wave rises and falls, I fill the rest of the period with a square wave on the low-side counterpart of the high-side transistor to re-charge a charge-pump capacitor. This is necessary because I am using N-Channel MOSFETs on the high-side as well.

I know that's a little sloppy of a description, but please let me know if there's anything I can clarify.

Best Answer

You are right in that they are not switching losses. If they were, the current would go up with frequency, not down.

You said it yourself. Your controller implements a dead time. At higher frequency the dead time is a larger fraction of the PWM period, so the effective time the motor is on is reduced.

Just to avoid some confusion, I want to point out that what you are seeing has nothing to do with the induction of the motor coils. The inductance of the coils only serves to smooth out the individual pulses. These decreases the ripple current in the coils and makes the current closer to the average. That's actually a good thing.

Depending on how exactly you are driving the motor, it is possible that too little inductance of the coils causes problems at low frequencies by causing back and forth current even though it averages to the desired amount. The losses in the coils are proportional to the square of the current which is higher with ripple imposed on the slow desired signal than just the average. Unless you have a unusual or badly designed setup (I have seen both), this is probably not the problem. Again, the dead time explains your symptoms very nicely.

Added:

I just realized that maybe by "PWM" you are talking about the motor phase drive frequency (how fast you are trying to rotate the magnetic field), not the PWM frequency used to modulate the effective drive voltage the motor sees. Your question is badly worded in that this major distinction is not clear. If in fact you are wondering why the motor takes less current when it is run faster, then that is because of the back EMF it generates. You can think of a motor as something that makes torque proportional to voltage with a variable voltage source in series. This voltage source apposes the voltage you apply proportional to how fast the motor is spinning in the direction you are trying to drive it in. The faster it goes, the less efective voltage is presented to the "driving" part of the motor, and it takes less current.