Based on a quick scan of the manual, it appears that the only means of remote control of the speed/frequency is by the analog input, either 0 - 10 volts or 4 - 20 mA.
I don't see any mention of RS-485 or PWM control (except that PWM may be used to generate the 0 - 10 volt/4 - 20 mA signal).
Please note that I have not used this device - I just quickly scanned the manual. I suggest that you carefully study the manual to see how, or if, you can control it remotely.
Motor/generators have a k1*V/f transfer function when coasting and when accelerating or braking have a force transfer function of k2*V/DCR.
Since motors are designed to do work they may be >90% efficient but carry a lot on stored energy from the inertial load which can be far greater than the Joules stored in the motor itself.
So the duty cycle of dumping Watts or plugging power with even more Watts, must be regulated with the winding thermal resistance Rwa ['C/W] in order to prevent overtemp on the windings and armature which causes accelerated aging.
Since a contant acceleration and braking force is often ideal at some level, how does one control this effectively and efficiently?
There's no simple solution but let me try this idea.
If one knows the DCR of the motor and uses efficient MOSFET switches that are <2% of the DCR then most of the heat I^2*DCR (neglecting other losses) will be in the motor windings.
We know that for PWM that the effective series resistance (ESR) is a ratio between RdsOn (of bridge pair) divided by the duty cycle. But the back EMF V/f reduces with RPM so the effective braking current reduces in an uncontrolled decay in rotational g's.
There when you input PID loop parameters for some machine with certain transfer functions and mass and choose setpoints for acceleration and velocity profiles it is better to compare the error in each parameter separately for a nice 2nd order stable response with critical dampening. That means choose a start or stop time according to current conditions including winding temp, start conditions and end conditions and compare acceleration feedback with current feedback and rotary encoder rate of changes and velocity feedback with encoder frequency and then set an g level that can be maintained to complete the task in the desired time as often as needed without overheating.
Now there are a lot of variables to compute here, which I won't begin to define.
here comes the Carl Jung moment (aha)
The way to set controlled braking profiles is now obvious to some to use current sensing with average current compared to target current profile in a servo loop using PWM to the required negative plug voltage using a 50mV current shunt rated at maximum short circuit currents. The PWM duty cycle can be varied perhaps from 10% to 100% to minimize the harmonics of the PWM rate and the thermal sensor can reduce the duty cycle as needed if there is a repetitive cycling of motors up and down.
Before locking the rotor with 0 OHm bridge shunts across all coils (no current) we need to modify the PID loop to go from constant velocity mode to braking mode to locked position mode using just before the 0 velocity error reaches 0 so that we don't start going backwards from the plugged negative voltage. But then as the OP stated doing this from a low velocity is a bit of overkill with dynamic losses increased and the software guy's not getting it right. But by regulating the current shunt drop, this servo control method ought to give a smooth transition using predicted PWM levels from "-Vr plug voltage and 0V by knowing the desired barking rate and expected current with inertial load. Some adaptive braking cycles may need to be periodically done to check the transfer functions are correct, to compare expected stop times with actual.
so what is the aha? Servo design with vel, accel, inertial mass , low RdsOn/DCR ratios with RPM feedback and current loop regulation for smooth stops. (something really need for bus drivers) Then compensating loop gain for variable inertia and load current using RPM feedback to track user foot controlled brake g levels.
The tradeoff is you can't have shortest stop time with variable back MF V/RPM or an uncontrolled resistance, you need a back driving voltage that increases as speed reduces to keep current constant. OR you must compromise on shortest stop time with a fixed back-driving voltage and controlled braking current.
S&H can be used on peak currents and compared with avg currents to get cycle to cycle PWM feedback on duty cycle.
This is how we did it in the 70's with a 2Hp linear motor seeking to any track in 50 ms with a large mass head arm assembly on 14" HDD's with zero overshoot on 5 disks to within 0.1 thou position error using embedded servo pulses. ...those were elephants.
Best Answer
It doesn't drive the motor. I assume you're talking about a brushed DC motor. Brushed DC motors have the commutation built into the mechanics of the motor - the brushes are part of that. You can just apply a voltage and it'll go. But since voltage relate to speed and current relates to torque then you can start doing fancy things to control those variables to have the motor behave in a desired way. Brushed DC motor controllers are there simply to help with one or both of those parameters but are not involved in commutating it.