How Much Should I Worry about Jerk

I am not a controls engineer.
I have written a motion controller for a 2 axis commercial actuator.

The controller must respond to two unpredictable user inputs:
DriveOpenLoop(v1, v2)
DriveClosedLoop(x1, x2, v1, v2)

The user-inputs come in via serial at 9600 baud, which is pretty slow.
The speed is not expected to be highly precise, and is never measured with any precision.
The system is very highly mechanically damped
The motors are steppers
The control output is a PWM output through this driver to the stepper motor.

The open loop controller does this
  1. Load target speed
  2. Clamp acceleration
  3. Write set speed to PWM out
  4. Try again on every loop until you reach the target speed
  5. Deal with cyclic position and position limits
This is entirely open loop and it works great

The closed loop controller does this
  1. Load set point
  2. Deal with cyclic position
  3. Calculate error
  4. Multiply by proportional gain to get set speed
  5. Clamp velocity
  6. Clamp acceleration
  7. Write set speed to PWM out
  8. Deal with user position limits
This also works great. It produces a more or less trapezoidal or triangular velocity profile depending on distance with an up slope set by acceleration and a down slope set by the lesser of the proportional gain or the acceleration limit. It goes right to the set point, never bounces around (it has synchronous motors and is heavily mechanically damped). It was incredibly easy to tune. It is also tested and proven.

So everything is great, but I'm going to be making a new system and there is a chance to try to get better.

Question 1:
Should I be clamping jerk?
The trapezoids are theoretically sharp and have theoretically huge jerks. This matters because the units often have cameras mounted on them which want to be smooth, and jerk can affect the likelihood of stepper slip. But it seems to me that just discrete time differentiating the acceleration is misleading, because the real-world, physical jerk is driven by the response time of the PWM output pin and the motor driver. n other words, the dt in da/dt is not really the controller time step, but just the time it takes the PWM and controller system to move from one set frequency to the next

Question 2: Is there a much better way to do this without dramatically increased complexity?
I was pretty excited looking at motion planning using boxcar filters ( Motion Profiling - Technical / Programming - Chief Delphi ). The problem I had with it was that I couldn't figure out how to start one of these profiles from a non-zero speed--just couldn't figure it out. Question 2 is pretty big, but if you could just point me to some literature or some google keywords it'd be helpful.

Also if you can post a draft of the code/program here we may be in better position for supporting on this case..
Curves of the motion scheme would be welcome too...

Any time!