J
Hi everyone,
First, I apologize if this (long!) question is a bit ignorant/dumb.
I am working on a project that is using a servo to move a mounted rod back and forth (I'm currently moving it from 0 to 90 degrees-but the angle will change every time). What I am interested in is finding a way to calculate the instantaneous torque of the rod (or servo, I guess). While the rod is moving, there are times at which a force acting against the motion of the rod is applied, which is why the torque won't remain the same between each experiment (its actually going to be a person pushing against it-each time with differing levels of force). I am using a powerful, 200 oz.in servo with a 3:1 gear ratio, which means that the servo should push through most forces acting against it while moving at the same speed. I currently have 2 ideas to calculate the torque:
1) I expect that when a greater torque is necessary, the servo should draw more current. I know that power (P)=torque (t) *angular velocity (v). If I keep the voltage and velocity (relatively) constant, then the torque should be mostly proportional to current, right? The problem is that I'm using a current sensor, and the current is being output in the form of a square wave. I'm not sure how to quantify this at all.
2) I can use accelerometers or some type of other sensor. I also know that torque=I (moment of inertia)*angular acceleration (a). However, I have no idea on how I would practically implement such a sensor or if it would even work. While doing research it seems that these sensors are only capable of measuring linear acceleration, but the rod is actually rotating.
I tried doing a ton of research online but surprisingly couldn't find anything. Some have said to use something called a "torque constant", but I am using a hobby servo and this information isn't provided. Does anyone have any ideas here, or am I completely misunderstanding something? Which idea is better, or easier to implement?
First, I apologize if this (long!) question is a bit ignorant/dumb.
I am working on a project that is using a servo to move a mounted rod back and forth (I'm currently moving it from 0 to 90 degrees-but the angle will change every time). What I am interested in is finding a way to calculate the instantaneous torque of the rod (or servo, I guess). While the rod is moving, there are times at which a force acting against the motion of the rod is applied, which is why the torque won't remain the same between each experiment (its actually going to be a person pushing against it-each time with differing levels of force). I am using a powerful, 200 oz.in servo with a 3:1 gear ratio, which means that the servo should push through most forces acting against it while moving at the same speed. I currently have 2 ideas to calculate the torque:
1) I expect that when a greater torque is necessary, the servo should draw more current. I know that power (P)=torque (t) *angular velocity (v). If I keep the voltage and velocity (relatively) constant, then the torque should be mostly proportional to current, right? The problem is that I'm using a current sensor, and the current is being output in the form of a square wave. I'm not sure how to quantify this at all.
2) I can use accelerometers or some type of other sensor. I also know that torque=I (moment of inertia)*angular acceleration (a). However, I have no idea on how I would practically implement such a sensor or if it would even work. While doing research it seems that these sensors are only capable of measuring linear acceleration, but the rod is actually rotating.
I tried doing a ton of research online but surprisingly couldn't find anything. Some have said to use something called a "torque constant", but I am using a hobby servo and this information isn't provided. Does anyone have any ideas here, or am I completely misunderstanding something? Which idea is better, or easier to implement?