A
I have a serial interface between my PC and a servo motor controller. I am using VB as the GUI, and am reading the actual velocity at a 1Hz rate. The problem is that the velocity reading jumps +/- 5rpm (the controller I have to use doesn't really have velocity feedback; I am just reading the encoder position, waiting x seconds, reading the encoder position a second time, and taking the difference; so there is some inaccuracy). I would like to decrease this fluctuation in the reading. I have tried an averaging equation, but it doesn't decrease the fluctuation enough. If I increase the number of samples in the averaging equation it takes too long to get the result.
I know there is a standard software filter algorithm, but I can't remember what it is. Any help would be appreciated!
Thanks.
I know there is a standard software filter algorithm, but I can't remember what it is. Any help would be appreciated!
Thanks.