G
I have been discussing the topic of analog signal resolution with some co-workers and we have become stuck on a point. A true 16 bit analog signal will allow the controls engineer to evaluate a 4-20mA signal in 65,535 steps. Is this kind of accuracy necessary in any of the applications found in process or factory automation? If not today how about next week, year, decade? If so, what is the application?
Thanks for the input.
Thanks for the input.