E
I recently came across a paper from a leading factory automation control software vendor that claims that most PLCs and industrial automation sampling and deterministic requirements are in the order of a minimum of 2-5ms (that is 2-5ms sampling, and AVERAGE interrupt latencies in the order of 2-4 milliseconds also) and that most users will never need anything less in industrial automation. While the sampling rate assertion may be true for some industries, I find at least the interrupt latency assertion incredibly hard to swallow even for most industries. This implies that, for example, when an alarm signal is triggered that the controlling computer is allowed to ON AVERAGE respond 2-4 milliseconds AFTER the signal arrives. And again note the "On-Average" statement, which means that it could easily be even more -- orders of magnitude more. More importantly, when a rotating shaft triggers its key phasor once per revolution, that we are willing to live with a large error in the calculated spin speed. Especially when safety is concerned, I think that these numbers are extremely large. In my experience 2-4 milliseconds is an eternity for turbomachinery, magnetic suspension systems, and steam/fluid pipe applications, especially when we are talking about safety. Also, for most of my applications I have usually used sampling rates in the order of no more than 1ms. However, maybe I am too much in a niche industry. Thus, in an attempt to get a better grasp of others' realities, I would like to poll the group: 1. what industry are you in? 2. what sampling rates do you typically use in your application? 3. how long do you think is an acceptable latency for a control application? or alternatively how long do you think that a computer system should take before it responds to an emergency shutdown request? Any other information that you think might be of interest to this discussion is more than welcome. Please respond either here or directly to [email protected]. Thank you for your time. - Edgar