B
Bill Lydon
(Originally posted Wed 09/09/1998)
This whole thread brings up a good debate.
The risk of doing control with a diskless PC is significantly lower than on a system with a disk. Also, the smaller the operating system the higher the reliability. As a rough rule of thumb the more lines of code the higher the probability for interactions that are not detected in testing but eventually show up as bugs in the field.
Using diskless PC hardware with compact and proven operating systems (DOS & QNX for example) is much less risky than larger operating systems (NT, WIN 95, CE).
Having been in real-time control for over 20 years and seen the progression to distributed multi-processing I’m a bit miffed by the idea of running MMI, information, and real-time control in one PC under NT even with “real-time” extensions. This is the kind of stuff we used to do with minicomputers because the cost of a machine was so high. In the PC world it doesn’t make any sense. Run control on diskless PC’s networked to PCs for information and MMI. This is a scalable, reliable architecture. We regularly run ethernet as a backbone and have great speed.
All one needs to do is a MTBF failure analysis on a desktop vs a diskless PC to see the difference.
The PC technology is sound and will outperform any PLC. As for ruggedness our products run single board diskless 80486 and 586 CPUs and are certified 1E for nuclear which is a stringent set of test for operating under high noise and vibration. I don’t know of any PLC product that meets this requirement “out of the box”.
The flexibility of the PC architecture has allowed us to provide redundant processors that run in lockstep for guaranteed bumpless redundancy at a low price. No special programming required.
There is a great deal of propaganda floating around about this subject, but PC based controls are here to stay.
Bill Lydon
RTP Corp.
414-427-0789
http://www.rtpcorp.com
This whole thread brings up a good debate.
The risk of doing control with a diskless PC is significantly lower than on a system with a disk. Also, the smaller the operating system the higher the reliability. As a rough rule of thumb the more lines of code the higher the probability for interactions that are not detected in testing but eventually show up as bugs in the field.
Using diskless PC hardware with compact and proven operating systems (DOS & QNX for example) is much less risky than larger operating systems (NT, WIN 95, CE).
Having been in real-time control for over 20 years and seen the progression to distributed multi-processing I’m a bit miffed by the idea of running MMI, information, and real-time control in one PC under NT even with “real-time” extensions. This is the kind of stuff we used to do with minicomputers because the cost of a machine was so high. In the PC world it doesn’t make any sense. Run control on diskless PC’s networked to PCs for information and MMI. This is a scalable, reliable architecture. We regularly run ethernet as a backbone and have great speed.
All one needs to do is a MTBF failure analysis on a desktop vs a diskless PC to see the difference.
The PC technology is sound and will outperform any PLC. As for ruggedness our products run single board diskless 80486 and 586 CPUs and are certified 1E for nuclear which is a stringent set of test for operating under high noise and vibration. I don’t know of any PLC product that meets this requirement “out of the box”.
The flexibility of the PC architecture has allowed us to provide redundant processors that run in lockstep for guaranteed bumpless redundancy at a low price. No special programming required.
There is a great deal of propaganda floating around about this subject, but PC based controls are here to stay.
Bill Lydon
RTP Corp.
414-427-0789
http://www.rtpcorp.com