Reliability of PC Automation

D
(Originally posted Thurs. 1/22/98)
"Brock, Dan" <[email protected]> wrote:
> When was the last time Win95/NT ran more than a month without a
> problem?

[Gentry, Davis] If you get (write) good software, hardware compatible with its environment, and know what you are doing when you set it up, then an NT 4.0 PC which crashes once a month on a production box would surprise the hell out of me. I would be disappointed if it crashed
once a *year*. If you miss *any* one of the three criteria above, then yes, it will crash on you. Frequently. One difference with the Wintel
boxes is that everybody and his grandmother thinks that they can program it. And they do. And it is not always stable. What a shock. How many amateur programmers out there are there who try to write applications on VMS? Or HP-UX? And from the other end of that equation, how many
of you have ever seen poorly written RLL causing havoc in a PLC?

> Johnson Lukose [SMTP:[email protected]] wrote:
>
> > The reality is the users have the money, and common sense says
> >the one with the money to spend is always RIGHT!! You will be up
> >against the wall in this matter. You are going to have a hell of a
> >time to convince them otherwise. The propoganda of PC + W95 / NT
> >has created a market perception of propotions even this list does
> >not realise.
> >It will make everyone a winner if you agree with the users and take
> >the contract. They get the systems they want and you get the project
> >you need.


[Gentry, Davis] I agree with Mr. Lukose, but he is missing one point. What do you do with your data? Many (most?) of the users today who want to look at manufacturing data (in any form) are using a wintel PC on their desk. And they are running Microsoft Office on it. And if you generate your data on a platform which is compatible with MS (whatever our feelings may be about MS, its tactics, and its products) you will find it *much* easier to get the data to the customer in a timely, efficient, and inexpensive manner. When your customer's accountants are getting the data they want, and the IEs are getting the data they want, and the process engineers are getting the data they want, and the executives are getting the pretty pictures on their desktops that they want, then everyone is happy. *That* is the true advantage to the PC. That is where your cost savings are, if that data is appropriately utilized. Be sure to help your customer *use* the data. They may not be used to having data quickly and easily available. Show
them the advantages.

Come on guys. The PC is just another tool. It may or may not be applicable to your problem. Analyse your problem, decide the best tools for the job, decide the cheapest tools for the job, and present your customer with the options, the pros, and the cons of each option. And if you don't know enough about PCs to program them and/or set them up correctly for a factory setting, then sell something else. But you should probably learn, because their market share is not going *down* any time soon.

Davis Gentry
White Oak Semiconductor
Sandston, Virginia
 
(Originally posted Thursday 1/22/98)
Well, I guess I'll throw my two cents worth in on this. It's more venting and opinion than anything else, but having been subjected to Windows NT for a couple months, I'm totally underwhelmed!

Before I got disgusted and started logging the blue screens, there were at least half a dozen others. This machine is connected to a network.

Date Time Activity/Application

12/20/97 1105 VB application
12/23/97 0745 Installing Citect
1/5/98 0749 Switching Servers
1/5/98 1148 Screen saver
1/14/98 0820 Screen saver
1/19/98 0930 Microstation 95
1/20/98 0905 Netscape 3.01 @Thomas Register URL
1/21/98 1030 Microstaion 95

Now am I planning on recommending WinNT for critical functions? Sure, and I also have some ocean front property for sale in Phoenix, Arizona.

Carl Ramer, Sr. Engineer
Controls & Protective Systems Design
EG&G Florida
Kennedy Space Center

p.s. If the weather holds, Space Shuttle Endeavor launches tonight.
 
(Originally posted Thurs. 1/22/98)
Dan,
I agree that the older operating systems are definitely more stable than the newer ones but lets compare things on an equal basis. My question is: When's the last time one of the VMS machine was installed, tuned, operated, and maintained without a knowledgeable system administrator. If we want a fair comparison we should try one of two things.

1. Set up the VMS machine so that the users can install and uninstall any applications or operating system patches they want. Let the users download every "cool" program they see on the internet and try it out for a few days
before they delete it off the machine. Let the users change operating systems parameters as they see fit. Give the users access to the on/off
switch so they can turn the thing off every time they get tired of waiting and ssume it must be locked up. If we do these things then we will have a reasonable approximation of the typical environment of a Win95 machine.


2. Have a knowledgeable system administrator set up a Win95/NT machine. The system administrator will then install all the applications, test the new applications in a non-public account, release the application for all users after the conflicts have been resolved. Finally, the sys admin will then check on the machine on a periodic basis and make any adjustments needed to resolve user complaints. If we do these things then we will have a reasonable approximation of the typical environment of a VMS machine.

Does anybody have any experience ith the relative reliability of VMS network servers vs. Win NT network servers? At least the environment and the use pattern of these would be the same.

Carl Lemp
 
M

Michael Whitwam

(Originally posted Thurs. 1/22/98)
I think that the sales of DEC Alpha speak for themselves. A decent modern PC is every bit as good as the DEC. If you want power, go multiprocessor.

At the risk of sounding like a stuck record, let me repeat. If correctly setup, NT is very robust. Did anyone ever ask a beginner to install VMX on
a VAX. No, so why do it with NT?

Further, the end user software also plays a major factor. If a system is left alone, it will probably do just fine. I have one customer that has been running InTouch on W95, since W95 came out. The system is an operator interface, running 24 hours a day. So far we have not had a single failure. (and W95 s*cks at robustness)

*** The reason in my opinion, is that the client has no PC literate maintenance staff, so nobody hacks! ***

Michael Whitwam
http://www.wisetech.co.za
 
A

A. V. Pawlowski

(Originally posted Thurs. 1/22/98)
It should be sunny when I finish work and go out to my car in the evening too. Your note is just a little off the mark.

It is reasonable to expect products to be used as intended, but not all control situations need absolute reliability and it would be silly to
have to reinvent every wheel yourself for every situation. Many control situations can be, and are, satisfied through the use of PC products.

My comment was based on the fact that many people are pushing Windows as the latest and greatest and my personal experience so far indicates
otherwise. I wanted to see if I was the only one or just having bad luck.

----------
On Tuesday, January 20, 1998, Cindy Hollenbeck <[email protected]> wrote:

.........Good control software should not fail, nor be dependent on
some other company's O/S code. There are a number of control software
vendors who provide products based on this policy. The vendors who take
the easy way out and write to WinNT or Win95 are doing an injustice to
the PC control industry - IF they advertise that they have a
deterministic, real-time, reliable system that can be used in
virtually any control application.

If you're planting a garden, use hand tools - if you're plowing a
field, you need a tractor!.................
 
(Originally posted Thurs. 1/22/98)
Well, Carl and David (servoboy) are right. We seem to be
comparing apples and oranges. His point is well taken in that not too many operators know enough to cause "problems" on a VMS operating system. But if they did know enough to change configurations and mess with TPU on the wrong files...

The whole conversation should boil down to comparing systems of the same relative cost and complexity.

One problem we just recently discovered was another twist on the reliability issue. Is your Ethernet connecting all your process systems
directly connected to your PC systems? When we switched to fast Ethernet between plants the vender (nameless on purpose) of this equipment did
not tell us that some of their own Ethernet devices were incompatible with it. This crashed PC's for some unknown (to us) reason. We purchased
new network interface cards for the PC's and the blue screens went away.
 
A

A. V. Pawlowski

(Originally posted Thurs. 1/22/98)
If you are putting together, or just have a, non-custom (configured commercial product based) SCADA system running on Win95/NT and it runs for a year with normal operator interaction, please name its makeup. Seriously, I am interested. If you have a setup that works well, I would like to know what it is.
 
D

Dan Hollenbeck

(Originally posted Thursday 1/22/98)
Hi Carl,

I am sorry to hear about your problems with Win NT. Sure glad I did not have to live through what you did.

Here is what I learned from your misfortune.

Win NT might work for control if:

12/20/97 ONLY run the control kernel on that box, nothing else.
12/23/97 Don't run or install SCADA on the control box.
01/05/98 Don't change network configuration.
01/05/98 Uninstall screen saver.
01/14/98 Really make sure the screen saver is removed.
01/19/98 Don't touch or look at the control box.
01/20/98 Don't treat the control box like it is a computer.
Unplug the monitor, keyboard, and mouse.
01/21/98 Lock the computer up in a control cabinet.

This sure sounds like a traditional PLC to me. However, if I need to do all this to get Win NT to work, what is the point of using it? ;-)

Regards, Dan
 
H

Hevelton Araujo Junior

(Originally posted Thurs. 1/22/98)
>I think that the sales of DEC Alpha speak for themselves. A decent modern
>PC is every bit as good as the DEC. If you want power, go multiprocessor.


Won't you raise the price to around the Alpha range once you start adding processors ? (I'm not being sarcastic, I really don't know)

>At the risk of sounding like a stuck record, let me repeat. If correctly
>setup, NT is very robust. Did anyone ever ask a beginner to install VMX on
>a VAX. No, so why do it with NT?

Could't agree with you more on that. The problem is that, the "hardware compatibility list" for VMS is one line long and works. The NT list, when
you use PC's, is the size of a book, and is not always right.


Hevelton Araujo Junior
IHM Engenharia e Sistemas de Automação LTDA
<[email protected]>
 
(Originally posted Fri. 1/23/98)
I have seen a poorly written RLL (and STL too) crash a Siemens S5 PLC.... happens all the time if you don't keep your variable addressing straight (especially mixed type variables of different length/structure) using Step5. Normally this happens after download and the little run light on the PC goes out!

The S5/Step5 system (which I believe is/was the world's #1 installed PLC!) has no type checking so it is easy to corrupt your memory if you are not careful.

PLC's are not idiot proof... just idiot resistant (and some more than others)

Randy Sweeney
Philip Morris R&D
 
(Originally posted Fri. 1/23/98)
Carl:

Was the questioning due to the fact that PLC's were prone to software crashes and hardware failures, or was it pure reluctance to use something new and different? Just curious, as I was not involved with the industry at the time.

Aside from one forum participant's response that software crashes are due mainly to amateur/inexperienced programmers or programmers too lazy to keep their skills up to date, it seems to me that those who are currently
reluctant to implement PC's in a control environment have a pretty solid foundation on which to base their opinions. I'm not inclined to believe that even most O/S crashes are the result of some unauthorized or inexperienced twiddling. Otherwise, why would PC manufacturers spend so much time and money on personnel for consumer helplines? PC's are NOT noted for working every time, even right out of the box. The ideal of Plug and Play falls so far short of real life that even a PC helpline technician
I once talked to sarcastically referred to it as Plug and Pray. There are even websites that deal with, get this - UNDOCUMENTED tips and tricks for
Win95/NT! If I couldn't expect anything better from PLC manufacturers, then I guess that I, too, would be greatly concerned about PLC reliability
and appropriateness. I wonder, sometimes, whether the competitive race among PC manufacturers for bigger/more bells and whistles has relegated system/hardware reliability to a lower priority than profitability. Remember the Intel Pentium chip? The one with the floating-point error that was discovered, but the marketing continued because endusers like me
who do not need massive number crunching would never be seriously affected by the the bug? Thanks, but I think that I prefer PLC's.

Don Lavery
Lavery Controls
[email protected]
 
J
(Originally posted Fri. 1/23/98)
YES -
Compaq 233 CiTect Software although with only 500 points, COMx driver on a Digiboard, Modbus Protocol over MDS Radios at 4800 Baud and through a repeater to 34 Remotes.

All Items OUT OF THE BOX, WinNT Svc pack 1 Installed, 3,700,000 analog reads and counting without the blues....

Purely SCADA in a water plant / well field system.

No Netscape, No PackMan, No Screen Savers, No monkeying with the I/O map while on-line.

The key seems to be that you don't release a system until it's right and all possible operator actions, alarm events, and process variable ranges are proven, you don't do anything unnecessary on that machine, and you don't let
your operators monkey with the kernel.

and there are surely others.

John Lindsey
Niles Radio Communications
 
(Originally posted Fri. 1/23/98)
I've been ... casually .. following this thread of "PC Reliability" which is driving me a little crazy. Reasons being - to me this sounds like the old PC<->MAC, Win<->OS/2, BSD<->Linux conversations.. However, because I have so much at stake here, I'd like to intervene and ask a question.

In what way are PC's supposedly unrealiable? I.e. the hardware can obviously be unreliable due to 2 aspects.
1) Poor assembly
2) Poor engineering
now, assuming someone is serious about getting their hardware, we can rule out #1. If the assembly is poor, don't use it.
As for #2, the PC concept has been around since early 80's - the hardware is not perfect, but not very far from it.
The software can also have 3 main aspects which can be "bad"
1) The BIOS code is bad
2) The OS is bad
3) The PC based programming is bad
well, just like #2 for hardware, the BIOS is good if your hardware is good. There is no difference if a device is a PLC or a PC, if it's basic
programming is wrong, it bad. If not, it is good (in that respect)
2) The OS. What OS are we all yelling about here? Other than Windows, there is DOS (which has had MANY years of testing and proves to be VERY reliable, from that which I see) There is Linux and similar systems, which are even far beyond DOS in reliability. And then there are the specially made micro OS's , which are GUARANTEED to be reliable by their manufacturers...
3) The PC based programming is bad? Well, if the implementor is bad, what can you expect? This (since I _AM_ a programmer here I am not
understanding)

Now, maybe I'm misinterpreting something. Maybe people are talking about master stations here, and I'm just clueless and out of whack. But I'm
developing a device on the PC architecture right now - I haven't had a single problem yet (except that the current board I use has no CO Processor, which isn't exactly a big problem). The hardware and software have been working nothing less than excellent. So, can someone PLEASE give me the spark if I am doing/assuming something wrong here? I don't want to invest huge sums of money only to find out that I missed something very basic.
 
(Originally posted Mon. 1/26/98)
Don Lavery wrote:

> Was the questioning due to the fact that PLC's were prone to software
> crashes and hardware failures, or was it pure reluctance to use something
> new and different?

The arguments at the plant I worked at were specifically about the reliability of the PLC. The engineer doing the questioning was used to working with a DCS and didn't trust a PLC in a process (as opposed to a machine) control
application. However, I think these arguments against new technologies usually start out with some basis in fact (Early PLC's were not nearly as reliable as the ones being sold now.) but the "pure reluctance to use something new and
different" lingers on long after the technical issues have been solved.

Don Lavery wrote:

> Thanks, but I think that I prefer PLC's.

I also prefer the PLC...for the time being. However, I applaud those with the courage to take the risks of installing and debugging bleeding edge technology. If it weren't for them, new technologies would never become reliable enough for the rest of us and I would be spending my time tracing wires in relay cabinets and tracing tubes in pneumatic control cabinets instead of tracing ladder logic on a laptop PC.
 
(Originally posted Mon. 1/26/98)
Carl Lemp wrote:

> 1. Set up the VMS machine so that the users can install and uninstall any
> applications or operating system patches they want. Let the users download
> every "cool" program they see on the internet and try it out for a few days
> before they delete it off the machine. Let the users change operating
> systems parameters as they see fit. Give the users access to the on/off
> switch so they can turn the thing off every time they get tired of waiting
> and ssume it must be locked up. If we do these things then we will have a
> reasonable approximation of the typical environment of a Win95 machine.

This ability for people to mess with their PC's is a serious flaw. You could potentially design a stable system using a typical desktop OS, and
the end user could turn it into garbage by installing some new whiz bang software or hardware.

This is one of the curses of open systems, the end user can buy who knows what and try to install it into his control system, sound cards, modems...

Now any good OS should be able to protect the system against a rogue application, isn't that what protected mode and hardware memory management
is supposed to do.

A new piece of hardware with a poorly written kernel mode device driver is another matter. It is hard for the OS to protect against this. I believe QNX guards against this by running all device drivers as user level processes. I have seen many reports that NT has decent soft real-time performance, at least on a Pentium II. But many of these reports caution that a poor device driver could disable interrupts for a long time
and screw up it's response times.


--
Bill Sturm
 
G

George Robertson

(Originally posted Mon. 1/26/98)
OK, You asked for it:

> I've been ... casually .. following this thread of "PC Reliability" which is
> driving me a little crazy. Reasons being - to me this sounds like the old
> PC<->MAC, Win<->OS/2, BSD<->Linux conversations.. However, because
> I have so much at stake here, I'd like to intervene and ask a question.
>
> In what way are PC's supposedly unrealiable? I.e. the hardware can
> obviously be unreliable due to 2 aspects.
> 1) Poor assembly
Sometimes, though not so common.
> 2) Poor engineering

In some cases, particularly with regard to "true" PC compatibility, whatever that is.

> now, assuming someone is serious about getting their hardware, we can
> rule out #1. If the assembly is poor, don't use it.

How do you know whether it's poor?

> As for #2, the PC concept has been around since early 80's - the hardware
> is not perfect, but not very far from it.
> The software can also have 3 main aspects which can be "bad"
> 1) The BIOS code is bad

Bad, or just different. It is difficult to develop code that runs on everyone's BIOS. Unless you use the same BIOS that the developer
used, you will be "beta" testing. I know it shouldn't be so, but them's the facts.

> 2) The OS is bad

Definitely. If you have a bug free one, let me know.

> 3) The PC based programming is bad
> well, just like #2 for hardware, the BIOS is good if your hardware is good.
See above
> There is no difference if a device is a PLC or a PC, if it's basic
> programming is wrong, it bad. If not, it is good (in that respect)

Big difference. PC OS is non-deterministic, interrrupt driven collection of code for allocating PC resources. (Definition). PLC OS is deterministic, probably NOT interrupt driven, very limited engine that does a very specific task, and is probably completely testable.

> 2) The OS. What OS are we all yelling about here? Other than Windows,
> there is DOS (which has had MANY years of testing and proves to be
> VERY reliable, from that which I see) There is Linux and similar systems,
> which are even far beyond DOS in reliability. And then there are the
> specially made micro OS's , which are GUARANTEED to be reliable by
> their manufacturers...

What do the manufacturers do if they fail? What's the guarantee?

> 3) The PC based programming is bad? Well, if the implementor is bad,
> what can you expect? This (since I _AM_ a programmer here I am not
> understanding)

With most of the complex systems, you've hit the nail on the head. Modern programming is such a hodge-podge of DLLs and objects that it's hard to see who's to blame. If you want something really tight, you have to write it in assembler, and totally pre-empt the OS. Which is basically what's going on in a PLC.


> Now, maybe I'm misinterpreting something. Maybe people are talking about
> master stations here, and I'm just clueless and out of whack. But I'm
> developing a device on the PC architecture right now - I haven't had a single
> problem yet (except that the current board I use has no CO Processor, which
> isn't exactly a big problem). The hardware and software have been working
> nothing less than excellent.
> So, can someone PLEASE give me the spark if I am doing/assuming something
> wrong here? I don't want to invest huge sums of money only to find out that I
> missed something very basic.

Just test your package as completely as possible, and insist that your customer's run your code on the same hardware, with the same OS, and don't run anything else on the same box (I'm not kidding here, if this is for process control) and you'll be golden.

-George Robertson
Saulsbury E & C
Getting grayer, and perhaps a bit jaded. (realistic?)

K.I.S.S. Hmmm, is a PLC OS simpler than NT?

George Robertson
 
(Originally posted Mon. 1/26/98)
Off the top of my head, my quick list of reasons PC control is prefereable to PLC's would include:

1) Hardware cost (CPU/Display/Hard Drive/Network Cards)
2) Choice of development languages for Control and/or Data processing
3) Simpler support for custom and semicustom boards
4) Simpler software upgrades (i.e. Modem downloads)
5) Mainframe connectivity
6) Multiple hardware suppliers
7) Simulation of system without physical hardware

Admittedly, my applications contain a lot of data processing into and out of the controlled system, and the need for extensive audit trails and report
generation, all of which is more PC-like in processing, along with the need to control and coordinate switches/cams/motors/indicators etc.

I still have small control needs best met by little PLC's.

And frankly, I would still be nervous putting a PC in a system where failure is truly hazardous.


Rufus V. Smith
RufusVS\@aol.com
 
M

Michael Whitwam

(Originally posted Mon. 1/26/98)
What you say is true. However my experience is that PLCs originally of American origin, tend to be far more idiot proof than their European counterparts. Have you ever managed to crash Modicon 984?

To be fair of course, it has to said that the European guys have more feature rich software.
 
(Originally posted Mon. 1/26/98)
James Lang wrote:

> Years ago, computer control started with I/O coming into a central computer
> which ran the control algorithms, generated alarms, etc. Later distributed
> systems and PLCs relieved the central computer of this load for more
> efficient and reliable operation. Without going into a long history, it
> seems that all PC control has done is to go back to the old central
> computer type control.

I think that one of the reasons for the trend back to centralizedcontrol is that people are collecting and monitoring much more data than in the past. Many PLC's have very slow networking
facilities. This makes the PLC to PC interface much more difficult. I had just spent many hours trying to get a few hundred points between a SLC 5/03 and a PC based MMI, with a reasonable update speed. You have to regroup and shuffle memory inside the PLC to get contiguous blocks and it takes several reads to get all of the different data types. (at least with A-B)

One way to solve this problem is to do the control in the PC, this way you have can have on tag database and very fast screen updates and data acquisition.

I am not saying that this is the best way, however. I would prefer to stay with a more distributed system with many small processors.
Some of the new PLC's are starting to have faster networking, such as ethernet, that makes it easier and more economical to connect with a host computer. No more 19.2 kb multi-drop links or $1000.00 interface cards. I would like to see all PLC's come with ethernet or at least 2 comm ports capable of 115 kilobaud serial comms. PC's
have had both of these luxuries for years. No wonder they are becoming more popular.

--
Bill Sturm
[email protected]
 
T
(Originally posted Monday 1/26/98)
I would rather refer to what people call 'PCs' as Intel x86 based controllers when I am talking about Industrial control applications.
 
Top