Reliability of PC Automation

Are preliminary conclusions available?
Concerning Soft-PLCs there are some solutions using dedicated processor cards which work even if the PC OS is rebooted.
Concerning purely software-based solutions I really wonder if PC OS's are really enough reliable. According to my experience, using any MS -Windows product doesn't allow a long-term runtime (> several months coninuous duty) without crashes.
Some specializes OS's are better suited for industrial applications (e.g. QNX etc.).
Would be intersting to hear any comments about Soft-PLC experiences.
 
From a hardware POV PCs can be quite reliable if environmental conditions are fine (temperature, moisture, dust, vibrations/shocks, clean UPS power,...) but operating systems, especially MS-Windows are not reliable.
The major problem is that in some domains customers demand more and more Win-based process control systems although others OS's (Unix-based, OS9, OS/2,...) are much more robust. One should admit the fact than ANY Win-based system will crash sooner or later and also show apparently random instabilities leading to impredictable behaviour. With fairly redundant client-server architectures used as HMI's it's (probably???) possible to guarantee acceptable system operability but there is a price to pay for.
Also PCs can't replace PLCs, a Soft-PLC is NOT a PLC CPU equivalent, same for PC-bus "PLC CPU" boards. There are models which use a separate PSU so the "PLC CPU" board works even if the PC power is off but if so finally, why should one get such a board instead of a regular real PLC CPU?
 
C
I agree with most of what you are saying, but a lot depends on the application
and how you define PC's. For example, an industrial SBC running Embedded Linux
in a DIN rail case and appropriate memory,IO and storage arguably _IS_ a PLC.
Most plcs are simply a small purpose built computer running an executive built
along the same lines as other embedded systems except more generalized in
nature to allow programming by the user. A SOC such as the MachZ from ZFLinux
Systems has even more fail-safe features and embedded adaptations than what
the PLC vendors are using. It would be fairly easy to build a drop-in
replacement for many Micro's at least. I fully intend to do just that if I
can generate funding. At the same time PLC's are moving more towards the
PC as they address demands for non-logic functionality and reasonable
communications capability. Between them is a gray area that they will both
occupy in the near future.

In the meantime PC's are useful where they are cost competitive and conditions
allow. In my case, the PC is a far more viable solution even considering the
cost of environmental control and a UPS. It was a case of where we needed to
have a PC for machine vision and terminal emulation and communications so it
makes a lot of sense to run the logic for the cell on the machine also. The
Linux Box does Logic a lot better than PLCs could do the other things needed
and the costs were drastically less. The reliability in this plant has been
quite acceptable with most of the Linux boxen I have doing MV running a year
or so between scheduled downtime for a vacuum job and general check up. These
are running without the benefit of sealed enclosures or in some cases UPSs.
I fully expect that the cooled, sealed enclosure for this cell to at least
double the maintenance interval.

For true relay replacement hazardous environment jobs the PLCs make sense.
But as fanless, diskless, low power, high function embedded platforms become
available as commodities, there will be fierce competition for everything
else. As you move up in functionality, programming time and development
costs will clearly favor more general more powerful solutions.

Regards

cww
 
R

Ranjan Acharya

<clip>
One should admit the fact than ANY Win-based system will crash sooner or
later and also show apparently random instabilities leading to umpredictable
behaviour.
</clip>

I do not agree with this statement -- it is true for many situations, but
not all. We have had some Windows NT Server systems out there that run for
"ever". The only time they have been re-booted is when the server was moved
to a new room. The application software has crashed a few times, but that
was nothing to to with NT (granted, you cannot be 100% sure of that, but the
same could be said for a system running on top of Linux too unless you know
<B>exactly</B> what caused the crash) -- NT kept running with no sign of
memory leaks, resource hogging et cetera. The customer just re-started the
application with no need for a service call.

The main problems with the system were the loss of a drive in the RAID array
(hot pluggable, no big deal) and the loss of a cooling fan (had to be shut
down for that too).

Before all the Linux lads get too upset, I am not implying in any way that
NT is a stable well-written OS (it is not), however, in a plain-vanilla
set-up with good hardware (the key, for any OS), users can expect to see it
behave quite well.
 
P

Peter Whalley

Hi all,

One of the disadvantages of NT however is the need to re-boot whenever a
significant change is made to the software environment. Every time you install a service pack or even a new version of any of the applications, NT needs to be re-booted. This is a major disadvantage for servers connected to the Internet where updates need to be installed fairly frequently (for NT or Linux) for security reasons.

With Linux for example I can stop Sendmail, install an updated version and just restart Sendmail without interrupting the operation of any of the other applications and I can do it from a thousand miles away using ssh (secure Telnet). This is generally not possible with an NT system.

In a closed environement it maybe possible to just get the system running and then leave it be for many months or even years but if security is an issue systems need to be frequently updated to stay ahead of the hackers and this is where Linux really shines.

Regards

Peter Whalley
 
P

Prashant V. Ingole

Dear sir
I think these PC 's although provides the flexibility of programming and a general purpose solutions the reliability is really a concern for the control people today.
When I work with my Pentium PC based control workstation the problems like OS stability and Hardware failure are common when the usage period is more. Some of the time the workstation fails without any clue and reason so I am doubtful about the complete replacement.
Thank you
Prashant Ingole
 
S

Stephen Fullerton

Look its real Simple on NT if your getting Blue Screens.
1> either there is a component not Functioning Properly.
2> NT deosnt Like one of the Components in the machine, <like Not on the NT Hardware Compatiblility List> or Software Apps...
3> or you have a bad Install, Try Looking at Service packs Reinstalling or even going back one or two Versions, <or forward depending on
your current version>
4> Also try Checking out your Event Viewer for system and Apps as to what is failing or throwing errors...
5> and if al else fail why dont you NASA guys Crank the Commadore 64s back up.... LOL

Sorry man but NT isnt that hard and the Information on it is EXTREMELY easy to look up. RTFM Geese o pete Bra
 
Hi.
Based on my personal experience with PC for Automation, Reliability is not a strong feature of it. But on the other hand PC has many advantages that makes it attractive over other options. When extreme reliability is needed then at the current era PC is probably not an option.
PC reliability has improved during the years
and got a kind of acceptance level. In general PC reliability might be good enough for many applications. So when making a decision, PC or not PC, like anything else in life, Advantages over Disadvantages must be carefully judged.
 
R

Reginald Sheridan

I was surfing the net and found this site and feel I must add my 2 cents. I have a PC program installed on a Dell computer in a few dust driven location in South Korea. I have read some of the replies stating that an industrial pc is better than a regular pc. This is not the case. Most industrial pc's do not use the most modern and up to date cpu's. This system has been running since 1998 using Taylor's Process windows and Waltz control software. It is also interfacing with a vb developed program for recipe and prodution reporting. This software has been by GE FANUC. If anyone has any questions feel free to e-mail at [email protected]
 
I work for a very famous PC manufacturer which switched to PC based controls 3/4 years ago.

Without naming vendors, we used SW based on flowcharting coupled with a CAN based fieldbus running on an NT platform.

Due to reliability problems with the Servers, 75% failed in 3 years, bus stabilty problems, application bugs as well as good old NT problems the company is back tracking to PLCs for all future lines.

To give it it's due PC based systems proved easy to integrate into the enterprise using DCOM, OPC etc. But with vitually all PLC suppliers providing OPC servers this advantage is lost.

On the other hand we have some turn key equipment running on a UNIX platform which has not failed within the same 3 years, this is the only PC based scenario I would recommend.
 
B

Bryan Hoffman

IMO, PC controls is not reliable. HAve had problems with general protection errors and the like.

Especially with viruses, I would not want to take the chance.

MTBF is not what I would expect. PLC are 10 years.

PLC are rock solid technology, PC's, especially with custom programming are not.

PLC allow for easier trouble shooting, you don't have to get the original programmer to help.
 
M
IMO a PC should only replace a PLC if the PLC cannot meet the application demands. There are third party packages which create a SOFTPLC which can be programmed in ladder, but frankly I find this ludicrous. I have programmed PC's on WIN95 and put them in the field with no problems (fingers crossed) for the past 6 years. However, these systems are showing signs of hard drive fatigue and Windows errors due to daily OFF/ON routines without proper shutdown.

If you must use a PC in a production environment I would suggest using a UPS system that slowly shuts down Windows upon 110VAC power loss. This will protect Windows and extend the life of the PC.
 
C

Curt Wuollet

I agree on this point, although it's unlikely I'd field W95. I wrote a script that shuts down the UPS after shutting down the OS and disconnected the PC power switches. This way it behaves properly on loss of power or if someone drops the main breaker. Some people have fallen into the habit of hitting the power button whenever they don't understand what's happening. Many UPS can watch a serial conection and can be programmed to do the right thing.

Regards

cww
 
According to SEMI S2, the computer can be shut down separately if it is only used for data logging. From what I understand, if the PC is doing the controls, it has to be shut down immediately upon EMO (emergency off).

However, PLC manufacturers is coming out with more advanced version of PLC with more computing power and storage (e.g. compact flash). So I think PLC is still the ideal choice.
 
R

Reuben Allott

Hi

PC-based automation has come a long way recently, mainly with improvements in flash-memory storage capacity and operating systems like Windows XP Embedded.

The Siemens Microbox is a bare-bones Windows XP with no moving parts (no hard drive). Other manufacturers have similar devices. These machines offer the same industrial resiliance as a traditional PLC (the Microbox is even mounted on a PLC rail), but provide much faster CPU performance with the added ability of linking in DLLs and applications written in high-level languages like C, and making use of advanced functions available to the operating system (modems, remote monitoring, web servers etc).

In my opinion "PC-based control" will eventually replace traditional PLCs - the capabilities will be the same as a desktop PC but the hardware will be industrial-strength.

Reuben Allott
 
I was an engineer in production of animal food. Automation was based on Siemens PLCs but the thing is that we had 2 servers in network with thise PLCs. We had database and SCADA on those servers. The thing is that the producton was not poslible if any of the servers was down. Data base was essential to the proces 'couse the recepies where there.
So conclusion is this. Noting would work without a PC!

I was also involved in a project of PC aotumation. Me and a friend made a custom program for a small production line. Also an animal food company. It runs on WinXP. One PC is incharge of everything!

I'm for PC automation. Even on Windows. It runs great and it is a LOT less expensive!
 
Lots of the facts and figures mentioned are used in the design of a system to meet the customer's reliability requirements. However, the long-term reliability of any system is more accurately determined by the implementation of systems designed to keep the design reliability than the reliability figure itself. For instance, during a FAT or PAT, there should be sections that cover system support roles, the implementation of maintenance, spares, change control, obsoletion prevention, training and eventual replacement at the end-of-active-service-life. The availiabilty of support services such as cooling and electricity supply should also be covered when considering the above list.
 
T

Timothy P. Niemczyk

Dave, I would suggest reading a very interesting paper, "Loss-Prevention and Risk-Mitigation in Equipment Protection Systems" by Phil Corso. You can contact Phil at [email protected] if you can't find his paper. He is very knowledgeable and helpful.

 
Top