PC based Control .vs. PLC

(Originally posted Wed 09/09/1998)
This whole thread brings up a good debate.
The risk of doing control with a diskless PC is significantly lower than on a system with a disk. Also, the smaller the operating system the higher the reliability. As a rough rule of thumb the more lines of code the higher the probability for interactions that are not detected in testing but eventually show up as bugs in the field.
Using diskless PC hardware with compact and proven operating systems (DOS & QNX for example) is much less risky than larger operating systems (NT, WIN 95, CE).
Having been in real-time control for over 20 years and seen the progression to distributed multi-processing I’m a bit miffed by the idea of running MMI, information, and real-time control in one PC under NT even with “real-time” extensions. This is the kind of stuff we used to do with minicomputers because the cost of a machine was so high. In the PC world it doesn’t make any sense. Run control on diskless PC’s networked to PCs for information and MMI. This is a scalable, reliable architecture. We regularly run ethernet as a backbone and have great speed.
All one needs to do is a MTBF failure analysis on a desktop vs a diskless PC to see the difference.
The PC technology is sound and will outperform any PLC. As for ruggedness our products run single board diskless 80486 and 586 CPUs and are certified 1E for nuclear which is a stringent set of test for operating under high noise and vibration. I don’t know of any PLC product that meets this requirement “out of the box”.
The flexibility of the PC architecture has allowed us to provide redundant processors that run in lockstep for guaranteed bumpless redundancy at a low price. No special programming required.
There is a great deal of propaganda floating around about this subject, but PC based controls are here to stay.
Bill Lydon
RTP Corp.
414-427-0789
http://www.rtpcorp.com
 
J

Johnson Lukose

(Originally posted Fri 09/11/1998)
This is a good rule of thumb for microprocessor chips too - the more transistors you cram in the higher probability for all gate combinations that are not detected in testing but eventually show up as bugs during run time. OR doesn’t microprocessor reliability count in overall system stability??
thanks
 
(Originally posted Mon 09/14/1998)
Good comment that has merit. Software however has a much lager number of permutations and has the ability to to modify itself ( many times not intended) which hardware does not. The number of applications loaded complicate this. The number of lines of code in software is so much greater than hardware making this a more complex issue.
I do agree that the lower parts count as a rule of thumb is higher reliability.
Years ago I had a government study that showed based on experience the more copies of a program being used the the sooner bugs will be found and fixed. Intuitively this makes sense.
Obviously reliability is a complex issue but I think using common engineering sense is the key.
 
(Originally posted Thu 10/01/1998)
I hope the embedded controllers will be supplied with visible Reset button.

Zak Fuchek
 
> Are there any features or properties that a PLC has that PC based controllers do not?

> Specific reasons that are affecting your purchasing decisions are very appreciated.

> What could we do that would change your mind ?

> Thanks.
Reply:

I have noticed few points about people reluctant to move to a PC based control system instead of PLC..

First thing the reliability of PCs
Crashing of operating system
Operators / engineers may exit the system un noticeably....

These points may vary area wise or even country wise...

If anybody has better clarification ..It is welcomed !!!

Regards
 
W

Will Brokenbourgh

At one time, I worked for a major speaker manufacturer who had two different types of production lines working side-by-side. One side was controlled completely by PLC, the other by NT stations running AB SoftLogix. Guess which line was more productive?

The PLC controlled line never had a control-failure related issue, required a very short start-up time, was very responsive, and was more forgiving in terms of operator input and tech blunders.

The PC controlled line would have mysterious crashes, freezes, individual stations would stop communicating with each other or would just stop functioning for no apparent reason, recovery from power outages was a nightmare, boot-up time seemed like an eternity. All of this with name-brand equipment and top of the line AB software and hardware.

I vote for PLCs.
 
C

Conrad Vinzens

(Originally posted Mon, 31 Aug 1998)

> >>There are reasons why the PLC is still used and popular:

> - The reboot time is minor compared to a PC (this can be important in some industries)<<

> Depends on the BIOS ... most good IPCs have a BIOS option for fast boot.

$We never had to load a operating system onto the many PLCs we have, but we had to reload and configure several of our NT based IPC due to Systemcrashes!!!! What takes more time? $

> >>- The cycle time of the PLC can be made deterministic; you know the sequence in which the program is executed.<<

> Use an IEC1131-3 environment and a RTOS like QNX and your PC based solution will be more predictable than the PLC solution.

$ Pretty exotic OS stuff you have to use. What about NT? $

> >>- Reliability: a PLC that runs doesn't crash, you can't guarantee this with a PC<<

> Interesting ... a PLC that runs doesn't crash ?? Why ??

?B'cause it is made from one block, whereas IPC is a mix of materials aout of a toolbox?

> >>- The PLC hardware is robust + you don't have to remove a frame to
exchange cards (in case of a defect). With some PLCs you even don't have to cut off the tension.<<

>With our PC and fieldbus based solution you can exchange the IO modules at run time !

> (as long as a clean bus topology is used .. e.g. like PROFIBUS or CAN ... )

> >>- The technical maintenance personel is more familiar with the PLC; this counts as well for programming as for hardware problems.<<

> With IEC1131-3 and the usage of fieldbusses ... I don't see any differences ...

> >>- Security reasons<<

> Well chosen embedded PC hardware has the same quality as the PLC hardware.

> >>In general, I think that the PC still has to prove its reliability in many industries. As long the real-time processing is more important than the information processing, the PLC is in favor.<<

> A PC solution under QNX gives you a better real-time behavior and the possibilty to use the information processing and communication features ...

$Might be with QNX, but NT stuffed my last x-mas party, when the IPC thought the Profibus was running, but as a matter of fact it didn't. The vessel was loaded with a exotermic process. We just had luck.$

$Conrad Vinzens, [email protected]
 
G

Gabriel Suarez

My opinion. PLC´s technology is in the final countdown. The new embedded systems PC-104 or 51/4" or others with the fabulous potential (Pentiums III, etc.), electronic flash disk (40mb/s), etc. combined with hundreds of rugged periferics are redundant most powerful. This systems are PLC´s murders. Programs in hundreds of languages extends the possibilities. And RealTime OS (RTLinux, QTX, etc.) destroy all barriers. GABRIEL SUAREZ. EGROJ. ARGENTINA
 
R
Just my 2C on this oft discussed issue.

The 'ideal' criteria for scheduling on a general purpose server (eg web server, database server etc) contrast with hard scheduling. Unless some specific hard scheduling mechanism has been installed you cannot simply 'pretend' it exists
because some empirical testing reveals n milliseconds of jitter, it is not reliable.

But if you do put in real hard time scheduling you open another can of worms with event queues, process queues etc. Modern OS do not like applications doing things as and when they like.

But there are solutions, which all tend to work in a pretty similar way. Given that both NT and Linux (to pick two widly contrasting systems) both have (more than one) RT packages available and that all work on the same principle, it would
suggest to me that it is THE method.

The principle is that a specific amount of processor time is set aside for real time processor, for example, in a 100ms cycle, 20mS is dedicated to RT tasks, and 80mS to the rest. During the RT time slot, normal OS operation is suspended, and a hard scheduler is invoked.

Of course the real time tasks have their own interfaces (drivers) to the underlying hardware, and to avoid chaos with all the queues and buffers in the host OS they do not interact directly with app space, but use a kernel based
interface.

When all is said and done it is as if you are running two systems on one physical hardware.
 
B

Brian Woodcraft

I just had to add my experiences to this one (excuse me if the protocol is incorrect, but i am a virgin at these posting things)

I worked for many years as a control engineer with Relays, then PLCs and now with PCs. At every stage I questioned the wisdom of whether the change was really progress, and always the answer is the same...

PCs are the future (the present for foward thinking companies) of our industry. They offer far higher performance and reliability if used properly.

Now don't get me wrong, Win NT is never going to be a solution for a Real-Time control system - I don't believe it was ever intended to be - but you have to consider that it is the (currently) ultimate front-end.

We use a QNX OS on our field control units and this is an excellent real-time solution offering connectivity to virtually any external interface as required.
The logic and control functions are scanned in a similar manner to a PLC, so offering a true scan cycle (typically less than 5 milliseconds for a plant with several hundred devices and a pentium 350).
Because it is a PC, just add a KT card and it will talk to Allen Bradley I/O (or whatever) use the standard serial port for modbus, a network card to allow comms to a workstation or 64. Want to share the information with half the world - no problem connect it to the web and let TCP/IP do the rest.

Attach some Windows workstations and now the control system can talk to anything, no bespoke protocols, no expensive hardware. Just good old fashioned MS (read 'world') standards.

Now your control system can get the current price of baked beans off of the web, it can e-mail the night supervisor to tell him to get some extra hands in, then it can make a load of extra tins of beans, it can use a workstation's sound card to shout instuctions at the line workers, it can queue the beans up at the warehouse door ready for the trucks, that it requested last night by calling the transport manager's mobile phone and using WAP to let him know that the price of beans was up.

OK so the analogy could use some work, but the point is that this is not a massive undertaking to achieve. All of this is pretty standard stuff.

I can't deny that this is all possible without the aid of a x86 processor, but it's much cheaper and likely to prove more reliable as the standards are better tested than any custom code. It's easier to maintain (PC hardware is off MANY shelves, not just the specialist few)and far more expandable.
 
B

Brian Woodcraft

The trouble with RAID is that the controller is still (often) a single failure point. If you are using NT I suppose the disk is constantly getting thrashed and this can't help their MTBF.
Alternatively - Our QNX solution runs from RAM once booted and allows hot-redundant PCs.
Never been a problem - never will be.
 
E

Edgar Hilton

Especially, take a look at RTLinux's small sibling: miniRTL (runs on a minimal PC-104 system). If you want minimal system and controlling capabilities, along with secure shells for remote monitoring over a tcp/ip interface, then miniRTL is the way to go.
 
H
respected sir, i am a third year under graduate engineering student studying in india. i am much interested in this topic and has been reading a lot on this wap application of process control. i am also engaged in research work at our institute regarding the same topic. i would like to know more from eminent people from industry like you on this topic. thanking you in anticipation harsh
 
G
> I reply on the mail sent by Don Baechtel. When two opposites compete , the one that merges , will win. It’s likely that PLC vendors will integrate PC technology in their products , so they will have best of both worlds. Guy Truyens. eXitec.
 
S
For one thing, us electricians were thought about when plc's were designed. It uses ladder logic, which we are familiar with. Second, PLC's are thought to be "industrial" grade vs. computers (office grade) What can you offer to make it more user friendly, cheaper, more reliable? Go ahead and sell me!!!
 
K

Koen Verschueren

1) You can easily find softPLC software that can be programmed in ladder. 2) There are also industrial PC's. you can find them with or without keyboard, with touchscreens or with only a few function keys so operators can't use them to install other application. 3) If you use a PLC and SCADA software then PC control with remote IO is cheaper. For small applications the PLC will be cheaper. I can think of some applications where a PLC would be better or safer, but in 80% of all applications you can change PLC by PC control. 4) If you use a good quality (industrial) PC, a flash harddisk and a Real time operating system a PC can be as reliable as a PLC. 5) Keep a PC control dedicated, once you have installed all software and hardware correctly you will never have problems with a PC I have done a number of PC's controlling cryogenic plants on a standard Compaq PC with remote IO from Advantech (RS485). I have written a Visual Basic program to control these plants. In the 3 years they are running (continuously) I never had a problem with one of them.
 
Hi Harsh I guess the idea of WAP made you ask for this information. WAP is a commercial failure. But it is great idea that could be of immense use in the Process Control Industry. In fact Process Control industry in stuck in years behind. So something of similar concept will work. If you are interested in this, let me know what exactly is your background and what you want to do > Maybe I can help Prem
 
Have you guys worked on a DDC sowftare called "Action" . I got introduced to it in 1990 because the vendor supplied this as a datalogger device(remember those flashy old boxes) and for the best of my knowledge it is still functional. This was a simple DOS based DCS software in which I/O racks had to be added. It never failed all these years. No..there was nothing called triple redundancy, client-server etc. So I guess with the right price and support and some extra marketing effort, PC based controls can have a legitimate place in the market. Prem
 
About 3 years ago I first began trying to justify PC based control platforms (back then Intellution, Allen Bradley, Action, and a few other big names were all that was really available). Then the problem was no real time control in any of the above engines. Now some of that's addressed, but most applications still dont offer real time control kernel replacement of the HAL layer in NT. This isn't all bad, in a recent test of Allen Bradley Softlogixs 5000 we had scan updates of 1 to 3 ms (worst case) with some 1000 I/O points being scanned and we were watching a video on the same computer. I believe it was a 700Mhz Dell or something.

Let's also face the fact that someone has to support the system you put in after your gone. It seems easy at first to dive into something neat like linux rts's etc but what enevitably happens is you get a new job, training never happens as it should, and the plant is stuck with your pet project that someone eventually comes in and cleans out. Since I have cleaned some of these out, I can assure you, nobody talks good about you.

In the late 80's we began using VME based processors running real time kernels (like wind rivers) which also means we develop our entire control system in C Language and port it down, and on top of that we have written tons, and I mean tons, of application specific code to connect all these onto our plant ethernet backbone and our SCADA system through to our MES system. Now it takes specifically trained people to develop new code for these systems, you do not find these people on the web at jobs.com or whatever, you train them yourselves. It takes years for someone to really get a hold on these types of systems.

So what am I saying, we are again testing PC based systems and it looks like we are going to buy one this time. I won't say which one because I don't want to influence anyone but we had 7 or 8 engineers involved for the last 3 months and are just now testing it, so we are not stepping in unprepared. This is also a motion application with .5mS updates, if you've done your homework you'll respect this coming off a PC. Don't just pick the pretty package or the same manufacturer of the PLC equipment your used to buying if your thinking of jumping on board.

Take your time, they will all come to you with a demo. and make sure you understand how to troubleshoot it after it's in. The botup time is also a very usefull fact as someone eluded to earlier in one of these replies. I hated waiting for an old Unix based RTS system to load (remember NextGen?), it took better than 5 minutes, I would kill myself if I had to do that again. Of course I hated waiting for my old Allen Bradley 5/03's to download over RS232, but those are the old days, or are they?
 
Top