Today is...
Saturday, March 25, 2017
Welcome to Control.com, the global online
community of automation professionals.
Featured Video...
Featured Video
A demonstration of EtherCAT control of linear motors using the CTC EtherCAT master.
Our Advertisers
Help keep our servers running...
Patronize our advertisers!
Visit our Post Archive
SCADA Control room management
I am soliciting information on what department should own the Scada control room.

I have recently been hired by a large utility to "organize" the Com room, provide for documentation, and generally organize the place for a future manager.

I'm no Scada expert. I'm an IT guy.

There is a wide disagreement in the organization about who is responsible for this function. Actually this does to seem to be causing any trouble now--but it will.

I would appreciate hearing your experiences on where Scada control fits and who is responsible for it. Thanks.

As per SCADA control room building is concern, construction of it is done by civil dept... But location of the control room will be decided by instrumentation or electrical people.

By Jasmin Ouellet on 30 November, 2006 - 12:24 am

I worked for an A-B distributor for years, and the same problem came quite often:

IT guy pretends it's theirs, because they handle computer, network, security and update.

Instruments/electronics pretends it's theirs because of the nature of the function performed.

what I noted most often, the real choice will depend on the knowledge of the instruments/electronics guy. if they are familiar enough with computers and network, they shall keep it. (you find that most departments have 2-3 "computer whiz kids".)

Keep in mind that most of the time, security is not an issue as most of industrial network are limitated to the PLCs, HMI and programming station. All of which, that should NOT be connected to the company's network, therefore greatly reducing the chances of being hacked!

Update: Often, it is NOT something you want to do as soon as a new update is availlable. (IT guys do!) Take for example, Allen-Bradley had a lot of problems with XP SP2... SP1 works fine? Keep it! I ran into a lot of places where they updated to SP2 as soon as it was out (on IT's recommendations) and ended up re-formatting the computer, so it worked "as before".

NETWORK: Most industrial networks are quite simple. A few basic switches, a few "192.168.1.xxx" addresses, and that's about it! If they need to connect to the company network, let's say to bring data to the SCADA, use a computer with 2 NIC cards and get the IT guy to handle that! You could also use an industrial bridge/gateway, such as A-B's Controllogix architecture with 2 (or more) Ethernet cards...

Regards,

Jasmin Ouellet

By Michael Batchelor on 2 December, 2006 - 8:38 pm

/* Rant Mode On

Quite honestly, I don't think *EITHER* of those answers is good.

/* Rant Off, engage brain I hope */

This struggle has been going on for a couple of decades now. I'll outline what I see quickly here, and try to follow up with a more complete essay next week. Yea, this will take a whole essay to get right.

A couple of hundred years ago mechanical guys were putting in steam and water power to replace muscles, and they were somehow "separated" from
the ordinary people who sewed, or chopped, or whatever the labor of the plant did to produce the product. As time went by, plants developed a
"maintenance staff" that had mechanics and an "engineering staff" that had guys who planned new things. (OK, I realize that in most companies this was all one guy. Think in the abstract here.)

Then, a hundred years ago these newfangled electric motors started showing up on the scene, and electrical specialists from outside came in
the plant and motorized things that had been run by the overhead power shaft. Again, as time went by the "electrical gurus" segregated into
electricians in a maintenance department and electrical engineers in a planning group.

Now, here's where we break down. When the computers first started showing up, they were giant things that were owned by the finance
department to keep the books. Unlike the steam engines and electric motors, these things came into the business and got comfortable without
ever being involved in the actual "production" of the plant. So the "computer guys" never became integrated into the planners and
maintainers groups. That isn't to say that the IT departments didn't have planners and maintainers, but unlike the production plant, those planners and maintainers worked in the same group and under the same boss usually.

In a regular manufacturing plant, the mechanical and electrical engineers work for an engineering boss, and the mechanics and electricians work for the maintenance supervisor. However the
informatics engineers work for the CIO or CFO, and informatics technicians work for the same CIO or CFO. Except that now it's not just the general ledger they're dealing with. It's the production equipment. The General Ledger requirements and the Production Floor requirements are *DIFFERENT* and sometimes at odds with one another.

The fact that they have competing needs isn't anyone's fault, nor is it particularly bad. The two environments are just different, and that's
just a fact.

What I predicted 20 years ago - 1987 actually - that has obviously failed to materialize was that the information systems discipline would
split along the same lines as the mechanical and electrical disciplines. A group of informatics planners would fall under the engineering
manager, and a group of informatics technicians would fall under the maintenance umbrella. So a new project might have p mechanical engineers, q electrical engineers, and r informatics engineers working on it. But after it's rolled out the maintenance department might send a
mechanic, an electrician or an informatics tech out depending on what's wrong with the thing when the operator writes a trouble ticket.

Instead, what seems to be happening in some plants is that trench warfare is beginning to break out about who owns the box that looks like
a users desktop computer but in reality runs the plant. At least it seems that in the plant Tony D. has been tasked to organize someone is
looking at it instead of fighting about it. I hope it turns out well.

I have yet to ever see any plant organized along the line I've suggested with the information systems design/planning guys working for the
engineering boss and the information systems technicians working for the maintenance boss. Not a single one. But I'll bet it would work better if it *WAS* organized that way.

MB
--
Michael R. Batchelor

www.ind-info.com/schedule.html

GUERRILLA MAINTENANCE [TM] PLC Training
5 Day Hands on PLC Boot Camp for Allen Bradley
PLC-5, SLC-500, and ControlLogix

If you aren't satisfied, don't pay for. Guaranteed. Period.

training@ind-info.com

Industrial Informatics, Inc.
1013 Bankton Cir., Suite C
Charleston, SC 29406

843-329-0342 x111 Voice
843-412-2692 Cell
843-329-0343 FAX

I've read Michael's article before but can't remember where. I agree with his proficy and I'm pushing to see it through. The only difference is that I see two groups, one for office networks and one for production networks. The biggest issue I see is money. Upper management can't see spending money for IT in the Maint/Eng group when there is already a group to cover Network maintenance.

The problem with upper managment's theory is that comparing an Office network to a production network is like comparing a 4-door sedan to a MAC truck. Yes, they each drive the same roads but each has a unique and completely different purpose. Most IT personnel don't understand this. They try to Copy/Paste the office network into the production area and it WILL fail. As Nathan said ""oops, I left the network down for the afternoon" holds a different level of significance when it comes to production."

Referring to Jasmin's comment "(you find that most departments have 2-3 "computer whiz kids".)". I cut my teeth on a 10Base2 network in production because our IT group said, "Go play with your little toys and leave us alone." Looking back it was the best thing he could have said. I played with my toys and I learned. Trial and error, back then you couldn't find experts in this area that were reliable. Now our production network dwarfs the office network in our plant and is still under control of Maint/Eng. Today I can't "play" anymore; production is too reliant on the network. This means that people hired in production networks have to be trained in PRODUCTION networks.

Now I look at IT with contempt at their ignorance of what it takes to make a production network reliable. Most of them think finance makes the money in a plant.

As for a SCADA system, I would use industrial equipment, not office equipment and build a stone wall around the production network with only a pinhole for information to travel through that can be plugged in a moments notice. In my plant "NIMBA" was the warning shot across the bow and "SASSER" was the justification for our diligence in a rock wall of security.

By Michael Batchelor on 8 December, 2006 - 7:09 pm

I really hope that you've seen this somewhere besides somewhere that I've posted it. I really have been advocating such a move since the late 80s when we all used Usenet news, but I've never heard anyone else except me singing this song. I don't even care if I don't get credit for the idea; I just want to see the problem solved. (Who was it? Reagan? That said it's amazing what you can get done when you don't care who gets the credit.)

I disagree, however, that there should be two groups. In the same way that there are not two groups of electricians - one to handle the bathroom light switch and one to handle the motor change out on the conveyor line - there shouldn't be two groups of data technicians. (Obviously gigantic plants have multiple groups of all disciplines with different areas of responsibilities - let's talk about average size plants here.) That isn't to say that the average plant maintenance staff doesn't have a couple of guys who are the ones you send to fix outlets and light switches and a couple of other guys you send to do the heavy lifting. Every good electrical supervisor has a grasp of his technician's capabilities and a good handle on the differences of requirements between the bathroom light switch and the conveyor motor. Likewise for the mechanical guys. Different requirements for different areas, but one supervisor.

I also won't deny that the requirements for the "plant desktop data network" and the "central accounting system" and the "production network" are very different. But the Data Supervisor should have a handle on the differences just like the Electrical Supervisor and the Mechanical Supervisor, and he should report to the Maintenance Manager just like they do. The problem is that it would require *GIANT* organizational changes, with winners and losers. The problem is political, not technical, and the CIO has a corner glass office next to the CEO, while the Maintenance Managers and the Engineering Managers have cubes out on the floor. I *WILL* deny that any CIO wants to move his desk down to the dirty production environment. Witness that this discussion is going on in a "controls newsgroup" and not in an "IT newsgroup."

--
Michael R. Batchelor

www.ind-info.com

It is happening. Peter Zornio, when he was at Honeywell, predicted it to me years ago. He said that it was far easier to take a plant networks and information person and make them understand enterprise IT than to take an enterprise IT manager and get him or her to understand how a plant works, and why it is different.

At this past week's Invensys Customer Conference, Marty Edwards (who recently moved from Georgia Pacific to Idaho National Laboratory) showed a chart with the differences between IT upgrade and patching practice and what is allowable on the plant floor-- and there were few similarities. IT simply doesn't understand the fundamental difference.

The biggest difference is that in the Enterprise world, the fundamental security practice is to protect the server--- save the data! In the plant floor environment, the fundamental security practice is to protect the controller and the plant floor loops and controls. These are diametrically opposed objectives.

That's why, as Eric Byres said to me the other day, Cisco never came up with the Tofino device. If you haven't seen that yet, you will. It is a revolutionary edge peripheral firewall that was designed to protect controllers and field devices. MTL will be selling it in 2007, and they are demonstrating it periodically. I saw it at the Honeywell EMEA User Group meeting in Spain a couple of weeks ago. Byres was there, and had a demo set up with a networked PLC operating a pump and level controller loop to fill and drain a storage tank. There was an HMI so we could see what was going on.

Byres used some basic blackhat penetration software and was able to spoof the HMI so that everything continued to look fine, and then cause the pump to force on, and overflow the tank. He could do anything he wanted to the PLC and nobody would know.

Then he put the Tofino firewall in the line between the network and the PLC, and the PLC absolutely disappeared to the blackhat software.

As we look at the coming proliferation of networked wireless devices in plants, this kind of device (Joanne Byres assures me the patents are good, strong and unbreakable) is going to be a necessity right at the field device (whether sensor, controller or final control).

Walt Boyes
Editor in Chief CONTROL magazine
Putman Media Inc.
555 W. Pierce Rd. Suite 301 Itasca, IL 60143 www.controlglobal.com
blog: Sound Off!! at controlglobal.com
630-467-1301 x 368 wboyes@putman.net

By Michael Batchelor on 10 December, 2006 - 5:46 pm

I'll still stand by my assertion below that this is going to take giant organizational changes. Walt is dead on the money in all of his assertions, but still, as I quote myself below, witness that this discussion is going on in a controls newsgroup, not an IT newsgroup. Until this starts getting exposure in the IT world, it's DOA.

Anyone got the guts to throw this thread on the desk of their CIO?

Michael

Of course I agree with you.

However, these changes are beginning to happen.
I am aware of more than a dozen people who head both plant automation and IT in their plants.

IT is being downsized because it isn't cost effective the way it is currently organized.

Walt Boyes
Editor in Chief
Control magazine
www.controlglobal.com
blog:Sound OFF!! http://waltboyes.livejournal.com
_________________

Putman Media Inc.
555 W. Pierce Rd. Suite 301
Itasca, IL 60143
630-467-1301 x368
wboyes@putman.net

By Nathan Boeger on 12 December, 2006 - 12:17 am

> It is happening. Peter Zornio, when he was at Honeywell, predicted it to me years ago. He said that it was far easier to take a plant networks and information person and make them understand enterprise IT than to take an enterprise IT manager and get him or her to understand how a plant works, and why it is different. <

In fundamental concept perhaps - but there is a distinction between "enterprise IT" and desktop support. Your typical plant network person doesn't know a thing about properly managing servers, networking beyond fundamentals, etc. They have to start at square 1 in their IT curriculum, just as the IT manager needs to start at the beginning with controls technology. Both sides need to learn to look beyond the physical hardware with which they may assume to be familiar. This may not address your point directly, but far too many controls people bash IT for running desktop patches that screw up shoddy HMI programs. Just because that's a typical IT department responsibility doesn't mean that it's the bulk of what they specialize in. IT needs to be informed of how to support specialized devices (PCs running controls apps). This is especially true since HMI vendors are so irresponsible about supporting the platform on which they run.

> IT simply doesn't understand the fundamental difference. <

Inform them - IT's job is to support and maintain computers and the network. If an IT department works for a manufacturing plant, they need to know what production priorities are.

> Then he put the Tofino firewall in the line between the network and the PLC, and the PLC absolutely disappeared to the blackhat software. <

That's what a firewall does. I think the Tofino device is a cool thing - it allows a PLC type who has no idea what he's doing with networking or security to provide some amount of protection to the plant floor - an innovative idea. However, a good IT department would do better. Worst case they could work with you on configuring that device, and actually understand what's going on.

I feel weird standing on the side of IT after doing project after project arguing these points from the other side. We all recognize the convergence of technologies. Now it's important to recognize each others skill sets in the area. SCADA security tends to be pathetic. Most industrial organizations will wish they'd worked with their IT department if they ever get attacked.

----
Nathan Boeger
boeger AT inductive automation DOT com
Inductive Automation
"Design Simplicity Cures Engineered Complexity"

By Michael Batchelor on 12 December, 2006 - 4:20 pm

My point exactly of why there needs to be one group, not two separate groups. It is true that the physical appearance fools a lot of people. It looks like a computer, so it belongs to us. The PLC doesnt' look like a computer, so you can play with the software on it willy-nilly without any procedures, despite the fact that some of those PLCs are now collecting data that's regulated by Sarbanes Oxley.

This isn't a simple issue. There are really competing needs here, some of which are mutually exclusive, or at least requires creative solutions. If one manager has responsibility, and understands both sets of requirements, then it's more likely to get solved.

Michael

I have to take exception to something Nathan said after he quoted me.

It is not true, in my experience, that vendors produce either shoddy programs or support them badly. Nor does Microsoft produce bad software.

Before everybody from the "Church of Kill Bill" leaps on me again, let me point out that CERT shows a large number of vulnerabilities in every distro of xNIX from OS X to Linux. If you look at the size of the distribution of MS apps vs the size of the xNIX apps, if there were as many Linux boxes as there are Windows boxes, we'd all be complaining about how shoddy Linux is, or how virus and trojan vulnerable OS X is...

The issue, as Joe Weiss from Kema and I were talking about on the phone last night, is not really one of coding. It is about policies and procedures, auditing and training.

I read somewhere that on the order of 40% of all control systems still have "password" as the root administrator password years after delivery, and the "guest" signon is still enabled in more than half.

You can't legitimately thwack vendors for that.

Walt Boyes
Editor in Chief
Control magazine
www.controlglobal.com
blog:Sound OFF!! http://waltboyes.livejournal.com
_________________

Putman Media Inc.
555 W. Pierce Rd. Suite 301
Itasca, IL 60143
630-467-1301 x368
wboyes@putman.net

By Michael Batchelor on 13 December, 2006 - 11:50 pm

> The issue, as Joe Weiss from Kema and I were talking about on the phone last
> night, is not really one of coding. It is about policies and procedures,
> auditing and training. <

Exactly. And I'll assert again, there should be *ONE* set of policies and procedures, not two or a dozen. That's not to say that there won't be two paragraphs or sections dealing with differences between two types of systems. In truth, there are probably a dozen different "domains" of how data systems get used, and all of them need to be addressed within the policies. The mainframes that are still around, and don't think they aren't, need - and should have - a very different set of rules from the desktops, and from the automated equipment, and from the Time and Attendance clocks, and from the security cameras that now write to a HDD instead of a VHS tape.

Some people have told me that it's insane to lump all of them together, but I'll stick to my example that electricity has now moved from the plant floor into the office, and there's 480VAC turning lathes as well as an outlet in the restroom with a GFI and another outlet under the desk that has your monitor plugged into it. "Computers" will, and must necessarily, become as pervasive as electricity.

> I read somewhere that on the order of 40% of all control systems still have
> "password" as the root administrator password years after delivery, and the
> "guest" signon is still enabled in more than half. <

I can attest to this first hand as well. I was called to a plant that I hadn't been into for at least 8 years to see if I could get a daily production report that had stopped printing to work again. No one knew the password, so I tried the default distribution password I have used for years. It worked first time.

Michael

--
Michael R. Batchelor

www.ind-info.com

By Michael Griffin on 14 December, 2006 - 9:01 pm

In repy to Walt Boyes - There are two points which you brought up which I think need addressing.

The first is (to paraphrase it) the claim that "MS-Windows gets cracked much more often than Unix/BSD/Linux because more people use it". That theory has been thoroughly debunked many times already in the general computer press so I won't address it here. The security problems with MS-Windows are due to deep seated design problems and a reluctance by Microsoft to admit to problems they don't want to fix.

The relevance of this first point to automation applications is that anyone who is counting on "security by obscurity" for their automation application is kidding himself. The fact that MMI and SCADA systems are used less than e-mail and word processors does not inherently make them less vulnerable.

The second point that needs addressing is the claim that vendors cannot take responsibility for default passwords not being changed. There is a solution for that and it is to not have any default passwords or "guest" accounts. When you install the software, you should get no functionality until you create your own login and password.

The reason why so many systems install with default passwords is because having an installed default password is less work than writing code which handles the special case of starting up the first time with no passwords configured. It is also seen as being more "user friendly", because you can install and start up the software without having to create any logins (under the theory that someone will get around to changing the passwords "later"). The problem with that approach is that if the software looks as if it is working people are inclined to just leave it alone rather than digging into it to find out how it works.

So yes, we can "legitimately thwack vendors for that".

How about thwacking yourselves? End-users have enormous influence with DCS vendors. If you don't think so, go sit in on a Honeywell User Group meeting, where the user group itself designs and commissions products, and has over 25 engineers working directly for them.

The same is true for Yokogawa, Invensys, Emerson, and so forth. End users have stroke, money, and feet to vote with. And they use them.

The fact is, Michael, it is easier to blame Microsoft or your least favorite DCS vendor for poor security practices on the part of end users and system integrators. Microsoft and the DCS vendors have their problems, it is true, but you're acting like there is no responsibility to be accountable on the part of the end user. That's not right.

Sure, it would be nice to have the feature you describe...have you asked a DCS vendor to supply it? If you haven't, then shame on you.

The debunking is coming unraveled. It was easy to thrash MS in the general computer press until OS X started exhibiting similar vulnerabilities, and xNIX distributions have always practiced your security by obscurity.

I completely agree with you that anybody who thinks security by obscurity is going to save them is foolish, and likely to do serious damage or even kill people from neglect of following some very simple rules.

We are all going to have to re-think what we do as far as security is concerned.

Walt Boyes Editor in Chief Control magazine www.controlglobal.com blog:Sound OFF!! http://waltboyes.livejournal.com
_________________

Putman Media Inc. 555 W. Pierce Rd. Suite 301 Itasca, IL 60143 630-467-1301 x368 wboyes@putman.net

By Nathan Boeger on 15 December, 2006 - 11:06 pm

First, I'd like to say that Michael Griffin's 14DEC 06 post is dead on.

Walt, I have to give it to you on the pizazz and elocutionary flair. Unfortunately on this last post I'll have to take exception to certain statements of yours.

"xNIX distributions have always practiced your security by obscurity" - Where did this come from? Please elaborate. Unix communities tend to provide source code for their security implementations, relying on algorithms that will stand up to an attacker who knows the system, the POLAR OPPOSITE of security by obscurity. Much details of the Windows kernel and security implementation are closely guarded trade secrets at Microsoft - providing a level of security by obfuscation. Maybe their code's solid, who knows? Ask someone that work for them.

Windows 9x and before were designed as single process applications without security in mind. Windows NT was designed to be a secure system, with many provisions that weren't actually implemented. Over time, Microsoft's been doing better and better. Vista should be pretty solid at the core. My interpretation is that Michael and Mark's qualms relate to the way the Microsoft deals with its problems - they choose to play the PR game and sweep it under the rug, or best case hotfix, where xNIX communities tend to release the problem and fix. Since the source code is available, the fix is open to pubic scrutiny.

> How about thwacking yourselves? End-users have enormous influence with DCS vendors. <

Should we blame ourselves? As integrators, engineers, and opinion leading writers, probably. As end users? Give me a break. End users understand computer security implementation details in even more limited scope than you. That's why they pay the big bucks to have the "experts" implement it. How can they be expected to correctly influence vendors in this regard?

> I completely agree with you that anybody who thinks security by obscurity is going to save them is foolish, and likely to do serious damage or even kill people from neglect of following some very simple rules.
>
> We are all going to have to re-think what we do as far as security is concerned. <

You're absolutely right here. SCADA security is currently almost non-existent from a real computer security standpoint. Obfuscation won't save a system from even a novice attacker. I think that the responsibility should fall on vendors, integrators, and finally, end users.

----
Nathan Boeger
Inductive Automation
"Design Simplicity Cures Engineered Complexity"

By Michael Batchelor on 15 December, 2006 - 11:33 pm

OK, here I am ranting again.

Touche' on this point Walt, but flavor of OS aside, how do we get the organizational structure of companies changed to get rid of the war between IT and Maintenance?

That point is a hundred time more important than what OS the stuff runs on.

MB

By Michael Griffin on 16 December, 2006 - 5:55 pm

In reply to Michael Batchelor - If the control network used RS-485, we wouldn't even consider the possibility that IT should take care of it. Just because it happens to use ethernet shouldn't change any conclusions we may have about who should look after it. The answer would seem to be to have separate IT and control room networks and let each department look after their own.

That doesn't mean that people in different departments shouldn't help each other out when they are in difficulty. It does mean however that when you buy some fancy equipment you should make sure that your own people can look after it. Moving the responsibility for a combined network from IT to the Maintenance department doesn't change the problem, it just changes who is complaining about it.

The real problem should come when you want to exchange data between the two systems. For things like having an MRP/ERP system get data from production equipment it's probably a good idea to have an interface "box" that intermediates between them rather than having the MRP/ERP system reach directly into individual machines. For things like allowing test equipment to back up test data to the IT file servers, it's a good idea to set up a special working relationship for those specific cases. In these types of applications though, you have limited and well defined interfaces that both sides can come to an agreement on how to deal with.

By Michael Batchelor on 17 December, 2006 - 2:55 pm

In a different life I worked in IT, and I had far more RS-232 links to worry about that I've seen in most production facilities. OK, so RS-232 isn't RS-485, but they're truly cousins. An had one of the non-IT guys wanted to monkey with it, he would have been stopped.

I see your point, but I think it's deeper than that.

By Michael Griffin on 16 December, 2006 - 9:27 am

In reply to Walt Boyes - Not everyone can attend user group meetings. It also helps to discuss problems in public ("thwacking vendors") to build a consensus amongst users as to whether there really is a problem, and as to what the possible solutions may be. If software vendors are not reading this and other mailing lists, they are missing out on important information about their market. No doubt the user group meetings help in getting more detailed feedback, but these will always have the disadvantage of representing a narrow selection of opinion.

As to whether "the debunking is coming unraveled", I don't want to turn this into a debate on the security of different operating systems. This would simply be duplicating detailed discussions that have taken place elsewhere in the general computer press.

Please note though that what was being "debunked" was the explanation of *why* there are more security problems with MS-Windows than with other operating systems. I was responding to your comment on "if you look at the size of the distribution of MS apps vs the size of the xNIX apps, if there were as many Linux boxes as there are Windows boxes, we'd all be complaining about how shoddy Linux is, or how virus and trojan vulnerable OS X is". The premise of your own argument is that MS-Windows has more security problems, and your explanation for why this is so is an old argument which I said has been debunked. If you would like more details, the following article explains it rather well.

http://www.theregister.co.uk/security/security_report_windows_vs_linux/

I would also point out that Microsoft is not the only software vendor with a poor security record. Oracle has an equally poor reputation in this regards. This seems to be a common failing among very large software vendors who dominate their respective markets. Competition seems to be the most reliable means of getting companies to respond to problems, so companies that don't feel that competition very strongly tend to avoid taking difficult measures to correct those problems.

The relevance of the above to SCADA and MMI systems is that unless customers consider security to be a priority and make it a criteria in their purchasing decisions, the vendors have little incentive to devote resources to security instead of on developing new features. Many vendors have been counting on the fact that SCADA systems are relatively obscure and feel that this affords a degree of protection ("security by obscurity").

We discussed SCADA security earlier this year under the topics

HMI: COMM: An interesting article on a SCADA security vulnerability.

HMI: SCADA Security Developments

These discussions were based on articles in the computer security press about vulnerabilities in SCADA systems and how new legislation in some countries is requiring plant owners to address this problem for "critical infrastructure".

I will repeat one of my own conclusions from that discussion, as I think it bears further consideration. The SCADA security studies pointed out that the security of the SCADA application itself is only one element in overall security, and in many cases, only a minor part of that security. The security of the operating system, database server software, and other third party applications are equally or even more critical.

An operating system which is in a configuration used for typical desktop applications will not meet the security requirements laid out in the studies. If it is possible to configure a system to meet these requirements it would require more knowledge than all but a very few SCADA integrators or plant operators possess.

I think that SCADA vendors have a role to fill here; and no, it doesn't involve standing around and getting "thwacked". While every SCADA application is unique, the software components and configurations are common to most. The problem is that for most SCADA systems the application integrator is expected to purchase various third party components separately and to put them together himself (hopefully doing it securely). The plant operator is then expected to maintain that system (while supposedly keeping track of all applicable security threats). The SCADA vendor doesn't take responsibility for anything that doesn't come on the CD they supply.

An alternative approach would be for the SCADA vendor to supply a customer with a CD that contains *all* the software they are likely to need for most SCADA applications, and to support *all* of that software (via a service contract). That would include having it install by default in a secure configuration, and supplying all updates from a single source. They would supply and take responsibility for not just the actual SCADA software, but also the operating system, the database, and any other software that a typical system would require (web servers, programming languages, etc.).

If that sounds impractical, it is worth pointing out that there are companies that do precisely that for critical business systems. They do not create most the software, but they package it together and support it for paying customers. That is in fact the actual business of most Linux distributors. The operating system itself is only a small part of what they deliver and support.

I don't see why SCADA vendors couldn't pursue a similar strategy, whereby they package their proprietary SCADA software with all the third party components necessary and then take responsibility for the system as a whole. If some customers needed to go outside the supported configuration they would still be free to do so, but they would be on their own as far as supporting those changes are concerned.

This doesn't mean using a proprietary OS on proprietary hardware. The customer would use common PC hardware purchased from any qualified vendor, the OS would be a readily available one, as would the other required software. The SCADA vendor would strip out software components which are not needed for SCADA applications, and configure the remainder in a reliable and secure manner. They would then supply additional software to manage installation, configuration, and downloading updates. They would operate a software repository that would provide all updates from a single source and notify their customers of any problems.

This would be a value added business for the SCADA vendors that would let integrators and plant operators concentrate on their application, while letting the SCADA software vendors take care of the complete software application platform. There are I believe some specialist SCADA vendors who already do this, but I don't know why this shouldn't be the rule for most SCADA software rather than the exception.

Notice I haven't just "thwacked" the SCADA vendors, nor have I thrown my hands up in the air or pretended that there is no problem. I think the above is a viable solution for both vendors and customers. Unless that is, nobody is really all that concerned about security. I think that is in fact the case today, but that could change in future. The SCADA vendors who have put themselves in a position where they could take advantage of that change when it comes may be the ones who survive while the rest go out of business.

By Michael Batchelor on 16 December, 2006 - 5:54 pm

OK, I'm still on an organizational rant here. I don't want to turn this into a debate about security differences between two, ten or fifty operating systems either.

I *DO* want to turn this into a discussion about how to effect the organizational changes necessary to solve the problem of the continuous struggle between the IT staff and the maintenance/production staff.

Even if I had a magic wand and could change every computer inside Organization "X" into a computer running OS distribution "Q" I'm still going to have the struggle between the competing needs of the Control Room HMI and the receptionist's email "terminal" and the Corporate mainframe in the glass house.

What can we do to address the organizational issue? I think this is far more important than the OS.

Michael Batchelor
www.ind-info.com

Michael

The answer is -- sit down and talk to the management and IT people.

You may be surprised where it will lead. They all have the same interest (the company) at heart.

Dennis

By Curt Wuollet on 18 December, 2006 - 10:19 am

They are inextricably related. The choice of MS for the secretaries currently more or less dictates the choice of tools for the automation dept. And right or wrong, management sees the IT gurus as professionals and everyone else as users. This explains, in part, why I must use WXP on the job. The other part is that automation and maintenance actively and decisively put themselves in this position by choosing products that include IT property in every system that uses OPC or depends on Windows.

This was _not_ a problem when PLCs were standalone and DCS was run on whatever was the most stable. And that is enforced by automation vendors supporting only one system, the one most attractive to IT meddling. The guru vs user bit is intrinsic in using MS for anything, so why should factory systems be handled any differently? And for the most part, since IT keeps up with crashware by necessity, and control folks presumably have other things to worry about, why would you _not_ organize it that way? After all, they are MCSE's.

If control systems were control systems and not simply another Windows box, we might win this, but the way things are, the MCSE is going to get the call any time you can't fix it in 10 minutes. It's the price you pay to use Microsoft, just like IT. You are just another Windows user.

Regards
cww

By Michael R. Batchelor on 19 December, 2006 - 8:47 pm

Curt,

Completely, absolutely true. Which is why I have been advocating for moving maintenance of the data devices into the hands of the maintenance department.

But, as I said several days ago, that would entail moving the CIO's responsibility into the engineering department. That's the change that's unlikely.

To Curt's point below, there's not a single electrical supervisor alive - at least not one who deserves the job - who sees the requirements for the 480VAC 3 phase motor that runs the conveyor chain in the same light as the bathroom outlets, but he's responsible for both. And a decent manager gets both right.

However, with the current organizational structure in most plants, the secretaries get correct service, and the control room gets shafted. So, I propose changing the organizational structure of the data services to mimic one that gets it right in another genera.

But there will be a fight.

Michael

www.ind-info.com

Michael

I work for a consulting engineering outfit which would never allow a supplier or integrator to define the security of a system. The specification defines the requirements and the contractor bids to complete the job WRT the specs --security is defined in the specs. We would be a fool not to include security in those specs. The security runs the whole gambit of software, hazardous areas, employee safety and public safety all of which are in the realm of consulting engineers and we try to protect these areas.

Dennis

By Michael Griffin on 17 December, 2006 - 2:57 pm

In reply to Dennis - There is a group called the "SCADA and Control Systems Procurement Project" who are a group of large users of SCADA systems. They are busy writing a set of standard contract language intended to be inserted into SCADA purchasing contracts to cover security issues. I have mentioned them before, so before replying to your message I thought I would have a look at their current draft. I found the following contract language which I had not noticed before:

"Post contract award, the vendor shall provide notification of a known vulnerability affecting vendor supplied or required OS, application, and third party software within a negotiated period of time after public disclosure. The vendor must apply, test, and validate the appropriate updates and/or workarounds on a baseline reference system before distribution. These steps could be a subset of FAT testing. Mitigation of these vulnerabilities should occur within a negotiated period of time." (section 2.6.2 - draft version 1.5).

Note they intend to have the *vendor* be responsible for the "supplied or required OS, application, and third party software". If a vendor's software requires a third party OS, this group (and anyone else who follows their recommendations) are going to require the SCADA vendor to take responsibility for the security of the OS (and other software) and that responsibility will continue after the sale and installation.

It's nice that you include software security in your specs, but do you take responsibility for the software for the life of the plant? Do you continuously analyse the security threats and test and provide the application, operating system, and other third party security patches? Most consulting engineers are on to the next project after the plant is up and running.

What I was referring to was a software vendor business model similar in intent to what the quoted text above is stating. That is, a business relationship that doesn't end after the initial sale.

For this to be feasible, I think the vendor should provide all the software. This doesn't mean the vendor has to *write* all the software (other than the SCADA package). These days operating systems and databases are commodities so it would be foolish for them to write their own. I think though it all needs to go through the vendor's hands so they have some control over what they are supporting and can strip out anything that isn't necessary for their customers.

It will be interesting to see how SCADA vendors respond to this requirement and how well their solutions work.

Michael,

I would be interested to here more about this.

Contact me at phair1 @ rogers. com

Do you have any web sites?

Dennis

By Michael Griffin on 5 January, 2007 - 10:45 pm

In reply to denn: The web site for the "SCADA and Control Systems Procurement Project" is: http://www.msisac.org/scada/

I am not involved with them, and I am not making any recommendations regarding them. I brought them up to point out that there are SCADA users who intend to place more emphasis on security and who intend to make the supplier responsible for providing it.

I hope that answers your questions. I have already sent you a copy of this reply directly to you as you requested. If I can be of any further help,
please let me know.

Michael,

I would love to see my life modified by this document, even though it is a draft, (I don't think it will happen in my lifetime -- to many preliminary legal battles to go thought to set precedence). The constant reminder of inaccurate sight supervision ( Suncor law suit for example) will keep my mind busy for years lest I fall into the same battle. My goal is not to follow Suncor's engineers path but to perform due diligence -- This spec helps but it is not a complete solution (all is still in flux). Maybe it will be refined with time and experience.

Dennis

By Nathan Boeger on 17 December, 2006 - 2:50 pm

In response to Michael Griffin's post:
I appreciate the general insight that you provided on the above post (16DEC06 9:27AM). Your points about the relevance and issues relating to SCADA security are dead on.

I wanted to address your "Alternative approach" for SCADA vendors - supplying and supporting all the necessary software like Linux distributors. I think that's a good idea in terms of leaving the responsibility in the hands of a group that should be able to handle it (integrators tend to be too small and specialize in controls and end users tend to lack the expertise). Inductive Automation goes part of the way toward what you described. Because end users often have such different standards and considerations, we support many databases (MySQL, MS SQL Server, Postgres, Oracle, DB2, etc, etc). Similarly we'll work with pretty much any OPC server to talk to any kind of PLC. This makes securely locking up and bundling a single package impractical. However, we do support integrators on these installations (a free service for integrators) and provide direction in terms of implementation/security. We recommend that they work with their IT department to figure out their best implementation and secure it. We also support the sensible use of 3rd party applications or consultants.

----
Nathan Boeger
Microsoft Certified Systems Engineer
http://www.inductiveautomation.com
"Design Simplicity Cures Engineered Complexity"

By Nathan Boeger on 15 December, 2006 - 11:24 pm

Well put Michael Griffin!

----
Nathan Boeger
Microsoft Certified Systems Engineer
http://www.inductiveautomation.com
"Design Simplicity Cures Engineered Complexity"

By Blunier, Mark on 14 December, 2006 - 10:21 pm
1 out of 1 members thought this post was helpful...

Unfortunately, you are comparing apples to oranges. In your Linux distribution, with several different versions of a specific application, take an FTP server for instance, if a couple of distributions have a couple of different ftp server packages that have a vulnerability, you're counting that as 4 bugs, even though only one FTP server is installed. In other cases, such as in web servers, a vulnerability you'd count against Linux would not count against windows, as the windows OS does not include a web server. If I were to release MBWS (marks buggy web server) for both Linux and windows, would it be fair to bash Windows for vulnerabilities in the windows version of MBWS? That's what you're doing to Linux.

I don't see how you can even suggest that they are similar. When Linux gets a vulnerability identified, it gets fixed. When Windows gets a vulnerability, its 'fire walling' and virus protection software gets updated, and then later if we're lucky, the software that is vulnerable gets fixed.

As far as the original thread goes, it doesn't matter what the title is of the person is that is managing the systems. The person should be competent in technology required for the work. Unfortunately, both IT and Maintenance have plenty of people that aren't competent in their traditional areas, that to expect them to be competent in both is unreasonable. But there are a few exceptional people that could do both competently.

Mark

By Nathan Boeger on 15 December, 2006 - 10:53 pm

Walt,

Microsoft does not currently produce bad software - they have a large enough economy of scale and development team to produce software of a quality where it's rare for a user to find significant bugs that affect productivity - the same is true of many large open source projects.

Industrial software, on the other hand, is written by much smaller companies for a much smaller industry that has orders of magnitude less volume. Developers are then pressured to release the latest technologies AND support legacy systems - a tall order. Take any HMI system from 10 years ago and try to get it working on new machines - good luck. My point is that it's commonplace, it's industry standard as "normal", to tolerate lots of version updates and hotfixes to be necessary to keep up with the times. Software that randomly crashes without a patch or has tech support telling you to reinstall Windows to fix your problem - This can seldom be said of commercial software. For example, when SP2 came out for XP, most major vendors issued instructions to NOT INSTALL IT until they figured out how to deal with it (in fairness Microsoft screwed them over with the default Windows Firewall settings). Similar problems are commonplace with DCOM settings, problems when installing different combinations of software (RSView and MS Word), etc, etc. If you think that the current generation of industrial software programs are up to par with commercial software - and they should be more stable given the criticality of their function - I would argue that your experience is out of touch with reality in most of the industrial world - talk to industrial integrators or manufacturing managers about the stability of their PC based control systems.

I think that you assumed a lot more than I meant because I used the term "shoddy". By that I meant that industrial software is seriously sub-par to where it should be, specifically with respect to normal computer workstation maintenance from IT - a heated topic that has caused many controls guys in this post to question the competency of IT departments in general. My point was that IT still possesses the expertise to manage the system - they just need to know what pitfalls to avoid and be keenly aware that when dealing with control systems, a running process is #1, security and updates come at a distant second.

My other point is that industrial systems should utilize developed (commercial or otherwise) software when applicable. For example, Microsoft SQL Server is a solid platform - great for commercial or industrial use. Why were most industrial software companies logging data to flat files or their own proprietary formats for so many years until relatively recently? Who knows, but the trend to move to a more stable platform makes too much sense. And IT would love to support that SQL database. Why do so many industrial software companies insist on writing their own pieces and plugins for their packages when they do a second rate job compared to existing technologies?

----
Nathan Boeger
Inductive Automation
"Design Simplicity Cures Engineered Complexity"

By Curt Wuollet on 18 December, 2006 - 10:12 am

Hi Nathan I can answer that last question:
> Why do so many industrial software companies insist on writing their own pieces and plugins for their packages when they do a second rate job compared to existing technologies?>

Because in 5 or 10 years their own database systems will still be supportable rather than obsoleted by MS. And in the meantime, you won't have to change all the interface code that doesn't need changing 3 or 4 times. Often, these decisions are based on past experience and pragmatism. Keeping up with Windows churn is a very expensive part of an ISD's development budget. Also, MS SQL is a resource hog and very much overkill for the job at hand. IT does like to own a new server for each function though. If I were to standardize, it would be on something that fits and can't be obsoleted, perhaps with an API that would work with several other systems. Owning the source would provide even more assurance that you won't be hung out to dry.

Regards
cww

By Nathan Boeger on 19 December, 2006 - 8:36 pm

cww,

Good point about Microsoft products - they certainly have a history of short lifecycles, especially support and forward compatibility, compared to industrial hardware.

Have you used newer versions of SQL Server? It's a much better product than it used to be. I find myself more often recommending that integrators use MySQL or PostgreSQL, but as much as I'd rather not admit it, I think that MS SQL Server 2005 is as good, if not a better product. Sure, you're not going to run it on old hardware, but it does perform.

----
Nathan Boeger
http://www.inductiveautomation.com
"Design Simplicity Cures Engineered Complexity"

By Curt Wuollet on 18 December, 2006 - 10:07 am

That's pretty simplistic. And a quote from the party line. Considering that the source is available for most of the *nix variants, it's several orders of magnitude easier to find (and fix) vulnerabilities as there are many orders of magnitude more eyeballs on the code. Think for a moment how many more vulnerabilities would be found if the Windows source were available for all to see. And consider for a moment how many fewer vulnerabilities exist after all that scrutiny. And then there is the "exploitability" factor. Those that are left in say, Linux, are much less open to exploit than those in Windows. The relative number of boxes is a far smaller factor as is evidenced by the number of _successful_ exploits, economic damage, business disruption or practically any other metric. But, yes, I will easily find more typos in the book I have read, than the one that remains closed. No matter how many copies of each I have.

Regards
cww

By Davis Gentry on 18 December, 2006 - 2:23 pm

Come on, Curt. Everyone here knows that you are a *nix man, and that you don't care much for MS. Having said that, your points are often cogent, usually well stated, and backed by a good deal of real world experience. Walt's point here is also very much to the point. The *nix OSs are not inherently any more suited to our work than MS, and tne converse is also true. In a well put together system the OS involved should be more or less immaterial to the customer. If we as coders and integrators do our homework then we can build a good, secure system be it on MS, Linux, or even BOSS. Most of the OSs available have their good points and their bad points, and we do a disservice to our customers when we blind ourselves to the facts.

Davis Gentry

By Curt Wuollet on 20 December, 2006 - 9:40 pm

Those are some pretty broad statements. Let's explore a few.

> Come on, Curt. Everyone here knows that you are a
> *nix man, and that you don't care much for MS. Having
> said that, your points are often cogent, usually well
> stated, and backed by a good deal of real world
> experience. Walt's point here is also very much to
> the point. The *nix OSs are not inherently any more
> suited to our work than MS, and tne converse is also
> true. <

No, it's not. For automation and control purposes, stability is key. Flexibility is important and the ability to interface to most anything makes integration much easier. To say that a 50 million line secret binary monolith with the Worlds shakiest security and stability record is equally suited to a modular open customizable OS is a fairly interesting argument. For actual
control purposes, running the minimum amount of code necessary has always been a given as it promotes reliability and speed. The ability to run headless with no user interaction is fundamental in most control applications. Owning the source code for your control system offers a great deal of confidence that you can support it long term.

> In a well put together system the OS involved
> should be more or less immaterial to the customer. If
> we as coders and integrators do our homework then we
> can build a good, secure system be it on MS, Linux, or
> even BOSS. <

On any particular application, I'd be happy to compare my uptime on Linux with yours on Windows.

> Most of the OSs available have their good
> points and their bad points, and we do a disservice to
> our customers when we blind ourselves to the facts. <

Well, on that much we can agree. It's just that most of Windows good points relate to playing games and consumer floobydust and nearly everything not related to the serious business of
automation and control. They have institutionalized many features that are kewl, but will forever be insecure. And their direction is
towards more of the same including such things as allowing them access to your data and breaking their own file formats with every version and a myriad of other nightmares for maintainability I find it quite incredible that one can favorably compare that with design for compatibility and commonality and openness. Even from a consumer perspective, it's a stretch that forced obsolescence, deliberate incompatibility, proprietary data schema and formats, DRM, making all data part of a database, secrecy and onerous
licensing and use restrictions are beneficial. That would require a special type of blindness as to whom all the benefits of such arrangements accrue. That blindness is endemic at present, but
more and more people are being cured every year. It should be stressed that Linux was created and is being developed specifically to address those faults and steer the benefits toward the consumer. And to make development and integration easier. And more secure and practically every good thing that can be done for computer users. Windows is being developed to make MS money and maximize their control over us and sustain their monopoly.
It's really hard to see them as equals and ignore the skunk on the table.

Regards

cww

By Chris Jennings on 27 December, 2006 - 8:55 am

In my experience the best way to ensure security is by having complete control of data that enters and leaves the SCADA environment. We set up our networks with a router between the SCADA network and the business network and only opened the ports that were required for each application. By doing this you have seriously improved the security of the system. The biggest
advantage I can see with Linux over Windows is to do with having a different OS for business applications to the SCADA applications. This means if a virus (most likely cause of security problems) gets on the business network the only effect it would have on the SCADA would be potential DOS attack but actual infection is unlikely. So another good example would be using Apache web server on one network and IIS on another. Diversity can be a good thing
sometimes.

> On any particular application, I'd be happy to compare my
> uptime on Linux with yours on Windows.

Just to put some numbers to the uptime argument, check out this website
http://en.uptime-project.net/page.php?page=toplist or maybe
http://uptime.netcraft.com/up/today/top.avg.html

Looks like SunOS or websites running IIS are the winners ;)

Chris Jennings

By Curt Wuollet on 28 December, 2006 - 2:29 pm

Hi Chris,

Chris Jennings wrote: <In my experience the best way to ensure security is by having complete control of data that enters and leaves the SCADA environment. We set up our networks with a router between the SCADA network and the business network and only opened the ports that were required for each application. By doing this you have seriously improved the security of the system. The biggest advantage I can see with Linux over Windows is to do with having a different OS for business applications to the SCADA applications. This means if a virus (most likely cause of security problems) gets on the business network the only effect it would have on the SCADA would be potential DOS attack but actual infection is unlikely. So another good example would be using Apache web server on one network and IIS on another. Diversity can be a good thing sometimes. >

cww: Diversity would solve most of the really horrible vulnerabilities.
The current situation of all MS plants is the _best_ possible case for virus propagation. If every other machine were running Linux or anything other than MS, most of these attacks would fail or at least slow enough to be manageable.

But there are many, many less obvious advantages to being able to use "purpose built" systems for SCADA and control. One can make many systems "appliances" with little to no potential for malware or other abuse. You can even make ROMable systems that are totally immune. This is the approach the router folks use, cycle power and you are back to uncompromised purity. You can also go the other way and dramatically lower your box count with Linux rather than the function per box approach currently seen as necessary with Windows.

Chris Jennings wrote: <Just to put some numbers to the uptime argument, check out this website http://en.uptime-project.net/page.php?page=toplist or maybe http://uptime.netcraft.com/up/today/top.avg.html >

cww: That's a little different than our arena with some special hardware and
farming going on, but not totally irrelevant. I like the fact that until quite recently, Microsoft had Linux firewalls in front of their IIS servers and were counted in the Linux camp :^) SunOS is not a bad system and the BSDs regularly appear there as well. On commodity hardware with perhaps a simple UPS, I'll still put any stable Linux up against any shrinkwrap Windows. But I will admit they have gotten dramatically better as of late, we only have a couple incidents where something is unavailable a week where I work. With 95 and 98 it was pretty hilarious in an installation of any size.

Chris Jennings wrote: <Looks like SunOS or websites running IIS are the winners ;) >

cww: Many of those are pretty high buck machines with failover, etc.

Regards
cww

By Brian E Boothe on 29 December, 2006 - 10:01 am

God I love these Conversations, we still ("install and Operate w/ 20yr old
technology and sell it as new,,, Dh485 and 1 PC to control an entire Water plant and stations, hows that for greatness

By Davis Gentry on 29 December, 2006 - 5:21 pm

cww>>For automation and control purposes, stability is key.

Of course. And MS products from 2000 on have been quite stable in my experience if properly configured and administered.

>>Flexibility is important and the ability to interface to most anything makes integration much easier.

Agreed. And what OS has the most hardware drivers written for it? Yes, you can of course write your own drivers for Linux. Then your customers are really caught in a bad place. Five year from now, after a lightning strike takes down their PC and cards, and they find out that they can no longer buy the same card from their vendor, and then they find that their driver no longer works with the firmware rev of the new card. Now if they bought source code with the original install they can go out and find a programmer who is capable enough to rewrite the driver (said programmer generally NOT found in most plants in the US, or on the nearest streetcorner, either). They also find out that the Linux kernal they were running is not compatible with the tools that the developer is currently using, so they need to upgrade OS as well. Maybe at this time they also have to rewrite, or at least recompile, their HMI.

CWW>>The ability to run headless with no user interaction is fundamental in most control applications.

Yeah - and Embedded NT/XP works great for that. As well as trivializing some of the other Windows concerns - smaller kernal, very limited and fully controllable number of programs running on the machine, etc.

cww>>Owning the source code for your control system offers a great deal of confidence that you can support it long term.

See arguement above about modifying old source code. While I personally greatly prefer to have source to any system on which I work, I have seen far too many end customers who own the source and are at best a menace if they try to do anything with it. Also a few integrators.

Michael Griffen>>An operating system which is in a configuration used for typical desktop applications will not meet the security requirements laid out in the studies. If it is possible to configure a system to meet these requirements it would require more knowledge than all but a very few SCADA integrators or plant operators possess.

This is perhaps a valid point - and if the SCADA integrators cannot set up a secure system with Windows, why would you think they can with Linux?

My original point stands. A competent integrator can set up a good (good = stable, secure, and supportable) system with many of the OSs available today. I am certain that Curt can set up something at least as capable in Linux as what I can set up with XP. And I know from over fifteen years of experience with automation systems under Windows (and a few *nix systems) that I can set up a good system under Windows.

Davis Gentry

By Nathan Boeger on 30 December, 2006 - 2:29 pm

Davis,
Your point about future supportability is exactly why I don't push custom software (and often Linux applications) in controls. End users need to know that there are viable support options for them a few years down the road.

I think you (and many others on this post) are missing cww's major security point about Linux. He's talking about trimming the machine down to an appliance as have been done with routers and other specialty devices. This eliminates (unneeded) functionality and indisputably decreases possible security holes. This point doesn't attempt to address a secured normal Windows install vs. secured Linux (with every optional package) install - you can trim down your Linux install to remove all the bloat. This isn't possible to the same extent in Windows. Could a Microsoft Programmer with the source code do the same? Possibly. Could you as an integrator? Probably not. As long as MS continues along it's current path with its software, versions like that won't be an option. That's a conscious business decision by Microsoft. They don't make appliances, they make a usable, highly backward compatible, general purpose operating system.

That said, I would support a high quality vendor's device that they've configured as a sweet Linux based appliance. Or it could be powered by a stripped, secured version of Windows. The bottom line is that the device better work like a VCR.

In my experience, end users and integrators don't have the expertise to set up workable Linux based systems, and certainly aren't able to maintain them. It's unfortunate because I'm a fan of the penguin. As it stands, I find myself recommending Windows systems most often. But Linux is getting more widely accepted and supported...

----
Nathan Boeger
Microsoft Certified Systems Engineer
http://www.inductiveautomation.com
"Design Simplicity Cures Engineered Complexity"

By Curt Wuollet on 31 December, 2006 - 12:51 pm

On Dec 30, 2006 2:29 pm, Nathan Boeger wrote:
> Davis,
> Your point about future supportability is exactly why I don't push custom software (and often Linux applications) in controls. End users need to know that there are viable support options for them a few years down the road. <clip>


Hi Nathan,
That's been a really rocky road for automation customers. Most, if not all own unsupported systems of some type. Don't you? And, there is room for other support options. Being wholly dependent on a single shrinkwrap vendor is a fairly large liability in itself. One can read from this fora that that isn't necessarily working that great. With all the mergers, acquisitions and consolidation there are large numbers of orphans under the current model. Add to that many packages that are simply obsoleted with no recourse, and having stuff that any qualified programmer can work on begins to appear much more attractive long term. What the commercial vendors see as a viable option often hinges upon the transfer of large amounts of money from your pocket to theirs and very few other choices. Single sourcing is risky business, why do business types see that for everything but software?

Nathan: > I think you (and many others on this post) are missing cww's major security point about Linux. He's talking about trimming the machine down to an appliance as have been done with routers and other specialty devices. This eliminates (unneeded) functionality and indisputably decreases possible security holes. This point doesn't attempt to address a secured normal Windows install vs. secured Linux (with every optional package) install - you can trim down your Linux install to remove all the bloat. This isn't possible to the same extent in Windows. Could a Microsoft Programmer with the source code do the same? Possibly. Could you as an integrator? Probably not. As long as MS continues along it's current path with its software, versions like that won't be an option. That's a conscious business decision by Microsoft. They don't make appliances, they make a usable, highly backward compatible, general purpose operating system.>

If they would simply make their systems usable without IE and Outlook, it would be a great leap forward for security. Instead it seems these can operate at administrator equivalent permission levels with full system access. Not good.

Nathan: >That said, I would support a high quality vendor's device that they've configured as a sweet Linux based appliance. Or it could be powered by a stripped, secured version of Windows. The bottom line is that the device better work like a VCR.>

Or a Tivo? :^)

Nathan: >In my experience, end users and integrators don't have the expertise to set up workable Linux based systems, and certainly aren't able to maintain them. It's unfortunate because I'm a fan of the penguin. As it stands, I find myself recommending Windows systems most often. But Linux is getting more widely accepted and supported...>

In my experience these same folks don't have the expertise to deal with Windows problems either. The ones that can are certainly technical enough to deal with a _documented_ system, especially with free community help available for the asking. If you think of it that way, it may well be more likely that you can fix a Linux system. Probably not instantly, but there are a lot more people who know the answers. If folks had the same exposure to Linux, I'm fairly sure they would do as well or better. After all, there simply aren't any Linux secrets.

Regards
cww

By Nathan Boeger on 31 December, 2006 - 8:40 pm

CWW,
Accidently responded to Michael Griffin's response thinking it was yours. Oh well.

On Dec 31, 2006 12:51 pm, Curt Wuollet wrote:
> That's been a really rocky road for automation customers. Most, if not all own unsupported systems of some type. Don't you? And, there is room for other support options. Being wholly dependent on a single shrinkwrap vendor is a fairly large liability in itself. One can read from this fora that that isn't necessarily working that great. With all the mergers, acquisitions and consolidation there are large numbers of orphans under the current model. Add to that many packages that are simply obsoleted with no recourse, and having stuff that any qualified programmer can work on begins to appear much more attractive long term. What the commercial vendors see as a viable option often hinges upon the transfer of large amounts of money from your pocket to theirs and very few other choices. Single sourcing is risky business, why do business types see that for everything but software? >

Absolutely! I've integrated many a project with unsupported legacy systems. It often pisses me off and the customer. In my experience, I've had (slightly) less trouble with old HMI applications that old custom programs. I'm thinking of a certain distributed Delphi app that was ahead of its time, and a Linux C based program where the original author passed away a few years ago. Going open source or even with the big guys should tend to minimize your risk, but you're right - these software companies have been going crazy with buyouts recently! It's tough, I'm not sure what to tell end users. Our industry has a poor track record - and this madness ends up costing manufacturers lots of money. I work for a small industrial software company. We strive to be as open and standards based as possible for commercial software. We're also not interested in a buyout. But I'm sure many companies have been there before.

cww: > Or a Tivo? :^)

Lol, I was going to say Tivo. I never did figure out how to program my VCR ;)

on the Linux vs Windows support - I think that a major problem is a lack of people giving Linux a legitimate shot. Usually when I mention Linux to manufacturers their eyes roll as they're reminded of some ancient project gone arye. Even the industrial IT departments that I've dealt with are often against Linux desktop support - even when they're supporting Linux servers. Any ideas why that might be? It's an uphill battle!

----
Nathan Boeger
http://www.inductiveautomation.com
"Design Simplicity Cures Engineered Complexity"

By Michael Batchelor on 4 January, 2007 - 12:36 am

Well, here I am back again, trying to change the subject.

I think we should all support Ght56dfN OS because it's going to be open by the time I get around to writing it, but in the mean time I want to get back to the problem of the "war" between IT and Production.

Operating System support aside, how do we effect the organizational change that alleviates the tension, and sometime open warfare, that exists because IT's needs conflict with Production's needs.

I've thrown my proposed solution into the ring several times. What do others suggest, besides trying to hide under a rock and hope the IT guys don't find us. That's not a solution.

Michael
--
Michael R. Batchelor
www.ind-info.com

GUERRILLA MAINTENANCE [TM] PLC Training
5 Day Hands on PLC Boot Camp for Allen Bradley
PLC-5, SLC-500, and ControlLogix

If you aren't satisfied, don't pay for it. Guaranteed. Period.

training@ind-info.com

By Michael Griffin on 5 January, 2007 - 10:25 pm

In reply to Michael Batchelor: The following is more or less a reiteration of my previous statement on this subject (or at least of the one that was connected to the putative subject).

I believe that the SCADA network is best controlled and maintained by the people responsible for the rest of the SCADA system. Some of the discussion we have had has been directed towards how we could make the SCADA systems easier to support to make this more feasible.

The problem you are trying to address is one of how to align the interests of different parts of the company so that they all work harmoniously in a common direction. If you really have a general solution for that, then I suggest that you get out of the automation business and become a management consultant. You might not have as much fun, but the money is certainly a lot better.

By Chris Jennings on 5 January, 2007 - 10:49 pm

Well my background is that I have an Electronic/Computer Engineering degree and I started working for a manufacturing company in the IT department. It was great overseeing small projects that ranged from upgrading old thinwire Ethernet to UTP and deploying new versions of Microsoft Office. I got sick of this pretty quickly because it was pretty much the same thing everyday. So I took a job working as an Automation Engineer on one of the paper machines. I had the computer experience I had done control theory at uni and there were a number of very experienced engineers around who could help me learn the ropes. From this experience I was able to apply my IT skills to the automation field and I made a lot of changes that hopefully improved reliability of the process control networks and computers. I could also see the frustrations that automation people had with the IT crowd. IT have a very narrow focus and they also don't like making exceptions to their often very specific security rules. Examples:
-Admin rights for normal users
-Virus scanners MUST be installed
-Non-standard software not allowed

Just those three rules make an automation engineers job almost impossible. I forces people to have a normal work PC (so they can read e-mail) and a "real" work PC which has all the DCS stuff on it. In some cases IT won't even let you connect up to the business LAN so you need to double up network infrastructure.

So the real way to get people to understand your problems is to put the shoe on the other foot. Get some of those IT desk jockeys to come and spend a week in the plant and see things from your perspective. But the converse is also true. Spend a week on a helpdesk or managing the business infrastructure and you will soon see why they aren't keen for your wonderful new DCS to be hooked up to the business LAN, or why they don't like non-standard software installed on work computers.

Empathy is a wonderful thing.

Chris Jennings

By Brian E Boothe on 9 January, 2007 - 10:41 pm

I've been following this thread for a while now, and it seems that a lot of IT Organizations have to many Tightly Wound, Flat 1 dimensional People Running the show, in not just the Control Industry. I can say I know for a fact they have no idea about control issues or Automation software. I'm basically the opposite of that Scenario you see in use in most IT organizations and Offices. IT and Control Should merge into 1 Application. And if the Administrator wants to check his/her internal Company email And Check production on the floor he/she can do it all on 1 machine in his office, thru a Spam filter gateway/router Combination w/ Logins, all on a local intranet with segmented IP addresses.

IT Guys, stop being so anal about everything, by the book isn't always the way to follow.

By Curt Wuollet on 6 January, 2007 - 12:54 am

But, Michael,

We are talking about IT vs. Maintenance. By placing themselves in the center of IT's domain, it is nearly inevitable that they will fall into the bailiwick of those entrusted with the Microsoft franchise. Management will see largely parallel efforts and most likely side with the professional reboot, reload, and upgrade crew taking care of it all. It only makes sense from their perspective. When you make MS a part of your production systems, you marry IT. The conflict is inherent and comes with the territory. It's all about territory. If you start doing house power, you get involved with electricians, hanging pipe you hang with plumbers, etc.

Regards

cww

By Michael Griffin on 4 January, 2007 - 12:56 am

In reply to Nathan Boeger: I am in the middle of a PC based automation project. The customer for the software was offered an OS choice of either MS-Windows or Linux. They chose MS-Windows and I'm not inclined to argue with them if that's what they want. The software was developed and tested on Linux and deployed on MS-Windows (it runs on either), so the project has given me a fairly direct side by side comparison of what it is like to manage MS-Windows versus Linux in the same industrial application. It was a lot easier to set up a computer with Linux for the application than it was using MS-Windows.

For most manufacturing people though, asking them what OS they want is like asking them whether they want an Intel CPU or an AMD CPU. They don't know the difference and they don't want to know the difference. Most IT departments are going to oppose any change in their daily routine the same way they opposed introducing PCs in the days when "IT" meant mainframes. IT departments tend to prefer vendors who take their department heads to "workshops" in sunny climates and they roll their eyes up at people who think that saving other department's money matters to them.

If you want to sell something different to people, sell them an advantage not a name. If you try to sell them a name they will stick with the name they already know.

If my project time line had allowed it, I would have asked the customer if they wanted to have a high reliability PC with no hard drives and no fans. If they said yes, I would have shown them the hardware and the software to do it. The software would have included using a Linux OS, because that's a low cost off the shelf solution for this sort of application. Reduced downtime is something that people can understand and it's the sort of question they want to be asked about.

By Curt Wuollet on 4 January, 2007 - 1:07 am

Hi Nathan

The point that almost everybody misses is that you don't have to really convince the users, <dons flameproof union suit> they are used to running whatever comes in the box. Seriously. Automation folks already run nearly everything with non-standard, single sourced, binaries that have no commonality to speak of. So most of automation could be running on Linux and no one, well very few, would ever know. What runs on your SLC, your S7-400, your converter box, your intelligent sensor, etc.? These could all run Linux under the covers, and some of recent vintage probably do. After all, Wind River, whose products grace lots of embedded gear, work with Linux now. SixNet produces an RTU and friends that run on Linux. Your phone may well run Linux.

The only places where people would notice is in tools, SCADA, HMI, and the various comms and IPC schemes designed to ensure that every automation project include a tithe to the church of Gates and a Windows box as it's least reliable component.

None of those functions really depend on Windows, that is, there is nothing there that can't be done with any other graphical OS. One would need a replacement for OPC and a few other Windows toll booths, but I doubt anyone would shed any tears since most are not supported anymore by MS. Replacements built for automation purposes would probably be simpler, easier to use, and more reliable . Vista is going to be a much larger PITA than switching to Linux would be and may even dump some of these old favorites.

The big problem here and the "Skunk on the table" is that this would be very doable in a cooperating world, but may well be impossible with the degree of consensus and cooperation our vendors exhibit. This Skunk prevents anything from being standardized or even modernized to a large extent, but would be particularly thwarting to shedding the MS lock-in. MS is the only thing they have ever agreed upon and it's only because they were forced to by monopoly omnipresence.

As it's New Years Day, I will make a prediction that some company will make a 2007 move. Probably a EU company where the legislators and regulators and everyone else aren't totally owned by Microsoft and the current Anti-Trust trials would discourage Microsoft from retaliating. Witness what happened when MA tried to mandate Open Document Format.

Regards

cww

By Michael Griffin on 5 January, 2007 - 10:39 pm

In reply to Curt Wuollet - I don't know if it's possible to get any further off topic, but with regards to your statement about a replacement for OPC the following might interest you.

It occurred to me several years ago that the primary function of OPC is to allow one or more programs to access data by tag names without being directly linked to the driver. When looked at this way, a tag read operation resembles a database look up. That is, the tag name is the "query key" with a value (or values) being returned similarly to a database result. The actual communications driver would be making selects and updates on one side, while the application(s) is making corresponding selects and updates on the other side. Dealing with databases is a very common and well established technology. There is very little if anything new that would need to be developed to make use of it.

While you could theoretically use a conventional database for this, I recently came across the following project for making a program's internal data structures accessible as if they were a database. It uses a subset of the Postgres protocol.

http://www.linuxappliancedesign.com/projects/rta/index.html

I think the applications for this are obvious when you realise that the service being described on the web site listed above could just as easily be a Modbus driver. Even better still would be a common server with "pluggable" drivers. I haven't tried this library, but I suspect it wouldn't take too much effort to have a working implementation up and running.

By Curt Wuollet on 6 January, 2007 - 3:38 pm

Exactly!
Any number of schemes could be used, the only trick is to get any two vendors to use one. I don't think OLE was chosen because of it's merits, but because it was all that was provided.

Regards
cww

By Davis Gentry on 5 January, 2007 - 12:00 am

Why would anyone load Outlook on a production machine?

I don't think that I have ever seen it there. And
you can load embedded Windows versions (I started with embedded NT, and have some installs running XPe) without IE. Haven't tried it under XP, but the Add/Remove Windows Components tool does offer the option - I'll try it on an old laptop after I get back home from this trip. One you missed is Windows Messenger - I think that is a pretty major
vulnerability, and one I remove on all pcs. The
latest versions of Media Player are also potentially problematic, and I remove that on production XP machines.

I do want to point out again that I am not knocking Linux - our latest and greatest motion control processor actually runs a real time Linux kernal, so I'm in process of getting back into the swing of *nix for the first time in a decade.

For Curt and the other Linux guys out there - we are including a web server on the processor with a web app that allows a GUI into the processor. What kinds of vulnerabilities should I be looking out for here?

Davis Gentry

By Michael Griffin on 6 January, 2007 - 12:31 am

In reply to Davis Gentry (with regards to removing MS-Windows programs) - You don't usually have to "load" MS-Outlook on a computer. If you buy a typical desktop computer from a major vendor it comes with MS-Outlook (and a lot of other junk) already loaded. It's more a matter of whether anyone had the time to try to remove all the excess junk before putting it into service. Computers from major OEMs are used as marketing channels for extra software and services. The OEMs get paid to put a lot of the third party junk on there. Locally built clones have less of this, and computers that you build yourself of course have the least of all.

As for removing MS-IE, that's not really possible. The MS-IE libraries are part of the basic GUI. You can remove the stub that loads the front end, but you can't remove the DLLs that make up most of MS-IE (at least without losing a lot of functionality). That's why many of third party programs will list a particular version of MS-IE as being a dependency even if they have no obvious relationship to the internet. Many of the MS-IE vulnerabilities are in the DLLs, so the problems are still there if someone can find a way to trigger them.

In many (if not most) cases, when you "remove" a program that comes with MS-Windows, you're not really removing it. You're just at most removing the loader stub. While I believe that MS-Windows XP Embedded comes with software that allows more detailed removal (which is basically the difference between it and regular XP), some of the libraries we are talking about are fairly fundamental to MS-Windows.

For most people who are using MS-Windows in an automation project, this really isn't an option. They can't remove enough of it to matter without becoming experts as to what each and every DLL does and what program needs it. The whole point of using MS-Windows was supposed to be that they could just buy a computer with it already loaded and not have to worry about that sort of thing in the first place.

========

With regards to your question about security of a web application, that
depends upon a number of questions (what web server are you using, is there
any scripting available and what kind, what add-on modules are you using, is
there a database present, etc.).

Some web site problems are due to holes in the web server itself or in add-on
modules (for encryption or server side scripting other tasks). If you are
using Apache (or a re-branded Apache) this isn't too common.

Many problems however are actually application vulnerabilities or poor
configuration. Some common application vulnerabilities are:

- SQL injection in a database.

- Failing to set the proper access permissions to the directories that hold
the web pages (allowing the user to access directories they shouldn't be able
to get to).

- Counting on cookies or Javascript being turned on to prevent the user from
otherwise doing something they shouldn't (e.g. counting on client side
re-direct).

- Trusting the contents of cookies, or the URL in a GET string. People can
(and do) modify these.

- Security by obscurity. People can guess at the name of a web page, and ask
for it directly. You can't count on them getting there by an "approved"
route.

- Leaving passwords in an accessible file.

- Allowing people to upload files without properly controlling where the files
end up. Someone could upload a file which replaces one of your own critical
web pages, in which case they can script the web page to do something for
them they don't otherwise have direct access to.

- DOS attacks. There's probably not much you can do about this except have
lots of bandwidth.

- A lot of problems these days are not due to a single vulnerability, but
rather to combinations of several.

Most of the above are really problems for publicly accessible web sites. If
your web server is embedded in a device and is being used as an MMI on a
network that isn't publicly accessible, then probably the most serious
problems would either be some sort of script injection or allowing access to
pages not intended for the user (e.g. the user can get at a system
configuration page by typing in the URL directly).

By Curt Wuollet on 6 January, 2007 - 3:29 pm

In answer to the last question, on a private network there are few non-obvious vulnerabilities. Open that network to the World and while vulnerabilities can be limited, it will need management. Automation folks tend towards set it and forget it. But it is several orders of magnitude less likely to be compromised than simply plopping another Windows box on the net.

Regards
cww

By Michael Griffin on 31 December, 2006 - 1:07 pm

In reply to Nathan Boeger: A "SCADA appliance" could be an integrated set of packages that run on PC hardware which was purchased separately. You would then install over top of (erasing) whatever operating system came with the PC (if any). There are "network security appliances" that work like this.

The key to the "SCADA appliance" would be a good system management package that takes care of all the installation and administration tasks of *all* the software in the system. Something like this should be a lot easier to manage than current SCADA systems because you are dealing with all of the software through a single consistent interface rather than half a dozen (or more) different ones.

By Curt Wuollet on 31 December, 2006 - 8:35 pm

Hi Michael, Nathan, etal.
This would indeed be a major improvement and it is doable. The very large number of Linux distributions available demonstrates that a company could "own" the Linux their package runs on and have far better control over the lifecycle of the product. Upgrades could be coordinated, and no one, not even a predatory monopoly, could simply declare all your hard work obsolete. This could be done with very limited resources as you can start with a good distribution and simply maintain it as long as you want. All necessary upgrades would be provided by the community and could simply be reviewed for applicability. Many "old" releases are well supported because there are still large numbers of people using them. The work needed simply to ensure their product continues to run well would be far less than what MS churn requires and if they write reasonably portable code there would be minimum disruption when they want to rev the whole package because unlike with MS, they could know exactly what changes are happening in the base distro as they happen and plan as they go along.

The cost savings would have to be large and the number of crunch events would be close to zero with good management. This is in comparison with developing on Microsoft's rather erratic schedule with bumps and panics every so often even after the release and first rounds of fixes. Automation people have shown they greatly prefer not to mess with that which is not broken and this would make it practical and practicable to accommodate them.

As for not having Windows on their automation machines, I think people would deal far better with that than all the consequences of having Windows on their automation machines. The OS drops into the background (or should) and the operations become prime.

Regards
cww

By Michael Griffin on 4 January, 2007 - 12:48 am

In reply to Curt Wuollet: If someone wanted to use a Linux OS in their "SCADA appliance", probably the best strategy would be to base the OS part of it off a major well established "full featured" distribution. You could track that distribution, and just have a different standard package list (i.e. strip out the stuff that is not required, and add in your own).

It is important to remember that "stability" is not the same thing as "stasis". New releases are good if they fix bugs in the old ones or provide features that people are waiting for. What people are going to want to know is that the new release is going to continue to work on their hardware. The only way to really know that is to try it out and the more people trying it out the better.

The SCADA appliance market is likely to be small compared to the general computing market. If you track another much larger distribution, you can take advantage of their larger user base in testing the software you have in common with them (which would be most of it).

What would differentiate the SCADA appliance distribution from the one it is based off would be the SCADA software itself (of course), the different standard package list, and the system management package (which would be SCADA oriented).

By Curt Wuollet on 6 January, 2007 - 1:07 am

> In reply to Curt Wuollet: If someone wanted to use a Linux OS in their "SCADA
> appliance", probably the best strategy would be to base the OS part of it off
> a major well established "full featured" distribution. You could track that
> distribution, and just have a different standard package list (i.e. strip out
> the stuff that is not required, and add in your own). <

Exactly.

> It is important to remember that "stability" is not the same thing
> as "stasis". New releases are good if they fix bugs in the old ones or
> provide features that people are waiting for. What people are going to want
> to know is that the new release is going to continue to work on their
> hardware. The only way to really know that is to try it out and the more
> people trying it out the better. <

But, you can limit churn to truly relevant issues. And you have the benefit of letting the general interest audience for the parent distribution be the guinea pigs. No, you shouldn't lock things down for the duration, but you can ignore change for changes sake and keep change to a tolerable
level keeping in mind the end purpose for the system. If all you interact with is the application, much of the noise is irrelevant.

> The SCADA appliance market is likely to be small compared to the general
> computing market. If you track another much larger distribution, you can take
> advantage of their larger user base in testing the software you have in
> common with them (which would be most of it). <

Should be almost all of it for high level apps like SCADA. It would be akin to the "stable tree, development tree" model used in much of Linux
development. Change the stable tree only when there is need or at least a really good reason and only well tested code.

> What would differentiate the SCADA appliance distribution from the one it is
> based off would be the SCADA software itself (of course), the different
> standard package list, and the system management package (which would be
> SCADA oriented). <

Yes, a stable and known reliable working set relative to the application. Mostly a stable set of libraries and any utilities used so you don't have the Linux equivalent of dll hell.

In short, you could ensure that things always work as intended. What Linux is famous for, "It just works". Until the _customer_ wants to change. And another stable package for then.

Regards

cww

By Nathan Boeger on 31 December, 2006 - 8:37 pm

Curt,
Thanks for the clarification. I agree completely with what it would take to be viable for the customer. Unfortunately, I've yet to see a SCADA vendor take this approach. To make matters worse, selling new ideas to end users seems really difficult when it comes to controls. Ironic for an industry that's been historically defined by innovation. Perhaps open source projects will pick up the pace in terms of viability. They could fill in some gaps here.

----
Nathan Boeger
http://www.inductiveautomation.com
"Design Simplicity Cures Engineered Complexity"

By Curt Wuollet on 31 December, 2006 - 12:44 pm

On Dec 29, 2006 5:21 pm, Davis Gentry wrote:
<clip>MS products from 2000 on have been quite stable in my experience if properly configured and administered. <clip> And what OS has the most hardware drivers written for it? Yes, you can of course write your own drivers for Linux. Then your customers are really caught in a bad place. Five year from now, after a lightning strike takes down their PC and cards, and they find out that they can no longer buy the same card from their vendor, and then they find that their driver no longer works with the firmware rev of the new card. Now if they bought source code with the original install they can go out and find a programmer who is capable enough to rewrite the driver (said programmer generally NOT found in most plants in the US, or on the nearest streetcorner, either).<clip>

First of all they should have received the source free with their Linux distribution
or at least had the option. And Linux drivers seldom depend on the manufacturer, they are typically written by the user base or sometimes Linux vendors.

Old hardware is far better supported under Linux and continuing support is the rule rather than the exception because, arguably, many people run old hardware on Linux. There are projects like Comedi that support cards that haven't been sold for a decade. That's because the drivers are generally written by users without a profit motive that demands that last years equipment is forgotten. I know this is the case because I use old hardware and it is very unusual that I have any sort of compatibility problem when
I load a new version of Linux. And when I have, I contacted the maintainer and so far, they have been good about revving the driver for a new kernel.

Why don't you suggest that to your ABC card company? So, you don't need a kernel programmer in your plant and I surely hope they wouldn't
be on the street corner. But they are accessible real people and will usually
be glad to help if treated nicely.

And how does using Windows alleviate this problem? Drivers are single sourced and if the company isn't interested in supporting what they have obsoleted, you are out of luck. It they think you should buy the new card you are out of luck. If they decide not to support your version of Windows you are out of luck. They are binaries and can't be revved or fixed except
by the vendor, if there's money in it. My "old" PC won't even boot new Windows due to hardware support but it will run the two current Linux
CDs I've tried. To put it another way, pick any card supported by both and I'll bet I can get a broken Linux driver fixed sooner than you can
get a broken Windows driver fixed, if ever.

Davis Gentry: <clip> They also find out that the Linux kernal they were running is not compatible with the tools that the developer is currently using, so they need to upgrade OS as well. Maybe at this time they also have to rewrite, or at least recompile, their HMI.>

Most driver writers I know use a text editor and GCC. And unlike MS developers they can simply use the tools and libraries you have because the tools came with your version of Linux and are guaranteed to work with that version because they
were used to compile the kernel and drivers on the machine. And very few Linux drivers are dropped or unmaintained because there is no profit incentive to make people upgrade. This will continue this way as long as binary drivers are discouraged by the community because they are bad for the community.

Davis Gentry: <clip> While I personally greatly prefer to have source to any system on which I work, I have seen far too many end customers who own the source and are at best a menace if they try to do anything with it. Also a few integrators. >

Perhaps, but having recourse is much better than having no recourse. I don't know that Linux developers will always be more accessible for help than Windows developers, but I can email some pretty important Linux folks and get answers. Somehow this doesn't work at Rockwell.

Michael Griffen: >An operating system which is in a configuration used for typical desktop applications will not meet the security requirements laid out in the studies. If it is possible to configure a system to meet these requirements it would require more knowledge than all but a very few SCADA integrators or plant operators possess.
>
> This is perhaps a valid point - and if the SCADA integrators cannot set up a secure system with Windows, why would you think they can with Linux?>

A Linux system in a configuration used for common desktop use doesn't have several of the most frequently exploited holes left open. And an install option will lock things up fairly well. I can enable the NSA extensions and have pretty good security out of the box. And it doesn't break everything.

Davis Gentry: >My original point stands. A competent integrator can set up a good (good = stable, secure, and supportable) system with many of the OSs available today. I am certain that Curt can set up something at least as capable in Linux as what I can set up with XP. And I know from over fifteen years of experience with automation systems under Windows (and a few *nix systems) that I can set up a good system under Windows.>

But that does nothing to deny the many other benefits folks would have with a truly Open and customizable system. Telco, automotive, military, infrastructure, machine builders, and even NASA are seeing this. Right now, you have to be able to automate without packaged PLCs or shrinkwrap software to be able to use Linux, but the reasons must be compelling to interest these people when they could take the easy way out. I think it's safe to say they all have experience
with Microsoft.

Regards
cww

By Brian E Boothe on 5 January, 2007 - 10:33 pm

I've tried the Linux Experience on several of my Customers, and the biggest complaint is "Software installs" and Flash/Java/Adobe Configurations - Installations.

Linux is great O/S for wire-heads and Computer Geeks like myself who have been in computers since 78, but you go to a Non-Computer experienced Individual and plop Linux in their lap... sure it's free, But YOU as the Enabler will feel the burn after 2 weeks, and they'll be running to you every day with questions and Driver issues. It's really not worth the headache in my opinion. I have 3 Linux machines running in my basement. I love Linux, but I also know how to write Device drivers and shell scripts, I'm also under the Poll of "Keeping linux Geeked!" no noobs.

By Curt Wuollet on 6 January, 2007 - 3:33 pm

Hi Brian What you describe sounds a lot like general office use and the problems there most certainly are different than setting up a particular application on a system where the purpose is to run that application. Like a SCADA system. Once you solve the problems for that, they should stay fixed.

I have done office settings as well and they are full of headaches, with either OS. But switching to Linux has impediments like hardware that only works on Windows, people who will not accept any change in how they do things and people who want non-business related software on "their" machine. The best way to deal with these is to solve them before you begin and get management on board so you can say NO until the dust settles a bit. One of the biggest problems I had was a saboteur who worked hard to wreck things because MS Golf wouldn't run anymore. It will all work out if people want it to work.

Regards
cww

By Michael Griffin on 31 December, 2006 - 12:58 pm

In reply to Davis Gentry - You have replied to several messages in your posting. I will deal with the points directed towards me first.

> Davis Gentry: "This is perhaps a valid point - and if the SCADA integrators
> cannot set up a secure system with Windows, why would you think they
> can with Linux?"

You appear to have misread my statements. I didn't say that a typical integrator was more capable of setting up a secure SCADA system with Linux than they were with MS-Windows. I simply said that a typical integrator was not capable of setting up a SCADA system which met security requirements which are beginning to be expected in today's systems.

These requirements include those of the "SCADA and Control Systems Procurement Project", the members of which represent a number of large SCADA customers. These standards as presently drafted will require extensive modification to an operating system which is configured for "office desktop" applications. Furthermore, these new security standards (as listed in the current draft) require that the SCADA vendor take responsibility for maintaining the on-going security configuration of the complete system (not just the SCADA application itself), and that their responsibility for the system does not end with installation and acceptance but rather continues for years (presumably for the lifetime of the installation). While not every customer is likely to adopt these standards a number of important ones are, and others are likely to be influenced by them.

I also didn't say that any particular operating system was more suited than another to this application. I mentioned Linux only to point out the fact that this sort of full package support is exactly what the major Linux vendors do for a living (primarily for large critical enterprise applications), so this sort of requirement is not unreasonable or unprecedented. It is however new to most of the automation business. These major Linux vendors do not offer this same support for SCADA systems, but it is the business model rather than the particular companies or products that I am referring to.

For this sort of business model to be useful in SCADA applications would require the SCADA vendors to offer the sort of "single point of responsibility" that major Linux vendors do for enterprise type applications. As to what operating system a vendor may choose to base this sort of business model on is another issue altogether. Simply having an integrator install a different operating system (what ever that may be) while otherwise doing everything else the same as before doesn't really address the issue.

> Davis Gentry: "My original point stands. A competent integrator can set
> up a good (good = stable, secure, and supportable) system with many
> of the OSs available today. I am certain that Curt can set up something
> at least as capable in Linux as what I can set up with XP. And I know
> from over fifteen years of experience with automation systems under
> Windows (and a few *nix systems) that I can set up a good system
> under Windows.">

I won't argue as to whether you could initially set up a good (stable, secure, supportable) system under MS-Windows. No doubt you are much better than a typical integrator in that respect. It is however completely irrelevant for several reasons. The first reason is that most integrators are not capable of this. They are experts in manufacturing or process industry automation, not in operating systems, databases, and networking. Even for specialists in operating systems, databases, and networking, only a very small percentage of those people are security experts.

The second reason is that most integrators operate on a project by project basis. They get paid to complete specific projects and they don't maintain a security lab researching future problems. Presently the industry operates on an "ignorance is bliss" basis and an integrator would only get involved in a problem when a customer hires them to fix a specific existing problem. The new security requirement is that someone looks for security problems, finds solutions to them, and provides the customer with a solution. Note that this is not the customer looking for problems and hiring someone to fix them; this is someone doing this for the customer without prompting on a continuous basis.

Finally, many of today's security problems do not arise from problems with single software packages on their own. They result from combinations of packages when working together. That is, a flaw in one software package may not on its own pose a security problem, but it may result in a serious one when combined with a different flaw in a separate package. This means that relying on each independent package's author to be responsible for security when looking at their own package in isolation won't work. Some party has to take responsibility for how all packages work together. Today, nobody does this for typical SCADA or MMI systems.

I don't see this as an issue that integrators can properly address. Rather, I see this as an issue for SCADA and MMI package vendors, the solutions for which integrators then apply. The role of the integrator would change very little from today, with the exception that the customer could assume that the "default install" version of the base packages would meet all the required security standards. The residual security risks would then be whatever application level problems the integrator may have created in any code they may have written for the application, or in any unsupported software that may have been added to the system by the integrator. This is a much more practical scope of responsibility for the integrator and one that more closely corresponds to what most would believe they are presently assuming.

If you want a practical example, let's assume we have the following (rather simple) problem. Suppose as part of a SCADA application, we have a third party OPC server feeding data into a third party database via ODBC. All are running on the same MS-Windows operating system. Now suppose someone discovers a way to send data via that OPC server which causes the database to stop logging certain data.

Now whose responsibility is this? We have five parties involved. The SCADA vendor will say that while they are compatible with OPC and databases via ODBC running on MS-Windows, they didn't sell any of this third party software so they aren't responsible for it. The OPC vendor will say that their server is just passing through data, and it's not their responsibility as to what that data does to anything else. The database vendor will say that they've never heard of OPC, so they "don't support it". The operating system vendor will say "try reinstalling Windows". The integrator will most likely say that the software was all from the customer's approved list of vendors, so it's not his fault (if you would like to specify a different set of software though, he'd be happy to give you a quote on installing it).

Under the proposed "single point of responsibility" though, the SCADA vendor would discover the problem (or analyse it when it is reported to them), come up with a solution, and download a patch to their customers' sites in a timely fashion (before most customers even knew there was a problem). There would be no question about who is responsible; the SCADA vendor is responsible for everything in the base packages. They may not write everything, but they would supply and take responsibility for all of it.

The above addresses security and other long term support concerns. If you have any valid complaints about it, I would think they would more properly be the flexibility you would be giving up if you want to remain within the vendor's support terms. While the vendor would supply and support everything they require, that support would be limited to those packages only. This is a limitation however that many customers may be willing to live with if it solves their long term support problems.

Your reply to Curt Wuollet discussed a number of points which are essentially questions about long term support of software. I would like to point out that that is exactly the issue which I have said is not being addressed under the current model.

> Davis Gentry (in reply to Curt Wuollet): "Then your customers
> are really caught in a bad place. Five year from now, after a lightning
> strike takes down their PC and cards, and they find out that they can no
> longer buy the same card from their vendor, and then they find that their
> driver no longer works with the firmware rev of the new card. Now if they
> bought source code with the original install they can go out and find a
> programmer who is capable enough to rewrite the driver (said programmer
> generally NOT found in most plants in the US, or on the nearest
> streetcorner, either). They also find out that the Linux kernal they were
> running is not compatible with the tools that the developer is currently
> using, so they need to upgrade OS as well. Maybe at this time they also
> have to rewrite, or at least recompile, their HMI.>

This is an interesting example, because it more or less describes the problem I had with a production test system (not a SCADA system). The operating system used was an "MS product" (as you would put it) though, not Linux. The board was still working, but the old driver would not work on a new computer. Cascading software and hardware incompatibilities meant that buying a different board meant replacing the entire test system when all I really wanted to do was replace the computer hardware.

In my case the data acquisition card vendor refused to even discuss making any changes to fix their driver. Their point of view was that the card was no longer a current product, so they were not going to devote any developer hours at all to making what they had admitted was a very trivial change to their driver (basically changing a constant and recompiling). The board vendor was one of the largest and most reputable companies in the field, so it's not as if we were dealing with a "bad vendor".

In the end, I was able to get enough information from several sources to reverse engineer enough of the board to control it without using their driver. I didn't get complete functionality, but I did get the features we were actually using. This let us extend the life of the test system out a few years to match the life of the product it was testing.

The problem you are describing is actually one of proprietary drivers, not choice of operating systems. You complain that if you have the source code, it is hard to fix. I can assure you though from my own experience that it is a lot harder to fix with no source code at all. If the driver code was publicly available then it is likely that someone else would have run into this problem before I did and have already fixed it, so that I would have had to do was download the new version.

Note that this isn't an "MS versus Linux" argument. It is definitely not Microsoft's fault that a data acquisition vendor doesn't support their own products any more. It was one of the situations though that convinced me that using whatever is the current Microsoft product isn't going to guaranty that your software will continue to work over the long term.

There is a larger scale problem along these lines that is coming up in the future. I am expecting that a lot of people are going to get very seriously burned by this one. 64 bit PC hardware has been on the market for several years now, and can be considered to be mainstream. Microsoft is one of the last operating system vendors to begin supporting the hardware in 64 bit mode. (Most people are using the 64 bit hardware in 32 bit mode. Linux was ported to 64 bits years ago though, so most Linux drivers and application packages have already been ported to 64 bits and people are using them.)

There have recently been 64 bit versions of MS-Windows available, but there is virtually no software or drivers to support it. Over the next couple of years though, "mainstream" MS-Windows is expected to move to 64 bits for the standard versions.

The problem which is arising is that the 64 bit version of the new version of MS-Windows ("Vista") will require drivers to be cryptographically signed by Microsoft before the operating system will load them. To get the drivers signed by Microsoft will require that the driver author is listed as an "approved vendor" by Microsoft. Microsoft has stated that they will only approve a limited number of vendors, as they need to keep the vendor list small enough to be manageable.

The reason for this new requirement is to ensure that nobody writes a driver that can be used to bypass the DRM (copy protection) systems for music or movies. Microsoft sees entertainment as a big future market, and they are positioning themselves as a "secure platform" (from a copy protection standpoint) so they can make distribution deals with the major media outlets. They aren't going to leave any backwards compatibility "holes" open for very long, because then the music pirates would use these same holes which would defeat the whole point of signed drivers.

Where this leaves people in the automation business who need to use drivers from relatively obscure (as compared to major consumer goods) companies that aren't on the "approved vendor" list is a good question. I suspect that it will leave them nowhere at all and wondering what to do next.

By Davis Gentry on 4 January, 2007 - 11:43 pm

Michael -

Thanks for the clarification. I did misunderstand
your point.

> As to what operating system a vendor may choose to
base this sort of business
> model on is another issue altogether. Simply having
> an integrator install a
> different operating system (what ever that may be)
> while otherwise doing
> everything else the same as before doesn't really
> address the issue. <

I agree fully.

> this is not the customer looking for problems and
hiring
> someone to fix them; this is someone doing this for
the customer without
> prompting on a continuous basis. <

Very good idea.

> I don't see this as an issue that integrators can
> properly address. Rather, I
> see this as an issue for SCADA and MMI package
> vendors, the solutions for which integrators then apply. <

I'm not sure that I agree with this. I see security risks in two major areas - local access (i.e. operations, maintenance, engineering, and IT with direct access to the computer), and remote access, usually across a corporate network. Many local access problems can be limited by setup of the computer - password access levels being the major necessary step - if the operators have no root and/or registry privileges (regardless of OS). Having no keyboard and limiting file input capabilities also limits possible problems. Limiting or excluding internet access from the operator end also blocks a number of vulnerabilities. Another good tool is the use of a comprehensive security tool such as Norton, or Panda, or whatever your preferred package is. This is an area where coordination with local IT can be useful - they can work with production/engineering to keep the package up to date (and no - I do not advocate auto update).

The maintenance and engineering people have greater access, which also leads to greater vulnerability. I'm not sure what, if anything, can be done to completely remove all vulnerability. If you have the access you can cause damage. Anyone know of any way around this? If so I would love to get details on that.

Remote access vulnerabilities are easier to stop. Cut off all unnecessary services on the computer and greatly limit what anyone can do from outside. Pass production data through directories which are read only from outside. If you have to open a space for write access (recipe information for example) then limit the types of files it will accept, and keep a process running that auto deletes anything else as it is written.

Again - security is an area where local IT can often help, and is OS dependent only insomuch as local support (engineering or IT) may be trained and experienced in one OS more so than on others. Am I missing anything here?

Thanks,

Davis Gentry

By Michael Griffin on 5 January, 2007 - 11:08 pm

In reply to Davis Gentry (on security configuration) - There are several sides to the security question.

1) Local access is a question of passwords and privilege levels. This sort of set up should be routine, and therefore for a "SCADA appliance" the complete system (including OS) configuration should be automatic (possibly with a choice of several usage profiles plus optional manual changes). This should be part of the SCADA system management software, and not require third party packages or low level tweaking for a typical installation.

2) Remote access is actually a more difficult problem, not a less difficult one. It's all very well to say "cut off all unnecessary services on the computer", but that can be more difficult in practice than it is in theory. Firstly, if you are using MS-Windows as the OS, it has a bad habit of turning on (and needing) all sorts of "services" (daemons) that listen to various ports even for purely local needs. There are a number of worms that have exploited this problem in the past. Furthermore, the services that *are* necessary can still be a security problem.

Secondly, software bugs can cause vulnerabilities that need to be patched regularly. There are also cases where the problem is an original misconfiguration that must be corrected. This is where the idea of the SCADA appliance vendor taking long term responsibility for the software comes into the picture. They would continuously address these problems.

However again, for configuration issues the "SCADA appliance" system management package should provide standard system configurations that address these needs for typical installations.

3) There is a problem which occurs with many MS-Windows updates or driver installations where some system settings get reset back to the defaults when unrelated changes take place (I had this happen when installing a network card for someone last week). This could happen at anytime after the original security configuration. Again, the SCADA appliance system management package could run regular audits and report on configuration changes.

Security is not something that you can do once and forget. It is an ongoing and evolving problem. For dedicated applications though (such as SCADA) it should be possible in most cases to use standard configurations. When you can use a standard configuration, it makes sense to automate the task of making (and verifying) the configuration. The manageability of the SCADA control room is improved by simplifying and standardising these factors.

By Davis Gentry on 6 January, 2007 - 3:46 pm

Since I first logged onto the internet in 1983 (through a Commodore 64 talking to Compuserve) I have had a virus or worm only a couple of times - and not once in the last decade on any PC for which I am/was responsible. My kids corrupted their PC with adware a year ago, and I took away some of the privileges for their login (write access to the registry mainly) and we haven't had that problem since then. The rest of the computers on my home network repulsed attempts from the adware to load. I've done this by consistently following a few simple rules which I have also followed on all installs for machines in the field, of which 80+% have been MS of some flavor:

1) Keep antivirus up to date - manually, as I don't like auto update 2) Turn off unused services 3) Even with the AV running don't do something stupid like run the .exe or .gif with the latest porn/joke/whatever on it.

Keeping AV up to date can sometimes be simplified by bringing the IT guys on board. I am in dozens of plants each year, and I've seen everything from warfare between IT and Engineering to a nicely symbiotic relationship. If you can maintain a good relationship with IT your life can be much easier. Bring them in to look at your stuff, show them why they can't just delete files from a production machine to save space (yes, I've had that happen), basically try to make them part of your team. If they are at all reasonable (maybe yes, maybe no) and you can show that you have some expertise in their area but that you are not trying to exclude them from an area where they do have some responsibility (does your data go out onto their network? If the answer is yes then they do have responsibility) then you can often work together. If not, wait for them to screw up and go over their heads.

It is also an oversimplification to suggest that IT guys know only MS OSs. I would say that just as many IT guys know other PC OSs as do engineers, maybe more. After all, many of the IT types are running web servers which are often Apache.

I agree that MS has a bad habit of turning things on which should be left off. If you do updates manually (or if you coopt your IT guys into doing it for you) then you can handle this. Or run Embedded XP and load only those things that you want.

Michael Griffen has some great points with regards to having a SCADA/HMI package which by default sets things up to a secure status. It has been a while since I used any package other than Visual Studio or Labview to build an HMI, but a good while back I used Wonderware and it defaulted to a secure setup at least from the standpoint of password access, including across the network. What is the status of the many packages now available? Do ANY of them handle AV and internet security? Would certainly be nice if they did.

Davis Gentry

in reply to Michael Griffin's observations
> Note that this isn't an "MS versus Linux" argument.

I differ. Mr. Wuollet's pointed comment on tithing to MS reflects religious struggle going on here. The High Priests of MS are making lists of who is clean and who is unclean.

> The problem which is arising is that the 64 bit version of the new version of MS-Windows ("Vista") will require drivers to be cryptographically signed by Microsoft before the operating system will load them. To get the drivers signed by Microsoft will require that the driver author is listed as an "approved vendor" by Microsoft. Microsoft has stated that they will only approve a limited number of vendors, as they need to keep the vendor list small enough to be manageable. >

It will be interesting how this situation develops, given that XP will load unsigned drivers after the requisite warning notification (Repent, the end of world is at hand!).

If, in fact, unsigned drivers will not load or function in Vista, then MS moves outside the orthodoxy of "open architecture" that IBM brought to the PC world 25+ years ago. If an O/S excommunicates unsigned hardware drivers, then hardware standards are only meaningful for the formally ordained, because hardware without a loadable driver is a non-entity, merely an unclean leper to be banished from communion with those holy (big & rich) enough to be confirmed into the faith.

Could Linux be walking the road to Jericho, the Samaritan ready to aid the beaten man, one of "these 3 who was neighbor to him who fell among the thieves"?

Daniel

> It is not true, in my experience, that vendors produce either shoddy programs or support them badly. Nor does Microsoft produce bad software. <

No, Microsoft produces fine software for their market. The problem is that market is not the industrial controls market. Windows is fine for the average home or office user, plenty of assists, everything nicely packaged, "easy" to run, lots of multimedia support, etc. But that is not what we need for controls. We need small, tight, tailorable systems that have only what is needed and nothing more. Not bloatware.

> Before everybody from the "Church of Kill Bill" leaps on me again, let me point out that CERT shows a large number of vulnerabilities in every distro of xNIX from OS X to Linux. If you look at the size of the distribution of MS apps vs the size of the xNIX apps, if there were as many Linux boxes as there are Windows boxes, we'd all be complaining about how shoddy Linux is, or how virus and trojan vulnerable OS X is... <

Yep, remember, the Linux kernel was written by one guy, a college student at the time. It is of questionable quality even now. The problem is an intractable one of testing coverage. Even Microsoft with all the talent and tools at their disposal can't get it right.

> The issue, as Joe Weiss from Kema and I were talking about on the phone last night, is not really one of coding. It is about policies and procedures, auditing and training. <

Policies, procedures, auditing and training attempt to make up for the fact that the software is not 100% tested. Limit how it is used and check up on everybody and maybe you can get away with sloppy code.

> I read somewhere that on the order of 40% of all control systems still have "password" as the root administrator password years after delivery, and the "guest" signon is still enabled in more than half.
>
> You can't legitimately thwack vendors for that. <

Yeah, you can. Those items plus many others are configurable from within the product. The problem is that most vendors would rather use ignorance than responsible design decisions because it is cheaper in the short run. A good design would mandate the user enter a "strong" password, enforce mandatory password changes, and not have a guest account at all. Systems were being designed and delivered back in the 1960s that were secure, because they were designed to be secure and to take into account the inherent insecurity of people. Most of this work by the U.S. government was categorized into the "Red Book" (I believe) that listed the various security levels and what it would take to be certified to a level. Granted, it took specially designed systems (hw and sw) to reach the upper levels, but it could and was done. The modern computer industry is too in love with flashy toys and simpleton marketing departments to focus on the meat and potatoe issues.

Mark my words, it's going to take a cyberattack by some foreign government that takes out a refinery or other large chemical facility with a large loss of the surrounding population, then watch the government crack down. I can see where inherently dangerous facilities will be outright banned in this country because the populace won't want the risk of having them around. Heck, how long has it been since we've had a new refinery, not to mention a nuclear power plant? Better than 30 years now.

You and I are going to have to disagree about whether the end-users are part of the "Stop me before I kill again" school of security management.

Yes, vendors could mandate strong security. None of them do, AND THE END USERS AREN'T ASKING THEM TO.

I guarantee you that if a major end-user asked any of the two dozen or so people that make HMI or optimization software to include strong encryption
and security out of the box, they'd do it, hands down, no questions asked.

But security costs money, just like whisky, and talk remains inexpensive.

If you missed my nightmare, I happen to think you're right. It is going to take a cyberattack that brings down a refinery or bulk chem plant to get people to put money where their mouths are.

And then, you have to ask yourself if the money is going to be well spent, or if it will just be another instance of "security kabuki" like airline
passenger security is.

Walt Boyes
www.waltboyes.com
wboyes@ix.netcom.com
630-639-7090

By Michael Griffin on 27 February, 2007 - 10:34 pm

In reply to Walt Boyes: If you look back to some of the earlier posts on this precise topic, you will see there was mention of the "SCADA and Control Systems Procurement Project" (http://www.msisac.org/scada/). This is an organisation representing end users who *are* asking for better security, and are coming up with specifications to try to get vendors to provide it. The present technical and business models of most vendors are however not able to provide it at the present time.

By Walt Boyes on 3 March, 2007 - 12:50 am

With the utmost of respect, Michael, what you're saying is not really true.

It doesn't take a village, or a committee, or a project to get the suppliers to provide what you say the end-users want.

It takes a grand total of five specific people to do it. These five people command enough purchasing power that all of the first and second tier process automation vendors would fall over their own feet to install whatever security those end users wanted.

They don't seem to be asking.

Walt

Walt Boyes
Editor in Chief
Control magazine
www.controlglobal.com
blog:Sound OFF!! http://waltboyes.livejournal.com
_________________

Putman Media Inc.
555 W. Pierce Rd. Suite 301
Itasca, IL 60143
630-467-1301 x368
wboyes@putman.net

By Michael Griffin on 4 March, 2007 - 1:51 pm

In reply to Walt Boyes: While it is quite possible that I am wrong about whether there is any customer demand for better SCADA security, I can point to the following information.

From the web site which I referenced (http://www.msisac.org/scada/): "The SCADA Procurement Project, established in March 2006, is a joint effort among public and private sectors focused on development of common procurement language that can be used by everyone. The goal is for federal, state and local asset owners and regulators to come together using these procurement requirements and to maximize the collective buying power to help ensure that security is integrated into SCADA systems."

The "leadership council" of this organisation is composed of the following members: NYS Public Service Commission, New York Power Authority, ConEdison, NY Independent Systems Operator, Central Hudson Gas & Electric, Municipal Electric Utilities Assoc. (MEUA), National Fuel Gas Dist. Corp, KeySpan Energy, Entergy, PSEG Fossil, LLC, NY Water Service Corp., National Grid, United Water NY, Rochester Gas & Electric, Dynegy, NYS Electric & Gas, Reliant Resources, Constellation Energy, Energy Association of NYS, Independent Power Producers of NY.

The particular organisation I have referenced above is in the US, but there are similar efforts in Canada and other countries who are cooperating with each other, as is evident by the list of members in the working groups.

The list of companies and organisations which are "Workgroup Coordinators" is much too long to be reproduced here in full (see the web site for more information), but I will list a few examples: Australia Department of Communications, Information Technology and the Arts, Bechtel, BP, Chevron, ConEdison Energy, Inc., DuPont, Electric Power Research Institute, Exxon Mobil, NYS Office of Cyber Security & Critical Infrastructure Coordination, Royal Canadian Mounted Police/Critical Infrastructure Intelligence Section, Shell, Siemens Power Transmission & Distribution, Inc., Suez Energy NA, Swedish Emergency Management Agency/Swedish Defense Research Agency, United Kingdom National Infrastructure Security Coordination Center, US Army, US Department of Homeland Security.

While I admit that I am not as familiar with the SCADA industry as you are, I would be interested in just what these people are all up to if nobody wants better security. When they say they wish to "maximise the collective buying power to help ensure that security is integrated into SCADA systems", I more or less took them at their word.

Hi Walt,

Most organisations seem to rely excessively on passwords as the front line of protection. This is about the only one that is available as a general rule. The problem with the "password" model of security is that it is extremely difficult for Joe (or Jo) User to comply with all the requirements: Make it non-obvious and don't use personal information Don't write it down Use a strange mixture of lower and upper-case letters, numbers and characters. Don't use the same password for different systems Change it every month, etc.

If you are dealing with more than one or two isolated systems you would need to have an exceptional memory to do this - and if you have to log on to a system once a month or so then forget it. A notebook PC used by the on-call tech to access a PLC has to have a password known to the whole maintenance team. I have seen a research paper (just had a quick look with Google but can't re-find it) that made the point that banks seem to do very well with 4-digit PINs - why can't software vendors do something the same?

The other issue is that software system vendors are trying to use internet access as a selling point, but this opens up a whole lot of cans of worms as regards security. This has always been a problem of remote monitoring systems but things have changed a bit since we used to phone up the telemetry number of the local waterworks to find out how much the reservoir level fell during the half-time break in the big football match. Surely the first point that needs to be checked in any attempt to deliberately open up a system for remote dial-up style access is its security?

The parts of the user's organisation that make these decisions need to take action - but the IT industry in generally seems to be blaming the guys and gals at the consoles for not following the rules. As instrument people, one of our first concerns should be to avoid any complications to operators that might hinder their access to information, and the password system as currently implemented is one of those complications. In just about every other activity we are trying to reduce stresses on operators so they can focus on the ob of running the plant. Cheers,

Bruce

By Nathan Boeger on 2 March, 2007 - 1:24 am

Walt,

Great points! SCADA security is certainly something that hasn't been taken seriously on a wide scale. And you're right, vendors don't do much about it, end users don't ask, integrators go with the flow, and the government doesn't seem to be doing anything effective. I certainly hope that this gets sorted out before rather than after we have to regret it.

To add to your point about end user complacency, I've experienced end users FIGHTING security and standard IT technologies because of the natural trade off between security and usability. While strong security certainly doesn't have to get in the way of usability, many end users have experienced downtime due to security implementations - a big problem for them. I've spoken with many end users who would rather just have no security in place - for simplicity. It's a scary thought to me, but I certainly see where they're coming from.

----
Nathan Boeger
http://www.inductiveautomation.com
Total SCADA Freedom

Hi Walt,

I have to eat one or two of my words...
Check out http://www.easydeltav.com/pd/WP_DeltaVSystemSecurity.pdf
And http://www.easydeltav.com/pd/WP_BestPrac_CyberSec.pdf
So at least one major vendor recognises that there is a problem and is prepared to give some firm recommendations as to how to deal with it. And
no, I don't work for Emerson. They also provide a two factor security system with access card and PIN rather than simple password access control.

Cheers,
Bruce

By Michael Griffin on 9 March, 2007 - 11:34 pm

In reply to Bruce Durdle: I had a look at these SCADA security documents from Emerson, and there is not as much to them as may appear at first sight. Much of it is just general IT security recommendations for PCs in general, or for configuring MS-Windows. It doesn't however tell you how to actually do any of the recommended general actions. It also requires you to source, test, and install third party security products.

In short, if you know how to make use of their advice, you probably don't need their advice. I would consider these to be some nice short documents that tell you that you may have a problem in terms that relate to a SCADA system. They do not however actually provide you with a solution.

I would agree that "at least one major vendor recognises that there is a problem". They however still appear to feel that it is up to the end user to provide a solution.

By Michael Griffin on 27 February, 2007 - 11:56 pm

In reply to Rich Wargo: There is a good deal of academic research being put into "micro-kernal operating systems". Some of that is focused on producing an operating system kernal that is provably correct. That is, the design can be subjected to formal proofs which can determine if the design will meet the objectives. A micro-kernal is not synonymous with "provably correct" however; most micro-kernals are not written with this in mind.

Despite their academic popularity, micro-kernals have been much less popular in commercial applications. The only reasonably successful one which comes to mind is QNX (Mac OSX was based on the Mach kernal, but had a huge monolithic chunk grafted onto it so it's not a micro-kernal). Micro-kernals have a number of practical disadvantages (particularly high system call overhead) which have limited their use so far despite their theoretical appeal.

As far as more conventional operating systems are concerned, some of the problem with relying on in house testing is that you only end up testing the problems you expect to have. There is really no substitute for getting feed back from as many real users as possible. The more people you have using a piece of code before it is formally released, the more likely it is that any problems with it will turn up before it is used in serious applications.

None the less, I agree with your point about "small, tight, tailorable systems" for industrial controls applications. What I think this requires is a modular (but not necessarily micro-kernal) operating system based on widely used components. Most modern operating systems would in fact be capable of being readily adapted to provide this. The exception may be Microsoft Windows (since everything in it is integrated with everything else), but as you said in your original message, they have a different vision of where their market lies.

Tom,

I have the problem where IT has *improved the security on the Intranet* which drops any attachments (i.e. AutoCAD drawings) to protect the system and in result reduce our productivity *who knows where this will end*.

Maybe we should revert back to snail mail?

Dennis

By Daniel Chartier on 30 November, 2006 - 12:35 am

Hello Tony;

From my experience this should be done by the team that does the programming/commissioning of the plant control system. In my organisation this would be the Instrumentation and Controls team. They are the ones that know how to make the system "dance", once mechanical and electrical have ensured the equipment is functional, and once IT have provided adequate communication services to interface the PLC and the SCADA.

No offence to the IT department of which you are part, but this is our turf, our specialty, even up to the ergonomics of the control room, and the power of the computers installed.

Anyone who interferes with this philosophy in our organization will be held reponsible for the adequate functioning of the plant.

For example, we had a run-in recently with our project manager who wanted to save a few bucks and reduced the size and number of work tables we had requested for the control room at a plant we were setting up this summer. A small thing, right? Who needs all that working area for 2 or 3 3 SCADA screens, wasted space anyway.

Then the CCTV control screens arrived, one had to be setup on a "temporary shelf" (2 cardboard boxes on top of each other) with their screen controller/recorder, the other one was kicked in by a worker as it was left on the floor, the vendors removed the one that was left and refused to re-install it until the broken one was replaced (do you know how much a non-interlaced colour screen costs, when the vendor is mad at you? not counting 8 weeks delay in delivery...) and adequate work space provided for their equipment; then the computers for Building services were brought in... The issue has just recently been resolved, and a furious project manager had to disburse from a closed project budget (Ouch!) to have our original furniture recommendations followed and finally installed with 3 months delay; only now will the client accept the plant as finished (not the only snag, but a major one). Just a small thing really.

If you have to do this yourself (generally that happens if you have nothing to do the day the manager sees you and needs a body to get rid of a problem) then consider getting help and collaboration from the people who do know what is needed for this. And a couple of rounds of beer will go a long way to making friends at that level. Still my experience talking... ;)

Hope this helps,
Daniel Chartier

By Nathan Boeger on 2 December, 2006 - 8:22 pm

I wasn't going to touch this one, but I can't resist after hearing a few responses. Your innocent question ends up being a really sticky issue in a lot of organizations.

Typically for really small operations Jasmine is dead on. Integrators/PLC programmers can easily handle setting up a dedicated switch on a single non-routable class C subnet - it's a no brainer. No IT department required! Most don't understand much about the setup or what's going on especially if you add 'net access, but what difference does that make? Problems/poor implementations typically arise due to lack of IT experience. Let me give a few examples:
-Management wants to view data from the "business network", but the data lies on the "control network". Integrators often don't handle this very elegantly.
-Remote access: Integrators will often leave an insecure modem connection running PCAnywhere instead of considering a VPN.
-Lack of Hardware knowledge: Integrators will often leave a consumer switch in a production environment. While this works fine, it misses the benefit of what IT can provide. You'll sometimes see numerous separate switches where there could be VLANs, "random problems" with communications due to not dealing well with broadcast packets, RJ45 ends not done to spec, etc.
-Backup/maintenance issues: Typically IT has the infrastructure to provide consistent backups and updates whereas it's a tough issue for the plant guys.

IT on the other hand, from an integrators perspective, can be a pain to deal with. They're not typically familiar with what it takes to communicate with field devices or support HMI/SCADA applications. They often forget that production is king and that their mission in a control room isn't to lock out designers of the system. When left unchecked, IT will often over complicate requirements and policy, perhaps to feel like their doing something.

The real key is to work together to plan what makes sense given everybody's strengths. Clearly outline what IT should implement/support to make everybody happy. For example, IT should provide physical network infrastructure. All parties involved can participate in determining how it makes sense to subnet the network. IT could then provide reliable connections where traffic and security can be monitored. They can securely provide remote access, etc. IT can also do backups and updates per integrators recommendations (ie be scared to update service packs with some industrial software). IT will really be helpful as everyone moves to SQL databases and remote access. Security may not seem like a concern, but it's coming...In general control guys need to respect that IT does have a useful specialty that they don't, and IT needs to get the hint that it's unacceptable to mess with control guys access or any controls hardware/software that they don't understand - "oops, I left the network down for the afternoon" holds a different level of significance when it comes to production. I strongly support cross training between the controls guys and IT. Petty bickering or a "one or the other do all" approach is ridiculous.

I've had a lot of success doing projects with IT when I included them from the beginning. I've also been involved in heated battles with "them" when both sides tried to sneak around the other for convenience.

just my $.02...

----
Nathan Boeger
http://www.inductiveautomation.com
"Design Simplicity Cures Engineered Complexity"

Nathan,

Thank you for touching this one.

My original post complained about IT mucking with the Intranet to their comfort or pleasure and our regret -- I've since got over that and realized that the 1,000s of peaple in the company have the same problem which will result in an early resolution. I still do not believe they should go about installing patches untill all of the ramifications are understood for IT and Production. I do believe cooperation would be a great help in both cases.

Dennis

By Nathan Boeger on 15 December, 2006 - 10:58 pm

Dennis,

I certainly agree with you on this one. I've seen it happen far too many times that IT person innocently tries to patch an industrial production machine without having any idea of what could ensue. Like you said, cooperation's really the way to straighten out this sort of thing. Once IT understands the systems and priorities, they tend a knowledgeable resource for this arena.

----
Nathan Boeger
http://www.inductiveautomation.com
"Design Simplicity Cures Engineered Complexity"

By Dennis Patterson on 10 December, 2006 - 7:13 pm

The question is not difficult to answer. Just because i can drive, doesnt make me a proffessional courier. Just because its in a PC doesnt make it IT responsibility. If you are to use that approach, then you could argue that the lady at the front desk using the word processor should look after the SCADA!

Dennis Patterson

By Nathan Boeger on 15 December, 2006 - 11:15 pm

Dennis,

That answer is convenient for small systems, but what happens when you grow to a large distributed system? Is the HMI guy going to learn how to program Cisco routers? What about when your customer gets into MES/ERP systems as all the major vendors are beginning to promote? Is the PLC programmer going to become an Oracle expert?

IT people should already be PC experts, more so than controls engineers. Of course they need to learn about the software they're supporting before you unleash them on your SCADA system! Why not work with them?

----
Nathan Boeger
http://www.inductiveautomation.com
"Design Simplicity Cures Engineered Complexity"

By Curt Wuollet on 18 December, 2006 - 10:15 am

The company whose salesman tells management that this will scale flawlessly up to enterprise size should be entirely and explicitly and legally responsible for this. And yes, they should know the equipment and resources to do it. Expecting the customer to secure and protect software they know almost nothing about is ridiculous. This is almost the best possible way to produce an unsupportable mess and it's the status quo because the sale matters, not the aftermath. Perhaps the BS level would be lowered considerably if the sales types were held responsible for the promises they make.

It's just as bad as "This database is so simple your secretary can administrate it". I've seen a lot of those databases. Everyone should have to clean up their own mess.

Regards
cww

By Dennis Patterson on 18 December, 2006 - 10:16 am

Nathan,
Nobody said anyone shouldn't work together. But a plumber, doesn't wire a light bulb, or shouldn't anyway. If there is a Cisco router to be configured, then by all means its ITs responsibility. But when they start playing with packets of data in the SCADA, because its traffic does not suit their network, hence rendering the SCADA inoperable, then they have over stepped the line.

Dennis Patterson

By Nathan Boeger on 23 December, 2006 - 1:11 pm

Dennis,
The line needs to be clearly drawn, utilizing both skillsets. Clearly IT is at fault in your example of disrupting production by setting packet restrictions on a managed switch. Unfortunately most integrators/controls engineers don't really know what a managed switch is.

You seem to be straddling the fence on this issue. On one hand you mention "cooperation is good" and on 2 other posts you give extreme examples of keeping IT away from the system (lady at front desk using word processor controlling SCADA, and example of plumber wiring the light bulb). An IT persons technical background is not significantly different than your own (although their priorities can be). SCADA implementations are quickly including SQL databases, enterprise systems, and complex infrastructure - all areas where IT has experience. Do you really think that the role of a competent IT department should be limited to programming routers? Where does working with Oracle/SAP, or even administering a MySQL or Microsoft SQL Server database come in - these are things that might have to be implemented in your SCADA application? Or we could stick to single user HMI packages forever...

----
Nathan Boeger
http://www.inductiveautomation.com
"Design Simplicity Cures Engineered Complexity"

> I'm no Scada expert. I'm an IT guy.

You just answered your own question! Leave it for the SCADA Expert

Michael

I have taken to bringing a lawn chair and a magazine or two with me wherever I go. The emphasis on security has been detrimental to the automation business. The measure of a control system's value is correctness and availabiliy, not security.

Security is important, but so is making widgets. An unsecure control system has the potential of failing, lowering both its MTBF and availablity. But an overly secure control system has the potential of being time consuming to repair, raising its MTTR and lowering its availabilty sharply. (Just in case, the definition of availability is the MTBF divided by the sum of the MTBF and the MTTR.)

The IT people I have dealt with, by and large, seem to be highly competent WRT MTBF, but are oblivious WRT control system MTTRs. A meeting is required for anything and everything, there's always someone "who needs to be there" that is on vacation or too busy to attend. Thus, the lawn chair and magazines.

My recommendation, for now, is to regard enterprise IT departments as a valuable service provider only, much the same way we look at the phone company or an ISP.

By Nathan Boeger on 17 December, 2006 - 2:51 pm

Kirk,
I have had similar experiences with some IT departments. More recently I have been recommending that integrators approach IT prior to implementing a project and include them on the relevant details. You will find that the details can be established beforehand, preventing all the meeting where they may have been trying to stall on you. I hear such experiences as "they just set up a computer for me, everything installed, patches, automatic backups and all, and it was ready to put the control software on"

I hate to be a nay-sayer, but SCADA security is often taken way too lightly. Think pre-911 airports. I remember a vendor at a gun show bragging that it was nothing to get a stun gun through airport metal detectors. How many incidents do you think it'll take before SCADA security will matter? Think ammonia systems, water treatment plants, power plants, etc. Protection from a knowledgeable assailant is practically non-existent.

This should be the responsibility of all - vendors, integrators, IT departments, and end users. Designing a secure system doesn't have to equate to a long mean time to recovery.

----
Nathan Boeger
http://www.inductiveautomation.com
"Design Simplicity Cures Engineered Complexity"

IT departments are concerned about security in a different way than you are. IT worries about financial information being compromised and employees wasting their time looking at celebrity crotch shots, not 911.

One of the things that has never failed to amaze me is that an organization may have numerous procedures and policies regarding the need to analyse any proposed changes to operating plant to the nth degree, but can totally ignore the impact of anything that does not involve cutting metal.

The need to make any modifications at all to SCADA related hardware can have a major impact on production. Even in a well-structured system with all control done at the field end and the SCADA performing only supervisory actions, loss of a server will ultimately affect performance - if it doesn't why have the SCADA in the first place? I'm sure many of us have been in situations where even adding the latest service pack upgrade has impacted on operational software.

(Incidentally, about 20 years ago, I was responsible for the instrumentation of a small methanol plant - including the "Process Computer" which was the nearest we got to a SCADA system. There was also a "business" computer system which was independently managed, and was eventually PC-ised. When the organization got around to replacing typewriters with word-processing software on desktop PC's, there was a major bust-up between the head of the typing pool and the IT manager over control of these machines. But because the plant computer was a mainframe there were no issues at that end of the plant. The IT manager also managed to seriously p%#$ off the drawing office guys and most of the engineers by disestablishing their very functional and home-brewed drawing data base system and insisting on writing his own - 5 years later it was just about up to the same level of functionality as the original.)

This issue is one of organisational management but all involved need to be very aware that any change to anything connected to the plant can have major and unforeseen effects on production and safety. This needs to be appreciated by all involved, from the CEO to the HR group to purchasing to the cleaning service. ANY change must undergo a formal review and sign-off process involving any affected discipline.

It will come under 'Process Control Engineers' or 'Electrical Engineering' as a standard maintenance requirement, but you will find most Electrical Engineers will not have the ability to make adjustments to SCADA or PLC ladder.

By Nathan Boeger on 4 January, 2007 - 2:34 am

This thread is out of control - I love it! It was obvious that Tony was opening up a hornet's nest, but I had no idea that we'd get this much good information. End users and integrators deserve to know what software's available - especially the open source/free stuff. And "everybody" knows that this SCADA/IT convergence is happening.

This is a bit of a tangent, but it really pisses me off when vendors spread ideas claiming that "industrial software is somehow, magically, different" - that it won't perform on any platform other than theirs. Or "industrial data" is different than any other type of data. So THAT'S why I have to spend so much more money on a custom, obfuscated (enhanced, really...) version of MS SQL Server instead of running free, MySQL or PostgreSQL databases! I wish I could find a link to an article that I read that claimed that open source software wasn't viable, secure, and get this, stable enough to control real world processes. I challenge you to ask any end user/integrator about their experience with stability in typical industrial software... especially from the era this article was written.

The example that I do remember is this: Google "web based hmi". You will get one of the worst nay-saying articles that I've ever seen. Not only does it patently misinform, but it reinforces the pessimistic notion that progress won't meet needs or even be possible - the entrenched vendors clearly have all your problems solved. Granted, it was written Sept '02, but it's the first search result in Google! Unacceptable! Users deserve far better data - like the information that posters to this thread have contributed.

----
Nathan Boeger
"Design Simplicity Cures Engineered Complexity"
Inductive Automation - total SCADA freedom

I guess the relationships between IT and Industrial Software Engineers will improve with time, when the rolls are more clearly defined, this will help.

If IT management quit their 'holier than thou' 'I'm the king of the domain' attitude toward everything, would also help to improving this relationship. But I don't see that happening soon, because to what I've seen, that's a prerequisite to getting into IT in the first place.

Dennis

Hi Dennis,

Please read past the first line... ; )

I'm a technical recruiter with an award-winning firm that recently was assigned to look for information about SCADA and RTU positions. On my own I have been seeking out any information I can find for some clients on mine. Any help or pointers would be greatly appreciated.

I've done some reading and the articles that this site and others have discussed seem very interesting. I've been an IT and Engineering recruiter for 10 years and have seen lines blurred in many roles regarding IT. Your posting sheds a little light on the mentality of the shops that I will be dealing with.

Please feel free to respond directly to my email if you can offer any help, msole @ sbcglobal. net.

-Marcus

By Jeff Roberts on 10 March, 2007 - 6:18 pm

Dennis,

I think you are right on the mark. But to take the issue a step further, I am convinced the heart of the turf war problems between Control Systems & IT is not technical, but personal. Control Systems professionals typically hold (traditional - accredited) engineering degrees related to their field. Often these degrees are advanced and highly specialized. IT professionals typically hold 2-year technology-based "degrees." This is not to say that IT professionals are less qualified for their positions or have inferior knowledge of their field, but there does seem to be a jealous tendency toward engineering & controls professionals. Envy & jealousy are the root causes of the strife between IT and any other technically oriented profession.

By Ranjeet M. Vaishnav on 14 March, 2007 - 12:02 am

I am a late entrant to this thread. This topic was also discussed on a separate mailing list way back in 2004, maybe someone wants to look at the Archive:

http://lists.iinet.net.au/pipermail/scada/2004-May/000158.html

Regards,
Ranjeet

By Michael Batchelor on 15 March, 2007 - 12:27 am

OK, I tried to keep quiet on this, but I must have a genetic defect that keeps me from not saying it. So here comes the "Michael Batchelor Organizational Change Rant" again. (Kinda has a nice ring to it!)

The thread Ranjeet referenced is good, but it just says exactly the same things that I and who knows how may other people have been saying for decades now. This argument goes back way before 2004.

And I think Jeff is partly right that the issues involve a tremendous amount of personal conflict. But I'm not sure that it's all at the level of guys down in the trenches.

I truly believe that the only solution to this whole dilemma is for a fundamental business reorganization to occur that merges IT with the rest of the engineering in an organization. Period. I do not think anything short of that can solve the problem. And this will involve a huge loss of stature for the IT group.

There! I've said it again! And I'll point out again that this is all moot. This whole discussion is going on in a controls group forum, not an IT group forum. And there's not a CIO on the planet who would give you 5 minutes to discuss this. Does anyone on this thread have enough horsepower to even *GET* an audience with a Fortune 500 corporation CIO? If you do, I'll ask this again, do you have the guts to print this whole thread and throw it on the CIO's desk?

Someone chided me earlier that I should become a high paid business consultant if I thought this was what was necessary. I've been working on my resume. Anyone know of a customer looking for a business consultant with dirty fingernails and a loop calibrator in the trunk of his car?

Michael
--
Michael R. Batchelor
www.ind-info.com

GUERRILLA MAINTENANCE [TM] PLC Training
5 Day Hands on PLC Boot Camp for Allen Bradley
PLC-5, SLC-500, and ControlLogix
www.ind-info.com/schedule.html
training@ind-info.com

Industrial Informatics, Inc.
1013 Bankton Cir., Suite C
Hanahan, SC 29406

843-329-0342 x111 Voice
843-412-2692 Cell
843-329-0343 FAX

By Michael Griffin on 16 March, 2007 - 11:01 pm

In reply to Michael Batchelor: Perhaps you could get the Finance department put under Engineering while you are at it; then we wouldn't have any trouble getting our purchase orders approved any more. Surely it would be a good fit. After all, creative financial engineering seems to be a "core competency" for most large companies these days.

By Michael Batchelor on 17 March, 2007 - 3:18 pm

Great idea! I'll put that in my management plan!! Look for it!!! Coming soon to a factory near you!!!! ;)

--
Michael R. Batchelor
www.ind-info.com

GUERRILLA MAINTENANCE [TM] PLC Training
5 Day Hands on PLC Boot Camp for Allen Bradley
PLC-5, SLC-500, and ControlLogix
www.ind-info.com/schedule.html
training@ind-info.com

Industrial Informatics, Inc.
1013 Bankton Cir., Suite C
Hanahan, SC 29406

843-329-0342 x111 Voice
843-412-2692 Cell
843-329-0343 FAX

I have worked for both large and small outfits, as small as 100 employees and as large as 70000. I am an instrument guy so I am hardly objective.

What I can tell you for sure is whether the plant hires I.T. guys to look after it or instrument guys, you want to make sure the work for the plant and not corporate. As soon as corporate I.T. get there fingers in the pie you will be waiting for 4 weeks for permission to update a driver. Way too cumbersome, keep your team lean and mean, use "I.T." as a resource, don't give them control.

The best I have seen is a Process Control group made up of Instrument, Electrical and I.T. types working for plant operations not maintenance. We handled plant control system and field instrumentation and control. It seems once you become part of the maintenance group you tend to loose touch with the plant and only see as far as your next work order. My experiance for what it is worth.

By Tony Brickner on 3 February, 2015 - 7:52 am

If I saw this thread 4 years ago, I probably would have ran out of my last interview screaming.

I'm an IT Network guy working for an O&G company. I've been focused in networking since my days in the Air Force, but things like PLCs interest me, so when my interviewer told me about their network and what my roll would be, I started glazing over. Towards the end of the interview he mentioned how IT was meant to be a resource to our SCADA network and I snapped back to attention.

Our enterprise network is large but there is nothing ever groundbreaking going on. The SCADA network in my area of the world is insane when you factor in the number of PLC, HMI, and different network components that are out there. Then you take that network and spread it across half of the state, my mind has exploded more than once as I stick to my "resource" role.

To be clear, I like 8-5; sure there are days when I come back in late for some work that requires outages but they are "mostly" planned. I've been called late at night when our commissioning team has connected new devices to the SCADA network and no one is able to understand the problem. The problem for me is, I'm still the "kid in the candy store" that stares in amazement at how these PLCs process data from an arm load of sensors and decide that things are working the way they are supposed to.

Our problem is that the guys that are responsible for the SCADA network are process engineers, electricians, and automation technicians that have learned different pieces of the networking world as the oil field grew. The danger is that vendors come in and offer their input and the SCADA network guys decide that it would benefit them. Most of the time, this is true and it is something that they should have been doing from the start; since they don't understand IT network concepts, the idea isn't implemented properly or things are way over done.

I agree with the first quarter of this page (still reading the rest) that you need two groups and that IT should not be involved. I agree with that because I'm selfish and like my sleep. The problem is, I like my pay check and if that automation network is not running, I'm back on the job market.