# Windows Security (was: Virus warning)

R

#### Ralph Mackiewicz

> I'm not aware of any effort to qualitatively improve Windows security.

Of course there is a plan to qualitatively improve Windows security. It is there on MS's web site for all the world to see:

"www.microsoft.com/TechNet/columns/security/noarch.asp":http://www.microsoft.com/TechNet/columns/security/noarch.asp

All you MS bashers never give them the benefit of the doubt. The plan is: trust us...the experts will take care of it. You don't need to know all these details anyway. The gentleman that learned something useful from the Melissa virus was picking the forbidden fruit...shame on you! Now all the rest of us have to pay for your indiscretion.

Not Sincerely,

Regards,
Ralph Mackiewicz
SISCO, Inc.

The opinions given above should be attributed to me and not my company because my company did not write them...I did.

J

#### Jiri Baum

Jiri:
> > I'm not aware of any effort to qualitatively improve Windows security.

Ralph Mackiewicz (Not Sincerely''):
> Of course there is a plan to qualitatively improve Windows security.

That's why I put the word qualitatively' in there: they may fix particular vulnerabilities, or even up the QA so that there are fewer; but that can result at best in a quantitative improvement.

> "www.microsoft.com/TechNet/columns/security/noarch.asp":http://www.microsoft.com/TechNet/columns/security/noarch.asp
Umm, yeah, sure. That'll work... NOT.

> All you MS bashers never give them the benefit of the doubt. The plan is:
> trust us...the experts will take care of it.

Actually, it's not even that - it's telling people to put their heads in sand, ostrich-like.

If it was trust us, the experts will take care of it', even that would be worlds better.

Jiri
--
Jiri Baum <[email protected]> http://www.csse.monash.edu.au/~jirib
MAT LinuxPLC project --- http://mat.sf.net --- Machine Automation Tools

M

#### Michael Griffin

Jiri Baum wrote:
>Patrick Cross:
>> It's a shame that people exploit the useful and powerful features of
>> Windows in this manner.
>
>Actually, the real shame is that these useful and powerful features were
>designed without regard to security, and that this was not redressed when
>it was pointed out or when the first (non-malicious) exploits appeared.
<clip>

Yes, its too bad these features were designed without security in mind, but I have to question the assumption that these features were
generally "useful" to any but perhaps a few people. I don't personally know of anyone who uses them. They could be eliminated from Microsoft's "Outlook" and "Word", and only a handfull of people would even notice they were gone. Some people do find them useful in "Excel", but then we rarely hear of "Excel" viruses sweeping the internet.

If only the very few people who actually needed these features had them, there would not be a high enough concentration of susceptible systems for viruses to exploit. I would think that companies and consumers would stand to save a lot of money as well if they weren't paying for features they never use when they purchase software, let alone the savings they would see from not having to deal with the security consequences.

>I wonder what this says about use of MS Windows (or Linux) for control of
>potentially dangerous machinery and processes, especially when these are
>connected (indirectly) the public Internet; as far as I know, there's no
>general-purpose operating system that would actually have a security model
>(except perhaps VMS, I'm not familiar with VMS). The better systems can be
>locked down to around C2, but that's not actually all that much security.
<clip>

I see security problems getting worse in future, firewalls not withstanding. The direction of the consumer and business market is so directed towards transparent internet access for distributed applications that the pace of change will open new security holes as fast as old ones can be closed.

I don't forsee the typical control system designer being able to keep current with, or even to fully understand, the security implications of every new "multi-media internet game interface" which gets automatically installed with the operating system installation CD (or with the OS update patches).

Consider also the fact that computer system administrators are constantly installing patches to existing desktop installations (or even patches for the patches). In larger companies, this is a full time job forone or more people. Who is going to do this for production equipment?

You can't use the existing methods and personnel who are taking care of the desk top installations. Part of this existing process involves having the patches tested for bugs which affect the desk top software (word processing, CAD, etc.). Who is going to do this compatability testing for the machinery installed in the plant? Nobody, that I can see.

Security (and reliability) could probably be better addressed for automation purposes in an operating system which was stripped down to the
essential, thoroughly tested features only. This would need to be done by someone who is an operating system and security expert. The question should not be whether a network security expert can configure a secure system - it is whether the typical control system designer can do so. The two fields are sufficiently broad and distinct that anyone who imagines himself an expert in both is deluding himself.

**********************
Michael Griffin
**********************

J

#### Jiri Baum

Patrick Cross:
> >> It's a shame that people exploit the useful and powerful features of
> >> Windows in this manner.

Jiri Baum:
> >Actually, the real shame is that these useful and powerful features were
> >designed without regard to security, and that this was not redressed

Michael Griffin:
> Yes, its too bad these features were designed without security in mind,
> but I have to question the assumption that these features were generally
> "useful" to any but perhaps a few people.

Scriptability is a useful feature in general - though obviously its usefulness can be compromised, for instance if it's obscure or difficult.

> I would think that companies and consumers would stand to save a lot of
> money as well if they weren't paying for features they never use when
> they purchase software,

Off-the-shelf software is not priced according to features but according to what the market will bear (both in competitive and monopoly markets).

> I see security problems getting worse in future, firewalls not
> withstanding.

Firewalls are the Maginot lines of the Internet. As has been demonstrated to the French decades ago (twice, if memory serves), they're useless. The Germans just walk around them.

This is where proper security features at the OS level, such as the aforementioned mandatory access control, can really make a difference. Like I wrote, though - I'm not aware of any plan to put anything like that in Windows, and for Linux this piece is at the experimental demo stage and other pieces are missing altogether.

Jiri
--
Jiri Baum <[email protected]> http://www.csse.monash.edu.au/~jirib
MAT LinuxPLC project --- http://mat.sf.net --- Machine Automation Tools

M

#### Michael Griffin

At 17:34 13/12/01 +1100, Jiri Baum wrote:
<clip>
>Michael Griffin:
>> Yes, its too bad these features were designed without security in
>> mind, but I have to question the assumption that these features were
>> generally "useful" to any but perhaps a few people.
>
>Scriptability is a useful feature in general - though obviously its
>usefulness can be compromised, for instance if it's obscure or
>difficult.
<clip>

I was trying to draw an analogy between the scripting features in programs such as "Outlook" and "Word" (seldom used by anyone except virus
creators) and unecessary (for automation purposes) operating system features which could conceivably be also exploited by viruses. If a feature isn't present, it can't be used by either a person or a virus. If the person never uses the feature, then the only potential beneficiary of having it is a virus.

So the question is, are there (now, or in the works) a significant number of operating system features which are not necessary for most automation purposes? Do these features pose any conceivable security threat (either alone, or in combination with other features)? If so, how practical is it to remove them from an operating system, or rather, to not install them in the first place?

Ideally, we would still use a "standard" operating system, but someone would produce a version of it which comes already configured with automation applications in mind. It is conceivable that the installation program could offer several possible configurations which offer varying trade-offs between flexibility and security.
The automation market differs from the office market in that in typical automation applications we know in advance exactly what software will run on the system and what capabilities it needs to have. The office market needs to be more flexible and generic. This means that in automation applications the balance between security and flexibility (and
compatability) can be more on the side of security.

I'm not sure exactly how to do what I have outlined above. I suppose that with Linux, it would be possible for someone to create a distribution which addresses this, provided the demand for it is really there. I'm not sure how to address this for people who want to use Windows, since the potential market would likely be too small to interest Microsoft and third parties can't really tinker with Windows and distribute special versions.

>> I see security problems getting worse in future, firewalls not
>> withstanding.
>
>Firewalls are the Maginot lines of the Internet. As has been
>demonstrated to the French decades ago (twice, if memory serves),
>they're useless. The Germans just walk around them.
<clip>

- Or lands gliders on the roof of your fortress and breaks in (nobody ever considered *that*). The biggest problem with firewalls is that they are designed to respond to known threats. As long as you keep responding to new threats as they develop and update your firewall accordingly, you can keep the infiltration of viruses to a tolerable level. What do you do though for a machine which you want to install and then be able to ignore?

**********************
Michael Griffin
**********************

B

#### Bob Peterson

> Security (and reliability) could probably be better addressed
> for automation purposes in an operating system which was stripped down
> to the essential, thoroughly tested features only. This would need to
> be done by someone who is an operating system and security expert. The
> question should not be whether a network security expert can configure
> a secure system - it is whether the typical control system designer
> can do so. The two fields are sufficiently broad and distinct that
> anyone who imagines himself an expert in both is deluding himself.

Personally, I think anyone who imagines himself an expert in either is deluding himself. You have to pick a few niches to be an expert in, and know enough about the other areas to get by. No one can be an expert C++ programmer, expert RLL programmer for 3 or 4 major PLC lines, and also be expert in IFix, Wonderware, RSView, and networking, while also being an expert in hardware, instrumentation, drafting, etc.

If you are lucky you can become an expert in maybe one brand of PLC programming, and one SCADA or HMI package, but be able to get by and learn enough when forced to, to be able to tackle something new, and deal with the other stuff you have to.

My definition of someone who is real smart is someone that recognizes when he is in over his head. The really dumb ones keep plugging along and make things worse. The smart ones call in (or call up) someone who really does know what he/she is talking about when they need to. The dumb ones never figure out they need help until it is too late.

And I think you are wrong about making a system secure. There are a lot of books out there that you can follow their guidelines and make a system VERY secure. The issue is that it typically is far less easy to use, and less convenient, once it has been secured. I think the trick is to balance security against ease of use and utility and come out with something useful. Making these decisions is what we call judgement.

Bob Peterson

J

#### Jiri Baum

Michael Griffin:
> > The question should not be whether a network security expert can
> > configure a secure system - it is whether the typical control system
> > designer can do so.

Bob Peterson:
> And I think you are wrong about making a system secure. There are a
> lot of books out there that you can follow their guidelines and make a
> system VERY secure.

I'm afraid Michael has it right - commonly-available systems can be configured to around C2 with a lot of effort, which is not "VERY secure" (and also unlikely to be achieved by following random books).

> The issue is that it typically is far less easy to use, and less
> convenient, once it has been secured.

Then it's done wrong.

In fact, one could argue that a more secure system is *easier* to use because the users don't have to worry about the security implications of their actions...

Similarly, a more secure system should be easier to configure right, as it provides better security tools. The Unix (or Windows) maze of twisty little permissions, all different, is not conducive to correct admin.

Jiri
--
Jiri Baum <[email protected]> http://www.csse.monash.edu.au/~jirib
MAT LinuxPLC project --- http://mat.sf.net --- Machine Automation Tools

K

#### Ken Irving

On Thu, Dec 13, 2001 at 04:56:19PM -0500, Michael Griffin wrote:
> At 17:34 13/12/01 +1100, Jiri Baum wrote:
> ...
>
> >> I see security problems getting worse in future, firewalls not
> >> withstanding.
> >
> >Firewalls are the Maginot lines of the Internet. As has been
> >demonstrated to the French decades ago (twice, if memory serves),
> >they're useless. The Germans just walk around them.
> <clip>
>
> - Or lands gliders on the roof of your fortress and breaks in
> (nobody ever considered *that*). The biggest problem with firewalls is
> that they are designed to respond to known threats. As long as you
> keep responding to new threats as they develop and update your
> firewall accordingly, you can keep the infiltration of viruses to a
> tolerable level. What do you do though for a machine which you want to
> install and then be able to ignore?

It seems to me that that machine either needs regular (and automated, to be ignored) updates with security improvements or needs to be behind a firewall which is so updated. You're still exposed to new and unanticipated methods of attack, however. The only hope is that your system isn't one of the first to be attacked, and you can get updates in place before that happens.

The Debian (a Linux distribution) model seems to be a good approach to me. Updates involve a couple of generic commands, typically invoked manually but potentially automated, bringing any and all changes posted by a trusted set of developers. For stable systems the updates only address security issues and bugs, and the developers (must) try very hard to not break anything in the process. The advantage of this over some other systems is that one doesn't need to research weaknesses and select specific updates, which could easily amount to a full time job.

You're still one step behind leading edge attackers, but how could any system be safe from completely unanticipated new threats?

Ken

--
Ken Irving <[email protected]>

J

#### Jiri Baum

Ken Irving:
> It seems to me that that machine either needs regular (and automated, to
> be ignored) updates with security improvements or needs to be behind a
> firewall which is so updated. You're still exposed to new and
> unanticipated methods of attack, however.

Most of the attacks on the Internet are not new or unanticipated; they're well-known and (in the abstract) well-understood.

Indeed, the major ones are of only two types - macro worms and buffer overflows.

For macro worms, proper security in the OS would stop them in the tracks. They'd simply not be cleared to run.

Buffer overflows are a bit more difficult (though eliminating the half a dozen worst offenders from the C library would be a good start). Even so, damage could often be contained and minimized.

Certainly new methods of attack can arise from time to time; but if computers were, in general, well-secured, the expertise required to come up
with such new attacks would be substantial and success rare.

Jiri
--
Jiri Baum <[email protected]> http://www.csse.monash.edu.au/~jirib
MAT LinuxPLC project --- http://mat.sf.net --- Machine Automation Tools

A

#### Anand Iyer

In case you are worried about net safety with regards to automation systems in hazardous/toxic or other applications where safety is prime,
Well, we concluded a seminar on web based automation systems and the paper on security issues, protection techniques, takes care of just
Anand

J

#### Jiri Baum

> >Michael Griffin:
> >> Yes, its too bad these features were designed without security in
> >> mind, but I have to question the assumption that these features
> >> were generally "useful" to any but perhaps a few people.

Jiri Baum:
> >Scriptability is a useful feature in general - though obviously its

Michael Griffin:
> If the person never uses the feature, then the only potential
> beneficiary of having it is a virus.

One can imagine any number of quasi-workflow scripts; though you're right that for some reason it is rarely used.

> So the question is, are there (now, or in the works) a significant
> number of operating system features which are not necessary for most
> automation purposes?
... [and can we remove them] ...

Actually, that may not be so useful - if an office machine is compromised, and it belongs to someone authorized to access the factory floor machine... That's where a general trusted path comes in handy, but what OS has it?

> It is conceivable that the installation program could offer several
> possible configurations which offer varying trade-offs between
> flexibility and security.

As I said before, if this is a trade-off, you're doing it wrong.

Anyway, you don't want your office machines compromised, either (both to avoid them being used to attack the factory floor, and for their own sake).

Jiri
--
Jiri Baum <[email protected]> http://www.csse.monash.edu.au/~jirib
MAT LinuxPLC project --- http://mat.sf.net --- Machine Automation Tools

M

#### Michael Griffin

At 13:50 15/12/01 +1100, Jiri Baum wrote:
<clip>
>Actually, that may not be so useful - if an office machine is
>compromised, and it belongs to someone authorized to access the factory
>floor machine... That's where a general trusted path comes in handy,
>but what OS has it?
>
>> It is conceivable that the installation program could offer several
>> possible configurations which offer varying trade-offs between
>> flexibility and security.
>
>As I said before, if this is a trade-off, you're doing it wrong.
>
>Anyway, you don't want your office machines compromised, either (both
>to avoid them being used to attack the factory floor, and for their own
>sake).
<clip>

I'm not sure how this subject got renamed "Windows Security", this is really a general operating system issue. However, I think you are really not addressing the point I was trying to make. You are talking about securing a plant or company, while I am talking about trying to secure an individual machine. The former is outside the realm of what most people on this list deal with, the latter isn't.
Let's start with the following assumptions. First, we assume that we are building a new piece of equipment - let's say some computerised test equipment. Second, it has some need to communicate via a network to computers in the rest of the plant. The network access can be restricted (if this is technically feasible) to such essential purposes as logging data and reporting its current status. Next assume that, we must follow standard networking protocols in order for the previous statement to be feasible. We must also assume the enviroment it is being installed in (e.g. the network) is fairly typical in that it cannot be guaranteed to be absolutely secure, and that we have no influence on how this network is configured or administered. Finally, assume that we have a completely free hand in selecting the software (operating system, application development software,
etc.) for the equipment we are building.

Now the questions are, what can we do with the operating system to provide the greatest degree of security, and how likely is it that the typical person who builds this equipment will be able to understand and implement this configuration?

A) A typical *general purpose* operating system contains a great many features which are not required for the above mentioned purposes. These features may be such things as media players or internet game interfaces.
B) These features are, under the default installation settings, installed regardless of whether they are needed or not.
C) These features are continuously being added to by the creators of the operating system.
D) The basic operating system itself is subject to less frequent major change than the above mentioned unneeded features.
E) Some of these uneeded features are potential security problems.
F) The typical person designing and programming the type of equipment we are discussing does not either have the knowledge or the time to research and solve all the new security problems which come up on a daily basis.
G) The machine will be installed as a turn-key system in a plant where it will run unattended on a continuous basis, and will not be subject to the daily patches and modifications which we see in office applications.
H) There is no one who is going to apply regular patches to the operating system for "security" reasons. There is no point in creating wild conjectures based on this person being present. Such a person simply does not exist at the job site.

Possible solution - remove (or do not install) features which are not needed for the defined application and which present actual or potential security problems.

For example - you mentioned "macro viruses" as being a major security problem. Macro viruses run by using scripting features present on the host computer. If the machine application does not require scripting, then scripting can be removed from the installed machine without affecting any required functionality. If there are no scripting facilities installed, then the machine is completely immune to macro viruses. I'm sure there are other possible examples.

So, now the question becomes, how feasible is the above, and how easily can it be implemented? If this really is useful in the automation field, then how feasible is it for some third party to supply a version of a "standard" operating system which comes already configured this way? Alternatively the same result may possibly come from a utility which performs (and audits) an equivalent configuration with an operating system installed from the normal distribution CD. This would seem to be a more realistic approach than simply assuming that somehow "somebody" will "look after things".

**********************
Michael Griffin
**********************

B

A

#### Anand Iyer

I believe that what you ask for has been covered to required levels.

The paper looks at security from an I&C engineers perspective.

It assumes that you as an I&C person will be forced to have automation systems and networks which could have security holes in them In other words systems and networks that are weak. And you would connect them to internet or intranet where attacks could be very vigorous.

It does not look at a particular product but recommends the use of good products.

The paper also tries to provide measures to counter WAP/ Mobile based attacks and also schemes for protecting plants that have total remote control, currently say from satellite communication and wish to connect via internet.

It is quite exhaustive, but it gives guidelines and you have to apply them to your system.

Anand

R

#### Ranjan Acharya

<clip>Firewalls are the Maginot lines of the Internet.</clip>

<clip>Or lands gliders on the roof of your fortress and ...</clip>

Those poor military planners in France and Belgium, expecting WWI and getting WWII. I think that there is a book called "Strange Defeat" or something like that written by a French military strategist in the late 1930s just before the Blitzkrieg juggernaut was unleashed that pointed out the fatal flaw in the allied military planning (fortifications, tanks designed for trench warfare, reliance on the Ardenne Forest [even thought it had proven remarkably porous only a few years earlier], poor communications and control, misuse of air power and so on). The poor gentleman who wrote the book was executed in the 1940s for being one of the few members of the Maquis. Also, the German planners actually observed early kinds of "manoeuvre warfare" with light tanks, mobile troop formations and co-ordinated air power in Great Britain in the early 1930s. The British never followed up their initial experiments - "While England Slept" by Jack Kennedy summarises that quite nicely.

A good thing to keep in mind when planning security - of any kind. The plan you have today is going to be useless tomorrow - all that money spent on Cold-War-era weapons, for example.

We must not forget that it is not just Windows with the security problems. Windows seems to be an insecure system with a few security safeguards. Linux appears to be a relatively secure systems with a few security holes.

However, Linux is not immune to security flaws. Recently a problem was discovered in the wu-ftp client that runs under all the various distributions on Linux (Caldera, Red Hat, SuSe and so on). Unfortunately, due to an administrative error Red Hat released details of the security hole to the user community before all the patches (from all the various vendors) were ready. They were all supposed to release their patches on the same day so that alert users would be ready (unfortunately, the patch only covers people who bother to install patches). Instead, anyone with a Linux system with wu-ftp active was left with a gaping security hole. Security administrators were busy patching their systems but the systems had already been compromised and it was too late. This real, some web sites had to go off line to be rebuilt. YOUR credit card perhaps?

Only connect to the outside world if you absolutely have to. Physically remove your system from the outside world if there is no good reason to be there.

B

#### Bob Peterson

In a message dated 12/19/01 10:00:05 AM Central Standard Time, [email protected] writes:

> A good thing to keep in mind when planning security - of any kind.
> The plan you have today is going to be useless tomorrow - all that
> money spent on Cold-War-era weapons, for example.

To continue this analogy a bit ...

From the free world's POV - those weapons are not useless even today, and saved the world from Soviet domination. They were worth every nickle spent on them, and still have lots of utility. They were designed against a threat that is lessened, and don't directly address all of the threats we face today. While cruise missles, precision guided munitions, and stealth aircraft were designed to deal with the Soviet threat, they are also quite useful in today's environment (just ask those on the receiving end in Afghanistan if you do not beleive that).

Today's security tools are helpful with tomorrow's security problems but are not the whole answer. The point is that what works today probably will not work as effectively tomorrow. I see no practical way to design a security system that will never have to be updated, not-with-standing the suggestions by some other members of the list that this is possible. I suspect you are forever going to have to update and patch your computer security systems.

And really, the worst security problems are not software or hardware but people.

-people who write passwords down on the bottom of their keyboard because they are forever being asked to enter them, and they cannot easily remember the half-dozen different passwords they need for various functions. -disgruntled employees who are authorized access and give that access to others. -people who don't log off when they go to lunch or the bathroom -control room operators who don't lock the control room door cause its inconvenient -people who give out passwords over the phone to an employee who "forgot" it -peope who have authroized access and use it in unauthorized ways

We all know horror stories about this kind of problem, yet we focus on security holes that have really not been that big a deal. They get a lot of press, but so far they have not amounted to much as far as real problems go. More of a nuisance than anything else. They need to be addresses, but the real problems, the people problems, are far worse, and we do little to address them. probably ebcause it is outside of our area of responsibility.

My basic practice has been to leave the SCADA systems I supply almost totally unsecure. On NT systems they automatically boot up into the administartor account and automatically start the SCADA software. I do not enable the scada software's internal security system. There are a couple reasons for this.

1. I have no idea (usually) what is appropriate for the end user.

2. I have little clue what they want or will accept at the stage when these things are typically shipped.

3. I want the end user to understand that the security of the system is HIS problem, and not mine. he has to live with it for the next umpteen years.

4. I am not smart enough to figure out what the end user might need 6 months or a year from now. Maybe some people are psychic and can tell these things, but i cannot. To make a judgement call at a very early stage of the game makes no sense, when you have very little information with which to make that judgement.

I do make it very clear in the accompanying documentation that it may (or may not) be appropriate for the end user to adopt some security measures, but that what level of security is implemented is his responsibility and not mine. This makes it obvious that the end user has to take the initiative to determine what level of security he is comfortable with and to maintain or increase it as he feels the need. I will not be around to install the next O/S security patch. The end user has to take that responsibility on himself.

Bob Peterson

R

#### Ralph Mackiewicz

> > Firewalls are the Maginot lines of the Internet.
>
> Those poor military planners in France and Belgium, expecting WWI and
> getting WWII.

...snip...snip...

While the brief history of the Maginot line was very interesting it does not address the essential flaw in claiming that modern firewalls
are equivalent to the Maginot line. I must disagree with this assertion very strongly.

The Maginot Line's purpose was to prevent an enemy from acheiving a successful invasion. It was an incompetent solution for that purpose as was pointe out.

However, the firewall's purpose is not to eliminate all security risks related to the Internet. The firewall's purpose is to prevent
specific kinds of communications from occuring. The vast majority of modern firewalls peform that purpose very admirably.

However, you can't rely on firewalls only. Firewalls won't stop denial of service attacks for instance. If your security depends on that then you must take other steps. Firewalls won't address internal security problems such as disgruntled ex-employees with old accounts that are never disabled. If your security depends on that then you must take other steps.

Firewalls are an essential and necessary part of any comprehensive communciations security plan involving the public Internet. Anyone who thinks that they don't do the job they are designed for is going to incorrectly dismiss an important and cost effective tool.

Regards,

Ralph Mackiewicz
SISCO, Inc.

M

#### Michael Griffin

>However, Linux is not immune to security flaws. Recently a problem was
>discovered in the wu-ftp client that runs under all the various
>distributions on Linux (Caldera, Red Hat, SuSe and so on).

I heard about this recently. This was one of the things that reinforced my thoughts about limiting uneeded features. The version of this story that I read though said that this particular ftp client was only used on some Linux distributions (there are others available which some people preferred), and only in some versions. Also, many of the people who had this
package didn't install it, and so weren't affected.
In other words, this wasn't something that affected every Linux system in existence, only some. What this tells me is if I don't need (for example) ftp, then if I don't install it I won't have an ftp problem in the event that another problem turns up in future. This same principle could be applied to other features. I don't have a list of what these features may be
at this point, but it sounds like a promising approach to limit the amount of software maintenance needed.

This really isn't much different from other maintenance problems. For example, we have several lines where the builder installed ventilation fans on all the electrical enclosures. We found ourselves changing a lot of
filters until we looked at whether the enclosures actually needed fans. It turned out that most of them didn't. The reason they all had fans was that it was easier for the design engineers to just put fans in all of them than
to think about which ones needed fans and which ones didn't (I know this was the case because we asked them about it). We removed the fans from most of the enclosures, and reduced our routine maintenance work load. This meant we could now spend more of our available PM time on things which really needed it.
By the same token, I would like to avoid installing software patches to fix features we don't even need.

**********************
Michael Griffin
**********************

M

#### Michael Griffin

<clip>
>My basic practice has been to leave the SCADA systems I supply almost
>totally unsecure. On NT systems they automatically boot up into the
>not enable the scada software's internal security system. There are a
>couple reasons for this.
<clip>
>I do make it very clear in the accompanying documentation that it may (or
>may not) be appropriate for the end user to adopt some security measures,
>but that what level of security is implemented is his responsibility and
>not mine. This makes it obvious that the end user has to take the
>initiative to determine what level of security he is comfortable with and
>to maintain or increase it as he feels the need. I will not be around to
>install the next O/S security patch. The end user has to take that
>responsibility on himself.
<clip>

It would difficult for you to apply your own security standards as each customer will (hopefully) have their own ideas about how they want it done. The customer won't want a different security configuration on every system they get.
The best that can be expected is for the customer to write a clear security configuration specification that can be issued with all the other specs for the project. Some won't even want that, but rather they would wish to do their own security configuration - including changing the administrator passwords as soon as you are out the door.

This also however raises another point. What is appropriate for one type of application may not be appropriate for another. I have been
concentrating on what might be useful for computerised automated test systems. SCADA is another field altogether which has its own set of problems. This is particularly true as a SCADA system may have to do a lot more types of communcations than a test system.
With a test system, I have been assuming I can usually restrict the communications needs to a very narrow subset of what is technically
possible. Once we get into the "nice to have" features (web servers, etc.) we could decide if the potential security problems and software maintenance (patches) are worth the perceived benefit.

**********************
Michael Griffin
**********************

J

#### Jiri Baum

Ranjan Acharya:
> We must not forget that it is not just Windows with the security problems.
> Windows seems to be an insecure system with a few security safeguards.
> Linux appears to be a relatively secure systems with a few security holes.

Actually, the difference between MS Windows and Linux isn't that great. True, Linux seems to have had fewer security holes, and tends to respond to them better, but the basic security strategy is similar - a hodge-podge of
permission bits.

> Only connect to the outside world if you absolutely have to. Physically
> remove your system from the outside world if there is no good reason to
> be there.

The "air gap" system of security. Effective, but limiting.

Jiri
--
Jiri Baum <[email protected]> http://www.csse.monash.edu.au/~jirib
MAT LinuxPLC project --- http://mat.sf.net --- Machine Automation Tools