The "reinstall it" mindset, was Control standards and initiatives

M

Thread Starter

Mark Bayern

While some of it might be simply intellectual laziness, there are at least two other reasons for the simple 'reinstall' answer on Windows products. Those reasons are overwritten DLLs
and the infamous registry. Once the install of an app has modified the registry and overwritten some DLLs with new non-working DLLs, or older DLLs even the most intellectually non-lazy(?) person can find it impossible to reconstruct the system.

Mark


At 01:42 PM 5/9/00 -0400, you wrote:
>------- Forwarded message follows -------
>From: "Robert Raesemann" <[email protected]>
>
>The reinstall it thing is mostly a result of intellectual laziness. I doubt
>that people who take that approach would be very successful with Linux or
>any other OS, no matter how open it is. For the most part, these people
>don't know what they are doing and don't bother to try to learn when they
>encounter a problem. It is much easier to reinstall and hope the problem
>goes away. The scary thing is that a lot of these folks think that they are
>hot stuff and have no idea the depth of their incompetence (as I often put
>it: They don't have a clue and they don't realize that they don't have a
>clue so they aren't even looking for one). Everyone that I have ever met
>who is really good at anything has a healthy respect for the subject and a
>good idea that there are still many things that they do not know. They are
>essentially lifetime students. .... <clip>
 
P

Phil Covington

Your comment above just adds support to Robert Raesemann's argument of "intellectual laziness" or at least is a testament to most people's
ignorance of available tools to deal with DLL versioning and dependencies.

There are many (free) utilities available that will help you deal with DLL versioning and dependency problems. For people who have MS Visual Studio installed there is a dependency checker that is useful for trouble-shooting
problems. Microsoft maintains a DLL Help Database at http://support.microsoft.com/servicedesks/fileversion/default.asp?vartarget=msdn
This site lists all Microsoft products that ever shipped with a certain version of DLL. It is also easy (for programmers) to pragmatically
inquire the version and dependency of DLLs through the API.

There are books that describe the Windows registry in detail so all one has to do is some reading to understand it. While we can argue all day about the advantages and disadvantages of the Windows registry, I can say that many of the configuration files scattered all around on a *nix system are no less cryptic and understandable than the registry. There are good tools that help maintain and defragment the registry available also. If one is diligent about periodically backing up the registry files it will go a long way in avoiding problems. It amazes me how many new *nix users will devote
endless hours to understanding and learning about the *nix operating system, but then complain that Windows is impossible to understand when the
information is readily available. ( And please don't lecture me about the *nix way of doing things - I have a MS in computer science and have been exposed to the *nix operating system in depth since my college days <g>).

I have administered both Windows and *nix systems. I have *yet* to have a problem where I had to reinstall *either* operating system ( excluding hard drive failure). But then I have taken the time to educate myself about both
Windows and *nix systems and administration tools. So Robert's argument of "intellectual laziness" is right on the mark in my opinion...

BTW, I am not a MCSE...

Regards,

Phil Covington
 
R
It must be nice to live in a corner of the world where the cost of doing things is ignored. In such a place there is no need for "intellectual
laziness". In the real world however... it is decidedly much more cost effective to delete the trashed Windows structure and re-install the whole
shebang then it is to spend literally hour after hour trying to re-construct a registry and DLL mishmash.

Granted there are times when a re-install is not possible or not desired. In such cases a manual repair is the only re-course (I have done these many times at my retail computer store and the longest a repair has taken is 6-7 hours. This consisted of registry editing and DLL hunting and replacing.) When you are dealing with the public's equipment you will find that the number of billable hours you spend on their equipment is of much greater importance then the level of "intellectual laziness" involved in the repair.

When dealing with ideal equipment used only by technically capable users in a professional environment, where all software and re-vision information is available... your view point makes sense. But outside of these constraints... a re-install is usually the most cost effective fix.

Best Regards... Rick Kelly

Chief Technician
Natural Cuts
Cheese Operations
Kraft Canada
(613) 537-8069 V
(613) 537-8057 F
[email protected]
http://trondata.on.ca
 
One little caveat about these version tracking tools... They have to be installed BEFORE they can be effective. There are also a LOT of environments where installing a tool like Visual Studio is not practical. I would venture a guess that most people don't even know that these tools are there to be used let alone how to use them.

It's not that I disagree with you, it's just that there are way too many uneducated computer users out in the world who simply don't care about version tracking or revision levels. Their only concern is "Why won't Word load my file
anymore?". The fact that they have been on a "downloading binge" for the past week, installing and uninstalling this and that, only complicates the matter. I have seen it happen all too many times.

Short of locking machines down so tightly as to make them completely unusuable, I am afraid that the dreaded computer disease "versionitis" will be with us for some time to come.

Ron Gage - Saginaw, MI
([email protected])
 
R
On Tue, 09 May 2000, Robert Raesemann wrote:
> I see that a lot of
> the people who are slamming NT have put a lot of time and effort into
> learning Unix but have not made a similar investment in NT.

Do you?

How many people who have participated in this thread have actually developed UNIX based systems? I do both, sorry, I do any system, I love learning something new. But NT is no great magic OS, it is one of many, and althougth
it is generally mediore it does have a role. But not EVRYWHERE. I like doing something on NT when it is merited, but I hate being forced to dig a hole with a screwdriver just because MS spinmeisters are so good. That is just ridiculous.

I have put effort into NT, but you said it yourself when you said you are proud to get the same uptimes with NT and NT has a long way to go etc.

I keep asking why bother? There are already solutions out there. Years ago I was really into the NT roll, as it was to promise UNIX like power at PC prices. Now it almost does, but costs more, and requires MORE learning to become expert. And it is always moving. And MS have this "To hell with everbody we do what we like attitude" which was exactly the kind of attitude that made us shun IBM and co and look to MS in the first place.

UNIX to NT is not an upgrade, it does not save money, and while the initial learning curve is shallower, it soon gets steeper when you start developing serious systems.

So why bother?

BTW, before somebody replies "because it is what everybody uses", I am referring to server sides and backend stuff here, where windows does not
dominate and in any case is a far cry from MS OFFICE. And yes, the CEO access data direct from my application using excel on a W9x laptop.
 
R

Robert Raesemann

We are probably arguing about the same thing here. The point that I was making is that administrative practices and attitudes are a major factor in system reliability. There are definitely times when you should just cut your losses and reinstall. If you find a system that someone else has hosed up and you are cleaning up the mess, fine. If they are a good paying customer then this is great for you and you should appreciate the steady stream of business that they will provide. If you find yourself
reinstalling systems that you have control over, then you should really be thinking about how you operate.

In response to your comment below I could say that it must be nice to work on systems that are unimportant enough to allow that kind of downtime. I can't imagine having to explain that the system was down for half a day while I reinstalled NT, the applications, and all of the data. That kind of thing would not be tolerated in most of the environments that I work in and
I would be looking for employment elsewhere. Downtime is usually a lot more expensive than my hourly rate, usually by orders of magnitude. My customers might have to discard product or incur regulatory fines for missing data. I catch serious heat for downtime. If something goes down I better have a very good reason for it and it better not be because I was careless. It's
always cheaper to take the time to do things properly the first time than it is to redo them. I have systems that have run for years without a
reinstall. These have been running since the SP2 days. I am very careful with my production machines. I test new software and updates and plan their installation in a sandbox environment before I touch my production machines. This leads to very predictable and stable results. I work on systems that tend to have many users relying on them 24/7. In my experience, with the proper discipline, education, and experience you should never have a production system in such as state as to require a reinstall.

I can contrast my behavior on my production servers to my personal workstation or development boxes. I probably nuke them and reinstall at
least once a year. Since they are not production equipment I tend to be more daring with them. I install stuff to play with it and yank it off,
install things without testing them first, etc. I simply don't do these things in my production environment. This goes for UNIX or NT systems.
I've seriously hosed up Linux boxes while fooling around and found it easier to reinstall than to try to put it back together.

I guess my point is that I try to treat my production servers like big iron. I look at the admin methodology of successful UNIX system admins and try to model their techniques and mindset. I used to think that I should use the
mainframe mindset but that would mean that I would still be running NT 3.51 waiting for them to work the kinks out of 4.0 <g>. Most of the folks that I see having serious problems with NT and complaining about poor reliability treat their boxes like a desktop PC. They usually just complain that NT is too unreliable and never consider their own practices which I have observed
are usually in great need of improvement.
 
P

Phil Covington

Yes, we seem to be discussing two different worlds...

I agree that when you are talking about working on someone's home computer system that it may make more sense to just re-install Windows instead of spending (and charging for) the time to figure out the problem. When they
re-install all of the junk that they have downloaded off the internet, it is only a matter of time before they are calling you again with problems.

In a professional environment with development software, various plc programming packages, SCADA/HMI, CAD software, etc.. whose settings and
configuration need to be preserved, re-installing the OS is much less acceptable. It is worth the time spent to resolve these problems and
generally once you've done it a few times, the problem can be resolved in much less time than it takes to do a OS re-install. Of course, most of the time in these environments you (should) have much more control over what is being installed on the system, so these problems aren't as common anyhow.

Regards,

Phil Covington
 
R
Robert...

You are right we are talking about the same thing. The machines in our production systems environment would be treated much differently then a machine from a customer in off the street. I find that what is a normal practice in the industrial control arena would be frowned upon in the retail world. The hoops and whistles we all jump through to remove all causes of downtime would never be accepted by the general public.

Best Regards... Rick Kelly

Chief Technician
Natural Cuts
Cheese Operations
Kraft Canada
(613) 537-8069 V
(613) 537-8057 F
[email protected]
http://trondata.on.ca
 
C

Curt Wuollet

Dll hell is payback for the whole poor practice of replacing libs on a whim. It's a bad idea that produces a bad outcome. In one of the rare moments of sanity I've seen in the windows world, MS is finally moving to end or at least discourage the practice. The real problem is ISV's who still feel they own the whole machine and the only thing that really matters is that _their_ application runs. What makes me curious is why this is so much less of a problem in other operating systems. It's very rare that anyone changes or replaces a shared lib under Linux. Peer review perhaps? Responsible programmers? Cooperation?

Curt Wuollet,
Linux Systems Engineer
Heartland Engineering
 
A
> It's very rare that
> anyone changes or replaces a shared lib under Linux. Peer review
> perhaps? Responsible programmers? Cooperation?

Its very simple, actually. Microsoft started at the bottom level, with one individual user on their machine, and are constantly trying to work their way up. *nix started at the top, with large systems with multiple users and a system administrator, and has been working their way down to systems where there really is only one user.

Microsoft's concentration has been, and still is the "user experience". Linux (to pick the hot alternative OS) has always concentrated more on the technical aspects.
 
P

Phil Covington

Note that the dependency checking program that comes with Visual Studio can be download free (about 350K) from MS's web site, so you don't have to purchase Visual Studio to take advantage of it. There are also much better tools available on the internet for free. A search of the newsgroups through deja.com will turn up many recommendations.

People who administer Windows systems *should* know about these tools is what I was trying to say. Users maybe not... But if you are technically savvy enough to load Linux on a system and take time to learn where and how
to twiddle the system, you should also be able to educate yourself about the tools and techniques on Windows systems IMO.

>It's not that I disagree with you, it's just that there are way too many
>uneducated computer users out in the world who simply don't care about version
>tracking or revision levels. Their only concern is "Why won't Word load my file
>anymore?". The fact that they have been on a "downloading binge" for the past
>week, installing and uninstalling this and that, only complicates the matter. I
>have seen it happen all too many times.

This is certainly true.

Regards,

Phil Covington
 
G

Gilles Allard

Phil Covington a =E9crit :

> There are many (free) utilities available that will help you deal with
> DLL versioning and dependency problems. For people who have
> MS Visual Studio installed there is a dependency checker that is
> useful for trouble-shooting problems. Microsoft maintains a DLL
> Help Database at
> http://support.microsoft.com/servicedesks/fileversion/default.asp?vartarget=msdn
> This site lists all Microsoft products that ever shipped with a
> certain version of DLL. It is also easy (for programmers) to pragmatically
> inquire the version and dependency of DLLs through the API.

For many years, Micro$oft proposed that DLL should be in the "system" directory. That was a wrong decision. In Win2000 they now protect the DLL in system directory from being overwritten.
This way, they protect the OS but do they let the application work (I do not have experience with W2000$). If you're not in Win2000$, is the sole solution to ask your users to look at the
KnowledgeBase every time they want to install a new software. Be realistic, they won't. The sole other way is to block any installation of a new DLL in system directory. This will make your Micro$oft system more robust but your users will be more angry. We've paid many K$ (possibly M$) for Micro$oft wrong recommendations.

Gilles
 
R

Robert Raesemann

Out of curiosity, what is the best way to deal with shared libraries? Are there any systems that do it exceptionally well? Statically linked libraries take up a great deal of space on the hard drive and in memory. That is what DLL's were designed to solve. What's the best way to share the code?

The Don Box book, "Essential COM" goes into great detail about the problems with shared libraries and how COM is supposed to correct these problems. The idea is to support an immutable
interface so that you can update your private code with screwing up all of the clients that were compiled against an older version. Each
class is given a globally unique identifier (GUID) and the DLL is registered in the registry. It's pretty interesting the thought
processes that led to the development of COM but when you get into the details it sure boogers up your C code. You can't help but think that there should be a much more elegant solution to the
problem.

I've played with JAVA some but have only really scratched the surface. Do you run into problems with different versions of classes? What happens if I write a class that extends version 2.0 of somebody else's class and the system that I try to run it on only has version 1.0? I assume that it probably crashes but then if I put v 2.0 of the class on the computer does it screw up programs that expect v 1.0. Again I assume that if the programmer used interfaces to the class and did not change that interface that it would all work fine together. This is very similar to the way that COM is supposed to work.

How does CORBA handle these issues?
 
J

Johan Bengtsson

Another interesting note is that some of these dependency checkers tests what DLL:s the application are linked to, that is the DLL:s that is to be loaded by windows when the application is loaded. Some dependencies are in the form
LoadLibrary(), ie a call after the application is loaded to load more DLL:s, this is usually NOT covered by ordinary dependecy checkers, in order for this to work you have to run the program and make the dependency checker test what DLL:s the application actually loads.


/Johan Bengtsson

----------------------------------------
P&L, the Academy of Automation
Box 252, S-281 23 H{ssleholm SWEDEN
Tel: +46 451 49 460, Fax: +46 451 89 833
E-mail: [email protected]
Internet: http://www.pol.se/
----------------------------------------
 
R

Robert Raesemann

I'm having trouble mapping this very general explanation to an actual technical process. The management of shared libraries doesn't really have much to do with the size of the system. How is Linux so much more effective at maintaining shared libraries than NT? I have been using Linux for quite some time but have never really
done any serious programming on the system. My limited experience tells me that most libraries are still statically linked to user programs. Am I wrong about this? Does Linux make as extensive use of shared libraries as NT does? I would imagine that UI code would make pretty extensive use of shared libraries. How do Unix systems handle the upgrading of libraries so that programs
compiled against an older version are guaranteed to work against a newer version of the library?
 
M
> > There are many (free) utilities available that will help
> you deal with DLL
> > versioning and dependency problems.

And just why is it that a program would ever install older versions of DLLs without notifying the person installing the program, and ask if it should proceed? Why is it that new DLLs would ever get installed if they are not backward compatible with the old DLLs without notifying the installer and asking if its OK to proceed?

> > For people who have MS
> Visual Studio
> > installed there is a dependency checker that is useful for
> > trouble-shooting

The price I've got on Visual Studio is $1469.16. Sounds a bit expensive to fix a problem that shouldn't exist in the first place.

Mark Blunier
Any opinions expressed in this message are not necessarily those of the
company.
 
W

Warren Postma

That dependency walker is also part of the NT Resource Kit, is it not? That can be had for
substantially less money.

Warren
 
-> And just why is it that a program would ever install older
-> versions of DLLs without notifying the person installing the
-> program, and ask if it should proceed? Why is it that new
-> DLLs would ever get installed if they are not backward
-> compatible with the
-> old DLLs without notifying the installer and asking if
-> its OK to proceed?
->

Code Forks. The DLL's may be the same age, but different due to different mods made by diferent vendors.

Plus, even if it asks, if you just paid $X for a software package, most people are going to install it, whatever the warnings.

(Please no flames. I know better, and most admins know better. Most is not all, however. I know of several individuals who are IT staff, but don't know anything beyond the GUI)

--Joe Jansen
 
H
Linux makes extensive use of shared libraries. I would say the difference is that application vendors and contributors regard the shlibs as standard and don't roll their own. In other words
there are system shlibs and application shlibs and you don't mess with the system shlibs. You can change your own libs but that's not a problem. With the Windows platforms it seems
nothing is sacred and people "fix" everything. Now, I understand, MS is going to declare their libs sacred and replace any that are messed with. I believe there are more smaller diverse shlibs in windows and gray areas that are not clearly system or application libs. MS's new policy should discourage tampering and help the problem. It's kind of ironic, in open source where everyone has the source, it's not a problem. In proprietary land where you have to buy the source, people feel free to hack it.

regards

Curt Wuollet, Linux Systems Engineer
Heartland Engineering Co.
 
R
List Management Account wrote:

>I'm having trouble mapping this very general explanation to an actual
> technical process. The management of shared libraries doesn't
> really have much to do with the size of the system.

Yes it does, efficient deployment of shared libraries means less code in memory.
Efficient deployment does not mean simply using DLLS's (that only means the code **may** be shared), the applications must actually be linked
against the same libraries. So efficient deployment means having all your applications link against a few big common libraries (the Linux tendency), rather than linking against lots of little proprietry libs (the windows tendency).


> How is Linux
> so much more effective at maintaining shared libraries than NT?

It is not really, at an OS level, the difference tends to be because so much code on a Linux system is in whole or in part based on OSS, wheras on NT proprietry (copyrighted and licenced code) tends to predominate.

Anybody who has developed on windows will be well aware of this, a license for this and a licence for that, and of course each development
environment has its own libraries. The end result is that we have many libraries that are doing pretty much the same thing, but each app requires
a different one.

In the OSS world the libraries are free, so development environments will only deviate from using the standard ones for a particular reason. Also, with proprietry libraries, if a developer needs something more or different from what the library offers, he will extend it or write his own
methods, and allthougth he may put them in a DLL, they will generally not feed back into the main library. In OSS, by contrast, the developer may
(and often does) feed back his efforts to the library maintainer, and if the maintainers feel the code is of sufficient quality and useful to
others it can become a standard feature of the library.

Put simply, if we have two programs with mp3 capabilities loaded on the system at the same time, under NT it is likely that we will have 2
different DLL's loaded for handling them, wheras on linux it is more likely to be a single DLL to serve both apps.

> I have been using Linux for quite some time but have never really
> done any serious programming on the system. My limited
> experience tells me that most libraries are still statically linked to
> user programs.

Early systems (like 5 years or more ago) tended to be staticly linked, principly because in the early days the libraries were......dynamic (but
in the development sense ;-) ). Also, the introduction of ELF format (around '95) made dynamic linking a simle linker flag option, prior to this things were not so easy. In recent years all dev tools I have used have done dynamic linking by default, and most of what is included in a typical distribution is dynamicly linked to a pretty small set of DLL's. Binary apps for general distribution and downloading are however often staticly linked as this eliminates the sort of DLL conflicts with which we are all too familiar with under windows.

So why don't windowers do the same? The answer is, once again, predominately an OSS issue rather than an OS one. Downloading staticly linked binaries of OSS is a quick and easy plug and play option, but if you like the software but not the way it eats up resources, you can download the source and compile it into your system, which assures DLL compatibility. That a non programmer can compile such applications (or indeed the kernel of the OS itself) is again testimony to the advantages of OSS. Althougth OSS developers may use CSV depositries full of complex patches for development, they also make stable source tarballs that can be built automaticly by standard tools and compilers supplied with the OS,
the user simply types 'make'. On windows there are no such facilities in the system, thus OSS for windows generally requires specific development tools such as VC++, which must be purchased and installed, no joke if it is the enterprise edition!

Of course actually downloading and installing seperate application packages is rare on OSS systems (unless you are into bleeding edge or
development), as 'distros' come with a full set of software, and the low cost means you can just upgrade the whole system every few months (major
distros do this automaticaly, upgrading the system **and** installed apps in a single pass). They can be so brave (yes brave, can you imagine trying this on a windows box!) because they compile the whole distro from source, so everything is linked against the same versions of the libraries (which would have been one of the first things they compiled). So you can install
and maintain the whole system using software compiled in the same environment, and thus with an optimum DLL usage. A typical windows box, by
contrast, is made using software from many fonts, so DLL diversification tends to more diffuse.


>Am I wrong about this? Does Linux make as
>extensive use of shared libraries as NT does?

Well, you have the answer, Linux tends to use fewer DLL's, and thus much more code is actually shared (as opposed to being shareable).

> How do Unix systems handle the upgrading of libraries so that programs
> compiled against an older version are guaranteed to work against a
> newer version of the library?

'make', Like I said, it is the OSS philosophy, not the OS, that makes the difference.

BTW, look at a Linux system and you will find many applications writen long before Linux, or dynamic linking, avtually existed, including GUI
ones (xcalc, for example, is over 15 years old). Nonetheless you can dynamicaly link them to standard libraies provided with the system simply
by sticking the appropriate flag in the makefile..

A much more appropriate question is 'how will I run my mission critical app when the OS/DLL's on which it depends are no longer shipped? Silly
question? Well, last week I was called in to help commision a new machine whose application software required NT3.51, because the first machine of the type was developed 5 years ago. The original developers have long moved on. The software works fine, nobody wants to change it, but getting 3.51 drivers for new hardware (let alone 3.51 itself) is no joke. The gleaming new PC was switched with an old 3.51 one based one. This is ridiculous in my eyes.
 
Top