Microsoft .Net's impact to Automation Industry

Jiri:
> > cut -f2 table.txt | sort | uniq -c | sort -n

> > Here, there are four processes in the pipeline, separated by vertical
> > bars. Data goes from left to right:

Ralph:
> Thank you for the detailed description of how to accomplish this on Linux
> using scripting. Take my word as a non-programmer: This will not be
> perceived as easy or simple to anyone who is not a programmer.

I guess the algorithm itself takes a bit of thought; but I contend that there's not much difference between the above syntax and a graphical one that might look something like this (you'll have to imagine the icons):


table.txt
\_________/
|
V
+---------+
| cut |
| field 2 |
+---------+
|
V
+---------+
| sort |
+---------+
|
V
+---------+
| uniq |
| count |
+---------+
|
V
+---------+
| sort |
| numeric |
+---------+
|
V

In fact, the main difference is probably that instead of clicking through property sheets to find the options for count and numeric, you'll be
flipping through man pages looking up the options for count and numeric...

> It is easier for my primitive brain to follow a 20 line VB program to do
> the same thing than to figure out how this simple script works. VB might
> not be efficient but it is much more intuitive.

Partly, it might be that it's a completely different way of programming: the above script is like a factory line, with a conveyor taking the data from one off-the-shelf station to the next. I think LabView is a bit like that, too.

Most other languages are more like a single robot cell, you tell it what to do, in the order you want it done. Even scripting on linux is mostly like that - it's just that some of the steps can be "run this wad of data through A, B, C and D and put the result in file X".

> This reminds me of a contest I saw in 1986/7 for the smallest program for
> sorting names alphabetically. The winner was an APL program that was a
> completely unintelligible collection of punctuation marks and letters
...

Heh, that would be the perl script:

while (<>) {
/\s(\w+)/ or /()/;
$count{$1}++;
}
for (sort keys $count) {
print "$_: $count{$_}";
}

Actually, this sorts it in alphabetical order; if you wanted it sorted by counts, the best way would be something called a Schwartzian Transform,
which is about as bad as it sounds.

Which is why I wasn't recommending perl.

> > > The Linux effort just does not have any signficant marketing efforts
> > > addressing the IA industry.

> > I guess not... apart from Curt and me posting to the A-list...

> That is not marketing...it is evangelizing.

Yup.

> Evangelizing can be an important part of marketing communications

Exactly. The other parts - well, none of us really has the resources to do a worldwide marketing blitz... We do what we can, and for the rest, well, the old saying that a good product sells itself is overstated, but 't has a grain of truth to it.

> Given that Linux is essentially free, some other business model must be
> found that provides the incentive for making this investment in marketing
> before Linux will find its way into the corporate mainstream. IBM seems
> to be finding a business model based around selling high-end servers.
> Maybe that will be the coat tails the rest of the Linux community can
> ride on.

Perhaps... although so far the evangelizing seems to have been doing okay. If it only gets turned into marketing at the potential customers site when someone goes to his or her boss and says `I need this product to do my job' then so be it.

Jiri
--
Jiri Baum <[email protected]>
http://www.csse.monash.edu.au/~jiribvisit the MAT LinuxPLC project at http://mat.sf.net


 
No, you appear to have misread what I said.

I said Product Pretty had _more_ market share than Product Ugly. In some cases I am aware of, it is as much as 300% or 400% more market share.

I said that Product Pretty had 80% _of the features_ of Product Ugly.

I really meant what I implied: that how "good" a product is depends entirely on its "market share."

Walt Boyes
 
J

Joe Jansen/ENGR/HQ/KEMET/US

OK, I _do_ have to take offense at that comment:

AT varying points in my life, I have been proficient in PASCAL, KAREL, DK, Cimpler and Cimpler II, Assembler (including hand compiling to machine code), Basic (commodore, and apple II) and countless intelligent drive programming languages. I Am currently working on learning JAVA, Python (thanks Jiri!), and twx scripting. I like to think I have at least a _basic_ grasp of Ladder programming and HMI programming.

I use VB quite a bit. Not because I am a dim bulb, but because it is the easiest way to slap together something with a pretty window to display
data. The client machines are all windows based. Fact of life that has to be dealt with. I do not have the authority to switch over every desktop in every facility of this corporation. Nothing beats VB for ease of use in this environment. If something better exists in this environment that doesn't involve my having to lay out window geometry and low level event capture and handling, I will use that. I just have found that VB's appeal is the fact that it does all the window-centric stuff for me. Why should I have to re-write that? It is there for me, and I honestly don't care about handling windows movement translations and capturing close events and inter-process communications to write it even once.

--Joe Jansen

 
[email protected]:
> Taking a look at the Linux PLC web site that has a link off
> www.control.com, may be a good example of how much energy the Linux PLC
> project has.

I think that'd be the old website. The new website is:
http://mat.sf.net

The manual has a lot of empty pages at the moment, but that should change, we know it's our greatest weakness ATM. The IL page is complete, and the classicladder page at least has a screenshot.

http://mat.sf.net/manual/logic/il.htmlhttp://mat.sf.net/manual/logic/classicladder.html

Jiri
--
Jiri Baum <[email protected]>
http://www.csse.monash.edu.au/~jiribvisit the MAT LinuxPLC project at http://mat.sf.net
 
[email protected]:
> one of my friends use to say, "There are two laws to UNIX: 1) Everything
> is a file 2) Get the last word."
...
> the rich set of tools that you have for processing text streams. You can
> even build compilers with the tools.

These days there's tools for images and stuff, too... Somewhere I have a script that makes thumbnails for my web pages, and somewhere in the middle it says:
djpeg $file.jpg | pnmscale -xysize 100 100 | cjpeg $file-thumb.jpg

That's: decode a jpeg, scale it to fit into 100x100, encode a jpeg.

> Curt and the other guys that are talking up the beauty of Linux and OSS
> have made the investment in learning UNIX and they are utilizing that
> expertise to develop good solutions for their customers.

Yup. (Though I myself ain't selling anything yet.)


Jiri
--
Jiri Baum <[email protected]>
http://www.csse.monash.edu.au/~jiribvisit the MAT LinuxPLC project at http://mat.sf.net


 
Johan Bengtsson:
> i get vague and "impossible" requests at work too, i argue, i get better
> requests and so on, in the end i know what needs to be done and do that,
> but i really have to have those answers before i can do it, otherwise i
> won't be solving the real problem, and the one making the request in the
> first place won't be happy anyway.

Yup. Doesn't make it any less frustrating, I suppose...

> A lot of times don't the people making the requests really know what they
> want, how could they possibly tell me then?

Not to mention that if they knew, they probably wouldn't need you...

Jiri
--
Jiri Baum <[email protected]>
http://www.csse.monash.edu.au/~jiribvisit the MAT LinuxPLC project at http://mat.sf.net

 
Walt Boyes:

> I was not using "good" in its moral sense. I was using "good" in its
> sense of "suitable to the service."

No, you're using "good" in the sense of "sells the most", which is a rather limited definition.

> This is as true as gravity: a product is "good" if a large percentage of
> the target population buys it and uses it for its intended service.

> There is no other definition of "good" that works in product development.

How about one or two of the dictionary meanings - "reliable, efficient", as in "good brakes" or "having the right or desired qualities; satisfactory, adequate"?

If you need an economic justification, selling mediocre products damages goodwill. Difficult to put figures on, but real.

> A pretty face plate on a product that is 80% as good as the one that is
> ugly but better: the pretty product will outsell the ugly one.

As has been pointed out, here you're using `good' in a different meaning, one which you claim doesn't exist.


Jiri
--
Jiri Baum <[email protected]>
http://www.csse.monash.edu.au/~jiribvisit the MAT LinuxPLC project at http://mat.sf.net


 
J

Johan Bengtsson

Hmmm, that ends up with U < 0, quite a weak market share if I might say so...

I agree with you. If you define good as proportional to market share you can not at the same time define good as something having any connection to tecnical performance and not be very clear that you use two different definitions.

Altogether this is a "good" example since it definitiely shows that there is more than one point to consider about most products. In most cases these will probably be in conflict with each other on one way or another.

BTW, I don't normally measure goodness in marketshare. Marketshare is nothing but marketshare and have nothing to do with goodness unless the marketshare affects the goodness (such
as for example a dating site on internet : higher market share means more people to search among and thereby (probably) a higher success-rate).


/Johan Bengtsson

----------------------------------------
P&L, Innovation in training
Box 252, S-281 23 H{ssleholm SWEDEN
Tel: +46 451 49 460, Fax: +46 451 89 833
E-mail: [email protected]
Internet: http://www.pol.se/
----------------------------------------
 
Scott:
> the Dot Net Common Language Runtime (CLR) garbage collector (server
> version) can cause delays in an application of a few hundred
> milliseconds.

Garbage collection means that every now and then the program stops and looks around for any memory that's not used any more. That means you don't
have to have explicit steps in the program to de-allocate memory, eliminating some work and some kinds of bugs. It's not particularly worse as far as efficiency is concerned.

> I would consider this to make the technology pretty limiting for
> automation, or any near real time application - even for middleware.

Yup, that's why the MAT LinuxPLC is using explicit de-allocation. (Of course, theoretically memory allocation is pretty bad anyway, but in
practice it rarely happens. We try not to allocate while running anyway.)

In practice, I suspect that figure of hundreds of milliseconds would be an uncomonly large figure, with normal runs much quicker. Someone might even
write an incremental garbage collector that reduces this further.

Jiri
--
Jiri Baum <[email protected]>
http://www.csse.monash.edu.au/~jiribvisit the MAT LinuxPLC project at http://mat.sf.net
 
C
This happens all over, several times someone has attempted to hire me to install something they've bought that can't possibly run on their systems. Bad position to be in. And if you want to know how
to get on the IS/IT excrement roster, go around them and sell to management. Runs the success ratio close to zero.

Regards

cww

 
S

Scott Cornwall

> Yup, that's why the MAT LinuxPLC is using explicit de-allocation. (Of
> course, theoretically memory allocation is pretty bad anyway, but in
> practice it rarely happens. We try not to allocate while
> running anyway.)

I hope you mean you absolutely never use dynamic memory allocation in the run time engine. Otherwise your PLC cannot guarantee it will always be able to run in a confined amount of memory, and that it won't suddenly fail after
running for a week/month/year.


> In practice, I suspect that figure of hundreds of
> milliseconds would be an
> uncomonly large figure, with normal runs much quicker.
> Someone might even
> write an incremental garbage collector that reduces this further.

> Jiri

The reason I raised this is that it seemed an uncommonly large figure to me, and one quoted by a spokesperson for Microsoft. Even if normal runs are quicker, the occassional delay of several hundred milliseconds could still be a serious limitation. I would be extremely surprised if Microsoft was not using an incremental garbage collector.

Jiri, do you have some technical detail on this technology that you are able to share with the group, I mean on the performance of the .NET CLR garbage collector ?

Scott
_________________
www.sentech.co.nz
 
C
Now that's strange, coming from my background the scripting would probably make a lot more sense than the VB and I hope it doesn't have anything to do with my intelligence. It sounds like you have simply been conditioned to do things the
Windows way and are familiar with it, where I have embraced the UNIX philosophy. While it's somewhat more difficult to make a mouseable menu with Unix, it's trivial to sort and reorder a million record database dump. Each is powerful in
it's own way. My contention is that the things *nix is powerful at are more germane to programming and processing and the things Windows is powerful at are mostly related to presentation.
This makes sense since this is what they are aimed at. Of course, most of the UNIX tools have been ported and Linux now wears a pretty face but neither is as natural as it is on it's home territory. I empt for the data tools as I don't
do anything that's graphical. The things I do are much easier in Linux. It is possible to do these things in the GUI but it would drive me crazy in short order, I've tried it. The Windows way would be to have a button for it. Much simpler. But there's too many things I do there isn't a button for.

Regards

cww
 
M

Michael Griffin

Data is not information. This is why I mentioned that "standard application software which can make use of the data is only now starting to appear".
A lot of projects have foundered on the assumption that getting data into a database was the object of the exercise, when in fact getting the data out of the database and analysed into useful information was the difficult part.


**********************
Michael Griffin
London, Ont. Canada
**********************
 
M

Michael Griffin

Perhaps my letter was a bit long and unclear. Company 'B' is proposing the exact same third party software system that Mr. 'X' had been
proposing all along. Not a similar or equivalent one, the same product from the same company (a small software company specialising in this market). Company 'B' (the big automation components vendor) has simply set up a division to act as an integrator for this (and other software).

This is the irony of the situation. Mr. 'X' proposes something - this is a *bad* idea. Company 'B' proposes the same thing using the same product - call a meeting!

The point of this was to show that these sorts of solutions need to be sold to people who don't have to justify their projects. These sorts of people don't read automation systems magazines. They read the business section in the Globe and Mail. So how do you reach them? Will they even understand what it is you have to offer? Will they be able to see beyond the
normal advertising gibberish to judge whether they can use the product?



**********************
Michael Griffin
London, Ont. Canada
**********************
 
C
I guess the thing that bothers me most is that almost everybody will get into .NET before anybody has a real sense of where it's going. Face it, if you do business with MS from this day forward you will be involved with .NET because it's built in to WXP. And like all things MS, once you get in, it's nearly impossible to get out
without major disruption and possibly the loss of your .NET data. This means that not paying your "subscription" will likely have very unpleasant consequences. This strikes me as a really bad deal and not very much like a competitive landscape. Sort of Hyper Lock-in. And
if you do decide to opt out (fat chance), you will
likely be excluded from many necessary functions. Just as Linux users can't make good use of many sites now. This doesn't seem to bother anyone. This just seems really whacked out and one-sided in what's supposed to be a free market economy.
Abusing a monopoly? Smells a lot like it. Competing? Yeah, Right. This sounds like the end of feasible competition. No protest? What choice do you have? Now. some. Later, none.

Regards

cww
 
D

Don Fitchett - Designer of Downtime Cen

> > I think that Mr. Griffin has "hit the nail on the head" with his analysis.
> > The point of any plant-floor addition beyond basic machine control is often
> > extremely difficult to justify on a capital cost basis.

Do to the complicity of the manufacturing environment, cost justification has always been a great challenge. This why I designed "downtimecentral.com":http://downtimecentral.com , a free content only source of information for data collection, standardization, cost analysis and cost justification.
 
M

Mathias Lindgren

Hi !

One thing that certinaly will be affected by MS
.NET are the thousands of ActiveX controls that
are produced for various needs in the Industry.

It's more ore less common to have some ActiveX
control in a SCADA system and with VisualStudio.NET the support for ActiveX technology is taken away...

Of course it still exist if you use C++ but the standard will not be enhanced by Microsoft anymore.

And most of the tailor made controls are made in VB and in VB it's gone...

And the COM/DCOM isn't either a .NET technology so i think that most of the OPC applications needs to be rewritten to suit the .NET.
But in that case I'm not so sure...

I might have totally wrong on all above or...

Have a nice day !

Mathias Lindgren From Sweden
 
i work in the shipping logistic industry designing shipping systems (like APSS aka Aristo, Clippership, KShip, Etc). currently I'm writing a new product in .NET and we did noticed some issues with the GC initially. then we did some research (5 minutes worth) and discovered what we were doing wrong.

the key that you all seem to be missing is that you can call the GC on demand. if you call GC.Collect at the end of each routine (after destroying your objects) it will take care of everything constantly. you can event specify which generation of objects you want to drop. Not calling GC.Collect explicitly is like running on Autopilot.

here's the real trip... it's so easy to code in DotNet you will be able spend less time coding and more time testing and enhancing your app. The features in DotNet make debugging and maintenance much less of a hassle. the end result of DotNet is a *better* product.
 
E

Eric Lee Elliott

Remember when people refused to use m$ proprietary software in production systems?

Now we do use it and know m$ has full access to our data. We use m$ OS, while knowing they can upgrade (change, alter, modify) software & OS while operating.

We also wear network accessible cameras & microphones while we discuss all our business. Even the lowest cost cellular microphone has GPS & voice recognition capability today.

Did you uninstall your notebook camera, then notice WindizXP installed & enabled it again? And you do realize you have no hardware switch to disable microphone(s) or camera.

Even if we remember business principals & history, our managers are most concerned for end of week profits, not survival to next decade.

So why do you think we will not let m$ have more access to all our business?
 
Top