Automation Market - Real Numbers

M

Michael Griffin

At 10:46 23/04/01 -0400, Michel A. Levesque wrote: <clip> >Let's face it, having "big iron" is an advantage for MMI, data acquisition, >advanced controls etc. But I would stay with the distributed model for >control (many PLC's interlinked). The centralized model has benefits for >ease of maintenance and troubleshooting. But, it also has a common point >of failure. Actually, in my own experience I have found that the centralised model (i.e. a large PLC controlling a large zone) is much more difficult to trouble shoot and maintain than a de-centralised one. With a de-centralised system, you know that your problem is contained within a small area, and you can concentrate on that. Small inexpensive, but capable PLCs have allowed a big improvement in machine design in the types of industries which I am familiar with. It makes systems simpler, more maintainable, and more flexible. Any re-centralisation of control systems would be a big step backwards in my view. <clip> >I've been going around and around on purpose. I just wanted to make my point >that fundamental changes like distributed computing and centralized >computing have both been tried. We need another alternative (other >than the traditional distributed or centralized) computing model >in industry. <clip> With discrete parts manufacture, the old control systems were centralised into zones because PLCs (or minicomputers in some cases) were expensive. As the cost of PLCs has fallen to a small fraction of their former price, each machine can have its own PLC. Cost is no longer a factor limiting de-centralisation. Why do you think we need an alternative to de-centralisation (for control), and what do you think this alternative should be? I do think we need better higher level networking, but only for status reporting and data sharing and not for actual control. ********************** Michael Griffin London, Ont. Canada [email protected] **********************
 
M

Michael Griffin

I believe that there were (if I remember correctly) also dedicated X terminal hardware devices. You just plugged them in to your network, and pointed them at the server. I believe the new thin clients are going to be similar to this idea, except they will use a built in web browser (or something similar). I think I have seen these things already on the market. Some are intended for the shop floor, and are rugged touch screen devices. The difference between these and the original X-terminals (other than how they actually work) is that the market may be more ready for this idea. Systems are beginning to centre more around the network, and less on individual computers. Since you are interested in the Linux point of view on things, it is worth noting the operating model for this system is based on the internet, and the internet to a great extent runs on Linux. I also suspect that ASPs (applications services providers) that do business with industrial companies will make most of their money from MRP and MES systems, and other similar things. Routine office software such as word processing, spreadsheets, and e-mail will likely be offered for "free" as a loss leader. Open source office software would fit this niche quite nicely at a low cost. I've had a thought about something which occasionally comes up on this list. Many people need occasional use of expensive SCADA or MMI development software to make minor changes to existing systems. However, they don't want to spend a lot of money to buy and maintain software they only need once in a while. The problem has been how to provide "metered" access to this development software for these people. If the software ran on the software publisher's web site, and you could get metered access to it, this would be an ASP type of solution. Pricing would probably be worked out so that if you are a regular developer, it would be cheaper to buy the software (as is done now) so that they wouldn't undermine their revenues with this strategy. I imagine that other software which has similar characteristics of a specialised market and high price could also be offered this way. Some examples may be various types of simulation and modeling software, other types of development software, specialised CAD systems, etc. I think the key to making something like this successful would be to allow free access to someone who wants to learn how to use it (similar to a demo mode), but charge for accomplishing "usefull" work. This could be billed on a time basis (i.e. by the second, minute, hour, etc.), or on a subscription basis (weekly, monthly, yearly, etc.), or on the basis of the amount of work done (e.g. by configured tag, or screen, etc.) I still think the stuff which you use all the time though (e.g. PLC programming software) would best be done on a conventional sale basis.
 
C
Yes, there were and probably still are appliance type X terminals. At first, they needed more horsepower and then PC's became cost competitive. They were client server (abeit reversed) before cs became cool. If standard internet protocols and methods are used any number of low cost existing products could be used such as the NIC @ $300.00. I wouldn't hold my breath though, given the history of this industry, some incredibly lame excuse will be made for using proprietary protos and thin clients @ 5 times that. Still, eventually people will catch on and reject proprietary approaches for products that work with any browser, thin, thick, or embedded. I am pretty sure that that is what the whole Modicon/OPTO22 thing is really about. Without patent issues to kill it, this could be a very big genie to try to stuff back into the bottle. It is so obvious to capitalize on the Internet phenomena that is is very unsettling to this last bastion of proprietary captivity. There is a serious effort under way to see to it that no Ethernet product makes it through without being perverted, I expect widespread Web interfaces will have to wait until they can be decommoditized or patented or even metered. The free and open Internet as we know it cannot be tolerated because there's no way to bill for it. I predict this effort to subvert a good thing will be self-defeating. People are smarter now. I agree that there is potential in the ASP angle for high buck software. The problem is that it typically requires a substantial investment of time on the part of the user to be useful. I'm not sure if even free access is enough to get someone to make that investment for non reusable knowlege. There is a better comfort zone if you get the software. Of course you don't own it in any case with the proprietary model so really, the differences are psychological. (Check your EULA) I have seen an interesting parallel with the Internet itself with all kinds of would be kings writhing, twisting, playing angles and desperately trying to exploit it somehow. It's amazing how very resistant the 'net is to being owned or exploited. I see this as the model for the future, proprietary around the edges, but with a large body of commonality commoditized with de facto public ownership. I predict this will eventually succeed by weight of the vast improvement in efficiency afforded by commonality and standardization as soon as margins will not support 500 functionally identical proprietary solutions to the same problems. In short, merging automation and the Internet will, thankfully, change automation rather than changing the Internet. Right now, there is a huge disconnect and diametrically opposed philosophies at work. While there will be desperate attempts to maintain the proprietary status quo with patents and such, success will belong to those who do the best job of adopting it's open models and capitalizing on the infrastructure that's there. If Openness doesn't bother you, the merger is trivial, it's the insistance on owning and controlling everything that has automation so very far behind the curve. Regards cww
 
Top