How important is multivendor interoperability to you?


Thread Starter

Thomas J. Burke

I've been living and breathing industry standards and pushing vendors to deliver products for multi-vendor multi-platform secure reliable interoperability. I truly believe that the expectations of the end-user's are all about being able to buy products from multiple vendors that are plug-and-play versus plug and pray. I'm currently evaluating a strategy that would mandate certification for OPC products. The industrial Ethernet world has set the bar and the majority of products from the various fieldbus networks certification has always been mandatory. I believe the expectations of consumer electronics from the engineer as of today and tomorrow are really beginning to motivate and drive the expectations for industrial automation and other domains.

I need the end-user's help to motivate the vendors to really deliver quality products that are truly multi-vendor interoperable. I highly encourage you only to buy products from vendors that actually take the time to architect design develop and test their products certifying their level of interoperability.

We've recently redesigned a lot of the OPC certification information on the website and I encourage you to check out this information and give me your feedback on the direction that I am taking.

There are many vendors that are OPC Foundation members that truly are dedicated to the high quality standards that OPC is setting.
What's wrong with simply having open protocols? I don't have anything against the OPC Foundation, but the problem it is trying to solve is a problem that shouldn't exist in the first place.

If you want a good analogy, consider how this message got to you. In the old days, we had Compuserve, MCImail, AOL, MSN (and no doubt others as well). Each of them had their own networks, protocols, and standards. You signed up with one of them, and if you were lucky you had a (very expensive) gateway that let you send message to some of the others.

Then the Internet came along. Compuserve and MCImail disappeared without a trace. AOL survived by converting themselves into a dial-up ISP (and are now fading away altogether). MSN was a complete failure and was closed down (the name was recycled later as an Internet portal).

Now everybody can talk to everybody else, with no gatekeeper, no "certification", and with multiple competing suppliers. Anybody can write software which uses the Internet without having to get permission from anyone else. Communications have expanded by orders of magnitude over what we had before. Entire new industries have grown up to take advantage of it. Things are being done today which weren't even imagined 10 years ago. I don't think that anyone would argue that the old system was better than what we have now.

As far as industrial communications go, customers want a solution that is like the Internet, while most vendors want one that is like Compuserve. I'm sure that the OPC Foundation is doing a wonderful job at what they are doing. However, what they are doing is rather like the old Compuserve gateways.

I think there is a lot of potential for growth in industrial communications hardware, software, and services. The problem we have right now is that we are being held back by the current closed systems. I can think of a lot of ways that customers could improve the efficiency of their plants by making better use of information about their equipment, but none of them are practical without open communications.

curt wuollet

Hi Michael

Yes, exactly.

I define the problem as convincing big automation that growth will require cooperation. Very difficult to see from the cube farm. Certain big companies almost missed the boat on the internet because they continued to insist that they could control the world for profit. Others did miss the boat and are on the scrapheap of history. Those who don't learn from history are doomed to repeat it. But what could be the irresistible open force that collapses the Tower Of Babel. What obvious need can be met to make the glacier melt?


William Sturm

I am in complete agreement with Mr. Griffin on this one. I couldn't have said it better myself.

Bill Sturm
M Griffin's example is correct, of course, but the question is how does his analogy apply to the issue at hand?

Remember that Compuserve <i>et al</i> used closed standards in an attempt to protect their turf and exclude competitors. Bad idea. The Internet is not without standards, but they are open: W3C, XHTML, TCP/IP, etc. Encourage competition and lower the barriers to entry, but keep everybody playing by the same rules.

Which is the better analogy to OPC? Well, it's an open standard that encourages competition by ensuring that everybody's product works with everybody else's. Even a little guy with new ideas can jump in with a limited product line, but one that is compatible with all of the big players. And the customer can pick whatever he wants, whatever works for him. Sounds like the Internet model to me.

Now, the Internet is not without compliance and interoperability issues itself. We've all visited websites that didn't quite work with this or that browser. Go back ten years and just designing a website was a nightmare of "this command works only with Netscape Navigator" and "that command is ignored by Internet Explorer". It was not uncommon to have to design multiple versions of portions of your website to accommodate the different browsers. We're not fully past that yet, but are at least moving in that direction. Why? Because of a general movement toward not just setting but <i>complying with</i> these open standards.

Or consider Wi-Fi. Would you buy a Brand X Wi-Fi router that worked only with other Brand X equipment? Of course not. You buy equipment that is tested and proven to comply with the existing standard, so that you can be assured of interoperability.

Now, the OPC Foundation has set up an open standard, and is exploring ways to encourage suppliers to conform to it. You may have quibbles with the OPC standards themselves; I suppose nothing in this world is perfect. But the Foundation's basic idea of having a standard by which everyone can talk to everyone else, and attempting to ensure compliance with that standard is surely the right approach to opening the market to new ideas and new products. And for that reason, my company is a charter member of the OPC Foundation, and is proud of the fact that we were the first to offer products certified for compliance to the OPC standards.

Dan Muller
In reply to Dan Muller: You said "Remember that Compuserve et al used closed standards in an attempt to protect their turf and exclude competitors." Yes and they had controlled gateways which permitted traffic to pass from one proprietary network to another. That sounds an awful lot like what we have in the automation industry today, with OPC as the gateway between the "walled gardens".

And it's funny that you mention Wi-Fi, because that whole area is a mess of conflicting standards. Every big vendor wants to push "their" patents into each of the multiple standards so they can collect a royalty on every unit sold. The result is that people have more problems getting their Wi-Fi working properly than they do the rest of the computer put together and they end up paying a lot more for the privilege than they should have to. That problem sounds rather familiar, doesn't it?

As I said before, I don't have anything against the OPC Foundation. I'm sure they are doing a wonderful job at meeting the very limited goals they have set themselves. However, the fact that it needs to exist at all is what the problem is.

OPC is not an automation protocol. It is a driver interface. It was created so that vendors could connect proprietary protocols without revealing how those protocols work. And if your software doesn't happen work in the same way that OPC does, you are pretty much out of luck as far as OPC and the proprietary protocols it supports are concerned. That's why we didn't have things like Google or Twitter when Compuserve and their friends dictated the rules of the game either. Compuserve had a marketing plan, and things like Google weren't in it.

So, now you have people who say things like "Hmmm, I've got a great idea, but it won't work as a normal Apache module. I can make it work if I write my own software though". That's the "Internet" way of doing things.

The industrial automation way of doing things though is that your software has to be written around the OPC model. If it doesn't fit that model, then you either have to do some horrible hacks to make things work, or else give up and don't do it. It might work OK if I'm writing a SCADA or an HMI that is just like every other SCADA or HMI out there. However, I don't find the answer "well, we didn't think of your application when we came up with OPC, so too bad" a very satisfactory one.

If OPC is such a wonderful concept, then how come we don't use it to send e-mails or browse the web? How does having a different Internet-OPC driver for every domain you want to send an e-mail to, or every web site you want to look at sound, for say $1000 apiece? Hey, you can even have them certified, and claim it's an "open standard" right?

So what is a genuine open protocol?

1) I don't have to belong to a foundation, vendor group, or trade association.

2) I don't have to sign a contract.

3) I don't have to sign a patent license.

4) I don't have to license a trademark.

5) I don't have to have a vendor ID number.

6) I don't have to get a "certification".

7) I don't even have to give them my name.

I should simply have to go to a web site that is open to everyone, download the spec at no charge, and then go ahead and do whatever I want without getting permission from anyone. That's how real innovation happens.

If people want trade group certifications, well there is nothing stopping that from happening without tying it into the right to use the spec itself.

Jeremy Pollard

Hi Tom,

As the former Managing Director of PLCopen (IEC-61131), I have fought for general standards and open concepts which did not get any traction at all in the US and Canada. While Mr. Griffin has great points, I'm not sure that the industrial marketplace will support open concepts since their markets are so small in comparison with the commercial marketplace. Having said that, the certification process is paramount. I think that end users want to know that vendor A works with Vendor B.

Ken Crater tried this with an independent lab and there was no interest in that either.... I tried to get the PLCopen board to think about interoperability between software vendors (motion control blocks and which hardware they work with etc) but that didn't fly either:)

Multi-vendor support is important, but only with the vendors that 'I' use.. I wont care about vendors that I don't use...

Thus the end user will only care about his/her own backyard. Who cares about the neighborhood??

Cheers from : Jeremy Pollard, CET The Caring Canuckian!
Control Design www[.]controldesignmag[.]com
Manufacturing Automation www[.]automationmag[.]com
3 Red Pine Court, RR# 2 Shanty Bay, Ontario L0L 2L0
Jeremy mentioned...

>Ken Crater tried this with an independent lab and there was no interest in that either.... <

Yes, that was an interesting experience. I remember having an exhibit at an OMAC conference, and getting virtually no support from those one would think to be natural allies. Granted, we were still at an early stage of development, but at that point some encouragement would have at least been a positive indication of a future market for the concept of open controls with interoperable pieces from multiple vendors. I came away with the impression, rightly or wrongly, that OMAC was basically about large end-users putting price pressure on large suppliers.

On the other hand, OPC seems to fulfill a valid purpose. Granted, it's a bit less ambitious than defining TCP/IP [grin], but my experience at Control Technology Corp. indicates it is a useful tool for companies not having a formal alliance to still produce some products that play well together. In other words, a step in the right direction, at least.

In the bad old days, each HMI vendor would have their own suite of proprietary drivers for each controller (and etc.) vendor they interoperated with. A controller vendor such as CTC would then have to cajole (or pay, or both) each HMI vendor to develop a driver to provide compatibility. What a waste of development resources!

Ken Crater
Nerds in Control, LLC
In reply to Jeremy Pollard: I think there were several problems with IEC-61131. First, vendors didn't really want a common standard. They just wanted to be able to put the name of a standard on their advertising literature to satisfy their customers demands for a standard. The result was a standards compliance criteria that said that virtually anything met the standard. That is, the bar to meeting the standard was set as low as possible.

There is another factor here though. I was a big fan of the *idea* of a common PLC language standard and have said so many times. However, having written an actual soft logic system myself now, I've been exposed to a lot of design details in PLC languages that I hadn't considered as someone who simply programs PLCs day to day. I now think the IEC-61131-3 language design was far too complicated and tried to be all things to all people. I'm not sure that it *could* have been implemented in a compatible manner.

In my own case, I didn't even try. I just picked what I thought was the easiest existing PLC on the market and modelled my own design on that. A proper IEC-61131-3 system would probably have taken me years to implement. I may do it some day, but in the mean time I want to have some software that is actually usable. I wouldn't be surprised if some of the PLC vendors came to the same conclusion.

OPC has a new version of their driver interface system called " OPC Unified Architecture". What I've been hearing about it so far is everyone complaining that it's "too complicated".

There is an official 337 page book on it, and that book *doesn't* include the spec itself. The table of contents includes heading such as "Is OPC UA Complicated?". "Are OPC UA Services Difficult to Handle?", and "Transport Protocols and Encodings: Why So Many?". When a book promoting the concept has to devote sections of itself to addressing those questions, then I think the answer is that it *is* too complicated.

The really successful standards are usually the simple ones. A standard industrial Ethernet communications protocol that meets most normal needs could be thoroughly described in no more than two dozen pages. The actual code to implement it would take a lot less than that. So why don't we have that instead of OPC?

For that matter, if Siemens and AB each opened a subset of their Profinet and EIP protocols to allow basic read write capability (Modbus/TCP is already open) most people would be happy with that.
In reply to Ken Crater:

The original OMAC documents from 10 years ago are still on their web site. I think their concept could probably be summed up as IEC-61131-3 or flow charts running on PC hardware using an MS Windows XP operating system. And that's about it.

They talked over and over again about vendor independence, but I never saw any sign that they understood what their actual dependencies were. They had a great fixation on the CPU hardware platform, which I think missed the point entirely. Their real dependency problems were the I/O interfaces and the logic program compatibility. They never seemed to come up with any answer about how they would solve that though. Quite frankly I'm rather puzzled as to exactly what they thought they were actually doing.

The people behind OMAC were from GM, Ford, and Chrysler. If those companies really wanted to, they could have developed their own shared soft logic platform and specified a common I/O bus and told their machinery suppliers to use them. They never did though, and the PLC vendors were never going to volunteer to cut their own throats.

Tallak Tveide

I agree with Mr Griffin on this topic.

OPC is by no means an open standard.

OPC has failed by choosing Windows DCOM as a platform and thus caused me and many others hour and hours of grief in debugging. It may also explain why many OPC drivers I have used are slow and buggy. Not to mention connectivity issues between domains.

The new OPC standard seems to improve on this, but I also agree that the standard seems very complex. One must remember that the users of this protocol are not mainly software engineers, but automation experts. The beauty of a simple open protocol like Modbus is that most people can read the standard in a few hours time and fully understand how it works. For an analogy look at what happened to CORBA Collections API - I doubt it sees much use today.

I quote:

Design and process deficiencies

The creation of the CORBA standard is also often cited for its process of design by committee. There was no process to arbitrate between conflicting proposals or to decide on the hierarchy of problems to tackle. Thus the standard was created by taking a union of the features in all proposals with no regard to their coherence.[2] This made the specification very complex, prohibitively expensive to implement entirely and often ambiguous.

A design committee composed largely of vendors of the standard implementation, created a disincentive to make a comprehensive standard. This was because standards and interoperability increased competition and eased customers' movement between alternative implementations. This led to much political fighting within the committee, and frequent releases of revisions of the CORBA standard that were impossible to use without proprietary extensions.

As for vendors who are not in it for the right reasons - they should not really be in the process to begin with. Members of a technical committee should be in it with a wish to create a great system, not to act as politicians on behalf of their company.

My wish list for 'OPC' would be:

- Simple protocol (perhaps like modbus TCP but with a query mechanism for tagnames).

- Open spec - no strings attached. I am thinking GPL or MIT licensed or similar

- Simple enough to implement internally in a PLC i a robust way so that you don't need external drivers and can communicate PLC-PLC

- Based on open standards - TCP/IP I take for granted
Yes, it is a means for some semblance or interoperability. But it's still like the bad old days, you can only do what they let you do, see what they expose, it still requires big money to play, with only one choice of OS. And it doesn't help with mixing HW. That's not much of a step. Sort of the Joseph Stalin school of Open.

And something eminently useful like a block of memory mapped between processors could be described in an email and implemented on any platform with Ethernet and the code already exists. ANSI C data types would do nicely.


Adriel Michaud

Without any kind of certification, specs tend to be "enhanced" by specific vendors past the original scope. Modbus is actually a great example because there are so many variations. Do you stop at implementing a single edition of the spec, or do you add support for wide registers, flipped bits, floats, or any of the other weird and wonderful extensions that have been used over the years?

HMTL is a free, open spec without mandatory certification and is something of a gong show to develop for. It's ridiculous that you have to be aware of, and code for, vendor-specific anomalies such as those present in various web browsers. It's also an incredible barrier to entry for any company to develop their own browser, because they effectively have to reverse-engineer Internet Explorer circa 2003 to be able to support most of the sites out there. These barriers are not present in current OPC. If you write a decent OPC Server that passes using the OPC Foundation's Analyzer tool, there's a good chance it will work with anyone's OPC Client.

Coming back to the topic, _some_ form of standardized certification will ensure developers stick to known, defined boundaries. So long as the process/tools do not unnecessarily hinder adoption and the pricing is appropriate, mandatory certification will prevent non compliant garbage from getting to market.
In reply to Tallak Tveide: I didn't intend for any of my arguments to be used as OPC bashing exercises. As I have said before, I don't have any problem with the idea behind OPC or with the OPC Foundation itself. As a way of getting various proprietary protocols to work with SCADA and HMI systems, it is much better than the situation in the 1990s where you had to pick your SCADA or HMI software according to what proprietary drivers it shipped with, rather than on the merits of the software itself.

In this case OPC is simply making the best of a bad situation. The problem was that there were a 100 different proprietary drivers. It didn't solve that problem, but it mitigated it somewhat.

The problem I am talking about however is where OPC is being promoted as something that it isn't. That is where it is being promoted as a "solution" to proprietary protocols, or that we don't need open protocols because we have OPC. Those are the sort of statements that I take issue with.

On the technical side, I think there is no question that OPC made some design mistakes in basing their system on MS COM/DCOM. They have recognized that mistake and have come up with a new version that eliminates MS COM/DCOM, but I think that the new version has problems of its own.

As for your wish list, you've got to realize that the OPC Foundation doesn't come up with the actual protocols. They just figure out how to hook the drivers for them into other programs.

As for your ideas for a protocol, if you want it to be simple then it really needs to be a binary protocol with fixed field definitions and numerical address offsets. Follow that idea to its conclusion and you end up with something like Modbus.

As for a "query mechanism for tagnames", I'm not sure how that would work with a binary protocol or simple low level control. For PLC to HMI/SCADA communications though, a JSON based protocol would be simpler. I've worked on something like that with some other people for an HMI system and it is simple and works quite well.
In reply to Curt Wuollet:

There is a new version of OPC called "OPC UA" which dumps the MS COM/DCOM dependency. Unfortunately, it substitutes SOAP instead. They have adopted SOAP just in time to see the general computer industry deciding that SOAP was also an overly complicated bad idea and are moving on to other things. I'll bet it looked great on a power point slide though.


Jeremy Pollard

Hi Mike:) and Ken (nice to see you came in from the field of dreams for a

We complained when stuff was proprietary, and people went ahead to try and make things open and interoperable, and they remain proprietary in practice.

The environment doesn't care about what works with what, just that I need to work with this!.. IEC-61131 is for sure a low bar setting standard to fake the end user into wanting something that conforms to a standard.. and to a standard that most don't understand..

The OPC challenge is somewhat unique because of shat it does, and what Ken mentioned makes some sense.. as OPC does...

But I agree that it shouldn't be that tough.. but seemingly it is and will be for a while!!

Long live DOS!!!

Cheers from : Jeremy Pollard, CET The Caring Canuckian!

Control Design www[.]controldesignmag[.]com
Manufacturing Automation www[.]automationmag[.]com

3 Red Pine Court, RR# 2 Shanty Bay, Ontario L0L 2L0
705.739.7155 Cell # 705.725.3579
In reply to Adriel Michaud: I've got nothing against certification. What I am against is signing a contract that says a vendor can sue me if they don't happen to like my implementation.

Certification in itself isn't a problem if the certification is open, transparent, and voluntary. Interested parties could get together and come up with compliance tests. They could come up with logos and trademarks and license those logos and trademarks only to those who pass the tests. Customers could decide for themselves whether certification is important to them. That all works.

What doesn't work is what seems to be popular in the automation industry. One vendor comes up with a protocol. They set the rules as to who complies and who doesn't. The tests are kept secret and only insiders can verify them. They "open" the spec to allow a few minnows to implement it. You have to sign a contract to even see the specs, and that contract basically stacks the deck against you if there is ever any sort of dispute. I've looked at what would be involved in getting copies of some of these supposedly "open" specs, and it isn't anything that I can live with.

As I said, I don't have any problems with certification tests, so long as the tests aren't tied to the spec. However, if I was forking out a lot of money to buy an OPC server I would rely more on the vendor's reputation for a quality product than I would on how many certification stamps they have. There are a lot of things that can go wrong with software and testing can only catch some of them. Even the best software can have bugs. But a vendor's reputation for supporting their customers - *that* I would feel more comfortable relying on.

As for your analogy with HTML, if the automation industry had the degree of compatibility that we see with HTML we would have the following. Almost all PLCs would be able to talk to one another or to anything else without any major problems. The exception would be one vendor whose product line was obsolete, but for which there were work-arounds for anyone who was interested in dealing with it. And that vendor's market share would be dropping like a rock. I don't know about you, but I would call that situation a lot better than what we have today.

I don't know what sort of problems you deal with for HTML, but I'm working on a web page right now that does a lot of pretty esoteric stuff with HTML, Javascript, CSS, and SVG. I can open up 5 different web browsers, and it works in all of them without doing anything special.

The only exception is Microsoft's IE, and their problem is their technology is about 10 years behind the rest of the industry. They sit on the W3C committee as a major vendor though, and they play all the usual vendor games to sabotage the process. The other browser makers have to get together outside the official standards process to get anything done.

So I guess your solution would be to say that Microsoft shouldn't be allowed to ship MS IE anymore? Or do you think that everyone else has to sit and do nothing so long as Microsoft wants to play dog in the manger?

I assure you however that *nobody* wastes much time anymore trying to reverse engineer Microsoft's browser anymore. The world has moved on since the 1990s. You develop according to the standards, test with "good" browsers, and then hack in the CSS to work around the IE problems later. Any other approach simply leaves you running in circles.

As for barriers to entry, a lot of things have changed in the world of software in the past 10 years. Apple based their Safari web browser on LGPL code from Linux instead of starting from scratch on their own. People are even writing very good web browsers in their spare time as a hobby. There are a lot of things happening in this field.

Ken Emmons Jr.

So I'm not the only one who thinks these things are overly complicated?

Power Point should be banned. (j/k) :eek:)

Hi Michael

All you need to do is check out where SOAP came from. It's the same source of bad ideas along with XML, another Trojan horse to get proprietary BS institutionalized as a standard. There is a pattern here. Using anything that MS had anything to do with always lands you in the same place. It's the Big Lie type of propaganda. If you repeat "It's Open" enough times, pretty soon, gullible people will insist it is. Meanwhile, they are busy corrupting ODF.