After Software, What's Next?

C

Charles Moeller

CWW:

> Do we care what's inside the box?

I think so. I would much rather have a directly-connected, parallel-concurrent system monitoring and controlling my physical process in real time. The alternative: hardware signals handed off to a TM, data-processing, handed back to hardware is slow, unsafe and complex, is what we have now. When you start considering safety-, time-, and mission-critical applications, there is even more reason to select a parallel-concurrent model.

> I will play the devil's advocate and ask: What problem
> are you trying to solve? Illuminate please:^)

The problems I am looking to solve are those of:
cost (to design and build, and to own), safety, complexity, maintenance, and understandability.

Regards,
CharlieM
 
C

charles Moeller

Dick Caro:

> Look at Grafcet
> Look first, then please share your impressions. It's NOT what you expect.

I looked and found GRAFCET "In 1988 it was adopted by the IEC as an international standard under the name of «Sequential Function Chart» with reference to the number «IEC 848». Translators have existed for many years, to implement GRAFCET on real time computers or programmable controllers."

Fuzzy logic, Petri nets, Neural nets, and Grafcets are all presently implemented on computers or clock-driven state-machines. This is one of my gripes: that advancements that should provide tremendous benefits are funneled through the constraints and impediments of TMs, thus picking up the restrictions and characteristics of the data-processed model.

Regards,
CharlieM
 
C

Charles Moeller

Armin Steinhoff:

> if a FPGA (or CPLD) is the processing unit all processing is done in the
> hardware and there no external memory for program code. The software defines
> only the individual configuration of this hardware.

> That means the result of the compilation of a piece VHDL software is
> a special configuration of a piece of hardware (FPGA, CPLD ..)

Yes. That's my point. If a piece of hardware (CPLD, FPGA, etc.) is directly connected (I&O) to the process it controls and no sample-and-store is taking place, and no instructions are accessed or executed (i.e., no run-time software), (and perhaps uses no clock) it wins my approval!

Regards,
CharlieM
 
C

Charles Moeller

Ken:

>> Why do we convert everything to software in the middle?

>> Instead of sampling and storing, then data-processing to determine the
>> response, why don't we stay in the hardware mode and create simple
>> stimulus-response mechanisms that react reliably and correctly. Control
>> systems would be faster, safer, and less costly.

>At some level of complexity, I think the distinction becomes a semantic one.
> An FPGA configuration begins to look like firmware, perhaps akin to microcode
> in a CPU. Does this make it more virtuous somehow, more robust? I've
> seen bugs arise in FPGAs, just as troublesome (and sometimes harder to
> locate) than those in software.

Simpler is better.

>It is true that FPGAs are becoming much more capable, but still, software tends
> to be more scalable. The scalability of FPGAs are bounded by their gate levels -
> they're great right up to the point where they're not, then they fall off
> the cliff. So, for fixed applications in the internals of products they tend
> to be a useful tool, whereas for user-programmable controllers intended
> for a wide variety of applications, the inconsequential cost of high performance
> CPUs (relative to the other costs of an automation application) means that
> software will continue to be a preferred approach for a long time.

I can agree with most of that. Higher levels of programmability need the CPUs and peripherals.

The applications I am thinking of are the toasters, home security, vehicle subsystems, factory automation, etc. These types of applications employ 98% of the microprocessors. Rather than use a $.25 device that can call up your aunt Jane, why not use a bare-bones hardware device that costs just a few cents?

>However, the specific language through which a system is programmed is quite
> another matter. The virtue of Relay Logic was that it closely replicated the
> control tools of the day -- electromechanical relays.

That kind of simplicity and straightforwardness can be available today.

Regards,
CharlieM
 
Hello Ken,

>> Why do we convert everything to software in the middle?

>> Instead of sampling and storing, then data-processing to determine the
>> response, why don't we stay in the hardware mode and create simple
>> stimulus-response mechanisms that react reliably and correctly. Control systems
>> would be faster, safer, and less costly.
> At some level of complexity, I think the distinction becomes a semantic one.
> An FPGA configuration begins to look like firmware, perhaps akin to microcode
> in a CPU. Does this make it more virtuous somehow, more robust?

Yes ... e.g. a FPGA can't change its configuration by it self.

> I've seen bugs arise in FPGAs, just as troublesome (and sometimes harder to locate)
> than those in software.

The roots of troubles with FPGAs are software based and here must the bugs fixed. It is always possible that the synthesizing process (compiler) has a bug. But there are very good tools for the verification of the FPGA code. Comparable tools are not available in the pur software world.

> It is true that FPGAs are becoming much more capable, but still, software tends to be
> more scalable. The scalability of FPGAs are bounded by their gate levels - they're
> great right up to the point where they're not, then they fall off the cliff.

There is a lot of SoC realized with FPGAs. These SoCs are often based on a 32bit microcontroller core ( e.g. Microblaze), memory and plain standard FPGA logic. The combination of a microcontroller core and FPGA logic seems for me very scalable .

> So, for fixed applications in the internals of products they tend to be a useful tool,
> whereas for user-programmable controllers intended for a wide variety of applications,
> the inconsequential cost of high performance CPUs (relative to the other costs of an
> automation application) means that software will continue to be a preferred approach
> for a long time.

Yes ... I share this view. But the usage of high performance CPUs is introducing a lot of trouble by their multcore design. Until now we don't have programming languages which are describing real physical mutiprocessing at the level of VHDL.

Best Regards
Armin Steinhoff
 
I'm with Bill, I'm not going to reply to this one...

For simple logical operations any modern day ARM or Atom CPU will rip the logic to pieces for [potentially] less than a few watts of power, in tens or hundreds of microseconds. So why are we having this discussion? For complex motion control I can see FPGAs or distributed computation, but this has already been done for decades.

Oh well, so much for not replying.... :eek:)

KEJR
 
W

William Sturm

< Charles said: "My thesis is that the problems of software can be cured by smarter hardware" >

That is an interesting topic, it seems to me that the typical machine/assembly code has not made any great advances in decades.

An interesting idea is silicon that has a high level language as it's native machine code. Some excellent examples are the various Forth processor chips out there. They have never become mainstream, but they appear to have some very nice capabilities. They are a great example of hardware that is specifically designed for software.

If you dig around in these sites, you will find a lot of interesting ideas:
http://www.ultratechnology.com/
http://www.colorforth.com/
http://www.offete.com

Along the same lines, there are also some native Java bytecode processors out there:
http://en.wikipedia.org/wiki/Java_processor

Another interesting hardware item that comes to mind is the Parallax Propeller chip. It has eight 32 bit cores each with their own timers and a global shared memory space. The idea is to eliminate interrupt service routines, you can assign a single core to service a high speed event. It enables programmers to write their own peripheral functions in software. Except for timers, there is no typical peripheral hardware on the chip.

You can read all about it here:
http://www.parallax.com/

I hope that you find these links to be helpful...
Bill Sturm
 
W

William Sturm

< Charlie said: "The problems I am looking to solve are those of:
cost (to design and build, and to own), safety, complexity, maintenance, and understandability" >

The number one problem I think is not technology, it is marketing.  There has been many great ideas that never get off of the ground. 

Of course, the right idea at the right time can be a powerful thing!
 Bill Sturm
 
W

William Sturm

< Charlie said: "I see software as an unneeded and complicating factor. One needs both software and hardware experts, rather than just hardware ones">

I see little difference fundamentally between hardware and software.  Algorithms are algorithms, they can be designed into hardware or software, only the techniques are different.  Like relay logic and a PLC, they can both accomplish the same task.  Same with silicon vs. software.
 Bill Sturm
 
J

James Ingraham

CharlieM: "I see software as an unneeded and complicating factor."

Don't we all. :)

CharlieM: "Hardware can be exhaustively tested, whereas software testing may never end (and you still ship with bugs for the user to find)."

Disagree. (a) Hardware call still be a problem. Note the 737 rudder issue. That was hardware. So was the Tacoma Narrows Bridge. (b) Moving the software complexity into the hardware just moves the problem rather than solving it. Using PLCs as an example again, imagine 10 relays controlled by ladder logic. If there's a software bug, you yell at the programmer. But without the software those relays don't *DO* anything. They just sit there. If you replicate the logic in the PLC by wiring the relays in a massively complicated way you will just as likely have bugs. And they will be MUCH harder to find.

-James Ingraham
Sage Automation, Inc.
 
R
i presented a paper ath the 14th International Programmable Controllers Conference in Detroit in 1985. William Keller of Joucomatic Controls presented a paper in the same session I was in titled "GRAFECT, A Functional Chart of Sequential Processes". That was my first introduction to Grafect and I thought it was a great concept back then but was disappointed that it really didn't catch on in the US.

I doubt the Engineering Society of Detroit has the paper on line but you might search around for it. I didn't see any of the IPC conference papers listed when I checked Google.

Unfortunately, many of the early papers in the PLC world are not available - we risk the same thing that has happened all to often; there is no monetary incentive to make available papers and as they are all covered by copyright, no one else can legally make them available. The only copy I know about is my marked hardcopy that is slowly growing old on my shelf.

Maybe someone on the list is a member of ESD and can see if the old conference proceedings are or can be made available. I'd be happy to contribute my copies to the cause if they could be put on line.

Russ
 
B
I second James's comment re wiring faults in relay systems being hard to find - and you can add in a whole slew of additional faults caused by minor differences in timing between so-called parallel relays, for example. But to me the major difficulty is not in whether the control system does what is wanted - it is in working out beforehand what we really want it to do. For some systems this can be relatively simple, but for even a moderately complex application the number of combinations and permutations can be appalling. For an automated start-up system, for instance, there is one right approach which will basically follow some pre-ordained design intent, but the number of wrong pathways that can be taken is very high - and you can guarantee that the one you decide is not an issue is the one that will bite you.

I have found the SFC approach very handy in both defining the decision paths through this type of problem - you can ask the mechanical or process engineers what should happen if the firing speed is not reached in 40 seconds - and in making sure that there is an operational cycle followed which ensures that any actions that have been done are undone at some point.

And don't forget that any control system will also have a wetware component.

Bruce.
 
C

Charles Moeller

Bill:

>> Charles said: "My thesis is that the problems of software can be cured by smarter hardware"

> That is an interesting topic, it seems to me that the typical machine/assembly
> code has not made any great advances in decades.

Right! The whole idea of code is instructions to shuffle data from one location to another, while keeping track of from- and to- locations. Transforms, substitutions, and translations can be performed as well. The sources of data are the input sensors. Once the data are separated from the real-time process by sampling and storing, there is no other recourse but to data-process. My approach is to avoid this step as much as possible and treat (not data-process) the sensor information in their native space-time-immersed states and in real time, not after stale and sterile representative data is retrieved from memory in some subsequent step.

> An interesting idea is silicon that has a high level language as its native
> machine code. Some excellent examples are the various Forth processor chips
> out there. They have never become mainstream, but they appear to have some
> very nice capabilities. They are a great example of hardware that is specifically
> designed for software.

I have played with Forth. Interesting side-trip but still a data-processing method.

> If you dig around in these sites, you will find a lot of interesting ideas:

Thanks for the links.

The Parallax Processor approaches a processor per sensor. That’s more like where I’m headed with “mostly hardware,” <b>without the processors</b>.

>> Charlie said: "I see software as an unneeded and complicating factor. One
>> needs both software and hardware experts, rather than just hardware
>> ones."

> I see little difference fundamentally >between hardware and software.
> Algorithms are algorithms, they can be >designed into hardware or software, only
> the techniques are different. Like >relay logic and a PLC, they can both
> accomplish the same task. Same with >silicon vs. software.

Bill, you and I have totally different realities on hardware & software:

Hardware is essential. It performs all the logic and arithmetic operations, and the reception, decoding, and storage of sensory information and data. Every effect created by a hardware-software combination is initiated in and executed by hardware, not software. Hardware even houses (control store) and paces (instruction counter) the software instructions. Hardware is indispensable. Controllers can’t work without it. It is not the case that hardware is dependent upon software for functionality. It is the other way around. Software depends upon hardware to code it, house it, access it, step through it, and implement it. That turns out to be a benefit, because hardware can be tested with finite resources, while software testing may never end. “Complete testing of a moderately complex software module is infeasible. Defect-free software product can not be assured.”[1] “Software essentially requires infinite testing, whereas hardware can usually be tested exhaustively.”[2]

1. “Software Reliability” by Jiantao Pan, Carnegie-Mellon University, 1999
http://www.ece.cmu.edu/~koopman/des_s99/sw_reliability/
2. Overview of Software Reliability
http://swassurance.gsfc.nasa.gov/disciplines/reliability/index.php

Best regards,
CharlieM
 
>> CharlieM: "I see software as an unneeded and complicating factor."

>> Don't we all. :)

>> CharlieM: "Hardware can be exhaustively tested, whereas software testing may never end (and you still ship with bugs for the user to find)."

> Disagree. (a) Hardware call still be a problem. Note the 737 rudder issue. That was
> hardware. So was the Tacoma Narrows Bridge. (b) Moving the software complexity
> into the hardware just moves the problem rather than solving it. Using PLCs as an
> example again, imagine 10 relays controlled by ladder logic. If there's a software bug,
> you yell at the programmer. But without the software those relays don't *DO*
> anything. They just sit there. If you replicate the logic in the PLC by wiring the relays
> in a massively complicated way you will just as likely have bugs. And they will be
> MUCH harder to find.

Yes ...and hardware gets older over the years and this happens very fast in a harsch environment (heat, radiation a.s.o)

The good thing is that software can't be effected by radiation and heat. Another concept would be to minimize the number of hardware components and to use raw CPU power to implement the concept of virtual devices talking to simple IO interfaces. Multicore CPUs are supporting such a concept

Best Regards
Armin Steinhoff
 
C

Charles Moeller

---- snip--- see http://www.control.com/thread/1327707041#1328018928
James Ingraham said:
> Disagree. (a) Hardware call still be a problem. Note the 737 rudder issue. That was
> hardware. So was the Tacoma Narrows Bridge. (b) Moving the software complexity
> into the hardware just moves the problem rather than solving it. Using PLCs as an
> example again, imagine 10 relays controlled by ladder logic. If there's a software bug,
> you yell at the programmer. But without the software those relays don't *DO*
> anything. They just sit there. If you replicate the logic in the PLC by wiring the relays
> in a massively complicated way you will just as likely have bugs. And they will be
> MUCH harder to find.

Before software, plant automation ran just fine.

The problems in software multiply with complexity, which climbs ever higher. Is software really necessary? If control systems relied more on hardware and less on software, some of our reliability problems would simply vanish and time- and safety-critical applications improve. Wouldn’t it be a good thing to surely and systematically decrease the use of and dependence upon software, in favor of the more reliable and testable hardware?

What is software, anyway? We can find that software selects, sequences, and times the hardware functions. But software itself is sequenced and timed by hardware. Software, in the final analysis, can only tell the hardware what to do, in what order and, in some cases, how long to do the activity. Software provides direction to the hardware, but hardware actually performs all the functions. Software can only tell the hardware <i>what it is time to do:</i>


001010 (<i>now it is time to</i>) Do X,
001012 (<i>now it is time to</i>) Do Y,
001014 (<i>now it is time to</i>) Do Z,


Hardware is the necessary and more robust physical layer that constitutes—and connects—subsystems. Software consists of fragile over-layers of human-composed command that generate their own problems. If the hardware for control systems was smarter inmatters <i>temporal</i>, it would not need as much software, or perhaps it would need none. If the hardware could be designed and configured to know <i>when</i> to do the functions and operations it already knows <i>how</i> to do, it would not need software to tell it when to do them. Shifting software functions to hardware would be key in reducing software dependency. Hardware is faster, surer, and more reliable than software. If the hardware did not need software, it could be more autonomous and simpler. Controller design efforts would be more focused and efficient as well.

Armin Steinhoff said:
> Yes ...and hardware gets older over the years and this happens very fast in a
> harsh environment (heat, radiation a.s.o)

Self-testing hot-swap hardware for that.

> The good thing is that software can't be effected by radiation and heat.
> Another concept would be to minimize the number of hardware components and to use
> raw CPU power to implement the concept of virtual devices talking to simple IO
> interfaces. Multicore CPUs are supporting such a concept

Amidst further complexity.

Best regards,
CharlieM
 
J

James Ingraham

CharlesM: "Before software, plant automation ran just fine."

Selective memory. Yes, there were plenty of plants running before the invention of the microprocessor. They had their own share of problems. WAY back, a water wheel would turn outside, driving every single machine in the plant. If the belt broke, every machine went down. If you wanted to add a machine every machine went down. If you wanted to fix a machine every machine went down.

You didn't address my points about the 737 rudder malfunction or the Tacoma Narrows bridge. The 737 rudder problem took over a DECADE to find. If you specifically want a plant automation example, the Union Carbide Bhopal disaster had nothing to do with software.

CharlesM: "The problems in software multiply with complexity, which climbs ever higher."

Again, the problem is not software. The problem is the TASK. If you replicate my ladder logic with relays you will have EXACTLY the same complexity.

CharlesM: "Is software really necessary?"

Apparently. Most of the tasks I have programmed could be theoretically done with hardware alone, but it would be a royal pain, and you couldn't hit the cycle times. And what do you do when you change something? On my systems, if on Tuesday they run out of the correct box size and need to run a slightly bigger box, they just go change a couple values on the HMI and keep running. Do you really want to re-program an FPGA for that? Or spin a new ASIC?

I don't have any customers telling me 'Please give us less information out of your machine. Make sure it's all bang-bang automation rather than servo drives. And don't give me one of those new-fangled color screens; I want 137 flat black push-buttons arrayed in no particular order.'

CharlesM: "If control systems relied more on hardware and less on software, some of our reliability problems would simply vanish..."

My counter-argument remains the same. Yes, a relay is more reliable than software. Incorporate the functionality of the software into the relay and the complexity is the same, therefore has the same reliability problems.

-James Ingraham
Sage Automation, Inc.
 
W

William Sturm

< Charlie said :: "If the hardware did not need software, it could be more autonomous and simpler." >

While I totally agree that simpler systems are exponentially more reliable, I do not agree that hardware will be simpler without software. I think that hardware would need to be added to do the logic that was previously done in software. Then all you have done is change the programming techniques from software to hardware.

Hardware is much less likely to change during operation, that I could agree with. I also think that software is frequently too complex. Since it is "soft", extra features frequently get added. There is a Peter Principle for software, I suppose. Will that not happen with hardware design?

Bill Sturm
 
> Before software, plant automation ran just fine.
>
> The problems in software multiply with complexity, which climbs ever higher. Is software really necessary?

Yes ... it is. Software is just represented by a special status of a piece configurable hardware ... it's called memory.
At the end all of the operations are done in hardware.

IMHO ... the complexity of industrial applications are so high that its impossible to do it in plain hardware logic.

Best Regards
Armin Steinhoff
 
C

Charles Moeller

>CharlesM: "Before software, plant
>automation ran just fine."
>
>Selective memory. Yes, there were
>plenty of plants running before the
>invention of the microprocessor. They
>had their own share of problems. WAY
>back, a water wheel would turn outside,
>driving every single machine in the
>plant. If the belt broke, every machine
>went down. If you wanted to add a
>machine every machine went down. If you
>wanted to fix a machine every machine
>went down.

Now, in addition to hardware faults, we have software faults.

>You didn't address my points about the
>737 rudder malfunction or the Tacoma
>Narrows bridge. The 737 rudder problem
>took over a DECADE to find. If you
>specifically want a plant automation
>example, the Union Carbide Bhopal
>disaster had nothing to do with
>software.

Sorry for not addressing your points:

The Bhopal disaster had nothing to do with electronic/electrical hardware, but was attributed to shoddy maintenance.

I am not familiar with the 737 rudder problem. Was it electronic logic hardware that was the problem?

The Tacoma Narrows, to my knowledge, was a case of unpredicted harmonic oscillation caused by wind-shear strumming of the suspension cables. Hardly an electronic hardware problem.

Software disasters:

“Computers are increasingly being introduced into safety-critical systems and, as a consequence, have been involved in accidents. Some of the most widely cited software-related accidents in safety-critical systems involved a computerized radiation therapy machine called the Therac-25. Between June 1985 and January 1987, six known accidents involved massive overdoses by the Therac-25 -- with resultant deaths and serious injuries. They have been described as the worst series of radiation accidents in the 35-year history of medical accelerators.” [1]http://ei.cs.vt.edu/~cs3604/lib/Therac_25/Therac_1.html

“On February 25, 1991, during the Gulf War, an American Patriot Missile battery in Dharan, Saudi Arabia, failed to intercept an incoming Iraqi Scud missile. The Scud struck an American Army barracks and killed 28 soldiers. A report of the General Accounting office, GAO/IMTEC-92-26, entitled Patriot Missile Defense: Software Problem Led to System Failure at Dhahran, Saudi Arabia reported on the cause of the failure. ….. Ironically, the fact that the bad time calculation had been improved in some parts of the code, but not all, contributed to the problem, since it meant that the inaccuracies did not cancel.” http://www.math.psu.edu/dna/455.f96/disasters.html

There have been many others.

>CharlesM: "The problems in software
>multiply with complexity, which climbs
>ever higher."
>
>Again, the problem is not software.
>The problem is the TASK. If you
>replicate my ladder logic with relays
>you will have EXACTLY the same
>complexity.

IMHO the input sensors and output contactor or triac or thyratron hardware has to be there anyway. The software is yet another complicating set of factors, including fetch-and-retrieve and instruction decode delay times on top of any hardware delays.

>CharlesM: "Is software really
>necessary?"
>
>Apparently. Most of the tasks I have
>programmed could be theoretically done
>with hardware alone, but it would be a
>royal pain, and you couldn't hit the
>cycle times. And what do you do when
>you change something? On my systems, if
>on Tuesday they run out of the correct
>box size and need to run a slightly
>bigger box, they just go change a couple
>values on the HMI and keep running. Do
>you really want to re-program an FPGA
>for that? Or spin a new ASIC?

Process flexibility can, in most cases be designed-in to be dial-able or switch-selectable.

>I don't have any customers telling me
>'Please give us less information out of
>your machine. Make sure it's all
>bang-bang automation rather than servo
>drives. And don't give me one of those
>new-fangled color screens; I want 137
>flat black push-buttons arrayed in no
>particular order.'
>
>CharlesM: "If control systems relied
>more on hardware and less on software,
>some of our reliability problems would
>simply vanish..."
>
>My counter-argument remains the same.
>Yes, a relay is more reliable than
>software. Incorporate the functionality
>of the software into the relay and the
>complexity is the same, therefore has
>the same reliability problems.

Software requires a TM and turning all information into data, voluminous amounts of which must be minded and stored or thrown away.
Data retrieved, even if correct is subject to improper interpretation, as well.

Simpler is better.

Best regards,
CharlieM
 
Top