After Software, What's Next?

C

Thread Starter

Charles Moeller

During six-plus decades of adherence to the Turing paradigm, the computer field has reaped the benefits of ever- faster and -denser (and -more reliable) hardware. Over the same span of decades, the creation and maintenance of software hasn't gotten any easier and remains problematic, especially in matters of implementation, integration, and system safety.

Computation, as a technique that was formulated to solve cryptography via instruction-dominated symbol-swapping, may not be the most appropriate means of monitoring and controlling real-world physical processes, yet that’s what 98% of the billions of microprocessors and their derivatives fabricated each year are made to do.

The tangled threads of linear-sequential operation tend to inhibit each other and may cause faulty operation. After these decades of experience, hasn’t a better way been developed? Even the "massively parallel" solutions are processors that have been slaved to operate in lock-step, but they are each linear-sequential systems at their cores: shared-resource hardware arranged to manage data in spatial memory addresses via step-by-step software instructions.

We can do better with an alternate technology but if we could, who would be its champion?

[email protected]
 
C

Curt Wuollet

This is the wrong forum to ask about change. most are strongly attached to doing or at least modeling control with relays:^) It seems Automation today is about the easiest way to control things rather than the best.

And most decry the procedural programming language stepping stone between mind and hardware. Assuming we don't veer off into massively parallel nanocomputers or chemical computing, I think the next step would be for our thinking machines to replicate a system from a description to produce an ASIC with logic built in or something on that order. Who will champion that? IP houses like ARM or the silicon giants.

That's my guess

Regards
cww
 
C

Charles Moeller

Thanks for your thoughts, CCW.

Relays are good for concrete conceptual understanding: ladder diagrams and logic are straightforward.

Existing computer languages, both functional and procedural, are restricted to linear-sequential thought and operation. Life is parallel-concurrent.

Give a computer the task of invention and you may get somewhat better, but generally more of the same old stuff. The best thinking machine for generating new ideas is the human mind.

Our logic has us thinking in flat, two-dimensional channels. Time is translated to data in space and all processes are done through combinations and sequences of AND, NOT, and STORE.

What is needed: a parallel-concurrent language and logic that describes physical processes in such a way that the same description is the exact specification of the controller for each process.

CharlieM
[email protected]
 
C

Charles Moeller

CWW, you hit the mark with your suggestion ..."replicate a system
from a description to produce an ASIC with logic built in or something on that order."

Complexity is all the rage, these days.
FPGAs would be an easy trial route ahead of ASICs.

My points:

1. Do we really need to program our toasters from our vacation cities, or is the vastly increased functionality built-in just because the facilities are easily available?

2. Do we really need such as LabView to create the simple functionality that most industrial applications and consumer electronics require?

CharlieM
[email protected]
 
C

Curt Wuollet

No and no.

The first is it's own answer. People will do it because they can. The second can be be done much more efficiently with a much smaller simpler processor but it's not as easy. When cost and volume dictate they generally are done that way hence embedded systems. But not just anybody can do it. Do you need 40 billion lines of code to emulate a terminal? No, but people do it all the time. The largest single product of the software industry, bloat, has everything to do with easy and nothing to do with good. Each successive generation of products do the same task with 10 times the software. It's Boars law.

Regards
cww
 
IMHO ... at next will be more and more software :) We will see more and more programmable hardware by huge and faster FPGAs.

New hardware description languages will allow us by the usage of automated verification to develop more reliable soft-/hardware.

This will bring real parallel processing to the automation systems. For instance the nodes of the VARAN fieldbus is based on reprogrammable FPGAs and there are already code generators converting IEC 61499 code to VHDL code .... software is the future :)

Best Regards
Armin Steinhoff

PS: VARAN bus -> www.varan-bus.net
 
Right, the world is multi-threaded with both parallel and sequential operations. That's why Sequential Function Chart programming was created. It is one of the IEC 61131-3 "languages" and is very much underused. It was created by Telemechanique and called Grafcet. Learn more about this powerful concept.

Dick Caro
 
Great thread, Charlie!

Back in the 1990s, Dick Morley hosted a series of "Chaos Conferences" in Santa Fe which examined alternatives to traditional procedural or combinatorial control. One of the major underlying themes was to look for hidden order in seemingly stochastic processes (hence "Chaos") and gain insights from that order to develop better control strategies. The conferences followed the work of the Santa Fe Institute on so-called Chaos Theory, the study of processes which, although deterministic, were highly sensitive to differences in initial conditions -- weather is the typical example.

Along the way, we looked at alternative programming methodologies with which various companies were experimenting at the time, including:

- Neural networks - essentially trying to replicate the methods of the brain, these create outputs as a function of inputs, often through a "training" process instead of explicit programming.

- Fuzzy logic - acts on input variables based on degrees of truth (or, as we say these days, "truthiness"). So, a sensed temperature could be too cold, too hot, or fairly hot, kinda cold, etc. If this sounds a bit like a linguistic PID, well, maybe it is, but as a mode of expression it may be closer to how we think about things.

I'm not sure whether either of these approaches became widely used in industry. I know that Omron was touting fuzzy logic based controllers for a while, but haven't heard anything about these lately. Neural networks were beginning to be used in research and analysis settings, but their opaque nature made people nervous about using them in control (so, what EXACTLY did the neural network "learn"??).

I have my own ideas about how best to model real-world applications in the structure of a programming language, to reduce the distance between how we think about our actual processes and the arcane contortions we have to go through to program them. I've been working on an article on this very topic to kick off a blog I'm starting, and will provide a link on Control.com when it's launched.

In the meantime, thanks for an interesting discussion!

Ken Crater
Founder, Control.com
[email protected]
 
C

Charles Moeller

Dick Caro:

Thanks for the tip on Grafcet. I'll look into it.

What I expect to see, however, is another one of the accepted "languages" that are characterized by, based upon, and require, a Turing-type mechanism (TM) to do the data-processing. That restriction limits any resulting parallel-concurrent operations to those performed in a linear-sequential manner.

No real change in the fundamentals.

What is needed is something really different, some new thought of how to go about controls and not constrain every control problem to those that can/must be performed by TMs.

[email protected]
 
C

Charles Moeller

CCW:

Why use a processor for simple tasks? The logic for most control tasks can be done in less than 100 gates, so why use even a 10,000-gate processor when a small FPGA will do the job?

[email protected]
 
C

Charles Moeller

Armin Steinhoff:

I can appreciate your viewpoint, as more and more software seems to be happening.

But consider that every action that is actually performed in the controlled environment is performed by hardware. Every condition and event sensed in the real environment is generated by hardware. Why do we convert everything to software in the middle?

Instead of sampling and storing, then data-processing to determine the response, why don't we stay in the hardware mode and create simple stimulus-response mechanisms that react reliably and correctly. Control systems would be faster, safer, and less costly.

[email protected]
 
@Charles Moeller,

if a FPGA (or CPLD) is the processing unit all processing is done in the hardware and there no external memory for program code. The software defines only the individual configuration of this hardware.
That means the result of the compilation of a piece VHDL software is a special configuration of a piece of hardware (FPGA, CPLD ..)

Best Regards
Armin Steinhoff
 
C
One has to consider though that we got here through parallel, though modest means. Back to the future a control system often had logic for each individual function. Relay logic was like that. And before microprocessors, digital control was like that. You had boards of gates that could combine signals where needed but much was done asynchronously. It could be interesting to troubleshoot to say the least. But if you look at it and squint a little all a PLC did was replace the rack of cards with a box that implemented the same logic. That is to say, it was not a compute engine as such, just a cheaper, more reliable replacement. So to mimic your earlier questions:^) Do we care what's inside the box? Do we need a PLC with guts a zillion times faster or more esoterically correct. Actual computing on the devices is in it's infancy which is why I'm fond of using a "real" computer in the first place, given a non trivial need. Your emphasis seems to agree with my approach that its' a lot easier to add PLC functions to a full feature computer than to add the functions to a PLC. I think that's why there are now PACs. I will play the devil's advocate and ask: What problem are you trying to solve? Illuminate please:^)

Regards
cww
 
C
Why indeed? We used to do things with the 40 gates. It probably has something to do with finding an electrician who is interested in hardware description languages. My interest in having a processor core is because I do things that are now becoming practical in an inexpensive SOC. For example: I can see a sensor that uses machine vision techniques for the price of a dumb sensor. It's tremendous overkill to use an ARM core in a sensor, But who cares what's in the box. Power, Ground, and an output. One can distribute intelligence all over the system for the price of one PLC. We've already started down the road of systems being collections of specialized functions, Motor drives, etc. etc. At some point the software will of course, still be there, we just won't see it,

Regards
cww
 
Look at Grafcet or the Sequential Function Chart language of IEC 61131-3. It is a top-down graphical/charting specification where the objectives of each block are defined. Each block can then be written in its own SFC, or in any of the languages of 61131-3. This is a parallel process model, NOT a turing machine model. Look first, then please share your impressions. It's NOT what you expect.

Dick Caro
 
> Why do we convert everything to software in the middle?

> Instead of sampling and storing, then data-processing to determine the
> response, why don't we stay in the hardware mode and create simple
> stimulus-response mechanisms that react reliably and correctly. Control systems
> would be faster, safer, and less costly.

At some level of complexity, I think the distinction becomes a semantic one. An FPGA configuration begins to look like firmware, perhaps akin to microcode in a CPU. Does this make it more virtuous somehow, more robust? I've seen bugs arise in FPGAs, just as troublesome (and sometimes harder to locate) than those in software.

It is true that FPGAs are becoming much more capable, but still, software tends to be more scalable. The scalability of FPGAs are bounded by their gate levels - they're great right up to the point where they're not, then they fall off the cliff. So, for fixed applications in the internals of products they tend to be a useful tool, whereas for user-programmable controllers intended for a wide variety of applications, the inconsequential cost of high performance CPUs (relative to the other costs of an automation application) means that software will continue to be a preferred approach for a long time.

However, the specific language through which a system is programmed is quite another matter. The virtue of Relay Logic was that it closely replicated the control tools of the day -- electromechanical relays. Unfortunately, that day was back in the 1960s. Our industry moves s-l-o-w-l-y...

Ken Crater
Founder, Control.com
[email protected]
 
Correction to original message - http://www.control.com/thread/1327707041#1327864914

NO. Leave this alone and issue a correction.

Alan Mathison Turing
Born 23 June 1912
Maida Vale, London, England,
United Kingdom

Alan Turing is considered the father of the modern stored program digital computer. The mechanism in his invention is called a Turing Machine.

Dick Caro

Original message with moderator's question removed:

Look at Grafcet or the Sequential Function Chart language of IEC 61131-3. It is a top-down graphical/charting specification where the objectives of each block are defined. Each block can then be written in its own SFC, or in any of the languages of 61131-3. This is a parallel process model, NOT a turing machine model. Look first, then please share your impressions. It's NOT what you expect.
 
C

Charles Moeller

Ken,

The fundamental problems of TM-dominated systems persist—and will do so for as long as we insist upon solving all our problems via the Turing paradigm: shared resource hardware, software, and linear-sequential operation. I’ve intimately experienced four decades of “progress,” including advances in the millions-fold in hardware capabilities. These meaningful improvements have resulted in Moore’s Law effects that have made computing available to everyman at reasonable cost. But a software crisis exists, ongoing prior to, and since, E.J. Dijkstra named it in 1971. We know that software can’t cure the ills of software because it hasn’t done it in the last 60-plus years—and not for the lack of trying.

My thesis is that the problems of software can be cured by <i>smarter hardware</i>.
 
C

Charles Moeller

CWW:

I see software as an unneeded and complicating factor. One needs both software and hardware experts, rather than just hardware ones. Hardware can be exhaustively tested, whereas software testing may never end (and you still ship with bugs for the user to find).

Regards,
CharlieM
 
Top