What the write-up does not include is just what sort of products the standard applies to. For our products it would seem that writing the control code so that any deviation from the correct sequence of actions stops everything may already be in compliance. That type of programming did not make any distinction between deviations that were unsafe and those that had no safety impact at all. Any deviation was a show-stopper.
But then I have also seen other safety rules, mostly hardware, that seem to be aimed at drunks bent on self-destruction. Perhaps those standards should be revisited now.
Rich, I sympathize with you as far as the kids go. It is tough.
As far as safety, it is imperative in many of the systems we interact with in these times. If safety is not built in, people will not trust them. Fortunately we have standards like IEC 61508. In aerospace, you have similar standard, like DO-254. The safety record of those systems is very good.
Are they robots or androids? We're not exactly sure. Each talking, gesturing Geminoid looks exactly like a real individual, starting with their creator, professor Hiroshi Ishiguro of Osaka University in Japan.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.