Once again we see the "method of selected data" at work, producing wildly different statistical results. And, of course the bottom line rules decisions! In many industries it always does, sometimes with terrible results, other times with fairly expensive safety systems. Several of my designs could have been much less expensive if safety had not been a consideration, but OUR "bottom line decision" was that it was far cheaper to provide the safety system than it was to kill people. OUr customers did appreciate the safety systems, even when they understood the cost.
For the oil companies, a working blowout preventer can be "cheap insurance" against an event that does not happen if nobody makes a mistake, and no other equipment fails.
MY suggestion for a cure for the type of decisions that led to the BP blowout is to raise the fine to a point well above the cost of assuring that the blowout preventer will function as needed. My understanding is that BP was quite aware that the device was not able to function correctly, but they chose to let that slide. Then when they made a number of other cost-cutting bad choices the well did blow. The worst is that they had the information that warned them that it was likely and they chose to go ahead anyway.
Perhaps a ten or twenty year ban on BP doing anything in the Golf of Mexico would convince others to be a bit more careful. Yes, "that would certainly be a harsh lesson, but fools will not learn any other way." The quote is not original with me.
My sense of the Rogers Committee hearings, and I admit this is purely subjective and based on my limited experience attending the opening hearing and also talking to some of the participants, is that without the presence of Feyman, who incidentally was not well at the time and would die two years later (not really relevant but an interesting side point), I believe the report would have been pretty much a whitewash or, more precisely, a useless exercise. Feynman's input gave it some meat.
@Michael Grieves: I'm pretty sure Professor Petroski got the story about the differing Space Shuttle reliability estimates by engineers and managers from Richard Feynman's appendix to the Rogers Commission report, which can be found here. Feynman concludes: "For a successful technology, reality must take precedence over public relations, for nature cannot be fooled."
Many of us believe that it is long past the time to eliminate the Industrial & Government Exemption for Professional Engineering Licensing. Looking at the many failures & Disasters, caused by non-technical persons, over riding an engineering decision, the problem has a solutioon. If a P.E. had to "sign-off" on the MN bridge, the Challenger, the Gulf Spill and many others - without over-ride, they would not happen.
I have a letter from Ford Motor Company, concerning a Ford Explorer, which states: "We know all about your problen. It's an uncorrectable built-in Design Flaw, which you're going to have to live with". And they pinned the problem on Firestone. If an engineer had rsponsibility, not Marketing, the problem ccould have been eliminated.
Ann said: I do not choose to drive a car that's potentially fatal, or use a medical device or prescription medicine that could kill me. If I knew about those possible failures ahead of time, I might be able to make different choices, either a different car or medical device or none of the above.
Bad news - You have chosen to never get medical care in any hospital in the world. Any medical device has the capability to fail and most in a hospital 'can' lead to death. Endoscopes and surgical devices can become contaminated, monitoring systems can fail, cath labs can shut down in the middle of a procedure, patient lifts can drop patients (actually, all of that and much much more has happened).
And unless you're a multi-millionaire, you don't even get to choose which medical device you get, let alone access to the information you'd need to make the decision. And that would assume useful information exists - it basically doesn't.
I think that's a major point - I never saw any mention of post-design failure consideration in engineering school. And that's a main reason I got my degree; before then, I dealt with the results of post design equipment failure and saw how pathetic the designs were related to failure management/prevention.
Good points, David. There is risk in all areas of energy production. I would guess that coal mining comes with a higher risk of death by injury or illness than oil. I've always heard that the energy source with the lowest risk of harm is nuclear energy. Not sure if that has changed since the growth of wind, solar and geothermal.
I would agree with the professor about the issue of the Blowout Preventer. While the drilling industry culture may be a major factor, what the issue argues for is a "digital twin," that is robust sensoring of the blowout preventer so as to know the complete state of the blowout preventer by having a digital virtual equivalent. I use the very example of the blowout preventer in my new book, "Virtually Perfect."
Where I have a problem is with drawing the equivalence to NASA based on geography. Hindsight is an amazing capability. Every launch has pressimistic engineers that would advise not to launch. There is little penalty for predicting disaster and being wrong.
There has been much written about the shuttle disasters and the causes that led up to them. All of it is hindsight. There may have been some misguided and even bad decisions, but there was never a sense of recklessness. Launching people into space is an inherently risky business. We hold it too much higher standard than we do commercial air travel. When something happens, Congress is delighted to search for scapegoats.
I don't know where the professor got his statistics about the estimate of expected failure rate by engineers and managers before the start of the space program. How anyone could have taken seriously a probability computed to three decimal places at that stage is a triumph of statistics over common sense.
Drawing an equivalence between the oil industry with a 45% failure rate and NASA with a 2% failure rate on a task with magnitudes of greater difficuly and unknowns is unfair. I think we should marvel at NASA's successes.
Professor, good job on highlighting the problem of the oild industry regarding blowout preventers. Bad job on comparing that to NASA.
No engineering endeaver has zero risk. For that matter all of life has risk. All risks need to be "appropriately" balanced. Most Americans want cheap energy (many want oil and it's downstream products such as gasoline) and low foreign dependance; however, everyone wants zero risk as well. It cannot be at both extremes. Clearly, nobody wants a disaster, but how do you stop one step before disaster or otherwise mitigate the risk enough to prevent it?
I believe the low reported blowout device failure rates are based upon testing the device to work as designed. The 45% failure rate includes the fact that many horriffic blow-outs will not be effectively released safely by a device that "meets specification". Some blow-outs are no doubt horriffic and practically impossible to safely control. This indicates that either A) the specification/testing/DFMEA needs to be reviewed, B) the higher reality of risk needs to be accepted, or C) we need to accept abandoning that high risk activity. I would advocate a combination of A and B with some mitigation.
We daily accept the risk of driving our car. Some of us daily accept the higher risk or riding a motorcycle. These risks can be magnified, or mitigated by the way we drive (i.e. - drive like a daredevil or drive defensively), and mitigated by car safety improvements. All of life is a risk. If I only walk, run, or ride a bicycle to avoid driving a car, I may get hit by a car or have a heart attack, AND I will need to limit my travel which could risk reducing my income and access to extended family . . . another type of risk trade-off.
A middle school team from Rochester, Mich., has again nabbed the grand prize in the annual international Future City Competition, which drew students from 37 regions of the United States, as well as from England and China.
The word “smart” is becoming the dumbest word around. It has been applied to almost every device and system in our homes. In addition to smartphones and smart meters, we now hear about smart clothing and smart shoes, smart lights, smart homes, smart buildings, and every trendy city today has its smart city project. Just because it has a computer inside and is connected to the Web, does not mean it is smart.
Are you being paid enough? Do you want a better job? According to a recent survey Manpower released just before Engineers Week, employers and engineers don't see eye-to-eye about the state of US engineers' skills and experience.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.