Chuck, this is an interesting and important question. We do not design things to be failsafe. As you point out, that would cost too much. On the other hand, our whole attitude to risk and human safety is completely bizaire on a societal level. We get all upset by things like a school shooting, while we drive our cars in a very dangerous fashion. Go figure.
Automobiles, on the other hand, are MUCH safer today. The number you quote is far less than it was when the population was much lower than it is. There are a number of factors at work here, but the most important is the design of the vehicle.
Finally, I am reminded of the old Tank McNamara cartoon. When fans were asked how long they would watch cars go round and round a track (we're talking NASCAR), they answered a few minutes. When told that there was a possibility someone would die they answered as long as it takes.
Every aspect of power generation has numerous failure mechanisms and each of those has a statistical number of deaths associated with it. The total number of predicted deaths per million from a 9.0 vs an 8.2 earthquake involves more calculations than there are engineers to make them. A trash-to-energy plant I'm familiar with requsted a permit to build and was denied because the predicted number of deaths per million of one of the stack gasses (out of 30 or 40 analyzed)was 4 per million (calculation showed one death wtih a margin of error of +/- 3). The applicant hired a world respected engineering firm to re-evaluate the formula and was able to reduce the margin of error from 3 to 2 (for several hundred thousand dollars) which reduced the prediction to 3 per million which was considered acceptable. It is difficult to separate a statistical model from the individual human lives those models represent.
At 30,000 deaths per year, we're at about the same raw number of auto deaths in the late 1960s. With a larger population, that shows progress. Even so, if the airline industry experienced one tenth of the number in a year, all planes would be grounded until a solution was found. I'd love to see some pressure on the auto industry to create safer cars.
Excellent, thought provoking post. It's clear that safety is a critical issue for all of us but how are our expectations set? Many times, there are additional factors that also come into play. Not alot of easy answers.
it is foolish to apply any form of statistical reasoning to a problem set more akin to a black swan then a normal distribution.
Any system should be designed for robustness as well as net risk assessment and to apply Probabilistic Risk Assesment is to foolishly assume you know the probability of any phenomena occuring.
Fundamentally, the Japanese and GE took a guess at the largest earthquake and tsunami and never looked back.
Had even minor thought gone in, then, the plants would have been sited on mildly higher ground. Fukushima 5 and 6 were 10 meters higher then 1-4 and as a result, rode out the EQ with a writeoff but no meltout.
The failure modes of things matter also, if you ever read Petrosky. If a failure mode
leads to the destruction of large areas of land, or the poisoning of millions, it's not a benign or acceptable failure mode.
It appears the Editors seem pleased that the winds pushed 80% of the nuclear material out to sea and towards California.
I do not think the Editors would be so happy, if the wind had pushed 80% of the material at Tokyo, instead of the 20% it did.
This is a really interesting topic to tackle, Chuck, and very relevant. It's true that safety comes at a price and that the barometer for creating safe products, buildings etc. can't really cover every possible event. Still, when you look at some occurrences in retrospect, it seems like tragedy could be avoided in many cases. Lou has a point that there are a lot of disparities in our attitudes, but generally a lot of products are much safer today than they used to be. One thing to keep in mind is you can't control individuals--you may be able to make a car safer in general, but you can't control someone who is determined to drive recklessly.
patb2009, on the topic of fish, one of the biggest sources of contamination in fish is chemical runnoff, mostly from lawn fertilizer.
What I find interesting in the study you provided the link to is that there is no mention of the level before. This study is a single point in time. You really cannot draw inferences from that. In addition, I am not conversant on what the natural level for these compounds would be in that area, and what the safe levels are considered to be.
Not too long ago there was a NIH study of arsenic, I believe, in rice. The level was higher in US grown rice. All rice has some level of the compound that it absorbed from the soil. As far as I can tell, there is also no standard for the allowable exposure level. That is being studied. I have not heard anything since the press published articles on the study. I do not see any evidence of large scale problems in rice eating countries from this.
My message is, be careful of how you use statistics.
The airline industry has professionally trained and certified pilots operating their aircraft, a cast of thousands constantly watching every plane in the air, and years-long investigations of every crash. Automobiles rely on amateurs with less than an hour of testing and often no training at all.
Airlines pay tens of millions of dollars for every plane, several thousand times the cost of the average new car, and yet when a plane crashes, people quite often die. A car crash happens about once every five seconds in this country and yet only 30,000 people a year die.
I think it is clear that our cars are quite safe, that the problem is the concept of nearly 200,000,000 careless amateurs being allowed to operate these devices. When 1/3 of all traffic fatalities are still alcohol related, it's obvious that what we need to fix is the driver, not the vehicle.
On a rough scale of probability, lets assume there are 50 reactors in Japan with a average 20 year operating time so they have 1000 reactor years of experience. Fukushima and the others were built to 100-year event standards. It was apparently well known that >9.0 earthquakes and 40-m tsunamis had occurred in Fukushima area as recently as 800 years ago. The evidence is readily available. With 1000 reactor-years statistically Japan could have experienced 10 reactor events exceeding 100-year limits. In reality they have just experienced 6 of them in March 2011 (there were 6 reactors at Fuk Dai-ichi) causing three meltdowns - fortunately three of them were cold shutdown at the time. Why are we only using 100-year limits?
Now comparing the earthquake and tsunami damage to 120,000 buildings vs damage to 1 reactor plant and where they should have placed the money to make them safer. The article suggested money (if available) to improve safety would have been better spent upon the 120,000 buildings.
Lets assume that another $1 Billion could have improved the reactor plant to withstand the events of March 2011. Failing to do so has probably cost $30-40 Billion. - loss of a plant worth $6B and $20 or 30 billion in property damage and displacement. Untold cost in lives but not immediate.
$1 Billion would have bought $8333 improvement for each structure destroyed. Not enough to withstand much of anything. But for almost $0 cost they could have built their businesses and home 20 mile inland. Look however at the history of low lying areas (New orleans, hurricane Katrina) - people will live close to their source of livlihood and the sea to save time and transportation and risk their lives against a 1 in a 100 year event for that. So even saving $8333 against ebeing 20 miles from the sea is not for them.
The difference between a Fukishima reactor and an ordinary building, dwelling or insfrastructure is, that the building loss is limited to the cost of the building and the property and maybe the life in it (who assumed the coastal risk). With a Fukushima the damage potential is much greater - tens of thousands of lives (who did not get to make a choice) and many square miles of land and other property in consequential damages.
I think there's no question that the $1 billiion would be better spent on protecting the plant with its huge capacity to generate consequential damage in the event of an Event.
Truchard will be presented the award at the 2014 Golden Mousetrap Awards ceremony during the co-located events Pacific Design & Manufacturing, MD&M West, WestPack, PLASTEC West, Electronics West, ATX West, and AeroCon.
In a bid to boost the viability of lithium-based electric car batteries, a team at Lawrence Berkeley National Laboratory has developed a chemistry that could possibly double an EV’s driving range while cutting its battery cost in half.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.