Archive for March, 2013

TEPCO’s Feb 2013 Release of Fukushima Diiachi photos taken March 2011

March 31, 2013

Enformable’s reproduction of the photos:


Unit 3

Unit 3

Unit 4

Unit 3 water spraying

Unit 3

Unit 3

Unit 3

Unit 2 and 3

Unit 3

Unit 3

5 April 2011 New York Times cites confidential NRC Fukushima Report

March 31, 2013 Original article in NYT.

Original blog post

New York Times
U.S. Sees Array of New Threats at Japan’s Nuclear Plant
Published: April 5, 2011
United States government engineers sent to help with the crisis in Japan are warning that the troubled nuclear plant there is facing a wide array of fresh threats that could persist indefinitely, and that in some cases are expected to increase as a result of the very measures being taken to keep the plant stable, according to a confidential assessment prepared by the Nuclear Regulatory Commission. …The document, which was obtained by The New York Times, provides a more detailed technical assessment than Japanese officials have provided of the conundrum facing the Japanese as they struggle to prevent more fuel from melting at the Fukushima Daiichi plant. But it appears to rely largely on data shared with American experts by the Japanese. …..The document also suggests that fragments or particles of nuclear fuel from spent fuel pools above the reactors were blown “up to one mile from the units,” and that pieces of highly radioactive material fell between two units and had to be “bulldozed over,” presumably to protect workers at the site. The ejection of nuclear material, which may have occurred during one of the earlier hydrogen explosions, may indicate more extensive damage to the extremely radioactive pools than previously disclosed. ….Even so, the engineers who prepared the document do not believe that a resumption of criticality is an immediate likelihood, Neil Wilmshurst, vice president of the nuclear sector at the Electric Power Research Institute, said when contacted about the document. “I have seen no data to suggest that there is criticality ongoing,” said Mr. Wilmshurst, who was involved in the assessment. …The N.R.C. report suggests that the fuel pool of the No. 4 reactor suffered a hydrogen explosion early in the Japanese crisis and could have shed much radioactive material into the environment, what it calls “a major source term release.”

Experts worry about the fuel pools because explosions have torn away their roofs and exposed their radioactive contents. By contrast, reactors have strong containment vessels that stand a better chance of bottling up radiation from a meltdown of the fuel in the reactor core. ” End Quote

As at 31 March 2011 there is the view among many many ordinary people that the reactors at Fukkushima Diiachi continue to release radio nuclide emissions into the air as well as the sea.

Officially, TEPCO, IAEA and governments insist that emissions into the air have ceased and dismiss the possibility that the continued release of radionuclides into the sea have an environmental impact. The official view is that the reactors at Fukushima Diiachi are not a present and continuing hazard in any form. That nothing is noteworthy, that no facts about the current state of the reactors deserves the continuing attention of the people of the world.

The salient points of the New York Times report are:
1. There is a confidential report issued by the NRC regarding the state of the Fukushima Diiachi reactors. ”
2. They saw an array of “fresh threats” at the plant.
3. These previously undeclared threats could “persist indefinitely”.
4. Attempts to achieve “stability”, these threats could result in the “increase” in severity of the threat level.
5. This confidential information “leaked” to the NYT newspaper contained more detailed information than Japanese offiicals had released to the world. BUT:
6. The information was derived from that SHARED by Japan with the USA NRC.
7. Fragments or particles of fuel rods were “blown up to one mile from the (reactor) units.”
8. The folly of multiple units located together was shown by highly radioactive material having to be “bulldozed over, presumably to protect workers at the site.”
9. The ejection of “nuclear material” which “may” have occurred during the earlier “hydrogen” explosions
10. The ejection of this material – fragmented fuel rods – “may indicate more extensive damage to the extremely radioactive (fuel) pools than previously disclosed.”
11.”engineers who prepared the document do not believe that a resumption of criticality is an immediate likelihood.” The one answer which is the most likely to the issue raised by David Chanin is thus discounted by: “Neil Wilmshurst, vice president of the nuclear sector at the Electric Power Research Institute”.

And it is this denial which seems debunked by the science Journal “Nature” as quoted last post.

12. “The N.R.C. report suggests that the fuel pool of the No. 4 reactor suffered a hydrogen explosion early in the Japanese crisis and could have shed much radioactive material into the environment, what it calls “a major source term release.” The events of 14 – 15 March 2011 at the spent fuel pool number 4 almost certainly involved an explosion and a fire, and a major release of fresh used fuel rod contents. This seems responsible for a large escalation of the crisis and marked the phase of a major fudging of events. A review of news footage for those two days of Australian TV media shows this fudging. REwatching those TV news programs now listening for mention of a fuel pool event results in no clear description of events.

13. “Experts worry about the fuel pools because explosions have torn away their roofs and exposed their radioactive contents.” Yet this official fear has been played down and nuclear industry and its spokespeople have been extreme in denial that this ever was or could be a factor affecting Japan.

In truth, the history of nuclear reactor approval since the late 1960s has revolved around this very vulnerability of spent fuel pools. The technical vulnerability of spent fuel pools and the inability of industry to solve the on site storage of spent fuel are key vulnerabilities of every reactor on earth. And the fears have been actualized not by ideology but by the facts of the matter. The containment buildings are gone, blown away as a result of the use zircalloy as fuel rod cladding, and its ability to liberate Hydrogen from water at high temperatures. If this is the best design for fuel rods, the use of material in fuel rods which causes explosions sufficient to destroy containment buildings, then the people have been lied to for decades by nuclear industry.

Ongoing criticality might be due to fuel pool disturbance by hydrogen explosion enabled by the use of zircalloy as cladding. It may also be caused by fuel melt within the reactors. The greatest store of nuclear material being in the fuel pools.

The storage of many tons of contaminated water is one example of a growing threat brought about by attempts to reach stability at the reactors.

The NRC reports also raises questions about the stability and integrity of the reactor vessels themselves:

Source: New York Times, NRC leaked document.

One may speculate on the source of the leak. It probably was not any of the NRC commissioners who sought the expulsion of the former NRC Chairperson Gregory B. Jaczko. For these commissioners are famous for acting for the interests of the nuclear industry.

“Nature” Journal news blog: Did Fission continue at Fukushima after reactor shutdowns?

March 30, 2013

This a report from April 2011. Please visit the Nature site for full information.

At the time, the Japanese government, and all other governments, nuclear experts, and the supposed expert organisation IAEA were proclaiming that the afflicted reactors at Fukushima Diiachi had safely shut down, that there was no emission from them and that all was well. Nuclear critics were are claimed as confusing nuclear reactors with atomic bombs, when in fact the truth is all nuclear technology can produce nuclear pollution. Nuclear fallout is the weapon effect which is in common with reactors, whether at refueling or in accident mode. The reactor accident mode releases nuclear pollutants at vast scale in proportion to the mass of fuel rod involved. Bombs generate fission and fuel particle emissions for a brief time. As evidenced by Fukushima Diiachi fallout production and nuclear pollutant emissions can and do continue for years post accident.

It is pointless nuclear experts pretending nuclear pollutant emissions from Fukushima Diiachi ceased in March 2011. It is pointless pretending any reference to fallout transport, deposition and biochemistry, derived from the military documents of the nuclear era are irrelevant. They are relevant. A case in point is the US Army’s Special Weapon Project which estimated in 1955 that plankton in seawater concentrates nuclear contaminant radiochemicals many thousands of times. It was in 2013 that the new record for cesium per kg of fish body weight was recorded in a fish caught by TEPCO. The mechanism by which this bio accumulation has killed the afflicted fishing industry is not a mystery, it was fully and accurately described by the US military 58 years ago.

Nuclear industry is the most prolific nuclear polluter and has been since radium was first used for making luminous paint. The nuclear industry relies on the myth of progress and an impotent futurism to promote this old fashioned, dirty and deceptive industry. The original female dial painters, as they died of radium induced cancers, mainly of the face and jaw, were labelled as sluts suffering Syphilis by the nuclear industry in 1925. Today the Japanese authorities claim people suffering fatigue and other symptoms are “weak minded ” and that plutonium is safe to eat.

No progress there since 1925 is there?

“Nature” writes at the above link:

There is growing evidence that uranium and plutonium fuel at the Fukushima nuclear plant may have continued nuclear fission chain reactions long after the reactors were shut down almost three weeks ago. This worrying development may explain the continued release of some shorter-lived radioisotopes from the stricken site.

Tepco, the plant operator, said earlier this week that it had – on 13 occasions – detected beams of neutrons near the reactors. Neutrons are produced during fission of nuclear fuel, and are the key driver of the chain reaction that sustains continuous fission reactions in a reactor.

Japan Today reports that “the neutron beam was measured about 1.5 kilometers southwest of the plant’s No. 1 and 2 reactors over three days from March 13.”

The neutron beam didn’t pack much of a punch – if anyone got in its way, it would likely deliver a dose of just 0.01 to 0.02 microsieverts per hour. But the finding tallies with a recent analysis of other isotopes found at the plant, published in the Asia-Pacific Journal: Japan Focus

Ferenc Dalnoki-Veress, at the James Martin Center for Non-Proliferation Studies of the Monterey Institute of International Studies in California, hones in on the significance of a very short-lived radioisotope, chlorine-38, in the water in the turbine building of reactor 1.

In an introduction to the analysis, Arjun Makhijani, president of the Institute for Energy and Environmental Research, an energy and environment information-provider based in Takoma Park, Maryland, explains:

Chlorine-38, which has a half-life of only 37 minutes, is created when stable chlorine-37, which is about one-fourth of the chlorine in salt, absorbs a neutron. Since seawater has been used to cool [the reactors], there is now a large amount of salt – thousands of kilograms – in all three reactors. Now, if a reactor is truly shut down, there is only one source of neutrons – spontaneous fission of some heavy metals that are created when the reactor is working that are present in the reactor fuel. The most important ones are two isotopes of plutonium and two of curium.

But if accidental chain reactions are occurring, it means that the efforts to completely shut down the reactor by mixing boron with the seawater have not completely succeeded. Periodic criticalities, or even a single accidental one, would mean that highly radioactive fission and activation products are being (or have been) created at least in Unit 1 since it was shut down. It would also mean that one or more intense bursts of neutrons, which cause heavy radiation damage to people, have occurred and possibly could occur again, unless the mechanism is understood and measures taken to prevent it. Measures would also need to be taken to protect workers and to measure potential neutron and gamma radiation exposure.

There’s a great debate about the implications of all this going on over at Arms Control Work.

In this matter I recall an email exchange between Mr David Chanin, a New Mexico based nuclear decontamination expert, (David Chanin Consulting) and myself. While the exchange was mutually frustrating, we did persist in understanding each other as well as we could. (The exchange eventually got too much for me, but that is no reflection upon Mr Chanin.)

Mr Chanin is, I figure, big enough – without doubt – to disseminate his point of view without my help, and ditto for me. Mr Chanin is an example of open minded American inquiry and professional diligence within the sphere of independent nuclear expertise. As film maker David Bradbury stated, the issue is beyond Black hats and white hats, and the nuclear lay person ( ie the Downwinders of the planet – all of us) needs the nuclear expert. At a certain point the dichotomy of the official view and the ordinary person relies upon, in the end, the coupe de grace of the expert for validation in the law court. While noone has ever really won in court – for everything gets watered down and the Law Lords eventually do a Tommy by going deaf dumb and blind, the reality of Downwinders (forget about the mechanism, bombs or reactors, the fallout, essentially, is the same, though reactors are much longer lived and insidious sources of nuclear pollution than bombs) eventually does get through. Nuclear industry relies upon the conservatism of the ignorant to persist. If everyone was a Martha Bordoli Laird or an Aboriginal Australian who lived through the fifties, nuclear industry would shut down.

So at the time, I figured I should consider what Mr Chanin had to say even though we annoyed each other with our views of internal emitters.

Mr Chanins website is here:

I figure he is a person who figures things out for himself from the basis of his knowledge base and on the basis of that which he believes. Just like everyone else. That said, in the final analysis, we didn’t get on. Here is one email he sent me:
What’s amazing to me is that I seem to be the only one who thinks that I-131 levels should
be decreasing with 8-day halflife because its only parents in the “standard NRC 60-nuclide
list for reactors” are Te-131and Te-131m, both with shorter halflives, so they can’t be
causing any I-131 buildup and certainly can’t cause the high levels of I-131 being reported in
the flood of measurements that were published by TEPCO all on April 19, with
measurements of seawater as far away as 15 km showing I:Cs rations of over 2:1 and as
high as 3:1, but sometimes they’re equal, with few to none where I-131 is
measured at levels less than Cs-134 and Cs-137 on a Bq/gram-water basis with 1000-
second counting time of 1-liter sample, which matches up with usage of a gamma
spectrometry machine like the GAM-AN1 by Canberra:
Can you do me a favor and ask one of your nuclear engineer contacts how and why I-131
can be over double the reported levels of Cs-134 and Cs-137, after five halflives of I-131?
I’m not a nuclear engineer who can try to run the Origen code for their reactors and the SNF
pools to see what could be making the I-131. I’m the consequence analyst who developed
the MACCS2 code and have used it and its predecessor MACCS since the 1980s for
nuclear accident analysis.
All I know is that when people use the MACCS2 code, which is the NRC-approved code for
reactor PRA consequence calculations, and is used worldwide for well over 500 nuclear
facilities and operations since its release in 1997, the MACCS2 code shows ZERO
consequences from I-131 from reactor accidents after 40 days of decay. It’s not just
the direct exposure doses from groundshine and inhalation, it’s also the food doses
calculated by the code with both of the “food models” that are available to the code user.
Milk from cows grazing during a large release shows very low levels of I-131 after 40
days according to the MACCS2 calculations.
And it’s also my understanding that “normal levels” of I-131 in SNF pools should be
practically zero, with the million-year, weak emitter, I-129 being the only iodine that should
be detected to any significant degree in SNF water from an intact pool under normal
operation. So, if my MACCS2 code is wrong about I-131, then all the safety analyses that
use to MACCS2 to calculate nuclear accident impacts are also wrong. That’s why this is an
important question.
Even if criticalities are ongoing, it’s impossible for me to imagine that they could be creating
so much I-131. I’ve used “standard decay tables” that all derive from ICRP 38 and were
calculated by Keith Eckerman, at ORNL, who calculates the internal and external DCFs for
US and international agencies which all rely on the ICRP 38 decay chains, where decay-
chain calcs are necessary because of the decay and buildup of progeny after an intake both
on the ground for deposited material and in the human body from inhlaed or ingested
I have not tried to use this database from KfK to solve the puzzle.:
So my question, which you can forward around with all the above ane below is: Why are the
I-131 levels of April 19 in “plant-water” and seawater from
e.html so high after 5 halflives? The NRC says that the MACCS2 code is essentially error-
free. I’m curious if that’s true because I learned way back in school that there is no such
thing a bug-free large-scale software such as MACCS2, which has received little-to-none
verification and validation for complex scenarios.
I have no qualms whatsoever being known as the source of this request. I’ve never
pretended to know everything.
David Chanin
Re NYT article of April 5, 2011, “U.S. Sees Array of New Threats at Japan’s Nuclear
“Even so, the engineers who prepared the document do not believe that a resumption of
criticality is an immediate likelihood, Neil Wilmshurst, vice president of the nuclear sector
at the Electric Power Research Institute, said when contacted about the document. “I have
seen no data to suggest that there is criticality ongoing,” said Mr. Wilmshurst, who was
involved in the assessment.”

The above email is dated Friday, 22 April, 2011, 10:40 PM

I passed this email to a person I know, who is a friendly reactor worker at Oak Ridge. His response was to take it easy. And in my mind, the more extreme anti-bullshit people appear to be the better the nukers in charge like it. The average reactor worker wants a safe environment as much as everyone else. What is “safe” is the moot point.

I also passed the email to a contact in Japan. And the matter was commented upon by a British expert as a result.

Now, Mr Chanin and I tried to find (I think) a rationale explanation for the surprising data. An explanation which excluded the obvious. I thought up reasons based up isotope separation due to thermal reasons in sea water and inadequate monitoring methods. None of these were satisfactory to Mr Chanin, who quoted the extract from the New York Times above. Noone at the NRC it seems wanted to contemplate uncontrolled fission at Fukushima Diiachi, but it seems to me that this is in fact what occurred.

Hmm. I’ve read all three of your email replies. None of your theories conform with the physics
David. ”

If there was uncontrolled fission occurring at Fukushima Diiachi in April 2011, it is about time the IAEA up dated itself. Its about time Nuclear Industry updated itself.

All the way through the event we have heard from Japan how wonderful it is as competent nuclear authority, and all the way through we have seen indications that the responses to Fukushima Diiachi and the events there are genetic, not Japanese. We can certainly expect the same bullshit from nuclear authorities around the world in similar circumstances.

Meanwhile, independent thinking leads the self reliant thinker to conclude that “the inhalation hazard is under estimated in Japan” “It’s an uncontrolled experiment upon the children of Japan” “We will never know the truth about the events in Japan”.

They are not the merely the essence of my thoughts, but they are the essence of what an independent decontamination expert in the US thought in April – July 2011.

I see no reason to argue with David Chanin in those respects.

The economic imperative seems to me to be more important than the health physics ones to the Japanese nuclear authorities. I make this statement with qualified support from a US decontamination expert, though I hope in reporting this I am not aiding the propaganda arm of US nuclear authorities. Who, in the same situation of reactor as suffered in Japan, will probably try to pull the same information control stunts as the Japanese authorities surely have.

Now, is anyone still measuring fresh Iodine isotopes in either seawater or air around Fukushima Diiachi?

Apart from the self serving statements issued by the Japanese authorities, including their broadcast view that the atomic test era was perfectly safe, under cuts the just claims of nuclear veterans presented to the British courts and recently rejected by those courts.

And so I say, nuclear victims wishing to claim justice must realize the anarchy nuclear industry has created in this regard over the decades. There is an irrational resistance by the nuclear bodies.

People are expendable and individuals are expected the risks imposed for the protected and featherbed nuclear agents and agencies.

What is truth ?

A blog is not the ideal medium with which to carry out a time based study of events, proclamations of truth and what the truth actually was and is. Past posts get lost down the time line. I only react that which has been published. I might be wrong or right. Mine is just opinion. On the basis of history I can say with certainty that the truth presented by nuclear victims includes the fact that nuclear authorities hide the truth and seek to sow argument in populations so as to divide populations.

The fact is nuclear industry promotes itself as clean and green and it is neither and never have been. Normal routine reactor emissions are not safe, neither are the slag heaps at the world’s uranium mines. Nor the dump sites for nuclear waste which burden the world for millions of years.

Of course, the email exchange above raises questions which exist in relation to the confidential report by NRC, leaked to the New York Times and quoted by Mr Chanin.

Time to review that, I think.

Lessons from Fukushima : Dr Majia Holmer Nelson

March 28, 2013
” Lessons From Fukushima: Governments and the Media Will Deceive the Public andWithhold Vital Information, Leaving Citizens to Create Informal InformationSharing NetworksThis presentation will demonstrate that the Japanese and U.S. Governmentswithheld vital information from their citizens about the direction and risks of Fukushima radiation plumes and the degree and consequences of radioactivefallout. Second, the presentation will demonstrate that the mainstream newsmedia, including The Wall Street Journal and The New York Times, were complicitin hiding information about fallout levels, dispersion, and plant conditions. TheU.S. media are commonly recognized as more independent from government thanJapanese media. This disaster demonstrates that the U.S. mainstream news mediacensor information, even when public safety is at issue. Finally, this presentationexamines the spontaneous creation of information sharing sites and thesubsequent development of a robust network of citizen-supported informationsites in Japan and the United States.” Abstract, Dr Majia Holmer Nelson

This Power point presentation is a wonderful and enlightening work which explains in an expert and coherent manner the process of deception. A deception which is aimed by nuclear authorities at protecting the interests of those authorities.

In Australia, a study of the British Nuclear Tests reveals a similar process. For instance, a study of the suppression of the contrary view as proposed by Marston.

Globally, the Tobacco industry comes to mind as an entity that used a very similar process.

In fact I have been wracking my brains for months trying to remember the advertising specialist who worked both for big tobacco and nuclear polluters. Can’t recall it.

On tobacco, the role of polonium as a cancer initiator vectored to the body by cigarette smoke is an important topic. The study of cancer and cigarettes has relevance to nuclear pollution. In the modern world there is rarely, in terms of chronic exposure, a single agent insult. Synergistic interactions likely lead to profound effects which are poorly predicted by current single agent models. imo.

As such, the decades of human disease caused by tobacco and polonium in cigarette smoke is in effect, a long term study into the effects of long term polonium uptake by humans. The medical database and acknowledged causal link between cancer and cigarette smoke needs only one thing: A study to determine the contribution to disease made by polonium. Given the hundreds of toxins in cigarette smoke, this may be difficult.

None the less, the cocktail of insults which results from smoking is an analogue for our synthetic environment in the wake of Fukushima and other, local nuclear and chemical emissions.

90 percent of smokers do not get lung cancers. They die of other tobacco related illnesses. The diseases caused by cigarette smoke are not limited to cancer.

The cigarette/polonium deception is no different in its essence to the fallout deceptions which have repeatedly been forced upon the world.

There was a time when “scientific” papers produced by or funded by, big tobacco were held in esteem by the learned community. That time is passed and now the utterances on behalf of big tobacco by “experts” are greeted with nearly universal scorn by both experts and lay people.

So it will be, one day, that people will discuss the many decades of documented lies foisted upon the world by nuclear “authorities”. Those documents of lies will elicit scorn and condemnation rather than controversy in the decades ahead for the truth will be abundantly plain by then. The exclusive right to comment claimed by nuclear industry is being eroded with every passing day.

Collectively, nuclear industry is no different to Philip Morris P/L. It has the same axe to grind.

Majia’s analysis also contributes something extra which is completely lacking in the mass media’s narrative of Fukushima. A coherent, inter-related narrative of the time span of events.Predictable from history. Whereas, the industry would have it that each deceptively described crisis was discrete, unique, unforeseeible, and harmlessly resolved.

The truth is that the seeds of the Fukushima Diiachi event were predicted as early as the 1960s. The facility was built in the 70s never the less, and in a slap to the face of humanity, TEPCO decided, as an economy measure, to locate the emergency generators down in the basements, along with the electrics and control panels.

That is not dumb, its criminal. One of many such acts.

I believe that the secrecy provisions within the Atomic Energy Act 1954, as currently amended, and associated US legislation, effectively enforced censorship upon the world media. The US Law states that “special nuclear material” produced during the production of civil nuclear energy is subject to the secrecy provisions. Other legislation controls unclassified controlled related information.

Further I believe that the internal war which broke out within the NRC in September 2011 was a battle over the selection, use and public reporting of nuclear information. The battle claimed the NRC career of the NRC Chairman Gregory Jaczko. My view at the time was that Americans were big enough and ugly enough to look after themselves, that, if Americans pulled their heads out of the sand, they could change the law, particularly the Atomic Energy Act. However, that’s Australian arrogance on my part.

Effectively, Jaczko wanted, in my view, more open discussion. The NRC members wanted less.

The Atomic Energy Act should be amended to mandate more open discussion, not less, and raw data should be released to independent institutes and researchers at the same time as the NRC receives it.

If that were the case, the US Media and the sources which feed it , would be as open in regard to nuclear news as it is (it claims) to any other news topic.

The Atomic Energy Act and associated US Laws basically compel a media conspiracy as the law stands at the present time.

Woe betide anyone who points this simple fact out. One does not need to be a Bush Lawyer to understand the relevant paragraphs of the Atomic Energy Act to see this. If the USA is a democracy, and if the role of government in a democracy is to make and amend laws, then the offending paragraphs of the Atomic Energy Act is a worthy target of concerted action by voters who believe these provisions in law are against the interests of open government and the public interest.

The days of the concept of “Born Secret” should be as dead as the Cold War. But it is not.

Emissions from Fukushima Diiachi Present in the Tropopause and Jetstream

March 27, 2013

Quote ““The important feature of this accident was that the source position was evidently clear, however, time and vertical emission variations were unknown (in this case, it was known that the height of emission was not so high in altitude).”

Wrong. Radiosondes released from Fukushima University showed beta radiation levels off the chart at jet stream levels (30,000 feet or 10 km):

Every one of these atmospheric simulations that do not include high-altitude measurements of radiation is invalid.

From Bobby1.
end quote.

Thanks very much for this information and for the information and source on your site. It is very important, it helps me immensely and it knocks on the head the uncertainty I have had about the transport vectors and reach.

In my searches, the information I had found did not take away my uncertainty regarding the jet stream as a major vector. Your source does give certainty.

The issue which has vexed me is whether the particles released by the Fukushima event have reached the stratosphere. If quantities of fission and fuel particles have reached the stratosphere, global fallout (much more significant in the Northern Hemisphere than the Southern) will continue for years, even if significant emissions into air from Fukushima have ceased, even if there are not future released.,d.dGI&cad=rja
which is: Transport and fallout of stratospheric radioactive debris

I hold the view that as the presence of emissions from Fukkushima Diiachi is confirmed in the tropopause and Jetstream, they are probably present in the stratosphere. Therefore, the Fukushimma event will probably continue for decades as fallout is slowly removed from the stratosphere by the demonstrated action of slow fallout.

The precedent my believing that the effect of Fukushima nuclear fallout will last for decades, if present in the stratospheric sink lies in the many decades worth of study of global stratospheric fallout from the era of nuclear weapons testing.

I now know, because of the source material presented at Bobby1’s link above, that the long term hazard posed by the stratospheric sink may be, and probably is, a hazard in the decades ahead.

Both immediate and long term fallout are important.

“By the time stratospheric fallout reaches the earth, its radioactivity is greatly reduced. For example, after one year, the time typically required for any sizable amount of fission products to move from the northern to the southern stratosphere, the rate of decay will be less than a hundred thousandth of what it was one hour after the blast. It is for this reason that stratospheric fallout does not have the potential to cause widespread and immediate sickness or death.”

None the less, in terms of impacts, stratospheric fallout is a concern into the future. And this concerns not only people who are present on the earth now, but also those who are yet to be born.
which is:

Report on the Health Consequences to the American Population from Nuclear Weapons Tests Conducted by the United States and Other Nations
Chapter 2: Fallout from Nuclear Weapons
2.1 Fallout Production Mechanisms
Chapter 3: Estimation of Doses from Fallout
Chapter 4: Potential Health Consequences from Exposure of the United States Population to Radioactive Fallout,d.dGI
Which is:
Fallout from Nuclear Weapons Tests and Cancer Risks
Exposures 50 years ago still have health implications
today that will continue into the future
Steven L. Simon, André Bouville and Charles E. Land

US National Cancer Institute.

It has been the conventional view that size of nuclear explosion was the sole mechanism which determined the altitude reached by fallout products. Within this view, the explosions at Fukushima Diiachi in March 2011 were trivial and the view was and remains that resulting fallout depended upon surface wind and other factors alone.

However Stohl and other studies, including the Fukkushima University study above, show that the Fukushima Diiachi emissions have reached very high altitudes.

Therefore the conventional view that the many tons of radioactive material resident on the ground and that portion of it liberated (or unsealed) by the events of March 2011 is of no comfort.

For the data shows despite the (in nuclear weapons terms) small size of the explosions at Fukushima Diiachi plant, the emissions have indeed reached the jetstream altitude. The hazard posed by further releases of radioactive material from the broken site remains.

The Fukushima Diiachi site is one shared by multiple nuclear reactors. The concept of a such a “reactor park” was studied and heavily criticized in the 1970s. Two such critics were Nader and Abbotts, who devoted a chapter on the subject in their book “The Menace of Atomic Energy” by Nader and Abbott, Outback Press, Victoria, Australia. Copyright 1977. ISBN 0 86888 0515.

The combined thermal updraft created by the multiple units at the site may be responsible for the high altitude at which the Fukushimma emissions have been detected.

US CDC Dose Estimates from Nuclear Testing

(see cdc document at link given above.)

Execuative Summary

Radiation Dose Estimates. In this study, for the first time, doses for representative persons
in all counties of the contiguous United States have been estimated from exposure to the
most important radionuclides produced as a result of nuclear weapons testing from 1951
through 1962 by the United States and other nations. Any person living in the contiguous
United States since 1951 has been exposed to radioactive fallout, and all organs and tissues
of the body have received some radiation exposure. Doses were estimated separately for the
tests conducted at NTS and the tests conducted at other sites throughout the world (global
Lifetime dose estimates were calculated separately for external and internal irradiation.
External irradiation results from exposure to radiation emitted outside of the body, for
example, by radionuclides present on the ground. The corresponding doses are similar in
most body organs and vary little with the age of the person. On the other hand, internal
irradiation results from the decay of radionuclides incorporated in the body by inhalation or
ingestion, with levels of exposure varying according to age and to the distribution of
radionuclides in the organs and tissues of the body. For example, radioiodines concentrate
in the thyroid gland, whereas radiostrontium is found mainly in bone tissues.
Because the purpose of the project was only to determine the feasibility of doing a study,
there was no intention in the allowed time to develop new tools or to gather all data needed
to complete an extensive study of doses to Americans from nuclear weapons tests conducted
by the United States and other nations. Instead, crude dose estimates were made on the
basis of reviewing a limited number of reports in details and using available dose assessment
models. In some cases—particularly for the doses resulting from the intake of shorter-lived
radionuclides (e.g., iodine-131) in global fallout—the doses calculated may have
considerable error. Future work would improve the precision of these calculations.
The usefulness of the doses estimated in this project is limited to rudimentary evaluations of
the average impact on limited health outcomes for the population of the United States.
Because of the low precision of the dose estimates, they should not be used to estimate
health effects for specific individuals or for subpopulations. The goal of these calculations
was to determine only the feasibility of a study; therefore, the magnitude of uncertainty of
these doses has not been fully evaluated. However, though the computed county-specific
deposition densities and doses are uncertain, dose maps presented in this report are useful
for illustrating general spatial patterns of fallout exposure for average individuals across the
United States.

A summary of doses averaged over the contiguous United States is presented in Table 1 as
an example of the findings from this study. Because the thyroid and red bone marrow are
among the most radiosensitive organs and tissues of the body, their doses were selected as
examples for presentation (Table 1). Thyroid cancer, noncancer thyroid disease, and
leukemia, which arises from the red bone marrow, are the health effects that are discussed in
this report. ”

The United States does not possess a fine record of the health impacts contributed to the population
by nuclear testing. Nuclear fallout from civil nuclear engineering is an additional contribution which adds to the unknown.

March 15 2011 Fukushima Diiachi (again)

March 26, 2013

See also

Geophysical Research Abstracts
Vol. 15, EGU2013-3793-3, 2013
EGU General Assembly 2013
© Author(s) 2013. CC Attribution 3.0 License.
The time series analysis of the radionuclide emissions from Fukushima
Daiichi nuclear power plant by online global chemistry transport model
and inverse model

Takashi Maki, Taichu Tanaka, Mizuo Kajino, Tsuyoshi Sekiyama, Yasuhito Igarashi, and Masao Mikami
Meteorological Research Institute, Japan (

The accident of the Fukushima Daiichi nuclear power plant that occurred in March 2011 emitted a large amount of radionuclide.

The important feature of this accident was that the source position was evidently clear, however, time
and vertical emission variations were unknown (in this case, it was known that the height of emission was not so high in altitude). In such a case, the technique of inverse model was a powerful tool to gain answers to questions; high resolution and more precise analysis by using prior emission information with relatively low computational cost are expected to be obtainable. Tagged simulation results by global aerosol model named MASINGAR (Tanaka et al., 2005) were used; the horizontal resolution was TL319 (about 60 km). Tagged tracers (Cs137) from lowest model layer (surface to 100m) were released every three hours with 1Tg/hr which accumulated daily mean. 50 sites’ daily observation data in the world (CTBTO, Ro5,Berkeley, Hoffmann and Taiwan) were collected.

The analysis period was 40 days, from 11 March to 19 April. We tested two prior emission information. The first information was JAEA posterior emission (Chino et al., 2011) and the second was NILU prior emission (not posterior) (Stohl et al.,2011) as our observation data were almost similar to their study.

Due to consideration for observation error and space representation error, the observation error was set as 20%. Several sensitivity tests were examined by changing prior emission flux uncertainties. As a result, Cs137 estimated the total emission amount from 11 March to 19 April as 18.5PBq with the uncertainty of 3.6PBq.

Moreover, the maximum radio nuclei emission occurred during 15 March, which was larger than prior information.

The precision of the analysis was highly dependent on observation data (quantity and quality) and precision of transport model. Possibility to obtain robust result by using multi-model ensemble results with inverse model was also considered. The results of this study are available for
modification of many processes of aerosol transport models. In the future, the combination of regional chemistry transport model and higher time resolution observation data in order to obtain robust emission time series of radionuclide is being planned.” end quote.


Fukushima Nuclear Accident Update (15 March 2011, 14:10 UTC)

The IAEA continues to seek details about the status of all workers, reactors and spent fuel at the Fukushima Daiichi plant.

An evacuation of the population from the 20-kilometre zone around Fukushima Daiichi is in effect. The Japanese have advised that people within a 30-km radius shall take shelter indoors. Iodine tablets have been distributed to evacuation centres but no decision has yet been taken on their administration.

A 30-kilometre no-fly zone has been established around the Daiichi plant. Normal civil aviation beyond this zone remains uninterrupted. The Japan Coast Guard established evacuation warnings within 10 kilometres of Fukushima Daiichi and 3 kilometres of Fukushima Daini.

ukushima Nuclear Accident Update (15 March 2011, 22:30 UTC)

Japanese authorities have informed the IAEA that the evacuation of the population from the 20-kilometre zone around Fukushima Daiichi has been successfully completed.

The Japanese authorities have also advised that people within a 30-km radius to take cover indoors. Iodine tablets have been distributed to evacuation centres but no decision has yet been taken on their administration.

The IAEA continues to liaise with the Japanese authorities and is monitoring the situation as it evolves.

Fukushima Nuclear Accident Update (16 March 2011, 22:00 UTC)

Temperature of Spent Fuel Pools at Fukushima Daiichi Nuclear Power Plant

Spent fuel that has been removed from a nuclear reactor generates intense heat and is typically stored in a water-filled spent fuel pool to cool it and provide protection from its radioactivity. Water in a spent fuel pool is continuously cooled to remove heat produced by spent fuel assemblies. According to IAEA experts, a typical spent fuel pool temperature is kept below 25 °C under normal operating conditions. The temperature of a spent fuel pool is maintained by constant cooling, which requires a constant power source.

Given the intense heat and radiation that spent fuel assemblies can generate, spent fuel pools must be constantly checked for water level and temperature. If fuel is no longer covered by water or temperatures reach a boiling point, fuel can become exposed and create a risk of radioactive release. The concern about the spent fuel pools at Fukushima Daiichi is that sources of power to cool the pools may have been compromised.

The IAEA can confirm the following information regarding the temperatures of the spent nuclear fuel pools at Units 4, 5 and 6 at Fukushima Daiichi nuclear power plant:
Unit 4
14 March, 10:08 UTC: 84 °C
15 March, 10:00 UTC: 84 °C
16 March, 05:00 UTC: no data
Unit 5
14 March, 10:08 UTC: 59.7 °C
15 March, 10:00 UTC: 60.4 °C
16 March, 05:00 UTC: 62.7 °C
Unit 6
14 March, 10:08 UTC: 58.0 °C
15 March, 10:00 UTC: 58.5 °C
16 March, 05:00 UTC: 60.0 °C

Fukushima Nuclear Accident Update (16 March 2011, 14:55 UTC)

Japanese authorities have reported concerns about the condition of the spent nuclear fuel pool at Fukushima Daiichi Unit 3 and Unit 4. Japanese Defense Minister Toshimi Kitazawa announced Wednesday that Special Defence Forces helicopters planned to drop water onto Unit 3, and officials are also preparing to spray water into Unit 4 from ground positions, and possibly later into Unit 3. Some debris on the ground from the 14 March explosion at Unit 3 may need to be removed before the spraying can begin.
Fukushima Nuclear Accident Update (16 March 2011, 03:55 UTC)

Japanese authorities have informed the IAEA that a fire in the reactor building of Unit 4 of the Fukushima Daiichi nuclear power plant was visually observed at 20:45 UTC of 15 March. As of 21:15 UTC of the same day, the fire could no longer be observed.

Fire of 14 March

As previously reported, at 23:54 UTC of 14 March a fire had occurred at Unit 4. The fire lasted around two hours and was confirmed to be extinguished at 02:00 UTC of 15 March.

IAEA Briefing on Fukushima Nuclear Emergency (17 March 2011, 14:00 UTC)

At the IAEA headquarters in Vienna, Graham Andrew, Special Adviser to the IAEA Director General on Scientific and Technical Affairs, briefed both Member States and the media on the current status of nuclear safety in Japan.

Current Situation

The situation at the Fukushima Daiichi nuclear power plants remains very serious, but there has been no significant worsening since yesterday.

The current situation at Units 1, 2 and 3, whose cores have suffered damage, appears to be relatively stable. Sea water is being injected into all three Units using fire extinguishing hoses. Containment pressures are fluctuating.

Military helicopters carried out four water drops over Unit 3.

Unit 4 remains a major safety concern. No information is available on the level of water in the spent fuel pool. No water temperature indication from the Unit 4 spent fuel pool has been received since 14 March, when the temperature was 84 °C. No roof is in place.

The water levels in the reactor pressure vessels of Units 5 and 6 have been declining.

Strontium-90 and Cesium-137 in Milk and Certain other Materials Collected in Finland, 1972

March 26, 2013

Journal of Dairy Science,
Volume 55, Issue 5 , Pages 633-639, May 1972

Strontium-90 and Cesium-137 in Milk and Certain other Materials Collected in Finland

B.E. Baker

E.R. Samuels

Erkki Pulliainen

Department of Agricultural Chemistry, Macdonald College of McGill University, Macdonald College P.O., Province of Quebec, Canada

Radiation Protection Division, Department of National Health and Welfare, Ottawa, Canada

Department of Agricultural and Forest Zoology, University of Helsinky, Finland

Received 18 March 1971

Sixty samples of human milk, 18 of bovine milk, 15 of reindeer milk, 1 of reindeer meat, and different lichens, all collected in Finland, were analyzed for 90Sr, and 137Cs. Reindeer milk contained the most 90Sr, being 119 pCi per liter. Only 5 samples of human milk had detectable amounts of 90Sr and 4 came from north of Oulu. Human milk from north and south of Oulu contained 114 and 19.7 pCi per liter of 137Cs whereas bovine milk from the same regions contained 125 and 84 pCi per liter of 137Cs. Reindeer milk from Lapland contained 1,647 pCi per liter of 137Cs which is 13 to 14 times more than that of human and bovine milk from the same area. A specimen of dried reindeer meat contained per kilogram, 237 pCi of 90Sr and 76,124 pCi of 137Cs. A sample of lichen, Cladonia species, contained 1,361 pCi of 90Sr and 21,173 pCi of 137Cs per kilogram.

end quote.

The situation in Alaska and Canada has long been discussed in documents which predate Chernobyl.

One person’s immediate hazard is another person’s long term hazard. And in terms of added insult, the more nuclear activity there is, the worse the future looks. Both the immediate future, and the more distant future.

The Nuclear “Either or Or” Mythology goes back a long time.

March 26, 2013

Before I get into the actual topic of this post, I have to say something stinks about the events of the 14 March 2011 and 15 March 2011 regarding the disclosures regarding reactor number 4 and its fuel pool.,d.aGc&cad=rja

The above link is to the document “Report on Project Gabriel” 1954. The document was produced by the US Atomic Energy Commission, Division of Biology and Medicine, and was created in 1954. A number of documents preceded it. The earliest relate to close in plutonium hazards at nuclear targets. The 1954 report was not de-classified until 1981.

Pdf page 2 of the document reads as follows “The objective of Project GABRIEL is to evaluate the radiological hazard from the fallout of debris from nuclear weapons detonated in warfare. Depending upon the conditions under which such weapons are used, the major interest may lie either in local fallout or in the super- imposed long range fallout from many weapons. ……… A theoretical analysis of the long range aspects of GABRIEL was made in 1949 by Dr. Nicholas
M. Smith, Jr., Oak Ridge National Laboratory, at the request of the Atomic Energy Commission. Smith concluded that Sr–90 is by far the most hazardous isotope resulting from nuclear
detonations, and that the distribution of this isotope over large areas of the earth’s surface
constitutes the limiting factor in estimating the long-range hazard from the use of a large number
of atomic bombs. In 1952 RAND Corporation was given a contract to make an independent study of GABRIEL, with some emphasis on the short-range aspects of fallout, Study of this phase, later called AUREOLE, has been carried as far as present information appears to permit, and a report has
been prepared.’

Note that “short range” in the above context refers to both the distance and time frames.

In the event, in August 1953 it had already been decided that the long range hazard (in time and distance) posed by the build up of Strontium 90 in the entire earth’s biosphere would guide the direction of Sunshine, the Project which had been born out of Project Gabriel. For the public this project would look only at Strontium 90 build up in the biosphere over years from many weapon detonations. Sunshine only ever reported on Strontium 90 in human bone, food, water and the environment in public.

However, fission produces many more substances than strontium 90, which is one of the least radioactive of the 5 or 6 strontium isotopes produced by fission. See

Meanwhile, from very early on, the AEC studied other fission products during Projects Gabriel and Sunshine but never discussed these in public. In fact the previously secret documents released by ACHRE and made available since 1994 show conclusively that Sunshine conducted a long study of Strontium 89 at the same time. Strontium 89 has a half life of about 52 days and within 3 years the deposition from each bomb had ceased to be a consideration. So, even though the public was told that the only worry for the United States was the long term build up of strontium 90, the AEC secretly conducted immediate effects studies. We can deduce this because the documents show the intense AEC interest in Sr89, an immediate but not long term hazard. At the same time as the AEC was claiming that there was no immediate danger, it was studying one of the major immediate hazards. It’s actions, in hindsight, speak louder than its words. Of course, both hazards are important. All hazards are important.

The effect of the decision taken in the August meeting held in Washington in 1953 effectively defined Project Sunshine as a study of long term effects only. But only in terms of public releases.

The limited number of early reports dealing with immediate hazards from fresh fallout were thus, as far as public utterances were concerned, relegated to the precursor studies such as Project AUREOLE. And such documents remain thin on the ground. Even personnel studying the fallout within the US military a few years later had trouble accessing the early studies, and later I introduce one such military study.

(One participant of the August 1953 meeting, Kramish (see for his bio) expressed concern for those subject close in to fresh fallout even as the AEC determined that 26,000 megatons of nuclear weapon detonations would be the lethal limit of bombs. At that point higher life forms would have difficulty surviving. Of course the ridiculous notion assumed even spread. As Kramish realized the spread would not be even and the close in would die first, as indeed, the Downwinders will us, they do.)

And so it was that the AEC set about secretly obtaining human remains from around the world for the study of Strontium 90 uptake in human bone. The project would not cease until the 1970s.

The Castle Bravo event of 1954, in which fresh fallout from a very large H bomb rained in a concentrated manner upon the Marshall Islanders caused a world outcry. The press were there to cover the bomb and turned their cameras to the victims. Even though the AEC and US Navy (Cronkite, Pearson) reported to Congress in 1956 and 1959 that the strontium 89 of the Islanders urine samples were “near tolerance levels for Strontium 89”, in 1954 the AEC’s head honcho of Projects Grabiel and Sunshine maintained that there was no immediate hazard to Americans from the bomb tests. Cronkite’s 1954 reports contradict Libby and the official US government line in 1954. But these reports are now largely forgotten. The main hazard and the main substance which caused the famous skin burns to Islanders was Strontium 89. Not Strontium 90.

Sr90 is about 400 times as radioactive as radium 226 weight for weight. Sr89 is 28,0000 times as radioactive as radium 226 weight for weight.

A public brawl broke out between Libby and Ralph Lapp, also of the AEC. Lapp argued that Libby’s position was wrong, but Lapp never mentioned Strontium 89. Pauling, not of the AEC, mentioned the wider range of substances in fallout and did not mention Strontium 89 either. However, it was known at the time that the fission process created much more Sr89 than Sr90 and that Sr89 was much more radioactive than Sr90. Its Beta energy was also much higher than that of Sr90. Pauling for his trouble, was hauled before the House UnAmerican Activities and had his passport suspended. Though world embarrassment caused the US State Department to return it to him so he could collect his second Noble Prize. Hamilton admitted in secret documents that Strontium 89 animal injection studies were proceeding in the 1950s. All the AEC and DOE released to the public were the results of the Strontium 90 animal experiments. And the Sr90 experiments did not produce any life shortening the test animals the AEC and DOE claim. However the current General Electric data sheet for Strontium 89 confirms what was known prior to 1942 about Strontium 89. GE writes that of 40 rats injected with Sr89, 33 developed bone cancer within a 9 month window. But the AEC nor the DOE mentions this in relation to Project Sunshine and fallout. The GE data was only published after 1993.

Despite knowing then, that both long term and immediate hazards were important, for 30 years nuclear authorities in charge of nuclear testing encouraged the public to think only in terms of long range hazard, denying, contrary to the documents the authorities had created themselves, which indicated profound immediate hazards in fresh fallout. Once the dose is delivered to the human, that dose forms part of the human’s accumulating life time dose. No matter how short the half life, the immediate effect contributes to the accumulating dose. And so this huge insult from fresh fallout is born by survivors. And it shortens lives. There is no doubt in my mind for example, that Australian servicemen died at the time of the nuclear tests in Australia. (And here I will give a personal story. I once worked as a casual employee at the Australian Department of Veterans Affairs. My supervisor at that time, many years ago, had been a nurse at the time of the atomic tests in South Australia. She worked in a hospital up north, close to the source of the fallout. She recounted the time, during the test period, that an Aboriginal boy was brought into the hospital. The doctors and nurses treating the boy were threatened and were ordered to remain silent about the case. The boy died and the cause was put down to infectious disease. The National Secretary of the Atomic ExServicemens’ Association has recounted many such stories relating to military personnel. People who sickened and died at the time, many more who sickened and recovered for a number of years. But of course, the British Law Lords will not hear of this. They refuse to hear it.)

Still, the “close in or distance”, the “immediate harm or long term hazard”, are concepts of exclusion which guide the public into making the false choice. And so some public argument is about which choice to make. In both example cases, there is no valid choice. The situation is that both close and distant fallout are important and also that both immediate exposure and long term build up of fallout components are important.

Even today, the same holds true in relation to the concepts of dispersion of nuclear emissions or focused, directional plume.

A recent article in “Japan Focus” has brought these false choices into my mind, casting it back to the previous examples as gleaned from Projects AUREOLE, Gabriel and Sunshine (there was another precursor to Gabriel, I forget, at the moment what it’s name is, though it also was named after an angel as I recall .)

Here is the relevant piece of the Japan Focus article:

The Asia-Pacific Journal, Vol. 11, Issue 12, No. 1. March 25, 2013.
A Lasting Legacy of the Fukushima Rescue Mission: Cat and Mouse with a Nuclear Ghost

Roger Witherspoon

This is part two of a two part series.

For several days, the winds from the destroyed nuclear reactors at Fukushima Daiichi crashed head on into the myth of the radioactive plume.

It is the most enduring falsehood of commercial nuclear power, promoted heavily by both the industry and its watchdog, the Nuclear Regulatory Commission. It is a myth with two conflicting premises:

Radioactive gases spewing from a stricken reactor or spent fuel pool have an inherent property which holds them in a tight, thin stream which prevents widespread contamination.
At 10 miles the plume disperses like steam from a teapot, leaving traces that are either too small to measure or are so minute as to be “below regulatory concern.”

The contradiction between being tightly bound and widely dispersed is never challenged. It was most clearly enunciated at a public hearing April 8, 2002, in White Plains, New York, on the evacuation plans for the two Indian Point reactors, located about 30 miles north of Manhattan, owned by Entergy Corp. There was no dissent from NRC officials as Entergy’s Larry Gottlieb said, glibly, “the easiest way to avoid a radioactive plume is to cross the street.
“It’s kind of like someone pointing a gun at you and all you have to do is step to the left or right to get out of the pathway of the bullet. That’s all you have to do.”

During the frenetic first week after the March 11, 2011 earthquake and tsunami destroyed the infrastructure of Japan’s northeast coast, killed some 20,000 people, rendered hundreds of thousands homeless, and set four of the six Fukushima Daiichi reactors on an irrevocable path to meltdowns, officials from the U.S. Departments of Defense, State, and Energy, as well as the Nuclear Regulatory Commission clung to the notion that the situation was manageable as long as the “plume” held true to the myth and blew out to sea.

That was paramount to DoD, which had 63 military installations throughout the Japanese islands containing some 60,000 men and women and their families. It was a relief, therefore, when the aircraft carrier, the USS Ronald Reagan reported on March 13 that its sensors were picking up radioactive material on its flight deck, 130 miles off the coast.

According to the NRC Status Report, “The measureable radioactivity was consistent with the venting of the Fukushima Daiichi Unit 1 reactor. The Navy also collected air samples having activity above background from the ‘plume’. Analysis, the report states, would show the Reagan contaminated with “iodine, cesium and technetium, consistent with a release from a nuclear reactor.”

Projected Fukushima plume 3.11.11

Projected Fukushima Plume 3.12.11

Radiation spread from Fukushima, 3.11-3.24.11

The full article, which I will put up at a later date, can be read here:

In relation to the bomb test era, I long ago concluded that the Atomic Weapons Test Safety Committee was empowered to keep the bomb test program in Australia safe from the public and that in relation to Fukushima the same dialetic (see allows the nuclear industry to present information remotely connected to the facts in the best possible light at the time. And over time they will happily change the mix to the same end, ie PR which presents nuclear power in the best positive light possible. Everything is an advert.

In regard to the plume argument I can quote an ARPANSA document which was commissioned as a result of a CTBO detection of a radiation spike in Melbourne in 2009.

The document explains that wind direction is not constant with altitude, and, when looked at from the vertical perspective, the wind may well blow in all cardinal directions and all points in between at the same time. The factor which allows this is altitude. Wind direction very commonly varies markedly with altitude. And in the example I will cite, this can be seen. It is only in the concept of a single altitude that wind direction (and indeed wind speed) can be seen as a single vector or stream. So, plumes can go in all directions, and when only altitude is considered, a distinct plume is seen. When all relevant altitudes are considered and plotted, one sees, in the case of contamination, a blanket with irregular edges or a blanket with ragged edges rather than a plume.

Wind shear is not an abnormal event then, it is usual.

This consideration of wind sheer is not the same as the air crash investigators appreciation of “wind sheer”. In the case of airports, which sees aircraft ascend and descend through various altitudes rapidly, is the wind speed and wind direction differential of relevance. The matter is a very different one to the concerns raised by nuclear fallout and the paths the fallout clouds take at varying altitudes.

For pollution the directional considerations remain regardless of the wind speed differential at altitude. So here the term wind sheer simply refers, in relation to contamination spread and pattern and place of deposition, simply to the fact that at different altitudes the wind usually displays sheer when altitude is properly considered. That is, the wind blows in different directions at different altitudes most of the time. (The Jet stream of course being a component of this 3 dimensional fact, which blows in the one direction all the time in the upper part of the vertical air column.)

(the link to my original use of the document is here:http://nucl

Source link for Abstract and Purchase ($31.50)

The article gives the Lucas Heights Sydney nuclear reactor as the source of the emissions. The diagrams show that wind sheer at different altitudes allows movement of the radioactive emissions in question to move via wind direction to South Australia, Victoria, New South Wales and Queensland at the same time.

The resultant combined model map shows a blanket of emissions over the areas of concern. This is due to the fact that at various altitudes wind direction was to SA, Vic, NSW and to points north (ie Queensland) at the time of emission. ie the plumes moved north, south, east and west at varying altitudes at the same time.

The article goes on to cover the movement of one plume (ie at one altitude) which traveled south to Melbourne, where it triggered CTBO sensors, causing the query from CTBO to the Australian government.

Once the CTBO was assured that Canberra had not detonated a bomb, and that in fact the emissions were from the Nuclear Medicine production reactor, Sydney, the CTBO went back to sleep.

Similar emissions occur at the Sydney reactor once every 12 weeks, due to fuel rod removal and replacement. It is a known event for these emissions to regularly reach SA, Vic, Queensland. The market for nuclear medicines is healthy in Australia. And the reactor in Sydney is very busy dispersing radioactive substances over 4 states without a prescription. There are better ways to produce nuclear medicines that do not risk producing more patients in the process.

The Paper:

Cost to access full paper: $31.50 paid to Science Direct.
Evaluation of radioxenon releases in Australia using atmospheric dispersion modelling tools

Rick Tinker a, Corresponding author contact information,
Blake Orr a,
Marcus Grzechnik a,
Emmy Hoffmann b,
Paul Saey c,
Stephen Solomon a

a Australian Radiation Protection and Nuclear Safety Agency (ARPANSA), 619 Lower Plenty Road, Yallambie, Victoria, Melbourne 3085, Australia
b Australian Nuclear Science and Technology Organisation (ANSTO), Environmental Monitoring, PMB1, Menai, NSW 2234, Australia
c Vienna University of Technology, Atomic Institute of the Austrian Universities, Stadionallee 2, 1020 Vienna, Austria

Received 15 September 2009. Revised 21 January 2010. Accepted 8 February 2010. Available online 26 March 2010.


The origin of a series of atmospheric radioxenon events detected at the Comprehensive Test Ban Treaty Organisation (CTBTO) International Monitoring System site in Melbourne, Australia, between November 2008 and February 2009 was investigated. Backward tracking analyses indicated that the events were consistent with releases associated with hot commission testing of the Australian Nuclear Science Technology Organisation (ANSTO) radiopharmaceutical production facility in Sydney, Australia. Forward dispersion analyses were used to estimate release magnitudes and transport times. The estimated 133Xe release magnitude of the largest event (between 0.2 and 34 TBq over a 2 d window), was in close agreement with the stack emission releases estimated by the facility for this time period (between 0.5 and 2 TBq). Modelling of irradiation conditions and theoretical radioxenon emission rates were undertaken and provided further evidence that the Melbourne detections originated from this radiopharmaceutical production facility. These findings do not have public health implications. This is the first comprehensive study of atmospheric radioxenon measurements and releases in Australia.

Environmental radioxenon;
Noble gas monitoring;
Comprehensive Nuclear-Test-Ban Treaty;
Radiopharmaceutical facilities

Published in: Journal of Environmental Radioactivity, Volume 101, Issue 5, May 2010, Pages 353–361 $13.50
Illusration thumbnails from abstract page:

In all this it can be seen that if one looks at a single altitude, one sees a plume. If one looks at all altitudes, one sees a blanket or blob. Both exist together. All patterns are important.

Note the model for emission from the Brazilian reactor.

I have a strange feeling the average American has no idea of how far their many reactors cast their radio poisons in the course of normal routine operation and refueling.

Add Fukushima to that. This is not either/or. It is both/and.

It is because of the demonstrated casualty rate from Lucas Heights and past contamination of domestic house backyards in Sydney (Sutherland Shire) that we are acutely aware of the “miracle” of Lucas Heights down here. (A number of houses “downwind” (hmmm) of the reactor, were, in the 1970s, found to have suffered contamination from fallout from the reactor. The soil in the gardens of these homes were unsafe. The soil was scrapped up and for many years was stored in 44 gallon drums at the reactor site. Hence the saying “Not in their backyards.” This term was first uttered by nuclear safety people who found they had to scrape up backyards and take the radio-poisons back to the reactor site. The Howard government moved the drums to Woomera, SA, and they have since been stored in a bunker behind the instrumented range and rocket/missile launch pad at Woomera rocket range. Genius John.

I should point out, and this may interest Americans, that Xenon is not the only radioactive gas emitted during refuel or explosion.

Krypton is another gas. And Krypton 89 is the precursor to the decay product Strontium 89, one of the most dangerous fission products known. On its own is potent enough to cause skin burn and is a potent carcinogen. 33 out 40 rats injected with Sr89 developed bone cancer within a 9 month window.

Krypton 89 > Rubidium 89 > Strontium 89 > stable. The disappearing bullet.

The ARPANSA model above only accounts for Xenon.


It is difficult to find Project AUREOLE documents. I have one mention of it in a document on a hard drive somewhere.

Here is a document which covers it briefly:

The History of Fallout Prediction – Defense Technical Information Centre DTIC
The abstract for the above publication states: “The development of the science of fallout prediction in this country from 1950 to 1979 is described. The chronological description emphasizes
early developments and the relationships between some of the significant
calculational models. The earliest work on fallout prediction discussed
is that performed by RAND on Project Aureole in 1954, and the evolution
is carried through to the derivatives of the DELFIC computer code. A
section is devoted to the histories of four commonly used handbook
prediction systems. ”

I wonder how the prediction systems went with Operation Fishbowl Bluegill, one of the high altitude nukes.

Well, a lot of it followed the earth’s magnetic flux lines to the arctic region.

“It’s plume!” “It’s a blanket!”. In this case the blanket resolved into a hemispheric stream which, by and large, with exceptions tended to flow to the arctic region.

Gavan McCormack On Fukushima , from “Japan Focus”

March 25, 2013

Fukushima: An Assessment of the Quake, Tsunami and Nuclear Meltdown
Mar. 25, 2013
Gavan McCormack

3:11 – The What

It is just over two years since Japan’s quake, tsunami, and nuclear meltdown. It was Japan’s 3rd nuclear catastrophe, at level 7 highest on the scale and on a par with Chernobyl, although, unlike Hiroshima and Nagasaki, it was self-inflicted. The triple event left 20,000 dead, 315,000 refugees, and a devastated swathe of productive farm and fish country and its towns and villages that will take decades, at least, to recover.

Today, the Government of Japan tends to refer to the “Great East Japan Earthquake,” preferring to focus on the quake and tsunami rather than the meltdown, as if it were some inexplicable act of god. It talks of its policies for economic revival, reconstruction and crisis management, but little of the nuclear crisis.[1]

The triple catastrophe is often referred to as “soteigai” (unimaginable) but we now know was not the case. The Diet committee that investigated the accident pointed out last year that the disaster was structural, man-made, brought about by the failings of the power company and of the national government. Even before Fukushima, the nuclear industry was known for data falsification and fabrication, the duping of safety inspectors, the belittling of risk and the failure to report criticality incidents and emergency shut-downs. Directly and indirectly, politicians, bureaucrats, industrialists, lawyers, media groups, academics also collaborated, constituting in sum the so-called “nuclear village.” “Japan’s nuclear industry became, as one critic put it, “a black hole of criminal malfeasance, incompetence, and corruption”[2]

At Fukushima, where a hydrogen explosion blew the roof off reactor four days after the quake, 1,535 irradiated fuel rods remain stored on its 5th floor. They still cannot be removed, so water must continue to be poured, some of it inevitably finding its way into the surrounding soil and sea. (Yomiuri Shimbun 8 March). One fish caught in the nearly seas in late February was found to have 5,100 times the safe limit of caesium (Kyodo 2 March). A 3,000-stong workforce struggles to stabilize and dismantle the plant. Its work will take at least 30 years.

3:11 – The Why

For over half a century (beginning just 10 years after Hiroshima and Nagasaki), Japan’s leaders pursued the goal of a nuclear future, what in recent years they described as “genshiryoku rikkoku” (building a nuclear power state). Persuaded by Eisenhower’s talk of “atoms for peace,” they believed that nuclear weapons and nuclear energy could be completely separate and they believed that nuclear energy could be safe in Japan despite the archipelago being poised on clashing terrestrial plates – accustomed to earthquakes (20 percent of the world’s total), volcanoes, typhoons and tidal waves (tsunami), and criss-crossed by the fault lines of various subterranean fissures. They believed in the chimera of eternal, almost limitless energy. Their hubris was sublime.

In the 70s and 80s they justified nuclear expansion on economic grounds as the alternative to oil and coal, and in the early 2000s as the key to counter global warming. The nuclear village gradually expanded from power generation into fuel enrichment, recycling, fast breeder reactors, MOX fuel, and nuclear waste treatment, the national policy (kokusaku) core of the Japanese economy.

Elsewhere, national referendums and parliamentary resolutions limited or prohibited nuclear energy, but in Japan the government-centered nuclear village ignored, suppressed, and bought off the resistance, steadily increasing the construction of nuclear plants, channelling trillions of yen into nuclear research and development.

So, Japan’s nuclear system was problematic long before the tsunami crashed into its Fukushima plant in March 2011.

3:11 – The Aftermath

(a) Government:

Although the government did allocate Y19trillion (ca $200 billion) for reconstruction, much of that was misappropriated – some actually to subsidize more nuclear research, and some (Y2.3 bn) to fund countermeasures for the country’s whaling ships to deploy against the Sea Shepherd in the southern ocean. Victims are now launching action for compensation in the courts against government and Tepco.

The DPJ government in September 2012, under huge social pressure, adopted the “zero nuclear option” as its policy. However, the nuclear village in Japan, and the governments of the US, Britain, and France, pressured it to the extent that the words—“zero nuclear power”—were deleted from the Cabinet resolution the following week. In due course, in December 2012, the LDP (the party that had led the country down the nuclear path), was restored to power.

Two weeks ago, Prime Minister Abe announced that those reactors that pass the new safety test would restart within a year. Areva (the French nuclear company that is a major supplier of power generating equipment) announced just days ago that Japan would restart 6 reactors by end of 2013, and two-thirds of all within several years. The Asahi reckons not one qualifies as of now, and that the estimated cost of meeting the new criteria would be ca Y1t (= ca $11bn) (AS, 27 February 2013).

Not only does the government today plan to switch back on the existing reactors, but it has no plans to liquidate the vast interlocking elements of the nuclear archipelago, including the world’s most intensive concentration of civilian nuclear energy facilities (at Rokkasho). It appears to maintain the dream of completing the nuclear cycle – from fuel processing and enrichment (including MOX, or Pu + uranium), through power generation to waste reprocessing and storage – or to abandon the long and desperate struggle to master fast-breeder technology, something so prodigiously difficult and expensive that the rest of the world has set it aside as a pipe-dream. Nuclear plant export is identified as a major growth sector for the economy.

As for the so-called “back end,” Japan’s accumulated nuclear wastes include roughly one fifth of the world’s civil plutonium stocks (in excess of 50 tonnes or hundreds of nuclear weapons-worth) and approximately 17,000 tonnes of reactor waste (much of it spent fuel rods). Low-level wastes are held in 200-liter drums, both at nation-wide reactor sites and at Rokkasho (where it is to be covered with soil and closely guarded for at least 300 years). High level wastes, vitrified and in canisters, are stored initially for 30 to 50 years until the surface temperature declines from around 500 degrees centigrade to 200 degrees centigrade, at which point they are to be buried too, in 300 meter deep underground caverns (at some site yet to be identified) where their radiation will further dissipate over millennia. Over millennia.

So official Japan, two years on from Fukushima, maintains and gradually restores its identity as nuclear archipelago, as plutonium superstate.

(b) Civil Society

Faced with the March 11 catastrophe, many people concluded that Japan’s energy and nuclear power policies had to be fundamentally changed. What ensued in 2011-12 was the greatest political mobilization by its citizenry seen in Japan in at least 50 years, but today, the superficial impression that mobilization seems to have slightly lost momentum. (I hope I am wrong and that others will correct me.)

(c) Japan and the World

Outside Japan, there are now about 100 reactors in Asia, and another 100 on drawing boards or under construction. But if the country whose scientific and engineering skills are the envy of the world can be guilty of the miscalculations, malpractice and incompetence that have marked the past half-century in Japan, can the rest of the world do better?

The challenge Japan faces is to scrap a core national policy of the past half century and to make the shift from nuclear promotion to a renewable energy system beyond carbon and uranium. If Japan were to go that way, the world would very likely follow. But it is a revolutionary agenda, and can only be possible under the pressure of a mobilized and determined national citizenry that wrests control over the levers of state power from the irresponsible bureaucratic and political forces that have driven it over the past 50 years. Much depends on the outcome.

Adapted from a presentation at the Canberra Public Forum, 12 March 2013.

Gavan McCormack is an emeritus professor at Australian National University, a coordinator of The Asia-Pacific Journal: Japan Focus, and a co-author, with Satoko Oka Norimatsu, of Resistant Islands – Okinawa Versus Japan and the United States (Rowman and Littlefield, 2012; Japanese edition now available from Horitsu Bunkasha). end quote.

The internal contradiction of the “modern” nuclear safety regime as foisted by the ICRP upon the world is contradictory. In essence: Oh its perfectly safe, but no, you can’t have your land back until its a bit safer.” When in fact there are too many people affected and that includes people living in unsafe conditions.

Watered Down Internal Emitters – The Union Carbide document, under contract from the AEC

March 25, 2013

Please download the whole thing.

As we have seen, the official view of the Maralinga cleanup in South Australia is based upon the position that in most areas, the most significant pathway, would result in an exposure from plutonium which is less than that considered unacceptable by the regulator (ARPANSA) for members of the public. I’ll come back to the Beagle experiments in a minute.

But first let’s look at what Peter Burns has said about the plutonium hazard early in the piece, before vitrification of the plutonium had been abandoned.

“Plutonium is an alpha emitting nuclide that is very insoluble so that when you breathe it in, it lodges in the lung and stays there for a long period of time, close to living cells and emits a large amount of energy, radiation into those cells which can cause damage…… only need to breathe in a micro-gram, one millionth of a gram of plutonium to produce a significant dose of radiation….In 1986, a Technical Assessment Group was appointed by the Commonwealth Government. Its goal was to reduce contamination at Maralinga to an International Standard of 5 mSv – that is, equivalent to 5 lung X-rays per year. That clean-up is now underway.”

OK. One minute he is saying plutonium can stay in the lung for a long time and that a really weenie amount causes significant dose. This is no different to Pecher and Aebersold in the early 40s, and no different to Hamilton from 1942-1946. I have previously and repeatedly quote these.

But then he says we aim to get the exposure down to 5mSv pa. Well, in one scenario you taking it with you and in the other you are not. Or at least the assumption is that if retained in lung, you are only going to retain really weenie bits or only really weenie weenie weenit if you inhale and retain more than one. How does anyone know how many specks one is going to inhale and retain?

And how does the modern world officially calculate the dose? Like this:

The AEC wanted to put paid to the ideas which were implied in Pecher and Aebersold, Hamilton 1946, and Tamplin and Gofman later. The AEC contracted Union Carbide to make the hot particle internal emitter problem go away.

Here’s part of the Union Carbide document as a quote:

The document front cover page carries the following notice:

By acceptance of this article, the publisher or recipient acknowledges the US Government’s right to retain a nonexclusive, royalty free license in and to any copyright covering the article.

– NOTICE-This report was prepared an account of work sponsored by the United States Government. Neither the United States nor the United States Energy Research and Development Administration, nor any of their employees, nor any of their contractors, subcontractors, or their employees, makes any warranty, express or implied, or assumes any liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights.

C. R. Richmond Biomedical and Environmental Sciences
OAK RIDGE NATIONAL LABORATORY Oak Ridge, Tennessee 37830**

(*Portions of the information contained in this paper were part of testimony presented at the Joint Committee on Atomic Energy Subcommittee to Review the National Breeder Reactor Program, June 1975.)

(**Operated by Union Carbide Corporation for the U. S. Energy Research and Development Administration.)

The recent revival of interest in the ‘hot particle’ problem, especially as regards particulate plutonium and other actinide elements in the lung, has stimulated a great deal of thought on this subject during the past several years. Non-uniformity of dose distribution has been of interest to standards-setting bodies and other groups, such as the National Academy of Sciences, and to health protectionists for many years. In fact, interest in the subject as regards alpha-emitting radio-nuclides predates the discovery of plutonium in 1941.

The hot particle problem has recently been brought to the attention of several federal agencies by the National Resources Defense Council, Inc. [1], The NRDC’s original petition and supporting documentation was submitted to the U. S. Atomic Energy Commission (AEC) and the U. S. Environmental Protection Agency (EPA) on February 14, 1974. Because of the Energy Reorganization Act of1974 which resulted in the formation of the U.S. Energy Research and Development Administration (ERDA) and Nuclear Regulatory Commission (NRC), the federal response to the NRDC petition is now the responsibility of the EPA and the NRC. Although many organizations have considered the hot particle problem for decades, there has been considerable reassessment of the problem since February 1974. I should point out that no final response to the NRDC’s petition has been made to date by either the EPA or NRC. There have been, however, discussions and correspondence among the involved parties and the NRDC has presented testimony on hot particles at AEC, EPA and ERDA hearings since submitting the petition.

The NRDC petition states (page 4) that in its view the present radiation standards when applied to hot particles are too high by a factor of 115,000. In addition, the petition states that each of the NRDC’s individual members is a potential victim of exposure to hot particles. The document supporting the petition, prepared by Tamplin and Cochran, proposes that a single radioactive particle in the lung capable of delivering a local radiation dose of 1000 or more rem per year will produce local tissue damage. The local tissue damage in turn produces a risk of lung cancer of one in 2000 (5 x 10 to -4 power). Put another way, exposure to 2000 such hot particles would produce one lung cancer.

National Radiological Protection Board’s Radiological Protection Bulletin #8 (1974).
A short critical review of the NRDC petition was prepared by the United Kingdom’s National Radiological Protection Board (NRPB) [2]which concludes, “It is noted that the basis of ICRP recommendations is the average radiation dose to an organ and not the number of radioactive particles in the organ. This dosimetric basis of radiological protection has been established for many years by observation of humans and experimental work with animals. A better evaluation than offered by Tamplin and Cochran would be needed for this system to be set aside in favor of the hot particle concept. Their estimate that there is a risk of cancer being generated in cells surrounding a hot particle of one in 2000 cannot be substantiated by our present knowledge.”

Biophysical Society (STAIS) Report (1974)
In December 1974, the Science and Technology Advice and Information Service (STAIS) of the Biophysical Society conducted a study of the question of radiation standards for hot particles at the request of the Center for Science in the Public Interest [3]. The summary of the report written by the Committee coordinators is as follows:
“1. The problem raised by the Natural Resources Defense Council petition of what should be the maximum permissible lung burden (MPLB) of hot particles is a valid and serious one. However, the call for a decrease in MPLB by 10 to the 5th power is exaggerated. More animal and epidemiological data are needed for a truly adequate estimate of what should be the radiation protection standard. A crucial piece of missing information concerns the distribution of particle sizes involved in the Manhattan District accident. Twenty-five individuals followed for almost 30 years have no lung cancer from 3—10 nCi plutonium in the chest. Calculations in the Tamplin and Cochran report accompanying the petition indicated that the particles were too small to be effective. Other calculations resulted in the opposite conclusion. One of the reviewers suggested an experimental re-enactment of this accident (without humans present) for the purpose of measuring particle size.
“The lung burdens of 25 Rocky Flats workers exposed to Plutonium fires range from one to ten times the present MPLB.No lung cancer has been detected in any of these individuals after nine years. Since there is evidence that the latent period for cancer induction after a large exposure may be as short as this, these data again suggest that the factor of 10 to the 5th power is too large.
“2. The reviewers who looked into the quantitative aspects of the Tamplin-Cochran report all concluded that it contained exaggerations and lack of adequate reasoning (a,b,c below). This report includes interpretation of the data of others, sometimes at variance with the author’s own interpretation. Two of the reviewers used existing published animal data and several biological models to estimate the probability of cancer induction in the human lung from hot particles. They conclude that the existing MPLB should probably be decreased by some factor between 40 and 10 to the 4th power, but that this figure at present can only be tentative, because of the paucity of data. Another reviewer finds no reason to alter current standards.

“a) The single instance of a hand sarcoma following plutonium contamination is inadequate for a quantitative argument, especially since there was no evidence that the plutonium penetrated the skin.
“b) The single instance of supposed precancerous changes in the neighborhood of a puncture wound involving plutonium, later excised, is also not suitable for a quantitative argument, especially since there was another similar but un-excised case in which no cancer developed in 30 years.
“c) The use of the data of Albert et al on rat skin tumors induced by fast electrons to estimate the risk from hot particles seems unjustified on four grounds: (1) The rat data involved a single dose, whereas the lung irradiation being considered is chronic. (2) Tamplin and Cochran do not cite data showing that non-uniform irradiation by beta and alpha particles is less effective than uniform radiation.
(3) Previous experiments cited by the Albert groups showed no tumor production by 0.3 MeV” electrons, external alpha particles and protons. [4) The hair follicle seems to be the sensitive structure for radiation-induced cancer in the skin. No similar structure has been identified in the lung, nor is there any estimate of the probability of a hot particle being close to such a structure.”

National Radiological Protection Board Report R-29 (1974)
In September 1974, the United Kingdom’s National Radiological Protection Board published a report entitled, “Radiological Problems in the Protection of Persons Exposed to Plutonium” [4]. The follow-
ing quotations are taken from Section 9 of the report entitled “Hot Particles”. By way of introduction, the report states, “The radiological protection problems associated with insoluble particles of alpha-emitting radio-nuclides have been known and considered for a number of years (Dolphin, 1964) but recently public attention has been drawn to these problems by a petition submitted to the USAEC by Tamplin and Cochran (1974) which caused comment in the national press. The problems concern the biological effect of high localised doses.”

After discussing the issue, the section concludes, “In summary, there is no biological evidence available at present which suggests that “hot spots” carry a higher risk of cancer induction. Hence
there is no necessity to change from the present system of using average dose to organs or tissues. However, it would be prudent to continue research into the biological effects of non-uniform dose
distributions within organs.” WASH-1320 (1974)

Another report which should be consulted by those interested in the hot particle issue is WASH-1320, “A Radiobiological Assessment of the Spatial Distribution of Radiation Dose from Inhaled Plutonium,”[5) which was published in September 1974 by the USAEC. The summary and conclusions of the report are as follows:
The importance of spatial distribution of dose to radiation protection practices by national and international standards-setting organizations and the scientific community predates the discovery of plutonium. Continued examination of the radiobiological aspects of the spatial distribution of dose, especially as regards alpha-emitting particles, has not led to major changes in radiation protection standards. However, the problem is and should be continually reassessed.
Animal studies clearly indicate that inhaled radioactive particles move from the lung to other organs and can be excreted from the body by several mechanisms. The experimental data also show that truly uniform distributions of inhaled radio-nuclides in lung seldom, if ever, occur. Because of the mobility of plutonium within the lung, there is some biological justification for averaging the radiation dose to the total tissue.

Particles deposited in the lung are dynamic and mobile unless trapped, as for example, in scar tissue. Experiments have been designed to simulate the static plutonium particle and study the biological effects of truly “hot spots” of radioactivity in the lung. These and other comparative experiments of uniform and non-uniform distributions of absorbed energy from radioactive particles suggest a biological sparing effect for both acute and late responses to the non-uniform distribution. Available experimental data indicate that averaging the absorbed alpha radiation dose from plutonium particles in the lung is radio-biologically sound.

Dosimetric models used to predict lung tumor probability in animals and human beings are biologically deficient, largely because of the lack of the required biological information. Most models are based on studies of tumor induction in irradiated rat skin and on the assumed validity of extrapolating to lung tissue. This practice is questionable for several reasons including the fact that the results of studies with rats vary with rat strains, i.e., tumor type, and that the results of comparable studies of irradiated mouse skin have not yielded results identical to the rat experiments. Thus, use of these models can lead to erroneous predictions of tumor probabilities.
Consideration of radiation carcinogenesis mechanisms suggests that there has been no change in either the direction or strength of data which would compel departure from the concept that average lung dose for alpha particles provides a reasonable and conservative base for radiation protection.
•Thirty years experience with plutonium in laboratory and production facilities has provided no evidence that the mean-dose lung model on which occupational radiation protection standards for plutonium are based is grossly in error or leads to hazardous practices. Data currently available from occupationally exposed persons indicate that the non-homogeneous dose distribution from inhaled plutonium does not result in demonstrably greater risk than that assumed for a uniform dose distribution. Thus, empirical considerations lead to the conclusion that the non-uniform dose distribution of plutonium particles in the lung is not more hazardous and may be less hazardous than if the plutonium were uniformly distributed and that the mean-dose lung model is a radio-biologically sound basis for establishment of plutonium standards.
The report WASH-1320 [5] was not meant to be a critique of the NRDC petition on hot particles as it addressed the main generic issue of the problem, that is, the question of the biological importance of spatial distribution of radiation dose from inhaled plutonium.

Los Alamos Scientific Laboratory Report LA-5810-MS (1974)
In November 1974, a Los Alamos Scientific Laboratory report [6] was prepared by a group of biomedical researchers with relevant plutonium research experience. This report, entitled “A Review of the Natural Resources Defense Council Petition Concerning Limits for Insoluble Alpha Emitters,” represents the most detailed and comprehensive analysis of the NRDC petition available to date. The report concludes, “The preceding review has indicated that the Tamplin-Cochran conclusions are based upon a hypothesis which requires considerable extrapolation of the data upon which it is based. Later evidence, of the same nature as was used in the derivation (i.e., rat skin data), does not support the assumptions of the original model. The Tamplin-Cochran interpretation of the model not only fails to take into account the later evidence, but appears to present the hypothesis as fact. The supporting evidence on human data which they present are based upon unsupported assumptions and distortions of the words of the authors they quote. Most importantly, they fail to use or acknowledge direct evidence on the effect of radioactive particles. Such evidence indicates that the basic damage model which they use overestimates badly the carcinogenic effects of radioactive particles. We conclude, therefore, that the application of the average organ dose to the establishment of limits is still appropriate, although experimentation to narrow existing uncertainties on the effects of non-uniform dose
distribution should continue.”

EPA Report ORP/CSD-75-1 (1975) and WASH-1359 (1974)
The U. S. Environmental Protection Agency held hearings in December 1974 (Washington, D. C.) and January 1975 (Denver, Colorado) on the subject of plutonium standards. The question of hot particles was addressed by several persons providing testimony. Proceedings from these hearings are available in a three-volume publication [7]. A compilation of the USAEC’s testimony presented at these hearings
was made available earlier in WASH-1359 entitled, “Plutonium and Other
Transuranium Elements: Sources, Environmental Distribution and Bio- medical Effects” [8]. These reports contain much information on the subject of the hot particle hypothesis. Also contained in WASH-1359 and ORP/CSD-75-1 is a letter from Dr. C. C. Lushbaugh concerning the incorrect interpretation by Tamplin and Cochran of his published data on a plutonium wound in the hand of a process worker. These reports also contain a report entitled “A Critique of the Tamplin-Cochran Proposal for Revision of the Current Plutonium Exposure Standards” by Dr. Roy Albert of the New York University Medical Center. The summary reads as follows:

“Largely on the basis of rat skin tumor experiments, Tamplin and Cochran propose that a single radioactive particle in the lung which delivers a local dose of more than 1000 rem per year will produce focal tissue damage and that this focal damage per confers a risk of lung cancer of one in two thousand.

“A review of current knowledge about the relationship of tissue damage to the induction of cancer does not support the contention that tissue damage is a proximate cause of cancer; rather that tissue damage represents a parallel toxic action of carcinogens which, to some extent, may enhance the development of tumors produced by carcinogens. Since the Tamplin-Cochran proposal is based almost wholly on radiation tumor studies of the rat skin hair follicles, the decisive argument against this proposal is the evidence that focal alpha irradiation of localized regions on the hair follicle, in a pattern similar to that from a plutonium particle, is non-tumorigenic.”

National Council on Radiation Protection and Measurements Report No. 46
The National Council on Radiation Protection and Measurements (.NCRP) recently released NCRP Report No. 46 entitled, “Alpha-Emitting Particles in Lungs” [9]. The report was discussed in some detail by Drs. M. Eisenbud of New York University and J. N. Stannard of the University of Rochester at the May 28, 1975, session of the Energy Research and Development Administration’s public hearing concerning the Technology Research and Development Program for the Liquid Metal Fast Breeder Reactor and the Proposed Final Environmental Impact Statement on that program. The report was prepared by an ad hoc committee consisting of W. J. Bair (chairman), A. Kellerer, J. N.Stannard and R. C. Thompson and reviewed and approved by the entire NCRP Council which is comprised of approximately 70 individuals.
The NCRP report concludes that:
* a substantial body of experimental animal data indicates that particulate plutonium in the lung is no greater hazard than the same amount of plutonium distributed more uniformly throughout the lung.
• the above observation from animal data is consistent with the theoretical analysis of the microscopic distribution of energy absorption in each case.
* the current NCRP practice of averaging absorbed dose over the lung is defensible when used in conjunction with appropriate dose limits.
• more precise consideration of spatial distribution of
absorbed dose cannot be profitably used to derive permissible exposure limits until we have more understanding of the relation between dose and effect.
The report is neither an endorsement of nor a commentary on the absolute numerical adequacy of present NCRP standards for plutonium or other alpha-emitting particles.
Medical Research Council’s Committee on Protection Against Ionizing Radiations-Report on the Toxicity of Plutonium (1975)
Earlier this year the United Kingdom’s Medical Research Council’s Committee on Protection Against Ionizing Radiations published a report entitled, “The Toxicity of Plutonium” [10], The following is a quotation from the report on the section dealing with recommendations relating specifically to plutonium: “For many years those professionally concerned with radiological protection have been aware of the need to establish general principles for assessing the relative risks of homogeneous and inhomogeneous irradiation. As discussed in the appendix, there is no evidence that irradiation by ‘hot particles’ in the lung is markedly more hazardous than the same activity uniformly distributed or that the currently recommended standards for inhalation of plutonium are seriously in error.” In the section of the report on hot particles, the authors state the following: “The conclusions of Tamplin and Cochran cannot be any better founded than the hypothesis on which they are based and that is too tenuous to be worth further discussion here. Tamplin and Cochran also put themselves in the difficult situation that the risk is considered to be decreased by a factor of 115,000 if a particle containing 0.1 picocurie plutonium were to break into two equal halves.” The report continues, “The evidence most immediately relevant to the ‘hot particle’ problem is human experience of lung irradiation.’ Hot particle’ irradiation seems unlikely to be more carcinogenic than uniform irradiation to the same dose as is received by the tissues adjacent to the particles. Indeed the risk for uniform irradiation on any hypothesis of carcinogenesis would be larger than for localised irradiation at the same dose in proportion to the ratios of the lung masses involved, unless the sites of deposition of the particles are also the sites where the lung cells specifically sensitive to cancer induction are to be found. Current ideas suggest the opposite, that particles with long residence times are in the deep lung and the cells specially sensitive to cancer induction are in the linings of the airways. On the unlikely assumption of uniform distribution of particles and sensitive cells, if 1000 rem in a year to 64 ug of lung tissue resulted in a mean lung cancer incidence of 1/2000, the cancer risk for irradiation of 120 mg would be 100 percent, and after uniform irradiation of the whole lung of mass 1000 g to 1000 rem some 8000 separately induced lung cancers would be expected on average in each individual. There is no evidence that this happens. Parts of the lung are frequently irradiated to doses of this order in the course of radiotherapy.”
This section of the report on hot particles concludes that there is at present no evidence to suggest that irradiation of the lung by plutonium particles is likely to be markedly more carcinogenic than for the case when the same activity is uniformly distributed.

The Importance of Non-Uniform Dose-Distribution to An Organ (1975)
In May 1974 a symposium entitled, “Plutonium Health Implications for Man,” was held at Los Alamos, New Mexico. At that meeting, a paper was presented on the subject of non-uniform dose-distribution of plutonium, especially as regards the lung. The published report [11] reviews the data from animal experiments that are often used both for and against support of the hot particle hypothesis. Also contained in the paper are the Los Alamos hamster experiments which currently provide rather convincing evidence that the tumor probability per hot particle as postulated by the NRDC [1] is incorrect. Some consideration of biological mechanisms is contained in the paper as evidenced by the following statement:
“For cases of non-uniform exposure, as occurs for particulate plutonium, there appears to be a biological sparing effect resulting from the fact that fewer cells are exposed to the alpha radiations, and much of the alpha energy is wasted as compared with a more uniform distribution. Also, the collective defenses of the body, both local and abscopal, such as inhibition of transformed cells by normal cells and immune surveillance, are more efficient in the case of non-uniform distribution. The key to the problem may well be the number of cells that interact with an alpha particle but are not killed. For the non-uniform distribution case, there are fewer of these cells which might have the potential to form a cancer, and they would be in an environment which would tend to inhibit their division and development to proceed to form a cancer.”

Suggested Reduction of Permissible Exposure to Plutonium and Other
Trans-uranium Elements (1975)
A recent publication suggesting possible reductions for permissible exposure to plutonium considered the lung and the question of hot particles [12]. The report states that “No one knows the answer to this question at the present time. Certainly we would like to have more information. Tamplin and Cochran {reference} suggest that because of the very large dose (thousands of rem/y) in the vicinity of a micron size particle of 239Pu lodged in lung tissue, the present q for lung (approx 0.015 uCi) and the corresponding values of (MPC) for occupational exposure as well as those for members of the public should be lowered by a factor of 10 to the 5th power. Perhaps they are right, but I believe they have not made a strong case for this factor simply because adequate biological data are not available and much of that which we have seems to give contradictory information.”

Morgan [12] also points out that he does not believe that we have “unequivocal proof that there is or isn’t a hot particle problem, ”but that “it certainly is encouraging that there is no clear evidence at the present time that human occupational exposure to plutonium and other trans-uranium elements has resulted in any form of cancer.” (nuclear history’s note: I cannot let this stand. See earlier posts relating to Clinton era disclosures of plutonium affected workers as reported by ACHRE and the New York Times. In the 1990s the US Federal Government set up a compensation scheme specifically for these workers, who had been suppressed and isolated for decades. The Morgan reference is patently incorrect, based upon exclusion of the very workers Morgan refers to. The false statement that no harm came to plutonium workers is today still published in papers by Bobby Scott of Lovelace Institute. Pam Sykes at Flinders University can irradiate thousands of mice if she likes, but she won’t change the facts of history as discovered and documented by President Clinton’s panel. US DOE has thrown research funds at FU for this very purpose in my opinion. Anyway, on with the article. )

On the basis of other work [13] involving plutonium studies in baboons, Morgan suggests that the value of q when the total lung is the critical tissue, may be reduced by a factor of 4, but cautions that ”this of course does not address the hot particle problem but rather shelves it until we have more data.” Morgan also adds that this shelving is what society has practiced for generations in the case of environmental pollutants from burning of fossil fuels.

The abstract of Morgan’s paper [12] docs state that “Until certain questions are answered about the particle problem, it will not be possible to set a satisfactory maximum permissible body burden for 239Pu based on lung as the critical organ…”

Lung Irradiation with Static Plutonium Microspheres
Experiments conducted at the Los Alamos Scientific Laboratory in which hamster lungs are exposed to plutonium containing micro-spheres represent an important test of the hot particle hypothesis. Data from these experiments were discussed in WASH-1320 (5) and in several Los Alamos Scientific Laboratory progress reports (14-16].

The Los Alamos experiments were discussed in considerable detail in a report presented at a conference on Experimental Lung Cancer-Carcinogenesis is and Bioassays [17]. The report states:
“Our results are in definite contradiction to all simplistic models (GEESAMAN, 1968; DEAN and LANGHAM, 19b9; PEREZ and COLEMAN,1969) that assume tumor induction can be calculated solely on the basis of cellular radiation exposure. The indication is that much more complicated mechanisms are involved and that the volume of tissue irradiated is an important factor. Of the experimental exposures, only the earliest ones have been completed in the sense that the animals have lived out their normal life spans. These involved comparatively small numbers of spheres irradiating only a few percent of the total lung mass. However, 1,142 hamsters were exposed to a total of some 5,700,000 spheres in these experiments, and only 2 lung tumors were observed, which already sets a very low limit on the probability of tumor induction per particle. The additional experiments begun through 1973 will raise the totals to 1,900 animals and160,000,000 spheres and will greatly increase the fraction of lung irradiated.”

Earlier this month, the results of the plutonium microsphere experiments being conducted at Los Alamos were summarized [IS] at an International Atomic Energy Agency sponsored symposium on the Biological Effects of Low Level Radiation Pertinent to Protection of Man and His Environment. The report emphasizes that the studies do not add credence to the supposition that lung tumor induction and expression from plutonium particulates can be predicted solely on the number of cells at risk and that discrete focal alpha radiation alone is not an efficient respiratory carcinogen in the hamster.

Additional Information Prepared on the Hot Particle Hypothesis
A status report on the hot particle problem was given at the summer 1975 meeting of the American Nuclear Society and appears as an abstract in the transactions of the meeting [19]. The report covers the original hypothesis and critiques of reviews of the hypothesis. The interested reader is directed to reference [19] for additional information.

National Academy of Sciences
An ad hoc subcommittee of the National Academy of Sciences’ Committee on the Biological Effects of Ionizing Radiation is reviewing the problem of non-uniformity of radiation dose as it relates to lung irradiation from plutonium and other actinide elements. A report will be published in the near future.
What can we conclude from the information now available on the hot particle question? I believe that the majority of responsible researchers and others who have reviewed the question of lung irradiation from particulate plutonium have rejected the hot particle hypothesis as put forth by Tamplin and Cochran as being unsupportable. This is, however, an important radiobiological question which continues to command attention of researchers and radiation protection groups.

The proposed risk estimate for plutonium-induced lung cancer of 5 x 10 to the 4th power per particle cannot be substantiated on the basis of our current knowledge. In fact, careful consideration of the available data shows that particulate plutonium is not more hazardous than the same amount of plutonium distributed uniformly. Further, the data suggest that the potential hazard from plutonium increases as the dispersion throughout the lung becomes more uniform.

We have yet to learn of the official response to the petition [1] by the EPA and the NRC. Perhaps a response will be available from these agencies before the second anniversary of the petition (14 February 1976). It is unfortunate that questions of such importance require such long periods of time for response. Perhaps we could accelerate the examination process by immediately directing important issues to established organizations such as the NCRP or, on an international scale, the ICRP, to determine if the question or issue is of reasonable importance and priority to command immediate attention. There must also be judgements other than technical that enter the decision making process on a given issue and these must not be overlooked. It is difficult, however, for the lay public to have thrust upon them complex issues which have not first been evaluated by national or international organizations established to provide guidance on the issues being questioned or examined. I believe much time, money and frustration could be saved if we developed a better system for review. In some instances, questions might be identified as “non-issues” and set aside in deference to questions that may indeed require review and decision making by the technical and other components of society.


Essentially all they did was assume that the internal emitter causes energy deposition (exposure) in all the cells of the body, whereas the range of alpha is too short to actually cause that whole organ dose to occur. For alpha is incapable of affecting cells of the body out side its limited but highly energised path. And so this US based model allowed the AEC to continue its position that the bombs were and drove the US official position into conformity with the ICRP model. The ICRP has always held an erroneous position on internal emitters, not even being originally chartered to examine alpha emitters.