Think Circadian. This link is very important in relation to the low dose debate and to the understanding of the mechanisms of adaption to background radiation. More on this in two posts time.
Apparent Diurnal Variation in Background Radiation
by John Walker
January 27 MIM
Since October 16th, 1998, I’ve run, pretty much continuously, an Aware Electronics RM-80 radiation monitor connected to a serial port on a 1992 vintage 486/50 MS-DOS PC. The RM-80 uses a 7313 pancake Geiger-Müller tube, which is halogen quenched and has a minimum dead time of 30 µs, and is equipped with a mica window to allow alpha particles to enter. The unit generates high voltage for the tube from the Data Terminal Ready and Request to Send pins of the serial port and toggles the state of the Ring Indicator line whenever a count is detected. The serial port can be programmed to interrupt on changes in the state of this signal, making it straightforward to implement radiation monitoring in software. Tube sensitivity, calibrated with Cesium-137 (137Cs), is 3.54 µR/hour per count per minute.
The second generation HotBits generator uses an RM-80 detector illuminated by a 5 microcurie 137Cs check source. I decided to attach the HotBits spare detector to a PC and let it run as a background radiation monitor, as much as anything to let the detector run for a while to guard against “infant mortality” in any of its components, should it have to take over for the in-service detector. Aware Electronics supplies the detector with a DOS driver program called AW-SRAD, which I used to log the number of counts per minute, logging the collected data to files in CSV format. After a few months of data collection, I decided to run some analyses of the data, and I’m not sure exactly what I found. Let’s take a look.
Radiation by Local Solar Time
The next thing I tried is binning the results hourly by local solar time. The following chart shows the results, plotted in terms of average background radiation flux in micro-Roentgens per hour. (The average background radiation of 16.2 µR/hr–142 mR per year–may seem high, but recall that Fourmilab is at an altitude of 800 metres above mean sea level in the Jura mountains. Both the soft and hard [primarily muon] components of secondary cosmic rays are absorbed by the atmosphere, so the greater the altitude the more intense the radiation [historically, this provided the first clue the source must be in the upper atmosphere or space]. At sea level, cosmic rays contribute about 30 mR/year, but at the 10 km altitude at which jet airliners fly, cosmic radiation accounts for about 2000 mR/year; more than 60 times as intense as at sea level.) When I plotted the hourly local time averages, I was surprised to obtain the following result.
I’ve read about variations in cosmic ray flux with latitude, in Easterly and Westerly incidence, modulation by the solar cycle, and due to changes in the geomagnetic field, but I’ve never heard mention of a diurnal cycle. Yet this plot appears to show a sinusoidal variation, with a magnitude variation between the highest three-hour period and the lowest of almost 6% of the mean value, with the trough in the curve apparently just about 12 hours from the peak.
To explore whether this might be nothing but an artifact or statistical fluctuation, I then re-binned the same data minute by minute, resulting in the following plot, in which the blue curve is the raw minute-binned data and the red curve is the same data filtered by an exponentially smoothed moving average with a smoothing factor of 0.9.
Well, it still looks credibly sinusoidal, with the maximum and minimum at about the same point. As we all know, the human eye and brain are extraordinarily adept at seeing patterns in random data. So let’s try another test frequently applied as a reality check when apparently significant results appear in a data set. The chart at the left was created by randomly selecting 25% of the points appearing in the complete data set and plotting them hour by hour. We find that the selection has little effect on the shape of the curve or the location of its maximum and minimum.
Next, I decided to explore whether the apparent sinusoidal variation might disappear if I discarded outlying values, which might conceivably vary differently in time than those which make up the bulk of the database. I pruned the bell curve at about one standard deviation, then used the remaining data to prepare the plot at the left. As you can see, the case for a sinusoidal variation is eroded somewhat, but the general shape, magnitude, and location of extrema is conserved.
Finally, I decided to plot the average radiation flux against local sidereal time. Sidereal time tracks the position of the distant stars as viewed from a given point on the Earth. At the same sidereal time, the same celestial objects (external to the solar system) will cross the meridian in the sky above a given place on the Earth. Since the viewpoint of the Earth shifts as it orbits the Sun, the sidereal day (the time between successive meridian crossings of a given star) is about 4 minutes shorter than the solar day (between solar meridian crossings). Correlation with the sidereal period is powerful evidence for a distant source for a given effect. For example, it was such correlation which provided early radio astronomers evidence the centre of the galaxy and Crab Nebula were celestial sources of the noise they were monitoring. Here’s a plot of average background radiation flux by sidereal time.
What’s Going On Here?
Darned if I know! The floor is open to wild conjecture and unbridled speculation.
First of all, I think it’s reasonable to assume that any diurnal variation in background must be due to cosmic rays. The balance of background radiation is primarily due to thorium, radon, and daughter nuclides in the local environment. In the vicinity of Fourmilab, the terrain is almost entirely thin, rocky topsoil over a thick layer of limestone. Limestone has little or no direct radioactivity (as opposed to, for example, granite, which is rich in thorium), nor does it contain radon precursors. In such an environment, it’s hard to imagine a background radiation component other than cosmic rays which would vary on a daily basis. (This would not be the case, for example, in a house with a radon problem, where you would expect to see a decrease when doors and windows were opened during the day.)
If the effect is genuine, and the cause is cosmic ray flux, what are possible causes? The two which pop to mind are atmospheric density and the geomagnetic field. During the day, as the Sun heats the atmosphere, it expands. If you’re at sea level, the total absorption cross section remains the same, but the altitude at which primary cosmic rays interact with atoms of the atmosphere may increase. Further, an increase in atmospheric temperature may change the scale height of of the atmosphere, which would perturb values measured at various altitudes above sea level. But if this were so, I’d expect the variation curve to be more or less in phase with the solar day, while what we seem to be seeing is skewed by about six hours.
Let’s move on to the geomagnetic field. It’s well documented that the Earth’s magnetic field and its interaction with the Sun’s create measurable changes in cosmic ray incidence, since the proton and heavy ion component of primary particles is charged and follows magnetic field lines. As any radio amateur or listener to AM radio knows, the ionosphere changes dramatically at night, allowing “skip propagation” of medium- and high-frequency signals far beyond the horizon. Perhaps this effect also modifies the geomagnetic field, affecting the number of charged cosmic rays incident at a given location.
If there is a diurnal effect, why should it peak around 07:00 local time? Beats me. Nor if the apparent (though I believe illusory, due simply to the database only covering a few months) correlation with sidereal time is genuine, why that should have a peak at 12 hours and a trough 8 hours later at 20 hours local sidereal time?
First off, the Jura Mountains: ” The Jura Mountains are a small mountain range located north of the Alps, separating the Rhine and Rhone rivers and forming part of the watershed of each. The mountain range as defined by Johann Gottfried Ebel (* 1764, † 1830) is located in France, Switzerland, and Germany.” (Wiki)
It is interesting to note that the US DOE funded team at Flinders University, South Australia, headed by Dr Pam Sykes, found a beneficial effect to LOW LET (linear energy transfer) very low dose radiation. It is very interesting to note that of all the forms of ionising radiation, the US DOE/FU team confined themselves to externally administered soft x rays. They did not confirm their findings in regard to internalised alpha and beta emitters. (Not surprising really, given that the US DOE designed the experiment’s parameters) Not surprising because alpha radiation is high LET. Beta, comprised as it is of an energised electron (as opposed to alpha’s energised helium nucleus) also has mass and charge and therefore is not low LET either.
It has long been known that to cause genetic damage by direct insult (impact, energy transfer), radiation has to be possess energy of more than a certain value. (Peter Alexander, “Atomic Radiation and Life”, Pelican Books, UK, 1957). This is regardless of dose. To create genetic damage via Linear Energy Transfer (one of two such vectors for harm possessed by such radiation- the other being via ionisation causing the formation of harmful chemicals in the cell and cell nucleus), the radiation has to exceed an ENERGY threshold. This is NOT a dose threshold. If of sufficient energy, a single track of radiation through a cell can cause LET damage. Increase the dose, increase the number of tracks, incease the chances of an LET “collision” of damaging consequences.
The type of radiation most usually encountered capable, normally, of possessing such energy is Alpha, followed by Beta (depending on the radionuclide which emits it. This determines, normally, the energy level.)
The differing characteristics of Gamma and X compared to Alpha and Beta in this regard are abvious and form the basis for the mathematically derived unit of absorbed dose called the “Sievert”. Regardless of whether or not the comparative mathematical values ascriped to the differing radiation types are adequate (that’s a whole other question), the fact is internalised alpha radiation is far more efficient at producing internal damage to the body (with high energy forms of Beta not that far behind – eg that Beta emitted by Strontium 89 for example can be considered to be on threshold of “high LET”) than Gamma or X. The reason is simple really. Gamma and X possess no mass (interia, momentum) and no charge. Alpha and Beta does. Alpha is comparatively massive in mass x velocity per track, Beta much smaller, but still possess both. Both also possess charge. So the likelihood of Alpha and Beta striking critical atoms in the genetic material is much greater than that of Gamma and X.
Given this, it is comforting to know Mother Nature may have provided a protective cycle in the form of Diurnal Variation in Background. A cycle in which charged particles are stripped from cosmic rays in the hours during which humans are active outdoors by the action of the Dayside Reconnection of the earth’s Magnetosphere. The peak influx, according to Walker, occurs at around 8.15 am. And it appears that this early morning exposure primes the body for the daytime assault of cosmic rays, protecting it, by activated repair, from the full daytime blast.
That it was Dr Sykes and her FU team who found this is protective priming mechanism (of verylow LET low dose ionising radiation) is not comforting. For on the basis of the findings, the principle of Brucer’s “Radiation Hormesis” has been extended by Flinders University and the US DOE (See earlier posts) to not just the form of radiation used in the Sykes DOE experiment, (ie soft, low dose X rays) but to all, including, paradoxically Alpha and Beta, which are not, in the realms of real science, considered to be “low LET” at all.
And that, after all, is the basis for the Sievert:
The equivalent dose to a tissue is found by multiplying the absorbed dose, in gray, by a dimensionless “quality factor” Q, dependent upon radiation type, and by another dimensionless factor N, dependent on all other pertinent factors.
N depends upon the part of the body irradiated, the time and volume over which the dose was spread, even the species of the subject. Together, Q and N constitute the radiation weighting factor, WR . Q is the same thing as the Relative Biological Effectiveness (RBE). For an organism composed of multiple tissue types a weighted sum or integral is often used. (In 2002, the CIPM decided that the distinction between Q and N causes too much confusion and therefore deleted the factor N from the definition of absorbed dose in the SI brochure.” – wiki.
Even though basic health physics recognises the enhanced danger posed by internalised alpha and beta emitting particles, (because they are, basically 1. trapped in the body for varying periods. 2. Are high LET (alpha) or more energetic (some beta) compared to gamma and x, have size and charge and more efficient at producing collisions (LET) and ionisation per cm of traversed tissue, the Sievert weights them proportionately), Sykes et al and the US DOE ignore this fact.
On the basis of the FU experiment, Los Alamos National Labs (via Bobby Scott) has proposed that clean up of contaminated sites is a waste of money!! Even though such sites consist of alpha (nuclear fuel) and beta (fission products) emitting particles.
Sykes et al did not use internalised alpha and beta emitting particles. They use, soley, soft, low LET, low dose, X rays only.
There is no basis therefore to conclude that “low dose” internalised alpha and beta particles have any benefit. In fact, to the contrary, they are highly dangerous. Everyone has a body burden. The lower that burden is, the better. There is no benefit derived from a contaminated landscape as Scott states, citing Sykes et al, FU.
There is a basis for concluding that a precursor dose of soft, very low LET low dose x rays primes the body for repair in anticipation for a higher dose of harder rays. This all Sykes et al found. The variation in background radiation reported by Walker may provide a regular cycle in nature which in evolutionary terms resulted in a protective mechanism in life.
In this though, it can be seen that the highest exposure to the varying cosmic background comes in the wee hours and morning, the time when humans naturally are asleep. A time when, naturally, therefore, the oxygen tension in our cells is at its lowest.
The oxygen tension of cells, as the reader will remember from earlier posts, is one determinant of the damage radiation may cause. The higher the oxygen tension in the cell, the greater the damage per unit dose. (Alexander, 1957. Observed by Hiroshima doctors, 1945.)
Sykes et al, as applied, is simply a refinement of the Teller technique of confusing the voter by muddling the radiation types and types of expose (internal vs external exposures). This is the basis of the historic Teller vs Pauling debates).
It is on again. Not that it ever ceased. The techniques remain the same.
“This variation in effect is attributed to the Linear Energy Transfer [LET] of the type of radiation, creating a different relative biological effectiveness for each type of radiation under consideration. Per most government regulations, the RBE [Q] for electron and photon radiation is 1, for neutron radiation it is 10, and for alpha radiation it is 20. There is some controversy that the Q or RBE for alpha radiation is underestimated due to mistaken assumptions in the original work in the 1950s* that developed those values. That original work neglected the component of the nucleus recoil radiation for alpha emitters.” Wikipedia, Definition of Sievert.
It’s not wise to ignore the internal emitters Dr Sykes. Dr Hamilton didn’t. Remember him? We don’t want a repeat.
* a whole other argument, like I said.