We tend to think of radioactivity as an artificial thing; some argue that the first nuclear explosions in 1945 should mark the start of a new human-dominated geological epoch called the Anthropocene. These man-made explosions have left distinctive radioactive traces that may well outlive us all. It turns out that natural radioactivity, even fission reactions, played an interesting role in Earth’s history long before we came along.
A little background
Our sun is a nuclear fusion reactor, taking simple atoms such as Helium and Hydrogen and squeezing them together to create new elements, plus energy. This normal activity, along with dramatic events in a star’s history such as supernovae, have created virtually all the atoms you see around you. Radioactive decay is where large unstable atoms break-up, creating new smaller atoms plus various left-over bits, such as alpha, beta or gamma particles. Sometimes these particles hit other unstable atoms and cause them, in turn to break up. Put enough radioactive atoms of the right sort together and a nuclear fission reaction starts. When nuclear fission is used to generate electricity, the reaction is controlled. If used to kill people, a chain reaction is created to generate as much energy as possible.
Too much radioactivity is dangerous, damaging cells and DNA whether the source is natural radon gas or a nuclear weapon. But it’s not all bad. Some people regard plate tectonics as a pre-requisite for life on earth. It certainly makes things more interesting. Plates move because the mantle convects because it needs to release heat to the surface. This heat comes partly from radioactive decay within the earth – without it this planet would be a cold and dull lump by now.
Radioactive decay is massively useful to geologists as a dating tool. Rates of decay, usually expressed in terms of half-lives, are constant. If you can work out that a grain of zircon started out with twice as much Uranium-235 as it now has, then you know it formed 703.8 million years ago.
Let’s turn that round: 703.8 million years ago there was twice as much Uranium-235 around as there is now and therefore four times as much 147 million years ago. This means that the earth used to be hotter (more radioactive decay), which is why Archean geology is so weird (odd komatiite lavas, crust that dripped back into the mantle). It also means that fission reactions were easier in the past.
Much of the hard work of a nuclear weapons program involves enriching Uranium. From the Manhattan Project through to the Iranians today the most laborious job is taking natural Uranium (a mixture of Uranium-235 and Uranium-238) and increasing the proportion of Uranium-235. This is important because U-238 is more stable, with a longer half-life and less interest in breaking up. Humans increase the proportion of U-235 using centrifuges, or lasers, but a time-machine would do the same job.
Around 2 billion years ago, a Uranium-rich deposit in modern day Gabon was the site of seventeen natural nuclear fission reactors. Self-sustaining nuclear reactions, moderated by groundwater, lasted for about a million years. There are two excellent blog posts that cover the site in more detail.
Such natural reactions are extremely unlikely now, since much more U-235 has decayed into lead over the intervening 2 billion years. But what about the 2 billion years of earth history before the Gabon reactors started up? Were fission reactions active in that time frame? Some argue that they were, with explosive consequences.
Huge explosions and the moon
The deep Earth is a mysterious place. We know that the crust is relatively rich in radioactive elements but we don’t know much about their distribution in the mantle. One day Neutrino detectors may help map out the modern day distribution. How they were distributed earlier in the earth’s history is anyone’s guess.
Some people’s guesses (informed by computer modelling) suggest that heavy radioactive elements such as Uranium, Thorium and Plutonium, sank to the bottom on the mantle, near the core-mantle boundary. Plutonium is now regarded as a man-made element, but it would have existed in the early earth, as it would have had less time to decay since being created in a supernova. Geochemical models suggest that while substantially enriched, the average concentrations would still be too low to cause fission reactions.
Dutch scientists (R.J. de Meijer and W. van Westrenen) have suggested an amazing thing. Their theory is that concentrations of radioactive elements were higher in some areas than others (not unreasonable). They suggest that, just as human nuclear bombs are triggered by using conventional explosives to pressurise the radioactive material, a major impact on the earth would send shock waves into the inner earth and compress the material enough to initiate a nuclear reaction.
This reaction would take place in a large volume of rock and so would be create a huge explosion. Big enough, their modelling suggests, to fragment the earth and send lots of material into space. In time, some of this material formed a large moon orbiting the earth – the one we see today.
The moon? Really?
I suspect you are feeling a little sceptical right now, which I think is the right reaction. But bear in mind that we don’t really know how the moon formed. The best available theory is based on the idea of a massive collision with another large body. This has big problems because of the many isotopic similarities between the earth and moon. Any other body coming in would be expected to have had a different composition, traces of which would be present in the moon today.
The giant impact model is still the best. A recent conference on the moon’s origins discussed many ways in which the similarities between earth and moon could be reconciled with the model. The impact could have thoroughly mixed the material, or maybe the impactor had the same composition. Perhaps the moon originally came from Venus. We don’t know anything about the composition of Venus – it may be very similar to earth.
As far as I can tell, nobody discussed the nuclear explosion model at this conference. This may be because there is no actual evidence for it, just inference from modelling. In their latest paper R.J. de Meijer and W. van Westrenen predict distinctive patterns in Xenon and Helium isotopes in lunar material. Measurements of these elements on our current Apollo samples are contaminated by the solar wind, so samples of deeply buried lunar material would be needed to test it fully.
We’ll have to wait then. Perhaps some future lunar rover will dig up the required samples. If it does, it is likely like the Chang’e rover currently on the moon to be powered by Plutonium. Useful stuff, radioactivity.