24 January 2013
The other day I got a message asking about where the earth gets its heat. It brings up a number of misconceptions that I thought would be worth spending a post discussing, so here goes:
Many people assume the earth to be millions if not billions of years old. Lava is molten, but the earth being only 8,000 miles in diameter has no internal heat source. It is almost like a thermos bottle that will lose heat over time. Many suppose that extreme pressure causes heat, but at the deepest depths of the ocean where the pressure is very high, it is also very cold.
This bit about pressure is correct: pressure does not cause heat if the thing you’re pressing on doesn’t change volume. However, compressing a gas can cause it to heat up. It’s called the “adiabatic lapse rate” and it explains why higher elevations see colder temperatures, and lower elevations are warmer. As air rises, it expands and cools so mountaintops are chilly. As air moves downward, it is compressed and warms up, which is why descending into the grand canyon or death valley in summer is a bad idea. But this effect can’t explain why the interior of the earth is hot, so let’s press on.
Others assume that radioactive decay causes the heat, but with the advent of nuclear plants and control rods, it would take some very precise levels to control the heating process. Also, with radioactive decay, there would naturally be some radiation that would be present in many different ways and in all probability, would be exposed with any volcanic eruption.
Here’s where we go wrong. About half of the earth’s interior energy does come from the radioactive decay of naturally occurring isotopes. In particular, potassium, uranium, and thorium. The confusion arises with the analogy to nuclear reactors. It’s true that a nuclear reactor needs to be maintained at a precise temperature to continue the fission reaction without going critical and melting down, but that has nothing to do with the radioactivity that heats the earth. Radioactivity is not the same thing as a fission chain reaction. Radioactive isotopes of potassium, uranium, and thorium spontaneously decay into more stable daughter elements no matter what temperature they are at. You could have a single atom of uranium, floating alone in frigid deep space, and it would still eventually decay and release some energy.
The second point here, that there would be naturally occurring radiation exposed by volcanic eruptions is absolutely right. Any rocks erupted that contain significant amounts of potassium, thorium, and uranium will have measurable amounts of radioactivity. Take a geiger counter into Grand Central Station or the Vatican if you don’t believe me. Their granite walls contain above-average concentrations of radioactive elements. There are lots of other sources of radioactivity all around us. For example, bananas are rich in potassium, and so that are slightly radioactive.
Even with any explanation that would hold some degree of credibility, you would need a source of fuel to provide the necessary heat source so that it could operate and release the required energy. This heat source would require precision controls to avoid catastrophic results. Do you see my dillema? Given the size of our planet, no internal fuel or heat source could possibly sustain itself for millions of years, let alone billions of years given the mass and quantity required.
Radioactivity is a long-lasting source of heat for our planet that doesn’t require any “fuel” other than the naturally occurring unstable isotopes. I see the point about the need for precision controls on the Earth’s temperature, but a planet is a complicated thing, especially the earth with its active plate tectonics, huge oceans, and active water cycle. The Earth climate system is made of countless interacting feedback loops that serve to moderate the planet’s temperature. It’s possible to perturb the system with dire consequences (see anthropogenic climate change) but all in all it’s a pretty stable system.
The question goes on to suggest that since there is no way that the Earth could maintain its internal heat, it must be young and therefore creationists are right. Well, not really. As I explained above, simple decay of radioactive isotopes provides about half of the Earth’s heat, and will continue to provide heat for billions of years. The rest of the heat is left over from the planet’s formation. When a planet forms, the material falling in to form it has to release its gravitational potential energy, and much of that energy is released as heat. Likewise, once you have a protoplanet, as the iron begins to sink through the molten bulk of the planet and settles in the core, it again releases gravitational potential energy and heats the planet. The only way to get rid of heat from the planet is thermal radiation, which is actually a pretty inefficient way to transfer energy, so planets can remain warm for a long time even without radioactive heating. In fact, Jupiter and Saturn are still giving off significantly more heat than they receive from the sun: they are really massive so there was a lot of gravitational energy involved in their formation, and their inner layers are actually still contracting and producing additional heat!