29th August 2011 (14:00)
This is an informational piece about a debate that is growing in intensity: Namely how much of the observed climate change in recent years is down to man made carbon dioxide emissions, and how much is due to variations in the Sun. (And indeed how much pre-historical climate change is due to the solar systems movement through the Milky way galaxy).
I have written it because it seems that this website is getting a far larger (and international ) audience than was ever expected...
I have not hitherto wanted to engage in a debate about climate change, because whether or not it's correct we still have a problem generating energy: And fossil fuel has a limited future whatever the truth about climate change is, but in this case I decided it was worth presenting the case because if this new science is correct, it means two things: firstly we need not be so coy about burning such fossil fuels as are left, as the implications are not as terrifying as we once thought, and secondly that climate change is likely inevitable and we may in fact be going into a colder period after all, and we had better be prepared for it.
Climate change - the basic science.
Before I go further, its is necessary to understand a little of the science behind the AGW (Anthropogenic Global Warming) .
Of course when the scientists who developed the model did so, they looked at the sun's output, and concluded the variations in brightness of the sun were not enough to account for Global Warming. So far, so good.
In order to make the observed changes in climate fit the emissions output, they had to introduce what, at school, we called a 'fudge factor'. That is, by itself, the variations in CO2 were not enough to account for the global warming, either. So the proposition was that small changes in temperature were amplified by other factors - say increased water vapour or methane in the air (both effective greenhouse gases in their own right). .
In the absence of any other known mechanisms, applying this 'fudge factor' made the climate change fit the CO2 increase. . The fudge factor is technically known as Lambda
(Click on the word to get a turgid discussion on what its all about)
The alternative explanation
However, other people elsewhere were investigating something else. Namely that the sun does not just affect how much direct radiation reaches the Earth as heat, but also that sunspot activity affects the Earth's magnetic field, and in so doing affects how many cosmic rays (intergalactic radiation from nuclear events - supernovae -all over the galaxy etc.) reach the upper atmosphere, where they act to form radioactive elements like Carbon 14 and Beryllium 7 and 10, and also do other things to the chemistry there. These radioactive elements decay too rapidly to have been formed when the Earth was, so the inference is they are constantly being created by cosmic rays. (And indeed this is how carbon dating works. Once the carbon 14 is absorbed into organic materials like wood, it starts to decay radioactively, and how much is left tells you when the wood was last growing.)
Some scientists then noted on a historical scale, that climate changes were associated with periods of change in the amount of radioactive isotopes laid down at that time, and a suspicion grew that variation in sunspot activity, and hence cosmic ray intensity, might be affecting climate. Not only did this provide an alternative reason for modern climate change, it also seemed to map exactly with both the little ice age of the 17th and 18th century, century, but also the mediaeval warm period, when grapes grew in Britain, and Greenland was free enough of ice for the Vikings to settle there for a couple of hundred years. Hence the name 'Greenland'. Attempts to explain this using the CO2 model were, at best, strained..
Needless to say, this research was not received well by those who had tied themselves to the AGW bandwagon. Because if it were true, it meant that the lambda factor was much less than it had been fudged to be to fit the facts, and instead of a global crisis, the extra CO2 in the air was actually a very small effect, and potentially non threatening at all. Maybe a half a degree or a degree rise at most.
The research continued with a very long term experiment at the large Hadron collider - which is capable of producing similar radiation to cosmic rays - and this experiment was called CLOUD and it showed, or its proponents say it showed, how cloud formation would be accelerated by nuclear chemical events in the upper air. So, their thesis was that more cosmic rays meant more clouds, and more clouds meant less solar energy reaching the Earth which meant global cooling, and vice versa.
The AGW camp have replied to this with the contention that what they produced wasn't big enough to form proper clouds.
Now normally an academic spat of this nature,"Phds at Dawn" , is not of any interest to anyone outside a narrow community, but this one has implications far beyond there, because huge and very expensive political decisions have been taken, including the whole importance of sticking windmills in our Parish, on the basis that the Lambda factor was big enough to cause such serious climate change that it was worth gambling the entire economy of the Western world on it being correct.
Yet if the CLOUDists (I have decided to call them that, for brevity) are right, not only is it almost insignificant, but with sunspot activity (although heading into a new period of higher activity, after a very weak period since 1998 - the hottest year on record) falling overall, it won't be getting very much hotter at all, and in fact its likely to get a bit colder in the next decade, and furthermore, the last time we saw this sort of low sunspot activity, it presaged what is known as the Maunder Minimum, which the CLOUDists say was responsible for the little Ice Age of the 17th and 18th centuries. Leading to such well known effects as skating and Frost fairs on the Thames and possibly (via childhood memories) the Dickensian 'White Christmas'...
This article in the National Geographic sums up what is happening to the sun..and makes...er...chilling reading !
If the CLOUDists are right, the biggest threat to humanity may turn out to be the International Panel for Climate Change, and all the politicians who believed that a single number was bigger than it really was, and all the companies that went along for the ride, and the fat profits...
See below for the story of the CLOUDISTS science. Knocks the BBC science programs into a cocked hat, whether or not its correct.