Uncharted Territory

February 24, 2010

Why the AMO Overshoots

Filed under: AMO, Complex decisions, Global warming, Reflections, Science, Sea ice — Tim Joslin @ 8:00 pm

I’ve had a bit of off-line feedback on my previous post Spin Snow, not Sea Ice, the AMO is Real!, so I thought I’d try to correct any misconceptions arising from my clumsy presentation.

1. I am only attempting to explain general climate trends, not annual variation in the weather. In particular, I am assuming that the SST (e.g. as measured by satellite) correlates with the heat stored in ocean surface waters (to 100-200m depth, say).  Hence I don’t model “heat” and “temperature” separately.  Over periods of less than a decade, the SST may be determined more by atmospheric variability (including cloud cover) than heat loss from the ocean.   Additionally, there will be different patterns of Atlantic SST variability at different latitudes.  (Since the underlying cycle is of more and less heat lost at high latitudes over decadal timescales, the idealised model would be of an alternately steeper and shallower temperature gradient from (steadily warming) low latitudes to high latitudes – though we can’t rule out heat transfer between the hemispheres as well).

2. Although I have used the term “AMO” (Atlantic Multi-Decadal Oscillation), this (i.e. apparently cyclic variation in the Atlantic sea surface temperature (SST)) is just one measure (another affected is the NAO/NAM, see previous post).  Since the Arctic exchanges water with the Pacific via the Bering Strait as well with the Atlantic via the Fram Strait and Barents Sea, the mechanism itself requires another name, so perhaps “AMO” should be read as the Arctic Multi-decadal Oscillation!  I only modelled the Arctic and the Atlantic, but the Pacific waters cooled by flow of surface currents to the Arctic would be affected much the same as the Atlantic, so I don’t think the extra complexity is required for a proof of principle.

3. Which brings me onto the final point: I’m only attempting a proof of principle, in particular in my graphics.  All I was setting out to do was represent what I perceive to be the logical consequence of coupling between the temperatures of the Arctic and the North Atlantic and Pacific.

In actual fact, I suspect the heat exported to the Arctic varies with a higher power of the Atlantic temperature and not linearly.  The point is that less and thinner Arctic sea ice at the start of winter allows more cold deep water formation which is accompanied by the dispersal of more heat because there’s more of it and also because the surface water was initially warmer.  Introducing a square function leads (as well as to a more chaotic system) to a shortening of the AMO cycle in a warming world.  E.g.:

Any fool can produce an oscillation in a spreadsheet, so why do I think the AMO mechanism is real and important?

1. We keep being told that the Arctic is warming faster than predicted by the climate models. This means it is dissipating more heat than predicted – by radiation into space, by evaporating water that falls as snow or rain and so on.  The climate involves net heat gain at low latitudes, heat transport in the atmosphere and oceans and heat loss at high latitudes.  If the Arctic is warmer than would be expected for steady global warming, then what we’re going to get is unsteady warming (as in the 1930s-40s, see previous post).

2. The criteria exist for an oscillating system – the temperature of the Arctic depends on that of the North Atlantic (and the North Pacific) and vice versa (i.e. there is a negative feedback loop) and there are delays in the system.  These arise because the rate of surface water flow to the Arctic (and deep water flow back) is variable and adjusts only slowly.  There needs to be a relatively large temperature difference between the North Atlantic (NA) (please read North Pacific too) and the Arctic to generate a sufficiently strong current to cool the NA. As every MBA student knows (e.g. from the Beer Game) any negative feedback loop with delays results in an oscillating system.

The system round Antarctica is somewhat different – the coldest area is land and water can flow freely from warmer areas to colder ones (i.e. those with seasonal sea-ice and hence deep cold water formation). This is not to say there aren’t oscillations down there, just that they’re not the same (or, probably, as extreme).

3. The AMO mechanism is that, as the NA warms, the Arctic warms too (because there’s always a current from the NA), reducing the amount of insulating sea ice (and multi-year ice is thicker and a better insulator than first year ice) and therefore increasing its capacity to drain heat (by creating new ice and cold deep water) from the NA.  The critical point – the delay in the system – is that warming and cooling takes some years, so the Arctic will continue to warm even as it starts to cool the NA, and will cool (forming more multi-year ice) even as the NA starts to warm.

4. It seems to me – and my incredibly simplified modelling supports this – that the Arctic will keep warming until it cools the NA, however warm the NA gets (of course, the NA can also lose heat in different ways).  Until, that is, first, the capacity of the Arctic to dissipate heat is reached, and then the system breaks – when the Arctic gets so warm it can no longer generate an overturning circulation.

And once the Arctic has warmed enough to cool the NA, it will overshoot (this could already have happened in the current cycle if the summer sea ice minimum has already been reached), because the NA will still be warm enough to warm the Arctic even while it (the NA) is cooling, albeit at a slower and slower rate until the process reverses.  For similar reasons, the Arctic will also overshoot in the reverse phase, i.e. it will continue to cool even after the NA has started warming again.

5. A rough calculation suggests a net oceanic transfer of heat to the Arctic of 60TW or ~2*10^21J/yr [1], which luckily is compatible with the figures I calculated in my previous post The Earth is a Fridge.  Now, the IPCC estimates that the oceans have gained on average ~14*10^22J between 1961 and 2003 (including ~8*10^22J from 1993-2003) because of global warming (the blue bars are 1961-2003, the burgundy bars 1993-2003):

Heat gain by global warming (IPCC Fig TS.15)

That is, the oceans have been gaining heat at a rate of around 3*10^21J/yr on average (and around 8*10^21J/yr from 1993-2003).  Let’s attribute 1 or 2*10^21J/yr to the NH which after all is mostly land.

It seems to me at least plausible that an overshooting strengthening of the AMO by more than 50% from its 2*10^21J/yr average – and remember it will be strongest when there is no sea-ice at all in summer, which is still some way from the case – could pump heat out of the northern oceans at a faster rate than they are gaining it by GW (this is all very approximate, proof of principle stuff, but note that a 50% volume increase oceanic circulation in the positive phase of the AMO would be 50% more water containing more heat – conceivably 4*10^21J/yr, perhaps, rather than 2*10^21J/yr).  That is, the AMO could create some cooling for a period.  Of course, this would be followed up by even faster warming, then an even stronger reaction, until the system reaches its capacity as I mentioned earlier, after which we’d just see steady warming.

I conclude with a final figure from the IPCC (panel (a) is mislabelled, the graph shows just the minimum sea-ice extent each year, not the anomalies in it):

Arctic and Antarctic sea-ice anomalies (IPCC Fig. TS.13)



[1] “Modelling Arctic Ocean heat transport and warming episodes in the 20th century caused by intruding Atlantic Water”, Wang Jia et al, Chinese Journal of Polar Science, Dec 2008.


February 23, 2010

Spin Snow, not Sea Ice: the AMO is Real!

How unfortunate. Back in 2000, yes, that’s not a typo, in 2000, the Independent wrote that:

“According to Dr David Viner, a senior research scientist at the climatic research unit (CRU) of the University of East Anglia, within a few years winter snowfall will become ‘a very rare and exciting event’.

‘Children just aren’t going to know what snow is,’ he said.”

In Viner’s defence, he did go on to say that rare snow events would cause chaos.

It’s No Joke

For a long time it’s seemed to me that one problem global warming (GW) is likely to throw up is that snow events, like other forms of precipitation, will become more extreme. That is, when it does snow, it’ll be heavier.

A commenter on one of my recent posts suggested I go and do some statistical analysis on temperature measurement data to see if trends are significant. In actual fact, about 5 years ago, I did exactly this with data on snowfall. If I recollect correctly, I found that there was a statistically significant trend in the number of heavy snow days (above a particular depth) in the middle of winter (i.e. not in months when, due to GW, some pf what would otherwise have been snow might fall as rain) in the data I found on the net for a particular Rocky Mountain ski resort. If I come across my notes I’ll bring the analysis up to date.

Here’s the real concern. A few decades down the line, the planet will be a lot warmer and we’ll be seeing much heavier precipitation in some regions. Some of this will be snow. Furthermore, there’s always the chance of a cold snap, for example, when a volcano goes off (and we really should be worrying more about this climate risk, IMHOP – more another time, maybe). Or after a geo-engineering accident (sorry, couldn’t resist). At the start of the cold event at least, the oceans will still be warm, because of stored cumulative GW heat, and they will therefore continue to pump moisture into the atmosphere. But the dust shroud will rapidly cool land areas, so that some places used to dealing with just heavier rain suddenly find themselves trying to cope with a foot or two of the white stuff.

It’s a shame climate scientists haven’t been warning people about the vulnerability of flat roofs to heavy snow.

Skating on Thin Ice

On the other hand, there’s been a worrying tendency over the last few years to treat the continually diminishing amount of Arctic sea ice each year (at the minimum extent in September) as a GW canary in the coal-mine, like glaciers.

It would have been better to stick to glaciers. Because changes in Arctic sea ice may well be part of a natural cycle. Of course, there’s an underlying warming trend tending to reduce the amount of Arctic sea ice. But if and when the natural cycle starts to dominate, sceptics will have another field day.

It’s worse than this. The cycle – which is called the Atlantic Multi-Decadal Oscillation, or AMO for short – could affect the temperature of the entire Northern Hemisphere (NH). [See previous post Musings of the Hemispheres – there may be similar processes in the SH, but I’m not going to discuss those just now].

Before I go on, there was a fuss a while back – serious stuff: letters to the Guardian editor, that kind of thing – when a Professor Latif was accused of explaining GW with AMO. His position, like mine, is that both GW and AMO affect the climate. I just want to make it clear that I’m with the Professor on this, even if simplistic sceptic brains find this position a logical contortion.

Evidence for the AMO (1): IPCC Data

Consider the following graph from the IPCC (AR4, the most recent report):

Global mean surface temperature relative to 1901-50, compared to climate models (IPCC Fig TS.23)

What gets me about the IPCC data is the anomaly around 1940. The average temperature was simply too high, and this is not adequately explained (if it was, I guess the models would be corrected).

We can drill down a little further:

Continental-scale breakdown of actual and modelled temperatures compared to 1901-50 (IPCC Fig TS.22)

Here we see that by and large the models represent land temperature fairly well, but that ocean temperatures were outside what are presumably intended to be some kind of confidence limits – for what looks like around an entire decade (just before mid-century).

This is not a very satisfactory state of affairs.

Note from Fig TS.22 above that the land temperature range over the past century has been around 1C, and that of the oceans perhaps 0.7C.

Consider what’s happened in the North Atlantic:

AMO from 1850-2005 (temperature relative to 1961-90 (IPCC Fig. 3.33)

The North Atlantic sea surface temperature (SST) (top graph) has increased by nearly 1C since its lowest point soon after the turn of the 20th century.

A 1C increase in ocean temperature is unsustainable. Land has a lower heat capacity (i.e. you have to put in less heat for a 1C temperature rise) than ocean, so must warm faster. The North Atlantic heat will have to dissipate.

Evidence for the AMO (2): The Historical Record

If I were a climate specialist about to make a song and dance over a particular piece of evidence for GW, I think I’d make pretty sure the phenomenon in question hadn’t happened before.

It just so happens that the area of Arctic sea ice has shrunk dramatically before, and not so long ago.

Yeap, you’ve guessed it, the Arctic warmed from around 1920 to 1940. Here’s the Abstract of a paper The early twentieth century warm period in the European Arctic that looks kosher – it must be, it costs $42! A site, www.arctic-warming.com seems to be devoted to the issue (particularly of warming around Spitsbergen in 1918-22) and cites some other papers discussing the 1920-40 episode, “one of the most spectacular climate events of the 20th century”. There’s even a book about the event.

None of these sites offer a clear explanation for the Arctic warming, so I’m going to have a bash.

Explaining the AMO

The point is that loss of Arctic sea ice – absence in summer and thinning year round – is not just a symptom of warming. It is part of a cyclic causal mechanism.

As I pointed out in a previous post, The Earth is a Fridge, the less sea ice there is at the start of winter (the Arctic ice extent is at a minimum around mid-September!), the more heat the Arctic waters can lose to the atmosphere and hence into space during the winter. Water covered by ice can’t lose heat because ice is an insulator, and the process of freezing is itself an important mechanism for losing heat.

Clearly the Arctic waters will lose most heat in winter when there is no summer ice. In a steadily warming world, you might expect first the summer ice to disappear, at which point the Arctic would have reached it’s maximum effectiveness in getting rid of heat (imported in currents from lower latitudes) and gradually the maximum extent of ice each year would reduce.

But there is an oscillation in the system.

Modelling the AMO

At first I was going to simply draw a curve on a piece of paper and scan it in, but my better half is a bit of an Excel whizz and persuaded me to do something a bit more sophisticated.

It was astonishingly easy.  Here’s the result, first without taking account of global warming (GW):

I can’t emphasise enough how easy it was to produce this graph. It’s hugely simplified, including as it does just two ocean masses and nothing else and making no attempt to distinguish between heat and temperature, and between temperatures at different times of year.  But I don’t see why it isn’t qualitatively valid – it produces the asynchronous sinusoidal temperature curves I’d deduced anyway, but with the added theoretical basis of generating them by heat exchange between the Arctic and the NA.  And since I’ve tied the temperature curves very roughly to historic data, the timescale of future temperature changes could conceivably be roughly correct.  The fact that what I wanted to show drops so easily out of the spreadsheet suggests some underlying veracity – I claim no more than that – at least to me.  End of disclaimer.

All I’ve done is calculate the temperature of the Arctic (purple line) in a given year as its temperature the previous year (times a cooling factor) plus the North Atlantic (NA) temperature the previous year times a factor (15% in this instance).  All I’m assuming is that the warmer the NA is, the warmer the Arctic will be.  After all, we know surface water flows from the NA to the Arctic.

So far, so simple.  The next bit is the critical point.

I’ve calculated the temperature of the NA (green line) similarly, but included a negative feedback.  In the model, the NA temperature is equal to its temperature the previous year (times a cooling factor) minus the Arctic temperature the previous year times a factor (6% in this instance, less than the 15% for the reverse case because the NA is bigger than the Arctic).

The minus in this calculation says that the warmer the Arctic is, the more NA heat it can absorb and disperse ultimately into space.    Remember, my argument is that the thinner and less extensive the Arctic ice, i.e. the warmer it is on average over the year, and in particular at the start of winter, the more NA heat it can disperse over the year, but in particular in winter.  [A more complex model could try to model the Arctic temperature at different times of year].

Obviously I’ve adjusted the numbers and starting conditions to fit the graph roughly to the historical record.  (The anomaly on the vertical axis is arbitrary, 0 is intended to be the long-term equilibrium – if you start with 0 for both anomalies, the graph is flat).

As well as the Arctic and NA temperatures I’ve included in my schematic an indication of the Northern Hemisphere (NH) temperature, produced by simply adding the NA and Arctic values (yellow line).  This shows a peak in 1940, which is what we’re trying to explain, as well as a peak around 2005 and, as predicted by Professor Latif, subsequent cooling for quite some time.

The good news is that we won’t have to wait too long to find out whether the AMO is real.  The bad news is, that, if it is, it’ll be like putting rocket fuel in the sceptic bandwagon.

I thought I’d go a little further and see if my model predicts anything else.  I’ve therefore included an “Arctic Oscillation” (AO) (blue line) which I’ve calculated by subtracting the NA temperature from the Arctic temperature.  The AO – represented by real-world indicators such as the North Atlantic Oscillation (NAO) and the Northern Annular Mode (NAM) – is an atmospheric phenomenon which correlates with the nature of NH winters.  My logic is that the higher the temperature of the Arctic compared to the NA, the lower the air pressure will be over the Arctic in comparison with the NA, which is in principle what the NAO and NAM measure.

Anyway, here, again from the IPCC, is the actual historical record of the NAO/NAM:

NAO/NAM indices (IPCC Fig 3.31)

Compare these real-world measurements with my model which (blue line) predicted a positive AO from 1900 to the 1930s and again from the 1960s to around 2000.  Could they possibly fit together?

Future temperatures, Global Dimming and Global Warming

I have to say I’m rather alarmed that, based on the timescales of the historic 20th century AMO cycle, my model shows temperatures falling for another 15 years.  I thought I’d better factor in a bit of global warming, so I played around in Excel a bit more:

This time I’ve allowed for GW by adding an arithmetically progressively larger term into the NA and Arctic temperatures each year.  As in the previous figure, the vertical anomaly scale is entirely arbitrary and not intended to map to real temperature deviations.

I’ve also extended the model to 2050 and calculated the NH temperature (yellow line) by adding the NA temperature (green line) to a halved, rather than the whole, Arctic temperature (purple line), since the NA is bigger than the Arctic.  Clearly the temperature cycles still exist, it’s just that the AMO is imposed on an underlying trend, so both peaks and troughs in the temperature curves are higher.

In this very rough calculation, we still see NH temperatures declining for a couple of decades.  Worrying.

I should add that the usual explanation for the cooling period from around 1940 to 1970 is “global dimming”, i.e. the blocking of sunlight by industrial pollution.   The AMO hypothesis suggests that at least some of this cooling was caused by a natural cycle.

Next Steps

A perfect computer model would accurately represent sea ice melting and freezing and the resultant exchanges of heat between the sea and the atmosphere and effect on oceanic circulation.  It would therefore predict long-term natural climate variability such as may – and I stress “may” – be caused by the AMO.

Current climate models do not correctly retrodict (i.e. predict known data) the warming up to 1940 and they have under-estimated the Arctic warming that has occurred over the last decade or so.

It seems to me that – prior to the IPCC’s next report on the science, AR5 – serious effort needs to be made to evaluate the evidence and theoretical basis for an AMO, and take account of it in projections of the future climate.

I used to be highly sceptical of long-term natural climate variability, but now I’ve realised there could be feedbacks between Arctic ice-melt and NA temperatures, I’m suddenly convinced.  I’d like to see some serious modelling of the AMO and similar decadal variability that logically should also occur in the SH.

Maybe the effect of GW will be to completely swamp the natural AMO.  But I’d like to see proof of that.

A failure to explain the AMO would lead to increased climate scepticism and a loss of political will to deal with GW.  We could be left totally unprepared for a steep rise in temperatures starting in a decade or two’s time.

February 17, 2010

The Telegraph’s Sensibly But Mysteriously Changed Climategate Story

Filed under: BBC, Global warming, Media, Science, Science and the media — Tim Joslin @ 6:51 pm

Now I am confused. Just by chance I noticed just now that a link in my post a couple of days ago is now broken.

I quoted the Telegraph as saying:

“In an interview for the BBC’s website, Professor Jones also conceded that global temperatures may have been higher during the medieval warm period [MWP] than they are now – suggesting that climate change may not be caused by human activity.

He admitted that there has been no ’statistically significant’ global warming since 1995, but said this was a blip in a general trend of rising temperatures.” [My abbreviation]

in a story at:


Clicking this link now results in the dreaded 404 page not found.

A bit of Googling, though, does find a story at:


“He said he stood by the view that recent climate warming was most likely predominantly man-made.

But he agreed that two periods in recent times had experienced similar warming. He also said that the debate had not been settled over whether the Medieval Warm Period was warmer than the current period.

The statements are likely to be welcomed by people sceptical of man-made climate change who have felt insulted to be labelled by government ministers as flat-earthers and deniers.”

“Insulted” now, are we? Diddums.

The change to the Telegraph story, if that’s what it is, is welcome, I suppose. Trouble is, the second story claims it was published online at 9:15am on Sunday 14th Feb which, if true, implies an impossible timeline. Since I was blogging on Monday 15th, it’s possible that one story has been deleted – maybe retracted – leaving another covering much the same material. (Or maybe they kept the time of the original story. Who knows? Who knows anything? /sigh). Anyway, I do wonder exactly what hundreds of thousands read over their Valentine’s Day breakfast in the print edition. If the story I found originally did appear in print, I wonder if the Telegraph has published a retraction. Maybe I’ll try to find out!

Harrabin’s Hamfisted Interview

In a post earlier this week, I traced back from an Express headline to a BBC Q&A with Professor Phil Jones. In fact, I only changed my title at the last minute when I realised the interview, and not just Express misreporting, was a large part of the problem.

The UK sceptic-fuelled media storm is reverberating around the world, for example at Realclimate. My comment is #50, here, but after writing it I started to wonder where Harrabin’s questions had come from. At some point, I noticed Harry Hodge’s comment #53 on the Realclimate Whatevergate (Lol) piece:

“Roger Harrabin’s (BBC’s environment correspondent) reputation is undergoing a sea change. He has moved from someone perceived as being an unimpeachable source of expert analysis to someone running around trying to defend his reputation and restating the way he will report in the future (because of the power of the blogosphere). He is in contact with the sceptic blogs and, it would appear, putting their questions to Phil Jones.” [my stress]

Too right he is. Focussing on Harrabin’s interview rather than the subsequent misleading Express reporting, we notice that the introductory paragraph – which I previously blipped over – says that:

“The BBC’s environment analyst Roger Harrabin put questions to Professor Jones, including several gathered from climate sceptics.” [my stress again]

I think I’ve already covered adequately the ridiculous question about the definition of the word “unprecedented”.

“There is a debate over whether the Medieval Warm Period (MWP) was global or not. If it were to be conclusively shown that it was a global phenomenon, would you accept that this would undermine the premise that mean surface atmospheric temperatures during the latter part of the 20th Century were unprecedented?”


But what’s really nagging at me is the question:

“Do you agree that from 1995 to the present there has been no statistically-significant global warming[?]”

As Jones pointed out, the warming since 1995 is not quite statistically significant, because it’s such a short period.

So why ask about the warming since 1995?

Why not ask about the warming since 1994, or 1990 or any other earlier year, which would probably pass the 95% level conventionally used to indicate statistical significance?

I can think of no other reason than to draw an answer like: “No, there’s been no significant warming since 1995”.

It might also be pertinent to point out that 1995 was quite a warm year (see the graphs in my post when the 2009 data appeared a while back).

I presume that during the forthcoming General Election campaign politicians will be allowed to send questions for their opponents to the BBC: “When did you stop beating your wife?”; “Are you over your drink problem now?”.

I’m about to register a complaint with the BBC about Harrabin’s interview and ask precisely who provided each question asked. I’ll let you know what they say.

February 15, 2010

A Strangely Unhelpful Interview, Deniable Denial and the Daily Express

Filed under: Complex decisions, Global warming, Media, Reflections, Science, Science and the media — Tim Joslin @ 10:34 pm

I passed a news-stand this afternoon and couldn’t help noticing that today’s front-page lead in the Express is The Great Climate Change Retreat. Yes indeed:

“There has been no global warming for 15 years, a key scientist admitted yesterday in a major U-turn.”

They went on:

“Professor Phil Jones, who is at the centre of the ‘Climategate’ affair, conceded that there has been no ‘statistically significant’ rise in temperatures since 1995.”

which, of course, is not quite the same as “no global warming”.

The Express then went on to tell us how it’s all explained by the urban heat island effect, as I mentioned earlier. Breaking news: more on the urban heat island FUD has just appeared on the Guardian’s site.

The report in the Express makes absolutely no sense when you read, as I suggest you do, what Phil Jones actually said to the BBC:

“Do you agree that from 1995 to the present there has been no statistically-significant global warming[?]

Yes, but only just. I also calculated the trend for the period 1995 to 2009. This trend (0.12C per decade) is positive, but not significant at the 95% significance level. The positive trend is quite close to the significance level. Achieving statistical significance in scientific terms is much more likely for longer periods, and much less likely for shorter periods.”

which, of course, is not quite the same as “no ‘statistically significant’ rise in temperatures”.

Jones’ noted that the “positive trend is quite close to the [95%] significance level”, so I guess that means there’s a 90% probability that we’ve had a warming trend since 1995 rather than random fluctuations. 90% is “very likely” in IPCC-speak. As in it’s “very likely” that global warming is caused by human activity. Not really a very helpful question from the BBC. I mean, they could have asked “How likely is it that…” which would not have drawn the sort of answer justifying a front-page lead in the Express.

The story in the Telegraph [which had vanished or been significantly changed by 17th Feb – see later post] focuses on Professor Jones’ disorganised data, but ends with the comments:

“In an interview for the BBC’s website, Professor Jones also conceded that global temperatures may have been higher during the medieval warm period [MWP] than they are now – suggesting that climate change may not be caused by human activity.

He admitted that there has been no ‘statistically significant’ global warming since 1995, but said this was a blip in a general trend of rising temperatures.”

That’s all right then.

Actually it’s not.

The last sentence implies temperatures haven’t been rising since 1995, which, as we just saw, is not the case.

And the comment about the MWP is downright misleading. What Jones actually said was:

“There is a debate over whether the Medieval Warm Period (MWP) was global or not. If it were to be conclusively shown that it was a global phenomenon, would you accept that this would undermine the premise that mean surface atmospheric temperatures during the latter part of the 20th Century were unprecedented?

There is much debate over whether the Medieval Warm Period was global in extent or not. The MWP is most clearly expressed in parts of North America, the North Atlantic and Europe and parts of Asia. For it to be global in extent the MWP would need to be seen clearly in more records from the tropical regions and the Southern Hemisphere [SH]. There are very few palaeoclimatic records for these latter two regions.

Of course, if the MWP was shown to be global in extent and as warm or warmer than today (based on an equivalent coverage over the NH and SH) then obviously the late-20th century warmth would not be unprecedented. On the other hand, if the MWP was global, but was less warm that today, then current warmth would be unprecedented.

We know from the instrumental temperature record that the two hemispheres do not always follow one another. We cannot, therefore, make the assumption that temperatures in the global average will be similar to those in the northern hemisphere.”

In other words, he didn’t really say anything! This is a case of stupid question, stupid answer. All that’s being asked is if x had happened before would x happening again be unprecedented? Of course not, since all we are really discussing is what the word “unprecedented” means: “without precedent” or “hasn’t happened before”.

The BBC seems to think an interviewer’s job is to ask trappy questions, and the one about the MWP is just crass. It’s no surprise Jones’ answers have been picked up by the sceptic-inclined press. All in all a shocking public disservice from the Beeb.

The Times also has a piece today, on the reopening of the urban heat island “debate” rather than Jones’ interview, but I include it in this little round-up because it’s a classic of its kind. Titled “World may not be warming, say scientists“, the first 80% or so is devoted to the denialist campaign about the validity of the temperature data record. Only at the end does it give a couple of quotes to mainstream scientists. Deniable denial perhaps.

It seems to me that maybe the Telegraph and the Times are slightly uncomfortable coming out as sceptics, but are happy to let the denialists buddy-breathe their oxygen of publicity.

All this hot air surfaced before Copenhagen and is carrying on while the US, internally, and the world try to rescue something from the fiasco. But why is the UK press taking the lead in pushing sceptic FUD?

I just wonder if it’s got something to do with the forthcoming General Election. Cameron can’t be seen to deny the science, but I bet scepticism has grass-roots appeal on the right. Painting the whole thing as a dodgy dossier to raise fuel-prices and taxes might well suit the Tories. Who cares about the polar bear as long as we get our man into Downing Street, eh!

NOTE (18/2/10): This post seems to have “crossed” with Realclimate’s discussion of the same topic, but as covered by the Mail in an article titled Climategate U-turn as scientist at centre of row admits: There has been no global warming since 1995 (dated, or at least last updated, Sun 14th Feb, 5:12pm). I wonder how much of this media storm was accidental and how much choreographed? The timing (breaking on a Sunday), proliferation and similarity of the articles suggests a sophisticated media “PR” operation. Or did everyone else simply follow one paper? – the Telegraph appears to have got in first.

Fixing the IPCC (and the Motley CRU) Part 2

[Oops, this post initially appeared dated Feb 10th because that’s when I created the draft it’s based on. Deleting and reposting as it’s now Feb 15th and more to the point so that Part 2 appears after Part 1!]

As promised a few days ago, I’m now delving into what has gone wrong with climate science.

The “CRU hack” and the Guardian Investigation

It all started to go wrong with the email leak from the Climate Research Unit (CRU) at the University of East Anglia (UEA). The issues have recently been thoroughly analysed by Fred Pearce in the Guardian.

Hitting the ground running with a front-page lead, Fred reported in Part 1 of his investigation how Phil Jones, the currently suspended head of the CRU, had, back in 1990, published a paper – referenced in 2007 in the latest IPCC report, known as Assessment Report 4 or simply AR4 to its friends – which included data from some Chinese weather stations. The problem is that the precise location of these weather-stations is now uncertain. This is important as an increasing urban heat island effect could be mistaken for a warming climate. Independent climate scientists wanted the Chinese data and CRU couldn’t supply it. It’s no surprise that we’re now being asked to believe that the urban heat island effect is a general problem. In actual fact the issue has been debated for decades and the data corrected. That’s why the missing information about the Chinese data is such a problem in the first place.

The first problem is that data is not being made freely available. There is now general agreement that it should be, otherwise results of analyses are simply not reproducible.

Part 2 of the investigation showed how the emails revealed alleged attempts to prevent certain papers from being published in peer-reviewed journals and to exclude papers from certain journals from the IPCC process.

This second problem is entirely different to the first, and not so easily solved. In effect, scientists are being asked to endorse papers they don’t believe in. This puts them in a dilemma for which there is no obvious resolution. What’s more, there is no clear prescription for how to resolve the dilemma. One has to have some sympathy for Phil Jones, although anyone who puts “HIGHLY CONFIDENTIAL” in the subject line of an email should obviously be fired on the spot for gross stupidity.

The third part of Fred’s investigation looked at how Freedom if Information (FoI) requests were handled by the CRU. What’s really needed is the attention of a great satirist, but to summarise the madness, there were at least three aspects that caused friction, whether or not there was willingness to share data in the first place:
– the volume of FoI requests became onerous for the scientists involved – why apparently no-one thought of simply appointing an administrator to deal with them is anyone’s guess.
– the FoI requests asked for code used to analyse data as well as the data itself. Not only did the scientists involved regard this as their intellectual property (and the result of thousands of hours of work), it defeats the object for it to be released. It’s important for others to be able to access data and analyse it independently, but independently is the operative word. Using a slight variant of their software is not independent and could simply reproduce the same errors.
– the FoI requests asked for emails. The ones subsequently leaked. But the value of these is entirely contingent. They represent recorded private conversations that could almost as easily have been carried out verbally, with no record, or for that matter by coded text messages between untraceable mobile phones. Obviously it’s sensible to be careful what you write in emails because they may become public, but for the law to specifically allow them to be requested under FoI will, in the long-run, simply inconvenience those affected, who will be unable to have private conversations in the most convenient way. If their emails are to be treated as public property, surely the next logical step is for the spooks to follow climate scientists about and record their every word.

The third problem is a tricky one – because one response to the IT revolution has been to implement a raft of poorly drafted and generally over-specified laws relating to information, instead of the minimum necessary. Scientists can already publish as much of their reasoning as they wish, but beyond that the only aspect of scientific work that we should insist is made public is the raw data. (Though we must insist on seeing all the raw data – including when, where and by whom it was collected).

The final part of the Guardian investigation looked into how the emails were leaked. Interesting, but not relevant to a discussion of how to fix the process.

The latest twist is that Professor Jones claims to have “lost track” of data (though it’s not clear exactly what he’s referring to). I’d say his excuse seems reasonably plausible. The FoI amounts to retrospective legislation, after all.

Nevertheless, the sloppiness is inexcusable. If scientific results are not reproducible, they are worthless. Climate studies are based on large amounts of historic data, generally collected by third parties. We’re not talking about personal lab notebooks or electronic data collected by the researcher (though these should also be retained). To ensure reproducibility we need some separation of responsibility between archivists and researchers:
– data collectors report their measurements (light snow here in London just now!) to anyone that wants it.
– these are collated by archivists together with the vast existing database of historic data and published (open source if they’re to be useful).
– to be valid any papers must specify precisely what data they reference and how it has been analysed.

Nothing else is needed. Inspecting computer code is not necessary. Far better for several independent teams to analyse the same data. If source code is released – and open source development of climate models may well make sense – then we need to be cautious. We might end up with the same underlying code in all the models, resulting in the same errors. We’re not trying to make a computer operating system, which just has to be good enough. We need the right answer.

And private email correspondence between scientists has no bearing at all on whether their results are valid.

February 12, 2010

Fixing the IPCC (and the Motley CRU): Part 1

Last night’s BBC News at 10 report on the University of East Anglia (UEA) investigation of the affair of the leaked emails made it seem like the climate scientists were on trial.

What a travesty (as someone said recently).

Just as with the financial crisis, where it’s easier to blame the bankers than analyse what’s wrong with the wider system, we have to look beyond the individuals involved. Why did they behave as they did? After all, they weren’t in line for massive bonuses.

Climate change science is indeed in crisis. There are various problems – which I’ll examine in detail in subsequent posts – but by far the most significant is the way findings are being evaluated. Rather than indulge point-scoring by “sceptics” we should simply go back to basics and demand testable predictions. And the “sceptics” should go and make their predictions as well.

Let’s get everyone’s eyes back on the ball.

The first part of the fix is therefore to take the entirely reasonable step of excluding “deniers” from the process – including from IPCC. The model we should look to is that of “splits” in open source software development projects. But we should also draw lessons from the history of science.

Most scientists, let alone the general public, have a rudimentary understanding of how science works. They know it’s not just a process of generalising from accumulated facts (known as inductive reasoning in the trade). Rather, hypotheses can be falsified. And, indeed, the idea that an experiment or observation can prove a general claim to be false is one thing that distinguishes science from other forms of knowledge.

But – my favourite word again – in the real world, theories are complex beasts. Sometimes – as in the example of the sun bending light according to Einstein’s prediction and not Newton’s – one crucial test supports one theory over another. Usually, though, theories can be adjusted to explain unexpected data. And there’s no way of knowing, except with hindsight, whether the adjustment was valid or the whole theory should have been thrown out.

It’s therefore possible to have what the philosopher Imre Lakatos called “competing research programmes”. Thomas Kuhn expressed much the same idea when he discussed “paradigms” in his famous work, The Structure of Scientific Revolutions, but I mentioned Lakatos because he – correctly in my view – allows for more than one theory existing at the same time, whereas Kuhn only conceives of one “paradigm” succeeding another.

Kuhn, in particular, stresses the social side of science. His paradigms are characterised by specific methods, texts and so on (in fact the philospher Margaret Masterman famously identified 21 distinct meanings of the word “paradigm” to refer to these types of model!).

In particular, it should be stressed that paradigms (or research programmes) are incommensurate, that is, they are based on concepts that have no meaning outside the paradigm.

At present what we are asking climate scientists to do is to work as a team with those who don’t simply disagree with them on some technical point, but also do not share their basic assumptions, their very culture.

The IPCC should give up on forcing mainstream climate scientists to field members of the opposing team in their line-up. Let them decide whose research is worthy of inclusion and whose isn’t. If an article appears in a journal that follows peer-review procedures that doesn’t in itself prove it has special value. It could still be misleading. Or outright garbage.

There are some similarities to open-source projects which occasionally split when one group has a different vision to another – leading to two versions of the browser, operating system or peer-to-peer file-sharing engine. Obviously, splitting is undesirable, because it spreads resources more thinly, but it is sometimes unavoidable.

If necessary, the UN should be prepared to fund more than one research programme. If a bunch of people want to go off and prove that the main way climate is controlled is by cosmic rays, then good luck to them. Let them publish a minority report.

If there are multiple research programmes, let’s see who puts their name to each of them. And let’s see which most impresses young research scientists entering the field.

If one research programme continually makes accurate predictions and the other doesn’t, well, one research programme will wither and die.

February 1, 2010

The Earth is a Fridge

Filed under: AMO, Global warming, Science, Sea ice — Tim Joslin @ 3:25 pm

No, I’m not a teapot. I’m serious. The way the climate system works is that, over a year, there is a net gain of heat in low latitudes and a net loss at high latitudes. Heat is transported from more tropical regions and radiated away at the poles.

Now, I’ve been mulling over the mystery of why Northern Hemisphere warming (as measured by the mean surface temperature) appears to have slowed over the past decade or so. I suggested a while back that, in view of the rapid industrialisation of China in particular, perhaps renewed global dimming has a role to play.

I recently felt some encouragement to persist from Sue Solomon’s comments in the Guardian recently that:

“…there are climate scientists round the world who are trying very hard to understand and to explain to people openly and honestly what has happened over the last decade.”

And so they should.

Realclimate was a little sniffy about the Guardian’s reporting of the science aspect, with a curious exchange at comment 47, but the (tentative) conclusion seems to be that Solomon’s findings relate to some kind of poorly understood feedback mechanism rather than a climate driver (i.e. an external effect on the climate system).

Back to the story. As I said at the start, the Earth is a giant fridge.

Now, it has suddenly occurred to me that the efficiency of the fridge could be different when the whole system is in a warmer (or cooler) state. If this effect is significant you’d therefore expect periods of more and less rapid warming as the Earth’s ability to radiate away heat changed.

Cutting to the chase, it seems to me that sea ice cover reduces the ability of the planet to radiate heat away; more to the point, loss of sea ice increases its ability to radiate heat away. Ice is a good insulator.

What’s been happening up in the Arctic is that “multiyear” ice has disappeared rapidly over recent years.

Now, if some relatively warm water ends up under some ice that’s already there, at best it can slowly cool to around -2C (when it is in equilibrium with the ice) – because of insulation the ice will not get much thicker. But if, come winter, the sea is not already covered in a layer of ice, the water can cool relatively more and can turn to ice and lose a lot more energy in doing so. Simples. [Actually, it’s not: what may be critical is the amount of surface water that, as it cools, becomes more dense and sinks, allowing heat to be lost from a greater volume of water than at a lower initial surface temperature. The amount of “ventilation” of the water column (by wind) may also be an important factor in determining how much heat can be lost before the insulating ice layer is formed at the surface. Furthermore, Wikipedia notes the process of “brine rejection” whereby water just under the freezing layer becomes more dense (because ice doesn’t incorporate salt) and sinks may also be important – obviously the amount of brine rejection depends on how much freezing occurs each year.].

What I’m suggesting is that the Earth’s refrigeration mechanism will be more efficient the less – in extent and thickness – sea ice there is at the start of winter. This doesn’t mean the planet will start cooling, of course, but it could slow the warming.

I thought I should do a rough calculation to see how much energy it takes to melt the Arctic sea ice each year. The interesting Stoat blog links to some data showing that very roughly 10 million km2 of ice freeze and melt each year.

I’ve seen the nature documentaries, so let’s guess that this ice is on average 1 metre thick.

To melt this ice alone takes 10^7 (the area) *10^6 (to metres cubed) *10^3 (to litres ~= kg) *334*10^3 J (latent heat of fusion of water) = ~3.34*10^21J.

I also happen to know that doubling CO2 will lead to a forcing of around 4W/m2 over the whole planet. 1W/m2 is therefore quite a significant number. How much is 1W/m2 over 1 hemisphere over a year?

The area of the Earth’s surface is ~500m km2, so 1W/m2 of the northern hemisphere is, over 1 year, 250*10^6 *10^6 (converting to m2) *365*24*3600 (a year’s worth of seconds = ~30*10^6) = ~7.5*10^21J.

So, just freezing the Arctic sea ice every year, never mind cooling the water or ice down implies that the Earth radiates away heat equivalent to a continuous forcing of around 0.4W/m2 of the entire surface of the northern hemisphere.

In fact, if we assume the water has to be cooled down as well, that 0.4W/m2 becomes a little bigger (the specific heat of water is around 4J/g/C – i.e. 4J heats 1g by 1C).

Of course, the extra heat loss in winter while the water is cooling and freezing when the ice extent is low needs to be weighed against the extra heat gain in summer by the albedo change due to the absent ice sheet. Looking at it another way, when there’s no permanent sea-ice, the albedo-feedback-assisted summer melting and winter freezing exactly cancel out. Obviously. My point, though, is that there is a circulation and the Arctic cools water that ends up flowing back south as a cold deep current (so it’s the 4J/g/C released when water cools rather than the 334J/g when it freezes that’s important). This mechanism is cut off by the insulating effect of a layer of sea ice. A corollary is therefore that improved Arctic fridge efficiency should strengthen the thermal oceanic circulation. In total, over a year, once it’s warm enough for the sea-ice to disappear in summer, more cold water should sink and flow south than before, thereby allowing more warm surface water to drift north.

There could be an optimum Arctic cooling efficiency when it’s still cold enough for the ice to freeze by the end of the winter (to reduce heat uptake during the early summer) but warm enough to mostly thaw by the end of summer.

In conclusion, I present, in the hope of encouraging progress towards an explanation of the lack of 21st century warming in the northern hemisphere, and to supplement the Renewed Global Dimming Hypothesis, the possibly even more tentative Strengthened Earth Refrigeration Mechanism Hypothesis.

I should repeat what I may term the Warming Warning, that is, that, if underlying warming is being masked, or postponed, by either of these mechanisms and/or others, we could be in for a real shock in later decades.

Create a free website or blog at WordPress.com.