Uncharted Territory

January 21, 2011

On Misplaced Certainty and Misunderstood Uncertainty

Filed under: Complex decisions, Global warming, Reflections, Science, Science and the media — Tim Joslin @ 9:42 pm

I know the climate scientists know they’re right, but a little care is called for. It’s important not to play fast and loose with the figures – especially when criticising someone else for playing fast and loose with the figures!

In a post entitled Getting things right, Realclimate yesterday addressed a piece of rogue science conducted apparently in-house by an NGO. Gavin Schmidt wrote:

The erroneous claim in the study was that the temperature anomaly in 2020 would be 2.4ºC above pre-industrial. This is obviously very different from the IPCC projections… which show trends of about 0.2ºC/decade, and temperatures at 2020 of around 1-1.4ºC above pre-industrial.

But the chance of “temperatures at 2020” being 1.4ºC above pre-industrial seems to me pretty remote – certainly less than 2.5%, if Gavin is quoting within 2 sigma confidence limits, as is customary.

You’d think in a blog post titled “Getting things right” that it was pretty important to get things right…

So I posted a comment and was pleased to see not one, but two replies:

Now I’m confused. I understand we are currently about 0.8ºC above pre-industrial. A mean global surface temperature 1.4ºC above by 2020 implies a 0.6ºC rise over the next decade.

[Response: The range is just eyeballing the IPCC figure for the year 2020 – so there is some component of internal variability in there as well. – gavin]

[Response: GISS temperature of 2010 (which happens to be right on the long-term trend) is 0.9 ºC above the mean 1880-1920 (and the latter is probably a bit higher than “preindustrial”). -stefan]

OK, let’s take 0.9ºC, though that’s not a figure you often hear.

The IPCC graphic Gavin is referring to when he says “projections” is one I’ve never really liked:

It’s all a bit too imprecise and pretty for my liking. For example, the yellow line (constant GHG levels from 2000) diverges from the other scenarios almost immediately, even though natural variation would initially overwhelm differences between emission trajectories.

It does rather look, though, as if at least one of the scenarios could, according to the models, lead to warming of 1.4ºC above the pre-industrial level. Could this be because emissions in the scenario are much higher than we’re actually experiencing? No, Gavin notes that:

* Current CO2 is 390 ppm
* Growth in CO2 is around 2 ppm/yr, and so by 2020 there will be ~410 ppm
So far so good. The different IPCC scenarios give a range of 412-420 ppm.

The difference between 420ppm and 410ppm would only lead to a 0.1ºC extra rise in temperature over the very long term and even then the climate sensitivity (the eventual temperature increase for every doubling of the atmospheric CO2 level) would have to be on the high side – around 4ºC.

No, the problem is that the temperature hasn’t risen fast enough to 2010 for the more extreme modelling predictions in the IPCC figure for 2020 to be sufficiently likely any more. The IPCC graphic is out of date, plain and simple.

It’s a bit puzzling to be honest why Gavin used the IPCC graphic, because another Realclimate post today has trend-lines suggesting a much more accurate estimate of the likely global mean surface temperature at 2020 – around 0.2ºC higher than at present or around 1.1ºC above the pre-industrial level (as Stefan noted, 2010 is roughly on trend).

But how confident are we in this estimate? What is the range Gavin should have quoted?

Well, here’s the point: you can’t just express uncertainty by running a few models with slightly different starting conditions (the “Monte Carlo” approach) and discarding 2.5% at each extreme of the resulting distribution.

No, we have to actually think about what we’re doing.

It rather seems to me there are different kinds of uncertainty that we might want to consider when trying to predict the temperature “at 2020”.

What are the types of uncertainty we might need to take into account?

Parameter uncertainty
These are our “known unknowns”. In this case, we don’t actually know that the trend is 0.18 or 0.19ºC per decade as discussed at Realclimate. It looks like it is, but this could change when we get a bit more data – maybe we’ll find over a longer timescale that the real figure is 0.16 or 0.21ºC per decade. This makes us less certain about temperatures further out – at 2030 or 2050, say – than at 2020.

But a relatively short time into the future, parameter uncertainty is dominated by:

Calculable statistical uncertainty
Measurements of mean surface temperature show some variability about the underlying trend, as can be seen from the graphs in the Realclimate post discussing the data for 2010.

But the most any year has varied above the trend-line is about 0.2ºC in the case of 1998, which remains one of the 3 warmest years on record (with 2005 and 2010) due to the super El Nino that year. Maybe Gavin is implicitly including the possibility that there will be another strong El Nino in 2020. But that would only get us to a 1.3ºC total temperature increase (1.1ºC for the trend plus 0.2ºC for the El Nino), not 1.4ºC.

Statistical distribution uncertainty
It’s just conceivable Gavin calculated the Standard Deviation (SD) of annual temperature deviations from the trend and found it to be 0.15ºC or more so that 2 SDs includes 1.4ºC, so even if the long-term increase in temperature around 2020 is our 1.1ºC, there may still be a greater than 2.5% chance that the temperature in that one particular year is 1.4ºC or above. The only trouble is, with a mean of 1.1ºC and SD of 0.15ºC there would be an equal probability of 2020 being much colder than usual, so Gavin would have had to give a range of 0.8-1.4ºC.

Ah, but maybe Gavin expects the distribution to be skewed, so that freakishly hot years are more likely than freakishly cold ones…

The point is we don’t actually know a priori what the distribution of probabilities (often called the Probability Density – or sometimes Distribution – Function, or PDF, if that isn’t too confusing!) for the annual mean temperature of a given year actually looks like. We need a theory to tell us that – and the PDF could be complex, not a nice normal, lognormal or power curve at all.

Damn, we already have three sources of uncertainty compounding our estimate of the 2020 temperature!

It can’t get trickier than this can it?

Execution uncertainty
Yes it can.

Global temperatures are depressed following volcanic eruptions. It’s almost as if these are being ignored and that global warming projections include the implicit qualifier: “unless there’s a major volcanic eruption”. These are frequent enough for them to be included in our “2 sigma” (central 95%) range: volcanoes in 1963 (Mount Agung), 1982 (El Chichon) and 1991 (Pinatubo) depressed global temperatures by up to 0.3ºC. Despite a long-term warming trend, the temperature “at 2020” could easily be knocked back to 2010 levels, that is, 0.9ºC above pre-industrial, or below.

I don’t want anyone coming back and saying I predicted 2020 to be warmer than 2010 and it wasn’t. Sure, I could say “the theory was right, there was just that damn eruption”. But really we need to include the possibility of volcanic activity if we’re going to make a serious forecast.

I’m beginning to think 1-1.4ºC above pre-industrial might not be that good a prediction for 2020. It seems a volcanic eruption could push us further below our central forecast of 1.1ºC than a strong El Nino could lift us above. I suspect 2 sigma confidence limits are more like 0.8-1.3ºC, with the proviso that a really serious volcanic event could leave us even cooler, without the possibility of a corresponding extreme warming event.

The point, of course, is that uncertainty in a complex system, such as the climate or the economy isn’t likely to be a simple mathematical relationship. We need to explore the theory itself. We need qualitative as well as quantitative understanding.

Unknown unknowns
So far our 2020 temperature predictions have assumed we’re certain about our theory.

But maybe we’re not as smart as we think we are.

This is where it gets really difficult. Nevertheless, we should really have a look at any developments that are bubbling up. For example, Realclimate itself has discussed modelling that suggests there could be natural cycles that affect the temperature over timescales of decades. Personally, I think there could be something in this.

Again, the risks, according to the researchers, are to the temperature downside over the next decade. How sure are we that the groups looking at these patterns of variability are wrong? Not more than 95%, surely?

Let’s make one final allowance. Let’s take account of this unknown unknown and predict that the mean global temperature at 2020 will in fact be in the range 0.7-1.3ºC above the pre-industrial level, with a central prediction of a 1.1ºC rise. That is, it will be from 0.2ºC cooler than 2010 to 0.4ºC warmer, with a median expectation in the PDF of a 0.2ºC rise, so a skewed distribution. Think of the 0.2ºC drop as maybe some cyclical cooling cancelling out some of the warming trend plus a bit of volcanic action; the 0.4ºC warming would perhaps arise with a continuation of the current trend plus a big El Nino.

This is the point I want to make: the PDF is in large part a judgement, based on understanding (so there’s plenty of people who could make a better stab at it than me). Number-crunching on its own will never do the job.

I agree with the guys at Realclimate, though: it’s important to get things right!

January 18, 2011

On Hulme on Science

This post is an addendum to my previous musings on Mike Hulme’s Why We Disagree About Climate Change. In particular I want to respond to Paul Hayne’s comment that:

“Mike Hulme’s argument is not relativist. He is arguing that there really is no argument that can leverage action, which seems pretty true.”

OK, I suppose – after re-reading Chapter 3 of Why We Disagree – my claim did go a bit far. However, I’m not sure I want to concede the point fully.

First, I’m not the only one who’s confused. What was uppermost in my mind I think were the comments about Why We Disagree made by Peter Kircher in Science (pdf), to which Hulme refers on his website, as detailed in my original post.

I concur with Kircher’s view that “Hulme’s book invites misreading” and his disquiet over Hulme’s infamous passage (p.80-2) discussing how science “must concede some ground to other ways of knowing.” There is, though, a way in which this makes sense, which Hulme doesn’t identify and which doesn’t in any way undermine science.

Second, any critique of science must always address the fundamental precept that science is about testing theories against reality. It either describes the world or it doesn’t. There’s no room for compromise with “other ways of knowing”.

There’s one little fly in the ointment, though, which is very apparent in the social sciences. Concepts are not always easy to define. What is “poverty”, for example? Before you can study “poverty” you have to get out there and translate what people mean by “poverty” into something or things that you can actually measure.

Hulme refers to “local tacit knowledge”, which he patronisingly suggests is “not conventionally classified as scientific knowledge”. He muddles strategies for coping with climate conditions with describing “environmental change” and weather-forecasting, but certainly some of what he’s driving at very much is scientific knowledge – climate science relies on interpretations of subjective historic anecdotal evidence in diaries, ships’ logs and so on.

The issue is merely about communication between scientists and those affected. In the case of climate change, science may need to translate its scientific predictions – expressed in terms of directly measurable parameters – into language that relates to people’s day to day experiences. But those experiences are not “other ways of knowing”.

Let’s take the example of “severe winter weather” in the UK, since “here’s one I prepared earlier”! As I explored recently – there is no direct correlation between measurable parameters and the common perception of, in this case, what constitutes a “cold winter”. No-one writes books about, say, February 1986, which was exceptionally cold, whereas (slightly) milder conditions with more snow, such as the Winter of Discontent (1978-9) and perhaps December 2010, linger much longer in the collective memory.

Science could, in principle, develop a “severe winter” index which included temperature extremes, averages, snowfall, lying snow days and so on. Trouble is, different people would want to constitute the index differently. Hence we all have to refer to the same variables if we want to make comparisons. This is what science is. It doesn’t stop us all making our subjective judgements, though.

So, there’s an inescapable conclusion: we have to agree on a framework, on what we can measure in order to make objective comparisons.

And this is the real weakness of Hulme’s work. In terms of both the science and making decisions on emission trajectories, we need a quantitative framework. Or we simply can’t reach any sort of agreement. It’s all very well to note that people have different values, but we can’t conceivably ever agree what is an acceptable level of climate change based on religious and political views. It is irrelevant on one level that the media distort the debate as Hulme goes on to discuss in Chapter 7, The Communication of Risk. This doesn’t alter the consequences of different courses of action and therefore the optimum path by one iota.

It might also be worth noting, en passant, that it is in fact historically somewhat unusual for public opinion to greatly matter in decision-making. The reason the media has influence is a result of our current political system. At most other times in history a ruler, or elite would simply make the decision. The long-term interest of society as a whole was the responsibility of a small group and not something actively contested between different interests. Maybe, as a civilisation, we need ways of making a clearer distinction between the general interest and the individual and sectional interests that drive our political processes. Tricky stuff!

Nevertheless, just as we can only meaningfully discuss and quantify the physical phenomena of climate change within the agreed framework that we call science we can only decide on a course of action in response to global warming by agreeing a framework that permits quantification.

And that framework is called economics.

January 6, 2011

Musing on “Why We Disagree About Climate Change”

I give in. Over the holiday season I’ve been improving my education by reading Mike Hulme‘s thought-provoking book Why We Disagree About Climate Change: Understanding Controversy, Inaction and Opportunity. Maybe I’m committing a cardinal sin, but I’m going to try to capture my thoughts on Why We Disagree without a clear idea where they’re leading. Perhaps because I haven’t finished reading the book yet!

Whilst I would recommend Why We Disagree as an introduction to a set of issues that are too rarely and superficially discussed, I find myself alarmed at the way the debate is going. The movement that has grown up in response to the global warming threat is more deeply confused than ever. There is a lack of clarity of thought around far too many aspects of the problem of global warming and how to organise a coherent response.

But what’s prompted me to jump the gun was committing another cardinal sin. I idly followed a backlink to this blog. It was automatically generated, but turned out to be very relevant. It led to a post at Haunting the Library, a new GW sceptic blog, which laid into a truly cringeworthy Guardian CiF piece by a Polly Higgins.

Polly Higgins argues for a “law on ecocide“. She particularly wants to prosecute corporations. Presumably she wants to create a world government or at least rewrite the Treaty of Westphalia.

Preamble: Power and the Law

Putting to one side Polly Higgins’ unwise use of comparisons between ecocide and Nazi genocide – I kid you not, read her piece – she misunderstands the role of the law.  The law expresses power relations between society and the individual.  Once the law proscribes (or prescribes) behaviour of some kind, the norm – for example, “don’t sell drugs” – has been agreed.  OK, the legal system is a contested space, but probably best to fight only battles you have a chance of winning.

Enacting a law does more than express a societal norm.  It also criminalises the behaviour it proscribes (or in the case of civil law recognises plaintiffs’ rights to redress).  There are other options than criminalising undesirable behaviour.  Once you have the power to enforce an agreed norm, e.g. if you’re an elected government, you could tinker with incentives.  In the case of drink, for example, rather than try to enforce a law limiting consumption you can raise taxes on alcohol.  When it comes to drinking and driving, though, a strict legally enforceable limit is considered appropriate.

So, what Polly Higgins fails to appreciate – besides the social norm of avoiding casual comparisons with Nazi atrocities – is the need for a coherent two-step plan:

1. Establish the power to prohibit or limit specific behaviour.

2. Identify the appropriate levers to control said behaviour.

A problem with global warming is that it is global in its diverse causes and global in its diverse impacts.  (1) is therefore difficult to achieve.  This might not matter if we either had a world government with the power to deal with the issue or the states that are prepared to project their power, principally the US, were prepared to do so in support of the cause.

But even if (1) were achievable, (2) also needs to be addressed.  Criminalisation might not be the most effective strategy.  The absurd, prohibitionist “war on drugs”, for example, arguably creates more problems than it solves.   In general, I suggest, it’s not a good idea to try to criminalise behaviour in which many people are engaged.

Where the issue gets interesting of course is in the interaction between (1) and (2).

The more power you have the easier it is to criminalise behaviour.  If you happen to be in control of a totalitarian government you can outlaw whatever you like.

On the other hand you need very little power to change incentives.  Simply by buying something you change behaviour throughout the supply chain.  To make a real difference you need to do a bit more than that.  But people are prepared to accept taxes that increase the price of things they may enjoy, but know aren’t good for them, at least in excess.  Hence tobacco and alcohol duty are accepted and are effective to some extent, whereas prohibition would fail, even if the laws required could be passed in the first place.

In the case of carbon emissions, where on the power-spectrum is Polly Higgins, do you think?

Which brings me on to Mike Hulme’s book, Why We Disagree About Climate Change.

Hulme’s project is an important one, and well executed, but, at least as far as Chapter 6, I fear a critical theme has been omitted and that this undermines the whole discussion.

Society and the Individual

Society is not the sum of individual interests.  Let me repeat that in case anyone missed the point.  Society is not the sum of individual interests.

Consider how behaviour is regulated.

Society as a whole attempts to regulate the behaviour of individuals by legal or other means as seen in relation to Polly’s counterproductive proposal to criminalise “ecocide”.

Individuals’ behaviour takes into account sanctions and incentives society as a whole may put in place.

Society has an interest in maintaining itself indefinitely; individuals may or may not be concerned about the future.  Society and individuals are qualitatively different.  Much of the time we pursue our individual interests.  But we expect certain agents to act in the interests of society.

We have collective values.  We expect society to maintain itself even if it doesn’t directly affect us.

Let’s cut to the chase.

Comparing Markets and Values – a Category Error

Chapter 5 of Why We Disagree, The Things We Believe, is primarily about religion but suggests three categories of solutions: correcting markets, establishing justice and transforming society (section 5.4, p.161-9).

But we’re comparing apples and lemons.  Market interventions are a way of achieving agreed objectives within the whole of society, in this case globally.  Establishing justice and transforming society are motivations of individuals or groups for influencing the objectives.

It is alarming not just that Hulme writes:

“These sweeping ideas for commodifying carbon and globalising its market through either free or regulated trade sit uneasily with many of the beliefs expressed in religious and other spiritual traditions”.

but also that he notes WWF UK have:

“…suggested that there are inherent contradictions in ‘attempting to market less consumptive lifestyles using techniques developed for selling products and services.’ “

This is simply incoherent.  Especially to a WWF “Supporter” who yesterday participated in an online WWF “Membership” survey.  If market research isn’t a “technique developed for selling products and services” I don’t know what is.

If we actually want results, I suggest we have to leave it to the technocrats to determine how best to reduce carbon emissions.  All pressure groups can and should do is build support for objectives, such as to reduce carbon emissions, or perhaps more broadly avoid dangerous climate change.

The problem is all pressure groups have their own over-riding objective – to boost their own support.  They need to have a coherent ideology.  They need to sell the illusion that it is their brand that is effective.

Of course, WWF is easy to pick on, in part because they don’t in fact have Members, only Supporters.  They’re not alone in spinning in this particular way, but to my mind, strictly speaking members have – or at least can have – some kind of influence.  This is not the case in WWF.  Policy is not determined by votes of the members, for example.  WWF is selling you something.  And the ratio of brand to product is very high!

If we really want to stop global warming we need to recognise the constraints on coordinated global action.  Market-based solutions of some kind are pretty much the only game in town.

Society, Individuals and Value

Hulme reports in Chapter 4, The Endowment of Value on the discounting of the costs of climate change as used, for example, in the Stern Report.  Every time I read about this I find myself entirely bemused and Hulme, I think, has unintentionally helped me put my finger on the problem.  What am I saying?  Why give Hulme the credit? – I knew this already, I was just hoping Hulme did too.

Stern’s economic analysis – or at least his use of discounting of future costs – looks at the problem from the perspective of individual actors (people or commercial enterprises, say).  We need to consider society as a whole.

The issue is around the discount rate – how much future costs of damage caused by climate change should be reduced by each year when compared with costs in the present.

As Hulme reports (p.126), Stern used two discount rates, a “time discount rate” of 0.1% and a “per capita growth rate” of 1.3% since “we” will be wealthier in the future.

The time discount rate always gets me.  It’s to allow – and this is serious – for the possibility of the human race going extinct before our global warming problems come home to roost.  This is ludicrous.  The logic implies that if we don’t “go extinct” we won’t have invested enough in preventing global warming.  Presumably this is why people don’t save enough for their pensions.  Obviously there may only be a 90% chance of needing the money – you could die or win the lottery.

Imagine if the scenario were just a little different.  Imagine extinction of the human race is certain if CO2 reaches say 500ppm.  We have to spend £10bn now to stop that happening in 100 years.  Ah, but we may be extinct anyway!  So let’s spend just £9bn (that’s £10bn discounted by 0.1% for each of the next 100 years – I know this should be compounded so it’s actually less than £9bn, but that’s not the point.  The point is…).  Damn, we haven’t spent quite enough – CO2 will hit 500ppm and wipe us out.  As I say, this is ludicrous.

OK, so that’s the overall discount rate down from 1.4% to 1.3%.

What about this “per capita growth rate”?  The argument is that it’s not worth spending £1 today to save £1.0129 next year (compounded over time, so it adds up, don’t it), because we’ll be able to create £1.013 worth of goods and services (or capital) next year for the same effort it takes to create £1 this year.

Very persuasive.

Not.

The problem I have with this is that I can’t think of any effect of global warming that won’t rise in proportion with the per capita growth rate.  In fact, it’s very easy to make a case for many costs to rise faster than the per capita growth rate!

Consider the loss of capital assets caused by global warming.  For example, a city may be destroyed due to rising sea-levels or increased storminess or both. The value of that city in the future will be greater than it is today.  Its productive capacity will have increased by as a best estimate – you’ve guessed it – the per capita growth rate.

Or people may be killed due, say, to the effects of a heatwave.  The direct economic loss will be their own individual productive capacity, which will on average have increased from today’s by – you’ve guessed it – the per capita growth rate.

How do we put a value on quality of life?  Well, biodiversity, the preservation of cultural artefacts and ways of living (referred to by Hulme as “natural capital and the aesthetics of living”, p.115) will, at a best guess, be worth as much in the future as a proportion of everything we consume, as they are today.  That is, the costs of damage to the environment will have increased by – you’ve guessed it – the per capita growth rate.

Of course, it’s easy to argue that in the future we will value our own lives, the lives of others and the environment relatively more highly than we do now.  And Hulme even cites Ronald Inglehart who has made the persuasive argument that concern for the environment is a luxury good valued more highly the more wealthy we become.   (Damn, something else I can’t claim to have thought of first!).

I might even add that the only way per capita productivity growth can occur is if we become more interconnected and specialised.  The relative cost of damage to capital and loss of life will therefore be higher in the future.  The more we rely on each other the more costly disruption becomes.

I can see no justification whatsoever for using a discount rate of greater than zero.  Arguably it should be negative.

Stern correctly argued that we should have no pure time preference.  That is, we as individuals may prefer jam today over jam tomorrow (so don’t put enough in our pension funds), but society as a whole transcends time.  We’re all gonners if we do anything other than weigh benefits to me now as at most equal to costs to someone else at some indeterminate time in the future.

Science and Society

Hulme has me really alarmed in his discussion of science (Chapter 3).  We disagree about climate change because we disagree about the science, apparently.  Yes, of course, but society as a whole needs some way of evaluating the risks.

Hulme seems to take a relativist position akin to that of Steve Fuller.  Take this summary from his website:

“In Philip Kitcher’s wide-ranging essay in Science on ‘The Climate Change Debates’ [pdf] I am struck by two things – which are not very new, but which are very important. First, is how the framing and public discourse around climate change differs between countries: as Kitcher puts it, where ‘societies … are inclined to see matters differently’. This is brute fact sociological reality, just as non-negotiable as the radiation physics of a CO2 molecule. Recognising this means that as soon as scientific knowledge enters public discourse – whether this knowledge is robust, imprecise or tentative – different things will happen to it and different social realities will be constructed around it. For me, this is the essence of the climate change phenomenon.

The second, related, thing to emphasise is how predictive claims about the climate future – and its impacts – are inextricably bound up with imaginations (e.g. scenarios) and value judgements (e.g. discount rates) about the future. One could argue that such considerations fall within the legitimate reach of ‘climate science’ and the elite scientific expertise Kitcher claims any genuine democracy needs. But for me it is these extra-scientific dimensions of climate change ‘knowledge’ which motivated me in my book ‘Why We Disagree About Climate Change’ to challenge a narrow appeal to science for engaging our publics around the idea of climate change. It really is not about ‘getting the science right’. It is just as much about engaging our imaginations, about facing up to the ways different peoples and cultures construct meaning for themselves, about the very different values we attach to the future. And because of this I don’t believe Cassandras such as Jim Hansen and Steve Schneider should have the last word.”

I’m baffled how he can refer to James Hansen as a “Cassandra” – Kitcher’s essay suggests only that Hansen and Schneider “play the role of Cassandra”, standing outside the debate rather than within it. I can imagine accusing the scientists of hubris, perhaps, thinking they know more than they do, but crying wolf [which is what I assume Hulme means – otherwise his comment would make no sense at all – since the point about Cassandra is she was right and nobody listened!]?  No.  Hansen certainly calls it as he sees it – everything I’ve ever seen of his has been rooted in what I would describe as fairly mainstream science.  (Though I’ve ordered “Storms” from Amazon anyway – it’s just come out in paperback).

Hulme urges us to face up to “brute-fact sociological reality”, as if this represents some new way of looking at the world, something that – perhaps with one exception – we’re all missing.  Global warming is a physical phenomenon – and by the way, sociological realities are nothing if not negotiable, unlike the  radiation properties of a CO2 molecule – but of course it’s a sociological problem.  The same way as anything from dying to the financial crisis is only a problem if we want it to be.  All “problems” are socially constructed, by definition.  CO2 molecules don’t have problems, only sentient beings do.

Society defines problems and has to construct ways of dealing with them.

The nature of the global warming problem depends on the physical phenomenon.  Sure, we can choose whether to say “oh, that’s interesting” or to define it as a problem.  But we can’t do that unless we understand the physical phenomenon.

So, in agreeing the extent of the problem, we have to use the best socially-constructed mechanisms we have.  All science is is a method of determining the most effective ways of predicting (and therefore controlling) physical phenomena.  Essentially by testing interventions against outcomes.  There’s some logic in there and quite a bit of institutional gubbins, but unless we collectively – and we’re talking about global society here – come up with a better “way of knowing”, it really is a matter of “getting the science right”.

Once we’ve done that we can argue about what to do about it. Or rather, we have to argue in in parallel, since science – like society – never ends.

But we mustn’t muddle up things we can measure – temperature, say, or precipitation, or the ability of particular species to survive – with how we feel about those things or our desires as to the kind of society we’d like to see in the future.

And when we come to try to put into effect what we’ve decided to do, maybe it would be a good idea to use methods that are to some extent predictable, and perhaps those that past experience would suggest are most likely to be successful – which probably won’t include implicating much of the global population in a new heinous crime of “ecocide”.  As I said, we’ll have to give it to the technocrats.  Who will almost certainly use markets to do a lot of the heavy-lifting.

We can afford individuals to be irrational, but society itself has to be rational and objective.

I suspect Hulme is going to go on to tell me in subsequent chapters that society is not rational and objective.  Quelle surprise!

The whole point is that when it comes to existential threats we can’t afford not to be rational and objective.

Maybe global warming isn’t in fact an existential threat.  Maybe we can use it as terrain we can contest with our various views of how we’d like the world to be.

But before we conclude that we can afford to disagree about climate change, it would be a good idea, perhaps, for us to remain rational and objective long enough to determine the nature of the physical phenomenon within somewhat tighter constraints than at present.

At the moment we disagree about climate change because, like naughty children, we think we can get away with doing so.

(to be continued…)

January 4, 2011

Subsidising Cambridge Commuters?

Filed under: Bus, Economics, Inequality, Rail, Tax, Transport, Tube — Tim Joslin @ 4:28 pm

Labour is choosing to attack today’s VAT rise as “the wrong tax at the wrong time”. I’m not so sure. It seems to me that stealth tax rises, such as on public transport, are far less fair.

Pre-empting arguments over the figure, Labour are cunningly pointing out that the Lib Dems claimed during the General Election that the VAT rise would cost “the average household” £7.50 a week.

Curious. £7.50 extra VAT a week at 2.5% implies £300 of spending that qualifies for the tax – that is, £300 of spending that doesn’t include mortgage or rent, food, children’s clothes, books or newspapers, lottery tickets, gas, water, electricity, public transport or Council Tax. Difficult to manage on an income of ~£30K, that is, a weekly spend of ~£600, I’d have thought.

On the other hand, multiplying £7.50 by 52 weeks and the ~20m households in the UK gives around £7.5bn, which does seem about right. I suspect VAT is in fact a progressive tax.  The wealthy spend proportionately more on the sort of things that qualify – restaurant meals, expensive booze, nibbles and confectionery, new cars, designer gear and other big-ticket items.  The poorest – getting by on Tesco bogoffs, saving up for the odd bus ticket, buying all their clothes from charity shops and so on – must have eff all VAT-qualifying expenditure.

I strongly suspect that this is a case where Mr & Mrs Average do not in fact actually exist.

Maybe Labour would gain more votes by instead pointing out what appears to have been another case of dissembling during the election campaign by those (allegedly) lying liar Liberals.

Or perhaps they could have focussed instead on the increases in public transport costs which are in many cases seriously regressive.

Take the Zone 1-4 Travelcard (and daily Oyster limit) which will affect those working in the centre of London.  It’s rising from £6.30 to £7.30, off-peak, that is, by nearly 16%, not the 11% the BBC calculates, bless. What’s more, if you happen to live near muggins here in zone 3, the peak Travelcard/Oyster limit has increased from £8.60 to £10 – that’s over 16% even if you’re the BBC – to match the unchanged rate for zone 4.

Curiously, bargain of the year for 2011 is the 7-day zone 1-3 Travelcard which remains less than that for zones 1-4 at £32.20 against £30.20 last year, a mere 6.6% increase.  This could now pay for itself in 32.20/10.00 = 3.22 days, against 30.20/8.60  = 3.51 days last time out.  Even off-peak it’s worth considering at 32.20/7.30 = 4.41 against 30.20/6.30 = 4.79 days.  More realistically a mix of peak and off-peak travel into London over 4 days (2*£10.00 + 2*£7.30 = £34.60) would justify buying the Travelcard for £32.20 whereas last year you were much more likely to need to travel on 5 days (2*£8.60 + 2*£6.30 = only £29.80, still less than a £30.20 Travelcard).  Where’s the logic in that?

Having to decide in advance whether to invest in a weekly Travelcard is an unnecessary irritation, since the system could cap weekly expenditure in the same way as daily.  I understand TfL’s IT experts will get round to doing this by around 2013.

Hours of amusement, perhaps, though maybe deadly serious if, like me, you fall into the category of zone 3 residents who travel into London on an irregular basis.  A category that is being seriously screwed by the latest fare rises.

Who will this arbitrary unfairness affect the most?  The poorest of course.  Consider those who live in zone 3 and can’t afford the higher price of property near a tube station.  In 2010 two off-peak tube fares to the centre at £2.40 each, for example, brought you within striking distance – £1.50 – of the daily limit of £6.30.  You didn’t end up spending full whack on the bus each way to the tube station – the cost was capped at another £1.50.  In 2011, though, those two tube fares will set you back £2.50 each, but the daily limit has been disproportionately raised to £7.30, so the buses will cost you £2.30.  The tube fare – which is all Mr Rich who lives near the station has to worry about – might have gone up by only 4%, but the bus fare will have risen by 80/1.50 = 53%!  The percentage is even greater if one of the tube fares happens to be at the afternoon peak rate (£2.70 in 2010, £2.90 in 2011, charged from 16:00 to 19:00) when the off-peak daily cap still applies. [In 2010, £2.40+£2.70 left £1.20 of the £6.30 daily limit for the bus; in 2011, £2.50+£2.90 leaves £1.90, so the cost of choosing the bus rather than walking has risen by more than 58%!].

I happen to fall into the category of those who live near enough to a tube station to be able to walk if I’m not feeling lazy.  I now have much more of an incentive to do so.  What TfL has done is make it much more expensive for zone 3 travellers to use a bus as well as the tube.  So more people will walk instead and TfL may not even realise the extra revenue they may expect from the daily cap increases.  Leaving everyone worse off.

Boris may want to take note that with another 8.3% increase (from £1.20 to £1.30) in the flat-rate bus fare, following the 20% increase at the start of 2010 (from £1) he’s making short hops in general more and more expensive.  The flat-rate fare makes a lot less sense in a purely fare-based system than in a subsidised one where the fares don’t recover the full cost.

Commuters who make one tube journey each day haven’t been hard hit, but it’s difficult to find categories of bus user who aren’t much worse off after these latest changes.  The daily bus limit has only increased by 2.6% – from £3.90 to £4 – this time (though it was £3.30 in 2009).  This is good news only for occasional bus commuters to the centre, who most likely have to change – and it’s a disgrace that some people are paid so little that they can’t afford to use the tube (and note that you face no penalty for changing tube routes) – since the 7 day bus pass has increased by 7.2% from £16.60 to £17.80.  [And now represents 4.45 rather than 4.26 daily maximum fares.  Where’s the logic in that?].

All this has been rather a digression as what I really wanted to do was provide an update on the cost of mainline rail travel.  ‘Cos if you want to get about the UK within a finite time you need serious money.

A couple of years ago I introduced the Cambridge Day Travelcard (with Network card discount) fare index, which is admittedly not yet perhaps quite as famous as the Economist’s Big Mac Index.  Here’s the full series, brought up to date:

2003     £11.55

2004     £12.60     9.1% increase on previous year

2005     £13.85     9.9%

2006     £14.85     7.2%

2007     £15.20     2.4% (presumably lower because of the new afternoon restrictions – the return can no longer be used on trains departing King’s Cross between 16:30 and 19:00, which is inconvenient, to say the least)

2008     £15.85     4.3% (lulling us into a false sense of security)

2009     £17.50     10.4% (out of the blue – it’s a record!!)

2010     £17.50     0% (but still a real-terms increase! – according to the RPI, prices in July 2009 were 1.4% lower than a year earlier)

2011     £18.50     5.7% (close to the July RPI of 4.8% plus 1% which I understand was allowed for the average of each operating company’s fare increases)

So the cost of a day Travelcard from Cambridge to London – for a degraded service, remember – has risen a whopping 60.2% in the mere 8 years since 2003.

What about inflation?  Really we should compare the RPI for a month from December 2010 to December 2011 (reflecting general prices when we’re actually travelling) with the same month in 2002-3, but the latest data available is for November 2010 when the RPI index was 226.8.   It was 178.2 in November 2002, so prices in general over the same 8 years have risen only roughly 27.3%.

That is, in 8 years, the day Travelcard from Cambridge to London (with Newtork card discount), for a degraded service, has risen about 25.8% in real terms.

And the formula for the next few years is RPI+3%.

But what really got my goat, and prompted this post, was reading the comments of an RAC spokesman in Saturday’s Guardian:

“The RAC Foundation, a motoring thinktank, claims that the annual £5bn subsidy of the rail network disproportionately benefits Londoners and the well-off, with 40% of households earning more than £50,000 a year using the railways at least three times a week – double the figure for those on less than £25,000 per year.

Stephen Glaister, its director, said: ‘The rail subsidy comes from the Treasury and, in that sense, it is paid for by everybody. But the benefits are weighted towards the south-east and the relatively well-off. If government policy is intended to help redistribute wealth and help the less well-off, rail subsidies are a poor way of doing it. Spending the money on helping road users would be a better way of doing it.’ “

Well, of course only the wealthy can now afford to use the railways!  There’s not much point taking a £15K a year job in London if it’s going to cost you £5K of that just to get to work, is there?

But I rather dispute that the benefits are “weighted towards the south-east”, or at least towards commuters on busy routes, such as Cambridge to London.  What I suspect happens is that commuters subsidise those travelling off-peak; busy routes subsidise those at the periphery of the network; and busier regions, especially in the south-east subsidise less-busy regions.

The Guardian could, for example, have taken a peak at the latest (2009-10) National Rail Trends (NRT) Handbook from the Office of the Rail Regulator (ORR).  On p.62 you’ll find table 6.2c which gives the 2008-9 passenger subsidies for each Train Operating Company (ToC).  I crudely show it here:

As can be seen at a glance, First Capital Connect (FCC), which operates the Cambridge-London route, is not directly subsidised, but in fact pays 3.4p per passenger kilometer for the privilege of running the trains.  Now, this is for the whole franchise, which must include peripheral routes that are less heavily used, as well as the most overcrowded trains in the country from Cambridge.  But those peripheral routes at least help to bring some passengers onto the network, so let’s take the figure of 3.4p to be realistic.  A round trip to London must be in excess of 100km, so travellers from Cambridge are on average paying in at least £3.40 every time they buy a return ticket.

But the franchise payments are not the main subsidy to the railways.  The taxpayer provides around £4bn a year in direct support to Network Rail (see Table 6.2a of the ORR’s HRT handbook – self-serving obfuscation in Network Rail’s financial statements reveals no more detail).  Table 6.2c shows a total of around 50bn passenger kilometers per year (note that some operators are outside the franchise system so the distance total in table 6.2c is not complete).  Making the heroic assumptions that Network Rail’s subsidy is evenly spread and not used to support vanity investment projects, rail passengers do indeed appear to be subsidised to the tune of around 8p per passenger km.

Combining the two subsidies suggests FCC passengers are on average subsidised by around 4.6p per km (8p – 3.4p) whereas those on, for example, Northern Rail receive around 12p/km (8p + 4p from Table 6.2c).

The Guardian notes that an annual season ticket from Cambridge to London costs around £4000.  If this is used 250 times, that works out at around £16 per day return, not bad at all compared to the £13.85 price for an off-peak day return with a Network card. It seems commuters in fact get a relatively good deal since their season ticket entitles them to unlimited travel to London at times when the day return fare would otherwise cost an absurd £34.

This isn’t quite what I expected – as always, it pays to delve into the numbers.  It seems a bit daft for an annual season ticket to represent no more than 120 daily trips (£4000/£34).  I don’t really see why anyone making fewer than that should be so severely penalised.  This discourages all kinds of business and other activity, part-time working, working from home and tourism, for example.

It remains conceivable that even commuters on the Cambridge to London route are still being subsidised, though the trains are so busy I’m confident that the Cambridge to London route in fact subsidises the rest of the FCC franchise.

The people really being fleeced are:

– those adults without a Student, Senior or other railcard – since anyone can buy a Network card for around £25, this means occasional users are penalised, which hardly helps to bring new passengers onto the railways;

– all non season-ticket holders forced to travel at peak times (which, since 2007, includes 16:30 to 19:00 from King’s Cross);

– purchasers of single or open return tickets. An Anytime (i.e. including peak-time trains) open return from Cambridge to London now costs the same as two peak singles, at £40, a ridiculous two and a half times the effective rate (£16) for a season ticket-holder occupying the same seat – or more likely standing on the same train.

The numbers suggest these categories of passenger from Cambridge to London are definitely not being subsidised.

If the strategy is for costs of rail travel to be attributed to those using the service, then it makes no sense for some categories of passenger to pay substantially more than the cost.  The open return ticket price should be reduced to that of the day return and single tickets should be half the return price.   For Cambridge to London, the non season-ticket peak fare is way out of line and should simply be reduced to say 1/150th of the season ticket price, that is, around £27 (from £34).

The TOCs effectively have monopoly pricing power.  Prices therefore reflect expediency rather than the cost of providing the service.  If there were a decent level of competition they’d soon find another operator could afford to undercut them on those fares that are out of line.

What’s more, allowing peak fares of effectively twice the off-peak rate gives no incentive to rail companies to increase passenger numbers, for example, by running more late-night and pre morning-peak trains.  Allowing an afternoon peak is insane – the rail company has a disincentive to ease over-crowding.

The whole rail franchise system is dysfunctional.  What’s effectively being sold is the right to charge monopoly prices.  This is absurd.

In an ideal world, there would be no need for peak and off-peak fares – sufficient trains would be run to meet demand at all times.  In the meantime, though, the need for demand management skews incentives for the TOCs.  It’s therefore necessary to divorce ticket-pricing from financial rewards to the TOCs.  The TOCs should be paid just for the service they provide – that is, the same rate per passenger regardless of when they travel and how much they’ve paid for their ticket.  And less per passenger on trains that are more than 70% full. The TOCs should have an incentive to increase use of the railways, not screw more money out of fewer passengers.

Create a free website or blog at WordPress.com.