Friday, December 28, 2018

CARBON FEE AND DIVIDEND. AN IDEA WHOSE TIME IS COMING.


One simple idea unites almost all environmental economists. It is the importance of pricing external costs – the social and environmental damage caused by greenhouse gas emissions – into markets so that users and consumers are forced to recognise and pay for these costs at the point of use, and not allowed to fall instead on the public at large or on future generations.

Unfortunately a simple implementation of such an approach runs head first into some major political realities. The first is that the level of pricing applied to fossil fuels that would be necessary to reduce emissions quickly or substantially is high, and would be a major economic shock both to many industries and consumers. The industries affected tend to be extremely vocal. The second is the observation that any resulting price increases are most likely to have a  disproportionate impact on poorer consumers.

Just to illustrate the point the European Union’s emissions trading scheme (the ETS) has over long periods produced carbon prices of significantly less than 15 per tonne, in a period when the price at which key technology transformations occur has appeared to be much higher, around €70 or more. Even more significantly it is now widely recognised that the world is so far from reaching safe climate targets that it will have to contemplate large scale sequestration of CO2, ie extraction from the atmosphere, with much higher costs, probably at least €200 per tonne (best available current technology puts this figure at around €600 per tonne).

Failures of the ETS reflect many factors, including industry lobbying and the post crisis recession, but they certainly include weak political will in relation to higher carbon prices.

Carbon Fee and Dividend

Citizens’ Climate Lobby[1] has proposed a carbon fee and dividend approach, in which the whole of of the revenue from a “carbon fee” is returned directly to households as a monthly dividend. The idea is that the majority of households will receive more in dividend than they pay in increased energy costs. This feature protects family budgets, frees households to make independent choices about their energy usage. The higher fuel prices spur innovation in low-carbon products.

The scheme would include border adjustments[2] to protect UK businesses, levying import fees on products imported from countries without a price on carbon, along with rebates to UK industries exporting to those countries. This would discourage businesses from relocating where they can emit more CO2 and incentivise other countries to adopt similar carbon pricing policies.

In fact this idea is part of a family of similar ideas aimed at shifting the burden of taxation towards “green taxes” which seek to redress existing market failures that comprehensively fail to address environmental problems. The counter part to these socially beneficial taxes is a reduction in other forms of taxation, and related concepts include “revenue neutral carbon taxation” and the notion of a carbon “wealth fund”, promoted in an earlier blog on this site by Adam Whitmore.

The case for action

I believe there will be numerous points of detail to get right, whatever new approach is adopted, and I have a few concerns on specifics.

·         Border adjustment taxes look fine in principle, but the administrative detail in their application looks formidable, and they would still have to comply with international rules on trade.

·         CCL propose a gradual increase in tax on carbon, starting from quite low numbers. As discussed in another page on this site, the stark message of the science is that the highest cost attaches to early emissions, since they are both cumulative and bring forward climate milestones.

·         Getting both the macroeconomics and the redistributive consequences right will be challenging, both technically and politically.

·         The idea of a wealth fund, as opposed to immediate revenue neutrality, may be the preferable route.

Despite these concerns I am more and more convinced[3] that more effective action on carbon pricing is essential, and that its combination with a redistributive agenda in domestic politics is probably the best available way to break the current logjams on climate policy.





[1] Citizens’ Climate Lobby (CCL) is a non-profit, nonpartisan, grassroots advocacy organisation, started in the United States. There are now CCL groups in 40 countries, including the UK network established in 2015.

[2] An idea also previously promoted by Dieter Helm and Cameron Hepburn
[3] https://www.ft.com/content/9a6f8e90-0398-11e9-bf0f-53b8511afd73.  Just a final footnote. 23 December issue – Arun Majumdar in the FT discusses the issue.

Thursday, December 6, 2018

ENERGY STORAGE. FINDING THE SILVER BULLET


Energy storage is now widely recognised as a necessary component in most options for achieving a low carbon future. For most of our history, energy storage has taken the form of physical stocks of fossil fuel, to be drawn on as and when required. Matching supply and demand in real time has therefore been comparatively simple. Even in the complex world of large power systems, generation can be turned up or down with comparative ease.

That world is changing. Renewable resources (mostly) provide energy according to their own timetable and the dictates of weather and season, most obviously so for the best developed resources of solar and wind power. Nuclear power output can be used to follow load, at a cost, but is still relatively inflexible. Matching these outputs to highly variable consumer demand, for a variety of energy services from heat to transport, as well as appliances and processes of all kinds, is going to be more and more challenging. This implies the fundamental importance of energy storage.

The key technical and economic requirements will in different applications include minimising weight or volume (car batteries), round trip energy efficiency (minimising losses in conversion processes to and from storage), sufficient scale (for large power system applications), low capital cost per kW (unit of power output), and low capital cost per kWh (unit of energy stored). The application determines what is most feasible and economic in each case, and hence also both the choice of existing technologies and the priorities in looking for new approaches.

It’s often widely assumed by commentators that the great advances in battery technology mean that we are well on the way to solving all these problems. However that is far from being the case. Lithium- ion batteries are probably reaching their inherent natural technical limits, and although it may be possible to force down production costs further, they still represent a very high capital cost. As a result cost, as well as any scalability or resource limitations, are likely to inhibit their use other than in premium and high value applications, even though these extend to some high volume uses such as electric vehicles (EVs) and a few specific power system applications.  Currently foreseen battery technology scores well on efficiency and reasonably well on cost per kW, but not on capital cost per kWh.

We need and are going to need a number of very different types of storage, and the requirements differ widely across the spectrum of energy services and other requirements. Here are a few of the key issues and questions.

In practice the really critical distinction is between situations amenable to storage solutions that operate on the basis of a daily cycle or similar, and those that have to meet annual or seasonal cycles.  For the former, high capital costs can be acceptable but conversion efficiencies will matter more. For the latter, the reverse is the case. Low capital costs are essential and conversion efficiency less important.

The Dinorwic pumped storage scheme in North Wales illustrates the issue of scale and cost very well.  Dinorwig, the main storage facility accessed by National Grid in the UK, stores the equivalent of about 9 GWh of energy.  Construction is estimated to have cost c £ 500 million and involved shifting 10 million tonnes of rock.  It provides a significant contribution to managing daily load fluctuations, but is still relatively small in relation to the overall fluctuations in the daily load curve. In practice Dinorwig is sometimes used for the provision of ancillary services such as frequency control, rather than in the more obvious role of storing energy against a daily peak demand. Operating on a daily cycle, the all-in cost of storage is of the order of only a few pence per kWh, allowing the facility to make a valuable contribution to the efficiency of the power system.

The amount of storage necessary to flatten the typical current January daily load curve is of the order of 80 GWh, or about 9 Dinorwigs, in principle still a credible level of investment. However to cope with even the UK’s current annual seasonal variations in electricity consumption, the storage need would rise to about 17000 GWh, or nearly 2000 Dinorwigs. On conservative estimates of the need if UK space heating loads were to be met through electricity (even using heat pumps rather than resistive heating) the seasonal storage requirement could be up to three times higher, or 6000 Dinorwigs.

Recovery of the very high capital costs of storage, through revenue or compensating benefits, on the basis of a once a year store/ draw down cycle, is clearly impossible. The silver bullet of cheap seasonal storage has to come through a technology with extremely low capital costs per unit of storage capacity measured in kWh. The prima facie front runners for high volume seasonal storage are chemical or heat based solutions, including hydrogen/ ammonia, synthetic fuels, and bulk heat or phase change methods.

The nature of heat and chemical storage solutions means that seasonal and indeed all storage choices for the power sector can only be properly evaluated by reference to the totality of the energy system. This is most evident for the two very substantial sectors of heating and transport, whose decarbonisation is an essential part of energy policy, since

·         widespread use of heat networks provides an easier means to tap into low cost seasonal heat storage than attempting to recover the stored energy as electricity to power heat pumps.

·         choices in the transport sector, eg between hydrogen and electric vehicles, have profound consequences for the management of electricity supply.

·         electric vehicles are themselves a potentially substantial source of relatively short term storage, as protection against any intermittency in renewables supply.

But this also emphasises the need for a coordinated approach that treats the energy system as an entity, aims to minimise costs and finds compatible paths forward across the very different elements of power, heat for buildings and transport. Current market structures fall a long way short of that requirement.  




Friday, September 21, 2018

CLIMATE SCIENCE. RESIDUAL UNCERTAINTIES THAT DO NOT CHANGE THE UNDERLYING MESSAGE.


Understanding Time Lags Important for Climate Policies.


“Cumulative carbon”, the fact that human emissions of CO2 are removed only slowly from the atmosphere through the natural carbon cycle, is the essence of the climate change problem. Given the thermal inertia of our planet it may produce a substantial time lag between effective action, to limit emissions, and actual stabilisation of temperatures, equivalent to the operation of a domestic heating radiator on a one way ratchet. Some climate scientists believe this effect may have been exaggerated, and will be largely offset by other elements in the natural cycle. Even so there is a consensus that we need to move to a world of net zero emissions and beyond.

So much of the predictive element of climate science has been borne out by observation that it is easy to forget there are still major gaps in our understanding and major uncertainties that are very relevant to understanding what climate policies are necessary to (ultimately) stabilise global temperatures.

The slow but seemingly relentless upward trend in global surface temperatures sits firmly in the middle of past model predictions, and denials that it is actually happening (the famous Lawson “hiatus” based on cherry picking outlier el Nino effects), or that it has nothing to do with human contributions to greenhouse gas concentrations, look increasingly threadbare and ridiculous. There is no doubt that human-induced climate change is with us, and that it is dangerous.

However, although the science makes it clear that urgent emission reductions are essential, it is much less clear how much leeway we have, and whether the 1.5o C or 2.0o C “targets” are attainable. The uncertainties manifest themselves in discussions over the time lags involved, for example between stabilisation of the atmospheric concentration levels of CO2 and stabilisation of global temperature. These questions relate in turn to complexities of the natural carbon cycle and the

The issue of time lags is in some ways politically important. Thomas Stocker, then co-chairman of the IPCC Working Group I (assessing scientific aspects of the climate system and climate change), and addressing the Environmental Change Institute in Oxford in 2014, argued that “committed peak warming rises 3 to 8 times faster than observed warming”. The implication is that there are very substantial time lags. In this case temperatures could continue to rise, perhaps for several decades or even centuries, even if net human emissions were reduced to zero. Similar comments can be found from other climate scientists. It has even been suggested that the thermal inertia effect, considered on its own, could stretch the lag to about 200 years – the time for heat equilibrium adjustment to reach the deep oceans.

The intuitive physical explanation of long time lags is simply the phenomenon of thermal inertia. When we turn up the radiator at home it may take an hour or more for the room to reach a new equilibrium temperature. Global warming, in this analogy, is a radiator heating system on an upward ratchet. The issue for humanity is that by the time temperatures become really uncomfortable, we have already ratcheted up the future temperature to which we have committed. If this is a real danger it dramatically increases the risks associated with climate inaction or “business as usual” trends. It is also creates an alarming image of climate change as a kind of doomsday machine in which humanity is trapped through its failure to anticipate and respond.

However rather more optimistic views have been presented by other scientists. In 2014 Ricke and Caldeira[1] argued that the lags had been seriously overstated, suggesting a time lag of only ten years as a more appropriate estimate. Oxford-based climate scientists, such as Myles Allen, have also suggested that stabilising atmospheric concentrations could lead to a comparatively early stabilisation of global temperature.

The main reason is the potential offsetting effect of the absorption of incremental carbon through the various elements of the natural carbon cycle, including ocean CO2 uptake and the behaviour of biosphere carbon sinks. However there is no natural law that requires such a balance of effects, and the reality seems to be that we are trying to compare two magnitudes, both of which are of major importance but are also difficult to measure with precision. If the effects broadly cancel out over particular timescales, this is essentially a numerical coincidence within the modelling effort. We can expect future research, better measurement, and associated modelling, will gradually improve our understanding.

But melting ice caps are a separate story!

The above discussion does not however cover all the long time lags involved. One particular concern has to be the polar ice caps. If global temperatures reach the point where these start to melt, then the consequential effects in the form of rising sea levels will go on for centuries. Ice sheets, as opposed to sea ice, can, like the retreating glaciers, only be restored by precipitation. That will be slow and net annual ice gain will probably depend on conditions associated with a fall in global and polar temperatures to or beyond what used to be regarded as normal.

And the policy implications of these uncertainties?

In reality the consequences for policy of differing estimates can be exaggerated, since there is agreement on most fundamentals. Most modelling now recognises that stabilisation of temperature, even at a higher level, requires progress to net zero human emissions. Importantly, this is probably unachievable without substantial measures to remove CO2 (and other gases) from the atmosphere.  In practice processes for carbon sequestration are likely to be very expensive process. They represent a form of geo-engineering.









Monday, September 17, 2018

SMART METERS. PRIVACY ISSUES OR A VITAL TOOL FOR OUR FUTURE?




The Guardian two weeks ago featured an anguished reader’s letter concerned about the invasion of privacy involved in the installation of smart meters in UK households. It’s worth reflecting briefly on what the privacy and security issues might be, what the real social value of smart meters might be, and how we should balance these with an effective policy.

Smart meters, and I have just acquired one for my electricity supply, will tell you a number of things that you might have found it difficult to work out previously. These may include for example what your power use (in watts or kilowatts) is at any instant in time, what it was in the last few hours, days or months. The privacy concern is that the utility, your supplier, can be assumed to have the capability to collect and keep this data, and, for example, could build up a picture of the minute by minute electricity consumption of every household with a smart meter.

There are many, not particularly sinister, reasons why utilities might want to do this. At a minimum it can provide a better understanding of how consumers use electricity can help in planning future system needs, make sure that local networks have adequate capacity to cope with fluctuations in load, and so on.

On privacy and security issues we should perhaps be far more worried about the amount of sensitive information held on you by your bank and your credit or store card issuer, not to mention Facebook, Google and your telecoms supplier, or, and, currently in the news, the airlines you use. Between them these have tons of information about your shopping habits, lifestyle, opinions, financial affairs, favourite websites, and so on, all of which, if privacy and security are breached, potentially give rise to much more serious abuses than someone being able to work out what time a household has breakfast or runs the washing machine.

It’s also certainly true, as a number of readers are testifying, that the government has not fully thought through its policy objectives on smart meters, and that the programme is unlikely to deliver many of the promised benefits, at least in the short term. And of course there are as usual a lot of horror stories on installation failures. But none of this should blind us to the fact that there is a huge and essential future for devices which create a much closer connection between the way we use energy, and electricity in particular, and the factors that constrain when and how it is produced and delivered.

Let us take a simple future example. We expect a big future for electric vehicles. This has the potential to create big spikes in load, with everyone switching on together, in a period when we will depend increasingly on renewable energy sources which are much more variable than currently, and harder to match to varying consumer demand. One answer to this is for some EV owners to charge their vehicles overnight, but for the timing of that supply to be at the discretion of the supplier, who can match it to when production is available to meet it. In exchange for this surrender of direct control, consumers will get a much more favourable tariff for their EVs, and the practical issues of managing network overload, when all EV owners try to re-charge their vehicles immediately on return from work, can be avoided. This obviously implies an element of intrusion, in the sense that the utility “knows” the purpose of the load it is required to meet. It also assigns a choice (over timing) from the consumer to the utility. But this is a commercial transaction, willingly entered on both sides, providing benefits to both parties.

Smart meters are just a starting point for more sophisticated and user friendly tariff and conditions of use arrangements which can redefine the ways we use energy. No doubt there will be privacy issues to be managed, but essentially these will be no different in character from the privacy issues we encounter in relation to almost all our transactions with businesses large and small. They will be substantially less than most of those that we face in relation to financial transactions, social media and our day to day use of IT and the internet..

Some of the benefits that can flow from more sophisticated metering and tariffs are highlighted in a report published last week by Energy Systems Catapult, which starts to explore the numerous tariff issues highlighted by progress towards a low carbon economy.

Thursday, August 23, 2018

INFRASTRUCTURE SPENDING. THE GENOA BRIDGE COLLAPSE AND PARALLELS WITH INACTION OVER CLIMATE ISSUES.


A low cost of capital but a high cost of inadequate investment.

Why, when interest rates are so low and there is a global glut of capital, are there not much higher levels of
investment in infrastructure?

The Genoa bridge collapse is a reminder of the high human and economic cost of failure to provide the safe infrastructure on which a modern society depends. In the case of the failed bridge the blame game is now in full cry, and the causes to which its sudden and catastrophic collapse can be, will be, or are being attributed are numerous and not necessarily mutually exclusive. They include:

·        questions over the original 1960s design.

·        failure to heed expert professional advice that the bridge was at imminent risk.

·        reluctance on the part of populist politicians to countenance any form of infrastructure spending, combined with

·        a readiness to shout “Project Fear” in response to expert opinion.

·        constant pressures to reduce public spending

·        failures of those responsible for bridge upkeep,. the adequacy of maintenance and monitoring of its structural integrity.

Mostly these factors relate to unwillingness to spend money on infrastructure investment, and the search for excuses to avoid doing so. Tony Barber, writing in the FT (17 August) argues as follows.
“… the case for more government spending on infrastructure is unanswerable. Of course, … controversial questions of taxation, spending priorities and state borrowing. But unless these questions are addressed, the next catastrophic bridge collapse is only a matter of time.

For “catastrophic bridge collapse” read “global climate crises”. The parallels with action on climate policy are both clear and frightening. Action on climate is as essential to our collective future as bridge maintenance is to road safety. The cost of the investment spend is small in relation to the costs of failure (an economic disaster for Genoa and for Italy). And most spending to deal with climate issues is essentially infrastructure in one form or another.

The parallels extend to the politics. As with the Genoa bridge, populist politicians (eg Redwood, Rees Mogg and Lawson) prefer to decry expert opinion, in this instance the science of climate, as “climate alarmism”. The ultimate costs of failure (to tackle climate issues) will, as with the bridge, be much higher both in absolute terms and relative to the necessary levels of investment. Infrastructure is commonly a public good, with collective benefits, and no doubt many people would prefer to enjoy the benefit without having to pay a share of the cost. For climate policy, reluctance to pay is reinforced by the knowledge that the problem is not even contained regionally or nationally, but is global. Unilateral national action is therefore of limited value. If the world is in reality doomed to endure climate catastrophe because of the indifference or inadequate response of other big players (eg the USA, China, the EU, India), then national investment, however well executed, provides little benefit to the country that undertakes it. It is a classic “free rider” problem.

We consistently overestimate the cost of the necessary investments.

Why are so many politicians are able to convince themselves, and indeed much of the press and public, that the cost of essential infrastructure investments is unaffordable? Part of this comes down to consideration of how such capital projects are financed and stems from fallacies that surround much of the conventional analysis of the real cost of that capital, and how that investment should be rewarded.

How is it that governments are currently able to borrow at rates of interest that are close to zero, or even negative in real inflation adjusted terms, while EDF’s Hinkley Point project is reported as providing a lifetime return of around 9%?  Needless to say such a rate of return means a high cost to the consumer.

The cost of capital reflects risk? But this is a very specific definition of risk.

Conventional capital market theory (the well known CAPM model beloved of finance theorists and the basis of much of financial analysis) does indeed link the cost of capital to perceived market risk. But it is a very specific definition of risk, defined as the extent of the correlation between the stock market price of an individual share and the volatility of the stock market (eg the FTSE 100 index) as a whole (the “beta” value). It does not depend on the riskiness of an individual project, since it is assumed that investors will diversify away from individual risks, even risks as big as that of a massive overrun on construction costs. Project specific risk, in this context, is irrelevant.

The same conventional theory suggests that the cost of capital, viewed as the return over the whole life of an asset, should equate, loosely at least, to the rate of return achieved by utility investments generally. Utility earnings tend to have a rather low correlation with the general economy and the various share indices, implying a low cost of capital. The costs of climate change, and hence the benefits of limiting those costs, are almost entirely independent of any short term market volatility.

And the same is true for almost any “essential” investment, a category in which we can place both bridge structural repair and investment in mitigating climate change. In commercial terms these should be seen as low risk investments, and financed on a basis that properly reflects that. Conventional finance theory, therefore, is entirely consistent with the use of very low discount rates for appraisal of essential infrastructure and “public good” investments.

We can manage risk and contain the cost of capital.

Recent UK history abounds in expensive mistakes in relation to major capital investment programmes, and many of these relate to a failure to get the right links between risk and reward (the cost of capital) or to manage risks in the most appropriate way.  These include notorious private finance initiative (PFI) projects in which the public finances (via the NHS) have been forced to bear the burden of high returns on capital, even though most of the real risks have remained with the public sector or been de facto underwritten by government. There are two major principles that should help avoid these mistakes in the future.

The first is to recognise project specific risks and separate them from calculation of appropriate returns over the full life of an asset. Many major capital projects, such as nuclear power stations or tidal barrages, are intrinsically subject to significant uncertainties over construction costs and the risk of cost overruns. These may be shared between the construction business and the utility company that operates the asset, and will be embodied in the cost attributed to the asset at the time of its commissioning. The contractor will need to be properly rewarded for the risk taken during construction. But thereafter, within a well regulated sector, the asset enjoys a safe “low beta” utility return. Investment should therefore be appraised on the basis of a very low cost of capital.

The second is to ensure that economic regulation of utilities (eg the power sector) is appropriate. This means providing a degree of security over the future regulatory framework to protect the investors (in the asset) from expropriation of what is for them a sunk cost, if necessary through revenue guarantees.   If the framework is inadequate in providing this kind of regulatory reassurance, the investment will inevitably carry a much higher “project specific” risk premium. The public sector will in effect be paying very heavily for risks that are within its power to avoid.

Lord Stern attracted a great deal of criticism following the Stern Review, for arguing for very low discount rates to be applicable to assessment of climate mitigation investment. Recent experience of ultra-low interest rates, and indeed some of the indications of the rates attainable for major projects, with appropriate guarantees, suggests that low discount rates may be the most appropriate for evaluating and comparing low carbon investments.

Saturday, July 28, 2018

HEAT RECORDS FALLING, AND WILDFIRES ACROSS THE GLOBE. AT LAST A DEMONSTRATION THAT GLOBAL WARMING IS REAL.


No! We had that long ago. It’s just that we are now starting to see the impacts “play out in real time”.


We have had heatwaves and long hot summers before, even in the UK. But record-breaking weather on its own, especially when confined to particular locations, demonstrates very little. Even so the coincidence of heatwaves across North America, Europe from the UK to Greece, and Japan, and, even more dramatically, the extraordinary temperatures observed in the Arctic.

Scientists, as opposed to much of the media, have always preferred to concentrate on careful and painstaking analysis of global average temperatures, rather than extreme events. The evidence of that is clear from temperature measurements showing a steady upward trend over many decades, and a trend that aligns closely with the predicted effect of increasing atmospheric concentrations of greenhouse gases (GHG). The serious nature of human-induced warming has been well established for at least twenty years. More on the scientific debate can be found on other pages on this site.

But the science is now starting to go further. It has long predicted that global warming would significantly increase the number and intensity of heatwaves, but improved analysis is now getting much closer to assigning the causation of particular extreme weather events specifically to the signal of climate change, at least with increasingly strong probabilities.

The heatwave currently scorching northern Europe was made more than twice as likely by climate change, according to one initial assessment. Even more extreme conditions could be occurring every other year by the 2040s. “The logic that climate change will do this is inescapable – the world is becoming warmer, and so heatwaves like this are becoming more common,”  Friederike Otto, at the University of Oxford, is reported as saying.

“We found that for the weather station in the far north, in the Arctic Circle, the current heatwave is just extraordinary – unprecedented in the historical record,” said Geert Jan van Oldenborgh, at the Royal Netherlands Meteorological Institute.

The Guardian reports that previous attribution analyses have shown very strong connections between climate change and extreme weather events. The scorching summer in New South Wales, Australia, in 2016-17 was made at least 50 times more likely by global warming, meaning it can be “linked directly to climate change”. The “Lucifer” heatwave across Europe’s Mediterranean nations in 2017 summer was made at least 10 times more likely by climate change, while the unprecedented deluge delivered in the US by Hurricane Harvey also in 2017 was made three times more likely by climate change. However, other events, such as storms Eleanor and Friederike, which hit western Europe in January, were not made more likely by climate change. Serious climate change is “unfolding before our eyes”, according to Professor Sutton, at the University of Reading.

But disputing the science, and the measurement of change, is an ideological obsession. The same team is trying to bring you Brexit.


What are the common factors linking the following?

·        The Institute for Economic Affairs.

·        Nigel Lawson. John Redwood. Jacob Rees Mogg.  (Conservatives – to name just three)

·        Graham Stringer (Labour).

·        UKIP and Nigel Farage.

·        Melanie Phillips. Christopher Booker. (Journalists)

·        Donald Trump

All have displayed strong or passionate opposition to the evidence and logic of climate science, or to any action to mitigate it.  All have also been passionate advocates of Brexit. Stringer was one of four Labour MPs voting with the government on recent crucial Brexit votes, and is also involved with Lawson’s Global Warming Policy Foundation - Lawson of course was a lead figure in the Leave campaign.

Phillips also has strong anti-science form, supporting disgraced doctor Andrew Wakefield  on the subject of MMR vaccination, a subject where the provision of flawed and inaccurate information to the public has caused huge damage. So does Booker, on the subject of evolution. To be fair, on Brexit, he has partially recanted.

I could create a much longer list (see my earlier comments on Brexit and Brexit economists), but one interesting question is simply this. Why is there such a strong correlation?  Is it the absence of any ability to deal rationally with fact and logic, or to comprehend the sophisticated nature of scientific method, or is it pure ideology?

And when are we going to hear more from some of the above on the subject of global warming?

Saturday, July 21, 2018

TRUMP, TRADE WARS AND CLIMATE POLICIES




An opportunity to introduce a rational and positive element to retaliatory measures.

Trump’s USA seems to be hell-bent on destroying every element of the pre-Trump international order, the WTO, presumptions in favour of free trade, and US participation in the Paris accord on climate. But a recent piece in Nature magazine proposes a neat way of turning some of this to a climate policy advantage.

The recent tit-for-tat on punitive tariffs has been based first and foremost either on targeting industries that are the source of the grievance, notably steel in the case of Trump, or products where the retaliatory tariff will cause pain in the USA, famously Harley Davidson.

The novel suggestion however is that the prospect of a global trade war provides an opportunity for the introduction of tariffs that, unlike many forms of trade barrier, can be considered to be almost unequivocally beneficial in their effect on human welfare. The suggestion is a simple one. Place the retaliatory tax on the products with the highest carbon footprint, relative to the alternatives of domestic production. Indirectly this will also counter at least to a small degree the environmentally reckless policies Trump and the Republican party are promoting at home.

Ideally of course this should be part of a wider set of agreements on effective carbon pricing, but “border carbon adjustments”, or BCA, clearly have a lot going for them. Nor is the idea new or without political support. “In 2017, French President Emmanuel Macron called them ‘indispensable’ for European climate leadership, and Canadian environment minister Catherine McKenna recommended closer scrutiny. Mexico included them in its Paris pledge” according to the Nature article. The US House of Representatives backed a similar approach in the Waxman–Markey Bill in 2009, but the bill failed to reach a vote in the Senate.[1]

Wholesale implementation, by the EU for example, of such an approach on a global basis would have a number of technical difficulties, but its examination in the context of retaliatory tariffs could provide some interesting outcomes. It would of course tend to hit those states in the USA that are most keen to protect their coal industries. As it happens that includes a number with a strong Republican base.

The other advantage is that what starts as a rather crude device, and as part of a trade war, could also evolve over time to become a significant component of mainstream decarbonisation policy.



[1] California, it is claimed,  has  introduced a version of  BCAs but this is confined to its energy market.


Monday, July 16, 2018

THE ROLE OF TARIFFS IN A LOW CARBON FUTURE




Redefining how we take our electricity supplies. The complexities of allocating fixed costs. 
And the need to recognise environmental costs through carbon pricing.

It is hard to understate the importance of retail tariffs[1] for the efficient financing and operation of public utilities, and especially in the power sector. Tariffs represent the pricing and charging structure through which most consumers are supplied. They influence how consumers use electricity. Tariffs revenues underpin the returns necessary to pay for utility investment.

To meet objectives of both equity and economic efficiency, it is generally accepted that prices should accurately reflect costs.  This provides a means to coordinate consumer choices, on how much and how they use power, with utility decisions on how they manage, operate and invest in their networks and in their sources of generation. Purchase, as opposed to sales, tariffs set the terms on which small scale producers can sell into the network and are an important influence on the development of decentralised power production in particular.  However, interpretation of how best to reflect costs is, as we shall discover, quite complex and requires careful analysis and judgement.

In the new world of low carbon energy, three important trends will change the way in which electricity is produced and delivered, the shape of future tariffs, and the nature of the service relationship between utilities and households (and business). These trends are Decarbonisation, Decentralisation and Digitalisation (the three D’s), and they impact on tariffs.

Decarbonisation of the energy sector is widely perceived as requiring much greater use of non-fossil electricity to substitute for current fuels in heating and transport. But it also changes the cost structure and operational characteristics of generation technology. It substantially reduces the importance of variable (fuel) costs, which are what largely underpin the design and operation of today’s markets, and raises the importance of capital costs. Low carbon technologies (both nuclear and renewables) are inherently less flexible in adjusting to fluctuations in consumer demand. This raises the importance of managing consumption patterns, and hence of tariffs and price signals, within a coordinating price mechanism for balancing supply and demand in real time.

Decentralisation is implied by the growing significance of small scale producers (also sometimes known as prosumers), by the smaller scale of many renewable technologies, and the increasing importance attaching to management of local network constraints, as electricity plays a large and increasing role in overall decarbonisation strategies. Tariffs, especially those on which small consumers can sell to a public network, should be designed to make an efficient connection between small operators and larger local or national grids.

Finally, digital technologies permit much more complex and sophisticated information and control systems. These can help maintain stable and balanced power systems, enable more sophisticated tariffs and consumer choices, and permit more efficient management of consumer requirements. They are therefore part of the solution.

We need to review many questions on retail tariffs, and establish general principles for development of retail tariffs and retail supply in a low carbon future. These proposals provide a new paradigm within which sector policies can evolve, and a benchmark against which options should be judged.  

The Long Run Marginal Cost (LRMC) Approach

It is widely understood that for both renewable energy (and nuclear power) the marginal cost of generation, defined as the cost of an additional unit of production from existing generation assets, production or from low carbon technologies is close to zero. Setting retail prices at this zero short run marginal cost (SRMC) is clearly not a viable basis for pricing, and making the consumer’s marginal electricity consumption free at the point of use has the potential to create an unlimited demand that cannot be satisfied.

There is however a well-established cost reflective benchmark for approaches to electricity tariffs based on long run marginal cost (LRMC) principles. This is intended to ensure that consumers pay the full incremental costs (at least of generation), including capital costs, that they impose on the power system “Marginal cost (as LRMC) is an engineering estimate of the effect upon the future time stream of outlays of a postulated change in the future time stream of output.”[2]

This implies that the real costs of meeting very different types of load can be very different. High load factor (eg continuous or baseload) loads or those that are well matched to patterns of production will require lower capacity requirements per unit of energy supplied. Low load factor loads, or loads concentrated in winter when generation is cheaper in summer, will be more expensive solar power. This makes the calculation of LRMC, and hence what different kinds of consumption might pay, subject to careful analysis and calculation.  “There are as many marginal costs as there are conceivable postulated changes.” [3]

The importance of digital technologies is that they allow us to think about this kind of cost reflectivity in a much more granular way, and reflect the very different costs implied by different kinds of consumption such as the traditional household applications, electricity for heat pumps, or the charging of electric vehicles. Each of these can have a very distinct load profile, with a very different servicing requirement for the consumer.

Reliability requirements, the differentiated nature of consumer needs, and supplier managed load

The importance of capacity costs also brings into sharp relief the fact that the standard of supply reliability is itself a very important driver of costs. A high standard of reliability, defined as a very low probability of failure to meet the maximum, unconstrained, instantaneous, aggregate demand of all consumers, implies a need for very substantial spare capacity margins. These may be needed to cater for daily and seasonal peak loads, for generator downtime (eg breakdowns) and for weather related fluctuations in renewables output.

However not all consumption requirements need the same level of instant access and reliability. We quite reasonably expect our power need for lighting, or for a television programme, to be met instantaneously. We are likely to have a very different approach to, for example, overnight charging of an electric vehicle battery with perhaps 50 kWh of energy, and be largely indifferent to when it is delivered, eg overnight or even over two or more days. It makes sense to permit the supplier to choose the timing of delivery, within clearly defined parameters, in order to match generation availability and any network constraints. Other loads, such as laundry, or domestic water heating, will also have their own requirements, which the consumer can choose, in relaxing the requirement for instantaneous delivery of power.

What we expect to see in a low carbon future, therefore, is consumers being able to make a selection from a menu of tariffs, with different supply arrangements and prices in each case:

·        Some supplies, eg for lighting circuits, taken at a premium price, with the highest level of guaranteed reliability.

·        Some consumers choosing a lower reliability standard, at least for some of their needs, with a lower price.

·        Some large loads, such as vehicle battery charging or heating, provided on the basis that the supplier manages the timing of energy delivery.

Future systems will place a high premium on pro-active and effective management, based on innovative tariffs and a redefined approach to retail supply, of the use of electricity for electric vehicle charging and domestic heating applications.

Allocation of fixed costs

Generation costs are however only a part of the story. A substantial proportion of total power sector costs reside in high voltage transmission, and even more in the local distribution networks. As with many networks (including road and rail) the marginal cost of accommodating extra throughput (the extra car or train) is, at least in uncongested networks, very low. But the fixed cost still needs to be recovered. How best to do it poses some very difficult questions in terms of reconciling considerations of equity and income distribution, on the one hand, and the efficient allocation of economic resources on the other.

Current UK practice for smaller retail consumers, for example, is simply to average most fixed costs over all units of energy sold. This seems fair, and prima facie results in those who consume most (and might broadly also be those with higher incomes) paying the most towards the fixed costs. However this distorts the economic message, that the actual marginal cost is much lower. When policies for a low carbon economy include persuading consumers to use large amounts of extra  electricity for heating (eg with heat pumps), this becomes a very serious obstacle. For a household consumer, a higher fixed charge in the tariff, and a lower unit energy charge, transforms the choice between using the low carbon solution (electric heat pumps) and traditional fossil fuels (gas or oil). 

Another problem arises with purchase tariffs. These provide an incentive to small scale producers that should, in ideal world, result in consumers installing their own generation when this is “efficient” and results in a reduction in total societal costs. However, if the kWh rate in the purchase tariff is overstated by including an allocation of fixed cost, it will result in too much own generation. There will be no saving in fixed cost and, while the individual consumer with own generation may benefit, a larger share of fixed network costs will be picked up by others.

There are potential answers to this question that not argued in detail here, since they take us deeper into complex policy, political and administrative questions than is appropriate for a short article. Possibilities include the recovery of fixed  costs through property taxes, and approaches in which fixed costs are recovered with differentiation according to the use to which power is put, for example with a higher fixed cost levy on EV charging (a premium use of electricity) than for heating which is in competition with gas.  

Reflecting the substantial environmental and climate costs of CO2 emissions

In the transition to a low carbon economy the case for more realistic levels of carbon taxation, as an incentive to invest in low carbon generation assets, and to minimise the share of fossil fuel in both consumption and production, is overwhelming. However this is not current policy in many countries. The UK currently has a particularly perverse approach in that the burden of renewables innovation policy is loaded on to electricity but not on to other fuels, notably gas. A major plank of low carbon policy is to encourage the use of electricity for heating, through the medium of heat pumps, and to substitute for gas. But the impact of current policies imposes a discriminatory tax on electricity, raising prices and reducing any incentive for consumers to switch from gas. A well constructed carbon tax, by contrast, would increase the cost of gas, restore a level playing field, and tilt the balance of running cost comparisons towards the electric technology of heat pumps. Perversely, recovering the cost of innovation support through the power sector hampers progress towards a low carbon economy.





[1] What we usually mean by a tariff is a set of prices that are published in advance, are open to all buyers (or for purchase tariffs, sellers) complying with a given set of conditions. They contrast with bilateral trading arrangements, and with “market” structures involving multiple buyers and sellers. They provide the standard route through which most consumers, and certainly smaller consumers, obtain their supplies of energy, water, and many communications services. It is quite normal for a supplier to offer a number of alternative tariff structures, between which consumers can choose an option that most closely reflects their needs. They can follow either simple one-part or two-part formats, or have more complex structures.
[2] Ralph Turvey, one of the pioneers, with Boiteux, of LRMC theory in electricity. Turvey, R., What are marginal costs and how to estimate them? University of Bath, 2000.
[3] Turvey, again.

Wednesday, January 3, 2018

THE COMING MINI ICE AGE. NOT QUITE FAKE BUT NOT QUITE TRUE EITHER!


I picked this up with some interest over the Christmas break as family conversations turned to the nature of our cold weather, and the recollection of winters past when the Thames froze – actually something I recall from my childhood although the river did not freeze as far down as London and the tidal sections of the river. The debate was around the possibility of a temporary reversal of global warming, even a mini ice age, perhaps allowing for time for us to find solutions to the greenhouse gas problem and to adapt to change. The idea seemed interesting and prima facie credible, but further investigation emphasised the need for more caution and slightly less optimism. We all agreed this was no reason to reduce concerns over climate.

……………………………………………..

A model of the Sun's magnetic activity suggests the River Thames may freeze over within two decades, experts say.

Global Warming Overridden by “mini ice age” that will plunge UK temperature in 2030, claim mathematicians.

According to research from universities in the UK and Russia, we could be skating on the Thames in just over a decade.

A flurry of headlines on climate science, including Sky News , the Mirror, the Sun, illustrate the difficulties of reporting serious science and the temptations of sensational headlines. The fake news sits in the exaggerated headlines, and in this instance there is an important kernel of real fact which, if the solar activity projections are correct, may be relevant both to expectations about climate and potentially to climate policy.

The story is not actually new, but has resurfaced with some particular publications by scientists working in the field of astrophysics who believe they have identified some interesting and potentially important features in the solar cycle.  Solar activity, popularly known as “sunspots”, has often been suggested as a credible explanation of variations in climate over relatively short periods of a few years, with lower activity reducing the amount of energy reaching Earth from the sun. The current story discusses a projected downturn in solar activity over the next 20 – 30 years, starting as early as 2020.  One figure quoted has been the expectation of a 60% downturn in solar activity over this period, although this translates into a reduction in the energy we receive from the sun that is an order of magnitude smaller.

Press reports had quickly transformed this story into the prospect of a mini Ice Age, with the Thames expected to freeze over by 2023, and the possibility that this would save the world from the devastation of climate change   I was perfectly prepared, perhaps being of a gullible or over-optimistic disposition, to accept the reporting at its face value, and to treat the story as mildly encouraging. If it was correct, then at a minimum it implied more time to put in place measures to adapt to climate change, and possibly even to develop the technologies that might allow us to manage or even reverse dangerous concentrations of greenhouse gases (GHG).

A reality check followed. This is what emerged.

The sunspot story.  There is serious research (Professor Zharkova et al) which forecasts coincidence of cycles in the sun’s activity, resulting in reduced solar activity. This will result in less energy reaching Earth, though the reduction is quite small.  Their work presents a model for the sun's magnetic field and sunspots, which predicting a 60% fall in sunspot numbers when extrapolated to the 2030s. Crucially, the paper makes no mention of climate.  A first failure of science communication was the Royal Astronomical Society press release from July 9. This stated that "solar activity will fall by 60 per cent during the 2030s" without clarifying that this "solar activity" refers to a fall in the number of sunspots, not a dramatic fall in the life-sustaining light emitted by the sun. A 60% fall in solar energy would most likely extinguish most life on the planet.

Comparison with earlier “mini ice ages”. This relates to a previous period of "prolonged sunspot minimum", the so-called Maunder Minimum, between about 1645 and 1715, which coincided with unusually cold weather believed to have a significant influence on climate. There is therefore some historical evidence, even after allowing for the more limited nature of observation and measurement in that period, of low solar activity being associated with significant cooling.

However even this assertion must be qualified. That mini ice age began before the Maunder minimum and may have had multiple causes, including the incidence of volcanic eruptions. Moreover the previous mini-ice age will almost certainly have built gradually, with ice cover an important part of the mechanism. Whether that mechanism can be relied on in the immediate future, with shrinking ice caps, is more debatable.

Will it reverse global warming?  The warming effect from more CO2 greatly outstrips the influence from changes in the Earth's orbit or solar activity, even if solar levels were to drop to Maunder Minimum levels. There is 40 percent more carbon dioxide in the air now than during the 17th century. A new Maunder Minimum might slow climate change, but it is not enough to stop it. Some estimates however suggest an effect on global temperatures of about a 0.3oC reduction. If correct, this is a substantial and welcome effect.

Should we relax our efforts to reduce emissions?  The answer is clearly not. A Maunder Minimum may conceal some of the underlying warming trend for a period, but the solar cycle will of course revert at some point, with an acceleration of warming. Reacting to a short term movement in what is now a well established trend would be dangerous or even disastrous on a longer term perspective.

Conclusions. This is an interesting and important element in the science, although it is still imperfectly understood. We may well see a Maunder Minimum effect, although it seems unlikely to freeze the Thames in any of its tidal range. What is important is that we are able to interpret the effect as accurately as possible as we observe global climate in the years ahead.