Daily Archives for June 10, 2017
Energy solutions but weak on climate – experts react to the Finkel Review

Lukas Coch/AAP
Hugh Saddler, Australian National University; Alan Pears, RMIT University, and David Karoly, University of Melbourne
The keenly anticipated Finkel Review, commissioned in the wake of last year’s South Australian blackout, has made a range of recommendations aimed at delivering a reliable, secure and sustainable National Electricity Market.
Among the proposals is a new Clean Energy Target to boost investment in low-carbon electricity generation, as well as moves to require high-emitting power stations to give three years’ notice before shutting down.
Below, our experts react to the measures.
“Security and reliability are first”
Hugh Saddler, Honorary Associate Professor, Australian National University
With so much focus on the design of a mechanism to support a shift towards lower-emissions generation, it is easy to forget that the primary purpose of the Review, commissioned following the “system black” event in South Australia on September 28, 2016, was “to develop a national reform blueprint to maintain energy security and reliability”. It is thus appropriate that security and reliability are the first topics to be addressed in the main body of the report.
System security is defined as the ability of the system to tolerate disturbances. Maintaining security requires the system to be able to prevent very high rates of change of frequency. At present the system has no explicit mechanism for doing this, but relies implicitly on the inertia provided, effectively as a free service, by existing large thermal generators.
The report recommends a series of regulatory energy security obligations to provide this service by various additional means, falling on the transmission network service providers in each of the five NEM regions (states), and also on all new generators connecting to the system.
System reliability is defined as the ability of the system to meet consumer demand at all times. In the old system, this is achieved by “dispatchable” generators, meaning coal and gas generators that can vary their output as required to meet demand.
In the new system, with large amounts of variable wind and solar generation, other supply sources are needed to meet demand at times of low wind speed and/or lack of sun – that is, to act as complements to wind and solar. Existing hydro and open-cycle gas turbine generators are ideally suited to this task, but with the growth in wind and solar generation, this capacity will very soon be insufficient for the task across the NEM (and is already insufficient in SA).
The Report recommends what it calls a Generator Reliability Obligation, which would be triggered whenever the proportion of dispatchable generation (which could include batteries and other forms of storage) in a region is falling towards a predetermined minimum acceptable level. The obligation would fall on all new renewable generators wishing to connect thereafter and, in the words of the Report “would not need to be located on site, and could utilise economies of scale” through multiple renewable generation projects “pairing” with “one new large-scale battery of gas fired generation project for example”.
If implemented, this recommendation would seem certain to greatly complicate, slow down and add to the administrative overhead cost of building new renewable generation. It would involve putting together a consortium of multiple parties with potentially differing objectives and who would otherwise be competing with one another in the wholesale electricity market.
A far better approach would be to recognise that dispatchable generation provides a distinct and more valuable product than non-dispatchable generation. There should be a separate market mechanism, possibly based on a contracting approach, to provide this service. If well designed, this would automatically ensure that economies of scale, as may be realised by pumped hydro storage, for example, would be captured. This approach would be far more economically efficient, and thus less costly to electricity consumers, than the messy processes required under the Report’s obligation approach.
“Energy efficiency is effectively handballed to governments”
Alan Pears, Senior Industry Fellow, RMIT University
The Review’s approach to the demand side is very focused. Demand response, the capacity to reduce demand at times of extreme pressure on the supply system, is addressed thoroughly. The past under-utilisation of this approach is acknowledged, and the actions of the Australian Energy Market Operator (AEMO) intended to capture some of its potential in time for next summer are outlined.
However, the deep cultural problems within the Australian Energy Markets Commission regarding demand response are not tackled. Instead, the AEMC is asked (yet again) to develop facilitation mechanisms in the wholesale market by mid-2018.
Energy efficiency is effectively handballed to governments. After making some positive comments about its valuable roles, recommendation 6.10 states that governments “should accelerate the roll out of broader energy efficiency measures to complement the reforms recommended in this Review”.
This is a disappointing outcome, given the enormous untapped potential of energy markets to drive effective energy efficiency improvement. But it clearly shows governments that they have to drive energy-efficiency initiatives unless they instruct energy market participants to act.
“It follows the wrong path on greenhouse emissions”
David Karoly, Professor of Atmospheric Science, University of Melbourne and Member, Climate Change Authority
The Finkel Review says many sensible things about ways to improve the security and reliability of Australia’s electricity sector. However, it follows completely the wrong path in what it says about lower greenhouse emissions from the electricity sector and Australia’s commitments under the Paris Agreement. This is disappointing, as Alan Finkel is Australia’s Chief Scientist and a member of the Climate Change Authority.
All economy-wide modelling shows that the electricity sector must do a larger share of future emissions reductions than other sectors, because there are easier and cheaper solutions for reducing emissions in that sector. However, this review’s vision is for “emissions reduced by 28% below 2005 levels by 2030” – exactly the same as Australia’s target under the Paris Agreement. It should be much more.
Australia’s commitments under the Paris Agreement are “to undertake ambitious efforts” to limit global warming “to well below 2℃ above pre-industrial levels”. The Targets Report from the Climate Change Authority in 2015 showed that this means Australia and the electricity sector must aim for zero emissions before 2050, not in the second half of the century, as suggested in the Finkel Review.
Hugh Saddler, Honorary Associate Professor, Centre for Climate Economics and Policy, Australian National University; Alan Pears, Senior Industry Fellow, RMIT University, and David Karoly, Professor of Atmospheric Science, University of Melbourne
This article was originally published on The Conversation. Read the original article.
The Finkel Review: finally, a sensible and solid footing for the electricity sector
David Blowers, Grattan Institute
Chief Scientist Alan Finkel’s long-awaited review of the National Electricity Market, released today, will make a significant difference to Australia’s electricity system in three key areas: reliability (making sure the system generates enough power to meet demand), security (making sure the system doesn’t break), and governance (making sure the electricity market can run effectively).
Reliability
The review recommends a Clean Energy Target (CET), which will provide subsidies to new low-emissions generation. The actual choice of scheme is less important than its durability. If broad political agreement can be reached on this target, it can provide the policy certainty that industry crucially needs to build new generation capacity and meet electricity demand.
Finkel also proposes a Generator Reliability Obligation, which places a limit on further wind and solar power in regions that already have a high proportion of intermittent generation. New intermittent generators will have to provide backup for some of their supply, in the form of new storage or contracts with new dispatchable generators such as gas. The aim is to ensure that federal and state subsidies for renewables do not push too much intermittent generation into the market without adequate backup.
Large generators will also need to provide a reasonable notice of closure – the review suggests a period of three years – before leaving the market. The aim here is to ensure the market has enough time to respond by installing new generation.
Finally, the review floats the possibility of further changes to ensure reliability, potentially a day-ahead market to lock in supply ahead of time, or a strategic reserve – a mechanism by which the market operator can sign contracts requiring generators to sit idle unless needed in an emergency.
The market operator (AEMO) can already do this, and the report is silent on how a strategic reserve would be different or whether it is definitely needed.
Security
To secure the electricity system, Finkel calls for existing standards to be tightened and new mechanisms to be introduced.
Transmission companies will be required to provide and maintain a prescribed level of inertia in the system – high levels of inertia can prevent rapid changes in frequency that harm the system. Fossil fuel generators may be required to change their settings to control the frequency in the system, whereas new generators, including renewables, will be required to provide fast frequency-response services to help avoid frequency fluctuations that can damage the grid.
While technical in their nature, these measures will reduce the likelihood of instability in the system and provide extra tools to fix the it if instability arises.
Finkel also makes recommendations to bolster the emergency management plan for the 2017-18 summer and to encourage consumers – both residential and business – to reduce their demand at peak times. The review strongly encourages the development of “demand response” schemes to give consumers incentives to switch off and help smooth the load at peak times.
Governance
The biggest change to how the market will be run is the proposed creation of an Energy Security Board (ESB). The ESB will comprise an independent chair and vice-chair, as well as the heads of the three governing bodies: the AEMC, AEMO and the market regulator (the AER). At a minimum, the ESB will be responsible for implementing many of the Finkel Review recommendations, although the panel leaves scope for it to do much more.
Finkel recommends a comprehensive review of the rules governing the electricity market. It also argues for increased accountability for market bodies and the COAG Energy Council, through enhanced performance indicators and a beefed-up process for determining and monitoring priorities for the energy sector.
What happens next?
The report makes a range of other recommendations designed to ensure better service for energy consumers, more transparency in gas markets, and improved planning and coordination of electricity networks.
The Finkel Review successfully addresses the main issues confronting the electricity sector today. At the very least, it is a step towards a more reliable and secure system.
The devil, as always, will be in the detail. Much will depend on how the recommendations are implemented. Australian households and business can only hope that the new Energy Security Board and the nation’s political leaders will see this through.
David Blowers, Energy Fellow, Grattan Institute
This article was originally published on The Conversation. Read the original article.
Russia: Saint Petersburg
What is a pre-industrial climate and why does it matter?
Andrew King, University of Melbourne; Ben Henley, University of Melbourne, and Ed Hawkins, University of Reading
Over the past few days there has been a lot of talk about the Paris climate agreement, from which the United States is planning to withdraw. Although this is a setback, there is still near-complete consensus from the world’s governments that a strong effort to tackle climate change is needed.
The Paris Agreement aims to limit global warming relative to a pre-industrial baseline. Its precise commitment is:
Holding the increase in the global average temperature to well below 2℃ above pre-industrial levels and to pursue efforts to limit the temperature increase to 1.5℃ above pre-industrial levels, recognising that this would significantly reduce the risks and impacts of climate change.
But this begs the question: what are “pre-industrial levels”?
Clearly, if we’re aiming to limit global warming to 1.5℃ or 2℃ above a certain point, we need a common understanding of what we’re working from. But the Paris Agreement doesn’t provide a definition.
This becomes key as governments expect climate scientists to coherently compare different plans to reach their Paris targets. It’s crucial to be clear on what researchers mean when we say “pre-industrial”, and what assumptions our projections are based on.
Of course, as the chart below shows, no matter which baseline we use it’s clear there’s been a drastic rise in global temperature over the last century.

Author provided
Defining a pre-industrial baseline
The Industrial Revolution began in the late 1700s in Britain, and spread around the world. But this only marked the beginning of a gradual rise in our greenhouse gas emissions. Various studies have found climate change signals appearing on a global scale as early as the 1830s, or as recently as the 1930s.
Besides the evolving and increasing human influence on the climate, we also know that plenty of other natural factors can affect Earth’s temperature. This natural variability in the climate makes it harder to determine a single precise pre-industrial baseline.
Scientists separate these natural influences on the climate into two groups: internal and external forcings.
Internal forcings transfer heat between different parts of Earth’s climate system. The El Niño-Southern Oscillation, for example, moves heat between the atmosphere and the ocean, causing year-to-year variations in global average surface temperatures of about 0.2℃. Similar variations also happen on decadal timescales, which are associated with slower energy transfers and longer variations in Earth’s temperature.
External forcings come from outside Earth’s climate system to influence global temperature. One example of an external forcing is volcanic eruptions, which send particles into the upper atmosphere. This prevents energy from the Sun reaching Earth’s surface, and leads to a temporary cooling.
Another external influence on Earth’s climate is the variability in the amount of energy the Sun emits.
The Sun’s total energy output varies on multiple cycles and is related to the number of sunspots, with slightly higher temperatures when there are more sunspots, and vice versa.
Earth has experienced extended periods of cooling due to more frequent explosive volcanic eruptions and periods of few sunspots – such as during the “Little Ice Age” which lasted roughly from 1300 to the 1800s.

Adapted from Hawkins et al. (2017).
All of these factors mean that Earth’s climate can vary quite substantially even without human interference.
It also means that if we choose a pre-industrial baseline when there was low solar activity, like the late 1600s, or in a period of high volcanic activity, like the 1810s or the 1880s, then we would have a lower reference point and we would pass through 1.5℃ or 2℃ sooner.
A challenge not only for scientists
At the moment there is a drive among the climate science community to better understand the impacts of 1.5℃ of global warming. The Intergovernmental Panel on Climate Change will deliver a special report on 1.5℃ next year.
But scientists are defining “pre-industrial” or “natural” climate in different ways. Some work from the beginning of global temperature records in the late 19th century, while others use climate model simulations that exclude human influences over a more recent period. One recent study suggested that the best baseline might be 1720-1800.
These different definitions make it harder to synthesise the results from individual studies, which is vital to informing decision-making.
This will have to be a consideration in the writing of the IPCC’s report, as policymakers will need to easily compare impacts at different levels of global warming.
There is no definitive way to determine the best “pre-industrial” reference point. An alternative might be to avoid the pre-industrial baseline altogether, and instead set targets from more recent periods, when we have a better grasp of what the global climate looked like.
You can read more about defining a pre-industrial climate here and here.
Andrew King, Climate Extremes Research Fellow, University of Melbourne; Ben Henley, Research Fellow in Climate and Water Resources, University of Melbourne, University of Melbourne, and Ed Hawkins, Associate professor of climate science, University of Reading
This article was originally published on The Conversation. Read the original article.