National cabinet just agreed to big changes to environment law. Here’s why the process shouldn’t be rushed



Journey Beyond/AAP

Megan C Evans, UNSW and Peter Burnett, Australian National University

Federal and state governments on Friday resolved to streamline environment approvals and fast-track 15 major projects to help stimulate Australia’s pandemic-stricken economy.

The move follows the release this week of Professor Graeme Samuel’s preliminary review of the law, the 20-year-old Environment Protection and Biodiversity Conservation (EPBC) Act. Samuel described the law as “ineffective” and “inefficient” and called for wholesale reform.

At the centrepiece of Samuel’s recommendations are “national environmental standards” that are consistent and legally enforceable, and set clear rules for decision-making. Samuel provides a set of “prototype” standards as a starting point. He recommends replacing the prototypes with more refined standards over time.

By the end of August, the Morrison government wants Parliament to consider implementing the prototype standards.

But rushing in the new law is a huge concern, and further threatens the future of Australia’s irreplaceable natural and cultural heritage. Here, we explain why.

Aerial view of a Tasmanian forest
Rushing through changes to environment laws may damage nature in the long run.
Rob Blakers/AAP

Semantics matter

Samuel’s review said legally enforceable national standards would help ensure development is sustainable over the long term, and reduce the time it takes to have development proposals assessed.

We’ve identified a number of problems with his prototype standards.

First, they introduce new terms that will require interpretation by decision-makers, which could lead the government into the courts. This occurred in Queensland’s Nathan dam case when conservation groups successfully argued the term environmental “impacts” should extend to “indirect effects” of development.




Read more:
Environment Minister Sussan Ley is in a tearing hurry to embrace nature law reform – and that’s a worry


Second, there’s a difference in wording between the prototype standards and the EPBC Act itself, which might lead to uncertainty and delay. Samuel suggested a “no net loss” national standard for vulnerable and endangered species habitat, and “net gain” for critically endangered species habitat. But this departs from current federal policy, under which environmental offsets must “improve or maintain” the environmental outcome compared to “what is likely to have occurred under the status quo”.

Third, the outcomes proposed under the prototype standards might themselves cause confusion. The standards say, overall, the environment should be “protected”, but rare wetlands protected under the Ramsar Convention should be “maintained”. The status of threatened species should “improve over time” and Commonwealth marine waters should be “maintained or enhanced”, but the Great Barrier Reef Marine Park needs to be “sustained for current and future generations”.

And fourth, the standards don’t rule out development in habitat critical to threatened species, but require that “no detrimental change” occurs. But in reality, can there be development in critical habitat without detrimental change?

The Great Barrier Reef
The Great Barrier Reef should be sustained for future generations.
Jurgen Freund/AP

Mind the gap

The escape clause in the prototype standards presents another problem. A small, yet critical recommendation in the appendix of Samuel’s report says:

These amendments should include a requirement that the Standards be applied unless the decision-maker can demonstrate that the public interest and the national interest is best served otherwise.

Which decision maker is he referring to here – federal or state? If it’s the former, will there be a constant stream of requests to the federal environment minister for a “public interest” exemption on the basis of jobs and economic development? If the latter, can a state decision-maker judge the “national interest”, especially for species found in several states, such as the koala?

Samuel says the “legally enforceable” nature of national standards are the foundation of effective regulation. But both he and Auditor-General Grant Hehir in his recent report found existing enforcement provisions are rarely applied, and penalties are low.

Federal Environment Minister Sussan Ley has already ruled out Samuel’s recommendation that an independent regulator take responsibility for enforcement. But the record to date does not give confidence that government officials will enforce the standards.

Temporary forever?

Both Ley and Samuel suggested the interim standards would be temporary and updated later. But history shows “draft” and “interim” policies have a tendency to become long-term, or permanent.

For example, federal authorities often allow a proponent to cause environmental damage, and compensate by improving the environment elsewhere – a process known as “offsetting”. A so-called “draft” offset policy drawn up in 2007 actually remained in place for five years until 2012, when it was finally replaced. And the federal environment department recently accepted offsets based on the 2007 “draft” rather than the current policy.

The best antidote is to ensure the first tranche of national standards is comprehensive, precise and strong. This can only occur if genuine consultation occurs, legislation is not rushed, and the government commits to improving the “antiquated” data and information systems the standards rely on.

Adult and baby koala on a pile of felled trees.
Environmental offsets allow a proponent to damage the environment in one location and improve it in another.
WWF

Negotiation to the lowest bar

According to the Samuel report, the proposed standards “provide a clear pathway for greater devolution in decision-making” that will enable states and territories to conduct federal environmental assessments and approvals. This proposed change has been strongly and consistently criticised by scientists and environmental lawyers.

Ley also appears to be wildly underestimating the time and effort required to negotiate the standards with the states and territories.




Read more:
Let there be no doubt: blame for our failing environment laws lies squarely at the feet of government


Take the Gillard government’s attempts to overcome duplication between state and federal law by establishing a “one-stop-shop” approvals process. Prime Minister Julia Gillard pulled the plug on negotiations after a year, declaring the myriad agreements being sought by various states was the “regulatory equivalent of a Dalmatian dog”.

The Abbott government’s negotiations for a similar policy lasted twice as long but suffered a similar fate, lapsing with the dissolution of Parliament in 2016.

Samuel warned refining the standards should not involve “negotiated agreement with rules set at the lowest bar”. But vested interests will inevitably seek to influence the process.

Proceed with caution

We have identified significant problems with the prototype standards, and more may emerge.

Ley’s rush to amend the Act appears motivated more by wanting to cut so-called “green tape” than by evidence or environmental outcomes.

Prototypes are meant to be stress-tested. But if the defects are not corrected before hurrying into negotiations and legislative change, Australia might go another 20 years without effective environment laws.

Update: This article has been amended to reflect the national cabinet decision.




Read more:
Environment laws have failed to tackle the extinction emergency. Here’s the proof


The Conversation


Megan C Evans, Lecturer and ARC DECRA Fellow, UNSW and Peter Burnett, Honorary Associate Professor, ANU College of Law, Australian National University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Advertisement

The climate won’t warm as much as we feared – but it will warm more than we hoped



Shutterstock

Steven Sherwood, UNSW; Eelco Rohling, Australian National University, and Katherine Marvel, NASA

We know the climate changes as greenhouse gas concentrations rise, but the exact amount of expected warming remains uncertain.

Scientists study this in terms of “equilibrium climate sensitivity” – the temperature rise for a sustained doubling of carbon dioxide concentrations. Equilibrium climate sensitivity has long been estimated within a likely range of 1.5-4.5℃.

Under our current emissions trajectories, carbon dioxide concentrations in the atmosphere will likely double between 2060 and 2080, relative to concentrations before the industrial revolution. Before that, they had changed little for millennia.




Read more:
Explainer: what is climate sensitivity?


A major new assessment has now calculated a range of 2.6–3.9℃. This implies that alarmingly high estimates from some recent climate models are unlikely, but also that comfortingly low estimates from other studies are even less likely.

More warming, greater impacts

Current and future climate change impacts include heatwaves, changing rainfall and drought patterns, and rising seas. Their severity depends on how much warming takes place.

Human activities are the main determinant of future temperatures, so a world with aggressive emissions control looks very different from a world in which emissions continue to increase.

Even if we knew exactly how emissions would change in the future, the exact amount of warming that would result remains uncertain.

Four wind turbines at the end of a road, flanked by pale green-brown grass.
Drastic measures are still needed to curb climate change.
Shutterstock

Our new equilibrium climate sensitivity analysis substantially reduces this uncertainty, by combining modern understanding of atmospheric physics with modern, historic, and prehistoric data using robust statistical methods.

The results indicate that substantial warming is much more solidly assured than we thought.

A matter of probabilities

In 1979, a farsighted report estimated for the first time that equilibrium climate sensitivity falls somewhere between 1.5℃ and 4.5℃. So if carbon dioxide concentrations doubled, global temperatures would eventually increase by somewhere in that range.




Read more:
There are no time-travelling climatologists: why we use climate models


The width of this range is a problem. If equilibrium climate sensitivity lies at the low end of the range, climate change might be manageable with relatively relaxed national policies.

In contrast, a value near the high end would be catastrophic unless drastic action is taken to reduce emissions and draw carbon dioxide from the atmosphere.

Consequently, narrowing the equilibrium climate sensitivity range has been a key focus of climate science. While recent estimates haven’t really changed, climate scientists have learned a lot about how likely each outcome is.

For example, the 2013 Intergovernmental Panel on Climate Change (IPCC) report estimated a minimum two-thirds chance that equilibrium climate sensitivity falls within the 1.5–4.5℃ range. This implies there’s a chance of up to one-third that equilibrium climate sensitivity is lower or, worryingly, much higher.

The silhouette of a coal-burning power plant.
There’s only a 17% chance we’ll keep warming under 2℃, in the lowest global emissions scenario.
Shutterstock

Recently, the potential for high climate sensitivities gained further attention after results from new climate models suggested values in excess of 5℃.

Our new assessment rules out low climate sensitivities, finding only a 5% chance that equilibrium climate sensitivity is below 2.3℃.

On the brighter side, we also find a low chance of it rising above 4.5℃. Constraining the precise probability of high equilibrium climate sensitivity range is difficult and depends to some extent on how the evidence is interpreted. Still, the alarming predictions of the new models appear unlikely.

We also find the chances of the world exceeding the 2℃ Paris Accord target by late this century are 17% under the lowest-emission scenario considered by the IPCC, 92% under a scenario that approximates current efforts, and 100% under the highest-emission scenario.

Why our study is different

The new assessment uses several strands of evidence. One is the recent, historical past since industrialisation, during which time temperatures have increased by about 1.1℃.

We compared this with knowledge about the natural drivers of climate over this period (such as slight changes in solar output and a few major volcanic eruptions), human-caused increases in atmospheric carbon dioxide and other greenhouse gases, and changes to the land surface.

Second, the assessment uses data for temperature changes and the underpinning natural processes from ice ages and warm periods in pre-historic times.

And third, it uses physical laws and present-day observations to evaluate how the planet responds to change, for example by examining brief warming or cooling episodes.




Read more:
Why 2℃ of global warming is much worse for Australia than 1.5℃


One conclusion is especially consistent between all lines of evidence. Unless the equilibrium climate sensitivity is larger than 2℃, we cannot explain either the warming we’ve already seen since industrialisation, the ice ages in Earth’s past, or certain aspects of how weather changes operate today.

This unequivocally demonstrates that relaxed efforts against carbon emissions will not avoid substantial warming.

This is not the final word

The new assessment is by no means the last word. It narrows the range, but we still don’t know exactly how hot it’s going to get.

Our assessment will also feed into the upcoming IPCC report, but the panellist will of course make an independent assessment. And further research may narrow the range more in the future.

While high sensitivities are unlikely, they cannot be completely excluded. But whether the temperature rise is moderate or high, the message is the same: drastic measures are needed to curb climate change.

Crucially, the new assessment clearly demonstrates that betting on low sensitivities and failing to implement drastic measures is risky to the point of irresponsibility.The Conversation

Steven Sherwood, ARC Laureate Fellow, Climate Change Research Centre, UNSW; Eelco Rohling, Professor of Ocean and Climate Change, Australian National University, and Katherine Marvel, Associate Research Scientist, NASA

This article is republished from The Conversation under a Creative Commons license. Read the original article.

New research reveals how Australia and other nations play politics with World Heritage sites



Shutterstock

Tiffany Morrison, James Cook University; Katrina Brown, University of Exeter; Maria Lemos, University of Michigan, and Neil Adger, University of Exeter

Some places are considered so special they’re valuable to all humanity and must be preserved for future generations. These irreplaceable gems – such as Machu Picchu, Stonehenge, Yosemite National Park and the Great Barrier Reef – are known as World Heritage sites.

When these places are threatened, they can officially be placed on the “List of World Heritage in Danger”. This action brings global attention to the natural or human causes of the threats. It can encourage emergency conservation action and mobilise international assistance.

However, our research released today shows the process of In Danger listings is being manipulated for political gain. National governments and other groups try to keep sites off the list, with strategies such as lobbying, or partial efforts to protect a site. Australian government actions to keep the Great Barrier Reef off the list are a prime example.

These practices are a problem for many reasons – not least because they enable further damage to threatened ecosystems.

Yosemite National Park is on the World Heritage list.
AAP/Kathryn Bermingham

What is the In Danger list?

World Heritage sites represent outstanding socioeconomic, natural and cultural values. Nations vie to have their sites included on the World Heritage list, which can attract tourist dollars and international prestige. In return, the nations are responsible for protecting the sites.

World Heritage sites are protected by an international convention, overseen by the United Nations body UNESCO and its World Heritage Committee. The committee consists of representatives from 21 of the 193 nations signed up to the convention.




Read more:
We just spent two weeks surveying the Great Barrier Reef. What we saw was an utter tragedy


When a site comes under threat, the World Heritage Committee can list the site as in danger of losing its heritage status. In 2014 for example, the committee threatened to list the Great Barrier Reef as In Danger – in part due to a plan to dump dredged sediment from a port development near the reef, as well as poor water quality, climate change and other threats. This listing did not eventuate.

An In Danger listing can attract help to protect a site. For example, the Galápagos Islands were placed on the list in 2007. The World Heritage Fund provided the Ecuadorian government with technical and financial assistance to restore the site’s World Heritage status. The work is not yet complete, but the islands were removed from the In Danger list in 2010.

Ecuador’s Galapagos Islands were removed from the In Danger list in 2010.
EPA

Political games

Our study shows political manipulation appears to be compromising the process that determines if a site is listed as In Danger.

We examined interactions between UNESCO and 102 national governments, from 1972 until 2019. We interviewed experts from the World Heritage Committee, government agencies and elsewhere, and combined this with global site threat data, UNESCO and government records, and economic and governance data.

We found at least 41 World Heritage sites, including the Great Barrier Reef, were at least once considered by the World Heritage Committee for the In Danger list, but weren’t put on it. This is despite these sites being reported by UNESCO as threatened, or more threatened, than those already on the In Danger list. And 27 of the 41 sites were considered for an In Danger listing more than once.

The number of sites on the In Danger list declined by 31.6% between 2001 and 2008, and has plateaued since. By 2019, only 16 of 238 ecosystems were certified as In Danger. In contrast, the number of ecosystems on the World Heritage list has increased steadily over the past 20 years.




Read more:
Explainer: what is the List of World Heritage in Danger?


So why is this happening? Our analysis showed the threat of an In Danger listing drives a range of government responses.

This includes governments complying only partially with World Heritage Committee recommendations or making only symbolic commitments. Such “rhetorical” adoption of recommendations has been seen in relation to the Three Parallel Rivers in China’s Yunnan province, the Western Caucasus in Russia and Australia’s Great Barrier Reef (explored in more detail below).

In other cases, threats to a site are high but attract limited attention and effort from either the national government or UNESCO. These sites include Halong Bay in Vietnam and the remote Tubbataha Reefs in the Philippines.

A 2004 amendment to the way the World Heritage Committee assesses In Danger listings means sites can be “considered” for inclusion rather than just listed, retained or removed. This has allowed governments to use delay tactics, such as in the case of Cameroon’s Dja Faunal Reserve. It has been considered for the In Danger list five times since 2011, but never listed.

Threats to Vietnam’s Halong Bay receive little attention.
Richard Vogel/AAP

Case in point: The Great Barrier Reef

In 2014 and 2015, the Australian government spent more than A$400,000 on overseas lobbying trips to keep the Great Barrier Reef off the In Danger list. The environment minister and senior bureaucrats travelled to most of the 21 countries on the committee, plus other nations, to argue against the listing. The mining industry also contributed to the lobbying effort.

The World Heritage Committee had asked Australia to develop a long-term plan to protect the reef. The Australian and Queensland governments appeared to comply, by releasing the Reef 2050 Plan in 2015.

But in 2018, a national audit and Senate inquiry found a substantial portion of finance for the plan was delivered – in a non-competitive and hidden process – to the private Great Barrier Reef Foundation, which had limited capacity and expertise. This casts doubt over whether the aims of the reef plan can be achieved.

Real world damage

Our study makes no recommendation on which World Heritage sites should be listed as In Danger. But it uncovered political manipulation that has real-world consequences. Had the Great Barrier Reef been listed as In Danger, for example, developments potentially harmful to the reef, such as the Adani coal mine, may have struggled to get approval.

Last year, an outlook report gave the reef a “very poor” prognosis and last summer the reef suffered its third mass bleaching in five years. There are grave concerns for the ecosystem’s ability to recover before yet another bleaching event.

Political manipulation of the World Heritage process undermines the usefulness of the In Danger list as a policy tool. Given the global investment in World Heritage over the past 50 years, it is essential to address the hidden threats to good governance and to safeguard all ecosystems.




Read more:
Australia reprieved – now it must prove it can care for the Reef


The Conversation


Tiffany Morrison, Professorial Research Fellow, ARC Centre of Excellence for Coral Reef Studies, James Cook University; Katrina Brown, Professor of Social Sciences, University of Exeter; Maria Lemos, Professor of Environmental Justice, Environmental Policy and Planning, Climate + Energy,, University of Michigan, and Neil Adger, Professor of Human Geography, University of Exeter

This article is republished from The Conversation under a Creative Commons license. Read the original article.