Silly Season Break


I wasn’t going to have a break from posting blog posts over Christmas – New Year, but I have now decided that I will. I’m just too tired not to have a break. So at some point I’m going to go bush, throw up the tent and read some books (modern-style). I could really use the break right now. Still, from time to time I may post something I come across. This will be an extended period, from the time I post this update, through to the middle of January 2018. From that point I’ll get back to more regular posts.

So let me take this opportunity to wish you all a great Christmas and New Year, and enjoy the time with family and friends if you can. – now something for a parting laugh

Advertisement

Flying home for Christmas? Carbon offsets are important, but they won’t fix plane pollution



File 20171214 27572 a8rrj.jpg?ixlib=rb 1.1

Roey Ahram/Flickr, CC BY-NC-SA

Susanne Becken, Griffith University and Brendan Mackey, Griffith University

Australia is an important player in the global tourism business. In 2016, 8.7 million visitors arrived in Australia and 8.8 million Australians went overseas. A further 33.5 million overnight trips were made domestically.

But all this travel comes at a cost. According to the Global Sustainable Tourism Dashboard, all Australian domestic trips and one-way international journeys (the other half is attributed to the end point of travel) amount to 15 million tonnes of carbon dioxide for 2016. That is 2.7% of global aviation emissions, despite a population of only 0.3% of the global total.


Read more: Life in a post-flying Australia, and why it might actually be ok


The peak month of air travel in and out of Australia is December. Christmas is the time where people travel to see friends and family, or to go on holiday. More and more people are aware of the carbon implications of their travel and want to know whether, for example, they should purchase carbon offsets or not.

Our recent study in the Journal of Air Transport Management showed that about one third of airlines globally offer some form of carbon offsetting to their customers. However, the research also concluded that the information provided to customers is often insufficient, dated and possibly misleading. Whilst local airlines Qantas, Virgin Australia and Air New Zealand have relatively advanced and well-articulated carbon offset programs, others fail to offer scientifically robust explanations and accredited mechanisms that ensure that the money spent on an offset generates some real climate benefits.

The notion of carbon compensation is actually more difficult than people might think. To help explain why carbon offsetting does make an important climate contribution, but at the same time still adds to atmospheric carbon, we created an animated video clip.

Jack’s journey.

The video features Jack, a concerned business traveller who begins purchasing carbon credits. However, he comes to the realisation that the carbon emissions from his flights are still released into the atmosphere, despite the credit.

The concept of “carbon neutral” promoted by airline offsets means that an equal amount of emissions is avoided elsewhere, but it does not mean there is no carbon being emitted at all – just relatively less compared with the scenario of not offsetting (where someone else continues to emit, in addition to the flight).

This means that, contrary to many promotional and educational materials (see
here for instance), carbon offsetting will not reduce overall carbon emissions. Trading emissions means that we are merely maintaining status quo.

A steep reduction, however, is what’s required by every sector if we were to reach the net-zero emissions goal by 2050, agreed on in the Paris Agreement.


Read more: It’s time to wake up to the devastating impact flying has on the environment


Carbon offsetting is already an important “polluter pays” mechanism for travellers who wish to contribute to climate mitigation. But it is also about to be institutionalised at large scale through the new UN-run Carbon Offsetting and Reduction Scheme for International Aviation (CORSIA).

CORSIA will come into force in 2021, when participating airlines will have to purchase carbon credits for emissions above 2020 levels on certain routes.

The availability of carbon credits and their integrity is of major concern, as well as how they align with national obligations and mechanisms agreed in the Paris Agreement. Of particular interest is Article 6, which allows countries to cooperate in meeting their climate commitments, including by “trading” emissions reductions to count towards a national target.

The recent COP23 in Bonn highlighted that CORSIA is widely seen as a potential source of billions of dollars for offset schemes, supporting important climate action. Air travel may provide an important intermediate source of funds, but
ultimately the aviation sector, just like anyone else, will have to reduce their own emissions. This will mean major advances in technology – and most likely a contraction in the fast expanding global aviation market.


Read more: Friday essay: smile and stay thin – life as a 60s air hostess


Travelling right this Christmas

In the meantime, and if you have booked your flights for Christmas travel, you can do the following:

  • pack light (every kilogram will cost additional fuel)

  • minimise carbon emissions whilst on holiday (for instance by biking or walking once you’re there), and

  • support a credible offsetting program.

The ConversationAnd it’s worth thinking about what else you can do during the year to minimise emissions – this is your own “carbon budget”.

Susanne Becken, Professor of Sustainable Tourism and Director, Griffith Institute for Tourism, Griffith University and Brendan Mackey, Director of the Griffith Climate Change Response Program, Griffith University

This article was originally published on The Conversation. Read the original article.

Google’s artificial intelligence finds two new exoplanets missed by human eyes


File 20171215 17878 rik7zz.jpg?ixlib=rb 1.1
Artist impression of Kepler-90i, the eighth planet discovered orbiting around Kepler-90.
NASA

Jake Clark, University of Southern Queensland

Two new exoplanets have been discovered thanks to NASA’s collaboration with Google’s artificial intelligence (AI). One of those in today’s announcement is an eighth planet – Kepler-90i – found orbiting the Sun-like star Kepler-90. This makes it the first system discovered with an equal number of planets to our own Solar system.

A mere road trip away, at 2,545 light-years from Earth, Kepler-90i orbits its host star every 14.4 Earth days, with a sizzling surface temperature similar to Venus of 426°C.

The new exoplanets are added to the growing list of known worlds found orbiting other stars.

The Kepler-90 planets have a similar configuration to our solar system with small planets found orbiting close to their star, and the larger planets found farther away.
NASA/Ames Research Center/Wendy Stenzel

This new Solar system rival provides evidence that a similar process occurred within Kepler-90 that formed our very own planetary neighbourhood: small terrestrial worlds close to the host star, and larger gassy planets further away. But to say the system is a twin of our own Solar system is a stretch.


Read more: Exoplanet discovery by an amateur astronomer shows the power of citizen science


The entire Kepler-90 system of eight planets would easily fit within Earth’s orbit of the Sun. All eight planets, bar Kepler-90h, would be too hostile for life, lying outside the so-called habitable zone.

Evidence also suggests that planets within the Kepler-90 system started out farther apart, much like our own Solar system. Some form of migration occurred, dragging this system inwards, producing the orbits we see in Kepler-90 today.

Kepler-90 is a Sun-like star, but all of its eight planets are scrunched into the equivalent distance of Earth to the Sun.
NASA/Ames Research Center/Wendy Stenzel

Google’s collaboration with NASA’s space telescope Kepler mission has now opened up new and exciting opportunities into AI helping with scientific discoveries.

So how exactly did Google’s AI discover these planets? And what sort of future discoveries can this technology provide?

Training AI for exoplanet discoveries

Traditionally, software developers program computers to perform a particular task, from playing your favourite cat video, to determining exoplanetary signals from space based telescopes such as NASA’s Kepler Mission.

These programs are executed to serve a single purpose. Using code intended for cat videos to hunt exoplanets in light curves would lead to some very interesting, yet false, results.

Googles’s AI is programmed rather differently, using machine learning. In machine learning, AI is trained through artificial neural networks – somewhat replicating our brain’s biological neural networks – to perform tasks like reading this article. It then learns from its mistakes, becoming more efficient at its particular task.

Google’s DeepMind AI, AlphaGo, was trained previously to play Go, an extremely complex yet elegant Chinese board game. Last year, AlphaGo defeated Lee Sedol, the world’s best Go player, by four games to one. It simply trained itself by watching thousands of previously played games, then competing against itself.

In our exoplantary case, AI was trained to identify transiting exoplanets, sifting through 15,000 signals from the Kepler exoplanet catalogue. It learned what was and wasn’t a signal caused by an exoplanet eclipsing its host star. These 15,000 signals were previously vetted by NASA scientists prior to the AI’s training, guiding it to a 96% efficiency of detecting known exoplanets.

Researchers then directed their AI network to search in multiplanetary systems for weaker signals. This research culminated in today’s announcement of both Kepler-90i and another Earth-sized exoplanet, Kepler-80g, in a separate planetary system.

Hunting for more exoplanets using AI

Google’s AI has analysed only 10% of the 150,000 stars NASA’s Kepler Mission has been eyeing off across the Milky Way galaxy.

How AI helps in the hunt for other exoplanets.

There’s potential then for sifting through Kepler’s entire catalogue and finding other exoplanetary worlds that have either been skimmed by scientist or haven’t been checked yet, due to Kepler’s rich data set. And that’s exactly what Google’s researchers are planning to do.

Machine learning neural networks have been assisting astronomers for a few years now. But the potential for AI to assist in exoplanetary discoveries will only increase within the next decade.

Beyond Kepler

The Kepler mission has been running since 2009, with observations slowly coming to an end. Within the next 12 months, all of its on-board fuel will be fully depleted, ending what has been, one of the greatest scientific endeavours in modern times.

NASA’s new TESS mission will inundate astronomers with 20,000 exoplanetary candidates in the next two years.
Chet Beals/MIT Lincoln Lab

Kepler’s successor, the Transiting Exoplanet Survey Satellite (TESS) will be launching this coming March.


Read more: A machine astronomer could help us find the unknowns in the universe


TESS is predicted to find 20,000 exoplanet candidates during its two-year mission. To put that into perspective, in the past 25 years, we’ve managed to discover just over 3,500.

This unprecedented inundation of exoplanetary data needs to either be confirmed by other transiting observations or other methods such as ground-based radial velocity measurements.

The ConversationThere just isn’t enough people-power to sift through all of this data. That’s why these machine learning networks are needed, so they can aid astronomers in sifting through big data sets, ultimately assisting in more exoplanetary discoveries. Which begs the question, who exactly gets credit for such a discovery?

Jake Clark, PhD Student, University of Southern Queensland

This article was originally published on The Conversation. Read the original article.

Climate scientists and policymakers need to trust each other (but not too much)



File 20171218 27591 1bh5wna.jpg?ixlib=rb 1.1
Trust is everything.
oneinchpunch/Shutterstock.com

Rebecca Colvin, Australian National University; Christopher Cvitanovic, University of Tasmania; Justine Lacey, CSIRO, and Mark Howden, Australian National University

At a time when the effects of climate change are accelerating and published science overwhelmingly supports the view that humans are responsible for the rate of change, powerful groups remain in denial across politics, the media, and industry. Now more than ever, we need scientists and policymakers to work together to create and implement effective policy which is informed by the most recent and reliable evidence.

We know that trust between scientists and policymakers is important in developing policy that is informed by scientific evidence. But how do you build this trust, and how do you make sure that it genuinely leads to positive outcomes for society?


Read more: Nature v technology: climate ‘belief’ is politics, not science


In response to these questions, our recent Perspective in Nature Climate Change explores the dynamics of trust at the interface of climate science and policy.

We suggest that while trust is an important component of the science-policy dynamic, there can be such a thing as “too much” trust between scientists and policymakers.

Understanding this dynamic is crucial if we are to deliver positive outcomes for science, policy, and the society that depends on their cooperation.

What happens when there is ‘too much’ trust?

Trust between climate scientists (researchers in a range of disciplines, institutions, and organisational settings) and policymakers (civil servants in government departments or agencies who shape climate policy) is useful because it enhances the flow of information between them. In a trusting relationship, we can expect to see a scientist explaining a new finding directly to a policymaker, or a policymaker describing future information needs to a scientist.

Together, this arrangement ideally gives us science-led policy, and policy-relevant science.

But as scholars of trust have warned, there is a point beyond which these positive benefits of trust can turn sour.

Think about a hypothetical situation in which a scientist and policy-maker come to trust each other deeply. What happens if one of them starts to become loose with the facts, or fails to adhere to professional standards? Is their trusting counterpart more, or less, likely to identify the poor behaviour and respond appropriately?

Over time, a trusting relationship may evolve into a self-perpetuating belief of trustworthiness based on the history of the relationship. This is where scientists and policymakers may find themselves in a situation of “too much” trust.

We know that science advances by consensus, and that this consensus is shaped by rigorous research and review, and intense debate and scrutiny. But what if (as in the hypothetical example described above) a policy-maker’s trust in an individual scientist means they bypass the consensus and instead depend on that one scientist for new information? What happens if that scientist is – intentionally or unintentionally – wrong?

More trust is not always best. ‘Too much’ trust can cause perverse outcomes at the science-policy interface.
Adapted from Stevens et al. (2015)

When you have “too much” trust, the benefits of trust can instead manifest as perverse outcomes, such as “blind faith” commitments between parties. In a situation like this, a policymaker may trust an individual scientist so much that they do not look for signs of misconduct, such as the misrepresentation of findings.

Favouritism and “capture” may mean that some policymakers provide information about future research support only to selected scientists, denying these opportunities to others. At the same time, scientists may promote only their own stream of research instead of outlining the range of perspectives in the field to the policymakers, narrowing the scope of what science enters the policy area.

“Cognitive lock-in” might result, where a policymaker sticks to a failing policy because they feel committed to the scientist who first recommended the course of action. For example, state-of-the-art climate forecasting tools are available in the Pacific but are reportedly underused. This is partly because the legacy of trusting relationships between scientists and policymakers in the region has led them to continue relying on less sophisticated tools.

“Too much” trust can also lead to overly burdensome obligations between scientists and policymakers. A scientist may come to hold unrealistically high expectations of the level of information a policymaker can share, or a policymaker may desire the production of research by an unfeasible deadline.

What’s the right way to trust?

With this awareness of the potentially negative outcomes of “too much” trust, should we abandon trust at the climate science-policy interface all together?

No. But we can – and should – develop, monitor, and manage trust with acknowledgement of how “too much” trust may lead to perverse outcomes for both scientists and policy-makers.

We should aim for a state of “optimal trust”, which enjoys the benefits of a trusting relationship while avoiding the pitfalls of taking too trusting an approach.

We propose five key strategies for managing trust at the climate science-policy interface.

  • Be explicit about expectations for trust in a climate science-policy relationship. Climate scientists and policy-makers should clarify protocols and expectations about behaviour through open discussion as early as possible within the relationship.

  • Transparency and accountability, especially when things go wrong, are critical to achieving and maintaining a state of optimal trust. When things do go wrong, trust repair can right the relationship.

  • Implement systems for monitoring trust, such as discussion groups within scientific and policy organisations and processes of peer review. Such approaches can help to identify the effects of “too much” trust – such as capture, cognitive lock-in, or unrealistically high expectations.

  • Manage staff churn in policy and scientific organisations. When scientists or policy-makers change role or institution, handing over the trusting relationships can help positive legacies and practices to carry on.

  • Use intermediaries such as knowledge brokers to facilitate the flow of information between science and policy. Such specialists can promote fairness and honesty at the science-policy interface, increasing the probability of maintaining ‘optimal trust’.


Read more: Is this the moment that climate politics and public opinion finally match up?


Embracing strategies such as these would be a positive step toward managing trust between scientists and policymakers, both in climate policy and beyond.

The ConversationIn this time of contested science and highly politicised policy agendas, all of us in science and policy have a responsibility to ensure we act ethically and appropriately to achieve positive outcomes for society.

Rebecca Colvin, Knowledge Exchange Specialist, Australian National University; Christopher Cvitanovic, Research Fellow, University of Tasmania; Justine Lacey, Senior Social Scientist, CSIRO, and Mark Howden, Director, Climate Change Institute, Australian National University

This article was originally published on The Conversation. Read the original article.

By slashing environment spending, the government is slashing opportunities



File 20171217 17878 1ezx5hj.jpg?ixlib=rb 1.1
At a time of growing human impacts, spending on environmental protection is more important than ever.
Author provided

Don Driscoll, Deakin University

Australia’s native plants and animals are integral to the success of our society. We depend on wildlife to pollinate many of our crops. Most of our cities depend on effective water catchments to provide clean water. And medical scientists are making important breakthroughs in managing disease and health issues based on discoveries in nature.

The mental health benefits of a “dose of nature” are becoming more widely recognised, on top of our own experiences of having fun and enjoying the natural wonders of national parks. Our nature inspires us in all kinds of ways, and you can build major industries around that; the Great Barrier Reef is reportedly worth A$56 billion to the Australian economy.

It is therefore surprising, on one hand, to read the Australian Conservation Foundation and WWF Australia budget submission that the Australian government has slashed environmental spending by one third since 2013.

On the other hand, I’m not especially surprised because we ecologists have been living through the ongoing attack on the environment every day. We see how cuts to environmental budgets play out.


Read more: Why a walk in the woods really does help your body and your soul


Our native species and ecosystems are under growing pressure. Australia’s 1.6% annual population growth outstrips many other countries. This is compounded by rises in per-capita consumption and greenhouse emissions.

Escalating consumption translates into growing impacts on biodiversity as more land is released for housing and infrastructure, extractive industries such as mining, recreational and industrial fishing expand and agriculture intensifies.

Climate change further interacts with land clearing associated with producing more for a growing and greedier population. Many species are expected to have to shift their range as the environmental conditions they live in move, and if they can’t move because there is no habitat to move through, extinctions will result.


Read more: Land clearing isn’t just about trees – it’s an animal welfare issue too


State of the Environment reports document the extent of the problem.

For example, between 2011 and 2015, there was a 66% increase in the number of critically endangered animals (from 38 in 2011 to 63 in 2015), and a 28% increase in critically endangered plants (112 in 2011; 143 in 2015). By critically endangered, we mean that extinction is a real possibility in the short term for these species. Immediate action is needed if we are to avoid terminating millions of years of independent evolution, as these biological lineages die out.

Given the extraordinary value of biodiversity and the extreme and growing threats, it would make sense to maximise our spending on biodiversity conservation now, to protect our wildlife through this period of peak human.

Key areas for investment include creating an effective national reserve system, at least meeting the arbitrary international goals of 17% of the land and 10% of the sea area.

Funding is needed to manage the reserve system, containing threats and nurturing already threatened species. Meanwhile, outside of reserves where most of the people live and interact with nature, biodiversity needs to be provided for, and threats need to be managed. Biosecurity is a critical area for funding, particularly to more tightly regulate rogue industries, like horticulture.

Horticulture was recently responsible for introducing myrtle rust, a disease that is devastating many gum-tree relatives, in the family Myrtaceae. Finally, climate change demands a strong response, both in mitigation and adaptation.

Science and environment work needs funding

I’ve never seen so many fantastic, skilled, enthusiastic young ecologists struggling to get a job. At a time when ecologists and conservation scientists are needed more than ever to help solve the problems created by the growth economy, funding for ecology is at a low.


Read more: Vale ‘Gump’, the last known Christmas Island Forest Skink


Of course, beyond the people, we see conservation programs in desperate need of support that just isn’t forthcoming. Christmas Island is a case in point.

The island’s reptiles have been devastated by invasive pests, most likely the wolf snake and perhaps the giant centipede. Two endemic species (species that only lived on Christmas Island) are presumed extinct; the last known forest skink died in 2014.

This Christmas Island Forest Skink was the last known member of her species.
Director of National Parks/Supplied

Two other endemic species are extinct in the wild, but small populations of around 1,000 animals are kept in captivity on the island and at Taronga Zoo.

While ideally a population of at least 5,000 would be maintained to minimise loss of genetic diversity, funding is not available to house that many animals. And it’s rock-bottom budget accommodation; Lister’s geckos are housed in tents because the budget doesn’t stretch to building something permanent.

We’ve also seen important long term research programs defunded. Long-term data provides crucial insights into how our biodiversity responds to decadal changes in weather patterns as well as longer-term changes caused by the greenhouse effect. It is unimaginable that the government have slashed the Terrestrial Ecosystem Research Network’s funding so far that well-established long-term data series are now being compromised.

Ultimately, the environmental funding shortfall needs to be fixed. Our livelihoods and well-being depend on it.


The ConversationThe original version of this article incorrectly reported that the budget submission was made by the Australian Conservation Foundation and The Wilderness Foundation. It was in fact made by the Australian Conservation Foundation and WWF Australia.

Don Driscoll, Professor in Terrestrial Ecology, Deakin University

This article was originally published on The Conversation. Read the original article.

Tide turned: surveys show the public has lost its appetite for shark culls


Christopher Neff, University of Sydney and Thomas Wynter, University of Sydney

A Senate Committee report on shark deterrent measures has, in the words of committee member Senator Peter Whish-Wilson, moved the “shark cull debate into the 21st century”.

The first recommendation of the inquiry is to “immediately replace lethal drum lines” with so-called SMART drum lines and to phase out shark nets.

Yet if the news media are to be believed, these conclusions go against the grain of public opinion, with Western Australia’s spate of shark incidents having spawned previous headlines such as “Calls grow louder for shark culling in WA”. More recently, a series of incidents in Ballina in northern New South Wales prompted our surfing former prime minister Tony Abbott to weigh in, calling on the state government to authorise culls and nets.


Read more: Sharks aren’t criminals, but our fear makes us talk as if they are


The question of how much the public really supports policies that kill sharks has been surprisingly difficult to answer. The Senate inquiry noted that while it had been suggested “that lethal measures such as nets are no longer supported … reliably ascertaining community views on matters such as this could be quite difficult”.

Difficult? Yes. But doable. We have surveyed public opinion in Western Australia and Ballina, following shark bite incidents in each place. In fact, over the past five years we have searched high and low for the type of widespread support for lethal policies that is suggested by the tabloid press. It simply is not there, as our findings in the peer-reviewed journals Conservation Letters and Marine Policy show.

Public opinion in Perth and Ballina

In fieldwork including phone polling in both Perth and Ballina, as well as face-to-face surveys of local residents, beachgoers, and business owners in Ballina, we consistently found levels of support for lethal policies in the 20-25% range.

This is particularly remarkable in the case of Ballina. As the shire’s mayor David Wright told the Senate committee, between 8 February 2015 and July 2016, surfers there “were involved in 9% of the world’s shark attacks and interactions”, with the media dubbing it the “shark attack capital of Australia”.

A large majority of people in both Perth and Ballina viewed shark bites as accidental rather than intentional. While fear of sharks is linked to higher support for lethal policies, fear alone does not cause people to support killing sharks.

Support for lethal policies arises when fear of sharks is combined with the misconceived idea that sharks bite people on purpose. In our surveys, respondents who view shark bites to be intentional were more than 2.5 times as likely to support policies that kill sharks.


Author provided

This is strongly related to the Senate inquiry’s finding that the belief that “killing ‘rogue’ sharks will solve the problem” remains widespread. This is despite a clear expert consensus that there is “no evidence for anything called a rogue shark”.

As the Department of the Environment and Energy says, “No shark is thought to target humans as prey”, and the vast majority of shark bite incidents “can be attributed to the shark confusing us with its normal prey”.

This view was apparent among the relatively few beachgoers in Ballina who reported supporting lethal policies, with several respondents suggesting that they would only support killing sharks that “had gotten a taste for human flesh”.

Many respondents were also unaware that shark nets are lethal to sharks. Indeed, this is their primary purpose, as the Senate inquiry noted: “It is not intended that the nets create an enclosed area: rather, they are a passive fishing device designed to cull sharks in the area.”

Understanding overcomes fear

Our second study looked specifically at the interaction between fear of sharks and the perception that they bite humans intentionally.

We carried out an experiment in the Sydney SEA LIFE Aquarium’s “shark tunnel” – a one-way, U-shaped exhibit that provides perfect conditions for our study. We divided participants into two groups and assigned one group to a treatment to “prime” their emotions at the beginning of the exhibit.

We also surveyed all participants about their feelings about and perceptions of sharks, after viewing the exhibit. This also allowed us to capture both a before and after measurement of fear, from which we could determine whether people’s fear had subsided after seeing sharks’ behaviour at first hand.

We tested two “priming” messages. One called attention to the low probability of being bitten by a shark – we call this our Probability Prime. A second priming message drew “attention to intentionality”. This was our Intentionality Prime and it prompted aquarium visitors to consider sharks’ behaviours.

The Probability Prime, which reflects standard marine education attempts to reduce fear of sharks, failed to do so, consistent with research showing humans overestimate low probability risks. Crucially, considering our findings in Ballina and Perth, the Intentionality Prime successfully reduced the public’s fear of sharks.

shark fear.
Author provided

There are five take-home messages from our research results:

  1. There is little blame on the shark. The tide has turned and the public is sophisticated enough to understand that sharks are not intentionally hurting people.

  2. There is little blame on the government. Governments that feel they need to continue using shark nets or else face the wrath of the public following a shark bite should rework their political calculations.

  3. The public no longer supports policies that kill sharks. In WA, 75% supported non-lethal options, in Ballina the number was 83% and in the Sydney experiment it reached 85%.

  4. A Save the Sharks movement has begun, with the public we have polled consistently voicing greater support for conservation approaches above killing sharks.

  5. Survey respondents believe that governments choose lethal measures to ease public concern, not to make beaches safer. This is a problem for Australia’s democracy; the public believes that policies are being designed to protect governments, not people.

The ConversationThis last point is arguably the most serious flaw of all in these policies: the continued killing of sharks for political gain.

Christopher Neff, Lecturer in Public Policy, University of Sydney and Thomas Wynter, Postdoctoral Research Associate, Electoral Integrity Project, University of Sydney

This article was originally published on The Conversation. Read the original article.

It’s official: 2016’s Great Barrier Reef bleaching was unlike anything that went before


Sophie Lewis, Australian National University and Jennie Mallela, Australian National University

It is no longer news that the Great Barrier Reef has suffered extreme bleaching.

In early 2016, we heard that the reef had suffered the worst bleaching ever recorded. Surveys published in June that year estimated that 93% of coral on the vast northern section of the reef was bleached, and 22% had already been killed.

Further reports from this year show that bleaching again occurred. The back-to-back bleaching hit more than two-thirds of the Great Barrier Reef and may threaten its UNESCO World Heritage listing.

After recent years of damage, what does the future hold for our priceless reef?

Our new research, published in the Bulletin of the American Meteorological Society’s special report on climate extremes, shows the news isn’t good for the Great Barrier Reef’s future.


Read more: How to work out which coral reefs will bleach, and which might be spared


Coral reefs are complex ecosystems that are affected by many factors. Changes in sea surface temperatures, rainfall, cloudiness, agricultural runoff, or water quality can affect a reef’s health and resilience to stress.

Early analysis of the 2016 bleaching suggested that the Great Barrier Reef was suffering from thermal stress brought on by human-caused climate change.

Our study took a new and comprehensive approach to examine these multiple climatic and environmental influences.

We set out to answer the crucial question: could anything else have bleached the Great Barrier Reef, besides human-induced climate change?

Clear fingerprint

The results were clear. Using a suite of climate models, we found that the significant warming of the Coral Sea region was likely caused by greenhouse gases from human activities. This warming was the primary cause of the extreme 2016 bleaching episode.

But what about those other complex factors? The 2016 event coincided with an El Niño episode that was among the most severe ever observed. The El Niño-Southern Oscillation system, with its positive El Niño and negative La Niña phases, has been linked to bleaching of various coral reefs in the past.

Our study showed that although the 2016 El Niño probably also contributed to the bleaching, this was a secondary contributor to the corals’ thermal stress. The major factor was the increase in temperatures because of climate change.

We next analysed other environmental data. Previous research has found that corals at sites with better water quality (that is, lower concentrations of pollution particles) are more resilient and less prone to bleaching.

Pollution data used in our study show that water quality in 2016 may have been better than in previous bleaching years. This means that the Great Barrier Reef should have been at lower risk of bleaching compared to long-term average conditions, all else being equal. Instead, record bleaching hit the reef as a result of the warming temperature trend.

Previous events

The final part of our investigation involved comparing the conditions behind the record 2016 bleaching with those seen in previous mass bleaching episodes on the Great Barrier Reef, in 1997-98 and 2010-11.

When we analysed these previous events on the Reef, we found very different factors at play.

In 1997-98 the bleaching coincided with a very strong El Niño event. Although an El Niño event also occurred in 2016, the two were very different in terms of the distribution of unusually warm waters, particularly in the eastern equatorial Pacific. In 1997-98, the primary cause of the bleaching – which was less severe than in 2016 – was El Niño.

In 2010-11, the health of the Great Barrier Reef was impaired by runoff. That summer brought record high rainfall to eastern Australia, causing widespread flooding across Queensland. As a result of the discharge of freshwater onto the reef reducing the salinity, bleaching occurred.


Read more: Feeling helpless about the Great Barrier Reef? Here’s one way you can help


There have been many reports in recent years warning of trouble for the Great Barrier Reef. Sadly, our study is yet another warning about the reef’s future – perhaps the most comprehensive warning yet. It tells us that the 2016 bleaching differed from previous mass bleaching events because it was driven primarily by human-induced climate warming.

This puts the Great Barrier Reef in grave danger of future bleaching from further greenhouse warming. The local environmental factors that have previously helped to protect our reefs, such as good water quality, will become less and less able to safeguard corals as the oceans warm.

The ConversationNow we need to take immediate action to reduce greenhouse gas emissions and limit further warming. Without these steps, there is simply no future for our Great Barrier Reef.

Sophie Lewis, Research fellow, Australian National University and Jennie Mallela, Research Fellow in Coral Reef Monitoring and Reef Health Appraisal, Australian National University

This article was originally published on The Conversation. Read the original article.

To fight the catastrophic fires of the future, we need to look beyond prescribed burning



File 20171214 27555 1noxo9d.jpg?ixlib=rb 1.1

AAP Image/ Darren Pateman

James Furlaud, University of Tasmania and David Bowman, University of Tasmania

California is burning – a sentence we’ve heard far too often this year. Sydney is currently on bushfire alert, as firefighters battle a fire in the Hunter Valley region and temperatures are set to top 40℃.

A cocktail of factors, from climate change to centuries of ignoring indigenous burning practises, means that catastrophic fires are likely to become more common.


Read more: Dry winter primes Sydney Basin for early start of bushfire season


One of Australia’s favourite fire prevention measures is prescribed burning – using carefully controlled fires to clear out flammable materials. We’re almost obsessed with it. Indeed, it seems the outcome of every major inquiry is that we need to do more of it.

The Royal Commission inquiry that followed Victoria’s 2009 Black Saturday fires recommended that 5% of all public land in Victoria be treated per year – a doctrine that was subsequently dropped due to impracticality.

Yet our research, published today in the International Journal of Wildland Fire, modelled thousands of fires in Tasmania and found that nearly a third of the state would have to be burned to effectively lower the risk of bushfires.

The question of how much to burn and where is a puzzle we must solve, especially given the inherent risk, issues caused by smoke smoke and shrinking weather windows for safe burning due to climate change.

Why use computer simulations?

The major problem fire science faces is gathering data. Landscape-scale experiments involving extreme fire are rare, for obvious reasons of risk and cost. When a major bushfire happens, all the resources go into putting it out and protecting people. Nobody has the time to painstakingly collect data on how fast it is moving and what it is burning. We are therefore restricted to a few limited data sources to reconstruct the behaviour and impact of fire: we can analyse the scar on the landscape after a fire, look at case studies, or run simulations of computer models.

Most research on the effectiveness of prescribed burning has been at a local scale. We need to start thinking bigger: how can we mitigate the effect of multiple large fires in a region like Tasmania or Southeastern Australia? What is the cumulative effect of different prescribed burning strategies?

A large fuel reduction burn off on Hobart’s eastern shore.
Flickr/Mike Rowe, CC BY-NC

To answer these questions, we create models using mathematical equations to simulate the behaviour of fires across actual landscapes. These models include the effects of vegetation type, terrain and fuel loads, under specific weather conditions. If we simulate thousands of these fires we can get an idea of where fire risk is the highest, and how effective prescribed burning is at reducing that risk.

The island of Tasmania offers the perfect study system. Self-contained, with a wide array of vegetation types and fire regimes, it offers an ideal opportunity to see how fire behaves across a diverse landscape. Perhaps more interestingly, the island contains large areas of flammable landscape surrounding globally unique ecosystems and numerous towns and villages. Obviously, we cannot set fire to all of Tasmania in real life, but computer simulations make it possible!

So, encouraged by the Tasmanian Fire Service, who initiated our research, we simulated tens of thousands of fires across Tasmania under a range of prescribed burning scenarios.

Prescribed fire can be effective, in theory

The first scenario we looked at was the best-case scenario: what happens if we perform prescribed burning on all the vegetation that can handle it, given theoretically unlimited resources? It is possible this approximates the sustained and skillful burning by Tasmanian Aboriginal peoples.

Wildfire simulations following this scenario suggested that such an approach would be extremely effective. Importantly, we saw significant reductions in fire activity even in areas where prescribed burning is impossible (for example, due to the presence of people).

Unfortunately, this best-case approach, while interesting from a theoretical perspective, would require prescribed burning over more than 30% of Tasmania in one year.

We also analysed the effects of 12 more realistic scenarios. These realistic plans were less than half as efficient as the best-case scenario at reducing fire activity.

On average, 3 hectares of prescribed burning would reduce wildfire extent by roughly 1ha in grasslands and dry forests.

In other flammable Tasmanian vegetation types like buttongrass sedgelands and heathlands, the reduction in wildfire was even smaller. This is obviously better than no prescribed burning, but it highlights the fact that this is a relatively inefficient tool, and given the costs and potential drawbacks, should be used only where it is most needed.

This is a fundamental conundrum of prescribed burning: though it is quite effective in theory, the extent to which we would need to implement it to affect fire behaviour across the entire state is completely unachievable.

Therefore, it is imperative that we not just blindly burn a pre-ordained fraction of the landscape. Rather, we must carefully design localised prescribed burning interventions to reduce risk to communities.

We need a multi-tool approach

Our study has shown that while prescribed burning can be quite effective in certain scenarios, it has serious constraints. Additionally, while we analysed these scenarios under bad fire weather, we were not able to analyse the kind of catastrophic days in which the effect of prescribed burning is seriously reduced, with howling dry winds and stupefying heat.

Unfortunately, due to climate change, we are going to see a lot more catastrophic days in the future in Tasmania and indeed globally.

In Hobart this is of particular concern, as the city is surrounded by tall, wet eucalypt forests that have had fifty years grow dense understoreys since the 1967 Black Tuesday fires. These have the potential to cause some of the most intense fires on the planet should conditions get dry enough. Prescribed burning is impossible in these forests.


Read more: Where to take refuge in your home during a bushfire


To combat fire risk we must take a multi-pronged approach that includes innovative strategies, such as designing new spatial patterns for prescribed burning, manually removing fuels from areas in which prescribed burning is not possible, improving the standards for buildings and defensible spaces, and most importantly, engaging the community in all of this.

The ConversationOnly by attacking this problem from multiple angles, and through close collaboration with the community and all levels of government, can we effectively face our fiery future.

James Furlaud, PhD Student in Fire Ecology, University of Tasmania and David Bowman, Professor, Environmental Change Biology, University of Tasmania

This article was originally published on The Conversation. Read the original article.