Tess Parker, Monash University and Ailie Gallant, Monash UniversityAt the tail end of winter in 2015, the ground in the Wimmera in northwestern Victoria had been a little dry but conditions weren’t too bad for farmers. The crop season was going well.
The start of September looked promising. It was cool, and there were decent rains. One Wimmera lentil grower said, “As long as it doesn’t get too hot, we should actually be OK.”
A few weeks later, summer weather had arrived early. At the start of October, the soils were baked dry. Lentils and other pulse crops were devastated.
This kind of event, where drier-than-normal conditions transform into severe or extreme drought in the space of weeks, is called a “flash drought”. While flash droughts are still not well understood, our research studies how they occur in Australia – which may help move us toward being able to warn of flash drought in advance.
The different kinds of drought
Scientists typically talk about drought as a lack or deficit of available moisture to meet various needs, such as in agriculture or for water resources. We often classify different types of drought depending on where there is a lack of water, or what its effects are:
- meteorological drought is a deficit of rain or other precipitation
- agricultural drought is a deficit of moisture in the soil and evaporating or transpiring into the air
- hydrological drought is a deficit of water in runoff and surface storage such as dams
- socioeconomic drought is a lack of water that affects the supply and demand of economic goods and services.
Different types of drought can occur at the same time, or a drought may evolve from one type to another. Droughts can last from months to decades, and can cover areas from a local region to most of the continent.
Recently, a new characterisation of drought has been added to the drought spectrum: “flash” drought.
What causes flash droughts?
Flash droughts are droughts that begin suddenly and then rapidly become more intense. Droughts only occur when there is insufficient rainfall, but flash droughts intensify rapidly over timescales of weeks to months because of other factors such as high temperatures, low humidity, strong winds and clear skies.
These conditions make the air “thirsty”, which meteorologists call “increased evaporative demand”. This means more water evaporates from the surface and transpires from plants, and moisture in the soil is rapidly depleted.
Under these conditions, evaporation and transpiration increase for as long as moisture is available at the surface. When this moisture is depleted and there is no rain to replenish it, the lack of water limits evaporation and transpiration – and vegetation becomes stressed as drought emerges.
Why haven’t we heard about flash drought before?
Flash droughts have always existed, and were first described in 2002. However, some particularly devastating flash droughts over the past decade have led to a surge of interest among researchers.
One such drought happened in the US Midwest. In May 2012, 30% of the continental United States was experiencing abnormally dry conditions. By August, that had extended to more than 60%. Although other rapidly developing droughts had been seen before, the widespread impacts of this event caught the attention of the US public and government.
Flash droughts are also increasingly a focus of attention in China and Australia. One of the few studies of flash drought in Australia examined an event when conditions in the country’s east suddenly changed from wet in December 2017, to dry in January 2018.
Anecdotal reports from farmers in the northern Murray–Darling Basin indicated removal of livestock from properties, and sheep numbers at record lows. By June 2018, there were reports of trees dying and a desert-like landscape, with little grass cover.
What happened in the Wimmera?
Our recent study of flash drought in Australia used several different measurements to capture a range of conditions related to drought.
- precipitation describes the supply of moisture from the atmosphere to the surface
- evaporative demand is the atmospheric demand for moisture from the surface
- evaporative stress is the supply of moisture from the surface relative to the demand from the atmosphere
- soil moisture is the wetness or dryness of the land surface.
The index we used to determine the atmospheric demand shows that the speed of development and the intensity of flash drought are driven by high temperatures, low humidity, strong winds and clear skies. All of these increase the demand for moisture from the surface.
After a drier than normal winter, southeast Australia experienced a cool and wet start to September 2015, with some rain in the first week of the month. Humidity and surface air pressure were roughly average, and surface sunshine below average, suggesting normal evaporative demand.
A warm spell began in mid-September, and intensified into a severe heatwave by early October, with temperatures over 35℃ persisting for several days in some areas. Throughout this period the overlying air became very dry. A persistent high-pressure system brought clear skies and increased sunshine.
By the end of October, the Wimmera was in severe or extreme drought conditions, devastating pulse and grain crops. Analysts estimated wheat production fell by 23%, with a loss of A$500 million in potential yields.
Flash drought in Australia
Flash droughts in Australia occur in all seasons. In the Wimmera, flash droughts are most frequent in summer and autumn. They can end as rapidly as they start, but in some cases may last many months.
In several instances, flash droughts in the Wimmera have started in summer or autumn, and the region has remained in drought through the following winter, and sometimes into spring. In this way, flash drought can be the catalyst for the common droughts lasting 6-12 months typical of southeast Australia.
Explainer: El Niño and La Niña
But there is some potential good news. We have long known that seasonal-scale droughts in Australia are strongly related to the El Niño-Southern Oscillation (ENSO), which gives us some ability to predict them.
ENSO strongly affects rainfall, which means it can also be linked to flash droughts in winter and spring.
Further, sub-seasonal forecasting, which predicts the climatic conditions weeks to a month in advance, has improved considerably in recent years. Given flash droughts occur on these timescales, we can be optimistic that prediction of flash droughts may be possible.
Alison O’Donnell, The University of Western Australia; Edward Cook, Columbia University, and Pauline Grierson, The University of Western AustraliaDrought over the last two decades has dealt a heavy blow to the wheatbelt of Western Australia, the country’s most productive grain-growing region. Since 2000, winter rainfall has plummeted by almost 20% and shifted grain-growing areas towards the coast.
Our recent research, however, found these dry conditions are nothing out of the ordinary for the region.
In fact, after analysing rings in centuries-old tree trunks, we found the region has seen far worse “megadroughts” over the last 700 years. Australia’s instrumental climate records only cover the last 120 or so years (at best), which means these historic droughts may not have previously been known to science.
Our research also found the 20th century was the wettest of the last seven centuries in the wheatbelt. This is important, because it means scientists have likely been underestimating the actual risk of drought – and this will be exacerbated by climate change.
What we can learn from ancient trees
We estimate the risk of extreme climate events, such as droughts, cyclones and floods, based on what we know from instrumental climate records from weather stations. Extending climate records by hundreds or even thousands of years means scientists would be able to get a much better understanding of climate variability and the risk of extreme events.
Thankfully we can do just that in many parts of the world using proxy records — things like tree rings, corals, stalagmites and ice cores in Antarctica. These record evidence of past climate conditions as they grow.
For example, trees typically create a new layer of growth (“growth ring”) around their trunks, just beneath the bark, each year. The amount of growth generally depends on how much rain falls in the year. The more it rains, the more growth and the wider the ring.
We used growth rings of native cypress trees (Callitris columellaris) near a large salt lake at the eastern edge the wheatbelt region. These trees can live for up to 1,000 years, perhaps even longer.
We can examine the growth rings of living trees without cutting them down by carefully drilling a small hole into the trunk and extracting a column (“core”) of wood about the size of a drinking straw. By measuring the ring widths, we developed a timeline of tree growth and used this to work out how much rain fell in each year of a tree’s life.
This method allowed us to reconstruct the last 668 years of autumn-winter rainfall in the wheatbelt.
A history of megadroughts
One of the most pressing questions for the wheatbelt is whether the decline in autumn-winter rainfall observed in recent decades is unusual or extreme. Our extended record of rainfall lets us answer this question.
Yes, rainfall since 2000 was below the 668-year average — but it was not extremely low.
The last two decades may seem particularly bad because our expectations of rainfall in the wheatbelt are likely based on memories of higher rainfall. But this frequent wet weather has actually been the anomaly. Our tree rings revealed the 20th century was wetter than any other in the last 700 years, with 12% more rain in the autumn-winter seasons on average than the 19th century.
Before the 20th century, the wheatbelt saw five droughts that were longer and more severe than any we’ve experienced in living memory, or have recorded in instrumental records. This includes two dry periods in the late 18th and 19th centuries that persisted for more than 30 years, making them “megadroughts”.
While the most recent dry period has persisted for almost two decades so far, rainfall during this period is at least 10% higher than it was in the two historical megadroughts.
This suggests prolonged droughts are a natural and relatively common feature of the wheatbelt’s climate.
So how does human-caused climate change play into this?
It’s likely both natural climate variability and human-caused climate change contributed to the wheatbelt’s recent decline in rainfall. Unfortunately, it’s also likely their combined influence will lead to even less rainfall in the near future.
What happens now?
Our findings have important implications for assessing the risk of drought. It’s now clear we need to look beyond these instrumental records to more accurately estimate the risk of droughts for the wheatbelt.
But currently, proxy climate records like tree rings aren’t generally used in drought risk models, as there aren’t many of them in the regions scientists want to research.
Improving risk estimates leads to better informed decisions around preparing for and managing the effects of droughts and future natural disasters.
Our findings are a confronting prospect for the future of farming in the wheatbelt.
Australian farmers have shown tremendous innovation in their ability to adapt in the face of drought, with many shifting from livestock to crops. This resilience will be critical as farmers face a drier, more difficult future.
Alison O’Donnell, Research Fellow in Dendroclimatology, The University of Western Australia; Edward Cook, Ewing Lamont Research Professor, Director Of Tree-Ring Lab, Columbia University, and Pauline Grierson, Director, West Australian Biogeochemistry Centre, The University of Western Australia
Erick Lundgren, University of Technology Sydney; Arian Wallach, University of Technology Sydney, and Daniel Ramp, University of Technology SydneyIn the heart of the world’s deserts – some of the most expansive wild places left on Earth – roam herds of feral donkeys and horses. These are the descendants of a once-essential but now-obsolete labour force.
These wild animals are generally considered a threat to the natural environment, and have been the target of mass eradication and lethal control programs in Australia. However, as we show in a new research paper in Science, these animals do something amazing that has long been overlooked: they dig wells — or “ass holes”.
In fact, we found that ass holes in North America — where feral donkeys and horses are widespread — dramatically increased water availability in desert streams, particularly during the height of summer when temperatures reached near 50℃. At some sites, the wells were the only sources of water.
The wells didn’t just provide water for the donkeys and horses, but were also used by more than 57 other species, including numerous birds, other herbivores such as mule deer, and even mountain lions. (The lions are also predators of feral donkeys and horses.)
Incredibly, once the wells dried up some became nurseries for the germination and establishment of wetland trees.
Ass holes in Australia
Our research didn’t evaluate the impact of donkey-dug wells in arid Australia. But Australia is home to most of the world’s feral donkeys, and it’s likely their wells support wildlife in similar ways.
Across the Kimberley in Western Australia, helicopter pilots regularly saw strings of wells in dry streambeds. However, these all but disappeared as mass shootings since the late 1970s have driven donkeys near local extinction. Only on Kachana Station, where the last of the Kimberley’s feral donkeys are protected, are these wells still to be found.
In Queensland, brumbies (feral horses) have been observed digging wells deeper than their own height to reach groundwater.
Feral horses and donkeys are not alone in this ability to maintain water availability through well digging.
Other equids — including mountain zebras, Grevy’s zebras and the kulan — dig wells. African and Asian elephants dig wells, too. These wells provide resources for other animal species, including the near-threatened argali and the mysterious Gobi desert grizzly bear in Mongolia.
These animals, like most of the world’s remaining megafauna, are threatened by human hunting and habitat loss.
Digging wells has ancient origins
These declines are the modern continuation of an ancient pattern visible since humans left Africa during the late Pleistocene, beginning around 100,000 years ago. As our ancestors stepped foot on new lands, the largest animals disappeared, most likely from human hunting, with contributions from climate change.
If their modern relatives dig wells, we presume many of these extinct megafauna may have also dug wells. In Australia, for example, a pair of common wombats were recently documented digging a 4m-deep well, which was used by numerous species, such as wallabies, emus, goannas and various birds, during a severe drought. This means ancient giant wombats (Phascolonus gigas) may have dug wells across the arid interior, too.
Likewise, a diversity of equids and elephant-like proboscideans that once roamed other parts of world, may have dug wells like their surviving relatives.
Indeed, these animals have left riddles in the soils of the Earth, such as the preserved remnants of a 13,500-year-old, 2m-deep well in western North America, perhaps dug by a mammoth during an ancient drought, as a 2012 research paper proposes.
Acting like long-lost megafauna
Feral equids are resurrecting this ancient way of life. While donkeys and horses were introduced to places like Australia, it’s clear they hold some curious resemblances to some of its great lost beasts.
Our previous research published in PNAS showed introduced megafauna actually make Australia overall more functionally similar to the ancient past, prior to widespread human-caused extinctions.
For example, donkeys and feral horses have trait combinations (including diet, body mass, and digestive systems) that mirror those of the giant wombat. This suggests — in addition to potentially restoring well-digging capacities to arid Australia — they may also influence vegetation in similar ways.
Water is a limited resource, made even scarcer by farming, mining, climate change, and other human activities. With deserts predicted to spread, feral animals may provide unexpected gifts of life in drying lands.
Despite these ecological benefits in desert environments, feral animals have long been denied the care, curiosity and respect native species deservedly receive. Instead, these animals are targeted by culling programs for conservation and the meat industry.
However, there are signs of change. New fields such as compassionate conservation and multispecies justice are expanding conservation’s moral world, and challenging the idea that only native species matter.
Erick Lundgren, PhD Student, Centre for Compassionate Conservation, University of Technology Sydney; Arian Wallach, Lecturer, Centre for Compassionate Conservation, University of Technology Sydney, and Daniel Ramp, Associate Professor and Director, Centre for Compassionate Conservation, University of Technology Sydney
The extreme, recent drought has devastated many communities around the Murray-Darling Basin, but the processes driving drought are still not well understood.
Our new study helps to change this. We threw a weather model into reverse and ran it back for 35 years to study the natural processes leading to low rainfall during drought.
And we found the leading cause for drought in the Murray-Darling Basin was that moisture from oceans didn’t reach the basin as often as normal, and produced less rain when it did. In fact, when moisture from the ocean did reach the basin during drought, the parched land surface actually made it harder for the moisture to fall as rain, worsening the already dry conditions.
These findings can help resolve why climate models struggle to simulate drought well, and ultimately help improve our ability to predict drought. This is crucial for our communities, farmers and bushfire emergency services.
There’s still a lot to learn about rain
The most recent drought was relentless. It saw the lowest rainfall on record in the Murray-Darling Basin, reduced agricultural output, led to increased food prices, and created tinder dry conditions before the Black Summer fires.
Drought in the Murray-Darling Basin is associated with global climate phenomena that drive changes in ocean and atmospheric circulation. These climate drivers include the El Niño and La Niña cycle, the Indian Ocean Dipole and the Southern Annular Mode.
Each influences the probability of rainfall over Australia. But drivers like El Niño can only explain around 20% of Australian rainfall — they only tell part of the story.
To fully understand the physical processes causing droughts to begin, persist and end, we need to answer the question: where does Australia’s rainfall come from? It may seem basic, but the answer isn’t so simple.
Where does Australia’s rainfall come from?
Broadly, scientists know rainfall derives from evaporation from two main sources: the ocean and the land. But we don’t know exactly where the moisture supplying Australia’s rainfall originally evaporates from, how the moisture supply changes between the seasons nor how it might have changed in the past.
To find out, we used a sophisticated model of Australia’s climate that gave data on atmospheric pressure, temperature, humidity, winds, rainfall and evaporation.
We put this data into a “back-trajectory model”. This traced the path of water from where it fell as rain, backwards in time through the atmosphere, to uncover where the water originally evaporated from. We did this for every day it rained over Australia between 1979 and 2013.
Not surprisingly, we found more than three-quarters of rain falling in Australia comes from evaporation from the surrounding oceans. So what does this mean for the Murray-Darling Basin?
Up to 18% of rain in the basin starts from the land
During the Millennium Drought and other big drought years (such as in 1982), the Murray-Darling Basin heavily relied on moisture transported from the Tasman and Coral seas for rain. Moisture evaporated off the east coast needs easterly winds to transport it over the Great Dividing Range and into the Murray-Darling Basin, where it can form rain.
This means low rainfall during these droughts was a result of anomalies in atmospheric circulation, which prevented the easterly flow of ocean moisture. The droughts broke when moisture could once again be transported into the basin.
The Murray-Darling Basin was also one of the regions in Australia where most “rainfall recycling” happens. This is when, following rainfall, high levels of evaporation from soils and plants return to the atmosphere, sometimes leading to more rain – particularly in spring and summer.
This means if we change the way we use the land or the vegetation, there is a risk we could impact rainfall. For example, when a forest of tall trees is replaced with short grass or crops, humidity can go down and wind patterns change in the atmosphere above. Both of these affect the likelihood of rain.
In the northern part of the basin, less evaporation from the dry land surface exacerbated the low rainfall.
On the other hand, when the drought broke, more moisture evaporated from the damp land surface, adding to the already high levels of moisture coming from the ocean. This meant the region got a surplus of moisture, promoting even more rain.
This relationship was weaker in the southern part of the basin. But interestingly, rainfall there relied on moisture originating from evaporation in the northern basin, particularly during drought breaks. This is a result we need to explore further.
Summer rain not so good for farmers
Rainfall and moisture sources for Australia and the Murray-Darling Basin are changing. In the past 35 years, the southeast of the country has been receiving less moisture in winter, and more in summer.
This is likely due to increased easterly wind flows of moisture from the Tasman Sea in summer, and reduced westerly flows of moisture from the Southern Ocean in winter.
This has important implications, particularly for agriculture and water resource management.
For example, more rainfall in summer can be a problem for horticultural farms, as it can make crops more susceptible to fungal diseases, decreases the quality of wine grape crops and affects harvest scheduling.
Less winter rain also means less runoff into creeks and rivers — a vital process for mitigating drought risk. And this creates uncertainty for dam operators and water resource managers.
Understanding where our rainfall comes from matters, because it can improve weather forecasts, seasonal streamflow forecasts and long-term rainfall impacts of climate change. For a drought-prone country like Australia — set to worsen under a changing climate — this is more crucial than ever.
Chiara Holgate, Hydrologist & PhD Candidate, Australian National University; Albert Van Dijk, Professor, Water and Landscape Dynamics, Fenner School of Environment & Society, Australian National University, and Jason Evans, Professor, UNSW
In box gum grassy woodlands, widely spaced eucalypts tower over carpets of wildflowers, lush native grasses and groves of flowering wattles. It’s no wonder some early landscape paintings depicting Australian farm life are inspired by this ecosystem.
But box gum grassy woodlands are critically endangered. These woodlands grow on highly productive agricultural country, from southern Queensland, along inland slopes and tablelands, into Victoria.
Many are degraded or cleared for farming. As a result, less than 5% of the woodlands remain in good condition. What remains often grows on private land such as farms, and public lands such as cemeteries or travelling stock routes.
Very little is protected in public conservation reserves. And the recent drought and record breaking heat caused these woodlands to stop growing and flowering.
But after Queensland’s recent drought-breaking rain earlier this year, we surveyed private farmland and found many dried-out woodlands in the northernmost areas transformed into flower-filled, park-like landscapes.
And landholders even came across rarely seen marsupials, such as the southern spotted-tail quoll.
Huge increase in plant diversity
These surveys were part of the Australian government’s Environmental Stewardship Program, a long-term cooperative conservation model with private landholders. It started in 2007 and will run for 19 years.
We found huge increases in previously declining native wildflowers and grasses on the private farmland. Many trees assumed to be dying began resprouting, such as McKie’s stringybark (Eucalyptus mckieana), which is listed as a vulnerable species.
This newfound plant diversity is the result of seeds and tubers (underground storage organs providing energy and nutrients for regrowth) lying dormant in the soil after wildflowers bloomed in earlier seasons. The dormant seeds and tubers were ready to spring into life with the right seasonal conditions.
For example, Queensland Herbarium surveys early last year, during the drought, looked at a 20 metre by 20 metre plot and found only six native grass and wildflower species on one property. After this year’s rain, we found 59 species in the same plot, including many species of perennial grass (three species jumped to 20 species post rain), native bluebells and many species of native daisies.
On another property with only 11 recorded species, more than 60 species sprouted after the extensive rains.
In areas where grazing and farming continued as normal (the paired “control” sites), the plots had only around half the number of plant species as areas managed for conservation.
Spotting rare marsupials
Landowners also reported several unusual sightings of animals on their farms after the rains. Stewardship program surveyors later identified them as two species of rare and endangered native carnivorous marsupials: the southern spotted-tailed quoll (mainland Australia’s largest carnivorous marsupial) and the brush-tailed phascogale.
The population status of both these species in southern Queensland is unknown. The brush-tailed phascogale is elusive and rarely detected, while the southern spotted-tailed quolls are listed as endangered under federal legislation.
Until those sightings, there were no recent records of southern spotted-tailed quolls in the local area.
These unusual wildlife sightings are valuable for monitoring and evaluation. They tell us what’s thriving, declining or surviving, compared to the first surveys for the stewardship program ten years ago.
Sightings are also a promising signal for the improving condition of the property and its surrounding landscape.
Changing farm habits
More than 200 farmers signed up to the stewardship program for the conservation and management of nationally threatened ecological communities on private lands. Most have said they’re keen to continue the partnership.
The landholders are funded to manage their farms as part of the stewardship program in ways that will help the woodlands recover, and help reverse declines in biodiversity.
For example, by changing the number of livestock grazing at any one time, and shortening their grazing time, many of the grazing-sensitive wildflowers have a better chance to germinate, grow, flower and produce seeds in the right seasonal conditions.
They can also manage weeds, and not remove fallen timber or loose rocks (bushrock). Fallen timber and rocks protect grazing-sensitive plants and provide habitat for birds, reptiles and invertebrates foraging on the ground.
So can we be optimistic for the future of wildlife and wildflowers of the box gum grassy woodlands? Yes, cautiously so.
Landholders are learning more about how best to manage biodiversity on their farms, but ecological recovery can take time. In any case, we’ve discovered how resilient our flora and fauna can be in the face of severe drought when given the opportunity to grow and flourish.
Climate change is bringing more extreme weather events. Last year was the warmest on record and the nation has been gripped by severe, protracted drought. There’s only so much pressure our iconic wildlife and wildflowers can take before they cross ecological thresholds that are difficult to bounce back from.
More government programs like this, and greater understanding and collaboration between scientists and farmers, create a tremendous opportunity to keep changing that trajectory for the better.
Quentin Grafton, Crawford School of Public Policy, Australian National University; Matthew Colloff, Australian National University; Paul Wyrwoll, Australian National University, and Virginia Marshall, Australian National University
The last bushfire season showed Australians they can no longer pretend climate change will not affect them. But there’s another climate change influence we must also face up to: increasingly scarce water on our continent.
Under climate change, rainfall will become more unpredictable. Extreme weather events such as cyclones will be more intense. This will challenge water managers already struggling to respond to Australia’s natural boom and bust of droughts and floods.
Thirty years since Australia’s water reform project began, it’s clear our efforts have largely failed. Drought-stricken rural towns have literally run out of water. Despite the recent rains, the Murray Darling river system is being run dry and struggles to support the communities that depend on it.
We must find another way. So let’s start the conversation.
How did we get here?
Sadly, inequitable water outcomes in Australia are not new.
The first water “reform” occurred when European settlers acquired water sources from First Peoples without consent or compensation. Overlaying this dispossession, British common law gave new settlers land access rights to freshwater. These later converted into state-owned rights, and are now allocated as privately held water entitlements.
Some 200 years later, the first steps towards long-term water reform arguably began in the 1990s. The process accelerated during the Millennium Drought and in 2004 led to the National Water Initiative, an intergovernmental water agreement. This was followed in 2007 by a federal Water Act, upending exclusive state jurisdiction over water.
Under the National Water Initiative, state and territory water plans were to be verified through water accounting to ensure “adequate measurement, monitoring and reporting systems” across the country.
This would have boosted public and investor confidence in the amount of water being traded, extracted and recovered – both for the environment and the public good.
This vision has not been realised. Instead, a narrow view now dominates in which water is valuable only when extracted, and water reform is about subsidising water infrastructure such as dams, to enable this extraction.
Why we should all care
In the current drought, rural towns have literally run out of fresh drinking water. These towns are not just dots on a map. They are communities whose very existence is now threatened.
In some small towns, drinking water can taste unpleasant or contain high levels of nitrate, threatening the health of babies. Drinking water in some remote Indigenous communities is not always treated, and the quality rarely checked.
In the Murray-Darling Basin, poor management and low rainfall have caused dry rivers, mass fish kills, and distress in Aboriginal communities. Key aspects of the basin plan have not been implemented. This, coupled with bushfire damage, has caused long-term ecological harm.
How do we fix the water emergency?
Rivers, lakes and wetlands must have enough water at the right time. Only then will the needs of humans and the environment be met equitably – including access to and use of water by First Peoples.
Water for the environment and water for irrigation is not a zero-sum trade-off. Without healthy rivers, irrigation farming and rural communities cannot survive.
A national conversation on water reform is needed. It should recognise and include First Peoples’ values and knowledge of land, water and fire.
Our water brief, Water Reform For All,
proposes six principles to build a national water dialogue:
- establish shared visions and goals
- develop clarity of roles and responsibilities
- implement adaptation as a way to respond to an escalation of stresses, including climate change and governance failures
- invest in advanced technology to monitor, predict and understand changes in water availability
- integrate bottom-up and community-based adaptation, including from Indigenous communities, into improved water governance arrangements
- undertake policy experiments to test new ways of managing water for all
Ask the right questions
As researchers, we don’t have all the answers on how to create a sustainable, equitable water future. No-one does. But in any national conversation, we believe these fundamental questions must be asked:
who is responsible for water governance? How do decisions and actions of one group affect access and availability of water for others?
what volumes of water are extracted from surface and groundwater systems? Where, when, by whom and for what?
what can we predict about a future climate and other long-term drivers of change?
how can we better understand and measure the multiple values that water holds for communities and society?
where do our visions for the future of water align? Where do they differ?
what principles, protocols and processes will help deliver the water reform needed?
how do existing rules and institutions constrain, or enable, efforts to achieve a shared vision of a sustainable water future?
how do we integrate new knowledge, such as water availability under climate change, into our goals?
what restitution is needed in relation to water and Country for First Peoples?
what economic sectors and processes would be better suited to a water-scarce future, and how might we foster them?
Water reform for all
These questions, if part of a national conversation, would reinvigorate the water debate and help put Australia on track to a sustainable water future.
Now is the time to start the discussion. Long-accepted policy approaches in support of sustainable water futures are in question. In the Murray-Darling Basin, some states even question the value of catchment-wide management. The formula for water-sharing between states is under attack.
Even science that previously underpinned water reform is being questioned
We must return to basics, reassess what’s sensible and feasible, and debate new ways forward.
We are not naive. All of us have been involved in water reform and some of us, like many others, suffer from reform fatigue.
But without a fresh debate, Australia’s water emergency will only get worse. Reform can – and must – happen, for the benefit of all Australians.
The following contributed to this piece and co-authored the report on which it was based: Daniel Connell, Katherine Daniell, Joseph Guillaume, Lorrae van Kerkoff, Aparna Lal, Ehsan Nabavi, Jamie Pittock, Katherine Taylor, Paul Tregoning, and John Williams
Quentin Grafton, Director of the Centre for Water Economics, Environment and Policy, Crawford School of Public Policy, Australian National University; Matthew Colloff, Honorary Senior Lecturer, Australian National University; Paul Wyrwoll, Research fellow, Australian National University, and Virginia Marshall, Inaugural Indigenous Postdoctoral Fellow, Australian National University
Water is a highly contested resource in this long, oppressive drought, and the coal industry is one of Australia’s biggest water users.
Research released today, funded by the Australian Conservation Foundation, has identified how much water coal mining and coal-fired power stations actually use in New South Wales and Queensland. The answer? About 383 billion litres of fresh water every year.
That’s the same amount 5.2 million people, or more than the entire population of Greater Sydney, uses in the same period. And it’s about 120 times the water used by wind and solar to generate the same amount of electricity.
Monitoring how much water is used by industry is vital for sustainable water management. But a lack of transparency about how much water Australia’s coal industry uses makes this very difficult.
Adani’s controversial Carmichael mine in central Queensland was granted a water licence that allows the company to take as much groundwater as it wants, despite fears it will damage aquifers and groundwater-dependent rivers.
Now more than ever, we must make sure water use by coal mines and power stations are better monitored and managed.
Why does coal need so much water?
Mines in NSW and Queensland account for 96% of Australia’s black coal production.
Almost all water used in coal mines is consumed and cannot be reused. Water is used for coal processing, handling and preparation, dust suppression, on-site facilities, irrigation, vehicle washing and more.
Coal mining’s water use rate equates to a total consumption of almost 225 billion litres a year in NSW and Queensland, which can be extrapolated to 234 billion litres for Australia, for black coal without considering brown coal.
About 80% of this water is freshwater from rainfall and runoff, extracted from rivers and water bodies, groundwater inflows or transferred from other mines. Mines are located in regions such as the Darling Downs, the Hunter River and the Namoi River in the Murray-Darling Basin.
The other 20% comes from water already contained in tailings (mine residue), recycled water or seepage from the mines.
The burning of coal to generate energy is also a large water user. Water use in coal-fired power stations is even harder to quantify, with a report from 2009 providing the only available data.
Water is used for cooling with power stations using either a once-through flow or recirculating water system.
The water consumed becomes toxic wastewater stored in ash ponds or is evaporated during cooling processes. Water withdrawn is returned to rivers which can damage aquatic life due to the increased temperature.
Data on total water use by coal mines is not publicly available. Despite the development of Australian and international water accounting frameworks, there is no reporting to these standards in coal mine reports.
This lack of consistent and available data means water use by the coal industry, and its negative effects, is not widely reported or understood. The problem is compounded by complex regulatory frameworks that allow gaps in water-use reporting.
A patchwork of government agencies in each state regulate water licences, quality and discharge, coal mine planning, annual reviews of mine operations and water and environmental impacts. This means that problems can fall through the gaps.
Digging for data
An analysis of annual reviews from 39 coal mines in NSW, provided data on water licences and details of water used in different parts of the mine.
Although they are part of mandatory reporting, the method of reporting water use is not standardised. The reviews are required to report against surface water and groundwater licences, but aren’t required to show a comprehensive water balanced account. Annual reviews for Queensland coal mines were not available.
Collated water use — both water consumption and water withdrawal – showed coal mining consumes approximately 653 litres for each tonne of coal produced.
This rate is 2.5 times more than a previous water-use rate of 250 litres per tonne, from research in 2010.
Using this rate the total water consumed by coal mining is 40% more than the total amount of water reported for all types of mining in NSW and Queensland by the Australian Bureau of Statistics in the same year.
By the numbers
NSW and Queensland coal-fired power stations annually consume 158,300 megalitres of water. One megalitre is equivalent to one million litres.
A typical 1,000-megawatt coal-fired power station uses enough water in one year to meet the basic water needs of nearly 700,000 people. NSW and Queensland have 18,000 megawatts of capacity.
Coal-fired generation uses significantly more water than other types of energy.
In total, coal mining and coal-fired power stations in NSW and Queensland consume 383 billion litres of freshwater a year – about 4.3% of all freshwater available in those states.
The value of this water is between A$770 million and A$2.49 billion (using a range of low to high security water licence costs).
They withdraw 2,353 billion litres of freshwater per year.
The problem with large water use
Coal mining is concentrated in a few regions, such as the Hunter Valley and the Bowen Basin, which are also important for farming and agriculture.
In NSW and Queensland, the coal industry withdraws about 30% as much water as is withdrawn for agriculture, and this is concentrated in the few regions.
Coal mining and power stations use water through licenses to access surface water and groundwater, and from unlicensed capturing of rainfall and runoff.
This can reduce stream flow and groundwater levels, which can threaten ecosystem habitats if not managed in context of other water users. Cumulative effects of multiple mines in one region can increase the risk to other water users.
The need for an holistic approach
A lack of available data remains a significant challenge to understanding the true impact of coal mining and coal-fired power on Australia’s water resources.
To improve transparency and increase trust in the coal industry, accounting for water consumed, withdrawn and impacted by coal mining should be standardised to report on full water account balances.
The coal industry should also be subject to mandatory monthly reporting and a single, open-access point of water data must be created. Comprehensive water modelling must be updated yearly and audited.
Coal water use must be managed in a holistic manner with the elevation of water accounting to a single government agency or common database.
Australia has a scarce water supply, and our environment and economy depend on the sustainable and equitable sharing of this resource.
Weather-wise, 2019 was a crazy way to end a decade. Fires spread through much of southeast Australia, fuelled by dry vegetation from the ongoing drought and fanned by hot, windy fire weather.
On the other side of the Indian Ocean, torrential rainfall and flooding devastated parts of eastern Africa. Communities there now face a locust plague and food shortages.
These intense events can partly be blamed on the extreme positive Indian Ocean Dipole, a climate phenomenon that unfolded in the second half of 2019.
The Indian Ocean Dipole refers to the difference in sea surface temperature on either side of the Indian Ocean, which alters rainfall patterns in Australia and other nations in the region. The dipole is a lesser-known relative of the Pacific Ocean’s El Niño.
Climate drivers, such as the Indian Ocean Dipole, are an entirely natural phenomenon, but climate change is modifying the behaviour of these climate modes.
In research published today in Nature, we reconstructed Indian Ocean Dipole variability over the last millennium. We found “extreme positive” Indian Ocean Dipole events like last year’s are historically very rare, but becoming more common due to human-caused climate change. This is big news for a planet already struggling to contain global warming.
So what does this new side-effect of climate change mean for the future?
The Indian Ocean brings drought and flooding rain
First, let’s explore what a “positive” and “negative” Indian Ocean Dipole means.
During a “positive” Indian Ocean Dipole event, waters in the eastern Indian Ocean become cooler than normal, while waters in the western Indian Ocean become warmer than normal.
Warmer water causes rising warm, moist air, bringing intense rainfall and flooding to east Africa. At the same time, atmospheric moisture is reduced over the cool waters of the eastern Indian Ocean. This turns off one of Australia’s important rainfall sources.
The Indian Ocean Dipole also has a negative phase, which is important to bring drought-breaking rain to Australia. But the positive phase is much stronger and has more intense climate impacts.
We’ve experienced extreme positive Indian Ocean Dipole events before. Reliable instrumental records of the phenomenon began in 1958, and since then a string of very strong positive Indian Ocean Dipoles have occurred in 1961, 1994, 1997 and now 2019.
But this instrumental record is very short, and it’s tainted by the external influence of climate change.
This means it’s impossible to tell from instrumental records alone how extreme Indian Ocean Dipoles can be, and whether human-caused climate change is influencing the phenomenon.
Diving into the past with corals
To uncover just how the Indian Ocean Dipole has changed, we looked back through the last millennium using natural records: “cores” taken from nine coral skeletons (one modern, eight fossilised).
These coral samples were collected just off of Sumatra, Indonesia, so they’re perfectly located for us to reconstruct the distinct ocean cooling that characterises positive Indian Ocean Dipole events.
Corals grow a lot like trees. For every year they live they produce a growth band, and individual corals can live for more than 100 years. Measuring the oxygen in these growth bands gives us a detailed history of the water temperature the coral grew in, and the amount of rainfall over the reef.
In other words, the signature of extreme events like past positive Indian Ocean Dipoles is written in the coral skeleton.
Altogether, our coral-based reconstruction of the Indian Ocean Dipole spans 500 years between 1240 and 2019. There are gaps in the timeline, but we have the best picture so far of how exactly the Indian Ocean Dipole has varied in the past.
How unusual was the 2019 Indian Ocean Dipole event?
Extreme events like the 2019 Indian Ocean Dipole have historically been very rare.
We found only ten extreme positive Indian Ocean Dipole events in the entire record. Four occurred in the past 60 years, but only six occurred in the remaining 440 years before then. This adds more weight to evidence that positive Indian Ocean Dipole events have been occurring more often in recent decades, and becoming more intense.
But another finding from the reconstruction surprised – and worried – us. Events like 2019 aren’t the worst of what the Indian Ocean Dipole can throw at us.
Of the extreme events we found in our reconstruction, one of them, in 1675, was much stronger than anything we’ve seen in observations from the last 60 years.
The 1675 event was around 30–40% stronger than what we saw in 1997 (around the same magnitude as 2019). Historical accounts from Asia show this event was disastrous, and the severe drought it caused led to crop failures, widespread famine and mortality, and incited war.
As far as we can tell, this event shows just how extreme Indian Ocean Dipole variability can be, even without any additional prompting from external forces like human-caused climate change.
Why should we care?
Indian Ocean Dipole variability will continue to episodically bring extreme climate conditions to our region.
But previous studies, as well as ours, have shown human-caused climate change has shortened the gaps between these episodes, and this trend will continue. This is because climate change is causing the western side of the Indian Ocean to warm faster than in the east, making it easier for positive Indian Ocean Dipole events to establish.
In other words, drought-causing positive Indian Ocean Dipole events will become more frequent as our climate continues to warm.
In fact, climate model projections indicate extreme positive Indian Ocean Dipole events will occur three times more often this century than last, if high greenhouse gas emissions continue.
This means events like last year will almost certainly unfold again soon, and we’re upping the odds of even worse events that, through the fossil coral data, we now know are possible.
Knowing we haven’t yet seen the worst of the Indian Ocean Dipole is important in planning for future climate risks. Future extremes from the Indian Ocean will act on top of long-term warming, giving a double-whammy effect to their impacts in Australia, like the record-breaking heat and drought of 2019.
But perhaps most importantly, rapidly cutting greenhouse gas emissions will limit how often positive Indian Ocean Dipole events occur in future.
Nicky Wright, Research Fellow, Australian National University; Bethany Ellis, PhD Candidate, Australian National University, and Nerilie Abram, Professor; ARC Future Fellow; Chief Investigator for the ARC Centre of Excellence for Climate Extremes, Australian National University
The drought in eastern Australia was a significant driver of this season’s unprecedented bushfires. But it also caused another, less well known environmental calamity this summer: entire hillsides of trees turned from green to brown.
We’ve observed extensive canopy dieback from southeast Queensland down to Canberra. Reports of more dead and dying trees from other regions across Australia are flowing in through the citizen science project, the Dead Tree Detective.
A few dead trees are not an unusual sight during a drought. But in some places, it is the first time in living memory so much canopy has died off.
Ecologists are now pondering the implications. There are warnings that some Australian tree species could disappear from large parts of their ranges as the climate changes. Could we be witnessing the start of ecosystem collapse?
Why are canopies dying now?
Much of eastern Australia has been in drought since the start of 2017. While this drought is not yet as long as the Millennium Drought, it appears to be more intense. Many areas have received the lowest rainfall on record, including long periods of time with no rainfall. This has been coupled with above-average temperatures and extreme heatwaves.
The higher the temperature, the greater the moisture loss from leaves. This is usually good for a tree because it cools the canopy. But if there is not enough water in the soil, the increased water loss can push trees over a threshold, causing extensive leaf “scorching”, or browning. The extensive canopy dieback we have observed this summer suggests that the soil had finally become too dry for many trees.
Are the trees dead?
Brown or bare trees are not necessarily dead. Many eucalypts can lose all their leaves but resprout after rain.
Many parts of eastern Australia are now flushed with green after rain. In these areas, it will be important to assess the extent of tree recovery. If trees are not showing signs of recovery after significant rainfall, they’re unlikely to survive. In some cases carbohydrate reserves – which trees need to resprout new leaves – may be too depleted for trees to recover.
The drought may also hinder post-fire recovery. Most eucalypt forests eventually recover from bushfires by resprouting new leaves. Some forests also recover when fire triggers seedlings to germinate.
But it’s likely that some forests now recovering from fire were already struggling with canopy dieback. So these two disturbances will test how resilient our forests are to back-to-back drought and bushfire.
Trees recovering from drought and/or fire may also enter the “dieback spiral”. The new flush of leaves following rain can make a particularly tasty meal for insects. Trees will then attempt to grow more foliage in response, but their ability to keep producing new leaves gradually declines as they deplete their carbohydrate reserves, and they can die.
Dieback spiral has led to extensive tree loss in the past, including in the New England area of NSW.
Should we be worried?
The capacity of eucalypts to resprout makes them naturally resilient to extended drought. There are some records of canopy dieback from severe droughts in the past, such as the Federation Drought. We assume (although we don’t know for sure) the forests recovered after these events. So they may bounce back after the current drought.
However, it’s hard not to be concerned. Climate change will bring increased drought, heatwaves and fires that could, over time, see extensive losses of trees across the landscape – as happened on the Monaro High Plain after the Millennium Drought.
Australian research in 2016 warned that due to climate change, the habitat of 90% of eucalypt species could decline and 16 species were expected to lose their home environments within 60 years.
Such a change would have huge consequences for how ecosystems function – reducing the capacity for ecosystem services such as carbon storage, altering catchment water resources and reducing habitat for native animals.
Where to from here?
Landholders can help bush on their property recover after drought, by protecting germinating seedlings from livestock and collecting local seed for later revegetation. Trees that appear dead should not be cut down as they may recover, and even if dead can provide valuable animal habitat.
Most importantly, however, we need to monitor trees carefully to see where they’ve died, and where they are recovering. A citizen science project, the Dead Tree Detective, is helping map the extent of tree die-off across Australia.
People send in photos of dead and dying trees – to date, over 267 records have been uploaded. These records can be used to target where to monitor forests during drought, including on-ground assessments of tree health and quantifying the physiological responses of trees to drought stress.
There is no ongoing forest health monitoring program in Australia, so this dataset is invaluable in helping us determine exactly how vulnerable Australia’s forests are to the double whammy of severe drought and bushfires.
Rachael Helene Nolan, Postdoctoral research fellow, Western Sydney University; Belinda Medlyn, Professor, Western Sydney University; Brendan Choat, Associate Professor, Western Sydney University, and Rhiannon Smith, Research Fellow, University of New England