For the first time, we can measure the human footprint on Antarctica



File 20190304 110110 1o2c5ev.jpg?ixlib=rb 1.1
The Casey Station is part of Australia’s permanent outpost in Antarctica.
Shaun Brooks, Author provided

Shaun Brooks, University of Tasmania and Julia Jabour, University of Tasmania

Most people picture Antarctica as a frozen continent of wilderness, but people have been living – and building – there for decades. Now, for the first time, we can reveal the human footprint across the entire continent.

Our research, published today in the journal Nature Sustainability, found that while buildings and disturbance cover a small portion of the whole continent, it has an outsized impact on Antartica’s ecosystem.




Read more:
Explainer: what any country can and can’t do in Antarctica, in the name of science


Our data show 76% of buildings in Antarctica are within just 0.06% of the continent: the ice-free areas within 5km of the coast. This coastal fringe is particularly important as it provides access to the Southern Ocean for penguins and seals, as well as providing a typically wetter climate suitable for plant life.

A hard question to answer

How much land we collectively impact with infrastructure in Antarctica has been a question raised for decades, but until now has been difficult to answer. The good news is it’s a relatively small area. The bigger issue is where it is. Together with our colleagues Dana Bergstrom and John van den Hoff, we have made the first measurement of the “footprint” of buildings and disturbed ice-free ground across Antarctica.

This equates to more than 390,000 square metres of buildings on the icy continent, with a further 5,200,000m² of disturbance just to ice-free land. To put it another way, there is more than 1,100m² of disturbed ground per person in Antarctica at its most populated in summer. This is caused primarily by the 30 nations with infrastructure in Antarctica, along with some presence from the tourism industry.

Figure Building footprint density.
Nature Sustainability

It has taken until now to find the extent of our impact because of difficulty in gathering the data. Because so many countries are active in Antarctica, getting them to provide data on their infrastructure has been very slow. As two-thirds of research stations were built before the adoption of the Protocol on Environmental Protection to the Antarctic Treaty, they did not require environmental impact assessments or monitoring, so it is quite likely many of the operators do not have accessible data on their footprints. In addition, due to the inherent difficulty in accessing Antarctica, and the vast distances between each station, it is not possible to conduct field measurements on a continental scale.

To address these problems, our team took an established approach to measuring a single station’s footprint, and applied it to 158 locations across the continent using satellite imagery. The majority of images used were freely sourced from Google Earth, enabled by continually increasing improvements in resolution and coverage.

This process took hours of painstaking “digitisation” – where the spatially accurate images of buildings and disturbed ground were manually mapped within a computer program to create the data.

Davis Station, one of Australia’s three permanent research outposts in Antartica. Researchers used Google Earth images to map the footprint of human infrastructure across the continent.
Shaun Brooks, Author provided

Interestingly, one of the most difficult sites was the United States’ Amundsen-Scott Station. As this station is located on the geographic South Pole, very few satellites pass overhead. This problem was eventually solved by trawling through thousands of aerial images produced by NASA’s Operation IceBridge, where we found their aircraft had flown over the station in 2010. After capturing these data, we then compared our measurements against existing known building sizes and found our accuracy was within 2%.

Unlike buildings, we didn’t have measurements to compare for disturbed ground such as roadways, airstrips, quarries and the like. We believe we have produced a significant underestimate, due to factors including snow cover and insufficient image resolution obscuring smaller features such as walking tracks.




Read more:
Antarctic seas host a surprising mix of lifeforms – and now we can map them


Location, location, location

After mapping the footprint of buildings and ground disturbance our data has yielded some interesting results. For practical reasons, most stations in Antarctica are located within the small ice-free areas spread across the continent, particularly around the coast. In addition to being attractive to us, these areas are essential for much of Antarctica’s biodiversity by providing nesting sites for seabirds and penguins, substrate for mosses, lichens, and two vascular plants, and habitat for the continent’s invertebrate species.

Adelie penguins need ice-free areas to access the ocean.
Shaun Brooks, Author provided

Another interesting finding from these data is what they tell us about wilderness on the continent. Although the current footprint covers a very small fraction of the more than 12 million square kilometres of Antarctica, we found disturbance is present in more than half of all large ice-free areas along the coast. Furthermore, by using the building data we captured, along with existing work by Rupert Summerson, we were also able to estimate the visual footprint, which amounts to an area similar in size to the total ice-free land across the whole continent.

The release of this research is timely, with significant increases in infrastructure proposed for Antarctica. Currently there are new stations proposed by several nations, major rebuilding projects of existing stations underway (including the US’s McMurdo and New Zealand’s Scott Base), and Italy is building a new runway in ice-free areas.

Australia has proposed Antarctica’s first concrete runway, which if built would be the continent’s largest.




Read more:
Why Antarctica’s sea ice cover is so low (and no, it’s not just about climate change)


Until now, decisions on expanding infrastructure have been without the context of how much is already present. We hope informed decisions can now be made by the international community about how much building in Antarctica is appropriate, where it should occur, and how to manage the future of the last great wilderness.The Conversation

Shaun Brooks, PhD Candidate, University of Tasmania and Julia Jabour, Leader, Ocean and Antarctic Governance Research Program, University of Tasmania

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Advertisements

Trails on trial: which human uses are OK for protected areas?



File 20181027 7062 x47aka.jpg?ixlib=rb 1.1
Mountain biking seems harmless but can damage soil and scare wildlife.
Pixabay

Bill Laurance, James Cook University and David Salt, Australian National University

There’s no question about it: parks and protected areas are the absolute cornerstone of our efforts to protect nature. In the long term, we can’t save wildlife and ecosystems without them.

But people want to use parks too, and in rapidly growing numbers. Around the world, parks are destinations for recreational activities like hiking, bird-watching and camping, as well as noisier affairs such as mountain-biking, snowmobiling and four-wheel-driving.

Where do we draw the line?

Road risks

Let’s start by looking at the roads that take us into and through parks. They can be a double-edged sword.

Roads are needed to allow tourists to access parks, but we have to be very careful where and how we build them.

Road for an industrial gold mine slicing through Panamanian rainforest.
Susan Laurance

In regions where law enforcement is weak, roads can rip apart a forest — sharply increasing illegal activities such as poaching, deforestation and mining.

According to my (Bill’s) research, new roads – often driven by foreign mining or timber investors from nations such as China – could damage up to a third of all the protected areas in sub-Saharan Africa.




Read more:
The global road-building explosion is shattering nature


In Nouabale Ndoke Park in the Congo Basin, poaching wasn’t a big problem until a new road was built along the edge of the park.

Suddenly the fatal rak-rak-rak of AK-47 rifles – often aimed at elephants by ivory poachers – was being heard all too often.

Bill Laurance examines a forest elephant slaughtered by poachers in the Congo. The elephant’s face had been hacked off to extract its valuable ivory tusks.
Mahmoud Mahmoud

Trails on trial

Roads are one thing, but what about a simple bike trail or walking track? They let in people too. But they are harmless, right?

Not always. A 2010 Canadian study found that mountain biking causes a range of environmental impacts, including tyres chewing up the soil, causing compaction and erosion. This is a significant problem for fragile alpine vegetation in mountainous areas where many bikers like to explore.

Rapidly moving cyclists can also scare wildlife. In North America and Europe, many wild species, such as bears, wolves, caribou and bobcats, have been shown to flee or avoid areas frequented by hikers or bikers.

In Indonesia, even trails used by ecotourists and birdwatchers scared away some sensitive wildlife species or caused them to shift to being active only at night.

The red panda, an endangered species. Some wildlife avoid areas with even limited human use.
Pixabay

Every type of human activity – be it hiking or biking or horse riding — has its own signature impact on nature. We simply don’t know the overall effect of human recreation on parks and protected areas globally.

However, a study earlier this year found that roughly one-third of all terrestrial protected areas worldwide – a staggering 6 million square kilometres, an area bigger than Kenya – is already under “intense” human pressure.




Read more:
One-third of the world’s nature reserves are under threat from humans


Roads, mines, industrial logging, farms, townships and cities all threaten these supposedly protected places. And on top of that are the impacts – probably lesser but still unquantified – of more benign human activities aimed at enjoying nature.

Keep people out?

Is the answer to stop people from visiting parks?

Not really. Visitors in many parts of the world help to fund the operation of national parks, and provide vital income for local people.

Exposure to nature is also one of the best ways to enhance human health, build support for environmental protection, and generate political momentum for the establishment of new protected areas.

A hiker in the Leuser Ecosystem, Indonesia.
William Laurance

What’s more, locking people out of land is a very unpopular thing to do. Governments that block people from accessing nature reserves often face an electoral backlash.

How to manage humanity

If we accept that people must be able to use parks, what’s the best way to limit their impacts on ecosystems and wildlife? One way is to encourage them to stay on designated trails and tourist routes.

A recent study (using geotagged data from photos) showed that half of all photos by park visitors were taken in less than 1% of each park.

In other words, most visitors use only a small, highly trafficked part of each park. That’s good news for nature.

If people tend to limit their activities to the vicinity of pretty waterfalls, spectacular vistas, and designated hiking areas, that leaves much of the park available for sensitive animals and ecosystems.

Forest elephants in Central Africa. In the past decade, two-thirds of all forest elephants have been wiped out by poachers and expanding roads.
Thomas Breuer/ Wikipedia, CC BY-SA

There are many opportunities for practical science and management. We want to help design protected areas in a way that lets people enjoy them – but which also focuses their activities in particular areas while retaining large intact areas where wildlife can roam free with little human disturbance.

And while we’re designing our parks, we want to use every opportunity, and every visit, to educate and empower tourists. We need people using parks to understand, appreciate, and stand up for nature, rather than thinking of parks as simply playgrounds.The Conversation

Bill Laurance, Distinguished Research Professor and Australian Laureate, James Cook University and David Salt, Science writer and editor, Australian National University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Google’s artificial intelligence finds two new exoplanets missed by human eyes


File 20171215 17878 rik7zz.jpg?ixlib=rb 1.1
Artist impression of Kepler-90i, the eighth planet discovered orbiting around Kepler-90.
NASA

Jake Clark, University of Southern Queensland

Two new exoplanets have been discovered thanks to NASA’s collaboration with Google’s artificial intelligence (AI). One of those in today’s announcement is an eighth planet – Kepler-90i – found orbiting the Sun-like star Kepler-90. This makes it the first system discovered with an equal number of planets to our own Solar system.

A mere road trip away, at 2,545 light-years from Earth, Kepler-90i orbits its host star every 14.4 Earth days, with a sizzling surface temperature similar to Venus of 426°C.

The new exoplanets are added to the growing list of known worlds found orbiting other stars.

The Kepler-90 planets have a similar configuration to our solar system with small planets found orbiting close to their star, and the larger planets found farther away.
NASA/Ames Research Center/Wendy Stenzel

This new Solar system rival provides evidence that a similar process occurred within Kepler-90 that formed our very own planetary neighbourhood: small terrestrial worlds close to the host star, and larger gassy planets further away. But to say the system is a twin of our own Solar system is a stretch.


Read more: Exoplanet discovery by an amateur astronomer shows the power of citizen science


The entire Kepler-90 system of eight planets would easily fit within Earth’s orbit of the Sun. All eight planets, bar Kepler-90h, would be too hostile for life, lying outside the so-called habitable zone.

Evidence also suggests that planets within the Kepler-90 system started out farther apart, much like our own Solar system. Some form of migration occurred, dragging this system inwards, producing the orbits we see in Kepler-90 today.

Kepler-90 is a Sun-like star, but all of its eight planets are scrunched into the equivalent distance of Earth to the Sun.
NASA/Ames Research Center/Wendy Stenzel

Google’s collaboration with NASA’s space telescope Kepler mission has now opened up new and exciting opportunities into AI helping with scientific discoveries.

So how exactly did Google’s AI discover these planets? And what sort of future discoveries can this technology provide?

Training AI for exoplanet discoveries

Traditionally, software developers program computers to perform a particular task, from playing your favourite cat video, to determining exoplanetary signals from space based telescopes such as NASA’s Kepler Mission.

These programs are executed to serve a single purpose. Using code intended for cat videos to hunt exoplanets in light curves would lead to some very interesting, yet false, results.

Googles’s AI is programmed rather differently, using machine learning. In machine learning, AI is trained through artificial neural networks – somewhat replicating our brain’s biological neural networks – to perform tasks like reading this article. It then learns from its mistakes, becoming more efficient at its particular task.

Google’s DeepMind AI, AlphaGo, was trained previously to play Go, an extremely complex yet elegant Chinese board game. Last year, AlphaGo defeated Lee Sedol, the world’s best Go player, by four games to one. It simply trained itself by watching thousands of previously played games, then competing against itself.

In our exoplantary case, AI was trained to identify transiting exoplanets, sifting through 15,000 signals from the Kepler exoplanet catalogue. It learned what was and wasn’t a signal caused by an exoplanet eclipsing its host star. These 15,000 signals were previously vetted by NASA scientists prior to the AI’s training, guiding it to a 96% efficiency of detecting known exoplanets.

Researchers then directed their AI network to search in multiplanetary systems for weaker signals. This research culminated in today’s announcement of both Kepler-90i and another Earth-sized exoplanet, Kepler-80g, in a separate planetary system.

Hunting for more exoplanets using AI

Google’s AI has analysed only 10% of the 150,000 stars NASA’s Kepler Mission has been eyeing off across the Milky Way galaxy.

How AI helps in the hunt for other exoplanets.

There’s potential then for sifting through Kepler’s entire catalogue and finding other exoplanetary worlds that have either been skimmed by scientist or haven’t been checked yet, due to Kepler’s rich data set. And that’s exactly what Google’s researchers are planning to do.

Machine learning neural networks have been assisting astronomers for a few years now. But the potential for AI to assist in exoplanetary discoveries will only increase within the next decade.

Beyond Kepler

The Kepler mission has been running since 2009, with observations slowly coming to an end. Within the next 12 months, all of its on-board fuel will be fully depleted, ending what has been, one of the greatest scientific endeavours in modern times.

NASA’s new TESS mission will inundate astronomers with 20,000 exoplanetary candidates in the next two years.
Chet Beals/MIT Lincoln Lab

Kepler’s successor, the Transiting Exoplanet Survey Satellite (TESS) will be launching this coming March.


Read more: A machine astronomer could help us find the unknowns in the universe


TESS is predicted to find 20,000 exoplanet candidates during its two-year mission. To put that into perspective, in the past 25 years, we’ve managed to discover just over 3,500.

This unprecedented inundation of exoplanetary data needs to either be confirmed by other transiting observations or other methods such as ground-based radial velocity measurements.

The ConversationThere just isn’t enough people-power to sift through all of this data. That’s why these machine learning networks are needed, so they can aid astronomers in sifting through big data sets, ultimately assisting in more exoplanetary discoveries. Which begs the question, who exactly gets credit for such a discovery?

Jake Clark, PhD Student, University of Southern Queensland

This article was originally published on The Conversation. Read the original article.

Human noise pollution is disrupting parks and wild places



File 20170630 8214 1k76ggc
A red fox listening for prey under the snow in Yellowstone National Park. Noise can affect foxes and other animals that rely on their hearing when they hunt.
Neal Herbert/NPS

Rachel Buxton, Colorado State University

As transportation networks expand and urban areas grow, noise from sources such as vehicle engines is spreading into remote places. Human-caused noise has consequences for wildlife, entire ecosystems and people. It reduces the ability to hear natural sounds, which can mean the difference between life and death for many animals, and degrade the calming effect that we feel when we spend time in wild places.

Protected areas in the United States, such as national parks and wildlife refuges, provide places for respite and recreation, and are essential for natural resource conservation. To understand how noise may be affecting these places, we need to measure all sounds and determine what fraction come from human activities.

In a recent study, our team used millions of hours of acoustic recordings and sophisticated models to measure human-caused noise in protected areas. We found that noise pollution doubled sound energy in many U.S. protected areas, and that noise was encroaching into the furthest reaches of remote areas.

Pine siskin song as a car passes by, Rocky Mountain National Park.
Recorded by Jacob Job, research associate with Colorado State University and the National Park Service, Author provided268 KB (download)

Our approach can help protected area managers enhance recreation opportunities for visitors to enjoy natural sounds and protect sensitive species. These acoustic resources are important for our physical and emotional well-being, and are beautiful. Like outstanding scenery, pristine soundscapes where people can escape the clamor of everyday life deserve protection.

What is noise pollution?

“Noise” is an unwanted or inappropriate sound. We focused on human sources of noise in natural environments, such as sounds from aircraft, highways or industrial sources. According to the Environmental Protection Agency, noise pollution is noise that interferes with normal activities, such as sleeping and conversation, and disrupts or diminishes our quality of life.

Human-caused noise in protected areas interferes with visitors’ experience and alters ecological communities. For example, noise may scare away carnivores, resulting in inflated numbers of prey species such as deer. To understand noise sources in parks and inform management, the National Park Service has been monitoring sounds at hundreds of sites for the past two decades.

Estimating human-generated noise

Noise is hard to quantify at large-landscape scales because it can’t be measured by satellite or other visual observations. Instead researchers have to collect acoustic recordings over a wide area. NPS scientists on our team used acoustic measurements taken from 492 sites around the continental United States to build a sound model that quantified the acoustic environment.

National Park Service staff set up an acoustic recording station as a car passes on Going-to- the-Sun Road in Glacier National Park, Montana.
National Park Service

They used algorithms to determine the relationship between sound measurements and dozens of geospatial features that can affect measured average sound levels. Examples include climate data, such as precipitation and wind speed; natural features, such as topography and vegetation cover; and human features, such as air traffic and proximity to roads.

Using these relationships, we predicted how much human-caused noise is added to natural sound levels across the continental United States.

To get an idea of the potential spatial extent of noise pollution effects, we summarized the amount of protected land experiencing human-produced noise three or 10 decibels above natural. These increments represent a doubling and a 10-fold increase, respectively, in sound energy, and a 50 to 90 percent reduction in the distance at which natural sounds can be heard. Based on a literature review, we found that these thresholds are known to impact human experience in parks and have a range of repercussions for wildlife.

Few escapes from noise

The good news is that in many cases, protected areas are quieter than surrounding lands. However, we found that human-caused noise doubled environmental sound in 63 percent of U.S. protected areas, and produced a tenfold or greater increase in 21 percent of protected areas.

Map of projected ambient sound levels for a typical summer day across the contiguous United States, where lighter yellow indicates louder conditions and darker blue indicates quieter conditions.
Rachel Buxton, Author provided

Noise depends on how a protected area is managed, where a site is located and what kinds of activities take place nearby. For example, we found that protected areas managed by local government had the most noise pollution, mainly because they were in or near large urban centers. The main noise sources were roads, aircraft, land-use conversion and resource extraction activities such as oil and gas production, mining and logging.

We were encouraged to find that wilderness areas – places that are preserved in their natural state, without roads or other development – were the quietest protected areas, with near-natural sound levels. However, we also found that 12 percent of wilderness areas experienced noise that doubled sound energy. Wilderness areas are managed to minimize human influence, so most noise sources come from outside their borders.

Finally, we found that many endangered species, particularly plants and invertebrates, experience high levels of noise pollution in their critical habitat – geographic areas that are essential for their survival. Examples include the Palos Verdes Blue butterfly, which is found only in Los Angeles County, California, and the Franciscan manzanita, a shrub that once was thought extinct, and is found only in the San Francisco Bay area.

Of course plants can’t hear, but many species with which they interact are affected by noise. For example, noise changes the distribution of birds, which are important pollinators and seed dispersers. This means that noise can reduce the recruitment of seedlings.

F-4 fighter jets pass through ‘Star Wars Canyon’ in Death Valley National Park, a spot popular with military pilots.

Turning down the volume

Noise pollution is pervasive in many protected areas, but there are ways to reduce it. We have identified noisy areas that will quickly benefit from noise mitigation efforts, especially in habitats that support endangered species.

The ConversationStrategies to reduce noise include establishing quiet zones where visitors are encouraged to quietly enjoy protected area surroundings, and confining noise corridors by aligning airplane flight patterns over roads. Our work provides insights for restoring natural acoustic environments, so that visitors can still enjoy the sounds of birdsong and wind through the trees.

Rachel Buxton, Postdoctoral Research Fellow, Colorado State University

This article was originally published on The Conversation. Read the original article.

Yes, the Arctic’s freakishly warm winter is due to humans’ climate influence


Andrew King, University of Melbourne

For the Arctic, like the globe as a whole, 2016 has been exceptionally warm. For much of the year, Arctic temperatures have been much higher than normal, and sea ice concentrations have been at record low levels.

The Arctic’s seasonal cycle means that the lowest sea ice concentrations occur in September each year. But while September 2012 had less ice than September 2016, this year the ice coverage has not increased as expected as we moved into the northern winter. As a result, since late October, Arctic sea ice extent has been at record low levels for the time of year.

Late 2016 has produced new record lows for Arctic ice.
NSIDC, Author provided

These record low sea ice levels have been associated with exceptionally high temperatures for the Arctic region. November and December (so far) have seen record warm temperatures. At the same time Siberia, and very recently North America, have experienced conditions that are slightly cooler than normal.

Temperatures have been far above normal over vast areas of the Arctic this November and December.
Geert Jan van Oldenborgh/KNMI/ERA-Interim, Author provided

Extreme Arctic warmth and low ice coverage affect the migration patterns of marine mammals and have been linked with mass starvation and deaths among reindeer, as well as affecting polar bear habitats.

Given these severe ecological impacts and the potential influence of the Arctic on the climates of North America and Europe, it is important that we try to understand whether and how human-induced climate change has played a role in this event.

Arctic attribution

Our World Weather Attribution group, led by Climate Central and including researchers at the University of Melbourne, the University of Oxford and the Dutch Meteorological Service (KNMI), used three different methods to assess the role of the human climate influence on record Arctic warmth over November and December.

We used forecast temperatures and heat persistence models to predict what will happen for the rest of December. But even with 10 days still to go, it is clear that November-December 2016 will certainly be record-breakingly warm for the Arctic.

Next, I investigated whether human-caused climate change has altered the likelihood of extremely warm Arctic temperatures, using state-of-the-art climate models. By comparing climate model simulations that include human influences, such as increased greenhouse gas concentrations, with ones without these human effects, we can estimate the role of climate change in this event.

This technique is similar to that used in previous analyses of Australian record heat and the sea temperatures associated with the Great Barrier Reef coral bleaching event.

The November-December temperatures of 2016 are record-breaking but will be commonplace in a few decades’ time.
Andrew King, Author provided

To put it simply, the record November-December temperatures in the Arctic do not happen in the simulations that leave out human-driven climate factors. In fact, even with human effects included, the models suggest that this Arctic hot spell is a 1-in-200-year event. So this is a freak event even by the standards of today’s world, which humans have warmed by roughly 1℃ on average since pre-industrial times.

But in the future, as we continue to emit greenhouse gases and further warm the planet, events like this won’t be freaks any more. If we do not reduce our greenhouse gas emissions, we estimate that by the late 2040s this event will occur on average once every two years.

Watching the trend

The group at KNMI used observational data (not a straightforward task in an area where very few observations are taken) to examine whether the probability of extreme warmth in the Arctic has changed over the past 100 years. To do this, temperatures slightly further south of the North Pole were incorporated into the analysis (to make up for the lack of data around the North Pole), and these indicated that the current Arctic heat is unprecedented in more than a century.

The observational analysis reached a similar conclusion to the model study: that a century ago this event would be extremely unlikely to occur, and now it is somewhat more likely (the observational analysis puts it at about a 1-in-50-year event).

The Oxford group used the very large ensemble of Weather@Home climate model simulations to compare Arctic heat like 2016 in the world of today with a year like 2016 without human influences. They also found a substantial human influence in this event.

Santa struggles with the heat. Climate change is warming the North Pole and increasing the chance of extreme warm events.
Climate Central

All of our analysis points the finger at human-induced climate change for this event. Without it, Arctic warmth like this is extremely unlikely to occur. And while it’s still an extreme event in today’s climate, in the future it won’t be that unusual, unless we drastically curtail our greenhouse gas emissions.

As we have already seen, the consequences of more frequent extreme warmth in the future could be devastating for the animals and other species that call the Arctic home.

Geert Jan van Oldenborgh, Marc Macias-Fauria, Peter Uhe, Sjoukje Philip, Sarah Kew, David Karoly, Friederike Otto, Myles Allen and Heidi Cullen all contributed to the research on which this article is based.

You can find more details on all the analysis techniques here. Each of the methods used has been peer-reviewed, although as with the Great Barrier Reef bleaching study, we will submit a research manuscript for peer review and publication in 2017.

The Conversation

Andrew King, Climate Extremes Research Fellow, University of Melbourne

This article was originally published on The Conversation. Read the original article.

Consensus confirmed: over 90% of climate scientists believe we’re causing global warming


John Cook, The University of Queensland

When we published a paper in 2013 finding 97% scientific consensus on human-caused global warming, what surprised me was how surprised everyone was.

Ours wasn’t the first study to find such a scientific consensus. Nor was it the second. Nor were we the last.

Nevertheless, no-one I spoke to was aware of the existing research into such a consensus. Rather, the public thought there was a 50:50 debate among scientists on the basic question of whether human activity was causing global warming.

This lack of awareness is reflected in a recent pronouncement by Senator Ted Cruz (currently competing with Donald Trump in the Republican primaries), who argued that:

The stat about the 97% of scientists is based on one discredited study.

Why is a US Senator running for President attacking University of Queensland research on scientific agreement? Cruz’s comments are the latest episode in a decades-long campaign to cast doubt on the scientific consensus on climate change.

Back in 2002, a Republican pollster advised conservatives to attack the consensus in order to win the public debate about climate policy. Conservatives complied. In conservative opinion pieces about climate change from 2007 to 2010, their number one argument was “there is no scientific consensus on climate change”.

Recent psychological research has shown that the persistent campaign to confuse the public about scientific agreement has significant societal consequences. Public perception of consensus has been shown to be a “gateway belief”, influencing a range of other climate attitudes and beliefs.

People’s awareness of the scientific consensus affects their acceptance of climate change, and their support for climate action.

The psychological importance of perceived consensus underscores why communicating the 97% consensus is important. Consensus messaging has been shown empirically to increase acceptance of climate change.

And, crucially, it’s most effective on those who are most likely to reject climate science: political conservatives.

In other words, consensus messaging has a neutralising effect, which is especially important given the highly polarised nature of the public debate about climate change.

Expert agreement

Consequently, social scientists have urged climate scientists to communicate the scientific consensus, countering the misconception that they are still divided about human-caused global warming.

But how do you counter the myth that the 97% consensus is based on a single study?

One way is to bring together the authors of the leading consensus papers to synthesise all the existing research: a meta-study of meta-studies. We did exactly that, with a new study published in Environmental Research Letters featuring authors from seven of the leading studies into the scientific consensus on climate change.

A video summary of the new paper into climate change consensus. (2016)

A recurring theme throughout the consensus research was that the level of scientific agreement varied depending on climate expertise. The higher the expertise in climate science, the higher the agreement that humans were causing global warming.

To none of our surprise, the highest agreement was found among climate scientists who had published peer-reviewed climate research. Interestingly, the group with the lowest agreement was economic geologists.

Expertise vs consensus.
Skeptical Science

Seven studies quantified the level of agreement among publishing climate scientists, or among peer-reviewed climate papers. Across these studies, there was between 90% to 100% agreement that humans were causing global warming.

A number of studies converged on the 97% consensus value. This is why the 97% figure is often invoked, having been mentioned by such public figures as President Barack Obama, Prime Minister David Cameron and US Senator Bernie Sanders.

Studies into consensus.
Skeptical Science

Manufacturing doubt about consensus

The relationship between scientific agreement and expertise turns out to be crucially important in understanding the consensus issue. Unfortunately, it provides an opportunity for those who reject human-caused global warming to manufacture doubt about the high level of scientific agreement.

They achieve this by using groups of scientists with lower expertise in climate science, to convey the impression that expert agreement on climate change is low. This technique is known as “fake experts”, one of the five characteristics of science denial.

For example, surveys of climate scientists may be “diluted” by including scientists who don’t possess expertise in climate science, thus obtaining a lower level of agreement compared to the consensus among climate scientists. This is partly what Senator Rick Santorum did when he argued that the scientific consensus was only 43%.

Another implementation of the “fake expert” strategy is the use of petitions containing many scientists who lack climate science credentials. The most famous example is the Oregon Petition Project, which lists over 31,000 people with a science degree who signed a statement that humans aren’t disrupting the climate. However, 99.9% of the signatories aren’t climate scientists.

The science of science communication tells us that communicating the science isn’t sufficient. Misinformation has been shown to cancel out the effect of accurate scientific information. We also need to explain the techniques of misinformation, such as the “fake expert” strategy.

This is why in communicating the results of our latest study, we not only communicated the overwhelming scientific agreement. We also explained the technique used to cast doubt on the consensus.

The Conversation

John Cook, Climate Communication Research Fellow, Global Change Institute, The University of Queensland

This article was originally published on The Conversation. Read the original article.

A year of records: the human role in 2014’s wild weather


Mitchell Black, University of Melbourne; Andrew King, University of Melbourne, and David Karoly, University of Melbourne

Australia has just had its hottest October, and we can already say that human-caused climate change made this new record at least ten times more likely than it would otherwise have been.

But if we turn our eyes to the past, what role did climate change play in the broken records of 2014? Last year was the hottest on record worldwide, and came with its fair share of extremes.

As part of the annual extreme weather issue of the Bulletin of the American Meteorological Society released today, five papers by Australian authors including us, investigate the role of climate change in extreme weather in 2014.

Year of records

Australia was hit hard in 2014 (although perhaps not quite as hard as 2013, which was Australia’s hottest year ever).

The year started with a bang when the international spotlight fell on southeast Australia as Melbourne was baked by the infamous “Australian Open heatwave”. It led into 12 months that saw the country experience a 19-day heatwave in May, the hottest spring on record, and unusually hot weather in Brisbane during November, right in the middle of the G20 World Leaders Forum.

The Australian Open heatwave of 2014 saw several days above 40C in southern Australia.
Author provided

In August, a record high pressure system stalled to the south of Australia and brought some unusual winter weather, including severe frosts.

So did human-caused global warming play a role in this “weirding” of Australian weather?

Revealing the role of climate change

The annual extremes issue centres on one of the fastest developing areas in climate change research, the role of climate change in recent extreme weather events.

While the link between human activities and climate change has been firmly established for several decades, attributing a single event to human influence isn’t easy. This is because individual events may be the result of natural climate variation.

To get to the heart of how climate change is influencing these extreme events, scientists try to determine how much more likely individual extremes are as a result of climate change. Using climate models they compare the world of today with a parallel world without human greenhouse gas emissions.

These scenarios are often run on models thousands of times in an effort to recreate events that are of a similar scale. By comparing the results of modelled climates with and without human-produced greenhouse gases, researchers can determine how much more likely it is that an extreme weather event occurred as a result of human-caused global warming.

This approach is similar to the way epidemiologists investigate whether smoking increases the likelihood of lung cancer.

Interestingly, there was a significant citizen science role in three of the Australian peer-reviewed studies reported in the extremes issue. Using a large number of climate simulations run on thousands of home computers as part of the Weather@home project, the scientists were able to examine local-scale extreme events such as the January heatwave in Melbourne.

Citizen computing power has helped crunch the numbers and simulate climate extremes.
Weather@home

What we found

The first study, led by Mitchell Black, focused on the prolonged heatwave in southeast Australia in January 2014. During this event Adelaide recorded five consecutive days above 42°C (13–17 January) while Melbourne recorded four consecutive days above 41°C (14–17 January) during the Australian Open tennis tournament.

This study found that human influence very likely increased the chance of prolonged heatwaves in Adelaide by at least 16%. Meanwhile, the influence for Melbourne was less clear.

The second study, led by Andrew King, examined an extreme temperature event caught in the spotlight of international media attention – the unseasonably hot weather in Brisbane during the G20 summit in mid-November. While the hot temperatures were not record-breaking in Brisbane at this time, they were well above average.

This study found that human influence increased the likelihood of hot (above 34°C) and very hot (above 38°C) November days in Brisbane by at least 25% and 44%, respectively.

The third study, led by Michael Grose at CSIRO, examined the exceptionally high surface pressure to the south of Australia during August 2014. This was associated with severe frosts in southeast Australia, lowland snowfalls in parts of Tasmania, and reduced rainfall in the southern parts of both Australia and New Zealand.

The findings suggested that the likelihood of these pressure anomalies had roughly doubled due to human-induced climate change.

The remaining two studies published today used independent sets of climate model simulations.

The fourth study, led by Sarah Perkins-Kirkpatrick from the University of New South Wales, investigated the late-autumn heatwave (May 8-26) that resulted in Australian-averaged maximum temperatures being 2.52°C above the monthly average. Although this heat event occurred during the cooler months, events of this nature are important because they can affect agricultural productivity through changing crop cycles.

The study found that this kind of cool-season heatwave was 23 times more likely as a result of increased greenhouse gasses.

Pandora Hope from the Australian Bureau of Meteorology led the final study, which examined Australia’s hottest spring on record. The study concluded that the record heat would likely not have occurred without increases in atmospheric carbon dioxide over the last 50 years working in concert with anomalous atmospheric patterns.

Year on year, in the extremes issues and through various other investigations reported in the peer-reviewed literature, these attribution studies continue to show that climate change is no longer something that will occur in the future. The rise of human-caused global warming is here, now, and it is already causing changes to extreme weather events that we can see and feel.

The Conversation

Mitchell Black, PhD Candidate, University of Melbourne; Andrew King, Climate Extremes Research Fellow, University of Melbourne, and David Karoly, Professor of Atmospheric Science, University of Melbourne

This article was originally published on The Conversation. Read the original article.