These 3 energy storage technologies can help solve the challenge of moving to 100% renewable electricity


Energy storage can make facilities like this solar farm in Oxford, Maine, more profitable by letting them store power for cloudy days.
AP Photo/Robert F. Bukaty

Kerry Rippy, National Renewable Energy LaboratoryIn recent decades the cost of wind and solar power generation has dropped dramatically. This is one reason that the U.S. Department of Energy projects that renewable energy will be the fastest-growing U.S. energy source through 2050.

However, it’s still relatively expensive to store energy. And since renewable energy generation isn’t available all the time – it happens when the wind blows or the sun shines – storage is essential.

As a researcher at the National Renewable Energy Laboratory, I work with the federal government and private industry to develop renewable energy storage technologies. In a recent report, researchers at NREL estimated that the potential exists to increase U.S. renewable energy storage capacity by as much as 3,000% percent by 2050.

Here are three emerging technologies that could help make this happen.

Longer charges

From alkaline batteries for small electronics to lithium-ion batteries for cars and laptops, most people already use batteries in many aspects of their daily lives. But there is still lots of room for growth.

For example, high-capacity batteries with long discharge times – up to 10 hours – could be valuable for storing solar power at night or increasing the range of electric vehicles. Right now there are very few such batteries in use. However, according to recent projections, upwards of 100 gigawatts’ worth of these batteries will likely be installed by 2050. For comparison, that’s 50 times the generating capacity of Hoover Dam. This could have a major impact on the viability of renewable energy.

Batteries work by creating a chemical reaction that produces a flow of electrical current.

One of the biggest obstacles is limited supplies of lithium and cobalt, which currently are essential for making lightweight, powerful batteries. According to some estimates, around 10% of the world’s lithium and nearly all of the world’s cobalt reserves will be depleted by 2050.

Furthermore, nearly 70% of the world’s cobalt is mined in the Congo, under conditions that have long been documented as inhumane.

Scientists are working to develop techniques for recycling lithium and cobalt batteries, and to design batteries based on other materials. Tesla plans to produce cobalt-free batteries within the next few years. Others aim to replace lithium with sodium, which has properties very similar to lithium’s but is much more abundant.

Safer batteries

Another priority is to make batteries safer. One area for improvement is electrolytes – the medium, often liquid, that allows an electric charge to flow from the battery’s anode, or negative terminal, to the cathode, or positive terminal.

When a battery is in use, charged particles in the electrolyte move around to balance out the charge of the electricity flowing out of the battery. Electrolytes often contain flammable materials. If they leak, the battery can overheat and catch fire or melt.

Scientists are developing solid electrolytes, which would make batteries more robust. It is much harder for particles to move around through solids than through liquids, but encouraging lab-scale results suggest that these batteries could be ready for use in electric vehicles in the coming years, with target dates for commercialization as early as 2026.

While solid-state batteries would be well suited for consumer electronics and electric vehicles, for large-scale energy storage, scientists are pursuing all-liquid designs called flow batteries.

Flow battery diagram.
A typical flow battery consists of two tanks of liquids that are pumped past a membrane held between two electrodes.
Qi and Koenig, 2017, CC BY

In these devices both the electrolyte and the electrodes are liquids. This allows for super-fast charging and makes it easy to make really big batteries. Currently these systems are very expensive, but research continues to bring down the price.

Storing sunlight as heat

Other renewable energy storage solutions cost less than batteries in some cases. For example, concentrated solar power plants use mirrors to concentrate sunlight, which heats up hundreds or thousands of tons of salt until it melts. This molten salt then is used to drive an electric generator, much as coal or nuclear power is used to heat steam and drive a generator in traditional plants.

These heated materials can also be stored to produce electricity when it is cloudy, or even at night. This approach allows concentrated solar power to work around the clock.

Man examines valve at end of large piping network.
Checking a molten salt valve for corrosion at Sandia’s Molten Salt Test Loop.
Randy Montoya, Sandia Labs/Flickr, CC BY-NC-ND

This idea could be adapted for use with nonsolar power generation technologies. For example, electricity made with wind power could be used to heat salt for use later when it isn’t windy.

Concentrating solar power is still relatively expensive. To compete with other forms of energy generation and storage, it needs to become more efficient. One way to achieve this is to increase the temperature the salt is heated to, enabling more efficient electricity production. Unfortunately, the salts currently in use aren’t stable at high temperatures. Researchers are working to develop new salts or other materials that can withstand temperatures as high as 1,300 degrees Fahrenheit (705 C).

One leading idea for how to reach higher temperature involves heating up sand instead of salt, which can withstand the higher temperature. The sand would then be moved with conveyor belts from the heating point to storage. The Department of Energy recently announced funding for a pilot concentrated solar power plant based on this concept.

Advanced renewable fuels

Batteries are useful for short-term energy storage, and concentrated solar power plants could help stabilize the electric grid. However, utilities also need to store a lot of energy for indefinite amounts of time. This is a role for renewable fuels like hydrogen and ammonia. Utilities would store energy in these fuels by producing them with surplus power, when wind turbines and solar panels are generating more electricity than the utilities’ customers need.

Hydrogen and ammonia contain more energy per pound than batteries, so they work where batteries don’t. For example, they could be used for shipping heavy loads and running heavy equipment, and for rocket fuel.

Today these fuels are mostly made from natural gas or other nonrenewable fossil fuels via extremely inefficient reactions. While we think of it as a green fuel, most hydrogen gas today is made from natural gas.

Scientists are looking for ways to produce hydrogen and other fuels using renewable electricity. For example, it is possible to make hydrogen fuel by splitting water molecules using electricity. The key challenge is optimizing the process to make it efficient and economical. The potential payoff is enormous: inexhaustible, completely renewable energy.

[Understand new developments in science, health and technology, each week. Subscribe to The Conversation’s science newsletter.]The Conversation

Kerry Rippy, Researcher, National Renewable Energy Laboratory

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Yes, a few climate models give unexpected predictions – but the technology remains a powerful tool


Shutterstock

Nerilie Abram, Australian National University; Andrew King, The University of Melbourne; Andy Pitman, UNSW; Christian Jakob, Monash University; Julie Arblaster, Monash University; Lisa Alexander, UNSW; Sarah Perkins-Kirkpatrick, UNSW; Shayne McGregor, Monash University, and Steven Sherwood, UNSW

The much-awaited new report from the Intergovernmental Panel on Climate Change (IPCC) is due later today. Ahead of the release, debate has erupted about the computer models at the very heart of global climate projections.

Climate models are one of many tools scientists use to understand how the climate changed in the past and what it will do in future.

A recent article in the eminent US magazine Science questioned how the IPCC will deal with some climate models which “run hot”. Some models, it said, have projected global warming rates “that most scientists, including the model makers themselves, believe are implausibly fast”.


Read more: Monday’s IPCC report is a really big deal for climate change. So what is it? And why should we trust it?


Some commentators, including in Australia, interpreted the article as proof climate modelling had failed.

So should we be using climate models? We are climate scientists from Australia’s Centre of Excellence for Climate Extremes, and we believe the answer is a firm yes.

Our research uses and improves climate models so we can help Australia cope with extreme events, now and in future. We know when climate models are running hot or cold. And identifying an error in some climate models doesn’t mean the science has failed – in fact, it means our understanding of the climate system has advanced.

So lets look at what you should know about climate models ahead of the IPCC findings.

What are climate models?

Climate models comprise millions of lines of computer code representing the physics and chemistry of the processes that make up our climate system. The models run on powerful supercomputers and have simulated and predicted global warming with remarkable accuracy.

They unequivocally show that warming of the planet since the Industrial Revolution is due to human-caused emissions of greenhouse gases. This confirms our understanding of the greenhouse effect, known since the 1850s.

Models also show the intensity of many recent extreme weather events around the world would be essentially impossible without this human influence.

 

 

 

Scientists do not use climate models in isolation, or without considering their limitations.

For a few years now, scientists have known some new-generation climate models probably overestimate global warming, and others underestimate it.

This realisation is based on our understanding of Earth’s climate sensitivity – how much the climate will warm when carbon dioxide (CO₂) levels in the atmosphere double.

Before industrial times, CO₂ levels in the atmosphere were 280 parts per million. So a doubling of CO₂ will occur at 560 parts per million. (For context, we’re currently at around 415 parts per million).

The latest scientific evidence, using observed warming, paleoclimate data and our physical understanding of the climate system, suggests global average temperatures will very likely increase by between 2.2℃ and 4.9℃ if CO₂ levels double.

The large majority of climate models run within this climate sensitivity range. But some don’t – instead suggesting a temperature rise as low as 1.8℃ or high as 5.6℃.

It’s thought the biases in some models stem from the representations of clouds and their interactions with aerosol particles. Researchers are beginning to understand these biases, building our understanding of the climate system and how to further improve models in future.

With all this in mind, scientists use climate models cautiously, giving more weight to projections from climate models that are consistent with other scientific evidence.

The following graph shows how most models are within the expected climate sensitivity range – and having some running a bit hot or cold doesn’t change the overall picture of future warming. And when we compare model results with the warming we’ve already observed over Australia, there’s no indication the models are over-cooking things.

Rapid warming in Australia under a very high greenhouse gas emission future (red) compared with climate change stabilisation in a low emission future (blue). Author provided.

What does the future look like?

Future climate projections are produced by giving models different possibilities for greenhouse gas concentrations in our atmosphere.

The latest IPCC models use a set of possibilities called “Shared Socioeconomic Pathways” (SSPs). These pathways match expected population growth, and where and how people will live, with plausible levels of atmospheric greenhouse gases that would result from these socioeconomic choices.

The pathways range from low-emission scenarios that also require considerable atmospheric CO₂ removal – giving the world a reasonable chance of meeting the Paris Agreement targets – to high-emission scenarios where temperature goals are far exceeded.


Nerilie Abram, based on Riahi et al. 2017, CC BY-ND

Ahead of the IPCC report, some say the high-emission scenarios are too pessimistic. But likewise, it could be argued the lack of climate action over the past decade, and absence of technology to remove large volumes of CO₂ from the atmosphere, means low-emission scenarios are too optimistic.

If countries meet their existing emissions reduction commitments under the Paris Agreement, we can expect to land somewhere in the middle of the scenarios. But the future depends on our choices, and we shouldn’t dismiss any pathway as implausible.

There is considerable value in knowing both the future risks to avoid, and what’s possible under ambitious climate action.


Read more: The climate won’t warm as much as we feared – but it will warm more than we hoped


Wind turbines in field
The future climate depends on our choices today. Unsplash

Where to from here?

We can expect the IPCC report to be deeply worrying. And unfortunately, 30 years of IPCC history tells us the findings are more likely to be too conservative than too alarmist.

An enormous global effort – both scientifically and in computing resources – is needed to ensure climate models can provide even better information.

Climate models are already phenomenal tools at large scales. But increasingly, we’ll need them to produce fine-scale projections to help answer questions such as: where to plant forests to mitigate carbon? Where to build flood defences? Where might crops best be grown? Where would renewable energy resources be best located?

Climate models will continue to be an important tool for the IPCC, policymakers and society as we attempt to manage the unavoidable risks ahead.The Conversation

Nerilie Abram, Chief Investigator for the ARC Centre of Excellence for Climate Extremes; Deputy Director for the Australian Centre for Excellence in Antarctic Science, Australian National University; Andrew King, ARC DECRA fellow, The University of Melbourne; Andy Pitman, Director of the ARC Centre of Excellence for Climate Extremes, UNSW; Christian Jakob, Professor in Atmospheric Science, Monash University; Julie Arblaster, Chief Investigator, ARC Centre of Excellence for Climate Extremes; Chief Investigator, ARC Securing Antarctica’s Environmental Future; Professor, Monash University; Lisa Alexander, Chief Investigator ARC Centre of Excellence for Climate Extremes and Professor Climate Change Research Centre, UNSW; Sarah Perkins-Kirkpatrick, ARC Future Fellow, UNSW; Shayne McGregor, Associate professor, Monash University, and Steven Sherwood, Professor of Atmospheric Sciences, Climate Change Research Centre, UNSW

This article is republished from The Conversation under a Creative Commons license. Read the original article.

The budget should have been a road to Australia’s low-emissions future. Instead, it’s a flight of fancy


Shutterstock

John Quiggin, The University of QueenslandLooking at other nations around the world, the path to cutting greenhouse gas emissions seems clear.

First, develop wind and solar energy and battery storage to replace coal- and gas-fired electricity. Then, replace petrol and diesel cars with electric vehicles running off carbon-free sources. Finally, replace traditionally made steel, cement and other industries with low-carbon alternatives.

In this global context, the climate policies announced in Tuesday’s federal budget are a long-odds bet on a radically different approach. In place of the approaches adopted elsewhere, the Morrison government is betting heavily on alternatives that have failed previous tests, such as carbon capture and storage. And it’s blatantly ignoring internationally proven technology, such as electric vehicles.

The government could have followed the lead of our international peers and backed Australia’s clean energy sector to create jobs and stimulate the post-pandemic economy. Instead, it’s sending the nation on a fool’s errand.

Prime Minister Scott Morrison, left, and Treasurer Josh Frydenberg shake hands
Prime Minister Scott Morrison, left, and Treasurer Josh Frydenberg should have used the budget to create jobs in the clean economy.
Mick Tsikas/AAP

Carbon-capture folly

The Morrison government is taking a “technology, not taxes” approach to emissions reduction. Rather than adopt a policy such as a carbon price – broadly considered the most effective and efficient way to cut emissions – the government has instead pinned its hopes on a low-emissions technology plan.

That means increased public spending on research and development, to accelerate the commercialisation of low emissions technologies. The problems with this approach are most obvious in relation to carbon capture and storage (CCS).

The budget contains A$263.7 million to fund new carbon capture and storage projects. This technology promises to capture some – but to date, not all – carbon dioxide at the point of emission, and then inject it underground. It would allow continued fossil fuel use with fewer emissions, but the process is complex and expensive.

In fact, recent research found of 39 carbon-capture projects examined in the United States, more than 80% ended in failure.




Read more:
The 1.5℃ global warming limit is not impossible – but without political action it soon will be


The government’s CCS funding is focused on capturing CO₂ from gas projects. This is despite the disappointing experience of Australia’s only CCS project so far, Chevron’s Gorgon gas field off Western Australia.

Some 80% of emissions from the operation were meant to be captured from 2016. But the process was delayed for three years, allowing millions of tonnes of CO₂ to enter the atmosphere. As of January this year, the project was still facing technical issues.

CCS from gas will be expensive even if it can be made to work. Santos, which has proposed a CCS project at its Moomba gas plant in South Australia, suggests a cost of $A30 per tonne of CO₂ captured.

This money would need to come from the government’s Climate Solutions Fund, currently allocated about A$2 billion over four years. If Moomba’s projected emissions reduction of 20 million tonnes a year were realised, this project alone would exhaust the fund.

two men stand over equipment
Plans to capture carbon from Chevron’s Gorgon gas project have not gone to plan.
Chevron Australia

What about electric vehicles?

There is a striking contrast between the Morrison government’s enthusiasm for carbon capture, and its neglect of electric vehicles.

It ought to be obvious that if Australia is to achieve a target of net-zero emissions by 2050 – which Treasurer Josh Frydenberg this week reiterated was his government’s preference – the road transport sector must be decarbonised by then.

The average age of Australian cars is about 10 years. This implies, given fairly steady sales, an average lifespan of 20 years. This in turn implies most petrol or diesel vehicles sold after 2030 will have to be taken off the road before the end of their useful life.

In any case, such vehicles will probably be very difficult to buy within 15 years. Manufacturers including General Motors and Volvo have announced plans to stop selling petrol and diesel vehicles by 2035 or earlier.

But the Morrison government has ruled out consumer incentives to encourage electric vehicle uptake – a policy at odds with many other nations, including the US.




Read more:
The US jumps on board the electric vehicle revolution, leaving Australia in the dust


Despite the “technology, not taxes” mantra, this week’s federal budget ignored electric vehicles. This includes a A$10 billion infrastructure spend which did not include charging stations as part of highway upgrades.

Unless the government takes action soon, Australian motorists will be faced with the choice between a limited range of second-rate petrol and diesel vehicles, or electric vehicles for which key infrastructure is missing.

It’s hard to work out why the government is so resistant to doing anything to help electric vehicles. Public support appears strong. There are no domestic carmakers left to protect.

The car retail industry is generally unenthusiastic about electric vehicles. Its business model is built on combining competitive sticker prices with a high-margin service and repair business, and electric vehicles don’t fit this model.

At the moment (although not for much longer), electric vehicles are more expensive than traditional cars to buy upfront. But they are cheaper to run and service.

There are fears of job losses in car maintenance as electric vehicle uptake increases. However, car dealers have adjusted to change in the past, and can do so in future.

electric vehicle on charge
The budget ignored electric vehicles.
Shutterstock

Wishful thinking

The Morrison government is still edging towards announcing a 2050 net-zero target in time for the United Nations Climate Change Conference in Glasgow this November. But as Prime Minister Scott Morrison himself has emphasised, there’s no point having a target without a strategy to get there.

Yet at this stage, the government’ emissions reduction strategy looks more like wishful thinking than a road map.




Read more:
Australia’s states are forging ahead with ambitious emissions reductions. Imagine if they worked together


The Conversation


John Quiggin, Professor, School of Economics, The University of Queensland

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Genome and satellite technology reveal recovery rates and impacts of climate change on southern right whales



University of Auckland tohorā research team, Department of Conservation permit DJI

Emma Carroll

After close to a decade of globe-spanning effort, the genome of the southern right whale has been released this week, giving us deeper insights into the histories and recovery of whale populations across the southern hemisphere.

Up to 150,000 southern right whales were killed between 1790 and 1980. This whaling drove the global population from perhaps 100,000 to as few as 500 whales in 1920. A century on, we estimate there are 12,000 southern right whales globally. It’s a remarkable conservation success story, but one facing new challenges.

A southern right whale calf breaches in the subantarctic Auckland Islands.
A southern right whale calf breaches in the subantarctic Auckland Islands.
University of Auckland tohorā research team, Author provided

The genome represents a record of the different impacts a species has faced. With statistical models we can use genomic information to reconstruct historical population trajectories and patterns of how species interacted and diverged.

We can then link that information with historical habitat and climate patterns. This look back into the past provides insights into how species might respond to future changes. Work on penguins and polar bears has already shown this.

But we also have a new and surprising short-term perspective on the population of whales breeding in the subantarctic Auckland Islands group — Maungahuka, 450km south of New Zealand.

Spying on whales via satellite

Known as tohorā in New Zealand, southern right whales once wintered in the bays and inlets of the North and South Islands of Aotearoa, where they gave birth and socialised. Today, the main nursery ground for this population is Port Ross, in the subantarctic Auckland Islands.

Adult whales socialise at both the Auckland and Campbell Islands during the austral winter. Together these subantarctic islands are internationally recognised as an important marine mammal area.

In August 2020, I led a University of Auckland and Cawthron Institute expedition to the Auckland Islands. We collected small skin samples for genetic and chemical analysis and placed satellite tags on six tohorā. These tags allowed us to follow their migrations to offshore feeding grounds.

It matters where tohorā feed and how their populations recover from whaling because the species is recognised as a sentinel for climate change throughout the Southern Hemisphere. They are what we describe as “capital” breeders — they fast during the breeding season in wintering grounds like the Auckland Islands, living off fat reserves gained in offshore feeding grounds.

Females need a lot in the “bank” because their calves need a lot of energy. At 4-5m at birth, these calves can grow up to a metre a month. This investment costs the mother 25% of her size over the first few months of her calf’s life. It’s no surprise that calf growth depends on the mother being in good condition.




Read more:
I measure whales with drones to find out if they’re fat enough to breed


Females can only breed again once they’ve regained their fat capital. Studies in the South Atlantic show wintering grounds in Brazil and Argentina produce more calves when prey is more abundant, or environmental conditions suggest it should be.

The first step in understanding the relationship between recovery and prey in New Zealand is to identify where and on what tohorā feed. The potential feeding areas for our New Zealand population could cover roughly a third of the Southern Ocean. That’s why we turn to technologies like satellite tags to help us understand where the whales are going and how they get there.

Where tohorā go

So far, all tracked whales have migrated west; away from the historical whaling grounds to the east near the Chatham Islands. As they left the Auckland Islands, two whales visited other oceanic islands — skirting around Macquarie Island and visiting Campbell Island.

It also seems one whale (Bill or Wiremu, identified as male using genetic analysis of his skin sample) may have reached his feeding grounds, likely at the subtropical convergence. The clue is in the pattern of his tracks: rather than the continuous straight line of a whale migrating, it shows the doughnuts of a whale that has found a prey patch.

Migratory track of southern right whale Bill/Wiremu, where the convoluted track could indicate foraging behaviour.

The subtropical convergence is an area of the ocean where temperature and salinity can change rapidly, and this can aggregate whale prey. Two whales we tracked offshore from the Auckland Islands in 2009 visited the subtropical convergence, but hundreds of kilometres to the east of Bill’s current location.

As Bill and his compatriots migrate, we’ve begun analysing data that will tell us about the recovery of tohorā in the past decade. The most recent population size estimate we have is from 2009, when there were about 2,000 whales.




Read more:
Humans threaten the Antarctic Peninsula’s fragile ecosystem. A marine protected area is long overdue


I am using genomic markers to learn about the kin relationships and, in doing so, the population’s size and growth rate. Think of it like this. Everybody has two parents and if you have a small population, say a small town, you are more likely to find those parents than if you have a big population, say a city.

This nifty statistical trick is known as the “close kin” approach to estimating population size. It relies on detailed understanding of the kin relationships of the whales — something we have only really been able to do recently using new genomic sequencing technology.

Global effort to understand climate change impacts

Globally, southern right whales in South Africa and Argentina have bred less often over the past decade, leading to a lower population growth rate in Argentina.

Concern over this slowdown in recovery has prompted researchers from around the world to work together to understand the relationship between climate change, foraging ecology and recovery of southern right whales as part of the International Whaling Commission Southern Ocean Research Partnership.

The genome helps by giving us that long view of how the whales responded to climate fluctuations in the past, while satellite tracking gives us the short view of how they are responding on a day-to-day basis. Both will help us understand the future of these amazing creatures.The Conversation

Emma Carroll, Rutherford Discovery Fellow

This article is republished from The Conversation under a Creative Commons license. Read the original article.

‘A dose of reality’: Morrison government’s new $1.9 billion techno-fix for climate change is a small step



Dean Lewins/AAP

Frank Jotzo, Australian National University

The Morrison government today announced A$1.9 billion over ten years to develop clean technology in industry, agriculture and transport. In some ways it’s a step in the right direction, but a far cry from what’s needed to drive Australia’s shift to a low emissions economy.

The big change involves what the money is for. The new funding will enable the Australian Renewable Energy Agency (ARENA) to support technologies such as green steel production, industrial processes to reduce energy consumption and somewhat controversially, carbon-capture and storage and soil-carbon sequestration.

This is a big move away from ARENA’s current investment priorities. Importantly it means ARENA will continue to operate, as it is running out of money now.

However technology development alone is not enough to cut Australia’s emissions deeply and quickly – which is what’s needed to address the climate threat. Other policies and more money will be needed.

Interior of steelworks
Cutting emissions from industry will be a focus of the new spending.
Dean Lewins/AAP

New role for ARENA

ARENA will receive the lion’s share of the money: A$1.4 billion over ten years in guaranteed baseline funding. ARENA has spent A$1.6 billion since it was established in 2012. So the new funding is lower on an annual basis. It’s also far less than what’s needed to properly meet the challenge, in a country with a large industrial sector and huge opportunities for zero carbon production.

To date, ARENA’s investments have focused on renewable energy supply. Prime Minister Scott Morrison today said the renewables industry was enjoying a “world-leading boom” and no longer needs government subsidies. Critics may be dismayed to see ARENA steered away from its original purpose. But it is true solar parks and wind farms are now commercially viable, and technologies to integrate large amounts of renewables into the grid are available.

So it makes sense to spend new research and development (R&D) funding on the next generation of low-emissions technologies. But how to choose what to spend the money on?

A few simple principles should inform those choices. The spending should help develop new zero- or low-emissions technologies or make them cheaper. It should also enable the shift to a net-zero emissions future, rather than locking in structures that continue to emit. The investment choices should be made by independent bodies such as ARENA’s board, based on research and expert judgement, rather than politically determined priorities.

For the industrial sector, the case for supporting zero-emissions technologies is clear. A sizeable share of Australia’s total emissions stem from fossil fuel use in industry.




Read more:
Government targets emerging technologies with $1.9 billion, saying renewables can stand on own feet


In some cases, government-supported R&D could help lay the foundation for zero-emissions industries of the future. But in others, what’s needed is a financial incentive for businesses to switch to clean energy or zero-emissions production methods, or regulation to require cleaner processes.

Green steel is a perfect example of the positive change that is possible. Steel can be made using clean hydrogen and renewable electricity, and the long term possibility of a green steel industry in Australia is tantalising.

Steel being made
Steel could be made cleanly using hydrogen instead of coking coal.
Dean Lewins/AAP

A future for fossil fuels?

The government’s support for carbon capture and storage (CCS) will be highly contested, because it’s a way to continue using fossil fuels at reduced – though not zero – emissions. This is achieved by capturing carbon dioxide before it enters the atmosphere and storing it underground, a technically feasible but costly process.

CCS will not perpetuate fossil fuel use in the energy sector, because renewables combined with energy storage are now much cheaper. Rather, CCS can be an option in specific processes that do not have ready alternatives, such as the production of cement, chemicals and fertiliser.

One step further is so-called “carbon capture and use” (CCU), where carbon dioxide is not pumped underground but turned into products, such as building materials. One program announced is for pilot projects of that kind.




Read more:
Yes, carbon emissions fell during COVID-19. But it’s the shift away from coal that really matters


A different proposition is the idea of hydrogen produced from coal or gas, in which some resulting emissions are captured. This method competes with “green” hydrogen produced using renewable electricity. It seems the government for now intends to support fossil fuel-derived hydrogen.

Reducing fossil fuel use, and using CCS/CCU where it makes sense, will not get the world to net-zero emissions. Emissions from other sources must be cut by as much as technically possible, at justifiable cost. Remaining emissions must then be negated by drawing carbon dioxide from the atmosphere. Such “negative emissions” can be achieved through technological means, and also by permanently increasing the amount of carbon stored in plants and soil.

The new funding includes support for increasing the amount of soil carbon. This method may hold promise in principle, but in practice its effectiveness is uncertain, and hard to measure. At the same time, the large emissions from agriculture are not yet addressed.

Gas flaring from an industrial plant
Reducing the burning of fossil fuels is not enough to get to net-zero emissions.
Matt Black Productions

A piecemeal effort

The spending amounts to A$140 million per year for ARENA, plus about A$500 million all up through other programs. A dose of reality is needed about what this money can achieve. It will create better understanding of options, some technological progress across the board and surely the occasional highlight. But a much greater effort is likely needed to achieve fundamental technological breakthroughs. And crucially, new technologies must be widely deployed.

For a sense of scale, consider that the Snowy 2.0 scheme is costed at around A$5 billion, and a single 1 gigawatt gas power plant, as mooted by the government for the Hunter Valley, would cost in the order of A$1.5 billion to build.

As well as additional spending, policies will be needed to drive the uptake of low-emissions technologies. The shift to renewables is now happening in the energy sector without government help, though some hurdles remain. But we cannot expect the same across the economy.

Governments will need to help drive uptake through policy. The most efficient way is usually to ensure producers of emissions pay for the environmental damage caused. In other words, putting a price on carbon.

The funding announced today is merely one piece of a national long-term strategy to deeply cut emissions – and not a particularly big piece.




Read more:
Carbon pricing works: the largest-ever study puts it beyond doubt


The Conversation


Frank Jotzo, Director, Centre for Climate and Energy Policy, Australian National University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Government targets emerging technologies with $1.9 billion, saying renewables can stand on own feet


Michelle Grattan, University of Canberra

The government has unveiled a $1.9 billion package of investments in new and emerging energy and emission-reducing technologies, and reinforced its message that it is time to move on from assisting now commercially-viable renewables.

The package will be controversial, given its planned broadening of the remit of the government’s clean energy investment vehicles, currently focused on renewables, and the attention given to carbon capture and storage, which has many critics.

The latest announcement follows the “gas-fired recovery” energy plan earlier this week, which included the threat the government would build its own gas-fired power station if the electricity sector failed to fill the gap left by the scheduled closure of the coal-fired Liddell power plant in 2023.




Read more:
Morrison government threatens to use Snowy Hydro to build gas generator, as it outlines ‘gas-fired recovery’ plan


Unveiling the latest policy, Scott Morrison said solar panels and wind farms were commercially viable “and have graduated from the need for government subsidies”.

The government was now looking to unlock new technologies “to help drive down costs, create jobs, improve reliability and reduce emissions. This will support our traditional industries – manufacturing, agriculture, transport – while positioning our economy for the future.”

An extra $1.62 billion will be provided for the Australian Renewable Energy Agency (ARENA) to invest.

The government will expand the focus of ARENA and the Clean Energy Finance Corporation (CEFC) to back new technologies that would reduce emissions in agriculture, manufacturing, industry and transport.

At present ARENA can only support renewable energy and the CEFC can only invest in clean energy technologies (although it can support some types of gas projects).

The changes to ARENA and the CEFC will need legislation.

The government says it will cut the time taken to develop new Emissions Reduction Fund (ERF) methods from two years or more to under a year, involving industry in a co-design process.

This follows a review of the fund, which is a centrepiece of the Coalition’s emissions reduction policy. The cost of the changes is put at $24.6 million. The fund has had trouble attracting proposals from some sectors because of its complex administrative requirements.

Other measures in the policy include a new $95.4 million Technology Co-Investment Fund to support businesses in the agriculture, manufacturing, industrial and transport sectors to take up technologies to boost productivity and reduce emissions.

A $50 million Carbon Capture Use and Storage Development Fund will pilot carbon capture projects. This technology buries carbon but has run into many problems over the years and its opponents point to it being expensive, risky and encouraging rather than discouraging the use of fossil fuels.

Businesses and regional communities will be encouraged to use hydrogen, electric, and bio-fuelled vehicles, supported by a new $74.5 million Future Fuels Fund.

A hydrogen export hub will be set up, with $70.2 million. Chief Scientist Alan Finkel has been a strong advocate for the potential of hydrogen, saying Australia has competitive advantages as a future hydrogen exporter.

Some $67 million will back new microgrids in regional and remote communities to deliver affordable and reliable power.

There will be $52.2 million to increase the energy productivity of homes and businesses. This will include grants for hotels’ upgrades.

The government says $1.8 billion of the package is new money.

Here are the details of the package:

The Conversation

Michelle Grattan, Professorial Fellow, University of Canberra

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Andrew Forrest’s high-tech plan to extinguish bushfires within an hour is as challenging as it sounds



Warren Frey/AAP

James Jin Kang, Edith Cowan University

The philanthropic foundation of mining billionaire Andrew “Twiggy” Forrest has unveiled a plan to transform how Australia responds to bushfires.

The Fire Shield project aims to use emerging technologies to rapidly find and extinguish bushfires. The goal is to be able to put out any dangerous blaze within an hour by 2025.

Some of the proposed technology includes drones and aerial surveillance robots, autonomous fire-fighting vehicles and on-the-ground remote sensors. If successful, the plan could alleviate the devastating impact of bushfires Australians face each year.

But while bushfire behaviour is an extensively studied science, it’s not an exact one. Fires are subject to a wide range of variables including local weather conditions, atmospheric pressure and composition, and the geographical layout of an area.

There are also human factors, such as how quickly and effectively front-line workers can respond, as well as the issue of arson.




Read more:
Humans light 85% of bushfires, and we do virtually nothing to stop it


A plan for rapid bushfire detection

The appeal of the Fire Shield plan is in its proposal to use emerging fields of computer science to fight bushfires, especially AI and the Internet of Things (IoT) network.

While we don’t currently have details on how the Fire Shield plan will be carried out, the use of an IoT bushfire monitoring network seems like the most viable option.

The IoT network is made of many wireless connected devices. Deploying IoT devices with sensors in remote areas could allow the monitoring of changes in soil temperature, air temperature, weather conditions, moisture and humidity, wind speed, wind direction and forest density.

The sensors could also help pinpoint a fire’s location, predict where it will spread and also where it most likely started. This insight would greatly help with the early evacuation of vulnerable communities.

Data collected could be quickly processed and analysed using machine learning. This branch of AI provides intelligent analysis much quicker than traditional computing, or human reckoning.

Water bomber puts out a blaze from the sky.
Water bomber helicopters were used in NSW earlier this year as almost 150 bushfires burnt across the state at one point.
Bianca De Marchi/AAP

A more reliable network

A wireless low power wide area network (LPWAN) would be the best option for implementing the required infrastructure for the proposal. LPWAN uses sensor devices with batteries lasting up to 15 years.

And although a LPWAN only allows limited coverage (10-40km) in rural areas, a network with more coverage would need batteries that have to be replaced more often — making the entire system less reliable.

In the event of sensors being destroyed by fire, neighbouring sensors can send this information back to the server to build a sensor “availability and location map”. With this map, tracking destroyed sensors would also help track a bushfire’s movement.

Dealing with logistics

While it’s possible, the practicalities of deploying sensors for a remote bushfire monitoring network make the plan hugely challenging. The areas to cover would be vast, with varying terrain and environmental conditions.

Sensor devices could potentially be deployed by aircrafts across a region. On-ground distribution by people would be another option, but a more expensive one.

However, the latter option would have to be used to distribute larger gateway devices. These act as the bridge between the other sensors on ground and the server in the cloud hosting the data.

Gateway devices have more hardware and need to be set up by a person when first installed. They play a key role in LPWAN networks and must be placed carefully. After being placed, IoT devices require regular monitoring and calibration to ensure the information being relayed to the server is accurate.

Weather and environmental factors (such as storms or floods) have the potential to destroy the sensors. There’s also the risk of human interference, as well as legal considerations around deploying sensors on privately owned land.




Read more:
There’s only one way to make bushfires less powerful: take out the stuff that burns


Unpredictable interruptions

While statisticians can provide insight into the likelihood of a bushfire starting at a particular location, bushfires remain inherently hard to predict.

Any sensor network will be counter-acted by unpredictable environmental conditions and technological issues such as interrupted network signals. And such disruptions could lead to delays in important information reaching authorities.

Potential solutions for this include using satellite services in conjunction with an LPWAN network, or balloon networks (such as Google’s project Loon) which can provide better internet connectivity in remote areas.

But even once the sensors can be used to identify and track bushfires, putting a blaze out is another challenge entirely. The Fire Shield plan’s vision “to detect, monitor and extinguish dangerous blazes within an hour anywhere in Australia” will face challenges on several fronts.

It may be relatively simple to predict hurdles in getting the technology set up. But once a bushfire is detected, it’s less clear as to what course of action could possible extinguish it within the hour. In some very remote areas, aerial firefighting (such as with water bombers) may be the only option.

That begs the next question: how can we have enough aircrafts and controllers ready to be dispatched to a remote place at a moment’s notice? Considering the logistics, it won’t be easy.The Conversation

James Jin Kang, Lecturer, Computing and Security, Edith Cowan University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

A pretty good start but room for improvement: 3 experts rate Australia’s emissions technology plan



James Gourley/AAP

Jake Whitehead, The University of Queensland; Chris Greig, and Simon Smart, The University of Queensland

Energy Minister Angus Taylor yesterday released his government’s emissions reduction technology plan, setting out priorities for meeting Australia’s climate targets while growing the economy.

The long-awaited Technology Investment Roadmap examined more than 140 technologies for potential investment between now and 2050. They include electric vehicles, biofuels, batteries, hydrogen, nuclear and carbon capture and storage.




Read more:
Morrison government dangles new carrots for industry but fails to fix bigger climate policy problem


The discussion paper builds on the need for a post-pandemic recovery plan. It sets a positive tone, and highlights Australia’s enormous opportunities to support investment in low-emission technologies, while increasing prosperity.

But it’s not clear whether the government grasps the sheer scale of infrastructure and behaviour change required to meet our climate goals – nor the urgency of the task.

So let’s take a closer look at where the report hits the mark, and where there’s room for improvement.

The University of Queensland’s 78 megawatt solar farm at Warwick.
Author provided

Positive signs

The paper gives a reasonably comprehensive overview of new and emerging technologies, and builds on a significant body of prior work and investment. This includes the CSIRO’s Low Emissions Technology Roadmap and ARENA’s Commercial Readiness Index.

Crucially, the paper recognises the need for government funding to help share the financial risks of deploying technologies in their early stages. It also acknowledges the need for partnerships between government, industry and research institutions to drive innovation.

Encouragingly, the paper recognises Australia’s responsibility to support our neighbours across the Indo-Pacific, to help reduce international emissions.




Read more:
Coronavirus is a ‘sliding doors’ moment. What we do now could change Earth’s trajectory


The paper is a “living” document, designed to be updated in response to future developments in technology, domestic demand, international markets and so on. Progress will be reported through annual “low emissions technology statements”, and the roadmap can be adjusted as certain technologies flourish and others fail.

This process recognises the considerable uncertainties around the performance and costs of future technologies. It will allow ongoing assessment of where future technologies should be deployed, and can ultimately deliver the greatest emission reduction benefit.

The paper considers the role of both coal and natural gas in Australia’s transition to net-zero emissions. We don’t object to the inclusion of these energy sources, as long as they’re decarbonised, for example using carbon capture and storage or verifiable carbon offsets.

Coal and gas should be decarbonised if they are part of our energy future.
Julian Smith/AAP

Room for improvement

The paper’s emphasis on technology and investment is clear. But what’s less clear is an appreciation of the sheer scale of change needed to support a low- or net-zero emissions future.

The roadmap would benefit from an assessment of the scale of investment and infrastructure needed to meet the long-term emissions goals of the Paris Agreement. This will require nations including Australia to reduce economy-wide emissions to net-zero.

We believe the lack of clarity around mid-century (and intermediate) emissions targets is a significant gap in the roadmap. It obscures the scale and pace of technological change required across all sectors, and has already prompted criticism.

The energy transition must start as soon as possible. It will involve unprecedented levels of behaviour change, infrastructure investment and technology deployment, which must be maintained over several decades.

The deployment of new technologies affects communities and natural landscapes. The paper touches on these issues, such as the use of water resources to produce renewable hydrogen.

But it does not sufficiently emphasise the need to consult a broad range of stakeholders, such as community, environment and business groups. This should happen before investment begins, and throughout the transition.

The paper also omits notable low-emission technologies already deployed in Australia. This includes zero-emission electric heavy vehicles such as buses, trackless trams and trucks. Future consultation on the paper will help fill these gaps.

The Brisbane Metro project involves electric buses.

Planning for an uncertain future

The roadmap process should explore the various technology pathways that could plausibly emerge between now and 2050, depending on how technologies progress and costs evolve, levels of public acceptance, and the nature of policies adopted.

The process should also seek to identify and deal with industrial, regulatory and social bottlenecks or constraints that might slow down technological efforts to decarbonise our economy, and those of our trading partners.




Read more:
Wrong way, go back: a proposed new tax on electric vehicles is a bad idea


With Princeton University, we are co-leading such a project. Known as Rapid Switch, the international collaboration will determine the actions needed in various countries to reach net-zero emissions by 2050.

Our work highlights the need for most low-carbon technologies to be deployed at historically unprecedented rates. This wholesale transformation will have dramatic impacts on landscapes, natural resources, industries and current practices.

The road ahead

Overall, the Technology Investment Roadmap is a solid foundation for building a low-emissions future.

It should encourage the right technology investment, if supported by other policy mechanisms. These should include an expanded Renewable Energy Target and low-carbon fuel and material standards which, for example, would encourage the production of green hydrogen and steel.

But the divisive nature of Australia’s climate politics over the past decade shows that securing bipartisan support for this plan, and its implementation over the long term, is crucial.

The magnitude of the challenge of transitioning our economy must not be taken for granted. But with a few important changes, this roadmap could help get us there.The Conversation

Jake Whitehead, Advance Queensland Industry Research Fellow & Tritum E-Mobility Fellow, The University of Queensland; Chris Greig, Professor, and Simon Smart, Associate professor, The University of Queensland

This article is republished from The Conversation under a Creative Commons license. Read the original article.

There are 10 catastrophic threats facing humans right now, and coronavirus is only one of them


Arnagretta Hunter, Australian National University and John Hewson, Crawford School of Public Policy, Australian National University

Four months in, this year has already been a remarkable showcase for existential and catastrophic risk. A severe drought, devastating bushfires, hazardous smoke, towns running dry – these events all demonstrate the consequences of human-induced climate change.

While the above may seem like isolated threats, they are parts of a larger puzzle of which the pieces are all interconnected. A report titled Surviving and Thriving in the 21st Century, published today by the Commission for the Human Future, has isolated ten potentially catastrophic threats to human survival.

Not prioritised over one another, these risks are:

  1. decline of natural resources, particularly water
  2. collapse of ecosystems and loss of biodiversity
  3. human population growth beyond Earth’s carrying capacity
  4. global warming and human-induced climate change
  5. chemical pollution of the Earth system, including the atmosphere and oceans
  6. rising food insecurity and failing nutritional quality
  7. nuclear weapons and other weapons of mass destruction
  8. pandemics of new and untreatable disease
  9. the advent of powerful, uncontrolled new technology
  10. national and global failure to understand and act preventatively on these risks.

The start of ongoing discussions

The Commission for the Human Future formed last year, following earlier discussions within emeritus faculty at the Australian National University about the major risks faced by humanity, how they should be approached and how they might be solved. We hosted our first round-table discussion last month, bringing together more than 40 academics, thinkers and policy leaders.

The commission’s report states our species’ ability to cause mass harm to itself has been accelerating since the mid-20th century. Global trends in demographics, information, politics, warfare, climate, environmental damage and technology have culminated in an entirely new level of risk.

The risks emerging now are varied, global and complex. Each one poses a “significant” risk to human civilisation, a “catastrophic risk”, or could actually extinguish the human species and is therefore an “existential risk”.

The risks are interconnected. They originate from the same basic causes and must be solved in ways that make no individual threat worse. This means many existing systems we take for granted, including our economic, food, energy, production and waste, community life and governance systems – along with our relationship with the Earth’s natural systems – must undergo searching examination and reform.

COVID-19: a lesson in interconnection

It’s tempting to examine these threats individually, and yet with the coronavirus crisis we see their interconnection.

The response to the coronavirus has had implications for climate change with carbon pollution reduction, increased discussion about artificial intelligence and use of data (including facial recognition), and changes to the landscape of global security particularly in the face of massive economic transition.

It’s not possible to “solve” COVID-19 without affecting other risks in some way.

Shared future, shared approach

The commission’s report does not aim to solve each risk, but rather to outline current thinking and identify unifying themes. Understanding science, evidence and analysis will be key to adequately addressing the threats and finding solutions. An evidence-based approach to policy has been needed for many years. Under-appreciating science and evidence leads to unmitigated risks, as we have seen with climate change.

The human future involves us all. Shaping it requires a collaborative, inclusive and diverse discussion. We should heed advice from political and social scientists on how to engage all people in this conversation.




Read more:
From the bushfires to coronavirus, our old ‘normal’ is gone forever. So what’s next?


Imagination, creativity and new narratives will be needed for challenges that test our civil society and humanity. The bushfire smoke over the summer was unprecedented, and COVID-19 is a new virus.

If our policymakers and government had spent more time using the available climate science to understand and then imagine the potential risks of the 2019-20 summer, we would have recognised the potential for a catastrophic season and would likely have been able to prepare better. Unprecedented events are not always unexpected.

Prepare for the long road

The short-termism of our political process needs to be circumvented. We must consider how our actions today will resonate for generations to come.

The commission’s report highlights the failure of governments to address these threats and particularly notes the short-term thinking that has increasingly dominated Australian and global politics. This has seriously undermined our potential to decrease risks such as climate change.




Read more:
Listen to your people Scott Morrison: the bushfires demand a climate policy reboot


The shift from short to longer term thinking can began at home and in our daily lives. We should make decisions today that acknowledge the future, and practise this not only in our own lives but also demand it of our policy makers.

We’re living in unprecedented times. The catastrophic and existential risks for humanity are serious and multifaceted. And this conversation is the most important one we have today.The Conversation

Arnagretta Hunter, ANU Human Futures Fellow 2020; Cardiologist and Physician., Australian National University and John Hewson, Professor and Chair, Tax and Transfer Policy Institute, Crawford School of Public Policy, Australian National University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Australia wants to install military technology in Antarctica – here’s why that’s allowed



Technology, such as satellite systems, can be used for both military and scientific purposes.
Shutterstock

Tony Press, University of Tasmania

This week, the ABC revealed that the Australian Defence Force wants to roll out military technology in Antarctica.

The article raises the issue of what is, or is not, legitimate use of technology under the Antarctic Treaty. And it has a lot to do with how technology is used and provisions in the treaty.

The Antarctic Treaty was negotiated in the late 1950s, during the Cold War. Its purpose was to keep Antarctica separate from any Cold War conflict, and any arguments over sovereignty claims.




Read more:
As China flexes its muscles in Antarctica, science is the best diplomatic tool on the frozen continent


The words used in the treaty reflect the global politics and technologies back then, before there were satellites and GPS systems. But its provisions and prohibitions are still relevant today.

The opening provision of the Antarctic Treaty, which came into force in 1961, says:

Antarctica shall be used for peaceful purposes only. There shall be prohibited, [among other things], any measures of a military nature, such as the establishment of military bases and fortifications, the carrying out of military manoeuvres, as well as the testing of any type of weapons.

The treaty also prohibits “any nuclear explosions in Antarctica” and disposal of radioactive waste. What the treaty does not do, however, is prohibit countries from using military support in their peaceful Antarctic activities.

Many Antarctic treaty parties, including Australia, New Zealand, the United Kingdom, the US, Chile and Argentina, rely on military support for their research. This includes the use of ships, aircraft, personnel and specialised services like aircraft ground support.

In fact, the opening provision of the treaty is clarified by the words:

the present Treaty shall not prevent the use of military personnel or equipment for scientific research or for any other peaceful purpose.

It would be a breach of the treaty if “military exercises” were being conducted in Antarctica, or if military equipment was being used for belligerent purposes. But the treaty does not deal specifically with technology. It deals with acts or actions. The closest it gets to technology is the term “equipment” as used above.

Dual use technology

So-called “dual use” technology – which that can be used for both peaceful and military purposes – is allowed in Antarctica in support of science.




Read more:
For the first time, we can measure the human footprint on Antarctica


The term is often used to describe technology such as the widely-used GPS, which relies on satellites and a worldwide system of ground-based receiving stations. Norway’s “Trollsat”, China’s “Beidou”, and Russia’s “GLONASS” systems are similar, relying on satellites and ground stations for their accuracy.

What’s more, modern science heavily relies on satellite technology and the use of Antarctic ground stations for data gathering and transmission.

And scientific equipment, like ice-penetrating radars, carried on aircraft, drones, and autonomous airborne vehicles are being used extensively to understand the Antarctic continent itself and how it’s changing.

Much, if not all, of this technology could have “dual use”. But its use is not contrary to the Antarctic Treaty.

In fact, the use of this equipment for “scientific research” or a “peaceful purpose” is not only legitimate, it’s also essential for Antarctic research, and global understanding of the health of our planet.




Read more:
The benefits – and pitfalls – of working in isolation


The technologies Australia deploys in Antarctica all relate to its legitimate Antarctic operations and to science.

There are also facilities in Antarctica used to monitor potential military-related activities elsewhere in the world, such as the monitoring stations used under the Comprehensive Nuclear Test Ban Treaty.

The circumstances under which modern technology would, or could be, used against the provisions of the Antarctic Treaty have not been tested. But the activity would have to go beyond “dual purpose” and not be for science or peaceful purposes.

Science in Antarctica is open to scrutiny

Science in Antarctica is very diverse, from space sciences to ecosystem science, and 29 countries have active research programs there.

And since Antarctica plays a significant role in the global climate system, much modern Antarctic research focuses on climate science and climate change.

But there has been speculation about whether Antarctica is crucial to the development of alternatives to GPS (for example, by Russia and China) that could also be used in warfare as well as for peaceful purposes. It’s unclear whether using ground stations in Antarctica is essential for such a purpose.

For instance, Claire Young, a security analyst writing for the Australian Strategic Policy Institute, said the accuracy of China’s Beidou satellite has already been improved by international testing, so testing in Antarctica will make very little difference.




Read more:
Remembering Antarctica’s nuclear past with ‘Nukey Poo’


This leads to another important provision of the Antarctic Treaty.

The treaty foreshadowed compliance problems in the remote and hostile continent by including an open ended provision for any Antarctic Treaty Party to inspect any Antarctic facility.

In other words, any party has complete freedom to access all parts of Antarctica at any time to inspect ships, aircraft, equipment, or any other facility, and even use “aerial observations” for inspection. This means the activities of all parties, and all actions in Antarctica, are available for open scrutiny.

This inspection regime is important because inspections can be used to determine if modern technology on the continent is, in fact, being used for scientific or peaceful purposes, in line with the provisions of the treaty.The Conversation

Tony Press, Adjunct Professor, Institute for Marine and Antarctic Studies, University of Tasmania

This article is republished from The Conversation under a Creative Commons license. Read the original article.