The link below is to an article that reports on the creation of massive new national parks in Chile.
Two new exoplanets have been discovered thanks to NASA’s collaboration with Google’s artificial intelligence (AI). One of those in today’s announcement is an eighth planet – Kepler-90i – found orbiting the Sun-like star Kepler-90. This makes it the first system discovered with an equal number of planets to our own Solar system.
A mere road trip away, at 2,545 light-years from Earth, Kepler-90i orbits its host star every 14.4 Earth days, with a sizzling surface temperature similar to Venus of 426°C.
The new exoplanets are added to the growing list of known worlds found orbiting other stars.
This new Solar system rival provides evidence that a similar process occurred within Kepler-90 that formed our very own planetary neighbourhood: small terrestrial worlds close to the host star, and larger gassy planets further away. But to say the system is a twin of our own Solar system is a stretch.
The entire Kepler-90 system of eight planets would easily fit within Earth’s orbit of the Sun. All eight planets, bar Kepler-90h, would be too hostile for life, lying outside the so-called habitable zone.
Evidence also suggests that planets within the Kepler-90 system started out farther apart, much like our own Solar system. Some form of migration occurred, dragging this system inwards, producing the orbits we see in Kepler-90 today.
Google’s collaboration with NASA’s space telescope Kepler mission has now opened up new and exciting opportunities into AI helping with scientific discoveries.
So how exactly did Google’s AI discover these planets? And what sort of future discoveries can this technology provide?
Training AI for exoplanet discoveries
Traditionally, software developers program computers to perform a particular task, from playing your favourite cat video, to determining exoplanetary signals from space based telescopes such as NASA’s Kepler Mission.
These programs are executed to serve a single purpose. Using code intended for cat videos to hunt exoplanets in light curves would lead to some very interesting, yet false, results.
Googles’s AI is programmed rather differently, using machine learning. In machine learning, AI is trained through artificial neural networks – somewhat replicating our brain’s biological neural networks – to perform tasks like reading this article. It then learns from its mistakes, becoming more efficient at its particular task.
Google’s DeepMind AI, AlphaGo, was trained previously to play Go, an extremely complex yet elegant Chinese board game. Last year, AlphaGo defeated Lee Sedol, the world’s best Go player, by four games to one. It simply trained itself by watching thousands of previously played games, then competing against itself.
In our exoplantary case, AI was trained to identify transiting exoplanets, sifting through 15,000 signals from the Kepler exoplanet catalogue. It learned what was and wasn’t a signal caused by an exoplanet eclipsing its host star. These 15,000 signals were previously vetted by NASA scientists prior to the AI’s training, guiding it to a 96% efficiency of detecting known exoplanets.
Researchers then directed their AI network to search in multiplanetary systems for weaker signals. This research culminated in today’s announcement of both Kepler-90i and another Earth-sized exoplanet, Kepler-80g, in a separate planetary system.
Hunting for more exoplanets using AI
Google’s AI has analysed only 10% of the 150,000 stars NASA’s Kepler Mission has been eyeing off across the Milky Way galaxy.
There’s potential then for sifting through Kepler’s entire catalogue and finding other exoplanetary worlds that have either been skimmed by scientist or haven’t been checked yet, due to Kepler’s rich data set. And that’s exactly what Google’s researchers are planning to do.
Machine learning neural networks have been assisting astronomers for a few years now. But the potential for AI to assist in exoplanetary discoveries will only increase within the next decade.
The Kepler mission has been running since 2009, with observations slowly coming to an end. Within the next 12 months, all of its on-board fuel will be fully depleted, ending what has been, one of the greatest scientific endeavours in modern times.
Kepler’s successor, the Transiting Exoplanet Survey Satellite (TESS) will be launching this coming March.
TESS is predicted to find 20,000 exoplanet candidates during its two-year mission. To put that into perspective, in the past 25 years, we’ve managed to discover just over 3,500.
This unprecedented inundation of exoplanetary data needs to either be confirmed by other transiting observations or other methods such as ground-based radial velocity measurements.
There just isn’t enough people-power to sift through all of this data. That’s why these machine learning networks are needed, so they can aid astronomers in sifting through big data sets, ultimately assisting in more exoplanetary discoveries. Which begs the question, who exactly gets credit for such a discovery?
Joshua Soderholm, The University of Queensland; Alain Protat, Australian Bureau of Meteorology; Hamish McGowan, The University of Queensland; Harald Richter, Australian Bureau of Meteorology, and Matthew Mason, The University of Queensland
An Australian spring wouldn’t be complete without thunderstorms and a visit to the Australian Bureau of Meteorology’s weather radar website. But a new type of radar technology is aiming to make weather radar even more useful, by helping to identify those storms that are packing hailstones.
Most storms just bring rain, lightning and thunder. But others can produce hazards including destructive flash flooding, winds, large hail, and even the occasional tornado. For these potentially dangerous storms, the Bureau issues severe thunderstorm warnings.
For metropolitan regions, warnings identify severe storm cells and their likely path and hazards. They provide a predictive “nowcast”, such as forecasts up to three hours before impact for suburbs that are in harm’s way.
When monitoring thunderstorms, weather radar is the primary tool for forecasters. Weather radar scans the atmosphere at multiple levels, building a 3D picture of thunderstorms, with a 2D version shown on the bureau’s website.
This is particularly important for hail, which forms several kilometres above ground in towering storms where temperatures are well below freezing.
In terms of insured losses, hailstorms have caused more insured losses than any other type of severe weather events in Australia. Brisbane’s November 2014 hailstorms cost an estimated A$1.41 billion, while Sydney’s April 1999 hailstorm, at A$4.3 billion, remains the nation’s most costly natural disaster.
Breaking the ice
Nonetheless, accurately detecting and estimating hail size from weather radar remains a challenge for scientists. This challenge stems from the diversity of hail. Hailstones can be large or small, densely or sparsely distributed, mixed with rain, or any combination of the above.
Conventional radars measure the scattering of the radar beams as they pass through precipitation. However, a few large hailstones can look the same as lots of small ones, making it hard to determine hailstones’ size.
A new type of radar technology called “dual-polarisation” or “dual-pol” can solve this problem. Rather than using a single radar beam, dual-pol uses two simultaneous beams aligned horizontally and vertically. When these beams scatter off precipitation, they provide relative measures of horizontal and vertical size.
Therefore, an observer can see the difference between flatter shapes of rain droplets and the rounder shapes of hailstones. Dual-pol can also more accurately measure the size and density of rain droplets, and whether it’s a mixture or just rain.
Together, these capabilities mean that dual-pol is a game-changer for hail detection, size estimation and nowcasting.
Into the eye of the storm
Dual-pol information is now streaming from the recently upgraded operational radars in Adelaide, Melbourne, Sydney and Brisbane. It allows forecasters to detect hail earlier and with more confidence.
However, more work is needed to accurately estimate hail size using dual-pol. The ideal place for such research is undoubtedly southeast Queensland, the hail capital of the east coast.
When it comes to thunderstorm hazards, nothing is closer to reality than scientific observations from within the storm. In the past, this approach was considered too costly, risky and demanding. Instead, researchers resorted to models or historical reports.
The Atmospheric Observations Research Group at the University of Queensland (UQ) has developed a unique capacity in Australia to deploy mobile weather instrumentation for severe weather research. In partnership with the UQ Wind Research Laboratory, Guy Carpenter and staff in the Bureau of Meteorology’s Brisbane office, the Storms Hazards Testbed has been established to advance the nowcasting of hail and wind hazards.
Over the next two to three years, the testbed will take a mobile weather radar, meteorological balloons, wind measurement towers and hail size sensors into and around severe thunderstorms. Data from these instruments provide high-resolution case studies and ground-truth verification data for hazards observed by the Bureau’s dual-pol radar.
Since the start of October, we have intercepted and sampled five hailstorms. If you see a convoy of UQ vehicles heading for ominous dark clouds, head in the opposite direction and follow us on Facebook instead.
Unfortunately, the UQ storm-chasing team can’t get to every severe thunderstorm, so we need your help! The project needs citizen scientists in southeast Queensland to report hail through #UQhail. Keep a ruler or object for scale (coins are great) handy and, when a hailstorm has safely passed, measure the largest hailstone.
Combining measurements, hail reports and the Bureau of Meteorology’s dual-pol weather radar data, we are working towards developing algorithms that will allow hail to be forecast more accurately. This will provide greater confidence in warnings and those vital extra few minutes when cars can be moved out of harm’s way, reducing the impact of storms.
Advanced techniques developed from storm-chasing and citizen science data will be applied across the Australian dual-pol radar network in Sydney, Melbourne and Adelaide.
Who knows, in the future if the Bureau’s weather radar shows a thunderstorm heading your way, your reports might even have helped to develop that forecast.
Joshua Soderholm, Research scientist, The University of Queensland; Alain Protat, Principal Research Scientist, Australian Bureau of Meteorology; Hamish McGowan, Professor, The University of Queensland; Harald Richter, Senior Research Scientist, Australian Bureau of Meteorology, and Matthew Mason, Lecturer in Civil Engineering, The University of Queensland
This follows apparently egregious behaviour by some irrigators and state government regulators in New South Wales. Yet the alleged theft of water in the Murray-Darling Basin is only the tip of the iceberg when we consider the institutional problems – namely the capture of state government agencies by powerful irrigation interests.
Take NSW as an example. In 1993 the then state Department of Water Resources’ North west rivers audit found the same theft, meter-tampering and questionable government oversight exposed again by the ABC’s Four Corners investigation in July.
Only half of the targeted volume of salt has been flushed out to sea and the water supply to Broken Hill and other communities has become unreliable. Moreover, floodplain forests and wetlands of international significance continue to decline, and native fish and water bird populations have flatlined.
In fact, many values are at risk in the river system that supplies water to more than 3 million people, and covers a seventh of Australia’s landmass. It is not only a few (alleged) bad apples, it is governance of water that is broken.
Problems with the existing plan
While bad behaviour in NSW is evident, of more concern is the way some state governments are frustrating implementation of the A$13 billion 2012-26 Basin Plan and associated programs to recover water for the river system.
If the Basin Plan is to improve the health of the river and its extensive floodplain forests along the lower River Murray, the water recovered for the environment needs to be released in pulses. That will be the best way to ensure it can rise out of the river channel and inundate wetlands.
Read more: Is the Murray-Darling Basin Plan broken?
In this context it is unhelpful for the Victorian Government to propose flows of around half the previously agreed size because of the objections of a small number of landowners along the Goulburn River in its Goulburn key focus area project.
Upstream, state governments have rules that allow water purchased by taxpayers for the river to be extracted by irrigators when it crosses state borders. However, they are failing to remove bottlenecks that prevent managed floods from travelling safely down rivers. They have even proposed to reduce the water available for the environment below minimum requirements.
Astonishingly, 30% of water extraction points in the Basin are still not metered and the information that is collected is not publicly available or audited so that theft can be penalised.
Sustainable management required
Sustainable management of the Murray-Darling Basin requires trust and cooperation among the responsible state, ACT and federal governments.
The alleged water theft in NSW breaks that trust, especially for SA as the downstream state that relies on the River Murray. But so too does the stalling of implementation of the Basin Plan agreement and manipulation of the rules that govern who gets what water and when they get it.
The foundation of trust is transparency. As a start, there are many opportunities for online recording of water allocations and use to increase trust. It is still possible to fix implementation of the Plan.
In a report released yesterday the Wentworth Group of Concerned Scientists has identified several solutions, including metering all water diversions, completing water recovery, and investing in regional development.
The good news is that there are signs of political leadership. The Council of Australian Governments promised in June to deliver the Basin Plan “in full and on time” for its planned commencement in 2019.
Recently, Prime Minister Malcolm Turnbull recommitted the federal government to Basin Plan implementation. He endorsed the far-reaching recommendations of the Murray-Darling Basin Authority’s Basin-wide Compliance Review to strengthen enforcement of water laws and the Basin Plan, and to recover the remaining environmental water.
The SA Royal Commission
Beginning in 2018, Weatherill’s newly announced Royal Commission will investigate breaches of the Murray Darling Basin Agreement, and the Commissioner “will examine the adequacy of existing legislation and practices and make recommendations for any necessary changes.”
Most significantly, Weatherill has proposed going beyond water theft to “look into whether any legislative or policy changes since the agreement was signed in 2012 have been inconsistent with the purpose of the Basin Agreement and Basin Plan”.
The Royal Commission’s terms of reference are not yet available and the extent of cooperation of upstream governments is highly uncertain (NSW has already said it will not cooperate). Yet the Royal Commission could help identify ways to better meter and account for water, improve compliance and set rules to protect environmental water.
At the next Basin Ministerial Council meeting later this year the governments need to map out measures to put the Plan back on track. If it can do so, it will be endorsed at the Council of Australian Governments in 2018. This is their opportunity to articulate precisely how they will fulfil their commitment to delivering the basin plan in full and on time.
The Murray-Darling Basin Plan is not perfect. Implementation has problems, but with the remaining $5.1 billion allocated funds and proper leadership it can be well implemented to benefit both people and the environment.
The link below is to an article reporting on the creation of a new vast ocean reserve by Mexico in the Pacific Ocean.
Antibiotics are losing their ability to kill bacteria.
One of the main reasons for the rise in antibiotic resistance is the improper use of antibiotics, but our latest research shows that the ingredients in commonly-used weed killers like Round-up and Kamba can also cause bacteria to become less susceptible to antibiotics.
Herbicides induce gene activity
Already, about 700,000 deaths are attributable each year to infections by drug-resistant bacteria. A recent report projected that by 2050, 10 million people a year will die from previously treatable bacterial infections, with a cumulative cost to the world economy of $US100 trillion.
The bacteria we study are potential human pathogens. Seventy years ago pathogens were uniformly susceptible to antibiotics used in medicine and agriculture. That has changed. Now some are resistant to all but one or two remaining antibiotics. Some strains are resistant to all.
When bacteria were exposed to commercial herbicide formulations based
on 2,4-D, dicamba or glyphosate, the lethal concentration of various antibiotics
changed. Often it took more antibiotic to kill them, but sometimes it took less.
We showed that one effect of the herbicides was to induce certain genes that they all carry, but don’t always use.
These genes are part of the so-called “adaptive response”. The main elements of this response are proteins that “pump” toxins out of the cell, keeping intracellular concentrations sublethal. We knew this because the addition of a chemical inhibitor of the pumps eliminated the protective effect of the herbicide.
In our latest work, we tested this by using gene “knockout” bacteria, which had been engineered to lose just one pump gene. We found that most of the effect of the herbicide was explained by these pumps.
Reduced antibiotic use may not fix the problem
For decades we have put our faith in inventing new antibiotics above the wisdom
of preserving the effectiveness of existing ones. We have applied the same invention incentives to the commercialisation of antibiotics as those used with mobile phones. Those incentives maximise the rate of product sales. They have saturated the market with phones, and they saturate the earth with antibiotic resistant bacteria.
Improper use of antibiotics is a powerful driver of the widespread resistance.
Knowing this naturally leads to the hypothesis that proper and lower use will make the world right again. Unfortunately, the science is not fully on the side of that hypothesis.
Studies following rates of resistance do generally find a decrease in resistance to specific drugs when their use is banned or decreased. However, the effect is not a restoration of a pre-antibiotic susceptibility, characterised by multi-year effectiveness of the antibiotic. Instead, resistance returns rapidly when the drug is used again.
This tells us that once resistance has stablised in populations of bacteria, suspended use may change the ratio of resistant to susceptible but it does not eliminate resistant types. Very small numbers of resistant bacteria can undermine the antibiotic when it is used again.
Herbicides and other pollutants mimic antibiotics
What keeps these resistant minorities around? Recall that bacteria are very
small, but there are lots of them; you carry 100 trillion of them. They are also found deep underground to high up in the atmosphere.
Because antibiotics are so powerful, they eliminate bacteria that are susceptible and leave the few resistant ones to repopulate. Having done so, we now have lots of bacteria, and lots of resistance genes, to get rid of, and that takes a lot of time.
As our work suggests, the story is even more complicated. We are inclined to think of antibiotics as medicine and agrichemicals, hand soaps, bug sprays and preservatives as different. Bacteria don’t do this. To them, they are all toxic.
Some are really toxic (antibiotics) and some not so much (herbicides). Bacteria are among the longest lived organisms on earth. Nearly four billion years of survival has taught them how to deal with toxins.
Pesticides as antibiotic vaccines
Our hypothesis is that herbicides immunise the bacteria from more toxic
toxins like antibiotics. Since all bacteria have these protections, the use of widely used products to which they are exposed is particularly problematic. So these products, among others, might keep bacteria ready for antibiotics whether or not we are using them.
We found that both the purified active ingredients and potential inert ingredients in weed killers caused a change in antibiotic response. Those inert ingredients are also found in processed foods and common household products. Resistance was caused below legally allowed food concentrations.
What does this all mean? Well for starters we may have to think more carefully about how to regulate chemical commerce. With approximately eight million manufactured chemicals in commerce, 140,000 new since 1950, and limited knowledge of their combination effects and breakdown products, this won’t be easy.
But neither is it easy to watch someone die from an infection we lost the power to cure.
Jack Heinemann, Professor of Molecular Biology and Genetics
We have discovered a new species of orangutan – the third known species and the first new great ape to be described since the bonobo almost a century ago.
The new species, called the Tapanuli orangutan (Pongo tapanuliensis), has a smaller skull than the existing Bornean and Sumatran orangutans, but has larger canines.
As we and our colleagues report in the journal Current Biology, the new species is represented by an isolated population of fewer than 800 orangutans living at Batang Toru in northern Sumatra, Indonesia.
The existence of a group of orangutans in this region was first reported back in 1939. But the Batang Toru orangutans were not rediscovered until 1997, and then confirmed in 2003. We set about carrying out further research to see whether this isolated group of orangutans was truly a unique species.
On the basis of genetic evidence, we have concluded that they are indeed distinct from both the other two known species of orangutan: Pongo abelii from further north in Sumatra, and Pongo pygmaeus from Borneo.
The Batang Toru orangutans have a curious mix of features. Mature males have cheek flanges similar to those of Bornean orangutans, but their slender build is more akin to Sumatran orangutans.
The hair colour is more cinnamon than the Bornean species, and the Batang Toru population also makes longer calls than other orangutans.
To make completely sure, we needed more accurate comparisons of their body dimensions, or “morphology”. It was not until 2013 that the skeleton of an adult male became available, but since then one of us (Anton) has amassed some 500 skulls of the other two species, collected from 21 institutions, to allow for accurate comparisons.
Analyses have to be conducted at a similar developmental stage on male orangutan skulls, because they continue growing even when adult. Anton found 33 skulls of wild males that were suitable for comparison. Of 39 different measurement characteristics for the Batang Toru skull, 24 of them fall outside of the typical ranges of northern Sumatran and Bornean orangutans.
Overall the Batang Toru male has a smaller skull, but bigger canines. Combining the genetic, vocal, and morphological sources of evidence, we have confidently concluded that Batang Toru orangutan population is a newly discovered species – and one whose future is already under threat.
Despite the heavy exploitation of the surrounding areas (hunting, habitat
alteration and other illegal activities), the communities surrounding the habitat of the Tapanuli orangutan still give us the opportunity to see and census the surviving population. Unfortunately, we believe that the population is fewer than 800 individuals.
Of the habitat itself, no more than 10 square km remains. Future development has been planned for that area, and about 15% of the orangutans’ habitat has non-protected forest status.
The discovery of the third orangutan in the 21st century gives us an understanding that the great apes have more diversity than we know, making it all the more important to conserve these various groups.
Without the strong support of, and participation from, the communities surrounding its habitat, the future of the Tapanuli orangutan will be uncertain. Government, researchers and conservation institutions must make a strong collaborative effort to make sure that this third orangutan will survive long after its discovery.