Thursday, June 17, 2021

New discovery shows Tibet as crossroads for giant rhino dispersal

CHINESE ACADEMY OF SCIENCES HEADQUARTERS

Research News

The giant rhino, Paraceratherium, is considered the largest land mammal that ever lived and was mainly found in Asia, especially China, Mongolia, Kazakhstan, and Pakistan. How this genus dispersed across Asia was long a mystery, however. A new discovery has now shed light on this process.

Prof. DENG Tao from the Institute of Vertebrate Paleontology and Paleoanthropology (IVPP) of the Chinese Academy of Sciences and his collaborators from China and the U.S.A. recently reported a new species Paraceratherium linxiaense sp. nov., which offers important clues to the dispersal of giant rhinos across Asia.

The study was published in Communications Biology on June 17.

The new species' fossils comprise a completely preserved skull and mandible with their associated atlas, as well as an axis and two thoracic vertebrae from another individual. The fossils were recovered from the Late Oligocene deposits of the Linxia Basin in Gansu Province, China, which is located on the northeastern border of the Tibetan Plateau.

Phylogenetic analysis yielded a single most parsimonious tree, which places P. linxiaense as a derived giant rhino, within the monophyletic clade of the Oligocene Asian Paraceratherium. Within the Paraceratherium clade, the researchers' phylogenetic analysis produced a series of progressively more-derived species--from P. grangeri, through P. huangheenseP. asiaticum, and P. bugtiense--finally terminating in P. lepidum and P. linxiaenseP. linxiaense is at a high level of specialization, similar to P. lepidum, and both are derived from P. bugtiense.

Adaptation of the atlas and axis to the large body and long neck of the giant rhino already characterized P. grangeri and P. bugtiense, and was further developed in P. linxiaense, whose atlas is elongated, indicative of a long neck and higher axis with a nearly horizontal position for its posterior articular face. These features are correlated with a more flexible neck.

The giant rhino of western Pakistan is from the Oligocene strata, representing a single species, Paraceratherium bugtiense. On the other hand, the rest of the genus Paraceratherium, which is distributed across the Mongolian Plateau, northwestern China, and the area north of the Tibetan Plateau to Kazakhstan, is highly diversified.

The researchers found that all six species of Paraceratherium are sisters to Aralotherium and form a monophyletic clade in which P. grangeri is the most primitive, succeeded by P. huangheense and P. asiaticum.

The researchers were thus able to determine that, in the Early Oligocene, P. asiaticum dispersed westward to Kazakhstan and its descendant lineage expanded to South Asia as P. bugtiense. In the Late Oligocene, Paraceratherium returned northward, crossing the Tibetan area to produce P. lepidium to the west in Kazakhstan and P. linxiaense to the east in the Linxia Basin.

The researchers noted the aridity of the Early Oligocene in Central Asia at a time when South Asia was relatively moist, with a mosaic of forested and open landscapes. "Late Oligocene tropical conditions allowed the giant rhino to return northward to Central Asia, implying that the Tibetan region was still not uplifted as a high-elevation plateau," said Prof. DENG.

During the Oligocene, the giant rhino could obviously disperse freely from the Mongolian Plateau to South Asia along the eastern coast of the Tethys Ocean and perhaps through Tibet. The topographical possibility that the giant rhino crossed the Tibetan area to reach the Indian-Pakistani subcontinent in the Oligocene can also be supported by other evidence.

Up to the Late Oligocene, the evolution and migration from P. bugtiense to P. linxiaense and P. lepidum show that the "Tibetan Plateau" was not yet a barrier to the movement of the largest land mammal.


CAPTION

Holotype of Paraceratherium linxiaense sp. nov. Skull and mandible share the scale bar, but both the anterior and nuchal views have an independent scale bar.

CREDIT

IVPP

This research was supported by the Chinese Academy of Sciences, the National Natural Science Foundation of China, and the Second Comprehensive Scientific Expedition on the Tibetan Plateau.


CAPTION

Distribution and migration of Paraceratherium in the Oligocene Eurasia. Localities of the early Oligocene species were marked by the yellow color, and the red indicates the late Oligocene species.

CREDIT

IVPP

1098-carat diamond, third-largest in the world, unearthed in Botswana

Jun 17 2021

DEBSWANA DIAMOND COMPANY/FACEBOOK
Mining company Debswana discovered the 1098-carat diamond in Botswana at the beginning of June.

A mining company in Botswana has unearthed a 1098-carat diamond, the third-largest in the world.

Lynette Armstrong, acting managing director of the Debswana Diamond Company​, said this was the largest diamond recovered by the company in its 50-year history.

“From our preliminary analysis it could be the world’s third-largest gem-quality stone,” Armstrong said, according to The Guardian.

It was discovered on June 1 at Jwaneng Mine​ in the country’s south. The stone, which hadn’t yet been named, measured 73mm by 52mm by 27mm.

After a year in which coronavirus restrictions caused a slump in diamond exports, Botswana's president said on Thursday that the country needs to diversify its economy.


The Debswana team spotted the diamond and ensured it was protected as it moved through the recovery process. Before this, the biggest diamond found at the Jwaneng Mine, known as the Prince of Mines, was 446-carats in 1993.

READ MORE:
Meet the UK designers giving jewels of the past a new lease of life
Why designing your own ring makes sense
Flawless 163-carat diamond fetches $49.4 million in Geneva auction
World's largest uncut diamond could fetch NZ$121.9 million at auction this week

The Debswana team spotted the diamond and ensured it was protected as it moved through the recovery process. Before this, the biggest diamond found at the Jwaneng Mine, known as the Prince of Mines, was 446-carats in 1993.

Armstrong said the “rare and extraordinary” gem brought hope to the struggling nation.

The diamond was presented to Botswana President Dr Mokgweetsi Masisi​ on Wednesday (local time), who said proceeds from its sale would be used to advance national development in the country, as was the norm.

Armstrong said no decision had been made over how they would sell the stone that was yet to be valued.

They were deciding between selling it through the De Beers channel, which was predominantly owned by Anglo American, or the state-owned Okavango Diamond Company.

Debswana was a joint venture by De Beers and the Botswanan Government.

The largest diamond ever found was the Cullinan Diamond discovered in South Africa in 1905 weighing in at 3107 carats, followed by the 1111-carat Lesedi La Rona diamond discovered in Botswana in 2015.



SPACE RACE 1.5
Chinese astronauts enter new space station for the first time

Sam McNeil
NZ STUFF
 Jun 18 2021

China's state television CCTV announced that the country's crewed spacecraft Shenzhou-12 docked with space station module Tianhe on Thursday.

Three Chinese astronauts have arrived at China's new space station, starting a three-month mission that marks another milestone in the country’s ambitious space programme.

Their Shenzhou-12 craft connected with the space station module about six hours after taking off from the Jiuquan launch centre on the edge of the Gobi Desert (overnight into Friday, NZ time).

About three hours later, commander Nie Haisheng, 56, followed by Liu Boming, 54, and space rookie Tang Hongbo, 45, opened the hatches and floated into the Tianhe-1 core living module. Pictures showed them busy at work unpacking equipment.

“This represents the first time Chinese have entered their own space station,” state broadcaster CCTV said on its nightly news broadcast.

Their spacecraft lifted off about 01:22 GMT on Thursday from northwest China.

The crew will carry out experiments, test equipment, conduct maintenance and prepare the station for receiving two laboratory modules next year. The mission brings to 14 the number of astronauts China has launched into space since 2003, becoming only the third country after the former Soviet Union and the United States to do so on its own.

All appears to have gone smoothly so far. China's leaders hope the mission will be a complete success as the ruling Communist Party prepares to celebrate the centenary of its founding next month.

The astronauts were seen off by space officials, other uniformed military personnel and a crowd of children waving flowers and flags and singing patriotic songs before blasting off at 9:22am (local time) atop a Long March-2F Y12 rocket.


NG HAN GUAN/AP
Chinese astronauts (from left) Tang Hongbo, Nie Haisheng, and Liu Boming wave as they prepare to board for liftoff.

The rocket dropped its boosters about two minutes into the flight followed by the cowling surrounding Shenzhou-12. After about 10 minutes it separated from the rocket's upper section, extended its solar panels and shortly afterward entered orbit.

About a half-dozen adjustments took place over the following six hours to line up the spaceship for docking with the Tianhe-1, or Heavenly Harmony, module.

The travel time is down from the two days it took to reach China's earlier experimental space stations, a result of a “great many breakthroughs and innovations”, the mission’s deputy chief designer, Gao Xu, told state broadcaster CCTV.

“So the astronauts can have a good rest in space which should make them less tired," Gao said.
A Long March-2F Y12 rocket carrying a crew of Chinese astronauts in a Shenzhou-12 spaceship lifts off at the Jiuquan Satellite Launch Centre in Jiuquan, northwestern China.

Other improvements include an increase in the number of automated and remote-controlled systems that should “significantly lessen the pressure on the astronauts”, Gao said.

Two astronauts on those past missions were women, and while this first station crew is all male, women are expected to be part of future station crews.

The mission is the third of 11 planned through next year to add the additional sections to the station and send up crews and supplies. A fresh three-member crew and a cargo ship with supplies will be sent in three months.

China is not a participant in the International Space Station, largely as a result of US objections to the Chinese programme’s secrecy and close military ties. However, China has been stepping up co-operation with Russia and a host of other countries, and its station may continue operating beyond the International Space Station, which is reaching the end of its functional life.

China landed a probe on Mars last month that carried a rover, the Zhurong, and earlier landed a probe and rover on the moon's less explored far side and brought back the first lunar samples by any country’s space programme since the 1970s.

China and Russia this week also unveiled an ambitious plan for a joint International Lunar Research Station running through 2036.

NG HAN GUAN/AP
Spectators cheered as the astronauts boarded their ship for a historic mission.

That could compete and possibly conflict with the multinational Artemis Accords, a blueprint for space co-operation that supports Nasa’s plans to return humans to the moon by 2024 and to launch a historic human mission to Mars.

After the Tianhe-1 was launched in April, the rocket that carried it into space made an uncontrolled re-entry to Earth, though China dismissed criticism of the potential safety hazard. Usually, discarded rocket stages re-enter the atmosphere soon after liftoff, normally over water, and don’t go into orbit.

The rocket used Thursday is of a different type and the components that will re-enter are expected to burn up long before they could be a danger, said Ji Qiming, assistant director of the China Manned Space Agency.

US top court rejects Republican bid to rescind Obamacare
The Supreme Court justices have dismissed a challenge to the Obama-era healthcare law by a 7-2 vote, preserving insurance coverage for millions of Americans.

The US Supreme Court building is seen on the day SCOTUS rejected a Republican bid to invalidate the Obamacare healthcare law in Washington, US on June 17, 2021. (Reuters)

The US Supreme Court has dismissed a challenge to the Obama era health care law, preserving insurance coverage for millions of Americans.

The justices, by a 7-2 vote, left the entire law intact on Thursday in ruling that Texas, other Republican-led states and two individuals had no right to bring their lawsuit in federal court.

The law’s major provisions include protections for people with pre-existing health conditions, a range of no-cost preventive services and the expansion of the Medicaid programme that insures lower-income people, including those who work in jobs that don’t pay much or provide health insurance.

Also left in place is the law’s now-toothless requirement that people have health insurance or pay a penalty.

Congress rendered that provision irrelevant in 2017 when it reduced the penalty to zero.

REEAD MORE: Biden signs orders reversing Trump policies on Obamacare, abortion funding


Law under third attack

The elimination of the penalty had become the hook that Texas and other Republican-led states, as well as the Trump administration, used to attack the entire law.

They argued that without the mandate, a pillar of the law when it was passed in 2010, the rest of the law should fall, too.

And with a more conservative Supreme Court that includes three Trump appointees, opponents of “Obamacare” hoped a majority of the justices would finally kill off the law they have been fighting against for more than a decade.

But the third major attack on the law at the Supreme Court ended the way the first two did, with a majority of the court rebuffing efforts to gut the law or get rid of it altogether.

Trump’s three appointees to the Supreme Court — Justices Amy Coney Barrett, Neil Gorsuch and Brett Kavanaugh — split their votes. Kavanaugh and Barrett joined the majority.

Gorsuch was in dissent, signing on to an opinion from Justice Samuel Alito.


Republican efforts

Justice Stephen Breyer wrote for the court that the states and people who filed a federal lawsuit “have failed to show that they have standing to attack as unconstitutional the Act’s minimum essential coverage provision.”

In dissent, Alito wrote, “Today’s decision is the third instalment in our epic Affordable Care Act trilogy, and it follows the same pattern as instalments one and two. In all three episodes, with the Affordable Care Act facing a serious threat, the Court has pulled off an improbable rescue.”

Alito was a dissenter in the two earlier cases, as well.

Republicans pressed their argument to invalidate the whole law even though congressional efforts to rip out the entire law "root and branch," in Senate GOP Leader Mitch McConnell’s words, have failed.

The closest they came was in July 2017 when Arizona Sen. John McCain, who died the following year, delivered a dramatic thumbs-down vote to a repeal effort by fellow Republicans.

Chief Justice John Roberts said during arguments in November that it seemed the law’s foes were asking the court to do work best left to the political branches of government.


Flipped American ratings

The court’s decision preserves benefits that became part of the fabric of the nation’s health care system even as Republicans repeatedly tried to rip out Obamacare — in McConnell's words — “root and branch.”

Polls show that the 2010 health care law grew in popularity as it endured the heaviest assault.

In December 2016, just before Obama left office and Trump swept in calling the ACA a “disaster,” 46 percent of Americans had an unfavourable view of the law, while 43 percent approved, according to the Kaiser Family Foundation tracking poll.

Those ratings flipped and by February of this year 54 percent had a favourable view, while disapproval had fallen to 39 percent in the same ongoing poll.


Expansion of law

The health law is now undergoing an expansion under President Joe Biden, who sees it as the foundation for moving the US to coverage for all.

His giant Covid-19 relief bill significantly increased subsidies for private health plans offered through the ACA’s insurance markets, while also dangling higher federal payments before the dozen states that have declined the law’s Medicaid expansion.

About 1 million people have signed up with HealthCare.gov since Biden reopened enrolment amid high levels of Covid cases earlier this year.


Popular benefits

The administration says an estimated 31 million people are covered because of the law, most through its combination of Medicaid expansion and marketplace plans.

But its most popular benefit is protection for people with pre-existing medical conditions. They cannot be turned down for coverage on account of health problems, or charged a higher premium.

While those covered under employer plans already had such protections, “Obamacare” guaranteed them for people buying individual policies.

Another hugely popular benefit allowed young adults to remain on their parents’ health insurance until they turn 26.

Before the law, going without medical coverage was akin to a rite of passage for people in their 20s getting a start in the world.

Because of the ACA, most privately insured women receive birth control free of charge. It’s considered a preventive benefit covered at no additional cost to the patient. So are routine screenings for cancer and other conditions.

For Medicare recipients, “Obamacare” also improved preventive care, and more importantly, closed a prescription drug coverage gap of several thousand dollars that was known as the “doughnut hole.”

READ MORE: What's next after Obamacare repeal bill failure?

Source: TRTWorld and agencies
In Push to Find Methane Leaks, Satellites Gear Up for the Hunt


Stemming the methane leaks from landfills, oil fields, natural gas pipelines and more is one of the most powerful levers we have to quickly slow global warming. New satellites are bolstering this urgent mission by pinpointing emitters of this potent greenhouse gas from space.


A computer rendering of a methane-sensing satellite to be launched in 2023 by Carbon Mapper, a U.S. public-private partnership. CARBON MAPPER / PLANET


BY CHERYL KATZ • JUNE 15, 2021

The threat was invisible to the eye: tons of methane billowing skyward, blown out by natural gas pipelines snaking across Siberia. In the past, those plumes of potent greenhouse gas released by Russian petroleum operations last year might have gone unnoticed. But armed with powerful new imaging technology, a methane-hunting satellite sniffed out the emissions and tracked them to their sources.

Thanks to rapidly advancing technology, a growing fleet of satellites is now aiming to help close the valve on methane by identifying such leaks from space. The mission is critical, with a series of recent reports sounding an increasingly urgent call to cut methane emissions.

While shorter-lived and less abundant than carbon dioxide, methane is much more powerful at trapping heat, making its global warming impact more than 80 times greater in the short term. Around 60 percent of the world’s methane emissions are produced by human activities — with the bulk coming from agriculture, waste disposal, and fossil fuel production. Human-caused methane is responsible for at least 25 percent of today’s global warming, the Environmental Defense Fund estimates. Stanching those emissions, a new Global Methane Assessment by the United Nations Environmental Programme stresses, is the best hope for quickly putting the brakes on warming.

“It’s the most powerful lever we have for reducing warming and all the effects that come with climate change over the next 30 years,” said Drew Shindell, an earth sciences professor at Duke University and lead author on the UN report. Scientists stress that major reductions in both carbon dioxide and methane are critical for averting extreme climate change. “It is not a substitute for reducing CO2, but a complement,” Shindell said.

Atmospheric methane levels surged over the last half-decade. 2020 saw the biggest one-year jump on record.


Nearly half of the roughly 380 million metric tons of methane released by human activities annually can be cut this decade with available and largely cost-effective methods, according to the UN assessment. Doing so would stave off nearly 0.3 degrees C of warming by the 2040s — buying precious time to get other greenhouse gas emissions under control. The easiest gains can be made by fixing leaky pipelines, stopping deliberate releases such as venting unwanted gas from drilling rigs, and other actions in the oil and gas industry, the UN report says. Capturing fumes from rotting materials in landfills and squelching the gassy belches of ruminant livestock will also help.

ALSO ON YALE E360
What is causing the recent rise in methane emissions? Read more.



For now, though, the trend is running in the opposite direction: The methane concentration in Earth’s atmosphere has been surging over the past half-decade, the NOAA Annual Greenhouse Gas Index shows. And despite the pandemic, 2020 saw the biggest one-year jump on record. The causes of the recent spike are unclear, but could include natural gas fracking, increased output from methane-producing microbes spurred by rising temperatures, or a combination of human-caused and natural forces.


All this, experts say, underscores the need to track down and plug any leaks or sources that can be controlled. Tracing emissions to their source is no easy task, however. Releases are often intermittent and easy to miss. Ground-based sensors can detect leaks in local areas, but their coverage is limited. Airplane and drone surveys are time-intensive and costly, and air access is restricted over much of the world.

That’s where a crop of recently launched and upcoming satellites with increasingly sophisticated tools comes in.


Satellite imagery shows a Russian gas pipeline (left) and highlights huge amounts of methane (right) being emitted from the pipeline on September 6, 2019. KAYRROS AND MODIFIED COPERNICUS DATA, 2019



A cluster of satellites launched by national space agencies and private companies over the last five years have greatly sharpened our view of what methane is being leaked from where. In the next couple of years, new satellite projects are headed for launch — including Carbon Mapper, a public-private partnership in California, and MethaneSAT, a subsidiary of the Environmental Defense Fund — that will help fill in the picture with unprecedented range and detail. These efforts, experts say, will be crucial not just for spotting leaks but also developing regulations and guiding enforcement — both of which are sorely lacking.

“You can’t mitigate what you can’t measure,” said Cassandra Ely, director of MethaneSAT.

Earlier satellites, such as the Japan Aerospace Exploration’s GOSAT launched in 2009, were able to detect methane, yet their resolution wasn’t good enough to identify specific sources.

But satellite technology is now advancing rapidly, boosting resolution, shrinking size, and gaining a host of cutting-edge capabilities. Powerful new eyes in space include the European Space Agency’s Sentinel 5P (launched in 2017), the Italian Space Agency’s PRISMA (launched 2019), and systems operated by private Canadian company GHGSat (with satellites launched in 2016, 2020 and 2021). Companies like French Kayrros are using artificial intelligence to enhance satellite imaging, paired with air and ground data, to provide detailed methane reports.

At any given time, there are about 100 high-volume methane leaks around the world.


In the past year, methane-hunting satellites have made a number of concerning discoveries. Among them: Despite the pandemic, methane emissions from oil and gas operations in Russia rose 32 percent in 2020. Satellites also observed sizeable releases from gas pipelines in Turkmenistan, a landfill in Bangladesh, a natural gas field in Canada, and coal mines in the U.S. Appalachian Basin.


At any given time, according to Kayrros, there are about 100 high-volume methane leaks around the world, along with a mass of smaller ones that add significantly to the total. Targeting emitters on a global scale from space, the European Space Agency says, provides “an important new tool to combat climate change.”


Now, Carbon Mapper is developing what it promises will be the most sensitive and precise tool for spotting point sources yet. The project aims to launch two satellites in 2023, eventually growing to a constellation of up to 20 that will provide near-constant methane and CO2 monitoring around the globe. Partners include NASA’s Jet Propulsion Laboratory, the California Air Resources Board, private satellite company Planet, and universities and nonprofits, with funding from major private donors, including Bloomberg Philanthropies.

The impetus is the current global monitoring gap, said Riley Duren, a remote-sensing scientist at the University of Arizona and Carbon Mapper CEO. “There’s no single organization that has the necessary mandate and resources and institutional culture to deliver an operational monitoring system for greenhouse gases,” Duren said. “At least not in the time frame that we need.” Duren likens Carbon Mapper to the U.S. National Weather Service, as it will provide an “essential public service” with its routine, sustained monitoring of greenhouse gases.


Cows at a dairy farm in Merced, California. Gassy belches of ruminant livestock are a significant source of methane. MARMADUKE ST. JOHN / ALAMY


The project’s main focus is to find super emitters, Duren said. He and colleagues conducted a previous study via methane-sensing airplane surveys of oil and gas operations, landfills, wastewater treatment, and agriculture in California that found that nearly half of the state’s methane output came from less than 1 percent of its infrastructure. Landfills produced the biggest share of the state’s overall emissions in that survey, followed by agriculture and then oil and gas.


The survey pointed out the need to “scale that up and operationalize it globally by going into space,” Duren said. The orbiters will employ “hyperspectral” spectrometers designed by the Jet Propulsion Laboratory, which the project’s website says will provide “unparalleled sensitivity, resolution and versatility.” The minifridge-sized satellites will be able to target a release to within 30 meters, precise enough to identify the exact piece of equipment that’s leaking.


When emissions are detected, subscribers to a rapid alert service will be notified within 24 hours by Planet, a private, San Francisco-based satellite operator that will build and manage the Carbon Mapper satellites.

The satellites will enhance the California Air Resources Board’s monitoring with wider and more frequent coverage, said Jorn Herner, who heads the board’s emissions monitoring program. Monitoring now is done once a quarter, he said. When the full constellation of Carbon Mapper satellites is deployed, that will increase to near-daily. “You just have a much better handle on what’s going on [and] when,” Herner said, “and you’ll be able to address any leaks more quickly.”

The global policies needed to do something about methane emissions aren’t yet in place.


Also joining the orbiting hunters will be MethaneSAT, a satellite that will scan wider areas — up to 200 kilometers in a swath, albeit with lower, 100-meter, resolution. This program uses a special algorithm that generates flux rates from the satellite data. “So instead of just getting a picture or a snapshot, you actually get more of a movie,” said MethaneSAT director Ely. That’s a first for satellite-based sensing and a boon to tracing wind-blown plumes back to their source, she said.


MethaneSAT will focus on the global oil and gas industry and aims to be sensitive enough to reveal the multitude of small methane releases that can account for the majority of emissions, Ely says. The findings will be made available to industry operators, regulators, investors and the public in near-real time. The data, she said, will help “prioritize what makes the most sense in terms of emissions reductions and mitigation.”


Yet while the world’s ability to hunt down methane emitters is growing, the global policies needed to do something about it aren’t yet in place.

Much of the current approach to dialing back methane is dependent on voluntary actions by the oil and gas industry. Satellites can help with that, UN report lead author Shindell and others said, by identifying leaks that, if stopped, will save or make those companies money. “If you capture the methane instead of letting it escape to the atmosphere, you have something quite useful,” Shindell said. “So, there’s a nice financial incentive not to waste it.” But if gas prices aren’t high enough, operators can feel it’s not worth the expense and effort to find, stem, and utilize runaway emissions — making rules and fees a necessary part of the picture.

“Having stronger regulations is really key,” Shindell said.



The global monthly average concentration of methane in the atmosphere. NOAA


Regulations on methane emissions today are a patchwork of local and national measures, with few international agreements that set specific targets, the UN report points out. In the United States, state policies range from fairly strict controls in some states, such as California and Colorado, to little enforcement in Texas and others. The U.S. Senate recently moved to reinstate methane emissions rules for the oil and gas industry that the Trump administration had rescinded; Congress is expected to vote on that action later this month. A Senate bill proposed in March would levy a fee on the industry’s methane output. And the U.S. Environmental Protection Agency just issued a plan to cap methane and other air pollutants from landfills.


The European Union is currently working on new regulations for emissions from the energy sector. However, other big emitters, such as Russia, have almost no methane-restricting policies in effect, according to the International Energy Agency analysis. Ahead of the UN climate change conference in November, the International Energy Forum this month launched its Methane Measurement Methodology Project, giving member nations access to data from the Sentinel 5P satellite along with analyses from Kayrros, to get a better handle on emissions from the energy industry.

Data from satellites could provide a useful political lever to compel countries to crack down on their emissions, scientists say. Precise measurements on Russian pipeline leaks, for example, could enable the EU, a major customer for Russian oil and gas, to impose border tariffs based on the emissions from production and transport. Better monitoring could also aid recent actions by shareholders and courts compelling major fossil fuel corporations to rein in their greenhouse gas emissions.

ALSO ON YALE E360
Methane detectives: Can a wave of new technology slash natural gas leaks? Read more.


Whatever measures come into effect, policymakers and regulators will need eyes in space to keep tabs on whether those rules are working, and to pinpoint violators and incentivize change.

Said Carbon Mapper’s Duren: “There are many uses for just making the invisible visible.”




Cheryl Katz is a Northern California-based freelance science writer covering climate change, earth sciences, natural resources, and energy. Her articles have appeared in National Geographic, Scientific American, Eos and Hakai Magazine, among other publicatio
ns. 
OPINION
The Time Has Come to Rein In the Global Scourge of Palm Oil

An oil palm plantation in Sumatra, Indonesia, shrouded in haze from fires on burning peatland. ULET IFANSASTI / GREENPEACE



The cultivation of palm oil, found in roughly half of U.S. grocery products, has devastated tropical ecosystems, released vast amounts of C02 into the atmosphere, and impoverished rural communities. But efforts are underway that could curb the abuses of this powerful industry.


BY JOCELYN C. ZUCKERMAN • MAY 27, 2021

A few weeks ago, the Sri Lankan president announced that his government would ban all imports of palm oil, with immediate effect, and ordered the country’s plantation companies to begin uprooting their oil-palm monocultures and replacing them with more environmentally friendly crops. Citing concerns about soil erosion, water scarcity, and threats to biodiversity and public health, President Gotabaya Rajapaksa explained that his aim was to “make the country free from oil palm plantations and palm oil consumption.”

That’s a pretty radical move, and, as someone who’s spent the past few years writing a book about the global palm oil industry, one I fully support. Worldwide, production of palm oil has skyrocketed in recent decades — oil-palm plantations now cover an area larger than New Zealand — but the boom has meant devastation for the planet. The oil palm plant, Elaeis guineensis, thrives at 10 degrees to the north and south of the equator, a swath that corresponds with our tropical rainforests. Though they cover just 10 percent of Earth’s land surface, these ecosystems support more than half of all biodiversity. In Indonesia, the world’s number-one producer of palm oil, habitat loss due largely to industrial agriculture has meant that such iconic species as the Sumatran elephant, orangutan, rhinoceros, and tiger — in addition to various species of hornbill — have been pushed to the brink of extinction. Indigenous peoples who for generations have sourced their food, building materials, and everything else from the archipelago’s forests and rivers have been reduced to eking out existences under donated plastic tarps and begging by the side of the road.

Tropical rainforests are also, of course, vital carbon sinks, and many of them sit upon great expanses of peatlands — soils formed over thousands of years through the accumulation of organic matter. Indonesia claims the planet’s largest concentration of tropical peatlands, and when its palm oil companies drain and burn that land as a precursor to planting, unimaginable quantities of carbon dioxide escape into the atmosphere. The country’s peatlands currently emit more carbon dioxide each year than does the state of California.

These days, palm oil accounts for one-third of total global vegetable oil consumption.


Native to West and Central Africa, where it has long been a pillar of local cuisine and culture, palm oil emerged as a global commodity in the 18th century, when Europeans began sourcing it as a fuel for lighting lamps. It eventually found its way into soaps, candles, and margarines, and served as a lubricant for the machines driving the Second Industrial Revolution. Around the turn of the 20th century, rubber planters in Malaya and the Dutch East Indies began introducing the crop in that part of the world, and the post-independence governments of Indonesia and Malaysia expanded oil-palm acreage in connection with poverty-alleviation schemes. Having eventually learned to refine, bleach, and deodorize the oil into something all but tasteless, odorless, and invisible, the industry proceeded to find ever-more uses for it. These days, palm oil accounts for one-third of total global vegetable oil consumption, and some derivative of the plant lurks in roughly half of all products in U.S. grocery stores, from shampoos and lipsticks to non-dairy creamers and doughnuts.

India, now the world’s number-one importer of the oil, went from buying 30,000 metric tons in 1992 to 8.4 million in 2020. China saw an increase from 800,000 metric tons to 6.8 million over the same period. Here in the United States, imports have risen steadily since the mid-2000s, in part as a result of the Food and Drug Administration’s warnings about trans fats. Semi-solid at room temperature, palm oil, which has no trans fats, proved an ideal replacement for the partially hydrogenated oils that processed-foods manufacturers had previously used to enhance the texture and extend the shelf life of their cookies and crackers. At around the same time, government biofuels mandates in the United States saw more domestic corn and soy oil being diverted to cars, leaving a vacuum increasingly filled by palm — and spurring producer countries to amp up the supply.

Trade liberalization and economic growth in middle-income countries over the last two decades have led to a surge of palm oil flowing across international borders, where it has enabled the production of ever-greater amounts of deep-fried snacks and ultra-processed foods. (Though we often look to sugar as the culprit for the world’s weight woes, refined vegetable oils have added far more calories to the global diet in the last half-century than any other food group.) Rates of obesity, diabetes, and heart disease are soaring in the poorer countries where the multinational companies that peddle such junk are focused on growing their markets.

Though many of the companies that produce, trade, and source palm oil have signed zero-deforestation commitments and otherwise pledged to protect the environment and human rights (palm oil production has been linked repeatedly to land-grabbing and labor abuses), oil palm fruit grown illegally on peatlands and other protected areas routinely makes its way into our kitchens and bathrooms. Nor has the Kuala Lumpur–based watchdog group known as the Roundtable on Sustainable Palm Oil, or RSPO, succeeded in reining in the industry.


An area cleared for an oil palm plantation in West Kalimantan, Indonesia. MUHAMMAD ADIMAJA / GREENPEACE


But that doesn’t mean that Westerners are off the hook: Last week, a Washington, DC–based think tank published a report finding that international markets for commodities like palm oil are by far the most important driver of global deforestation, the majority of which happens illegally. In Indonesia, the researchers found, at least 81 percent of forested land cleared to produce palm oil was done so in violation of the law. While consumers and activists aligned with such groups as the Rainforest Action Network and Greenpeace have done their part to force concessions from the industry, without genuine buy-in from Western governments and consumer-facing corporations, activist campaigns will only get so far.

Now there may be reason for hope. A few weeks ago, during President Biden’s climate summit, a group including the U.S., Britain, and Norway — along with such companies as Amazon, Airbnb, Unilever, and Nestlé — introduced an ambitious initiative called Lowering Emissions by Accelerating Forest finance, or LEAF, aimed at creating an international marketplace in which carbon credits can be sold in exchange for avoiding deforestation. The scheme, which kicks off with a pledge of $1 billion, is meant to improve upon the program known as REDD+ (Reducing Emissions from Deforestation and forest Degradation), the United Nations initiative introduced in 2008, by working with larger units of land, thereby avoiding deforestation simply being displaced to other forest patches. Its proponents believe that by offering a consistent, long-term source of demand for developing countries that effectively protects their tropical forests, the LEAF marketplace will make forests more valuable to those countries — and their often-corrupt leaders — than if they are cut down to grow agricultural commodities like palm oil.

In other good news, Senator Brian Schatz, Democrat from Hawaii, recently announced plans to introduce legislation that would put in place import requirements for agricultural commodities associated with illegal deforestation. “I don’t think the average consumer knows that half the stuff they buy at the supermarket contains palm oil,” Schatz said, “and most of palm oil is from illegally deforested land.”

The WHO compared the tactics used by the palm oil industry to those employed by the tobacco and alcohol lobbies.

Modeled on the 1900 Lacey Act, which banned trafficking in illegal wildlife (it was amended in 2008 to include plant and plant products such as timber and paper), the bill would oblige companies bringing commodities like palm oil into the U.S. to know where the goods originated and to ensure they were produced in compliance with the laws of the country in which they were grown. The bill would also make it possible for U.S. courts to prosecute companies laundering illegally sourced products and would provide aid to countries that commit to eliminating illegal deforestation. Britain and the European Union are in the process of developing similar regulatory measures to reduce the negative impacts of their trade in agricultural commodities.

Also on Yale e360
How pressuring corporations can save the Amazon from destruction. Read more.



Big Palm Oil will undoubtedly push back — in 2019, the World Health Organization compared the tactics used by the $65 billion industry to those employed by the tobacco and alcohol lobbies — but if there were ever a time for governments to stand their ground, now is that time. Last week, the International Energy Administration reported that to have any chance of meeting the temperature target set in the Paris accord, investment in fossil fuel supply projects has to cease immediately. We also need to slam the brakes on tropical deforestation. Ripping out an entire nation’s oil-palm acreage, as Sri Lanka is doing, may not be the most practical way to solve our intertwined climate, biodiversity, and health crises, but it’s a step in the right direction.



Jocelyn C. Zuckerman is the author of Planet Palm, an account of how the soaring global use of palm oil in food and consumer products has had devastating impacts on tropical forests, biodiversity, and subsistence communities. A Brooklyn-based writer specializing in the environment, agriculture, and the Global South, Zuckerman was formerly deputy editor of Gourmet. Her work has appeared in The New York Times Magazine, Fast Company, and Audubon, among other places.

As Climate Warms, a Rearrangement of World’s Plant Life Looms

Ponderosa pine, now widely distributed in North America, were exceedingly rare during the last ice age. WOLFGANG KAEHLER / GETTY IMAGES


BY ZACH ST. GEORGE • JUNE 17, 2021

Previous periods of rapid warming millions of years ago drastically altered plants and forests on Earth. Now, scientists see the beginnings of a more sudden, disruptive rearrangement of the world’s flora — a trend that will intensify if greenhouse gas emissions are not reined in.

Some 56 million years ago, just after the Paleocene epoch gave way to the Eocene, the world suddenly warmed. Scientists continue to debate the ultimate cause of the warming, but they agree on its proximate cause: A huge burst of carbon dioxide entered the atmosphere, raising Earth’s average temperature by 7 to 14 degrees Fahrenheit. The Paleocene-Eocene Thermal Maximum (PETM), as this event is known, is “the best geologic analog” for modern anthropogenic climate change, said University of Wyoming paleobotanist Ellen Currano.

She studies how the PETM’s sudden warmth affected plants. Darwin famously compared the fossil record to a tattered book missing most of its pages and with all but a few lines obscured. The PETM, which lasted roughly 200,000 years, bears out the analogy. Wyoming’s Bighorn Basin is the only place on Earth where scientists have found plant macrofossils (visible to the naked eye, that is) that date to the PETM. The fossil leaves that Currano and her colleagues have found there paint a vivid portrait.

Before the PETM, she said, there lived a forest of cypress, sycamores, alders, dogwoods, walnuts and other species, all of them suggestive of a temperate climate — a bit swampy, perhaps not unlike that of the southeastern United States. Then, with the onset of the PETM, that forest disappeared, its trees vanishing from the fossil record. “During the climate event you have a nearly complete turnover of plants,” Currano said. A new forest appeared, this one consisting of palms, heat-tolerant members of the bean family, and other plants evocative of the semi-arid tropics.

It is a story repeated throughout the fossil record: When the climate changes, so does the arrangement of the world’s plants. Species move back and forth toward the poles, up and downslope. Some species grow more common, others rarer. Species arrange themselves together in new combinations. The fossil record reveals plants for what they are, as mobile beings. For plant species, migrating in response to climate change is often a matter of survival.

Warmth-loving plants are growing more common, from the middle of the Amazon to the middle of Nebraska.


As human-generated greenhouse gas emissions cause the world to rapidly warm, this movement is once again under way. Scientists have observed plants shifting toward the poles and upslope. They’ve noted old ecosystems suddenly replaced by new ones, often in the wake of fire, insect outbreaks, drought or other disturbances. They’ve observed an increase in the number of trees dying and watched as a growing number of the world’s biggest and oldest plants, including the baobabs of Africa and the cedars of Lebanon, have succumbed. Just this month, scientists announced that the Castle Fire, which burned through California’s Sierra Nevada last year, singlehandedly killed off more than 10 percent of the world’s mature giant sequoias.

So far, many of these changes are subtle, seemingly unrelated to one another, but they are all facets of the same global phenomenon — one that scientists say is likely to grow far more apparent in the decades to come.

The climate is currently warming at least 10 times faster than it did at the onset of the PETM. Under its worst-case scenario, the Intergovernmental Panel on Climate Change projects that, over the next 100 to 150 years, Earth’s average temperature could rise by roughly the same amount as it did during the PETM. Dramatic vegetational shifts could arrive not in a matter of centuries or millennia, but decades; a 2019 study, for example, projected that Alaska’s vast interior forests will shift from being dominated by conifers to being dominated by broadleaf trees as soon as the middle of this century.

Scientists debate what this floral rearrangement will look like. In some places, it may take place quietly and be easily ignored. In others, though, it could be one of the changing climate’s most consequential and disruptive effects. “There’s a whole lot more of this we can expect over the next decades,” said University of Wisconsin-Madison paleoecologist Jack Williams. “When people talk about wildfires out West, about species moving upslope — to me, this is just the beginning.”


These baobab trees, near Morondava, Madagascar, are up to 2,800 years old. Scientist attribute the sudden deaths of ancient baobabs in recent years to climate change. ATLANTIDE PHOTOTRAVEL / GETTY IMAGES


Williams is a senior co-author on a study published this month in Science that provides context on floral change in the present and recent geological past. Led by University of Bergen ecologists Ondřej Mottl and Suzette Flantua, the team of researchers used more than 1,000 fossil pollen records collected from around the world to compare rates of floral change over the last 18,000 years. It is the largest such study of its type, Williams said, representing many thousands of hours of combined scientific effort.

The researchers found that the rate of change peaked first as the world warmed at the end of the last ice age. Then, the rate of change began climbing even faster beginning between 2,000 and 4,000 years ago. This was a period when the global climate was relatively stable, so the changes were likely due to human activities. The study suggests that people, who have spent thousands of years rearranging the world’s plants for agriculture and other reasons, currently remain the strongest driver of change in the shifts of the world’s plants. But it also affirms how powerfully climate has driven change and suggests how it might again. “There’s likely a human legacy from quite some time ago,” Flantua said. “On top of that we’re adding a quite massive change in temperature. It is a dangerous combination.”

ALSO ON YALE E360
Native species or invasive? The distinction blurs as the world warms. Read more.


How will floral change look and feel to those living through it? While the fossil record offers a useful sense of the big picture, it is often fuzzy on the specifics, particularly at the scales of years and decades. Scientists trying to track the comings and goings of plant species in the present face a similar problem. Plants are constantly casting off seeds and spores, little genetic fingers that will grab hold wherever they are able. When physical or biological conditions change, so do the places where various plant species can find purchase; over time, the range and abundance of the whole species shifts. That’s how it works in theory, anyway. Catching it happening is another matter. To do so, scientists need long-term records for comparison. Such records are unevenly distributed around the world, and all are of either limited geographic or temporal scope; global satellite imagery, for instance, dates only to the 1970s.

Still, in the places where scientists do have long-term historical records, they’ve tended to find plants on the move in recent decades. Shrubs are popping up across the Arctic. New species of plants are colonizing mountain summits. In one of the most wide-ranging studies of floral range shifts, a group of researchers led by University of Miami ecologist Kenneth Feeley used herbarium data to track how plant communities across the Western Hemisphere had changed from 1970 to 2011. Comprising 20,000 species and 20 million individual observations, the data shows that warmth-loving plants were growing more common nearly everywhere the researchers looked, “from the middle of the Amazon to the middle of Nebraska,” Feeley said.

Some species can migrate remarkably fast, perhaps as much as a mile a year.

This type of floral change will likely often go unnoticed by people, said Yale School of the Environment geographer Jennifer Marlon, who studies the public’s perception of climate change. People, she said, are attuned to the wild variation between days and weeks and seasons, not the long-term shifts wrought by the changing climate. People also tend to have a short memory of their surroundings, a phenomenon known as the “shifting baseline.” “We just forget very quickly what the baseline was,” she said. “We tend to normalize change around us.”

The species whose migration we’ll likely notice first are those of agricultural, commercial or cultural importance. University of Maine paleoecologist Jacquelyn Gill points to sugar maple, whose range scientists project will shift far to the north in the coming decades. “As an ecologist, I’m happy that sugar maple is tracking the climate,” Gill said — it is a sign of resilience. On the other hand, she said, “As a person who lives in Maine and loves maple syrup, I am extremely concerned for the impact of sugar maple’s movements on a food I care about, on my neighbors’ livelihoods, and on the tourist industry.”

These shifts in species’ ranges also have serious implications for conservationists. Experts say the changing climate means that Sequoia National Park will eventually be left without its sequoias, Joshua Tree National Park without its Joshua trees. As with Gill’s sugar maples, this is distressing from a human perspective, though potentially of little importance from the plants’ perspective. The question is whether sequoias, Joshua trees, and countless other plants will be able to reach newly suitable habitats. For decades, scientists have debated whether plants would be able to track the rate of climate change, and whether people should intervene to help rare, isolated species reach more suitable habitat.

On the one hand, fossil evidence from the late Pleistocene and early Holocene suggests that some species can migrate quickly, perhaps more than a mile per year. On the other hand, studies in Europe and North America suggest that many tree species did not keep up with the climate as it warmed at the end of the Pleistocene.


A wildfire in Australia's Blue Mountains National Park in 2018. More and hotter fires are accelerating changes in changes in flora globally. ANDREW MERRY / GETTY IMAG
ES


The fossil record is clear on one point, said Steve Jackson, an ecologist and director of the U.S. Geological Survey’s Southwest and South Central Climate Adaptation Science Centers: Each of the world’s 400,000-odd species of plants will respond differently to the changing climate. Changing conditions will tip the odds of competition toward some, away from others. Today, for example, the ponderosa pine is the most widely distributed pine tree in North America, growing from British Columbia to California’s Mexican border and as far east as Nebraska. It stands alongside sagebrush, creosote and saguaro as a floral symbol of the arid West.

But at the height of the last ice age, 20,000 years ago, the tree seems to have been exceedingly rare, apparently confined to a couple tiny patches of Arizona and New Mexico. “Climate change is not going to affect everyone equally,” Jackson said. “There are going to be winners and losers.”

Not all of the rearrangement of the world’s flora will happen slowly or subtly. As Gill pointed out, for the composition of an ecosystem to change, members of new species need to arrive, but members of old species also need to make way. “Death has to be part of that story,” she said. Mature plants, especially long-lived plants like trees, are often capable of surviving under physical conditions that no longer suit their seedlings. “Trees can hang out a really long time in unsuitable climates and just not reproduce,” she said. Like a rubber band, this disequilibrium between plants and environment stretches and stretches. When the mature plants die, the tension is suddenly released. New species flood in.

Scientists around the world have noted ecosystems transforming suddenly into new states — from dense forest to open woodland, for example, or woodland to brushland. Most often, these transformations come in the wake of a fire, insect outbreak or heat wave — all of which are expected to increase in intensity as the world grows warmer. Warmer temperatures stress plants and speed up the lifecycles of the insects that attack them. Currano and her colleagues have found that during the PETM, insects did far more damage to leaves than they did before or after.

The tendency of climate change to kill trees means that tree-planting campaigns alone won’t provide a panacea.

Fire is likely to be an especially visible catalyst of change, said Marlon, the Yale geographer. The disastrous wildfires in Australia, California and the Amazon in recent years are previews of what to expect in coming decades. “We have plenty more to burn,” she said.

Some scientists expect that, in many parts of the world, floral change will remain mostly unnoticeable. Threshold-type changes, driven by fire and mass forest die-back, they say, will be concentrated in places already experiencing those events. Others think sudden changes could soon become more widespread. “The problem is, it’s not about the average conditions. It’s about the extremes,” said University of Arizona ecologist David Breshears. He pointed to a month-long heat wave that struck Western Australia in 2011. It killed 20 percent of the trees in the affected area. Such heat waves are especially deadly when paired with droughts like the one currently gripping much of the American West.

As Breshears and climate scientist Jonathan Overpeck pointed out in a recent editorial, the tendency of climate change to kill trees means that, by themselves, tree-planting campaigns won’t provide a climate-mending panacea. “The first-order action has to be emissions reduction,” Breshears said.


ALSO ON YALE E360
In the Sierras, new approaches to protecting forests under stress. Read more
.


Currano agreed. The high temperatures of the PETM lasted for about 180,000 years. The fossil beds of Wyoming show that, when it ended, most of the species that had existed before the PETM returned. It is a sign of the flora’s resilience to climate change, Currano said: “We don’t see a big extinction event.” But that hopeful message can only be applied to the modern day if humans stop pumping carbon dioxide into the atmosphere, she said: “Otherwise we are in unprecedented conditions, and the PETM is the best-case scenario.”

Zach St. George is a freelance reporter based in Baltimore. His first book, The Journeys of Trees, was published last summerABOUT ZACH ST. GEORGE


Why the Approval of That Alzheimer’s Drug Is So Disturbing

Drugs that merely fiddle with the body’s physiology provide a false sense of control—at a cost.

BY TIM REQUARTH
JUNE 17, 2021
digicomphoto/iStock/Getty Images Plus


Last week, the Food and Drug Administration ignored the advice of its own expert advisory committee and approved the first new treatment for Alzheimer’s in 18 years. Called Aduhelm, it carries a substantial risk of painful brain swelling and bleeding, requires monthly infusions, and comes with an eye-popping list price of $56,000 per year. These caveats might be fine if the drug, which is manufactured by Biogen, miraculously restored the memories lost by the 6 million Americans with Alzheimer’s—or at least measurably improved the lives of patients in some meaningful way. But according to even the FDA’s own statisticians, the clinical data fail to show the new drug can slow Alzheimer’s devastating cognitive decline.

The FDA’s surprise approval has ignited a firestorm within the medical community. People are justifiably angry about the felonious cost for a risky drug that may offer little, if any, benefit. Aduhelm is, for now, a confusing and foregone conclusion; Biogen is slated to ship the drug starting next week. But a closer look at the FDA’s approval process reveals a deeper scientific issue at stake about what constitutes adequate evidence for desperately needed treatments. How the Aduhelm saga played out could have far-reaching implications not just for Alzheimer’s patients, but for anyone taking a drug approved by the FDA in the future.

To understand why the FDA approved the drug, and why this approval is so problematic beyond Aduhelm itself, it’s helpful to understand a bit about how clinical studies are designed. Some drugs are approved based on real-world outcomes, such as whether they prevent death. But others are approved based on so-called surrogate outcomes, such as whether they, say, suppress an abnormal heartbeat. If the surrogate outcome (abnormal heartbeat) is meaningfully associated with the real-world outcome (death), it follows that a drug suppressing the abnormal heartbeat should help prevent life-threatening heart attacks. Aduhelm doesn’t seem to show much benefit to patients according to a clinical dementia rating scale (a real-world outcome, approximately—some researchers say it’s not quite as firm as an outcome like “death”). But the drug does do one thing well—it removes some of the amyloid plaques that build up in the brains of Alzheimer’s patients (a surrogate outcome). It’s seemingly on the basis of this surrogate outcome that the FDA saved Aduhelm from joining a long list of failed Alzheimer’s treatments.

Approving drugs based on the surrogate-outcome approach, in principle, offers huge advantages. It’s easier, cheaper, and faster to design a clinical study that measures a drug’s effect on abnormal heartbeats than to wait for enough people to suffer a heart attack and die. Drug approvals based on surrogate outcomes have indeed proved to be lifesaving in the past. In 1992, in the midst of the HIV crisis, the FDA created a special “Accelerated Approval” program to fast-track urgently needed treatments based on surrogate outcomes alone. (The only catch was companies had to perform post-approval confirmatory studies to assess real-world outcomes like sickness and death, or the FDA could withdraw the drug.) The first wave of treatments approved in this program were HIV drugs, brought to market based on improving surrogate outcomes such as levels of a type of white blood cell. These drugs were later proved to prevent sickness and death from AIDS, which wasn’t surprising because scientists had established a tight linkage between white blood cell count and disease progression. The surrogate-outcome approach was considered a success.
Aduhelm now joins this group of drugs that merely fiddle with the body’s physiology but potentially leave patients worse off and much poorer for it.

But HIV drugs are among the only unqualified successes for surrogate approvals. Joseph Ross, a professor of medicine and public health at Yale University, studies the FDA regulatory process and notes that apart from the HIV drugs, “in almost every other instance, surrogates have proven to be far more fallible.” Several diabetes drugs have been approved because they lower hemoglobin A1c, a measure of average blood sugar levels. That might seem clearly useful. However, these drugs have wildly different effects on diabetic complications and overall mortality. At least one drug is thought to increase heart attacks. Cancer drugs with astronomical price tags might reduce tumor size but fail to prolong life or improve its quality. The list goes on: Blockbuster drugs lower cholesterol but fail to prevent cardiovascular disease. Osteoporosis drugs improve bone density but fail to decrease fractures. And in some cases, these drugs have damaging side effects only discovered years later.

In fact, the writing has long been on the wall that the surrogate-outcome approach can be a bit dicey. Take the heartbeat example above—which is actually real. In the mid-’80s, two class IC antiarrhythmics were brought to market largely based on the evidence that they suppressed abnormal heartbeats. Yet, no one had ever done a clinical study to confirm the drugs actually prevented death. When such a study was conducted years later, it had to be stopped midway through: It turns out these drugs actually increased the chance of death. Although the human toll is impossible to know, the surrogate-outcome approach in this case may have caused thousands of unnecessary deaths. Although the link between abnormal heartbeats and death was plausible in theory, it turned out quite differently in the real world. There’s a clear lesson from using surrogate outcomes to evaluate a drug. Without strong evidence of a link between the surrogate outcome and the clinical outcome, the FDA should be wary to greenlight treatments based on surrogate outcomes alone. For HIV drugs, this link was already established, and the drugs panned out. For the heart drug, this link was less established, and the drug did not pan out.

Aduhelm now joins this group of drugs that fiddle with the body’s physiology but potentially leave patients worse off and much poorer for it. Recall that with this drug, about 30–40 percent of study participants experienced either brain swelling or bleeding. Not to mention the $56,000 annual price tag—much of which Medicare is required to cover—could be a fiscal catastrophe for American health care. No one disputes that amyloid plaques appear in the brains of Alzheimer’s patients, and no one doubts Aduhelm can clear amyloid plaques (by a respectable 30 percent), but how valid of a surrogate outcome are amyloid plaques for the real-world outcomes Alzheimer’s patients care about? Given the pent-up demand, the worrisome side effects, and the steep price, you’d hope the amyloid hypothesis is a slam-dunk. Unfortunately, it’s not even close.

What amyloid plaques mean—and in turn, what Aduhelm’s actual effect on people might be—is still a hotly debated scientific question. Do they cause Alzheimer’s? Do they worsen it? Are they incidental? Are they protective? Scientists have made reasonable cases one way or another, but it’s very far from settled that amyloid plaques are a valid surrogate outcome for Alzheimer’s. What’s worse, the amyloid hypothesis doesn’t have a great track record with regard to therapeutics. Several drugs targeting amyloid plaques have been developed and then fizzled during clinical trials because not a single one had any effect on dementia. And yet, when approving Aduhelm, the FDA relied almost entirely on the amyloid hypothesis.

What makes the FDA’s decision so baffling is that surrogates are used when researchers can’t, or don’t, collect data that better reflects the real-world outcome of interest—in this case, slowing or preventing cognitive decline due to Alzheimer’s. But that’s exactly the data that Biogen submitted. The two clinical studies—which looked at cognitive decline, something that patients and doctors definitely care about—showed almost no benefit. One study was a total dud, and the other raised a score on a cognitive scale by a minuscule amount. Although that improvement reached statistical significance, it is almost certainly not going to make a difference in the lives of patients. “The effect sizes are trivial from a clinical point of view,” said Chiadi Onyike, a professor of psychiatry and behavioral sciences at Johns Hopkins and a member of the 11-person committee of outside experts tapped by the FDA to review the data. “They are meaningless—no different from the ebb and flow a patient might show from week to week.” Not a single member of the FDA’s outside advisory committee recommended approval.

The fallout has already been widespread. So far, three members of the FDA’s advisory committee have resigned, one of them calling the process a “sham.” Doctors who helped run Biogen’s clinical studies are speaking out, and others are penning editorials that they won’t be prescribing Aduhelm until they see evidence of effectiveness. But no one should hold their breath. When the FDA greenlit Aduhelm for use, it told Biogen it had nine years to run the confirmatory studies necessary to prove Aduhelm’s effectiveness. Nine years of people taking this drug that existing data suggests might not do anything meaningful. With Aduhelm poised to become among the biggest blockbuster drug in history—analysts estimate annual revenues could peak at $10 billion—Biogen probably isn’t in a hurry. But they might not even have to collect that extra data at all (for its part, Biogen said in an email to Slate, “We are working diligently to initiate the confirmatory trial”). Ross, the Yale FDA regulatory expert, looked at FDA approvals from 2005–12, and found that post-market confirmatory studies—ones that truly verified the clinical value of a surrogate outcome—only took place about 10 percent of the time. Despite this dismal compliance rate, according to Ross, the FDA has never fined a company for failing to do a confirmatory study and rarely uses its power to withdraw a drug later shown to be clinically ineffective. In an email to Slate, the FDA did not offer comment on whether it would use its power to withdraw Aduhelm should the drug ultimately prove clinically ineffective but “will carefully monitor trial progress and support efforts to complete this trial in the shortest possible timeline.”

What specifically caused the FDA to approve Aduhelm based on such a shaky outcome and over the protests of its own committee is anyone’s guess, even in the context of the FDA’s frequent reliance on surrogate outcomes—nearly 45 percent of all drugs, according to the agency’s own analysis. (In a press release acknowledging the contention around the decision, the FDA explained that it ultimately decided “the benefits of Aduhelm for patients with Alzheimer’s disease outweighed the risks of the therapy.”) Aduhelm may well help fuel a trend in leaning too heavily on surrogate outcomes: Because the Aduhelm example is so egregious, it establishes a far-reaching precedent that some believe could undermine the regulatory process. “Presumably,” says Ross, “[companies] could look at what just happened with this product and say, ‘Hey, you have to treat me the same way.’ ” In other words, when a company fails to show a drug actually works, why not try for back-door approval based on an unproven idea of how the drug is supposed to work?

This might seem grim, and it is. But there’s a lesson you can take as a patient: just because a number goes up or down at the doctor’s office—whether it’s cholesterol, blood pressure, or even weight—that doesn’t necessarily mean you’re better or worse off for it. Because of the FDA’s widespread endorsement of surrogate metrics, it’s been all too easy for patients—and doctors—to believe these metrics for health are health itself. With a false sense of understanding comes a false sense of hope, a false promise of control. That’s the true tragedy of the Aduhelm approval.