Thursday, June 17, 2021

In Push to Find Methane Leaks, Satellites Gear Up for the Hunt


Stemming the methane leaks from landfills, oil fields, natural gas pipelines and more is one of the most powerful levers we have to quickly slow global warming. New satellites are bolstering this urgent mission by pinpointing emitters of this potent greenhouse gas from space.


A computer rendering of a methane-sensing satellite to be launched in 2023 by Carbon Mapper, a U.S. public-private partnership. CARBON MAPPER / PLANET


BY CHERYL KATZ • JUNE 15, 2021

The threat was invisible to the eye: tons of methane billowing skyward, blown out by natural gas pipelines snaking across Siberia. In the past, those plumes of potent greenhouse gas released by Russian petroleum operations last year might have gone unnoticed. But armed with powerful new imaging technology, a methane-hunting satellite sniffed out the emissions and tracked them to their sources.

Thanks to rapidly advancing technology, a growing fleet of satellites is now aiming to help close the valve on methane by identifying such leaks from space. The mission is critical, with a series of recent reports sounding an increasingly urgent call to cut methane emissions.

While shorter-lived and less abundant than carbon dioxide, methane is much more powerful at trapping heat, making its global warming impact more than 80 times greater in the short term. Around 60 percent of the world’s methane emissions are produced by human activities — with the bulk coming from agriculture, waste disposal, and fossil fuel production. Human-caused methane is responsible for at least 25 percent of today’s global warming, the Environmental Defense Fund estimates. Stanching those emissions, a new Global Methane Assessment by the United Nations Environmental Programme stresses, is the best hope for quickly putting the brakes on warming.

“It’s the most powerful lever we have for reducing warming and all the effects that come with climate change over the next 30 years,” said Drew Shindell, an earth sciences professor at Duke University and lead author on the UN report. Scientists stress that major reductions in both carbon dioxide and methane are critical for averting extreme climate change. “It is not a substitute for reducing CO2, but a complement,” Shindell said.

Atmospheric methane levels surged over the last half-decade. 2020 saw the biggest one-year jump on record.


Nearly half of the roughly 380 million metric tons of methane released by human activities annually can be cut this decade with available and largely cost-effective methods, according to the UN assessment. Doing so would stave off nearly 0.3 degrees C of warming by the 2040s — buying precious time to get other greenhouse gas emissions under control. The easiest gains can be made by fixing leaky pipelines, stopping deliberate releases such as venting unwanted gas from drilling rigs, and other actions in the oil and gas industry, the UN report says. Capturing fumes from rotting materials in landfills and squelching the gassy belches of ruminant livestock will also help.

ALSO ON YALE E360
What is causing the recent rise in methane emissions? Read more.



For now, though, the trend is running in the opposite direction: The methane concentration in Earth’s atmosphere has been surging over the past half-decade, the NOAA Annual Greenhouse Gas Index shows. And despite the pandemic, 2020 saw the biggest one-year jump on record. The causes of the recent spike are unclear, but could include natural gas fracking, increased output from methane-producing microbes spurred by rising temperatures, or a combination of human-caused and natural forces.


All this, experts say, underscores the need to track down and plug any leaks or sources that can be controlled. Tracing emissions to their source is no easy task, however. Releases are often intermittent and easy to miss. Ground-based sensors can detect leaks in local areas, but their coverage is limited. Airplane and drone surveys are time-intensive and costly, and air access is restricted over much of the world.

That’s where a crop of recently launched and upcoming satellites with increasingly sophisticated tools comes in.


Satellite imagery shows a Russian gas pipeline (left) and highlights huge amounts of methane (right) being emitted from the pipeline on September 6, 2019. KAYRROS AND MODIFIED COPERNICUS DATA, 2019



A cluster of satellites launched by national space agencies and private companies over the last five years have greatly sharpened our view of what methane is being leaked from where. In the next couple of years, new satellite projects are headed for launch — including Carbon Mapper, a public-private partnership in California, and MethaneSAT, a subsidiary of the Environmental Defense Fund — that will help fill in the picture with unprecedented range and detail. These efforts, experts say, will be crucial not just for spotting leaks but also developing regulations and guiding enforcement — both of which are sorely lacking.

“You can’t mitigate what you can’t measure,” said Cassandra Ely, director of MethaneSAT.

Earlier satellites, such as the Japan Aerospace Exploration’s GOSAT launched in 2009, were able to detect methane, yet their resolution wasn’t good enough to identify specific sources.

But satellite technology is now advancing rapidly, boosting resolution, shrinking size, and gaining a host of cutting-edge capabilities. Powerful new eyes in space include the European Space Agency’s Sentinel 5P (launched in 2017), the Italian Space Agency’s PRISMA (launched 2019), and systems operated by private Canadian company GHGSat (with satellites launched in 2016, 2020 and 2021). Companies like French Kayrros are using artificial intelligence to enhance satellite imaging, paired with air and ground data, to provide detailed methane reports.

At any given time, there are about 100 high-volume methane leaks around the world.


In the past year, methane-hunting satellites have made a number of concerning discoveries. Among them: Despite the pandemic, methane emissions from oil and gas operations in Russia rose 32 percent in 2020. Satellites also observed sizeable releases from gas pipelines in Turkmenistan, a landfill in Bangladesh, a natural gas field in Canada, and coal mines in the U.S. Appalachian Basin.


At any given time, according to Kayrros, there are about 100 high-volume methane leaks around the world, along with a mass of smaller ones that add significantly to the total. Targeting emitters on a global scale from space, the European Space Agency says, provides “an important new tool to combat climate change.”


Now, Carbon Mapper is developing what it promises will be the most sensitive and precise tool for spotting point sources yet. The project aims to launch two satellites in 2023, eventually growing to a constellation of up to 20 that will provide near-constant methane and CO2 monitoring around the globe. Partners include NASA’s Jet Propulsion Laboratory, the California Air Resources Board, private satellite company Planet, and universities and nonprofits, with funding from major private donors, including Bloomberg Philanthropies.

The impetus is the current global monitoring gap, said Riley Duren, a remote-sensing scientist at the University of Arizona and Carbon Mapper CEO. “There’s no single organization that has the necessary mandate and resources and institutional culture to deliver an operational monitoring system for greenhouse gases,” Duren said. “At least not in the time frame that we need.” Duren likens Carbon Mapper to the U.S. National Weather Service, as it will provide an “essential public service” with its routine, sustained monitoring of greenhouse gases.


Cows at a dairy farm in Merced, California. Gassy belches of ruminant livestock are a significant source of methane. MARMADUKE ST. JOHN / ALAMY


The project’s main focus is to find super emitters, Duren said. He and colleagues conducted a previous study via methane-sensing airplane surveys of oil and gas operations, landfills, wastewater treatment, and agriculture in California that found that nearly half of the state’s methane output came from less than 1 percent of its infrastructure. Landfills produced the biggest share of the state’s overall emissions in that survey, followed by agriculture and then oil and gas.


The survey pointed out the need to “scale that up and operationalize it globally by going into space,” Duren said. The orbiters will employ “hyperspectral” spectrometers designed by the Jet Propulsion Laboratory, which the project’s website says will provide “unparalleled sensitivity, resolution and versatility.” The minifridge-sized satellites will be able to target a release to within 30 meters, precise enough to identify the exact piece of equipment that’s leaking.


When emissions are detected, subscribers to a rapid alert service will be notified within 24 hours by Planet, a private, San Francisco-based satellite operator that will build and manage the Carbon Mapper satellites.

The satellites will enhance the California Air Resources Board’s monitoring with wider and more frequent coverage, said Jorn Herner, who heads the board’s emissions monitoring program. Monitoring now is done once a quarter, he said. When the full constellation of Carbon Mapper satellites is deployed, that will increase to near-daily. “You just have a much better handle on what’s going on [and] when,” Herner said, “and you’ll be able to address any leaks more quickly.”

The global policies needed to do something about methane emissions aren’t yet in place.


Also joining the orbiting hunters will be MethaneSAT, a satellite that will scan wider areas — up to 200 kilometers in a swath, albeit with lower, 100-meter, resolution. This program uses a special algorithm that generates flux rates from the satellite data. “So instead of just getting a picture or a snapshot, you actually get more of a movie,” said MethaneSAT director Ely. That’s a first for satellite-based sensing and a boon to tracing wind-blown plumes back to their source, she said.


MethaneSAT will focus on the global oil and gas industry and aims to be sensitive enough to reveal the multitude of small methane releases that can account for the majority of emissions, Ely says. The findings will be made available to industry operators, regulators, investors and the public in near-real time. The data, she said, will help “prioritize what makes the most sense in terms of emissions reductions and mitigation.”


Yet while the world’s ability to hunt down methane emitters is growing, the global policies needed to do something about it aren’t yet in place.

Much of the current approach to dialing back methane is dependent on voluntary actions by the oil and gas industry. Satellites can help with that, UN report lead author Shindell and others said, by identifying leaks that, if stopped, will save or make those companies money. “If you capture the methane instead of letting it escape to the atmosphere, you have something quite useful,” Shindell said. “So, there’s a nice financial incentive not to waste it.” But if gas prices aren’t high enough, operators can feel it’s not worth the expense and effort to find, stem, and utilize runaway emissions — making rules and fees a necessary part of the picture.

“Having stronger regulations is really key,” Shindell said.



The global monthly average concentration of methane in the atmosphere. NOAA


Regulations on methane emissions today are a patchwork of local and national measures, with few international agreements that set specific targets, the UN report points out. In the United States, state policies range from fairly strict controls in some states, such as California and Colorado, to little enforcement in Texas and others. The U.S. Senate recently moved to reinstate methane emissions rules for the oil and gas industry that the Trump administration had rescinded; Congress is expected to vote on that action later this month. A Senate bill proposed in March would levy a fee on the industry’s methane output. And the U.S. Environmental Protection Agency just issued a plan to cap methane and other air pollutants from landfills.


The European Union is currently working on new regulations for emissions from the energy sector. However, other big emitters, such as Russia, have almost no methane-restricting policies in effect, according to the International Energy Agency analysis. Ahead of the UN climate change conference in November, the International Energy Forum this month launched its Methane Measurement Methodology Project, giving member nations access to data from the Sentinel 5P satellite along with analyses from Kayrros, to get a better handle on emissions from the energy industry.

Data from satellites could provide a useful political lever to compel countries to crack down on their emissions, scientists say. Precise measurements on Russian pipeline leaks, for example, could enable the EU, a major customer for Russian oil and gas, to impose border tariffs based on the emissions from production and transport. Better monitoring could also aid recent actions by shareholders and courts compelling major fossil fuel corporations to rein in their greenhouse gas emissions.

ALSO ON YALE E360
Methane detectives: Can a wave of new technology slash natural gas leaks? Read more.


Whatever measures come into effect, policymakers and regulators will need eyes in space to keep tabs on whether those rules are working, and to pinpoint violators and incentivize change.

Said Carbon Mapper’s Duren: “There are many uses for just making the invisible visible.”




Cheryl Katz is a Northern California-based freelance science writer covering climate change, earth sciences, natural resources, and energy. Her articles have appeared in National Geographic, Scientific American, Eos and Hakai Magazine, among other publicatio
ns. 
OPINION
The Time Has Come to Rein In the Global Scourge of Palm Oil

An oil palm plantation in Sumatra, Indonesia, shrouded in haze from fires on burning peatland. ULET IFANSASTI / GREENPEACE



The cultivation of palm oil, found in roughly half of U.S. grocery products, has devastated tropical ecosystems, released vast amounts of C02 into the atmosphere, and impoverished rural communities. But efforts are underway that could curb the abuses of this powerful industry.


BY JOCELYN C. ZUCKERMAN • MAY 27, 2021

A few weeks ago, the Sri Lankan president announced that his government would ban all imports of palm oil, with immediate effect, and ordered the country’s plantation companies to begin uprooting their oil-palm monocultures and replacing them with more environmentally friendly crops. Citing concerns about soil erosion, water scarcity, and threats to biodiversity and public health, President Gotabaya Rajapaksa explained that his aim was to “make the country free from oil palm plantations and palm oil consumption.”

That’s a pretty radical move, and, as someone who’s spent the past few years writing a book about the global palm oil industry, one I fully support. Worldwide, production of palm oil has skyrocketed in recent decades — oil-palm plantations now cover an area larger than New Zealand — but the boom has meant devastation for the planet. The oil palm plant, Elaeis guineensis, thrives at 10 degrees to the north and south of the equator, a swath that corresponds with our tropical rainforests. Though they cover just 10 percent of Earth’s land surface, these ecosystems support more than half of all biodiversity. In Indonesia, the world’s number-one producer of palm oil, habitat loss due largely to industrial agriculture has meant that such iconic species as the Sumatran elephant, orangutan, rhinoceros, and tiger — in addition to various species of hornbill — have been pushed to the brink of extinction. Indigenous peoples who for generations have sourced their food, building materials, and everything else from the archipelago’s forests and rivers have been reduced to eking out existences under donated plastic tarps and begging by the side of the road.

Tropical rainforests are also, of course, vital carbon sinks, and many of them sit upon great expanses of peatlands — soils formed over thousands of years through the accumulation of organic matter. Indonesia claims the planet’s largest concentration of tropical peatlands, and when its palm oil companies drain and burn that land as a precursor to planting, unimaginable quantities of carbon dioxide escape into the atmosphere. The country’s peatlands currently emit more carbon dioxide each year than does the state of California.

These days, palm oil accounts for one-third of total global vegetable oil consumption.


Native to West and Central Africa, where it has long been a pillar of local cuisine and culture, palm oil emerged as a global commodity in the 18th century, when Europeans began sourcing it as a fuel for lighting lamps. It eventually found its way into soaps, candles, and margarines, and served as a lubricant for the machines driving the Second Industrial Revolution. Around the turn of the 20th century, rubber planters in Malaya and the Dutch East Indies began introducing the crop in that part of the world, and the post-independence governments of Indonesia and Malaysia expanded oil-palm acreage in connection with poverty-alleviation schemes. Having eventually learned to refine, bleach, and deodorize the oil into something all but tasteless, odorless, and invisible, the industry proceeded to find ever-more uses for it. These days, palm oil accounts for one-third of total global vegetable oil consumption, and some derivative of the plant lurks in roughly half of all products in U.S. grocery stores, from shampoos and lipsticks to non-dairy creamers and doughnuts.

India, now the world’s number-one importer of the oil, went from buying 30,000 metric tons in 1992 to 8.4 million in 2020. China saw an increase from 800,000 metric tons to 6.8 million over the same period. Here in the United States, imports have risen steadily since the mid-2000s, in part as a result of the Food and Drug Administration’s warnings about trans fats. Semi-solid at room temperature, palm oil, which has no trans fats, proved an ideal replacement for the partially hydrogenated oils that processed-foods manufacturers had previously used to enhance the texture and extend the shelf life of their cookies and crackers. At around the same time, government biofuels mandates in the United States saw more domestic corn and soy oil being diverted to cars, leaving a vacuum increasingly filled by palm — and spurring producer countries to amp up the supply.

Trade liberalization and economic growth in middle-income countries over the last two decades have led to a surge of palm oil flowing across international borders, where it has enabled the production of ever-greater amounts of deep-fried snacks and ultra-processed foods. (Though we often look to sugar as the culprit for the world’s weight woes, refined vegetable oils have added far more calories to the global diet in the last half-century than any other food group.) Rates of obesity, diabetes, and heart disease are soaring in the poorer countries where the multinational companies that peddle such junk are focused on growing their markets.

Though many of the companies that produce, trade, and source palm oil have signed zero-deforestation commitments and otherwise pledged to protect the environment and human rights (palm oil production has been linked repeatedly to land-grabbing and labor abuses), oil palm fruit grown illegally on peatlands and other protected areas routinely makes its way into our kitchens and bathrooms. Nor has the Kuala Lumpur–based watchdog group known as the Roundtable on Sustainable Palm Oil, or RSPO, succeeded in reining in the industry.


An area cleared for an oil palm plantation in West Kalimantan, Indonesia. MUHAMMAD ADIMAJA / GREENPEACE


But that doesn’t mean that Westerners are off the hook: Last week, a Washington, DC–based think tank published a report finding that international markets for commodities like palm oil are by far the most important driver of global deforestation, the majority of which happens illegally. In Indonesia, the researchers found, at least 81 percent of forested land cleared to produce palm oil was done so in violation of the law. While consumers and activists aligned with such groups as the Rainforest Action Network and Greenpeace have done their part to force concessions from the industry, without genuine buy-in from Western governments and consumer-facing corporations, activist campaigns will only get so far.

Now there may be reason for hope. A few weeks ago, during President Biden’s climate summit, a group including the U.S., Britain, and Norway — along with such companies as Amazon, Airbnb, Unilever, and NestlĂ© — introduced an ambitious initiative called Lowering Emissions by Accelerating Forest finance, or LEAF, aimed at creating an international marketplace in which carbon credits can be sold in exchange for avoiding deforestation. The scheme, which kicks off with a pledge of $1 billion, is meant to improve upon the program known as REDD+ (Reducing Emissions from Deforestation and forest Degradation), the United Nations initiative introduced in 2008, by working with larger units of land, thereby avoiding deforestation simply being displaced to other forest patches. Its proponents believe that by offering a consistent, long-term source of demand for developing countries that effectively protects their tropical forests, the LEAF marketplace will make forests more valuable to those countries — and their often-corrupt leaders — than if they are cut down to grow agricultural commodities like palm oil.

In other good news, Senator Brian Schatz, Democrat from Hawaii, recently announced plans to introduce legislation that would put in place import requirements for agricultural commodities associated with illegal deforestation. “I don’t think the average consumer knows that half the stuff they buy at the supermarket contains palm oil,” Schatz said, “and most of palm oil is from illegally deforested land.”

The WHO compared the tactics used by the palm oil industry to those employed by the tobacco and alcohol lobbies.

Modeled on the 1900 Lacey Act, which banned trafficking in illegal wildlife (it was amended in 2008 to include plant and plant products such as timber and paper), the bill would oblige companies bringing commodities like palm oil into the U.S. to know where the goods originated and to ensure they were produced in compliance with the laws of the country in which they were grown. The bill would also make it possible for U.S. courts to prosecute companies laundering illegally sourced products and would provide aid to countries that commit to eliminating illegal deforestation. Britain and the European Union are in the process of developing similar regulatory measures to reduce the negative impacts of their trade in agricultural commodities.

Also on Yale e360
How pressuring corporations can save the Amazon from destruction. Read more.



Big Palm Oil will undoubtedly push back — in 2019, the World Health Organization compared the tactics used by the $65 billion industry to those employed by the tobacco and alcohol lobbies — but if there were ever a time for governments to stand their ground, now is that time. Last week, the International Energy Administration reported that to have any chance of meeting the temperature target set in the Paris accord, investment in fossil fuel supply projects has to cease immediately. We also need to slam the brakes on tropical deforestation. Ripping out an entire nation’s oil-palm acreage, as Sri Lanka is doing, may not be the most practical way to solve our intertwined climate, biodiversity, and health crises, but it’s a step in the right direction.



Jocelyn C. Zuckerman is the author of Planet Palm, an account of how the soaring global use of palm oil in food and consumer products has had devastating impacts on tropical forests, biodiversity, and subsistence communities. A Brooklyn-based writer specializing in the environment, agriculture, and the Global South, Zuckerman was formerly deputy editor of Gourmet. Her work has appeared in The New York Times Magazine, Fast Company, and Audubon, among other places.

As Climate Warms, a Rearrangement of World’s Plant Life Looms

Ponderosa pine, now widely distributed in North America, were exceedingly rare during the last ice age. WOLFGANG KAEHLER / GETTY IMAGES


BY ZACH ST. GEORGE • JUNE 17, 2021

Previous periods of rapid warming millions of years ago drastically altered plants and forests on Earth. Now, scientists see the beginnings of a more sudden, disruptive rearrangement of the world’s flora — a trend that will intensify if greenhouse gas emissions are not reined in.

Some 56 million years ago, just after the Paleocene epoch gave way to the Eocene, the world suddenly warmed. Scientists continue to debate the ultimate cause of the warming, but they agree on its proximate cause: A huge burst of carbon dioxide entered the atmosphere, raising Earth’s average temperature by 7 to 14 degrees Fahrenheit. The Paleocene-Eocene Thermal Maximum (PETM), as this event is known, is “the best geologic analog” for modern anthropogenic climate change, said University of Wyoming paleobotanist Ellen Currano.

She studies how the PETM’s sudden warmth affected plants. Darwin famously compared the fossil record to a tattered book missing most of its pages and with all but a few lines obscured. The PETM, which lasted roughly 200,000 years, bears out the analogy. Wyoming’s Bighorn Basin is the only place on Earth where scientists have found plant macrofossils (visible to the naked eye, that is) that date to the PETM. The fossil leaves that Currano and her colleagues have found there paint a vivid portrait.

Before the PETM, she said, there lived a forest of cypress, sycamores, alders, dogwoods, walnuts and other species, all of them suggestive of a temperate climate — a bit swampy, perhaps not unlike that of the southeastern United States. Then, with the onset of the PETM, that forest disappeared, its trees vanishing from the fossil record. “During the climate event you have a nearly complete turnover of plants,” Currano said. A new forest appeared, this one consisting of palms, heat-tolerant members of the bean family, and other plants evocative of the semi-arid tropics.

It is a story repeated throughout the fossil record: When the climate changes, so does the arrangement of the world’s plants. Species move back and forth toward the poles, up and downslope. Some species grow more common, others rarer. Species arrange themselves together in new combinations. The fossil record reveals plants for what they are, as mobile beings. For plant species, migrating in response to climate change is often a matter of survival.

Warmth-loving plants are growing more common, from the middle of the Amazon to the middle of Nebraska.


As human-generated greenhouse gas emissions cause the world to rapidly warm, this movement is once again under way. Scientists have observed plants shifting toward the poles and upslope. They’ve noted old ecosystems suddenly replaced by new ones, often in the wake of fire, insect outbreaks, drought or other disturbances. They’ve observed an increase in the number of trees dying and watched as a growing number of the world’s biggest and oldest plants, including the baobabs of Africa and the cedars of Lebanon, have succumbed. Just this month, scientists announced that the Castle Fire, which burned through California’s Sierra Nevada last year, singlehandedly killed off more than 10 percent of the world’s mature giant sequoias.

So far, many of these changes are subtle, seemingly unrelated to one another, but they are all facets of the same global phenomenon — one that scientists say is likely to grow far more apparent in the decades to come.

The climate is currently warming at least 10 times faster than it did at the onset of the PETM. Under its worst-case scenario, the Intergovernmental Panel on Climate Change projects that, over the next 100 to 150 years, Earth’s average temperature could rise by roughly the same amount as it did during the PETM. Dramatic vegetational shifts could arrive not in a matter of centuries or millennia, but decades; a 2019 study, for example, projected that Alaska’s vast interior forests will shift from being dominated by conifers to being dominated by broadleaf trees as soon as the middle of this century.

Scientists debate what this floral rearrangement will look like. In some places, it may take place quietly and be easily ignored. In others, though, it could be one of the changing climate’s most consequential and disruptive effects. “There’s a whole lot more of this we can expect over the next decades,” said University of Wisconsin-Madison paleoecologist Jack Williams. “When people talk about wildfires out West, about species moving upslope — to me, this is just the beginning.”


These baobab trees, near Morondava, Madagascar, are up to 2,800 years old. Scientist attribute the sudden deaths of ancient baobabs in recent years to climate change. ATLANTIDE PHOTOTRAVEL / GETTY IMAGES


Williams is a senior co-author on a study published this month in Science that provides context on floral change in the present and recent geological past. Led by University of Bergen ecologists Ondřej Mottl and Suzette Flantua, the team of researchers used more than 1,000 fossil pollen records collected from around the world to compare rates of floral change over the last 18,000 years. It is the largest such study of its type, Williams said, representing many thousands of hours of combined scientific effort.

The researchers found that the rate of change peaked first as the world warmed at the end of the last ice age. Then, the rate of change began climbing even faster beginning between 2,000 and 4,000 years ago. This was a period when the global climate was relatively stable, so the changes were likely due to human activities. The study suggests that people, who have spent thousands of years rearranging the world’s plants for agriculture and other reasons, currently remain the strongest driver of change in the shifts of the world’s plants. But it also affirms how powerfully climate has driven change and suggests how it might again. “There’s likely a human legacy from quite some time ago,” Flantua said. “On top of that we’re adding a quite massive change in temperature. It is a dangerous combination.”

ALSO ON YALE E360
Native species or invasive? The distinction blurs as the world warms. Read more.


How will floral change look and feel to those living through it? While the fossil record offers a useful sense of the big picture, it is often fuzzy on the specifics, particularly at the scales of years and decades. Scientists trying to track the comings and goings of plant species in the present face a similar problem. Plants are constantly casting off seeds and spores, little genetic fingers that will grab hold wherever they are able. When physical or biological conditions change, so do the places where various plant species can find purchase; over time, the range and abundance of the whole species shifts. That’s how it works in theory, anyway. Catching it happening is another matter. To do so, scientists need long-term records for comparison. Such records are unevenly distributed around the world, and all are of either limited geographic or temporal scope; global satellite imagery, for instance, dates only to the 1970s.

Still, in the places where scientists do have long-term historical records, they’ve tended to find plants on the move in recent decades. Shrubs are popping up across the Arctic. New species of plants are colonizing mountain summits. In one of the most wide-ranging studies of floral range shifts, a group of researchers led by University of Miami ecologist Kenneth Feeley used herbarium data to track how plant communities across the Western Hemisphere had changed from 1970 to 2011. Comprising 20,000 species and 20 million individual observations, the data shows that warmth-loving plants were growing more common nearly everywhere the researchers looked, “from the middle of the Amazon to the middle of Nebraska,” Feeley said.

Some species can migrate remarkably fast, perhaps as much as a mile a year.

This type of floral change will likely often go unnoticed by people, said Yale School of the Environment geographer Jennifer Marlon, who studies the public’s perception of climate change. People, she said, are attuned to the wild variation between days and weeks and seasons, not the long-term shifts wrought by the changing climate. People also tend to have a short memory of their surroundings, a phenomenon known as the “shifting baseline.” “We just forget very quickly what the baseline was,” she said. “We tend to normalize change around us.”

The species whose migration we’ll likely notice first are those of agricultural, commercial or cultural importance. University of Maine paleoecologist Jacquelyn Gill points to sugar maple, whose range scientists project will shift far to the north in the coming decades. “As an ecologist, I’m happy that sugar maple is tracking the climate,” Gill said — it is a sign of resilience. On the other hand, she said, “As a person who lives in Maine and loves maple syrup, I am extremely concerned for the impact of sugar maple’s movements on a food I care about, on my neighbors’ livelihoods, and on the tourist industry.”

These shifts in species’ ranges also have serious implications for conservationists. Experts say the changing climate means that Sequoia National Park will eventually be left without its sequoias, Joshua Tree National Park without its Joshua trees. As with Gill’s sugar maples, this is distressing from a human perspective, though potentially of little importance from the plants’ perspective. The question is whether sequoias, Joshua trees, and countless other plants will be able to reach newly suitable habitats. For decades, scientists have debated whether plants would be able to track the rate of climate change, and whether people should intervene to help rare, isolated species reach more suitable habitat.

On the one hand, fossil evidence from the late Pleistocene and early Holocene suggests that some species can migrate quickly, perhaps more than a mile per year. On the other hand, studies in Europe and North America suggest that many tree species did not keep up with the climate as it warmed at the end of the Pleistocene.


A wildfire in Australia's Blue Mountains National Park in 2018. More and hotter fires are accelerating changes in changes in flora globally. ANDREW MERRY / GETTY IMAG
ES


The fossil record is clear on one point, said Steve Jackson, an ecologist and director of the U.S. Geological Survey’s Southwest and South Central Climate Adaptation Science Centers: Each of the world’s 400,000-odd species of plants will respond differently to the changing climate. Changing conditions will tip the odds of competition toward some, away from others. Today, for example, the ponderosa pine is the most widely distributed pine tree in North America, growing from British Columbia to California’s Mexican border and as far east as Nebraska. It stands alongside sagebrush, creosote and saguaro as a floral symbol of the arid West.

But at the height of the last ice age, 20,000 years ago, the tree seems to have been exceedingly rare, apparently confined to a couple tiny patches of Arizona and New Mexico. “Climate change is not going to affect everyone equally,” Jackson said. “There are going to be winners and losers.”

Not all of the rearrangement of the world’s flora will happen slowly or subtly. As Gill pointed out, for the composition of an ecosystem to change, members of new species need to arrive, but members of old species also need to make way. “Death has to be part of that story,” she said. Mature plants, especially long-lived plants like trees, are often capable of surviving under physical conditions that no longer suit their seedlings. “Trees can hang out a really long time in unsuitable climates and just not reproduce,” she said. Like a rubber band, this disequilibrium between plants and environment stretches and stretches. When the mature plants die, the tension is suddenly released. New species flood in.

Scientists around the world have noted ecosystems transforming suddenly into new states — from dense forest to open woodland, for example, or woodland to brushland. Most often, these transformations come in the wake of a fire, insect outbreak or heat wave — all of which are expected to increase in intensity as the world grows warmer. Warmer temperatures stress plants and speed up the lifecycles of the insects that attack them. Currano and her colleagues have found that during the PETM, insects did far more damage to leaves than they did before or after.

The tendency of climate change to kill trees means that tree-planting campaigns alone won’t provide a panacea.

Fire is likely to be an especially visible catalyst of change, said Marlon, the Yale geographer. The disastrous wildfires in Australia, California and the Amazon in recent years are previews of what to expect in coming decades. “We have plenty more to burn,” she said.

Some scientists expect that, in many parts of the world, floral change will remain mostly unnoticeable. Threshold-type changes, driven by fire and mass forest die-back, they say, will be concentrated in places already experiencing those events. Others think sudden changes could soon become more widespread. “The problem is, it’s not about the average conditions. It’s about the extremes,” said University of Arizona ecologist David Breshears. He pointed to a month-long heat wave that struck Western Australia in 2011. It killed 20 percent of the trees in the affected area. Such heat waves are especially deadly when paired with droughts like the one currently gripping much of the American West.

As Breshears and climate scientist Jonathan Overpeck pointed out in a recent editorial, the tendency of climate change to kill trees means that, by themselves, tree-planting campaigns won’t provide a climate-mending panacea. “The first-order action has to be emissions reduction,” Breshears said.


ALSO ON YALE E360
In the Sierras, new approaches to protecting forests under stress. Read more
.


Currano agreed. The high temperatures of the PETM lasted for about 180,000 years. The fossil beds of Wyoming show that, when it ended, most of the species that had existed before the PETM returned. It is a sign of the flora’s resilience to climate change, Currano said: “We don’t see a big extinction event.” But that hopeful message can only be applied to the modern day if humans stop pumping carbon dioxide into the atmosphere, she said: “Otherwise we are in unprecedented conditions, and the PETM is the best-case scenario.”

Zach St. George is a freelance reporter based in Baltimore. His first book, The Journeys of Trees, was published last summerABOUT ZACH ST. GEORGE


Why the Approval of That Alzheimer’s Drug Is So Disturbing

Drugs that merely fiddle with the body’s physiology provide a false sense of control—at a cost.

BY TIM REQUARTH
JUNE 17, 2021
digicomphoto/iStock/Getty Images Plus


Last week, the Food and Drug Administration ignored the advice of its own expert advisory committee and approved the first new treatment for Alzheimer’s in 18 years. Called Aduhelm, it carries a substantial risk of painful brain swelling and bleeding, requires monthly infusions, and comes with an eye-popping list price of $56,000 per year. These caveats might be fine if the drug, which is manufactured by Biogen, miraculously restored the memories lost by the 6 million Americans with Alzheimer’s—or at least measurably improved the lives of patients in some meaningful way. But according to even the FDA’s own statisticians, the clinical data fail to show the new drug can slow Alzheimer’s devastating cognitive decline.

The FDA’s surprise approval has ignited a firestorm within the medical community. People are justifiably angry about the felonious cost for a risky drug that may offer little, if any, benefit. Aduhelm is, for now, a confusing and foregone conclusion; Biogen is slated to ship the drug starting next week. But a closer look at the FDA’s approval process reveals a deeper scientific issue at stake about what constitutes adequate evidence for desperately needed treatments. How the Aduhelm saga played out could have far-reaching implications not just for Alzheimer’s patients, but for anyone taking a drug approved by the FDA in the future.

To understand why the FDA approved the drug, and why this approval is so problematic beyond Aduhelm itself, it’s helpful to understand a bit about how clinical studies are designed. Some drugs are approved based on real-world outcomes, such as whether they prevent death. But others are approved based on so-called surrogate outcomes, such as whether they, say, suppress an abnormal heartbeat. If the surrogate outcome (abnormal heartbeat) is meaningfully associated with the real-world outcome (death), it follows that a drug suppressing the abnormal heartbeat should help prevent life-threatening heart attacks. Aduhelm doesn’t seem to show much benefit to patients according to a clinical dementia rating scale (a real-world outcome, approximately—some researchers say it’s not quite as firm as an outcome like “death”). But the drug does do one thing well—it removes some of the amyloid plaques that build up in the brains of Alzheimer’s patients (a surrogate outcome). It’s seemingly on the basis of this surrogate outcome that the FDA saved Aduhelm from joining a long list of failed Alzheimer’s treatments.

Approving drugs based on the surrogate-outcome approach, in principle, offers huge advantages. It’s easier, cheaper, and faster to design a clinical study that measures a drug’s effect on abnormal heartbeats than to wait for enough people to suffer a heart attack and die. Drug approvals based on surrogate outcomes have indeed proved to be lifesaving in the past. In 1992, in the midst of the HIV crisis, the FDA created a special “Accelerated Approval” program to fast-track urgently needed treatments based on surrogate outcomes alone. (The only catch was companies had to perform post-approval confirmatory studies to assess real-world outcomes like sickness and death, or the FDA could withdraw the drug.) The first wave of treatments approved in this program were HIV drugs, brought to market based on improving surrogate outcomes such as levels of a type of white blood cell. These drugs were later proved to prevent sickness and death from AIDS, which wasn’t surprising because scientists had established a tight linkage between white blood cell count and disease progression. The surrogate-outcome approach was considered a success.
Aduhelm now joins this group of drugs that merely fiddle with the body’s physiology but potentially leave patients worse off and much poorer for it.

But HIV drugs are among the only unqualified successes for surrogate approvals. Joseph Ross, a professor of medicine and public health at Yale University, studies the FDA regulatory process and notes that apart from the HIV drugs, “in almost every other instance, surrogates have proven to be far more fallible.” Several diabetes drugs have been approved because they lower hemoglobin A1c, a measure of average blood sugar levels. That might seem clearly useful. However, these drugs have wildly different effects on diabetic complications and overall mortality. At least one drug is thought to increase heart attacks. Cancer drugs with astronomical price tags might reduce tumor size but fail to prolong life or improve its quality. The list goes on: Blockbuster drugs lower cholesterol but fail to prevent cardiovascular disease. Osteoporosis drugs improve bone density but fail to decrease fractures. And in some cases, these drugs have damaging side effects only discovered years later.

In fact, the writing has long been on the wall that the surrogate-outcome approach can be a bit dicey. Take the heartbeat example above—which is actually real. In the mid-’80s, two class IC antiarrhythmics were brought to market largely based on the evidence that they suppressed abnormal heartbeats. Yet, no one had ever done a clinical study to confirm the drugs actually prevented death. When such a study was conducted years later, it had to be stopped midway through: It turns out these drugs actually increased the chance of death. Although the human toll is impossible to know, the surrogate-outcome approach in this case may have caused thousands of unnecessary deaths. Although the link between abnormal heartbeats and death was plausible in theory, it turned out quite differently in the real world. There’s a clear lesson from using surrogate outcomes to evaluate a drug. Without strong evidence of a link between the surrogate outcome and the clinical outcome, the FDA should be wary to greenlight treatments based on surrogate outcomes alone. For HIV drugs, this link was already established, and the drugs panned out. For the heart drug, this link was less established, and the drug did not pan out.

Aduhelm now joins this group of drugs that fiddle with the body’s physiology but potentially leave patients worse off and much poorer for it. Recall that with this drug, about 30–40 percent of study participants experienced either brain swelling or bleeding. Not to mention the $56,000 annual price tag—much of which Medicare is required to cover—could be a fiscal catastrophe for American health care. No one disputes that amyloid plaques appear in the brains of Alzheimer’s patients, and no one doubts Aduhelm can clear amyloid plaques (by a respectable 30 percent), but how valid of a surrogate outcome are amyloid plaques for the real-world outcomes Alzheimer’s patients care about? Given the pent-up demand, the worrisome side effects, and the steep price, you’d hope the amyloid hypothesis is a slam-dunk. Unfortunately, it’s not even close.

What amyloid plaques mean—and in turn, what Aduhelm’s actual effect on people might be—is still a hotly debated scientific question. Do they cause Alzheimer’s? Do they worsen it? Are they incidental? Are they protective? Scientists have made reasonable cases one way or another, but it’s very far from settled that amyloid plaques are a valid surrogate outcome for Alzheimer’s. What’s worse, the amyloid hypothesis doesn’t have a great track record with regard to therapeutics. Several drugs targeting amyloid plaques have been developed and then fizzled during clinical trials because not a single one had any effect on dementia. And yet, when approving Aduhelm, the FDA relied almost entirely on the amyloid hypothesis.

What makes the FDA’s decision so baffling is that surrogates are used when researchers can’t, or don’t, collect data that better reflects the real-world outcome of interest—in this case, slowing or preventing cognitive decline due to Alzheimer’s. But that’s exactly the data that Biogen submitted. The two clinical studies—which looked at cognitive decline, something that patients and doctors definitely care about—showed almost no benefit. One study was a total dud, and the other raised a score on a cognitive scale by a minuscule amount. Although that improvement reached statistical significance, it is almost certainly not going to make a difference in the lives of patients. “The effect sizes are trivial from a clinical point of view,” said Chiadi Onyike, a professor of psychiatry and behavioral sciences at Johns Hopkins and a member of the 11-person committee of outside experts tapped by the FDA to review the data. “They are meaningless—no different from the ebb and flow a patient might show from week to week.” Not a single member of the FDA’s outside advisory committee recommended approval.

The fallout has already been widespread. So far, three members of the FDA’s advisory committee have resigned, one of them calling the process a “sham.” Doctors who helped run Biogen’s clinical studies are speaking out, and others are penning editorials that they won’t be prescribing Aduhelm until they see evidence of effectiveness. But no one should hold their breath. When the FDA greenlit Aduhelm for use, it told Biogen it had nine years to run the confirmatory studies necessary to prove Aduhelm’s effectiveness. Nine years of people taking this drug that existing data suggests might not do anything meaningful. With Aduhelm poised to become among the biggest blockbuster drug in history—analysts estimate annual revenues could peak at $10 billion—Biogen probably isn’t in a hurry. But they might not even have to collect that extra data at all (for its part, Biogen said in an email to Slate, “We are working diligently to initiate the confirmatory trial”). Ross, the Yale FDA regulatory expert, looked at FDA approvals from 2005–12, and found that post-market confirmatory studies—ones that truly verified the clinical value of a surrogate outcome—only took place about 10 percent of the time. Despite this dismal compliance rate, according to Ross, the FDA has never fined a company for failing to do a confirmatory study and rarely uses its power to withdraw a drug later shown to be clinically ineffective. In an email to Slate, the FDA did not offer comment on whether it would use its power to withdraw Aduhelm should the drug ultimately prove clinically ineffective but “will carefully monitor trial progress and support efforts to complete this trial in the shortest possible timeline.”

What specifically caused the FDA to approve Aduhelm based on such a shaky outcome and over the protests of its own committee is anyone’s guess, even in the context of the FDA’s frequent reliance on surrogate outcomes—nearly 45 percent of all drugs, according to the agency’s own analysis. (In a press release acknowledging the contention around the decision, the FDA explained that it ultimately decided “the benefits of Aduhelm for patients with Alzheimer’s disease outweighed the risks of the therapy.”) Aduhelm may well help fuel a trend in leaning too heavily on surrogate outcomes: Because the Aduhelm example is so egregious, it establishes a far-reaching precedent that some believe could undermine the regulatory process. “Presumably,” says Ross, “[companies] could look at what just happened with this product and say, ‘Hey, you have to treat me the same way.’ ” In other words, when a company fails to show a drug actually works, why not try for back-door approval based on an unproven idea of how the drug is supposed to work?

This might seem grim, and it is. But there’s a lesson you can take as a patient: just because a number goes up or down at the doctor’s office—whether it’s cholesterol, blood pressure, or even weight—that doesn’t necessarily mean you’re better or worse off for it. Because of the FDA’s widespread endorsement of surrogate metrics, it’s been all too easy for patients—and doctors—to believe these metrics for health are health itself. With a false sense of understanding comes a false sense of hope, a false promise of control. That’s the true tragedy of the Aduhelm approval.
Fanuc to deliver 500 robots to Ford’s Cologne electric car factory

15 JUNE 2021 • In News
PES Media: Daily UK Manufacturing News and Insight


Fanuc has received a 500 robot order from Ford’s Cologne plant to assist in the manufacture of electric car bodies.

The Ford manufacturing facility in Cologne is currently undergoing transition into the Ford Cologne Electrification Center, a development and production site for electric vehicles that will serve the entire European market.

In 2023, the carmaker expects its first purely electric high-volume passenger model to roll off the plant’s production line. Ford has also announced that it will only offer battery-electric passenger cars in Europe from 2030.

“Fanuc has a lot of experience in robotics for e-mobility applications,” said Ralf Winkelmann, managing director of Fanuc Germany. “We are very pleased that we can accompany Ford in this forward-looking transformation.”

Headquartered in Japan, Fanuc operates in more than 100 countries worldwide, providing customers from a wide range of industries with exceptionally reliable customer service internationally. “In the USA, Fanuc has been working closely with Ford for many years,” added Mr Winkelmann. “We are now expanding this successful cooperation to Europe.”

Shinichi Tanzawa, CEO and president of Fanuc Europe added: “Our robots are known for their reliability and durability. Therefore, we are convinced that they will contribute to low downtime and maintenance costs at Ford’s Cologne plant.”

Fanuc
www.fanuc.eu/uk/en

Major obstacle to fusion energy cleared at Culham nuclear centre

17 JUNE 2021 • In News
PES Media: Daily UK Manufacturing News and Insight

UKAEA’s new MAST Upgrade experiment at Culham


One of the major hurdles in developing fusion energy has been cleared after successful testing at the UK Atomic Energy Authority (UKAEA).

Initial results from UKAEA’s new MAST Upgrade experiment at Culham, near Oxford, have demonstrated the effectiveness of an innovative exhaust system designed to make compact fusion power plants commercially viable.

With no greenhouse gas emissions and abundant fuels, fusion can be a safe and sustainable part of the world’s future energy supply.

Fusion energy is based on the same principle as stars creating heat and light. Using a machine called a tokamak, a fusion power station will heat a gas, or plasma, enabling types of hydrogen fuel to fuse together to release energy that can generate electricity.

A key challenge in getting tokamaks on the electricity grid is removing excess heat produced during fusion reactions.

Without an exhaust system that can handle this intense heat, materials will have to be regularly replaced – significantly affecting the amount of time a power plant could operate for.

The new system, known as a Super-X divertor, would allow components in future commercial tokamaks to last for much longer, greatly increasing the power plant’s availability, improving its economic viability and reducing the cost of fusion electricity. The concept for the Super-X divertor originally came from the Institute for Fusion Studies group at the University of Texas.

Tests at MAST Upgrade, which began operating in October 2020, have shown at least a ten-fold reduction in the heat on materials with the Super-X system.

This is a game-changer for achieving fusion power plants that can deliver affordable, efficient electricity.

UKAEA is planning to build a prototype fusion power plant – known as STEP – by the early 2040s, using a compact machine called the ‘spherical tokamak’. The success of the Super-X divertor is a huge boost for engineers designing the STEP device, as it is particularly suited to the spherical tokamak.

UKAEA’s lead scientist at MAST Upgrade, Dr Andrew Kirk, said: “These are fantastic results. They are the moment our team at UKAEA has been working towards for almost a decade.

“We built MAST Upgrade to solve the exhaust problem for compact fusion power plants, and the signs are that we’ve succeeded.

“Super-X reduces the heat on the exhaust system from a blowtorch level down to more like you’d find in a car engine. This could mean it would only have to be replaced once during the lifetime of a power plant.

“It’s a pivotal development for the UK’s plan to put a fusion power plant on the grid by the early 2040s – and for bringing low-carbon energy from fusion to the world.”

UKAEA
www.ccfe.ukaea.uk


Volvo to explore fossil-free steel with Swedish steelmaker SSAB
17 JUNE 2021 • In News
PES Media: Daily UK Manufacturing News and Insight

Car manufacturing underway at Volvo's Luqiao manufacturing plant

Volvo is teaming up with Swedish steelmaker SSAB to explore the development of fossil-free, high-quality steel for use in the automotive industry.

The collaboration makes Volvo the first car maker to work with SSAB and its HYBRIT initiative.

HYBRIT was started by SSAB, iron ore producer LKAB and energy firm Vattenfall. It aims to replace coking coal, traditionally needed for iron ore-based steelmaking, with fossil-free electricity and hydrogen. The result is expected to be the world’s first fossil-free steelmaking technology, with virtually no carbon footprint.

As part of the collaboration, Volvo will be the first car maker to secure SSAB steel made from hydrogen-reduced iron from HYBRIT’s pilot plant in LuleĂĄ, Sweden. This steel will be used for testing purposes and may be used in a concept car.

In 2026, SSAB aims to supply the market with fossil-free steel at a commercial scale. Volvo Cars aims to also be the first car maker to use fossil-free steel for its own production cars.

“As we continuously reduce our total carbon footprint, we know that steel is a major area for further progress,” said HĂĄkan Samuelsson, chief executive at Volvo. “The collaboration with SSAB on fossil-free steel development could give significant emission reductions in our supply chain.”

“We are building an entirely fossil-free value chain all the way to the end customer,” Martin Lindqvist, president and CEO at SSAB, added: “Our breakthrough technology has virtually no carbon footprint and will help strengthen our customer’s competitiveness. Together with Volvo Cars, we aim to develop fossil-free steel products for the cars of the future.”

The global steel industry accounts for around 7% of global direct carbon emissions, due to the fact that the industry is currently dominated by an iron ore-based steelmaking technology, using blast furnaces depending on coking coal.

For Volvo, the CO2 emissions related to steel and iron production for its cars amount to around 35% in a traditionally powered car and 20% in a fully electric car of the total CO2 emissions from the material and production of the components going into the car.

Volvo
www.volvocars.com

‘Space Dream’: China astronauts blast off for new space station

FROM SOCIALISM IN ONE COUNTRY 
TO SOCIALISM IN ONE SPACE STATION

China has launched its first crewed space mission in five years, sending three astronauts to a new space station.

In Pictures
Gallery
Astronauts Tang Hongbo, left, Nie Haisheng, centre, and Liu Boming wave during a departure ceremony before boarding the Shenzhou-12 spacecraft on a Long March-2F carrier rocket at the Jiuquan Satellite Launch Centre in the Gobi desert in northwest China. [Greg Baker/AFP]

17 Jun 2021

China launched a spacecraft on Thursday carrying three astronauts to part of a space station still under construction for the longest stay in low Earth orbit by any Chinese national.

A Long March 2F rocket transporting the Shenzhou-12, or “Divine Vessel”, bound for the space station module Tianhe blasted off at 9:22am Beijing time (01:22 GMT) from the Jiuquan Satellite Launch Centre in northwestern Gansu province.

Shenzhou-12 is the third of 11 missions – four of which will be crewed – needed to complete China’s first full-fledged space station. Construction began in April with the launch of Tianhe, the first and largest of three modules.

The astronauts Nie Haisheng, 56, Liu Boming, 54, and Tang Hongbo, 45, are to work and stay on Tianhe, the living quarters of the future space station, for three months.

Since 2003, China has launched six crewed missions and sent 11 astronauts into space, including Zhai Zhigang, who carried out China’s first spacewalk ever on the 2008 Shenzhou mission.

Chinese astronauts Nie Haisheng, Liu Boming, and Tang Hongbo meet members of the media before the Shenzhou-12 mission to build China's space station. [Carlos Garcia Rawlins/Reuters]


A worker holds an umbrella near the Shenzhou-12 spaceship covered up on its launch pad. [Ng Han Guan/AP Photo]

Chinese President Xi Jinping is seen on a billboard with the slogan "China Dream, Space Dream" at the Jiuquan Satellite Launch Center. [Ng Han Guan/AP Photo]

Spectators cheer as Chinese astronauts prepare to board for liftoff. [Ng Han Guan/AP Photo]
Since 2003, China has launched six crewed missions and sent 11 astronauts into space. [Ng Han Guan/AP Photo]

Chinese astronauts wave as they prepare to board for liftoff at the Jiuquan Satellite Launch Center. They will be the first crew members to live on China's new orbiting space station Tianhe, or Heavenly Harmony. [Ng Han Guan/AP Photo]
Advertisement


A spectator wearing a space-themed shirt and holding a Chinese flag at the Jiuquan Satellite Launch Center. [Ng Han Guan/AP Photo]



Chinese astronauts prepare to board for liftoff. Shenzhou-12 is the third of 11 missions - four of which will be crewed - needed to coplete China's first full-fledged space station. [Ng Han Guan/AP Photo]


March-2F Y12 rocket carrying a crew of Chinese astronauts in a Shenzhou-12 spaceship lifts off at the Jiuquan Satellite Launch Center. [Ng Han Guan/AP Photo]
Stella McCartney calls for end to fur trade with protest

Stella McCartney campaigns with the Human Society
 International to urge the fashion industry to stop using fur
Stella McCartney


Stella McCartney has called for an end to the fur trade.

17 June 2021

The 49-year-old fashion designer, animal rights activist and environmentalist has insisted it's her "life mission" to give her industry "a conscience" when it comes to the materials they choose to use and how they source them.

Sir Paul McCartney's daughter voiced her stance as part of the Humane Society International's campaign.

The organisation "confronts cruelty to animals in all of its forms".

Stella told the Daily Mirror newspaper: "Whether it's sold here in the UK or farmed globally, barbarism knows no borders, and this effort is key to my life's mission of bringing a conscience to the fashion industry. I am proud to partner with Human Society International. Please join us in ending this horrendous practice."

It comes after the activist vowed to "drive change" in the fashion industry to create a "cruelty-free society" for generations to come, and was among those to sign Prince Charles' Terra Carta Transition Coalitions for a sustainable future at the G7 Summit in Cornwall last weekend.

She said: “My goal is to drive change, encourage investments, and create lasting difference through incentives supporting the next generation. “I hope the G7 Summit will translate our message into policies bringing us closer to creating a cruelty-free society that is kinder to all creatures, Mother Earth, and each other.”

Stella was recently a guest at St. James Palace, where she urged global leaders to introduce new laws or legislation that will "put hard stops" on unsustainable practices in the industry.

She said: “I’m really here to ask all of these powerful people in the room to make a shift from convention to a new way of sourcing and new suppliers into the fashion industry. “One of the biggest problems that we have in the fashion industry is we're not policed in any way. We have no laws or legislations that will put hard stops on our industry… We need to be incentivized, [and] we need to have taxations looked at to work in a better way.”

CLASS CONCIOUSNESS
UK
Will Mellor: NHS staff should be country's top earners

'Coronation Street' star Will Mellor believes the NHS staff should be the country's top earners, and he was left fuming at the government's proposed pay rise of just one per cent this year.



Will Mellor believes NHS staff should be "the highest paid people in the country".


Will Mellor in Corrie

17 June 2021

The 'Coronation Street' star was left fuming with the government's proposed NHS pay rise of just one per cent this year - which Prime Minister Boris Johnson, who caught coronavirus in 2020, said was "as much as we can" - as he thinks the health service's staff should receive a bigger boost after their courageous work amid the coronavirus pandemic.

He said: "All of us have needed the NHS at some point.

"Imagine this country without it. Where we would be now? I wasn’t happy at all with that one per cent rise they got. It p***** me off.

"They should be the highest paid people in the country.

"They saved his life. We spend billions and billions on track and trace and this and that, that don’t work, but we can only afford a one per cent pay rise for the NHS who saved lives and put their own lives at risk when this country was on its back."


As well as starring in 'Corrie', Will has recorded a new version of 1998 Fat Les classic 'Vindaloo', alongside stars such as Paddy McGuinness and Danny Dyer, to encourage the nation to get behind England during Euro 2020.

And the cobbles actor believes his father Bill - who died from cancer in April last year - would be "really proud" of him for 'Vindaloo Two'.

Will - who plays drug lord Harvey Gaskell in the ITV soap - added to the Daily Mirror newspaper: "He’d be really proud of me doing this, and getting behind it and getting his mates to sing it down the pub."
Dr Martens' profits fall by over half after the iconic bootmaker awards £49.1m in staff bonuses related to its IPO

Despite falling store sales, it managed to open 18 more shops during the year

A greater focus on digital sales helped the firm's online purchases jump by 73%

Dr Martens has now handed back £1.3m in furlough cash to the UK government

By HARRY WISE FOR THIS IS MONEY

PUBLISHED:  17 June 2021

Leather boot brand Dr Martens saw its profits fall last year despite rising online sales after it handed out tens of millions of pounds in staff bonuses following its public listing.

Shares in the group tumbled by 11.5 per cent today as it blamed the £80.5million in total costs arising from its initial public offering (IPO) in January for causing profits to fall by over half to £35.7million in the year to March 31.

Without these exceptional costs, which included a £49.1million one-off 'IPO bonus' for employees, the company said its operating profit would have been 25 per cent higher at £193million.



Staff bonuses caused profits to fall by over half to £35.7million in the year to March 31

The Northamptonshire shoemaker was further hit by store closures across the world, sending its retail sales plummeting by around 40 per cent to £99.7million, which particularly impacted its performance in Japan.

Yet it still managed to open 18 more shops across the period, including six in the United States and its first store in Rome, and it hopes to open another 20 to 25 during the current financial year.

Three stores were shut down in the UK though, leaving 34 overall; but due to 'resilient trading,' Dr Martens has now returned £1.3million in furlough cash that it claimed from the UK government.

Overall sales grew by 15 per cent to £773million, as demand in China soared by 46 per cent, sales in Europe and the Americas grew 17 per cent, and Asia-Pacific sales tipped up 7 per cent.

However, higher digital investment and an accelerating shift towards customers purchasing their shoes on the internet helped the value of online purchases jump 73 per cent.

The share of online sales also grew by ten percentage points to 30 per cent, and Dr Martens aims for them to comprise 40 per cent of its revenues in the medium term, with the remaining 60 per cent coming from direct-to-consumer channels.


Retail weakness: Store closures sent Dr Martens's retail revenues falling 40 per cent last year

Chief executive Kenny Wilson said: 'People buy their first pair of Dr Martens in their late teens or early 20s. Those consumers have grown up in a world where digital is the norm, so our strategy has been for a long time to build the digital capabilities of business.

'When the pandemic actually hit, we were ready, and we were able to drive more of our demand to online. The trend towards digital was something we've been working on for years, so we were agile enough to move quite quickly.'

Meanwhile, wholesale revenue rose by 18 per cent to £437.9million thanks to sturdy demand from the United States and the company's focus on having fewer relationships but with 'quality partners.'

Most of its shoe categories recorded decent growth levels as well, including its sandals collection and classic shoe brands like its 1460 boot, which celebrated its 60th anniversary in 2020.



Rock 'n' roll: More famous for being a popular fashion accessory among mods, punks, and grunge bands, bootmaker Dr Martens had its initial public offering in January earlier this year

It also announced new sustainability targets today, such as having zero waste in its value chain going to landfill by 2028, becoming a net zero firm by the end of the decade and having all shoes made from 'sustainable materials' by 2040.

Russ Mould, an analyst at AJ Bell, said: 'While the aim is to increase web-based sales further in the future, some people will want to try their shoes and boots on first before buying, which could be an obstacle to these efforts.

'There may be some disappointment that, despite a robust sales performance, the outlook given by Dr Martens has remained unchanged. Newly listed firms often set the bar low on guidance so they can clear it early in their life as a public company.

'The company continues to push a strategy of increasing the amount of product it sells direct to consumers, something a lot of major brands are targeting as it gives them greater control over the way it engages with customers.'

From a German doctor to punks and rockers: How Dr Martens became THE iconic British boot

By Harriet Johnston for MailOnline

Klaus Märtens was a doctor in the German army during World War II. While on leave in 1945, he injured his ankle while skiing in the Bavarian Alps.

He found that his standard-issue army boots were too uncomfortable on his injured foot.

While recuperating, he designed improvements to the boots, with soft leather and air-padded soles made of tyres.

When the war ended and some Germans recovered valuables from their own cities, Märtens took leather from a cobbler's shop. With that leather he made himself a pair of boots with air-cushioned soles.


By the later 1960s, skinheads started to wear them, 'Docs' or 'DMs' being the usual naming

Märtens did not have much success selling his shoes until he met up with an old university friend, Herbert Funck, a Luxembourger, in Munich in 1947.

Funck was intrigued by the new shoe design, and the two went into business that year in Seeshaupt, Germany, using discarded rubber from Luftwaffe airfields.

The comfortable soles were a big hit with housewives, with 80% of sales in the first decade to women over the age of 40.


By the late 1970s, they were popular among punks, musicians and members of other youth groups (pictured Ian Dury in the shoes)

Sales had grown so much by 1952 that they opened a factory in Munich. In 1959, the company had grown large enough that Märtens and Funck looked at marketing the footwear internationally.

Almost immediately, British shoe manufacturer R. Griggs Group Ltd. bought rights to manufacture the shoes in the United Kingdom.

Griggs anglicised the name to 'Dr Martens', slightly re-shaped the heel to make them fit better, added the trademark yellow stitching, and trademarked the soles as AirWair.

By the later 1960s, skinheads started to wear them, 'Docs' or 'DMs' being the usual naming, and by the late 1970s, they were popular among punks, musicians and members of other youth groups.

The boots and shoes became increasingly popular in the 1990s as grunge fashion arose.

In 2003 the Dr Martens company came close to bankruptcy. On 1 April that year, under pressure from declining sales, the company ceased making shoes in the United Kingdom, and moved all production to China and Thailand. Five factories and two shops were closed in the UK, and more than 1,000 of the firm's employees lost their jobs.

Following the closures, the R. Griggs company employed only 20 people in the UK, all in the firm's head office.

Five million pairs of Dr Martens were sold during 2003, which was half the 1990s level of sales.

In 2004 a new range of Dr Martens was launched in an attempt to appeal to a wider market, and especially young people.


The boots and shoes became increasingly popular in the 1990s as grunge fashion arose. (pictured, the band Madness wearing the boots)

The shoes and boots were intended to be more comfortable, and easier to break in, and included some new design elements.

Worldwide sales of Dr Martens shoes grew strongly in the early 2010s, and in 2012 it was the eighth-fastest-growing British company.

In 2018 ten million pairs of Dr Martens shoes were produced, only one percent in the UK and in 2019, Dr Martens announced plans to double the production of shoes and boots in the UK, to 165,000 pairs annually in 2020.


The boots have a trademark yellow stitching which was added by the British shoe manufacturer R. Griggs Group
Cargill and Nestlé welcome US Supreme court ruling rejecting Ivory Coast child slavery case



Posted: 17 June 2021

Cargill and Nestlé have welcomed a landmark US Supreme Court ruling in favour of the two major companies, rejecting a claim of child slavery within Ivory Coast cocoa farms, reports Neill Barston.

According to Reuters, the decision was an 8-1 result in favour of the businesses in a high-profile case brought on behalf of former child slaves from Mali, who had reportedly originally put their case forward in 2005.

The hearing was told that their claim was not possible under the US legal frameworks that have enabled non citizens to bring legal action within American courts.

As previously covered by Confectionery Production, the strongly disputed court case had been brought on the basis of allegations that the two corporations had failed to take a duty of care to those working within the supply chain.

Speaking following the court decision, a Nestlé spokesperson strongly refuted the case, and believed that the right verdict had been reached.

He said: “Child labor is unacceptable. That is why we are working so hard to prevent it. Today, all nine Supreme Court Justices unanimously agreed there is no basis for this lawsuit to proceed against NestlĂ©. NestlĂ© never engaged in the egregious child labor alleged in this suit, and we remain unwavering in our dedication to ending child labor in the cocoa industry and to our ongoing work with partners in government, NGOs and industry to tackle this complex, global issue.

“Access to education, modernising farming methods, and improving livelihoods are crucial to combatting child labor in cocoa production. Addressing the root causes of child labor is part of the NestlĂ© Cocoa Plan and will continue to be the focus of our efforts in the future.”

Similarly, Cargill was also firm in its belief that the case reached the correct decision, but added that there remained significant work to be done on the issue.

“The Supreme Court’s ruling today affirms Cargill’s analysis of the law and confirms this suit has no basis to proceed. Regardless, Cargill’s work to keep child labor out of the cocoa supply chain is unwavering. We do not tolerate the use of child labor in our operations or supply chains and we are working every day to prevent it. We will continue to focus on the root causes, including poverty and lack of education access. Our mission is to drive long-lasting change in cocoa communities and to lift up the families that rely on cocoa for their income. This is part of Cargill’s Cocoa Promise, which guides all of our work and reinforces our commitment to a sustainable cocoa sector into the future.

 

Developing countries pay steep economic & health costs because of high car air pollution

Some of the world's most vulnerable cities suffer disproportionate economic losses because of the health consequences of in-car air pollution, finds a new study.

UNIVERSITY OF SURREY

Research News

In an international study published by the journal Environment International, the University of Surrey led an international team of air pollution experts in monitoring pollution hotspots in 10 global cities: Dhaka (Bangladesh); SĂŁo Paulo (Brazil); Guangzhou (China); MedellĂ­n (Colombia); Cairo (Egypt); Addis Ababa (Ethiopia); Chennai (India); Sulaymaniyah (Iraq); Blantyre (Malawi); and Dar-es-Salaam (Tanzania).

Surrey's Global Centre for Clean Air Research (GCARE) set out to investigate whether the amount of fine air pollution particles (PM2.5) drivers inhaled is connected to the duration drivers spend in pollution hotspots and socio-economic indicators such as gross domestic product (GDP).

Across all the cities in the study, researchers found that drivers only needed to spend a short amount of time in high-pollution hotspots to inhale a significant amount of PM2.5 particles. For example, drivers in Guangzhou and Addis Ababa spent 26 and 28 per cent of their commute in hotspot areas, which contributed to 54 and 56 per cent of the total amount of air pollution inhaled on their trip.

The researchers found that the cities where drivers were exposed to the highest levels of PM2.5 pollution - Dar-es-Salaam, Blantyre and Dhaka - also experienced higher death rates per 100,000 commuting car population per year. The low PM2.5 levels in MedellĂ­n, SĂŁo Paulo and Sulaymaniyah corresponded with very low death rates.

The international study assessed economic losses by measuring a city's death rate caused by PM2.5 car exposure against its GDP per capita. It found that, for most cities, lower GDP linked directly to more significant economic losses caused by in-car PM2.5 exposure - with Cairo and Dar-es-Salaam being impacted the most (losses of 8.9 and 10.2 million US dollars per year, respectively).

The team also found that, except for Guangzhou, cities with higher GDP per capita have less hotspot areas during an average route trip, thus decreasing the risk to drivers.

Professor Prashant Kumar, Principal Investigator of CArE-Cities Project, Associate Dean (International) and Founding Director of GCARE at the University of Surrey, said: "Our global collaborative project has confirmed that air pollution disproportionately affects developing countries. Many countries are caught in a vicious cycle where their low GDP leads to higher pollution exposure rate for drivers, which leads to poorer health outcomes, which further damages the economy of those cities. This is discouraging news - but it should galvanise the international community to find and deploy measures that mitigate the health risks faced by the world's most vulnerable drivers."

Professor Shi-Jie Cao, a collaborative partner from the Southeast University, said: "If we are ever to make a world where clean air is available to all, it will take a truly global collaborative effort - such as CArE-Cities. We hope to continue to work closely with Surrey and other global partners, sharing knowledge and expertise that will make a cleaner future a reality."

Professor Adamson Muula, a collaborative partner from formerly University of Malawi and now Head of Public Health at the Kamuzu University of Health Sciences (KUHeS), said: "If developing countries are to not be left behind in the struggle against air pollution and climate change, it is important that we build the capacity and knowledge to gather on-the-ground data. This project is a small but a significant step in the right direction for Malawians; a direction which will lead to better decisions and cleaner air for Malawi."

###

Note to editors

The study was part of the Clean Air Engineering for Cities (CArE-Cities) project and builds upon our previous work around car exposure. CArE-Cities is a seed funding project awarded by the University of Surrey under the Research England's Global Challenge Research Funds. CArE-Cities involves 11 Development Assistance Committee (DAC) listed countries and aspires to bring cleaner air to cities by building a knowledge exchange platform. Its activities include joint workshops, researchers exchange and pilot studies to address urban development and health impact assessment agendas in ODA countries.

Reference

Kumar, P., Hama, S., Abbass, R. A., Nogueira, T., Brand, V. S., Abhijith, K. V., de Fatima Andrade, M., Asfaw, A., Aziz, K. H., Cao, S. J., El-Gendy, A., Khare, M., Muula, A. S., Nagendra, S. M. S., Ngowi, A. V., Omer, K., Olaya, Y., Salam, A., 2021. Potential health risks due to in-car aerosol exposure across ten global cities. Environment International 155, 106688. Online link: https://doi.org/10.1016/j.envint.2021.106688