Friday, April 15, 2022


The True Reason For So Much Hunger in The World Is Probably Not What You Think



(artur carvalho/Getty Images)


GISèLE YASMEEN, THE CONVERSATION
13 APRIL 2022

Nearly one in three people in the world did not have access to enough food in 2020. That's an increase of almost 320 million people in one year and it's expected to get worse with rising food prices and the war trapping wheat, barley and corn in Ukraine and Russia.

Climate change related floods, fires and extreme weather, combined with armed conflict and a worldwide pandemic have magnified this crisis by affecting the right to food.

Many assume world hunger is due to "too many people, not enough food." This trope has persisted since the 18th century when economist Thomas Malthus postulated that the human population would eventually exceed the planet's carrying capacity. This belief moves us away from addressing the root causes of hunger and malnutrition.

In fact, inequity and armed conflict play a larger role. The world's hungry are disproportionately located in Africa and Asia, in conflict-ridden zones.

As a researcher who has been working on food systems since 1991, I believe that addressing root causes is the only way to tackle hunger and malnutrition. For this, we need more equitable distribution of land, water and income, as well as investments in sustainable diets and peace-building.
But how will we feed the world?

The world produces enough food to provide every man, woman and child with more than 2,300 kilocalories per day, which is more than sufficient. However, poverty and inequality – structured by class, gender, race and the impact of colonialism – have resulted in an unequal access to Earth's bounty.

Despite adequate food production globally, poverty and inequality restrict many people's access to healthy food. (FAO, The State of Food Security and Nutrition in the World 2020, CC-BY)Y

Half of global crop production consists of sugar cane, maize, wheat and rice – a great deal of which is used for sweeteners and other high-calorie, low-nutrient products, as feed for industrially produced meat, biofuels and vegetable oil.

The global food system is controlled by a handful of transnational corporations that produce highly processed foods, containing sugar, salt, fat and artificial colors or preservatives. Overconsumption of these foods is killing people around the world and taxing healthcare costs.


Nutrition experts say that we should limit sugars, saturated and trans fats, oils and simple carbohydrates and eat an abundance of fruits and vegetables with only a quarter of our plates consisting of protein and dairy. The Intergovernmental Panel on Climate Change also recommends a move toward sustainable healthy diets.

A recent study showed that overconsumption of highly processed foods – soft drinks, snacks, breakfast cereals, packaged soups and confectionery items – can lead to negative environmental and health impacts, such as Type 2 diabetes and cardiovascular disorders.

Steering the world away from highly processed foods will also lessen their negative impacts on land, water and reduce energy consumption.
We live in a world of plenty

Since the 1960s, global agricultural production has outpaced population growth. Yet the Malthusian theory continues to focus on the risk of population increases outstripping the Earth's carrying capacity, even though global population is peaking.

Nobel Laureate Amartya Sen's study of the Great Bengal Famine of 1943 challenged Malthus by demonstrating that millions died of hunger because they didn't have the money to buy food, not due to food shortages.


In 1970, Danish economist Ester Boserup also questioned Malthus's assumptions. She argued that rising incomes, women's equality and urbanization would ultimately stem the tide of population growth, with the birthrate, even in poor countries, dropping to at or below replacement levels.

Food – like water – is an entitlement, and public policy should stem from this. Unfortunately, land and income remain highly unevenly distributed, resulting in food insecurity, even in wealthy countries. While land redistribution is notoriously difficult, some land reform initiatives – like the one in Madagascar – have been successful.
The role of war in hunger

Hunger is aggravated by armed conflict. The countries with the highest rates of food insecurity have been ravaged by war, such as Somalia. More than half of the people who are undernourished and almost 80 percent of children with stunted growth live in countries struggling with some form of conflict, violence or fragility.

UN Secretary General António Guterres has warned that the war in Ukraine puts 45 African and least developed countries at risk of a "hurricane of hunger," as they import at least a third of their wheat from Ukraine or Russia. According to the New York Times, the World Food Program has been forced to cut rations to nearly four million people due to higher food prices.


What works, ultimately, are adequate social protection floors (basic social security guarantees) and rights based "food sovereignty" approaches that put communities in control of their own local food systems. For example, the Deccan Development Society in India assists rural women by providing access to nutritious food and other community supports.

To address food insecurity, we must invest in diplomacy by coordinating humanitarian, development and peacekeeping activities to avoid and curtail armed conflicts. Poverty reduction is part of peace building as rampant inequalities serve as tinderboxes for aggression.
Protecting our ability to produce food

Climate change and poor environmental management have put collective food production assets including soil, water and pollinators in peril.

Several studies over the past 30 years have warned that soil and water contamination from high concentrations of toxins such as pesticides, dwindling biodiversity and disappearing pollinators could further affect the quality and quantity of food production.

Livestock, crop production, agricultural expansion and food processing account for a quarter of all greenhouse gas emissions. In addition, one-third of all food produced is lost or goes to waste, so tackling this travesty is also paramount.

Reducing food loss and waste will help reduce environmental impacts of the food system, as will transitioning to healthier, sustainably produced diets.
Food, health and environmental sustainability

Food is an entitlement and should be viewed as such, not framed as an issue of population growth or inadequate food production. Poverty and systemic inequalities are the root causes of food insecurity as is armed conflict. Keeping this idea central in discussions about feeding the world is essential.

We need policies that support healthy and sustainably produced, balanced diets to address chronic diet-related disease, environmental issues and climate change.

We need more initiatives that enable equitable distribution of land, water and income globally.

We need policies that address food insecurity through initiatives like rights-based food sovereignty systems.

In areas affected by conflict and war, we need policies that invest in diplomacy by coordinating humanitarian, development and peacekeeping activities.

These are the key pathways to recognize that "food is the single strongest lever to optimize human health and environmental sustainability on Earth."

Gisèle Yasmeen, Senior Fellow, School of Public Policy and Global Affairs, University of British Columbia.

This article is republished from The Conversation under a Creative Commons license. Read the original article.
How ancient, recurring climate changes may have shaped human evolution

Shifting habitats implicate a disputed ancestor in the rise of Homo sapiens and Neandertals


The climate change–induced travels of a disputed hominid species called Homo heidelbergensis, represented here by a roughly 600,000-year-old East African skull, led to the evolution of H. sapiens in southern Africa and Neandertals in Europe, a new study claims.

RYAN SOMMA/FLICKR (CC BY-SA 2.0)

By Bruce Bower
APRIL 13, 2022 

Recurring climate changes may have orchestrated where Homo species lived over the last 2 million years and how humankind evolved.

Ups and downs in temperature, rainfall and plant growth promoted ancient hominid migrations within and out of Africa that fostered an ability to survive in unfamiliar environments, say climate physicist and oceanographer Axel Timmermann and colleagues. Based on how the timing of ancient climate variations matched up with the comings and goings of different fossil Homo species, the researchers generated a novel — and controversial — outline of human evolution. Timmermann, of Pusan National University in Busan, South Korea, and his team present that scenario April 13 in Nature.

Here’s how these scientists tell the story of humankind, starting roughly 2 million years ago. By that time, Homo erectus had already begun to roam outside Africa, while an East African species called H. ergaster stuck close to its home region. H. ergaster probably evolved into a disputed East African species called H. heidelbergensis, which split into southern and northern branches between 850,000 and 600,000 years ago. These migrations coincided with warmer, survival-enhancing climate shifts that occur every 20,000 to 100,000 years due to variations in Earth’s orbit and tilt that modify how much sunlight reaches the planet.

Then, after traveling north to Eurasia, H. heidelbergensis possibly gave rise to Denisovans around 430,000 years ago, the researchers say. And in central Europe, harsh habitats created by recurring ice ages spurred the evolution of H. heidelbergensis into Neandertals between 400,000 and 300,000 years ago. Finally, in southern Africa between 310,000 and 200,000 years ago, increasingly harsh environmental conditions accompanied a transition from H. heidelbergensis to H. sapiens, who later moved out of Africa.

But some researchers contend that H. heidelbergensis, as defined by its advocates, contains too many hard-to-categorize fossils to qualify as a species.

An alternative view to the newly proposed scenario suggests that, during the time that H. heidelbergensis allegedly lived, closely related Homo populations periodically split up, reorganized and bred with outsiders, without necessarily operating as distinct biological species (SN: 12/13/21). In this view, mating among H. sapiens groups across Africa starting as early as 500,000 years ago eventually produced a physical makeup typical of people today. If so, that would undermine the validity of a neatly branching evolutionary tree of Homo species leading up to H. sapiens, as proposed by Timmermann’s group.

The new scenario derives from a computer simulation of the probable climate over the last 2 million years, in 1,000-year intervals, across Africa, Asia and Europe. The researchers then examined the relationship between simulated predictions of what ancient habitats were like in those regions and the dates of known hominid fossil and archaeological sites. Those sites range in age from around 2 million to 30,000 years old.

Previous fossil evidence indicates that H. erectus spread as far as East Asia and Java (SN: 12/18/19). Timmermann’s climate simulations suggest that H. erectus, as well as H. heidelbergensis and H. sapiens, adapted to increasingly diverse habitats during extended travels. Those migrations stimulated brain growth and cultural innovations that “may have made [all three species] the global wanderers that they were,” Timmermann says.

The new habitat simulations also indicate that H. sapiens was particularly good at adjusting to hot, dry regions, such as northeastern Africa and the Arabian Peninsula.

Climate, habitat and fossil data weren’t sufficient to include additional proposed Homo species in the new evolutionary model, including H. floresiensis in Indonesia (SN: 3/30/16) and H. naledi in South Africa (SN: 5/9/17).

It has proven difficult to show more definitively that ancient environmental changes caused transitions in hominid evolution. For instance, a previous proposal that abrupt climate shifts resulted in rainy, resource-rich stretches of southern Africa’s coast, creating conditions where H. sapiens then evolved (SN: 3/31/21), still lacks sufficient climate, fossil and other archaeological evidence.

Paleoanthropologist Rick Potts of the Smithsonian Institution in Washington, D.C., has developed another influential theory about how climate fluctuations influenced human evolution that’s still open to debate. A series of climate-driven booms and busts in resource availability, starting around 400,000 years ago in East Africa, resulted in H. sapiens evolving as a species with a keen ability to survive in unpredictably shifting environments, Potts argues (SN: 10/21/20). But the new model indicates that ancient H. sapiens often migrated into novel but relatively stable environments, Timmermann says, undermining support for Potts’ hypothesis, known as variability selection.

The new findings need to be compared with long-term environmental records at several well-studied fossil sites in Africa and East Asia before rendering a verdict on variability selection, Potts says.

The new model “provides a great framework” to evaluate ideas such as variability selection, says paleoclimatologist Rachel Lupien of Lamont-Doherty Earth Observatory in Palisades, N.Y. That’s especially true, Lupien says, if researchers can specify whether climate and ecosystem changes that played out over tens or hundreds of years were closely linked to ancient Homo migrations.

For now, much remains obscured on the ancient landscape of human evolution.

Questions or comments on this article? E-mail us at feedback@sciencenews.org

CITATIONS

A. Timmermann et al. Climate effects on archaic human habitats and species successions. Nature. Published online April 13, 2022. doi: 10.1038/s41586-022-04600-9.

New Research Shows Estimates of the Carbon Cycle – Vital to Predicting Climate Change – Are Incorrect

Lone Plant Growing

The findings do not counter the established science of climate change but highlight how the accounting of the amount of carbon withdrawn by plants and returned by soil is not accurate.

Virginia Tech researchers, in collaboration with Pacific Northwest National Laboratory, have discovered that key parts of the global carbon cycle used to track movement of carbon dioxide in the environment are not correct, which could significantly alter conventional carbon cycle models.

The estimate of how much carbon dioxide plants pull from the atmosphere is critical to accurately monitor and predict the amount of climate-changing gasses in the atmosphere. This finding has the potential to change predictions for climate change, though it is unclear at this juncture if the mismatch will result in more or less carbon dioxide being accounted for in the environment.

“Either the amount of carbon coming out of the atmosphere from the plants is wrong or the amount coming out of the soil is wrong,” said Meredith Steele, an assistant professor in the School of Plant and Environmental Sciences in the College of Agriculture and Life Sciences, whose Ph.D. student at the time, Jinshi Jian, led the research team. The findings were published on April 1, 2022, in Nature Communications.

Virgina Tech Student Planting

“We are not challenging the well-established climate change science, but we should be able to account for all carbon in the ecosystem and currently cannot. What we found is that the models of the ecosystem’s response to climate change need updating,” said Meredith Steele. Credit: Photo by Logan Wallace for Virginia Tech.

“We are not challenging the well-established climate change science, but we should be able to account for all carbon in the ecosystem and currently cannot,” she said. “What we found is that the models of the ecosystem’s response to climate change need updating.”

Jian and Steele’s work focuses on carbon cycling and how plants and soil remove and return carbon dioxide in the atmosphere.

To understand how carbon affects the ecosystems on Earth, it’s important to know exactly where all the carbon is going. This process, called carbon accounting, says how much carbon is going where, how much is in each of Earth’s carbon pools of the oceans, atmosphere, land, and living things.

For decades, researchers have been trying to get an accurate accounting of where our carbon is and where it is going. Virginia Tech and Pacific Northwest National Laboratory researchers focused on the carbon dioxide that gets drawn out of the atmosphere by plants through photosynthesis.

When animals eat plants, the carbon moves into the terrestrial ecosystem. It then moves into the soil or to animals. And a large amount of carbon is also exhaled — or respirated — back into the atmosphere.

This carbon dioxide that’s coming in and going out is essential for balancing the amount of carbon in the atmosphere, which contributes to climate change and storing carbon long-term.

However, Virginia Tech researchers discovered that when using the accepted numbers for soil respiration, that number in the carbon cycling models is no longer balanced.

“Photosynthesis and respiration are the driving forces of the carbon cycle, however the total annual sum of each of these at the global scale has been elusive to measure,” said Lisa Welp, an associate professor of earth, atmospheric, and planetary sciences at Purdue University, who is familiar with the work but was not part of the research. “The authors’ attempts to reconcile these global estimates from different communities show us that they are not entirely self-consistent and there is more to learn about these fundamental processes on the planet.”

What Jian and Steele, along with the rest of the team, found is that by using the gross primary productivity of carbon dioxide’s accepted number of 120 petagrams — each petagram is a billion metric tons — the amount of carbon coming out through soil respiration should be in the neighborhood of 65 petagrams.

By analyzing multiple fluxes, the amount of carbon exchanged between Earth’s carbon pools of the oceans, atmosphere, land, and living things, the researchers discovered that the amount of carbon soil respiration coming out of the soil is about 95 petagrams. The gross primary productivity should be around 147. For scale, the difference between the currently accepted amount of 120 petagrams and this is estimate is about three times the global fossil fuel emissions each year.

According to the researchers, there are two possibilities for this. The first is that the remote sensing approach may be underestimating gross primary production. The other is the upscaling of soil respiration measurements, which could be overestimating the amount of carbon returned to the atmosphere. Whether this misestimate is a positive or negative thing for the scientifically proven challenge of climate change is what needs to be examined next, Steele said.

The next step for the research is to determine which part of the global carbon cycling model is being under or overestimated.

By having accurate accounting of the carbon and where it is in the ecosystem, better predictions and models will be possible to accurately judge these ecosystems’ response to climate change, said Jian, who began this research as a Ph.D. student at Virginia Tech and is now at Northwest A&F University in China.

“If we think back to how the world was when we were young, the climate has changed,” Jian said. “We have more extreme weather events. This study should improve the models we used for carbon cycling and provide better predictions of what the climate will look like in the future.”

As Steele’s first Ph.D. student at Virginia Tech, a portion of Steele’s startup fund went to support Jian’s graduate research. Jian, fascinated with data science, databases, and soil respiration, was working on another part of his dissertation when he stumbled across something that didn’t quite add up.

Jian was researching how to take small, localized carbon measurements from across the globe. While researching this, Jian discovered that the best estimates didn’t match up if all the fluxes of global carbon accounting were put together.

Reference: “Historically inconsistent productivity and respiration fluxes in the global terrestrial carbon cycle” by Jinshi Jian, Vanessa Bailey, Kalyn Dorheim, Alexandra G. Konings, Dalei Hao, Alexey N. Shiklomanov, Abigail Snyder, Meredith Steele, Munemasa Teramoto, Rodrigo Vargas and Ben Bond-Lamberty, 1 April 2022, Nature Communications.
DOI: 10.1038/s41467-022-29391-5

The research was funded by Steele’s startup fund from the College of Agriculture and Life Sciences at Virginia Tech and further supported by the Pacific Northwest National Laboratory.


Complex Life May Have Started on Earth Much Earlier Than We Thought


Detailed view of an iron formation with wavy bands. (D. Papineau)

CONOR FEEHLY
14 APRIL 2022

All life on Earth likely emerged from one spark in Earth's early history. Some time later, it diversified, branching off into lineages that helped it survive.

Exactly when these moments occurred has been a point of contention in the scientific community, but new research suggests both steps may have taken place earlier than we previously thought.

The study, led by University College London researchers builds on evidence of diverse microbial life inside a fist-sized piece of rock from Quebec in Canada, dated to around 3.75 billion to 4.28 billion years.

In 2017, the researchers who discovered it speculated that structures in the rock – tiny filaments, knobs, and tubes – had been left by ancient bacteria.

But not everyone was convinced that these structures – which would push the date for the first signs of life on Earth back by at least 300 million years – were biological in origin.

The filaments seen here are the stem-like structures indicating oldest known fossils.(D. Papineau)

However, after further extensive analysis of the rock, the team discovered an even larger and more complex structure than those which were previously identified. Within the rock was a stem-like structure with parallel branches on one side that are nearly a centimeter long, as well as hundreds of distorted spheres, or ellipsoids, alongside the tubes and filaments.

"This means life could have begun as little as 300 million years after Earth formed. In geological terms, this is quick – about one spin of the Sun around the galaxy," says lead author of the study, geochemist Dominic Papineau from UCL.

The key question for Papineau and his colleagues was whether it was possible for these structures to have formed through chemical reactions not related to living things.

According to the paper, some of the smaller structures could have conceivably been the product of abiotic reactions, however, the newly identified 'tree-like' stem is most likely biological in origin, as no structure like it, created through chemical reactions alone, has been found before.

In addition to the structures, researchers identified mineralized chemicals in the rock that could have been byproducts of different types of metabolic processes.



The chemicals are consistent with energy-extraction processes in the bacteria that would have involved iron and sulfur; depending on the interpretation of chemical signatures, there could even be hints of a version of photosynthesis.

This finding points to the possibility that the early Earth – only 300 million years after its formation – was inhabited by an array of microbial life.

The rock was analyzed through a combination of optical observations through Raman microscopes (which use light scattering to determine chemical structures), and digitally recreating sections of the rock with a supercomputer that processed thousands of images from two high-resolution imaging techniques.

The piece of rock in question was collected by Papineau in 2008 from Quebec's Nuvvuagittuq Supracrustal Belt (NSB), which was once a part of the seafloor. The NSB contains some of the oldest sedimentary rocks known on Earth. The fossil-laden rock was also analyzed for levels of rare Earth elements, with researchers finding it did indeed have the same levels as other ancient rock specimens, confirming it was as old as the surrounding volcanic rocks.

Bright red iron and silica-rich rock which contains tubular and filamentous microfossils. (D. Papineau)

Prior to this discovery, the earliest fossil evidence of life was found in Western Australia, which dates back 3.46 billion years. However, similar contention exists around whether these fossils were biological in origin.

Perhaps the most exciting implications from this discovery are what it means for the potential distribution of life in the Universe. If life was able to develop and evolve in the harsh conditions of the very early Earth, then it may be more common throughout the cosmos than we think.

"This discovery implies that only a few hundred million years are needed for life to evolve to an organized level on a primordial habitable planet," state the authors of the paper.

"We therefore conclude that such microbial ecosystems could exist on other planetary surfaces where liquid water interacted with volcanic rocks, and that these oldest microfossils and dubiofossils reported here from the NSB suggest that extraterrestrial life may be more widespread than previously thought."

The study was published in the journal Science Advances.


A Powerful 'Space Laser' Has Been Detected Beaming From Deep Space


The red crosshairs indicate the host galaxy of Nkalakatha. (ICRAR)

MICHELLE STARR
8 APRIL 2022

Powerful, radio-wavelength laser light has been detected emanating from the greatest distance across deep space yet.

It's a type of massless cosmic object called a megamaser, and its light has traveled for a jaw-dropping 5 billion light-years to reach us here on Earth. The astronomers who discovered it using the MeerKAT radio telescope in South Africa have named it Nkalakatha – an isiZulu word meaning "big boss".

The discovery has been accepted into The Astrophysical Journal Letters and is available on preprint server arXiv.

"It's impressive that, with just a single night of observations, we've already found a record-breaking megamaser," said astronomer Marcin Glowacki of the Curtin University node of the International Centre for Radio Astronomy Research (ICRAR) in Australia.

"It shows just how good the telescope is."

A maser is the microwave equivalent of a laser (light amplification by stimulated emission of radiation). Rather than emitting visible light, a maser emits microwave and radio wavelengths that are stimulated and amplified. For an astrophysical maser, the processes that amplify the light are cosmic; planets, comets, clouds, and stars can all produce masers.

As you may have guessed it, a megamaser is therefore a maser with some serious oomph. Generally these emissions are produced by an object that is going absolutely ham in some way; for instance, active supermassive black holes can produce megamasers.

When the data came in from the first night of a survey planned for 3,000 hours, Glowacki and team found the signature of a very specific type of megamaser, bright in wavelengths amplified by stimulated hydroxyl molecules, consisting of one hydrogen atom and one oxygen atom.

Hydroxyl megamasers have a known production mechanism. They are emitted by galaxies that are in the process of, or have recently undergone, a collision with another galaxy, and are bursting with star formation as a result. The gravitational interactions of such a massive encounter compress star-forming gas, causing it to collapse down into baby stars at a tremendous rate.

The source of the megamaser detected by Glowacki and his colleagues is just that, a galaxy named WISEA J033046.26−275518.3 – now known as Nkalakatha.

"When two galaxies like the Milky Way and the Andromeda Galaxy collide, beams of light shoot out from the collision and can be seen at cosmological distances," said astrophysicist Jeremy Darling from the University of Colorado.

"The hydroxyl megamasers act like bright lights that say: here is a collision of galaxies that is making new stars and feeding massive black holes."

The MeerKAT survey was not designed to look for megamasers. It's called Looking at the Distant Universe with the Meerkat Array (LADUMA), and it's looking for a 21-centimeter wavelength emitted by neutral hydrogen in the early Universe, stretched (redshifted) by the expansion of the Universe.

The wavelengths of a hydroxyl megamaser are, however, 18 centimeters; when they're redshifted, they're even longer, and that redshifted signal was within the range detectable by the telescope array.

Since the region of sky has been extensively observed in other wavelengths, tracking the signal down to a host galaxy was fairly straightforward. Nkalakatha is bright in infrared wavelengths, and has a long tail on one side that glows brightly in radio, likely as the result of being pulled by the gravitational interaction between the two now-merged galaxies.

The team has already planned follow-up observations of the fascinating object, and expect to find many more megamasers as the survey continues.

"MeerKAT will probably double the known number of these rare phenomena," Darling said. "Galaxies were thought to merge more often in the past, and the newly discovered hydroxyl megamasers will allow us to test this hypothesis."

The research has been accepted into The Astrophysical Journal Letters and is available on arXiv.

NFT of first-ever tweet a dud in online auction

NFTs -- such as this one, of the painting 'Madonna del Cardellino ('Madonna of the Goldfinch') by Raphael -- have been in vogue,
NFTs -- such as this one, of the painting 'Madonna del Cardellino 
('Madonna of the Goldfinch') by Raphael -- have been in vogue, but the dud 
auction of the first ever tweet could signal the trend is slipping.

An NFT of the first tweet ever posted on Twitter was struggling to attract bidders on Thursday, with the highest offer so far just shy of $10,000—a year after it was bought for $2.9 million.

In what could signal waning interest in non-fungible tokens (NFTs) hawked by athletes, , celebrities and tech stars, the famed inaugural Twitter post authored by co-founder Jack Dorsey seemed headed for an epic fail.

It has spent more than a week on the  block at NFT marketplace OpenSea. The top offer was in the cryptocurrency Ether, its current value translating to just under $10,000, according to the website.

The historic first tweet from the account of @Jack reads "just setting up my twttr."

Malaysia-based entrepreneur Sina Estavi bought the tweet as an authenticated  known as an NFT for $2.9 million early last year, but put it up for auction last week hoping to get some $48 million for it and give a chunk of the money to charity.

Dorsey had originally sold his  as an NFT in Ether.

Blockchain technology used in making the tokens allows people to own virtual objects with provable authenticity.

Estavi is among the early players in the NFT market, and is chief executive of blockchain platform Bridge Oracle.

The NFT trend includes a digital work by US artist Beeple selling for $69.3 million at a Christie's auction.

First tweet fetches $2.9 mn at auction

© 2022 AFP

Autonomous vehicles could prove to be future model for delivery services, study finds

Autonomous vehicles could prove to be future model for delivery services, study finds
Autonomous delivery vehicles are deployed on the streets of Beijing, China. 
Credit: Wikimedia Commons

The notion of self-driving vehicles is currently met with equal parts wonder and alarm. But a new study reveals how the pros may outweigh the cons as a business strategy.

An article titled "Impact of Autonomous Vehicle Assisted Last-Mile Delivery in Urban to Rural Settings" determines that this technology reduces the completion time of delivery tours and provides the most cost-effective business model. It appears in Transportation Science.

"The starting point of this paper involved the United States Postal Service announcing its idea to start using autonomous vehicles in rural routes," said Sara Reed, assistant professor of business analytics at the University of Kansas.

"What we find is autonomous vehicles are cost-effective across all customer geographies, but it's actually in urban environments that will see the most gains, particularly because parking is a challenge and customers are closer together. Serving customers on foot in urban environments becomes more advantageous than in rural environments where one may need to walk long distances between customers after parking."

FedEx and Volkswagen have already utilized autonomous vehicles in countries such as China and Germany. Domino's Pizza and 7-Eleven are now experimenting with bringing food and products to American customers. The USPS is aiming to implement this service by 2025, projecting a deployment on 28,000 rural routes. What once sounded like "Jetsons"-era futurism is now right around the corner.

Reed, who co-wrote the paper with Ann Melissa Campbell and Barrett Thomas of the University of Iowa, said this tech should not be confused with flying drones.

"Our autonomous vehicle is a ground vehicle that can drive itself," she said. "It can go from one place to another without the need for a driver."

In this particular study, the autonomous vehicle assists the delivery person in making deliveries.

"The vehicle drops off the delivery person at one location. They serve customers by delivering packages on foot, and then the vehicle picks them up at an alternate location. So what does that do for the delivery driver? It removes the need to find parking where you leave a vehicle by itself, as well as it removes the delivery person's walk back to the vehicle," Reed said.

The United Parcel Service has used a similar approach during the holiday season in Baltimore, for instance, by putting an extra delivery person onboard its trucks. The model proposed by Reed considers upgrading the technology of the vehicle instead. But which approach ultimately saves more money? On one hand, there's the extra cost of the additional driver; on the other hand, there's the extra cost of autonomous technology.

"Given the wage cost and increased productivity, we show that the autonomous vehicle model is more cost-effective than putting an additional person on a traditional vehicle," she said.

Currently, Reed explained, this technology is very expensive. Thus the "per hour" delivery model costs more.

"But these great reductions in the time of the delivery tour outweigh that increased expenditure," she said. "You also have to think about other advantages introduced by placing autonomous vehicles on the road. They may be driving around more, but they're also not parking, so they're freeing up spots for other people to park and get to businesses downtown."

Overall, her report reveals dramatic savings in both time and money. In , the prospective savings could eclipse 50%.

On the flip side of this equation is the potential toll on human employment.

"One fear is that  are going to take away the role of the delivery person," she said. "From the aspect of a fully autonomous vehicle—meaning we've eliminated that driver job—our paper shows it's actually more advantageous to have a delivery driver on board in both productivity and cost-effectiveness."

A native of Des Moines, Iowa, Reed is now completing her first year at KU. Her expertise is in transportation logistics. This autonomous vehicle study was part of her dissertation.

She said, "Most of my work concerns 'last-mile delivery'—which is the idea that if you order a package from Amazon, it might go a variety of places until it gets to you, and the last mile is the previous destination to your address. I'm focused on the challenges in that realm."

The "future shock" aspect of her autonomous subject is not lost on Reed.

"At the beginning, I thought this might represent a 'new society' and that type of thing," she said. "But since I first started working on this, some things have already been implemented—not necessarily full-scale on the autonomous vehicle side, but if you think more locally on college campuses, there are places where robots are delivering food to students. I imagine it's weird to see a robot going by, but then you probably get acclimated to it."Amazon applies for patent on secondary delivery vehicle to carry packages from truck to doorstep

More information: Sara Reed et al, Impact of Autonomous Vehicle Assisted Last-Mile Delivery in Urban to Rural Settings, Transportation Science (2022). DOI: 10.1287/trsc.2022.1142

Provided by University of Kansas 

How to compete with robots

Automation
Credit: CC0 Public Domain

When it comes to the future of intelligent robots, the first question people ask is often: How many jobs will they make disappear? Whatever the answer, the second question is likely to be: How can I make sure that my job is not among them?

In a study just published in Science Robotics, a team of roboticists from EPFL and economists from the University of Lausanne offers answers to both questions. By combining the scientific and technical literature on robotic abilities with employment and wage statistics, they have developed a method to calculate which of the currently existing jobs are more at risk of being performed by machines in the near future. Additionally, they have devised a method for suggesting career transitions to jobs that are less at risk and require smallest retraining efforts.

"There are several studies predicting how many jobs will be automated by robots, but they all focus on software robots, such as speech and image recognition, financial robo-advisers, chatbots, and so forth. Furthermore, those predictions wildly oscillate depending on how job requirements and software abilities are assessed. Here, we consider not only artificial intelligence software, but also real  that perform physical work and we developed a method for a systematic comparison of human and robotic abilities used in hundreds of jobs," says Prof. Dario Floreano, Director of EPFL's Laboratory of Intelligent System, who led the study at EPFL.

The key innovation of the study is a new mapping of robot capabilities onto job requirements. The team looked into the European H2020 Robotic Multi-Annual Roadmap (MAR), a strategy document by the European Commission that is periodically revised by robotics experts. The MAR describes dozens of abilities that are required from current robot or may be required by future ones, ranging, organized in categories such as manipulation, perception, sensing, interaction with humans. The researchers went through research papers, patents, and description of robotic products to assess the maturity level of robotic abilities, using a well-known scale for measuring the level of technology development, "technology readiness level" (TRL).

For human abilities, they relied on the O*net database, a widely-used resource database on the US job market, that classifies approximately 1,000 occupations and breaks down the skills and knowledge that are most crucial for each of them

After selectively matching the human abilities from O*net list to robotic abilities from the MAR document, the team could calculate how likely each existing job occupation is to be performed by a robot. Say, for example, that a job requires a human to work at millimeter-level precision of movements. Robots are very good at that, and the TRL of the corresponding ability is thus the highest. If a job requires enough such skills, it will be more likely to be automated than one that requires abilities such as critical thinking or creativity.

The result is a ranking of the 1,000 jobs, with "Physicists" being the ones who have the lowest risk of being replaced by a machine, and "Slaughterers and Meat Packers," who face the highest risk. In general, jobs in , building and maintenance, construction and extraction appear to have the highest risk.

"The key challenge for society today is how to become resilient against " says Prof. Rafael Lalive. who co-led the study at the University of Lausanne. "Our work provides detailed career advice for workers who face high risks of automation, which allows them to take on more secure jobs while re-using many of the skills acquired on the old job. Through this advice, governments can support society in becoming more resilient against automation."

The authors then created a method to find, for any given job, alternative jobs that have a significantly lower automation risk and are reasonably close to the original one in terms of the abilities and knowledge they require—thus keeping the retraining effort minimal and making the career transition feasible. To test how that method would perform in real life, they used data from the US workforce and simulated thousands of career moves based on the algorithm's suggestions, finding that it would indeed allow workers in the occupations with the highest risk to shift towards medium-risk occupations, while undergoing a relatively low retraining effort.

The method could be used by governments to measure how many workers could face automation risks and adjust retraining policies, by companies to assess the costs of increasing automation, by robotics manufacturers to better tailor their products to the market needs; and by the public to identify the easiest route to reposition themselves on the job market.

Finally, the authors translated the new methods and data into an algorithm that predicts the risk of automation for hundreds of jobs and suggests resilient career transitions at minimal  effort, publicly accessible at https://lis2.epfl.ch/resiliencetorobots.Low-wage workers at risk for automation: study

More information: Antonio Paolillo et al, How to Compete with Robots: Assessing Job Automation Risks and Resilient Alternatives, Science Robotics (2022). DOI: 10.1126/scirobotics.abg5561. www.science.org/doi/10.1126/scirobotics.abg5561

Journal information: Science Robotics 

Provided by Ecole Polytechnique Federale de Lausanne 

A new heat engine with no moving parts is as efficient as a steam turbine

A new heat engine with no moving parts is as efficient as a steam turbine | MIT News
A thermophotovoltaic (TPV) cell (size 1 cm x 1 cm) mounted on a heat sink 
designed to measure the TPV cell efficiency. To measure the efficiency, the cell 
is exposed to an emitter and simultaneous measurements of electric power and 
heat flow through the device are taken. Credit: Felice Frankel

Engineers at MIT and the National Renewable Energy Laboratory (NREL) have designed a heat engine with no moving parts. Their new demonstrations show that it converts heat to electricity with over 40 percent efficiency—a performance better than that of traditional steam turbines.

The  is a thermophotovoltaic (TPV) cell, similar to a solar panel's photovoltaic cells, that passively captures high-energy photons from a white-hot  and converts them into electricity. The team's design can generate electricity from a heat source of between 1,900 to 2,400 degrees Celsius, or up to about 4,300 degrees Fahrenheit.

The researchers plan to incorporate the TPV cell into a grid-scale thermal battery. The system would absorb  from  such as the sun and store that energy in heavily insulated banks of hot graphite. When the energy is needed, such as on overcast days, TPV cells would convert the heat into electricity, and dispatch the energy to a power grid.

With the new TPV cell, the team has now successfully demonstrated the main parts of the system in separate, small-scale experiments. They are working to integrate the parts to demonstrate a fully operational system. From there, they hope to scale up the system to replace fossil-fuel-driven power plants and enable a fully decarbonized , supplied entirely by renewable energy.

"Thermophotovoltaic cells were the last key step toward demonstrating that thermal batteries are a viable concept," says Asegun Henry, the Robert N. Noyce Career Development Professor in MIT's Department of Mechanical Engineering. "This is an absolutely critical step on the path to proliferate renewable energy and get to a fully decarbonized grid."

Henry and his collaborators have published their results today in the journal Nature. Co-authors at MIT include Alina LaPotin, Kevin Schulte, Kyle Buznitsky, Colin Kelsall, Andrew Rohskopf, and Evelyn Wang, the Ford Professor of Engineering and head of the Department of Mechanical Engineering, along with collaborators at NREL in Golden, Colorado.

Jumping the gap

More than 90 percent of the world's electricity comes from sources of heat such as coal, natural gas, nuclear energy, and concentrated solar energy. For a century, steam turbines have been the industrial standard for converting such heat sources into electricity.

On average, steam turbines reliably convert about 35 percent of a heat source into electricity, with about 60 percent representing the highest efficiency of any heat engine to date. But the machinery depends on moving parts that are temperature- limited. Heat sources higher than 2,000 degrees Celsius, such as Henry's proposed thermal battery system, would be too hot for turbines.

In recent years, scientists have looked into solid-state alternatives—heat engines with no moving parts, that could potentially work efficiently at higher temperatures.

"One of the advantages of solid-state energy converters are that they can operate at higher temperatures with lower maintenance costs because they have no moving parts," Henry says. "They just sit there and reliably generate electricity."

Thermophotovoltaic cells offered one exploratory route toward solid-state heat engines. Much like solar cells, TPV cells could be made from semiconducting materials with a particular bandgap—the gap between a material's valence band and its conduction band. If a photon with a high enough energy is absorbed by the material, it can kick an electron across the bandgap, where the electron can then conduct, and thereby generate electricity—doing so without moving rotors or blades.

To date, most TPV cells have only reached efficiencies of around 20 percent, with the record at 32 percent, as they have been made of relatively low-bandgap materials that convert lower-temperature, low-energy photons, and therefore convert energy less efficiently.

Catching light

In their new TPV design, Henry and his colleagues looked to capture higher-energy photons from a higher-temperature heat source, thereby converting energy more efficiently. The team's new cell does so with higher-bandgap materials and multiple junctions, or material layers, compared with existing TPV designs.

The cell is fabricated from three main regions: a high-bandgap alloy, which sits over a slightly lower-bandgap alloy, underneath which is a mirror-like layer of gold. The first layer captures a heat source's highest-energy photons and converts them into electricity, while lower-energy photons that pass through the first layer are captured by the second and converted to add to the generated voltage. Any photons that pass through this second layer are then reflected by the mirror, back to the heat source, rather than being absorbed as wasted heat.

The team tested the cell's efficiency by placing it over a heat flux sensor—a device that directly measures the heat absorbed from the cell. They exposed the cell to a high-temperature lamp and concentrated the light onto the cell. They then varied the bulb's intensity, or temperature, and observed how the cell's power efficiency—the amount of power it produced, compared with the heat it absorbed—changed with temperature. Over a range of 1,900 to 2,400 degrees Celsius, the new TPV cell maintained an efficiency of around 40 percent.

"We can get a high efficiency over a broad range of temperatures relevant for thermal batteries," Henry says.

The cell in the experiments is about a square centimeter. For a grid-scale thermal battery system, Henry envisions the TPV cells would have to scale up to about 10,000 square feet (about a quarter of a football field), and would operate in climate-controlled warehouses to draw power from huge banks of stored solar energy. He points out that an infrastructure exists for making large-scale photovoltaic cells, which could also be adapted to manufacture TPVs.

"There's definitely a huge net positive here in terms of sustainability," Henry says. "The technology is safe, environmentally benign in its life cycle, and can have a tremendous impact on abating carbon dioxide emissions from  production."

Innovative waste heat recovery experiment in Sweden

More information: Asegun Henry, Thermophotovoltaic efficiency of 40%, Nature (2022). DOI: 10.1038/s41586-022-04473-y. www.nature.com/articles/s41586-022-04473-y

Journal information: Nature 

Provided by Massachusetts Institute of Technology

This story is republished courtesy of MIT News (web.mit.edu/newsoffice/), a popular site that covers news about MIT research, innovation and teaching.

Researchers develop new AI form that can adapt to perform tasks in changeable environments

Researchers develop new AI form that can adapt to perform tasks in changeable environments
Robot Tiego is ready to stack cubes. Credit: Maximilian Diehl

Can robots adapt their own working methods to solve complex tasks? Researchers at Chalmers University of Technology, Sweden, have developed a new form of AI, which, by observing human behavior, can adapt to perform its tasks in a changeable environment. The hope is that robots that can be flexible in this way will be able to work alongside humans to a much greater degree.

"Robots that work in human environments need to be adaptable to the fact that humans are unique, and that we might all solve the same task in a different way. An important area in  development, therefore, is to teach robots how to work alongside humans in dynamic environments," says Maximilian Diehl, Doctoral Student at the Department of Electrical Engineering at Chalmers University of Technology and main researcher behind the project.

When humans carry out a simple task, such as setting a table, we might approach the challenge in several different ways, depending on the conditions. If a chair unexpectedly stands in the way, we could choose to move it or walk around it. We alternate between using our right and left hands, we take pauses, and perform any number of unplanned actions.

But robots do not work in the same way. They need precise programming and instructions all the way to the goal. This approach makes them very efficient in environments where they constantly follow the same pattern, such as factory processing lines. But to successfully interact with people in areas such as healthcare or customer facing roles, robots need to develop much more flexible ways of working.

"In the future we foresee robots accomplish some basic household activities, such as setting and cleaning a table, placing kitchen utensils in the sink, or help organizing groceries," says Karinne Ramirez-Amaro, Assistant Professor at the Department of Electrical Engineering.

The Chalmers University researchers wanted to investigate whether it was possible to teach a robot more humanlike ways to approach solving tasks—to develop an "explainable AI" that extracts general instead of specific information during a demonstration, so that it can then plan a flexible and adaptable path towards a long-term goal. Explainable AI (XAI) is a term that refers to a type of  where humans can understand how it arrived at a specific decision or result.

This video is part of the publication "Automated Generation of Robotic Planning Domains from Observations", Maximilian Diehl, Chris Paxton and Karinne Ramirez-Amaro. Credit: Reasoning Laboratory

Teaching a robot to stack objects under changing conditions

The researchers asked several people to perform the same task—stacking piles of small cubes—twelve times, in a VR environment. Each time the task was performed in a different way, and the movements the humans made were tracked through a set of laser sensors.

"When we humans have a task, we divide it into a chain of smaller sub-goals along the way, and every action we perform is aimed at fulfilling an intermediate goal. Instead of teaching the robot an exact imitation of human behavior, we focused on identifying what the goals were, looking at all the actions that the people in the study performed," says Karinne Ramirez-Amaro.

The researchers' unique method meant the AI focused on extracting the intent of the sub-goals and built libraries consisting of different actions for each one. Then, the AI created a planning tool that could be used by a TIAGo robot—a mobile service robot designed to work in indoor environments. With the help of the tool, the robot was able to automatically generate a plan for a given task of stacking cubes on top of one another, even when the surrounding conditions were changed.

In short: The robot was given the task of stacking the cubes, and then, depending on the circumstances, which changed slightly for each attempt, chose for itself a combination of several possible actions to form a sequence that would lead to completion of the task. The results were extremely successful.

"With our AI, the robot made plans with a 92% success rate after just a single human demonstration. When the information from all twelve demonstrations was used, the success rate reached up to 100%," says Maximilian Diehl.

The work was presented at the robot conference IROS 2021, one of the world's most prestigious conferences in robotics. In the next phase of the project, the researchers will investigate how robots can communicate to humans and explain what went wrong, and why, if they fail a task.

Industry and healthcare

The long-term goal is to use robots in the industry to help technicians with tasks that could cause long-term health problems, for example, tightening bolts/nuts on truck wheels. In healthcare, it could be tasks like bringing and collecting medicine or food.

"We want to make the job of healthcare professionals easier so that they can focus on tasks which need more attention," says Karinne-Ramirez Amaro.

"It might still take several years until we see genuinely autonomous and multi-purpose robots, mainly because many individual challenges still need to be addressed, like computer vision, control, and safe interaction with humans. However, we believe that our approach will contribute to speeding up the learning process of robots, allowing the robot to connect all of these aspects and apply them in new situations," says Maximilian Diehl.A new framework that could simplify imitation learning in robotics

More information: Read more at research.chalmers.se/project/9253

Provided by Chalmers University of Technology