Friday, April 15, 2022

New Research Shows Estimates of the Carbon Cycle – Vital to Predicting Climate Change – Are Incorrect

Lone Plant Growing

The findings do not counter the established science of climate change but highlight how the accounting of the amount of carbon withdrawn by plants and returned by soil is not accurate.

Virginia Tech researchers, in collaboration with Pacific Northwest National Laboratory, have discovered that key parts of the global carbon cycle used to track movement of carbon dioxide in the environment are not correct, which could significantly alter conventional carbon cycle models.

The estimate of how much carbon dioxide plants pull from the atmosphere is critical to accurately monitor and predict the amount of climate-changing gasses in the atmosphere. This finding has the potential to change predictions for climate change, though it is unclear at this juncture if the mismatch will result in more or less carbon dioxide being accounted for in the environment.

“Either the amount of carbon coming out of the atmosphere from the plants is wrong or the amount coming out of the soil is wrong,” said Meredith Steele, an assistant professor in the School of Plant and Environmental Sciences in the College of Agriculture and Life Sciences, whose Ph.D. student at the time, Jinshi Jian, led the research team. The findings were published on April 1, 2022, in Nature Communications.

Virgina Tech Student Planting

“We are not challenging the well-established climate change science, but we should be able to account for all carbon in the ecosystem and currently cannot. What we found is that the models of the ecosystem’s response to climate change need updating,” said Meredith Steele. Credit: Photo by Logan Wallace for Virginia Tech.

“We are not challenging the well-established climate change science, but we should be able to account for all carbon in the ecosystem and currently cannot,” she said. “What we found is that the models of the ecosystem’s response to climate change need updating.”

Jian and Steele’s work focuses on carbon cycling and how plants and soil remove and return carbon dioxide in the atmosphere.

To understand how carbon affects the ecosystems on Earth, it’s important to know exactly where all the carbon is going. This process, called carbon accounting, says how much carbon is going where, how much is in each of Earth’s carbon pools of the oceans, atmosphere, land, and living things.

For decades, researchers have been trying to get an accurate accounting of where our carbon is and where it is going. Virginia Tech and Pacific Northwest National Laboratory researchers focused on the carbon dioxide that gets drawn out of the atmosphere by plants through photosynthesis.

When animals eat plants, the carbon moves into the terrestrial ecosystem. It then moves into the soil or to animals. And a large amount of carbon is also exhaled — or respirated — back into the atmosphere.

This carbon dioxide that’s coming in and going out is essential for balancing the amount of carbon in the atmosphere, which contributes to climate change and storing carbon long-term.

However, Virginia Tech researchers discovered that when using the accepted numbers for soil respiration, that number in the carbon cycling models is no longer balanced.

“Photosynthesis and respiration are the driving forces of the carbon cycle, however the total annual sum of each of these at the global scale has been elusive to measure,” said Lisa Welp, an associate professor of earth, atmospheric, and planetary sciences at Purdue University, who is familiar with the work but was not part of the research. “The authors’ attempts to reconcile these global estimates from different communities show us that they are not entirely self-consistent and there is more to learn about these fundamental processes on the planet.”

What Jian and Steele, along with the rest of the team, found is that by using the gross primary productivity of carbon dioxide’s accepted number of 120 petagrams — each petagram is a billion metric tons — the amount of carbon coming out through soil respiration should be in the neighborhood of 65 petagrams.

By analyzing multiple fluxes, the amount of carbon exchanged between Earth’s carbon pools of the oceans, atmosphere, land, and living things, the researchers discovered that the amount of carbon soil respiration coming out of the soil is about 95 petagrams. The gross primary productivity should be around 147. For scale, the difference between the currently accepted amount of 120 petagrams and this is estimate is about three times the global fossil fuel emissions each year.

According to the researchers, there are two possibilities for this. The first is that the remote sensing approach may be underestimating gross primary production. The other is the upscaling of soil respiration measurements, which could be overestimating the amount of carbon returned to the atmosphere. Whether this misestimate is a positive or negative thing for the scientifically proven challenge of climate change is what needs to be examined next, Steele said.

The next step for the research is to determine which part of the global carbon cycling model is being under or overestimated.

By having accurate accounting of the carbon and where it is in the ecosystem, better predictions and models will be possible to accurately judge these ecosystems’ response to climate change, said Jian, who began this research as a Ph.D. student at Virginia Tech and is now at Northwest A&F University in China.

“If we think back to how the world was when we were young, the climate has changed,” Jian said. “We have more extreme weather events. This study should improve the models we used for carbon cycling and provide better predictions of what the climate will look like in the future.”

As Steele’s first Ph.D. student at Virginia Tech, a portion of Steele’s startup fund went to support Jian’s graduate research. Jian, fascinated with data science, databases, and soil respiration, was working on another part of his dissertation when he stumbled across something that didn’t quite add up.

Jian was researching how to take small, localized carbon measurements from across the globe. While researching this, Jian discovered that the best estimates didn’t match up if all the fluxes of global carbon accounting were put together.

Reference: “Historically inconsistent productivity and respiration fluxes in the global terrestrial carbon cycle” by Jinshi Jian, Vanessa Bailey, Kalyn Dorheim, Alexandra G. Konings, Dalei Hao, Alexey N. Shiklomanov, Abigail Snyder, Meredith Steele, Munemasa Teramoto, Rodrigo Vargas and Ben Bond-Lamberty, 1 April 2022, Nature Communications.
DOI: 10.1038/s41467-022-29391-5

The research was funded by Steele’s startup fund from the College of Agriculture and Life Sciences at Virginia Tech and further supported by the Pacific Northwest National Laboratory.


Complex Life May Have Started on Earth Much Earlier Than We Thought


Detailed view of an iron formation with wavy bands. (D. Papineau)

CONOR FEEHLY
14 APRIL 2022

All life on Earth likely emerged from one spark in Earth's early history. Some time later, it diversified, branching off into lineages that helped it survive.

Exactly when these moments occurred has been a point of contention in the scientific community, but new research suggests both steps may have taken place earlier than we previously thought.

The study, led by University College London researchers builds on evidence of diverse microbial life inside a fist-sized piece of rock from Quebec in Canada, dated to around 3.75 billion to 4.28 billion years.

In 2017, the researchers who discovered it speculated that structures in the rock – tiny filaments, knobs, and tubes – had been left by ancient bacteria.

But not everyone was convinced that these structures – which would push the date for the first signs of life on Earth back by at least 300 million years – were biological in origin.

The filaments seen here are the stem-like structures indicating oldest known fossils.(D. Papineau)

However, after further extensive analysis of the rock, the team discovered an even larger and more complex structure than those which were previously identified. Within the rock was a stem-like structure with parallel branches on one side that are nearly a centimeter long, as well as hundreds of distorted spheres, or ellipsoids, alongside the tubes and filaments.

"This means life could have begun as little as 300 million years after Earth formed. In geological terms, this is quick – about one spin of the Sun around the galaxy," says lead author of the study, geochemist Dominic Papineau from UCL.

The key question for Papineau and his colleagues was whether it was possible for these structures to have formed through chemical reactions not related to living things.

According to the paper, some of the smaller structures could have conceivably been the product of abiotic reactions, however, the newly identified 'tree-like' stem is most likely biological in origin, as no structure like it, created through chemical reactions alone, has been found before.

In addition to the structures, researchers identified mineralized chemicals in the rock that could have been byproducts of different types of metabolic processes.



The chemicals are consistent with energy-extraction processes in the bacteria that would have involved iron and sulfur; depending on the interpretation of chemical signatures, there could even be hints of a version of photosynthesis.

This finding points to the possibility that the early Earth – only 300 million years after its formation – was inhabited by an array of microbial life.

The rock was analyzed through a combination of optical observations through Raman microscopes (which use light scattering to determine chemical structures), and digitally recreating sections of the rock with a supercomputer that processed thousands of images from two high-resolution imaging techniques.

The piece of rock in question was collected by Papineau in 2008 from Quebec's Nuvvuagittuq Supracrustal Belt (NSB), which was once a part of the seafloor. The NSB contains some of the oldest sedimentary rocks known on Earth. The fossil-laden rock was also analyzed for levels of rare Earth elements, with researchers finding it did indeed have the same levels as other ancient rock specimens, confirming it was as old as the surrounding volcanic rocks.

Bright red iron and silica-rich rock which contains tubular and filamentous microfossils. (D. Papineau)

Prior to this discovery, the earliest fossil evidence of life was found in Western Australia, which dates back 3.46 billion years. However, similar contention exists around whether these fossils were biological in origin.

Perhaps the most exciting implications from this discovery are what it means for the potential distribution of life in the Universe. If life was able to develop and evolve in the harsh conditions of the very early Earth, then it may be more common throughout the cosmos than we think.

"This discovery implies that only a few hundred million years are needed for life to evolve to an organized level on a primordial habitable planet," state the authors of the paper.

"We therefore conclude that such microbial ecosystems could exist on other planetary surfaces where liquid water interacted with volcanic rocks, and that these oldest microfossils and dubiofossils reported here from the NSB suggest that extraterrestrial life may be more widespread than previously thought."

The study was published in the journal Science Advances.


A Powerful 'Space Laser' Has Been Detected Beaming From Deep Space


The red crosshairs indicate the host galaxy of Nkalakatha. (ICRAR)

MICHELLE STARR
8 APRIL 2022

Powerful, radio-wavelength laser light has been detected emanating from the greatest distance across deep space yet.

It's a type of massless cosmic object called a megamaser, and its light has traveled for a jaw-dropping 5 billion light-years to reach us here on Earth. The astronomers who discovered it using the MeerKAT radio telescope in South Africa have named it Nkalakatha – an isiZulu word meaning "big boss".

The discovery has been accepted into The Astrophysical Journal Letters and is available on preprint server arXiv.

"It's impressive that, with just a single night of observations, we've already found a record-breaking megamaser," said astronomer Marcin Glowacki of the Curtin University node of the International Centre for Radio Astronomy Research (ICRAR) in Australia.

"It shows just how good the telescope is."

A maser is the microwave equivalent of a laser (light amplification by stimulated emission of radiation). Rather than emitting visible light, a maser emits microwave and radio wavelengths that are stimulated and amplified. For an astrophysical maser, the processes that amplify the light are cosmic; planets, comets, clouds, and stars can all produce masers.

As you may have guessed it, a megamaser is therefore a maser with some serious oomph. Generally these emissions are produced by an object that is going absolutely ham in some way; for instance, active supermassive black holes can produce megamasers.

When the data came in from the first night of a survey planned for 3,000 hours, Glowacki and team found the signature of a very specific type of megamaser, bright in wavelengths amplified by stimulated hydroxyl molecules, consisting of one hydrogen atom and one oxygen atom.

Hydroxyl megamasers have a known production mechanism. They are emitted by galaxies that are in the process of, or have recently undergone, a collision with another galaxy, and are bursting with star formation as a result. The gravitational interactions of such a massive encounter compress star-forming gas, causing it to collapse down into baby stars at a tremendous rate.

The source of the megamaser detected by Glowacki and his colleagues is just that, a galaxy named WISEA J033046.26−275518.3 – now known as Nkalakatha.

"When two galaxies like the Milky Way and the Andromeda Galaxy collide, beams of light shoot out from the collision and can be seen at cosmological distances," said astrophysicist Jeremy Darling from the University of Colorado.

"The hydroxyl megamasers act like bright lights that say: here is a collision of galaxies that is making new stars and feeding massive black holes."

The MeerKAT survey was not designed to look for megamasers. It's called Looking at the Distant Universe with the Meerkat Array (LADUMA), and it's looking for a 21-centimeter wavelength emitted by neutral hydrogen in the early Universe, stretched (redshifted) by the expansion of the Universe.

The wavelengths of a hydroxyl megamaser are, however, 18 centimeters; when they're redshifted, they're even longer, and that redshifted signal was within the range detectable by the telescope array.

Since the region of sky has been extensively observed in other wavelengths, tracking the signal down to a host galaxy was fairly straightforward. Nkalakatha is bright in infrared wavelengths, and has a long tail on one side that glows brightly in radio, likely as the result of being pulled by the gravitational interaction between the two now-merged galaxies.

The team has already planned follow-up observations of the fascinating object, and expect to find many more megamasers as the survey continues.

"MeerKAT will probably double the known number of these rare phenomena," Darling said. "Galaxies were thought to merge more often in the past, and the newly discovered hydroxyl megamasers will allow us to test this hypothesis."

The research has been accepted into The Astrophysical Journal Letters and is available on arXiv.

NFT of first-ever tweet a dud in online auction

NFTs -- such as this one, of the painting 'Madonna del Cardellino ('Madonna of the Goldfinch') by Raphael -- have been in vogue,
NFTs -- such as this one, of the painting 'Madonna del Cardellino 
('Madonna of the Goldfinch') by Raphael -- have been in vogue, but the dud 
auction of the first ever tweet could signal the trend is slipping.

An NFT of the first tweet ever posted on Twitter was struggling to attract bidders on Thursday, with the highest offer so far just shy of $10,000—a year after it was bought for $2.9 million.

In what could signal waning interest in non-fungible tokens (NFTs) hawked by athletes, , celebrities and tech stars, the famed inaugural Twitter post authored by co-founder Jack Dorsey seemed headed for an epic fail.

It has spent more than a week on the  block at NFT marketplace OpenSea. The top offer was in the cryptocurrency Ether, its current value translating to just under $10,000, according to the website.

The historic first tweet from the account of @Jack reads "just setting up my twttr."

Malaysia-based entrepreneur Sina Estavi bought the tweet as an authenticated  known as an NFT for $2.9 million early last year, but put it up for auction last week hoping to get some $48 million for it and give a chunk of the money to charity.

Dorsey had originally sold his  as an NFT in Ether.

Blockchain technology used in making the tokens allows people to own virtual objects with provable authenticity.

Estavi is among the early players in the NFT market, and is chief executive of blockchain platform Bridge Oracle.

The NFT trend includes a digital work by US artist Beeple selling for $69.3 million at a Christie's auction.

First tweet fetches $2.9 mn at auction

© 2022 AFP

Autonomous vehicles could prove to be future model for delivery services, study finds

Autonomous vehicles could prove to be future model for delivery services, study finds
Autonomous delivery vehicles are deployed on the streets of Beijing, China. 
Credit: Wikimedia Commons

The notion of self-driving vehicles is currently met with equal parts wonder and alarm. But a new study reveals how the pros may outweigh the cons as a business strategy.

An article titled "Impact of Autonomous Vehicle Assisted Last-Mile Delivery in Urban to Rural Settings" determines that this technology reduces the completion time of delivery tours and provides the most cost-effective business model. It appears in Transportation Science.

"The starting point of this paper involved the United States Postal Service announcing its idea to start using autonomous vehicles in rural routes," said Sara Reed, assistant professor of business analytics at the University of Kansas.

"What we find is autonomous vehicles are cost-effective across all customer geographies, but it's actually in urban environments that will see the most gains, particularly because parking is a challenge and customers are closer together. Serving customers on foot in urban environments becomes more advantageous than in rural environments where one may need to walk long distances between customers after parking."

FedEx and Volkswagen have already utilized autonomous vehicles in countries such as China and Germany. Domino's Pizza and 7-Eleven are now experimenting with bringing food and products to American customers. The USPS is aiming to implement this service by 2025, projecting a deployment on 28,000 rural routes. What once sounded like "Jetsons"-era futurism is now right around the corner.

Reed, who co-wrote the paper with Ann Melissa Campbell and Barrett Thomas of the University of Iowa, said this tech should not be confused with flying drones.

"Our autonomous vehicle is a ground vehicle that can drive itself," she said. "It can go from one place to another without the need for a driver."

In this particular study, the autonomous vehicle assists the delivery person in making deliveries.

"The vehicle drops off the delivery person at one location. They serve customers by delivering packages on foot, and then the vehicle picks them up at an alternate location. So what does that do for the delivery driver? It removes the need to find parking where you leave a vehicle by itself, as well as it removes the delivery person's walk back to the vehicle," Reed said.

The United Parcel Service has used a similar approach during the holiday season in Baltimore, for instance, by putting an extra delivery person onboard its trucks. The model proposed by Reed considers upgrading the technology of the vehicle instead. But which approach ultimately saves more money? On one hand, there's the extra cost of the additional driver; on the other hand, there's the extra cost of autonomous technology.

"Given the wage cost and increased productivity, we show that the autonomous vehicle model is more cost-effective than putting an additional person on a traditional vehicle," she said.

Currently, Reed explained, this technology is very expensive. Thus the "per hour" delivery model costs more.

"But these great reductions in the time of the delivery tour outweigh that increased expenditure," she said. "You also have to think about other advantages introduced by placing autonomous vehicles on the road. They may be driving around more, but they're also not parking, so they're freeing up spots for other people to park and get to businesses downtown."

Overall, her report reveals dramatic savings in both time and money. In , the prospective savings could eclipse 50%.

On the flip side of this equation is the potential toll on human employment.

"One fear is that  are going to take away the role of the delivery person," she said. "From the aspect of a fully autonomous vehicle—meaning we've eliminated that driver job—our paper shows it's actually more advantageous to have a delivery driver on board in both productivity and cost-effectiveness."

A native of Des Moines, Iowa, Reed is now completing her first year at KU. Her expertise is in transportation logistics. This autonomous vehicle study was part of her dissertation.

She said, "Most of my work concerns 'last-mile delivery'—which is the idea that if you order a package from Amazon, it might go a variety of places until it gets to you, and the last mile is the previous destination to your address. I'm focused on the challenges in that realm."

The "future shock" aspect of her autonomous subject is not lost on Reed.

"At the beginning, I thought this might represent a 'new society' and that type of thing," she said. "But since I first started working on this, some things have already been implemented—not necessarily full-scale on the autonomous vehicle side, but if you think more locally on college campuses, there are places where robots are delivering food to students. I imagine it's weird to see a robot going by, but then you probably get acclimated to it."Amazon applies for patent on secondary delivery vehicle to carry packages from truck to doorstep

More information: Sara Reed et al, Impact of Autonomous Vehicle Assisted Last-Mile Delivery in Urban to Rural Settings, Transportation Science (2022). DOI: 10.1287/trsc.2022.1142

Provided by University of Kansas 

How to compete with robots

Automation
Credit: CC0 Public Domain

When it comes to the future of intelligent robots, the first question people ask is often: How many jobs will they make disappear? Whatever the answer, the second question is likely to be: How can I make sure that my job is not among them?

In a study just published in Science Robotics, a team of roboticists from EPFL and economists from the University of Lausanne offers answers to both questions. By combining the scientific and technical literature on robotic abilities with employment and wage statistics, they have developed a method to calculate which of the currently existing jobs are more at risk of being performed by machines in the near future. Additionally, they have devised a method for suggesting career transitions to jobs that are less at risk and require smallest retraining efforts.

"There are several studies predicting how many jobs will be automated by robots, but they all focus on software robots, such as speech and image recognition, financial robo-advisers, chatbots, and so forth. Furthermore, those predictions wildly oscillate depending on how job requirements and software abilities are assessed. Here, we consider not only artificial intelligence software, but also real  that perform physical work and we developed a method for a systematic comparison of human and robotic abilities used in hundreds of jobs," says Prof. Dario Floreano, Director of EPFL's Laboratory of Intelligent System, who led the study at EPFL.

The key innovation of the study is a new mapping of robot capabilities onto job requirements. The team looked into the European H2020 Robotic Multi-Annual Roadmap (MAR), a strategy document by the European Commission that is periodically revised by robotics experts. The MAR describes dozens of abilities that are required from current robot or may be required by future ones, ranging, organized in categories such as manipulation, perception, sensing, interaction with humans. The researchers went through research papers, patents, and description of robotic products to assess the maturity level of robotic abilities, using a well-known scale for measuring the level of technology development, "technology readiness level" (TRL).

For human abilities, they relied on the O*net database, a widely-used resource database on the US job market, that classifies approximately 1,000 occupations and breaks down the skills and knowledge that are most crucial for each of them

After selectively matching the human abilities from O*net list to robotic abilities from the MAR document, the team could calculate how likely each existing job occupation is to be performed by a robot. Say, for example, that a job requires a human to work at millimeter-level precision of movements. Robots are very good at that, and the TRL of the corresponding ability is thus the highest. If a job requires enough such skills, it will be more likely to be automated than one that requires abilities such as critical thinking or creativity.

The result is a ranking of the 1,000 jobs, with "Physicists" being the ones who have the lowest risk of being replaced by a machine, and "Slaughterers and Meat Packers," who face the highest risk. In general, jobs in , building and maintenance, construction and extraction appear to have the highest risk.

"The key challenge for society today is how to become resilient against " says Prof. Rafael Lalive. who co-led the study at the University of Lausanne. "Our work provides detailed career advice for workers who face high risks of automation, which allows them to take on more secure jobs while re-using many of the skills acquired on the old job. Through this advice, governments can support society in becoming more resilient against automation."

The authors then created a method to find, for any given job, alternative jobs that have a significantly lower automation risk and are reasonably close to the original one in terms of the abilities and knowledge they require—thus keeping the retraining effort minimal and making the career transition feasible. To test how that method would perform in real life, they used data from the US workforce and simulated thousands of career moves based on the algorithm's suggestions, finding that it would indeed allow workers in the occupations with the highest risk to shift towards medium-risk occupations, while undergoing a relatively low retraining effort.

The method could be used by governments to measure how many workers could face automation risks and adjust retraining policies, by companies to assess the costs of increasing automation, by robotics manufacturers to better tailor their products to the market needs; and by the public to identify the easiest route to reposition themselves on the job market.

Finally, the authors translated the new methods and data into an algorithm that predicts the risk of automation for hundreds of jobs and suggests resilient career transitions at minimal  effort, publicly accessible at https://lis2.epfl.ch/resiliencetorobots.Low-wage workers at risk for automation: study

More information: Antonio Paolillo et al, How to Compete with Robots: Assessing Job Automation Risks and Resilient Alternatives, Science Robotics (2022). DOI: 10.1126/scirobotics.abg5561. www.science.org/doi/10.1126/scirobotics.abg5561

Journal information: Science Robotics 

Provided by Ecole Polytechnique Federale de Lausanne 

A new heat engine with no moving parts is as efficient as a steam turbine

A new heat engine with no moving parts is as efficient as a steam turbine | MIT News
A thermophotovoltaic (TPV) cell (size 1 cm x 1 cm) mounted on a heat sink 
designed to measure the TPV cell efficiency. To measure the efficiency, the cell 
is exposed to an emitter and simultaneous measurements of electric power and 
heat flow through the device are taken. Credit: Felice Frankel

Engineers at MIT and the National Renewable Energy Laboratory (NREL) have designed a heat engine with no moving parts. Their new demonstrations show that it converts heat to electricity with over 40 percent efficiency—a performance better than that of traditional steam turbines.

The  is a thermophotovoltaic (TPV) cell, similar to a solar panel's photovoltaic cells, that passively captures high-energy photons from a white-hot  and converts them into electricity. The team's design can generate electricity from a heat source of between 1,900 to 2,400 degrees Celsius, or up to about 4,300 degrees Fahrenheit.

The researchers plan to incorporate the TPV cell into a grid-scale thermal battery. The system would absorb  from  such as the sun and store that energy in heavily insulated banks of hot graphite. When the energy is needed, such as on overcast days, TPV cells would convert the heat into electricity, and dispatch the energy to a power grid.

With the new TPV cell, the team has now successfully demonstrated the main parts of the system in separate, small-scale experiments. They are working to integrate the parts to demonstrate a fully operational system. From there, they hope to scale up the system to replace fossil-fuel-driven power plants and enable a fully decarbonized , supplied entirely by renewable energy.

"Thermophotovoltaic cells were the last key step toward demonstrating that thermal batteries are a viable concept," says Asegun Henry, the Robert N. Noyce Career Development Professor in MIT's Department of Mechanical Engineering. "This is an absolutely critical step on the path to proliferate renewable energy and get to a fully decarbonized grid."

Henry and his collaborators have published their results today in the journal Nature. Co-authors at MIT include Alina LaPotin, Kevin Schulte, Kyle Buznitsky, Colin Kelsall, Andrew Rohskopf, and Evelyn Wang, the Ford Professor of Engineering and head of the Department of Mechanical Engineering, along with collaborators at NREL in Golden, Colorado.

Jumping the gap

More than 90 percent of the world's electricity comes from sources of heat such as coal, natural gas, nuclear energy, and concentrated solar energy. For a century, steam turbines have been the industrial standard for converting such heat sources into electricity.

On average, steam turbines reliably convert about 35 percent of a heat source into electricity, with about 60 percent representing the highest efficiency of any heat engine to date. But the machinery depends on moving parts that are temperature- limited. Heat sources higher than 2,000 degrees Celsius, such as Henry's proposed thermal battery system, would be too hot for turbines.

In recent years, scientists have looked into solid-state alternatives—heat engines with no moving parts, that could potentially work efficiently at higher temperatures.

"One of the advantages of solid-state energy converters are that they can operate at higher temperatures with lower maintenance costs because they have no moving parts," Henry says. "They just sit there and reliably generate electricity."

Thermophotovoltaic cells offered one exploratory route toward solid-state heat engines. Much like solar cells, TPV cells could be made from semiconducting materials with a particular bandgap—the gap between a material's valence band and its conduction band. If a photon with a high enough energy is absorbed by the material, it can kick an electron across the bandgap, where the electron can then conduct, and thereby generate electricity—doing so without moving rotors or blades.

To date, most TPV cells have only reached efficiencies of around 20 percent, with the record at 32 percent, as they have been made of relatively low-bandgap materials that convert lower-temperature, low-energy photons, and therefore convert energy less efficiently.

Catching light

In their new TPV design, Henry and his colleagues looked to capture higher-energy photons from a higher-temperature heat source, thereby converting energy more efficiently. The team's new cell does so with higher-bandgap materials and multiple junctions, or material layers, compared with existing TPV designs.

The cell is fabricated from three main regions: a high-bandgap alloy, which sits over a slightly lower-bandgap alloy, underneath which is a mirror-like layer of gold. The first layer captures a heat source's highest-energy photons and converts them into electricity, while lower-energy photons that pass through the first layer are captured by the second and converted to add to the generated voltage. Any photons that pass through this second layer are then reflected by the mirror, back to the heat source, rather than being absorbed as wasted heat.

The team tested the cell's efficiency by placing it over a heat flux sensor—a device that directly measures the heat absorbed from the cell. They exposed the cell to a high-temperature lamp and concentrated the light onto the cell. They then varied the bulb's intensity, or temperature, and observed how the cell's power efficiency—the amount of power it produced, compared with the heat it absorbed—changed with temperature. Over a range of 1,900 to 2,400 degrees Celsius, the new TPV cell maintained an efficiency of around 40 percent.

"We can get a high efficiency over a broad range of temperatures relevant for thermal batteries," Henry says.

The cell in the experiments is about a square centimeter. For a grid-scale thermal battery system, Henry envisions the TPV cells would have to scale up to about 10,000 square feet (about a quarter of a football field), and would operate in climate-controlled warehouses to draw power from huge banks of stored solar energy. He points out that an infrastructure exists for making large-scale photovoltaic cells, which could also be adapted to manufacture TPVs.

"There's definitely a huge net positive here in terms of sustainability," Henry says. "The technology is safe, environmentally benign in its life cycle, and can have a tremendous impact on abating carbon dioxide emissions from  production."

Innovative waste heat recovery experiment in Sweden

More information: Asegun Henry, Thermophotovoltaic efficiency of 40%, Nature (2022). DOI: 10.1038/s41586-022-04473-y. www.nature.com/articles/s41586-022-04473-y

Journal information: Nature 

Provided by Massachusetts Institute of Technology

This story is republished courtesy of MIT News (web.mit.edu/newsoffice/), a popular site that covers news about MIT research, innovation and teaching.

Researchers develop new AI form that can adapt to perform tasks in changeable environments

Researchers develop new AI form that can adapt to perform tasks in changeable environments
Robot Tiego is ready to stack cubes. Credit: Maximilian Diehl

Can robots adapt their own working methods to solve complex tasks? Researchers at Chalmers University of Technology, Sweden, have developed a new form of AI, which, by observing human behavior, can adapt to perform its tasks in a changeable environment. The hope is that robots that can be flexible in this way will be able to work alongside humans to a much greater degree.

"Robots that work in human environments need to be adaptable to the fact that humans are unique, and that we might all solve the same task in a different way. An important area in  development, therefore, is to teach robots how to work alongside humans in dynamic environments," says Maximilian Diehl, Doctoral Student at the Department of Electrical Engineering at Chalmers University of Technology and main researcher behind the project.

When humans carry out a simple task, such as setting a table, we might approach the challenge in several different ways, depending on the conditions. If a chair unexpectedly stands in the way, we could choose to move it or walk around it. We alternate between using our right and left hands, we take pauses, and perform any number of unplanned actions.

But robots do not work in the same way. They need precise programming and instructions all the way to the goal. This approach makes them very efficient in environments where they constantly follow the same pattern, such as factory processing lines. But to successfully interact with people in areas such as healthcare or customer facing roles, robots need to develop much more flexible ways of working.

"In the future we foresee robots accomplish some basic household activities, such as setting and cleaning a table, placing kitchen utensils in the sink, or help organizing groceries," says Karinne Ramirez-Amaro, Assistant Professor at the Department of Electrical Engineering.

The Chalmers University researchers wanted to investigate whether it was possible to teach a robot more humanlike ways to approach solving tasks—to develop an "explainable AI" that extracts general instead of specific information during a demonstration, so that it can then plan a flexible and adaptable path towards a long-term goal. Explainable AI (XAI) is a term that refers to a type of  where humans can understand how it arrived at a specific decision or result.

This video is part of the publication "Automated Generation of Robotic Planning Domains from Observations", Maximilian Diehl, Chris Paxton and Karinne Ramirez-Amaro. Credit: Reasoning Laboratory

Teaching a robot to stack objects under changing conditions

The researchers asked several people to perform the same task—stacking piles of small cubes—twelve times, in a VR environment. Each time the task was performed in a different way, and the movements the humans made were tracked through a set of laser sensors.

"When we humans have a task, we divide it into a chain of smaller sub-goals along the way, and every action we perform is aimed at fulfilling an intermediate goal. Instead of teaching the robot an exact imitation of human behavior, we focused on identifying what the goals were, looking at all the actions that the people in the study performed," says Karinne Ramirez-Amaro.

The researchers' unique method meant the AI focused on extracting the intent of the sub-goals and built libraries consisting of different actions for each one. Then, the AI created a planning tool that could be used by a TIAGo robot—a mobile service robot designed to work in indoor environments. With the help of the tool, the robot was able to automatically generate a plan for a given task of stacking cubes on top of one another, even when the surrounding conditions were changed.

In short: The robot was given the task of stacking the cubes, and then, depending on the circumstances, which changed slightly for each attempt, chose for itself a combination of several possible actions to form a sequence that would lead to completion of the task. The results were extremely successful.

"With our AI, the robot made plans with a 92% success rate after just a single human demonstration. When the information from all twelve demonstrations was used, the success rate reached up to 100%," says Maximilian Diehl.

The work was presented at the robot conference IROS 2021, one of the world's most prestigious conferences in robotics. In the next phase of the project, the researchers will investigate how robots can communicate to humans and explain what went wrong, and why, if they fail a task.

Industry and healthcare

The long-term goal is to use robots in the industry to help technicians with tasks that could cause long-term health problems, for example, tightening bolts/nuts on truck wheels. In healthcare, it could be tasks like bringing and collecting medicine or food.

"We want to make the job of healthcare professionals easier so that they can focus on tasks which need more attention," says Karinne-Ramirez Amaro.

"It might still take several years until we see genuinely autonomous and multi-purpose robots, mainly because many individual challenges still need to be addressed, like computer vision, control, and safe interaction with humans. However, we believe that our approach will contribute to speeding up the learning process of robots, allowing the robot to connect all of these aspects and apply them in new situations," says Maximilian Diehl.A new framework that could simplify imitation learning in robotics

More information: Read more at research.chalmers.se/project/9253

Provided by Chalmers University of Technology 

Biden's biofuel: Cheaper at the pump, but high environmental cost

A biofuel production plant in Iowa
A biofuel production plant in Iowa.

In an effort to ease Americans' pain at the gas pump, President Joe Biden has announced his administration will ease restrictions on the sale of E15—gasoline that includes 15 percent ethanol—and new investments in biofuels as a whole.

But the decision hasn't pleased scientists who study the environmental impact of ethanol.

What is ethanol?

Fuel ethanol is based on the same type of alcohol used in beverages, but with "denaturant" additives that make it unsuitable for drinking.

Blending ethanol with gasoline eases reliance on crude oil.

Most gas now sold in the United States is E10. American ethanol is generally produced by fermenting sugar from . Other countries such as Brazil rely on sugar from sugar cane.

In 2011, the US Environmental Protection Agency approved the use of E15 following research on its pollution impact.

But it is currently offered at just 2,300 gas stations in the country, according to officials.

What has Biden announced?

Speaking at a bioethanol production plant in the Midwestern state of Iowa, Biden said Tuesday that the EPA would lift a restriction prohibiting the sale of E15 between June 1 and September 15—a constraint that was imposed to limit air pollution.

That's because ethanol evaporates more easily and turns more readily into smog, which is particularly problematic in the  and sunlight of summer.

In 2018, then president Donald Trump also wanted to lift this restriction, as a concession to farmers in the midst of a trade war with China.

But a  eventually overturned Trump's decision.

According to the White House, at today's prices, E15 can save an average of 10 cents per gallon of gasoline (4.5 liters).

Environmental consequences

Though biofuels have been touted for their ability to reduce greenhouse gas emissions, assessing the environmental impact of bioethanol requires including greenhouse gas emissions related to the crops needed for its production.

And "the carbon balance of ethanol relative to gasoline isn't as good as it was originally anticipated," Tyler Lark, a scientist at the University of Wisconsin-Madison told AFP.

In 2005, Congress passed a "Renewable Fuel Standard," which required transportation fuel to include a volume of biofuel that increased over time.

The law was further expanded in 2007. As a result, 2.8 million additional hectares of corn were grown between 2008 and 2016, according to a study published in February in the Proceedings of the National Academy of Sciences (PNAS).

Lark. the first author of the PNAS study, said the consequences of converting land to corn cultivation were underestimated at the time.

"When you do that, you plow up other types of land that may have been sequestering carbon and you apply extra nitrogen fertilizer to grow that corn," he said.

In addition, some of the fertilizer used to grow corn emits nitrous oxide (N2O), a very powerful greenhouse gas.

Thus,  related to gasoline or  are ultimately comparable, concludes the study.

There are other harmful consequences, too—including leaching of fertilizers into , and the destruction of wildlife habitats to make way for corn fields.

Health dangers

Once in the tank, bioethanol emits less CO2 per liter than traditional fuels, but there's less energy per volume and so more is needed.

In addition, "it produces acetaldehyde which is a carcinogen, formaldehyde, which is a carcinogen and both of those are two of the five most potent ozone producers in photochemical smog," explained Mark Jacobson, a professor of environmental engineering at Stanford University.

Ground-level ozone represents a major health hazard, causing numerous respiratory problems including asthma. For Jacobson, both gasoline and biofuels are "horrible."

"It's bad for both climate and air pollution, and spending money on it is taking money away from real solutions" such as electric vehicles, he concluded.

At bioenergy crossroads, should corn ethanol be left in the rearview mirror?
Journal information: Proceedings of the National Academy of Science
© 2022 AFP

Why algae can be our next secret weapon to combat plastic pollution

plastic bottles
Credit: CC0 Public Domain

Plastic pollution has become an alarming problem worldwide. A 2015 study published in Science Magazine projected that by 2025, around 100–250 million metric tons of plastic waste could enter our oceans every year.

The problem also triggered the United Nations (UN) to issue a global resolution to end plastic waste, adopted by representatives from 173 countries.

However, even if all drastic measures were put in place to stop plastic production tomorrow, we would still have around 5 billion tons of plastic waste in landfills and the environment.

Research has shown that plastic can disintegrate into microplastic—particles ranging from 1 nanometer (nm) to less than 5 micrometers (mm)—with various shapes, densities, and mechanical and chemical properties.

Due to their small volume and high surface area, microplastics can absorb pollutants, causing chronic toxicity when consumed and accumulated within organisms.

For decades, scientists have been looking to nature for our fight against the plastic problem. Combined with global strategic action to slow down plastic production, we could prevent future plastic disasters.

Microalgae, for instance, are the most promising nature-based candidate capable of destroying microplastics. It is a unicellular species that exists individually or in chains or groups. Depending on the species, their size can range from a few millimeters to hundreds of micrometers.

Cultivating microalgae is simple because it does not require fertile land, large quantities of freshwater, and pesticides compared to other aquaculture crops.

Microalgae are also capable of growing rapidly. Open pond cultivation has been one of the oldest and simplest ways to cultivate microalgae on a large scale. Some people also use photobioreactors—bioreactors used in an enclosed system to increase microalgae cultivation.

How microalgae works

Interaction between microalgae and plastic can significantly change the properties of plastic, including its biodegradation, the alteration of plastic density, and sinking behavior. Moreover, microalgae may take control, gather, and stick to microplastics on their surface, regardless of their size.

There are four stages of plastic biodegradation. First is the attachment of microalgae to plastic surfaces. This starts the biodegradation process and alters surface properties.

The second is biodeterioration. Microalgae will secrete specific enzymes, which are pivotal for plastic biodegradation.

The third is the biofragmentation process. In this stage, the plastic material loses its mechanical stability and becomes fragile.

The last stage is the assimilation process, where microbial filaments and water start to penetrate plastics, which results in the decomposition and utilization of plastic by microorganisms.

Studies have reported success stories of algae-based plastic biodegradation, particularly for polyethylene (commonly used in fibers for clothing or bottle), low-density polyethylene or LDPE (used in plastic bags) and bisphenol A or BPA (chemicals to harden plastic). One of those studies calculated a 58.9% decrease of carbon composition in their LDPE sample.

More action is needed

As the biggest archipelagic country, Indonesia has a maritime area of over 6.4 million square kilometers and extensive freshwater lakes which have immense potential for microalgae cultivation.

Microalgae could be a viable solution to tackle the plastic problem in Indonesia—the world's second-largest ocean plastic polluter, according to a 2015 study.

More research is vital to deepen analyses of microalgae and microplastics' interactions and their effects to support this initiative. So far, studies on  in Indonesia only focus on its potential as a green energy resource or its capacity to become a substitute material for plastic.

To prevent plastic disasters, we also need improvements in plastic recycling and reusing strategies. Regulation and policies should be in line with the 2018–2015 National Action Plan on Marine Debris which highlights Indonesia's waste management, reduction or substitution of plastic usage, redesign of plastic products and packaging, doubling  collection rates, and expansion of waste disposal facilities.


Thai national parks ban single-use plastics
Provided by The Conversation 
This article is republished from The Conversation under a Creative Commons license. Read the original article.The Conversation