Saturday, January 14, 2023

Computers that power self-driving cars could be a huge driver of global carbon emissions

self driving car
Credit: Pixabay/CC0 Public Domain

In the future, the energy needed to run the powerful computers on board a global fleet of autonomous vehicles could generate as many greenhouse gas emissions as all the data centers in the world today.

That is one key finding of a new study from MIT researchers that explored the potential energy consumption and related carbon emissions if autonomous vehicles are widely adopted.

The data centers that house the physical computing infrastructure used for running applications are widely known for their large carbon footprint: They currently account for about 0.3 percent of global , or about as much carbon as the country of Argentina produces annually, according to the International Energy Agency. Realizing that less attention has been paid to the potential footprint of autonomous vehicles, the MIT researchers built a  to study the problem. They determined that 1 billion autonomous vehicles, each driving for one hour per day with a computer consuming 840 watts, would consume enough energy to generate about the same amount of emissions as data centers currently do.

The researchers also found that in over 90 percent of modeled scenarios, to keep autonomous vehicle emissions from zooming past current data center emissions, each vehicle must use less than 1.2 kilowatts of power for computing, which would require more efficient hardware. In one scenario—where 95 percent of the global fleet of vehicles is autonomous in 2050, computational workloads double every three years, and the world continues to decarbonize at the current rate—they found that hardware efficiency would need to double faster than every 1.1 years to keep emissions under those levels.

"If we just keep the business-as-usual trends in decarbonization and the current rate of hardware efficiency improvements, it doesn't seem like it is going to be enough to constrain the emissions from computing onboard autonomous vehicles. This has the potential to become an enormous problem. But if we get ahead of it, we could design more efficient autonomous vehicles that have a smaller carbon footprint from the start," says first author Soumya Sudhakar, a graduate student in aeronautics and astronautics.

Sudhakar wrote the paper with her co-advisors Vivienne Sze, associate professor in the Department of Electrical Engineering and Computer Science (EECS) and a member of the Research Laboratory of Electronics (RLE); and Sertac Karaman, associate professor of aeronautics and astronautics and director of the Laboratory for Information and Decision Systems (LIDS). The research appears in the January-February issue of IEEE Micro.

Modeling emissions

The researchers built a framework to explore the operational emissions from computers on board a global fleet of electric vehicles that are fully autonomous, meaning they don't require a back-up human driver.

The model is a function of the number of vehicles in the global fleet, the power of each computer on each vehicle, the hours driven by each vehicle, and the carbon intensity of the electricity powering each computer.

"On its own, that looks like a deceptively simple equation. But each of those variables contains a lot of uncertainty because we are considering an emerging application that is not here yet," Sudhakar says.

For instance, some research suggests that the amount of time driven in autonomous vehicles might increase because people can multitask while driving and the young and the elderly could drive more. But other research suggests that time spent driving might decrease because algorithms could find optimal routes that get people to their destinations faster.

In addition to considering these uncertainties, the researchers also needed to model advanced computing hardware and software that doesn't exist yet.

To accomplish that, they modeled the workload of a popular algorithm for autonomous vehicles, known as a multitask deep neural network because it can perform many tasks at once. They explored how much energy this deep neural network would consume if it were processing many high-resolution inputs from many cameras with high frame rates, simultaneously.

When they used the probabilistic model to explore different scenarios, Sudhakar was surprised by how quickly the algorithms' workload added up.

For example, if an autonomous vehicle has 10 deep neural networks processing images from 10 cameras, and that vehicle drives for one hour a day, it will make 21.6 million inferences each day. One billion vehicles would make 21.6 quadrillion inferences. To put that into perspective, all of Facebook's  worldwide make a few trillion inferences each day (1 quadrillion is 1,000 trillion).

"After seeing the results, this makes a lot of sense, but it is not something that is on a lot of people's radar. These vehicles could actually be using a ton of computer power. They have a 360-degree view of the world, so while we have two eyes, they may have 20 eyes, looking all over the place and trying to understand all the things that are happening at the same time," Karaman says.

Autonomous vehicles would be used for moving goods, as well as people, so there could be a massive amount of computing power distributed along global supply chains, he says. And their model only considers computing—it doesn't take into account the energy consumed by vehicle sensors or the emissions generated during manufacturing.

Keeping emissions in check

To keep emissions from spiraling out of control, the researchers found that each autonomous vehicle needs to consume less than 1.2 kilowatts of energy for computing. For that to be possible, computing hardware must become more efficient at a significantly faster pace, doubling in efficiency about every 1.1 years.

One way to boost that efficiency could be to use more specialized hardware, which is designed to run specific driving algorithms. Because researchers know the navigation and perception tasks required for autonomous driving, it could be easier to design specialized hardware for those tasks, Sudhakar says. But vehicles tend to have 10- or 20-year lifespans, so one challenge in developing specialized hardware would be to "future-proof" it so it can run new algorithms.

In the future, researchers could also make the algorithms more efficient, so they would need less computing power. However, this is also challenging because trading off some accuracy for more efficiency could hamper vehicle safety.

Now that they have demonstrated this framework, the researchers want to continue exploring hardware efficiency and algorithm improvements. In addition, they say their model can be enhanced by characterizing embodied carbon from —the carbon emissions generated when a car is manufactured—and emissions from a vehicle's sensors.

While there are still many scenarios to explore, the researchers hope that this work sheds light on a potential problem people may not have considered.

"We are hoping that people will think of emissions and carbon efficiency as important metrics to consider in their designs. The energy consumption of an autonomous  is really critical, not just for extending the battery life, but also for sustainability," says Sze.

More information: Soumya Sudhakar et al, Data Centers on Wheels: Emissions From Computing Onboard Autonomous Vehicles, IEEE Micro (2022). DOI: 10.1109/MM.2022.3219803

This story is republished courtesy of MIT News (web.mit.edu/newsoffice/), a popular site that covers news about MIT research, innovation and teaching.

New technique to turn abandoned mines into batteries

Turning abandoned mines into batteries
Underground Gravity Energy Storage system: a schematic of different system
 sections. Credit: Hunt et al.

A novel technique called Underground Gravity Energy Storage turns decommissioned mines into long-term energy storage solutions, thereby supporting the sustainable energy transition.

Renewable energy sources are central to the  toward a more sustainable future. However, as sources like sunshine and wind are inherently variable and inconsistent, finding ways to store energy in an accessible and efficient way is crucial. While there are many effective solutions for daily , the most common being batteries, a cost-effective long-term solution is still lacking.

In a new International Institute for Applied Systems Analysis (IIASA)-led study, an international team of researchers has developed a novel way to store energy by transporting sand into abandoned . The new technique, called Underground Gravity Energy Storage (UGES), proposes an effective long-term energy storage solution while also making use of now-defunct mining sites, which likely number in the millions globally.

The work is published in the journal Energies.

UGES generates electricity when the price is high by lowering sand into an underground mine and converting the potential energy of the sand into electricity via , and then lifting the sand from the mine to an upper reservoir using  to store energy when electricity is cheap. The main components of UGES are the shaft, motor/generator, upper and lower storage sites, and mining equipment. The deeper and broader the mineshaft, the more power can be extracted from the plant, and the larger the mine, the higher the plant's energy storage capacity.

"When a mine closes, it lays off thousands of workers. This devastates communities that rely only on the mine for their economic output. UGES would create a few vacancies as the mine would provide energy storage services after it stops operations," says Julian Hunt, a researcher in the IIASA Energy, Climate, and Environment Program and the lead author of the study. "Mines already have the basic infrastructure and are connected to the , which significantly reduces the cost and facilitates the implementation of UGES plants."

Other energy storage methods, like batteries, lose energy via self-discharge over long periods. The energy storage medium of UGES is sand, meaning that there is no energy lost to self-discharge, enabling ultra-long time energy storage ranging from weeks to several years.

The investment costs of UGES are about 1 to 10 USD/kWh and power capacity costs are about 2 USD/kW. The technology is estimated to have a global potential of 7 to 70 TWh, with most of this potential concentrated in China, India, Russia, and the U.S.

"To decarbonize the economy, we need to rethink the energy system based on innovative solutions using existing resources. Turning abandoned mines into energy storage is one example of many solutions that exist around us, and we only need to change the way we deploy them," concludes Behnam Zakeri, study co-author and a researcher in the IIASA Energy, Climate, and Environment Program.

More information: Julian David Hunt et al, Underground Gravity Energy Storage: A Solution for Long-Term Energy Storage, Energies (2023). DOI: 10.3390/en16020825

Researchers introduce new energy storage concept to turn high-rise buildings into batteries

 SCIENCTIFICTION (SCI-FI)

Using cosmic rays to generate and distribute random numbers and boost security for local devices and networks

Introducing COSMOCAT
When both sender and receiver have identical random numbers, they can share
 encrypted data without the need to share a key to decode it. This prevents 
so-called man-in-the-middle attacks. With COSMOCAT, muons (µ) arriving at
 the sender and receiver at the same time provide the source of the random 
number. Provided the devices are synchronized, the receiver can know which
 muon signal relates to which incoming message and can decode it accordingly
. Credit: ©2022 Hiroyuki Tanaka

State-of-the-art methods of information security are likely to be compromised by emerging technologies such as quantum computers. One of the reasons they are vulnerable is that both encrypted messages and the keys to decrypt them must be sent from sender to receiver.

A new method—called COSMOCAT—is proposed and demonstrated, which removes the need to send a  since cosmic rays transport it for us, meaning that even if messages are intercepted, they could not be read using any theorized approach. COSMOCAT could be useful in localized various bandwidth applications, as there are limitations to the effective distance between sender and receiver.

In the field of information communication technology, there is a perpetual arms race to find ever more secure ways to transfer data, and ever more sophisticated ways to break them. Even the first modern computers were essentially code-breaking machines used by the U.S. and European Allies during World War II. And this  is about to enter a new regime with the advent of quantum computers, capable of breaking current forms of security with ease. Even security methods which use quantum computers themselves might be susceptible to other quantum attacks.

"Basically, the problem with our current security paradigm is that it relies on encrypted information and keys to decrypt it both being sent along a network from sender to receiver," said Professor Hiroyuki Tanaka from Muographix at the University of Tokyo.

"Regardless of the way messages are encrypted, in theory someone eavesdropping could use the keys to decode the secure messages eventually. Quantum computers just make this process faster. If we dispense with this idea of sharing keys and could instead find some way of using unpredictable random numbers to encrypt information, then it should lead to a system immune to interception. And I happen to work often with a source capable of generating truly random unpredictable numbers:  from outer space."

Introducing COSMOCAT
Some use cases for COSMOCAT. As the distance is limited due to the nature of the muon
 shower arriving at the ground, COSMOCAT is best suited for networks within small areas
 such as buildings. Offices, data centers and buildings that make use of smart devices, 
and even electric-car charging stations, are some possible application areas. 
Credit: ©2022 Hiroyuki Tanaka

Various random number generators have been tried over time, but the problem is how to share these  while avoiding interception. Cosmic rays may hold the answer, as one of their byproducts, muons, are statistically random in their arrival times at the ground. Muons also travel close to the speed of light and penetrate solid matter easily.

This means that as long as we know the distance between the sender's detector and the receiver's detector, the time required for muons to travel from the sender to the receiver can be precisely calculated. And providing that a pair of devices are sufficiently synchronized, the muons' arrival time could serve as a secret key for both encoding and decoding a packet of data. But this key never has to leave the sender's device, as the receiving machine should automatically have it as well. This would plug the security hole presented by sending shared keys.

"I call the system Cosmic Coding and Transfer, or COSMOCAT," said Tanaka. "It could be used alongside or in place of current wireless communications technologies such as Wi-Fi, Bluetooth, near-field communication (NFC), and more. And it can exceed speeds possible with current encrypted Bluetooth standards. However, the distance it can be used at is limited; hence, it's ideally kept to small local networks, for example, within a building. I believe COSMOCAT is ready to be adopted by commercial applications."

At present, the muon-detecting apparatus are relatively large and require more power than other local wireless communication components. But as technology improves and the size of this apparatus can be reduced, it might soon be possible to install COSMOCAT in high-security offices, data centers and other local area networks

The work is published in the journal iScience.


More information: Hiroyuki K.M. Tanaka, Cosmic Coding and Transfer (COSMOCAT) for Ultra High Security Near-Field Communications, iScience (2023). DOI: 10.1016/j.isci.2022.105897


Journal information: iScience 

Provided by University of Tokyo 

Explore further


Measurement-device-independent quantum communication without encryption

Provided by University of Tokyo Measurement-device-independent quantum communication without encryption

AI improves detail, estimate of urban air pollution

pollution
Credit: CC0 Public Domain

Using artificial intelligence, Cornell engineers have simplified and reinforced models that accurately calculate the fine particulate matter (PM2.5)—the soot, dust and exhaust emitted by trucks and cars that get into human lungs—contained in urban air pollution.

Now,  and government health officials can obtain a more precise accounting about the well-being of urban dwellers and the air they breathe, from new research published December 2022 in Transportation Research Part D.

"Infrastructure determines our living environment, our exposure," said senior author Oliver Gao, the Howard Simpson Professor of Civil and Environmental Engineering in the College of Engineering. "Air pollution impact due to transportation—put out as exhaust from the cars and trucks that drive on our streets—is very complicated. Our infrastructure, transportation and energy policies are going to impact air pollution and hence public health."

Previous methods to gauge air pollution were cumbersome and reliant on extraordinary amounts of data points. "Older models to calculate particulate matter were computationally and mechanically consuming and complex," said Gao, a faculty fellow at the Cornell Atkinson Center for Sustainability. "But if you develop an easily accessible data model, with the help of  filling in some of the blanks, you can have an accurate model at a local scale."

Lead author Salil Desai '20, M.Eng. '21 and visiting scientist Mohammad Tayarani, together with Gao, published "Developing Machine Learning Models for Hyperlocal Traffic Related Particulate Matter Concentration Mapping," to offer a leaner, less data-intensive method for making accurate models.

Ambient air pollution is a leading cause of premature death around the world. Globally, more than 4.2 million annual fatalities—in the form of cardiovascular disease, , stroke and —were attributed to air pollution in 2015, according to a Lancet study cited in the Cornell research.

In this work, the group developed four machine learning models for traffic-related  concentrations in data gathered in New York City's five boroughs, which have a combined population of 8.2 million people and a daily-vehicle miles traveled of 55 million miles.

The equations use few inputs such as traffic data, topology and meteorology in an AI algorithm to learn simulations for a wide range of traffic-related, air-pollution concentration scenarios.

Their best performing  was the Convolutional Long Short-term Memory, or ConvLSTM, which trained the algorithm to predict many spatially correlated observations.

"Our data-driven approach—mainly based on vehicle emission data—requires considerably fewer modeling steps," Desai said. Instead of focusing on stationary locations, the method provides a high-resolution estimation of the city street pollution surface. Higher resolution can help transportation and epidemiology studies assess health, environmental justice and air quality impacts.

More information: Salil Desai et al, Developing Machine learning models for hyperlocal traffic related particulate matter concentration mapping, Transportation Research Part D: Transport and Environment (2022). DOI: 10.1016/j.trd.2022.103505

Journal information: The Lancet 

Researchers work toward harnessing ocean energy to power devices

New generator rolls into ocean energy
A new nanogenerator that harnesses the renewable energy of open ocean waves
 could power observation platforms and more in the middle of the ocean. 
Credit: Sara Levine | Pacific Northwest National Laboratory

Tsunamis, hurricanes, and maritime weather are monitored using sensors and other devices on platforms in the ocean to help keep coastal communities safe—until the batteries on these platforms run out of juice. Without power, ocean sensors can't collect critical wave and weather data, which results in safety concerns for coastal communities that rely on accurate maritime weather information. Replacing batteries at sea is also expensive. What if this could all be avoided by powering devices indefinitely from the energy in ocean waves?

Pacific Northwest National Laboratory (PNNL) researchers are working to make this a reality with the development of a new cylindrical triboelectric nanogenerator (TENG)—a small powerhouse that converts  into electricity to  devices at sea. Larger versions of this generator could be used to power ocean observation and communications systems, including acoustic and satellite telemetry.

"TENGs are low cost, lightweight, and can efficiently convert slow, uniform or random waves into power—making them particularly well-suited to powering devices in the open ocean where monitoring and access are challenging and costly," explained Daniel Deng, a PNNL laboratory fellow and co-developer of the new TENG device.

Deng and his team took a novel approach to advance cylindrical TENGs for use on the open ocean. Their patent-pending frequency-multiplied cylindrical triboelectric nanogenerator (FMC-TENG) uses carefully placed magnets to convert energy more efficiently than other cylindrical TENGs and to better transform slow, uniform waves into electricity. So far, the prototype FMC-TENG has been able to produce enough electricity to power an acoustic transmitter—a type of sensor often included on ocean observing platforms that can be used for communications. This is about the same amount of electricity it takes to power an LED lightbulb.

"We're developing the FMC-TENG to power everything from ocean observing platforms with multiple sensors to satellite communications, all using the power of the ocean," said Deng.

A new nanogenerator, the FMC-TENG, harnesses the renewable energy of open ocean waves to generate power. Credit: Sara Levine / Pacific Northwest National Laboratory

Artificial fur, magnets, and waves for power

If you've ever been shocked by , then you've personally experienced the triboelectric effect—the same effect researchers leverage in the FMC-TENG to produce power. A cylindrical TENG is made up of two nested cylinders with the inner cylinder rotating freely. Between the two cylinders are strips of artificial fur, aluminum electrodes, and a material similar to Teflon called fluorinated ethylene propylene (FEP). As the TENG rolls along the surface of an ocean wave, the artificial fur and aluminum electrodes on one cylinder rub against the FEP material on the other cylinder, creating static electricity that can be converted into power.

The more a cylindrical TENG moves, the more energy it generates. That's why fast, frequent waves can generate more energy than the slower, more uniform waves of the open ocean. To come up with a TENG that could power electronics in the open ocean, Deng and his team set out to increase the amount of wave energy converted into electricity in the FMC-TENG. As it turned out, the key was to temporarily stop the FMC-TENG's inner cylinder from moving.

In the FMC-TENG, the team positioned magnets to stop the inner cylinder in the device from rotating until it reached the crest of a wave, allowing it to build up more and more potential energy. Nearing the crest of the wave, the magnets released and the internal cylinder started rolling down the wave very quickly. The faster movement produced electricity more efficiently, generating more energy from a slower wave.

A wave energy converter for the open ocean

Currently, the FMC-TENG prototype can produce enough power to run small electronics, like temperature sensors and acoustic transmitters. As the team iterates on their design for commercial use, the FMC-TENG is expected to produce enough power to run an entire open ocean monitoring platform including multiple sensors and satellite communications. Plus, the FMC-TENG is lightweight and can be used in both free-floating devices and moored platforms.

"The FMC-TENG is unique because there are very few wave energy converters that are efficient and able to generate significant power from low-frequency ocean waves," said Deng. "This type of generator could potentially power integrated buoys with sensor arrays to track  water, wind, and climate data entirely using renewable  energy."

The study is published in the journal Nano Energy.

More information: Hyunjun Jung et al, Frequency-multiplied cylindrical triboelectric nanogenerator for harvesting low frequency wave energy to power ocean observation system, Nano Energy (2022). DOI: 10.1016/j.nanoen.2022.107365


Journal information: Nano Energy 


Provided by Pacific Northwest National Laboratory Generating electricity from store-bought, double-sided tape

Antarctic rover performs research in the snow

Antarctic rover performs research in the snow
Team POLAR's rover Ice Cube in the Norwegian snow. 
Credit: Laurenz Edelmann, team POLAR

Team Polar, a student team at Eindhoven University of Technology (TU/e), took their first rover to perform research in the Norwegian snow in the first week of January. The team is dedicated to developing an independent rover that can perform Antarctic research. This is their first working prototype and the team is eager to set a benchmark for future developments. They will present their findings and the rover itself at the reveal event, January 20, 2023.

The Earth is facing its biggest problem in centuries: . To fight climate change, we need to understand better the factors behind it. However, understanding it requires gathering information about our planet in places where nature is still pristine and more or less unaffected by climate change, as well as places where the consequences of global warming and climate change can be observed first-hand. Circumstances like that are usually found in extremely cold and remote environments like the Arctics, Antarctica, and the oldest glaciers. At the moment, research is often carried out inefficiently and in very expensive and unsustainable ways.

Team Polar wants to create an alternative way to do extensive environmental research in the coldest places on our planet by developing an unmanned rover. This rover will eventually operate and perform research all by itself—similar to how the Martian rovers operate on the surface of Mars—and collect invaluable data about the effects of climate change. The team started its mission in 2018 and now presents its first working prototype: the rover "Ice Cube."

Research in the snow

Development of the rover took place at the Eindhoven University of Technology in the Netherlands, a country notoriously devoid of deep  most winters these days. That's why the team took their rover to Trondheim in Norway, to see how their first rover "Ice Cube" performs in the snow. "We can test a lot of things in our lab, but not how the rover drives in the snow, how the  handle the snow and temperatures, and how cold it will get inside the rover and how quickly it becomes very cold," explains team manager Laurenz Edelmann.

"We have built our first rover with off-the-shelf components. That may or may not present some challenges for us. And, we may find that certain components will need to be custom-made and designed for future rover models. That's why this rover will be a benchmark for us and for future teams." The rover will be remote controlled for now, allowing the team to hand-pick the locations they want to study on this trip. This allowed the team to gather essential insights that can help the team to ensure that their next rover will be fit for a mission to Antarctica.

Inspiring other teams

The team was in Norway to run their experiments from January 4 until January 8. There, they saw their rover drive through the snow naturally. They also determined which components worked fine, and discovered a handful of items that broke and needed replacing. "Overall, we were super impressed with the way Ice Cube tackled the snow and handled itself," says Edelmann. "We gathered valuable data, and great footage to present at our event." There was also time set aside on the return journey to the Netherlands to visit other universities. "We love to maintain good relations with other student teams at technical universities in Europe. That's why we made sure to present our rover at the universities of Trondheim (NTNU), Kopenhagen (DTU), and Braunschweig (NFF)," explains Edelmann.

"We hope to inspire and inform local student teams how to tackle an ambitious goal, such as building an Antarctic research  successfully. And, of course, to forge new alliances and collaborations with our colleagues at other institutions."

NASA's self-driving Perseverance Mars rover is breaking records

Olives, the source of "liquid gold," offer more riches to unlock

Olives, the source of "liquid gold," offer more riches to unlock
Only about 25% of an olive is used for the prized oil. Credit: Lucio Patone on Unsplash

Olive oil is a multibillion-euro global business and Manuel Román is determined to create an even bigger market from the sacred ancient fruit.

Román is co-founder of ISANATUR, a Spanish company that has built a refinery able to turn every part of an olive into a commercial product of some kind.

Pulp and pits

Only about 25% of an olive is used for the prized oil, with the remaining pulp, pits and water ending up in landfills or being turned into fertiliser. The pits can also be used for fuel.

'What is needed is a market willing to use the products—the olive powder, olive water, olive seed,' said Román, former coordinator of an EU-funded project developing ways to transform the entire residue into commercial goods.

Global annual production of olive oil totals about 3 million tonnes, of which 2 million tonnes are in Europe. The worldwide olive oil market is worth nearly €13 billion, according to Fortune Business Insights.

While olive oil has been called 'liquid gold' for millennia because of its many health benefits, the residue is also packed full of goodness.

Companies in Europe are developing ways to tap into this rich source of ingredients so they can be used in health and beauty products, food supplements and .

'Thousands of tonnes of natural products are not being used and we are missing out on the opportunity to eat these healthy compounds,' said Román.

He was interim coordinator of UP4HEALTH, which began in mid-2020 and is due to continue until end-May 2024. Run from Spain, Europe's top producer of olive oil, the project brings together participants determined to end olive waste.

ISANATUR is already selling powder made from olive pulp—which is rich in iron, proteins and antioxidants—for use in snack bars and food supplements.

The overall potential is enormous. Other products include soluble fibre to boost digestive health, drinks made from olive water and fats that can be used as skin moisturisers.

Although UP4HEALTH includes several small and medium-sized food and nutraceutical businesses testing the products, it wants to attract large multinational companies that could create enough demand to scale.

One of the main obstacles to turning biowaste into new products is creating the demand and building the supply chains to meet it.

Leafy promises

Olive-tree leaves, which are normally left to rot in groves or burnt for energy, also have significant commercial potential.

A second EU-funded project coordinated from Spain, OLEAF4VALUE, has brought together scientists, multinational companies and olive farmers to develop uses for leaves in a range of products.

Olive leaves contain antioxidants, anti-inflammatories and antimicrobials, which—like  residue—could be used in food supplements, pharmaceuticals, beauty products and animal feed.

Currently, only about 0.2% of the world's residual olive leaves find commercial uses, according to OLEAF4VALUE.

The project wants to increase the level to 15% with the help of Oleícola El Tejar, a Spanish farm cooperative that already handles an equivalent percentage of the world's supply of olive pulp, pits and leaves.

The project is working with Mibelle Biochemistry Switzerland, a multinational that designs ingredients for the beauty industry.

But inroads are also being made into other markets including feed, said José Maria Pinilla, coordinator of OLEAF4VALUE.

Healthier fish

Pinilla is project manager at Natac Group, which makes natural ingredients for food supplements, feed and pharmaceuticals.

Natac works with Norway-based Mowi, the world's largest producer of farmed salmon.

The two companies have already successfully tested olive pulp ingredients in feed for fish to determine whether they protect salmon from bacterial infections. Trials so far—conducted at the Institute of Agrifood Research and Technology in Spain—have found they make salmon more resilient to such contagion.

Now, as part of OLEAF4VALUE, Natac and Mowi are testing ingredients from the leaves to gauge whether they can protect fish from viral infections.

Trials are under way at the Institute of Marine Research in Bergen, Norway and should be completed by the end of 2023, according to Pinilla.

To break into the market for animal-feed additives, economic efficiency is key.

'If you want to enter that market, you need to be very, very cost competitive,' said Pinilla.

Smarter sorting

To cut production costs in general, the project is developing a new way to extract the most value from the leaves.

With this technique, known as "dynamic processing," each batch of leaves is processed differently according to its chemical content. OLEAF4VALUE researchers are creating a sensor that analyses the chemical composition of each batch.

The chemical content of leaves depends on where they grew and how they were handled before processing. The content determines the most suitable products for leaves.

For example, olive leaves contain oleuropein, which is used as a food supplement. Its content in leaves varies from 0.2% to 10%.

Currently, leaves are all processed in the same way and for the same purposes.

'But if I want to produce a very high-purity oleuropein, it's obviously much easier to do it with a high-content leaf at the beginning,' said Pinilla.

With dynamic processing, only olive leaves with high oleuropein content are used to extract this compound. The rest are processed for different chemicals.

The project is also developing treatments to modify the chemical content of leaves so they contain higher concentrations of certain chemicals before processing begins.

Plenty of research is still needed for both the processes and the products. Then the challenge will be to find major buyers.

But Pinilla is certain about the merits of the whole initiative.

'In theory we could have nothing to throw away,' he said. 'We are trying for a zero-waste approach.'

Research in this article was funded by the EU. This article was originally published in Horizon, the EU Research and Innovation Magazine.

Video: The chemistry of olive oil

Greenhouse gas concentrations further increased in 2022, finds analysis of global satellite data

Greenhouse gas concentrations further increased in 2022
Time dependence of the concentration of carbon dioxide and methane since 2003. 
Credit: Institute for Environmental Physics at the University of Bremen

Preliminary analyses of global satellite data by environmental researchers at the University of Bremen show that atmospheric concentrations of the two important greenhouse gases carbon dioxide (CO2) and methane (CH4) continued to rise sharply in 2022. The increase in both gases is similar to that of previous years. However, the increase in methane does not reach the record levels of 2020 and 2021.

The Institute of Environmental Physics (IUP) at the University of Bremen is a world-leading institute in the field of evaluation and interpretation of global satellite measurements of the greenhouse gases  (CO2) and methane (CH4) and other atmospheric trace gases that are of great importance for climate and air quality.

The institute leads the GHG-CCI greenhouse gas project of the European Space Agency's Climate Change Initiative (ESA) and provides related data to the European Copernicus Climate Change Service C3S and the Copernicus Atmospheric Monitoring Service CAMS. The latest Copernicus communication on greenhouse gases (see link below) is based on  and analysis provided by IUP.

"The methane increase remains very high in 2022 at about 0.6%, but below the record levels of the past two years. Our guess for this is that on the one hand there have been more emissions, but at the same time the atmospheric methane sink has decreased. At just over 0.5%, the CO2 increase is similar to that of previous years," says environmental physicist Dr. Michael Buchwitz, summarizing the initial results.

Greenhouse gas measurements since 2002

Time series of greenhouse gas measurements from space begin in 2002 with the SCIAMACHY instrument on the European environmental satellite ENVISAT, proposed and scientifically leg by the University of Bremen. These measurements are currently being continued by Japanese (GOSAT and GOSAT-2) and American (OCO-2) satellites, among others.

The satellites measure the vertically averaged mixing ratio of CO2 and CH4. These measurements are referred to as XCO2 and XCH4, and they differ from the commonly reported measurements of near-ground concentrations. The data are reported in parts per million (ppm) for CO2 and parts per billion (ppb) for CH4. An XCO2 concentration of 400 ppm means the atmosphere contains 400 CO2 molecules per one million air molecules. "Methane increased by 11.8 ppb in 2022, CO2 by 2.1 ppm," Buchwitz said.

CO2 increases almost uniformly—in contrast to methane. In the years 2000 to 2006, the methane concentration was stable on average. Since 2007, however, methane has been rising (again), with particularly high rates of increase in recent years. The record levels in 2020 and 2021 are likely associated with a COVID-19-induced increase in the methane sink, but also with an increase in .

"Unfortunately, there are still many gaps in our knowledge of the diverse natural and anthropogenic sources and sinks of  and other ," Buchwitz says. "It is therefore still necessary to make optimal use of and further improve the existing system for global monitoring of climate-relevant parameters."

Provided by Universität BremenNASA cancels greenhouse gas monitoring satellite due to cost

Feds release bleak 2022 climate change data: Oceans warm, global temps among hottest on record

global warming
Credit: Pixabay/CC0 Public Domain

In one announcement after another this week, a grim accounting emerged of the world's extreme weather and climate disasters in 2022

The science leaves "no doubt" about the impacts of the warming climate, Bill Nelson, administrator of the National Aeronautics and Space Administration, said during a briefing Thursday. "Sea levels are rising. Extreme weather patterns threaten our well-being across this planet."

The nation's two  charged with weather and climate observations said in 2022:

  • Ocean heat reached a new high
  • Arctic sea ice was second lowest level ever recorded
  • Europe saw its second warmest year on record, but much of western Europe was the warmest ever

But when it comes to  and the impacts of climate change, there's no place like home. The U.S. led the world again last year in extreme weather events and disasters, the National Oceanic and Atmospheric Administration said on Tuesday.

Oceans get even warmer and saltier

The world's oceans—which absorb more than 90% of the world's excess heat—were again the hottest on record last year.

"Year after year we are breaking records for ocean heat content," Michael Mann, a climate scientist at the University of Pennsylvania, tweeted Wednesday. Mann was one of a team of 16 international researchers who published a paper Wednesday detailing last year's record  heat.

The hotter and saltier oceans are critical indicators of "profound alterations" taking place in energy and water cycles, the scientists wrote. "The inexorable climb in  is the inevitable outcome of Earth's energy imbalance, primarily associated with increasing concentrations of greenhouse gases."

If not for the large storage capacity of the oceans, the world would have warmed a lot more already, said Russell Vose, chief of the analysis and synthesis branch for NOAA's National Centers of Environmental Information.

World temperatures again among warmest on record

NASA and NOAA agreed global average temperatures in 2022 were among the warmest on record, with their data and calculations coming to slightly different conclusions.

Temperatures would have been even higher last year without La Nina keeping things cooler in the Pacific, said Gavin Schmidt, director of NASA's Goddard Institute for Space Studies.

  • It's been 46 years since the Earth had a colder-than-average year.
  • NASA put the global average temperature at 1.6 degrees above the baseline for 1951-1980 or fifth warmest, tied with 2015.
  • The European Commission's Copernicus website also ranked the year 5th warmest.
  • NOAA ranked 2022 the sixth warmest at 1.55 degrees above a baseline set between 1901-2000. It does not yet include the Arctic in its calculations.
  • La Nina likely contributed a .06 degree Celsius cooling effect on global average temperatures, Schmidt said.

There's almost a 100% chance that 2023 will also be among the top 10 warmest years on record, Schmidt said. And with conditions in the central Pacific Ocean potentially flipping to an El Nino, he and Vose said 2024 could be a contender for warmest year on record.

If the warming continues, the average temperature in a single year could soon top the 1.5 degree Celsius level the world hoped to avoid with the Paris Agreement, Vose said. "There's actually a 50-50% chance that we have one year in the 2020s that maybe jumps above 1.5."

Schmidt guessed the first year with 1.5 degrees warming will be an El Nino year, probably in the early 2030s, but, he said, the world may still be two decades away from sustained warming above 1.5 degrees.

18 billion-dollar disasters in US

"In the U.S. we have consistently had the highest count and the largest diversity of different types of weather and climate extremes that lead to billion dollar disasters," said Sarah Kapnick, NOAA'S chief scientist said Thursday.

The 18 billion-dollar disasters last year were the third most on record, behind 2020 and 2021. They included Hurricane Ian, the mega-drought in the west and a massive snowstorm across much of the country in December.

With a total cost of $165 billion, the 18 disasters made it the third most costly year on record behind 2017 and 2005, the years when Hurricane Harvey and Hurricane Katrina made landfall in the U.S.

At least 474 deaths were reported last year as a result of the billion-dollar disasters.

Hurricane Ian was the costliest disaster of 2022, with estimated damages so far at $112.9 billion.

How did US weather in 2022 compare to previous years?

It was:

  • The 27th driest year on record overall.
  • The fourth driest year on record in Nebraska.
  • The ninth driest in California, thanks to wetter than average conditions during the past two months.
  • Alaska's 16th warmest year and fourth wettest year.
  • An above average year for tornadoes, with 1,331.

(c)2023 USA Today

Distributed by Tribune Content Agency, LLC.