Sunday, April 04, 2021

 

Consumers are searching online but not buying. Why?

Study reveals the keys to more efficient and profitable retargeting campaigns for online marketers

INSTITUTE FOR OPERATIONS RESEARCH AND THE MANAGEMENT SCIENCES

Research News

Key Takeaways:

  • Discounting or couponing is not the most effective way to tap the power of retargeting in online marketing.
  • Customized seller recommendations may be more powerful than discounting.
  • Seller auctions that allow marketers to self-select in the retargeting process improve cost efficiency.

CATONSVILLE, MD, April 1, 2021 - Online marketers have seen the pattern: 95%-98% of online visitors search for something, but the search never converts into a purchase and they leave the site without buying. For marketers, this results in speculation and assumptions that can lead to wasted time and investments in ineffective marketing programs.

One of the more common ways online marketers attempt to solve this problem is to "retarget," which tracks those consumers and reconnects with them at a later point by showing display ads when they browse other websites. You've probably noticed this when using Google Search to find something, such as a pair of shoes, and then later when you're reading a separate news site, you're exposed to a number of display ads centered on that very thing you were searching for earlier.

Once that marketer gets your attention, what can they do to increase the likelihood that you will make a purchase? That question is at the center of a new study that reveals what the best approach to increase conversion rates may be.

The research study to be published in the April issue of the INFORMS journal Marketing Science, "Consumer Search and Purchase: An Empirical Investigation of Retargeting Based on Consumer Online Behaviors," is authored by Zhenling Jiang of The Wharton School at the University of Pennsylvania; Tat Chan of Washington University in St. Louis; Hai Che of the University of California; and Youwei Wang of Fudan University in Shanghai.

To conduct their research, the authors analyzed consumer behaviors in response to two distinct marketing strategies. In one approach, they sent out coupons via those retargeted display ads to be redeemed upon purchasing. In the other approach, the authors used the same display ads to provide seller recommendations that centered on a specific product offering customized to the user, but with no coupon or discount.

"We found that while both strategies help increase the conversion rate, the seller recommendations were more effective than coupons," said Jiang. "This told us that providing consumers with the sellers' information that is most relevant to them may be a more effective way to tap the power of retargeting."

To conduct their research, the authors tapped empirical data from Taobao.com, which is owned by Alibaba, and is the largest online retail platform in China. Like other major e-commerce platforms, it collects consumer browsing history and can reach consumers through direct messaging on the platform, either through the website or its mobile app. The researchers built a consumer search model to establish the relationship between consumer preference and search behaviors. They studied the behavior of 104,189 consumers who searched for a specific product among 20 sellers.

"We noticed some predictable patterns," said Jiang. "Consumers who had a higher search intensity for a specific product were more likely to actually make a purchase. Search intensity was measured in the volume of clicks tied to the same search or search term. What we found was that even where the consumer clicked on multiple possible products, it was the first link they clicked on that had the highest potential of generating a sale. In other words, after a more intense search, the consumer is more likely to go back to that initial seller once a decision to make a purchase is made."

In addition to the two basic retargeting strategies - discounting or customization - the authors proposed to use auction as a pricing mechanism to implement the policies. The auction pricing mechanism requires the seller to self-select. This means the seller selects certain criteria for its ideal customer for a specific product at a specific price point, and then bids on how much it will pay to reach that consumer.

"Through our research, we were also able to show that a pricing mechanism, such as an auction, also tends to improve the effectiveness of a retargeting program," said Jiang. "When Taobao used a pricing mechanism such as an auction, the company was able to improve the efficiency of its retargeting campaigns."

###

About INFORMS and Marketing Science

Marketing Science is a premier peer-reviewed scholarly marketing journal focused on research using quantitative approaches to study all aspects of the interface between consumers and firms. It is published by INFORMS, the leading international association for operations research and analytics professionals. More information is available at http://www.informs.org or @informs.

Exploring the evolution of Earth's habitability regulated by oxygen cycle

SCIENCE CHINA PRESS

Research News

IMAGE

IMAGE: THE STATUS OF THE OXYGEN CYCLE IN EARTH SYSTEM SCIENCE (A) AND ITS RELATIONSHIP WITH OTHER BIOGEOCHEMICAL CYCLES (B). view more 

CREDIT: @SCIENCE CHINA PRESS

As an essential material for the survival and reproduction of almost all aerobic organisms, oxygen is closely related to the formation and development of complex organisms. A recent review provides a systematic overview of the latest advances in the oxygen cycle at different spatial and temporal scales and the important role that oxygen plays in shaping our current habitable Earth.

Professor Jianping Huang from Lanzhou University is the corresponding author of the review entitled "The oxygen cycle and a habitable Earth", which is the cover article of the 64(4) of SCIENCE CHINA Earth Sciences in 2021.

Based on summarizing the latest research results of predecessors, the authors of this paper propose a coupling model of the five spheres of the earth system with the oxygen cycle as the core, and clarify the link role of the oxygen cycle in it. In this paper, the authors comprehensively summarized the changes of oxygen cycle and its effect on the habitability of the earth on multiple time scales including modern and geological time, and prospected the future development trend of oxygen cycle research.

"We take O2 for granted because it is just there and we breathe it all the time, yet it took billions of years before there was enough of it to keep animals like us alive." Professor Jianping Huang of Lanzhou University, the corresponding author of the paper, points out, "These processes involve the interaction of various spheres of the Earth system, which are complex interdisciplinary issues with multiple temporal and spatial scales." In this paper, the authors illustrate how the key biochemical processes in the oxygen cycle tie together the various spheres of the Earth system through feedback and interaction. "A habitable Earth gradually formed during the long evolution of the oxygen cycle."

The effects of current human activities on the oxygen cycle and biodiversity are also discussed. "Four of the five large-scale species extinctions that have occurred in the history of the earth are related to the lack of oxygen," Professor Huang concluded, "At present, under the compulsion of human activities, our planet is experiencing a large-scale oxygen reduction, with the ocean deoxygenation as a representative. The oxygen cycle of the Earth system is gradually out of balance, which is very worrying."

Studies of the oxygen cycle cover a wide span of timescales from daily to geologic scales. The oxygen cycles of different timescales dominate the control of atmospheric O2 over the corresponding timescales. However, a distinct boundary that divides the long-term and short-term oxygen cycles has yet to be established, and the complex interactions between the short-term and long-term processes remain unclear. Since the earth system is a highly non-linear and strongly coupled system, a minor perturbation can have the potential to cause a series of dramatic changes. "It is a top priority to connect the short-term and long-term oxygen cycles under a comparable timescale rather than separating them. Effective multidisciplinary cooperation among the subdisciplines of Earth sciences (geology, oceanography, atmospheric sciences, paleobiology, etc.), and social sciences should be promoted to reveal the hidden mechanisms that control the trajectory of the Earth system and how the trajectory may influence the future of human beings." said Prof. Huang. Fortunately, efforts have been made to reverse the decline of atmospheric O2. In China, the Green Great Wall, which was designed to mitigate desertification and expand forests has achieved overall success in past decades. Reductions in carbon emission and its related O2 consumption have been achieved in some major cities around the world.

This study has far-reaching scientific significance and important reference value for understanding the potential link between the oxygen cycle and the biodiversity in geological history and exploring the historical evolution and future of the Earth's habitability.

###

This work was supported by the National Natural Science Foundation of China (Grant No. 41888101, 41521004 & 41991231), and the China University Research Talents Recruitment Program (111 Projects, Grant No. B13045).

See the article: Huang J, Liu X, He Y, Shen S, Hou Z, Li S, Li C, Yao L, Huang J. 2021. The oxygen cycle and a habitable Earth. Science China Earth Sciences, 64(4): 511-529,
http://doi.org/10.1007/s11430-020-9747-1

 

Study finds airborne release of toxin from algal scum

Study finds a new type of toxin seen to release from scum, which sits on ponds and lakes across the world

TAYLOR & FRANCIS GROUP

Research News

A dangerous toxin has been witnessed - for the first time - releasing into the air from pond scum, research published in the peer-reviewed journal Lake and Reservoir Management today shows.

Not only is pond scum - otherwise known as algal bloom - an unsightly formation which can occur on still water across the world, it can also prove dangerous to wildlife and humans.

For the first time, scientists have now detected the presence of the algal toxin anatoxin-a (ATX)which is also known as 'Very Fast Death Factor', in the air near a Massachusetts pond with large algal blooms.

ATX can cause a range of symptoms at acute doses, including loss of coordination, muscular twitching and respiratory paralysis, and has been linked to the deaths of livestock, waterfowl and dogs from drinking contaminated water.

ATX is produced by single celled organisms known as cyanobacteria, which can form harmful algal blooms - when huge amounts of cyanobacteria grow in lake surface waters. Blooms are exacerbated by fertilizer run-off entering lakes or ponds from nearby fields or improperly treated wastewater, and can stimulate growth and high water temperatures. Cyanobacteria, which also are known as blue-green algae, are actually a type of bacteria that can photosynthesize.

Cyanobacterial blooms can also lead to low oxygen conditions, further degrading water quality. This is because when the algae in these large blooms die, they sink to the lake bottom and decompose, which can use up all the oxygen in the water, killing fish and other animals. The blooms also can release toxins into the water that can prove fatal for these animals.

"ATX is one of the more dangerous cyanotoxins produced by harmful algal blooms, which are becoming more predominant in lakes and ponds worldwide due to global warming and climate change," says lead author Dr James Sutherland, at the Nantucket Land Council.

ATX had never been detected in the atmosphere before, but Sutherland and his colleagues suspected that it might become airborne under certain environmental conditions.

To test this possibility, they collected samples of airborne particles from around the edge of Capaum Pond on Nantucket Island in Massachusetts, US, from July to October 2019, when it was regularly covered with algal blooms, by sucking air through a glass fiber filter. They then used an analytical technique called liquid chromatography-tandem mass spectrometry to search for ATX in these samples, as well as in samples of water from the pond.

Not only were they able to detect ATX in the pond water, at concentrations of up to 21ng/mg, but on one occasion they also detected it in the air around the pond, at an average concentration of 0.87ng/filter, which corresponds to a potential airborne exposure of 0.16ng/m3. This detection occurred on a foggy day in September, after a windy night, when the ATX was likely blown from the surface of the water by the strong wind and then protected by the fog, allowing it to be detected.

The researchers are unsure exactly how ATX is released into the air from the pond, whether within small water droplets, attached to aerosol particles or even inside cyanobacteria blown into the air. It is also unclear what effects exposure to these trace concentrations of ATX might have on humans and wildlife, but the fact that there is exposure is clearly a cause for concern and requires further research.

Whilst this new toxin was witnessed in one specific pond, researchers warn caution for people across the world to approach still water with algal blooms. Therefore, further research is needed to test algal blooms in ponds internationally.

"People often recreate around these lakes and ponds with algal blooms without any awareness of the potential problems," said Sutherland. " Direct contact or inhalation of these cyanotoxins can present health risks for individuals, and we have reported a potential human health exposure not previously examined."

###

 

Scientists turn to deep learning to improve air quality forecasts

PENN STATE

Research News

Air pollution from the burning of fossil fuels impacts human health but predicting pollution levels at a given time and place remains challenging, according to a team of scientists who are turning to deep learning to improve air quality estimates. Results of the team's study could be helpful for modelers examining how economic factors like industrial productivity and health factors like hospitalizations change with pollution levels.

"Air quality is one of the major issues within an urban area that affects people's lives," said Manzhu Yu, assistant professor of geography at Penn State. "Yet existing observations are not adequate to provide comprehensive information that may help vulnerable populations to plan ahead."

Satellite and ground-based observations each measure air pollution, but they are limited, the scientists said. Satellites, for instance, may pass a given location at the same time each day and miss how emissions vary at different hours. Ground-based weather stations continuously collect data but only in a limited number of locations.

To address this, the scientists used deep learning, a type of machine learning, to analyze the relationship between satellite and ground-based observations of nitrogen dioxide in the greater Los Angeles area. Nitrogen dioxide is largely associated with emissions from traffic and power plants, the scientists said.

"The problem right now is nitrogen dioxide varies a lot during the day," Yu said. "But we haven't had an hourly, sub-urban scale product available to track air pollution. By comparing surface level and satellite observations, we can actually produce estimates with higher spatial and temporal resolution."

The learned relationship allowed the researchers to take daily satellite observations and create hourly estimates of atmospheric nitrogen dioxide in roughly 3-mile grids, the scientists said. They recently reported their findings in the journal Science of the Total Environment.

"The challenge here is whether we can find a linkage between measurements from earth's surface and satellite observations of the troposphere, which are actually far away from each other. That's where deep learning comes in."

Deep learning algorithms operate much like the human brain and feature multiple layers of artificial neurons for processing data and creating patterns. The system learns and trains itself based on connections it finds within large amounts of data, the scientists said.

The scientists tested two deep-learning algorithms and found the one that compared the ground-based observations directly to the satellite observations more accurately predicted nitrogen dioxide levels. Adding information like meteorological data, elevation and the locations of the ground-based stations and major roads and power plants improved the prediction accuracy further.

Yu said the study could be repeated for other greenhouse gases and applied to different cities or on regional and continental scales, the scientists said. In addition, the model could be updated when new, higher-resolution satellites are launched.

"With a high spatiotemporal resolution, our results will facilitate the study between air quality and health issues and improve the understanding of the dynamic evolution of airborne pollutants," Yu said.

###

Also contributing to this study was Qian Liu, a doctoral student at the National Science Foundation Spatiotemporal Innovation Center at George Mason University.

Serving size, satisfaction influence food waste on campus

Multicampus study identifies problem's causes and potential solutions

RICE UNIVERSITY

Research News

HOUSTON - (April 2, 2021) - Understanding what drives food choices can help high-volume food service operations like universities reduce waste, according to a new study.

Researchers have concluded that food waste in places like university cafeterias is driven by how much people put on their plates, how familiar they are with what's on the menu and how much they like - or don't like - what they're served.

Food waste has been studied often in households, but not so often in institutional settings like university dining commons. What drives food choices in these "all-you-care-to-eat" facilities is different because diners don't perceive personal financial penalty if they leave food on their plates.

Published in the journal Foods, "Food Choice and Waste in University Dining Commons -- A Menus of Change University Research Collaborative Study" was conducted by a team of experts from Rice University; the University of California, Davis; Stanford University; Lebanon Valley College; the University of California, Santa Barbara; and the University of California, Berkeley.

Co-author Eleanor Putnam-Farr, assistant marketing professor at Rice's Jones Graduate School of Business, is available to discuss the findings and potential impact with news media.

The researchers conducted student surveys during the 2019 spring and fall semesters to study foods types, diner confidence and diner satisfaction. They used photos taken by diners themselves before and after eating to measure how much food was taken and how much of it went to waste. "Diners were intercepted at their dining halls and asked if they wanted to participate in a study about food choices and satisfaction, but the objective of investigating food waste behavior was not disclosed," the authors wrote.

The study found the amount of food wasted didn't significantly differ among types of food. Instead, researchers discovered waste was related to the amount of food diners put on their plates, how satisfied they were with their meals and how often they went to the dining commons. If students were satisfied with their food, they tended waste less of it. And diners who visited the commons most often -- making them more familiar with the menus and more confident in their choices -- tended to waste less.

Mixed dishes, like sandwiches or stir-fry, took up a greater percentage of the surface area on surveyed plates than animal proteins or grains and starches. Those three types of food took up a greater area of the plates than fruits, vegetables or plant proteins. The amount of food wasted, however, did not significantly differ among the various food categories.

The mixed dishes and animal proteins that took up greater portions of the plate tended to be pre-plated by the commons staff or have a suggested serving size. The study's results showed that greater amounts of food taken by diners correlated with the item being pre-plated or served by others.

The authors recommend future research on the topic uses their multicampus approach -- which enabled them to study food choice among a large and diverse group -- to better understand what causes food waste and find out if it can be reduced by interventions such as posting signs that encourage healthier choices.

###

To schedule an interview with Putnam-Farr, or for more information, contact Avery Franklin, media relations specialist at Rice, at averyrf@rice.edu or 713-348-6327.

Related materials:

Paper: https://www.mdpi.com/2304-8158/10/3/577.

Putnam-Farr bio: https://business.rice.edu/person/eleanor-putnam-farr.

Follow Rice News and Media Relations via Twitter @RiceUNews.

This news release can be found online at news.rice.edu.

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAle

Landslides: New early warning systems reduce false alarms

CMCC FOUNDATION - EURO-MEDITERRANEAN CENTER ON CLIMATE CHANGE

Research News

Many slopes in the Campania region are covered with layers of volcanic soil, the result of repeated eruptions over the course of millennia. As the impacts of climate change worsen, including the occurrence of very intense and short rainfall in localized areas, there is a growing need, especially in this and other Italian regions that are vulnerable to landslides, to understand the dynamics that induce such events more precisely and develop models that can predict them. When employed through adequate early warning systems, these tools can support decision-makers in adopting effective and efficient measures to protect people and the areas themselves from landslides.

To this end, the study Exploring ERA5 reanalysis potentialities for supporting landslide investigations: a test case from Campania Region (Southern Italy) by the CMCC Foundation identifies the potential of the fifth generation of atmospheric models (known as reanalysis dataset ERA5) developed by the European Centre for Medium-Range Weather Forecasts for improving the performance of early warning systems used, for example, by the Civil Protection (Protezione Civile).

On a slope covered by pyroclastic soil landslides are usually caused by a combination of two factors: the antecedent slope wetness conditions and the triggering condition represented by an intense precipitation event. The early warning system currently used by the Campania region uses the latter as the only indicator, ignoring the antecedent conditions of the soil. As a result, every time a precipitation event of a certain intensity is forecasted, whatever the state of the ground at that specific moment, the system returns a state of alert, pre-alarm or alarm. This increases the likelihood of false alarms and therefore of decisions - such as road closures - which may cause interruptions to services that in some cases could be avoided.

"Reanalyses are atmospheric models, the same ones that are used for forecasting. But they are usually used in back-analysis to reconstruct past weather conditions," explains Guido Rianna, CMCC researcher and co-author of the study. "Because of this characteristic, the purpose of reanalyses is not to forecast, but to homogenize, from a spatial and temporal point of view, data from different in situ monitoring systems - such as weather stations - or from remote sensing, such as satellites."

The fifth generation of reanalysis (ERA5) has a much better spatial resolution than the previous ones, providing atmospheric variables on grid cells of about 30 km side. In addition, it is possible to access free daily updated data through the Climate Data Store of the Copernicus Climate Change Service. This data ranges from 1979 up to 5 days before the consultation. Such a short time of data release allowed the authors of the study to assume - and then verify - a good performance of the tool not only for back-analysis studies but also for operational purposes such as the development of early warning systems.

"We first verified the reliability of the ERA5 reanalysis in reproducing rainfall histories leading to landslide events that actually occurred in the studied area," says Alfredo Reder, CMCC researcher and first author of the publication. "Next, we analysed the specific landslide event that occurred in Nocera Inferiore (Campania) on 4 March 2005. We verified that, at that event, the soil moisture estimations offered by ERA5, although not free of constraints and limitations, would have been able to observe a very high value in terms of water content along the entire cover. Thus, we have been able to take the final step of evaluating the possibility of using these reanalyses operationally for an early warning system. In the last phase of our research, we verified that the ERA5 datasets, if used as a proxy to support a decision, could improve the reliability of the forecasting model currently used in Campania, because they can provide information on antecedent slope wetness conditions, which are a predisposing factor for a landslide event."

The results of the study suggest the potential of this tool especially for minimizing false alarms, while avoiding missed alarms.

"Any expert on landslides in pyroclastic soils would say that the occurrence of an intense precipitation event in September in Campania, happening on a soil in dry condition, can rarely trigger a landslide," concludes Rianna. "But to date there is a lack of quantitative observational support to affirm this. Today, a Civil Protection decision-maker, on the basis of existing studies in the literature, could not fail to sound the alarm in the event of a forecast of a precipitation event with certain characteristics: he could not support this choice with data on the state of the soil in the prior period (except for a few limited slopes). Our research shows that ERA5 could fill this gap and therefore minimize the number of false alarms."

The results of this research apply to the case of pyroclastic landslides, which are characteristic of the studied area. The next step will be to verify the suitability of ERA5 reanalyses for the same purpose but in the case of other soil types, such as clay soils leading to slow landslides, which are characteristic of various areas of Italy, such as the southern Apennines and some areas of Basilicata and Campania.

###

For more information:

Reder, A., Rianna, G. Exploring ERA5 reanalysis potentialities for supporting landslide investigations: a test case from Campania Region (Southern Italy). Landslides (2021). https://doi.org/10.1007/s10346-020-01610-4

 

Keeping it fresh: New AI-based strategy can assess the freshness of beef samples

Scientists combine spectroscopy and deep learning in an efficient technique for detecting spoiled meat

GIST (GWANGJU INSTITUTE OF SCIENCE AND TECHNOLOGY)

Research News

IMAGE

IMAGE: CONSUMING SPOILED BEEF IS DANGEROUS, BUT THERE ARE CURRENTLY NO SIMPLE AND EFFICIENT METHODS TO ASSESS BEEF FRESHNESS. view more 

CREDIT: UNSPLASH

Although beef is one of the most consumed foods around the world, eating it when it's past its prime is not only unsavory, but also poses some serious health risks. Unfortunately, available methods to check for beef freshness have various disadvantages that keep them from being useful to the public. For example, chemical analysis or microbial population evaluations take too much time and require the skills of a professional. On the other hand, non-destructive approaches based on near-infrared spectroscopy require expensive and sophisticated equipment. Could artificial intelligence be the key to a more cost-effective way to assess the freshness of beef?

At Gwangju Institute of Science and Technology (GIST), Korea, a team of scientists led by Associate Processors Kyoobin Lee and Jae Gwan Kim have developed a new strategy that combines deep learning with diffuse reflectance spectroscopy (DRS), a relatively inexpensive optical technique. "Unlike other types of spectroscopy, DRS does not require complex calibration; instead, it can be used to quantify part of the molecular composition of a sample using just an affordable and easily configurable spectrometer," explains Lee. The findings of their study are now published in Food Chemistry.

To determine the freshness of beef samples, they relied on DRS measurements to estimate the proportions of different forms of myoglobin in the meat. Myoglobin and its derivatives are the proteins mainly responsible for the color of meat and its changes during the decomposition process. However, manually converting DRS measurements into myoglobin concentrations to finally decide upon the freshness of a sample is not a very accurate strategy--and this is where deep learning comes into play.

Convolutional neural networks (CNN) are widely used artificial intelligence algorithms that can learn from a pre-classified dataset, referred to as 'training set,' and find hidden patterns in the data to classify new inputs. To train the CNN, the researchers gathered data on 78 beef samples during their spoilage process by regularly measuring their pH (acidity) alongside their DRS profiles. After manually classifying the DRS data based on the pH values as 'fresh,' 'normal,' or 'spoiled,' they fed the algorithm the labelled DRS dataset and also fused this information with myoglobin estimations. "By providing both myoglobin and spectral information, our trained deep learning algorithm could correctly classify the freshness of beef samples in a matter of seconds in about 92% of cases," highlights Kim.

Besides its accuracy, the strengths of this novel strategy lie in its speed, low cost, and non-destructive nature. The team believes it may be possible to develop small, portable spectroscopic devices so that everyone can easily assess the freshness of their beef, even at home. Moreover, similar spectroscopy and CNN-based techniques could also be extended to other products, such as fish or pork. In the future, with any luck, it will be easier and more accessible to identify and avoid questionable meat.

###

About Gwangju Institute of Science and Technology (GIST)

Gwangju Institute of Science and Technology (GIST) is a research-oriented university situated in Gwangju, South Korea. One of the most prestigious schools in South Korea, it was founded in 1993. The university aims to create a strong research environment to spur advancements in science and technology and to promote collaboration between foreign and domestic research programs. With its motto, "A Proud Creator of Future Science and Technology," the university has consistently received one of the highest university rankings in Korea.
Website: http://www.gist.ac.kr/

About the authors

Kyoobin Lee is an Associate Professor and Director of the AI laboratory at GIST. His group is developing AI-based robot vision and deep learning-based bio-medical analysis methods. Before joining GIST, he obtained a PhD in Mechatronics from KAIST and completed a postdoctoral training program at Korea Institute of Science and Technology (KIST).

Jae Gwan Kim is an Associate Professor at the Department of Biomedical Science and Engineering at GIST since 2011. His current research topics include brain stimulation by transcranial ultrasound, anesthesia depth monitoring, and screening the stage of Alzheimer's disease via brain functional connectivity measurements. Before joining GIST, he completed a postdoctoral training program at the Beckman Laser Institute and Medical Clinic at UC Irvine, USA. In 2005, he received a PhD in Biomedical Engineering from a joint program between the University of Texas at Arlington and the University of Texas Southwestern Medical Center at Dallas, USA.

NASA’s InSight Detects Two Sizable Quakes on Mars

April 2, 2021 
Doug Messier 
Parabolic Arc All Space All the Time

NASA’s InSight lander used a scoop on its robotic arm to begin trickling soil over the cable connecting its seismometer to the spacecraft on March 14, 2021, the 816th Martian day, or sol of the mission. Scientists hope insulating it from the wind will make it easier to detect marsquakes. (Credits: NASA/JPL-Caltech)

The magnitude 3.3 and 3.1 temblors originated in a region called Cerberus Fossae, further supporting the idea that this location is seismically active.


PASADENA, Calif. (NASA PR) — NASA’s InSight lander has detected two strong, clear quakes originating in a location of Mars called Cerberus Fossae – the same place where two strong quakes were seen earlier in the mission. The new quakes have magnitudes of 3.3 and 3.1; the previous quakes were magnitude 3.6 and 3.5. InSight has recorded over 500 quakes to date, but because of their clear signals, these are four of the best quake records for probing the interior of the planet.

Studying marsquakes is one way the InSight science team seeks to develop a better understanding of Mars’ mantle and core. The planet doesn’t have tectonic plates like Earth, but it does have volcanically active regions that can cause rumbles. The March 7 and March 18 quakes add weight to the idea that Cerberus Fossae is a center of seismic activity.

“Over the course of the mission, we’ve seen two different types of marsquakes: one that is more ‘Moon-like’ and the other, more ‘Earth-like,’” said Taichi Kawamura of France’s Institut de Physique du Globe de Paris, which helped provide InSight’s seismometer and distributes its data along with the Swiss research university ETH Zurich. Earthquake waves travel more directly through the planet, while those of moonquakes tend to be very scattered; marsquakes fall somewhere in between. “Interestingly,” Kawamura continued, “all four of these larger quakes, which come from Cerberus Fossae, are ‘Earth-like.’”

The new quakes have something else in common with InSight’s previous top seismic events, which occurred almost a full Martian year (two Earth years) ago: They occurred in the Martian northern summer. Scientists had predicted this would again be an ideal time to listen for quakes because winds would become calmer.

The seismometer, called the Seismic Experiment for Interior Structure (SEIS), is sensitive enough that, even while it is covered by a dome-shaped shield to block it from wind and keep it from getting too cold, wind still causes enough vibration to obscure some marsquakes. During the past northern winter season, InSight couldn’t detect any quakes at all.

“It’s wonderful to once again observe marsquakes after a long period of recording wind noise,” said John Clinton, a seismologist who leads InSight’s Marsquake Service at ETH Zurich. “One Martian year on, we are now much faster at characterizing seismic activity on the Red Planet.”

Better Detection


The winds may have quieted down, but scientists are still hoping to improve their “listening” capability even more. Temperatures near the InSight lander may swing from almost minus 148 degrees Fahrenheit (minus 100 degrees Celsius) at night to 32 degrees Fahrenheit (0 degrees Celsius) during the day. These extreme temperature variations may be causing the cable connecting the seismometer to the lander to expand and contract, resulting in popping sounds and spikes in the data.

So the mission team has begun trying to partially insulate the cable from the weather. They’ve started by using the scoop on the end of InSight’s robotic arm to drop soil on top of the domed Wind and Thermal Shield, allowing it to trickle down onto the cable. That allows the soil to get as close to the shield as possible without interfering with the shield’s seal with the ground. Burying the seismic tether is in fact one of the goals of the next phase of the mission, which NASA recently extended by two years, to December 2022.

Despite the winds that have been shaking the seismometer, InSight’s solar panels remain covered with dust, and power is running lower as Mars moves away from the Sun. Energy levels are expected to improve after July, when the planet begins to approach the Sun again. Until then, the mission will successively turn off the lander’s instruments so that InSight can hibernate, waking periodically to check its health and communicate with Earth. The team hopes to keep the seismometer on for another month or two before it has to be temporarily turned off.

More About the Mission


JPL manages InSight for NASA’s Science Mission Directorate. InSight is part of NASA’s Discovery Program, managed by the agency’s Marshall Space Flight Center in Huntsville, Alabama. Lockheed Martin Space in Denver built the InSight spacecraft, including its cruise stage and lander, and supports spacecraft operations for the mission.

A number of European partners, including France’s Centre National d’Études Spatiales (CNES) and the German Aerospace Center (DLR), are supporting the InSight mission. CNES provided the Seismic Experiment for Interior Structure (SEIS) instrument to NASA, with the principal investigator at IPGP (Institut de Physique du Globe de Paris). Significant contributions for SEIS came from IPGP; the Max Planck Institute for Solar System Research (MPS) in Germany; the Swiss Federal Institute of Technology (ETH Zurich) in Switzerland; Imperial College London and Oxford University in the United Kingdom; and JPL. InSight’s Marsquake Service is a collaborative ground service operation led by ETH Zurich that also includes on-duty seismologists from IPG Paris, University of Bristol and Imperial College London. SEIS and APSS Operations are led by CNES SISMOC, with support of CAB, and SEIS data are formatted and distributed by the IPG Paris Mars SEIS Data Service. DLR provided the Heat Flow and Physical Properties Package (HP3) instrument, with significant contributions from the Space Research Center (CBK) of the Polish Academy of Sciences and Astronika in Poland. Spain’s Centro de Astrobiología (CAB) supplied the temperature and wind sensors.

Thousands dead and a country in turmoil: Brazil’s countdown to Covid catastrophe

With more than 66,000 deaths in March alone, Brazil is in the midst of a health and political crisis. How did the country get to this point?

BySarah Newey, GLOBAL HEALTH SECURITY CORRESPONDENT2 April 2021

A man mourns his mother in Manaus, a sprawling city in the Amazon that has been hit by two devastating waves of Covid-19 CREDIT: Simon Townsley

The calamity currently unfolding in Brazil is off the charts. In March alone, 66,570 people died of Covid-19, while daily fatalities in the vast country currently account for about a quarter of the global total.

A highly contagious variant, P1, is now rampant and there are few measures in place to contain its spread, pushing health systems to the brink of collapse.

Perhaps unsurprisingly a political crisis is also brewing. The heads of all three branches of the military resigned this week leaving president Jair Bolsonaro - dubbed the Trump of the Tropics - exposed.

There are growing calls for his impeachment and a Bidenesque overhaul of the country’s coronavirus response.

But for many, witnessing the pandemic unfold over the last year has felt akin to watching a slow motion car crash. Experts say the foundations for the current disaster were laid soon after the virus first reached Brazil, in late February 2020.

Here, we look back at the key moments in Brazil’s countdown to catastrophe.
March: Bolsonaro meets Trump

In January and February 2020, as it slowly dawned on the world that a “mystery pneumonia” detected in Wuhan, China was a growing threat, president Jair Bolsonaro’s public statements already pitted the challenge as the economy versus the virus.

But his comments at this stage were “nuanced” and largely in line with other leaders, says Lorena Barberia, an associate professor of political science at the University of São Paulo.

And Mr Bolsonaro did not object to legislation, introduced before Covid-19 was first detected in Brazil on 26 February, that gave states a mandate to introduce restrictions and allowed for emergency quarantine measures to be adopted at a national level.

Former US President Donald Trump with Brazilian President Jair Bolsonaro during a dinner in Mar a Lago, Florida, on March 7 CREDIT: BRAZILIAN PRESIDENCY/AFP/ALAN SANTOS


“But something appears to have happened in March, there’s a shift in discourse,” says Prof Barberia. “There’s lots of speculation that it’s linked to President Bolsonaro’s trip to Florida [to meet Donald Trump].

“In Miami Bolsonaro makes a speech claiming the pandemic is being exaggerated, and when he comes back to Brazil he starts to say other flus have killed more people,” she adds.

Members of Mr Bolsonaro’s delegation not only returned with Covid-19 infections, but a new approach.

April: Popular health minister fired

In the weeks after the Miami trip, as state and municipal governments set up coronavirus committees and taskforces, the national government did little to mobilise a coordinated pandemic response.

“This disconnect set us off on a bad track,” says Prof Barberia, adding that the lack of a clear chain of communication throughout the pandemic has allowed misinformation to spread like wildfire.

Then, in mid April, internal tensions burst onto the public stage when Mr Bolsonaro fired his popular health minister, Luiz Henrique Mandetta, following clashes over the coronavirus response. The move sparked protests in cities across the country, as people banged on pots and pans from their windows to express frustration, fear and anger.

President of Brazil Jair Bolsonaro with the former Minister of Health, Luiz Henrique Mandetta CREDIT: HANDOUT/EPA-EFE/Shutterstock


“There was this struggle building and building,” says Ricardo Parolin Schnekenberg, a Brazilian researcher at Oxford University. “And then when Mandetta left, it became very clear that we would have no guidelines, no rules, no restrictions, or any sort of protective measure from the federal government.

“If you ask me for my key moments, they are all between February and April 2020,” he adds. “Everything else is just a consequence of decisions made - or not made - during these months.”

April: Mass burials in Manaus

Also in April Manaus, a sprawling city in the heart of the Amazon, gained notoriety as hospitals were overwhelmed and images of mass burials lapped the globe.

“Manaus was taken by surprise,” the city’s Archbishop, Leonardo Steiner, told the Telegraph last autumn. “There was this denial of the disease, [the government] just said ‘no it doesn’t exist’... Certainly we would have seen fewer deaths had the approach been different.”

Experts say the outbreak in Manaus should have been a wake-up call for the federal government, at a point when fear had already driven people to stay at home, regardless of official policy.

A bird perches on the cross that accompanies the grave of a person who died in January, in the Nossa Senhora Aparecida Cemetery, where victims of covid-19 are buried, in Manaus CREDIT: RAPHAEL ALVES/EPA-EFE/Shutterstock


But disjointed decision-making continued. “We’ve never had a consistent and clear message coming from a position of authority, saying: this is what we know, this is what we don't know, this is your official guidance,” says Mr Schnekenberg.

Instead, Mr Bolsonaro pushed unproven treatments including hydroxychloroquine, joined anti-lockdown protests and declared “war” on state government leaders who adopted disease containment measures. When quizzed after the country's death toll first surpassed that in China, he simply replied: “I don't do miracles.”

This is one in a long list of controversial comments about the “little flu” from the President. “There’s no use trying to escape it, escape reality - we have to quit being a country of sissies,” he later said.

One tracking website, Aos Fatos, found Mr Bolsonaro has made more than 1,200 false or distorted statements about the pandemic since last March.

Nurses transport a patient infected with coronavirus to the 28 de Agosto Hospital in Manaus, as the health system is pushed to the brink CREDIT: Raphael Alves/EPA-EFE/Shutterstock

In an editorial in the British Medical Journal, three professors from the University of São Paulo - Deisy Ventura, Fernando Aith and Rossana Reis - called this a “a barrage of propaganda against public health”.

A summer of missed opportunities

Missed opportunities continued throughout the summer, says Prof Barberia.

The federal government’s landmark scheme offering monthly payments of 600 reais(£77) to 68 million Brazilians was popular, but to get hold the cash people had to queue in long lines outside Caixa bank - potentially spreading the virus.

President Bolsonaro promoted the handout as an economic stimulus, rather than a measure to allow people to stay at home like the UK’s furlough scheme. “So we saw this huge movement in cities and in local communities all over Brazil,” says Prof Barberia, as people were encouraged to shop and go back to work.

Then, as cases began to ease from June, state and municipal governments attempted to introduce tiered restrictions. While a good idea in theory, measures became confusing and frequently changed, leading much of the public to ignore them altogether.

A health technician carries a thermal box with doses of the Chinese 
Coronavac vaccine in March 2021 CREDIT: Raphael Alves/EPA-EFE/Shutterstock

Yet the biggest missed opportunity was securing vaccines, says Mr Schnekenberg. Not only has Mr Bolsonaro repeatedly questioned the safety of jabs - “if you turn into a crocodile, it’s your problem,” he quipped - but he has had a lackadaisical approach to securing supplies.

The country finally agreed a deal with Pfizer for 100 million doses last month. It is hoped these will boost Brazil’s stuttering vaccination campaign, which has so far been heavily reliant on Chinese jabs secured by Sao Paulo state’s governor.

“For a country the size of Brazil, it’s just absurd that we have contracts with so few vaccine manufacturers,” says Mr Schnekenberg. “Even if we could now vaccinate at the rate seen in the US, it still would take months to be able to see the effects. So in the short term, we're in a very difficult situation.”
January: Contagious new variant hits

But policy alone isn’t to blame for the current crisis: the virus has had surprises of its own.

In January a highly contagious new variant, known as P1, was detected in Manaus, as the hard-hit city experienced a second devastating wave of infections. By February the city’s health system was once again on the brink, a surprise to many who believed the size of Manaus’ first outbreak would trigger some protection from a second.

Aerial view of the Nossa Senhora Aparecida cemetery in Manaus, Amazonas state, Brazil. The site was cleared in early 2020 as coronavirus deaths first surged - it is now full CREDIT: MICHAEL DANTAS/AFP

Prof Barberia says Brazil failed to “learn from what happened in the UK”, where the Kent variant triggered a major surge from late 2020.

“P1 could have been an opportunity for a really concerted effort to call a national coordinated response,” she says. “Instead I think we've really lost momentum.

“Testing and surveillance remains limited, and we allowed P1 to spread throughout Brazil without reacting. In hindsight, we even helped it’s spread - at least 15 states received Covid patients from Manaus,” she adds.
What next?

Experts say Brazil’s biggest problem has been believing that prevention wasn’t necessary, that they could treat their way out of the pandemic - care for the sick and let everyone else carry on as normal.

That was a mistake, says Prof Barberia, but with President Bolsonaro’s continued emphasis on the economy - and his insistence that state governors who introduce restrictions are “tyrants” - the approach doesn’t look set to change.

A Protester with crosses during demonstration in honor of victims of coronavirus in Brasilia CREDIT: Andressa Anholete/Getty Images


With a highly volatile political situation and a slow vaccination drive - only around seven per cent of the population have had a jab so far - most experts agree the country is in for a rough few months.

“We don't have stability in terms of ministers, and it's not very clear where we are at this moment with the armed forces, so we’re at a very difficult moment,” says Prof Barberia. “I’m very worried.”

“Honestly, I don’t know, but I can’t see it ending well,” adds Mr Schnekenberg. “I think we’re still going to see very high levels of mortality for quite some time.”

Protect yourself and your family by learning more about Global Health Security
‘Like Boris Johnson Telling Steve Jobs How To Design The iPhone’: Experts Urge Overhaul Of U.K. Energy Market

David Vetter
Senior Contributor, FORBES
Sustainability
Climate research, renewables and circularity



We're producing and using electricity in new and smarter ways, but is the way the government ... [+] GETTY

The U.K. must overhaul its entire electricity market if consumers, the economy and the climate are to benefit from new innovations in green energy such as smart homes, electric vehicles and heat pumps, a new report has concluded.


While the way Britain produces its electricity is unrecognizable from 10 years ago, with renewables such as wind and solar power helping the country to get halfway to its net zero emissions target, the outdated way the government awards contracts is preventing consumers from feeling the full benefits of a decarbonized grid, and holding back efforts to achieve grid flexibility, the report contends.

Instead, says Energy Systems Catapult, the U.K. non-profit energy and tech center which authored the report, the government should give innovative energy and tech companies the ability to drive the push to net-zero carbon emissions by decentralizing its process of contracting energy providers.


“The current government-directed approach to energy is like Boris Johnson telling Steve Jobs how to design the iPhone,” said Guy Newey, strategy and performance director at Energy Systems Catapult. “The progress on renewables over the past 10 years has been extraordinary, but if we are to finish the job of decarbonising the power sector—and create new businesses and jobs—we need to unleash the potential of our brilliant digital energy innovators to create a more flexible and greener system.”

The Catapult recommends six major changes to the U.K.’s electricity market, including a demand for more accurate, real time electricity pricing, and an obligation for sellers to source ever more low carbon energy. The report also calls for investment mechanisms such as the government’s Contracts for Difference—a form of subsidy—to be phased out, making the argument that such mechanisms were designed for a far less mature market, where the business case for renewable energy was not yet proven. The Catapult argues that these should be replaced with “outcome-based” policy mandates.

Opening the market up in this way, the authors say, will enable consumers to take full advantage of new technologies such as battery storage, heat pumps and electric vehicles, and help maximize the effectiveness of digitization, whereby home heating and other services can be controlled from the customer’s smartphone, offering greater comfort and potentially lower bills.

The Catapult says the reforms would also help facilitate “energy as a service” business models, which bundle together low-carbon energy supply with low-carbon technologies such as heat pumps or solar panels and batteries, much like a payment plan from a mobile phone network.

“What’s needed is an aggressive evolution of the energy market,” Newey told Forbes.com. “You need a set of market signals that reflect the physics of a system where supply varies.”

MORE FROM FORBES'The Urgency Is Immense': Wind And Solar Power Double In 5 Years, But Countries Are Clinging To CoalBy David Vetter

Until recently, Newey explained, the relationship between people and electricity providers was entirely one-way, with energy flowing from generators to consumers. But with digitization, he said, energy consumption could now be controlled and stored flexibly, improving efficiency and reducing costs.

“These new innovations will benefit from our reform proposals because electricity markets will more accurately put a value on products and services that can utilise and store energy when it is cheap and plentiful, delivering a cost saving to the energy system,” he added.

Needless to say, firms in the energy sector have welcomed the proposals. Caroline Bragg, head of policy for the Association for Decentralised Energy, an industry body that represents a large number of energy services firms, said: “The center of gravity of the energy system is well and truly on the move—shifting from large generation and supply to energy users from industry, offices to our homes … This very timely report sets out how we can unleash the vast, hardly tapped potential of innovative offerings across zero carbon heat, flexibility and energy efficiency.”

“This is just the sort of market reform we need to drive down costs as we go renewable, to accelerate Britain’s green recovery and to make the U.K. the Silicon Valley of energy,” said Greg Jackson, CEO of Octopus Energy, a rapidly growing electricity supplier. “Adopting this approach we can make the green revolution faster and cheaper than anyone imagined. But we need to act now: neither the climate, nor citizens, should have to wait.”

For its part, the government in its recent Energy White Paper floated the possibility of energy market reform, while the U.K.’s influential Climate Change Committee in its Sixth Carbon Budget called for a “clear long-term strategy as soon as possible, and certainly before 2025, on market design for a fully decarbonised electricity system.”

In response to a Forbes.com request for comment on the report, a spokesperson for the government’s Department for Business, Energy & Industrial Strategy (BEIS) declined to say whether the Catapult’s recommendations would be taken into account in its reforms.

“As stated in our Energy White Paper, we are clear that we need to fully decarbonize electricity by 2050 as part of our commitment to building back better and greener from the coronavirus pandemic, and reaching our ambitious emissions targets,” the spokesperson said.

“It is the government’s job to create the right market conditions to deliver on that objective, encouraging competition to empower industry to find low cost routes to net zero. This is why we committed to a £1 billion net zero innovation portfolio in the Prime Minister’s Ten Point Plan—including £100 million for energy storage and flexibility—to help create the market competitors of tomorrow,” they added.

Follow me on Twitter.

David Vetter

I spent much of the past 20 years as a journalist in Asia. Now based in Europe, my key interests are in decarbonization and the circular e
conomy.