Sunday, April 04, 2021

New research on Alzheimer's Disease shows 'lifestyle origin at least in some degree'

AD-stricken brains show a genetic deficit in ability to use glucose

BRIGHAM YOUNG UNIVERSITY

Research News

For years, research to pin down the underlying cause of Alzheimer's Disease has been focused on plaque found to be building up in the brain in AD patients. But treatments targeted at breaking down that buildup have been ineffective in restoring cognitive function, suggesting that the buildup may be a side effect of AD and not the cause itself.

A new study led by a team of Brigham Young University researchers finds novel cellular-level support for an alternate theory that is growing in strength: Alzheimer's could actually be a result of metabolic dysfunction in the brain. In other words, there is growing evidence that diet and lifestyle are at the heart of Alzheimer's Disease.

"Alzheimer's Disease is increasingly being referred to as insulin resistance of the brain or Type 3 Diabetes," said senior study author Benjamin Bikman, a professor of physiology and developmental biology at BYU. "Our research shows there is likely a lifestyle origin to the disease, at least to some degree."

For the new study, published in academic journal Alzheimer's & Dementia, the BYU research team examined RNA sequences in 240 post-mortem Alzheimer's Disease-impacted brains. They were looking specifically at the gene expression of nervous system support cells during two types of metabolism: glucose metabolism, where carbohydrates are broken down to provide energy, and something called ketolytic metabolism.

Ketolytic metabolism involves the brain creating energy from ketones, molecules made in our body when the hormone insulin is low and we are burning relatively higher amounts of fat. The popular "Keto Diet" is named after the process since that low-carb, high-protein diet lowers insulin levels and causes the body to burn fat instead of carbs and produce ketones.

The researchers found widespread glucose metabolism impairment in those nervous system support cells of the brains of former Alzheimer's Disease patients, but limited ketolytic metabolism impairment. The finding is significant because the brain is like a hybrid engine, with the ability to get its fuel from glucose or ketones, but in the Alzheimer's brains studied, there appears to be a fundamental genetic deficit in the brain's ability to use glucose.

"We've turned the hybrid engine of our brains into a mono-fuel system that just fails to thrive," Bikman said. "And so, the brain, which is progressively becoming deficient in its ability to use glucose, is now crying out for help; it's starving in the midst of plenty. The body is swimming in a sea of glucose, but the brain just can't use it.

"The inability to use glucose increases the value of ketones. However, because the average person is eating insulin-spiking foods so frequently, there's never any ketones available to the brain," Bikman added. "I look at these findings as a problem we've created and that we're making worse."

Previous research has observed that the brains of people with AD have a quantifiable reduction in the ability to take in and use glucose, but this paper is the first to show it actually happens at the cellular level. It's a significant contribution to the growing paradigm shift in regards to the scientific view of the causes of Alzheimer's.

And since ketolytic metabolism seems to keep working fine in people with AD, even when glucose metabolism gives out, the paper concludes that treatments involving ketones may be able to support brain metabolism and slow the cognitive decline associated with the disease.

###

Study authors, which include BYU professor Justin Miller and former BYU professor John Kauwe (now president of BYU-Hawaii), suggest future research investigate metabolic dysfunction in Alzheimer's Disease brains should target oligodendrocytes because genes involved in ketolysis and glycolysis are both differentially expressed in that cell type in AD brains.

Study co-authors also included Oscar Harari, Carlos Cruchaga and Kathie Mihindukulasuriya from the Washington University School of Medicine.

 THE FUTURE AMAZON WORKER

A robot that senses hidden objects

System uses penetrative radio frequency to pinpoint items, even when they're hidden from view

 The technology could aid fulfilment in e-commerce warehouses.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY

Research News

IMAGE

IMAGE: MIT RESEARCHERS DEVELOPED A PICKING ROBOT THAT COMBINES VISION WITH RADIO FREQUENCY (RF) SENSING TO FIND AND GRASPS OBJECTS, EVEN IF THEY'RE HIDDEN FROM VIEW. THE TECHNOLOGY COULD AID FULFILMENT... view more 

CREDIT: IMAGE COURTESY OF FADEL ADIB, TARA BOROUSHAKI, ALBERTO RODRIGUEZ

In recent years, robots have gained artificial vision, touch, and even smell. "Researchers have been giving robots human-like perception," says MIT Associate Professor Fadel Adib. In a new paper, Adib's team is pushing the technology a step further. "We're trying to give robots superhuman perception," he says.

The researchers have developed a robot that uses radio waves, which can pass through walls, to sense occluded objects. The robot, called RF-Grasp, combines this powerful sensing with more traditional computer vision to locate and grasp items that might otherwise be blocked from view. The advance could one day streamline e-commerce fulfillment in warehouses or help a machine pluck a screwdriver from a jumbled toolkit.

The research will be presented in May at the IEEE International Conference on Robotics and Automation. The paper's lead author is Tara Boroushaki, a research assistant in the Signal Kinetics Group at the MIT Media Lab. Her MIT co-authors include Adib, who is the director of the Signal Kinetics Group; and Alberto Rodriguez, the Class of 1957 Associate Professor in the Department of Mechanical Engineering. Other co-authors include Junshan Leng, a research engineer at Harvard University, and Ian Clester, a PhD student at Georgia Tech.

As e-commerce continues to grow, warehouse work is still usually the domain of humans, not robots, despite sometimes-dangerous working conditions. That's in part because robots struggle to locate and grasp objects in such a crowded environment. "Perception and picking are two roadblocks in the industry today," says Rodriguez. Using optical vision alone, robots can't perceive the presence of an item packed away in a box or hidden behind another object on the shelf -- visible light waves, of course, don't pass through walls.

But radio waves can.

For decades, radio frequency (RF) identification has been used to track everything from library books to pets. RF identification systems have two main components: a reader and a tag. The tag is a tiny computer chip that gets attached to -- or, in the case of pets, implanted in -- the item to be tracked. The reader then emits an RF signal, which gets modulated by the tag and reflected back to the reader.

The reflected signal provides information about the location and identity of the tagged item. The technology has gained popularity in retail supply chains -- Japan aims to use RF tracking for nearly all retail purchases in a matter of years. The researchers realized this profusion of RF could be a boon for robots, giving them another mode of perception.

"RF is such a different sensing modality than vision," says Rodriguez. "It would be a mistake not to explore what RF can do."

RF Grasp uses both a camera and an RF reader to find and grab tagged objects, even when they're fully blocked from the camera's view. It consists of a robotic arm attached to a grasping hand. The camera sits on the robot's wrist. The RF reader stands independent of the robot and relays tracking information to the robot's control algorithm. So, the robot is constantly collecting both RF tracking data and a visual picture of its surroundings. Integrating these two data streams into the robot's decision making was one of the biggest challenges the researchers faced.

"The robot has to decide, at each point in time, which of these streams is more important to think about," says Boroushaki. "It's not just eye-hand coordination, it's RF-eye-hand coordination. So, the problem gets very complicated."

The robot initiates the seek-and-pluck process by pinging the target object's RF tag for a sense of its whereabouts. "It starts by using RF to focus the attention of vision," says Adib. "Then you use vision to navigate fine maneuvers." The sequence is akin to hearing a siren from behind, then turning to look and get a clearer picture of the siren's source.

With its two complementary senses, RF Grasp zeroes in on the target object. As it gets closer and even starts manipulating the item, vision, which provides much finer detail than RF, dominates the robot's decision making.

RF Grasp proved its efficiency in a battery of tests. Compared to a similar robot equipped with only a camera, RF Grasp was able to pinpoint and grab its target object with about half as much total movement. Plus, RF Grasp displayed the unique ability to "declutter" its environment -- removing packing materials and other obstacles in its way in order to access the target. Rodriguez says this demonstrates RF Grasp's "unfair advantage" over robots without penetrative RF sensing. "It has this guidance that other systems simply don't have."

RF Grasp could one day perform fulfillment in packed e-commerce warehouses. Its RF sensing could even instantly verify an item's identity without the need to manipulate the item, expose its barcode, then scan it. "RF has the potential to improve some of those limitations in industry, especially in perception and localization," says Rodriguez.

Adib also envisions potential home applications for the robot, like locating the right Allen wrench to assemble your Ikea chair. "Or you could imagine the robot finding lost items. It's like a super-Roomba that goes and retrieves my keys, wherever the heck I put them."

###

The research is sponsored by the National Science Foundation, NTT DATA, Toppan, Toppan Forms, and the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS).

How Fortnite and Zelda can up your surgical game (no joke!)

Scalpel? Check. Gaming console? Check. Study finds video games can be a new tool on surgical tray for medical students

UNIVERSITY OF OTTAWA

Research News

Video games offer students obvious respite from the stresses of studies and, now, a study from a University of Ottawa medical student has found they could benefit surgical skills training.

Arnav Gupta carries a heavy course load as a third-year student in the Faculty of Medicine, so winding down with a game of Legend of Zelda always provides relief from the rigorous of study. But Zelda may be helping improve his surgical education, too, as Gupta and a team of researchers from the University of Toronto found in a paper they recently published in the medical journal Surgery.

"Given the limited availability of simulators and the high accessibility of video games, medical students interested in surgical specialties should know that video games may be a valuable adjunct training for enhancing their medical education, especially in surgical specialties where it can be critical," says Gupta, whose findings were deciphered from a systematic review of 16 studies involving 575 participants.

"Particularly, in robotic surgery, being a video gamer was associated with improvements in time to completion, economy of motion, and overall performance. In laparoscopic surgery, video games-based training was associated with improvement in duration on certain tasks, economy of motion, accuracy, and overall performance," explains Gupta, who has been a gamer since age 8.

This study builds on past reviews and is the first to focus on a specific medical student population where this style of training could be feasibly implemented. Their timely study found some of the most beneficial games for students of robotic surgery and laparoscopy were: Super Monkey Ball, Half Life, Rocket League and Underground. Underground is purposely designed to assist medical students with their robotic surgery training via a video game console.

"While video games can never replace the value of first-hand experience, they do have merit as an adjunctive tool, especially when attempting to replicate important movements to surgery. For example, first-person shooting games require you to translate three dimensional motions onto a two-dimensional screen, which is like the concept of laparoscopic surgery," says Gupta, whose studies are focused on surgery in ophthalmology, which makes games like Resident Evil 4 or Trauma Center: New Blood fitted for his own ambitions.

"I'm not joking when I say that games such as Fortnite have the potential to enhance those necessary movements, providing stronger motivational components and in a low stakes environment."

Reports suggest 55 percent of university students are gamers and enjoy proficiency with video consoles. Yet, many medical students don't admit to owning and using a gaming console.

"I think there definitely is some ambivalence towards video games in medicine," says Gupta, who is also a fan of Witcher 3. "Given how accessible games have become and how video game technology is advancing, video games definitely are an easy go-to for the students who do love them in some capacity. The hope is that maybe this study can inspire someone to take advantage of video games' unique capabilities, reduce the general ambivalence towards it, and develop some fun ways to let students engage with surgical education."


 

Consumers are searching online but not buying. Why?

Study reveals the keys to more efficient and profitable retargeting campaigns for online marketers

INSTITUTE FOR OPERATIONS RESEARCH AND THE MANAGEMENT SCIENCES

Research News

Key Takeaways:

  • Discounting or couponing is not the most effective way to tap the power of retargeting in online marketing.
  • Customized seller recommendations may be more powerful than discounting.
  • Seller auctions that allow marketers to self-select in the retargeting process improve cost efficiency.

CATONSVILLE, MD, April 1, 2021 - Online marketers have seen the pattern: 95%-98% of online visitors search for something, but the search never converts into a purchase and they leave the site without buying. For marketers, this results in speculation and assumptions that can lead to wasted time and investments in ineffective marketing programs.

One of the more common ways online marketers attempt to solve this problem is to "retarget," which tracks those consumers and reconnects with them at a later point by showing display ads when they browse other websites. You've probably noticed this when using Google Search to find something, such as a pair of shoes, and then later when you're reading a separate news site, you're exposed to a number of display ads centered on that very thing you were searching for earlier.

Once that marketer gets your attention, what can they do to increase the likelihood that you will make a purchase? That question is at the center of a new study that reveals what the best approach to increase conversion rates may be.

The research study to be published in the April issue of the INFORMS journal Marketing Science, "Consumer Search and Purchase: An Empirical Investigation of Retargeting Based on Consumer Online Behaviors," is authored by Zhenling Jiang of The Wharton School at the University of Pennsylvania; Tat Chan of Washington University in St. Louis; Hai Che of the University of California; and Youwei Wang of Fudan University in Shanghai.

To conduct their research, the authors analyzed consumer behaviors in response to two distinct marketing strategies. In one approach, they sent out coupons via those retargeted display ads to be redeemed upon purchasing. In the other approach, the authors used the same display ads to provide seller recommendations that centered on a specific product offering customized to the user, but with no coupon or discount.

"We found that while both strategies help increase the conversion rate, the seller recommendations were more effective than coupons," said Jiang. "This told us that providing consumers with the sellers' information that is most relevant to them may be a more effective way to tap the power of retargeting."

To conduct their research, the authors tapped empirical data from Taobao.com, which is owned by Alibaba, and is the largest online retail platform in China. Like other major e-commerce platforms, it collects consumer browsing history and can reach consumers through direct messaging on the platform, either through the website or its mobile app. The researchers built a consumer search model to establish the relationship between consumer preference and search behaviors. They studied the behavior of 104,189 consumers who searched for a specific product among 20 sellers.

"We noticed some predictable patterns," said Jiang. "Consumers who had a higher search intensity for a specific product were more likely to actually make a purchase. Search intensity was measured in the volume of clicks tied to the same search or search term. What we found was that even where the consumer clicked on multiple possible products, it was the first link they clicked on that had the highest potential of generating a sale. In other words, after a more intense search, the consumer is more likely to go back to that initial seller once a decision to make a purchase is made."

In addition to the two basic retargeting strategies - discounting or customization - the authors proposed to use auction as a pricing mechanism to implement the policies. The auction pricing mechanism requires the seller to self-select. This means the seller selects certain criteria for its ideal customer for a specific product at a specific price point, and then bids on how much it will pay to reach that consumer.

"Through our research, we were also able to show that a pricing mechanism, such as an auction, also tends to improve the effectiveness of a retargeting program," said Jiang. "When Taobao used a pricing mechanism such as an auction, the company was able to improve the efficiency of its retargeting campaigns."

###

About INFORMS and Marketing Science

Marketing Science is a premier peer-reviewed scholarly marketing journal focused on research using quantitative approaches to study all aspects of the interface between consumers and firms. It is published by INFORMS, the leading international association for operations research and analytics professionals. More information is available at http://www.informs.org or @informs.

Exploring the evolution of Earth's habitability regulated by oxygen cycle

SCIENCE CHINA PRESS

Research News

IMAGE

IMAGE: THE STATUS OF THE OXYGEN CYCLE IN EARTH SYSTEM SCIENCE (A) AND ITS RELATIONSHIP WITH OTHER BIOGEOCHEMICAL CYCLES (B). view more 

CREDIT: @SCIENCE CHINA PRESS

As an essential material for the survival and reproduction of almost all aerobic organisms, oxygen is closely related to the formation and development of complex organisms. A recent review provides a systematic overview of the latest advances in the oxygen cycle at different spatial and temporal scales and the important role that oxygen plays in shaping our current habitable Earth.

Professor Jianping Huang from Lanzhou University is the corresponding author of the review entitled "The oxygen cycle and a habitable Earth", which is the cover article of the 64(4) of SCIENCE CHINA Earth Sciences in 2021.

Based on summarizing the latest research results of predecessors, the authors of this paper propose a coupling model of the five spheres of the earth system with the oxygen cycle as the core, and clarify the link role of the oxygen cycle in it. In this paper, the authors comprehensively summarized the changes of oxygen cycle and its effect on the habitability of the earth on multiple time scales including modern and geological time, and prospected the future development trend of oxygen cycle research.

"We take O2 for granted because it is just there and we breathe it all the time, yet it took billions of years before there was enough of it to keep animals like us alive." Professor Jianping Huang of Lanzhou University, the corresponding author of the paper, points out, "These processes involve the interaction of various spheres of the Earth system, which are complex interdisciplinary issues with multiple temporal and spatial scales." In this paper, the authors illustrate how the key biochemical processes in the oxygen cycle tie together the various spheres of the Earth system through feedback and interaction. "A habitable Earth gradually formed during the long evolution of the oxygen cycle."

The effects of current human activities on the oxygen cycle and biodiversity are also discussed. "Four of the five large-scale species extinctions that have occurred in the history of the earth are related to the lack of oxygen," Professor Huang concluded, "At present, under the compulsion of human activities, our planet is experiencing a large-scale oxygen reduction, with the ocean deoxygenation as a representative. The oxygen cycle of the Earth system is gradually out of balance, which is very worrying."

Studies of the oxygen cycle cover a wide span of timescales from daily to geologic scales. The oxygen cycles of different timescales dominate the control of atmospheric O2 over the corresponding timescales. However, a distinct boundary that divides the long-term and short-term oxygen cycles has yet to be established, and the complex interactions between the short-term and long-term processes remain unclear. Since the earth system is a highly non-linear and strongly coupled system, a minor perturbation can have the potential to cause a series of dramatic changes. "It is a top priority to connect the short-term and long-term oxygen cycles under a comparable timescale rather than separating them. Effective multidisciplinary cooperation among the subdisciplines of Earth sciences (geology, oceanography, atmospheric sciences, paleobiology, etc.), and social sciences should be promoted to reveal the hidden mechanisms that control the trajectory of the Earth system and how the trajectory may influence the future of human beings." said Prof. Huang. Fortunately, efforts have been made to reverse the decline of atmospheric O2. In China, the Green Great Wall, which was designed to mitigate desertification and expand forests has achieved overall success in past decades. Reductions in carbon emission and its related O2 consumption have been achieved in some major cities around the world.

This study has far-reaching scientific significance and important reference value for understanding the potential link between the oxygen cycle and the biodiversity in geological history and exploring the historical evolution and future of the Earth's habitability.

###

This work was supported by the National Natural Science Foundation of China (Grant No. 41888101, 41521004 & 41991231), and the China University Research Talents Recruitment Program (111 Projects, Grant No. B13045).

See the article: Huang J, Liu X, He Y, Shen S, Hou Z, Li S, Li C, Yao L, Huang J. 2021. The oxygen cycle and a habitable Earth. Science China Earth Sciences, 64(4): 511-529,
http://doi.org/10.1007/s11430-020-9747-1

 

Study finds airborne release of toxin from algal scum

Study finds a new type of toxin seen to release from scum, which sits on ponds and lakes across the world

TAYLOR & FRANCIS GROUP

Research News

A dangerous toxin has been witnessed - for the first time - releasing into the air from pond scum, research published in the peer-reviewed journal Lake and Reservoir Management today shows.

Not only is pond scum - otherwise known as algal bloom - an unsightly formation which can occur on still water across the world, it can also prove dangerous to wildlife and humans.

For the first time, scientists have now detected the presence of the algal toxin anatoxin-a (ATX)which is also known as 'Very Fast Death Factor', in the air near a Massachusetts pond with large algal blooms.

ATX can cause a range of symptoms at acute doses, including loss of coordination, muscular twitching and respiratory paralysis, and has been linked to the deaths of livestock, waterfowl and dogs from drinking contaminated water.

ATX is produced by single celled organisms known as cyanobacteria, which can form harmful algal blooms - when huge amounts of cyanobacteria grow in lake surface waters. Blooms are exacerbated by fertilizer run-off entering lakes or ponds from nearby fields or improperly treated wastewater, and can stimulate growth and high water temperatures. Cyanobacteria, which also are known as blue-green algae, are actually a type of bacteria that can photosynthesize.

Cyanobacterial blooms can also lead to low oxygen conditions, further degrading water quality. This is because when the algae in these large blooms die, they sink to the lake bottom and decompose, which can use up all the oxygen in the water, killing fish and other animals. The blooms also can release toxins into the water that can prove fatal for these animals.

"ATX is one of the more dangerous cyanotoxins produced by harmful algal blooms, which are becoming more predominant in lakes and ponds worldwide due to global warming and climate change," says lead author Dr James Sutherland, at the Nantucket Land Council.

ATX had never been detected in the atmosphere before, but Sutherland and his colleagues suspected that it might become airborne under certain environmental conditions.

To test this possibility, they collected samples of airborne particles from around the edge of Capaum Pond on Nantucket Island in Massachusetts, US, from July to October 2019, when it was regularly covered with algal blooms, by sucking air through a glass fiber filter. They then used an analytical technique called liquid chromatography-tandem mass spectrometry to search for ATX in these samples, as well as in samples of water from the pond.

Not only were they able to detect ATX in the pond water, at concentrations of up to 21ng/mg, but on one occasion they also detected it in the air around the pond, at an average concentration of 0.87ng/filter, which corresponds to a potential airborne exposure of 0.16ng/m3. This detection occurred on a foggy day in September, after a windy night, when the ATX was likely blown from the surface of the water by the strong wind and then protected by the fog, allowing it to be detected.

The researchers are unsure exactly how ATX is released into the air from the pond, whether within small water droplets, attached to aerosol particles or even inside cyanobacteria blown into the air. It is also unclear what effects exposure to these trace concentrations of ATX might have on humans and wildlife, but the fact that there is exposure is clearly a cause for concern and requires further research.

Whilst this new toxin was witnessed in one specific pond, researchers warn caution for people across the world to approach still water with algal blooms. Therefore, further research is needed to test algal blooms in ponds internationally.

"People often recreate around these lakes and ponds with algal blooms without any awareness of the potential problems," said Sutherland. " Direct contact or inhalation of these cyanotoxins can present health risks for individuals, and we have reported a potential human health exposure not previously examined."

###

 

Scientists turn to deep learning to improve air quality forecasts

PENN STATE

Research News

Air pollution from the burning of fossil fuels impacts human health but predicting pollution levels at a given time and place remains challenging, according to a team of scientists who are turning to deep learning to improve air quality estimates. Results of the team's study could be helpful for modelers examining how economic factors like industrial productivity and health factors like hospitalizations change with pollution levels.

"Air quality is one of the major issues within an urban area that affects people's lives," said Manzhu Yu, assistant professor of geography at Penn State. "Yet existing observations are not adequate to provide comprehensive information that may help vulnerable populations to plan ahead."

Satellite and ground-based observations each measure air pollution, but they are limited, the scientists said. Satellites, for instance, may pass a given location at the same time each day and miss how emissions vary at different hours. Ground-based weather stations continuously collect data but only in a limited number of locations.

To address this, the scientists used deep learning, a type of machine learning, to analyze the relationship between satellite and ground-based observations of nitrogen dioxide in the greater Los Angeles area. Nitrogen dioxide is largely associated with emissions from traffic and power plants, the scientists said.

"The problem right now is nitrogen dioxide varies a lot during the day," Yu said. "But we haven't had an hourly, sub-urban scale product available to track air pollution. By comparing surface level and satellite observations, we can actually produce estimates with higher spatial and temporal resolution."

The learned relationship allowed the researchers to take daily satellite observations and create hourly estimates of atmospheric nitrogen dioxide in roughly 3-mile grids, the scientists said. They recently reported their findings in the journal Science of the Total Environment.

"The challenge here is whether we can find a linkage between measurements from earth's surface and satellite observations of the troposphere, which are actually far away from each other. That's where deep learning comes in."

Deep learning algorithms operate much like the human brain and feature multiple layers of artificial neurons for processing data and creating patterns. The system learns and trains itself based on connections it finds within large amounts of data, the scientists said.

The scientists tested two deep-learning algorithms and found the one that compared the ground-based observations directly to the satellite observations more accurately predicted nitrogen dioxide levels. Adding information like meteorological data, elevation and the locations of the ground-based stations and major roads and power plants improved the prediction accuracy further.

Yu said the study could be repeated for other greenhouse gases and applied to different cities or on regional and continental scales, the scientists said. In addition, the model could be updated when new, higher-resolution satellites are launched.

"With a high spatiotemporal resolution, our results will facilitate the study between air quality and health issues and improve the understanding of the dynamic evolution of airborne pollutants," Yu said.

###

Also contributing to this study was Qian Liu, a doctoral student at the National Science Foundation Spatiotemporal Innovation Center at George Mason University.

Serving size, satisfaction influence food waste on campus

Multicampus study identifies problem's causes and potential solutions

RICE UNIVERSITY

Research News

HOUSTON - (April 2, 2021) - Understanding what drives food choices can help high-volume food service operations like universities reduce waste, according to a new study.

Researchers have concluded that food waste in places like university cafeterias is driven by how much people put on their plates, how familiar they are with what's on the menu and how much they like - or don't like - what they're served.

Food waste has been studied often in households, but not so often in institutional settings like university dining commons. What drives food choices in these "all-you-care-to-eat" facilities is different because diners don't perceive personal financial penalty if they leave food on their plates.

Published in the journal Foods, "Food Choice and Waste in University Dining Commons -- A Menus of Change University Research Collaborative Study" was conducted by a team of experts from Rice University; the University of California, Davis; Stanford University; Lebanon Valley College; the University of California, Santa Barbara; and the University of California, Berkeley.

Co-author Eleanor Putnam-Farr, assistant marketing professor at Rice's Jones Graduate School of Business, is available to discuss the findings and potential impact with news media.

The researchers conducted student surveys during the 2019 spring and fall semesters to study foods types, diner confidence and diner satisfaction. They used photos taken by diners themselves before and after eating to measure how much food was taken and how much of it went to waste. "Diners were intercepted at their dining halls and asked if they wanted to participate in a study about food choices and satisfaction, but the objective of investigating food waste behavior was not disclosed," the authors wrote.

The study found the amount of food wasted didn't significantly differ among types of food. Instead, researchers discovered waste was related to the amount of food diners put on their plates, how satisfied they were with their meals and how often they went to the dining commons. If students were satisfied with their food, they tended waste less of it. And diners who visited the commons most often -- making them more familiar with the menus and more confident in their choices -- tended to waste less.

Mixed dishes, like sandwiches or stir-fry, took up a greater percentage of the surface area on surveyed plates than animal proteins or grains and starches. Those three types of food took up a greater area of the plates than fruits, vegetables or plant proteins. The amount of food wasted, however, did not significantly differ among the various food categories.

The mixed dishes and animal proteins that took up greater portions of the plate tended to be pre-plated by the commons staff or have a suggested serving size. The study's results showed that greater amounts of food taken by diners correlated with the item being pre-plated or served by others.

The authors recommend future research on the topic uses their multicampus approach -- which enabled them to study food choice among a large and diverse group -- to better understand what causes food waste and find out if it can be reduced by interventions such as posting signs that encourage healthier choices.

###

To schedule an interview with Putnam-Farr, or for more information, contact Avery Franklin, media relations specialist at Rice, at averyrf@rice.edu or 713-348-6327.

Related materials:

Paper: https://www.mdpi.com/2304-8158/10/3/577.

Putnam-Farr bio: https://business.rice.edu/person/eleanor-putnam-farr.

Follow Rice News and Media Relations via Twitter @RiceUNews.

This news release can be found online at news.rice.edu.

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAle

Landslides: New early warning systems reduce false alarms

CMCC FOUNDATION - EURO-MEDITERRANEAN CENTER ON CLIMATE CHANGE

Research News

Many slopes in the Campania region are covered with layers of volcanic soil, the result of repeated eruptions over the course of millennia. As the impacts of climate change worsen, including the occurrence of very intense and short rainfall in localized areas, there is a growing need, especially in this and other Italian regions that are vulnerable to landslides, to understand the dynamics that induce such events more precisely and develop models that can predict them. When employed through adequate early warning systems, these tools can support decision-makers in adopting effective and efficient measures to protect people and the areas themselves from landslides.

To this end, the study Exploring ERA5 reanalysis potentialities for supporting landslide investigations: a test case from Campania Region (Southern Italy) by the CMCC Foundation identifies the potential of the fifth generation of atmospheric models (known as reanalysis dataset ERA5) developed by the European Centre for Medium-Range Weather Forecasts for improving the performance of early warning systems used, for example, by the Civil Protection (Protezione Civile).

On a slope covered by pyroclastic soil landslides are usually caused by a combination of two factors: the antecedent slope wetness conditions and the triggering condition represented by an intense precipitation event. The early warning system currently used by the Campania region uses the latter as the only indicator, ignoring the antecedent conditions of the soil. As a result, every time a precipitation event of a certain intensity is forecasted, whatever the state of the ground at that specific moment, the system returns a state of alert, pre-alarm or alarm. This increases the likelihood of false alarms and therefore of decisions - such as road closures - which may cause interruptions to services that in some cases could be avoided.

"Reanalyses are atmospheric models, the same ones that are used for forecasting. But they are usually used in back-analysis to reconstruct past weather conditions," explains Guido Rianna, CMCC researcher and co-author of the study. "Because of this characteristic, the purpose of reanalyses is not to forecast, but to homogenize, from a spatial and temporal point of view, data from different in situ monitoring systems - such as weather stations - or from remote sensing, such as satellites."

The fifth generation of reanalysis (ERA5) has a much better spatial resolution than the previous ones, providing atmospheric variables on grid cells of about 30 km side. In addition, it is possible to access free daily updated data through the Climate Data Store of the Copernicus Climate Change Service. This data ranges from 1979 up to 5 days before the consultation. Such a short time of data release allowed the authors of the study to assume - and then verify - a good performance of the tool not only for back-analysis studies but also for operational purposes such as the development of early warning systems.

"We first verified the reliability of the ERA5 reanalysis in reproducing rainfall histories leading to landslide events that actually occurred in the studied area," says Alfredo Reder, CMCC researcher and first author of the publication. "Next, we analysed the specific landslide event that occurred in Nocera Inferiore (Campania) on 4 March 2005. We verified that, at that event, the soil moisture estimations offered by ERA5, although not free of constraints and limitations, would have been able to observe a very high value in terms of water content along the entire cover. Thus, we have been able to take the final step of evaluating the possibility of using these reanalyses operationally for an early warning system. In the last phase of our research, we verified that the ERA5 datasets, if used as a proxy to support a decision, could improve the reliability of the forecasting model currently used in Campania, because they can provide information on antecedent slope wetness conditions, which are a predisposing factor for a landslide event."

The results of the study suggest the potential of this tool especially for minimizing false alarms, while avoiding missed alarms.

"Any expert on landslides in pyroclastic soils would say that the occurrence of an intense precipitation event in September in Campania, happening on a soil in dry condition, can rarely trigger a landslide," concludes Rianna. "But to date there is a lack of quantitative observational support to affirm this. Today, a Civil Protection decision-maker, on the basis of existing studies in the literature, could not fail to sound the alarm in the event of a forecast of a precipitation event with certain characteristics: he could not support this choice with data on the state of the soil in the prior period (except for a few limited slopes). Our research shows that ERA5 could fill this gap and therefore minimize the number of false alarms."

The results of this research apply to the case of pyroclastic landslides, which are characteristic of the studied area. The next step will be to verify the suitability of ERA5 reanalyses for the same purpose but in the case of other soil types, such as clay soils leading to slow landslides, which are characteristic of various areas of Italy, such as the southern Apennines and some areas of Basilicata and Campania.

###

For more information:

Reder, A., Rianna, G. Exploring ERA5 reanalysis potentialities for supporting landslide investigations: a test case from Campania Region (Southern Italy). Landslides (2021). https://doi.org/10.1007/s10346-020-01610-4

 

Keeping it fresh: New AI-based strategy can assess the freshness of beef samples

Scientists combine spectroscopy and deep learning in an efficient technique for detecting spoiled meat

GIST (GWANGJU INSTITUTE OF SCIENCE AND TECHNOLOGY)

Research News

IMAGE

IMAGE: CONSUMING SPOILED BEEF IS DANGEROUS, BUT THERE ARE CURRENTLY NO SIMPLE AND EFFICIENT METHODS TO ASSESS BEEF FRESHNESS. view more 

CREDIT: UNSPLASH

Although beef is one of the most consumed foods around the world, eating it when it's past its prime is not only unsavory, but also poses some serious health risks. Unfortunately, available methods to check for beef freshness have various disadvantages that keep them from being useful to the public. For example, chemical analysis or microbial population evaluations take too much time and require the skills of a professional. On the other hand, non-destructive approaches based on near-infrared spectroscopy require expensive and sophisticated equipment. Could artificial intelligence be the key to a more cost-effective way to assess the freshness of beef?

At Gwangju Institute of Science and Technology (GIST), Korea, a team of scientists led by Associate Processors Kyoobin Lee and Jae Gwan Kim have developed a new strategy that combines deep learning with diffuse reflectance spectroscopy (DRS), a relatively inexpensive optical technique. "Unlike other types of spectroscopy, DRS does not require complex calibration; instead, it can be used to quantify part of the molecular composition of a sample using just an affordable and easily configurable spectrometer," explains Lee. The findings of their study are now published in Food Chemistry.

To determine the freshness of beef samples, they relied on DRS measurements to estimate the proportions of different forms of myoglobin in the meat. Myoglobin and its derivatives are the proteins mainly responsible for the color of meat and its changes during the decomposition process. However, manually converting DRS measurements into myoglobin concentrations to finally decide upon the freshness of a sample is not a very accurate strategy--and this is where deep learning comes into play.

Convolutional neural networks (CNN) are widely used artificial intelligence algorithms that can learn from a pre-classified dataset, referred to as 'training set,' and find hidden patterns in the data to classify new inputs. To train the CNN, the researchers gathered data on 78 beef samples during their spoilage process by regularly measuring their pH (acidity) alongside their DRS profiles. After manually classifying the DRS data based on the pH values as 'fresh,' 'normal,' or 'spoiled,' they fed the algorithm the labelled DRS dataset and also fused this information with myoglobin estimations. "By providing both myoglobin and spectral information, our trained deep learning algorithm could correctly classify the freshness of beef samples in a matter of seconds in about 92% of cases," highlights Kim.

Besides its accuracy, the strengths of this novel strategy lie in its speed, low cost, and non-destructive nature. The team believes it may be possible to develop small, portable spectroscopic devices so that everyone can easily assess the freshness of their beef, even at home. Moreover, similar spectroscopy and CNN-based techniques could also be extended to other products, such as fish or pork. In the future, with any luck, it will be easier and more accessible to identify and avoid questionable meat.

###

About Gwangju Institute of Science and Technology (GIST)

Gwangju Institute of Science and Technology (GIST) is a research-oriented university situated in Gwangju, South Korea. One of the most prestigious schools in South Korea, it was founded in 1993. The university aims to create a strong research environment to spur advancements in science and technology and to promote collaboration between foreign and domestic research programs. With its motto, "A Proud Creator of Future Science and Technology," the university has consistently received one of the highest university rankings in Korea.
Website: http://www.gist.ac.kr/

About the authors

Kyoobin Lee is an Associate Professor and Director of the AI laboratory at GIST. His group is developing AI-based robot vision and deep learning-based bio-medical analysis methods. Before joining GIST, he obtained a PhD in Mechatronics from KAIST and completed a postdoctoral training program at Korea Institute of Science and Technology (KIST).

Jae Gwan Kim is an Associate Professor at the Department of Biomedical Science and Engineering at GIST since 2011. His current research topics include brain stimulation by transcranial ultrasound, anesthesia depth monitoring, and screening the stage of Alzheimer's disease via brain functional connectivity measurements. Before joining GIST, he completed a postdoctoral training program at the Beckman Laser Institute and Medical Clinic at UC Irvine, USA. In 2005, he received a PhD in Biomedical Engineering from a joint program between the University of Texas at Arlington and the University of Texas Southwestern Medical Center at Dallas, USA.