Thursday, December 19, 2024

 

Early warning tool will help control huge locust swarms



University of Cambridge
Locust swarm in Somalia 

image: 

This migratory pest can reach plague proportions, and a swarm covering one square kilometre can consume enough food in one day to feed 35,000 people.

view more 

Credit: Keith Cressman, FAO




Desert locusts typically lead solitary lives until something - like intense rainfall - triggers them to swarm in vast numbers, often with devastating consequences. 

This migratory pest can reach plague proportions, and a swarm covering one square kilometre can consume enough food in one day to feed 35,000 people. Such extensive crop destruction pushes up local food prices and can lead to riots and mass starvation.

Now a team led by the University of Cambridge has developed a way to predict when and where desert locusts will swarm, so they can be dealt with before the problem gets out of hand. 

It uses weather forecast data from the UK Met Office, and state-of the-art computational models of the insects’ movements in the air, to predict where swarms will go as they search for new feeding and breeding grounds. The areas likely to be affected can then be sprayed with pesticides.

Until now, predicting and controlling locust swarms has been ‘hit and miss’, according to the researchers. Their new model, published today in the journal PLOS Computational Biology, will enable national agencies to respond quickly to a developing locust threat.

Desert locust control is a top priority for food security: it is the biggest migratory pest for smallholder farmers in many regions of Africa and Asia, and capable of long-distance travel across national boundaries.

Climate change is expected to drive more frequent desert locust swarms, by causing trigger events like cyclones and intense rainfall. These bring moisture to desert regions that allows plants to thrive, providing food for locusts that triggers their breeding.

“During a desert locust outbreak we can now predict where swarms will go several days in advance, so we can control them at particular sites. And if they’re not controlled at those sites, we can predict where they’ll go next so preparations can be made there,” said Dr Renata Retkute, a researcher in the University of Cambridge’s Department of Plant Sciences and first author of the paper.

“The important thing is to respond quickly if there’s likely to be a big locust upsurge, before it causes a major crop loss.  Huge swarms can lead to really desperate situations where people could starve,” said Professor Chris Gilligan in the University of Cambridge’s Department of Plant Sciences, senior author of the paper.

He added: “Our model will allow us to hit the ground running in future, rather than starting from scratch as has historically been the case.”

The team noticed the need for a comprehensive model of desert locust behaviour during the response to a massive upsurge over 2019-2021, which extended from Kenya to India and put huge strain on wheat production in these regions. The infestations destroyed sugarcane, sorghum, maize and root crops. The researchers say the scientific response was hampered by the need to gather and integrate information from a range of disparate sources.

“The response to the last locust upsurge was very ad-hoc, and less efficient than it could have been. We’ve created a comprehensive model that can be used next time to control this devastating pest,” said Retkute. 

Although models like this have been attempted before, this is the first that can rapidly and reliably predict swarm behaviour. It takes into account the insects’ lifecycle and their selection of breeding sites, and can forecast locust swarm movements both short and long-term. 

The new model has been rigorously tested using real surveillance and weather data from the last major locust upsurge. It will inform surveillance, early warning, and management of desert locust swarms by national governments, and international organisations like the Food and Agriculture Organisation of the United Nations (FAO).

The researchers say countries that haven’t experienced a locust upsurge in many years are often ill-prepared to respond, lacking the necessary surveillance teams, aircraft and pesticides. As climate change alters the movement and spread of major swarms, better planning is needed - making the new model a timely development.
 

Locust swarm in Ethiopia 

 

Denali Fault tore apart ancient joining of two landmasses




University of Alaska Fairbanks





New research shows that three sites spread along an approximately 620-mile portion of today’s Denali Fault were once a smaller united geologic feature indicative of the final joining of two land masses. That feature was then torn apart by millions of years of tectonic activity.

The work, led by associate professor Sean Regan at the University of Alaska Fairbanks Geophysical Institute and UAF College of Natural Science and Mathematics, is featured on the cover of the December edition of Geology, the journal of The Geological Society of America.

Regan is the research paper’s lead author. UAF co-authors include doctoral student McKenzie Miller, recent master’s graduate Sean Marble and research assistant professor Florian Hofmann. Other co-authors are from St. Lawrence University, South Dakota School of Mines and Technology and the University of California, Santa Barbara.

“Our understanding of lithospheric growth, or plate growth, along the western margin in North America is becoming clearer, and a big part of that is related to reconstruction of strike-slip faults such as the Denali Fault,” Regan said. “We’re starting to recognize those primary features involved in the stitching, or the suturing, of once-distant land masses to the North American plate.”

The research focused on formations at three locations: the Clearwater Mountains of Southcentral Alaska, the Kluane Lake region of Canada’s southwestern Yukon, and the Coast Mountains near Juneau. Previous thinking among geologists is mixed, with some suggesting the three locations formed individually.

Regan’s historical reconstruction of 300 miles of horizontal movement on the Denali Fault over millions of years found that the three locations at one time formed a terminal suture zone. A terminal suture zone represents the final integration of tectonic plates or crustal fragments into a larger mass.

Regan’s work defines one of several places where the Wrangellia Composite Terrane, an oceanic plate that originated far from its current position, accreted to the western edge of North America between 72 million and 56 million years ago.

“When you think about geologists crawling around Earth’s surface trying to understand what the heck happened, it makes some sense that they might not link things that are so far apart,” Regan said of the three sites he studied. “With different geologists working in different areas, the dots don’t really get connected until you can reconstruct deformation on the Denali Fault.”

Regan’s reconstruction focused on the three sites’ inverted metamorphism, a geological phenomenon where rocks formed under higher temperatures and pressures are found overlying rocks formed under lower temperatures and pressures. This is the reverse of the typical sequence observed in regional metamorphism, where temperature and pressure generally increase with depth.

Inverted metamorphism is a key indicator of tectonic complexity and helps geologists reconstruct the processes of crustal deformation and mountain building.

“We showed that each of these three independent inverted metamorphic belts all formed at the same time under similar conditions,” Regan said. “And all occupy a very similar structural setting. Not only are they the same age, they all behaved in a similar fashion. They decrease in age, structurally, downward.”

Regan connected the three locations by analyzing their monazite, which consists of the rare earth elements lanthanum, cerium, neodymium and sometimes yttrium. He collected monazite from the two Alaska locations and used Kluane data published earlier in the year by another scientist.

“It is just the most special little mineral,” Regan said. “It can participate in a lot of reactions, so we can use it as a way to track the mineralogical evolution of a rock.”

Regan began his quest after reading a 1993 paper by researchers at the University of Alberta and University British Columbia and published in Geology. That paper asserted similarities in the Denali Fault region later studied by Regan, but only went as far as labeling them as a single metamorphic-plutonic belt. 

A metamorphic-plutonic belt is a region characterized by the close association of metamorphic rocks and plutonic rocks that form as a result of intense tectonic activity, typically during mountain-building processes. These belts are commonly found in areas where tectonic plates converge.

“It was amazing to me that the 1993 paper hadn’t caught more attention back in the day,” Regan said. “I had this paper hung up on my wall for the last four years, because I thought it was really ahead of its time.”


CONTACTS:

• Sean Regan, University of Alaska Fairbanks Geophysical Institute, sregan5@alaska.edu

• Rod Boyce, University of Alaska Fairbanks Geophysical Institute, 907-474-7185, rcboyce@alaska.edu

 

New study highlights ethical challenges in conducting cannabis research in Canada



CAMH-led research reveals systemic barriers and ethical considerations in industry-funded cannabis research in Canada




Centre for Addiction and Mental Health




December 18, 2024 (Toronto, Canada) – In the first study of its kind, research led by the Centre for Addiction and Mental Health (CAMH) sheds light on the ethical complexities and systemic barriers facing scientists conducting cannabis research funded by the for-profit cannabis industry.

Recently published in the journal Social Science & Medicine, the qualitative study titled Canadian cannabis researcher perspectives on the conduct and sponsorship of scientific research by the for-profit cannabis industry reveals that while researchers are driven by a commitment to public health and high-quality research, systemic barriers often force them to rely on industry partnerships. Researchers adopt personal strategies—such as prioritizing transparency, aligning values, and maintaining independence—to navigate these relationships. However, the study emphasizes that these challenges reflect structural issues that require systemic, rather than individual, solutions.

“Cannabis researchers are deeply committed to conducting meaningful work that advances knowledge and improves lives,” said Dr. Daniel Buchman, CAMH Scientist. “However, we found that cannabis researchers perceive that the structural barriers and the regulatory context place them in a difficult position where they are concerned about scientific integrity, agenda bias, and conflicts of interest.”

The study involved 38 semi-structured interviews with a diverse group of academic researchers, peer researchers, and clinicians across all Canadian provinces. Participants were recruited through professional networks, email invitations, and snowball sampling, ensuring a wide range of perspectives across age, gender, geography, and racial and ethnic identities. Thematic analysis of the interviews provided a comprehensive understanding of the ethical and practical challenges posed by industry sponsorship in cannabis research.

Key recommendations in the study include increasing public funding for cannabis research to reduce reliance on industry sponsorship, enhancing transparency about industry relationships, and ensuring that research agendas are not unduly influenced by industry interests. The study also suggests implementing institutional policies that promote independence and integrity in research, such as clear guidelines for managing conflicts of interest and fostering collaborations that align with public health goals. These measures aim to safeguard the integrity of cannabis research and ensure it benefits public health.

This work builds on previous research by Buchman’s team investigating industry partnerships in cannabis studies. A 2023 paper, Cannabis companies and the sponsorship of scientific research: A cross-sectional Canadian case study, found that for-profit cannabis companies have been sponsoring, financially and non-financially, cannabis research since the legalization of recreational cannabis in Canada in 2018. The study highlighted concerns about the “funding effect,” where industry sponsorship could lead to biased, sponsor-favorable outcomes. It underscored the need for greater understanding of Canada’s regulatory environment, which often necessitates researcher-industry collaboration for certain cannabis studies.

Dr. Klara Vichnevetski, Director of the CAMH Technology Transfer and Industry Partnerships Office, who was not involved in this study, added: “As the cannabis research landscape evolves, this study underscores the importance of fostering transparent and ethical industry collaborations. By addressing potential biases in industry-sponsored research, we can build responsible partnerships that uphold high-quality, impactful research while maintaining academic integrity and public trust.”

CAMH is a Canadian leader in cannabis research, having conducted 60 cannabis-related studies over the past several years in a variety of areas, including its impact on impairing drivers, the association between cannabis and psychosis, and more.

CAMH maintains the highest ethical standards in procuring cannabis for research purposes through a well-regulated, arms-length process. Our researchers often obtain cannabis products directly from the Ontario Cannabis Store (OCS), a legally authorized organization that ensures all products meet federal Good Production Practices (GPP) requirements. By purchasing through the OCS, which serves as an intermediary between the cannabis industry and end users, CAMH maintains independence from industry players. This structured procurement approach helps safeguard the research integrity by eliminating potential conflicts of interest and ensuring that all cannabis products used in research meets rigorous quality standards.
 
This research underscores the timely need for systemic reforms to ensure ethical standards and public trust in the rapidly growing field of cannabis research

As summarized by Dr. Sophie Soklaridis, Scientific Director and Research Chair of Recovery and Equity-Focused Mental Health Education Research at CAMH, and paper co-author: “Over one-third of Canadian adults use cannabis, though stigma likely means the true number is higher. Our research is clear: increased government-funded research and clearer regulations would enhance our understanding of cannabis use through independent, robust studies to better inform cannabis users in our country and around the world.”

-30-

About the Centre for Addiction and Mental Health (CAMH)

CAMH is Canada's largest mental health and addiction teaching hospital and a world leading research centre in this field. CAMH combines clinical care, research, education, policy development and health promotion to help transform the lives of people affected by mental illness and addiction. CAMH is fully affiliated with the University of Toronto, and is a Pan American Health Organization/World Health Organization Collaborating Centre. For more information, please visit camh.ca or follow @CAMHnews on X, formerly Twitter.

 

Patience isn't a virtue; it's a coping mechanism



Studies sought to better define what constitutes patience, and impatience, and the factors that determine them




University of California - Riverside




Patience — like its corollary impatience — has always been a sort of “I know it when I see it” concept. And that didn’t sit well with UC Riverside psychology researcher Kate Sweeny.

“Philosophers and religious scholars call patience a virtue, yet most people claim to be impatient,” Sweeny said. “That made me wonder if maybe patience is less about being a good person and more about how we deal with day-to-day frustrations.”

For purposes of her research, Sweeny sought to better define what constitutes patience, and impatience, and the factors that determine them.

Impatience, she concluded across three studies of 1,200 people, is the emotion people feel when they face a delay that seems unfair, unreasonable, or inappropriate—like a traffic jam outside of rush hour, or a meeting that should have ended 15 minutes ago. Patience, then, is how we cope with those feelings of impatience.

The studies’ findings were published recently in the journal Personality and Social Psychology Bulletin in the article “When Time is the Enemy: An Initial Test of the Process Model of Patience.”

Psychologists use the term “emotion regulation” to capture the many strategies people use to reduce (or sometimes increase) the intensity of their emotions. Patience, Sweeny asserts in a companion theoretical paper, is the subset of these strategies that particularly target feelings of impatience.

The first studies to test that idea were recently published in the article “When Time is the Enemy: An Initial Test of the Process Model of Patience,” in the journal Personality and Social Psychology Bulletin.

The studies asked participants to consider their responses to various frustrating situations one might encounter in everyday life. One depicted a traffic jam, another described a long, boring meeting, and others prompted them to imagine being stuck in a waiting room.

Participants indicated how impatient they would feel in response to each, then whether they would counter their impatience through strategies like distraction, deep breathing, or seeing the upsides of the situation.

The study results identified three scenarios that create a “perfect storm” for impatience: when the stakes are relatively high (traffic on the way to a favorite band’s concert), when the state of waiting is unpleasant (no seats and no distractions at the DMV), and when someone is clearly to blame for the delay (the lab forgot to process your medical test). People also felt more impatient when a delay was longer than they anticipated—but surprisingly, not when they delay was relatively long or short.

Although nearly everyone in the studies said they would feel at least a bit impatient when facing those frustrating situations, some people were more patient than others. Participants who were more comfortable with open-ended situations and more emotionally stable (i.e., low in need for closure and neuroticism) said they wouldn’t feel too impatient in those scenarios; those who were more emotionally skilled and better at self-regulation said they would respond more patiently, even if they initially felt impatient. Being agreeable and high in empathy also predicted patience.

“Our initial findings support many of our ideas about patience and impatience,” Sweeny concludes. “We have a lot still to learn, but our approach is quite promising in terms of helping people to manage feelings of impatience and ultimately become more patient in their daily lives.”

Co-authors for “When Time is the Enemy” included graduate researchers Jason Hawes and Olivia T. Karaman. The companion theoretical paper, “On (Im)Patience: A New Approach to an Old Virtue,” was published in the journal Personality and Social Psychology Review

Digital labels can help grocers waste less food



Consumers, stores, and the environment benefit when electronic tags allow speedy price changes



University of Texas at Austin




In 2022, U.S. grocers wasted 5 million tons of food, with 35% of it going to landfills, according to the food waste nonprofit ReFed. More than half of that waste — 2.7 million tons — was past the labels’ expiration dates.

But there’s a potential technological solution to the waste problem, according to new research from Texas McCombs. By moving from paper shelf labels to digital ones, supermarkets can easily lower prices and move older stock from their shelves to consumers’ homes.

Using technology to quickly change prices on labels, a process known as dynamic pricing, benefits more than just consumers, says Ioannis Stamatopoulos, associate professor of information, risk, and operations management.

“Everyone is better off when dynamic pricing is enabled,” he says. “There’s less food waste and less emissions from food ending up in landfills.”

Grocers save time and money, too. They can change digital labels — also called electronic shelf labels — with a few keystrokes on a tablet, compared with printing physical labels and sticking them onto shelves. When it’s easier and cheaper to update prices, Stamatopoulos says, grocers should do it more often.

How much is more often? To find out, Stamatopoulos — with Naveed Chehrazi of Washington University in St. Louis and Robert Sanders of the University of California, San Diego — analyzed two unnamed European grocery chains as they installed the tags.

One, in the United Kingdom, introduced digital labels for 940 perishable products. The labels displayed the base price and added discounts as products got closer to expiring. The researchers found the stores changed prices 54% more often.

Raising the Bar for Barcodes

The second supermarket chain, in the European Union, adopted electronic labels but added a second technology: expanded barcodes.

Unlike a standard barcode, an expanded one can hold inventory details such as packaging dates, lot numbers, and expiration dates. When inventory nears its “sell-by” date, a store can lower prices to stimulate buying.

After the EU stores installed the two technologies, the researchers found, they increased price change frequency 853%.

Rapid price changes aid shoppers, says Stamatopoulos. “If you’re a consumer who really, really cares about price, then you can buy the blueberries that will expire two days later and consume them today.”

By better managing inventories, the barcodes also boost stores’ bottom lines, he adds. “Because the grocery store can put things on discount when they’re about to expire, they can afford larger orders, so they take advantage of economies of scale in ordering.”

The Price of Dynamic Pricing

Although dynamic pricing has long-term benefits, it faces short-term obstacles. One is consumer fear that retailers will jack up prices when demand is high — as ride-hailing companies such as Uber do.

In February, when the fast-food chain Wendy’s announced it would roll out dynamic pricing, it drew backlash. It quickly clarified that it would cut prices during slow periods but not raise them during busy periods.

But unlike restaurants, it’s hard for grocers to pinpoint hours of high demand for individual products, Stamatopoulos notes. “For retailers to estimate demand very finely and dynamically respond accordingly, so as to squeeze every dollar out of it, I think that’s kind of impossible,” he says.

Another barrier is cost. Grocers must invest in digital labels and tablets, while employees must update data daily for thousands of items.

Europe is ahead of the U.S. on adopting these technologies, but that could be changing. In June, Walmart announced it will transition to digital price labels in 2,300 stores by 2026. Amazon Fresh and the Midwestern chain Schnuck’s are also using them.

To speed the transition, Stamatopoulos suggests government subsidies, like those for solar panels and electric vehicles.

 “Somebody needs to break this equilibrium,” he says. “Then things will move to a new era where everybody’s using the additional information.”

Inventory Information Frictions Explain Price Rigidity in Perishable Groceries” is forthcoming in Marketing Science.

Story by Suzi Morales

 MAGICK IS QUANTUM PHYSICS 0=2

Large Hadron Collider regularly makes magic




University of Adelaide
colliding-top-quarks 

image: 

The magic of a mixed top-antitop final state in (a) the qq channel and (b) the gg channel.

view more 

Credit: University of Adelaide/Queen Mary University of London



A brotherly research duo has discovered that when the Large Hadron Collider (LHC) produces top quarks – the heaviest known fundamental particles – it regularly creates a property known as magic.

This finding, published in Physical Review D, has implications for the progression of quantum computing, with magic being a measure that describes how difficult a quantum system is for a non-quantum computer to calculate.

“The higher the magic, the more we need quantum computers to describe the behaviour,” explains Professor Martin White, from the University of Adelaide’s School of Physics, Chemistry and Earth Sciences, who co-led the study with his twin brother, Professor Chris White, a physicist from Queen Mary University of London.

“Studying the magic properties of quantum systems generates significant insights into the development and potential uses of quantum computers.”

The LHC is the world’s largest and most powerful particle accelerator, consisting of a 27-kilometre ring of superconducting magnets with a number of accelerating structures through which two high-energy particle beams travel at close to the speed of light before they are made to collide.

The amount of magic exhibited by top quarks depends on how fast they are moving and their direction of travel, all of which can be measured by the ATLAS and CMS detectors that observe the results of the LHC proton collisions.

“Quantum research has long focused on entanglement, which is where particles become linked; however, our work on magic explores how well-suited particles are for building powerful quantum computers,” says Professor White.

“The ATLAS experiment has already observed evidence of quantum entanglement. We have shown that the LHC can also observe more complex patterns of quantum behaviour at the highest energies yet attempted for these kinds of experiments.”

For decades, scientists have strived to build quantum computers that leverage the laws of quantum mechanics to achieve far greater processing power than traditional computers.

The potential benefits of quantum computers are vast, impacting fields like drug discovery and materials science. Harnessing this power requires robust and controllable quantum states, and magic plays a critical role in achieving that control.

"Our research paves the way for a deeper understanding of the connection between quantum information theory and high-energy physics,” says Professor White.

“This discovery is not just about the heaviest particles in the universe, it's about unlocking the potential of a revolutionary new computing paradigm.”