Thursday, December 19, 2024

 

Denali Fault tore apart ancient joining of two landmasses




University of Alaska Fairbanks





New research shows that three sites spread along an approximately 620-mile portion of today’s Denali Fault were once a smaller united geologic feature indicative of the final joining of two land masses. That feature was then torn apart by millions of years of tectonic activity.

The work, led by associate professor Sean Regan at the University of Alaska Fairbanks Geophysical Institute and UAF College of Natural Science and Mathematics, is featured on the cover of the December edition of Geology, the journal of The Geological Society of America.

Regan is the research paper’s lead author. UAF co-authors include doctoral student McKenzie Miller, recent master’s graduate Sean Marble and research assistant professor Florian Hofmann. Other co-authors are from St. Lawrence University, South Dakota School of Mines and Technology and the University of California, Santa Barbara.

“Our understanding of lithospheric growth, or plate growth, along the western margin in North America is becoming clearer, and a big part of that is related to reconstruction of strike-slip faults such as the Denali Fault,” Regan said. “We’re starting to recognize those primary features involved in the stitching, or the suturing, of once-distant land masses to the North American plate.”

The research focused on formations at three locations: the Clearwater Mountains of Southcentral Alaska, the Kluane Lake region of Canada’s southwestern Yukon, and the Coast Mountains near Juneau. Previous thinking among geologists is mixed, with some suggesting the three locations formed individually.

Regan’s historical reconstruction of 300 miles of horizontal movement on the Denali Fault over millions of years found that the three locations at one time formed a terminal suture zone. A terminal suture zone represents the final integration of tectonic plates or crustal fragments into a larger mass.

Regan’s work defines one of several places where the Wrangellia Composite Terrane, an oceanic plate that originated far from its current position, accreted to the western edge of North America between 72 million and 56 million years ago.

“When you think about geologists crawling around Earth’s surface trying to understand what the heck happened, it makes some sense that they might not link things that are so far apart,” Regan said of the three sites he studied. “With different geologists working in different areas, the dots don’t really get connected until you can reconstruct deformation on the Denali Fault.”

Regan’s reconstruction focused on the three sites’ inverted metamorphism, a geological phenomenon where rocks formed under higher temperatures and pressures are found overlying rocks formed under lower temperatures and pressures. This is the reverse of the typical sequence observed in regional metamorphism, where temperature and pressure generally increase with depth.

Inverted metamorphism is a key indicator of tectonic complexity and helps geologists reconstruct the processes of crustal deformation and mountain building.

“We showed that each of these three independent inverted metamorphic belts all formed at the same time under similar conditions,” Regan said. “And all occupy a very similar structural setting. Not only are they the same age, they all behaved in a similar fashion. They decrease in age, structurally, downward.”

Regan connected the three locations by analyzing their monazite, which consists of the rare earth elements lanthanum, cerium, neodymium and sometimes yttrium. He collected monazite from the two Alaska locations and used Kluane data published earlier in the year by another scientist.

“It is just the most special little mineral,” Regan said. “It can participate in a lot of reactions, so we can use it as a way to track the mineralogical evolution of a rock.”

Regan began his quest after reading a 1993 paper by researchers at the University of Alberta and University British Columbia and published in Geology. That paper asserted similarities in the Denali Fault region later studied by Regan, but only went as far as labeling them as a single metamorphic-plutonic belt. 

A metamorphic-plutonic belt is a region characterized by the close association of metamorphic rocks and plutonic rocks that form as a result of intense tectonic activity, typically during mountain-building processes. These belts are commonly found in areas where tectonic plates converge.

“It was amazing to me that the 1993 paper hadn’t caught more attention back in the day,” Regan said. “I had this paper hung up on my wall for the last four years, because I thought it was really ahead of its time.”


CONTACTS:

• Sean Regan, University of Alaska Fairbanks Geophysical Institute, sregan5@alaska.edu

• Rod Boyce, University of Alaska Fairbanks Geophysical Institute, 907-474-7185, rcboyce@alaska.edu

 

New study highlights ethical challenges in conducting cannabis research in Canada



CAMH-led research reveals systemic barriers and ethical considerations in industry-funded cannabis research in Canada




Centre for Addiction and Mental Health




December 18, 2024 (Toronto, Canada) – In the first study of its kind, research led by the Centre for Addiction and Mental Health (CAMH) sheds light on the ethical complexities and systemic barriers facing scientists conducting cannabis research funded by the for-profit cannabis industry.

Recently published in the journal Social Science & Medicine, the qualitative study titled Canadian cannabis researcher perspectives on the conduct and sponsorship of scientific research by the for-profit cannabis industry reveals that while researchers are driven by a commitment to public health and high-quality research, systemic barriers often force them to rely on industry partnerships. Researchers adopt personal strategies—such as prioritizing transparency, aligning values, and maintaining independence—to navigate these relationships. However, the study emphasizes that these challenges reflect structural issues that require systemic, rather than individual, solutions.

“Cannabis researchers are deeply committed to conducting meaningful work that advances knowledge and improves lives,” said Dr. Daniel Buchman, CAMH Scientist. “However, we found that cannabis researchers perceive that the structural barriers and the regulatory context place them in a difficult position where they are concerned about scientific integrity, agenda bias, and conflicts of interest.”

The study involved 38 semi-structured interviews with a diverse group of academic researchers, peer researchers, and clinicians across all Canadian provinces. Participants were recruited through professional networks, email invitations, and snowball sampling, ensuring a wide range of perspectives across age, gender, geography, and racial and ethnic identities. Thematic analysis of the interviews provided a comprehensive understanding of the ethical and practical challenges posed by industry sponsorship in cannabis research.

Key recommendations in the study include increasing public funding for cannabis research to reduce reliance on industry sponsorship, enhancing transparency about industry relationships, and ensuring that research agendas are not unduly influenced by industry interests. The study also suggests implementing institutional policies that promote independence and integrity in research, such as clear guidelines for managing conflicts of interest and fostering collaborations that align with public health goals. These measures aim to safeguard the integrity of cannabis research and ensure it benefits public health.

This work builds on previous research by Buchman’s team investigating industry partnerships in cannabis studies. A 2023 paper, Cannabis companies and the sponsorship of scientific research: A cross-sectional Canadian case study, found that for-profit cannabis companies have been sponsoring, financially and non-financially, cannabis research since the legalization of recreational cannabis in Canada in 2018. The study highlighted concerns about the “funding effect,” where industry sponsorship could lead to biased, sponsor-favorable outcomes. It underscored the need for greater understanding of Canada’s regulatory environment, which often necessitates researcher-industry collaboration for certain cannabis studies.

Dr. Klara Vichnevetski, Director of the CAMH Technology Transfer and Industry Partnerships Office, who was not involved in this study, added: “As the cannabis research landscape evolves, this study underscores the importance of fostering transparent and ethical industry collaborations. By addressing potential biases in industry-sponsored research, we can build responsible partnerships that uphold high-quality, impactful research while maintaining academic integrity and public trust.”

CAMH is a Canadian leader in cannabis research, having conducted 60 cannabis-related studies over the past several years in a variety of areas, including its impact on impairing drivers, the association between cannabis and psychosis, and more.

CAMH maintains the highest ethical standards in procuring cannabis for research purposes through a well-regulated, arms-length process. Our researchers often obtain cannabis products directly from the Ontario Cannabis Store (OCS), a legally authorized organization that ensures all products meet federal Good Production Practices (GPP) requirements. By purchasing through the OCS, which serves as an intermediary between the cannabis industry and end users, CAMH maintains independence from industry players. This structured procurement approach helps safeguard the research integrity by eliminating potential conflicts of interest and ensuring that all cannabis products used in research meets rigorous quality standards.
 
This research underscores the timely need for systemic reforms to ensure ethical standards and public trust in the rapidly growing field of cannabis research

As summarized by Dr. Sophie Soklaridis, Scientific Director and Research Chair of Recovery and Equity-Focused Mental Health Education Research at CAMH, and paper co-author: “Over one-third of Canadian adults use cannabis, though stigma likely means the true number is higher. Our research is clear: increased government-funded research and clearer regulations would enhance our understanding of cannabis use through independent, robust studies to better inform cannabis users in our country and around the world.”

-30-

About the Centre for Addiction and Mental Health (CAMH)

CAMH is Canada's largest mental health and addiction teaching hospital and a world leading research centre in this field. CAMH combines clinical care, research, education, policy development and health promotion to help transform the lives of people affected by mental illness and addiction. CAMH is fully affiliated with the University of Toronto, and is a Pan American Health Organization/World Health Organization Collaborating Centre. For more information, please visit camh.ca or follow @CAMHnews on X, formerly Twitter.

 

Patience isn't a virtue; it's a coping mechanism



Studies sought to better define what constitutes patience, and impatience, and the factors that determine them




University of California - Riverside




Patience — like its corollary impatience — has always been a sort of “I know it when I see it” concept. And that didn’t sit well with UC Riverside psychology researcher Kate Sweeny.

“Philosophers and religious scholars call patience a virtue, yet most people claim to be impatient,” Sweeny said. “That made me wonder if maybe patience is less about being a good person and more about how we deal with day-to-day frustrations.”

For purposes of her research, Sweeny sought to better define what constitutes patience, and impatience, and the factors that determine them.

Impatience, she concluded across three studies of 1,200 people, is the emotion people feel when they face a delay that seems unfair, unreasonable, or inappropriate—like a traffic jam outside of rush hour, or a meeting that should have ended 15 minutes ago. Patience, then, is how we cope with those feelings of impatience.

The studies’ findings were published recently in the journal Personality and Social Psychology Bulletin in the article “When Time is the Enemy: An Initial Test of the Process Model of Patience.”

Psychologists use the term “emotion regulation” to capture the many strategies people use to reduce (or sometimes increase) the intensity of their emotions. Patience, Sweeny asserts in a companion theoretical paper, is the subset of these strategies that particularly target feelings of impatience.

The first studies to test that idea were recently published in the article “When Time is the Enemy: An Initial Test of the Process Model of Patience,” in the journal Personality and Social Psychology Bulletin.

The studies asked participants to consider their responses to various frustrating situations one might encounter in everyday life. One depicted a traffic jam, another described a long, boring meeting, and others prompted them to imagine being stuck in a waiting room.

Participants indicated how impatient they would feel in response to each, then whether they would counter their impatience through strategies like distraction, deep breathing, or seeing the upsides of the situation.

The study results identified three scenarios that create a “perfect storm” for impatience: when the stakes are relatively high (traffic on the way to a favorite band’s concert), when the state of waiting is unpleasant (no seats and no distractions at the DMV), and when someone is clearly to blame for the delay (the lab forgot to process your medical test). People also felt more impatient when a delay was longer than they anticipated—but surprisingly, not when they delay was relatively long or short.

Although nearly everyone in the studies said they would feel at least a bit impatient when facing those frustrating situations, some people were more patient than others. Participants who were more comfortable with open-ended situations and more emotionally stable (i.e., low in need for closure and neuroticism) said they wouldn’t feel too impatient in those scenarios; those who were more emotionally skilled and better at self-regulation said they would respond more patiently, even if they initially felt impatient. Being agreeable and high in empathy also predicted patience.

“Our initial findings support many of our ideas about patience and impatience,” Sweeny concludes. “We have a lot still to learn, but our approach is quite promising in terms of helping people to manage feelings of impatience and ultimately become more patient in their daily lives.”

Co-authors for “When Time is the Enemy” included graduate researchers Jason Hawes and Olivia T. Karaman. The companion theoretical paper, “On (Im)Patience: A New Approach to an Old Virtue,” was published in the journal Personality and Social Psychology Review

Digital labels can help grocers waste less food



Consumers, stores, and the environment benefit when electronic tags allow speedy price changes



University of Texas at Austin




In 2022, U.S. grocers wasted 5 million tons of food, with 35% of it going to landfills, according to the food waste nonprofit ReFed. More than half of that waste — 2.7 million tons — was past the labels’ expiration dates.

But there’s a potential technological solution to the waste problem, according to new research from Texas McCombs. By moving from paper shelf labels to digital ones, supermarkets can easily lower prices and move older stock from their shelves to consumers’ homes.

Using technology to quickly change prices on labels, a process known as dynamic pricing, benefits more than just consumers, says Ioannis Stamatopoulos, associate professor of information, risk, and operations management.

“Everyone is better off when dynamic pricing is enabled,” he says. “There’s less food waste and less emissions from food ending up in landfills.”

Grocers save time and money, too. They can change digital labels — also called electronic shelf labels — with a few keystrokes on a tablet, compared with printing physical labels and sticking them onto shelves. When it’s easier and cheaper to update prices, Stamatopoulos says, grocers should do it more often.

How much is more often? To find out, Stamatopoulos — with Naveed Chehrazi of Washington University in St. Louis and Robert Sanders of the University of California, San Diego — analyzed two unnamed European grocery chains as they installed the tags.

One, in the United Kingdom, introduced digital labels for 940 perishable products. The labels displayed the base price and added discounts as products got closer to expiring. The researchers found the stores changed prices 54% more often.

Raising the Bar for Barcodes

The second supermarket chain, in the European Union, adopted electronic labels but added a second technology: expanded barcodes.

Unlike a standard barcode, an expanded one can hold inventory details such as packaging dates, lot numbers, and expiration dates. When inventory nears its “sell-by” date, a store can lower prices to stimulate buying.

After the EU stores installed the two technologies, the researchers found, they increased price change frequency 853%.

Rapid price changes aid shoppers, says Stamatopoulos. “If you’re a consumer who really, really cares about price, then you can buy the blueberries that will expire two days later and consume them today.”

By better managing inventories, the barcodes also boost stores’ bottom lines, he adds. “Because the grocery store can put things on discount when they’re about to expire, they can afford larger orders, so they take advantage of economies of scale in ordering.”

The Price of Dynamic Pricing

Although dynamic pricing has long-term benefits, it faces short-term obstacles. One is consumer fear that retailers will jack up prices when demand is high — as ride-hailing companies such as Uber do.

In February, when the fast-food chain Wendy’s announced it would roll out dynamic pricing, it drew backlash. It quickly clarified that it would cut prices during slow periods but not raise them during busy periods.

But unlike restaurants, it’s hard for grocers to pinpoint hours of high demand for individual products, Stamatopoulos notes. “For retailers to estimate demand very finely and dynamically respond accordingly, so as to squeeze every dollar out of it, I think that’s kind of impossible,” he says.

Another barrier is cost. Grocers must invest in digital labels and tablets, while employees must update data daily for thousands of items.

Europe is ahead of the U.S. on adopting these technologies, but that could be changing. In June, Walmart announced it will transition to digital price labels in 2,300 stores by 2026. Amazon Fresh and the Midwestern chain Schnuck’s are also using them.

To speed the transition, Stamatopoulos suggests government subsidies, like those for solar panels and electric vehicles.

 “Somebody needs to break this equilibrium,” he says. “Then things will move to a new era where everybody’s using the additional information.”

Inventory Information Frictions Explain Price Rigidity in Perishable Groceries” is forthcoming in Marketing Science.

Story by Suzi Morales

 MAGICK IS QUANTUM PHYSICS 0=2

Large Hadron Collider regularly makes magic




University of Adelaide
colliding-top-quarks 

image: 

The magic of a mixed top-antitop final state in (a) the qq channel and (b) the gg channel.

view more 

Credit: University of Adelaide/Queen Mary University of London



A brotherly research duo has discovered that when the Large Hadron Collider (LHC) produces top quarks – the heaviest known fundamental particles – it regularly creates a property known as magic.

This finding, published in Physical Review D, has implications for the progression of quantum computing, with magic being a measure that describes how difficult a quantum system is for a non-quantum computer to calculate.

“The higher the magic, the more we need quantum computers to describe the behaviour,” explains Professor Martin White, from the University of Adelaide’s School of Physics, Chemistry and Earth Sciences, who co-led the study with his twin brother, Professor Chris White, a physicist from Queen Mary University of London.

“Studying the magic properties of quantum systems generates significant insights into the development and potential uses of quantum computers.”

The LHC is the world’s largest and most powerful particle accelerator, consisting of a 27-kilometre ring of superconducting magnets with a number of accelerating structures through which two high-energy particle beams travel at close to the speed of light before they are made to collide.

The amount of magic exhibited by top quarks depends on how fast they are moving and their direction of travel, all of which can be measured by the ATLAS and CMS detectors that observe the results of the LHC proton collisions.

“Quantum research has long focused on entanglement, which is where particles become linked; however, our work on magic explores how well-suited particles are for building powerful quantum computers,” says Professor White.

“The ATLAS experiment has already observed evidence of quantum entanglement. We have shown that the LHC can also observe more complex patterns of quantum behaviour at the highest energies yet attempted for these kinds of experiments.”

For decades, scientists have strived to build quantum computers that leverage the laws of quantum mechanics to achieve far greater processing power than traditional computers.

The potential benefits of quantum computers are vast, impacting fields like drug discovery and materials science. Harnessing this power requires robust and controllable quantum states, and magic plays a critical role in achieving that control.

"Our research paves the way for a deeper understanding of the connection between quantum information theory and high-energy physics,” says Professor White.

“This discovery is not just about the heaviest particles in the universe, it's about unlocking the potential of a revolutionary new computing paradigm.”

More people living without running water in US cities since the global financial crisis of 2008-2009


King's College London
Map of US households without access to running water in 2017–2021. 

image: 

Map of US households without access to running water in 2017–2021. Cities with 613 notable concentrations are listed by name. Source data: US Census Bureau.

view more 

Credit: (Meehan et al. 2024)




More American cities – even those seen as affluent – are home to people living without running water as people are being ‘squeezed’ by unaffordable housing and the cost-of-living crisis, new research finds.

Published in Nature Cities, the study revealed the problem worsened following changes to the housing market triggered by the 2008 global crash. And since 2017 it has been “expanding in scope and severity” to affect a broader array of US cities including Portland (OR), Phoenix, Houston, Atlanta, Dallas-Fort Worth, and Philadelphia, as well as large urban areas such as Los Angeles, New York City and San Francisco.

The research also found that people of color were disproportionally affected by a lack of household water, a situation defined by the authors as ‘plumbing poverty’, in 12 of the 15 largest cities.

The researchers from King’s College London and the University of Arizona said the findings should “raise alarm bells” and warned it would take a “heroic” transformation of housing conditions and social infrastructures for the USA to meet the United Nations goal for everyone to have access to safe drinking water, sanitation and hygiene.

Lead researcher Professor Katie Meehan, Professor of Environmental Justice at King’s College London, UK, said: “It is alarming how many US cities, including those thought of as affluent and growing, are now home to more people living in situations of extreme poverty, namely without access to running water.

“Our research is the first effort to track these changes over time, starting in the 1970s and noting a dramatic urbanization of plumbing poverty in the 1990s and sharp acceleration triggered by the 2008 crash and the current housing and cost-of-living crisis.

“The compound pressures of high housing costs and expenditures mean that more low-income, asset-limited people are living without running water in these expensive cities. Far too many people, especially those of color, are now in such extreme poverty they are being pushed into homes that do not meet the basic standard for human dignity and life.”

Meehan said people can find themselves living without running water because of a range of reasons and, in most cases, people are working but not earning enough to make ends meet. Some households might have been disconnected from water service after falling behind with bills or had to “downgrade” to housing without any water access because other expenses take priority. Others might be in homes which have been poorly maintained by their landlord but cannot afford to move out, some might be living in buildings such as sheds or warehouses not designed to be homes, while others could be experiencing homelessness.

Lucy Everitt, a PhD student at King’s College London who was part of the research team, said water service shut-offs are a hidden problem across US cities that may be indirectly picked up by US census data.

“New York City tops the ‘worst offenders’ list for the total number of households in a US metro without running water. Despite this, the municipal Water Board issued more than 2,400 shutoff notices in March of this year alone to properties behind in their payments. Because our analysis tracks the status of running water in households, as measured by the US Census, we anticipate that we are capturing many thousands of households whose access is denied by their inability to pay.”

The study is the first to track the problem over a 51-year period in the 50 largest US cities. In the 1970s, according to census data, 3.5 million US households lacked running water and by 2021 this overall number had reduced, but 0.5 million households or 1.1million people still lacked household access to running water. This is equivalent to one  out of 245 households live without running water. The team believe this is likely to be an underestimate of the true number because of limitations in US census data.

Other key findings from the study include:

  • From 1990, plumbing poverty shifted from being a mainly rural to urban issue and latest figures show 71 percent of those in plumbing poverty now live in cities.
  • In 2021, the New York City metro area led the nation in the number of people living in plumbing poverty – a staggering 56,900 people  – followed by Los Angeles (45,900 people) and San Francisco (24,400 people).
  • People of color represent the majority of individuals without access to running water in 12 of the 15 largest US cities, including Los Angeles (82%), Miami (79%), San Francisco (74%), and Houston (71%) in 2021.

Dr Jason R. Jurjevich, Assistant Professor in the School of Geography, Development and Environment at the University of Arizona who was part of the research team, said: “Our results underscore that the success in reducing plumbing poverty in select US cities over the past twenty years is uneven, with households of color often left behind.  In Philadelphia, for example, people of color comprised 40% of the total population, but represented 66% of people without access to running water in 2021.”

The authors said not enough attention is being given to how the housing crisis is shaping people’s access to running water. They recommend reform and improvements to the US Census Bureau’s capacity to collect nationwide data about household water access and the extent of water service shut-offs, to monitor and meet SDG development goals. They also said local water utilities and water boards must revisit and overhaul low-income assistance programs in light of the expanding cost-of-living and housing expenses, which are ‘squeezing’ people’s ability to pay for water services.   

Ends