Tuesday, January 07, 2025

 

Cutting edge simulations unveil clues to human evolution



University of Liverpool




The University of Liverpool has led an international team of scientists to take a fresh look at the running capabilities of Australopithecus afarensis, the early human ancestor famously represented by the fossil ‘Lucy’.

Karl Bates, Professor of Musculoskeletal Biology, convened experts from institutions across the UK and the Netherlands. Together they used cutting-edge computer simulations to uncover how this ancient species ran, using a digital model of ‘Lucy’s’ skeleton.

Previous work on the fossilized footprints of Australopithecus by multiple research teams has suggested that Lucy probably walked relatively upright and much more like a human than a chimpanzee. These new findings demonstrate that Lucy’s overall body shape limited running speed relative to modern humans and therefore support the hypothesis that the human body evolved to improve running performance, with top speed being a more critical driver than previously thought.

Professor Bates said: “When Lucy was discovered 50 years ago, it was by far the most complete skeleton of an early human ancestor. Lucy is a fascinating fossil because it captures what you might call an intermediate stage in Homo sapiens’ evolution. Lucy bridges the gap between our more tree-dwelling ancestors and modern humans, who walk and run efficiently on two legs.

“By simulating running performance in Australopithecus and modern humans with computer models, we’ve been able to address questions about the evolution of running in our ancestors.

“For decades scientists have debated whether more economical walking ability or improved running performance was the primary factor that drove the evolution of many of distinctly human characteristics, such as longer legs and shorter arms, stronger leg bones and our arched feet. By illustrating how Australopithecus walked and ran, we have started to answer these questions.”

The team used computer-based movement simulations to model the biomechanics and energetics of running in Australopithecus afarensis, alongside a model of a human. In both the Australopithecus and human models, the team ran multiple simulations where various features thought to be important to modern human running, like larger leg muscles and a long Achilles Tendon, were added and removed, thereby digitally replaying evolutionary events to see how they impact running speed and energy use.

Muscles and other soft tissues are not preserved in fossils, so palaeontologists don’t know how large ‘Lucy’s’ leg muscles and other important parameters were. However, these new digital models varied the muscle properties from chimpanzee-like to human-like, producing a range of estimates for running speed and economy.

The simulations reveal that while Lucy was capable of running upright on both legs, her maximum speeds were significantly slower than those of modern humans. In fact, even the fastest speed the team predicted for Lucy (in a model with very human-like muscles) remained relatively modest at just 11mph (18kph). This is much slower than elite human sprinters, which reach peak speeds of more than 20mph (38kph). The models show the range of intermediate (‘jogging’) speeds that animals use to run longer distances (‘endurance running’) was also very restricted, perhaps suggesting that Australopithecus didn’t engage in the kind of long-distance hunting activities thought to be important to the earliest humans.

Professor Bates continued: “Our results highlight the importance of muscle anatomy and body proportions in the development of running ability. Skeletal strength doesn’t seem to have been a limiting factor, but evolutionary changes to muscles and tendons played a major role in enhancing running speed and economy.

“As the 50th anniversary of Lucy’s discovery is celebrated, this study not only sheds new light on her capabilities but also underscores how far modern science has come in unravelling the story of human evolution.”

The study, ‘Running performance in Australopithecus afarensis’ was published in Current Biology (DOI:10.1016/j.cub.2024.11.025).

 

Desert nectar: Agave genome study sheds light on drought tolerance




Peer-Reviewed Publication

Nanjing Agricultural University The Academy of Science

Plant morphology, genome features, and synteny information. 

image: 

Plant morphology, genome features, and synteny information.

view more 

Credit: Horticulture Research




A recent study has illuminated the intricate genetic mechanisms behind crassulacean acid metabolism (CAM) photosynthesis in Agave hybrid NO.11648. This research is a landmark in understanding how plants adapt to extreme water scarcity, offering fresh insights into the genomic blueprint of CAM—a photosynthetic pathway critical for plant survival in arid climates. The findings could revolutionize agricultural practices in drought-prone regions, providing a scientific foundation for developing more resilient crops.

Crassulacean acid metabolism (CAM) photosynthesis, a unique metabolic strategy, enables plants to conserve water by capturing carbon dioxide during the night. This adaptation is a lifeline for species in arid environments, yet its genetic underpinnings remain enigmatic. Despite its ecological importance, gaps in understanding the molecular controls of CAM pose challenges to designing water-efficient crops for a warming world. Exploring the genomes of CAM plants, particularly the drought-resilient Agave genus, is essential to unlocking the genetic secrets of this extraordinary adaptation.

On December 19, 2023, researchers from the Zhanjiang Key Laboratory of Tropical Crop Genetic Improvement achieved a major milestone in CAM research. Published (DOI: 10.1093/hr/uhad269) in Horticulture Research, their study provides a chromosome-level genome assembly of Agave hybrid NO.11648, uncovering key genes and regulatory pathways that govern CAM photosynthesis.

The researchers employed cutting-edge techniques, including high-throughput chromosome conformation capture and next-generation sequencing, to achieve a highly detailed assembly of the Agave genome. The resulting genome spans 4.87 Gb, organized into 30 pseudo-chromosomes with an N50 of 186.42 Mb. This comprehensive analysis revealed a genome abundant in repetitive sequences, particularly I-type repeats, and identified 58,841 protein-coding genes. Among the findings was a lineage-specific whole-genome duplication event post-dating the divergence from the Asparagoideae subfamily. The study also highlighted a duplication within the phosphoenolpyruvate carboxylase kinase (PEPCK) gene family, identifying three PEPCK genes—PEPCK3PEPCK5, and PEPCK12—as central to CAM regulation. Furthermore, the researchers identified transcription factors linked to circadian rhythms, MAPK signaling, and hormone signal transduction pathways that modulate PEPCK3 expression, shedding light on the complexity of CAM's genetic control.

Dr. Wenzhao Zhou, the corresponding author and a renowned authority in tropical crop genetics, emphasized the significance of this discovery: "Our chromosome-level genome assembly of Agave hybrid NO.11648 represents a monumental step in plant science. By decoding the genetic architecture of CAM photosynthesis, we not only enhance our understanding of plant resilience but also provide invaluable genomic resources for breeding crops that thrive under challenging environmental conditions. This work lays a solid foundation for sustainable agriculture in the face of climate change."

The implications of this research extend far beyond Agave. Understanding CAM photosynthesis at a genomic level opens the door to developing drought-resistant crops capable of optimizing water use. These insights could transform agricultural practices, enabling crops to thrive in water-scarce regions and contributing to global food security. As the world grapples with climate change and diminishing water resources, this study serves as a beacon for innovation in plant genomics and sustainable farming.

###

References

DOI

10.1093/hr/uhad269

Original Source URL

https://doi.org/10.1093/hr/uhad269

Funding information

This study was sponsored by the Earmarked fund for the China Agriculture Research System (grant No. CARS-19), the National Natural Science Foundation of China (grant No. 31801679), Guangdong Provincial Team of Technical System Innovation for Sugarcane Sisal Hemp Industry (grant No. 2023KJ104-03), Guangdong Basic and Applied Basic Research Foundation (grant Nos 2021A1515012421 and 2022A1515011841), Hainan Provincial Natural Science Foundation of China (321QN300 and 323MS099), and Central Public-interest Scientific Institution Basal Research Fund for Chinese Academy of Tropical Agricultural Sciences (grant Nos. 1630062019016, 1630062020015, 1630062022002, and 1630062021015).

About Horticulture Research

Horticulture Research is an open access journal of Nanjing Agricultural University and ranked number one in the Horticulture category of the Journal Citation Reports ™ from Clarivate, 2022. The journal is committed to publishing original research articles, reviews, perspectives, comments, correspondence articles and letters to the editor related to all major horticultural plants and disciplines, including biotechnology, breeding, cellular and molecular biology, evolution, genetics, inter-species interactions, physiology, and the origination and domestication of crops.

Reducing irrigation for livestock feed crops is needed to save Great Salt Lake, study argues


 News Release 

Oregon State University

Great Salt Lake 2024 

image: 

Great Salt Lake 2024

view more 

Credit: Photo by Brian Richter, president, Sustainable Waters




CORVALLIS, Ore. – The Great Salt Lake has lost more than 15 billion cubic yards of water over the past three decades, is getting shallower at the rate of 4 inches a year, and an analysis of its water budget suggests reducing irrigation is necessary for saving it.

The study published today in Environmental Challenges shows that 62% of the river water bound for the lake is diverted for human uses, with agricultural activities responsible for nearly three-quarters of that percentage.

“The research highlights the alarming role of water consumption for feeding livestock in driving the lake’s rapid depletion,” said co-author William Ripple, distinguished professor of ecology at Oregon State University, who notes that 80% of agricultural water use is for irrigating alfalfa and hay crops.

To stabilize the lake and begin refilling it, the authors propose cutting human water consumption in the Great Salt Lake watershed by 35%, including a reduction in irrigated alfalfa production, a fallowing of much of the region’s irrigated grass hay fields and taxpayer-funded compensation for farmers and ranchers who lose income.

“The lake is of tremendous ecological, economic, cultural and spiritual significance in the region and beyond,” said Ripple, a member of OSU’s College of Forestry. “All of those values are in severe jeopardy because of the lake’s dramatic depletion over the last few decades.”

The authors used data from the Utah Division of Water Resources to build a detailed water budget for the Great Salt Lake basin for the years 1989 through 2022. On average, inputs to the lake – river inflows and precipitation – during the study period lagged behind consumption and evaporation at the rate of 500 million cubic yards per year.

The water budget has been in a deficit situation for much of the past 100 years and the numbers have worsened with climate change and drought, the authors say.

“Abnormally large snowmelt inflow during the 1980s and 1990s served to temporarily obscure the long-term decline in lake levels, and the lake actually reached its highest level in more than a century in 1987,” Ripple said. “But it has been dropping by roughly 4 inches per year on average since then.”

The Great Salt Lake, which has no outlet, is the largest saline lake in the Western Hemisphere and the eighth largest in the world. Its 21,000-square-mile drainage basin includes the Wasatch Mountains, whose snowfall accounts for much of the basin’s water replenishment.

A biodiversity hotspot, the lake sustains more than 10 million migratory birds and 350 bird species. Declining lake levels threaten critical habitats and could disrupt food webs, Ripple said.

The lake directly supports 9,000 jobs and annually fuels $2.5 billion in economic activity in the form of recreation, mining and brine shrimp harvesting, the paper points out. It’s the world’s largest supplier of brine shrimp eggs, a food source that underpins global aquaculture, but as the lake shrinks and salinity increases, the shrimp become physiologically stressed and don’t produce as well.

Also as the lake gets smaller, human health risk grows in the form of wind-carried dust from the exposed saline lakebed, or playa. Five percent of the Great Salt Lake playa is fine particulate matter that can enter the lungs and cause a range of pulmonary problems, and particularly troublesome, the scientists say, is the presence of toxic heavy metals, residues of the region’s history of mining, smelting and oil refining.

Depending on which conservation measures are deployed – including crop shifting, reducing municipal and industrial use, and leasing water rights from irrigators – the authors propose that farmers and ranchers who lose income from using less water could be compensated at a cost ranging from $29 to $124 per Utah resident per year. The state’s population is 3.4 million.

“Revenues from growing both irrigated alfalfa and grass hay cattle feed in the Great Salt Lake basin account for less than 0.1% of Utah’s gross domestic product,” Ripple said. “But our potential solutions would mean lifestyle changes for as many as 20,000 farmers and ranchers in the basin.”

In that regard, he adds, the Great Salt Lake area exemplifies the socio-cultural changes facing many river basin communities in the West and around the world, where climate change is sending many water budgets into deficit status.

“The economic and cultural adjustments required are significant but not insurmountable,” said Ripple. “With the right policies and public support, we can secure a sustainable future for the Great Salt Lake and set a precedent for addressing water scarcity globally.”

Collaborating with Ripple on the paper was an interdisciplinary team of scientists from Northern Arizona University, Utah State University and Virginia Tech; the Utah Agricultural Experiment Station; and Sustainable Waters, a New Mexico-based nonprofit focusing on global water education.

The National Science Foundation and the Utah Agricultural Experiment Station provided funding.

Great Salt Lake.2024

Credit

Photo by Brian Richter, president, Sustainable Waters


 

Method can detect harmful salts forming in nuclear waste melters


Washington State University
saltgrowth1 

image: 

Optical microscopy of salt growth on a glass surface.  Ribs at the top left of the image are a very thin film of salt forming on the surface.

view more 

Credit: WSU




PULLMAN, Wash. –  A new way to identify salts in nuclear waste melters could help improve clean-up technology, including at the Hanford Site, one of the largest, most complex nuclear waste clean-up sites in the world.

Reporting in the journal Measurement, Washington State University researchers used two detectors to find thin layers of sulfate, chloride and fluoride salts during vitrification, a nuclear waste storage process that involves converting the waste into glass. The formation of salts can be problematic for waste processing and storage.

“We were able to demonstrate a technique to see when the salts are forming,” said John Bussey, a WSU undergraduate who is one of the paper’s lead authors. “By doing that, the melters could be monitored to know if we should change what is being put in the melt.”

Vitrification entails putting the nuclear waste into large melters that are then heated to high temperatures. The resulting glass is then poured into cylinders and solidified for long-term safe storage.

The U.S. Department of Energy is building a vitrification plant at the Hanford Site. Because Hanford was used to make plutonium for the very first nuclear bomb, the waste there is particularly complex, containing nearly all of the elements of the periodic table, said Bussey. A total of 55 million gallons of chemical and nuclear waste are stored in 177 tanks at the site.

In the processing of the nuclear waste, salts can form. They can be corrosive and ruin very expensive vitrification equipment. Furthermore, since they dissolve in water, salts in the final glass waste form could lead to leaks and contamination if the waste form becomes exposed to water during storage. The wide variety of waste components at Hanford makes salt formation more likely.

“Salt formation is very undesirable during the vitrification process,” said Bussey.

With a system that was developed at Pacific Northwest National Laboratory and the Massachusetts Institute of Technology, the researchers used optical and electrical components to look at light between infrared and microwave wavelengths that are naturally emitted during the melting process. They looked at samples of glass melts that are similar to those found at the Hanford site. Using two types of detectors, they were able to study the thermal emissions of the samples as well as the change over time.

“The brightness is really interesting for identifying all of the melting, solidification and salt formation,” said Ian Wells, co-lead author and a graduate student in the WSU School of Mechanical and Materials Engineering. “What is really unique about this is you don’t have to add any additional lighting or additional systems -- Purely based on the heat that is coming off the melt, you are able to look at the brightness of one-pixel images, and you can tell what’s happening.”

The researchers were able to see when there’s a large change in the melt. Whether because a salt is forming or if there’s melting or solidification, there is also a sharp change in the intensity. The researchers compared different melts and were able to identify behavior indicative of salts.

“We can clearly identify what is happening based on that behavior,” said Wells. “We were surprised by how sensitive a probe it was even with very small amounts of salt.”

The system can discriminate between salt types. The sensors can also sense the salts remotely, without having to be dipped in the radioactive molten glass, thus avoiding additional challenges.

“This work takes this monitoring technology a good step of the way closer to being able to be used inside the vitrification plant,” said Bussey. “This piece of equipment without too much modification could be put straight into the vitrification plant.”

The researchers think the work has other potential applications in molten salt nuclear reactors or in different types of manufacturing processes, such as glass, epoxies or carbon fiber processing, in which manufacturers want to better understand phase changes and the formation of different compounds during those phases. They hope to next move from lab-scale testing to larger scale melt tests. This research was funded through the United States Department of Energy Office of Environmental Management.

 

Major EU project to investigate societal benefits and risks of AI




UCD Research & Innovation
FORSEE Project 

image: 

Dr Elizabeth Farries, Director of UCD Centre for Digital Policy

view more 

Credit: Jason Clarke Photography




A new €3 million EU research project led by University College Dublin (UCD) Centre for Digital Policy will explore the benefits and risks of Artificial Intelligence (AI) from a societal perspective in order to enhance AI capabilities and EU regulatory frameworks. 

Commencing in February 2025, the FORSEE (Forging Successful AI Applications for European Economy and Society) project is funded through the Beyond the horizon: A human-friendly deployment of artificial intelligence and related technologies funding call under the Horizon Europe programme.

The project aims to broaden the concept of AI "success" beyond technological and economic efficiency and provide insights that will structurally enhance capacities to anticipate, evaluate and manage the future and long term opportunities and challenges associated with AI. 

Led by Dr Elizabeth Farries, Director of UCD Centre for Digital Policy, the consortium includes eight partners from universities, research institutions and think tanks across six European countries. Dr Farries said, "FORSEE seeks to improve our understanding of what “successful AI” actually means in order to enhance regulatory perspectives and approaches. Focusing on sustainability, labour and economic efficiency, gender and engagement with civil society, our research group will offer broadened awareness of the risks and opportunities of AI, based on our grounded research.”

Professor Niamh Moore-Cherry, Principal of UCD College of Social Sciences and Law said, “I am delighted that this exciting consortium project led by Dr Elizabeth Farries has been funded. As the development of AI technology accelerates, it is crucial that we gain a better understanding of its economic, societal and ethical implications as well as its technological success. The FORSEE project, bringing international experts together to develop a critical building block for AI policy and regulatory frameworks in Europe, is part of a growing portfolio of research across a range of disciplines in our College focused on AI and data science and we are delighted to be hosting it.”

Engaging with institutions, civil society organisations and the broader public, the FORSEE team will discern the current criteria of AI success to highlight potential tensions between existing AI applications and EU priorities, and evaluate impacts on economy and society. The project will also examine the conditions that underpin or restrain success for small and medium enterprises within the EU, in order to equip stakeholders and policymakers with the tools to address future risks and opportunities.

Co-PI Prof Eugenia Siapera, Professor of Digital Technology, Policy and Society and Co-Director of UCD Centre for Digital Policy, said, "In a context of rapid technological developments and regulatory responses, FORSEE aims to develop a strong empirical basis for fair, equitable and sustainable AI governance in dialogue with institutional bodies and societal stakeholders.” 

Dr Alexandros Minotakis, a Post-Doctoral researcher at UCD Centre for Digital Policy and member of the project team added, “This project will result in concrete recommendations on European policy, complimenting the existing regulatory framework through its interventions.”

The consortium brings together a broad range of partners encompassing interdisciplinary expertise across legal and policy analysis, political economy, computational social science, information and communication, media and platforms studies, collaborating with academics from computer science. Participating UCD researchers include Prof Aphra Kerr, Dr Arjumand Younus, Dr James Steinhoff and Dr Pat Brodie.

Consortium partners include UCD (Ireland), Trinity College Dublin (Ireland), Tilburg University (The Netherlands), University Paul Sabatier Toulouse III (France), the WZB Berlin Social Science Centre (Germany), Demos Helsinki (Finland), TASC Europe Studies CLG (Ireland) and the European Digital SME Alliance (Belgium). 


Implanting false memories much harder than claimed in court



University College London




False memories are much harder to implant than previously claimed by memory researchers and expert witnesses in criminal trials, finds a new study led by researchers at UCL and Royal Holloway, University of London.

The 1995 Lost in the Mall study has often been cited in criminal trials, particularly those involving historical sexual abuse – including by Harvey Weinstein’s defence team – in order to cast doubt on the memory of accusers.

This famous study suggested that it is easy to implant false memories for a fake event that never happened – after 25% of the 24 participants wrongly recalled being lost in a supermarket at the age of five.

In 2023 the Lost in the Mall study was repeated by psychologists at University College Cork and University College Dublin, using the same methods. They used a larger sample of 123 people and claimed to find more false memories (35%) than the original study.

However, the new analysis of the 2023 data, published in Applied Cognitive Psychology, has raised serious questions about these findings. The article shows that none of the 35% judged to have a false memory in 2023 reported an entirely false memory and many did not even recall being lost.

According to the new analysis, half of those judged to have false memories had actually been lost before and were likely to be reporting on real events (albeit at a different time/place). Meanwhile others were so unsure about the suggested details in the fake story that their testimony would have been of little value in court.

Emeritus Professor Chris Brewin (UCL Psychology & Language Sciences) said: “The findings underscore the dangers of applying laboratory research findings to the real world of witnesses in court. People in these studies are cautious in what they claim to remember and seem to be much less likely than the investigators to agree they had a false memory. Experts need to be very careful in how they present research findings so as not to mislead the justice system.”

As part of their analysis, the researchers focused on six core details of the fake event, including: being lost; crying; being helped by an elderly woman; being reunited with their family; the location of the event; the time of the event.

They found that participants who were deemed to have a false memory on average recalled one and a half details with any confidence, and 30% recalled none at all.

This was consistent with previous reports that investigators’ false memory judgements were often not backed up by the views of participants themselves.

Lead author Emeritus Professor Bernice Andrews (Royal Holloway Department of Psychology) added: “This is the first time that the raw data from a false memory implantation study have been made publicly available and subjected to independent scrutiny.”

 

Weaponized disinformation poses serious threat to urban transit systems



AI model reveals critical vulnerabilities in mass transit networks



Stevens Institute of Technology





Hoboken, N.J., January 7, 2025 — As every straphanger knows, public transit systems are fantastic—except when they aren’t. A single station closure can ruin a commuter’s morning, and even minor delays or disruptions can snowball into major problems as thousands of passengers rush to find alternate routes, straining other parts of the transit network.  

In a study published this month in Reliability Engineering and System Safety, researchers from Stevens Institute of Technology and the University of Oklahoma warn that the complexity and fragility of urban transit networks could be exploited by terrorists or other bad actors. Using AI models, the team shows how low-cost attacks based on the spread of misinformation could significantly disrupt transit networks, causing widespread chaos and economic harm. 

“This study reveals serious vulnerabilities that could be targeted relatively easily to disrupt urban transit systems,” says Dr. Jose Ramirez-Marquez, a professor in Stevens' Department of Systems and Enterprises. “The purposeful spread of misinformation—such as bomb threats or other bogus reports—has the potential to cause serious problems for the world’s public transport networks.” 

To test transit systems’ resilience to disinformation, the team used AI models to study the Port Authority Trans-Hudson (PATH) system in New Jersey and New York, which is relied on by more than 200,000 daily riders. Based on social media alerts published by PATH, the team used AI and natural language processing (NLP) tools to categorize disruption scenarios, from service delays to security incidents, many of which could be triggered or exacerbated by the spread of misinformation.

Next, the team used computer modeling and real-world ridership data to test how the PATH network would respond to disinformation regarding various disruption scenarios. That meant quantifying the direct impact of any given disruption—such as a train being held on a platform, or a station being temporarily closed—but also modeling passengers’ likely responses, and the ways in which delays would subsequently cascade through the network as a whole.

“A disruption might be caused by a single false report of an unattended suitcase or a rowdy passenger on a train—but we’ve shown how that single bit of disinformation can ripple out to degrade performance across the PATH network,” Dr. Ramirez-Marquez explains. 

The results were striking: a single brief station closure was found to cause cumulative delays of up to 16,441 minutes for impacted passengers. It also imposed additional costs averaging up to $18.13 per passenger as riders were forced to pay for taxis, buses, and other alternate transportation. 

Even relatively low levels of disinformation could lead to major PATH hubs such as its Newark or World Trade Center stations experiencing forced closures. Higher levels of disinformation, perhaps timed to coincide with major sporting events for maximum disruption, could lead to stations being closed as much as 11% of the time, triggering delays and economic impacts across the New York and New Jersey region.

Modeling such impacts is valuable because it can help network administrators prepare for and respond to disinformation, Dr. Ramirez-Marquez says. “Something as simple as quickly checking whether a report is true—such as walking down a platform and looking for an abandoned suitcase, or confirming that a security incident is actually happening—can mitigate many of these impacts,” he explains. “But to implement that at scale, you need to have anticipated and planned for the scenarios in question.” 

The Stevens team now hopes to extend its work to model potential disruptions across entire urban areas, and show how social media chatter—including misinformation—can be used to anticipate and respond to disruptions in real time. “Right now, most cities are reactive: they send first responders out when something bad happens,” Dr. Ramirez-Marquez says. “With the right AI tools, cities could spot problems before they happen—reducing their vulnerability to weaponized misinformation, and minimizing disruption for everyone.” 

About Stevens Institute of Technology
Stevens Institute of Technology is a premier, private research university situated in Hoboken, New Jersey. Since our founding in 1870, technological innovation has been the hallmark of Stevens’ education and research. Within the university’s three schools and one college, more than 8,000 undergraduate and graduate students collaborate closely with faculty in an interdisciplinary, student-centric, entrepreneurial environment. Academic and research programs spanning business, computing, engineering, the arts and other disciplines actively advance the frontiers of science and leverage technology to confront our most pressing global challenges. The university continues to be consistently ranked among the nation’s leaders in career services, post-graduation salaries of alumni and return on tuition investment.

 

Researchers identify public policies that work to prevent suicide



Policies around economic security, alcohol use, and safety restrictions have strongest potential for reducing suicide deaths



New York University




An analysis led by New York University researchers determines which public policies effectively prevent suicide deaths in the United States. But it’s not just policies that limit firearms and expand access to health care—many economic and social policies that are not explicitly focused on mental health can also prevent suicide, according to their article published in the Annual Review of Public Health.

“Most of the policies that demonstrate evidence do not mention suicide and were not passed to prevent suicide. They are policies that are intended to address other issues—for instance, increasing minimum wage to promote economic security or reducing alcohol consumption—but they have spillover benefits in that they also prevent suicides,” said Jonathan Purtle, associate professor of public health policy and management at the NYU School of Global Public Health and the study’s lead author.

“This research highlights the importance of considering social determinants in suicide prevention,” said Michael A. Lindsey, Dean and Paulette Goddard Professor of Social Work at the NYU Silver School of Social Work and a study co-author. “An individual’s mental well-being is influenced not only by clinical factors, but also by their environment, circumstances, and experiences.”

Increasing policy activity to address a growing need

Suicide is a leading cause of death in the US, and rates have increased over the past two decades. Public policies, including laws passed by elected officials and regulations adopted by public agencies, play an important role in reducing suicide deaths. While suicide is addressed by some federal policies, most public health policy authority is at the state level. 

In their article in the Annual Review of Public Health, the researchers analyzed the number of state bills passed that mention suicide over the past two decades, as well as the volume of social media posts from state legislators on the topic—an indicator of policy priority. They found a dramatic increase in both, particularly beginning around 2017.

“Our analysis suggests that policymakers recognize that suicide is an issue of public health significance and are trying to address it, and there is bipartisan concern,” said Purtle. 

Policies that work

While many studies have looked at the impact of individual policies on suicide risk, until now, there was not an analysis that collectively examined the research to provide a deeper understanding of what policies are most effective.

To develop this analysis, Purtle, Lindsey, and their colleagues reviewed more than 100 studies and uncovered three categories of policies that research shows have the potential to prevent suicide:

  • Policies that limit access to lethal means (e.g., policies for safe firearm storage and waiting periods to purchase firearms, installing barriers on bridges)
  • Policies that increase access to mental health services (e.g., Medicaid expansion, laws requiring insurance to cover mental health care)
  • Policies that address underlying risk factors for suicide, including those that increase economic security (e.g., minimum wage laws, paid sick leave, unemployment benefits, supplemental nutrition program), prohibit discrimination (e.g., protecting sexual and gender identity in hate crime laws), and limit access to alcohol and tobacco

While policies in all three categories have some potential to reduce suicide deaths, the researchers found that policies to improve economic security, limit access to alcohol, and restrict access to lethal means have the strongest evidence.

“Access to alcohol and lethal means of harm, as well as poverty, are all known risk factors for suicide,” said Lindsey. “Our research suggests that a great starting place for saving lives is to fund and enact public policies that target these three areas.”

In addition, while some of the most effective policies focus on improving well-being over the long term, others—including those related to firearms and restricting other lethal means—aim to make it more difficult to make quick decisions that can have fatal consequences. 

“Suicide is often an impulsive act,” said Purtle. “Anything you can do to delay that impulsivity on average will be beneficial and will prevent suicide from a public health perspective.”

Firearms are the most common and deadly method of suicide, although research on gun violence was long hampered by a federal law blocking funding for this work. However, evidence has begun to build over the past decade about firearm policy and suicide risk, enabling the researchers to include it in their analysis.

The review identified studies that determined that having a firearm in the home dramatically increases the risk of suicide and that policies to limit firearm access can reduce this risk. While not all firearm policies were found to be equally effective, laws requiring a waiting period for gun purchases were moderately effective at preventing suicides. Moreover, laws setting more restrictive age limits for gun purchases and those that require safe gun storage in the home—with consequences for adults who do not safely secure their guns—reduced suicide deaths among young people. 

More data needed

The researchers outlined several areas of research that need attention moving forward, including the new 988 suicide and crisis lifeline. Purtle is leading NIH-funded research on the implementation and impact of policies on the lifeline, with recent studies describing the increase in call volume to the lifeline during its first two years, the experience of users, and how federal and state investments in 988 have enhanced the capacity of these systems.

They also call for more research on technology and youth mental health, including the impact of social media age restrictions, school cell phone bans, and policies that prevent exposure to harmful suicide-related content online. Early efforts to reduce harm online have largely allowed technology companies to self-regulate, but in recent years, there have been growing efforts to implement policies to protect young people from the potential harms of technology.

“The policy landscape has changed so quickly, but it will take time to study these changes, so we don't yet have strong evidence as to what works,” said Purtle.

Amanda Mauri of the NYU School of Global Public Health and Katherine Keyes of the Columbia Mailman School of Public Health were additional authors of the study. The research is supported in part by the National Institute of Mental Health (R01MH131649).