Sunday, August 18, 2024

 

Detecting machine-generated text: An arms race with the advancements of large language models



Penn engineers develop new tool to detect AI-generated text




University of Pennsylvania School of Engineering and Applied Science

Detecting AI Text Isn't So Easy 

image: 

Conceptual image shows how detectors are able to detect AI-generated text when it contains no edits or "disguises," but when manipulated, current detectors are not reliably able to detect AI-generated text.

view more 

Credit: Chris Callison-Burch and Liam Dugan




Machine-generated text has been fooling humans for the last four years. Since the release of GPT-2 in 2019, large language model (LLM) tools have gotten progressively better at crafting stories, news articles, student essays and more, to the point that humans are often unable to recognize when they are reading text produced by an algorithm. While these LLMs are being used to save time and even boost creativity in ideating and writing, their power can lead to misuse and harmful outcomes, which are already showing up across spaces we consume information. The inability to detect machine-generated text only enhances the potential for harm. 

One way both academics and companies are trying to improve this detection is by employing machines themselves. Machine learning models can identify subtle patterns of word choice and grammatical constructions to recognize LLM-generated text in a way that our human intuition cannot. 

Today, many commercial detectors are claiming to be highly successful at detecting machine-generated text, with up to 99% accuracy, but are these claims too good to be true? Chris Callison-Burch, Professor in Computer and Information Science, and Liam Dugan, a doctoral student in Callison-Burch’s group, aimed to find out in their recent paper published at the 62nd Annual Meeting of the Association for Computational Linguistics.

Liam Dugan presents RAID at the 62nd Annual Meeting of the Association for Computational Linguistics in Bangkok.

“As the technology to detect machine-generated text advances, so does the technology used to evade detectors,” says Callison-Burch. “It’s an arms race, and while the goal to develop robust detectors is one we should strive to achieve, there are many limitations and vulnerabilities in detectors that are available now.”   

To investigate those limitations and provide a path forward for developing robust detectors, the research team created Robust AI Detector (RAID), a data set of over 10 million documents across recipes, news articles, blog posts and more, including both AI-generated text and human-generated text. RAID serves as the first standardized benchmark to test detection ability in current and future detectors. In addition to creating the data set, they created a leaderboard, which publicly ranks the performance of all detectors that have been evaluated using RAID in an unbiased way.

“The concept of a leaderboard has been key to success in many aspects of machine learning like computer vision,” says Dugan. “The RAID benchmark is the first leaderboard for robust detection of AI-generated text. We hope that our leaderboard will encourage transparency and high-quality research in this quickly evolving field.”

Dugan has already seen the influence this paper is having in companies that develop detectors. 

“Soon after our paper became available as a preprint and after we released the RAID data set, we started seeing the data set being downloaded many times, and we were contacted by Originality.ai, a prominent company that develops detectors for AI-generated text,” he says. “They shared our work in a blog post, ranked their detector in our leaderboard and are using RAID to identify previously hidden vulnerabilities and improve their detection tool. It’s inspiring to see that the community appreciates this work and also strives to raise the bar for AI-detection technology.”

So, do the current detectors hold up to the work at hand? RAID shows that not many do as well as they claim. 

“Detectors trained on ChatGPT were mostly useless in detecting machine-generated text outputs from other LLMs such as Llama and vice versa,” says Callison-Burch. “Detectors trained on news stories don’t hold up when reviewing machine-generated recipes or creative writing. What we found is that there are a myriad of detectors that only work well when applied to very specific use cases and when reviewing text similar to the text they were trained on.” 

Detectors are able to detect AI-generated text when it contains no edits or “disguises,” but when manipulated, current detectors are not reliably able to detect AI-generated text.

Faulty detectors are not only an issue because they don’t work well, they can be as dangerous as the AI tool used to produce the text in the first place. 

“If universities or schools were relying on a narrowly trained detector to catch students’ use of ChatGPT to write assignments, they could be falsely accusing students of cheating when they are not,” says Callison-Burch. “They could also miss students who were cheating by using other LLMs to generate their homework.”   

It’s not just a detector’s training, or lack thereof, that limits its ability to detect machine-generated text. The team looked into how adversarial attacks such as replacing letters with look-alike symbols can easily derail a detector and allow machine-generated text to fly under the radar.

“It turns out, there are a variety of edits a user can make to evade detection by the detectors we evaluated in this study,” says Dugan. “Something as simple as inserting extra spaces, swapping letters for symbols, or using alternative spelling or synonyms for a few words can cause a detector to be rendered useless.”

Swapping certain letters with similarly looking symbols is one type of adversarial attack that derails current detectors.

 The study concludes that, while current detectors are not robust enough to be of significant use in society just yet, openly evaluating detectors on large, diverse, shared resources is critical to accelerating progress and trust in detection, and that transparency will lead to the development of detectors that do hold up in a variety of use cases. 

“Evaluating robustness is particularly important for detection, and it only increases in importance as the scale of public deployment grows,” says Dugan. “We also need to remember that detection is just one tool for a larger, even more valuable motivation: preventing harm by the mass distribution of AI-generated text.” 

“My work is focused on reducing the harms that LLMs can inadvertently cause, and, at the very least, making people aware of the harms so that they can be better informed when interacting with information,” he continues. “In the realm of information distribution and consumption, it will become increasingly important to understand where and how text is generated, and this paper is just one way I am working towards bridging those gaps in both the scientific and public communities.”

Dugan and Callison-Burch worked with several other researchers on this study, including Penn graduate students Alyssa Hwang, Josh Magnus Ludan, Andrew Zhu and Hainiu Xu, as well as a former Penn doctoral student Daphne Ippolito and Filip Trhlik, an undergraduate at University College London. They continue to work on projects that focus on advancing the reliability and safety of AI tools and how society integrates them into daily life. 

This study was funded by the Intelligence Advanced Research Activity (IARPA), a directive of the Office of the Director of National Intelligence and within the Human Interpretable Attribution of Text Using Underlying Structure (HIATUS) program.

 

Solutions to Nigeria’s newborn mortality rate might lie in existing innovations, finds review



Imperial College London





The review, led by Imperial College London’s Professor Hippolite Amadi, argues that Nigeria’s own discoveries and technological advancements of the past three decades have been “abandoned” by policymakers.

The authors argue that too many Nigerian newborns, clinically defined as infants in the first 28 days of life, die of causes that could have been prevented had policymakers adopted recent in-country scientific breakthroughs.  

Led by Professor Amadi of Imperial’s Department of Bioengineering, who received the Nigeria Prize for Science (NPS) in 2023, the researchers say the lack of adoption and scale up of breakthroughs in treatment beyond Nigeria’s major cities might partly explain the country’s consistently high infant mortality rate.

Figures from the World Health Organization show that globally, 6,500 newborns die every day, and that sub-Saharan Africa experiences the highest neonatal mortality rate in the world at 27 deaths per 1000 live births.  

In addition, a child born in sub-Saharan Africa, a region that includes Nigeria, is 11 times more likely to die in their first month of life than one born in Australia and New Zealand, which is the lowest-mortality region.   

Lead author Professor Amadi, who received the NPS for his work on low-cost newborn care systems in the West African nation, said: “Nigeria has the power to reduce its own infant mortality rate. We already possess the necessary knowledge and technology, cultivated by decades of Nigerian research and innovation.

"We need to put the policies and leadership in place to make these improvements where they are needed most, so we can reduce the soaring numbers of infant deaths in the country.” 

The review is published in Frontiers in Pediatrics.

"Game-changing science"

Nigerian clinicians and researchers have developed several low-cost advances in neonatal care in recent years. These include adaptive care pathways for premature births, an innovative respiratory support mechanism for newborns with low birth weight, and solar powered intensive phototherapy machines for treating neonatal jaundice.   

However, access to neonatal care exists mainly in major cities, where most hospitals with neonatal care units are located, and is more difficult in rural areas. To address this, the researchers argue that policymakers should scale up and adopt these strategies nationally.  

To carry out the analysis, the researchers examined 4,286 publications for evidence of potential strategies or interventions to reduce infant mortality.   

Nineteen of those publications covered potential strategies or interventions to reduce neonatal mortality. Fourteen of these strategies produced significant results during their trials and subsequent usage in hospitals. However, none of these applications were adopted nationally, which the researchers say has denied newborns proper access to these interventions.  

The researchers say that Nigeria is an example case study that could be applied to many other low- and middle-income countries (LMICs) that face similar high mortality and morbidity rates, including across West Africa. Professor Amadi added: “All low- and middle-income countries (LMICs) must look inward to strengthen and use what they already possess.  

“The continuing failure of the Nigerian system to protect newborns seems to have become a norm, a huge source of nursing fatigue, and an unwelcome situation. Nigeria, and other LMICs like it, already possess the game-changing science and technology to prevent many of its newborn deaths. It’s now in policymakers’ hands to nationally scale up these innovations and accelerate infant survival.”  

-  

The case of the neonate vs. LMIC medical academia—a jury-style systematic review of 32 years of literature without significant mortality reduction” by Amadi et al., published 22 July 2024 in Frontiers in Pediatrics.

 

Emergency departments could help reduce youth suicide risk




Ann & Robert H. Lurie Children's Hospital of Chicago





A study of over 15,000 youth with self-inflicted injury treated in Emergency Departments (EDs) found that around 25 percent were seen in the ED within 90 days before or 90 days after injury, pointing to an opportunity for ED-based interventions, such as suicide risk screening, safety planning, and linkage to services. Nearly half of ED visits after the self-inflicted injury encounter were for mental health issues.

“Self-inflicted injury is an important predictor of suicide risk,” said Samaa Kemal, MD, MPH, emergency medicine physician at Ann & Robert H. Lurie Children’s Hospital of Chicago, who was the lead author on the study published in JAMA Network Open. “Our study suggests that Emergency Departments could have life-saving impact if they treat youth not only in the moment of crisis but intervene to extend care into the future. It would be critical to screen for suicide risk, talk to families about removing from the home or locking up anything that could be lethal to their child, like guns, and connect patients to follow-up care.”

Dr. Kemal and colleagues also found that around 70 percent of children in the study received care in general EDs, as opposed to EDs at children’s hospitals.

“The interventions we propose are brief and could be implemented in any ED, even in hospitals without pediatric mental health resources,” said Dr. Kemal.

Limited access to pediatric mental healthcare most likely drives greater ED utilization among rural and publicly insured youth, which underscores a significant health inequity, added Dr. Kemal.

“In communities without easy access to mental health providers, EDs could refer children to pediatricians for follow-up,” she said. “Most importantly, in the midst of the current youth mental health crisis, the care these children receive in the ED should focus on their future safety.”

Co-authors from Lurie Children’s included Jennifer A. Hoffmann, MD, MSKenneth A. Michelson, MD, and Elizabeth R. Alpern, MD, MSCE.

Research at Ann & Robert H. Lurie Children’s Hospital of Chicago is conducted through Stanley Manne Children’s Research Institute, which is focused on improving child health, transforming pediatric medicine and ensuring healthier futures through the relentless pursuit of knowledge. Lurie Children’s is a nonprofit organization committed to providing access to exceptional care for every child. It is ranked as one of the nation’s top children’s hospitals by U.S. News & World Report. Lurie Children’s is the pediatric training ground for Northwestern University Feinberg School of Medicine. Emergency medicine-focused research at Lurie Children’s is conducted through the Grainger Research Program in Pediatric Emergency Medicine.

 

Computer-based model could mitigate cattle fever tick outbreaks


Federal grant supports collaboration of Texas A&M AgriLife, state and federal agencies


Texas A&M AgriLife Communications

 

 

 

 by Helen White

Since the early 1900s, eradicating cattle fever ticks has challenged surveillance and quarantine programs designed to protect the U.S. and Texas cattle industry.

Over the decades, scientists and specialists in state and federal regulatory programs overseeing the U.S. Cattle Fever Tick Eradication Program have developed datasets that track a detailed history of detecting and eliminating cattle fever ticks.

Now, a team of Texas A&M AgriLife researchers is assimilating this information into an interactive, computer-based tool to identify ever-changing risks to prevent or mitigate cattle fever tick infestations.

The three-year project, Agricultural Biosecurity: Harnessing Data Fusion to Meet Emerging Challenges to Cattle Fever Tick Eradication in a Changing World, has received a $600,000 grant from the U.S. Department of Agriculture National Institute of Food and Agriculture, Agricultural Biosecurity Program and is funded by the Agriculture and Food Research Initiative, the nation’s leading competitive grants program for agricultural sciences.

Pete Teel, Ph.D., Regents Professor in the Texas A&M Department of Entomology, leads a team of Texas A&M AgriLife researchers developing a computer-based platform to assess the risk of cattle fever tick infestations in Texas. (Sam Craft/Texas A&M AgriLife)

The cattle fever tick team

Texas A&M AgriLife Research project investigators are Pete Teel, Ph.D., Regents Professor, and Taylor Donaldson, Ph.D., assistant research scientist, both in the  Department of Entomology; and Rose Wang, Ph.D., senior research scientist, and William Grant, Ph.D., professor, both in the  Department of Ecology and Conservation Biology.

“The cattle fever tick issue is a constant challenge for Texas,” Teel said.  “It has a considerable history related to the development, security and sustainability of the cattle industry, and not just in the U.S. because of our international boundary with Mexico.”

Other researchers on the team from Texas A&M are Doug Tolleson, Ph.D., professor, Department of Rangeland, Wildlife and Fisheries and director of the Sonora Research Station; David Anderson, Ph.D., Texas A&M AgriLife Extension Service economist and professor, Department of Agricultural Economics. Research collaborators from the USDA Agricultural Research Service, ARS, are Kimberly Lohmeyer, Ph.D., director, Knipling-Bushland U.S. Livestock Insects Research Laboratory, Kerrville; Donald Thomas, Ph.D., research scientist, Cattle Fever Tick Research Laboratory, Edinburg; and Kennan Oyen, Ph.D., research scientist, Animal Disease Research Unit, Pullman, Washington.

The advisory group includes representatives from the USDA Animal and Plant Health Inspection Service, APHIS, Veterinary Services, the Texas Animal Health Commission and the regulatory agencies in charge of the U.S. Cattle Fever Tick Eradication Program.

Cattle tick fever: a long history of challenges

Only two species of cattle fever ticks, Rhipicephalus annulatus and Rhipicephalus microplus, can transmit the pathogens that cause the highly fatal cattle disease, bovine babesiosis, or Texas cattle fever, Teel said.

“There are no drugs or vaccines to protect cattle from this disease, so we rely upon eliminating the vectors to prevent this problem,” Teel said. “The best disease control is to prevent the tick vectors from reestablishing in the U.S. from Mexico, where both the ticks and disease pathogens remain endemic. At risk are U.S. cattle that are immunologically susceptible to infection through the bite of cattle fever ticks.”

Two species of cattle fever ticks transmit the bovine babesiosis pathogen: left, Rhipicephalus annulatus and, right, Rhipicephalus microplus. (Sam Craft/Texas A&M AgriLife)

Teel said these ticks and the pathogens they transmit were once distributed throughout 13 southern states and southern California. In 1906, the U.S. Cattle Fever Tick Eradication Program was developed to eradicate them. By 1943, the USDA declared the ticks were eradicated in the U.S., except for a zone on the Texas-Mexico border. A permanent quarantine zone inside Texas along the Rio Grande was established to intercept infested animals and ticks that might come across from Mexico.

In Texas, USDA-APHIS operates the eradication program within the permanent quarantine zone, collaborating with the Texas Animal Health Commission and other state and federal agencies outside the permanent zone for inspection, quarantine and other eradication efforts. USDA-APHIS estimates the annual economic benefit of the eradication program to the U.S. cattle industry is more than $1 billion.

Harnessing data fusion to assess risk projections

Both tick species and pathogens are still endemic in Mexico. Teel said the problem remains and has become more complicated in Texas because of several challenges. There have been land use and population changes, and increased resistance to acaricides, the pesticides used to control ticks. Also, wildlife hosts such as white-tailed deer and nilgai antelope can spread ticks to a more extensive range because they are not confined within fence lines like cattle.

The research project uses these challenges as scenarios for risk analysis with data fusion, which integrates multiple data sources to produce information relevant to cattle fever tick eradication.

Teel said the research project’s goal is to combine disparate datasets from the U.S. Cattle Fever Tick Eradication Program to create a computer-based platform that better analyzes and identifies factors conducive for the spread of cattle fever ticks.

Integrating datasets with data fusion can improve risk analysis and accelerate response to the spread of cattle fever ticks, helping to protect the U.S. cattle industry. (Michael Miller/Texas A&M AgriLife)

Some of these factors are changes in climate and weather patterns, vegetation, land use and fragmentation, and the risk of evolving strains of cattle fever ticks resistant to acaricides.

Some datasets have analytical models going back 65 years; others include real-time weather data, GPS mapping and outbreak investigations.

Another project goal is to develop an interactive tool that regulatory agencies’ staff can use in the field on devices such as a tablet, phone or computer to access the new computer platform.

“Texas has developed different technologies and databases that track the history of these infestations and the interactions of how incidents occurred,” Teel said. “There’s a lot to be learned from the relationship of these datasets if they can be evaluated in conjunction with each other. Then we can develop risk assessments to be proactive about stopping tick incursions as quickly as possible.”

-30-

 

Would you like more information from Texas A&M AgriLife?

Visit AgriLife Today, the news hub for Texas A&M AgriLife, which brings together a college and four state agencies focused on agriculture and life sciences within The Texas A&M University System, or sign up for our Texas A&M AgriLife E-Newsletter.

For more resources including photo repository, logo downloads and style guidelines, please visit the Resources for Press and Media.

 

 

 

DOE announces $10 million to support climate resilience centers across America


University-led projects will share data, strengthen and build relationships between DOE and communities bearing the brunt of climate change


DOE/US Department of Energy




WASHINGTON, D.C. – To support vulnerable communities responding to continued and extreme climate effects, the Department of Energy (DOE) today announced $10 million in funding for innovative Climate Resilience Centers (CRCs) in 10 different states. University-led research teams will leverage the world class modeling, data and research capabilities from DOE national laboratories customized for their local regions with a focus on climate prediction of weather hazard risks to better prepare communities. The CRCs are part of the Biden-Harris Administration’s Justice40 Initiative and are designed to ensure that all Americans are benefitting from scientific research. 

“Every pocket of the country has experienced the impact of extreme weather events that are exacerbated by climate change, and disadvantaged communities often feel the brunt of that impact,” said U.S. Secretary of Energy Jennifer M. Granholm. “The projects announced today will leverage the world class expertise and scientific research capacities of DOE’s national laboratories to develop the tools communities will need to inform future decisions for building resiliency.   

Each of the CRCs are led by Minority Serving Institutions and Emerging Research Institutions. Most are also collaborations with DOE national labs, other federal agencies, academic institutions, state and municipal agencies, or community organizations.

Projects were selected by a peer-review panel, and selections focused on a diversity of topics, regions, and institutions across the country. These projects also build on prior awards to CRCs made in 2023. 

The CRCs will help form a nucleus for a diverse group of young scientists, engineers, and technicians to further their scientific research and work on scientific teams. The CRCs will also foster capacity at the regional and local level by connecting with affected communities and stakeholders to enable them to translate basic research into actionable science to enhance climate resilience, as well as to identify potential future research opportunities. 

Across the 10 selectees, research projects include ways to predict and protect communities from coastal flooding and extreme storms; analyzing the impacts of drought on Tribal and agricultural communities; and improving water quality. 

Selected Project Descriptions: 

The 10 projects were selected under the DOE Funding Opportunity Announcement for Climate Resilience Centers DE-FOA-0003181. 

  1. The Advancing Development and Climate-Resilient Adaptation Practices via Community-Driven Urban Transformation in St. Louis, Missouri, establishes a CRC at Saint Louis University with the goal of building regional resilience to heat islands. Heat islands are urban areas that experience extreme high temperatures due to the built environment. Researchers will connect with local communities to define the impacts of climate risk; increase awareness of climate change effects; and empower communities with data to support resilience projects and green infrastructure development. The goal is improved public health, increased economic stability, and a greener infrastructure.  
  2. In the Climate Lighthouse project, City College of New York City will lead the effort partnering with the DOE’s Brookhaven National Laboratory to help residents better cope with extreme heat. The focus will be on translating DOE climate data into useable tools to improve NYC residents’ resilience. Information will be communicated to the public through community partnerships in Manhattan (Harlem) and Brooklyn. The team will work closely with community partners to improve tools so they can serve to build actionable knowledge among the public. 
  3. The goal of the CRC in Tribal communities along the Missouri River Basin is to build climate resilience capacity for Native American communities.  The team effort will be led by the South Dakota School of Mines and Technology, Tribal nations, US Geological Survey, and Pacific Northwest National Laboratory (PNNL). The team will develop user-friendly planning tools to translate existing climate projections into site-specific drought and flood risks, mitigation recommendations, associated costs, and uncertainties. Educational workshops organized by the consortium will demonstrate research results.  
  4. Communities in the Texas Coastal Bend along the Gulf of Mexico face multiple water-related threats, including floods and droughts. In the Coastal Blend Climate Resilience Project, a  a partnership with University of Texas-Arlington and Texas A&M University-Corpus Christi, the focus is on improving the predictions of events. The Coastal Bend CRC will use data and modeling from the DOE for adapting and planning for climate extremes. A critical part of the program is building short-and long-term capacity in communities to ensure local and community leaders can leverage climate science to inform resilience-building efforts, particularly among vulnerable groups. 
  5. The Midwest Climate Resilience Center in Clark County, Ohio, will address high risk from extreme rain and flooding and the consequential effects on drinking water quality. Central State University will partner with Ohio State University and PNNL to assess the impact of climate stressors on soil system processes in watersheds with varied land uses. The group will develop scale-appropriate targeted climate solutions for local communities and train the next generation of climate scientists from underrepresented student communities.  
  6. The Climate Resilience Center for Alaska brings together researchers from the University of Alaska Fairbanks with Los Alamos National Laboratory. As Alaska experiences transformational change due to climate warming, this funding will enhance communication with Alaska communities about existing DOE science, develop meaningful collaborations between communities and the DOE, and incorporate DOE science into educational pathways and opportunities. This project will also conduct pilot research specifically focused on southwest Alaska to demonstrate the Center’s role.  Much of the research will involve graduate students supported by the Center, acting as a conduit to recruit the next generation of climate investigators, with an emphasis on rural, traditionally underserved communities.  
  7. Space Coast RESCUE (Resilience Solutions for Climate, Urbanization, and Environment) is an effort by researchers at the Florida Institute of Technology collaborating with the DOE’s Argonne National Laboratory.  The Florida Space Coast, which borders the Atlantic Ocean, faces climate resilience challenges and risks that are multifaceted and representative of the problems faced by many coastal communities throughout the nation. Hazards include heat stress, extreme precipitation, tropical cyclone-induced high winds and surge, flooding, and coastal erosion, as well as inland flooding due to extreme precipitation and stormwater runoff that negatively affects water quality and human health. 
  8. Planning for extreme weather is a cornerstone of the Building Predictive Capacity to Enhance Stormwater Infrastructure and Flood Resilience project led by Central Michigan University. The project aims to produce data and tools that will help communities plan for and become more resilient to climate change in collaboration with communities in three pilot watersheds in Michigan: the Chippewa River, Lower Grand River, and Rouge River. The project will use downscaling of climate model projections to develop local precipitation models to simulate future risks of floods and torrential rains.  
  9. Powering just and resilient cities is the goal of the Gateway Cities Climate Resilience Center led by the University of Massachusetts Lowell in partnership with PNNL. The objective is to work with the community to use DOE science and tools to provide local projections of extreme temperature events. The project will assess vulnerability of residential heating and cooling power demand and potential mitigation measures in terms of urban tree cover and green spaces. Graduate students will be trained in community-engaged climate and energy-modeling research. 
  10. This project, led by Lehigh University in partnership with PNNL, will examine the impact of regional climate action plans on the response to extreme water events like floods and droughts in Eastern Pennsylvania. The team plans to work closely with the three-city coalition of Allentown, Bethlehem and Easton to address multiple climate change impacts. Community workshops will be hosted with Community Action Lehigh Valley. 
     

Total funding for all of the projects is $10 million for Fiscal Year 2024 dollars for projects lasting three years in duration. A list of all Biological and Environmental Research (BER) projects and funding, including Climate Resilience Centers, can be found at science.osti.gov/ber. 

Selection for award negotiations is not a commitment by DOE to issue an award or provide funding. Before funding is issued, DOE and the applicants will undergo a negotiation process, and DOE may cancel negotiations and rescind the selection for any reason during that time. 

 

Research provides a roadmap for improving electrochemical performance



New findings expand understanding about how electrons move in complex fluids in batteries and other similar devices



University of Delaware





Thomas Edison went through thousands of materials before he finally found the right tungsten filament to create a working lightbulb. This type of trial-and-error research continues today and is responsible for countless inventions that improve our world. Battery systems that help power our lives in many seen (and unseen) ways are one example.

However, improving these materials and devices requires more than experimentation. Modern engineers must also form a deeper understanding of the general principles that govern material performance, from which they can design better materials to achieve challenging product requirements. 

In a paper published Aug. 13 in the Proceedings of the National Academy of Sciences (PNAS), University of Delaware, Northwestern University and industry researchers report expanded understanding on how electrons move through the conductive parts of complex fluids called slurries that are found in electrochemical devices such as batteries and other energy storage devices.

It’s important work that can help overcome existing knowledge gaps about how electrons hop between conductive particles found in these materials, as engineers seek new ways to improve that activity. 

The paper is the result of collaborative research between UD’s Norman Wagner, Unidel Robert L. Pigford Chair in Chemical and Biomolecular Engineering, and researchers led by Jeffrey Richards, assistant professor of chemical and biological engineering at Northwestern University, and a former UD postdoctoral researcher. Lead authors on the paper include UD alumna Julie Hipp, who earned her doctoral degree in chemical and biomolecular engineering in 2020 and now is a senior scientist at Procter and Gamble, and Paolo Ramos, a former NU graduate student now at L’Oreal. NU doctoral candidate Qingsong Liu also contributed to this work.

According to Wagner, by combining carefully designed and conducted experiments with state-of-the-art theory and simulation, the research team found that enhancing performance requires more than formulation chemistry. It also requires understanding how the electrical conductivity behaves as the slurry materials are processed and manufactured.

“To control the device performance, it's not enough just to control the chemistry, we have to control the microstructure, too,” said Wagner. This is because the material’s final microstructure — meaning how all the components come together — regulates how the electrons can move, impacting the device’s power and efficiency. 

Performance depends on the details

Though many electrochemical devices exist, let’s stay with the battery example for a moment to break things down.

Batteries supply electricity when electrons move through a solution or “slurry” made of conductive materials and solvents via a chemical reaction. How well the battery system works depends on the materials, which includes both the chemistry and the manufacturing processes used in its creation.

Think of it like multiple racecars going around a racetrack. All the racecars have steering wheels, tires and engines, but the structure of each vehicle and how it’s assembled may differ from car to car. So, just because a car with an engine and a steering wheel is on the track doesn’t mean it gets the same performance as the other vehicles. The same is true for the critical components in batteries. The details matter in how you put them together. 

Conductive versions of carbon black (or soot) are commonly used in batteries as well as a vast number of electrochemical devices. They are nano-sized crystals of carbon made in such a way that they stick together and form aggregates, or clusters, that can be mixed with various liquids to form a slurry. This slurry is then used to cast, or make, parts of a battery or other devices. 

“In that mixture, electrons can move very fast within the carbon black, which is highly conductive like an electrical wire. But the electrons have to hop from one cluster of carbon-black particles to another because the carbon black is suspended in the slurry — the aggregate particles are not connected as a solid structure,” explained Wagner. 

The researchers had previously shown that the way the carbon black material flows — its rheology—plays a key role in the material’s performance, using neutron-scattering techniques at the National Institute of Standards and Technology’s Center for Neutron Research in Gaithersburg, Maryland, through UD’s Center for Neutron Science. In this new study, the research team extended that work to create a universal roadmap for understanding how the conductivity of the flowing slurry depends upon the chemistry of the components from which it is comprised and — importantly — how the slurry is processed. 

Together, these pieces form a blueprint for how to process energy storage devices during manufacturing. The promise in this kind of roadmap is an enhanced ability to systematically design materials and predict the behavior for electrochemical devices on the front end.

“What we've studied allows us to begin to understand how the structure of this carbon-black slurry, this aggregated suspension, impacts the efficiency and performance of these devices,” said Wagner. “We're not solving anyone's specific battery problem. The hope is that others in practice can apply our foundational work to their own electrochemical systems and problems.”

The researchers expect this work will have an impact on the formulation and processing windows for emerging electrochemical energy storage methods and water deionization technologies.

Wagner gave the example of electrolyzer devices that use electricity to split water into its component parts of hydrogen and oxygen. One of the most challenging parts of this process is mixing and controlling the properties of the material solutions that enable the electrolyzer to do its work and free up hydrogen molecules so they can be used for other purposes, say, as an energy resource. According to Wagner, future improvements in such devices will depend on processing.

“You can get the chemistry right, but if you don't process it right, you don't end up with the performance that you want,” Wagner said.

 

Zebrafish use surprising strategy to regrow spinal cord



Detailed blueprint of nerve cells’ dramatic changes could help identify ways to heal spinal cord damage



Washington University School of Medicine

Zebrafish use surprising strategy to regrow spinal cord 

image: 

The top image shows fluorescently labeled cells in the spinal cord of a zebrafish recovering one week after an injury, and the bottom image shows recovery four weeks after an injury. Researchers at Washington University School of Medicine in St. Louis describe the dramatic changes within nerve cells that make regeneration possible. Such findings could inspire the development of new therapies for spinal cord injuries in people.

view more 

Credit: MOKALLED LAB




Zebrafish are members of a rarefied group of vertebrates capable of fully healing a severed spinal cord. A clear understanding of how this regeneration takes place could provide clues toward strategies for healing spinal cord injuries in people. Such injuries can be devastating, causing permanent loss of sensation and movement.

A new study from Washington University School of Medicine in St. Louis maps out a detailed atlas of all the cells involved — and how they work together — in regenerating the zebrafish spinal cord. In an unexpected finding, the researchers showed that survival and adaptability of the severed neurons themselves is required for full spinal cord regeneration. Surprisingly, the study showed that stem cells capable of forming new neurons — and typically thought of as central to regeneration — play a complementary role but don’t lead the process.

The study is published Thursday, Aug. 15, in the journal Nature Communications.

Unlike humans’ and other mammals’ spinal cord injuries, in which damaged neurons always die, the damaged neurons of zebrafish dramatically alter their cellular functions in response to injury, first to survive and then to take on new and central roles in orchestrating the precise events that govern healing, the researchers found. Scientists knew that zebrafish neurons survive spinal cord injury, and this new study reveals how they do it.

“We found that most, if not all, aspects of neural repair that we’re trying to achieve in people occur naturally in zebrafish,” said senior author Mayssa Mokalled, PhD, an associate professor of developmental biology. “The surprising observation we made is that there are strong neuronal protection and repair mechanisms happening right after injury. We think these protective mechanisms allow neurons to survive the injury and then adopt a kind of spontaneous plasticity — or flexibility in their functions — that gives the fish time to regenerate new neurons to achieve full recovery. Our study has identified genetic targets that will help us promote this type of plasticity in the cells of people and other mammals.”

By mapping out the evolving roles of various cell types involved in regeneration, Mokalled and her colleagues found that the flexibility of the surviving injured neurons and their capacity to immediately reprogram after injury lead the chain of events that are required for spinal cord regeneration. If these injury-surviving neurons are disabled, zebrafish do not regain their normal swim capacity, even though regenerative stem cells remain present.

When the long wiring of the spinal cord is crushed or severed in people and other mammals, it sets off a chain of toxicity events that kills the neurons and makes the spinal cord environment hostile against repair mechanisms. This neuronal toxicity could provide some explanation for the failure of attempts to harness stem cells to treat spinal cord injuries in people. Rather than focus on regeneration with stem cells, the new study suggests that any successful method to heal spinal cord injuries in people must start with saving the injured neurons from death.

“Neurons by themselves, without connections to other cells, do not survive,” Mokalled said. “In zebrafish, we think severed neurons can overcome the stress of injury because their flexibility helps them establish new local connections immediately after injury. Our research suggests this is a temporary mechanism that buys time, protecting neurons from death and allowing the system to preserve neuronal circuitry while building and regenerating the main spinal cord.”

There is some evidence that this capacity is present but dormant in mammalian neurons, so this may be a route to new therapies, according to the researchers.

“We are hopeful that identifying the genes that orchestrate this protective process in zebrafish — versions of which also are present in the human genome — will help us find ways to protect neurons in people from the waves of cell death that we see following spinal cord injuries,” she said.

While this study is focused on neurons, Mokalled said spinal cord regeneration is extremely complex, and future work for her team will delve into a new cell atlas to understand the contributions of other cell types to spinal cord regeneration, including non-neuronal cells, called glia, in the central nervous system as well as cells of the immune system and vasculature. They also have ongoing studies comparing the findings in zebrafish to what is happening in mammalian cells, including mouse and human nerve tissue.

Saraswathy VM, Zhou L, Mokalled MH. Single-cell analysis of innate spinal cord regeneration identifies intersecting modes of neuronal repair. Nature Communications. Aug. 15, 2024.

 

Warning signs: National data indicate that autistic birthing people are at increased risk for postpartum anxiety and depression



New research from Drexel University’s Policy and Analytics Center in the A.J. Drexel Autism Institute examined perinatal and postpartum outcomes among people with intellectual and developmental disabilities.



Drexel University





American women have the highest rate of maternal deaths among high-income countries, with outcomes worse for minoritized groups. In an effort to understand the maternal health of pregnant people with intellectual and developmental disabilities, including autism and intellectual disability, researchers from Drexel University’s Policy and Analytics Center in the A.J. Drexel Autism Institute examined Medicaid data to identify perinatal and postpartum outcomes among people with intellectual and developmental disabilities. The study was recently published in JAMA Network Open.

“While previous studies have reported an increased risk for challenges related to pregnancy and birth among people with intellectual and developmental disabilities, little research has been done using United States-based population-level data,” said Lindsay Shea, DrPH, director of the Policy and Analytics Center in the A.J. Drexel Autism Institute and lead author of the study. “Medicaid is a key system to study these risks and opportunities for policy and program improvements because it covers almost half of births in the U.S. and a disproportionate share of people with intellectual and developmental disabilities.”

The data showed people with intellectual and developmental disabilities were younger at the time of their first delivery and had higher risks for multiple medical and mental health conditions, including gestational diabetes, gestational hypertension and preeclampsia. Autistic pregnant people had significantly higher probability of postpartum anxiety and postpartum depression, compared to people with intellectual disabilities only and people without intellectual and developmental disabilities.

Researchers examined national Medicaid claims to compare perinatal and postpartum outcomes across groups of birthing people with intellectual and developmental disabilities (including intellectual disability and autism) and a random sample of birthing people without intellectual and developmental disabilities. The data included Medicaid claims from 2008-2019 for 55,440 birthing people with intellectual and developmental disabilities and a random sample of 438,557 birthing people without intellectual and developmental disabilities.

Perinatal outcomes, including medical conditions such as gestational diabetes, gestational hypertension, and preeclampsia, and mental health conditions, such as anxiety disorders and depressive disorders, were compared across the groups. Researchers estimated the probability of postpartum anxiety and postpartum depression using Kaplan-Meier and Cox proportional hazard regressions.

Co-author Molly Sadowsky, project director in the Policy and Analytics Center in the Autism Institute, explained how the findings suggest several opportunities for policymakers, providers and researchers. Reproductive health education, perinatal care and delivery services should be tailored to ensure comprehensive and targeted support for birthing people with intellectual and developmental disabilities. Policies should be designed and implemented to align with and be guided by the needs of people with intellectual and developmental disabilities to reduce maternal health disparities. Current clinical guidelines and procedures should be adapted to the specific needs and experiences of people with intellectual and developmental disabilities. And new Medicaid policies – like the postpartum coverage extension and doula service reimbursement – should be evaluated for impact on health outcomes of people with intellectual and developmental disabilities.

“Findings from this study underscore an urgent need for attention on Medicaid in supporting birthing people with intellectual and developmental disabilities throughout the perinatal period,” said Sadowsky. “It’s vital that differences in access to and coordination of postpartum care, as well as related differences in risk for postpartum depression and anxiety, continue to be examined.”

Shea and Sadowsky explained where this work will continue.

“We’ll advance this work in our next project by examining the impact of attitudinal and structural ableism on perinatal health and mental health outcomes, as well as neonatal and postnatal outcomes, morbidity, and mortality among children of women with and without intellectual and developmental disabilities,” said Shea.

Shea and her research team were recently awarded a five year $3 million National Institutes of Health Research Project Grant (R01) to further explore this area.

The future study will conduct a detailed examination of the impact of ableism on women with intellectual and developmental disabilities during pregnancy and the postpartum period, and will compare outcomes experienced by this group and their infants to those of peers without intellectual and developmental disabilities.

“Parenthood and reproductive health are important times in everyone’s life to be supported in getting the services and supports that work for each person and for each family,” said Shea. “We are excited about the future of our work on this topic to find ways that the health care system can do better and we can support people and celebrate their birthing experiences and roles in these tumultuous times in life.”

For more information on NIH RePORTER, click here.