Friday, March 12, 2021

Manure improves soil and microbe community

AMERICAN SOCIETY OF AGRONOMY

Research News

IMAGE

IMAGE: SOME OF THE RESEARCH PASTURES INCLUDED WARM-SEASON OLD WORLD BLUESTEM GRASS (LEFT) MIXED WITH LEGUMES INCLUDING ALFALFA (PURPLE BLOOMS, CENTER) AND YELLOW SWEETCLOVER (YELLOW BLOOMS, LOWER LEFT). THE LEGUMES PROVIDE... view more 

CREDIT: LINDSEY SLAUGHTER

In the dry air and soil of Texas' Southern High Plains, improving soil health can be tough. We usually think of healthy soil as moist and loose with lots of organic matter. But this can be hard to achieve in this arid area of Texas.

Lindsey Slaughter, a member of the Soil Science Society of America, set out with her fellow researchers to test a solution that kills two birds with one stone. They put excess cow manure on these soils to see if they could make them healthier.

The team recently published their research in the Soil Science Society of America Journal.

"We know that planting perennial grasslands for cattle production can help protect and restore soil in semi-arid lands that are likely to erode and degrade from intense farming," Slaughter says. "But producers need additional ways to increase soil carbon and nutrient stores."

What makes a healthy or unhealthy soil?

Slaughter describes soil health as the ability of a living soil ecosystem to perform a variety of important functions. These include cycling nutrients, storing and purifying water, helping plants and animals, and more.

This "living" part is made up of various microorganisms that help a soil be healthy. They, for example, help break down materials like manure so that it and its nutrients become part of the soil.

"Improving the soil's ability to perform these roles and support plant and animal life is our target for soil health," Slaughter says. "Adding the manure can provide a boost of material that can be incorporated into soil organic matter. This helps provide a stronger foundation for more microbial activity and nutrient cycling."

This is why in their study they applied a low one-time amount of manure to two types of pastures to look into this. The pastures they put the manure on had either grass only that was fertilized occasionally or were a mix of grass and legumes that was not fertilize



CAPTION

The researchers used excess cattle manure from local producers. Here a donor deposits composted cattle manure prior to spreading it over the field sites.

CREDIT

Paul Green

Manure helps, but results take time

Overall, they did find that manure helped increase soil organic carbon and the number of microbes in the soil. These are two important characteristics of a healthy soil.

It took almost a year and a half to see these changes, although they say this is not totally surprising.

"This tells us that it can take a long time for even a little added compost to become incorporated into the soil organic matter of semi-arid grasslands, but it definitely helps," Slaughter explains.

"We think this is mostly due to the dry climate at our study site," says Slaughter. "We commonly get little rainfall per year. The microbial community was not able to work quickly or efficiently to decompose the manure without water."

Their results also showed that the pastures receiving fertilizer responded better to the manure. They believe this is because the nitrogen in the fertilizer helped the microbes decompose the manure better.

"Microbes help directly with releasing nutrients from organic material in a form that plants can use, as well as decomposing those residues to build soil organic matter," Slaughter says. "A lot of work has been done on how this can help improve cropping systems. However, we wanted to also test this on forage pastures."

Slaughter adds that the next steps in this work include whether more manure or multiple applications would get faster results. In addition, they hope to investigate if irrigation or fertilizer would help incorporate the manure faster.

"We need more research along these lines to help us design strategies that quickly and effectively increase soil health and productivity in these grasslands," she says. "This helps farmers save money on nutrients and amendments while building soil organic matter and nutrient cycling capacity. This also saves them water and protects against soil degradation."

###

Lindsey Slaughter is an assistant professor at Texas Tech University. Funding for this work was provided by the Southern Sustainable Agriculture Research and Education and the USDA's National Institute of Food and Agriculture. Funding and student support was also provided by the Department of Plant and Soil Science in the College of Agricultural Sciences and Natural Resources at Texas Tech University.


CAPTION

Farm manager Paul Green of Texas Tech University anchors tarps in the pasture just before compost was spread onto the fields. This created treatment areas in each field where compost application was excluded.

CREDIT

Phil Brown

'Lost' ocean nanoplastic might be getting trapped on coasts

AMERICAN CHEMICAL SOCIETY

Research News

As plastic debris weathers in aquatic environments, it can shed tiny nanoplastics. Although scientists have a good understanding of how these particles form, they still don't have a good grasp of where all the fragments end up. Now, researchers reporting in ACS' Environmental Science & Technology have shown experimentally that most nanoplastics in estuarine waters can clump, forming larger clusters that either settle or stick to solid objects, instead of floating on into the ocean.

There is a huge discrepancy between the millions of tons of plastic waste entering rivers and streams and the amount researchers have found in the oceans. As large pieces of plastic break apart into successively smaller fragments on their transit to the sea, some eventually wear down to nano-sized particles. Previous studies have shown that these nanoplastics congregate in well-mixed, stagnant salty water. Yet, these results do not apply when the particles encounter dynamic changes in salt content, such as estuaries, where rivers carrying freshwater meet tidal saltwater. So, Hervé Tabuteau, Julien Gigault and colleagues wanted to perform laboratory experiments with micro-sized chambers mimicking the conditions measured in an estuary to show how nanoplastics interact and aggregate in this type of environment.

To determine how nanoplastics move in estuarine waters, the team developed a lab-on-a-chip device. They introduced crushed 400-nm-wide polystyrene beads and freshwater into one side of the device, while injecting saltwater through another inlet. At the opposite end of the 1.7-cm-long device, the researchers collected the output. The team tested different flow rates, replicating the salt gradient and water movement they measured in an estuary on the French Caribbean island of Guadeloupe. Nanoplastic aggregates up to 10-μm wide were detected within the zone of highest salt concentration in the flow chamber, regardless of how fast the water was moving. At the highest flow rate, only 12% of the nanoplastics were collected in the outlets; the remaining particles either clumped together and sank in the flow chamber or formed floating aggregates that stuck to the chamber's sides. The researchers say their results show estuaries and other coastal environments may filter out nanoplastics before they can enter the ocean.

###

The authors acknowledge funding from the French Agency for Research.

The abstract that accompanies this paper can be found here.

For more of the latest research news, register for our upcoming meeting, ACS Spring 2021. Journalists and public information officers are encouraged to apply for complimentary press registration by emailing us at newsroom@acs.org.

VEGAN LEATHER

The quest for sustainable leather alternatives

AMERICAN CHEMICAL SOCIETY

Throughout history, leather has been a popular material for clothes and many other goods. However, the tanning process and use of livestock mean that it has a large environmental footprint, leading consumers and manufacturers alike to seek out alternatives. An article in Chemical & Engineering News (C&EN), the weekly newsmagazine of the American Chemical Society, details how sustainable materials are giving traditional leather a run for its money.

Traditional leather goods are known for their durability, flexibility and attractive finish, with a global market worth billions, writes Associate Editor Craig Bettenhausen. Despite leather's popularity, the modern tanning process uses potentially harmful chemicals and creates a large amount of wastewater. In addition, most hides come from the meat and dairy industries, which have sustainability problems. Leather-like materials, often called vegan leather, are gaining traction among high-end manufacturers, defying the negative perceptions of synthetic "pleather." These leather alternatives are made from an array of base materials, including plants, mushrooms and even fish skin, each with a unique take on sustainable production.

Plant-based materials are currently the most advanced leather mimics because of their straightforward manufacturing process, which combines inexpensive natural fibers and polymers that are rolled into sheets. A company based in Mexico has created a leather made from prickly pear cactus, which is ground into a powder and combined with a biobased polyurethane. Mushroom leather mimics the texture of cowhide leather very well, but production needs to scale up substantially to make an impact. Although not a vegan alternative, fish skin is poised to replace exotic leathers such as snake and alligator skin. Cell-culture leather is also in early development, which could disrupt the traditional leather market even further. Experts are confident that these materials are viable alternatives, and manufacturers plan to scale up their efforts going forward.

###

The paper, "Sustainable Materials Make a Play for the Vegan Leather Market," is freely available here.

University of Minnesota scientists discover attacking fungi that show promise against emerald ash borer

UNIVERSITY OF MINNESOTA

Research News

IMAGE

IMAGE: THROUGH DNA SEQUENCING, SCIENTISTS IDENTIFIED FUNGAL ISOLATES AND REVEALED A DIVERSE ASSEMBLAGE OF FUNGI. THIS INCLUDED ENTOMOPATHOGENIC FUNGI THAT ATTACK INSECTS, AS WELL AS OTHER FUNGI THAT CAUSE CANKERS --... view more 

CREDIT: N/A

Since its introduction, the emerald ash borer (EAB) has become the most devastating invasive forest insect in the United States, killing hundreds of millions of ash trees at a cost of hundreds of millions of dollars.

Now, new research from the University of Minnesota's Minnesota Invasive Terrestrial Plants and Pests Center (MITPPC) shows a possible path forward in controlling the invasive pest that threatens Minnesota's nearly one billion ash trees.

In a recent study published in Fungal Biology, MITPPC researchers identified various fungi living in EAB-infested trees -- a critical first step in finding fungi that may be harnessed to control the spread of EAB, and ultimately, prevent ash tree death.

"We discovered that several different species of fungi attack EAB and other insects, and they can now be further tested for their potential for biocontrol," said Robert Blanchette, the study's project leader and professor in the College of Food, Agricultural and Natural Resource Sciences. "This is a very important first step in the search for a biocontrol for emerald ash borer."

Larval EAB feed just beneath the bark, leaving behind tunnel galleries that can stretch up to 20 inches long. Beneath the surface, fungi -- some of which may be capable of parasitizing the EAB -- may be carried by the larvae as they develop, or may enter the tree through the tunnel galleries. Some of these fungi also seriously affect urban trees, causing rapid wood decay which result in hazardous tree situations.

From Rochester to Duluth, researchers gathered samples where ash is affected by EAB. Through DNA sequencing, scientists identified fungal isolates and revealed a diverse assemblage of fungi. This included entomopathogenic fungi that attack insects, as well as other fungi that cause cankers -- which help EAB kill trees -- and some that cause wood decay.

"Before now, we simply haven't been sure what fungi are associated with EAB infestations in Minnesota. This project identified those species and, in doing so, opened up new possibilities for managing one of our state's most devastating tree pests," said Ben Held, the study's lead author and researcher in the College of Food, Agricultural and Natural Resource Sciences.

As research continues, the scientists will build on the work from this study to determine if any of the fungi can be used to kill the emerald ash borer. Results will also be of value in helping control the insect in other parts of North America where EAB is found.

"Ash trees are vitally important to Minnesota," said Rob Venette, MITPPC director. "They reduce air pollution, storm water runoff, and cooling costs, all while increasing property values in local communities. It's critical we work to protect them from this invasive pest."

###

The research was funded through the Minnesota Environment and Natural Resources Trust Fund.

New IceCube detection proves 60-year-old theory

MICHIGAN STATE UNIVERSITY

Research News

On December 6, 2016, a high-energy particle called an electron antineutrino was hurtling through space at nearly the speed of light. Normally, the ghostly particle would zip right through the Earth as if it weren't even there.

But this particle just so happened to smash into an electron deep inside the South Pole's glacial ice. The collision created a new particle, known as the W- boson. That boson quickly decayed, creating a shower of secondary particles.

The whole thing played out in front of the watchful detectors of a massive telescope buried in the Antarctic ice, the IceCube Neutrino Observatory. This enabled IceCube to make the first ever detection of a Glashow resonance event, a phenomenon predicted 60 years ago by Nobel laureate physicist Sheldon Glashow.

This detection provides the latest confirmation of the Standard Model, the name of the particle physics theory explaining the universe's fundamental forces and particles.

"Finding it wasn't necessarily a surprise, but that doesn't mean I wasn't very happy to see it," said Claudio Kopper, an associate professor in Michigan State University's Department of Physics and Astronomy in the College of Natural Science. Kopper and his departmental colleague, assistant professor Nathan Whitehorn, lead IceCube's Diffuse and Atmospheric Flux Working Group behind the discovery.

The international IceCube Collaboration published this result online on March 11 in the journal Nature.

"Even three years ago, I didn't think IceCube would be able to make this measurement, or at least as well as we did," Whitehorn said.

A 3-D plot with columns of green, blue, yellow and orange spheres and other round shapes give a visual representation of the Glashow resonance event detection.

This detection further demonstrates the ability of IceCube, which observes nearly massless particles called neutrinos using thousands of sensors embedded in the Antarctic ice, to do fundamental physics.

Although the Spartans lead the working group, they emphasized that this discovery was a team effort, powered by the paper's three lead analysts: Lu Lu, an assistant professor at University of Wisconsin-Madison; Tianlu Yuan, an assistant scientist at the Wisconsin IceCube Particle Astrophysics Center, or WIPAC; and Christian Haack, a postdoc at the Technical University of Munich.

"We lead weekly meetings, we talk about how the work is done, we ask hard questions," said Kopper. "But without the people doing the actual analysis, we wouldn't have anything."

"Our job is to be the doubters-in-chief," Whitehorn said. "The lead authors did a great job convincing everyone that this event was a Glashow resonance."

The particle physics community has been anticipating such a detection, but Glashow resonance events are extremely rare by nature and technologically challenging to detect.

"When Glashow was a postdoc at Niels Bohr, he could never have imagined that his unconventional proposal for producing the W- boson would be realized by an antineutrino from a faraway galaxy crashing into Antarctic ice," said Francis Halzen, professor of physics at the University of Wisconsin-Madison, the headquarters of IceCube maintenance and operations, and principal investigator of IceCube.

A Glashow resonance event requires an electron antineutrino with a cosmic amount of energy -- at least 6.3 peta-electronvolts, or PeV. For comparison, that's about 1,000 times more energy than that of the most energetic particles produced by the Earth's most powerful particle accelerators.

Since IceCube started fully operating in 2011, it has detected hundreds of high-energy neutrinos from space. Yet the neutrino in December 2016 was only the third with an energy higher than 5 PeV.

And simply having a high-energy neutrino is not sufficient to detect a Glashow resonance event. The neutrino then has to interact with matter, which is not a guarantee. But IceCube encompasses quite a bit of matter in the form of Antarctic ice.

The IceCube Laboratory, lighted red against the night sky, is small in this landscape photograph of the South Pole white tundra, which also captures yellow stars and light green auroras.

The observatory's detector array has been built into the ice, spanning nearly 250 acres with sensors reaching up to about a mile deep. All told, IceCube boasts a cubic kilometer of coverage, watching over a billion metric tons of extremely clear ice.

That's what it takes to detect neutrinos, along with a team of scientists who have the skill and determination to spot rare events.

IceCube's more than 5,000 detectors take in a tremendous firehose of light, Whitehorn said. Detecting the Glashow resonance meant researchers had to pick out a handful of telltale photons, individual particles of light, from that firehose spray.

"This is some of the most impressive technical work I've ever seen," Whitehorn said, calling the team unstoppable over the years-long effort to confirm this was a Glashow resonance event.

Making the work even more impressive was the fact that the lead authors -- Lu, Yuan and Haack -- were in three countries on three different continents during the analysis. Lu was a postdoc at Chiba University in Japan, Yuan was at WIPAC in the U.S. and Haack was a doctoral student at Rheinisch-Westfälische Technische Hochschule Aachen University in Germany.

"It was amazing to me just seeing that that is possible," Kopper said.

But this is very much in keeping with the ethos of IceCube, an observatory built on international collaboration. IceCube is operated by a group of scientists, engineers and staff from 53 institutions in 12 countries, together known as the IceCube Collaboration. The project's headquarters is WIPAC, a research center of UW-Madison in the United States.

To confirm the detection and usher in a new chapter of neutrino astronomy, the IceCube Collaboration is working to detect more Glashow resonances. And they need IceCube-Gen2, a proposed expansion of the IceCube detector, to make it happen.

"We already know that the astrophysical spectrum does not end at 6 PeV," Lu said. "The key is to detect more Glashow resonance events and to identify the sources that accelerate those antineutrinos. IceCube-Gen2 will be key to making such measurements in a statistically significant way."

Glashow himself echoed that sentiment about validation. "To be absolutely sure, we should see another such event at the very same energy as the one that was seen," said Glashow, now an emeritus professor of physics at Boston University. "So far there's one, and someday there will be more."

###

The IceCube Neutrino Observatory is funded primarily by the National Science Foundation and is operated by a team headquartered at the University of Wisconsin-Madison. IceCube's research efforts, including critical contributions to the detector operation, are funded by agencies in Australia, Belgium, Canada, Denmark, Germany, Japan, New Zealand, Republic of Korea, Sweden, Switzerland, the United Kingdom, and the United States. IceCube construction was also funded with significant contributions from the National Fund for Scientific Research -- the FNRS and FWO -- in Belgium; the Federal Ministry of Education and Research and the German Research Foundation in Germany; the Knut and Alice Wallenberg Foundation, the Swedish Polar Research Secretariat, and the Swedish Research Council in Sweden; and the Department of Energy and the University of Wisconsin-Madison Research Fund in the U.S.

(Note for media: Please include a link to the original paper in online coverage: https://dx.doi.org/10.1038/s41586-021-03256-1)

Tracking cosmic ghosts

Frontera supercomputer, a community resource for IceCube Neutrino Observatory research, enables the discovery of new high-energy particle

UNIVERSITY OF TEXAS AT AUSTIN, TEXAS ADVANCED COMPUTING CENTER

Research News

IMAGE

IMAGE: A VISUALIZATION OF THE GLASHOW EVENT RECORDED BY THE ICECUBE DETECTOR. EACH COLORED CIRCLE SHOWS AN ICECUBE SENSOR THAT WAS TRIGGERED BY THE EVENT; RED CIRCLES INDICATE SENSORS TRIGGERED EARLIER... view more 

CREDIT: ICECUBE COLLABORATION

The idea was so far-fetched it seemed like science fiction: create an observatory out of a one cubic kilometer block of ice in Antarctica to track ghostly particles called neutrinos that pass through the Earth. But speaking to Benedickt Riedel, global computing manager at the IceCube Neutrino Observatory, it makes perfect sense.

"Constructing a comparable observatory anywhere else would be astronomically expensive," Riedel explained. "Antarctica ice is a great optical material and allows us to sense neutrinos as nowhere else."

Neutrinos are neutral subatomic particles with a mass close to zero that can pass through solid materials at near the speed of light, rarely reacting with normal matter. They were first detected in the 1950s in experiments that operated near nuclear reactors, which also generate these particles. They were further found to be created by cosmic rays interacting with our atmosphere. But astrophysicists believed they were likely widespread and caused by a variety of cosmic events, if only they could be detected.

Importantly, scientists believed they could be critical clues to other phenomenon. "20 percent of the potentially visible Universe is dark to us," Riedel explained. "That's mostly because of distances and the age of the Universe. High energy light is also hidden. It is absorbed or undergoes transformation that makes it hard to trace back to a source. IceCube reveals a slice of Universe we haven't yet observed."

An Important New Tool in the Multi-Messenger Astronomy Toolbox

Multi-messenger astronomy describes an approach that combines observations of light, gravitational waves, and particles to understand some of the most extreme events in the Universe. Neutrinos play an important part in this type of research.

Prior to 1987, with the explosion of Supernova 1987a, all extra-solar astronomical observations were photon-based. Today, additional detection systems add to our view of the cosmos, including all sky surveys and gravitational wave detectors. However, most observatories can only look at a small portion of the sky. IceCube, because of the nature of neutrinos, can observe these particles' flights from any direction, and therefore act as a full-sky sentinel.

The block of ice at the Amundsen-Scott South Pole Station in Antarctica -- up to a hundred thousand years-old and extremely clear -- is instrumented with sensors between 1,450 and 2,450 meters below the surface. As neutrinos pass through the ice, they may interact with a proton or neutron, producing photons which then travel through the ice, and can be detected by a sensor. The sensors transform these signals from neutrino interactions -- a handful an hour -- into digital data that is then analyzed to determine whether they represent a local source (Earth's atmosphere) or a distant one.

"Based on the analysis, researchers are also able to determine where in the sky the particle came from, its energy, and sometimes, what type of neutrino -- electron, muon or tau -- it was," said James Madson, executive director at the Wisconsin IceCube Particle Astrophysics Center.

In 2017, IceCube detected a neutrino with an energy of 290 teraelectronvolts (TeV) and sent out an alert. The detection triggered an extensive campaign involving more than twenty space- and ground-based telescopes. They identified a blazar 3.5 billion light years away, identifying a high energy cosmic ray source for the first time and launching a new era in multi-messenger detection, according to Riedl.

"We continuously search our dataset in near-real time for interesting neutrino events," he explained. "We found one and sent out an email alert to the community. They followed up with all these other electromagnetic observations, pinpointing a known gamma ray source. They also found, over the course of a month, an increased activity from the source."


CAPTION

The IceCube Neutrino Observatory is the first detector of its kind, designed to observe the cosmos from deep within the South Pole ice. An international group of scientists responsible for the scientific research makes up the IceCube Collaboration.

CREDIT

Yuya Makino, IceCube/NSF

IceCube Discovers Evidence of High-energy Electron Antineutrino

On March 10, 2021, IceCube announced the detection of a Glashow resonance event, a phenomenon predicted by Nobel laureate physicist Sheldon Glashow in 1960. The Glashow resonance describes the formation of a W? boson -- an elementary particle that mediates the weak force -- during the interaction of a high-energy electron antineutrino with an electron, peaking at an antineutrino energy of 6.3 petaelectronvolts (PeV). Its existence is a key prediction of the Standard Model of particle physics. The results further demonstrated the ability of IceCube to do fundamental physics. The result was published on March 10 in Nature.

While this energy scale is out of reach for current and future planned particle accelerators, natural astrophysical phenomena are expected to produce antineutrinos that reach beyond PeV energies. The news of the Glashow resonance discovery, "suggests the presence of electron antineutrinos in the astrophysical flux, while also providing further validation of the standard model of particle physics," the authors wrote. "Its unique signature indicates a method of distinguishing neutrinos from antineutrinos, thus providing a way to identify astronomical accelerators that produce neutrinos via hadronuclear or photohadronic interactions, with or without strong magnetic fields."

Neutrino detections require significant computing resources to model the detector behavior and differentiate extra-solar signals from background events created from cosmic ray interactions in the atmosphere. Riedel serves as the coordinator for a large community of researchers -- as many as 300 by his estimates -- who use the Frontera supercomputer at the Texas Advanced Computing Center (TACC), a National Science Foundation (NSF)-funded resource for the national community.

IceCube was awarded time on Frontera as part of the Large Scale Community Partnership track, which provides extended allocations of up to three years to support long-lived science experiments. IceCube - which has collected data for 14 years and was recently awarded a grant from NSF to expand operations over the next the next few years -- is a premier example of such an experiment.

"Part of the resources from Frontera contributed to that discovery," Riedl said. "There's years of Monte Carlo simulations that went into it to figuring out that we could do this."

IceCube uses computing resources from a number of sources, including the Open Science Grid, the Extreme Science and Engineering Discovery Environment (XSEDE), their own local supercomputing cluster, and recently the Amazon Web Services cloud. Frontera is the largest system utilized, however, and can handle a large part of the computational needs of the neutrino community, reserving local or cloud resources for urgent analyses, Riedel says.

"A lot of the computing on Frontera may not be directly associated with discoveries, but it helps down the road, to discern signals better and develop new algorithms," he said.

Modeling Ice and Following Up on Promising Signals

The projects that IceCube scientists use Frontera for vary, but they typically either involve calculations to better understand the optical nature of the ice generally (so the trajectory and other characteristics of neutrino detections can be accurately determined); or computations to analyze specific events that are deemed significant.

The first type of computation uses primarily ray tracing to calculate the path of the light in the ice from high-energy electrically charged particles produced when neutrinos interact. The rays can scatter or be adsorbed by defects in the ice, complicating analysis. Using graphics processing units (GPUs), Riedel has found, can speed up the simulations to studying light the light propagation in the ice by hundreds of times. The IceCube team is among the largest users of the Frontera GPU subsystem that includes NVIDIA RTX GPUs.

The second type of computation occurs when scientists receive an alert that says they have received an interesting signal. "We kick off a calculation to analyze the event that can scale to one million CPUs," Riedl said. "We don't have those, so Frontera can give us a portion of that computational power to run a reconstruction or extraction algorithm. We get those type of events about once a month."

"Large scale simulations of the IceCube facility and the data it creates allow us to rapidly and accurately determine the properties of these neutrinos, which in turn exposes the physics of the most energetic events in the universe," said Niall Gaffney, TACC Director of Data Intensive Computing. "This is key to validating the fundamental quantum-mechanical physics in environments that cannot be practically replicated on earth."

Today's astronomers can observe the universe in many different ways, and computing is now central to almost all of them. "We've moved from the traditional view of a guy with a telescope looking up at the sky, to large scale instruments, to now particle physics and particle observatories," Riedl said. "With this new paradigm, we need large amounts of computing for short periods of time to do big time sensitive computing, and big scientific computing centers like TACC help us do our science."


CAPTION

Flags outside of IceCube represent the international collaboration of the project.

CREDIT

Yuya Makino, IceCube/NSF



Making decisions based on how we feel about memories, not accuracy

UNIVERSITY OF CALIFORNIA - DAVIS

Research News

When we recall a memory, we retrieve specific details about it: where, when, with whom. But we often also experience a vivid feeling of remembering the event, sometimes almost reliving it. Memory researchers call these processes objective and subjective memory, respectively. A new study from the Center for Mind and Brain at the University of California, Davis, shows that objective and subjective memory can function independently, involve different parts of the brain, and that people base their decisions on subjective memory -- how they feel about a memory -- more than on its accuracy.

"The study distinguishes between how well we remember and how well we think we remember, and shows that decision making depends primarily on the subjective evaluation of memory evidence," said co-author Simona Ghetti, professor at the UC Davis Department of Psychology and Center for Mind and Brain. The work is published March 9 in the journal eLife.

Postdoctoral researcher Yana Fandakova, now an investigator at the Max Planck Institute for Human Development in Berlin, graduate student Elliott Johnson and Ghetti tested objective and subjective memory. After showing volunteers a series of images of common objects, the researchers showed them pairs of images and asked them to determine which of the two they had seen before. The volunteers were asked to rate the memory as "recollected," if they experienced it as vivid and detailed, or as "familiar" if they felt that the memory lacked detail. In some of the tests, image pairs included a target image and a similar image of the same object. In others, the target was shown with an unrelated image from the same original set. For example, a chair might be shown with another chair shown from a different angle, or with an apple.

This experimental design allowed the researchers to score objective memory by how well the volunteers recalled previously seeing an image, and subjective memory by how they rated their own memory as vividly recollected or merely familiar. Finally, participants were asked to select which images to keep or discard, assigning them to a treasure chest or trash bin.

The team also used functional MRI to measure brain activity during this task.

Scoring objective and subjective memory

The results showed higher levels of objective memory when participants were tested with pairs of similar images. But, people were more likely to claim that they remembered vividly when looking at pairs of dissimilar images.

Participants were more likely to base their decision about whether to keep or trash an image on how they felt about a memory rather than its objective accuracy.

To give a real-world example, a person could have a vivid memory of going to an event with friends. Some of the actual details of that memory might be a bit off, but they may feel it is a vivid memory, so they might decide to go out with the same people again (after the pandemic).

On the other hand, if someone has learned to use similar power tools doing odd jobs around the house, their memories about those objects may be quite specific.

"But you might still feel that you are not recalling vividly because you might question whether you are remembering the right procedure about the right tool. So, you may end up asking for help instead of relying on your memory," Ghetti said.

The fMRI data showed that objective and subjective memory recruited distinct cortical regions in the parietal and prefrontal regions. The regions involved in subjective experiences were also involved in decision making, bolstering the connection between the two processes.

"By understanding how our brains give rise to vivid subjective memories and memory decisions, we are moving a step closer to understanding how we learn to evaluate memory evidence in order to make effective decisions in the future," Fandakova said.

###

The work was supported by the James S. McDonnell Foundation.

Study finds lower dose of ketamine equally effective in reducing pain

LOYOLA UNIVERSITY HEALTH SYSTEM

Research News

MAYWOOD, IL - A recent Loyola Medicine study found that reducing the standard dose of IV-administered ketamine in half is as effective as the larger, standard dose in reducing pain in adults.

Ketamine is known to provide pain relief comparable to opioid medications, which are highly addictive. In the recent study, appearing in the journal Academic Emergency Medicine, researchers studied 98 patients, ages 18 to 59, who presented to the emergency department with acute, moderate to severe pain. The patients were randomized prospectively to receive either 0.15 mg/kg of ketamine (low dose) or 0.30 mg/kg (high dose). Patients and providers were blinded to dose, with the primary outcome of pain measured on the 11-point numerical rating scale (NRS) at 30 minutes. At 15 minutes, the high dose group had a greater decrease in pain on the NRS but more adverse events. At 30 minutes, adverse events and pain were similar.

Overall, patients generally reported that they would take ketamine again for pain - 75.6% in the low-dose group and 61.7% in the high-dose group.

"We challenged the conventional ketamine dose used to treat pain," said lead study author Shannon Lovett, MD, emergency medicine physician at Loyola University Medical Center (LUMC) and associate professor in the Department of Emergency Medicine at Loyola University Chicago Stritch School of Medicine. "Our study should help demonstrate that a lower dose is sufficient to treat pain."

The study did not find a significant reduction in side effects from the lower dose.

"As we continue with our research, we hope to find data that supports diminished side effects with the lower dose of ketamine with equal efficacy in treating pain," said senior study author Megan A. Rech, emergency medicine clinical pharmacist at LUMC and an adjunct assistant professor and research coordinator at Stritch.

###

Large computer language models carry environmental, social risks

UNIVERSITY OF WASHINGTON

Research News

Computer engineers at the world's largest companies and universities are using machines to scan through tomes of written material. The goal? Teach these machines the gift of language. Do that, some even claim, and computers will be able to mimic the human brain.

But this impressive compute capability comes with real costs, including perpetuating racism and causing significant environmental damage, according to a new paper, "On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? ?" The paper is being presented Wednesday, March 10 at the ACM Conference on Fairness, Accountability and Transparency (ACM FAccT).

This is the first exhaustive review of the literature surrounding the risks that come with rapid growth of language-learning technologies, said Emily M. Bender, a University of Washington professor of linguistics and a lead author of the paper along with Timnit Gebru, a well-known AI researcher.

"The question we're asking is what are the possible dangers of this approach and the answers that we're giving involve surveying literature across a broad range of fields and pulling them together," said Bender, who is the UW Howard and Frances Nostrand Endowed Professor.

What the researchers surfaced was that there are downsides to the ever-growing computing power put into natural language models. They discuss how the ever-increasing size of training data for language modeling exacerbates social and environmental issues. Alarmingly, such language models perpetuate hegemonic language and can deceive people into thinking they are having a "real" conversation with a person rather than a machine. The increased computational needs of these models further contributes to environmental degradation.

The authors were motivated to write the paper because of a trend within the field towards ever-larger language models and their growing spheres of influence.

The paper already has generated wide-spread attention due, in part, to the fact that two of the paper's co-authors say they were fired recently from Google for reasons that remain unsettled. Margaret Mitchell and Gebru, the two now-former Google researchers, said they stand by the paper's scholarship and point to its conclusions as a clarion call to industry to take heed.

"It's very clear that putting in the concerns has to happen right now, because it's already becoming too late," said Mitchell, a researcher in AI.

It takes an enormous amount of computing power to fuel the model language programs, Bender said. That takes up energy at tremendous scale, and that, the authors argue, causes environmental degradation. And those costs aren't borne by the computer engineers, but rather by marginalized people who cannot afford the environmental costs.

"It's not just that there's big energy impacts here, but also that the carbon impacts of that will bring costs first to people who are not benefiting from this technology," Bender said. "When we do the cost-benefit analysis, it's important to think of who's getting the benefit and who's paying the cost because they're not the same people."

The large scale of this compute power also can restrict access to only the most well-resourced companies and research groups, leaving out smaller developers outside of the U.S., Canada, Europe and China. That's because it takes huge machines to run the software necessary to make computers mimic human thought and speech.

Another risk comes from the training data itself, the authors say. Because the computers read language from the Web and from other sources, they can pick up and perpetuate racist, sexist, ableist, extremist and other harmful ideologies.

"One of the fallacies that people fall into is well, the internet is big, the internet is everything. If I just scrape the whole internet then clearly I've incorporated diverse viewpoints," Bender said. "But when we did a step-by-step review of the literature, it says that's not the case right now because not everybody's on the internet, and of the people who are on the internet, not everybody is socially comfortable participating in the same way."

And, people can confuse the language models for real human interaction, believing that they're actually talking with a person or reading something that a person has spoken or written, when, in fact, the language comes from a machine. Thus, the stochastic parrots.

"It produces this seemingly coherent text, but it has no communicative intent. It has no idea what it's saying. There's no there there," Bender said.

###

Angelina McMillan-Major, a doctoral student in linguistics at UW, also co-authored the paper.

MUTUAL AID

Helpful behavior during pandemic tied to recognizing common humanity

UNIVERSITY OF WASHINGTON

Research News

IMAGE

IMAGE: THIS BAR CHART SHOWS THAT "IDENTIFICATION WITH ALL HUMANITY " HAD A LARGER EFFECT SIZE THAN ANY OTHER VARIABLE ON COOPERATIVE BEHAVIOR DURING THE PANDEMIC. view more 

CREDIT: BARRAGAN ET AL., 2021, PLOS ONE

During the COVID-19 pandemic, people who recognize the connections they share with others are more likely to wear a mask, follow health guidelines and help people, even at a potential cost to themselves, a new University of Washington study shows.

Indeed, an identification with all humanity, as opposed to identification with a geographic area like a country or town, predicts whether someone will engage in "prosocial" behaviors particular to the pandemic, such as donating their own masks to a hospital or coming to the aid of a sick person.

The study, published March 10 in PLOS ONE, is drawn from about 2,500 responses, from more than 80 countries, to an online, international study launched last April.

Researchers say the findings could have implications for public health messaging during the pandemic: Appealing to individuals' deep sense of connectedness to others could, for example, encourage some people to get vaccinated, wear masks or follow other public health guidelines.

"We want to understand to what extent people feel connected with and identify with all humanity, and how that can be used to explain the individual differences in how people respond during the COVID-19 pandemic," said author Rodolfo Cortes Barragan, a postdoctoral researcher at the UW Institute for Learning & Brain Sciences, or I-LABS, who co-led the study with postdoctoral researcher Nigini Oliveira at the Paul G. Allen School for Computer Science and Engineering.

In psychology, "identification with all humanity" is a belief that can be measured and utilized in predicting behavior or informing policy or decision-making. Last spring, as governments around the world were imposing pandemic restrictions, a multidisciplinary team of UW researchers came together to study the implications of how people would respond to pandemic-related ethical dilemmas, and how those responses might be associated with various psychological beliefs.

Researchers designed an online study, providing different scenarios based in social psychology and game theory, for participants to consider. The team then made the study available in English and five other languages on the virtual lab LabintheWild, which co-author Katharina Reinecke, an associate professor in the Allen School, created for conducting behavioral studies with people around the world.

The scenarios presented participants with situations that could arise during the pandemic and asked people to what extent they would:

  • Follow the list of World Health Organization health guidelines (which mostly focused on social distancing and hygiene when the study was run between mid-April to mid-June)
  • Donate masks of their family's to a hospital short on masks
  • Drive a person exhibiting obvious symptoms of COVID-19 to the hospital
  • Go to a grocery store to buy food for a neighboring family
  • Call an ambulance and wait with a sick person for it to arrive

In addition to demographic details and information about their local pandemic restrictions, such as stay-at-home orders, participants were asked questions to get at the psychology behind their responses: about their own felt identification with their local community, their nation and humanity, in general. For instance, participants were asked, "How much would you say you care (feel upset, want to help) when bad things happen to people all over the world?"

Researchers found that an identification with "all humanity" significantly predicted answers to the five scenarios, well above identifying with country or community, and after controlling for other variables such as gender, age or education level. Its effect was stronger than any other factor, said Barragan, and popped out as a highly significant predictor of people's tendency to want to help others.

The authors noted that identifying with one's country, in fact, came in a distant third, behind identification with humanity in general and one's local community. Strong feelings toward one's nation, nationalism, can lead to behavior and policies that favor some groups of people over others.

"There is variability in how people respond to the social aspects of the pandemic. Our research reveals that a crucial aspect of one's world view - how much people feel connected to others they have never met - predicts people's cooperation with public health measures and the altruism they feel toward others during the pandemic," said co-author Andrew Meltzoff, who is co-director of I-LABS and holds the Job and Gertrud Tamaki Endowed Chair in psychology.

Since last spring, of course, much has changed. More than 2.5 million people worldwide have died of COVID-19, vaccines are being administered, and guidance from the U.S. Centers for Disease Control and Prevention, especially regarding masks, has evolved. If a new survey was launched today, Barragan said, the research group would like to include scenarios tuned to the current demands of the pandemic and the way it challenges us to care for others even while we maintain physical distancing.

While surveys, in general, can be prone to what's known as self-serving bias -- the participant answers in ways that they believe will make them "look good" -- researchers say that's not evident here. They point to the sizeable differences between responses that identify with all humanity, and those that identify with community or country, and note there would be little reason for participants to deliberately emphasize one and not the others.

The project is part of a larger multidisciplinary effort by this same UW research team to bring together computer scientists and psychologists interested in decision-making in different cultural contexts, which could inform our understanding of human and machine learning.

An eventual aim of the study is to use tools from artificial intelligence research and online interactions with humans around the world to understand how one's culture influences social and moral decision-making.

"This project is a wonderful example of how the tools of computer science can be combined with psychological science to understand human moral behaviors, revealing new information for the public good," said co-author Rajesh Rao, the Hwang Endowed Professor of Computer Science and Engineering at the UW.

For COVID-19 and future humanitarian crises, the ethical dilemmas presented in the study can offer insight into what propels people to help, which can, in turn, inform policy and outreach.

"While it is true that many people don't seem to be exhibiting helpful behaviors during this pandemic, what our study shows is that there are specific characteristics that predict who is especially likely to engage in such behavior," Barragan said. "Future work could help people to feel a stronger connection to others, and this could promote more helpful behavior during pandemics."

###

Additional co-authors were Koosha Khalvati, a doctoral student in the Allen School and Rechele Brooks, a research scientist with I-LABS.

The study was funded by the UW, the Templeton World Charity Foundation and the National Science Foundation.