Friday, March 12, 2021

New technology could increase health inequities

Making sure that everyone gets the benefits of new approaches to managing and treating disease

NORWEGIAN UNIVERSITY OF SCIENCE AND TECHNOLOGY

Research News

People are different. New technology is good for patients and the healthcare system. But it could also expand the already significant health disparities in Norway and other countries.

"Women and men with higher education in Norway live five to six years longer than people with that only have lower secondary school education," says Emil Øversveen, a postdoctoral fellow at the Norwegian University of Science and Technology's (NTNU) Department of Sociology and Political Science.

He is affiliated with CHAIN, the Centre for Global Health Inequalities Research. The centre works to reduce social health inequalities worldwide.

Both Norwegian and international research shows that people with low occupational status, low income and less education have poorer health and live shorter lives than people higher up in the social hierarchy.

Øversveen has looked at whether new technology might be further increasing health disparities. His study compares differences between countries and between people within various countries.

"Vaccines are a health technology, too. The new knowledge can be used to understand why the COVID-19 vaccine is being unequally distributed around the world," says Øversveen.

We see that wealthy players like the USA, England and the EU countries are securing vaccines for themselves. These countries and people are buying their way to the front of the queue.

Norway has greater internal health inequities than many other European countries. These health discrepancies crop up in all age groups, including among children and young people.

The survey indicates that the differences have increased over time. Men's life expectancy between districts in Oslo has now grown to an eight-year difference.

In general, new health technology offers more opportunities for better treatment of patients and a more efficient health care system.

"For example, our smartwatch can send reports about our health directly to our doctor, and we can follow statistics and get recommendations on medication dosages on a phone app," Øversveen says.

These options are great for the folks who can benefit from them. But in practice, not everyone in Norway has equal access to the latest technology.

Øversveen has investigated how patients with diabetes use medical technology. At the same time, he has looked at how doctors and nurses decide who has access to the technology.

"Based on my qualitative research, I see that this patient group generally experiences that diabetes technology is difficult to access in the Norwegian health care system," says Øversveen.

But the technology isn't equally difficult for everyone. Patients' social characteristics and status play an important role when healthcare professionals prioritize who should be allowed to use the technology.

"The resourceful patients learn how they can 'buy their way in' as qualified, active and competent patients who 'deserve' the expensive technology," Øversveen says.

His work shows how the development, distribution and use of medical technology can contribute to creating and maintaining social health inequities in the public health care system.

"This is new and important knowledge about a large and global societal problem," says Øversveen.

Social inequalities in health care access exist in many countries, if not all. The study shows systematic links between health status and social position, causing a lot of people to lose out.

"Social health inequities are unfair and represent a loss for individuals, families and society," says Øversveen.

CHAIN is an interdisciplinary centre at NTNU that studies global health inequalities.

###

As cases spread across US last year, pattern emerged suggesting link between governors' party affiliation and COVID-19 case and death numbers

Starting in early summer last year, analysis finds that states with Republican governors had higher case and death rates

JOHNS HOPKINS UNIVERSITY BLOOMBERG SCHOOL OF PUBLIC HEALTH

Research News

The per-capita rates of new COVID-19 cases and COVID-19 deaths were higher in states with Democrat governors in the first months of the pandemic last year, but became much higher in states with Republican governors by mid-summer and through 2020, possibly reflecting COVID-19 policy differences between GOP- and Democrat-led states, according to a study led by researchers at the Johns Hopkins Bloomberg School of Public Health and the Medical University of South Carolina.

For their study, the researchers analyzed data on SARS-CoV-2-positive nasal swab tests, COVID-19 diagnoses, and COVID-19 fatalities, for the 50 U.S. states and the District of Columbia. After adjusting for confounding factors such as state population density, they found that Republican-governed states began to have consistently higher rates of positive swab tests in May, of COVID-19 diagnoses in June, and of COVID-19 mortality in July.

The results, published online March 10 in the American Journal of Preventive Medicine, suggest that policy differences between Republican- and Democrat-governed states, including mitigation measures such as mask mandates and social distancing requirements, may have led to systematic differences in COVID-19's impact on public health, the researchers say.

"Governors' party affiliation may have contributed to a range of policy decisions that, together, influenced the spread of the virus," says study senior author Sara Benjamin-Neelon, PhD, professor in the Bloomberg School's Department of Health, Behavior and Society. "These findings underscore the need for state policy actions that are guided by public health considerations rather than by partisan politics."

The analysis covered March 15 to December 15, 2020, and included the number of SARS-CoV-2 tests, positive tests, COVID-19 case diagnoses, and COVID-19 fatalities. The researchers used a sophisticated statistical tool called a Bayesian negative binomial model to estimate, for each day in the nine-month study window, the relative risks or chances of getting tested, testing positive, getting COVID-19, or dying of COVID-19, for people in 26 GOP-governed vs. 25 Democrat-governed states. Washington, D.C. was treated as Democrat-governed.

The researchers were aware that many other factors, including the natural progression of the pandemic from early waves in urban areas, such as New York City and Seattle, to later waves in rural areas, might have contributed to differences between Republican- and Democrat-led states. However, they attempted to correct for these confounding factors in their analysis.

Their findings, even when factoring in these confounders, revealed a clear pattern in which Democrat-led states were hardest-hit early in the pandemic, but after a few months Republican-led states on average began to have more positive tests, COVID-19 cases, and more COVID-19 deaths. The transition occurred for testing-positivity on May 30, for COVID-19 case diagnoses on June 3, and for COVID-19 deaths on July 4. The differences between the two groups of states peaked in the period from late June to early August--for example, on August 5 the relative risk of dying of COVID-19 was 1.8 times higher in GOP-led states.

Testing rates were similar for the two sets of states until late September when Republican-led states began to have lower testing rates.

Other studies have found evidence that Republican governors in 2020 were broadly less strict than their Democrat counterparts in setting policies on mask-wearing, social distancing, and other pandemic-related measures. The researchers say that those studies, along with the links they found between Republican governorship and greater COVID-19 impact, are consistent with the idea that the political polarization of the COVID-19 response has contributed to less effective COVID-19 policies and worse case-related statistics in some states.

"Despite a more coordinated federal response this year, governors still play a key role in the pandemic response," says Benjamin-Neelon. "As we're seeing, several states have lifted mask requirements even though we have yet to make substantial progress in controlling the spread of the virus."

###

Brian Neelon, PhD, professor in the Department of Public Health Sciences at the Medical University of South Carolina, is the paper's lead author.

"Associations between governor political affiliation and COVID-19 cases, deaths, and testing in the United States" was written by Brian Neelon, Fedelis Mutiso, Noel Mueller, John Pearce, and Sara Benjamin-Neelon.

Manure improves soil and microbe community

AMERICAN SOCIETY OF AGRONOMY

Research News

IMAGE

IMAGE: SOME OF THE RESEARCH PASTURES INCLUDED WARM-SEASON OLD WORLD BLUESTEM GRASS (LEFT) MIXED WITH LEGUMES INCLUDING ALFALFA (PURPLE BLOOMS, CENTER) AND YELLOW SWEETCLOVER (YELLOW BLOOMS, LOWER LEFT). THE LEGUMES PROVIDE... view more 

CREDIT: LINDSEY SLAUGHTER

In the dry air and soil of Texas' Southern High Plains, improving soil health can be tough. We usually think of healthy soil as moist and loose with lots of organic matter. But this can be hard to achieve in this arid area of Texas.

Lindsey Slaughter, a member of the Soil Science Society of America, set out with her fellow researchers to test a solution that kills two birds with one stone. They put excess cow manure on these soils to see if they could make them healthier.

The team recently published their research in the Soil Science Society of America Journal.

"We know that planting perennial grasslands for cattle production can help protect and restore soil in semi-arid lands that are likely to erode and degrade from intense farming," Slaughter says. "But producers need additional ways to increase soil carbon and nutrient stores."

What makes a healthy or unhealthy soil?

Slaughter describes soil health as the ability of a living soil ecosystem to perform a variety of important functions. These include cycling nutrients, storing and purifying water, helping plants and animals, and more.

This "living" part is made up of various microorganisms that help a soil be healthy. They, for example, help break down materials like manure so that it and its nutrients become part of the soil.

"Improving the soil's ability to perform these roles and support plant and animal life is our target for soil health," Slaughter says. "Adding the manure can provide a boost of material that can be incorporated into soil organic matter. This helps provide a stronger foundation for more microbial activity and nutrient cycling."

This is why in their study they applied a low one-time amount of manure to two types of pastures to look into this. The pastures they put the manure on had either grass only that was fertilized occasionally or were a mix of grass and legumes that was not fertilize



CAPTION

The researchers used excess cattle manure from local producers. Here a donor deposits composted cattle manure prior to spreading it over the field sites.

CREDIT

Paul Green

Manure helps, but results take time

Overall, they did find that manure helped increase soil organic carbon and the number of microbes in the soil. These are two important characteristics of a healthy soil.

It took almost a year and a half to see these changes, although they say this is not totally surprising.

"This tells us that it can take a long time for even a little added compost to become incorporated into the soil organic matter of semi-arid grasslands, but it definitely helps," Slaughter explains.

"We think this is mostly due to the dry climate at our study site," says Slaughter. "We commonly get little rainfall per year. The microbial community was not able to work quickly or efficiently to decompose the manure without water."

Their results also showed that the pastures receiving fertilizer responded better to the manure. They believe this is because the nitrogen in the fertilizer helped the microbes decompose the manure better.

"Microbes help directly with releasing nutrients from organic material in a form that plants can use, as well as decomposing those residues to build soil organic matter," Slaughter says. "A lot of work has been done on how this can help improve cropping systems. However, we wanted to also test this on forage pastures."

Slaughter adds that the next steps in this work include whether more manure or multiple applications would get faster results. In addition, they hope to investigate if irrigation or fertilizer would help incorporate the manure faster.

"We need more research along these lines to help us design strategies that quickly and effectively increase soil health and productivity in these grasslands," she says. "This helps farmers save money on nutrients and amendments while building soil organic matter and nutrient cycling capacity. This also saves them water and protects against soil degradation."

###

Lindsey Slaughter is an assistant professor at Texas Tech University. Funding for this work was provided by the Southern Sustainable Agriculture Research and Education and the USDA's National Institute of Food and Agriculture. Funding and student support was also provided by the Department of Plant and Soil Science in the College of Agricultural Sciences and Natural Resources at Texas Tech University.


CAPTION

Farm manager Paul Green of Texas Tech University anchors tarps in the pasture just before compost was spread onto the fields. This created treatment areas in each field where compost application was excluded.

CREDIT

Phil Brown

'Lost' ocean nanoplastic might be getting trapped on coasts

AMERICAN CHEMICAL SOCIETY

Research News

As plastic debris weathers in aquatic environments, it can shed tiny nanoplastics. Although scientists have a good understanding of how these particles form, they still don't have a good grasp of where all the fragments end up. Now, researchers reporting in ACS' Environmental Science & Technology have shown experimentally that most nanoplastics in estuarine waters can clump, forming larger clusters that either settle or stick to solid objects, instead of floating on into the ocean.

There is a huge discrepancy between the millions of tons of plastic waste entering rivers and streams and the amount researchers have found in the oceans. As large pieces of plastic break apart into successively smaller fragments on their transit to the sea, some eventually wear down to nano-sized particles. Previous studies have shown that these nanoplastics congregate in well-mixed, stagnant salty water. Yet, these results do not apply when the particles encounter dynamic changes in salt content, such as estuaries, where rivers carrying freshwater meet tidal saltwater. So, Hervé Tabuteau, Julien Gigault and colleagues wanted to perform laboratory experiments with micro-sized chambers mimicking the conditions measured in an estuary to show how nanoplastics interact and aggregate in this type of environment.

To determine how nanoplastics move in estuarine waters, the team developed a lab-on-a-chip device. They introduced crushed 400-nm-wide polystyrene beads and freshwater into one side of the device, while injecting saltwater through another inlet. At the opposite end of the 1.7-cm-long device, the researchers collected the output. The team tested different flow rates, replicating the salt gradient and water movement they measured in an estuary on the French Caribbean island of Guadeloupe. Nanoplastic aggregates up to 10-μm wide were detected within the zone of highest salt concentration in the flow chamber, regardless of how fast the water was moving. At the highest flow rate, only 12% of the nanoplastics were collected in the outlets; the remaining particles either clumped together and sank in the flow chamber or formed floating aggregates that stuck to the chamber's sides. The researchers say their results show estuaries and other coastal environments may filter out nanoplastics before they can enter the ocean.

###

The authors acknowledge funding from the French Agency for Research.

The abstract that accompanies this paper can be found here.

For more of the latest research news, register for our upcoming meeting, ACS Spring 2021. Journalists and public information officers are encouraged to apply for complimentary press registration by emailing us at newsroom@acs.org.

VEGAN LEATHER

The quest for sustainable leather alternatives

AMERICAN CHEMICAL SOCIETY

Throughout history, leather has been a popular material for clothes and many other goods. However, the tanning process and use of livestock mean that it has a large environmental footprint, leading consumers and manufacturers alike to seek out alternatives. An article in Chemical & Engineering News (C&EN), the weekly newsmagazine of the American Chemical Society, details how sustainable materials are giving traditional leather a run for its money.

Traditional leather goods are known for their durability, flexibility and attractive finish, with a global market worth billions, writes Associate Editor Craig Bettenhausen. Despite leather's popularity, the modern tanning process uses potentially harmful chemicals and creates a large amount of wastewater. In addition, most hides come from the meat and dairy industries, which have sustainability problems. Leather-like materials, often called vegan leather, are gaining traction among high-end manufacturers, defying the negative perceptions of synthetic "pleather." These leather alternatives are made from an array of base materials, including plants, mushrooms and even fish skin, each with a unique take on sustainable production.

Plant-based materials are currently the most advanced leather mimics because of their straightforward manufacturing process, which combines inexpensive natural fibers and polymers that are rolled into sheets. A company based in Mexico has created a leather made from prickly pear cactus, which is ground into a powder and combined with a biobased polyurethane. Mushroom leather mimics the texture of cowhide leather very well, but production needs to scale up substantially to make an impact. Although not a vegan alternative, fish skin is poised to replace exotic leathers such as snake and alligator skin. Cell-culture leather is also in early development, which could disrupt the traditional leather market even further. Experts are confident that these materials are viable alternatives, and manufacturers plan to scale up their efforts going forward.

###

The paper, "Sustainable Materials Make a Play for the Vegan Leather Market," is freely available here.

University of Minnesota scientists discover attacking fungi that show promise against emerald ash borer

UNIVERSITY OF MINNESOTA

Research News

IMAGE

IMAGE: THROUGH DNA SEQUENCING, SCIENTISTS IDENTIFIED FUNGAL ISOLATES AND REVEALED A DIVERSE ASSEMBLAGE OF FUNGI. THIS INCLUDED ENTOMOPATHOGENIC FUNGI THAT ATTACK INSECTS, AS WELL AS OTHER FUNGI THAT CAUSE CANKERS --... view more 

CREDIT: N/A

Since its introduction, the emerald ash borer (EAB) has become the most devastating invasive forest insect in the United States, killing hundreds of millions of ash trees at a cost of hundreds of millions of dollars.

Now, new research from the University of Minnesota's Minnesota Invasive Terrestrial Plants and Pests Center (MITPPC) shows a possible path forward in controlling the invasive pest that threatens Minnesota's nearly one billion ash trees.

In a recent study published in Fungal Biology, MITPPC researchers identified various fungi living in EAB-infested trees -- a critical first step in finding fungi that may be harnessed to control the spread of EAB, and ultimately, prevent ash tree death.

"We discovered that several different species of fungi attack EAB and other insects, and they can now be further tested for their potential for biocontrol," said Robert Blanchette, the study's project leader and professor in the College of Food, Agricultural and Natural Resource Sciences. "This is a very important first step in the search for a biocontrol for emerald ash borer."

Larval EAB feed just beneath the bark, leaving behind tunnel galleries that can stretch up to 20 inches long. Beneath the surface, fungi -- some of which may be capable of parasitizing the EAB -- may be carried by the larvae as they develop, or may enter the tree through the tunnel galleries. Some of these fungi also seriously affect urban trees, causing rapid wood decay which result in hazardous tree situations.

From Rochester to Duluth, researchers gathered samples where ash is affected by EAB. Through DNA sequencing, scientists identified fungal isolates and revealed a diverse assemblage of fungi. This included entomopathogenic fungi that attack insects, as well as other fungi that cause cankers -- which help EAB kill trees -- and some that cause wood decay.

"Before now, we simply haven't been sure what fungi are associated with EAB infestations in Minnesota. This project identified those species and, in doing so, opened up new possibilities for managing one of our state's most devastating tree pests," said Ben Held, the study's lead author and researcher in the College of Food, Agricultural and Natural Resource Sciences.

As research continues, the scientists will build on the work from this study to determine if any of the fungi can be used to kill the emerald ash borer. Results will also be of value in helping control the insect in other parts of North America where EAB is found.

"Ash trees are vitally important to Minnesota," said Rob Venette, MITPPC director. "They reduce air pollution, storm water runoff, and cooling costs, all while increasing property values in local communities. It's critical we work to protect them from this invasive pest."

###

The research was funded through the Minnesota Environment and Natural Resources Trust Fund.

New IceCube detection proves 60-year-old theory

MICHIGAN STATE UNIVERSITY

Research News

On December 6, 2016, a high-energy particle called an electron antineutrino was hurtling through space at nearly the speed of light. Normally, the ghostly particle would zip right through the Earth as if it weren't even there.

But this particle just so happened to smash into an electron deep inside the South Pole's glacial ice. The collision created a new particle, known as the W- boson. That boson quickly decayed, creating a shower of secondary particles.

The whole thing played out in front of the watchful detectors of a massive telescope buried in the Antarctic ice, the IceCube Neutrino Observatory. This enabled IceCube to make the first ever detection of a Glashow resonance event, a phenomenon predicted 60 years ago by Nobel laureate physicist Sheldon Glashow.

This detection provides the latest confirmation of the Standard Model, the name of the particle physics theory explaining the universe's fundamental forces and particles.

"Finding it wasn't necessarily a surprise, but that doesn't mean I wasn't very happy to see it," said Claudio Kopper, an associate professor in Michigan State University's Department of Physics and Astronomy in the College of Natural Science. Kopper and his departmental colleague, assistant professor Nathan Whitehorn, lead IceCube's Diffuse and Atmospheric Flux Working Group behind the discovery.

The international IceCube Collaboration published this result online on March 11 in the journal Nature.

"Even three years ago, I didn't think IceCube would be able to make this measurement, or at least as well as we did," Whitehorn said.

A 3-D plot with columns of green, blue, yellow and orange spheres and other round shapes give a visual representation of the Glashow resonance event detection.

This detection further demonstrates the ability of IceCube, which observes nearly massless particles called neutrinos using thousands of sensors embedded in the Antarctic ice, to do fundamental physics.

Although the Spartans lead the working group, they emphasized that this discovery was a team effort, powered by the paper's three lead analysts: Lu Lu, an assistant professor at University of Wisconsin-Madison; Tianlu Yuan, an assistant scientist at the Wisconsin IceCube Particle Astrophysics Center, or WIPAC; and Christian Haack, a postdoc at the Technical University of Munich.

"We lead weekly meetings, we talk about how the work is done, we ask hard questions," said Kopper. "But without the people doing the actual analysis, we wouldn't have anything."

"Our job is to be the doubters-in-chief," Whitehorn said. "The lead authors did a great job convincing everyone that this event was a Glashow resonance."

The particle physics community has been anticipating such a detection, but Glashow resonance events are extremely rare by nature and technologically challenging to detect.

"When Glashow was a postdoc at Niels Bohr, he could never have imagined that his unconventional proposal for producing the W- boson would be realized by an antineutrino from a faraway galaxy crashing into Antarctic ice," said Francis Halzen, professor of physics at the University of Wisconsin-Madison, the headquarters of IceCube maintenance and operations, and principal investigator of IceCube.

A Glashow resonance event requires an electron antineutrino with a cosmic amount of energy -- at least 6.3 peta-electronvolts, or PeV. For comparison, that's about 1,000 times more energy than that of the most energetic particles produced by the Earth's most powerful particle accelerators.

Since IceCube started fully operating in 2011, it has detected hundreds of high-energy neutrinos from space. Yet the neutrino in December 2016 was only the third with an energy higher than 5 PeV.

And simply having a high-energy neutrino is not sufficient to detect a Glashow resonance event. The neutrino then has to interact with matter, which is not a guarantee. But IceCube encompasses quite a bit of matter in the form of Antarctic ice.

The IceCube Laboratory, lighted red against the night sky, is small in this landscape photograph of the South Pole white tundra, which also captures yellow stars and light green auroras.

The observatory's detector array has been built into the ice, spanning nearly 250 acres with sensors reaching up to about a mile deep. All told, IceCube boasts a cubic kilometer of coverage, watching over a billion metric tons of extremely clear ice.

That's what it takes to detect neutrinos, along with a team of scientists who have the skill and determination to spot rare events.

IceCube's more than 5,000 detectors take in a tremendous firehose of light, Whitehorn said. Detecting the Glashow resonance meant researchers had to pick out a handful of telltale photons, individual particles of light, from that firehose spray.

"This is some of the most impressive technical work I've ever seen," Whitehorn said, calling the team unstoppable over the years-long effort to confirm this was a Glashow resonance event.

Making the work even more impressive was the fact that the lead authors -- Lu, Yuan and Haack -- were in three countries on three different continents during the analysis. Lu was a postdoc at Chiba University in Japan, Yuan was at WIPAC in the U.S. and Haack was a doctoral student at Rheinisch-Westfälische Technische Hochschule Aachen University in Germany.

"It was amazing to me just seeing that that is possible," Kopper said.

But this is very much in keeping with the ethos of IceCube, an observatory built on international collaboration. IceCube is operated by a group of scientists, engineers and staff from 53 institutions in 12 countries, together known as the IceCube Collaboration. The project's headquarters is WIPAC, a research center of UW-Madison in the United States.

To confirm the detection and usher in a new chapter of neutrino astronomy, the IceCube Collaboration is working to detect more Glashow resonances. And they need IceCube-Gen2, a proposed expansion of the IceCube detector, to make it happen.

"We already know that the astrophysical spectrum does not end at 6 PeV," Lu said. "The key is to detect more Glashow resonance events and to identify the sources that accelerate those antineutrinos. IceCube-Gen2 will be key to making such measurements in a statistically significant way."

Glashow himself echoed that sentiment about validation. "To be absolutely sure, we should see another such event at the very same energy as the one that was seen," said Glashow, now an emeritus professor of physics at Boston University. "So far there's one, and someday there will be more."

###

The IceCube Neutrino Observatory is funded primarily by the National Science Foundation and is operated by a team headquartered at the University of Wisconsin-Madison. IceCube's research efforts, including critical contributions to the detector operation, are funded by agencies in Australia, Belgium, Canada, Denmark, Germany, Japan, New Zealand, Republic of Korea, Sweden, Switzerland, the United Kingdom, and the United States. IceCube construction was also funded with significant contributions from the National Fund for Scientific Research -- the FNRS and FWO -- in Belgium; the Federal Ministry of Education and Research and the German Research Foundation in Germany; the Knut and Alice Wallenberg Foundation, the Swedish Polar Research Secretariat, and the Swedish Research Council in Sweden; and the Department of Energy and the University of Wisconsin-Madison Research Fund in the U.S.

(Note for media: Please include a link to the original paper in online coverage: https://dx.doi.org/10.1038/s41586-021-03256-1)

Tracking cosmic ghosts

Frontera supercomputer, a community resource for IceCube Neutrino Observatory research, enables the discovery of new high-energy particle

UNIVERSITY OF TEXAS AT AUSTIN, TEXAS ADVANCED COMPUTING CENTER

Research News

IMAGE

IMAGE: A VISUALIZATION OF THE GLASHOW EVENT RECORDED BY THE ICECUBE DETECTOR. EACH COLORED CIRCLE SHOWS AN ICECUBE SENSOR THAT WAS TRIGGERED BY THE EVENT; RED CIRCLES INDICATE SENSORS TRIGGERED EARLIER... view more 

CREDIT: ICECUBE COLLABORATION

The idea was so far-fetched it seemed like science fiction: create an observatory out of a one cubic kilometer block of ice in Antarctica to track ghostly particles called neutrinos that pass through the Earth. But speaking to Benedickt Riedel, global computing manager at the IceCube Neutrino Observatory, it makes perfect sense.

"Constructing a comparable observatory anywhere else would be astronomically expensive," Riedel explained. "Antarctica ice is a great optical material and allows us to sense neutrinos as nowhere else."

Neutrinos are neutral subatomic particles with a mass close to zero that can pass through solid materials at near the speed of light, rarely reacting with normal matter. They were first detected in the 1950s in experiments that operated near nuclear reactors, which also generate these particles. They were further found to be created by cosmic rays interacting with our atmosphere. But astrophysicists believed they were likely widespread and caused by a variety of cosmic events, if only they could be detected.

Importantly, scientists believed they could be critical clues to other phenomenon. "20 percent of the potentially visible Universe is dark to us," Riedel explained. "That's mostly because of distances and the age of the Universe. High energy light is also hidden. It is absorbed or undergoes transformation that makes it hard to trace back to a source. IceCube reveals a slice of Universe we haven't yet observed."

An Important New Tool in the Multi-Messenger Astronomy Toolbox

Multi-messenger astronomy describes an approach that combines observations of light, gravitational waves, and particles to understand some of the most extreme events in the Universe. Neutrinos play an important part in this type of research.

Prior to 1987, with the explosion of Supernova 1987a, all extra-solar astronomical observations were photon-based. Today, additional detection systems add to our view of the cosmos, including all sky surveys and gravitational wave detectors. However, most observatories can only look at a small portion of the sky. IceCube, because of the nature of neutrinos, can observe these particles' flights from any direction, and therefore act as a full-sky sentinel.

The block of ice at the Amundsen-Scott South Pole Station in Antarctica -- up to a hundred thousand years-old and extremely clear -- is instrumented with sensors between 1,450 and 2,450 meters below the surface. As neutrinos pass through the ice, they may interact with a proton or neutron, producing photons which then travel through the ice, and can be detected by a sensor. The sensors transform these signals from neutrino interactions -- a handful an hour -- into digital data that is then analyzed to determine whether they represent a local source (Earth's atmosphere) or a distant one.

"Based on the analysis, researchers are also able to determine where in the sky the particle came from, its energy, and sometimes, what type of neutrino -- electron, muon or tau -- it was," said James Madson, executive director at the Wisconsin IceCube Particle Astrophysics Center.

In 2017, IceCube detected a neutrino with an energy of 290 teraelectronvolts (TeV) and sent out an alert. The detection triggered an extensive campaign involving more than twenty space- and ground-based telescopes. They identified a blazar 3.5 billion light years away, identifying a high energy cosmic ray source for the first time and launching a new era in multi-messenger detection, according to Riedl.

"We continuously search our dataset in near-real time for interesting neutrino events," he explained. "We found one and sent out an email alert to the community. They followed up with all these other electromagnetic observations, pinpointing a known gamma ray source. They also found, over the course of a month, an increased activity from the source."


CAPTION

The IceCube Neutrino Observatory is the first detector of its kind, designed to observe the cosmos from deep within the South Pole ice. An international group of scientists responsible for the scientific research makes up the IceCube Collaboration.

CREDIT

Yuya Makino, IceCube/NSF

IceCube Discovers Evidence of High-energy Electron Antineutrino

On March 10, 2021, IceCube announced the detection of a Glashow resonance event, a phenomenon predicted by Nobel laureate physicist Sheldon Glashow in 1960. The Glashow resonance describes the formation of a W? boson -- an elementary particle that mediates the weak force -- during the interaction of a high-energy electron antineutrino with an electron, peaking at an antineutrino energy of 6.3 petaelectronvolts (PeV). Its existence is a key prediction of the Standard Model of particle physics. The results further demonstrated the ability of IceCube to do fundamental physics. The result was published on March 10 in Nature.

While this energy scale is out of reach for current and future planned particle accelerators, natural astrophysical phenomena are expected to produce antineutrinos that reach beyond PeV energies. The news of the Glashow resonance discovery, "suggests the presence of electron antineutrinos in the astrophysical flux, while also providing further validation of the standard model of particle physics," the authors wrote. "Its unique signature indicates a method of distinguishing neutrinos from antineutrinos, thus providing a way to identify astronomical accelerators that produce neutrinos via hadronuclear or photohadronic interactions, with or without strong magnetic fields."

Neutrino detections require significant computing resources to model the detector behavior and differentiate extra-solar signals from background events created from cosmic ray interactions in the atmosphere. Riedel serves as the coordinator for a large community of researchers -- as many as 300 by his estimates -- who use the Frontera supercomputer at the Texas Advanced Computing Center (TACC), a National Science Foundation (NSF)-funded resource for the national community.

IceCube was awarded time on Frontera as part of the Large Scale Community Partnership track, which provides extended allocations of up to three years to support long-lived science experiments. IceCube - which has collected data for 14 years and was recently awarded a grant from NSF to expand operations over the next the next few years -- is a premier example of such an experiment.

"Part of the resources from Frontera contributed to that discovery," Riedl said. "There's years of Monte Carlo simulations that went into it to figuring out that we could do this."

IceCube uses computing resources from a number of sources, including the Open Science Grid, the Extreme Science and Engineering Discovery Environment (XSEDE), their own local supercomputing cluster, and recently the Amazon Web Services cloud. Frontera is the largest system utilized, however, and can handle a large part of the computational needs of the neutrino community, reserving local or cloud resources for urgent analyses, Riedel says.

"A lot of the computing on Frontera may not be directly associated with discoveries, but it helps down the road, to discern signals better and develop new algorithms," he said.

Modeling Ice and Following Up on Promising Signals

The projects that IceCube scientists use Frontera for vary, but they typically either involve calculations to better understand the optical nature of the ice generally (so the trajectory and other characteristics of neutrino detections can be accurately determined); or computations to analyze specific events that are deemed significant.

The first type of computation uses primarily ray tracing to calculate the path of the light in the ice from high-energy electrically charged particles produced when neutrinos interact. The rays can scatter or be adsorbed by defects in the ice, complicating analysis. Using graphics processing units (GPUs), Riedel has found, can speed up the simulations to studying light the light propagation in the ice by hundreds of times. The IceCube team is among the largest users of the Frontera GPU subsystem that includes NVIDIA RTX GPUs.

The second type of computation occurs when scientists receive an alert that says they have received an interesting signal. "We kick off a calculation to analyze the event that can scale to one million CPUs," Riedl said. "We don't have those, so Frontera can give us a portion of that computational power to run a reconstruction or extraction algorithm. We get those type of events about once a month."

"Large scale simulations of the IceCube facility and the data it creates allow us to rapidly and accurately determine the properties of these neutrinos, which in turn exposes the physics of the most energetic events in the universe," said Niall Gaffney, TACC Director of Data Intensive Computing. "This is key to validating the fundamental quantum-mechanical physics in environments that cannot be practically replicated on earth."

Today's astronomers can observe the universe in many different ways, and computing is now central to almost all of them. "We've moved from the traditional view of a guy with a telescope looking up at the sky, to large scale instruments, to now particle physics and particle observatories," Riedl said. "With this new paradigm, we need large amounts of computing for short periods of time to do big time sensitive computing, and big scientific computing centers like TACC help us do our science."


CAPTION

Flags outside of IceCube represent the international collaboration of the project.

CREDIT

Yuya Makino, IceCube/NSF



Making decisions based on how we feel about memories, not accuracy

UNIVERSITY OF CALIFORNIA - DAVIS

Research News

When we recall a memory, we retrieve specific details about it: where, when, with whom. But we often also experience a vivid feeling of remembering the event, sometimes almost reliving it. Memory researchers call these processes objective and subjective memory, respectively. A new study from the Center for Mind and Brain at the University of California, Davis, shows that objective and subjective memory can function independently, involve different parts of the brain, and that people base their decisions on subjective memory -- how they feel about a memory -- more than on its accuracy.

"The study distinguishes between how well we remember and how well we think we remember, and shows that decision making depends primarily on the subjective evaluation of memory evidence," said co-author Simona Ghetti, professor at the UC Davis Department of Psychology and Center for Mind and Brain. The work is published March 9 in the journal eLife.

Postdoctoral researcher Yana Fandakova, now an investigator at the Max Planck Institute for Human Development in Berlin, graduate student Elliott Johnson and Ghetti tested objective and subjective memory. After showing volunteers a series of images of common objects, the researchers showed them pairs of images and asked them to determine which of the two they had seen before. The volunteers were asked to rate the memory as "recollected," if they experienced it as vivid and detailed, or as "familiar" if they felt that the memory lacked detail. In some of the tests, image pairs included a target image and a similar image of the same object. In others, the target was shown with an unrelated image from the same original set. For example, a chair might be shown with another chair shown from a different angle, or with an apple.

This experimental design allowed the researchers to score objective memory by how well the volunteers recalled previously seeing an image, and subjective memory by how they rated their own memory as vividly recollected or merely familiar. Finally, participants were asked to select which images to keep or discard, assigning them to a treasure chest or trash bin.

The team also used functional MRI to measure brain activity during this task.

Scoring objective and subjective memory

The results showed higher levels of objective memory when participants were tested with pairs of similar images. But, people were more likely to claim that they remembered vividly when looking at pairs of dissimilar images.

Participants were more likely to base their decision about whether to keep or trash an image on how they felt about a memory rather than its objective accuracy.

To give a real-world example, a person could have a vivid memory of going to an event with friends. Some of the actual details of that memory might be a bit off, but they may feel it is a vivid memory, so they might decide to go out with the same people again (after the pandemic).

On the other hand, if someone has learned to use similar power tools doing odd jobs around the house, their memories about those objects may be quite specific.

"But you might still feel that you are not recalling vividly because you might question whether you are remembering the right procedure about the right tool. So, you may end up asking for help instead of relying on your memory," Ghetti said.

The fMRI data showed that objective and subjective memory recruited distinct cortical regions in the parietal and prefrontal regions. The regions involved in subjective experiences were also involved in decision making, bolstering the connection between the two processes.

"By understanding how our brains give rise to vivid subjective memories and memory decisions, we are moving a step closer to understanding how we learn to evaluate memory evidence in order to make effective decisions in the future," Fandakova said.

###

The work was supported by the James S. McDonnell Foundation.

Study finds lower dose of ketamine equally effective in reducing pain

LOYOLA UNIVERSITY HEALTH SYSTEM

Research News

MAYWOOD, IL - A recent Loyola Medicine study found that reducing the standard dose of IV-administered ketamine in half is as effective as the larger, standard dose in reducing pain in adults.

Ketamine is known to provide pain relief comparable to opioid medications, which are highly addictive. In the recent study, appearing in the journal Academic Emergency Medicine, researchers studied 98 patients, ages 18 to 59, who presented to the emergency department with acute, moderate to severe pain. The patients were randomized prospectively to receive either 0.15 mg/kg of ketamine (low dose) or 0.30 mg/kg (high dose). Patients and providers were blinded to dose, with the primary outcome of pain measured on the 11-point numerical rating scale (NRS) at 30 minutes. At 15 minutes, the high dose group had a greater decrease in pain on the NRS but more adverse events. At 30 minutes, adverse events and pain were similar.

Overall, patients generally reported that they would take ketamine again for pain - 75.6% in the low-dose group and 61.7% in the high-dose group.

"We challenged the conventional ketamine dose used to treat pain," said lead study author Shannon Lovett, MD, emergency medicine physician at Loyola University Medical Center (LUMC) and associate professor in the Department of Emergency Medicine at Loyola University Chicago Stritch School of Medicine. "Our study should help demonstrate that a lower dose is sufficient to treat pain."

The study did not find a significant reduction in side effects from the lower dose.

"As we continue with our research, we hope to find data that supports diminished side effects with the lower dose of ketamine with equal efficacy in treating pain," said senior study author Megan A. Rech, emergency medicine clinical pharmacist at LUMC and an adjunct assistant professor and research coordinator at Stritch.

###