Friday, May 29, 2020


Smart sponge could clean up oil spills

Highly porous sponge selectively soaks up oil, sparing water and wildlife
NORTHWESTERN UNIVERSITY NEWS RELEASE 

EVANSTON, Ill. -- A Northwestern University-led team has developed a highly porous smart sponge that selectively soaks up oil in water.
With an ability to absorb more than 30 times its weight in oil, the sponge could be used to inexpensively and efficiently clean up oil spills without harming marine life. After squeezing the oil out of the sponge, it can be reused many dozens of times without losing its effectiveness.
"Oil spills have devastating and immediate effects on the environment, human health and economy," said Northwestern's Vinayak Dravid, who led the research. "Although many spills are small and may not make the evening news, they are still profoundly invasive to the ecosystem and surrounding community. Our sponge can remediate these spills in a more economic, efficient and eco-friendly manner than any of the current state-of-the-art solutions."
The research was published yesterday (May 27) in the journal Industrial Engineering and Chemical Research.
Dravid is the Abraham Harris Professor of Materials Science and Engineering at Northwestern's McCormick School of Engineering. Vikas Nandwana, a senior research associate in Dravid's laboratory, is the paper's first author.
Oil spill clean-up is an expensive and complicated process that often harms marine life and further damages the environment. Currently used solutions include burning the oil, using chemical dispersants to breakdown oil into very small droplets, skimming oil floating on top of water and/or absorbing it with expensive, unrecyclable sorbents.
"Each approach has its own drawbacks and none are sustainable solutions," Nandwana said. "Burning increases carbon emissions and dispersants are terribly harmful for marine wildlife. Skimmers don't work in rough waters or with thin layers of oil. And sorbents are not only expensive, but they generate a huge amount of physical waste -- similar to the diaper landfill issue."
The Northwestern solution bypasses these challenges by selectively absorbing oil and leaving clean water and unaffected marine life behind. The secret lies in a nanocomposite coating of magnetic nanostructures and a carbon-based substrate that is oleophilic (attracts oil), hydrophobic (resists water) and magnetic. The nanocomposite's nanoporous 3D structure selectively interacts with and binds to the oil molecules, capturing and storing the oil until it is squeezed out. The magnetic nanostructures give the smart sponge two additional functionalities: controlled movement in the presence of an external magnetic field and desorption of adsorbed components, such as oil, in a simulated and remote manner.
The OHM (oleophilic hydrophobic magnetic) nanocomposite slurry can be used to coat any cheap, commercially available sponge. The researchers applied a thin coating of the slurry to the sponge, squeezed out the excess and let it dry. The sponge is quickly and easily converted into a smart sponge (or "OHM sponge") with a selective affinity for oil.
Vinayak and his team tested the OHM sponge with many different types of crude oils of varying density and viscosity. The OHM sponge consistently absorbed up to 30 times its weight in oil, leaving the water behind. To mimic natural waves, researchers put the OHM sponge on a shaker submerged in water. Even after vigorous shaking, the sponge release less than 1% of its absorbed oil back into the water.
"Our sponge works effectively in diverse and extreme aquatic conditions that have different pH and salinity levels," Dravid said. "We believe we can address a giga-ton problem with a nanoscale solution."
"We are excited to introduce such smart sponges as an environmental remediation platform for selectively removing and recovering pollutants present in water, soil and air, such as excess nutrients, heavy metal contaminants, VOC/toxins and others," Nandwana said. "The nanostructure coating can be tailored to selectively adsorb (and later desorb) these pollutants."
The team also is working on another grade of OHM sponge that can selectively absorb (and later recover) excess dissolved nutrients, such as phosphates, from fertilizer runoff and agricultural pollution. Stephanie Ribet, a Ph.D. candidate in Dravid's lab and paper coauthor is pursuing this topic. The team plans to develop and commercialize OHM technology for environmental clean-up.

Smart sponge selectively absorbs oil (on the left) while resisting water (on the right).

###
The research, "OHM Sponge: A versatile, efficient and eco-friendly environmental remediation platform," was supported by the National Science Foundation. Stephanie Ribet, Roberto Reis, Yuyao Kuang and Yash More -- all from Northwestern -- coauthored the paper.
Editor's note: Dravid, Nandwana and Northwestern will have financial interests (equities, royalties) if the technology is commercialized.

Reintroduction of wolves tied to return of tall willows in Yellowstone National Park

OREGON STATE UNIVERSITY
IMAGE
IMAGE: IN YELLOWSTONE'S LAMAR VALLEY, BISON CONGREGATE IN THE SUMMER AND GRAZE NOT ONLY ON GRASS BUT ALSO ON WILLOW, COTTONWOOD, AND ASPEN. THESE BISON ARE EATING WILLOWS IN THE FOREGROUND,... view more 
CREDIT: LUKE PAINTER, OREGON STATE UNIVERSITY
CORVALLIS, Ore. - The reintroduction of wolves into Yellowstone National Park is tied to the recovery of tall willows in the park, according to a new Oregon State University-led study.
Wolves were reintroduced to the park in 1995. The new study shows their predation on elk is a major reason for an increase in the height of willows in northern Yellowstone, said Luke Painter, a wildlife ecologist at Oregon State University and lead author on the study.
There's been a debate among scientists over the degree to which willows may have recovered from decades of suppression by elk following the restoration of wolves and subsequent reductions in elk numbers, Painter said.
"Our results demonstrate that the reduction of elk browsing over the last two decades in northern Yellowstone has allowed willows to grow taller in many places, despite a warming and drying climate," Painter said, adding that willows aren't recovering in some areas due to continued browsing by increased numbers of bison.
Following wolf restoration in the 1990s, elk numbers decreased, and some researchers reported willows growing taller with reductions in elk browsing, evidence of a shift toward willow recovery.
The new study compared data from three time periods: 1988-1993, when elk densities were high and most willows very short; 2001-04, when willows may have begun to recover; and 2016-18.
The researchers confirmed that willows have indeed increased in height and cover in response to a reduction in browsing by elk.
The study is published in the journal Ecosphere.
Elk numbers in northern Yellowstone have declined from a high of nearly 20,000 in 1995 - the year wolves were restored to the park - to 4,149 counted over two days in March 2019 by biologists with Montana Fish, Wildlife, and Parks; U.S. Forest Service and U.S. Geological Survey.
Painter and co-author Michael Tercek of Walking Shadow Ecology in Montana found a strong contrast between sites along streams compared to wet meadows. Willows in meadow sites did not increase in height, but willows in stream sites increased significantly, exceeding 200 centimeters, or 6 feet - a height accessible to elk - in the summers of 2001-04 and in the spring of 2016.
They also found a significant change in willow thickets at least 200 centimeters in height along streams, with thickets occupying about 80% of willow patches in some sites, but as little as 22% in others. Tall willow thickets are an important habitat feature and an indicator of willow recovery, Painter said.
Thus, passive restoration through the return of predators has begun to reverse the loss of willows, something active culling of elk in the past was unable to accomplish, he said.
"Wolves didn't do it all by themselves," Painter said. "Other predators and hunters also affected elk, but this would not have happened without the wolves.
"This does not mean a wider expanse of willow habitat has been restored as existed in the early days of the park when beavers created large wetland expanses. This may eventually happen as beavers return but could take a long time to develop."
This is the latest OSU study led by Painter that examines the effects of wolf reintroduction to Yellowstone on trees. In 2018, he published a study that showed that aspen is recovering in areas around the park, as well as inside the park boundary.
###
Painter teaches ecology and conservation in the OSU College of Agricultural Sciences and College of Forestry.


New Zealand blue whale distribution patterns tied to ocean conditions, prey availability

OREGON STATE UNIVERSITY


IMAGE
IMAGE: A BLUE WHALE MOTHER-CALF PAIR SURFACES IN NEW ZEALAND'S SOUTH TARANAKI BIGHT. view more 
CREDIT: KRISTIN HODGE

NEWPORT, Ore. - Oregon State University researchers who recently discovered a population of blue whales in New Zealand are learning more about the links between the whales, their prey and ocean conditions that are changing as the planet warms.
Understanding how changes in climate affect the ability of blue whales to feed gives researchers more insight into the whales' overall health and provides critical information for conservation and management, said Leigh Torres, an assistant professor and director of the Geospatial Ecology of Marine Megafauna Laboratory at OSU's Marine Mammal Institute.
"These whales don't move around at random. We found that the same ocean patterns that determine where whales are also determine where their prey are, under both typical and warm ocean conditions," Torres said. "The more we learn about what drives these whales' movement, the more we can help protect them from whatever threats they face."
The researchers' findings were published today in the journal Marine Ecology Progress Series. The study's lead author is Dawn Barlow, a doctoral student in Torres' lab; additional co-authors are Kim Bernard of OSU's College of Earth, Ocean, and Atmospheric Sciences; Daniel Palacios of OSU's Marine Mammal Institute; and Pablo Escobar-Flores of the National Institute of Water and Atmospheric Sciences in New Zealand.
Torres, Barlow and colleagues recently documented this new population of New Zealand blue whales, which is genetically distinct from other blue whale populations and spends much of its time in the South Taranaki Bight between New Zealand's North and South Islands.
"The goal of our study is to understand the habitat use patterns of this population of blue whales - why they are where they are and how they respond to changing ocean conditions," Barlow said. "We know this area is important to this population of whales, and we want to understand what it is about this spot that is desirable to them."
The region is often rich in prey - blue whales feast on patches of krill - but the prey is patchy and influenced by changing ocean conditions, including warmer temperatures and changes in ocean properties. The South Taranaki Bight also sees frequent shipping traffic and activity from oil and gas exploration and production, Torres said.
Using data collected during typical summer conditions in 2014 and 2017 and warmer than average conditions in 2016, the researchers analyzed how changing ocean conditions affect the blue whales' distribution in the region's waters and the availability and location of their prey within the water column.
They found that during a regional marine heat wave in 2016, there were fewer aggregations of krill for the whales to dine on. With fewer options, the whales pursued the densest aggregations of krill they could find, Barlow said.


The researchers also found that during both warm and more typical ocean conditions the whales were more likely to feed in areas where the water was cooler. During the marine heat wave, when even the coolest water temperatures were higher than normal conditions, the whales still sought the coolest waters available for feeding.
In this region, cooler water temperatures represent deeper water that was pushed toward the surface in a process called upwelling and tends to be nutrient-rich, Torres said.
The nutrient-rich water supports aggregations of krill, which in turn provide sustenance for the blue whales. In their study, the researchers were able to bring all of the pieces of this trophic pathway together to describe the relationships between oceanography, krill and whales.
As warmer ocean conditions become more frequent, this new knowledge can be used to inform and adjust spatial management of human activities in the region in an effort to reduce impacts on New Zealand blue whales, Torres said.
"Documenting information like this can really help us understand how to reduce threats to these animals," Torres said. "We need continued monitoring to understand how these whales will respond to both the changing climate and human impacts."
###
Researchers find CBD improves arthritis symptoms in dogs

BAYLOR COLLEGE OF MEDICINE
NEWS RELEASE 28-MAY-2020

A team led by researchers at Baylor College of Medicine in collaboration with Medterra CBD conducted the first scientific studies to assess the potential therapeutic effects of cannabidiol (CBD) for arthritic pain in dogs, and the results could lead the way to studying its effect in humans. Researchers focused first on these animals because their condition closely mimics the characteristics of human arthritis, the leading cause of pain and disability in the U.S. for which there is no effective treatment.

Published in the journal PAIN, the study first showed both in laboratory tests and mouse models that CBD, a non-addictive product derived from hemp (cannabis), can significantly reduce the production of inflammatory molecules and immune cells associated with arthritis. Subsequently, the study showed that in dogs diagnosed with the condition, CBD treatment significantly improved quality of life as documented by both owner and veterinarian assessments. This work supports future scientific evaluation of CBD for human arthritis.

"CBD is rapidly increasing in popularity due to its anecdotal health benefits for a variety of conditions, from reducing anxiety to helping with movement disorders," said corresponding author Dr. Matthew Halpert, research faculty in the Department of Pathology and Immunology at Baylor. "In 2019, Medterra CBD approached Baylor to conduct independent scientific studies to determine the biological capabilities of several of its products."

In the current study, Halpert and his colleagues first measured the effect of CBD on immune responses associated with arthritis, both in human and murine cells grown in the lab and in mouse models. Using Medterra tinctures, they found that CBD treatment resulted in reduced production of both inflammatory molecules and immune cells linked to arthritis.

The researchers also determined that the effect was quicker and more effective when CBD was delivered encapsulated in liposomes than when it was administered 'naked.' Liposomes are artificially formed tiny spherical sacs that are used to deliver drugs and other substances into tissues at higher rates of absorption.

Halpert and colleagues next assessed the effect of naked and liposome-encapsulated CBD on the quality of life of dogs diagnosed with arthritis.

"We studied dogs because experimental evidence shows that spontaneous models of arthritis, particularly in domesticated canine models, are more appropriate for assessing human arthritis pain treatments than other animal models. The biological characteristics of arthritis in dogs closely resemble those of the human condition," Halpert said.

Arthritis is a common condition in dogs. According to the American Kennel Club, it affects one out of five dogs in the United States.

The 20 client-owned dogs enrolled in the study were seen at Sunset Animal Hospital in Houston. The dog owners were randomly provided with identical unidentified medication bottles that contained CBD, liposomal CBD, or a placebo. Neither the owners nor the veterinarian knew which treatment each dog received.

After four weeks of daily treatment, owners and veterinarians reported on the condition of the dogs, whether they observed changes in the animals' level of pain, such as changes related to running or gait. The dogs' cell blood count and blood indicators of liver and kidney function also were evaluated before and after the four weeks of treatment.

"We found encouraging results," Halpert said. "Nine of the 10 dogs on CBD showed benefits, which remained for two weeks after the treatment stopped. We did not detect alterations in the blood markers we measured, suggesting that, under the conditions of our study, the treatment seems to be safe."

###

The findings support conducting studies to evaluate CBD for the treatment of human arthritis.

Other contributors to this work include Chris D. Verrico, Shonda Wesson, Vanaja Konduri, Colby Hofferek, Jonathan Vazquez-Perez, Emek Blair, Kenneth Dunner Jr, Pedram Salimpour, William K. Decker. The authors are associated with Baylor College of Medicine, Sunset Animal Hospital, Valimenta Labs, University of Texas MD Anderson Cancer Center and Boston University School of Medicine.

This study was funded in part by a sponsored research agreement between Medterra CBD Inc and Baylor College of Medicine. This project also was supported in part by the Cytometry and Cell Sorting Core at Baylor College of Medicine with funding from the NIH (grants AI036211, CA125123 and RR024574).

Decker, Halpert and Konduri declare their ownership stakes in Diakonos Research, Ltd, an unrelated immuno-oncology company. Additionally, Halpert is a paid scientific advisor for Medterra CBD.

In planet formation, it's location, location, location

NASA/GODDARD SPACE FLIGHT CENTER
IMAGE
IMAGE: THE BRILLIANT TAPESTRY OF YOUNG STARS FLARING TO LIFE RESEMBLES A GLITTERING FIREWORKS DISPLAY IN THIS HUBBLE SPACE TELESCOPE IMAGE. THE SPARKLING CENTERPIECE OF THIS FIREWORKS SHOW IS A GIANT... view more 
CREDIT: CREDITS: NASA, ESA, THE HUBBLE HERITAGE TEAM (STSCI/AURA), A. NOTA (ESA/STSCI) AND THE WESTERLUND 2 SCIENCE TEAM
Astronomers using NASA's Hubble Space Telescope are finding that planets have a tough time forming in the rough-and-tumble central region of the massive, crowded star cluster Westerlund 2. Located 20,000 light-years away, Westerlund 2 is a unique laboratory to study stellar evolutionary processes because it's relatively nearby, quite young, and contains a large stellar population.
A three-year Hubble study of stars in Westerlund 2 revealed that the precursors to planet-forming disks encircling stars near the cluster's center are mysteriously devoid of large, dense clouds of dust that in a few million years could become planets.
However, the observations show that stars on the cluster's periphery do have the immense planet-forming dust clouds embedded in their disks. Researchers think our solar system followed this recipe when it formed 4.6 billion years ago.
So why do some stars in Westerlund 2 have a difficult time forming planets while others do not? It seems that planet formation depends on location, location, location. The most massive and brightest stars in the cluster congregate in the core, which is verified by observations of other star-forming regions. The cluster's center contains at least 30 extremely massive stars, some weighing up to 80 times the mass of the Sun. Their blistering ultraviolet radiation and hurricane-like stellar winds of charged particles blowtorch disks around neighboring lower-mass stars, dispersing the giant dust clouds.
"Basically, if you have monster stars, their energy is going to alter the properties of the disks around nearby, less massive stars," explained Elena Sabbi, of the Space Telescope Science Institute in Baltimore and lead researcher of the Hubble study. "You may still have a disk, but the stars change the composition of the dust in the disks, so it's harder to create stable structures that will eventually lead to planets. We think the dust either evaporates away in 1 million years, or it changes in composition and size so dramatically that planets don't have the building blocks to form."
The Hubble observations represent the first time that astronomers analyzed an extremely dense star cluster to study which environments are favorable to planet formation. Scientists, however, are still debating whether bulky stars are born in the center or whether they migrate there. Westerlund 2 already has massive stars in its core, even though it is a comparatively young, 2-million-year-old system.
Using Hubble's Wide Field Camera 3, the researchers found that of the nearly 5,000 stars in Westerlund 2 with masses between 0.1 to 5 times the Sun's mass, 1,500 of them show fluctuations in their light as the stars accrete material from their disks. Orbiting material clumped within the disk would temporarily block some of the starlight, causing brightness fluctuations.
However, Hubble detected the signature of such orbiting material only around stars outside the cluster's packed central region. The telescope witnessed large drops in brightness for as much as 10 to 20 days around 5% of the stars before they returned to normal brightness. They did not detect these dips in brightness in stars residing within four light-years of the center. These fluctuations could be caused by large clumps of dust passing in front of the star. The clumps would be in a disk tilted nearly edge-on to the view from Earth. "We think they are planetesimals or structures in formation," Sabbi explained. "These could be the seeds that eventually lead to planets in more evolved systems. These are the systems we don't see close to very massive stars. We see them only in systems outside the center."
Thanks to Hubble, astronomers can now see how stars are accreting in environments that are like the early universe, where clusters were dominated by monster stars. So far, the best known nearby stellar environment that contains massive stars is the starbirth region in the Orion Nebula. However, Westerlund 2 is a richer target because of its larger stellar population.
"Hubble's observations of Westerlund 2 give us a much better sense of how stars of different masses change over time, and how powerful winds and radiation from very massive stars affect nearby lower-mass stars and their disks," Sabbi said. "We see, for example, that lower-mass stars, like our Sun, that are near extremely massive stars in the cluster still have disks and still can accrete material as they grow. But the structure of their disks (and thus their planet-forming capability) seems to be very different from that of disks around stars forming in a calmer environment farther away from the cluster core. This information is important for building models of planet formation and stellar evolution."
This cluster will be an excellent laboratory for follow-up observations with NASA's upcoming James Webb Space Telescope, an infrared observatory. Hubble has helped astronomers identify the stars that have possible planetary structures. With Webb, researchers can study which disks around stars are not accreting material and which disks still have material that could build up into planets. This information on 1,500 stars will allow astronomers to map a path on how star systems grow and evolve. Webb also can study the chemistry of the disks in different evolutionary phases and watch how they change, and help astronomers determine what influence environment plays in their evolution.
NASA's Nancy Grace Roman Space Telescope, another planned infrared observatory, will be able to perform Sabbi's study on a much larger area.? Westerlund 2 is just a small slice of an immense star-formation region. These vast regions contain clusters of stars with different ages and different densities. Astronomers could use Roman Space Telescope observations to start to build up statistics on how a star's characteristics, like its mass or outflows, affect its own evolution or the nature of stars that form nearby. The observations could also provide more information on how planets form in tough environments.
###
Sabbi's team's results appeared in The Astrophysical Journal.
The Hubble Space Telescope is a project of international cooperation between NASA and ESA (European Space Agency). NASA's Goddard Space Flight Center in Greenbelt, Maryland, manages the telescope. The Space Telescope Science Institute (STScI) in Baltimore conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy in Washington, D.C.
Claire Andreoli
NASA's Goddard Space Flight Center, Greenbelt, Md.
301-286-1940
claire.andreoli@nasa.gov
Donna Weaver / Ray Villard
Space Telescope Science Institute, Baltimore
410-338-4493 / 410-338-4514
dweaver@stsci.edu / villard@stsci.edu
Elena Sabbi
Space Telescope Science Institute, Baltimore
sabbi@stsci.edu

Study questions benefits of social networks to disaster response

Communication within groups not as helpful as anticipated
CARNEGIE MELLON UNIVERSITY
Hirokazu Shirado, an assistant professor in Carnegie Mellon University's Human-Computer Interaction Institute, said he had expected his experiments to show that social networks, such as neighbors, work groups and extended families, would improve decision-making by giving people actionable information.
"What we found is that social networks make things worse," said Shirado, who began the research while a member of the Human Nature Lab at Yale University. A paper on their work appeared this week in the Proceedings of the Royal Society A.
Gathering data about social networks in the midst of a crisis is difficult, so Shirado devised a game in which online participants had an economic stake in making a decision whether to evacuate in the face of danger. He recruited 2,480 subjects and organized them into 108 groups, comparing how networked groups and isolated individuals compared in their decision making.
Participants received $2 at the outset of the 75-second experiment. If nothing happened, they could keep the $2 at the end. But if there was an impending disaster, they could leave the game and retain $1. If they failed to evacuate and disaster struck, they lost everything. They also received 10 cents for every other player who made a correct decision on whether to leave the game.
The participants thus had every incentive to choose correctly and were encouraged to communicate with each other. One member of each social network group also received the correct information about impending danger.
Compared with the isolated individuals, the networked players consistently tended to resist evacuation, regardless of whether the danger was real or not. Communication didn't improve decision-making so much as it delayed it, Shirado said. The networked players also generated misinformation, even though nobody had an incentive to do so.
One of the problems, he said, is that players didn't realize that they often used different strategies. A player who accepts "no news is good news," for instance, might think that all is safe simply because he hasn't heard anything. He might then send "safe" signals to other members of the group even though danger lurked. In other cases, players might be unable to learn the truth because the players adjacent to them all had bad information.
Shirado has used the same game as an educational tool in his CMU classes, including one instance just before the onset of the COVID-19 pandemic. He recalled one student was skeptical, arguing that there was no reason why the players couldn't choose correctly. But about 70 percent of the students -- including the skeptic -- erred in their decisions.
"Inside the networks, people could not understand why this was happening," he added.
Social media -- one type of social network -- was not included in the study, but might actually improve performance, Shirado said. Though individuals tend to follow like-minded people on social media, it's also easy to connect with others who might fall outside normal social networks, providing a way around some of the barriers that form within networks.
Shirado said he hopes to find ways of improving the performance of social networks.
"We cannot live without social networks," he explained. "I'm interested in how social networks can provide a benefit to individuals."
He acknowledged that one of the shortcomings of his experiment is that it was too simple and involved people who were randomly assigned into networks. Future experiments will require players to play several times with the same network of individuals, so they might learn who to trust.
###
The Robert Wood Johnson Foundation, Tata Consultancy Services, the Nomis Foundation, the National Institute of Social Sciences and the National Institutes of Health provided support for this resea

Gold mining with mercury poses health threats for miles downstream

Assumption that distance lowers risk doesn't hold up
DUKE UNIVERSITY
IMAGE
IMAGE: DUKE RESEARCHER HELENA FRISCHTAK, (RIGHT FRONT) ADMINISTERS PSYCHOLOGICAL ASSESSMENTS WITH A PAIR OF PERUVIAN CHILDREN DURING A STUDY OF MERCURY CONTAMINATION NEAR SMALL-SCALE GOLD MINING. view more 
CREDIT: WILLIAM PAN, DUKE UNIVERSITY
DURHAM, N.C. - Small-scale gold mining in the Peruvian Amazon poses a health hazard not only to the miners and communities near where mercury is used to extract gold from ore, but also to downstream communities hundreds of kilometers away where people eat mercury-contaminated river fish as part of their diet.
In these downstream communities where fish is an important part of the diet, children under 12 with the highest levels of mercury in their blood and hair suffer a 4.68-point loss in I.Q., researchers report. They are also more anemic, lacking adequate hemoglobin to carry oxygen in their blood.
Both findings come from a series of studies conducted by Duke University scientists in and around the Amarakaeri Communal Reserve in the Madre De Dios region of Peru. They appear in a pair of papers published May 20 in GeoHealth and May 28 in the Journal of Exposure Science & Environmental Epidemiology.
The studies show that common assumptions about mercury exposure should be reexamined, and that native people in the region are more vulnerable to harm, probably because of their greater reliance on river fish, but also perhaps because their healthcare and standard of living is not as high.
When compared to non-native individuals, children and adults living in native villages were found to have much higher mercury exposures. Among tested children, mercury levels were 2.5 times greater, on average, in native communities.
"Assumptions about exposure are not reliable," said co-author Caren Weinhouse, an assistant professor at Oregon Health & Science University. Many studies have looked at communities closest to the mines on the assumption that they would have the greatest mercury exposures. "If you look more closely, it turns out your first instinct may be wrong," she said. Often communities far from mining sites had higher exposures.
Artisanal and small-scale gold miners around the world use liquid elemental mercury to extract gold from soils and sediments. The mercury binds to the gold to form an amalgam, which is then extracted by burning, creating gaseous mercury that enters the atmosphere. The rest of the mercury ends up as waste on the landscape and runoff during mining-related erosion. Miners are also known to simply pour excess mercury directly into surface waters.
Elemental mercury is converted to methylmercury in waterways, where it is more readily taken up by animals and tends to 'bio-accumulate' or add up in tissues as bigger fish eat contaminated little fish.
Mercury is a known neurotoxin that can lead to muscle weakness and problems with coordination, anxiety, and memory, as well as trouble speaking, hearing or seeing. Nineteenth century hat makers who used mercury nitrate regularly for their trade were known for unpredictability, hallucinations and tremors, leading to the expression 'mad as a hatter.'
To gather data on hair and blood concentrations of mercury, both near mining operations and farther away, the researchers visited 1,221 Peruvian households in 23 communities in 2015 and returned to resample 900 of those households the following year.
Most of the children measured did not have what is currently considered to be high mercury levels in their blood and hair. But children with more mercury exposure demonstrated lower cognitive ability than their peers, even at fairly low levels of exposure. A third of the children were found to have mercury levels higher than the World Health Organization's safety standards.
"If you were going to have a high IQ anyway, you'll probably still have a higher IQ after exposure, but for kids at risk for impairment, a few points can make a difference," said Duke graduate student Aaron Reuben, who co-led the neurocognitive study on children.
Half of the children under age 12 -- and indeed half of the entire population sampled -- were also found to be anemic. The researchers found that the higher the mercury levels in a child's blood, the lower their hemoglobin levels.
"Our paper describes some plausible biological pathways for that to happen," said William Pan, the Elizabeth Brooks Reid and Whitelaw Reid associate professor of environmental sciences and policy at Duke. "But really, this is something we need to understand a bit more."
Though it seems to vary from household to household, the anemia from mercury contamination is "a major public health problem," Weinhouse said. "If you just supplement them with iron, that might not be enough if mercury is a major cause."
An earlier study by the Duke team found that children with higher mercury exposures were less responsive to vaccines, especially if they were also malnourished.
###
The research was supported by the Hunt Oil Company Peru, the Inter-American Institute for Global Change Research and Bass Connections at Duke University.
CITATIONS: "A Population-Based Mercury Exposure Assessment Near an Artisanal and Small-Scale Gold Mining Site in the Peruvian Amazon," Caren Weinhouse, John Gallis, Ernesto Ortiz, Axel Berky, Ana Maria Morales, Sarah Diringer, James Harringon, Paige Bullins, Laura Rogers, John Hare-Grogg, Heileen Hsu-Kim, William Pan. Journal of Exposure Science & Environmental Epidemiology, May 28, 2020. 10.1038/s41370-020-0234-2
"Elevated Hair Mercury Levels Are Ass

Algorithm quickly simulates a roll of loaded dice

Approach for generating numbers at random may help analyses of complex systems, from Earth's climate to financial markets
MASSACHUSETTS INSTITUTE OF TECHNOLOGY
The fast and efficient generation of random numbers has long been an important challenge. For centuries, games of chance have relied on the roll of a die, the flip of a coin, or the shuffling of cards to bring some randomness into the proceedings. In the second half of the 20th century, computers started taking over that role, for applications in cryptography, statistics, and artificial intelligence, as well as for various simulations -- climatic, epidemiological, financial, and so forth.
MIT researchers have now developed a computer algorithm that might, at least for some tasks, churn out random numbers with the best combination of speed, accuracy, and low memory requirements available today. The algorithm, called the Fast Loaded Dice Roller (FLDR), was created by MIT graduate student Feras Saad, Research Scientist Cameron Freer, Professor Martin Rinard, and Principal Research Scientist Vikash Mansinghka, and it will be presented next week at the 23rd International Conference on Artificial Intelligence and Statistics.
Simply put, FLDR is a computer program that simulates the roll of dice to produce random integers. The dice can have any number of sides, and they are "loaded," or weighted, to make some sides more likely to come up than others. A loaded die can still yield random numbers -- as one cannot predict in advance which side will turn up -- but the randomness is constrained to meet a preset probability distribution. One might, for instance, use loaded dice to simulate the outcome of a baseball game; while the superior team is more likely to win, on a given day either team could end up on top.
With FLDR, the dice are "perfectly" loaded, which means they exactly achieve the specified probabilities. With a four-sided die, for example, one could arrange things so that the numbers 1,2,3, and 4 turn up exactly 23 percent, 34 percent, 17 percent, and 26 percent of the time, respectively.
To simulate the roll of loaded dice that have a large number of sides, the MIT team first had to draw on a simpler source of randomness -- that being a computerized (binary) version of a coin toss, yielding either a 0 or a 1, each with 50 percent probability. The efficiency of their method, a key design criterion, depends on the number of times they have to tap into this random source -- the number of "coin tosses," in other words -- to simulate each dice roll.
Dice GIF by Craigson - Find & Share on GIPHY

In a landmark 1976 paper, the computer scientists Donald Knuth and Andrew Yao devised an algorithm that could simulate the roll of loaded dice with the maximum efficiency theoretically attainable. "While their algorithm was optimally efficient with respect to time," Saad explains, meaning that literally nothing could be faster, "it is inefficient in terms of the space, or computer memory, needed to store that information." In fact, the amount of memory required grows exponentially, depending on the number of sides on the dice and other factors. That renders the Knuth-Yao method impractical, he says, except for special cases, despite its theoretical importance.
FLDR was designed for greater utility. "We are almost as time efficient," Saad says, "but orders of magnitude better in terms of memory efficiency." FLDR can use up to 10,000 times less memory storage space than the Knuth-Yao approach, while taking no more than 1.5 times longer per operation.
For now, FLDR's main competitor is the Alias method, which has been the field's dominant technology for decades. When analyzed theoretically, according to Freer, FLDR has one clear-cut advantage over Alias: It makes more efficient use of the random source -- the "coin tosses," to continue with that metaphor -- than Alias. In certain cases, moreover, FLDR is also faster than Alias in generating rolls of loaded dice.
FLDR, of course, is still brand new and has not yet seen widespread use. But its developers are already thinking of ways to improve its effectiveness through both software and hardware engineering. They also have specific applications in mind, apart from the general, ever-present need for random numbers. Where FLDR can help most, Mansinghka suggests, is by making so-called Monte Carlo simulations and Monte Carlo inference techniques more efficient. Just as FLDR uses coin flips to simulate the more complicated roll of weighted, many-sided dice, Monte Carlo simulations use a dice roll to generate more complex patterns of random numbers.
The United Nations, for instance, runs simulations of seismic activity that show when and where earthquakes, tremors, or nuclear tests are happening on the globe. The United Nations also carries out Monte Carlo inference: running random simulations that generate possible explanations for actual seismic data. This works by conducting a second series of Monte Carlo simulations, which randomly test out alternative parameters for an underlying seismic simulation to find the parameter values most likely to reproduce the observed data. These parameters contain information about when and where earthquakes and nuclear tests might actually have occurred.
"Monte Carlo inference can require hundreds of thousands of times more random numbers than Monte Carlo simulations," Mansinghka says. "That's one big bottleneck where FLDR could really help. Monte Carlo simulation and inference algorithms are also central to probabilistic programming, an emerging area of AI with broad applications."
Dice Wind Mill Dice Dancer GIF - DiceWindMill DiceDancer Dancing GIFs
Despite its seemingly bright future, FLDR almost did not come to light. Hints of it first emerged from a previous paper the same four MIT researchers published at a symposium in January, which introduced a separate algorithm. In that work, the authors showed that if a predetermined amount of memory were allocated for a computer program to simulate the roll of loaded dice, their algorithm could determine the minimum amount of "error" possible -- that is, how close one comes toward meeting the designated probabilities for each side of the dice.
If one doesn't limit the memory in advance, the error can be reduced to zero, but Saad noticed a variant with zero error that used substantially less memory and was nearly as fast. At first he thought the result might be too trivial to bother with. But he mentioned it to Freer who assured Saad that this avenue was worth pursuing. FLDR, which is error-free in this same respect, arose from those humble origins and now has a chance of becoming a leading technology in the realm of random number generation. That's no trivial matter given that we live in a world that's governed, to a large extent, by random processes -- a principle that applies to the distribution of galaxies in the universe, as well as to the outcome of a spirited game of craps.
###

Study finds surge in hydroxychloroquine/chloroquine prescriptions during COVID-19

From Feb. 16 to April 25, almost half a million more prescriptions of hydroxychloroquine/chloroquine were dispensed nationally compared to the same period in 2019
BRIGHAM AND WOMEN'S HOSPITAL
A new study by investigators from Brigham and Women's Hospital examines changes in prescription patterns in the United States during the COVID-19 pandemic.
In an exploratory analysis of data from GoodRx used to generate national estimates, Brigham investigators found prescriptions of the anti-malarial drug chloroquine and its analogue hydroxychloroquine dramatically surged during the week of March 15, likely due to off-label prescriptions for COVID-19. Results of the study are published in JAMA.
"There have been indications that hydroxychloroquine prescribing had increased and shortages had been reported, but this study puts a spotlight on the extent to which excess hydroxychloroquine/chloroquine prescriptions were filled nationally," said corresponding author Haider Warraich, MD, an associate physician in the Division of Cardiovascular Medicine at the Brigham. "This analysis doesn't include patients who were prescribed HCQ in a hospital setting -- this means that patients could have been taking the drugs at home, without supervision or monitoring for side effects."
Chloroquine is an anti-malarial drug and its analogue, hydroxychloroquine, is used to treat autoimmune diseases such as lupus or rheumatoid arthritis. Both drugs are considered safe and effective for these indications. Laboratory testing has suggested that the drugs may also have antiviral effects, and, given their relatively low cost, there has been much interest in their potential effectiveness against COVID-19. However, a study published last week by Brigham researchers and collaborators found that, in an observational analysis, COVID-19 patients who were given either drug (with or without an antibiotic) did not show an improvement in survival rates and were at increased risk for ventricular arrhythmias.
For the current analysis, Warraich and colleagues looked at prescribing patterns for hydroxychloroquine/chloroquine as well as many other commonly prescribed drugs. These included angiotensin-converting-enzyme-inhibitors (ACEi) and angiotensin-receptor blockers (ARBs), both of which are prescribed for patients with hypertension or heart failure, as well as the antibiotic azithromycin, and the top 10 drug prescriptions filled in 2019. The team compared the number of filled prescriptions for each drug to the number of prescriptions filled last year over a 10-week period from Feb. 16 to April 25.
The team found that fills for all drugs, except the antibiotic amoxicillin and the pain reliever combination of hydrocodone/acetaminophen, peaked during the week of March 15 to March 21, 2020, followed by subsequent declines. During this week, hydroxychloroquine/chloroquine fills for 28 tablets increased from 2,208 prescriptions in 2019 to 45,858 prescriptions in 2020 (an increase of more than 2,000 percent). Over the full 10 weeks, there were close to half a million excess fills of hydroxychloroquine/chloroquine in 2020 compared to the year before.
In contrast, prescriptions for antibiotics such as amoxicillin and azithromycin and for hydrocodone/acetaminophen declined. Prescriptions for heart therapies remained stable or declined slightly. After the surge in prescriptions, the authors observed a reduction in longer-term prescription fills for hydroxychloroquine/chloroquine, which could indicate decreased availability of the drug for patients with systemic lupus erythematosus and rheumatoid arthritis. The United States Food and Drug Administration reported a drug shortage of hydroxychloroquine starting March 31. The surge in prescriptions occurred between March 15 and March 21, within days of the World Health Organization declaring a global coronavirus pandemic on March 11, the U.S. declaring a national emergency on March 13, the publishing of a pre-print about hydroxychloroquine on March 17, and President Trump's announced support of hydroxychloroquine on March 19.
"During this pandemic, there has been both good information and misinformation about benefits and potential harms of common medications like hydroxychloroquine, and there had been conjecture that proven medications for heart failure may be harmful in this patient population," said Warraich. "One positive finding is that we didn't see a stark reduction in prescription fills for routine, chronic care, but our findings for HCQ are concerning."
###
Paper cited: Vaduganathan, M et al. "Prescription Fill Patterns for Commonly Used Drugs During the COVID-19 Pandemic in the United States" JAMA DOI: 10.1001/jama.2020.918