Friday, May 29, 2020

Algorithm quickly simulates a roll of loaded dice

Approach for generating numbers at random may help analyses of complex systems, from Earth's climate to financial markets
MASSACHUSETTS INSTITUTE OF TECHNOLOGY
The fast and efficient generation of random numbers has long been an important challenge. For centuries, games of chance have relied on the roll of a die, the flip of a coin, or the shuffling of cards to bring some randomness into the proceedings. In the second half of the 20th century, computers started taking over that role, for applications in cryptography, statistics, and artificial intelligence, as well as for various simulations -- climatic, epidemiological, financial, and so forth.
MIT researchers have now developed a computer algorithm that might, at least for some tasks, churn out random numbers with the best combination of speed, accuracy, and low memory requirements available today. The algorithm, called the Fast Loaded Dice Roller (FLDR), was created by MIT graduate student Feras Saad, Research Scientist Cameron Freer, Professor Martin Rinard, and Principal Research Scientist Vikash Mansinghka, and it will be presented next week at the 23rd International Conference on Artificial Intelligence and Statistics.
Simply put, FLDR is a computer program that simulates the roll of dice to produce random integers. The dice can have any number of sides, and they are "loaded," or weighted, to make some sides more likely to come up than others. A loaded die can still yield random numbers -- as one cannot predict in advance which side will turn up -- but the randomness is constrained to meet a preset probability distribution. One might, for instance, use loaded dice to simulate the outcome of a baseball game; while the superior team is more likely to win, on a given day either team could end up on top.
With FLDR, the dice are "perfectly" loaded, which means they exactly achieve the specified probabilities. With a four-sided die, for example, one could arrange things so that the numbers 1,2,3, and 4 turn up exactly 23 percent, 34 percent, 17 percent, and 26 percent of the time, respectively.
To simulate the roll of loaded dice that have a large number of sides, the MIT team first had to draw on a simpler source of randomness -- that being a computerized (binary) version of a coin toss, yielding either a 0 or a 1, each with 50 percent probability. The efficiency of their method, a key design criterion, depends on the number of times they have to tap into this random source -- the number of "coin tosses," in other words -- to simulate each dice roll.
Dice GIF by Craigson - Find & Share on GIPHY

In a landmark 1976 paper, the computer scientists Donald Knuth and Andrew Yao devised an algorithm that could simulate the roll of loaded dice with the maximum efficiency theoretically attainable. "While their algorithm was optimally efficient with respect to time," Saad explains, meaning that literally nothing could be faster, "it is inefficient in terms of the space, or computer memory, needed to store that information." In fact, the amount of memory required grows exponentially, depending on the number of sides on the dice and other factors. That renders the Knuth-Yao method impractical, he says, except for special cases, despite its theoretical importance.
FLDR was designed for greater utility. "We are almost as time efficient," Saad says, "but orders of magnitude better in terms of memory efficiency." FLDR can use up to 10,000 times less memory storage space than the Knuth-Yao approach, while taking no more than 1.5 times longer per operation.
For now, FLDR's main competitor is the Alias method, which has been the field's dominant technology for decades. When analyzed theoretically, according to Freer, FLDR has one clear-cut advantage over Alias: It makes more efficient use of the random source -- the "coin tosses," to continue with that metaphor -- than Alias. In certain cases, moreover, FLDR is also faster than Alias in generating rolls of loaded dice.
FLDR, of course, is still brand new and has not yet seen widespread use. But its developers are already thinking of ways to improve its effectiveness through both software and hardware engineering. They also have specific applications in mind, apart from the general, ever-present need for random numbers. Where FLDR can help most, Mansinghka suggests, is by making so-called Monte Carlo simulations and Monte Carlo inference techniques more efficient. Just as FLDR uses coin flips to simulate the more complicated roll of weighted, many-sided dice, Monte Carlo simulations use a dice roll to generate more complex patterns of random numbers.
The United Nations, for instance, runs simulations of seismic activity that show when and where earthquakes, tremors, or nuclear tests are happening on the globe. The United Nations also carries out Monte Carlo inference: running random simulations that generate possible explanations for actual seismic data. This works by conducting a second series of Monte Carlo simulations, which randomly test out alternative parameters for an underlying seismic simulation to find the parameter values most likely to reproduce the observed data. These parameters contain information about when and where earthquakes and nuclear tests might actually have occurred.
"Monte Carlo inference can require hundreds of thousands of times more random numbers than Monte Carlo simulations," Mansinghka says. "That's one big bottleneck where FLDR could really help. Monte Carlo simulation and inference algorithms are also central to probabilistic programming, an emerging area of AI with broad applications."
Dice Wind Mill Dice Dancer GIF - DiceWindMill DiceDancer Dancing GIFs
Despite its seemingly bright future, FLDR almost did not come to light. Hints of it first emerged from a previous paper the same four MIT researchers published at a symposium in January, which introduced a separate algorithm. In that work, the authors showed that if a predetermined amount of memory were allocated for a computer program to simulate the roll of loaded dice, their algorithm could determine the minimum amount of "error" possible -- that is, how close one comes toward meeting the designated probabilities for each side of the dice.
If one doesn't limit the memory in advance, the error can be reduced to zero, but Saad noticed a variant with zero error that used substantially less memory and was nearly as fast. At first he thought the result might be too trivial to bother with. But he mentioned it to Freer who assured Saad that this avenue was worth pursuing. FLDR, which is error-free in this same respect, arose from those humble origins and now has a chance of becoming a leading technology in the realm of random number generation. That's no trivial matter given that we live in a world that's governed, to a large extent, by random processes -- a principle that applies to the distribution of galaxies in the universe, as well as to the outcome of a spirited game of craps.
###

Study finds surge in hydroxychloroquine/chloroquine prescriptions during COVID-19

From Feb. 16 to April 25, almost half a million more prescriptions of hydroxychloroquine/chloroquine were dispensed nationally compared to the same period in 2019
BRIGHAM AND WOMEN'S HOSPITAL
A new study by investigators from Brigham and Women's Hospital examines changes in prescription patterns in the United States during the COVID-19 pandemic.
In an exploratory analysis of data from GoodRx used to generate national estimates, Brigham investigators found prescriptions of the anti-malarial drug chloroquine and its analogue hydroxychloroquine dramatically surged during the week of March 15, likely due to off-label prescriptions for COVID-19. Results of the study are published in JAMA.
"There have been indications that hydroxychloroquine prescribing had increased and shortages had been reported, but this study puts a spotlight on the extent to which excess hydroxychloroquine/chloroquine prescriptions were filled nationally," said corresponding author Haider Warraich, MD, an associate physician in the Division of Cardiovascular Medicine at the Brigham. "This analysis doesn't include patients who were prescribed HCQ in a hospital setting -- this means that patients could have been taking the drugs at home, without supervision or monitoring for side effects."
Chloroquine is an anti-malarial drug and its analogue, hydroxychloroquine, is used to treat autoimmune diseases such as lupus or rheumatoid arthritis. Both drugs are considered safe and effective for these indications. Laboratory testing has suggested that the drugs may also have antiviral effects, and, given their relatively low cost, there has been much interest in their potential effectiveness against COVID-19. However, a study published last week by Brigham researchers and collaborators found that, in an observational analysis, COVID-19 patients who were given either drug (with or without an antibiotic) did not show an improvement in survival rates and were at increased risk for ventricular arrhythmias.
For the current analysis, Warraich and colleagues looked at prescribing patterns for hydroxychloroquine/chloroquine as well as many other commonly prescribed drugs. These included angiotensin-converting-enzyme-inhibitors (ACEi) and angiotensin-receptor blockers (ARBs), both of which are prescribed for patients with hypertension or heart failure, as well as the antibiotic azithromycin, and the top 10 drug prescriptions filled in 2019. The team compared the number of filled prescriptions for each drug to the number of prescriptions filled last year over a 10-week period from Feb. 16 to April 25.
The team found that fills for all drugs, except the antibiotic amoxicillin and the pain reliever combination of hydrocodone/acetaminophen, peaked during the week of March 15 to March 21, 2020, followed by subsequent declines. During this week, hydroxychloroquine/chloroquine fills for 28 tablets increased from 2,208 prescriptions in 2019 to 45,858 prescriptions in 2020 (an increase of more than 2,000 percent). Over the full 10 weeks, there were close to half a million excess fills of hydroxychloroquine/chloroquine in 2020 compared to the year before.
In contrast, prescriptions for antibiotics such as amoxicillin and azithromycin and for hydrocodone/acetaminophen declined. Prescriptions for heart therapies remained stable or declined slightly. After the surge in prescriptions, the authors observed a reduction in longer-term prescription fills for hydroxychloroquine/chloroquine, which could indicate decreased availability of the drug for patients with systemic lupus erythematosus and rheumatoid arthritis. The United States Food and Drug Administration reported a drug shortage of hydroxychloroquine starting March 31. The surge in prescriptions occurred between March 15 and March 21, within days of the World Health Organization declaring a global coronavirus pandemic on March 11, the U.S. declaring a national emergency on March 13, the publishing of a pre-print about hydroxychloroquine on March 17, and President Trump's announced support of hydroxychloroquine on March 19.
"During this pandemic, there has been both good information and misinformation about benefits and potential harms of common medications like hydroxychloroquine, and there had been conjecture that proven medications for heart failure may be harmful in this patient population," said Warraich. "One positive finding is that we didn't see a stark reduction in prescription fills for routine, chronic care, but our findings for HCQ are concerning."
###
Paper cited: Vaduganathan, M et al. "Prescription Fill Patterns for Commonly Used Drugs During the COVID-19 Pandemic in the United States" JAMA DOI: 10.1001/jama.2020.918

Yes, your dog wants to rescue you

ASU Canine Science Collaboratory study shows that pet dogs will try to save their distressed human, as long as they know how
ARIZONA STATE UNIVERSITY
IMAGE
IMAGE: CLIVE WYNNE, ARIZONA STATE UNIVERSITY PROFESSOR OF PSYCHOLOGY, HOLDING A PUPPY. view more 
CREDIT: DEANNA DENT/ASU
What to do. You're a dog. Your owner is trapped in a box and is crying out for help. Are you aware of his despair? If so, can you set him free? And what's more, do you really want to?
That's what Joshua Van Bourg and Clive Wynne wanted to know when they gave dogs the chance to rescue their owners.
Until recently, little research has been done on dogs' interest in rescuing humans, but that's what humans have come to expect from their canine companions -- a legend dating back to Lassie and updated by the popular Bolt.
"It's a pervasive legend," said Van Bourg, a graduate student in Arizona State University's Department of Psychology.
Simply observing dogs rescuing someone doesn't tell you much, Van Bourg said. "The difficult challenge is figuring out why they do it."
So, Van Bourg and Wynne, an ASU professor of psychology and director of the Canine Science Collaboratory at ASU, set up an experiment assessing 60 pet dogs' propensity to rescue their owners. None of the dogs had training in such an endeavor.
In the main test, each owner was confined to a large box equipped with a light-weight door, which the dog could move aside. The owners feigned distress by calling out "help," or "help me."
Beforehand, the researchers coached the owners so their cries for help sounded authentic. In addition, owners weren't allowed to call their dog's name, which would encourage the dog to act out of obedience, and not out of concern for her owner's welfare.
"About one-third of the dogs rescued their distressed owner, which doesn't sound too impressive on its own, but really is impressive when you take a closer look," Van Bourg said.
That's because two things are at stake here. One is the dogs' desire to help their owners, and the other is how well the dogs understood the nature of the help that was needed. Van Bourg and Wynne explored this factor in control tests -- tests that were lacking in previous studies.
In one control test, when the dog watched a researcher drop food into the box, only 19 of the 60 dogs opened the box to get the food. More dogs rescued their owners than retrieved food.
"The key here is that without controlling for each dog's understanding of how to open the box, the proportion of dogs who rescued their owners greatly underestimates the proportion of dogs who wanted to rescue their owners," Van Bourg said.
"The fact that two-thirds of the dogs didn't even open the box for food is a pretty strong indication that rescuing requires more than just motivation, there's something else involved, and that's the ability component," Van Bourg said. "If you look at only those 19 dogs that showed us they were able to open the door in the food test, 84% of them rescued their owners. So, most dogs want to rescue you, but they need to know how."
In another control test, Van Bourg and Wynne looked at what happened when the owner sat inside the box and calmly read aloud from a magazine. What they found was that four fewer dogs, 16 out of 60, opened the box in the reading test than in the distress test.
"A lot of the time it isn't necessarily about rescuing," Van Bourg said. "But that doesn't take anything away from how special dogs really are. Most dogs would run into a burning building just because they can't stand to be apart from their owners. How sweet is that? And if they know you're in distress, well, that just ups the ante."
The fact that dogs did open the box more often in the distress test than in the reading control test indicated that rescuing could not be explained solely by the dogs wanting to be near their owners.
The researchers also observed each dog's behavior during the three scenarios. They noted behaviors that can indicate stress, such as whining, walking, barking and yawning.
"During the distress test, the dogs were much more stressed," Van Bourg said. "When their owner was distressed, they barked more, and they whined more. In fact, there were eight dogs who whined, and they did so during the distress test. Only one other dog whined, and that was for food."
What's more, the second and third attempts to open the box during the distress test didn't make the dogs less stressed than they were during the first attempt. That was in contrast to the reading test, where dogs that have already been exposed to the scenario, were less stressed across repeated tests.
"They became acclimated," Van Bourg said. "Something about the owner's distress counteracts this acclimation. There's something about the owner calling for help that makes the dogs not get calmer with repeated exposure."
In essence, these individual behaviors are more evidence of "emotional contagion," the transmission of stress from the owner to the dog, explains Van Bourg, or what humans would call empathy.
"What's fascinating about this study," Wynne said, "is that it shows that dogs really care about their people. Even without training, many dogs will try and rescue people who appear to be in distress -- and when they fail, we can still see how upset they are. The results from the control tests indicate that dogs who fail to rescue their people are unable to understand what to do -- it's not that they don't care about their people.
"Next, we want to explore whether the dogs that rescue do so to get close to their people, or whether they would still open the box even if that did not give them the opportunity to come together with their humans," Wynne added.
###
The study, "Pet dogs (Canis lupus familiaris) release their trapped and distressed owners: Individual variation and evidence of emotional contagion was published last month," was published online last month in the journal PLOS.

Smart windows that self-illuminate on rainy days

POHANG UNIVERSITY OF SCIENCE & TECHNOLOGY (POSTECH)
IMAGE
IMAGE: HUMIDITY SENSOR COMBINING VARIABLE FILTER AND SOLAR CELLS. view more 
CREDIT: JUNSUK RHO (POSTECH)
Smart windows that automatically change colors depending on the intensity of sunlight are gaining attention as they can reduce energy bills by blocking off sun's visible rays during summer. But what about windows that change colors depending on the humidity outside during the monsoon season or on hot days of summer?
Recently, a Korean research team has developed the source technology for smart windows that change colors according to the amount of moisture, without needing electricity.
The joint research team comprised of Professor Junsuk Rho of departments of mechanical and chemical engineering, Jaehyuck Jang and Aizhan Ismukhanova of chemical engineering department at POSTECH, and Professor Inkyu Park of KAIST's department of mechanical engineering. Together, they successfully developed a variable color filter using a metal-hydrogel-metal resonator structure using chitosan-based hydrogel, and combined it with solar cells to make a self-powering humidity sensor. These research findings were published as a cover story in the latest edition of Advanced Optical Materials, a journal specializing in nanoscience and optics.
Sensors using light are already widely used in our daily lives in measuring the ECG, air quality, or distance. The basic principle is to use light to detect changes in the surroundings and to convert them into digital signals.
Fabri-Pero interference* is one of the resonance phenomena that can be applied in optical sensors and can be materialized in the form of multilayer thin films of metal-dielectric-metal. It is known that the resonance wavelength of transmitted light can be controlled according to the thickness and refractive index of the dielectric layer. However, the existing metal-dielectric-metal resonators had a big disadvantage in not being able to control the wavelengths of transmitted light once they are manufactured, making it difficult to use them in variable sensors.
The research team found that when the chitosan hydrogel is made into the metal-hydrogel-metal structure, the resonance wavelength of light transmitted changes in real time depending on the humidity of the environment. This is because the chitosan hydrogel repeats expansion and contraction as the humidity changes around it.
Using this mechanism, the team developed a humidity sensor that can convert light's energy into electricity by combining a solar battery with a water variable wavelength filter made of metal-hydrogel-metal structured metamterial that changes resonance wavelength depending on the external humidity.
The design principle is to overlap the filter's resonance wavelength with the wavelength where the absorption of the solar cells changes rapidly. This filter is designed to change the amount of light absorption of solar cells depending on the amount of moisture, and to lead to electric changes that ultimately detect the surrounding humidity.
Unlike the conventional optical humidity sensors, these newly developed ones work regardless of the type of light, whether it be natural, LED or indoor. Also, not only does it function without external power, but it can also predict humidity according to the filter's color.
Professor Junsuk Rho who led the research commented, "This technology is a sensing technology that can be used in places like nuclear power reactors where people and electricity cannot reach." He added, "It will create even greater synergy if combined with IoT technology such as humidity sensors that activate or smart windows that change colors according to the level of external humidity."
###
The study was supported by the Samsung Research Funding & Incubation Center for Future Technology.

When COVID-19 meets flu season

Pulmonologist outlines factors that could determine severity of '20/'21 flu season
NORTHWESTERN UNIVERSITY
CHICAGO --- As if the COVID-19 pandemic isn't scary enough, the flu season is not far away. How severe will the flu season be as it converges with the COVID-19 outbreak? What can we do to prepare?
Dr. Benjamin Singer, a Northwestern Medicine pulmonologist who treats COVID-19 patients in the intensive care unit, outlines the best defense against influenza, which also may protect against coronavirus.
In an editorial that will be published May 29 in the journal Science Advances, Singer, an assistant professor of pulmonary and critical care and biochemistry and molecular genetics at Northwestern University Feinberg School of Medicine, examines the epidemiology and biology of SARS-CoV-2 and influenza to help inform preparation strategies for the upcoming flu season.
He outlines the following four factors that could determine the severity of the upcoming flu season:
    1. Transmission: Social distancing policies designed to limit the spread of COVID-19 are also effective against the flu. If COVID-19 cases begin to spike in the fall of 2020, re-tightening social distancing measures could help mitigate early spread of the flu to flatten the curves for both viruses.
    2. Vaccination: As we await vaccine trials for COVID-19, we should plan to increase rates of vaccination against the flu, particularly among older adults who are more susceptible to both the flu and COVID-19.
    3. Co-infection: We need widespread availability of rapid diagnostics for COVID-19 and other respiratory pathogens because co-infection with another respiratory pathogen, including the flu, occurred in more than 20% of COVID-19-positive patients who presented with a respiratory viral syndrome early in the pandemic.
    4. Disparities: The COVID-19 pandemic has highlighted unconscionable disparities among African Americans, Latinx and Native Americans so we must galvanize public health efforts aimed to limit viral spread, increase vaccination rates, deploy rapid diagnostics and expand other health care services for vulnerable populations, including communities of color, the poor and older adults.
The Centers for Disease Control and Prevention estimated that the 2019-2020 seasonal influenza epidemic resulted in tens of millions of cases and tens of thousands of deaths.
"Even in non-pandemic years, the flu and other causes of pneumonia represent the eighth-leading cause of death in the United States, and respiratory viruses are the most commonly identified pathogens among hospitalized patients with community-acquired pneumonia," Singer said.
###
Singer is available for interviews with the media. Reporters should contact Kristin Samuelson at ksamuelson@northwestern.edu to schedule an interview.

Next frontier in bacterial engineering

New technique promises end-run around major barrier in genetic engineering of bacteria, setting stage for advances in medicine and beyond
HARVARD MEDICAL SCHOOL
A decades-old bacterial engineering technique called recombineering (recombination-mediated genetic engineering) allows scientists to scarlessly swap pieces of DNA of their choosing for regions of the bacterial genome. But this valuable and versatile approach has remained woefully underused because it has been limited mainly to Escherichia coli--the lab rat of the bacterial world--and to a handful of other bacterial species.
Now a new genetic engineering method developed by investigators in the Blavatnik Institute at Harvard Medical School and the Biological Research Center in Szeged, Hungary, promises to super-charge recombineering and open the bacterial world at large to this underutilized approach.
A report detailing the team's technique is published May 28 in PNAS.
The investigators have developed a high-throughput screening method to look for the most efficient proteins that serve as the engines of recombineering. Such proteins, known as SSAPs, reside within phages--viruses that infect bacteria.
Applying the new method, which enables the screening of more than two hundred SSAPs, the researchers identified two proteins that appear to be particularly promising.
One of them doubled the efficiency of single-spot edits of the bacterial genome. It also improved tenfold the ability to perform multiplex editing--making multiple edits genome-wide at the same time. The other one enabled efficient recombineering in the human pathogen Pseudomonas aeruginosa, a frequent cause of life-threatening, hospital-acquired infections, for which there has long been a dearth of good genetic tools.
"Recombineering will be a very critical tool that will augment our DNA writing and editing capabilities in the future, and this is an important step in improving the efficiency and reach of the technology," said study first author Timothy Wannier, research associate in genetics in lab of George Church, the Robert Winthrop Professor of Genetics at HMS.
Previous genetic engineering methods, including CRISPR Cas9-based gene-editing, have been ill-suited to bacteria because these methods involve "cutting and pasting" DNA, the researchers said. This is because, unlike multicellular organisms, bacteria lack the machinery to repair double-stranded DNA breaks efficiently and precisely, thus DNA cutting can profoundly interfere with the stability of the bacterial genome, Wannier said. The advantage of recombineering is that it works without cutting DNA.
Instead, recombineering involves sneaking edits into the genome during bacterial reproduction. Bacteria reproduce by splitting in two. During that process, one strand of their double-stranded, circular DNA chromosomes goes to each daughter cell, along with a new second strand that grows during the early stages of fission. The raw materials for recombineering are short, approximately 90 base strands of DNA that are made to order. Each strand is identical to a sequence in the genome, except for edits in the strand's center. These short strands slip into place as the second strands of the daughter cells grow, efficiently incorporating the edits into their genomes.
Among many possible uses, edits might be designed to interfere with a gene in order to pinpoint its function or, alternatively, to improve production of a valuable bacterial product. SSAPs mediate attachment and proper placement of the short strand within the growing new half of the daughter chromosome.
Recombineering might enable the substitution of a naturally occurring bacterial amino acid--the building blocks of proteins--with an artificial one. Among other things, doing so could enable the use of bacteria for environmental cleanup of oil spills or other contaminants, that depend on these artificial amino acids to survive, meaning that the modified bacteria could be easily annihilated once the work is done to avoid the risks of releasing engineered microbes into the environment, Wannier said.
"The bacteria would require artificial amino acid supplements to survive, meaning that they are preprogrammed to perish without the artificial feed stock," Wannier added.
A version of recombineering, called multiplex automated genome engineering (MAGE), could greatly boost the benefits of the technique. The particular advantage of MAGE is its ability to make multiple edits throughout the genome in one fell swoop.
MAGE could lead to progress in projects requiring reengineering of entire metabolic pathways, said John Aach, lecturer in genetics at HMS. Case in point, Aach added, are large-scale attempts to engineer microbes to turn wood waste into liquid fuels.
"Many investigator-years' effort in that quest have made great progress, even if they have not yet produced market-competitive products," he said.
Such endeavors require testing many combinations of edits, Aach said.
"We have found that using MAGE with a library of DNA sequences is a very good way of finding the combinations that optimize pathways."
A more recent descendant of recombineering, named directed evolution with random genomic mutations (DIvERGE), promises benefits in the fight against infectious diseases and could open new avenues for tackling antibiotic resistance.
By introducing random mutations into the genome, DIvERGE can speed up natural bacterial evolution. This helps researchers quickly uncover changes that could arise naturally in harmful bacteria that would make them resistant to antibiotic treatment, explained Akos Nyerges, research fellow in genetics in Church's lab at HMSs, previously at the Biological Research Center of the Hungarian Academy of Sciences.
"Improvements in recombineering will allow researchers to more quickly test how bacterial populations can gain resistance to new antibacterial drugs, helping researchers to identify less resistance-prone antibiotics," Nyerges said.
Recombineering will likely usher in a whole new world of applications that would be hard to foresee at this juncture, the researchers said.
"The new method greatly improves our ability to modify bacteria," Wannier said. "If we could modify a letter here and there in the past, the new approach is akin to editing words all over a book and doing so opens up the scientific imagination in a way that was not previously possible."
###
Church was a principal investigator on the study. Co-investigators on the research included Helene Kuchwara, Márton Czikkely, Dávid Balogh, Gabriel Filsinger, Nathaniel Borders, Christopher Gregg, Marc Lajoie, Xavier Rios, and Csaba Pál.
The work was funded by the U.S. Department of Energy (DE-FG02-02ER63445), the European Research Council (H2020-ERC-2014-CoG 648364), GINOP (MolMedEx TUMORDNS) GINOP-2.3.2-15-2016-00020, GINOP (EVOMER) GINOP-2.3.2-15-2016-00014; Momentum Program of the Hungarian Academy of Sciences, an EMBO Long-Term Fellowship, the Szeged Scientists Academy under the sponsorship of the Hungarian Ministry of Human Capacities (EMMI: 13725-2/2018/INTFIN), by the UNKP-18-2 New the National Excellence Program of the Hungarian Ministry of Human Capacities and UNKP 10-2, and the New National Excellence Program of the Hungarian Ministry for Innovation and Technology.
Relevant disclosures:
Church has related financial interests in enEvolv, GRO Biosciences and 64-x; with some coauthors; and has submitted a patent application relating to pieces of this work (WO2017184227A2). Wannier, Church and another coauthor have submitted a patent application related to the improved SSAP variants referenced here. Nyerges is listed as an inventor on a patent application related to DIvERGE.

The most common organism in the oceans harbors a virus in its DNA

UNIVERSITY OF WASHINGTON
IMAGE
IMAGE: THE VIRUSES, COLORED ORANGE, ATTACHED TO A MEMBRANE VESICLE FROM THE SAR11 MARINE BACTERIA, COLORED GRAY, THAT WAS THE SUBJECT OF THIS STUDY. view more 
CREDIT: MORRIS ET AL./NATURE MICROBIOLOGY
The most common organism in the oceans, and possibly on the entire planet, is a family of single-celled marine bacteria called SAR11. These drifting organisms look like tiny jelly beans and have evolved to outcompete other bacteria for scarce resources in the oceans.
We now know that this group of organisms thrives despite -- or perhaps because of -- the ability to host viruses in their DNA. A study published in May in Nature Microbiology could lead to new understanding of viral survival strategies.
University of Washington oceanographers discovered that the bacteria that dominate seawater, known as Pelagibacter or SAR11, hosts a unique virus. The virus is of a type that spends most of its time dormant in the host's DNA but occasionally erupts to infect other cells, potentially carrying some of its host's genetic material along with it.
"Many bacteria have viruses that exist in their genomes. But people had not found them in the ocean's most abundant organisms," said co-lead author Robert Morris, a UW associate professor of oceanography. "We suspect it's probably common, or more common than we thought -- we just had never seen it."
This virus' two-pronged survival strategy differs from similar ones found in other organisms. The virus lurks in the host's DNA and gets copied as cells divide, but for reasons still poorly understood, it also replicates and is released from other cells.
The new study shows that as many as 3% of the SAR11 cells can have the virus multiply and split, or lyse, the cell -- a much higher percentage than for most viruses that inhabit a host's genome. This produces a large number of free viruses and could be key to its survival.
"There are 10 times more viruses in the ocean than there are bacteria," Morris said. "Understanding how those large numbers are maintained is important. How does a virus survive? If you kill your host, how do you find another host before you degrade?"

Co-lead author Kelsy Cain fills a bottle with seawater off the coast of Oregon as an undergraduate aboard the RV Roger Revelle during a research cruise in July 2017. Cain diluted the water several times and then isolated a new strain of SAR11 bacteria that became the focus of the new paper.

The study could prompt basic research that could help clarify host-virus interactions in other settings.
"If you study a system in bacteria, that is easier to manipulate, then you can sort out the basic mechanisms," Morris said. "It's not too much of a stretch to say it could eventually help in biomedical applications."
The UW oceanography group had published a previous paper in 2019 looking at how marine phytoplankton, including SAR11, use sulfur. That allowed the researchers to cultivate two new strains of the ocean-dwelling organism and analyze one strain, NP1, with the latest genetic techniques.
Co-lead author Kelsy Cain collected samples off the coast of Oregon during a July 2017 research cruise. She diluted the seawater several times and then used a sulfur-containing substance to grow the samples in the lab -- a difficult process, for organisms that prefer to exist in seawater.
The team then sequenced this strain's DNA at the UW PacBio sequencing center in Seattle.
"In the past we got a full genome, first try," Morris said. "This one didn't do that, and it was confusing because it's a very small genome."
The researchers found that a virus was complicating the task of sequencing the genome. Then they discovered a virus wasn't just in that single strain.
"When we went to grow the NP2 control culture, lo and behold, there was another virus. It was surprising how you couldn't get away from a virus," said Cain, who graduated in 2019 with a UW bachelor's in oceanography and now works in a UW research lab.
Cain's experiments showed that the virus' switch to replicating and bursting cells is more active when the cells are deprived of nutrients, lysing up to 30% of the host cells. The authors believe that bacterial genes that hitch a ride with the viruses could help other SAR11 maintain their competitive advantage in nutrient-poor conditions.
"We want to understand how that has contributed to the evolution and ecology of life in the oceans," Morris said.
Pelagibacter, or SAR11, is a single-celled bacterium that survives off dissolved carbon throughout the oceans. It makes up one in four ce
###
Other co-authors are postdoctoral researcher Kelli Hvorecny and associate professor Justin Kollman in the UW Department of Biochemistry. The study was funded by the National Science Foundation and the National Institutes of Health's National Institute of Allergy and Infectious Disease.
For more information, contact Morris at morrisrm@uw.edu or 206-221-7228 and Cain at kcain97@uw.edu.

Towards a climate neutral Europe: The land sector is key

CMCC FOUNDATION - EURO-MEDITERRANEAN CENTER ON CLIMATE CHANGE
In 2014, EU leaders agreed that all sectors should contribute to the European 2030 emission reduction target, including the land use sector, which did not count towards the achievement of the previous climate change mitigation goals. In 2018, this agreement was implemented by the Regulation on the inclusion of greenhouse gas emissions and removals from land use, land use change and forestry (LULUCF) in the 2030 EU climate and energy framework. The Regulation lays down new rules for the accounting of the sector's emissions and removals, and for assessing EU Member States' compliance with these. For the first time, this allows the land sector to contribute, at least in part, to the achievement of the EU's climate change mitigation targets.
The paper "Making sense of the LULUCF Regulation: Much ado about nothing?", realized with the collaboration of the CMCC Foundation, assesses the importance and highlights the weaknesses and strengths of the LULUCF Regulation in the context of current EU climate and sustainability policies.
The three authors - among which Maria Vincenza Chiriacò and Lucia Perugini, researchers at the CMCC within the division dedicated to the study of agriculture, forests and ecosystem services - explain that the land sector plays a crucial role in climate change mitigation due to a peculiarity: the sector can either release greenhouse gases into the atmosphere, acting as a source of emissions or, conversely, store carbon and therefore acting as a sink. Whereas some sectors can reduce or even eliminate their emissions by foregoing the use of fossil fuels (which can be achieved via a transition to renewable energy sources and increased energy efficiency interventions) , other sectors - such as food production and waste - cannot. With its capacity to absorb CO2, the land sector can therefore compensate for part of these unavoidable emissions, thus becoming an important player in the EU's mitigation targets of reducing emissions by 40% before 2030.
"Given the potential for climate change mitigation embedded in the good management of the LULUCF sector, and underlined in the latest IPCC Special Report on "Climate change and land", it is extremely important that emissions and removals of the land sector are accounted for, to incentivize virtuous forest and agricultural management in the EU. Thanks to this Regulation, the sector can finally contribute to the EU's mitigation targets. This was also necessary to align the EU with the Paris Agreement requirement for economy-wide mitigation targets. Although the new Regulation has much improved the accounting rules for the LULUCF, it is still constrained within certain limits. We can consider the LULUCF regulation as a first step towards its full recognition", affirms Perugini, who is currently involved in the negotiating process under the UNFCCC (United Nations Framework Convention on Climate Change) as part of the Italian delegation dedicated to defining the role of the land sector.
Indeed, the Regulation demands that EU Member States ensure, between 2021 and 2030, that the LULUCF sector remain emission "neutral", and therefore generate neither credits nor debits. As of today, only a small part of credits generated by the LULUCF can be used to compensate emissions generated in other sectors towards the EU climate goals. Furthermore, the Regulation allows for possible debts arising from the land sector, under given conditions, to go unaccounted for by single member states.
The authors look forward to a further review of the 2030 EU climate framework, as envisioned by the EU Green Deal, as an opportunity to better tap into the sector's sizeable mitigation potential. "With the increased ambitions foreseen by the 'European Green Deal', which includes the specific objective to make of the EU the first climate neutral continent, including the contributions of every economic sector into the EU targets is even more important, as it incentivizes all sectors to do their best in the fight against climate change", continues Chiriacò.
The roadmap designed by the EU Commission - with the final objective of having zero net emissions of greenhouse gases by 2050 - includes the target of reducing GHG emissions by at least 50%, and possibly towards 55%, by 2030 compared with 1990 levels, and therefore increasing current ambitions.
Achieving these climate goals will require a deep cut in emissions in all sectors.
"The subject matter of the LULUCF Regulation closely intersects with that of other EU law and policy instruments dealing with agriculture and forestry, most saliently the Common Agricultural Policy (CAP) and the Renewable Energy Directive (RED). The EU's ambitious targets ask for a strong coordination and integration among the various sustainability and climate policies linked to the land sector, where all debits and credits generated are accounted for, with no limitations. Only in this way will we have full accountability of emissions and removals from the agriculture and forestry sectors, which will be crucial to monitor progress and reward those that engage in virtuous behaviour, and penalize those who do not", concludes Perugini.
###
For more information: Savaresi, A., Perugini, L., Chiriacò, M. V., Making sense of the LULUCF Regulation: Much ado about nothing?, Review of European, RECIEL - Comparative and International Environmental Law, DOI: https://doi.org/10.1111/reel.12332

Growing evidence that minority ethnic groups in England may be at higher risk of COVID-19

BMC (BIOMED CENTRAL)
Previous pandemics have often disproportionately impacted ethnic minorities and socioeconomically disadvantaged populations. While early evidence suggests that the same may be occurring in the current SARS-CoV-2 pandemic, research into the subject remains limited.
A team of researchers at the University of Glasgow and Public Health Scotland, UK analysed data on 392,116 participants in the UK Biobank study, a large long-term study investigating the contribution of genes and the environment to the development of disease. UK Biobank data, which include information on social and demographic factors, such as ethnicity and socioeconomic position, health and behavioural risk factors, were linked to results of COVID-19 tests conducted in England between 16th March 2020 and 3rd May 2020. Out of the total number of participants whose data were analysed, 348,735 were White British, 7,323 were South Asian and 6,395 were from black ethnic backgrounds. 2,658 participants had been tested for SARS-CoV-2 and 948 had at least one positive test. Out of those, 726 received a positive test in a hospital setting, suggesting more severe illness.
The authors found that, compared to people from white British backgrounds, the risks of testing positive were largest in in black and South Asian minority groups who were 3.4 and 2.4 times more likely to test positive, respectively, with people of Pakistani ethnicity at highest risk in the south Asian group (3.2 times more likely to test positive). Ethnic minorities also were more likely to receive their diagnosis in a hospital setting, which suggests more severe illness. The observed ethnic differences in infection risk did not appear to be fully explained by differences in pre-existing health, behavioural risk factors, country of birth, or socioeconomic differences. The authors also found that living in a disadvantaged area was associated with a higher risk of testing positive, particularly for the most disadvantaged (2.2 times more likely to test positive compared to the least disadvantaged), as was having the lowest level of education (2.0 times more likely to test positive compared to the highest level of education).
The findings suggest that some ethnic minority groups, especially black and South Asian people may be particularly vulnerable to the adverse consequences of COVID-19. An immediate policy response is needed to ensure that the health system is responsive to the needs of ethnic minority groups, according to the authors. This should include ensuring that health and care workers, who often are from minority ethnic populations, have access to the necessary protective personal equipment. Timely communication of guidelines to reduce the risk of being exposed to the virus in a range of languages should also be considered.
The authors caution that test result data was only available for England. Those who were more advantaged were more likely to participate in the UK Biobank study and ethnic minorities may be less well represented. Further research is needed to investigate whether these findings are reflective of the broader UK population, alongside analysis of other datasets examining how SARS-CoV-2 infection affects different ethnic and socioeconomic groups, including in representative samples across different countries.
###

Fearful Great Danes provide new insights to genetic causes of fear

UNIVERSITY OF HELSINKI
IMAGE
IMAGE: PROFESSOR LOHI AND GREAT DANE RENO view more 
CREDIT: UNIVERSITY OF HELSINKI
The identified genomic region includes several candidate genes associated with brain development and function as well as anxiety, whose further analysis may reveal new neural mechanisms related to fear.
For the purposes of the study, carried out by Professor Hannes Lohi's research group and published in the Translational Psychiatry journal, data from a total of 120 Great Danes was collected. The Great Dane breed is among the largest dog breeds in the world.
The project was launched after a number of Great Dane owners approached the research group to tell them about their dogs' disturbing fearfulness towards unfamiliar human beings in particular.
"Fear in itself produces a natural and vital reaction, but excessive fear can be disturbing and results in behavioural disorders. Especially in the case of large dogs, strongly expressed fearfulness is often problematic, as it makes it more difficult to handle and control the dog," says Riika Sarviaho, PhD from the University of Helsinki.
In dogs, behavioural disorders associated with anxiety and fearfulness include generalised anxiety disorder and a range of phobias. Fear can be evidenced, for example, as the dog's attempt to flee from situations they experience as frightening. At its worst, fear can manifest as aggression, which may result in attacks against other dogs or humans.
"Previous studies have suggested that canine anxiety and fearfulness could correspond with anxiety disorder in humans. In fact, investigating fearfulness in dogs may also shed more light on human anxiety disorders and help understand their genetic background," Professor Lohi explains the broader goal of the study.
A new genomic region underlying fearfulness
The study utilised a citizen science approach as the dog owners contributed by completing a behavioural survey concerning their dogs, in which the dogs received scores according to the intensity of fear. Through genetic research, a genomic region associated with fearfulness was identified in chromosome 11. The analysis was repeated by taking into consideration the socialisation carried out in puppyhood, or the familiarisation of the dogs with new people, dogs and situations. The re-analysis reinforced the original finding.
"In the case of behavioural studies, it's important to keep in mind that, in addition to genes, the environment has a significant impact on the occurrence of specific traits. For dogs, the socialisation of puppies has been found to be an important environmental factor that strongly impacts fearfulness. In this study, the aim was to exclude the effect of puppyhood socialisation and, thus, observe solely the genetic predisposition to fearfulness," says Sarviaho.
The genomic region was studied in more detail also with the help of whole genome sequencing, but, so far, the researchers have not succeeded in identifying in it a specific gene variant that predisposes to fearfulness.
"Although no actual risk variant was identified, the genomic region itself is interesting, as it contains a number of genes previously associated in various study models with neural development and function, as well as anxiety. For example, the MAPK9 gene has been linked with brain development and synaptic plasticity as well as anxiety, while RACK1 has been associated with neural development and N4BP3 with neurological diseases," says Professor Lohi.
Link between accelerated puppyhood growth and timidity?
A genomic region in humans corresponding with the one now associated with canine fearfulness is linked to a rare syndrome, which causes both neurological symptoms and, among other things, accelerated growth in childhood.
"Research on the topic is only at the early stages and findings have to be carefully interpreted, but it's interesting to note, when focusing on a particularly large dog breed, that the genomic region associated with fearfulness appears to have a neurological role as well as one related to growth," Sarviaho adds.
So far, gene discoveries in canine behavioural research have remained fairly rare, and the genomic region now identified has not previously been linked with fearfulness. Lohi's research group has previously described two genomic regions associated with canine generalised fear and sensitivity to sound. The genetic research findings support the hypothesis that fearfulness and anxiety are inherited traits. To be able to identify more detailed risk factors and confirm the relevance of the findings, the study should be repeated with a more extensive dataset.
###

Better prepared for future crises

Recommendations from risk resarchers
INSTITUTE FOR ADVANCED SUSTAINABILITY STUDIES E.V. (IASS)
NEWS RELEASE 

The article gives an overview of the spread of Covid-19 and outlines six causes of the crisis: the exponential infection rate, international integration, the insufficient capacity of health care systems in many countries, conflicts of competence and a lack of foresight on the part of many government agencies, the need to grapple with the economic impacts of the shutdown parallel to the health crisis, as well as weaknesses in capital markets resulting from the financial crisis of 2008. The solutions proposed by the team of authors were developed using a framework developed by the International Risk Governance Council to which Ortwin Renn contributed.
According to the study, five of the aspects of risk governance described in the framework are particularly relevant for efforts to overcome the Corona Crisis. Accordingly, the authors highlight the importance of increasing global capacities for the scientific and technical appraisal of risks in order to provide reliable early warning systems. This research must be supplemented by an analysis of the perceived risk - i.e. individual and public opinion, concerns, and wishes. The awareness of and acknowledgement of these perceptions facilitates effective crisis communication and enables authorities to issue effective public health guidelines. This leads to a key task for decision-makers - risk evaluation: Whether and to what extent are risk reduction measures necessary? What trade-offs are identified during the development of measures and restrictions and how can they be resolved on the basis of recognized ethical criteria in light of the considerable degree of uncertainty? This characterization and evaluation of the risk provides qualified options for risk management. The focus here is on the development of collectively binding decisions on measures to minimize the suffering of affected populations as a whole as well as strategies to minimize undesirable side effects. Coordinated crisis and risk communication underpinned by robust scientific and professional communications expertise is crucial to the success of efforts to tackle the crisis.
The team of authors has distilled ten recommendations from its findings:
  1. Address risks at source: in the case of pandemics this means reducing the possibility of viruses being transmitted from animals to humans.
  2. Respond to warnings: This includes the review of national and international risk assessments, and the development of better safeguards for risks with particularly serious impacts.
  3. Acknowledge trade-offs: Measures to reduce a particular risk will impact other risks. Undesirable side effects must be identified in risk assessments.
  4. Consider the role of technology: How can machine learning and other technologies be applied to support pandemic assessment, preparedness, and responses?
  5. Invest in resilience: Gains in organizational efficiency have made critical systems such as health care more vulnerable. Their resilience must be strengthened, for example by reducing dependencies on important products and services.
  6. Concentrate on the most important nodes in the system: The early imposition of restrictions on air travel have proved effective in combating a pandemic. A global emergency fund could be established to address the cost of such measures.
  7. Strengthen links between science and policymaking: Those countries in which scientific information and science-based policy advice are readily available to policymakers have had greater success in combating the coronavirus.
  8. Build state capacities: Tackling systemic risks should be viewed as an integral aspect of good governance that is performed on a continuing basis rather than as an emergency response.
  9. Improve communication: Communications around Covid-19 was slow or deficient in a number of countries. One solution would be the establishment of national and international risk information and communication units.
  10. Reflect on social disruption: The Corona Crisis is forcing people and organizations to experiment with new work and life patterns. Now is the time to consider which of these changes should be maintained over the longer term.
###