Monday, October 19, 2020

BROADBAND 

Internet connectivity is oxygen for research and development work

UNIVERSITY OF ILLINOIS COLLEGE OF AGRICULTURAL, CONSUMER AND ENVIRONMENTAL SCIENCES

Research News

IMAGE

IMAGE: EMMANUEL TOGO, IT ARCHITECT FOR THE UNIVERSITY OF GHANA, GAVE A TOUR OF THE UNIVERSITY'S CAMPUS NETWORK OPERATIONS CENTER DURING AN ICT HEALTH CHECKUP CONDUCTED BY PAUL HIXSON, UNIVERSITY OF... view more 

CREDIT: COLLEGE OF ACES, UNIVERSITY OF ILLINOIS.

URBANA, Ill. - Fast and reliable internet access is fundamental for research and development activity around the world. Seamless connectivity is a privilege we often take for granted. But in developing nations, technological limitations can become stumbling blocks to efficient communication and cause significant disadvantages.

Pete Goldsmith, director of the Soybean Innovation Lab at University of Illinois, works closely with partner organizations in several African countries. He noticed that his African colleagues were often dealing with technological problems that made communication very challenging. For example, sometimes they had to rely on their cell phones because their institution's internet access was unreliable.

Goldsmith teamed up with two IT experts at U of I, former Chief Information Officer Paul Hixson and Director of Research IT and Innovation Tracy Smith, to investigate technological challenges facing institutions in developing countries.

"Connectivity is the oxygen organizations run on," Hixson says. "It's such a basic requirement that it's often not even recognized as an issue. But lack of connectivity severely hinders an organization's ability to perform simple functions, conduct research, and compete for grants."

Goldsmith, Hixson, and Smith conducted an in-depth case study of information communication technology (ICT) infrastructure at the Savannah Agricultural Research Institute (SARI), a leading research station in Ghana and a close collaborator of SIL.

The case study included focus groups, interviews, and a technological analysis of SARI's equipment and connectivity. Based on this study, the research team developed the ICT Health Checkup, an assessment procedure for IT administrators to methodically assess the current state of their system, identify gaps affecting performance, and document steps for remediation.

The ICT Health Checkup tool systematically evaluates four key elements of ICT infrastructure. The first step focuses on connectivity and bandwidth, identifying the required bandwidth to accommodate the institution's needs and whether the institution has an uninterrupted fiber-based connection to the global internet. The second step analyzes core physical infrastructure, including dependable electricity, local network design, and both wired and wireless connectivity capabilities.

The third step looks at available intranet service offerings for researchers such as local storage, data backup procedures, access control, security procedures, email service, and cloud access. Finally, the fourth step deals with the human resources and technical support requirements for planning and managing the institution's IT infrastructure.

"With this tool, institutions can go through a checklist, and at each point there is a 'stoplight'. If it's red, you know there is something that needs to be fixed, because there are conditions that will act as a block and you can't go on until they are fixed - until there's a green light. So turning things from red to green at each step is crucial; methodically going through each step at a time and making sure it's fixed before moving on to the next one," Hixson explains.

The researchers compare the ICT Health Checkup to a medical health exam; it measures the current conditions and can be used as a benchmarking tool to measure improvements.

Goldsmith says the tool can be used to empower organizations so they can be self-sufficient. "With proper connectivity you can manage and store research data, compete for grants, and manage awards," he notes. "It's the foundation that allows institutions to participate fully in a global context."

The research team is currently expanding the study, collecting data from nine institutions and five networking organizations operating in three countries, in order to create a more robust picture of internet connectivity challenges and potential solutions across Africa.

They are also collaborating with the National Research and Education Networks (NRENs) in each of the sub-Saharan African countries that SIL operates in. These African NRENs are comparable to Internet2, which has been an instrumental partner in the expansion and adoption of advanced computing technologies at U of I and is one of the leading NRENs in the U.S., serving the country's research and higher-education communities.

"With the ICT health checkup, our partner African NRENs now have an actual assessment tool they can use with their member institutions. It's becoming a continent-wide approach as they are starting to adopt this new instrument created at the U of I to be their benchmark and measurement tool," Goldsmith says.

"The U of I is ideally positioned to provide this knowledge, because of the university's continued leadership in the computational and network administration space," he adds. "Now we are extending that to have real impact overseas."

###

The article, "The ICT Health Checkup Tool: Assessing Connectivity of the National Agriculture Research System (NARS)," is published in African Journal of Food, Agriculture, Nutrition and Development.

Funding was provided by the Feed the Future Innovation Lab for Soybean Value Chain Research (Soybean Innovation Lab (SIL)) under the U.S. Government's global hunger and food security initiative, Feed the Future.

The Soybean Innovation Lab is in the Department of Agricultural and Consumer Economics, College of Agricultural, Consumer and Environmental Sciences, University of Illinois.

 

Those funky cheese smells allow microbes to 'talk' to and feed each other

Researchers discover that bacteria that ripen cheese respond to the volatile gases produced by cheese fungi

TUFTS UNIVERSITY

Research News

IMAGE

IMAGE: FUNGI AND BACTERIA KEY TO RIPENING CHEESE COMMUNICATE WITH AND FEED EACH OTHER USING VOLATILE COMPOUNDS view more 

CREDIT: ADAM DETOUR

MEDFORD/SOMERVILLE, Mass. (October 16, 2020)-- Researchers at Tufts University have found that those distinctly funky smells from cheese are one way that fungi communicate with bacteria, and what they are saying has a lot to do with the delicious variety of flavors that cheese has to offer. The research team found that common bacteria essential to ripening cheese can sense and respond to compounds produced by fungi in the rind and released into the air, enhancing the growth of some species of bacteria over others. The composition of bacteria, yeast and fungi that make up the cheese microbiome is critical to flavor and quality of the cheese, so figuring out how that can be controlled or modified adds science to the art of cheese making.

The discovery, published in Environmental Microbiology, also provides a model for the understanding and modification of other economically and clinically important microbiomes, such as in soil or the gastrointestinal tract.

"Humans have appreciated the diverse aromas of cheeses for hundreds of years, but how these aromas impact the biology of the cheese microbiome had not been studied," said Benjamin Wolfe, professor of biology in the School of Arts and Science at Tufts University and corresponding author of the study. "Our latest findings show that cheese microbes can use these aromas to dramatically change their biology, and the findings' importance extends beyond cheese making to other fields as well."

Many microbes produce airborne chemical compounds called volatile organic compounds, or VOCs, as they interact with their environment. A widely recognized microbial VOC is geosmin, which is emitted by soil microbes and can often be smelled after a heavy rain in forests. As bacteria and fungi grow on ripening cheeses, they secrete enzymes that break down amino acids to produce acids, alcohols, aldehydes, amines, and various sulfur compounds, while other enzymes break down fatty acids to produce esters, methyl ketones, and secondary alcohols. All of those biological products contribute to the flavor and aroma of cheese and they are the reason why Camembert, Blue cheese and Limburger have their signature smells.

The Tufts researchers found that VOCs don't just contribute to the sensory experience of cheese, but also provide a way for fungi to communicate with and "feed" bacteria in the cheese microbiome. By pairing 16 different common cheese bacteria with 5 common cheese rind fungi, the researchers found that the fungi caused responses in the bacteria ranging from strong stimulation to strong inhibition. One bacteria species, Vibrio casei, responded by growing rapidly in the presence of VOCs emitted by all five of the fungi. Other bacteria, such as Psychrobacter, only grew in response to one of the fungi (Galactomyces), and two common cheese bacteria decreased significantly in number when exposed to VOCs produced by Galactomyces.

The researchers found that the VOCs altered the expression of many genes in the bacteria, including genes that affect the way they metabolize nutrients. One metabolic mechanism that was enhanced, called the glyoxylate shunt, allows the bacteria to utilize more simple compounds as "food" when more complex sources such as glucose are unavailable. In effect, they enabled the bacteria to better "eat" some of the VOCs and use them as sources for energy and growth.

"The bacteria are able to actually eat what we perceive as smells," said Casey Cosetta, post-doctoral scholar in the department of biology at Tufts University and first author of the study. "That's important because the cheese itself provides little in the way of easily metabolized sugars such as glucose. With VOCs, the fungi are really providing a useful assist to the bacteria to help them thrive."

There are direct implications of this research for cheese producers around the world. When you walk into a cheese cave there are many VOCs released into the air as the cheeses age. These VOCs may impact how neighboring cheeses develop by promoting or inhibiting the growth of specific microbes, or by changing how the bacteria produce other biological products that add to the flavor. A better understanding of this process could enable cheese producers to manipulate the VOC environment to improve the quality and variety of flavors.

The implications of the research can even extend much further. "Now that we know that airborne chemicals can control the composition of microbiomes, we can start to think about how to control the composition of other microbiomes, for example in agriculture to improve soil quality and crop production and in medicine to help manage diseases affected by the hundreds of species of bacteria in the body," said Wolfe.

###

Other authors of this study include Nicole Kfoury, former postdoctoral scholar at Tufts and currently applications scientist at Gerstel, Inc., and Albert Robbat Jr., associate professor of chemistry at Tufts.

This research was supported by a grant from the National Science Foundation (#1715553).

Cosetta, C.M., Kfoury, N., Robbat, A., and Wolfe, B.E. "Fungal volatiles mediate cheese rind microbiome assembly" Environmental Microbiology, 9 September 2020; DOI: 10.1111/1462-2920.15223

About Tufts University

Tufts University, located on campuses in Boston, Medford/Somerville and Grafton, Massachusetts, and in Talloires, France, is recognized among the premier research universities in the United States. Tufts enjoys a global reputation for academic excellence and for the preparation of students as leaders in a wide range of professions. A growing number of innovative teaching and research initiatives span all Tufts campuses, and collaboration among the faculty and students in the undergraduate, graduate and professional programs across the university's schools is widely encouraged.

 

New research comparing HIV medications set to change international recommendations

UNIVERSITY OF BRITISH COLUMBIA

Research News

IMAGE

IMAGE: THE STUDY'S LEAD AUTHOR, DR. STEVE KANTERS, WHO COMPLETED THE RESEARCH AS A PHD CANDIDATE IN UBC'S SCHOOL OF POPULATION AND PUBLIC HEALTH. view more 

CREDIT: UNIVERSITY OF BRITISH COLUMBIA

A new study by UBC researchers is set to change international treatment recommendations for people who are newly diagnosed with HIV--an update that could affect nearly two million people per year worldwide.

The study, published today by The Lancet in the journal EClinicalMedicine, was commissioned by the World Health Organization (WHO) as part of a planned update to its guidelines for HIV antiretroviral treatment (ART). The study found that dolutegravir is the optimal medication for first-line treatment for people newly diagnosed with HIV, a choice that has not been clear over the past several years.

"Research supporting the 2016 WHO guidelines suggested that dolutegravir was effective and well tolerated, but its efficacy and safety among key populations, such as pregnant women and people living with both HIV and tuberculosis (TB), remained unclear," said the study's lead author, Dr. Steve Kanters, who completed the research as a PhD candidate in UBC's School of Population and Public Health (SPPH). "In 2018, new research warned of a potentially serious increase in risk of neural tube defects in the children of women who became pregnant while taking this treatment."

The risk of adverse reaction meant that, although dolutegravir was found to be favourable compared to other options, it was only recommended as an alternative, with an antiretroviral called efavirenz recommended as the primary treatment.

The study team, which included Dr. Nick Bansback, associate professor at SPPH, Dr. Aslam Anis, professor at SPPH and director of the Centre for Health Evaluation and Outcome Sciences (CHÉOS), and Dr. Ehsan Karim, assistant professor at SPPH, completed a network meta-analysis of research stemming from 68 available antiretroviral therapy (ART) clinical trials.

They found dolutegravir was superior to efavirenz in most outcomes, including viral suppression, tolerability, and safety. According to Kanters, the increased odds of viral suppression with dolutegravir could have a significant impact on achieving international goals for HIV treatment.

"We found about a five per cent increase in the probability of viral suppression, which means that more people who start treatment will be able to successfully control their HIV," he said.

Another key attribute of dolutegravir is that it is effective in people who are resistant to NNRTI-class antiretrovirals, like efavirenz, a problem that is becoming increasingly common.

The analysis also showed that dolutegravir and efavirenz had similar rates of adverse events for pregnant women -- the increased risk of neural tube defects for dolutegravir was estimated to be less than 0.3 per cent.

"The new evidence on neural tube defects show that the risk with dolutegravir is much more tolerable than previously thought and should quell the initial worry about this drug," said Kanters.

"Dolutegravir appears to be here to stay as the preferred treatment for people newly diagnosed with HIV," he said. "However, it is important to recognize the good that efavirenz has done over the past two decades, as it helped lead the ART scale-up around the world."

Despite the many benefits of dolutegravir, dolutegravir use was associated with increased weight gain, a side effect that could increase the risk of aging-associated comorbidities, like heart attack or stroke.

"In many places, well-treated HIV has become a chronic condition and we are now seeing people living long lives with HIV," said Kanters. "The research community will continue to monitor the effects dolutegravir may have on the healthy aging process."

While this study is specifically focused on the optimal treatment for people newly diagnosed with HIV, an upcoming publication will review the evidence in support of switching to dolutegravir for people whose first treatment choice has been unsuccessful in controlling their infection. This recommendation could mean improved treatment for the many people living with HIV around the world who are unable to achieve viral suppression despite being on treatment.

###

 

Deep sea coral time machines reveal ancient CO2 burps

UNIVERSITY OF BRISTOL

Research News

IMAGE

IMAGE: CORAL IMAGES: ANALYSIS OF THE FOSSIL REMAINS OF DEEP-SEA CORALS (PICTURED HERE) WERE USED TO EXAMINE THE HISTORY OF THE OCEANS AND CONNECTIONS TO GLOBAL CLIMATE. view more 

CREDIT: DANN BLACKWOOD, USGS.

The fossilised remains of ancient deep-sea corals may act as time machines providing new insights into the effect the ocean has on rising CO2 levels, according to new research carried out by the Universities of Bristol, St Andrews and Nanjing and published today [16 October] in Science Advances.

Rising CO2 levels helped end the last ice age, but the cause of this CO2 rise has puzzled scientists for decades. Using geochemical fingerprinting of fossil corals, an international team of scientists has found new evidence that this CO2 rise was linked to extremely rapid changes in ocean circulation around Antarctica.

The team collected fossil remains of deep-sea corals that lived thousands of metres beneath the waves. By studying the radioactive decay of the tiny amounts of uranium found in these skeletons, they identified corals that grew at the end of the ice age around 15,000 years ago.

Further geochemical fingerprinting of these specimens - including measurements of radiocarbon - allowed the team to reconstruct changes in ocean circulation and compare them to changes in global climate at an unprecedented time resolution.

Professor Laura Robinson, Professor of Geochemistry at Bristol's School of Earth Sciences who led the research team, said: "The data show that deep ocean circulation can change surprisingly rapidly, and that this can rapidly release CO2 to the atmosphere."

Dr James Rae at St Andrew's School of Earth and Environmental Sciences, added: "The corals act as a time machine, allowing us to see changes in ocean circulation that happened thousands of years ago.

"They show that the ocean round Antarctica can suddenly switch its circulation to deliver burps of CO2 to the atmosphere."

Scientists have suspected that the Southern Ocean played an important role in ending the last ice age and the team's findings add weight to this idea.

Dr Tao Li of Nanjing University, lead author of the new study, said: "There is no doubt that Southern Ocean processes must have played a critical role in these rapid climate shifts and the fossil corals provide the only possible way to examine Southern Ocean processes on these timescales."

In another study published in Nature Geoscience this week the same team ruled out recent speculation that the global increase in CO2 at the end of the ice age may have been related to release of geological carbon from deep sea sediments.

Andrea Burke at St Andrew's School of Earth and Environmental Sciences, added: "There have been some suggestions that reservoirs of carbon deep in marine mud might bubble up and add CO2 to the ocean and the atmosphere, but we found no evidence of this in our coral samples."

Dr Tianyu Chen of Nanjing University said: "Our robust reconstructions of radiocarbon at intermediate depths yields powerful constraints on mixing between the deep and upper ocean, which is important for modelling changes in circulation and carbon cycle during the last ice age termination.

Dr James Rae added: "Although the rise in CO2 at the end of the ice age was dramatic in geological terms, the recent rise in CO2 due to human activity is much bigger and faster. What the climate system will do in response is pretty scary."

###

Paper

'Rapid shifts in circulation and biogeochemistry of the Southern Ocean during deglacial carbon cycle events' by L Robinson et al in Science Advanceshttps://advances.sciencemag.org/content/6/42/eabb3807

'Persistently well-ventilated intermediate-depth ocean through the last deglaciation' by L Robinson et al in Nature Geoscience

https://www.nature.com/articles/s41561-020-0638-6

 

Echo from the past makes rice paddies a good home for wetland plants

Geography before human land use impacts how well plant species are retained

TOKYO METROPOLITAN UNIVERSITY

Research News

IMAGE

IMAGE: RICE PADDIES WHICH WERE ORIGINALLY WETLAND ARE ABLE TO PROVIDE A NEW HOME FOR THE ORIGINAL WETLAND PLANT SPECIES THAT WERE PRESENT. CONVERSELY, PADDIES WHICH WERE NOT ORIGINALLY WETLAND ARE... view more 

CREDIT: TOKYO METROPOLITAN UNIVERSITY

Tokyo, Japan - Researchers from Tokyo Metropolitan University studied the biodiversity of wetland plants over time in rice paddies in the Tone River basin, Japan. They found that paddies which were more likely to have been wetland before agricultural use retained more wetland plant species. On the other hand, land consolidation and agricultural abandonment were both found to negatively impact biodiversity. Their findings may one day inform conservation efforts and promote sustainable agriculture.

The Asian monsoon region is home to a vast number of rice paddies. Not only have they fed its billions of inhabitants for centuries, they are also an important part of the ecosystem, home to a vast array of wetland plant species. But as the population grows and more agricultural land is required, their sheer scale and complexity beg an important question: what is their environmental impact?

A team from Tokyo Metropolitan University led by Associate Professor Takeshi Osawa and their collaborators have been studying how rice paddies affect local plant life. In their most recent work, they investigated the biodiversity of wetland plants in rice paddies around the Tone River basin Japan. The Tone River is Japan's second longest river, and runs through the 170,000 square kilometer expanse of the Kanto plains. Previous studies have looked at how a particular species or group of species fare in different conditions. Instead, the team turned their attention to the range of species that make up the plant community, with a particular focus on the number of wetland and non-wetland species present. Changes were tracked over time using extensive survey data from 2002, 2007 and 2012.

They found that not all rice paddies are equal when it comes to how well they support original wetland species. In fact, there was a correlation between how likely it was that the land was wetland before it was put to agricultural use, and the number of wetland species which were retained over time. Here, the team measured this using flow accumulation values (FAVs) for different plots of land, a simple metric which showed how easily water could accumulate. Importantly, this kind of approach might help us predict how amenable new rice paddies might be to the local wetland flora by calculating a simple number using the local terrain. However, they also found that things like land consolidation and agricultural abandonment could also have a negative impact. The emerging story is that both current human usage and original geographical conditions play an important role in deciding how 'friendly' rice paddies could be for the original wetland ecosystem.

The team believe that the same approach could be applied to different locations such as plantation forests which were (or were not) originally woodland. After all, many nations are turning to large scale tree planting to offset carbon emissions. The ability to systematically 'assign' how new land usage might impact local ecosystems is sure to greatly help restoration and conversation efforts.

###

This work was supported by the Environment Research and Technology Development Fund (4-1805, 4-1705) of the Ministry of the Environment, Japan, the Research Institute for Humanity and Nature (RIHN: a constituent member of NIHU) Core Project No. 14200075, and JSPS KAKENHI Grant Number 20K06096.

 

Glimpse deep into Earth's crust finds heat source that may stabilize continents

PENN STATE

Research News

IMAGE

IMAGE: ROCKS COLLECTED FROM KILBOURNE HOLE CRATER IN NEW MEXICO REVEALED THE THINNING LITHOSPHERE IN THE RIO GRANDE CONTINENTAL RIFT PROVIDES HEAT NECESSARY TO STABILIZE CONTINENTS OVER BILLIONS OF YEARS. view more 

CREDIT: PENN STATE

Rocks from the Rio Grande continental rift have provided a rare snapshot of active geology deep inside Earth's crust, revealing new evidence for how continents remain stable over billions of years, according to a team of scientists.

"We tend to study rocks that are millions to billions of years old, but in this case we can show what's happening in the deep crust, nearly 19 miles below the surface of the Earth, in what geologically speaking is the modern day," said Jacob Cipar, a graduate student in geosciences at Penn State. "And we have linked what's preserved in these rocks with tectonic processes happening today that may represent an important step in the development of stable continents."

The team, led by Penn State scientists, found evidence that heat from the mantle is melting the lower crust at the rift, where tectonic forces are pulling apart and thinning the lithosphere, or the crust and upper mantle that make up the rigid outer layer of Earth.

Heating the continental crust is considered important to its development. But the process is often associated with crustal thickening, when continental plates collide and form mountains like the Himalayas, the scientists said.

"Our research suggests that these rocks that have been thought of as related to mountain building may have actually been cooked by a thinning lithosphere like what's happening in the modern-day Rio Grande rift," Cipar said. "And more broadly, thinning lithosphere may be more important than previously recognized for stabilizing continents and preventing them from sinking back into the mantle."

The researchers recently reported their findings in the journal Nature Geoscience.

Earth's continents feature a unique silicon-rich, buoyant crust that allows land to rise above sea level and host terrestrial life, the scientists said. The crust also contains heat-producing elements like uranium that could destabilize it over geological time.

Heating the crust creates molten rock that carries those elements toward the surface, resulting in a cooler and stronger lower crust that can protect continents from being absorbed into the mantle, the scientists said. But questions remain about the sources of that heat.

"We are suggesting that thinning of the lithosphere is really the removal of a barrier that keeps that heat away from the crust," said Andrew Smye, assistant professor of geosciences at Penn State and Cipar's adviser. "Removing or thinning that barrier at the Rio Grande rift appears to be what is generating the heat needed to initiate this process of stabilizing continental crust. And this has been overlooked in our understanding of how continents become so stable."

The scientists tapped into rocks brought to the surface 20,000 years ago by volcanoes in New Mexico. The rocks are considered geologically young and are significant because they retain the context of the lower crust, the scientists said.

"In contrast, what we see in the rock record around the world is that oftentimes what it takes to get them up to the surface has disrupted their original relationship with the lower crust," said Joshua Garber, a postdoctoral researcher at Penn State. "This makes it really challenging to use older rocks to try to understand tectonics, and it makes the Rio Grande probably the best place to do this research."

The scientists used analytical techniques to link the age of minerals in the rocks to the pressure and temperature they faced as they made their way through the crust.

Similarities between the pressure and temperature path from the Rio Grande lower crust and rocks from other locations suggest that a thinning lithosphere is important for stabilizing Earth's continents, the scientists said.

"The snapshots of data we do have from other locations really nicely aligns with what we found in the Rio Grande rift," Garber said. "So that tells us this is not just happening now in the western United States. This shows the guts of continents have probably undergone this globally at least for the last billion years."

###

Andrew Kylander-Clark, a senior development engineering at the University of California, Santa Barbara, also contributed to this work.

The National Science Foundation and the Slingerland Early Career Fellowship at Penn State supported the research.

 

Why school bullying prevention programs that involve peers may be harmful to victims

SOCIETY FOR RESEARCH IN CHILD DEVELOPMENT

Research News

School bullying has been identified as harmful to students' mental health. Many studies have evaluated the effectiveness of bullying prevention programs, finding mixed results in general and no benefits overall for secondary school students. Looking at the specific components of bullying prevention programs helps to explain the complicated pattern: Unlike intensive programs that include parent training, firm disciplinary methods or improved playground supervision, interventions that involve work with peers tend to lead to increases in bullying. A new review explores why encouraging peers to defend victims may actually cause more harm than good.

The analysis was written by a researcher at the QIMR Berghofer Medical Research Institute and the University of Queensland, Brisbane. It appears in Child Development Perspectives, a journal of the Society for Research in Child Development.

"Many school bullying prevention programs encourage and train peer bystanders (helpers) to get actively involved in assisting with possible instances of bullying," said Karyn L. Healy, research officer from QIMR Berghofer Medical Research Institute, who authored the analysis. "Although this approach is very common and well-intentioned, there is no evidence that it helps victims. Encouraging peers to actively defend victims of bullying may actually produce adverse outcomes for victims."

Most research on the effectiveness of bullying prevention programs assumes that each program affects bullying and victimization in a simple and unified way. But many programs combine a range of different strategies and participants, which are likely to produce differential effects.

Healy identified several mechanisms through which bystander interventions that involve peer defense of the victim could increase victimization and distress of victims: 1) by disempowering victims, 2) by reinforcing or provoking bullying, or 3) by eroding broader support for victims by the peer group.

"Having lots of peers involved makes the situation more public, which can be damaging to the social reputation of victims," said Healy. "Having a trained bystander step in also prevents the victim from handling a situation themselves and may make them look weak in the eyes of the bully. Training students to intervene in bullying also has the potential of leading to overuse of peer defense strategies because of benefits to helpers, such as making helpers feel they have higher status or increasing helpers' feelings of belonging in school."

Recent evidence suggests that even when programs are successful in reducing bullying, they may still be harmful to the individual students who are victimized the most. "This could potentially be the case for any program that aims to reduce overall bullying without taking into account the impacts on victims," explains Healy.

To lessen the risk to vulnerable students, Healy suggests that schools be wary of bullying prevention programs that lack evidence of effectiveness for reducing bullying and victimization. Schools should avoid using strategies that boost peer visibility of victimization (e.g., identifying a victim in a class meeting). In addition, evaluations of bullying prevention programs that look at the school as a whole should be cautious of hidden negative outcomes for individual students who remain victimized.

###

Summarized from Child Development Perspectives, Hypotheses for Possible Iatrogenic Impacts of School Bullying Prevention Programs by Healy, KL (QIMR Berghofer Medical Research Institute and the University of Queensland, Brisbane). Copyright 2020 The Society for Research in Child Development, Inc. All rights reserved.

 

Cannabis reduces OCD symptoms by half in the short-term

WASHINGTON STATE UNIVERSITY

Research News

PULLMAN, Wash. - People with obsessive-compulsive disorder, or OCD, report that the severity of their symptoms was reduced by about half within four hours of smoking cannabis, according to a Washington State University study.

The researchers analyzed data inputted into the Strainprint app by people who self-identified as having OCD, a condition characterized by intrusive, persistent thoughts and repetitive behaviors such as compulsively checking if a door is locked. After smoking cannabis, users with OCD reported it reduced their compulsions by 60%, intrusions, or unwanted thoughts, by 49% and anxiety by 52%.

The study, recently published in the Journal of Affective Disorders, also found that higher doses and cannabis with higher concentrations of CBD, or cannabidiol, were associated with larger reductions in compulsions.

"The results overall indicate that cannabis may have some beneficial short-term but not really long-term effects on obsessive-compulsive disorder," said Carrie Cuttler, the study's corresponding author and WSU assistant professor of psychology. "To me, the CBD findings are really promising because it is not intoxicating. This is an area of research that would really benefit from clinical trials looking at changes in compulsions, intrusions and anxiety with pure CBD."

The WSU study drew from data of more than 1,800 cannabis sessions that 87 individuals logged into the Strainprint app over 31 months. The long time period allowed the researchers to assess whether users developed tolerance to cannabis, but those effects were mixed. As people continued to use cannabis, the associated reductions in intrusions became slightly smaller suggesting they were building tolerance, but the relationship between cannabis and reductions in compulsions and anxiety remained fairly constant.

Traditional treatments for obsessive-compulsive disorder include exposure and response prevention therapy where people's irrational thoughts around their behaviors are directly challenged, and prescribing antidepressants called serotonin reuptake inhibitors to reduce symptoms. While these treatments have positive effects for many patients, they do not cure the disorder nor do they work well for every person with OCD.

"We're trying to build knowledge about the relationship of cannabis use and OCD because it's an area that is really understudied," said Dakota Mauzay, a doctoral student in Cuttler's lab and first author on the paper.

Aside from their own research, the researchers found only one other human study on the topic: a small clinical trial with 12 participants that revealed that there were reductions in OCD symptoms after cannabis use, but these were not much larger than the reductions associated with the placebo.

The WSU researchers noted that one of the limitations of their study was the inability to use a placebo control and an "expectancy effect" may play a role in the results, meaning when people expect to feel better from something they generally do. The data was also from a self-selected sample of cannabis users, and there was variability in the results which means that not everyone experienced the same reductions in symptoms after using cannabis.

However, Cuttler said this analysis of user-provided information via the Strainprint app was especially valuable because it provides a large data set and the participants were using market cannabis in their home environment, as opposed to federally grown cannabis in a lab which may affect their responses. Strainprint's app is intended to help users determine which types of cannabis work the best for them, but the company provided the WSU researchers free access to users' anonymized data for research purposes.

Cuttler said this study points out that further research, particularly clinical trials on the cannabis constituent CBD, may reveal a therapeutic potential for people with OCD.

This is the fourth study Cuttler and her colleagues have conducted examining the effects of cannabis on various mental health conditions using the data provided by the app created by the Canadian company Strainprint. Others include studies on how cannabis impacts PTSD symptoms, reduces headache pain, and affects emotional well-being.

###

 

UMD-led study shows fear and anxiety share same bases in brain


Transformational findings could ultimately lead to better models of emotion and more effective interventions for anxiety and depression

UNIVERSITY OF MARYLAND

Research News

COLLEGE PARK, Md. - Anxiety, the most common family of mental illnesses in the U.S., has been pushed to epic new heights by the COVID-19 pandemic, with the Centers for Disease Control and Prevention estimating that nearly 1 in 3 U.S. adults and a staggering 41% of people ages 18-29 experienced clinically significant anxiety symptoms in late August. Now, the findings of a recent UMD-led study indicate that some long-accepted thinking about the basic neuroscience of anxiety is wrong.

The report by an international team of researchers led by Alexander Shackman, an associate professor of psychology at UMD, and Juyoen Hur, an assistant professor of psychology at Yonsei University in Seoul, South Korea, provides new evidence that fear and anxiety reflect overlapping brain circuits. The findings run counter to popular scientific accounts, highlighting the need for a major theoretical reckoning. The study was published last week in the Journal of Neuroscience.

"The conceptual distinction between 'fear' and 'anxiety' dates back to the time of Freud, if not the Greek philosophers of antiquity," said Shackman, a core faculty member of UMD's Neuroscience and Cognitive Science Program, and 2018 recipient of a seed grant award from UMD's Brain and Behavior Initiative, "In recent years, brain imagers and clinicians have extended this distinction, arguing that fear and anxiety are orchestrated by distinct neural networks.

However, Shackman says their new study adds to a rapidly growing body of new evidence suggesting that this old mode is wrong. "If anything, fear and anxiety seem to be constructed in the brain using a massively overlapping set of neural building blocks," he said.

Prevailing scientific theory holds that fear and anxiety are distinct, with different triggers and strictly segregated brain circuits. Fear--a fleeting reaction to certain danger--is thought to be controlled by the amygdala, a small almond-shaped region buried beneath the wrinkled convolutions of the cerebral cortex. By contrast, anxiety--a persistent state of heightened apprehension and arousal elicited when threat is uncertain--is thought to be orchestrated by the neighboring bed nucleus of the stria terminalis (BNST). But new evidence from Shackman and his colleagues suggests that both of these brain regions are equally sensitive to certain and uncertain kinds of threats.

Leveraging cutting-edge neuroimaging techniques available at the Maryland Neuroimaging Center, their research team used fMRI to quantify neural activity while participants anticipated receiving a painful shock paired with an unpleasant image and sound--a new task that the researchers dubbed the "Maryland Threat Countdown".

The timing of this "threat" was signaled either by a conventional countdown timer--i.e. "3, 2, 1..."--or by a random string of numbers--e.g. "16, 21, 8." In both conditions, threat anticipation recruited a remarkably similar network of brain regions, including the amygdala and the BNST. Across a range of head-to-head comparisons, the two showed statistically indistinguishable responses.

The team examined the neural circuits engaged while waiting for certain and uncertain threat (i.e. "fear" and "anxiety"). Results demonstrated that both kinds of threat anticipation recruited a common network of core brain regions, including the amygdala and BNST.

These observations raise important questions about the Research Domain Criteria (RDoC) framework that currently guides the U.S. National Institute of Mental Health's quest to discover the brain circuitry underlying anxiety disorders, depression, and other common mental illnesses. "As it is currently written, RDoC embodies the idea that certain and uncertain threat are processed by circuits centered on the amygdala and BNST, respectively. It's very black-and-white thinking," Shackman noted, emphasizing that RDoC's "strict-segregation" model is based on data collected at the turn of the century.

"It's time to update the RDoC so that it reflects the actual state of the science. It's not just our study; in fact, a whole slew of mechanistic studies in rodents and monkeys, and new meta-analyses of the published human imaging literature are all coalescing around the same fundamental scientific lesson: certain and uncertain threat are processed by a shared network of brain regions, a common core," he said.

As the crown jewel of NIMH's strategic plan for psychiatric research in the U.S., the RDoC framework influences a wide range of biomedical stakeholders, from researchers and drug companies to private philanthropic foundations and foreign funding agencies. Shackman noted that the RDoC has an outsized impact on how fear and anxiety research is designed, interpreted, peer reviewed, and funded here in the U.S. and abroad.

"Anxiety disorders impose a substantial and growing burden on global public health and the economy," Shackman said, "While we have made tremendous scientific progress, existing treatments are far from curative for many patients. Our hope is that research like this study can help set the stage for better models of emotion and, ultimately, hasten the development of more effective intervention strategies for the many millions of children and adults around the world who struggle with debilitating anxiety and depression."

###

This work was supported by the National Institute of Mental Health and University of Maryland, College Park.

The research team also included Allegra S. Anderson, Vanderbilt University; Jinyi Kuang, University of Pennsylvania; Manuel Kuhn, Harvard Medical School; Andrew S. Fox, University of California, Davis; and Jason F. Smith, Rachael M. Tillman, Hyung Cho Kim, and Kathryn A. DeYoung, all from the University of Maryland, College Park.

 

Driver of the largest mass extinction in the history of the Earth identified

New study provides a comprehensive reconstruction of the Permian-Triassic boundary event

HELMHOLTZ CENTRE FOR OCEAN RESEARCH KIEL (GEOMAR)

Research News

Life on Earth has a long, but also an extremely turbulent history. On more than one occasion, the majority of all species became extinct and an already highly developed biodiversity shrank to a minimum again, changing the course of evolution each time. The most extensive mass extinction took place about 252 million years ago. It marked the end of the Permian Epoch and the beginning of the Triassic Epoch. About three quarters of all land life and about 95 percent of life in the ocean disappeared within a few thousands of years only.

Gigantic volcanic activities in today's Siberia and the release of large amounts of methane from the sea floor have been long debated as potential triggers of the Permian-Triassic extinction. But the exact cause and the sequence of events that led to the mass extinction remained highly controversial. Now, scientists from Germany, Italy and Canada, in the framework of the EU-funded project BASE-LiNE Earth led by Prof. Dr. Anton Eisenhauer from GEOMAR Helmholtz Centre for Ocean Research Kiel in cooperation with the Helmholtz Centre Potsdam GFZ German Research Centre for Geosciences, have for the first time been able to conclusively reconstruct the entire cascade of events at that time using cutting-edge analytical techniques and innovative geochemical modelling. The study has been published today in the international journal Nature Geoscience.

For their study, the BASE-LiNE Earth team used a previously often neglected environmental archive: the shells of fossil brachiopods. "These are clam-like organisms that have existed on Earth for more than 500 million years. We were able to use well-preserved brachiopod fossils from the Southern Alps for our analyses. These shells were deposited at the bottom of the shallow shelf seas of the Tethys Ocean 252 million years ago and recorded the environmental conditions shortly before and at the beginning of extinction", explains Dr. Hana Jurikova. She is first author of the study, which she conducted as part of the BASE-LiNE Earth project and her doctoral thesis at GEOMAR.

By measuring different isotopes of the element boron in the fossil shells, the team was able to trace the development of the pH values in the ocean 252 million years ago. Since seawater pH is tightly coupled to the CO2 concentration in the atmosphere, the reconstruction of the latter was also possible. For the analyses, the team used high-precision isotope analyses at GEOMAR as well as high-resolution microanalyses on the state-of-the-art large-geometry secondary ion mass spectrometer (SIMS) at GFZ.

"With this technique, we can not only reconstruct the evolution of the atmospheric CO2 concentrations, but also clearly trace it back to volcanic activity. The dissolution of methane hydrates, which had been suggested as a potential further cause, is highly unlikely based on our data", explains Dr. Marcus Gutjahr from GEOMAR, co-author of the study.

As a next step, the team fed their data from the boron and additional carbon isotope-based investigations into a computer-based geochemical model that simulated the Earth's processes at that time. Results showed that warming and ocean acidification associated with the immense volcanic CO2 injection to the atmosphere was already fatal and led to the extinction of marine calcifying organisms right at the onset of the extinction. However, the CO2 release also brought further consequences; with increased global temperatures caused by the greenhouse effect, chemical weathering on land also increased.

Over thousands of years, increasing amounts of nutrients reached the oceans via rivers and coasts, which then became over-fertilized. The result was a large-scale oxygen depletion and the alteration of entire elemental cycles. "This domino-like collapse of the inter-connected life-sustaining cycles and processes ultimately led to the observed catastrophic extent of mass extinction at the Permian-Triassic boundary," summarizes Dr. Jurikova.

The study was conducted within the framework of the EU-funded ITN project BASE-LiNE Earth, in which the use of brachiopods as an environmental archive was systematically studied for the first time, and relevant analytical methods were improved and newly developed. "Without these new techniques it would be difficult to reconstruct environmental processes more than 250 million years ago in the same level of detail as we have done now", emphasizes Prof. Dr. Anton Eisenhauer from GEOMAR, the former BASE-LiNE Earth project coordinator and co-author of the new study, "in addition, the new methods can be applied for other scientific applications".

###