Monday, October 19, 2020

Safe sex or risky romance? Young adults make the rational choice

ASSOCIATION FOR PSYCHOLOGICAL SCIENCE

Research News

A study published in the journal Psychological Science found that young adults--contrary to how they are sometimes portrayed in the media--tend to make highly rational decisions when it comes to selecting potential romantic partners.

This is not to say that young adults make risk-free choices, but they appear to consider both the risks and benefits of their sexual behavior in a highly consistent and thoughtful manner.

"There is a tendency to view sexual decision making in young adults as a highly variable and somewhat random process, more influenced by hormones or impulsivity than rational processes," said Laura Hatz, a doctoral candidate at the University of Missouri and lead author of the study. "Our study suggests, however, that young adults are highly consistent in their choices, balancing potential partners' level of attractiveness against the potential risk for sexually transmitted infection."

The research involved presenting 257 participants with hypothetical "sexual gambles" in which a photo of a potential partner's face was shown alongside an associated, though purely hypothetical, risk of contracting a sexually transmitted infection. Nearly all participants in the study made consistently rational choices, as defined by established models of psychological behavior. Prior research has shown that, in general, individuals tend to use what are known as heuristic decision strategies--cognitive shortcuts that may ignore some information--to make choices in life.

Hatz and her colleagues found that even individuals who could be identified as classic heuristic decision makers for monetary-based choices became rational decision makers when similar choices were framed as sexual choices.

###

See related content in the APS Research Topic on Love and Marriage.

Reference:

Hatz, L. E., Park, S., McCarty, K. N., McCarthy, D. M., & Davis-Stober, C. P. (2020). Young adults make rational sexual decisions. Psychological Science, 31(8), 944-956.
https://doi.org/10.1177%2F0956797620925036

 

Are climate scientists being too cautious when linking extreme weather to climate change?

UNIVERSITY OF WASHINGTON

Research News

IMAGE

IMAGE: THE PUBLIC EXPECTS TO RECEIVE ADVANCED WARNING OF HAZARDOUS WEATHER, SUCH AS TORNADOES AND WINTER STORMS. THIS PHOTO SHOWS A TORNADO IN PROSPECT VALLEY, COLORADO, ON JUNE 19, 2018. view more 

CREDIT: ERIC MEOLA

In this year of extreme weather events -- from devastating West Coast wildfires to tropical Atlantic storms that have exhausted the alphabet -- scientists and members of the public are asking when these extreme events can be scientifically linked to climate change.

Dale Durran, a professor of atmospheric sciences at the University of Washington, argues that climate science need to approach this question in a way similar to how weather forecasters issue warnings for hazardous weather.

In a new paper, published in the October issue of the Bulletin of the American Meteorological Society, he draws on the weather forecasting community's experience in predicting extreme weather events such as tornadoes, flash floods, high winds and winter storms. If forecasters send out a mistaken alert too often, people will start to ignore them. If they don't alert for severe events, people will get hurt. How can the atmospheric sciences community find the right balance?

Most current approaches to attributing extreme weather events to global warming, he says, such as the conditions leading to the ongoing Western wildfires, focus on the likelihood of raising a false alarm. Scientists do this by using statistics to estimate the increase in the probability of that event that is attributable to climate change. Those statistical measures are closely related to the "false alarm ratio," an important metric used to assess the quality of hazardous weather warnings.

But there is a second key metric used to assess the performance of weather forecasters, he argues: The probably that the forecast will correctly warn of events that actually occur, known as the "probability of detection." The ideal probability of detection score is 100%, while the ideal false-alarm rate would be zero.

Probability of detection has mostly been ignored when it comes to linking extreme events to climate change, he says. Yet both weather forecasting and climate change attribution face a tradeoff between the two. In both weather forecasting and climate-change attribution, calculations in the paper show that raising the thresholds to reduce false alarms produces a much greater drop in the probability of detection.

Drawing on a hypothetical example of a tornado forecaster whose false alarm ratio is zero, but is accompanied by a low probability of detection, he writes that such an "overly cautious tornado forecasting strategy might be argued by some to be smart politics in the context of attributing extreme events to global warming, but it is inconsistent with the way meteorologists warn for a wide range of hazardous weather, and arguably with the way society expects to be warned about threats to property and human life."

Why does this matter? The paper concludes by noting: "If a forecaster fails to warn for a tornado there may be serious consequences and loss of life, but missing the forecast does not make next year's tornadoes more severe. On the other hand, every failure to alert the public about those extreme events actually influenced by global warming facilitates the illusion that mankind has time to delay the actions required to address the source of that warming. Because the residence time of CO2 in the atmosphere is many hundreds to thousands of years the cumulative consequences of such errors can have a very long lifetime."

###

For more information, contact Durran at drdee@uw.edu.

 

Remember that fake news you read? It may help you remember even more

ASSOCIATION FOR PSYCHOLOGICAL SCIENCE

Research News

People who receive reminders of past misinformation may form new factual memories with greater fidelity, according to an article published in the journal Psychological Science.

Past research highlights one insidious side of fake news: The more you encounter the same misinformation--for instance, that world governments are covering up the existence of flying saucers--the more familiar and potentially believable that false information becomes.

New research, however, has found that reminders of past misinformation can help protect against remembering misinformation as true while improving recollection of real-world events and information.

"Reminding people of previous encounters with fake news can improve memory and beliefs for facts that correct misinformation," said Christopher Wahlheim, a lead author on the paper and assistant professor of psychology at the University of North Carolina, Greensboro. "This suggests that pointing out conflicting information could improve the comprehension of truth in some situations."

Wahlheim and colleagues conducted two experiments examining whether reminders of misinformation could improve memory for and beliefs in corrections. Study participants were shown corrections of news and information they may have encountered in the past. Reminders of past misinformation appeared before some corrections but not others. Study results showed that misinformation reminders increased the participants' recall of facts and belief accuracy. The researchers interpreted the results to indicate that misinformation reminders raise awareness of discrepancies and promote memory updating. These results may be pertinent to individuals who confront misinformation frequently.

"It suggests that there may be benefits to learning how someone was being misleading. This knowledge may inform strategies that people use to counteract high exposure to misinformation spread for political gain," Wahlheim said.

###

See related content in the APS Research Topic on Memory.

Reference:

Wahlheim, C. N., Alexander T. R., & Peske, C. D. (2020). Reminders of everyday misinformation statements can enhance memory for and beliefs in corrections of those statements in the short term. Psychological Science, 31(10), 1325-1339.
https://doi.org/10.1177/0956797620952797

https://www.wired.com/2010/01/weekly-world-news-comics/



 

NASA supercomputing study breaks ground for tree mapping, carbon research

NASA/GODDARD SPACE FLIGHT CENTER

Research News

Scientists from NASA's Goddard Space Flight Center in Greenbelt, Maryland, and international collaborators demonstrated a new method for mapping the location and size of trees growing outside of forests, discovering billions of trees in arid and semi-arid regions and laying the groundwork for more accurate global measurement of carbon storage on land.

Using powerful supercomputers and machine learning algorithms, the team mapped the crown diameter - the width of a tree when viewed from above - of more than 1.8 billion trees across an area of more than 500,000 square miles, or 1,300,000 square kilometers. The team mapped how tree crown diameter, coverage, and density varied depending on rainfall and land use.

Mapping non-forest trees at this level of detail would take months or years with traditional analysis methods, the team said, compared to a few weeks for this study. The use of very high-resolution imagery and powerful artificial intelligence represents a technology breakthrough for mapping and measuring these trees. This study is intended to be the first in a series of papers whose goal is not only to map non-forest trees across a wide area, but also to calculate how much carbon they store - vital information for understanding the Earth's carbon cycle and how it is changing over time.

Measuring carbon in trees

Carbon is one of the primary building blocks for all life on Earth, and this element circulates among the land, atmosphere, and oceans via the carbon cycle. Some natural processes and human activities release carbon into the atmosphere, while other processes draw it out of the atmosphere and store it on land or in the ocean. Trees and other green vegetation are carbon "sinks," meaning they use carbon for growth and store it out of the atmosphere in their trunks, branches, leaves and roots. Human activities, like burning trees and fossil fuels or clearing forested land, release carbon into the atmosphere as carbon dioxide, and rising concentrations of atmospheric carbon dioxide are a main cause of climate change.

Conservation experts working to mitigate climate change and other environmental threats have targeted deforestation for years, but these efforts do not always include trees that grow outside forests, said Compton Tucker, senior biospheric scientist in the Earth Sciences Division at NASA Goddard. Not only could these trees be significant carbon sinks, but they also contribute to the ecosystems and economies of nearby human, animal and plant populations. However, many current methods for studying trees' carbon content only include forests, not trees that grow individually or in small clusters.

Tucker and his NASA colleagues, together with an international team, used commercial satellite images from DigitalGlobe, which were high-resolution enough to spot individual trees and measure their crown size. The images came from the commercial QuickBird-2, GeoEye-1, WorldView-2, and WorldView-3 satellites. The team focused on the dryland regions - areas that receive less precipitation than what evaporates from plants each year - including the arid south side of the Sahara Desert, that stretches through the semi-arid Sahel Zone and into the humid sub-tropics of West Africa. By studying a variety of landscapes from few trees to nearly forested conditions, the team trained their computing algorithms to recognize trees across diverse terrain types, from deserts in the north to tree savannas in the south.

The team focused on the dryland regions of West Africa, including the arid south side of the Sahara Desert, stretching through the semi-arid Sahel Zone and into the humid sub-tropics. By studying a variety of landscapes from few trees to nearly forested conditions, the team trained their computing algorithms to recognize trees across diverse terrain types, from deserts in the north to tree savannas in the south. Download related video in HD formats: https://svs.gsfc.nasa.gov/4865

Learning on the job

The team ran a powerful computing algorithm called a fully convolutional neural network ("deep learning") on the University of Illinois' Blue Waters, one of the world's fastest supercomputers. The team trained the model by manually marking nearly 90,000 individual trees across a variety of terrain, then allowing it to "learn" which shapes and shadows indicated the presence of trees.

The process of coding the training data took more than a year, said Martin Brandt, an assistant professor of geography at the University of Copenhagen and the study's lead author. Brandt marked all 89,899 trees by himself and helped supervise training and running the model. Ankit Kariryaa of the University of Bremen led the development of the deep learning computer processing.

"In one kilometer of terrain, say it's a desert, many times there are no trees, but the program wants to find a tree," Brandt said. "It will find a stone, and think it's a tree. Further south, it will find houses that look like trees. It sounds easy, you'd think - there's a tree, why shouldn't the model know it's a tree? But the challenges come with this level of detail. The more detail there is, the more challenges come."

Establishing an accurate count of trees in this area provides vital information for researchers, policymakers and conservationists. Additionally, measuring how tree size and density vary by rainfall - with wetter and more populated regions supporting more and larger trees - provides important data for on-the-ground conservation efforts.

"There are important ecological processes, not only inside, but outside forests too," said Jesse Meyer, a programmer at NASA Goddard who led the processing on Blue Waters. "For preservation, restoration, climate change, and other purposes, data like these are very important to establish a baseline. In a year or two or ten, the study could be repeated with new data and compared to data from today, to see if efforts to revitalize and reduce deforestation are effective or not. It has quite practical implications."

After gauging the program's accuracy by comparing it to both manually coded data and field data from the region, the team ran the program across the full study area. The neural network identified more than 1.8 billion trees - surprising numbers for a region often assumed to support little vegetation, said Meyer and Tucker.

"Future papers in the series will build on the foundation of counting trees, extend the areas studied, and look ways to calculate their carbon content," said Tucker. NASA missions like the Global Ecosystem Dynamics Investigation mission, or GEDI, and ICESat-2, or the Ice, Cloud, and Land Elevation Satellite-2, are already collecting data that will be used to measure the height and biomass of forests. In the future, combining these data sources with the power of artificial intelligence could open up new research possibilities.

"Our objective is to see how much carbon is in isolated trees in the vast arid and semi-arid portions of the world," Tucker said. "Then we need to understand the mechanism which drives carbon storage in arid and semi-arid areas. Perhaps this information can be utilized to store more carbon in vegetation by taking more carbon dioxide out of the atmosphere."

"From a carbon cycle perspective, these dry areas are not well mapped, in terms of what density of trees and carbon is there," Brandt said. "It's a white area on maps. These dry areas are basically masked out. This is because normal satellites just don't see the trees - they see a forest, but if the tree is isolated, they can't see it. Now we're on the way to filling these white spots on the maps. And that's quite exciting."

###

USask scientists develop model to identify best lentils for climate change impacts

UNIVERSITY OF SASKATCHEWAN

Research News

IMAGE

IMAGE: USASK PLANT SCIENTIST KIRSTIN BETT. view more 

CREDIT: DEBRA MARSHALL PHOTOGRAPHY

With demand for lentils growing globally and climate change driving temperatures higher, a University of Saskatchewan-led international research team has developed a model for predicting which varieties of the pulse crop are most likely to thrive in new production environments.

An inexpensive plant-based source of protein that can be cooked quickly, lentil is a globally important crop for combatting food and nutritional insecurity.

But increased production to meet this global demand will have to come from either boosting yields in traditional growing areas or shifting production to new locations, said USask plant scientist Kirstin Bett.

"By understanding how different lentil lines will interact with the new environment, we can perhaps get a leg up in developing varieties likely to do well in new growing locations," said Bett.

Working with universities and organizations around the globe, the team planted 324 lentil varieties in nine lentil production hotspots, including two in Saskatchewan and one in the U.S., as well as sites in South Asia (Nepal, Bangladesh, and India) and the Mediterranean (Morocco, Spain, and Italy).

The findings, published in the journal Plants, People, Planet, will help producers and breeders identify existing varieties or develop new lines likely to flourish in new growing environments--valuable intelligence in the quest to feed the world's growing appetite for inexpensive plant-based protein.

The new mathematical model is based on a key predictor of crop yield--days to flowering (DTF) which is determined by two factors: day length (hours of sunshine or "photoperiod") and the mean temperature of the growing environment. Using detailed information about each variety's interaction with temperature and photoperiod, the simple model can be used to predict the number of days it takes each variety to flower in a specific environment.

"With this model, we can predict which lines they (producers) should be looking at that will do well in new regions, how they should work, and whether they'll work," Bett said.

For example, lentil producers in Nepal--which is already experiencing higher mean temperatures as a result of climate change--can use the model to identify which lines will produce high yields if they're grown at higher altitudes.

Closer to home in Western Canada, the model could be used to predict which varieties should do well in what are currently considered to be marginal production areas.

The project also involved USask plant researchers Sandesh Neupane, Derek Wright, Crystal Chan, and Bert Vandenberg.

The next step is putting the new model to work in lentil breeding programs to identify the genes that are controlling lentil lines' interactions with temperature and day length, said Bett.

Once breeders determine the genes involved, they can develop molecular markers that will enable breeders to pre-screen seeds. That way they'll know how crosses between different lentil varieties are likely to perform in different production locations.

###

This research project was part of the Application of Genomics to Innovation in the Lentil Economy (AGILE) project funded by Genome Canada and managed by Genome Prairie. Matching financial support was provided by partners that include the Saskatchewan Pulse Growers, Western Grains Research Foundation, and USask.

BROADBAND 

Internet connectivity is oxygen for research and development work

UNIVERSITY OF ILLINOIS COLLEGE OF AGRICULTURAL, CONSUMER AND ENVIRONMENTAL SCIENCES

Research News

IMAGE

IMAGE: EMMANUEL TOGO, IT ARCHITECT FOR THE UNIVERSITY OF GHANA, GAVE A TOUR OF THE UNIVERSITY'S CAMPUS NETWORK OPERATIONS CENTER DURING AN ICT HEALTH CHECKUP CONDUCTED BY PAUL HIXSON, UNIVERSITY OF... view more 

CREDIT: COLLEGE OF ACES, UNIVERSITY OF ILLINOIS.

URBANA, Ill. - Fast and reliable internet access is fundamental for research and development activity around the world. Seamless connectivity is a privilege we often take for granted. But in developing nations, technological limitations can become stumbling blocks to efficient communication and cause significant disadvantages.

Pete Goldsmith, director of the Soybean Innovation Lab at University of Illinois, works closely with partner organizations in several African countries. He noticed that his African colleagues were often dealing with technological problems that made communication very challenging. For example, sometimes they had to rely on their cell phones because their institution's internet access was unreliable.

Goldsmith teamed up with two IT experts at U of I, former Chief Information Officer Paul Hixson and Director of Research IT and Innovation Tracy Smith, to investigate technological challenges facing institutions in developing countries.

"Connectivity is the oxygen organizations run on," Hixson says. "It's such a basic requirement that it's often not even recognized as an issue. But lack of connectivity severely hinders an organization's ability to perform simple functions, conduct research, and compete for grants."

Goldsmith, Hixson, and Smith conducted an in-depth case study of information communication technology (ICT) infrastructure at the Savannah Agricultural Research Institute (SARI), a leading research station in Ghana and a close collaborator of SIL.

The case study included focus groups, interviews, and a technological analysis of SARI's equipment and connectivity. Based on this study, the research team developed the ICT Health Checkup, an assessment procedure for IT administrators to methodically assess the current state of their system, identify gaps affecting performance, and document steps for remediation.

The ICT Health Checkup tool systematically evaluates four key elements of ICT infrastructure. The first step focuses on connectivity and bandwidth, identifying the required bandwidth to accommodate the institution's needs and whether the institution has an uninterrupted fiber-based connection to the global internet. The second step analyzes core physical infrastructure, including dependable electricity, local network design, and both wired and wireless connectivity capabilities.

The third step looks at available intranet service offerings for researchers such as local storage, data backup procedures, access control, security procedures, email service, and cloud access. Finally, the fourth step deals with the human resources and technical support requirements for planning and managing the institution's IT infrastructure.

"With this tool, institutions can go through a checklist, and at each point there is a 'stoplight'. If it's red, you know there is something that needs to be fixed, because there are conditions that will act as a block and you can't go on until they are fixed - until there's a green light. So turning things from red to green at each step is crucial; methodically going through each step at a time and making sure it's fixed before moving on to the next one," Hixson explains.

The researchers compare the ICT Health Checkup to a medical health exam; it measures the current conditions and can be used as a benchmarking tool to measure improvements.

Goldsmith says the tool can be used to empower organizations so they can be self-sufficient. "With proper connectivity you can manage and store research data, compete for grants, and manage awards," he notes. "It's the foundation that allows institutions to participate fully in a global context."

The research team is currently expanding the study, collecting data from nine institutions and five networking organizations operating in three countries, in order to create a more robust picture of internet connectivity challenges and potential solutions across Africa.

They are also collaborating with the National Research and Education Networks (NRENs) in each of the sub-Saharan African countries that SIL operates in. These African NRENs are comparable to Internet2, which has been an instrumental partner in the expansion and adoption of advanced computing technologies at U of I and is one of the leading NRENs in the U.S., serving the country's research and higher-education communities.

"With the ICT health checkup, our partner African NRENs now have an actual assessment tool they can use with their member institutions. It's becoming a continent-wide approach as they are starting to adopt this new instrument created at the U of I to be their benchmark and measurement tool," Goldsmith says.

"The U of I is ideally positioned to provide this knowledge, because of the university's continued leadership in the computational and network administration space," he adds. "Now we are extending that to have real impact overseas."

###

The article, "The ICT Health Checkup Tool: Assessing Connectivity of the National Agriculture Research System (NARS)," is published in African Journal of Food, Agriculture, Nutrition and Development.

Funding was provided by the Feed the Future Innovation Lab for Soybean Value Chain Research (Soybean Innovation Lab (SIL)) under the U.S. Government's global hunger and food security initiative, Feed the Future.

The Soybean Innovation Lab is in the Department of Agricultural and Consumer Economics, College of Agricultural, Consumer and Environmental Sciences, University of Illinois.

 

Those funky cheese smells allow microbes to 'talk' to and feed each other

Researchers discover that bacteria that ripen cheese respond to the volatile gases produced by cheese fungi

TUFTS UNIVERSITY

Research News

IMAGE

IMAGE: FUNGI AND BACTERIA KEY TO RIPENING CHEESE COMMUNICATE WITH AND FEED EACH OTHER USING VOLATILE COMPOUNDS view more 

CREDIT: ADAM DETOUR

MEDFORD/SOMERVILLE, Mass. (October 16, 2020)-- Researchers at Tufts University have found that those distinctly funky smells from cheese are one way that fungi communicate with bacteria, and what they are saying has a lot to do with the delicious variety of flavors that cheese has to offer. The research team found that common bacteria essential to ripening cheese can sense and respond to compounds produced by fungi in the rind and released into the air, enhancing the growth of some species of bacteria over others. The composition of bacteria, yeast and fungi that make up the cheese microbiome is critical to flavor and quality of the cheese, so figuring out how that can be controlled or modified adds science to the art of cheese making.

The discovery, published in Environmental Microbiology, also provides a model for the understanding and modification of other economically and clinically important microbiomes, such as in soil or the gastrointestinal tract.

"Humans have appreciated the diverse aromas of cheeses for hundreds of years, but how these aromas impact the biology of the cheese microbiome had not been studied," said Benjamin Wolfe, professor of biology in the School of Arts and Science at Tufts University and corresponding author of the study. "Our latest findings show that cheese microbes can use these aromas to dramatically change their biology, and the findings' importance extends beyond cheese making to other fields as well."

Many microbes produce airborne chemical compounds called volatile organic compounds, or VOCs, as they interact with their environment. A widely recognized microbial VOC is geosmin, which is emitted by soil microbes and can often be smelled after a heavy rain in forests. As bacteria and fungi grow on ripening cheeses, they secrete enzymes that break down amino acids to produce acids, alcohols, aldehydes, amines, and various sulfur compounds, while other enzymes break down fatty acids to produce esters, methyl ketones, and secondary alcohols. All of those biological products contribute to the flavor and aroma of cheese and they are the reason why Camembert, Blue cheese and Limburger have their signature smells.

The Tufts researchers found that VOCs don't just contribute to the sensory experience of cheese, but also provide a way for fungi to communicate with and "feed" bacteria in the cheese microbiome. By pairing 16 different common cheese bacteria with 5 common cheese rind fungi, the researchers found that the fungi caused responses in the bacteria ranging from strong stimulation to strong inhibition. One bacteria species, Vibrio casei, responded by growing rapidly in the presence of VOCs emitted by all five of the fungi. Other bacteria, such as Psychrobacter, only grew in response to one of the fungi (Galactomyces), and two common cheese bacteria decreased significantly in number when exposed to VOCs produced by Galactomyces.

The researchers found that the VOCs altered the expression of many genes in the bacteria, including genes that affect the way they metabolize nutrients. One metabolic mechanism that was enhanced, called the glyoxylate shunt, allows the bacteria to utilize more simple compounds as "food" when more complex sources such as glucose are unavailable. In effect, they enabled the bacteria to better "eat" some of the VOCs and use them as sources for energy and growth.

"The bacteria are able to actually eat what we perceive as smells," said Casey Cosetta, post-doctoral scholar in the department of biology at Tufts University and first author of the study. "That's important because the cheese itself provides little in the way of easily metabolized sugars such as glucose. With VOCs, the fungi are really providing a useful assist to the bacteria to help them thrive."

There are direct implications of this research for cheese producers around the world. When you walk into a cheese cave there are many VOCs released into the air as the cheeses age. These VOCs may impact how neighboring cheeses develop by promoting or inhibiting the growth of specific microbes, or by changing how the bacteria produce other biological products that add to the flavor. A better understanding of this process could enable cheese producers to manipulate the VOC environment to improve the quality and variety of flavors.

The implications of the research can even extend much further. "Now that we know that airborne chemicals can control the composition of microbiomes, we can start to think about how to control the composition of other microbiomes, for example in agriculture to improve soil quality and crop production and in medicine to help manage diseases affected by the hundreds of species of bacteria in the body," said Wolfe.

###

Other authors of this study include Nicole Kfoury, former postdoctoral scholar at Tufts and currently applications scientist at Gerstel, Inc., and Albert Robbat Jr., associate professor of chemistry at Tufts.

This research was supported by a grant from the National Science Foundation (#1715553).

Cosetta, C.M., Kfoury, N., Robbat, A., and Wolfe, B.E. "Fungal volatiles mediate cheese rind microbiome assembly" Environmental Microbiology, 9 September 2020; DOI: 10.1111/1462-2920.15223

About Tufts University

Tufts University, located on campuses in Boston, Medford/Somerville and Grafton, Massachusetts, and in Talloires, France, is recognized among the premier research universities in the United States. Tufts enjoys a global reputation for academic excellence and for the preparation of students as leaders in a wide range of professions. A growing number of innovative teaching and research initiatives span all Tufts campuses, and collaboration among the faculty and students in the undergraduate, graduate and professional programs across the university's schools is widely encouraged.

 

New research comparing HIV medications set to change international recommendations

UNIVERSITY OF BRITISH COLUMBIA

Research News

IMAGE

IMAGE: THE STUDY'S LEAD AUTHOR, DR. STEVE KANTERS, WHO COMPLETED THE RESEARCH AS A PHD CANDIDATE IN UBC'S SCHOOL OF POPULATION AND PUBLIC HEALTH. view more 

CREDIT: UNIVERSITY OF BRITISH COLUMBIA

A new study by UBC researchers is set to change international treatment recommendations for people who are newly diagnosed with HIV--an update that could affect nearly two million people per year worldwide.

The study, published today by The Lancet in the journal EClinicalMedicine, was commissioned by the World Health Organization (WHO) as part of a planned update to its guidelines for HIV antiretroviral treatment (ART). The study found that dolutegravir is the optimal medication for first-line treatment for people newly diagnosed with HIV, a choice that has not been clear over the past several years.

"Research supporting the 2016 WHO guidelines suggested that dolutegravir was effective and well tolerated, but its efficacy and safety among key populations, such as pregnant women and people living with both HIV and tuberculosis (TB), remained unclear," said the study's lead author, Dr. Steve Kanters, who completed the research as a PhD candidate in UBC's School of Population and Public Health (SPPH). "In 2018, new research warned of a potentially serious increase in risk of neural tube defects in the children of women who became pregnant while taking this treatment."

The risk of adverse reaction meant that, although dolutegravir was found to be favourable compared to other options, it was only recommended as an alternative, with an antiretroviral called efavirenz recommended as the primary treatment.

The study team, which included Dr. Nick Bansback, associate professor at SPPH, Dr. Aslam Anis, professor at SPPH and director of the Centre for Health Evaluation and Outcome Sciences (CHÉOS), and Dr. Ehsan Karim, assistant professor at SPPH, completed a network meta-analysis of research stemming from 68 available antiretroviral therapy (ART) clinical trials.

They found dolutegravir was superior to efavirenz in most outcomes, including viral suppression, tolerability, and safety. According to Kanters, the increased odds of viral suppression with dolutegravir could have a significant impact on achieving international goals for HIV treatment.

"We found about a five per cent increase in the probability of viral suppression, which means that more people who start treatment will be able to successfully control their HIV," he said.

Another key attribute of dolutegravir is that it is effective in people who are resistant to NNRTI-class antiretrovirals, like efavirenz, a problem that is becoming increasingly common.

The analysis also showed that dolutegravir and efavirenz had similar rates of adverse events for pregnant women -- the increased risk of neural tube defects for dolutegravir was estimated to be less than 0.3 per cent.

"The new evidence on neural tube defects show that the risk with dolutegravir is much more tolerable than previously thought and should quell the initial worry about this drug," said Kanters.

"Dolutegravir appears to be here to stay as the preferred treatment for people newly diagnosed with HIV," he said. "However, it is important to recognize the good that efavirenz has done over the past two decades, as it helped lead the ART scale-up around the world."

Despite the many benefits of dolutegravir, dolutegravir use was associated with increased weight gain, a side effect that could increase the risk of aging-associated comorbidities, like heart attack or stroke.

"In many places, well-treated HIV has become a chronic condition and we are now seeing people living long lives with HIV," said Kanters. "The research community will continue to monitor the effects dolutegravir may have on the healthy aging process."

While this study is specifically focused on the optimal treatment for people newly diagnosed with HIV, an upcoming publication will review the evidence in support of switching to dolutegravir for people whose first treatment choice has been unsuccessful in controlling their infection. This recommendation could mean improved treatment for the many people living with HIV around the world who are unable to achieve viral suppression despite being on treatment.

###