Friday, January 27, 2023

Modern arms technologies help autocratic rulers stay in power

Autocrats and dictators quickly acquire new arms technologies from abroad and often use them against their own citizens. Now a study covering all countries from 1820 to 2010 shows that the spread of military technologies inhibits democratic reform

Peer-Reviewed Publication

UNIVERSITY OF COPENHAGEN

Democratisation of regimes in the period 1820-2010 according to military-technological capability 

IMAGE: DEMOCRATISATION OF REGIMES IN THE PERIOD 1820-2010: THE FIGURE SHOWS THE CUMULATED NUMBER OF DEMOCRATISED AUTOCRACIES, WHEN DIVIDING THEM INTO TWO EQUAL-SIZED GROUPS AFTER MILITARY-TECHNOLOGICAL CAPABILITY. SIGNIFICANTLY FEWER DEMOCRATISATIONS HAVE TAKEN PLACE AMONG AUTOCRACIES WITH ACCESS TO FEWER MILITARY TECHNOLOGIES. THE PARTITIONING OF COUNTRIES TAKES INTO ACCOUNT THAT THE COUNTRIES’ DIFFERENT LEVELS OF ECONOMIC DEVELOPMENT AFFECT THEIR ACCESS TO MILITARY TECHNOLOGIES. view more 

CREDIT: UNIVERSITY OF COPENHAGEN

When autocratic rulers have access to modern arms that are both fast and accurate at long ranges, it allows them to suppress protests and riots more effectively and at a lower cost. Now a large study confirms that access to modern military technology substantially reduces the probability of democratisation of authoritarian regimes.

The study minutely details the spread of 29 ground-breaking military technologies in all independent states in the period 1820-2010 (see box) as well as the form of government in these states. Based on statistical analysis of the data, the study establishes connections between states’ access to specific weapons, their economy and form of government.

According to the researchers, it makes sense that modern weapons play a key role in suppressing democratic movements.

“In short, the more protesters a regime can kill using as few resources as possible, the stronger it will be. But this is the first scientific study to show that regimes’ access to weapons do have a systematic, measurable effect on democratisation,” says Associate Professor Asger Mose Wingender from the Department of Economics, UCPH, who conducted the study together with Professor Jacob Gerner Hariri from the Department of Political Science.

Less chance of democratisation

Incumbent rulers often use violence, or the mere threat thereof, to suppress popular uprisings. Although such uprisings contributed to two in three successful democratisations in the period 1820-2010, many more were nipped in the bud.

The study shows that the success of pro-democracy movements crucially depends on the incumbent rulers’ (in)ability to inflict violence on protesters, and that this ability depends on arms technology. Overall, the study concludes that the chance of a democratic transition today is about 1.3 percentage points lower per year in the autocracies with the most advanced arms compared with the autocracies having access to the least advanced weaponry.

A difference of 1.3 percentage points may seem small, but over many years, it becomes significant. And because modern arms technologies have become far more effective, the resources available to present-day autocratic regimes are radically different from those of their predecessors. The poorest countries in the world have access to potent arms technologies that are merely a few decades old, despite that they, on some metrics, are less economically developed than Western Europe was two centuries ago.

This is an entirely new situation, Asger Mose Wingender explains:

“Historically, the development in military technology has run parallel to economic and other technological developments. It propelled the democratisation of the Western world, because in order to wage war, the state collected taxes from its citizens, who in turn would often demand and be granted the right to vote,” he says.

“Today, there is less pressure on autocratic regimes. Weapons are more cost-efficient, and technologies have spread to poor countries, giving authoritarian rulers access to extremely strong means of repression. Consequently, an imbalance has emerged between military-technological development and economic development that inhibits democratisation.”

Democratisation does not occur automatically

This imbalance between prosperity and democratic reform may be the most thought-provoking result of the study. Many Western economists and political scientists have suggested that a country’s level of economic development is a deciding factor in democratic reform: When the wealth of the state and its citizens increases, many countries will move towards democratisation.

The new study confirms that economic modernisation is indeed a key factor in democratisation, but it refutes the idea that it happens automatically as authoritarian regimes’ increased access to highly effective arms outbalance economic progress and prosperity.

“Our conclusion is in fact rather pessimistic,” says Asger Mose Wingender.

“We have this idea in the West that the economic development of countries such as China and Russia will lead to democracy when the growing middle classes begin to demand a say. And it is true that the economic development has made people in general want democracy, but at the same time, the states have access to better means of repression. This makes revolutionary waves like the ones we saw in Europe in e.g. 1848-1849 and after the fall of the Berlin Wall less likely to succeed today. Particularly in parts of the world that are less developed than Europe.”

This may also affect the way the Western world relates to autocratic regimes.

“Our study suggests that we in the Western world may have been naïve when it comes to modern dictatorships, and that we cannot simply apply Western European experiences with democratisation to the rest of the world,” Wingender says.

--

About the study

The study, ‘Jumping the gun: how dictators got ahead of their subjects’ is conducted by Associate Professor Asger Mose Wingender from the Department of Economics and Professor Jacob Gerner Hariri from the Department of Political Science, UCPH.

The study details the spread of weapons technology in all independent states in the period 1820-2010 with focus on 29 types of weapons divided into six categories: small arms, machine guns, artillery, tanks, attack aircraft and combat helicopters. All 29 weapon types are ground-breaking in military history and can potentially be used against a country’s own population.

Moreover, the study identifies the form of government in the countries throughout the period, including changes in form of government. Through statistical (econometric) analysis of the two datasets and inclusion of a number of other factors, i.a. economic development and geographic location, the study calculates the impact of military technologies of processes of democratisation controlling for causal errors.

Virologists call for rational discourse on gain of function research

Peer-Reviewed Publication

AMERICAN SOCIETY FOR MICROBIOLOGY

Washington, DC – January 26, 2023 – The study of viruses is under renewed scrutiny, say more than 150 experts in a new commentary published today in mSphere, mBio and the Journal of Virology, journals of the American Society for Microbiology 

The commentary’s authors call on policymakers to recognize the need for more rational discourse around the future of virology. They implore a more nuanced, evidence-based discussion around gain of function research and provide evidence to support the benefits of this type of research for human health. These concerns are especially focused on enhanced potential pandemic pathogen (ePPP) research and dual use research of concern (DURC). 

“To respond rapidly to emerging viral threats we must be able to apply modern biology tools to viruses which will ensure that we reduce the burden of future disease outbreaks,” said Felicia Goodrum, Ph.D., co-Editor-in-Chief of ASM’s Journal of Virology.   

The current debate regarding the origin of SARS-CoV-2 pandemic is partly due to a theory that suggests it may have been caused by an accidental or intentional lab leak. However, evidence strongly suggests that the virus originated from zoonotic transmission, through the transfer of the virus from wild animals to humans.  

Despite this, a narrative against this valuable research tool has developed, putting the field of virology at risk, despite its critical role in preparing humanity to fight threats posed by viruses.  

"Research on dangerous pathogens does require oversight, but we must be careful to not overly restrict the ability of scientists to generate the knowledge needed to protect ourselves from these pathogens, said Michael Imperiale, Ph.D., a professor with the Department of Microbiology and Immunology at the University of Michigan Medical School and Editor-in-Chief of ASM’s journal mSphere.  

As policymakers take a renewed look at policies surrounding gain of function research, the authors state, the abundance of existing oversight around virology research should be considered and a concerted effort to avoid redundant measures should be implemented. 

Gain of function research and regulations around virus research is the subject of a meeting by the National Science Advisory Board for Biosecurity to be held on January 27, which has released draft findings and recommendations.   

 

“It is critical that policy makers, virologists, and biosafety experts work together to ensure that research is conducted safely, with the common goal of reducing the burden of disease caused by viruses,” said Seema Lakdawala, Ph.D., an associate professor with the Department of Microbiology and Immunology at Emory University.  

###

The American Society for Microbiology is one of the largest single life science societies, composed of more than 30,000 scientists and health professionals. ASM's mission is to promote and advance the microbial sciences.   

ASM advances the microbial sciences through conferences, publications, certifications, educational opportunities and advocacy efforts. It enhances laboratory capacity around the globe through training and resources. It provides a network for scientists in academia, industry and clinical settings. Additionally, ASM promotes a deeper understanding of the microbial sciences to diverse audiences. 

New facility at KIT produces carbon out of air

The climate-friendly NECOC process produces carbon out of the CO2 from ambient air

Business Announcement

KARLSRUHER INSTITUT FÜR TECHNOLOGIE (KIT)


The new NECOC facility at KIT produces the high-tech resource carbon out of the climate-harming CO2 in the ambient air. (Photo: Markus Breig) 

IMAGE: THE NEW NECOC FACILITY AT KIT PRODUCES THE HIGH-TECH RESOURCE CARBON OUT OF THE CLIMATE-HARMING CO2 IN THE AMBIENT AIR. (PHOTO: MARKUS BREIG) view more 

CREDIT: MARKUS BREIG, KIT

Germany is progressing on its way to climate neutrality – and has to close carbon cycles in its industries as soon as possible to get there. To reach the 1.5-degree target, the Intergovernmental Panel on Climate Change (IPCC) suggests to remove and permanently store already emitted CO2. “We have to find completely new technological solutions if we want to keep up industrial production,” says Dr. Benjamin Dietrich of the KIT Institute of Thermal Process Engineering (TVT). ”This includes the industrial carbon supply. Carbon is needed for the production of batteries, building materials, colors, and in the agricultural sector. So far, it comes largely from fossil sources.” In the research project NECOC (short for: NEgative CarbOn Dioxide to Carbon) coordinated by Dietrich, the associated partners KIT, INERATEC, and Climeworks develop a process to convert CO2 from the atmosphere into carbon. “If this carbon remains permanently bound, we successfully combine negative emission with a component of the post-fossil resource supply as part of a future carbon management strategy. This represents a double contribution to a sustainable future,” Dietrich explains. In the first project phase, the research team constructed a container-sized test facility, which now went into operation. This first-phase installation removes two kilograms of CO2 from the ambient air in one day and turns it into 0.5 kilogram of solid carbon.

 

In Three Steps from Greenhouse Gas to Useful Resource

The NECOC process combines three steps: The first is an absorber to separate the CO2 from the ambient air (Direct Air Capture). In the second step, the CO2 is moved to a microstructured reactor, where it reacts with sustainably produced hydrogen from a connected electrolyzer. Its components, carbon and oxygen, form new bonds and the CO2 becomes methane and water. While the water flows back to the electrolyzer, the methane including the carbon ends up in a reactor with liquid tin. This is where the third process step takes place: In rising bubbles, a pyrolysis-reaction splits the methane molecules, creating hydrogen, which can be returned to split CO2. The only remaining part is carbon, which floats on the tin as micro granular that can be taken off mechanically on a regular basis. Changing process parameters like the temperature level allows the production of different carbon modifications like graphite, carbon black, or even graphene.

 

Optimize and Scale for Industrial Application

The start of the test installation is an important milestone for the NECOC project, as well as the end of the first funding phase. In a second project phase, the NECOC procedure will now be scaled up and optimized for expansion. “We are planning to make the procedure more energy-efficient by improving the energy recovery from the process heat,” states project director Dr. Leonid Stoppel from the Karlsruhe Liquid Metal Laboratory (KALLA).  “We are also looking into an integration of high-temperature heat storages and direct solar heating.” Additional points of research are the inclusion of CO2 point sources, novel approaches to the extraction of CO2 from the air, and the influence of trace components and impurities in the process network on the carbon quality.

 

About NECOC

In the framework of the NECOC research project, a climate-friendly procedure with negative emissions is developed for the production of the high-tech resource carbon from atmospheric CO2 as an element of a carbon management strategy. NECOC started at the end of 2019 with the construction of the components for the three underlying process steps: Direct Air Capture, methanization, and pyrolysis. After the successful testing of each individual component, the combined installation was realized in 2022 and put into operation at the beginning of December. Involved in the project are the Karlsruhe Liquid Metal Laboratory (KALLA) as part of the Institute for Thermal Energy Technology and Safety (ITES), as well as the Institute of Thermal Process Engineering (TVT). Project partners are INERATEC GmbH, a KIT spinoff, and Climeworks Deutschland GmbH.

 

More Information (in German): https://www.tvt.kit.edu/21_3547.php

 

More about the KIT Energy Center

 

Being “The Research University in the Helmholtz Association”, KIT creates and imparts knowledge for the society and the environment. It is the objective to make significant contributions to the global challenges in the fields of energy, mobility, and information. For this, about 9,800 employees cooperate in a broad range of disciplines in natural sciences, engineering sciences, economics, and the humanities and social sciences. KIT prepares its 22,300 students for responsible tasks in society, industry, and science by offering research-based study programs. Innovation efforts at KIT build a bridge between important scientific findings and their application for the benefit of society, economic prosperity, and the preservation of our natural basis of life. KIT is one of the German universities of excellence.

Overview of the climate-friendly NECOC process to turn CO2 into carbon. (Graph: modus: medien + kommunikation gmbh)

Graph: modus: medien + kommunikation gmbh

Urgent need to address ‘regulatory void’ for online gambling across sub-Saharan Africa – new study

University of Bath Press Release

Peer-Reviewed Publication

UNIVERSITY OF BATH

Governments across sub-Saharan Africa are struggling to keep pace with the mass expansion of gambling, brought about through online technologies and smartphone apps, say the authors of a new study funded by the British Academy and Global Challenges Research Fund (GCRF).

An international team, coordinated by the universities of Ghana, Bath, and Glasgow, in partnership with the Malawi Epidemiology and Intervention Research Unit (MEIRU), comprehensively reviewed existing policies in place to regulate the gambling industry across 49 countries.

They found that in 41 countries where gambling was legal (of 49 in total), only two published annual reports about its impacts. In 36, no public reporting was accessible.

This lack of transparency and general lack of coherence about gambling regulation suggests that as the industry grows, governments are increasingly unaware of the scale of the problem; namely, the negative effects gambling is having on users’ finances and their mental health.

With the advent of digital technologies fuelling the expansion of the industry, the researchers argue that regulation is outdated and ill-equipped for a digital age. At present, many territories lack explicit regulation for online gambling, or protections against the deluge of advertising that has become commonplace in more established gambling markets, like the UK.

What’s more, smartphones are allowing more and more users to access sports gambling 24/7 via a range of apps for both domestic and international fixtures. Whilst there is growing concern about the industry’s expansion in the Global North, little attention has so far been paid to gambling growth across Africa.

There, the review argues gambling has fast become ubiquitous across daily life from billboard to TV advertising. Young people are particularly exposed to this. Two separate recent studies found that up to 10% of urban Ethiopian teenagers were compulsive gamblers; in Uganda, over 90% of urban young people had at least one gambling problem.

Overall, limited information was available from governments about the size of the gambling market, participation, harms, and prevention. Only in Malawi and South Africa was market size reported. Botswana’s regulatory authority was the only one to offer participation statistics. Rates of harmful gambling were only reported in Botswana and South Africa.

First author on the paper, Junious Sichali of the Malawi Epidemiology and Intervention Unit (MEIRU), explained: “We’re witnessing an exponential growth of gambling across Africa. In Malawi alone, exposure to gambling, either via street vendors and betting shops, or the deluge of adverts across billboards, TV, live sport and social media, is everywhere.

“What is most concerning is that weak regulation means that many people don’t understand the risks of harm; some even see gambling as a way to earn a living in hard times. Urgent action cannot come soon enough.”

Co-Investigator from the University of Ghana, Dr Joana Salifu Yendork added: “The findings will come as no surprise to many in Ghana, where existing regulation has not been effective in either curbing the rapid growth of gambling or protecting against its potentially harmful effects.

“What is most concerning is that for unemployed youth in particular, gambling is not just a leisure form but a potential source of income and wealth. Robust policymaking is essential to prevent further harms.”

Principal Investigator on the project, Dr Darragh McGee at the University of Bath said: “The findings point to a lack of regulation, transparency, and protection from gambling harms across Sub-Saharan Africa, despite evidence that the industry is rapidly expanding. The need to address this regulatory void and prioritise a public health approach is urgent.

“Decisive action by policymakers across Africa can safeguard against the kind of runaway excess and harms that have been an avoidable by-product of gambling in markets such as the UK.”

Professor Gerda Reith, Professor of Social Sciences at the University of Glasgow, added: “The rapid expansion of the gambling industry into Sub Saharan Africa is especially worrying when the regulatory frameworks that might control them are often weak or poorly enforced, as this research has found.

“This trend raises serious concerns about the potential impacts of harm on the health and wellbeing of vulnerable populations, many of whom live in conditions of poverty and unemployment. Urgent, joined up action by national and international policymakers is badly needed in response.”

To access the latest paper ‘Regulating of gambling in Sub-Saharan Africa: findings from a comparative policy analysis is accessible via the Journal of Public Health https://www.sciencedirect.com/science/article/pii/S0033350622002281 .

MSU expert: An ‘extreme lack of consistency’ hampers the potential of nanosized medicines

Peer-Reviewed Publication

MICHIGAN STATE UNIVERSITY

Morteza Mahmoudi 

IMAGE: MICHIGAN STATE UNIVERSITY ASSISTANT PROFESSOR MORTEZA MAHMOUDI STUDIES THE FACTORS LIMITING THE DEVELOPMENT OF NANOMEDICINES view more 

CREDIT: MICHIGAN STATE UNIVERSITY

EAST LANSING, Mich. – Michigan State University researcher Morteza Mahmoudi studies factors impeding the development of very promising and extremely tiny diagnostics and therapeutics known as nanomedicines.

One of those factors is a lack of standards when it comes to how these medicines are analyzed and characterized in the laboratory. Mahmoudi was part of a team that recently revealed a shocking level of disagreement between lab results that researchers rely on as they create and test new nanomedicines. That team included Ali Ashkarran at MSU and collaborators at the University of California, Berkeley and the Karolinska Institute in Sweden.

Mahmoudi, an assistant professor in MSU’s Department of Radiology, explains why addressing such disagreements with stronger standards will help ensure future nanomedicines are safe, effective and successful.

The following answers are excerpts and adaptations from an article originally published in The Conversation.

What are nanomedicines?

Nanomedicines took the spotlight during the COVID-19 pandemic. Researchers are using these very small and intricate materials to develop diagnostic tests and treatments. Nanomedicines are already used for various diseases, such as the COVID-19 vaccines and therapies for cardiovascular disease. The “nano” refers to the use of particles that are only a few hundred nanometers in size, which is significantly smaller than the width of a human hair.

If we’re already using nanomedicines, why is there a need to develop and implement new standards for how we study them?

Although there are success stories in the field, many scientists — including myself — believe there aren’t enough, especially considering the amount of effort and taxpayer money we’ve invested in nanomedicine development. 

To that end, researchers have been working to improve the safety and efficacy of nanomedicine through various approaches. These include modifying study protocols, methodologies and analytical techniques to standardize the field and improve the reliability of nanomedicine data.

Aligned with these efforts, my team and I have identified several critical but often overlooked factors that can influence the performance of a nanomedicine, such as a person’s sexprior medical conditions and disease type.

Taking these factors into account when designing studies and interpreting results could enable researchers to produce more reliable and accurate data and lead to better nanomedicine treatments.

Can you give an example of how?

Nanomedicines, just like all medications, are surrounded by proteins from the body once they come into contact with the bloodstream. This protein coating, known as a protein corona, gives nanoparticles a biological identity. This biological identity determines how the body will recognize and interact with the particles, like how the immune system has specific reactions against certain pathogens and allergens.

Knowing the precise type, amount and configuration of the proteins and other biomolecules attached to the surface of nanomedicines is critical to determine safe and effective dosages for treatments.

However, one of the few available approaches to analyze the composition of protein coronas requires instruments that many nanomedicine laboratories lack. So these labs typically send their samples to separate proteomics facilities to do the analysis for them. Unfortunately, many facilities use different sample preparation methods and instruments, which can lead to differences in results.

Those are the types of differences you investigated in your team’s new study. What did you find?

We wanted to test how consistently these proteomics facilities analyzed protein corona samples. To do this, my colleagues and I sent biologically identical protein coronas to 17 different labs in the U.S. for analysis.

We had striking results: Less than 2% of the proteins the labs identified were the same.

Our results reveal an extreme lack of consistency in the analyses researchers use to understand how nanomedicines work in the body. This may pose a significant challenge not only to ensuring the accuracy of diagnostics, but also the effectiveness and safety of treatments based on nanomedicines.

Read on MSUToday.


Nanoparticles, the white disks, can be used to deliver treatment to cells, shown in blue.

CREDIT

Brenda Melendez and Rita Serda/National Cancer Institute, National Institutes of Health, CC BY-NC

Michigan State University has been advancing the common good with uncommon will for more than 165 years. One of the world's leading research universities, MSU pushes the boundaries of discovery to make a better, safer, healthier world for all while providing life-changing opportunities to a diverse and inclusive academic community through more than 200 programs of study in 17 degree-granting colleges.

For MSU news on the Web, go to MSUToday. Follow MSU News on Twitter at twitter.com/MSUnews.

 

Quantum sensors see Weyl photocurrents flow

Boston College-led team develops new quantum sensor technique to image and understand the origin of photocurrent flow in Weyl semimetals

Peer-Reviewed Publication

BOSTON COLLE

Quantum sensors see Weyl photocurrents flow 

IMAGE: A TEAM OF BOSTON COLLEGE RESEARCHERS DISCOVERED THAT THE PHOTOCURRENT FLOWS IN (ILLUSTRATED IN BLUE) ALONG ONE CRYSTAL AXIS OF THE WEYL SEMIMETAL AND FLOWS OUT (ILLUSTRATED IN YELLOW/ORANGE) ALONG THE PERPENDICULAR AXIS, REPRESENTED HERE AS A RESULT OF A NEW TECHNIQUE THE TEAM DEVELOPED USING QUANTUM MAGNETIC FIELD SENSORS TO VISUALIZE THE FLOW OF ELECTRICITY. view more 

CREDIT: ZHOU LAB, BOSTON COLLEGE

Chestnut Hill, Mass (1/26/2023) – Quantum sensors can be used to reveal a surprising new mechanism for converting light into electricity in Weyl semimetals, Boston College Assistant Professor of Physics Brian Zhou and colleagues report in the journal Nature Physics.

A number of modern technologies, such as cameras, fiber optic networks, and solar cells rely on the conversion of light into electrical signals. But with most materials, shining a light onto their surface will not generate any electricity because there is no preferred direction for the electricity to flow. The unique properties of electrons in Weyl semimetals have made them a focus of researchers trying to overcome those limits and develop novel optoelectronic devices.

“Most photoelectrical devices require two different materials to create an asymmetry in space,” said Zhou, who worked with eight BC colleagues and two researchers from the Nanyang Technological University in Singapore. “Here, we showed that the spatial asymmetry within a single material – in particular the asymmetry in its thermoelectric transport properties – can give rise to spontaneous photocurrents.”

The team studied the materials tungsten ditelluride and tantalum iridium tetratelluride, which both belong to the class of Weyl semimetals. Researchers have suspected that these materials would be good candidates for photocurrent generation because their crystal structure is inherently inversion asymmetric; that is to say, the crystal does not map onto itself by reversing directions about a point.

Zhou’s research group set out to understand why Weyl semimetals are efficient at converting light into electricity. Previous measurements could only determine the amount of electricity coming out of a device, like measuring how much water flows from a sink into a drainpipe. To better understand the origin of the photocurrents, Zhou’s team sought to visualize the flow of electricity within the device – similar to making a map of the swirling water currents in the sink.

“As part of the project, we developed a new technique using quantum magnetic field sensors called nitrogen-vacancy centers in diamond to image the local magnetic field produced by the photocurrents and reconstruct the full streamlines of the photocurrent flow,” graduate student Yu-Xuan Wang, lead author on the manuscript, said.

The team found the electrical current flowed in a four-fold vortex pattern around where the light shined on the material. The team further visualized how the circulating flow pattern is modified by the edges of the material and revealed that the precise angle of the edge determines whether the total photocurrent flowing out of the device is positive, negative, or zero.

“These never-before-seen flow images allowed us to explain that the photocurrent generation mechanism is surprisingly due to an anisotropic photothermoelectric effect – that is to say, differences in how heat is converted to current along the different in-plane directions of the Weyl semimetal,” Zhou said.

Surprisingly, the appearance of anisotropic thermopower is not necessarily related to the inversion asymmetry displayed by Weyl semimetals, and hence, may be present in other classes of materials.

“Our findings open a new direction for searching for other highly photoresponsive materials,” Zhou said. “It showcases the disruptive impact of quantum-enabled sensors on open questions in materials science.”

Zhou said future projects will use the unique photocurrent flow microscope to understand the origins of photocurrents in other exotic materials and to push the limits in detection sensitivity and spatial resolution.

In addition to Zhou and Wang, co-authors of the report “Visualization of bulk and edge photocurrent flow in anisotropic Weyl semimetals” include Boston College Associate Professor of Physics Ying Ran, Professor of Physics David Broido, and Assistant Professor of Physics Fazel Tafti; graduate students Xin-Yue Zhang, Thomas Graham, and Xiaohan Yao; and post-doctoral researcher Chunhua Li; as well as Nanyang Technological University Professor Zheng Liu and post-doctoral researcher Ruihuan Duan.

UK STUDY

Sugary drinks tax may have prevented over 5,000 cases of obesity a year in year six girls alone

Peer-Reviewed Publication

UNIVERSITY OF CAMBRIDGE

The introduction of the soft drinks industry levy – the ‘sugary drinks tax’ – in England was followed by a drop in the number of cases of obesity among older primary school children, according to Cambridge researchers. Taking into account current trends in obesity, their estimates suggest that around 5,000 cases of obesity per year may have been prevented in year six girls alone.

The study, published today in PLOS Medicine, looked at the impact of the levy on reception age children and those in year six, but found no significant association between the levy and obesity levels in year six boys or younger children from reception class.

Obesity has become a global public health problem. In England, one in ten reception age children (four to five years old) is living with obesity and this figure doubles to one in five children in year six (10 to 11 years). Children who are obese are more likely to suffer from serious health problems including high blood pressure, type II diabetes and depression in childhood and in later life.

In the UK, young people consume significantly more added sugars than is recommended – by late adolescence, they typically consume 70g of added sugar per day, more than double the recommended amount (30g). A large source of this is sugar-sweetened drinks. Children from deprived households are more likely to be at risk of obesity and to be heavy consumers of sugar-sweetened drinks.

In April 2018, to protect children from excessive sugar consumption and tackle childhood obesity, the UK governments introduced a two-tier sugar tax on soft drinks – the soft drinks industry levy. The tax was targeted at manufacturers of the drinks to incentivise them to reduce the sugar content of soft drinks.

Researchers from the Medical Research Council (MRC) Epidemiology Unit at the University of Cambridge tracked changes in the levels of obesity in children in England in reception year and year six between 2014 and 2020. Taking account of previous trends in obesity levels, they compared changes in levels of obesity 19 months after the sugar tax came into effect.

The team found that the introduction of the sugar tax was associated with an 8% relative reduction* in obesity levels in year six girls, equivalent to preventing 5,234 cases of obesity per year in this group alone. Reductions were greatest in girls whose schools were in deprived areas, where children are known to consume the largest amount of sugary drinks – those living in the most deprived areas saw a 9% reduction.

However, the team found no associations between the sugar tax coming into effect and changes in obesity levels in children from reception class. In year 6 boys, there was no overall change in obesity prevalence.

Dr Nina Rogers from the MRC Epidemiology Unit at Cambridge, the study’s first author, said: “We urgently need to find ways to tackle the increasing numbers of children living with obesity, otherwise we risk our children growing up to face significant health problems. That was one reason why the UK’s soft drinks industry levy was introduced, and the evidence so far is promising. We’ve shown for the first time that it is likely to have helped prevent thousands of children each year becoming obese.

“It isn’t a straightforward picture, though, as it was mainly older girls who benefited. But the fact that we saw the biggest difference among girls from areas of high deprivation is important and is a step towards reducing the health inequalities they face.”

Although the researchers found an association rather than a causal link, this study adds to previous findings that the levy was associated with a substantial reduction in the amount of sugar in soft drinks.

Senior author Professor Jean Adams from the MRC Epidemiology Unit said: “We know that consuming too many sugary drinks contributes to obesity and that the UK soft drinks levy led to a drop in the amount of sugar in soft drinks available in the UK, so it makes sense that we also see a drop in cases of obesity, although we only found this in girls. Children from more deprived backgrounds tend to consume the largest amount of sugary drinks, and it was among girls in this group that we saw the biggest change.”

There are several reasons why the sugar tax did not lead to changes in levels of obesity among the younger children, they say. Very young children consume fewer sugar-sweetened drinks than older children, so the soft drinks levy would have had a smaller effect. Similarly, fruit juices are not included in the levy, but contribute similar amounts of sugar in young children’s diets as sugar-sweetened beverages.

It’s unclear why the sugar tax might affect obesity prevalence in girls and boys differently, however, especially since boys are higher consumers of sugar-sweetened beverages. One explanation the researchers put forward is the possible impact of advertising – numerous studies have found that boys are often exposed to more food advertising content than girls, both through higher levels of TV viewing and in how adverts are framed. Physical activity is often used to promote junk food and boys, compared to girls, have been shown to be more likely to believe that energy dense junk foods depicted in adverts will boost physical performance and so are more likely to choose energy-dense, nutrient-poor products following celebrity endorsements.

The study was a collaboration involving researchers from the University of Cambridge, London School of Hygiene and Tropical Medicine, University of Oxford, Great Ormond Street Institute of Child Health and University of Bath. It was supported by the National Institute of Health and Care Research, and the Medical Research Council.

*A relative reduction is the difference between the expected incidence of obesity had the sugar tax not been introduced and the actual incidence.

Reference
Rogers, NT et al. Associations between trajectories of obesity prevalence in English primary school children and the UK soft drink industry levy: an interrupted time series analysis of surveillance data. PLOS Med; 26 Jan 2023; DOI: 10.1371/journal.pmed.1004160

CRISPR ALCHEMY

Secret recipe for limonoids opens door for bee-friendly crop protection

Peer-Reviewed Publication

JOHN INNES CENTRE

Limonoids 

IMAGE: THE JOHN INNES CENTRE RESEARCH TEAM USED GENOMIC TOOLS TO MAP THE GENOME OF CHINABERRY (MELIA AZEDARACH), A MAHOGANY SPECIES view more 

CREDIT: JOHN INNES CENTRE

Innovative research has uncovered the secret of how plants make limonoids, a family of valuable organic chemicals which include bee-friendly insecticides and have potential as anti-cancer drugs.

The research team, a collaboration between the John Innes Centre and Stanford University, used ground-breaking methods to reveal the biosynthetic pathway of these useful molecules, which are made by certain plant families, including mahogany and citrus.

In the study which appears in Science, the John Innes Centre research team used genomic tools to map the genome of Chinaberry (Melia azedarach), a mahogany species, and combined this with molecular analysis to reveal the enzymes in the biosynthetic pathway.

“By finding the enzymes required to make limonoids, we have opened the door to an alternate production source of these valuable chemicals,” explained Dr Hannah Hodgson, co-first author of the paper and a postdoctoral scientist at the John Innes Centre.

Until now limonoids, a type of triterpene, could only be produced by extraction from plant material.

Dr Hodgson explains, “Their structures are too complicated to efficiently make by chemical synthesis. With the knowledge of the biosynthetic pathway, it is now possible to use a host organism to produce these compounds.” she added.

Armed with the complete biosynthetic pathway researchers can now produce the chemicals in commonly used host plants such as Nicotiana benthamiana. This method can produce larger quantities of limonoids in a more sustainable way.

Increasing the supply of limonoids could enable the more widespread use of azadirachtin, the anti-insect limonoid obtained from the neem tree and used in commercial and traditional crop protection. Azadirachtin is an effective, fast degrading, bee-friendly option for crop protection but is not widely used due to limited supply.

The team made two relatively simple limonoids, azadirone from Chinaberry and kihadalactone A from citrus, and believe that the methods used here can now be applied as a template for making more complicated triterpenes.

Professor Anne Osbourn, group leader at the John Innes Centre and co-corresponding author of the study said: “Plants make a wide variety of specialised metabolites that can be useful to humans. We are only just starting to understand how plants make complex chemicals like limonoids. Prior to this project, their biosynthesis and the enzymes involved were completely unknown, now the door is open for future research to build on this knowledge, which could benefit people in many ways.”

Another example of a high value limonoid which the team hopes to produce is the anti-cancer drug candidate nimbolide, this work could enable easier access to limonoids like nimbolide to enable further study. As well as producing known products like nimbolide, the research team say the door may open to understanding new activities for limonoids that have not yet been investigated.

The team at the John Innes Centre were funded by Syngenta and BBSRC via an industrial partnership award.

Complex scaffold remodeling in plant triterpene biosynthesis appears in Science. DOI: 10.1126/science.adf1017

Research Method in More Detail

The team at John Innes used genomic tools to assemble a chromosome level genome for Chinaberry (Melia azedarach), within which they found the genes encoding 10 additional enzymes required to produce the azadirachtin precursor, azadirone. In parallel, the team working at Stanford were able to find the 12 additional enzymes required to make khidalactone A.

Expressing these enzymes in N. benthamiana enabled their characterisation, with the help of both Liquid chromatography–mass spectrometry (LC-MS) and Nuclear Magnetic Resonance (NMR) Spectroscopy, technologies that allow molecular level analysis of samples.