Monday, August 10, 2020

GI symptoms linked to behavioral problems in children, especially those with autism
by UC Davis
Credit: CC0 Public Domain

A new UC Davis Health study found that common gastrointestinal (GI) symptoms such as diarrhea, constipation and bloating are linked to troubling sleep problems, self-harm and physical complaints in preschool children. According to the study, published Aug. 6 in Autism Research, these GI symptoms are much more common and potentially disruptive in young kids with autism.

"Clinicians and parents need to be aware of the high occurrence of GI problems in kids with autism," said Bibiana Restrepo, assistant clinical professor of pediatrics and first author on the study. "This study highlights the link between GI symptoms and some problematic behaviors we see in preschool-aged children."

Children with autism experience more gastrointestinal symptoms

Gastrointestinal concerns are frequently reported by parents of children with autism spectrum disorder (ASD). Researchers from the UC Davis MIND Institute evaluated the presence of GI symptoms in preschool-aged children with and without autism.

The study included 255 (184 males/71 females) children with ASD between two and 3.5 years of age and 129 (75 males/54 females) typically developing children in the same age group. Pediatricians specializing in autism interviewed caregivers during the children's medical evaluation. They asked the parents how often their children experienced GI symptoms such as difficulty swallowing, abdominal pain, bloating, diarrhea, constipation, painful stooling, vomiting, difficulty swallowing, blood in stool and blood in vomit.

The researchers grouped children in two categories: those who experienced one or more GI symptom and those who never or rarely had GI symptoms in the last three months. They compared the children in the two groups on measures of developmental, behavioral and adaptive functioning.

The study found that preschool-aged children with ASD were 2.7 times more likely to experience GI symptoms than their typically developing peers. In fact, almost 50% of children with ASD reported frequent GI symptoms—compared to 18% of children with typical development. Around 30% of the children with ASD experienced multiple GI symptoms.

Problem behaviors as an expression of GI discomfort in children
Multiple GI symptoms were associated with increased challenges with sleep and attention, as well as problem behaviors related to self-harm, aggression and restricted or repetitive behavior in both autistic and typically developing children. The severity of these problems was higher in children with autism.

"Problem behaviors may be an expression of GI discomfort in preschool-aged children," said Christine Wu Nordahl, associate professor at UC Davis MIND Institute and the department of psychiatry and behavioral sciences. "GI symptoms are often treatable, so it is important to recognize how common they are in children with autism. Treating their GI symptoms could potentially provide some relief to the kids and their parents."

The study found no link between GI symptoms and the children's cognitive development or gender. GI symptoms were equally common in male and female preschool children.



Explore further Autism health challenges could be explained by problem behaviors

More information: Bibiana Restrepo et al, Developmental–behavioral profiles in children with autism spectrum disorder and co‐occurring gastrointestinal symptoms, Autism Research (2020). DOI: 10.1002/aur.2354

Journal information: Autism Research

Provided by UC Davis
Confused by whole grain labels on food packaging? Study finds you're not alone

by Tufts University
Credit: Tufts University

Whole grain labels on cereal, bread, and crackers are confusing to consumers and could cause them to make fewer healthy choices, according to the results of a study that tested whether people are able pick out the healthier, whole grain option based on food package labels.


The study, led by researchers at the Gerald J. and Dorothy R. Friedman School of Nutrition Science and Policy at Tufts University and NYU School of Global Public Health, is published today in Public Health Nutrition. The researchers say the findings could help lead to enhancements in food labeling.

A pool of 1,030 U.S. adults, representative of the population, responded to a survey with photos of both hypothetical and real products. The photos showed the products, with various whole grain labels on the front of the package, along with the nutrition facts label and ingredients list for each product. Participants were asked to identify the healthier option (for the hypothetical products) or assess the whole grain content (for the real products).
For the hypothetical products, 29-47% of respondents answered incorrectly (specifically, 31% incorrectly for cereal, 29-37% for crackers, 47% for bread).
For real products that were not mostly composed of whole grains, 43-51% of respondents overstated the whole grain content (specifically, 41% overstated for multigrain crackers, 43% for honey wheat bread, and 51% for 12-grain bread). Consumers more accurately stated the whole grain content for an oat cereal product that really was mostly composed of whole grain.

"Our study results show that many consumers cannot correctly identify the amount of whole grains or select a healthier whole grain product. Manufacturers have many ways to persuade you that a product has whole grain even if it doesn't. They can tell you it's multigrain or they can color it brown, but those signals do not really indicate the whole grain content," said first author Parke Wilde, a food economist and professor at the Friedman School.

The packages on the hypothetical products either had no front-of-package whole grain label or were marked with "multigrain," "made with whole grains," or a whole grain stamp. The packages on the real products displayed the actual product markings, including "multigrain," "honey wheat," and "12 grain."

The study goal was to assess whether consumer misunderstanding of the labels meets a legal standard for enhanced U.S. labeling requirements for whole grain products. The legal standard relates to deceptive advertising, and evidence that the labels are actually misleading—or likely to mislead—consumers can bolster support for regulations.


"With the results of this study, we have a strong legal argument that whole grain labels are misleading in fact. I would say when it comes to deceptive labels, 'whole grain' claims are among the worst. Even people with advanced degrees cannot figure out how much whole grain is in these products," said co-author Jennifer L. Pomeranz, assistant professor of public health policy and management at NYU School of Global Public Health.

Previous research has shown disparities in whole grain intake in the United States, including for example, lower intake for adolescents than for adults, and lower intake for participants in the Supplemental Nutrition Assistance Program (SNAP) than for higher-income non-participants. The authors of the new study found that consumers who were younger, had less education, were Black or African American, or reported having difficulty understanding food labels were more likely to answer incorrectly in the test involving hypothetical products.

The 2015-2020 Dietary Guidelines for Americans recommend that half of all grains consumed should be whole grains. Adequate intake of whole grains has been linked with reduced risk of heart disease, type 2 diabetes, and cancer.

"A large chunk of Americans' daily calories—42 percent—comes from low quality carbohydrates. Consuming more whole grains can help change that, but the policy challenge is to provide consumers with clear labels in order to make those healthier choices," said co-senior author Fang Fang Zhang, nutrition epidemiologist at the Friedman School.

Limitations of the study include the fact that higher education respondents were moderately over-represented, which means the results are conservative. Also, a formal response rate to the survey cannot be calculated because participants were part of ongoing survey panels and volunteered to respond.


Explore furtherWhy whole grains are the healthier choice


More information: Parke Wilde et al, Consumer confusion about wholegrain content and healthfulness in product labels: a discrete choice experiment and comprehension assessment, Public Health Nutrition (2020).

Journal information: Public Health Nutrition

Provided by Tufts University
Analysis of renewable energy points toward more affordable carbon-free electricity

by California Institute of Technology
Credit: CC0 Public Domain

As more states in the U.S. push for increased reliance on variable renewable energy in the form of wind or solar power, long-term energy storage may play an important role in assuring reliability and reducing electricity costs, according to a new paper published by Caltech researchers.


Graduate student Jackie Dowling, who works in the lab of Nathan Lewis (BS '77), the George L. Argyros Professor and professor of chemistry, has collaborated with Ken Caldeira at the Carnegie Institution for Science and others to examine energy-storage options and multiple decades of data about wind and solar availability. Dowling and her collaborators determined that currently available battery technology is prohibitively expensive for long-term energy storage services for the power grid and that alternative technologies that can store a few weeks' to a month's worth of energy for entire seasons or even multiple years may be the key to building affordable, reliable renewable electricity systems.

Energy storage is needed with renewable energy because wind and solar energy are not as reliably available as fossil fuels. For example, wind power is often at its lowest during the summer in the United States, which is when the electrical grid is strained the most by the demand for air conditioning in homes and businesses.

"This research is motivated by the fact that laws in several states have mandated 100 percent carbon-free electricity systems by midcentury," says Dowling, lead author of a paper about the work. "Within these mandates, a lot of states include requirements for wind and solar power. Both wind and solar are variable from day to day, or even year to year, yet high reliability is mandatory for a viable electricity system. Energy storage can fill in for the gaps between supply and demand."

Dowling looked at short-duration storage systems, such as lithium-ion batteries, and long-duration storage methods, such as hydrogen storage, compressed-air storage, and pumped-storage hydroelectricity.

To see how to optimize the use of those storage technologies at the lowest energy cost, Dowling built a mathematical simulation of each and incorporated historical electricity-demand data and four decades of hourly resolved historical weather data across the contiguous U.S. The Macro Energy Model, as she calls it, reveals that adding long-duration storage to a wind-solar-battery system lowers energy costs. In contrast, using batteries alone for storage makes renewable energy more expensive.

Dowling says that the extra expense associated with batteries occurs because they cannot cost-effectively store enough energy for an entire season during which electricity is generated in lower amounts. That means an electrical grid would require many costlier solar panels or wind turbines to compensate and would result in wasteful idling of electricity-generation equipment for much of the year.

Currently available battery technology is not even close to being cost effective for seasonal storage, Dowling says.

"The huge dip in wind power in the summer in the U.S. is problematic, and batteries are not suitable for filling that gap. So, if you only have batteries, you have to overbuild wind or solar capacity," she says. "Long-duration storage helps avoid the need to overbuild power generation infrastructure and provides electricity when people need it rather than only when nature provides it. At current technology costs, storage in underground caverns of green hydrogen generated by water electrolysis would provide a cost-effective approach for long-duration grid storage."

Other researchers have built renewable energy models, but the team's data-driven approach is the first to incorporate four decades of historical wind and solar variability data, thus factoring in variability from year to year and periodic episodes of rare weather events that affect power generation, such as wind and solar droughts.

"The more years of data we use in our models, the more we find a compelling need for long-term storage to get the reliability that we expect from an electricity system," she says.

Dowling suggests her findings may be helpful to policy makers in states with 100 percent carbon-free electricity laws and high wind/solar mandates and to other U.S. states considering the adoption of similar laws. In the future, she plans to extend her research to take an in-depth look at the roles that specific types of energy storage, such as hydrogen or redox flow batteries, can play in renewable energy systems. For instance, some types of batteries might effectively serve as medium-duration energy storage, she says.

The paper, titled "Role of long-duration energy storage in variable renewable electricity systems," appears in the September issue of Joule.

Explore furtherAnswer to energy storage problem could be hydrogen

More information: Jacqueline A. Dowling et al, Role of Long-Duration Energy Storage in Variable Renewable Electricity Systems, Joule (2020). DOI: 10.1016/j.joule.2020.07.007
Journal information: Joule

Provided by California Institute of Technology

Machine-learning research may help find new tungsten deposits in England

by University of Exeter
Credit: Pixabay/CC0 Public Domain

Geologists have developed a machine learning technique that highlights the potential for further deposits of the critical metal tungsten in SW England.

Tungsten is an essential component of high-performance steels but global production is strongly influenced by China and western countries are keen to develop alternative sources.


The work, published in the leading journal Geoscience Frontiers, has been led by Dr. Chris Yeomans, from the Camborne School of Mines, and involved geoscientists from the University of Nottingham, Geological Survey of Finland (GTK) and the British Geological Survey.

The research applies machine learning to multiple existing datasets to examine the geological factors that have resulted in known tungsten deposits in SW England.

These findings are then applied across the wider region to predict areas where tungsten mineralisation is more likely and might have previously been overlooked. The same methodology could be applied to help in the exploration for other metals around the world.

Dr. Yeomans, a Postdoctoral Research Fellow at the Camborne School of Mines, based at the University of Exeter's Penryn Campus in Cornwall said: "We're really pleased with the methodology developed and the results of this study.

"SW England is already the focus of UK mineral exploration for tungsten but we wanted to demonstrate that new machine learning approaches may provide additional insights and highlight areas that might otherwise be overlooked."

SW England hosts the fourth biggest tungsten deposit in the world (Hemerdon, near Plympton), that resulted in the UK being the sixth biggest global tungsten producer in 2017; the mine is currently being re-developed by Tungsten West Limited.

The Redmoor tin-tungsten project, being developed by Cornwall Resources Limited, has also been identified as being a potentially globally significant mineral deposit.

The new study suggests that there may be a wider potential for tungsten deposits and has attracted praise from those currently involved in the development of tungsten resources in SW England.

James McFarlane, from Tungsten West, said: "Tungsten has only been of economic interest in the last 100 years or so, during which exploration efforts for this critical metal have generally been short-lived.

"As such is very encouraging to see work that aims to holistically combine the available data to develop a tungsten prospectivity model in an area that has world-class potential."

Brett Grist, from Cornwall Resources added: "Our own work has shown that applying modern techniques can reveal world-class deposits in this historic and globally-significant mining district. Dr. Yeomans' assertion, that the likelihood of new discoveries of tungsten mineralisation may be enhanced by a high-resolution gravity survey, is something in which we see great potential. Indeed, such a program could stimulate the new discovery of economically significant deposits of a suite of critical metals, here in the southwest of the UK, for years to come."


Explore furtherGranites could solve riddle of pinpointing metals crucial for low carbon tech
More information: Christopher M. Yeomans et al. A machine learning approach to tungsten prospectivity modeling using knowledge-driven feature extraction and model confidence, Geoscience Frontiers (2020). DOI: 10.1016/j.gsf.2020.05.016
Provided by University of Exeter
'Achilles' flaw exposes a billion Android phones

by Peter Grad , Tech Xplore

Credit: CC0 Public Domain

One billion Android phones are at risk of attacks by hackers taking advantage of what a research firm says are 400 vulnerabilities detected on the smartphone's chips.

Collectively called "Achilles," the vulnerabilities were found on stretches of code found in Qualcomm's Snapdragon chips, which are found on nearly half of all Android phones.

Addressing the DEF CON Safe Mode security conference Friday, researchers at Check Point security firm said phones could be turned into spying tools providing access to photos, videos, location data, and other sensitive user details. The hacker need only successfully persuade a user to install a seemingly benign app that requires no permissions to operate.

Hackers could spy on phone conversations, launch denial-of-service attacks, or surreptitiously plant malicious code.

"You can be spied on. You can lose all your data," said said Yaniv Balmas, head of cyber research at Check Point. "If such vulnerabilities are found and used by malicious actors, it will find millions of mobile phone users with almost no way to protect themselves for a very long time."

Check Point has distributed details of its findings to Qualcomm and affected phone vendors. It did not post the details in public so as to not provide any advantages to hackers.

Qualcomm said it is addressing the vulnerabilities; issuing a new compiler and a new software development kit. But it is up to phone vendors to distribute patches for each model phone carrying the affected processor.

"For vendors, it means they will need to recompile each and every DSP application they use, test them, and fix any issues [that] may occur," said Balmas. "Then they need to ship these fixes to all devices in the market."

Snapdragon chipsets have been a welcome component of smartphones, wearable devices, and automobile systems. It's embraced for its speed and performance benchmarks, power efficiency, 5G support, graphics handling, and embedded fingerprint reading capacity.

Digital signal processors don't attract the same degree of scrutiny by researchers for possible flaws as other computer components because technical specs are usually closely guarded by manufacturers.

"While DSP chips provide a relatively economical solution that allows mobile phones to provide end users with more functionality and enable innovative features, they do come with a cost," researchers from Check Point state in a report posted online. "These chips introduce new attack surfaces and weak points to these mobile devices. DSP chips are much more vulnerable to risks as they are being managed as 'Black Boxes' since it can be very complex for anyone other than their manufacturer to review their design, functionality or code."

"Our research managed to break these limits and we were able to have a very close look at the chip's internal design and implementation in a relatively convenient way. Since such research is very rare, it can explain why we found so many vulnerable code sections," Balmas said.

Snapdragon system-on-a-chip products can be found on leading phone products by Google, Samsung, Xiaomi, LG, and OnePlus. Apple provides its own processors, so iPhones are not affected by Achilles.

Qualcomm said it has no evidence the vulnerabilities are "currently being exploited," but urged customers "to update their devices as patches become available and to only install applications from trusted locations, such as the Google Play Store."


Explore further Malicious apps infect 25 million Android devices with 'Agent Smith' malware

Coronavirus transmission risk increases along wildlife supply chains


by Public Library of Science
Malayan porcupine (Hystrix brachyura) farm in Dong Nai province, November 2013. Credit: Huong et al, 2020 (PLOS ONE, CC BY)

Coronaviruses were detected in a high proportion of bats and rodents in Vietnam from 2013 to 2014, with an increasing proportion of positive samples found along the wildlife supply chain from traders to large markets to restaurants, according to a study published August 10 in the open-access journal PLOS ONE by Amanda Fine of the Wildlife Conservation Society and colleagues. As noted by the authors, the amplification of coronaviruses along the wildlife supply chain suggests maximal risk for end consumers and likely explains the coronavirus spillover to people.


Outbreaks of emerging coronaviruses in the past two decades and the current pandemic of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) highlight the importance of this viral family as a public health threat. Human-wildlife contact with a bat or an intermediate host species in China almost certainly triggered a coronavirus spillover event that may have involved wildlife markets and led to the pandemic spread of SARS-CoV-2, according to the latest scientific evidence. Beyond China, commercial wildlife farming in Vietnam is part of the expanded international wildlife trade that is thought to contribute to global epidemics, such as SARS and now coronavirus disease 2019 (COVID-19), which is caused by SARS-CoV-2.

To better understand the natural hosts of coronaviruses and the risk for these wildlife-human interfaces to facilitate spillover into humans, Fine and her collaborators investigated presence of viruses in the coronavirus family and diversity in wildlife at wildlife-human interfaces in Vietnam from 2013 to 2014 (years prior to the emergence of SARS-CoV-2).

They observed high proportions of positive samples of coronaviruses among field rats (34.0%, 239/702) destined for human consumption and bats in guano farms (74.8%, 234/313) adjacent to human dwellings. The odds of coronavirus detection increased along the supply chain, from field rats sold by traders (20.7%, 39/188), to field rats sold in large markets (32.0%, 116/363), and field rats served in restaurants (55.6%, 84/151). Coronaviruses were also detected in rodents on most wildlife farms sampled (60.7%, 17/28), affecting Malayan porcupines (6.0%, 20/331) and bamboo rats (6.3%, 6/96) raised for human consumption. To minimize the public health risks of viral disease emergence, the authors recommend improving coronavirus surveillance in wildlife and implementing targeted wildlife trade reform.

The authors add: "This study shows the wildlife supply chain generates a one-two punch when it comes to spillover risk. It is known to increase contact rates between wildlife and people and here we show how it greatly amplifies the number of infected animals along the way."


Explore further
Study finds that wildlife supply chains for human consumption increase coronavirus spillover risk to people
More information: Nguyen Quynh Huong et al, Coronavirus testing indicates transmission risk increases along wildlife supply chains for human consumption in Vietnam, 2013-2014, PLOS ONE (2020). DOI: 10.1371/journal.pone.0237129
Journal information: PLoS ONE
Provided by Public Library of Science
Air pollution impacts the health of wild pollinators

by National Centre for Biological Sciences

AUGUST 10, 2020
Pseudocolorized, non-coated scanning electron micrographs of two foraging Giant Asian honeybee (Apis dorsata) wing regions, one (top) collected from a low polluted site in Bangalore, India (average respiratory suspended particulate matter, RSPM10 particles < 10 μM at 33.7 μg/m3) and one (bottom) collected from a highly polluted site 7.7 km away (average RSPM10 at 98.6 μg/m3). Note the presence of pollen (top) and RSPM (bottom) in the images. Credit: Micrograph obtained by Geetha Thimmegowda with a Zeiss merlin compact VP microscope at 1 kV EHT and 460X magnification.
According to the World Health Organization (WHO), nine of the world's 10 most polluted cities are in India. Yet, researchers have almost no idea how air pollution is affecting non-human organisms. In some of the first research to address the physiological and molecular impacts of air pollution on wild plants and animals, scientists from the Bangalore Life Science Cluster show that air pollution could be devastating for organisms humans rely on most for survival, like the honey bee.

Apis dorsata, or the giant Asian honey bee, is not only a common resident of Indian cities, but it is an important contributor to India's food security and ecosystems. This bee produces over 80% of the country's honey, and pollinates over 687 plants in Karnataka alone. Seventy-five percent of Indian crop species rely to some extent on animals, and mostly insects, for their production. India is the largest fruit producer and second-largest vegetable producer in the world. Without insect pollinators like honey bees, the yearly mango export would lose over Rs. 65,000 Lacs. The importance of bees and other pollinators to India's plant biodiversity and agroeconomy cannot be overstated.

Led by Shannon Olsson at the National Center for Biological Sciences in Bangalore, Geetha Thimmegowda and colleagues embarked on a four-year study of over 1800 wild bees. Their study was published this week in the Proceedings of the National Academy of Sciences. Through a series of experiments, along with honeybee expert Dr. Axel Brockmann of NCBS and cardiovascular researcher Dr. Dandipany Perunderai of the Institute for Stem Cell Science and Regenerative Medicine (inStem) and the Knight Cardiovascular Institute, the scientists found that giant Asian honey bees from more polluted areas of the megacity of Bangalore exhibited lower flower visitation rates than in less polluted areas. Bees from more polluted areas likewise showed significant differences in heart rhythmicity, blood cell count, and the expression of genes coding for stress, immunity and metabolism. Repeating these experiments with lab-reared Drosophila found similar effects, suggesting that the impact of air pollution is not species-specific, nor likely the result of other environmental factors.
Giant Asian Honey bee (Apis dorsata) colonies in Bengaluru, India. Credit: Elephant Corridor Films (elephantcorridorfilms.com)

Dr. Hema Somanathan, who studies bee behavior and pollination ecology at the Behavioral and Evolutionary Ecology (BEE) Laboratory, Indian Institute of Science Education and Research, Thiruvananthapuram says, "The study was done with wild bees naturally visiting flowers in Bangalore and not in lab assays on reared honey bees kept in hive boxes that may already be stressed or immuno-compromised. Thus, in my opinion, this study provides us with hard evidence that all is not well with our wild bees. Given the scale of landscape alteration and urbanization in India, it is expected that these effects are widespread and likely to worsen with time."
Photos of Giant Asian Honey bee (Apis dorsata) colonies in Bengaluru, India. Credit: cC (elephantcorridorfilms.com)

Perhaps most strikingly, the researchers found over 80% of the bees collected from the moderate and highly polluted sites died within 24 hours. These RSPM levels were similar to the Interim Target II guidelines proposed by the WHO. To this end, Arunabha Ghosh, founder and CEO of the Council on Energy, Environment and Water says, "So far, much of the air quality studies in India have either considered sources of pollution or impact on human health, and to an extent on economic productivity. This study covers important new ground, by examining the impact of air pollution on pollinators, which would have serious implications for agricultural output in India. Such findings further underscore the need to raise India's ambient air quality standards."


Finally, Shloka Nath, executive director at the India Climate Collaborative and the Head of Sustainability and Special Projects at Tata Trusts, says, "Better application of research and evidence in development policy-making can save lives, reduce poverty, and improve quality of life. In the case of this study, the research speaks for itself: we now have concrete proof that by polluting our air, we are not only endangering our own health, we are also affecting the wild animals and plants who depend on it for sustenance. This has far-reaching implications for the complex ecosystems we are part of, as these changes affect the quality of habitat and food sources we all depend on."


Explore further  Decline of bees, other pollinators threatens US crop yields

More information: Geetha G. Thimmegowda el al., A field-based quantitative analysis of sublethal effects of air pollution on pollinators, PNAS (2020).  

Journal information: Proceedings of the National Academy of Sciences

Provided by National Centre for Biological Sciences

Poverty alleviation efforts are shaping the success of environmental targets
by University of Sheffield 


AUGUST 10, 2020
Credit: CC0 Public Domain

Social protection programs can facilitate progress towards the Sustainable Development Goals (SDGs) but can also create trade-offs across divergent social and environmental goals that can undermine their effectiveness, say the authors of new research published in the journal PNAS. This is one of the largest studies on the sustainability implications of social protection, funded by the Grantham Centre for Sustainable Futures at The University of Sheffield .


Focusing on Brazil's flagship Zero Hunger (ZH) social protection scheme, designed to alleviate food insecurity and hunger through cash transfers and agricultural support, the study highlights the importance of considering the social and environmental outcomes of development policies. The authors used data spanning 13 years (2000-2013) and covering around 4,000 rural municipalities in Brazil. Their results draw out implications for Brazil's progress towards the SDGs, specifically: no poverty (SDG 1), zero hunger (SDG 2), good health and wellbeing (SDG 3) and life on land (SDG 15).

The ZH program was implemented in 2004 with the primary target beneficiaries being small-scale family farmers with the goal of lifting 44 million Brazilians out of poverty and food insecurity. This is a globally important group with 12% of the world's agricultural land managed by some 475 million smallholders. The programme has been praised for playing a key role in enabling Brazil to meet its Millennium Development Goals in 2015.

The study found that successful elements of the ZH program include evidence of an increase in food production (SDG 2) and slightly reduced poverty (SDG 1). However, this can be contrasted with more variable outcomes in food security dimensions across regions, depending on whether cash transfer or agricultural support were used. In addition, they were widespread trade-offs with other sustainable development goals, notably environmental protection (SDG 15).

Dr. Cecilie Dyngeland (who conducted the research as part of her Ph.D. at the University of Sheffield) said:

"Alleviating poverty is essential, but we rarely think about the unintended environmental consequences of poverty alleviation policies. A key strength of our analysis is that it allows us to understand how policies affect multiple social and environmental outcomes simultaneously."

Despite these evident shortcomings, the authors suggest there are ways to balance human development with environmental integrity.


Dr. Johan Oldekop (at the Global Development Institute, University of Manchester) said:

"We find that the same programme can lead to contrasting outcomes in different regions of Brazil. It is critical for us to understand what processes have enabled joint positive social and environmental outcomes, in order to learn from these synergies and develop incentives that avoid trade-offs."

The research team's analysis of the ZH programme provides insights on how to achieve multiple sustainability outcomes whilst being directly relevant to the design and implementation of social protection mechanisms around the world. This is particularly salient in Africa, where social protection programs based on ZH currently operate in several countries. The research compared two different types of protection programmes and found that cash transfers were less likely than agricultural support to generate synergies across development and environmental objectives.

As Dr. Karl Evans (from the Animal and Plant Science Department at the University of Sheffield) added,

"This research demonstrates that development policies can enhance or degrade the natural environments which are vital for the well-being and livelihoods of many vulnerable people. Development policies need to focus on strategies that enhance rather than degrade this capacity. Linking social protection to environmental conditionalities is one potential mechanism to achieve poverty alleviation without degrading the natural environment."

Governments, international donors and financial organisations are making large investments in social protection to mitigate the economic impacts of the COVID-19 pandemic. For social protection programmes to continue to contribute to progress on multiple development objectives, their trade-offs and synergies will need to be at the front and centre of the design and implementation of poverty alleviation strategies moving forward.

To ensure robust policy impact evaluation, the measurement of intended and unintended sustainable development outcomes of initiatives needs to become the norm.


Explore further

More information: Cecilie Dyngeland el al., "Assessing multidimensional sustainability: Lessons from Brazil's social protection programs," PNAS (2020). www.pnas.org/cgi/doi/10.1073/pnas.1920998117
Indigenous property rights protect the Amazon rainforest

NOT PRIVATE PROPERTY RIGHTS 
(PROPERTY IS THEFT)
BUT THE RIGHT TO THE COMMONWEALTH
(PROPERTY IS FREEDOM)

by University of California - San Diego  

AUGUST 10, 2020

Indigenous territories and deforestation in the Brazilian Amazon for 1985, 1995, 2005 and 2015. Deforestation is significantly reduced in territories with full property rights. Credit: Kathryn Baragwanath, UC San Diego.
One way to cut back on deforestation in the Amazon rainforest—and help in the global fight against climate change—is to grant more of Brazil's indigenous communities full property rights to tribal lands. This policy focus is suggested by a new University California of San Diego study published in the Proceedings of the National Academy of Sciences.


Led by UC San Diego political science researcher Kathryn Baragwanath, the study uses an innovative method to combine satellite data of vegetation coverage in the Amazon rainforest, between 1982 and 2016, with Brazilian government records of indigenous property rights. The study found significantly reduced deforestation rates in territories that are owned fully and collectively by local tribes—when compared to territories that are owned only partially by the tribes or not at all. The average effect was a 66% reduction in deforestation.

The Amazon accounts for half of the Earth's remaining tropical forest, is an important source of the biodiversity on our planet and plays a major role in climate and water cycles around the world. Yet the Amazon basin is losing trees at an alarming rate, with particularly high levels in recent years, due to a combination of massive forest fires and illegal activities.

Who owns the Amazon, meanwhile, is hotly contested, with numerous actors vying for the privilege. Some private entities go ahead with illegal mining or logging, for example, to demonstrate "productive use of land" and thereby gain title to that land. At present, about 2 million hectares of indigenous land are still awaiting official designation as tribal territories.

Also debated is whether collective property rights are effective in curbing deforestation. These rights are granted to indigenous peoples in Brazil through a complex and lengthy constitutional process, and are distinct from the private property rights most of us are more familiar with.

UC San Diego's Baragwanath and co-author Ella Bayi, now at Columbia University, say "yes, collective property rights are effective"—if you focus your analysis on the final stage of the titling process in Brazil (which can take up to 25 years to complete), or the point at which tribes gain full property rights.

Full property rights give indigenous groups official territorial recognition, enabling them not only to demarcate their territories but also to access the support of monitoring and enforcement agencies, the researcher say.

"Our research shows that full property rights have significant implications for indigenous people's capacity to curb deforestation within their territories," said Baragwanath. "Not only do indigenous territories serve a human rights role, but they are a cost-effective way for governments to preserve their forested areas and attain climate goals. This is important since many indigenous territories have yet to receive their full property rights and it points to where policymakers and NGOs concerned about the situation in Brazil should now focus their efforts."


Explore further Amazonian Indigenous territories are crucial for conservation
More information: Collective property rights reduce deforestation in the Brazilian Amazon, Proceedings of the National Academy of Sciences (2020).
www.pnas.org/cgi/doi/10.1073/pnas.1917874117

BECAUSE IT BEARS REPEATING 

Why Comparing Flu and COVID-19 Severity Is Not Equivalent

MAY 28, 2020 | RACHEL LUTZ




                                                                  
coronavirus, covid-19, flu, influenza, virusComparing seasonal influenza (flu) mortality to the mortality rate of coronavirus 2019 (COVID-19) is a threat to public health and demonstrates the lack of understanding about how the data is collected for each infection by varying agencies, according to a Viewpoint published in JAMA Internal Medicine.

Authors from Brigham and Women’s Hospital and Emory University School of Medicine outlined why the apparent equivalence of deaths from COVID-19 and seasonal influenza are not capturing the entirety of the situation.

They said that public officials continually draw comparisons between the 2 infections, “often in an attempt to minimize the effects of the unfolding pandemic.”

The number of deaths from COVID-19 was estimated in early May to be approximately 65,000, which the authors agreed appeared similar to the estimated number of seasonal influenza deaths reported every year by the US Centers for Disease Control and Prevention (CDC).

However, that represents a fundamental misunderstanding of the way the CDC reports seasonal influenza morbidity and mortality.

From 2013-14 to 2018-19, the CDC reported yearly estimates of influenza deaths ranging from 23,000-61,000. However, the number of counted influenza deaths during those 2 seasons was 3448 and 15,620, respectively.

It would be more accurate to compare weekly counts of COVID-19 deaths to weekly counts of seasonal influenza deaths, the authors said, due to COVID-19 fatalities being counted and reported directly instead of estimated.

By the numbers, according to the paper:
  • There were 15,455 COVID-19 deaths reported in the US during the week ending April 21, 2020.
  • There were 14,478 COVID-19 deaths reported in the US during the week prior.
  • There were 351 flu deaths during the peak week (week 11 of 2016) of the flu season in 2015-16.
  • There were 1626 flu deaths during the peak week (week 3 of 2018) of the flu season in 2018-19.
“These statistics on counted deaths suggest that the number of COVID-19 deaths for the week ending April 21 was 9.5-fold to 44.1-fold greater than the peak week of counted influenza deaths during the past 7 influenza seasons in the US, with a 20.5-fold mean increase,” the authors wrote.

The CDC also recognizes that their COVID-19 death counts are continually revised due to delays in reporting. The authors believed that the ratio of counted COVID-19 deaths to flu deaths will rise. Additionally, they said their ratios are more clinically consistent with the experiences of health care workers on the front lines.

“We infer that either the CDC’s annual estimates substantially overstate the actual number of deaths caused by influenza or that the current number of COVID-19 counted deaths substantially understates the actual number of deaths caused by SARS-CoV-2, or both,” they wrote.

The authors allowed for several considerations, including that testing capacity is limited for COVID-19 and there could be false-negative results. They also said that flu deaths are not reportable to public health authorities—while COVID-19 deaths are, which could lead to potential underreporting.

Drawing direct comparisons between 2 diseases, despite mortality statistics being collected by different methods, provides inaccurate information. The failure to consider these differences by experts “threatens public health,” the authors wrote, especially as they rely on the comparisons “to reopen the economy and deescalate mitigation strategies.”

“Although officials may say that SARS-CoV-2 is ‘just another flu,’ this is not true,” the authors concluded. “Our analysis suggests that comparisons between SARS-CoV-2 mortality and seasonal influenza mortality must be made using an apples-to-apples comparison, not an apples-to-oranges comparison. Doing so better demonstrates the true threat to public health from COVID-19.”