It’s possible that I shall make an ass of myself. But in that case one can always get out of it with a little dialectic. I have, of course, so worded my proposition as to be right either way (K.Marx, Letter to F.Engels on the Indian Mutiny)
Thursday, February 25, 2021
Diabetes patients use of mobile health app found to improve health outcomes, lower medical costs
Emerging smart mobile health (or mHealth) technologies are changing the way patients track information related to diagnosed conditions. A new study examined the health and economic impacts of mHealth technologies on the outcomes of diabetes patients in Asia. The study concluded that compared to patients who did not use mHealth applications, patients who used the apps had better health outcomes and were able to regulate their health behavior more effectively. They also had fewer hospital visits and lower medical costs.
The study was conducted by researchers at Carnegie Mellon University (CMU) and New York University (NYU). It has been accepted into publication and is forthcoming in MIS Quarterly, a publication of the Management Information Systems Research Center.
"Given the importance of health behaviors to well-being, health outcomes, and disease processes, mHealth technologies offer significant potential to facilitate patients' lifestyle and behavior modification through patient education, improved autonomous self-regulation, and perceived competence," explains Beibei Li, professor of information systems and management at CMU's Heinz College, who coauthored the study.
The relatively new area of mHealth includes mobile computing, medical sensor, and communications technologies used for health care services (e.g., managing chronic diseases). mHealth applications can operate on smart phones, tablets, sensors, and cloud-based computing systems, all of which collect health data on individuals. The global mHealth market was estimated to have reached $49 billion by the end of 2020. Yet few studies have assessed the technology's effectiveness in changing patients' behaviors and outcomes.
In this study, researchers sought to determine how mHealth applications persuade individuals to modify their behavior to comply with recommended approaches to obtain certain health goals. The researchers measured compliance by looking at detailed patient activities (e.g., daily walking steps, exercise time, sleeping pattern, food intake) as measured by the app, as well as general health outcomes, hospital visits, and medical expenses.
The researchers partnered with a top mHealth firm that provides one of the largest mobile health platforms in Asia specializing in diabetes care. The study randomly assigned 1,070 adult patients to different groups for three months: Some patients used the mHealth app, some did not, and some used a web-based version of the app. Among the patients in the group that used the mHealth app, some received personalized text message reminders, while others received non-personalized text messages. Researchers interviewed all participants before the study began and five months after it ended. Among the questions asked were those about demographics, medication and medical history, blood glucose and hemoglobin levels, frequency of hospital visits, and medical costs.
The study found that patients who used the mHealth app reduced their blood glucose and hemoglobin levels, even after controlling for individual-level fixed effects. Patients who used the app also exercised more, slept more, and ate healthier food. And they had fewer hospital visits and lower medical expenses.
The authors suggest that patients' adoption of and use of the mHealth app was associated with significant behavioral modifications toward a healthier diet and lifestyle. In this way, users became more autonomously self-regulated with their health behavior, and this increasing intrinsic motivation helped them become more engaged, persistent, and stable in their behavior, which led to improved health outcomes. The mHealth platform also facilitated an increased usage of telemedicine, which in turn led to reduced hospital visits and medical expenses for the patients.
The study also found that the mHealth platform was more effective in improving patients' health outcomes than a web-based (PC) version of the same app. And non-personalized text messages tended to be more effective in changing patients' behavior than personalized messages, possibly because personalized messages can be viewed as intrusive, coercive, and annoying.
Among the study's limitations, the authors note that this study focused mainly on participants with Type II diabetes which, different from Type I diabetes or Gestational diabetes, is directly tied to dietary or lifestyle self-management. Hence, the research is not necessarily applicable to patients with other types of diabetes.
"Our findings provide important insights on the design of mHealth apps through a better understanding of patients' health behavior and interactions with the platform," suggests Anindya Ghose, professor of business at NYU's Stern School of Business, who coauthored the study. "Such knowledge can be very valuable for health care mobile platform designers as well as policymakers to improve the design of smart and connected health infrastructures through sustained usage of the emerging technologies."
###
Micropopulism' may be turning education into a battlefield in the culture wars
A new analysis suggests that the education sector is being increasingly influenced by populism and the wider social media 'culture wars'.
A new analysis of education debates on both social media and in traditional media outlets suggests that the education sector is being increasingly influenced by populism and the wider social media 'culture wars'.
The study also suggests that the type of populism in question is not quite the same as that used to explain large-scale political events, such as the UK's 'Brexit' from the European Union, or Donald Trump's recent presidency in the United States.
Instead, the researchers - from the University of Cambridge, UK, and Queensland University of Technology, Australia - identify a phenomenon called 'micropopulism': a localised populism which spotlights an aspect of public services, such as the education sector. Micropopulism is populist, they argue, in the sense that it expresses a fervent division between a disregarded 'people' and an unjust elite.
The paper, by Dr Steve Watson and Dr Naomi Barnes, sketches out how think tanks, among other organisations, propagate such controversies using both new media and old. They highlight how 'wedge' issues are being used to prompt bitter disputes on social media between those with traditional views of education, and those who are more progressive.
'Traditional' teachers, in this context, argue that their authority in the classroom has been undermined by a largely university-based and ideologically-progressive 'elite' which, they claim, has used its institutional power to force them to use student-centred teaching methods which are not supported by scientific evidence. The polarised debate that ensues disguises the complexity of real classrooms, which in practice can be neither purely traditional, nor purely progressive.
The authors argue that 'the claim that educational micropopulism is abroad in England and Australia is almost self-evident' and offer a theoretical analysis of how and why it is happening. As potential examples, they cite increasingly vitriolic and adversarial online standoffs over issues such as teaching methods, discipline, or free speech on university campuses. Many of these appear to be linked to, or directly involve, think thanks or other groups with an interest in shaping policy. The paper calls for more evidence-gathering to understand the conditions which precipitate increasingly bitter debates within the education community, and warns that some vested interests may be using micropopulist tactics to influence policy.
Dr Steve Watson, a lecturer at the Faculty of Education, University of Cambridge, said: "We've reached the stage where there is enough evidence to indicate this issue requires more analysis and attention than it has received to date. There is clearly a relationship between education, policy-making, think tanks, media, and micropopulism - but its extent and consequences have yet to be fully determined."
Dr Naomi Barnes, from the Faculty of Education, Queensland University of Technology, said: "One concern is that at present, teachers and educators who are actively involved in these online discussions may not be aware of how controversy is being perpetuated and how bitter discussions go viral to help achieve policy-making objectives. There is a need to understand this more."
The authors argue that controversies in the media and on social channels enable would-be reformers to position progressives in education (often abbreviated to 'progs') as an out-of-touch elite. Most obviously, this idea seems to match Michael Gove's infamous demonisation of progressive 'bureaucrats, academics and teachers' unions' as 'The Blob'. One reformist government advisor has similarly praised social media 'trads' for instigating 'a reformation of the church of education'.
They also suggest that this reductive version of the debate now defines many of the most toxic arguments about education online. Watson, in particular, identifies Twitter - especially the popular #EduTwitter - as the site of unpleasant confrontations about matters such as the #BanTheBooth debate on discipline in schools, or the use of phonics in primary education.
In higher education, the researchers document a similar pattern in which university leaders are demonised as lazy, careless, distant and heavy-handed. In Australia, this seems to parallel a recent upswing in efforts by the right-wing Institute of Public Affairs (IPA) and Centre for Independent Studies (CIS) to actively publicise their policy arguments as research 'findings'.
The paper highlights 10 recent examples of this activity, which prompted national media headlines such as: 'Our universities have caved in to lazy groupthink'; and 'Don't bail out bloated unis'. Similarly provocative articles are increasingly appearing in the UK media, concerning issues such as free speech on campus, or claims about infiltration by foreign governments.
Watson's own experiences suggest that some of the online confrontations, if not deliberately instigated, certainly involve strange forms of behaviour. Last year, he published a paper highlighting possible evidence of micropopulist strategies on #EduTwitter. Within hours, this had provoked multiple angry responses on Twitter accusing him of fabricating a conspiracy theory - although many teachers and academics also posted messages of agreement.
As a result, the paper scored unusually well on Altmetric.com: a tool that tracks engagement with scholarly content online. Once this became apparent, the Twitter attacks not only ceased, but disappeared, with several critics deleting their posts as if attempting to stifle its popularity. "Extraordinarily, the paper may have gone some way to proving its own theory through the backlash it created," Watson said.
The authors believe that, at the very least, further research is needed to understand how today's education debates have become so schismatic. They warn that reasoned discussion about the future of education is being compromised. "We would recommend considering a digital citizenship initiative for education professionals to counter this," Barnes added.
###
The paper, Online educational populism and New Right 2.0 in Australia and England, is published in the academic journal, Globalisation, Societies and Education. DOI: 10.1080/14767724.2021.1882292
Flu vaccination this season likely to be highest ever
UGA research also uncovers disparities in ethnic acceptance of vaccine
More U.S. adults reported receiving or planning to receive an influenza vaccination during the 2020-2021 flu season than ever before, according to findings from a national survey.
The survey of 1,027 adults, conducted by the University of Georgia, found that 43.5% of respondents reported having already received a flu vaccination with an additional 13.5% stating they "definitely will get one" and 9.3% stating they "probably will get one." Combined, 66.3% have received or intend to receive an influenza vaccination.
By comparison, 48.4% of adults 18 and older received the vaccine during the 2019-2020 flu season, according to the Centers for Disease Control and Prevention, an increase of 3.1 percentage points from 2018-2019.
The survey was led by professor Glen Nowak, director of UGA's Center for Health and Risk Communication in the Grady College of Journalism and Mass Communication, and associate professor Michael Cacciatore, CHRC research director. The respondents came from the National Opinion Research Center's AmeriSpeak panel, which uses a prescreened, nationally representative pool of participants to obtain rapid and projectable survey findings.
"Our survey shows that most Americans have or planned to act on the advice to get a flu vaccination this season," said Nowak. "Further, these results strongly suggest the U.S. will be crossing an important threshold this flu season, which is over half of U.S. adults getting a flu vaccination."
The survey results indicate much of the increase in flu vaccine uptake is being driven by people 60 years old and older. A total of 61.5% said they had already received the influenza vaccine in December, with another 12% stating they "would definitely get it" and 5.8% stating they "would probably get it."
Demographic differences
The survey results also indicated many demographic differences when it came to having received a flu vaccination. Forty-eight percent of white respondents reported having a flu vaccination by December, compared to 35.1% of Hispanic respondents and 30.1% of Black respondents. Having already received a flu vaccination was also much higher for respondents with a college or higher education and those with annual household incomes of $75,000 a year or more.
Conversely, flu vaccination uptake and plans to get a flu vaccination were lowest for those 18-29 years old, those with some college or a high school education, and those with annual incomes less than $25,000. The survey found that 50.7% of those making more than $75,000 had already been vaccinated for the flu, while only 35% of those making less than $25,000 had been vaccinated.
"It was disappointing to see that significant differences by race, age, education and income persisted during a flu vaccination season that took place during a COVID pandemic," Cacciatore said. "It's important that we continue to learn more about why these disparities exist so we can take steps that will reduce them."
"Overall, it is good news to find that many people, particularly those at highest risk for serious flu or COVID-19 illness, followed the advice to get the flu vaccine. Hopefully, we can sustain that level of success in the years ahead," Nowak said. "It also remains worrisome to find much lower flu vaccination rates and intentions in so many groups. We continue to have much work to do among Hispanic and Black adults and those with lower income and years of formal education when it comes to flu vaccination."
Reducing salt in Parmigiano Reggiano cheese might not negatively affect its flavor
Aged cheeses pack a punch of nutty, sharp flavor. Before they're fully mature, aged cheeses are either waxed or placed in brine for weeks to create a natural rind. However, the high salt content in brined cheeses deters some consumers. Now, researchers reporting in ACS Food Science & Technology present a shortened brining time for Parmigiano Reggiano that results in a less salty product, while still potentially maintaining the cheese's distinctive texture and flavor compounds.
Parmigiano Reggiano is a lactose-free, crumbly and hard cheese. Manufactured in select provinces in Italy, its protected designation of origin status requires that certain production processes, such as a minimum 12-month ripening period, be performed. Ripening or maturing imparts the cheese's recognizable taste as milk solids are converted to flavor compounds. But before that, cheese wheels are placed in a saturated brine solution for weeks. The added salt plays a key role in the ripening process by modulating microbial growth, enzyme activity and the separation of solids from liquids, hardening the final product. One enzyme-mediated reaction is lipolysis, in which triglyceride fats in milk break down into their key components -- free fatty acids and diacylglycerides. Free fatty acids not only contribute to the taste of the cheese but are also precursors to other flavor molecules. So, Silvia Marzocchi and colleagues wanted to test the impact of brining time on the lipolysis reactions responsible for the free fatty acids involved in Parmigiano Reggiano's flavor profile and distinctive characteristics.
The researchers had five Parmigiano Reggiano dairies brine several cheese wheels by immersing them in a saturated salt solution for either 18 days or a shorter 12-day period. Then the wheels were ripened for 15 months under conditions typical for this type of cheese. Salt content in fully ripened cheese was 9% lower in the samples brined for a shorter time than the group with the longer procedure. Unexpectedly, the researchers found no difference in the moisture level, cholesterol and total fat in the two sets of cheeses. The team also observed no major variations in compounds involved in the flavor profile, as most of the 32 free fatty acids had overlapping concentration ranges between the two groups. Yet in the cheeses with the shorter salting time, overall, the total free fatty acids and the total diacylglycerides concentration ranges were 260% and 100% higher, respectively, than the traditionally brined version, suggesting the lower salt to moisture ratio resulted in more water available to lipolysis reactions and more rapid enzymatic activity breaking down triglycerides. The researchers say a reduced brining time for Parmigiano Reggiano could result in a product appealing to salt-conscious consumers, but sensory tests are still needed to indicate if they can detect differences to the overall taste and texture.
The paper is freely available as an ACS AuthorChoice
Celebrating Black chemists and chemical engineers
AMERICAN CHEMICAL SOCIETY
Chemical & Engineering News (C&EN), the newsmagazine of the American Chemical Society (ACS), is celebrating Black chemists and chemical engineers with a special issue highlighting Black chemists who work across the fields of biotechnology, solar energy, pharmaceuticals and more. Guest edited by Massachusetts Institute of Technology (MIT) drug delivery pioneer Paula Hammond, Ph.D., this special issue showcases Black scientists, spotlighting their scientific passions and career accomplishments.
"In bringing into focus the unique lives of this set of accomplished Black scientists in chemistry and chemical engineering, it is my hope that we open the door to more frequent and constant recognition of our presence in the field," Hammond wrote in her introductory remarks. "We have always been present in the sciences -- but now more than ever, we must appreciate and acknowledge the presence of Black people and other people of color. We must find ways to continue to raise our voices and celebrate our work. As a nation, we all benefit from the huge talent gained when all are included in the science enterprise."
Among the chemists and chemical engineers featured in the 2021 Trailblazers issue are Karen Akinsanya, Ph.D., of Schr?dinger, Inc., on artificial-intelligence-driven drug discovery; Oluwatoyin Asojo, Ph.D., of Hampton University on her calling to develop drugs for neglected diseases; Squire J. Booker, Ph.D., of Penn State University on the catalytic moments of his career; Cato T. Laurencin, M.D., Ph.D., of the University of Connecticut on his twin passions for surgery and biomedical engineering; and Kristala L. J. Prather, Ph.D., of MIT on harnessing the synthetic power of microbial systems. The print issue, which features original content and photography by Black creators, was released on February 22.
###
The paper, "C&EN's Trailblazers," is freely available here.
For more of the latest research news, register for our upcoming meeting, ACS Spring 2021. Journalists and public information officers are encouraged to apply for complimentary press registration by emailing us at newsroom@acs.org.
COVID-19 isolation linked to increased domestic violence, researchers suggest
While COVID-19-related lockdowns may have decreased the spread of a deadly virus, they appear to have created an ideal environment for increased domestic violence.
Data collected in surveys of nearly 400 adults for 10 weeks beginning in April 2020 suggest that more services and communication are needed so that even front-line health and food bank workers, for example -- rather than only social workers, doctors and therapists -- can spot the signs and ask clients questions about potential intimate partner violence. They could then help lead victims to resources, said Clare Cannon, assistant professor of social and environmental justice in the Department of Human Ecology and the lead author of the study.
The paper, "COVID-19, intimate partner violence, and communication ecologies," was published this month in American Behavioral Scientist. Study co-authors include Regardt Ferreira and Frederick Buttell, both of Tulane University, and Jennifer First, of University of Tennessee-Knoxville.
"The pandemic, like other kinds of disasters, exacerbates the social and livelihood stresses and circumstances that we know lead to intimate partner violence," said Cannon. She explained that increased social isolation during COVID-19 has created an environment where victims and aggressors, or potential aggressors in a relationship, cannot easily separate themselves from each other. The extra stress also can cause mental health issues, increasing individuals' perceived stress and reactions to stress through violence and other means.
"Compounding these stressors, those fleeing abuse may not have a place to get away from abusive partners," Cannon said.
Intimate partner violence is defined as physical, emotional, psychological or economic abuse and stalking or sexual harm by a current or former partner or spouse, according to the Centers for Disease Control and Prevention. Crime statistics indicate that 16 percent of homicides are perpetrated by a partner. Further, the CDC says, 25 percent of women and 10 percent of men experience some form of intimate partner violence in their lifetime.
Research participants in the study completed an online survey asking about previous disaster experience, perceived stress, their current situation as it relates to COVID-19, if they experienced intimate partner violence, and what their personal and household demographics were. In all, 374 people completed the survey. Respondents, whose average age was 47, were asked about how COVID-19 had affected them financially and otherwise.
Of the respondents, 39 reported having experienced violence in their relationship, and 74 percent of those people were women.
Although only 10 percent of the sample reported experiencing intimate partner violence, the people that had experienced that violence reported more stress than the segment of the sample that had not experienced it. Furthermore, the results show that as perceived stress increased, participants were more likely to end up as victims of violence.
"Importantly," Cannon said, "these data do not suggest causality and there is no way to determine if intimate partner violence was present in those relationships prior to the pandemic. What the data do suggest, however, is that experiencing such violence is related to reporting more exposure to stress."
Researchers found that as people find themselves in a more tenuous financial situation due to COVID-19, "there are more things to worry about and subsequently argue about. In many instances, that type of situation leads to an occasion for intimate partner violence," the researchers said.
"In our sample's case, as people lost their jobs and suffered financial losses, they also likely increased their worry about eviction," Cannon said. Notably, similar findings linking financial and job loss stresses with increased intimate partner violence were reported in the 2008 recession, Cannon said.
Researchers said their findings show a need for more communication resources for families -- potentially coming from government and nongovernment sources of support and information. By increasing public awareness of resources available to the broader community, community members, trusted friends, neighbors, and family members may be better able to connect those affected by domestic violence with resources, such as shelters, treatment intervention programs and therapeutic professionals such as social workers, therapists and others, researchers said.
Apollo rock samples capture key moments in the Moon's early history, study find
PROVIDENCE, R.I. [Brown University] -- Volcanic rock samples collected during NASA's Apollo missions bear the isotopic signature of key events in the early evolution of the Moon, a new analysis found. Those events include the formation of the Moon's iron core, as well as the crystallization of the lunar magma ocean -- the sea of molten rock thought to have covered the Moon for around 100 million years after the it formed.
The analysis, published in the journal Science Advances, used a technique called secondary ion mass spectrometry (SIMS) to study volcanic glasses returned from the Apollo 15 and 17 missions, which are thought to represent some of the most primitive volcanic material on the Moon. The study looked specifically at sulfur isotope composition, which can reveal details about the chemical evolution of lavas from generation, transport and eruption.
"For many years it appeared as though the lunar basaltic rock samples analyzed had a very limited variation in sulfur isotope ratios," said Alberto Saal, a geology professor at Brown University and study co-author. "That would suggest that the interior of the Moon has a basically homogeneous sulfur isotopic composition. But using modern in situ analytical techniques, we show that the isotope ratios of the volcanic glasses actually have a fairly wide range, and those variations can be explained by events early in lunar history."
The sulfur signature of interest is the ratio of the "heavy" sulfur-34 isotope to the lighter sulfur-32. Initial studies of lunar volcanic samples found that they uniformly leaned toward the heavier sulfur-34. The nearly homogeneous sulfur isotope ratio was in contrast with large variations in other elements and isotopes detected in the lunar samples.
This new study looked at 67 individual volcanic glass samples and their melt inclusions -- tiny blobs of molten lava trapped within crystals inside the glass. Melt inclusions capture the lava before sulfur and other volatile elements are released as gas during eruption -- a process called degassing. As such, they offer a pristine picture of what the original source lava was like. Using the SIMS at the Carnegie Institution for Science, Saal with his colleague, the late Carnegie scientist Eric Hauri, were able to measure the sulfur isotopes in these pristine melt inclusions and glasses, and use those results to calibrate a model of the degassing process for all the samples.
"Once we know the degassing, then we can estimate back the original sulfur isotope composition of the sources that produced these lavas," Saal said.
Those calculations revealed that the lavas had been derived from different reservoirs within the interior of the Moon with a wide range of sulfur isotope ratios. The researchers then showed that the range of values detected in the samples could be explained by events in the Moon's early history.
The lighter isotope ratio in some of the volcanic glasses, for example, is consistent with the segregation of the iron core from the early molten Moon. When an iron core separates from other material in a planetary body, it takes a bit of sulfur with it. The sulfur that's taken tends to be the heavier sulfur-34 isotope, leaving the remaining magma enriched in the lighter sulfur-32.
"The values we see in some of the volcanic glasses are fully consistent with models of the core segregation process," Saal said.
The heavier isotope values can be explained by the further cooling and crystallization of the early molten Moon. The crystallization process removes sulfur from the magma pool, producing solid reservoirs with heavier sulfur-34. That process is the likely source of the heavier isotope values found in some of the volcanic glasses and basaltic rocks returned from the Moon.
"Our results suggest that these samples record these critical events in lunar history," Saal said. "As we keep looking at these samples with newer and better techniques, we keep learning new things."
More work needs to be done -- and more samples need to be analyzed -- to fully understand the sulfur isotopic composition of the Moon, Saal says. But these new results help to clarify long-standing questions about the composition of the Moon's interior, and they bring scientists one step closer to understanding the formation and early history of the Moon.
###
The research was funded by NASA's Solar System Workings program (80NSSC20K0461).
The risks of communicating extreme climate forecasts
COLLEGE OF ENGINEERING, CARNEGIE MELLON UNIVERSITY
For decades, climate change researchers and activists have used dramatic forecasts to attempt to influence public perception of the problem and as a call to action on climate change. These forecasts have frequently been for events that might be called "apocalyptic," because they predict cataclysmic events resulting from climate change.
In a new paper published in the International Journal of Global Warming, Carnegie Mellon University's David Rode and Paul Fischbeck argue that making such forecasts can be counterproductive. "Truly apocalyptic forecasts can only ever be observed in their failure--that is the world did not end as predicted," says Rode, adjunct research faculty with the Carnegie Mellon Electricity Industry Center, "and observing a string of repeated apocalyptic forecast failures can undermine the public's trust in the underlying science."
Rode and Fischbeck, professor of Social & Decision Sciences and Engineering & Public Policy, collected 79 predictions of climate-caused apocalypse going back to the first Earth Day in 1970. With the passage of time, many of these forecasts have since expired; the dates have come and gone uneventfully. In fact, 48 (61%) of the predictions have already expired as of the end of 2020.
Fischbeck noted, "from a forecasting perspective, the 'problem' is not only that all of the expired forecasts were wrong, but also that so many of them never admitted to any uncertainty about the date. About 43% of the forecasts in our dataset made no mention of uncertainty."
In some cases, the forecasters were both explicit and certain. For example, Stanford University biologist Paul Ehrlich and British environmental activist Prince Charles are serial failed forecasters, repeatedly expressing high degrees of certainty about apocalyptic climate events.
Rode commented "Ehrlich has made predictions of environmental collapse going back to 1970 that he has described as having 'near certainty'. Prince Charles has similarly warned repeatedly of 'irretrievable ecosystem collapse' if actions were not taken, and when expired, repeated the prediction with a new definitive end date. Their predictions have repeatedly been apocalyptic and highly certain...and so far, they've also been wrong."
The researchers noted that the average time horizon before a climate apocalypse for the 11 predictions made prior to 2000 was 22 years, while for the 68 predictions made after 2000, the average time horizon was 21 years. Despite the passage of time, little has changed--across a half a century of forecasts; the apocalypse is always about 20 years out.
Fischbeck continued, "It's like the boy who repeatedly cried wolf. If I observe many successive forecast failures, I may be unwilling to take future forecasts seriously.
That's a problem for climate science, say Rode and Fischbeck.
"The underlying science of climate change has many solid results," says Fischbeck, "the problem is often the leap in connecting the prediction of climate events to the prediction of the consequences of those events." Human efforts at adaptation and mitigation, together with the complexity of socio-physical systems, means that the prediction of sea level rise, for example, may not necessarily lead to apocalyptic flooding.
"By linking the climate event and the potential consequence for dramatic effect," noted Rode, "a failure to observe the consequence may unfairly call into question the legitimacy of the science behind the climate event."
With the new Biden administration making climate change policy a top priority, trust in scientific predictions about climate change is more crucial than ever, however scientists will have to be wary in qualifying their predictions. In measuring the proliferation the forecasts through search results, the authors found that forecasts that did not mention uncertainty in their apocalyptic date tended to be more visible (i.e., have more search results available). Making sensational predictions of the doom of humanity, while scientifically dubious, has still proven tempting for those wishing to grab headlines.
The trouble with this is that scientists, due to their training, tend to make more cautious statements and more often include references to uncertainty. Rode and Fischbeck found that while 81% of the forecasts made by scientists referenced uncertainty, less than half of the forecasts made by non-scientists did.
"This is not surprising," said Rode, "but it is troubling when you consider that forecasts that reference uncertainty are less visible on the web. This results in the most visible voices often being the least qualified."
Rode and Fischbeck argue that scientists must take extraordinary caution in communicating events of great consequence. When it comes to climate change, the authors advise "thinking small." That is, focusing on making predictions that are less grandiose and shorter in term. "If you want people to believe big predictions, you first need to convince them that you can make little predictions," says Rode.
Fischbeck added, "We need forecasts of a greater variety of climate variables, we need them made on a regular basis, and we need expert assessments of their uncertainties so people can better calibrate themselves to the accuracy of the forecaster."
Over 80% of Atlantic Rainforest remnants have been impacted by human activity
Researchers estimated biodiversity and biomass losses in the biome using data from 1,819 forest inventories. In terms of carbon storage, the losses correspond to the destruction of 70,000 km² of forest
FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DE SÃO PAULO
A Brazilian study published (http://www.nature.com/articles/s41467-020-20217-w) in Nature Communications shows that human activities have directly or indirectly caused biodiversity and biomass losses in over 80% of the remaining Atlantic Rainforest fragments.
According to the authors, in terms of carbon storage, the biomass erosion corresponds to the destruction of 70,000 square kilometers (km²) of forest - almost 10 million soccer pitches - or USD 2.3 billion-USD 2.6 billion in carbon credits. "These figures have direct implications for mechanisms of climate change mitigation," they state in the article.
Atlantic Rainforest remnants in Brazil are strung along its long coastline. The biome once covered 15% of Brazil, totaling 1,315,460 km². Only 20% of the original area is now left. The fragments are of varying sizes and have different characteristics.
To estimate the impact of human activity on these remnants, the researchers used data from 1,819 forest inventories conducted by several research groups.
"These inventories are a sort of tree census. The researchers go into the field and choose an area to survey, typically 100 meters by 100 meters. All the trees found within this perimeter are identified, analyzed, and measured," said Renato de Lima (https://bv.fapesp.br/en/pesquisador/668300/renato-augusto-ferreira-de-lima), a researcher at the University of São Paulo's Institute of Biosciences (IB-USP) and leader of the study. "We compiled all the data available in the scientific literature and calculated the average loss of biodiversity and biomass in the fragments studied, which represent 1% of the biome. We then used statistical methods to extrapolate the results to the fragments not studied, assuming that the impact would be constant throughout the Atlantic Rainforest biome."
After identifying the tree species in a fragment, the researchers estimated the size of their seeds and also what they call the "ecological or successional group". These two factors indicate how healthy the forest is, according to Lima. "There are hardy plants that require very little in the way of local resources and can grow on wasteland, pasture, forest borders, etc. These are known as pioneer species. A Brazilian example is the Ambay pumpwood [Cecropia pachystachya]," he said.
Pioneer tree species tend to produce seeds of smaller size, but in large numbers, because each seed has such a small chance of germinating. At the opposite extreme are climax species that flourish only in favorable environments, such as Brazilwood (Paubrasilia echinata) or various species of the genus Ocotea. These trees produce larger seeds with a substantial store of nutrients.
"This kind of seed requires a heavier investment by the parent tree in terms of energy," Lima said. "Areas in which climax species are present typically support more diversified fauna, so they serve as a marker of overall forest quality. Areas in which pioneer species predominate have probably been disturbed in the recent past."
The IB-USP group set out to show how the loss of late-successional species correlated with overall biodiversity loss and also with biomass loss, which represents the reduction in the forest's capacity to store carbon and keep this greenhouse gas out of the atmosphere. They found the forest fragments studied to have 25%-32% less biomass, 23%-31% fewer tree species, and 33%-42% fewer individuals belonging to late-successional, large-seeded, and endemic species.
The analysis also showed that biodiversity and biomass erosion were lower in strictly protected conservation units, especially large ones. "The smaller the forest fragment and the larger the edge area, the easier it is for people to gain access and disturb the remnant," Lima said.
On the positive side, degraded forest areas can recoup their carbon storage capacity if they are restored. "Combating deforestation and restoring totally degraded open areas such as pasturelands have been a major focus. These two strategies are very important, but we shouldn't forget the fragments in the middle," Lima said.
According to Paulo Inácio Prado (https://bv.fapesp.br/en/pesquisador/3487/paulo-inacio-de-knegt-lopez-de-prado), a professor at IB-USP and last author of the study, restored forest remnants can attract billions of dollars in investment relating to carbon credits. "Degraded forests should no longer be considered a liability. They're an opportunity to attract investment, create jobs and conserve what still remains of the Atlantic Rainforest," he said.
Lima believes this could be an attractive strategy for landowners in protected areas of the biome. "There's no need to reduce the amount of available arable land. Instead, we should increase the biomass in forest fragments, recouping part of the cost of restoration in the form of carbon credits," he said. "There will be no future for the Atlantic Rainforest without the owners of private properties. Only 9% of the remaining forest fragments are on state-owned land."
Database
According to Lima, the study began during his postdoctoral research, which was supported by São Paulo Research Foundation - FAPESP (https://bv.fapesp.br/en/bolsas/145695) and supervised by Prado. The aim was to identify the key factors that determine biodiversity and biomass loss in remnants of Atlantic Rainforest. "We found human action to be a major factor," he said. "We considered activities such as logging, hunting, and invasion by exotic species, as well as the indirect effects of forest fragmentation."
The data obtained from the 1,819 forest inventories used in the research is stored in a repository called TreeCo (http://labtrop.ib.usp.br/doku.php?id=projetos:treeco:start), short for Neotropical Tree Communities. Lima developed the database during his postdoctoral fellowship and still runs it. Its contents are described in an article published in Biodiversity and Conservation (https://link.springer.com/article/10.1007/s10531-015-0953-1). It is open to other research groups interested in sharing data on Neotropical forests.
"The repository became a byproduct of my postdoctoral project, and more than ten PhD and master's candidates are using it in their research," Lima said.
###
A Brazilian study published in Nature Communications shows that human activities have directly or indirectly caused biodiversity and biomass losses in over 80% of the remaining Atlantic Rainforest fragments.
According to the authors, in terms of carbon storage, the biomass erosion corresponds to the destruction of 70,000 square kilometers (km²) of forest – almost 10 million soccer pitches – or USD 2.3 billion-USD 2.6 billion in carbon credits. “These figures have direct implications for mechanisms of climate change mitigation,” they state in the article.
Atlantic Rainforest remnants in Brazil are strung along its long coastline. The biome once covered 15% of Brazil, totaling 1,315,460 km². Only 20% of the original area is now left. The fragments are of varying sizes and have different characteristics.
To estimate the impact of human activity on these remnants, the researchers used data from 1,819 forest inventories conducted by several research groups.
“These inventories are a sort of tree census. The researchers go into the field and choose an area to survey, typically 100 meters by 100 meters. All the trees found within this perimeter are identified, analyzed, and measured,” said Renato de Lima, a researcher at the University of São Paulo’s Institute of Biosciences (IB-USP) and leader of the study. “We compiled all the data available in the scientific literature and calculated the average loss of biodiversity and biomass in the fragments studied, which represent 1% of the biome. We then used statistical methods to extrapolate the results to the fragments not studied, assuming that the impact would be constant throughout the Atlantic Rainforest biome.”
After identifying the tree species in a fragment, the researchers estimated the size of their seeds and also what they call the “ecological or successional group”. These two factors indicate how healthy the forest is, according to Lima. “There are hardy plants that require very little in the way of local resources and can grow on wasteland, pasture, forest borders, etc. These are known as pioneer species. A Brazilian example is the Ambay pumpwood [Cecropia pachystachya],” he said.
Pioneer tree species tend to produce seeds of smaller size, but in large numbers, because each seed has such a small chance of germinating. At the opposite extreme are climax species that flourish only in favorable environments, such as Brazilwood (Paubrasilia echinata) or various species of the genus Ocotea. These trees produce larger seeds with a substantial store of nutrients.
“This kind of seed requires a heavier investment by the parent tree in terms of energy,” Lima said. “Areas in which climax species are present typically support more diversified fauna, so they serve as a marker of overall forest quality. Areas in which pioneer species predominate have probably been disturbed in the recent past.”
The IB-USP group set out to show how the loss of late-successional species correlated with overall biodiversity loss and also with biomass loss, which represents the reduction in the forest’s capacity to store carbon and keep this greenhouse gas out of the atmosphere. They found the forest fragments studied to have 25%-32% less biomass, 23%-31% fewer tree species, and 33%-42% fewer individuals belonging to late-successional, large-seeded, and endemic species.
The analysis also showed that biodiversity and biomass erosion were lower in strictly protected conservation units, especially large ones. “The smaller the forest fragment and the larger the edge area, the easier it is for people to gain access and disturb the remnant,” Lima said.
On the positive side, degraded forest areas can recoup their carbon storage capacity if they are restored. “Combating deforestation and restoring totally degraded open areas such as pasturelands have been a major focus. These two strategies are very important, but we shouldn’t forget the fragments in the middle,” Lima said.
According to Paulo Inácio Prado, a professor at IB-USP and last author of the study, restored forest remnants can attract billions of dollars in investment relating to carbon credits. “Degraded forests should no longer be considered a liability. They’re an opportunity to attract investment, create jobs and conserve what still remains of the Atlantic Rainforest,” he said.
Lima believes this could be an attractive strategy for landowners in protected areas of the biome. “There’s no need to reduce the amount of available arable land. Instead, we should increase the biomass in forest fragments, recouping part of the cost of restoration in the form of carbon credits,” he said. “There will be no future for the Atlantic Rainforest without the owners of private properties. Only 9% of the remaining forest fragments are on state-owned land.”
Database
According to Lima, the study began during his postdoctoral research, which was supported by FAPESP and supervised by Prado. The aim was to identify the key factors that determine biodiversity and biomass loss in remnants of Atlantic Rainforest. “We found human action to be a major factor,” he said. “We considered activities such as logging, hunting, and invasion by exotic species, as well as the indirect effects of forest fragmentation.”
The data obtained from the 1,819 forest inventories used in the research is stored in a repository called TreeCo, short for Neotropical Tree Communities. Lima developed the database during his postdoctoral fellowship and still runs it. Its contents are described in an article published in Biodiversity and Conservation. It is open to other research groups interested in sharing data on Neotropical forests.
“The repository became a byproduct of my postdoctoral project, and more than ten PhD and master’s candidates are using it in their research,” Lima said.
###
About São Paulo Research Foundation (FAPESP)
The São Paulo Research Foundation (FAPESP) is a public institution with the mission of supporting scientific research in all fields of knowledge by awarding scholarships, fellowships and grants to investigators linked with higher education and research institutions in the State of São Paulo, Brazil. FAPESP is aware that the very best research can only be done by working with the best researchers internationally. Therefore, it has established partnerships with funding agencies, higher education, private companies, and research organizations in other countries known for the quality of their research and has been encouraging scientists funded by its grants to further develop their international collaboration. You can learn more about FAPESP at http://www.fapesp.br/en and visit FAPESP news agency at http://www.agencia.fapesp.br/en to keep updated with the latest scientific breakthroughs FAPESP helps achieve through its many programs, awards and research centers. You may also subscribe to FAPESP news agency at http://agencia.fapesp.br/subscribe.
POSTMODERN PARACELUS ALCHEMY
'Miracle poison' for novel therapeutics
Researchers prove they can engineer botulinum toxin proteins to find new targets with high selectivity, a critical advance toward potential new treatments for everything from neuroregeneration to cytokine storm
When people hear botulinum toxin, they often think one of two things: a cosmetic that makes frown lines disappear or a deadly poison.
But the "miracle poison," as it's also known, has been approved by the F.D.A. to treat a suite of maladies like chronic migraines, uncontrolled blinking, and certain muscle spasms. And now, a team of researchers from Harvard University and the Broad Institute have, for the first time, proved they could rapidly evolve the toxin in the laboratory to target a variety of different proteins, creating a suite of bespoke, super-selective proteins called proteases with the potential to aid in neuroregeneration, regulate growth hormones, calm rampant inflammation, or dampen the life-threatening immune response called cytokine storm.
"In theory, there is a really high ceiling for the number and type of conditions where you could intervene," said Travis Blum, a postdoctoral researcher in the Department of Chemistry and Chemical Biology and first author on the study published in Science. The study was the culmination of a collaboration with Min Dong, an associate professor at the Harvard Medical School, and David Liu, the Thomas Dudley Cabot Professor of the Natural Sciences, a Howard Hughes Medical Institute Investigator, and a core faculty member of the Broad Institute.
Together, the team achieved two firsts: They successfully reprogrammed proteases--enzymes that cut proteins to either activate or deactivate them--to cut entirely new protein targets, even some with little or no similarity to the native targets of the starting proteases, and to simultaneously avoid engaging their original targets. They also started to address what Blum called a "classical challenge in biology": designing treatments that can cross into a cell. Unlike most large proteins, botulinum toxin proteases can enter neurons in large numbers, giving them a wider reach that makes them all the more appealing as potential therapeutics.
Now, the team's technology can evolve custom proteases with tailor-made instructions for which protein to cut. "Such a capability could make 'editing the proteome' feasible," said Liu, "in ways that complement the recent development of technologies to edit the genome."
Current gene-editing technologies often target chronic diseases like sickle cell anemia, caused by an underlying genetic error. Correct the error, and the symptoms fade. But some acute illnesses, like neurological damage following a stroke, aren't caused by a genetic mistake. That's where protease-based therapies come in: The proteins can help boost the body's ability to heal something like nerve damage through a temporary or even one-time treatment.
Scientists have been eager to use proteases to treat disease for decades. Unlike antibodies, which can only attack specific alien substances in the body, proteases can find and attach to any number of proteins, and, once bound, can do more than just destroy their target. They could, for example, reactivate dormant proteins.
"Despite these important features, proteases have not been widely adopted as human therapeutics," said Liu, "primarily because of the lack of a technology to generate proteases that cleave protein targets of our choosing."
But Liu has a technological ace in his pocket: PACE (which stands for phage-assisted continuous evolution). A Liu lab invention, the platform rapidly evolves novel proteins with valuable features. PACE, Liu said, can evolve dozens of generations of proteins a day with minimal human intervention. Using PACE, the team first taught so-called "promiscuous" proteases--those that naturally target a wide swath of proteins--to stop cutting certain targets and become far more selective. When that worked, they moved on to the bigger challenge: Teaching a protease to only recognize an entirely new target, one outside its natural wheelhouse.
"At the outset," said Blum, "we didn't know if it was even feasible to take this unique class of proteases and evolve them or teach them to cleave something new because that had never been done before." ("It was a moonshot to begin with," said Michael Packer, a previous Liu lab member and an author on the paper). But the proteases outperformed the team's expectations. With PACE, they evolved four proteases from three families of botulinum toxin; all four had no detected activity on their original targets and cut their new targets with a high level of specificity (ranging from 218- to more than 11,000,000-fold). The proteases also retained their valuable ability to enter cells. "You end up with a powerful tool to do intracellular therapy," said Blum. "In theory."
"In theory" because, while this work provides a strong foundation for the rapid generation of many new proteases with new capabilities, far more work needs to be done before such proteases can be used to treat humans. There are other limitations, too: The proteins are not ideal as treatments for chronic diseases because, over time, the body's immune system will recognize them as alien substances and attack and defuse them. While botulinum toxin lasts longer than most proteins in cells (up to three months as opposed to the typical protein lifecycle of hours or days), the team's evolved proteins might end up with shorter lifetimes, which could diminish their effectiveness.
Still, since the immune system takes time to identify foreign substances, the proteases could be effective for temporary treatments. And, to side-step the immune response, the team is also looking to evolve other classes of mammalian proteases since the human body is less likely to attack proteins that resemble their own. Because their work on botulinum toxin proteases proved so successful, the team plans to continue to tinker with those, too, which means continuing their fruitful collaboration with Min Dong, who not only has the required permission from the Centers for Disease Control (CDC) to work with botulinum toxin but provides critical perspective on the potential medical applications and targets for the proteases.
"We're still trying to understand the system's limitations, but in an ideal world," said Blum, "we can think about using these toxins to theoretically cleave any protein of interest." They just have to choose which proteins to go after next.