Thursday, December 22, 2022

Metabolic hack makes ocean algae more resilient to 21st century climate change

Peer-Reviewed Publication

INSTITUTE FOR BASIC SCIENCE

Fig. 1 

IMAGE: PHOTOSYNTHESIZING ALGAE PLAY AN IMPORTANT ROLE IN MARINE LIFE. ACCORDING TO NEW COMPUTER MODEL SIMULATIONS, A METABOLIC HACK MAKES PHYTOPLANKTON MORE RESILIENT TO 21ST CENTURY CLIMATE CHANGE THAN PREVIOUSLY THOUGHT. view more 

CREDIT: INSTITUTE FOR BASIC SCIENCE

A study published in Science Advances by an international team of scientists provides clear evidence that marine phytoplankton are much more resilient to future climate change than previously thought.

Combining data from the long-term Hawai'i Ocean Time-series program with new climate model simulations conducted on one of South Korea’s fastest supercomputers, the scientists revealed that a mechanism, known as nutrient uptake plasticity, allows marine algae to adapt and cope with nutrient-poor ocean conditions expected to occur over the next decades in response to global warming of the upper ocean.

Phytoplankton are tiny algae (Fig. 1) which drift at the ocean’s surface and form the basis of the marine food web. While photosynthesizing, these algae absorb nutrients (e.g., phosphate, nitrate), take up dissolved carbon dioxide and release oxygen, which makes up for about 50% of the oxygen that we breathe. Knowing how marine algae will respond to global warming and to associated decline of nutrients in upper ocean waters is therefore crucial for understanding the long-term habitability of our planet.

How the annual phytoplankton production rate will change globally over the next 80 years remains highly uncertain. The latest report of the Intergovernmental Panel on Climate Change (IPCC) states an uncertainty of -20% to +20%, which implies an uncertainty as to whether phytoplankton will increase or decrease in future.

Global warming affects the upper layers of the ocean more than the deeper layers. Warmer water is lighter and hence the upper ocean will become more stratified in future, which reduces mixing of nutrients from the subsurface into the sun-lit layer, where phytoplankton reside. Earlier studies suggested that the expected future depletion of nutrients near the surface would lead to a substantial reduction of ocean’s phytoplankton production with widespread and potentially catastrophic effects on both marine ecosystems and climate.

But according to a new study in Science Advances, this may not happen. New analyses of the upper ocean phytoplankton data from Hawai'i Ocean Time-series program shows that productivity can be sustained, even in very nutrient-depleted conditions. “Under such conditions individual phytoplankton cells can substitute phosphorus with sulfur. On a community level, one might see further shifts towards taxa that require less phosphorus”, says David Karl, a coauthor of the study, Professor in Oceanography at the University of Hawai'i and co-founder of the Hawai'i Ocean Time-series Study program, to illustrate the concept of phytoplankton plasticity. Further supporting evidence for plasticity comes from the fact that in subtropical regions, where nutrient concentrations in the surface waters are low, algae take up less phosphorus per amount of carbon stored in their cells, as compared to the global mean.

To study how this unique metabolic “hack” will impact global ocean productivity over the next few decades, the team ran a series of climate model simulations with the Community Earth System model (version 2, CESM2) on their supercomputer Aleph. By turning off the phytoplankton plasticity in their model, the authors were able to qualitatively reproduce previous model results of a decline in global productivity by about 8%. However, when turning on the plasticity parameter in their model, in a way that captures the observations near Hawai'i for the past 3 decades, the computer simulation reveals an increase in global productivity of up to 5% until the end of this century. “Regionally, however, these future productivity differences can be much higher, reaching up to 200% in subtropical regions,” says Dr. Eun Young Kwon, first author of the study and a researcher at the IBS Center for Climate Physics at Pusan National University, South Korea. With this extra productivity boost, the ocean can also take up more carbon dioxide from the atmosphere and eventually sequester it below the ocean’s surface.

Inspired by the results of their sensitivity computer model simulations, the authors then looked at 10 other climate models, whose data were used in the recent 6th Assessment Report of the IPCC. The results confirmed the author’s initial conclusions. “Models without plasticity tend to project overall declining primary production for the 21st century, whereas those that account for the capability of phytoplankton to adapt to low nutrient conditions show on average increasing global productivity” says Dr. M.G. Sreeush, co-corresponding author of the study and a postdoctoral fellow at the IBS Center for Climate Physics.

“Even though our study demonstrates the importance of biological buffering of global-scale ecological changes, this does not imply that phytoplankton are immune to human induced climate change. For instance, worsening ocean acidification will reduce the calcification rates of certain types of phytoplankton, which can lead to large-scale shifts in ecosystems.” warns Dr. Eun Young Kwon. These factors are neither well understood nor represented yet in climate models.

“Future Earth system models need to use improved observationally-based representations of how phytoplankton respond to multiple stressors, including warming and ocean acidification. This is necessary to predict the future of marine life on our planet” says Prof. Axel Timmermann, a coauthor of this study and director of the IBS Center for Climate Physics.

UK

Lower risk of alcohol disorders in top footballers but only from 1960s onwards

Questions also persist over alcohol marketing to football fans

Peer-Reviewed Publication

BMJ

Elite male football players have a lower risk of alcohol and drug related disorders than men from the general population, but this protective effect was seen only for those who first played in the 1960s and later, not for players from earlier eras, finds a study in the Christmas issue of The BMJ.

Questions also persist over the health impacts of alcohol marketing to millions of football fans around the globe, say authors of a linked editorial.

Alcohol consumption has been deeply ingrained in football culture for both players and fans, and several well known players have experienced alcohol addiction during and after their playing careers.

Constant pressure to perform at a high level, public attention and fame have been suggested to increase the risk of alcohol related disorders among active and retired elite athletes, but large scale studies assessing such outcomes are scarce.

To explore this further, researchers tracked the health of 6,007 male football players who had played in the Swedish top division, Allsvenskan, from 1924 to 2019 and 56,168 men from the general population matched to players based on age and region of residence.

They identified any alcohol and drug related disorders recorded in death certificates, during hospital admissions and outpatient visits, or use of prescription drugs for alcohol addiction.

They also assessed whether any increased risk would vary according to year of first top tier playing season, age, career length, and goal scoring abilities.

Participants were followed for an average of 27 years, during which time 257 (4.3%) football players and 3,528 (6.3%) men from the general population received diagnoses of alcohol related disorders.

In analyses accounting for age, region of residence, and calendar time, risk of alcohol related disorders was about 30% lower among football players than among men from the general population. 

This reduced risk was seen among football players who played their first season in the top tier from the early 1960s onwards, while football players from earlier eras had a similar risk as men from the general population.

The risk of alcohol related disorders was lowest at around age 35 years, and then increased with age. At around age 75, football players had a higher risk of alcohol related disorders than men from the general population.

No significant association was seen between goal scoring, number of games, and seasons played in the top tier and the risk of alcohol related disorders.

Risk of disorders related to other drug misuse was significantly lower (78%) among football players than the general population.

This is an observational study and the researchers acknowledge that individuals could have had alcohol related disorders without receiving a diagnosis, and that their findings may not apply to female elite players and to male and female amateur and youth players (who constitute most football players worldwide). 

But they conclude: “In this nationwide cohort study, male football players who had played in the Swedish top tier of competition had a significantly lower risk of alcohol related disorders than men from the general population.”

These findings are likely to reflect the economic changes in football, altering players’ drinking habits since the 1960s and mitigating alcohol related health harms, say researchers in a linked editorial.

In contrast, they point out that football clubs, competitions, and leagues continue to promote alcohol and other unhealthy commodities to football fans, which evidence indicates is directly linked with higher consumption, particularly among young people.

Further research might be able to compare the incidence of alcohol related disorders between the general population and football fans to ascertain the impact of football related marketing, they write. It could also look at how and when elite footballers object to alcohol sponsorship and whether elite footballers pushing back on alcohol can influence the consumption habits of fans.

“While fans could not buy alcohol at the matches at the Qatar World Cup, digital advertising boards alongside the pitch promoted beer to millions of global television viewers,” they note. “Playing football might be healthy but watching it could be the very opposite.”

 

Microplastics deposited on the seafloor triple in 20 years

Peer-Reviewed Publication

UNIVERSITAT AUTONOMA DE BARCELONA

Campaing 

IMAGE: RESEARCHER LAURA SIMON-SÁNCHEZ DURING ONE OF THE SAMPLE COLLECTION CAMPAIGNS view more 

CREDIT: (AUTHOR: LENA HEINS).

The total amount of microplastics deposited on the bottom of oceans has tripled in the past two decades with a progression that corresponds to the type and volume of consumption of plastic products by society. This is the main conclusion of a study developed by the Institute of Environmental Science and Technology of the Universitat Autònoma de Barcelona (ICTA-UAB) and the Department of the Built Environment of Aalborg University (AAU-BUILD), which provides the first high-resolution reconstruction of microplastic pollution from sediments obtained in the northwestern Mediterranean Sea.

Despite the seafloor being considered the final sink for microplastics floating on the sea surface, the historical evolution of this pollution source in the sediment compartment, and particularly the sequestration and burial rate of smaller microplastics on the ocean floor, is unknown.

This new study, published in the journal Environmental Science and Technology (ES&T), shows that microplastics are retained unaltered in marine sediments, and that the microplastic mass sequestered in the seafloor mimics the global plastic production from 1965 to 2016. "Specifically, the results show that, since 2000, the amount of plastic particles deposited on the seafloor has tripled and that, far from decreasing, the accumulation has not stopped growing mimicking the production and global use of these materials," explains ICTA-UAB researcher Laura Simon-Sánchez.

Researchers explains that the sediments analysed have remained unaltered on the seafloor since they were deposited decades ago. "This has allowed us to see how, since the 1980s, but especially in the past two decades, the accumulation of polyethylene and polypropylene particles from packaging, bottles and food films has increased, as well as polyester from synthetic fibres in clothing fabrics," explains Michael Grelaud, ICTA-UAB researcher. The amount of these three types of particles reaches 1.5mg per kilogram of sediment collected, with polypropylene being the most abundant, followed by polyethylene and polyester. Despite awareness campaigns on the need to reduce single-use plastic, data from annual marine sediment records show that we are still far from achieving this. Policies at the global level in this regard could contribute to improving this serious problem.

Although smaller microplastics are very abundant in the environment, constraints in analytical methods have limited robust evidence on the levels of small microplastics in previous studies targeting marine sediment. In this study they were characterised by applying state-of-the-art imaging to quantify particles down to 11 µm in size.

The degradation status of the buried particles was investigated, and it was found that, once trapped in the seafloor, they no longer degrade, either due to lack of erosion, oxygen, or light. "The process of fragmentation takes place mostly in the beach sediments, on the sea surface or in the water column. Once deposited, degradation is minimal, so plastics from the 1960s remain on the seabed, leaving the signature of human pollution there," says Patrizia Ziveri, ICREA professor at ICTA-UAB.

The investigated sediment core was collected in November 2019, on board the oceanographic vessel Sarmiento de Gamboa, in an expedition that went from Barcelona to the coast of the Ebro Delta, in Tarragona, Spain. The research group selected the western Mediterranean Sea as a study area, in particular the Ebro Delta, because rivers are recognized as hotspots for several pollutants, including microplastics. In addition, the influx of sediment from the Ebro River provides higher sedimentation rates than in the open ocean.

Should we tax robots?

Study suggests a robot levy — but only a modest one — could help combat the effects of automation on income inequality in the U.S.

Peer-Reviewed Publication

MASSACHUSETTS INSTITUTE OF TECHNOLOGY

What if the U.S. placed a tax on robots? The concept has been publicly discussed by policy analysts, scholars, and Bill Gates (who favors the notion). Because robots can replace jobs, the idea goes, a stiff tax on them would give firms incentive to help retain workers, while also compensating for a dropoff in payroll taxes when robots are used. Thus far, South Korea has reduced incentives for firms to deploy robots; European Union policymakers, on the other hand, considered a robot tax but did not enact it.  

Now a study by MIT economists scrutinizes the existing evidence and suggests the optimal policy in this situation would indeed include a tax on robots, but only a modest one. The same applies to taxes on foreign trade that would also reduce U.S. jobs, the research finds.   

“Our finding suggests that taxes on either robots or imported goods should be pretty small,” says Arnaud Costinot, an MIT economist, and co-author of a published paper detailing the findings. “Although robots have an effect on income inequality … they still lead to optimal taxes that are modest.”

Specifically, the study finds that a tax on robots should range from 1 percent to 3.7 percent of their value, while trade taxes would be from 0.03 percent to 0.11 percent, given current U.S. income taxes.

“We came in to this not knowing what would happen,” says Iván Werning, an MIT economist and the other co-author of the study. “We had all the potential ingredients for this to be a big tax, so that by stopping technology or trade you would have less inequality, but … for now, we find a tax in the one-digit range, and for trade, even smaller taxes.”

The paper, “Robots, Trade, and Luddism: A Sufficient Statistic Approach to Optimal Technology Regulation,” appears in advance online form in The Review of Economic Studies. Costinot is a professor of economics and associate head of the MIT Department of Economics; Werning is the department’s Robert M. Solow Professor of Economics.

A sufficient statistic: Wages

A key to the study is that the scholars did not start with an a priori idea about whether or not taxes on robots and trade were merited. Rather, they applied a “sufficient statistic” approach, examining empirical evidence on the subject.

For instance, one study by MIT economist Daron Acemoglu and Boston University economist Pascual Restrepo found that in the U.S. from 1990 to 2007, adding one robot per 1,000 workers reduced the employment-to-population ratio by about 0.2 percent; each robot added in manufacturing replaced about 3.3 workers, while the increase in workplace robots lowered wages about 0.4 percent.

In conducting their policy analysis, Costinot and Werning drew upon that empirical study and others. They built a model to evaluate a few different scenarios, and included levers like income taxes as other means of addressing income inequality.

“We do have these other tools, though they’re not perfect, for dealing with inequality,” Werning says. “We think it’s incorrect to discuss this taxes on robots and trade as if they are our only tools for redistribution.”

Still more specifically, the scholars used wage distribution data across all five income quintiles in the U.S. — the top 20 percent, the next 20 percent, and so on — to evaluate the need for robot and trade taxes. Where empirical data indicates technology and trade have changed that wage distribution, the magnitude of that change helped produce the robot and trade tax estimates Costinot and Werning suggest. This has the benefit of simplicity; the overall wage numbers help the economists avoid making a model with too many assumptions about, say, the exact role automation might play in a workplace.

“I think where we are methodologically breaking ground, we’re able to make that connection between wages and taxes without making super-particular assumptions about technology and about the way production works,” Werning says. “It’s all encoded in that distributional effect. We’re asking a lot from that empirical work. But we’re not making assumptions we cannot test about the rest of the economy.”

Costinot adds: “If you are at peace with some high-level assumptions about the way markets operate, we can tell you that the only objects of interest driving the optimal policy on robots or Chinese goods should be these responses of wages across quantiles of the income distribution, which, luckily for us, people have tried to estimate.”

Beyond robots, an approach for climate and more

Apart from its bottom-line tax numbers, the study contains some additional conclusions about technology and income trends. Perhaps counterintuitively, the research concludes that after many more robots are added to the economy, the impact that each additional robot has on wages may actually decline. At a future point, robot taxes could then be reduced even further.   

“You could have a situation where we deeply care about redistribution, we have more robots, we have more trade, but taxes are actually going down,” Costinot says. If the economy is relatively saturated with robots, he adds, “That marginal robot you are getting in the economy matters less and less for inequality.”

The study’s approach could also be applied to subjects besides automation and trade. There is increasing empirical work on, for instance, the impact of climate change on income inequality, as well as similar studies about how migration, education, and other things affect wages. Given the increasing empirical data in those fields, the kind of modeling Costinot and Werning perform in this paper could be applied to determine, say, the right level for carbon taxes, if the goal is to sustain a reasonable income distribution.

“There are a lot of other applications,” Werning says. “There is a similar logic to those issues, where this methodology would carry through.” That suggests several other future avenues of research related to the current paper.

In the meantime, for people who have envisioned a steep tax on robots, however, they are “qualitatively right, but quantitatively off,” Werning concludes.

###

Written by Peter Dizikes, MIT News

Additional background

Paper: “Robots, Trade, and Luddism: A Sufficient Statistic Approach to Optimal Technology Regulation”

https://academic.oup.com/restud/advance-article-abstract/doi/10.1093/restud/rdac076/6798670?redirectedFrom=fulltext&login=false

New multi-institutional project to examine the flow of the Earth’s mantle

Grant and Award Announcement

UNIVERSITY OF ILLINOIS SCHOOL OF INFORMATION SCIENCES

The work of a team led by Visiting Research Scientist Chris Havlin and Assistant Professor Matthew Turk of the School of Information Sciences at the University of Illinois Urbana-Champaign is at the center of a National Science Foundation (NSF) project to better understand the microphysical activities of rocks that affect the upper mantle of the Earth.

The mantle is the interior layer of Earth, between the crust at the surface and the core at the center. Although solid, the mantle can be caused to flow by the push or pull of physical phenomena. The primary driver of mantle flow is thermal convection, which controls the motion of the mantle and evolution of tectonic plates over millions of years. On top of the convective background, processes that occur on shorter timescales cause perturbations of this background state. These include seismic waves following an earthquake, melting of continental ice sheets and glaciers, the annual recharge and extraction of groundwater, and the drainage of large lakes. For this new project, scientists will focus on these shorter time scale responses and consider three locations on the planet with existing datasets, the western United States, Iceland, and Alaska.

Havilin and Turk have been awarded a five-year, $127,723 grant from the NSF to focus on computational modeling for their project, “inveStigating the Transient Rheology of the Upper Mantle (iSTRUM).” Joining the iSchool team for the total $1.6 million project will be scientists from the University of California-Berkeley, Institut de Physique du Globe de Paris, University of Minnesota-Twin Cities, Lamont-Doherty Earth Observatory at Columbia University, University of California-Santa Barbara, and Brown University. The work will integrate theory, experiments, and observations spanning seismic to convective timescales.

“It’s rare to have a big project like this,” Havlin said. “Our modeling sits at the intersection of all this work. It’s at the heart of the scientific process behind this project.”

The iSchool team will use a software program Havlin has developed, Very Broadband Rheology Calculator (VBRc), to determine the rocks’ properties at various time and length scales. The tool will connect the microscopic description of the rock to macroscopic observations of the land masses as recorded from satellites in space, Global Positioning System networks and seismic stations. Part of the iSchool team’s work will be conducting workshops to use VBRc.

“I help the team of scientists to use this tool, and we will adjust the code to make it specialized for this project,” Havlin said. “I’m hoping we can get a larger user base and community to modify the tool for their needs.”

The results of this study have a bearing on topics ranging from predicting how sea levels will rise due to melting ice sheets to understanding tidal deformation on the Jupiter moon Io

Chronic dysentery unlikely killer of Edward the Black Prince as is commonly believed

Malaria and inflammatory bowel disease among possible causes that changed course of English history, suggests military expert


Peer-Reviewed Publication

BMJ

Whatever disease killed Edward the Black Prince—heir apparent to the English throne in the mid 1300s, and heralded as the greatest English soldier ever to have lived—is unlikely to have been chronic dysentery, as is commonly believed, writes a military expert in the journal BMJ Military Health.

But whether it was malaria; brucellosis, caused by eating unpasteurised dairy products and raw meat; inflammatory bowel disease; or complications arising from a single bout of dysentery—all possible causes—the disease changed the course of English history, says Dr James Robert Anderson of 21 Engineer Regiment.

And what happened to the Black Prince, who pretty much continuously fought wars and was exposed to violence from the age of 16, has been endlessly repeated throughout millennia, with disease, rather than battle injury, taking the heaviest toll on life during warfare, he says.

Edward of Woodstock, the Black Prince, was never seriously injured despite the number of military campaigns he led. But he had a chronic illness that waxed and waned for almost 9 years, to which he finally succumbed in 1376 at the age of 45. 

His early death changed the course of English history, because the crown passed directly to his 10 year-old son after the death of King Edward III. Young King Richard II was later deposed and murdered, sparking over a century of instability, including the Wars of the Roses and the rise of the Tudors, notes the author.

The Black Prince’s illness is thought to have started after his victory at the Battle of Nájera in Spain in 1367, writes the author. A chronicle suggested that up to 80% of his army may have died from “dysentery and other diseases.” 

And most later accounts of the Black Prince’s death suggest that he died from chronic dysentery, possibly the amoebic form, which was common in medieval Europe. 

Amoebic dysentery can cause long term complications, including internal scarring (amoeboma), intestinal inflammation and ulceration (colitis), and extreme inflammation and distension of the bowel (life threatening toxic megacolon), points out the author. 

But if he really did have amoebic dysentery, with its symptoms of chronic diarrhoea, would he really have been well enough, or even welcomed aboard, a ship with a cargo of soldiers heading for battle in France in 1372, asks the author? 

Complications from surviving a single bout of dysentery are a possibility, particularly as historical records indicate that paratyphoid—similar to typhoid, but caused by a different bug—and a recently discovered cause of dysentery, was in circulation in 1367.

Complications from this could have included long term health issues, such as anaemia, kidney damage, liver abscess and/or reactive arthritis, suggests the author.

Dehydration due to lack of water during the hot Spanish campaign is another possibility. This could have caused kidney stones which would fit with a fluctuating illness lasting several years, he says. 

Another candidate is inflammatory bowel disease, which might have accounted for relapsing-remitting symptoms and gradual deterioration, suggests the author.

Brucellosis was also common in medieval Europe, and its sources (dairy products and raw meat) were often kept aside for the nobility on military campaigns, says the author. It can produce chronic symptoms of fatigue, recurrent fever, and joint and heart inflammation.

Another common disease in medieval Europe was malaria, the symptoms of which include fever, headache, myalgia (muscle aches and pains), gut problems, fatigue, chronic anaemia and susceptibility to acute infections, such as pneumonia or gastroenteritis, leading to multiorgan failure and death, he adds.

“This would fit the fluctuating nature of his illness and the decline towards the end of his life. Any anaemia would not have been helped by the purging and venesection [blood letting] treatments of the time,” he suggests.

“There are several diverse infections or inflammatory conditions that may have led to [the Black Prince’s] demise…However, chronic dysentery is probably unlikely,” he writes. 

And he concludes :“Even in modern conflicts and war zones, disease has caused enormous morbidity and loss of life, something that has remained consistent for centuries. Efforts to protect and treat deployed forces are as important now as in the 1370s.”

 

Popular folk medicine remedy (‘The Secret’) doesn’t prevent bleeding after invasive heart procedures

But it may help to relieve stress among its believers, suggest the researchers

Peer-Reviewed Publication

BMJ

A popular folk medicine remedy for staunching blood, known as ‘The Secret’, doesn’t stop bleeding after invasive coronary procedures used to diagnose or treat cardiac problems, finds research published in the open access journal Open Heart.

But this remnant from medical practice in the Middle Ages may help to relieve stress among its believers, and may have ‘therapeutic’ placebo effect value, suggest the researchers.

‘The Secret’ has been used for several centuries in Switzerland, particularly in the French speaking part, to staunch blood during and after a procedure. It consists of a healing ‘formula’ or prayer that is intended to mobilise superior forces to help cure the patient.

 The ‘formula’, which can be deployed on site or remotely by an initiated ‘Secret Maker’, is a widely practised and reputed complementary medicine in Switzerland, so much so that it is used in hospitals. But its clinical effectiveness has never been formally evaluated.

In a bid to plug this knowledge gap, the researchers compared bleeding outcomes in 200 people admitted to one tertiary care centre for planned invasive coronary procedures.

These were diagnostic coronary angiography (x-ray imaging of the heart vessels) and/or percutaneous coronary intervention (unclogging of blocked arteries to restore blood flow) between January and July 2022.

Half the patients were randomly assigned to standard care, and half were randomly assigned to standard care plus The Secret, which was administered by a randomly selected Secret Maker.

The average age of the patients was 68, and nearly three out of four were men. Most (76%) of the entire sample believed that The Secret would prevent bleeding, with believers more or less evenly distributed across both groups.

Risk factors for postoperative complications were similar between the two groups, as were the criteria for minor and major bleeding. 

Bleeding severity was defined according to Bleeding Academic Research Consortium (BARC) criteria, ranging from 1 (minor) to 5 (major).

Bleeding after a procedure occurred in 55 (27.5%) of the patients. Rates were similar in both groups:16% in The Secret group vs 14% (BARC 1) in the standard care group; and 12% vs 13% (BARC 2). No patient had a major bleed (BARC 3 and above).

The researchers acknowledge that their study was relatively small and carried out in one hospital. And despite random group allocation, radial artery access for angiography was more often used for patients in the standard care group. This can lower the risk of serious vascular complications and major bleeding. 

That The Secret didn’t affect bleeding outcomes for better or worse wasn’t exactly a surprise, say the researchers, but a substantial proportion of patients nevertheless request The Secret.

“This apparent discrepancy between the measured effects on bleeding and patient demands touches on an aspect that was not addressed by this study, but which can be understood as stress management and wellbeing,” they point out. 

“The reduction of stress in the patient who has used a ‘Secret Maker’ has been considered after burns. As such, ‘The Secret’ might allow some neuropsychological conditioning and act as a placebo, as do other beliefs or biofeedback techniques,” they suggest.

The Secret is a remnant of the Middle Ages, when medicine was practised by monks or sorcerers, based on one of the miracles reported in the synoptic gospels (Matthew, Mark, Luke) as “Jesus healing the bleeding woman,” they explain. 

Despite medical and scientific advances, “recent enthusiasm for ‘alternative’ medicines and healers, which is particularly intense on social media since the last COVID-19 pandemic started, or the techno-optimism towards global warming, are proof of persistent magical thinking among the general public,” they add.

New study finds birds build hanging-nests to protect offspring from nest invaders


Peer-Reviewed Publication

DURHAM UNIVERSITY

-With pictures-

A new study has found that birds build hanging-nests, particularly those with extended entrance tunnels, to help protect offspring against nest invaders like snakes and parasitic cuckoos.

Researchers at Durham University, the British Trust for Ornithology and Princeton University examined the relationship between nest design and the length of time offspring spend in the nest before fledging across species of weaverbirds and icterids, two bird families renowned for their complex woven nests.

They found that species building the most elaborate nests, particularly those with long entrance tunnels, produce offspring with longer developmental periods.

Nests with longer entrance tunnels are more effective at hindering access by nest invaders than shorter tunnels and thereby limits the exposure of developing offspring to nest invaders.

Researchers suggest that the complex structural features in these nests do indeed play a role in protecting offspring from predators and brood parasites.

They find the consistency of these findings ‘striking’ given that highly elaborate nests have evolved independently in the weaverbirds and icterids.

Full analysis of the study has been published in the journal Proceedings of the Royal Society B.

Lead author of the study, Dr Sally Street of Durham University, said: “Ornithologists have long been fascinated by the beautifully woven nests of weaverbirds and icterids – these nests often dangle precariously from slim branches and some have extended entrance tunnels up to a metre long.

“It has been widely assumed that these nests prevent attacks by tree-climbing snakes but this idea is largely based on anecdotes until now. We are excited to show that these ideas appear to be correct – species building the most elaborate nests, particularly those with long entrance tunnels, have more slowly developing offspring which is exactly what we should expect if the nests protect chicks from predators and other nest invaders such as brood parasitic cuckoos”

Researchers have also revealed that by building protective structures such as the elaborate nest, birds and other species can deploy greater control over their exposure to environmental hazards.

The scientists obtained data on nest design, life-history traits, body mass and latitude in weaverbird and icterid species from multiple secondary sources for the purpose of this study.

Their study findings reveal how animal architects such as nest-building birds and burrowing mammals can create protective environments that change how their offspring develop.

The researchers say this may even help to understand the role of shelter-building in human evolution.

ENDS

Media Information

Dr Sally Street is available for interview and can be contacted on sally.e.street@durham.ac.uk.   

Alternatively, please contact Durham University Communications Office for interview requests on communications.team@durham.ac.uk.

Source

“Convergent evolution of elaborate nests as structural defences in birds”, (2022), S. Street, R. Jaques and T. Silva, Proceedings of the Royal Society B.

An embargoed copy of the paper is available from Durham University Communications Office. Please email communications.team@durham.ac.uk.

The article will be available online after the embargo lifts: https://doi.org/10.1098/rspb.2022.1734

Graphics

Associated images are available via the following link: https://www.dropbox.com/scl/fo/cwuafjz1iie5b8ur0rtit/h?dl=0&rlkey=bzlamt5xwydb0eev61nty1i19

Useful Web Links  

Dr Sally Street staff profile: https://www.durham.ac.uk/staff/sally-e-street/

Department of Anthropology: https://www.durham.ac.uk/anthropology/

Durham Cultural Evolution Research Centre: https://www.durham.ac.uk/dcerc/

About Durham University

Durham University is a globally outstanding centre of teaching and research based in historic Durham City in the UK.

We are a collegiate university committed to inspiring our people to do outstanding things at Durham and in the world.

We conduct boundary-breaking research that improves lives globally and we are ranked as a world top 100 university with an international reputation in research and education (QS World University Rankings 2023).

We are a member of the Russell Group of leading research-intensive UK universities and we are consistently ranked as a top 10 university in national league tables (Times and Sunday Times Good University Guide, Guardian University Guide and The Complete University Guide).

For more information about Durham University visit: www.durham.ac.uk/about/

END OF MEDIA RELEASE – issued by Durham University