Tuesday, September 14, 2021

 

One water bucket to find them all: Detecting fish, mammals, and birds from a single sample


Revolutionary environmental DNA analysis holds great potential for the future of biodiversity monitoring, concludes a new study

Peer-Reviewed Publication

PENSOFT PUBLISHERS

Water sample 

IMAGE: COLLECTION OF WATER SAMPLES FOR EDNA METABARCODING BIOASSESSMENT. view more 

CREDIT: TILL-HENDRIK MACHER

In times of exacerbating biodiversity loss, reliable data on species occurrence are essential, in order for prompt and adequate conservation actions to be initiated. This is especially true for freshwater ecosystems, which are particularly vulnerable and threatened by anthropogenic impacts. Their ecological status has already been highlighted as a top priority by multiple national and international directives, such as the European Water Framework Directive.

However, traditional monitoring methods, such as electrofishing, trapping methods, or observation-based assessments, which are the current status-quo in fish monitoring, are often time- and cost-consuming. As a result, over the last decade, scientists progressively agree that we need a more comprehensive and holistic method to assess freshwater biodiversity.

Meanwhile, recent studies have continuously been demonstrating that eDNA metabarcoding analyses, where DNA traces found in the water are used to identify what organisms live there, is an efficient method to capture aquatic biodiversity in a fast, reliable, non-invasive and relatively low-cost manner. In such metabarcoding studies, scientists sample, collect and sequence DNA, so that they can compare it with existing databases and identify the source organisms.

Furthermore, as eDNA metabarcoding assessments use samples from water, often streams, located at the lowest point, one such sample usually contains not only traces of specimens that come into direct contact with water, for example, by swimming or drinking, but also collects traces of terrestrial species indirectly via rainfalls, snowmelt, groundwaters etc. 

CAPTION

After water filtration the eDNA filter is preserved in ethanol until further processing in the lab.

CREDIT

Till-Hendrik Macher

In standard fish eDNA metabarcoding assessments, these ‘bycatch data’ are typically left aside. Yet, from a viewpoint of a more holistic biodiversity monitoring, they hold immense potential to also detect the presence of terrestrial and semi-terrestrial species in the catchment.

In their new study, reported in the open-access scholarly journal Metabarcoding and MetagenomicsGerman researchers from the University of Duisburg-Essen and the German Environment Agency successfully detected an astonishing quantity of the local mammals and birds native to the Saxony-Anhalt state by collecting as much as 18 litres of water from across a two-kilometre stretch along the river Mulde.

In fact, it took only one day for the team, led by Till-Hendrik Macher, PhD student in the German Federal Environmental Agency-funded GeDNA project, to collect the samples. Using metabarcoding to analyse the DNA from the samples, the researchers identified as much as 50% of the fishes, 22% of the mammal species, and 7.4% of the breeding bird species in the region. 

However, the team also concluded that while it would normally take only 10 litres of water to assess the aquatic and semi-terrestrial fauna, terrestrial species required significantly more sampling.

Unlocking data from the increasingly available fish eDNA metabarcoding information enables synergies among terrestrial and aquatic biodiversity monitoring programs, adding further important information on species diversity in space and time. 

“We thus encourage to exploit fish eDNA metabarcoding biodiversity monitoring data to inform other conservation programs,” say’s lead author Till-Hendrik Macher. 

“For that purpose, however, it is essential that eDNA data is jointly stored and accessible for different biodiversity monitoring and biodiversity assessment campaigns, either at state, federal, or international level,” concludes Florian Leese, who coordinates the project.

  

CAPTION

Overview of a freshwater-associated vertebrate community including some of the detected species.

CREDIT

Till-Hendrik Macher

Original source:

Macher T-H, Schütz R, Arle J, Beermann AJ, Koschorreck J, Leese F (2021) Beyond fish eDNA metabarcoding: Field replicates disproportionately improve the detection of stream associated vertebrate species. Metabarcoding and Metagenomics 5: e66557. https://doi.org/10.3897/mbmg.5.66557

 

Ambitious research to study fundamental earth and environmental science questions


Five innovative new research projects could push the boundaries of science and help us understand key questions of environmental and earth science.


Grant and Award Announcement

UK RESEARCH AND INNOVATION

Five innovative new research projects could push the boundaries of science and help us understand key questions of environmental and earth science.

The ambitious studies, led by some of the UK’s leading scientists, are each tackling fundamental questions about the earth and our environment, including how we interact with our planet. The studies will:

  • Establish whether the Earth's Core has multiple layers by building computer models to explain seismic and magnetic field data. The research could change our understanding of processes that generate our planet’s protective magnetic field.
  • Conduct the first comprehensive study of epigenetics, and whether environmental factors can cause changes in the way our genes are read, by following the lifecycles of a rare breed of sheep on the remote island of St Kilda in the Outer Hebrides.
  • Investigate whether deep sea hydrothermal vents sustained or even created the first life on earth. The research team will also investigate what that can tell us about the possibilities life in volcanically active planets including Mars.
  • Assess the limits of stability of marine ecosystems in a changing environment by studying the roles of three triggers of widespread mortality: viruses, microplastic pollution, and early life cycles. 
  • Build our understanding of how single cell plant-like super marine organisms, called coccolithophores, use the sun’s energy to transform dissolved ions from the sea back into rocks.

The Natural Environment Research Council (NERC) has invested a total of £8 million in the research as part of a unique pilot scheme to fund high risk and innovative science. This is the first time NERC has launched a scheme of this kind.

The Pushing the Frontiers scheme aims to facilitate truly adventurous and ambitious science and exploit new technologies and approaches. The projects will be funded for between three and four years.

UK Science Minister Amanda Solloway said:

“If the UK is to lead the world in achieving scientific breakthroughs, it’s vital that we give our most pioneering scientists and researchers license to go where others haven’t before by driving forward high-risk, high-reward research.

“That’s why we are backing these five ambitious studies to the tune of £8m, to help solve unanswered questions about our Universe – from the origins of Earth to whether there is life on Mars - all while helping to secure the UK’s status as a global science superpower.” 

Robyn Thomas, Associate Director of Research and Skills at NERC, said:

“These highly innovative research projects could advance our understanding of fundamental questions in environmental and earth science, and lead to important scientific breakthroughs. The grants are the outcome of an exciting new pilot scheme to encourage and fund some of the UK’s most exceptional environmental scientists to lead more risky and transformational research.”  

Ends

Project summaries:

  1. Dr Eva Stueeken (University of St Andrews)

Project title: Did hydrothermal vents push the frontiers of habitability on the early Earth?

This research project will study the fundamental question how early life was sustained. Several lines of evidence suggest that the nutrients supplied by deep sea hydrothermal vents could have cradled the earliest life on Earth.

To test this hypothesis, Dr Stueeken will develop a new hydrothermal reaction chamber that will allow her to recreate the pressures and temperatures in the vents and to introduce different gases (N2, CO2, CH4), fluids (saline, fresh), phosphate phases and catalytic minerals (magnetite, sulphides).

This could help to understand how volcanic activity on other planets might in theory sustain life – including on Mars and Europa, a moon orbiting Jupiter. 

  1. Professor Jon Slate (University of Sheffield)

Project title: The role of epigenetics in evolution

Professor Slate and his team will conduct the first full study on epigenetics – how environmental factors could play a part in changing our genes. Researchers will track the population of Soay sheep on the island of Hirta, St Kilda in the Outer Hebrides. This

population is one of the best studied mammal populations in the world. It has many features that make it ideal to explore the role of epigenetics on evolution. Complete life histories are known for around 10,000 sheep born since 1985. Environmental conditions on the island are challenging, meaning natural selection is strong enough for evolutionary change to have been witnessed and measured in the lifetime of the study.

The project will investigate commonest form of epigenetic effect is known as methylation.

  1. Professor Rosalind Emily Mayors Rickaby (University of Oxford)

Project title: PUCCA: Photosynthetic Underpinnings of Coccolithophore Calcification

Single-cell organisms called Coccolithophores generate over 1 billion tonnes of chalk (calcium carbonate) every year in the ocean. These ocean bugs provide a foundation of the marine ecosystem and are major players in the ocean carbon cycle.

The generation of chalk in the surface ocean affects how much CO2 can be extracted into those surface waters from the atmosphere. And the production of the dense chalk mineral by these carbon-fixing cells, helps them sink to depth and take carbon dioxide from the atmosphere to the deep ocean where it is stored. This is a fundamental part of the modern carbon cycle but we do not know what limits how much chalk can be produced and therefore how this component will affect the amount of carbon dioxide that the ocean can absorb in the future. By working with interdisciplinary teams and using pioneering techniques, Professor Rickaby will:

•           Compare the physiology recorded in coccolithophore fossils with current cultured cells to establish how they have adapted to changes in the environment over the centuries to millions of years.

•           Explore the links between photosynthesis and calcification as evidenced from modern and past biomolecules,  adaptation in genes and the isotopic compositions of the fossils

  1. Professor Corinne Le Quéré (University of East Anglia)

Project title: Frontiers of instability in marine ecosystems and carbon export (Marine Frontiers)

This research project aims to establish the limits of stability of the Earth’s marine ecosystems, using modelling on a scale not undertaken before. Marine micro-organisms live on the ocean surface and once they die or are eaten, organic matter sinks to the bottom of the sea. This is a vital process in regulating our climate.
 
Understanding the causes of mortality of micro-organisms - and the roles of viruses, microplastics and early life cycles - is therefore critical. Organisms’ lifespans are very short and so changes in mortality can substantially impact marine ecosystems. Yet mortality processes have received little attention so far.
 
Using data from new technology, including imaging of ocean plankton, Professor Le Quéré will develop and use a global ecosystem model to better understand how marine ecosystems are impacted by multiple stressors, including climate change, ocean acidification, microplastic pollution, and pressure from fisheries, and test the limits of their stability under extreme conditions.
 

  1. Dr Christopher Davies (University of Leeds)

Project title: Earth's Core as a Layered System

Establishing the origin of Earth's magnetic field is crucial for understanding planetary habitability and evolution and is widely recognised as a fundamental goal in Earth Science. The field has shielded the surface environment from solar radiation for billions of years and now helps mitigate against space weather events, which can significantly impact telecommunications and power grids. Yet the field is generated in the iron core, an ocean of liquid metal 2800 km below the surface, and so the changes we experience at Earth’s surface reflect dynamics in the most remote region of our planet.

The standard model of Earth’s core cannot explain crucial observations from seismology and geomagnetism and therefore lacks essential physics. Dr Davies believes that the observations can be explained by viewing the Earth’s Core as a system of coupled layers, each with their own unique dynamics. To test this hypothesis, he will develop a new model of the core that requires major enhancements to existing computer codes and solutions of new and complex systems of equations.

Further information

About the Pushing the Frontiers Pilot scheme

The Pushing the Frontiers pilot aims to support our very best individual researchers to push the frontiers of knowledge with ground-breaking, risky, innovative scientific discovery. The proposals are funded at up to £2m per award over a period of 3 - 4 years, a timescale and funding award that can enable the ambitious and innovative research we seek. The application process involved a streamlined process, with focus on the proposed transformational research (5 pages), and the skills and track record of the individual (2 pages).

As part of the streamlined application process, the assessment would focus on the individual applicant and their high-risk, high-reward scientific vision. Joint applications with other researchers were not permitted, although funds may be used flexibly to resource eligible collaborators. In recognising the PI may lead a team, the applicant had to demonstrate in their track record their capability to advance the careers of others.

This scheme structure encouraged applications from early career researchers (ECR) and within an institution’s demand management allocation, research organisations could submit up to two applications to this scheme, provided one application was from an ECR. There were separate, initial assessment panels for ECRs and established PIs, but a final panel considered jointly the best candidates from each.


 

The bridge between public health education and government workforce needs fixing


Rebuilding the U.S. public health system requires a new generation of highly trained, diverse public health professionals--

Peer-Reviewed Publication

COLUMBIA UNIVERSITY'S MAILMAN SCHOOL OF PUBLIC HEALTH

September 13, 2021-- The COVID-19 pandemic has uncovered long-term underinvestment in the public health workforce, including staff losses and underfunding for public health education, according to a new paper in the American Journal of Public Health. For training of individuals in health departments to succeed, we must assess needs, increase access to education for future public health professionals, and invest in the existing public health workforce, according to Columbia Mailman School of Public Health authors Heather Krasna and Dean Linda P. Fried.

It has been estimated that for every dollar spent on public health, $14.30 is saved on health care and related costs.

“The public health workforce is a critical element of the public health system and infrastructure, but a reduction in the number of public health workers in the core government public health workforce is well documented,” says Heather Krasna, MS, assistant dean of career services at Columbia Mailman School. “And with 22 percent of government public health workers planning to retire by 2023, workforce losses are expected to worsen. Meanwhile, the for-profit sector is increasingly hiring public health graduates.”

“With only 14 percent of governmental public health professionals having a formal education in the field of public health, rebuilding and expanding the U.S. public health system will require a new generation of highly trained, diverse public health professionals to create a healthier America, and these professionals will need a public health education,” says Linda P. Fried, MD, MPH, Columbia Mailman School Dean and DeLamar Professor of Public Health Practice and Professor of Epidemiology.

Even with increased enrollments in public health degree programs, it is unlikely that enough public health graduates are entering government to address unmet needs. Although many students are motivated to work in government public health, many have concerns about career paths, wages, employee empowerment, and opportunities for innovation within government, according to the authors.

To ensure a highly trained, diverse public health workforce and replace retiring workers while adding to capacity to handle the COVID-19 pandemic and other public health challenges, authors Krasna and Fried offer the following recommendations:

  • Undertake new workforce research, including large scale surveys on the number and types of workers needed in specific public health occupations.
     
  • Make reforms to public service loan forgiveness and create student loan repayment programs specifically for public health graduates. Existing loan forgiveness and repayment for clinicians should also encourage work in public health.
     
  • Attract diverse candidates with recruitment campaigns and reforms. Programs should be significantly expanded to improve the pipeline of hirees for governmental public health from academia. This includes streamlining the hiring process in government agencies.
     
  • Sustain workforce investment. Expanded funding for public health workers must become permanent. This is essential for creating a unified public health workforce recruitment and training plan.

“We can no longer rely on ‘emergency’-based, short-term, earmarked funding that disappears when a crisis ends,” notes Krasna. “Without long-term investment in education for new public health professionals and programs easing entry into government careers, a recovery from COVID-19 and improvements in the public’s health will be impossible.”

Columbia University Mailman School of Public Health

Founded in 1922, the Columbia University Mailman School of Public Health pursues an agenda of research, education, and service to address the critical and complex public health issues affecting New Yorkers, the nation and the world. The Columbia Mailman School is the seventh largest recipient of NIH grants among schools of public health. Its nearly 300 multi-disciplinary faculty members work in more than 100 countries around the world, addressing such issues as preventing infectious and chronic diseases, environmental health, maternal and child health, health policy, climate change and health, and public health preparedness. It is a leader in public health education with more than 1,300 graduate students from 55 nations pursuing a variety of master’s and doctoral degree programs. The Columbia Mailman School is also home to numerous world-renowned research centers, including ICAP and the Center for Infection and Immunity. For more information, please visit www.mailman.columbia.edu

 

Researchers develop toolkit to test Apple security, find vulnerability


Peer-Reviewed Publication

NORTH CAROLINA STATE UNIVERSITY

Researchers from North Carolina State University have developed a software toolkit that allows users to test the hardware security of Apple devices. During their proof-of-concept demonstration, the research team identified a previously unknown vulnerability, which they call iTimed.

“This toolkit allows us to conduct a variety of fine-grained security experiments that have simply not been possible on Apple devices to this point,” says Aydin Aysu, co-author of a paper on the work and an assistant professor of electrical and computer engineering at NC State.

Apple is well known for creating integrated devices. The design of the devices effectively prevents people from seeing how the devices function internally.

“As a result, it has been difficult or impossible for independent researchers to verify that Apple devices perform the way that Apple says they perform when it comes to security and privacy,” says Gregor Haas, first author of the paper and a recent master’s graduate from NC State.

However, a hardware vulnerability was uncovered in 2019 called checkm8. It affects several models of iPhone and is essentially an unpatchable flaw.

“We were able to use checkm8 to get a foothold at the most fundamental level of the device – when the system begins booting up, we can control the very first code to run on the machine,” Haas says. “With checkm8 as a starting point, we developed a suite of software tools that allows us to observe what’s happening across the device, to remove or control security measures that Apple has installed, and so on.”

The researchers stress that there are practical reasons for wanting to have third parties assess Apple’s security claims.

“A lot of people interact with Apple’s tech on a daily basis,” Haas says. “And the way Apple wants to use its platforms is changing all the time. At some point, there’s value in having independent verification that Apple’s technology is doing what Apple says it is doing, and that its security measures are sound.”

“For example, we want to know the extent to which attacks that have worked against hardware flaws in other devices might work against Apple devices,” Aysu says.

It didn’t take the researchers long to demonstrate how useful their new toolkit is.

While conducting a proof-of-concept demonstration of the toolkit, the researchers reverse-engineered several key components of Apple’s hardware and identified a vulnerability to something they named an iTimed attack. It falls under the category of so-called “cache timing side channel attacks,” and effectively allows a program to gain access to cryptographic keys used by one or more programs on an Apple device. With the relevant keys, outside users would then be able to access whatever information the other affected program or programs on the device had access to.

“We haven’t seen evidence of this attack in the wild yet, but we have notified Apple of the vulnerability,” Aysu says.

The NC State team is sharing much of the toolkit as an open-source resource for other security researchers.

“We also plan to use this suite of tools to explore other types of attacks so that we can assess how secure these devices are and identify things we can do to reduce or eliminate these vulnerabilities moving forward,” Aysu says.

The paper, “iTimed: Cache Attacks on the Apple A10 Fusion SoC,” is co-authored by Seetal Potluri, a postdoctoral researcher at NC State. The paper will be presented at the IEEE International Symposium on Hardware Oriented Security and Trust, which is being held Dec. 12-15 in Washington, D.C. The work was done primarily with support from the National Science Foundation under grant 1850373.

SHARE VACCINES WITH POOR COUNTRIES INSTEAD

The Lancet: Scientific evidence to date on COVID-19 vaccine efficacy does not support boosters for general population, expert review concludes


Peer-Reviewed Publication

THE LANCET

Peer reviewed / Review and opinion

An expert review by an international group of scientists, including some at the WHO and FDA, concludes that, even for the delta variant, vaccine efficacy against severe COVID is so high that booster doses for the general population are not appropriate at this stage in the pandemic.

The review, published in The Lancet, summarises the currently available evidence from randomised controlled trials and observational studies published in peer-reviewed journals and pre-print servers.

A consistent finding from the observational studies is that vaccines remain highly effective against severe disease, including that from all the main viral variants. Averaging the results reported from the observational studies, vaccination had 95% efficacy against severe disease both from the delta variant and from the alpha variant, and over 80% efficacy at protecting against any infection from these variants. Across all vaccine types and variants, vaccine efficacy is greater against severe disease than against mild disease (see figure, page 2).

Although vaccines are less effective against asymptomatic disease or against transmission than against severe disease, even in populations with high vaccination coverage the unvaccinated minority are still the major drivers of transmission, as well as being themselves at the highest risk of serious disease.

“Taken as a whole, the currently available studies do not provide credible evidence of substantially declining protection against severe disease, which is the primary goal of vaccination. The limited supply of these vaccines will save the most lives if made available to people who are at appreciable risk of serious disease and have not yet received any vaccine. Even if some gain can ultimately be obtained from boosting, it will not outweigh the benefits of providing initial protection to the unvaccinated. If vaccines are deployed where they would do the most good, they could hasten the end of the pandemic by inhibiting further evolution of variants,” says lead author Dr Ana-Maria Henao-Restrepo, WHO [1].

The authors note that even if levels of antibodies in vaccinated individuals wane over time, this does not necessarily predict reductions in the efficacy of vaccines against severe disease. This could be because protection against severe disease is mediated not only by antibody responses, which might be relatively short lived for some vaccines, but also by memory responses and cell-mediated immunity, which are generally longer-lived. If boosters are ultimately to be used, there will be a need to identify specific circumstances where the benefits outweigh the risks.

Even without any loss of vaccine efficacy, however, increasing success in delivering vaccines to large populations will inevitably lead to increasing numbers of vaccinated people, decreasing numbers of unvaccinated people, and hence an increasing proportion of all cases being breakthrough cases, especially if vaccination leads to behavioural changes in vaccinees. But, the ability of vaccines to elicit an antibody response against current variants indicates that these variants have not yet evolved to the point at which they are likely to escape the memory immune response induced by the vaccines.

Even if new variants that can escape the current vaccines are going to evolve, they are most likely to do so from strains that have already become widely prevalent. Therefore, the effectiveness of boosters developed specifically to match potential newer variants could be greater and longer lived than boosters using current vaccines. A similar strategy is used for influenza vaccines, for which each annual vaccine is based on the most current data about circulating strains, increasing the likelihood that the vaccine will remain effective even if there is further strain evolution.

“The vaccines that are currently available are safe, effective, and save lives. Although the idea of further reducing the number of COVID-19 cases by enhancing immunity in vaccinated people is appealing, any decision to do so should be evidence-based and consider the benefits and risks for individuals and society. These high-stakes decisions should be based on robust evidence and international scientific discussion,” adds co-author Dr Soumya Swaminathan, WHO Chief Scientist [1].

NOTES TO EDITORS

The review was conducted by authors at the Food and Drug Administration (USA), University of Washington (USA), University of Oxford (UK), University of Florida (USA), University of the West Indies (Jamaica), University of Bristol (UK), Universidad Nacional Autonoma de Mexico (Mexico), Wits Reproductive Health and HIV Institute (South Africa), Universite de Paris (France), the INCLEN Trust International (India) and the World Health Organisation (Switzerland).

WHO’s Strategic Advisory Group of Experts on Immunization,(SAGE), which develops WHO’s immunisation policy, is actively reviewing all the evidence including the data and this issue. This paper does not constitute a formal policy position for WHO.

[1] Quote direct from author and cannot be found in the text of the Viewpoint.

The labels have been added to this press release as part of a project run by the Academy of Medical Sciences seeking to improve the communication of evidence. For more information, please see: http://www.sciencemediacentre.org/wp-content/uploads/2018/01/AMS-press-release-labelling-system-GUIDANCE.pdf if you have any questions or feedback, please contact The Lancet press office pressoffice@lancet.com  

IF YOU WISH TO PROVIDE A LINK FOR YOUR READERS, PLEASE USE THE FOLLOWING, WHICH WILL GO LIVE AT THE TIME THE EMBARGO LIFTS: https://www.thelancet.com/pb-assets/Lancet/pdfs/S0140673621020468.pdf

THE BIOLOGICAL ORIGIN OF EPICURIANISM

Scientists discover that taste cells can control a whole animal’s foraging strategy

Peer-Reviewed Publication

UNIVERSITY OF LEEDS

Neuroscientists have developed a computer model to explain how a nematode worm searches for food, revealing that single brain cells can both sense the environment and control a whole animal’s foraging strategy.   

The study, involving a team of scientists from the University of Leeds and the Erasmus University Medical Centre in Rotterdam, involved the microscopic nematode species, Caenorhabditis elegans.   

In a paper published today in the journal Communications Biology, the researchers show that sensory cells in this animal are not only picking up signals from the environment, they are also processing that information to drive decision making that dictates the animal’s motion.   

Up to now, scientists had generally believed information from sensory cells was sent to other circuits in the animal’s brain for decision making and to control behaviour.   

Professor Netta Cohen, Computational Neuroscientist at the University of Leeds and co-lead author in the paper, said: “Our findings are startling – we found simple mechanisms by which salt tasting cells drive a rather sophisticated strategy to forage for food.”  

Foraging pattern  

The species C. elegans feeds off bacteria in patches of rotting vegetation in soils. These patches of food are likely to vary in size and be some distance apart, with the result that worm colonies undergo a “boom or bust” existence. For an individual animal, successfully foraging for food is a matter of life or death.   

To increase its chances of survival, the worm has evolved a foraging strategy where it will randomly criss-cross an area in search of food: if it finds no food, the animal will move away in search of other areas with possible food.  

The researchers performed experiments and developed a model which explains how the worm’s taste sensors process information from the environment to direct its foraging behaviour.   

They believe the worm uses its taste of salt in the soil as “navigation beacons”, moving towards them and then, if food is not found, away from them.   

Sensory cells attracted to salt  

The nervous system of C. elegans contains 302 cells, including two taste cells that are stimulated by the presence of salt. These two sensory cells respond differently: one is stimulated by increasing salt levels, the other by decreasing salt levels.   

The starting point for this study was the discovery by researchers led by Dr Gert Jansen in Rotterdam that when one of these cells is active, the other is “asleep”.   

Professor Cohen said: “When a nematode first senses a salty environment, the sensory cell that is sensitive to increasing salt concentrations is stimulated - and provides all the information the animal needs to steer into the salt patch.”    

But if the animal does not locate food after a few minutes, the sensory cell becomes de-sensitised. Meanwhile the other taste cell, stimulated by decreasing salt levels, becomes active, inducing sharp turns that help keep the animal on the salt. The result is that the animal preferentially explores larger patches of salt.    

Both the sensory cells work to keep the worm foraging on a salt patch. But what happens if it fails to find food? Dr Gert Jansen and his group discovered that two additional sensory cells are recruited to the salt sensing circuit when animals are exposed to salt.  

It was initially thought that these additional sensory cells alerted the worm to dangers in the environment, allowing it to abruptly change direction and get out of harm’s way. But the study has revealed that these harm-avoidance cells also toggle on and off as part of its navigation strategy, allowing it to sharply change direction to avoid the salt, thus extending its foraging range.  

Over time, all the sensors continue to cycle between their on and off states, in this way controlling a rich and dynamic foraging strategy.

Professor Cohen said: “We think this is a mechanism built into these sensory cells. Not only is it remarkably effective, but surprisingly, because it all takes place inside the sensors, it is very easy to implement with the basic toolkit that nearly all brain cells have at their disposal.   

“While C. elegans may use salt cues to forage for food, we suspect similar mechanisms may be used by other animals to selectively attend to other cues or features of the environment.”   

The study was funded by the Engineering and Physical Sciences Research Council and the Center for Biomedical Genetics, the Royal Netherlands Academy of Sciences.   

Earlier this year, Professor Cohen’s team at the University of Leeds revealed in a paper published in the journal Nature, that they had mapped the physical organisation of the brain of C. elegans.  

 END

Multimedia: a time-lapse graphic showing the foraging behaviour of C.elegans based on a computer-generated model of the way four sensory cells dictate the search for food can be downloaded from the following link: download animation (https://leeds365-my.sharepoint.com/:v:/g/personal/extdgl_leeds_ac_uk/EaBSoVulNmVHpHfz97IFIoYBvIbDgHy-_EhV7AFnHYusJg?e=iIFoxd) PLEASE CREDIT:  University of Leeds

 

Eating peanuts may lower risk of ischemic stroke, cardiovascular disease among Asians


Stroke Journal Report

Peer-Reviewed Publication

AMERICAN HEART ASSOCIATION

DALLAS, Sept. 9, 2021 — Asian men and women living in Japan who ate peanuts (on average 4-5 peanuts/day) had a lower risk of having an ischemic stroke or a cardiovascular disease event compared to those who did not eat peanuts, according to new research published today in Stroke, a journal of the American Stroke Association, a division of the American Heart Association.

While previous studies have linked peanut consumption with improved cardiovascular health among Americans, researchers in this study specifically examined the link between peanut consumption and the incidence of different types of stroke (ischemic and hemorrhagic) and cardiovascular disease events (such as stroke and ischemic heart disease) among Japanese men and women.

“We showed for the first time a reduced risk for ischemic stroke incidence associated with higher peanut consumption in an Asian population,” said lead study author Satoyo Ikehara, Ph.D., specially appointed associate professor of public health in the department of social medicine at the Osaka University Graduate School of Medicine in Suita, Japan. “Our results suggest that adding peanuts to your diet has a beneficial effect on the prevention of ischemic stroke.”

Peanuts are rich in heart-healthy nutrients, such as “monounsaturated fatty acids, polyunsaturated fatty acids, minerals, vitamins and dietary fiber that help lower risk of cardiovascular disease by reducing risk factors, including high blood pressure, high blood levels of ’bad’ cholesterol and chronic inflammation,” Ikehara said.

Researchers examined the frequency of how often people reported eating peanuts in relation to stroke occurrence and cardiovascular disease. The analysis includes people who were recruited in two phases, in 1995 and 1998-1999, for a total of more than 74,000 Asian men and women, ages 45 to 74, from the Japan Public Health Center-based Prospective Study. Participants completed a comprehensive lifestyle survey, which included a questionnaire about the frequency of peanut consumption. They were followed for approximately 15 years – through 2009 or 2012, depending on when they were originally enrolled.

The incidence of stroke and ischemic heart disease were determined by linking with 78 participating hospitals in the areas included in the study.

Researchers adjusted for other health conditions, smoking, diet, alcohol consumption and physical activity, as detailed by participants in the questionnaires. According to medical records, researchers noted 3,599 strokes (2,223 ischemic and 1,376 hemorrhagic) and 849 cases of ischemic heart disease developed during the follow-up period.

The levels of peanut consumption were ranked in four quartiles, with 0 peanuts a day as the least intake compared to 4.3 unshelled peanuts a day (median) as the highest. Compared to a peanut-free diet, researchers found eating about 4-5 unshelled peanuts per day was associated with:

  • 20% lower risk of ischemic stroke;
  • 16% lower risk of total stroke; and
  • 13% lower risk of having cardiovascular disease (this included both stroke and ischemic heart disease).
  • A significant association was not found between peanut consumption and a lower risk of hemorrhagic stroke or ischemic heart disease.

The link between peanut consumption and lowered risk of stroke and cardiovascular disease was consistent in both men and women.

“The beneficial effect of peanut consumption on risk of stroke, especially ischemic stroke was found, despite the small quantity of peanuts eaten by study participants,” Ikehara said. “The habit of eating peanuts and tree nuts is still not common in Asian countries. However, adding even a small amount to one’s diet could be a simple yet effective approach to help reduce the risk of cardiovascular disease.”

The American Heart Association recommends eating about five servings of unsalted nuts per week; one serving is ½ ounce (2 tablespoons) of nuts. Besides peanuts, the Association also says other healthy nut options include unsalted cashews, walnuts, pecans, macadamia nuts and hazelnuts.

Several limitations were noted in the study, including the validity and reliability of peanut consumption measurements in the data collection and analysis. Bias caused by these measurements may lead to errors in the association. However, a measurement error correction analysis was performed, and the associations proved to be accurate.

Co-authors are Hiroyasu Iso, M.D., Ph.D.; Yoshihiro Kokubo, M.D., Ph.D., FAHA; Kazumasa Yamagishi, M.D., Ph.D.; Isao Saito, M.D., Ph.D.; Hiroshi Yatsuya, M.D., Ph.D.; Takashi Kimura, Ph.D.; Norie Sawada, M.D., Ph.D.; Motoki Iwasaki, M.D., Ph.D.; and Shoichiro Tsugane, M.D., Ph.D.

The study was funded by the National Cancer Center Research and Development Fund and a Grant-in-Aid for Cancer Research from the Ministry of Health, Labor and Welfare of Japan.

Additional Resources:

Statements and conclusions of studies published in the American Heart Association’s scientific journals are solely those of the study authors and do not necessarily reflect the Association’s policy or position. The Association makes no representation or guarantee as to their accuracy or reliability. The Association receives funding primarily from individuals; foundations and corporations (including pharmaceutical, device manufacturers and other companies) also make donations and fund specific Association programs and events. The Association has strict policies to prevent these relationships from influencing the science content. Revenues from pharmaceutical and biotech companies, device manufacturers and health insurance providers and the Association’s overall financial information are available here.

About the American Stroke Association

The American Stroke Association is a relentless force for a world with fewer strokes and longer, healthier lives. We team with millions of volunteers and donors to ensure equitable health and stroke care in all communities. We work to prevent, treat and beat stroke by funding innovative research, fighting for the public’s health, and providing lifesaving resources. The Dallas-based association was created in 1998 as a division of the American Heart Association. To learn more or to get involved, call 1-888-4STROKE or visit stroke.org. Follow us on Facebook and Twitter.

###

 

Between hope and reservations: Study on what students expect of remote learning


Peer-Reviewed Publication

UNIVERSITY OF COLOGNE

Among the hopes and fears students associated with switching to remote teaching and learning at the beginning of the COVID-19 pandemic, negative expectations slightly outweigh positive ones. That is the result of a study conducted by Thomas Hoss, Amancay Ancina, and Professor Kai Kaspar from the University of Cologne’s Psychology Department during the first nationwide lockdown in Germany. The researchers focused on students’ expectations regarding the risks and opportunities associated with this challenging situation. The paper ‘Forced Remote Learning During the COVID-19 Pandemic in Germany: A Mixed-Methods Study on Students’ Positive and Negative Expectations’ has appeared in Frontiers in Psychology.

Beginning in the spring of 2020, universities around the world had to convert their lectures and seminars from classroom teaching to virtual formats within a very short period of time – often without sufficient preparation or educational concepts. Faculty and students were forced to adapt their familiar teaching and learning routines. The survey reveals that students had both positive as well as negative expectations of this shift to an e-learning semester during the first lockdown in April and May 2020. ‘Their assessment is so valuable because, looking back after three online semesters so far, it allows us to better understand which of the expected effects actually took hold and are still relevant’, said Kaspar. Revisiting the beginning of the pandemic is therefore not only interesting to determine the status quo, it also provides a knowledge base on which measures for the future can build. For the next teaching term, it is still uncertain whether, and if so to what degree, teaching can take place in the classroom again.

For the study, a group of 584 students in different degree programmes, from different universities, and in different phases of their courses provided more than 3,800 statements on their positive and negative expectations for the upcoming online semester. They also rated these expectations according to personal relevance. At 57.7 per cent, negative expectations outweighed positive ones. Moreover, on average the negative expectations were seen as more relevant than the positive ones. Individual study phases also made a difference: master’s degree students attributed higher relevance to their negative expectations than bachelor’s degree students. ‘This is probably because master’s students have already developed study routines, so they perceived the abrupt change as more challenging’, said Thomas Hoss. Kai Kaspar added: ‘We also found that those expectations that came to mind first were, on average, rated as more personally significant than any expectations mentioned thereafter. This was true for both negative and positive expectations. This cognitive effect shows that we should also pay attention to the order in which interviewees express things.’

Among the most frequently voiced negative effects were: the concern that the quality of teaching and learning could deteriorate, including performance assessment; an expected decrease in social interaction and communication with other students as well as decreased interaction with faculty and university staff; reduced feedback and support services. In addition, many students experienced uncertainty due to a lack of information, while also fearing an increase in the required time and workload along with a decrease in their study motivation. ‘Many students also reported negative expectations regarding their degree programme and feared an increase in the required study time. The closing of key university facilities and services, and thus more difficult access to resources, were also frequently mentioned negative aspects’, said Amancay Ancina.

In contrast, increased flexibility in time and work management, in receiving and processing course materials, and in choosing where to learn were frequently cited as positive expectations. ‘While more individual freedom was associated with online learning, students also frequently voiced the concern that they might not have sufficient self-regulation, self-discipline, and self-organization skills,’ Professor Kaspar said. Many students also reported the hope that forced remote learning would save time, e.g. because long commutes to the university were no longer necessary. Positive work–life balance effects were also frequently mentioned, although a smaller group of students feared that it would become difficult to draw a line between studies and private life. Through intensive use of digital technologies, many students also hoped that their media competency would improve in the long term, and that the digitalization of materials and procedures would progress at universities and in society.

In summary, the study results paint a very detailed and mixed picture of sentiment among students at the onset of the pandemic. These data are an important basis for the research team to critically evaluate the status quo after currently three almost completely virtual semesters at German universities: ‘The social isolation already feared by many at that time due to the lack of physical contact opportunities became a reality for students, faculty, and staff at universities for many months,’ Kaspar concluded. He and his team now want to clarify the following questions: To what extent has the creative use of digital communication tools replaced face-to-face interaction at least to some degree? Has the quality of teaching and learning actually deteriorated, as many students feared? Which basic skills do universities need to convey to provide students with the necessary self-regulation competency to successfully make use of the greater flexibility provided by online teaching and learning? Last but not least: What kinds of offers actually promote the hoped-for increase in media competency? On the basis of these questions, the scientists want to start a conversation on how to improve virtual teaching and learning at universities to make them fit for the future.

###

Disclaimer: AAAS and