Sunday, August 22, 2021

 

“Before the Big Bang” –Vestiges of a Prior Universe? (Weekend Feature)

 

Big Bang

 

“Eliminating the singularity or Big Bang– which has its origins in the late 1920s when US astronomer Edwin Hubble discovered that almost all galaxies are moving away from each other at ever-faster velocities– brings back the bouncing Universe on to the theoretical stage of cosmology. The absence of a singularity at the start of spacetime opens up the possibility that vestiges of a previous contraction phase may have withstood the phase change (between contraction to expansion) and may still be with us in the ongoing expansion of the Universe,” said Brazilian physicist Juliano Cesar Silva Neves.

A Universe Prior to the Big Bang

Although for five decades, the Big Bang theory has been the best known and most accepted explanation for the beginning and evolution of the Universe, it is hardly a consensus among scientists. Physicists are now assuming the possibility of vestiges of a Universe previous to the Big Bang. 

The Big Bounce

“The idea of a “big bounce” has existed in some form or another for decades,” writes Jonas Mureika, theoretical physicist and physics chair at Loyola Marymount University and the Kavli Institute for Theoretical Physics, UC Santa Barbara, in an extraordinary email to The Daily Galaxy. “The basic idea is to avoid the singularity at t=0 (time of the Big Bang) and r=0 (where all matter and energy were compressed into an infinitesimally small volume), since our mathematical description of spacetime (and thus our favorite theories) break down at that point. This is similar to problems in black hole physics, which itself has similar fixes to remove the singularity.”

“The Quantum Fix”

“The crux of the problem,” observes Mureika, “is that our description of this physics is classical, i.e. a prediction of General Relativity, and that’s why singularities arise. The theories just don’t work in that limit. It is most likely the case, however, that the physics governing the realm of classical singularities — extremely small and extremely high energy — is quantum in nature. So, the rules change in some fascinating ways and introducing us to new physics allows this to make sense.” 

“When classical physics breaks down, we look to replace the broken parts with a quantum fix. If the singularity is at r=0, then one of the ways we can avoid this is to not let the physics act at r=0. That is, we impose a minimal length (usually the Planck length, but not always) below which the universe can’t be ‘probed’. That removes the infinites that plague the singularity and allows our theories to behave well. In a ‘big bounce’ scenario, the quantum pressures at this minimum length basically stop the implosion of the universe and allow it to re-expand. Again, similar ideas exist with black holes, called Planck stars.”

 Roger Penrose’s Cyclic Conformal Cosmology (CCC) 

“Another approach is to change our notion of the structure of spacetime itself,” explains Mureika, “and how it behaves in the small and large limits. This is embodied in Nobel Laureate Roger Penrose’s 2012 Cyclic Conformal Cosmology (CCC) framework, in which the very small limit of the universe (singularity) is identical to the very large limit (‘infinite’ accelerated expansion). This is done by a conformal transformation on the spacetime metric (the thing that defines straight lines and shortest distances), which is a fancy way of saying we stretch and bend spacetime while preserving certain geometric features (e.g. angles). We now know the universe is indeed going through a phase of accelerated expansion, so this adds weight to Penrose’s idea and kills previous ones (i.e. the universe doesn’t contract, so it can’t ‘bounce’ in the ways previously thought). This allows for the universe to be ‘reborn’ as it expands indefinitely, so there is a repeating cycle of big bangs. Penrose calls these cycles ‘aeons’.”

“CMB” Fossils

“Of course,” Mureika concludes, “a theory is only as good as its experimental verification, so the challenge is to detect tell-tale fingerprints of these models. The observational go-to for early universe cosmology is the Cosmic Microwave Background (CMB), which represents an imprint of the earliest time we can see. It’s believed that the CMB will contain information from earlier times, including the big bang (if it happened). These will manifest themselves as e.g. geometric signatures, patterns in temperature fluctuations, over/underdensities of clustering, etc. Detecting any such signature would be a monumental discovery, and will help quantum gravity and cosmology research shape their future paths.”

Brian Keating’s Deep Dive

“In contrast to inflation,” observes Brian Keating, Chancellor’s Distinguished Professor of Physics at UC San Diego, author of Losing the Nobel Prize, and host of the INTO THE IMPOSSIBLE Podcast in an email to The Daily Galaxy,  “there are several other possible mechanisms to the singularity featured in most versions of the Big Bang theory. Two of the most prominent alternatives to the singular Big Bang are the Bouncing model of Anna Ijjas and Paul Steinhardt and the Conformal Cyclic Cosmology (CCC) of Sir Roger Penrose. Both of these share the feature that they do not produce so-called ‘primordial B-mode polarization’ patterns, the result of relic gravitational waves produced in most models of cosmic Inflation, which also features a concomitant spacetime singularity. In that sense, both the Bouncing and CCC models are falsifiable, e.g. if  current or  future B-mode polarization experiments like BICEP Array or the Simons Observatory were to detect and confirm primordial B-modes, these alternatives would be disproven. Many cosmologists find the falsifiability of these models, in contrast to the inflationary Multiverse, strongly appealing.”

Hawking Points”

“The CCC model also predicts  the presence of so-called ‘Hawking Points’ ”, explains Keating, “regions of concentrated energy caused by the coalescing black holes from preceding ‘aeons’ which would, according to Penrose and collaborators, be evidence supporting the cyclic Evidence for Hawking points from the ESA’s Planck satellite has been claimed already. But those claims are also disputed by members of the Planck team. Upcoming experiments like BICEP Array and the Simons Observatory will be able to rule out or confirm evidence for Hawking Points which would be tantamount to evidence for the CCC model.”

 

Mirroring the “Bouncing Universe” model, Neves, in an article published in General Relativity and Gravitation, proposes to eliminate the need for cosmological spacetime singularity and argues that the current expansion phase was preceded by contraction.

Neves is part of a group of researchers who dare to imagine a different origin. In a study published in the journal General Relativity and Gravitation, Neves suggests the elimination of a key aspect of the standard cosmological model: the need for a spacetime singularity known as the Big Bang.

Challenging the Idea that Time had a Beginning

In raising this possibility, Neves challenges the idea that time had a beginning and reintroduces the possibility that the current expansion was preceded by contraction. “I believe the Big Bang never happened,” the physicist said, who works as a researcher at the University of Campinas’s Mathematics, Statistics & Scientific Computation Institute (IMECC-UNICAMP) in Sao Paulo State, Brazil.

For Neves, the fast spacetime expansion stage does not exclude the possibility of a prior contraction phase. Moreover, the switch from contraction to expansion may not have destroyed all traces of the preceding phase.

Introducing the “Scale Factor”

The article, which reflects the work developed under the Thematic Project “Physics and geometry of spacetime”, considers the solutions to the general relativity equations that describe the geometry of the cosmos and then proposes the introduction of a “scale factor” that makes the rate at which the Universe is expanding depend not only on time but also on cosmological scale.

“In order to measure the rate at which the Universe is expanding with the standard cosmology, the model in which there’s a Big Bang, a mathematical function is used that depends only on cosmological time,” said Neves, who elaborated the idea with Alberto Vazques Saa, a Professor at IMECC-UNICAMP and also the supervisor for Neves’ postdoctoral project, funded by the Sao Paulo Research Foundation – FAPESP.

With the scale factor, the Big Bang itself, or cosmological singularity, ceases to be a necessary condition for the cosmos to begin universal expansion. A concept from mathematics that expresses indefiniteness, singularity was used by cosmologists to characterize the “primordial cosmological singularity” that happened 13.8 billion years ago, when all the matter and energy from the Universe were compressed into an initial state of infinite density and temperature, where the traditional laws of physics no longer apply.

From the 1940s onward, scientists guided by Einstein’s theory of general relativity constructed a detailed model of the evolution of the Universe since the Big Bang. Such model could lead to three possible outcomes: the infinite expansion of the Universe at ever-higher velocities; the stagnation of the Universe expansion in a permanent basis; or an inverted process of retraction caused by the gravitational attraction exerted by the mass of the Universe, what is known as Big Crunch.

Neves conceptualizes that “bouncing cosmology” is rooted in the hypothesis that Big Crunch would give way to an eternal succession of universes, creating extreme conditions of density and temperature in order to instigate a new inversion in the process, giving way to expansion in another bounce.

Black-Hole Fossils

Black holes are the starting point of Neves’ investigations about the “Bouncing Universe”. “Who knows, there may be remains of black holes in the ongoing expansion that date from the prior contraction phase and passed intact through the bottleneck of the bounce,” he said.

Consisting of the imploded core remaining after a giant star explodes, black holes are a kind of cosmic object whose core contracted to form a singularity, a point with infinite density and the strongest gravitational attraction known to exist. Nothing escapes from it, not even light.

According to Neves, a black hole is not defined by singularity, but rather by an event horizon, a membrane that indicates the point of no return from which nothing escapes the inexorable destiny of being swallowed up and destroyed by the singularity.

The Illustris simulation shown below that visualizes the universe with the Standard Big Bang Model is the most ambitious computer simulation yet performed. The calculation tracks the expansion of the universe, the gravitational pull of matter onto itself, the motion of cosmic gas, as well as the formation of stars and black holes.

 

 

“Outside the event horizon of a regular black hole, there are no major changes, but inside it, the changes are deep-seated. There’s a different spacetime that avoids the formation of a singularity.”

The scale factor formulated by Neves and Saa was inspired by US physicist James Bardeen. In 1968, Berdeen used a mathematical trick to modify the solution to the general relativity equations that describe black holes.

The trick consisted of thinking of the mass of a black hole not as a constant, as had previously been the case, but as a function that depends on the distance to the center of the black hole. With this change, a different black hole, termed a regular black hole, emerged from the solution to the equations. “Regular black holes are permitted, since they don’t violate general relativity. The concept isn’t new and has frequently been revisited in recent decades,” said Neves.

Since the insertion of a mathematical trick into the general relativity equations could prevent the formation of singularities in regular black holes, Neves considered creating a similar artifice to eliminate the singularity in a regular bounce.

In science, says philosopher Karl Popper,  a theory is worthless if cannot be verified, however beautiful and inspiring it may be. How do you test the hypothesis of a Big Bang that did not start with a singularity?

“By looking for traces of the events in a contraction phase that may have remained in the ongoing expansion phase. What traces? The candidates include remnants of black holes from a previous phase of universal contraction that may have survived the bounce,” Neves said.

The Daily Galaxy, Maxwell Moe, astrophysicist, NASA Einstein Fellow, University of Arizona via Brian KeatingJonas Mureika and  Sao Paulo Research Foundation (FAPESP)

Image credit: Shutterstock License

A HUNDRED & SEVENTY YEARS AFTER THE FACT

Evolution is now accepted by a majority of Americans

evolution
Credit: CC0 Public Domain

The level of public acceptance of evolution in the United States is now solidly above the halfway mark, according to a new study based on a series of national public opinion surveys conducted over the last 35 years.\

"From 1985 to 2010, there was a statistical dead heat between acceptance and rejection of ," said lead researcher Jon D. Miller of the Institute for Social Research at the University of Michigan. "But acceptance then surged, becoming the majority position in 2016."

Examining data over 35 years, the study consistently identified aspects of education—civic science literacy, taking  in science and having a —as the strongest factors leading to the acceptance of evolution.

"Almost twice as many Americans held a college degree in 2018 as in 1988," said co-author Mark Ackerman, a researcher at Michigan Engineering, the U-M School of Information and Michigan Medicine. "It's hard to earn a college degree without acquiring at least a little respect for the success of ."

The researchers analyzed a collection of biennial surveys from the National Science Board, several national surveys funded by units of the National Science Foundations, and a series focused on adult civic literacy funded by NASA. Beginning in 1985, these national samples of U.S. adults were asked to agree or disagree with this statement: "Human beings, as we know them today, developed from earlier species of animals."

The series of surveys showed that Americans were evenly divided on the question of evolution from 1985 to 2007. According to a 2005 study of the acceptance of evolution in 34 developed nations, led by Miller, only Turkey, at 27%, scored lower than the United States. But over the last decade, until 2019, the percentage of American adults who agreed with this statement increased from 40% to 54%.

The current study consistently identified religious fundamentalism as the strongest factor leading to the rejection of evolution. While their numbers declined slightly in the last decade, approximately 30% of Americans continue to be religious fundamentalists as defined in the study. But even those who scored highest on the scale of religious fundamentalism shifted toward acceptance of evolution, rising from 8% in 1988 to 32% in 2019.

Miller predicted that  would continue to impede the public acceptance of evolution. 

"Such beliefs are not only tenacious but also, increasingly, politicized," he said, citing a widening gap between Republican and Democratic acceptance of evolution. 

As of 2019, 34% of conservative Republicans accepted evolution compared to 83% of liberal Democrats.

The study is published in the journal Public Understanding of Science.

Exploring evolution acceptance for better science education
More information: Jon D. Miller et al, Public acceptance of evolution in the United States, 1985–2020, Public Understanding of Science (2021). DOI: 10.1177/09636625211035919
Journal information: Public Understanding of Science 
Provided by University of Michigan 

 

Scientists Detect Tens of Thousands of Different Molecules in Beer – 80% Not Yet Described in Chemical Databases

Beer Glass Bubbles

Study used modern high resolution analytics to reveal enormous metabolic complexity of beer.

The tradition of beer brewing dates back to at least 7000 BCE and maybe even to the invention of agriculture, considering that most cereals can spontaneously ferment if exposed to airborne yeasts. The code of the Babylonian king Hammurabi (rule 1792 to 1750 BCE), whose laws 108 through 111 regulate beer sales, shows that people have been anxious to safeguard the quality of beer through legislation for millennia. For example, the Bavarian ‘Reinheitsgebot’ (‘Purity Law’) of 1516, often considered the world’s oldest still functional – with modifications – food regulation, allows only barley, water, and hops as ingredients for brewing beer (with confiscation of the barrels as penalty for transgression).

Now, in a recent study in Frontiers in Chemistry, the science of beer is taken to a new level. Scientists from Germany use state-of-the-art analytical methods to reveal the metabolic complexity – tens of thousands of different molecules – of commercial beers from around the world.

Enormous chemical complexity

“Beer is an example of enormous chemical complexity. And thanks to recent improvements in analytical chemistry, comparable in power to the ongoing revolution in the technology of video displays with ever-increasing resolution, we can reveal this complexity in unprecedented detail. Today it’s easy to trace tiny variations in chemistry throughout the food production process, to safeguard quality or to detect hidden adulterations,” said corresponding author Prof Philippe Schmitt-Kopplin, head of the Comprehensive Foodomics Platform at the Technical University of Munich and of the Analytical BioGeoChemistry research unit at the Helmholtz Center in Munich.

Schmitt-Kopplin and colleagues used two powerful methods – direct infusion Fourier transform ion cyclotron resonance mass spectrometry (DI-FTICR MS) and ultra-performance liquid chromatography quadrupole time-of-flight mass spectrometry (UPLC-ToF-MS) – to reveal the full range of metabolites in 467 types of beer brewed in the US, Latin America, Europe, Africa and East Asia. These included lagers, craft and abbey beers, top-fermented beers, and gueuzes brewed from barley as the only source of starch for fermentation, or barley plus either wheat, rice, and corn (maize).

The methods have complementary strengths. DI-FTICR-MS directly revealed the chemical diversity across all beers and predicted chemical formulas for the metabolite ions in them. The authors then used UPLC-ToF-MS on a subset of 100 beers to analyze the results with resolution on the possible isomers. UPLC-ToF-MS uses chromatography to first separate ions with identical masses and fragmentation of the mass ions to daughter ions, making it possible to predict the exact molecular structure.

The authors placed these metabolites in relation within the ‘chemical space’, each linked to one or more others through a single reaction, for example the addition of a methoxy-, hydroxyl-, sulfate-, or sugar-group to the molecular backbone, or turning an unsaturated bond into a saturated bond. This yielded a reconstruction of a metabolite network leading to the final product, consisting of nearly a hundred steps with a starting point in molecules from the original cereals, synthesized from the amino acid tryptophan. Derived from these are secondary metabolites, unique to each cereal.

Powerful method for quality control

“Our mass spectrometry method, which takes only 10 minutes per sample, should be very powerful for quality control in food industry and set the basis of novel molecular markers and non-targeted metabolite profiles needed in foodstuff inspection,” said Schmitt-Kopplin.

The authors found approximately 7700 ions with unique masses and formulas, including lipids, peptides, nucleotides, phenolics, organic acids, phosphates, and carbohydrates, of which around 80% aren’t yet described in chemical databases. Because each formula may in some cases cover up to 25 different molecular structures, this translates into tens of thousands of unique metabolites.

“Here we reveal an enormous chemical diversity across beers, with tens of thousands of unique molecules. We show that this diversity originates in the variety of raw materials, processing, and fermentation. The molecular complexity is then amplified by the so-called ‘Maillard reaction’ between amino acids and sugars which also gives bread, meat steaks, and toasted marshmallow their ‘roasty’ flavor. This complex reaction network is an exciting focus of our research, given its importance for food quality, flavor, and also the development of novel bioactive molecules of interest for health,” concluded first author Stefan Pieczonka, a PhD student at the Technical University of Munich.

Reference: “On the Trail of the German Purity Law: Distinguishing the Metabolic Signatures of Wheat, Corn and Rice in Beer” by Stefan A. Pieczonka, Sophia Paravicini, Michael Rychlik and Philippe Schmitt-Kopplin, 20 July 2021, Frontiers in Chemistry.
DOI: 10.3389/fchem.2021.715372

 

Historical Timeline Is Inaccurate: Advanced Radiocarbon Dating Reveals Machu Picchu Is Older Than Expected

Machu Picchu Peru

Machu Picchu, Peru.

Machu Picchu, the famous 15th-century Inca site in southern Peru, is up to several decades older than previously thought, according to a new study led by Yale archaeologist Richard Burger.

Burger and researchers from several U.S. institutions used accelerator mass spectrometry (AMS) — an advanced form of radiocarbon dating — to date human remains recovered during the early 20th century at the monumental complex and onetime country estate of Inca Emperor Pachacuti located on the eastern face of the Andes Mountains.

Their findings, published in the journal Antiquity, reveal that Machu Picchu was in use from about A.D. 1420 to A.D. 1530 — ending around the time of the Spanish conquest — making the site at least 20 years older than the accepted historical record suggests and raising questions about our understanding of Inca chronology.

Historical sources dating from the Spanish invasion of the Inca Empire indicate that Pachacuti seized power in A.D. 1438 and subsequently conquered the lower Urubamba Valley where Machu Picchu is located. Based on those records, scholars have estimated that the site was built after A.D. 1440, and perhaps as late as A.D. 1450, depending on how long it took Pachacuti to subdue the region and construct the stone palace.

The AMS testing indicates that the historical timeline is inaccurate.

Machu Picchu

“Until now, estimates of Machu Picchu’s antiquity and the length of its occupation were based on contradictory historical accounts written by Spaniards in the period following the Spanish conquest,” said Burger, the Charles J. MacCurdy Professor of Anthropology in Yale’s Faculty of Arts and Sciences. “This is the first study based on scientific evidence to provide an estimate for the founding of Machu Picchu and the length of its occupation, giving us a clearer picture of the site’s origins and history.”

The finding suggests that Pachacuti, whose reign set the Inca on the path to becoming pre-Columbian America’s largest and most powerful empire, gained power and began his conquests decades earlier than textual sources indicate. As such, it has implications for people’s wider understanding of Inca history, Burger said.

Machu Picchu Yale

Credit: Photo courtesy Yale University

“The results suggest that the discussion of the development of the Inca empire based primarily on colonial records needs revision,” he said. “Modern radiocarbon methods provide a better foundation than the historical records for understanding Inca chronology.”

The AMS technique can date bones and teeth that contain even small amounts of organic material, expanding the pool of remains suitable for scientific analysis. For this study, the researchers used it to analyze human samples from 26 individuals that were recovered from four cemeteries at Machu Picchu in 1912 during excavations led by Yale professor Hiram Bingham III, who had “rediscovered” the site the previous year.

The bones and teeth used in the analysis likely belonged to retainers, or attendants, who were assigned to the royal estate, the study states. The remains show little evidence of involvement in heavy physical labor, such as construction, meaning that they likely were from the period when the site functioned as a country palace, not when it was being built, the researchers said.

On November 30, 2010, Yale University and the Peruvian government reached an accord for the return to Peru of the archaeological materials Bingham excavated at Machu Picchu. On February 11, 2011, Yale signed an agreement with the Universidad Nacional de San Antonio Abad del Cusco establishing the International Center for the Study of Machu Picchu and Inca Culture, which is dedicated to the display, conservation, and study of the archaeological collections from Bingham’s 1912 excavations. All human remains and other archaeological materials from Machu Picchu have subsequently been returned to Cusco, the former capital city of the Inca Empire, where they are conserved at the Museo Machu Picchu.

Reference: “New AMS dates for Machu Picchu: results and implications” by Richard L. Burger, Lucy C. Salazar, Jason Nesbitt, Eden Washburn and Lars Fehren-Schmitz, 4 August 2021, Antiquity.
DOI: 10.15184/aqy.2021.99

We recommend

 NASA Star Trek Crew in 1976

Dryden Flight Research Center (now Armstrong) hosted the Star Trek crew in 1976 for the rollout of space shuttle Enterprise. In front, from left: NASA Administrator James Fletcher, and the show’s stars DeForest Kelley, George Takei, Nichelle Nichols, Leonard Nimoy, show creator Gene Roddenberry, and Walter Koenig. Credit: NASA

Star Trek and NASA: Celebrating the Connection

Gene Roddenberry would have been 100 years old on August 19, 2021, and we at NASA celebrate his legacy. As creator of the legendary Star Trek saga, Roddenberry’s vision continues to resonate.

In the documentary “NASA on the Edge of Forever: Science in Space,” host NASA astronaut Victor Glover stated, “Science and Star Trek go hand-in-hand.” The film explores how for the past 55 years, Star Trek has influenced scientists, engineers, and even astronauts to reach beyond. While the International Space Station doesn’t speed through the galaxy like the Starship Enterprise, much of the research conducted aboard the orbiting facility is making the fiction of Star Trek come a little closer to reality.

In this image, the then Dryden Flight Research Center (now Armstrong) hosted the Star Trek crew in 1976 for the rollout of space shuttle Enterprise. In front, from left: NASA Administrator James Fletcher, and the show’s stars DeForest Kelley, George Takei, Nichelle Nichols, Leonard Nimoy, show creator Gene Roddenberry, and Walter Koenig.



Organic and conservation agriculture promote ecosystem multifunctionality



See all authors and affiliations
Science Advances 20 Aug 2021:
Vol. 7, no. 34, eabg6995
DOI: 10.1126/sciadv.abg6995

Article

Abstract

Ecosystems provide multiple services to humans. However, agricultural systems are usually evaluated on their productivity and economic performance, and a systematic and quantitative assessment of the multifunctionality of agroecosystems including environmental services is missing. Using a long-term farming system experiment, we evaluated and compared the agronomic, economic, and ecological performance of the most widespread arable cropping systems in Europe: organic, conservation, and conventional agriculture. We analyzed 43 agroecosystem properties and determined overall agroecosystem multifunctionality. We show that organic and conservation agriculture promoted ecosystem multifunctionality, especially by enhancing regulating and supporting services, including biodiversity preservation, soil and water quality, and climate mitigation. In contrast, conventional cropping showed reduced multifunctionality but delivered highest yield. Organic production resulted in higher economic performance, thanks to higher product prices and additional support payments. Our results demonstrate that different cropping systems provide opposing services, enforcing the productivity–environmental protection dilemma for agroecosystem functioning.

INTRODUCTION

Global food production has more than doubled in the past 60 years. This has been achieved through land use change and use of mineral fertilizers, pesticides, breeding of new crop varieties, and other technologies of the “Green Revolution” (1, 2). However, increased use of agrochemicals, land conversion, farm expansion, and farm specialization have a negative impact on the environment and have caused habitat and biodiversity loss, pollution, and eutrophication of water bodies, increasing greenhouse gases emissions and reduced soil quality (1, 3, 4). Thus, one of the main challenges for the future of agriculture is to produce sufficient amounts of food with minimal environmental impact (1). However, to date, there is lack of appropriate methods and tools to evaluate, design, and track the multifunctionality and sustainability of agricultural production.

For agronomists, the focus of agricultural systems is dedicated to productivity, while ecologists and environmental researchers focus on the environmental impact of agriculture. Ideally, agricultural systems should provide the desired balance of provisioning services (e.g., food production), regulating services (e.g., soil, water, and climate protection), and supporting services (e.g., biodiversity and soil quality conservation) within viable socioeconomic boundaries (e.g., ensured income and suitable working conditions). However, systemic evaluations of the diverse services and trade-offs provided by different agricultural practices are scarce, and this has been viewed as a major research gap (5, 6).

In the past 15 years, there have been considerable efforts to conceptualize ecosystem services (ESs), defining their contribution to human well-being and bring it into policy and planning. Examples such as the Millennium Ecosystem Assessment (MEA) (7), The Economics of Ecosystems and Biodiversity (8), or the Intergovernmental Platform on Biodiversity and Ecosystem Service (9) are global initiatives that are integrated in national monitoring programs such as the U.K. National Ecosystem Assessment (UKNEA) framework (10). Even if these concepts and framework are increasingly recognized, there is a lack of implementation in practice due to difficulties to appropriately measure and value ES and to institutionalize outcomes (11).

One of the key approaches to measure and appropriately manage agroecosystems is to gain a solid understanding of how farming practices influence a wide range of ecosystem functions and services and to summarize these effects in a meaningful way (12, 13). The “ability of ecosystems to simultaneously provide multiple functions and services” can be assessed by calculating ecosystem multifunctionality (EMF), an approach widely used in ecology (14, 15). Here, we define ecosystem functions as the biotic and abiotic processes that make up or contribute to ESs either directly or indirectly.

A range of studies has assessed how different drivers including biodiversity and land use intensity affect individual functions and EMF (1619). However, this approach is still poorly developed for agroecosystems (15), where anthropogenic management plays a key role in determining ecosystem functioning (i.e., specific crop management practices like tillage intensity and chemical and organic input sources and amounts). Moreover, the number of ecosystem functions used to assess EMF varies greatly among studies, and there is often little explanation of why certain variables are included (15). Thus, a next frontier is to investigate how major cropping systems (e.g., conventional, organic, and conservation agriculture) influence different ecosystem functions and EMF and embed such analyses in a broader conceptual ES framework supporting producer and policy decisions.

The main objective of this study is to assess the overall performance of important cropping systems within an adapted ES framework using the EMF methodology applied in ecology. To do this, we use a 6-year dataset from the long-term FArming System and Tillage (FAST) experiment (fig. S1) where we compare the agronomical, ecological, and economic impacts of four arable cropping systems [conventional intensive tillage (C-IT), conventional no tillage (C-NT), organic intensive tillage (O-IT), and organic reduced tillage (O-RT); see Materials and Methods and tables S1 to S3 for detailed management description]. We focus on these specific management strategies since conservation and organic agriculture are two main alternatives to conventional management and are often promoted as more environmentally friendly practices. Organic agriculture prohibits the use of synthetic inputs (e.g., pesticides and fertilizers), and a range of studies show that organic farming enhances biodiversity and reduces environmental impacts but results in lower productivity (4, 5, 20, 21). Conservation agriculture, in turn, is based on three main pillars: minimum mechanical soil disturbance, permanent soil cover, and species diversification, which are applicable in many different farming contexts (22). Several studies indicate that conservation agriculture has positive effects on soil quality and protection, water regulation, energy use, and production costs (23), but productivity increases are minimal or even negative (24) and often dependent on herbicide use (25). In our study, C-NT and O-RT systems are considered to reflect conservation agriculture as the three defined pillars of conservation agriculture are largely fulfilled (minimum tillage, 6-year crop rotation, and permanent soil cover with crop residues and cover crops).

We assessed 43 variables in each cropping system, from which 38 were classified into nine agroecosystem goods and four ecosystem categories using an adapted ES framework and five were used as agronomic co-variables. We based our classification on the MEA and UKNEA frameworks (7, 10) and grouped our variables into proxies for ecosystem functions representing ecosystem processes and services and we finally valued them as ecosystem goods. Ecosystem functions, services, and goods were attributed to supporting, regulating, and provisioning ES categories. In addition, we added socioeconomic proxies and an economic category to the classical ES framework (Fig. 1 and tables S4 and S5). We did this because agroecosystems also have a socioeconomic dimension for producers and policy makers. The following nine final agroecosystem goods were used for multifunctionality assessments: biodiversity conservation, soil health preservation, erosion control, water and air pollution control, food production, income, work efficiency, and financial autonomy.

READ ON: Organic and conservation agriculture promote ecosystem multifunctionality | Science Advances (sciencemag.org)
The Wuhan lab leak theory is more about politics than science

Whatever this week’s Biden review finds, the cause of the pandemic lies in the destruction of animal habitats

Wuhan’s virology institute has been at the centre of the investigation into the origins of Covid-19. Photograph: Roman Pilipey/EPA


Robin McKie Science editor
Sun 22 Aug 2021 09.30 BST

If Joe Biden’s security staff are up to the mark, a new report on the origins of the Covid-19 pandemic will be placed on the president’s desk this week. His team was given 90 days in May to review the virus’s origins after several US scientists indicated they were no longer certain about the source of Sars-CoV-2.

It will be intriguing to learn how Biden’s team answers the critically important questions that still surround the origins of Sars-CoV-2, the virus that causes Covid-19. Did it emerge because of natural viral spillovers from bats to another animal and then into humans? Or did it leak from the Wuhan Institute of Virology? And, if so, had it been enhanced to make it especially virulent?

These are important questions – to say the least. If we want to prevent another pandemic, it would be very useful to know how this one started. However, given the paucity of new information Biden’s team will have unearthed over the past three months – while the Chinese authorities have continued to provide little extra data – it is unlikely hard answers will be provided this week.

Although allegations of a leak from the Wuhan institute had been aired by Donald Trump, and rejected flatly by the Chinese, little credence was given to the claim until May, when 18 leading scientists sent a letter to the journal Science in which they claimed both spillover and leak theories were equally plausible. They also accused a recent World Health Organization investigation at Wuhan of not giving a balanced consideration to both scenarios.

News that the lab leak theory was being taken seriously by the new US administration triggered an onslaught from commentators, who have since alleged that the scientific establishment has been covering up for Chinese scientists’ errors. Among these accusations was one from Sir Richard Dearlove, former head of MI6, who claimed that “some scientific journals absolutely refused to publish anything that disagreed with the Chinese view”.

The main evidence to support a lab leak rests on the failure of scientists to pinpoint the intermediate animal that picked up the virus from bats and passed it to humans. In addition, the Wuhan institute is home to a laboratory that is headed by the virologist Shi Zhengli, who tracked down the bat origins of the last coronavirus Sars epidemic.

Her team specialises in collecting coronaviruses. Thus, one of the world’s coronavirus research centres was situated in the city where Covid-19 first materialised – a coincidence that some conspiracy advocates find too much to accept.

Shi has rejected claims she had been working on enhancing a new virus to make it more virulent or that she or her staff had been infected with a new coronavirus that they had collected, a view supported by a recent review by scientists in the journal Cell: “Despite extensive contact tracing of early cases during the Covid-19 pandemic, there have been no reported cases related to any laboratory staff at the WIV [Wuhan Institute of Virology] and all staff in the laboratory of Dr Shi Zhengli were said to be seronegative for Sars-CoV-2 when tested in March 2020,” it states.

The fact that Sars-CoV-2 is highly transmissible among humans has also raised suspicions that it had been genetically enhanced. This notion is dismissed by Professor David Robertson, of Glasgow University’s centre for virus research.
Fiddling with viruses in laboratories is not the dangerous activity. The real threat comes from the wildlife trade

“Yes, the virus is spread by asymptomatic carriers and that is perfect for human transmission. So how does a natural virus like that come into existence? It is so good at infecting humans, after all. But it is not just a human virus. We find it in pangolins. It goes from humans to mink very easily and it has infected deer in the US. It isn’t a human- adapted virus. It is what we call a generalist or promiscuous virus.”

However, the prospect that Covid-19 emerged from a lab leak was taken very seriously by some senior scientists, including Sir Jeremy Farrar, head of the Wellcome Trust. As he makes clear in his recent book, Spike: The Virus v the People, his initial – horrified – reaction to the emergence of Covid-19 was that it could have escaped from a virus research centre. Only intense consultations with other researchers caused him to change his mind.

“As things currently stand, the evidence strongly suggests that Covid-19 arose after a natural spillover event, but nobody is yet in a position to rule out an alternative,” he said.

This point is backed by Professor James Wood, of Cambridge University. “I think there is very strong evidence for this being caused by natural spillovers but that argument simply does not suit some political groups. They promote the idea that Covid-19 was caused by a lab leak because such a claim deflects attention from increasing evidence that indicates biodiversity loss, deforestation and wildlife trade – which increase the dangers of natural spillovers – are the real dangers that we face from pandemics.”

In other words, fiddling with viruses in laboratories is not the dangerous activity. The real threat comes from the wildlife trade, bulldozing rainforests and clearing wildernesses to provide land for farms and to gain access to mines. As vegetation and wildlife are destroyed, countless species of viruses and the bacteria they host are set loose to seek new hosts, such as humans and domestic livestock. This has happened with HIV, Sars and very probably Covid-19.

And that, for many scientists, is the real lesson of Covid-19.

This article was amended on 22 August 2021 to refer to Shi Zhengli at second mention as “Shi”, that being her surname.

The Science of Recurring Dreams Is More Fascinating Than We Ever Imagined


(Mischa Keijser/Getty Images)
HUMANS
CLAUDIA PICARD-DELAND & TORE NIELSEN, THE CONVERSATION
20 AUGUST 2021

Having the same dream again and again is a well-known phenomenon — nearly two-thirds of the population report having recurring dreams. Being chased, finding yourself naked in a public place or in the middle of a natural disaster, losing your teeth or forgetting to go to class for an entire semester are typical recurring scenarios in these dreams.

But where does the phenomenon come from? The science of dreams shows that recurring dreams may reflect unresolved conflicts in the dreamer's life.

Recurring dreams often occur during times of stress, or over long periods of time, sometimes several years or even a lifetime. Not only do these dreams have the same themes, they can also repeat the same narrative night after night.

Although the exact content of recurring dreams is unique to every individual, there are common themes among individuals and even among cultures and in different periods. For example, being chased, falling, being unprepared for an exam, arriving late or trying to do something repeatedly are among the most prevalent scenarios.

The majority of recurring dreams have negative content involving emotions such as fear, sadness, anger and guilt. More than half of recurring dreams involve a situation where the dreamer is in danger. But some recurring themes can also be positive, even euphoric, such as dreams where we discover new rooms in our house, erotic dreams or where we fly.

In some cases, recurring dreams that begin in childhood can persist into adulthood. These dreams may disappear for a few years, reappear in the presence of a new source of stress and then disappear again when the situation is over.

Unresolved conflicts

Why does our brain play the same dreams over and over again? Studies suggest that dreams, in general, help us regulate our emotions and adapt to stressful events. Incorporating emotional material into dreams may allow the dreamer to process a painful or difficult event.

In the case of recurrent dreams, repetitive content could represent an unsuccessful attempt to integrate these difficult experiences. Many theories agree that recurring dreams are related to unresolved difficulties or conflicts in the dreamer's life.

The presence of recurrent dreams has also been associated with lower levels of psychological wellbeing and the presence of symptoms of anxiety and depression. These dreams tend to recur during stressful situations and cease when the person has resolved their personal conflict, which indicates improved wellbeing.

Recurrent dreams often metaphorically reflect the emotional concerns of the dreamers. For example, dreaming about a tsunami is common following trauma or abuse. This is a typical example of a metaphor that can represent emotions of helplessness, panic or fear experienced in waking life.

Similarly, being inappropriately dressed in one's dream, being naked or not being able to find a toilet can all represent scenarios of embarrassment or modesty.

These themes can be thought of as scripts or ready-to-dream scenarios that provide us with a space where we can digest our conflicting emotions. The same script can be reused in different situations where we experience similar emotions.

This is why some people, when faced with a stressful situation or a new challenge, may dream they're showing up unprepared for a math exam, even years after they have set foot in a school. Although the circumstances are different, a similar feeling of stress or desire to excel can trigger the same dream scenario again.

A continuum of repetition

William Domhoff, an American researcher and psychologist, proposes the concept of a continuum of repetition in dreams. At the extreme end, traumatic nightmares directly reproduce a lived trauma — one of the main symptoms of post-traumatic stress disorder.

Then there are recurring dreams where the same dream content is replayed in part or in its entirety. Unlike traumatic dreams, recurring dreams rarely replay an event or conflict directly but reflect it metaphorically through a central emotion.

Further along the continuum are the recurring themes in dreams. These dreams tend to replay a similar situation, such as being late, being chased or being lost, but the exact content of the dream differs from one time to the next, such as being late for a train rather than for an exam.

Finally, at the other end of the continuum, we find certain dream elements recurring in the dreams of one individual, such as characters, actions or objects. All these dreams would reflect, at different levels, an attempt to resolve certain emotional concerns.

Moving from an intense level to a lower level on the continuum of repetition is often a sign that a person's psychological state is improving. For example, in the content of traumatic nightmares progressive and positive changes are often observed in people who have experienced trauma as they gradually overcome their difficulties.

Physiological phenomena

Why do the themes tend to be the same from person to person? One possible explanation is that some of these scripts have been preserved in humans due to the evolutionary advantage they bring. By simulating a threatening situation, the dream of being chased, for example, provides a space for a person to practise perceiving and escaping predators in their sleep.

Some common themes may also be explained, in part, by physiological phenomena that take place during sleep. A 2018 study by a research team in Israel found that dreaming of losing one's teeth was not particularly linked to symptoms of anxiety but rather associated to teeth clenching during sleep or dental discomfort upon waking.

When we sleep, our brain is not completely cut off from the outside world. It continues to perceive external stimuli, such as sounds or smells, or internal body sensations. That means that other themes, such as not being able to find a toilet or being naked in a public space, could actually be spurred by the need to urinate during the night or by wearing loose pyjamas in bed.

Some physical phenomena specific to REM sleep, the stage of sleep when we dream the most, could also be at play. In REM sleep, our muscles are paralyzed, which could provoke dreams of having heavy legs or being paralyzed in bed.

Similarly, some authors have proposed that dreams of falling or flying are caused by our vestibular system, which contributes to balance and can reactivate spontaneously during REM sleep. Of course, these sensations are not sufficient to explain the recurrence of these dreams in some people and their sudden occurrence in times of stress, but they probably play a significant role in the construction of our most typical dreams.

Breaking the cycle

People who experience a recurring nightmare have in some ways become stuck in a particular way of responding to the dream scenario and anticipating it. Therapies have been developed to try to resolve this recurrence and break the vicious cycle of nightmares.

One technique is to visualize the nightmare while awake and then rewrite it, that is, to modify the narrative by changing one aspect, for example, the end of the dream to something more positive. Lucid dreaming may also be a solution.

In lucid dreams we become aware that we are dreaming and can sometimes influence the content of the dream. Becoming lucid in a recurring dream might allow us to think or react differently to the dream and thereby alter the repetitive nature of it.

However, not all recurring dreams are bad in themselves. They can even be helpful insofar as they are informing us about our personal conflicts. Paying attention to the repetitive elements of dreams could be a way to better understand and resolve our greatest desires and torments.The Conversation

Claudia Picard-Deland, Candidate au doctorat en neurosciences, Université de Montréal and Tore Nielsen, Professor of Psychiatry, Université de Montréal.

This article is republished from The Conversation under a Creative Commons license. Read the original article.