Sunday, August 22, 2021

#FOREVERCHEMICALS
Are we at risk from wearing clothing with detectable amounts of PFASs or phthalates?

PFASs might be present in clothing, but the presence of a chemical cannot be equated to the presence of risk!


Joe Schwarcz PhD |
MCGILL UNIVERSITY
 20 Aug 2021
Health


Much ink has recently been spilled about our environment, and potentially our bodies, being contaminated by some of the estimated 60,000 chemicals being industrially produced today. That ink itself contains the likes of perfluoroalkyl substances (PFAS) and phthalates, chemicals of concern because of their hormone disruptive properties, and in the case of PFAS, also because of their environmental persistence. That has earned the latter the nickname “forever chemicals.” Both these classes of substances are found in numerous consumer products. Phthalates are added to some plastics, including ones that are spun into fibres to make them soft and pliable, whereas PFASs have both water and oil repellant properties. It therefore comes as no surprise that these chemicals can be detected when some fabrics are subjected to chemical analysis, mostly at the parts per million level or less.

The first take-away is that what the results of such an analysis really demonstrate is the astounding capability of modern instrumentation to detect vanishingly small amounts of substances. The second point is that the presence of a chemical cannot be equated to the presence of risk! Evaluation of risk is a very complex affair and often comes down to making an educated guess given that the relevant experiment cannot be ethically, logistically or economically performed.

For example, when it comes to perfluoroalkyl substances (PFAS), the definitive experiment would involve exposing a group of subjects to varying amounts of these substances and comparing the findings to that of a control group with no exposure. The experimental and control groups would have to be followed for decades, since we are talking about chronic rather than acute effects. To add to the complexity, it should be noted that there are thousands of different PFASs that are in use and they can have totally different toxicological profiles. So which one, or which mixture would be tested? Clearly such an interventional trail cannot be carried out. We are therefore left with human epidemiological studies and laboratory experiments using animals or cells in test tubes. But people are not rats and the body is not some giant test tube, so the relevance of laboratory data to humans is hard to determine.

As far as epidemiological data go, these are clearly relevant, but generally come from large scale exposure. For example, detrimental effects of PFAS have been noted in people living around DuPont’s West Virginia Parkesburg plant where significantly high amounts of PFAS have been found in the water due to improper disposal of manufacturing chemicals. However, this may not mean much for the general population elsewhere with exposure to trace amounts. As we well know, only the dose makes the poison. The same arguments can be made for phthalates, bisphenol A, PCBs, dioxins, furans or the hundreds of other chemicals with potential toxicity that may be encountered in the environment.

All of this is to say that we cannot come to a conclusion about the risk, if any, posed by the tiny amounts of PFASs, phthalates or inorganics detected in fabrics. My guess is that absorption of any of these into the bloodstream would be inconsequential. That, though, may not be the case for cosmetics that are formulated with PFASs such as lipstick, foundations, or mascara. But the biggest concern about exposure to PFAS and phthalates is through food and water. How do these chemicals end up there? Leaching out from discarded items, for one. While wearing fabrics with PFAS may not be an issue, when large numbers of these are discarded, some PFASs end up in the water supply and from there in food. Since PFAS are also used in food packaging, migration into the contents is possible. Then there is the possibility of inadvertent release during manufacture of these chemicals.

The bottom line is that while PFAS in fabrics may pose no risk to the wearer, banning their inclusion in such items means fewer of these chemicals will be produced, and population exposure will be reduced. However, making any recommendation to the public about favouring specific clothing items based on the trivial amounts of PFAS or phthalates found, cannot be justified scientifically.

READ

“Different From All Currently Known Life?” –Darwin’s Extraterrestrials (Weekend Feature)

extraterrestrial life

 

“By now it has become a common futurist prediction and science fiction plot device that intelligent and sentient life forms can be created which are not biochemical in nature and are thus fundamentally different from all currently known life,” distinguished Princeton astrophysicist Edwin Turner wrote in an email to The Daily Galaxy.  “Whether or not this would actually be possible,” he explains, “depends on the nature and origin of consciousness, a topic about which we have little more than entertaining whistling-in-the-dark guesses at this point and no clear path toward obtaining any better understanding of this deep mystery.”

Aliens Shaped by Natural Selection

In a landmark 2017 study published in the International Journal of Astrobiology scientists from the University of Oxford showed that aliens are potentially shaped by the same processes and mechanisms that shaped humans, such as natural selection and are like us, evolving to be fitter and stronger over time.

Only One Known Sample in the Universe

“A fundamental task for astrobiologists is thinking about what extraterrestrial life might be like.” said Sam Levin, a researcher in Oxford’s Department of Zoology. “But making predictions about aliens is hard,” he noted “We only have one example of life – life on Earth — to extrapolate from. Past approaches in the field of astrobiology have been largely mechanistic, taking what we see on Earth, and what we know about chemistry, geology, and physics to make predictions about aliens.”

Complexity’s Arrow

By predicting that aliens have undergone major transitions – which is how complexity has arisen in species on Earth, we can say that there is a level of predictability to evolution that would cause them to look like us.

“In our paper,” said Levin, “we offer an alternative approach, which is to use evolutionary theory to make predictions that are independent of Earth’s details. This is a useful approach, because theoretical predictions will apply to aliens that are silicon based, do not have DNA, and breathe nitrogen, for example.”

 Alien Natural Selection

Using this idea of alien natural selection as a framework, the team addressed extra-terrestrial evolution, and how complexity will arise in space.

Species complexity has increased on the Earth as a result of a handful of events, known as major transitions. These transitions occur when a group of separate organisms evolve into a higher-level organism – when cells become multi-cellular organisms, for example. Both theory and empirical data suggest that extreme conditions are required for major transitions to occur.

The paper also makes specific predictions about the biological make-up of complex aliens, and offers a degree of insight as to what they might look like.

Mirror Sapiens?

“We still can’t say whether aliens will walk on two legs or have big green eyes<” said Levin. “But we believe evolutionary theory offers a unique additional tool for trying to understand what aliens will be like, and we have shown some examples of the kinds of strong predictions we can make with it.”

“By predicting that aliens have undergone major transitions – which is how complexity has arisen in species on Earth, we can say that there is a level of predictability to evolution that would cause them to look like us.”

‘Like humans, we predict that they are made-up of a hierarchy of entities, which all cooperate to produce an alien. At each level of the organism there will be mechanisms in place to eliminate conflict, maintain cooperation, and keep the organism functioning. We can even offer some examples of what these mechanisms will be.

‘There are potentially hundreds of thousands of habitable planets in our galaxy alone. We can’t say whether or not we’re alone on Earth, but we have taken a small step forward in answering, if we’re not alone, what our neighbors are like.’

 Rare Evolutionary Transitions

In subsequent 2019 research from the University of Oxford have created a statistical model that shows the chances of intelligent life existing elsewhere in the Universe are slim. “It’s still unknown,” observes mathematical biologist Michael Bonsall, “how abundant extraterrestrial life is, or whether such life might be intelligent. On Earth, numerous evolutionary transitions were needed for complex intelligent life to emerge, and this occurring relatively late in Earth’s lifetime is thought to be evidence for a handful of rare evolutionary transitions.”

“In addition to the evolutionary transition,” writes Bonsall, “the emergence of intelligent life also requires a set of cosmological factors to be in place. These include whether the planet is in the right place for water to be present, whether life emerges from the water and whether the planet itself is habitable. Most crucial in this is the lifetime of the star the planet is orbiting; if this lifetime is short in comparison to the necessary evolutionary transitions for life, then intelligent observers might never get a chance to emerge (often referred to as the Great Filter).”

The Daily Galaxy, Avi Shporer, Research Scientist, MIT Kavli Institute for Astrophysics and Space Research via Oxford University and Edwin Turner. Avi was formerly a NASA Sagan Fellow at the Jet Propulsion Laboratory (JPL).

 

“Before the Big Bang” –Vestiges of a Prior Universe? (Weekend Feature)

 

Big Bang

 

“Eliminating the singularity or Big Bang– which has its origins in the late 1920s when US astronomer Edwin Hubble discovered that almost all galaxies are moving away from each other at ever-faster velocities– brings back the bouncing Universe on to the theoretical stage of cosmology. The absence of a singularity at the start of spacetime opens up the possibility that vestiges of a previous contraction phase may have withstood the phase change (between contraction to expansion) and may still be with us in the ongoing expansion of the Universe,” said Brazilian physicist Juliano Cesar Silva Neves.

A Universe Prior to the Big Bang

Although for five decades, the Big Bang theory has been the best known and most accepted explanation for the beginning and evolution of the Universe, it is hardly a consensus among scientists. Physicists are now assuming the possibility of vestiges of a Universe previous to the Big Bang. 

The Big Bounce

“The idea of a “big bounce” has existed in some form or another for decades,” writes Jonas Mureika, theoretical physicist and physics chair at Loyola Marymount University and the Kavli Institute for Theoretical Physics, UC Santa Barbara, in an extraordinary email to The Daily Galaxy. “The basic idea is to avoid the singularity at t=0 (time of the Big Bang) and r=0 (where all matter and energy were compressed into an infinitesimally small volume), since our mathematical description of spacetime (and thus our favorite theories) break down at that point. This is similar to problems in black hole physics, which itself has similar fixes to remove the singularity.”

“The Quantum Fix”

“The crux of the problem,” observes Mureika, “is that our description of this physics is classical, i.e. a prediction of General Relativity, and that’s why singularities arise. The theories just don’t work in that limit. It is most likely the case, however, that the physics governing the realm of classical singularities — extremely small and extremely high energy — is quantum in nature. So, the rules change in some fascinating ways and introducing us to new physics allows this to make sense.” 

“When classical physics breaks down, we look to replace the broken parts with a quantum fix. If the singularity is at r=0, then one of the ways we can avoid this is to not let the physics act at r=0. That is, we impose a minimal length (usually the Planck length, but not always) below which the universe can’t be ‘probed’. That removes the infinites that plague the singularity and allows our theories to behave well. In a ‘big bounce’ scenario, the quantum pressures at this minimum length basically stop the implosion of the universe and allow it to re-expand. Again, similar ideas exist with black holes, called Planck stars.”

 Roger Penrose’s Cyclic Conformal Cosmology (CCC) 

“Another approach is to change our notion of the structure of spacetime itself,” explains Mureika, “and how it behaves in the small and large limits. This is embodied in Nobel Laureate Roger Penrose’s 2012 Cyclic Conformal Cosmology (CCC) framework, in which the very small limit of the universe (singularity) is identical to the very large limit (‘infinite’ accelerated expansion). This is done by a conformal transformation on the spacetime metric (the thing that defines straight lines and shortest distances), which is a fancy way of saying we stretch and bend spacetime while preserving certain geometric features (e.g. angles). We now know the universe is indeed going through a phase of accelerated expansion, so this adds weight to Penrose’s idea and kills previous ones (i.e. the universe doesn’t contract, so it can’t ‘bounce’ in the ways previously thought). This allows for the universe to be ‘reborn’ as it expands indefinitely, so there is a repeating cycle of big bangs. Penrose calls these cycles ‘aeons’.”

“CMB” Fossils

“Of course,” Mureika concludes, “a theory is only as good as its experimental verification, so the challenge is to detect tell-tale fingerprints of these models. The observational go-to for early universe cosmology is the Cosmic Microwave Background (CMB), which represents an imprint of the earliest time we can see. It’s believed that the CMB will contain information from earlier times, including the big bang (if it happened). These will manifest themselves as e.g. geometric signatures, patterns in temperature fluctuations, over/underdensities of clustering, etc. Detecting any such signature would be a monumental discovery, and will help quantum gravity and cosmology research shape their future paths.”

Brian Keating’s Deep Dive

“In contrast to inflation,” observes Brian Keating, Chancellor’s Distinguished Professor of Physics at UC San Diego, author of Losing the Nobel Prize, and host of the INTO THE IMPOSSIBLE Podcast in an email to The Daily Galaxy,  “there are several other possible mechanisms to the singularity featured in most versions of the Big Bang theory. Two of the most prominent alternatives to the singular Big Bang are the Bouncing model of Anna Ijjas and Paul Steinhardt and the Conformal Cyclic Cosmology (CCC) of Sir Roger Penrose. Both of these share the feature that they do not produce so-called ‘primordial B-mode polarization’ patterns, the result of relic gravitational waves produced in most models of cosmic Inflation, which also features a concomitant spacetime singularity. In that sense, both the Bouncing and CCC models are falsifiable, e.g. if  current or  future B-mode polarization experiments like BICEP Array or the Simons Observatory were to detect and confirm primordial B-modes, these alternatives would be disproven. Many cosmologists find the falsifiability of these models, in contrast to the inflationary Multiverse, strongly appealing.”

Hawking Points”

“The CCC model also predicts  the presence of so-called ‘Hawking Points’ ”, explains Keating, “regions of concentrated energy caused by the coalescing black holes from preceding ‘aeons’ which would, according to Penrose and collaborators, be evidence supporting the cyclic Evidence for Hawking points from the ESA’s Planck satellite has been claimed already. But those claims are also disputed by members of the Planck team. Upcoming experiments like BICEP Array and the Simons Observatory will be able to rule out or confirm evidence for Hawking Points which would be tantamount to evidence for the CCC model.”

 

Mirroring the “Bouncing Universe” model, Neves, in an article published in General Relativity and Gravitation, proposes to eliminate the need for cosmological spacetime singularity and argues that the current expansion phase was preceded by contraction.

Neves is part of a group of researchers who dare to imagine a different origin. In a study published in the journal General Relativity and Gravitation, Neves suggests the elimination of a key aspect of the standard cosmological model: the need for a spacetime singularity known as the Big Bang.

Challenging the Idea that Time had a Beginning

In raising this possibility, Neves challenges the idea that time had a beginning and reintroduces the possibility that the current expansion was preceded by contraction. “I believe the Big Bang never happened,” the physicist said, who works as a researcher at the University of Campinas’s Mathematics, Statistics & Scientific Computation Institute (IMECC-UNICAMP) in Sao Paulo State, Brazil.

For Neves, the fast spacetime expansion stage does not exclude the possibility of a prior contraction phase. Moreover, the switch from contraction to expansion may not have destroyed all traces of the preceding phase.

Introducing the “Scale Factor”

The article, which reflects the work developed under the Thematic Project “Physics and geometry of spacetime”, considers the solutions to the general relativity equations that describe the geometry of the cosmos and then proposes the introduction of a “scale factor” that makes the rate at which the Universe is expanding depend not only on time but also on cosmological scale.

“In order to measure the rate at which the Universe is expanding with the standard cosmology, the model in which there’s a Big Bang, a mathematical function is used that depends only on cosmological time,” said Neves, who elaborated the idea with Alberto Vazques Saa, a Professor at IMECC-UNICAMP and also the supervisor for Neves’ postdoctoral project, funded by the Sao Paulo Research Foundation – FAPESP.

With the scale factor, the Big Bang itself, or cosmological singularity, ceases to be a necessary condition for the cosmos to begin universal expansion. A concept from mathematics that expresses indefiniteness, singularity was used by cosmologists to characterize the “primordial cosmological singularity” that happened 13.8 billion years ago, when all the matter and energy from the Universe were compressed into an initial state of infinite density and temperature, where the traditional laws of physics no longer apply.

From the 1940s onward, scientists guided by Einstein’s theory of general relativity constructed a detailed model of the evolution of the Universe since the Big Bang. Such model could lead to three possible outcomes: the infinite expansion of the Universe at ever-higher velocities; the stagnation of the Universe expansion in a permanent basis; or an inverted process of retraction caused by the gravitational attraction exerted by the mass of the Universe, what is known as Big Crunch.

Neves conceptualizes that “bouncing cosmology” is rooted in the hypothesis that Big Crunch would give way to an eternal succession of universes, creating extreme conditions of density and temperature in order to instigate a new inversion in the process, giving way to expansion in another bounce.

Black-Hole Fossils

Black holes are the starting point of Neves’ investigations about the “Bouncing Universe”. “Who knows, there may be remains of black holes in the ongoing expansion that date from the prior contraction phase and passed intact through the bottleneck of the bounce,” he said.

Consisting of the imploded core remaining after a giant star explodes, black holes are a kind of cosmic object whose core contracted to form a singularity, a point with infinite density and the strongest gravitational attraction known to exist. Nothing escapes from it, not even light.

According to Neves, a black hole is not defined by singularity, but rather by an event horizon, a membrane that indicates the point of no return from which nothing escapes the inexorable destiny of being swallowed up and destroyed by the singularity.

The Illustris simulation shown below that visualizes the universe with the Standard Big Bang Model is the most ambitious computer simulation yet performed. The calculation tracks the expansion of the universe, the gravitational pull of matter onto itself, the motion of cosmic gas, as well as the formation of stars and black holes.

 

 

“Outside the event horizon of a regular black hole, there are no major changes, but inside it, the changes are deep-seated. There’s a different spacetime that avoids the formation of a singularity.”

The scale factor formulated by Neves and Saa was inspired by US physicist James Bardeen. In 1968, Berdeen used a mathematical trick to modify the solution to the general relativity equations that describe black holes.

The trick consisted of thinking of the mass of a black hole not as a constant, as had previously been the case, but as a function that depends on the distance to the center of the black hole. With this change, a different black hole, termed a regular black hole, emerged from the solution to the equations. “Regular black holes are permitted, since they don’t violate general relativity. The concept isn’t new and has frequently been revisited in recent decades,” said Neves.

Since the insertion of a mathematical trick into the general relativity equations could prevent the formation of singularities in regular black holes, Neves considered creating a similar artifice to eliminate the singularity in a regular bounce.

In science, says philosopher Karl Popper,  a theory is worthless if cannot be verified, however beautiful and inspiring it may be. How do you test the hypothesis of a Big Bang that did not start with a singularity?

“By looking for traces of the events in a contraction phase that may have remained in the ongoing expansion phase. What traces? The candidates include remnants of black holes from a previous phase of universal contraction that may have survived the bounce,” Neves said.

The Daily Galaxy, Maxwell Moe, astrophysicist, NASA Einstein Fellow, University of Arizona via Brian KeatingJonas Mureika and  Sao Paulo Research Foundation (FAPESP)

Image credit: Shutterstock License

A HUNDRED & SEVENTY YEARS AFTER THE FACT

Evolution is now accepted by a majority of Americans

evolution
Credit: CC0 Public Domain

The level of public acceptance of evolution in the United States is now solidly above the halfway mark, according to a new study based on a series of national public opinion surveys conducted over the last 35 years.\

"From 1985 to 2010, there was a statistical dead heat between acceptance and rejection of ," said lead researcher Jon D. Miller of the Institute for Social Research at the University of Michigan. "But acceptance then surged, becoming the majority position in 2016."

Examining data over 35 years, the study consistently identified aspects of education—civic science literacy, taking  in science and having a —as the strongest factors leading to the acceptance of evolution.

"Almost twice as many Americans held a college degree in 2018 as in 1988," said co-author Mark Ackerman, a researcher at Michigan Engineering, the U-M School of Information and Michigan Medicine. "It's hard to earn a college degree without acquiring at least a little respect for the success of ."

The researchers analyzed a collection of biennial surveys from the National Science Board, several national surveys funded by units of the National Science Foundations, and a series focused on adult civic literacy funded by NASA. Beginning in 1985, these national samples of U.S. adults were asked to agree or disagree with this statement: "Human beings, as we know them today, developed from earlier species of animals."

The series of surveys showed that Americans were evenly divided on the question of evolution from 1985 to 2007. According to a 2005 study of the acceptance of evolution in 34 developed nations, led by Miller, only Turkey, at 27%, scored lower than the United States. But over the last decade, until 2019, the percentage of American adults who agreed with this statement increased from 40% to 54%.

The current study consistently identified religious fundamentalism as the strongest factor leading to the rejection of evolution. While their numbers declined slightly in the last decade, approximately 30% of Americans continue to be religious fundamentalists as defined in the study. But even those who scored highest on the scale of religious fundamentalism shifted toward acceptance of evolution, rising from 8% in 1988 to 32% in 2019.

Miller predicted that  would continue to impede the public acceptance of evolution. 

"Such beliefs are not only tenacious but also, increasingly, politicized," he said, citing a widening gap between Republican and Democratic acceptance of evolution. 

As of 2019, 34% of conservative Republicans accepted evolution compared to 83% of liberal Democrats.

The study is published in the journal Public Understanding of Science.

Exploring evolution acceptance for better science education
More information: Jon D. Miller et al, Public acceptance of evolution in the United States, 1985–2020, Public Understanding of Science (2021). DOI: 10.1177/09636625211035919
Journal information: Public Understanding of Science 
Provided by University of Michigan 

 

Scientists Detect Tens of Thousands of Different Molecules in Beer – 80% Not Yet Described in Chemical Databases

Beer Glass Bubbles

Study used modern high resolution analytics to reveal enormous metabolic complexity of beer.

The tradition of beer brewing dates back to at least 7000 BCE and maybe even to the invention of agriculture, considering that most cereals can spontaneously ferment if exposed to airborne yeasts. The code of the Babylonian king Hammurabi (rule 1792 to 1750 BCE), whose laws 108 through 111 regulate beer sales, shows that people have been anxious to safeguard the quality of beer through legislation for millennia. For example, the Bavarian ‘Reinheitsgebot’ (‘Purity Law’) of 1516, often considered the world’s oldest still functional – with modifications – food regulation, allows only barley, water, and hops as ingredients for brewing beer (with confiscation of the barrels as penalty for transgression).

Now, in a recent study in Frontiers in Chemistry, the science of beer is taken to a new level. Scientists from Germany use state-of-the-art analytical methods to reveal the metabolic complexity – tens of thousands of different molecules – of commercial beers from around the world.

Enormous chemical complexity

“Beer is an example of enormous chemical complexity. And thanks to recent improvements in analytical chemistry, comparable in power to the ongoing revolution in the technology of video displays with ever-increasing resolution, we can reveal this complexity in unprecedented detail. Today it’s easy to trace tiny variations in chemistry throughout the food production process, to safeguard quality or to detect hidden adulterations,” said corresponding author Prof Philippe Schmitt-Kopplin, head of the Comprehensive Foodomics Platform at the Technical University of Munich and of the Analytical BioGeoChemistry research unit at the Helmholtz Center in Munich.

Schmitt-Kopplin and colleagues used two powerful methods – direct infusion Fourier transform ion cyclotron resonance mass spectrometry (DI-FTICR MS) and ultra-performance liquid chromatography quadrupole time-of-flight mass spectrometry (UPLC-ToF-MS) – to reveal the full range of metabolites in 467 types of beer brewed in the US, Latin America, Europe, Africa and East Asia. These included lagers, craft and abbey beers, top-fermented beers, and gueuzes brewed from barley as the only source of starch for fermentation, or barley plus either wheat, rice, and corn (maize).

The methods have complementary strengths. DI-FTICR-MS directly revealed the chemical diversity across all beers and predicted chemical formulas for the metabolite ions in them. The authors then used UPLC-ToF-MS on a subset of 100 beers to analyze the results with resolution on the possible isomers. UPLC-ToF-MS uses chromatography to first separate ions with identical masses and fragmentation of the mass ions to daughter ions, making it possible to predict the exact molecular structure.

The authors placed these metabolites in relation within the ‘chemical space’, each linked to one or more others through a single reaction, for example the addition of a methoxy-, hydroxyl-, sulfate-, or sugar-group to the molecular backbone, or turning an unsaturated bond into a saturated bond. This yielded a reconstruction of a metabolite network leading to the final product, consisting of nearly a hundred steps with a starting point in molecules from the original cereals, synthesized from the amino acid tryptophan. Derived from these are secondary metabolites, unique to each cereal.

Powerful method for quality control

“Our mass spectrometry method, which takes only 10 minutes per sample, should be very powerful for quality control in food industry and set the basis of novel molecular markers and non-targeted metabolite profiles needed in foodstuff inspection,” said Schmitt-Kopplin.

The authors found approximately 7700 ions with unique masses and formulas, including lipids, peptides, nucleotides, phenolics, organic acids, phosphates, and carbohydrates, of which around 80% aren’t yet described in chemical databases. Because each formula may in some cases cover up to 25 different molecular structures, this translates into tens of thousands of unique metabolites.

“Here we reveal an enormous chemical diversity across beers, with tens of thousands of unique molecules. We show that this diversity originates in the variety of raw materials, processing, and fermentation. The molecular complexity is then amplified by the so-called ‘Maillard reaction’ between amino acids and sugars which also gives bread, meat steaks, and toasted marshmallow their ‘roasty’ flavor. This complex reaction network is an exciting focus of our research, given its importance for food quality, flavor, and also the development of novel bioactive molecules of interest for health,” concluded first author Stefan Pieczonka, a PhD student at the Technical University of Munich.

Reference: “On the Trail of the German Purity Law: Distinguishing the Metabolic Signatures of Wheat, Corn and Rice in Beer” by Stefan A. Pieczonka, Sophia Paravicini, Michael Rychlik and Philippe Schmitt-Kopplin, 20 July 2021, Frontiers in Chemistry.
DOI: 10.3389/fchem.2021.715372

 

Historical Timeline Is Inaccurate: Advanced Radiocarbon Dating Reveals Machu Picchu Is Older Than Expected

Machu Picchu Peru

Machu Picchu, Peru.

Machu Picchu, the famous 15th-century Inca site in southern Peru, is up to several decades older than previously thought, according to a new study led by Yale archaeologist Richard Burger.

Burger and researchers from several U.S. institutions used accelerator mass spectrometry (AMS) — an advanced form of radiocarbon dating — to date human remains recovered during the early 20th century at the monumental complex and onetime country estate of Inca Emperor Pachacuti located on the eastern face of the Andes Mountains.

Their findings, published in the journal Antiquity, reveal that Machu Picchu was in use from about A.D. 1420 to A.D. 1530 — ending around the time of the Spanish conquest — making the site at least 20 years older than the accepted historical record suggests and raising questions about our understanding of Inca chronology.

Historical sources dating from the Spanish invasion of the Inca Empire indicate that Pachacuti seized power in A.D. 1438 and subsequently conquered the lower Urubamba Valley where Machu Picchu is located. Based on those records, scholars have estimated that the site was built after A.D. 1440, and perhaps as late as A.D. 1450, depending on how long it took Pachacuti to subdue the region and construct the stone palace.

The AMS testing indicates that the historical timeline is inaccurate.

Machu Picchu

“Until now, estimates of Machu Picchu’s antiquity and the length of its occupation were based on contradictory historical accounts written by Spaniards in the period following the Spanish conquest,” said Burger, the Charles J. MacCurdy Professor of Anthropology in Yale’s Faculty of Arts and Sciences. “This is the first study based on scientific evidence to provide an estimate for the founding of Machu Picchu and the length of its occupation, giving us a clearer picture of the site’s origins and history.”

The finding suggests that Pachacuti, whose reign set the Inca on the path to becoming pre-Columbian America’s largest and most powerful empire, gained power and began his conquests decades earlier than textual sources indicate. As such, it has implications for people’s wider understanding of Inca history, Burger said.

Machu Picchu Yale

Credit: Photo courtesy Yale University

“The results suggest that the discussion of the development of the Inca empire based primarily on colonial records needs revision,” he said. “Modern radiocarbon methods provide a better foundation than the historical records for understanding Inca chronology.”

The AMS technique can date bones and teeth that contain even small amounts of organic material, expanding the pool of remains suitable for scientific analysis. For this study, the researchers used it to analyze human samples from 26 individuals that were recovered from four cemeteries at Machu Picchu in 1912 during excavations led by Yale professor Hiram Bingham III, who had “rediscovered” the site the previous year.

The bones and teeth used in the analysis likely belonged to retainers, or attendants, who were assigned to the royal estate, the study states. The remains show little evidence of involvement in heavy physical labor, such as construction, meaning that they likely were from the period when the site functioned as a country palace, not when it was being built, the researchers said.

On November 30, 2010, Yale University and the Peruvian government reached an accord for the return to Peru of the archaeological materials Bingham excavated at Machu Picchu. On February 11, 2011, Yale signed an agreement with the Universidad Nacional de San Antonio Abad del Cusco establishing the International Center for the Study of Machu Picchu and Inca Culture, which is dedicated to the display, conservation, and study of the archaeological collections from Bingham’s 1912 excavations. All human remains and other archaeological materials from Machu Picchu have subsequently been returned to Cusco, the former capital city of the Inca Empire, where they are conserved at the Museo Machu Picchu.

Reference: “New AMS dates for Machu Picchu: results and implications” by Richard L. Burger, Lucy C. Salazar, Jason Nesbitt, Eden Washburn and Lars Fehren-Schmitz, 4 August 2021, Antiquity.
DOI: 10.15184/aqy.2021.99

We recommend

 NASA Star Trek Crew in 1976

Dryden Flight Research Center (now Armstrong) hosted the Star Trek crew in 1976 for the rollout of space shuttle Enterprise. In front, from left: NASA Administrator James Fletcher, and the show’s stars DeForest Kelley, George Takei, Nichelle Nichols, Leonard Nimoy, show creator Gene Roddenberry, and Walter Koenig. Credit: NASA

Star Trek and NASA: Celebrating the Connection

Gene Roddenberry would have been 100 years old on August 19, 2021, and we at NASA celebrate his legacy. As creator of the legendary Star Trek saga, Roddenberry’s vision continues to resonate.

In the documentary “NASA on the Edge of Forever: Science in Space,” host NASA astronaut Victor Glover stated, “Science and Star Trek go hand-in-hand.” The film explores how for the past 55 years, Star Trek has influenced scientists, engineers, and even astronauts to reach beyond. While the International Space Station doesn’t speed through the galaxy like the Starship Enterprise, much of the research conducted aboard the orbiting facility is making the fiction of Star Trek come a little closer to reality.

In this image, the then Dryden Flight Research Center (now Armstrong) hosted the Star Trek crew in 1976 for the rollout of space shuttle Enterprise. In front, from left: NASA Administrator James Fletcher, and the show’s stars DeForest Kelley, George Takei, Nichelle Nichols, Leonard Nimoy, show creator Gene Roddenberry, and Walter Koenig.



Organic and conservation agriculture promote ecosystem multifunctionality



See all authors and affiliations
Science Advances 20 Aug 2021:
Vol. 7, no. 34, eabg6995
DOI: 10.1126/sciadv.abg6995

Article

Abstract

Ecosystems provide multiple services to humans. However, agricultural systems are usually evaluated on their productivity and economic performance, and a systematic and quantitative assessment of the multifunctionality of agroecosystems including environmental services is missing. Using a long-term farming system experiment, we evaluated and compared the agronomic, economic, and ecological performance of the most widespread arable cropping systems in Europe: organic, conservation, and conventional agriculture. We analyzed 43 agroecosystem properties and determined overall agroecosystem multifunctionality. We show that organic and conservation agriculture promoted ecosystem multifunctionality, especially by enhancing regulating and supporting services, including biodiversity preservation, soil and water quality, and climate mitigation. In contrast, conventional cropping showed reduced multifunctionality but delivered highest yield. Organic production resulted in higher economic performance, thanks to higher product prices and additional support payments. Our results demonstrate that different cropping systems provide opposing services, enforcing the productivity–environmental protection dilemma for agroecosystem functioning.

INTRODUCTION

Global food production has more than doubled in the past 60 years. This has been achieved through land use change and use of mineral fertilizers, pesticides, breeding of new crop varieties, and other technologies of the “Green Revolution” (1, 2). However, increased use of agrochemicals, land conversion, farm expansion, and farm specialization have a negative impact on the environment and have caused habitat and biodiversity loss, pollution, and eutrophication of water bodies, increasing greenhouse gases emissions and reduced soil quality (1, 3, 4). Thus, one of the main challenges for the future of agriculture is to produce sufficient amounts of food with minimal environmental impact (1). However, to date, there is lack of appropriate methods and tools to evaluate, design, and track the multifunctionality and sustainability of agricultural production.

For agronomists, the focus of agricultural systems is dedicated to productivity, while ecologists and environmental researchers focus on the environmental impact of agriculture. Ideally, agricultural systems should provide the desired balance of provisioning services (e.g., food production), regulating services (e.g., soil, water, and climate protection), and supporting services (e.g., biodiversity and soil quality conservation) within viable socioeconomic boundaries (e.g., ensured income and suitable working conditions). However, systemic evaluations of the diverse services and trade-offs provided by different agricultural practices are scarce, and this has been viewed as a major research gap (5, 6).

In the past 15 years, there have been considerable efforts to conceptualize ecosystem services (ESs), defining their contribution to human well-being and bring it into policy and planning. Examples such as the Millennium Ecosystem Assessment (MEA) (7), The Economics of Ecosystems and Biodiversity (8), or the Intergovernmental Platform on Biodiversity and Ecosystem Service (9) are global initiatives that are integrated in national monitoring programs such as the U.K. National Ecosystem Assessment (UKNEA) framework (10). Even if these concepts and framework are increasingly recognized, there is a lack of implementation in practice due to difficulties to appropriately measure and value ES and to institutionalize outcomes (11).

One of the key approaches to measure and appropriately manage agroecosystems is to gain a solid understanding of how farming practices influence a wide range of ecosystem functions and services and to summarize these effects in a meaningful way (12, 13). The “ability of ecosystems to simultaneously provide multiple functions and services” can be assessed by calculating ecosystem multifunctionality (EMF), an approach widely used in ecology (14, 15). Here, we define ecosystem functions as the biotic and abiotic processes that make up or contribute to ESs either directly or indirectly.

A range of studies has assessed how different drivers including biodiversity and land use intensity affect individual functions and EMF (1619). However, this approach is still poorly developed for agroecosystems (15), where anthropogenic management plays a key role in determining ecosystem functioning (i.e., specific crop management practices like tillage intensity and chemical and organic input sources and amounts). Moreover, the number of ecosystem functions used to assess EMF varies greatly among studies, and there is often little explanation of why certain variables are included (15). Thus, a next frontier is to investigate how major cropping systems (e.g., conventional, organic, and conservation agriculture) influence different ecosystem functions and EMF and embed such analyses in a broader conceptual ES framework supporting producer and policy decisions.

The main objective of this study is to assess the overall performance of important cropping systems within an adapted ES framework using the EMF methodology applied in ecology. To do this, we use a 6-year dataset from the long-term FArming System and Tillage (FAST) experiment (fig. S1) where we compare the agronomical, ecological, and economic impacts of four arable cropping systems [conventional intensive tillage (C-IT), conventional no tillage (C-NT), organic intensive tillage (O-IT), and organic reduced tillage (O-RT); see Materials and Methods and tables S1 to S3 for detailed management description]. We focus on these specific management strategies since conservation and organic agriculture are two main alternatives to conventional management and are often promoted as more environmentally friendly practices. Organic agriculture prohibits the use of synthetic inputs (e.g., pesticides and fertilizers), and a range of studies show that organic farming enhances biodiversity and reduces environmental impacts but results in lower productivity (4, 5, 20, 21). Conservation agriculture, in turn, is based on three main pillars: minimum mechanical soil disturbance, permanent soil cover, and species diversification, which are applicable in many different farming contexts (22). Several studies indicate that conservation agriculture has positive effects on soil quality and protection, water regulation, energy use, and production costs (23), but productivity increases are minimal or even negative (24) and often dependent on herbicide use (25). In our study, C-NT and O-RT systems are considered to reflect conservation agriculture as the three defined pillars of conservation agriculture are largely fulfilled (minimum tillage, 6-year crop rotation, and permanent soil cover with crop residues and cover crops).

We assessed 43 variables in each cropping system, from which 38 were classified into nine agroecosystem goods and four ecosystem categories using an adapted ES framework and five were used as agronomic co-variables. We based our classification on the MEA and UKNEA frameworks (7, 10) and grouped our variables into proxies for ecosystem functions representing ecosystem processes and services and we finally valued them as ecosystem goods. Ecosystem functions, services, and goods were attributed to supporting, regulating, and provisioning ES categories. In addition, we added socioeconomic proxies and an economic category to the classical ES framework (Fig. 1 and tables S4 and S5). We did this because agroecosystems also have a socioeconomic dimension for producers and policy makers. The following nine final agroecosystem goods were used for multifunctionality assessments: biodiversity conservation, soil health preservation, erosion control, water and air pollution control, food production, income, work efficiency, and financial autonomy.

READ ON: Organic and conservation agriculture promote ecosystem multifunctionality | Science Advances (sciencemag.org)