Wednesday, July 27, 2022

Contamination analysis of Arctic ice samples as planetary field analogs and implications for future life-detection missions to Europa and Enceladus

Abstract

Missions to detect extraterrestrial life are being designed to visit Europa and Enceladus in the next decades. The contact between the mission payload and the habitable subsurface of these satellites involves significant risk of forward contamination. The standardization of protocols to decontaminate ice cores from planetary field analogs of icy moons, and monitor the contamination in downstream analysis, has a direct application for developing clean approaches crucial to life detection missions in these satellites. Here we developed a comprehensive protocol that can be used to monitor and minimize the contamination of Arctic ice cores in processing and downstream analysis. We physically removed the exterior layers of ice cores to minimize bioburden from sampling. To monitor contamination, we constructed artificial controls and applied culture-dependent and culture-independent techniques such as 16S rRNA amplicon sequencing. We identified 13 bacterial contaminants, including a radioresistant species. This protocol decreases the contamination risk, provides quantitative and qualitative information about contamination agents, and allows validation of the results obtained. This study highlights the importance of decreasing and evaluating prokaryotic contamination in the processing of polar ice cores, including in their use as analogs of Europa and Enceladus.

Introduction

Europa and Enceladus are two icy moons from our solar system identified as ocean worlds due to the presence of a liquid ocean under their icy surface1,2. Europa has also secondary liquid water reservoirs, perched in the ice and closer to the active surface3. The water bodies present in these satellites are considered habitable environments4. Several concepts for future lander missions to these moons have been developed; e.g., Europa Lander, Enceladus Orbilander, and Joint Europa Mission (JEM)5,6,7. These missions will need to drill and sample a layer of ice (exact extension still undetermined) to eventually reach interface water.

The Committee of Space Research (COSPAR) recommends that the study of methods of bioburden reduction for these missions should reflect the type of environments found on Europa or Enceladus, focusing on Earth organisms most likely to survive on these moons, such as cold and radiation tolerant organisms8,9,10. Environments on Earth that exhibit similar extreme conditions as planets and moons in our solar system are called planetary field analogues11,12,13. Both the Arctic and Antarctic offer locations that mimic environments present in the icy moons of Jupiter and Saturn12. These locations are populated by extremophiles—organisms adapted to survive these severe conditions such as extreme temperature and pH, dryness, oxidation, UV radiation, high pressure, and high salinity14,15. Microbes from these habitats are viable despite hundreds to thousands of years in terrestrial glaciers and cryo-permafrost environments16,17, increasing the plausibility of finding putative life forms in Europa and Enceladus. Polar extremophiles are also known to survive sterilization procedures for planetary protection18 and under space conditions on-board the International Space Station (ISS)19.

The challenges of sampling, processing, and analyzing Arctic and Antarctic ice samples are the closest to those of future life-detection missions in these satellites20,21. The constraints include difficulties in sampling, analysis of low biomass samples, and the need to minimize and monitor potential sources22 of contamination from mesophilic environments on Earth, where microbes are ubiquitous and in high abundance.

When studying recent terrestrial ice, contamination is critical, and sources are mainly due to equipment such as ice corers, handling, and transportation20,23,24. In the context of planetary field analogs, we add more contamination sources such as snow, air, and soil microbes that are part of the atmospheric and soil microbiome but are not expected to be present in icy worlds with thin atmospheres and no regolith25. Studies from the last 20 years of ice core research mention the use of sterile equipment in the field while drilling the ice cores (Table S1—Supplementary material). Codes of conduct and clean protocols to sample pristine subglacial lakes in Antarctica have also been created recently22,24, which represents a challenging and laudable effort by the scientific community to conserve these unique microbial ecosystems. However, the potential sources of contamination are not limited to the field. During manipulations of ice cores in the laboratory, microbial contaminants can be introduced from the laboratory air, equipment, materials, reagents, or even by humans during downstream analysis such as filtration, DNA extraction (the kitome), amplification, and sequencing, or during cultivation in a nutritive medium. These procedures will be robotized in lander space missions; however, they still represent more layers of contact between man-made equipment coming from Earth and extraterrestrial samples. Sterilization methods used for equipment cannot be directly applied on ice samples21. Controlled heat, UV-C, and chemical disinfectants such as ethanol, benzethonium chloride, and sodium hypochlorite have been used directly in the exterior of ice samples being especially efficient in reducing active contaminants in cultivation work20,23,26. However, they are not adequate for life detection, molecular methods of low-biomass ice samples, or the integrity of other microbial analyses. For example, ethanol is an effective disinfectant to decrease contamination for culture-dependent analysis, however, it does not destroy DNA molecules20. While excising the external layers of the ice cores has proved effective in removing most contaminating cells20,23,26,27, without compromising the interior native biota, no known protocol can completely prevent contamination, which leads us to the last possible resource for an ethical sampling and processing methodology: contamination monitoring through the use of background controls27 that have proved to be very effective27,28. Culture-dependent and -independent analyses9,27,29,30,31,32,33,34,35,36,37,38,39 have been used in background controls and in the planetary protection context. In past studies on ice cores, culture-dependent investigations appeared to instigate more care to prevent contaminants in comparison with culture-independent techniques (Table S1—Supplementary material). This is likely because DNA contamination from the laboratory air or sterile material is commonly considered insignificant, due to its presumably low representation in comparison with the microbial load of the whole community present in the samples, which is now overcome by the increased sensitivity of polymerase chain reaction (PCR) and DNA sequencing techniques. The lack of standardized methodology adopted to decrease and monitor contaminations in ice core analysis, similar to what already exists to sampling22,24, remains a limiting issue for the scientific integrity of the acquired data in icy planetary field analogs, as well as for the design of proven and robust protocols for the future lander and return missions to icy moons. As a result, the identity and function of microbial contaminants expected from ice core analysis remains a challenge in the field of planetary protection.

In this study, we propose a multidimensional approach to restrict and monitor the contamination inherent in the processing and analyzing ice samples, combining the most effective methods presented in the last 20 years on ice core studies (Table S1—Supplementary material). The decontamination methodology for the ice core surface was mechanical to decrease contaminants while preserving the natural biota. We constructed background controls (Fig. 1): an artificial ice core made of sterile MilliQ water (processing control) and a DNA extraction control sample (DNA-extraction control). We used both culture-dependent and -independent techniques, such as 16S rRNA gene amplicon sequencing of metagenomic DNA samples, accessing several levels of visible and quantifiable prokaryotic contamination. We identified the contaminating microorganisms of the present study using an established ribosomal marker database to understand the astrobiology relevance of the contaminants. Such a decontamination protocol would be suitable for the design of life-detection experiments on planetary field analogs of Europa and Enceladus, targeting ice cores that may serve as a proxy for habitats of putative extraterrestrial communities. Also, our protocol will serve as a testbed for procedures of decontamination of samples from future landing/return missions to the icy moons.

READ ON  Contamination analysis of Arctic ice samples as planetary field analogs and implications for future life-detection missions to Europa and Enceladus | Scientific Reports (nature.com)

Figure 1
figure 1

(A) Sampling location on the east coast of Hudson Bay, Quebec, Canada, latitude 55.39° N; longitude 77.61° W (Map data © Sentinel-2), and the salinity measured on-site (% by mass). (B) Sampling decontamination procedure and following processing preceding culture-dependent and culture-independent analysis. (C) Description of environmental samples with respective replicates: ice cores (duplicate—Ice 1, Ice 2), and interface water (triplicate—Water 1, Water 2, Water 3); (D) Control samples: an artificial sterile ice core made in the laboratory referred as a processing control, and a clean filter inside a clean microtube used as a control for downstream DNA analysis referred as DNA extraction control.

 

Data from OSIRIS-REx reveals loose surface of asteroid Bennu and “early aging” of asteroids

 July 26, 2022

Recently analyzed data from an asteroid surface-sample collection performed by NASA’s OSIRIS-REx mission has revealed that asteroid 101955 Bennu’s surface regolith is much looser than previously thought. Had OSIRIS-REx not fired its thrusters to back away from Bennu after collecting its sample, the spacecraft likely would’ve sunk straight into Bennu’s surface.

What’s more, additional data from OSIRIS-REx’s sample collection revealed that surface regeneration occurs much quicker on asteroids than on Earth.

Bennu’s loose surface

At 22:13 UTC (6:13 PM EDT) on October 20, 2020, OSIRIS-REx, with its Touch-And-Go Sample Acquisition Mechanism (TAGSAM) sample collection arm extended, successfully touched down on the surface of asteroid 101955 Bennu. Following surface contact, OSIRIS-REx engaged its sample collection system, filling the TAGSAM head with regolith from the surface of Bennu. OSIRIS-REx then fired its thrusters and backed away from Bennu, completing the sample collection process in less than five seconds.

Since its sample collection, OSIRIS-REx has left Bennu and is currently en route to Earth. Just before it flies by Earth, it will release the sample capsule into the atmosphere for retrieval by NASA teams following a touchdown at the Utah Test and Training Range in the United States.  However, scientists have been continuously analyzing the data collected by OSIRIS-REx during the sample collection procedure.

In a new study led by OSIRIS-REx principal investigator Dante Lauretta, scientists have discovered that the outer particles of Bennu’s surface are extremely loose and lightly bound, so much so that if a person were to step onto Bennu’s surface, they would feel little to no resistance and sink into the asteroid’s surface — similar to a plastic ball pit on Earth.

“If Bennu was completely packed, that would imply nearly solid rock, but we found a lot of void space in the surface,” said OSIRIS-REx scientist Kevin Walsh of the Southwest Research Institute in San Antonio, Texas.

Furthermore — to the surprise of the OSIRIS-REx team — if the probe had not fired its thrusters to back away from Bennu after collecting the surface sample, it likely would have continued sinking into Bennu’s surface until it hit a compact object.

The effect of OSIRIS-REx’s sample collection on Bennu’s surface was also clearly seen in up-close imagery of Bennu’s surface taken by the spacecraft after collecting its sample.

“What we saw was a huge wall of debris radiating out from the sample site,” Lauretta said.

The wall of debris created by the spacecraft was larger than expected and had an unusually high abundance of pebbles and other surface regolith, especially since OSIRIS-REx barely tapped the surface. Additionally, a large eight-meter crater was left on Bennu’s surface from the sample collection. This observation was quite bizarre to OSIRIS-REx teams, as every time they had tested the sampling system on Earth, the spacecraft barely made a divot in the testing area.

“Every time we tested the sample pickup procedure in the lab, we barely made a divot,” Lauretta said.

“Our expectations about the asteroid’s surface were completely wrong. We were like, ‘Holy cow!’”

In order to gain a better understanding of the sample collection and what occurred on Bennu’s surface during the procedure, teams had OSIRIS-REx swing back around to the sample site to gather more imagery of the crater and surrounding areas.

Using images taken before and after the sample collection, scientists were able to determine the volume of debris at the impact site, named “Nightingale,” before and after OSIRIS-REx collected its surface sample. Scientists also analyzed acceleration data from the spacecraft during touchdown. The data showed that as OSIRIS-REx touched Bennu’s surface, it experienced very little amounts of resistance and was sinking into Bennu’s surface by the time the spacecraft fired its thrusters.

“By the time we fired our thrusters to leave the surface we were still plunging into the asteroid,” said mission scientist Ron Ballouz of the Johns Hopkins Applied Physics Laboratory in Laurel, Maryland.

Mission scientists continued to run hundreds of computer simulations based on OSIRIS-REx images and acceleration data to understand the density and cohesion of Bennu’s surface. Teams varied the surface cohesion in each simulation until they got a value that closely matched real-life data collected by OSIRIS-REx.

Having precise information on the cohesion and nature of Bennu’s surface allows scientists to better interpret asteroid observations, both from Earth and spacecraft. Additionally, the Bennu data could prove useful for developing upcoming missions to asteroids and methods to protect Earth from possible asteroid collisions. Loose asteroids like Bennu could pose a different kind of threat when entering Earth’s atmosphere than asteroids that are more solid.

“I think we’re still at the beginning of understanding what these bodies are, because they behave in very counterintuitive ways,” said OSIRIS-REx scientist Patrick Michel, director of research at the Centre National de la Recherche Scientifique at Côte d’Azur Observatory in Nice, France.

Lauretta et al.’s research on Bennu’s surface was released on July 7 in the journals Science and Science Advances.

“Early aging” of asteroids

The Lauretta et al. study isn’t the only recent study to use high-resolution up-close imagery from OSIRIS-REx to uncover the many secrets of asteroid evolution. A new study led by Marco Delbo of Université Côte d’Azur, CNRS, Observatoire de la Côte d’Azur, Laboratoire Lagrange in Nice, France, has revealed that asteroids regenerate their surfaces much faster than Earth does — meaning asteroids “age” faster.

Using up-close imagery of the surface of Bennu, scientists were able to determine that the heat from the Sun can fracture rocks on Bennu’s surface in just 10,000 to 100,000 years — much quicker than scientists previously thought.

“We thought surface regeneration on asteroids took a few millions of years. We were surprised to learn that the aging and weathering process on asteroids happens so quickly, geologically speaking,” said Delbo.

Though jarring and enormous natural disasters — such as volcanic eruptions, landslides, and earthquakes — can rapidly change a landscape on Earth, widespread changes to Earth’s surface happen gradually, typically by wind, temperature changes, and water. These natural occurrences can change and shape Earth’s surface extensively over the course of a few million years as they erode surface layers gradually. A great example of this occurrence is the Grand Canyon in the United States, which was shaped, molded, and layered by flowing rivers and other bodies of water, meaning that its top layers are its youngest while its lowest layers are around 1.8 billion years old.

On Bennu, quick temperature changes create internal stress, causing fractures to form in its surface rocks, leading to the rocks being broken down after further stress.

So just how quickly does the temperature change on Bennu?

Every 4.3 hours, the Sun rises on Bennu. The sunlit side of Bennu can reach temperatures of nearly 127° C, with the night side experiencing temperatures as low as -23° C. The rаpid change in temperature every 4.3 hours is what causes the internal stress and rock fractures on Bennu.

When first investigating the OSIRIS-REx images of fractured rocks, Delbo et al. immediately noticed the temperature shock signature in the fractured rocks imaged by the spacecraft.

“A distinct signature that temperature shocks between the day and the night could be the cause (was seen in the rock images),” said Delbo.

Delbo et al. then measured the length and angles of more than 1,500 rock fractures in rocks seen in the OSIRIS-REx images. Interestingly, the size of the fractures varied immensely, with some being as small as a tennis racket and others being as large as a tennis court. The group of researchers found that the rock fractures mainly align in the northwest-southwest direction, which is a major indication that they were formed by the Sun’s heat — confirming that the Sun is a primary force in shaping the landscape of Bennu.

“If landslides or impacts were moving boulders faster than the boulders were cracking, the fractures would point in random directions,” said Delbo.

In this image from OSIRIS-REx, deep, dark, and long fractures can be seen in rocks, with some major fractures being outlined in red. (Credit: NASA/Goddard/University of Arizona)

In addition to hand-measuring the fractures seen in the images, Delbo et al. developed computer simulations to model how the fractures formed over time. Using the computer simulations and the OSIRIS-REx images, they were able to calculate the 10,000 to 100,000-year timeframe needed for the thermal rock fractures to propagate and split the rocks.

“The thermal fractures on Bennu are quite similar to what we find on Earth and on Mars in terms of how they form. It is fascinating to see that they can exist and are similar in very ‘exotic’ physical conditions [low gravity, no atmosphere], even compared to Mars,” said co-author Christophe Matonti of Université Côte d’Azur, CNRS, Observatoire de la Côte d’Azur, Géoazur, Sophia-Antipolis, Valbonne, France.

Though new rocks and surface features form relatively quickly (geologically speaking), it’s important to remember just how old these asteroids are and why we study them.

“Keep in mind, the topography of Bennu is young, but the rocks on the asteroids are still billions of years old and hold valuable information about the beginning of the solar system,” said OSIRIS-REx mission scientist Jason Dworkin of NASA’s Goddard Space Flight Center in Greenbelt, Maryland.

Delbo et al.’s research was published in June in the journal Nature Geoscience.

(Lead image: Artist’s illustration of OSIRIS-REx performing its sample collection on Bennu. Credit: NASA)

Before we Develop Self-Replicating Machines to Explore the Universe, we Should Figure out how to Turn Them off Again



An early NASA concept of an interstellar space probe.
 Credit: NASA/Johns Hopkins University Applied Physics Laboratory


POSTED ONJULY 26, 2022 BY MATT WILLIAMS

In 1948/49, famed computer scientist, engineer, and physicist John von Neumann introduced the world to his revolutionary idea for a species of self-replicating robots (aka. “Universal Assemblers”). In time, researchers involved in the Search for Extraterrestrial Intelligence (SETI) adopted this idea, stating that self-replicating probes would be an effective way to explore the cosmos and that an advanced species may be doing this already. Among SETI researchers, “Von Neumann probes” (as they’ve come to be known) are considered a viable indication of technologically advanced species (technosignature).

Given the rate of progress with robotics, it’s likely just a matter of time before humanity can deploy Von Neumann probes, and the range of applications is endless. But what about the safety implications? In a recent study by Carleton University Professor Alex Ellery explores the potential harm that Von Neumann Probes could have. In particular, Ellery considers the prospect of runaway population growth (aka. the “grey goo problem”) and how a series of biologically-inspired controls that impose a cap on their replication cycles would prevent that.

Professor Ellery is the Canada Research Chair in Space Robotics & Space Technology in the Mechanical & Aerospace Engineering Department at Carleton University, Ottawa. The paper that describes his findings, titled “Curbing the fruitfulness of self-replicating machines,” recently appeared in the International Journal of Astrobiology. For the sake of this study, Ellery investigated how interstellar Von Neumann probes could explore the Milky Way galaxy safely by imposing a limit on their ability to reproduce.

Universal Assemblers in Space


The topic of Von Neumann probes and their applications (and implications) for space exploration and SETI is one that Ellery is well-versed. While Von Neumann was interested in self-replicating machines as a means of advancing the frontiers of robotics and manufacturing, the concept was quickly seized upon by researchers engaged in the Search for Extraterrestrial Intelligence (SETI). During the 1980s, astronomer Frank Tipler used the concept of these machines to advance the argument that intelligent life did not exist beyond Earth.

This argument remains central to the Fermi Paradox, which essentially states that the assumed likelihood of intelligent life in the Universe stands in contrast to the absence of evidence for it. According to Tipler, an advanced intelligence that preceded humanity would have likely created Von Neumann probes long ago to explore and colonize the galaxy and have had ample time to do it. As he stated in his first paper on the subject, titled “Extraterrestrial Intelligent Beings do not Exist” (released in 1979):

“I shall assume that such a species will eventually develop a self-replicating universal constructor with intelligence comparable to the human level – such a machine should be developed within a century, according to the experts – and such a machine combined with present-day rocket technology would make it possible to explore and/or colonize the Galaxy in less than 300 million years.”

Considering that humanity sees no evidence of self-replicating machines in our galaxy (and hasn’t been visited by any), we must assume that no intelligent civilizations exist, says Tipler. These conclusions prompted a spirited response from Carl Sagan and William Newman in a paper titled “The Solipsists Approach to Extraterrestrial Intelligence” (aka. “Sagan’s Response.”), where Sagan famously stated that “the absence of evidence is not the evidence of absence.” This debate and Tipler’s paper profoundly influenced Ellery, who was studying in the UK at the time. As he recounted to Universe Today via email:

“When I was at the university of Sussex doing the masters in astronomy, I came across the Von Neumann probe concept. And that completely captivated my imagination and I decided then to shift to engineering and space robotics. Basically, I was trying to figure out how I could start to work on Von Neumann probes, eventually. After my PhD, I spent a couple years in working in a hospital as a medical physicist and in industry for a bit, and then came back to academia.”

Traditionally, Ellery’s work has been focused on space robotics, primarily with planetary rovers for Martian exploration and other aspects like servicing manipulators and space debris removal. Through this, Ellery has also maintained a close connection to the astrobiology community, which is dedicated to searching for extraterrestrial life on Mars and beyond. While the concept of self-replicating machines was always in the back of his mind, it was only a few years ago that research programs emerged that allowed him to work on it in an official capacity.

As Ellery explained, his recent study considers self-replicating robots as a means of building infrastructure on the Moon (which would facilitate human exploration) but has applications far beyond that:

“I started moving away from the rovers and more into the In-Situ Resource Utilization [ISRU] side of things. You know, mining the moon and that sort of sort of stuff. I was actually doing some work in 3D printing as a mechanism to, uh, leverage resources from the Moon. So one of the things we’ve done is we’ve 3d printed an electric motor. This is a major step towards realizing 3d printing, robotic machines on the moon using Luna resources.

“The primary motivation in the back of my mind was I’m doing this to try and build a self replicating machine.And so in a way, it’s come full circle to my original interest in using self replicating machines to explore the cosmos, and its implications for the SETI program, for the Fermi Paradox, and so on, which originally was what motivated down me down this road.”

As with all things concerning technological advancement and humanity’s future in space, there are undeniable issues that need to be discussed beforehand. When it comes to self-replicating machines, there is the question of whether they might grow beyond our control someday. If perchance, some malfunctioned and began consuming everything in their environment (and multiplying exponentially in the process), the results would be disastrous. This speculative prospect is known as the “grey goo” scenario.

The Problem with “Grey Goo”


The term “grey goo” (or “gray goo”) was coined by famed engineer and nanotechnology pioneer K. Eric Drexler. In his 1986 book, Engines of Creation, he poses a thought experiment where molecular self-replicating robots could escape from a laboratory and multiply ad infinitum, potentially leading to an ecology catastrophe. As Drexler described it:

“[E]arly assembler-based replicators could beat the most advanced modern organisms. ‘Plants’ with ‘leaves’ no more efficient than today’s solar cells could out-compete real plants, crowding the biosphere with an inedible foliage. Tough, omnivorous ‘bacteria’ could out-compete real bacteria: they could spread like blowing pollen, replicate swiftly, and reduce the biosphere to dust in a matter of days. Dangerous replicators could easily be too tough, small, and rapidly spreading to stop – at least if we made no preparation. We have trouble enough controlling viruses and fruit flies.”

While Drexler dismissed the likelihood of doomsday scenarios and later stated that he wished he’d “never used the term ‘gray goo,'” he nevertheless felt the scenario needed to be taken seriously. As he stated in his book, this thought experiment made it clear that humanity “cannot afford certain kinds of accidents with replicating assemblers.” The theory that such machines could run amok and become hostile to life has been explored as a possible resolution to the Fermi Paradox (known as the “Berserker Hypothesis“).

Ellery echoes Drexler’s statement by explaining how the “grey goo” scenario is more about perception than fact. Nevertheless, he agrees that it is a prospect that warrants discussion and action to ensure that worst-case scenarios can be avoided. As he put it:

“It’s an idea more than anything else. I’m not convinced of how realistic it is. The problem is, it’s the commonest question, ‘what do you about uncontrolled self-replication?’ Because people imagine a self-replicating machine to be like a virus, and it will spread. Which it could do, but only so far as there are resources available.

“[T]he thing that we have to be cognizant of is that self-replicate machines sound scary. We have to be aware that some people have knowledge and they can temper that knowledge, and so can appreciate rational argument. Other people are more focused on the fear, rather than the probability. But the idea behind this is being able to curb the self-replication process is to try and show that we are working on that problem – that we’re not just going straight into this self-replicate machine without thinking about the potential consequences.”

Therefore, says Ellery, it is incumbent upon us to develop preventative measures to ensure that self-replicating Von Neumann probes will behave themselves before we begin experimenting. To this end, he recommends a solution inspired by cellular biology.

RepRap “Mendel” self-replicating robot. 
Credit: RepRap Project/Wikimedia

Telomeres for Robots?

The key to Ellery’s approach is the Hayflick Limit, a concept in biology that states that a normal human cell can only replicate and divide forty to sixty times before it breaks down due to programmed cell death (apoptosis). This limit is imposed by telomeres, the protective caps at the ends of the DNA strands inside the nucleus of chromosomes (animal and plant cells) that progressively shorten during cellular replication. In recent years, telomere research has advanced considerably thanks to growing interest in anti-aging treatments.

In much the same way, Ellery’s proposal consists of machines with “memory modules” similar to chromosomes, which contain genetic memory in the form of DNA sequences. These modules are comprised of volumetric arrays of magnetic core memory cells, each of which is programmed with “genetic instructions” in the form of zeros and ones. During the replication process, the “parent machine” will copy these instructions into every “offspring machine,” and the telomeres will shorten with each replication.

In this case, the telomeres constitute a physical linear tail of blank memory cells that feed into the original to-be-read memory array. The number of these cells corresponds to the maximum number of replications, thereby imposing an artificial Hayflick Limit. As he detailed the process:

“When you start to copy [an array], you start at a certain point, and that’s defined. And then, the copier goes from one to the other and copies each one with a blank magnetic core. The point is that when you copy it, you start on position one with your copier, and then you move on to the next one and start copying at the next point. You don’t copy at the point to which you sat at.

“So what happens is that as you copy, you lose that first generation, and then the second generation you are missing the first position, and then you start at the second position, and you start copying. As long as you are not copying the first position – where you’ve positioned your initial point, your initial copier – it only copies on the next square. Then you’re chopping out information.

“Now the, in order to retain the information in that block, you can add a tail of units. And so basically it copies each unit along the tail, and it’s basically not copying the first one that lands on. So every generation is basically you get, you lose one cell of memory. So that first tale doesn’t actually code for anything. It’s just basically a telomere. So you just basically plan how many generations you need, [which] defines how long your tail is.”

Artist’s impression of a Mars habitat in conjunction with other surface elements on Mars. Credit: NASA

Potential Applications

In his study, Ellery explores several applications for self-replicating robots, starting with the industrialization of the Moon. In each case, the robots rely on additive manufacturing (3D printing) and in-situ resource utilization (ISRU) to manufacture the infrastructure necessary for human habitation, s well as more copies of themselves. This would allow for a long-term human presence on the Moon, consistent with the NASA, ESA, and the Chinese and Russian space agencies’ goals for the coming decades.

In particular, NASA has stressed that the main objective of the Artemis Program is to create a “sustained program of lunar exploration.” This presents one of the main advantages of self-replicating machines, which is their ability to develop Lunar and other celestial resources sustainably. Said Ellery:

“When we go to the Moon, we have two choices. At the moment, the interest is in going to the moon and mining water, and then separating the hydrogen and oxygen, then burning it as propellant. To me, that is wasteful. That’s not sustainable because you’re taking a scarce resource and then you’re burning it, much like we’ve done on our Earth. Why are we taking all our bad habits with us?

“The other approach, of course, is to look at resources and work within the limits of those resources. So, if you look at minerals. Minerals are widespread: common rock-forming minerals on the Moon and asteroids. Basically, we would be sending these probes to the Moon and other locations in the Solar System, wherever we’re intending to [build bases]. They’re tectonically dead, they’re just hunks of rock. So I see nothing wrong with utilizing those resources.”

Beyond the Moon, Mars, and asteroid mining, there’s the potential for creating space habitats (such as O’Neill Cylinders) at the Earth-Sun Lagrange Points and even terraforming operations. Last but not least (by any stretch of the imagination) is the potential for interstellar exploration and the possible settlement of exoplanets! Once again, Ellery stresses how this will have implications where the whole SETI vs. METI (Messaging Extraterrestrial Intelligence) is concerned:

“These Von Neumann probes potentially act as Scouts. They’re scouting vehicles to investigate target locations before humans go there. So, of course, it makes no sense to send out a World Ship from here to another stellar system that you’ve never been to. You have no idea what’s there. You want to send robotic machines out there first to give you information. So you understand what the implications are, what you need to take with you, what resources you need, and what resources you don’t need.

“Self-replicating probes are the mechanism for doing any kind of interstellar space exploration, whichever way you want to look at it. You will always use these to try and scout beforehand and to send information, if only to locate other planets that are Earthlike, which are no use, or which ones might be of use (that you could adapt to). And perhaps most important would be to find out if there is intelligent life and does it present a threat. You send a Von Neumann probe to a planet, and the civilization doesn’t know where it comes from. You send signals, they know exactly where the signal is coming from.”

Annotated illustration of the Solar System and Interstellar Medium (ISM). 
Credits: Charles Carter/Keck Institute for Space Studies

These and other considerations are paramount given how humanity finds itself on the verge of another “Space Age.” With multiple plans to “return to the Moon,” explore Mars, establish permanent based and infrastructure, mine asteroids, and commercialize Low Earth Orbit (LEO), there are countless safety, legal, logistical, and ethical considerations that need to be worked out in advance. This is also essential when looking beyond the next few decades, where technological revolutions and the possibility of First Contact present certain existential risks.

In short, if we plan to deploy self-replicating probes to pave the way for human settlement, we need to ensure that they will not run amok and begin consuming everything in sight. This is especially true if these probes are to be used as scouts, exploring the galaxy and maybe acting as our ambassadors to extraterrestrial civilizations. Making “programmed cell death” a part of their design is an elegant solution that ensures our creations will be tempered by mortality. We can only hope that if an extraterrestrial civilization is already exploring the cosmos with Von Neumann probes, they took similar precautions!

Further Reading: Cambridge University
SPAIN

Climate change has Seville so hot it's started naming heat waves like hurricanes

Zoe arrived this week.


I. Bonifacic
@igorbonifacic
July 26, 2022 
Marcelo del Pozo / reuters


The city of Seville is trying something new to raise awareness of climate change and save lives. With oppressive heat waves becoming a fact of life in Europe and other parts of the world, the Spanish metropolis has begun naming them. The first one, Zoe, arrived this week, bringing with it expected daytime highs above 109 degrees Fahrenheit (or 43 degrees Celsius).

As Time points out, there’s no single scientific definition of a heat wave. Most countries use the term to describe periods of temperatures that are higher than the historical and seasonal norms for a particular area. Seville’s new system categorizes those events into three tiers, with names reserved for the most severe ones and an escalating municipal response tied to each level. The city will designate future heat waves in reverse alphabetical order, with Yago, Xenia, Wenceslao and Vega to follow.

It’s a system akin to ones organizations like the US National Hurricane Center have used for decades to raise awareness of impending tropical storms, tornadoes and hurricanes. The idea is that people are more likely to take a threat seriously and act accordingly when it's given a name.

"This new method is intended to build awareness of this deadly impact of climate change and ultimately save lives," Kathy Baughman McLeod, director of the Adrienne Arsht-Rockefeller Foundation Resilience Center, the think tank that helped develop Seville’s system, told Euronews. Naming heat waves could also help some people realize that we're not dealing with occasional “freak” weather events anymore: they’re the byproduct of a warming planet.
Rugby

‘Conclusive evidence’ repetitive head impacts can cause brain disease

Chronic traumatic encephalopathy is 68 times more likely in contact-sport athletes


New research claims to have found 'conclusive evidence' that repetitive head impacts can potentially cause degenerative brain disease.
 Photograph: Getty

Wed Jul 27 2022 -

New research claims to have found “conclusive evidence” that repetitive head impacts can cause degenerative brain disease, with leading sports organisations urged to acknowledge the analysis by world-leading experts.

An international team of experts have issued a global call for further chronic traumatic encephalopathy (CTE) prevention and mitigation efforts to be brought in, especially for children.

Researchers from Oxford Brookes University and 12 other academic institutions produced the study, with analysis provided by Concussion Legacy Foundation UK.

It found that the brain banks of the US Department of Defence, Boston University-US Department of Veterans Affairs and Mayo Clinic have published independent studies showing contact-sport athletes were at least 68 times more likely to develop CTE than those who did not play any contact sport.


In rugby, there has been a number of players showing signs of early-onset dementia and on Monday the lobby group Progressive Rugby said it was finalising a comprehensive list of welfare requirements it would submit to governing bodies such as World Rugby and the English Rugby Football Union.

Dr Chris Nowinski, the lead author of the study and chief executive at the Concussion Legacy Foundation, said: “This innovative analysis gives us the highest scientific confidence that repeated head impacts cause CTE.

“Sport governing bodies should acknowledge that head impacts cause CTE and they should not mislead the public on CTE causation while athletes die, and families are destroyed, by this terrible disease.”

Dr Adam White, senior lecturer in sport and coaching sciences at Oxford Brookes University and executive director of the Concussion Legacy Foundation UK, said: “This analysis shows it is time to include repetitive head impacts and CTE among other child safety efforts like smoking, sunburns and alcohol.

“Repetitive head impacts and CTE deserve recognition in the global public health discussion of preventable disorders caused by childhood exposure in contact sports like football, rugby, ice hockey and others.”

The research paper, “Applying the Bradford Hill Criteria for Causation to Repetitive Head Impacts and Chronic Traumatic Encephalopathy”, has been published in Frontiers in Neurology and it is hoped it can force global sporting organisations such as Fifa and World Rugby into acknowledging there is a causal link between CTE and repetitive head impacts.

Asked about reports in this morning’s Irish Times that the IRFU is to face legal action over alleged issues arising from concussion among former players, Sports Minister Jack Chambers said the protocols around concussion had been strengthened in recent years through the international rugby authorities

“I know the IRFU need to make sure that they match all international best practice and that’s something I know they’re cognisant of, and we all want to ensure that HIAs are taken very seriously and I support all efforts to protect players and protect people as they play the game.”

He said there were medical assessments of whether a player can continue to play or to play the week following a concussion which “should be independent and objective to protect player welfare”.

Asked about recent moves in the United Kingdom to ban headers in under-12 leagues, he said it was something the Football Association of Ireland “could consider”.

“You need to follow best medical practice when it comes to protecting players, and it is a concern across sport. Sport is about wellbeing and promoting health and we need to ensure whatever measures are taken by sports that they mitigate any impacts when it comes to young people or people as they advance through the professional or performance systems.”