It’s possible that I shall make an ass of myself. But in that case one can always get out of it with a little dialectic. I have, of course, so worded my proposition as to be right either way (K.Marx, Letter to F.Engels on the Indian Mutiny)
Tuesday, February 16, 2021
Capuchin monkey genome reveals clues to its long life and large brain
An international team of scientists has sequenced the genome of a capuchin monkey for the first time, uncovering new genetic clues about the evolution of their long lifespan and large brains.
Published in PNAS, the work was led by the University of Calgary in Canada and involved researchers at the University of Liverpool.
"Capuchins have the largest relative brain size of any monkey and can live past the age of 50, despite their small size, but their genetic underpinnings had remained unexplored until now," explains Professor Joao Pedro De Magalhaes, who researches ageing at the University of Liverpool.
The researchers developed and annotated a reference assembly for white-faced capuchin monkeys (Cebus imitator) to explore the evolution of these traits.
Through a comparative genomics approach spanning a wide diversity of mammals, they identified genes under evolutionary selection associated with longevity and brain development.
"We found signatures of positive selection on genes underlying both traits, which helps us to better understand how such traits evolve. In addition, we found evidence of genetic adaptation to drought and seasonal environments by looking at populations of capuchins from a rainforest and a seasonal dry forest," said senior author and Canada Research Chair Amanda Melin who has studied capuchin monkey behaviour and genetics for almost 20 years.
The researchers identified genes associated with DNA damage response, metabolism, cell cycle, and insulin signalling. Damage to the DNA is thought to be a major contributor to ageing and previous studies by Professor de Magalhaes and others have shown that genes involved in DNA damage responses exhibit longevity-specific selection patterns in mammals.
"Of course, because aging-related genes often play multiple roles it is impossible to be sure whether selection in these genes is related to ageing or to other life-history traits, like growth rates and developmental times, that in turn correlate with longevity," said Professor De Magalhaes.
"Although we should be cautious about the biological significance of our findings, it is tempting to speculate that, like in other species, changes to specific aging-related genes or pathways, could contribute to the longevity of capuchins," he added.
The team's insights were made possible thanks to the development of a new technique to isolate DNA more efficiently from primate faeces.
FecalFACS utilises an existing technique that has been developed to separate cells types in body fluids - for example to separate different cell types in blood for cancer research - and applies it to primate faecal samples.
"This is a major breakthrough because the typical way to extract DNA from faeces results in about 95-99% of the DNA coming from gut microbes and food items. A lot of money has been spent sequencing genomes from different organisms than the mammals we're actually trying to study. Because of this, when wildlife biologists have required entire genomes, they have had to rely on more pure sources of DNA, like blood, saliva, or tissue - but as you can imagine, those are very hard to come by when studying endangered animals," explained the study's lead author, Dr Joseph Orkin, who completed work on this project as a postdoctoral scholar at the University of Calgary, and in his present location at Universitat Pompeu Fabra-CSIC in Barcelona.
"FecalFACS finally provides a way to sequence whole genomes from free-ranging mammals using readily available, non-invasive samples, which could really help future conservation efforts," he added.
###
In predicting shallow but dangerous landslides, size matters
Computer models to predict areas most likely to slide confront lack of subsurface data
The threat of landslides is again in the news as torrential winter storms in California threaten to undermine fire-scarred hillsides and bring deadly debris flows crashing into homes and inundating roads.
But it doesn't take wildfires to reveal the landslide danger, University of California, Berkeley, researchers say. Aerial surveys using airborne laser mapping -- LiDAR (light detection and ranging) -- can provide very detailed information on the topography and vegetation that allow scientists to identify which landslide-prone areas could give way during an expected rainstorm. This is especially important for predicting where shallow landslides -- those just involving the soil mantle -- may mobilize and transform as they travel downslope into destructive debris flows.
The catch, they say, is that such information cannot yet help predict how large and potentially hazardous the landslides will be, meaning that evacuations may target lots more people than are really endangered by big slides and debris flows.
In a new paper appearing this week in the journal Proceedings of the National Academy of Sciences, the scientists, UC Berkeley geologist William Dietrich and project scientist Dino Bellugi report their latest attempt at tagging landslide-prone areas according to their likely size and hazard potential, in hopes of more precise predictions. Their model takes into account the physical aspects of hillsides -- steepness, root structures holding the slope in place and soil composition -- and the pathways water follows as it runs downslope and into the soil.
Yet, while the model is better at identifying areas prone to larger and potentially more dangerous landslides, the researchers discovered factors affecting landslide size that can't easily be determined from aerial data and must be assessed from the ground -- a daunting task, if one is concerned about the entire state of California.
The key unknowns are what the subsurface soil and underlying bedrock are like and the influence of past landslides on ground conditions.
"Our studies highlight the problem of overprediction: We have models that successfully predict the location of slides that did occur, but they end up predicting lots of places that didn't occur because of our ignorance about the subsurface," said Dietrich, UC Berkeley professor of earth and planetary science. "Our new findings point out specifically that the spatial structure of the hillslope material -- soil depth, root strength, permeability and variabilities across the slope -- play a role in the size and distribution and, therefore, the hazard itself. We are hitting a wall -- if we want to get further with landslide prediction that attempts to specify where, when and how big a landslide will be, we have to have knowledge that is really hard to get, but matters."
CAPTION
Aerial photograph of a hillslope after a rainstorm in February 2017 that generated 595 shallow landslides in a 16 square kilometer (6.4 square mile) area in the hills west of Williams, California. In the image, the landscape slopes downward from left to right. The darker brown upslope element of each scar is the landslide, while the lighter toned area downslope records the path the landslide took as it mobilized as mudflow, locally scouring and burying the grass in mud. The scale bar in lower left is 11 meters (36 feet) long.
CREDIT
Image provided by the National Center for Airborne Laser Mapping
Models key to targeted evacuations
Decades of studies by Dietrich and others have led to predictive models of where and under what rainfall conditions slopes will fail, and such models are used worldwide in conjunction with weather prediction models to pinpoint areas that could suffer slides in an oncoming storm and warn residents. But these models, triggered by a so-called "empirical rainfall thresholds," are conservative, and government agencies often end up issuing evacuation warnings for large areas to protect lives and property.
Dietrich, who directs the Eel River Critical Zone Observatory -- a decade-long project to analyze how water moves all the way from the tree canopy through the soil and bedrock and into streams -- is trying to improve landslide size prediction models based on the physics of slopes. Airborne laser imaging using LiDAR can provide submeter-scale detail, not only of vegetation, but also of the ground under the vegetation, allowing precise measurements of slopes and a good estimate of the types of vegetation on the slopes.
Slopes fail during rainstorms, he said, because the water pressure in the soil -- the pore pressure -- pushes soil particles apart, making them buoyant. The buoyancy reduces the friction holding the soil particles against gravity, and once the mass of the slide is enough to snap the roots holding the soil in place, the slope slumps. Shallow slides may involve only the top portion of the soil, or scour down to bedrock and push everything below it downslope, creating deadly debris flows that can travel several meters per second.
Each wet year along the Pacific Coast, homes are swept away and lives lost from large landslides, though the threat is worldwide. As illustrated by a landslide in Sausalito exactly two years ago, landslides can originate just a short distance upslope and mobilize as a debris flow traveling meters per second before striking a house. The size of the initial landslide will influence the depth and speed of the flow and the distance it can travel downslope into canyons, Dietrich said.
With earlier computer models, Dietrich and his colleagues were able to pinpoint more precisely the places on hillslopes that would suffer landslides. In 2015, for example, Bellugi and Dietrich used their computer model to predict shallow landslides on a well-studied hillslope in Coos Bay, Oregon, during a sequence of landslide-triggering rainstorms, based solely on these physical measures. Those models employed LiDAR data to calculate steepness and how water would flow downslope and affect pore pressure inside the slope; the seasonal history of rainfall in the area, which helps assess how much groundwater is present; and estimates of the soil and root strength.
In the new paper, Bellugi and David Milledge of Newcastle University in Newcastle upon Tyne in the United Kingdom tested the landslide prediction model on two very different landscapes: a very steep, deeply etched and forested hillside in Oregon, and a smooth, grassy, gently sloped glacial valley in England's storied Lake District.
Surprisingly, they found that the distribution of small and large shallow landslides were quite similar across both landscapes and could be predicted if they took into account one extra piece of information: the variability of hillslope strength across these hillsides. They discovered that small slides can turn into major slides if the conditions -- soil strength, root strength and pore pressure -- do not vary sufficiently over short distances. Essentially, small slides can propagate across the slope and become larger by connecting isolated slide-prone areas, even if they're separated by more solid slope.
"These areas that are susceptible to shallow landslides, even though you may be able to define them, may coalesce, if close enough to each other. Then you can have a big landslide that encompasses some of these little patches of low strength," Bellugi said. "These patches of low strength may be separated by areas that are strong -- they may be densely forested or less steep or drier -- but if they are not well separated, then those areas can coalesce and make a giant landslide."
"On hillsides, there are trees and topography, and we can see them and quantify them," Dietrich added. "But starting from the surface and going down into the ground, there is a lot that we need in models that we can't now quantify over large areas: the spatial variation in soil depth and root strength and the influence of groundwater flow, which can emerge from the underlying bedrock and influence soil pore pressure."
Getting such detailed information across an entire slope is a herculean effort, Dietrich said. On the Oregon and Lake District slopes, researchers walked or scanned the entire area to map vegetation, soil composition and depth, and past slides meter by meter, and then painstakingly estimated root strength, all of which is impractical for most slopes.
"What this says is that to predict the size of a landslide and a size distribution, we have a significant barrier that is going to be hard to cross -- but we need to -- which is to be able to characterize the subsurface material properties," Dietrich said. "Dino's paper says that the spatial structure of the subsurface matters."
The researchers' previous field studies found, for example, that fractured bedrock can allow localized subsurface water flow and undermine otherwise stable slopes, something not observable -- yet -- by aerial surveys.
They urge more intensive research on steep hillsides to be able to predict these subsurface features. This could include more drilling, installing hydrologic monitoring equipment and application of other geophysical tools, including cone penetrometers, which can be used to map soil susceptible to failure.
###
Other co-authors of the paper are Lauren Larsen and Kurt Cuffey, UC Berkeley professors of geography.
The work was supported by a Gordon and Betty Moore Foundation Data Driven Discovery Investigator Award to Bellugi and Larsen. Dietrich is supported by a National Science Foundation grant for the Eel River Critical Zone Observatory (EAR-1331940). Cuffey was supported by the Martin Family Foundation.
Commuters are inhaling unacceptably high levels of carcinogens
Twenty minutes or longer in the car also raises risk of birth defects
A new study finds that California's commuters are likely inhaling chemicals at levels that increase the risk for cancer and birth defects.
As with most chemicals, the poison is in the amount. Under a certain threshold of exposure, even known carcinogens are not likely to cause cancer. Once you cross that threshold, the risk for disease increases.
Governmental agencies tend to regulate that threshold in workplaces. However, private spaces such as the interior of our cars and living rooms are less studied and less regulated.
Benzene and formaldehyde -- both used in automobile manufacturing -- are known to cause cancer at or above certain levels of exposure and are Prop. 65-listed chemicals.
New UC Riverside research shows that the average commuter in California is exceeding the threshold for exposure, breathing in unsustainably high levels of both chemicals.
Both benzene and formaldehyde are carcinogens, and benzene carries the additional risk of reproductive and developmental toxicity.
"These chemicals are very volatile, moving easily from plastics and textiles to the air that you breathe," said David Volz, UCR professor of environmental toxicology.
The study, published in the journal Environment International, calculated the daily dose of benzene and formaldehyde being inhaled by drivers with commutes of at least 20 minutes per day.
It found that up to 90% of the population in Los Angeles, San Diego, Orange, Santa Clara, and Alameda counties have at least a 10% chance of exceeding cancer risk from inhaling the chemicals, based on having 30-minute average commute times.
"Of course, there is a range of exposure that depends on how long you're in the car, and how much of the compounds your car is emitting," said Aalekhya Reddam, a graduate student in the Volz laboratory, and lead author of the study.
Previously, Volz and Reddam studied commuter exposure to a flame retardant called TDCIPP or chlorinated tris, and found that longer commute times increased exposure to that carcinogen as well.
They set out on this study wanting to understand the risk of that compound relative to other chemicals introduced during car manufacturing.
Reddam advises commuters to keep the windows open during their rides if possible. "At least with some air flow, you'd be diluting the concentration of these chemicals inside your car," she said.
Benzene is used to produce synthetic fibers, and formaldehyde is a binder in plastics. "There should be alternatives to these chemicals to achieve the same goals during vehicle manufacturing," Volz said. "If so, these should be used."
###
USA
Corn belt farmland has lost a third of its carbon-rich soil
UMass Amherst researchers used remote sensing to quantify the previously underestimated erosion
More than one-third of the Corn Belt in the Midwest - nearly 100 million acres - has completely lost its carbon-rich topsoil, according to University of Massachusetts Amherst research that indicates the U.S. Department of Agricultural has significantly underestimated the true magnitude of farmland erosion.
In a paper published in the Proceedings of the National Academy of Sciences, researchers led by UMass Amherst graduate student Evan Thaler, along with professors Isaac Larsen and Qian Yu in the department of geosciences, developed a method using satellite imagery to map areas in agricultural fields in the Corn Belt of the Midwestern U.S. that have no remaining A-horizon soil. The A-horizon is the upper portion of the soil that is rich in organic matter, which is critical for plant growth because of its water and nutrient retention properties. The researchers then used high-resolution elevation data to extrapolate the satellite measurements across the Corn Belt and the true magnitude of erosion.
Productive agricultural soils are vital for producing food for a growing global population and for sustaining rural economies. However, degradation of soil quality by erosion reduces crop yields. Thaler and his colleagues estimate that erosion of the A-horizon has reduced corn and soybean yields by about 6%, leading to nearly $3 billion in annual economic losses for farmers across the Midwest.
The A-horizon has primarily been lost on hilltops and ridgelines, which indicates that tillage erosion - downslope movement of soil by repeated plowing - is a major driver of soil loss in the Midwest. Notably, tillage erosion is not included in national assessments of soil loss and the research highlights the urgent need to include tillage erosion in the soil erosion models that are used in the U.S. and to incentivize adoption of no-till farming methods.
Further, their research suggests erosion has removed nearly 1.5 petagrams of carbon from hillslopes. Restoration of organic carbon to the degraded soils by switching from intensive conventional agricultural practices to soil-regenerative practices, has potential to sequester carbon dioxide from the atmosphere while restoring soil productivity.
###
POSTMODERN ALCHEMY
Hydrogen peroxide, universal oxidizing agent, high-efficiency production by simple process
Computer simulation-based catalyst development for hydrogen peroxide production with selectivity of 95%. Development of the platinum-gold alloy catalyst facilitating hydrogen peroxide direct synthesis from hydrogen and oxygen at room temperature and atmo
Hydrogen peroxide is used as a disinfectant, after dilution in water, to treat wounds. It is widely used across the industry as an eco-friendly oxidizing agent for impurity removal from semiconductors, waste treatment, etc. Currently, it is mainly produced by the sequential hydrogenation and oxidation of anthraquinone (AQ). However, this process is not only energy intensive and requires large-scale facilities, but AQ is also toxic.
As an alternative to the AQ process, hydrogen peroxide direct synthesis from hydrogen (H2) and oxygen (O2) using a palladium (Pd) catalyst was proposed. However, the commercialization of the technology has been challenging becausethe amount of water (H2O) formed is more than hydrogen peroxide (H2O2) during the process.*
*In the case of the Pd catalyst, 40% of hydrogen peroxide and 60% of water were maximally produced.
The Korea Institute of Science and Technology (KIST) announced that a joint research team of Dr. Sang Soo Han and Dr. Donghun Kim (Computational Science Research Center), Dr. Seung Yong Lee (Materials Architecture Research Center), and Professor Kwan-Young Lee at Korea University (Korea University, President Jin Taek Chung) developed a platinum-gold alloy catalyst for hydrogen peroxide production based on a computer simulation. Hydrogen peroxide selectivity can be increased to 95% by using this catalyst, compared with only 30-40% for a palladium catalyst, which indicates that mostly hydrogen peroxide on the developed Pt-Au catalyst can be produced with a small amount of water.
The joint research team between KIST and Korea University developed a new type of Pt-Au alloyed nanoparticle catalyst. Although it is difficult to homogeniously mix Pt and Au to develop an alloyed catalyst due to the intrinsic immiscibility of the metals, the researchers could successfully synthesize nanoparticles in the form of alloys by forcibly reducing **precursors of Pt and Au. Also, using this method, the content of each metal particle could be controlled by adjusting the amount of precursors of Pt and Au.
**Precursor: a substance from which the final specific substances is obtained by metabolism or chemical reactions
Hydrogen peroxide can be produced anywhere without large equipment by simply injecting both hydrogen gas and oxygen gas into an aqueous solution using the catalyst developed by the researchers. Unlike the Pd catalyst, the catalyst developed by the joint researchers can produce hydrogen peroxide up to 95% even at ambient temperature (10 ?C) and atmospheric pressure (1 atm). In addition, a catalytic reaction can be maintained for longer than 8 h, resulting from the structural stability of the catalyst.
The researchers clearly established the crystal structure of Pt-Au alloy nanoparticles by performing additional computer simulations, which is difficult to solve using general material analysis techniques. Furthermore, the catalytic reaction mechanism via compuater simulations was proposed at the atomic level in which the reason why the catalytic performance for hydrogen peroxide production is increased iswith increasing Au content was also clarified.
Sang Soo Han, Head of the Center at KIST, said, "it is important that the developed catalysts provide an eco-friendly hydrogen peroxide production option that can be applied without any limitation of manufacturing sites. Therefore, commercialization for the hydrogen peroxide direct synthesis would be greatly accelerated by overcoming the limitation of Pd catalysts with the low selectivity" and "the time and cost for the development of novel catalysts, mainly explored through trial and error, could be considerably reduced through computer simulations".
###
This study was conducted by the Creative Materials Discovery Program of the National Research Foundation of Korea with the support of the Ministry of Science and ICT (MSIT).The research results were published in the latest issue of an international journal 'Acta Materialia' in the field of materials science.
Biotech fit for the Red Planet
New method for growing cyanobacteria under Mars-like conditions
NASA, in collaboration with other leading space agencies, aims to send its first human missions to Mars in the early 2030s, while companies like SpaceX may do so even earlier. Astronauts on Mars will need oxygen, water, food, and other consumables. These will need to be sourced from Mars, because importing them from Earth would be impractical in the long term. In Frontiers in Microbiology, scientists show for the first time that Anabaena cyanobacteria can be grown with only local gases, water, and other nutrients and at low pressure. This makes it much easier to develop sustainable biological life support systems.
"Here we show that cyanobacteria can use gases available in the Martian atmosphere, at a low total pressure, as their source of carbon and nitrogen. Under these conditions, cyanobacteria kept their ability to grow in water containing only Mars-like dust and could still be used for feeding other microbes. This could help make long-term missions to Mars sustainable," says lead author Dr Cyprien Verseux, an astrobiologist who heads the Laboratory of Applied Space Microbiology at the Center of Applied Space Technology and Microgravity (ZARM) of the University of Bremen, Germany.
Low-pressure atmosphere
Cyanobacteria have long been targeted as candidates to drive biological life support on space missions, as all species produce oxygen through photosynthesis while some can fix atmospheric nitrogen into nutrients. A difficulty is that they cannot grow directly in the Martian atmosphere, where the total pressure is less than 1% of Earth's - 6 to 11 hPa, too low for the presence of liquid water - while the partial pressure of nitrogen gas - 0.2 to 0.3 hPa - is too low for their metabolism. But recreating an Earth-like atmosphere would be expensive: gases would need to be imported, while the culture system would need to be robust - hence, heavy to freight - to resist the pressure differences: "Think of a pressure cooker," Verseux says. So the researchers looked for a middle ground: an atmosphere close to Mars's which allows the cyanobacteria to grow well.
To find suitable atmospheric conditions, Verseux et al. developed a bioreactor called Atmos (for "Atmosphere Tester for Mars-bound Organic Systems"), in which cyanobacteria can be grown in artificial atmospheres at low pressure. Any input must come from the Red Planet itself: apart from nitrogen and carbon dioxide, gases abundant in the Martian atmosphere, and water which could be mined from ice, nutrients should come from "regolith", the dust covering Earth-like planets and moons. Martian regolith has been shown to be rich in nutrients such as phosphorus, sulphur, and calcium.
Anabaena: versatile cyanobacteria grown on Mars-like dust
Atmos has nine 1 L vessels made of glass and steel, each of which is sterile, heated, pressure-controlled, and digitally monitored, while the cultures inside are continuously stirred. The authors chose a strain of nitrogen-fixing cyanobacteria called Anabaena sp. PCC 7938, because preliminary tests showed that it would be particularly good at using Martian resources and helping to grow other organisms. Closely related species have been shown to be edible, suitable for genetic engineering, and able to form specialized dormant cells to survive harsh conditions.
Verseux and his colleagues first grew Anabaena for 10 days under a mixture of 96% nitrogen and 4% carbon dioxide at a pressure of 100 hPa - ten times lower than on Earth. The cyanobacteria grew as well as under ambient air. Then they tested the combination of the modified atmosphere with regolith. Because no regolith has ever been brought from Mars, they used a substrate developed by the University of Central Florida (called "Mars Global Simulant") instead to create a growth medium. As controls, Anabaena were grown in standard medium, either at ambient air or under the same low-pressure artificial atmosphere.
The cyanobacteria grew well under all conditions, including in regolith under the nitrogen- and carbon dioxide-rich mixture at low pressure. As expected, they grew faster on standard medium optimized for cyanobacteria than on Mars Global Simulant, under either atmosphere. But this is still a major success: while standard medium would need to be imported from Earth, regolith is ubiquitous on Mars. "We want to use as nutrients resources available on Mars, and only those," says Verseux.
Dried Anabaena biomass was ground, suspended in sterile water, filtered, and successfully used as a substrate for growing of E. coli bacteria, proving that sugars, amino acids, and other nutrients can be extracted from them to feed other bacteria, which are less hardy but tried-and-tested tools for biotechnology. For example, E. coli could be engineered more easily than Anabaena to produce some food products and medicines on Mars that Anabaena cannot.
The researchers conclude that nitrogen-fixing, oxygen-producing cyanobacteria can be efficiently grown on Mars at low pressure under controlled conditions, with exclusively local ingredients.
Further refinements in the pipeline
These results are an important advance. But the authors caution that further studies are necessary: "We want to go from this proof-of-concept to a system that can be used on Mars efficiently," Verseux says. They suggest fine-tuning the combination of pressure, carbon dioxide, and nitrogen optimal for growth, while testing other genera of cyanobacteria, perhaps genetically tailored for space missions. A cultivation system for Mars also needs to be designed:
"Our bioreactor, Atmos, is not the cultivation system we would use on Mars: it is meant to test, on Earth, the conditions we would provide there. But our results will help guide the design of a Martian cultivation system. For example, the lower pressure means that we can develop a more lightweight structure that is more easily freighted, as it won't have to withstand great differences between inside and outside," concludes Verseux.
###
The project was funded by the Alexander von Humboldt Foundation.
Far underneath the ice shelves of the Antarctic, there's more life than expected, finds a recent study in the journal Frontiers in Marine Science.
During an exploratory survey, researchers drilled through 900 meters of ice in the Filchner-Ronne Ice Shelf, situated on the south eastern Weddell Sea. At a distance of 260km away from the open ocean, under complete darkness and with temperatures of -2.2°C, very few animals have ever been observed in these conditions.
But this study is the first to discover the existence of stationary animals - similar to sponges and potentially several previously unknown species - attached to a boulder on the sea floor.
"This discovery is one of those fortunate accidents that pushes ideas in a different direction and shows us that Antarctic marine life is incredibly special and amazingly adapted to a frozen world," says biogeographer and lead author, Dr Huw Griffiths of British Antarctic Survey.
More questions than answers
"Our discovery raises so many more questions than it answers, such as how did they get there? What are they eating? How long have they been there? How common are these boulders covered in life? Are these the same species as we see outside the ice shelf or are they new species? And what would happen to these communities if the ice shelf collapsed?"
Floating ice shelves represent the greatest unexplored habitat in the Southern Ocean. They cover more that 1.5m sq km of the Antarctic continental shelf, but only a total area similar in size to a tennis court has been studied through eight prior boreholes.
Current theories on what life could survive under ice shelves suggest that all life becomes less abundant as you move further away from open water and sunlight. Past studies have found some small mobile scavengers and predators, such as fish, worms, jellyfish or krill, in these habitats. But filter feeding organisms - which depend on a supply of food from above - were expected to be amongst the first to disappear further under the ice.
So, it came as a surprise when the team of geologists, drilling through the ice to collect sediment samples, hit a rock instead of mud at the bottom of the ocean below. They were even more surprised by the video footage, which showed a large boulder covered in strange creatures.
New Antarctic expedition needed
This is the first ever record of a hard substrate (ie a boulder) community deep beneath an ice shelf and it appears to go against all previous theories of what types of life could survive there.
Given the water currents in the region, the researchers calculate that this community may be as much as 1,500km upstream from the closest source of photosynthesis. Other organisms are also known to collect nutrients from glacial melts or chemicals from methane seeps, but the researchers won't know more about these organisms until they have the tools to collect samples of these organisms--a significant challenge in itself.
"To answer our questions we will have to find a way of getting up close with these animals and their environment - and that's under 900 meters of ice, 260km away from the ships where our labs are," continues Griffiths. "This means that as polar scientists, we are going to have to find new and innovative ways to study them and answer all the new questions we have."
Griffiths and the team also note that with the climate crisis and the collapse of these ice shelves, time is running out to study and protect these ecosystems
CAPTION
Stationary animals - similar to sponges and potentially several previously unknown species - attached to a boulder on the sea floor.
CREDIT
Dr Huw Griffiths/British Antarctic Survey
Neanderthals and Homo sapiens used identical Nubian technology
MAX PLANCK INSTITUTE FOR THE SCIENCE OF HUMAN HISTORY
Long held in a private collection, the newly analysed tooth of an approximately 9-year-old Neanderthal child marks the hominin's southernmost known range. Analysis of the associated archaeological assemblage suggests Neanderthals used Nubian Levallois technology, previously thought to be restricted to Homo sapiens.
With a high concentration of cave sites harbouring evidence of past populations and their behaviour, the Levant is a major centre for human origins research. For over a century, archaeological excavations in the Levant have produced human fossils and stone tool assemblages that reveal landscapes inhabited by both Neanderthals and Homo sapiens, making this region a potential mixing ground between populations. Distinguishing these populations by stone tool assemblages alone is difficult, but one technology, the distinct Nubian Levallois method, is argued to have been produced only by Homo sapiens.
In a new study published in Scientific Reports, researchers from the Max Planck Institute for the Science of Human History teamed up with international partners to re-examine the fossil and archaeological record of Shukbah Cave. Their findings extend the southernmost known range of Neanderthals and suggest that our now-extinct relatives made use of a technology previously argued to be a trademark of modern humans. This study marks the first time the lone human tooth from the site has been studied in detail, in combination with a major comparative study examining the stone tool assemblage.
"Sites where hominin fossils are directly associated with stone tool assemblages remain a rarity - but the study of both fossils and tools is critical for understanding hominin occupations of Shukbah Cave and the larger region," says lead author Dr Jimbob Blinkhorn, formerly of Royal Holloway, University of London and now with the Pan-African Evolution Research Group (Max Planck Institute for the Science of Human History).
Shukbah Cave was first excavated in the spring of 1928 by Dorothy Garrod, who reported a rich assemblage of animal bones and Mousterian-style stone tools cemented in breccia deposits, often concentrated in well-marked hearths. She also identified a large, unique human molar. However, the specimen was kept in a private collection for most of the 20th century, prohibiting comparative studies using modern methods. The recent re-identification of the tooth at the Natural History Museum in London has led to new detailed work on the Shukbah collections.
CAPTION
Photos of Nubian Levallois cores associated with Neanderthal fossils. Copyright: UCL, Institute of Archaeology & courtesy of the Penn Museum, University of Pennsylvania
Although Homo sapiens and Neanderthals shared the use of a wide suite of stone tool technologies, Nubian Levallois technology has recently been argued to have been exclusively used by Homo sapiens. The argument has been made particularly in southwest Asia, where Nubian Levallois tools have been used to track human dispersals in the absence of fossils.
"Illustrations of the stone tool collections from Shukbah hinted at the presence of Nubian Levallois technology so we revisited the collections to investigate further. In the end, we identified many more artefacts produced using the Nubian Levallois methods than we had anticipated," says Blinkhorn. "This is the first time they've been found in direct association with Neanderthal fossils, which suggests we can't make a simple link between this technology and Homo sapiens."
"Southwest Asia is a dynamic region in terms of hominin demography, behaviour and environmental change, and may be particularly important to examine interactions between Neanderthals and Homo sapiens," adds Prof Simon Blockley, of Royal Holloway, University of London. "This study highlights the geographic range of Neanderthal populations and their behavioural flexibility, but also issues a timely note of caution that there are no straightforward links between particular hominins and specific stone tool technologies."
"Up to now we have no direct evidence of a Neanderthal presence in Africa," said Prof Chris Stringer of the Natural History Museum. "But the southerly location of Shukbah, only about 400 km from Cairo, should remind us that they may have even dispersed into Africa at times."
CAPTION
Photo and 3D reconstruction of a tooth of a 9-year-old Neanderthal child. Copyright: Trustees of the Natural History Museum, London
It forever changed history when it crashed into Earth about 66 million years ago.
The Chicxulub impactor, as it's known, left behind a crater off the coast of Mexico that spans 93 miles and runs 12 miles deep. Its devastating impact brought the reign of the dinosaurs to an abrupt and calamitous end by triggering their sudden mass extinction, along with the end of almost three-quarters of the plant and animal species living on Earth.
The enduring puzzle: Where did the asteroid or comet originate, and how did it come to strike Earth? Now, a pair of researchers at the Center for Astrophysics | Harvard & Smithsonian believe they have the answer.
In a study published today in Nature's Scientific Reports, Harvard University astrophysics undergraduate student Amir Siraj and astronomer Avi Loeb put forth a new theory that could explain the origin and journey of this catastrophic object.
Using statistical analysis and gravitational simulations, Siraj and Loeb calculate that a significant fraction of long-period comets originating from the Oort cloud, an icy sphere of debris at the edge of the solar system, can be bumped off-course by Jupiter's gravitational field during orbit.
"The solar system acts as a kind of pinball machine," explains Siraj, who is pursuing bachelor's and master's degrees in astrophysics, in addition to a master's degree in piano performance at the New England Conservatory of Music. "Jupiter, the most massive planet, kicks incoming long-period comets into orbits that bring them very close to the sun."
During close passage to the sun, the comets -- nicknamed "sungrazers" --can experience powerful tidal forces that break apart pieces of the rock and ultimately, produce cometary shrapnel.
"In a sungrazing event, the portion of the comet closer to the sun feels a stronger gravitational pull than the part that is further, resulting in a tidal force across the object," Siraj says. "You can get what's called a tidal disruption event, in which a large comet breaks up into many smaller pieces. And crucially, on the journey back to the Oort cloud, there's an enhanced probability that one of these fragments hit the Earth."
The new calculations from Siraj and Loeb's theory increase the chances of long-period comets impacting Earth by a factor of about 10, and show that about 20 percent of long-period comets become sungrazers.
The pair say that their new rate of impact is consistent with the age of Chicxulub, providing a satisfactory explanation for its origin and other impactors like it.
"Our paper provides a basis for explaining the occurrence of this event," Loeb says. "We are suggesting that, in fact, if you break up an object as it comes close to the sun, it could give rise to the appropriate event rate and also the kind of impact that killed the dinosaurs."
Evidence found at the Chicxulub crater suggests the rock was composed of carbonaceous chondrite. Siraj and Loeb's hypothesis might also explain this unusual composition.
A popular theory on the origin of Chicxulub claims that the impactor originated from the main belt, which is an asteroid population between the orbit of Jupiter and Mars. However, carbonaceous chondrites are rare amongst main-belt asteroids, but possibly widespread amongst long-period comets, providing additional support to the cometary impact hypothesis.
Other similar craters display the same composition. This includes an object that hit about 2 billion years ago and left the Vredefort crater in South Africa, which is the largest confirmed crater in Earth's history, and the impactor that left the Zhamanshin crater in Kazakhstan, which is the largest confirmed crater within the last million years. The researchers say that the timing of these impacts support their calculations on the expected rate of Chicxulub-sized tidally disrupted comets.
Siraj and Loeb say their hypothesis can be tested by further studying these craters, others like them, and even ones on the surface of the moon to determine the composition of the impactors. Space missions sampling comets can also help.
Aside from composition of comets, the new Vera Rubin Observatory in Chile may be able to observe tidal disruption of long-period comets after it becomes operational next year.
"We should see smaller fragments coming to Earth more frequently from the Oort cloud," Loeb says. "I hope that we can test the theory by having more data on long-period comets, get better statistics, and perhaps see evidence for some fragments."
Loeb says understanding this is not just crucial to solving a mystery of Earth's history but could prove pivotal if such an event were to threaten the planet.
"It must have been an amazing sight, but we don't want to see that again," he said.
###
This work was partially supported by the Harvard Origins of Life Initiative and the Breakthrough Prize Foundation.
About the Center for Astrophysics | Harvard & Smithsonian
The Center for Astrophysics | Harvard & Smithsonian is a collaboration between Harvard and the Smithsonian designed to ask--and ultimately answer--humanity's greatest unresolved questions about the nature of the universe. The Center for Astrophysics is headquartered in Cambridge, MA, with research facilities across the U.S. and around the world.