Saturday, June 26, 2021

ANOTHER ARYAN RACE MYTH FALLS

Being Anglo-Saxon was a matter of language and culture, not genetics

New evidence to answer the question 'who exactly were the Anglo-Saxons?'

UNIVERSITY OF SYDNEY

Research News

IMAGE

IMAGE: THE FAMOUS ANGLO-SAXON SUTTON HOO HELMET FROM ABOUT 625 CE, PART OF THE BRITISH MUSEUM COLLECTION. PHOTO: ELISSA BLAKE/UNIVERSITY OF SYDNEY view more 

CREDIT: PHOTO: ELISSA BLAKE/UNIVERSITY OF SYDNEY

A new study from archaeologists at University of Sydney and Simon Fraser University in Vancouver, has provided important new evidence to answer the question "Who exactly were the Anglo-Saxons?"

New findings based on studying skeletal remains clearly indicates the Anglo-Saxons were a melting pot of people from both migrant and local cultural groups and not one homogenous group from Western Europe.

Professor Keith Dobney at the University of Sydney said the team's results indicate that "the Anglo-Saxon kingdoms of early Medieval Britain were strikingly similar to contemporary Britain - full of people of different ancestries sharing a common language and culture".

The Anglo-Saxon (or early medieval) period in England runs from the 5th-11th centuries AD. Early Anglo-Saxon dates from around 410-660 AD - with migration occurring throughout all but the final 100 years (ie 410-560AD).

Studying ancient skulls

Published in PLOS ONE, the collaborative study by Professor Dobney at University of Sydney and Dr Kimberly Plomp and Professor Mark Collard at Simon Fraser University in Vancouver, looked at the three-dimensional shape of the base of the skull.

"Previous studies by palaeoanthropologists have shown that the base of the human skull holds a shape signature that can be used to track relationships among human populations in a similar way to ancient DNA," Dr Plomp said. "Based on this, we collected 3D data from suitably dated skeletal collections from Britain and Denmark, and then analysed the data to estimate the ancestry of the Anglo-Saxon individuals in the sample."

The researchers found that between two-thirds and three-quarters of early Anglo-Saxon individuals were of continental European ancestry, while between a quarter and one-third were of local ancestry.

When they looked at skeletons dated to the Middle Anglo-Saxon period (several hundred years after the original migrants arrived), they found that 50 to 70 percent of the individuals were of local ancestry, while 30 to 50 percent were of continental European ancestry, which probably indicates a change in the rate of migration and/or local adoption of culture over time.

"These findings tell us that being Anglo-Saxon was more likely a matter of language and culture, not genetics," Professor Collard said.

The debate about Anglo-Saxons

Although Anglo-Saxon origins can clearly be traced to a migration of Germanic-speaking people from mainland Europe between the 5th and 7th centuries AD, the number of individuals who settled in Britain is still contested, as is the nature of their relationship with the pre-existing inhabitants of the British Isles, most of whom were Romano-Celts.

The ongoing and unresolved argument is whether hordes of European invaders largely replaced the existing Romano-British inhabitants, or did smaller numbers of migrants settle and interact with the locals, who then rapidly adopted the new language and culture of the Anglo-Saxons?

"The reason for the ongoing confusion is the apparent contradiction between early historical texts (written sometime after the events that imply that the newcomers were both numerous and replaced the Romano-British population) and some recent biomolecular markers directly recovered from Anglo-Saxon skeletons that appears to suggest numbers of immigrants were few," said Professor Dobney.

"Our new data sits at the interface of this debate and implies that early Anglo-Saxon society was a mix of both newcomers and immigrants and, instead of wholesale population replacement, a process of acculturation resulted in Anglo-Saxon language and culture being adopted wholesale by the local population."

"It could be this new cultural package was attractive, filling a vacuum left at the end of the Roman occupation of Britain. Whatever the reason, it lit the fuse for the English nation we have today - still comprised of people of different origins who share the same language," Professor Dobney said.

###

Anglo-Saxon ancestry was 'mixed and mutable'

Skull analysis finds combination of indigenous and immigrant ancestors for early Anglo-Saxons

PLOS

Research News

IMAGE

IMAGE: PHOTOGRAMMETRY WITH SKULL view more 

CREDIT: KIMBERLY PLOMP

The ancestry of early Anglo-Saxons, a subject of some debate, included immigrants from continental Europe as well as people indigenous to Great Britain, according to a study published June 23, 2021 in the open-access journal PLOS ONE by Kimberly Plomp of Simon Fraser University, British Columbia and colleagues.

Between the 5th and 7th centuries CE, Germanic-speaking people from continental Europe settled across the island of Great Britain, leading to the formation of the Anglo-Saxon ethnic group, and eventually to the development of the English language and the Kingdom of England. Historical texts describe this event as an invasion, where continental immigrants replaced indigenous people, but some archaeological evidence contradicts this story, indicating that many early Anglo-Saxons in Great Britain were of local origin.

In this study, the authors sought to determine the ancestry of Anglo-Saxons by comparing their skull anatomy to that of indigenous and continental individuals. They used 3D shape analysis to compare 236 individuals from Early and Middle Anglo-Saxon cemeteries (410-899 CE), pre-Medieval sites in England (800 BCE-410 CE), and Iron Age sites in Denmark (800 BCE-399CE). The results indicate that, among Early Anglo-Saxon skeletons, 25-33% were of local ancestry, while among Middle Anglo-Saxon skeletons, 50-70% were local.

Thus, this study finds that Anglo-Saxons comprised individuals of both local and continental ancestry, and that the ratio between the two changed over time, possibly due to variations in immigration patterns. These results conflict with some previous data from historical texts, as well as isotopic and genetic analyses. The authors propose that future studies, sampling a wider selection of skulls and examining more regions of the skull, might provide more information to resolve these questions of Anglo-Saxon ancestry.

"Palaeoanthropologists have found that when the base of the human skull is analysed in 3D, it can be used to track relationships among human populations in a similar way to ancient DNA," explains Dr Plomp. "Based on this, we collected 3D data from suitably dated skeletal collections from Britain and Denmark, and then analysed the data to estimate the ancestry of the Anglo-Saxon individuals in the sample."

Professor Collard adds, "these findings tell us that being Anglo-Saxon was a matter of language and culture, not genetics".

According to Professor Dobney, the team's results indicate that "the Anglo-Saxon kingdoms of early Medieval Britain were strikingly similar to contemporary Britain--full of people of different ancestries sharing a common language and culture."

###

Press-only preview: https://plos.io/3q18HvE

Contact: Kimberly Plomp, kplomp@sfu.ca, (236)-991-3638; Mark Collard, mcollard@sfu.ca, (778)-782-8166; Keith Dobney, keith.dobney@sydney.edu.au, +61 2 9351 5658

Image Caption: Photogrammetry with skull

Image Credit: Kimberly Plomp

Citation: Plomp KA, Dobney K, Collard M (2021) A 3D basicranial shape-based assessment of local and continental northwest European ancestry among 5th to 9th century CE Anglo-Saxons. PLoS ONE 16(6): e0252477. https://doi.org/10.1371/journal.pone.0252477

Funding: The study was supported by the European Union's Marie Sk?odowska-Curie Actions program (Horizon 2020 - 748200), the Social Sciences and Humanities Research Council of Canada (895-2011-1009), the Canada Research Chairs Program (228117 and 231256), the Canada Foundation for Innovation (203808), the British Columbia Knowledge Development Fund (862-804231), and Simon Fraser University (14518).

Competing Interests: The authors have declared that no competing interests exist.

In your coverage please use this URL to provide access to the freely available article in PLOS ONE: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0252477

3,000-year-old shark attack victim found by Oxford-led researchers

UNIVERSITY OF OXFORD

Research News

IMAGE

IMAGE: ORIGINAL EXCAVATION PHOTOGRAPH OF TSUKUMO NO. 24, COURTESY OF THE LABORATORY OF PHYSICAL ANTHROPOLOGY, KYOTO UNIVERSITY. view more 

CREDIT: KYOTO UNIVERSITY

Newspapers regularly carry stories of terrifying shark attacks, but in a paper published today, Oxford-led researchers reveal their discovery of a 3,000-year-old victim - attacked by a shark in the Seto Inland Sea of the Japanese archipelago.

The research in Journal of Archaeological Science: Reports, shows that this body is the earliest direct evidence for a shark attack on a human and an international research team has carefully recreated what happened - using a combination of archaeological science and forensic techniques.

The grim discovery of the victim was made by Oxford researchers, J. Alyssa White and Professor Rick Schulting, while investigating evidence for violent trauma on the skeletal remains of prehistoric hunter-gatherers at Kyoto University. They came upon No24, from the previously excavated site of Tsukumo, an adult male riddled with traumatic injuries.

'We were initially flummoxed by what could have caused at least 790 deep, serrated injuries to this man,' say the Oxford pair. 'There were so many injuries and yet he was buried in the community burial ground, the Tsukumo Shell-mound cemetery site.'

They continue, 'The injuries were mainly confined to the arms, legs, and front of the chest and abdomen. Through a process of elimination, we ruled out human conflict and more commonly-reported animal predators or scavengers.'

Since archaeological cases of shark reports are extremely rare, they turned to forensic shark attack cases for clues and worked with expert George Burgess, Director Emeritus of the Florida Program for Shark Research. And a reconstruction of the attack was put together by the international team.

The team concluded that the individual died more than 3,000 years ago, between 1370 to 1010 BC. The distribution of wounds strongly suggest the victim was alive at the time of attack; his left hand was sheared off, possibly a defence wound.

Individual No 24's body had been recovered soon after the attack and buried with his people at the cemetery. Excavation records showed he was also missing his right leg and his left leg was placed on top of his body in an inverted position.

According to the pair, 'Given the injuries, he was clearly the victim of a shark attack. The man may well have been fishing with companions at the time, since he was recovered quickly. And, based on the character and distribution of the tooth marks, the most likely species responsible was either a tiger or white shark.'

Co-author Dr Mark Hudson, a researcher with the Max Planck Institute, says, 'The Neolithic people of Jomon Japan exploited a range of marine resources... It's not clear if Tsukumo 24 was deliberately targeting sharks or if the shark was attracted by blood or bait from other fish. Either way, this find not only provides a new perspective on ancient Japan, but is also a rare example of archaeologists being able to reconstruct a dramatic episode in the life of a prehistoric community.'

###

Notes for Editors

The paper can be seen here: https://www.sciencedirect.com/science/article/abs/pii/S2352409X21002777?dgcid=author

A novel method of 3D analysis in development by Alyssa White, John Pouncett, and Rick Schulting was used to visualise the wounds found, which can be explored at Tsukumo 24 BodyMap 3D.

Authors: J. Alyssa White, George H. Burgess, Masato Nakatsukasa, Mark J. Hudson, John Pouncett, Soichiro Kusaka, Minoru Yoneda, Yasuhiro Yamada, Rick Schulting

The University of Oxford

Oxford University has been placed number one in the Times Higher Education World University Rankings for the fifth year running, and at the heart of this success is our ground-breaking research and innovation. Oxford is world-famous for research excellence and home to some of the most talented people from across the globe. The breadth and interdisciplinary nature of our research sparks imaginative and inventive insights and solutions.

 

Mushroom growing out of fossilized ant reveals new genus and species of fungal parasite

OREGON STATE UNIVERSITY

Research News

IMAGE

IMAGE: OREGON STATE UNIVERSITY RESEARCH HAS IDENTIFIED THE OLDEST KNOWN SPECIMEN OF A FUNGUS PARASITIZING AN ANT, AND THE FOSSIL ALSO REPRESENTS A NEW FUNGAL GENUS AND SPECIES. A MUSHROOM IS... view more 

CREDIT: GEORGE POINAR JR., OSU

CORVALLIS, Ore. - Oregon State University research has identified the oldest known specimen of a fungus parasitizing an ant, and the fossil also represents a new fungal genus and species.

"It's a mushroom growing out of a carpenter ant," said OSU's George Poinar Jr., an international expert in using plant and animal life forms preserved in amber to learn about the biology and ecology of the distant past.

A mushroom is the reproductive structure of many fungi, including the ones you find growing in your yard, and Poinar and a collaborator in France named their discovery Allocordyceps baltica. They found the new type of Ascomycota fungi in an ant preserved in 50-million-year-old amber from Europe's Baltic region.

"Ants are hosts to a number of intriguing parasites, some of which modify the insects' behavior to benefit the parasites' development and dispersion," said Poinar, who has a courtesy appointment in the OSU College of Science. "Ants of the tribe Camponotini, commonly known as carpenter ants, seem especially susceptible to fungal pathogens of the genus Ophiocordyceps, including one species that compels infected ants to bite into various erect plant parts just before they die."

Doing so, he explains, puts the ants in a favorable position for allowing fungal spores to be released from cup-shaped ascomata - the fungi's fruiting body -protruding from the ants' head and neck. Carpenter ants usually make their nests in trees, rotting logs and stumps.

The new fungal genus and species shares certain features with Ophiocordyceps but also displays several developmental stages not previously reported, Poinar said. To name the genus, placed in the order Hypocreales, Poinar and fellow researcher Yves-Marie Maltier combined the Greek word for new - alloios - with the name of known genus Cordyceps.

"We can see a large, orange, cup-shaped ascoma with developing perithecia - flask-shaped structures that let the spores out - emerging from rectum of the ant," Poinar said. "The vegetative part of the fungus is coming out of the abdomen and the base of the neck. We see freestanding fungal bodies also bearing what look like perithecia, and in addition we see what look like the sacs where spores develop. All of the stages, those attached to the ant and the freestanding ones, are of the same species."

The fungus could not be placed in the known ant-infecting genus Ophiocordyceps because ascomata in those species usually come out the neck or head of an ant, Poinar said, and not the rectum.

"There is no doubt that Allocordyceps represents a fungal infection of a Camponotus ant," he said. "This is the first fossil record of a member of the Hypocreales order emerging from the body of an ant. And as the earliest fossil record of fungal parasitism of ants, it can be used in future studies as a reference point regarding the origin of the fungus-ant association."

###

Findings were published in Fungal Biology.



CAPTION

Oregon State University research has identified the oldest known specimen of a fungus parasitizing an ant, and the fossil also represents a new fungal genus and species. A mushroom is the reproductive structure of many fungi, including the ones you find growing in your yard, and OSU's George Poinar Jr. and a collaborator in France named their discovery Allocordyceps baltica. They found the new type of Ascomycota fungi in an ant preserved in 50-million-year-old amber from Europe's Baltic region. The mushroom is coming out of the ant's rectum, and vegetative part of the fungus is emerging from its abdomen and neck. (image courtesy George Poinar Jr.)

CREDIT

George Poinar Jr., OSU




CAPTION

Oregon State University research has identified the oldest known specimen of a fungus parasitizing an ant, and the fossil also represents a new fungal genus and species. A mushroom is the reproductive structure of many fungi, including the ones you find growing in your yard, and OSU's George Poinar Jr. and a collaborator in France named their discovery Allocordyceps baltica. They found the new type of Ascomycota fungi in an ant preserved in 50-million-year-old amber from Europe's Baltic region. The mushroom is coming out of the ant's rectum, and vegetative part of the fungus is emerging from its abdomen and neck. (image courtesy George Poinar Jr.)

CREDIT

George Poinar Jr., OSU

 

New method could reveal what genes we might have inherited from Neanderthals

UNIVERSITY OF COPENHAGEN - THE FACULTY OF HEALTH AND MEDICAL SCIENCES

Research News

Thousands of years ago, archaic humans such as Neanderthals and Denisovans went extinct. But before that, they interbred with the ancestors of present-day humans, who still to this day carry genetic mutations from the extinct species.

Over 40 percent of the Neanderthal genome is thought to have survived in different present-day humans of non-African descent, but spread out so that any individual genome is only composed of up to two percent Neanderthal material. Some human populations also carry genetic material from Denisovans - a mysterious group of archaic humans that may have lived in Eastern Eurasia and Oceania thousands of years ago.

The introduction of beneficial genetic material into our gene pool, a process known as adaptive introgression, often happened because it was advantageous to humans after they expanded across the globe. To name a few examples, scientists believe some of the mutations affected skin development and metabolism. But many mutations are yet still undiscovered.

Now, researchers from GLOBE Institute at the University of Copenhagen have developed a new method using deep learning techniques to search the human genome for undiscovered mutations.

"We developed a deep learning method called 'genomatnn' that jointly models introgression, which is the transfer of genetic information between species, and natural selection. The model was developed in order to identify regions in the human genome where this introgression could have happened," says Associate Professor Fernando Racimo, GLOBE Institute, corresponding author of the new study.

"Our method is highly accurate and outcompetes previous approaches in power. We applied it to various human genomic datasets and found several candidate beneficial gene variants that were introduced into the human gene pool," he says.

The new method is based on a so-called convolutional neural network (CNN), which is a type of deep learning framework commonly used in image and video recognition.

Using hundreds of thousands of simulations, the researchers at the University of Copenhagen trained the CNN to identify patterns in images of the genome that would be produced by adaptive introgression with archaic humans.

Besides confirming already suggested genetic mutations from adaptive introgression, the researchers also discovered possible mutations that were not known to be introgressed.

"We recovered previously identified candidates for adaptive introgression in modern humans, as well as several candidates which have not previously been described," says postdoc Graham Gower, first author of the new study.

Some of the previously undescribed mutations are involved in core pathways in human metabolism and immunity.

"In European genomes, we found two strong candidates for adaptive introgression from Neanderthals in regions of the genome that affect phenotypes related to blood, including blood cell counts. In Melanesian genomes, we found candidate variants introgressed from Denisovans that potentially affected a wide range of traits, such as blood-related diseases, tumor suppression, skin development, metabolism, and various neurological diseases. It's not clear how such traits are affected in present-day carriers of the archaic variants, e.g. neutrally, positively or negatively, although historically the introgressed genetic material is assumed to have had a positive effect on those individuals carrying them," he explains.

The next stage for the research team is to adapt the method to more complex demographic and selection scenarios to understand the overall fate of Neanderthal genetic material. Graham Gower points out that the team aims to follow up on the function of the candidate variants in the genome that they found in this study.

Looking forward, it remains a challenge to search the human genome for genetic material from as yet unsampled populations, so-called ghost populations. However, the researchers are hopeful that they can further train the neural network to recognize mutations from these unsampled populations.

"Future work could also involve developing a CNN that can detect adaptive introgression from a ghost population, for cases in which genomic data from the source is unavailable," says Graham Gower.

###

Light in darkness: an experimental look at Paleolithic cave lighting

PLOS

Research News

IMAGE

IMAGE: SET OF PHOTOGRAPHS OF STONE LAMP EXPERIMENT. view more 

CREDIT: MEDINA-ALCAIDE ET AL, 2021, PLOS ONE

A recreation of three common types of Paleolithic lighting systems (torches, grease lamps, and fireplaces) illuminates how Paleolithic cave dwellers might have traveled, lived, and created in the depths of their caves, according to a study published June 16, 2021 in the open-access journal PLOS ONE by Mª Ángeles Medina-Alcaide from the University of Cantabria, Spain, and colleagues.

Humans need light to access the deepest areas of caves--and these visits also depend on the type of light available, as light intensity and duration, area of illumination, and color temperature all determine how the cave environment can be used. In this study, Medina-Alcaide and colleagues use archaeological evidence of lighting remains found across several Paleolithic caves featuring cave art in Southwest Europe to experimentally replicate the artificial lighting systems presumably used by the original human cave dwellers, allowing immediate empirical observations.

The authors conducted their experiments at Isuntza 1 Cave in the Basque region of Spain. Their replicated lighting was based as much as possible on archaeological evidence found in similar Paleolithic caves, and included five replicated torches (made variably from ivy, juniper, oak, birch, and pine resins), two stone lamps using animal fat (bone marrow from cow and deer), and a small fireplace (oak and juniper wood).

They found that the different lighting systems all had diverse features, suggesting their likely selection and use across different contexts. Wooden torches made of multiple sticks worked best for exploring caves or crossing wide spaces, since they projected light in all directions (up to almost six meters in the experiments), were easy to transport, and didn't dazzle the torchbearer despite having a light intensity almost five times greater than a double-wicked grease lamp. Torch light lasted for an average of 41 minutes in this study, with the shortest-lived torch burning 21 minutes, and the longest burning 61 minutes. The torches tended to function irregularly and required close supervision when burning--though they were easy to relight via oxygenation (moving the torch quickly side to side). The authors found the main torch disadvantage was the amount of smoke production. In contrast, grease lamps worked best for lighting small spaces over a long period--with a light intensity similar to a candle, they were able to light up to three meters (or more if larger or multiple wicks were added). Though grease lamps weren't well-suited for transit due to their dazzling effect and poor floor illumination, they burned consistently and without much smoke for well over an hour, complementing the use of torches. The authors made one fireplace, a static system, which burned very smokily and was extinguished after 30 minutes. They note that the location was likely not appropriate due to air currents in the cave.

The authors note that the practical insights and observations gained from their experimental replications are invaluable for a deeper understanding of what it may have been like to access the darkest parts of inhabited caves, especially in order to create art, and emphasize that future experimental lighting studies will be useful in continuing to unravel our ancestors' activities in their caves.

The authors add: "The artificial lighting was a crucial physical resource for expanding complex social and economic behavior in Paleolithic groups, especially for the development of the first palaeo-speleological explorations and for the origin of art in caves."

###

Citation: Medina-Alcaide MÁ, Garate D, Intxaurbe I, Sanchidrián JL, Rivero O, Ferrier C, et al. (2021) The conquest of the dark spaces: An experimental approach to lighting systems in Paleolithic caves. PLoS ONE 16(6): e0250497. https://doi.org/10.1371/journal.pone.0250497

Funding: This paper has been funded by the research projects "Before art: social investment in symbolic expressions during the Upper Paleolithic in the Iberian Peninsula" (PID2019-107262 GB-I00), PI: Diego Garate, and "Learning and development of artistic abilities in Anatomically Modern Humans; a multidisciplinary approach (ApArt)" HAR2017-87739-P, PI: Olivia Rivero, both funded by the Ministry of Science, Innovation, and Universities (Spain). Besides, the University of Cordoba (José Luis Sanchidrián, Economic Unit in Prehistory) has collaborated in the publication of this paper in open access. I.I.'s Ph.D. research is funded by a grant for the training of research personnel (PIF 2019) at the University of the Basque Country (UPV/EHU). Lastly, this paper publishes a part of M.M.A.'s thesis. This PhD was financed by the Ministry of Education, Culture, and Sport of Spain (FPU fellowship 2014-2018).

Competing Interests: The authors have declared that no competing interests exist.

In your coverage please use this URL to provide access to the freely available article in PLOS ONEhttps://journals.plos.org/plosone/article?id=10.1371/journal.pone.0250497

METAPHYSICS & MYSTICISM ARE PROVEN CORRECT

Greater than the sum of our parts: The evolution of collective intelligence

UNIVERSITY OF CAMBRIDGE

Research News

The period preceding the emergence of behaviorally modern humans was characterised by dramatic climatic and environmental variability - it is these pressures, occurring over hundreds of thousands of years that shaped human evolution.

New research published today in the Cambridge Archaeological Journal proposes a new theory of human cognitive evolution entitled 'Complementary Cognition' which suggests that in adapting to dramatic environmental and climactic variabilities our ancestors evolved to specialise in different, but complementary, ways of thinking.

Lead author Dr Helen Taylor, Research Associate at the University of Strathclyde and Affiliated Scholar at the McDonald Institute for Archaeological Research, University of Cambridge, explained: "This system of complementary cognition functions in a way that is similar to evolution at the genetic level but instead of underlying physical adaptation, may underlay our species' immense ability to create behavioural, cultural and technological adaptations. It provides insights into the evolution of uniquely human adaptations like language suggesting that this evolved in concert with specialisation in human cognition."

The theory of complementary cognition proposes that our species cooperatively adapt and evolve culturally through a system of collective cognitive search alongside genetic search which enables phenotypic adaptation (Darwin's theory of evolution through natural selection can be interpreted as a 'search' process) and cognitive search which enables behavioural adaptation.

Dr Taylor continued, "Each of these search systems is essentially a way of adapting using a mixture of building on and exploiting past solutions and exploring to update them; as a consequence, we see evolution in those solutions over time. This is the first study to explore the notion that individual members of our species are neurocognitively specialised in complementary cognitive search strategies."

Complementary cognition could lie at the core of explaining the exceptional level of cultural adaptation in our species and provides an explanatory framework for the emergence of language. Language can be viewed as evolving both as a means of facilitating cooperative search and as an inheritance mechanism for sharing the more complex results of complementary cognitive search. Language is viewed as an integral part of the system of complementary cognition.

The theory of complementary cognition brings together observations from disparate disciplines, showing that they can be viewed as various faces of the same underlying phenomenon.

Dr Taylor continued: "For example, a form of cognition currently viewed as a disorder, dyslexia, is shown to be a neurocognitive specialisation whose nature in turn predicts that our species evolved in a highly variable environment. This concurs with the conclusions of many other disciplines including palaeoarchaeological evidence confirming that the crucible of our species' evolution was highly variable."

Nick Posford, CEO, British Dyslexia Association said, "As the leading charity for dyslexia, we welcome Dr Helen Taylor's ground-breaking research on the evolution of complementary cognition. Whilst our current education and work environments are often not designed to make the most of dyslexia-associated thinking, we hope this research provides a starting point for further exploration of the economic, cultural and social benefits the whole of society can gain from the unique abilities of people with dyslexia."

At the same time, this may also provide insights into understanding the kind of cumulative cultural evolution seen in our species. Specialisation in complementary search strategies and cooperatively adapting would have vastly increased the ability of human groups to produce adaptive knowledge, enabling us to continually adapt to highly variable conditions. But in periods of greater stability and abundance when adaptive knowledge did not become obsolete at such a rate, it would have instead accumulated, and as such Complementary Cognition may also be a key factor in explaining cumulative cultural evolution.

Complementary cognition has enabled us to adapt to different environments, and may be at the heart of our species' success, enabling us to adapt much faster and more effectively than any other highly complex organism. However, this may also be our species' greatest vulnerability.

Dr Taylor concluded: "The impact of human activity on the environment is the most pressing and stark example of this. The challenge of collaborating and cooperatively adapting at scale creates many difficulties and we may have unwittingly put in place a number of cultural systems and practices, particularly in education, which are undermining our ability to adapt. These self-imposed limitations disrupt our complementary cognitive search capability and may restrict our capacity to find and act upon innovative and creative solutions."

"Complementary cognition should be seen as a starting point in exploring a rich area of human evolution and as a valuable tool in helping to create an adaptive and sustainable society. Our species may owe our spectacular technological and cultural achievements to neurocognitive specialisation and cooperative cognitive search, but our adaptive success so far may belie the importance of attaining an equilibrium of approaches. If this system becomes maladjusted, it can quickly lead to equally spectacular failures to adapt - and to survive, it is critical that this system be explored and understood further."

###

 

Babies can see things that adults cannot

Visual perception changes drastically in the second half of the first year of life

CHUO UNIVERSITY

Research News

IMAGE

IMAGE: NEW RESEARCH SHOWS THAT INFANTS YOUNGER THAN 7 MONTHS CAN PERCEIVE OBJECTS THAT OLDER INFANTS AND ADULTS CANNOT SEE BECAUSE OF A PHENOMENON CALLED "VISUAL BACKWARD MASKING. " view more 

CREDIT: CHUO UNIVERSITY, LAIMAN

We can generally recognize an object, even if it is presented for a very brief time. However, if another object appears immediately following the first object, the perception on the first object is impaired such that we do not notice its existence. This perceptual phenomenon, called "visual backward masking," is used in vision science to study how visual perception is processed in the brain. Interestingly, this phenomenon occurs even if the second object does not spatially overlap the first object, such as a contour or four dots surrounding the object.

The occurrence of this phenomenon is assumed to be due to a disruption of "feedback processing." When we see something, visual information is serially processed from lower to higher visual areas in the brain in a bottom-up manner. However, top-down feedback processing, in which visual signals are sent back from higher to lower areas, also plays a critical role in visual perception. Visual backward masking is thought to occur owing to interference with feedback processing.

"We applied backward masking to infants aged 3-8 months to examine the development of feedback processing," says Yusuke Nakashima, a postdoctoral fellow at Chuo University in Tokyo and the study's lead author. "Recent studies in vision science revealed the importance of feedback processing in visual perception, but its development is poorly understood."

To test whether backward masking occurs in infants, the researchers presented images of faces on a computer screen and measured the length of time that infants spent looking at them. As infants tend to look longer at faces, researchers can test whether infants perceive faces by measuring their looking time. The faces were presented in two ways. In one condition, a face was followed by a mask image, in which infants would not see the face if backward masking occurred. In another condition, nothing appeared after the face; thus, infants would be able to see the face.

The researchers found that infants aged 7-8 months could not see faces followed by the mask, indicating that backward masking occurred, similar to adults. In contrast, infants aged 3-6 months could perceive faces even when the faces were followed by the mask image, indicating that masking did not occur and that younger infants could see faces that older infants could not.

"These results suggest that feedback processing is immature in infants younger than 7 months," says Nakashima. "That is, younger infants do not have feedback processing that backward masking should interfere, and thus, masking is ineffective for them." The results of the study demonstrated that the mechanisms for visual perception change drastically in the second half of the first year of life, from the bottom-up system to the system incorporating top-down processing.

The results also showed that objects that can be perceived in early infancy become imperceptible during development. "This might seem counterintuitive," says Masami Yamaguchi, a professor at Chuo University. "Instead, important visual abilities would be acquired by the maturation of feedback processing."

For example, feedback processing is essential for robustly perceiving ambiguous visual images, such as occluded objects. "Younger infants whose feedback processing is immature might perceive the external world ambiguously," says Yamaguchi. "In return for susceptibility to visual masking, we acquire the ability to robustly perceive ambiguous visual scenes."

###

Paper: Nakashima Y., Kanazawa S., & Yamaguchi M. K. "Perception of invisible masked objects in early infancy" Proceedings of the National Academy of Sciences of the United States of America, 118 (27), e2103040118.

DOI: https://doi.org/10.1073/pnas.2103040118

This research was supported by Japan Society for the Promotion of Science.

 

Performance-enhancing substance use: A link to criminal offending

New research shows performance-enhancing substance use is associated with criminal offending during young and middle adulthood

UNIVERSITY OF TORONTO

Research News

Toronto, ON - Despite research showing associations between anabolic steroid use and criminal offending, the possibility of a similar association between legal performance-enhancing substance use, such as creatine, and criminal offending remained unknown. A new study published online in the journal Drug and Alcohol Dependence now shows that both forms of performance-enhancing substance use is longitudinally associated with criminal offending among U.S. adults.

The study, which analyzed a sample of over 9,000 U.S. participants from the National Longitudinal Study of Adolescent Health (Add Health), highlights the need for more research on performance-enhancing substances to understand the complex social problems associated with their use.

"This is the first study to identify relationships between legal performance-enhancing substance use and criminal offending," says lead author Kyle T. Ganson, PhD, MSW, assistant professor at the University of Toronto's Factor-Inwentash Faculty of Social Work. "This finding is acutely salient because these substances are easily accessible and commonly used, particularly among young people."

The study highlights the importance of clinical professionals screening for performance-enhancing substance use and assessing patterns of criminal offending among young people.

"We need more research to identify effective prevention and intervention techniques to ensure that we reduce the use of these substances, as well as curtail any connection with criminal offending," says co-author Jason M. Nagata, MD, MSc, assistant professor at the University of California, San Francisco's Department of Pediatrics.

"The associations found in this study are likely explained by an intersection of behavioral, psychological, and sociocultural influences," says Ganson. "We therefore need to target this problem from a multitude of angles, including clinically and via public health and policy interventions."

###

Ganson, K.T., Testa, A., Jackson, D.B., & Nagata, J.M. (2021). Performance-Enhancing Substance Use and Criminal Offending: A 15-Year Prospective Cohort Study. Drug and Alcohol Dependencehttps://doi.org/10.1016/j.drugalcdep.2021.108832

Edible Cholera vaccine made of powdered rice proves safe in phase 1 human trials

Study points towards role of gut microbiome in vaccine effectiveness

UNIVERSITY OF TOKYO

Research News

IMAGE

IMAGE: RESEARCHERS AT THE UNIVERSITY OF TOKYO HAVE ANNOUNCED THE SUCCESSFUL RESULTS OF THE PHASE 1 CLINICAL TRIAL OF A NEW TYPE OF VACCINE TO PROTECT AGAINST CHOLERA AND TRAVELERS' DIARRHEA.... view more 

CREDIT: IMAGE BY DR. HIROSHI KIYONO, CC BY 4.0

A new vaccine to protect against deadly cholera has been made by grinding up genetically modified grains of rice. The first human trial has shown no obvious side effects and a good immune response. Researchers based at the University of Tokyo and Chiba University have published the peer-reviewed results of the Phase 1 clinical trial of the vaccine, named MucoRice-CTB, in The Lancet Microbe.

Vaccine manufacturing has made enormous strides in 2020, spurred on by COVID-19. However, the complexity of mRNA-based SARS-CoV-2 vaccines has highlighted the value of inoculations that can be made, transported and stored cheaply and without refrigeration.

The MucoRice-CTB vaccine is stable at room temperature from start to finish.

"I'm very optimistic for the future of our MucoRice-CTB vaccine, especially because of the dose escalation results. Participants responded to the vaccine at the low, medium and high doses, with the largest immune response at the highest dose," said Professor Hiroshi Kiyono, D.D.S., Ph.D., from the Institute of Medical Science at the University of Tokyo who leads the MucoRice project. Dr. Kiyono is also a faculty member at Chiba University in Japan and the University of California, San Diego, in the U.S.

Thirty volunteers received a placebo and groups of 10 volunteers received a total of four doses spaced every two weeks of either 3 milligrams (mg), 6 mg or 18 mg each of the vaccine. Tests two and four months after receiving the last dose revealed that volunteers who responded to the vaccine had IgA and IgG antibodies - two types of proteins the immune system produces to fight infections - specific to cholera toxin B (CTB). Participants who received a higher dose of vaccine were more likely to have CTB-specific antibodies.

An independent review board found no evidence of significant side effects.

Growing a new type of vaccine Vibrio cholerae bacteria is spread most often by drinking water contaminated with sewage. Without medical attention, cholera can kill in mere hours due to diarrhea with severe dehydration. Cholera infects 1.3 million to 4 million people and causes 21,000 to 143,000 deaths each year, according to the World Health Organization.

There are four modern needle-free cholera vaccines, all of which are given as drops on the tongue, but require cold storage and are made from whole killed or live-attenuated (weakened) cholera cells; https://www.fda.gov/media/98688/download).

The new cholera vaccine grows in genetically modified Japanese short-grain rice plants that produce a nontoxic portion of CTB that can be recognized by the immune system. CTB is similar in structure to a toxin made by some types of disease-causing E. coli bacteria, so cholera vaccines often provide cross protection against travelers' diarrhea.

Researchers grow the rice plants in a purpose-built, indoor hydroponic farm that meets WHO good manufacturing practice standards for medicines, which ensures that the vaccine remains uncontaminated and that the plants are isolated from the natural environment.

The plants produce the CTB subunit in their seeds, the edible grains of rice, and store the antigens in droplets called protein bodies with membranes made of fat.

"The rice protein bodies behave like a natural capsule to deliver the antigen to the gut immune system," said Dr. Kiyono.

Other medicines have been grown in plants, most often in the leaves - including treatments for Ebola, lymphoma and flu - but the drugs have to be extracted and purified before being used. The grain-based aspect of the MucoRice system avoids those extra steps, the need for cold storage, and protects the antigens as they travel through the harsh acid of the stomach.

When the plants are mature, the rice is harvested and ground into a fine powder, then sealed in aluminum packets for storage. When people are ready to be vaccinated, the powder is mixed with about 90 milliliters (1/3 U.S. cup) of liquid and then drunk. Researchers have only tested the vaccine using saline (a salt solution equivalent to body fluids), but they expect it would work equally well with plain water.

Immunity through the gut is strong, but complicated by the microbiome

"The beautiful part of our vaccine is that it wisely uses the body's mucosal immune system through the gut for the induction of antigen-specific antibodies," said Dr. Kiyono.

MucoRice-CTB enters the body through intestinal mucosal membranes, mimicking a natural way of encountering and responding to germs. Stimulating the mucosal immune system produces two classes of antibodies that identify germs and target them for removal, IgG and IgA. Vaccines that are injected under the skin or into a muscle generally increase only IgG, not IgA, antibodies.

Volunteers who responded to MucoRice-CTB had their highest blood levels of antigen-specific IgG and IgA after eight to 16 weeks.

However, 11 of the 30 volunteers who received the vaccine showed low or no measurable immune response. All study volunteers reported never traveling outside of Japan, so it is unlikely that they had any previous exposure or natural immunity to V. cholerae or pathogenic E. coli.

"When we saw those data about the 11 low and nonresponders, we thought maybe gut microflora have an influence on the outcome of the immune response," Dr. Kiyono recalled.

The microflora or microbiome is the community of microorganisms that live in our bodies and either benefit us or are harmless. It is well accepted that the microflora of the digestive system influence health and immunity, but scientists are just beginning to understand the precise mechanisms of the relationship.

Extensive genetic analysis of all volunteers' fecal samples identified the thousands of bacterial species living in volunteers' intestines.

"In simplified terms, high responders had more diversified microflora, and in the low-responder group, diversity was much narrower," said Dr. Kiyono.

Researchers cautioned that the small size of the Phase 1 study - giving the vaccine to only 30 healthy Japanese male volunteers - means the relevance and prevalence of nonresponders is still unclear and that the total difference in microflora diversity was subtle. However, the results do hint at the larger role of microflora in vaccine effectiveness.

"It's all speculation right now, but maybe higher microflora diversity creates a better situation for strong immune response against oral vaccine," said Dr. Kiyono.

The link between the gut microbiome and vaccine effectiveness has been previously revealed by the unfortunate fact that most vaccines are developed in industrialized nations and some are then less effective when delivered in developing countries. Mucosal vaccines, including oral vaccines against polio and cholera, seem especially prone to this disparity. Most scientific theories to explain the phenomenon focus on chronic intestinal inflammation linked to poor sanitation. (https://doi.org/10.1186/1741-7007-8-129)

"Probably for every vaccination right now, even injected vaccines, we should think of the immune status of the individual based on the condition of their microflora," said Dr. Kiyono.

It remains to be seen how microflora diversity will impact the global effectiveness of the new MucoRice edible vaccine system compared to other oral vaccines' records.

For now, the researchers plan to work with partners in the pharmaceutical industry to bring MucoRice-CTB into the next phase of clinical trials in Japan and overseas.

###

Research Publication

Yoshikazu Yuki, Masanori Nojima, Osamu Hosono, Hirotoshi Tanaka, Yasumasa Kimura, Takeshi Satoh, Seiya Imoto, Satoshi Uematsu, Shiho Kurokawa, Koji Kashima, Mio Mejima, Rika Nakahashi-Ouchida, Yohei Uchida, Takanori Marui, Noritada Yoshikawa, Fumitaka Nagamura, Kohtaro Fujihashi, Hiroshi Kiyono. 24 June 2021. Assessment of Oral MucoRice-CTB vaccine for the safety and microbiota-dependent immunogenicity in humans: A Randomized Trial. The Lancet Microbe.

Related Links

Division of Mucosal Immunology: https://www.ims.u-tokyo.ac.jp/EnMen/index_e.html

Society for Mucosal Immunology oral history interview: https://www.socmucimm.org/about/smi-oral-history-project/dr-hiroshi-kiyono/

About the University of Tokyo

The University of Tokyo is Japan's leading university and one of the world's top research universities. The vast research output of some 6,000 researchers is published in the world's top journals across the arts and sciences. Our vibrant student body of around 15,000 undergraduate and 15,000 graduate students includes over 4,000 international students. Find out more at http://www.u-tokyo.ac.jp/en/ or follow us on Twitter at @UTokyo_News_en.

Funders

Japan Agency for Medical Research and Development (AMED), Ministry of Education, Culture, Sports, Science and Technology (MEXT), Science and Technology Research Partnership for Sustainable Development (SATREPS)