Saturday, September 09, 2023

Research into accessible virtual reality and quantum information processing


Early-stage Researchers Kathrin Gerling and Philip Willke from KIT Receive Starting Grants from the European Research Council


Grant and Award Announcement

KARLSRUHER INSTITUT FÜR TECHNOLOGIE (KIT)

Professor Kathrin Gerling (Photo: Tanja Meißner, KIT) and Tenure-track Professor Philip Willke (Photo: Private) receive ERC Starting Grants for their projects. 

IMAGE: PROFESSOR KATHRIN GERLING (PHOTO: TANJA MEISSNER, KIT) AND TENURE-TRACK PROFESSOR PHILIP WILLKE (PHOTO: PRIVATE) RECEIVE ERC STARTING GRANTS FOR THEIR PROJECTS view more 

CREDIT: TANJA MEISSNER, PHILIP WILK





Contribution to the Design of Accessible Technologies

Virtual reality (VR) has an enormous potential in leisure activities, education, and work. Still, most VR systems are designed for non-disabled persons. The project AccessVR (stands for: Developing Experience-Centric Accessible Immersive Virtual Reality Technology) of Kathrin Gerling, Professor of Human-Computer Interaction and Accessibility at the Institute for Anthropomatics and Robotics (IAR) of KIT, aims to eliminate physical, digital, and experience-centric barriers and to make VR better accessible for disabled persons. In her project, the computer scientist will first study requirements to be met by user interfaces and the presentation of disability in VR. Then, customized VR prototypes will be developed. Finally, an adaptable VR platform will be designed with various workplace and entertainment scenarios. “The AccessVR project will lay the foundation for further integration of disabled people in VR, contribute to the design of accessible technologies, and support technological research that prioritizes accessibility from the very first day,” the researcher says.

Gerling heads the Human-Computer Interaction and Accessibility research group and co-directs the “Accessibility” real-world lab at KIT’s IAR. Research of her team focuses on how interactive technology can be designed to support self-determination of the people and how experience-centric accessibility can be achieved, which goes beyond the removal of barriers and aims to create enriching experience for us all.

 

Quantum Research Atom by Atom

Quantum mechanics is the theory that describes laws of nature in the nanoworld on the scale of atoms and molecules and opens up new technological opportunities in information processing and sensor technology, for instance. Philip Willke, Tenure-track Professor at KIT’s Physikalisches Institut (PHI), proposed ATOMQUANT to work on a new architecture based on atomic force microscopy (AFM) for quantum information processing and magnetic sensor technology on the atomic level. Here, spins – the basic units of magnets – play a central role and enable measurement of quantum mechanics properties of individual atoms and molecules. When spins are sufficiently isolated on the nanoscale, they can keep their quantum properties and remain oriented in a given direction for a long time. Within ATOMQUANT, Willke will work on improving these magnetic quantum states on surfaces by several orders of magnitude. “The results will have the potential to bring quantum research to the atomic scale. Potential quantum systems with outstanding quantum properties can be studied in situ and atom by atom,” the physicist explains.

At KIT’s PHI, Willke conducts research at the interface between quantum technologies and nanosciences. He is convinced that future challenges in science and technology will have to be addressed on the fundamental scale of matter, i.e. the atomic scale. With his team, he mainly uses scanning probe microscopy in combination with electron spin resonance to resolve and control quantum systems atom by atom. 

 

ERC Starting Grants 2023

The European Research Council (ERC) awards Starting Grants to outstanding early-stage researchers who want to start an independent career and establish a working group of their own. Projects selected will be funded with up to about EUR 1.5 million each for a period of five years. In the recent competition of 2023, the ERC awarded Starting Grants in the total amount of EUR 628  million to 400 projects in 24 countries. 2696 proposals were submitted to ERC. The approval rate, hence, was bei 14,8 percent.   


Being “The Research University in the Helmholtz Association”, KIT creates and imparts knowledge for the society and the environment. It is the objective to make significant contributions to the global challenges in the fields of energy, mobility, and information. For this, about 9,800 employees cooperate in a broad range of disciplines in natural sciences, engineering sciences, economics, and the humanities and social sciences. KIT prepares its 22,300 students for responsible tasks in society, industry, and science by offering research-based study programs. Innovation efforts at KIT build a bridge between important scientific findings and their application for the benefit of society, economic prosperity, and the preservation of our natural basis of life. KIT is one of the German universities of excellence.


Machine learning contributes to better quantum error correction


Peer-Reviewed Publication

RIKEN




Researchers from the RIKEN Center for Quantum Computing have used machine learning to perform error correction for quantum computers—a crucial step for making these devices practical—using an autonomous correction system that despite being approximate, can efficiently determine how best to make the necessary corrections.

In contrast to classical computers, which operate on bits that can only take the basic values 0 and 1, quantum computers operate on “qubits”, which can assume any superposition of the computational basis states. In combination with quantum entanglement, another quantum characteristic that connects different qubits beyond classical means, this enables quantum computers to perform entirely new operations, giving rise to potential advantages in some computational tasks, such as large-scale searches, optimization problems, and cryptography.

The main challenge towards putting quantum computers into practice stems from the extremely fragile nature of quantum superpositions. Indeed, tiny perturbations induced, for instance, by the ubiquitous presence of an environment give rise to errors that rapidly destroy quantum superpositions and, as a consequence, quantum computers lose their edge.

To overcome this obstacle, sophisticated methods for quantum error correction have been developed. While they can, in theory, successfully neutralize the effect of errors, they often come with a massive overhead in device complexity, which itself is error-prone and thus potentially even increases the exposure to errors. As a consequence, full-fledged error correction has remained elusive.

In this work, the researchers leveraged machine learning in a search for error correction schemes that minimize the device overhead while maintaining good error correcting performance. To this end, they focused on an autonomous approach to quantum error correction, where a cleverly designed, artificial environment replaces the necessity to perform frequent error-detecting measurements. They also looked at “bosonic qubit encodings”, which are, for instance, available and utilized in some of the currently most promising and widespread quantum computing machines based on superconducting circuits.

Finding high-performing candidates in the vast search space of bosonic qubit encodings represents a complex optimization task, which the researchers address with reinforcement learning, an advanced machine learning method, where an agent explores a possibly abstract environment to learn and optimize its action policy. With this, the group found that a surprisingly simple, approximate qubit encoding could not only greatly reduce the device complexity compared to other proposed encodings, but also outperformed its competitors in terms of its capability to correct errors.

Yexiong Zeng, the first author of the paper, says, “Our work not only demonstrates the potential for deploying machine learning towards quantum error correction, but it may also bring us a step closer to the successful implementation of quantum error correction in experiments.”

According to Franco Nori, “Machine learning can play a pivotal role in addressing large-scale quantum computation and optimization challenges. Currently, we are actively involved in a number of projects that integrate machine learning, artificial neural networks, quantum error correction, and quantum fault tolerance.”


 

Artificial intelligence could help build pollen jigsaw of present and ancient flora


Peer-Reviewed Publication

UNIVERSITY OF EXETER

Pollen types captured via microscope 

IMAGE: DIFFERENT POLLEN TYPES CAPTURED VIA A MICROSCOPE view more 

CREDIT: THE UNIVERSITY OF EXETER




An emerging system which combines rapid imaging with artificial intelligence could help scientists build a comprehensive picture of present and historic environmental change – by swiftly and accurately analysing pollen.

Pollen grains from different plant species are unique and identifiable based on their shape. Analysing which pollen grains are captured in samples such as sediment cores from lakes helps scientists understand which plants were thriving at any given point in history, potentially dating back thousands to millions of years.

Up to now, scientists have manually counted pollen types in sediments or from air samples using a light microscope – a specialised and time-consuming task.

Now, scientists at the University of Exeter and Swansea University are combining cutting-edge technologies including imaging flow cytometry and artificial intelligence to build a system capable of identifying and categorising pollen at much faster rates. Their progress is published today in a research paper in New Phytologist. As well as building a fuller picture of past flora, the team hope the technology could one day be applied to more accurate pollen readings in today’s environment, which may help provide hayfever sufferers to mitigate symptoms.

Dr Ann Power, of the University of Exeter, said: “Pollen is an important environmental indicator, and piecing together the jigsaw of different pollen types in the atmosphere, both today and in the past, can help us build up a picture of biodiversity and climate change.

“However, recognising what plant species pollen belongs to under a microscope is incredibly labour-intensive and cannot always be done. The system we’re developing will cut the time this takes dramatically and improve classifications. This means we can build a richer picture of pollen in the environment far more swiftly, revealing how the climate, human activity and biodiversity has changed over time, or better understand what allergens are in the air we breathe.”

The team has already used the system to automatically analyse a 5,500-year-old slice of lake sediment core, rapidly classifying over a thousand pollen grains. In the past, this would have taken a specialist up to eight hours to count and categorise – a task the new system completed in well under an hour.

The new system uses imaging flow cytometry – a technology which is typically used to investigate cells in medical research, to quickly capture pollen images. A unique type of artificial intelligence has then been developed based on deep learning to identify the different types of pollen in an environmental sample. This is able to make these distinctions even when the sample is imperfect.

Dr Claire Barnes, from Swansea University, said: “Up to now, the AI systems in development to categorise pollen learn from and test on the same pollen libraries – which means each sample is perfect and belongs to species previously seen by the network. These systems are not able to recognise pollen from the environment that’s taken some knocks along the way, nor to categorise pollen not included in training libraries. Incorporating a unique version of deep learning into our system means the artificial intelligence is smarter and applies a more flexible approach to learning. It can deal with poor quality images and can use shared species characteristics to predict what family of plant the pollen belongs to even if the system hasn’t seen it before during training.”

In the coming years, the team hopes to refine and launch the new system, and to use it to learn more about grass pollen, a particular irritant for hayfever sufferers. Dr Power said: “Some grass pollens are more allergenic than others. If we can understand better which pollens are prevalent at specific times, it would lead to improvements in the pollen forecast that could help people with hayfever plan to reduce their exposure.”

The research is supported by the National Environment Research Council (NERC) and the US National Institutes of Health. The paper is entitled ‘Deductive Automated Pollen Classification in Environmental samples via Exploratory Deep Learning and Imaging Flow Cytometry’, and is published in New Phytologist.

 

Nudging food delivery customers to skip the fork drastically cuts plastic waste, study shows


A new study examines how a food-delivery platform in China used “green nudges” to increase the share of no-cutlery orders by 648 percent


Peer-Reviewed Publication

UNIVERSITY OF CHICAGO




In 2021, more than 400 million metric tons of plastic waste were produced worldwide, and it is predicted that the world’s plastic waste growth will continue to outpace the efforts to reduce plastic pollution in the coming decades. As food delivery services became increasingly popular during the COVID-19 pandemic, the surge in plastic waste generated by single-use cutlery has become a key environmental challenge for many countries. A new study finds “green nudges” that encouraged customers to skip asking for cutlery with their delivery orders were dramatically successful and could be a powerful policy tool to reduce plastic waste.

“Few policies target plastic waste production at the consumer level, except charges on plastic bags,” says EPIC-China’s research director Guojun He, an author of the study and an Associate Professor at the Hong Kong University Business School. “Our findings show that simple nudges can make a big difference in changing consumers’ behaviors and could become a tool for policymakers as they confront the immense challenge of plastic waste.”

Reducing single-use cutlery waste in the food-delivery industry is particularly important in China, the world’s largest producer and consumer of single-use cutlery. As of 2019, more than 540 million Chinese were active users of food-delivery services and each day consumed more than 50 million sets of single-use cutlery that were not adequately treated or disposed of. To reduce single-use cutlery consumption, policy-makers in China set a target of reducing its usage in food deliveries by 30 percent by 2025.

Guojun He and his co-authors Yuhang Pan, Albert Park, Yasuyuki Sawada and Elaine Tan worked with Alibaba’s online food-ordering platform Eleme. Eleme is China’s second largest food-delivery company, similar to Uber Eats and DoorDash, with more than 753 million users in 2022. The researchers evaluated the effectiveness of Alibaba’s green nudges to reduce single-use cutlery consumption. These nudges included changing the default selection to “no cutlery” and including green points as rewards for not using the cutlery. When a customer accumulated enough green points, they could then be redeemed to plant a tree under the customer’s name.

The researchers studied each user’s monthly food-ordering history for two years through 2019-2020 in 10 major Chinese cities. These included the three treated cities with green nudges (i.e., Beijing, Shanghai, and Tianjin) and the seven control cities without the nudges (Qingdao, Xi’an, Guangzhou, Nanjing, Hangzhou, Wuhan, and Chengdu). Among these cities, the authors randomly sampled about 200,000 active users (i.e., those who placed at least one order between 2019 and 2020).

The authors found that the green nudges—changing the default to “no cutlery” and rewarding consumers with green points—increased the share of no-cutlery orders by 648 percent. If green nudges were applied to all of China, they discovered that more than 21.75 billion sets of single-use cutlery would be saved annually—eliminating 3.26 million metric tons of plastic waste and saving 5.44 million trees (from wooden chopsticks) each year.

“Other food delivery platforms, such as UberEats and DoorDash, could try similar nudges to reduce cutlery consumption and plastic waste globally,” says He.

 

Researchers develop highly efficient and stable photoelectrode for water splitting using organic semiconductors


Their findings have been selected for the supplementary cover of ACS Energy Letters and published on June 9, 2023


Peer-Reviewed Publication

ULSAN NATIONAL INSTITUTE OF SCIENCE AND TECHNOLOGY(UNIST)

Professor Ji-Hyun Jang and her research team 

IMAGE: PROFESSOR JI-HYUN JANG (SECOND FROM LEFT) FROM THE SCHOOL OF ENERGY AND CHEMICAL ENGINEERING AT UNIST view more 

CREDIT: UNIST




A team of researchers, led by Professor Ji-Hyun Jang from the School of Energy and Chemical Engineering at UNIST has achieved a significant breakthrough in photoelectrode development. Through collaborative research with Professor Junghoon Lee from Dongseo University and Dr. Hyo-Jin Ahn from the German Engineering Research and Development Center LSTME Busan , the team successfully created a high-performance and stable photoelectrode by incorporating organic semiconductors as an intermediate layer in existing inorganic semiconductor-based photoelectrodes.

The production of green hydrogen using solar energy involves splitting water into its constituent elements through charges generated in semiconductors that absorb sunlight. Previous studies primarily focused on utilizing inorganic semiconductors for constructing photoelectrodes. However, organic semiconductors offer several advantages such as lower costs, various process methods, and easier large-scale production. Additionally, their high solar energy conversion efficiency translates to enhanced hydrogen production efficiency. Nevertheless, the susceptibility of organic semiconductors to water damage has limited their application in photoelectrodes.

To overcome this challenge, the research team applied a coating of organic semiconductors onto the surface of conventional iron oxide-based photoelectrodes to ensure stability when exposed to water. Furthermore, they implemented a catalyst (nickel/iron double-layer hydroxide) as an additional protective layer over the coated organic semiconductor to prevent direct contact with water. This innovative approach allowed the charges generated from solar energy absorption to facilitate efficient hydrogen production reactions.

Professor Jang expressed the team’s excitement about the research outcomes, by saying, “By overcoming the limitations of traditional inorganic semiconductor-oriented photoelectrodes, we have demonstrated the potential for more extensive utilization of organic semiconductors in hydrogen production through photoelectrodes.”

This breakthrough not only opens up new possibilities for enhancing efficiency and stability but also contributes to advancing sustainable energy solutions for a carbon-neutral future.

The study findings have been published ahead of their official publication in the online version of ACS Energy Letters on March 12, 2023. This work was selected for the supplementary cover of the journal and was subsequently published on June 9, 2023. The main authors were Dr. Hyo-Jin Ahn (LSTME Busan) and Ki-Yong Yoon (UNIST). Funding support was provided by the National Research Foundation of Korea (NRF) and the Ministry of Science and ICT (MSIT).


Scheme for the fabrication of NiFe(OH)x/PSi/Ti–Fe2O3

Journal Reference
Hyo-Jin Ahn, Ki-Yong Yoon, Mingi Sung, et al., “Utilizing a Siloxane-Modified Organic Semiconductor for Photoelectrochemical Water Splitting,” ACS Energy Lett., (2023).

First U.S. study of nest temperature impacts on leatherback hatchlings


Researchers uncover correlation between heat and hatchling shape, performance and hatching success

Peer-Reviewed Publication

FLORIDA ATLANTIC UNIVERSITY

Heat and Leatherback Hatchlings 

VIDEO: UPON EMERGENCE, HATCHLINGS WERE TESTED FOR RIGHTING ABILITY (FLIPPING OVER WHEN PLACED ON THEIR BACK) AND CRAWLING SPEEDS. HATCHLINGS THAT WERE SMALLER AND/OR HAD A LARGER BODY DEPTH STRUGGLED TO RIGHT THEMSELVES. view more 

CREDIT: SARAH MILTON, FLORIDA ATLANTIC UNIVERSITY




One of the most unique of all sea turtle species, the leatherback (Dermochelys coriacea), is distinguished by its smooth, “leathery” looking carapace or shell. Listed as vulnerable, this species is experiencing population declines from natural and anthropogenic threats – including climate change.

Environmental conditions affect sea turtles’ reproductive success inside the nest. Increasing sand temperatures can affect embryonic development, which can lead to decreased hatching success and hatchling performance. Hatchlings need to be in good physical shape to crawl from the nest to the water to avoid being preyed upon by seabirds, raccoons and other predators.   

Florida Atlantic University study is the first to examine the effects of incubation temperature on leatherback hatchling performance in the United States. Studies on nest temperatures and hatchling performance to date have focused primarily on green and loggerhead sea turtles.

Results, published in the journal Endangered Species Researchshowed hatchling morphology, performance and nest success were all influenced by nest temperature. Mid- and late-season hatchling masses differed significantly from each other, where mid-season hatchlings weighed more. 

For the study, temperature data loggers were placed in 13 leatherback turtle nests along 9.6 kilometers of coastline in Juno Beach on the day they were laid during the early, middle and late South Florida nesting seasons beginning in March and ending in June. Researchers looked at the relationship between nest incubation temperatures and hatching success, emergence success, hatchling morphology and locomotor performance.

Upon emergence, hatchlings were tested for righting ability (flipping over when placed on their back) and crawling speeds. To evaluate hatchling morphology, researchers used digital calipers to measure carapace length and width, body depth and flipper length. Body mass was determined using a digital scale. They also identified the overall mean incubation temperature and the maximum temperature for each nest.

Nests with lower temperatures produced longer hatchlings; the highest nest temperatures produced hatchlings with thicker body depths. Flipper length also correlated with temperature. Hatchlings from the highest nest temperatures had shorter flippers compared to hatchlings from mid-season nests that had longer flippers.

Incubation temperatures also affected hatchling performance. Righting response scores were significantly lower in hatchlings from late season, hotter nests. Hatchlings that were smaller and/or had a larger body depth struggled to right themselves. However, 30 percent of all 119 of the hatchlings tested across the season could not right themselves even one time.

Researchers did not find a correlation between incubation temperature and crawling, nor between crawling speed and hatchling size. Crawling speed did not significantly differ between early, mid and late nests.

“Interestingly, locomotor performance did not show the same correlations with temperature as hatching and emergence success, indicating hatchlings can be good performers even if they come from nests with poor hatching and emergence success,” said Sarah Milton, Ph.D., senior author and chair and professor, Department of Biological Sciences within FAU’s Charles E. Schmidt College of Science. “The leatherback hatchlings crawled at similar speeds, regardless of incubation mean or maximum temperature, which suggests there are other factors influencing performance. One factor that may influence terrestrial performance is their overall body morphology compared to the other species.”

Findings showed hatching and emergence success correlated with temperature, where the mid-season nests were the most successful. Nests incubating at hotter temperatures also had a shorter incubation period, which means embryo developmental time is reduced, and hatchlings are unable to convert as much of the yolk mass to body tissue growth. The leatherback turtle nests in this study had an overall lower hatching success (45 percent) than loggerhead (73 percent) and green sea turtles (70 percent). 

“There could be a temperature ‘sweet spot’ for incubation since the mid-season nests had both higher hatching and emergence success and better hatchling physical performance,” said Milton.

Mean nest temperatures ranged from 29 degrees Celsius (84 degrees Fahrenheit) to 32.5 degrees Celsius (90.5 degrees Fahrenheit) and were significantly higher in the mid- and late season compared to early season nests.

“While leatherback turtle nests are laid in the cooler months in South Florida and are deeper than nests of other species, temperatures can still rise surprisingly high,” said Milton. “Increasing temperatures due to climate change poses a significant threat to a species that already has lower nest success than other species that share the same nesting beaches.” 

Study co-author is Heather A. Seaman, an FAU graduate student pursuing her Ph.D., under Milton’s direction in the Department of Biological Sciences.

Listed as vulnerable, the leatherback is one of the most unique of all sea turtle species.

Listed as vulnerable, the leatherback sea turtle is distinguished by it smooth, "leathery' looking carapace or shell.

CREDIT

Sarah Milton, Florida Atlantic University

- FAU -

About Florida Atlantic University:
Florida Atlantic University, established in 1961, officially opened its doors in 1964 as the fifth public university in Florida. Today, the University serves more than 30,000 undergraduate and graduate students across six campuses located along the southeast Florida coast. In recent years, the University has doubled its research expenditures and outpaced its peers in student achievement rates. Through the coexistence of access and excellence, FAU embodies an innovative model where traditional achievement gaps vanish. FAU is designated a Hispanic-serving institution, ranked as a top public university by U.S. News & World Report and a High Research Activity institution by the Carnegie Foundation for the Advancement of Teaching. For more information, visit www.fau.edu.

 JOURNAL

 

Dog diversity unveiled by international DNA database



Peer-Reviewed Publication

MICHIGAN MEDICINE - UNIVERSITY OF MICHIGAN

Dogs of the Michigan Medicine PR team 

IMAGE: FURRY FRIENDS OF THE MICHIGAN MEDICINE PUBLIC RELATIONS TEAM. view more 

CREDIT: MICHIGAN MEDICINE COMMUNICATIONS




An international consortium of scientists, led by Jeff Kidd, Ph.D. of University of Michigan, Jennifer R. S. Meadows of Uppsala University in Sweden, and Elaine A. Ostrander, Ph.D. of the NIH National Human Genome Research Institute, is using an unprecedentedly large database of canine DNA to take an unbiased look at how our furry friends evolved into the various breeds we know and love.

A new paper, published in the journal Genome Biology, outlines what the Dog10K project discovered after sequencing the genomes of close to 2000 samples from 321 different breed dogs, wild dogs, coyotes, and wolves, and comparing them to one reference sample—that of a German Shepherd named Mischka.

Analyzing more than 48 million pieces of genetic information, they discovered that each breed dog had around 3 million single nucleotide polymorphism differences. These SNPs or “snips” are what account for most of the genetic variation among people and dogs alike. They also found 26,000 deleted sequences that were present in the German Shepherd but not in the comparison breed and 14,000 that were in the compared breed but missing from Mischka’s DNA.

“We did an analysis to see how similar the dogs were to each other, and it ended up that we could divide them into around 25 major groups that pretty much match up with what people would have expected based on breed origin, the dogs’ type, size and coloration,” said Kidd, a professor of Human Genetics and Computational Medicine and Bioinformatics at the U-M Medical School.

Most of the varying genes, he added, had to do with morphology, confirming that the breed differences were driven by how the dogs look.

Relative to dogs, wolves had around 14 percent more variation. And wild village dogs—dogs that live amongst people in villages or cities but aren’t kept as pets—exhibited more genetic variation than breed dogs.

The data set, which was processed using the Great Lakes high-performing computing cluster at U-M, also revealed an unusual amount of retrogenes, a new gene that forms when RNA gets turned back into DNA and inserted back into the genome in a different spot. The study found 926 retrogenes, the most famous of which, says Kidd, is a retrogene called FGF4, which results in the short leg phenotype seen in dachshunds and corgis.

“Dogs tend to have an increased amount of retrogenes which have resulted in mutations that were selected for, that perhaps people found cute and bred more of,” said Kidd. His lab is attempting to figure out why retrogenes and insertions happen so frequently in dogs.

One of the benefits of the Dog10K consortium is its size, which will enable researchers at U-M and elsewhere to examine the genetic underpinnings of other canine characteristics and even common diseases in dogs, such as cancer.

Paper cited: “Genome sequencing of 2000 canids by the Dog10K consortium advances the understanding of demography, genome function and architecture. Genome Biol 24, 187 (2023). https://doi.org/10.1186/s13059-023-03023-7

 

The need to hunt small prey compelled prehistoric humans to produce appropriate hunting weapons and improve their cognitive abilities


The extinction of large prey drove evolutionary changes in prehistoric humans

Peer-Reviewed Publication

TEL-AVIV UNIVERSITY

Illustration of elephant hunting using spears 

IMAGE: ILLUSTRATION OF ELEPHANT HUNTING USING SPEARS view more 

CREDIT: TEL AVIV UNIVERSITY




A new study from the Department of Archaeology at Tel Aviv University found that the extinction of large prey, upon which human nutrition had been based, compelled prehistoric humans to develop improved weapons for hunting small prey, thereby driving evolutionary adaptations. The study reviews the evolution of hunting weapons from wooden-tipped and stone-tipped spears, all the way to the sophisticated bow and arrow of a later era, correlating it with changes in prey size and human culture and physiology.

 

The researchers explain: "This study was designed to examine a broader unifying hypothesis, which we proposed in a previous paper published in 2021. The hypothesis explains the cultural and physiological evolution of prehistoric humans – including increased cognitive abilities - as an adaptational response to the need to hunt progressively smaller and quicker prey. So far such a unified hypothesis was lacking in professional literature, with the prevailing hypothesis maintaining that the changes in hunting weapons were a reflection of an essentially unexplained cognitive improvement." The study was led by Dr. Miki Ben-Dor and Prof. Ran Barkai from the Department of Archaeology at Tel Aviv University. The paper was published in Quaternary.

 

Dr. Ben-Dor: "In the present study we analyzed findings from nine prehistoric sites – in South Africa, East Africa, Spain, and France, inhabited during the transition from the Lower to the Middle Stone Age (Paleolithic), about 300,000 years ago, when Neanderthals and Homo Sapiens first emerged. In early archaeological sites of this kind, we find mostly animal bones and stone tools used to hunt and process prey. The bones reflect the relative quantities of different species hunted by humans, such as elephants, fallow deer, etc. In this study we looked for a correlation between the advent of stone-tipped spears, and the progressive decline in prey size.  Specifically, we examined the emergence of a sophisticated stone-knapping method known as the Levallois technique, which is especially indicative of cognitive development: unlike earlier knapping methods, here the craftsman first prepares a core of good-quality stone, then cuts a pointed item off with one stroke – a process that requires him/her to imagine the final outcome in advance.  We found that in all cases, at all sites, stone tips made with the Levallois technology appeared simultaneously with a relative decrease in the quantity of bones of large prey. "

 

Dr. Ben-Dor adds that "studies of contemporary hunter-gatherers indicate that a wooden spear is quite sufficient for hunting large prey like an elephant: the hunters first limit the animal's mobility – for example by driving it into a swamp or digging a trapping pit and concealing it with branches - then thrust the spear into the prey and wait for it to bleed. On the other hand, a middle-sized animal like a deer is much more difficult to trap, and if hit by a wooden spear it will probably run away. A more substantial wound induced by a stone-tipped spear is likely to slow it down and reduce the distance it can run before ultimately collapsing - increasing the hunter's chances of retrieving the fallen prey. This insight further elucidates our findings from hundreds of thousands of years ago, when stone-tipped spears were developed in response to the increasing scarcity of large prey."

 

Reviewing the evolution of prehistoric hunting, the researchers explain that "humans began to make stone tools about 3 million years ago, and started to hunt about 2 million years ago, with hunting weapons evolving constantly throughout prehistoric times. Homo Erectus, the ancestor of all later types of humans, used a wooden spear, probably thrusting it into large prey from up close. Homo Sapiens and Neanderthals, emerging about 300,000 years ago, upgraded their spears by adding stone tips, which they produced with the more sophisticated Levallois technique. These stone-tipped spears were apparently used for both thrusting and hurling. About 50,000 years ago more complex hunting systems like the bow and arrow and spear thrower, were used regularly by Homo Sapiens. At the end of the Upper Paleolithic, about 25,000 years ago, new hunting aids emerged, such as dogs, traps, and fishing hooks. Facts about this continual evolution of hunting weapons, necessarily accompanied by improvement of human cognition and skills, have been known for a long time; and yet, a unifying hypothesis for explaining these facts or attributing them to some change in the environment, was not proposed. In our research we have tried to address this challenge."

Changes in the hunting tools and the dominant animals in the sites from the Paleolithic period

CREDIT

Tel Aviv University

Prof. Barkai: "For the past ten years we have been searching for a unified explanation for focal phenomena in the cultural and biological evolution of prehistoric humans. Our excavations at the Qesem Cave site led us to conclude that elephants, a major component of the human diet in our region for a million years, disappeared about 300,000 years ago, as a result of overhunting and climate change. With the huge elephants gone, humans had to find ways for obtaining the same amount of calories from a larger number of smaller animals. Ultimately, we hypothesized that prey size had played a major part in human evolution: at the beginning the largest animals were hunted, and when these were gone humans went on to the next in size, and so on. Finally, when hunting was no longer energetically viable, humans began to domesticate animals and plants. That's how the agricultural revolution began."

 

In 2021 Prof. Barkai and Dr. Ben-Dor published a unified hypothesis, which explains the physiological and cultural evolution of Paleolithic humans, including the improvement of cognitive capabilities, as adaptations to mitigate declined energetic returns due to a progressive decline in prey size. The present study's findings corroborate this hypothesis, following another study which they published last year together with zoologists Jacob Dambitzer and Prof. Shai Meiri of TAU. Surveying data from archaeological sites dating from 1,500,000 to 20,000 years ago, the 2022 study found that the dominant species of prey at the beginning of the period was a 12-ton elephant, and at the end it was a 25kg gazelle. In addition, the data indicate that the average weight of animals hunted by humans a million years ago was 3 tons, going down to 50kg 20,000 years ago. In other words, prey size decreased continually through time.

 

Prof. Barkai: "In the present study specifically, and in our broader unifying hypothesis in general, we propose for the first time an explanation for one of the most intriguing questions in prehistoric archaeology: Why did tools change? The usual explanation is that tools changed due to improvements in the cognitive abilities of humans. For instance, when humans were suddenly able to imagine the outcomes of a sophisticated process, they developed the Levallois technique. But one may well ask: Why did humans become smarter all of a sudden? What was the advantage of having a large brain that consumes so much energy? We demonstrate that these biological and cognitive changes correlate directly with the size of prey. To hunt small elusive animals humans had to become smarter, faster, more focused, more observant, and more collaborative. They had to develop new weapons for hunting from afar and learn how to track their prey. And they had to choose their prey carefully, with preference for high fat content, to ensure a sufficient energetic return - because hunting a large number of agile gazelles requires a much higher investment of energy than hunting one giant elephant. This, we propose, is the evolutionary pressure that generated the improvement in human ability and tools – to ensure an adequate energy return on investment (EROI)."

Link to the article:

https://www.mdpi.com/2571-550X/6/3/46