It’s possible that I shall make an ass of myself. But in that case one can always get out of it with a little dialectic. I have, of course, so worded my proposition as to be right either way (K.Marx, Letter to F.Engels on the Indian Mutiny)
Friday, March 29, 2024
Suppressing boredom at work hurts future productivity, study shows
UNIVERSITY OF NOTRE DAME
Boredom is more common at work than in any other setting, studies show, and employees are bored at work for more than 10 hours per week on average.
Even astronauts and police officers get bored on the job. No occupation is immune.
Boredom serves an important purpose — it signals the need to stop an action and find an alternative project. But boredom becomes problematic when it’s ignored.
New research from the University of Notre Dame shows that trying to stifle boredom prolongs its effects and that alternating boring and meaningful tasks helps to prevent the effects of one boring task from spilling over to reduce productivity on others.
The team sought to understand if, when and why experiencing boredom now might lead to attention and productivity deficits later. They tested these possibilities in three studies that examined the consequences of boredom on a task-to-task basis.
The first study drew on data from dual-career families working in a variety of industries. Participants responded to multiple surveys per day at different intervals, enabling the team to examine the relationships between boredom, attention and productivity over time. Follow-up studies used alternative methods to reach a broader audience and focused on how meaningful work tasks help mitigate boredom’s prolonged effects.
Belinda, who specializes in emotions, interpersonal communication and close relationships within organizations, noted that boredom is viewed as a nuisance emotion that any strong-willed employee should subdue for the sake of productivity.
He found that experiencing boredom at any one point in time leads to delayed or residual bouts of mind-wandering. Employees often try to “power through” boring tasks to make progress on their work goals, but he said that not only does this fail to prevent boredom’s negative effects, it’s also one of the most dysfunctional responses to boredom.
“Like whack-a-mole, downplaying boredom on one task results in attention and productivity deficits that bubble up during subsequent tasks,” he said. “Paradoxically, then, trying to suppress boredom gives its harmful effects a longer shelf life.”
Part of the solution lies in how work tasks are organized throughout the day. Although boring tasks can’t be avoided, effectively combating the negative effects of boredom requires careful consideration of the nature of different work tasks and how they are sequenced. Casher said it helps to work strategically, looking beyond a single boring task.
“‘Playing the long game’ will help minimize the cumulative effects of boredom over the course of the day,” Belinda explained. “Following an initial boring task, employees should turn to other meaningful tasks to help restore lost energy.”
Breaking boredom: Interrupting the residual effect of state boredom on future productivity.
How commercial rooftop solar power could bring affordable clean energy to low-income homes
A new study led by Stanford researchers finds that factory and warehouse rooftops offer a big untapped opportunity to help disadvantaged communities bridge the solar energy divide.
STANFORD UNIVERSITY
Lower-income communities across the United States have long been much slower to adopt solar power than their affluent neighbors, even when local and federal agencies offer tax breaks and other financial incentives.
But, commercial and industrial rooftops, such as those atop retail buildings and factories, offer a big opportunity to reduce what researchers call the “solar equity gap,” according to a new study, published in Nature Energy and led by researchers at Stanford University.
“The solar equity gap is a serious problem in disadvantaged communities, in part because of income inequalities, but also because residential solar isn’t usually practical for people who don’t own their homes,” said Ram Rajagopal, senior author of the study and associate professor of civil and environmental engineering and of electrical engineering at Stanford. “This new study shows that commercial and industrial properties have the capacity to host solar resources to fill in part of that gap.”
Untapped resources
First, the bad news. The researchers found that non-residential rooftops generate 38% less electricity in disadvantaged communities than in wealthier ones. That gap, which is mainly because of lower deployment in poorer areas, has widened over the past two decades. Nevertheless, this gap is significantly lower than that of residential solar in these neighborhoods.
The good news, the researchers say, is that non-residential buildings have large unused capacity to produce solar power for their own benefit and to supply the communities around them. In low-income communities, commercial enterprises may be more responsive to government incentives for solar power than households are. An earlier study by the same researchers found that residential customers in disadvantaged communities, who may have fewer financial resources and often don’t own their homes, show less response to tax breaks and other financial inducements.
“Using Stanford’s DeepSolar database, we estimated that solar arrays on non-residential buildings could meet more than a fifth of annual residential electricity demand in almost two-thirds of disadvantaged communities,” said Moritz Wussow, the study’s lead author.
“Also, the raw cost of that power would be less in many communities than the residential rates that local electric utilities charge,” said Wussow, who was a visiting student researcher in Rajagopal’s lab group in 2022 and 2023.
To quantify the distribution of non-residential solar power installations, the researchers used satellite images and artificial intelligence to identify the number and size of rooftop solar arrays in 72,739 census tracts across the United States. About one-third of those tracts are deemed disadvantaged by the U.S. government.
The team tracked non-residential solar deployment as well as the amount of unused rooftops that would be good candidates for solar installation from 2006 through 2016 and then again for 2022. They then calculated the average annual cost of producing solar electricity in each area, based on the amount of local sun exposure and other variables. The costs ranged from about 6.4 cents per kilowatt-hour in sun-drenched New Mexico to almost 11 cents in Alaska. But those costs were lower than residential electricity rates in many of those areas – even in many northern states.
Chad Zanocco, a co-author of the new study and a postdoctoral fellow in civil and environmental engineering, noted that getting the power to residential areas would include other costs, such as battery storage and the construction of microgrids.
“We estimate that battery storage would increase total system costs by about 50%, but even that would be practical in almost two-thirds of the disadvantaged communities we studied,” Zanocco said.
Economies of scale
If commercial and industrial solar arrays can feed their surplus electricity into local power grids, the researchers write, lower-income residents could gain access through community subscriptions rather than by building their own rooftop panels. Commercial and industrial sites also offer greater economies of scale, compared to individual household solar panels. Another big advantage is that non-residential power customers could also be highly sensitive to tax incentives and other government inducements, leading to greater adoption.
Further lowering barriers, the researchers noted, is the Inflation Reduction Act of 2022 which has provided billions of dollars for states and local communities for clean-energy infrastructure. That money has already reduced the cost of new microgrids.
“Beyond reducing carbon emissions and slowing climate change, increased access to solar power would offer tangible local benefits to lower-income communities,” said Zhecheng Wang, a co-author and a postdoctoral fellow at Stanford’s Institute for Human-Centered Artificial Intelligence.
“This would promote local clean and low-cost energy generation, which would also increase the resilience from outages and reduce the pollution caused by fossil fuel power plants – many of which are located in low-income areas.”
Additional Stanford co-authors on the paper are: Rajanie Prabha, a PhD student in Civil & Environmental Engineering; June Flora, a senior research scholar in Civil & Environmental Engineering, and in Stanford’s School of Medicine; and Arun Majumdar, dean of Stanford Doerr School of Sustainability, professor in the departments of Energy Science & Engineering, of Mechanical Engineering, and of Photon Science at SLAC National Accelerator Laboratory, and senior fellow at the Precourt Institute and Stanford Woods Institute for the Environment. Civil & Environmental Engineering is a joint department of the Stanford School of Engineering and the Stanford Doerr School of Sustainability. Dirk Neumann, a professor of information systems research at Albert-Ludwigs-Universität in Freiburg, Germany, is also a co-author. Ram Rajagopal is also co-director of the Bits & Watts Initiative at the Precourt Institute for Energy, which is part of the Stanford Doerr School of Sustainability.
Just as many people battle seasonal colds and flu, native plants face their own viral threats. People have long known that plants can succumb to viruses just like humans. Now, a new study led by Michigan State University and the University of California, Riverside reveals a previously unknown threat: non-native crop viruses are infecting and jeopardizing the health of wild desert plants.
“For years, the ecological field assumed wild plants were immune to invasive viruses that damage crops,” said Carolyn Malmstrom, a professor of plant biology and ecology, evolution and behavior at MSU and a co-leader of the study. Kerry Mauck, an associate professor and Alfred M. Boyce Endowed Chair in Entomology, was the team leader at UC Riverside and adviser for the lead author Tessa Shates, who was a graduate student in the Mauck Lab.
“But we’ve found that we need to be just as concerned about protecting indigenous plants as we are agricultural ones,” Malmstrom said.
Published in the Phytobiomes Journal, this discovery holds significant implications for conservation efforts. The research utilizes advanced genetic sequencing and field experiments to demonstrate how insects, acting as unwitting infectors, ferry harmful pathogens from cultivated fields to native ecosystems.
The study focused on desert regions of Southern California, where the Cucurbita species of wild squash thrived alongside irrigated agriculture. The team meticulously identified, marked and collected samples from the wild plants.
Then, analyzing the genetic makeup of viruses within these wild plants, the researchers discovered a surprising presence of crop pathogens like cucurbit yellow stunting disorder virus and cucurbit aphid-borne yellows virus, or CABYV.
In fact, they found that infection rates with CABYV — a non-native pathogen — could reach as high as 88% in some wild Cucurbita populations, with visible impacts on plant growth and root health, both vital for the plants survival in the harsh desert environment.
“These wild plants are crucial components of desert ecosystems, providing food and habitat for other species,” Malmstrom said. “Their decline from crop virus infections could have cascading effects on entire ecological communities.”
“Our findings should help the greater community recognize that our impact on the landscapes around us are not always obvious or clear to see,” Shates said. “It’s easy to see the landscape changes of a clear-cut forest, but it is harder to recognize how hitchhiking microbes might change plant community structure over time.”
Plant virology and research at MSU
Collaboration across the country and fields of research was an important facilitator of this research. Having leading experts in plant biology and entomology contributed to the scientific success of the project, as well as the growth of its early-career researchers.
“In addition to her own contributions, Dr. Malmstrom has been a great mentor throughout this study,” said Shates, who is now an infectious disease scientist with the research company Quest Diagnostics.
“Doing this research, especially during the pandemic, required learning new skills and expanding my research toolkit,” Shates said. “Dr. Malmstrom was a great resource for recommending technical methods to try for data analysis and generating ‘genomes’ for the viruses in our samples.”
As an AgBioResearch scientist, Malmstrom was also able to tap into both agricultural and natural systems expertise at MSU.
“This project bridges the gap between agriculture and natural systems, reminding us that nature and agriculture are intricately linked,” said Malmstrom. “It also underscores the need for a more holistic approach to managing plant health and shows that understanding the complex dynamics of viruses in natural systems is essential for developing sustainable solutions that benefit both agriculture and biodiversity.”
Carolyn Malmstrom of Michigan State University (left) and Jaimie Kenney (center, kneeling) and Kerry Mauck of the University of California, Riverside work at the Motte Rimrock Reserve. This was one of the field sites the researchers worked at to discover viruses from crops were infecting native plants.
CREDIT
Tessa Shates
University of California, Riverside researchers Kerry Mauck (left), Tessa Shates (center) and Jaimie Kenney (right, in distance) worked in the field to discover the prevalence of viral pathogens from agriculture in native plant populations.
Study reveals evidence of violence at a time of crisis in ancient Peru
Analysis of skeletons exhumed at a burial ground dating from the period 500-400 BCE, shortly after the collapse of the Chavín culture, revealed lethal injuries inflicted on men, women and children, as well as signs of material poverty
FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DE SÃO PAULO
The transition from the fifth to the fourth century BCE (Before the Common Era) seems to have been a critical period for the Central Andes, a region now part of Peru. Researchers have found evidence of turbulence during the passage from the Middle Formative period (1200-400 BCE) to the Late Formative period (400-1 BCE). Political disintegration and intergroup violence were apparently part of the context, possibly associated with a shift from theocracy to secular government. A new study, published in the journal Latin American Antiquity, consistently reinforces these suppositions.
The study was conducted by a team of Peruvian, Colombian and Brazilian researchers led by Peruvian bioarcheologist Luis Pezo-Lanfranco, then affiliated with the Biological Anthropology Laboratory at the University of São Paulo’s Institute of Biosciences (IB-USP) in Brazil. The project was supported by FAPESP via the Grant Program for Young Investigators in Emerging Research Centers.
“We made a detailed analysis of the skeletal remains of 67 individuals excavated at a burial ground dating from the period 500-400 BCE and located in the Supe Valley region, a few kilometers from Caral, a famous ceremonial center that functioned between 2900 and 1800 BCE. There we detected injury patterns characteristic of repeated events of interpersonal violence. Among the individuals examined, 80% of the adults and adolescents died from inflicted traumatic injuries,” Pezo-Lanfranco told Agência FAPESP. He currently works in the Department of Prehistory at the Autonomous University of Barcelona (UAB) in Spain.
Perimortem injuries to the skull, face and chest observed in several individuals are compatible with lethal, probably intercommunity, violence, whose victims included children. “Our hypothesis is that a group of strangers came to the community and committed the murders. After the aggressors left, the murder victims were buried by their own people with the usual funeral rites, as suggested by the burial patterns,” he said.
Perimortem means at or near the time of death. Bone damage in perimortem injuries shows no evidence of healing. Bone damage in antemortem injuries shows evidence of healing.
Although perimortem trauma was the most frequent type of injury among the adult skeletons studied, as well as some of the child remains, many examples of antemortem trauma were also found, and several individuals displayed both, suggesting the occurrence of at least two violent events during their lives. The first led to injuries that healed, while the second killed them.
“The markers point to exposure to repetitive and lethal violence during the course of their lives,” Pezo-Lanfranco said. The most frequent injuries were depressed fractures of the cranial vault, other maxillofacial fractures, thoracic fractures (mainly in ribs and scapulae), and “defensive” fractures of the ulna (forearm, indicating an attempt to parry a blow).
Sixty-four of the 67 individuals studied were buried in a fetal position: 12 in dorsal decubitus (lying on their back), four in ventral decubitus (on their stomach), seven in left lateral decubitus (on their left side), and 41 in right lateral decubitus. The fetal position is a recurring burial pattern in prehistoric and ancient communities worldwide. Given its association with the womb, some experts believe it reflects the expectation of rebirth after death.
Besides the signs of violence, the analysis of the bones showed a high incidence of non-specific stresses and infectious diseases, possibly associated with adverse living conditions due to a combination of a shortage of resources and population growth. The simplicity of most of the grave goods also points to poverty. Many of the skeletons were buried with plain cotton fabric, woven mats and basketry, gourds containing vegetables, cotton seeds and roots, necklaces, and pottery. “Stable isotope studies showed that staple crops were the basis for their subsistence,” Pezo-Lanfranco said.
Competition for scant resources in the Supe Valley region was probably a major factor in the collapse of the Chavín culture, which spread through Peru’s mountains and coast between 1200 and 500 BCE. Its center was Chavín de Huantar, a monumental ceremonial site in northern Peru in the Marañon River basin. The Marañón rises in the Peruvian Andes at about 5,800 m, first flowing northwest and then turning northeast to meet the Ucayali and become the Upper Amazon and Solimões in Brazil.
“The Chavín system reached exhaustion during the Middle to Late Formative transition, around 500-400 BCE. Several ceremonial centers, including Chavín de Huantar, were desacralized and abandoned. Political formations organized around the religious sphere disintegrated, perhaps characterizing the decline of theocracy and the emergence of secular government,” Pezo-Lanfranco said.
The Chavín people worshipped a “zooanthropomorphic” deity resembling a man-jaguar. Gods that combine animal and human attributes are featured in many ancient cultures around the world, including those of Crete, India and Egypt. In a purely speculative approach, some scholars think they may be later re-elaborations of prehistoric shamanic traditions in which the virtues of tutelary animals are syncretized in the figure of the shaman. This hypothesis cannot be confirmed on the basis of existing knowledge.
The name of the Chavín man-jaguar god is unknown. Unlike ancient civilizations in the Old World, the Andean people who worshiped the deity left no written records that could be deciphered to furnish more detailed information. It is worth stressing that the period in question preceded the formal establishment of the Inca Empire by almost 2,000 years. Founded by Pachacuti in 1438 CE (Common Era), the Inca Empire was the ultimate expression of thousands of years of Andean civilizations, yet it lasted for less than 100 years. The Spanish executed the last reigning Inca Emperor, Atahualpa, in 1533, and in 1572 captured and killed Túpac Amaru in Vilcabamba, where he had been leading the resistance.
For the researchers who conducted the study, the results are particularly important because of what they reveal about an era of ancient Andean history that has so far been poorly documented. Few burial grounds from the period in the Central Andes have been excavated, and still fewer have been found to contain remains as well-conserved as these. Their conservation is due mainly to the dry climate in the region, permitting detailed observation of injuries in almost intact bones.
“The study belongs to a field we call the ‘bioarcheology of violence”, which helps understand the nature of interpersonal conflict around the middle of the first millennium before the common era. On the other hand, data from the same analysis, to be published soon, offers several answers regarding factors in this society that modulated morbidity and mortality, which developed in the hypothetical context of population pressure and political transition associated with the collapse of belief systems in a highly resource-poor environment,” Pezo-Lanfranco said.
About São Paulo Research Foundation (FAPESP)
The São Paulo Research Foundation (FAPESP) is a public institution with the mission of supporting scientific research in all fields of knowledge by awarding scholarships, fellowships and grants to investigators linked with higher education and research institutions in the State of São Paulo, Brazil. FAPESP is aware that the very best research can only be done by working with the best researchers internationally. Therefore, it has established partnerships with funding agencies, higher education, private companies, and research organizations in other countries known for the quality of their research and has been encouraging scientists funded by its grants to further develop their international collaboration. You can learn more about FAPESP at www.fapesp.br/en and visit FAPESP news agency at www.agencia.fapesp.br/en to keep updated with the latest scientific breakthroughs FAPESP helps achieve through its many programs, awards and research centers. You may also subscribe to FAPESP news agency at http://agencia.fapesp.br/subscribe.
Bioarchaeological Evidence of Violence between the Middle and Late Formative (500–400 BC) in the Peruvian North-Central Coast
UK's summer 2022 drought provides warning for future years
Scientists say improved real-time monitoring and forecasting systems would inform early mitigation measures
UK CENTRE FOR ECOLOGY & HYDROLOGY
The UK will be increasingly tested by more droughts like 2022, emphasising the importance of being prepared for similar extreme weather in future, say scientists who have analysed that summer’s events.
The newly published study by the UK Centre for Ecology & Hydrology (UKCEH) outlines how the drought evolved and its impacts on water resources, wildlife and people, comparing the situation with previous droughts and looks at whether it is an indication of future events.
Summer 2022 was the joint hottest (with 2018) and fifth driest since the 1890s. The drought affected large parts of the country and was the worst in some areas since 1976. It was part of wider European drought, believed to be the worst on the continent in 500 years.
The prolonged and extensive exceptional heat, dry soils and low river flows had impacts across much of the UK including water restrictions – with six companies introducing hosepipe bans affecting around 20 million people – and restrictions on waterways navigation.
Extensive challenges for agriculture included low crop and milk yields, as well as dying grass in grazing fields that forced farmers to use winter food stores. During the summer, there were nearly 25,000 wildfires; they spread easily across dry fields and also affected urban areas. Environmental impacts included algal blooms and fish kills.
A Level 4 heat health alert was issued for the first time since its introduction in 2004, and an estimated 2,800 excess deaths of over 65s due to heat between June and August.
That summer’s events underline our continuing vulnerability to intense droughts associated with low spring/summer rainfall alongside very high temperatures – especially given it followed shortly another intense summer drought in 2018.
UKCEH hydrologist Jamie Hannaford, one of the authors of the study, said: “The 2022 drought posed significant challenges to water management and communication with the public given the speed of onset of drought conditions and impacts. It has provided water managers with an important stress test, enabling them to assess our resilience to the kind of extreme event that we will see much more of in future.”
Hydrologists classify 2022 as a summer drought, which developed relatively quickly, as opposed to a multi-year drought driven by successive dry winters. While there is significant uncertainty about how multi-year droughts may evolve in future, scientists are highly confident, based on modelling, that we will be increasingly tested by more droughts like 2022. Human-driven climate warming increases the risk of droughts like 2018 and 2022, associated with drier summers and higher temperatures.
The authors of the study, published in the Royal Meteorological Society journal Weather, say the impacts on water supply were relatively modest in terms of duration and areas affected. Like 2018, this was largely due to wetter winters before and after the drought.
They say droughts like 2022 emphasise the need for improved real-time monitoring and forecasting systems. This would give an indication of the likely impacts that may lie ahead, to help apply mitigation measures – such as restrictions on abstractions or efforts to safeguard the environment like fish rescues – at an early stage.
UKCEH oversees COSMOS-UK, a long-term network of soil moisture monitoring sites, producing live data, which was used for the 2022 drought study.
It is also leading the development of a Floods and Droughts Research Infrastructure (FDRI), funded by UK Research and Innovation (UKRI). The new instruments will produce an extensive range of new measurements across several UK catchments. The data will enable researchers to improve computer models to predict when and where droughts and floods will happen, and how severe they will be.
– Ends –
Paper information
Barker et al. 2024. An appraisal of the severity of the 2022 drought and its impacts. Weather. DOI: 10.1002/wea.4531. Open access.
For interviews and more information, please contact Simon Williams, Media Relations Officer at UKCEH, via simwil@ceh.ac.uk or +44 (0)7920 295384.
About the UK Centre for Ecology & Hydrology (UKCEH)
The UK Centre for Ecology & Hydrology is a centre for excellence in environmental science across water, land and air. We have a long history of investigating, monitoring and modelling environmental change, and our science makes a positive difference in the world.
Combining expertise in hydro-meteorology with data derived from national monitoring networks, we measure and model water to accurately predict, mitigate and manage the impacts of floods and droughts.
The UK Centre for Ecology & Hydrology is a strategic delivery partner for the Natural Environment Research Council, part of UK Research and Innovation.
An appraisal of the severity of the 2022 drought and its impacts
Movement of crops, animals played a key role in domestication
WASHINGTON UNIVERSITY IN ST. LOUIS
Archaeologist Xinyi Liu at Washington University in St. Louis teamed up with Martin Jones of the University of Cambridge to write a new paper for the Proceedings of the National Academy of Sciences that explains how recent research is connecting the science of biological domestication to early food globalization.
Liu, an associate professor of archaeology and associate chair of the Department of Anthropology in Arts & Sciences, proposes a new conceptual framework to understand domestication, which is relevant not only to anthropology but other fields such as biology and ecology.
In this Q&A, he also offers his perspective on how understanding the past conditions can help us to forge a vision for the future.
***
The domestication of plants and animals is among the most significant transitions in human history. How has our understanding of domestication changed recently?
Our new article focuses on how we conceptualize domestication. A considerable intellectual legacy has depicted domestication as a series of short-lived, localized and episodic events. Some of the literature, particularly those pieces dating back to the early 20th century, envisioned the process as a transition from humans within nature to humans controlling nature in a revolutionary fashion.
The metaphor there is “revolution.” So, as people described it, there was a “Neolithic Revolution” that functioned in a similar way as the “Industrial Revolution” or the “scientific revolution” — a rapid technological shift followed by changes in societies, according to some narratives.
It is time to reconsider all this. Newly emergent evidence from the last 15 years challenges the idea of rapid domestication. This evidence shows unambiguously that plant and animal domestication in a range of species entailed a more gradual transition spanning a few thousand years across extensive geographies.
How has archaeology contributed to this line of inquiry?
Much of this evidence was brought to light by archaeological and scientific investigations. For example, it took about 5,000 years for the domestication traits of wheat to be fully developed from its wild morphology, according to archaeobotanical work in the Near East. In the lower Yangtze Valley in China, research informed a similar process that ancient communities had cultivated rice for a few millennia before the plant reached domesticated states, in the biological sense.
So domestication has extended in time. But you also argue that it has extended in space. What does that mean?
Over the last 15 years we’ve also seen an improvement in the understanding of how people have moved domesticated plants and animals over continents. In some cases, people moved crops and stocks before the genetic changes associated with domestication were fully fixed within the species. This raises questions about the role translocations played in the domestication process.
Central to our inquiry is the relationship between domesticated crops and stocks and their free-living ancestors, or progenitors. Newer genetic evidence suggests that long-term gene flow between wild and domestic species was much more common than previously appreciated.
It makes sense: At the so-called domestication center, where ancestral varieties were dominant, such gene flow would have been very strong. No meaningful mechanism could have stopped the introgression.
But if farmers took their crops, or herders their stocks, and moved to a new environment beyond the natural distribution of the ancestors, then selection pressures would have changed dramatically. Eventually you are domesticating in a single pathway, with no return. Such a process has been documented genetically and archaeologically in a number of domesticated species, such as maize and wheat.
How do human preferences or traditions factor in?
If crop or stock movements were entangled with the domestication process, the newly introduced species would have to adapt to the new physical environment encountered. But they would have also been adapted to align with new cultural habits. We envision both the physical and cultural adaptation played roles in the fixation of some domestication traits.
Does this research have any implications for modern agriculture?
Understanding the past conditions can help us to forge visions about the future. In that sense, archaeology plays a key role in establishing the historical and community roots of a range of contemporary challenges, such as food security, planetary health and sustainability, providing solutions drawing from humanity at the deepest level.
One such example is the positive impact that archaeogenetic research about millet made on the livelihoods of farmers across the globe. At its 75th session, the United Nations General Assembly declared 2023 the International Year of Millets to raise awareness of the crop’s deep community roots and future potential. There has been considerable recent momentum in understanding the biodiversity and historical geography of millets, which are a diverse group of cereals originating from several continents, including pearl, proso (or broomcorn), foxtail, barnyard, little, kodo, browntop, finger and fonio millets.
Millets can grow on arid lands with minimal inputs and are resilient to changes in climate. They are, therefore, an ideal solution for communities to increase self-sufficiency and reduce reliance on imported cereal grains.
These grains once sustained ancient populations by large. Archaeology played a key role in establishing the original biogeography, domestication and early dispersals of millets. The knowledge we have gained consequently has profoundly impacted food security and conservation in areas where millets are culturally relevant.