Sunday, September 22, 2024

 

A two-dose schedule could make HIV vaccines more effective



MIT researchers find that the first dose primes the immune system, helping it to generate a strong response to the second dose, a week later



Massachusetts Institute of Technology




One major reason why it has been difficult to develop an effective HIV vaccine is that the virus mutates very rapidly, allowing it to evade the antibody response generated by vaccines. 

Several years ago, MIT researchers showed that administering a series of escalating doses of an HIV vaccine over a two-week period could help overcome a part of that challenge by generating larger quantities of neutralizing antibodies. However, a multidose vaccine regimen administered over a short time is not practical for mass vaccination campaigns. 

In a new study, the researchers have now found that they can achieve a similar immune response with just two doses, given one week apart. The first dose, which is much smaller, prepares the immune system to respond more powerfully to the second, larger dose.  

This study, which was performed by bringing together computational modeling and experiments in mice, used an HIV envelope protein as the vaccine. A single-dose version of this vaccine is now in clinical trials, and the researchers hope to establish another study group that will receive the vaccine on a two-dose schedule.

“By bringing together the physical and life sciences, we shed light on some basic immunological questions that helped develop this two-dose schedule to mimic the multiple-dose regimen,” says Arup Chakraborty, the John M. Deutch Institute Professor at MIT and a member of MIT’s Institute for Medical Engineering and Science and the Ragon Institute of MIT, MGH and Harvard University.

This approach may also generalize to vaccines for other diseases, Chakraborty notes.

Chakraborty and Darrell Irvine, a former MIT professor of biological engineering and materials science and engineering and member of the Koch Institute for Integrative Cancer Research, who is now a professor of immunology and microbiology at the Scripps Research Institute, are the senior authors of the study, which appears in Science Immunology. The lead authors of the paper are Sachin Bhagchandani PhD ’23 and Leerang Yang PhD ’24.

Neutralizing antibodies

Each year, HIV infects more than 1 million people around the world, and some of those people do not have access to antiviral drugs. An effective vaccine could prevent many of those infections. One promising vaccine now in clinical trials consists of an HIV protein called an envelope trimer, along with a nanoparticle called SMNP. The nanoparticle, developed by Irvine’s lab, acts as an adjuvant that helps recruit a stronger B cell response to the vaccine. 

In clinical trials, this vaccine and other experimental vaccines have been given as just one dose. However, there is growing evidence that a series of doses is more effective at generating broadly neutralizing antibodies. The seven-dose regimen, the researchers believe, works well because it mimics what happens when the body is exposed to a virus: The immune system builds up a strong response as more viral proteins, or antigens, accumulate in the body. 

In the new study, the MIT team investigated how this response develops and explored whether they could achieve the same effect using a smaller number of vaccine doses. 

“Giving seven doses just isn’t feasible for mass vaccination,” Bhagchandani says. “We wanted to identify some of the critical elements necessary for the success of this escalating dose, and to explore whether that knowledge could allow us to reduce the number of doses.” 

The researchers began by comparing the effects of one, two, three, four, five, six, or seven doses, all given over a 12-day period. They initially found that while three or more doses generated strong antibody responses, two doses did not. However, by tweaking the dose intervals and ratios, the researchers discovered that giving 20 percent of the vaccine in the first dose and 80 percent in a second dose, seven days later, achieved just as good a response as the seven-dose schedule.

“It was clear that understanding the mechanisms behind this phenomenon would be crucial for future clinical translation,” Yang says. “Even if the ideal dosing ratio and timing may differ for humans, the underlying mechanistic principles will likely remain the same.”

Using a computational model, the researchers explored what was happening in each of these dosing scenarios. This work showed that when all of the vaccine is given as one dose, most of the antigen gets chopped into fragments before it reaches the lymph nodes. Lymph nodes are where B cells become activated to target a particular antigen, within structures known as germinal centers.

When only a tiny amount of the intact antigen reaches these germinal centers, B cells can’t come up with a strong response against that antigen.

However, a very small number of B cells do arise that produce antibodies targeting the intact antigen. So, giving a small amount in the first dose does not “waste” much antigen but allows some B cells and antibodies to develop. If a second, larger dose is given a week later, those antibodies bind to the antigen before it can be broken down and escort it into the lymph node. This allows more B cells to be exposed to that antigen and eventually leads to a large population of B cells that can target it.

“The early doses generate some small amounts of antibody, and that’s enough to then bind to the vaccine of the later doses, protect it, and target it to the lymph node. That's how we realized that we don't need to give seven doses,” Bhagchandani says. “A small initial dose will generate this antibody and then when you give the larger dose, it can again be protected because that antibody will bind to it and traffic it to the lymph node.”

T-cell boost

Those antigens may stay in the germinal centers for weeks or even longer, allowing more B cells to come in and be exposed to them, making it more likely that diverse types of antibodies will develop.

The researchers also found that the two-dose schedule induces a stronger T-cell response. The first dose activates dendritic cells, which promote inflammation and T-cell activation. Then, when the second dose arrives, even more dendritic cells are stimulated, further boosting the T-cell response.

Overall, the two-dose regimen resulted in a fivefold improvement in the T-cell response and a 60-fold improvement in the antibody response, compared to a single vaccine dose.

“Reducing the ‘escalating dose’ strategy down to two shots makes it much more practical for clinical implementation. Further, a number of technologies are in development that could mimic the two-dose exposure in a single shot, which could become ideal for mass vaccination campaigns,” Irvine says. 

The researchers are now studying this vaccine strategy in a nonhuman primate model. They are also working on specialized materials that can deliver the second dose over an extended period of time, which could further enhance the immune response.

The research was funded by the Koch Institute Support (core) Grant from the National Cancer Institute, the National Institutes of Health, and the Ragon Institute of MIT, MGH, and Harvard.

###

Written by Anne Trafton, MIT News

COSMOLOGY


FAA proposes $630,000 fine against SpaceX for 2023 regulatory violations

SPARE CHANGE 



A SpaceX Falcon Heavy rocket launches the Jupiter 3/Echostar 24 communications satellite for the Hughes Network System at the Kennedy Space Center on July 28, 2023. The FAA fined SpaceX for that launch and another on Tuesday. File Photo by Joe Marino/UPI | License Photo

Sept. 18 (UPI) -- The Federal Aviation Administration on Tuesday said it plans to fine SpaceX for regulatory violations.

The FAA proposed to fine the company more than $630,000, saying it failed to comply with regulations during two launches in 2023.

SpaceX on June 18, 2023, used an unapproved launch control room for the PSN Satria mission and did not conduct the required hour poll, the FAA said. On July 28, 2023, the agency used an unapproved rocket propellant farm for the EchoStar XXIV/Jupiter mission.

"Safety drives everything we do at the FAA, including a legal responsibility for the safety oversight of companies with commercial space transportation licenses," FAA Chief Counsel Marc Nichol said. "Failure of a company to comply with the safety requirements will result in consequences."

The proposed fine drew the ire of SpaceX's founder Elon Musk, who threatened legal action against the agency.

"SpaceX will be filing suit against the FAA for regulatory overreach," Musk said on X.

SpaceX remains in a holding pattern with the FAA over the fifth test of its Starship, which is expected to deliver humans and goods back to the Moon and eventually Mars. The FAA has been slow to approve the flight even though SpaceX it's been ready since August.

SpaceX railed against the FAA on Sept. 10 for its latest delay.

"We recently received a launch license date estimate of late November from the FAA," SpaceX said in a statement. "This is a more than two-month delay to the previously communicated date of mid-September.

"This delay was not based on a new safety concern but instead driven by superfluous environmental analysis. The four open environmental issues are illustrative of the difficulties launch companies face in the current regulatory environment for launch and reentry licensing."

SkAI launched to further explore universe



NCSA and partner institutions received a $20 million grant to create new artificial intelligence tools for space exploration



National Center for Supercomputing Applications





Funded by a five-year, $20 million grant from the U.S. National Science Foundation (NSF) and the Simons Foundation, the National Center for Supercomputing Applications is partnering with other academic institutions and federal laboratories in the Midwest to develop new artificial intelligence (AI) tools to advance astrophysics research and exploration of the universe.

Led by Northwestern University, the collaboration will establish the NSF-Simons AI Institute for the Sky or SkAI (pronounced “sky”), one of two AI research centers that will help astronomers better understand the cosmos.

Located close to NCSA’s home in Illinois, SkAI will bring together researchers to create and deploy innovative AI mechanisms that will analyze large datasets from astronomical surveys and transform physics-based simulations in pursuit of furthering deep space exploration. SkAI will unite 83 team members from 25 partner organizations, including NCSA, Northwestern University, the University of Illinois Urbana-Champaign, the University of Chicago, Argonne National Laboratory, Fermi National Accelerator Laboratory, the University of Illinois Chicago, the Adler Planetarium and University of Wisconsin-Madison.

“Our mission at the Center for AstroPhysical Surveys (CAPS) in NCSA has been to bring together innovative software and cutting-edge hardware to tackle the most pressing questions in the universe” said SkAI co-principal investigator and CAPS Deputy Director Gautham Narayan. “We’re very excited to have our students, postdocs, faculty and staff deepen our involvement with our colleagues at Northwestern and U of Chicago, provide the entire SkAI community access to NSF’s Delta and DeltaAI supercomputers here at NCSA and build tools and services that lead to AI methods becoming more interpretable and reliable. Our goal is to democratize AI and make it more trustworthy – not just for astrophysics and cosmology, or our campus, but for everyone. This is a big leap forward, and Illinois will lead the way.”

“I am thrilled to receive this opportunity to work with our amazing cross-disciplinary, multi-institutional team, so we can accelerate the data-driven revolution that wide and deep sky surveys will bring to the field of astronomy,” said Northwestern’s Vicky Kalogera, SkAI director and principal investigator of the grant. “We will transform our astrophysical understanding across an enormous range of scales – from stars and the transients they produce to the evolving galaxies they live in, the black holes they form, and to the dark sector of the universe and its cosmological origins.”

Astronomical surveys like the Vera C. Rubin Observatory and CMB-Stage 4 experiment promise transformational advances in astrophysics and cosmology.  Realizing these breakthroughs will require overcoming enormous challenges in data analysis, modeling and experiment design. Researchers will need assistance provided by these SkAI resources to sufficiently interpret increasingly large and complex datasets.

The massive amount of data that will be gathered in the coming years by the NSF-DOE Vera C. Rubin Observatory and other large-scale astronomical projects is simply too vast and rich to be fully explored with existing methods. With reliable and trustworthy AI in their toolbox, everyone from students to senior researchers will have exciting new ways to gain valuable insights leading to amazing discoveries that might otherwise remain hidden in the data.

Sethuraman Panchanathan, NSF Director

SkAI will also be a bastion for open science and emphasize diversity of disciplines and researchers in its work. Building on existing partnerships with urban and rural community colleges, minority serving institutions, youth organizations, the Adler Planetarium, artists and art organizations, and industry professionals, the new institute will develop Astro-AI educational and workforce-development resources for high school students through postdocs. SkAI will engage the public around Astro-AI concepts and enhance local, regional and national research capacity and knowledge transfer to the community, higher-learning institutions and in industry. Activities will be designed to lower barriers to access and grow a more diverse STEM workforce.

“Our research will be guided by AI ethics principles and all SkAI members will be trained in key AI ethics practices,” Narayan said. “Our commitment to open collaboration ensures that SkAI research products are adopted widely. We will develop new, trustworthy AI tools in an open-source ecosystem and train a diverse generation of scientists and engineers to ethically apply and extend AI within academia and beyond.

“University of Illinois leadership and their vision in creating the Center for AstroPhysical Surveys has enabled us to get a head start in these areas, and was crucial to the success of SkAI, and their continuing support gives us the freedom and ability to innovate. By pushing the frontier of computing, we’re reaching ever more distant horizons in the universe.”

For more details on SkAI, check out the announcements from NSF and Northwestern University.


ABOUT ASPO/CAPS

NCSA’s Center for AstroPhysical Surveys (CAPS) brings cohesion and advancement to astrophysical survey science efforts across the University of Illinois Urbana-Champaign. CAPS offers annual postdoctoral and graduate fellowship programs, publishes vast amounts of astronomical data from a broad portfolio of astrophysical surveys and collaborates across disciplines to enable trailblazing research.

 

Wastewater monitoring can detect foodborne illness, researchers find



Results of a new study suggest sewage monitoring could provide early warning of foodborne disease outbreaks to public health authorities



Penn State

sewage treatment plant in central Pennsylvania 

image: 

The researchers tested raw sewage samples collected twice a week from two treatment plants in central Pennsylvania for non-typhoidal Salmonella and characterized isolates using whole genome sequencing. They recovered 43 Salmonella isolates from wastewater samples. 

view more 

Credit: Provided by Ed Dudley/Penn State





UNIVERSITY PARK, Pa. — First used in the 1940s to monitor for polio, wastewater surveillance proved such a powerful disease monitoring tool that the U.S. Centers for Disease Control and Prevention (CDC) established the National Wastewater Surveillance System to support SARS-CoV-2 monitoring in September of 2020. Now, a team of scientists from Penn State and the Pennsylvania Department of Health have shown that domestic sewage monitoring is useful for a foodborne pathogen as well. 

In findings published today (Sept. 19) in the Journal of Clinical Microbiology, the researchers report that the bacteria Salmonella enterica was detected in samples from two wastewater treatment plants in central Pennsylvania during June 2022.

“Non-typhoidal Salmonella is a common cause of gastroenteritis worldwide, but current surveillance for the disease is suboptimal, so in this research we evaluated the utility of wastewater monitoring to enhance surveillance for this foodborne pathogen,” said Nkuchia M’ikanatha, lead epidemiologist, Pennsylvania Department of Health and an affiliated researcher in Penn State’s Department of Food Science, in the College of Agricultural Sciences. "In this study, we explored wastewater monitoring as a tool to enhance surveillance for this foodborne pathogen. Wastewater testing can detect traces of infectious diseases circulating in a community, even in asymptomatic individuals, offering an early warning system for potential outbreaks."

While health care providers are required to report salmonellosis cases, many go undetected. Salmonella bacteria, inhabiting the intestines of animals and humans, are shed in feces. The CDC estimates Salmonella causes roughly 1.35 million infections, 26,500 hospitalizations and 420 deaths annually in the U.S., primarily through contaminated food.

In June 2022, the researchers tested raw sewage samples collected twice a week from two treatment plants in central Pennsylvania for non-typhoidal Salmonella and characterized isolates using whole genome sequencing. They recovered 43 Salmonella isolates from wastewater samples, differentiated by genomic analysis into seven serovars, which are groupings of microorganisms based on similarities. Eight of the isolates, or nearly 20%, were from a rare type of Salmonella called Baildon.

The researchers assessed genetic relatedness and epidemiologic links between non-typhoidal Salmonella isolates from wastewater and similar bacteria from patients with salmonellosis. The Salmonella Baildon serovars isolated from wastewater were genetically indistinguishable from a similar bacteria found in a patient associated with a salmonellosis outbreak in the same period in the area. Salmonella Baildon from wastewater and 42 outbreak-related isolates in the national outbreak detection database had the same genetic makeup. One of the 42 outbreak-related isolates was obtained from a patient residing in the wastewater study sample collection catchment area, which serves approximately 17,000 people. 

Salmonella Baildon is a rare serovar — reported in less than 1% of cases nationally over five years, noted M’ikanatha, the study’s first author. He pointed out that this research demonstrates the value of monitoring sewage from a defined population to supplement traditional surveillance methods for evidence of Salmonella infections and to determine the extent of outbreaks.

“Using whole genome sequencing, we showed that isolates of variant Salmonella Baildon clustered with those from an outbreak that occurred in a similar time frame,” he said. “Case reports were primarily from Pennsylvania, and one individual lived within the treatment plant catchment area. This study provides support for using domestic sewage surveillance in assisting public health agencies to identify communities impacted by infectious diseases.”

Ed Dudley, a professor of food science and the senior author on the study, said these findings highlight the potential of wastewater monitoring as an early warning system for foodborne disease outbreaks, potentially even before physicians and laboratories report cases. This proactive approach could enable health officials to swiftly trace the source of contaminated food, ultimately reducing the number of people affected, suggested Dudley, who also directs Penn State’s E. coli Reference Center.

“While it may not happen overnight, I foresee a future where many, if not most, domestic wastewater treatment plants contribute untreated sewage samples for monitoring evidence of various illnesses,” he said. “This would likely involve collaboration among public health agencies, academia and federal entities, much like our pilot study. I see this as yet another crucial lesson from the pandemic.”

Contributing to the research at Penn State were Jasna Kovac, associate professor of food science and Lester Earl and Veronica Casida Career Development Professor of Food Safety; Erin Nawrocki and Yezhi Fu, postdoctoral scholars in the Dudley Lab; Zoe Goldblum, undergraduate researcher in the Department of Food Science; and Nicholas Cesari, Division of Infectious Disease Epidemiology, Pennsylvania Department of Health.

The CDC, the U.S. Food and Drug Administration and the U.S. Department of Agriculture’s National Institute of Food and Agriculture provided funding for this research.

A researcher prepares samples to search for Salmonella in sewage 

 

New data science tool greatly speeds up molecular analysis of our environment


UC Riverside-led team developed the tool through an international virtual research group



University of California - Riverside

Daniel Petras 

image: 

Photo shows Daniel Petras.

view more 

Credit: Petras lab, UC Riverside.




RIVERSIDE, Calif. -- A research team led by scientists at the University of California, Riverside, has developed a computational workflow for analyzing large data sets in the field of metabolomics, the study of small molecules found within cells, biofluids, tissues, and entire ecosystems.

Most recently, the team applied this new computational tool to analyze pollutants in seawater in Southern California. The team swiftly captured the chemical profiles of coastal environments and highlighted potential sources of pollution.

“We are interested in understanding how such pollutants get introduced in the ecosystem,” said Daniel Petras, an assistant professor of biochemistry at UC Riverside, who led the research team. “Figuring out which molecules in the ocean are important for environmental health is not straightforward because of the ocean’s sheer chemical diversity. The protocol we developed greatly speeds up this process. More efficient sorting of the data means we can understand problems related to ocean pollution faster.”

Petras and his colleagues report in the journal Nature Protocols that their protocol is designed not only for experienced researchers but also for educational purposes, making it an ideal resource for students and early-career scientists. This computational workflow is accompanied by an accessible web application with a graphical user interface that makes metabolomics data analysis accessible for non-experts and enables them to gain statistical insights into their data within minutes. 

“This tool is accessible to a broad range of researchers, from absolute beginners to experts, and is tailored for use in conjunction with the molecular networking software my group is developing,” said coauthor Mingxun Wang, an assistant professor of computer science and engineering at UCR. “For beginners, the guidelines and code we provide make it easier to understand common data processing and analysis steps. For experts, it accelerates reproducible data analysis, enabling them to share their statistical data analysis workflows and results.”

Petras explained the research paper is unique, serving as a large educational resource organized through a virtual research group called Virtual Multiomics Lab, or VMOL. With more than 50 scientists participating from around the world, VMOL is a community-driven, open-access community. It aims to simplify and democratize the chemical analysis process, making it accessible to researchers worldwide, regardless of their background or resources.

“I’m incredibly proud to see how this project evolved into something impactful, involving experts and students from across the globe,” said Abzer Pakkir Shah, a doctoral student in Petras’ group and the first author of the paper. “By removing physical and economic barriers, VMOL provides training in computational mass spectrometry and data science and aims to launch virtual research projects as a new form of collaborative science.”

All software the team developed is free and publicly available. The software development was initiated during a summer school for non-targeted metabolomics in 2022 at the University of Tübingen, where the team also launched VMOL.

Petras expects the protocol will be especially useful to environmental researchers as well as scientists working in the biomedical field and researchers doing clinical studies in microbiome science.

“The versatility of our protocol extends to a wide range of fields and sample types, including combinatorial chemistry, doping analysis, and trace contamination of food, pharmaceuticals, and other industrial products,” he said.

Petras received his master’s degree in biotechnology from the University of Applied Science Darmstadt and his doctoral degree in biochemistry from the Technical University Berlin. He did postdoctoral research at UC San Diego, where he focused on the development of large-scale environmental metabolomics methods. In 2021, he launched the Functional Metabolomics Lab at the University of Tübingen. In January 2024 he joined UCR, where his lab focuses on the development and application of mass spectrometry-based methods to visualize and assess chemical exchange within microbial communities.

The title of the paper is “Statistical analysis of feature-based molecular networking results from non-targeted metabolomics data.” 

The University of California, Riverside is a doctoral research university, a living laboratory for groundbreaking exploration of issues critical to Inland Southern California, the state and communities around the world. Reflecting California's diverse culture, UCR's enrollment is more than 26,000 students. The campus opened a medical school in 2013 and has reached the heart of the Coachella Valley by way of the UCR Palm Desert Center. The campus has an annual impact of more than $2.7 billion on the U.S. economy. To learn more, visit www.ucr.edu.

 

Study reveals large ocean heat storage efficiency during the last deglaciation




Institute of Atmospheric Physics, Chinese Academy of Sciences
Melting ice in the Southern Ocean. 

image: 

Melting ice in the Southern Ocean.

view more 

Credit: Jiuxin Shi





As one of the largest heat reservoirs in the climate system, the global ocean absorbs more than 90% of the excess energy from ongoing anthropogenic warming. In the last century, the greatest warming in the ocean has occurred in the upper 500 m, with relatively weak warming in the deep ocean, corresponding to a small ocean heat storage efficiency of ~0.1.

 

Paleoceanographic observations, however, suggest that on long time scales, the deep ocean warming can be comparable to or larger than the surface ocean warming, with ocean heat storage efficiency during the last deglaciation about ten times of its modern value. This raises the following question: what mechanisms are responsible for ocean heat uptake/storage and how large can its efficiency be?

 

Recently published in Science Advances, a joint study by an international team of scientists from China and USA has shed light on this issue. By combining state-of-the-art deglacial simulations and proxy-based reconstructions, they resolved the three-dimensional deglacial ocean temperature change and found that the deglacial ocean heat storage efficiency is substantially enhanced to ≥1 by strong warming in intermediate-depth waters in response to deglacial forcing.

 

“Our simulations and proxy reconstructions demonstrate that the three-dimensional ocean warming during the last deglaciation was strongly nonuniform with strongest warming occurring at intermediate depths, in striking contrast to contemporary observations,” said Dr. Chenyu Zhu from Institute of Atmospheric Sciences at the Chinese Academy of Sciences, the study's co-first author.

 

Utilizing sensitivity experiments, the study revealed that the large warming of intermediate waters can be related to surface warming at mid-to-subpolar latitudes through ventilation in response to greenhouse gases and ice sheet forcing and substantially enhanced by oceanic circulation change associated with meltwater forcing. “The unique ocean warming structure facilitates a large ocean heat storage efficiency. In particular, this resolves the paradox suggested by the conventional view that warming occurred at sites of deep-water formation that remained covered by sea ice”, said Prof. Zhengyu Liu, one of the corresponding authors of the study from The Ohio State University.

 

“These results have valuable implications. For example, if strong surface warming and strong ventilation are collocated like in our simulations, then the ocean will absorb more heat from the atmosphere, potentially slowing the rate of atmospheric warming” said Prof. Peter U. Clark, another corresponding author of the study from Oregon State University.

 

The study underscores the important role of surface warming pattern and oceanic circulation change in long-term ocean heat storage change and suggests that “the ocean can serve as a far greater reservoir of energy in the climate system than implied by contemporary observations,” according to the study.

 

Scientists turn to human skeletons to explore origins of horseback riding




University of Colorado at Boulder





As anyone who’s spent time in the saddle knows, riding a horse can be hard on your body. But can it change the way your skeleton looks?

The answer, according to archaeologists from the University of Colorado Boulder: It’s complicated. In a new study, the team drew on a wide range of evidence—from medical studies of modern equestrians to records of human remains across thousands of years.

The researchers concluded that horseback riding can, in fact, leave a mark on human skeletons, such as by subtly altering the shape of the hip joint. But those sorts of changes on their own can’t definitively reveal whether people have ridden horses during their lives. Many other activities, even sitting for long periods of time, can also transform human bones.

“In archaeology, there are vanishingly few instances in which we can tie a particular activity unequivocally to skeletal changes,” said Lauren Hosek, lead author of the study and an assistant professor in the Department of Anthropology at CU Boulder.

She and her colleagues reported their findings Sept. 20 in the journal Science Advances.

The results may have implications for researchers who study the origins of when humans first domesticated horses—and also cast doubt on a long-standing theory in archaeology known as the Kurgan hypothesis.

The first equestrians

The research lies at the center of what is among the old debates in archaeology, said William Taylor, a co-author of the new study and curator of archaeology at the CU Museum of Natural History.

He explained that the earliest, incontrovertible evidence of humans using horses for transport comes from the region around the Ural Mountains of Russia. There, scientists have uncovered horses, bridles and chariots dating back to around 4,000 years ago.

But the Kurgan hypothesis, which emerged in the early 20th century, argues that the close relationship between humans and horses began much earlier. Proponents believe that around the fourth millennium B.C., ancient humans living near the Black Sea called the Yamnaya first began galloping on horseback across Eurasia. In the process, the story goes, they may have spread a primordial version of the languages that would later evolve into English, French and more.

“A lot of our understanding of both the ancient and modern worlds hinges on when people started using horses for transportation,” Taylor said. “For decades, there’s been this idea that the distribution of Indo-European languages is, in some way, related to the domestication of the horse.”

Recently, scientists have pointed to human remains from the Yamnaya culture dating back to about 3500 B.C. as a key piece of evidence supporting the Kurgan hypothesis. These ancient peoples, the group argued, showed evidence of wear and tear in their skeletons that likely came from riding horses.

Hips can lie

But, in the new study, Hosek and Taylor argue that the story isn’t so simple.

Hosek has spent a lot of time poring over human bones to learn lessons about the past. She explained that the skeleton isn’t static but can shift and change shape over an individual’s lifetime. If you pull a muscle, for example, a reaction can emerge at the site where the muscle attaches to the underlying bone. In some cases, the bone can become more porous or raised ridges may form.

Reading those sorts of clues, however, can be murky at best. The hip joint is one example.

Hosek noted that when you flex your legs at the hip for long periods of time, including during long horse rides, the ball and socket of the hip joint may rub together along one edge. Over time, that rubbing can cause the round socket of the hip bone to become more elongated, or oval in shape. But, she said, other activities can cause the same kind of elongation.

Archaeological evidence shows that humans used cattle, donkeys and even wild asses for transport in some areas of western Asia centuries before they first tamed horses. Ancient peoples likely yoked these beasts of burden to pull carts or even smaller, two-wheeled vehicles that looked something like a chariot.  

“Over time, this repetitive, intense pressure from that kind of jostling in a flexed position could cause skeletal changes,” Hosek said. 

She’s seen similar changes, for example, in the skeletons of Catholic nuns from the 20th century. They never rode horses, but did take long carriage rides across the American West.

Ultimately, Hosek and Taylor say that human remains on their own can’t be used to put a date on when people first started riding horses—at least not with currently available science.  

“Human skeletons alone are not going to be enough evidence,” Hosek said. “We need to couple that data with evidence coming out of genetics and archaeology and by looking at horse remains, too.”

Taylor added that the picture doesn’t look good for the Kurgan hypothesis:

"At least for now, none of these lines of evidence suggest that the Yamnaya people had domestic horses.”