Friday, December 30, 2022

Holding information in mind may mean storing it among synapses

Peer-Reviewed Publication

PICOWER INSTITUTE AT MIT

Plasticity matters 

IMAGE: RESEARCHERS COMPARED THE OUTPUT (ACTIVITY ON THE TOP AND DECODER ACCURACY ON THE BOTTOM) ASSOCIATED WITH REAL NEURAL DATA (LEFT COLUMN) AND SEVERAL MODELS OF WORKING MEMORY TO THE RIGHT. THE ONES THAT BEST RESEMBLED THE REAL DATA WERE THE "PS" MODELS FEATURING SHORT-TERM SYNAPTIC PLASTICITY. view more 

CREDIT: MILLER LAB/THE PICOWER INSTITUTE AT MIT

Between the time you read the Wi-Fi password off the cafĂ©’s menu board and the time you can get back to your laptop to enter it, you have to hold it in mind. If you’ve ever wondered how your brain does that, you are asking a question about working memory that has researchers have strived for decades to explain. Now MIT neuroscientists have published a key new insight to explain how it works.

In a study in PLOS Computational Biology, scientists at The Picower Institute for Learning and Memory compared measurements of brain cell activity in an animal performing a working memory task with the output of various computer models representing two theories of the underlying mechanism for holding information in mind. The results strongly favored the newer notion that a network of neurons stores the information by making short-lived changes in the pattern of their connections, or synapses, and contradicted the traditional alternative that memory is maintained by neurons remaining persistently active (like an idling engine).

While both models allowed for information to be held in mind, only the versions that allowed for synapses to transiently change connections (“short-term synaptic plasticity”) produced neural activity patterns that mimicked what was actually observed in real brains at work. The idea that brain cells maintain memories by being always “on” may be simpler, acknowledged senior author Earl K. Miller, but it doesn’t represent what nature is doing and can’t produce the sophisticated flexibility of thought that can arise from intermittent neural activity backed up by short-term synaptic plasticity.

“You need these kinds of mechanisms to give working memory activity the freedom it needs to be flexible,” said Miller, Picower Professor Neuroscience in MIT’s Department of Brain and Cognitive Sciences (BCS). “If working memory was just sustained activity alone, it would be as simple as a light switch. But working memory is as complex and dynamic as our thoughts.”

Co-lead author Leo Kozachkov, who earned his PhD at MIT in November for theoretical modeling work including this study, said matching computer models to real-world data was crucial.

“Most people think that working memory ‘happens’ in neurons—persistent neural activity gives rise to persistent thoughts. However, this view has come under recent scrutiny because it does not really agree with the data,” said Kozachkov who was co-supervised by co-senior author Jean-Jacques Slotine, a professor in BCS and mechanical engineering. “Using artificial neural networks with short-term synaptic plasticity, we show that synaptic activity (instead of neural activity) can be a substrate for working memory. The important takeaway from our paper is: these ‘plastic’ neural network models are more brain-like, in a quantitative sense, and also have additional functional benefits in terms of robustness.”

Matching models with nature

Alongside co-lead author John Tauber, an MIT graduate student, Kozachkov’s goal was not just to determine how working memory information might be held in mind, but to shed light on which way nature actually does it. That meant starting with “ground truth” measurements of the electrical “spiking” activity of hundreds of neurons in the prefrontal cortex of an animal as it played a working memory game. In each of many rounds the animal was shown an image that then disappeared. A second later it would see two images including the original and had to look at the original to earn a little reward. The key moment is that intervening second, called the “delay period,” in which the image must be kept in mind in advance of the test.

The team consistently observed what Miller’s lab has seen many times before: The neurons spike a lot when seeing the original image, spike only intermittently during the delay, and then spike again when the images must be recalled during the test (these dynamics are governed by an interplay of beta and gamma frequency brain rhtyhms). In other words, spiking is strong when information must be initially stored and when it must be recalled but is only sporadic when it has to be maintained. The spiking is not persistent during the delay.

Moreover, the team trained software “decoders” to read out the working memory information from the measurements of spiking activity. They were highly accurate when spiking was high, but not when it was low, as in the delay period. This suggested that spiking doesn’t represent information during the delay. But that raised a crucial question: If spiking doesn’t hold information in mind, what does?

Researchers including Mark Stokes at the University of Oxford have proposed that changes in the relative strength, or “weights,” of synapses could store the information instead. The MIT team put that idea to the test by computationally modeling neural networks embodying two versions of each main theory. As with the real animal, the machine learning networks were trained to perform the same working memory task and to output neural activity that could also be interpreted by a decoder.

The upshot is that the computational networks that allowed for short-term synaptic plasticity to encode information spiked when the actual brain spiked and didn’t when it didn’t. The networks featuring constant spiking as the method for maintaining memory spiked all the time including when the natural brain did not. And the decoder results revealed that accuracy dropped during the delay period in the synaptic plasticity models but remained unnaturally high in the persistent spiking models.

In another layer of analysis, the team created a decoder to read out information from the synaptic weights. They found that during the delay period, the synapses represented the working memory information that the spiking did not.

Among the two model versions that featured short-term synaptic plasticity the most realistic one was called “PS-Hebb,” which features a negative feedback loop that keeps the neural network stable and robust, Kozachkov said.

Workings of working memory

In addition to matching nature better, the synaptic plasticity models also conferred other benefits that likely matter to real brains. One was that the plasticity models retained information in their synaptic weightings even after as many as half of the artificial neurons were “ablated.” The persistent activity models broke down after losing just 10-20 percent of their synapses. And, Miller added, just spiking occasionally requires less energy than spiking persistently.

Furthermore, Miller said, quick bursts of spiking rather than persistent spiking leaves room in time for storing more than one item in memory. Research has shown that people can hold up to four different things in working memory. Miller’s lab plans new experiments to determine whether models with intermittent spiking and synaptic weight-based information storage appropriately match real neural data when animals must hold multiple things in mind rather than just one image.

In addition to Miller, Kozachkov, Tauber and Slotine, the paper’s other authors are Mikael Lundqvist and Scott Brincat.

The Office of Naval Research, the JPB Foundation, and ERC and VR Starting Grants funded the research.

Association of vaccination rates of staff in nursing homes, COVID-19 outcomes among residents, staff

JAMA Network Open

Peer-Reviewed Publication

JAMA NETWORK

The findings of this study of 15,000 nursing homes suggest that before the Omicron variant wave, increasing staff vaccination rates was associated with lower incidence of COVID-19 cases and deaths among residents and staff in U.S. nursing homes. However, as newer, more infectious and transmissible variants of the virus emerged, the original 2-dose regimen of the COVID-19 vaccine as recommended in December 2020 was no longer associated with lower rates of adverse COVID-19 outcomes in nursing homes. Policy makers may want to consider longer-term policy options to increase the uptake of booster doses among staff in nursing homes. 

Authors: Soham Sinha, M.S., of the University of Chicago, is the corresponding author. 

 To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2022.49002)

Editor’s Note: Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Embed this link to provide your readers free access to the full-text article This link will be live at the embargo time http://jamanetwork.com/journals/jamanetworkopen/fullarticle/10.1001/jamanetworkopen.2022.49002?utm_source=For_The_Media&utm_medium=referral&utm_campaign=ftm_links&utm_term=122922

Thursday, December 29, 2022

Trends in ransomware attacks on hospitals, clinics, other health care delivery organizations

JAMA Health Forum

Peer-Reviewed Publication

JAMA NETWORK

The annual number of ransomware attacks on health care delivery organizations more than doubled from 2016 to 2021, exposing the personal health information of nearly 42 million patients, according to the results of this study of 374 ransomware attacks. During the study period, ransomware attacks exposed larger quantities of personal health information and grew more likely to affect large organizations with multiple facilities. Current monitoring/reporting efforts provide limited information and could be expanded to potentially yield a more complete view of how this growing form of cybercrime affects the delivery of health care. 

Authors: Hannah T. Neprash, Ph.D., of the University of Minnesota in Minneapolis, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamahealthforum.2022.4873)

Editor’s Note: Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Embed this link to provide your readers free access to the full-text article This link will be live at the embargo time https://jamanetwork.com/journals/jama-health-forum/fullarticle/10.1001/jamahealthforum.2022.4873?utm_source=For_The_Media&utm_medium=referral&utm_campaign=ftm_links&utm_term=122922

Scientific criteria for assessing the safety of probiotics

New paper highlights how probiotic regulatory approaches and requirements vary globally – and proposes how to harmonize them

Peer-Reviewed Publication

INTERNATIONAL SCIENTIFIC ASSOCIATION FOR PROBIOTICS AND PREBIOTICS

In past decades, the safety of many probiotics available to consumers was anchored in knowledge of their long history of safe consumption in humans. However, the range of probiotics is expanding to include non-traditional probiotic strains, which may bring health benefits but are not typically present in food sources and do not have a history of safe use.

A group of industry and government scientists convened under the auspices of the United States Pharmacopeia’s Probiotic Expert Panel recently set out to review current approaches to assessing the safety of probiotics from a scientific perspective, while also looking at what regulators currently require. The newly published paper summarizes scientific frameworks for assessing the safety of probiotics used in foods and dietary supplements, including the importance of comprehensive genomic characterization. The paper questions the value of animal testing for probiotics intended for human use. Further, the paper considers emerging technologies for safety testing as well as how manufacturing practices play an important role in the safety of the final product.

Recognizing that regulatory approaches to probiotic safety vary greatly around the world, the paper focuses on scientifically meaningful approaches for safety assessment. Risk tiers for probiotics are identified, based on available information on history of safe use. For strains of species recognized by authoritative bodies as having a sufficient history of safe use, and for which their genomes show no genes of concern, no additional testing is proposed. But if genes of concern are present or a history of safe use is lacking, additional testing is warranted. Safety of genetically modified probiotics or probiotics used as drugs was not addressed in this paper.

Dr. Mary Ellen Sanders, the Executive Science Officer for the International Scientific Association for Probiotics and Prebiotics (ISAPP), and a co-author of the paper, commented, “The current situation, with different countries having different criteria for showing safety of the same probiotic strain, is not ideal. This paper calls for a more harmonized regulatory approach to probiotic safety, based on sound scientific principles.”

Sanders says that while probiotics in foods and supplements are used widely and have a good safety record, establishing safety of innovative strains may require more extensive testing prior to marketing.

Amy Roe, Principal Toxicologist, Product Safety & Regulatory Affairs, The Procter & Gamble Company, led the writing effort. She says, “We hope that this review, which includes the use of modern technologies and approaches, will provide helpful recommendations for completing a comprehensive safety assessment of a probiotic.”

ISAPP is the international non-profit organization that works to advance the science of probiotics, prebiotics and related substances, such as synbiotics, postbiotics and fermented foods. Founded in 2002, ISAPP brings together global leading scientific experts and has shifted the paradigm for how probiotics and prebiotics are studied and understood.

USP is an independent, scientific non-profit organization whose purpose is to improve global health through public standards and related programs that help ensure the quality, safety and benefit of medicines, dietary supplements and foods.

Perfectionists are more likely to burn out, extensive study suggests


Book Announcement

TAYLOR & FRANCIS GROUP

People with perfectionist traits are more susceptible to burnout, according to new research, and it is not just work-related stress that is the cause.

Christmas is coming. We have all endured a global pandemic. There are coughs and colds everywhere. Bills are mounting. It is safe to say we are all exhausted – but when does tiredness tip into burnout?

An expert in mental health and mood disorders has been studying the phenomenon of burnout closely for several years. The extensive research has now been released in the first complete self-help guide to burnout.

The study highlights some of the warning signs of burnout and suggests that people who tend to be perfectionists are more likely to veer into burnout due to their own ‘unrelenting standards’.

What is burnout?

With the worries accompanying pandemic lockdowns, the pressures of inflation and other life stressors, many people are feeling at the end of their tether.

For some people, the cumulative effect of these prolonged periods of stress can result in burnout.

Unlike normal tiredness, the experts suggest burnout symptoms include constant exhaustion, emotional numbness and confusion at home or in the workplace.

Some conventional tools used to diagnose burnout focus on work-related stress, however mental health expert and lead author Professor Gordon Parker suggests that the impact is much more extensive.

Professor Parker said: “Most people consider burnout to be extreme tiredness, but in our studies we have found that the symptoms are much more wide-ranging.

“People struggling with burnout also suffer from cognitive dysfunction, sometimes known as ‘brain fog’ and disconnection from their friends and family, as well as the more typically-recognised reduced performance in work and tasks around the home.”

Who is most likely to burn out?

Burnout is widespread among high achievers in the workplace – but is becoming increasingly more prevalent in personal lives.

Professor Parker said: “Most people think that burnout is a work problem. Actually, we found that stress experienced at work or at home can set the wheels of burnout in motion.

“Our analyses indicated that burnout may also develop as a result of predisposing personality traits, especially perfectionism.

“People with perfectionistic traits are usually excellent workers, as they’re extremely reliable and conscientious. However, they’re also prone to burnout as they set unrealistic and unrelenting standards for their own performance, which are ultimately impossible to live up to.”

What can be done about it?

Professor Parker is the founder of the Black Dog Institute, which conducts research into mood disorders and works to remove the social stigmas around mental illness.

During his extensive research on burnout, and with decades of clinical work under his belt, he has determined how to best identify and manage it.

This research is outlined in a recently published book – Burnout: A Guide to Identifying Burnout and Pathways to Recovery.

Critically, the book offers a guide for navigating out of burnout, including identifying sources and coping strategies to minimise the impact of stress.

It contains new evidence-based tools for readers to work out for themselves whether they have burnout and generate a plan for recovery based on their personal situation.

Chapters help readers recognise their own burnout patterns and provide approaches to help them regain their passions and build their resilience.

 

FURTHER INFORMATION

 

Burnout: A Guide to Identifying Burnout and Pathways to Recovery

By Gordon Parker, Gabriela Tavella, Kerrie Eyers

Published December 30, 2022  

252 Pages

Paperback9781032358963 | $24.95 | £18.99 

Hardback: 9781032367729 | $160.00 | £120.00 

eBook9781003333722| $24.95 | £18.99 

 

About The Author 

 

Professor Gordon Parker, AO, is a clinical psychiatrist and Scientia Professor of Psychiatry at the University of New South Wales, Sydney. He previously headed university and hospital departments and was founder of the Black Dog Institute. He was the recipient of the Australian Mental Health Prize in 2020.
 

Gabriela Tavella is a research officer at University of New South Wales and is completing a PhD on burnout.
 

Kerrie Eyers, AM, is a psychologist and writer.

 

Taylor & Francis contact: 
Becky Parker-Ellis, Media Relations Manager 
Email: newsroom@taylorandfrancis.com  
Tel.: +(44) 7818 911310
Follow us on Twitter: @tandfnewsroom 

 

 

About Taylor & Francis Group 
Taylor & Francis Group partners with researchers, scholarly societies, universities and libraries worldwide to bring knowledge to life.  As one of the world’s leading publishers of scholarly journals, books, ebooks and reference works our content spans all areas of Humanities, Social Sciences, Behavioural Sciences, Science, Technology and Medicine. 

 

From our network of offices in Oxford, New York, Philadelphia, Boca Raton, Boston, Melbourne, Singapore, Beijing, Tokyo, Stockholm, New Delhi and Cape Town, Taylor & Francis staff provide local expertise and support to our editors, societies and authors and tailored, efficient customer service to our library colleagues.  

 

***** ENDS ***** 

Drivers of political violence in the United States


From the JPP&M special issue, "Marketing to prevent radicalization: Developing insights for policies"

Peer-Reviewed Publication

AMERICAN MARKETING ASSOCIATION

A researcher from The Pennsylvania State University published a commentary in the Journal of Public Policy and Marketing that examines four key contemporary “drivers” of political violence in the United States: “toxic” political polarization; “toxic” identity-based ideologies; assaults on democratic norms; and disinformation and conspiracy theories. The article also discusses some ideas about how to curb the influence of these drivers and reduce political violence.

The study is titled “Drivers of Political Violence in the United States” and is authored by James A. Piazza.

Political violence in the United States is fast becoming a key concern for experts, policymakers, and the public in the wake of the January 6, 2021, assault on the U.S. Capitol. Public opinion surveys show that nearly one third of Americans believe the use of political violence to be acceptable and a plurality of surveyed Americans express concern that widespread violent civil conflict is a real possibility in the future. What explains the growing acceptability, and potential occurrence, of political violence in U.S. political life?

The article proposes that there are four potential social phenomena, or “drivers,” that contribute to political violence in America today. It also examines the social science research that links each of these drivers to political violence.

The first is “toxic” political polarization. The supporters of the two U.S. political parties have become hostile, warring camps that have strong aversion toward one another. This level of extreme political polarization has been found by scholars to increase both support for political violence and the actual occurrence of political violence in the United States and in other democracies globally.

The second, “toxic” identity-based ideologies, involves the mainstreaming of formerly extremist political ideologies such as white nationalism and Christian nationalism. These ideologies are associated with increased acceptance of the use of political violence.

Third, liberal democratic norms, and the institutions that are based upon them, are being eroded in the United States. These norms and institutions are crucial for channeling grievances into legal, nonviolent democratic behaviors. As they weaken, political violence moves into the mainstream.

Finally, the U.S. information ecosystem, particularly social media, facilitates the dissemination of disinformation and conspiracy theories. Researchers have found that both disinformation and conspiratorial mindsets contribute to increased political violence and tolerance of political violence in democratic societies.

The article concludes by stating that all these drivers represent the mainstreaming of extremist politics with the potential of violence. To push them back into the margins of American politics, several potential reforms should be considered. These include political reforms associated with electoral politics as well as media and social media reforms.

About the Journal of Public Policy & Marketing 

Journal of Public Policy & Marketing (JPP&M) is a forum for understanding the nexus of marketing and public policy, with each issue featuring a wide-range of topics including, but not limited to, ecology, ethics and social responsibility, nutrition and health, regulation and deregulation, security and privacy.
https://www.ama.org/jppm

About the American Marketing Association (AMA) 

As the largest chapter-based marketing association in the world, the AMA is trusted by marketing and sales professionals to help them discover what is coming next in the industry. The AMA has a community of local chapters in more than 70 cities and 350 college campuses throughout North America. The AMA is home to award-winning content, PCM® professional certification, premiere academic journals, and industry-leading training events and conferences.
https://www.ama.org

Water pollution, a major environmental contamination issue, solved by developing eco-friendly materials capable of purifying water at high speed with inexpensive raw materials!

Peer-Reviewed Publication

DGIST (DAEGU GYEONGBUK INSTITUTE OF SCIENCE AND TECHNOLOGY)

□ DGIST (President: Kuk Yang) Department of Energy Science and Engineering Professor Park Chi-Young's team successfully developed an 'atypical porous polymer material' that can completely remove phenolic organic contaminants in water at ultra-high speeds. The porous material developed this time can efficiently remove not only microplastics in the water but also very small-sized VOCs based on photothermal effect. At the same time, it is expected to be utilized as a high-efficiency adsorption material that can be commercialized in the future as it has cost competitiveness based on raw materials and enables solar-based water purification process.

□ Water pollution caused by the rapid development of the chemical industry is a representative problem in environmental pollution. Various water purification technologies and materials have been developed to solve this problem. Carbon-based porous materials using existing adsorption mechanisms have limitations in that the adsorption rate is slow and high thermal energy is required for recycling. Various materials have been developed to improve contaminant removal efficiency, but it has been difficult to develop materials that simultaneously satisfy excellent recyclability, high efficiency, economic efficiency of raw materials, and industrialization potential.

□ DGIST Department of Energy Science and Engineering Professor Park Chi-Young's team succeeded in synthesizing a porous polymer with excellent adsorption performance and photothermal properties by reacting an inexpensive and effective precursor. Also, an additional oxidation reaction was experimented on the polymer, and based on the results, a hydrophilic functional group was introduced to enable fast adsorption of micro-pollutants in the aquatic environment.

□ Furthermore, it was confirmed through experiments that the polymer developed by the research team does not require high thermal energy for recycling and can be used multiple times without loss of performance. The research team produced a water treatment membrane capable of evaporating water using solar energy as a driving force through the developed polymer’s ability to absorb light broadly and convert the absorbed light into heat. As a result, it was confirmed that the water treatment membrane coated with the oxidized polymer could purify phenolic contaminants through sunlight.

□ DGIST Department of Energy Science and Engineering Professor Park Chi-Young said, “The technology we developed here is an unrivaled water purification technology with the world’s highest purification efficiency, removing more than 99.9% of phenolic microplastics and VOC contaminants in water at ultra-high speeds. We expected that it will be a universal technology with high economic efficiency that can purify contaminated water and supply drinking water even in areas where there is no power supply.”

□ Meanwhile, this research was conducted with the support from the National Research Foundation of Korea's Leading Researcher Support Project and Nano and Material Technology Development Project, and Cho Wan-soo from the DGIST Department of Energy Science and Engineering, Choi Gyeong-hyeon in the master-doctoral combined program, and Lee Dong-joon in the master's program participated as the lead authors. The research results were selected and published as the cover paper for the 50th edition of ‘Advanced Materials,’ the most prestigious academic journal in the field of materials in 2022.

 Correspondent author's email address : parkcy@dgist.ac.kr

Statement by AERA president Rich Milner and executive director Felice J. Levine on the demeaning remarks by the chancellor of Purdue University Northwest

Business Announcement

AMERICAN EDUCATIONAL RESEARCH ASSOCIATION

We are deeply disturbed by the demeaning and dehumanizing comments made by Thomas L. Keon, chancellor of Purdue University Northwest, during a winter commencement address on December 10. His remarks, in which he mocked Asian languages in a crude impression, came at a time when hate against individuals of Asian descent, Jewish heritage, and other historically marginalized groups is on the rise in the U.S.

The university’s response to this incident cannot be silence. Keon’s comments were harmful and convey a disrespect of communities of Asian descent that do not befit the leadership of higher education institutions. Universities, and their leaders, should serve as models for society. We call on Purdue University Northwest to respond appropriately to ensure that this offensive conduct is not minimized and not allowed to be swept aside. We must stand against messages and acts of belittlement and belligerence, even when made in “jest,” and work to support, elevate, and humanize all communities across the globe.

###

About AERA
The American Educational Research Association (AERA) is the largest national interdisciplinary research association devoted to the scientific study of education and learning. Founded in 1916, AERA advances knowledge about education, encourages scholarly inquiry related to education, and promotes the use of research to improve education and serve the public good. Find AERA on TwitterFacebook, and Instagram.

Bering Land Bridge formed surprisingly late during last ice age

Princeton scientists found that the Bering Land Bridge was flooded until 35,700 years ago, with its full emergence occurring only shortly before the migration of humans into the Americas.

Peer-Reviewed Publication

PRINCETON UNIVERSITY

USCGC Polar Sea (WAGB-11) 

IMAGE: THE SAMPLES WERE COLLECTED ON THE U.S. COAST GUARD ICEBREAKER POLAR SEA ON AN AMBITIOUS TRANS-ARCTIC EXPEDITION IN 1994. view more 

CREDIT: US COAST GUARD

A new study shows that the Bering Land Bridge, the strip of land that once connected Asia to Alaska, emerged far later during the last ice age than previously thought. 

The unexpected findings shorten the window of time that humans could have first migrated from Asia to the Americas across the Bering Land Bridge. 

The findings also indicate that there may be a less direct relationship between climate and global ice volume than scientists had thought, casting into doubt some explanations for the chain of events that causes ice age cycles. The study was published on December 27 in the Proceedings of the National Academy of Sciences

“This result came totally out of left field,” said Jesse Farmer, postdoctoral researcher at Princeton University and co-lead author on the study. “As it turns out, our research into sediments from the bottom of the Arctic Ocean told us not only about past climate change but also one of the great migrations in human history."

Insight into ice age cycles 

During the periodic ice ages over Earth’s history, global sea levels drop as more and more of Earth’s water becomes locked up in massive ice sheets. At the end of each ice age, as temperatures increase, ice sheets melt and sea levels rise. These ice age cycles repeat throughout the last 3 million years of Earth’s history, but their causes have been hard to pin down.

By reconstructing the history of the Arctic Ocean over the last 50,000 years, the researchers revealed that the growth of the ice sheets — and the resulting drop in sea level — occurred surprisingly quickly and much later in the last glacial cycle than previous studies had suggested.

“One implication is that ice sheets can change more rapidly than previously thought,” Farmer said.

During the last ice age’s peak of the last ice age, known as the Last Glacial Maximum, the low sea levels exposed a vast land area that extended between Siberia and Alaska known as Beringia, which included the Bering Land Bridge. In its place today is a passage of water known as the Bering Strait, which connects the Pacific and Arctic Oceans.

Based on records of estimated global temperature and sea level, scientists thought the Bering Land Bridge emerged around 70,000 years ago, long before the Last Glacial Maximum.

But the new data show that sea levels became low enough for the land bridge to appear only 35,700 years ago. This finding was particularly surprising because global temperatures were relatively stable at the time of the fall in sea level, raising questions about the correlation between temperature, sea level and ice volume.

“Remarkably, the data suggest that the ice sheets can change in response to more than just global climate,” Farmer said. For example, the change in ice volume may have been the direct result of changes in the intensity of sunlight that struck the ice surface over the summer.

“These findings appear to poke a hole in our current understanding of how past ice sheets interacted with the rest of the climate system, including the greenhouse effect,” said Daniel Sigman, Dusenbury Professor of Geological and Geophysical Sciences at Princeton University and Farmer’s postdoctoral advisor. “Our next goal is to extend this record further back in time to see if the same tendencies apply to other major ice sheet changes. The scientific community will be hungry for confirmation.”


New context for human migration

The timing of human migration into North America from Asia remains unresolved, but genetic studies tell us that ancestral Native American populations diverged from Asian populations about 36,000 years ago, the same time that Farmer and colleagues found that the Bering Land Bridge emerged.

“It’s generally believed that the land bridge was open for a while, and then humans crossed it at some point,” Sigman said. “But our new data suggest that the land bridge was not open, and as soon as it opened up, human populations made their way into North America.”

The finding raises questions about why humans decided to migrate as soon as the land bridge opened, and how humans made their way across the land bridge with no previous knowledge of the landscape.

The researchers noted that they need to be cautious when considering these implications, as the interpretation requires combining very different types of information, including the new data and the information of human geneticists and paleoanthropologists. They look forward to seeing how their results are built upon by these other scientific communities.

A window to the past

To reconstruct the history of the Bering Strait, Farmer and Sigman sought an ocean chemical fingerprint.

Pacific waters carry high concentrations of nitrogen molecules that have a distinct chemical composition, known as an isotope ratio. Today, waters from the Pacific Ocean travel northwards across the Bering Strait into the Arctic Ocean, carrying a traceable nitrogen isotope ratio.

By measuring nitrogen isotopes in sediments at the bottom of the Arctic Ocean, Farmer found that the fingerprint of Pacific Ocean nitrogen disappeared when the Bering Strait was closed during the peak of the last ice age, as expected.

But when Farmer continued his analyses further back in time – to about 50,000 years ago – he found that the Pacific nitrogen fingerprint returned far more recently than researchers had thought possible.

“When Jesse showed me his data, he didn't need to explain to me what had happened,” Sigman said. “It was too large of a change to be anything other than a previous opening of the Bering Strait.”

To understand the implications for global sea level, Farmer and Sigman collaborated with Tamara Pico, a sea level expert and professor of Earth and Planetary Sciences at UC Santa Cruz, Princeton undergraduate Class of 2014, and co-lead author on the paper. Pico compared Farmer’s results with sea level models based on different scenarios for the growth of the ice sheets.

“When Jesse contacted me I was so excited,” Pico said. “A large part of my PhD thesis was focused on how fast global ice sheets grew leading into the Last Glacial Maximum, and much of my work suggests that they might have grown faster than previously thought.”

Farmer’s nitrogen analyses provided a new set of evidence to back up Pico’s research about sea levels during the last ice age.

“The exciting thing to me is that this provides a completely independent constraint on global sea level during this time period,” Pico said. “Some of the ice sheet histories that have been proposed differ by quite a lot, and we were able to look at what the predicted sea level would be at the Bering Strait and see which ones are consistent with the nitrogen data.”

“This study brought together experts in the Arctic Ocean, nitrogen cycling and global sea level. And the outcome has consequences not only for climate and sea level but also for human prehistory,” Farmer said. “One of the thrilling aspects of paleoclimate research is the opportunity to collaborate across such a broad range of subjects.”

“The Bering Strait was flooded 10,000 years before the Last Glacial Maximum,” by Jesse R. Farmer, Tamara Pico, Ona M. Underwood, Rebecca Cleveland Stout, Julie Granger, Thomas M. Cronin, François Fripiat, Alfredo Martínez-García, Gerald H. Haug, and Daniel M. Sigman appears in the current issue of Proceedings of the National Academy of Sciences (DOI: 10.1073/pnas.2206742119). The research was supported by the U.S. National Science Foundation (OCE-2054780 and OCE-2054757), the Tuttle and Phillips Funds of the Department of Geosciences, the Max Planck Society, and the USGS Climate Research and Development Program.