Wednesday, March 30, 2022

Compelling Sign of Alien Life: Methane Could Be the First Detectable Indication of Life Beyond Earth

Abiotic Methane Sources

Methane in a planet’s atmosphere may be a sign of life if nonbiological sources can be ruled out. This illustration summarizes the known abiotic sources of methane on Earth, including outgassing from volcanoes, reactions in settings such as mid-ocean ridges, hydrothermal vents, and subduction zones, and impacts from asteroids and comets. Credit: © 2022 Elena Hartley

A new study assesses the planetary context in which the detection of methane in an exoplanet’s atmosphere could be considered a compelling sign of life.

If life is abundant in the universe, atmospheric methane may be the first sign of life beyond Earth detectable by astronomers. Although nonbiological processes can generate methane, a new study by scientists at UC Santa Cruz establishes a set of circumstances in which a persuasive case could be made for biological activity as the source of methane in a rocky planet’s atmosphere.

This is especially noteworthy because methane is one of the few potential signs of life, or “biosignatures,” that could be readily detectable with the James Webb Space Telescope, which will begin observations later this year.

“Oxygen is often talked about as one of the best biosignatures, but it’s probably going to be hard to detect with JWST,” said Maggie Thompson, a graduate student in astronomy and astrophysics at UC Santa Cruz and lead author of the new study.

Despite some prior studies on methane biosignatures, there had not been an up-to-date, dedicated assessment of the planetary conditions needed for methane to be a good biosignature. “We wanted to provide a framework for interpreting observations, so if we see a rocky planet with methane, we know what other observations are needed for it to be a persuasive biosignature,” Thompson said.

Published today (March 28, 2022) in Proceedings of the National Academy of Sciences, the study examines a variety of non-biological sources of methane and assesses their potential to maintain a methane-rich atmosphere. These include volcanoes; reactions in settings such as mid-ocean ridges, hydrothermal vents, and tectonic subduction zones; and comet or asteroid impacts.

The case for methane as a biosignature stems from its instability in the atmosphere. Because photochemical reactions destroy atmospheric methane, it must be steadily replenished to maintain high levels.

“If you detect a lot of methane on a rocky planet, you typically need a massive source to explain that,” said coauthor Joshua Krissansen-Totton, a Sagan Fellow at UCSC. “We know biological activity creates large amounts of methane on Earth, and probably did on the early Earth as well because making methane is a fairly easy thing to do metabolically.”

Nonbiological sources, however, would not be able to produce that much methane without also generating observable clues to its origins. Outgassing from volcanoes, for example, would add both methane and carbon monoxide to the atmosphere, while biological activity tends to readily consume carbon monoxide. The researchers found that nonbiological processes cannot easily produce habitable planet atmospheres rich in both methane and carbon dioxide and with little to no carbon monoxide.

The study emphasizes the need to consider the full planetary context in evaluating potential biosignatures. The researchers concluded that, for a rocky planet orbiting a sun-like star, atmospheric methane is more likely to be considered a strong indication of life if the atmosphere also has carbon dioxide, methane is more abundant than carbon monoxide, and extremely water-rich planetary compositions can be ruled out.

“One molecule is not going to give you the answer—you have to take into account the planet’s full context,” Thompson said. “Methane is one piece of the puzzle, but to determine if there is life on a planet you have to consider its geochemistry, how it’s interacting with its star, and the many processes that can affect a planet’s atmosphere on geologic timescales.”

The study considers a variety of possibilities for “false positives” and provides guidelines for assessing methane biosignatures.

“There are two things that could go wrong—you could misinterpret something as a biosignature and get a false positive, or you could overlook something that’s a real biosignature,” Krissansen-Totton said. “With this paper, we wanted to develop a framework to help avoid both of those potential errors with methane.”

He added that there is still a lot of work to be done to fully understand any future methane detections. “This study is focused on the most obvious false positives for methane as a biosignature,” he said. “The atmospheres of rocky exoplanets are probably going to surprise us, and we will need to be cautious in our interpretations. Future work should try to anticipate and quantify more unusual mechanisms for nonbiological methane production.”

Reference: “The case and context for atmospheric methane as an exoplanet biosignature” 28 March 2022, Proceedings of the National Academy of Sciences.
DOI: 10.1073/pnas.2117933119

In addition to Thompson and Krissansen-Totton, the coauthors of the paper include Jonathan Fortney, professor of astronomy and astrophysics at UCSC, Myriam Telus, assistant professor of Earth and planetary sciences at UCSC, and Nicholas Wogan at the University of Washington, Seattle. This work was supported by NASA.

Could the Internet Be Driven by Climate-Friendly “Natural Intelligence?”

Sun Exposure at Each Solar Protocol Server

Map of solar exposure over 24 hours at each Solar Protocol sun-powered network server. Credit: Tega Brain, Benedetta Piantella

Global network of solar-powered servers shows how.

The energy requirements of the Internet, the devices and systems using it and the servers that support it are responsible for greenhouse gas emissions equal to that of the global airline industry, and this carbon cost is growing rapidly with the expansion of blockchain-enabled transactions: The carbon footprint of a single Ethereum transaction is equivalent to nearly 329,000 credit card transactions.

A new project, Solar Protocol, developed by a team of researchers at the NYU Tandon School of Engineering, aims both to spotlight how this transglobal trafficking of data through the web is a major consumer of energy and driver of climate change, and to offer a potential solution.

On Tuesday, March 15, 2022, the project, initially supported by the Eyebeam Rapid Response for a Better Digital Future program, and the Code for Science & Society Digital Infrastructure Incubator, receives the Mozilla Creative Media Award from Mozilla, which supports internet health through development of open-source technologies in such areas as online privacy, inclusion and decentralization. Mozilla offers the award to people and projects that illuminate how to reimagine data in a way that shifts power away from big tech platforms and toward individuals and communities.

Developed by NYU Tandon Professors of Technology Culture and Society Tega Brain, who is also an assistant industry professor of integrated digital media, and Benedetta Piantella, a member of the Center for Urban Science and Progress (CUSP); and Adjunct Professor Alex Nathanson, Solar Protocol comprises a web platform hosted across a network of solar-powered servers set up in locations around the world. Besides being a workable system with implications for future servers, it constitutes a global installation that spotlights the politics of the web and different ways to track web traffic.

In stark contrast to large-scale, high-volume web services that algorithmically direct network traffic to whichever server gives the quickest response time, usually the closest geographically, Solar Protocol, with about a dozen volunteer-run server nodes around the world, uses the sun’s interaction with Earth as the cornerstone. How the sun shapes daily behaviors, seasonal activities and the decision-making of almost all life forms becomes the “logic” used to automate decisions in the digital network.

“Solar Protocol is a great opportunity for us as artists to foreground issues of climate change and how technology is driving it,” said Brain. “The project has catalyzed conversations about AI and automation, since in-network user traffic is decided by solar energy, so we are using intelligence from natural and dynamic versus a data-driven machine learning model; it’s an alternative proposition. Why not think of planetary limits as intelligence? Afterall, they will shape the future of life on earth whether we like it or not”

The network takes into consideration the fact that servers, each powered by photovoltaic cells, are located in different time zones and seasons, with different sun exposure and weather systems, and directs internet traffic to wherever the sun is shining. When a browser makes a request to see the Solar Protocol website, it is sent to whichever server in the network is generating the most energy.

“This is in no way an alternative to the internet, so the goal here is not to scale it up. But we are publishing the system as an open standard, which means, theoretically, anyone could launch a similar network — say, a network of art museums,” said Piantella.

Brain pointed out that the project also addresses the language of the Internet, and how we speak of it in terms suggesting it has little to do with the concrete realities of our physical environment.

“We talk about the internet as the cloud, for example, and we tend to use the language of magic to describe it, making no connection to how resource intensive it really is,” she said. “So, people who get involved in the project as server stewards, become very much in touch with its material reality, and what it takes to set up a server powered by the sun. You start making different design decisions; you think about planetary limits and rethink the politics of the internet.”

Added Kofi Yeboah, Creative Media Awards Program Officer at Mozilla, “In our connected world, conversations about power, about inclusion and exclusion, and about ownership often come down to one thing: data. How data is collected, managed, and trains AI systems has an impact on billions of lives. But that impact is often invisible. Creative Media Awards like Solar Protocol make the invisible visible, revealing how data can impact everything from the environment to personal safety. The Creative Media Awards also offer a way forward, modeling ways that data can be better stewarded to empower people and communities.”

An educational component of the project is through the VIP (Vertically Integrated Projects) initiative at NYU Tandon, allowing students to participate in analysis of the network’s functional cycle.

Solar Protocol includes several collaborators from a range of communities, including faculty members in Chile, and arts, cultural and community-based organizations in multiple areas and Indigenous territories in the Caribbean, Australia, and Kenya.

How the Brain Makes Memories and Uses “Time Travel” To Remember

Neuroscience

Cedars-Sinai researchers discover neurons that separate experience into segments, then help the brain ‘time travel’ and remember.

In a study led by Cedars-Sinai, researchers have discovered two types of brain cells that play a key role in dividing continuous human experience into distinct segments that can be recalled later. The discovery provides new promise as a path toward development of novel treatments for memory disorders such as dementia and Alzheimer’s disease.

The study, part of a multi-institutional BRAIN Initiative consortium funded by the National Institutes of Health and led by Cedars-Sinai, was published in the peer-reviewed journal Nature Neuroscience. As part of ongoing research into how memory works, Ueli Rutishauser, PhD, professor of Neurosurgery, Neurology, and Biomedical Sciences at Cedars-Sinai, and co-investigators looked at how brain cells react as memories are formed.

“One of the reasons we can’t offer significant help for somebody who suffers from a memory disorder is that we don’t know enough about how the memory system works,” said Rutishauser, senior author of the study, adding that memory is foundational to us as human beings.

How Human Brain Stores and Retrieves Memories

Researchers have discovered two types of brain cells that play a key role in creating memories. Credit: Sarah Pyle for Cedars-Sinai Medical Center

Human experience is continuous, but psychologists believe, based on observations of people’s behavior, that memories are divided by the brain into distinct events, a concept known as event segmentation. Working with 19 patients with drug-resistant epilepsy, Rutishauser and his team were able to study how neurons perform during this process.

Patients participating in the study had electrodes surgically inserted into their brains to help locate the focus of their epileptic seizures, allowing investigators to record the activity of individual neurons while the patients viewed film clips that included cognitive boundaries.

While these boundaries in daily life are nuanced, for research purposes, the investigators focused on “hard” and “soft” boundaries.

“An example of a soft boundary would be a scene with two people walking down a hallway and talking, and in the next scene, a third person joins them, but it is still part of the same overall narrative,” said Rutishauser, interim director of the Center for Neural Science and Medicine and the Board of Governors Chair in Neurosciences at Cedars-Sinai.

In the case of a hard boundary, the second scene might involve a completely different set of people riding in a car. “The difference between hard and soft boundaries is in the size of the deviation from the ongoing narrative,” Rutishauser said. “Is it a totally different story, or like a new scene from the same story?”

When study participants watched film clips, investigators noted that certain neurons in the brain, which they labeled “boundary cells,” increased their activity after both hard and soft boundaries. Another group of neurons, labeled “event cells,” increased their activity only in response to hard boundaries, but not soft boundaries.

Rutishauser and his co-investigators theorize that peaks in the activity of boundary and event cells—which are highest after hard boundaries, when both types of cells fire—send the brain into the proper state for initiating a new memory.

“A boundary response is kind of like creating a new folder on your computer,” said Rutishauser. “You can then deposit files in there. And when another boundary comes around, you close the first folder and create another one.”

To retrieve memories, the brain uses boundary peaks as what Rutishauser calls “anchors for mental time travel.”

“When you try to remember something, it causes brain cells to fire,” Rutishauser said. “The memory system then compares this pattern of activity to all the previous firing peaks that happened shortly after boundaries. If it finds one that is similar, it opens that folder. You go back for a few seconds to that point in time, and things that happened then come into focus.”

To test their theory, investigators gave study participants two memory tests.

They first showed participants a series of still images and asked them whether or not they had seen them in the film clips they had viewed. Study participants were more likely to remember images that closely followed a hard or soft boundary, when a new “memory folder” would have been created.

Investigators also showed participants pairs of images from film clips they had viewed and asked which of the images appeared first. Participants had difficulty remembering the correct order of images that appeared on opposite sides of a hard boundary, possibly because the brain had segmented those images into separate memory folders.

Rutishauser said that therapies that improve event segmentation could help patients with memory disorders. Even something as simple as a change in atmosphere can amplify event boundaries, he explained.

“The effect of context is actually quite strong,” Rutishauser said. “If you study in a new place, where you have never been before, instead of on your couch where everything is familiar, you will create a much stronger memory of the material.”

The research team included postdoctoral fellow Jie Zheng, PhD, and neuroscientist Gabriel Kreiman, PhD, from Boston Children’s Hospital; neurosurgeon Taufik A. Valiante, MD, PhD, of the University of Toronto; and Adam Mamelak, MD, professor of Neurosurgery and director of the Functional Neurosurgery Program at Cedars-Sinai.

In follow-up studies, the team plans to test the theory that boundary and event cells activate dopamine neurons when they fire, and that dopamine, a chemical that sends messages between cells, might be used as a therapy to strengthen memory formation.

Rutishauser and his team also noted during this study that when event cells fired in time with one of the brain’s internal rhythms, the theta rhythm—a repetitive pattern of activity linked to learning, memory and navigation—subjects were better able to remember the order of images they had seen. This is an important new insight because it shows that deep brain stimulation that adjusts theta rhythms could prove therapeutic for memory disorders.

“Theta rhythms are thought to be the ‘temporal glue’ for episodic memory,” said Zheng, first author of the study. “We think that firing of event cells in synchrony with the theta rhythm builds time-based links across different memory folders.”

For more on this research, see Researchers Discover How the Human Brain Separates, Stores, and Retrieves Memories.

Reference: “Neurons detect cognitive boundaries to structure episodic memories in humans” by Jie Zheng, Andrea G. P. Schjetnan, Mar Yebra, Bernard A. Gomes, Clayton P. Mosher, Suneil K. Kalia, Taufik A. Valiante, Adam N. Mamelak, Gabriel Kreiman and Ueli Rutishauser, 7 March 2022, Nature Neuroscience.
DOI: 10.1038/s41593-022-01020-w

The study was funded by National Institutes of Health Grants number U01NS103792 and U01NS117839, National Science Foundation Grant number 8241231216, and Brain Canada.

MIT Researchers Create a Tool for Predicting the Future

Big Data Predictive Analysis AI Concept

Researchers design a user-friendly interface that helps nonexperts make forecasts using data collected over time.

Whether someone is trying to predict tomorrow’s weather, forecast future stock prices, identify missed opportunities for sales in retail, or estimate a patient’s risk of developing a disease, they will likely need to interpret time-series data, which are a collection of observations recorded over time.

Making predictions using time-series data typically requires several data-processing steps and the use of complex machine-learning algorithms, which have such a steep learning curve they aren’t readily accessible to nonexperts.

To make these powerful tools more user-friendly, MIT researchers developed a system that directly integrates prediction functionality on top of an existing time-series database. Their simplified interface, which they call tspDB (time series predict database), does all the complex modeling behind the scenes so a nonexpert can easily generate a prediction in only a few seconds.

Tool for Predicting the Future

MIT researchers created a tool that enables people to make highly accurate predictions using multiple time-series data with just a few keystrokes. The powerful algorithm at the heart of their tool can transform multiple time series into a tensor, which is a multi-dimensional array of numbers (pictured). Credit: Figure courtesy of the researchers and edited by MIT News

The new system is more accurate and more efficient than state-of-the-art deep learning methods when performing two tasks: predicting future values and filling in missing data points.

One reason tspDB is so successful is that it incorporates a novel time-series-prediction algorithm, explains electrical engineering and computer science (EECS) graduate student Abdullah Alomar, an author of a recent research paper in which he and his co-authors describe the algorithm. This algorithm is especially effective at making predictions on multivariate time-series data, which are data that have more than one time-dependent variable. In a weather database, for instance, temperature, dew point, and cloud cover each depend on their past values.

The algorithm also estimates the volatility of a multivariate time series to provide the user with a confidence level for its predictions.

“Even as the time-series data becomes more and more complex, this algorithm can effectively capture any time-series structure out there. It feels like we have found the right lens to look at the model complexity of time-series data,” says senior author Devavrat Shah, the Andrew and Erna Viterbi Professor in EECS and a member of the Institute for Data, Systems, and Society and of the Laboratory for Information and Decision Systems.

Joining Alomar and Shah on the paper is lead author Anish Agrawal, a former EECS graduate student who is currently a postdoc at the Simons Institute at the University of California at Berkeley. The research will be presented at the ACM SIGMETRICS conference.

Adapting a new algorithm

Shah and his collaborators have been working on the problem of interpreting time-series data for years, adapting different algorithms and integrating them into tspDB as they built the interface.

About four years ago, they learned about a particularly powerful classical algorithm, called singular spectrum analysis (SSA), that imputes and forecasts single time series. Imputation is the process of replacing missing values or correcting past values. While this algorithm required manual parameter selection, the researchers suspected it could enable their interface to make effective predictions using time series data. In earlier work, they removed this need to manually intervene for algorithmic implementation.

The algorithm for single time series transformed it into a matrix and utilized matrix estimation procedures. The key intellectual challenge was how to adapt it to utilize multiple time series. After a few years of struggle, they realized the answer was something very simple: “Stack” the matrices for each individual time series, treat it as a one big matrix, and then apply the single time-series algorithm on it.

This utilizes information across multiple time series naturally — both across the time series and across time, which they describe in their new paper.

This recent publication also discusses interesting alternatives, where instead of transforming the multivariate time series into a big matrix, it is viewed as a three-dimensional tensor. A tensor is a multi-dimensional array, or grid, of numbers. This established a promising connection between the classical field of time series analysis and the growing field of tensor estimation, Alomar says.

“The variant of mSSA that we introduced actually captures all of that beautifully. So, not only does it provide the most likely estimation, but a time-varying confidence interval, as well,” Shah says.

The simpler, the better

They tested the adapted mSSA against other state-of-the-art algorithms, including deep-learning methods, on real-world time-series datasets with inputs drawn from the electricity grid, traffic patterns, and financial markets.

Their algorithm outperformed all the others on imputation and it outperformed all but one of the other algorithms when it came to forecasting future values. The researchers also demonstrated that their tweaked version of mSSA can be applied to any kind of time-series data.

“One reason I think this works so well is that the model captures a lot of time series dynamics, but at the end of the day, it is still a simple model. When you are working with something simple like this, instead of a neural network that can easily overfit the data, you can actually perform better,” Alomar says.

The impressive performance of mSSA is what makes tspDB so effective, Shah explains. Now, their goal is to make this algorithm accessible to everyone.

One a user installs tspDB on top of an existing database, they can run a prediction query with just a few keystrokes in about 0.9 milliseconds, as compared to 0.5 milliseconds for a standard search query. The confidence intervals are also designed to help nonexperts to make a more informed decision by incorporating the degree of uncertainty of the predictions into their decision making.

For instance, the system could enable a nonexpert to predict future stock prices with high accuracy in just a few minutes, even if the time-series dataset contains missing values.

Now that the researchers have shown why mSSA works so well, they are targeting new algorithms that can be incorporated into tspDB. One of these algorithms utilizes the same model to automatically enable change point detection, so if the user believes their time series will change its behavior at some point, the system will automatically detect that change and incorporate that into its predictions.

They also want to continue gathering feedback from current tspDB users to see how they can improve the system’s functionality and user-friendliness, Shah says.

“Our interest at the highest level is to make tspDB a success in the form of a broadly utilizable, open-source system. Time-series data are very important, and this is a beautiful concept of actually building prediction functionalities directly into the database. It has never been done before, and so we want to make sure the world uses it,” he says.

“This work is very interesting for a number of reasons. It provides a practical variant of mSSA which requires no hand tuning, they provide the first known analysis of mSSA, and the authors demonstrate the real-world value of their algorithm by being competitive with or out-performing several known algorithms for imputations and predictions in (multivariate) time series for several real-world data sets,” says Vishal Misra, a professor of computer science at Columbia University who was not involved with this research. “At the heart of it all is the beautiful modeling work where they cleverly exploit correlations across time (within a time series) and space (across time series) to create a low-rank spatiotemporal factor representation of a multivariate time series. Importantly this model connects the field of time series analysis to that of the rapidly evolving topic of tensor completion, and I expect a lot of follow-on research spurred by this paper.”

Reference: “On Multivariate Singular Spectrum Analysis and its Variants” by Anish Agarwal, Abdullah Alomar and Devavrat Shah, 13 February 2021, Computer Science > Machine Learning.
arXiv:2006.13448

We recommend

Your Gut Microbiome Shapes Your Life. But Where Did It Come From?

Microbiome Acquisition

Artist’s concept of this research that probes the odds of a new species being acquired by a microbiome ecosystem. Credit: Navid Marvi, courtesy of the Carnegie Institution for Science

The role of chance in microbiome composition.

The gut microbiome is an ecosystem of hundreds to thousands of microbial species living within the human body. These populations affect our health, fertility, and even our longevity. But how do they get there in the first place?

New collaborative work led by Carnegie’s William Ludington reveals crucial details about how the bacterial communities that comprise each of our individual gut microbiomes are acquired. These findings, published in the Proceedings of the National Academy of Sciences, have major implications for treatments such as fecal transplants and probiotic administration.

“There is a huge amount of variation in microbiome composition between individuals,” Ludington explained. “For example, if you look at the sum total of all of the bacterial species that are adapted to live in the gastrointestinal systems of humans, most of these are not present in a majority of people. That’s how incredibly diverse these gut microbial populations are.”

A combination of elements, including genetics, diet, and environment contribute to the differences between our microbiomes. But there isn’t a direct line between these inputs and the species that successfully colonize our guts. There’s an element of chance at play each time we are exposed to a new microbe as to whether it will be acquired and become a member of our gut ecosystem. Ludington and his collaborators set out to understand the factors that shape the odds of this colonization process.

Although many researchers have studied microbiome compositions in natural populations, there have been few attempts to use a controlled environment to reveal the process by which new species successfully join the gut microbial ecosystem. Ludington and his collaborators—Eric Jones and David Sivak of Simon Fraser University and Jean Carlson of UC Santa Barbara—developed a new ecological model to understand how we acquire the specific mix of microbes that are individual to our own particular gut communities.  

Working in the comparatively much less complicated microbiomes of fruit flies, the team showed that exposure to a microbial species does not guarantee its successful incorporation into the microbiome ecosystem. They found that the state of the microbiome, and interactions between existing microbiome member species, sets the odds for whether a newly encountered bacteria is added into the mix.

“Even among genetically identical flies that lived in the same housing and were fed the same diets, we saw variations in microbiome composition,” Sivak said.

The researchers then used these results to build mathematical models that could probe increasingly complex scenarios by which new microbiome species could be acquired, leading to their breakthrough understanding of the community factors that shape membership in the microbiome ecosystem.

“Think of microbiome composition as a big party where the social dynamics determine who leaves early and who stays until dawn,” said Ludington.

Added Jones, the paper’s first author: “Bacterial colonization depends on a number of complicated factors that we’re just starting to understand.  We showed, for example, that some groups of species facilitate each other’s colonization and are therefore more likely to coexist.”

These group interactions have exciting implications for how microbiomes are transmitted between individuals, including how medical professionals might drive a person’s microbiome towards a desired composition.

“The beauty of the mathematical approach we deployed is that it acknowledges that colonization is a roll of the dice, but we are now able to attribute the weighting of the dice to biological interactions with a molecular basis that has been honed by evolution,” said Carlson.

The team’s findings provide a framework for quantitatively examining the mechanisms that therapies such as fecal transplants and probiotics depend upon, advancing toward the eventual goal of personalized microbiome medicine.        

Reference: “Stochastic microbiome assembly depends on context” by Eric W. Jones, Jean M. Carlson, David A. Sivak and William B. Ludington, 8 February 2022, Proceedings of the National Academy of Sciences.
DOI: 10.1073/pnas.2115877119

This work was supported by the Banting Postdoctoral Fellowship, the Pacific Institute for the Mathematical Sciences Postdoctoral Fellowship, the David and Lucile Packard Foundation, the Institute for Collaborative Biotechnologies, the U.S. Army Research Office, a Natural Sciences and Engineering Research Council of Canada Discovery Grant, the Canada Research Chairs program, a National Institutes of Health grant, National Science Foundation Integrative Organismal Systems award, the Carnegie Institution for Science endowment, and a Carnegie Institution of Canada grant.

UNLIKE TRUMP RALLIES

Research Shows NFL Stadium Openings in 2020 Had No Impact on Local COVID-19 Infections

NFL Football Stadium

The findings may inform decisions on holding large outdoor gatherings amid future public health crises.

As with most everything in the world, football looked very different in 2020. As the Covid-19 pandemic unfolded, many National Football League (NFL) games were played in empty stadiums, while other stadiums opened to fans at significantly reduced capacity, with strict safety protocols in place.

At the time it was unclear what impact such large sporting events would have on Covid-19 case counts, particularly at a time when vaccination against the virus was not widely available.

Now, MIT engineers have taken a look back at the NFL’s 2020 regular season and found that for this specific period during the pandemic, opening stadiums to fans while requiring face coverings, social distancing, and other measures had no impact on the number of Covid-19 infections in those stadiums’ local counties.

As they write in a new paper published in the Proceedings of the National Academy of Sciences, “the benefits of providing a tightly controlled outdoor spectating environment — including masking and distancing requirements — counterbalanced the risks associated with opening.”

NFL Stadium Openings Impact COVID

An MIT study finds NFL stadium openings had no impact on local Covid-19 infections during the 2020 season. Credit: Jose-Luis Olivares, MIT

The study concentrates on the NFL’s 2020 regular season (September 2020 to early January 2021), at a time when earlier strains of the virus dominated, before the rise of more transmissible Delta and Omicron variants. Nevertheless, the results may inform decisions on whether and how to hold large outdoor gatherings in the face of future public health crises.

“These results show that the measures adopted by the NFL were effective in safely opening stadiums,” says study author Anette “Peko” Hosoi, the Neil and Jane Pappalardo Professor of Mechanical Engineering at MIT. “If case counts start to rise again, we know what to do: mask people, put them outside, and distance them from each other.”

The study’s co-authors are members of MIT’s Institue for Data, Systems, and Society (IDSS), and include Bernardo García Bulle, Dennis Shen, and Devavrat Shah, the Andrew and Erna Viterbi Professor in the Department of Electrical Engineering and Computer Science (EECS).

Preseason patterns

Last year a group led by the University of Southern Mississippi compared Covid-19 case counts in the counties of NFL stadiums that allowed fans in, versus those that did not. Their analysis showed that stadiums that opened to large numbers of fans led to “tangible increases” in the local county’s number of Covid-19 cases.

But there are a number of factors in addition to a stadium’s opening that can affect case counts, including local policies, mandates, and attitudes. As the MIT team writes, “it is not at all obvious that one can attribute the differences in case spikes to the stadiums given the enormous number of confounding factors.”

To truly isolate the effects of a stadium’s opening, one could imagine tracking Covid cases in a county with an open stadium through the 2020 season, then turning back the clock, closing the stadium, then tracking that same county’s Covid cases through the same season, all things being equal.

“That’s the perfect experiment, with the exception that you would need a time machine,” Hosoi says.

As it turns out, the next best thing is synthetic control — a statistical method that is used to determine the effect of an “intervention” (such as the opening of a stadium) compared with the exact same scenario without that intervention.

In synthetic control, researchers use a weighted combination of groups to construct a “synthetic” version of an actual  scenario. In this case, the actual scenario is a county such as Dallas that hosts an open stadium. A synthetic version would be a county that looks similar to Dallas, only without a stadium. In the context of this study, a county that “looks” like Dallas has a similar preseason pattern of Covid-19 cases.

To construct a synthetic Dallas, the researchers looked for surrounding counties without stadiums, that had similar Covid-19 trajectories leading up to the 2020 football season. They combined these counties in a way that best fit Dallas’ actual case trajectory. They then used data from the combined counties to calculate the number of Covid cases for this synthetic Dallas through the season, and compared these counts to the real Dallas.

The team carried out this analysis for every “stadium county.” They determined a county to be a stadium county if more than 10 percent of a stadium’s fans came from that county, which the researchers estimated based on attendance data provided by the NFL.

“Go outside”

Of the stadiums included in the study, 13 were closed through the regular season, while 16 opened with reduced capacity and multiple pandemic requirements in place, such as required masking, distanced seating, mobile ticketing, and enhanced cleaning protocols.

The researchers found the trajectory of infections in all stadium counties mirrored that of synthetic counties, showing that the number of infections would have been the same if the stadiums had remained closed. In other words, they found no evidence that NFL stadium openings led to any increase in local Covid case counts.

To check that their method wasn’t missing any case spikes, they tested it on a known superspreader: the Sturgis Motorcycle Rally, which was held in August of 2020. The analysis successfully picked up an increase in cases in Meade, the host county, compared to a synthetic counterpart, in the two weeks following the rally.

Surprisingly, the researchers found that several stadium counties’ case counts dipped slightly compared to their synthetic counterparts. In these counties — including Hamilton, Ohio, home of the Cincinnati Bengals — it appeared that opening the stadium to fans was tied to a dip in Covid-19 infections. Hosoi has a guess as to why:

“These are football communities with dedicated fans. Rather than stay home alone, those fans may have gone to a sports bar or hosted indoor football gatherings if the stadium had not opened,” Hosoi proposes. “Opening the stadium under those circumstances would have been beneficial to the community because it makes people go outside.”

The team’s analysis also revealed another connection: Counties with similar Covid trajectories also shared similar politics. To illustrate this point, the team mapped the county-wide temporal trajectories of Covid case counts in Ohio in 2020 and found them to be a strong predictor of the state’s 2020 electoral map.

“That is not a coincidence,” Hosoi notes. “It tells us that local political leanings determined the temporal trajectory of the pandemic.”

The team plans to apply their analysis to see how other factors may have influenced the pandemic.

“Covid is a different beast [today],” she says. “Omicron is more transmissive, and more of the population is vaccinated. It’s possible we’d find something different if we ran this analysis on the upcoming season, and I think we probably should try.”

Reference: “Public health implications of opening National Football League stadiums during the COVID-19 pandemic” by Bernardo García Bulle, Dennis Shen, Devavrat Shah and Anette E. Hosoi, 22 March 2022, Proceedings of the National Academy of Sciences.
DOI: 10.1073/pnas.2114226119