Wednesday, March 30, 2022

How the Brain Makes Memories and Uses “Time Travel” To Remember

Neuroscience

Cedars-Sinai researchers discover neurons that separate experience into segments, then help the brain ‘time travel’ and remember.

In a study led by Cedars-Sinai, researchers have discovered two types of brain cells that play a key role in dividing continuous human experience into distinct segments that can be recalled later. The discovery provides new promise as a path toward development of novel treatments for memory disorders such as dementia and Alzheimer’s disease.

The study, part of a multi-institutional BRAIN Initiative consortium funded by the National Institutes of Health and led by Cedars-Sinai, was published in the peer-reviewed journal Nature Neuroscience. As part of ongoing research into how memory works, Ueli Rutishauser, PhD, professor of Neurosurgery, Neurology, and Biomedical Sciences at Cedars-Sinai, and co-investigators looked at how brain cells react as memories are formed.

“One of the reasons we can’t offer significant help for somebody who suffers from a memory disorder is that we don’t know enough about how the memory system works,” said Rutishauser, senior author of the study, adding that memory is foundational to us as human beings.

How Human Brain Stores and Retrieves Memories

Researchers have discovered two types of brain cells that play a key role in creating memories. Credit: Sarah Pyle for Cedars-Sinai Medical Center

Human experience is continuous, but psychologists believe, based on observations of people’s behavior, that memories are divided by the brain into distinct events, a concept known as event segmentation. Working with 19 patients with drug-resistant epilepsy, Rutishauser and his team were able to study how neurons perform during this process.

Patients participating in the study had electrodes surgically inserted into their brains to help locate the focus of their epileptic seizures, allowing investigators to record the activity of individual neurons while the patients viewed film clips that included cognitive boundaries.

While these boundaries in daily life are nuanced, for research purposes, the investigators focused on “hard” and “soft” boundaries.

“An example of a soft boundary would be a scene with two people walking down a hallway and talking, and in the next scene, a third person joins them, but it is still part of the same overall narrative,” said Rutishauser, interim director of the Center for Neural Science and Medicine and the Board of Governors Chair in Neurosciences at Cedars-Sinai.

In the case of a hard boundary, the second scene might involve a completely different set of people riding in a car. “The difference between hard and soft boundaries is in the size of the deviation from the ongoing narrative,” Rutishauser said. “Is it a totally different story, or like a new scene from the same story?”

When study participants watched film clips, investigators noted that certain neurons in the brain, which they labeled “boundary cells,” increased their activity after both hard and soft boundaries. Another group of neurons, labeled “event cells,” increased their activity only in response to hard boundaries, but not soft boundaries.

Rutishauser and his co-investigators theorize that peaks in the activity of boundary and event cells—which are highest after hard boundaries, when both types of cells fire—send the brain into the proper state for initiating a new memory.

“A boundary response is kind of like creating a new folder on your computer,” said Rutishauser. “You can then deposit files in there. And when another boundary comes around, you close the first folder and create another one.”

To retrieve memories, the brain uses boundary peaks as what Rutishauser calls “anchors for mental time travel.”

“When you try to remember something, it causes brain cells to fire,” Rutishauser said. “The memory system then compares this pattern of activity to all the previous firing peaks that happened shortly after boundaries. If it finds one that is similar, it opens that folder. You go back for a few seconds to that point in time, and things that happened then come into focus.”

To test their theory, investigators gave study participants two memory tests.

They first showed participants a series of still images and asked them whether or not they had seen them in the film clips they had viewed. Study participants were more likely to remember images that closely followed a hard or soft boundary, when a new “memory folder” would have been created.

Investigators also showed participants pairs of images from film clips they had viewed and asked which of the images appeared first. Participants had difficulty remembering the correct order of images that appeared on opposite sides of a hard boundary, possibly because the brain had segmented those images into separate memory folders.

Rutishauser said that therapies that improve event segmentation could help patients with memory disorders. Even something as simple as a change in atmosphere can amplify event boundaries, he explained.

“The effect of context is actually quite strong,” Rutishauser said. “If you study in a new place, where you have never been before, instead of on your couch where everything is familiar, you will create a much stronger memory of the material.”

The research team included postdoctoral fellow Jie Zheng, PhD, and neuroscientist Gabriel Kreiman, PhD, from Boston Children’s Hospital; neurosurgeon Taufik A. Valiante, MD, PhD, of the University of Toronto; and Adam Mamelak, MD, professor of Neurosurgery and director of the Functional Neurosurgery Program at Cedars-Sinai.

In follow-up studies, the team plans to test the theory that boundary and event cells activate dopamine neurons when they fire, and that dopamine, a chemical that sends messages between cells, might be used as a therapy to strengthen memory formation.

Rutishauser and his team also noted during this study that when event cells fired in time with one of the brain’s internal rhythms, the theta rhythm—a repetitive pattern of activity linked to learning, memory and navigation—subjects were better able to remember the order of images they had seen. This is an important new insight because it shows that deep brain stimulation that adjusts theta rhythms could prove therapeutic for memory disorders.

“Theta rhythms are thought to be the ‘temporal glue’ for episodic memory,” said Zheng, first author of the study. “We think that firing of event cells in synchrony with the theta rhythm builds time-based links across different memory folders.”

For more on this research, see Researchers Discover How the Human Brain Separates, Stores, and Retrieves Memories.

Reference: “Neurons detect cognitive boundaries to structure episodic memories in humans” by Jie Zheng, Andrea G. P. Schjetnan, Mar Yebra, Bernard A. Gomes, Clayton P. Mosher, Suneil K. Kalia, Taufik A. Valiante, Adam N. Mamelak, Gabriel Kreiman and Ueli Rutishauser, 7 March 2022, Nature Neuroscience.
DOI: 10.1038/s41593-022-01020-w

The study was funded by National Institutes of Health Grants number U01NS103792 and U01NS117839, National Science Foundation Grant number 8241231216, and Brain Canada.

MIT Researchers Create a Tool for Predicting the Future

Big Data Predictive Analysis AI Concept

Researchers design a user-friendly interface that helps nonexperts make forecasts using data collected over time.

Whether someone is trying to predict tomorrow’s weather, forecast future stock prices, identify missed opportunities for sales in retail, or estimate a patient’s risk of developing a disease, they will likely need to interpret time-series data, which are a collection of observations recorded over time.

Making predictions using time-series data typically requires several data-processing steps and the use of complex machine-learning algorithms, which have such a steep learning curve they aren’t readily accessible to nonexperts.

To make these powerful tools more user-friendly, MIT researchers developed a system that directly integrates prediction functionality on top of an existing time-series database. Their simplified interface, which they call tspDB (time series predict database), does all the complex modeling behind the scenes so a nonexpert can easily generate a prediction in only a few seconds.

Tool for Predicting the Future

MIT researchers created a tool that enables people to make highly accurate predictions using multiple time-series data with just a few keystrokes. The powerful algorithm at the heart of their tool can transform multiple time series into a tensor, which is a multi-dimensional array of numbers (pictured). Credit: Figure courtesy of the researchers and edited by MIT News

The new system is more accurate and more efficient than state-of-the-art deep learning methods when performing two tasks: predicting future values and filling in missing data points.

One reason tspDB is so successful is that it incorporates a novel time-series-prediction algorithm, explains electrical engineering and computer science (EECS) graduate student Abdullah Alomar, an author of a recent research paper in which he and his co-authors describe the algorithm. This algorithm is especially effective at making predictions on multivariate time-series data, which are data that have more than one time-dependent variable. In a weather database, for instance, temperature, dew point, and cloud cover each depend on their past values.

The algorithm also estimates the volatility of a multivariate time series to provide the user with a confidence level for its predictions.

“Even as the time-series data becomes more and more complex, this algorithm can effectively capture any time-series structure out there. It feels like we have found the right lens to look at the model complexity of time-series data,” says senior author Devavrat Shah, the Andrew and Erna Viterbi Professor in EECS and a member of the Institute for Data, Systems, and Society and of the Laboratory for Information and Decision Systems.

Joining Alomar and Shah on the paper is lead author Anish Agrawal, a former EECS graduate student who is currently a postdoc at the Simons Institute at the University of California at Berkeley. The research will be presented at the ACM SIGMETRICS conference.

Adapting a new algorithm

Shah and his collaborators have been working on the problem of interpreting time-series data for years, adapting different algorithms and integrating them into tspDB as they built the interface.

About four years ago, they learned about a particularly powerful classical algorithm, called singular spectrum analysis (SSA), that imputes and forecasts single time series. Imputation is the process of replacing missing values or correcting past values. While this algorithm required manual parameter selection, the researchers suspected it could enable their interface to make effective predictions using time series data. In earlier work, they removed this need to manually intervene for algorithmic implementation.

The algorithm for single time series transformed it into a matrix and utilized matrix estimation procedures. The key intellectual challenge was how to adapt it to utilize multiple time series. After a few years of struggle, they realized the answer was something very simple: “Stack” the matrices for each individual time series, treat it as a one big matrix, and then apply the single time-series algorithm on it.

This utilizes information across multiple time series naturally — both across the time series and across time, which they describe in their new paper.

This recent publication also discusses interesting alternatives, where instead of transforming the multivariate time series into a big matrix, it is viewed as a three-dimensional tensor. A tensor is a multi-dimensional array, or grid, of numbers. This established a promising connection between the classical field of time series analysis and the growing field of tensor estimation, Alomar says.

“The variant of mSSA that we introduced actually captures all of that beautifully. So, not only does it provide the most likely estimation, but a time-varying confidence interval, as well,” Shah says.

The simpler, the better

They tested the adapted mSSA against other state-of-the-art algorithms, including deep-learning methods, on real-world time-series datasets with inputs drawn from the electricity grid, traffic patterns, and financial markets.

Their algorithm outperformed all the others on imputation and it outperformed all but one of the other algorithms when it came to forecasting future values. The researchers also demonstrated that their tweaked version of mSSA can be applied to any kind of time-series data.

“One reason I think this works so well is that the model captures a lot of time series dynamics, but at the end of the day, it is still a simple model. When you are working with something simple like this, instead of a neural network that can easily overfit the data, you can actually perform better,” Alomar says.

The impressive performance of mSSA is what makes tspDB so effective, Shah explains. Now, their goal is to make this algorithm accessible to everyone.

One a user installs tspDB on top of an existing database, they can run a prediction query with just a few keystrokes in about 0.9 milliseconds, as compared to 0.5 milliseconds for a standard search query. The confidence intervals are also designed to help nonexperts to make a more informed decision by incorporating the degree of uncertainty of the predictions into their decision making.

For instance, the system could enable a nonexpert to predict future stock prices with high accuracy in just a few minutes, even if the time-series dataset contains missing values.

Now that the researchers have shown why mSSA works so well, they are targeting new algorithms that can be incorporated into tspDB. One of these algorithms utilizes the same model to automatically enable change point detection, so if the user believes their time series will change its behavior at some point, the system will automatically detect that change and incorporate that into its predictions.

They also want to continue gathering feedback from current tspDB users to see how they can improve the system’s functionality and user-friendliness, Shah says.

“Our interest at the highest level is to make tspDB a success in the form of a broadly utilizable, open-source system. Time-series data are very important, and this is a beautiful concept of actually building prediction functionalities directly into the database. It has never been done before, and so we want to make sure the world uses it,” he says.

“This work is very interesting for a number of reasons. It provides a practical variant of mSSA which requires no hand tuning, they provide the first known analysis of mSSA, and the authors demonstrate the real-world value of their algorithm by being competitive with or out-performing several known algorithms for imputations and predictions in (multivariate) time series for several real-world data sets,” says Vishal Misra, a professor of computer science at Columbia University who was not involved with this research. “At the heart of it all is the beautiful modeling work where they cleverly exploit correlations across time (within a time series) and space (across time series) to create a low-rank spatiotemporal factor representation of a multivariate time series. Importantly this model connects the field of time series analysis to that of the rapidly evolving topic of tensor completion, and I expect a lot of follow-on research spurred by this paper.”

Reference: “On Multivariate Singular Spectrum Analysis and its Variants” by Anish Agarwal, Abdullah Alomar and Devavrat Shah, 13 February 2021, Computer Science > Machine Learning.
arXiv:2006.13448

We recommend

Your Gut Microbiome Shapes Your Life. But Where Did It Come From?

Microbiome Acquisition

Artist’s concept of this research that probes the odds of a new species being acquired by a microbiome ecosystem. Credit: Navid Marvi, courtesy of the Carnegie Institution for Science

The role of chance in microbiome composition.

The gut microbiome is an ecosystem of hundreds to thousands of microbial species living within the human body. These populations affect our health, fertility, and even our longevity. But how do they get there in the first place?

New collaborative work led by Carnegie’s William Ludington reveals crucial details about how the bacterial communities that comprise each of our individual gut microbiomes are acquired. These findings, published in the Proceedings of the National Academy of Sciences, have major implications for treatments such as fecal transplants and probiotic administration.

“There is a huge amount of variation in microbiome composition between individuals,” Ludington explained. “For example, if you look at the sum total of all of the bacterial species that are adapted to live in the gastrointestinal systems of humans, most of these are not present in a majority of people. That’s how incredibly diverse these gut microbial populations are.”

A combination of elements, including genetics, diet, and environment contribute to the differences between our microbiomes. But there isn’t a direct line between these inputs and the species that successfully colonize our guts. There’s an element of chance at play each time we are exposed to a new microbe as to whether it will be acquired and become a member of our gut ecosystem. Ludington and his collaborators set out to understand the factors that shape the odds of this colonization process.

Although many researchers have studied microbiome compositions in natural populations, there have been few attempts to use a controlled environment to reveal the process by which new species successfully join the gut microbial ecosystem. Ludington and his collaborators—Eric Jones and David Sivak of Simon Fraser University and Jean Carlson of UC Santa Barbara—developed a new ecological model to understand how we acquire the specific mix of microbes that are individual to our own particular gut communities.  

Working in the comparatively much less complicated microbiomes of fruit flies, the team showed that exposure to a microbial species does not guarantee its successful incorporation into the microbiome ecosystem. They found that the state of the microbiome, and interactions between existing microbiome member species, sets the odds for whether a newly encountered bacteria is added into the mix.

“Even among genetically identical flies that lived in the same housing and were fed the same diets, we saw variations in microbiome composition,” Sivak said.

The researchers then used these results to build mathematical models that could probe increasingly complex scenarios by which new microbiome species could be acquired, leading to their breakthrough understanding of the community factors that shape membership in the microbiome ecosystem.

“Think of microbiome composition as a big party where the social dynamics determine who leaves early and who stays until dawn,” said Ludington.

Added Jones, the paper’s first author: “Bacterial colonization depends on a number of complicated factors that we’re just starting to understand.  We showed, for example, that some groups of species facilitate each other’s colonization and are therefore more likely to coexist.”

These group interactions have exciting implications for how microbiomes are transmitted between individuals, including how medical professionals might drive a person’s microbiome towards a desired composition.

“The beauty of the mathematical approach we deployed is that it acknowledges that colonization is a roll of the dice, but we are now able to attribute the weighting of the dice to biological interactions with a molecular basis that has been honed by evolution,” said Carlson.

The team’s findings provide a framework for quantitatively examining the mechanisms that therapies such as fecal transplants and probiotics depend upon, advancing toward the eventual goal of personalized microbiome medicine.        

Reference: “Stochastic microbiome assembly depends on context” by Eric W. Jones, Jean M. Carlson, David A. Sivak and William B. Ludington, 8 February 2022, Proceedings of the National Academy of Sciences.
DOI: 10.1073/pnas.2115877119

This work was supported by the Banting Postdoctoral Fellowship, the Pacific Institute for the Mathematical Sciences Postdoctoral Fellowship, the David and Lucile Packard Foundation, the Institute for Collaborative Biotechnologies, the U.S. Army Research Office, a Natural Sciences and Engineering Research Council of Canada Discovery Grant, the Canada Research Chairs program, a National Institutes of Health grant, National Science Foundation Integrative Organismal Systems award, the Carnegie Institution for Science endowment, and a Carnegie Institution of Canada grant.

UNLIKE TRUMP RALLIES

Research Shows NFL Stadium Openings in 2020 Had No Impact on Local COVID-19 Infections

NFL Football Stadium

The findings may inform decisions on holding large outdoor gatherings amid future public health crises.

As with most everything in the world, football looked very different in 2020. As the Covid-19 pandemic unfolded, many National Football League (NFL) games were played in empty stadiums, while other stadiums opened to fans at significantly reduced capacity, with strict safety protocols in place.

At the time it was unclear what impact such large sporting events would have on Covid-19 case counts, particularly at a time when vaccination against the virus was not widely available.

Now, MIT engineers have taken a look back at the NFL’s 2020 regular season and found that for this specific period during the pandemic, opening stadiums to fans while requiring face coverings, social distancing, and other measures had no impact on the number of Covid-19 infections in those stadiums’ local counties.

As they write in a new paper published in the Proceedings of the National Academy of Sciences, “the benefits of providing a tightly controlled outdoor spectating environment — including masking and distancing requirements — counterbalanced the risks associated with opening.”

NFL Stadium Openings Impact COVID

An MIT study finds NFL stadium openings had no impact on local Covid-19 infections during the 2020 season. Credit: Jose-Luis Olivares, MIT

The study concentrates on the NFL’s 2020 regular season (September 2020 to early January 2021), at a time when earlier strains of the virus dominated, before the rise of more transmissible Delta and Omicron variants. Nevertheless, the results may inform decisions on whether and how to hold large outdoor gatherings in the face of future public health crises.

“These results show that the measures adopted by the NFL were effective in safely opening stadiums,” says study author Anette “Peko” Hosoi, the Neil and Jane Pappalardo Professor of Mechanical Engineering at MIT. “If case counts start to rise again, we know what to do: mask people, put them outside, and distance them from each other.”

The study’s co-authors are members of MIT’s Institue for Data, Systems, and Society (IDSS), and include Bernardo García Bulle, Dennis Shen, and Devavrat Shah, the Andrew and Erna Viterbi Professor in the Department of Electrical Engineering and Computer Science (EECS).

Preseason patterns

Last year a group led by the University of Southern Mississippi compared Covid-19 case counts in the counties of NFL stadiums that allowed fans in, versus those that did not. Their analysis showed that stadiums that opened to large numbers of fans led to “tangible increases” in the local county’s number of Covid-19 cases.

But there are a number of factors in addition to a stadium’s opening that can affect case counts, including local policies, mandates, and attitudes. As the MIT team writes, “it is not at all obvious that one can attribute the differences in case spikes to the stadiums given the enormous number of confounding factors.”

To truly isolate the effects of a stadium’s opening, one could imagine tracking Covid cases in a county with an open stadium through the 2020 season, then turning back the clock, closing the stadium, then tracking that same county’s Covid cases through the same season, all things being equal.

“That’s the perfect experiment, with the exception that you would need a time machine,” Hosoi says.

As it turns out, the next best thing is synthetic control — a statistical method that is used to determine the effect of an “intervention” (such as the opening of a stadium) compared with the exact same scenario without that intervention.

In synthetic control, researchers use a weighted combination of groups to construct a “synthetic” version of an actual  scenario. In this case, the actual scenario is a county such as Dallas that hosts an open stadium. A synthetic version would be a county that looks similar to Dallas, only without a stadium. In the context of this study, a county that “looks” like Dallas has a similar preseason pattern of Covid-19 cases.

To construct a synthetic Dallas, the researchers looked for surrounding counties without stadiums, that had similar Covid-19 trajectories leading up to the 2020 football season. They combined these counties in a way that best fit Dallas’ actual case trajectory. They then used data from the combined counties to calculate the number of Covid cases for this synthetic Dallas through the season, and compared these counts to the real Dallas.

The team carried out this analysis for every “stadium county.” They determined a county to be a stadium county if more than 10 percent of a stadium’s fans came from that county, which the researchers estimated based on attendance data provided by the NFL.

“Go outside”

Of the stadiums included in the study, 13 were closed through the regular season, while 16 opened with reduced capacity and multiple pandemic requirements in place, such as required masking, distanced seating, mobile ticketing, and enhanced cleaning protocols.

The researchers found the trajectory of infections in all stadium counties mirrored that of synthetic counties, showing that the number of infections would have been the same if the stadiums had remained closed. In other words, they found no evidence that NFL stadium openings led to any increase in local Covid case counts.

To check that their method wasn’t missing any case spikes, they tested it on a known superspreader: the Sturgis Motorcycle Rally, which was held in August of 2020. The analysis successfully picked up an increase in cases in Meade, the host county, compared to a synthetic counterpart, in the two weeks following the rally.

Surprisingly, the researchers found that several stadium counties’ case counts dipped slightly compared to their synthetic counterparts. In these counties — including Hamilton, Ohio, home of the Cincinnati Bengals — it appeared that opening the stadium to fans was tied to a dip in Covid-19 infections. Hosoi has a guess as to why:

“These are football communities with dedicated fans. Rather than stay home alone, those fans may have gone to a sports bar or hosted indoor football gatherings if the stadium had not opened,” Hosoi proposes. “Opening the stadium under those circumstances would have been beneficial to the community because it makes people go outside.”

The team’s analysis also revealed another connection: Counties with similar Covid trajectories also shared similar politics. To illustrate this point, the team mapped the county-wide temporal trajectories of Covid case counts in Ohio in 2020 and found them to be a strong predictor of the state’s 2020 electoral map.

“That is not a coincidence,” Hosoi notes. “It tells us that local political leanings determined the temporal trajectory of the pandemic.”

The team plans to apply their analysis to see how other factors may have influenced the pandemic.

“Covid is a different beast [today],” she says. “Omicron is more transmissive, and more of the population is vaccinated. It’s possible we’d find something different if we ran this analysis on the upcoming season, and I think we probably should try.”

Reference: “Public health implications of opening National Football League stadiums during the COVID-19 pandemic” by Bernardo García Bulle, Dennis Shen, Devavrat Shah and Anette E. Hosoi, 22 March 2022, Proceedings of the National Academy of Sciences.
DOI: 10.1073/pnas.2114226119

Power of Grassroots Innovation: How a Nondescript Box Has Been Saving Lives During the COVID Pandemic

Do-It-Yourself Air Purifier

A do-it-yourself air purifier in use in a classroom. Credit: Douglas Hannah, CC BY-ND

One afternoon, a dozen Arizona State University students gathered to spend the morning cutting cardboard, taping fans, and assembling filters in an effort to build 125 portable air purifiers for local schools. That same morning, staff members at a homeless shelter in Los Angeles were setting up 20 homemade purifiers of their own, while in Brookline, Massachusetts, another DIY air purifier was whirring quietly in the back of a daycare classroom as children played.

The technology in all three cases – an unassuming duct tape-and-cardboard construction known as a Corsi-Rosenthal box – is playing an important part in the fight against COVID-19. The story of how it came to be also reveals a lot about communities as sources of innovation and resilience in the face of disasters.

A simple technology with a big effect

As it became clear that COVID-19 was spread through airborne transmission, people started wearing masks and building managers rushed to upgrade their ventilation systems. This typically meant installing high-efficiency HEPA filters. These filters work by capturing virus-laden particles: Air is forced into a porous mat, contaminants are filtered out, and clean air passes through.

The efficacy of a building’s ventilation system is governed by two factors, though, not just the quality of the filters. The amount of air moved through the ventilation systems matters as well. Experts typically recommend five to six air changes per hour in shared spaces, meaning the entire volume of air in a room is replaced every 45 minutes. Systems in many older buildings can’t manage this volume, however.

Portable air filters are an option for augmenting ventilation systems, but they typically cost hundreds of dollars, which puts them out of range for schools and other public spaces that face budget constraints.

This is where the Corsi-Rosenthal box comes in. It’s a cube consisting of four to five off-the-shelf furnace filters topped by a standard box fan blowing outward. Once sealed together with tape, it can sit on a floor, shelf or table. The fan draws air through the sides of the cube and out the top. The units are simple, durable, and easy to make, and are more effective than simply placing a single filter in front of a box fan. It usually takes 40 minutes, minimal technical expertise, and US$60 to $90 in materials that are available from any home supply store.

Building a Corsi-Rosenthal Box Portable Air Filter

Building a Corsi-Rosenthal box portable air filter comes down to duct-taping together a set of furnace filters and a box fan. Credit: Douglas Hannah, CC BY-ND

Despite this simplicity, though, these homemade units are extremely effective. When used in a shared space like a classroom or hospital ward, they can supplement existing ventilation and remove airborne contaminants, including smoke and virus-laden particles. A raft of recent peer-reviewed research has found portable air purifiers can dramatically reduce aerosol transmission. Other preprint and under-review studies have found Corsi-Rosenthal boxes perform as well as professional units at a fraction of the cost.

Origins of the Corsi-Rosenthal box

The formal story of the Corsi-Rosenthal box began in August 2020, when Richard Corsi, an air quality expert and now dean at the University of California, Davis, pitched the idea of building cheap box-fan air filters on Twitter. Jim Rosenthal, the CEO of a Texas-based filter company, had been playing around with a similar idea and quickly built the first prototype.

Within days, tinkerers and air quality engineers alike were constructing their own Corsi-Rosenthal boxes and sharing the results on social media. A vibrant conversation emerged on Twitter, blending sophisticated technical analysis from engineers with the insight and efforts of nonspecialists.

By December, hundreds of people were making Corsi-Rosenthal boxes, and thousands more had read press coverage in outlets like Wired. In different corners of the world, people tweaked designs based on the availability of supplies and different needs. Their collective improvements and adaptations were documented by dedicated websites and blogs, as well as news reports.

In some cases, design tweaks proved to be influential. In November 2020, for example, a homeowner in North Carolina discovered an issue with air being drawn back in through the corners of the most commonly used square fans. Subsequent testing by air quality experts showed that adding a shroud to the fan increased efficiency by as much as 50%.

Weekly Tweets of Corsirosenthalbox in 2021
Analyzing social media and news coverage gives a sense of the scale of the Corsi-Rosenthal box phenomenon. As of January 2022, more than 1,000 units were in use in schools, with thousands more in homes and offices. More than 3,500 people had used the hashtag #corsirosenthalbox on Twitter, and tens of thousands more contributed to the online conversation. News articles and explainer videos on YouTube had collectively accumulated more than 1.9 million views.

Communities as sources of innovation

The story of the Corsi-Rosenthal box is part of a broader story of the grassroots response to the COVID-19 pandemic. The early days of the pandemic did more than just take a terrible toll on people. They also galvanized a massive entrepreneurial effort, with tens of thousands of everyday citizens lending their hands to design and produce the critical medical supplies and personal protective equipment that was suddenly needed.

Assembled Corsi-Rosenthal Boxes

Corsi-Rosenthal boxes assembled and awaiting delivery to a homeless shelter in California. Credit: Douglas Hannah, CC BY-ND

My research team has been tracking these efforts. Through dozens of interviews and months of archival research, we’ve built a database of more than 200 startups – formal and informal, nonprofit and for-profit – whose activities ranged from designing oxygen concentrators to 3D printing face shields to building UV disinfection rooms. The picture of innovation that emerges is a far cry from the traditional lab coats and middle managers image that is commonly associated with new technologies.

First, few of the innovations we’ve tracked were actually invented by a single person, or even a single team. Rather, they were the joint project of broad networks of individual contributors from different backgrounds and organizations. This breadth is important because it brings more knowledge and more diverse perspectives. It can also be helpful for tapping existing knowledge. For example, as Corsi-Rosenthal boxes gained traction, the community was able to draw on earlier iterations that had been developed to help with wildfire smoke.

Second, the innovation process lacked hierarchical control. There was no single person directing where or how the technology was used. This lack of control made it easier to experiment and adapt to local conditions. One example is the development of oxygen concentrators for use in hospitals in India. Realizing that existing Western technologies failed frequently in the more humid operating environment typical of India, teams of innovators rallied to develop and share improved open-source designs.

Third, these communities shared knowledge online. This allowed individual contributors to communicate directly and share ideas, which helped knowledge spread rapidly through the network. It also meant that knowledge was more readily accessible. The detailed designs and test results from air quality engineers working on Corsi-Rosenthal boxes were readily available to anyone in the community.

Also, most of the organizations we tracked used Facebook, Twitter and Slack as tools to manage collaboration within and between organizations. As I and others have argued, this gives grassroots innovation tremendous promise – especially in a world where large-scale disruptions like a pandemic are increasingly common.

Pitfalls of grassroots innovation

Despite this promise, there are areas in which grassroots innovation communities falter. One challenge is a lack of technological sophistication and resources. While some of the communities in our study produced remarkably complex devices, the greatest contribution was in far simpler products like face shields and surgical gowns.

Then there are rules and regulations. Even when grassroots communities can produce safe and effective innovations, existing rules may not be ready to receive them. Some hospitals were unable to accept personal protective equipment provided by the community during the pandemic because of inflexible procurement policies, and today some schools continue to prohibit Corsi-Rosenthal boxes.

A final issue is sustaining effort. While grassroots communities were vital to allowing hospitals and medical facilities to remain functioning during the early days of the pandemic, many of the efforts that depended on volunteer labor eventually ran out of steam.

What this means for the future

As the second anniversary of the U.S. declaration of emergency approaches, a key lesson the world has learned is the importance of investing in indoor air quality, for example through monitoring and improved ventilation and filtration. And the value of ventilation as a noninvasive public health tool is even greater as mask mandates wane.

Another, broader lesson is the power of grassroots innovation and citizen engineering to develop these technologies. The story of the Corsi-Rosenthal box, like the thousands of other grassroots innovations developed during the pandemic, is fundamentally about people taking the welfare of their communities into their own hands. The most popular tweet shared about Corsi-Rosenthal boxes was from a 14-year-old aspiring engineer in Ontario offering to build and donate boxes to anyone in need.

Written by Douglas Hannah, Assistant Professor of Strategy and Innovation, Boston University.

This article was first published in The Conversation.The Conversation

Scientists Find Ways To Reconstruct Ancient 

Scents

Past Scents Can Be Recovered

Past scents can be recovered from scent archives by extracting molecules using a variety of different methods. With successful detection and identification of the molecular composition of samples, ancient sources of smell can then be identified. Credit: Michelle O’Reilly

In recent years, millions of people worldwide have suffered the loss of smell due to COVID-19. Even those who have avoided infection with the new coronavirus experience the world of scent differently now due to the very masks that provide protection from the virus. This loss of olfaction has highlighted the important role of smell in how we perceive and navigate the world, and underscored the connections between olfaction and mental and physical health.

Scent has always been an integral component of the human experience, but up until now, the past has remained largely odorless. Most scents come from organic substances that decay quickly, leaving little for archaeologists to investigate thousands of years later. Now a team of researchers from the MPI for the Science of Human History is looking for new ways to bring the smellscapes of the past back to life and using smell to study past experience, behavior, and society.

“Tracking scent in the deep past is not a simple task,” says Barbara Huber, the lead author of the paper, “but the fact that history records expeditions of discovery, wars, and long-distance exchange to acquire materials with strong olfactory properties – like incense and spices – reveals how significant scent has been for humankind.”

Scent Data Can Be Contextualized

Scent data can be contextualized within the framework of relevant historical texts and visual representations, as well as the archaeological and environmental records. Connecting these different sources of information enables the reconstruction of critical aspects of ancient lifeways and society (indicated by arrows emerging from the blue circle). The icons in the outer area of the figure represent examples of some of the themes that can be addressed through this approach (grey boxes indicate broad headings). Credit: Michelle O’Reilly

Understanding the sensorial dimension of human history and the use of odorous and aromatic substances can contribute knowledge about many aspects of the past – including ritual, perfumery, hygiene, cuisine, trade, and commerce. But because scent is part of how we experience, understand, and navigate the world, ancient scents can also provide insight into more general aspects of the past, from social hierarchy and social practices to group identity.

“Scent is a powerful and underappreciated aspect of human experience,” notes Professor Nicole Boivin, senior author of the study and Director of the Department of Archaeology at the MPI Science of Human History “Smells reach our brain fairly directly and motivate us in critical ways – whether to avoid danger, identify something that is good for us, or remember something from our past, for example.”

“Using only traces of scented substances preserved in archaeological artifacts and features,” adds Huber, “novel methods are revealing the powerful odors that were a cardinal feature of ancient lived realities, and that shaped human action, thoughts, emotions, and memories.”

By leveraging potent new biomolecular and omics approaches, such as proteomic and metabolomics techniques, and linking new data with information from ancient texts, visual depictions, and the broader archaeological and environmental records, researchers stand to open up new aspects of the ancient world, our changing societies and cultures, and our evolution as a species. The authors of the new paper hope that more research into the rich ‘smellscapes’ of the past will provide insight into the sensory worlds of long ago, and the diverse ways that people have captured scents from nature in order to shape human experience. 

Reference: “How to use modern science to reconstruct ancient scent” 28 March 2022, Nature Human Behaviour.
DOI: 10.1038/s41562-022-01325-7