Wednesday, January 18, 2023

New Short Course & Symposium to explore first principles of collective intelligence

Meeting Announcement

SANTA FE INSTITUTE

Collective Intelligence Symposium & Short Course on June 20–23, 2023 

IMAGE: SFI WILL HOST A THREE-DAY COLLECTIVE INTELLIGENCE SYMPOSIUM & SHORT COURSE ON JUNE 20–23, 2023, FOCUSING ON FOUNDATIONAL IDEAS LIKE FIRST PRINCIPLES TO HELP ESTABLISH A RIGOROUS APPROACH TO THE STUDY OF COLLECTIVE INTELLIGENCE. view more 

CREDIT: AI-GENERATED ART, CREATED BY LAURA EGLEY TAYLOR USING DIFFUSION BEE.

When collections of cells or ant colonies or human societies make decisions as a group, a kind of intelligence that surpasses any individual’s capabilities can arise. The Santa Fe Institute will host a three-day Collective Intelligence Symposium & Short Course on June 20–23, 2023, focusing on foundational ideas like first principles to help establish a rigorous approach to the study of collective intelligence. The event will also leap into unexplored possibilities through a Radical Ideas competition. Applications are required for all participants, and the priority deadline is February 1, 2023.

Many people, from scientists, those working in AI, and policymakers, have a growing interest in collective intelligence. “For the first time in human history, we’re in a position to analyze the microscopic, individual-level data to understand how that produces collective effects,” says SFI Professor Jessica Flack, organizer of the Symposium & Short Course. “One of the goals of the meeting is to focus on the measures that we use to study collective intelligence and justify them by showing that they can be derived from individual-level behavior.” 

Most of the research on collective behavior is rooted in statistical physics, biology, and dynamical systems — fields that prioritize first-principles thinking. “Conceptually, collective behavior encapsulates collective intelligence,” says Flack. “But in practice, much of the collective intelligence work is coming out of places like business and management schools where first-principles thinking is not taught.” 

The Symposium & Short Course features a series of plenary talks and interactive panel discussions, followed by evening poster presentations, whiteboard sessions, and other gatherings. Day one establishes foundational principles that are as true for adaptive matter and cells as they are for economic systems and artificial intelligence. Day two builds on that foundation to explore the nature of intelligence, and how we might best measure it, in individuals, collectives, and AI. The final day covers how groups respond to changing environments, and concludes with a Radical Ideas competition. 

All applicants to the Symposium & Short Course may submit an idea for the competition. SFI welcomes creative ideas for thinking about and harnessing collective intelligence. “The more out of the box, the better,” says Flack, adding that ideas may be shared in any medium. “It could be a painting, drawing, computer code, writing, film…you just have to be able to present it.” 

Anyone interested in participating in the Symposium & Short course is encouraged to apply early, as space is limited. Priority applications for audience members, poster presenters, and entry in the Radical Ideas competition close on February 1, 2023. 

New modelling shows how interrupted flows in Australia's Murray River endanger frogs

Australian scientists develop virtual models to reveal a crucial link between natural flooding and the extinction risk of endangered frogs

Peer-Reviewed Publication

FLINDERS UNIVERSITY

Flooding in Australia's Murray-Darling Basin is creating ideal breeding conditions for many native species that have evolved to take advantage of temporary flood conditions.

Australian scientists have now developed virtual models of the Murray River to reveal a crucial link between natural flooding and the extinction risk of endangered southern bell frogs (Litoria raniformis; also known as growling grass frogs).

Southern bell frogs are one of Australia’s 100 Priority Threatened Species. This endangered frog breeds during spring and summer when water levels increase in their wetlands. However, the natural flooding patterns in Australia’s largest river system have been negatively impacted by expansive river regulation that in some years, sees up to 60% of river water extracted for human use. 

Flinders University PhD Candidate and South Australian frog expert Rupert Mathwin and his colleagues have built computer simulations of Murray-Darling Basin wetlands filled with simulated southern bell frogs.

By changing the simulation from natural to regulated conditions, they show that modern conditions dramatically increase the extinction risk of these beloved frogs.

Rupert Mathwin of Global Ecology at Flinders University and lead author of the study, says the data clearly indicate that successive dry years raise the probability of local extinction, and these effects are strongest in smaller wetlands.

‘Larger wetlands and those with more frequent inundation are less prone to these effects, although they are not immune to them entirely.’ The models present a warning. We have greatly modified the way the river behaves, and the modern river cannot support the long-term survival of southern bell frogs.’

Regulation has effectively locked these wetlands into a state of perpetual bust. The research clearly shows that river regulation has been a driver of historical declines in frog numbers. These effects are further compounded by interactive stressors such as disease, exotic species, and land clearance.’

A century of government water policies has reduced flows below the critical thresholds required for southern bell frogs to persist, but it has also created the means to save them.

The Millennium Drought shifted environmental thinking in Australia and created policies that give the Murray-Darling Basin ownership of a portion of flow. Government scientists use these flows (sometimes called “environmental water”) to create positive ecological outcomes. This includes pumping water into dry wetlands to create breeding opportunities for southern bell frogs.  

Rupert Mathwin says ‘The flows that inform our model include the Millennium Drought that occurred from 1996 to 2009. During that time, south-eastern Australia experienced a region-wide reduction in rainfall and river flows, which coupled with ongoing water extraction, resulted in seven or more successive dry years in all South Australian wetlands. Without environmental water, southern bell frogs would now be extinct in South Australia.’

Corey Bradshaw, Matthew Flinders Professor of Global Ecology and co-author of the study, says because large-scale flooding can no longer occur with sufficient frequency, proper management is the only way forward.

‘The few remaining frog populations that persist are at sites that are actively managed through regular inundation with environmental water applied at the right times for breeding. Our findings provide a blueprint for this process.’

Rupert Mathwin says that a single dry year is not a great risk to a healthy population, but managers should ensure that wetlands with southern bell frogs experience no more than two successive dry years to reduce the risk of local extinction.

‘Unfortunately, without human intervention in this heavily modified system, southern bell frogs will probably go extinct.’

Science of sediment transport key to river conservation & protection: Researchers

Peer-Reviewed Publication

SIMON FRASER UNIVERSITY

Researchers at Simon Fraser University (SFU) and The Massachusetts Institute of Technology (MIT) have devised a better way to measure how fast sediment flows in rivers—information that can help scientists and planners better prepare for flooding and weather-related events, understand salmon activity and even restore rivers.

Their solution, outlined in a new paper in Nature, all boils down to the shape and particular features of a sediment grain.

“Sediment transport controls the morphology of the Earth's surface—that includes the physical environments of all ecosystems, the beds of rivers and the ocean, and even terrestrial environments,” says SFU professor Jeremy Venditti, founding director of the School of Environmental Science, whose team carried out the study’s research activities in SFU’s River Dynamics Lab.

“Despite this, accurately predicting sediment transport remains a stubbornly difficult problem. Our work examines the granular dynamics of sediment transported by fluid flows, and shows that grain shape plays an important role in sediment transport rates. The model we developed substantially improves our ability to predict sediment transport.”

Bed load sediment transport involves wind or water flowing over a bed of sediment, causing grains to “roll or hop” along the bed. The researchers say sediment is critically important to the life cycle of rivers and understanding its movement has been “notoriously imprecise.”

Researchers decided to look beyond size and density and focused on two particular properties connected to a grain’s shape – its resistance to flow, or its drag, and its internal friction, which plays a part in its ability to slip past other grains.

Both factor into a new mathematical formula which provided predictions that were successfully matched during a series of flume experiments in the SFU lab. 

During the experiments a current of water was pumped into a small wooden flume to flow over a bed of sediment with various grain shapes, from round glass beads and chips, rectangular prisms and natural gravel. Measurements of sediment transport, drag, and internal friction of the bed were recorded.

In their paper, the research team notes: "Sediment transport is a part of life on Earth's surface, from the impact of storms on beaches to the gravel nests in mountain streams where salmon lay their eggs. Damming and sea level rise have already impacted many such terrains and pose ongoing threats. A good understanding of bed load transport is crucial to our ability to maintain these landscapes or restore them to their natural states."

Venditti has been leading research into the 2018 Big Bar Landslide that prevented salmon from getting to their Fraser River spawning grounds, to map its effects and mitigate future risks.

DOE announces $2.3 million for public-private partnerships to advance fusion energy

National labs, private companies pair up to develop cost-effective, innovative fusion energy technologies

Grant and Award Announcement

DOE/US DEPARTMENT OF ENERGY

WASHINGTON, D.C.—The U.S. Department of Energy (DOE) today announced $2.3 million in funding for 10 projects that will pair private industry with DOE’s National Laboratories to overcome challenges in fusion energy development, an area of research that captivated global attention in December when the Department announced that a team at Lawrence Livermore National Laboratory had achieved fusion ignition. Ignition, in which more energy was derived from fusion than was put into it, had never been accomplished before in a laboratory setting and raised hopes that fusion energy could play a major role in the transition to clean energy.

“We were elated when the team at Livermore delivered the news that they had achieved fusion ignition, and we knew that was just the beginning,” said U.S. Secretary of Energy Jennifer M. Granholm. “The companies and DOE scientists will build on advances from the National Labs with the entrepreneurial spirit of the private sector to advance our understanding of fusion.”

The awards announced today are provided through the DOE Office of Science’s Innovation Network for Fusion Energy (INFUSE) program, which was established in 2019 to accelerate fusion energy development through public-private research partnerships.

The projects will be led by researchers at seven private companies:

  • Commonwealth Fusion Systems (Cambridge, Massachusetts)
  • Energy Driven Technologies LLC (Champaign, Illinois)
  • Focused Energy (Austin, Texas)
  • General Atomics (San Diego, California)
  • Princeton Stellarators Inc. (Princeton, New Jersey)
  • Tokamak Energy Inc. (Bruceton Mills, West Virginia)
  • Type One Energy Group (Madison, Wisconsin) 

The awards will provide companies with access to the leading expertise and capabilities available at DOE National Laboratories to address critical scientific and technological challenges in pursuing fusion energy systems.

INFUSE solicited proposals from the fusion industry and selected projects for one- or two-year awards between $50,000 and $500,000 each, with a 20% cost share for industry partners. Outyear funding is contingent on congressional appropriations. The full list of planned awards can be found on the Office of Science’s Fusion Energy Sciences website. Full abstracts for each project are available on the INFUSE website.

Increased atmospheric dust is masking greenhouse gases’ warming effect

UCLA researchers say climate change could accelerate slightly if dust levels stop climbing

Peer-Reviewed Publication

UNIVERSITY OF CALIFORNIA - LOS ANGELES

A new study shows that global atmospheric dust — microscopic airborne particles from desert dust storms — has a slight overall cooling effect on the planet that has hidden the full amount of warming caused by greenhouse gases.

The UCLA research, published today in Nature Reviews Earth and Environment, found that the amount of desert dust has grown roughly 55% since the mid-1800s, which increased the dust’s cooling effect.

The study is the first to demonstrate the overall cooling effect of atmospheric desert dust. Some effects of atmospheric dust warm the planet, but because other effects of dust actually counteract warming — for example by scattering sunlight back into space and dissipating high clouds that warm the planet — the study calculated that dust’s overall effect is a cooling one.

Should dust levels decline — or even simply stop growing — warming could ramp up, said UCLA atmospheric physicist Jasper Kok, the study’s lead author.

“We show desert dust has increased, and most likely slightly counteracted greenhouse warming, which is missing from current climate models,” said Kok, who studies how particulate matter affects the climate. “The increased dust hasn’t caused a whole lot of cooling — the climate models are still close — but our findings imply that greenhouses gases alone could cause even more climate warming than models currently predict,” he said.

Kok compared the revelation to discovering, while driving a car at high speed, that the vehicle’s emergency brake had been partly engaged. Just as fully releasing the break could cause the car to move even faster, a stop to the increase in dust levels could slightly speed up global warming.

And while atmospheric desert dust levels have increased overall since pre-industrial times, the trend has not been steady — there have been upticks and declines along the way. And because there are so many natural and human-influenced variables that can cause dust levels to increase or decrease, scientists cannot accurately project how the amounts of atmospheric dust will change in the coming decades.

Some of the microscopic airborne particles created by burning fossil fuels also temporarily contribute to cooling, Kok said. But while scientists have spent decades determining the consequences of these human-made aerosols, the precise warming or cooling effect of desert dust remained unclear until now. The challenge researchers faced was to determine the cumulative effect of the known warming and cooling effects of dust.

In addition to atmospheric interactions with sunlight and cloud cover, when dust drops back to earth, it can darken snow and ice by settling on them, making them absorb more heat. Dust also cools the planet by depositing nutrients like iron and phosphorus. When those nutrients land in the ocean, for example, they support the growth of phytoplankton that take up carbon dioxide from the atmosphere, thereby causing a net cooling effect, Kok said.

Human actions have warmed the planet by 1.2 degrees Celsius, or 2.2 degrees Fahrenheit, since about 1850. Without the increase in dust, climate change would likely have warmed the planet by about 0.1 degree Fahrenheit more already, Kok said. With the planet nearing the 2.7 degrees Fahrenheit of warming that scientists consider especially dangerous, every tenth of a degree matters, Kok said.

“We want climate projections to be as accurate as possible, and this dust increase could have masked up to 8% of the greenhouse warming,” Kok said. “By adding the increase in desert dust, which accounts for over half of the atmosphere’s mass of particulate matter, we can increase the accuracy of climate model predictions. This is of tremendous importance because better predictions can inform better decisions of how to mitigate or adapt to climate change.”

The researchers used satellite and ground measurements to quantify the current amount of microscopic mineral particles in the air. They determined that there were 26 million tons of such particles globally — equivalent to the weight of about 5 million African elephants floating in the sky.

They next looked at the geologic record, gathering data from ice cores, marine sediment records and samples from peat bogs, which all show the layers of atmospheric dust that had fallen from the sky. Samples from around the world showed a steady increase in desert dust.

Dust can increase as a result of drier soils, higher wind speed and human land-use changes — diverting water for irrigation and turning marginal desert regions into grazing and agricultural land, for example. While increases in dust levels due to those types of land-use changes have taken place primarily on the borders of the world’s largest deserts, like the Sahara and Sahel in Africa and Asia’s Gobi desert, Kok said, similar changes have taken place in California’s Owens Lake and are occurring now in the Salton Sea, also in California.

But the factors that account for increased dust levels are not clear-cut or linear, Kok said, and whether the amounts of desert particulates will increase, decrease or remain relatively flat is unknown.

Kok emphasized that while the increase in atmospheric dust has somewhat masked the full potential of greenhouse gasses to warm the climate, the findings don’t show that climate models are wrong.

“The climate models are very useful in predicting future climate change, and this finding could further improve their usefulness,” Kok said.

Exposure to World Trade Center dust exacerbates cognitive impairment in an animal model of Alzheimer’s

Peer-Reviewed Publication

THE MOUNT SINAI HOSPITAL / MOUNT SINAI SCHOOL OF MEDICINE

WTC particulate matter exposure 

IMAGE: GRAPHICAL ILLUSTRATION OF EXPOSURE TO WORLD TRADE CENTER PARTICULAR MATTER STUDY BY GIULIO PASINETTI, ET. AL. view more 

CREDIT: MOUNT SINAI HEALTH SYSTEM


Also evokes central and peripheral pro-inflammatory responses

New York, NY (January 17, 2022) – Mice exposed to World Trade Center dust exhibit a significant impairment in spatial recognition and short- and long-term memory, as well as changes in genes related to immune-inflammatory responses and blood-brain barrier disruption, according to a study conducted by researchers from the Icahn School of Medicine at Mount Sinai and published January 17 in the Journal of Alzheimer’s Disease.

The study suggests a peripheral-brain immune inflammatory “cross-talking” that may increase the likelihood of cognitive decline, identifying key steps that may be therapeutically targetable in future studies of World Trade Center first responders.

“It is imperative that we understand the risk for Alzheimer’s disease in aging first responders and other subjects exposed to Ground Zero so that we can develop preventive initiatives,” said Giulio Maria Pasinetti, MD, PhD, the Saunders Family Professor of Neurology and Program Director for the Mount Sinai Center for Molecular Integrative Neuroresilience at Icahn Mount Sinai and senior author of the paper.

The September 11, 2001, terrorist attacks on the World Trade Center led to intense fires, which produced a massive, dense cloud of toxic gases and suspended pulverized debris comprising particles of varying sizes that contained metals, polychlorinated biphenyls, and polyaromatic hydrocarbons, among other known toxins, collectively known as World Trade Center particulate matter (WTCPM).

In the years following the attack and cleanup efforts, a cluster of chronic health conditions emerged among first responders who, working at Ground Zero for prolonged time periods, were repeatedly exposed to high levels of this particulate matter. Among the chronic health conditions, a growing body of scientific literature indicates that these first responders may have a greater incidence of mild cognitive impairment, as well as other neurological complications like changes in white matter connectivity and/or decreased hippocampal volume, which may put them at a greater risk of developing Alzheimer’s disease later in life.

“Based on epidemiological and preliminary data, we hypothesized that first responders repeatedly exposed to Ground Zero dusts in the first week post-disaster were placed at greater risk of age-related neurological conditions like Alzheimer’s disease and Alzheimer’s disease-related dementias due to changes in blood-brain barrier permeability, and/or neuro-immune interactions,” said Ruth Iban-Arias, PhD, a postdoctoral fellow in the Department of Neurology at Icahn Mount Sinai. “Our study revealed that acute exposure to World Trade Center particulate matter may accelerate cognitive deterioration and Alzheimer’s disease-type neuropathology in mice genetically modified to develop Alzheimer’s disease. And our transcriptomic analysis strongly suggests that this exposure may trigger generalized immune inflammatory cascades which may underlie the collective pathophysiology being experienced by first responders.”

To test their hypothesis, researchers from the Center for Molecular Integrative Neuroresilience at Mount Sinai used mice genetically engineered to develop Alzheimer’s disease (5XFAD) and wild-type mice as controls. Mice in the treatment groups were exposed to repeated intranasal instillation of WTCPM dust—which was collected at Ground Zero within 72 hours after the attacks—for three consecutive days for three weeks, reflecting the air level exposures faced by first responders at Ground Zero. The animals were exposed to WTCPM dust with high and low doses to identify a dose-dependent response.

Y-maze assay and novel object recognition behavioral tests were performed for working memory deficits and learning and recognition memory, respectively. During the Y-maze assay, the mouse was placed at the start of a Y-shaped maze and allowed to roam freely for 10 minutes. Generally, mice have an innate tendency to explore the environment they have not recently visited; spatial working memory impairment in this assay is defined as behavior wherein a mouse re-enters the same arm(s) repeatedly, indicating that it does not remember which arms it has already explored. Seven days later, mice were assessed via a novel recognition test, wherein each mouse was placed in an enclosure with two objects (a salt shaker and a toy block) and given 10 minutes to investigate. Time spent with both objects was recorded. Each mouse was removed and subsequently returned to the enclosure that contained a familiar object from the previous trial and a novel object. Cognitively intact mice display an innate tendency to spend a greater amount of time investigating the novel object rather than the familiar one. Thus, an animal that does not remember which object it has been exposed to previously will spend similar amounts of time exploring both objects.

Both the control and 5XFAD mice exhibited a 10 percent decrease in working memory after exposure to WTCPM dust, with only the high-exposure group displaying significant impairment compared to those not exposed to the dust. The 5XFAD mice exposed to high doses of dust and subjected to the Novel Object Recognition task showed a 16 percent and 30 percent (short- and long-term, respectively) increased preference to explore the familiar object rather than the novel when compared to no-exposure mice, depicting underlying memory alteration, evidently due to dust exposure.

The researchers also performed transcriptomic analysis (study of the complete set of RNA transcripts that are produced in the genome) in the blood and hippocampus of both sets of mice.

Exposure to WTCPM dust evoked a variety of perturbations in immune function, cell signaling, and homeostatic functioning. Interestingly, a trending increase in neutrophils, the granulocytes of the innate immune system, was also noted in the peripheral blood of WTCPM-exposed 5XFAD mice, compared to 5XFAD mice exposed to saline solution containing no dust. Overall, significant activation of pathways with an overarching theme of inflammation including acute phase response signaling were upregulated.

WTCPM dust also exacerbated the neuroinflammatory profile in the mouse brain. The researchers found significant upregulation in the expression of genes involved in blood-brain barrier.

These effects are indicative of a peripherally mounted innate immune response, which might synergistically spread neuroinflammation. Results indicate that the exposure to WTCPM may have exerted peripheral immune responses, ultimately resulting in the disruption of brain endothelial tight junction proteins and leading to a permissive vascular permeability for the migration of peripheral immune modulators to the brain.

“While we should cautiously interpret the outcomes of these preclinical studies and further investigation in the clinical setting is needed, our study provides valuable information relevant to the health of first responders. The data opens a new horizon for investigations to further understand the impact that acute exposure to WTCPM dust has on the accelerated onset of Alzheimer’s and related dementias in first responders who are now reaching older age,” said Dr. Pasinetti.

The Mount Sinai research team is currently conducting preclinical studies that explore the interaction between mice expressing the human form of APOE4/4 (the highest genetic risk factor in late-onset Alzheimer’s disease) and exposure to WTCPM dust to examine the possible accelerated onset of Alzheimer’s disease-type phenotype. These studies will provide the much-needed information for preventive screening and possibly interventions in first responders and other individuals who were exposed to the dust who have genetic susceptibility to Alzheimer’s disease.

About the Mount Sinai Health System
Mount Sinai Health System is one of the largest academic medical systems in the New York metro area, with more than 43,000 employees working across eight hospitals, over 400 outpatient practices, nearly 300 labs, a school of nursing, and a leading school of medicine and graduate education. Mount Sinai advances health for all people, everywhere, by taking on the most complex health care challenges of our time — discovering and applying new scientific learning and knowledge; developing safer, more effective treatments; educating the next generation of medical leaders and innovators; and supporting local communities by delivering high-quality care to all who need it.

Through the integration of its hospitals, labs, and schools, Mount Sinai offers comprehensive health care solutions from birth through geriatrics, leveraging innovative approaches such as artificial intelligence and informatics while keeping patients’ medical and emotional needs at the center of all treatment. The Health System includes approximately 7,300 primary and specialty care physicians; 13 joint-venture outpatient surgery centers throughout the five boroughs of New York City, Westchester, Long Island, and Florida; and more than 30 affiliated community health centers. We are consistently ranked by U.S. News & World Report's Best Hospitals, receiving high "Honor Roll" status, and are highly ranked: No. 1 in Geriatrics and top 20 in Cardiology/Heart Surgery, Diabetes/Endocrinology, Gastroenterology/GI Surgery, Neurology/Neurosurgery, Orthopedics, Pulmonology/Lung Surgery, Rehabilitation, and Urology. New York Eye and Ear Infirmary of Mount Sinai is ranked No. 12 in Ophthalmology. U.S. News & World Report’s “Best Children’s Hospitals” ranks Mount Sinai Kravis Children's Hospital among the country’s best in several pediatric specialties. The Icahn School of Medicine at Mount Sinai is one of three medical schools that have earned distinction by multiple indicators: It is consistently ranked in the top 20 by U.S. News & World Report's "Best Medical Schools," aligned with a U.S. News & World Report "Honor Roll" Hospital, and top 20 in the nation for National Institutes of Health funding and top 5 in the nation for numerous basic and clinical research areas. Newsweek’s “The World’s Best Smart Hospitals” ranks The Mount Sinai Hospital as No. 1 in New York and in the top five globally, and Mount Sinai Morningside in the top 20 globally.

For more information, visit https://www.mountsinai.org or find Mount Sinai on FacebookTwitter and YouTube.

###

YOU DID WHAT

The mechanism of cosmic magnetic fields explored in the laboratory

A novel experiment sheds new light on a possible mechanism that may seed magnetic fields for the galactic dynamo

Peer-Reviewed Publication

DOE/US DEPARTMENT OF ENERGY

The Mechanism Of Cosmic Magnetic Fields Explored in the Laboratory 

IMAGE: CONTOURS OF MAGNETIC FIELDS THAT EMERGE A RESULT OF SELF-ORGANIZATION OF MICROSCOPIC CURRENTS RESULTING FROM THE WEIBEL INSTABILITY IN A CARBON DIOXIDE LASER-PRODUCED PLASMA PROBED BY AN ULTRASHORT RELATIVISTIC ELECTRON BEAM. view more 

CREDIT: IMAGE COURTESY OF CHAOJIE ZHANG, UNIVERSITY OF CALIFORNIA LOS ANGELES

The Science

Plasma is matter that is so hot that the electrons are separated from atoms. The electrons float freely and the atoms become ions. This creates an ionized gas—plasma—that makes up nearly all of the visible universe. Recent research shows that magnetic fields can spontaneously emerge in a plasma. This can happen if the plasma has a temperature anisotropy—temperature that is different along different spatial directions. This mechanism is known as the Weibel instability. It was predicted by plasma theorist Eric Weibel more than six decades ago but only now has been unambiguously observed in the laboratory. The new research finds that this process can convert a significant fraction of the energy stored in the temperature anisotropy into magnetic field energy. It also finds that the Weibel instability could be a source of magnetic fields that permeate throughout the cosmos.

The Impact

The matter in our observable universe is plasma state and it is magnetized. Magnetic fields at the micro-gauss level (about a millionth of the Earth’s magnetic fields) permeate the galaxies. These magnetic fields are thought to be amplified from weak seed fields by the spiral motion of the galaxies, known as the galactic dynamo. How the seed magnetic fields are created is a longstanding question in astrophysics. This new work offers a possible solution to this vexing problem of the origin of the microgauss level seed magnetic fields. The research used a novel platform that has great potential for studying the ultrafast dynamics of magnetic fields in the laboratory plasmas that are relevant to astro- and high-energy density physics.

Summary

First theorized six decades ago, the Weibel instability driven by temperature anisotropy is thought to be an important mechanism for self-magnetization of many laboratory and astrophysical plasmas. However, scientists have faced two challenges in unambiguously demonstrating the Weibel instability. First, until recently, researchers were not able to generate a plasma with a known temperature anisotropy as initially envisioned by Weibel. Second, researchers had no suitable technique to measure the complex and rapidly evolving topology of the magnetic fields subsequently generated in the plasma.

This work, enabled by the unique capability of the Accelerator Test Facility, a Department of Energy (DOE) user facility at Brookhaven National Laboratory, employed a novel experimental platform that allowed the researchers to create a hydrogen plasma with a known highly anisotropic electron velocity distributions on a tens of  trillionth of a second timescale by using an ultrashort but intense carbon dioxide laser pulse. The subsequent thermalization of the plasma occurs via self-organization of plasma currents that produces magnetic fields driven by Weibel instability. These fields are large enough to deflect relativistic electrons to reveal an image of the magnetic fields a certain distance from the plasma. The researchers obtained a movie of the evolution of these magnetic fields with exquisite spatiotemporal resolution by using an one picosecond relativistic electron beam to probe these fields.

Funding

This work was supported by the Department of Energy (DOE) Office of Science, the National Science Foundation, and NSF Graduate Research Fellowships Program. The Accelerator Test Facility is supported by the DOE Office of Science. The principal investigator’s work at UCLA is supported by the National Science Foundation (NSF), DOE Office of Science High Energy Physics program, and the NSF Graduate Research Fellowships Program.

Our future climate depends partly on soil microbes—but how are they affected by climate change?

In a surprising twist, food, not temperature, is the most important factor driving microbial release of CO2

Peer-Reviewed Publication

UNIVERSITY OF MASSACHUSETTS AMHERST

Samples were harvested at two time points in July and October 2019 at two long-term warming experiments, SWaN and PH, at the Harvard Forest long-term ecological research station, which had been established for 13 and 28 years, respectively. 

IMAGE: SAMPLES WERE HARVESTED AT TWO TIME POINTS IN JULY AND OCTOBER 2019 AT TWO LONG-TERM WARMING EXPERIMENTS, SWAN AND PH, AT THE HARVARD FOREST LONG-TERM ECOLOGICAL RESEARCH STATION, WHICH HAD BEEN ESTABLISHED FOR 13 AND 28 YEARS, RESPECTIVELY. view more 

CREDIT: DOMEIGNOZ-HORTA ET AL., 10.1111/GCB.16544

AMHERST, Mass. – The largest terrestrial carbon sink on Earth is the planet’s soil. One of the big fears is that a warming planet will liberate significant portions of the soil’s carbon, turning it into carbon dioxide (CO2) gas, and so further accelerate the pace of planetary warming. A key player in this story is the microbe, the predominant form of life on Earth, and which can either turn organic carbon—the fallen leaves, rotting tree stumps, dead roots and other organic matter—into soil, or release it into the atmosphere as CO2. Now, an international team of researchers led by the University of Massachusetts Amherst has helped to untangle one of the knottiest questions involving soil microbes and climate change: what effect does a warming planet have on the microbes’ carbon cycling?

The answer is surprising: increased temperature decreases the rate at which soil microbes respire CO2—but only in the summer. During the rest of the year, microbial activity remains largely historically consistent.

But there’s a catch to this seemingly happy story.

Soil microbes are releasing less CO2 in the summer because they’re starving. And they’re starving because long-term warming is threatening the viability of deciduous trees, on whose dead leaves the microbes depend.

“One of the major outcomes of our study,” says Kristen DeAngelis, professor of microbiology at the University of Massachusetts Amherst and senior author of the study, recently published in the journal Global Change Biology, “is that all those autumn leaves mitigate the negative effects of global warming on soil microbes.” For now. But fewer dead leaves means less food for the microbes and seems to be leading to a reduction in microbial biomass during the summer.

To reach these conclusions, DeAngelis and her co-authors teamed up with two remarkable, long-term studies sited at the Harvard Forest: a project begun in 1991 by co-author Jerry Melillo on soil warming in forest ecosystems, and another, begun by co-author Serita Frey in 2006, focused on soil microbes and warming.

“Sampling soils that have been warmed for 13 and 28 years helped us elucidate how resilient to changes microorganisms are to shifts in temperature,” says Luiz A. Domeignoz-Horta, the paper’s lead author, who completed this research while at UMass Amherst and who is now a post-doctoral fellow in the Department of Evolutionary Biology and Environmental Studies at the University of Zurich.

Though much of the attention to climate change has been understandably focused on the burning of fossil fuels, it is equally important for scientists to understand the “carbon budget,” or the complete cycle of how carbon cycles through the air, soil and water. “Once I woke up to climate change, Ithought ‘what can I as a microbiologist do,’” says DeAngelis. This newest research gives climate modelers a better understanding of how carbon works in the soil, which will allow us all to better plan for a warming world.

This research was supported by the National Science Foundation and the U.S. Department of Energy.


Forest landslides’ frequency, size influenced more by road building, logging than heavy rain

Peer-Reviewed Publication

OREGON STATE UNIVERSITY

Lookout Creek 

IMAGE: LOOKOUT CREEK IN OREGON'S H.J. ANDREWS EXPERIMENTAL FOREST. view more 

CREDIT: THERESA HOGUE, OSU

CORVALLIS, Ore. – A long-term Pacific Northwest study of landslides, clear-cutting timber and building roads shows that a forest’s management history has a greater impact on how often landslides occur and how severe they are compared to how much water is coursing through a watershed.

Findings of the research, led by associate forest engineering associate professor Catalina Segura and graduate student Arianna Goodman of the Oregon State University College of Forestry, were published in the journal Earth Surface Processes and Landforms.

Probing the factors behind landside frequency and magnitude is crucial because slides occur in all 50 states, causing an average of more than 25 deaths per year, according to the United States Geological Survey. The USGS puts the total annual average economic damage resulting from landslides at greater than $1 billion.

“Understanding the long-term effects of forest practices like logging and road building is critical to sustainable forest management,” Segura said. “This requires observations on time scales that capture responses to past and ongoing management practices – looking at the timing of floods, landscape susceptibility to landsliding and the delivery and movement of wood.”

Focusing on the Lookout Creek watershed in western Oregon, a research team that included Julia Jones of the OSU College of Earth, Ocean, and Atmospheric Sciences and Frederick Swanson of the U.S. Forest Service examined a decades-long history of old-growth clear-cutting and associated road construction and how those practices affected flooding, landslides, big pieces of wood jamming up waterways, and channel change.

Debris flows contribute huge quantities of sediment and large wood to streams, Segura said. Those inputs control, for decades, a channel’s response to large flows – the amount of erosion and deposition that takes place. Regulating the inputs are an area’s history of forest practices, the natural vulnerability of a watershed to erosion and the frequency of big flood events.

The 64-square-kilometer Lookout Creek watershed is part of the H.J. Andrews Experimental Forest, a long-term research site in the Cascade Range funded by the National Science Foundation and the Forest Service. Logging and road building began in the Lookout Creek area in 1950 and largely ceased by the 1980s, enabling the scientists to track forest management practices’ impact on slides and floods during and after the period of active management.

Researchers studied five time periods: 1950 to November 1964 (initial logging and road building; December 1964 to January 1965 (first major flood); February 1965 to 1995 (between-floods period); 1996 (second major flood); and 1997 to 2020 (post-flood period).

The scientists note that three zones of distinct and contrasting geologic history comprise the Lookout Creek watershed: one zone with relatively smooth terrain and U-shaped valleys; another characterized by irregular topography, rough surfaces and moderate steepness; and a third featuring V-shaped valleys, steep slopes and narrow drainages.

“In each of the five time periods, the frequency of landslides and debris flows depended on the conditions created by management practices during prior time periods,” Segura said. “Watershed responses did differ somewhat among the zones, as would be expected – places that were once glaciated and have broad valleys are less vulnerable to landslides and debris flows than steep terrain with weak, erodible rock.”

Even small floods caused landslides and stream channel changes during the first 15 years of road construction and logging, and amid ongoing logging in the early part of the time period between large flood events, she said.

“Big flooding in 1964-65, when harvesting was taking place, produced much larger geomorphic responses than the large flood of 1996, more than a decade after logging stopped,” Segura said.

Landscape effects were negligible in 2011 for the third largest flood event on record, the researchers found; by that time clear-cut areas of the forest had been replanted and the new trees were 20 to 70 years old.