Sunday, December 11, 2022

UTEP receives $5M Department of Energy grant to train next generation nuclear security workforce

UTEP-led group will provide scholarships, stipends, internships and research opportunities to underrepresented students

Grant and Award Announcement

UNIVERSITY OF TEXAS AT EL PASO

UTEP Receives 5M Department of Energy Grant_01 

IMAGE: THE UNIVERSITY OF TEXAS AT EL PASO IN PARTNERSHIP WITH THE UNIVERSITY OF NEW MEXICO AND THE NORTH CAROLINA AGRICULTURAL AND TECHNICAL STATE UNIVERSITY WILL PREPARE THE NEXT GENERATION OF NUCLEAR SECURITY ENTERPRISE TALENT TO DEVELOP ELECTRONICS FOR EXTREME ENVIRONMENTS THROUGH A FIVE-YEAR, $5 MILLION GRANT FROM THE U.S. DEPARTMENT OF ENERGY. view more 

CREDIT: UTEP MARKETING AND COMMUNICATIONS

EL PASO, Texas (Dec. 7, 2022) – The University of Texas at El Paso in partnership with the University of New Mexico and the North Carolina Agricultural and Technical State University will prepare the next generation of nuclear security enterprise (NSE) talent to develop electronics for extreme environments through a five-year, $5 million grant from the U.S. Department of Energy (DOE).

“With this grant, UTEP will make substantial research contributions to national security with a special emphasis on nuclear security, extreme environment electronics and computer systems,” said Kenith Meissner, Ph.D., dean of the UTEP College of Engineering. “We are excited to lead this nationally coordinated consortium of industry, government and university partners.”

Electronics for extreme environments include materials, electronic devices, sensors, circuits, electronic packaging and systems that can sustain environmental challenges such as extreme temperatures, mechanical stresses and radiation fields. Applications include the use of electronics in high-power energy conversion, space and weapon systems, all critical to DOE and its National Nuclear Security Administration (NNSA).

The grant will fund the efforts of the newly established Consortium for Education and Research in Electronics for Extreme Environments (E3C) to create a sustainable pipeline of electrical engineers from underrepresented populations for the NSE workforce. The consortium includes Hispanic-Serving Institutions UTEP and the University of New Mexico (UNM), both top-tier research universities, and North Carolina Agricultural and Technical State University (NCA&T), one of the top 10 Historically Black Colleges and Universities in the nation according to U.S. News & World Report. NNSA partners include Kansas City National Security Campus (KCNSC), Sandia National Laboratories (Sandia) and Los Alamos National Laboratories (LANL).

The award is part of NNSA’s Minority Serving Institution Partnership Program, an initiative designed to build a sustainable pipeline between DOE sites and labs and minority-serving institutions in STEM disciplines.

“This grant will provide outstanding opportunities for our students and for UTEP to grow in our capabilities in electronics research and teaching,” said Miguel Velez-Reyes, Ph.D., professor and chair of electrical and computer engineering, and the grant’s principal investigator. “We are at a particular point in time in which there is a recognition among leaders in industry and government of the pressing need to increase the electrical engineering workforce to spur domestic growth in areas such as chip manufacturing, transportation electrification, aerospace systems and advanced electronics packaging.”

Velez-Reyes also said he expects recent changes in U.S. manufacturing policies to increase the demand for electrical engineers, particularly as it pertains to electronic devices, systems and their applications.

To address that demand, over the next five years the program will provide financial support to at least 65 graduate and undergraduate students from the three academic institutions in the form of stipends, scholarships and health insurance assistance. Many other program participants will benefit from new curriculum, research opportunities and internships at DOE partner facilities, and professional and career development activities.

The project’s collaborative research program will be performed in collaboration with scientists and engineers from KCNSC, Sandia and LANL.

The consortium also will engage in outreach opportunities to community college and high school students to introduce them to careers and career paths in electrical engineering.

At UTEP, faculty will also develop new courses in electronics for extreme environments, micro-electro-mechanical systems, and emerging additive manufacturing technologies for electronics.

Joining Velez-Reyes at UTEP are co-principal investigators Robert C. Roberts, Ph.D., assistant professor of electrical and computer engineering; David Zubia, Ph.D., professor of electrical and computer engineering; Brian E. Schuster, Ph.D., associate professor of metallurgical, materials and biomedical engineering; and Raymond C. Rumpf, Ph.D., professor of electrical and computer engineering and computational science. The lead at UNM is Daniel Feezell, Ph.D., associate professor of electrical and computer engineering, and at NCA&T is Abdullah Eroglu, Ph.D., department chair and professor of electrical engineering.

The consortium is recruiting students from electrical engineering and materials science and engineering to start in the program beginning in spring 2023.

 To learn more, visit www.e3consortium.org

Lehigh University shares in $47 million DOE push to accelerate fusion energy research

Professor Eugenio Schuster awarded nearly $1.75 million over three years for projects that could help pave the way for the development of a fusion pilot plant in the US

Grant and Award Announcement

LEHIGH UNIVERSITY

Eugenio Schuster 

IMAGE: LEHIGH UNIVERSITY MECHANICAL ENGINEERING AND MECHANICS PROFESSOR EUGENIO SCHUSTER STANDS IN ITER'S ASSEMBLY HALL IN THIS JULY 2022 PHOTO. IN THE BACKGROUND, THE SUB-ASSEMBLY TOOL IS USED TO PREPARE A SECTION OF THE TOKAMAK VACUUM VESSEL BEFORE MOVING IT TO ITS FINAL LOCATION WITHIN THE TOKAMAK PIT. view more 

CREDIT: PHOTO COURTESY OF EUGENIO SCHUSTER

The world’s largest nuclear fusion reactor is currently being built in the south of France. Called ITER— Latin for “the path”—the machine is the product of an international effort to essentially harness the energy-generating power of the sun.  

“The goal of ITER is to produce 10 times more energy than is required to operate it,” says Eugenio Schuster, a professor of mechanical engineering and mechanics in Lehigh University's P.C. Rossin College of Engineering and Applied Science. “Everyone in the fusion community is directly or indirectly working toward ITER.”

Schuster leads the Lehigh team that recently received nearly $1.75 million in funding from the U.S. Department of Energy (DOE) to continue their work in support of ITER and to help the United States establish its own nuclear fusion reactor within the next decade. 

The award is part of a $47 million effort by the DOE announced in October to fund research around the world by U.S. scientists, and ultimately close the gap in science and technology when it comes to nuclear fusion. The awards support the Biden administration’s vision of creating commercial fusion energy.

ITER was formalized in 2006, construction began in 2013, and the facility is expected to be operational by the end of the decade. The long timeline is no surprise. For fusion to occur, two isotopes of hydrogen called deuterium and tritium must be heated to upwards of 100 million degrees Celsius. At that temperature, gas turns to plasma, the fourth state of matter. And stably containing that superheated plasma has been a tough problem to solve.

“We can’t just put it in a vessel because it will melt anything it touches,” says Schuster. “However, plasma is made of ionized atoms, meaning that nuclei and electrons are completely separated, which creates a mix of positive and negative charges. Those charged particles, ions and electrons, react to the presence of a magnetic field. So one of the approaches to containing the plasma has been to use magnetic fields to create what’s essentially an invisible bottle to hold it, and prevent those charged particles from escaping.”

While there are several approaches to magnetic confinement, Schuster says the most promising has been the tokamak, a doughnut-shaped structure that uses large coils to create the magnetic fields that confine and shape the plasma.

ITER is a tokamak, and for the past 20 years, Schuster has been part of an international team of scientists and engineers working at tokamaks in China, South Korea, and the U.S that were designed not to produce energy, but to help researchers study the physics of the plasma, and ultimately, to ensure the reality of sustained fusion—and ITER’s success.

Current tokamaks work in what’s called a pulsed regime, but they’re not all created equal, says Schuster. Unlike the newer machines in South Korea and China (called KSTAR and EAST, respectively) those in the U.S lack the superconducting coils necessary for extended operation.

“So, for example,” he says, “with the DIII-D tokamak in San Diego, which is the largest one we have here in the U.S., the pulse is around six seconds long. So we run an experiment for six seconds, and then we have to wait for the machine to cool down before we can run another one.”

KSTAR and EAST are known as long-pulse machines, and can run on the order of hundreds of seconds, which makes them ideal for experimentation that could eventually be applied to ITER, which is also designed for long-pulse and eventual steady-state operation. Ultimately, says Schuster, for nuclear fusion to be economically feasible, future nuclear-fusion reactors will have to operate in very long pulses (in comparison to the time between pulses) or in steady state.

Part of Lehigh’s DOE award will support research at KSTAR in South Korea, where Schuster’s Plasma Control Group—together with teams from General Atomics, Oak Ridge National Laboratory, and Princeton Plasma Physics Laboratory (PPPL)—will address critical research questions related to the development of ITER’s long-pulse scenarios and their active control.

“We’ll be exploiting the fact that KSTAR is a long-pulse device that is similar to ITER in terms of both its inner-wall materials and its magnetic configuration,” he says. “And the work that we’ll be doing will help prepare ITER for operation.”  

The total requested funding for this particular project is $4,347,000.

The remainder of Lehigh’s portion of the DOE award will support research that Schuster’s group will do in China at EAST. One of the goals of the project, which includes teams from General Atomics, Lawrence Livermore National Laboratory, Massachusetts Institute of Technology, PPPL, and University of California, Los Angeles, is to utilize the readiness and uniqueness of EAST to guide the design of a fusion pilot plant (FPP) here in the U.S. 

The total requested funding for the EAST project is $6,453,000.

“A few years ago, the DOE asked the National Academies of Sciences, Engineering, and Medicine to determine what was needed here in the U.S. to really advance nuclear fusion, and in particular, to bring fusion energy to the grid,” says Schuster. “And one of those recommendations was the construction of a fusion pilot plant. And that’s why the DOE is funding American researchers to do this work abroad. It will help us learn from these superconducting tokamak machines so that eventually, we can build a long-pulse reactor-degree device in this country.” 

He says this push to design an FPP by the DOE is not only in line with the current administration’s goals to diversify its clean energy portfolio, but also with intense interest from private equity firms in making nuclear fusion a reality. Schuster says the goal is to design, develop, and begin producing energy within 10 years.

“It’s a very aggressive timeline,” he says. Indeed, especially considering that ITER—with its 16-year head start—is still not operational. But Schuster says the difference comes down to investment and technology.

“If you want to make things faster, you need to invest more heavily,” he says. “In the past, investors in nuclear fusion were exclusively governments around the world, which are often constrained by budgets aligned with shorter-term priorities. But now, you have startup companies funded by billionaires, and they aren’t constrained in the same way. The U.S. DOE is presently developing partnerships with private investors to accelerate the development of fusion energy.”

The technological constraints have also changed dramatically. Developments in computational power and advancements in superconducting coils capable of generating ever stronger magnetic fields to contain the plasma have pushed the field forward, faster. 

“We now have the capability to better predict the evolution of the plasma, thanks to more powerful computational resources, to understand the physics of the plasma more precisely after decades of studies, and to process the huge amount of data that comes from plasma experiments so that we can better learn from these experimental results,” he says. “So we do believe that we’re in a position to design and build this FPP much faster than we did with ITER.”

It’s an exciting time, he says. The promise of nuclear fusion is a big one: If achieved, it could provide a limitless supply of clean, safe, and reliable energy. And now, with the force of American entrepreneurship behind it, the efforts of Schuster and so many researchers like him, may finally be realized.

“At the end of the day, this fusion pilot plant is what we’ve all been working on, and waiting our whole lives for,” he says. “It’s the opportunity to develop a type of energy that comes with so many benefits in terms of pollution, climate change, and fuel security—right here at home.” 

Related Links: 

About Eugenio Schuster

Eugenio Schuster is an expert in nuclear-fusion plasma control and leads the internationally recognized Lehigh University Plasma Control Laboratory. The members of his group have a unique background combining training in fusion and plasma physics, control theory, computational methods, and the collaborative skills needed to work effectively with a large experimental team. 

Supported by the U.S. Department of Energy (DOE), some members of his group are stationed at the two largest nuclear-fusion experimental facilities in the United States (the DIII-D National Fusion Facility within General Atomics in San Diego, California, and the National Spherical Torus Experiment Upgrade (NSTX-U) within the Princeton Plasma Physics Laboratory (PPPL) in Princeton, New Jersey). Schuster also conducts research on international superconducting tokamaks like KSTAR (South Korea) and EAST (China). He also collaborates closely with ITER, the world's largest tokamak currently under construction in France, which will be the first nuclear-fusion device to sustain fusion reactions for long periods of time and to produce net energy.

Schuster has been appointed as an ITER Scientist Fellow in Plasma Control by the ITER Organization. He has been designated by the DOE as an expert member of the Integrated Operation Scenarios (IOS) Topical Group within the International Tokamak Physics Activity (ITPA), which he is currently co-chairing. He has recently served as Leader of the Operations and Control Topical Group within the U.S. Burning Plasma Organization (BPO). He is a recipient of the prestigious National Science Foundation (NSF) CAREER award and is the founder and first chair of the Technical Committee on Power Generation within the IEEE Control Systems Society (CSS). 

Nuclear theorists collaborate to explore 'heavy flavor' particles

Leading US researchers will develop framework for describing exotic particles' behavior at various stages in the evolution of hot nuclear matter

Grant and Award Announcement

DOE/BROOKHAVEN NATIONAL LABORATORY


Tracking Heavy Quarks 

IMAGE: COLLISIONS AT THE RELATIVISTIC HEAVY ION COLLIDER (RHIC) PRODUCE A HOT SOUP OF QUARKS AND GLUONS (CENTER)—AND ULTIMATELY THOUSANDS OF NEW PARTICLES. A NEW THEORY COLLABORATION SEEKS TO UNDERSTAND HOW HEAVY QUARKS (Q) AND ANTIQUARKS (Q-BAR) INTERACT WITH THIS QUARK-GLUON PLASMA (QGP) AND TRANSFORM INTO COMPOSITE PARTICLES THAT STRIKE THE DETECTOR. TRACKING THESE "HEAVY FLAVOR" PARTICLES CAN HELP SCIENTISTS UNRAVEL THE UNDERLYING MICROSCOPIC PROCESSES THAT DRIVE THE PROPERTIES OF THE QGP. view more 

CREDIT: BROOKHAVEN NATIONAL LABORATORY

UPTON, NY—Scientists at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory will participate in a new Topical Theory Collaboration funded by DOE’s Office of Nuclear Physics to explore the behavior of so-called “heavy flavor” particles. These particles are made of quarks of the “charm” and “bottom” varieties, which are heavier and rarer than the “up” and “down” quarks that make up the protons and neutrons of ordinary atomic nuclei. By understanding how these exotic particles form, evolve, and interact with the medium created during powerful particle collisions, scientists will gain a deeper understanding of a unique form of matter known as a quark-gluon plasma (QGP) that filled the early universe.

These experiments take place at the Relativistic Heavy Ion Collider (RHIC) at Brookhaven Lab and the Large Hadron Collider (LHC) at Europe’s CERN laboratory. Scientists accelerate and smash together the nuclei of heavy atoms at energies high enough to set free the quarks and gluelike “gluons” that hold ordinary matter together. These collisions create a soup of quarks and gluons much like the matter that existed just after the Big Bang, some 14 billion years ago.

A powerful theory, known as quantum chromodynamics (QCD), describes very accurately how the plasma’s quarks and gluons interact. But understanding how those fundamental interactions lead to the complex characteristics of the plasma—a trillion-degree, dense medium that flows like a fluid with no resistance—remains a great challenge in modern research.

The Heavy-Flavor Theory (HEFTY) for QCD Matter Topical Theory Collaboration, which will be led by Ralf Rapp from Texas A&M University, seeks to close that gap in understanding by developing a rigorous and comprehensive theoretical framework for describing how heavy-flavor particles interact with the QGP.

“With a heavy-flavor framework in place, experiments tracking these particles can be used to precisely probe the plasma’s properties,” said Peter Petreczky, a theorist at Brookhaven Lab, who will serve as co-spokesperson for the collaboration along with Ramona Vogt from DOE’s Lawrence Livermore National Laboratory. “Our framework will also provide a foundation for using heavy-flavor particles as a probe at the future Electron-Ion Collider (EIC). Future experiments at the EIC will probe different forms of cold nuclear matter which are the precursors of the QGP in the laboratory,” Petreczky said.

In heavy ion collisions at RHIC and the LHC, heavy charm and bottom quarks are produced upon initial impact of the colliding nuclei. Their large masses cause a diffusive motion that can serve as a marker of the interactions in the QGP, including the fundamental process of quarks binding together to form composite particles called hadrons.

“The framework needs to describe these particles from their initial production when the nuclei first collide, through their subsequent diffusion through the QGP and hadroniziation,” Petreczky said. “And these descriptions need to be embedded into realistic numerical simulations that enable quantitative comparisons to experimental data.”

Swagato Mukherjee of Brookhaven Lab will be a co-principal investigator in the collaboration, responsible for lattice QCD computations. These calculations require some of the world’s most powerful supercomputers to handle the complex array of variables involved in quark-gluon interactions.

“Recently there has been significant progress in lattice QCD calculations related to heavy flavor probes of QGP,” Mukherjee said. “We are in an exciting time when the exascale computing facilities and the support provided by the topical collaboration will enable us to perform realistic calculations of the key quantities needed for theoretical interpretation of experimental results on heavy flavor probes.”

In addition to lattice QCD the collaboration will use variety of theoretical approaches, including rigorous statistical data analysis to obtain the transport properties of QGP.

“The resulting framework will help us unravel the underlying microscopic processes that drive the properties of the QGP, thereby providing unprecedented insights into the inner workings of nuclear matter based on QCD,” said Rapp of Texas A&M, the principal investigator of the project.

The HEFTY collaboration will receive $2.5 Million from the DOE Office of Science, Office of Nuclear Physics, over five years. That funding will provide partial support for six graduate students and three postdoctoral fellows at 10 institutions, as well as a senior staff position at one of the national laboratories. It will also establish a bridge junior faculty position at Kent State University.

Partnering institutions include Brookhaven National Laboratory, Duke University, Florida State University, Kent State University, Lawrence Berkeley National Laboratory, Lawrence Livermore National Laboratory, Los Alamos National Laboratory, Massachusetts Institute of Technology, Texas A&M University, and Thomas Jefferson National Accelerator Facility.

Brookhaven National Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.

Follow @BrookhavenLab on Twitter or find us on Facebook.

Related Links

Soft robot detects damage, heals itself

Peer-Reviewed Publication

CORNELL UNIVERSITY

ITHACA, N.Y. – Cornell University engineers have created a soft robot capable of detecting when and where it was damaged – and then healing itself on the spot.

“Our lab is always trying to make robots more enduring and agile, so they operate longer with more capabilities,” said Rob Shepherd, associate professor of mechanical and aerospace engineering. “If you make robots operate for a long time, they’re going to accumulate damage. And so how can we allow them to repair or deal with that damage?”

Shepherd’s Organic Robotics Lab has developed stretchable fiber-optic sensors for use in soft robots and related components – from skin to wearable technology.

For self-healing to work, Shepard says the key first step is that the robot must be able to identify that there is, in fact, something that needs to be fixed.

To do this, researchers have pioneered a technique using fiber-optic sensors coupled with LED lights capable of detecting minute changes on the surface of the robot.

These sensors are combined with a polyurethane urea elastomer that incorporates hydrogen bonds, for rapid healing, and disulfide exchanges, for strength.

The resulting SHeaLDS – self-healing light guides for dynamic sensing – provides a damage-resistant soft robot that can self-heal from cuts at room temperature without any external intervention.

To demonstrate the technology, the researchers installed the SHeaLDS in a soft robot resembling a four-legged starfish and equipped it with feedback control. Researchers then punctured one of its legs six times, after which the robot was then able to detect the damage and self-heal each cut in about a minute. The robot could also autonomously adapt its gait based on the damage it sensed.

While the material is sturdy, it is not indestructible.

“They have similar properties to human flesh,” Shepherd said. “You don’t heal well from burning, or from things with acid or heat, because that will change the chemical properties. But we can do a good job of healing from cuts.”

Shepherd plans to integrate SHeaLDS with machine learning algorithms capable of recognizing tactile events to eventually create “a very enduring robot that has a self-healing skin but uses the same skin to feel its environment to be able to do more tasks.”

Autonomous Self-Healing Optical Sensors for Damage Intelligent Soft-Bodied Systems,” published Dec. 7 in Science Advances.

For additional information, see this Cornell Chronicle story.

Cornell University has dedicated television and audio studios available for media interviews.

Media Note: Pictures and video of a soft robot can be viewed and downloaded here: https://cornell.box.com/v/shealdssoftrobot

-30-

Adults’ interactions at mealtimes influence children’s future relationships with food


University of Houston research shows engaged feeding yields healthier food habits that are likely to last

Peer-Reviewed Publication

UNIVERSITY OF HOUSTON

HOUSTON, December 7 — Two University of Houston researchers are developing strategies to help parents artfully sidestep showdowns at the family table. The goal is to reign in mealtime angst early in children’s lives so they can nurture positive relationships with food that can carry them into healthy adulthood.

In an article in the journal Appetite, the research team reveals that guiding children in recognizing their innate sense of fullness and helping them understand the importance of responding to its cues are two key elements in what is referred to as responsive feeding practices. The term is used by psychologists and other experts to describe parental attentiveness and engagement during feeding that affect children’s overall attitudes and behavior around food.

By way of clarifying the concept of responsive feeding practices, you might bring to mind its opposites – the nonresponsive feeding practices: Enforcing the ‘clean plate club,’ for example, whether a young eater is hungry or not. Or offering yummy dessert as a bribe for trudging through yucky vegetables or tedious chores.

Such unfortunate directions can encourage lifelong overeating, explained Leslie A. Frankel, associate professor in the Human Development and Family Sciences Program at the College of Education, and Ritu Sampige, a biomedical sciences senior in the UH Honors College, and the article’s first author.

“We consider those types of nonresponsive feeding practices to be less optimal because they override children’s ability to internally regulate how much food they should consume,” Sampige said.

In addition to paying attention to hunger and fullness cues, adults guide the mealtime atmosphere with attitudes they bring to the family table, even when they do not realize it. Staying positively involved with their children throughout family meals can make lasting differences.

“It’s not a black-and-white issue. Parents tend to use a lot of tactics to get their children to eat and behave and do all the things we need them to do. The key difference is the level to which parents are engaged at mealtime and how successful they are in avoiding nonresponsive eating behaviors and food rewards,” Frankel said.

Frankel and Sampige, with fellow research colleague and co-author Caroline Bena Kuno, of the Department of Psychology at the Virginia State University, are uncovering an unrecognized tie-in with parents’ mental health status.

Previous research has noted that children of parents who suffer from anxiety or depression are, themselves, more likely to experience general mental health issues. But until now, few studies have linked the issue specifically with children’s resiliency around the temptations of food.

“Parents who are more able to be responsive in the moment tend to be more successful in guiding their children on good paths to healthy eating. Helping parents get the support they need is crucial for many reasons. And now we know one more, that success at the family table involves the parents’ ability to be engaged with children and provide in-the-moment responses to each child’s fullness cues,” Frankel explained.

But take it all in balance, she stressed. “Food is often at the center of celebration, and that’s a beautiful thing. So are family trips out for ice cream and the joyful times children have with their families and friends. The important factor is not to adhere too strictly to rules – or expect every mealtime to go smoothly – but to help parents steer toward feeding practices that appreciate children’s innate sense of when to stop eating and regular mealtime rituals that honor everyone around the table,” she said.

Click here for the full article in Appetite.

Adults’ interactions at mealtimes influence children’s future relationships with food 

Socialness is in the eye of the beholder

New study investigates how social interaction is perceived

Peer-Reviewed Publication

DARTMOUTH COLLEGE

Although people are generally predisposed to perceive interactions to be social even in unlikely contexts, they don’t always agree on exactly which information is social, according to a new Dartmouth study.

The findings show that much of the brain responds more strongly to information that is interpreted as social versus non-social.

Published in the Journal of Neuroscience, the results contribute to research illustrating how humans are drawn to social connections.

It has long been known that humans tend to perceive social information, including in inanimate stimuli, such as seeing a face in a rocky outcropping, or interpreting the motion of two shapes as a social interaction. Previous studies on social perception using such geometric-shaped animations have often relied on labels assigned over 20 years ago by researchers that designated which animations should be classified as social versus random (non-social) motion. However, Dartmouth’s study employs a more subjective approach and is based on participants’ own reports of whether they perceive a given animation as social or not.

“Through this research, we set out to understand how and why people can perceive the same dynamic social information differently,” says lead author Rekha Varrier, a postdoctoral fellow in the Functional Imaging and Naturalistic Neuroscience (FINN) Lab in the department of psychological and brain sciences at Dartmouth. “By taking into account people’s own perceptions, we can understand the underlying neural processes better.”

To examine the behavioral and neural correlates of “conscious” social perception, the Dartmouth team used data from the social cognition task in the Human Connectome Project. With over 1,000 healthy adult participants, the project provides researchers with a large, public dataset to work from.

For the task, participants were asked to watch 10 animations of two or more shapes in motion that were 20 seconds in length while their brain activity was recorded in an fMRI scanner. The task was designed to be balanced, as five of the animations were meant to be social and the other five were not. After watching each animation, participants were asked to indicate how they perceived the content by selecting one of the following labels: social, non-social, or unsure.

The results demonstrated that participants were biased towards perceiving information as social in that they were more likely to declare an animation intended to be random as “social” than to declare one intended to be social as “random.” Furthermore, participants tended to express higher uncertainty on how to categorize animations when the stimuli were meant to be perceived as non-social, perhaps indicating a reluctance to declare content as non-social altogether.

“Humans depend on social structures to survive,” says senior author Emily Finn, an assistant professor of psychological and brain sciences, and principal investigator of the FINN Lab at Dartmouth. “Our brains and minds might be primed to see things as social because it confers an evolutionary advantage.”

Finn also notes, “We’re likely tuned to see social information in our surroundings because the cost of missing a social interaction would likely be higher than that of falsely perceiving something as social.”

“Our fMRI results indicate that much of the brain cares about social information,” says Varrier. “We found that the neural response to social content occurs early both in time and in the cortical hierarchy.”

The findings show that the processing of social information occurs in early brain regions that are typically involved in processing visual information, including regions in the lateral occipital and temporal regions.

Through future work, the researchers plan to develop their own set of animations that are deliberately ambiguous, which will enable them to ask more precise questions about why there are different perceptions of social interactions across individuals. The results could potentially be used to better understand autism spectrum disorder and to gain a more nuanced understanding of social perception.

The Southern Hemisphere is stormier than the Northern, and we finally know why

UChicago research offers first concrete explanation for difference, and show it is getting even stormier over time

Peer-Reviewed Publication

UNIVERSITY OF CHIC

Topographical map of the world 

IMAGE: TOPOGRAPHICAL MAP OF THE WORLD, WITH HIGHER MOUNTAIN RANGES IN DARK BROWN AND LOWER AREAS IN GREEN. THE NORTHERN HEMISPHERE HAS MORE LAND MASS AND HIGHER MOUNTAINS THAN THE SOUTHERN HEMISPHERE, WHICH CONTRIBUTES IN PART TO FEWER STORMS, ACCORDING TO A NEW STUDY. view more 

CREDIT: NASA JET PROPULSION LABORATORY

For centuries, sailors who had been all over the world knew where the most fearsome storms of all lay in wait: the Southern Hemisphere. “The waves ran mountain-high and threatened to overwhelm [the ship] at every roll,” wrote one passenger on an 1849 voyage rounding the tip of South America.

Many years later, scientists poring over satellite data could finally put numbers behind sailors’ intuition: The Southern Hemisphere is indeed stormier than the Northern, by about 24%, in fact. But no one knew why.

A new study led by University of Chicago climate scientist Tiffany Shaw lays out the first concrete explanation for this phenomenon. Shaw and her colleagues found two major culprits: ocean circulation and the large mountain ranges in the Northern Hemisphere.

The study also found that this storminess asymmetry has increased since the beginning of the satellite era in the 1980s. The increase was shown to be qualitatively consistent with climate change forecasts from physics-based models.

‘A tale of two hemispheres’
For a long time, we didn’t know very much about the weather in the Southern Hemisphere: most of the ways we observe weather are land-based, and the Southern Hemisphere has much more ocean than the Northern Hemisphere does. 

But with the advent of satellite-based global observing in the 1980s, we could quantify just how extreme the difference was. The Southern Hemisphere has a stronger jet stream and more intense weather events.

Ideas had been circulated, but no one had established a definitive explanation for this asymmetry. Shaw—along with Osamu Miyawaki (PhD’22, now at the National Center for Atmospheric Research) and the University of Washington’s Aaron Donohoe—had hypotheses from their own and other previous studies, but they wanted to take the next step. This meant bringing together multiple lines of evidence, from observations, theory, and physics-based simulations of Earth’s climate.

“You can’t put the Earth in a jar,” Shaw explained, “so instead we use climate models built on the laws of physics and run experiments to test our hypotheses.”

They used a numerical model of Earth’s climate built on the laws of physics that reproduced the observations. Then they removed different variables one at a time, and quantified each one’s impact on storminess.

The first variable they tested was topography.  Large mountain ranges disrupt air flow in a way that reduces storms, and there are more mountain ranges in the Northern Hemisphere.

Indeed, when the scientists flattened every mountain on Earth, about half the difference in storminess between the two hemispheres disappeared.

The other half had to do with ocean circulation. Water moves around the globe like a very slow but powerful conveyor belt: it sinks in the Arctic, travels along the bottom of the ocean, rises near Antarctica and then flows up near the surface, carrying energy with it. This creates an energy difference between the two hemispheres. When the scientists tried eliminating this conveyor belt, they saw the other half of the difference in storminess disappear.

Getting even stormier
Having answered the fundamental question regarding why the southern hemisphere is stormier, the researchers moved on to examine how storminess has changed since we’ve been able to track it. 

Looking over past decades of observations, they found that the storminess asymmetry has increased over the satellite era beginning in the 1980s. That is, the Southern Hemisphere is getting even stormier, whereas the change on average in the Northern Hemisphere has been negligible.

The Southern Hemisphere storminess changes were connected to changes in the ocean. They found a similar ocean influence is occurring in the Northern Hemisphere, but its effect is canceled out by the absorption of sunlight in the Northern Hemisphere due to the loss of sea ice and snow.

The scientists checked and found that models used to forecast climate change as part of the Intergovernmental Panel on Climate Change assessment report were showing the same signals—increasing storminess in the Southern Hemisphere and negligible changes in the Northern—which serves as an important independent check on the accuracy of these models.

It may be surprising that such a deceptively simple question—why one hemisphere is stormier than another—went unanswered for so long, but Shaw explained that the field of weather and climate physics is relatively young compared to many other fields.

It was only after World War II that scientists began to build models of the physics driving large-scale weather and climate (of which key contributions were made at the University of Chicago by Prof. Carl-Gustaf Rossby).

But having a deep understanding of the physical mechanisms behind the climate and its response to human-caused changes, such as those laid out in this study, are crucial for predicting and understanding what will happen as climate change accelerates.

“By laying this foundation of understanding, we increase confidence in climate change projections and thereby help society better prepare for the impacts of climate change,” Shaw said. “One of the major threads in my research is to understand if models are giving us good information now so that we can trust what they say about the future. The stakes are high and it’s important to get the right answer for the right reason.”