Thursday, March 02, 2023

Energy: More than two million citizens power Europe’s renewable energy transition

Peer-Reviewed Publication

SCIENTIFIC REPORTS

More than two million citizens across 30 European countries have been involved in thousands of projects and initiatives as part of efforts to transition to renewable energy, according to an analysis published in Scientific Reports. With investments ranging between 6.2 and 11.3 billion Euros, these findings highlight the important role of collective action in the decarbonisation of Europe.

The energy system in Europe is undergoing a significant transition towards renewables and decarbonisation. However, the contribution of citizen-led efforts, such as energy cooperatives, in this sphere is largely unknown.

Valeria Schwanitz and colleagues quantified the contributions of citizen-led energy initiatives towards the transition to low-carbon energy in 30 European countries between 2000 and 2021.They assessed the numbers of initiatives, people involved, specific energy projects, and renewable energy facilities installed, and the total funds invested.

The authors estimate that 10,540 citizen-led initiatives were recorded during this time period, examples of which include a renewable energy community in Borutta, Italy and an eco-village community in Sweden. Within these initiatives, 22,830 specific projects were undertaken such as the installation of wind turbines and solar panels on local buildings, and the promotion of behavioural change and climate action within communities. The authors estimate that 2,010,600 people collectively participated in these activities – including 391,500 individuals in Germany and 306,650 individuals in Denmark.

Additionally, the authors calculate that between 6.2 and 11.3 billion Euros were invested in citizen-led energy activities. This equated to investment of up to 5,700 Euros per individual. The installed renewable facilities had a capacity of between 7.2 to 9.9 gigawatts, and the authors calculate that these facilities produced between 8,500 and 11,700 kiloWatt hours annually per person involved in the initiatives. This approximately covers the electricity needs of a typical European household.

The authors conclude that more data and reporting standards are needed to develop comprehensive statistics for the contribution of citizen efforts to the energy transition in Europe.

###

Article details

Statistical evidence for the contribution of citizen-led initiatives and projects to the energy transition in Europe

DOI: 10.1038/s41598-023-28504-4

Corresponding Authors:

Valeria Schwanitz
Western Norway University of Applied Science, Sogndal, Norway
Email: valerias@hvl.no

Please link to the article in online versions of your report (the URL will go live after the embargo ends): https://www.nature.com/articles/s41598-023-28504-4.

Robot provides unprecedented views below Antarctic ice shelf


Peer-Reviewed Publication

CORNELL UNIVERSITY

High in a narrow, seawater-filled crevasse in the base of Antarctica’s largest ice shelf, cameras on the remotely operated Icefin underwater vehicle relayed a sudden change in scenery.

Walls of smooth, cloudy meteoric ice suddenly turned green and rougher in texture, transitioning to salty marine ice.

Nearly 1,900 feet above, near where the surface of the Ross Ice Shelf meets Kamb Ice Stream, a U.S.-New Zealand research team recognized the shift as evidence of “ice pumping” – a process never before directly observed in an ice shelf crevasse, important to its stability.

“We were looking at ice that had just melted less than 100 feet below, flowed up into the crevasse and then refrozen,” said Justin Lawrence, visiting scholar at the Cornell Center for Astrophysics and Planetary Science. “And then it just got weirder as we went higher up.”

The Icefin robot’s unprecedented look inside a crevasse, and observations revealing more than a century of geological processes beneath the ice shelf, are detailed in “Crevasse Refreezing and Signatures of Retreat Observed at Kamb Ice Stream Grounding Zone,” published March 2 in Nature Geoscience.

The paper reports results from a 2019 field campaign to Kamb Ice Stream supported by Antarctica New Zealand and other New Zealand research agencies, led by Christina Hulbe, professor at the University of Otago, and colleagues. Through support from NASA’s Astrobiology Program, a research team led by Britney Schmidt, associate professor of astronomy and earth and atmospheric sciences at Cornell University, was able to join the expedition and deploy Icefin. Schmidt’s Planetary Habitability and Technology Lab has been developing Icefin for nearly a decade, beginning at the Georgia Institute of Technology.

Combined with recently published investigations of the fast-changing Thwaites Glacier – explored the same season by a second Icefin vehicle – the research is expected to improve models of sea-level rise by providing the first high-resolution views of ice, ocean and sea floor interactions at contrasting glacier systems on the West Antarctic Ice Sheet.

Thwaites, which is exposed to warm ocean currents, is one of the continent’s most unstable glaciers. Kamb Ice Stream, where the ocean is very cold, has been stagnant since the late 1800s. Kamb currently offsets some of the ice loss from western Antarctica, but if it reactivates could increase the region’s contribution to sea-level rise by 12%.

“Antarctica is a complex system and it’s important to understand both ends of the spectrum – systems already undergoing rapid change as well as those quieter systems where future change poses a risk,” Schmidt said. “Observing Kamb and Thwaites together helps us learn more.”

NASA funded Icefin’s development and the Kamb exploration to extend ocean exploration beyond Earth. Marine ice like that found in the crevasse may be an analog for conditions on Jupiter’s icy moon Europa, the target of NASA’s Europa Clipper orbital mission slated for launch in 2024. Later lander missions might one day search directly for microbial life in the ice.

Icefin carries a full complement of oceanographic instruments on a modular frame more than 12 feet long and less than 10 inches in diameter. It was lowered on a tether through a borehole the New Zealand team drilled through the ice shelf with hot water.

During three dives spanning more than three miles near the grounding zone where Kamb transitions to the floating Ross shelf, Icefin mapped five crevasses – ascending one – and the sea floor, while recording water conditions including temperature, pressure and salinity.

The team observed diverse ice features that provide valuable information about water mixing and melt rates. They included golf ball-like dimples, ripples, vertical runnels and the “weirder” formations near the top of the crevasse: globs of ice and finger-like protrusions resembling brinicles.

Ice pumping observed in the crevasse likely contributes to the relative stability of the Ross Ice Shelf – the world’s largest by area, the size of France – compared to Thwaites Glacier, the researchers said.

“It’s a way these big ice shelves can protect and heal themselves,” said Peter Washam, a polar oceanographer on the Icefin science team and the paper’s second author. “A lot of the melting that happens deep near the grounding line, that water then refreezes and accretes onto the bottom of the ice as marine ice.”

On the sea floor, Icefin mapped parallel sets of ridges that the researchers believe are impressions left behind by ice shelf crevasses – and a record of 150 years of activity since the Kamb stream stagnated. As its grounding line retreated, the ice shelf thinned, causing the crevasses to lift away. The ice’s slow movement over time shifted the crevasses seaward of the ridges.

“We can look at those sea floor features and directly connect them to what we saw on the ice base,” said Lawrence, the paper’s lead author, now a program manager and planetary scientist at Honeybee Robotics. “We can, in a way, rewind the process.”

In addition to Lawrence, Washam and Schmidt, Cornell co-authors of the research are Senior Research Engineers Matthew Meister, who led the Icefin engineering team, and Andrew Mullen; Research Engineer Daniel Dichek; and Program Manager Enrica Quartini. Schmidt’s team also includes Research Engineer Frances Bryson, and at Georgia Tech, doctoral students Benjamin Hurwitz and Anthony Spears.

Also contributing were partners from New Zealand at the National Institute of Water and Atmospheric Research (NIWA); University of Auckland; University of Otago; and Victoria University of Wellington.

NASA supported the research through the Planetary Science and Technology from Analog Research program’s Project RISE UP (Ross Ice Shelf and Europa Underwater Probe), and the Future Investigators in NASA Earth and Space Science and Technology program. Additional support came from New Zealand’s Antarctic Science Platform, the U.S. Antarctic Program and Victoria University of Wellington’s Hot Water Drilling initiative.

Wisconsin cave holds tantalizing clues to ancient climate changes, future shifts


Peer-Reviewed Publication

UNIVERSITY OF WISCONSIN-MADISON

Even in their dark isolation from the atmosphere above, caves can hold a rich archive of local climate conditions and how they've shifted over the eons. Formed over tens of thousands of years, speleothems — rock formations unique to caves better known as stalagmites and stalactites — hold secrets to the ancient environments from which they formed.

A newly published study of a stalagmite found in a cave in southern Wisconsin reveals previously undetected history of the local climate going back thousands of years. The new findings provide strong evidence that a series of massive and abrupt warming events that punctuated the most recent ice age likely enveloped vast swaths of the Northern Hemisphere.

The research, conducted by a team of scientists at the University of Wisconsin–Madison, appears March 2 in the journal Nature Geoscience. It's the first study to identify a possible link between ice age warm-ups recorded in the Greenland ice sheet — known as Dansgaard-Oeschger events — and climate records from deep within the interior of central North America.

"This is the only study in this area of the world that is recording these abrupt climate events during the last glacial period," says Cameron Batchelor, who led the analysis while completing her PhD at UW–Madison. Batchelor is now a postdoctoral fellow with the National Science Foundation working at the Massachusetts Institute of Technology.

The study is based on an exceptionally detailed chemical and physical analysis of a stalagmite that formed in the Cave of the Mounds, a tourist attraction and educational destination.

"At Cave of the Mounds our mission is to interpret this geologic wonder for our many annual visitors," says Joe Klimczak, general manager of the cave, which is a designated national natural landmark. "We are thrilled to deepen our understanding of the cave thanks to this world-class research and very exciting results.”

The stalagmite Batchelor and her team analyzed grew extremely slowly — taking roughly 20,000 years to reach the length of a human pinky finger.

The finger-length subterranean rock formed from a complex process that began in the sky. Water that originally fell as precipitation from the atmosphere soaked into the ground and percolated through soil and cracks in bedrock, dissolving tiny bits of limestone along the way. Some of that dissolved limestone was then left behind as countless drips of water fell from the ceiling of Cave of the Mounds, gradually accumulating into thousands of exceedingly thin layers of a mineral called calcite.

"And because those calcite layers are formed from that original precipitation, they're locking in the oxygen in the H2O originating from that precipitation," says Batchelor.

Therein lies the key to reconstructing an ancient climate record from a small, otherwise unremarkable rock. The oxygen trapped in the calcite exists in a couple varieties — known as isotopes — that scientists can use to glean information about the environmental conditions present during the precipitation events that formed it. That includes the temperature and possible sources of rain and snow that fell atop the Cave of the Mounds over thousands of years.

Batchelor's team used a specialized imaging technique that allowed them to identify layers within the stalagmite representing annual growth bands — much like how tree rings record a season’s worth of growth. Using another technique, they identified the isotopes in the tiny layers, revealing that present-day southern Wisconsin experienced a number of very large average temperature swings of up to 10 C (or about 18 F) between 48,000 and 68,000 years ago. Several of the temperature swings occurred over the course of around a decade.

While the dating information is not precise enough to definitively tie the temperature swings to the Dansgaard-Oeschger events recorded in Greenland ice cores, the researchers can say with confidence they occurred within similar timeframes. The team also performed climate simulations that bolstered the hypothesis that warming events occurred tens of thousands of years ago in the region of North America that includes present-day Wisconsin, and that the climate records from Cave of the Mounds and the Greenland ice sheet are indeed linked.

This potential link is exciting for Batchelor because it offers a climate story about central North America that has so far gone untold. Previous research from the mid-continent has not resolved signals of these large temperature swings, also called excursions.

"One theory was that the mid-continent is relatively immune to abrupt climate changes, and that maybe that's because it's surrounded by landmass, and there's some type of buffering happening," says Batchelor. "However, when we went and measured, we saw these really large excursions, and we were like, 'Oh, no, something is definitely happening.'"

That something — a rapidly changing climate — is unfolding yet again today, thanks to humans and our use of fossil fuels. Batchelor says she hopes her work in Wisconsin, and now a cave in the Canadian subarctic that she is studying for her postdoc, helps fill a big data gap about the history and potential future of abrupt climate changes in the mid-continent of North America.

This study was supported by grants from the National Science foundation (P2C2-1805629, EAR-1355590, EAR-1658823). Further resources were provided by the U.S. Department of Energy (DE-AC05-00OR22725), the Wisconsin Alumni Research Foundation and the Isotope Laboratory at the University of Minnesota. At UW–Madison, Shaun Marcott, Ian Orland and Feng He contributed to this study, as did R. Lawrence Edwards at the University of Minnesota.

Crowdsourced reports can quickly identify an earthquake’s impact


Peer-Reviewed Publication

SEISMOLOGICAL SOCIETY OF AMERICA

Within minutes, a statistical model based on a global database of public reports of ground shaking can be used to identify an earthquake as a high- or low-impact event, according to a new study published in The Seismic Record.

High-impact earthquakes, as defined by the study, are those associated with at least one destroyed building, at least 50 damaged buildings, at least two deaths, or any documented financial losses.

The researchers were able to provide impact results for 393 global earthquake events from 2021 within 10 minutes. Their model was developed using more than 1.5 million globally collected felt reports from more than 10,000 earthquakes of any magnitude between 2014 and 2021. The reports come from the Euro-Mediterranean Seismological Center (EMSC)’s LastQuake app, which alerts populations and collects user reports of earthquake shaking in real time.

While their model still has some challenges separating some high versus low-impact events, the model was able to definitively label a large number of events as low-impact, according to University of Potsdam researcher Henning Lilienkamp and colleagues.

Quickly determining the impact of an earthquake is essential for decision-makers and emergency response operators, as they guide the immediate direction of the actions that can protect lives and mitigate further damage.

Of course, in some cases such as the February 2023 catastrophic sequence of earthquakes affecting parts of Turkey and Syria, “it is immediately clear that emergency measures are urgent,” Lilienkamp said.

Events like the magnitude 5.9 earthquake that hit remote areas of Afghanistan and Iran on 12 June 2022, causing over 1000 fatalities, “are where our model could be of interest, because, according to the EMSC, it was not clear for hours whether considerable impact was to be expected or not,” he added.

Rapid assessment impact systems such as the U.S. Geological Survey PAGER provide impact estimates within 30 minutes—although they can return results in as little as five minutes in heavily instrumented areas—using ground acceleration data and other seismic observations, along with crowdsourced reports.

While many kinds of data can go into assessing an earthquake in its immediate aftermath, EMSC “has built up a huge source of information that has barely been utilized in a quantitative way in seismic hazard and risk-related studies so far,” Lilienkamp explained. “We were convinced that this database is too valuable to be disregarded in the long run, because it is collected efficiently and on a global scale, including in regions that lack expensive seismic instrumentation.”

The goal of Lilienkamp and colleagues was to see whether a useful assessment could be developed quickly using only crowdsourced data. The basis of their method converts a felt report into a “pseudo-intensity” value that quantifies the level of shaking.

Being able to identify an earthquake as low impact could provide some comfort to the public, as these kinds of earthquakes can still be felt and may cause anxiety as a result, the researchers note in their paper.

Lilienkamp and colleagues suggest that their method could be used to develop a “traffic light” system based on impact scores, where green-level scores would require no further action by decision-makers, yellow would prompt further investigation, and red could raise an alert.

“As seismologists, we need to get a better understanding of how exactly decision-makers and emergency services like fire departments actually act in case of an emergency, which kind of information is useful, and at which probabilities of high impact they would prefer to raise an alarm,” said Lilienkamp. “Careful communication of our model’s abilities and the individual needs of potential end-users will be key for a practical implementation of traffic-light systems.”

For the 6 February sequence in Turkey, Lilienkamp said the LastQuake service collected about 6500 reports from the first magnitude 7.8 shock and about 4800 reports from the second magnitude 7.5 shock. “For the first shock it took about four and a half minutes to collect 50 reports—the number required to run our model—and after 10 minutes 1232 reports were available.”

As is usual, there was an initial lack of reports from the area where shaking was most intense. “This effect is well known and represents the fact that people under such extreme circumstances of course prioritize finding shelter and rescuing people in danger, over submitting felt reports on their smartphones,” Lilienkamp said.

Stanford researchers develop a new way to identify bacteria in fluids

An innovative adaptation of the technology in an old inkjet printer plus AI-assisted imaging leads to a faster, cheaper way to spot bacteria in blood, wastewater, and more.


Peer-Reviewed Publication

STANFORD UNIVERSITY SCHOOL OF ENGINEERING

Pattern printout 

IMAGE: A DERIVATIVE OF THE STANFORD UNIVERSITY LOGO PRINTED FROM DROPLETS CONTAINING A 1:1 MIXTURE OF STAPHYLOCOCCUS EPIDERMIDIS BACTERIA AND MOUSE RED BLOOD CELLS (RBCS) ONTO A GOLD-COATED SLIDE. DROPLETS WERE PRINTED USING 147 MHZ ACOUSTIC TRANSDUCER. view more 

CREDIT: FAREEHA SAFIR

Shine a laser on a drop of blood, mucus, or wastewater, and the light reflecting back can be used to positively identify bacteria in the sample.

“We can find out not just that bacteria are present, but specifically which bacteria are in the sample – E. coliStaphylococcusStreptococcus, Salmonella, anthrax, and more,” said Jennifer Dionne, an associate professor of materials science and engineering and, by courtesy, of radiology at Stanford University. “Every microbe has its own unique optical fingerprint. It’s like the genetic and proteomic code scribbled in light.”

Dionne is senior author of a new study in the journal Nano Letters detailing an innovative method her team has developed that could lead to faster (almost immediate), inexpensive, and more accurate microbial assays of virtually any fluid one might want to test for microbes.

Traditional culturing methods still in use today can take hours if not days to complete. A tuberculosis culture takes 40 days, Dionne said. The new test can be done in minutes and holds the promise of better and faster diagnoses of infection, improved use of antibiotics, safer foods, enhanced environmental monitoring, and faster drug development, says the team.

Old dogs, new tricks

The breakthrough is not that bacteria display these spectral fingerprints, a fact that has been known for decades, but in how the team has been able to reveal those spectra amid the blinding array of light reflecting from each sample.

“Not only does each type of bacterium demonstrate unique patterns of light but virtually every other molecule or cell in a given sample does too,” said first author Fareeha Safir, a PhD student in Dionne’s lab. “Red blood cells, white blood cells, and other components in the sample are sending back their own signals, making it hard if not impossible to distinguish the microbial patterns from the noise of other cells.”

A milliliter of blood – about the size of a raindrop – can contain billions of cells, only a few of which might be microbes. The team had to find a way to separate and amplify the light reflecting from the bacteria alone. To do that, they ventured along several surprising scientific tangents, combining a four-decade-old technology borrowed from computing – the inkjet printer – and two cutting-edge technologies of our time – nanoparticles and artificial intelligence.

“The key to separating bacterial spectra from other signals is to isolate the cells in extremely small samples. We use the principles of inkjet printing to print thousands of tiny dots of blood instead of interrogating a single large sample,” explained co-author Butrus “Pierre” Khuri-Yakub, a professor emeritus of electrical engineering at Stanford who helped develop the original inkjet printer in the 1980s.

“But you can’t just get an off-the-shelf inkjet printer and add blood or wastewater,” Safir emphasized. To circumvent challenges in handling biological samples, the researchers modified the printer to put samples to paper using acoustic pulses. Each dot of printed blood is then just two trillionths of a liter in volume – more than a billion times smaller than a raindrop. At that scale, the droplets are so small they may hold just a few dozen cells.

In addition, the researchers infused the samples with gold nanorods that attach themselves to bacteria, if present, and act like antennas, drawing the laser light toward the bacteria and amplifying the signal some 1500 times its unenhanced strength. Appropriately isolated and amplified, the bacterial spectra stick out like scientific sore thumbs.

The final piece of the puzzle is the use of machine learning to compare the several spectra reflecting from each printed dot of fluid to spot the telltale signatures of any bacteria in the sample.

“It’s an innovative solution with the potential for life-saving impact. We are now excited for commercialization opportunities that can help redefine the standard of bacterial detection and single-cell characterization,” said senior co-author Amr Saleh, a former postdoctoral scholar in Dionne’s lab and now a professor at Cairo University.

Catalyst for collaboration

This sort of cross-disciplinary collaboration is a hallmark of the Stanford tradition in which experts from seemingly disparate fields bring their varying expertise to bear to solve longstanding challenges with societal impact.

This particular approach was hatched during a lunchtime meeting at a café on campus and, in 2017, was among the first recipients of a series of $3 million grants distributed by Stanford’s Catalyst for Collaborative Solutions. Catalyst grants are specifically targeted at inspiring interdisciplinary risk-taking and collaboration among Stanford researchers in high-reward fields such as health care, the environment, autonomy, and security.

While this technique was created and perfected using samples of blood, Dionne is equally confident that it can be applied to other sorts of fluids and target cells beyond bacteria, like testing drinking water for purity or perhaps spotting viruses faster, more accurately, and at lower cost than present methods.

Additional Stanford co-authors include former PhD student Loza Tadesse; research staff Kamyar FirouziNiaz Banaei, professor of pathology and of medicine at the School of Medicine; and Stefanie Jeffrey, the John and Marva Warnock Professor, Emerita, in the School of Medicine. Nhat Vu from Pumpkinseed Technologies is also a co-author. Banaei, Dionne, Jeffrey, and Khuri-Yakub are also members of Stanford Bio-X. Dionne is also senior associate vice provost of research platforms/shared facilities, a member of the Cardiovascular Institute and the Wu Tsai Neurosciences Institute, and an affiliate of the Precourt Institute for Energy. Jeffrey is also a member of the Stanford Cancer Institute. Khuri-Yakub is also a member of the Cardiovascular Institute, the Stanford Cancer Institute, and the Wu Tsai Neurosciences Institute.

This research was funded by the Stanford Catalyst for Collaborative Solutions, the Chan Zuckerberg Biohub Investigator Program, the NIH-NCATS-CTSA, the Gates Foundation, the National Science Foundation, the NIH New Innovator Award, and from seed funds from the Stanford Center for Innovation in Global Health. Part of this work was performed at the Stanford Nano Shared Facilities (SNSF) and the Soft & Hybrid Materials Facility (SMF), which are supported by the National Science Foundation and National Nanotechnology Coordinated Infrastructure.

Details of the printed dots on a gold-coated slide (a) where false coloring in the close-up of a single dot shows red blood calls in red and Staphylococcus epidermidis bacteria in blue. The researchers also printed onto an agar-coated slide (b) to show how the dots fare under incubation.

CREDIT

Fareeha Safir

Integrating humans with AI in structural design


A process that seeks feedback from human specialists proves more effective at optimization that automated systems working alone.

Peer-Reviewed Publication

MASSACHUSETTS INSTITUTE OF TECHNOLOGY                                          

Interactive Design 

IMAGE: THIS SEQUENCE SHOWS AN EXAMPLE OF THE ITERATIVE DESIGN PROCESS IN ACTION. ON TOP, YOU CAN SEE THE AI-DESIGNED INITIAL VERSION OF A SUPPORT BEAM. IN THE SECOND AND THIRD IMAGES, A HUMAN OPERATOR HIGHLIGHTS TWO SUPPORT SEGMENTS AS UNNECESSARY. THE BOTTOM IMAGE SHOWS HOW THE AI SYSTEM INCORPORATES THAT INPUT BY ELIMINATING THOSE SEGMENTS AND STRENGTHENING OTHERS TO COMPENSATE. view more 

CREDIT: COURTESY OF DAT HA AND JOSEPHINE CARSTENSEN; EDITED BY MIT NEWS

Modern fabrication tools such as 3D printers can make structural materials in shapes that would have been difficult or impossible using conventional tools. Meanwhile, new generative design systems can take great advantage of this flexibility to create innovative designs for parts of a new building, car, or virtually any other device.

But such “black box” automated systems often fall short of producing designs that are fully optimized for their purpose, such as providing the greatest strength in proportion to weight or minimizing the amount of material needed to support a given load. Fully manual design, on the other hand, is time-consuming and labor-intensive.

Now, researchers at MIT have found a way to achieve some of the best of both of these approaches. They used an automated design system but stopped the process periodically to allow human engineers to evaluate the work in progress and make tweaks or adjustments before letting the computer resume its design process. Introducing a few of these iterations produced results that performed better than those designed by the automated system alone, and the process was completed more quickly compared to the fully manual approach.

The results are reported this week in the journal Structural and Multidisciplinary Optimization, in a paper by MIT doctoral student Dat Ha and assistant professor of civil and environmental engineering Josephine Carstensen.

The basic approach can be applied to a broad range of scales and applications, Carstensen explains, for the design of everything from biomedical devices to nanoscale materials to structural support members of a skyscraper. Already, automated design systems have found many applications. “If we can make things in a better way, if we can make whatever we want, why not make it better?” she asks.

“It’s a way to take advantage of how we can make things in much more complex ways than we could in the past,” says Ha, adding that automated design systems have already begun to be widely used over the last decade in automotive and aerospace industries, where reducing weight while maintaining structural strength is a key need.

“You can take a lot of weight out of components, and in these two industries, everything is driven by weight,” he says. In some cases, such as internal components that aren’t visible, appearance is irrelevant, but for other structures aesthetics may be important as well. The new system makes it possible to optimize designs for visual as well as mechanical properties, and in such decisions the human touch is essential.

As a demonstration of their process in action, the researchers designed a number of structural load-bearing beams, such as might be used in a building or a bridge. In their iterations, they saw that the design has an area that could fail prematurely, so they selected that feature and required the program to address it. The computer system then revised the design accordingly, removing the highlighted strut and strengthening some other struts to compensate, and leading to an improved final design.

The process, which they call Human-Informed Topology Optimization, begins by setting out the needed specifications — for example, a beam needs to be this length, supported on two points at its ends, and must support this much of a load. “As we’re seeing the structure evolve” on the computer screen in response to initial specification, Carstensen says, “we interrupt the design and ask the user to judge it. The user can select, say, ‘I’m not a fan of this region, I’d like you to beef up or beef down this feature size requirement.’ And then the algorithm takes into account the user input.”

While the result is not as ideal as what might be produced by a fully rigorous yet significantly slower design algorithm that considers the underlying physics, she says it can be much better than a result generated by a rapid automated design system alone. “You don’t get something that’s quite as good, but that was not necessarily the goal. What we can show is that instead of using several hours to get something, we can use 10 minutes and get something much better than where we started off.”

The system can be used to optimize a design based on any desired properties, not just strength and weight. For example, it can be used to minimize fracture or buckling, or to reduce stresses in the material by softening corners.

Carstensen says, “We’re not looking to replace the seven-hour solution. If you have all the time and all the resources in the world, obviously you can run these and it’s going to give you the best solution.” But for many situations, such as designing replacement parts for equipment in a war zone or a disaster-relief area with limited computational power available, “then this kind of solution that catered directly to your needs would prevail.”

Similarly, for smaller companies manufacturing equipment in essentially “mom and pop” businesses, such a simplified system might be just the ticket. The new system they developed is not only simple and efficient to run on smaller computers, but it also requires far less training to produce useful results, Carstensen says. A basic two-dimensional version of the software, suitable for designing basic beams and structural parts, is freely available now online, she says, as the team continues to develop a full 3D version.

“By integrating engineering ‘intuition’ (or engineering ‘judgement’) into a rigorous yet computationally efficient topology optimization process, the human engineer is offered the possibility of guiding the creation of optimal structural configurations in a way that was not available to us before,” he adds. “Her findings have the potential to change the way engineers tackle ‘day-to-day’ design tasks.”

###

Written by David Chandler, MIT News Office

Smithsonian science backs Caribbean Ocean conservation: Panama will protect 54% of its oceans

Business Announcement

SMITHSONIAN TROPICAL RESEARCH INSTITUTE

Panama Blue Pioneer 

VIDEO: PANAMA, PIONERO AZUL, A SHORT VIDEO DESCRIBING HOW SMITHSONIAN SCIENCE HAS SUPPORTED PANAMA'S CREATION OF NEW MARINE PROTECTED AREAS. view more 

CREDIT: STRI (SMITHSONIAN TROPICAL RESEARCH INSTITUTE)

Within the framework of the Our Ocean Conference on Mar. 2-3, 2023 in Panama City, Panama’s President Laurentino Cortizo and Minister of Environment Milciades Concepción added 36,058 square miles to the Banco Volcán marine protected area in the Caribbean. During the last two decades, researchers at the Smithsonian Tropical Research Institute (STRI) along with local and international collaborators have offered much of the science backing Panama’s successful proposals to create the MPA’s bringing more than 50% of its ocean waters under some form of management or protection.

Created in 2015, with 5,487 square miles, the “Banco Volcán Managed Resources Area” is an area with unique natural resources, such as deep mountain ranges and high biodiversity that includes various migratory species and protected and endangered species, all important for the health of the oceans. The proposal to expand its limits was made in response to a request from the Ministry of the Environment last year, after a review of the protected area by STRI scientist Héctor Guzmán and considering the ecological integrity of the region.

The expansion of the Banco Volcán Marine Protected Area in 2023 has not only led Panama to protect more than 54% of its territorial waters, but will also buffer climate change, protect Panama's deep-sea mountain environments, and help safeguard fauna from human interventions, including several fish and invertebrate species of high commercial value, such as the Caribbean spiny lobster (Panulirus argus). Therefore, this action will have a direct impact on the protection of an important sustainable resource for the indigenous and Afro-Caribbean coastal communities of Panama. In addition, it could maintain the connectivity of migratory routes for oceanic and marine-coastal species in the area that extends along the Caribbean coasts of Jamaica, Colombia, Honduras, Nicaragua, Costa Rica, and Panama.

“With the protection of more than half of its seas, including extensive ocean reserves on both sides of the isthmus, Panama is not only ensuring the conservation of its marine biodiversity and the livelihoods of the people who depend on these ecosystems in the long-term, but is also positioned to lead a much more ambitious regional effort,” said STRI marine biologist and MigraMar co-founder Héctor Guzmán.

STRI has accompanied the Panamanian government with the scientific bases for the creation of new marine protected areas for almost two decades, starting with the Coiba National Park in 2004 and followed by the Las Perlas archipelago in 2007, both in coastal areas of the Pacific Ocean. In 2015, STRI led the design and scientific justification for the creation of the first two oceanic marine protected areas: Banco Volcán in the Caribbean and Cordillera de Coiba in the Pacific, helping Panama to protect 13% of its oceans. With this, the country surpassed the international Aichi target for biological diversity.

A few years later, in 2021, STRI once again supported the Panamanian government with the scientific data for the expansion of the Cordillera de Coiba marine protected area. With this action, Panama achieved a total of 37,926 square miles of marine protected areas throughout the country and met the goal of the United Nations 30x30 Initiative to protect at least 30% of its marine surface by the year 2030.

“The expansion of Banco Volcan is an essential first step for large-scale regional protection of marine biodiversity,” said STRI Director Joshua Tewksbury. “A lot more science by us and by others will be required to ensure that we do the monitoring of this massive area and ensure that specific policy interventions actually create the sustainable ecosystems we all want.”

The Smithsonian Tropical Research Institute, headquartered in Panama City, Panama, is part of the Smithsonian Institution. The Institute furthers the understanding of tropical nature and its importance to human welfare, trains students to conduct research in the tropics and promotes conservation by increasing public awareness of the beauty and importance of tropical ecosystems.      

Map of Panama's extended Marine Protected Areas

                

Experiencing racism increases black women’s heart disease risk, BU research finds

New study shows perceived racism in employment, housing, and interactions with the police associated with 26% higher risk of coronary heart disease.

Reports and Proceedings

BOSTON UNIVERSITY

BU epidemiologist Shanshan Sheehy 

IMAGE: BU EPIDEMIOLOGIST SHANSHAN SHEEHY SAYS THE STUDY PROVIDES THE “FIRST LONGITUDINAL EVIDENCE THAT PERCEIVED RACISM IS ASSOCIATED WITH INCREASED RISK OF CORONARY HEART DISEASE.” view more 

CREDIT: PHOTO BY DAVID KEOUGH, CHOBANIAN & AVEDISIAN SCHOOL OF MEDICINE.

More than half of Black women in America aged 20 and older have cardiovascular diseases, according to the American Heart Association, and every year, 50,000 will die as a result. Some researchers have tied Black women’s increased risk of heart disease to genetics, others to higher rates of obesity and diabetes. A new Boston University-led study points to another key factor: experiences of racism.

A team of researchers who followed more than 48,000 Black women over 22 years found those who reported experiencing interpersonal racism in employment, housing, and in interactions with the police had a 26 percent higher risk of coronary heart disease than those who did not. The women were participants in BU’s Black Women’s Health Study, a more than 25-year effort to track the health of 59,000 women in the United States.

“This is the first longitudinal evidence that perceived racism is associated with increased risk of coronary heart disease,” says Shanshan Sheehy, a BU Chobanian & Avedisian School of Medicine assistant professor. “Racism has a real impact on the heart health of Black women.” Sheehy presented the findings at the American Heart Association’s Epidemiology, Prevention, Lifestyle & Cardiometabolic Health Scientific Sessions 2023.

The research kicked off in 1997, when participants—then with a mean age of 40.5—answered a series of questions about their experiences of racism. The first set of questions aimed to get at instances of perceived discrimination and unfair treatment while job hunting or at work, when trying to rent or buy a home, or during a law enforcement stop or search. Another set looked at experiences of interpersonal racism in everyday life: whether the women felt they’d received poorer restaurant service, been looked down upon, or treated as unintelligent, dishonest, or as a threat. Throughout the next 22 years, researchers tracked the women’s wellbeing with biennial mailed and online health questionnaires. All started the study with apparently healthy hearts; by 2019, 1,947 had developed coronary heart disease.

Although the research team discovered an association between a higher probability of heart disease and self-reported experiences of racism in employment, housing, and interactions with the police, they found that racism in everyday life—at a store, in a restaurant—was not linked with an increased risk. Sheehy suspects that’s because while the different types of racism are pernicious and damaging, their relative consequences are varied. Someone discriminated against in a store, she says, may be able to draw on coping mechanisms—like talking with a friend—but missing out on a promotion or a mortgage is much harder to tune out.

“When we think about how racism impacts our health, it’s a psychosocial stressor,” says Sheehy, who’s also affiliated with the BU Slone Epidemiology Center. “It increases your blood pressure, your level of inflammation—all of these biological mechanisms increase your risk of coronary heart disease.”

In past papers, Black Women’s Health Study researchers have also shown a connection between perceived experiences of racism and obesityreduced cognitive functioninsomniapreterm birthand many other afflictions. Sheehy and her colleagues say one next step for the coronary heart disease research is to take a deeper dive into the impact of structural racism.

“Structural racism is real—on the job, in educational circumstances, and in interactions with the criminal justice system,” says coauthor Michelle A. Albert, American Heart Association president and a University of California at San Francisco professor of medicine. “Now, we have hard data linking it to cardiovascular outcomes, which means that we as a society need to work on the things that create the barriers that perpetuate structural racism.”

 

This research was funded by the National Institutes of Health. Other study coauthors are Julie R. Palmer, BU Slone Epidemiology Center director and Karin Grunebaum Cancer Research Professor, Medicine; Yvette C. Cozier, a BU School of Public Health associate professor of epidemiology; Lynn Rosenberg, an SPH professor of epidemiology; and Max Brock, a Cook County Health cardiologist.