Tuesday, October 01, 2024

 70 YEARS OF SCI-FI-TEK

A new and unique fusion reactor comes together with PPPL's contributions



First plasma is coming soon to the University of Seville’s compact spherical tokamak called SMART



DOE/Princeton Plasma Physics Laboratory

Plasma testing in SMART 

video: 

A microwave-heated glow discharge runs in SMART as a test of the tokamak.
 

view more 

Credit: University of Seville




Like atoms coming together to release their power, fusion researchers worldwide are joining forces to solve the world’s energy crisis. Harnessing the power of fusing plasma as a reliable energy source for the power grid is no easy task, requiring global contributions.

The Princeton Plasma Physics Laboratory (PPPL) — a U.S. national laboratory funded by the Department of Energy (DOE) — is leading several efforts on this front, including collaborating on the design and development of a new fusion device at the University of Seville in Spain. The SMall Aspect Ratio Tokamak (SMART) strongly benefits from PPPL computer codes as well as the Lab’s expertise in magnetics and sensor systems.  

“The SMART project is a great example of us all working together to solve the challenges presented by fusion and teaching the next generation what we have already learned,” said Jack Berkery, PPPL’s deputy director of research for the National Spherical Torus Experiment-Upgrade (NSTX-U) and principal investigator for the PPPL collaboration with SMART. “We have to all do this together or it’s not going to happen.”

Manuel Garcia-Munoz and Eleonora Viezzer, both professors at the Department of Atomic, Molecular and Nuclear Physics of the University of Seville as well as co-leaders of the Plasma Science and Fusion Technology Lab and the SMART tokamak project, said PPPL seemed like the ideal partner for their first tokamak experiment. The next step was deciding what kind of tokamak they should build. “It needed to be one that a university could afford but also one that could make a unique contribution to the fusion landscape at the university scale,” said Garcia-Munoz. “The idea was to put together technologies that were already established: a spherical tokamak and negative triangularity, making SMART the first of its kind. It turns out it was a fantastic idea.” 

SMART should offer easy-to-manage fusion plasma

Triangularity refers to the shape of the plasma relative to the tokamak. The cross section of the plasma in a tokamak is typically shaped like the capital letter D. When the straight part of the D faces the center of the tokamak, it is said to have positive triangularity. When the curved part of the plasma faces the center, the plasma has negative triangularity. 

Garcia-Munoz said negative triangularity should offer enhanced performance because it can suppress instabilities that expel particles and energy from the plasma, preventing damage to the tokamak wall. “It’s a potential game changer with attractive fusion performance and power handling for future compact fusion reactors,” he said. “Negative triangularity has a lower level of fluctuations inside the plasma, but it also has a larger divertor area to distribute the heat exhaust.”

The spherical shape of SMART should make it better at confining the plasma than it would be if it were doughnut shaped. The shape matters significantly in terms of plasma confinement. That is why NSTX-U, PPPL’s main fusion experiment, isn’t squat like some other tokamaks: the rounder shape makes it easier to confine the plasma. SMART will be the first spherical tokamak to fully explore the potential of a particular plasma shape known as negative triangularity.

PPPL’s expertise in computer codes proves essential

PPPL has a long history of leadership in spherical tokamak research. The University of Seville fusion team first contacted PPPL to implement SMART in TRANSP, a simulation software developed and maintained by the Lab. Dozens of facilities use TRANSP, including private ventures such as Tokamak Energy in England. 

“PPPL is a world leader in many, many areas, including fusion simulation; TRANSP is a great example of their success,” said Garcia-Munoz.  

Mario Podesta, formerly of PPPL, was integral to helping the University of Seville determine the configuration of the neutral beams used for heating the plasma. That work culminated in a paper published in the journal Plasma Physics and Controlled Fusion.

Stanley Kaye, director of research for NSTX-U, is now working with Diego Jose Cruz-Zabala, EUROfusion Bernard Bigot Researcher Fellow, from the SMART team, using TRANSP “to determine the shaping coil currents necessary for attaining their design plasma shapes of positive triangularity and negative triangularity at different phases of operation.” The first phase, Kaye said, will involve a “very basic” plasma. Phase two will have neutral beams heating the plasma.

Separately, other computer codes were used for assessing the stability of future SMART plasmas by Berkery, former undergraduate intern John Labbate, who is, now a grad student at Columbia University, and former University of Seville graduate student Jesús Domínguez-Palacios, who has now moved to an American company. A new paper in Nuclear Fusion by Domínguez-Palacios discusses this work.

Designing diagnostics for the long haul

The collaboration between SMART and PPPL also extended into and one of the Lab’s core areas of expertise: diagnostics, which are devices with sensors to assess the plasma. Several such diagnostics are being designed by PPPL researchers. PPPL Physicists Manjit Kaur and Ahmed Diallo, together with Viezzer, are leading the design of the SMART’s Thomson scattering diagnostic, for example. This diagnostic will precisely measure the plasma electron temperature and density during fusion reactions, as detailed in a new paper published in the journal Review of Scientific Instruments. These measurements will be complemented with ion temperature, rotation and density measurements provided by diagnostics known as the charge exchange recombination spectroscopy suite developed by Alfonso Rodriguez-Gonzalez, graduate student at University of Seville, Cruz-Zabala and Viezzer.

“These diagnostics can run for decades, so when we design the system, we keep that in mind,” said Kaur. When developing the designs, it was important the diagnostic can handle temperature ranges SMART might achieve in the next few decades and not just the initial, low values, she said.

Kaur designed the Thomson scattering diagnostic from the start of the project, selecting and procuring its different subparts, including the laser she felt best fits the job. She was thrilled to see how well the laser tests went when Gonzalo Jimenez and Viezzer sent her photos from Spain. The test involved setting up the laser on a bench and shooting it at a piece of special parchment that the researchers call “burn paper.” If the laser is designed just right, the burn marks will be circular with relatively smooth edges. “The initial laser test results were just gorgeous,” she said. “Now, we eagerly await receiving other parts to get the diagnostic up and running.”

James Clark, a PPPL research engineer whose doctoral thesis focused on Thomson scattering systems, was later brought on to work with Kaur. “I’ve been designing the laser path and related optics,” Clark explained. In addition to working on the engineering side of the project, Clark has also helped with logistics, deciding how and when things should be delivered, installed and calibrated.

PPPL’s Head of Advanced Projects Luis Delgado-Aparicio, together with Marie Skłodowska-Curie fellow Joaquin Galdon-Quiroga and University of Seville graduate student Jesus Salas-Barcenas, are leading efforts to add two other kinds of diagnostics to SMART: a multi-energy, soft X-ray (ME-SXR) diagnostic and spectrometers. The ME-SXR will also measure the plasma’s electron temperature and density but using a different approach than the Thomson scattering system. The ME-SXR will use sets of small electronic components called diodes to measure X-rays. Combined, the Thomson scattering diagnostic and the ME-SXR will comprehensively analyze the plasma’s electron temperature and density. 

By looking at the different frequencies of light inside the tokamak, the spectrometers can provide information about impurities in the plasma, such as oxygen, carbon and nitrogen. “We are using off-the-shelf spectrometers and designing some tools to put them in the machine, incorporating some fiber optics,” Delgado-Aparicio said. Another new paper published in the Review of Scientific Instruments discusses the design of this diagnostic.

PPPL Research Physicist Stefano Munaretto worked on the magnetic diagnostic system for SMART with the field work led by University of Seville graduate student Fernando Puentes del Pozo Fernando. “The diagnostic itself is pretty simple,” said Munaretto. “It’s just a wire wound around something. Most of the work involves optimizing the sensor’s geometry by getting its size, shape and length correct, selecting where it should be located and all the signal conditioning and data analysis involved after that.” The design of SMART’s magnetics is detailed in a new paper.

Munaretto said working on SMART has been very fulfilling, with much of the team working on the magnetic diagnostics made up of young students with little previous experience in the field. “They are eager to learn, and they work a lot. I definitely see a bright future for them.”

Delgado-Aparicio agreed. “I enjoyed quite a lot working with Manuel Garcia-Munoz, Eleonora Viezzer and all of the other very seasoned scientists and professors at the University of Seville, but what I enjoyed most was working with the very vibrant pool of students they have there,” he said. “They are brilliant and have helped me quite a bit in understanding the challenges that we have and how to move forward toward obtaining first plasmas.”

Researchers at the University of Seville have already run a test in the tokamak, displaying the pink glow of argon when heated with microwaves. This process helps prepare the tokamak’s inner walls for a far denser plasma contained at a higher pressure. While technically, that pink glow is from a plasma, it’s at such a low pressure that the researchers don’t consider it their real first tokamak plasma. Garcia-Munoz says that will likely happen in the fall of 2024.

Support for this research comes from the DOE under contract number DE-AC02-09CH11466, European Research Council Grant Agreements 101142810 and 805162, the Euratom Research and Training Programme Grant Agreement 101052200 — EUROfusion, and the Junta de Andalucía Ayuda a Infraestructuras y Equipamiento de I+D+i IE17-5670 and Proyectos I+D+i FEDER Andalucía 2014-2020, US-15570.

SMall Aspect Ratio Tokamak 




________________________________________________________________________________________


PPPL is mastering the art of using plasma — the fourth state of matter — to solve some of the world's toughest science and technology challenges. Nestled on Princeton University’s Forrestal Campus in Plainsboro, New Jersey, our research ignites innovation in a range of applications including fusion energy, nanoscale fabrication, quantum materials and devices, and sustainability science. The University manages the Laboratory for the U.S. Department of Energy’s Office of Science, which is the nation’s single largest supporter of basic research in the physical sciences. Feel the heat at https://energy.gov/science and https://www.pppl.gov.  

 

The Plasma Science and Fusion Technology Lab of the University of Seville hosts the SMall Aspect Ratio Tokamak and leads several worldwide efforts on energetic particles and plasma stability towards the development of magnetically confined fusion energy. www.psft.eu

 

NCSA, Google work together in Alaska as part of Permafrost Discovery Gateway




National Center for Supercomputing Applications





Earlier this summer, members of the National Center for Supercomputing Applications traveled to Alaska as part of their continued work with the Permafrost Discovery Gateway, a project led by the Woodwell Climate Research Center using artificial intelligence (AI) and machine learning (ML) in tracking Arctic permafrost thaw.

NCSA’s Associate Director for Software Kenton McHenry and Research Software Engineer Todd Nicholson visited Fairbanks, Alaska along with 12 Google.org fellows to see first hand the melting permafrost and the impacts to those that live there.

“I have seen the artifacts of melting permafrost for years through the satellite images we work with, yet I was shocked to see things firsthand,” said McHenry. “What should be flat terrain was filled with 6-feet-deep ravines, the borders of these ice-wedge polygons. The trees at 45 or greater degree angles. The sinkholes in the University of Fairbanks parking lot.”

Seeing the permafrost research tunnel was a great experience. It shows just how significant permafrost is to the Arctic, how much it affects the land and the environment. Hearing from people about how permafrost thaw affects them in their day-to-day life lets you know how important the topic is.

Todd Nicholson, NCSA Research Software Engineer

Funded by a $5 million grant from Google.org, the Permafrost Discover Gateway, with further Google.org fellowship support, is developing and expanding a new, open-access resource that will use satellite data and AI technology to make it possible to track Arctic permafrost thaw regularly for the first time ever. This potentially game-changing resource for climate science will utilize advances in AI/ML technology to streamline the data analysis process and make it easier to rapidly identify patterns and trends in permafrost thaw datasets that will be essential to informing climate mitigation and adaptation strategies for city planners.

“We are excited to be working with Google.org to improve and extend the tools and data pipelines initially developed for the Permafrost Discovery Gateway to new use cases,” said NCSA Lead Research Software Engineer Luigi Marini after the award was announced. “Closing the time gap between remote sensing data products becoming available and permafrost data products being published, such as the pan-Arctic sub-meter scale ice-wedge polygon dataset developed by Chandi Witharana and team, will hopefully help scientists and stakeholders better understand permafrost thawing at the pan-Arctic scale. We also hope to generalize some of the technologies and tools being developed so that more scientists can leverage this work to develop new permafrost-related data pipelines.”

But NCSA’s trip didn’t just center around software and science. The Arctic adventure included underground tours, an ice hotel, team activities and more.

“On our way to the permafrost cave we stopped by to see a portion of the Trans-Alaska oil pipeline,” McHenry said. “A few of us were amazed that it was actually built on top of permafrost, and to prevent its foundation from melting, it had thousands of passive refrigeration units along it to pump the winter cold into the ground to help prevent the permafrost from thawing in the summer.”

During the same period, the project hosted 12 Google software engineers who assisted in the Arctic research. Through a program partnering with research projects like the Permafrost Discovery Gateway, Google staff apply for opportunities to work on scientific research as a change of pace from their normal work.

“Several of the fellows told us how much they really enjoyed this experience working within science and would like more opportunities to do so,” McHenry said.


ABOUT NCSA

The National Center for Supercomputing Applications at the University of Illinois Urbana-Champaign provides supercomputing, expertise and advanced digital resources for the nation’s science enterprise. At NCSA, University of Illinois faculty, staff, students and collaborators from around the globe use innovative resources to address research challenges for the benefit of science and society. NCSA has been assisting many of the world’s industry giants for over 35 years by bringing industry, researchers and students together to solve grand challenges at rapid speed and scale.

Monday, September 30, 2024

 

More CO₂ in the atmosphere during El Niño



New study analyses how the carbon cycle responds to temperatures in the tropics



Universität Leipzig




A recent study challenges previous assumptions about the connection between CO₂ in the atmosphere and temperatures in the tropics. Between 1959 and 2011, the CO₂ content in the atmosphere responded twice as strongly to temperatures in the tropics than before. This has often been attributed to increasing droughts in the tropics and to changes in carbon cycle responses caused by climate change. However, the current study conducted by researchers from the Max Planck Institute for Biogeochemistry and Leipzig University suggests that a small number of particularly strong El Niño events could be responsible for this.

Both tropical and non-tropical ecosystems absorb large amounts of carbon that were previously released into the atmosphere through human CO₂ emissions. Globally, land surface ecosystems act as a carbon sink and absorb on average around a third of human CO₂ emissions. These ecosystems are therefore a natural buffer for climate change. In the 1980s and 1990s, however, researchers observed an increased fluctuation in global carbon storage on land, and it appeared that the CO₂ growth rate was particularly sensitive to temperatures in the tropics. In a recent study, researchers from Jena and Leipzig found that this “doubling” of sensitivity was caused by the increased occurrence of El Niño events in the 1980s and 1990s compared to 1960–1979. This also includes the extreme El Niño events of 1982/83 and 1997/98. El Niño events cause severe droughts and heat waves in the tropics, which affect plant growth and thus reduce carbon uptake. In times of El Niño, vegetation even releases large amounts of carbon that would otherwise be sequestered in the soil or forests. This causes the CO₂ content in the atmosphere to increase. 

Internal climate variability as the main factor for changes in the carbon cycle
The authors of the study emphasise that this CO₂ increase is due to internal climate variability rather than a systematic change in the carbon cycle caused by climate change. “Our results show that this doubling of sensitivity is not necessarily a sign of a fundamental change in the response of the carbon cycle to climate change,” says Na Li from the Max Planck Institute for Biogeochemistry, first author of the study. Instead, it is caused by the combination of extreme El Niño events and their global impact.

“Slow-in, fast-out”: dynamics of the carbon cycle during extreme weather events 
“Through our work, we were also able to show that this phenomenon is related to the ‘slow-in, fast-out’ dynamics of the carbon cycle. This means that carbon is only slowly absorbed by ecosystems, but can be suddenly and quickly released again during extreme weather events such as strong El Niños,” explains Professor Ana Bastos from Leipzig University, senior author of the study.

New findings on reducing uncertainties in climate projections
The results of this study are important because they highlight uncertainties in future climate projections. To date, it was assumed that a heightened sensitivity of the CO₂ increase to temperatures in the tropics is caused by long-term climate-related changes in the carbon cycle, and thus in the global climate system. However, the study shows that extreme events can cause short-term fluctuations that do not necessarily indicate permanent changes in the carbon cycle. “These new findings could help to develop more precise climate models and reduce the uncertainties in predicting future climate scenarios,” says junior professor Dr Sebastian Sippel from Leipzig University. He also stated that we need to better understand how extreme climate phenomena such as El Niño affect carbon dynamics in order to make more reliable forecasts for the future.

Research is an important building block in the Cluster of Excellence
Professor Ana Bastos and junior professor Dr Sebastian Sippel as well as Professor Miguel Mahecha and Professor Markus Reichstein are also part of Leipzig University’s Cluster of Excellence Breathing Nature. In this context, researchers are investigating the complex links between climate change and biodiversity, and how they interact with human activity. They use innovative methods to understand patterns and dynamics of ecosystems and the atmosphere.

 

Unlocking ocean power: $3.6 million for community-centric wave energy converters



Wave energy could power millions of homes, but to make a splash in the industry, the tech must balance engineering, socio-economic and environmental trade-offs



University of Michigan

 


 

Images

Coastal communities are partnering with a multidisciplinary research team to determine the best way to harvest wave energy at Beaver Island, Michigan, and Nags Head, North Carolina.

 

The project is led by the University of Michigan, supported with $3.6 million from the National Science Foundation. It brings together researchers from five different institutions to help provide renewable energy that addresses the needs and concerns of coastal and island communities and identifies paths to make wave energy technology competitive with solar and wind power.

 

Waves are a vast source of untapped renewable energy. They could completely cover Alaska and Hawaii's electricity needs and generate enough power along mainland U.S. coasts to keep the lights on in 130 million homes, or meet 35% of the country's electricity demand, without any direct greenhouse gas emissions, according to the National Renewable Energy Laboratory.

 

Despite its promise, wave energy still can't compete with wind and solar energy in today's market because engineers haven't settled on the best way to harvest it or assess the technology. 

 

"Everybody knows what a wind turbine looks like because the research community has rallied behind a single idea," said Jeff Scruggs, U-M professor of civil and environmental engineering and the project's co-principal investigator. "For wave energy converters, that's not the case. When you look at the devices that companies are deploying for their trials, they are nothing like each other. Everybody's got their own idea about the best way to harvest wave energy."

 

Companies and laboratories have tested a wide host of concepts: bobbing buoyssubmersed deviceshinged raftspaddles and more. Each comes with its own pros and cons. A device optimized for generating energy could be more prone to damage from storms or have more environmental risks than other devices, and there are no guidelines to determine which trade-offs are acceptable or economical. 

 

"We need to develop a method to holistically assess wave energy devices, and that's something that can't be done by one person with one area of expertise working individually," said Lei Zuo, the Herbert C. Sadler Collegiate Professor of Engineering at U-M, a professor of naval architecture and marine engineering and the project's lead principal investigator.

 

The team will develop that assessment framework by getting community input from the start. For Beaver Island, wave energy might be a pathway to increased energy security and independence from expensive diesel for the island's back-up generators. At Nags Head, wave power could provide emergency electricity after catastrophic events—like hurricanes—or power devices that remove salt from seawater during emergency scenarios when freshwater may have been compromised. But the researchers need the communities' feedback to decide the best approach at each location.

 

"As researchers, we often think that communities are only recipients of our research. But coastal communities often know more about what is happening locally on the coast and about what is likely to work for their communities," said Eric Wade, assistant professor of coastal studies at East Carolina University who will assess the project's sociological impacts and co-lead community engagement.

 

Without that community-centric design, renewable energy projects are likely to face terminal pushback. In a recent example, an offshore wind energy project off the coast of Cleveland, Ohio, recently ended due to challenges from residents and environmental groups who feared damage to wildlife and treasured views.

 

"All of the research to date indicates that wave energy isn't likely a high risk to marine life compared to climate change, but because we don't have enough deployments to really know what the risks are, it’s hard to make that case to regulators," said Lindsay Dubbs, research associate professor at the University of North Carolina's Institute for the Environment, who will lead the project's environmental risk assessments. "That has certainly prevented some marine energy devices from being permitted and deployed in a timely manner."

 

Additional team members include: Danesh Tafti, professor of mechanical engineering at Virginia Tech; Daniel Deng, U-M adjunct professor of naval architecture and marine engineering and laboratory fellow at the Pacific Northwest National Laboratory; Gail Gruenwald, a member of the Beaver Island Association; and Bill Staby, founder of Blue Water Network LLC.

 

Zuo is also a professor of mechanical engineering, and Scruggs is also a professor of electrical and computer engineering. Wade and Dubbs are both affiliated with the Coastal Studies Institute.

 

 

Model projects energy storage needs for fossil fuel-free energy system





North Carolina State University





Researchers have developed a model that can be used to project what a nation’s energy storage needs would be if it were to shift entirely to renewable energy sources, moving away from fossil fuels for electric power generation. The model offers policymakers critical information for use when making near-term decisions and engaging in long-term energy system planning.

“We focused this study on Italy’s energy system because it has suffered significantly in recent years, due to difficulties obtaining affordable natural gas due to Russia’s invasion of Ukraine,” says Anderson de Queiroz, co-author of a paper on the work and an associate professor of civil, construction and environmental engineering at North Carolina State University. “That has raised questions about how Italy can make its energy system more robust. Our goal here was to develop a model that would allow us to determine what Italy’s energy storage needs would be if it moved completely away from fossil fuels and met its electricity demands with renewable resources.”

Energy storage is a critical piece of this puzzle because renewable energy sources, such as solar or wind, don’t produce energy at the same rate all the time. For example, you need to be able to store energy generated by solar power so that you can use that energy at night, when the sun is not shining.

To better understand an energy system’s energy storage needs, the researchers modified an existing optimization model called Temoa.

Specifically, the researchers modified the model to account for how renewable energy production would change during different times of day and different times of the year. For example, there would be greater solar power production during summer when days are longer, but solar power would still drop overnight. The researchers also accounted for changes in energy consumption at different times of day and during different seasons. For example, energy consumption may go up during hot summer afternoons if people are using air conditioners.

Capturing these daily and seasonal fluctuations in renewable energy production and energy consumption allowed the researchers to create a more detailed model of the energy system, which allowed them to better answer questions about the system’s energy storage needs. How much renewable energy could be redirected to storage? How much energy storage would be needed to meet demands?

“Our modified model makes clear that increasing energy storage capacity is critical for decarbonizing Italy’s power sector, but it also offers some detailed insights,” de Queiroz says. “For example, the model suggests that Italy needs to be able to store about 10% of its electricity generation in short-term energy storage devices.”

The term “short-term energy storage” is somewhat confusing. It does not refer to how long a storage device can store energy. Rather, it refers to how long the device can sustain its maximum power output. For example, a one-hour 2-kilowatt device could release two kilowatts of power for one hour, whereas a three-hour 2-kilowatt device could release two kilowatts of power for three hours. Energy storage systems that can release the maximum power output for four hours or less are typically considered short-term energy storage devices.

“Our projection related to short-term energy storage devices is driven both by the energy system’s energy storage needs and the fact that these devices are the most cost-effective way to meet those needs, based on recent cost projections and estimates,” de Queiroz says.

But while this paper focuses on Italy, the modified model the researchers developed for this work can be used to project energy storage needs for any energy system.

“As the world moves toward renewable power sources, we need to find ways to account for their variability,” says de Queiroz. “Energy storage devices give us the flexibility to adjust to fluctuations in energy production while also giving us the reliability we need to meet energy demands. And models like the one we’ve demonstrated here provide critical insights for policymakers regarding their long-term energy storage needs.”

The paper, “Modeling energy storage in long-term capacity expansion energy planning: an analysis of the Italian system,” is published open access in the Journal of Energy Storage. First author of the paper is Matteo Nicoli, a Ph.D. student at Politecnico di Torino. The paper was co-authored by Victor Faria, a recent Ph.D. graduate from NC State; and by Laura Savoldi of the Politecnico di Torino.

 

Study: Conflicting goals, focus on economic development lead to underperforming streetcar systems



Streetcars meant to spur development instead of taking people to highly desired spots predicted lack of streetcar success



University of Kansas




LAWRENCE — A city’s streetcar system can be many things. But it can’t be everything. New research from the University of Kansas has found that cities with underperforming streetcar systems often get there by setting too many, sometimes conflicting, goals for what they want the transit systems to accomplish.

Joel Mendez, assistant professor of public affairs & administration at KU, conducted a study in which he analyzed streetcar systems from around the United States. He then focused on two cities with high-performing systems and two with poor-performing streetcars to see what differences caused the disparities. Results showed it is a case of placemaking vs. place taking, or focusing on a streetcar as an economic developmental tool versus a system that takes passengers where they want to go.

Over the past decade, more than $1 billion has been invested in streetcar systems across the nation.

“The reality is most systems are not doing great in terms of attracting passengers,” Mendez said. “This research was geared toward understanding what’s driving performance outcomes in these streetcar systems. I explored the role which goal tension plays in this situation as past research has found that transit projects can pursue as many as 25 distinct and often conflicting goals.”

Such goals commonly include improving air quality, reducing car traffic, increasing mobility for low-income residents and economic development. While all the goals have merit, Mendez said they can often contradict each other.

For the study, Mendez selected two cities with high-performing streetcar systems: Kansas City, Missouri, and Tucson, Arizona. Two underperforming cities — Atlanta and Cincinnati — were selected as well. He examined performance metrics, system policies, location characteristics and planning documents and interviewed 40 people involved in the planning and development of the four systems. Interviewees were asked what influenced decisions that shaped the streetcar systems in their respective city. Mendez found that systems that prioritized economic development in decision making tended to perform poorly. The most successful systems were in cities that emphasized system performance and placed streetcar systems in areas where people lived, worked and wanted to go for entertainment, recreational and personal reasons.

“In cities that prioritized economic development, decisions reflected that focus,” Mendez said. “For example, if you look at corridors where poor performing systems were placed, you will find twice the number of vacant parcels and properties. Such placement can maximize the economic development impact of the streetcar, but it limits its ability to serve the immediate needs of the public.”

Riders of such systems often indicated that they did not use the streetcar as it did not take them to where they wanted to go. Such placement often reflected where decision makers wanted development to go. Anticipated development, if it does come to fruition, can take time and result in the presence of empty streetcars in the meantime.

“This can sour people on the idea of streetcars and affect both political and public perceptions. Plans for expansion won’t happen if employers, workers and leaders think it’s a waste. I think it’s important for cities to focus on passenger attraction,” Mendez said.

Kansas City and Tucson, the cities with high-performing systems, focused more so on place taking, or transporting people to highly desired locations, such as downtown areas with high densities of jobs, entertainment, dining and other features.

The study, published in the Journal of Planning Education and Research, also found that fare policy played a role in ridership numbers. In systems where the fares were not coordinated with other modes of transit, ridership suffered.

Portland, Oregon, installed a streetcar system about two decades ago and saw immediate success. That, coupled with an ongoing availability of federal funds for such systems, spurred many other cities to follow suit in the past decade. Other cities continue to plan streetcar systems, and Mendez said the findings could help planners, developers, policymakers and the public set goals that give such transit systems a chance for success. His larger body of research examines transit planning and policy and how it intersects with equity and future work will examine how private actors influence streetcar development, and if they are doing so in ways that benefit themselves over public interests.

Meanwhile, cities with high-performing streetcar systems are showing that taking people where they want to go is the surest indicator of a successful plan.

“You have to make it easy to get to if you want people to use it,” Mendez said.

AMERIKA


Heart transplant patients from socioeconomically deprived areas face higher risk for postoperative complications, earlier death than others


The disparity persisted even when they were transplanted at high-volume, high-quality hospitals



University of California - Los Angeles Health Sciences




Heart transplant patients who live in socioeconomically disadvantaged areas are more likely to experience post-surgical complications and die within five years than patients who live in more advantaged areas, even when those patients were transplanted at topnotch high-volume hospitals, new UCLA research suggests.

The findings, to be published September 30 in the peer-reviewed Journal of Heart and Lung Transplantation, the official publication of the International Society for Heart and Lung Transplantation, suggest that a lack of access to follow-up care, likely stemming from neighborhood deprivation, are at the root of this disparity, said lead author Sara Sakowitz MS MPH, a medical student at the David Geffen School of Medicine at UCLA.

The paper was originally highlighted at the 2024 Society of Thoracic Surgeons National Meeting, where it was named the J. Maxwell Chamberlain Memorial Paper, representing the top paper in perioperative care.

“Our study demonstrates that access to high quality centers for cardiac transplantation does not mitigate persistent neighborhood deprivation-based disparities in patient and allograft survival,” Sakowitz said. “Rather, factors outside the immediate post-transplantation period that stem from access to longitudinal care or crucial immunosuppressive medications, appear to be implicated.”

“Altogether, this means that improving access to care is not wholly sufficient to address persistent disparities in post-transplant outcomes. We must shift our focus to addressing inequities in access to and engagement with longitudinal care, over the months and years following transplant,” she added.

The researchers examined data from the Organ Procurement and Transplantation Network (OPTN) for adults who received heart transplants between January 2005 and December 2022, with data from follow-ups through June 2023. They used a metric called the Area Deprivation Index (ADI), a composite of a neighborhood’s financial strength, economic hardship, inequality, and educational attainment to rank regions from 1 for highest socioeconomic status or “least deprived,” to 100, representing lowest socioeconomic status, or “most deprived.”

Death at years one, three and five post-transplant was the primary outcome the researchers measured, followed by outcomes during transplant hospitalization and organ failure at three and five years.

Of the 38,000 heart transplant recipients’ data examined, about 20% (7,600) were from the most deprived areas.

The researchers found that people in the most deprived areas had a 14% higher risk of dying at three years and 13% higher chance of dying at five years. In addition, they faced a 14% higher risk of organ failure after three years and 13% increased risk after five years.

Even when they received care at high quality hospitals, heart transplant recipients stood a 10% higher chance of dying at both three and five years compared to their counterparts living in less deprived areas, suggesting that treatment at the better hospital did not translate into statistically better outcomes.

Patients from deprived communities more frequently had diabetes, higher body mass index, and coronary disease compared with those from higher income areas, the researchers noted. But the disparity persisted even after adjusting for factors such as race, insurance, and comorbidities.

“Therefore, community-level socioeconomic disadvantage appears to act as a higher-level, compounding structural factor that independently shapes post-transplant outcomes,” they write.

Study limitations include a potentially insufficient granularity in OPTN data, a lack of surgeon experience information, and the possibility that the ADI scores may not completely represent patients’ socioeconomic characteristics due to how they are calculated, indicating a need for more research.

In the meantime, the researchers are further exploring the factors possibly contributing to the disparity, such as access and adherence to post-transplant medication and the impact of residential and environmental forces on health.

“Our goal is to fully characterize the complex, non-linear, and multi-faceted associations of social determinants with cardiac care and outcomes, so that we can design appropriately targeted solutions at both the local and national scale,” Sakowitz said. “To address the systemic, root causes underlying disparities in transplantation, we have to break down these large scale problems into inflection points where we can make meaningful change.”

The research was performed in the Cardiovascular Outcomes Research Laboratories (CORELAB) in the Department of Surgery under the direction of Dr.Peyman Benharash, the senior author of the report. Additional co-authors are Dr. Syed Shahyan Bakhtiyar, Dr. Saad Mallick, Amulya Vadlakonda, Dr. Nikhil Chervu, and Dr. Richard Shemin of UCLA. Bakhtiyar is also affiliated with the University of Colorado.