Tuesday, October 25, 2022

WHITE SUPREMACIST MEDICINE

Minorities face longer wait times for vital lung cancer treatment, study finds

Peer-Reviewed Publication

UNIVERSITY OF VIRGINIA HEALTH SYSTEM

Minorities face longer wait times for vital lung cancer treatment, study finds 

IMAGE: “OUR RESULTS SUGGEST THAT NON-WHITE LUNG CANCER PATIENTS HAVE DELAYED TIME TO CANCER TREATMENT COMPARED WITH WHITE PATIENTS, AND THIS IS NOT LIMITED TO A PARTICULAR TYPE OF TREATMENT FACILITY," SAID SENIOR RESEARCHER RAJESH BALKRISHNAN, PHD, OF UVA CANCER CENTER AND THE UNIVERSITY OF VIRGINIA SCHOOL OF MEDICINE’S DEPARTMENT OF PUBLIC HEALTH SCIENCES. view more 

CREDIT: DAN ADDISON | UVA COMMUNICATIONS

A new study has revealed significant racial disparities in how quickly minorities with the most common form of lung cancer receive potentially lifesaving radiation therapy compared with their white counterparts.

UVA Cancer Center researchers reviewed data from more than 222,700 patients with non-small cell lung cancer across the country. They found that the mean time to radiation initiation was 61.7 days. For white patients, it was only 60.9 days; for Black patients, it was 65.9 days; and for Asian patients, it was 71.9 days.

These differences can have real consequences, as speedy treatment initiation contributes to better outcomes. A single-week delay in treatment has been associated with a 3.2% and 1.6% increase in the risk of death for patients with stage I and stage II non-small cell lung cancer, respectively.

The troubling findings prompted the UVA researchers to call for further investigation into the underlying causes of the disparities and to urge efforts to ensure equity in treatment access.

“Our results suggest that non-white lung cancer patients have delayed time to cancer treatment compared with white patients, and this is not limited to a particular type of treatment facility," said senior researcher Rajesh Balkrishnan, PhD, of UVA Cancer Center and the University of Virginia School of Medicine’s Department of Public Health Sciences. “Collaboration among providers and community stakeholders and organizations is much needed to increase accessibility and patient knowledge of cancer and to overcome existing disparities in timely care for lung cancer patients."

Cancer Treatment Disparities

The UVA researchers say there may be multiple reasons for the racial disparities. They note that non-white patients are more likely to be uninsured, face greater socioeconomic barriers to care and may be perceived by doctors as being at risk for not following through with their treatment plans. In addition, minorities are more likely to report less satisfaction with their encounters with care providers.

“These findings shed light on the potential presence of and impact of structural racism on patients seeking cancer treatment,” the researchers write in a paper in the scientific journal Health Equity. “Further investigation into the societal determinants that perpetuate disparity in time to radiation therapy, and potential interventions in the clinical setting to improve cultural and racial sensitivity among healthcare professionals, is recommended.”

The longest waits for treatment initiation were at academic health systems, the researchers report. They suggest this may be because patients begin their treatment in community hospitals and other local facilities before they need to be transferred to academic facilities that can offer more complex care.

Racial disparities were found across all types of hospitals, however. White patients consistently had the shortest time to treatment, while Asians had the longest, the researchers report.

Ensuring equity in access to cancer care and improving the quality of that care are major missions of UVA Cancer Center, which this year became one of only 53 cancer centers in the country designated a Comprehensive Cancer Center by the National Cancer Institute. The designation recognizes elite cancer centers with the most outstanding cancer programs in the nation. Comprehensive Cancer Centers must meet rigorous standards for innovative research and leading-edge clinical trials.

UVA Cancer Center is the only Comprehensive Cancer Center in Virginia.

Findings Published

The lung cancer findings have been published in the scientific journal Health Equity. The research team consisted of Akhil Rekulapelli, Raj P. Desai, Aditya Narayan, Linda W. Martin, Richard Hall, James M. Larner and Balkrishnan.

To keep up with the latest medical research news from UVA, subscribe to the Making of Medicine blog at http://makingofmedicine.virginia.edu.

Non-white newborns with cystic fibrosis more likely to be missed in screening

Peer-Reviewed Publication

UNIVERSITY OF CALIFORNIA - SAN FRANCISCO

Researchers call for better newborn screens across the U.S.

Cystic fibrosis is missed more often in newborn screenings for non-white than white babies, creating higher risk for irreversible lung damage and other serious outcomes in Black, Hispanic, Asian, American Indian and Alaska Native newborns, a new study finds. 

The genetic panels used in newborn screening programs vary by state, but most tend to test for cystic fibrosis gene mutations that are common in white populations while excluding mutations more common in non-white populations, said Meghan McGarry, MD, associate professor of pediatrics at UC San Francisco (UCSF) Benioff Children’s Hospitals and first author of the study, publishing today in Pediatric Pulmonology

“Newborn screenings are meant to be a public health measure that is equal across populations, but in practice, we are actually creating disparities because children of color are going undiagnosed for cystic fibrosis until an older age,” McGarry said. “That means they are treated later when symptoms occur, and their outcomes are often worse.”

Cystic fibrosis is one of the most common genetic disorders, with about 1,000 new cases diagnosed each year. The disease causes the malfunction of a protein that helps regulate mucus, which can lead to blockages and trapped germs and eventually, infections such as bronchitis and pneumonia. Early diagnosis and treatment can reduce severe symptoms, such as failure to thrive, and is associated with improved nutrition, better pulmonary outcomes, and survival.

Better Screenings Needed

To determine case detection rates, researchers analyzed the genetic mutations of 46,729 people in the 2020 Cystic Fibrosis Foundation Patient Registry, then calculated the rate of delayed diagnosis or false-negative tests by race and ethnicity. They also compared data across states.

In most states, a newborn screen is positive only when at least one disease-causing variant of the cystic fibrosis transmembrane conductance regulator (CFTR) gene is detected. This study found the detection rate of at least one CFTR variant in known cystic fibrosis patients was 56-77% in Asian patients, 73-86% in Black patients, 84-91% in American Indian and Alaska Native patients, and 81-94% in Hispanic patients, compared to 95-97% in white patients.  The differences between races and ethnicities were even larger when the tests aimed to detect two CTFR variants instead of just one. 

The states with greater racial and ethnic diversity had lower detection rates for each variant panel. Likewise, detection rates were higher overall with genetic panels that included more variants compared to fewer variants, though this wasn’t the case for all races/ethnicities. 

“In order for newborn screenings to be equitable, they must include CTFR variant panels that reflect the racial and ethnic diversity of the population,” McGarry said. “Three states--Wisconsin, New York and California--do this well and use full sequencing, as well as systematically monitor and review who is being missed, what variants they had, and whether they should be adding variants to the panels.”

While race and ethnicity are social constructs, the genes that cause cystic fibrosis do vary by race, ethnicity and region, said Susanna McColley, MD, professor of pediatrics at Northwestern University Feinberg School of Medicine and physician at Ann & Robert H. Lurie Children’s Hospital of Chicago, as well as senior author on the study. 

“The bias in newborn screening testing is coupled with clinician bias against diagnosis of cystic fibrosis in Black, Hispanic, Asian, and American Indian/Native Alaskan babies that can have fatal consequences,” said McColley. “We hope this work will lead to more equitable screening in all states.” 

‘Two diseases’ based on race

About 20% of cystic fibrosis cases are from racial and ethnic minority groups, and the disease occurs in people of all ethnicities and races across the U.S. The delays in detection and treatment for non-white patients has essentially created two diseases, noted McGarry. 

“Most white kids with cystic fibrosis nowadays are never hospitalized; for them, it’s an outpatient disease and they will likely live a full life. If you are not white, however, you are more likely to be the one in the hospital all the time with a severe disease,” McGarry said. “We have a young patient now who had a normal newborn screen, and he wasn’t diagnosed with cystic fibrosis until he was on the ventilator and intubated with multiple pneumonias and permanent lung damage.” 

The goal is to diagnose children before they are a month old to prevent irreversible harm, she added.

“As early as four weeks, you can already see permanent lung damage. Some even have symptoms in utero, though most undiagnosed babies start having respiratory symptoms around 4-8 weeks,” McGarry said. “There is good research showing that if you can get diagnosed through newborn screening before having a lot of disease—rather than being diagnosed with symptoms—you have much better outcomes in the long term.” 

Co-authors and funding: Please see paper for additional co-authors and funding disclosures.   
 

About UCSF Health: UCSF Health is recognized worldwide for its innovative patient care, reflecting the latest medical knowledge, advanced technologies and pioneering research. It includes the flagship UCSF Medical Center, which is ranked among the top 10 hospitals nationwide, as well as UCSF Benioff Children’s Hospitals, with campuses in San Francisco and Oakland, Langley Porter Psychiatric Hospital and Clinics, UCSF Benioff Children’s Physicians and the UCSF Faculty Practice. These hospitals serve as the academic medical center of the University of California, San Francisco, which is world-renowned for its graduate-level health sciences education and biomedical research. UCSF Health has affiliations with hospitals and health organizations throughout the Bay Area. Visit https://www.ucsfhealth.org/. Follow UCSF Health on Facebook or on Twitter

###

Follow UCSF
ucsf.edu | Facebook.com/ucsf | Twitter.com/ucsf | YouTube.com/ucsf

 

By leaving garden waste alone, we could store 600,000 tons of CO2 per year

Danes shuttle nearly a million tons of garden waste from their gardens every year. But we would be doing the climate a great service if we just left it alone, according to the calculations of University of Copenhagen professor Per Gundersen

Reports and Proceedings

UNIVERSITY OF COPENHAGEN - FACULTY OF SCIENCE

Danes shuttle nearly a million tons of garden waste from their gardens every year. But we would be doing the climate a great service if we just left it alone, according to the calculations of University of Copenhagen professor Per Gundersen. At the same time, leaves and other garden waste are a boon for backyard biodiversity and soil.

Come fall, the sound of a leaf blower blasting leaves into piles, or the sight of a queue of garden waste-packed trailers at the recycling center is all too common, as trees shed their golden brown and yellow robes.

For the most part, Danes are happy to "tidy up" their gardens. Ministry of Environment figures report that Danes disposed of 983,000 tonnes of garden waste in 2019, all of which is transported, sorted and processed by municipal waste management systems. The largest branches and trunks are burned for bioenergy, while small branches, leaves and grass clippings are composted.

But perhaps we should leave more of our garden alone. According to Per Gundersen, a professor of forest ecology at the Department of Geosciences and Natural Resource Management at the University of Copenhagen, there are large CO2 savings to be made. Professor Gundersen calculated the implications for Denmark’s national climate account if Danes simply allowed garden waste to decompose in their gardens instead of burning it.

"If everyone went all-in and learned how to manage their garden’s branches and twigs, for example, my calculations suggest that we could store about 600,000 tons of CO2 annually," he says.

The calculations are based on a simple model that includes the different decomposition processes for leaves, twigs and branches.

When Gundersen deducts the utility that branches and trunks used for bioenergy in the current system provide, the CO2 storage potential of garden waste could meet about half of one percent of Denmark’s 70 percent reduction target for 2030.

"When garden waste is burned or composted by municipal waste schemes, CO2 is returned to the atmosphere very quickly. By keeping waste in the garden, the decomposition process is significantly slower. In practice, this means that one builds up a larger and larger storage of CO2 in the garden in the form of twigs, dead branches and leaves that are left to decompose," says Professor Gundersen.

Autumn leaves are good for the garden and biodiversity

Leaves are flying about and filling up yards at this time of the year. But instead of raking them up and hauling them off to a recycling center, there are other great reasons to leave them in the yard.

"Leaves help nourish the garden as they feed an entire ecosystem of decomposers. From fungi, bacteria and tiny, invisible soil creatures to earthworms and roly poly bugs, which help break down and metabolize organic material so that nutrients are released into the soil. These decomposers then act as important food source for larger animals like hedgehogs and birds," explains Per Gundersen.    

As microorganisms, worms and other decomposers wage war on the leaves, a large part of the biomass ends up as CO2 that is slowly released into the air over several years. A small portion of the gnawed leaves becomes soil humus, a common name for the complicated organic material that takes a very long time to decompose and is of great importance for plant health.

"A high humus content in topsoil makes for a dark soil that has a good structure and retains water and nutrients well, which is important if you want garden plants to thrive," says Per Gundersen.

Facts

  • According to figures from the Danish Ministry of Environment, Danes disposed of 983,000 tons of garden waste in 2019. This figure includes garden waste from businesses and parks.
  • The garden waste is extracted CO2 from the atmosphere. Converted, this amount corresponds to about 1.2 million tons of bound CO2.
  • If Danes kept their garden waste in their yards and deducted the amount of garden waste used for bioenergy, about half of one percent of Denmark’s 70 percent reduction target for 2030 could be met.
  • Leaves consist largely of the gas CO2. Minerals taken up into the soil account for less than 5% of a leaf.
  • Leaves are small CO2 stores that absorb CO2 from the air and release some of it while decomposing.
  • Leaves take 3-6 months to decompose, until half are left (their half-life), while twigs and branches take 2-5 years. Tree trunks decay over 10-20 years.
  • Find inspiration for managing 'yard waste' on the Facebook page 'Klimahaven'

We must dare to let go of control

But what to make of all those leaves suddenly filling up the garden? Per Gundersen suggests raking them into planting beds where vegetables are grown, or over to areas where you want to avoid weeds. They can also be gathered in a part of the garden that you may be willing to let grow wild and manage itself.

"Funnily enough, while the wind tends to blow leaves around the garden, they often remain in beds or around bushes where there is less wind, which is a good thing," he says.

By spring, a large portion of the leaves will have already disintegrated and returned into the invisible garden cycle. Leaves from linden and ash trees are especially good at disappearing quickly from gardens. Oak and fruit tree leaves are slower to break down, making them ideal for soil insulation around bulbous plants or vegetables, for example.

"Our gardens can contribute to both the climate and biodiversity crisis by harnessing more garden waste. I also think that it will lead to a little less gardening in the long run. And then it's more fun, because there will be a greater abundance of life around the garden. We just need to dare to let go of control and make more room for nature by allowing for twigs and a few dead branches, for example" concludes Per Gundersen.

WVU researchers dig into study of manure as organic fertilizer

Grant and Award Announcement

WEST VIRGINIA UNIVERSITY

WVU Field 

IMAGE: A MANURE SPREADER DRIVES OVER A PLOT OF LAND FOR A PREVIOUS RESEARCH PROJECT CONDUCTED BY ELIZABETH ROWEN, A SERVICE ASSISTANT PROFESSOR OF ENTOMOLOGY IN THE WEST VIRGINIA UNIVERSITY DAVIS COLLEGE OF AGRICULTURE, NATURAL RESOURCES AND DESIGN. view more 

CREDIT: SUBMITTED PHOTO

These West Virginia University researchers are the No. 1 experts in No. 2. 

Led by Elizabeth Rowen, a team of faculty from the Davis College of Agriculture, Natural Resources and Design will study the use of manure as an organic fertilizer, thanks to a $750,000 grant from the United States Department of Agriculture.

More specifically, they are digging into the most effective ways to reduce insects and pathogens that threaten crops.

The U.S. is the world’s largest consumer of beef — much of which is grain-fed — and spends $233 billion annually to import wheat, corn and soybeans for organic livestock feed. However, domestic field crop producers might have an opportunity to meet those needs if the barriers to organic production, like pests and nutrient sources, were lowered.

The key may be cow manure, which helps build nutrient content and organic matter content of the soil and bolsters soil health.

“In organic production, manure is very common because it is one of the only sources of nutrients that you can have,” said Rowen, assistant professor of entomology. “You can’t use chemical sources of nutrients to grow things like organic grain to feed your cows. You have to produce them with manure or other types of composts.”

Manure not only makes an effective fertilizer, meeting plant nutritional needs, but it’s also a common one. It’s readily available in Pennsylvania and West Virginia, where the livestock and dairy industries are prevalent.

Agricultural managers can apply manure in dry-stacked, liquid or composted form, but it also may bolster weeds, insects and soil pathogens, all of which will damage organic crops. Rowen and her colleagues are studying which form is most likely to carry or combat these pests.

“We have put together a team that's going to look at the soils, the weeds, the microbiome of these plants,” Rowen said. “We’re looking at how the microbiome from the manure transfers to the soil, and how it then affects insects. Because there's good evidence that applying manure can make plants more resistant to insect pests.”

Researchers don’t know why, but possible factors include the microbial community in the soil or the micronutrients from the applied manure fertilizer.

The right soil conditions will also encourage beneficial insects like spiders and ground beetles, which are important predators in agricultural systems. Organic matter in the soil feeds and stabilizes predator populations so they can prey on destructive insects.

“We know that organic systems tend to have fewer insect pests even though they’re not getting sprayed with insecticides,” Rowen said. “But is there something about manure that’s making these plants more resilient? We’re going to be investigating in the greenhouse, trying to pick apart that system. I’m really excited about it.”

Cost effectiveness, too, plays a role in what fertilizers a farmer may choose, so the research team will be looking at ways to make an organic farm more profitable.

Researchers at the WVU Davis College of Agriculture, Natural Resources and Design are utilizing plots at the WVU Agronomy Farm to research effective ways to use manure as organic fertilizer

CAPTION

Researchers at the WVU Davis College of Agriculture, Natural Resources and Design are utilizing plots at the WVU Agronomy Farm to research effective ways to use manure as organic fertilizer

“The U.S. is importing organic grain, and it’s got a higher economic value than conventional grain,” Rowen said. “That’s one of the reasons you would transition to growing your own organic grain, because the alternative is expensive.”

She said team member Ana Claudia Sant’Ana, assistant professor of resource economics and management, will be looking at how fertilizer management affects profitability. Other members of the research team include Ember MorrisseyJim KotconEugenia Peña-Yewtukhiw and Rakesh Chandran.

Currently, Rowen and her colleagues are preparing for the three-year study at the WVU Agronomy Farm where they’ll plant four different crops. In the spring, they’ll apply either liquid, raw (dry-stacked) or composted manure to the soil before planting corn, soy and wheat. The fourth will be a control that doesn’t receive any fertilizer. Researchers will also have plots that are certified organic and others that will transition to organic. The latter will represent farmers who want to become organic and need an appropriate fertilizer.

“We want to be able to present the options,” she said. “Especially for people who are trying to decide what to invest in. Things like composting their manure and using liquid manure, those are investments in terms of equipment and time and such. If there’s really an advantage for their pest management strategies, it's valuable to have that information.”

Rowen hopes the results will support decision-making in the future. In a nation with growing interest in organic milk and beef, she sees the potential to increase organic feed production and help farmers transition to more profitable organic production.

“We drink a lot of milk and we eat a lot of beef,” she said. “We’re not getting rid of cows anytime soon.”

3D flora and fauna at your fingertips

Repository of over 1,400 highly detailed 3D models of animals and plants captured with ‘Bio-photogrammetry’ publicly available

Peer-Reviewed Publication

KYUSHU UNIVERSITY

Luna Lionfish, Pterois lunulata 

IMAGE: 3D RENDER OF A LUNA LIONFISH TAKEN WITH BIO-PHOTOGRAMMETRY view more 

CREDIT: KANO LAB/KYUSHU UNIVERSITY

Fukuoka, Japan—Reporting in Research Ideas and Outcomes, a Kyushu University researcher has developed a new technique for scanning various plants and animals and reconstructing them into highly detailed 3D models. To date, over 1,400 models have been made available online for public use. 

Open any textbook or nature magazine and you will find stunning high-resolution pictures of the diverse flora and fauna that encompass our world. From the botanical illustrations in Dioscorides’ De materia medica (50-70 CE) to Robert Hooke’s sketches of the microscopic world in Micrographia (1665), scientists and artists alike have worked meticulously to draw the true majesty of nature.  

The advent of photography has given us even more detailed images of animals and plants both big and small, in some cases providing new information on an organism's morphology. As technology developed, digital libraries began to grow, giving us near unfettered access to valuable data, with methods like computer tomography, or CT, and MRI scanning becoming powerful tools for studying the internal structure of such creatures.  

"While powerful, MRI scanning and CT methods are prohibitively expensive. You also can't collect vital information such as the organism's color," explains Yuichi Kano, associate professor of Kyushu University's Graduate Education and Research Training Program in Decision Science for a Sustainable Society. "So, we developed ‘bio-photogrammetry’ as a way to incorporate photogrammetry that could scan and render a high-quality 3D image of an organism." 

Photogrammetry is a method by which you can obtain information and measurements about objects by analyzing photos or other imagery. Today it is commonly used to scan everything from landscapes to sculptures to make digital 3D models, similar to what you find on Google Earth. 

Kano took that same methodology to make thousands of models of various organisms. 

“We suspended the sample on a fishing line and took photos from multiple angles. We would end up taking hundreds of photos of the sample, and input up to 500 of the best ones into the photogrammetry program,” explains Kano. “It is similar to how the ‘bullet time’ sequences were filmed in the first Matrix movie, except instead of Keanu Reeves on a line surrounded by cameras, we use an octopus.” 

While Kano has been working on various organisms including insects, plants, and even fungi, he is currently focusing on aquatic animals such as fish and amphibians. To date, there are over 1,400 specimens available all free to use under the CC BY 4.0 license. 

There are a few limitations in the current methodology, such as difficulty in capturing transparent creatures or making models of exceedingly small (<5 mm) or large (>1 m) organisms, but a few improvements in software and protocols could help solve such issues. 

“I hope to see this work continue to grow and be utilized in various fields like taxonomy, morphology, and ecology. It’s free to the public, so you can use it in education or even plug it into a VR machine and explore these organisms up-close. I’d like to see what some people can come up with,” concludes Kano. 

Bio-photogrammetry in action (VIDEO)



A sample organism, in this case a Long-spine Balloonfish (Diodon olocanthus), is processed (inflated), scanned, and rendered into a high-quality 3D model. Up to 500 photos go into rendering a single model. Kano demonstrates in the video how the process is done.


A series of 3D models taken using Bio0photogrammetry. The process can be used to scan a variety of organisms including plants, amphibians, and insects.




CREDIT

Kano Lab/Kyushu University

For more information about this research, see "Bio-photogrammetry: digitally archiving coloured 3D morphology data of creatures and associated challenges," Yuichi Kano, Research Ideas and Outcomes (2022). https://doi.org/10.3897/rio.8.e86985 

About Kyushu University 
Kyushu University is one of Japan's leading research-oriented institutes of higher education since its founding in 1911. Home to around 19,000 students and 8,000 faculty and staff, Kyushu U's world-class research centers cover a wide range of study areas and research fields, from the humanities and arts to engineering and medical sciences. Its multiple campuses—including the largest in Japan—are located around Fukuoka City, a coastal metropolis on the southwestern Japanese island of Kyushu that is frequently ranked among the world's most livable cities and historically known as a gateway to Asia. 

Q&A: David Kaiser on Freeman Dyson, the relentless freethinker

MIT physicist and historian of science has edited a new volume about Dyson, a famed quantum theorist and futurist.

Book Announcement

MASSACHUSETTS INSTITUTE OF TECHNOLOGY

In the early 2000s, David Kaiser first visited famed physicist Freeman Dyson at the Institute for Advanced Study in Princeton, New Jersey. By the end of the conversation, Dyson was handing over keys to his office, along with files, to help Kaiser’s research — a characteristically open gesture by Dyson, a legendary quantum theorist.

The English-born Dyson started his career as a mathematical prodigy who helped expand quantum electrodynamics (QED) — the study of the quantum behavior of light and its interaction with charged particles — and ended it as a renowned author of science books and essays. In between, Dyson dove into a wide array of subjects, including the origins of life, the design of nuclear power plants, nuclear missile disarmament, astrophysics, and space travel, becoming a hero to futurists. Dyson also courted controversy late in life for downplaying the dangers of climate change, having previously recognized the salience of the issue as early as the 1970s.

Now, Kaiser, a historian of science and active physicist himself, has edited a volume of essays about the life and career of Dyson, titled, “‘Well, Doc, You’re In’: Freeman Dyson’s Journey through the Universe,” published today by the MIT Press. (The title is a quip Richard Feynman once made after another young physicist praised Dyson’s work at a physics conference.) Contributors include the prominent technologist Esther Dyson, who was Dyson’s daughter. MIT News talked about the new book with Kaiser, who is the Germeshausen Professor of the History of Science in MIT’s Program in Science, Technology, and Society, and a professor in the Department of Physics.

Q: What is most interesting to you about Freeman Dyson?

 

A: Freeman Dyson was more than just another really smart scientist. Dyson strikes me as an unusual thinker who tackled a range of questions with tenacity. He wasn’t right every time. We know sometimes he was fantastically not correct, like every human. But his intellectual fearlessness and open-ended curiosity was astonishing. To have made the kinds of contributions he did across the span of topics in addition to physics and math — an original design of a nuclear reactor, trying to reimagine space travel, thinking hard about nuclear disarmament, the fundamentals of life sciences and biology — he wasn’t just broadly curious, he was relentless. And over the decades he emerged as an award-winning, beloved author with bestselling books. When I got to know him, I saw kindness and curiosity. He was very self-effacing and modest, a fascinating person.

 

Q: Dyson is often described as expanding and popularizing the work of the quantum theorists Richard Feynman, Julian Schwinger, and Sin-Itiro Tomonaga, who had developed QED. But how would you describe his specific accomplishment here?

 

A: Dyson was synthesizing work from multiple scientists that even his colleagues thought might not have much in common. They had each found a way to conquer the first round of corrections in QED, which had stymied generations before. That was enormous. For years until that time, the equations had kept breaking down, yielding these infinites when everyone knew the answer should be some small number. The three of them had cracked that, independently, with very different-looking methods.

 

Dyson put it all together in a way they had missed. He first demonstrated that the three distinct approaches were in fact compatible — they shared a common underpinning. And then he pushed much further than the other three, demonstrating in a real tour de force that one could calculate the corrections to arbitrary precision. Where Feynman, Schwinger, and Tomonaga had each found a way to calculate basically the percent-level corrections for a given physical process, Dyson demonstrated that the corrections could be computed, self-consistently, all the way out to parts-per-million, parts-per-billion, and beyond.

 

And indeed, thanks to Dyson’s insights, QED is the most precise scientific theory in human history. Specific quantities, such as how an electron will respond when placed in a magnetic field, can now be calculated using Dyson’s techniques all the way out to 12 decimal places and compared with stunning laboratory measurements — and the theoretical predictions and experimental measurements match all the way out to parts-per-trillion.

 

Dyson wasn’t just popularizing the work of others; he was supercharging the efforts of the other three. On top of that, Dyson became a remarkably effective ambassador for the new methods, patiently teaching a whole generation how to perform the new kind of calculations. You can trace the spread of these new techniques to Dyson, literally more than any person on Earth.

 

Q: Why did Dyson move on from physics? And across all these topics, were there any common characteristics in his thinking?

 

A: Partly he had this roving curiosity from his really early years onward. And from his school years, through university, and his searing experience during the Second World War, when he worked for the British military authorities, trying to mathematize strategies for bombing campaigns, he became very skeptical of hierarchy and formal organizations.

 

Whereas thinking about problems at various scales is not unique to Dyson, he ran with it exuberantly. Details are important, and his work on QED was certainly detail-oriented, but Dyson would ask, “What’s going to dominate at a particular scale?” Most phenomena have many moving parts, and some interactions might be more important on one scale than another. That’s how many, many physicists think about the world now, but Dyson did it earlier. And he didn’t confine his approach to theoretical physics. When it came to nuclear disarmament, he realized there was an exponential growth that couldn’t possibly be sustained for above-ground testing, and that something dramatic had to change — not an incremental change. You see it in Dyson’s unusual approach to astrophysics and cosmology, where he’s not just taking a long view, but a preposterously long view — not just billions of years, but hundreds of billions of years.

 

We even see that in his forays into climate change, which became controversial later in his life. I hadn’t appreciated this, but he’d been thinking carefully and critically about climate change driven by human emissions, from the 1970s. His approach again was to say this problem is of such a scale that we’re not going to get there by incremental change. His thoughts on climate change later became harder to square with a robust scientific consensus, but in the early years, he was actually out in front of this.

 

Q: Okay, let’s talk about climate change. How did Dyson get so dug into a conflict with climate scientists and other advocates for strong climate action?

 

A: It was interesting for me to learn more about it while working on the book. I think by the end of his life he was wrong on the topic of climate change, and I just want to be clear about that. I’m not an outlier in thinking that. So, what was the nature of his shift? He was very much concerned and even alarmed early on. If you read his earlier writings on the topic, they would sound very timely, even today.

 

Part of Dyson’s change was his habit of being skeptical when an organized consensus is forming. Yet sometimes organized consensuses are solid. People work together and treat each other’s work with helpful, constructive skepticism, and the collective understanding gets better. But Dyson had a personal allergy to that. He felt a responsibility to poke at a forming consensus. Yet he was getting older and not devoting the same attention to the real details of climate models as he had in the 1970s and 1980s.

 

So, we are left with a mixed record on the topic. Right up until he passed away, he agreed that climate change was happening and that it was accelerated by human activities; he was not a climate-change denier. But he became increasingly critical of the scientific community’s approach to understanding the problem and trying to address it. Unfortunately his strongest statements on the topic came long after he had stopped actively following research on the topic. He was also insufficiently vigilant about clarifying his position for those who had adopted some of his soundbites. And that’s on him. He should have done that.  

 

Another important thread of his thinking is that was a techno-optimist. That is, he really thought that even though human-produced climate change is real and a problem, he had a kind of faith, not a calculation but a faith, that human technological ingenuity will get there faster than the climate calamities.

 

Q: It seems like these tensions in Dyson’s career involve a distrust of authority and consensus that wasn’t always borne out. He championed space exploration, but when we put people on the moon, it came from a fairly top-down organization.

 

A: Yes, Dyson’s scientific approach did not just involve fearlessness, it was about unboundedness. But that doesn’t always work for organizing people. The unbounded imagination is critical, but that’s not the only register in which we need to think to get things done.

 

Q: Can we apply Dyson-esque thinking to Dyson’s career? What helps generate an unbounded spirit of inquiry? At the Institute for Advanced Study he didn’t have to teach, and could focus on research. But at MIT we think teaching is good and can help spur ideas. Do you have any scaled-up conclusions about how to produce wide-ranging freethinkers like Dyson?

 

A: In at least one important sense the Institute for Advanced Study does share a critical feature with MIT, which is, it can foster a focus on the long view. Having an institutional stability can enable a thinker to wander, to explore ideas across topics, in our case through tenure. And Dyson certainly was an intellectual wanderer. But he could do it safe in the knowledge that he had a home base. In other aspects, these institutions can be quite different. But nurturing a place where people can pursue ideas as their curiosity leads them — I think in many wonderful institutions of higher education around the world, we still have the great privilege to take advantage of that.

 

###

 

Written by Peter Dizikes, MIT News Office

 

Additional background

 

Book: “‘Well, Doc, You’re In’: Freeman Dyson’s Journey through the Universe”

https://mitpress.mit.edu/9780262047340/well-doc-youre-in/

Dan Karney wins EPA challenge to study how power plants ramp to meet demand, California's ‘Duck Curve’

Grant and Award Announcement

OHIO UNIVERSITY

If the sun isn't shining and the wind isn't blowing, then ramp up the fossil-fuel power plant to fill the gap. But it's not that easy or efficient to run a power grid that way, said economist Daniel Karney, who recently won a U.S. EPA challenge to study the effects of "ramping."

"Think about your car," Karney said. "If you're cruising along at highway speed, your engine is running very efficiently. But if you're at a stop sign, it takes a lot of energy to reach that highway speed.

"Ramping isn't instantaneous, meaning it can take more than 12 hours to bring a natural gas system up to capacity," Karney said. "And ramping with fossil fuels can actually cause an increase in emission rates as the system warms up. When a power company is trying to comply with government regulations, heat rates and efficiency are very important," said Karney, an associate professor of economics in the College of Arts and Sciences at Ohio University who conducts research at the intersection of energy and environmental policy.

Karney and Khyati Malik, a Ph.D. student at The Ohio State University, will use Power Sector Emissions Data from the EPA's Clean Air Markets Division to study the ramping behavior of fossil fuel-fired power plants and the subsequent impact on emissions and regulatory compliance. Malik is an Ohio University alumna, having earned an M.A. in Economics from the College of Arts & Sciences in 2018.

"Utility-scale power plants traditionally ramp generation up and down to balance grid-level electricity demand and supply, but the increased deployment of intermittent and distributed renewables may impact the need for ramping," Karney said. Therefore, the project’s first goal is to evaluate if and how Clean Air Markets Program Data can be used to determine the ramping behavior of fossil-fuel fired electricity generating units in support of EPA's regulatory compliance and program evaluation.

"We will also evaluate the potential effect of increased work-from-home employment during the pandemic on ramping and the 'Duck Curve' phenomenon," Karney said. "Ramping presents a particular challenge for electricity and associated emission markets in areas with large penetration of distributed solar capacity with a phenomenon known as the Duck Curve."

California and the Duck Curve

Californians are familiar with power grid fluctuations and brownouts and are now facing the Duck Curve too.

"The daytime ‘belly’ or dipping of the curve occurs because of distributed solar generation (think roof top solar panels) reaches its apex in late mornings and early afternoons. Then, there is a period of transition when solar generation falls quickly with the dissipation of sunlight,” Karney said. “During the drop in solar power generation, power plants then must step in and generate sufficient electricity to meet the demand; otherwise, the local grid becomes unstable, leading to brownouts. This steep increase in electricity generation is the ‘neck’ of the curve. The overall shape resembles a duck and hence provides the name."

He said power companies anticipate the need by ramping up generating capacity prior to the fall of the distributed solar power supply, but that can lead to excess generation before the grid reaches the neck. This timing is especially important as the neck can coincide with peak demand for energy-intensive activities like air conditioning.

Could the Duck Curve and brown outs be the next West Coast phenomenon to sweep east?

"With an increased distributed solar generation across the United States, there exists a possibility of the curve migrating outside of California," Karney said. "Solving this problem could require investment in energy storage and costly demand response programs in order to deal with the Duck Curve."

The Impact of Working from Home

The second goal of the project is to analyze the effect of an exogenous shock like the COVID-19 pandemic on the curve by analyzing the changes in the overall energy consumption because of a surge in work from home, which is expected to continue past the pandemic.

"In the absence of work from home, the curve shows that the demand for electricity is highest when people prepare to go for work or school in the morning or after they have returned in the evening. Higher rates of work from home may lead to smoothening of the fluctuations in demand for electricity with households having a steady daytime electricity use. This implies that potentially less ramping is required in the transition period and would mitigate some need for energy storage and demand response," Karney said.

"However, work from home could theoretically exacerbate the curve's neck if demand sufficiently increased, and it is unclear if increased work from home exacerbates or lessens the effect of the curve and the subsequent expected future cost of mitigating the curve's effects on generation and emissions," he said.

Creating a Machine Learning Predictive Model

That's why Karney and Malik will be modeling the size of the ramping effect on generation and emissions, and its subsequent effect on switching from solar power generation to fossil-fuel based power generation. They will be using machine learning methods like artificial neural networks or random decision forests to identify the most efficient model.

"The use of machine learning methods to model the high frequency electricity demand and emissions data provides a data-driven predictive model that utilizes various inputs from Clean Air Markets Program Data and other datasets. We will simulate various scenarios and evaluate the frequency and size of the ramping effects," said Malik, who has working papers in the field of public economics and is currently working on her Ph.D. dissertation in the fields of energy, environmental and regional economics.

"Using the machine learning tools, we hope to identify the ramping conditions that maximize the efficiency of electricity generation and minimize the emissions, all else equal," Karney said.

"To examine if higher rates of work from home led to smoothening of the fluctuations in the net demand for grid electricity, we will construct the curves before and after the pandemic and then compare them with a counterfactual curve assuming the absence of the pandemic shock using the machine learning model," said Karney, who previously worked at ICF International running the Integrated Planning Model in support of the EPA's Clean Air Markets Division regulatory analysis.

While Karney and Malik are working with U.S. data for their project, "The Effect of Thermal Electric Generator Ramping on Emissions, Regulatory Compliance, and the Duck Curve," Karney said their work could theoretically inform international markets as well.

Chemical clues to the mystery of what’s coating Stradivari’s violins

Peer-Reviewed Publication

AMERICAN CHEMICAL SOCIETY

Chemical clues to the mystery of what’s coating Stradivari’s violins 

IMAGE: A HIGHLY PRECISE, NANOMETER-SCALE IMAGING TECHNIQUE REVEALED A PROTEIN-BASED LAYER BETWEEN THE WOOD AND THE VARNISH COATING OF THESE TWO STRADIVARIUS VIOLINS. view more 

CREDIT: ADAPTED FROM ANALYTICAL CHEMISTRY 2022, DOI: 10.1021/ACS.ANALCHEM.2C02965

Stradivarius violins produce elegant music with a level of clarity that is unparalleled by modern instruments, according to some musicians. And it’s the finishing touches — mysterious treatments applied hundreds of years ago by Antonio Stradivari — that contribute to their unique look and sound. In a step toward unraveling the secret, researchers in ACS’ Analytical Chemistry report on a nanometer-scale imaging of two of Stradivari’s violins, revealing a protein-based layer between the wood and varnish.

Previous studies have reported that some stringed instruments crafted by Stradivari have a hidden coating underneath the shiny varnish. This coating’s purpose would have been to fill in and smooth out the wood, influencing the wood’s resonance and the sound that’s produced. Knowing the components of this film could be key to replicating the historic instruments in modern times. So, Lisa Vaccari, Marco Malagodi and colleagues wanted to find a technique that would determine the composition of the layer between the wood and varnish of two precious violins — the San Lorenzo 1718 and the Toscano 1690.

Using a technique previously used on historic violins, synchrotron radiation Fourier-transform infrared spectromicroscopy, the team found that both samples had an intermediary layer, but this method couldn’t differentiate the layer’s composition from the adjacent wood. Then they turned to infrared scattering-type scanning near field microscopy (IR s-SNOM) to analyze the samples. The IR s-SNOM apparatus includes a microscope that collects images tens of nanometers wide and measures the infrared light scattered from the coating layer and the wood to collect information about their chemical composition. The results of the new method showed that the layer between the wood and varnish of both instruments contained protein-based compounds, congregating in nano-sized patches. Because IR s-SNOM provided a detailed 3D picture of the types of substances on the violin’s surface, the researchers say that it could be used in future studies to identify compounds in complex multi-layer cultural heritage samples.

The authors acknowledge CERIC-ERIC and Elettra Sincrotrone Trieste for access to experimental facilities and financial support.

The American Chemical Society (ACS) is a nonprofit organization chartered by the U.S. Congress. ACS’ mission is to advance the broader chemistry enterprise and its practitioners for the benefit of Earth and all its people. The Society is a global leader in promoting excellence in science education and providing access to chemistry-related information and research through its multiple research solutions, peer-reviewed journals, scientific conferences, eBooks and weekly news periodical Chemical & Engineering News. ACS journals are among the most cited, most trusted and most read within the scientific literature; however, ACS itself does not conduct chemical research. As a leader in scientific information solutions, its CAS division partners with global innovators to accelerate breakthroughs by curating, connecting and analyzing the world’s scientific knowledge. ACS’ main offices are in Washington, D.C., and Columbus, Ohio.

To automatically receive news releases from the American Chemical Society, contact newsroom@acs.org.

Follow us: Twitter | Facebook | LinkedIn | Instagram