Tuesday, December 16, 2025

Transparent ceramic could boost internet speeds and cut energy use


Researchers use advanced physics theory to solve performance mystery of the advanced material



Penn State

recently developed transparent ceramics 

image: 

 

A new physics theory called zentropy could help explain why recently developed transparent ceramics control light far better than expected, a discovery that could lead to faster, smaller and more energy-efficient optical technologies used in communications, sensing and medical imaging. 

view more 

Credit: Zi-Kui Liu/Phases Research Lab





UNIVERSITY PARK, Pa. — A new class of ceramics are not only transparent, but they can control light with exceptional efficiency — better than any theories predicted. Now, an advanced theory put forth by researcher at Penn State may explain why this material is so good at light control, which could lead to large-scale manufacturing of these materials for faster, smaller and more energy efficient technologies used in high-speed communications, medical imaging and advanced sensing. 

To solve the puzzle of why transparent ceramic’s electro-optic properties — the ability to change how they bend or transmit light when a voltage is applied — performed far better than predicted, Haixue Yan, reader in materials science and engineering from the Queen Mary University of London, reached out to Zi-Kui Liu, a Penn State professor of materials science and engineering. Liu previously developed an advanced theory of entropy, or the concept that systems trend towards disorder if no energy is applied to keep the chaos at bay. This advanced theory, known as zentropy theory, blends quantum mechanics, thermodynamics and statistical mechanics into a single predictive framework. Together, along with a team representing multiple institutions across six countries, they solved the mystery and published their work in the Journal of the American Chemical Society.  

Ceramics offered major advantages for optical technologies because they are far cheaper to manufacture than single crystals, easier to scale into usable components and allow precise control of composition. However, to function in electro-optic devices, the material must be transparent so that light passes through it smoothly, a longstanding challenge that recent processing advances have finally overcome. 

“Ceramics are much cheaper, easier to manufacture and allow precise control of the material’s chemical composition,” Liu said. “The challenge is that ceramics must be transparent, so the light can pass through them smoothly without distortion, before they can function as electro-optic materials.” 

Researchers achieved transparency by using improved manufacturing techniques that smooth out the tiny imperfections inside the ceramic, the same imperfections that would normally scatter light and make the material look cloudy. These newer methods help the ceramic’s internal grains line up more evenly with much less defects, allowing light to pass straight through. The research team used these techniques to create the fully transparent ceramics used in the study. This, in turn, enabled the strong electro-optic results, which were a surprise to the researchers. 

“There was no existing theory in the ferroelectrics community that could explain these results,” Liu said, explaining that Yan learned of his zentrophy theory and reached out to collaborate. Liu said the team was motivated by hints in the scientific literature that transparent ferroelectric single crystals with dense domain walls could show unusually strong electro-optic behavior. Scientists suspected that if unusual electro-optic behavior appeared in single crystals with many domain walls — the internal boundaries that separate differently oriented regions inside the material — the same underlying mechanism might also show up in ceramics, which naturally contain even richer domain structures.  

From analyzing the transparent ceramic materials, Yan, Liu and the rest of the team found that the same mechanism did appear — and enabled a much stronger performance. The problem, they explained, was understanding why. To understand these results, the team zoomed farther into the material than scientists normally look. In typical ferroelectric materials, the electric charge is arranged into large “domains,” which are regions made of thousands of atoms that all line up and flip direction together when a voltage is applied. These big domains work well for technologies that operate at slower, radio-frequency speeds, but they simply can’t move fast enough to respond to the incredibly rapid light waves used in photonics. According to the researchers, big domains could not account for the unusually strong electro-optic effects they saw in transparent ceramics. 

So, the team turned to high-resolution transmission electron microscopy and advanced computer simulations to look at the material on a much smaller scale. Instead of large, slow-moving domains, they found the material contained tiny pockets of polarization only a few atoms wide. These small, fast-responding structures, almost like “mini-domains,” helped explain the ultrahigh performance. 

“These very small polar features have extremely fast relaxation times,” Liu said. “They can adjust their electronic polarization almost instantly under an applied field.”  

He explained that these tiny polar regions are not static. Instead, they fluctuate continuously and are dynamic, which allows them to respond at optical speeds.  

 “This behavior is very different from typical ferroelectrics,” Yan said.  

Liu’s zentropy theory helped the team make sense of why the new ceramics behaved so differently from what existing ferroelectric models predicted. Zentropy is designed to capture how atoms inside a material constantly shift, vibrate and rearrange — behavior that traditional theories often treat as background noise, Liu said. Through the lens of zentropy, the researchers mapped out all the tiny structural states the atoms can adopt and  then calculated how those rapid fluctuations add up to influence the material’s overall performance. This approach is especially useful for ferroelectrics, whose internal structures are highly dynamic, particularly at the high frequencies used in photonics, according to the researchers. 

They found that the theory of zentropy could explain why the small, fast-moving polar regions they observed were able to respond at optical speeds. When a material’s internal structure breaks down into these tiny, fluctuating units, the energy needed for the polarization to flip becomes extremely low. That means the material can adjust to an applied electric field almost instantly, producing the ultrahigh electro-optic response seen in the experiments. Traditional theories, which assume larger and slower-moving domain structures, simply couldn’t account for this behavior. Liu noted that zentropy showed that the remarkable performance was not a lucky accident but a natural consequence of the material’s atomic-scale dynamics.  

“By breaking the larger system into smaller atomic units, the energy barrier for polarization changes becomes much lower,” Liu said. “That allows the response to be extremely fast.” 

This understanding is key to being able to scale up future production of transparent ceramics, Liu said. The researchers have already demonstrated that their ceramics can be produced reliably at laboratory scale, and they are now working to scale production, evaluate long term reliability and develop safer lead-free versions for industry.  

“With progress in these areas, we are optimistic that practical devices could follow in the near future,” Liu said.  

Such practical devices could reshape key optical devices — fiber optic internet infrastructure to self-driving car guidance systems and precision medical diagnostics, to name a few examples —that power the modern digital economy, the researchers said, explaining that lithium niobate has been the standard material in these devices for decades. Applying electricity changes how lithium niobate bends light, but only by an amount so small, it is like nudging a ruler by the width of a few atoms. The ceramics developed in this new study demonstrate coefficients far beyond that level.  

“These materials could pave the way for a new generation of electro-optic devices that are smaller, faster, more energy efficient and lower cost,” Yan said. “Potential applications include optical modulators, optical switches, communication components, sensors and integrated photonics.”  

Liu’s contributions to this research were supported by the U.S. Department of  Energy and the Dorothy Pate Enright Professorship at Penn State. Full collaborator and funding details may be found in the paper

Poor health is the primary driver of involuntary retirement, study finds



State vocational rehabilitation agencies may help individuals, who otherwise would be forced to retire, remain in the workforce longer



Penn State




UNIVERSITY PARK, Pa. — Poor health is the primary reason why more than half of middle-aged workers in the United States retire earlier than planned, but state vocational rehabilitation agencies that provide career assistance to individuals with disabilities may be able to help potential retirees remain in the workforce longer, according to a study led by Allison Fleming, associate professor of education at Penn State.

Fleming analyzed anonymized responses from 12,712 participants who reported a forced or voluntary retirement as part of the University of Michigan’s Health and Retirement Study, a 28-year study of U.S. adults aged 55 and over, to understand what factors drove them to leave the workforce earlier than they planned. She found that poor health was the primary driver behind forced retirements. She reported the study findings in the journal Rehabilitation Counseling Bulletin. The study is available as an online first article prior to print publication.

“Disability is something that can happen to anyone at any time,” Fleming said. “In the U.S., a greater proportion of working-age adults are over 40, and we have more people who will be entering their retirement years. But as a group, we are seeing higher trends of onset of chronic illness and disability — or CID — in midlife that could potentially be disabling for some workers. The consequences of being forced out of the workforce before you plan to can be really devastating, especially financially.”

Middle adulthood, ranging approximately from ages 45 to 60, marks a time of peak earnings potential and career success for many American workers. But this stage of life is also when many individuals experience the onset of CID. As a former vocational counselor for the state of Massachusetts, Fleming observed that workers who incurred CID in middle age had a difficult time re-entering the workforce.

“Many workers are not prepared financially to retire 10 years earlier than planned, and on average it takes two years from time of application to receive Social Security disability benefits, during which time the individual cannot work,” Fleming said. “I don’t know many people who could financially survive two years of not working.”

In addition to the financial hit, there may be health costs associated with an involuntary retirement. Fleming cited other studies that found that individuals who are forced to retire report worse physical and mental health outcomes, such as depression and anxiety, than individuals who retire as planned or voluntarily retire to pursue other work or spend more time with family.

In terms of sociodemographic factors, Fleming found that racial or ethnic minorities were up to 55% more likely than white respondents to report an involuntary retirement. Individuals with a college degree or higher were 63% less likely than those without a high school education to say they had been forced to retire early, she reported.

“Folks in more privileged jobs may be in a position to ask for accommodations, get flexibility, have paid time off to attend medical appointments, things like that, versus folks who don't have access to those things, whose job means they only get paid for hours worked, who can't take time off to attend medical appointments or work in very high turnover industries where workers are considered replaceable,” Fleming said. "There's privilege within our labor market that made a lot of sense to me based on my previous experience as an employment counselor, and that was reflected in this data.”

State vocational rehabilitation agencies could be in a good position to help employees and employers address those issues, Fleming said.

“Sometimes we call rehabilitation counseling the best kept secret,” she said. "Every state and territory has one of these state vocational rehabilitation agencies. The organizations are 75% federally funded and 25% state funded. They employ rehabilitation counselors who are trained in disability and employment, and that person is available to meet with someone who's applying for services, decide if they're eligible, and if so, they have the capacity to help that person come up with an employment plan.”

Employment plans can include advice on accommodations like schedule flexibility or having a stool to sit on instead of standing all shift, to software solutions for individuals who use computers for work but have visual impairments. Counselors may speak with employers about worker accommodations and even purchase equipment on the company’s behalf, Fleming explained.

“Trying to find a job when you’re an older worker and you have a disability or health condition is really difficult, no matter how talented or qualified you are,” she said. “I would hope that these state rehabilitation agencies would try to raise their profile a little bit, maybe partner with doctors or medical professionals who are diagnosing people in midlife with chronic conditions to say, ‘Hey, work might not be a concern for you now, but it may become one, and if so, here's an organization who could help.’”

Study puts spotlight on assessment tool used to predict deadly domestic abuse


Researchers say a risk evaluation aimed at helping protect women from domestic violence is working but could be improved to help more victims.




Swansea University




A risk evaluation aimed at helping protect women from domestic violence is working but could be improved to help more victims, according to researchers.

The Domestic Abuse, Stalking, Harassment and Honour-Based Violence (DASH) tool is used by police forces throughout the UK to assess risk of future domestic abuse.

Earlier this year Parliamentary Under-Secretary of State for Safeguarding and Violence Against Women and Girls Jess Phillips told the BBC's File on 4 that the DASH doesn't work, amid mounting evidence that it fails to correctly identify those at the highest risk of further harm.

But now research has been able to demonstrate that the DASH is an effective predictor of deadly domestic abuse although it could be improved to help victims and the frontline staff who use it.

A team from Swansea and Cardiff universities collaborated with Dyfed-Powys Police to examine 1,000 cases of domestic violence from a total of 25,000 incidents. They compared the use of DASH in cases where the perpetrator went on to commit deadly violence – such as murder - to those where the perpetrator did not commit any further domestic abuse.

They found that people who went on to commit a deadly act of violence were five times more likely to have received a high-risk rating. This result is comparable to assessment schemes used in other countries and proved that DASH does work as intended.

For their study, the researchers went on to examine DASH in further detail, increasing its predictive ability by carefully selecting just a few of the elements from the scheme that were most powerful at identifying deadly perpetrators. This may pave the way for a more succinct but more accurate version which would not only reduce time and resources for completion of the DASH by frontline staff but also increase its efficacy.

Their findings have just been published by international journal Forensic Sciences.

Professor Nicola Gray, from the School of Psychology, emphasised that just predicting future violence is not DASH’s main objective.

She said: “The aim of risk assessment is not merely to predict but to prevent such violence. However, all organisations work with limited resources and need to know how to target those that need the most help. It is essential that the police and other frontline workers have evidence-based practices that help them in this process.”

Lead author Karen Caulfield said: “While we have shown that DASH is working as well as other risk assessment schemes, it needs to be emphasised that no scheme can ever perfectly predict the future. There will be cases where the instrument, or human judgment, indicates minor risk, but then a terrible event occurs. Such cases are devastating for those involved and attract attention.

“However, the cases where we correctly manage the risk attract no such attention. This gives the impression that the schemes are poor when they are actually functioning well. We must continue to improve these practices to help safeguard victims.”

Dyfed-Powys Assistant Chief Constable Andrew Edwards added: “The force has committed itself to the highest standard of policing in cases of domestic abuse. Our officers are determined to help safeguard those at risk. As an organisation we are happy to have helped in facilitating this investigation and providing evidence of the efficacy of DASH.”

The smoky signature of climate change


Study quantifies role of a warming climate in wildfires and air quality




Harvard John A. Paulson School of Engineering and Applied Sciences



Key Takeaways

  • A new Harvard study pinpoints human-caused climate change as a key driver of western U.S. wildfire activity over the last three decades, accounting for 65% of total fire emissions from 1997-2020.
  • Nearly half of the exposure to harmful fine-particulate matter wildfire smoke, or PM2.5, from 1997-2020 is directly linked to climate change.
  • The study highlights the need for stronger land-management strategies such as prescribed burning to mitigate extreme wildfires.

Across the western U.S., wildfires and the dangerous smoke that results have increased in frequency and intensity since the 1990s – that much is clear. Surprisingly less clear are the exact reasons why: While greenhouse gas-related global warming is often cited as a culprit, to what extent can this claim be quantified? 

Atmospheric chemists and wildfire experts in the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) have authored one of the most rigorous estimates to date of just how much of the fire damage and hazardous smoke of the last three decades is directly attributable to the warming temperatures and drier conditions caused by climate change. 

Researchers led by Loretta Mickley, senior research fellow in chemistry-climate interactions and leader of the Atmospheric Chemistry Modeling Group at Harvard SEAS, report in Proceedings of the National Academy of Sciences that climate change directly accounts for 60-82% of total burned area in western U.S. forests and 33% in central and southern California since the early 1990s. On average, that’s 65% of total fire emissions in the U.S. between 1997 and 2020. 

In turn, from 1997 to 2020, nearly half of the most dangerous types of wildfire smoke in the western U.S., what’s called fine particulate matter, or PM2.5, for its very small particle size and ability to penetrate lungs and the bloodstream, can be traced directly back to climate change. From 2010 to 2020, climate change explains 58% of the increase in this type of smoke pollution. 

A combination of observations, machine learning models, and large climate models netted these conclusions, with the Harvard team’s analysis clearly showing how changing weather conditions and rising temperatures have affected the behavior of wildfires since the 1990s. The researchers also used a chemical transport model, GEOS-Chem, to estimate how much smoke PM2.5 is attributable to human-driven climate change. 

“Our goal was to quantify how much climate change has amplified and exacerbated smoke exposure in the western U.S.,” Mickley said. “Our hope is that this work will spur efforts to think more deeply about how we manage land and wildfires in the western U.S., as well as greater interest toward doing something about the increase in greenhouse gases.” 

To make their claims, the researchers first mapped a series of ecosystems across the western U.S., such as northwest forested mountains, mediterranean California, and the cold deserts of the interior, and compiled decades of records on weather, levels of vegetation, and areas burned in each. Machine learning models helped them deduce how temperature, humidity, and aridity of vegetation translated into fire activity. 

The analysis confirmed that pollution from other sources, such as factories, dropped significantly -- around 44% -- from 1997 to 2020, highlighting the success of the Clean Air Act. Yet wildfire smoke has seen an opposing trend, increasing steadily during that time. 

Hardest hit with wildfire smoke were areas that included northern California and parts of Oregon, Washington, and Idaho, where climate-driven smoke made up 44-66% of total PM2.5 between 2010 and 2020. 

“If you lived in these areas from 2010 to 2020, at least half of all of the fine PM you were breathing came from smoke,” Mickley said. 

In ongoing work, the team is trying to quantify how much the 20th-century legacy of fire suppression may have amplified the effects of climate change on wildfire activity today. Accumulated underbrush and increasingly dense forests have likely provided more fuel for these fires and therefore contributed to increased smoke exposures. 

Mickley and others’ work continues to underscore the urgent need for land managers and communities in the western U.S. to deploy counteractive measures, notably prescribed burning in the most fire-prone areas. Prescribed burning can clear out underbrush and limit the spread of potentially catastrophic fires in the future.

The research was supported by the Modeling, Analysis, Prediction, and Projection program of the Climate Program Office, part of the National Oceanic and Atmospheric Administration (Grant No. NA22OAR4310140). The paper’s first author is former Harvard postdoctoral researcher Xu Feng. Co-authors are Jed O. Kaplan, Makoto Kelp, Yang Li, and Tianjia Liu. 

Mickley will present the research at the Jacob Bjerknes Lecture, part of the American Geophysical Union’s December meeting