Wednesday, July 30, 2025

 

TraMA: new RNA-based measure predicts mortality risk and tracks aging



“TraMA is likely to be of particular value to researchers interested in understanding the biological processes underlying health and aging, and for social, psychological, epidemiological, and demographic studies of health and aging”



Impact Journals LLC

Development of a novel transcriptomic measure of aging: Transcriptomic Mortality-risk Age (TraMA) 

image: 

Figure 1. (A) Plan of analysis for the current study. (B) Nested regression results from the HRS testing data including associations between TraMA and sociodemographic factors and health behaviors; points represent regression coefficients and bars represent 95% confidence intervals; all models include cell type and batch as covariates. Model 1 includes demographic factors; Model 2 includes variables in Model 1, as well as socioeconomic factors; Model 3 includes variables in Model 2, as well as health behaviors. (C) Regression results from the HRS testing data of health/aging outcomes on TraMA; points represent regression coefficients and bars represent 95% confidence intervals; all models include age, race/ethnicity, sex/gender, cell type, and batch as covariates. (D). Validation results from nested regression of time to death on TraMA in HRS and LLFS. Model 1 includes batch as a covariate; Model 2 includes batch, age, race/ethnicity, and sex/gender as covariates; Model 3 includes variables from Model 2, as well as RNA-based cell type as covariates.

view more 

Credit: Copyright: © 2025 Klopack et al. This is an open access article distributed under the terms of the Creative Commons Attribution License (CC BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.




BUFFALO, NY — July 28, 2025 — A new research paper was published in Aging (Aging-US) Volume 17, Issue 6, on June 13, 2025, titled “Development of a novel transcriptomic measure of aging: Transcriptomic Mortality-risk Age (TraMA).”

In this study, led by Eric T. Klopack from the University of Southern California, researchers created a new RNA-based aging measure that predicts health risks and mortality. This measure, called Transcriptomic Mortality-risk Age (TraMA), uses gene expression data to estimate a person’s biological aging. This finding offers a new and potentially more accurate way to track aging and understand health risks, especially for older adults.

Aging is a complex biological process that affects multiple systems in the body and increases the risk of disease and death. Scientists have long looked for reliable ways to measure biological aging. While DNA methylation and blood biomarkers are commonly used, this study focused on RNA—a molecule that reflects gene activity. By analyzing RNA sequencing data from nearly 4,000 U.S. adults aged 50 and older, the team developed TraMA to predict the probability of dying within four years.

TraMA proved to be a strong and independent predictor of early death, multiple chronic diseases, poor cognitive function, and difficulties with daily activities. It was also tested in another large group of long-lived families and in several smaller datasets from patients with conditions like diabetes, sepsis, and cancer. The results confirmed the tool’s usefulness across different populations and health conditions.

“TraMA was also externally validated in the Long Life Family Study and several publicly available datasets.”

Unlike earlier RNA-based aging measures, which were often built using small or non-representative samples, TraMA was developed using modern RNA sequencing technology results and a nationally representative dataset. This increases its reliability and potential for broad public health applications. The tool also demonstrated unique advantages over popular biological aging measures like GrimAge and PhenoAge, capturing distinct aspects of aging and health decline.

Importantly, TraMA tracks biological processes related to inflammation, immune function, and kidney and brain health, systems that play key roles in aging. It was also sensitive to behavioral and socioeconomic factors. For instance, smoking, obesity, and low physical activity were associated with older TraMA scores.

TraMA was also sensitive to changes in biological aging. In one study, researchers measured TraMA at two different time points and found that the more recent scores were better at predicting who would die. This suggests that TraMA can track changes in a person’s aging as their health evolves. It also performed well in both large-scale surveys and small clinical samples, making it a useful tool in many types of research.

By offering a new, accurate, and flexible method for measuring biological aging, TraMA may help researchers better understand how genes, lifestyle, and environment influence aging. This tool opens the door to more precise research on improving health and extending lifespan.

Read the full paper: DOIhttps://doi.org/10.18632/aging.206272

Corresponding author: Eric T. Klopack – klopack@usc.edu

Keywords: aging, biological aging, transcriptomics, mortality, accelerated aging, machine learning

Click here to sign up for free Altmetric alerts about this article.

______

To learn more about the journal, please visit our website at www.Aging-US.com​​ and connect with us on social media at:

Click here to subscribe to Aging publication updates.

For media inquiries, please contact media@impactjournals.com.

 

Pedestrians now walk faster and linger less, researchers find


A computer vision study compares changes in pedestrian behavior since 1980, providing information for urban designers about creating public spaces




Massachusetts Institute of Technology




City life is often described as “fast-paced.” A new study suggests that’s more true that ever. 

The research, co-authored by MIT scholars, shows that the average walking speed of pedestrians in three northeastern U.S. cities increased 15 percent from 1980 to 2010. The number of people lingering in public spaces declined by 14 percent in that time as well. 

The researchers used machine-learning tools to assess 1980s-era video footage captured by renowned urbanist William Whyte, in Boston, New York, and Philadelphia. They compared the old material with newer videos from the same locations. 

“Something has changed over the past 40 years,” says MIT professor of the practice Carlo Ratti, a co-author of the new study. “How fast we walk, how people meet in public space — what we’re seeing here is that public spaces are working in somewhat different ways, more as a thoroughfare and less a space of encounter.”

The paper, “Exploring the social life of urban spaces through AI,” is published this week in the Proceedings of the National Academy of Sciences. The co-authors are Arianna Salazar-Miranda MCP ’16, PhD ’23, an assistant professor at Yale University’s School of the Environment; Zhuanguan Fan of the University of Hong Kong; Michael Baick; Keith N. Hampton, a professor at Michigan State University; Fabio Duarte, associate director of the Senseable City Lab; Becky P.Y. Loo of the University of Hong Kong; Edward Glaeser, the Fred and Eleanor Glimp Professor of Economics at Harvard University; and Ratti, who is also director of MIT’s Senseable City Lab. 

The results could help inform urban planning, as designers seek to create new public areas or modify existing ones. 

“Public space is such an important element of civic life, and today partly because it counteracts the polarization of digital space,” says Salazar-Miranda. “The more we can keep improving public space, the more we can make our cities suited for convening.”

Meet you at the Met

Whyte was a prominent social thinker whose famous 1956 book, “The Organization Man,” probing the apparent culture of corporate conformity in the U.S., became a touchstone of its decade. 

However, Whyte spent the latter decades of his career focused on urbanism. The footage he filmed, from 1978 through 1980, was archived by a Brooklyn-based nonprofit organization called the Project for Public Spaces and later digitized by Hampton and his students. 

Whyte chose to make his recording at four spots in the three cities combined: Boston’s Downtown Crossing area; New York City’s Bryant Park; the steps of the Metropolitan Museum of Art in New York, a famous gathering point and people-watching spot; and Philadelphia’s Chestnut Street. 

In 2010, a group led by Hampton then shot new footage at those locations, at the same times of day Whyte had, to compare and contrast current-day dynamics with those of Whyte’s time. To conduct the study, the co-authors used computer vision and AI models to summarize and quantify the activity in the videos. 

The researchers have found that some things have not changed greatly. The percentage of people walking alone barely moved, from 67 percent in 1980 to 68 percent in 2010. On the other hand, the percentage of individuals entering these public spaces who became part of a group declined a bit. In 1980, 5.5 percent of the people approaching these spots met up with a group; in 2010, that was down to 2 percent. 

“Perhaps there’s a more transactional nature to public space today,” Ratti says.

Fewer outdoor groups: Anomie or Starbucks?

If people’s behavioral patterns have altered since 1980, it’s natural to ask why. Certainly some of the visible changes seem consistent with the pervasive use of cellphones; people organize their social lives by phone now, and perhaps zip around more quickly from place to place as a result. 

“When you look at the footage from William Whyte, the people in public spaces were looking at each other more,” Ratti says. “It was a place you could start a conversation or run into a friend. You couldn’t do things online then. Today, behavior is more predicated on texting first, to meet in public space.”

As the scholars note, if groups of people hang out together slightly less often in public spaces, there could be still another reason for that: Starbucks and its competitors. As the paper states, outdoor group socializing may be less common due to “the proliferation of coffee shops and other indoor venues. Instead of lingering on sidewalks, people may have moved their social interactions into air-conditioned, more comfortable private spaces.” 

Certainly coffeeshops were far less common in big cities in 1980, and the big chain coffeeshops did not exist.

On the other hand, public-space behavior might have been evolving all this time regardless of Starbucks and the like. The researchers say the new study offers a proof-of-concept for its method and has encouraged them to conduct additional work. Ratti, Duarte, and other researchers from MIT’s Senseable City Lab have turned their attention to an extensive survey of European public spaces in an attempt to shed more light on the interaction between people and the public form. 

“We are collecting footage from 40 squares in Europe,” Duarte says. “The question is: How can we learn at a larger scale? This is in part what we’re doing.” 

###

Written by Peter Dizikes, MIT News

Paper: “Exploring the social life of urban spaces through AI”

https://www.pnas.org/doi/10.1073/pnas.2424662122

 SCHRODINGERS CAT

Famous double-slit experiment holds up when stripped to its quantum essentials




MIT physicists confirm that, like Superman, light has two identities that are impossible to see at once




Massachusetts Institute of Technology

Atomic Double Slit 

image: 

Schematic of the MIT experiment: Two single atoms floating in a vacuum chamber are illuminated by a laser beam and act as the two slits. The interference of the scattered light is recorded with a highly sensitive camera depicted as a screen. Incoherent light appears as background and implies that the photon has acted as a particle passing only through one slit.

view more 

Credit: Courtesy of Wolfgang Ketterle, Vitaly Fedoseev, Hanzhen Lin, Yu-Kun Lu, Yoo Kyung Lee, and Jiahao Lyu





MIT physicists have performed an idealized version of one of the most famous experiments in quantum physics. Their findings demonstrate, with atomic-level precision, the dual yet evasive nature of light. They also happen to confirm that Albert Einstein was wrong about this particular quantum scenario. 

The experiment in question is the double-slit experiment, which was first performed in 1801 by the British scholar Thomas Young to show how light behaves as a wave. Today, with the formulation of quantum mechanics, the double-slit experiment is now known for its surprisingly simple demonstration of a head-scratching reality: that light exists as both a particle and a wave. Stranger still, this duality cannot be simultaneously observed. Seeing light in the form of particles instantly obscures its wave-like nature, and vice versa. 

The original experiment involved shining a beam of light through two parallel slits in a screen and observing the pattern that formed on a second, faraway screen. One might expect to see two overlapping spots of light, which would imply that light exists as particles, a.k.a. photons, like paintballs that follow a direct path. But instead, the light produces alternating bright and dark stripes on the screen, in an interference pattern similar to what happens when two ripples in a pond meet. This suggests light behaves as a wave. Even weirder, when one tries to measure which slit the light is traveling through, the light suddenly behaves as particles and the interference pattern disappears. 

The double-slit experiment is taught today in most high school physics classes as a simple way to illustrate the fundamental principle of quantum mechanics: that all physical objects, including light, are simultaneously particles and waves. 

Nearly a century ago, the experiment was at the center of a friendly debate between physicists Albert Einstein and Niels Bohr. In 1927, Einstein argued that a photon particle should pass through just one of the two slits and in the process generate a slight force on that slit, like a bird rustling a leaf as it flies by. He proposed that one could detect such a force while also observing an interference pattern, thereby catching light’s particle and wave nature at the same time. In response, Bohr applied the quantum mechanical uncertainty principle and showed that the detection of the photon’s path would wash out the interference pattern.

Scientists have since carried out multiple versions of the double-slit experiment, and they have all, to various degrees, confirmed the validity of the quantum theory formulated by Bohr. Now, MIT physicists have performed the most “idealized” version of the double-slit experiment to date. Their version strips down the experiment to its quantum essentials. They used individual atoms as slits, and used weak beams of light so that each atom scattered at most one photon. By preparing the atoms in different quantum states, they were able to modify what information the atoms obtained about the path of the photons. The researchers thus confirmed the predictions of quantum theory: The more information was obtained about the path (i.e. the particle nature) of light, the lower the visibility of the interference pattern was.   

They demonstrated what Einstein got wrong. Whenever an atom is “rustled” by a passing photon, the wave interference is diminished. 

“Einstein and Bohr would have never thought that this is possible, to perform such an experiment with single atoms and single photons,” says Wolfgang Ketterle, the John D. MacArthur Professor of Physics and leader of the MIT team. “What we have done is an idealized Gedanken experiment.” 

Their results appear in the journal Physical Review Letters. Ketterle’s MIT co-authors include first author Vitaly Fedoseev, Hanzhen Lin, Yu-Kun Lu, Yoo Kyung Lee, and Jiahao Lyu, who all are affiliated with MIT’s Department of Physics, the Research Laboratory of Electronics, and the MIT-Harvard Center for Ultracold Atoms.

Cold confinement

Ketterle’s group at MIT experiments with atoms and molecules that they super-cool to temperatures just above absolute zero and arrange in configurations that they confine with laser light. Within these ultracold, carefully tuned clouds, exotic phenomena that only occur at the quantum, single-atom scale can emerge. 

In a recent experiment, the team was investigating a seemingly unrelated question, studying how light scattering can reveal the properties of materials built from ultracold atoms.  

“We realized we can quantify the degree to which this scattering process is like a particle or a wave, and we quickly realized we can apply this new method to realize this famous experiment in a very idealized way,” Fedoseev says. 

In their new study, the team worked with more than 10,000 atoms, which they cooled to microkelvin temperatures. They used an array of laser beams to arrange the frozen atoms into an evenly spaced, crystal-like lattice configuration. In this arrangement, each atom is far enough away from any other atom that each can effectively be considered a single, isolated and identical atom. And 10,000 such atoms can produce a signal that is more easily detected, compared to a single atom or two. 

The group reasoned that with this arrangement, they might shine a weak beam of light through the atoms and observe how a single photon scatters off two adjacent atoms, as a wave or a particle. This would be similar to how, in the original double-slit experiment, light passes through two slits. 

“What we have done can be regarded as a new variant to the double-slit experiment,” Ketterle says. “These single atoms are like the smallest slits you could possibly build.”

Tuning fuzz

Working at the level of single photons required repeating the experiment many times and using an ultrasensitive detector to record the pattern of light scattered off the atoms. From the intensity of the detected light, the researchers could directly infer whether the light behaved as a particle or a wave.

They were particularly interested in the situation where half the photons they sent in behaved as waves, and half behaved as particles. They achieved this by using a method to tune the probability that a photon will appear as a wave versus a particle, by adjusting an atom’s “fuzziness,” or the certainty of its location. In their experiment, each of the 10,000 atoms is held in place by laser light that can be adjusted to tighten or loosen the light’s hold. The more loosely an atom is held, the fuzzier, or more “spatially extensive,” it appears. The fuzzier atom rustles more easily and records the path of the photon. Therefore, in tuning up an atom’s fuzziness, researchers can increase the probability that a photon will exhibit particle-like behavior. Their observations were in full agreement with the theoretical description.

Springs away

In their experiment, the group tested Einstein’s idea about how to detect the path of the photon. Conceptually, if each slit were cut into an extremely thin sheet of paper that was suspended in the air by a spring, a photon passing through one slit should shake the corresponding spring by a certain degree that would be a signal of the photon’s particle nature. In previous realizations of the double slit experiment, physicists have incorporated such a spring-like ingredient, and the spring played a major role in describing the photon’s dual nature. 

But Ketterle and his colleagues were able to perform the experiment without the proverbial springs. The team’s cloud of atoms is initially held in place by laser light, similar to Einstein’s conception of a slit suspended by a spring. The researchers reasoned that if they were to do away with their “spring,” and observe exactly the same phenomenon, then it would show that the spring has no effect on a photon’s wave/particle duality. 

This, too, was what they found. Over multiple runs, they turned off the spring-like laser holding the atoms in place and then quickly took a measurement in a millionth of a second,  before the atoms became more fuzzy and eventually fell down due to gravity. In this tiny amount of time, the atoms were effectively floating in free space. In this spring-free scenario, the team observed the same phenomenon: A photon’s wave and particle nature could not be observed simultaneously. 

“In many descriptions, the springs play a major role. But we show, no, the springs do not matter here; what matters is only the fuzziness of the atoms,” Fedoseev says. “Therefore,  one has to use a more profound description, which uses quantum correlations between photons and atoms.”

The researchers note that the year 2025 has been declared by the United Nations as the International Year of Quantum Science and Technology, celebrating the formulation of quantum mechanics 100 years ago. The discussion between Bohr and Einstein about the double-slit experiment took place only two years later. 

“It’s a wonderful coincidence that we could help clarify this historic controversy in the same year we celebrate quantum physics,” says co-author Lee.

This work was supported, in part, by the National Science Foundation, the U.S. Department of Defense, and the Gordon and Betty Moore Foundation.

###

Written by Jennifer Chu, MIT News

Paper: “Coherent and incoherent light scattering by single-atom wavepackets”

https://journals.aps.org/prl/abstract/10.1103/zwhd-1k2t

 

New spectrometer is small enough to fit on your phone



North Carolina State University
A spectrometer smaller than your thumbnail 

image: 

Researchers have successfully demonstrated a spectrometer that is orders of magnitude smaller than current technologies and can accurately measure wavelengths of light from ultraviolet to the near-infrared. The technology makes it possible to create hand-held spectroscopy devices and holds promise for the development of devices that incorporate an array of the new sensors to serve as next-generation imaging spectrometers. This photo shows a series of prototype organic photodetector-based spectrometer cells. Each metal bar is a detector capable of measuring light spectra. 

view more 

Credit: Brendan O'Connor, NC State University




Researchers have successfully demonstrated a spectrometer that is orders of magnitude smaller than current technologies and can accurately measure wavelengths of light from ultraviolet to the near-infrared. The technology makes it possible to create hand-held spectroscopy devices and holds promise for the development of devices that incorporate an array of the new sensors to serve as next-generation imaging spectrometers.

“Spectrometers are critical tools for helping us understand the chemical and physical properties of various materials based on how light changes when it interacts with those materials,” says Brendan O’Connor, corresponding author of a paper on the work and a professor of mechanical and aerospace engineering at North Carolina State University. “They are used in applications that range from manufacturing to biomedical diagnostics. However, the smallest spectrometers on the market are still fairly bulky.

“We’ve created a spectrometer that operates quickly, at low voltage, and that is sensitive to a wide spectrum of light,” O’Connor says. “Our demonstration prototype is only a few square millimeters in size – it could fit on your phone. You could make it as small as a pixel, if you wanted to.”

The technology makes use of a tiny photodetector capable of sensing wavelengths of light after the light interacts with a target material. By applying different voltages to the photodetector, you can manipulate which wavelengths of light the photodetector is most sensitive to.

“If you rapidly apply a range of voltages to the photodetector, and measure all of the wavelengths of light being captured at each voltage, you have enough data that a simple computational program can recreate an accurate signature of the light that is passing through or reflecting off of the target material,” O’Connor says. “The range of voltages is less than one volt, and the entire process can take place in less than a millisecond.”

Previous attempts to create miniaturized photodetectors have relied on complex optics, used high voltages, or have not been as sensitive to such a broad range of wavelengths.

In proof-of-concept testing, the researchers found their pixel-sized spectrometer was as accurate as a conventional spectrometer and had sensitivity comparable to commercial photodetection devices.

“In the long term, our goal is to bring spectrometers to the consumer market,” O’Connor says. “The size and energy demand of the technology make it feasible to incorporate into a smartphone, and we think this makes some exciting applications possible. From a research standpoint, this also paves the way for improved access to imaging spectroscopy, microscopic spectroscopy, and other applications that would be useful in the lab.”

The paper, “Single pixel spectrometer based on a bias-tunable tandem organic photodetector,” is published in the journal Device. First author of the paper is Harry Schrickx, a former Ph.D. student at NC State. The paper was co-authored by Abdullah Al Shafe, a former Ph.D. student at NC State; Caleb Moore, a former undergraduate at NC State; Yusen Pei, a Ph.D. student at NC State; Franky So, the Walter and Ida Freeman Distinguished Professor of Materials Science and Engineering at NC State; and Michael Kudenov, the John and Catherine Amein Family Distinguished Professor of Electrical and Computer Engineering at NC State.

The work was done with support from the National Science Foundation under grants 1809753 and 2324190, and from the Office of Naval Research under grant N000142412101.

 

Mizzou using AI to improve energy efficiency for campuses nationwide



Researchers harness machine learning to transform energy management at large universities



University of Missouri-Columbia

University of Missouri Power Plant 

image: 

University of Missouri Power Plant

view more 

Credit: University of Missouri





University of Missouri researchers are using a type of artificial intelligence to revolutionize how large campuses — including Mizzou’s own — manage energy use. Their goal is to help institutions across the nation optimize efficiency and reduce operational costs while cutting carbon emissions.

In a recently published study, Sanjeev Khanna and Saad Alsamraee, researchers in the College of Engineering, analyzed six years of data collected from Mizzou’s Combined Cooling Heat and Power Plant between 2017 and 2022. The data included campuswide energy usage and environmental variables such as air temperature, humidity, wind speed/direction and solar radiation — all tracked hourly.

Using machine learning, the research team successfully forecasted the campus’s energy demands for 2023 with a remarkable accuracy of 94%.

“By knowing when there are going to be peaks and valleys and how much energy will be needed, even on an hour-by-hour basis, we can ultimately help power plants better plan ahead so they can be as efficient as possible with energy use,” Khanna said. “This research can help universities and industries reduce carbon emissions and save money.”

The ability to predict energy lulls also enables universities to better schedule maintenance breaks with minimal disruption.

While this study focused on Mizzou’s energy data, the findings show that machine learning can help power plants across the country operate more efficiently.

Khanna, a professor of mechanical engineering and the director of the Midwest Industrial Assessment Center, has received continuous support from the U.S. Department of Energy over the past 18 years. His work throughout Missouri has not only helped local industries reduce energy use and carbon output — it has also decreased the state’s electric demand by approximately 300 megawatts and delivered an economic impact exceeding $120 million, while reducing carbon emissions by more than 0.8 million tons annually.

“The collaboration at Mizzou sets us apart from other universities,” Khanna said. “The administration is very supportive of outside-the-box thinking, and working at a land-grant university has allowed our research to help industries all across the state.”

“High-resolution energy consumption forecasting of a university campus power plant based on advanced machine learning techniques” was published in Energy Strategy Reviews.