Tuesday, April 15, 2025

 

Conservative Americans consistently distrust science




Universiteit van Amsterdam





Conservative people in America appear to distrust science more broadly than previously thought. Not only do they distrust science that does not correspond to their worldview. Compared to liberal Americans, their trust is also lower in fields that contribute to economic growth and productivity. Short interventions aimed at strengthening trust have little effect. This is apparent from new research by social psychologists at the University of Amsterdam, which has now been published in Nature Human Behaviour.

Science helps solve major societal problems, such as pandemics and climate change. But if people do not trust scientists, they will be less likely to accept scientific solutions.‘In America, but also in other countries, conservatives generally have lower trust in science,’ says one of the researchers involved, Bastiaan Rutjens. ‘Since the 1980s, trust of science among conservatives in America has even been plummeting.’

Part of the explanation is that scientific findings do not align with conservatives' political or economic beliefs. ‘But science is also increasingly dismissed in some circles as a “leftist hobby”, and universities as strongholds of the leftist establishment,’ Rutjens adds.The researchers wanted to gain more insight into how trust varies across scientific fields and whether short interventions could strengthen trust.

Research among almost 8,000 Americans
The researchers asked 7,800 Americans about their views on 35 different scientific professions, ranging from anthropologists to biologists and atomic physicists. They examined the differences between people who identified as conservative or liberal.

They then tested five interventions aimed at increasing trust in scientists specifically among conservatives. These interventions addressed the reasons why people may distrust science: its perceived misalignment with moral values or the idea that scientists are not part of their in-group. The interventions highlighted how scientific results were actually in line with conservative beliefs or showcased conservative scientists.

Less trust across all fields
Liberals were found to have more trust than conservatives in all 35 scientific professions that were examined—not just in fields that align with their priorities, such as climate change or inclusion, but also in areas focused on industry. However, the differences in levels of trust were not entirely uniform, with levels varying depending on the scientific field.

The gap was particularly large for climate scientists, medical researchers and social scientists. ‘This is likely because findings in these fields often conflict with conservative beliefs, such as a free-market economy or conservative social policies,’ Rutjens explains.

The trust gap was smaller in technical and applied disciplines, such as industrial chemistry. ‘These fields are more focused on economic growth and productivity,’ Rutjens adds. ‘But it remains striking that even here, conservatives show lower trust. Their distrust extends across science as a whole.’

Deep-rooted distrust
Another striking finding was that none of the five interventions succeeded in increasing conservatives' trust in scientists. Even when the message was well-aligned with their values, their attitude hardly changed. ‘This suggests that their distrust is deeply-rooted and not easily changed,’ says Rutjens.

Stronger interventions needed
Rutjens finds it difficult to predict how scientific distrust will evolve. ‘Extreme things are happening in America right now. But even here in the Netherlands we are seeing unprecedented discussions being held around science, sometimes accompanied by significant distrust.’

As the new research shows, it is difficult to strengthen trust in science using quick interventions. ‘This does not mean it is impossible, but these short interventions do not work to make science more transparent and reliable for certain groups,’ Rutjens explains. ‘We need stronger interventions that make science truly personal. What can science contribute to your life, here and now?’

Article Details
Vukašin Gligorić, Gerben van Kleef, and Bastiaan Rutjens, 2025, ‘Political ideology and trust in scientists in the US’, in: Nature Human Behaviour

Rutjens contributed to this study, which was led by his PhD student Vukašin Gligorić. Gligorić recently obtained his PhD doctorate at the University of Amsterdam and will start this Spring as postdoctoral researcher at the NYU Abu Dhabi.

 

Using artificial intelligence (AI) to conduct classroom intelligent analysis of real classrooms



New research from East China Normal University sheds light on classroom dynamics using large-scale AI analysis



ECNU Review of Education

Artificial Intelligence (AI)-powered analysis reveals teacher-centered classroom trends in China 

image: 

A large-scale AI-powered study has found that teacher-centered instruction remains dominant in Chinese classrooms, challenging assumptions about modern teaching methods.

view more 

Credit: Trustypics by Flickr through Creative Commons Search Repository Image source link: https://openverse.org/image/70b7bb09-9e20-4e6f-9112-753dac0ba496?q=classroom+with+students+and+teachers&p=4




Classroom teaching plays a crucial role in improving students’ learning, building interpersonal relationships, peer engagement, and provides foundation for future academic success. Analyzing classroom teaching patterns can help create an environment conducive to learning, promote diverse learning styles, and enhance student learning outcomes. This involves the collection of substantial quantity of classroom videos and evaluating them manually. However, manual annotation consumes more time and human resources.

Against this backdrop, Yihe Gao and Xiaozhe Yang from East China Normal University have now utilized AI to revolutionize classroom analysis. Their study, published online on March 24, 2025 in ECNU Review of Education, introduces the High-Quality Classroom Intelligent Analysis Standard (CEED) system. “This AI-driven system applies machine learning and multimodal data processing techniques to classify discourse, analyze behavior, and evaluate teacher-student interactions. By automating large-scale classroom video analysis, the CEED system marks a significant advancement over traditional manual observation methods, which are labor-intensive and time-consuming,” explains Gao.

The CEED system is based on the data handling ability of AI throughout various aspects such as speech, behavior, and psychology to evaluate classroom conditions, with emphasis on students’ learning scenarios. This multi-dimensional approach can provide a comprehensive understanding of the high-quality standards of classroom efficiency, equity, and democracy.

The research team collected classroom videos of grade 1 to grade 9 with Chinese language as medium of instruction. By analyzing 1,008 primary and secondary school classes, the researchers reveal that teacher-centered instruction prevails. On average, teacher presentation occupied 51.9% of classroom time and teacher-student interaction accounted for 30.5%. Additionally, individual tasks and group activities comprised 12.3% and 5.3%, respectively.

These findings confirm that teacher-led instruction remains predominant, particularly in higher grade levels, where open-ended questioning decreases significantly. The research challenges common assumptions that older students engage in more critical discussions, revealing instead that teachers tend to favor structured, closed-ended questions.

The researchers further examined the pattern of classroom time span spent across different grades and found that first grade had the lowest proportion of time during group activities, lower than the third grade, fourth grade, sixth grade, and seventh grade. Also, the amount of time spent on teaching increases as the grade level increases. Conversely, the proportions of teacher-student interactions, group activities, and individual tasks decrease as the grade level increases. Furthermore, teachers in higher grades tend to engage in lecturing as an instructional approach.

The study underscores the potential of AI in transforming classroom evaluation and teacher training, emphasizing the implications on education policy and practice. Educators can use the CEED system to analyze their teaching methods, assess classroom engagement, and refine instructional strategies based on data-driven insights. Assessing the indicators such as teacher-student interactions, teacher presentation, group activities, and individual tasks can assist in improving the quality of teaching. These indicators can help in designing training programs to streamline teaching strategies.

"Our study not only provides a snapshot of current teaching practices, but also provides educators with a tool to facilitate instructional adjustments and enhance classroom interactions," Yang noted. "AI-driven analytics open up new avenues to increase the efficiency of instructional improvement and foster more student-centered learning environments."

While the AI system has proven highly effective and accurate in analyzing language-based subjects, challenges remain in applying it to fields like physical education. Further research is needed to refine AI’s capabilities in assessing diverse classroom activities across subjects. Additionally, concerns around algorithmic bias and data interpretation highlight the need for careful implementation in educational settings.

This study represents a significant step toward integrating AI in education research and practice, offering a scalable, data-driven approach to understanding and improving classroom dynamics.

 

***

Reference

Titles of original paper: Are China’s Classes Predominantly Centered Around Teacher-Presentation Instruction?—A Large-Scale Data Analysis Based on Classroom Intelligent Analysis Systems

DOI: 10.1177/20965311251322181

Journal: ECNU Review of Education

 

 

Using AI to monitor inaccessible locations of nuclear energy systems




National Center for Supercomputing Applications




Whether it’s for your vehicle or your home, from small-scale uses to the largest, the debate over the most efficient and cost-effective fuels continues. Currently, there’s no shortage of options either.

Nuclear power provides an alternative to more conventional energy options but requires rigorous systems monitoring and safety procedures. Machine learning could make keeping a close eye on key elements of nuclear systems easier and response time to issues faster.

Syed Bahauddin Alam, an assistant professor in the Department of Nuclear, Plasma & Radiological Engineering (NPRE) in the Grainger College of Engineering at the University of Illinois Urbana-Champaign, and his team worked with artificial-intelligence and machine-learning experts through Illinois Computes to develop a novel method for real-time monitoring of nuclear energy systems that can infer predictions about 1,400 times faster than traditional Computational Fluid Dynamics (CFD) simulations. NCSA research assistants and NPRE graduate students Kazuma Kobayashi and Farid Ahmed assisted in the development. 

Published in Nature’s npj Materials Degradation, Alam’s research introduces machine learning-driven virtual sensors based on deep-learning operator-surrogate models as a complement to physical sensors in monitoring critical degradation indicators. Traditional physical sensors face limitations, particularly in measuring critical parameters in hard-to-reach or harsh environments, which often result in incomplete data coverage. Moreover, traditional physics-based numerical modeling methods, such as CFD, are still too slow to provide real-time predictions in nuclear power facilities.

Instead, the novel Deep Operator Neural Networks (DeepONet), when properly trained on graphics processing units (GPUs), can instantly and accurately predict complete multiphysics solutions on the entire domain. DeepONet functions as real-time virtual sensors and addresses these limitations of physical sensors or classical modeling predictions, specifically by predicting key thermal-hydraulic parameters in the hot leg of a pressurized water reactor.

Because components are continuously subjected to extreme temperatures, pressures and radiation, proper monitoring and inspection of in-service elements of nuclear reactors is essential for long-term safety and efficiency. AI isn’t replacing human oversight but creating new ways to monitor and predict the potential failure of system elements.

“Our research introduces a new way to keep nuclear systems safe by using advanced machine-learning techniques to monitor critical conditions in real-time,” Alam said. “Traditionally, it’s been incredibly challenging to measure certain parameters inside nuclear reactors because they’re often in hard-to-reach or extremely harsh environments. Our approach leverages virtual sensors powered by algorithms to predict crucial thermal and flow conditions without needing physical sensors everywhere.

“Think of it like having a virtual map of how the reactor is operating, giving us constant feedback without having to place physical instruments in risky spots. This not only speeds up the monitoring process but also makes it significantly more accurate and reliable. By doing this, we can detect potential issues before they become serious, enhancing both safety and efficiency.”

Through the campus-funded Illinois Computes program, Alam utilized allocations on NCSA’s Delta, performing computations for data generation on central processing unit (CPU) nodes, and for the training and evaluation tasks on a computational node with  NVIDIA A100 GPUs. He collaborated with NCSA’s experts in AI-driven scientific computing and high-performance computing.

Massive thanks to Illinois Computes for funding this research. Partnering with Dr. Diab Abueidda and Dr. Seid Koric from NCSA was essential to our success. Through the campus-funded program, we leveraged Delta’s state-of-the-art supercomputing resources, including a computational node with NVIDIA A100 GPUs, to train and test our models efficiently. The NCSA technical staff provided invaluable support throughout the entire process, demonstrating the tremendous impact of combining AI with high-performance computing to advance nuclear safety. We will continue to work on unleashing the power of AI in complex energy systems, pushing the boundaries of what is possible to enhance safety, efficiency and reliability.

Syed Bahauddin Alam, assistant professor in the Department of Nuclear, Plasma & Radiological Engineering

“In this Illinois Computes project, we have fully utilized the unique high-performance computing resources and multidisciplinary expertise at NCSA and the Grainger College of Engineering to advance translational and transformative engineering research in Illinois,” said Seid Koric, senior technical associate director for Research Consulting at NCSA and research professor at the Department of Mechanical Science and Engineering.

“This collaboration exemplifies the synergy that emerges when advanced AI methods, high-performance computing resources and domain expertise converge,” said Abueidda, a research scientist at NCSA. “Working alongside Dr. Alam’s team and NCSA’s AI and HPC experts, we leveraged the U.S. National Science Foundation-funded Delta’s cutting-edge capabilities to push the boundaries of real-time monitoring and predictive analysis in nuclear systems. By uniting our specialized skill sets, we have accelerated research while enhancing the accuracy and reliability of critical safety measures.

“We look forward to continuing this interdisciplinary approach to drive transformative solutions for complex energy systems. Ultimately, these breakthroughs highlight the promise of computational science in addressing the pressing challenges of nuclear energy.”


ABOUT DELTA AND DELTAAI

NCSA’s Delta and DeltaAI are part of the national cyberinfrastructure ecosystem through the U.S. National Science FoundationACCESS program. Delta (OAC 2005572) is a powerful computing and data-analysis resource combining next-generation processor architectures and NVIDIA graphics processors with forward-looking user interfaces and file systems. The Delta project partners with the Science Gateways Community Institute to empower broad communities of researchers to easily access Delta and with the University of Illinois Division of Disability Resources & Educational Services and the School of Information Sciences to explore and reduce barriers to access. DeltaAI (OAC 2320345) maximizes the output of artificial intelligence and machine learning (AI/ML) research. Tripling NCSA’s AI-focused computing capacity and greatly expanding the capacity available within ACCESS, DeltaAI enables researchers to address the world’s most challenging problems by accelerating complex AI/ML and high-performance computing applications running terabytes of data. Additional funding for DeltaAI comes from the State of Illinois.

ABOUT ILLINOIS COMPUTES

Illinois Computes offers computing and data storage resources, technical expertise and support services to researchers from all domains across the University of Illinois Urbana-Champaign campus. Through the campus-funded program, NCSA will learn what additional assets are needed to fulfill the computing demands of the university and adjust the cyberinfrastructure strategy while continuing to make access to systems, interdisciplinary and technical knowledge, and support infrastructure easy to obtain. Illinois Computes removes barriers for all Illinois researchers – especially those typically underserved – to access NCSA’s growing assemblage of research computing tools and world-class staff, furthering their innovative and novel work while ensuring NCSA is a leader in the global research community.

Check out the Illinois Computes website and sign up for the monthly newsletter for more information.

 

Nested hierarchies in job skills underscores importance of basic education




Santa Fe Institute





In many careers, a person must learn foundational skills before advancing deeper into their profession. Computer programmers need a solid foundation in basic mathematics; nurses must gain clinical experience and specialized training to become nurse practitioners; a negotiator’s ability to persuade depends on solid communication and active-listening skills.

A recent paper published in Nature Human Behaviour mapped the dependency relationships between workplace skills using data from millions of job transitions and U.S. workplace surveys. The authors identified a nested structure in many professions, where advanced skills depend on prior mastery of broader skills. This nestedness, they found, has significant implications for wage inequality and career mobility in increasingly complex labor markets.

“We found that many skills aren't just complementary — they're interdependent in a directional way, acting as prerequisites for others, snowballing layer over layer to get to more specialized knowledge.” says Moh Hosseinioun, a postdoctoral fellow at Northwestern University and lead author of the study.

The study grew out of a question about how the classifications of blue-collar and white-collar jobs arise, says SFI External Professor Hyejin Youn (Seoul National University), a corresponding author on the study. “In trying to answer this question, we found that these categories emerge around specialization.” Knowledge-based work tends to require more time to build specialized skills while physical work is often learned on the job.

“It’s like a succession model in ecology — acquiring complex skills requires a sequence of prerequisites,” Youn explains. Just as predators depend on prey, which rely on vegetation, which in turn requires soil created by microbes and fungi breaking down rock, cognitive development unfolds in layers within a kind of mental ecosystem. “Advanced problem-solving — like solving partial differential equations — first depends on mastering arithmetic, understanding mathematical notation, and grasping logical principles,” she says. “Basic educational skills are the cognitive equivalent of early organisms, creating the conditions in the mental ecosystem for higher-order reasoning to emerge, and are essential for developing the advanced skills that can lead to higher wage premiums.” 

This job skills structure isn’t itself all that surprising, write the authors in a supplemental research briefing, but it has significant societal implications. “We find that skills that are more closely aligned with the nested hierarchy require a longer education, command higher wage premiums and are less likely to be automated,” they write.

The nested structure in job skills has become more pronounced over the past two decades, suggesting a possible increase in job polarization as longer, deeper, and more complex sequences of prerequisites may hinder newcomers. Policy interventions may be needed to avoid increasing inequality and polarization in the job market; those unable to access education for foundational skills face barriers to entering higher-wage career paths. "The more we become specialized and nested, the more inequality and disparity across the labor market will occur," says Youn.

The findings also have implications for education policy. Quick "reskilling" programs may have limited effectiveness without investment in general foundational skills, and moves by universities to remove foundational courses in favor of immediately applicable skills could have unintended consequences for graduates.

The findings also raise concerns about using AI to tackle problems that require foundational skills. “Large language models are unprecedented in how they target fundamental skills,” says Hosseinioun. “Is this an opportunity, where some of the layers of the hierarchy might be condensed? Or, if we outsource those fundamental skills, will we become unable to learn more advanced skills?”

The more our collective knowledge base grows, the harder it becomes for any one individual to master a universal set of skills. Our modern economy’s demands for ever-more-specialized skills is shaping broader social and economic systems, and the highest-paying skills are locked within a series of prerequisites.

“These structured pathways systematically shape professional development and the socio-economic landscape, driving differences in rewards and career accessibility based on early choice of skill acquisition,” write the authors. “This deeper nested structure imposes greater constraints on individual career paths, amplifies disparities and has macroeconomic implications, affecting the resilience and stability of the entire system.”

Read the study “Skill dependencies uncover nested human capital” in Nature Human Behaviour (February 24, 2025). DOI: 10.1038/s41562-024-02093-2

 

Remote repairs: discovering the longevity of 3D-printed metal parts



3D-printing metal parts could save weeks of downtime, but DARPA wants a way to certify how long they'll last



University of Michigan





Photos of researchers in U-M 3D printing lab

To find out how long 3D-printed metal parts are likely to last in the field, the Defense Advanced Research Projects Agency is providing up to $10.3M to a University of Michigan-led team.

 

When military equipment fails in remote locations, it can take weeks for a part ordered from the manufacturer to arrive. 3D printing, specifically laser powder bed fusion, or LPBF, is an expensive way to make parts, and they likely aren't as sturdy as cold-forged parts. However, downtime is even more expensive in hours of lost work, so military agencies would like the option to commission locally made parts or bring along 3D printers themselves.

 

The problem is guaranteeing longevity. Military parts undergo stringent testing, and usually the manufacturing process is so uniform that samples of one part made by a particular machine reliably represent how all such parts made by that machine will fare. This is not the case for LPBF, in which defects in the material structure are more common and more random.

 

In LPBF, a bed of metal powder is hit with lasers so that it solidifies into a cross-section of the desired shape. Then, more powder is added and the lasers fuse the next layer to the first from below. This continues until the part is complete.

 

"Depending on which model of LPBF printer you use, you might get different microstructures and different properties. The laser spot size and laser power levels might be different. The scanning strategies might be different. These things change the quality of the part," said Veera Sundararaghavan, U-M professor of aerospace engineering and principal investigator of the project.

 

"Our aim is to guarantee the quality of the part as you print."

 

The solution offered by Sundararaghavan and his collaborators is to carefully record the printing process and create a digital twin of each part based on the defects that emerge. Then, the team will computationally model repeated stresses on the part to figure out where cracks form and how long that takes. These fatigue models can incorporate the actual service data to predict when parts will fail. The team will validate these models with fatigue tests. The four-year project is called Predictive Real Time Intelligence for Metal Endurance, or PRIME.

 

"To understand the lifespan of LPBF parts, we must push the current boundaries of the field and detect even the most critical defects that impact component performance. Through the PRIME project, we are doing exactly that—leveraging state-of-the-art monitoring and AI techniques to redefine what’s possible," said Mohsen Taheri Andani, assistant professor of mechanical engineering at Texas A&M University, who is co-leading the effort to monitor LPBF printing.

 

Three partners—the additive manufacturing monitoring company AddiguruTexas A&M University and the ASTM Additive Manufacturing Center of Excellence—will develop techniques and standards to collect data during LPBF manufacturing. They will set up LPBF machines with an optical camera and two infrared cameras, capturing near- and far-infrared signals that reveal where heat is building up in the sample. 

 

Addiguru is pioneering a multisensor integration including an acoustic sensor. The sensor Addiguru chose was originally designed to pick up birdsong, but here it will listen for the sounds of porosity defects in the metal. These tools will enable the team to identify defects as small as 0.025 millimeters, and the sensor-suite will be designed such that it would work with most LPBF devices.

 

"Multisensor data, combined with advanced analytics, will provide critical insights to part manufacturers. This project will enable a comprehensive, real-time assessment of part quality, helping manufacturers make informed go/no-go decisions with confidence," said Shuchi "SK" Khurana, founder and CEO of Addiguru, also co-leading the print monitoring effort.

 

Meanwhile, part of the U-M contingent will work with the 3D-printing simulation company AlphaSTAR to use that data to develop digital twins of the printed parts. They intend to combine advanced physics-based modeling of the LPBF process from AlphaSTAR with U-M's simulations of the part's structure at the microscale. The modeling and simulation of the microstructure also will help the team identify the residual stresses, or stresses that are built into the part, that may eventually contribute to its demise.

 

"The microstructures of 3D-printed parts contain crystal grains that produce different properties across different directions, brittle structures known as intermetallic phases, and internal pores that are different from those seen in their conventionally processed counterparts. Microstructure modeling will offer important inputs for fatigue life predictions," said Lei Chen, associate professor of mechanical engineering at U-M Dearborn, who plays a key role in the microstructure modeling effort.

 

Finally, U-M researchers will also work with partners at the University of California, San Diego, to run uncertainty quantification models on top of the microstructure models, predicting the resilience of the part over time by digitally testing how the metal responds to the stresses it's likely to encounter on the job. To discover whether those predictions are correct, Auburn University will perform fatigue testing on the metal parts, stressing them until they break.

 

"If PRIME takes off, it’s like giving 3D printing a crystal ball—predicting the lifetime of LPBF parts across platforms and turning critical part production into a low-cost, distributed dream," Sundararaghavan said.

 

The project is funded through DARPA's Structures Uniquely Resolved to Guarantee Endurance program.