Tuesday, March 07, 2023

Counting heads: how deep learning can simplify tedious agricultural tasks

Scientists show how machine learning models can be used to automatically detect the heads of sorghum plants in drone images to derive agricultural metrics

Peer-Reviewed Publication

NANJING AGRICULTURAL UNIVERSITY THE ACADEMY OF SCIENCE

The selective breeding of grain crops is one of the main reasons why domesticated plants produce such excellent yields. Selecting the best candidates for breeding is, however, a remarkably complex task. On one hand, it requires a skilled breeder with trained eyes to assess plant resistance to disease and pests, crop growth, and other factors. On the other hand, it also requires precise tool-assisted measurements such as grain size, mass, and quality.

Although all these standard measures are useful, none of them takes into account the number of panicles or ‘heads’ per plant. Head density is closely related to crop yield in most cases, and it could easily be a staple characteristic to measure in breeding programs. However, estimating the number of heads per plant and per unit area is very time consuming and requires tedious manual work.

To address this issue, many researchers have developed machine learning models that can automatically detect individual heads on grain crops in images taken either at ground level or by drones. While these models are aimed at simplifying the otherwise manual counting process in the field, the reality is that they are usually trained in limited testing conditions and focus exclusively on head detection without providing more metrics. In other words, using these models outside of the context in which they were developed and trained can be difficult, tedious, and even yield poor results.

Against this backdrop, a research team including Professor Scott Chapman from The University of Queensland, Australia, sought to promote deep-learning models for head counting by providing a detailed pipeline outlining their use. As explained in their paper, which was recently published in Plant Phenomics, this pipeline covers most of the quirks and challenges that one could find when using these models. “We took various real-world variables into consideration, including data preparation, model validation, inference, and how to derive yield-specific metrics,” explains Prof. Chapman, “We aimed to outline a practical and end-to-end pipeline for head detection in sorghum.

There are two variants to the proposed pipeline, which are demonstrated by way of two independent illustrative experiments. In the first one, the researchers show how one should proceed if one needed to prepare training, testing, and validation datasets for a given machine learning model from scratch. This is usually the case when publicly available datasets are not suitable for the target field, which can happen, for example, when one is dealing with a different stage in plant development than the available datasets.

In the second experiment, the team showcases the steps required to use various pre-trained deep-learning models for sorghum head detection and/or counting. They demonstrate how the detection results (that is, the output of models that only outline sorghum heads on a set of given images) can be ‘stitched together’ into larger mosaic images. This enables one to observe and analyze large areas more easily and calculate important metrics, such as head density per tilling row or per square meter. “Our pipeline produces a high-resolution head density map that can be used for the diagnosis of agronomic variability within a field without relying on commercial software,” highlights Prof. Chapman.

Overall, this study will be useful to researchers and people involved in the agricultural industry alike. Not only it explains how deep learning models can be leveraged to assess grain crops more efficiently, but it also helps unlock new functionalities for camera-equipped drones in agriculture. Worth noting, the proposed pipeline could be adapted to other plants besides sorghum, as Prof. Chapman remarks: “Although we demonstrated our pipeline in a sorghum field, it can be generalized to other grain species. In future works, we intend to test our pipeline on tasks involving other grain types, such as wheat and maize yield estimation.”

Let us hope this work help us bridge the field of agriculture with machine learning to improve crop breeding and, thus, secure better food supplies.

###

Reference

Authors

Chrisbin James1, Yanyang Gu2, Andries Potgieter3, Etienne David4, Simon Madec4, Wei Guo5, Frédéric Baret6, Anders Eriksson2, and Scott Chapman1

Affiliations

1School of Agriculture and Food Sciences, The University of Queensland

2School of Information Technology and Electrical Engineering, The University of Queensland

3Queensland Alliance for Agriculture and Food Innovation, The University of Queensland

4Arvalis, Institut du Végétal

5Graduate School of Agricultural and Life Sciences, The University of Tokyo

6Institut National de la Recherche Agronomique

MMMMM MICROBES

Evidence for the health benefits of consuming more live microbes

New study shows lower weight and better overall health in those who consume more live microorganisms daily

Peer-Reviewed Publication

INTERNATIONAL SCIENTIFIC ASSOCIATION FOR PROBIOTICS AND PREBIOTICS

Safe live microorganisms are found in a variety of foods we eat every day, from yogurt and other fermented foods, to raw fruits and vegetables. Despite the widespread idea that these mixtures of live microbes contribute to health, convincing evidence linking live dietary microbes to health benefits has been lacking.

new study provides some of the first real-world evidence that higher consumption of live microbes may promote health. A group of scientists led by the International Scientific Association for Probiotics and Prebiotics (ISAPP) classified over 9,000 individual foods listed in the US National Health and Nutrition Examination Survey (NHANES) into three categories based on their abundance of live microbes, and then used NHANES participants’ reported food intake to quantify the food they ate that contained medium or high levels of microbes. Then they determined how these intakes correlated with various markers of health such as blood pressure and weight.

The scientists found that increased consumption of live microbes in the diet was linked with multiple measurements of better health: more favorable blood pressure, better blood glucose and insulin, lower inflammation, as well as lower waist circumference and body mass index. This established that those who consumed higher quantities of live dietary microbes showed tangible, if modest, health benefits.

While the scientific approach did not allow researchers to conclude that the live dietary microorganisms directly caused the health benefits, the results are consistent with plausible arguments that dietary exposure to live microorganisms in general could benefit health by increasing the diversity of microbes in the gut or by supporting immune function. In the past century, a reduction in the amount of fermented foods in the diet and increased consumption of processed foods has led to a dramatic reduction in the number of microbes most people consume on a daily basis. This trend may be reversing, however, since the dietary data used in the study showed that US adults have gradually increased their live microbe consumption over the 18-year study period. This may bode well for the health of the population.

This study built on two previous published papers, here and herewhich conducted the preliminary work necessary to make this assessment of live dietary microbes and health.

”Although the dose-response associations we found were relatively modest, it was notable that these estimated benefits applied to several plausible and important health outcomes and were robust to adjustment for available confounders, including body mass index,” says co-lead author Prof. Dan Tancredi, PhD, of University of California - Davis. “More research that extends these findings to other populations and research that uses study designs that permit stronger causal claims is needed, especially given the potential benefits that might be available by simply substituting into the diet more foods that have safe live microbes.”

ISAPP Executive Science Officer Mary Ellen Sanders, PhD, points out the research focused not just on probiotics, but on all microbes in foods, including environmental microbes associated with raw fruits and vegetables as well as lactic acid bacteria associated with fermented foods. Thus, the study differs from probiotic research, which focuses on microbes defined to the strain level, specific dose and proven health benefits.

Co-first author Prof. Colin Hill, PhD, of University College Cork, Ireland, says it's possible that dietary advice of the future could include a recommendation for the daily consumption of high levels of live dietary microbes. “Those foods with high levels of microbes (fermented foods, raw vegetables and fruits) are all nutritionally valuable parts of a healthy and diverse diet,” he says. “Secondly, these same foods could be providing an additional, hitherto unrecognised, health benefit due to live microbes themselves that enter the gut and interact with the host microbiome, immune system and even the enteric nervous system”.

Cleveland Clinic selected for nationwide initiative to translate research to patient care

Joins 41 other U.S. health systems to carry out multi-year Patient-Centered Outcomes Research Institute funded initiative

Grant and Award Announcement

CLEVELAND CLINIC

March 7, 2023, Cleveland: The Patient-Centered Outcomes Research Institute (PCORI) has chosen Cleveland Clinic for a nationwide initiative to improve how healthcare systems translate research findings to clinical practice.

PCORI’s Health Systems Implementation Initiative connects 42 healthcare systems to reduce the time it takes to move a discovery in research to a new treatment for patients – an estimated 17-year lag. PCORI has committed an initial investment of up to $50 million to support the initiative.

Healthcare systems that are part of the initiative will propose projects that implement findings from PCORI-funded research, which includes more than 800 research studies focused on patient care. This is the nonprofit’s first time offering this type of implementation funding.

“We know it takes years to move a scientific discovery into clinical practice. Now, PCORI is providing the resources,” said Anita Misra-Hebert, M.D., M.P.H., director of Cleveland Clinic’s Healthcare Delivery and Implementation Science Center. “These health systems can also benefit from learning how others accelerate this process, continuously improving and tailoring care to benefit patients.”

Research funding provides data on how to improve treatment, recovery and healthcare maintenance, as well as the predictive tools and structures healthcare systems use to deliver services. Implementing the findings requires additional money, training and infrastructure, which is where health systems can hit roadblocks.

PCORI is offering a first phase of funding to build capacity for upcoming implementation projects – up to $500,000 per health system. Healthcare systems will then pitch implementation projects for the second phase of funding, which will range from $500,000 to $5 million per project.

In 2019, Cleveland Clinic launched the Healthcare Delivery and Implementation Science Center, which serves as a systemwide resource for research implementation, connecting researchers with operations leadership. The center offers consultation and education programs, and funds implementation projects through grants. Accepted projects include using 3-D models in imaging, prenatal testing during COVID-19, and using data to predict readmission risk.

Dr. Misra-Hebert is co-lead on programs through the Health Systems Implementation Initiative with Beri Ridgeway, M.D., Cleveland Clinic’s Chief of Staff. “One of the key aspects of implementing new programs or standardizing care is translating knowledge generated from research into real world practice,” said Dr. Misra-Hebert

“This initiative allows us to continue building infrastructure that will further expedite the adoption and implementation of scientific discoveries,” said Dr. Ridgeway.

Participants in the initiative collectively represent 800 hospitals serving 79 million unique patients across 41 states and the District of Columbia. In addition to health systems, participants include academic medical centers, community-based health systems, integrated healthcare delivery and finance systems, safety-net health systems, faith-based systems, public health care delivery systems and a medical center within the Veterans Health Administration. See the full list on PCORI’s website.

Health Systems Implementation Initiative is part of a portfolio of PCORI-funded efforts that aim to improve the awareness, uptake and use of results from patient-centered comparative effectiveness research. PCORI is an independent, non-profit organization that funds comparative clinical effectiveness research, which provides patients, their caregivers and clinicians with the evidence to make better-informed health and healthcare decisions. PCORI is committed to seeking input from a broad range of stakeholders to guide its work.

 

About Cleveland Clinic

Cleveland Clinic is a nonprofit multispecialty academic medical center that integrates clinical and hospital care with research and education. Located in Cleveland, Ohio, it was founded in 1921 by four renowned physicians with a vision of providing outstanding patient care based upon the principles of cooperation, compassion and innovation. Cleveland Clinic has pioneered many medical breakthroughs, including coronary artery bypass surgery and the first face transplant in the United States. U.S. News & World Report consistently names Cleveland Clinic as one of the nation’s best hospitals in its annual “America’s Best Hospitals” survey. Among Cleveland Clinic’s 72,500 employees worldwide are more than 5,050 salaried physicians and researchers, and 17,800 registered nurses and advanced practice providers, representing 140 medical specialties and subspecialties. Cleveland Clinic is a 6,500-bed health system that includes a 173-acre main campus near downtown Cleveland, 22 hospitals, more than 220 outpatient facilities, including locations in northeast Ohio; southeast Florida; Las Vegas, Nevada; Toronto, Canada; Abu Dhabi, UAE; and London, England. In 2021, there were 10.2 million total outpatient visits, 304,000 hospital admissions and observations, and 259,000 surgical cases throughout Cleveland Clinic’s health system. Patients came for treatment from every state and 185 countries. Visit us at clevelandclinic.org. Follow us at twitter.com/ClevelandClinic. News and resources available at newsroom.clevelandclinic.org.

Editor’s Note: Cleveland Clinic News Service is available to provide broadcast-quality interviews and B-roll upon request.

Gas monitoring at volcanic fields outside Naples exposes multiple sources of carbon dioxide emissions

New Geology Geoscience from The Geological Society of America

Peer-Reviewed Publication

GEOLOGICAL SOCIETY OF AMERICA

Map of the Phlegraean volcanic fields 

IMAGE: MAP OF THE PHLEGRAEAN VOLCANIC FIELDS (LEFT), WITH DETAILS OF CARBON DIOXIDE EMISSIONS FROM SOLFATARA CRATER (RIGHT). SOURCE: G. BUONO ET AL., 2023. view more 

CREDIT: SOURCE: G. BUONO ET AL., 2023.

Boulder, Colo., USA: The Phlegraean volcanic fields just west of Naples, Italy, are among the top eight emitters of volcanic carbon dioxide in the world. Since 2005, the Solfatara crater—one of many circular depressions in the landscape left by a long history of eruptions—has been emitting increased volumes of gas. Today it emits 4,000-5,000 tons of carbon dioxide each day, equivalent to the emissions from burning ~500,000 gallons of gasoline. In a new paper published ahead of print in Geology on Thursday, researchers estimate that as much as 20%–40% of the current carbon dioxide emissions are from the dissolution of calcite in the rocks, while 60%–80% is from underground magma.

“Estimating the source of the carbon dioxide is important to properly reconstruct what is happening in the magmatic system and the hydrothermal system,” says Gianmarco Buono, a volcanologist at Italian National Institute of Geophysics and Volcanology and lead author of the study. “Our aim is to provide a tool to better discriminate the contribution of magmatic and non-magmatic carbon dioxide that can also be applied to other systems.”

When magma moves toward Earth’s surface, the decreasing pressure on the magma results in degassing—the release of gases that were previously trapped inside the magma—including water vapor, carbon dioxide, and sulfur dioxide. Scientists monitor volcanoes for unrest and possible eruptions using a variety of observations—detecting earthquakes and tremors related to magma movement, taking detailed measurements of ground deformation, and assessing the types and volumes of gases released at the surface from fumaroles—openings in the earth that emit steam and other gases.

Eruptions are often preceded by increased fluxes of gas, but that does not mean that every increase in gas emissions will be followed by an eruption. It is also possible for carbon dioxide to come from sources besides magma. Interaction between hot underground fluids and host rocks can also release carbon dioxide.


Fumaroles at Solfatara crater. Source: Lucia Pappalardo (INGV).

CREDIT

Source: Lucia Pappalardo (Italian National Institute of Geophysics and Volcanology).

The Italian National Institute of Geophysics and Volcanology has been monitoring gas emissions from Solfatara crater since 1983, providing a long record of the changes in volume and composition of the gases released there. By comparing ratios of nitrogen, helium, and carbon dioxide in the emissions, researchers had previously established that the gases were coming from deep sources of magma.

“We focused mainly on geochemical variation, especially for carbon dioxide, helium, and nitrogen, because they are non-reactive species. They contain information about what is happening in the magma,” explains Buono.

But when the region started experiencing increased unrest in 2005, the data began to deviate from the chemical fingerprints of the magmas, a trend that continued to increase over time alongside rising temperatures in the shallow hydrothermal system. The unrest continued, and in 2012 the alert level was raised from green to yellow, indicating that there is heightened activity but not an imminent threat of eruption.

In addition to small earthquakes and higher gas emissions, the region also experienced deformation of the ground surface. Circulation of hot fluids underground could explain the rising temperatures, ground deformation, and increased gas emissions—interaction of hot acidic fluids with calcite in the rocks also releases carbon dioxide. Drill cores of the rocks from previous studies reveal that calcite in the rocks has similar composition to the gas emissions. The researchers estimate that 20%–40% of the carbon dioxide at the Solfatara crater site was from removal of the calcite in the host rock.

The Phlegraean Fields have hosted volcanic activity since first erupting ~40,00 years ago, with the most recent eruption in 1538. There have been several unrest phases since the 1950s. The current research is part of a strategic project by the Italian National Institute of Geophysics and Volcanology, LOVE-CF: Linking surface Observables to sub-Volcanic plumbing-system: a multidisciplinary approach for Eruption forecasting at Campi Flegrei caldera (Italy).

FEATURED ARTICLE
Discriminating carbon dioxide sources during volcanic unrest: The case of Campi Flegrei caldera (Italy)
Gianmarco Buono; Stefano Caliro; Antonio Paonita; Lucia Pappalardo; Giovanni Chiodini
Contact: Gianmarco Buono, gianmarco.buono@ingv.it
https://pubs.geoscienceworld.org/gsa/geology/article-abstract/doi/10.1130/G50624.1/620796/Discriminating-carbon-dioxide-sources-during

GEOLOGY articles are online at https://geology.geoscienceworld.org/content/early/recent . Representatives of the media may obtain complimentary articles by contacting Kea Giles at the e-mail address above. Please discuss articles of interest with the authors before publishing stories on their work, and please make reference to GEOLOGY in articles published. Non-media requests for articles may be directed to GSA Sales and Service, gsaservice@geosociety.org.

https://www.geosociety.org

# # #

Heart tissue heads to space to aid research on aging and impact of long spaceflights

Business Announcement

JOHNS HOPKINS MEDICINE

Engineered Heart Tissue 

IMAGE: TISSUE CHAMBERS LOADED INTO A PLATE HABITAT DESIGNED FOR RESEARCH ABOARD THE INTERNATIONAL SPACE STATION. view more 

CREDIT: DEOK-HO KIM AND DEVIN MAIR, JOHNS HOPKINS MEDICINE.

Note: Johns Hopkins Medicine researchers Deok-Ho Kim and Devin Mair will participate in a NASA teleconference for journalists on Tuesday, March 14, at 11 a.m. ET.

Johns Hopkins Medicine researchers are collaborating with NASA to send human heart “tissue-on-a-chip” specimens into space as early as March. The project is designed to monitor the tissue for changes in heart muscle cells’ mitochondria (their power supply) and ability to contract in low-gravity conditions.

The tissue samples will be launched into space aboard SpaceX CRS-27, a resupply mission to the International Space Station, slated for liftoff no earlier than Tuesday, March 14, at NASA’s Kennedy Space Center in Florida.

Astronauts on board during the mission will also introduce three FDA-approved medicines to the samples in efforts to prevent heart cell changes known or suspected to occur in those undertaking long-duration spaceflights.

“It’s possible that what we learn from these experiments in space could also inform how we treat age-related cardiac problems,” says Deok-Ho Kim, Ph.D., professor of biomedical engineering at the Johns Hopkins University School of Medicine, because many heart cellular changes already detected in space explorers mimic changes linked to heart muscle aging in general.

To develop the microengineered human heart tissue-on-a-chip, researchers begin with human induced pluripotent stem cells grown in the laboratory. Such cells are able to develop into nearly any type of cell, and are coaxed biologically to develop into beating human cardiomyocytes, the muscle cells that make hearts contract.

Groups of cardiomyocytes form tissue that can be strung between two posts, one flexible and one stiff. The flexible post has an embedded magnet and, when placed over sensors, allows for collection of information on tissue contraction. The chamber enclosing the tissue is sealed so that liquid media feeding the tissue doesn’t float away in space. These tissue chambers are then loaded into so-called plate habitats with the magnetic sensors located beneath the tissue. The experimental payload consists of two of these plate habitats, which measure about 7 inches long, 5 inches tall and 4 inches wide.

Kim, his previous postdoctoral researcher Jonathan Tsui, and his doctoral student Devin Mair previously sent heart tissue into space in March 2020. Those experiments, presented at the Tissue Engineering and Regenerative Medicine International Society-Americas 2022 Annual Meeting, showed that microgravity in space changed the cells’ mitochondria and the tissues’ ability to contract.

In the new experiments with their microengineered human heart tissues-on-a-chip, the scientists will focus on the proteins activated during tissue inflammation and mitochondrial dysfunction.

The astronauts aboard the space station will also test whether any of three medicines can stave off the problems anticipated in space-bound heart cells.

Funding for the research was provided by the National Institutes of Health (UH3TR003519).

Sediments, structures, shores, and storms: keeping a keen eye on eastern geology

2023 Joint Meeting of The Geological Society of America's Southeastern and Northeastern Sections

Meeting Announcement

GEOLOGICAL SOCIETY OF AMERICA

Boulder, Colo., USA: The 2023 joint meeting of the 72nd annual meeting of the Southeastern Section and 58th annual meeting of the Northeastern Section of the Geological Society of America will be held in Reston, Virginia, USA, on 17–19 March. We are pleased to announce a robust and diverse technical program that links the geology of the southeastern and northeastern U.S. and examines many key issues, including sea-level rise; climate and environmental change; energy and critical minerals; and diversity, equity, and inclusion in the geosciences. The meeting has many opportunities for students . Special tributes are planned for A. Conrad Neumann and William A. “Bill” Thomas.

The meeting offers short courses in applied micropaleontology, stormwater management in karst terrain, integrating ESRI mobile, online and desktop GIS for field data acquisitions, and an essential guide to luminescence dating.

Field trips will explore the Piedmont geology and the Civil War at North Anna River fall zone, Virginia; tectonic evolution of the northern Shenandoah massif, Virginia; Cretaceous and Paleocene fossiliferous sediments of Cabin Branch and Tinkers Creek, Maryland; and a visit to Capitol Hill.

Sponsors

U.S. Geological Society

University of Pennsylvania Master of Science in Applied Geosciences

Pittsburgh Geological Society

Learn more about the meeting on the Section Meeting home page. Interested media are invited to receive complimentary registration. Contact Audrey Heun at aheun@geosociety.org to attend.

The Geological Society of America (GSA) unites a diverse community of geoscientists in a common purpose to study the mysteries of our planet (and beyond) and share scientific findings. Members and friends around the world, from academia, government, and industry, participate in GSA meetings, publications, and programs at all career levels, to foster professional excellence. GSA values and supports inclusion through cooperative research, public dialogue on earth issues, science education, and the application of geoscience in the service of humankind.

www.geosociety.org

# # #

Whistleblowers losing faith in media impact

Study shows that as newsrooms shrink, so does the trust of some former sources

Peer-Reviewed Publication

UNIVERSITY OF GEORGIA

The whistleblowers who once trusted journalism are losing faith in the institution.

new study from the University of Georgia found that many whistleblowers who reached out to journalists in the past no longer believe media has the same ability to motivate change, and they feel let down by a system they once trusted.

“If you don’t believe that an outlet or journalist can carry you across the finish line—meaning can affect change, attract enough attention and attract the attention of the right people—then you’re losing faith,” said Karin Assmann, study lead and assistant professor in UGA’s Grady College of Journalism and Mass Communication. “So if you think the institution of journalism no longer has the same impact, maybe because you see algorithms dictate what people pay attention to, then you lose faith.”

Distrust in institutions leads to skepticism

Assmann’s study was inspired by her own career as a journalist and a desire to protect the profession as it undergoes significant change. Assmann spoke with 16 whistleblowers who contacted journalists between the 1970s and 2010s, discussing their decisions to go to the press, their experience during interviews and their reflections on how media has changed.

Although they once believed journalists would protect sources and stories would foster accountability and evoke change, study participants noted the erosion of media’s reach throughout the last several decades.

“Especially with early whistleblowers, I would say through the early 2000s, there would be an emphasis on television,” Assmann said. “People would tune into the evening news, and pay attention to what this whistleblower had to say, trusting that it must be worthy of the audience’s attention if it made it into the evening news.”

But today, many whistleblowers believe consumers are less likely to watch the news or read a newspaper—they’re tuning into YouTube and reading brief push notifications or social posts instead. This, coupled with a loss of faith in institutions, meant that if they had another whistle to blow, they might rely on a different medium to spread the message.

“Another thing about whistleblowers is that their trust in their own institution has been damaged,” Assmann said. “So many of them see journalism as an institution that is equally damaged because they may have an ideology that tells them corporate ownership dictates how news is spread, the same way that maybe corporate ownership is responsible for whatever they’re blowing the whistle on.”

A lack of resources impacting newsrooms

Assmann’s study also cataloged the care with which whistleblowers selected journalists. They sought individuals with an established byline, experience reporting on specific topics and a dedicated audience.

“The whistleblowers talked about individuals plus institutions. Jeff Wigand, for example, was really strategic,” Assmann said, highlighting the former tobacco executive who reported that chemicals were added to a tobacco blend to increase the nicotine’s effect. “He looked at ‘60 Minutes,’ looked at the ratings. He knew the program was going to reach millions of people.”

Another example from the study was Tom Drake, who exposed excessive spending at the National Security Agency in 2010.

Drake relied on anonymity for his own safety, and he sought out a specific journalist who would understand the nuances of encryption and the intelligence community in order make sure his report was understood and well-reported. 

“He had to find somebody who understood encryption, who would buy into all of the safeguards he had set up in order to share what he had to share without being found out,” Assmann said. “That is an extreme example, but other people follow that same logic and choose who they see as a subject matter expert. That kind of expertise is now at risk, I would say, in the current media landscape.”

Now, reporters are stretched thin and more likely to cover multiple topics.

“You’ll often start working at a news organization, and maybe you’ll have a beat, but you’ll also have to do 100 other things,” Assmann said. “How are you supposed to build trust? How are people supposed to recognize you as somebody who is a subject matter expert?”

Trending lack of trust

There are several factors at hand in waning faith in media, including shrinking newsrooms and a growing gap between communities and journalists. And in a world where more consumers rely on quick articles—or just headlines—to stay up to date, newsrooms need to become better resourced to rebuild trust.

“My suspicion, and I don’t think I’m alone with that, is that there is a crisis in local news,” Assmann said. “People don’t meet journalists in their everyday life anymore. Normal citizens don’t find themselves represented in local broadcast or the local paper.”

Continued research, however, can highlight opportunities to support newsrooms and rebuild some of that trust to combat negative perceptions, Assmann said.

“There’s a mistrust in the news media that’s been fostered, I would say, in the last couple of years through some politicians who have something to gain from the news media losing credibility,” Assmann said. “So I think this is a huge construction site for us to work on as journalism scholars and as journalists.”

If things continue along the same trajectory, whistleblowers could start turning to alternative forms of media to share their stories. Many interviewees in the study reported having greater trust for alternative news sources—blogs or social media—than traditional media.

But Assmann is wary of fully attributing this shift to mistrust.

“I would say that rather than calling it mistrust, it reflects them being media savvy and understanding how media networks work and audiences work,” she said. “The new, modern whistleblower may be born out of mistrust for mainstream media and just figures out that these are the best ways to get their stuff out of there. Maybe the next whistleblower will say, ‘I’ll make it a TikTok video,’ if, you know, TikTok is even still around.”

STEP Demo pilot plant achieves supercritical CO2 fluid conditions

sCO2 demonstration facility on SwRI campus advances toward system-level testing

Business Announcement

SOUTHWEST RESEARCH INSTITUTE

STEP Demo 

IMAGE: SWRI’S JOHN KLAERNER, LEAD TURBINE ENGINEER, AND DR. JEFF MOORE, THE PRINCIPAL INVESTIGATOR OF THE STEP DEMO PROJECT, ARE PICTURED WITH THE RECENTLY ASSEMBLED SCO2 TURBINE FOR THE 10 MWE DEMONSTRATION PLANT UNDER CONSTRUCTION AT SWRI. THE FACILITY, DEVELOPED THROUGH A COLLABORATION BETWEEN SWRI, GTI ENERGY, GE RESEARCH AND THE U.S. DEPARTMENT OF ENERGY/NATIONAL ENERGY TECHNOLOGY LABORATORY (DOE/NETL), HAS ACHIEVED ITS FIRST OPERATION WITH CO2 AT SUPERCRITICAL FLUID CONDITIONS IN ITS COMPRESSOR SECTION, WHICH REPRESENTS SIGNIFICANT PROGRESS TOWARD READYING THE FACILITY FOR SYSTEM-LEVEL TESTING. view more 

CREDIT: SOUTHWEST RESEARCH INSTITUTE

SAN ANTONIO — March 7, 2023 —The Supercritical Transformational Electric Power (STEP) Demo pilot plant, a $155 million, 10-megawatt supercritical carbon dioxide (sCO2) test facility at Southwest Research Institute (SwRI) in San Antonio, developed in partnership with GTI Energy and GE Research and sponsored by the U.S. Department of Energy, has successfully achieved its first operation with COat supercritical fluid conditions in its compressor section. This accomplishment represents significant progress toward readying the facility for system-level testing.

“This exciting milestone represents a significant advancement for a truly transformational project,” said Dr. Tim Allison, director of SwRI’s Department of Machinery. “STEP Demo is laying the groundwork for power generation that is more efficient, with a smaller footprint.”

Unlike conventional power plants, which use water as the thermal medium in power cycles, STEP is designed to use high-temperature sCO2, which increases efficiency by as much as 10% due to its favorable thermodynamic properties. Carbon dioxide is nontoxic and nonflammable, and when held above a critical temperature and pressure can act like a gas while having the density near that of a liquid.

The efficiency of sCOas a working fluid allows for STEP turbomachinery to be approximately one-tenth the size of conventional power plant components, providing the opportunity to shrink the environmental footprint and construction cost of any new facilities. For example, a desk-sized sCOturbine can power up to 10,000 homes. The technology is also compatible with concentrated solar power and industrial waste heat.

“The sCO2 power cycle is a breakthrough clean, compact, and high-efficiency power generation technology that can deliver significant environmental performance. We look forward to continued operation of the current test to demonstrate control and operability of this power cycle while validating system performance over long periods of time,” notes Bhima Sastri, Director of Energy Asset Transformation, DOE Office of Fossil Energy and Carbon Management.

The STEP Demo pilot plant is one of the largest demonstration facilities in the world for sCOtechnology to dramatically improve the efficiency, economics, operational flexibility, space requirements and environmental performance of this new technology. The facility’s turbine is currently being installed and will be tested later this year.

SwRI is an industry leader in the development of sCOpower cycles. Staff members have conducted numerous related U.S. Department of Energy projects advancing the efficiency, reliability and commercial readiness of sCOpower cycle turbomachinery, heat exchangers, cycles and systems. The team brings extensive experience with sCO2 technology and the key building blocks to make the STEP Demo project a success and a landmark demonstration.

About STEP Demo
The STEP Demo pilot facility will demonstrate a fully integrated electricity generating power plant using transformational sCO2-based power cycle technology that can offer dramatically improved size, performance, economics, and operational flexibility, with less environmental impact. OEMs, engineering companies, and power plant owner/operators from around the globe are invited to join this open project to gain a better understanding of how sCO2 technology can improve high-efficiency power generation. www.stepdemo.us

About GTI Energy
GTI Energy is a leading research and training organization. Our trusted team works to scale impactful solutions that shape energy transitions by leveraging gases, liquids, infrastructure, and efficiency. We embrace systems thinking, open learning, and collaboration to develop, scale, and deploy the technologies needed for low-carbon, low-cost energy systems.

GTI Energy leads the STEP Demo project as the prime contractor with the U.S. Department of Energy/National Energy Technology Laboratory.
www.gti.energy

About the Office of Fossil Energy and Carbon Management
The Office of Fossil Energy and Carbon Management (FECM) conducts research, development, demonstration, and deployment that focuses on technologies to reduce carbon emissions and other environmental impacts from fossil fuel production and use and from key industrial processes, particularly the hardest-to-decarbonize applications in the electricity and industrial sectors. Priority areas of technology work include carbon capture, carbon conversion, carbon dioxide removal, carbon dioxide transport and storage, hydrogen production with carbon management, methane emissions reduction, and critical minerals production. To learn more, visit the FECM website or sign up for FECM news announcements. www.energy.gov/fecm/office-fossil-energy-and-carbon-management

About the National Energy Technology Laboratory
The National Energy Technology Laboratory (NETL) is the U.S. Department of Energy’s only Government-Owned, Government-Operated Laboratory. NETL focuses on the discovery, development, and deployment of technology solutions to enhance the nation’s energy foundation and protect the environment for future generations. These advanced technologies enable fossil fuels to produce the clean, reliable, and affordable energy needed to support increased domestic manufacturing, improve infrastructure, enhance global competitiveness, revitalize the workforce, and free the U.S. from dependence on foreign oil. www.netl.doe.gov

About GE Research
GE Research is GE’s innovation powerhouse where research meets reality. It is a world-class team of 1,000+ scientific, engineering and marketing minds (600+ Ph. Ds), working at the intersection of physics and markets, physical and digital technologies, and across a broad set of industries to deliver world-changing innovations and capabilities for their customers. www.ge.com/research

For more information, visit the STEP Demo website.

Teacher supports, guidance for elementary social studies education vary widely across U.S., report finds

Reports and Proceedings

RAND CORPORATION

new RAND Corporation report finds that the basic infrastructure to support elementary (grades K-5) social studies instruction – academic standards, accountability requirements, assessment programs – is inadequate in many states. Even where state-level infrastructure to guide teachers’ instruction is in place, its comprehensiveness and quality vary greatly.

Support and guidance at the district and school level to underpin social studies instruction are also lacking compared to other core academic subjects. For example, elementary principals report less teacher evaluation and professional learning focused on social studies instruction than on reading/language arts, math and – to a lesser extent – science instruction.

Researchers conducted a review of state policies for social studies and analyzed results from nationally representative surveys of elementary teachers and principals about social studies instruction during the 2021-2022 school year.

“Over the past few decades, school systems have invested less in students’ civic development and more in academic and career preparation as educational priorities,” said Melissa Kay Diliberti, lead author of the report and assistant policy researcher at RAND, a nonprofit, nonpartisan research organization. “Our findings suggest that inadequate state and local infrastructure focused on social studies instruction may have affected what elementary teachers did in their classrooms in 2021-2022.”

For example, 29% of elementary principals surveyed said their schools had not adopted any recommended or required social studies curriculum materials, meaning that their schools or districts had not chosen any curricula to provide to teachers to support such instruction.

In turn, only 16% of elementary teachers surveyed reported using a required textbook for most of their social studies instructional time. More commonly, teachers cobbled together their instructional materials or leaned on self-created materials.

The researchers suggest that, ideally, all elementary social studies policies and guidance – state standards, accountability policies, assessment programs, teacher evaluation, professional learning opportunities, and guidance around materials – would work together to build coherent and strong infrastructure to support teachers’ instruction. This effort requires additional investments at all levels of the U.S. education system, from state policy to investments by school and district leaders themselves.

Other authors of  “The Missing Infrastructure for Elementary (K-5) Social Studies Instruction: Findings from the 2022 American Instructional Resources Survey” are Ashley Woo and Julia H. Kaufman.

RAND Education and Labor, a division of RAND, is dedicated to improving education and expanding economic opportunities for all through research and analysis. Its researchers address key policy issues in U.S. and international education systems and labor markets, from pre-kindergarten to retirement planning.