Tuesday, October 17, 2023

 

Study reveals areas of Brazilian Amazon where no ecological research has been done



The findings evidenced high susceptibility to climate change by 2050 in 15%-18% of the areas with the most neglected biodiversity


Peer-Reviewed Publication

FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DE SÃO PAULO

Areas of Brazilian Amazon where no ecological research has been done 

IMAGE: 

THE FINDINGS EVIDENCED HIGH SUSCEPTIBILITY TO CLIMATE CHANGE BY 2050 IN 15%-18% OF THE AREAS WITH THE MOST NEGLECTED BIODIVERSITY

view more 

CREDIT: ALEXANDER LEES




Many parts of the Brazilian Amazon are neglected in ecological research, for several reasons, according to an article published in the journal Current Biology. Authored by Joice Ferreira of the Federal University of Pará (UFP) and colleagues from many countries who also belong to the Synergize Consortium, the article identifies the areas missing from ecological research and the factors that have determined these gaps, pinpointing opportunities for the planning of new investments in research in the region.

The researchers analyzed data from 7,694 ecological research sites to try to understand how logistics and human influence on the forests could explain the probability of research being done in different parts of the Amazon region. The period analyzed was 2010-20, and the survey covered nine groups of organisms: benthic invertebrates (living on the seabed or in the lowest layers of any water body), heteropterans (true bugs), odonates (dragonflies and damselflies), fish, macrophytes (aquatic plants), birds, woody vegetation, ants, and dung beetles.

“The consortium contacted people who had contributed to databases, standardized inventories and studies involving sampling efforts. Information was thereby compiled on three groups that represent Amazonian biodiversity: vertebrates, invertebrates, and plants in upland forests, flooded forests and aquatic environments – rivers, lakes, etc. This is the first paper published by the group,” said Mario Ribeiro de Moura, a researcher at the State University of Campinas’s Institute of Biology (IB-UNICAMP) in São Paulo, Brazil. He is a co-author of the article and a member of the consortium.

The findings evidenced high susceptibility to climate change by 2050 in 15%-18% of the most neglected areas in the Brazilian Amazon. The least studied areas are also the most threatened in the vicinity of the “deforestation arc”, a swathe of territory extending along the southern, southeastern and eastern borders of Amazonia, mostly in the states of Acre, Amazonas, Maranhão, Mato Grosso, Pará, Rondônia and Tocantins.

The main gaps in Amazonian ecological research were in upland areas. “This was expected and probably reflects the role played by navigable waterways in facilitating access to blackwater and whitewater inundation forest, as well as other aquatic environments,” Moura said.

Not by chance, the least pessimistic scenarios appeared along rivers in northeast Pará and Roraima, southeastern Acre and northern Rondônia. “In these areas, the future impact of climate change will be less severe, and we have more knowledge of the species that live there,” Moura said.

The study was supported by FAPESP via two postdoctoral fellowships in Brazil. One of the fellowships was awarded to Raquel de Carvalho, and the other to Angélica Faria de Resende. Moura was supported by a Young Investigator Grant and a scholarship in Brazil

Research biases

The scientists mapped the most neglected areas of the Amazon region in terms of ecological research and superimposed on this map the areas most likely to be affected by climate change based on a metric they developed to reflect its intensity. Deforestation and degradation data were taken from a recent study published in Science on the drivers of deforestation in the Amazon. The correlations between datasets showed that ecological research in the Amazon is more frequent in deforested areas than areas where deforestation is predicted in the next three decades.

“Environmental change is happening at a very fast pace, including climate change and landscape transformation. To understand how these changes affect biodiversity, we need to know what was in a given area before they happened. The Amazon is one of the last significantly conserved refuges of tropical biodiversity and essential to an understanding of the isolated effect of climate change and habitat destruction on biodiversity,” Moura said. “The study highlighted the areas at risk of environmental change in the coming years and not yet explored by scientists. Without sufficient ecological research, we won’t be able to know what’s changing and what’s being lost.”

With regard to logistics, accessibility and distance to research facilities were key predictors of the probability of research being done. “Access is a mixed blessing, as evidenced by the deforestation arc. Easy access enables researchers to reach more areas, so part of this immense arc has been thoroughly studied, but it also enables those responsible for deforestation and other malefactors to reach these areas. Little information is available on the threatened areas at the edges of the deforestation arc,” Moura said.

Access, and hence research probability, increased with proximity to transportation and research facilities for all upland organisms and most representatives of wetlands and aquatic habitats. “The length of the dry season determines ease of access by water. In flooded forest areas, the shorter the dry season, the easier it is to gain access by river, and this increases the likelihood of research. In upland areas, more severe dry seasons facilitate overland access, with less mud and inundation,” Moura said.

Forest degradation and land tenure were also moderately effective predictors, albeit with consistent importance, across all organism groups. Both factors affected ecological research in the same direction, with research probability slightly declining in more degraded areas and Indigenous territories, but increasing in conservation units. 

In short, less research is done in degraded areas and Indigenous territories, and more in conservation units. “It’s harder to obtain access to Indigenous communities, or there may be a lack of administrative mechanisms that connect researchers with the bodies that regulate such access and with the communities themselves. We need to improve integration between the parties involved, and above all engage local communities in the knowledge creation process. Far more research goes on in conservation units than Indigenous territories, although both are types of protected area,” Moura said.

In Carvalho’s opinion, this is a distribution problem, since Indigenous territories account for some 23% of the total area of the Brazilian Amazon. “At the same time, several Indigenous territories are the best conserved parts of the Amazon biome. It would be very valuable if we could do research there,” she said.

Novel strategies

According to Moura, the Amazon Rainforest is under-represented in global databases used as a source for research on biodiversity. “As noted in the article, we need to integrate the information we have about the Amazon with global databases. The Synergize Consortium has projects that could contribute to global assessments. The information reviewed for this study mostly complies with the requirements of other databases and could be used to improve the representativeness of Amazonian biodiversity in future research on global change. The consortium plans to use this study as a basis for establishing itself as an important collaborative network for other research groups interested in analyzing environmental changes in the Amazon,” he said.

The Synergize Consortium’s principal investigators are Ferreira, who is affiliated with EMBRAPA Amazônia Oriental, a unit of the Brazilian Agricultural Research Corporation (EMBRAPA); and Filipe França, a researcher at the University of Bristol in the United Kingdom. Jos Barlow, a professor at the University of Lancaster, also in the UK, is a co-author of the article and a member of the consortium’s steering committee.

Moura believes the group’s findings can be used to develop novel funding strategies for the Amazon. “Once you’ve identified the gaps, you can target them for investment in conservation and research, or give more weight to research in these areas in future calls for proposals. Public policy and action plans can take these results into consideration, especially as far as biodiversity monitoring and inventorying are concerned,” he said.

About São Paulo Research Foundation (FAPESP)

The São Paulo Research Foundation (FAPESP) is a public institution with the mission of supporting scientific research in all fields of knowledge by awarding scholarships, fellowships and grants to investigators linked with higher education and research institutions in the State of São Paulo, Brazil. FAPESP is aware that the very best research can only be done by working with the best researchers internationally. Therefore, it has established partnerships with funding agencies, higher education, private companies, and research organizations in other countries known for the quality of their research and has been encouraging scientists funded by its grants to further develop their international collaboration. You can learn more about FAPESP at www.fapesp.br/en and visit FAPESP news agency at www.agencia.fapesp.br/en to keep updated with the latest scientific breakthroughs FAPESP helps achieve through its many programs, awards and research centers. You may also subscribe to FAPESP news agency at http://agencia.fapesp.br/subscribe.

 

Transforming fossil fuels: University of Oklahoma, Kansas State University successfully complete DOE project


Grant and Award Announcement

UNIVERSITY OF OKLAHOMA

Pejman Kazempoor 

IMAGE: 

PEJMAN KAZEMPOOR, A PROFESSOR IN THE UNIVERSITY OF OKLAHOMA SCHOOL OF AEROSPACE AND MECHANICAL ENGINEERING IN THE GALLOGLY COLLEGE OF ENGINEERING, LEADS OU’S EFFORT ON A PROJECT FUNDED BY THE NATIONAL ENERGY TECHNOLOGY LAB, OFFICE OF FOSSIL ENERGY. 

view more 

CREDIT: UNIVERSITY OF OKLAHOMA




NORMAN, Okla. (Oct. 17, 2023) -- In collaboration, the University of Oklahoma has taken the lead in a Department of Energy project, with support from Kansas State University, to pioneer a new generation of reversible electrochemical cells. The cells have the potential to revolutionize energy storage by integrating seamlessly with fossil fuel assets.

The project’s objective was to conduct an extensive study aimed at developing an energy storage technology capable of efficiently converting carbon dioxide emissions captured from fossil fuel assets into valuable fuels, such as methane, says Pejman Kazempoor, Ph.D., a professor in the OU School of Aerospace and Mechanical Engineering, Gallogly College of Engineering, who leads OU’s contribution to the project. 

“One of the standout features of this reversible electrochemical technology is its ability to solve existing problems in the energy storage landscape, particularly its exceptionally high round-trip efficiency and durability. Unlike conventional energy storage technologies, this solution promises to minimize energy loss during the storage and retrieval process, making it a game-changer for the industry,” Kazempoor said.

The success of the OU and KSU collaboration has garnered international attention, with their achievements prominently featured in journals such as Nature Energy and Joule. 

“The implications of this achievement extend far beyond the laboratory, as it opens up exciting opportunities for a more efficient and sustainable energy future,” said Abu Yousuf, Ph.D., a postdoctoral Fellow in the OU School of Aerospace and Mechanical Engineering. 

Kazempoor says that by seamlessly integrating with fossil fuel assets and enabling the conversion of CO2 emissions into usable fuels, the technology holds the potential to usher in a new era of energy storage. The implications for reducing carbon emissions and enhancing energy efficiency are profound, offering hope for a more sustainable and environmentally friendly energy landscape, he adds. 

About the project: The agreement, in effect from Feb. 28, 2021, to Aug. 30, 2023, was funded by the National Energy Technology Lab, Office of Fossil Energy. The project, titled “Reversible Methane Electrochemical Reactors as Efficient Energy Storage for Fossil Fuel Power,” received a grant of $312,504 under Award Number DE-FE0032005.

About the Gallogly College of Engineering: Engineering has been part of the University of Oklahoma since 1908. Today, the Gallogly College of Engineering is organized into seven schools and is one of the largest colleges on the Norman campus. 

 

New polymer membranes, AI predictions could dramatically reduce energy, water use in oil refining


The membranes would improve distillation processes that account for 1% of the world’s energy use

Peer-Reviewed Publication

GEORGIA INSTITUTE OF TECHNOLOGY

DUCKY Polymer Membrane for Crude Oil Separations 

IMAGE: 

A SAMPLE OF A DUCKY POLYMER MEMBRANE RESEARCHERS CREATED TO PERFORM THE INITIAL SEPARATION OF CRUDE OILS USING SIGNIFICANTLY LESS ENERGY.

view more 

CREDIT: CANDLER HOBBS, GEORGIA INSTITUTE OF TECHNOLOGY




A new kind of polymer membrane created by researchers at Georgia Tech could reshape how refineries process crude oil, dramatically reducing the energy and water required while extracting even more useful materials.

The so-called DUCKY polymers — more on the unusual name in a minute — are reported Oct. 16 in Nature Materials. And they’re just the beginning for the team of Georgia Tech chemists, chemical engineers, and materials scientists. They also have created artificial intelligence tools to predict the performance of these kinds of polymer membranes, which could accelerate development of new ones.

The implications are stark: the initial separation of crude oil components is responsible for roughly 1% of energy used across the globe. What’s more, the membrane separation technology the researchers are developing could have several uses, from biofuels and biodegradable plastics to pulp and paper products.

“We're establishing concepts here that we can then use with different molecules or polymers, but we apply them to crude oil because that's the most challenging target right now,” said M.G. Finn, professor and James A. Carlos Family Chair in the School of Chemistry and Biochemistry.

Crude oil in its raw state includes thousands of compounds that have to be processed and refined to produce useful materials — gas and other fuels, as well as plastics, textiles, food additives, medical products, and more. Squeezing out the valuable stuff involves dozens of steps, but it starts with distillation, a water- and energy-intensive process.

Researchers have been trying to develop membranes to do that work instead, filtering out the desirable molecules and skipping all the boiling and cooling.

“Crude oil is an enormously important feedstock for almost all aspects of life, and most people don't think about how it's processed,” said Ryan Lively, Thomas C. DeLoach Jr. Professor in the School of Chemical and Biomolecular Engineering. “These distillation systems are massive water consumers, and the membranes simply are not. They're not using heat or combustion. They just use electricity. You could ostensibly run it off of a wind turbine, if you wanted. It’s just a fundamentally different way of doing a separation.”

What makes the team’s new membrane formula so powerful is a new family of polymers. The researchers used building blocks called spirocyclic monomers that assemble together in chains with lots of 90-degree turns, forming a kinky material that doesn’t compress easily and forms pores that selectively bind and permit desirable molecules to pass through. The polymers are not rigid, which means they’re easier to make in large quantities. They also have a well-controlled flexibility or mobility that allows pores of the right filtering structure to come and go over time.

The DUCKY polymers are created through a chemical reaction that’s easy to produce at a scale that would be useful for industrial purposes. It’s a flavor of a Nobel Prize-winning family of reactions called click chemistry, and that’s what gives the polymers their name. The reaction is called copper-catalyzed azide-alkyne cycloaddition — abbreviated CuAAC and pronounced “quack.” Thus: DUCKY polymers.

In isolation, the three key characteristics of the polymer membranes aren’t new; it’s their unique combination that makes them a novelty and effective, Finn said.

The research team included scientists at ExxonMobil, who discovered just how effective the membranes could be. The company’s scientists took the crudest of the crude oil components — the sludge left at the bottom after the distillation process — and pushed it through one of the membranes. The process extracted even more valuable materials.

“That's actually the business case for a lot of the people who process crude oils. They want to know what they can do that’s new. Can a membrane make something new that the distillation column can't?” Lively said. “Of course, our secret motivation is to reduce energy, carbon, and water footprints, but if we can help them make new products at the same time, that's a win-win.”

Predicting such outcomes is one way the team’s AI models can come into play. In a related study recently published in Nature Communications, Lively, Finn, and researchers in Rampi Ramprasad’s Georgia Tech lab described using machine learning algorithms and mass transport simulations to predict the performance of polymer membranes in complex separations.

“This entire pipeline, I think, is a significant development. And it's also the first step toward actual materials design,” said Ramprasad, professor and Michael E. Tennenbaum Family Chair in the School of Materials Science and Engineering. “We call this a ‘forward problem,’ meaning you have a material and a mixture that goes in — what comes out? That's a prediction problem. What we want to do eventually is to design new polymers that achieve a certain target permeation performance.”

Complex mixtures like crude oil might have hundreds or thousands of components, so accurately describing each compound in mathematical terms, how it interacts with the membrane, and extrapolating the outcome is “non-trivial,” as Ramprasad put it.

Training the algorithms also involved combing through all the experimental literature on solvent diffusion through polymers to build an enormous dataset. But, like the potential of membranes themselves to reshape refining, knowing ahead of time how a proposed polymer membrane might work would accelerate a materias design process that’s basically trial-and-error now, Ramprasad said.

“The default approach is to make the material and test it, and that takes time. This data-driven or machine learning-based approach uses past knowledge in a very efficient manner,” he said. “It’s a digital partner: You’re not guaranteed an exact prediction, because the model is limited by the space spanned by the data you use to train it. But it can extrapolate a little bit and it can take you in new directions, potentially. You can do an initial screening by searching through vast chemical spaces and make go, no-go decisions up front.”

Lively said he’d long been a skeptic about the ability of machine learning tools to tackle the kinds of complex separations he works with.

“I always said, ‘I don't think you can predict the complexity of transport through polymer membranes. The systems are too big; the physics are too complicated. Can't do it.’”

But then he met Ramprasad: “Rather than just be a naysayer, Rampi and I took a stab at it with a couple of undergrads, built this big database, and dang. Actually, you can do it,” Lively said.

Developing the AI tools also involved comparing the algorithms’ predictions to actual results, including with the DUCKY polymer membranes. The experiments showed the AI models predictions were within 6% to 7% of actual measurements.

“It's astonishing,” Finn said. “My career has been spent trying to predict what molecules are going to do. The machine learning approach, and Rampi’s execution of it, is just completely revolutionary.”

This research was supported by the U.S. Department of Energy, grant No. DE-EE0007888; the European Research Council, grant No. 758370; the Kwanjeong Educational Foundation; a Royal Society University Research Fellowship; and the ExxonMobil Technology and Engineering Company. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of any funding agency.

 

The biggest barrier to getting fossil fuel workers green jobs isn’t skills — it’s location


LOCATION, LOCATION, LOCATION

Peer-Reviewed Publication

UNIVERSITY OF PITTSBURGH




Between the Inflation Reduction Act and the EU’s Just Transition Mechanism, both the United States and Europe are poised to put tens of billions of dollars toward creating green jobs. At the same time, there are conversations about how to ensure workers in the current fossil fuel industry have the skills to participate in this green revolution.

But new research published in Nature Communications shows many fossil fuel workers have the right skills already — the problem is that those new green jobs likely won’t be in the right place. The results spell a message for those planning a greener economy: If all they think about is reskilling, their efforts are unlikely to bear fruit.

“Our results challenge the prevailing narrative that we hear from policymakers,” said Morgan Frank, an assistant professor in Pitt’s School of Computing and Information and corresponding author on the study. “Yes, fossil fuel workers do appear to have the skills to use green jobs. But it doesn’t look like fossil fuel workers historically have had a lot of spatial mobility over the course of their careers.”

The question of what will happen to fossil fuel workers during a transition to green energy carries high stakes: in the U.S. alone, a phase-out of fossil fuels by 2050 would displace more than 1.7 million workers. To address that question, Frank and his colleagues drew on data from the U.S. Census Bureau and the Bureau of Labor Statistics that map out the skills workers make use of in these different industries and how the workers moved between states and industries in the past.

There was a substantial overlap between the skills workers use in fossil fuel extraction and those they would need for green jobs — but then, the team mapped out the current locations of green energy plants across the country and compared them against current fossil fuel hotspots. Fossil fuel extraction happens largely in Appalachian states, Texas, New Mexico and the Midwest. Solar energy plants, to take one example, are clustered along the coasts.

“You can just eyeball this and see there’s no overlap,” said Frank. “So it’s very discouraging, even with this very simple version of the analysis.”

Running more complex models that predicted future green job potential delivered similar results. According to their simulation, in the 15 biggest regions for fossil fuel extraction, less than 1.5% of fossil fuel workers are likely to transition to green jobs.

Political scientists have long suspected skills are only one barrier to a transition to green energy. Although previous studies tackled similar questions by, for instance, interviewing fossil fuel workers, this is the first study to put detailed numbers to the phenomenon.

“That allowed us to empirically back up this hunch,” said Frank. “The number one bit of feedback that I've gotten on this project, especially from political scientists, is, ‘Wow, I didn’t know that this data is just out there.’”

That’s not to say reskilling isn’t important — there’s not a perfect match in skills between fossil fuel workers and green energy workers. And there’s another issue: Many jobs in green energy would only last during the construction of a facility. It takes far fewer workers to maintain a solar plant once it’s up and running.

The team, which included Junghyun Lim of the University of North Carolina Chapel Hill and Michaël Aklin of the Swiss Federal Institute of Technology Lausanne, also performed additional analyses to clarify what kinds of investments are likely to benefit current fossil fuel extraction workers, but there are many more aspects the researchers couldn’t address with their data, such as social factors that prevent workers from relocating. They plan to include those in future studies, pulling in surveys and even data from job search sites like LinkedIn and Indeed to figure out how policymakers might address the barriers to transitioning between industries.  

What’s clear now is that the picture is more complicated than the prevailing political conversations indicate. One solution, Frank says, is that policymakers could consider targeting current regions of fossil fuel extraction for investment toward other industries as well.

“If you take a broader view, other industries like construction or manufacturing involve heavy machinery, working with tools, building materials and infrastructure,” said Frank. “It could be an easier transition for fossil fuel workers to make. In terms of skills and in terms of distance, it seems much more plausible.”

 

Could you correctly identify someone wearing sunglasses from a distance of 20 meters?


Peer-Reviewed Publication

ABO AKADEMI UNIVERSITY




This comprehensive study focused on three key factors: distance, lighting and facial masking, and their impact on the ability of eyewitnesses to later correctly identify individuals they have seen. In the study, eyewitnesses were asked to identify perpetrators they had seen from various distances (5, 12.5 or 20 metres) and in different lighting conditions (daylight or deep twilight). The perpetrators were shown both with and without facial masking (sunglasses, hood, or both sunglasses and hood).

The key finding of the study is that distance plays a crucial role – the longer the distance, the harder it is to later identify a person correctly. Moreover, facial masking presents a challenge even in good lighting conditions and at close proximity. Compared to other facial masking methods, sunglasses had the most negative impact on identification accuracy.

– Facial masking has a significant effect on the ability of eyewitnesses to correctly identify a perpetrator. A distance of five meters in clear daylight already presents a substantial challenge for later identifying a perpetrator who was wearing sunglasses. When the distance is 20 metres and the lighting is low (deep twilight), it then becomes so difficult to later identify a perpetrator that facial masking has very little effect. If an eyewitness sees a perpetrator wearing sunglasses from a distance of 20 metres, our findings indicate that it is highly unlikely that the eyewitness would be able to correctly identify the perpetrator at a later time, says Thomas J. Nyman, Assistant Professor of Practice in Psychology at New York University Shanghai.

Julia Korkman, Professor of Practice in Legal Psychology at Åbo Akademi University, emphasises that with over 1300 participants aged from 5 to 90 years, the study is a unique example of citizen science. This makes the results robust and potentially applicable on a global scale.

– These results mean that we have a better starting point for assessing the value of eyewitness identification in relation to distance, lighting and facial masking, she says.

Article: “The masked villain: the effects of facial masking, distance, lighting, and eyewitness age on eyewitness identification accuracy” (Psychology, Crime & Law).

Link: https://www.tandfonline.com/doi/full/10.1080/1068316X.2023.2242999

 

 

Nature publication: Biochemists clear up a decades-old misconception about a key metabolic pathway


Peer-Reviewed Publication

SAARLAND UNIVERSITY

Prof. Dr. Martin van der Laan 

IMAGE: 

PROF. DR. MARTIN VAN DER LAAN

view more 

CREDIT: THORSTEN MOHR/SAARLAND UNIVERSITY




Imagine that the brightest minds in a particular discipline all agree that the object of their research looks like a triangle. What then happens when someone turns up and says: 'No, it's actually a square.'? 'They'll say he's cuckoo!' said Martin van der Laan, whose unequivocal response directly reflects the experience that his colleague Eunyong Park from the University of California in Berkeley made about two years ago. 'He published a manuscript in which he depicted the structure of the TIM23-TIM17 complex that was quite different from what nearly every expert in the field had assumed. No biochemical experiment of the previous 25 years seemed to fit with this new structure,' explained van der Laan, professor of medical biochemistry at Saarland University whose main area of research is mitochondria – the 'powerhouses' that drive cellular metabolism. The experts were unanimous: Their research colleague in California must be mistaken.

But it turns out he wasn't. Up until recently, scientific doctrine held that the protein complex TIM23 (TIM stands for 'translocase of the inner membrane') forms a tunnel-like structure through which large protein molecules can be transported into the mitochondria from other parts of the cell – something that van der Laan demonstrated by placing his two cupped hands together with their fingertips touching. This hollow channel through the mitochondrial envelope acts like a keyhole that will only accept a molecule that has the right key. When such a molecule approaches, it gets pulled into the channel and transported, together with energy-supplying auxiliary proteins, into the interior of the mitochondrion. That, at least, was the established scientific paradigm for decades.

To help understand why this model is not correct, Martin van der Laan now placed his cupped hands back to back with his fingers now pointing outwards. It was this new structure, which Eunyong Park had proposed on the basis of high-resolution cryo-electron microscopy data, that seemed so utterly surprising. The structure of the transport complex looked completely different from what had been assumed for decades. In this new structure, one of the hands represents TIM23, while the other is its 'fraternal twin' TIM17. But up until then, the TIM17 protein had not really played any significant part in describing the protein transport mechanism in mitochondria. It was thought that TIM17 had more of a supporting or regulatory role in the transport of proteins through the mitochondrial membrane, and that the real star of the team was TIM23.

It turns out that's not the case. Martin van der Laan and his colleague Nils Wiedemann from Freiburg have been collaborating closely for almost two decades. Their research teams recently took another close look at the TIM17-TIM23 complex and have now mapped the functional organization of the complex in great detail using advanced and highly precise biochemical methods. A key feature of this work was the re-evaluation of old data that previously had made little sense, but that now fitted perfectly with this revolutionary new picture of the TIM complex.

The results of the research work carried out in Homburg and Freiburg have completely upturned the previous long-held assumptions about how proteins get inside mitochondria, reinforcing the conclusions drawn from the structural investigations by the research group at UC Berkeley. Martin van der Laan summarized these new findings: 'Taking the new structural and biochemical data together, we can now see that proteins migrate into the mitochondria along a pan-shaped membrane opening formed by TIM17 and not via a TIM23 channel structure.' The supporting actor has suddenly become the star of the show.

According to van der Laan, it was an ingenious experimental trick that has been crucial to this biochemical paradigm shift. 'We made an artificial mitochondrial protein that gets stuck in the TIM17/23 transport pore like a cork getting stuck in the neck of a wine bottle. We then modified the trapped protein complex so that free radicals – highly reactive chemical groups – were released on our artificially engineered protein. The reaction of these free radicals with their molecular surroundings is something that we can observe with extremely high spatial resolution. What we found was that the free radicals were only active in the TIM17 half-channel,' explained Professor van der Laan. This could only mean that the mitochondrial proteins migrate across the envelope membrane in close contact with the TIM17 structure, and not via a TIM23 channel, which is the mechanism presented in practically all the standard textbooks on biochemistry and cell biology.

That alone is a pretty revolutionary discovery. But why is this finding of such importance – and not just to a handful of specialist researchers around the world? As Martin van der Laan puts it: 'Mitochondrial dysfunction can result in severe degenerative and metabolic diseases and is known to be involved in the development of Parkinson's disease, diabetes and certain types of cancer.' Improving our understanding of how proteins actually enter the mitochondria – the 'powerhouses' that maintain cellular energy supply – could facilitate the development of highly effective drugs that are better able to treat those suffering from such serious diseases.

This article is the second paper that Martin van der Laan and his research group have published in Nature, one of the world's top scientific journals (link to first paper)

 

 

Study: Struggling students who repeat third grade see improved achievement


Researchers found no evidence that retention caused disciplinary or attendance problems

Peer-Reviewed Publication

AMERICAN EDUCATIONAL RESEARCH ASSOCIATION




Washington, October 12, 2023—Third-grade retention can increase the reading and math scores of struggling students, with positive effects lasting into middle school, according to new research released today. The study, by NaYoung Hwang at the University of New Hampshire and Cory Koedel at the University of Missouri, was published today in Educational Evaluation and Policy Analysis, a peer-reviewed journal of the American Educational Research Association.

Video: Co-author NaYoung Hwang discusses findings and implications of the study

Despite mixed reviews among policymakers, researchers, and educators, grade retention policies are on the rise in the United States. Currently, 25 states and the District of Columbia require or allow school districts to retain students who are not proficient in reading at the end of third grade, although in recent years some states have begun to revisit or rescind their policies.

For their study, Hwang and Koedel examined Indiana’s test-based retention policy, using statewide data on third graders through seventh graders from 2011–12 to 2016–17. The authors found large increases in students’ reading and math scores for up to five years after the retention event. There was also no evidence of third grade retention resulting in disciplinary or attendance problems. Moreover, the positive effects were consistent regardless student gender, race/ethnicity, or family income level.  

“Our findings show that repeating third grade can substantially improve educational outcomes without causing certain behavioral issues,” said NaYoung Hwang, an assistant professor of education at the University of New Hampshire. “Our results corroborate an emerging theme in the grade-retention literature that timing greatly affects how grade retention impacts student outcomes.”

Prior research has found that the impact of grade retention on students is more positive—although not entirely positive—when it occurs in early grades, with most of the negative impacts found when retention happens in the sixth grade or higher. Previous studies have also found some evidence that retention can have a harmful impact on the suspension rates of boys and Black and Hispanic students, who are at a higher baseline risk of disciplinary action. 

“There are concerns about possible negative impacts of retention on students, especially those most at risk, and it’s important that researchers continue to examine how these policies affect students of different backgrounds, grade levels, and geographic regions,” said Hwang.

Hwang and Koedel stressed that the academic and socio-emotional support provided to these students by their schools is crucial to their success.

“It’s important that scholars, educators, and policymakers continue to assess and highlight best practices for supporting the success of these students, and that states and districts learn from one another,” Hwang said.

The authors noted their study does not examine the longer run effects of grade retention, including outcomes in high school, or the effects on other non-academic outcomes, such as self-esteem, peer relationships, or confidence.  

Study citation: Hwang, N. & Koedel, C. (2023). Helping or hurting: The effects of retention in the third grade on student outcomes. Educational Evaluation and Policy Analysis. Prepublished October 12, 2023. http://doi.org/10.3102/01623737231197639

###

About AERA
The American Educational Research Association (AERA) is the largest national interdisciplinary research association devoted to the scientific study of education and learning. Founded in 1916, AERA advances knowledge about education, encourages scholarly inquiry related to education, and promotes the use of research to improve education and serve the public good. Find AERA on FacebookTwitterLinkedIn, and Instagram.