Friday, February 17, 2023

Physicists solve durability issue in next-generation solar cells

The UToledo discovery clears the path for the commercialization of new photovoltaic technology to replace silicon and lower the cost of solar electricity

Peer-Reviewed Publication

UNIVERSITY OF TOLEDO

Dr. Yanfa Yan, UToledo 

IMAGE: DR. YANFA YAN, UTOLEDO DISTINGUISHED UNIVERSITY PROFESSOR OF PHYSICS AND A MEMBER OF THE UTOLEDO WRIGHT CENTER FOR PHOTOVOLTAICS INNOVATION AND COMMERCIALIZATION view more 

CREDIT: DANIEL MILLER, THE UNIVERSITY OF TOLEDO

Physicists in the U.S. jumped a major hurdle standing in the way of the commercialization of solar cells created with halide perovskites as a lower-cost, higher-efficiency replacement for silicon when generating electricity from the sun.

Published in the journal Science, the clean energy research led by The University of Toledo in collaboration with the University of Washington, University of Toronto, Northwestern University and Swiss Federal Laboratories for Materials Science and Technology solved the problem with the durability of perovskite solar cells, taking the technology one step closer to powering solar panels in the consumer market.

“Perovskite solar cells offer a route to lowering the cost of solar electricity given their high power conversion efficiencies and low manufacturing cost,” said Dr. Yanfa Yan, UToledo Distinguished University Professor of physics and a member of the UToledo Wright Center for Photovoltaics Innovation and Commercialization. “However, we needed to strengthen the emerging solar cell technology’s endurance during outdoor operation.”

The technology needs to survive for decades outdoors in all kinds of weather and temperatures without corroding or breaking down.

“This challenge is no longer a roadblock to deploying the potential of perovskite solar cells,” Yan said. “Our breakthrough work improved device stability and presents ways of achieving success after a decade of research and development.”

The team discovered the ingredient that enhances adhesion and mechanical toughness.

Researchers experimentally demonstrated that perovskite solar cells treated with 1,3-bis(diphenylphosphino)propane (DPPP), a diphosphine Lewis base molecule, retained a high power conversion efficiency and exhibited superior durability after continuous operation under simulated sun illumination for more than 3,500 hours, or more than 145 days.

They used what is called one sun illumination, which is equivalent to outdoor sunlight.

“Phosphine-containing Lewis base molecules with two electron-donating atoms have a strong binding with the perovskite surface,” Yan said. “We saw the robust beneficial effects on perovskite film quality and device performance when we treated the perovskite solar cells with DPPP.”

“DPPP is also a commercialized product with low cost and easy accessibility, which make it suitable for the commercialization of perovskite solar cells,” said Dr. Zhaoning Song, a research assistant professor in Yan’s lab at UToledo and one of the authors on the new paper.

Researchers say the next step to move the technology forward is to employ their findings to make perovskite panels stable.

Dr. Chongwen Li, the first author of the study and a UToledo alumnus, worked with Yan as a graduate student. Li earned his Ph.D. in physics from UToledo in 2020. He is a postdoctoral researcher at the University of Toronto.

“Continuing to exploit the potentiality in the stability of perovskite solar cells is a crucial priority for the ongoing decarbonization of the world’s economy,” Li said. “After the successful demonstration of DPPP on improving the stability of perovskite solar cells, we are further applying it to large area perovskite solar panels and moving the prototype device forward to commercialization.”

UToledo has been a trailblazer in solar energy research and development for more than 30 years.

It has been a decade since Yan’s team at UToledo identified the ideal properties of perovskites, compound materials with a special crystal structure formed through chemistry, and started to focus their efforts on bringing together two different solar cells to increase the total electrical power generated by using two different parts of the sun’s spectrum.

In November, a team of scientists from UToledo, the University of Toronto and Northwestern University collaborated to create an all-perovskite tandem solar cell with record-setting voltage. The research was published in the journal Nature.

“Northwest Ohio is a global leader in solar technology, and The University of Toledo has been at the forefront of breakthrough after breakthrough. Investment and collaboration between the Department of Energy and world-class research professionals at our universities continues to pay dividends, as working men and women benefit from a growing solar industry,” said Congresswoman Marcy Kaptur (OH-09), ranking member of the House Appropriations Subcommittee on Energy and Water Development. “As a leading member on Energy and Water Development, I’ll continue to champion smart investment to power American energy independence.”

The Wright Center for Photovoltaics Innovation and Commercialization was created at UToledo in 2007 to support solar energy research and manufacturing with $18.6 million in support from the Ohio Department of Development, along with matching contributions of $30 million from federal agencies, universities and industrial partners.

UToledo perovskite solar cells ready for the durability test

U.S. unprepared for dangers posed by zoonotic diseases, new analysis concludes

Editorial in "Science" calls for integrating multiple agencies to better combat threats

Peer-Reviewed Publication

NEW YORK UNIVERSITY

The United States, the largest importer of wildlife in the world, is not prepared for future spread of animal-borne, or zoonotic, diseases due to gaps among governmental agencies designed to combat these threats, concludes a new analysis by researchers at Harvard Law School and New York University. The authors call for a “One Health” approach, integrating multiple agencies in order to better govern human-animal interactions.

The editorial, “Blind spots in biodefense,” which appears in the journal Science, is authored by Ann Linder, a research fellow at Harvard Law School’s Brooks McCormick Jr. Animal Law & Policy Program, and Dale Jamieson, a professor at New York University’s Center for Environmental and Animal Protection in the Department of Environmental Studies.

Linder and Jamieson note that the Biden administration’s recent release of its National Biodefense Strategy (NBS-22), the first update since the COVID-19 pandemic began, frames threats as largely external to the United States. 

“NBS-22 focuses primarily on bioterrorism and laboratory accidents, neglecting threats posed by routine practices of animal use and production inside the United States,” they write. 

This oversight is significant, Linder and Jamieson observe, given the United States’ past and present when it comes to human-animal interface:

  • More zoonotic diseases originated in the United States than in any other country during the second half of the 20th century. 
  • In 2022, the U.S. processed more than 10 billion livestock, the largest number ever recorded and an increase of 204 million over 2021.
  • The ongoing H5N1 avian influenza outbreak has left 58 million animals dead in backyard chicken coops and industrial farms in the U.S.
  • Since 2011, the U.S. has recorded more swine-origin influenza infections than any other country. Most occurred at state and county fairs, which attract 150 million visitors each year and where an estimated 18% of swine have tested positive. 

Moreover, they add, the current patchwork of siloed agencies and authorities is marked by a lack of coordination, leaving significant gaps and areas of underregulation. In fact, of the many agencies that govern food animal production, the U.S. Department of Agriculture is the most important, but it has no authority to regulate on-farm animal production.

The authors call for rebuilding from the ground up the U.S. regulatory system in order to combat zoonotic disease risk.

“What is needed is not simply for agencies to do their jobs better or to paper over the gaps, but a fundamental restructuring of the way that human–animal interfaces are governed,” Linder and Jamieson urge. “A One Health approach, which NBS-22 claims as its guiding principle, would take the health of other living things not merely as the occasional means or obstacles to human health, but as continuous with it. The first step in implementing such an approach would be to create a high-level process for integrating the broken mosaic of multiple agencies, with their unclear and sometimes competing mandates, into an effective, comprehensive regime.”

The editorial is based on research from the Live Animal Markets Project, which is examining global policy responses to animal markets and their role in zoonotic disease transmission. The project includes 15 individual country case studies involving local collaborators, partner institutions, and members of the core research team. The project aims to provide a comprehensive assessment that will aid policymakers, contribute to public education about zoonotic risks, and support the human health and animal protection communities. The project is led by researchers from Harvard Law School’s Brooks McCormick Jr. Animal Law & Policy Program, and New York University’s Center for Environmental and Animal Protection, and involves researchers and institutions from around the world. Kristen Stilt, Arthur Caplan, Chris Green, Bonnie Nadzam, and Valerie Wilson McCarthy contributed to this editorial.

Earthquake scientists have a new tool in the race to find the next big one

Peer-Reviewed Publication

UNIVERSITY OF TEXAS AT AUSTIN

San Andreas Seismograph 

IMAGE: A SEISMOGRAPH RECORDS SEISMIC ACTIVITY NEAR THE SAN ANDREAS FAULT. NEW RESEARCH FROM THE UNIVERSITY OF TEXAS AT AUSTIN JACKSON SCHOOL OF GEOSCIENCES COULD AID IN PREDICTING THE WORLD’S MOST POWERFUL EARTHQUAKES. view more 

CREDIT: RAY_EXPLORES/FLICKR HTTPS://FLICKR.COM/PHOTOS/RAYBOUK/8201310617/

An everyday quirk of physics could be an important missing piece in scientists' efforts to predict the world’s most powerful earthquakes. 

In a study published in the journal Science, researchers at The University of Texas at Austin discovered that a frictional phenomenon could be key to understanding when and how violently faults move. That’s because the phenomenon, which explains why it takes more effort to shove a heavy box from a standstill than it does to keep it moving, governs how quickly the fault surfaces bond together, or heal, after an earthquake. A fault that is slow to heal is more likely to move harmlessly, while one that heals quickly is more likely to stick until it breaks in a large, damaging earthquake.

The discovery could be key to understanding when, and how violently, faults move. That alone won’t allow scientists to predict when the next big one will strike — the forces behind large earthquakes are too complex — but it does give researchers a valuable new way to investigate the causes and potential for a large, damaging earthquake to happen, the authors said.

“The same physics and logic should apply to all different kinds of faults around the world,” said the study’s co-lead author Demian Saffer, director of the University of Texas Institute for Geophysics at the Jackson School of Geosciences. “With the right samples and field observations we can now start to make testable predictions about how big and how often large seismic slip events might occur on other major faults, like Cascadia in the Pacific Northwest.”

To make the discovery, researchers devised a test that combined rocks from a well-studied fault off the coast of New Zealand and a computer model, to successfully calculate that a harmless kind of “slow motion” earthquake would happen every few years because the clay-rich rocks within the fault are very slow to heal. 

The rock samples the researchers tested were drilled from about half a mile under the seafloor in a fault in New Zealand. They squeezed the fault zone rocks in a hydraulic press and found that they were very slow to heal and slipped easily. When they plugged the rock data into a computer model of the fault, the result was a small, slow-motion tremor every two years, a near exact match with observations from the New Zealand fault.

The researchers think the clay-rich rocks, which are common at many large faults, could be regulating earthquakes by allowing plates to slip quietly past each other, which limits the buildup of stress. The discovery could be used to determine whether a fault is prone to slipping in large, damaging earthquakes, said study co-lead Srisharan Shreedharan, affiliate researcher at the University of Texas Institute for Geophysics and assistant professor at Utah State University.

“This doesn't get us any closer to actually predicting earthquakes, but it does tell us whether a fault is likely to slip silently with no earthquakes, or have large ground-shaking earthquakes,” he said.

At Cascadia, there is little evidence of shallow, slow-motion tremors. That’s one of the reasons the Pacific Northwest Seismic Network wants to place sensors across key areas of the fault. The new study gives them the framework to do so, said network Director Harold Tobin.

“We want to zero in on the processes in the shallow part of the fault because that’s what governs the size of the tsunami,” said Tobin, who was not part of the study. “Fault healing doesn’t explain everything, but it does give us a window into the working of subduction zone faults that we didn’t have before.”

The research was funded by the University of Texas Institute for Geophysics, the International Ocean Discovery Program, and New Zealand’s GNS Science. The New Zealand rock samples were gathered during a 2018 scientific ocean drilling mission co-led by Saffer and Laura Wallace, a research scientist at University of Texas Institute for Geophysics and GNS Science in New Zealand. Coauthors included Wallace and Charles Williams, also at GNS Science, who collaborated on the study’s computer modeling.

Demian Saffer, director of the University of Texas Institute for Geophysics (UTIG) and Laura Wallace, a UTIG research scientist, examine rock samples drilled from about half a mile under the seafloor in a fault in New Zealand during a 2018 scientific ocean drilling mission that they co-led. Lab tests revealed that clay-rich rocks are regulating earthquakes there by allowing the fault to slip harmlessly.

CREDIT

Tim Fulton, IODP JRSO.

Study quantifies global impact of electricity in dust storms on Mars

Electrical discharge could be major driving force of Martian chlorine cycle

Peer-Reviewed Publication

WASHINGTON UNIVERSITY IN ST. LOUIS

Mars is infamous for its intense dust storms, some of which kick up enough dust to be seen by telescopes on Earth.

When dust particles rub against each other, as they do in Martian dust storms, they can become electrified, transferring positive and negative electric charge in the same way as you build up static electricity if you shuffle across a carpet.

Strong electric fields build up in dust storms on Earth, so it is perhaps unsurprising that this also happens on Mars. But what happens next? Probably not a sudden flash of lightning, as we might expect on Earth.

Instead, planetary scientist Alian Wang at Washington University in St. Louis thinks electrical discharge on Mars probably looks more like a faint glow. (None of the Mars landers, rovers or other missions have captured a real picture of it.)

“It could be somewhat like the aurora in polar regions on Earth, where energetic electrons collide with dilute atmospheric species,” said Wang, a research professor of earth and planetary sciences in Arts & Sciences.

Flashy or not, this Martian “faux-rora” still packs a hefty punch.

Wang’s new study in the journal Geophysical Research Letters shows that electricity in dust storms could be the major driving force of the Martian chlorine cycle.

As background, scientists consider chlorine one of five elements that are “mobile” on Mars (the others are hydrogen, oxygen, carbon and sulfur). This means chlorine, in different forms, moves back and forth between the surface and the atmosphere on Mars. On the ground, chloride deposits — which are similar to saline playas or shallow salt flats on Earth — are widespread. These chloride deposits likely formed in the early history of Mars as precipitated chloride salts from brine.

In the new study, Wang shows that one particularly efficient way to move chlorine from the ground to the air on Mars is by way of reactions set off by electrical discharge generated in Martian dust activities.

Wang and her collaborators conducted a series of experiments that obtained high yields of chlorine gasses from common chlorides — all by zapping the solid salts with electrical discharge under Mars-like conditions. They conducted these experiments using a planetary simulation chamber at Washington University (called the Planetary Environment and Analysis Chamber, or PEACh).

“The high-releasing rate of chlorine from common chlorides revealed by this study indicates a promising pathway to convert surface chlorides to the gas phases that we now see in the atmosphere,” said Kevin Olsen, a research fellow at The Open University, in the United Kingdom, and a co-author of the new study.

“These findings offer support that Martian dust activities can drive a global chlorine cycle. With the ExoMars Trace Gas Orbiter, we see repeated seasonal activity that coincides with global and regional dust storms,” he said.

Easier on Mars than on Earth

“Frictional electrification is a common process in our solar system, with Martian dust activities known to be a powerful source of electrical charge buildup,” said Wang, who is a faculty fellow of the university’s McDonnell Center for the Space Sciences. “The thin atmosphere on Mars makes it much easier for accumulated electrical fields to break down in the form of electrostatic discharge. In fact, it’s a hundred times easier on Mars than on Earth.”

Scientists involved in the Viking missions that landed on Mars in the 1970s first proposed that dust storms might be a source of the new reactive chemistry on the red planet.

However, the chemical effects of dust activities were difficult to study. Certain mission opportunities, like the ExoMars Schiaparelli EDM launched in 2016, ended in failure. Scientists turned to models and experimental studies.

In recent years, Wang and other scientists published research that shows that when electrostatic discharge interacts with chlorine salts in a Mars-like carbon dioxide-rich environment, it can generate perchlorates and carbonates, and also release chlorine as a gas.

But this new study is the first to try to quantify just how much of these chemical products are actually produced during dust storm events.

“The reaction rates are huge,” Wang said. “Importantly, the released chlorine in a short-time mid-strength electrostatic discharge process is at a percent level.” This means that during a seven-hour simulated electrostatic discharge experiment, at least one out of every 100 chloride molecules is decomposed and then releases its chlorine atom into the atmosphere.

Similar but slightly lower, the formation rates of carbonates and perchlorates are at sub-percent and per-thousand levels, Wang said.

These high yields lead Wang and her team to believe that Martian dust activities can be linked to three global phenomena recently revealed by Mars missions.

Electrical discharge can be tied to the extremely high concentrations of perchlorate and carbonate globally in Martian topsoil, she said. Quantitatively, the high end of the observed concentration ranges can be accumulated by dust storm-induced electrical discharge within less than half of the Amazonian period, the most recent period of Mars’ history, which is thought to have begun about 3 billion years ago. Also, the high yield of released chlorine atoms from chlorides can account for the high concentrations of hydrogen chloride observed in the Martian atmosphere during the 2018 and 2019 dust seasons, when assuming 1 to 10 cm thickness of Martian surface dust would be kicked up by a global dust storm.

“No other process that we know of can do this,” Wang said, “especially with such quantitatively high yield of chlorine release.”

Study demonstrates energy-efficient conversion of nitrate pollutants into ammonia

Peer-Reviewed Publication

UNIVERSITY OF ILLINOIS AT URBANA-CHAMPAIGN, NEWS BUREAU

Coupling nitrate capture with ammonia production through bifunctional redox-electrodes 

IMAGE: ILLINOIS CHEMICAL AND BIOMOLECULAR ENGINEERING PROFESSOR XIAO SU AND COLLEAGUES FOUND A NEW APPROACH FOR CONVERTING NITRATE-CONTAMINATED WATERS – A KNOWN CARCINOGEN AND A CAUSE OF ALGAL BLOOMS IN WATERWAYS – INTO VALUABLE AMMONIA WITHIN A SINGLE ELECTROCHEMICAL CELL. view more 

CREDIT: PHOTO BY FRED ZWICKY

CHAMPAIGN, Ill. — The nitrate runoff problem, a source of carcinogens and a cause of suffocating algal blooms in U.S. waterways, may not be all gloom and doom. A new study led by the University of Illinois Urbana-Champaign demonstrates an approach for the integrated capture and conversion of nitrate-contaminated waters into valuable ammonia within a single electrochemical cell.

The study, directed by chemical and biomolecular engineering professor Xiao Su, demonstrates a device capable of an eightfold concentration of nitrate, a 24-times enhancement of ammonium production rate and a greater than tenfold enhancement in energy efficiency compared with previous nitrate-to-ammonia electrocatalysis methods.

 “By combining separation with reaction, we overcame previously existing limitations of producing ammonia directly from groundwater, where the concentrations of nitrate are very low, and thus make the conversion step inefficient,” Su said.  

The findings are published in the journal Nature Communications.

“The goal of this study was to use as little energy as possible to remove nitrate from agricultural runoff before it hits our waterways, and transform it back to a fertilizer or sell it as a chemical feedstock,” Su said. “Our technology can thus have an impact on waste treatment, sustainable chemical production and advance decarbonization. We are hoping to bring greater circularity into the nitrogen cycle.”

The team developed a unique, bifunctional electrode that can separate and up-concentrate nitrate from a water stream, while converting to ammonia in a single unit using purely electrochemical control. "The bifunctional electrode combines a redox-polymer adsorbent, which captures the nitrate, with cobalt-based catalysts that drive the electrocatalytic conversion to ammonium," Su said.

The system was tested in the lab using agricultural runoff collected from drain tiles around the U. of I. research farmlands to evaluate the potential of the technology for real-world conditions, the researchers said.

“This is a very efficient capture and conversion platform with a low footprint,” Su said. “We don’t need separate electrochemical cells for the water treatment and ammonium production or adding extra chemicals or solvents. Instead, we envision a module installed directly onto farmland and run using the power generated from the electrocatalytic process and a small solar panel.”

The team said its next goal is to develop even more selective materials used in the device to achieve higher nitrate removal and accelerate the conversion to ammonia – while engineering larger scale systems for practical deployment in the field.

Kwiyong Kim is the first author of the study, with contributions from Jaeyoung Hong and Jing Lian Ng, from the Su group. The work was carried out in collaboration with Tuan Anh Pham, from the Lawrence Livermore National Laboratory, and Alexandra Zagalskaya and Vitaly Alexandrov, from the University of Nebraska.

Su also is affiliated with the Beckman Institute for Advanced Science and Technology and also is a professor of civil and environmental engineering at Illinois.

The National Alliance for Water Innovation, funded by the U.S. Department of Energy and the Institute for Sustainability, Energy, and Environment at Illinois supported this study. 

The paper “Coupling nitrate capture with ammonia production through bifunctional

redox-electrodes” is available online. DOI: 10.1038/s41467-023-36318-1

Beyond memorization: Text generators may plagiarize beyond 'copy and paste'


Language models, possibly including ChatGPT, paraphrase and reuse ideas from training data without citing the source, raising plagiarism concerns

Peer-Reviewed Publication

PENN STATE

UNIVERSITY PARK, Pa. — Students may want to think twice before using a chatbot to complete their next assignment. Language models that generate text in response to user prompts plagiarize content in more ways than one, according to a Penn State-led research team that conducted the first study to directly examine the phenomenon.

“Plagiarism comes in different flavors,” said Dongwon Lee, professor of information sciences and technology at Penn State. “We wanted to see if language models not only copy and paste but resort to more sophisticated forms of plagiarism without realizing it.”

The researchers focused on identifying three forms of plagiarism: verbatim, or directly copying and pasting content; paraphrase, or rewording and restructuring content without citing the original source; and idea, or using the main idea from a text without proper attribution. They constructed a pipeline for automated plagiarism detection and tested it against OpenAI’s GPT-2 because the language model’s training data is available online, allowing the researchers to compare generated texts to the 8 million documents used to pre-train GPT-2.

The scientists used 210,000 generated texts to test for plagiarism in pre-trained language models and fine-tuned language models, or models trained further to focus on specific topic areas. In this case, the team fine-tuned three language models to focus on scientific documents, scholarly articles related to COVID-19, and patent claims. They used an open-source search engine to retrieve the top 10 training documents most similar to each generated text and modified an existing text alignment algorithm to better detect instances of verbatim, paraphrase and idea plagiarism.

The team found that the language models committed all three types of plagiarism, and that the larger the dataset and parameters used to train the model, the more often plagiarism occurred. They also noted that fine-tuned language models reduced verbatim plagiarism but increased instances of paraphrase and idea plagiarism. In addition, they identified instances of the language model exposing individuals’ private information through all three forms of plagiarism. The researchers will present their findings at the 2023 ACM Web Conference, which takes place April 30-May 4 in Austin, Texas.

“People pursue large language models because the larger the model gets, generation abilities increase,” said lead author Jooyoung Lee, doctoral student in the College of Information Sciences and Technology at Penn State. “At the same time, they are jeopardizing the originality and creativity of the content within the training corpus. This is an important finding.”

The study highlights the need for more research into text generators and the ethical and philosophical questions that they pose, according to the researchers.

“Even though the output may be appealing, and language models may be fun to use and seem productive for certain tasks, it doesn’t mean they are practical,” said Thai Le, assistant professor of computer and information science at the University of Mississippi who began working on the project as a doctoral candidate at Penn State. “In practice, we need to take care of the ethical and copyright issues that text generators pose.”

Though the results of the study only apply to GPT-2, the automatic plagiarism detection process that the researchers established can be applied to newer language models like ChatGPT to determine if and how often these models plagiarize training content. Testing for plagiarism, however, depends on the developers making the training data publicly accessible, said the researchers.

The current study can help AI researchers build more robust, reliable and responsible language models in future, according to the scientists. For now, they urge individuals to exercise caution when using text generators.

“AI researchers and scientists are studying how to make language models better and more robust, meanwhile, many individuals are using language models in their daily lives for various productivity tasks,” said Jinghui Chen, assistant professor of information sciences and technology at Penn State. “While leveraging language models as a search engine or a stack overflow to debug code is probably fine, for other purposes, since the language model may produce plagiarized content, it may result in negative consequences for the user.”

The plagiarism outcome is not something unexpected, added Dongwon Lee.

“As a stochastic parrot, we taught language models to mimic human writings without teaching them how not to plagiarize properly,” he said. “Now, it’s time to teach them to write more properly, and we have a long way to go.”

The National Science Foundation supported this work.