Wednesday, January 17, 2024

 

New USGS map shows where damaging earthquakes are most likely to occur in US

The research-based map is the first to display an updated, comprehensive National Seismic Hazard Model for all 50 states.


Peer-Reviewed Publication

U.S. GEOLOGICAL SURVEY

National Seismic Hazard Model (2023) - Chance of Damaging Earthquake Shaking 

IMAGE: 

NATIONAL SEISMIC HAZARD MODEL (2023). MAP DISPLAYS THE LIKELIHOOD OF DAMAGING EARTHQUAKE SHAKING IN THE UNITED STATES OVER THE NEXT 100 YEARS.

view more 

CREDIT: USGS NATIONAL SEISMIC HAZARD MODEL TEAM





GOLDEN, Colo. – Nearly 75 percent of the U.S. could experience damaging earthquake shaking, according to a recent U.S. Geological Survey-led team of 50+ scientists and engineers.

This was one of several key findings from the latest USGS National Seismic Hazard Model (NSHM). The model was used to create a color-coded map that pinpoints where damaging earthquakes are most likely to occur based on insights from seismic studies, historical geologic data, and the latest data-collection technologies.

The congressionally requested NSHM update was created as an essential tool to help engineers and others mitigate how earthquakes affect the most vulnerable communities by showing likely earthquake locations and how much shaking they might produce. New tools and technology identified nearly 500 additional faults that could produce a damaging quake, showcasing the evolving landscape of earthquake research.

“This was a massive, multi-year collaborative effort between federal, state and local governments and the private sector,” said Mark Petersen, USGS geophysicist and lead author of the study. “The new seismic hazard model represents a touchstone achievement for enhancing public safety.”

The latest iteration, the first 50-state comprehensive assessment, was updated from previous versions published in 2018 (conterminous U.S.), 2007 (Alaska) and 1998 (Hawaii).

Noteworthy changes in the new model show the possibility of more damaging earthquakes along the central and northeastern Atlantic Coastal corridor, including in the cities of Washington D.C., Philadelphia, New York and Boston. In addition, there is a chance for greater shaking in the seismically active regions of California and Alaska. The new model also characterizes Hawai‘i as having greater potential for shaking because of observations from recent volcanic eruptions and seismic unrest on the islands.

"Earthquakes are difficult to forecast but we’ve made great strides with this new model," said Petersen. "The update includes more faults, better-characterized land surfaces, and computational advancements in modeling that provide the most detailed view ever of the earthquake risks we face."

Key findings from the updated seismic hazard model include:

  • Risk to People: Nearly 75% of the U.S. could experience potentially damaging earthquakes and intense ground shaking, putting hundreds of millions of people at risk.
  • Widespread Hazard: 37 U.S. states have experienced earthquakes exceeding magnitude 5 during the last 200 years, highlighting a long history of seismic activity across this country.
  • Structural Implications: The updated model will inform the future of building and structural design, offering critical insights for architects, engineers, and policymakers on how structures are planned and constructed across the U.S.
  • Unified Approach: This marks the first National Seismic Hazard Model to encompass all 50 states simultaneously, reflecting a massive collaborative effort with federal, state, and local partners.
  • Not a Prediction: No one can predict earthquakes. However, by investigating faults and past quakes, scientists can better assess the likelihood of future earthquakes and how intense their shaking might be.

To read the full findings of the scientific assessment, which was published in the journal Earthquake Spectraplease visit: https://doi.org/10.1177/87552930231215428

 

Pudukotai Dinakarrao studying ways to protect autonomous vehicle supply chains


Grant and Award Announcement

GEORGE MASON UNIVERSITY





Sai Manoj Pudukotai Dinakarrao, Assistant Professor, Electrical and Computer Engineering, received funding for the project: "Cyber Sentinel: Safeguarding Autonomous Vehicle Supply Chains against Backdoors in Hardware."  

Pudukotai Dinakarrao is working with University of Virginia researchers who aim to deploy a backdoor attack mitigation and avoidance approach for vehicles.  

Haiying Shen, Associate Professor, Computer Science; Associate Professor, Electrical and Computer Engineering, University of Virginia, will provide advice on the experiment.  

Shen has extensive experience conducting experiments on connected autonomous vehicles (CAVs). She will provide advice on setting up the experimental environments, conducting the experiments, collecting experimental results and analyzing the results, and determining whether the approach is effective in mitigating and avoiding backdoor attacks. 

Pudukotai Dinakarrao received $100,000 from the Virginia Innovation Partnership Authority for this research. Funding began in Jan. 2024 and will end in Jan. 2025. 

###

About George Mason University

George Mason University is Virginia's largest public research university. Located near Washington, D.C., Mason enrolls 38,000 students from 130 countries and all 50 states. Mason has grown rapidly over the last half-century and is recognized for its innovation and entrepreneurship, remarkable diversity and commitment to accessibility. Learn more at http://www.gmu.edu.

 

Thermoelectric permanent magnet opens new possibilities in thermal management technologies


Magnetically enhanced transverse thermoelectric conversion


Peer-Reviewed Publication

NATIONAL INSTITUTE FOR MATERIALS SCIENCE, JAPAN

Artificially tilted multilayered materials 

IMAGE: 

SCHEMATIC DIAGRAM (LEFT) AND PHOTO (RIGHT) OF THE PERMANENT-MAGNET-BASED ARTIFICIALLY TILTED MULTILAYERED MATERIAL DEVELOPED BY THIS RESEARCH TEAM

view more 

CREDIT: KEN-ICHI UCHIDA NATIONAL INSTITUTE FOR MATERIALS SCIENCE





1. A NIMS research team has demonstrated that the transverse thermoelectric conversion (i.e., energy conversion between charge and heat currents that flow orthogonally to each other) can be greatly enhanced by applying magnetic fields or utilizing magnetism. In addition, the team developed a thermoelectric permanent magnet—a new functional material capable of thermoelectric cooling and power generation—by combining permanent magnets and thermoelectric materials into a hybrid structure. These results may guide in achieving thermal management and energy harvesting using common magnets.

2. The Seebeck effect and the Peltier effect have been extensively researched for their application to thermoelectric conversion (TEC) technologies. These effects are classified as longitudinal TEC phenomena—conversion between charge and heat currents that flow in parallel to each other. Although longitudinal TEC devices have higher energy conversion efficiency than their transverse counterparts, their structures are more complex. By contrast, structurally simpler transverse TEC devices can have low energy losses, low manufacturing cost, and excellent durability. To achieve the practical use of transverse TEC devices, however, their conversion efficiency needs to be improved. Transverse TEC is driven by different types of physical phenomena: magnetically induced phenomena (i.e., the magneto-thermoelectric effect) and phenomena attributed to anisotropic crystalline or electronic structures. These phenomena had previously only been researched independently of one another.

3. This NIMS research team recently fabricated an artificially tilted multilayered material—a hybrid material capable of simultaneously exhibiting three different types of TEC phenomena, including the magneto-thermoelectric effects. The team then demonstrated the enhanced cooling performance of this material due to the transverse TEC. The hybrid material was created by alternately stacking and bonding Bi88Sb12 alloy slabs, which exhibit large magneto-thermoelectric effects, and Bi0.2Sb1.8Te3 alloy slabs, which exhibit a large Peltier effect. This stack was then cut diagonally to form the artificially tiled multilayered material. When magnetic fields were applied to this material, its transverse TEC efficiency increased, which was found to be attributed to the combined effects of the three types of TEC phenomena. The team then replaced the Bi0.2Sb1.8Te3 alloy slabs with permanent magnets (see the figure below) and found that the transverse TEC performance can be improved by the magneto-thermoelectric effects even without external magnetic fields.

4. This research demonstrated ways in which magnetic materials can be designed to increase their thermoelectric cooling and power generation capabilities. In future research, the team will develop materials/devices with better thermal management and energy harvesting capabilities for a sustainable society and improved IoT systems.

5. This project was carried out by Ken-ichi Uchida (Distinguished Group Leader, Research Center for Magnetic and Spintronic Materials (CMSM), NIMS), Takamasa Hirai (Researcher, CMSM, NIMS), Fuyuki Ando (Special Researcher, CMSM, NIMS), and Hossein Sepehri-Amin (Group Leader, CMSM, NIMS). This work was conducted as part of the Uchida Magnetic Thermal Management Materials Project (Research Director: Ken-ichi Uchida; grant number: JPMJER2201) under JST’s ERATO Strategic Basic Research Program.

6. This research was published in the online version of Advanced Energy Materials on November 29, 2023.

 

 

Canada and Spain scientists establish new Antarctic Ocean observatory


Ocean Networks Canada, an initiative of the University of Victoria, will extend its ocean monitoring outside Canadian waters


Business Announcement

UNIVERSITY OF VICTORIA

ONC graphic composite 

IMAGE: 

A PARTNERSHIP BETWEEN OCEAN NETWORKS CANADA, AN INITIATIVE OF THE UNIVERSITY OF VICTORIA, AND THE SPANISH NATIONAL RESEARCH COUNCIL WILL ADVANCE SCIENTIFIC UNDERSTANDING OF ONE OF THE MOST UNDER-OBSERVED PARTS OF THE PLANET, THE SOUTHERN OCEAN, OR THE ANTARCTIC OCEAN.

view more 

CREDIT: OCEAN NETWORKS CANADA.





Canadian and European experts in polar observation are joining forces in a new partnership that will see Ocean Networks Canada (ONC) operating a subsea observatory at the Spanish Antarctic Station, providing year-round, near real-time data on ocean conditions there. This is the first time that ONC will extend its ocean monitoring outside Canadian waters.

This partnership between ONC, a University of Victoria initiative, and the Spanish National Research Council (CSIC) will advance scientific understanding of one of the most under-observed parts of the planet, the Southern Ocean, or the Antarctic Ocean.

The Spanish polar research vessel Hespérides that is transporting the ONC observatory is enroute to the Spanish Antarctic Station (BAE), Juan Carlos I, located on Livingston Island in the South Shetlands Archipelago that is north of the Antarctic Peninsula. It is also carrying two ONC deep-sea Argo floats that will be deployedduring transit in the Drake Passage in the Southern Ocean.

The Hespérides, which departed Barcelona last fall, is scheduled to depart the Argentina port at Ushuaia this week and reach BAE Juan Carlos I in Antarctica later this month.

ONC president and CEO Kate Moran says this partnership marks a tremendous milestone in polar scientific collaboration.

“Ocean Networks Canada has been monitoring Arctic conditions since 2012 through its network of Indigenous community-led and remote coastal observatories that provide continuous ocean data, available to all, through the Oceans 3.0 data portal on the ONC website,” she says. 

“ONC’s expertise in designing and successfully operating underwater observatories able to withstand harsh polar conditions will contribute to Spain’s scientific expertise in monitoring Antarctica, a continent that is critical to this planet’s climate system, and is undergoing rapid, consequential changes that we need to understand.”

Today’s announcement from ONC and CSIC follows a recent call for the urgent expansion of ocean monitoring in the Southern Ocean. In a joint statement released at the 2023 Southern Ocean Observing System (SOOS) Symposium, 300 scientists from 25 nations said that “the chronic lack of observations for the Southern Ocean challenges our ability to detect and assess the consequences of change.”

Jordi Sorribas Cervantes, director of the Unit of Marine Technology of the CSIC, says the crew will construct and deploy the observatory on arrival as part of the activities of the 2023-24 Spanish Antarctic Campaign.

“This partnership with Ocean Networks Canada will provide vital ocean science data in the Southern Ocean, not least because the new observatory will operate year-round outside of the station’s staffed summer months,” he says. “Having access to this near-continuous data, from anywhere in the world, will help meet the current data gap challenge in the Southern Ocean.”

The proposed site of the cabled seafloor observatory is in a small embayment called Johnsons Dock at a depth of 23 metres. It is modelled on one of ONC’s Arctic observatories at Gascoyne Inlet, and will use the Iridium satellite network to transmit the data every 30 minutes to ONC for processing, archival and distribution. 

The observatory will consist of a CTD scientific instrument that measures conductivity, temperature and depth. Additional sensors will track dissolved oxygen concentrations as well as optical properties including turbidity and chlorophyll-a that are important for monitoring seawater quality at this location where freshwater glacier melt and ocean water meet.

The observatory, along with ONC’s two autonomous deep Argo floats, will help monitor the changing biogeochemical and physical ocean processes in this under-observed, sensitive region.

Polar Knowledge Canada (POLAR) manages Canada’s scientific contributions and commitments to the Antarctic Treaty. David Hik, POLAR chief scientist and executive director, says this new partnership between Spain and Canada marks an important milestone in advancing ocean monitoring.

“We are delighted that ONC is contributing its expertise and infrastructure to Antarctica and Southern Ocean research to advance knowledge as well as Canadian leadership in polar science and technology.” 

The research partnership aligns with UVic’s commitment to the United Nations Sustainable Development Goals, and with its strengths in climate action, life below water, life on land, and sustainable cities and communities. Learn more about UVic’s climate leadership at uvic.ca/IMPACT.

“It’s exciting to see ONC’s transformative leadership in ocean science expand internationally to inform climate solutions beyond Canada’s three coasts,” says Lisa Kalynchuk, vice-president, research and innovation, at UVic and member of ONC’s board of directors. “This partnership demonstrates how coastal communities and scientists from around the world can drive technological innovation and scientific discovery.”

Learn more about ONC’s recent Argo float deployments and world-leading ocean observatories

Track the live passage of the Hespérides on its journey to the Spanish Antarctic station, BAE Juan Carlos I.

Ocean Networks Canada (ONC) operates world-leading observatories in the deep ocean, coastal waters and land of the Pacific, Atlantic and Arctic coasts of Canada. ONC’s cabled observatories supply continuous power and internet connectivity to scientific instruments, cameras and 12,000-plus ocean sensors. ONC also operates ocean mobile and land-based assets, including coastal radar. ONC, an initiative of the University of Victoria, is supported by the Government of Canada, and is one of the country’s major research facilities.  

The Spanish National Research Council (CSIC) is the largest public research institution in Spain and one of the most renowned institutions in the European Research Area. The CSIC’s Unit of Marine Technology Unit manages the Spanish Antarctic station Juan Carlos I and Camp Byers on Livingston Island, and also coordinates the overall logistics of the Spanish Antarctic campaign.

 

-- 30 --

View this release in your browser.

A media kit is available


SCI-FI-TEK 70YRS IN DEVELOPMENT

Smooth operation of future nuclear fusion facilities is a matter of control


The Lehigh University Plasma Control Group, supported by a new $1.6 million DOE grant, continues work on advancing plasma dynamics simulation capabilities and algorithms to control superheated gasses that hold promise for limitless, clean energy


Grant and Award Announcement

LEHIGH UNIVERSITY

ITER 

IMAGE: 

THE FIRST MODULE OF ITER’S TOROIDAL PLASMA CHAMBER IS PLACED INSIDE THE TOKAMAK PIT. EACH ONE OF THE NINE MODULES, WHICH IS COMPOSED OF A 40° VACUUM VESSEL SECTOR COVERED BY THERMAL SHIELDS AND TWO SUPERCONDUCTING TOROIDAL-FIELD COILS, WEIGHS APPROXIMATELY THE EQUIVALENT OF FOUR FULLY LOADED BOEING 747S AND IS AS TALL AS A SIX-STORY BUILDING. THE ITER TOKAMAK, UNDER CONSTRUCTION IN SOUTHERN FRANCE, WILL BE THE LARGEST FUSION REACTOR IN THE WORLD. 

view more 

CREDIT: EUGENIO SCHUSTER/LEHIGH UNIVERSITY





As researchers around the world work to develop viable alternatives to fossil fuels, the prospect of nuclear fusion—harnessing the same energy-generating reactions that power the sun—has grown increasingly attractive to private equity firms.

In 2022, the U.S. Department of Energy launched a partnership with investors in the private sector to accelerate the development of fusion energy, in part through the development of a fusion pilot plant, or FPP, in the United States.

The FPP and ITER—the world’s largest nuclear fusion reactor, currently being built in France—represent the future of fusion, according to Eugenio Schuster, a professor of mechanical engineering and mechanics in Lehigh University’s P.C. Rossin College of Engineering and Applied Science. Ensuring that future is a success, however, requires the reactors to operate within safety, stability, and controllability boundaries, he says.

Schuster, who directs the Lehigh University Plasma Control Group, recently received a $1.6 million grant from the DOE to conduct experiments on the DIII-D National Fusion Facility in San Diego that will ultimately serve to improve the operation of ITER, FPP, and future reactors through the development of advanced controls and the application of machine learning.

The three-year grant is part of a $16 million DOE initiative to fund projects focused on advancing innovative fusion technology and collaborative research on small-scale experiments and on the DIII-D tokamak. The DIII-D tokamak is a device designed not to produce energy, but to help researchers study the physics of the plasma where energy-releasing fusion reactors would take place and develop the technology that is needed for reactor operation. 

“There are two goals with this project,” says Schuster. “The first is to improve the operation of DIII-D itself, and the second is to address the technological issues that we’ll be facing with both ITER and FPP.”

Those issues are myriad and exceedingly complex. For one, researchers like Schuster cannot study the type of plasma, called burning plasma, that will be present in fusion reactors.

“In devices like DIII-D, we create a very hot gas in a plasma state, and the goal is to study the physics of that plasma and how to stably confine it,” he says. “But this is not the type of gas that we’re going to use in ITER or in FPP. In fact, the employed gas lowers the probability of fusion in DIII-D instead of increasing it. We don’t want to have nuclear fusion reactions in these experimental devices because then everything becomes radioactive and it becomes much more complicated to carry out experiments.”

Instead, researchers must develop the tools they’ll ultimately need to control the burning plasma of a reactor like ITER or FPP, on a test bed like DIII-D. Using DIII-D, the team will emulate the conditions of an actual nuclear reactor to better understand the dynamics of the plasma and to develop the necessary control solutions. These solutions will ultimately be extrapolated to ITER or FPP by following a model-based approach.  

“You have three key elements in the control design,” he says. “One is the control algorithm, or the controller, and that’s what we work on in my group. Based on plasma-response models, we develop algorithms that will keep the plasma dynamics stable and reach the level of performance needed in reactor operation. The other two elements are the actuators, which can modify the dynamics, and the sensors that are your eyes, basically, into what’s happening.”

His team needs to develop algorithms that are smart enough to indicate to the actuators when priorities change. For example, if an actuator is being used to control the temperature of the plasma, but a sudden instability occurs that could cause what Schuster half-jokingly describes as a “poof”—when the hot gas is no longer confined and hits the inner wall of the reactor with all the force of a small earthquake—the algorithm needs to indicate to the plant supervisor that temperature is no longer relevant. The actuator must be repurposed to stabilize the plasma.

“It’s important that the algorithms that we expect to prove on DIII-D can be extrapolated to these other machines,” he says. “They cannot fail. This poof can never happen. The plasma inside ITER and FPP will be 100 million degrees Celsius, and if gas that hot becomes unconfined, it can really damage the reactor. So we have to develop these control algorithms for a machine that cannot fail, but we can’t do any experimentation with that machine.”

He stresses that all the problems inherent in such a challenge won’t be solved within a three-year project. But the work will move the state-of-the-art in incremental yet meaningful steps forward.

Another goal of the project is to bring more researchers into the field of nuclear fusion energy. The workforce necessary for progress hasn’t kept up with the increasing interest in fusion and the nascent nuclear-fusion industry, says Schuster, and the grant will support four PhD students who will all eventually work in San Diego as part of their doctoral program.

“There are several Lehigh graduates who are now a part of the permanent staff at the DIII-D National Fusion Facility,” he says. “A key part of this project is developing the human resources for the next generation of tokamaks that we plan on building.”

He says it’s an exciting time to be in the field. Advances in magnet technology, materials, computational processing, and machine learning—combined with substantial public and private investment—are all bringing the dream of limitless, clean energy a bit closer to reality.

“If I didn’t believe in it, I wouldn’t be working on it,” he says. “I’m feeling optimistic that it will, eventually, happen.”

 

Microfossils shed light on the long fossil record of euglenoids

A 400-million-year evolutionary history

Peer-Reviewed Publication

UNIVERSITEIT UTRECHT FACULTEIT GEOWETENSCHAPPEN

– Light microscope images of euglenoid cysts from the Triassic-Jurassic boundary 

IMAGE: 

LIGHT MICROSCOPE IMAGES OF EUGLENOID CYSTS FROM THE TRIASSIC-JURASSIC BOUNDARY (APPROX. 200 MILLION YEARS OLD) IN THE SCHANDELAH-1 CORE IN GERMANY (LEFT) AND FROM TRIASSIC SEDIMENTS IN WINTERSWIJK, THE NETHERLANDS (RIGHT). THE SPECIMENS ARE BETWEEN 20 AND 30 MICROMETERS IN DIAMETER. PHOTOS: BAS VAN DE SCHOOTBRUGGE.

view more 

CREDIT: BAS VAN DE SCHOOTBRUGGE, UTRECHT UNIVERSITY

Hiding in the shadows, euglenoids are a fascinating group of single-celled protists that are neither plant nor animal. Plants photosynthesize, and animals eat. Euglenoids do both. Spiraling along the murky bottoms of shallow fresh-water ponds with their long flagella, they eat organic goop, while also using their chloroplasts to convert CO2 and water with light into sugars. Because of this in-between status, euglenoids have been placed close to the very base of the eukaryotic branch on the tree-of-life that includes all plants, fungi, and animals. However, while euglenoids likely evolved more than 1 billion years ago, they appeared to have left only a very scant fossil record.

In a new study published in the journal Review of Palaeobotany and Palynology, a team of Dutch, American, British, German, and Australian scientists shed new light on a group of “problematic” microfossils that have remained a mystery for nearly a century. By comparing microscopic fossil cysts in 200-million-year-old pond sediments from cores drilled in Germany and the Netherlands to much older Paleozoic, and much younger remains in Holocene lakes in Greece, and finally to living protists in a pond in Australia, the researchers establish a 400-million-year evolutionary history of the euglenoids.

What’s in a name?

In 2012, Bas van de Schootbrugge, then at the Goethe University in Frankfurt am Main, and Paul Strother from Boston College, while working on a variety of problematic microfossils in sediments from around the Triassic-Jurassic boundary, realized that the circular striated cysts they were seeing, could be in fact euglenoid cysts. “We had this amazing drill core material at our disposal that contained many unidentified microfossils, including some of the oldest butterfly remains that we published on in 2018”, said Bas van de Schootbrugge, now at Utrecht University. Paul Strother continued: “Some of the microfossils we encountered showed a canny similarity to cysts of Euglena, a modern representative that had been described by Slovakian colleagues. The problem was, there was only one publication in the world making this claim”.

Even more unsettling: after an extensive literature review, van de Schootbrugge and Strother realized that the same type of microfossil had been given many different names. Scientists working on Quaternary and Holocene time slices used Concentricystes, referring to a possible algal cyst with concentric ribs. But Mesozoic workers used Pseudoschizaea, originally thinking it could have been a fern spore. Even older fossils from the Permian, Devonian, Silurian and Ordovician were known as Circulisporites and Chomotriletes.

Transmission electron microscopy

After the authors had disentangled the taxonomic confusion, compiling in the process nearly 500 literature sources related to any of the four taxa, more advanced microscope techniques were needed to establish the ultrastructure of the cysts with the help of transmission electron microscopy (TEM). This required picking of single specimens, embedding, and micro-tome slicing by University of Wisconsin-Eau-Claire co-author Wilson Taylor. Because the specimens in the Triassic-Jurassic cores were mostly damaged, the team turned to palynologist Andreas Koutsodendris at Heidelberg University (Germany), who had access to Holocene and Pliocene core samples containing abundant well-preserved specimens. Andreas Koutsodendris said: “I am encountering these cysts regularly in cores drilled in lakes, for example in Lake Vouliagmeni in Greece that we studied here, but their biological affinity has never been cleared. In fact, the cysts are commonly figured in publications by colleagues, but no one was able to really put a finger on it.” Wilson Taylor continued: “We were much surprised by the ultrastructure of the cysts. The structure of the wall does not resemble anything that is known. The ribs are not ornaments, like in pollen and spores, but part of the wall structure”, said Wilson Taylor. “The layered structure of the walls is also clearly different from many other fresh-water green algae”, Taylor continued.

Nagging uncertainty

While the TEM analysis initially added more mystery, the results did align with a study published in 2021 by another group of colleagues that looked at the ultrastructure of Pseudoschizaea. At least it was possible to show that Holocene and Pliocene Concentricystes and Jurassic Pseudoschizaea are in fact the same. But there remained one nagging uncertainty and that was the lack of any cysts produced by living euglenoids. Wilson Taylor: “We did contact several biologists working on living euglenoids, but no one had been able to make euglenoids encyst in a lab setting, allowing for extraction and TEM analyses of the cysts”.

Microscopic life down under

Enter Fabian Weston. By chance, Strother and van de Schootbrugge stumbled across superb video material posted on YouTube by microscopy enthusiast Fabian Weston from Sydney, Australia. In 2020 Fabian Weston had put a drop of water sampled from a nearby pond in New South Wales on a microscope slide, and using his advanced set-up at The Protist Lab filmed Euglena as it gracefully moved in and out of focus. For reasons that remain poorly understood but could be related to the drying out of the water under the cover slip, Euglena is then seen to ball-up and form a thick wall with ribs that is akin to the cysts found throughout the fossil record. “Unwittingly, Fabian provided a key piece of evidence. He is probably the only person on the planet to have witnessed Euglena encyst under a microscope”, Strother said.

Significance and next steps

Based on all the available pieces of the puzzle, the authors thus link euglenoids from a pond in Australia to fossil cysts that are more than 400 million years old, establishing a deep time record of the euglenoids. “This opens the door for recognizing even older examples, for example from Precambrian records that go back to the very root of the eukaryotic tree of life”, Strother said. “Now that we know which organisms produced those cysts, we can also use them for paleo-environmental interpretations. Their abundance around two of the largest mass-extinction events of the past 600 million years is a tell-tale sign of some major upheavals on the continents related to increased precipitation under extreme greenhouse climate conditions.” Van de Schootbrugge concluded: “Perhaps related to their capabilities to encyst, these organisms have endured and survived every major extinction on the planet. Unlike the behemoths that were done in by volcanoes and asteroids, these tiny creatures have weathered it all.” Extending their research, the team intends to travel to Australia in the near future to scour for preserved Euglena cysts in pond and lake sediments in New South Wales.

Light microscope images of euglenoid cysts from Holocene to recent Lake Vouliagmeni in Greece. Specimens are between 20 and 30 micrometers in diameter. Note the fingerprint-like patterning that is a shared characteristic of all fossil forms.

CREDIT

Andreas Koutsodendris, Heidelberg University



Transmission Electron Microscope (TEM) image of the wall structure of a Holocene euglenoid cyst from Lake Vouliagmeni, Greece.

CREDIT

Wilson Taylor, University of Wisconsin - Eau Claire


Video stills from encysting Euglena from New South Wales, Australia.

CREDIT

Fabian Weston, Protist Lab Films, Galston

 

Study: New deepfake detector designed to be less biased


With facial recognition performing worse on certain races and genders, algorithms developed at UB close the gap


Reports and Proceedings

UNIVERSITY AT BUFFALO





BUFFALO, N.Y. — The image spoke for itself. 

University at Buffalo computer scientist and deepfake expert Siwei Lyu created a photo collage out of the hundreds of faces that his detection algorithms had incorrectly classified as fake — and the new composition clearly had a predominantly darker skin tone.

“A detection algorithm’s accuracy should be statistically independent from factors like race,” Lyu says, “but obviously many existing algorithms, including our own, inherit a bias.”

Lyu, PhD, co-director of the UB Center for Information Integrity, and his team have now developed what they believe are the first-ever deepfake detection algorithms specifically designed to be less biased.

Their two machine learning methods — one which makes algorithms aware of demographics and one that leaves them blind to them — reduced disparities in accuracy across races and genders, while, in some cases, still improving overall accuracy.

The research was presented at the Winter Conference on Applications of Computer Vision (WACV), held Jan. 4-8, and was supported in part by the U.S. Defense Advanced Research Projects Agency (DARPA). 

Lyu, the study’s senior author, collaborated with his former student, Shu Hu, PhD, now an assistant professor of computer and information technology at Indiana University-Purdue University Indianapolis, as well as George Chen, PhD, assistant professor of information systems at Carnegie Mellon University. Other contributors include Yan Ju, a PhD student in Lyu’s Media Forensic Lab at UB, and postdoctoral researcher Shan Jia.

Ju, the study’s first author, says detection tools are often less scrutinized than the artificial intelligence tools they keep in check, but that doesn’t mean they don’t need to be held accountable, too. 

“Deepfakes have been so disruptive to society that the research community was in a hurry to find a solution,” she says, “but even though these algorithms were made for a good cause, we still need to be aware of their collateral consequences.”

Demographic aware vs. demographic agnostic

Recent studies have found large disparities in deepfake detection algorithms’ error rates — up to a 10.7% difference in one study — among different races. In particular, it’s been shown that some are better at guessing the authenticity of lighter-skinned subjects than darker-skinned ones.

This can result in certain groups being more at risk of having their real image pegged as a fake, or perhaps even more damaging, a doctored image of them pegged as real. 

The problem is not necessarily the algorithms themselves, but the data they’ve been trained on. Middle-aged white men are often overly represented in such photo and video datasets, so the algorithms are better at analyzing them than they are underrepresented groups, says Lyu, SUNY Empire Professor in the UB Department of Computer Science and Engineering, within the School of Engineering and Applied Sciences.

“Say one demographic group has 10,000 samples in the dataset and the other only has 100. The algorithm will sacrifice accuracy on the smaller group in order to minimize errors on the larger group,” he adds. “So it reduces overall errors, but at the expense of the smaller group.”

While other studies have attempted to make databases more demographically balanced — a time-consuming process — Lyu says his team’s study is the first attempt to actually improve the fairness of the algorithms themselves.

To explain their method, Lyu uses an analogy of a teacher being evaluated by student test scores. 

“If a teacher has 80 students do well and 20 students do poorly, they’ll still end up with a pretty good average,” he says. “So instead we want to give a weighted average to the students around the middle, forcing them to focus more on everyone instead of the dominating group.”

First, their demographic-aware method supplied algorithms with datasets that labeled subjects’ gender — male or female — and race — white, Black, Asian or other — and instructed it to minimize errors on the less represented groups.

“We’re essentially telling the algorithms that we care about overall performance, but we also want to guarantee that the performance of every group meets certain thresholds, or at least is only so much below the overall performance,” Lyu says.

However, datasets typically aren’t labeled for race and gender. Thus, the team’s demographic-agnostic method classifies deepfake videos not based on the subjects’ demographics — but on features in the video not immediately visible to the human eye.

“Maybe a group of videos in the dataset corresponds to a particular demographic group or maybe it corresponds with some other feature of the video, but we don't need demographic information to identify them,” Lyu says. “This way, we do not have to handpick which groups should be emphasized. It’s all automated based on which groups make up that middle slice of data.”

Improving fairness — and accuracy

The team tested their methods using the popular FaceForensic++ dataset and state-of-the-art Xception detection algorithm. This improved all of the algorithm’s fairness metrics, such as equal false positive rate among races, with the demographic-aware method performing best of all.

Most importantly, Lyu says, their methods actually increased the overall detection accuracy of the algorithm — from 91.49% to as high as 94.17%.

However, when using the Xception algorithm with different datasets and the FF+ dataset with different algorithms, the methods — while still improving most fairness metrics — slightly reduced overall detection accuracy.

“There can be a small tradeoff between performance and fairness, but we can guarantee that the performance degradation is limited,” Lyu says. “Of course, the fundamental solution to the bias problem is improving the quality of the datasets, but for now, we should incorporate fairness into the algorithms themselves.”