Wednesday, January 17, 2024

 

Canada and Spain scientists establish new Antarctic Ocean observatory


Ocean Networks Canada, an initiative of the University of Victoria, will extend its ocean monitoring outside Canadian waters


Business Announcement

UNIVERSITY OF VICTORIA

ONC graphic composite 

IMAGE: 

A PARTNERSHIP BETWEEN OCEAN NETWORKS CANADA, AN INITIATIVE OF THE UNIVERSITY OF VICTORIA, AND THE SPANISH NATIONAL RESEARCH COUNCIL WILL ADVANCE SCIENTIFIC UNDERSTANDING OF ONE OF THE MOST UNDER-OBSERVED PARTS OF THE PLANET, THE SOUTHERN OCEAN, OR THE ANTARCTIC OCEAN.

view more 

CREDIT: OCEAN NETWORKS CANADA.





Canadian and European experts in polar observation are joining forces in a new partnership that will see Ocean Networks Canada (ONC) operating a subsea observatory at the Spanish Antarctic Station, providing year-round, near real-time data on ocean conditions there. This is the first time that ONC will extend its ocean monitoring outside Canadian waters.

This partnership between ONC, a University of Victoria initiative, and the Spanish National Research Council (CSIC) will advance scientific understanding of one of the most under-observed parts of the planet, the Southern Ocean, or the Antarctic Ocean.

The Spanish polar research vessel Hespérides that is transporting the ONC observatory is enroute to the Spanish Antarctic Station (BAE), Juan Carlos I, located on Livingston Island in the South Shetlands Archipelago that is north of the Antarctic Peninsula. It is also carrying two ONC deep-sea Argo floats that will be deployedduring transit in the Drake Passage in the Southern Ocean.

The Hespérides, which departed Barcelona last fall, is scheduled to depart the Argentina port at Ushuaia this week and reach BAE Juan Carlos I in Antarctica later this month.

ONC president and CEO Kate Moran says this partnership marks a tremendous milestone in polar scientific collaboration.

“Ocean Networks Canada has been monitoring Arctic conditions since 2012 through its network of Indigenous community-led and remote coastal observatories that provide continuous ocean data, available to all, through the Oceans 3.0 data portal on the ONC website,” she says. 

“ONC’s expertise in designing and successfully operating underwater observatories able to withstand harsh polar conditions will contribute to Spain’s scientific expertise in monitoring Antarctica, a continent that is critical to this planet’s climate system, and is undergoing rapid, consequential changes that we need to understand.”

Today’s announcement from ONC and CSIC follows a recent call for the urgent expansion of ocean monitoring in the Southern Ocean. In a joint statement released at the 2023 Southern Ocean Observing System (SOOS) Symposium, 300 scientists from 25 nations said that “the chronic lack of observations for the Southern Ocean challenges our ability to detect and assess the consequences of change.”

Jordi Sorribas Cervantes, director of the Unit of Marine Technology of the CSIC, says the crew will construct and deploy the observatory on arrival as part of the activities of the 2023-24 Spanish Antarctic Campaign.

“This partnership with Ocean Networks Canada will provide vital ocean science data in the Southern Ocean, not least because the new observatory will operate year-round outside of the station’s staffed summer months,” he says. “Having access to this near-continuous data, from anywhere in the world, will help meet the current data gap challenge in the Southern Ocean.”

The proposed site of the cabled seafloor observatory is in a small embayment called Johnsons Dock at a depth of 23 metres. It is modelled on one of ONC’s Arctic observatories at Gascoyne Inlet, and will use the Iridium satellite network to transmit the data every 30 minutes to ONC for processing, archival and distribution. 

The observatory will consist of a CTD scientific instrument that measures conductivity, temperature and depth. Additional sensors will track dissolved oxygen concentrations as well as optical properties including turbidity and chlorophyll-a that are important for monitoring seawater quality at this location where freshwater glacier melt and ocean water meet.

The observatory, along with ONC’s two autonomous deep Argo floats, will help monitor the changing biogeochemical and physical ocean processes in this under-observed, sensitive region.

Polar Knowledge Canada (POLAR) manages Canada’s scientific contributions and commitments to the Antarctic Treaty. David Hik, POLAR chief scientist and executive director, says this new partnership between Spain and Canada marks an important milestone in advancing ocean monitoring.

“We are delighted that ONC is contributing its expertise and infrastructure to Antarctica and Southern Ocean research to advance knowledge as well as Canadian leadership in polar science and technology.” 

The research partnership aligns with UVic’s commitment to the United Nations Sustainable Development Goals, and with its strengths in climate action, life below water, life on land, and sustainable cities and communities. Learn more about UVic’s climate leadership at uvic.ca/IMPACT.

“It’s exciting to see ONC’s transformative leadership in ocean science expand internationally to inform climate solutions beyond Canada’s three coasts,” says Lisa Kalynchuk, vice-president, research and innovation, at UVic and member of ONC’s board of directors. “This partnership demonstrates how coastal communities and scientists from around the world can drive technological innovation and scientific discovery.”

Learn more about ONC’s recent Argo float deployments and world-leading ocean observatories

Track the live passage of the Hespérides on its journey to the Spanish Antarctic station, BAE Juan Carlos I.

Ocean Networks Canada (ONC) operates world-leading observatories in the deep ocean, coastal waters and land of the Pacific, Atlantic and Arctic coasts of Canada. ONC’s cabled observatories supply continuous power and internet connectivity to scientific instruments, cameras and 12,000-plus ocean sensors. ONC also operates ocean mobile and land-based assets, including coastal radar. ONC, an initiative of the University of Victoria, is supported by the Government of Canada, and is one of the country’s major research facilities.  

The Spanish National Research Council (CSIC) is the largest public research institution in Spain and one of the most renowned institutions in the European Research Area. The CSIC’s Unit of Marine Technology Unit manages the Spanish Antarctic station Juan Carlos I and Camp Byers on Livingston Island, and also coordinates the overall logistics of the Spanish Antarctic campaign.

 

-- 30 --

View this release in your browser.

A media kit is available


SCI-FI-TEK 70YRS IN DEVELOPMENT

Smooth operation of future nuclear fusion facilities is a matter of control


The Lehigh University Plasma Control Group, supported by a new $1.6 million DOE grant, continues work on advancing plasma dynamics simulation capabilities and algorithms to control superheated gasses that hold promise for limitless, clean energy


Grant and Award Announcement

LEHIGH UNIVERSITY

ITER 

IMAGE: 

THE FIRST MODULE OF ITER’S TOROIDAL PLASMA CHAMBER IS PLACED INSIDE THE TOKAMAK PIT. EACH ONE OF THE NINE MODULES, WHICH IS COMPOSED OF A 40° VACUUM VESSEL SECTOR COVERED BY THERMAL SHIELDS AND TWO SUPERCONDUCTING TOROIDAL-FIELD COILS, WEIGHS APPROXIMATELY THE EQUIVALENT OF FOUR FULLY LOADED BOEING 747S AND IS AS TALL AS A SIX-STORY BUILDING. THE ITER TOKAMAK, UNDER CONSTRUCTION IN SOUTHERN FRANCE, WILL BE THE LARGEST FUSION REACTOR IN THE WORLD. 

view more 

CREDIT: EUGENIO SCHUSTER/LEHIGH UNIVERSITY





As researchers around the world work to develop viable alternatives to fossil fuels, the prospect of nuclear fusion—harnessing the same energy-generating reactions that power the sun—has grown increasingly attractive to private equity firms.

In 2022, the U.S. Department of Energy launched a partnership with investors in the private sector to accelerate the development of fusion energy, in part through the development of a fusion pilot plant, or FPP, in the United States.

The FPP and ITER—the world’s largest nuclear fusion reactor, currently being built in France—represent the future of fusion, according to Eugenio Schuster, a professor of mechanical engineering and mechanics in Lehigh University’s P.C. Rossin College of Engineering and Applied Science. Ensuring that future is a success, however, requires the reactors to operate within safety, stability, and controllability boundaries, he says.

Schuster, who directs the Lehigh University Plasma Control Group, recently received a $1.6 million grant from the DOE to conduct experiments on the DIII-D National Fusion Facility in San Diego that will ultimately serve to improve the operation of ITER, FPP, and future reactors through the development of advanced controls and the application of machine learning.

The three-year grant is part of a $16 million DOE initiative to fund projects focused on advancing innovative fusion technology and collaborative research on small-scale experiments and on the DIII-D tokamak. The DIII-D tokamak is a device designed not to produce energy, but to help researchers study the physics of the plasma where energy-releasing fusion reactors would take place and develop the technology that is needed for reactor operation. 

“There are two goals with this project,” says Schuster. “The first is to improve the operation of DIII-D itself, and the second is to address the technological issues that we’ll be facing with both ITER and FPP.”

Those issues are myriad and exceedingly complex. For one, researchers like Schuster cannot study the type of plasma, called burning plasma, that will be present in fusion reactors.

“In devices like DIII-D, we create a very hot gas in a plasma state, and the goal is to study the physics of that plasma and how to stably confine it,” he says. “But this is not the type of gas that we’re going to use in ITER or in FPP. In fact, the employed gas lowers the probability of fusion in DIII-D instead of increasing it. We don’t want to have nuclear fusion reactions in these experimental devices because then everything becomes radioactive and it becomes much more complicated to carry out experiments.”

Instead, researchers must develop the tools they’ll ultimately need to control the burning plasma of a reactor like ITER or FPP, on a test bed like DIII-D. Using DIII-D, the team will emulate the conditions of an actual nuclear reactor to better understand the dynamics of the plasma and to develop the necessary control solutions. These solutions will ultimately be extrapolated to ITER or FPP by following a model-based approach.  

“You have three key elements in the control design,” he says. “One is the control algorithm, or the controller, and that’s what we work on in my group. Based on plasma-response models, we develop algorithms that will keep the plasma dynamics stable and reach the level of performance needed in reactor operation. The other two elements are the actuators, which can modify the dynamics, and the sensors that are your eyes, basically, into what’s happening.”

His team needs to develop algorithms that are smart enough to indicate to the actuators when priorities change. For example, if an actuator is being used to control the temperature of the plasma, but a sudden instability occurs that could cause what Schuster half-jokingly describes as a “poof”—when the hot gas is no longer confined and hits the inner wall of the reactor with all the force of a small earthquake—the algorithm needs to indicate to the plant supervisor that temperature is no longer relevant. The actuator must be repurposed to stabilize the plasma.

“It’s important that the algorithms that we expect to prove on DIII-D can be extrapolated to these other machines,” he says. “They cannot fail. This poof can never happen. The plasma inside ITER and FPP will be 100 million degrees Celsius, and if gas that hot becomes unconfined, it can really damage the reactor. So we have to develop these control algorithms for a machine that cannot fail, but we can’t do any experimentation with that machine.”

He stresses that all the problems inherent in such a challenge won’t be solved within a three-year project. But the work will move the state-of-the-art in incremental yet meaningful steps forward.

Another goal of the project is to bring more researchers into the field of nuclear fusion energy. The workforce necessary for progress hasn’t kept up with the increasing interest in fusion and the nascent nuclear-fusion industry, says Schuster, and the grant will support four PhD students who will all eventually work in San Diego as part of their doctoral program.

“There are several Lehigh graduates who are now a part of the permanent staff at the DIII-D National Fusion Facility,” he says. “A key part of this project is developing the human resources for the next generation of tokamaks that we plan on building.”

He says it’s an exciting time to be in the field. Advances in magnet technology, materials, computational processing, and machine learning—combined with substantial public and private investment—are all bringing the dream of limitless, clean energy a bit closer to reality.

“If I didn’t believe in it, I wouldn’t be working on it,” he says. “I’m feeling optimistic that it will, eventually, happen.”

 

Microfossils shed light on the long fossil record of euglenoids

A 400-million-year evolutionary history

Peer-Reviewed Publication

UNIVERSITEIT UTRECHT FACULTEIT GEOWETENSCHAPPEN

– Light microscope images of euglenoid cysts from the Triassic-Jurassic boundary 

IMAGE: 

LIGHT MICROSCOPE IMAGES OF EUGLENOID CYSTS FROM THE TRIASSIC-JURASSIC BOUNDARY (APPROX. 200 MILLION YEARS OLD) IN THE SCHANDELAH-1 CORE IN GERMANY (LEFT) AND FROM TRIASSIC SEDIMENTS IN WINTERSWIJK, THE NETHERLANDS (RIGHT). THE SPECIMENS ARE BETWEEN 20 AND 30 MICROMETERS IN DIAMETER. PHOTOS: BAS VAN DE SCHOOTBRUGGE.

view more 

CREDIT: BAS VAN DE SCHOOTBRUGGE, UTRECHT UNIVERSITY

Hiding in the shadows, euglenoids are a fascinating group of single-celled protists that are neither plant nor animal. Plants photosynthesize, and animals eat. Euglenoids do both. Spiraling along the murky bottoms of shallow fresh-water ponds with their long flagella, they eat organic goop, while also using their chloroplasts to convert CO2 and water with light into sugars. Because of this in-between status, euglenoids have been placed close to the very base of the eukaryotic branch on the tree-of-life that includes all plants, fungi, and animals. However, while euglenoids likely evolved more than 1 billion years ago, they appeared to have left only a very scant fossil record.

In a new study published in the journal Review of Palaeobotany and Palynology, a team of Dutch, American, British, German, and Australian scientists shed new light on a group of “problematic” microfossils that have remained a mystery for nearly a century. By comparing microscopic fossil cysts in 200-million-year-old pond sediments from cores drilled in Germany and the Netherlands to much older Paleozoic, and much younger remains in Holocene lakes in Greece, and finally to living protists in a pond in Australia, the researchers establish a 400-million-year evolutionary history of the euglenoids.

What’s in a name?

In 2012, Bas van de Schootbrugge, then at the Goethe University in Frankfurt am Main, and Paul Strother from Boston College, while working on a variety of problematic microfossils in sediments from around the Triassic-Jurassic boundary, realized that the circular striated cysts they were seeing, could be in fact euglenoid cysts. “We had this amazing drill core material at our disposal that contained many unidentified microfossils, including some of the oldest butterfly remains that we published on in 2018”, said Bas van de Schootbrugge, now at Utrecht University. Paul Strother continued: “Some of the microfossils we encountered showed a canny similarity to cysts of Euglena, a modern representative that had been described by Slovakian colleagues. The problem was, there was only one publication in the world making this claim”.

Even more unsettling: after an extensive literature review, van de Schootbrugge and Strother realized that the same type of microfossil had been given many different names. Scientists working on Quaternary and Holocene time slices used Concentricystes, referring to a possible algal cyst with concentric ribs. But Mesozoic workers used Pseudoschizaea, originally thinking it could have been a fern spore. Even older fossils from the Permian, Devonian, Silurian and Ordovician were known as Circulisporites and Chomotriletes.

Transmission electron microscopy

After the authors had disentangled the taxonomic confusion, compiling in the process nearly 500 literature sources related to any of the four taxa, more advanced microscope techniques were needed to establish the ultrastructure of the cysts with the help of transmission electron microscopy (TEM). This required picking of single specimens, embedding, and micro-tome slicing by University of Wisconsin-Eau-Claire co-author Wilson Taylor. Because the specimens in the Triassic-Jurassic cores were mostly damaged, the team turned to palynologist Andreas Koutsodendris at Heidelberg University (Germany), who had access to Holocene and Pliocene core samples containing abundant well-preserved specimens. Andreas Koutsodendris said: “I am encountering these cysts regularly in cores drilled in lakes, for example in Lake Vouliagmeni in Greece that we studied here, but their biological affinity has never been cleared. In fact, the cysts are commonly figured in publications by colleagues, but no one was able to really put a finger on it.” Wilson Taylor continued: “We were much surprised by the ultrastructure of the cysts. The structure of the wall does not resemble anything that is known. The ribs are not ornaments, like in pollen and spores, but part of the wall structure”, said Wilson Taylor. “The layered structure of the walls is also clearly different from many other fresh-water green algae”, Taylor continued.

Nagging uncertainty

While the TEM analysis initially added more mystery, the results did align with a study published in 2021 by another group of colleagues that looked at the ultrastructure of Pseudoschizaea. At least it was possible to show that Holocene and Pliocene Concentricystes and Jurassic Pseudoschizaea are in fact the same. But there remained one nagging uncertainty and that was the lack of any cysts produced by living euglenoids. Wilson Taylor: “We did contact several biologists working on living euglenoids, but no one had been able to make euglenoids encyst in a lab setting, allowing for extraction and TEM analyses of the cysts”.

Microscopic life down under

Enter Fabian Weston. By chance, Strother and van de Schootbrugge stumbled across superb video material posted on YouTube by microscopy enthusiast Fabian Weston from Sydney, Australia. In 2020 Fabian Weston had put a drop of water sampled from a nearby pond in New South Wales on a microscope slide, and using his advanced set-up at The Protist Lab filmed Euglena as it gracefully moved in and out of focus. For reasons that remain poorly understood but could be related to the drying out of the water under the cover slip, Euglena is then seen to ball-up and form a thick wall with ribs that is akin to the cysts found throughout the fossil record. “Unwittingly, Fabian provided a key piece of evidence. He is probably the only person on the planet to have witnessed Euglena encyst under a microscope”, Strother said.

Significance and next steps

Based on all the available pieces of the puzzle, the authors thus link euglenoids from a pond in Australia to fossil cysts that are more than 400 million years old, establishing a deep time record of the euglenoids. “This opens the door for recognizing even older examples, for example from Precambrian records that go back to the very root of the eukaryotic tree of life”, Strother said. “Now that we know which organisms produced those cysts, we can also use them for paleo-environmental interpretations. Their abundance around two of the largest mass-extinction events of the past 600 million years is a tell-tale sign of some major upheavals on the continents related to increased precipitation under extreme greenhouse climate conditions.” Van de Schootbrugge concluded: “Perhaps related to their capabilities to encyst, these organisms have endured and survived every major extinction on the planet. Unlike the behemoths that were done in by volcanoes and asteroids, these tiny creatures have weathered it all.” Extending their research, the team intends to travel to Australia in the near future to scour for preserved Euglena cysts in pond and lake sediments in New South Wales.

Light microscope images of euglenoid cysts from Holocene to recent Lake Vouliagmeni in Greece. Specimens are between 20 and 30 micrometers in diameter. Note the fingerprint-like patterning that is a shared characteristic of all fossil forms.

CREDIT

Andreas Koutsodendris, Heidelberg University



Transmission Electron Microscope (TEM) image of the wall structure of a Holocene euglenoid cyst from Lake Vouliagmeni, Greece.

CREDIT

Wilson Taylor, University of Wisconsin - Eau Claire


Video stills from encysting Euglena from New South Wales, Australia.

CREDIT

Fabian Weston, Protist Lab Films, Galston

 

Study: New deepfake detector designed to be less biased


With facial recognition performing worse on certain races and genders, algorithms developed at UB close the gap


Reports and Proceedings

UNIVERSITY AT BUFFALO





BUFFALO, N.Y. — The image spoke for itself. 

University at Buffalo computer scientist and deepfake expert Siwei Lyu created a photo collage out of the hundreds of faces that his detection algorithms had incorrectly classified as fake — and the new composition clearly had a predominantly darker skin tone.

“A detection algorithm’s accuracy should be statistically independent from factors like race,” Lyu says, “but obviously many existing algorithms, including our own, inherit a bias.”

Lyu, PhD, co-director of the UB Center for Information Integrity, and his team have now developed what they believe are the first-ever deepfake detection algorithms specifically designed to be less biased.

Their two machine learning methods — one which makes algorithms aware of demographics and one that leaves them blind to them — reduced disparities in accuracy across races and genders, while, in some cases, still improving overall accuracy.

The research was presented at the Winter Conference on Applications of Computer Vision (WACV), held Jan. 4-8, and was supported in part by the U.S. Defense Advanced Research Projects Agency (DARPA). 

Lyu, the study’s senior author, collaborated with his former student, Shu Hu, PhD, now an assistant professor of computer and information technology at Indiana University-Purdue University Indianapolis, as well as George Chen, PhD, assistant professor of information systems at Carnegie Mellon University. Other contributors include Yan Ju, a PhD student in Lyu’s Media Forensic Lab at UB, and postdoctoral researcher Shan Jia.

Ju, the study’s first author, says detection tools are often less scrutinized than the artificial intelligence tools they keep in check, but that doesn’t mean they don’t need to be held accountable, too. 

“Deepfakes have been so disruptive to society that the research community was in a hurry to find a solution,” she says, “but even though these algorithms were made for a good cause, we still need to be aware of their collateral consequences.”

Demographic aware vs. demographic agnostic

Recent studies have found large disparities in deepfake detection algorithms’ error rates — up to a 10.7% difference in one study — among different races. In particular, it’s been shown that some are better at guessing the authenticity of lighter-skinned subjects than darker-skinned ones.

This can result in certain groups being more at risk of having their real image pegged as a fake, or perhaps even more damaging, a doctored image of them pegged as real. 

The problem is not necessarily the algorithms themselves, but the data they’ve been trained on. Middle-aged white men are often overly represented in such photo and video datasets, so the algorithms are better at analyzing them than they are underrepresented groups, says Lyu, SUNY Empire Professor in the UB Department of Computer Science and Engineering, within the School of Engineering and Applied Sciences.

“Say one demographic group has 10,000 samples in the dataset and the other only has 100. The algorithm will sacrifice accuracy on the smaller group in order to minimize errors on the larger group,” he adds. “So it reduces overall errors, but at the expense of the smaller group.”

While other studies have attempted to make databases more demographically balanced — a time-consuming process — Lyu says his team’s study is the first attempt to actually improve the fairness of the algorithms themselves.

To explain their method, Lyu uses an analogy of a teacher being evaluated by student test scores. 

“If a teacher has 80 students do well and 20 students do poorly, they’ll still end up with a pretty good average,” he says. “So instead we want to give a weighted average to the students around the middle, forcing them to focus more on everyone instead of the dominating group.”

First, their demographic-aware method supplied algorithms with datasets that labeled subjects’ gender — male or female — and race — white, Black, Asian or other — and instructed it to minimize errors on the less represented groups.

“We’re essentially telling the algorithms that we care about overall performance, but we also want to guarantee that the performance of every group meets certain thresholds, or at least is only so much below the overall performance,” Lyu says.

However, datasets typically aren’t labeled for race and gender. Thus, the team’s demographic-agnostic method classifies deepfake videos not based on the subjects’ demographics — but on features in the video not immediately visible to the human eye.

“Maybe a group of videos in the dataset corresponds to a particular demographic group or maybe it corresponds with some other feature of the video, but we don't need demographic information to identify them,” Lyu says. “This way, we do not have to handpick which groups should be emphasized. It’s all automated based on which groups make up that middle slice of data.”

Improving fairness — and accuracy

The team tested their methods using the popular FaceForensic++ dataset and state-of-the-art Xception detection algorithm. This improved all of the algorithm’s fairness metrics, such as equal false positive rate among races, with the demographic-aware method performing best of all.

Most importantly, Lyu says, their methods actually increased the overall detection accuracy of the algorithm — from 91.49% to as high as 94.17%.

However, when using the Xception algorithm with different datasets and the FF+ dataset with different algorithms, the methods — while still improving most fairness metrics — slightly reduced overall detection accuracy.

“There can be a small tradeoff between performance and fairness, but we can guarantee that the performance degradation is limited,” Lyu says. “Of course, the fundamental solution to the bias problem is improving the quality of the datasets, but for now, we should incorporate fairness into the algorithms themselves.”

 

A statewide survey shows the digital divide narrowing in California, but many low-income residents remain under-connected


Reports and Proceedings

UNIVERSITY OF SOUTHERN CALIFORNIA

Broadband adoption by presence of school-age children 

IMAGE: 

BROADBAND ADOPTION BY PRESENCE OF SCHOOL-AGE CHILDREN

view more 

CREDIT: USC ANNENBERG SCHOOL FOR COMMUNICATION AND JOURNALISM AND THE CALIFORNIA EMERGING TECHNOLOGY FUND





Statewide broadband adoption remains high with 91% of households in California enjoying high-speed internet access at home, according to new survey results released today by USC, the California Emerging Technology Fund and the California Department of Technology.

The overall findings are consistent with the 2021 results of the biennial Statewide Digital Equity Survey, which monitors Californians’ digital access. The latest findings also reveal that the percentage of under-connected households — those with only a smartphone — was cut in half from 6% to 3%.

However, broadband adoption among families with school-age children decreased from 97% in 2021 to 93% in 2023, likely due to the expiration of school-based programs that sponsored internet connectivity during the COVID-19 pandemic.

In addition, fewer children in K-12 households have a desktop, laptop or tablet computer available at home to use for school activities that are not shared with other family members, a decline from nearly 95% in 2021 to about 72% in 2023, the researchers found.

“There was significant progress in reducing the number of under-connected households,” said Hernán Galperin, the study’s lead researcher and a professor at the USC Annenberg School for Communication and Journalism. “However, our latest data also point to the sobering reality of the challenges in reaching the most digitally disadvantaged households.”

The 2023 survey is the largest endeavor to date by USC Annenberg researchers with CETF in collaboration with the California Department of Technology to obtain a highly representative sample of Californians, said Sunne Wright McPeak, president and CEO of CETF.

“We went to great lengths developing a robust methodology to get the clearest picture of how Californians are faring in broadband adoption,” said McPeak. “We now know we still have some miles to go for Californians to achieve full adoption. The survey confirms that affordability remains the major barrier to broadband adoption and underscores the need to ensure that low-income households always will have affordable home internet service available to them.”

The researchers used a multimodal approach — an effective method for sampling hard-to-reach populations, including unconnected and under-connected households — by combining text-to-online responses with telephone interviews conducted through random digit dialing.

In addition, researchers oversampled residents in rural counties and low-income households, as well as people with disabilities.

CETF has sponsored the Statewide Digital Equity Survey since 2008. USC has been the independent research partner for conducting the CETF surveys in 2021 and 2023.


Broadband adoption in California (IMAGE)

UNIVERSITY OF SOUTHERN CALIFORNIA

More Californians come online

The results show that California has made significant progress in improving internet access among specific disadvantaged groups, such as:

  • households with disabilities, up from 83% in 2021 to 91% in 2023
  • adults 60 years and older, increasing from 78% in 2021 to 90% in 2023
  • adults without a high-school degree, up from 64% in 2021 to 79% in 2023

The data also show that overall adoption trends among racial minority groups improved over the past decade. Among Asian Americans, broadband adoption levels have reached similar levels as those among non-Hispanic white residents. Hispanic/Latino residents still trail non-Hispanic white residents by about 10 percentage points; however, the most recent data show progress in broadband adoption after several years of declines, the researchers wrote.

“To truly eliminate the digital divide, we must confront racial and ethnic disparities by implementing targeted strategies and policies that uplift digitally disadvantaged communities,” said François Bar, co-author of the study and professor of communication and spatial sciences at USC Annenberg. “The insights gleaned from this latest survey can help shape these initiatives.”

###

About the survey

The Statewide Survey on Broadband Adoption has been conducted regularly since 2008.

Methodology

The 2023 Statewide Survey on Broadband Adoption and Digital Equity is one of the largest ever conducted on broadband adoption in California. The performance standards for the survey methodology were established by the California Department of Technology in collaboration with CETF. The methodology included telephone interviews conducted through random digit dialing (RDD) and text-to-online responses, as well as an oversample of pre-paid numbers to capture low-income households. People with disabilities were contacted based on a California Department of Rehabilitation randomized client list. Overall, the sample included 3,560 households, including 1,899 in the main sample. Another 1,661 households were part of a targeted oversample comprising 1,059 rural California households, 283 low-income households, and 319 people with disabilities. The overall margin of error for the survey was +/-3%.

 

 

BSC predicts that global-mean temperature could reach the 1.5ºC warming level threshold in 2024


Annual global mean surface temperatures in 2024 will likely exceed the 1.5ºC threshold for the first time, according to the prediction carried out by the Climate Variability and Change group at the Barcelona Supercomputing Center


Reports and Proceedings

BARCELONA SUPERCOMPUTING CENTER

BSC researchers Markus Donat, Étienne Tourigny, Vladimir Lapin and Roberto Bilbao. 

IMAGE: 

BSC RESEARCHERS MARKUS DONAT, ÉTIENNE TOURIGNY, VLADIMIR LAPIN AND ROBERTO BILBAO.

view more 

CREDIT: BSC-CNS





2023 has just been confirmed as the hottest year on record, with global average temperatures exceeding pre-industrial conditions by 1.48°C, as stated by the Copernicus Programme of the European Union. Climate scientists from the Barcelona Supercomputing Center-Centro Nacional de Computación (BSC-CNS), based on the BSC decadal forecast system, were capable of predicting a year ago that 2023 had a high probability of being the warmest year on record.

After the record-smashing conditions in 2023, the imminent question is how the year 2024 and the following years will turn out. The recently released decadal forecast reveals that annual global mean surface temperatures in 2024 will probably further exceed those from 2023, and temperatures will continue to increase in the following years as greenhouse gas emissions continue.

The 2024-2033 prediction

Climate scientists of the Climate Variability and Change (CVC) group at the Earth Sciences Department of BSC have recently conducted a decadal forecast for the next ten years, i.e., the 2024-2033 period. The BSC decadal forecast system predicts that the annual global-mean surface temperature for 2024 will be between 1.43-1.69ºC warmer than for pre-industrial levels (defined as the average from 1850 to 1900), with a central estimate of 1.54ºC.

This means that temperatures in 2024 will likely be warmer than in 2023, and there is a high chance (74% probability) that annual global mean temperature will exceed the 1.5ºC threshold for the first time. The warming is primarily due to the continued emissions of greenhouse gases to the atmosphere due to human activities, particularly burning fossil fuels. The El Niño conditions developing in the Pacific Ocean, which are expected to peak in winter 2023/2024, also contribute to the exceptionally warm global mean temperature conditions.

BSC researcher Roberto Bilbao, the main responsible for performing the BSC decadal forecast, stated: “Our decadal forecast system allows us to predict both the year-to-year variations and longer-term warming trends by considering the influences of greenhouse gas and aerosol emissions and inherent natural variability of the climate system.”

In the next 10 years, surface temperatures are expected to continue increasing in response to continued greenhouse gas emissions. The BSC forecast system predicts that the average of the next two five-year period (2024-2028 and 2029-2033) global mean temperatures could reach between 1.49-1.79ºC and 1.67-1.94ºC above pre-industrial levels, respectively.

While annual mean temperature exceeding the 1.5 ºC threshold in 2024 does not necessarily breach the Paris Agreement, which refers to the average of 20 years, it indicates that the world is rapidly approaching this threshold. Combining the past 10 years of observations and the 10-year BSC forecasts, the mean of this 20-year period (2014-2033) is 1.41±0.05ºC. This indicates that we are on the brink of breaching the Paris Agreement in the coming years.

“Despite the possible year-to-year variations, where individual years can be slightly warmer or cooler than previous years, the global climate is still on a concerning warming trajectory, which is bringing us very close to breaching the goals that global leaders agreed upon in Paris in 2015,” explained ICREA Professor and co-leader of the CVC group Markus Donat.


Figure 1. Evolution of global mean surface air temperatures in observations and the latest forecasts by the decadal prediction system of the BSC.

CREDIT

BSC-CNS

The BSC decadal forecast system

Predicting the variations in climate for the near future is considered one of the most challenging problems the climate forecasting community faces. Until recently, seasonal forecasts (predictions for the next few months) and climate projections (information for long periods over the next 100 years) were the only sources of future climate information available to interested users.

However, newly developed decadal climate prediction systems can foresee variations from a year to a decade. Within this timescale, the evolution of the Earth system is impacted by both natural variability and external forcings (such as rising greenhouse gas concentrations in the atmosphere that cause global warming). Decadal predictions aim to fill the gap between seasonal predictions and climate projections, offering the potential to inform current adaptation and, thus, increase resilience.

The BSC is one of the four “Global Producing Center of Near-Term Climate Prediction” endorsed by the World Meteorological Organization (WMO) that produces yearly operational decadal climate predictions. The CVC group develops the BSC decadal forecast system, which predicts changes in average climate conditions (such as temperature or precipitation, among many other variables) and the frequency and intensity of extreme climate events (such as heatwaves, floods and droughts) over the next decade.

The service combines observational data and climate models, a mathematical representation of the Earth’s climate typically covering the atmosphere, ocean, sea ice and land, to provide the best estimate of the climate system at a specific time. In addition, the BSC also conducts research to enhance decadal predictions and exchange knowledge with interested users since predicting the variations in climate for the upcoming 1-10 years offers multiple opportunities for adaptation to a changing climate in the near future and is crucial to support the development of a more resilient society.