Wednesday, July 10, 2024

'They're everywhere': Common foods linked to elevated levels of PFAS in body

Edward Carver, Common Dreams
July 5, 2024 

Eggs

Common foods including white rice and eggs are linked to higher levels of "forever chemicals" in the body, new research from scientists at the Geisel School of Medicine at Dartmouth shows.

The researchers also found elevated levels of per- and polyfluoroalkyl substances (PFAS) in people who consumed coffee, red meat, and seafood, based on plasma and breast milk samples of 3,000 pregnant people. The findings, published in Science of the Total Environment, add to the mounting evidence of the accumulation of PFAS, which were developed by chemical companies in the mid-20th century, in the natural environment and the body.

"The results definitely point toward the need for environmental stewardship, and keeping PFAS out of the environment and food chain," Megan Romano, a Dartmouth epidemiologist and co-author of the paper, toldThe Guardian. “Now we're in a situation where they're everywhere and are going to stick around even if we do aggressive remediation."

PFAS are a class of 16,000 compounds linked to a wide range of adverse health conditions including cancer, with research ongoing. The chemicals' development and production went effectively unregulated for decades, but has received significant attention in recent years, with alarming studies coming out regularly.

3M, a consumer goods multinational that developed and manufactured many PFAS compounds, knew that they were accumulating dangerously in the blood of the general public, but concealed it, according to a recent investigation co-published by ProPublica and The New Yorker; the article was written by journalist Sharon Lerner, who previously reported on PFAS-related deception by 3M and Dupont for The Intercept.

Such corporations may yet face unprecedented legal action. As Steven Shapin wrote in the London Review of Books on Thursday, "It is thought that the monetary scale of American lawsuits against companies responsible for PFAS water pollution may eventually dwarf those involving asbestos and tobacco, considering that people are in a position to decide whether or not to smoke cigarettes but everybody has to drink water."

While much of the concern about PFAS has rightly centered on drinking water—in which they're found worldwide—that is just one of the ways the chemicals can get into the human body. A new study this week showed they can be absorbed through the skin.

Food intake is also a primary means of accumulation in the body, and the new Dartmouth study indicates which foods are the worst. The study doesn't explore why, though Romano discussed some possible reasons with The Guardian. Rice is likely contaminated because of PFAS in soil or agricultural water, while coffee could have PFAS because of various factors including filters. Animal products can be contaminated if, among other reasons, the ground that the animals lived off was treated with PFAS-fouled toxic sludge, which is used by farmers as a cheap alternative to fertilizer.

Even consumption of backyard chicken eggs lead to elevated levels of PFAS, and that could be because of the table scraps the chickens are often fed, Romano said.

Beryl foreshadows future hurricanes, says UN weather agency

Agence France-Presse
July 5, 2024 

Tropical Storm Beryl is expected to strengthen into a major hurricane as it moves through the Caribbean over the coming days (HANDOUT)

The World Meteorological Organization, which is tracking Hurricane Beryl's deadly course through the Caribbean, told AFP that more storms with its hallmarks could be expected in the future.

The WMO, the United Nations' weather and climate agency, said the record-breaking tropical cyclone intensified rapidly, picking up energy over a warmer Atlantic Ocean and developing into a system with lots of heavy rain.

Anne-Claire Fontan, scientific officer at the WMO's tropical cyclone programme, said that Beryl signals a very active Atlantic hurricane season in 2024.

- How did Beryl develop? -


"It developed quite quickly in an area which was unusual for this time of year.

"It reached category 4 in June; that was the earliest we've ever seen. It reached category 5 quickly afterwards, so there is a very rapid intensification.

"It reached category 5 very early in the season. It's really very unusual. Hurricane Beryl really broke records.

"For more than a year, there has been a significant positive anomaly of hot water in this area.

"So that's a lot of energy for cyclones, since they feed on the energy of the ocean.


"With such a powerful system, this early in the hurricane season, it suggests... a very active season for 2024."

- Where will Beryl go next? -


"Beryl will head towards the Yucatan peninsula.

"We are expecting violent winds even if Beryl is expected to decrease in intensity.

"And then it's expected to come out into the Gulf of Mexico.

"There is a little more uncertainty on its trajectory after passing over the Yucatan, recognising that when a hurricane is cut off from... the ocean... it will probably weaken enormously.

"By returning to warm waters, it could intensify. So this is where there is uncertainty... It remains to be seen whether it will be Mexico or Texas (afterwards)."

- How will climate change affect hurricanes in future? -

"Beryl is an illustration of what we can expect in the future: systems which intensify rapidly with a lot of energy at the ocean level, therefore category 5 systems with a lot of rain.

"A warmer world with global warming does not necessarily mean more tropical cyclones... in terms of frequency.

"(But) we expect a shift towards much more powerful systems -- so with much higher winds.

"Another factor is that in a warmer atmosphere that is capable of retaining more moisture, there will be increased rain associated with tropical cyclones."

- What will their impact be? -


"The categorization of hurricanes relies on wind speeds, but the... dangers (are heavily linked to) rain, with all the hazards it brings, in the form of mudslides, flash floods. All of that will also increase.

"The sea level is rising. Tropical cyclones are associated with storm surges... which can cause catastrophic flooding when they make landfall, depending on the layout of the coast.

"So if the storm surges arrive with an already increased sea level, you can clearly see the flooding this can also cause.

"We have a huge population living near the coasts worldwide. So it will clearly be a problem to manage the populations at the coast."

- A longer hurricane season? -

"When it comes to what will happen to tropical cyclones in a warming world, in terms of a longer season, there is no information at the global level.

- Are systems geared to track such cyclones? -

"Trajectory-level tropical cyclone forecasts have improved significantly.

"There is room for improvement in terms of intensity forecasting, and in particular rapid intensification.

"Rapid intensifications are not yet well understood by digital weather forecast models. The scientific community is constantly working on this."

- How should vulnerable countries prepare? -


"It is very important that all countries do hurricane preparedness, in other words that they sensitise their population to the dangers presented by tropical cyclones, (explaining) how they should react depending on the degree of danger.

"Preparing really means systematically educating the population in advance to know how to act, to prepare their house, then the family; knowing where the shelters are."
Why is Amazon building a $2 billion TOP SECRET cloud for Australia’s military intelligence

MILITARY INTELLIGENCE IS AN OXYMORON

The Conversation
July 5, 2024 

PopTika/Shutterstock

Amazon has secured a A$2 billion contract with the Australian Signals Directorate – the agency responsible for foreign signals intelligence and information security. A local subsidiary of Amazon Web Services will build a Top Secret Cloud to provide secure data storage for military intelligence.

The deal will securely manage top secret data vital to Australia’s national security. This contract is expected to last over a decade. It will build three secure data centres at undisclosed locations in Australia.

Prime Minister Anthony Albanese stated this project will “bolster our defense and national intelligence community to ensure they can deliver world-leading protection for our nation.”

Set to be operational by 2027, the project is expected to create over 2,000 jobs and cost billions more in operating expenses over the coming years. So – why Amazon? And does Australia really need it?

Why Australia needs a secret cloud


Australia faces a rising tide of security challenges. The capability to securely store military intelligence is vital to guard against a host of potential threats.

The Australian Signals Directorate’s Director-General, Rachel Noble, explained the project will provide a “state-of-the-art collaborative space for our intelligence and defence community to store and access top secret data.”

The cloud cloud is also part of the directorate’s REDSPICE program, which aims to improve Australia’s intelligence capabilities and cyber defences. By moving to a modern cloud system, Australia can better protect its sensitive data. It will also improve coordination between different security agencies.

Why Amazon Web Services?


You may only know of Amazon as an online retail giant. Amazon Web Services (AWS) is a tech subsidiary of Amazon. It was actually a pioneer in the cloud services business.

Today, it provides cloud computing services to tens of thousands of businesses and governments worldwide.

AWS’s market share among the top ten cloud providers grew to 50.1% in 2024. Microsoft Azure and Google Cloud are the next two largest providers.

Known for its reliability, scalability and security, AWS already provides similar services to other governments and organizations globally. This includes the United States Department of Defense and the Central Intelligence Agency (CIA), as well as all three of the United Kindgom’s intelligence agencies.

Will the new cloud be safe?


When we think of “the cloud,” we often picture the internet we use every day.

However, the Top Secret Cloud that AWS will build for Australia’s military is very different. It’s a private, highly secure system entirely insulated from the public internet.

While AWS is the contractor, the data centers will be built to the Australian Signal Directorate’s specifications.

The cloud cloud will use advanced encryption to protect data. No system is completely hack-proof, but this setup makes it extremely difficult for unauthorized individuals to access the information.

The Australian government has emphasised it will maintain full control over the data stored in the cloud. Only staff with high-level security clearance will work on the project.
Broader trend

This move to a secure cloud is part of a broader trend in government and military technology worldwide. Many countries are updating their old computer systems to take advantage of new technology. This can offer greater flexibility, better performance, and potentially lower costs in the long run.

The project also has international implications. The Top Secret Cloud will ease collaboration with partner nations.

Similar data clouds have already been established in the US and UK, allowing for the sharing of large amounts of information between allies. It’s worth noting that potential adversaries are also investing heavily in similar technology.

By developing this Top Secret Cloud, Australia aims to stay ahead of the game in the rapidly evolving cyber threat environment. In the coming years, we’ll likely see more countries adopt similar cloud systems for their defense and intelligence needs.


David Tuffley, Senior Lecturer in Applied Ethics & CyberSecurity, Griffith University

This article is republished from The Conversation under a Creative Commons license. Read the original article.
Is AI a major drain on the world's energy supply?

RHETORICAL QUESTION

Agence France-Presse
July 5, 2024 

Data centres, particularly those that power artificial intelligence programs, are driving surging demand for electricity © JACQUES DEMARTHON / AFP/File

When Google announced this week that its climate emissions had risen by 48 percent since 2019, it pointed the finger at artificial intelligence.

US tech firms are building vast networks of data centres across the globe and say AI is fuelling the growth, throwing the spotlight on the amount of energy the technology is sucking up and its impact on the environment.

How does AI use electricity?

Every time a user punches a request into a chatbot or generative AI tool, the request is fired off to a data centre.

Even before that stage, developing AI programs known as large language models (LLMs) needs a huge amount of computer power.

All the while, the computers are burning through electricity and the servers get hotter, meaning more electricity to cool them.

The International Energy Agency (IEA) said in a report earlier this year that data centers in general used roughly 40 percent of electricity on computing and 40 percent on cooling.

Why are experts worried?

Big tech firms have been rushing to pack all their products with AI ever since OpenAI launched its ChatGPT bot in late 2022.

Plenty of experts are concerned these new products will cause electricity usage to spike.

This is firstly because AI services require more power than their non-AI analogues.

For example, various studies have shown that each request made to ChatGPT uses roughly 10 times the power of a single Google search.


So if Google switches all search queries to AI -- about nine billion a year -- it could hugely inflate the company's electricity usage.

And most of these new services and products rely on LLMs.

Programming these algorithms is extremely intensive and usually requires high-powered computer chips.

They in turn require more cooling, which uses more electricity.

How much energy does AI use?

Before the era of AI, estimates generally suggested data centres accounted for around one percent of global electricity demand.

The IEA report said data centers, cryptocurrencies and AI combined used 460 TWh of electricity worldwide in 2022, almost two percent of total global electricity demand.

The IEA estimated that the figure could double by 2026 -- the equivalent of Japan's usage figures.

Alex De Vries, a researcher who runs the Digiconomist website, modeled the electricity used by AI alone by focusing on sales projections from the US firm NVIDIA, which has cornered the market in AI-specialized servers.


He concluded in a paper late last year that if NVIDIA's projected sales for 2023 were correct and all those servers ran at full power, they alone could be responsible for between 85.4–134.0 TWh of annual electricity consumption -- an amount similar to Argentina or Sweden.

"The numbers I put in that article were already conservative to begin with because I couldn't include things like cooling requirements," he told AFP.

And he added that adoption of NVIDIA's servers had outstripped last year's projections, so the figures would certainly be higher.


How are data centres coping?

Fabrice Coquio of Digital Realty, a data centre company that leases its services to others, told AFP during a visit to one of its enormous facilities north of Paris in April that AI was going to transform his industry.

"It's going to be exactly the same (as the cloud), maybe a bit more massive in terms of the deployment," he said.


Part of Digital Realty's latest data centre hub in Courneuve -- a gigantic edifice that looks like a football stadium -- will be dedicated to AI.

Coquio explained that normal computing requests could be handled by server racks in rooms with powerful air-conditioning.

But AI racks use much more powerful components, get much hotter and require water to be physically pumped into the equipment, he said.


"For sure, this requires different servers, storage equipment, communication equipment," Coquio said.

Is it sustainable?

The biggest players in AI and data centers -- Amazon, Google and Microsoft -- have been trying to reduce their carbon footprints by buying up vast amounts of renewable energy.

Amazon official Prasad Kalyanaraman told AFP that the firm's data centre division, AWS, was "the largest purchaser of renewable energy in the world today".

AWS is committed to being a net-zero carbon company by 2040. Google and Microsoft have pledged to reach that goal by 2030.

But building new data centres and ramping up usage in existing ones is not going to help with green energy targets.

Google and Microsoft have said in recent reports that their greenhouse gas emissions have been rising in the last few years.

Google flagged a 48 percent rise from 2019 and Microsoft a 30 percent increase from 2020.

Both have squarely blamed AI.

Microsoft President Brad Smith told Bloomberg in May the pledge was a "moonshot" made before the AI "explosion", adding that "the Moon is five times as far away as it was in 2020".

© 2024 AFP
‘Jaws’ portrayed sharks as monsters, but it also inspired a generation of shark scientists

The Conversation
July 6, 2024 

Great White Shark (Wikimedia Commons)

Human fear of sharks has deep roots. Written works and art from the ancient world contain references to sharks preying on sailors as early as the eighth century B.C.E.

Relayed back to land, stories about shark encounters have been embellished and amplified. Together with the fact that from time to time – very rarely – sharks bite humans, people have been primed for centuries to imagine terrifying situations at sea.

In 1974, Peter Benchley’s bestselling novel “Jaws” fanned this fear into a wildfire that spread around the world. The book sold more than 5 million copies in the U.S. within a year and was quickly followed by Steven Spielberg’s 1975 movie, which became the highest-grossing film in history at that time. Virtually all audiences embraced the idea, depicted vividly in the movie and its sequels, that sharks were malevolent, vindictive creatures that prowled coastal waters seeking to feed on unsuspecting bathers.

But “Jaws” also spawned widespread interest in better understanding sharks.

Previously, shark research had largely been the esoteric domain of a handful of academic specialists. Thanks to interest sparked by “Jaws,” we now know that there are many more kinds of sharks than scientists were aware of in 1974, and that sharks do more interesting things than researchers ever anticipated. Benchley himself became an avid spokesman for shark protection and marine conservation.

In my own 30-year career studying sharks and their close relatives, skates and rays, I’ve seen attitudes evolve and interest in understanding sharks expand enormously. Here’s how things have changed.


Marine biologist Greg Skomal of the Massachusetts Division of Marine Fisheries captures video footage of a white shark off Cape Cod, Oct. 21, 2022. 

Swimming into the spotlight

Before the mid-1970s, much of what was known about sharks came via people who went to sea. In 1958, the U.S. Navy established the International Shark Attack File – the world’s only scientifically documented, comprehensive database of all known shark attacks – to reduce wartime risks to sailors stranded at sea when their ships sank.

Today the file is managed by the Florida Museum of Natural History and the American Elasmobranch Society, a professional organization for shark researchers. It works to inform the public about shark-human interactions and ways to reduce the risk of shark bites.

In 1962, Jack Casey, a pioneer of modern shark research, initiated the Cooperative Shark Tagging Program. This initiative, which is still running today, relied on Atlantic commercial fishermen to report and return tags they found on sharks, so that government scientists could calculate how far the sharks had moved after being tagged.

After “Jaws,” shark research quickly went mainstream. The American Elasmobranch Society was founded in 1982. Graduate students lined up to study shark behavior, and the number of published shark studies sharply increased.

Field research on sharks expanded in parallel with growing interest in extreme outdoor sports like surfing, parasailing and scuba diving. Electronic tags enabled researchers to monitor sharks’ movements in real time. DNA sequencing technologies provided cost-effective ways to determine how different species were related to one another, what they were eating and how populations were structured.

This interest also had a sensational side, embodied in the Discovery Channel’s launch in 1988 of Shark Week. This annual block of programming, ostensibly designed to educate the public about shark biology and counter negative publicity about sharks, was a commercial venture that exploited the tension between people’s deep-seated fear of sharks and their yearning to understand what made these animals tick.

Shark Week featured made-for-TV stories that focused on fictional scientific research projects. It was wildly successful and remains so today, in spite of critiques from some researchers who call it a major source of misinformation about sharks and shark science.
Physical, social and genetic insights

Contrary to the long-held notion that sharks are mindless killers, they exhibit a wide range of traits and behavior. For example, the velvet belly lantern shark communicates through flashes of light from organs on the sides of its body. Female hammerhead sharks can clone perfect replicas of themselves without male sperm.

Sharks have the most sensitive electrical detectors thus far discovered in the natural world – networks of pores and nerves in their heads, known as ampullae of Lorenzini, after Italian scientist Stefano Lorenzini, who first described these features in the 17th century. Sharks use these networks to navigate in the open ocean, using Earth’s magnetic field for orientation.


Snorkelers swim above a whale shark near the Maldives in the Indian Ocean. The largest fish in the sea, whale sharks are filter feeders that prey on plankton. Tchami/Flickr, CC BY-SA

Another intriguing discovery is that some shark species, including makos and blue sharks, segregate by both sex and size. Among these species, cohorts of males and females of different sizes are often found in distinct groups. This finding suggests that some sharks may have social hierarchies, like those seen in some primates and hoofed mammals.

Genetic studies have helped researchers explore questions such as why some sharks have heads shaped like hammers or shovels. They also show that sharks have the lowest mutation rate of any vertebrate animal. This is notable because mutations are the raw material for evolution: The higher the mutation rate, the better a species can adapt to environmental change.

However, sharks have been around for 400 million years and have been through some of the most extreme environmental changes on earth. It’s not known yet how they have persisted so successfully with such a low mutation rate.

Gavin Naylor, director of the Florida Program for Shark Research, describes how DNA analysis provides insights into shark science.


The marquee species

White sharks, the focal species of “Jaws,” attract enormous public interest, although much about them is still unknown. They can live to age 70, and they routinely swim thousands of miles every year. Those in the Western North Atlantic tend to move north-south between Canada and the Gulf of Mexico; white sharks on the U.S. west coast move east-west between California and the Central Pacific.

We now know that juvenile white sharks feed almost exclusively on fishes and stingrays, and don’t start incorporating seals and other marine mammals into their diets until they are the equivalent of teenagers and have grown to about 12 feet long. Most confirmed white shark bites on humans seem to be by animals that are between 12 and 15 feet long. This supports the theory that almost all bites by white sharks on humans are cases of mistaken identity, where humans resemble the seals that sharks prey on.


Still in the water


Although “Jaws” had a widespread cultural impact, it didn’t keep surfers and bathers from enjoying the ocean.

Data from the International Shark Attack File on confirmed unprovoked bites by white sharks from the 1960s to the present day shows a continuous increase, although the number of incidents yearly is quite low. This pattern is consistent with growing numbers of people pursuing recreational activities at the coasts.

Around the world, there have been 363 confirmed, unprovoked bites by white sharks since 1960. Of these, 73 were fatal. The World Health Organization estimates that there are 236,000 deaths yearly due to drowning, which translates to around 15 million drowning deaths over the same time period.

In other words, people are roughly 200,000 times more likely to drown than to die from a white shark bite. Indeed, surfers are more likely to die in a car crash on the way to the beach than they are to be bitten by a shark.

Gavin Naylor, Director of Florida Program for Shark Research, University of Florida

This article is republished from The Conversation under a Creative Commons license. Read the original article.
Old pipes cause Texas cities to lose tens of billions of gallons of water each year

Juan Salinas II, Texas Tribune
July 8, 2024

Water from the tap is poured into a glass (Shutterstock)

This article was first published by The Texas Tribune, a nonprofit, nonpartisan media organization that informs Texans — and engages with them — about public policy, politics, government and statewide issues. Sign up for The Brief, The Texas Tribune’s daily newsletter.

Texas’ most populous cities lost roughly 88 billion gallons of water last year because of aging water infrastructure and extreme heat, costing them millions of dollars and straining the state’s water supply, according to self-reported water loss audits.

The documents show that bigger municipalities are not immune to water issues often seen in smaller, less-resourced communities around the state. All but one big city saw increased water loss from last year's audits.

While cities are losing water because of inaccurate meters or other data issues, the main factors are leaks and main breaks.

Here's how much each of Texas' biggest cities lost last year, according to their self-reported audits:Houston: 31.8 billion
San Antonio: 19.5 billion
Dallas: 17.6 billion
Austin: 7.1 billion
Fort Worth: 5.9 billion
El Paso: 4.8 billion

Houston, San Antonio, Dallas, Austin, Fort Worth and El Paso must submit water loss audits to the Texas Water Development Board yearly. Other water agencies must do audits only every five years, unless the city has over 3,300 connections or receives money from the board.

“What we have right now is not sustainable [or] tenable,” said Jennifer Walker, National Wildlife Federation’s Texas Coast and Water Program director.

The cities of Houston and Dallas saw the biggest increase in lost water reported. Houston saw a 30% jump from last year's audit, while Dallas saw an increase of 18%.

Houston is the largest populous city in the state, home to roughly 2.3 million Texans; it lost around 31 billion gallons of water last year.

Houston Public Works blames the region's long drought from June 2022 to December last year for the increase. Droughts cause clay in soil to dry up and shrink, stressing older water lines and making them more likely to break and leak. Officials said this, combined with aging infrastructure, led to a significant increase in water leaks across the city.

“HPW will continue to pursue all funding options available to help replace aging infrastructure,” the Houston spokesperson said.

Aging infrastructure isn’t only a Houston problem. Dallas officials said they only expected a roughly 4% increase in water loss in 2023. They saw a double-digit increase instead.

A Dallas Water Utilities spokesperson said the city is investigating the cause of the increase and “reviewing records to ensure all allowable unbilled/unmetered authorized uses were properly accounted for in the 2023 calculation.”

On the other side of North Texas, Fort Worth saw an increase from 5.6 billion gallons lost in 2022 to 5.9 billion gallons in 2023, losing Cowtown more than $8 million.

Walker, from the National Wildlife Federation, said numbers are also rising because cities are getting more accurate in reporting water loss.

Fort Worth has a “MyH2O program” that replaced all manual read meters with remote read meters and implemented a Real Water Loss Management Plan in 2020 to focus the city efforts related to leak surveys, leak detection and the creation of district metering areas.

“It is actually a testament to how we are using available data to make better decisions and improve reporting with a higher level of confidence,” said Fort Worth Water Conservation Manager Micah Reed.

Last year, voters passed a proposition that created a new fund specifically for water infrastructure projects that are overseen by the Texas Water Development Board.

The agency now has $1 billion to invest in projects that address various issues, from water loss and quality to acquiring new water sources and addressing Texas’ deteriorating pipes. It’s the largest investment in water infrastructure by state lawmakers since 2013.

Walker calls the $1 billion a “drop in the bucket.”

Texas 2036, an Austin-based think tank, expects the state needs to spend more than $150 billion over the next 50 years on water infrastructure.

While some of the Texas Water Fund must be focused on projects in rural areas with populations of less than 150,000, Walker said the bigger cities could also receive some funding.

In San Antonio, the San Antonio Water System isn’t “waiting for [the state] to come and tackle the problem for us.”

The city lost around 19 billion gallons of water in 2023 and has seen an increase over the last five years.

“We're in a state that doesn't even fund public education,” said Robert Puente, president and CEO of the San Antonio Water System. “So good luck to us getting some money from the state on these issues.”

Earlier this week, the SAWS board of trustees unanimously approved a new five-year water conservation plan.

The city of Austin Austin lost around 7 billion gallons of water in 2023.

Austin has hired a consultant to review it's water loss practices and metrics, according to city officials. The capital city is also in the process of replacing water mains around Austin.

Walker said while Texas lawmakers should invest more money in water infrastructure, city officials also need to hire more staff and better planning to address water loss.

The one city that lost less water in 2023 was El Paso, which reported losing 475 million fewer gallons last year. Since El Paso is in the desert, water conservation and having a “watertight” infrastructure is the city's main focus, said Aide Fuentes, El Paso Wastewater Treatment Manager.

“That makes us a little bit different from the rest of Texas in that sense,” Fuentes said.

El Paso Water officials aim to reduce water loss by 10%.

Walker said the data shows that cities should make the case to state lawmakers to continue addressing water infrastructure in the next legislative session. She added this issue isn’t going away.

“We really need [to] try to live with what we have and not lose the water that we already have in place and make sure that it's reaching its intended destination,” Walker said.

Disclosure: San Antonio Water System and Texas 2036 have been financial supporters of The Texas Tribune, a nonprofit, nonpartisan news organization that is funded in part by donations from members, foundations and corporate sponsors. Financial supporters play no role in the Tribune's journalism. Find a complete list of them here.

The Texas Tribune is a member-supported, nonpartisan newsroom informing and engaging Texans on state politics and policy. Learn more at texastribune.org.
Wildfire smoke linked to thousands of premature deaths every year in California alone

The Conversation
July 9, 2024 

The LNU Lightning Complex was one of the largest wildfires in California's history (AFP)

When wildfires rage, the immediate threat is obvious – but smoke from the fires actually kills far more people than the flames.

As fires become more frequent, that smoke is leading to a public health crisis.

In a new study published in the journal Science Advances, we found that wildfire smoke likely contributed to more than 52,000 premature deaths across California alone from 2008 to 2018, with an economic impact from the deaths of more than US$430 billion.

Previous studies have examined the short-term health risks from wildfire smoke, but few have assessed how exposure to wildfire smoke over years adds up to shorten human lives.



Smoke from far-off wildfires turns the skies orange over San Francisco in 2020. Running in wildfire smoke carries harmful particles deep into a person’s lungs. AP Photo/Tony Avelar

Wildfire risk and severity have grown as the climate has changed and as more people have moved into the edges of wildland areas, increasing the risk that they will start fires. Years of putting out all wildfires have also kept small fires from clearing out the undergrowth, meaning the fires that do escape have more fuel to burn.

As fires are becoming a regular occurrence in all of our lives, it’s important for communities to understand that the health risk from smoke pollution is rising, too.
Health risks

Decades of air pollution research have shown how exposure to fine particulate matter, or PM2.5, can harm a person’s long-term health.

PM2.5 is a mixture of small particles, each only a fraction of the width of a human hair. It comes from a variety of sources, such as vehicle tailpipes and factory emissions, as well as from other sources, including fires. The particles are so tiny, they can travel deep into the lungs and even enter the bloodstream.

Inhaling PM2.5 can cause short-term respiratory health problems in vulnerable populations, such as people with asthma and older adults. It also leads to long-term harm by contributing to the formation of chronic diseases, including atherosclerosis, asthma, decreases in lung function and diabetes. One reason this happens is the body’s inflammatory response to inhaling air pollutants.




Rachel Connolly, et al., 2024, CC BY-NC-SA

Wildfire smoke contains a stew of chemicals. A growing body of research indicates that wildfire PM2.5 may have unique physical and chemical components that increase toxicity. For example, a study from the University of California, Davis, showed that wildfire smoke caused more inflammation in the lungs of rats than PM2.5 from other sources such as cars and industry. This makes the pollutant potentially more harmful to human respiratory health compared with the typical PM2.5 that people are exposed to in urban environments.

The problem with wildfire smoke

We examined the long-term health damages from breathing wildfire smoke. The results, from over a decade, suggest that in years with lots of fire activities – 2017 and 2018, for example – wildfire smoke contributed to more than 10,000 deaths annually in California. For comparison, about 4,000 people die in California from traffic accidents each year.

Even though most fires are in rural, forested areas, smoke can travel hundreds or thousands of miles. So the health effects are widespread through population centers.

For example, California’s Rough Fire of 2015 burned more than 150,000 acres in a mostly rural area of Fresno County, but most mortality impacts in that year were west of the fire in a more populated area of the county and throughout the San Joaquin Valley. In 2018, the year the town of Paradise and several other communities burned, wildfire smoke may have killed as many as 12,000 Californians prematurely.

To conduct this analysis, we looked at annual estimates of wildfire smoke PM2.5 by ZIP code based on wildfire activity. Unlike most past studies, we looked specifically at exposure to PM2.5 from wildfires. This allowed us to account for that potentially increased toxicity of smoke.


Then, we used a dose-response estimate for the relationship between wildfire-specific air pollution and premature deaths. Dose-response estimates are derived from epidemiological studies that relate air pollution levels to survival. For this study, we adjusted an existing dose-response estimate to account for potential increased toxicity from wildfire smoke.



Rachel Connolly, et al., 2024, CC BY-NC-SA

How to stay safe

These findings suggest that society needs to invest in coordinated forest management, wildland-urban interface management and climate change mitigation, all of which could yield significant public health benefits through reducing the frequency and severity of wildfires and smoke exposure. We will also need to invest more to adapt to the worsened smoke by providing safe areas in schools and other public buildings.

Here are some tips for limiting how much smoke you inhale on days when the wind is carrying wildfire smoke your way:

Stay indoors: On heavy smoke days, don’t go outside, and close your windows. Turn on an air purifier, if you have one. If you don’t have one, considering purchasing one with a HEPA filter. The California Air Resources Board has a list of certified air filters. If you smell smoke, or know there is a fire nearby – even if you can’t see it – check a resource such as the EPA and Forest Service’s AirNow Fire and Smoke Map or the PurpleAir map to get a sense of whether it’s safe to be outdoors.

Wear a protective mask: If you do go outside, wear a protective mask, such as an N95 if possible. If you don’t have access to an N95, a surgical mask will help too.

Avoid exercising outdoors on smoky days: Strenuous activities such as exercise draw more air into the lungs – and with it, PM2.5. Exercising also causes your heart and respiratory system to work harder. If you exercise, do so inside in a place with filtered air.
Stay informed: You can follow state and federal air quality agencies in your region on their websites and social media platforms and sign up for alerts when available.

Rachel Connolly, Research Project Director, University of California, Los Angeles and Michael Jerrett, Professor of Environmental Health Sciences, University of California, Los Angeles

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Unregulated online political ads pose a threat to democracy
The Conversation
July 9, 2024 

Social Media (Olivier DOULIERY/AFP)

Think back to the last time you scrolled through your social media feed and encountered a political ad that perfectly aligned with your views – or perhaps one that outraged you. Could you tell if it was from a legitimate campaign, a shadowy political action committee or even a foreign entity? Could you discern who paid for the ad? Chances are you couldn’t.

While television and radio political ads have been subject to strict disclosure requirements for decades, their online counterparts exist in a regulatory vacuum. Social media giants like Facebook, X – formerly Twitter – and Instagram have become central battlegrounds for political campaigns. Yet they operate without the transparency mandated for traditional broadcast media. This allows advertisers to use sophisticated microtargeting to tailor messages to voters, often exploiting detailed personal data.

Welcome to the unregulated Wild West of online political advertising, where transparency is scarce and accountability is lacking. With the 2024 U.S. presidential election in full swing, this digital frontier poses an unprecedented threat to the integrity of American democracy


The good old days

The McCain-Feingold Act became law over two decades ago. The law, officially known as the Bipartisan Campaign Reform Act of 2002, was designed to curb the influence of money in politics and increase transparency in campaign financing. The landmark legislation, championed by Senators John McCain (R-Ariz.) and Russ Feingold (D-Wis.), includes the regulation of issue advocacy ads on television and radio.

The McCain-Feingold Act addressed the need for disclaimers and the “Stand by Your Ad” provision, which required candidates to personally endorse their messages in TV and radio ads. Such regulations have proved effective in maintaining a level of accountability and transparency in traditional media.

The media landscape has undergone a dramatic transformation since the bill’s passage, however. As a communications scholar who studies online advertising, I see the lack of similar regulatory measures governing online political advertisements as a glaring absence. This vacuum leaves platforms responsible for providing transparency.

At the same time, Federal Election Commission rules governing disclosure on digital political ads remain murky at best. The lack of clarity makes tracking and analyzing digital political ads a daunting task for researchers, journalists and concerned citizens.


Ad transparency studies



A recent study conducted by open internet advocacy organization Mozilla and Finnish internet research firm Check First reveals significant deficiencies in the ad transparency tools provided by major tech platforms. Ad transparency tools are collections and analysis of ads that the social media companies make publicly available. Researchers, policymakers and advocacy groups use the tools to understand ads and their effects. The deficiencies raise concerns about the potential for manipulation and deception in the lead-up to the presidential election.

The study examined the ad transparency tools of 11 major tech platforms, including X, Apple’s App Store, Google, Meta, TikTok and LinkedIn. The study found that these tools often provide incomplete data, have broken search functions and are difficult to use effectively. Among the tech giants the study evaluated, X emerged as the worst performer, with a dismal record of providing meaningful data for watchdogs and users alike.

Notably, the study focused on the efforts of these platforms to comply with the European Union’s Digital Services Act, which mandates a certain level of ad transparency. The United States, however, has no comparable requirements, leaving voters vulnerable to potential manipulation and disinformation campaigns.

A study of Meta’s ad policy enforcement found that the company missed most of the political ads on Facebook that had failed to properly identify themselves as political.

Recent academic research offers some insights into the potential effectiveness of political ad labeling. One study tested various transparency information disclosures based on enacted regulations, including the EU’s Digital Services Act, and proposed regulations, including the U.S. Honest Ads Act.



The Digital Services Act is a broad set of regulations that requires online platforms to provide real-time information about which posts are ads and who produced and financed them. The U.S. bill aims to require platforms to maintain publicly accessible records of any political ads purchased by a person or group who spends more than $500 on ads in a calendar year. It also seeks to ensure that foreign entities are not purchasing political ads to influence U.S. elections.

The researchers found that transparency measures based on these regulations were most effective in increasing users’ ability to recognize and understand persuasion attempts in advertising. However, the academic study also highlighted significant challenges in implementing ad labeling. Only 30% of participants remembered noticing the transparency information, underscoring the difficulty of making such measures effective in the fast-paced world of social media.

The stakes


The importance of ad transparency was spotlighted by a recent report from AI Forensics, a European nonprofit that investigates influential and opaque algorithms. The report, titled “No Embargo in Sight: Meta Lets Pro-Russia Propaganda Ads Flood the EU,” revealed that a massive network of pro-Russian propaganda targeted voters in France and Germany. It reached 38 million user accounts in just six months. Meta failed to identify and label the vast majority of these ads as political in a timely manner, allowing the disinformation to spread rapidly.

Experts are increasingly concerned about the potential for similar disinformation campaigns to target American voters. With wars raging in multiple global hot spots and platforms like X and Facebook struggling to monitor and report on political ads effectively, the risks of electoral interference and voter manipulation are significant.




Sens. Amy Klobuchar, D-Minn., and Mark Warner, D-Va., introduced the Honest Ads Act in 2017. Tom Williams/CQ Roll Call via Getty Images

Despite the Honest Ads Act’s high-profile bipartisan sponsors and the potential effectiveness suggested by academic research, most analysts predict that partisan gridlock and tech industry lobbying will keep the legislation from being passed before the November 2024 election. This lack of legislative action leaves the U.S. without robust ad transparency tools, making it difficult for the public to identify the sources behind political ads on digital platforms.

Advocates have called on tech platforms to prioritize the development of more robust and user-friendly ad transparency tools ahead of the election. They argue that without meaningful reforms, the integrity of the democratic process is at risk, leaving voters vulnerable to manipulation and deception.


Steven Caplan, Adjunct Instructor of Communications and Marketing, USC Annenberg School for Communication and Journalism

This article is republished from The Conversation under a Creative Commons license. Read the original article.
Rare Florida cactus becomes first U.S. species lost to sea rise

Agence France-Presse
July 9, 2024 

This undated image released by the Florida Museum shows the thick hair and spines blanketing parts of the Key Largo tree cactus, to the extent that an isolated fragment might not look like it came from a cactus at all (Jeff GAGE/AFP)


A rare species of tree cactus has gone extinct in Florida, in what is believed to be the first species lost to sea level rise in the United States, researchers said Tuesday.

The Key Largo tree cactus (Pilosocereus millspaughii) was restricted to a single small population in the Florida Keys, an archipelago off the southern tip of the state, first discovered in 1992 and monitored intermittently since then.

But salt water intrusion caused by rising seas, soil erosion from storms and high tides, and herbivory by mammals placed significant pressure on the last population.
The video player is currently playing an ad.

By 2021, a once-thriving group of roughly 150 stems in an isolated mangrove forest had dramatically declined to six fragile, struggling fragments, which researchers relocated for off-site cultivation in a bid to ensure their survival.

"Unfortunately, the Key Largo tree cactus may be a bellwether for how other low-lying coastal plants will respond to climate change," said Jennifer Possley, the director of regional conservation at Fairchild Tropical Botanic Garden.

Possley is the lead author on a study published Tuesday in the Journal of the Botanical Research Institute of Texas documenting the decline of the species.

Key Largo tree cacti continue to grow on a few scattered Caribbean islands, including northern Cuba and parts of the Bahamas.

These plants can reach heights exceeding 20 feet (six meters) and feature cream-colored flowers with a garlic scent that gleam in moonlight, drawing bat pollinators. Their vivid red and purple fruits are also highly attractive to birds and mammals.

Human-caused climate change is leading to water from melting ice sheets and glaciers flowing into the world's oceans.

Additionally, as water in the ocean warms it expands slightly. Both factors contribute to sea level rise.
Trump's far-right army is threatening bloodshed — believe them

Thom Hartmann
July 7, 2024 

A supporter of Republican presidential candidate, former U.S. President Donald Trump cheers ahead of a campaign rally at Sunset Park on June 9, 2024 in Las Vegas, Nevada. (Photo by Brandon Bell/Getty Images)


Kevin Roberts, who heads the Heritage Foundation (largely responsible for Project 2025) just implicitly threatened Americans that if we don’t allow him and his hard-right movement to complete their transformation of America from a democratic republic into an authoritarian state, there will be blood in the streets.

“We’re in the process of taking this country back,” he told a TV audience, adding:

“The reason that they are apoplectic right now, the reason that so many anchors on MSNBC, for example, are losing their minds daily is because our side is winning. And so I come full circle on this response and just want to encourage you with some substance that we are in the process of the second American Revolution, which will remain bloodless if the left allows it to be.”

He’s not wrong. America has been changed as a result of a series of corrupt rulings by Republicans (exclusively; not one of these rulings has been joined by a Democratic appointee) which have changed America’s legal and political systems themselves.

As Roberts notes, this is really the largest issue we all face, and our mainstream media are totally failing to either recognize or clearly articulate how radically different our country is now, how far the Republicans on the Court have dragged us away from both our Founder’s vision and the norms and standards of a functioning, modern democratic republic.

These actions — corporate personhood, money as speech, ending the Chevron deference to regulatory agencies, and giving the president life-and-death powers that historically have only been held by kings, shahs, mullahs, dictators, and popes — have fundamentally altered the nature of our nation.

READ: Only WE can save our democracy

First, in a series of decisions — the first written by that notorious corporatist Lewis Powell (of “Powell Memo” fame) — Republicans on the Court have functionally legalized bribery of politicians and judges by both the morbidly rich and massive corporations.

This started with Powell’s 1978 Bellotti opinion, which opened the door (already cracked a bit) to the idea that corporations are not only “persons” under the Constitution, but, more radically, are entitled to the human rights the Framers wrote into the Bill of Rights (the first ten amendments).

Using that rationale, Powell asserted that corporations, like rich people (from the Buckley decision that preceded Belotti by two years), are entitled to the First Amendment right of free speech. But he took it a radical step farther, ruling that because corporations don’t have mouths they can use to speak with, their use of money to spend supporting politicians or carpet-bombing advertising for a candidate or issue is free speech that can’t be tightly regulated.

Citizens United, another all-Republican decision with Clarence Thomas the deciding vote (after taking millions in bribes), expanded that doctrine for both corporations and rich people, creating new “dark money” systems that wealthy donors and companies can use to hide their involvement in their efforts to get the political/legal/legislative outcomes they seek.

Last week the Republicans on the Court took even that a huge step farther, declaring that when companies or wealthy people give money to politicians in exchange for contracts, legislation, or other favors, as long as the cash is paid out after the deed is done it’s not a bribe but a simple “gratuity.”

So, first off, they’ve overthrown over 240 years of American law and legalized bribery.

Last week they also gutted the ability of federal regulatory agencies to protect average people, voters, employees, and even the environment from corporations that seek to exploit, pollute, or even engage in wage theft. This shifted power across the economic spectrum from a government elected by we the people to the CEOs and boards of directors of some of America’s most predatory and poisonous companies.

Finally, in the Trump immunity case, the Court ruled that presidents are immune from prosecution under criminal law, regardless of the crimes they commit, so long as they assert those crimes are done as part of their “official” responsibilities. And who decides what’s “official”? The six Republicans on the Supreme Court.

These actions — corporate personhood, money as speech, ending the Chevron deference to regulatory agencies, and giving the president life-and-death powers that historically have only been held by kings, shahs, mullahs, dictators, and popes — have fundamentally altered the nature of our nation.

It’s almost impossible to overstate the significance of this, or its consequences. We no longer live in America 1.0; this is a new America, one more closely resembling the old Confederacy, where wealthy families and giant companies make the rules, enforce the rules, and punish those who irritate or try to obstruct them.

In America 2.0, there is no right to vote; governors and secretaries of state can take away your vote without even telling you (although they still must go to court to take away your gun).

They can destroy any politician they choose by simply pouring enough cash into the campaign system (including dark, untraceable cash).

The president can now go much farther than Bush’s torturing and imprisoning innocent people in Gitmo without legal process: he can now shoot a person on Fifth Avenue in plain sight of the world and simply call it a necessary part of his job. Or impoverish or imprison you or me with the thinnest of legal “official” rationales.

We no longer live in America 1.0; this is a new America, one more closely resembling the old Confederacy, where wealthy families and giant companies make the rules, enforce the rules, and punish those who irritate or try to obstruct them.

America 2.0 is not a democracy; it’s an oligarchy, as I wrote about in The Hidden History of American Oligarchy. The South has finally — nearly — won the Civil War.

While it will be months or more likely years before all of these new powers the Republicans on the Court have given the president, rich people, and corporations begin to dawn on most Americans, they will, step-by-step transform this country into something more closely resembling Hungary or Russia than the democracies of Europe and Southeast Asia.

The only remedy at this late stage in this 50+ yearlong campaign to remake America is a massive revolt this fall at the ballot box, turning Congress — by huge majorities — over to Democrats while holding the White House.

If we fail at this, while there will be scattered pockets of resistance for years, it’ll be nearly impossible to reverse the course that America’s rightwing billionaires have set us on.

There has never been a more critical time in the history of our nation outside of the last time rich oligarchs tried to overthrow our democracy, the Civil War. Like then, the stakes are nothing less than the survival of a nation of, by, and for we the people.

READ: Only WE can save our democracy