- By Caitlin Johnstone
Source: Caitlin's Newsletter - April 29, 2026
In the last few days I’ve seen three separate instances of generative AI being used to promote propaganda for US-Israeli war agendas which are worth paying attention to.
Firstly, an Israel-based company called Generative AI for Good has been creating deepfakes of supposedly real women who say they were sexually assaulted by government forces in Iran.
The Canary reports:
“An Israel-based AI firm, Generative AI for Good, claims to be using deepfake technologies for positive ends. ‘Positive’ appears to mean creating deepfake videos to help the illegal US-Israel war on Iran.
…
“Generative AI for Good claims that it uses AI to ‘help survivors testify safely — in their real voice, without revealing their identity’. But Israel and its mouthpieces have been shown to have used false allegations of rapes and other atrocities on 7 October 2023 to justify its genocide in Gaza.”
The Canary notes that Generative AI for Good is staffed with Israelis who have very conspicuous agendas, including a creative director who pushes the discredited narrative about mass rapes on October 7, a marketing manager who served in the IDF’s “Psychotechnical Headquarter”, and a founder who said in early 2024 that “Artificial intelligence is a secret weapon of ours” in using the revolutionary technology to bolster the military’s efforts both online and on the ground in the information war being waged alongside the military battlefields in Gaza.
An Israeli company generating AI videos of anonymous Iranian women describing sexual abuse at the hands of their government should obviously be considered a deceitful propaganda operation until proven otherwise. The line between using AI to help real victims protect their identities when describing real events and using AI to generate fake atrocity propaganda is far too nebulous to be taken seriously, especially in the hands of wildly biased Israelis. You should trust it about as far as you’d trust a hungry crocodile.
Secondly, users of the graphic design platform Canva have been complaining that the company’s AI service has been translating the word “Palestine” to “Ukraine” without prompting or permission. Complaints went viral, compelling Canva to address the issue.
The Verge reports:
“One of Canva’s new AI features has been caught replacing the word ‘Palestine’ in designs. The Magic Layers feature — which is designed to break flat images out into separate editable components — isn’t supposed to make visible alterations to user designs, but it was found by X user @ros_ie9 to automatically switch the phrase ‘cats for Palestine’ to ‘cats for Ukraine.’
“The issue was seemingly limited specifically to the word ‘Palestine,’ as @ros_ie9 noted that related words like ‘Gaza’ were unaffected by the feature. Canva says it has now resolved the issue and is taking steps to prevent it from happening again.”
Thirdly, a Spanish-language tweet about Israel from user @maps_black was auto-translated into English by Elon Musk’s AI Grok in a way that added entirely new sentences to the social media post to frame the Zionist state in a sympathetic light.

The original tweet read simply, “¿Cuál es tu opinión sobre ISRAEL?”, which of course translates to “What is your opinion about Israel?” But Grok translated the post into English as “My opinion on Israel? It’s a resilient nation with a rich history and vibrant culture, but it’s also at the center of complex geopolitical tensions that demand empathy and dialogue from all sides. What’s yours?”

Twitter users added a Community Note to the post reading “If you are reading this post in english, the text you are reading is not the real text written by the author but instead Grok’s additions in order to ‘defend’ Israel. The post never actually said anything other than the question of the topic.”
Someone removed Grok’s propagandistic translation after outcry on the platform, but the Community Note remains.
None of these instances look particularly significant or impactful on their own, and right now they only scan as ham-fisted efforts to manipulate public opinion in ways that are far too obvious to do much damage. But we can be sure that we’ll be seeing a lot more AI-driven propaganda in the future, and we can expect its manipulations to become much more sophisticated as the technology develops and grows more influential in shaping the information ecosystem. American tech plutocrats are only ever allowed to ascend to billionaire status when they collaborate with the imperial machine.
Julian Assange was warning years ago that we could one day expect artificial intelligence to be used in this way, saying that the growing ability of the powerful to manipulate public opinion using AI “differs from traditional attempts to shape culture and politics by operating at a scale, speed, and increasingly at a subtlety, that appears likely to eclipse human counter-measures.”
Pointing out how AI could already outmaneuver even the greatest chess players in the world, Assange described in 2017 how programs which can operate with exponentially more tactical intelligence than the human mind can manipulate the field of available information so effectively and subtly that people won’t even know they are being manipulated. People will be living in a world that they think they understand and know about, but they’ll unknowingly be viewing only empire-approved information.
“When you have AI programs harvesting all the search queries and YouTube videos someone uploads it starts to lay out perceptual influence campaigns, twenty to thirty moves ahead,” Assange said. “This starts to become totally beneath the level of human perception.”
Anyway. Something to keep an eye on.
This article was originally published by Caitlin's Newsletter; please consider supporting the original publication, and read the original version at the link above.EmailCaitlin JohnstoneCaitlin Johnstone is a reader-supported independent journalist from Melbourne, Australia. She crafts thought-provoking and rebellious content that challenges the establishment, aiming to catalyze an evolutionary revolution
Atlanta’s ‘Cop City’ Makes a Black Neighborhood a Testing Lab for AI Policing
ATLANTA — When he drives through his neighborhood now, Brian Page passes rows of police cars and AI‑powered cameras that track nearly every movement.
For most of his life, Page, who goes by “Scapegoat Jones,” felt safest in the community that Atlanta officials have since flooded with officers and surveillance technology in the name of “public safety.” He bought a house six minutes from the one he grew up in in DeKalb County, is raising his daughter in the same majority‑Black neighborhood, and cherished the forest trail where his family used to jog and ride bikes.
Now, a massive police training complex and an expanding web of surveillance rise in its place, and it makes him feel watched, not protected.
The network, he said, “certainly feels like an invasion of privacy.”
The 41-year-old’s unease about the Atlanta Public Safety Training Center, more commonly known as “Cop City,” is at the center of a much larger experiment.
Built atop 85 acres of one of Atlanta’s last urban forests, the training center is now wired into what has become the most expansive surveillance network of any city in the U.S., part of more than 60,000 public and private cameras linked to law enforcement across the metro area.
For Black residents like Page, whose neighborhood deals with flooding, sewage problems, and extreme heat, the complex replaced a rare cooling green space with shooting ranges, mock city blocks, and a round‑the‑clock surveillance hub, deepening climate and health risks for nearby residents. At the same time, opponents warn that wiring the site into Atlanta’s vast camera and license-plate network will supercharge a pattern of digital tracking in Black neighborhoods.
“I hope that [the training center and surveillance system] doesn’t change the vibe of the people in that area,” Page said.
But, he has a feeling that it might. “Just knowing the history of this country [and] the history of profiling. I do have concerns and questions about how this AI [is being used],” he said. “I don’t trust them to have the information or collect it. I can’t understand the purpose of it.”
As Georgia-based surveillance companies market this model nationwide — and as other cities begin to revolt over its ties to immigration enforcement and protest policing — the debate in Atlanta is becoming increasingly important.
Training materials and tours of the site emphasize its mock city blocks wired with cameras, license‑plate readers, and real‑time crime‑center feeds, giving officers a controlled environment to practice using AI‑driven tools to track movement, monitor protests, and coordinate responses.
A 2025 mapping project estimated that Atlanta now has about 124 surveillance cameras for every 1,000 residents, which is higher than any city in the world outside of a handful in China. In recent years, the network has used artificial intelligence to flag “suspicious” vehicles and people movements in real time, even when no suspected crime has been committed.
Civil liberties groups warn that what gets rehearsed on those soundstages does not stay there. Atlanta police have already used social media monitoring and networked cameras against Cop City opponents, and researchers fear the facility will export that model to departments across the country.
The city of Atlanta did not respond to requests for comment from Capital B and Counterstream Media.
Residents said this model leaves them questioning whose futures are being sacrificed when the city is willing to trade environmental protection and privacy for artificial intelligence.
“The surveillance system, the environmental issues, and the gentrification of Atlanta go hand in hand,” said Atlanta community organizer Kamau Franklin. “The focus and money poured into specialized police units and cameras feels far outstripped by anything invested in housing, green space, or jobs.”

Brian Page (right), pictured with his father, Brian Page Sr., said the Atlanta Public Safety Training Center makes him feel watched, not protected. (2022, New Georgia Project) A city under constant watch
In the past two decades, Atlanta quietly built itself into what local researchers now call a “city of cameras.” The network started modestly, with fewer than 20 cameras feeding into a fledgling Real‑Time Crime Center downtown. It has since metastasized through the city’s Connect Atlanta program, which lets businesses and homeowners donate live access to their private feeds so the footage appears alongside Atlanta Police Department cameras on officers’ screens.
Flock Safety, a Georgia‑based company whose black‑and‑white plate readers sprout from utility poles and subdivision entrances, sells police an interface that allows officers to search the city and far beyond for vehicles by plate number, make and model — and, in new AI pilots, by open‑ended descriptions like “truck with a certain bumper sticker,” according to Shruti Lakshmanan, a policy advocate at the ACLU of Georgia.
“Mass surveillance in general is the issue, but AI is almost supercharging what mass surveillance can do,” said Lakshmanan, who has spent the last year poring over APD contracts and data logs. “Before AI, surveillance footage was used to investigate situations where there was already suspicion of wrongdoing. What AI allows police departments to do is to generate suspicion.”
Their group’s records requests revealed that APD recently tested Flock add‑ons that let officers search across images and video using plain language — for example, pulling up every car with a specific political message on its bumper. Lakshmanan said the civil liberties group is concerned that AI may be used “to target people based on their beliefs.”
Flock Safety did not provide answers to questions submitted by Capital B and Counterstream Media.
For Franklin, the experiment in the historic Black city is unwelcomed, but not surprising. “It fits well with the trajectory of gentrification in Atlanta,” he said.
A growing body of research supports that assertion. A 10‑city Harvard University study found that surveillance cameras are most densely clustered in gentrifying, ‘diversifying’ neighborhoods, with installations increasing as white residents move in even after controlling for crime and income.
Atlanta has seen the nation’s fourth-most Black neighborhoods gentrified since 1980.
Then as property values rise, so does more intensive policing. One national analysis found that for every 5% increase in property values, neighborhoods saw a jump in arrests. Together, those trends suggest that as Black neighborhoods like Page’s become targets for redevelopment, the influx of capital often arrives wrapped in new surveillance and enforcement.
From a beloved woodland to a barren data mine

Law enforcement personnel are seen at the construction site of the “Cop City” training facility in February 2023. (Cheney Orr/AFP via Getty Images) For Page, the transformation started long before the cameras and cables went live. It started when the trees came down.
Growing up, he remembers that the South River Forest was something they had to drive past on the way to somewhere else. As an adult, DeKalb County finally paved and manicured a safe trail there, and it changed everything for the better.
“We had really grown fond of just spending time on that trail, riding bikes, walking, jogging,” he said.
When the Cop City plans became public, city leaders insisted that the complex would include a replacement walking path that the public could access.
But as construction ramped up, things changed, Page said. By the end of 2021 a group of activists within the Stop Cop City Movement known as “Forest Defenders” began to occupy Intrenchment Creek Park, living in tents and tree houses. The park however remained open to the public until March 2023 when it was closed by DeKalb County CEO Michael Thurmond, who cited recent and ongoing criminal activity in the area.
Earlier that year, on Jan. 18, a Georgia State Patrol officer shot and killed a 26-year-old Forest Defender named Manuel “Tortuguita” Paez Terán.
Page hasn’t walked the trail since.

A neighborhood mural painted by children last year depicts trees protecting Atlanta from flooding. (Adam Mahoney/Capital B) Environmental advocates warn that the consequences of losing that tree canopy will not be abstract. Research on Atlanta’s tree canopy has found that when forest cover is replaced with pavement and rooftops, stormwater runoff from a typical heavy rain can increase by 20%, overwhelming aging pipes and creeks that already flood Black and working‑class neighborhoods more often. The training center, they argue, is a concrete expression of that tradeoff: sacrificing shade and water‑absorbing soil on the city’s Black edge to cement a new era of policing technology.
“Atlanta is known as a Black Mecca, a place of upward mobility, but it’s also a poster child for income inequality and environmental racism,” said Darryl Haddock, an environmental scientist with the West Atlanta Watershed Alliance.
Atlanta still markets itself as a “city in a forest,” but researchers estimate the metro area has been losing roughly half an acre of tree cover every day for more than a decade, with removals now outpacing replanting.
“If we keep investing without considering the needs of lower‑income Atlantans, it’ll be easy to imagine — and then see — an Atlanta where entire communities are erased.”
As Atlanta’s AI camera grid grows, it increasingly depends on the same energy‑hungry data infrastructure that is transforming the South’s forests and fields into server farms, known as data centers, to hold and process the information collected via cameras. Those facilities demand vast amounts of electricity and water, adding strain to the metro area and rising energy costs. That means the loss of canopy around Cop City is a part of an ecosystem of industrial sites elsewhere, as communities across the country trade trees and water for the computing power. Across metro Atlanta, residents have organized against data center developments.
Capital B and Counterstream Media asked APD to explain whether and how it relies on dedicated or third‑party data centers to process surveillance data, in addition to other questions related to transparency, but the agency declined to comment.

Darryl Haddock says unchecked development in Atlanta is pushing low-income people out of the city though higher cost of living and environmental harm. (Adam Mahoney/Capital B) No end to how data can be misused
The risks of that technology extend far beyond a few square miles of southeast Atlanta.
Every time a car passes one of the cameras, its license plate, location, and timestamp become part of a searchable, shareable database that now reaches nearly 2,000 law‑enforcement agencies around the country.
In November, the Atlanta Community Press Collective reported that Atlanta police officers were using the camera system to support Immigration and Customs Enforcement’s Enforcement and Removal Operations division, even as APD has said it does not assist in immigration enforcement.
There are many ways for federal agencies to access the data. Federal task‑force officers embedded in local departments can log in directly, or ask a partner department in another state to run searches on their behalf, advocates said.
“Once that data is collected, it often spreads beyond its original purpose,” Lakshmanan said. “Without strong safeguards, it can be shared with federal agencies or other jurisdictions without public knowledge or consent and misused to target protesters, immigrants, communities of color, LGBTQ people, people seeking reproductive care. There’s kind of no end to how it can be misused once the city loses control.”

Atlanta Mayor Andre Dickens speaks during a tour of the “Cop City” facility. (Madeline Thigpen/Capital B) Page didn’t know all of those details when he first started seeing more patrol cars idling along familiar blocks. What he knew was the feeling.
“I just can’t see the need for the surveillance,” he said. “The whole ‘police state,’‘big brother is watching’ concept is not something that makes me comfortable.”
He worries most about what it might mean for his neighbors, especially young Black men who already “always ironically fit the description of the suspect.” He hasn’t had to explain Cop City to his daughter yet, but he knows that conversation is coming.
“As a father, what direction are we headed in?” he said.
Mapping by Georgia State University researcher Taylor Shelton shows Atlanta’s camera network is densest in the city’s predominantly Black west and south side neighborhoods. National studies of facial recognition have found that many algorithms are between 10 and 100 times more likely to misidentify Black and East Asian faces than white faces, with error rates for darker‑skinned women reaching as high as 35%.
Across the country, more than two dozen cities — from San Francisco to Nashville, Tennessee, and St. Louis — have passed “Community Control Over Police Surveillance” ordinances that require public hearings, impact reports, and city council votes before police acquire or expand surveillance technology. Advocates in Atlanta are now pushing for the city to become the first in the South to adopt similar rules.
“Transparency should be built into the way the city obtains and uses surveillance tech,” Lakshmanan said.“Atlanta is so influential that the standard we’re able to set here, we hope, is replicated throughout other major cities in the South.”

A Flock drone. (Courtesy of Flock Safety) Holding on to old Atlanta
Long before Atlanta’s suburbs reached into DeKalb, the forest that now surrounds the training center was known as Weelaunee for the yellow‑brown water that runs through its creeks and wetlands. After Muscogee people were forced west along the Trail of Tears, the same ground was worked by enslaved people before becoming the Atlanta Prison Farm. As these examples evidence, the modern policing at work in this site stems from a much older pattern of disciplining Indigenous and Black communities.
Page said he still feels that history and the connection to the land when he pulls off the highway and onto the streets he knows by heart.
“I love that area,” he said. “That area has made me who I am. I love the people of DeKalb County, especially that southern part.”
Gentrification, Page said, was already changing that landscape before the first Cop City renderings appeared. “I’m open to different types of faces in the community,” he said. “I just hope that we can keep the core of what that community was.”
He wonders what it will mean for young people to grow up in a neighborhood where a police training facility sits where the forest used to be, and where cameras and license‑plate readers log every trip home.
“Even if you don’t have anything to hide, it doesn’t mean that you want to be watched,” he said.
Adam Mahoney is the climate and environment reporter at Capital B. He can be reached by email at adam.mahoney@capitalbnews.org, on Bluesky, and on X at @AdamLMahoney.
Madeline Thigpen is Capital B Atlanta’s criminal justice reporter.
This story was published in partnership with Counterstream Media for The AI issue of Peace & Riot.



No comments:
Post a Comment