December 16, 2025

Human rights advocates contend the ability to easily share Ring doorbell and security camera video with police has exacerbated racial profiling - Copyright GETTY IMAGES NORTH AMERICA/AFP JUSTIN SULLIVAN
Smart AI security cameras can become a spying tool for hackers. Many devices are also collecting unnecessary amounts of data, which affects privacy and which provides an inroad for hackers.
The new study, from Surfshark, reveals that modern AI security cameras do more than just watch. By standardizing facial recognition, these devices have turned home security into a source of rich biometric data that includes not just camera owners but their neighbors as well.
Additionally, the apps required to operate these cameras are collecting personal information that may not be related to security functionality.
“The central risk isn’t only the capture. When people can’t meaningfully opt in or out and are not informed about where their biometric data is stored, what additional data points are being collected, and with whom it’s shared, you’ve created a privacy hazard. Scanning faces or car plates of neighbours – especially without explicit consent – should be treated as a major privacy concern if not a breach of privacy regulations,” Miguel Fornes, cybersecurity expert at Surfshark, has told Digital Journal.
Is facial recognition necessary?
Beyond video capture and alerts, many popular devices now include smart facial recognition and vehicle detection, raising the stakes for biometric and metadata exposure for camera owners and their neighbours.
Despite claims of enhanced safety, most camera companion apps gather additional user data unrelated to core camera functionality.
Fornes outlines the problem: “Imagine that suddenly the most personal and intimate part of your life – your home – is available to unknown individuals, with no known security controls to protect the recordings. The central risk isn’t only the capture. When people can’t meaningfully opt in or out and are not informed about where their biometric data is stored, what additional data points are being collected, and with whom it’s shared, you’ve created a privacy hazard. Scanning faces or car plates of neighbours – especially without explicit consent – should be treated as a major privacy concern if not a breach of privacy regulations.”
Drawing on an example, Fornes cites Amazon Ring. The device has recently been criticized by privacy watchdogs over its “Familiar Faces” feature, which claims to identify people captured on camera. This particular case raised concerns about consent and the handling of biometric data.

Human rights advocates contend the ability to easily share Ring doorbell and security camera video with police has exacerbated racial profiling - Copyright GETTY IMAGES NORTH AMERICA/AFP JUSTIN SULLIVAN
Smart AI security cameras can become a spying tool for hackers. Many devices are also collecting unnecessary amounts of data, which affects privacy and which provides an inroad for hackers.
The new study, from Surfshark, reveals that modern AI security cameras do more than just watch. By standardizing facial recognition, these devices have turned home security into a source of rich biometric data that includes not just camera owners but their neighbors as well.
Additionally, the apps required to operate these cameras are collecting personal information that may not be related to security functionality.
“The central risk isn’t only the capture. When people can’t meaningfully opt in or out and are not informed about where their biometric data is stored, what additional data points are being collected, and with whom it’s shared, you’ve created a privacy hazard. Scanning faces or car plates of neighbours – especially without explicit consent – should be treated as a major privacy concern if not a breach of privacy regulations,” Miguel Fornes, cybersecurity expert at Surfshark, has told Digital Journal.
Is facial recognition necessary?
Beyond video capture and alerts, many popular devices now include smart facial recognition and vehicle detection, raising the stakes for biometric and metadata exposure for camera owners and their neighbours.
Despite claims of enhanced safety, most camera companion apps gather additional user data unrelated to core camera functionality.
Fornes outlines the problem: “Imagine that suddenly the most personal and intimate part of your life – your home – is available to unknown individuals, with no known security controls to protect the recordings. The central risk isn’t only the capture. When people can’t meaningfully opt in or out and are not informed about where their biometric data is stored, what additional data points are being collected, and with whom it’s shared, you’ve created a privacy hazard. Scanning faces or car plates of neighbours – especially without explicit consent – should be treated as a major privacy concern if not a breach of privacy regulations.”
Drawing on an example, Fornes cites Amazon Ring. The device has recently been criticized by privacy watchdogs over its “Familiar Faces” feature, which claims to identify people captured on camera. This particular case raised concerns about consent and the handling of biometric data.
Too many AI features?
As these AI features become more prevalent, particularly facial recognition, manufacturers must navigate increasingly complex global privacy regulations. Facial recognition features are strictly regulated in the EU and UK, with high privacy standards enforced by the GDPR. By contrast, frameworks in the US, Canada, and Australia are less comprehensive and vary by jurisdiction.
According to Fornes, even when the law allows it, people often overlook the fact that in order to enable these smart features, AI cameras need to constantly ship data back to the manufacturer’s servers: “It’s not just recordings or snapshots. Camera’s companion apps may be siphoning additional data points – including location, device IDs, contact information, usage patterns, and even biometrics – creating a parallel surveillance stream that can amplify the damage of any breach.”
Fornes adds: “Once a smart camera has a known vulnerability, bad actors can silently take control – turning it into a live feed that spies on you and even your neighbors. For such attackers, highly sophisticated skills are not even necessary; there are numerous readily available platforms that literally list compromised webcams, allowing anyone to break in”.
Information gathered by AI camera makers
Among the eight leading brands analyzed in Surfshark’s study, six offer AI-powered facial recognition, 7 provide smart vehicle detection, and all of them feature person detection and intelligent alerts. Despite claims of enhanced safety, most popular models require companion apps for setup, notifications, and cloud storage – apps that often collect additional information unrelated to core camera functionality.
The disclosed advertising-related data practices varied widely among the analysed apps. Arlo stands out by collecting and sharing device IDs specifically for third-party advertising and by gathering more data types for developer advertising than any competitor, with five data types collected. Vivint and Google Nest each collect four data types for developer advertising, SimpliSafe collects three, and Amazon Ring and ADT each collect one.


No comments:
Post a Comment