Saturday, January 25, 2025

Majority of popular websites in U.S. and Europe are not compliant with privacy regulations


By Dr. Tim Sandle
January 21, 2025
DIGITAL JOURNAL

The Internet Corporation for Assigned Names and Numbers (ICANN), the nonprofit group that minds the internet's infrastructure, is worried about chatter at the United Nations about giving more control of the world wide web to individual governments - Copyright AFP Mark RALSTON

Privacy startup Privado has issued a report titled ‘State of Website Privacy Report’. This has discovered various rates of non-compliance with privacy regulations and identifies controls needed to avoid privacy fines.

The headline finding is that 75 percent of most visited websites in U.S. and Europe are not compliant with privacy regulations

Despite stricter privacy enforcement in Europe, Privado found a surprising 74 percent of top websites in Europe do not honour opt-in consent as required by Europe’s General Data Protection Regulation (GDPR).

Although top websites in the U.S. had a similar non-compliance rate of 76 percent for not honouring opt-out consent as required by the California Privacy Rights Act (CPRA), Privado found the median volume of compliance risks to be three-times higher in the U.S.

The State of Website Privacy Report was based on data from Privado’s consent monitoring solution. Six of the 20 largest GDPR fines since 2018 are due to consent compliance violations on websites, with Amazon receiving the second-largest GDPR fine to date, $888M, for targeting users with ads without proper consent in 2021.

In the US, at least 10 companies since 2022 have been fined for violating consent compliance on websites as regulated by CPRA, the FTC (Federal Trade Commission), or HIPAA (Health Insurance Portability and Accountability Act).

With fines mounting and consumers demanding greater privacy, personal data sharing from websites has become a major legal risk for companies worldwide.

Most websites do not honour consent as required by privacy regulations in the US and Europe

To comply with the CPRA amendment to CCPA (California Consumer Privacy Act), websites in the US must block personal data sharing with advertising third parties if the user opts out of data sharing. To comply with GDPR, websites in Europe must block personal data collection and sharing with third parties unless the user provides opt-in consent. Despite increasing privacy fines in the US and Europe, most websites are not honouring the consent requirements in the US or Europe.

Non-compliant websites in the US average three times more compliance risks than those in Europe

Privacy teams typically lack the visibility and controls to track what third parties are integrated with on their websites and whether they are honoring consent requirements. With teams using so many third parties to optimize marketing and website performance, privacy teams need comprehensive solutions to continuously monitor consent and data flows.

Top websites in the US and Europe typically share data with over 20 3rd parties
Median 3rd Parties Integrated with Top Websites

Consent management platforms alone do not ensure consent compliance

Consent management platforms (CMPs) are effective at managing the complexity of implementing consent banners and data flows across websites, but CMPs can’t sufficiently monitor and validate consent compliance. Privacy teams need continuous website monitoring solutions to mitigate privacy risk at scale. The solutions should provide a real-time view of third parties integrated with their websites, each data element being sent to which third parties, and consent banner functionality.

Privacy code scanning and consent management platforms together can ensure privacy compliance

Privacy code scanning should be used in conjunction with a consent management platform to implement best-in-class digital tracking governance for websites and mobile apps.

Consent management platforms are important for collecting, acting on, and recording consent, but they lack the full visibility and governance to ensure personal data doesn’t improperly leak to advertising third parties.



Dr. Tim Sandle is Digital Journal's Editor-at-Large for science news. Tim specializes in science, technology, environmental, business, and health journalism. He is additionally a practising microbiologist; and an author. He is also interested in history, politics and current affairs.


Business

Balancing personalization and privacy: 
A Canadian perspective


ByJennifer Kervin
January 23, 2025
DIGITAL JOURNAL

Image generated by OpenAI's DALL-E via ChatGPT

Eighty per cent of consumers expect personalization online, but two-thirds report having such poor experiences that they’ve disengaged from brands altogether.

The motivation to get it right is abundantly clear, but the disconnect highlights a growing challenge for businesses: how do you deliver the tailored experiences customers crave without crossing the line into invasive or unethical data practices?

Finding this balance has never been more urgent for organizations all over the world, including Canada. The tightening of privacy laws, coupled with rising consumer expectations, means that companies must approach personalization with care and transparency. The stakes are high — getting it wrong risks not only alienating customers, but also triggering costly compliance penalties — but so are the rewards for those who can get it right.
Privacy regulations reshaping the Canadian landscape

Canada’s approach to privacy is evolving in response to growing concerns over how personal data is collected and used.

An Interac survey from January 2024 found that:77% of Canadians said their personal data is more exposed than ever.
72% are concerned companies are able to see too much of their data.

The Consumer Privacy Protection Act (CPPA) was initially introduced as a much-needed and more stringent replacement for the Personal Information Protection and Electronic Documents Act (PIPEDA), which was signed into law way back in 2000.

Bill C-27, the Digital Charter Implementation Act (introduced in 2022) would’ve enacted the switch, and brought in two other acts as part of the bill: the Personal Information and Data Protection Tribunal Act, and the Artificial Intelligence and Data Act. These changes were intended to align Canada’s privacy standards with those of jurisdictions like the European Union, which has set the benchmark with its General Data Protection Regulation (GDPR).

Prime Minister Justin Trudeau’s resignation and subsequent prorogation of parliament on Jan. 6 effectively killed Bill C-27, in addition to all bills that had not yet received Royal Assent, making the future of any updates or changes to Canadian data privacy up in the air.

Globally, regulations like the proposed EU AI Act and the California Privacy Rights Act (CPRA) add further compliance layers for businesses that operate internationally.
Why personalization still matters

Consumers expect personalization, but their trust hinges on how businesses handle their data.

According to the Canadian Marketing Association, companies that excel at personalization can generate up to 40% more revenue than those that do not. At the same time, three-quarters of Canadians feel more comfortable sharing personal data when businesses are transparent about how it’s used.

Missteps such as irrelevant recommendations or intrusive ads can quickly erode trust. Conversely, brands that prioritize transparency and give customers control over their data — for example, through clear privacy settings or opt-in mechanisms — can foster deeper loyalty and engagement.

Personalization, when done ethically, offers a win-win scenario. Not only does it enhance the customer experience, but it also drives tangible business outcomes.
Why ethical AI is crucial for personalization

AI has become a cornerstone of modern personalization strategies, allowing businesses to deliver tailored experiences at scale. However, its use comes with significant risks, including bias, overreach, and potential misuse of consumer data. Ethical AI practices are essential for addressing these concerns and ensuring that AI is deployed responsibly.

The AI Act, proposed by the European Union, emphasizes the need for transparency, accountability, and fairness in AI systems. These principles are critical for ensuring that AI-driven personalization respects consumer rights. Canadian businesses can look to frameworks like Microsoft’s Responsible AI principles for guidance on implementing ethical AI, but the focus must remain on how these tools align with consumer expectations.

Ultimately, businesses must treat ethical AI as a non-negotiable element of their personalization strategies. This includes regular audits, clear disclosure of how AI systems function, and safeguards to prevent misuse.
Striking the right balance

With stricter regulations on the horizon and growing consumer awareness, organizations must adopt strategies that prioritize trust, transparency, and ethical innovation.

The path forward requires more than compliance with laws, though. Businesses must actively engage with consumers, address their concerns, and make privacy a core part of their value proposition.

In an era where trust is a key differentiator, those who succeed in balancing these priorities will not only comply with regulations, but also build lasting loyalty and competitive advantage.


Written ByJennifer Kervin is a Digital Journal staff writer and editor based in Toronto.


-

No comments: