Thursday, October 19, 2023

EU opens disinformation probes into Meta, TikTok

DOING WHAT THE U$ CAN'T, WON'T

The EU sent formal requests for information to Meta and TikTok in what is a first procedure launched under a new law
The EU sent formal requests for information to Meta and TikTok in what is a first 
procedure launched under a new law.

The EU announced probes Thursday into Facebook owner Meta and TikTok, seeking more details on the measures they have taken to stop the spread of "illegal content and disinformation" after the Hamas attack on Israel

The European Commission said it had sent formal requests for information to Meta and TikTok respectively in what is a first procedure launched under the EU's new law on .

The EU launched a similar probe into billionaire mogul Elon Musk's social media platform X, formerly Twitter, last week.

The commission said the request to Meta related "to the dissemination and amplification of illegal content and disinformation" around the Hamas-Israel conflict.

In a separate statement, it said it wanted to know more about TikTok's efforts against "the spreading of terrorist and violent content and hate speech".

The EU's executive arm added that it wanted more information from Meta on its "mitigation measures to protect the integrity of elections".

Meta and TikTok have until October 25 to respond, with a deadline of November 8 for less urgent aspects of the demand for information.

The commission said it also sought more details about how TikTok was complying with rules on protecting minors online.

The European Union has built a powerful armory to challenge the power of big tech with its landmark Digital Services Act (DSA) and a sister law, the Digital Markets Act, that hits internet giants with tough new curbs on how they do business.

The EU's fight against disinformation has intensified since Moscow's invasion of Ukraine last year and Russian attempts to sway European public opinion.

The issue has gained further urgency after Hamas' assault on October 7 on Israel and the aftermath which sparked a wave of violent images that flooded the platforms.

The DSA came into effect for "very large" platforms, including Meta and TikTok, that have more than 45 million monthly European users in August.

The DSA bans illegal online content under threat of fines running as high as six percent of a company's global turnover.

The EU's top tech enforcer, Thierry Breton, sent warning letters to tech CEOs including Meta's Mark Zuckerberg, TikTok's Shou Zi Chew and Sundar Pichai of YouTube owner Alphabet.

Growing EU fears

Breton, EU internal market commissioner, told the executives to crack down on illegal content following Hamas' attack.

Meta said last week that it was putting special resources towards cracking down on illegal and problematic content related to the Hamas-Israel conflict.

On Wednesday, Breton expressed his fears over the impact of disinformation on the EU.

"The widespread dissemination of illegal content and disinformation... carries a clear risk of stigmatization of certain communities, destabilization of our democratic structures, not to mention the exposure of our children to ," he said.

AFP fact-checkers have found several posts on Facebook, TikTok and X promoting a fake White House document purporting to allocate $8 billion in military assistance to Israel.

And several platforms have had users passing off material from other conflicts, or even from video games, as footage from Israel or Gaza.

Since the EU's tougher action on digital behemoths, some companies, including Meta, are exploring whether to offer a paid-for version of their services in the European Union.

© 2023 AFP

EU warns TikTok over 'illegal content and disinformation'

What to know about the EU’s landmark digital content act

By AFP
October 19, 2023

Tech giants have been targeted by the EU for a number of allegedly unfair practices 


Daniel Aronssohn

The EU’s milestone legislation, known as the Digital Services Act, demands digital giants crack down on illegal and problematic content.

Brussels is bearing its teeth, launching investigations into X, formerly Twitter, as well as Facebook parent Meta and TikTok following Hamas’ assault on Israel on October 7 and its aftermath.

The DSA is a mammoth law that will force digital giants to aggressively police content online in the European Union and could see them hit with major fines.

Here are the regulation’s key elements:

– Rules for all platforms –


The law kicks in for all platforms from February 17, 2024, but since August it already applies for very large platforms with more than 45 million active monthly users.

Among their obligations, all platforms must quickly remove illegal content or make access to it impossible as soon as they are aware of the issue.

The companies must also rapidly inform the authorities when they suspect a criminal offence that threatens people’s lives or the safety of others.

Every year the platform must publish a report that provides details about actions taken on content moderation and how long they took to respond after notification of illegal content. They will also report on the decisions taken in disputes with users.

The law also tells platforms to suspend users who frequently share illegal content such as hate speech or fake ads, while online shopping sites must verify the identities of users and block repeat fraudsters.

There are also tougher rules on targeted advertising, with a ban on such ads for children aged 17 and under.

The EU also wants users to see how their data is used. The law bans targeted advertising based on sensitive data, such as ethnicity, religion or sexual orientation.

The law does not apply to very small companies.

– Extra rules for large platforms –

The EU has named 19 “very large” platforms including Apple, Amazon, Facebook, Google, Instagram, Microsoft, Snapchat, TikTok and clothing retailer Zalando.

Amazon and Zalando have launched legal challenges to their designations.

These large platforms must assess the risks linked to their services with regards to the spread of illegal content and privacy infringements.

And they must also set up structures internally to mitigate such risks, such as improved content moderation.

The platforms must also give regulators’ access to their data so officials can see whether they are complying with the rules.

This access will also be shared with approved researchers.

They will be audited once a year by independent organisations — at their own expense — to ensure compliance, and also establish an independent internal supervisor that will keep an eye on whether the platforms are in line with the rules.

– EU, national coordination –

Under the law, the EU’s 27 member states must assign a competent authority that has the powers to investigate and sanction any violation.

These authorities must work with each other and with the European Commission, the EU’s executive arm, to enforce the regulation from February.

If a digital platform provider is located in one member state, that country must enforce the rules except for very large platforms which will come under the commission’s supervision.

– Complaints, penalties –


The DSA wants to make it easier for users’ complaints to be heard.

Users will be able to lodge a complaint claiming a platform is in violation of the DSA with their competent national authority.

Online shopping sites may be held responsible for any damage from products bought by users that are non-compliant or dangerous.

Violations can be met with fines that could go up to six percent of a company’s global turnover, and for repeated non-compliance, the EU can even decide to ban offending platforms from Europe.




No comments:

Post a Comment