EU Urges TikTok to Curb Disinformation within 24-Hours After Hamas Attack

31

The European Union has issued a stark warning to social media giant TikTok regarding the propagation of “disinformation” via its platform, following the recent attack launched by Hamas on Israel. The European governing body has urged Shou Zi Chew, CEO of TikTok, to conspicuously amplify the company’s mitigation efforts within a stringent 24-hour time frame, ensuring these efforts are in full compliance with European law.

In the age of digitalization, social media firms have seen an unprecedented spike in the spread of inaccurate information about the enduring conflict. This includes the distribution of manipulated images and misattributed videos that fan the flames of discord. Priorly, both X, the company formerly known as Twitter, and Meta, previously Facebook, faced similar warnings from the EU about the dissemination of such content.


Given the popularity of TikTok among the younger demographic, the EU emphasized the platform’s responsibility to safeguard children and adolescents from exposure to harmful content. This includes, but is not limited to, violent content, terrorist propaganda, deadly challenges, and content that could potentially endanger lives.

CEO of the social media platform X, Linda Yaccarino, has responded to the EU’s urgency by elucidating that the company has either flagged or eliminated “tens of thousands of pieces of content” since Hamas’s attack on Israel. The firm has also deactivated hundreds of accounts.

In accordance with a comparable warning, Facebook and Instagram’s parent company, Meta, has received a 24-hour ultimatum from the EU. Despite the decline from the EU to comment on the receipt of a response from Meta, a representative from the European Commission confirmed that ongoing discussions are being held with the Meta’s compliance team.

A spokesperson from Meta has assured that their teams are diligently working to ensure the safety of their platforms, taking necessary action against content that infringes upon company policies or local law. They have also set up an operations center, comprised of experts fluent in Hebrew and Arabic, to monitor the situation closely and take the required actions.

Simultaneously, X has also amplified efforts by reassigning resources and reorienting internal teams to manage the content. Ms. Yaccarino has revealed that X has acted on over 80 requests to eliminate content within the EU’s jurisdiction while also adding explanatory notes to select posts.

However, despite the EU’s accusation regarding “illegal content,” Yaccarino’s official statement insists that X did not receive any notification from Europol.

Under the new Digital Services Act introduced in 2023, “very large online platforms” are required to proactively eliminate “illegal content” and demonstrate that measures have been taken to do so upon request. This legislation allows the EU to conduct interviews and investigations. If a platform fails to comply or address identified issues, the commission can impose a heavy fine or as a last resort, request for a temporary ban of the platform from the EU.