EU Commission Presses Meta, TikTok for Strategies Against Illegal Content Amid Conflict

17

The European Union (EU) has asked social media platforms, Meta and TikTok, to share their strategies combating the proliferation of illegal content and misinformation, a move prompted by the recent Israel-Hamas conflict. This request, issued by the 27-member EU’s executive arm, the European Commission, seeks to evaluate the companies’ adherence to new digital regulations designed to sanitize the online environment.

Under the new Digital Services Act (DSA), which moved into force in August, technology giants like Meta and TikTok are required to outline their steps towards reducing and countering the dissemination of violent content, hate speech, and disinformation. The Act places extra burdens on these entities to quash the propagation of illegal content or face potential fines, which might tally up to six per cent of their annual global revenue.


The DSA, a ground-breaking initiative that reasserts the EU as a global leader in reigning in Big Tech, is being tested for the first time by the Israel-Hamas conflict. Social media has seen a deluge of photos, videos, and posts pertaining to the conflict, many comprising false narratives or presenting out-of-context content from other occurrences.

Previously, the EU Commissioner Thierry Breton had sent warning letters to these platforms, cognizant of the immense potential for misuse and the threat posed by the live broadcast of brutal acts. The Commissioner has further sought assurances regarding the platforms’ preparedness to manage such eventualities.

The intense scrutiny follows threats from Hamas to execute an Israeli hostage live on social media, a promise thankfully thus far unfulfilled despite the continued strife and collateral damage on civilian infrastructure.

In response to the EU’s demand, Meta, the parent company of Facebook and Instagram, stated that it employs a tested approach to navigating crises that balances risk mitigation with the safeguarding of expression rights. The company has also set up a special operations center comprised of language experts to closely monitor the situation, reacting swiftly to developments on the ground.

Meta asserts that it has teams working tirelessly to maintain platform safety and take the necessary action against any content that violates its policies or local laws. This involves coordination with third-party fact-checkers in the conflict region to minimize the spread of misinformation.

TikTok did not offer a comment on the request.

The companies must now respond to the Commission’s queries by Wednesday regarding their crisis management approaches. Subsequently, they face another deadline set for November 8th, on the topics of protecting electoral integrity and enforcing child safety measures, specifically in TikTok’s case.

Depending on the adequacy of their responses, the Commission may open formal proceedings and even levy fines against Meta or TikTok for providing “incorrect, incomplete, or misleading information”.