Today: 20-05-2024

EU demands Meta and TikTok detail efforts to combat disinformation during Israel-Hamas conflict

On Thursday, the European Union tightened its control over major technology companies, demanding that Meta and TikTok provide detailed information on their efforts to limit illegal content and disinformation during the Israel-Hamas conflict.

The European Commission, the executive body of the 27-country bloc, formally requested social media companies to explain how they comply with the new sweeping digital rules aimed at cleaning up online platforms.

The Commission has asked Meta and TikTok to explain the measures they have taken to reduce the risk of spreading and amplifying terrorist and violent content, hate speech, and disinformation.

Under the new EU rules that came into effect in August, major technology companies face additional obligations to prevent a wide range of illegal content from flourishing on their platforms or face the threat of significant fines.

The new rules, known as the Digital Services Act (DSA), are being tested by the Israel-Hamas conflict. Images and videos of the bloody conflict flooded social media along with user messages making false claims and distorting videos from other events.

Last week, Brussels sent its first official request under the DSA to Elon Musk's social media platform X, formerly known as Twitter.

Thierry Breton, the EU's digital specialist, previously sent warning letters to three social media platforms and YouTube, emphasizing the risks posed by the war.

"In our discussions with the platforms, we specifically asked them to be prepared for the risk of direct broadcasts of executions by Hamas - an inevitable risk from which we must protect our citizens - and we are seeking guarantees that the platforms are well-prepared for such a possibility," Breton said in his speech on Wednesday.

Meta, the company that owns Facebook and Instagram, as well as the video-sharing app TikTok, did not immediately respond to email requests for comments.

The companies have until Wednesday to respond to questions related to their crisis response. They also face a second deadline of November 8 for responses to questions about election integrity and, in the case of TikTok, child safety.

Depending on their responses, Brussels may decide to initiate an official investigation into Meta or TikTok and impose fines for "incorrect, incomplete, or misleading information," the Commission said.