EU investigates Musk’s X over violent content, disinformation about Hamas attack

0 106

Elon Musk’s X faces further intense scrutiny in the European Union after regulators demanded answers over concerns about “illegal” and even “terrorist” content on the social media platform.

The European Commission late Thursday said it sent X a formal request for information under its strict new digital content management rules following indications about the potential terrorist and violent content.

X, formerly known as Twitter, received a one-week ultimatum — until Oct. 18 — to respond to regulators’ questions about “the activation and functioning” of its crisis-response protocol, and will have to respond on other questions by Oct. 31, the commission said in a news release.

“This formal request for information is a step to fully understand what measures @X is taking to ensure online safety,” commission Vice President Vera Jourova wrote in a post on X.

The move comes after the EU’s internal market commissioner Thierry Breton set a 24-hour deadline for X owner Musk to respond to allegations that the platform was ignoring notices of illegal content in relation to the Israel-Hamas conflict.

In the letter to X earlier this week, which Breton posted on the social platform, he reminded Musk that he has an obligation to take “very precise” content moderation measures under the EU’s Digital Services Act.

X faces possible fines if it provides the commission with “incorrect, incomplete or misleading information” in response, according to the news release. X’s spokespeople didn’t immediately respond to an email seeking comment.

John Kirby, the spokesman for the US National Security Council, told reporters Thursday he was grateful that X removed some disinformation related to the attack on Israel.

ByteDance Ltd.’s TikTok on Thursday was added to a growing list of leading social media players, along with Meta Platforms Inc., that Breton has been warning to take prompt action to stop the spread of disinformation.

Social media companies are required under the new law to hire more content moderators and use risk mitigation methods to decrease the spread of harmful content. Companies that fail to comply could face fines as high as 6% annual revenue or even be banned from the bloc if they repeatedly break the rules.

Leave A Reply

Your email address will not be published.