Penfrat is worried about political overreach—he agrees that disinformation on X is a problem. There is no legal obligation for platforms to respond in 24 hours, he says, referring to Breton’s letter. “Don't just throw out empty threats on a social media site. This is not how enforcement works,” Penfrat says. “He's playing [by] Elon Musk's rules here, rather than using the ones that he's been given by the law.”
Breton’s letter also refers to “crisis measures.” There are extra “crisis” rules contained in the Digital Services Act, designed to be used in times of war. “However, none of the requirements in order for this mechanism to be enacted have been initiated or met, further indicating overreach by the commission,” says Asha Allen, advocacy director for Europe, at the Centre for Democracy and Technology (CDT), a think tank.
Allen says she is also concerned that Breton’s letter appears to conflate illegal content and disinformation. Creating a false equivalency between the two is worrying for freedom of expression, she says. “It is for these types of reasons that the DSA treats those content types differently; on the one hand, it contains mandatory obligations to tackle illegal content, and on the other hand increases due diligence to address harmful but lawful content.”
The CDT is seeking clarification on Breton’s letter, says Allen. “We would characterize the letter as a misstep.”
What happens next is unclear. Under the new rules, the EU Commission can fine social media platforms up to 6 percent of their global turnover or, in extreme cases, block a site entirely from the EU. That would take months of investigation, however.
“There won't be immediate consequences if X doesn't address some of the allegations in the letter,” says Mathias Vermeulen, the public policy director at AWO, a data rights consultancy.
“But Breton seems to hint at the fact that X's response will be taken into account by the commission when it is assessing the risk mitigation measures.”
Source link