The European Union has warned X that it may calculate fines against the social-media platform by including revenue from Elon Musk’s other businesses, including Space Exploration Technologies Corp.
And Neuralink Corp., an approach that would significantly increase the potential penalties for violating content moderation rules. Under the EU’s Digital Services Act, the bloc can slap online platforms with fines of as much as 6% of their yearly global revenue for failing to tackle illegal content and disinformation or follow transparency rules.
Regulators are considering whether sales from SpaceX, Neuralink, xAI and the Boring Company, in addition to revenue generated from the social network, should be included to determine potential fines against X, people familiar with the matter said, asking not to be identified because the information isn’t public.
The European Commission has been investigating X for several potential breaches of the Digital Services Act, newly introduced rules meant to ensure platforms police illegal content. The EU is leading a global crackdown on harmful online content and disinformation that’s sparked increasingly vocal responses from Musk, who has said such measures restrict free speech.
X is a private company under Musk’s sole control. In considering revenue from his other companies, the commission is essentially weighing whether Musk himself should be regarded as the entity to fine as opposed to X itself, the people said. Tesla Inc.’s sales would be exempt from this calculation because it’s publicly traded and not under Musk’s full control, one of the people said.
The commission hasn’t yet decided whether to penalize X, and the size of any potential fine is still under discussion, the people said. It tends not to fine companies the maximum possible amount in antitrust cases. Penalties may be avoided if X finds ways to satisfy the watchdog’s concerns.
X would also have the opportunity to challenge any EU decision, but the final say rests with the commission, the people said. X didn’t reply to requests seeking comment. Musk has previously said on X that he will fight any DSA fine through “a very public battle in court.”
The review of X began under Thierry Breton, the EU’s former tech czar who often feuded with Musk online and had been granted special powers to enforce the DSA without the need for the commission’s rubber stamp. After Breton resigned in September, he bequeathed his fining powers to competition and digital boss Margrethe Vestager. Decisions on the penalties and how they are calculated would ultimately lie with Vestager.
“The obligations under the DSA are addressed to the provider of the very large online platform or very large online search engine,” commission spokesperson Thomas Regnier said. “This applies irrespective of whether the entity exercising decisive influence over the platform or search engine is a natural or legal person.” Regnier didn’t elaborate further on this specific case.
The EU raised concerns in August that X’s use of blue check marks for what it calls “verified” accounts could deceive users into believing the accounts are safe when “malicious state actors” have abused the system in some cases. It also said X’s lack of transparency on advertising and failure to share data with researchers may violate the DSA.
Earlier this week, Musk’s X escaped further regulatory scrutiny under the EU’s sister regulation, the Digital Markets Act, which attempts to root out competition violations online before they take hold. While regulators ultimately found that X wasn’t powerful enough to meet those rules, they said that all legal entities under Musk’s control — including “X Holdings Corp., Space X, The Boring Company, Neuralink Corporation, and X.AI, as well as Mr. Elon Musk,” according to a commission document — should be considered as a singular group.
The DSA became legally enforceable last August, laying out content rules for social media platforms, online marketplaces and app stores. It forces their owners to clamp down on misinformation and objectionable content such as hate speech, terrorist propaganda and ads for unsafe toys. Regulators have also homed in on the creation of so-called rabbit holes on social media, which suck young users deeper and deeper into often inappropriate material.