Loading...

EU tells Elon Musk to hire more staff to moderate Twitter

EU tells Elon Musk to hire more staff to moderate Twitter<br />
<b>Warning</b>:  Undefined array key /var/www/vhosts/lawyersinamerica.com/httpdocs/app/views/singleBlog/singleBlogView.php on line 59
">
Policy
Mar 2023

Elon Musk and the EU are in a dispute over the Twitter owner's plan to use more volunteers and artificial intelligence to help moderate the social media platform, as the company responds to strict new rules designed to police online content.

According to four people familiar with talks between Musk, Twitter executives, and regulators in Brussels, the billionaire has been told to hire more human moderators and fact-checkers to review posts.

The demand complicates Musk's efforts to reorganize the lossmaking business he acquired for $44 billion in October. The new owner has slashed more than half of Twitter's 7,500 staff, including the entire trust and safety teams in some offices, while seeking cheaper methods to monitor tweets.

Twitter currently uses a mix of human moderation and AI technology to detect and review harmful material, in line with other social media platforms. However, it does not employ fact checkers, unlike larger rival Meta, which owns Facebook and Instagram.

Twitter has also been using volunteer moderators for a feature called "community notes" to tackle the deluge of misinformation on the platform--but the tool is not used to address the illegal content.

Musk also told EU commissioner Thierry Breton last January that it will lean further on its AI processes, said people with direct knowledge of the talks.

Those people said Breton advised that, while it was up to Twitter to come up with the best way to moderate the site, he was expecting the company to hire people to comply with the Digital Services Act.

Twitter said in a statement to the Financial Times: "We intend to comply fully with the Digital Services Act and have had several productive discussions with EU officials on our efforts in this area."

"We will continue to utilize a mixture of technology and expert staff to proactively detect and remove illegal content, while community notes will enable people to learn more about potential misinformation in a way that is informative, transparent and trustworthy," the company added.

The DSA is landmark legislation that will force Big Tech groups to police their sites more aggressively for illegal content. Major platforms, including Twitter, will have to be fully compliant by September this year at the latest. Those in breach face fines of up to 6 percent of global turnover. Musk told Breton that hiring will take time but staff will be in place to comply with the DSA this year.

Following their January meeting, Musk tweeted: "Good meeting with @ThierryBreton regarding EU DSA. The goals of transparency, accountability & accuracy of information are aligned with ours. @CommunityNotes will be transformational for the latter."

Further talks have been held between Twitter and EU regulators regarding its moderation plans in recent weeks. At those, officials have admitted pursuing the community notes model could work to weed out a large proportion of misleading information in the same way editors achieve similar results on Wikipedia.

But concerns have been raised that the site does not have hundreds of thousands of volunteer editors, as Wikipedia does, and that Twitter has a poor record on non-English language content moderation, an issue that plagues other social networks.

"Community notes is not a terrible idea but Musk needs to prove that it works," said a person with direct knowledge of the talks.

A person familiar with the community notes project said the feature is only one part of Twitter's wider disinformation moderation approach.

"Platforms should be under no illusion that cutting costs risks cutting corners in an area that has taken years to develop and refine," said Adam Hadley, director of Tech Against Terrorism, a UN-backed organization that helps platforms police extremist content online. "We are worried about the signal Twitter's latest move sends to the rest of the industry."

The European Commission said: "We believe that ensuring sufficient staff is necessary for a platform to respond effectively to the challenges of content moderation, which are particularly complex in the field of hate speech. We expect platforms to ensure the appropriate resources to deliver on their commitments."

(C) 2023 The Financial Times Ltd. All rights reserved. Not to be redistributed, copied, or modified in any way.

Top