Nation/World

Musk’s ‘free speech’ agenda dismantles safety work at Twitter, insiders say

Hours after Elon Musk took control of Twitter in late October, the Trust and Safety team responsible for combating hate speech on the site received an urgent directive: Bring back the Babylon Bee.

To some Twitter employees, the order was troubling. The self-described Christian satirical site had been banned since March for refusing to delete a tweet naming Biden health official Rachel Levine its “Man of the Year.” Levine is a transgender woman, and the tweet violated a 2018 rule prohibiting Twitter users from targeting transgender people by referring to them by the name or gender they used before transitioning.

To Musk, the suspension was emblematic of the kind of liberal overreach he has vowed to purge from the platform. In a small kitchen on the second floor of the company’s San Francisco headquarters, he huddled with his lawyer, Alex Spiro, Trust and Safety lead Yoel Roth, another Twitter employee and a Tesla employee to discuss the suspension.

While the Babylon Bee’s tweet was “not cool,” Musk told them, it also wasn’t “sticks and stones” - violent threats that he argued should be the new standard for who gets booted.

Since that Friday in October, Musk’s brief reign at Twitter has been marked by chaos and upheaval as the notoriously mercurial and impatient billionaire seeks to impose his will on a company famous for its deliberative culture. Massive layoffs and Musk’s demand that remaining employees pledge to work “hardcore” hours have left one of the world’s most influential social media sites operating with a skeleton staff and experts predicting an eventual crash. Meanwhile, advertisers are fleeing, raising doubts about Musk’s ability to generate sufficient profits to satisfy investors in the $44 billion deal.

Amid the turmoil, Musk has stoked the culture-war issues that helped inspire him to purchase the company in the first place. A fierce advocate for the right “to speak freely within the bounds of the law,” Musk has moved rapidly - at times erratically - to undermine a regime built over a decade to define dangerous language on the site and protect vulnerable communities, replacing it with his own more impulsive style of ad hoc decision-making.

The Trust and Safety team lies at the heart of that battle. Known for examining every question from every angle before taking action, the team is responsible for many of the polarizing decisions that have prompted the right to cry censorship, including the ban on former president Donald Trump after his supporters attacked the U.S. Capitol on Jan. 6, 2021.

ADVERTISEMENT

Interviews with more than a dozen current and former employees and people close to Musk who spoke on the condition of anonymity for fear of retribution, as well as documents obtained by The Washington Post, detail the clash of cultures as Musk and his allies have fired or alienated Trust and Safety team leaders and reversed their decisions. In recent days, Musk has reinstated both Trump and the Babylon Bee, along with a handful of other controversial accounts.

Now, Musk is looking to automate much of the Trust and Safety team’s work to police content - eliminating some of the nuance from complicated decisions for a cheaper approach.

Already, the team is dwindling. Its numbers have shrunk to dozens from a staff of 120 before Musk’s takeover. On his first night as owner, Musk fired its leader of 11 years, former Trust and Safety head Vijaya Gadde. Even before the takeover, Musk had personally amplified rhetoric from conservatives calling Gadde the company’s “chief censor.” Gadde did not respond to a request for comment.

With Gadde’s departure, Roth became the highest-ranking Trust and Safety official, steering the site through the Nov. 8 midterm elections. At first, he worked with Musk to stanch a flood of hate speech unleashed under the new owner. But Roth quit two weeks later, after Musk insisted on creating a $7.99 pay-for-play blue check system - since suspended - that led to a profusion of “verified” impostor accounts.

In an op-ed Friday in the New York Times, Roth acknowledged that the old system of content moderation - a “know-it-when-I-see-it” mélange reflecting the values of advertisers, Apple and Google app store managers, and Big Tech executives - can “have a dismaying lack of legitimacy.” But Roth warned that Musk “perpetuates the same lack of legitimacy through his impulsive changes.”

“It was for this reason that I chose to leave the company: A Twitter whose policies are defined by edict has little need for a trust and safety function dedicated to its principled development,” he wrote.

Roth declined to comment further. Musk and Twitter did not respond to requests for comment.

- - -

In Twitter’s early years, the company’s executives called the social network “the free speech wing of the free speech party.” But then came GamerGate, the brutal 2014 campaign against women in the video game industry, and the site started to evolve. In 2015, Gadde committed in a Post op-ed to invest in tools and systems that would better detect and root out abuse.

After the 2016 election, it became clear that the social network had been used by Russian trolls to sow disinformation. The Trust and Safety team took on new significance and grew, increasingly developing policies that put it at the center of the culture wars that gripped the nation during the Trump administration and the coronavirus pandemic. Civil rights groups complained they weren’t doing enough; conservatives said they were being censored.

Despite Musk’s criticism in the lead-up to his purchase, he initially seemed more open to their work than the team expected. On Oct. 27, the day the deal closed, Gadde told her team that she’d had productive discussions with Musk.

Within hours, however, Gadde’s access to company email and other systems was cut off; she couldn’t even say a formal farewell. At a Halloween party at Twitter headquarters where workers had brought children dressed in costumes, some employees quietly left to go cry.

Twitter occupies two buildings linked by a foot bridge on the ninth floor. Across the bridge from the Halloween festivities, Roth huddled that day with Musk and his new team in a large second-floor conference room. Dubbed the war room, the space was strewn with toys and two-year-old X Æ A-Xii, one of Musk’s children was running around.

Musk’s takeover coincided with a critical stress test for the Trust and Safety team - the Brazil election. As they mapped out layoffs that ultimately would banish more than half the company’s 7,500 workers, Musk’s team wanted to cut off access to key content moderation tools to prevent disgruntled employees from wreaking havoc.

Roth pitched a plan to satisfy that desire while preserving sufficient access to permit his team to address any issues that might crop up in Brazil, where the incumbent, Jair Bolsonaro, uses Twitter much as Trump did and was likely to contest the results. Musk approved Roth’s plan.

The next day, Musk demanded that Twitter reverse bans imposed on the Babylon Bee and Canadian psychologist Jordan Peterson, the self-proclaimed “professor against political correctness.” Peterson was suspended from Twitter this year after referring to transgender actor Elliot Page by the name he used before transitioning in a tweet that said the actor had his “breasts removed by a criminal physician.”

Musk has long promoted the Babylon Bee on Twitter. When he objected to a barb from its competitor in the medium, the Onion, in March 2021 about how he had made his money, Musk wrote, “Shame on you, Onion. This is why people are switching to @TheBabylonBee.”

ADVERTISEMENT

But Musk’s sudden demand was problematic. If the company restored the two accounts without a clear reason, it would undo years of careful work. As owner of the company, Musk could just revoke the policy on transgender abuse. But Roth warned that such a move could make Twitter a lightning rod for the culturally charged national debate over gender during Musk’s first 24 hours at the helm.

Roth’s concerns were escalated to Musk, and they huddled in the kitchen to discuss. Twitter already was working on a new method of handling users who broke the rules. Instead of suspending their accounts, the company would obscure offending tweets with public warnings, much as it does with election misinformation.

Musk stood down, and Roth’s deputy sent a directive on Slack to his team to overhaul the suspension policy that very afternoon. The Babylon Bee and Peterson would remain suspended until Twitter unveiled the new policy.

Shortly after, Musk announced on Twitter that he would form a new content moderation council - composed of diverse voices from across the political spectrum - to help Twitter make decisions about restoring and banning accounts.

“No major content decisions or account reinstatements will happen before that council convenes,” Musk tweeted.

For the moment, it seemed like a victory.

- - -

Through the following weekend, Twitter executives worked with Musk and his team to plan layoffs. Food, coffee and snacks were catered so that people didn’t have to leave the building. In violation of Twitter’s long-standing practice of encouraging reusable water bottles, tables in the war room were dotted with buckets of Voss water. Musk’s allies - investor David Sacks, Spiro and a host of engineers from his other companies, Tesla, SpaceX and The Boring Co. - regularly accompanied him.

ADVERTISEMENT

That Sunday, Musk tweeted a link containing misinformation about the attack on House Speaker Nancy Pelosi’s husband. He later deleted it, although his reasoning was unclear.

Meanwhile, hate speech surged on the site as users tested Musk’s new Twitter. Civil rights groups raised concerns, and basketball star LeBron James tweeted that the rise of hate was “scary AF,” calling on Musk to take it seriously.

Trust and Safety sprang into action. The team had already been working on new rules that would result in removing more hateful slurs - a policy it had planned to roll out months later. They pulled the trigger, despite some warning it might err on the side of removing posts that were fine, too.

“There was a lot of scrambling, a lot of disbelief,” said one former employee, who left the company this month. “How are we doing this in a week, when it was slated to take two quarters?”

When “Black Thursday” rolled around on Nov. 3, only 15 percent of Roth’s team was laid off, compared with half of the entire company. But other teams that also do critical content moderation suffered heavier cuts, including the Product Trust team that develops policies to prevent new features from harming people and the Health team that implements policies from Trust and Safety.

Civil rights groups called for an advertiser boycott. Roth tried to calm the public in a Twitter thread, where he said the company’s “core moderation capabilities remain in place.”

As layoff notices went out - first to Asia and Europe, then the United States - a group of longtime Twitter executives huddled together in a conference room and cried.

Civil rights groups and researchers who use the company as a resource noticed the effect almost immediately.

Thenmozhi Soundararajan, executive director of a Dalit rights group called Equality Labs, routinely asked Trust and Safety division employees to take down accounts violating Twitter’s rules on hate speech. Following the layoffs, her emails started bouncing back. When she made requests through Twitter’s public portal, she stopped receiving automated acknowledgments.

“It is such a dangerous time to have fired the moderators and this team,” she said in an interview, citing regional elections in India and surging misinformation and slurs. “Twitter has been already in a state of failure, this is just the nail in the coffin.”

On Nov. 7, Musk made many in the company cringe when he tweeted to his millions of followers that they should vote Republican. But Election Day otherwise came and went without major crises.

Then came Twitter Blue Verified, the paid version of the check marks that have long been appended to accounts of major corporations, celebrities, journalists and other influential people.

ADVERTISEMENT

Already, Trust and Safety employees including Roth had made an impassioned appeal to delay the rollout indefinitely - at least until after the midterms, warning the fallout could be major.

In a seven-page memo reviewed by Musk, employees warned of the risks of broad impersonation of world leaders, advertisers, brand partners, election officials and other high-profile individuals.

To help mitigate this risk, the memo recommended that Twitter include “official account” labels, which should be “visually distinguishable” and “distinguishable on every surface” including in people’s timelines and replies.

Musk agreed to delay the rollout until after the midterms. But on Nov. 9, the check marks started popping up - as did the fake accounts. Impostors purporting to be basketball star James, President Biden and major brands began fomenting chaos.

Sales executive Robin Wheeler, Roth and Musk launched an audio chat on Twitter Spaces in an attempt to reassure advertisers, joining the call from separate rooms. Musk said the new paid-for verified service would make the site more reliable and help eliminate or demote fake accounts.

Mistakes will be made, Musk acknowledged. But, he said, “If we do not try bold moves, how will we make great improvements?”

ADVERTISEMENT

The next day, Roth quit. A Trust and Safety director gathered employees in a team meeting and shared the abrupt announcement that Roth was out. Employees were given no explanation and worried about the institutional knowledge vacuum it created.

That same night, an urgent memo was sent to engineers over Slack. Less than 48 hours after the launch of Musk’s first major product, the company had disabled new sign-ups for check mark subscriptions to address the impersonation issues.

In a tweet late Monday, Musk said the company would hold off relaunching Blue Verified “until there is high confidence of stopping impersonation.” He also said Twitter “will probably use different color check for organizations than individuals.”

- - -

On Trust and Safety, morale continued to decline. On Nov. 10, the team executed a one-day sickout.

After the ultimatum asking employees to commit to working harder on Wednesday, Trust and Safety team members and their allies inside the company huddled virtually over Google Meet and Signal groups to discuss what to do. They no longer trusted private internal Slack channels, after Musk had fired several employees for their communications there.

There was a lopsided consensus that Musk didn’t value their work. The leaders that inherited Roth’s team had not even spoken to Musk, they said, and false tweets from impersonator blue-check accounts were up for hours. They discussed a potential mass resignation.

But some had to stay for health insurance that protected their partners and children as well as themselves. Those who wanted out did not judge the others poorly.

“The message was: Do what is best for your mental health and your family,” one of the participants said.

The next day, many members of the team met in San Francisco with an engineer close to Musk, as people outside the office called in. One employee walked away calling it a “disaster.”

“Several colleagues made up their minds to leave after that,” the person said. “There appeared to be no strategy for [Trust and Safety], and Elon was just overriding or making up policy as he went without any input.”

Ella Irwin, the executive that Musk tapped to replace Roth as lead of the division, suggested after the meeting that Musk would learn moving forward.

“It is a complete Trust and Safety nightmare. It’s the kind of thing you work on nonstop, you make it your baby,” another person said, “and for someone to just disregard everything you are saying is a complete mental blow.”

Virtually, the entire team dedicated to rooting out covert foreign influence operations was fired or quit, putting in jeopardy the company’s abilities to detect accounts including those attempting to influence U.S. politics.

On Friday, Musk announced that the Babylon Bee and Peterson would be permitted back on the site - with little explanation. The Babylon Bee declined to comment.

“New Twitter policy is freedom of speech, but not freedom of reach,” Musk tweeted, adding that negative and hate tweets will be made less visible and won’t earn the company money.

Over the weekend, Musk announced that Trump would be reinstated to the platform after conducting a Twitter poll in which roughly 52 percent of participants voted to bring the former president back.

“Hope all judgy hall monitors stay on other platforms - please, I’m begging u,” Musk later tweeted.

His promise to appoint a council to bring transparency to such momentous decisions appeared to have been forgotten.

- - -

The Washington Post’s Will Oremus and Jeremy B. Merrill contributed to this report.

ADVERTISEMENT