It had been a long pandemic for the Twitter research team. Charged with solving some of the platform’s toughest problems surrounding harassment, extremism and disinformation, staffers headed to Napa Valley in November 2021 for a corporate retreat. Despite a tumultuous change in leadership — Jack Dorsey had recently stepped down and appointed former chief technology officer Parag Agrawal to take his place — the group felt united, even hopeful. After months of battling bad actors online, employees took a moment to relax. “We finally felt like we had a close-knit team,” said one researcher.
But during the farewell brunch on the last day, people’s phones started pinging with alarming news: Their boss, Dantley Davis, Twitter’s vice president of design, had been fired. Nobody knew it was coming. “It was like a movie,” said one participant, who asked to remain anonymous as they are not authorized to speak about the company publicly. “People started to cry. I just sat there eating a croissant and thought, ‘How’s the mood?’”
The news predicted a downward spiral for the research organization. Although the group was used to reorganizations, a shake-up during an outing meant to bind the team together felt deeply symbolic.
The unrest came to a head in April, when Elon Musk signed a deal to buy Twitter. Interviews with current and former employees, along with 70 pages of internal documents, suggest the chaos surrounding Musk’s acquisition drove some teams to the breaking point, causing numerous health researchers to shut down, with some saying their colleagues were told they were working on projects in the past. fighting extremism in favor of focus on bots and spam. The Musk deal may not even go through, but the effects on Twitter’s health efforts are already clear.
The health team, once tasked with fostering civilian conversations on the famously uncivilized platform, went down from 15 full-time staffers to two.
In 2019, Jack Dorsey asked a fundamental question about the platform he helped create: “Can we actually measure the health of the conversation?”
On stage at a TED conference in Vancouver, the beanie-clad CEO spoke candidly about investing in automated systems to proactively detect bad behavior and “completely relieve the victim.”
That summer, the company began staffing a team of health researchers to carry out Dorsey’s mission. His speech convinced people who had worked in academia or for larger tech companies like Meta to join Twitter, inspired by the prospect of working for positive social change.
When the process worked as intended, health researchers helped Twitter think about possible misuse of new products. In 2020, Twitter was working on a tool called “unmention” that would allow users to restrict who can respond to their tweets. Researchers conducted a “red team” exercise, bringing together employees from across the company to investigate how the tool could be misused. Unmention could allow “powerful people” [to] suppress dissent, discussion and correction” and “enable harassers to contact their targets” [to] force targets to respond personally,” the red team wrote in an internal report.
But the process was not always smooth. In 2021, former Twitter product chief Kayvon Beykpour announced that the company’s first priority was the launch of Spaces. (“It was a full-blown attack to kill Clubhouse,” says one employee.) The team assigned to the project worked overtime to get the feature out the door and didn’t schedule an exercise for the red team until August 10 — three months after launch. In July, the exercise was canceled. Spaces went live without a comprehensive assessment of key risks, and white nationalists and terrorists flooded the platform, such as The Washington Post reported.
When Twitter finally held a red team exercise for Spaces in January 2022, the report concluded: “We did not prioritize identifying and mitigating health and safety risks before launching Spaces. This red team came too late. Despite critical investments in the first A year and a half of building Spaces, we’ve been largely reactive to the real damage inflicted by malicious actors in Spaces. We’ve relied too much on the general public to identify issues. We’ve launched products and features without doing enough research. to possible health consequences.”
Earlier this year, Twitter backed off plans to monetize adult content after a red team discovered the platform had failed to adequately address child sexual exploitation material. It was a problem that researchers had been warning about for years. Employees said Twitter executives were aware of the issue but noted that the company has not allocated the resources needed to resolve the issue.
By the end of 2021, Twitter’s health researchers had spent years messing with bad actors on the platform and decided to take a more sophisticated approach to dealing with harmful content. Externally, the company was regularly criticized for allowing dangerous groups to run amok. But internally it sometimes felt like certain groups, such as conspiracy theorists, were also kicked off the platform soon — before researchers could study their dynamics.
“The old approach was almost comically ineffective and highly reactive — a manual process of playing catch,” said a former employee, who wished to remain anonymous as they are not authorized to speak publicly about the company. “Simply defining and catching ‘bad guys’ is a losing game.”
Instead, researchers hoped to identify people who were about to engage in harmful tweets, and push them toward healthier content using pop-up messages and interstitials. “The pilot will enable Twitter to identify and exploit behavioral cues – rather than content – cues and reach users at risk of harm with redirection to supporting content and services,” read an internal project overview, viewed by The edge.
Twitter researchers teamed up with Moonshot, a company that specializes in studying violent extremists, and launched a project called Redirect, modeled on the work Google and Facebook had done to curb the spread of harmful communities. At Google, this work would have resulted in a sophisticated campaign to target people search for extremist content with ads and YouTube videos aimed at exposing extremist messages. Twitter planned to do the same.
The goal was to move the company from simply responding to bad accounts and messages to proactively guiding users toward better behavior.
Twitter’s efforts to contain harmful groups usually focus on defining these groups, designating them within a policy framework, detecting their reach (though group membership and behavior), and suspending or deplatforming those within the cohort. ”, reads an internal project overview. “This project instead seeks to understand and address user behavior upstream. Instead of focusing on pinpointing bad accounts or content, we are trying to understand how users find malicious group content in accounts and then redirect those efforts .”
In phase one of the project, which started last year, researchers focused on three communities: racially or ethnically motivated violent extremism, violent anti-government or anti-authority extremism, and incels. In a case study on the boogaloo movement, a far-right group focused on instigating a second American Civil War, Moonshot identified 17 high-engagement influencers who used Twitter to share and spread their ideology.
The report outlined possible intervention points: one when someone was trying to search for a boogaloo term, and another when they were about to use a piece of boogaloo content. “Moonshot’s approach to identifying core communities could mean that users are moving into this sphere of influence, which could trigger an interstitial message from Twitter,” the report said.
The team also suggested adding a pop-up message before users can retweet extremist content. The interventions were designed to add friction to the process of finding and dealing with malicious tweets. Done right, it would mitigate the impact of extremist content on Twitter, making it more difficult for the groups to gain new followers.
Before that work could be fully completed, however, Musk reached an agreement with the Twitter board to buy the company. Shortly thereafter, employees who had led the Moonshot partnership left. And in the months since Musk signed the deal, the health research team has all but evaporated, from 15 staffers to just two.
“The sale of the company to Elon Musk was the icing on the cake of a much longer track record of decisions by senior companies in the company, which showed that safety was not a priority,” said one employee.
Multiple former investigators said the turmoil associated with Musk’s bid to buy the company was a breaking point and prompted them to switch jobs.
“The chaos of the deal made me realize that I didn’t want to work for a private Twitter owned by Musk, but also that I didn’t want to work for a public Twitter not owned by Musk,” a former employee said. “I just didn’t want to work for Twitter anymore.”
Phase two of the Redirect project — which is said to have helped Twitter understand which interventions worked and how users actually interacted with them — received funding. But by the time the money came in, no researchers were available to oversee it. Some of the employees who remained would reportedly have to downgrade Redirect in favor of bot and spam related projects, which Musk has been targeting in his bid to pull out of the deal.
Twitter spokesman Lauren Alexander declined to comment on the record.
An employee captured the team’s frustration in a tweet: “Absolutely not interested in what jack or any other c-suiter has to say about this acquisition,” the employee wrote, screenshotting an article about how much Twitter CEO Parag Agrawal and former CEO Jack Dorsey stood to benefit from the deal with Musk. “May you all fall down a very long staircase.” (The employee declined to comment.)
According to current employees, the tweet was reported as a threat to a colleague and the employee was fired.
Janice has been with businesskinda for 5 years, writing copy for client websites, blog posts, EDMs and other mediums to engage readers and encourage action. By collaborating with clients, our SEO manager and the wider businesskinda team, Janice seeks to understand an audience before creating memorable, persuasive copy.