“‘VF’ refers to Twitter’s control over user visibility. It used VF to block searches of individual users; to limit the scope of a particular tweet’s discoverability; to block select users’ posts from ever appearing on the ‘trending’ page; and from inclusion in hashtag searches. All without users’ knowledge,” Weiss said.
A Twitter engineer said, and two other Twitter employees confirmed: “We control visibility quite a bit. And we control the amplification of your content quite a bit. And normal people do not know how much we do.”
Weiss added that the Strategic Response Team-Global Escalation Team, or SRT-GET, was the group that made the decision on whether to limit the reach of specific users. This team handled up to 200 “cases” per day.
However, there was also a level that existed beyond official ticketing or following Twitter’s policy on paper, called the “Site Integrity Policy, Policy Escalation Support,” or “SIP-PES.”
“This secret group included Head of Legal, Policy, and Trust (Vijaya Gadde), the Global Head of Trust & Safety (Yoel Roth), subsequent CEOs Jack Dorsey and Parag Agrawal, and others,” Weiss continued, adding, “This is where the biggest, most politically sensitive decisions got made. ‘Think high follower account, controversial,’ another Twitter employee told us. For these ‘there would be no ticket or anything.'”
One of these accounts was Libs of TikTok (LTT), which was both placed on the Trends Blacklist and designated as “‘Do Not Take Action on User Without Consulting With SIP-PES.'”
Libs of TikTok, which was started by Chaya Raichik in November 2020 and has over 1.4 million followers, received six suspensions in 2022 alone, Raichik said. “Twitter repeatedly informed Raichik that she had been suspended for violating Twitter’s policy against ‘hateful conduct.'” These suspensions each blocked Raichik from posting for up to a week.
However, Weiss continued, “in an internal SIP-PES memo from October 2022, after her seventh suspension, the committee acknowledged that ‘LTT has not directly engaged in behavior violative of the Hateful Conduct policy.'”
The group internally justified her suspensions by claiming that her posts “encouraged online harassment of ‘hospitals and medical providers’ by insinuating ‘that gender-affirming healthcare is equivalent to child abuse or grooming.'”
However, when Raichik herself was doxxed and a photo of her home and address went up on Twitter, Twitter Support responded: “We reviewed the reported content, and didn’t find it to be in violation of the Twitter rules.” The doxxing tweet is still up, and no action was taken, Weiss said.
“In internal Slack messages, Twitter employees spoke of using technicalities to restrict the visibility of tweets and subjects,” Weiss continued.
Roth said in one of these messages to a colleague that “a lot of times, SI has used technicality spam enforcements as a way to solve a problem created by Safety under-enforcing their policies. Which, again isn’t a problem per se — but it keeps us from addressing the root cause of the issue, which is that our Safety policies need some attention.”
Six days later, Roth messaged an employee on the Health, Misinformation, Privacy, and Identity research team requesting more research to support expanding “non-removal policy interventions like disabling engagements and deamplification/visibility filtering.”
“Roth wrote: ‘The hypothesis underlying much of what we’ve implemented is that if exposure to, e.g., misinformation directly causes harm, we should use remediations that reduce exposure, and limiting the spread/virality of content is a good way to do that.'” Weiss said.
Roth added, “We got Jack on board with implementing this for civic integrity in the near term, but we’re going to need to make a more robust case to get this into our repertoire of policy remediations — especially for other policy domains.”
“The authors,” who include journalists Abigail Shrier, Michael Shellenberger, Nellie Bowles and Isaac Grafstein, “have broad and expanding access to Twitter’s files,” Weiss said. “The only condition we agreed to was that the material would first be published on Twitter.
“We’re just getting started on our reporting. Documents cannot tell the whole story here. A big thank you to everyone who has spoken to us so far. If you are a current or former Twitter employee, we’d love to hear from you. Please write to: email@example.com,” Weiss said.
Weiss concluded by telling readers to watch journalist Matt Taibbi for the next installment.