Over the last couple of weeks, social media networks have finally enacted a glut of the content moderation policies that critics have spent years demanding. The latest crackdown comes from YouTube, which announced on Thursday that the ban hammer is dropping on QAnon videos—with caveats, of course.
While Facebook groups were seen as a digital home base for QAnon supporters to organize from, YouTube has been viewed as the catalyst that radicalized average folks through the conspiracy theories they would encounter along their way down the recommendation algorithm rabbit hole. In the world of conspiracy enthusiasts, QAnon has become a sort of unified field theory that unites new flights of the imagination with others that go back decades. YouTube has acknowledged that the broad, shifting nature of QAnon makes it difficult to craft policy to responsibly limit its spread. In a new blog post the company put forth its own theory of how it hopes to address the issue:
Today we’re further expanding both our hate and harassment policies to prohibit content that targets an individual or group with conspiracy theories that have been used to justify real-world violence. One example would be content that threatens or harrasses someone by suggesting they are complicit in one of these harmful conspiracies, such as QAnon or Pizzagate. As always, context matters, so news coverage on these issues or content discussing them without targeting individuals or protected groups may stay up. We will begin enforcing this updated policy today, and will ramp up in the weeks to come.
Last week, Facebook placed a blanket ban on QAnon groups and pages, a very heavy-handed move relative to the social media network’s track record of avoiding prohibitions on any legal content for as long as it can. But at the same time, the move was met with a collective resignation that Q supporters would likely just rebrand to something more innocuous. We’ve already seen this happening with the Save the Children movement’s modest success in rebranding a deranged fantasy about cannibal pedophile cabals into something that soccer moms can proudly call their own.
YouTube is keeping that in mind with its policy changes by not tying violations to a single group, and by focusing video removals only on content “that targets an individual or group with conspiracy theories that have been used to justify real-world violence.” In short: talking about QAnon or appealing to its adherents isn’t over the line (yet); seemingly using the trappings of QAnon to describe an individual or religious group as part of a deep state plot to harvest adrenochrome from infants (a chemical which, incidentally, we’ve known how to synthesize in labs for decades), you’re probably headed for a ban.
QAnon has been cited as an inspiration for a range of crimes including kidnapping and murder. And the FBI has designated the movement as a domestic terror threat. Pizzagate, the predecessor to QAnon, inspired a man to fire a gun at a pizza parlor in Washington, DC because he thought it was harboring a child sex-slave ring in its basement. The gunman was sentenced to four years in prison and admitted his actions were “foolish and reckless” when he later understood he’d shot up a standard pizza shop which lacked both child slaves or a basement to put them in.
At its base, the theory holds that someone going by the pseudonym “Q” posts secret information from inside the U.S. government on the online image board known as 8Chan. Followers parse these cryptic messages and through their interpretations, they often expand the Q universe. The main storyline that most believers can get behind is that Trump is secretly working with people in the federal government to weed out a group of Satan-worshipping pedophiles who run the world. The finer details are disparate but they typically involve Democratic figures like Hillary Clinton either being executed soon or having already been secretly executed and replaced by a robot, clone, or reptilian.
YouTube is decidedly not on a mission to defeat QAnon—merely, it wants to limit its participation in spreading these theories and their potential for harassment or violence in the real world. As mainstream platforms reject the conspiracy theory community, the effect isn’t to make this kind of organizing impossible. Primarily, it just means that soccer moms will be more likely to head to 8Chan to participate in their fantasies and if you’re one of them please send us a reaction video documenting the first time you hit the site’s homepage.