Conspiracy theories about the health of Supreme Court Justice Ruth Bader Ginsburg have dominated YouTube this week, illustrating how the world's most popular video site is failing to prevent its algorithm from helping popularize viral hoaxes and misinformation.
More than half of the top 20 search results for her initials, “RBG,” on Wednesday pointed to false far-right videos, some claiming doctors are using mysterious illegal drugs to keep her alive, according to a review by The Washington Post. Ginsburg has been absent from oral arguments at the Supreme Court this week as she recuperates from recent surgery to remove cancer from her lungs. Tests revealed Friday she will need no further treatment and her recovery is on track.
The falsehoods, most of which originated with the fringe movement QAnon, dramatically outnumbered results from credible news sources: Only one of the top results came from a mainstream news site, CNN, and it was an 11-month old interview about her career. The algorithm rewarded the conspiracy videos over reliable news based on what it calculated was their "relevance," signaling that the videos were likely new, popular or suitable to the search. By Thursday, a day after being contacted by the Post, YouTube searches for "RBG" also surfaced multiple videos from mainstream news organizations.
Tests revealed that Ginsburg has no additional cancer following her surgery in December, and no further treatment is needed, the Supreme Court announced Friday.
"Justice Ginsburg will continue to work from home next week and will participate in the consideration and decision of the cases on the basis of the briefs and the transcripts of oral arguments," court spokeswoman Kathleen Arberg said in a statement.
"Her recovery from surgery is on track. Post-surgery evaluation indicates no evidence of remaining disease, and no further treatment is required."
YouTube, a primary conduit for information, knows it has a problem. The Google-owned video site has changed its algorithms over the last year to surface more reliable videos around major news events, learning from Google search and Google News. It relies on an armada of human content moderators to vet what its algorithm flags as potentially problematic. But it is still dramatically outmatched by the pace and volume of videos that are uploaded, particularly around news events.
"While we've made good progress, we also recognize there's more to do," YouTube spokesman Farshad Shadloo said.
The hoaxes and conspiracy theories on YouTube can become gateways to more false information. YouTube's recommendation engine automatically queues up additional videos based in part on what it thinks the user might want to watch next. Users who clicked on one of the QAnon-related conspiracies about Ginsburg were presented other videos claiming existence of a "deep state" that's infested with demons and others saying a secretive Jewish cabal controls the world.
YouTube has fiercely resisted the idea that it should serve as an arbiter of truth, arguing a combination of tech tools and savvy consumers are best equipped to help separate fact from fiction.
Shadloo said the site has added text boxes on certain searches so users can more easily fact-check information for themselves. YouTube searches for "Ruth Bader Ginsburg" resulted in more authoritative videos from news outlets including NBC and CBS this week - though many people refer to the justice by her initials.
The conspiracy around Ginsburg's health wasn't confined to YouTube. It became a subject of discussion in multiple Facebook groups, data from the social-monitoring firm CrowdTangle show. Others took to Twitter, where a search revealed multiple tweets - many with hundreds of shares and favorites, and sporting QAnon-related hashtags - that questioned whether Ginsburg is actually dead.
The prominence of conspiracy theories about one of the country's most recognizable political figures highlights how misinformation slips through the cracks on social media sites, despite heightened efforts and high-profile promises by tech giants to police their platforms.
Facebook, which declined comment, does not ban falsehoods but generally pushes problematic news stories shared on the site to fact checkers and ranks content deemed to be a hoax lower in the News Feed. Twitter said Wednesday it doesn't serve as an arbiter of truth but relies in part on users who "are correcting and challenging the theories in real time."
The roots of the latest Ginsburg conspiracy are on QAnon, which centers around a series of cryptic messages posted on anonymous online forums. Some QAnon posts cast doubt on the Supreme Court justice's health and the treatment doctors provided her. Followers of "Q" then posted videos on YouTube, which explicitly cited the hoax or read its text to viewers in full.
YouTube's algorithms don't verify statements for accuracy, and the company says such an undertaking would be virtually impossible: While its current technology can flag some violations, it is still far from comprehending the nuances of conversation, context and human speech. YouTube also said it faces an overwhelming logistical challenge in attempting to police the 450 hours of video uploaded to the site every minute.
But YouTube is the world's second most-visited website - ranked only behind Google, its parent company - and many viewers look to the video giant as a prominent source for credible news and information. And much like Google, its search algorithm is closely guarded, a secrecy that can make it impossible for viewers to understand why a video was given what seems to be a seal of approval atop YouTube search results.
"Data shows that Americans increasingly rely on YouTube as a source of news and information," Virginia Sen. Mark Warner, the top Democrat on the Senate Intelligence Committee, said in a statement in response to the RBG videos. "It's inexcusable that YouTube continues to shirk its responsibility to the public by allowing misinformation and hoaxes to flourish so widely on the platform."
YouTube pledged to improve its search functions in 2017 after a flood of misleading videos overwhelmed the site following the Las Vegas mass shooting that left 58 people dead, and again last year after the site's algorithms highlighted videos attacking the teenage survivors of the Parkland school shooting. In October, YouTube chief Susan Wojcicki said that surfacing relevant results was one of the company's biggest challenges, adding: "We really want those top results to be right."
But the "RBG" results show how the algorithm still struggles from the same core vulnerabilities, spotlighting viral hoaxes and misleading videos while also burying more credible content. Many of the "RBG" conspiracy-theory videos came from niche video creators with only a few thousand views. But because YouTube's search algorithms rated the videos as having the highest "relevance" to searches about "RBG," based largely on factors such as view count and upload date, the site rocketed them to the top of the rankings, helping introduce the videos to a vast and unsuspecting new audience.
YouTube said the conspiracy-theory videos did not appear to violate its policies on harmful or dangerous content, and that it has made progress in tackling videos ripe for deception or manipulation. A viewer searching for "moon landing fake," for instance, is now shown a brief encyclopedia entry and a series of videos debunking the idea.
But the site still often fails to provide that level of context for ongoing or very recent news events - the kinds of developing stories many viewers may end up searching about. More than half of all adults surveyed by the Pew Research Center last summer said they thought YouTube was important to "understanding things happening in the world."
Joan Donovan, the director of the Technology and Social Change Research Project at Harvard University, said QAnon and other conspiracy theories have developed a "foothold on YouTube," and that their followers have been successful in using the site to reach broad new audiences.
Their prominence, and YouTube’s struggle to constrain it, has helped plunge mainstream viewers “into a bit of a spiral through many other” conspiracy-related videos, Donovan said. “They can’t tell after a while if [they’re] looking for more content to do with RBG or more content to do with QAnon.”