Untrue-Tube: Monetizing Misery and Disinformation
Welcome to 2018, a time when we clicked and swiped ourselves into a cultural dystopia for a few trillion views.
With the dust still settling from the Parkland, Florida high school shooting massacre, the information war carries on. The initial focus involved the NRA lobby, but a large-scale disinformation campaign has successfully shifted the debate to the use of “crisis actors” for student survivors.
This is the real fake news — no quotation marks are needed.
YouTube’s almost-monthly scramble to take down trending videos that perpetuate rumours and false information is over for the time being. Using several hundred “seed” videos returned from a search on YouTube’s API for “crisis actor,” I obtained the “next up” recommendations for each of the results. This generated a network of close to 9,000 related conspiracy-themed videos. Except for the time it took to collect the data, it was a relatively straightforward process.
I didn’t expect to be shocked when I looked at the results.
First, I want to emphasize that this is a landscape survey of the conspiracy space. It’s the network of YouTube videos users are exposed to after searching for “crisis actor” following the Parkland event. It is not meant to be a content analysis. Exactly what the content of each of these videos entails, I’d rather not know. 90% of the titles, however, are a mixture of shocking, vile and promotional. Themes include rape game jokes, shock reality social experiments, celebrity pedophilia, “false flag” rants, and terror-related conspiracy theories dating back to the Oklahoma City attack in 1995.
Returned videos include topics such as:
It goes on and on, and on and on….
Below are snapshots of the network generated from these 9000 “crisis actor” YouTube videos. I’m not finished with this part of the analysis yet, so these are very rough drafts. But they are interesting to look through to get a sense of the quality and breadth of content that’s being hosted, monetized, and promoted on Youtube:
Every religion, government branch, geopolitical flashpoint issue, shadow organization — and every mass shooting and domestic terror event — are seemingly represented in this focused collection.
As the graph below shows, mass shooting, false flag, and crisis actor conspiracy videos on YouTube are a well-established, if not flourishing genre. Across this network, one actor and its associated videos stand out above the rest. See the node directly below.
Every time there’s a mass shooting or terror event, due to the subsequent backlash, this YouTube conspiracy genre grows in size and economic value. The search and recommendation algorithms will naturally ensure these videos are connected and thus have more reach.
In other words, due to the increasing depth of the content offerings and ongoing optimization of YouTube’s algorithms, it’s getting harder to counter these types of campaigns with real, factual information.
I hate to take the dystopian route, but YouTube’s role in spreading this “crisis actor” content and hosting thousands of false videos is akin to a parasitic relationship with the public. This genre of videos is especially troublesome, since the content has targeted (individual) effects as well as the potential to trigger mass public reactions.
The view count for 50 of the top mass shooting-related conspiracy videos is around 50 million. Not every single video overlaps directly with conspiracy-related subjects, but it’s worth pointing out that these 8842 videos have registered almost four billion (3,956,454,363) views.
Contrary to my earlier remarks on Twitter about YouTube’s algorithm getting “gamed,” I’m no longer sure. The only gaming here appears to be using tragic events for automated content monetization. The mass shootings in particular are especially troubling: the experiences of the least fortunate among us — including tragedy survivors, children, and their families— are being used to algorithmically profit from the most impressionable.
I’m still working on this project, but since it’s an important set of “fake news” data, I’m sharing the .csv of all the videos with the names and channels, and the view, like, and dislike counts. You can browse the list here:
or download the .csv file directly here: https://query.data.world/s/BIDK7G6TGs5tFmYm8pL0uevQibg0Tv
From my experience, in the disinformation space, all roads seem to eventually lead to YouTube. This exacerbates all of the other problems, because it allows content creators to monetize potentially harmful material while benefitting from the visibility provided by what’s arguably the best recommendation system in the world.
While they don’t need to outright censor it, there must — at the very least — be policies put in place that include optional filters and human moderators to help protect children and other vulnerable people from this material.
*This write-up marks a return full-circle to my November 2016 study. In 2018, YouTube is no doubt one of the hubs of the misinformation ecosystem and still a key enabler in the spread of micro-propaganda.