How YouTube sent one man down an alt-right rabbit hole

YouTube has been criticized often in recent weeks for hosting content created by radical actors, and has also been censured for how it has handled this criticism.

The video sharing giant’s policies on harassment and hate speech and its enforcement of these policies faced public scrutiny last week after Vox journalist and YouTube host Carlos Maza expressed his frustration with ongoing personal attacks he has faced from popular fellow YouTuber Steven Crowder.

After viewing the supercut of a few of Crowder’s comments, many — including Vox’s editor-in-chief Lauren Williams and its head of video Joe Posner — questioned why YouTube allowed Crowder’s videos to remain on its site, noting that the streaming service’s rules explicitly state videos that are “deliberately posted in order to humiliate someone” are not allowed. The platform’s community guidelines also state that users cannot make “hurtful and negative personal comments/videos about another person.”

YouTube officials responded to these questions with a series of decisions, as was explained in Recode Daily:

First, on Tuesday, the company said a series of videos in which the conservative media figure Steven Crowder calls Maza homophobic slurs didn’t violate its policies. Then, under public pressure, YouTube said on Wednesday it would temporarily demonetize Crowder’s channel. That didn’t resolve tensions either, as The Verge’s Elizabeth Lopatto writes: “YouTube’s policies have satisfied no one in this very public debacle.” By Wednesday night, the company published a blog post explaining its reasoning for keeping up the videos but promised to take a “hard look at our harassment policies with an aim to update them.”

A number of Google (which owns YouTube) employees spoke anonymously with The Verge’s Megan Farokhmanesh, and said the company’s handling of the issues Maza raised reflect how the company has dealt with the concerns of marginalized communities internally and externally.

“Internal outreach to executives has not been effective in years,” one employee said. “They ignore us completely unless there is extreme unrest. We can’t trust them anymore to listen in good faith.”

Another described YouTube’s decision to leave up Crowder’s videos as “the latest in a long series of really, really shitty behavior and double-talking on the part of my employer as pertains to anything to do with queer shit.”

Crowder and creators of similar content draw a large audience to YouTube, and that audience brings in money through advertising. Kevin Roose, a reporter for the New York Times, sought to understand how creators like Crowder win fans, and over the course of his reporting, found that YouTube may help to radicalize viewers through its algorithms.

Roose followed one man — Caleb Cain, now 26 — who, as he put it, “fell down the alt-right rabbit hole” and became a viewer of videos like Crowder’s.

Cain discovered the alt-right movement on YouTube while seeking community after dropping out of college; he shared a download of his YouTube history (comprised of more than 12,000 videos and 2,500 searches) with Roose that illustrated how he became steeped in far-right ideology.

Cain began watching self-help videos in 2014. At that time, he identified as a liberal, and stumbled upon the work of Stefan Molyneux; in addition to producing videos containing life advice, Molyneux also creates videos with social and political commentary, arguing for increased men’s rights and a return to the sort of gender politics that were common in previous centuries.

“He was willing to address young men’s issues directly, in a way I’d never heard before,” Cain told Roose.

As he watched more of Molyneux’s pieces, YouTube began recommending other conservative and alt-right content, which Cain watched as well. Over time, he came to internalize, identify with, and believe in the points of view expressed in both traditionally conservative and more radical videos.

“When I found this stuff, I felt like I was chasing uncomfortable truths,” Cain said. “I felt like it was giving me power and respect and authority.”

Roose writes that experts he spoke with believe YouTube’s profit model and the algorithm responsible for serving Cain and others like him video after related video can inadvertently lead to radicalization: “Critics and independent researchers say YouTube has inadvertently created a dangerous on-ramp to extremism by combining two things: a business model that rewards provocative videos with exposure to advertising dollars, and an algorithm that guides users down personalized paths meant to keep them glued to their screens.”

According to Roose, videos suggested by that algorithm drive more than 70 percent of users’ time on YouTube. And it is getting better at recommending the sorts of videos that keep users watching until the end.

YouTube reportedly updated its algorithm in 2012 to promote videos viewers actually finished watching; that led to a surge in engagement, as did a 2015 change that incorporated artificial intelligence into the algorithm’s video recommendation process. In 2017, that artificial intelligence was further refined, and it learned to pull videos tangentially related to what users already liked in order to both expand their horizons and keep them watching longer.

Roose notes YouTube denies their algorithm has led users to more radical videos, but changes to it may have led Cain to a new type of video: a genre of work that employs the rhetoric and style seen in the types of alt-right content Cain had come to enjoy, but that promotes ideas of the left.

Cain found himself drawn to one creator in particular: Natalie Wynn.

“I just kept watching more and more of that content, sympathizing and empathizing with her and also seeing that, wow, she really knows what she’s talking about,” Cain told Roose.

Eventually, he was so moved by the videos of Wynn and other YouTubers like her that he rejected the alt-right philosophies he had embraced for years, and became a content creator himself, posting liberal videos of his own in the mode of the platform’s most popular alt-right figures.

Despite the rise of figures like Wynn and Cain on the platform, criticism of YouTube being used to host incendiary content remains. And the employees Farokhmanesh spoke with were not overly optimistic that things will change soon.

Senior software engineer Irene Knapp told Farokhmanesh: “The company takes half-measures, and pats itself on the back for those half-measures,” and said the problems Maza has faced “will absolutely happen again … That’s just how it goes.”

The reason alt-right figures like those Cain was for years a fan of will remain on the platform is fear, another Google employee told Farokhmanesh.

“Google and YouTube don’t want to take any action against any far-right channel for fear of stoking the far right to say they’re being persecuted,” the employee said. “But that strategy doesn’t pan out. They will never stop saying they’re persecuted.”

Similar Posts:

    None Found