Youtube Is The Extremist Engine That Drives The Youngsters of the Alt-Right

Photo Illustration by The Daily Beast

“I realized really fast that YouTube’s recommendation was putting people into filter bubbles”

“Young people are more susceptible to having their political ideals shaped. That’s the time in your life when you’re figuring out who you are and what your politics are.”— technology researcher Becca Lewis

For David Sherratt, like so many teenagers, far-right radicalization began with video game tutorials on YouTube. He was 15 years old and loosely liberal, mostly interested in “Call of Duty” clips. Then YouTube’s recommendations led him elsewhere.

“As I kept watching, I started seeing things like the online atheist community,” Sherratt said, “which then became a gateway to the atheism community’s civil war over feminism.” Due to a large subculture of YouTube atheists who opposed feminism, “I think I fell down that rabbit hole a lot quicker,” he said.

During that four-year trip down the rabbit hole, the teenager made headlines for his involvement in the men’s rights movement, a fringe ideology which believes men are oppressed by women, and which he no longer supports. He made videos with a prominent YouTuber now beloved by the far right.

He attended a screening of a documentary on the “men’s rights” movement, and hung out with other YouTubers afterward, where he met a young man who seemed “a bit off,” Sherratt said. Still, he didn’t think much of it, and ended up posing for a group picture with the man and other YouTubers. Some of Sherratt’s friends even struck up a rapport with the man online afterward, which prompted Sherratt to check out his YouTube channel.

What he found soured his outlook on the documentary screening. The young man’s channel was full of Holocaust denial content.

“I’d met a neo-Nazi and didn’t even know it,” Sherratt said

The encounter was part of his disenchantment with the far-right political world which he’d slowly entered over the end of his childhood.

“I think one of the real things that made it so difficult to get out and realize how radicalized I’d become in certain areas was the fact that in a lot of ways, far-right people make themselves sound less far-right; more moderate or more left-wing,” Sherratt said.

Sherratt wasn’t alone. YouTube has become a quiet powerhouse of political radicalization in recent years, powered by an algorithm that a former employee says suggests increasingly fringe content. And far-right YouTubers have learned to exploit that algorithm and land their videos high in the recommendations on less extreme videos. The Daily Beast spoke to three men whose YouTube habits pushed them down a far-right path and who have since logged out of hate.

YouTube has a massive viewership, with nearly 2 billion daily users, many of them young. The site is more popular among teenagers than Facebook and Twitter. A 2018 Pew study found that 85 percent of U.S. teens used YouTube, making it by far the most popular online platform for the under-20 set. (Facebook and Twitter, which have faced regulatory ire for extremist content, are popular among a respective 51 and 32 percent of teens.)

Launched in 2005, YouTube was quickly acquired by Google. The tech giant set about trying to maximize profits by keeping users watching videos. The company hired engineers to craft an algorithm that would recommend new videos before a user had finished watching their current video.

Former YouTube engineer Guillaume Chaslot was hired to a team that designed the algorithm in 2010.

“People think it’s suggesting the most relevant, this thing that’s very specialized for you. That’s not the case,” Chaslot told The Daily Beast, adding that the algorithm “optimizes for watch-time,” not for relevance.

“The goal of the algorithm is really to keep you in line the longest,” he said.

That fixation on watch-time can be banal or dangerous, said Becca Lewis, a researcher with the technology research nonprofit Data & Society. “In terms of YouTube’s business model and attempts to keep users engaged on their content, it makes sense what we’re seeing the algorithms do,” Lewis said. “That algorithmic behavior is great if you’re looking for makeup artists and you watch one person’s content and want a bunch of other people’s advice on how to do your eye shadow. But it becomes a lot more problematic when you’re talking about political and extremist content.”

Chaslot said it was apparent to him then that algorithm could help reinforce fringe beliefs.

“I realized really fast that YouTube’s recommendation was putting people into filter bubbles,” Chaslot said. “There was no way out. If a person was into Flat Earth conspiracies, it was bad for watch-time to recommend anti-Flat Earth videos, so it won’t even recommend them.”

Lewis and other researchers have noted that recommended videos often tend toward the fringes. Writing for The New York Times, sociologist Zeynep Tufekci observed that videos of Donald Trump recommended videos “that featured white supremacist rants, Holocaust denials and other disturbing content.”

Matt, a former right-winger who asked to withhold his name, was personally trapped in such a filter bubble.

For instance, he described watching a video of Bill Maher and Ben Affleck discussing Islam, and seeing recommended a more extreme video about Islam by Infowars employee and conspiracy theorist Paul Joseph Watson. That video led to the next video, and the next.

“Delve into [Watson’s] channel and start finding his anti-immigration stuff which often in turn leads people to become more sympathetic to ethno-nationalist politics,” Matt said.

“This sort of indirectly sent me down a path to moving way more to the right politically as it led me to discover other people with similar far-right views.”

Now 20, Matt has since exited the ideology and built an anonymous internet presence where he argues with his ex-brethren on the right.

“I think YouTube certainly played a role in my shift to the right because through the recommendations I got,” he said, “it led me to discover other content that was very much right of center, and this only got progressively worse over time, leading me to discover more sinister content.”

This opposition to feminism and racial equality movements is part of a YouTube movement that describes itself as “anti-social justice.”

“I think the anti-SJW stuff appeals to young white guys who feel like they're losing their status for lack of a better term.”— Andrew, a former white supremacist ...
Read full article at The Daily Beast

Comments