9.17.2019

Dear Educators: Algorithms are Radicalizing our Boys

Recently, I've seen several tweets from educators asking about how to teach internet culture, specifically how to help our boys avoid radicalization in online spaces. It's imperative that we address this in schools. As someone who has had his eye on this issue for a number of years, I'll do my best to help move the dialogue forward by presenting the problems in layman's terms.

https://www.flickr.com/photos/bryanrusch/10428426095/in/photolist-gTwpKV-843oY8-cztP7W-5CFeAz-UyFKue-KDyvRm-ocriHu-Tu2cqq-2eghqUJ-Sj5h2L-monfkU-2fmcy3v-ipE2os-22xDUrW-2ecMh5B-qVRABD-9T4j9C-2aAfRG7-STmbmm-5df2AC-7cAFq6-TieeBC-Ua7T47-fan2oV-dm4MNF-fv9Vhg-hJmtWp-5hcauP-71p7Js-eaNHJt-e1xKuo-rdfHiN-7axycj-SQKMBw-VLfZo1-9zn3qh-CdGqJU-RZ91bd-a57A44-nkiiUj-nReGM7-XooGWD-RGjv1H-GFBNVf-TcqERt-EDP3aL-Cvrep-JtZSbb-az74Zp-REoL16
First, and foremost, it's important to provide a basic understanding of how these online spaces work. The engine that drives the internet is "time on site" and the best way to increase the amount of time people spend online is to manipulate them psychologically into getting and staying online. That's why games and social media companies use variable rewards--just like slot machines--to get people online. Those rewards release dopamine, which yields clicks. To keep you "on site," tech companies mine your data so they knows who you are, what you like, and what is most likely to get you to engage. Then, they feed your data into an algorithm that manipulates you into clicking on more things you might like.

What tech companies have learned is that feeding people affirmation gets clicks, but feeding people fear and outrage yield even more clicks, shares, and engagement. That's how Twitter and Facebook juice time on site. Another successful model is to nudge users towards more and more extreme content. That's why users watch over 1 billion hours on YouTube a day. "Time on site" skyrockets when users fall down a rabbit hole of extreme content; conspiracy theories work perfectly. That's why flat earthers and anti-vaxxers have experienced a resurgence in the internet age. For more examples and a better explanation of the above, check out Guillaume Chaslot's website AlgoTransparency and listen to Zeynep Tufecki's conversation with Ezra Klein.

Yes, algorithms manipulate users to stay online and that process is self-sustaining. Becca Lewis's research on "reactionary right" video creators revealed that not only are creators radicalizing audiences, but audiences radicalize creators too. As YouTube's algorithm knows too well, users want more and more extreme content--especially people who spend a lot of time getting information on YouTube. So if a creator wants to please his audience, like the algorithm, he too has to move towards the extremes.

In the book Networked Propaganda, Harvard researchers saw this radicalizing movement happen in more traditional forms of news as well. Their research showed that Brietbart's content was further right (read: more extreme) than Fox News's content and, they were getting more clicks and more shares--especially on immigration. As a result, Fox News tacked rightward to keep up. This could only happen in an age of social media drives audiences to websites, thereby affecting their bottom line. Of course, Fox's rightward tack is partially motivated by money. Even teenagers in Macedonia realized they could make a quick buck (sometimes 3K a day!) off traffic from social media by pumping out misinformation with click-bait headlines to Trump voters. 

In 2019, Gallup reported American's trust in mass media has dropped 8 points to an all time low of 32%. Among Republicans, trust dropped 18 points to 14%. This lack of trust in the mass media is most pronounced in young people. Perhaps that's partially why 94% of 18-24 year-olds use YouTube daily (Pew) and 59% of Gen Zers cite YouTube as their preferred learning tool (Mary Meeker). Perhaps needless to say, teenagers are at the forefront of this algorithmic radicalization.

Do our boys fall down algorithmic rabbit holes of extreme content? Yes and yes. If you want to know more about how, I highly suggest this thread. Seriously, read it. What it fails to mention is where our teenage boys first find media that pushes them to believe, say or do something "edgy" or anti-pc. In my experience as an educator, that most frequently happens 1) via gaming videos on Twitch or YouTube including the corresponding comments section (and chat apps like Discord) and/or 2) via political videos or podcasts on YouTube. Both Guillaume Chaslot and the Berkman-Klein Center have shown YouTube's tendency to push users towards far-right videos and conspiracy theories. Zeynep Tufecki proved that even watching a Hillary Clinton video inspires YouTube to start recommending Trump videos and eventually far-right content. Then, as the thread above explains so well, once our boys parrot something they hear in this space and get push-back from a parent, teacher, or peer, it often pushes them further into the media diet that prompted them to say/do something inappropriate and down the rabbit holes that algorithms feed them.

For example, the most famous gaming streamer on YouTube is Pewdiepie. He recommends anti-Semitic channels and has purposely provoked critics by doing a Nazi "heil" and using Fiverr to get people to hold up an egregiously anti-Semitic sign (Vox). The way algorithms work, if Pewdiepie gets even a small percentage of his 100+ million subscribers to click on these other channels, YouTube begins to connect them and recommend these channels to Pewdiepie subscribers even if they don't take him up on his recommendations. Connecting accounts and videos like this creates the network of edgy (to say the least) content recommended to our teenagers. For older males, Joe Rogan's podcast might be the trigger that connects listeners to alt-right leaders like Alex Jones and Jordan Peterson--both guests on Rogan's show.

I assume most educators, and very few administrators, have no idea what algorithmic extremism is, let alone how it works on our students. Hopefully, this article gives us enough background to begin discussing this issue and bringing it into our curriculum. In order to address these problems, we must lead discussions with our teenagers about their media diet and it's crucial that we are informed, honest, and open. We should identify and empower student leaders who will act as upstanders in the face of the negatives in this space. Finally, we must emphasize do's over don'ts and positives over negatives in order to avoid pushing students away from us and towards internet communities that don't share our values. Obviously, this will be challenging, but we're educators, and none of us want to relinquish our responsibility to impart knowledge and values in our students to internet celebrities or to algorithms.