Table of Contents

YouTube can remove Jordan Peterson, RFK Jr. interview, but should it?

Deplatforming public figures is no win for discourse.
Jordan Peterson and Robert F. Kennedy Jr.

Psychologist Jordan B. Peterson (left) and Robert F. Kennedy Jr. (right) speak during an interview that YouTube removed from Peterson’s channel.

On June 18, psychologist and political commentator Jordan Peterson and 2024 Democratic presidential candidate Robert F. Kennedy Jr. tweeted that YouTube removed an hour-and-a-half long conversation between them from Peterson’s channel.

Offering no explanation at the time of the takedown, a YouTube spokesperson claimed the next day the platform removed the video because it violated YouTube’s vaccine misinformation policy. The source told CNN that YouTube “does not allow ‘content that alleges that vaccines cause chronic side effects, outside of rare side effects that are recognized by health authorities.’”

As a private company, YouTube is allowed to do this. Rightfully, those who manage the platform have no legal obligation to allow any particular content to remain there. And notably, YouTube doesn’t even pay lip service to free speech in its policy commitments.

However, the conversation shouldn’t end there. Just because censorial action is lawful doesn’t necessarily mean it’s productive for public discourse. YouTube itself claims to have “a responsibility to support an informed citizenry and foster healthy political discourse.” But the way it conceives of carrying out this responsibility leaves a lot to be desired. 

A subsection of the platform’s “Supporting Political Integrity” webpage states:

[W]e remove policy-violative content, raise authoritative news sources, reduce the spread of election-related misinformation, and provide a range of resources for civics partners such as government officials, candidates, civics organizations, and political Creators to ensure a broad range of voices are heard. 

But can supporting “an informed citizenry,” fostering “healthy political discourse,” and ensuring “a broad range of voices are heard” be squared with disallowing any perspective that differs from that of “health authorities”?

A “healthy political discourse,” as YouTube acknowledges, means ensuring “a broad range of voices are heard.” If a political candidate generating the support of more than 15% of Democrats, with a higher favorability rating than either major party’s presumptive nominee, falls outside the acceptable range of voices that can be heard, YouTube’s conception of “broad” seems conspicuously narrow. 

Especially given that Kennedy is a major public figure, the public would be well served to have the opportunity to confront his claims concerning vaccines and other subjects. This contributes to fostering an “informed citizenry.” 

But YouTube doesn’t give audiences that chance. Instead, it infantilizes viewers, treating them as if they’re incapable of hearing a given perspective without instantly onboarding it.

Fortunately, YouTube is not the sole arbiter of media content. Anyone interested may still watch the interview on Twitter or listen to it on podcast platforms. And, apparently, many are interested: Peterson’s and Kennedy’s tweets linking to the video have each garnered 4.2 million views and 4.6 million views, respectively. 

This casts doubt on whether YouTube’s decision to remove the interview is even strategically effective if it truly hopes to dissuade people from encountering “harmful” views.

Instead of playing content cop, YouTube should give its viewerbase the chance to heed the platform’s own advice.

Kennedy’s detractors often describe him as a conspiracy theorist, and YouTube’s policy states that it aims to combat “harmful conspiracy theories.” If this goal contributed to the platform removing the video, the action was particularly ill-conceived. Conspiracy theories often rest on notions of the existence of cabals of well-connected conspirators, wielding institutional power to stop average people from recognizing “the truth.” YouTube should critically consider whether a person prone to conspiratorial thinking would be more or less likely to believe a given theory after witnessing a powerful corporation censor someone who expresses it. 

In less extreme terms, viewpoint-based censorship necessarily places a thumb on the scale for some viewpoints and against others. Even if it’s removed, we shouldn’t be surprised if the weight swings back in the direction of the speech that was suppressed. And we shouldn’t underestimate the power of the “Streisand effect,” peoples’ reactive desire to seek out information they’re not “supposed” to see.

android phone with icons for Parler and Twitter

FIRE Statement on Free Speech and Social Media

Issue Pages

FIRE is disturbed by calls for government action to force or pressure social media companies to censor.

Read More

So, where does that leave us? How can we reliably identify misleading and deceptive information so as not to place our trust in it?

Ironically, some of the answers YouTube itself provides on its “Media Literacy” page aren’t half bad: 

“[W]e encourage you to ask yourself some questions before you believe everything you see online,” says YouTuber Coyote Peterson in a clip representing the platform. “Like, ‘Why was this video made?’, and, ‘Who made it?’, And ‘How do I know the information is true?’ ‘Where else can I check to make sure it’s right?’”

“Remember,” he says, “you can always play detective and check more than one source.”

Instead of playing content cop, YouTube should give its viewerbase the chance to heed the platform’s own advice. Otherwise, it’s setting the precedent that it, not they, knows best what’s true and what’s false, what’s harmful and what’s helpful — undermining its own assertion that we each should take it upon ourselves to examine media with a critical eye.

Recent Articles

FIRE’s award-winning Newsdesk covers the free speech news you need to stay informed.

Share