‘So to Speak’ podcast transcript: What ‘On the Media’ got wrong about free speech . . . again

May 12, 2022

Note: This is an unedited rush transcript. Please check any quotations against the audio recording.

Nico Perrino: Welcome back to So to Speak, the free speech podcast, where every other week we take an uncensored look at the world of free expression through personal stories and candid conversations. I am, as always, your host Nico Perrino.

It shouldn’t be news to any of our listeners, we actually covered it in part on our last episode, but Elon Musk is buying Twitter for a reported $44 billion. This has gotten the internet up in arms. A lot of conversations about whether this will be good for free speech or bad for free speech, good for open dialogue, bad for open dialogue, good for extremism, bad for extremism on the internet.

But most relevant for us, on this show at least, is that when asked why he was buying Twitter, Elon Musk had this to say.

Elon Musk: Well, I think it’s very important that it be an inclusive arena for free speech, where – yes. Twitter has become kind of the de facto town square, so it’s just really important that people have both the reality and the perception that they are able to speak freely within the bounds of the law.

Nico Perrino: And when Twitter announced that it was being purchased by Elon Musk, and he had floated the idea, and I at least didn’t think it was gonna actually happen – and to be clear, still might not actually happen – I have been hearing some writings and rumblings that he might not be able to get the funding secured for it because it would require selling shares of Tesla that he can’t actually sell, so he needs to bring in other funders to help with the process.

But let’s assume for the purposes of this podcast, and as much of the internet and Twitterverse is assuming, that he is going to buy Twitter.

He said in the buyout statement as part of his statement that was released by Twitter, “Free speech is the bedrock of a functioning democracy, and Twitter is the digital town square where matters vital to the future of humanity are debated.”

He’s also said in tweets and interviews that he is a so-called free speech absolutist. Now, on the last episode, where our guests today, Nadine and Matt, appeared, we had a lot of discussion about what it actually means to be a free speech absolutist, so maybe we will cover that in today’s episode.

But the reason we’re having today’s episode is for the precise reason we had the last episode with these guests, which is WNYC’s On the Media put out an episode called Ghost in the Machine that was fairly critical of Elon Musk and the broader support that he sort of embodies for a freer and open speech space on Twitter.

I’ve got a bunch of clips clipped up from that episode, and we’re going to respond to the arguments from that On the Media segment. To do so, as I said, we are joined again by journalist and author Matt Taibbi, former ACLU president and New York Law School professor emeriti Nadine Strossen, and Carlton College history professor Amna Khalid.

Folks, welcome back onto the show.

Nadine Strossen: Great to be here.

Amna Khalid: Thank you.

Matt Taibbi: Thanks for having us.

Nico Perrino: Matt, I want to start with you because you’ve written about this a little bit. What are you making of this reported acquisition?

Matt Taibbi: Well, I think the first thing is that it’s impossible to know now what a potential purchase of Twitter by Elon Musk, what effect that will actually have. A lot of people are purporting to know already that it’s going to mean X, Y, or Z. We have no idea. It could be good for speech, it could be bad for speech. None of that is clear.

But what is clear is the reaction of people to this idea of him purchasing Twitter has revealed a lot about contemporary attitudes towards speech and content moderation, and there are a lot of people who are reacting negatively to the idea of a single rich person buying an internet platform by saying things like, “Oh, it’s horrible that this one oligarch will have control over the speech space.”

When they’ve basically wanted exactly that for years now, and when other people have pointed out that the idea of a fully privatized speech landscape has all kinds of dangers for free speech, they’ve said the opposite. They’ve said, “Actually, no, this is a good thing. This will help us to crack down on disinformation and hate speech,” and all sorts of other things.

So, there’s a kind of hypocrisy in the reaction to the Musk purchase which I think is very revealing, especially among certain figures of the Blue Check sort of elite on Twitter. They were all sort of of one voice about this whole thing. I think the reaction by former Labor Secretary Robert Reich, that when people say they want freedom, what they’re actually asking for is freedom from accountability, I think that was a pretty common response to this news, and it says a lot about how these people view free speech in general.

Nico Perrino: Has anyone seen the, I think letter that was sent yesterday or the day before to Twitter’s advertisers from that coalition of groups, asking them essentially to stop supporting Twitter unless it retains its current moderation practices? Matt, you’re nodding your head yes.

Nadine Strossen: No, I have not.

Nico Perrino: Yes. Matt, what do you make of that?

Matt Taibbi: Well, I thought that was – again, it’s fascinating because what’s really changed about Twitter? If they’re really upset that somebody is going to censor less, or exercise less content moderation, what does that really say about what their position towards speech was previously? I thought that was very interesting.

And again, it says a lot about this new environment, where corporations believe that they have a much bigger role in policing the speech of even individuals than they ever did before.

Matt Taibbi: Nadine, what are your thoughts? You’re really the one who kind of had the idea for this episode. What’s your hot take?

Nadine Strossen: Yes. Well, first of all, I do have to say it is very trouble from the perspective of not only free speech and other individual liberties, but also equality, democracy, human rights, that so much power over such significant platforms for the exchange of information and ideas is wielded by one individual, and that has been true in the past. It’s just a different individual now. It’s true for other platforms.

As the Supreme Court itself said back in 2017, it used to be subject to debate what areas geographically are the most important for the exchange of information and ideas among we the people as part of our democratic discourse. And the court said now it’s no longer debatable. It’s clear that the most important place, location, for the exchange of information and ideas is online in general, and in particular social media.

And yet they are not, as a matter of law, subject to First Amendment constraints, due process constraints, other constraints that would provide some fair and equal opportunity for people of different perspectives, different identities, etcetera, to get involved in the debate.

Now, mind you, I would strongly opposed government censorship or controls over these private platforms. That would raise controls in terms of their content moderation practices. But what we have to hope for is that those who wield this power will voluntarily choose to do so in a way that respects free speech values, that promotes a free speech culture rather than a cancel culture or a culture that favors one side of the partisan debate or another on any issue.

So, in that sense, it was welcome news that Elon Musk said that he was going to adhere to not absolute free speech, anything goes. The familiar caricature of those who want to trash the values of free speech. In fact, he said that he would support on Twitter open speech within the bounds of the law.

So, here we come again full circle to our last response on the media. The law, the First Amendment law, if we’re talking about the government, does not protect absolutely all speech, and this On the Media segment was replete with examples of the supposedly horrible, harmful, dangerous speech that would occur under an Elon Musk free speech regime that are completely illegal under current law, such as child pornography, bullying, harassment.

So, it’s a red herring or a caricatured version of free speech that is being attacked, while overlooking the enormous positive exchanges of information and ideas that have occurred on Twitter, including human rights movements, equal justice movements that got off the ground there. Me Too, Black Lives Matter, Oscar So White, and the list goes on.

Nico Perrino: Well, I want to turn now to On the Media, but Amna, before we do that, is there anything else you wanted to add to what Matt and Nadine said?

Amna Khalid: Just for your listeners, you know, we haven’t consulted before this episode, and it’s remarkable to me how much in agreement we are, as I listen to the two of you.

Nico Perrino: That puts a lot of pressure on me because that means I gotta play bad cop here, which I’m going to do here momentarily.

Amna Khalid: Fantastic. But I do want to say one thing, which is again going back to what Matt was saying, it’s been so revelatory in terms of what the anxieties are that people have, and I think that’s what’s fascinating, is that this caricature that Nadine is talking about is such an easy crutch for people to channel their anxieties.

I think they’re well aware that these are caricatures of what they think free speech stands for, but they help aid the kind of liberal paternalism that is so rife on the left right now, and is strangely converging with the kind of paternalism that we see on the right as well in terms of what they deem is appropriate for people to read about or what they can be educated about in classrooms.

So, it’s a strange moment when we see them making bedfellows and converging on a particular point, and it took Elon Musk for that to become visible.

Nico Perrino: Well, Matt during his introduction had talked about Robert Reich’s kind of appeal that there’s such a thing as too much freedom, and both you and Nadine talked about a caricature of this. So, I want to turn to the first clip from On the Media because it is nothing if not a caricature of this discussion. This comes from the “Always Sunny in Philadelphia” episode. It’s a sitcom. I forget what channel it’s on. But the conversation is between Micah Loewinger, he’s interviewing Natalie Nguyen, who runs ContraPoints, which is kind of a popular left-wing blog. Let’s go to it now.

Natalie Nguyen: Dennis and Mack decide to turn their bar, Paddy’s, into –

Male 1: It’s the most American bar in all of America. A place with absolute freedom.

Male 2: With no gambling restrictions.

Natalie Nguyen: Anything goes. We’re gonna have women taking their tops off.

Male 1: You girls went wild.

Male 2: Way to go.

Natalie Nguyen: And to them, the human is simply as removing of restraints. We’ll have no rules, and then everything that we want to do, we’ll get to do. That’s the kind of logic.

I think that’s the kind of logic a lot of people have they advocate for no restrictions, no rules. But of course what ends up happening is that you’re not the only one, then, who has no rules.

Male 1: That could be a bit of a problem back there, though.

Natalie Nguyen: Gambling ring gets out of control, people are betting their fingers.

Male 1: Do we have any short knives?

Male 2: What?

Micah Loewinger: People are playing Russian roulette, the bizarre milk drinking –

Natalie Nguyen: The McPoyls.

Micah Loewinger: Yes, they’re like an incestuous family, and they also want to cash in on the no-rules space to manifest their own dream of making out with each other.

Male 3: I heard you guys have an anything goes type situation here. Can we get a couple glasses of milk?

Male 1: What? No.

Natalie Nguyen: It was a nightmare, and so they decided they had to introduce maybe a few rules.

Male 1: I think we gave people too much freedom.

Male 2: Yes, you’re right, man. I’ll call the cops.

Male 1: No, no, no, no, no. We can’t call the cops. That’s admitting failure.

Male 2: Dennis, we gave people too much freedom. That’s the problem. All they do is exploit it.

Nico Perrino: As you can see there, Dennis and Mack want to start an anything-goes blog, but they gave people too much freedom, right? And this is kind of the sort of scaremongering we sometimes see in the campus context. Like if you eliminate the free speech zone, the sky will fall, there will be free speech everywhere, and we won’t be able to control our campus. Nadine, why is this wrong?

Nadine Strossen: You’re reminding me of a speech opportunity I recently had at Emory Law School, where there had been so many suppressions of free speech in the name of protecting safety that – alleged safety, that a number of law students at this very prestigious law school decided to form the Emory Law School Free Speech Forum, and the student government association, which had the power to decide whether they had satisfied all the standards, having an advisor and whatever the other content neutral standards were, denied the request to create this group because free speech is so dangerous. Free speech is so divisive. Free speech can lead to a lot of problems.
And so that kind of reasoning, I put that in quotation marks because it’s –

Nico Perrino: It sounds like the post-9/11 Patriot Act reasoning, right?

Nadine Strossen: It’s fear mongering. It is not reasoning. And one of the dangers of this overheated rhetoric, both from the campus example that I gave, which is all too typical, and the OTM clip, is a conflation of real-world physical harm that occurs to people’s bodies, right? We’re talking about people playing Russian roulette. People slicing off their fingers.

Somehow that has become equated with encountering an idea that you find challenging, or that might truly make you uncomfortable in the sense of calling into question your most cherished ideas. That kind of discomfort – let me quote one of my heroes, Ruth Simmons, the first African American woman who was the president of an Ivy League institution. When she was president of Brown, in her opening convocation address, she said, “Education at its best is the antithesis of comfort.”

So, you know, to equate, wrongly equate physical harm and danger to whatever discomfort comes from challenging unfamiliar ideas, that itself is really, really dangerous to free speech and democracy.

Matt Taibbi: If I could, I also agree with everything that Nadine said, and also I just can’t believe how disingenuous that segment is. For a media show which – you know, these are media professionals who, whenever they broadcast anything, they have to go through a legal review or at least an implied legal review every time they go on the air. Any trained journalist knows there’s a whole series of restrictions that have always existed on the press. You know, we have to avoid libel, slander, incitement, all kinds of other things. We have to be educated in the different nuances of libel, the things that might trigger a suit even if they are legal.

Like, you know, these are things that every media professional is aware of, and the idea that someone would go on the air and say “Oh, there’s gonna be this free for all where all these things happen,” is just delusion. It’s a lie, and they know it’s a lie. It’s a straw man argument. I’m gonna hate to pull out a social media cliché, but that’s what it is. They’re presenting an image of something that has never existed in reality, not in the internet and certainly not in the news media.
Nadine Strossen: Matt, I’m gonna ask you a devil’s advocate question. So, should Elon Musk, consistent with his pledge to honor free speech within the bounds of the law, should he not allow that On the Media clip on the ground that it’s a lie, to quote you?

Matt Taibbi: You know, I don’t know. I mean, I’m more in favor of the sort of litigation based system of all this, which is if someone feels harmed by that speech, they can make an issue out of it. But it’s certainly deceptive, I would say, that segment. But I’m not in favor of canceling it.

Amna Khalid: I do want to come in over here and say that what they’re doing in that particular episode is very telling of what’s happening in our times, which is they’re taking a comedy sketch, which I actually think is critiquing these flat notions of free speech, and is supposed to kind of make fun of them, it’s satire and comedy about them, to point out the absurdity of them.

But they’re taking them literally and presenting them as the option. I feel like they’ve missed the entire mark about what the point of comedy and satire is, and these are the times we’re living in, where news sources are presenting that and not critiquing someone on the show for saying, “Hey, you’re being literal minded here. I don’t think we can air this. I think you need to have a more sophisticated take on this.”

Matt Taibbi: That’s such a good point. The total inability to grasp humor is so central to this whole situation, I think.

Nico Perrino: Well, we’ve been talking about how Elon said he was gonna try and move the platform towards a free speech framework that would be consistent with the law. Now, in the United States, that’s the First Amendment, right? But if you go to India, if you go to China, I don’t even know if Twitter exists in China, but there are a lot more repressive regimes, have a lot more repressive free speech environments than the United States. What does that mean?

Nadine Strossen: I think one thing that’s very interesting is what has been happening at Facebook with its oversight board because Facebook’s oversight board is adhering to a really important recommendation that was made by David Kaye several years ago, when he was the UN Special Rapporteur for Freedom of Expression, and that is that these global tech giants should voluntarily – again, not a matter of compulsion – conform their content moderation decisions, ie censorship decisions, to the norms of United Nations treaties that govern freedom of speech.

And what’s interesting about that is that is the only truly global norm. Virtually every country in the world is a party to it. That’s not true of the domestic or national law of any individual country or of regional laws.

Nico Perrino: Let me push there, Nadine, because if I’m remembering kind of the UN’s treaties and its statements on free expression, they’re fairly broad, and they don’t have the centuries of jurisprudence backing them up to tell you what qualifies as incitement? What qualifies as Truth Act or defamation?

So, for example, Russia has a constitution that guarantees free expression for all. North Korea has one of those as well. So, just kind of appealing to these values without having the meat on the bones that the historical jurisprudence, at least within the United States has, leaves a lot of room at the joints.

Nadine Strossen: That’s a really important point, Nico, and what is not very well known, including among many US First Amendment experts, is that that meat has been started to be added to the bone. And when you look at the length of time, the relative youthfulness of these United Nations charters, within the past 10 years, a very robust jurisprudence has grown up, and that is very speech protective and amazingly overlapping with robust US First Amendment principles.

And I actually did the arithmetic, and the percentage of time that there’s been speech protected jurisprudence is about the same as under the First Amendment, where we didn’t start to develop that until the 1960s.

Let me give one example because I think many people do know that the United Nations treaties require countries to outlaw hate speech. But what is not known is that the standards and the concept for doing that are remarkably similar to US law, basically that there has to be an emergency. The speech has to directly, imminently cause certain specific harm, such as intentional incitement to imminent violence.

Now, in fairness, it’s not a coincidence, many United States free speech experts have played a big role in these UN developments, including David Kaye himself. But I think that’s an idea that has gained traction through the Facebook Oversight Board, which has struck down a lot of Facebook’s takedown decisions on hate speech and other controversial speech, saying that those blockages by Facebook are inconsistent with United Nations law.

So, I think this is a very promising development.

Nico Perrino: When I think of United Nations law, I often think of it in the context of what’s happening with Russia and Ukraine right now, and just how toothless it seems to be. Whatever you think of the United Nations, I just think kind of the whole infrastructure around the Security Council and what actually results in any sort of action or punitive measure against a country who violates its proposed treaties or rules – but maybe that’s different in the free speech/civil liberties/civil rights context. I don’t know. I’m not a United Nations expert like you might be, Nadine. Matt?

Matt Taibbi: No, just one thing quickly. One of the first stories I did about the content moderation movement talked about how a lot of these internet platforms have run into the problem of how they’re legally allowed to operate in countries that have let’s just say less liberal attitudes toward speech, and this has resulted in some pretty uncomfortable situations where, for instance, Facebook in Israel, the Israeli government may deem a certain site a security concern, and the next thing you know, you’re seeing thousands of sites removed from Facebook.

And I think that’s one of the things that is sort of a long-range concern for the United States, which is once you start going down the road of content moderation that’s beyond the scope of the law, and somebody starts whispering in the ears of executives at a company that oh, you know, on security grounds we need to remove this site or that site, that’s a snowball that basically never stops rolling. The tendency with these countries is that they seem to keep pressing that more and more, and I just think that’s a big danger with this situation, is that –

Nico Perrino: Yes. Do you remember – gosh, this would’ve been like 2013 or 2014, but Mark Zuckerberg was in Europe, maybe in Germany, and Angela Merkel during a hot mic moment – this is at the time when the immigration crisis was ongoing. You had a flood of immigrants coming from the Middle East, for example, and North Africa. Angela Merkel was – and there was a lot of criticism of how the European countries were handling.

Angela Merkel on a hot mic asked Mark Zuckerberg what he was gonna do about it, and he said he’s working on it. More or less made it sound like he was going to censor some of the criticisms of how these countries had handled the situation.

So, we mentioned Facebook and its Oversight Board adding more transparency to the process. Perhaps good. But I will say this as someone who manages a communications team and has to like market and advertise, we don’t even count Facebook organic follows as members of our social media following anymore because you just can’t reach people organically on Facebook.

And this came after 2016, they really throttled your ability, if you’re just an individual user or company, to get your message out there unless you put money behind it. And if you put money behind it, Facebook can be very, very powerful. But in order to put money behind your message, you need to go through this sort of Byzantine approval process, and that might be what Elon Musk is talking about, for example, when he’s talking about authenticating humans. I don’t know. He’s got to find a way to make Facebook profitable. Or Twitter profitable, excuse me, because it’s not currently.

Nadine Strossen: Could I add something to Matt’s point quickly? Because Matt talked, and it was a really important additional argument in favor of a free speech standard on Twitter, which is that you then cut off the ability of any government to put pressure through the content moderation vehicle.

And that happens not just with authoritarian governments. It happens in our country. We have politicians that are constantly putting pressure on the social media companies to take down what the politicians consider to be disinformation, what they consider to be hate speech, and guess what? It’s the ideas of the other side.

And in effect, that means that the government is delegating power that it could not constitutionally exercise under the First Amendment to pressure these companies to take down what would be constitutionally protected speech. It’s an end run around the First Amendment.

Nico Perrino: Amna, I’ve seen you nodding your head a few times. Is there something you want to add?

Amna Khalid: I’m just thinking, and this might become more relevant with the subsequent clips that you’re going to play, but to my mind, the issue isn’t what are the ideas that are going to be on Twitter now that Elon Musk is going to come on, if he comes on. It’s more about what are the kinds of algorithms that are in play to make particular voices louder than others?

And on that issue, I think the recent piece by Jonathan Hyatt in the Atlantic was quite helpful in sketching out how the kind of algorithmic and the new kind of features that are introduced, like the “like” button and the “retweet” button, are in a way generating and changing the kinds of scale of the number of people you can reach, and now you’ve brought up Facebook and how you can no longer organically reach people.

And I think there we might want to see some regulation and some changes, not so much in terms of who can speak and who’s going to moderate the content in terms of the ideas.

Matt Taibbi: Just quickly to Nadine’s point before we move on, it’s absolutely true that it’s already happening in this country. It’s not just places that are more authoritarian. You know, going back to 2017, we remember the scene of the CEOs of Twitter, Facebook, Google, all dragged to the hill, being questioned by members of the Senate. The senator from Hawaii saying, “What are your strategies for preventing the foment of discord on your platform?”

And so even as far back as then, there was this implied threat by government that if you do not get in line – they had a white paper already prepared, but with like 25 pages of new regulations and taxes that might’ve been slapped on Silicon Valley if they didn’t get in line and come up with new content moderation strategies.

And as Nadine says, this is totally an end run around the First Amendment. It’s the government exercising power, but sort of by implication and by threat, which is very dangerous, and that’s part of also this whole debate with the Elon Musk thing is because this is where we are with speech. It’s a privatized landscape. But where’s the First Amendment now? How do we protect it?

Nico Perrino: Well, I’m glad you brought that up, Matt, because that’s a perfect segue into our next clip. There have been a lot of free speech advocates, including most recently Jeffrey Rosen of the National Constitution Center, arguing within the pages of the Atlantic that we should look to the First Amendment as kind of a guiding document or guiding 45 words, I should say, as to how we should moderate content within Twitter.

But the On the Media segment, Micah Loewinger and Natalie Nguyen talk about what that would like and draw comparisons with 8chan. Let’s listen to it.

Natalie Nguyen: But I also think that without restrictions against bullying and harassment, for example, the platform becomes unusable very fast.

Micah Loewinger: I remember I spoke to Frederic Brennan, who was one of the admins of 8chan, who since kind of had a pretty massive about face, and he told me in 2019, he said, “I was kind of aware of the political arguments that Image Board users make about free speech – you know, that it’s all just about the marketplace of ideas, and the best ideas fall out. As 8chan’s admin, I never saw any good ideas fall out. I just saw each community getting more and more extreme in their rhetoric.”

8chan was as good a free speech experiment as we’re probably gonna get for a long time.

Natalie Nguyen: Yes.

Micah Loewinger: And I’m not sure what Elon Musk’s version of this experiment could possibly lead to a better outcome.

Natalie Nguyen: I do think that looking at 8chan is a pretty good case study, and what happens when you create an, “Okay, let’s just let people say anything.” People were posting child pornography to this website on a fairly frequent basis. You know, the only people who end up using this space are kind of socially isolated, angry at the world, white boys in their early 20s or late teens, who enjoy the feeling of power that comes with being able to say racist things, and I think that’s not a space that most people want to use.

Nico Perrino: So, she brought up three categories of unprotected speech, including harassment, child pornography, for example. But let’s strongman the argument, or let me reframe their argument to make it a little bit better, which is a First Amendment-guided social media platform, is that really a platform that anyone would want to use, keeping in mind, for example, that pornography – Nadine, you’ve written a book about this – is protected under the First Amendment. Crush videos are protected under the First Amendment.

There’s a lot of speech that is protected under the First Amendment that might not make for a very usable, friendly, or civil social media space.

So, I’m just curious what other people’s thoughts are on there. Greg Lukianoff, my boss, his idea is that I understand that, but to answer some of these hard questions about what a true threat is or what incitement is, or the main thing people are concerned about, which is political viewpoint discrimination, the First Amendment has a lot to say about all of that. But it doesn’t seem to be inspiring the actual content moderation policies.

Nadine Strossen: You know, you say you’re making the opposite of a straw person argument that they’re making, Nico, but I don’t think you can escape, paper over the basic flaw in that clip, which is the assumption that there’s something somehow unique about sharing some space for communication that is governed by the First Amendment. We do that every day in the proverbial Hyde Park corner, or Central Park right outside my window, or in virtually every single media space.

The only one that is not subject to First Amendment standards because of highly criticized, justly criticized old Supreme Court decisions, is the broadcast media, ironically enough. And we seem to survive and thrive with that vibrant free speech.

Now, I would say, as many advocates of digital free speech rights argue, that in an ideal world, we would have more user empowerment, more user agency, more user freedom of choice, so that it would be easier for us to choose what algorithms are used, what content is driven to us, and in fact, we should be able to choose the filtering ourselves and take advantage of artificial intelligence to even try to create very individualized content streams.

I understand that Twitter has been making some pronouncements about experimentation in that direction.

Nico Perrino: Well, you bring up the algorithm, and I think Amna, you had talked about this briefly, is one of the things that Elon Musk wants to do is make it open source, so anyone can see how content is distributed on these platforms.

And that kind of speaks to this idea that you can – trust in an institution is critical in order for the institution to thrive, and if people see if as being unfair, even if it’s just the perception that it’s unfair, then it’s easy to tear it down. It’s not long gonna survive as an institution, and I think Facebook’s Oversight Board is one move in that direction, and making the algorithm at Twitter Open Source saying “Here, here’s how we privilege the content, where we engage in viewpoint discrimination,” is one way to increase trust and the perception of fairness within the institution.

Amna Khalid: Can I just say also that to my mind, it’s very, very frustrating, the way in which the users are treated as if they’re – I mean, it’s below children, even. It’s the idea that ideas are contagious, and you put them out there, and they’re going to infect you instantaneously. There is no understanding that people do have agency and people do filter ideas, and people have their own ways of making sense of the world.

So, just by putting something out there, it’s not going to spread like wildfire, and indeed, this has been the case. All the anxiety about disinformation, the studies that have been done show that in fact, the damage is far, far less than what is perceived to be the damage or what the anxiety is about it.

So, this is an elite kind of anxiety about – and as Jacob Mchangama notes in his excellent book on free speech, this has happened every time we have a new medium of communication. So, when the printing press first started, it was like, oh my God, ideas are gonna spread and people are just gonna take them, and then it’s just gonna like take over.

It doesn’t happen that way. We need to treat other individuals as human agents who are like us. I feel like we’ve forgotten that other people, too, are like us in their fundamental humanity and the way they make sense of the world and make meaning.

Nico Perrino: I should say, the argument you made right there – and FIRE doesn’t take any position on campaign finance laws, but when there was a lot of uproar over Citizens United, that was kind of the argument that I always made, right, is that there’s something that stands between the prospective officer holder and the office, and it’s the voter. And to the extent that corporations, in this case Citizens United, which made a documentary about Hillary Clinton, or it could be the ACLU or the Sierra Club, is putting money behind a candidate or an idea. They still gotta convince people, right, with that money.

And yes, I think on the part of censor – you know, people who advocate for a more restrictive free speech environment, there’s this sense that people are just putty in the hands of other people with ideas they don’t like, that they can just be easily molded and there’s no critical thinking that can be brought to bear.

Maybe that’s true to some certain extent, but if that’s the case, then democracy really can’t work, can it? We would be ruled by Plato’s oligarchs.

Nadine Strossen: There is obviously more we can do and should do in terms of developing critical media literacy and critical reasoning and analytical skills, starting at the earliest ages. I would say even the most ardent advocate of censorship is not going to be able to root out all ideas that are potentially harmful, right, considering that that’s basically all ideas. As Oliver Wendell Holmes said, every idea is an incitement.

So, what we have to do is to make people capable or improve their skills at sorting the wheat from the chaff, and I think the internet, it presents just absolutely unparalleled resources to make that a significant reality.

Amna Khalid: And I’m having a passionate reaction to this. Sorry, Matt, I’ll pass it to you in a second. But you know, this smacks of the kind of authoritarian thinking that I know from my part of the world. It’s like we know better than you, and therefore we will decide what can and can’t be shared. That is ridiculous, and to hear that coming from the liberals, from the left, is irksome to me. It really, really gets me because it’s completely wrong-headed.

Nico Perrino: Amna, for our listeners, what’s your part of the world? Just so they know.
Amna Khalid: Pakistan. Pakistan.

Nico Perrino: Pakistan, yes.

Amna Khalid: You know, the realm of dictatorship.

Nico Perrino: Which, by the way, I made a movie that can’t be distributed in Pakistan. They have strong restrictions on like profanity and things like that, and I think they also have like a review board that has to review movies before they’re distributed within the country. There are a couple countries like that. I think South Africa, Australia, New Zealand. It’s just a big, giant effing headache, in fact, for filmmakers.

Amna Khalid: Well, actually, this is a huge problem. It’s not just like – it goes back to the heart of what we’re talking about, which is content moderation and who gets to call the shots, and who decides what’s disinformation and misinformation. So, it’s compound.

Nico Perrino: The Department of Homeland Security, apparently.

Amna Khalid: Yes, that’s why the Iraq War was founded on such a lot of good information.

Matt Taibbi: Well, actually, to that point, just quickly before we move on, firstly, going way back to that clip, I thought it was incredibly revealing that their idea of the most successful First Amendment experiment is 8chan and not the United States, for instance. Like, that’s unbelievably revealing to me.

And then the other thing is just to tie together something that Amna was saying with something that Nico was saying, you know, there is this belief that if we put a bad idea out there, it’s instantly going to infect everybody. That people are helpless to resist the charm of bad ideas.

But somehow none of these people think that there’s going to be, that those same people are going to be affected by the phenomenon that Nico was talking about, which is the loss of trust in the institution. In other words, when audiences see that a platform is putting its thumb on the scale, and maybe they’re making a decision politically in one direction or another, that inspires behavior just as much as being exposed to an idea inspires a reaction.
And the idea that people are going to be convinced by rhetoric, but not be negatively convinced by censorship, is a total contradiction and crazy to me. I just don’t understand why they never see that.

Nadine Strossen: Also, as somebody, the woman, I’m sorry, I can’t remember her name –

Nico Perrino: Natalie Nguyen.

Nadine Strossen: Right, right. On that clip, she said the only people who want to use Twitter are angry white boys who want to spout racism, right? But it suddenly popped into my head that one of the most moving, persuasive, compelling examples of counterspeech and dialogue, changing somebody’s views from hateful, discriminatory views toward completely reputing them, is Megan Phelps-Roper, and that happened on Twitter, as she documents in great detail in her memoir, which came out a couple years ago. Unfollowing: How I –

And she talks about how she had been raised in the Westboro Baptist Church, a group whose motto was – or website, www.godhatesfags.org. But they also hated basically Catholics and Jews and members of the military and anybody who wasn’t a member of the Church.

She went on Twitter in order to try to recruit followers to her church, in which she had been born and raised, and there she encountered rabbis in Israel who just started exchanging with her about the Bible verses that the Westboro Baptist Church was basing its philosophy on. And through incredibly patient, ongoing back and forth, got her to question, reexamine, and ultimately reject the ideas she had been raised in.

And that’s just one of many, many examples, and we could give so many positive examples where the actual –

Another example is how often, when people hear about or see racist or other discriminatory language online, does it galvanize them to do something to be an anti-racist, to provide support to those who are attacked, to lobby for laws that will protect against discrimination.

So, you know, not only may it not have a negative effect, as Amna and the rest of you were saying, but it may actually have a positive impact, prompting people to question and reject hateful ideas.
Nico Perrino: Let’s go to the next clip. This clip comes from a conversation between Brooke Gladstone, who’s the main host of On the Media, and Eli Pariser, who is the head of this organization called Civic Signals. And it makes the argument that despite what the First Amendment says, from a normative standpoint, rules make people more comfortable expressing themselves. Let’s hear the argument.

Brooke Gladstone: In your research, you tease out this idea by imagining what Twitter specifically might look like if it were a physical place. It would be something like a crowded parking lot on a busy shopping day.

Eli Pariser: Yes, as opposed to, say, even like Reddit, Twitter is sort of uniquely normless. It’s very hard to figure out who’s here, what are we doing here, what are the rules of engagement. And so it’s not a surprise that the loudest and often most entitled voices get heard the most because there are no rules.

You know, communities have to have norms in order to function. One of our advisers is Nathan Matias, who has this fascinating research about Reddit where he looks at a Reddit channel where some folks saw a list of rules about how to engage, and some folks didn’t. And you might think oh, this is gonna put people off, to show them all of the rules.

Actually, the opposite was true. That especially for women and folks of color, they were more likely to engage when they saw the rules because there was some sense that number one, there are rules, and that gives a sense of organization and safety. And number two, I have equal access to them. They’re not hidden to me, and therefore I feel comfortable participating because it’s an equal playing field, or more equal playing field.

And so how we design these spaces has a lot to do with how people participate in them.

Nico Perrino: I take issue with Eli’s idea that Twitter is normless. I think Twitter very much has a set of norms that are easy to gather if you spend any time on it, kind of motivated by that idea that Twitter is not the real world. You know, there’s this world that exists with Twitter that has certain perspectives that are not the real world. So, I take issue with that.

But what about this broader idea? To the extent that you support the First Amendment or free speech, and one of the reasons you do so is the value that it amplifies voices, this research that suggests that rules help to amplify voices. How should we think about that, especially in our online spaces?

Nadine Strossen: I think the call for transparency and accountability is absolutely critical, and one of the norms that has been advocated by a group of digital rights advocates for many years under the banner of the Santa Clara Principles. And this is not to impose any specific content moderation obligations on them, but they at least – Nico, you say the Twitter rules are clear. The ones that I’ve read are
very –

Nico Perrino: Oh no, I said the norms. The norms of the users are clear.

Nadine Strossen: Oh, the norms. The norms.

Nico Perrino: The rules are not clear.

Nadine Strossen: Touché, touché, touché. The rules have to be much more transparent, much more – again, the United Nations standard is the same as the First Amendment free speech standard, which is that they have to be narrowly tailored, and that means that they have to be understandable to a person of ordinary intelligence not only as a matter of protecting free speech, but also as a matter of due process, fundamental fairness. You have to have fair notice of what is going to transgress the rules. You have to be given an explanation as to why your post is taken down. That doesn’t always happen. You have to be given an opportunity to appeal.

I think these are the kinds of viewpoint neutral regulations that would be consistent with the kind of First Amendment free speech principles that Elon Musk pledged to in general. Obviously the devil is in the details.

Amna Khalid: I think the narrow tailoring is actually key over here because we need to accept that along with good ideas, there will be bad ideas out there. There’s no foolproof way of doing this because it’s fundamentally – the good and the bad are fundamentally value-laden, and what’s good to you may not be good to someone else.

So, the point, the narrow tailoring needs to be such not necessarily to control what you deem bad ideas, but for you to be able to put out what you deem good ideas when you want to put them out. And that, I think we often – not “we” as in – but people who are making these arguments often forget that a rule that constrains the speech that they don’t like will soon come to constrain the speech that they do like as well.

So, the alternative outcome is dangerous here. More dangerous than the outcome of narrowly tailored rules.

Matt Taibbi: I would just add that from a perspective of a journalist, having rules I always thought was empowering for us, knowing exactly where the boundaries are, knowing exactly what we can say and what we can’t say. That’s a positive in reporting because now you know exactly how far you can push things when you have to do a report, and where you’re protected, additionally.

The problem with social media is that it’s at once heavily regulated and full of rules that we don’t really know about, and also ruleless in another way. People are constantly libeling one another on Twitter, and there’s no penalty for that whatsoever.

On the other hand, you can be removed from the platform for all sorts of things without any notice. There’s no transparency whatsoever.

And so while I would kind of agree with what they’re saying, it doesn’t really apply to how the internet is run right now because the huge problem – and I’ve been interviewing people who have been removed from internet platforms for years now – they all say the same thing. Like, we get no notice, no explanation, and when we are removed, we have no way to appeal to a human being to find out what happened or to fix the situation.

That’s not having rules. That’s something else entirely. That’s a ruleless landscape. And so I kind of disagree with their premise.

Nico Perrino: Elon Musk made waves a couple, maybe it was one week ago, when he posted that graphic. Joe Rogan had a very helpful episode maybe a year or two ago, where he had, I think his name was Tim Pool, and then the main legal mind at Twitter, on to discuss the argument that Twitter discriminates against viewpoint. And this Tim Pool guy presented some examples of Twitter’s what he alleged left-wing bias.

And then the general counselor said, “We have to take the context into consideration.” And then he’s like, “Well, Twitter’s interpretation of the context is affected by their left-wing bias, and I would need to see an example of that.” “Well, here’s an example.”

But so you know, it’s this kind of circular deal, where you have these rules that require kind of like contextual analysis to interpret. You have that in the First Amendment, too, right? A lot of the exceptions to the First Amendment are super fact driven to figure out whether any expression falls into that.

But you have guardrails, and you explain all the decisions, right? Like the people who say that courts don’t explain themselves I guess haven’t had to go through a hundred-page court decision before.

But Twitter doesn’t really have to do that. It just kind of creates the contexts and then makes its decision and says what policy you violated, but doesn’t put the facts behind it. Facebook’s Oversight Board to a certain extent does that.

And then you see the response from current Twitter employees to Elon Musk’s statement, and you’re like, “Maybe Tim Pool was right.” You know? Maybe there is sort of a bias against free speech. And then you see the Babylon B getting deplatformed, and the response to the Hunter Biden laptop story, and it seems to go in one direction.

Nadine Strossen: And I think that’s the most – you know, the Supreme Court has said that the bedrock principle underlying our free speech jurisprudence is viewpoint neutrality. That the decision maker may not discriminate in favor of or against particular views, ideas, messages, and that to me is the worst of what is happening with the so-called content moderation.

I say so-called because that sounds so moderate. It really is blatant viewpoint discrimination, and there is a complete unevenness in how even the comprehensive rules are enforced depending, A, on who the speaker is, B, what the idea is, C, who’s putting pressure, who has access to put pressure on the hierarchy at the social media. Some people get attention and others don’t. It’s arbitrary and discriminatory.

Nico Perrino: And it’s not just what’s taken down, too. It’s how information is also contextualized, right? We saw people posting about COVID lab-leak theory, which is now one of the main theories of the United States government, it so happens, having their post tagged as misinformation.

I am a big follower of Charles Cooke, who is one of the editors over at National Review. I think his writing on free speech is some of the best out there. But he’s also like this Second Amendment guy, and he also loves roller coasters, and he writes a lot about those on his personal website. And I guess his personal website was flagged as extremist content, if you wanted to click through to it through Twitter. I can say pretty definitively, Charles Cooke is not an extremist. He’s a very moderate, Mitt Romney – although he probably didn’t – maybe that’s not the way to put it. But like old school, classical liberal conservative.

Nadine Strossen: But like all of these concepts – extremist, hate speech, disinformation – these are all inherently subjective. They completely depend on the values of whoever has the power to enforce.

Matt Taibbi: Just to cut in here quickly, I mean, I did a story this week about Paypal last week, in the last couple of weeks has deactivated the accounts of a whole series of media figures, mostly this time on the left. They have a history of doing it in the other direction, too. But it included sites like Consortium News and Mint Press, which are basically anti-war, sort of in that direction.

And again, the problem isn’t so much that they did it, it’s that there’s no explanation, and the rubric under which they make these decisions is “Well, we have a rule against false, misleading, or inaccurate information.” Well, who’s making that determination? A payment processing company? I mean, come on. If you’re gonna have people at these platforms making those decisions, especially about news organizations whose primary mission is to challenge entrenched narratives, it’s inherently going to be an uneven, unfair exercise, and it’s always gonna be subjective, as Nadine puts it. And that’s a big problem for me.

Nadine Strossen: Nico, you know, there’s theme which may be obvious, but I think it’s worth stating, that the On the Media segment completely focused on the down sides of free speech, and they didn’t even treat them as potential downsides. Just this is the only thing that happens with free speech, is all of these awful things. Bullying and harassment and child pornography and physical violence and people cutting off their limbs and so forth.

We are talking about – even if you assume for the sake of argument that all of those asserted harms are present, and we’ve I think made a very strong case that you’re only assuming that for the sake of argument, it’s not true under the law – but even if we made that far-fetched assumption, it still would not justify rejecting Musk’s commitment to free speech because we’re talking about, well, what’s the alternative? If free speech is not the standard, in fact, if we just stick with what Twitter is already doing, that’s even worse, in many ways. Even more dangerous and more harmful than all of the posited harms from the free speech approach.

Nico Perrino: So, I don’t think we have time to go to the rest of the clips. I’d encourage our listeners to listen to it themselves. Ghost in the Machine, I believe it’s their April 29th episode. But if you all have two more minutes, I’d like to ask two more questions. I see some noddings of the heads. Beware. You don’t know what the questions are yet.

But Matt, you were talking about Paypal, right? It’s a payment processing system. There have been efforts as well to get other payment processing systems and like credit card companies from also stopping to service certain organizations. You see this in the Dunn Manufacturing context, for example. That’s one.

Matt Taibbi: Adult content.

NP Adult content as well. And Nadine, you spoke to this earlier. What do we make of this idea that many people have, that some of the piping of our modern internet infrastructure, ISPs, denial of service, systems that prevent denial of service attacks. Like Cloudflare, you remember, after the Charlottesville events, Cloudflare stopped servicing Stormfront. Paypal is one. Credit card companies.

People say, well, these social media companies are common carriers, kinda like AT&T. I don’t quite see the same parallel because they do make editorial decisions to the extent you have any sort of algorithmic content moderation. Maybe that’s good or bad. But some of the piping, like can you get on the internet, can you prevent hackers from accessing your system, can I send a payment to someone for legal services, those are coming under attack now, too. And to the extent that those are undermined by any sort of ideological attacks is quite chilling, and perhaps more chilling than what we’re seeing on the social media front.

So, I’d like to get your guys’s perspective as to whether we should start to actually seriously consider the idea that some of these companies should be considered common carriers, just like our gas company or water company and our phone companies.

Nadine Strossen: I think that’s a very powerful argument which, interestingly enough, has been made by experts across the ideological spectrum, including even staunch libertarians, interestingly enough. The deeper you go down into the architecture, the more compelling the argument becomes because the less analogous they are to exercising editorial judgment and the more analogous they are to providing some critical infrastructure that is absolutely necessary to meaningfully function in the contemporary world.

Of course, again, the devil is in the details. Exactly what regulations would be enforced? But I think the basic concept, which goes back to the common law, when you have critical infrastructure, is that there would be a basic obligation to fairly and non-discriminatorily treat everybody.

And I like that because that, to me, is a manifestation of the viewpoint neutrality principle, you know? No matter who you are, no matter what you believe, you should have access to these critically essential facilities for communicating.

Nico Perrino: You know, I fear that those arguments aren’t gonna get anywhere. We live in the age of the politics of expediency. It’s like, increasingly on every side of the political spectrum. Or I should – I guess spectrum, it goes two directions. I guess it could go more than that.

But there’s this like “burn it down” mentality. If it’s good for my side, then let’s pack the court. If it’s good for my side, then let’s –

Nadine Strossen: End the filibuster.

Nico Perrino: End the filibuster. If it’s good for my side, let’s leak this Supreme Court draft opinion. I just don’t see – I interviewed Normal Siegel, who used to run the New York Civil Liberties Union, and he said, “If I could have a tattoo on my body, it would go straight across my chest and it would say Neutral Principles.” But I don’t see a party for neutral principles anymore.

Nadine Strossen: Those are considered dangerous ideas on many campuses. Neutral is hate speech, I have no doubt about it, in the eyes of many now.

Matt Taibbi: Just quickly, from my perspective, I mean, I jumped from Rolling Stone to Substack, so I went from kind of a traditional corporate media organization to an independent media organization which is booming and doing great business, and I have a bigger audience than I’ve ever had before in my life. It’s been extraordinarily successful for lots of people.

But decisions like what happened with Paypal are incredibly chilling for independent journalists because it’s one thing to worry about having an article deleted from the internet. It’s another thing to have your whole business shut down. And if the threat of that is there, I know people who are already kind of retreating from where they think the line might be because that is just devastating. There’s nowhere else to go, right? If you’re gonna get cut off by Mastercard, Visa, or a company like that, there’s really no recovering from that, financially, for a lot of these folks.

So, I think that stuff is very dangerous, and that’s a place where the First Amendment is really in danger.

Nico Perrino: Yes, and you could have a platform like Substack, which is welcoming of a diverse set of views, and I forget her name, but she’s like the communications director for Substack. I’d love to have her on the podcast. But she’s like a full-throated defender of Substack’s approach to letting its writers kind of have the freedom to write.

But even Substack needs to have a way to pay its writers, right?

Matt Taibbi: Well, right. Exactly. Substack was designed specifically to avoid censorship. It was designed so that the distribution system bypasses these platforms, right? You get your content directly by mail. But you still gotta find a way to get paid, and those payment companies are a weak point for speech. And people are thinking about this now. What was maybe a fantasy for authoritarian thinkers decades ago, this is a reality now. There are people who are actively looking for ways to get into the middle of speech they don’t like.

And as we see with the Paypal thing, or the GoFundMe episode with the Canadian truckers, it also happened with some of these independent media sites, they’re coming after speech in the same way that they’ve done it previously with things like adult entertainment and guns, as you mentioned.

Nadine Strossen: The NRA. Yes.

Matt Taibbi: Yes. And this is, I think this stuff is really scary. It’s way scarier even than the content moderation policies of Facebook or Twitter.

Nico Perrino: I do need to ask as our final question here, because this is, I think, the thing that’s motivating a lot of people’s opposition to Elon Musk, which is the Trump ban. It’s a third rail in a certain sense, and I see Amna kind of exhaling there.

Amna Khalid: Listen, I just want to be very frank, right? Like, I’m no fan of Trump’s, but we need to think, if we’re concerned about the effects on society, and we think about what’s gonna happen when you ban people, you need to also recognize that these voices go underground. They go underground, there are other platforms they go to where they’re less policed, you’re less aware of them, and then you’re suddenly surprised when someone like Trump gets elected. You’re like, “Oh, I had no idea people were thinking this.” Well, because you completely created a wall so that you couldn’t hear them, and now here we are.

So, to my mind, no fan of Trump’s, but let’s think. If we’re going to think about the consequences for our society, the consequences aren’t within these like walled, tiny silos. The consequences are all over.

So, the argument is the same as the argument for – Nadine will attest, you know, for Nazis being able to march in Skokie. It’s like if you’re not gonna let them do that, then we know it goes underground, and then it comes out worse.

So, it’s simple, in my mind. It’s not a complicated question, and this wasn’t to you, Nico, I mean, I was just saying I was getting frustrated with the question in the wider public sphere about Trump.

Nico Perrino: Yes. Well, it also speaks to the question we discussed earlier about fairness and trust in the institutions, right? To the extent you have 50% of the electorate, more or less, that supports Trump, banning him from one of the greatest social media platforms, regardless of whether you think he should be banned because he met the incitement standard or whatnot in his January 6th speech –

Nadine Strossen: That which was enforced, as with all the so-called standards, enforced in a very inconsistent way. But one point I want to make about Trump, completely agreeing with Amna, but adding that people have to remember that freedom of speech includes not only the right of the speaker, but the right of the audience members.

You know, Amna and I want to hear what these important politicians have to say. At that point, he was still the duly elected president of the United States and commander-in-chief. I wanted to hear what his messages were.

And by the way, in terms of consequences, who knows? A number of political analysts surmise that part of the reason Trump was defeated last time around was precisely because those traditionally Republican suburban voters heard what he said on Twitter and Facebook and didn’t like it, so they decided to vote against him. So, again, it shows the complexity of human reactions.

Going back to a point Amna made earlier, that we can’t just have this simplistic “Oh, you hear Trump, you’re gonna immediately become an acolyte and support and imitate everything that he does.” One final point. When Trump got deplatformed, my former ACLU colleagues issued a statement that I thought was really powerful. They said, “You know, Trump will have alternative platforms and outlets, but we are really concerned about the people we represent. The marginalized, discriminated, voiceless people who really are especially dependent on these social media platforms.”

And I want to come back a point I made right near the beginning, that with all of the overemphasis on the negative stuff that’s happening online, think of all of the positive movements. If you are the most ardent social justice champion, your movements got off the ground because of Twitter and #BlackLivesMatter and #MeToo and others. These were movements that had struggled for years, didn’t gain any traction, and Twitter finally gave a voice to the non-Donald Trumps of the world.

Matt Taibbi: I would just add, I had the same reaction to the removal of Trump that Bernie Sanders did, which is if there’s somebody who can turn off the president, a sitting president of the United States, who are those people, and aren’t they really running the country? I mean, that was kind of a scary moment. And again, I think for a lot of people, the shock of that was more powerful than any statement that Trump could make. So, to me it’s dangerous to even go there. Yes.

Nico Perrino: Bernie Sanders is funny on free speech issues because he always kinda has – he has kind of like this old school liberal mentality, where you see modern progressives making certain arguments calling for more censorship. And we saw this in the campus context, he heard these arguments, he’s like, “Why would we ban them? What are we afraid of? Their ideas?”

Nadine Strossen: Their ideas. Right.

Nico Perrino: And it’s just like, he just doesn’t understand the argument because he’s used to the 1960s and ‘70s, like kinda free speech culture, which I always find kind of funny. And it’s true, we’re like, are you afraid that you can’t win in the battle of ideas?

But I will say this, just kind of as a small story before we sign off, Nadine, your point about the rights of the listener is super important. I grew up and came up through high school during the time where like the four horsemen were popular. Christopher Hitchens, Daniel Dennett, Sam Harris, and – why am I forgetting the fourth one? Over there in Europe. Richard Dawkins, excuse me. And one of the shaping – and I kinda became like an anti-theist in their movement, and my opinions have changed over time because –

Amna Khalid: Because you’re human. You’re human. You changed your mind.

Nico Perrino: Intellectual journey doesn’t end when you’re 22 years old, surprise surprise. But there was this awesome documentary about Christopher Hitchens and this evangelical pastor from Idaho, Douglas Wilson, who’s like very evangelical, opposed to everything I believe. But they both have a commanding knowledge of Scripture, and so they made this documentary where they went on a road show debating Scripture. At Indiana University, where I went to college, Douglas Wilson was coming to speak. This is around the time that Hitch passed away.

And I was so excited for the opportunity because I was gonna get to do what Christopher Hitchens did. I was gonna get to interrogate Douglas Wilson. And it was my first ever experience in person with a mob shouting down a speaker. Douglas Wilson didn’t really get to speak. I didn’t get to ask him any of my questions.

So, people, when they were talking about the debate, were talking about his right to speak. They weren’t talking about my right to hear someone speak who I had heard debate one of my heroes, and I got to be in the position of one of my heroes, Christopher Hitchens.

And Douglas Wilson had this great line. He says, “I always think I’m right, but I don’t think I’m always right.” And mob centers, I think, have this sense of their infallibility, that they are always right, and then therefore they can determine for everyone else what is true and false at the point of censorship.

We have to leave it there.

Amna Khalid: Can I say one thing that actually came up?

Nico Perrino: Go ahead, Amna.

Amna Khalid: And I think it’s worth noting, is Nadine brought it up, which is this idea that value neutral, content neutrality is somehow seen as a bad thing. It’s happening a lot on campuses, and it’s seen as a product of white supremacy, as is free speech. It’s like the weapon of the people who want to oppress.

I just want to contest that and say for those people who see this as specifically white and Western concepts, there is a long, long history of the appreciation of these values across the world in many, many societies. So, those who claim that these are white supremacist values are coming from a deeply ahistorical point of view. They need to go back and read some serious history and recognize that there are some universals that are worth upholding.

And even people of color and biPOC people have historically seen the value of those things.

Nadine Strossen: I would say especially, not even.

Amna Khalid: Yes.

Nadine Strossen: Especially. Frederick Douglas said it best, and he was somebody who very strongly advocated for the right to receive ideas. He said, “Slavery could not survive free speech. Five years of its exercise would banish every auction block and break every chain in the South.”

Nico Perrino: Yes. Amna, you’re talking about these being universal values. I think the argument that these are white supremacist ideas is also a very American-centric viewpoint that kind of erases the rest of the world, where these ideas are essential to secure the rights that we in America might have right now, but elsewhere in the world they don’t.

So, I appreciate you guys staying on 15 minutes past the hour. We’ve got to wrap it up there. Hope to do it again soon. Well, actually, I don’t because that would mean On the Media did another one of their episodes where they make arguments for censorship under orchestral music, and I don’t really want that.

Thanks, guys, I appreciate it.

Amna Khalid: Thank you, Nico.

Nadine Strossen: Thank you all so much. Great seeing you.

Matt Taibbi: Thank you. Take care, Amna, Nadine.

Nico Perrino: That was Matt Taibbi, Nadine Strossen, and Amna Khalid responding to an April 29th radio segment from WNYC’s On the Media. The segment was called “Ghost in the Machine,” and it will be linked in the show notes for anyone who has an interest in listening to that full episode.

I should also mention that that documentary that I referred to, between Christopher Hitchens and Douglas Wilson, was called “Collision.” Highly recommend it for anyone who’s interested in kind of debates over Scripture.

This podcast is hosted, produced, and recorded by me, Nico Perrino, and edited by Aaron Reese. You can learn more about So to Speak at twitter.com/freespeechtalk. We also now have an Instagram account, also at the @freespeechtalk handle. Facebook at facebook.com/sotospeakpodcast, although we’re not reaching you there organically for reasons we discussed earlier in the episode.

We take email feedback at sotospeak@thefire.org, and you can watch this episode on YouTube at our new So to Speak channel. It’s no longer hosted on the FIRE channel.

If you enjoyed this episode, please consider leaving us a review, and until next time, I thank you all again for listening.