Table of Contents

‘So to Speak’ podcast transcript: What does the First Amendment protect on social media?

What does the First Amendment protect on social media?

Note: This is an unedited rush transcript. Please check any quotations against the audio recording.

Nico Perrino: Welcome back to So To Speak, the free speech podcast, where every other week, we take an uncensored look at the world of free expression through personal stories and candid conversations. I am, as always, your host, Nico Perrino. And today I wanna talk about a Texas bill that was signed into law late last year that seeks to do two things. First, it seeks to prevent viewpoint discrimination on social media platforms. And second, it seeks to compel certain disclosures of the platform's content moderation practices.

The new law has been wrapped up in court and hasn't actually gone into effect yet. But after suffering some initial legal setbacks, it recently was upheld by the Fifth Circuit Court of Appeals, in a ruling that many found surprising, and that diverges from an 11th Circuit ruling on a similar social media law in Florida. The case is arguably setting up the most consequential Supreme Court First Amendment case since New York Times versus Sullivan, which was decided in 1964. And as Charlie Warzel of The Atlantic recently put it, “The case is a battle for the soul of the internet.”

Now, before I bring in our two esteemed guests for today's show to break this all down, I wanna set a foundation for our listeners by reading you all some relevant portions of the Texas law. The law regulates social media platforms with 50 million or more users, and it has two main provisions. First, it says that a social media platform may not censor a user, a user's expression, or a user's ability to receive the expression of another person, based on the viewpoint of the user or another person. And in this context, it defines censorship, by meaning to block, ban, remove the platform, demonetize, deboost, restrict, deny equal access or visibility to, or otherwise discriminate against expression.

And the second thing the law seeks to do is to empower social media companies, or force, I should say, social media companies to disclose how they curate and target content to users, how they place and promote content, how they engage in content moderation, how their search functions work, and more. And in addition to these disclosure requirements, they must file a biannual transparency report on various content moderation practices, and set up a complaint and appeal system for when users’ content is moderated or removed. So, with that foundation set, joining us today to discuss and debate the Fifth Circuit ruling that upheld this Texas law is Brad Smith.

Brad is the founder and chairman of the Institute for Free Speech, which filed an amicus brief in support of the Texas law. And also joining us is Ilya Somin. He's a professor of law at George Mason University, and he has written commentary critical of the law and the Fifth Circuit's ruling. Gentlemen, thanks for coming on the show.

Brad Smith: Thanks.

Ilya Somin: Thank you very much for having us.

Nico: Brad, let's start with you. What's your read on the Texas law and the Fifth Circuit's decision?

Brad: Well, I think first we need to recognize that the question of social media here raises a number of very important free speech issues. I'm not sure that they're First Amendment issues, that's a tougher question. But we have not before seen something like this, where businesses that exercise a fair amount of influence in the dissemination of public ideas, have decided that it's their mission to in fact limit the dissemination of those public ideas. And the opinion of the Fifth Circuit kind of provocatively calls them censoring and uses that term over and over.

But it's not government censorship. And I'm not sure it helpful to keep calling it censorship in quite the way the court does, but it is censorship. It is an effort to limit opinions simply because people in the platforms don't like it. And one can argue about whether or not they're monopolies or not and so on. I don't really think that's terribly relevant. I think it's clear to everybody that we know how many people rely on a small handful of platforms for news, to get message out, how difficult it is to navigate. So, for example, I know one libertarian scholar, who I won't name because I don't think he's really wedded to this position. But he's raised it, I think he finds it at least somewhat persuasive.

And I don't know that I agree with it totally, but I think it's at least, again, somewhat persuasive. He analogizes it a little bit to the situation for African Americans traveling in the Deep South, in the 50s. They could travel in the deep south, they could. But it was very difficult for them to exercise that right because of the action taken by private companies with no monopoly power at all. And that might be an applicable type of analogy here. So, I think, first of all, we have a big speech issue. And that is potentially, at least, a compelling government interest in fostering freedom of speech.

The second thing then is the interest of the companies is rather interesting as a speech case. And I think the court does a pretty good job of laying out some of the problems here. The companies are not being deprived of their right to speak, they can say whatever they want on their platform or anywhere else. Nothing in the Texas law prohibits that. By the way, that’s a big difference from the Florida law that was struck by the 11th Circuit. Nor, are they being required to speak in ways that we usually associate with the compelled speech cases.

They're not being required to give a statement like in Barnette, they're not being required to put a statement on their car like in Maynard v. Wooley. They’re not being required to include somebody else's message in their message to their customers, as in Pacific Gas and Electric. What they're really saying is simply that they don't want to have their property used to distribute certain messages. And that really sounds much more to me like a property rights claim, which, as a libertarian, I think property rights are really important. But the Supreme Court has not thought they're important for almost 100 years. We have a lot of sort of water under the bridge and a way in which we need to look at things.

So, it's an odd sort of First Amendment claim, to suggest that they're being compelled to speak in some way. And I guess the third thing I would say, just right off the top, just to kind of set the stage, and then not try to monopolize our little conversation here. But there's a lot of discussion I see about, are the platforms more like a newspaper? Is Tornillo vs. Miami Herald the applicable precedent? Or are they more like a broadcast committee? Cable Network is Turner v. Federal Communications Commission the applicable precedent? Or are they more like a telephone company?

And I think the answer to that is, they're like internet platforms. They're not really like any of these analogies. These analogies are helpful. And as lawyers, we use analogies, but all of these run out. This is kind of a new type of thing. And we think about how much it's changed our world in, really, 20 years. I think it's worth keeping that in mind. And for that reason, just to wrap up, I was kind of pleased to see the Fifth Circuit say, “Look, we're not gonna hold this facially unconstitutional, and the idea that we want to allow, perhaps, at least some experiment and see how some of this actually impacts the platforms and other users and so on. And practice maybe the wisest part of the decision.

Nico: Well, Brad, let me ask you because that was one of the curious things, speaking of the facially unconstitutional question about the decision, is it seems to reject, under Article Three, the courts’ ability to review facial challenges altogether. What do you make of that?

Brad: Well, the opinions kind of interesting. It's got some parts where, if not overreaches, at least puts things in a very tough tone. One commentator described it as Judge Oldham's, Andy Oldham who wrote the opinion, his audition to be the first appointee to the DeSantis administration Supreme Court. It's got some parts as well that, for example, attack the Lochner era.

And actually, one of the major efforts of conservative jurisprudence over the last 25 years has been to at least somewhat rehabilitate the Lochner era. So, it's interesting to see him slam into that and take some things that may be a little bit stronger than they need it to be for the opinion to go forward. But let's put it this way, there's no doubt that he wrote some things to catch people's attention.

Nico: Sure did. Ilya, what's your sense of the case?

Ilya: I think this is a terribly written opinion, with teams with lots of horrible arguments. And at times, it is actually Orwellian. As Brad mentioned, the opinion describes as censorship, what the platforms do in choosing what messages they wanna carry and which ones they don't. And it describes this free speech, Texas’ efforts to coerce them into carrying messages and posts that they don't want to have. That's obviously the opposite of the truth. What is censorship is when the government tries to control what private actors convey on their platforms and their property.

On the other hand, it is in fact the exercise of free speech when private actors exercise editorial judgment, in deciding what messages they want to convey and which ones they don't. And that's true, even if the decisions they make are sometimes illogical and flawed. I don't myself agree with all of the content moderation rules that Twitter has or Facebook has, and sometimes their algorithms work in a kind of stupid way. But that's the nature of freedom of speech. I also don't agree with all of the New York Times editorial decisions, or Fox News’, or others.

And it is common for media platforms of many different kinds, going back now decades and centuries, newspapers, TV stations, radio stations, and many others, to allow some messages, but not others. Now, it is said in the opinion and by some other commentators, well, this is different because they're not exercising as close a control, or not as intimately connected. But there are lots of media platforms, which traditionally allow a wide range of views, but not an infinite range. And are sometimes, perhaps, inconsistent which ones they allow.

The New York Times, The Washington Post, Fox News, all allow publication of various opinion pieces that go against their editorial line. But they will forbid ones that they feel go too far, or offensive, or problematic in various ways. I think the same thing is true with Twitter and Facebook. Both in the opinion and by outside commentator to support it, there is this notion that this is different because these entities are some kind of a monopoly. Nothing can be further from the truth. If you look at survey data, many more Americans get political news and commentary from TV news or from traditional media websites.

About two-thirds of Americans say that they get political news and information from both of those sources. Fewer get their information from social media, only about 50 percent. And of those who get information from both kinds of sources or from more than one, only 11 percent say they prefer social media. That's from a recent Pew Research Foundation study. So, there is lots of competition between social media and other forms of information provision, forms of political expression. It's not even necessarily the case that these platforms, Twitter, Facebook and so on, have a true monopoly over social media provision of political speech.

These entities themselves displace previous big firms that were seen at the time as possible monopolies, like MySpace, if consumers become dissatisfied with Twitter and Facebook, they too might suffer in competition. In the case of Facebook, there's already a lot of evidence that younger people, that if people younger than my generation, I'm already kind of getting old, they don't use Facebook nearly as much as people my age or older do. And Twitter, by the way, is already used only by a relative minority of Americans, even fewer regularly check it and post on it.

I would also add that here, unlike in the common carrier analogy that we might talk about later, here, the part of the service that these firms are providing actually is moderation. And that it turns out the consumers, most of them, actually don't want a completely unmoderated space or one where the only moderation is that, which is required by federal law as under the Texas law, even Donald Trump's Truth Social, they reserve the right to moderate and remove what they consider offensive speech. And the same thing is true for other conservative alternatives to Twitter and Facebook, like Parler.

So, if this decision goes through, and I expect that it will more likely than not be overruled by the Supreme Court. I think the court has already signaled that they hold that view, the majority of justices do. But if it's allowed to stand, it would pose a serious threat to freedom of speech because it’ll allow government to impose similar kind of restrictions on other media, which they consider big, or influential, or problematic. The last point I wanna take up, and I know I've probably gone slightly long is, Brad mentioned the issue of property rights. I actually do think that there is a constitutional property rights violation here.

And that's true, not just under what libertarians might want, but under current Supreme Court takings precedent. The Cedar Point decision from last year says that it's presumptively a taking, whenever the government requires even a temporary physical occupation of private property. In that case, property and land. Here, the state of Texas is compelling an occupation of private property media websites by speakers or users that the owners of the site don't wanna have there.

You can say it's not physical property, but the Supreme Court has in fact said that intellectual property get takings protection or the Fifth Amendment. Moreover, there is even a purely physical occupation here because the information and data generated by social media sites is actually stored on physical servers. And the more users there are, the more of that storage space have to be used. So, there is a physical occupation here as well. And so, in my view, there is a violation of the takings clause here, as well as a violation of the First Amendment.

That takings clause issue has not been litigated yet in neither the Fifth Circuit case or the 11th Circuit. But should the Free Speech argument be defeated; I would expect this takings issue to come up. And to bring it up does not require going back to the Lochner era or doing anything of the sort, it just requires applying the Supreme Court's precedent, which I think was correctly decided, by the way, from just last year.

Brad: Nic, let me raise a couple points here. So, first, Ilya says this is not censorship. Well, it is censorship. It's not government censorship, but it's censorship. There's no doubt about that. And what the platforms are saying is exactly, “We want the right censor speech on our platforms.” And I think we just need to keep that in mind. Because when we get into thinking about this, libertarians have long – let’s put it this way, what is the purpose of libertarianism? Is it to promote freedom, or is it to promote a really tiny little government? I think it's to promote freedom, or maybe no government. I think it's to promote freedom.

Now, for most of the last hundreds of years, libertarians have seen government, quite correctly, as the primary threat to those freedoms. But it's never been lost on thoughtful libertarians, that private sector can infringe on freedom. Certainly, John Stuart Mill, in his own liberty, wrote at great length about the private censorship of speech, which I think he viewed really as a greater problem than public censorship of speech. I'm not sure he was right on that, but I'm not sure he was categorically wrong at all times, in all places, either.

So, I do think we need to kind of keep that notion in mind. When we think about, what is the purpose of what we hope to do with that? Now, thoughtful libertarians are gonna be very, very uneasy about this because we do recognize the tremendous power of government, we recognize it can be misused. We recognize that, historically, this has usually been the big enemy. So, I don't think people would be wise to jump into inviting the government into an arena like this. But I do think we need to kind of keep those two facts in mind.

What the platforms want to do is censor speech they don't like, and then we need to think about what is the best way to actually promote freedom? Not only in this case, but in the long term. So, Ilya points out 50 percent of Americans get their news from Facebook, certainly some of their news, 50 percent? I’m like, “Wow.” –

Ilya: From social media, generally, not just from Facebook.

Brad: Well, the last number I saw, Facebook was up in the mid 40s. So, I thought maybe you’d see more recent numbers [inaudible] [00:17:18]. That’s a heck of a lot of people getting their news from one source. I don't know that we've seen that, since the days when there were simply three broadcast networks on the airways. Not all their news, but a substantial part of their news. Again, I think the whole question of whether they're monopolies or not is kind of irrelevant. Like I say, if you look in the old discrimination cases, Always Barbecue is not a monopoly. It's a tiny little business.

And the bigger question is, what is the overall effect on the ability of people to speak? And we know, again, that it does matter. We know that burying the Hunter Biden story, that it was buried and we at least can think, from certain polling that has been done, that it definitely influenced the outcome of the last election. Now, the fact that something influences the outcome of an election isn't a reason to ban it, or require it, or anything else. I mean, that's why people engage in speech, is try to influence the outcomes of elections. But it does indicate the importance of what is going on here.

And that effort was not an effort to influence the outcome by speaking or persuading, it was an effort to influence the outcome by discouraging ideas from getting into the public sector, by hiding them from the public sector. So, I think that we need to think about that a bit more. In terms of the similar restrictions and so on, again, I wish the – the opinion begins by saying, “We start with the original meaning.” Well, that's probably good for the Supreme Court, I'm not sure that's always good for an intermediate appellate court, which probably should start with Supreme Court doctrine of what the original meaning is, even if they would like to go to the original meaning right away.

If there's no doctrine on point, maybe you go back to the original meaning. And here, like I say, maybe none of the doctrine exactly fits. But I do wish they focused a bit more on some of the doctrine. And again, a lot of the doctrine has gone through – we have lots of restrictions on companies that force much more compelled speech. So, we have Turner Broadcasting versus FCC, that says it's okay to force cable channels to carry certain providers of content. We have Pruneyard saying, “It's okay to have a state statute that requires a mall to allow people in the mall to set up sort of a little public table and express their point of view.”

We have a lot of cases like that. And I think, having said a couple times, I don't think the analogies all go that far. I would say, in my mind, the best analogy is this. Way back in the day, back before you were born, Nic, and maybe even Ilya, but probably not. If you wanted to produce something, you would go down to a print shop, they would have names like Instaprint, and PIP, and Quick Print, and things like that. And you would give them the stuff and they would print it up for you. I don't remember ever, that there was ever an issue, that any of those businesses would not serve somebody because they didn't like the ideas that was being done there.

And to me, that's really what these platforms are. They're quick print shops, right? They don't exercise editorial discretion. They say right up front, they say, “Look, we pretty much want everybody here.” In defending themselves on charges to 230, and I hope the listeners kind of know at least what 230 is. Because otherwise, we're gonna get in a long, long discussion about explaining 230. But 230 is a law that insulates, obviously, the providers from certain libel suits. And to maintain that protection under 230, in just sort of a lobbying position, they steadfastly argue that they are not publishers in material.

And that's what 230 says, they are not gonna be treated as publishers for things that are put on their platform by other people. So, you've got this kind of dual action going on here. And again, I don't think we can kind of just rather simplistically state that, “Clearly, the forces of freedom are on the side of the tech companies here. And clearly, the forces of evil on the side of the government here.” And I think to call this all Orwellian is to just dismiss the serious complexities of the issues that people have and that the courts deal with.

And I'll close this thought with one fanfare, my big concern is that the Supreme Court’s gonna feel they have to take this case. And maybe that's right, it probably is. I do know that there's pretty significant differences from the Florida law. So, it's not entirely clear you truly have the clean circuit split, but they're probably gonna feel like they have to take this case. It’s national importance. And the fact is, I don't think they're ready. I don't think they, or most of us, those of us who are sitting here commenting like we know so much, have probably really thought this stuff through enough to have a real sense of where the values lie, and which side is ultimately the one that does the most to protect our freedoms.

Nico: No, Ilya, I’ll bring it over to you in a second. But I wanna go back to what the court says in the majority opinion. They say – and they're kind of echoing what you're saying there, Brad, that they're exercising no editorial discretion. They say the platforms are nothing like newspaper and Miami Herald. Unlike newspapers, the platforms exercise virtually no editorial control or judgment. The platforms use algorithms to screen out certain obscene and spam-related content. And then, virtually everything else is just posted to the platform with zero editorial control or judgment.
I wanna push back on that a little bit because if you actually read the law and read its definition of censor, the law recognizes that they're doing something editorial, right. They say, “Censor means to block, ban, remove, de-platform, demonetize, deboost, restrict, or deny equal access or visibility to, or otherwise discriminate against expression.”

Which, is sort of speaking to what the platforms are doing and kind of what gives you a competitive edge, if you're TikTok, over Facebook, over Twitter, is how the platform's determine what content gets elevated or gets put in front of you versus doesn't. Do you not see that as a sort of editorial judgment on how to make a platform perhaps more engaging to its end users?

Brad: Yeah. Well, a few thoughts. First off, not to directly answer, but one other thing that the court does is, essentially says, there's no First Amendment right to editorial discretion, really –

Nico: Yeah, I was gonna bring that up.

Brad: I disagree with the court on that. But as to what the platforms are doing, I think to me, it illustrates a little bit the wisdom of the court A) Not making it a facial challenge, and B) Goes to the point I just made of thinking about this a bit. I don't think right now people quite know how do you get at the idea of algorithms doing this kind of work? Because obviously, though, the algorithms are overlaid with these kind of seemingly almost random decisions made on a small amounts of material, to take them down at different points in time.

So, it may be that these kinds of statutes need to be worked out over time, and that they need to be a lot better. When we were considering where to go on this case, at the Institute for Free Speech, [inaudible] [00:24:08] said, “These are bad statutes.” And a point I finally made was, “Look, they’re the statutes we've got, they’re the statutes that are being challenged, and you've got to decide, are you gonna try to say the government absolutely cannot do this, or are we gonna suggest that maybe the government should be allowed to do this, and maybe the time should be bought to let some of this play out?”

And one other thing that I'll just note, you mentioned, when you read the definition, one of the things that really concerns me here, oddly enough, it's kind of odd to me that the cases have started with what I consider the – I don't know, the weakest case or whatever, which is the content issue. And by the way, I do think the standard is content neutral, which generally, results in a lower Supreme Court degree of scrutiny to see if the law can be upheld. But the real fears, to me, that I think people should be worried about, are things like demonetizing people and so on.

If you wanna talk about something where I think the application of the analogy of the Deep South, during Jim Crow, would really come into play is when you start making it so that disfavored businesses literally can't get cash, transfer cash, do the kinds of things that you need to be in business nowadays. And I guess, one of the questions would be, if you take Ilya’s sort of absolutist approach here, is where is the stopping point on that? Is that fine, too?

That's also [inaudible] [00:25:27], we’re just gonna demonetize people, all these folks are gonna be out of business. If they're in disfavored industries, they're out of business. They'll have to set up something else. But that might be tough to do. And we saw, for example, when the Parler site went up and tried to compete with Twitter, and the other tech companies ganged up on it. And overnight, it was gone. And that's a worrisome thing.

Nico: Well, there is the idea that you can think of these internet companies as on a spectrum, right? On the one hand, you've got the internet service provider, the Verizon, that just gets your internet up and running. You've got payment processors, for example, you've got institutions like Cloudflare, that prevent denial of service attacks. They don't have any expressive component for it, but they allow you to exist in the internet world.

And then, further along in the spectrum perhaps, is something closer to a social media company that maybe does have an expressive component maybe, or it's like a Tumblr, where you create a community surrounding a shared set of values. If you’re a Buffalo Bills fan, for example, only Buffalo Bills fans exist here. And you can only have positive Buffalo Bills opinions. So, when I'm thinking about it, you gave the example of the print shop, right? Well, how might that be different than Masterpiece Cakeshop, right? Which has a more expressive –

Ilya: So, I wonder if I might jump in here.

Nico: Go ahead, Ilya.

Ilya: Just a couple things. One is, Parler very much still does exist. They found an alternative host and they're up there. They're not very popular, but that's not because they've somehow been forcibly driven out of business. It’s because not that many consumers like their services. The same thing is true for Trump's Truth Social, which very much exists, and there are some other smaller, sort of more right wing-oriented types of social media sites of various kinds, most of which at least do, by the way, have their own content moderation. And that leads me to the print shop analogy.

I would maintain that in the history of print shops, I bet there are plenty of instances where somebody came into a printer, and they wanted to print some kind of viewpoint that the printer thought was really awful. Racism, Nazism, Communism, what have you. And the printer said, “No, I don't wanna do that.” And in the context of Masterpiece Cakeshop, which you just mentioned, where for listeners who don't know, that was a case where a baker did not wanna bake a cake with an inscription celebrating a same sex wedding because he was for religious and moral reasons, opposed to same sex marriage.

The vast majority of conservatives, libertarians, correctly I think, recognize that that was an exercise of the baker's freedom of speech. Possibly also, of his freedom of religion. And they, and I also, very much stood up and said, “No, the baker should be able to say no.” And the person who wants to cake bake, they should go somewhere else and get it baked there. Analogies to the Deep South, in the Jim Crow era, I think don't undermine that point because the discrimination was massively underpinned by the government and by private violence. If you remove that, I think things would be much different.

I think you can imagine worlds where somebody's view is so unpopular that, they can't do anything anywhere. That is simply not the case with people who disagree with the views of these social media websites. There are many other places where you can express your views, some of which are, as I mentioned before, are much more popular than these social media sites are. And you can have more viewers, more credibility and the like. Yes, Twitter and Facebook are big, but there's actually more evidence that something like Fox News influences public opinion elections than Twitter does.

But nobody argues that means Fox News should be compelled to show more left of center views or more other views, that disagree with their editorial line. And the same thing applies here. Final relevant point on this particular issue is that unlike in the Pruneyard case, which Brad mentioned, where there was a shopping mall where the court said, “California can have a law requiring the shopping malls not to kick people out, based on protests or demonstrations they had.”

Part of the reasoning of the court in that decision was that the mall really was open to everybody in the general public, without exception, or almost without exception. That's not true for Twitter or Facebook, or Truth Social for that matter. They make you sign a contract before you come in, and among the provisions of that contract are that you are subject to our content moderation. There are few, if any, shopping malls that require you to sign a contract before you come in. And that contract explicitly has, essentially, editorial judgment.

In the court decision, they say, “Well, that judgment is exercised after the fact.” And so, that makes it somehow different. But as Judge Southwick pointed out in his dissent, in some cases, the actual exercise before the fact. For example, in the case of Donald Trump and some other speakers, they simply are banned from Twitter or from Facebook entirely. Because the owners of the firm have judged that, their expression is so inimical in various ways to the platform's values, that they don't wanna have them.

To my mind, that's no different from Fox News or the New York Times saying, “We don't want Ilya Somin ever appearing in our website or our network because we think his views are totally awful.” I might disagree with that decision. I might say it's inconsistent or that I'm not nearly as awful as they say, that they might think that I am. But they do have a First Amendment free speech right to do that. Finally, on the issue of sort of private versus public threats to freedom. Yes, I certainly recognize private actions can threaten freedom in various ways.

But when a private platform or a private owner says, “I don't want this certain kind of speech in my property, or I don't want people to enter my property at all.” That is not some kind of private coercion of freedom, that is itself the exercise of freedom. And yes, in theory, you could have a danger to John Stuart Mill, was worried about that there's an opinion almost that is overwhelmingly agreed on by society, and purely private action can drive that opinion out. That strikes me as a much less likely or plausible scenario than the government exercising its power to force the same rules on all social media or all media that they think are too large.

And I would note that in the mill type scenario, if an opinion really is overwhelmingly dominant in the private sector, and it’s to the extent that people who disagree with it are shut out, that it's certainly very unlikely in a democratic society that the government is gonna step in to protect that highly unpopular opinion. To the contrary, it's much more likely that they will step into oppress that unpopular opinion more. So, when we have opinions that are very unpopular in the private sector, that's actually all the more reason to keep the government out of it.

Because the government is much more likely to come in and suppress those few places, where that unpopular opinion can still be expressed, rather than try to come in and somehow help the unpopular opinion get more of a foothold. And if you look at the history of government policy, there is vastly more examples of the former kind, where they try to help suppress unpopular opinions than where they try to protect unpopular opinions against First Amendment, or I'm sorry, against private disapproval or exclusion.

And I would add that that is actually what is going on with the Texas and Florida social media law. It's not the case that they're protecting some kind of unpopular opinion, it’s the case that they claim that conservative opinions, which are of course, the politically dominant ones in the state of Texas, are being too much excluded from Twitter, or from Facebook, or some other platforms.

And therefore, they want those opinions to be more present. So they're not protecting the weak against the powerful, they're actually protecting the majority opinion or at least the more common opinion in that state, against media firms that, on some points, disagree with it, or at least on some points don't want some types of right wing speakers on their sites.

Brad: No, I’d push back on, Ilya, on a couple points. First, I think you're entirely correct. And it's a really good point, just a very good prudential and practical point, to express that concern about what is the government likely to do with this power. I think that's a real issue, and not one that can at all be taken lightly. But let's take a couple things, other things. So, the Masterpiece analogy, I think, again, that's one that has some persuasive power. But at the end of the day, when you look at the cake shop cases, why are they always cake shops? Right?

Well, the reason is because you're asking somebody to bake a cake and write on it, here's what I think. Right? And that makes it much more like Barnett, like Willie B. Maynard, those kinds of cases where the government is compelling somebody to write something down. And here, Twitter is not being compelled to write any speech down. And you might add the added point that nobody really thinks that because I post something on Twitter, people think that that's what Twitter believes. Nobody thinks – I would venture a guess that literally not a single person in the United States believes that.

Could be wrong on that, but I would venture that, yes. So, I just don't think that Masterpiece quite – I think it can be distinguished. And that's what makes, again, this issue kind of tough. Also, on the question of –it's not so much whether they're encouraging popular or unpopular opinions. It's the power of individuals that – many of the people who are being knocked off these sites, do not have a lot of power. And so, it's not really so much, “Is this a popular opinion or an unpopular opinion?” Although, that certainly can become an important point. But a lot of these people don't have any power at all. They're certainly much less powerful than the platforms.

The Pruneyard case, again, Pruneyard could have put up all the signs at one or two saying, “If you come into our mall, you agree you're not gonna do this.” They were gonna lose that case. So, I don't think that that idea that there's this contract separates it. And I think the court is quite right in pointing out that, look, I believe contracts of adhesion are valid, everything else, you check that little box, “I agree to these terms.” Okay, on your computer, but the reality is that elsewhere, these platforms, like the mall in Pruneyard, are repeatedly going forth and saying, “We welcome everybody. This is America's new town square, come here.”

Oldham's opinion even includes some quotes from some of that sort of marketing material. But then, they have, in this little fine print that you check the box on, saying, “Oh, but we reserve the right to kick you off.” Well, that's Pruneyard, that's Pruneyard’s case. So, I don't think that's a distinction between the two cases there, in that sense. So, again, I think some very tough issues that require – let’s put it this way, I hope the Supreme Court considers them very, very carefully. If you're gonna take this case, and what I just hope is that they do it with appropriate modesty. And I’m not sure, by the way, the Circuit had appropriate modesty. But you know –

Ilya: Yeah. So, I think the one point we sort of agree on is that this circuit opinion is not modest, and modesty is not always a virtue, but in this case, they should have been more modest about making claims, many of which are utterly unsupported, either by precedent or by empirical evidence, or even just minimal common sense.

A couple of points on Masterpiece Cakeshop and Pruneyard, I think Pruneyard is a bad decision that should be overruled on both the free speech and the takings dimensions. But it is the case that the court made, much of the fact that this is open to everybody. And at that mall, I think like at the vast majority of malls, there is not in fact a person standing at the gate saying, “You cannot enter our mall, unless you sign this contract, which says’” –

Brad: But there's nobody at Twitter manning the gate either. Right? It would be like if you [inaudible – crosstalk] when you walked into the mall, “Sure, I'll be.” And just walked on by them. They still have that issue. And that's what Twitter's got, “Check the box.” But they’re not really [inaudible – crosstalk] –

Ilya: So, on Twitter, if you wanna just go to the website and look at things, you don't have to sign a contract. But if you wanna post, you do have to sign it. And this is not just a legal technicality. By now, most Americans who are at all interested in sites like Twitter, they know that they have content moderation, they know they have various kinds of rules. Whereas, by contrast, very few people, if you asked them, will say, “I have to sign an agreement before I enter a mall.” Certainly, not an agreement about what I will say or not say when I'm there. And yes, you're right.

And Judge Oldham is right that Twitter also sometimes had this rhetoric, “We're open everybody,” or whatever. But I think that standard sort of advertising puffery, it's similar to saying like, “We have the best burger in the world that's clearly better than any other burger.” Most minimally intelligent people know that that's puffery, that they're not saying that it's literally scientifically proven that they have the best burger in the world. And I can say the same thing, at least, about Twitter or Facebook's rhetoric about openness.

Finally, on Masterpiece Cakeshop and the like, yes, you can try to distinguish it and Brad is clever where it can make distinctions and Brad is one of the best lawyers we have in the First Amendment space. So, he's right, you can try to distinguish it on the basis that there the baker is actually writing something, as opposed to merely allowing somebody else to propose something. But if the law compelled the baker to include a sign on his cake that was written by somebody else, or even post a sign on his shop, that was written by somebody else, then I think he would still have a First Amendment speech case there, and perhaps a freedom of religion case as well.

And it wouldn't matter, even if the baker sometimes wrote inscriptions that he disagreed with, or sometimes posted signs that he disagreed with. Because we recognize, and I think Judge Southwick knows this in his dissent, we recognize that part of the right of editorial discretion and freedom of speech is the right not just to say, “I only post messages I agree with,” but to say, “I only post messages within a certain range. There are some messages that I might disagree with, but they're within the range of what I consider to be acceptable.”

Like for instance, I might post – but on the other hand, there are other messages that I think are, from my point of view, beyond the pale. So, I might post or allow mainstream conservative messages or what I think are mainstream. Mainstream liberal ones, but not Nazi or communist ones, or racist ones. I might post some expression about sex that I might disagree with, but not stuff that I consider pornographic, even if that pornography is protected by the First Amendment. And so on.

And that's a common stance for newspapers, radio stations, TV stations, all these other sorts of organizations that I mentioned before. And it's entirely, I think, within the First Amendment free speech rights. And also, property rights of social media sites, to take the line that they permit a range of speech, but not an infinite range or not even as broad a range as would be legally permissible for them to include.

Brad: So, let me make one more comment, if I can on Masterpiece. And then, I hope that we can, in our remaining time, touch a little bit on Section Two, the disclosure requirements of the law perhaps. So, let’s suppose somebody had – and for people of a certain age, it reminds me of the kind of business that Kramer might start on Seinfeld. Suppose you had a cake shop, where you said, “Look, you can come in here and bake your own cake.” So, we've got all the best ovens and the décor, little things that help you decorate the cake properly, and maybe we’ll sell you great ingredients and so on, too. Right?

If you had that sort of thing, I don't think there would be an issue. And moreover, I would expect that a lot of the – that you would find government regulation and people saying, “Well, you can't prevent them from if they're coming in to make their cake, that you've offered this up, that they can put the message they want on their cake. Maybe I'm wrong on that, but I don't think that would be a terribly controversial law to have that. And I think that's largely how we've operated in a number of areas.

And also, just on Pruneyard, I don't think most people think when they walk into a shopping mall that they can say or do whatever they want. I don't think most people think that at all. So, I would disagree with that. So, maybe we just have an empirical disagreement, and I don't think there’s any evidence on it. Might have to go out and do some polling. What do people think they can say in a shopping mall? But the other thing I did want to touch on – Well, I’ll let you, Nic –

Nico: Yeah, let me get in because we can get to the disclosure. I wanna touch a little bit on kind of the common carrier argument here. So, you might recall the Packingham case, I believe it stemmed from a North Carolina law that prevented registered sex offenders from being on social media. The Supreme Court struck that down as being unconstitutional. And as part of the rationale, they sort of argued that social media is so much a part of our daily lives and it's so much a part of our modern economy, that to prevent a sex offender from being on it is to cut them out from an important part of our society. Right.

So, if you take that understanding of social media as import in our world. And then, you ask yourself, “Okay, well, what if these social media companies decide that they're just gonna not let conservatives on their platforms? Or people who express conservative opinions?” Ilya, is there no room for the government to become involved there, in your mind?

Ilya: I would say it depends on what you mean by no room, but I think there should be no room, at least no room beyond doubt, which can pass a very high level of strict scrutiny for them to say that these companies can exclude people based on their messages. Notice you can make the same argument, what if all newspapers get together and said, “No conservatives will publish there.”

What if all non-social media news website said that? What if all TV stations did that? You can play this out in the same way. The fact, the matter is that in a private sector with competition, that's not gonna happen or if it did happen, competitors would emerge, as by the way Fox News did emerge as a competitor for –

Nico: And didn’t that kinda happen with the Fairness Doctrine, too?

Ilya: Sorry?

Nico: Didn’t the Fairness Doctrine kind of try to get ahead of that sort of issue?

Ilya: Yes, it did do that. But the Fairness Doctrine was unconstitutional. It was repealed by the Reagan administration in 1980s. I think certainly most libertarian and conservative scholars agreed that the Fairness Doctrine was unconstitutional. Even many on the left, I think at this point would do so as well. So, even after the Reagan administration got rid of the Fairness Doctrine, that didn't mean that there was only left-wing media out there.

And indeed, Fox News emerged after the Fairness Doctrine was gotten rid of. So, I think, therefore, that that fear, if it's valid, it could justify government control of what all sorts of media publish, including many that are more influential than social media, in terms of actually influencing people's political opinions.

As for the decision that you mentioned, I think there's a big difference between the government banning somebody from appearing on all social media of any kind, therefore, again, setting up the same rule for all companies, as opposed to a particular company saying, “We're not willing to give a sex offender access.” And by the way, I have said in elsewhere, I think the definition of who counts as a sex offender is overly broad, and I am concerned about laws that, for instance, bar anybody who's a registered sex offender from living within 1,000 yards of a school or whatnot. I think that's a problem.

But it's a problem because it's a uniform rule set up by the government over an entire vast area, as opposed to just private individuals making their own decisions for themselves. And lastly, I think Brad makes the interesting analogy of, what if there was a rule for cake bakers, that unlike the owner of the Masterpiece Cakeshop, these bakeries would have and let people come in and do a do it yourself message on your cake. I think there would be a First Amendment problem if the baker or the owner of the bakery could not say, for instance, “We allow a wide range of messages, but we forbid Nazi messages or racist messages or other ones that we especially disapprove of.”

I think that would be a serious problem. And I think lots of people would readily see that, if such a law really had that effect. The reason why people are less concerned about, or some people are less concerned about the Masterpiece situation, is today and I think for good reason support for same sex marriage is a majority view, it’s widespread, it’s seen as entirely reasonable. So, the cake baker is seen as a bigot, I think, in some respects for good reason, for thinking that this form of marriage is somehow wrong or evil.

And I agree, there is bigotry there. But I also think even people with awful opinions have the right to freedom of speech, and I would say the same thing is true for social media content moderations that I disapprove of, or editorial decisions by newspapers that I disapprove of. And to call it censorship, I think, is both misleading, and yes, Orwellian, the opposite is true. The compelling host messages that they don't want is actually the real censorship.

Nico: Brad and Ilya, do you guys have 10 more minutes to keep going? I do wanna get to disclosure and I do wanna ask a question about the outcomes here. So, if you have another 10, 12 minutes, I'd appreciate it.

Ilya: Sure.

Brad: Yeah. So, Packingham, if I could just – I agree with Ilya, Packingham is about the government banning people from social media. So, it's a very different paradigm. And to me, clearly, something that erodes speech and violates the First Amendment. I wanna point out –

Nico: Yeah, but in the dicta, they do talk about the importance of social media and –

Brad: Well, that's true. That's true. And in that sense, it may support at least that sort of reasoning, or thinking about the importance of social media as a means of communication. But it certainly is not gonna be controlling, but that's a good point. The other thing is, I would point out, this law, the Texas law, is not at all like the Fairness Doctrine. That is, there's no requirement that the platforms assure a diversity of views. There's no requirement that, if they allow one view to be heard, that they bring another view on. Nothing like that. It's rather simply a content neutral provision that says you can't discriminate against certain points of views.

And we haven't focused much on that legal standard, but again, as under doctrine, the content neutral standard typically goes to sort of an intermediate scrutiny. And a compelling government interest. And so, I think that there's not been nearly enough focus. And Ilya has focused on it some here today, but I think in a lot of the discussion and a lot of the criticism, there's not really been a serious discussion of, “Is there a compelling government interest? And how does that play into this idea of a content neutral regulation?”

Ilya: I think that’s a reasonable point, that the level of scrutiny might be lower for a content neutral regulation, but it's still intermediate scrutiny, which is a pretty restrictive standard, though, less so than strict scrutiny. I would add, also, I'm not convinced – this isn't much discussed in the opinions, but I'm not convinced that this law really is content neutral, because I think there's a lot of evidence now, which was in the briefs in the District Court, which suggested that Texas’ real motive was not an impartial concern for all speech.

But rather, specific complaints raised by conservatives are the right of center people. And that, therefore, the goal of this law is really to boost one side in a political debate, that particular platforms may not like or may not like as much, at least as Texas thinks they should like them. And I think there is a lot of doctrine in the first seminary and others, which suggested the intent of the government actor is relevant in determining what kind of regulation it is.

For example, a law that on its face is neutral, like literacy tests for voting or a poll tax, may be seen as racially discriminatory if the evidence strongly suggests that the motive is to discriminate against African Americans or some other minority. And I think there is a lot of evidence that this Texas law is meant to boost, specifically, right of center speech [inaudible – crosstalk].

Brad: Let's say the problem that they are trying to boost right of center views. And let's suppose, and I think this is contestable issue, that conservatives are being discriminated against on these platforms. They can't seek redress from a liberal state government because the liberal state government says, “Tough, we don't care. That’s great. we think that's good. We want you to put down.” And they can't seek redress from a conservative state government because the conservative state government will, by definition, be trying to protect conservative points of view. I just think that's a little bit of a Catch 22, that proves a little too much and goes a little too far in this particular –

Ilya: So, it’s a catch 22 that is inherent in the nature of restrictions on discrimination? You can similarly say that segregationists who wanna promote facially neutral wise to promote segregate – would that be effective for promoting segregation. They can't go to an anti-segregation estate government because the anti-segregation and state government will not be sympathetic to your complaints. But if they go to a pro segregationist state government, then that government’s laws will be struck down because of their malicious intent. And I think the same thing is true here. So, the Catch 22, if there is one, is a feature, not a bud.

Brad: I'm not even sure what you're talking about here, Ilya. You're going off the railroads here. [Inaudible – crosstalk].

Nico: Yeah, let me ask about the disclosure section of this law. And Texas isn't the only state to pass a law requiring disclosure of content moderation practices for social media companies. California, I believe, in the last couple of weeks also passed a law requiring disclosure of how social media companies moderate certain categories of speech, including hate speech. Which, of course, goes undefined. And in passing that law, Governor Newsom said, “California will not stand by, as social media is weaponized to spread hate and disinformation that threaten our communities and foundational values as a country.”

And while he didn't go so far, or the state legislature in California, didn't go so long as to mandate viewpoint discrimination or viewpoint neutrality. It did, through its transparency requirements, seek, in this case, and it's very clear from Newsom’s quote, “To chill certain speech, or to put pressure on social media companies to chill certain speech.” And I think you might get that same sort of effect in Texas as well. To say nothing of the sort of concerns about proprietary information and private property, the Texas law says, for example, that you need to disclose how you use search ranking or other algorithms or procedures to determine results on your platform.

But I ask myself, “Isn't that the whole game?” Right, you're starting to set up a social media company, and the way you get market share, if you're a social media company, is to deliver content that is more interesting to an end user. That's how TikTok has just taken off because its algorithm has gotten so good at delivering relevant content and interesting content to its users.

So, by being forced to provide that information to the state of Texas, are you not being forced to also turn over what makes your social media company, it distinguishes you from in the market. So, there's two questions, there's the chilling effect of transparency requirements, and also, the sort of private property interest in your algorithm.

Brad: Yeah. No, that's a great point. And it's good on both issues. One thing that, again, we might think about is, to what extent does the court have to view all of this, is this an all or nothing proposition? Could a court say, to the extent the definition includes your algorithms, right? It's a problem, to the extent it includes other things, maybe it's not. So, one can go in a lot of different ways on this. I think you've also implicitly raised something, having said let’s talk about the disclosure stuff.

That takes us right back, a little bit, to the content moderation issue, which is one of the arguments that’s been made, for example, by Phil Hamburger at Columbia recently and so on, is that we should have rules that prevent companies from keeping lots of private data. Or rules, in this situation, that might prevent them from discriminating against content on their platforms, precisely so that government cannot begin to pressure the platforms to allow government to use them for improper purposes. And we've seen a lot of evidence, I don't wanna make a conclusory statement as to what’s been done.

But we've seen a lot of evidence that the government has been attempting to pressure social media platforms to censor certain information and that the platforms have been very willing to do the bidding of government, things that government could not do directly. And, of course, the government has a lot of power. And over on the left, Elizabeth Warren is threatening them with antitrust actions and so on. And so, one reason that one might arguably want this kind of law is so that the tech companies are gonna say, “We'd love to help you government, but we can't because we got a law that prevents us from doing that.”

And, again, that might be another way in which we think about certain types of regulation is actually being something that is, oddly enough, enhancing a freedom or protects us from government overreach by setting a legal standard that can't be infringed upon.

Nico: Ilya, do you have any additional thoughts on that before I turn to the final question?

Ilya: So, I don't have a very strong opinion on the constitutional aspect of the disclosure issue. I think it's much less worrisome than the content moderation rules. But I would note two things about the points that have been made so far. One is the Gavin Newsom situation should remind people on the right way who like the Texas law, that two can play this game or more than two. Liberal states would also be able to impose content neutrality restrictions or other restrictions, especially if part of the court's reasoning is that these things are common carriers, or that they can be regulated because they're big and influential, and so on.

So, the goal here is not just to protect people against the Texas government, but also against governments generally, including more left wing ones. The second thing, when it comes to chilling effects, and government pressure, I think, first it's kind of perverse to say that because the government can pressure people, we should allow government to pressure them even more, and get them to disclose data and so forth. The proper way to deal with government pressure is to cut it off in the first instance.

By having, among other things, drawing First Amendment protections, and also property rights protections. So, that if the government tries to pressure social media sites or other websites, they can simply say, “We don't care what you think because your threat is incredible. Because courts will strike down your measures against us, whether on First Amendment grounds, property rights grounds, or the likes.” But in my mind, that actually strengthens the case against disclosure requirements, rather than says that they should be done.

Particularly, since if these things have to be disclosed, then it's actually more likely that the government will know what is being done. And therefore, it can identify potential vulnerable pressure points in the like, that actually use the disclosure requirements to their advantage. So, that said, I actually don't have a very strong opinion on the constitutionality of many disclosure requirements. I think if the disclosure requirements really are content neutral, they might at least in some cases, be constitutional when it's not an area of law, I know as much about some of the other ones that we've talked about.
And I do worry about the issue of sort of disclosing proprietary information, that can then deter innovation or deter centers for innovation. But I wonder if that's more of a policy question than a constitutional lens. So, on the disclosure things, I think Brad certainly has more expertise than I do on that issue with his important work on campaigns finance, disclosures that he's done, under probably lots of other people who know more about that aspect than I do. So, while I have a few somewhat tentative thoughts on that, I'll probably leave that issue to people who know more about it than I do.

Nico: Go ahead, Brad.

Brad: Well, I was just gonna say, it is an interesting aspect because the – well, now I’ve just lost my train of thought here. [Inaudible – crosstalk].

Nico: Well, if you think of it again, you can interrupt me.

Brad: It obviously couldn’t have been too important.

Nico: I do wanna ask a closing question of you both. Well, the Texas law was set to go into effect here soon, but it looks like NetChoice filed an unopposed motion to stay the mandate, that would put it into effect, pending a cert petition from the United States Supreme Court. And I guess, Ken Paxton, the Attorney General in Texas didn't oppose that. So, I don't believe the law will go in effect. It just depends what the Supreme Court judge says here, I guess.

But let's say the law does go into effect, or the Supreme Court agrees with the Fifth Circuit and this rule is constitutional. What does the internet look like, or at least social media look like? So, I'm reading this Charlie Warzel article that came out a few days ago in The Atlantic. And he quotes someone from Stanford Cyber Policy Center, who says, “These legislators think they're opening the door to some stuff that might offend liberals. But I don't believe they realize they're also opening the door to barely legal child porn or pro anorexia content and beheading videos. I don't think they've understood how bad the law is.”

I'm assuming that we've seen the news articles about the amount of content moderators hired by Facebook and Twitter to police beheading videos, crush videos, child porn videos, preventing those from getting up on the platform. How can social media companies handle that if the Fifth Circuit ruling stands? And there’s states like Texas that have a law like this in place. Are social media companies gonna have a switch that you can play with at the top of your screen, where you get a moderated version of Facebook versus an unmoderated version? I'm just trying to get a sense of what social media looks like.

Because I remember the MySpace era, where things were much less moderated. You could even change your pages with your own HTML. It was very spammy. A lot of different sort of content that you wouldn't see on a Facebook or a Twitter. So, what does the internet look like, in a Judge Oldham, Texas world?

Ilya: So, I am not an internet or a technology specialist, so I can't know for sure. But I do think that if the Texas law is allowed to stand in its full force, then you will see a lot of very unpleasant user experiences on social media. One of the reasons why, virtually, all social media platforms do in fact have content moderation, that goes beyond what the Texas law allows, is because most consumers want that. That doesn't mean that all the content moderation is good or there aren’t some stupid decisions. I, myself, disagree with some decisions that I've seen made.

But I think a world where the content moderation is limited to this very tight space that Texas allows, I think would be problematic. I would also worry that this would open up the door, depending on how the court reasoned this decision, this would open up the door to other kinds of regulation of social media content, and possibly other content as well. Because as I've mentioned before, many of the kinds of arguments that Judge Oldham uses to justify the regulation on social media, could just as easily apply to many other kinds of media.

And I think most of his attempts to try to say that it wouldn't apply to them, largely fail. Either because they're based on factual inaccuracies, or because there are logical fallacies in the things that he says for both. But that said, I have to be somewhat tentative, in that, I'm not an internet technology expert and attempts to predict the future of internet discourse, they've often failed quite spectacularly. So, I admit, I could be wrong in some of these predictions.

But I still do think, generally, the lessons of history show that it's much better to leave the development of speech and discourse to decentralize private decision making, than for government to impose one size fits all rules. And that's especially true in a dynamically evolving industry as this one certainly is.

Brad: I agree. I mean, it's very hard to know what it would look like. And also, I'm not the guy who’s developed internet sites successfully. So, some respect, this is a question for the folks at Mehta and Twitter and Amazon and so on. Tell us what's gonna happen. I do think that it's very likely, at least I would look at it as very likely, that yes, consumers will actually revolt and this will not be a popular law. Because people will find themselves getting all kinds of stuff they don't want to get. I think that certainly is one of the possibilities.

It may be that the companies, as you suggested, Nic, have sort of a moderated platform that you can opt into, or maybe that they do more things where you opt into certain groups. Not groups the way they're sometimes used on some of these sites now, but we might say – I guess I'll call them groups, but, but they're not like exclusive groups, per se, but you say, “I wanna be on this page that doesn't include this stuff, or does include these things.” I think there's a lot of ability for the companies to do a fair amount of that, to give you – in other words, a way to, in a certain sense, set your own algorithm by saying, “Here are things I don't want, here are things I do want.”

But I don't know that for sure, like Ilya, I'm not really a tech guy. So, we'll have to see how that plays out. And it's another reason for a lot of these things to be done, not on a facial basis, but on a supply basis. I think that's true as well, the disclosures things, which we didn't talk about much. But again, it may be that – we have lots of disclosure obligations on companies. Contract disclosure obligations, labeling disclosure obligations, we force them to put notices up in their workplace and so on.

It may be that when you get into as apply challenges, that companies would be successful in getting a lot of the mandatory disclosure stuff taken out. Because it did step too much into their trade secrets or other protected rights, what have you. And I think that's largely the case with how the net would look. So, yeah, it's a great question, how's the net gonna look? I don't know. I think it's very, very possible that when this is done, it's gonna be very unpopular. But we'd have to see.

Ilya: Can I ask a technical question about this?

Nico: Go ahead.

Ilya: In your opinion, Brad, or for that matter, in Nic’s opinion, does either the Texas law – First of all, does the Texas law allow for a situation where Twitter essentially says, “When you join Twitter, you have the option of choosing either moderated Twitter or unmoderated Twitter.” Right? Because if that's permissible, then it may be that, say 90 percent of consumers use unmoderated Twitter, and there will be sort of a little cesspool of – 90 percent, which use moderated Twitter. There might be a little cesspool, that looks like Gab or something, that's the unmoderated Twitter. But people who want to use Twitter to reach a vast audience potentially, but without any restrictions, they wouldn't be happy.

Brad: Yeah, and you’re right back to square one. And I don't know if the law covers that or not, to be honest.

Nico: I don't think it does. I've read the law, trying to look for any suggestion that that sort of thing might be acceptable. But it's pretty straightforward, in saying you can't censor, or it's done permission to censor. Whether you can set up a censorship zone versus a non-censorship zone, I don't know. I think it's largely silent on that, and that's something that would get litigated if the law goes into effect, potentially.

Ilya: I wonder also whether Judge Oldham has an opinion, I think not. But I wonder if Judge Oldham’s opinion would read that there's a First Amendment right to set up the restricted zone. If as long as you have an unrestricted zone on your site, my guess is Oldham would probably say there is not such a First Amendment right. And if Texas or other states could ban having the sort of zone with restrictions. But you have to imagine a more, what was the term that Brad used, the more humble opinion, perhaps that did allow that.

Nico: So, the federal government, or for that matter, state governments, could set up their own social media companies too, if they wanted to. But that doesn't seem to be the interest –

Ilya: Sure. And I think that would be a less of a menace to them trying to set up one size fits all rules for –

Brad: It sounds like an awful [inaudible] [01:07:57], but maybe you're right.

Ilya: We already have government websites, where the government trumpets his message in all sorts of ways. And some of them do have comments sections, in sections where you can put up petitions and so forth. So, it's not quite the same. It's not quite the same thing as Twitter. And it doesn't have as many bells and whistles that Twitter has, in most cases. But government can do that. And I'm not necessarily always a fan of it when they do it, but I think it's constitutional, and it's not threatening in the way that this Texas social media law is.

Nico: Yeah, I don't know that it would take off if our experience with USA jobs or the healthcare exchanges is any experience – is any indication –

Ilya: Absolutely. The government may not be very good at running these websites. Though, historically, government has had some success in doing propaganda of various kinds. We see Vladimir Putin in Russia, obviously, having some degree of success at least with that.

Nico: Well, guys, I've kept you longer than I promised. I feel like I say that on every podcast, I'm so sorry. But I had some technical issues getting started here. I’m glad it was able to work out. Brad and Ilya, we could keep going, but we’ll leave it there. And I thank you both for your time.

Brad: Yeah, it's fun. Thanks, Nic. Thanks, Ilya. I gotta go teach a class now, I have a Friday afternoon class. So, see you all there.

Nico: That was Brad Smith of the Institute for Free Speech and Ilya Somin of George Mason University. This podcast is hosted, produced, and recorded by me, Nico Perrino, and edited by my colleagues, Aaron Reese and Chris Maltby. You can learn more about So To Speak by subscribing to our YouTube channel or following us on Twitter or Instagram, by searching for free speech talk. We're also on Facebook at Facebook.com/sotospeakpodcast. And you can email us feedback, as always, at So To Speak at the fire.org. We take reviews. Those are appreciated, they help get more ears and eyes on the show. And until next time, I thank you all again for listening.

Share