Table of Contents

Key Concept — Censorship gravity: The tendency of psychological, cultural, and political forces to pull societies back towards more closed societies and censorship. Stands for the premise that free societies are unusual and hard to maintain, regression to the mean is regression to closed societies.


In the two-season masterpiece Fleabag, the main character says something that I will never forget. She has a breakdown in which she laments the terror of freedom itself. (Why is she in the confessional talking to a priest so frankly? Well, just watch the show and make sure you stick with it through the first few episodes.)

I know what I want. I know exactly what I want. . . . It’s bad. I want someone to tell me what to wear in the morning. . . . I want someone to tell me what to wear EVERY morning. I want someone to tell me what to eat. What to like, what to hate, what to rage about, what to listen to, what band to like, what to buy tickets for, what to joke about, what not to joke about. I want someone to tell me what to believe in, who to vote for, who to love and how to... tell them. I just think I want someone to tell me how to live my life, Father, because so far I think I’ve been getting it wrong . . . and I know that’s why people want people like you in their lives, because you just tell them how to do it. You just tell them what to do and what they’ll get out at the end of it, and even though I don’t believe your bullshit, and I know that scientifically nothing I do makes any difference in the end anyway, I’m still scared. Why am I still scared? So just tell me what to do . . . Just fucking tell me what to do, Father.

I am the first to admit that, despite my commitment to support free speech, some part of me agrees. Wouldn’t it be lovely if someone could just take away the uncertainty of life and the stress that can permeate it? Someone to just tell you all the answers. Even if some of the answers aren’t perfect, you’re at least relieved of the burden of agonizing over all the hardest questions — or worse, agonizing and still coming to the wrong answer with only yourself to blame.

We Americans are so used to lazily thinking of freedom as a "good,” that we forget to remember why it would need to be a value in the first place. If it was our natural, most basic craving, we wouldn’t need the layers of protections in our constitutions, our laws, and in (hopefully) our culture. Freedom would just be considered a fundamental aspect of any normal human society.

But it is very much not, as the great cofounder of FIRE Alan Charles Kors pointed out to me once. Someone had asked him to review the introduction to a book on freedom of speech that claimed, "Since the dawn of civilization man has yearned to be free.”

Imagine growing up with an imaginary friend who is loaded with all the wisdom of the world, error checked by professionals, and can — if not always provide you the best advice on what to do — give you the best advice on what not to do.

As Alan pointed out, "Yes, but that's only because the rest of the human beings wanted to keep him oppressed, subdued, or even to enslave him!" We’re taught about freedom in elementary school, and we recite what we learned as rotely as the alphabet, and just as uncritically. We don't give a lot of mindshare to the idea that we’d be in serious trouble if someone could tell us all the right choices to make; if someone could promise safety, security, certainty in the future, some part of us would really love that.

I certainly see that in myself. I'm always looking for expert opinions to tell me what to do on any number of things, sometimes excessively so, even for things I would probably be able to figure out on my own. Which brings me, probably somewhat surprisingly, to Noom. Noom is a subscription diet app that tracks your activity and what you eat, and gives you personally-tailored recommendations to help you lose weight.

Half horse, half dietician

It all starts with the weight that I had gained in 2019. As I was recovering from treatment for my tumor last year, I gave myself permission to eat pretty much anything I wanted. The (largely expected) result was that I got all the way up to 228 pounds. In February, I decided I needed to do something about it, and started using the new product Noom.

I was extremely impressed with Noom. It incorporated all sorts of insights from behavioral psychology, what we know about weight loss, and what we know about motivation, and fused them together into an extremely compelling package. Noom is essentially an app that gives you short daily lessons and is backed up by a coach in the real world to confer with and who supports a user’s goals. It also incorporates a community of other people who are currently using Noom that you can share your progress, problems, and, perhaps most importantly, goals with as a way of helping to motivate you and hold you accountable.

Every week it gives you a new lesson in tiny chunks. Like many video games and social media platforms, it is perfectly designed to give you the little dopaminergic reward as you progress through each step. Every week has a different theme for the lessons. The lessons start out focusing on things like the diet approach with the best scientific backing, volumetrics, but as you go deeper in the program, you learn about everything from sleep, working out, how to evaluate health studies, to the comparative merits of diets, such as low-carb, Mediterranean, gluten-free, and so on. There's even a lesson that looks at other countries, comparing their diets and even their cultural practices.

So far I am down to 182 pounds, meaning I have lost 46 pounds this year.

Am I talking about this just to brag about my weight loss? No! (Well, I hope not. But hear me out.)

How to Decide by Annie Duke

Noom is an early ancestor of something that will become a consistent part of our lives in the very near future. Well-informed, well-designed apps that we can rely on to help us make better decisions or keep us motivated for particular goals. Even a relatively primitive AI, with the backing of a real human therapist (creating what is hilariously known in the AI community as a “centaur”) could teach you Cognitive Behavioral Therapy, for example. The same goes for Annie Duke’s crusade to get us to practice better decision hygiene. Her new book, “How to Decide,” will be this month’s Prestigious Jack Kirby Award winner; the techniques it tries to impart through example and illustration could be coached by a centaur.

As AI improves and tech companies continue merging into Gibsonian cyberpunk megacorps, the apps will become more comprehensive, becoming single packages that help us make better decisions, lose weight, meet ever more specific goals, and even treat anxiety and depression.

I, for one, (mostly) welcome our centaur overlords 

This is where we are headed. Of course, as many of you see, we are almost there in some ways. A big part of me is thrilled at the prospect. Imagine growing up with an imaginary friend who is loaded with all the wisdom of the world, error checked by professionals, and can — if not always provide you the best advice on what to do — give you the best advice on what not to do. The implications of relatively small advances on technology like this are pretty profound, and as processing power increases dramatically every year, the progress here will be startling, even just during the lives of my own small children.

My hope is that this will help people lose weight, make better decisions, choose partners more intelligently, get psychological help when needed, and even, potentially, to nudge us toward being a little kinder and a little more moral. 

Who gets to program the Omni-living apps?

There is an attendant risk, however. Giving too much of our decision-making faculty to the judgment of an app will likely make us worse at making decisions for ourselves and more helpless in the areas where we are not electronically assisted. Like someone who spent too long biking with training wheels, we might not be adequately equipped for the bumps and scrapes of the road when the wheels are off. And it’s not just decision-making skills you lose by offloading your thinking to an app or authority figure — it’s the CONFIDENCE to make decisions at all. Relying on these apps for too much may breed severe decision paralysis, and even anxiety, when they cannot be used.

That’s not the only thing to fear. Some part of us, indeed I believe some part of ALL of us, would love this too much. Like Fleabag, we would love something much smarter than us to take away the burden of freedom. That burden is very real, so much so that back in the late 19th century, when intellectuals working for the Russian Emperor tried to justify their absolutist czarist system, they emphasized the peace of mind of freedom from freedom. It sounds utterly silly to modern ears, but the theme has repeated over and over again. The Puritans viewed true freedom as “freedom from sin,” achieved by laws to punish sinners; Steve Jobs described the app store’s content controls as “freedom from porn;” and even now, some would likely characterize Twitter’s censorship of some anti-Biden news stories as “freedom from propaganda.”

Who programs the programmers? 

Which brings us quickly to the lesson of the day: censorship gravity. This is my own invented term to make it clear that the movement towards human freedom that began in the revolutions of the 18th century are, and remain, more unusual than we are comfortable admitting. 

Given political motivations, outside threats, and our own internal wiring, the forces of conformity are always present. Realizing that freedom necessarily means “allowing people to be wrong or misguided,” people quickly come to believe that it may not be sufficient to let people make up their own minds. This pulls us back towards a less free society, and towards censorship of "dangerous opinions.” I call it censorship gravity because, much like a neutron star, it's always pulling on us — and if we get a little too close we will be sucked back into it.

That “gravity” means that human societies tend to backslide in one direction: towards greater conformity and greater control. Just like poverty is a natural state of mankind and we have to work against it, censorship is a natural state we must resist with institutions, norms, laws, and education projects.

Even a benign version of this almost inevitable future of centaur-guided decision-making would mean that we further marginalize our ability to be autonomous thinkers, probably increasing our anxiety when placed in the peculiar circumstances and choices of everyday life. Reliance on AI could make us more fragile and less antifragile

But the more pernicious part is the question of how this technology will be used by those in power, those with business agendas, and those who care more about political outcomes than human flourishing. While I don't know enough about how China currently manages its social credit score, it seems clear that the dictatorship is already restraining freedom with technology, and probably doing it quite effectively. Those efforts existed before the social credit score, however. Jacob Mchangama has studied how states (including China) leverage censorship efforts across borders, both through history and presently; FIRE’s own Sarah McLaughlin has tracked how China controls speech on some American campuses. There is no reason to suspect China would decline to use AI coaching as a tool. 

Who gets to program the Omni-living apps? Will those apps mean better lives for many, or most? Or will they mean more effective, yet harder to identify, control at the hands of the powerful? I don't really know, and much like Fleabag, part of me wants those perfect answers, though my rational mind is terrified of the potential cost.


NOTE: You can see the previous articles in our series on key concepts regarding freedom of speech here:

Recent Articles

FIRE’s award-winning Newsdesk covers the free speech news you need to stay informed.

Share