Privacy Is a Precondition for Morality

Every system of control narrows the field of legitimate action. Morality narrows by gating. Noise narrows by drowning. Two modes, one architecture.


Privacy Is a Precondition for Morality

Visibility is a trap.

Michel Foucault, Discipline and Punish, 1975


The morning after the adversarial pass came back, I sat down and started acting as my own lawyer.

I pulled Ulbricht’s sentencing memo. I pulled the Tornado Cash indictment. I read Harmon’s plea. I ran the argument a prosecutor would make against SatsRail, content-blind, non-custodial, subscription rather than commission, never in the payment path between customer and merchant, against the record of what the government had done to builders on the wrong side of this question. I made the case against myself first. Then I made the case back.

The precedents had names and docket numbers. Ross Ulbricht had been serving life. Larry Harmon had pleaded. Roman Storm was in court while I sat with this. The architectures each of them had built were not mine, and the distinctions mattered — custody, content-awareness, revenue model, the presence or absence of knowledge of illegal activity — but they were going to matter in a courtroom I did not want to be in. The risk was not a mood. It had docket numbers.

That was the first defense, the easier one. The legal case held. The architecture had been designed so it would.

The second defense was the one that mattered, because I had watched the mechanism the book’s later chapters name do its work and I knew how it arrived. Not as a charge. As a memo. From a card network, a payment processor, a regulator making a phone call that nobody recorded. The weapon the incumbents had used against the companies I had worked alongside was not the criminal code. It was the moral frame. You are helping bad actors. You have no legitimate use case. Your architecture is indistinguishable from a laundering tool. By the time it landed, the bank had already closed the account and the processor had already cut the merchant off, and the court it was never submitted to was not going to save anyone anyway.

SatsRail, if it worked, removed the handle the incumbents used to apply the frame. I was not imagining they would like this. I was building it because they would not. Which meant the moral attack was not a risk. It was a certainty. The only question was whether the answer to it existed before the attack arrived, or whether I would be constructing the answer under the speed of the news cycle, six months after the first article, while my payment processor was already halfway through cutting the business off.

So I ran the adversarial pass again. This time on the moral argument. I sat in the chair of the thoughtful, well-meaning person who would say: building privacy infrastructure is an amoral act. You are helping people hide things. Hiding things is what bad actors do. A moral society wants transparency. I built the strongest version of their case against me. Then I built the answer back.

The answer is this chapter. It is not a defense I hoped I would never have to give. It is the defense I already knew I would have to give, written down in advance, while there was still time to write it carefully.

The critique confuses visibility with virtue. They are not the same thing.

Morality — not compliance — requires a genuine inner life. When someone does the right thing only because they are being watched, that is not moral behavior. It is performance. Kant’s point exactly: moral worth comes from the will behind the act, not from the watching.

A society of total surveillance does not produce moral people. It produces people who are very good at appearing moral. That distinction is everything.

The mechanism is well documented in psychology. When people know they might be observed, they stop asking “what is right?” and start asking “what will be approved?” They internalize the watcher’s gaze. The question stops forming before it becomes conscious. The chilling effect operates below the level of deliberate self-censorship.

A society that stops thinking certain thoughts is not a moral society. It is a conformist one.

The clearest case study is East Germany. At its peak, the Stasi had roughly one informant for every 63 citizens. The densest surveillance apparatus in history. What did it produce?

Not a moral population.

It produced a deeply traumatized, atomized society where trust collapsed at every level. Between neighbors, between spouses, between parents and children. After reunification, people discovered their closest family members had been filing reports on them for years. The psychological damage outlasted the regime by decades. The point is not that the Stasi was evil — it is that comprehensive surveillance destroyed the social fabric morality depends on. You cannot have moral community without trust, and surveillance is a machine for destroying trust.

The mechanism does not require a Stasi. It requires only gatekeepers who can withhold.

In December 2010, WikiLeaks lost its donation channel in an afternoon. The major card networks and payment processors cut the organization off, one by one, citing terms-of-service violations. No court had found WikiLeaks guilty of anything. The blockade was administered by the payment rail itself, on the basis of a moral claim, and it lasted years. The donations went away because the rail decided they should.

Operation Choke Point was the same mechanism turned on legal U.S. businesses. From 2013 to 2017, federal regulators leaned on banks to deny accounts to firearms dealers, payday lenders, and other industries flagged as high-risk. No statute named these businesses. No court reviewed the exclusions. Banks dropped the accounts because the regulator made it clear they should.

In authoritarian contexts the mechanism is even more direct. Journalists, activists, protesters have had their ability to transact removed — not by court order but by administrative exclusion. The bank cancels the account. The card network cancels the merchant. The reason given is policy. The reason underneath is politics.

And the mechanism keeps moving down the stack. I tried the obvious experiment — opened an incognito tab, fresh session, no history, and asked a model about private payments. The flinch was already there. Not because anyone had trained it to suspect me, but because the moral frame the bank used in 2010 has been absorbed into the weights themselves. The chokepoint has become the prior. No regulator required. No phone call to record.

When the ability to transact is contingent on approval from whoever controls the ledger, economic freedom is conditional. And conditional economic freedom has a way of becoming no economic freedom at all. Gradually, then suddenly.

The principle that threads the needle is not radical. It is the foundational logic of every free society that has ever functioned: privacy by default, accountability when there’s cause. Probable cause. Warrants. Due process. Presumption of innocence. These are all the same principle applied to different domains. The legal tradition worked this out centuries ago. The problem is that technology made mass observation so cheap that societies drifted away from it without ever consciously deciding to.

The principle answers the hardest objection cleanly. The system was never supposed to watch everyone. It was supposed to watch people when there is a specific, articulable reason to. Mass surveillance inverts this. It watches everyone, all the time, and sorts out the bad actors from the data afterward, which requires treating every person as a suspect by default. That is not a safety architecture. It is a presumption of guilt with better branding.

The warrant system does not only protect the suspect. It forces the state to articulate why it is watching someone and have a third party agree. That constraint on power is the feature, not the bug. Privacy by default protects the innocent. Accountability when there’s cause pursues the guilty. The two are not in tension. One makes the other legitimate.

Privacy is not a preference. It is not a feature. It is the soil that morality grows in.

Free and moral people require privacy as a baseline. The same way oxygen is not a feature of human flourishing but the condition that makes it possible at all.

The road to the panopticon has been paved with transparency advocates. The people who built the architectures of visibility, the ad networks, the compliance regimes, the payment graphs, many of them genuinely believed they were making the world safer. Good intentions, confidently executed. What they built was infrastructure for control available to whoever ends up holding the keys.

The privacy builder has a clear moral theory: people deserve sovereignty over their own lives, concentrating information is concentrating power, and concentrating power ends badly. Consistently, across history, without exception.

The surveillance builder has an assumption: that whoever holds the keys will be good ones.

That is not a moral theory. That is a prayer.