
The barbarians are not waiting beyond the frontiers; they have already been governing us for quite some time. And it is our lack of consciousness of this that constitutes part of our predicament.
Alasdair MacIntyre, After Virtue, 1981
I noticed the pattern in April 2026, when the European Commission announced that its age-verification app was ready. Ursula von der Leyen, the Commission’s president, presented it as the next step for safer online services and compared it to COVID passports. One scan would gate access to digital services across the bloc.1 The language was calm. Safety. Verification. Protection. It was the same register as a public-health announcement. It was not asking anyone to accept new power. It was announcing that the power was already there, and that decent people would of course assent to it.
Around the same time, British officers were detaining people across the UK for offensive comments on social media. The rate was roughly thirty arrests a day under section 127 of the Communications Act and section 1 of the Malicious Communications Act.2 The framing in each case was familiar: a speech law had been violated; the officers were enforcing it; the system was working as designed. The language around the law was calm. Public safety. Protection from harmful speech. Different country, different mechanism, same register.
Once I saw it in one announcement, I saw it in others. Control rarely arrives wearing its own name; it arrives as protection, as responsibility, as the only reasonable thing a decent society would do. It wears the language of virtue so naturally that questioning it feels like questioning goodness itself. That is the mechanism.
The pattern is older than any living institution. But the version running now is different in ways that matter.
The Old Architecture
For most of Western history, the moral framework that justified social control was religious. The church provided the vocabulary of right and wrong, the mechanisms of accountability, confession, penance, judgment, and the metaphysical grounding that made the whole system feel inevitable rather than constructed.
This is not an anti-religious observation. It is a structural one. When a single institution holds the authority to define sin, it holds the authority to define the boundaries of acceptable thought. What counts as transgression determines what counts as obedience. And obedience, once moralized, stops looking like control. It looks like virtue.
The medieval church did not frame its authority as power. It framed it as care for your soul. The inquisitor was not controlling you. He was saving you. That framing was not incidental to the system. It was the system. The moral story made the control architecture invisible to the people living inside it.
The Vacuum
Over the last century, religion gradually stepped back from the center of public life in most Western democracies. Fewer people attend services. Fewer accept theological claims as the basis for law. Secularism won. In the sense that the old moral authority lost its grip.
But the need it served did not disappear.
Humans are social animals with a deep appetite for moral frameworks. We want to know what the rules are. We want to know who the good people are and who the bad people are. We want a shared vocabulary for judgment. Religion provided all of this. When it receded, it left a vacuum. Not of belief, but of moral authority. The seat was empty, and the seat does not stay empty for long. Whoever could reach it would.
The New Priests
The state and the technology platforms filled the seat. Not overnight, and not by conspiracy. They filled it because they were there, because they had reach, and because they had something the church never had: data.
The roles map cleanly onto the older ones. A compliance officer is doing the work a confessor used to do. Receiving disclosure, sorting it against rules, recording it. A risk score performs the moral judgment a sermon used to perform. A content moderation policy plays the role of catechism. The terms of service stand in for commandments. And deplatforming, the quiet removal of your ability to speak, transact, or participate, carries the social consequence excommunication used to carry. It just does not require an appeal.
The language changed. The structure did not. There is still an authority that defines acceptable behavior. There are still consequences for transgression. There is still a moral story that makes the whole arrangement feel natural rather than imposed.
The difference is that the old system was at least explicit about being a system of belief. The new one presents itself as neutral infrastructure. It claims to be managing risk, ensuring safety, protecting the vulnerable. These are not articles of faith. They are presented as facts. And that makes them harder to question, not easier.
Foucault called this arrangement a regime of truth. Each society produces one. The discourses it treats as true, the instruments that sort truth from error, the people authorized to speak. The current regime of truth has its apparatus in infrastructure, and its authorized speakers are the compliance systems that run that infrastructure.
The Feedback Loop
The loop works like this. First, a control measure is introduced under a moral justification. Safety, child protection, national security, financial integrity. The justification is chosen to be nearly impossible to argue against in public. Nobody wants to be the person who argued against protecting children.
Second, the moral framing makes society willing to accept less privacy. If you have nothing to hide, you have nothing to fear. If you resist the measure, you are at minimum suspicious and at maximum complicit. Privacy becomes reframed not as a right but as an obstacle to virtue.
Third, less privacy creates more data. More data creates more surface area for observation. More observation creates more capacity for control. Not just of the original threat, but of anything the system can see. And now it can see a lot.
Fourth, and this is the critical step, the expanded control apparatus generates new moral justifications for its own existence. Now that we have this data, look at what we can prevent. Now that we can see these patterns, it would be irresponsible not to act on them. The tool creates the moral argument for the tool.
The loop closes. Control produces the moral framework that justifies the next expansion of control. Each turn of the cycle feels reasonable in isolation. In aggregate, the ratchet only turns one way.
How the Ratchet Works in Practice
The examples are not theoretical.
The push for encryption backdoors follows the pattern precisely. The moral story is child safety. The most unassailable justification available. No one who argues for end-to-end encryption wants to be positioned as indifferent to the exploitation of children. The framing is designed to make the privacy position morally untenable in public discourse. But a door does not know who is walking through it. A backdoor built for one purpose is a backdoor available for all purposes. The technical reality does not matter. The moral story does.
In financial systems, the pattern is KYC and AML regulation. The moral story is preventing money laundering and terrorism financing. The practical effect is that every person on earth who wants to participate in the financial system must first prove their identity to an intermediary, who records every transaction, indefinitely. The compliance architecture was built to catch criminals. It surveils everyone. In the United States, fewer than 1% of Suspicious Activity Reports lead to any law enforcement action. The system watches everyone to occasionally catch someone. That ratio does not get discussed.
A merchant opens a business account. The bank requires identity documents, proof of address, descriptions of expected transaction volume and types, and ongoing monitoring of every payment received. If the merchant sells legal goods to willing buyers and violates no law, the surveillance continues anyway. The system does not watch you because you are suspected of something. It watches you so it can suspect you of something later if it needs to. The moral story, preventing financial crime, justifies a permanent condition of observation applied to everyone, not a targeted investigation applied to the few.
The Language Is Load-Bearing
The EU said safety. So has every announcement of new digital control architecture in the past decade: safety, compliance, responsibility, transparency. The words are not chosen to describe. They are chosen to preempt objection.
Safety. Who argues against safety? The word does not mean the absence of danger. It means the presence of monitoring. When a platform says it is making the community safer, it means it has expanded its capacity to observe, classify, and remove. Safety is the word that converts surveillance into a gift.
Compliance. The word contains its own argument. To comply is to meet a standard. The standard is presented as external and objective, like a law of physics. But compliance standards are authored by the same entities that profit from them. The compliance industry does not serve a moral framework. It is a moral framework. One that generates revenue for every institution that participates in maintaining it.
Responsibility. This is the word that gets aimed at anyone who builds infrastructure that does not collect data. You are being irresponsible. You are enabling bad actors. The framing assumes that the default state of a system is total visibility, and that reducing visibility is an active choice to enable harm. It reverses the burden. You are not required to justify watching everyone. You are required to justify not watching.
Transparency. When aimed at institutions, the word means accountability. When aimed at individuals, it means exposure. Notice who gets asked to be transparent. It is rarely the entity making the rules. It is the person subject to them. Transparency, in practice, flows upward from the governed to the governor. The governor calls this accountability. It is actually submission.
Each of these words does the same thing. It takes a control mechanism and gives it the texture of a value. You are not pushing back against a system. You are pushing back against safety, against responsibility, against transparency. And now you are the problem.
Why This Is Harder to Resist Than Religion
The old moral authority had a specific vulnerability: it was explicitly metaphysical. It required faith. You could reject the premises. You could decide you did not believe in a god who tracked your sins, and the system lost its claim on you. Millions did exactly that. Secularism was, in a real sense, the act of stepping outside the framework.
There is no outside the new framework.
The new moral authority does not ask you to believe. It asks you to comply. It does not invoke the supernatural. It invokes data, risk models, and algorithmic assessments. These carry the authority of objectivity. They feel like facts rather than claims. The priest needed you to accept a cosmology. The compliance system just needs your ID.
Worse, the new framework is distributed. There is no pope to challenge, no council to petition. The moral authority is embedded in terms of service, in payment processing rules, in content algorithms, in credit scoring models. It is everywhere and nowhere. It operates through infrastructure rather than doctrine, which makes it feel less like authority and more like the way things simply are.
When control is embedded in infrastructure, resistance looks like inconvenience at best and deviancy at worst. You are not rebelling against a belief system. You are failing to comply with a process. And processes do not have arguments with you. They just exclude you.
Havel described this in 1978 and called it post-totalitarianism. The word meant a specific thing. A system whose authority runs not through force but through the moral vocabulary of the people inside it. The greengrocer hangs the slogan in his window and is not asked to believe it. He is asked to participate in the ritual by which the system borrows his language back from him. The same arrangement arrives now by payment rail rather than by Party.
The Moral Story Writes Itself Now
The feedback loop has reached a point where the system generates its own moral justification faster than any institution could.
Social media platforms observe behavior across billions of interactions, and each new observation generates a new category of harm that justifies more observation. New forms of speech are identified as dangerous. New transaction patterns are flagged as suspicious. New behaviors are classified as risky. Each classification is a moral judgment dressed in technical language. Each creates the case for the next expansion.
The speed matters. When a crisis emerges, a shooting, a financial scandal, a public outrage, the moral demand for more control arrives within hours. The infrastructure to deliver it already exists. The expansion happens before the deliberation. And the deliberation, when it comes at all, faces a system that has already normalized the new boundary.
No prior system of moral control operated at this speed. The church took decades to shift doctrine. Legislatures take years. The algorithmic moral framework updates continuously, and each update becomes the new default.
A test: if a moral principle, fully implemented, expands the institution’s authority, it is structural. A control mechanism wearing the language of values.
What Breaks the Loop
If the feedback loop runs on the surrender of privacy in the name of virtue, the circuit breaker is infrastructure that does not require that surrender.
Not privacy as a user preference, or as a setting you can toggle, but privacy as an architectural default. Systems where the data is not collected in the first place, where observation is not possible without specific, justified cause, where the ratchet has nothing to turn.
This is why the debate about privacy tools is never really about privacy tools. It is about whether the feedback loop has an off switch. Every system that collects data by default is a system that will eventually find a moral reason to use it. The only reliable way to prevent the misuse of data is to not have it.
The warrant system understood this. You do not get to search the house first and justify it later. The justification must precede the intrusion. That principle, applied to digital infrastructure, to payment systems, to communication networks, is the structural answer to the feedback loop: an architecture that does not require trust in the goodness of the people running the system, because trust in their goodness is no longer the load-bearing element.
Reframed that way, the live question stops being whether you trust the people currently holding the keys, and becomes whether you want a system where the keys exist at all.
A picture of the loop with nothing to turn looks like this. A buyer pays a merchant. The payment settles. No intermediary records the buyer’s identity. No compliance system assigns a risk score. No moral vocabulary is required because no judgment is being made. The transaction stays a transaction. Neither a confession nor an application for permission nor a data point in someone else’s model of who you are. The architecture has not collected what the ratchet would need.
Privacy by default reads in this light less as a political position than as an engineering decision: the one that keeps the loop from closing in the first place.3
-
European Commission announcement of its EU Digital Age Verification App, April 15, 2026; reported in Bloomberg the same day: https://www.bloomberg.com/news/newsletters/2026-04-15/eu-tries-to-rein-in-social-media-giants-with-new-age-verification-app. Commission President Ursula von der Leyen drew an explicit parallel to COVID-era travel passes. The full case, including security researcher Paul Moore’s 48-hour bypass demonstration, is documented in The Receipts. ↩
-
The Times, in April 2025, published freedom-of-information data showing that more than 12,000 people were arrested in the UK in 2023 under section 127 of the Communications Act 2003 and section 1 of the Malicious Communications Act 1988, a rate that had more than doubled since 2017. Big Brother Watch has tracked the trend across UK police forces. ↩
-
Timothy Leary, “Think for Yourself, Question Authority” (1991): in the transition from industrial to post-industrial information society, thinking for yourself stops being a personal pleasure and becomes a duty — precisely because the apparatus that governs an information society runs on its absence. https://www.youtube.com/watch?v=mfqRPfhxUdc ↩