Smart Technologies in Authoritarian Hands: When Convenient Helpers Turn into Dangerous Informants

Listen to the audio version (experimental)
For years we have used digital technologies without truly taking their potential risks to our freedom seriously. Smart devices, social networks, and connected services were seen as progress that makes our everyday lives easier. Abuse by companies—such as personalized advertising or data trading—was indeed discussed, but it never seemed threatening enough to fundamentally change our behavior. But the reality is shifting. With the rise of authoritarian movements, particularly visible in the United States, it becomes clear that the same technologies that promise comfort can, in the wrong hands, instantly become instruments of political control and repression.
Bruce Schneier, American cryptographer and security expert, described precisely this development in his recent article “Digital Threat Modeling Under Authoritarianism” in Lawfare. Schneier has been a leading voice in cybersecurity issues for decades and is known for his critical analysis of the intersections between technology, society, and politics. He has published books such as Data and Goliath and Secrets and Lies, and he teaches at Harvard Kennedy School. His core message: traditional threat modeling—that is, the systematic analysis of potential digital security risks—falls short if it ignores the political context.
In this article
From corporate databases to state surveillance
In democracies, state and commercial surveillance were long distinguished. On one side intelligence services, police, and authorities that collected data within legal limits. On the other side companies like Google, Meta, or Amazon, which collected massive amounts of data for advertising and business models. Schneier shows how this separation is increasingly disappearing. Authorities buy data from companies or access their systems directly.
His examples include surveillance cameras from Amazon Ring, which cooperate with police departments, or Flock Safety, a provider of automated license plate recognition, whose systems enable movement profiles of entire cities. For example, the German police immediately claimed access to the Toll Collect system’s automatic license plate recognition cameras after its installation. When that did not succeed, some German states illegally installed their own highway cameras and used them for mass license plate collection.
U.S. authorities like the Internal Revenue Service (IRS) or Homeland Security use growing data pools that are supplemented by commercial sources. These interfaces between state and corporate data collection create an infrastructure that may still seem controlled in a democracy but becomes highly dangerous in an authoritarian environment.
The underestimated danger
As long as democratic control mechanisms functioned, this merging of data seemed less problematic. But Schneier warns that such thinking is dangerously naive. Citizens in democracies have become accustomed to seeing surveillance and data abuse primarily as an economic issue. People feared spam, personalized advertising, or the resale of profile information—but not state repression. This has led to systematic underestimation of the risks posed by digital technologies.
Historical comparisons make this danger clearer. In East Germany, the Stasi used analog surveillance methods to control dissidents—an extremely effective system despite technological limitations. In China today, digital systems enable almost total control. The “Social Credit System” links economic, social, and state data to steer citizens into desired behaviors. What is seen as a “harmless” data pool in democracies can very quickly become the foundation of authoritarian control.
Mass surveillance is surveillance without specific targets. For most people, this is where the primary risks lie. Even if we’re not targeted by name, personal data could raise red flags, drawing unwanted scrutiny. — Bruce Schneier
Authoritarianism as a tipping point
The threat changes fundamentally when democratic barriers fall. In an authoritarian system, technical infrastructure merges with political power. The same data that today is used for advertising can tomorrow be used to identify dissidents, monitor movements, or put citizens under targeted pressure.
Schneier describes this transition as the core problem of threat modeling: what seems acceptable in a free state today can become a weapon against citizens in an authoritarian reality tomorrow. Particularly critical is the fact that many citizens voluntarily disclose their data—through smart household devices, fitness trackers, or digital assistants. This creates everyday transparency that is extremely dangerous in authoritarian hands.
The danger is not theoretical. Developments in the United States show how quickly democratic structures can erode. Where rule of law and separation of powers are undermined, new power centers emerge that resort to digital surveillance tools. In Europe as well, extremist and authoritarian parties are gaining strength. This raises the question of whether societies are sufficiently prepared to protect their citizens from the misuse of smart technologies.
The risks are no longer just theoretical
The risks are not only theoretical but also practically relevant due to the spread of devices. According to a 2024 Statista study, more than 350 million people worldwide use smart speakers such as Amazon Echo or Google Nest. In Germany alone, about a quarter of households have such an assistant. For connected surveillance cameras, the number of global installations exceeds 1 billion, according to figures from IHS Markit. Fitness trackers and smartwatches are also mass products: in 2023 more than 500 million wearables were sold worldwide.
These devices constantly generate data streams—from movement profiles to communication habits to biometric values. Combined, they create a complete picture of daily life. While this is primarily an economic resource for companies, for authoritarian regimes it can become a surveillance tool of unprecedented scope.
Yes, they are convenient. But are they safe in the long run?
The conclusion is clear: we must rethink our attitude toward smart technologies. Convenience and comfort can no longer be the only benchmarks. Citizens should understand that every collected dataset can be used not only for advertising but also for political repression—once the political framework changes. Schneier’s analysis is a wake-up call: digital threats arise not only from hackers or corporations but also from political developments.
Legislation also plays a key role here. Data protection laws such as the European General Data Protection Regulation (GDPR) provide protection as long as political institutions uphold them. But this protection can quickly vanish in authoritarian systems. Dependence on a few global platforms further worsens the situation: if data is centrally collected by major U.S. or Chinese companies, a political shift in one country is enough to put millions of citizens worldwide at risk.
Questioning our own behavior more critically
Schneier’s analysis shows that we can no longer distinguish between “harmless” corporate surveillance and “dangerous” state surveillance. The systems are already deeply intertwined. This fundamentally changes the threat landscape for citizens. Those who fail to question their behavior today may tomorrow become an unwitting part of a comprehensive control apparatus.
It is therefore urgently necessary for citizens, politics, and civil society to reassess the handling of smart technologies. This includes not only more conscious use but also the demand for transparency and clear rules to prevent data abuse. Only in this way can we prevent convenience and technological naivety from leading into an authoritarian reality.
Summary (tl;dr)
- Bruce Schneier analyzes in “Digital Threat Modeling Under Authoritarianism” the growing convergence of state and corporate surveillance.
- Smart technologies, long considered harmless, can in authoritarian regimes become tools of oppression.
- Citizens in democracies have so far underestimated the risks, as they were primarily perceived as an economic issue.
- Studies show: Over 350 million smart speakers, 1 billion surveillance cameras, and 500 million wearables generate massive amounts of data every day.
- A rethink in the handling of smart technology is necessary before authoritarian power structures are fully established.