Law in Contemporary Society

Privacy, Fascism, and Law: How Surveillance Chills Collective Action

-- By CiarraLee - 25 May 2025

Introduction: The Dystopian Reality

I consume a lot of science fiction entertainment, and I’ve come to see it as a lens for understanding our current world. Black Mirror is a popular dystopian series that explores the societal impact of advanced technologies through chilling and emotional stories. In “Joan is Awful,” we see a woman’s private life become a form of corporate entertainment, all because she clicked 'agree' on a streaming service’s terms. In “Hated in the Nation,” state-sponsored surveillance technology backdoors are co-opted to commit murder. These stories show that privacy is treated as expendable by unchecked institutions, and that mass surveillance is not only technically possible, but legally and culturally accepted. Many of the legal questions that animate these plots are issues I want to explore in my legal career.

Surveillance by powerful institutions reshapes behavior, weakens dissent, and clears a path for authoritarian control. This is not just theoretical or dystopian. It is happening on the streets of New York, on campuses like Columbia, and in the systems that law enforcement uses every day. As a future lawyer, I feel the urgency of developing the legal, technical, and social tools needed to fight back. This paper explores the real-world mechanics of surveillance and its authoritarian implications, considers Columbia’s response to student protest as a case study, and charts the education, network, and skills I must acquire to do this work.

Relationship Between Surveillance and Fascism

Historical and Current Practices

Surveillance has long been a tool of authoritarian regimes. In East Germany, the Stasi monitored citizens through widespread informant networks, eroding trust between neighbors and within families. In Nazi Germany and Fascist Italy, surveillance was a central element of the state’s identity. The Gestapo and other agencies created a culture of total visibility, where even loyalty to the regime did not insulate individuals from being watched. The Italian fascist regime similarly used informant networks and moral surveillance to discipline women, sexual minorities, and working class spaces. Both regimes employed surveillance to impose ideological conformity, racial and gender norms, and nationalistic obedience, using fear to reshape political activity and private life (Wright, 2018).

Contemporary democracies often reproduce these dynamics under new names and technologies. In the United States, the post-9/11 expansion of domestic surveillance under the USA PATRIOT Act normalized warrantless data collection under the guise of national security. Today, police departments like the NYPD and NOPD deploy facial recognition, drones, and cell-site simulators to monitor protestors, disproportionately targeting communities of color. When Amnesty International requested public records about these practices, the NYPD refused to comply, prompting litigation. The government’s defense of mass surveillance, even in the face of legal obligations, underscores the growing normalization of secrecy and state power over individual autonomy. In February, Google quietly deleted its 2018 pledge not to develop AI for “weapons or surveillance beyond international norms,” framing the change as necessary to its partnerships with companies, governments, and organizations that share democratic values.

Case Study: Columbia’s Surveillance and Protest Suppression

At Columbia, similar dynamics are playing out in real time. In response to student protests, the university has implemented draconian security measures, including increased police presence, restrictions on building access, and tracking of student movements via ID swipes. Students have been arrested, suspended, or placed under investigation for participating in demonstrations. These disciplinary responses are framed as efforts to ensure campus safety, but in practice, they mirror state-level efforts to suppress dissent. The university’s surveillance produces a chilling effect that discourages students from organizing or speaking out. Columbia replicates the logic of the surveillance state in microcosm: using technology and ideological appeals to manage and punish political activity.

The EU’s data protection laws seem strong but lack clear prohibitions on biometric surveillance, deferring regulation to national governments that often fail to act (De Hert & Bouchagiar, 2022). This enables widespread use of facial recognition in policing and border control. Columbia’s surveillance regime follows a similar logic: broad discretion, little transparency, and appeals to order. In both cases, bureaucratic language and legal ambiguity mask the normalization of surveillance and the suppression of dissent under the guise of safety.

Defining My Practice

To meaningfully engage in legal work that challenges mass surveillance, I need to sharpen both my technical literacy and understanding of policy. At Cornell, I studied Information Science, which gave me the requisite technical skills to develop products in fintech for three years. I should build upon my technical skills by continuing to learn from data scientists, engineers, and other cyber professionals who can help me understand how surveillance tools actually work, and evaluate legal claims about their functionality and harm. This includes communicating and working with companies like Google and Ambient.ai that are actually creating them.

I also need to continue to expand my network at the intersection of technology and justice. Currently, I am a junior board member at the Surveillance Technology Oversight Project. This organization engages in strategic litigation, legislative advocacy, and public education to protect people’s privacy rights in New York (and it has litigated some of the matters discussed above).

Finally, it would behoove me to also connect with government agencies and state and local law enforcement to learn their internal processes and current use cases for new technologies.

Conclusion

From local police departments evading transparency laws, to national law enforcement adopting AI tools without public debate, to tech giants abandoning ethical guardrails under pressure from outside stakeholders, these examples reveal a coordinated mass erosion of privacy. As a lawyer, it is imperative that I understand the technologies at stake and the communities most affected to use the law to fight back against these surveillance regimes.

Sources

S.T.O.P. Initiatives Referenced

I think the best route to improvement is less Black Mirror and more real world. Plot summaries from screenplays don't help you decide what you need to learn about and what sort of network you are building in order to pursue the practice that interests you, but you do know what that is, and you are ready to start planning.

You don't say or show much here about your present levels of technical understanding. You should decide where they need to be strengthened and get the education around here that will help you. There are still way too many lawyers who don't actually understand the technology and too any technologists who think if they're smart they can know law by logic and together they gunk up the conversation and cause lots of foolishness, not to mention harm. You want to meet people on both the light side and the dark one, and learn the conversational styles of both the rebels and the Empire. You need some history. A good shopping list will be very valuable for you.

A bunch of what you need I can teach you, but I'll be sway until your third year, so what you do next year should gain for you in other ways. You're right that you'll never be far from the issues around here. But keeping them firmly in sight despite all the distracting confetti is still not going to be easy. Stick to it.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.

Navigation

Webs Webs

r5 - 26 May 2025 - 04:48:47 - CiarraLee
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM