Computers, Privacy & the Constitution
-- ZoeyYichenLiu - 28 Feb 2024

The illusion of choice in Human Adaptation for Technology's sake

Given the stronger digital surveillance systems developed in the past couple of decades, the illusion of choice forces people today to be subject to this policing, to be an operating member of society. To go from point A to point B, to make a purchase, each movement creates an unavoidable data point of recorded evidence, showcasing the parasitic nature of technological involvement in modern life. Cases like Kyllo v. United States (2001) highlight how even in the privacy of one’s own home, technology can infringe on 4th amendment rights. The use of a thermal imaging device by police to detect heat patterns inside a home was ruled by the Supreme Court to be considered a “search” requiring probable cause. This demonstrates how advancing surveillance tools utilized by law enforcement encroach upon privacy expectations in ways not imagined by the original writers of the 4th amendment.

In times of challenges, humans adapt. Whether it is wearing face masks to appear undetectable to facial recognition software, using encryption apps to secure communications, or leaving personal devices at home to avoid location tracking, individuals take measures to protect their privacy. Yet we fail to answer the question of how far human adaptation we have to accept until we are no longer ourselves. As each human individual strives to adapt, the values we’ve established for our democratic ideals remain static. The 4th amendment was written in an era before advanced telemetry and surveillance were possible, yet we strain to have it apply in today’s digital age. This creates an inherent tension, as landmark cases like Carpenter v. United States (2018) demonstrate. Here, the Supreme Court worked to redefine what constitutes a “search”, ruling that police generally require a warrant before accessing private cell phone data that reveals the intimate details of a person’s movements and activity.

Why must we be expected to adapt our behavior so extremely, when the language and propositions of our early established documents remain static? How is it that we are held to higher standards as individuals than the products we have created to dictate our rights? As seen in Carpenter v. United States (2018), the onus remains entirely upon people to uphold their privacy through technical knowledge, ingenuity and effort to exert rights supposedly guaranteed under the unchanging Constitution. This imbalance seems profoundly unjust as technology reshapes society.

At its core, technology encompasses any human-made object created to achieve a task. As Chomsky and Herman argued in Manufacturing Consent (1988), people have been told these technological innovations have been developed to solve our problems and make our lives better. Yet in the same way that the definitions of constitutional language have shifted over time, the definition of technology is vulnerable to time as well. As we have seen with facial recognition software and digital surveillance systems, technology today, and the way systems of power that it has amplified, opposes more and more barriers that require humans to adapt based on its constraints.

Until there is a broader consensus on this change in the definition and capabilities of technology, humans cannot fully realize the implications of their behavior change, the sources, and the unconscious consent manufactured to cater to our technological advances. Cases like United States v Jones (2012), where police attached a GPS tracking device to a suspect’s car without a warrant, demonstrate how most people remain oblivious to how their rights are being impacted by advancing technology. The courts attempt to balance privacy rights with law enforcement power, yet most citizens do not grasp how surveillance has already altered their behavior. This lack of perspective persists because the legal system evolves slower than scientific innovation, while human actualization falls even behind to routine human behavior. Ultimately this uneven pace cannot sustain a just, democratic system. Courts must work quicker to apply existing rights and privacy protections to emerging technologies, instead of individuals bearing the burden.

Resulting in this bleak challenge, several options for each of us remain. Advocate for privacy protections and limits on surveillance technologies. Support digital rights organizations working to advance privacy laws and public awareness. Though the law may lag behind, grassroots efforts can influence policymakers over time. Litigate strategic test cases to establish new privacy precedents. Impact litigation groups can bring focused legal challenges around new tech issues, setting vital court precedents in the process. Though piecemeal, such cases can slowly modernize privacy rights. Demand accountability from tech companies. Lobby social media platforms, device makers, app developers, and other tech firms to make privacy and ethics central in new products. The industry itself could prevent many emerging issues through self-regulation. Foster public understanding and debate. The more everyday citizens discuss technology's societal impact, the more momentum around updating rights and rules. Improved technological literacy and ethical contemplation is pivotal. In short - lawsuits, activism, corporate responsibility campaigns, and public education. With corruption and corporate-funded lobbying in mind, this multi-pronged approach beyond government change could potentially help balance the scales between innovation and regulation. It presents hope that as technology progresses, human rights need not perish in the digital age.

Navigation

Webs Webs

r1 - 28 Feb 2024 - 21:07:52 - ZoeyYichenLiu
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM