Computers, Privacy & the Constitution

Searching for Truth: The Illusion of Search Engines

-- By HafsahHanif - 01 Mar 2024

Introduction

With approximately seventy percent of the global population using the internet, search engines hold immense power in today’s society. Search engines, however, are merely the product of algorithms and computations that may be manipulated by their programmers and users. While search engines are highly susceptible to manipulation and the dissemination of misinformation, their creators often evade responsibility for the harm they cause because the First Amendment protects their creators.

Search Engines Are Powerful but Dangerous Tools

Search engines are no longer––if they ever were––analogous to a library that stores fact-checked or credible sources. Search engines have become a tool for the propagation of propaganda and misinformation, catering to the highest bidder and providing curated results to increase user engagement. While users may have a duty to verify the results provided by search engines, this does not mean that we should assume away responsibility from search engines for the damage and violence their results may have. Where the burden is on the user and never on the service provider to do the heavy lifting, we risk creating a dangerous precedent where accountability is conveniently shifted away from those who have the power to control the dissemination of information. By absolving search engines of responsibility for the consequences of their algorithms, we effectively allow them to operate with impunity, prioritizing profit over truth and social well-being. This not only undermines legal principles of accountability but also exacerbates the proliferation of misinformation and divisive content.

The case of convicted murderer Dylann Roof underscores the profound impact of online platforms' propaganda in shaping extremist ideologies and inciting violence. Trying to make sense of the Trayvon Martin case, Roof searched “black on White crime” on Google. The first result Roof saw was from the white nationalist Council of Conservative Christians; these views ultimately motivated his plan to massacre Black civilians in an attempt to start a race war.

Roof’s case begs the question of why and how such extremist propaganda and unreliable information was not only disseminated but also highly recommended by Google’s search engine. Many believe that the ranking system employed by Google is an indicator of reliability and credibility. Indeed, research has shown that people often only look at the first page of results produced by a search engine. The reality, however, is that the top-ranking sites can be manipulated or curated to reflect the biases of the user. These websites may belong to those who can pay premium prices to have their site appear higher on the results page or those who have managed to manipulate the algorithm through search engine optimization tactics. Moreover, the algorithms underlying search engines are built to rely on collected data and web history. Using this collected information, search engines learn what the user’s preferences are and then direct them toward content that confirms or reaffirms their beliefs and biases.

First Amendment Protection

Despite the dangers posed by search engines, these systems continue to enjoy protection under the First Amendment—a notion that has gained judicial support in recent years. Google’s search algorithm has historically “fail[ed] to find the middle ground between protecting free speech and protecting its users from misinformation.” The sentiment underlying this protection is that search results reflect the “individual editorial choices” of the company—a form of speech that is protected by the First Amendment. By “select[ing] what information it presents and how it presents it,” Google is exercising its right to free speech in a manner akin to that of a newspaper or forum editor—all without government interference.

But this argument is inherently flawed. The information presented by Google’s search engine is not necessarily reflective of the company’s “editorial choices”; rather, an algorithm developed by Google but designed to learn through user engagement is often making “editorial choices” when determining which results to produce for a given search. As deeply flawed as human beings are, algorithms are not only far more susceptible to error but also lack judgment and the ability to discern right from wrong. While programmers may install safeguards to train algorithms to detect unreliable information, ultimately these protections are easily circumventable. By extending First Amendment protections to an artificially intelligent algorithm, courts have diminished the core principles of free speech and eroded human agency and accountability in the exercise of free speech.

Privacy Concerns

Search engines also raise concerns about individual privacy. Companies such as Google, which have established their brand around the prowess of their search engine, meet the demands of their users by curating results to provide relevant and personalized information tailored to individual preferences and search histories. While users believe search engines are providing them with information that is useful to them, the reality is that the algorithm is “steer[ing] them toward content that confirms their likes, dislikes and biases.”

In the most extreme cases, such reinforcement can contribute to the radicalization of individuals, exemplified by cases like that of Dylann Roof, who became immersed in a spiral of extremist ideologies. Most often, however, the concern with collecting web history and data is that search engines sell this information to advertisers and other third-party companies to generate profit. Google uses personal data in two ways: “It uses data to build individual profiles with demographics and interests, then lets advertisers target groups of people based on those traits” and “[i]t shares data with advertisers directly and asks them to bid on individual ads.” In both instances, user privacy and protection are diminished as users are directed toward specific websites, ultimately undermining individual autonomy to serve the interests of a data-driven system.

Conclusion

As gatekeepers of digital knowledge, search engines wield significant power in shaping societies and influencing decisions. Search engines are often perceived as endless libraries containing a wealth of reliable information. In reality, search engines are highly manipulated systems that influence human behavior often without detection. By allowing search engines and internet companies to evade responsibility, we risk undermining the integrity of online informational ecosystems and compromising the privacy and rights of users.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.

Navigation

Webs Webs

r6 - 09 May 2024 - 15:49:37 - HafsahHanif
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM