Law in the Internet Society

View   r9  >  r8  ...
FarayiMafotiFirstPaper 9 - 17 Dec 2011 - Main.FarayiMafoti
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"
Changed:
<
<
>
>
 MY REVISED ESSAY

It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.

Added:
>
>

GOOGLE, GIVE US A PEAK

 Google Inc has not cooked its search results to favor its own products and listings, Executive Chairman Eric Schmidt told a U.S. Senate hearing looking into whether the search giant abuses its power. Members of the Senate Judiciary Committee's antitrust panel said last September that Google had grown into a dominant and potentially anti-competitive force on the Internet. This hearing should come as no surprise to anyone who has been following Google’s ongoing squabbles with the FTC and the EC. Practically every player in the digital economy is gunning for Google these days with some accusing Google of operating a “black box” algorithm that lacks transparency or accountability. Others say Google stacks the deck against rival services, such as maps or shopping services, when it display its own affiliated sites or content prominently in search results.
Added:
>
>

THE ARGUMENT FOR TRANSPARENCY

 “Search neutralists,” as they call themselves, articulate their argument against Google as follows: If search engines have become an undisputed gateway to Internet, and are now arguably as essential a component of its infrastructure as the underlying network itself, does that not create a basis on which to argue for algorithmic transparency? Given that Google, the overwhelmingly dominant search engine, can apparently assert full and undisclosed editorial control of what content you see and what you don’t, does it follow that this endangers the fundamental openness of the internet?
Added:
>
>

THE ARGUMENT FOR TRANSPARENCY WILL BE IGNORED BY THE COMMON HERD

 Disregarding transparency’s obvious problems with execution ((1) the more transparent the algorithm, the more vulnerable it is to gamesmanship by spammers or worse, the greater the chance of the algorithm being rendered useless; (2) if the algorithm is transparent to regulators, they are unlikely to adapt fast enough to spur innovations), the concept is only worthwhile to prosumers, not consumers, and it is vital to remember that antitrust law, at least in theory, is supposed to be about protecting consumers. All consumers see is the supposedly objective final results, not the intervention by the gatekeeper. Unless the search manipulation is drastic (i.e. no relevant result appears), corrupted results are an “unknown unknown” and so no one cares. People will continue to see the search as a credence good, whose value is difficult to determine even after consumption.
Added:
>
>

A PROSUMER-INSPIRED SEARCH ARCHITECTURE

 Transparency’s relation to prosumers: The prosumer campaigns for a system that allows a visitor to conduct any or all three types of a search task: develop information, compare options, and find where to execute transactions. Algorithmic opacity would not be ideal for the prosumer because the prosumer, the active, tech-savvy customer who gains information from digital media or online, and interprets and influences mass consumers in terms of lifestyle and brand choices, desires increased facility with the technology in order to maximize his ability to engage critically with it and collaborate with others. Collaboration or federation is value to the prosumer. Currently, web search engines like Google function as weak federation mechanisms either by bringing up relevant web pages for user queries or via directories of related sites. A federated architecture, however, would offer a single point of entry allowing users to employ specific applications optimized for their searches. To be clear, the emerging paradigm is based on the combination of a multi–domain query approach with the integration of heterogeneous data sources capable of scouring the deep Web.
Changed:
<
<
Conceptualizing a prosumer- ideal search architecture, or as Professor Moglen puts it, “a system of federated search technology, in which we all do searching for one another in some fast and efficient manner” can prove difficult, however, for a number of reasons, not least of which is because there would need to be a revenue mechanism different from the “pay-per-click” method that we are accustomed to. Existing revenue sharing agreements between search engines and publishers, where each receives a fixed share of the profit, are no longer feasible. Consider Google’s model: once a user clicks on a sponsored link, the search engine receives the payment from the corresponding advertiser and gives part of this to the publisher. The payment ratio of the search engine is defined by a commercial contract, existing independently of the specific search. When it comes to federated search, however, the contracts between the publisher and the domain-specific search engines must account for the fact that each search engine plays a role in the generation of the search process. In order for there to be a disciplined way to estimate the search value of each domain-specific search engine, the monetization must be performed after the ranking. This would avoid issues of trust (often a search engine may want to bid strongly or decrease the bid on a query for purely economic reasons) around the domain-specific engines.
>
>

FOOD FOR THOUGHT: ONE IMPLICATION OF A FEDERATED SEARCH ARCHITECTURE

Conceptualizing a prosumer- ideal search architecture, or as Professor Moglen puts it, “a system of federated search technology, in which we all do searching for one another in some fast and efficient manner” can prove difficult, however, for a number of reasons, not least of which is because there would need to be a revenue mechanism different from the “pay-per-click” method that we are accustomed to. Existing revenue sharing agreements between search engines and publishers, where each receives a fixed share of the profit, are no longer feasible. Consider Google’s model: once a user clicks on a sponsored link, the search engine receives the payment from the corresponding advertiser and gives part of this to the publisher. The payment ratio of the search engine is defined by a commercial contract, existing independently of the specific search. When it comes to federated search, however, the contracts between the publisher and the domain-specific search engines must account for the fact that each search engine plays a role in the generation of the search process. In order for there to be a disciplined way to estimate the search value of each domain-specific search engine, the monetization must be performed after the ranking. This would help avoid issues of gamesmanship (often a search engine may want to bid strongly or decrease the bid on a query for purely economic reasons) around the domain-specific engines.

 

Google's Algorithmic Cat and Mouse Game: The Case against Greater Transparency


Revision 9r9 - 17 Dec 2011 - 15:47:25 - FarayiMafoti
Revision 8r8 - 10 Dec 2011 - 18:44:16 - FarayiMafoti
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM