Law in the Internet Society
Wikis allow anyone with access to a computer in a community of any size to contribute instantly and collaborate simultaneously with very little effort. This last feature is key--wikis work largely because they are easy to access and change. Minimizing barriers to contribution can create benefits like growth and refinement. But minimizing barriers also means an increased risk of vandalism and other abuse. In order to thrive, a wiki's administrators must choose the level of openness that will best balance these benefits and risks.

Consider Wikipedia. In 1999, Richard Stallman said that a web-based "free encyclopedia of all knowledge" could be collaboratively authored over a period of ten or twenty years. In [[http://en.wikipedia.org/wiki/Wikipedia_History#Formulation_of_the_concept][January 2001], Jimmy Wales put Wikipedia online and it grew even faster than Stallman had anticipated. Stallman thought that an online encyclopedia would be written by teachers who contributed whole articles in their spare time at a rate of a couple per year. But Wikipedia's open editing policy created an even more impressive division of labor by allowing anyone to contribute to or edit any article.

Countless minor and some more notable abuses followed. The same openness that powered Wikipedia's growth made it an easy target. Some prominent wikis, such as Wikihow, continued to allow anonymous editing because it resulted in more robust content. Others, like Google's Knol, raised higher barriers to contribution by requiring users to register. Wikipedia ultimately left its anonymous editing policy mostly in place, although it has adopted protective measures such as this one and, more recently, this one.

But it turns out that anonymous editing of Wikipedia is something of a myth. The reality is that Wikipedia only affords a degree of anonymity because it logs contributors IP addresses and then makes those logs public. In August of 2007, Virgil Griffith's Wikiscanner project generated a great deal of attention by using Wikipedia's public logs to track the IP addresses of millions of edits, revealing the extent of changes made from within various companies, government agencies, and other organizations--sometimes with remarkable precision. Many affiliated with Wikipedia, including Jimmy Wales, publicly voiced support of Wikiscanner and touted its potential to reduce abuses.

Wikiscanner's debut highlighted the fact that fully anonymous editing is not a priority for Wikipedia. On the one hand, why should it be? A minimal level of accountability may help prevent vandalism, censorship, and bias. But, on the other hand, there are some people living under repressive governments--probably a very large number of people--who are deterred from contributing to Wikipedia by the lack of full anonymity. These people have reason to fear that if they contribute, their IP addresses could be tracked and made public and they could face persecution as a result. Without full anonymity, their contributions are lost.

One objection here is that such users can simply employ IP address blocking software. This objection fails for two reasons. First, for average computer users, anonymizing software may be difficult to acquire or use. Second, even if they are able to acquire and use software that blocks their IP address, it is likely that Wikipedia will reject the contribution because it was made anonymously.

Wikipedia's policies frequently result in the blocking of contributions made using open or anonymizing proxies. For instance, users in mainland China using the popular Tor anonymizing software will be blocked from editing by Wikipedia. And even if the user registers using an open proxy, their edits may still face [[http://en.wikipedia.org/wiki/Blocking_of_Wikipedia_in_mainland_China#Circumvention_of_the_block][administrative blocks] due to their anonymous origins. There are ways around such blocking, but they may not be available to many average computer users.

In refusing to allow fully anonymous contributions, Wikipedia has chosen to prevent abuse in a way that will hamper the wiki's development by chilling contributions in countries with repressive governments. Is refusing to accept anonymous contributions and making user's IP addresses publicly available in an effort in an effort to decrease vandalism the best way of balancing the risks of abuse with the benefits of openness?

One reason to think that Wikipedia's course of action is a wise one is that full anonymity would have to be granted indiscriminately. The benefit of allowing those living under repressive governments to contribute anonymously could therefore be canceled out by that government's ability to make anonymous edits as well. Anonymity may encourage contributions by the repressed, but it would also allow propaganda and governmental abuse.

A counterargument is that even if anonymous editing were allowed, those making changes in good faith would outnumber the propagandists and vandals. Anonymity makes a wiki a truly neutral conduit for information. And because changes are accessible to all, this argument goes, users will be able to see both sides of the issue and decide for themselves.

I am inclined to accept this counterargument, but I recognize that there may be no universal solution. Nor does there have to be. Wikis are cheap to set up, maintain, and mirror. Why not experiment? One flavor of Wikipedia (or any other wiki) could vary the degree of anonymity available in proportion to the risks faced by contributors on a country-by-country basis. A second could allow truly anonymous edits. A third could use wikiscanning or a mandatory registration system to maximize transparency.

-- DavidHambrick - 13 Dec 2008

Is it ironic that the comment box at the bottom of this writing has been removed? I think so.

I wonder if one solution to the repressive government/China problem might be to have a truly anonymous option that would required checking (as apparently Wikipedia is doing in Germany) by some sort of trusted moderator before appearing. This could be reserved for people who have some need for anonymity (so pro-coke entries from Coke's headquarters wouldn't be allowed, but anti-Cheny sentiments from Cuban bays would be). This would allow Wiki's some control over people like myself who wage a constant battle to have Carrot Top listed as Canadian, but allow entries from people who might really be at risk from non-anonymous posting.

-- HamiltonFalk - 17 Dec 2008

Thanks for this comment, Hamilton. I think that the kind of moderated system you describe is an interesting idea. Wikileaks.org is one example of a wiki that has adopted such a system. But even with Wikileaks, which is relatively specialized, there seems to be a danger of the moderators being overwhelmed and I can imagine this would only be magnified in a more general wiki. How can your solution be made scalable? How can moderators be recruited and screened effectively when there also may be a need to protect their anonymity?

-- DavidHambrick - 17 Dec 2008

 

Navigation

Webs Webs

r3 - 17 Dec 2008 - 16:17:57 - DavidHambrick
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM