Computers, Privacy & the Constitution

Note: I know this paper is somewhat misplaced within the wiki, but there's no folder for second papers yet. Also, I moved the conversation that was previously occurring here to RickSchwartzOtherIdeas. -- RickSchwartz Apr 25, 2009

Privacy ReMinder

Self Regulaxation

Privacy policies have been ineffectual restraints of data collection and use because of the inherent tension with the freedoms of contract and information gathering. The FTC only takes action against actual noncompliance with whatever terms a data collector chooses to set forth in its own privacy policy, and does not allow a private right of action for breach. The environment of self-regulation has encouraged the use of vague and equivocal language designed to avoid creating obligatory and therefore enforceable duties, while simultaneously impressing upon the unsophisticated consumer an appearance that consumer privacy is taken seriously. Perhaps over-optimistically, the notion that companies even nominally care about privacy might indicate that consumers would react adversely to an explicit disregard for the privacy of sensitive information. Though most trends suggest widespread user apathy with regard to the collection and use of sensitive information, the recent spate over Facebook's Terms of Service suggest that a critical mass of vocal and discontented users could incite some reform or at least some superficial dialogue and attention on the issue. In communities reliant on their user-bases, reform may be more substantial. For example, though the issue did not concern privacy, Digg's HD-DVD encoder key controversy suggests that a sufficiently large user revolt can induce an alternative course of operations.

Common Privacy

Some have suggested creating standardized privacy policies in a Privacy Commons, emulating the eminently accessible style of Creative Commons, that would create enforceable duties. Though this might increase the transparency of the policies used by a data collector that chooses to adopt such a privacy policy, that impulse itself would not incentivize more the adoption of robust privacy policies if people remain apathetic about the privacy options available to them or the impacts of such privacy policies. Without some effect on users' preferences, a Privacy Commons is unlikely to succeed, given the ignorance of privacy concerns to which users are already accustomed.

Furthermore, self-selection would limit likely adopters of such policies to data collectors that would have given more protection in any event. Some suggest mandatory adoption of one of a range of standardized policies, but unless one of those policies allowed the flexibility of information collection and use of the status quo, freedom of learning and thought would probably be undesirably impaired. Furthermore, the multifaceted approach required to accommodate the myriad methods of data collection would make the creation of standardized and technologically relevant policies particularly difficult, especially when the technology will necessarily evolve faster than the policies.

Going even further, some would prefer the creation of machine-readable privacy policies combined with user-controllable metadata that would dictate acceptable information use and collection beforehand. An existing subscription list for AdBlock called EasyPrivacy and extensions like BetterPrivacy are the closest this idea has come to fruition, though they only block processes run by the user's browser, which does not fully eliminate the scope of data miners surveillance.

PrivacyMinder

Since the FTC won't realistically require anything other than self-imposed privacy obligations, users must demand that sites adopt real obligations as a condition of use. One step in the right direction would be the invention of a browser extension (that I would tentatively name "PrivacyMinder" like the discarded AT&T project) that would prominently display, in easy-to-understand terms or iconography (again, like Creative Commons), the kinds of data collection and uses the currently-viewed website performs according to its privacy policy (or otherwise known facts about its data use). Privacy policies deliberately obfuscate their own terms in order to disincentivize all but the most ardent investigations into their terms; this extension would eliminate some transaction costs of parsing legalese and putting otherwise overlooked terms of use to the fore. The extension would also ideally incorporate information about the terms of use to which users must agree. Just as Firefox displays whether the browser is on a secure server within the address or status bar (and Firefox could just as easily and prominently display whether or not a website is storing cookies), the end product would hopefully give users a more automatic and intuitive understanding of the degree to which websites invade user privacy. A change in attitudes on the demand side should encourage a more bilateral dialogue about the terms a website chooses to set for itself. And, however unlikely it is for a court to hold that a unilateral act constitutes acceptance of those terms, such an extension could include a pop-up asking for affirmative assent to privacy-protecting policies before continuing to browse as an attempt to make browsewrap binding.

The icons and principles already developed by Mary Rundle or Aaron Helton would be a decent starting point for Privacy Commons, though they are currently a bit tame. Perhaps the non-judgmental attitude these icons currently would reduce resistance to the extension and even induce more cooperation by sites attempting to get favorable ratings. Whatever icons are used should also be color-coded in a traffic light style or otherwise graded in order to indicate the extent of any use or collection. Furthermore, a default icon for mealy-mouthed language that doesn't do anything for privacy protection (i.e., "no protection granted") ought to be jarring enough to remind users to be careful of activity performed on that site.

In the absence of machine-readable or standardized privacy policies, the extension could subscribe to a list containing manually-generated assessments of which protections, or lack thereof, every domain's privacy policy triggered in the same way AdBlock subscribes to a list containing ad servers to block. The subscription would be created and updated collaboratively through a wiki or other moderated community (perhaps EPIC?), and if the database lacked information for a given domain, or the community found the policy to be too equivocal, the default display would indicate no protection of data and potentially unlimited collection and use. Triggering the default icon might incentivize sites to adopt standardized privacy policies that the extension would automatically recognize as corresponding to various levels of protection.


Good paper, Rick. I think there's a few substantial obstacles to community-generated easy to understand privacy icons:

(1)What do the icons communicate? Is the focus a company's actual collection/data use practices, or the privacy protection it binds itself to provide users? The latter could be determined by parsing privacy policies. What data a company actually collects and what it does with it however is mostly guess-work for an outsider.

(2) Who puts together the list? Parsing deliberately misleading privacy policies seems like a task for lawyers, not the general internet community. If EPIC or another public interest organization is to handle the list, how is the project to be funded? Are there any corporate sponsors out there that might benefit from having their competitors' disregard for privacy exposed?

(3) Inaccurate ratings could create liability for the raters. While a default "no protection" icon when the list contains no information about a website might incentivize the company to adopt a standardized policy, it could also be actionable as defamation or tortuous interference if the privacy policy actually does commit that website to some protection.

(4) There's a lot of websites and privacy policies out there. This ties in to (3) and (2). If you need lawyers to put together the list, and you need the information to be accurate, then the best you might be able to accomplish (at least in the short run) is to rate popular websites.

(5) Getting wide-scale adoption. A PrivacyMinder? plugin will only have a substantial impact if a lot of users use it. Firefox is a minority browser still, and if people are as apathetic about privacy as we tend to think, they're not going to go out of their way to install a privacy rating plugin. AdBlock? caught on because it actually makes the web browsing experience more pleasant. If you want PrivacyMinder? to be effective, you probably need to get it bundled into Firefox as a standard feature. How do we do that is the face of opposition from companies that want to collect your data?

-- AndreiVoinigescu - 26 Apr 2009

Andrei, thanks for getting the discussion going. I didn't want to waste precious words by detailing what could be found on the links to the people that had begun working on some form of a Privacy Commons, but I am happy to address your concerns more explicitly and at greater length here.

(1) I think that the icons would largely indicate what practices a company could do without violating its own privacy policy (and therefore subject itself to FTC scrutiny). A supplemental icon (possibly just overlaid on the corresponding category) might indicate any supplemental and verifiable information that was available about actual compliance with a company's privacy policy (subject to the limitation/concern you raise in (3)).

Mary Rundle enumerates most of the substantive aspects of data collection that the icons should address, sorted into the following categories: 1. Collection Limitations (which kinds of data are collected: sensitive/personal, IP addresses, browsing patterns, etc.?) 2. Data Quality (how much data is correlated, anonymized or simply not collected) 3. Purpose Specification (is the data used for customer databases, internal research, marketing purposes, etc.?) 4. Use Limitation (is the data available to third parties for commercial resale, limited to intra-company data-sharing, or only for uses required to operate the service) 5. Security Safeguards (is the data sufficiently protected from unauthorized third parties) 6. Openness (do developers or users have access to the API/data?) 7. Individual Participation (can users control or view their own data? To that end, who owns the data? The last question may be important enough to many users to be its own icon.) 8. Accountability (is redress for disclosing or failure to protect data available by the terms of the policy itself?)

I am not sure whether or not these elements are already too numerous and need synthesis, but Aaron Helton also raises the following factors that are more concerned with the nature of the policy itself: 9. Policy Mutability/Revocability (is the policy subject to change or revocable unilaterally and without notice to the user?) 10. Policy Version (this might not need to be an icon or anything, but would be useful on the analysis side).

(2) The first effort should be on creating some kind of algorithm that would parse policies to see if they were in one of the standardized forms that a Privacy Commons might develop, but since P3P? has had so much trouble with machine-readable privacy settings, I could imagine that this challenge is not easily overcome. I agree that the natural impulse is to have lawyers parsing the privacy policies, I would also think that a wiki/community could be successful in this particular context, as long as there was some moderation performed by lawyers. If not, why not start a clinic at Columbia Law School or anywhere else people need to complete pro bono hours? I am aware of at least a few professors academically interested in EULAs, which would dovetail nicely with this project.

In terms of alternative funding, I could even imagine Google taking some interest in the project because of the data ownership aspect of the crawling. If Google could have someone else do the work of publicizing the fact that other sites claim proprietary ownership of users' data, in the hopes that those sites' users would demand some policy reversal, it would only make Google's attempts at crawling all that data themselves all that much easier. But maybe that's overly optimistic.

(3) This is a good objection, but I think it is easily solved by couching the icons/terms as "No protection for ___ verified by PrivacyMinder? " and would reinforce the point about adopting standardized privacy policies where it would be easy to verify that such protections were in fact granted. I think we would run into more trouble if we were using "outside information" about actual practices unless that information was absolutely verifiable, thus allowing us an affirmative truth defense to a libel action.

(4) I agree that this would be a monumental task in the absence of machine-readable or standardized policies. Depending on how strongly the advantage of anarchic policy-assessment plays in our favor, the problem might not be so significant. Practically speaking though, going by popularity rank would be a necessary first step until the other tools are solidified.

(5) Getting "pull" to apply rather than "push" is probably the toughest challenge there is here, but I think that upcoming conflicts on sites with a lot of user-generated content might push a data-collection issue like this to the fore. The better answer is that if we also bundle in something like the EasyPrivacy subscription list for AdBlock (which isn't a default subscription in AdBlock, and might be better leveraged to spread this product), then that might actually make the users' experience more enjoyable (by cutting out unnecessary data-collection processes) and create some pull.

If we can't do that, I agree that bundling is the best option. Facing corporate opposition to privacy would probably require either (1) courting favor with those companies that already offer fairly generous privacy (relative to their competitors) or (2) creating some straw men icons that would be easily satisfied by most sites. (1) might be easier than we would prospectively expect, given that Chief Privacy Officers would probably expect to be able to either outmaneuver such an extension or further publicize the company's respect for privacy through it.

I could keep going but I should probably pause and let others join in the fray. Thanks again for your thoughtful response.

-- RickSchwartz - 26 Apr 2009

Also, to add to what was previously said about existing privacy add-ons, it looks Ghostery is something close to a proof of concept in terms of displaying what kinds of tracking techniques a given site is using. I would imagine that Ghostery's functionality could be easily integrated and simplified to enhance the functionality of a PrivacyMinder? .

-- RickSchwartz - 30 Apr 2009

Great article, Rick. Personally, I'm wondering more about how to get companies on board. Correct me if I'm wrong, but making it work would require 1) developing the labels themselves; 2) developing standards for both human assessment and machine-readability; 3) getting companies to conform to these standards; 4) getting browsers to include them; 5) getting users to care.

To have any hope of widespread adoption, the companies (at least a sort of a critical mass of them) would probably want to have a say on every step. They would probably either want to agree among themselves on what the standard should be, or you could play off their instinct to seem better than the rest of the pack. In the first case, you may as well forget about the whole thing. In the second case, there's no real incentive for those who know their privacy standards are not great to join at all.

-- MislavMataija - 03 May 2009

It looks like Ghostery already does (part) of the job of figuring out what clickstream data is being collected. So the focus really should be on parsing out what voluntary self-restraint the privacy policies promise and displaying that in iconized form alongside an icon for what data they actually collect.

As to the real issue--adoption--maybe we don't need bundling into Firefox if we can get PrivacyMinder? as a standard (default enabled) feature in another popular plugin like AdBlock? Plus. It's worth sending Wladimir Palant an email to see if he'd be willing to do that (or at least to including Ghostery/the EasyPrivacy? list as defaults in Adblock Plus) If not, well, you can always fork the Adblock code to fold in the extra features. But then we'd be back to square one -- engineering adoption of an unknown plugin.

Rick, you also bring up future backlash against sites that depend on user-generated content. By this, I imagine you mean places like Facebook/Myspace/Twitter/YouTube/Flikr. If/when that backlash happens, it'll be a good catalyst for the adoption of technological measures like PrivacyMinder? , I'm sure. But we shouldn't stop there. If the backlash can be nurtured, then perhaps it can be channeled to achieve more legislative privacy protection as well. But Facebook at least has been pretty quick in backing down from past controversial changes. And once they do, people seem to forget all about privacy again. (Maybe they don't and I'm just jaded--but I get the impression that the average internet user has a very short attention where privacy is concerned. There's a chasm between outrage and action.)

-- AndreiVoinigescu - 04 May 2009

Mislav: Yes, the labels should be an early step, but there have already been some substantial efforts in that direction (see Rundle and Helton), so I focused my efforts in the paper on strategy. I would be happy to create another page on the wiki for more suggestions on what the final icon set should look like though if people are interested in participating in that discussion. As far as getting companies to adopt the machine-readable privacy statements, the shortest (and probably unlikeliest) method would be to get the FTC to require it. Barring that deus ex machina, I'm not sure we want to allow companies to exercise any kinds of controls since that would defeat the purpose of creating demand-side feedback on the suppliers. Creative Commons doesn't need publishing companies to tell them how to fashion their licenses, and if the Free World likes the idea of a Privacy Commons, I don't see why we would need companies to cooperate on the creation side, they can implement the privacy policies as they like (or else they probably wouldn't mean much to begin with).

Andre: I agree. Once I found Ghostery, I was somewhat back to the original core of my idea, which is to parse privacy policies by assessing the self-imposed legal limitations (which P3P has shown to be a Very Hard Problem) and supplementing that data with other known or admitted practices. Adoption still seems like a challenge, but I would agree that there is no reason for the free world not to leverage as much of these tools as possible into one package (with freely separable elements, of course). Your average user is just going to install the most popular add-on that does what they think they want it to do. This is probably why AdBlock? gets installed by most users, but EasyPrivacy? does not.

-- RickSchwartz - 04 May 2009

 

Navigation

Webs Webs

r13 - 05 Jan 2010 - 22:33:50 - IanSullivan
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM