Computers, Privacy & the Constitution
Ready for review. Comments welcome.

The Evolving Problem of DMCA Takedowns

-- By BrianS - 24 Mar 2010

Introduction: Section 512(c)

The DMCA's takedown provision, 17 U.S.C. section 512(c), significantly modified the playing field for content owners, content creators, and content hosts. Under section 512(c), rightholders can issue notifications to ISPs requesting that they remove user-uploaded content on the ISP's site because it allegedly violates U.S. copyright law. See, e.g., ChillingEffect's DMCA page and their FAQ. This essay examines the actual use of section 512(c)'s takedown provision and the emerging problem of the automatic takedown filter.

The Hall of "Fame"

Since section 512(c)'s enactment, thousands of takedown notices have been issued. Many of those notices were issued in good faith and in compliance with the intended effect of section 512(c). Others, however, have not.

For example, after blogger Perez Hilton issued a video clip criticizing Miss California Carrie Prejean for being opposed to same sex marriages, the Natioanl Organization for Marriage used a few seconds of his video in a response ad. Perez then issued a takedown notice alleging copyright infringement for what was clearly a fair use. As others have noted, this was an abuse of the DMCA process. Nevertheless, it knocked the clip off YouTube temporarily.

Similarly, NPR issued a takedown notice when an anti-same sex marriage group used 21 seconds of an NPR clip to try to make a point in its political ad. Fair use? Yes. Removal from YouTube nevertheless? Yes.

And in a third example, after the National Organization for Marriage made a video depicting the rise in gay rights as a coming storm, a second organization obtained and released clips of the rather horrific auditions for NOM's ad. Reporter Rachel Maddow played some of those auditions (at about two minutes into the link) on her show to make the comedic point that "pretending to be a straight person hurt by gay marriage [] is apparently very, very challenging." In response, of course, NOM issued a takedown notice. MSNBC's use was clearly a fair use and NOM's takedown thus an abuse of the DMCA. Still, the content was removed.

These are not the only examples. Indeed, some studies have argued that 30% of takedowns present substantive flaws suggesting they seek removal of noninfringing works.

The Unblinking Eye

Processing takedown notices costs service providers time and money. There is also continuous pressure on providers to "reasonably implement[] . . . a policy that provides for the termination [of access to the provider's services] . . . for repeat [copyright] infringers." See, e.g., 17 U.S.C. s. 512(i)(1)(A) (stating that service providers that fail to do so lose their section 512 safe harbor). Further, service providers face pressure to implement tools allowing rightsholders to do less of the heavy lifting on policing content.

The end product? Automatic takedown filters like YouTube's ContentID. ContentID matches audio or video files provided to YouTube by rightholders against user-uploaded content. When rightholders sign-up to use ContentID, the rightholder instructs the program what it should do when it finds a match: "monetize, track, or block" the content. Once the rightholder has done so, ContentID is "Fully Automated. Once you're set up, Audio ID and Video ID identify, claim, and apply policies to YouTube videos for you."

In other words, tools like ContentID are fully-automated DMCA takedown clones. If a rightholder doesn't want any incarnations of its works on YouTube then so it shall be; ContentID will act as a gateway, stopping user-generated content from ever appearing on the site.

Going Forward: The Problem

The problem is entirely too many non-infringing videos removed. It is an end product of automated processes--currently incapable of assessing fair use--blocking content indiscriminately. With ContentID, there is no human element once the process is initiated. Even if your use is a transparently fair use, ContentID's unblinking digital eye cannot tell and it silences you nevertheless.

The problem is also nearly sanctionless opportunity for DMCA abuse; by sending a notice, the video is taken down and at best placed back up days later (and at worst never revived and the user discouraged from future digital expression). Section 512(f) provides that "any person who knowingly materially misrepresents under this section- that material or activity is infringing . . . shall be liable for any damages, including costs and attorneys' fees, incurred by the alleged infringer . . . who is injured by such misrepresentation, as the result of the service provider relying upon such misrepresentation." In some courts, however, the only way the user whose content was removed can recover is if he or she can show "subjective bad faith" on the part of the notice issuer.

Finally, it is little comfort that the user has post-abuse self-help options. If the user takes the removed-material elsewhere they still face the risk of abusive takedown notices. If the user challenges the ContentID automatic filtration, the rightholder must manually review it but can still send an actual DMCA takedown notice and silence the clip for days. The dual threat of abusive DMCA takedown notices and automatic filtration programs create a substantially user-unfriendly atmosphere for what the growing expressive culture for this generation and those to come.

Conclusion: The Solution

The solution is two-pronged: it must address both the state and private party components. On the state side, the DMCA should be amended to embody fair use principles. Content filters, however, are not state-made and thus may need to be addressed separately (unless revising the DMCA persuades service providers like Google to rethink or redesign content filters). Perhaps the best tool for that job is a dramatic increase in public advocacy.

The Internet and digital culture are robust, but they are not immune to censorship. The digital culture is here to stay, and if we allow programs like ContentID in its current incarnation to set the gold standard we chart a path to significantly less expressive freedom. There is a middle path between abolishing copyright law and abolishing fair use. If we insist on retaining a baseline highly-restrictive model, we must at least find a balanced, center route online.


 

# * Set ALLOWTOPICVIEW = TWikiAdminGroup, BrianS

Navigation

Webs Webs

r7 - 10 Apr 2010 - 03:02:04 - BrianS
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM