No Technical Mandates: Strong Opposition Voiced to Copyright Office

Creators, Consumers, Libraries and Tech
Tell Copyright Office “No” On Technical Mandates & Filtering

The U.S. Copyright Office’s request for input on the use of technical measures to address copyright infringement triggered a massive response in opposition. While organizations representing creators, consumers, libraries, startups and more raised the concerns of their members, thousands of comments were also filed by members of the public in strong opposition to technical mandates and filters. 


ReCreate: “Instead of focusing its attention on the role of technical measures in preventing piracy, the Copyright Office should focus on the role of technical measures in preventing legitimate fair uses and how they often stifle, rather than encourage creativity. For many online creators, interaction with, and lawful uses of copyrighted works play an essential role in making their creations and profiting off of them. Copyright has lost its way when it starts to stand in the way of creativity rather than promoting it. Government backing of technical measures, in most cases, will do exactly that.”

Citizens and Technology Lab, Cornell University: ​​”The automation of the DMCA notice and takedown system, leading to literally tens of millions of these notices sent by technical measures like algorithms, bots, and automated programs, magnify these chilling effects concerns significantly, raising the possibility of chilling effects at mass scale…[P]ersonally targeted legal threats like DMCA notices can have chilling effects on a range of entirely lawful online activities, including user’s free expression, social media engagement, online search, and content sharing. The automation of the DMCA’s notice and takedown scheme means that these chilling effects may be happening on a mass scale online.”

Orchestra Music Licensing Association: “[T]he current implementation of content matching and automated infringement claim technologies has caused significant harm and disruption to American orchestras because that technology does not work accurately when applied to orchestral music. If the use of such faulty technologies were expanded and even more broadly implemented, those harms would only increase. For example, right now orchestras at least have the option of hosting archived videos on their own websites to avoid these false infringement notices. This hardly suffices, because orchestras receive far more traffic on their social media accounts than they do on their websites. If these flawed technologies were even more broadly implemented, there would presumably be no options left – even poor options – to avoid the harm and disruption those technologies cause.”


GitHub: “Technical measures are ill-suited to address alleged infringement of source code. Technical measures used for detecting copyright infringement of other forms of content do not translate well to source code. The essence of source code is functional…Code takedowns have cascading effects that can harm developer ecosystems. Code on GitHub may be in use by millions of computers around the world, and a wrongful takedown can have enormous consequences to the developer ecosystem.”

Wikimedia Foundation: “One of the major obstacles to adopting technical measures is that the technology is limited and not one size fits all…Our experience at Wikimedia is that wide adoption of these technologies could have negative effects for educational and archival projects in particular…Further, our experience with the low quality of these automated technologies and their failure to consider fair use and de minimis use means that they are likely to cause harm to the overall academic and archival ecosystem. Even if Wikimedia’s systems continue to prove effective at combating copyright infringement, the use of these technologies will likely mean that smaller academic institutions and researchers using copyrighted content will find themselves with inordinate legal costs or unnecessarily restrictive rules to avoid liability, unfairly impoverishing the public sphere.”

Organization for Transformative Works: “Like most other online services outside of the dominant sites, the OTW does not have, could not build, and—if it were technologically feasible, would almost certainly not be able to afford—technologies that could identify unauthorized, infringing works and distinguish them from non-infringing works…The OTW’s constituents regularly report that their transformative videos have been blocked by filtering software, notwithstanding that they are fair uses. This is incredibly damaging but hardly surprising, when even the most highly-touted filtering software cannot account for fair uses…The implementation of ‘standard technical measures’ that seems to be contemplated by the current process would squeeze small and nonprofit OSPs and would trample fair uses without producing a corresponding benefit.” 


R Street Institute: “R Street believes it is important to allow private firms to continue to innovate and improve technical measures within the existing framework; we do not believe that government facilitation or adoption of technical measures will generate superior outcomes to the status quo.”

USTelecom – The Broadband Association: “USTelecom anticipates that the Copyright Office can play a useful role, consistent with its neutral advisory remit, by sharing information, cataloging and educating the public about existing or future technical measures for the protection of online copyright works. We do not believe the Copyright Office, however is best positioned to actually develop technical measures or facilitate their adoption. The latter should be left to the marketplace and to voluntary multi-stakeholder standards-setting processes.”

Library Copyright Alliance: “The government cannot and should not facilitate the adoption or implementation of such technical measures. The government does not have the technical expertise to engage productively in such activities. Government involvement in filtering technologies also raises serious First Amendment issues.”

Association for Computing Machinery: “USTPC believes, however, that government should not itself seek to develop such standards. Rather, the development and maintenance of such standards would be better left to organizations well versed in creating such technical standards, such as the American National Standards Institute (ANSI) or the Institute of Electrical and Electronics Engineers (IEEE).”


Engine: “Startups operate on thin margins, and have to stretch limited resources to cover everything from R&D and product development to marketing and customer acquisition to legal compliance and human resources. The average seed-stage startup in the U.S. raises $1.2 million, a sum that is expected to cover all of its expenses for nearly two years…By contrast, the costs of filtering technologies are out of reach. For example, YouTube spent over $100 million to develop ContentID, and off-the-shelf tools like Audible Magic come with licensing fees that…cost more than $10,000 per month. Of course, startups cannot afford those amounts, especially not to catch the very, very few (if any) instances of possible infringement…One founder surveyed explained that if the use of “filtering technology were required by law, ‘it would put us out of business.’”

CCIA: “Technical measures are costly and time consuming to develop, and site- and media-specific, which raises concerns about the feasibility of broad adoption…Mandatory deployment also risks perverse secondary effects: a smaller service that needs to develop content scanning technologies will need to operate in a space where established services have been spending years developing technology. Mandating the use of this technology could therefore result in these smaller services risking expensive and time-consuming litigation, or having to pay license fees just to enter the market. The ultimate consequence of this may be to reduce the number and quality of options available to the users, creators, and advertisers that use the services in question…There is no role for government here.”

Internet Archive: “[O]ne of filtering’s most pernicious effects would be to impose costs on service providers which could make operations difficult or impossible for all but the largest corporations. This would effectively condition participation as an online service provider on access to expensive resources. As a result, few but the largest for-profit enterprises could likely participate as online service providers at all. This would very likely entrench existing players and have deleterious effects not only on the Internet Archive, but also on other libraries, non-profits, users, small and mid-size companies, and our overall information ecosystem. New or non-commercial platforms for online expression would increasingly disappear.”


EFF: “The Copyright Office should also consult with security and privacy experts regarding the potential corollary effects of any technical measures…As 83 prominent Internet inventors and engineers explained in connection with the site-blocking and other measures proposed in SOPA and PIPA, any measures that interfere with internet infrastructure will inevitably cause network errors and security problems. This is true in China, Iran and other countries that censor the network today; it will be just as true of American censorship. It is also true regardless of whether censorship is implemented via the DNS, proxies, firewalls, or any other method. Types of network errors and insecurity that we wrestle with today will become more widespread, and will affect sites other than those blacklisted by the American government.”


Melos Hani-Tani: “I am a music-listener, composer and creator of videogames, I would ask anyone reading this to seriously consider if you’re willing to damage the future of human expression just for a few mediocre music and movie companies’ profits.”

Seth Erfurt: “The use of automated copyright filters often not only lead to an ever escalating arms race of finding ways around the filter. It also has led to people having their videos claimed even when they lawfully followed fair use standards. Normal users often find a system where all the moderation is completely automated to be rather suffocating to deal with. Creators have no real recourse a lot of the time even when their content is outright falsely claimed. Plus, this type of filter only really works on massive platforms that can afford to run them and develop them.”

Mary Raichle: “As an independent publisher and writer, automated filters make my job either much harder or impossible. I write LGBTQ+ material which is already strongly affected by the existing filters. Adding more filters onto the internet will, due to the nature of AI / the prejudices that inevitably creep into them, ensure that I would not be able to reach my market. Please do not follow this path. It only hurts smaller copyright holders and helps the corporate entities that already make life difficult for us.”