The EU recently overhauled its copyright regulatory framework with the EU parliament’s adoption of a sweeping new law: EU Directive on Copyright in the Digital Single Market[1].  As in the US, the EU’s last major update to copyright laws was undertaken at the turn of the century, and the internet landscape has since changed dramatically. Many novel new digital use-cases—think memes and re-tweets—had not been envisioned by the original law, and with these innovations have come new challenges and frictions for intellectual property rights. While the new copyright law makes numerous changes, one of the most significant and contentious issues it seeks to address is the increasing difficulty that content creators like newspapers, movie studios, and musicians face in collecting adequate compensation for their digital content. To tackle this problem, the law seeks to strengthen protections for rights holders as their works are shared on content aggregators or uploaded to file-sharing platforms.

Although the Directive must first be transposed by the individual EU member states over the next two years before taking effect, it is already clear that complying with the new law will present significant technical and policy challenges for both large internet giants like Facebook and Google, and also for smaller media content platforms.

What Does the New Directive on Copyright Require?

The Directive recasts the hosting activities of online digital content platforms as acts covered by copyright (performing acts of communication or making available to the public). This means that online platforms hosting content are subject to copyright, and must first obtain licensing agreements with creators before hosting any of their content or else face liability for infringement. The first way it achieves this is through Article 15, dubbed the “link tax,” which compels platforms who provide links to news articles (e.g., Apple News, Yahoo News) to negotiate licensing agreements with publishers for hosting links to their articles. The second way is through Article 17, commonly called the “upload filter,” which requires platforms where user-generated content can be uploaded (e.g., Youtube, Instagram) to engage in new efforts to prevent, rather than respond to, copyright infringement.

Currently, companies can take a reactive approach to infringement by responding to takedown requests and removing offending content from their platforms. This framework puts most of the copyright enforcement burden on creators, who must crawl the web searching for infringing uses and issue takedown notices for each violation in turn—a costly and labor-intensive exercise. While feasible for large movie studios, record labels, and other significant content creators, it is often far outside the economic means of small artists and musicians.

By designating hosting activities as acts covered by copyright, the Directive turns this approach on its head by attaching liability for infringement onto the host platform without the need for a takedown notice. The platform must then demonstrate that it has undertaken “best efforts”[2] to ensure the unavailability of unauthorized content and to prevent future uploads. This demands a proactive effort to prevent infringing content from ever being hosted on their platforms. Exactly how can this be achieved for a site like Youtube, where users from all over the world watch over a billion hours of content daily[3]? The Directive does not mandate any particular approach, but the clear implication is that companies hosting vast amounts of data will need to develop some type of automated filter technology that can pre-screen potentially infringing content before it can be uploaded and disseminated for public consumption.

Smaller companies will be subject to a lighter mechanism: they are required only to expeditiously remove infringing content after receiving a takedown notice. However, after a platform reaches 5 million monthly unique visitors, the requirement to undertake “best efforts” to prevent infringement on the platform will kick in.[4]

What Does the Directive Mean for Information Governance?

Internal controls for monitoring and managing intellectual property rights—including IP protection, legal compliance, and remediation—have always been an important aspect of enterprise information governance. The Directive is certain to add new urgency to efforts to innovate automated technologies that help fulfill these governance tasks. Previously, swiftly responding to take-down requests granted safe-harbor from liability, but now the potential for liability will attach immediately, significantly raising the stakes.

Companies will need to invest legal and compliance resources into crafting a strategy to satisfy copyright obligations. The scope of those resources needs to be commensurate with the risks and liabilities they might potentially face. However, companies must make this investment before the full magnitude of those risks can be fully known. Because this directive must be transposed into the national laws of each member state before taking effect, companies will have to wait and see the specific laws each member state enacts and how the courts apply those laws before the scope of that liability will become clear.

While the path forward for the largest companies will almost certainly involve some type of automated  technological solution along the lines of Google’s Content ID system (which cost $100 million[5] to develop), the right approach for midsize companies that exceed the 5 million visitor monthly quota but lack the resources to develop their own automated content filter technology will depend on a number of factors, including:

  • What is the risk profile of the content types being hosted?
  • How much control does the enterprise exercise over what’s hosted?
  • Are licensing agreements practical?
  • Is a technological solution feasible, and if so, is it proportional to the costs and risks?

Without an explicit guideline or a clear body of case law on how to satisfy the requirement of anti-infringement “best efforts,” companies will struggle to mount a balanced response. It may be possible to accomplish best efforts through a combination of automation and human oversight, or companies may need to rely on licensing comprehensive automated filtering tools from a vendor. As member states begin transposing the Directive into their national laws and creators begin asserting their new rights, the answers to these questions will begin to come into clearer focus.

[1] https://data.consilium.europa.eu/doc/document/PE-51-2019-INIT/en/pdf

[2] http://europa.eu/rapid/press-release_MEMO-19-1849_en.htm

[3] https://www.youtube.com/yt/about/press/

[4] http://europa.eu/rapid/press-release_MEMO-19-1849_en.htm

[5] https://www.engadget.com/2018/11/07/google-anti-piracy-report/

 

Disclaimer: The purpose of this post is to provide general education on Information Governance topics. The statements are informational only and do not constitute legal advice. If you have specific questions regarding the application of the law to your business activities, you should seek the advice of your legal counsel.