Sep 18, 2018 | Srivats Shankar and Maathangi Hariharan
Sep 18, 2018 | Srivats Shankar and Maathangi Hariharan
The European Union has released a revised proposal for its Copyright Directive. The stated goal of the Directive is to change the existing copyright law dynamic, which leaves content developers with few avenues to generate revenue. The Directive particularly targets Big Tech, which often “scrapes” content and displays either the entire content or portions of the content without any remuneration for the author or right holder.
While the issues the Directive seeks to address remain some of the critical issues that existing copyright law has failed to address with sufficient backing, the Directive as it currently stands is highly controversial. This includes the purported reasoning of protecting copyright property rights under Article 17 of the Charter of Fundamental Rights of the European Union. In particular, a number of its provisions essentially facilitate and encourage the state to implement the form of “soft policing” by monitoring content on the internet in the guise of protecting the rights of copyright holders. The impact assessment conducted by the European Union notes that it would have a generally positive impact on wider access to content, exceptions to digital and cross-border environment, and ensuring a well-functioning marketplace for copyright. However, the analysis does not appear to extensively discuss the implications the Directive would have on surveillance.
The provision of the Directive of particular concern is Article 13, which states that “information society service providers” that store and provide access to copyrighted works protected by the Directive should ensure the protection of the rights holders. This includes content recognition technologies that the legislation defines as “appropriate and proportionate.” Additionally, they have an obligation to provide the rights holders information about the functioning of the recognition technology. The Directive does not provide for any mechanism to limit or control the access of recognition technology. This is the first of its kind legal document provided by a state that mandates the application of content recognition as a matter of law. The potential ramifications of the blackbox recognition mechanism, which the state can subsequently use to request information for the purpose of conducting investigations or otherwise request during a discovery process could be significant. No safeguard mechanisms had been provided within the Directive. The ramifications of this provision could be significant, presenting a window into the future of content surveillance and digital identity.
Srivats Shankar | May 02, 2022
The European Parliament adopted the recommendations of the Special Committee on Artificial Intelligence in the Digital Age providing a roadmap until the year 2030 regarding its impact on climate change, healthcare, and labor relations
Srivats Shankar | Mar 26, 2022
European Union reaches political agreement to introduce Digital Markets Act.
Maathangi Hariharan | Mar 22, 2021
/diːpfeɪk/
/ˌɑːtɪfɪʃl ˈdʒɛn(ə)r(ə)l ɪnˈtelɪɡəns/
/ˌɑːtɪfɪʃl ɪnˈtelɪɡəns/