jasonsackey.com

Draft: Article 13

2019-4-9

Note: I wanted to get this article finished and published before the protests, and before the vote. Oh well.

Last Saturday across Europe there were demonstrations and protests against proposed new EU copyright legislation. The concerning legislation concerns ‘online content-sharing service providers’, which means sites like Facebook, Twitter, YouTube, and regular web hosting companies. Right now, companies running such sites are generally not liable for copyright infringement that folks get up to using their systems.

If online services were always liable for copyright-infringing sharing performed by their users, running them would be much riskier and more expensive. Social networking and much online infrastructure, cloud services and such, would arguably not be able to exist as we know them.

So, our enlightened governments, having been convinced of the value these services provide in powering the digital economy and relieving our fellow citizens’ abysmal boredom, have exempted ‘online content-sharing service providers’ from regular burdens of copyright compliance. Under certain conditions.

To keep this special exemption, to keep out of the courts, sites need to work with copyright-owners to help reduce the infringing uses of their services. They need to follow certain practices like ‘notice and take down’ procedures.

As a condition for limited liability online hosts must expeditiously remove or disable access to content they host when they are notified of the alleged illegality.

https://en.wikipedia.org/wiki/Notice\_and\_take\_down

The massive quantity of content uploaded to big services includes, apparently, quite a lot of piracy. So copyright-owners understandably send lots and lots of takedown notices every day. They even churn them out automatically, guided by content-identification software. Copyright lawyer-bots. The sheer amount of notices is too much for online services to deal with manually, but deal with them they must! So, they too use automation on their side of the process. A decision that a certain piece of content, identified as problematic by someone’s algorithm, should be removed or left alone is, frequently, decided by someone else’s algorithm, another machine, rather than humans.

New law expands the requirements for platforms to keep their limited liability status.

http://www.europarl.europa.eu/doceo/document/A-8-2018-0245-AM-271-271\_EN.pdf

They must ‘demonstrate that they have’:

made, in accordance with high industry standards of professional diligence, best efforts to ensure the unavailability of specific works and other subject matter for which the rightholders have provided the service providers with the relevant and necessary information;

Article 17: 4. (b)

Big tech was against it. Civil rights advocates didn’t like it either. Now it’s law, or soon enough it will be.