Saturday, January 16, 2021

EFF reviews the various levels on the Internet where censorship of user content can happen

 

San Francisco, 2018

Electronic Frontier Foundation has an important article, “Beyond platforms: Private censorship, Parler, and the Stack”, by Jillian C. York, Corynne McSherry, and Danny O’Brien.

The article leads to a second piece by Joan Donovan, from Oct. 28, 2019, “Navigating the Tech Stack: When, where and how should we moderate content? Look at the diagram “Content moderation in the Tech Stack”.

The most common takedowns are from social media platforms (Facebook, Twitter) or “free hosting” platforms, mainly YouTube.  They have a “signature style” for dealing with individual users. In principle these takedowns involve community guidelines (hate speech, incitement) and copyright strikes, and the platforms have the controversial third party liability protections from Section 230 and DMCA Safe Harbor (and the EU Copyright Directive adds to complexity).

But there are many other choke points for other content, including hosting companies, content delivery networks (like Cloudflare), Domain Name registrars, and telecom ISP’s.

These companies generally don’t moderate content and don’t have close relationships with users. Web hosts do have "acceptable use policies" to ban spam, hacking, illegal behavior, and sometimes online pharmacies and weapons sales.  But they became dragged in to controversy after Charlottesville, and feel the same or worse pressures today after the Capitol Riots, and hence the situation with Parler.

But actions by hosting providers have occurred.  Corynne McSherry back in 2010 discussed a DMCA takedown of a whole site, Cryptome, for publishing material from Microsoft which it maintained was Fair Use. 

There is another case in 2010 where Fred Von Lohmann discusses the copyright dangers for music bloggers.   Now it seems to me that you can usually embed a legal video of a performance or trailer to talk about the music (don’t count absolutely on the “server rule” given a case in New York State, see post February 17, 2018).

Mike Masnick has a similar discussion of moderation at infrastructure layers on Techdirt. 

The laws regarding downstream liability protection (especially Section 230) are running into friction over public concerns not only about radicalization, but also of inequity and the need to get individuals to be more open to participating in conventional activism. Many established interests feel that user-generated content that we take for granted brings risk and instability (and disruption to legacy media).  But if you undermine the ability of individual speakers to be heard, you don’t get the next teenager who develops a coronavirus tracker and warns the world about what is about to come.

No comments: