Saturday, March 03, 2012

Social media, publishing services face tricky issues with content moderation and TOS enforcement

The Berkman Center for Internet and Society, the Center for Democracy and Technology at Harvard has a paper by Erica Newland, Catherine Nolan, Cynthia Wong and Jillian York, “Account Deactivation and Content Removal, Guiding Principles and Practices for Companies and Users", link here, from September 2011.

EFF  (in a piece by Eva Galperin, March 2) recently mentioned the Guide in reference to controversies over the ways service providers implement their “terms of service” (TOS) or acceptable use policies, particularly when they respond to complaints or take-down notices from other users or members of the public. She points out that Facebook, unlike some other services, is unable to take down a posting by specific country.

The most important recommendation is clarity and transparency, and the ability of the person who receives the complaint to respond. Members of the public may have widely varying perceptions of what might be acceptable use of pictures in which they appear, or of acceptable ways for others to contact them.  At the other end of the problem, companies may have to maintain secrets in the way they protect the integrity of their service, as for example, in combatting spam.  Because of the secrecy, it can be hard for users of services to know in advance what behavior might trigger take-downs. Some commentators have argued that speakers should not depend on free services for valuable speech, but should always place it on hosted service they pay for. 

Companies (offering publishing and social networking services) may believe that they need specific policies regarding social problems, such as bullying, or as with Tumblr’s recent policy against promotion of self-harm, link here. It is very hard to tell the difference between “promoting” something and just “talking about” something for the purposes of constructive criticism. 

Even in an environment where there are considerable protections from downstream liability, companies normally need well-conceived TOS services but must operate under the presumption that any user has standing to speak about anything.

There are companies that offer content moderation, such as “Moderation Pro” (which has a domain name with an unusual tld, for “professional”, link).

1 comment:

MicroSourcing said...

Content moderation is a perennial issue for social networking sites because social media needs to toe the line between encouraging free speech and being politically correct. While these sites want to be as inclusive as possible, they can't be platforms for divisive content.