Tuesday, June 04, 2019

Big Tech seems to welcome global regulation to keep out competitors, and might not need user generated content in the future the way it has


There is a lot of talk again about breaking up big Tech with anti-trust laws, but a more comprehensive explanation probably would be that on May 3 by Matthew Yglesias on Vox.   Elizabeth Dawson and Tony Room have a newer discussion in the Washington Post June 4. 

Yglesias points out that while a lot of it is about consumer prices and competition, and a lot more is about the unhealthful incentives for tech companies to mine consumer personal information, much of it is more fundamental.  You’re not supposed to be able to control the platform that can make things (like the media content from users) and then make the same things yourselves.  The problem is partly that the “algorithm” model of driving people into common chambers conflates that.

I actually think, as I’ve said, there are fundamental problems with too much content being free, and that paywalls are a good thing.  But we should offer the ability to bundle the paywalls so that users have more varied reading experiences and more balanced information sources.


“Kneecapping them sometimes helps them.” (Nick Thompson in this Wired video on CBS).
  
More regulation might, however, favor big companies and hamper startups and individuals, because of the controversy over layered speech (and over downstream liability, as like in EU with copyright).
In the meantime, Lior Lesig is pursuing an FTC action against possible collusion among some tech companies and payment processors.
  
There is even a more fundamental problem, that many more educated people stay in echo chambers where others can think in layers the way they do, but may become fixed on certain ideas (like free speech means all legal speech, or, on the other hand, real minorities must be protected from hate).
   
But there is a tendency for “the masses” to be unable to understand layered thinking on the web, and think that bringing something up (even alt-right ethno philosophy) in discussion is the same thing as promoting it.  So it is hard to define “toxic content”.

Today, Electronic Frontier Foundation offers a strike page The Impact of ‘Extremist’ Speech Regulations on Human Rights Content”, by Jillian C, York.  This leads to another paper, “Caught in the Net” by EFF, the Syrian Archive, and Witness, in response to a “Christchurch Call to Action”.  I’ll come back to the details soon.   We've still got a problem of "meta-speech" where mere mention of something (ethno-nationalism) is seen as inherently toxic. 

No comments: