Tuesday, January 21, 2020

Regulation of facial regulation software: would this stop with governments?; Twitter tells a startup not to harvest images


The media has been warning the public about the dangers of new AI and facial recognition software and its use in public places.


Buzzfeed says it should be banned.  The Washington Post points out that false positives are a real risk especially for African Americans.  This could even happen at airports even for people carrying lawful passports, the Post warns.   The New York Times wants to allow very limited use by law enforcement.

Other observers suggest that their should be maximum retention periods.

Many observers rightfully worry that western governments could become as invasive in public places as China and use the data in calculating social credit scores on persons that go beyond due process. 
   
That might be a particular risk for access to air travel.
   
It would seem logical to ask what happens if individual private interests get ahold of the software. 
   
For example, I have a library over maybe 20,000 personal photos.  Many of them are outdoors with no identifiable people, but some do have people, like in discos.  People have generally become more sensitive about being photographed by people they don’t know (as in discos) than they used to be, since about 2010.  But relatively few bars have no photography policies.  I’ve discussed the issue (with a couple of incidents) on my GLBT blog in the past.



Update: Jan. 23 

The New York Times reports, in a Business sections story by Kashmir Hill, that Twitter has sent a cease-and-desist to a company called Clearview not to scrape images from Twitter for a facial recognition database.  That is obviously because the company wants to deploy the images commercially for security customers.  But theoretically that could mean that a social media company or newspaper would object to people keeping images on their hard-drives even for private use (never published online).  It still might be picked up in the Cloud if someone developed the tools to do this to go after individual users.  You could imagine combining this idea with the Case Act (not yet passed) and the Copyright Office, if you wanted to. 

No comments: