Saturday, June 22, 2019

"Wired" article shows that Congress wants to expand FOSTA-like legislation to other problems like opioids and deep fakes


Christine Biederman has a detailed article in this month’s “Wired” detailing the history of how the “fibbies” seized Backpage, after a nearly two decade legal battle, right after FOSTA passed in April 2018. The link is here.  This has a free-article paywall (I have a print subscription).  The initial raid in Arizona was quite sudden and broke up a wedding. The title is “Inside Backpage.com’s vicious battle with the Feds”.
  

Section 230 was used to defeat most attempts to seize it.  At some point, Backpage began to “sanitize” ads to remove keywords and memes so that they would not appear to be supporting sex trafficking. It is arguable that such activity would have canceled their “moderation privilege” (or “good Samaritan clause”) even under Section 230 before FOSTA. It is certainly arguable that such behavior amounts to contributing to the crime intentionally (it is no longer “stochastic”).

The government also used civil asset forfeiture against Backpage to make it harder for it to defend itself legally in terms of financial burden.  Civil asset forfeiture is unusual in free speech cases with digital assets.

The long article warns about another state attorneys-general letter to weaken Section 230 with respect to other crimes, including opioid distribution, identity theft (another blog in my setup), election meddling, and a new problem, deep fakes.

Thursday, June 20, 2019

Hawley's bill on Section 230 provokes alarming podcasts and blog posts, including mine and Tim Pool's


A lot of talk in the past 24 hours.


First in the video above, Section 230 relates to the CDA, not DMCA.

OK, I gave a 100-minute billcast on “Stephen Ignoramus” yesterday, link

A little comment about my work on the EMP and electric grid problem was accidentally lost in editing. I would also add that I “announced” Hawley’s bill at the start of the video, but forgot to really explain the “moderator paradox” in that Section 230 does allow reasonable moderation and axiomatic manipulation of content. The problem is that the major social media platforms do so much of it that many view them as publishers.

I discussed a couple of predictions of mine:  that we will see YouTube selecting whom it wants not only to monetize but even publish at all on its platform, partly because of the EU Copyright Directive issues but also now because “hate speech” (especially its stochastic nature on the right) is impossible to define in a way it can be moderated objectively.  I also explained that I need to keep my own speech; I can’t be forced into a situation where I have to work for somebody else first (and serve their ends).

Well, all of this follows one of my own more controversial posts Tuesday, about my plans after 2021.  as well as an earlier one there (May 30) proposing bundled paywalls as part of a solution (Stephen was already familiar with that).

This morning, Tim Pool made a particularly alarming post predicting that his channel will be banned from YouTube (although he could continue on Minds, Bitchute, etc).   He discussed the ban of Black Pigeon Speaks, which was reversed by pressure. The channel had no strikes, but was suddenly accused of hate speech by YouTube’s new standards. Pool reinforces the idea that YouTube wants to be more like Netflix, with content from already professional sources, and fewer “amateurs” competing with people who have jobs in media.  Pool introduced an article by David Auerbach, “The Coming Gentrification of YouTube”. 

Generally Google companies give plenty of notice when they plan major changes in policy which can affect creators. But YouTube seems to be moving erratically and suddenly.
  
YouTube has also made statements about “keeping people safe”.  Some minority groups are able to claim that their members are personally more vulnerable to attacks from radicalized persons  (and need much more of a zero-tolerance approach) than most other users, who will not see anything terribly wrong with the speech that is being taken down.

Wednesday, June 19, 2019

GOP Senator introduces bill to remove Section 230 protections from large platforms, unless they allow audits showing political neutrality


Senator Josh Hawley (R-MO) has introduced a bill to blow up the business models of social media companies, whereby larger platforms become liable for the content of their platforms unless they submit audits to show that they are “politically neutral”.  The bill is called the Ending Support for Internet Censorship Act.


It is not clear if this would apply to conventional hosting providers like the host for my Wordpress blogs.

Here is the CNBC story (Mary Catherine Wellons).

Makena Kelly has a story on the Verge

Elizabeth Warren Brown has a perspective on Reason here
  
This story is so new that it will certainly change rapidly.  I will keep tabs on it. Tim Pool has a comprehensive response to it already.  He is largely right so far. At 13:20 he discusses putting Section 230 in the USMCA.



Update: June 21

Elliot Harmon of Electronic Frontier Foundation argues that the Hawley bill will be unconstitutional and quickly fail in court and provides background from the Prodigy case in 1995 that led to Section 230.  Remember AOL didn't fully open up Hometown AOL until Oct 1996, after Section 230 had been passed. 

Tuesday, June 18, 2019

YouTube appears ready to ban some subject matter altogether because of user literacy issues



Attorney Viva Frei discusses how one post on his video channel where he discussed the legal details of a deposition related to the lawsuit against Alex Jones by families from Sandy Hook, was removed by YouTube, after first being shadow banned.


Toward the end of the video, Frei admits that some topics might have to be off the table on YouTube and other user generated content in the future. But if so, then a journalist who uses the platform at all might call his own reputation for objectivity into question.

Here is a rather long Twitter thread from this morning. 
  
I had suggested that I looked at Redicetv and didn’t see anything obviously w.s., just normal conservatism.

There would occur a couple tweets about the idea that (1) I was willing to give possible hidden or stochastic Nazi-sympathetic sites and YouTube channels the benefit of the doubt in order to protect free speech, but (2) I would not give Carlos Maza the benefit of the doubt in insisting that deliberately anti-gay channels be deplatformed.
   
The calculation seems to be that with (2) Carlos believes that some material like Crowder’s will tempt “enemies” or unstable people to attack more vulnerable LGBT people (trans and fluid and low-income and POC), even though the same people probably don’t threaten me as someone who is better off. Protecting human life takes a higher priority than protecting individualized free speech.  But if this is true, individuals like me must become more comfortable with functioning as parts of groups than on their own.

Monday, June 17, 2019

Harvard really shouldn't let social justice mobs drive its decisions


Robby Soave on Reason discusses Harvard’s rescinding of its admission to “conservative” Parkland survivor Kyle Kashuv. 

Two years previously, at age 16, Kyle had apparently indulged in some silly behavior with racial slurs on on Google Docs.  It appears that he thought that this chat was “private”. It also appears that social justice warriors dug it out, on both the far Left and far Right.

Of course, as a private institution Harvard can do this lawfully, but I think that it is a serious mistake for universities to pay attention to information sent to it deliberately by “mobs”.  We are seeing Patreon all over again. 


It is also disturbing that Harvard invited Kyle to write an explanation and apology letter, and still rescinded admission.

I remember saying some pretty horrible things myself particularly in ninth grade (age 14) which got me a rare call into the office.  Had that happened today it would have meant expulsion and being set to an alternative school. Dr. Phil, in some programs back around 2007 centered on Myspace, would talk about “Internet mistakes” and warn that the teen brain is not mature enough to see around corners. True, some teens are mature enough now to invent anti-cancer tests or fusion reactors.  The best teens are more mature today than they were when I was growing up.

This is a very regrettable incident. Reason reports that right-wing mobs have tried to get David Hogg disinvited, and it’s hard to ignore the likelihood that Kyle’s association with conservative views contributed to the disinvitation.

David Brooks weighs in with a paradox on how we develop morality as we grow up. 

At this point, it's important to realize that this incident happened on a Google Docs study group document that was not supposed to be published in public mode;  it was intended to remain private. We all know Dr. Phil's warnings, again, about the corners. Maybe this is a disciplinary action for the school -- or would have been had it been caught at the time.  Private messages really should not become fodder of SJW's in the future. 

Zack Beauchamp of Vox gives a detailed explanation of how conservatives and liberals view this incident.  In general, the problem is that racism (and maybe sexism and homo/transphobia) is seen as such a structural problem that enhanced awareness of behavioral codes is demanded of everyone, even with private communications. 

Forbes Richard Vedder regards Harvard as an "embarrassment". 

Monica Hesse writes a piece in the Style section of the Washington Post  (the "Kyle's" of the world) that might sound vengeful. What does happen now to Kyle? 

We need to pay attention to how, especially the far Left right now, acts like it finds that combativeness works, rationalized by the need to protect its own tribe from random violence from “enemies”.
  
If you want to look at a great channel by a Harvard undergrad, look at John Fish (from Canada).  99% non-political (he seems to follow Jordan Peterson’s idea of individualism), but he has talked about the “attention economy” and the dangers of social media recently.

Sunday, June 16, 2019

Pinterest creates "privacy claim" risk for bloggers criticizing them (over pro-life position)


I won’t try to give all the details of the Pinterest mess. 

But a pro-life whistleblower was fired, after Pinterest marked a pro-choice group as “porn” internally and he exposed it, and then a Project Veritas video on the matter was removed for a “privacy violation” regarding one of the employees (???)  Even Tim Pool’s video on the same was removed for showing a clip from that violation.


There were also claims that the “privacy” violation was simply reporting someone’s name from a news story, as if any blog post which does that could be taken down, even if the name had been disclosed in an article already linked to.

A right-to-life argument ought to be in favor of abolishing Selective Service, for moral consistency.

I sometimes see anti-abortion demonstrators in front of an office building in Falls Church VA on Lee Highway.

Friday, June 14, 2019

EFF discusses big platforms with respect to transparency in censorship policies, post Voxadpocalypse


Electronic Frontier Foundation reports “Social media platforms increase transparency about content removal, but many users are kept in the dark when their speech is censored,” basic link , by Gennie Gebhart.  The series is called "Who Has Your Back?"
   
The article links to a table listing major social media platforms and rating them on (1) responding to legal requests (2) responding to platform requests (3) giving notice (4) appeals notice (5) appeals transparency and (6) Santa Clara Principle.

YouTube, Facebook and Twitter were reported as OK on Santa Clara Principles but all have had serious problems.


Oddly, Wordpress was not listed as complying with the principles. Vimeo was listed as complying with anything.  

There seems little progress on YouTube’s responding to Ford Fischer’s situation with respect to allowing some monetization of live (neutral) reporting of critical historical events (as they develop) including more extremist speech.  

Perhaps YouTube believes this sort of material needs to be produced only on larger platforms, particularly for documentary film (HBO, Participant Media) and that this no longer works.  I would think these platforms will fund more documentary film on Charlottesville.

YouTube is very concerned about misuse of their material by less educated users for radicalization, being blamed for making profit off it. It also seems to be concerned about speakers who reject the idea of protecting people specifically because they belong to specific groups rather than on individualistic values. 
  
Reason has an important op-ed by Nick Gillespie, that muscular censorship is really bad. 

Thursday, June 13, 2019

"Deep Fake" videos could set people up for ruined reputations, even deplatforming


Ben Collins of NBC News demonstrates the potential “deep fake” problem where Bill Hader (SNL) impersonates Arnold Schwazenegger, the transition starting at about 10 seconds into the video and taking six seconds. 


I have to say, I would not have noticed the transition.

Similar tricks have been played with Mark Zuckerberg and Nancy Pelosi.

This is said to be a national security or homeland security problem.

Tim Pool has pointed out that it would be possible to discredit someone by deep-faking their making an overtly racist remark. Social justice activists might really try to do this.
  
It’s interesting to wonder if libel and defamation law already cover this.

Wednesday, June 12, 2019

House Judiciary Committee holds hearing with legacy newspapers on the idea of a pseudo-link tax on social media companies (effectively)



The House Judiciary Committee opened hearings (June 11) on a request from the legacy news industry for exceptions in anti-trust laws written for old legacy media.   Cecilia Kang has the New York Times story here
  
Newspapers want to be able to bargain for better prices when their stories are copies into social media.


Yet to me this sounds like a variation of the “link tax” going into effect in the European Union as part of its Copyright Directive.

Tuesday, June 11, 2019

The demonetization of independent journalists (over embedded toxic content) on YouTube is rapidly coming to a head, and could spread to other areas if not "contained"


Today, I just sent three tweets to YouTube executives as a formal reply to the “conflation” or “meta-content” problem posed  by Ford Fischer (News2Share) and other independent journalists when they report on protests and demonstrations and some people they film articulate “unacceptable content”, which would be embedded in the “meta-content” (that is, the reporting with embedding of toxic content from speakers or protesters at a public event).
  
Here is the Twitter link. I've also documented this in Minds ("'/jboushka"). 

We’re waiting to see how YouTube will answer.

Again, there are several elements.  The main problem is with demonetization; independent journalists can’t make a living at this, or the content can’t pay for itself, which in turn causes other problems we have already talked about. YouTube fears that advertisers will object to content that contains livestream of violent extremism even if posted for journalism.

And YouTube could reasonably be concerned that some viewers will not understand the idea of journalism and will be radicalized by the embedded speech anyway.  Imagine if the Internet had been available in 1933.

But YouTube especially seems caught off guard with not having thought through a problem like this. Sundar Pichai (CEO of Google) called it a "hard computer science problem".  Ironically, John Fish, a Harvard undergraduate with a video channel on undergraduate life with a tech major (computer science), while usually non-political, has dropped hints on his channel that these kinds of problems are coming for the whole industry. It's as if Ford had created a "final exam" problem for a university journalism course. 
  
The problem could conceivably become a concern on Blogger, and in general on hosted platforms. But generally text (as opposed to video) is not viewed as so enticing (although remember how the Christchurch “manifesto” was feared and quarantined in New Zealand).  The notion that certain “ideas” must be quarantined as “viruses” seems novel but maybe the point of some “stochastic terrorist” activity.

    
I am beginning to believe that the demonetization of many journalistic or political commentary-oriented YouTube channels (otherwise not supporting commerce) on June 5 is somewhat coincidental to the Maza-Crowder problem and may even go back to 2018 when Susan Wojcicki made an alarming statement about the EU Article 13.

There are important stories today on Twitter, too, about embedding other people's tweets in direct messages, and about little known ways to troll with empty honeypot accounts;  Twitter will address these but I did not get around to reporting about them as issues in detail today. 

Sunday, June 09, 2019

Discovery process in Damore lawsuit could have implications for social media; a new bill balances traditional media to big tech


As I got home from Pride last night, I found this interesting video from Tim Pool, that Google will have to go through discovery for a lawsuit from James Damore over bias in its employment practices.


It is certainly correct that “conservatives” are not a protected class (although in immigration law political belief is a social group, interesting distinction).  I don’t personally like doing things by protected class.  But that’s part of what the 14th Amendment means as applied today.

The important point is that discovery will examine Google’s and YouTube’s business models. This could call into question many of their products and how user-generated content is monetized further, even on this platform (Blogger), where I have concerns over the long term. 
   
Pool noted the problems of depending on ads to pay for content, and suggested making big tech platforms "utilities" where you pay for metered use.
  
 I have suggested "bundled paywalls" and there is some evidence coming my way that this idea (needing funding for a startup) is getting attention. 
    
CNN this morning mentioned a new bill “HR 2054” a “Journalism Competition and Preservation Act”, which reverses previous anti-trust law and allows local newspapers to bargain collectively with large tech companies. There will be hearings in the House soon on this. CNN noted today.  The News Media Alliance has promoted it.  The group notes that today only 20% of revenue for newspapers comes from subscription (down from what used to be) and that dependence on advertisers is getting precarious for newspapers just as for tech sites, as it is easily manipulated by extremist activists. 

Friday, June 07, 2019

Twitter simplifies its TOS rules; Vox seems to double down on demands on YouTube, putting embedded political content in more danger everywhere?


Two days after YouTube Purge 2.0, Twitter took a good step and simplified its rules, reducing the word count from 2500 to 600, story here

The rules, as stated, seem fair and neutral enough, and if applied as written, may show that Dorsey et al take seriously their confrontation interview with Tim Pool last fall.  One of the more interesting rules has to do with election integrity. But this is not an existential problem for users, because the fears over election influence that had surfaced back around 2004 would be more much applicable to bloggers and vloggers.

The controversy over the YouTube purge continued Friday, as Vox seemed to double down on its “demands” of YouTube regarding GLBTQ+ creators, but in a way that seems identarian and polarizing.

A piece by Aja Romano on Vox seems to object to the presence of hate speech even if embedded for journalistic report, at least when applied to Crowder’s case.  But that isn’t so much about argument as about comedy and parody, for which there are other precedents.

But Vox had already released its sudden purge of monetization of controversial news videos even if justifiable by context.  Presenting someone talking about Nazism was seen as promoting Nazism because it seems gratuitous. But then, libraries still have “Mein Kampf”, right?  Or is it the quick access on the Internet that means that homemade journalism about extremism is no longer to be allowed, at least as a career? 
  
  
That seems to be the case for now as Carey Wedler interviews Ford Fischer. Let’s hope that YouTube rethinks this again.  Without independent journalism, Nick Sandmann would still be wrongfully seen as a pariah, because the mainstream trusted fourth estate didn’t do its job and it took independent journalists to provide a number set of eyes on what they had missed.

An article on Rolling Stone piece by Matt Tabbai is even more blunt on the meta-content problem. 

Wednesday, June 05, 2019

YouTube announces a purge or demonitization of a lot of political content, among controversy associated with a Vox journalist; Russian trolls may be using American blogs



 I was “on the road” today, and the mice played.

Seriously, YouTube (on its Creator Blog) announced a policy that will demonetize a lot of political content on YouTube and remove content that promotes “ethnic supremacy” (mostly white nationalism in the US and Europe). The blog post is “Preventing Harm to the Broader YouTube Community”. YouTube says it will also shawdowban marginal content, such as claims that major historical events didn’t happen. Would Logan Paul’s satire about Flat Earthers get downgraded?

There is a lot of detail here.  Look at the twitter feeds of Carlos Maza, Tim Pool, and Ford Fischer.  I’ll have to come back to some of this again.  There is a theory that Vox (Carlos’s employer) is doing this to force YouTube to demonetize low cost independent journalists who are seriously eroding larger established companies ability to make revenue and causing layoffs.

One of the problems is the idea that only safe way to handle domestic extremism (especially white supremacist ideology) is to quarantine it, to keep people from talking about it in order to look smart themselves or to make money on their channels. Some of Maza’s tweets suggest he believes that merely giving an extremist a voice at all, like an interview on a news clip, will lead to immature or mentally unstable people back into gun violence – and then we get back to the activism of David Hogg and his friends.
  
In fact, Ford Fischer (owner of News2Share) reports that his channel was demonitized almost immediately after the announcement, and one of two of his videos were removed. 

I personally think the public is safer if people know what has been said in a rally in a public place.
     
But there is an issue with defining “trusted content providers” which, in the case of independent individuals, sounds like social credit, making sure that indie providers don’t behave in such a way as to make extremists feel that the indie providers will give them attention if they protest and propagate their views.

After the recent Facebook Purge 4.0 I feared a YouTube Purge 2.0 was coming.  It’s here.
   

Before the news about the Purge came on my cell phone (as I left a movie in Woodbridge), I had (at a Starbucks before a movie) seen an NBC story about the trolls at the Internet Research Agency in Russia. I’ve noticed that three of my 16 Blogger entities (Bill Retires, Bill on Major Issues, and Identity Theft) have much higher per-post access counts than the others, and justified by Analytics.  
     
That could suggest that Russian or even Chinese bots could be accessing them for use in their own internal propaganda, for example, low birth rates and social security as an issue transposed to Russia.  
   
This is disturbing.

Update:  June 6 (on "Voxadpocalypse")

The Verge (a Vox imprint) has an article by Elizabeth Lopatto discussing further actions YouTube could consider. 

YouTube has a page on advertiser-friendly policies.  YouTube says a content creator should voluntarily flag controversial videos for no-ads to keep his channel monetized. That means that a creator would need to create a lot of non-controversial content and earn ad revenue from that to pay for the controversial content.

Again, some of the important policy or news-based independent channels do this for a living (for their owners).  I don't.  That creates even another set of problems that I have taken up before and will have to revisit again.  "Adsense" sometimes won't display ads on Blogger posts with more sensitive topics.

Here are the hate speech policies with examples.  Most of them are straightforward.  But there can be nit-picky problems.  For example, in a real world, transgender-ism has to be recognized as a medical issue before insurance will pay for treatment and surgery, which is what most liberals want. There are a couple of other logical fallacies there if you look for them.

(June 7: Note:  I just replaced the wrong link on the YouTube blog post: I had cached an older link.) 

Tuesday, June 04, 2019

Big Tech seems to welcome global regulation to keep out competitors, and might not need user generated content in the future the way it has


There is a lot of talk again about breaking up big Tech with anti-trust laws, but a more comprehensive explanation probably would be that on May 3 by Matthew Yglesias on Vox.   Elizabeth Dawson and Tony Room have a newer discussion in the Washington Post June 4. 

Yglesias points out that while a lot of it is about consumer prices and competition, and a lot more is about the unhealthful incentives for tech companies to mine consumer personal information, much of it is more fundamental.  You’re not supposed to be able to control the platform that can make things (like the media content from users) and then make the same things yourselves.  The problem is partly that the “algorithm” model of driving people into common chambers conflates that.

I actually think, as I’ve said, there are fundamental problems with too much content being free, and that paywalls are a good thing.  But we should offer the ability to bundle the paywalls so that users have more varied reading experiences and more balanced information sources.


“Kneecapping them sometimes helps them.” (Nick Thompson in this Wired video on CBS).
  
More regulation might, however, favor big companies and hamper startups and individuals, because of the controversy over layered speech (and over downstream liability, as like in EU with copyright).
In the meantime, Lior Lesig is pursuing an FTC action against possible collusion among some tech companies and payment processors.
  
There is even a more fundamental problem, that many more educated people stay in echo chambers where others can think in layers the way they do, but may become fixed on certain ideas (like free speech means all legal speech, or, on the other hand, real minorities must be protected from hate).
   
But there is a tendency for “the masses” to be unable to understand layered thinking on the web, and think that bringing something up (even alt-right ethno philosophy) in discussion is the same thing as promoting it.  So it is hard to define “toxic content”.

Today, Electronic Frontier Foundation offers a strike page The Impact of ‘Extremist’ Speech Regulations on Human Rights Content”, by Jillian C, York.  This leads to another paper, “Caught in the Net” by EFF, the Syrian Archive, and Witness, in response to a “Christchurch Call to Action”.  I’ll come back to the details soon.   We've still got a problem of "meta-speech" where mere mention of something (ethno-nationalism) is seen as inherently toxic. 

Saturday, June 01, 2019

David Brooks presents Solutions Journalism Network as a bridge between talk and action (maybe for less sociable people)


David Brooks, the Canadian commentator who tells us how “to be good”, has a few recent commentaries that seem to point to the need for more singleton-like people (like me) to become re-socialized. 

The latest post (as of today) talks about trolls (I might even be one, given his flexible definition) and crybullies (which only at first glance refer to the ragebait social media users from the identarian Left).

But the most important one recently seems to be the one on May 16, “The Big Story You Don’t Read About” with the tagline “Journalists don’t always cover what’s really going on”.

He talks about the Solutions Journalism Network, who people are responding to problems, largely locally, as what Brooks calls “weavers” (#WeaveThePeople), builders of social capital. 

 

I have to admit that for some time I have worked largely alone, without a responsibility for anyone, which, according to some interpretations of Nicholas Taleb’s “skin in the game” theory, would mean I shouldn’t be allowed to keep my own individual voice online at all.  I’ll come back to that later, and this seems like an extreme, ironically Marxist theory. Yet we’re seeing it in China already and big Tech is starting to think about “social credit” even here.  Should people be expected to engage big community service projects (maybe even traveling and camping out in disaster sites) with non-profits with people they don’t know?  Should they just jump in, enlist, and take orders again?

Recently, WJLA in Washington DC reported on a case where a young boy needed a kidney transplant and, even with a donor organ, could not get one as long as he “lived in a hospital” (like in the recent film “Five Feet Apart”).  So one of his teachers (a male) became a foster parent and took him in. That is off the charts for me but it is engagement at a personal level that was unimaginable the way I was brought up.

Friday, May 31, 2019

Does social media interfere with teens' intellectual development and brain pruning? Well, not for the best performers, but then there are "average people"



Tim Pool is always tweeting that Twitter is a clickbait-driven cesspool.  EFF’s Elliot Harmon is a little kinder, but says that Facebook is nothing.

Now there is a study reporting that Twitter use has a negative effect on learning and the ability to develop abstract thinking in teens. The Hill (Rachel Frazin) and The Washington Post (Isaac Stanley-Becker) report here.  By the way, the studies don't even like blogging (or reading blogs) or, I guess, watching YouTube. 
    
I personally think that teens who succeed in “real life” activities in school don’t run into this, and teens typically succeed because they learn abstract thinking skills earlier in life.  Overuse of technology earlier in life can definitely be bad, as pediatricians report.


Jack Andraka invented a new pancreatic cancer test in high school and how has Stanford University behind him promoting his career, quite publicly.  He doesn’t need his own video channel.  But he says he has been an avid social media user in the past, although recently his Tweets became less frequent. That probably means something. Taylor Wilson (scientist, not protester) invented a fusion reactor as a teen and now has the career he wants at the University of Nevada in Reno.  Some teens have learned that they can make a living on social media “the right way” (and maybe avoid Logan Paul’s mistakes).  Harvard undergraduate John Fish gets more views at his educational YouTube channel than Tim Pool.  The clue to all of this is learning abstract thinking early in life.  David Hogg (however you feel about his political positions) has accomplished a lot with his activism.  Despite the hype, not all social media is identarian rage bait.

Again, people have to learn layered thinking.  They can get this from positive experiences in the real world.  So too much social media can get in the way.
  
We have a problem in that our competitive society is leaving a lot of “average people” behind, and driving them into identarianism and making them susceptible to manipulation by propaganda in social media and echo chambers. Ironically, the intellectual elites hardly see that this is going on, as the rage bait never shows up in their feeds because of the algorithms.
  
Is David Pakman really going to give up social media, forever? 

Picture: a library in a museum at Stanford (my visit 2018). 

Thursday, May 30, 2019

Facebook removes video of a "Proud Boys" press conference based on its "Dangerous" ban, raising questions as to whether journalists can use the platform with integrity




Wednesday, Ford Fischer, owner of News2Share in Washington DC, reported that Facebook had removed a livestream that he had filmed near the Lincoln Memorial on Monday May 20, 2019, of the Proud Boys announcing a lawsuit against the Southern Poverty Law Center.
  
I will provide the link from YouTube for this News2Share video, this time without embedding it.  The visitor should form her own conclusions after watching it. 

The SBS in Australia has a brief “Dateline” excerpt where Gavin McInnes explains what he sees as society’s war on masculinity and manhood.  This seems more like just a challenge to radical feminism.

  
I recall a news report where some Proud Boys members were arrested in NYC after a brawl with Antifa.  It didn’t sound like a terribly earthshattering incident.

Milo Yiannopolous had appeared at one of their events and then Patreon bluntly told him he could not use their service because he had been associated with the group at all.
  
Wikipedia, which is generally pretty balanced, does characterize them as Neo-fascist, and yet I hear very little in the way of factual reporting that backs up such a damaging assertion.

Ford Fischer describes his issue in a long tweet and also on Facebook, which did allow his report on the removal of the video to stay. He says he was the only journalist covering the press conference.  Yet the public should know about this.

I answered this with a tweet storm of my own.

The reason for the takedown seems to be, bluntly, that Facebook had named Gavin McInnes as a “dangerous individual” and has an explicit policy regarding “dangerous individuals and organizations” here.  Yet McInnes apparently has quit the group
  
The policy seeks not only to ban them from using Facebook, but also to prohibit others from covering them with news stories or discussing them except to condemn them – effectively “quarantine” them.
This is particularly objectionable because of Facebook’s monopoly on social media – Chris Hughes is right, Facebook has too much political power.  It is hard, based on the facts, for me to believe that a few of them “deserve” such a public condemnation.
  
But it is true that a large portion of the Far Left perceives a “Nazi-like” threat to previously oppressed groups and believes, maybe from the example set by Germany in the 1930s, “quarantine” and forced solidarity is the only way to counter the threat – a combative approach. 
  
It is pretty easy to see the threat from radical Islam (ISIS and Al Qaeda) and single it out, and it is fairly easy to recognized dangerous states like North Korea and Iran and isolate them from US or western social media.  It is much harder to separate out fascism, or even separate it from communism at the extremes.  Anti-Semitism or overt racism is much harder to separate from passivity or indifference to intersectional claims.  Facebook calls “white nationalism” to be equivalent to “white supremacy”.  Yet does this means that governments like those of Hungary now in Europe (ethno-identity on the right) to be considered as “dangerous”?  Furthermore, the gay community is split over the more radical demands of the trans community and the idea of personal “body fascism”, as a few YouTube videos recently have shown.

The Verge and Engadget both report that Twitter (“The Church of Jack Dorsey” as Tim Pool calls it) is now relooking at how it should handle what the left calls neo-fascism, when some of it is probably just closer to mainstream conservatism and sometimes even to libertarianism.

In the meantime, I have to say that if a social media company (especially Facebook) wants to “quarantine” certain individuals and groups based on a “no fly list”, it is very difficult for journalists to use the service with integrity.  The mere continued use of the platform might imply a liberal bias and undermine objectivity.  There could even be problems with "mainstream media" continuing to use the platform for news if that implies omissions and lack of objectivity now.  

On the other hand, Facebook has said it is pulling back from welcoming journalism or pretending it can replace major “professional” media with amateurs and maintain objectivity. (And the “professional” media did not prove trustworthy on the Covington Boys case;  it was independent journalists like Tim Pool who busted the original story, resulting in defamation lawsuits against several major media companies.)   It wants communities to use it for personal matters, fund raising, charities, art projects, and even emotional support. 
  
I may need this indeed when I have my music and novel ready later, but right now, it’s a problem when I use it for “reporting” because now I can’t be objective if I have to exclude certain groups or persons.





Wednesday, May 29, 2019

Self-publishing coaching site discusses personal memoirs, and guest posting on blogs



I found a good site on self-publishing that is well to pass on.  It is called “Just Publishing Advice”.

I’ll point out two particular links right now.

One is a discussion of the difficulties in selling self-published non-fiction, particularly memoirs, which are to be distinguished from factual biographies usually written by a contributing author. (Yes, “biography” was a specific genre in eleventh grade English class when we did “literature”.)
  
The latter part of this article is pretty frank and I certainly agree with the “who cares?” part of this. (That was a meme at a book author’s conference in Denton, TX back in January 1988 that I still remember well.)


In my case, with the first “Do Ask Do Tell” book, there was a controversial issue (gays in the military, and my connection to it, and particularly the novelty of the issue just a few years after the peak of the AIDS crisis) that attracted attention and did result in reasonable sales in the first eighteen months or so (with three speaking engagements in the Twin Cities area, at campuses and at a Unitarian church).  The third book, with the eldercare and workplace chapters, seems more personal, and the “gay rights” chapter may seem stale now because gay marriage has since become the law of the land and is no longer controversial as it might have been.




I also want to point out their discussion on “The right and wrong way to guest post”.  When Ramsay Taplin had Blogtyrant (until June 2018), he was big on recommending guest posts.  Rick Sincere (a libertarian video and podcast host in Charlottesville Virginia, who used to be associated with Gays and Lesbians for Individual Liberty, GLIL, back in the 1990s in Washington DC) often offers guest posts on his blog.  But they take a lot of time. 

  
I do accept guest posts on two of my Wordpress blogs.  Many submissions to me seem to serve tribal interests and are one-sided, but I try to encourage contributions where some sort of moral or ethical principle can be extracted.  For example, I did publish a guest post about the border wall between Israel and the West Bank merely to reinforce the idea that border security by itself is not inherently the result of racism.


Saturday, May 25, 2019

CNN reports on worker falsely smeared for "racism" on the job by a misleading viral video


John Blake on CNN reports on a young woman who was smeared online with contrived posts after she, working for a Chipotle on St. Paul MN, refused service to misbehaving African-American men in the store she worked at before the end of a shift.  She was called “racist” when she is PoC herself (which is not obvious).  The men apparently conspired to make up a story about her on social media.

She was fired, and offered her job back when the company found out about the libel.

The article discusses three problems: confirmation bias, mob vengeance mentality, and the lack of inclination of individuals to question the tribe.


A similar problem occurred with the Covington Boys in January when the mainstream media misinterpreted a video fragment.

I am not sure how the mob knew her name or identity but the article reports that it is a common setup to make a viral video of someone behaving in a way that appears racist at first glance.