Monday, April 30, 2012

FCC: "Street View" led to personal data collection, which was not illegal (at the time)


Again (the second time this weekend), I don’t like to bite the silverware at the dinner table, but, for the same of completeness if nothing else, here’s a story widely repoted, here from PC World, that the FCC (not FTC) is saying that several employees and at least one manager at Google knew that its “Street View”  project was collecting personal data, apparently from unsecured WiFi networks (the “wardriving” problem).  The link to the story by Adario Strange is (website url) here.

The FCC has the Report, called a “Notice of Apparent Liability for Forfeiture” on Scribd, here. The names of individual employees are redacted.  The FCC says that that the data collection did not break existing federal law, but imposed a $25000 civil fine for allegedly not cooperating with the investigation.

Jessica Guynn has a story in the Los Angeles Times Sunday, here.  The story is illustrated with pictures of the vehicles involved. 
  
CNET has a detailed report by Edward Moyer, here.

There is no evidence that any information was otherwise mishandled or distributed, and the matter probably has no practical significance to people.  The biggest danger to individuals from any unauthorized data collection might come to those in unusually sensitive family or business situations where someone might have enough incentive to try to get and misuse the information to target them. 

Picture: More from DC Science Exhibit. 

Sunday, April 29, 2012

FTC is looking over the shoulder of the "search engine giant"


On Saturday, the New York Times “Business Day” published a piece by David Streitfeld and Edward Wyatt, “U.S. Move Has Google Fighting on 2 Fronts”, link  (with paywall). 

The other front is in Europe, where the legal regulation of Internet companies is stronger.

But the FTC, like the European Commission, is also looking at whether search engine placements by Google might violate antitrust laws, and be harmful to businesses, perhaps not affiliated, that do not score as well in search placements.  I won’t get into biting the “hand that feeds me” here too much, other than to urge reading of the article (paywall).  But it’s hard to see why, in terms of “freedom of speech” (even commercial speech), why the company would not be free to display results and associated ads  (possibly paid) "more or less" as it sees fit.   It would seem (to me, at least, from general observation) that Microsoft’s Bing and Yahoo! Safe Search (for Firefox) follow similar techniques, but have a smaller penetration in the market.

I generally don't depend on raw search engines for comparison shopping.  For airfares, I usually either know about specific bargains, or I go to Priceline.  For example, airlines (American, Delta), offer ground hotel and rental car packages (sometimes significantly discounted) by "search"; could they be exposed to the same regulatory pressures?  (I generally find airlines-offered deals competitive.) 

Search results have also been controversial in the subject of online reputation, although Google, Bing and Yahoo! also seem to place professional-looking pages that they can find about a person first.  


Pictures: Science and Engineering Exhibit, today, Washington Convention Center; I got there rather late. 

Saturday, April 28, 2012

Political ad revenue for broadcast networks will be published analytically on the Web, by FCC


In a New York Times column called Media Decoder, “Behind the Scenes, Between the Lines”, Brian Stelter reports that major network broadcasters will be allowed (and even required) by the FCC to move their political ad database to the Web for the public to peruse easily.

The concept sounds important because a few years ago, the “indirect” or “implicit” campaign support given to candidates by novice bloggers was seen as legally troublesome given the results of a 2002 campaign finance reform law.  This was finally resolved administratively by the FEC in 2006, still within the Bush years (which probably had a political incentive to leave a free rein for visible conservative bloggers).

Smaller stations and cable networks probably would not post political ads online.  (I suspect that CNN and CNBC will, however).  Independent blogging operations would not right now.

The link is here


Friday, April 27, 2012

When is behavior "selfish" and when is it "altruistic"


While I plow through Edward O. Wilson’s “The Social Conquest of Earth”, I’m goaded into wondering which behaviors of my own, and of others around me, are “selfish” and which are “altruistic”.
The most obvious “selfish” behaviors, in the modern world, would be those that take care of oneself and that maximize one’s “pleasure”.  Going to work and doing your job dependably is, in sociologically terms, “selfish” if you support only yourself.

Self-expression and “recognition gaining” seem like “selfish” actions.  Artistic creations, such as musical compositions (especially in the classical world) are usually the result of individual effort.  Because publication and performance tends to satisfy the ego, most of us would see creating and disseminating them as “selfish”.  (You have to be very good at your craft and work diligently to succeed at this, but that doesn’t change the idea that it is in a sense “selfish”.)  Likewise, blogger journalism might be seen as “selfish”.  The true-life young character Zola (in the film “High Tech, Low Life” at Tribeca, movies blog Apr 26) who goes off on an adventure to blog in China is satisfying his own need for recognition and accomplishment and a sense of influence, to the chagrin, perhaps, of his family, which wants to see him seeking a wife.  Mark Zuckerberg’s creation of Facebook seems to be driven much more by ego and a desire to “be important” than making money per se.   But Wilson might see these as partially altruistic, if the output from such individuals has a beneficial influence on human culture.

In comparison to self-driven expression, marrying and having children sounds altruistic, done for the “group” (the natural family as extended by one’s parents). But in a practical sense, most adults who have children see them as part of their “selfish interest”, just as most people see their life partners.  Generally, people are “selfish” when they pursue potential mates, and particularly if they are jealous about the competition from others.  In the gay world, men are “selfish” when they pursue potential liaisons, whether short-term, or permanent life-partners.   Likewise, people are “selfish” when they are concerned about the “attractiveness” or “desirability” of (potential or real) partners, which may not be sustainable for a whole lifetime.

However, doing things for extended family members other than a partner or children is more clearly altruistic.  Taking on foster children sounds altruistic, even if compensated.  Adopting children sounds like both, but it sounds more altruistic if the children are special-needs.  Parents often expect older children to learn to take care of younger siblings, and this is clearly expecting “altruistic” behavior, because the siblings are the result of the parents’ activity, not the older child’s choices.  Volunteerism is generally “altruistic”, and extends into activities that can involve personal risk taking, such as fire-fighting and military service.  In the past, military service could be conscripted and was an example of mandatory altruism.  The same idea exists with the idea of national service today.

I find that I am constantly “bombarded” with calls to join “other people’s causes”, and accept the direction and regimentation imposed by others, which of course can interfere with getting my own personally chosen goals accomplished. 

Balanced personalities are more likely to be willing to accept goals developed by others and even to recruit others into these goals.  At worst, they can stumble into hucksterism.  Unbalanced personalities are more likely to insist on staying on their own paths. They can become prone to narcissism. 

The concept of eusociality would probably regard "chosen" behaviors as "selfish" and those that involve responsibilities that cannot be chosen as "altruistic".  I guess nature, with the idea of group selection, denies the "Axiom of Choice".  It's the unchosen ones that form the backbone of moral teaching early in life.  


Thursday, April 26, 2012

Catholic school firing of teacher may focus attention on "reasonableness" of religious teachings, even by mainstream values


A Catholic school (high school) teacher, Emily Herx, was fired from her job after telling her employer, the St, Vincent de Paul School in Fort Wayne, IN, governed by the Catholic Diocese, that she was using sick time to take fertility treatments.  She had a medical condition that prevented him from becoming pregnant “naturally”.  But she was trying to have a child with a married husband in a traditional family.

The Church claims she signed a contract to honor Catholic teachings, and having conception from any source other than marital intercourse is a “sin”.  (Conversely, the Vatican insists that access to sexuality in any fashion outside of marriage and without openness to procreation is a “sin”, an idea that many scientists say does not comport with the way nature evolved).

Her pastor said she was a “sinner”.

She is suing the diocese, but previous Supreme Court rulings suggest religious organizations can fire people for not obeying religious rules.

She says she teaches only “language and literature” but not church doctrine.

The libertarian position here is interesting.  If an employer is allowed to dismiss people for behavior inconsistent with its values, then the public is more likely to question whether these values make sense when publicity for the dismissal occurs.   Many people will learn about this case and react with the belief that Catholic teachings in this matter seem unreasonable, even according to mainstream notions about family.  (The Vatican is, of course, very determined to defend its authoritarianism on church teaching, and demands that everyone bear his own crosses to follow it.)  But in the past, of course, government and “religious values” could collude to monopolize the message being passed.
    
I have tended to look at this sort of thing as an ethical “conflict of interest”.  When I was working on my 1997 book very critical of the military’s ban on gays (and “don’t ask don’t tell” policy), I was working for an insurance company whose most visible line of business was sale of life insurance to military officers.  I felt that I should not continue earning my living from such a source.  Fortunately, after a merger, I found another position in the company in another city (Minneapolis) that did not create such a conflict.

Monday, April 23, 2012

Online reputation experts weigh in on the "do's and don'ts" of fixing one's own


I noticed a year-old article (link) in Time, “Repairing Your Damaged Online Reputation: When Is It Time to Call in the Experts?”, by Megan Gibson, April 19, 2011.  She discusses two major services, Reputation.com and Integrity Defenders (link).   It’s possible, of course, that some of the prices and information could have changed since publication.

I checked my own a few moments ago, by my legal name (“John W. Boushka”) in Google and it still looks pretty clean.  The websites that contain my name come up ahead of “doaskdotell.com”, and a Master’s Thesis from grad school shows up high, which would generally be a good thing.

It’s clear that people who have to sell to the public for a living (insurance agents, for example, or financial planners) or who are entrusted personally by the public (teachers) can have a lot a stake with the permanence of negative digital information popping up in the first few pages of a Google, Bing, or Yahoo! search.  It’s remarkable suddenly this whole issue became a problem in the 2005-2006 time frame, in the days when Myspace was still more popular (or more universal) than Facebook.

Another good question could be, would the popularity of mobile computing affect the way reputations are perceived?

As noted in a posting here March 12, the presence of “tattletale” sites, discussed on Anderson Cooper’s afternoon show, creates issues, as people can turn in others suspected (often incorrectly) of things (like STD’s) and then possibly be extorted into some sort of reputation cleanup.  As I discussed there, it’s controversial and unclear how much the Section 230 downstream liability protection for hosts fits in with these sites. 

Professional reputation management is expensive, and requires the active cooperation of the client.  When can reputation repair be a “do it yourself” operation?

“Reputation.com”  (founded by Michael Fertik) does have a white-paper-like page on this issue, warning users of likely mistakes (link).  True, the more positive or professional stuff you put out there, the better, up to a point.  Websites and blogs with your name (which can be a problem if someone with the same name or similar name has a bad reputation), twitter, Linked-In and Facebook accounts tend to show up first, as do resumes and things that sound “occupational” to most search engines.  But avoid techniques that search engines consider unethical or spam, such as hiding text, link farming (probably the worst offense) or “website cloaking”.  Avoid too much literal copying (“cut and paste”) of text from other sites – this is not only a copyright risk (and visitors know I have covered the copyright troll issue – “Righthaven” – thoroughly), but it may cause your web work to look spammy.  Maybe your English teacher was doing you a favor by using turnitin.com to look for plagiarism.

An "inconvenient truth" is that many people have to scrap and compete for a living (and to provide for real-world families) and will sometimes do or say things that others consider "over the top".  And the digital world have very long memories indeed.  


Sunday, April 22, 2012

Washington Post "brainstorming column" suggests reimposing conscription


The Washington Post has an interesting “Outlook’s 4th Annual Spring Cleaning” page in the Outlook Section, p. B4, link here.  

I’ll take up a couple of the items.  One is criticism of the all-volunteer military by Thomas E. Ricks.  The commentator writes that the all-volunteer military is successful “militarily”, but not politically and ethically. He says we were reckless when we invaded Iraq and would not have done so with a draft.

Conscription raises the idea that individuals must share the risks and perils that it takes to provide them a stable and prosperous culture.  That means, individuals must be open to exposure to sacrifice for the common good.  This, as I’ve noted recently, was a paramount part of “moral thinking”.   The Supreme Court has even ruled that the male-only draft is constitutional. As a “moral precept”, it began to unravel in the 1960s with the Vietnam era conscription, of which I was a part. 

The draft didn’t keep us out of Vietnam, but maybe the convenience of student deferments did.  When the debate over gays in the military erupted a quarter-century later, the irony of the whole paradigm was exposed, although by the 1990s a lot of people had forgotten we used to have a draft.

But after 9/11, Charles Moskos, one of the originators of "don't ask don't tell", argued vigorously for returning to conscription and suggested that (back in 2001) DADT could be easily repealed if the (both-sex) military draft were re-imposed. 

There’s something else about the idea of this kind of “moral duty”.  Shared sacrifice is still not the same thing as shared purpose.

Another topic is school “Grades”, by Tulane professor Melissa Harris-Perry.  But when I was growing up, Grades were a kind of currency, a measure of self-worth.  Life was actually pretty rich and interesting even in the days before I was on my own as an adult with my own fiat money and bills.

The topic of Software Patents (Christina Mulligan and Timothy B. Lee) I’ll take up momentarily on the trademark blog.


Friday, April 20, 2012

No, Facebook doesn't think that the PC is dead


“The Daily Beast” column in the current Newsweek, says “Instagram will take Facebook into the Mobile Age”, link here. No argument with that.  But I take exception to the subtitle “the pc is dead. Long live mobile”.
         
The article says that when Zuckerberg wrote his Facebook code in his dorm room in early 2004 (probably not while drunk as in the movie), “computing was about PC’s. The Internet was about websites”.  Oh, boy, that's how the Web 1.0 world evolved in the 1990s.  I miss it. 

Well, I think there’s really a limit to mobile.  You can’t always get reliable connections everywhere.  It’s kind of stupid to watch a movie like “Inception”  (or even “Social Network”) on a phone.  The iPad displays stuff in web-like manner (although blogging on it may be a challenge).  The Kindle and Nook (and iPad) are really book-like, because they’re supposed to me.  On Amtrak, I see plenty of people working on high-end Windows-based laptops (and they really are working, for companies, with “the numbers”).  It’s easier to look at most sites on a PC than on a smart phone (even, say, Major League Baseball, mlb.com).  But, still, almost 20% of my own traffic is now mobile (and that’s recent, within the last six months, with the avalanche of people getting smart phones as 3G contracts wind up). And on a disco floor, I usually check to see if the Nats won on a smartphone.  (I’ve even gotten a drag queen to read the scores one time as part of the show.) 


Thursday, April 19, 2012

Can "I" get along with just a smartphone and iPad when playing on the road? "Misadventures in a perambulator"


Well, it looks like I need to get weaned from needing to be conventional PC or laptop or even Notebook.  Today, I stumbled through getting signed on to Google on the iPad, playing with the shift (arrow) key, keeping track of whether I’m in caps or not, looking for special characters, and so on.  I managed to get signed on and to verify, and then I tried to look at Blogger.  I couldn’t see the text of any blog posting in the editor.  I could only create the text for a new test posting. 

From what I see out there, it looks like I need an application like “BloggerPlus” on the iPad, from ThinkTekCo.  It’s inexpensive ($2.99).  Haven’t used my credit card to buy anything on the iPad; it seems as though I’m really addicted to conventional keyboards; I wonder what will happen when I try.  The link for BloggerPlus at apple is here.  I guess this is the “solution”. I’ll continue this effort later.  I'll look at this "9 essential apps" list soon, also. 

I also tried to set up the gmail application on my new Motorola droid smart phone.  I could not get the pw and gmail account name to take, even though I used the pull-out keyboard, and I’m “sure” that I entered it right.  Do I have to tell Google that I have this phone somewhere else first?

I see 21-year-old “kids” doing complicated texts and games and probably blog posts in bars on phones only, and I don’t see how they can do it so fast and get it right.  (At least the whiz-kid “Nolan” on “Revenge” is more like 28, maybe.)

That’s the view of a senior who “grew up on mainframes”.

I’m in the midst of an effort to get my screenplays (particularly two of them) and novel into “professional shape”, while keeping all my sites running, and that’s turning out to be more of a challenge than writing my book in 1996-1997 when I was working in a “conventional” and “mainframe” salaried IT job.  It seems the market demands that I learn to do things new ways all the time, to do more with less hardware.  The idea is to be able to go to the remotest spots in the world, perhaps to “pay it forward”, and be able to stay connected and blog and write with little or no hardware.  Just a phone.

Or maybe the idea is to use telepathy.  I have a friend who has tweeted material closely related to what I was writing (on my scripts) before I had posted any hints as to what I was writing anywhere.  Maybe telepathy is the next breakthrough.  Remember the “biological Internet” in James Cameron’s “Avatar”?

By the way, I "gave in" and wrote this post from a "conventional" Windows 7 Professional laptop, with all the luxuries.  I've done it OK on the MacBook and on a little Toshiba Notebook when traveling.  But I didn't make it happen yet on the iPad or smartphone. 

Home field advantage means something.  Remember how kids on playgrounds yell "first up", not realizing the only the home team can have a walk-off win?  But we all need to be ready for "The Road".

Update, April 21:

I found a forum about blogging on the new iPad here.   I posted a question.  Inconclusive.  There's also a blog post on "Orange Crate Art", Jan, 28, 2012 here.  There have been problems at least in the new Blogger interface since January, even with the older iPad. 



Update: April 23:


I've encountered some minor issues with the new Blogger not fully refreshing the computer screen, to overlay images content already being displayed, sometimes.  If I close the session and reopen, the problem goes away sometimes.  But in one case, I had to page down farther to do an image insert to get away from a pre-existing YouTube display (and then cut and paste again).  It's possible that something in the way the new Blogger communicates with display  (Nvidia on a PC, something else on an iPad or smart phone, operating system specific) causes this problem.  

George Will goes to bat for "individual sovereignty"


I’ve written a lot on these blogs about “personal autonomy” and “individual sovereignty”, and today (April 19, 2012, an unfortunate anniversary), George F. Will has an important piece in The Washington Post on “The right to rule oneself”, p. A15, link here.  Online, the title is “The constitutional right to be left alone”.

Will writes that the Constitution does not confer rights but “secures” pre-existing rights, and that the “fundamental rights concern the liberty of individuals, not the prerogatives of collectivity”.  He also warns that in common conservative parlance “individuals’ self-governance of themselves is sacrificed to self-government understood merely as a prerogative of majorities.”  Even my own father used to say “the majority has some rights.”

The socially conservative part of the GOP, most of all with now “self-suspended”  GOP presidential candidate Rick Santorum, is critical of “no fault freedom” and supportive of the idea of shared goals and common good. Even with an emphasis on individualism, there can be a proper emphasis on earning wha one has without too much dependence on the unseen sacrifices of others.

Social conservatives have repeatedly complained about "court made law" when it ratifies individual self-ownership.  Such claims suggest that such persons believe that a command of the loyalty and perhaps subservience from others is an important prerequisite for their "doing what they have to do" for the "common good". 

Yesterday, of my GLBT blog, I wrote a post re-echoing the majority opinion in the 2003 case “Lawrence v. Texas”.   Under due process, an individual fundamental right, if not conferred down to the level of a specific “act” even under the parameters of consent and privacy, still exists whenever there is any meaningful psychological context at all relative to the world of the individual. Remember, the "right to be left alone" had been mentioned in impassioned dissent in Bowers v. Hardwick (1986), which Lawrence "largely" oveturned. 

Social conservatives, including the “natural family” crowd, may have a political bedfellow with the climate change movement, and also with the seemingly distant concern over care of elderly family members.  Both problems suggest that it’s important for the individual have personal stakes in members of other generations beside his own. That sounds like sharing goals relating to collective good defined by the majority.  But these ideas can rise and take on moral importance competitive with usual notions of “personal responsibility”, sometimes, at least in areas of “choice”, contradicting them.  I grew up with a lot of these ideas in the Cold War 50s.  Today, the “socially conservative” Right has to deal with its own ideological internal contradiction, the idea that all government (even if applied to global problems like climate change) is bad.  The linking idea seems to be that government is less necessary when people learn to take care of each other locally, mostly in nuclear and extended families.  But that requires social structures that can undermine individual choices, on the theory that common challenges won't get successfully met unless "everybody has to share the burden". 

It’s interesting, to me at least, that the last two words of Will’s piece today are “individual sovereignty”.

  

Tuesday, April 17, 2012

CISPA broadly criticized, but it's less than a "SOPA 2"


Electronic Frontier Foundation has a white paper today outlining the basics about CISPA (HR 3523, “The Cybersecurity Intelligence Sharing and Protection Act”0, link here
  
CISPA allows private companies to turn over almost any “suspicious” email or other activity to government.  For example, an ordinary user’s using TOR might be deemed “suspicious”. 

Also the federal government could use this information for other law enforcement purposes, although House authors apparently removed language that seemed to allow going after copyright infringers (making it a “SOPA 2”).   Unlike the case with SOPA, however, many Silicon Valley companies support CISPA because they believe it will make their networks safer.

The Cato Institute has a short paper by Jim Harper, “Cybersecurity: Talking Points v. Substance”, link here

 A couple of other correlated points come to mind. The government has recently upped the talk about the dangers of major cyberattacks against infrastructure, like power grids, although one wonders why such infrastructure should be so easily accessed through the public Internet (see my Internet Safety blog Monday).

The other point concerns publishers (like me) rather than service providers.  That is to say, people in my position do sometimes learn about serious anomalies that could fall into the “see something, say something” category.   Despite all the talk of blogger immunity (whether it should follow mainstream jounralists’, which itself is dubious sometimes), I have on a few occasions believed I needed to call authorities with stuff passed to me.  And I did so.  And the administration is likely to say that this is all it wants.

On another topic, I went to a Nats game tonight, and on the way, saw a poster defending Bradley Manning in the Metro (first picture, above), defining "whistleblower".  And that does not mean "snitch". 


The Nats won, 1-0, without hitting the ball at all.  A ("another") winner of the Cherry Blossom pageant threw out the first ball, but she was not the same performer as the saxophonist at Town-DC last weekend.

Above, a quick clip of Park Service Police marching before the game.  Official MLB video of the game (no embeds) is here.   

Sunday, April 15, 2012

Webroot presents survey of social networking site use and "privacy" concerns about Facebook Timeline


Webroot has a story with illustrations on the way people have been using Facebook’s Timeline.  The report also does comparative surveys by gender and age group of Facebook, Myspace, Twitter, YouTube, and Google+.  Webroot tweeted the report link Saturday. 

The report shows considerable concern in some quarters about Facebook’s Timeline, as likely to reveal compromising material that would not otherwise be noticed.  At least 39% restricted access to Facebook by some people after Timeline was launched (about 2% deactivated Facebook).  37% were concerned that employers would see inappropriate material.

Some people do all their micro-blogging and Twitter and let it all copy into Facebook, which then will still sort it by Timeline.

Timeline may indeed present news to people that is critical, based on its artificial intelligence. For example, someone who lives in a tornado watch area may see posts from other friends (or other “like’s”) about the watch sorted near the top. 

People who use Facebook and Twitter to “socialize” and who discuss very specific personal or employment situations are much more likely to feel affected by Timelime. 

I still feel that the idea that one should publish differently to different circular audiences somewhat controversial, as materials are often forwarded.  I still publish only what can reasonably be seen in public mode, and generally avoid specifically personal materials about others. 

The link for the Webroot report is here

Notice also that Maryland has passed a law forbidding employers from asking for social networking passwords from applicants or employees (or contractors).  

Saturday, April 14, 2012

W3C holds meetings on "do not track"; deep divisions remain over issue


Rainey Reitman has an important new discussion of “Do Not Track”, April 5, on Electronic Frontier Foundation’s website, link here.  There was a meeting for the W3C World Wide Web Consortium Tracking Protection Group in Washington DC this week, April 10-13 (website url link).  Reitman compares the W3C effort with the apparent closed doors behind the Digital Advertising Alliance (website url link).  Retiman is also critical of the mis-focus of Yahoo!’s own “do not track” initiative, which seems focused on gathering user interests rather than on stopping “rampant” data collection (Marketwatch story here).

Another player is the Interactive Advertising Bureau. I found a paper, back from June 2008, urging “small publishers unite!” (website url link) claiming that regulatory political interests (those advocating strong “do not track” policies) could destroy the small publishing and self-broadcast business model that large ISP’s and service providers have built “for us”.

My own take on this is mixed.  I have self-published on the web since 1996, when I got started first by tracking the “gays in the military” issue, expanding concentrically.  I used the web to support the material in my books, and most visitors and customers got to know me through this “passive shelter” strategy, finding me on search engines.  I didn’t use ads at all until maybe 1999 when I start using Link-Share.  I started out with a small web hosting provider, whom I had met at work, and used until he went out of business at the end of 2001, when I switched to a large, corporate provider (Verio).  I actually had very stable, dependable service with him.  In 2006, I added blogging. 

Even though many bloggers or webmasters earn little or nothing from web advertising, or at least don’t have to depend on it, the industry as a whole does, and could not exist without visitors willing to engage commerce in some way.  The same concept has applied to broadcast TV since the 1950s, and probably even radio before; but with broadcast there was no way to “target” individual consumers; there were only aggregate measures like Nielsen ratings (Nielsen, by the way, offers similar survey services of web browsing).   Our whole society depends somewhat on receptivity to advertisers as part of its “social capital”, and that is getting weaker as privacy concerns grow.  (Just yesterday, I had to put my home landline number on “do not call” because of a continual, time consuming parade of solicitation calls; there are just too many of them.)

Is tracking really necessary for effective online marketing?  I can say that in my own experience (and I have not enabled any “do not track” capabilities on any browser), most of the ads I see on most sites seem correlated to where I am geographically (particularly recently when I was in Dallas) and to my general search and surfing habits.  It seems that everybody is doing it (that is, OBA or “online behavioral advertising”).  I’ve particularly noticed this on my new smartphone. Generally, on my own “home business” computer,  security software has been allowing it, and Webroot seems to have backed off from quarantining tracking cookies.  But because I’m the only user of my computers, it doesn’t cause any problems.  In a family or workplace, with many different users on one computer there could be real issues. 

Interestingly, a Pew Research Center report from Feb. 2012 found the public skeptical about the need to regulate tracking (link here  look for “Internet privacy”).  More highly educated (and probably single) consumers tended to understand that advertising is important for the services they get, particularly from Facebook and other social networking and publishing services. 

Other studies, in the past, have claimed that untargeted ads can be almost as effective as those developed by OBA.  Again, in the “good old” Web 1.0 days, I personally found that in general “passive” strategies for selling oneself and selling one’s own books were pretty effective, but the world really started to change in this regard around 2005 or so; “extreme capitalism” has increase, and so have the pressures from it.

The other big threat on the “small publishing” horizon, though, is the whole downstream liability issue.  It’s true that we’ve deflected SOPA and PIPA for a while (although there are already replacements or these in the works). But there is increasing “grass roots” pressure for gut Section 230 protections in the libel area, partly because of the cyberbullying problems (I’m about to see a movie on this problem today), but also because of major abuses by some operators, such as the recent flap about the “STD Carriers” site (here, March 12, as discussed on the Anderson show). 

The biggest long-range question is to whether “free services” will always be profitable enough to keep operating the way they do today.  The business environment today seems to depend on large service providers with an intricate use of online marketing technology, in start contrast to the earlier "dot-com" boom and then bubble days.  Given calls for regulation, to protect privacy and sometimes protect online reputations, they might not.  The world could become a “safer” but less innovative place. 

Thursday, April 12, 2012

How are we taught about Right and Wrong? Family perspective means everything. It's about social capital


So, it’s time again for a sonata on “right and wrong”, or where our moral notions come from.  In my own experience, moral priorities seem to come from circumstance, and seem to depend on what’s visible before the horizon.  And when we're growing up, our parents usually "see" the "social capital" issue for us before "personal responsibility" looks meaningful. 

I can see I’ve worked variations on this issue, like a pitcher throwing sliders around home plate, many times before. But I feel pumped up about it again, because in recent years I have come under a lot of pressure to adopt group goals chosen (for me) by others.  And I must resist.

I also have come across the “moral” questions about homosexuality, as some other people see them. They overlap the issues regarding personal autonomy, but they aren’t the same.  I’ll come back to that.

In modern “western” (and “classical liberal”) culture, moral values depend largely on personal responsibility to own up to choices that are made, and the “right” to do something depends on the consent of other (competent adult) people involved.  The “common denominator” for calculating “right” is (fiat) money.

 Some particular activities are understood to have major potential consequences, such as “risking” having a (possibly unwanted) baby.

But that’s not the focus when one is growing up.  One must prove oneself competent in a number of areas, which often relates to academic performance in school. One must also learn good work habits, and to deal with a certain measure of self-discipline and regimentation.  Up to a point, all of this comports well with individualism, because later one will need to take care of oneself and then to get and hold down a job.  But, at least in my own experience, I came to sense that a lot of this  had to do with being able to contribute to “social capital’ and to take care of other people.  And the context of this expectation, even giving as much credit as possible to the good intentions of social conservatism, gets complicated.

The “truism” that we hear so often these days is that it is imperative to accept the goals of something larger than self.  What that is, get’s complicated. Sometimes it’s “family” (and future lineage). Sometimes it’s community. Sometimes it’s religious faith amd sometimes nation.

It’s the “motives” that are served that makes this interesting. The purposes that we associate with “common good” (a favorite catch-phrase of Rick Santorum) often have to do with long-term “threats” to the survival of a family and larger concentric social bodies. When I was growing up, the main issue seemed to be war (as well as the fact that the economy seem to depend on getting many men to do dangerous jobs, like firefighting or coal mining).  In earlier times, it might have been surviving on the Great Plains as a pioneer or homesteader. (How I remember “social studies” dealing with “The Pioneers” and “The Colonists” in fifth grade.) These challenges tended, in the past, to require that people perform differentially and with “complementarity” according to their inborn biological genders.  Boys were expected to take risks fighting to protect women and children, and women took risks in bearing children, even though by the time of my boyhood (the 1950s) that peril had become largely insignificant.  For a boy to fail to do his part was evidence of “cowardice”, one of the most “mortal” of sins in my day.  (Remember “women and children first” in “Titanic”?) It made him seem like a “mooch” (like the infamous Ayn Rand character), freeloader or parasite.  All of this went up in a world supported by a male-only military draft, with deferments, at one time extended to married fathers and later only to “good students”.

But the “wartime” mentality (which would carry into the Vietnam war, and which would come back after 9/11, with Iraq and Afghanistan, and with some calls to resume conscription) tended to comport with a world that saw things in terms of “us” and “them”.  You had family and country, but you also had enemies, at different levels.  If you were a man, or even a grownup at all, you learned that you might have to protect (or “immunize”) others in your family or group from the “not self” some day.  Gradually, “self” became equated to “family”, even (and most especially) for those who did not beget their own kids. "Eusociality" became part of the moral landscape for everyone.  All of this lived against a backdrop of slowly (perhaps stubbornly) declining racism – from the slavery of two centuries before, to segregation, and the changes brought about with the Civil Rights movement, not even complete today. Family life meant “you take care of your own”.  Conservative political though stressed that, if government  (and taxes) could be made less, people had to take care of each other in families and sometimes through other forms of social capital.

Of course, in Christian churches you learned something else.  All persons were potentially brothers and sisters.  (Music taught us that, too: look at the words to Beethoven’s famous Finale.) You had a responsibility to reach out to people not in your cohort or herd.  So moral teaching, for young people, became an exercise in learning how to draw a line, that you can’t define but “know it when you see it”, between self, family, and other groups in the rest of the world.

Caretaking also gave the family (especially) another major social purpose: giving everyone value, including, as Santorum often specifically says, “the least of us.”  (This plays at least indirectly into the debates on abortion and even contraception: to have children at all, parents have to risk having kids born with major disabilities.  But it also points to the unwillingness of many people to “care” about someone who doesn’t “turn on”.)  I got my share of this in recent years with eldercare for my mother, which lasted many more years than I expected.  I had to deal with the “moral” aspects of hiring help or possible institutional placement instead of carrying out the "physical" and emotional aspects of filial responsibility myself. Today, medicine can do a lot more to extend life than it could when I was growing up, but the do-ability of many of these life saving or enhancing therapies may well depend on the support of an entire family (including those without kids).  In earlier times, people couldn’t live as long, and there was less public attention to raising support to fight diseases ranging from breast cancer to Alzheimer’s.  Still, the unmarried or childless (usually women) were expected to stay home and take care of their parents, whose years (of helplessness) were probably not as long as they can be today. (Today we have a derogatory term for such a person, “family slave”.) Families tended to be bigger then, and older children were more likely to have experience learning to babysit or care for younger siblings – responsibility created not by their own but by their parents’ “choices”.  I was an only child and missed out on this, although at one point I recall that my parents considered adopting a sister for me when I was about nine.

My life as a child and a teen was eventful and I have great, detailed memories of this time in my life.  It was troubled because I fell behind in the “manly” pursuits.  I was teased and sometimes bullied, but not as badly as many of today’s kids (as in a particular upcoming film).  I had the impression that others regarded me as a potential “drag” should some calamity happen to us all during those Cold War days. I got the impression that family was supposed to take care of the “weak”, but persons in such a position (yes, I had to face being regarded as “the least of us”) had to accept our dependence on the authority of others.  So my world was not free the way my adult world was (when I worked dependably for decades and had my own resources) but it was still eventful and interesting. But it was a bit authoritarian. It’s obvious that this societal and political arrangement can be abused by “those in power” and history proves that it often is.   People in earlier generations could not envision that values really could change some day. Hyper-individualism is indeed an effective counter to political corruption, even if individualism doesn't always "protect" everyone or get big collective goals met. 

I also developed a motive of “upward affiliation” (to borrow a term from George Gilder, perhaps comporting with the idea of a “subjective feminine” personality in Rosenfels’s polarity theory, honoring Goethe’s idea of “The Eternal Feminine”).  I tended to regard people as having a calculated “value” which could lead to their being ranked sequentially, in a mathematically well-ordered fashion. Those who were “weaker” had to live, but relating to them would not seem intrinsically rewarding. So my own mind developed a feedback loop that seemed to express authoritarian, hierarchal values.  I saw people in strictly moral terms (as I thought I had been viewed) and I believed I was supporting (physical) "virtue", and this ought to be a good thing. 

I recall those grade report cards that reported “progress of the pupil as an individual” and as “a member of the group”.  I could excel (in a labored way) at academics and particularly with my music (piano, and a good ear), but others were forcing me to support their ends.  I built, in my own mind, a lot of fantasy life about what made other people desired partners.  And in the 50s and early 60s, the psychological world equated early socialization (with peers and family) with success in starting and keeping marriages in adult life. 

I’m aware of the whole line of thought espoused by Rick Warren, that “it’s not about you” (his “Purpose-Driven Life” philosophy), and I understand that generally people have held a position that loyalty to your own immediate family and community group is a primal moral value, to the extent that you’re ready to sacrifice your own purposes for the greater good if called upon to do so. What if that “common purpose” designed by the group turns out to be wrong?  According to Warren, “that’s not your responsibility”.  But in the Internet age, where an amateur can change the whole world, that no longer sounds like a reasonable statement.  But perhaps one can stipulate, “you shouldn’t put yourself out there, for the whole world to see, until you have people you’re responsible for.”  That is, you have “your own family.” I’ve called this, “the privilege of being listened to.”

That is somewhat how people reacted to me.  Once I had made myself public with my own 15-year battle about “gays in the military”, others constantly approached me to join their causes, and took “neutrality” as not just behavioral aloofness, but as indifference, lack of compassion, even hostility. I was seen as lacking real "emotion" for real people. 

The talk about “sustainability” that has grown since 2000 or so, with the concerns over energy and particularly climate change in the physical environment, and “demographic winter” in terms of low birth rates and long periods of disability at end-of-life (along with the debates over social security, Medicare [“entitlements”] and “retirement”), certainly applies to the concerns over declining social capital and weaker families.  A future society could indeed become one where people must live and interact much more “locally” than today. So, those who advocate the “natural family” and group-driven personal priorities can make a case that involvement in raising future generations (and in caring for past ones) is an inherent responsibility for everyone, not something that just starts with heterosexual intercourse resulting in a baby. "Generativity", and a real personal stake -- having "your own skin" invested in relationships with people in other generations -- becomes seen as a moral imperative. 

All of this material – on moral challenges to individual sovereignty and hyper-individualism, overlaps a discussion on sexual orientation.

Today, we often hear “gay rights” discussed in terms of “equality”, which is an abstract social and political concept (however desirable) that does not reflect reality at the “street” level or even within a family, where the applicable concept is “complementarity”. The same applies for gender-independent rights (women’s issues).   “Equality” appears in issues like gay marriage, gay parents, and gays in the military, with the recent repeal of “don’t ask don’t tell”.  A moralist would say that equal rights is a logical corollary of equal responsibility, including the capacity to share common responsibilities, such as fight in the military and raise families.  "Gay equality" also seemed to react to another reality: those who enjoyed marital intercourse were subsidized by those who did not or could not. 

In fact, the gay marriage debate does ponder committed relationships and escape from living in “fantasy” or upward affiliation.  What it denies to some people is ratification of male initiative in marriage, a concept which some people link as equivalent to healthy social capital in families.

I lived most of my adult life without “full equality” but, after the early 1970s, able for the most part to live my “private life” the way I wanted -- although that took a great deal of focus just on my own needs (tuning out other people) just to handle the logistics of "coming out" twice!  That capability (for privacy and independence) was most challenged in the 1980s with the right wing’s political response to AIDS (as a chained “public health threat”) which has subsided with medical sanity.  But in times earlier than that, as with my William and Mary expulsion in 1961 and my “reparative therapy” at NIH in 1962, it was an issue on its own.

Why did society used to have a prohibitionist (to use Andrew Sullivan’s term) toward homosexuality?  Why did this “happen to me”?   After all, I was not a threat to create an unwanted pregnancy or compete for someone’s wife.  Why was my issue seen as an even bigger threat?

Again, we pay attention to “problems” that we can actually see (in a particular person).  I was an only child. For me to take myself out of any chance of continuing my parents’ lineage could be perceived as a profound insult to the family and to their marital relationship, an idea I could not have grasped when I was much younger. The feedback from my WM roommate in 1961 and then the course of “therapy” at NIH in 1962 give some good clues.  My roommate feared impotence merely after being around me.  Such an admission sounds self-deprecating today, but people play “victim” all the time.  At NIH, the staff (despite insistence that the problem was not “just” sexual orientation) became preoccupied with the existential nature of my fantasy and ruminations, and what would happen “collectively” if I induced others to believe them.  They were very concerned about what I "noticed", what turned me on and what didn’t, and why.  They seemed to believe that, perhaps as a response to earlier physical shame, my "intellect" had shut down any procreative instinct.  I know this from reading all my patient records, which I secured in 1996 under the FOIA when I was writing my first book.  But in the 1970s, Paul Rosenfels, at the Ninth Street Center in NYC, would characterize this tendency for reenactment with titillation as the “sadistic” aspect of psychological defenses in his book “Homosexuality: The Psychology of the Creative Process”, book review blog, April 12, 2006).

I want to note that I did serve in the Army (1968-1970), after graduate school, without incident. I was in the company of relatively well educated men so talk of homosexuality tended to amount to comedy  (Tiny Tim’s butterflies) and light entertainment, and wasn’t perceived as threatening.  (Randy Shilts had made that point in his book “Conduct Unbecoming” in the early days of debating gays in the military).  In fact, the military in my case wasn’t as hostile to homosexuality as a lot of the civilian world.  But we weren’t in combat or in a situation where we shared much physical risk or sacrifice together.  (Back in the 1970s, some of the arguments advanced against gays in the military in 1993 with Clinton's proposals had been made in other civilian areas, like firehouses.) 

It’s very striking to me: I was “punished” for refusing to “join” “their” world, and trying to remain an expressive oddball, kibitizing from a distance.  I avoided their entire world of “soap opera” passions for my own, based somewhat on imagination.  In fact, I’ve never experienced jealousy or had someone jealous over me.  Have I missed something? Does that mean I haven’t “lived”?

During my adult world, I did function pretty well “living in my own world” as a computer programmer.  I would know what it is like to be depended on to be perfect in work, but the whole world of families and children would appear to belong on another planet.  I lived in urban exile. Trips home were rather like “greater than the speed of light” space travel.  They had their world and we had ours.  All of that would start to change in the 1990s, after HIV calmed down, with th debates over the Equality issues (ENDA, DADT, marriage, and parenting).  As I faced eldercare, and then a “forced buyout” effectively ending my stable I.T. career, I would have to return to the “real world”.  I would understand how many people face regimentation and forced hucksterism in the workplace. I would find myself, after some coercion, in interpersonal situations where caring and compassion were demanded from me (even in the substitute teaching environment) in a way that had never happened before.  This was a taste of what my (late) mother called “real life”, which requires the capacity to live in close quarters with other people and full permanent intimacy with one.   I found it very difficult to enter the intimate spaces of others' lives (even when asked to and "needed" to make things "all right") when I hadn't built my own first.  And that sort of thinking sets a real trap -- "I can do that only if everyone else has to." 

I was also struck by how “moralism” had taken over the political debates early in my working life.  The Far Left, in the late 60s and early 70s, was demanding “equality” at the gross level of “The People” – the elimination of high salaries and inherited wealth, and even, in some talk, of Maoist-like cultural revolutions where everyone took turns being a peasant. Social conservatives simply brought all this idea of “paying your dues” back into the family. Today’s debates over higher taxes on the rich seem like a shell of how these arguments were fought in the past. 

Recently, I saw the film “Undefeated” (movies blog, April 9) about an inner-city football team.  The point was made that most of the boys didn’t have fathers at home, and that the boys tended to feel that this was a sign of their worthlessness, that their dad’s had rejected them as not good enough.  I felt I was “physically unworthy” of becoming a father because I hadn’t “competed” well enough in masculine pursuits as a boy. (The right wing sometimes says, “Masculinity is an achievement, and my own head admits that is true.) I developed upward affiliation, which in a paradoxical way ratified right wing values. I traveled full circle. So I must ask myself, am I partly responsible for the fact that some of these boys don’t have fathers?  The right wing (in Santorum-talk) seems to point to the “no fault freedom” of someone like me who went down his own path, paying his bills (according to libertarian values) but not paying his dues.

I recall a line in the WB show “Seventh Heaven” where Pastor Camden says, “sex is just for married people.”  I suppose that in a “perfect moral world” imagined by social conservatives (and by the Vatican, in particular), no one would experience passion or sexuality until after marriage, when it is suddenly discovered and ignited by consummation on the wedding night, after all the rituals of doting and courtship.  That might prevent unwanted pregnancies from the majority, but of a minority (like me) it would send a message that family responsibility is expected of everyone.  If so, it all sounds like a canard, a mere artifact of deductive logic. Perhaps a perfect world like that would have high social capital and little inequality because there would be little opportunity to take advantage of it, and maybe it would be stable and crime-free.  (In fact, one of the disadvantages of a hyper-individualistic world is that losses tend to be absolute, even if caused by someone else’s wrongdoing; you can’t make everything whole by money.) On the other hand, it might be a static world with no innovation, no art, no inspiration.  The Neanderthals survived 70000 years but finally failed because they could not create much on their own.  

So what is the answer for my riddle?  I don’t need the Sermon on the Mount.  At an intellectual level, it’s unsolvable, rather like a polynomial with only imaginary roots (it’s like “x**2 + 1” rather than “x**2 – 1”), or maybe like Godel's "incompleteness".  What seems important is that everyone needs to contribute to social capital to give his own voice real meaning.

As I finish a seventh decade, I'm indeed struck by my attitude that "good" pre-exists and is to be found, rather than nurtured by emotional commitment (to the immature or "far from perfect").  Indeed, a lot of the debate today about "family values" apparently relates to finding and maintaining passion for others when it is needed rather than when it satisfies fantasy.   I'm also impressed by how I never saw a need for progeny as anything that could generate passion.  (I can recall a particularly testing moment decades ago when a female date's father teased me by offering a sandwich.)  I always felt that the "buck stops with me", with what I produce myself, with my own content. So I must now finish it.   Yes, it was "about me" and I became and remained aloof. Yet I know that many, perhaps most, people don't have the luxury of losing their place in line.  

Update: July 4, 2012

See International Issues blog (July 1) for a discussion of "fairness" around the world. Yes, conservatives think that "mandatory socialization" somehow addresses these questions -- everyone has to be responsible for someone else to get somewhere.  But gross inequality has always existed, even in earlier times when only a heavily locally socialized life was possible. Stay tuned. 




Wednesday, April 11, 2012

Can "amateur" bloggers deliver "integrity of information"?


Kathleen Parker has a perspective on p. A19 of today’s Washington Post, “Tweets and crow” in print, “Whispering campaigns can take flight in new media”, about the recent blog rumors about South Carolina governor Nikki Haley  (Rep.).  It was shot down by some fact checking by a USA Today reporter.

Parker writes, “Integrity of information is the one thing newspapers can promise readers that other, new media can’t deliver with the same consistency.”

That statement certainly has implications for bloggers (more so in public mode than social media). But then again, has the “established media” jumped the gun a bit on Trayvon Martin?

The link is here.

Monday, April 09, 2012

Law enforcement can get complete user records from Facebook, other services; more negative press on employer social media policies

According to a story today by Rosa Golijan on MSNBC, police can get complete archives and logs of someone's Facebook activity, apparently to the beginning of time.  The story today appeared with this link. The story quotes Facebook's policy in cooperating with law enforcement.

I suspect that the same is true of other services, like Google+, Twitter, Myspace, Blogger, YouTube, and Wordpress, but the complexity of activity that can be searched may generally be less with these sites.

The Boston Phoenix "Phlog", an alternative newspaper, has researched this capability with the "Craigslist" incident in New England (subject of a Lifetime movie about Philip Markoff in January 2011), link.  The "Phlog" has a redacted pdf of the server log sent to the Boston Police Department.

As I noted Sunday, online publishers have varying policies regarding knowing who visits their sites, since the practice is not accepted in the print world.  But server logs can often provide "valuable" information about "who" looked for "what" information on a site with specific (perhaps troubling) search arguments.

Furthermore, Business Insurance (April 2, 2012) has a story by Judy Greenwald advising employers not to ask for social media passwords, and reporting that there is an effort in Congress and some state legislatures to outlaw the practice, which is infrequent but increasing.  The same issue of the same publication has another article by Sheena Harrison, p. 3, "Risks from social media, ACO's complicate medical liability", here. Online the title has a different spin, "Social media policies can reduce risk of reputational damage".

Picture: Union Square, demonstration, NYC, March 24. 

Are "microsurveys" a viable alternative to "paywalls" for smaller online newspapers


There is some controversy in the digital publishing world now about a new service, Google Customer Survey’s, as an alternative to a “paywall”, especially for smaller newspapers. The Online Publishers' Association (link) sent out an email about this today.   A number of smaller papers, like the New York Daily News and Texas Tribune, use it.  A visitor sees only part of a selected article until the visitor answers a “microsurvey” question, which will help advertisers place material on the smaller newspaper’s site effectively. “Adweek” has an article (link) on the service, here, and says that the concept could save digital publishing.  Some businessmen, such as HDNet’s Mark Cuban (who appears on “Shark Tank”), don’t like this idea. (Cuban’s own blog link makes amusing, sometimes inspiring reading. "Don't follow your passion, follow your effort".)

The microsurvey idea sounds important particularly in view of the “do not track” debate. But some visitors may distrust surveys and be annoyed by them, as some "social surveys" have been associated with malware, made available with misspelled common domain names. 

Google has offered a paywall service itself, called “Google One Pass”, discussed here

Another paywall service popular with some papers (maybe 200 or so) is “Press+” (or Press Plus), link here

The Wall Street Journal has a new article about paywalls from the AP, April 3, here

The simple fact is, that in this day of increasing sensitivity about privacy (and “do not track” options), publishing services need to maintain a business model that can bring in revenue and keep people employed, with benefits. It’s “simple capitalism”, but not “extreme capitalism”.  



Sunday, April 08, 2012

We've had online social networking for a long time; "public mode" does imply certain ideas in ethics


We had social networking sites of sorts back in the 1990s.  They were our listservers, and also discussion forums.  The environment was such that most postings were read mainly by people on a specified list.  There was no way, however, to prevent others from seeing a confidential post.

AOL ran many effective forums in the 1990s, including a large one on “don’t ask don’t tell”.  There was a site called Independent Gay Forum.  In the entertainment area, the screenwriting contest run by Miramax Pictures, Project Greenlight, ran very effective forums, one of the best I had seen, like around 2002.
As far back as the 1980s, however, people had forums of some effectiveness, through Usenet, Gopher, etc.

What Facebook and Google+ have offered (not so much Myspace) is the possibility to refine the lists of people who will receive certain content.  That’s really not so important to me, personally.  If I post something online, it’s usually public.  (My Facebook, Twitter, YouTube videos, and blogs are all public.)  I don’t post comments about individual non-public people.  I don’t conduct “relationships” online as such if they’re not in the real world.  I don’t let anyone make a big deal of friending, unfriending, or counts of friends or followers.

One way, even before 2000, that people restricted content was “virtual office”.   The Libertarian Party of Minnesota experimented with a virtual office, where content was distributed to logged –in members only.
When you author and publish a physical book (or its e-book equivalent), you generally don’t have the right to know who has purchased the book.  That’s part of the ethical framework of ethical publishing.  On the web, when you accept advertisers, sometimes you may be expected to monitor the nature of your visitors.  Sometimes, for security reasons (like DDOS), a web publisher might have to block certain visitors.  This sort of  “opportunity” is possible with shared hosting, where server logs can be examined.  (I had to study my server logs in late 2005 after an incident when I was substitute teaching, as I have discussed here before.)  But generally, when I put something out in “public mode”, I don’t expect to be concerned about who read it.  That still sounds like an important ethical idea.  (In libel law, however, "publication" is sometimes considered to have occurred when a message is transmitted to only one party who understands it. That's one reason why "online reputation" gurus say one should not trust "privacy settings" too much.)

The “passive” social networking that results from open publishing has, in fact, been very effective for me personally.  So were some of the forums that were popular in the late 1990s and early 2000s.  I don’t find that I need to fine-tune the concentric (or overlapping) circles of recipients or friends that receive content the way that others do. 

I do know that on a large college campus, it’s not hard to amass a thousand social networking friends or so.  (A politician has to, and probably so does a life insurance agent.) But one can only “know” about fifty people or so “well” in a real world.