Sunday, January 31, 2010

Most of BitTorrent use seems related to copyright infringement (Princeton study)


Jacqui Cheng has an interesting report on Ars Technica Jan. 29 about Bit Torrent. Specifically, “Bit Torrent consensus: About 99% of files copyright infringing,” link here.

Nearly half of the files studied by a Princeton student, Sauhard Sahi were “non pornographic” movies and TV shows. That’s a bit surprisingly, maybe, because most networks offer most episodes of their primetime shows available for “free” Internet “anytime” viewing – you just have to sit through the commercials, too (just like you would on TV).

The article also points out that a record industry fear has not born out: P2P users can share DRM-free MP3 files easier than ever, but these have become a small part of BitTorrent sharing, which often mislabels its files.

There’s another report on the Princeton study on the “Freedom to Tinker” website by Ed Felten, here.

Friday, January 29, 2010

Today's debate on "don't ask don't tell" and the 60s debate on draft deferments: an odd moral connection: why repealing DADT redeems me


While some momentum seem to grow to repeal the “don’t ask don’t tell” and allow gays to serve (somewhat, at least) openly in the US military, it’s well to review another moral debate from four decades ago, the Vietnam era draft with all the controversy about deferments.

In fact, from the late 1940s until the mid 1960s, married men were sometimes deferred, and sometimes only married men with children were deferred. In early 1963 the term “Kennedy fathers” was coined to characterize married men in genuine relationships to their children, who then had III-A deferments. The fatherhood and marriage deferments got wiped out by Johnson in 1965 as Vietnam escalated. However, student deferments remained, and changed the character of the debate. (Male) math and science majors with good grades in college and graduate school were the “safest” from conscription. The rationale was that we needed “brainpower” to defend ourselves in the Cold War. Suddenly, society had a new quasi-hero, the nerd. Deferments would finally be replaced by a lottery in 1969, and Nixon would end the draft in 1973. Congress and the president, however, have always had the authority to restore it (and there have been some minor changes, such as in 1980 when the Soviet Union had invaded Afghanistan). Selective service still operates today, and men in certain age ranges are still required to register.

In fact, right after 9/11, there was talk from some people, such has Democratic Senator Carl Levin (MI) , New York Representative Rangel, and DADT author Charles Moskos, that we should restore the draft. At the same time, Moskos was starting to argue the DADT should be repealed, and in general proponents of conscription or even mandatory national service have favored ending “don’t ask don’t tell.”

It’s well to mention here that in 1981 the Supreme Court had valided the constitutional permissibility of the male-only draft, in Rokster v. Goldberg (text). In the modern world of individualism, it seems strange that the state could require men to offer themselves as potential sacrifice for the good of all. And that only men had to do this. But remember, not only was the “complementarity” of the sexes regarded with more “reverence” in earlier times, women had taken their own share of the common risk merely by undergoing childbirth. At one time, men outlived women.

Of course, some nations with conscription, namely Israel, require women to serve, and if a draft were ever reinstated in the U.S., it would probably incorporate women, too.

Once in the Army, men with college educations fared better and were less likely to wind up in combat, at least during the Vietnam years. Men who enlisted for three years or more were given some choice of MOS, and I was told by a recruiter in 1967 that “95%” of draftees got infantry. There was a slogan, “choice, not chance, in today’s Army.”

My own story becomes relevant and prescient. My expulsion from William and Mary in November 1961 for admitting homosexuality (covered here Nov. 28, 2006) was, according to the cultural and “moral” standards of that era, a cause for great shame, or perhaps an extension of shame from teenage years for not being “competitive” enough as a male. One part of my plan to redeem myself was military service. I “volunteered” for the draft physical in 1964, 1966, and 1967, and was rated 4-F, 1-Y, and finally 1-A. By 1966, the draft paperwork had stopped “asking” about “homosexual tendencies”, and by the mid 1960s there was, in effect, an unofficial “don’t ask don’t tell” policy. After finishing graduate school, I “volunteered” for the draft and then went in 2 weeks early as a two-year enlistee (at age 24). I had an RA number, “RA1937256” (I still remember it from all the BCT formations – it’s long term memory). I guess I fell into the lucky 5%. In the MOS session, the master sergeant in charge said, “hey, you missed a college grad”. I wound up with the MOS of “01E20” which was Mathematician. But I was a dud at the start of Basic, and after a few days in the infirmary with the flu, I got recycled through Special Training (Tent City at Fort Jackson). My PCPT went from 190 to 357 before I graduated from Basic in 14 weeks. (I even remember the PCPT: low crawl, horizontal ladder, run dodge and jump, and mile run).

What strikes me about all this is that our society, in fighting Communism in the Cold War, thought nothing of estimating the “value” of one young man’s life compared to someone else’s. Supposedly we had won World War II in order to put down this kind of thinking, that one person could be officially regarded as more “worthy” than another, and compel another’s “sacrifice.” In my case, the whole draft deferment system seemed almost anti-Darwinian. Someone less fit for service could let someone else put himself in the line of sacrifice. In more authoritarian thinking setups, this could be seen as a moral outrage. In the "McCarthyism"-infected world of my coming of age (which still strikes me as having endured some lingering fascism), physical cowardice was seen as one of the worst moral deficits: if you didn't step up when called for the common good, someone else would take the hit in your place.

All of this background helps explain why I would view a repeal of “don’t ask don’t tell” as a personal vindication.

Thursday, January 28, 2010

Does it take money to distribute effective political speech?


George F. Will, in a column “Campaign finance: A ‘reform’ wisely rejected” on p A25 of The Washington Post Thursday Jan. 28, adds some more fuel to the debate on the value of money in influencing policy debate. His link his here.

He writes that “the court now held that dissemination of political speech requires money”, leading first to the conclusion that campaign contributions are a form of speech. But the developments of the past fifteen years or so (Internet speech, whether or not part of “social media”) have shown that policy debate can be affected by individuals with little money or competitive track record, raising questions about the “stakes” of speakers not taking as much risk or responsibility as others. The Court’s decision, as noted yesterday, would seem to protect the newbies too.

Will also notice that the overturned law would have sometimes prohibited political instructions from employers to associates. Again, “political loyalty” is an issue that I have encountered in my own career.

Will points out that voters should have access to knowledge of corporate and union contributions, and such knowledge would help mediate their votes.

Wednesday, January 27, 2010

Did the Supreme Court, while upholding corporate speech and campaign contributions, "exonerate" web self-publishing?


Remember how it was a big deal at one time to “get published”? To some extent, in some narrow areas, as in the academic world, it still is. But generally magazine articles and books were not published and released “into the wild” without a lot of supervision and due diligence. Vanity publishing (or subsidy publishing), before the 1990s at least, was very expensive. There used to be a company called “Exposition Press” and the owner, a Mr. Uhlan, wrote a book in the 1970s called “The Rogue of Publishers’ Row.”

Desktop publishing and more efficient press designs began to drive costs down in the 1990s, and self-publishers could indeed order modest runs at book manufacturers for low cost. But web publishing, as a way to be found and recognized, started to get recognized around 1998 as search engines took off.

At first, self-publishing on the web tended to consist of flat sites, and sites with mostly simple html text and hard coded hyperlinks tended to do well because they loaded faster, particularly for users with dialup connections. Despite the multiplicity of businesses offering search engine placement, in practice “amateur” content on static files placed higher in search engine results than “professional” corporate content generate with sophisticated dynamic algorithms pulling content off databases.

Self-publishing could work socially, because if the content was original and interesting and about pertinent topics, it would attract “friends” that I wanted. This could take place in areas like screenwriting, political activism, gay rights, libertarianism, and so on.

Self-publishing, because of the almost “free entry” and elimination of most due diligence, could also pose unbounded and unpredictable downstream risks, even for those connected to the speaker (especially in connection with the workplace, sometimes the family, sometimes with special problems like with the military [“don’t ask don’t tell”].

Social networking sites, however hesitatingly, tried to turn the paradigm around. Facebook, particularly, was originally based on the idea that you should know your friends, and there should exists some common connection, like a campus. Even as it became “global” and allowed the now controversial “everyone” concept, it still encourages the idea that some “published” material be circulated only among known “friends” – although it isn’t reasonable to say you “know” 1000 people that well (no double meaning here).

Making a personal website available to “everyone” and accessible to all search engines and robots is the logical equivalent of self-publishing a book or periodical. The author does not know who accesses his or her literature immediately.

But in practice, the risk of release inappropriate or damaging material in digital form even in a known list is still quite great. Digital records can be transferred to anyone and can never be “recalled.” Most of the time, defamation law will consider an item as “published” if it is shown to one other party who understands it. There are Victorian novels concerning defamation in handwritten letters.

The “self-publication” event, releasing material into the wild and to search engines, may be considered “gratuitous” in some circumstances (creating risk) if what the speaker has to “gain” is not clear.

Yet, there is still a good First Amendment-related question about the value of political speech for its own sake. If a large corporation or labor union can (according to the Supreme Court ruling last week) spend its own money on “published” materials that support a political candidate favorable to it, then as a corollary it seems to follow that an individual can stake out a political position on the open web with no other motive than to create a public stir and gain the sense of personal leverage with respect to a particular issue. There may need be no other purpose, and the recently touted “implicit content” issue might go away.

There’s another angle to the Facebook model: that is, the “right to publish” (or "the privilege of being listened to") could depend on how “popular” you are. But that notion would not be good for public policy or healthy debate.

Monday, January 25, 2010

More writs filed to identity bloggers or webmasters who "skank" others "anonymously"


MSN is reporting more lawsuits against bloggers for online “slagging”, in this story from NineMsn, “Floodgates open for cyber-bullying writs”, link here.

All of this follows on the heels of a case involving a model who was able to get a court order disclosing the identity of an anonymous blogger who was defaming her, as discussed on this blog on Aug. 19, 2009.

Attorney John Dozier, Jr., who made a comment on a posting her Aug. 28 about Section 230 (with respect to his recent book), reports several other major similar cases, including at least two where people put up “fake websites” to cause people to appear to be defamed when found on search engines.

Saturday, January 23, 2010

A model railroad is one big thought experiment!


Today, I visited the WGH model railroad show at the Dulles Conference Center near the Dulles Airport (link for show).

The best exhibit was a complicated layout by National Capital Trackers on O Gauge, with a curious feature. It seemed to comprise two layouts, appearing to join in parallel tracks in one stretch, but with no topological crossover. Using the language of Clive Barker’s 1991 novel “Imajica”, the two “dominions” remain “unreconciled.” There lots of interesting things in the layouts, such as a steel mill and a carnival with roller coaster. There were about ten other model railroads, including a Standard, an an N-gauge layout with a long coal train and a simulation of “mountaintop removal”, and an Acela model of Wilmington, DE (in the Blue Hen State). Most of them seemed to invoke worlds that existed more or less in the 1950s.

In a screenplay that I submitted to Project Greenlight in 2004 (“Baltimore Is Missing”), the protagonist wakes up after a bizarre “abduction” (of sorts), to find that he lives as a toy figure in a model railroad “dominion” owned by an old nemesis. (It would be especially horrible if the dominion were a model of Wilmington or, as in in screenplay in one scene, Grand Rapids MI). It does sound a bit like “Outer Limits” or “Twilight Zone.” He has a “wife”, and he is charged with the idea of rebuilding a new world by joining with people in other “dominions.” Now this could tie into (Jan. 2010) Discover 's recent treatise on multiverses: maybe when you tweak the physical constants in pairs, you get other universes compatible with life. For example, you could have a “weakless” universe (without the “weak force”) without elements heavier than iron – which might mean no nuclear weapons, and maybe world peace. It might be a “simpler” universe in some other ways, and smaller. Maybe Heaven lies in a “weakless” Universe. Maybe some of Clive Barker’s dominions have other physical constants, although some of the characters have learned to move back and forth between the “reconciled” dominions. (When will “Imajica” be a movie? It sounds like a great project for Lionsgate to me. And there is a great little train in the Third Dominion -- a great idea for a model railroad, as would be a railroad on a terraformed Mars.)

Ii got my one model train at the age of 3, in 1946, at Christmas. That is my first memory. I remember building little layouts with it, with one track to my grandparents, another to my aunt and uncle, before I started to realize that life was much more than blood relations – such a politically and socially controversial point today. We used to build toy cities in Ohio on the concrete sidewalk to the privy behind grandma’s house – recreation that my mother once thought was “baby play.” But now, model railroad layouts strike me as Thought Experiments: kingdoms, of limited horizons, where people can start over as characters in someone else’s world.

A few years ago Dulles had another model railroad show that simulated 60 miles of track to scale. Pennsylvania has a couple of large layouts, including Choo-Choo Barn in Starsburg, and Roadside America up on I-78.

As for the Dulles Exposition, it could have been better managed. The parking situation at the Dulles Expo Center is atrocious, as I wound up in a dirt lot changed to mud by the melted snow from before Christmas, and some cars were getting stuck in the mud. The directions were wrong: you turn right when you reach Willard, not left. The whole facility needs much more management and infrastructure, and probably a for-pay commercial parking garage. I was almost struck by the overhang of an oversized truck trailer (huge momentum) passing me as I walked out. Narrow miss! Remember Oprah’s recent show on cell phone and text free driving.

Friday, January 22, 2010

New York Times to meter visitors and charge for content in 2011


The New York Times will start charging Internet visitors to view content on its website starting in January 2011, according to a report Jan. 20, 2010.

Visitors will have to register, and will have a certain free allotment per month, and then be charged on a metering system. Print subscribers (even only to the Sunday version) will be able to view the online content free.

It’s not known if other newspapers will follow suit. Wall Street Journal requires a paid subscription for some online articles. So do a few others like Newsday , the Arkansas Democrat-Gazette, and the Albuquerque Journal. Financial articles reporting on specific companies (as linked from Yahoo!) and “professional journals” in medicine and other fields typically require paid subscription now.

This could make it more difficult for bloggers to gather information from multiple sources if many newspapers do this. Furthermore visitors, if they clicked on links in blogs, could sometimes find that their viewing of content counted toward a monthly limit or led to a request for a credit card payment. On my own blogs, I have sometimes warned visitors if I know that specific linked content requires paid subscription.

Many newspapers charge for archived articles, but some have backed away from the practice in recent years.

The Los Angeles Times blog has a story about this here.

The basic news wire story is here. It was reported in print Thursday in the Washington Times.

The Times own story ("The Times to Charge for Frequent Access to Its Web Site") from Richard Perez-Pena from Jan. 20 is here.

As to the use of a print sunscription to pay for unlimited access, I do have one issue: it is a bit of a security problem, for someone who lives alone, with newspapers likely to be visible at a residence when someone is not home. I wish the newspapers would consider that. Consider unlimited use online subscriptions, too.

Thursday, January 21, 2010

RIAA wants ISP's to take on more downstream responsibility to prevent copyright infringement; FCC may agree: more than a net neutrality issue


The RIAA is still challenging the limitations on downstream liability in our legal system and claiming that ISP’s should be required to limit the transfer of infringing downloads, including those through P2P networks. EFF has a link to a Computer World story by Grant Gross, “RIAA tells FCC: ISP’s need to be copyright cops,” link here.

ISP’s even say that this is a “network neutrality” issue, but it is more a downstream liability exposure problem.

The FCC, in a Notice on Proposed Rulemaking from Oct. 22, 2009 (comment date Jan 14 2010), here had proposed that broadband providers “ be allowed to engage in "reasonable network management," including preventing the "lawful transfer of content," RIAA has also suggested that ISP’s follow a practice prescribed by law in France, to kick out subscribers after three violations.

Tuesday, January 19, 2010

Visitors: I can't be responsible for how you feel about yourselves, but I do empathize with what you're saying; please understand "subjunction"!


Occasionally I receive comments about a few of my blog postings or other essays, indicating or hinting (sometimes with irony or sarcasm) that the reader feels that he has been made to feel less well about himself because of the posting.

In many postings, I review lines of thinking and personal value systems of people that I had to deal with in the past. I try to restate their postulates and the conclusions or logical consequences of their “moral” belief systems. Sometimes readers believe that merely reviewing or restating their beliefs gives these belief systems credibility and “helps the enemy”; some people will take a post "personally" even though the speaker did not know them when writing the post. I maintain that it is important to understand how other people think, and how this relates to how people thought a few decades ago.

In writing, we often state hypotheses or assertions, as if they could be tested or questioned. But many times readers do not understand that these are hypothetical thought systems, and are taken as “fact”, relative to the world of the essay or blog posting author. Other (inflected) foreign languages (like French, especially) are better at dealing with this problem because they have more explicit conjugations for the “subjunctive mood,” than does (“analytic” and “simplified”) English, which places so much importance on overall context (as related to the "implicit content" problem). I’ve had another brutal lesson before (July 27, 2007 here), when readers did not understand that “fiction” by me was just that.

Given the nature of these postings, then , I don’t take personal responsibility for how a particular visitor “feels about himself.” There is no way that I could.

In a related area, however, I can empathize with the reader’s feelings. There are some “dark energy” areas in my own life where I sometimes believe others have not respected me and have felt free to suggest that I should allow my own life to be expropriated for the needs of others, that I should give up control over my own goals. This is quite disturbing to me and leads to “existential” questions about why I did not “perform better” at certain things or accept the idea of making conventional commitments earlier in life. On the other hand, I understand that when some people say these things they come from a more “collective” mindset that cares more about the group than the individual (that’s all too convenient for some people).

One of the biggest concerns underlying some of my more “challenging” columns is a concern that family responsibility is not always just a consequence of personal “choice” (that is, bringing children into the world an/or saying “I do”). Some of it can apply to anyone, depending on circumstances in that person’s family of origin, beyond the control of their own choices. Informally, many people see this as an issue of trying to "get out of things." We ought to sit down and face this squarely in our policy debates. It's about "personal responsibility" and justice, but it's about that "something else" (coummunity? family?) too. And yes, unelected filial responsibility can have a disparate impact on LGBT people.

How to monetize your presence on Twitter -- if you're popular enough


On ABC “Good Morning America” today (Tuesday Jan. 19), Tory Johnson (“Women for Hire”) covered how one can make some money from tweets. The service is called “Sponsored Tweets” (link) and is working now (after the show it became unreachable for some time).

The story appears on the second page of a report on working from home, “For You, Tweets can mean Cash”, link here.

But it seems as though you need to be pretty popular and famous on Twitter to make this work. You need at least 200 followers, so you need to be sociable. (It occurs to me that someone running multi-level marketing would have that many.) Remember, Ashton Kutcher (twitter page) has over one million followers. Remember also, he told Larry King Live that he doesn’t sleep much and gets up in the middle of the night to tweet.

I’ve always had trouble believing you can say a whole lot in "microblog posting" of 140 characters – Okay, it’s the concatenation or accumulation of them, with links, that says something. Right now, raising money for Haiti is giving celebrities a lot of material. Again, it's the social network that also attracts visitors and potential advertisers, not just self-published content.

I found another service called “Pay My Tweets” here. And it did tell me I wasn't "popular" enough!

A site called “Mashable” (remember “Facemash”?) has an article from last June, “Sponsored Tweets: The End of Twitter as We Know It?” link here.

Monday, January 18, 2010

Uganda reminds us of what anti-gay bias used to be like here; individualism should never be taken for granted


I remember when reading Crane Brinton’s world history textbook (“A History of Civilization”) while taking mandatory European history in college, that the authors would always prefix each chapter with questions as to why people or states or groups behaved and develop in manners that befuddle a modern mind today.

Recently (Jan. 7) the Washington Post ran an editorial about a barbaric proposal in Uganda to track down gay people. I covered it on the LGBT blog. Yet, a half century ago (particularly in the 1950s and early 1960s) homosexuals in this country faced witch-hunts and firings, and, as in my experience in 1961, college expulsions. Justifications for these persecutions were usually stated in religious or sometimes “collectivist” terms (as they were in Uganda, according to the Post editorial). We seemed to have an area of moral thinking that was obsessed with enforcing conformity, as if this were a more important priority that preventing crimes that had actual concrete victims. (Remember the horrific pronouncements and rants of "psychologist" Paul Cameron and author Gene Antonio back in the 1980s?) Over the decades, things changed, as gay rights moved from the area of “privacy” to “equality”, as with the gay marriage debate (particularly in the United States). Today, the term “witch-hunt” usually refers to purges in the United States military associated with the “don’t ask don’t tell” policy. Because we have become a more open, individualistic, and technological society, we have changed our minds and perceptions of this issue. But it is still important to understand where these attitudes came from. Ultimately the issue is not homosexuality itself, but a weighing of the rights of the individual against the commonly perceived survival interests of the group.

The mindset that produces legislation like that proposed in Uganda sees the family unit as a basic unit of survival, and life itself as a “gift” (maybe not for many people). Therefore, anything that detracts from passing on one’s genes or partiality toward one’s own blood becomes unthinkable; it seems synonymous with suicide. Back in 1961 I dealt with a similar mindset: as an only male child, announcing even “latent homosexuality” was seen as a “death sentence” for a family lineage.

But gradually any group of people has to deal with individual differences among its members, and it also has to deal with the “unfairness” that an unfettered tribal, patriarchal society inevitably produces. Most of the history we study deals with struggles between nationalities, religious groups, races, and the like, and have to do with reallocation of resources to deal with perceived wrongs. But some of history deals with the development of laws as they relate to individuals, often acting in behalf of “families”, as the compete with or deal with one another. Gradually, religious or tribalistic and “knee-jerk” moral notions based on emotion come under the scrutiny of reason -- and in democratic societies, ultimately constitutional law.

To its credit, the “family values” system, or “natural family” paradigm posed by Carlson and others, accomplishes two big things. First, it gets people (especially men) to care about others in ways that transcend competitive self-interest as they first learn it in youth, and (most important) to transcend introspective or reactive fantasy. (Perhaps the focus on "virtue" in some very conservative religions is a ruse to avoid unwanted intimacy!) Second, besides raising the next generation of children in a stable environment (“the obvious”), its takes care of “less competitive” people (especially the elderly) with less intervention from the state, and provides a local context to make all human life inherently valuable. This second factor, extended intergenerational responsibility, becomes all the more critical today as lifespans increase and families are smaller.

These “accomplishment” get reflected in our legal system. Much of what our legal system does with this is try to enforce monogamy: tie individual men, who might feel incentivized to spread their genes among as many nubile women as possible, to specific spouses with specific responsibilities for the children that they father. No one sees this as controversial. Men are supposed to enter traditional (heterosexual) marriage partially socialized, but be tamed by their spouses (the "women tame men" thing of George Gilder) into transcending their competitiveness and self-focus with new modes of emotion and intimacy.

But it’s another area where the cultural and legal system becomes difficult. Men are more than reproductive or conjugal beings: we have individualized expressions which (especially now in the Internet age) we broadcast with pride. Some of us (like me) would rather stay in our own worlds and not even negotiate the competitive world of dating, marriage and babies. And my experience is that “we’re” a “problem” too for some people. The world sometimes demands that we develop the skills to provide for other people (especially blood family members) even though we have not made the “choice” to have procreative sexual intercourse and create new responsibilities in the form of children. The model is more like this: some of this responsibility is communal, and having children (in legally recognized marriage) earns recognition and support for carrying out communal responsibilities as well as creating new ones. It’s like a double journal entry. The demands for “involuntary family responsibility” can demand deferral or sacrifice of one’s own chosen goals. When I was a teenager in the 50s and early 60s, I thought I had a good shot at becoing a classical pianist and composer, and I would have worked hard enough. But the outside world would not leave me alone from its demands of "manhood" from me!

On the other hand, there is a separate world where adults focus on the value of relationships for their own creative potential, outside the world of social approbation. But communities of adults with this kind of focus tended to do better when they were closed and somewhat secluded. That's tougher in the Internet age.


In a sense, “sodomy laws” used to have the purpose of enforcing or imposing familial connectedness and family responsibility on everyone, whatever one's own separate talents. Married couples monopolized sexuality and those who did not form their own families were expected to subordinate themselves to the family. Sometimes certain provisions were provided: unmarried women could be encourage to become teachers and “enjoyed” a certain amount of prestige and authority (remember “Good Morning Miss Dove”). Unmarried men were worse off, except in the Catholic Church, where they could become priests (sometimes with dangerous results). This way, those who were “different” could not threaten the system by standing at a distance from it and kibitzing. The family model was indeed a collective one, which tended to live off of fears and it never followed the idea of individual personal responsibility that is so accepted in individualistic society today.

Laws regarding "public decency" and now the military "don't tell" law have the effect of preventing "distraction" from family intimacy when it is needed -- at least that's what a lot of people perceive.

The old system had some explaining to do, to be sure. It chided young men (like me) who were not physically competitive (making us feel like mooches), but then it seemed to turn about-face (with a self-serving double standard) and expect us to be interested in marriage and family (even in "acting" as fatherly role models ourselves) even after we had been told we had been competitive “failures” as boys. Some of the “explaining” may have come from the fact that a lot of young male competition occurred in groups (especially team sports, like football, leading to a social structure well suited to the military). The competition was supposed to occur in a context that emphasized that young men owed their “community” something: the ability to protect women and children (hence the era of conscription or the draft, and the morally tenuous student deferments of the Vietnam era -- and a neo-authoritarian mentality that decried physical "cowardice" and said, "If you don't go, someone else has to sacrifice in your place"). In time, technology would help the less “physical” people (the nerds) invent new ways to “compete” (Facebook provides an extreme example), eventually developing and reinforcing a moral value system much more centered on individual responsibility for self than for family per se.

By the 1990s, we had evolved an informal "social contract" that regarded marriage and having children as a purely personal decision whose moral significance rested with the person. That notion has been reverting back in the past few years, as we begin to realize that not only child rearing but longevity in many areas requires a lot more familial and community commitment. "Personal responsibility" now incorporates a sense of understanding one's dependencies on the unseen sacrifices of others, as well as some "existential" responsibilty for sustainability, especially one has spoken out. Curiously, this deeper notion of "responsibility" seems to meld with earlier notions of "biological" loyalty and makes some familial intimacy almost compulsory. The problems of "fairness" in how family responsibility is met, particularly in a world of longer lifetimes with disability, become quite profound. That's really the importance of the gay marriage debate.

Update: Jan 19

The comment source below maintains this blog, "Gay Uganda", here.

Sunday, January 17, 2010

Newspaper staff cuts lead to more errors; Post ombudsman weighs in: newspaper staff has to deal with new media, just like bloggers!


Along the lines of the troubles of “establishment journalism”, Washington Post ombudsman Andrew Alexander has an interesting column Sunday, Jan. 17 on p A21, “Is Post editing sloppier?” The online title of the story is “Why you’re seeing more copy-editing errors in the Post”, link here.

Yes, words get mixed up (“principal” v. “principle” – a good mixup not mentioned is “comprised” which is usually correct in active voice instead of “is comprised of”). Little details get misreported.

Alexander explains the problem partly in economy-driven staff reductions, but also in terms of the writing style and technical skills required for effective presentation of the stories (including this one) on the Web. Newspapers really do follow search engine optimization and “cost per click” and other advertising efficiency measures. Because they have to make a profit and make a payroll partly with this revenue, it’s a much bigger issue for them than it is for “citizen journalists” (yesterday’s post), although it raises an “existential” question, what if all bloggers had to prove they could actually make money with their writing to support other people?

The care and attention to detail required in professional journalism is quite remarkable. Foster Winans related all of this in his 1989 book “Trading Secrets” (St. Martin’s) about his career at the Wall Street Journal in the 1980s, with older technology. I've noticed in the past ten years or so that little typos have become more common in "trade" (or "regularly") published books, also. I had my share in my self-published book in 1997.

Update: Jan. 18

Check The New York Times, p B6, Richard Perez-Pena, "As Shrinking Newsrooms Use Upstarts' Content, Vetting Questions Arise," link here.

Saturday, January 16, 2010

Is "citizen journalism" (aka "blogger journalism") an oxymoron?


Is the practice of “citizen journalism” or particularly “blogger journalism” a “fundamental right” as part of the First Amendment?

To the extent that citizen journalism is usually unsupervised (with some exceptions where it is sponsored by newspapers) the term is theoretically an oxymoron. “Journalism” implies complete objectivity as well as systematic, rigorous fact-checking.

Most or nearly all “citizens” would have some natural bias about some things because of “inequality” in life circumstances and experiences. If someone has no biases at all, then that person probably has no responsibility for others, so why should we listen to them? (We’ve visited this point before.)

We can speak of citizen “writings” and admit (as I recall from an interchange in a Minneapolis NWU group one time) and accept the idea that they don’t have to be completely “objective” and maybe shouldn’t pretend to be.

Nevertheless, what we call “blogger journalism” can often, in practice, be reasonably balanced and add a lot of nuance to the debate or history of issues on top of what is usually reported by established media.

There may be some people who should not engage in “blogger journalism.” It’s particularly troubling when one is in a position of authority over others in the workplace, at least in the long term with formal reports, and is in the position to make decisions that affect subordinates, or perhaps customers. Some of this bit of ethical concern crawled out of the woodwork (sort of “Outer Limits” style, like a “blob”) when I was writing my book involving (in part) gays in the military and was in a work situation where I worked for a company that catered to military officers – but we determined that I was not in a position to make an decisions that affected anyone, and then I transferred away from this when there was a corporate buyout anyway.

Likewise, someone responsible for a family (most people are, not always by choice) has to be careful that his self-promotion does not jeopardize his ability to earn a living and provide for the family, or otherwise expose the family.

Blogger journalism may sound gratuitous (leading to the “implicit content” problem), and that concern could melt away if the journalism becomes monetized. But that idea gets challenged by the idea that conventional newspapers are having such a hard time today. On the other hand, social media companies are very profitable, sometimes, for the entrepreneurs who started them.

As noted in the previous posting, employers are getting increasingly nervous about associates’ blogging, fearing, not so much outright disclosure of trade secrets, but disruption of relations with clients and stakeholders, and even the prospective fear that employees will write about them after they leave. But even established media companies have hired undercover reporters to work for companies they will report on (as with a famous ABC investigation of Food Lion in the 1990s).

I started down the path of “citizen journalism” myself in the 1990s as a result of my involvement with the “gays in the military” debate, based on earlier personal experiences decades previously. For better or worse, I am linked to it, although in a kind of quasi early retirement (now at 66). The ideology of it is far from perfect. I’d love to be a “real journalist.”

So, Anderson, would I love your job. You bet (but you paid your dues early). I remember, Anderson, when you were wading in the flood waters of Hurricane Rita on a Saturday night in September 2008, reporting. You should have been back in the Big Apple, where the Financial Crisis would erupt the next day.

Thursday, January 14, 2010

Employers: "Dooce" and "Pre-dooce"


Here’s a piece I’ve heard about from the San Diego Reader from September 3, 2008[note, just before the financial crash!], “You blog, you’re out!”, by Michael Hemmingson (link)
Here, he talks about job applicants being “pre-dooced” by being asked about blogging: particularly, have they ever blogged about a job after they’ve left the job. You see the existential point: if you did, how does a prospective employer know you won’t do the same thing to “him”? You can’t prove a negative, you know. (By the way, the new verb “dooce” refers to being fired for what you say on the Web with your own resources from home, a term that originated with Heather Armstrong in 2002 and became the name of her famous mommy blog website.)

The article gives a lot of examples of other doocings and pre-doocings, mostly for pretty silly behavior by the “victims.” The issue is clouded by many factors, like a debate over whether amateur blogging is really legitimate “citizen journalism” and the misimpression that the First Amendment would apply to private employers. As far back as 1999, a nurse was fired from a hospital from Arizona for appearing with her husband in a porn site the couple created!!

The article lists a compilation of anonymous blogs (here) including blogs about work, which are not always as safe as they sound. Like social networking sites offering privacy settings, Livejournal (link) offers varying degrees of restricted access. And “Tor Project” (link) will help defend against electronic surveillance, such as of IP addresses (Tor is popular overseas with political dissidents).

As for writing about past jobs, I do think there is an issue of “common sense”. I don’t think there is a problem in discussing how systems technology worked in an IBM mainframe shop or even a non-IBM mainframe 30 years ago compares to how things work today, even if you name the companies, as long as you keep it “general.” (You almost would do this on a resume site anyway.) Yes, I mention NBC, Chilton (Experian), and BCBS, but I don’t think anyone minds because nothing I say is remotely compromising of confidentiality or remotely related to specific people who were at the companies involved.

Even so, it seems as though, still, few employers spell out blogging policies very specifically (although CNN does).

The article concludes with some tonal writing, that bloggers will have to fight to protect their freedom of speech in the practical world. Agreed.


Let me add, I don't set up sites to rate providers of services (like doctors, teachers) etc., and I don't make postings on the many sites around that do rate providers. If I displayed a public propensity to do that, how could they work with me in the future.

Wednesday, January 13, 2010

MSNBC editorial scathes Facebook, other social media companies on attitudes toward user privacy


Helen A.S. Popkin has a stinging commentary about Facebook on MSNBC today: “Privacy is dead on Facebook. Get over it. Cool kids don’t care about privacy, claim cool CEOs. So, neither should you.” The link is here.

She makes some interesting quotes of executives in other companies. In 1999 Sun Microsystmes Scott McNealy said “you have zero privacy anyway. Get over it.” And Google’s Eric Schmidt said “If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place.”

She quotes Mark Zuckerberg as saying that blogging has really taken off in the past 5 or 6 years as a social networking tool – but here she misses a point. Socializing and meeting people is one thing; publishing in order to enter debate (more like what I “innovated” – previous post) is related but not the same thing. True, social norms about what kind of information should can and be shared has evolved over time. But, when I’m a substitute teacher and a kid has found me on Google, I don’t need to be asked why I’m not married and don’t have kids. The1993 mentality that had led to “don’t ask don’t tell” really got blown away by the techie kids.

Toward the end of her piece Popkin echoes Rick Warren as she writes “Privacy isn’t just about you, even if you have nothing to hide. Privacy is also about those in power abusing personal information, or those in power having their personal information abused in ways that can eventually affect us, the little people.”

I remember back in 1996 that the Libertarian Party was going to consider a constitutional amendment that would read simply “The right to privacy shall not be infringed.” Remember how “the right to be left alone” figured into cases like Bowers v. Hardwick and Lawrence v. Texas. We have really gone around in circles, haven’t we.

Also, check Meghan Cox Gurden, Commentary in The Washington Examiner, Jan. 14, "It's like being stalked by a computer", link here. Just read it!

Tuesday, January 12, 2010

Political blogging: just where is it headed?


I’m going to start this post with an extended (and I think, under the circumstances, fair use) quote of a Washington Times editorial dated Oct. 12, 2005, regarding campaign finance reform and blogging (in reply to a related Washington Post editorial the day before). Unfortunately, the Times editorial doesn’t pull up now, but I had saved it because it figured, through a bizarre chain of coincidental circumstances, into a serious incident when I was substitute teaching (see July 27, 2007 here) – events that sound like a humdinger and deserving of movie treatment now.

Here’s the quote:

“Political blogs aren't widely read because they are funded by some multimillion-dollar company through political advertising. As Michael Krempasky, director of RedState.org, testified before Congress last month, money has very little to do with it. ‘Bloggers don't have influence because they start with large chunks of capital -- in fact, most if not all start out as relatively lonely voices with tiny audiences. By delivering credible, interesting, and valuable content, their audience and influence grows over time, ‘ he said. In other words, blogging is an endeavor subject to the rules of the free market. Inside this unbridled exercise in free speech, the good rise to the top, while the hacks and frauds go ignored or quickly disappear. “

This editorial appeared just as social networking sites were heating up, with the public (including newspaper editorial staffs) still not quite grasping how the nature of “online reputation” was soon going to grow. Recently, I covered here how “socializing” and “publishing” online are interrelated but still different concepts. “Socializing” doesn’t always imply a desire for the limelight – and we know that Facebook, when it was started, recognized this fact much more than did Myspace.

But the Washington Times has a terrific point: an individual blogger, just out of the quality of his or her postings, could be in a position to influence how millions think about a particular issue. And the blogger didn’t compete for the approval of others by the usual routes expected in the past, and doesn’t seem answerable to any specific hierarchy, whether business-related, political or familial. Of course that point, as noted here before, invokes the problem of “the privilege of being listened to.” And most of this happened because the "due diligence" and supervision element of publishing simply got dropped as unnecessary (sort of the way junk bonds were justified), with the help of some fortunate legislation (Section 230) and a certain naivete about the ultimate downstream legal risks.

There seems to be a fundamental question: just how significant can a blog be, then, since it can have such unbounded reach? Is this just a theoretical artifact of technology (including the powerful broadcasting effect of search engines – and you really don’t have to pay for placement), or is it something “real” with ethically significant consequences for our social and political system?

But the growth in the concern s about “online reputation” – and the attention that companies like Michael Fertik’s Reputation Defender, or a competitor Reputation Hawk – now garner, seems to answer that question. Online self-publishing, freed from the restraints of supervision and accountability (and especially the expectation of financial returns) can have a real effect on things, ranging from people’s employment prospects to the future of conventional newspapers.

That leads to another concern: it self-produced content seems gratuitous or designed only to grab attention without having to compete through “normal” means or step into taking responsibility for others in manners that often determined by social and familial hierarchies, then the “purpose” of content becomes an important part of it, leading to the novel legal notion called “implicit content”, hardly mentioned before 2006 but mentioned during the COPA trial. Curiously, this notion has grown while bloggers remained largely oblivious to the fact that their public postings bore the same technical legal risks of liability (for libel or copyright infringment, among other things) as more conventionally "published" literature -- leading to the new controversy over pricing media perils insurance.

On the other side of all of this, consider the way our “conventional” way of competing for reward and limelight used to work. “You” were expected to prove that you could serve as an authority figure over others, or that “you” could manipulate others into buying things that you did not produce or invent yourself but that you peddled in order to prove that you could produce more revenue for somebody else’s bottom line than your neighbor across the street. No wonder blogging, maybe combined with a prolonged period of offering free content, seems like such an attractive opportunities for the introvert. Truth counts as much as power.

Back in 2006, Fairfax County Public Schools English teacher Erica Jacobs wrote in the DC Examiner about introducing blogging to her students as a form of literature, but caused nervousness and fear among her school administrators.

All of this melds together, as new modes for the best way of representing oneself online develop, with ideas as to how to develop an integrated presence, coordinating Facebook, LinkedIn, Twitter, professional publications and perhaps blogs. Gone are the days that I evolved my own presence, the 90s environment where one had one life at work and another “private-public” life online. Peaceful coexistence couldn’t last forever. It was no longer “sustainable” when social media stirred the pot.

Social networking and self-publishing have evolved side by side, interlocked, but gradually coming together. The whole process is one grand “reconciliation” (one of novelist Clive Barker’s favorite temrs) as two views of life: one centered on the value of the individual as a member of the group (especially family), and another on the individual as “competitive” global citizen.

Although I didn’t make much money at it (I’m no “accidental billionaire”, not even by a factor of thousands), I do believe that I helped to “innovate” “political blogging” in the late 1990s, motivated by a couple of specific issues (“don’t ask don’t tell” and COPA) that would engulf other issues like Steve McQueen’s “Blob”. That became the path for the second half of my life, for better or worse.

Sunday, January 10, 2010

In bizarre case, plaintiff gets entire sites ordered shut down by New Jersey court, despite Section 230


A court in New Jersey issued an order to shut down three websites and revoke domain associated registrations, all related to allegations made on these websites about the Apex Technology Group, which is involved in bringing technology workers to the United States under H1-B non-immigrant visas. The websites were apparently critical of Apex and also linked to a particular Apex document while providing some sort of commentary on it. Apex allegedly claimed that the defendants were guilty of both copyright infringement and defamation, creating a bizarre situation where a copyright owner defames itself. (A variation of this problem occurred in my own experience: see July 27, 2007 on this blog.)

An article by Kurt Opsahl, “Order to Shut Down Websites Critical of Apex Technology Group is Dangerous and Wrong” link, provides a lot of analysis.

One of the most shocking aspects of the order was the total ignorance of the protections of ISPs under Section 230 of the 1996 Telecommunications Act. Another is that a whole website is shut down because of material related to one plaintiff, which might be a miniscule portion of the entire site.

Saturday, January 09, 2010

Juror home Internet access and social networking during multi-day trials becoming a problem


Jury duty has always been a civic obligation, which could disrupt a person’s livelihood, to the point that many employers would cover it. That was the case more me: I got called to jury duty four times in Dallas in the 1980s (with the one day, one trial system) and was foreman once.

But asking jurors to remain mum on the Internet while on jury duty seems too much to ask. Check out the Metro story “Social networking among jurors is trying judges’ patience,” by Del Quentin Wilber, the Washington Post, Saturday Jan. 9, here, mostly about cases in Maryland. Convictions have been overturned or at least challenged after jurors admitting doing “research” on the Internet (even just looking up words on Wikipedida), or when several jurors “friended” each other on Facebook and chatted about the case, potentially “bullying” other jurors to go along.

In the most sensitive cases in the physical world, jurors can be sequested in hotels, and kept away from newspapers and media, including the Internet.

Attorneys could research a juror’s “online reputation”, especially blog postings, to look for prejudices that might affect a verdict in a case; but it’s unclear where appeals courts would consider such post mortem material seriously,

Thursday, January 07, 2010

"Right of publicity" can generate a real tort; companies and bloggers alike should take note (incident with billboard of President in Times Square)


The White House reportedly objects to the placement of a billboard above Red Lobster in Times Square showing President Obama in a Weatherproof jacket (link) in China. (I guess, following the spirit of my trademark blog, "Weatherproof" is a strong brand name even if it is a common word.) The White House says it has always prohibited misappropriation of a president’s image for advertising purposes. In fact, use of a celebrity’s image without permission to imply endorsement of a product is considered the tort of right of publicity. Many states, including New York, have laws specifically authorizing lawsuits by the subjects of such ads. Most likely, the company will take the billboard down after a period of free publicity. The New York Magazine story is here.

Would such a doctrine bother bloggers (with “commercial activity”). It’s probably pretty trivial; if you show an image of a celebrity and its informative and relative to the story, I would think it shouldn’t be a problem. But maybe an interesting question.

Wednesday, January 06, 2010

Individualism retreats in a battle over "virtue" vs. "people"


One of the pleasures of “perfectly composed” music (one thinks of Beethoven) is the idea that some kind of virtue or beauty is being expressed as an ideal for people to follow.

We hear a lot about “virtue” these days in connection with discussions of Islam, because its attainment by strict obedience to the Koran is such an important goal for some people (there’s a lot here in the writings of Sayyid Qutb). In fact, is seems like an important goal in orthodox or conservative Judaism (the Torah), and in much of evangelical Christianity (the Bible as a whole). One feature of societies with social norms based on scripture is that it seems important to people (especially to find emotional satisfaction in committed family life as well as religious practice) to believe that others have to play by the same rules.

When I was growing up, in the grade school and tween years, I fell behind physically but, with the aid of music, actually pulled ahead academically. I inherited the idea that “winning” competitively (Trump style) was a very important “virtue”. I did pick up on the idea that men competed to prove they were capable of protecting and providing for women and children and were “worthy” of carrying on a family lineage. In historical retrospect , this seems like an extension of ideas we had fought World War II to defeat. But we would do the same things ourselves. During the Vietnam era we would shelter the “smartest” people from the dangers of conscription and allow others to be “sacrificed.” In the 50s and 60s, I looked for rationalizations of this practice.

The “ideal man” would be someone who was developmentally “superior” both academically and physically, a Clark Kent. I saw someone like that as representing “good,” and being good at both was quite a feat. At the same time, I had come to believe that I might be physically competitively “unworthy” of performing as a man in the conventional sense. All of this sounds horrible to say today, but that is frankly how a lot of us saw things at the end of beginning of the 60s. We had to go back to a certain kind of fascist thinking (what were these Nietzchean ideals for, anyway?) to defeat the big red godless enemy, Communism. Of course, if someone didn’t have to struggle with anything, what would he have to say or give to others? We didn’t always see this then. In “Smallville”, Clark “struggles” to keep his “secret” (his extraterrestrial origin), a bizarre parody of “don’t ask don’t tell.” But could Clark write a book about real anguish? I wonder. Yes, he loses his father, and loses Lana; interesting question.

But nevertheless, as I remember debating with myself in a soliloquy, Carousel style, while walking to work with the Science Honor Society selling cokes for a spring trip (Mount Washington) at a warm October high school football game in 1960, it seemed to make sense to “affiliate” with those who were “good”. I’ve talked a lot here about “upward affiliation.” I justified this in my own mind, the way people rationalize things in foreign films (especially those from Pedro Almodovar). All of this contributed a lot to my homosexuality. The whole idea of creativity and the polarities would come later (in the 1970s with the Ninth Street Center).

There is a conceit in all this, that I had the “power” (call it “power behind the throne”) to decide who are what was “good” (or “evil”). It seems to me that both Islam and Judaism allow this, and I had a bit of both religions in my own personal moral belief system, even though I was brought up as a Baptist. (That’s another point: it makes no sense to me that Islam and Judaism should be enemies, other than the issue of taking Palestinian property – but ideologically, it’s all nonsense; they worship the same God and have the same idea that there should be some kind of justice on earth and that people should pay for their own mistakes). Christianity adds something new to the mix: an individual man cannot know good and evil from his own efforts or emotional reactions to things (including feelings of sexual attractiveness to individuals who appear to live up to some ideal, whether in a straight or gay environment, and a refusal to “feel” sexually to those who are less “worthy” according to one’s own value system). According to Christianity, we all need Grace (although if you believe the Vatican some – homosexuals – seem to need it more than others).

I had always rationalized “upward affiliation” as moral because of its apparent harmlessness; it invoked private choices (who could become my adult “significant others). Violence against those who challenged one’s belief system could never be morally acceptable behavior; liberty depended on non-aggression. (That’s a big difference from radical Islam, but not conventional Islam – the whole meaning of jihad as being an internal private struggle.) But of course what challenged this set of assumptions, popular in the early 1990s as Bill Clinton took on the military gay ban, for example, was that the Internet would make private feelings much more public, not so much through any direct confessions but more from innuendo, context, and imputed motives for writings self-published in public spaces. These feelings might be though to have a judgmental purpose – hence, the old problem of claiming “the knowledge of good and evil”. Internet entrepreneurs and adventurers (myself included) bragged that we could master all knowledge before dealing with real people.

‘Christianity, it seemed, more than other monotheistic religions, did make something of the idea that we are all connected at various levels of granularity, mainly the nuclear and extended family, where the heads of a family (legally married monogamous couples) have some power over the values not just of minor children but of all adults who hadn’t made a similar commitment. One did not really “know” things (including “virtue”) until one had dived into the pool and taken some chances and assumed some responsibility – starting a family oneself and taking on all the uncertainties.

That had an effect on personal moral values. In decades past, heads of families assumed they had the right to direct the loyalties of everyone under them, even adult children (remember the movie “One True Thing”?) Families were based on the idea of complementarity, that anyone could fall into the blind side at times and need help. The "hidden curriculum" of gender and capabiltiy compelemtarity was supposed to grow the emotional responses that would enable even the "less competitive" to find and enjoy marriage (although there was some contradiction and double standard in this notion). After the 1960s, hyper-individualism started to grow, with the idea that every adult had to be totally accountable for the self. With individualism came the idea that one had complete control of one’s the aspect of one’s intimate life as an adult, including the right to turn people down. (Life in the gay community really teaches that lesson.) Sometimes we called this “self-ownership.”

With demographics and with the way some medical advances have gone, this idea seems to have become the wild pendulum, swinging back the other way, challenging our ideas about relativity in modern physics. Lack of control seems to be the new norm. There’s no way to greatly extend people’s life span without the cooperation of family members (institutionalism will run out of room), and many medical advances for the disabled, not previously possible, can only be done with the “involuntary” sacrifice of family members. Family responsibility will no longer be just a matter of the consequences of sexual intercourse. As it did in the past, it pre-exists: marriage adds privileges and more responsibility, both. No wonder the stakes in the same-sex marriage debate seem so high.

A young person (that could be anyone psychologically in some kind of adolesence) must consider what it takes to "make it" with his own expression. An aspiring music composer must master music technique with years of practice, master certain musical concepts related to composition, and have something to say with his or her music. There is always the possibility that the needs of others (especially blood family, and not just one's own children), and the "knocks" from the demands of the outside world (even if the artist is earning a living and "paying his bills" responsibly) will derail the artist's own individual plans. On the other hand, without grounding in meeting the real needs of others (perhaps in a psychologically "creative" setting), when will someone's global expression really say anything?

Tuesday, January 05, 2010

Outpatient surgery: My hernia operation turns out to be a "smash and grab job"


Well, outpatient surgery is all the rage, and I had my own experience with it a Virginia Hosptial Center, with a bilateral hernia operation, necessary because of 66 years of Earth’s gravity.

So, once they put me in the temporary “day care” room, I turn on CNN, and the first thing that comes through is the story of a woman and baby that died in childbirth when something with the epidural went wrong. And I thought they were going to do epidural for me.

Oh, the nurse decides to put the IV in the middle of the forearm, so I have to be very careful at the end of surgery taking it off to save the hair on the aim. Underneath the bandages, I suppose they shaved, but the chest is intact. The days of “nipples to the knees” seem to have been forgotten. In fact, the only EKG leads on my body are on the side of my neck.

No, they decided that this would become a “smash and grab job” like in Oceans 11, as spoken by Matt Damon. Three different nurses come in and ask my name and the procedure that I was to have. Then the surgeon comes in, and after a brief discussion of post-operative care (no driving for four days, but I feel absolutely no pain at all right now). He ways, we’re going to room 111.

Then there is absolutely nothing. The next thing I remember is the recovery room, feeling a little drunk, with ice cubes and juice being offered. The operation took 1:07 hours.

The worst thing about outpatient surgery is that you can’t eat anything or drink anything after midnight the day before. That’s because they don’t know whether they will do general anesthesia or not. But with epidural it takes four hours before you get your feeling back. General is so much faster. And the CNN show pretty much made the decision. Sanjay Gupta seemed to be supervising the operation from on high.

Oh, yes, the hospital tells you to bring as little as possible besides your insurance cards. But at least two other surgery patients bought wireless laptops, as if they expected to tweet or blog about their experience while they were being cut. I guess they hadn’t seen the CNN story about epidural yet.

Sometimes general anesthesia really is safer. But everything in life entails risks, right?

Monday, January 04, 2010

Newspapers deal with ethics of "paying" for anything related to freelance submissions


Well, The Washington Times has fulfilled its New Years Resolutions, with a $1.00 hardcopy paper that has no local or sports section (no one cares that the Redskins fired Jim Zorn). It actually hired new management while laying off 40% of its news staff. And the stack of papers at an Arlington 7-11 at noon Monday was not moving at that price.

The Times has an excellent article on the Culture page, B5 (in the Commentary Section), “Checkbook Journalism: TV news still facing ethics issue”, link here.

Obviously media sources aren’t supposed to “pay for news”; nevertheless, it seems as though recent “accidental celebrities” Jasper Schuringa and David Goldman benefited from their adventures.

Jasper Schuringa, after tackling the terror suspect on the plane about to land in Detroit, with his hand burned, still got off some cell phone photos of the suspect. ABC and CNN both paid for photos, which they say is not paying for an interview. Not that Jasper should not be rewarded for his acts, but it seems as if the networks walked into areas of gray here. No doubt some movie directors must wonder if he can act; imagine him as having a part in CWTV’s “Smallville”.

This reminds me of the issue of bloggers getting “paid” to promote products, which the FTC is forcing bloggers to disclose; it even reminds me of past controversies over whether blog entries could be considered indirect “campaign contributions”.

The New York Times ((another “Times” and big brother!) also addresses the issue of even paying expenses for freelances in a column on the Opinion page 8 Sunday in “The Public Editor” by Clark Hoyt, “Times Standards: Staffers or Not”, link here. The article discusses three major freelancers, all major professionals, who no longer write for the paper for violating company rules on accepting money at all. Hoyt writes “These cases illustrate how hard it is for The Times to ensure that freelancers, who contribute a substantial portion of the paper’s content, abide by ethics guidelines that editors believe are self-evident and essential to the paper’s credibility but that writers sometimes don’t think about, or don’t think apply to their circumstances, or believe are unfair or unrealistic. Some writers do not read the guidelines carefully, and although they are encouraged to raise possible conflicts of interest with an editor, some don’t tell and are not asked.”

Where I think there could be an ethical problem is with publishing opinion pieces submitted by people in management on their own initiative, until there is an agreement by the manager’s employer that submission is appropriate and is naturally part of the person’s job. Likewise, it can be inappropriate for a manager to blog on his own about anything remotely related to the work of people he or she supervises, without oversight from the employer.

A distantly related incident occurred on CNN with a story on Popeater by Ron Shulter, "Kathy Griffin Banned From CNN", link here.

As an aside, you may want to check David Carr’s piece in “Week in Review” in the New York Times, Jan. 3, “Why Twitter Will Endure: So you’re drowning in a sea of information; perhaps he answer is more information”, link here. Twitter, unless Facebook and Myspace, doesn’t expect “reciprocity.” Twitter is the ultimate example of “algorithmic authority.”

Sunday, January 03, 2010

If "we're at war", then a lot of free expression could be up for grabs


When I turned on the news Sunday morning on NBC, I kept hearing that president is “absolutely certain” that the Christmas Day Incident in Detroit was part of a terror plot connected to Yemen and probably Somalia, and that he would meet with all of his advisors to consider emergency steps Tuesday when he returns to Washington. A huge tragedy was avoided by an ineffective device and by a particular passenger (who may present a certain political irony). This is certainly alarming for civil liberties on several fronts.

Let’s put a couple things in perspectives: the Incident would not have occurred had the suspect been re-screened properly in Amsterdam (where I have flown to and from several times). It’s hard to understand how a passenger was allowed to board an international flight to the United States and be screened only in a third world country (particularly after paying cash). Simply improving screening at European or Asian plane changes would do a lot to improve security without imposing on civil liberties.

The alarm this morning was augmented by announcements that the United States and Britain would close their embassies in Yemen because of specific threats, but later reports indicated that these closures would be temporary.

We’ve all heard a lot of reports about passenger management in flight (the pillow issue) and the probable increased use of body scanners. We’ve heard a lot about the failure of intelligence to connect the dots, especially when the suspect’s father, a businessman living in London with moderate beliefs, talked to authorities about his son (a tip that reminds one of the Kaczynski case.

And we hear a lot about comparisons to other incidents, including Ft. Hood.

But the most disturbing comment is that ordinary Americans need to realize “we’re at war.” Cheney, remember, just before Christmas, had criticized the president for downplaying the importance of this “war.”

It’s the “war” word that can provide a rationale for all kinds of restrictions on civil liberties, going beyond airline passengers. Ironically, on the first Sunday that the shrinking Washington Times does not print a paper, the conservative paper ran a hard-hitting editorial “A Decade of Decline: Barbarians at the gates in 2000-2009”, by Jeffrey T. Kuhner, link here.

It’s true that the Bush administration did not ask Americans (outside of those “volunteers” already in uniform who got treated with multiple deployments and “stop-loss” policies ) to “sacrifice”, but it’s inevitable that this kind of talk will reoccur, with issues like national service and possible conscription (as was discussed in 2001, ironically by Charles Moskos, originally one of the authors of “don’t ask don’t tell” back in 1993).

Another issue is the “gratuitous” Internet, so critical to business models now, that can serve as an easy tool for “steganography”, or a trick called the “dead drop”. There’s an article at ABC Australia from 2005 with Roger Clarke at Chet Hosner about how it works, link that makes comments as to whether your could “ban” steganography, but that also makes the point that it’s already “illegal” in most western countries for terrorists to plot attacks, on the Web or anywhere; it’s detecting them that is difficult. Here’s is an excellent discussion offered by Georgia Tech of a Los Alamos article by Donovan Artz, “Digital Steganography: Hiding Data Within Data”, (and “the data beneath the wax”) link (pdf) here. There was a lot of discussion of this possibility after 2001. In April 2002, an online essay of mine about 9/11 was hacked and overlaid with bizarre characters in a section where I talked about nuclear weapons.

In fact, a few years ago, there was a lot of attention from NGO’s on accounting for loose nuclear materials, some of it led by former Senator Sam Nunn with the “Nuclear Threat Initiative”. Recently, the Washington Times has presented some op-eds about the danger of small nuclear devices launched off shore by rogue elements (from Iran or North Korea as well as Al Qaeda) for the EMP effect, or even the development of microwave weapons for this purpose, discussed in Popular Science one week before 9/11 in 2001, and the subject of recent media reports about experiments carried out on Aberdeen Proving Grounds in 2001.

The media has also made a lot of the use of the Internet to recruit young men to radical Islam. Most of these sites are hosted outside the United States, but trying to regulate the content of sites in the United States would run into the constitutional objection to any censorship solely based on explicit content, as we know from COPA. That leaves open the idea of regulation distribution, dealing with issues like free entry, free content models and possibly mandatory insurance, as has been noted here before.

Listen to Richard Engel's comments about Al Qaeda in Yemen as compared to Afghanistan and elsewhere in this video

Visit msnbc.com for breaking news, world news, and news about the economy


Note that he says Al Qeada in Yemen encourages would-be jihadists to "do it yourself" with instructions from the Internet. Although most of these sites (as I noted) are overseas and would violate TOS of major western service providers, this sounds like an existential attack on the "free entry" along with free speech model of the Internet itself. Anyone notice this?


The world seems even more dangerous today than it was in 2001.

Saturday, January 02, 2010

Blogger journalism: reaching a critical point?


I do like journalism and can imagine that I could have become an investigative reporter in another life. I like the idea of uncovering and “exposing” things, and of helping the public “connect the dots”.

And as I have covered before, I place a high priority on documenting the history of the relationship between the individual as an autonomous unit and the individual as a (loyal) member of the group, especially the family. Attitudes on this matter have migrated dramatically in the past half decade (toward individualism) but sustainability concerns and an indirect consequence of the “free entry” paradigm of social media may be to draw us together again, with a less idealized view of notions like “equality.”

Both blogger journalism and social media – interrelated and linked if somewhat different experiences – evolved unexpectedly because of quick changes that happened technically (the sudden development of search engines in the late 1990s) and legally (the passage of Section 230 and later the much misunderstood DMCA safe harbor). The end result is that the “due diligence” and formal fact checking and external supervision previously associated with publishing were dished for newbies. This has a good effect (enriching debate with anecdotal details of “life in the grass roots” not available from the main media) and bad effects (people misuse the web, sometimes maliciously, but often without the knowledge of the long term consequences to reputations (and increased legal exposures) of themselves and others).

The business world has two real concerns, neither of which it probably fully gets, partly because of the existential problem that you can’t prove a negative (or prove that some undesirable event can never happen at any unspecified time in the future – a conundrum we know well from “don’t ask don’t tell”). For example, businesses can specify rules of guarding stakeholder confidential information and trade secrets, but be queasy that individual associates with such easy access to global self-promotion (from home) will know where to draw the legal lines, or that they will honor confidences in the future after leaving the company, when an employer in a practical sense has less leverage (yes, they could sue…) And employees with direct reports or with bigtime interaction with clients could inadvertently offend such stakeholders (possibly with business or legal consequences) who find “gratuitous” postings on the web and interpret the postings in a personal way, out of context intended by the original speaker. One recent lesson, especially since the advent of social media around 2004 or so, is that the apparent “purpose” of the speaker can be as relevant (even legally) as the objective meaning and legality of the content apart from knowledge of the identity of the author. This is the “implicit content” problem.

I’ve actually encountered this already. There were a couple of circumstances that occurred in the 1990s at my last employer that I cannot discuss in detail safely even today, even though there is one particular incident that could embellish what I have to say. I don’t have legal ownership of a particular item, for one thing, even though I “authored” it at work. Anyone who enters global public debate with sensitive issues the way I did is probably going to have some experience with this kind of situation, so I hope that is somewhat reassuring to people who deal with me today. These things can always happen, given enough time and opportunity. Possibly with investor money for a “real” movie, legal permission could be gained to disclose certain matters (the film industry calls this “script clearance”). There are some practical limits to what “amateurs” can do.

There’s another subtle consequence of placing oneself into “public life” (to borrow a line uttered by Anthony Hopkins as the notorious Hannibal character in the movies). It creates responsibility for others. I’ve heard people say that “writing a book is like having a baby.” That assertion may trivialize parenthood, but once someone has, “standing from afar” (and “on high”, as if writing a “manifesto”), commented on the actions of others who have dived into deeper “Open Water” than the author has, that that author is likely to be challenged or manipulated into welcoming some of the same responsibility. Call it karma. This is a sea change in our concept of civility; in the age of "privacy" (before the open Internet), if you disapproved of something you just ignored it; now with public outspokenness, you can't claim you're (just) "neutral" anymore. There is some irony or paradox, perhaps unaffordable, in my own experience: my own gay values map in my own mind to a notion of "virtue" (call it "the knowledge of good and evil" if you want) that can become as "fundamental" to me as in formal religion, and I'm sure some of it sieves through my writings, at least indirectly; "threats" to drag me out if my own value system into unwelcome interaction with others are particularly disturbing to me. I have to say that in the past decade or so, I have learned why some people are so concerned that others be forced to behave as “righteously” (especially in religious terms) as they once had to. But I also understand how old fashioned ideas of "public morality" involved socializing others so they could experience affection on terms determined by others in the family rather than their own terms -- leading to double standards. If you speak about us, they say, then tell us why you don’t want to take care of us.