Monday, December 31, 2007

Seize the Web from Home, and fly over the Culture Wars


A look at the checkout line offerings at a local red-letter Staples store gives a skewed impression in what is happening these days in the area of self-expression (the “look at me generation”) and entrepreneurialism. The December 2007 "PC World", with a sky-blue cover, reads “Special Issue: Secrets of the New Web: 87 Powerful Sites, Services and Tips.”

The core article seems to be by Jeff Bertolucci, “Seize the Web,” which is a bit like a college essay test question with many little parts. Well, not really. But there are a lot of components. “Star in your own Videos.” That goes way beyond YouTube and mentions Metacafe, Veoh, and Crackle (that’s not bronchitis). Spike is associated the entry level filmmaker’s product iFilm (all the rage in 2002 when there were waiting lists for the iMac). I used that and edited a 34 minute film with it, not polished enough for commercial distribution (and I’d need the releases). But today middle school kids are learning to use Premiere and FinalCut on their own and have, on more than one occasion, produced short films worthy of being in major festivals (I can think of one about the treatment of military veterans right off the bat).

To get back on track with this magazine, there is also “Be the Star of your own social network,” going beyond what you usually hear about Myspace and Facebook. Then there is “Blog for Show, Blog for Dough,” the latter of which this entry is about. The article does recommend Blogger for openers, and gives a nod to Wordpress as a step up for more advanced knowledge management through blogs.

I also like this one “Get your Book Read.” He discusses a number of self-publishing services that have varying business models. Besides iUniverse and xLibris, there is now Amazon’s BookSurge, as well as Lulu (Alban Berg’s opera perhaps?) and Cafepress. Three decades ago, the only model was “subsidy publishing,” and the players were Vantage Press and Exposition. It did not enjoy a lot of accolades. Now, print-on-demand has changed everything. Some self-publishers are more what literary agents call “cooperative publishers” and have sophisticated packages aimed at improving the editing quality and professionalism of the product.

There is an interesting section "Life without Software" by Scott Spanbauer, discussing how most functions can be performed interactively over the web with server-sided computing and without one's own desktop applications. But that sounds like returning to the culture of the mainframe, doesn't it?

A somewhat different view of self-promotion comes from the beige “Success from Home”, at least the September 2007 issue. Actually, this magazine may comport more with some of the evangelical community since as there is a banner “Stephen R. Covey: Discover the Power of a Family Mission Statement.” Inside, there are a lot of materials about sales business run from home, some of the material centered around Isagenix, a weight-loss product. But there is also a story “Donald Trump / Robert Kiyosakt: Why the Want You to be Rich.” Inside there is an article by John Fleming, “Way of the Future” that presents the case for “Direct Selling” and later there is a piece by Trump and Kiyosaki, “Why We Recommend Direct Marketing.”

There is a certain contrast between the tone of these two magazines. The PC magazine is centered around personal autonomy, and is based on the (to some people, incredible) idea that self-produced material will get noticed, even in a world where there are hundreds of millions of blogs, profiles and videos. That’s true, and it is a simply a result of “exponential mathematics” and the binary fact, mathematical behavior drives Google’s own success, but that seems so befuddling to high school students in Algebra II. There is self-interest in a great education, particularly in the fundamentals (like math). They really do pay off for you.

The “Success” magazine seems closer to family values, to be sure, but even that has to find roots in individual self-interest. Here, again, so much of family financial planning depends on immutable mathematical facts (even beyond the control of God): compound interest. That’s why it’s so important to pay off credit card debts fully each cycle (families don’t). But Success is going beyond that, of course, to get into the idea that individuals and families can free themselves from dependency on employer benevolence: if you work for yourself and can make money at it, you can’t be laid off. The work-from-home world can lead into some interesting byways, such as the cash-flow business; getting into these sometimes means a large upfront fee for a seminar and introductory materials, and a quick go-nogo decision.

When it comes to independence and entrepreneurialism, there are a couple of indispensables. One if content, and the other is notability. You need to have something worth selling. Personally, I feel a lot better about selling something that I developed myself, and that advances social good just beyond the bottom line or taking care of one family. I can believe that people “need” what I have, but I have to make them “want” it so that it can generate a profit. That is a key component in giving it “notability” (I still don’t have a Wikipedia entry on me, although there is certainly major one on “don’t ask don’t tell”).

In general, although this is not without exceptions, people who have kids and families to support may be more psychologically attuned to make a living by selling the services and products and others, because they believe that taking care of their own families trumps over addressing the “needs of the world.” For many people, the post World War II economic boom was based on salesmanship and on manipulating demand as a “profession,” and that helped set the atmosphere of the 50s. At the same time, underlying problems, centered first around national security, forced a significant subset of the population – the predecessors of today’s geeks – to focus on truth for its own sake.

It is true that people generally get paid to promote the work of others. It’s understandable how this stretches our notions of ethics. In Tony Kushner’s play (and the drived HBO film) “Angels in America” there is a late scene dealing with the “immorality” of making a living ghostwriting Supreme Court opinions whose logic one personally does not believe.

All of this has been going against an increasing din of counter-hype about the dangers of too much self-promotion, too much self-expression (especially on the Internet), the shredding of social contracts (especially the sundering of unions), and the risks posed by shady operators (the scams, the identity thefts, the hoaxes, the predators, the hackers), as well as the dangers provided by external threats (global warming, pandemics, oil shocks, and terrorism), as well as family demographics (birth rates and life spans, pensions and social security, and long term care) all of which are promoted as reasons to give up some individual sovereignty and learn social interdependence again, starting first with the obligatory nuclear family. Particularly disturbing are persistent media reports during the past two years about employers misinterpreting information they glean by trying to use social networking sites and blogs to do “background investigations” on employees and applicants, often getting wrong impressions, sometimes about the wrong person (the “reputation defender” problem). Yet, some of this chaos has always been with us. Just look at the Roaring 20s, and what followed. What’s different today is asymmetry, the potential unpredictable “power” of any one person acting on his own recognizance, counter to inefficient bureaucracy. Or is it that different – just what did happen in Germany in the 1930s?

What we have is a perfect storm, a low pressure system that feeds itself and draws all within. We call it the “culture wars”. It is very hard to see how things will wind up. I do know that the tone of some of my postings expresses a deep concern about where we could be headed, compared to the optimism of my first book ten years ago.

Wednesday, December 26, 2007

"Brother's Keeper" as a Biblical and as a practical social concept


The term “brother’s keeper” often comes up as a moral conceptual counterweight to “selfishness” or self-promotion in a competitive society. The term originally came from the story of Cain and Abel in Genesis Chapter 4, where Cain asked, “Am I My Brother’s Keeper?” Of course, literally speaking, the application seems limited: Cain, had after all, aggressed upon his brother and harmed him directly. The story seems limited to direct harm to others and perhaps even more so within the family or tribe.

In the New Testament, of course, Jesus gave a whole new meaning to the idea of charity. There are many passages about charity and “loving your enemies,” both in the Gospels and in the Epistles that follow. One good direct example would be I Corinthians 10:24, “Let no one seek his own good, but the good of his neighbor.”

The early Christians lived quite a socialized underground existence and practiced “Kingdom economics.” But what does this say about the appropriateness with which we set and follow our own personal goals in today’s world. We know that a free economy has to be competitive if people will have the necessary incentives to make their lives better, but that also implies that people must work together and learn interdependence. Some of the issues that have been flying around since 9/11 (and these include natural external problems like global warming and pandemics) bring up discussion of the need for people to accept interdependence and “watch their neighbors” even as technology has allowed many people in the West to live and set their own priorities very independently, often enough even within the context of the traditional nuclear family was well as with GLBT and other lifestyles. In general, people have become more responsible for their own expressive and perceptual well-being, and blood kinship or lineage ties have tended to become less automatic and perhaps weaker. This does not work out well for everyone.

The polarity theory of Paul Rosenfels (discussed often on these blogs) dealt a lot with psychological defenses, and remedying these often meant accepting some sort of meaningful interdependence, while at the same time maintaining the capacity for selectivity.

The legal system in the West had gradually tended to place more emphasis on personal responsibility, even within the legal family unit. That trend has been essential in developing modern ideas of GLBT rights and equality. However, there are many relics in our legal system that seem to reflect a concern over the possibility that one person’s demand can entice himself or others into harmful actions. Drug laws are still a particularly important example (of "brother's keeper" laws). In the age of the Internet, profiles and search engines, it’s easy to examine the problems that can arise, as we saw with the COPA trial, and with increasing concerns of the risks people can inadvertently cause for themselves and others when they have instant topological access to the entire planet with their content. We don’t know conclusively where this is headed.

Tuesday, December 25, 2007

Those were the days: when bus travel and hitchhiking were in flower


On Friday, Dec. 16, 1966 (I had to check this on a perpetual calendar website), I and several friends from the University of Kansas “took off.” One of the guys had rented an “in transit” car (a full sized Olds, quite luxurious then) to be driven from KCMO (aka Metropolis for Smallville fans) to Longview, WA. At age 23, I had been raised in the DC area and never been west of Topeka, KS. That would change quickly. We were lucky to have mild, dry weather.

By 3 AM we could see the lights of Denver from I-70 as we drove past the old Stapleton airport. We took US-40 to climb through the mountains (the Eisenhower Tunnel, a favorite hangout of writer Stephen King, had not been started yet) and the same came out as we got through the Front Range and in one of those open space parks we played in the fresh dry snow powder. We drove straight through, reaching the Salt Lake Desert Saturday night. I took a long turn driving through Nevada that night and almost fell asleep. I remember the reflectors on the road at the California border near Tahoe. We got to San Francisco Sunday. We crashed in sleeping bags on the floor of one of the students’ home in Palo Alto (I slept 12 hours to catch up). Monday we headed north (we diverted to Bodega Bay and ran around on the beach -- the site of Alfred Hitchcock's "The Birds"), finding common Pacific Northwest rain at night and reaching Seattle Tuesday. We headed north and crashed in another friend’s house in Vancouver BC (aka Metropolis II, where Smallville is really filmed) with the weather still rainy. Wednesday morning I would fly home to DC. A prop plane to Seattle, a DC 9 to Chicago, and a 727 to National. The whole process took only about seven hours.

In retrospect the 5-day car trip seems a bit like a "Sam and Dean" adventure from Supernatural. I remember some of the car radio music: "Lady Godiva," "Chestnuts Roasting over an Open Fire" etc.

Over Thanksgiving 1967, I would take another western trip. A grad student friend would deliver me to the Greyhound station in Lawrence, KS, for the bus to Tulsa. I would meet a friend in Bartlesville Thanksgiving morning. I remember the brown, barren appearance of the Oklahoma plains in November. By that night I would be crossing the Divide again, at Sandia Mountain before entering Albuquerque. I remember Gallup, and I stayed with another friend in a dorm at Northern Arizona University in Flagstaff Friday night. (The bus station warns of “high elevation” 6900 feet, and there was some snow around). Saturday, I met an old chum from my William and Mary fiasco (described already on this blog) in Los Angeles, and spent the night at his family’s home in Pasadena. He was the guy who claimed to have written 57 symphonies. Now, I never knew about the credibility of this, but at W&M he had played a piano rendition of an (that is, his) E-flat major piano concerto, about 30 minutes, three movements, and I could reconstruct it today. I was sort of Hummel-like (lots of octave passages; it had a lugubrious arioso slow movement in g minor), and I think it was authentic. (Just see my drama blog for Dec 2006 for a much more credible news story about music composition.) I left LA by bus Sunday night and was back on campus Tuesday morning, missing only one day of class. I remember Gallup on the way back. And I remember the Topology hour exam that Tuesday morning.

The last of these youthful trips would occur in January 1968, after finishing and passing my Master’s orals. I had been teaching algebra as an assistant instructor, and took the students’ final exams with me and graded them on the bus to western Kansas, where I visited an old roommate who lived in Tribune (almost on the Colorado border).

When I returned home to DC at the end of that month I had just a week or so before being inducted into the Army in Richmond, on February 8, 1968. Tom Brokaw says that 1968 was the pivotal year that would lead to our cultural wars, and I think he is right.

In the thirty-plus years that would follow (two years Army, 30+ years of work), I would run around the western states many times (including job interviews before leaving the Army),usually with rental cars. I would make it to Europe three times, but never the other continents. (Russia is on the list now. Maybe Dubai. Oh, it would be wonderful to be able to rent a place in Spain and just write novels.) Travel would get more complicated, as one had to “secure” one’s life before leaving. And now, there is all the 9/11 security. In grad student days, though, I had few possessions and nobody worried about dorm or apartment security. Those days are sadly long gone. As one song went (I heard it a lot of times on the radio in Army Barracks), “Those were the days my friend, I thought they’d never end.” But they did.

Monday, December 24, 2007

The Christmas Story: it fits modern concerns about reputation


At the First Baptist Church of the City of Washington DC this Sunday (Dec. 23, 2007), pastor Dr. James Somerville gave a sermon about Christmas – the announcement nine months before – that actually ties together a lot of the concerns I’ve been discussing recently. Technically, the sermon was not about the Annunciation to Mary (Luke 1:26-38), but rather the notification to Joseph by dream with an angel in Matthew 1:18-25. The use of dream almost sounds like an official notification by letter or even email in today’s society. It’s important to note that the Annunciation is also discussed in the Koran, and the underlying concepts are every bit as important to Muslims as to Christians. Jimmy Carter's recent controversial book about Palestine mentions the Church of the Annunciation in Nazareth.

The underlying problem for Joseph was reputation. As Dr. Somerville pointed out, small town America used to be a place where you could get your name in the local paper by growing the biggest turnip. It was that way in Nazareth. Joseph was a well respected man who was engaged to Mary, and who was expected to build on living quarters after marrying her and before consummating the marriage. As an institution, marriage was even more fundamental to the core of being than it is for us today, with all of our political debates. Mary was quite young (although not too young to marry by the moral standards of her society). So when she became visibly pregnant (maybe like Juno in the current film) everyone wanted to know who the father was.

In this culture, such circumstances could easily have become a source of false shame for Joseph. Other townspeople would think that Mary had been with another man, which is not bad only for her, but for Joseph as well, as he cannot maintain the loyalty of his fiancée or wife. Yes, soap operas revolve around this. Yet, as the sermon went on, Joseph let God turn this situation into one of historical honor.

Reputation in that society was quite a collective experience, as it has been for most of history. Generally, people have understood that their actions and personal reputations can affect the reputations of other family members. (The last sentence of the Old Testament, in the book of Malachi, reinforces the notion of family reputation in a curious manner.) Importance of biological lineal family is demonstrated by the fact that, for purposes of being registered and counted for Roman taxation, "Saint Joseph" and Mary had to travel to Bethlehem, birthplace of David, ancestor of Joseph as part of the House of David. Familial and tribal reputation, as a collective concept, may even be more central to many Muslim cultures. Since the rise of civil rights and individualism, which accompanied the development of technology in the latter part of the last century (starting particularly in the 1960s), our culture has gradually accepted a more privatized view of “reputation.”

Of course, the Internet has mixed all of this up. Search engines became important in the mid to late 90s, and social networking sites did the same a few years ago. Now, anyone can mediate his own reputation across the entire globe and, unfortunately, others can also (sometimes by bearing false witness, as mentioned in the Ten Commandments) seriously damage someone’s reputation online without a lot of accountability. (There are companies like Reputation Defender that are trying to change that.) Curiously, the technology that makes the topology of today’s Internet possible was well under development in the Defense industry by the 1960s, at a time (as with the 1968 film “2001 A Space Odyssey) when we expected space travel to revolutionize everything; instead it was instant information travel, to the extent that the very Lesbegue measure of the space we live in changed.

One problem with reputation is that it is a very subjective concept, defined in the eye of the beholders. True, many of the abuses on the Internet deal with mean-spirited for frankly libelous gossip. But at deeper levels “reputation” seems to be a troubling notion indeed. One can deal with troubling subject matter and others may perceive (whether correctly or not) the desire to write about them as evidence of personal problems or as a lack of personal competitiveness (which could affect dependent family members). Or one can present a rebellious parody of oneself in some fictitious setting, or perhaps a fun picture of oneself bucking the system, and find that potential or current employers take it very seriously. Employers have a tough problem in sensing what matters for reputation. They may feel queasy about issues like sexual orientation but know that prejudicial behavior is not acceptable any longer, and tend to focus on narrow behaviors that are easy to rule out, like alcohol and drug abuse.

Another problem is that employers (and to a large extent families) use reputations of their leaders to represent the interests of their customers, clients (or family members) to the public at large. Conceptually, the Internet with its global search engine capability complicates this because it enables the individual to do this on his own without the approval of others. That sounds wonderful and democratizing, but then it undermines the notion that the individual can represent the interests of his clients publicly in any competitive or adversarial setting. Society is now only beginning to grasp the scope of this as an ethical problem.

The Scriptures, of course, deal with something bigger still: that man, whether acting on his own or through the family, corporate interest, tribe, or political unit that he belongs to, can really have his own way in life, even with his own ability. That’s an idea that crystallizes in objectivism (or what we call today "radical individualism" or "personal autonomy"), but is something than man develops on his own once spiritual and religious structures give him the ethical compass to do so. The scriptures (of any faith) constantly remind us of the limits of our own best efforts, and of the fact that we need to accept interdependence on one another and on some kind of Supreme Being. And, yes, there is the Gift of Grace, so symbolized by this time of year.

Ronald Reagan once said that the one book he could not do without is the Bible. Perhaps that’s an exaggeration, but on the issue of personality and reputation it has a lot to say after all, particularly at Christmas. But the issue is also taken up by the Koran. Its underlying moral importance to a progressive society transcends all specific creeds but belongs in all of them.



Update: Jan. 20, 2008

McLaughlin's One (NBC4) today at noon presented a Boston University professor Paula Fredriksen, author of "Jesus: King of the Jews: A Jewish Life and the Emergence of Christianity" (Vintage, 2000). She indicated that the notion of virginity comes from the prophecy in the Greek version of Isaiah 7:14, "parthenos", which had been "aalmah" or "young girl" in Hebrew, so the idea of virginity is not necessarily supported. (The King James does use the word "virgin".) Jesus could instead be a legitimate biological descendant of David. The program also presented Jeffrey Sheler, with an opposing view. The program also considered the common belief that anti-Semitism arose because of beliefs that the Jews in Jerusalem were complicit with with the crucifixion (Mel Gibson's "The Passion of the Christ" film).



But, the virginity concept is important for a subtle reason that extends the reputation concept already discussed. It suggests that sometimes men can be expected to support children that they did not biologically father. In our modern system of morality, this must be controversial.

Sunday, December 23, 2007

Pastor dramatizes controversial Parable of the Talents


One of the most controversial New Testament parables is the Parable of the Talents (Matthew 25:14-30) where a master gives unequal gifts to three servants. The servant who got the least earns nothing and is scorned. There is a somewhat similar parable in Luke 19:12-27, the Parable of the Minas, where the bequeaths are equal.

The parable has always had social importance because it sometimes seems to justify the idea that some people have more worldly wealth (or abilities) than others, because they do more with what they have (for others or for society). Objectivists (Ayn Rand) have tended to believe in this kind of thinking. There are other interpretations, however, one of which says that the servant who received the least did nothing to expose a scam. The Wikipedia article is here.

Nevertheless, many people feel that the Parable relates to the importance in having a grounding in the real world of actually obtaining results that benefit others. It can even be viewed as justifying "bottom line" type of thinking in economics.

AP has a story by Helen O’Neill, “Ohio Congregation Lives Giving Parable,” about a Chagrin Falls, Ohio pastor Hamiton Throckmorton, who gave a sermon last September in which he gave the members of the congregation $50 bills and challenged them to go out an earn money for missions. He gave each child $10. It sounds more like the Minas parable.

Of course, this is a case where people are told by those in “authority” to go out an raise money for specific causes of others. It is not an exercise in personal creativity.

The link is for the AP Story by Helen O'Neill here.

and it was a headline Saturday Dec. 22 on AOL.

There is a similar parable where a vineyard owner pays each worker the same, even though some workers came much later, Matthew 20:15, wiki link here.

Saturday, December 22, 2007

Academicians want to research social networking sites


Social networking sites, especially Facebook, are attracting the attention of academicians who want to solve a kind of chicken-and-egg problem. Do personal values control the formation of social networks (and “friends” lists), or is it the other way around?

The New York Times had a story about this on Monday, Dec. 17, by Stephanie Rosenboom, “On Facebook, Scholars Link Up with Data: A Networking Site is Now a Magnet for Researchers,” link here. Universities mentioned in the story include Harvard, UCLA, and Carnegie Mellon, the last of which wants to study the privacy issue. Facebook is viewed as a more desirable site for research than some others (Myspace) because the groups are more closed and customers are less likely to make up fake material or remain anonymous. Some universities, however, like Indiana University, have barred research from social networking sites without explicit permission.

There are some links with research on social networking sites, such as here.

Universities continue to be wary of the way students use these sites. A page at the University of Delaware warns that schools look at them in hiring graduate assistants. However, a subscription article in The Journal of Higher Education warns, “Beware of Using Social Networking Sites to Monitor Students, Lawyers Say,” by Martin Van Der Werf, link here. Universities have been quick to advise students that employers look at entries on theses sites, although there are serious concerns about the accuracy of information in them and of even identifying the right people (previous posting).

Wednesday, December 19, 2007

Profiles and blogs: Employers can easily "identify" the wrong person, and nobody will ever know


A week or so ago I went through an online search engine exercise that underscores the problems of employers checking blogs and social networking profiles of job applicants and employed associates. There is a real danger with many people that employers will simply find the wrong person. Even with job application information, it may be hard to verify that a particular profile or blog really matches the person in question. It’s even more disturbing when reacting to comments made about a person about others and found by search engines. It’s easy to find the wrong person.

There is a particular person (whom I will not name here because the identity doesn’t matter to make my point) who helps organize film events in the GLBT and Hebrew communities. There is another person with the same name who makes short films. There is a third person with that name who sings and makes albums. They are about the same age and Internet pictures look a bit alike. Pictures (especially when reduced in size in blogs or thumbnails, or when in motion in grainy videos as on YouTube) can be deceptive. Such images should not be used as lineups. Although there is nothing derogatory (from the perspective of most people in the American middle-of-the-road mainstream) in any of this material, I can imagine what could happen if there were, and an employer misidentified a clip, picture, post or profile. It could happen.

A secondary problem then happens when an employer does not call in an applicant for an interview, or does not extend an offer and never contacts the applicant again. The legal climate of “employment at will” encourages employers to make employment decisions “for no reason” and offer no explanation; offering an explanation involving on-line speech might seem to invite legal problems (although the First Amendment only protects speakers from the government (often including government employers) or in certain litigation issues like fair use; it does not protect people from private employers. I have reason to believe that this has happened, maybe to me (as far back as 2003) and maybe to others that I know about. Also, it appears to happen sometimes to the wrong person, and sometimes because of information (or videos) posted by other parties (which can refer to the wrong person). There are no real safeguards against this. This is a major motivation for the company and site called "Reputation Defender" and similar other services.



In 2005, when I was substitute teaching, I encountered another situation that illustrates this problem. I had posted a fictitious screenplay on my site in which a character similar to me is put in a very disturbing situation with a seemingly negative outcome. (Whether that matters illegally is an unsettled question, but is has been litigated [in the pre-Internet print world] in California and New York with varying results -- the "Touching" and "Bell Jar" cases; in any case I never bothered to put the standard movie "fictitious" disclaimer on my draft -- but even that may not matter.) While on an assignment at a particular Fairfax County high school, a newspaper editorial appeared discussing the possible impact of campaign finance reform on bloggers. I showed this to an English teacher-intern (that topic certainly belongs in school) and then gave her a handwritten reference to my site, with some files relevant to that essay. The next day I got a phone call at home that I had given her a reference to an “inappropriate website” because this other material was on the same site. They had known about the material previously (in part from search engines) but would not assume that I was the same person who wrote the screenplay (even though WHOIS on domains makes that easy to check in my case) until I mentioned my domain in connection with the campaign finance reform issue. This is definitely a “don’t ask don’t tell” kind of problem.

The issue may be less important with teens who keep their profiles whitelisted or private, which all major social networking sites allow and encourage (sometimes require). Even so, profile information seems to circulate the way gossip did in the old bricks and mortar world.

The Human Resources world does need to get its act together on fair employment practices involving employee online speech.

Dec. 26, 2007

I've made a related posting about network TV videos about Reputation Defender on my TV blog, here.

Sunday, December 16, 2007

Close to half of adults check themselves, others on search engines


Almost half of all adults with Internet access look themselves up on at least one search engine (often Google -- resulting in the company name becoming a reflexive verb) , and most do not find derogatory or disturbing information that surprises them. Slightly more check out other people. An increasing number of adults are concerned about online "reputation" and say that their jobs demand it. More adults than teens, however, allow their blogs and profiles to remain public.

All of this came out in an AP story today by Anick Jesdanun, here. AOL's numbers for these on its survey were over 70%.

Sometimes debt collectors or lawyers use search engines for skip tracing purposes.

I just tried it again tonight with my own legal name ("John W. Boushka") and I noticed how visibly my "professional resume" (about computer programming) shows up from my "johnwboushka.com" domain, whereas all my political writings show up on "doaskdotell.com" and, of course, these blogs (also "billboushka.com"). Some will say, isn't this manipulative or unprofessional? In some jobs it would be, and I've covered that before. (I am "quasi" retired.) The problem is that it's important for individuals to fight for their own rights themselves rather than let special interests and paid lobbyists do that for them. Not everyone has that right or luxury. And I've covered that before, too.

Update: Dec. 19

View the related story on Switched.com, here.

On Dec. 20, a Work & Family column by Sue Shellenberger on p D1 of The Wall Street Journal referred to the under-25 "Look at Me Generation" and the willingness of people to disclose personal information, like early pregnancy (which employers sometimes find and are concerned about) on social networking sites.

Thursday, December 13, 2007

FAIR USE Act, and Pro IP Act: Congress must be careful


Electronic Frontier Foundation is encouraging users to become familiar with and support the “Freedom and Information Revitalizing U.S. Entrepreneurship of 2007” (also called the “Fair Use Act”). One sponsor is Democrat Rick Boucher (D-Va). The bill would prevent limit secondary liability for hardware manufacturers that offer goods that can be used for infringement, and expand fair use in certain circumstances where record studios and music companies may feel obliged to go after individual abusers.

Boucher’s own explanation of the bill is here. It is viewable on LOC Thomas as H.R. 1201.

There is a general problem that publicly traded media companies perceive a fiduciary responsibility (to their shareholders) to go after every possible infringement on their materials or brands. Unfortunately, they do not have much of a “golden compass” on how to distinguish between real piracy and incidental infringement by home users, professors, or schools, and they have been unable to change their business models rapidly enough to keep up with broadband technology software and hardware (P2P, BitTorrent, iPods, etc.). Hardware manufacturers and ISPs alike could face downstream liability for the behavior of their customers unless there is some kind of legislative or judicial oversight on how media companies apply copyright laws. Generally, the courts (especially in MGM v. Grokster) have tried to articulate the idea that secondary liability exists when the defendant’s business is predicated on encouraging infringement. But this concept can be very hard to limit and pin down. There is also an undertone of wanting to eliminate “low budget” competition that does not have to deal with the unions, guilds, rents, and other high levels of expense of mainline media businesses.

The language of the FAIR USE bill is also available here. http://action.eff.org/site/DocServer/boucher_hr_1201.pdf?docid=461, EFF supports the bill with this "Support the FAIR USE Act: Electronic Frontier Foundation”, dated back on Feb. 27, 2007, link:
However EFF criticizes a related “Pro IP Act” in a Dec. 17 posting by Richard Esquerra, “Pro IP Act Aims to Increase Infringement Penalties and Expand Government Enforcement, “ here.

There are some criticisms of this Act, as on Ars Technica “The Art of Technology” here, by Ken Fisher, “Digital Fair Use bill introduced to US House (sans teeth)
here. Tim Lee writes an article here “FAIR USE Act analysis: DMCA reform left on the cutting room floor, “ here.

On balance, both bills mentioned here seem to leave of a lot unanswered questions that are going to have to be followed carefully.

See my wordpress blog for legal references, here.

Tuesday, December 11, 2007

Independence and Inter-dependence: closing the circle of the culture wars


After I came of age and became my own boss (this didn’t happen in full until I was 26 and out of the Army and finally had my own apartment), I learned in time the mores of the new adult world, as it emerged post 60s, post Watergate. Don’t cling to people, be your own man, accomplish something, be selective in your people to share intimacy with, and be comfortable with being alone at times. “Being your own man” became the preventive vaccine for jealousy, and that sort of mantra became popular in gay counseling centers in New York and San Francisco even in the 1970s.

Sometimes when riding the subway lines above ground in the outer boroughs of New York City, I would note the social contrasts. Manhattan seemed to be the province of singles. (That’s not totally true, as the rich in the Trump towers have their own family scenes); the boroughs, with their detached homes, were for families. You would hear people make that kind of contrast between social cultures of states like California and Texas. Yet, even among families, cultural weather fronts were developing, over the place of women, of traditional gender roles, of how to raise kids, and the like.

When I moved from New York to Dallas in 1979, I saw more of “family culture” for a while, even at events as mundane as Friday night “north Dallas” poker parties held by coworkers.

I came to see myself as a kind of Phylos the Tibetan, a dweller on two planets, a Dominion-hopper (to use Clive Barker's metaphor of reconciling cultures in his 1991 fantasy novel Imajica). The urban exile that I lived in, where I could take the time to focus on my own interpersonal needs well into my thirties, seemed a bit circumscribed. I could visit the Outside, but that’s what I was, a guest, a visitor on pass.

The gay male community would have a dose of social reality in the 1980s with the way it handled the demands of the AIDS crisis. Quickly, buddy programs evolved in various cities, such as with the Gay Men’s Health Crisis in New York, Oak Lawn Counseling Center in Dallas, Whitman Walker Clinic in Washington, and Minnesota AIDS Project (eventually incorporating “Pride Alive”) in Minneapolis. But buddy programs did not constitute family responsibility in the usual sense (although for some volunteers it seemed to). In the beginning, most clients did not live long; those who did became local “stars.” Later, as the rapidly improving anti-retroviral medications greatly extended life spans and enabled PWA’s to go back to work, the nature of the need in these buddy programs changed. The programs became more politicized and the volunteer duties specialized.

In the 1990s, the political debates over social responsibility shifted, with emphasis again on the GLBT community with the emerging battles over gays in the military, gay marriage and partnerships, and gay adoption and custody. The meaning of all this went beyond its effect on GLBT people; the debates started to reissue the notions that previous generations (like Tom Brokaw’s “Greatest Generation”) had taken for granted, that some social responsibilities exist in common and need to be shared, and this need affects the way people conduct their relationships and their lives. In particular, there was emerging tension between those raising families (usually children but, also rapidly becoming important because of demographics, having eldercare responsibilities), and “singletons” who seemed to do just fine on their own (many employers now actually preferred them) but whose expressiveness could affect families (look at the issues of Internet censorship and minors, such as the recent COPA litigation). The communal pressures seemed to increase after 9/11, various financial scandals, and a growing awareness that natural perils like global warming and pandemics could force us to accept increased social interdependence – the kind that extended nuclear families used to offer. Persons who had lived productively without having kids would be expected to take on family responsibilities, most of all eldercare, but, given the practical directions of the job market (the need for teachers, nurses, etc) all kinds of new interpersonal responsibilities with other people’s children.

I found pressures, both within the family, and on the job in substitute teaching situations, to adopt a more community-centered attitude about social connections. Independence and selectivity were not absolute virtues in the “real life” world. The do-it-yourself and broadcast-yourself possibilities with the Internet could be turned upside down into pressuring people into social conformity. In various situations, people demanded responsiveness and receptiveness, deferential loyalty, even protectiveness from me, when I had been used to a world in which I had been “God of my own space” (sort of like god of one’s own blogs). The responsiveness extends indefinitely, unlike the circumstances of being a "baby buddy" twenty years ago for PWA's. People sometimes would advance notions that in my world of occasional hyper-individualism would seem self-deprecating. It was all right, even a virtue, to need the presence., attention, even pampering from others, to know that one was valued. This sounds like a culture shock, but perhaps it simply runs full circle, back to school days, when learning to share chores and do them with precision, to fit in with the social demands of others who would protect “me”, was a big deal.

Coordinate post on GLBT blog.

Sunday, December 09, 2007

Tinseltown (et al) WGA stike: let's get it settled


When I was growing up, my father, who was a self-employed manufacturer’s representative (in the days when domestic manufacturing was like knighthood in flower), used to make fun of labor unions and strikes. As a computer programmer and individualist, my own career was never directly affected by labor problems (except when at NBC, as a programmer, I worked two months as a replacement boom man on a soap opera set during the 1976 NABET strike – in the days when video journalism was new and rapidly changing the business). I have always been personally sympathetic to “right to work” logic. But collective bargaining is a legally guaranteed right of all workers. The practical effect of strikes, particularly in rust belt industries, has been considerable. Some strikes against municipalities and transit systems have been illegal, and unions almost brought New York City to a standstill in the 1970s with the financial crisis (remember the Daily News headline: “Ford to City: Drop Dead”!), although the Teachers’ Union helped to settle it. I still remember the futile bargaining in the early 1970s to save the 35-cent New York City subway fare.

When I started writing and publishing in the 1990s, I discovered the National Writers Union, which has made a big deal out of writers’ republication royalties when their articles are placed by their media publishers on the Internet, often as “free” content (sometimes with charge when archived) supported by web advertising.

So it is now with the WGA (Writers Guild of America) strike in LA, tracked in detail on the Nov 4 entry of my TV blog (go here). It seems that the talks broke down Friday Dec. 7 in some usual baby play, posturing and nastiness. Actually, the demand by unions or guilds for fair royalties from republication or redistribution by Internet and DVD comports with modern business models of paying workers by “piecework” or with commissions or royalties according to sales numbers. Such a model, compared to a flat salary or hourly wage with benefits (guilds actually negotiate health insurance benefits for its freelance members) generally can lead to more jobs, and in the movie business, can increase the probability that a given spec script (even mine) might actually get made. So I can understand, even according to the philosophy of the free market, why the royalty demands matter.

Nevertheless, there are other demands that are more problematic. One is that the guilds control reality TV shows. Since these shows are based on a “winner take all” mentality (or at least “winner take most”) it’s less clear that this is appropriate. Imagine Jeopardy or “The Apprentice” contestants being represented by a guild!

Can some movies go on without the Guild? I suppose so. Many small films, with DVD’s available from Netflix, seem to be made without the unions (SAG makes available specific agreements for small budget indie films, however). High concept might well come from a writer who tries his or her own spec script in order to advance an original idea. However, often professional industry members will be needed to make it practical to actually produce a releasable film. This is all the more true of a script to be adapted for a TV series, where the scenes have to fit into exact time slices. (I have one such script, “Make the A-List”, that lends itself to adaptation into a CW-like series.) Furthermore, there are all kinds of technical issues that come up with shooting scripts that generally require the skills of professionals. So “amateurs” have a legitimate self-interest in seeing this dispute settled with a new contract. Hence, I offer my own bit of jawboning with this posting.

It’s good to see that the Broadway strike by stagehands was settled, although there were a lot of losses in New York over Thanksgiving. Here the issues were more about working conditions and safety, as well as money, according to the union.

Parties concerned about these labor matters in the film, TV and theater industries ought to look at Tilda Swinton 's 2002 speech "In the Spirit of Derek Jarman" made at a film festival in Edinburgh, Scotland, available on a Fine Line / Image DVD of Jarman's 1991 film "Edward II" (of Christopher Marlowe 's controversial play). She talks a lot about balancing money concerns with artistic integrity.

Update: Feb. 3, 2008

Major media outlets report major progress and a possible settlement, with announcement the week of Feb 3. Steve Gorman's story on Reuters is "Major progress reported in writers talks" and the link is here.

Saturday, December 08, 2007

Major case involving anonymity of blogger tests an important First Amendment concept


Anonymous speech has long been defended as a fundamental First Amendment right. The concept probably relates to the notion of speech in earlier times as related to assembly and redress, where individuals were not necessarily identified and considered entitled to act as a group, just as they are with collective bargaining. The Supreme Court strengthened the notion of anonymous speech as a fundamental right in the 1995 case McIntyre v. Ohio Elections Commission, as explained by EFF here:

There is a recent case in New Jersey where a township, Manalapan, filed and indemnification suit against an attorney regarding cleanup of a property that the attorney helped the township buy. A typical link is here, by Kathy Baratta, “Judge denies attorney’s request to dismiss case in the Englishtown News Transcript, here.

In the meantime, an anonymous blog named named daTruthSquad emerged. http://datruthsquad.blogspot.com/ The township issued a subpoena trying to identify the blogger, as it claimed that the blogger could be the defendant.

On Nov. 28, 2007, the Electronic Frontier Foundation asked a Superior Court judge in New Jersey to deny the subpoena. The story is “Blogger fights for free speech in New Jersey: EFF Defends Critics from Local Government’s Heavy Handed Tactics,” story here.

A couple of sidebars come up here. One is that anonymity has become important especially for minors and families, as high school and college students need to be able to express ideas without attracting harm in certain cases (well discussed in the media) or without their expressions winding up before employers or schools. Of course, this point has a double edge: sometimes political speech is effective only when the speaker is willing and eager to be known.

Another important point that comes to mind is the state where this is going on. New Jersey has a history of cases associated with speech, with some townships in the past fining rabbis or other people for writing or working at home, supposedly out of “protectionist” (from the point of view of commercial property owners) zoning laws in some communities, a topic I have noted before on my issues blog. I don’t know if this is still going on, in a world where we need to encourage telecommuting and cutting down on fuel and carbon emissions.

NBC Today reported on 12/9 disturbing examples of people impersonating others in blogs (and this has been reported on Dr. Phil when kid impersonated a school administrator with a Myspace profile. These are hardly a protected examples of anonymous speech.

Monday, December 03, 2007

Sean Taylor: memorial service today


The 2-1/2 hour funeral memorial service for Washington Redskins safety Sean Taylor in Florida today was moving and beautiful, as broadcast on NBC4 here in Washington (not quite all of it, more later), and the “in your face” sermon by Taylor’s Adventist pastor brought it to a passionate close. But there are a lot of separate points to make.

First, I was shocked that a wound like this proved fatal 24 hours after surgery had started. Had exactly the same wound happened in Iraq and had been treated immediately, I honestly believe military surgeons would have saved him, and I wish they had been available for this kind of wound here.

Second, the circumstances of the robbery and shooting are particularly troubling. One could say that had Sean been armed at home (and had immediate access to a weapon) he would be alive today. I have always supported the individual’s right to bear arms to protect himself and family, and I wonder if this will affect the case about the DC law before the Supreme Court, even though it happened in Florida. (Subsequently, at a megachurch in Colorado, a female security guard was able to save lives because she was armed, in another tragic incident Dec. 9).

But it is the motive of the intruders, who knew Sean other people close to him, that the public needs to know. It’s easy to throw the book and demand “life without parole” for all four men (there may be more) involved, but that may not be possible in Florida without premeditation of the actual shooting (it is in some states and may be in DC). It’s easy to imagine ideas about greed, envy, resentment and covetousness (some of the “seven deadly sins”). The ideas of Charles Karelis in his book “The Persistence of Poverty” (discussion here. ) may be relevant, although factually we haven’t been told the financial status of the defendants yet. What’s important is to get into the heads of the perpetrators, especially because they knew the victim, had been in his home (apparently) and knew some of his friends and acquaintances, and, presuming they had been involved in the robbery eight days earlier, had behaved in a bizarre manner. We need to know what made them tick at a psychological level. There seems to be an issue of inherited, derived or personal shame, which, as we know from international problems and tragic incidents in schools and malls, can be a dangerous and unacceptable emotion when people can get weapons illegally (or are prodded by others, as overseas, to martyr themselves). The public needs to know this in order to help prevent this kind of crime in all communities with all victims. Right now, we have mostly speculation and theories, but truth may surprise us yet.

But it is the character, personal and religious aspect of all of this that strikes me the most. Coach Joe Gibbs has pointed out how Sean Taylor’s life came together once he became a father (Gibbs also attributed this to God, not just to Sean on his own). Of course, it’s much better to become a father after getting married (to the mother), but here it’s the fact that parenthood seem to give him a reason to be a good person that is so striking. That’s not necessary for all men by any means, but it apparently is for some. Early in the service, Joe Gibbs made a simple public prayer for faith in Jesus Christ, but the Adventist sermon at the end really challenged everyone for “not giving God enough time” and for walking away from responsibility for others, leaving people (possibly the suspects) deserted and with no incentive to live productively.

In Washington, NBC pre-empted “iVillage” and “Days of our Lives” and apparently DC viewers may not have the opportunity to see the episodes of these shows at all. (WUSA, the CBS affiliate, pre-empted some schedule but WJLA (ABC) did not.) I checked, and there were copies on HD channels, to which I don’t subscribe, and these might have carried the funeral. Major networks need to offer pre-empted programs on other channels on cable networks (or satellite) and let viewers know where they are. Also, NBC4 rudely broke away from the sermon at 2 PM and allowed “Ellen” (which may have more clout”) to start. I watched the rest of the sermon on the NBC4 website, but the station should have told the public its plans.

The photo was taken along "Alligator Alley" in the Florida Everglades (Nov 2004).

Friday, November 30, 2007

News associations pressure search engines to adopt "Automated Content Access Protocol" (ACAP)


News associations and publishers are advocating that search engine companies adopt an extension to the “robots.txt” concept that is supposed to block robots when that is desired by the publisher. The enhancement is called Automated Content Access Protocol (ACAP).

In recent years, book publishers have been concerned about the way insides of books are readily displayed by search engines, and news groups (like the Associated Press and Reuters, and the European Publishers Council) have objected to the display of pictures and significant blocks of content almost immediately by search engine companies. Nevertheless, the AP has sometimes given permission for search engine companies to host specific stories quickly.

The AP has expressed concern that professional journalists risk their lives (or sometimes imprisonment in foreign countries) to provide stories, and reproduction of them without permission can make their reporting economically unviable.

The Washington Post story ("Publishers Seeking Web Controls: News Organizations Propose Tighter Search Engine Rules") on this item is by Anick Jesdanun (from the AP), appears on page A2 Business of the Nov. 30, 2007 The Washington Post, link here.

As covered several times on this blog, deep linking by bloggers and other sites and publishers appears to be legal as long as the quoting site does not “frame” the link as if it were its own. The legal background for that case has been covered several times. (For example, Feb 7, 2006, here): or on March 9, 2007 here. EFF’s main link appears to be here.

Linking is essentially like a bibliographic reference or footnote in a high school or college term paper. Some news sites have clauses about “republishing or redistributing” stories but facts by themselves are not copyright protected (in a few cases though they could be trade secrets or security classified). The writer is on safest grounds when he or she adds perspective to the item, comparing it to other items previously published on the topic (as is accepted research). A writer’s own personal experience with an issue also adds perspective, although that can sometimes raise other issues (for employers, family, or others connected to the person who might believe they could affected by the information or the connection); this was an issue in the Nov 29 blog posting (below).
.

Thursday, November 29, 2007

Epiphany: All that Personal Stuff (or All that Jazz): constructors and self-instantiation on Web 2.0


I remember a particular epiphany. On a Saturday in early August, 1994, I walked into a family restaurant in Sterling, CO (known for the rumors of alien cattle mutilations), picked up a rag and had a burger. I vaguely remember now a story about some local soldier discharged under the new “don’t ask don’t tell.” As I walked out of the restaurant and got in my rental stick shift car to drive to Scottsbluff and Cheyenne, I knew that I would write my book. I knew that I would put myself in the public limelight, whatever the consequences, over this issue. I felt that events earlier in my life (well documented on these blogs) justified it and could affect the long term political and even judicial debate. Later, that evening, I would actually meet, at the motel bar (Cheyenne) a man at the who claimed to be gay and an LTC in Commandant Carl Mundy ‘s Marine Corps. Later in the trip I would circle back to Colorado Springs, given the furor over Colorado’s “Amendment 2” and, within earshot of the Focus on the Family property, have lunch with another migrated friend who asked if I was out there to “run around.” There would be more “planes, motels, and rental automobiles” in the mid 90s as I burned vacation time “running around” the country digging up more stories (an mountain driving with a clutch).

The book would lead to the websites and blogs of today (and the screenplay treatments); the DADT issue would bud out into practically every other issue in the cultural wars: the transition of a society based on family and tribal ties, based in part on emotion and automaticity, into one in which individualism and objectivism have developed new world views for younger generations. Economically, extreme capitalism and globalization would accompany the psychological transformations, banging up against international realities of the limitations of this planet and of the problems of being perceived as living off the sacrifices of less developed peoples. 9/11 would wake us up. Issues like terror, war, the draft, the idea of mandatory national service, global warming, energy shortages, pandemics, disasters, workplace risks (including now recent reports on graveyard shift dangers) and the wide array of individual human tragedies and misfortunes (now including eldercare demographics) portrayed in the media – all of these remind us that “burdens” are not shared equally and can in time threaten or undermine the “individual sovereignty” model or freedom – and they are all interconnected (ultimately back to "DADT") and sometimes seem to justify older notions of “public morality.” Variation in unelected responsibility does help explain some reckless behavior (the subprime mess for openers). However, individual freedom should not be taken for granted, and everyone, regardless of how "born", should take some personal responsibility in addressing these problems, sometimes requiring "sacrifices" outside normal notions of market economy (and not necessarily starting just when having children).

By the end of the 90s, it was clear that something profound had happened because of Internet technology. Anyone with a good enough message could make himself (or herself) a celebrity in cyberspace, simply because the topology (or, in the language of advanced calculus, measure) of cyberspace turned upsidedown all the economic and other valuation notions that had previously run the bricks and mortar world. We had a “second life” that was not exactly “real life” but could certainly affect “reality.” Domain names could be hoarded and sold as if they were real estate, almost. For one thing, by the end of the 90s, the pretense of “privacy” that was supposed to be protected by “don’t ask don’t tell” had been shredded by search engines. The world had morphed that quickly.

A sociologist calls this phenomenon “asymmetry.” One individual, because of a twist in circumstances (here, leveraging of suddenly available technology with self-publishing or “self-instantiation,” to borrow from java) gains influence way beyond his normal competitive ability or what would seem appropriate for his social responsibilities. Shawn Fanning made this point when he created Napster which, despite the period of legal copyright problems, certainly taught the music industry that it needed to change its business model quickly.

What I found was that one Internet domain, because of the effect of search engines (which started to become important in 1998) could have a real influence on political issues, if it was structured concentrically around the relationships among the issues and if if it kept up with the news. It just took one person, very little money, no real profits, and most of all, nor organization or employees. Previously, political advocacy had always been organizational and usually partisan. People had always been encouraged to band together in “solidarity” and engage in single-minded campaigns to get their way. Minority groups tended to present themselves as “victims” and GLBT tended to depend on superficial arguments about immutability because that had worked with race. People generally had depended on lobbyists or organizations to represent them and boil issues down to litanies of need; and often, people could personally speak out on specific topics because of conflicts possible with the workplace (potentially legal) or sometimes family; there was sometimes a lot of social pressure to spin certain issues in a slanted fashion to manipulate political or social comfort. Now, there was the capability to present issues in a compact but structured fashion, for one person or a small company to “control” it, and keep it deployed at low or no cost. In this state of affairs, politics could change for ever. This might not bode well for the future of K Street.

It’s important to realize how quickly things on the web get indexed and found everywhere on the planet (and that can include even the tribal areas of Pakistan), even when there are umptime million new blogs or sites created every day or week. Well-organized and presented material (it doesn’t have to be fancy – in fact static content gets loaded and found more quickly a lot of times) gets found and has an impact.

In a world where book publishers exercised so much legal due diligence before going to press, it seems like an accident that “amateurs” were turned loose to post their wares on the Web with no real understanding of possible legal consequences. In theory, intellectual property law (copyright, libel, right of publicity, etc) works the same way on the web as in print, but typically most of the time only defendants with deeper pockets are pursued (the exception is illegal downloading through P2P, which is a totally different issue, but arose with Napster, as above). Now, I do follow the expectations of i.p. law as normally understood in my postings. I fully credit sources, and I don’t target people. (I generally don’t publish the names of unconvicted defendants even though they were published in conventional newspaper stories first. I don't report on "private" matters of named celebrities, even though sometimes I can "see" them even in public places.) Much of my argument comes from personal material, most of which happened years or decades ago. With the passage of time, I believe, the value to the public of knowing about a historical incident that bears on an issue today (like the military DADT) can be considerable, while the practical effect on others involved in the incident (usually unnamed or pseudonymed, but conceivably they could recognize themselves or others could anyway from the circumstances).

However, in the past few years, and especially since about 2005, we have heard a barrage of reports about people getting into serious trouble with their behavior on the web. It's true that the "wild west" mentality of the early web days and the suddenly apparent (unknown before the mid 1990s) capability to broadcast content globally without much obvious personal accountability sets an example that some people misinterpret as an invitation for risky or harmful behavior online. It used to be that the most common problem was people getting fired for badmouthing their employers, who then would find the postings through search engines. Sometimes people tried to influence share prices with rumors and got into trouble with the SEC.

Then the nature of the problems started to expand as social networking sites with Web 2.0 became all the rage. Teenagers (and sometimes adults, especially women) got into trouble giving away provocative personal information and attracting bad people. Teens may say they want the public limelight, but often they do not fully understand the global nature of “fame” and are more concerned about in-crowd popularity, and see social networking sites as a primary social tool. Teenagers (and sometimes adults, as in the case in Missouri recently) would use the web to harass and tease each other. Sometimes people, in fun or to protest social convention, would post derogatory pictures of themselves (especially drug use or underage drinking) and lose jobs or at least job opportunities when employers trolled for this. People, sometimes ignorant of libel or privacy law, would post derogatory information or pictures of others on the web, and employers would find that too. People began to confuse the concepts of social interaction with legitimate publication. Social networking sites have responded by help and encouraging customers to whitelist their material (keep restricted to known lists of users) but even so a lot of derogatory material would get passed around into the wrong hands, just as in Gossip Girl. “Reputation defense” became a new industry.

Legal battles have raged since the mid 1990s about Internet pornography (the CDA and later COPA, the later of which I became involved with as a plaintiff because other material can be confused with pornography) but the creeping problem is much more subtle. Some lawyers call it “implicit content” – the “between the lines” meaning that visitors may attribute to material that they find out of context by searches on the web. Implicit content may concern the motives of the speaker, as they would be perceived by the visitor based on what the visitor is likely to know about the speaker’s familial, social or business position. A careless rendering of the concept could mean that a speaker has a particular personal problem because he or she addressed a certain controversial topic in a posting that itself is legitimate and presents no legal problems in the usual sense. Socratic online “thought experiments” can cause real trouble if they are perceived as Orson Welles broadcasts.

That’s why, for employers especially, the presence of “self-instantiation” on the web can present previously unknown problems, with many ambiguous and gray areas. It’s especially important in jobs where the person makes decisions about others, has some usual public trust (teachers), has to accept forced intimacy on the job (the military, fire, law enforcement), has access to very sensitive information (intelligence). We may be coming to realizing that “free speech” and the “right to publish” should not be identical concepts. For example, one can propose a rule that people in sensitive jobs should accept “professional” supervision of all their personal web activities or whitelist all of their material – the job would be the only “portal” to the public and the employer would have legal ownership of the person’s “right of publicity.” Another more common sense rule would be that one should not make a posting that compromises the confidence in others that he can do his job, even if he has a legitimate political motive (he may have to consider resigning instead if the issue is compelling enough).

Some people say that I should “keep a low profile” unless I am willing to take the risks (emotional and otherwise) that others believe they must take (often for family). (Some have said, why don’t I have the guts to run for office rather than remain an Internet pamphleteer. Then I’d have to be partisan and beg for money.) This notion is just unacceptable. I do what I think I have to do to contribute to my world my own way.

I look upon that "before and after" 1994 moment in Colorado as a selection of my second career, a commitment to rationalize the content and process of debate (starting with DADT). A future job could force me into a “quiet period” of some sort, or major restructuring. Money matters (no more specifics here). But I won’t do something where the point is to manipulate others into buying something that I had nothing to do with. No “we give you the words” stuff – I want to “give” the words rather than “take” them. The resolution of my career must remain with content that must come from me before I can sell it. It does seem like I am pushing "the knowledge of good and evil" as I believe that what people know should not depend just on their familial, tribal, religious or even business associations.

Sunday, November 25, 2007

New ventures try to sell user-generated content, while many surfers lurk


Last June, at the Digital Media conference at the AFI theater in Silver Spring, MD, user-generated content was all the rage in the symposia and discussions.

Saturday, Nov. 24, the Business Day section of The New York Times featured a story by Evelyn Nussenbaum, “Publisher Gets Web Readers to Fill Pages of Its Magazines,” here.

The story first discusses Halsey Minor and his technology info site CNET. I have been familiar with it for a number of years, and am on the email list. Shortly after my end-of-2001 layoff and buyout retirement, I remember getting a lot of emails at home about the world of an information technology manager, about all of the security and productivity issues. A manager of someone else’s world was the last thing, then at age 58, that I wanted to be. I had always been an individual contributor. The tone of those articles, along with the message boards and discussions that followed, struck me.

The article goes on to discuss the collaboration with Paul Cloutier and severy ventures, including JPGMag and Everyman. JPG encourages members of visitors to submit material (mainly photographs and visuals) that other members rate or vote on. The “best” material gets selected for a print magazine, that does appear to have some challenges to stay in the black. Advertising on both the magazine and website are kept within certain bounds so as not to disturb or annoy visitors or readers. This could be a way for some people to get "published" by a third party without working with agents in the usual manner or doing more conventional formal submissions. This is real publication of work offering royalties, not just payment-free publication of LTE's or forum commentaries.

Then, in today’s New York Times Magazine (Nov. 25), p. 26, Virginia Heffernan has a picaresque essay “The Medium: In Defense of Lurking: Solitary Consumption, not interactivity, many be the best thing about the Web,” link here.

Actually, I used to think that was what “surfing” was all about, especially back in the 90s. In these Patriot Act days, you wonder if someone could get too interested in tracking the sites you went to by IP number – and there are some things that are patently illegal to have images of on your computer. It’s a much bigger issue at work, but it could be at home, or on a family computer with other family members. Still, her article discusses a couple of varied issues, one of which is our loss of reading – especially literary materials – for pleasure. (I remember at George Washington University in English literature reading a Wordsworth poem about how poetry was supposed to give pleasure.) She mentions that professional actors (especially stage) have little time to read things other than the scripts they perform or audition for (even more so for Shakespeare-type actors – who at least can get as far as Marlowe), and that teachers (especially in humanities) have little time, after grading papers and preparing lessons, for material that goes outside of approved school curricula – even for themselves – and that creates certain tensions among teachers when outside influences (substitutes) come in the door. She also talks about how some “lurkers” insist on enforcing their constitutional right to anonymity when they blog (that’s a good thing in these days that employers mine Myspace and Facebook pages) – and that a New Republic editor got sacked for blogging anonymously on the magazine’s site, and that a foods executive got fired for promoting his stock anonymously on an investor trash board.

So it goes.

Picture: The new Washington-Less High School in Arlington VA (I graduated in 1961), taken Thanksgiving Day.

Wednesday, November 21, 2007

Wi-Fi linked to autism -- something to this, or an urban legend?


The media has a story tonight "Wi-Fi causing autism?" One immediate source is from John Biggs, and the "Crunch Gear" link is this.

The study appeared in the Australasian Journal of Clinical Environmental Medicine, authored by Dr. George Carlo, and claims that exposure to Wi-Fi signals causes heavy metal ions (like those you study in high school chemistry) to be trapped in certain brain cells in infants or toddlers.

The Computer Weekly article is by John-Paul Kamath and is called "Wi-Fi linked to childhood autism." The link is here. There is a suggestion that autism in young boys has increased in the same time frame that wireless became widespread. There is also concern about living near "hot spots."

The story mentions another study at "Penn University" that disputes these claims.

The story was posted on AOL today here, and suggested that parents had not been adequately screened about drug use. This suggestion brought very angry comments from parents with kids on AOL.

I had mentioned autism in my posting about libertarianism yesterday, where it seems that our actions and behaviors and expressions have effects that we don't take into account -- a questioning of "consequentialism".

Distantly related is a story by Martin Fackler in the Nov. 18 New York Times, "In Korea, a Boot Camp Cure for Web Obsession" for teens with cyber-addiction, link here.

Is this real, or is it an urban legend? After all, "I read it on the Internet!"

Tuesday, November 20, 2007

Yes, it's getting harder to remain a libertarian


I wrote on July 30 that it is getting harder to remain a libertarian – and actually LP party people have been saying that privately ever since 9/11. Then, I wrote about this from a mostly personal perspective. Now some of the recent trends that accompany globalization make it useful to look at the “big problems” categorically.

The most important problem seems to be that the Western world consumes, per person, and pollutes out of proportion to population. It’s obvious that the competition from big countries with large populations (most of all China) who want our standard of living can produce huge political and security problems.

That concerns decomposes into two overlapping areas: global warming, and oil consumption. Global warming, caused apparently by carbon emissions from developed countries, is likely to impact less developed parts of the world (floods in Bangladesh, droughts in Africa) most severely – meaning that consequences aren’t apparent at once to the “perpetrators” but do impact the “victims” by social class. Of course, all of this gets said as we realize that nature alone can produce huge climate changes (as it has over hundred of millions of years – that’s why there’s oil in the desert Middle East). Still, the evidence that man has contributed to the current situation is overwhelming, and so is the unpredictability of what may actually happen (what about the possible sudden loss of the Gulf Stream?)

The affiliated concern in oil consumption. It’s possible to reduce oil consumption while increasing global warming, and it’s possible that developed countries, when consuming fossil fuels, may pollute less (and emit fewer carbons) because of better technology. But it’s obvious that China and other countries are becoming competitive consumers for oil, and that the Western consumers are open to condemnation from Muslim zealots who claim that their lands have been occupied for cheap oil. This sort of thing leads to tirade like that of far left-wing Venezuelan president Hugo Chavez, who recently threated that oil would go to $200 a barrel if the US went on any more "redacted" military misadventures. ABC World News Tonight on "A Closer Look" ran a comparative simulation of oil use once China has the same rate of automobile ownership (without re-engineering away from petroleum fuel) as Americans.

Another area where “collective” forces reign is public health and the new concerns about pandemics. In the 1980s, the religious right tested the waters with the idea that private sexual behaviors (spreading HIV) could jeopardize the lives of the “innocent.” That died out, but in the past few years we are seeing new concerns about “healthy carriers” or the moral issues associated with personal “typhoid Marys”) with truly communicable diseases. It’s a complicated picture. Some of it is related to overuse of antibiotics – again, and example where “personal actions” complicate future public health. But with most viruses and many bacteria, the practical reality is that normally people develop resistance as they go through life. A person can be sickly as a child and then have very little work or productivity disruption as an adult because resistance is built up with a lifetime of crowds, gatherings, bars, and intimacy. The problem is that some populations (kids, the elderly, the immunosuppressed) are vulnerable to infections normally tolerated when carried by others – and there are some sudden surprises – like the unpredictable behavior of “superbugs”. Even so, draconian government actions against those identified with TB seem hard to justify, since TB is normally not easy to transmit.

And on top of all this, there is the concern of the fatal pandemic, bird flu, or the emergence of a virus so novel that the world population is unprepared for it. The mysteries of the 1918 Spanish flu become decoded and we realize it can happen again. But it is not personal behavior in the west that is the issue (other than increased mobility as with air travel); it seems that it may be agricultural practices in less developed parts of the world, where people live close to poultry and farm animals. But why hasn’t that been more of a problem in the past? Well, maybe it really has.

We find ourselves in a tug of war – can market driven technology keep us ahead of the threats? Why don’t we have an effective avian flu vaccine yet for mass use? One reason could be careless implementation of product liability laws. That’s a good libertarian argument. Technology can reduce threats and save lives. John Stossel makes that point all the time.

The social history of the past few decades has been the development of personal autonomy and individual sovereignty, at least for consenting adults. This has a tendency to leave a lot of people behind (those who are personally less “competitive” or more dependent on the socialization of others around them), and that leads to backlashes and demands that everyone be accountable to their own family and community (even if they don’t have their own children – and falling birth rates among the affluent are creating still another collective demographic problem with eldercare). The sociologist’s name for those demands is “public morality.”

An interesting sidebar to this could be the unexplained epidemic of autism (also Aspergers) in children (especially boys), about which an important letter appears in today's (Nov. 20, 2007) Washington Times. There are unproven theories about vaccines and toxins, but it's also possible that culture (and excessive infant media exposure) have something to do with it, making toddler brains decide to withdraw from what seems like gratuitous or unnecessary "real" social stimulation.

The challenges ahead (global warming, terrorism, pandemics -- as well as the usually overblown shocks presented on the History Channel's "mega-disasters" series) may well force people to accept more socialization and local or familial interdependence than has been expected of the past couple of generations. We even hear that now said in conjunction with the possibility of running out of oil without a global technological infrastructure that can support globalization on “current sunlight” – Matthew Simmons (in his book “Twilight in the Desert”) wrote that we need to learn to produce more locally and trade and travel less (in contradiction to the ideas of Thomas Friedman ‘s “Flat World”). But we heard that before, back in the “collectivist” 1970s with the oil shocks then, and we muddle through it and personal freedom got stronger, with productivity. Will that be true again?

Note: Feb 21 2008. Since this post was written, there has emerged more medical evidence linking autism and related conditions to genetics.

Friday, November 16, 2007

More major news sites offer a lot of free content


I’ve written about the previous “controversy” over deep linking, such as on Feb. 7, 2006 on this blog.

Today, Nov. 16, 2007, Frank Ahrens has a story on page D01, Business, The Washington Post, “Web Sites Tear Down That Wall,” here.

Ahrens discusses the trend for more newspapers and other online publications to offer more of their online content for free and depend on ad revenue. Some papers offer recent content free (even though you pay to purchase a hardcopy of essentially the same content). Most papers charge for full stories from “archives” that have aged a certain amount, but recently some papers have lengthened the free time. Some papers allow a certain monthly volume of stories (by email address or ip address) free, and then require registration, and then for a higher volume require payment. A few may charge for internet-only content that is not available in print.

An issue arises for papers now because much of their content is found by search engines, in what we call “sideways” access. That means that the visitor misses many of the intended ads. Bloggers (like me) often give direct deep links, as a practical matter, to give the visitor immediate verification of the truth of the comment, and immediate access to more details behind the story. (It’s still like footnoting.) I usually try to give the page and section and exact date of the print version, if known. It would be possible to give only the root link and then the exact URL with a link (to encourage a visit to the original link) but this seems silly and unproductive. Newspapers can, of course, try to target their ads to the content of each specific page, as there are plenty of software products in the advertising market that do this.

Since I live in the DC area, my radar screen often starts with The Washington Post and Washington Times, but I try to look at as many sources as practical. I have lived in Minneapolis and Dallas and am familiar with the publications there, as well as The New York Times and WSJ (discussed in Ahrens's story). AP, Reuters and other consolidated service stories often provide stories to all papers, which often can only be carried for a couple weeks. I try to link to the originals if I can find them, but sometimes I cannot.

Some papers will defeat deep links by replacing them with the home page, in order to force the visitor to see all the ads and go through their search engines. The Washington Times has done this recently, and I find that even with their own search many of their older stories have disappeared (don't even show up to be purchased with credit card).

Of course, other Internet businesses have gone to the ad model, including major ISPs like AOL, and this has been traumatic for employees.

Picture: From Sharpton's demonstration in Washington DC today, which I went to and covered myself (see Issues blog).

Thursday, November 15, 2007

Why "don't ask don't tell" is still a galactic-central issue


I recall in the mid 1990s that a trial attorney from a major Washington DC (Pennsylvania Ave) law firm told a GLBT audience that the military gay ban (aka “don’t ask don’t tell” policy) was one of the most important issues that it faces, and that it could potentially become one of the nation’s most important national security and public policy issues. That is so despite the fact that the quantity of people affected is, compare to many issues, relatively small (although close to 12000 people have been discharged since 1993 under the policy).

Some of the effects of the ban are practical. The military often provides careers and college educations to those who are otherwise disadvantaged, and the tuition recoupment cases are particularly troubling. From a security perspective, the ban seems to be causing the military to lose scare language and medical skills.

But the biggest concern is somewhat more symbolic. The ban, at least until studied in detail, implies to the public that GLBT people are not fit to do their share in defending the country, a point that has become more important again in a post 9/11 world where we cannot take our freedom for granted. If they are not fit to serve, why should they have equal rights in other areas?

I came of age when we had the Vietnam era military draft. I remember the controversy about student deferments, which were eliminated with the lottery in 1969. I also remembered the moral dichotomies: it seemed that a draft penalized the healthy and fit. It penalized men and assumed that they had to prove they could protect women and children before they had rights to their own lives. A lot of this kind of thinking started to recede after Nixon ended the draft around 1973, but since 9/11 we have heard a lot of talk about restoring it and about how the “sacrifices” should be shared equally. It gets expanded into discussions of universal national service. You wouldn’t need formal conscription (although the Selective Service System is still intact and ready to go); you could have a system with much stronger carrots, and some leading Democrats are pushing that angle. At least, that answers the objection of “involuntary servitude.” Still, the obvious question is, what about gays? “Civilian” service only? What about the Peace Corps in Muslim countries.

That brings up the central justification for the “ban” as it emerged in the 1993 debate: that it was necessary to protect the “privacy” of straight soldiers in conditions of forced intimacy. That also led to discussions about unit cohesion. I had an upsidedown relationship with this whole debate. I had been thrown out of William and Mary as a freshman in November 1961 after telling the Dean of Men that I was gay, and it was assumed that my presence and knowledge of my predisposition threatened the delicate “masculinity” or moral normal boys. (Although, after a stay-at-home college education and dorm graduate school, I would personally serve in the Army stateside for two years without incident, not exactly openly.) That’s similar to what the military was thinking. But we run into forced intimacy in other occupations: fire departments (this was a debate in NYC in the 1970s), law enforcement, intelligence. It seems that these concerns have not been as serious in these areas. One wonders about medicine, nursing, and even teaching, where at work one might have to attend to custodial needs of a non-consenting adult person.

To it’s credit, when Congress passed the 1993 law, it tried to say that the military is “different” from these other areas. Nevertheless, the message was clear: gay people, men especially, should not be trusted in some intimate same-sex environments.

I even thought at the time that some sort of more “humane” kind of “don’t tell” could work. After all, back then, the concern was “having a life” (we called it a “private life” and civil libertarians talked about “private choices” as “fundamental rights”) and not having to bring it into the workplace. This debate was going on just before the general public caught on to the Internet, and in a few years search engines were making everything public, to the point that public openness and candor became a virtue that seemed more important to many people than “privacy”, even in living quarters or in barracks.

It is true, there has been steady improvement in defense-related civilian employment for gays. I applied for a top secret clearance when I had a civilian job in 1972, and the outcome was indeterminate. By the early 1990s, the situation had become much better, and in 1995 the Clinton administration issued an executive order that supposedly protected gays in security clearances and civilian defense employment.

I was in an odd position, working in the 1990s for a company that specialized in selling life insurance to military officers when I decided to do the book. I was not comfortable with this, but fortunately the company was acquired, and I took a position with the acquiring company in Minneapolis. Still, this raises “moral” questions. If I am unsuitable to take up arms for my country, should I make a living off of it as a civilian? There are some people who would say no.

We get to the recent issue of ENDA which, along with hate crimes legislation, represents a practical advance in rights for many GLBT people, but it is far from perfect because of the various exclusions (trans-gender, military, disparate impact, etc.) But, really, we come back to the question of sharing responsibility.

It’s interesting how history tracks. Clinton’s attempt to lift the military ban came up after the “first” 1990-1991 Persian Gulf War, which represented more of a threat to security than maybe many of us realize today. The war showed (just as World War II and Vietnam had -- indeed, the Army had stopped “asking” by 1966), that the military can look the other way on gays when it needs people deployed in real combat. It was about this time – when there was real need – that security clearances were being handled more fairly, too. And, to close the circle, the American success (in the Persian Gulf War) then may well have helped contribute to the pressure on the Soviet Union at the end of 1991, leading to the collapse of old-style communism. And now, elements of that War helped set the stage for 9/11 and our situation with Iraq, Afghanistan, etc. and Al Qaeda today. (I leave the conduct of Bush on Iraq for another time.) All of these historical trends relate. And, by the way, this was all accomplished with a volunteer military, until Bush started the repeat Reserve and Guard deployments to Iraq and the “back door draft” leading to calls for a real draft again.

Not long after Clinton raised the issue of the military ban, gay marriage (at least in Hawaii and then Vermont) percolated in the mid 1990s as an issue, to explode with the Massachusetts decision in 2004. But even in the 90s we were starting to see “shared responsibility” as part of the debate. Here the domestic responsibility includes raising the next generation (the constant call for adoptive and foster parents, the publicity over the teacher shortage and “no child left behind” and caring for the last one (the demographics of eldercare with a lower birthrate, and how filial responsibility could come to affect GLBT people. All of these issues are interconnected, like routes on a graph theory problem.

It's encouraging that individualism has held together somewhat even after 9/11, with Lawrence v. Texas (2003) and the successful outcome in getting COPA declared unconstitutional at a district court trial last year. The political pressure to lift the ban through democratic legislative processes, given how things are evolving (and the evangelical political meltdown after all the scandals) make Mart Meehan's bill to lift the ban seem credible, when a few years ago it sounded like a fantasy (constitutional challenges to the ban have not fared well in appeals courts because of "deference to the military"). A good blockbuster film on it could help seal its demise.

In the period after Stonewall, the real gains were in being able to lead our lives within reason, at least in larger cities. In the 80s, many of us fought for our lives. In the 90s, the thinking changed, away from the emphasis on protecting private choices to the importance of full equality because equality is an important part of sharing responsibility (and sharing psychological “risk taking” too). In general, full public equality is important especially in situations where one needs to be counted on as an authority figure or role model (like a teacher). All of this only became clearer after the 9/11 events. The sensitivity of so many people to having their feelings protected when they make and try to keep marital commitments and raise families, while other people "tell" and speak with such candor on global media never before available, seems to be a major stumbling block.

There is an LCR-SLDN announcement today about a display on the National Mall in Washington around the end of Nov. 2007, discussed here.