Friday, February 29, 2008

Data Broker practices might damage reputations


The Erickson Times today (in the March 2003 print edition) has a followup report on p 4 about data brokers, by Michael G. Williams, "Caught in the data broker web: Will Congress protect us from personal information leaks?" I had discussed the previous issue on this blog on Jan. 30/

The story relates a particularly troubling narrative about Intellius. The story tells of finding criminal convictions about individuals with the same name as the person being investigated, and shown on the same page. This practice resembles problems being reported with the Internet, where employers attempt to check on a person's "online reputation" and can easily pull up the wrong person. The data broker industry argues that there is nothing inherently wrong with offering information from the "public domain" to customers (like landlords or employers) as long as it is collected legally.

The Senate considers a bill requiring data brokers to disclose data records to consumers on request, and the House would prohibit the sale of social security numbers.

On Feb. 22, the Washington Post reported that Reed Elsevier (owner of Books in Print) and owner of LexisNexias was attempting to acquire ChoicePoint, story here. A merger could make it easier for Congress to regulate the data broker industry and treat it as very much like the credit reporting industry and even debt collection business.

Update: May 13, 2008

Vuahini Vara has a story in The Wall Street Journal, "New Sites Make It Easier To Spy on Your Friends," p D1, link here. Among the sites discussed are zabasearch, wink, Spock, spokeo, and zillow. A 2005 letter to me by Congressman Jim Moran (D-VA) about such sites is posted here.

Thursday, February 28, 2008

"Court reporting", globalization-style


On a Saturday in early August, 1994, while on “vacation” with a rental car, I had a personal epiphany as I had lunch in some family diner in Sterling, Colorado, supposedly the home of the “cattle mutilation” controversy. Somehow I came to the irrevocable decision that I would do my book on the military gay ban, and ease myself into a “second career” someday on exploring all the “individual freedom and responsibility” issues that seemed concentric to the ban. I’ve explained how that is in previous posts (and the connection to my own experience at William and Mary in 1961). I remember the rest of that say, driving through western Nebraska, visiting Scottsbluff, admiring the magpies and scrub jays, and staying in Cheyenne that night. I knew life would change, although only gradually at first.

My entire “paying” career had consisted of positions as an “individual contributor” in information technology. When I “retired” at the end of 2001 (post 9/11), the past twelve years had been spent in the life insurance industry. The “natural” question is, why not make a secondary living “selling” what I know, which then would have been life insurance and annuities. I have indeed been approached about becoming an insurance agent (several times), but also about a lot of other things: the cash flow business, the charity surplus goods business, and so on. I have considered the “career switcher” into public school teaching (mathematics). All of these involved manipulating people to convey a simplified, pre-digested message where a boss says “we give you the words.” (Note well: some retirees took up the mortgage sales business, and look at where it is now.)

My books and websites express a certain intellectual conceit, that of “knowledge management.” That is, one can design a kind of “Wikipedia on steroids” that catalogues all major political, social and scientific controversies (Wordpress is really good at this!) in such a way that the “average person” can navigate into it. (As Roger Ebert said in one of his movie reviews, “go into the Website”.) One person can start this, although one needs the help of others in a “networked journalism” operation. Of course, there are issues, such as fact checking and journalistic objectivity. The underlying potential is that, with a public informational infrastructure like this, lobbyists and conventional politicians will lose out. Simplified appeals to voters who just want their immediate comfort needs met will no longer work. Pie in the sky? Perhaps.

In fact, a large portion of advertised “opportunities” for writers essentially are jobs that require dumbing-down of material for lazy readers to sell them things. Many of them are ghost-writing other people’s agendas (such as unions’). Some of them have to do with “training” CDs and courses. The wiki idea ought to make money ethically, but it requires a lot of thought an staying power, and personal independence and autonomy.

Of course, most “normal people” have family-driven commitments to others, and find it necessary to ask for “biased” help from politicians, labor unions or the like to meet these. Their question becomes "what can you do for Me (and My Family)?" and they are more concerned about winning converts than arguments. It's interesting how this notion of family commitment will lead some people perceive that "political participation" means volunteering for specific candidates in most partisan fashion. (The NBC Nightly News video (2/28/2008) on this mode of political participation is here.) Family responsibility creates tension with intellectual objectivity. So, the logical question becomes, what does this say about me?

I did spend much of my “coming of age” years and subsequent adult life focusing on my own “psychic needs.” The “second coming” in the early 1970s required an inordinate amount of self-focus. It required privacy and a separate "universe" that was large and rich enough (the large cities) for me to build "a different life" in; it needed to allow reconciliation forays back into the "real world" but it did not require complete "equality" because the outside world was not demanding much other than the work that I did for my salary (OK, there was on-call and unpaid overtime; that was part of my "karma".) It also led to impatience with those who wanted to manipulate the spin for their short-term advantage, with no sense of “intellectual honesty.” Eric Schneiderman has a relevant article in the March 10, 2008 The Nation, (“Transforming the Liberal Checklist: It’s time for progressives to demand a bolder, ‘transformational’ politics,” link here; also see the Web letters: It’s true, starting in the 1980s conservatives (unfortunately, the “Moral Majority” sometimes) started recasting the issues debate in terms of personal responsibility and, less definitively perhaps, intra-family or filial responsibility. Now, the changing demographics (increased life span, fewer children) has led to a new kind of “moral focus” on the “self-absorption” that putatively interferes with forming and keeping intact new families. (Link.)

I’m obviously caught in the middle of this. In the past few years, there has been a lot of personal pressure on me to “interact” more to meet the real needs of other people. Let me backtrack and say this has happened before, as in the 1970s at the (New York City East Village) Ninth Street Center (when I was urged to start by “volunteering” to wash the dishes after the Saturday night chicken-and-tomato aspic potlucks), or in the very early 1980s in Dallas when there was concern about housing gay Cuban refugees, or later, obviously, with the AIDS epidemic and the need for buddies (and “baby buddies” who helped on personal convenience). But none of those compare qualitatively to the personal demands of eldercare, and of the constant pressures today to become involved in teaching, mentoring, etc. The economics of this is part of the issue, and there is a real risk that states will start enforcing filial responsibility laws, and disproportionately on the childless. But what’s more remarkable is that this new call for personal charity requires emotional connection that is alien to people like me.

I spent a few decades living in more or less a different, although “reconciled” (to use Clive Barker’s terminology) universe where individual sovereignty was the strongest value. In the past few years, post “retirement” myself, many things have changed. Besides the eldercare issue, I found myself in classrooms with kids who demanded emotional connection and presumed that I had led a life (preferably married with children myself, a wrong assumption) that would support my use of emotion to demand “respect” as an authority figure. Others would approach me about “sales” jobs that obviously depended on structured social connections that are alien to the way I have lived.

I do have my own world of emotions, that start from within. Music is one way to map them. So, one asks, what good is it to become the Universal Scribe if no particular person matters? Well, people to find this material and feedback, and I find that when I produce something valuable and have enough personal mobility, I do attract the people that I want. Yet that doesn’t seem fair to some people. “You” (let me be informal and address the reader here!) are supposed to attract the people who will “need” you. I have had “relationships” that, for me, generated emotional intensity. But there was no element of permanent post-adolescent commitment, which is more for the safety and protection of a lineage than for me and a partner. There was no need for a “marriage.”

Yet, it seems, as a moral matter, others (or “society”) has a right to demand this of me. If that’s true, and if it has a legitimate “moral” foundation, then there can be ramifications. In the future, others like me could be compelled to live close to other family members in order to be able to care for them if necessary. Or my right to free speech or self-expression or “self-promotion” in public spaces like the Internet could be pre-conditioned on accountability to someone. That may seem far off (constitutionally, given recent court activity), but that sounds like what some people want. Family responsibility need not wait for conception, or even a sex act.

There is a lot of coverage of autism and Aspergers syndrome in the media today, as with Feb. 27’s very informative session on Larry King Live on CNN. They seem at least reasonably related to “emotional autism,” an unwillingness to accept the gratuitous emotions of much family life. A doctor made a point about the word “autism” itself, which connotes a “turning within” in the mind, a preoccupation with the self that precludes productive engagement with the outside world, introversion carried to the ultimate extreme. That may not be completely correct or fair as a characterization. But it strikes me as interesting that a syndrome that (at least when milder clinically) people used to attribute as ”selfishness-driven” character flaw or moral failure when I was growing up (it was the responsibility of all men to protect women and children a few decades ago) so obviously has biological explanations, and what these may be is disturbing and controversial. (The show gave some statistics that show that autism has risen rapidly in boys in the past thirty years with no clear explanation.) When I substitute taught and found myself in a couple of classes with severely handicapped students, I saw that the instructional program did focus on their taking “moral” responsibility for their own behaviors as much as possible.

I do get some “angry” emails once in a while. In August, someone asked when I would pick up a hammer and go down to New Orleans and pay my dues. The speaker wanted Anderson Cooper to do the same (see the TV blog for August 2007). And I know that some employers act as if they would like to see me behave in a more partisan and protectionist manner, even if that meant taking “advantage” of anti-discrimination laws or sentiment. (I recall the incident when I was asked if I would borrow swimming trunks and get into the pool in front of kids.) That would make other “non-competitive” people in the conventional social sense feel better about themselves. I want no part of it.

For now, but not forever, I am a world “court reporter.” I’d like to find real money and build this into a film. I’d enjoy a visit to the Kodak Theater some day.

Monday, February 25, 2008

Blogs and campagin finance: the past has a lesson for today


Several times in the past two years I’ve taken up the subject of blogging and campaign finance reform. As readers recall, there was some controversy around the end of 2005, after a federal judge had ruled that the FEC could not baldly exclude Internet communications from possible regulation under the Bipartisan Campaign Reform Act of 2002. A descriptive link at the FEC is here; the statute text is here. The opinion by Colleen Kollar-Kotelly (9/2004) can be found in a dynamic link here near the top of this page on the Agonist.) An important development later, recall, was the Supreme Court ruling in the Federal Election Commission v. Wisconsin Right to Life, et al., link here (findlaw opinion here: ) in 2007, which would limit the reach of the law in cases where the intention was to discuss an issue rather than endorse a particular political candidate.

In October 2005, the Washington Post and the Washington Times had a face-off on this issue. The Post editorial was called “Cyber Loophole” and the link is still here:
The Washington Times editorial was called “Suffocating the First Amendment,” Oct. 12, 2005, and the link no longer works (it resolves only to the Washington Times home page). The Times came to the alarming conclusion that no ordinary person would be able to afford to open a blog without the expense of hiring a lawyer, because there would be know way to know when one wasn’t making an “untraceable” (to parody the name of a popular movie) under-the-table campaign contribution. In 2006, the FEC finally amended the rules to relieve the minds of bloggers somewhat.

The question, two years later, and maybe important to raise in an election year, is, how could a personal blog conceivably corrupt campaign finance reform anyway? The Times editorial really didn’t explain that. The legal point, in circulation at the time and perhaps forgotten now, was that a political position on anything (say immigration, or health care reform), if it matched the position of a particular candidate, could be construed as an endorsement of the candidate. Therefore, the “value” of the blog posting, even if it were posted as free content without compensation, could be viewed as a “contribution” that could not be accounted for. Thankfully the Supreme Court curtailed some of this madness in 2007, with the opinion above. At the time of the debate, there was a lot of concern over links to candidates’ pages.

Still, there is an important point that lingers. When a blogger or web publisher makes a posting in a public space, accessible to search engine, the law is probably going to assume, at some point in its standard-making, that the speaker has a motive and a purpose for the posting, that the speaker expects something to “happen” as a result. The situation may be less problematic, ironically, when the speaker has been compensated by a third party and is conveying another party’s message. It may become particularly nettlesome with free content, and lead to situations where enticement could be alleged.

One observation is that personal blogs are found by others and are effective. That point is not lost on employers, who have been “investigating” applicants and employees with search engines, sometimes, as we note, finding the wrong person and not being too conscientious about their own ethical practices as employers. The problem is best known with social networking profiles, but has been coming on for a long time, with ordinary sites and then blogs, even before social networking sites per se became big players in UGC (user-generated content) scenes. The mathematics of binary searches makes it much easier to find a particular person or item by that person quickly, even among billions of web pages, that most novices can imagine. In assessing incidents where employers (or their “clients”) are perturbed when they learn of what associates have said “in the wild,” that point needs to be borne in mind.

During the political campaigns this year, it seems appropriate to use this mathematical reality to advantage in promoting pseudo-direct democracy. Whatever Hillary and Barack say about their respective health care plans, what “ordinary people” say about the specific proposals matters. If 75% of the blogs supported dropping employer-sponsored health care and replacing it with a fairer system, that would be very important information for whoever gets elected, even if the short-term effect is to favor one candidate (even somebody like Nader now) over another.

My earlier important posts on this were: Jan. 2006; Jan. 2007 (study the News.com link there); and June 2007 (Supreme Court).

Thursday, February 21, 2008

DADT books: what happened to my "Bill of Rights 2"? It's still there, but look at demographics


With my big “Do Ask Do Tell” book in 1997, followed by the small “fundamental rights” supplement in 1998 and second DADT book in 2002, I urged the idea of a “Bill of Rights 2”. I stressed the idea of building a firewall between government and individual rights. Given the tone of many of my recent posts, many will wonder what has “happened”? Why all the recent moralizing?

In fact, I touched on the idea that a BOR2 might be accompanied by a “Bill of Responsibilities.” We all know that freedom cannot be taken for granted. Beyond the obvious objectivist notion of absolute responsibility for one’s own acts, there is a necessity to share the burdens of defending freedom. Some of these duties mix in with interpersonal family obligations that go beyond the obvious priorities that get reset by having children. Some "social rights" (indeed the last two of FDR's "Four Freedoms") only exist because of cooperative efforts and sometimes individual sacrifices.

There has long been an undertone in conservative literature of disapproval of “self-absorption.” Recently, this concept has been connected to lower birth rates and changing age and race demographics, which are seen as a result of weakened families and as having dire potential political consequences in the future, especially related to immigration. I covered this yesterday on the issues blog with discussion of a recent article in The Nation by Kathryn Joyce. It seems that some social conservatives regard personal "introversion" as self-indulgent and as a quality that makes someone a burden or even a danger.

The self-focus gets mixed in with religious notions of accepting “the spirit.” Rick Warren talks about this with his notion of “Purpose-Driven” life. “It’s not about you,” he writes. Then, as in the previous post here, James Somerville talks about the fact that you can’t be born again by focusing on yourself and becoming a better (or more meritorious) person.

Another concern has to do with the way technology works in tandem with self-concept. Technological advances, culminating with the self-expression opportunities on the Internet, have led to the drive for people to be more independent of expected or “automatic” social and familial emotional ties. There is this “Da Vinci” paradigm for leading one’s life: be your own person and building your own interpersonal relations through a selective funnel opening from your own work. That model works well especially for introverted people, but it doesn’t lead to strong families or communities. Technological infrastructure can be undermined by political or religious enemies (terrorists) or by natural disasters, and people can be forced to live in a more interdependent manner again.

People like me do become targets of this kind of thinking. We are accused of “cheating the system” by experiencing sexuality without taking the risk of possible family responsibility that comes with having children. But that idea misses the mark. I came from a family, and normally I owe it some emotional loyalty, regardless of my public or private life. I talked about the evolution of my emotional distance from typical family emotions on the glbt blog recently, here. Is this kind of thinking a form of scapegoating? Yes, and it covers up other problems, like wholesale discrimination and unfair unearned inherited wealth distribution among families. As Joyce points out in her article, there is no question that this sort of thinking feeds the agendas of those who want male-headed patriarchal families. A deeper point seems, however, to concern what supports good marriages need to foster and raise their children, and one of the things it seems to need is some loyalty and an assurance of support in time of need. The emotional connectivity that this can require is difficult, and I talked about this recently on the glbt blog, here. There are plenty of family responsibilities that people take that don’t come from their own sexual intercourse: eldercare, the need to take care of or sacrifice for siblings, and this is hardly shared equitably throughout society. Politicians seem to be uncomfortable raising these issues in mainstream debates,

That’s why it comes back to deciding just what should be expected of someone life me, as I wrote on this blog on Oct. 9, link here. Some people are very determined that those who are “different” should not reach the public except through conventional committed connections to other people governed first of all by the nuclear family (and its anchor in “marriage”). That relates to old-fashioned ideas of "public morality" where an atmosphere of conformity keeps people motivated toward maintaining the empathy that family life needs, and (sad to say) keeps people like me from being able to distract families. This kind of requirement does tend to protect other family members in some situations, but it also contributes to jealousy and the “soap opera” syndrome.

Our economic system generally assumes individualism, and that people respond to incentives based on choices (and that includes choices that result in marriage and children). So does our legal system, up to a point, until it deals with the special privileges of marriage and parenthood, which others subsidize. People with family responsibility may feel that they should not have to compete “with neutrality” with those without such responsibility. To some extent "family values" seem to contradict "individualism" and "self-reliance" as sometimes the family concept seems to stress responsiveness to people, sociability, and perception of the self as part of a necessary group, even if it contributes to an "adult" individuality later.

Every society has always had a large fraction of adults who do not personally reproduce. Societies, even relatively liberal ones, generally expect these adults to supplement family care efforts including being able to raise children (sometimes siblings’ children after family tragedies, a favorite theme of several movies and television series) and particularly do the large share of the eldercare, the latter increasing rapidly as life spans increase while there are fewer children. Society expects some emotional reception and response, and even expects such individuals to serve as additional “role models” for the next generation. Finally, society generally expects such individuals to be able to share other responsibilities like military or civilian volunteer service. It’s logical that all well-respected individual would have proven at some point that he or she can support others besides the self, and can take care of dependents. All of this had been part of the moral mindset of earlier generations – the sense of duty that went with a higher standard of living than much of the rest of the world -- until the 1960s and 1970s, when it fell apart, not just because of the Civil Rights movement but because the values of government itself became discredited with Vietnam and Watergate. Another way to look at this expectation found in religiously conservative cultures is to say, "you don't go out into the world on your own until you are accountable to someone or others, usually in the family, usually a spouse and children, because this makes things right in terms of karma (or in terms of religious practice).

Of course, that was the undergirder for my own arguments about removing government policies that make certain people (GLBT) second-class citizens or (in the past) “unapprehended felons.” To be worthy of respect, one needs to be able to share the responsibility for defending freedom, even if there is no formal draft. I wrote a statement about this in October 2007, here. One needs to have stable relationships recognized, and have the ability to adopt children when otherwise suitable and qualified. One needs, as Andrew Sullivan once wrote, to end “public discrimination” just so that “different” people will be able to participate when needed in sharing the sensitive jobs and tasks that keep a free society going, and than enable at least some majority of people to raise their own families. What “outliers” like me experienced was a kind of generous urban exile that kept me away from real “responsibility,” but all of that changed in the last decade, because of family “demographics” and public visibility associated with the media and Internet.

Of course, this challenges the “blinder” belief system that many people think they need to stay involved in their marriages. They need social approbation, a cover that pampers them and demands some deference from others, especially those like me who want a permanent "adolescence". Generally, they do not experience only the creative experience of an expressive relationship for its own sake; they are drawn in to a collective experience of raising another generation. But some of the social changes being floated during the past fifteen years might not seem so polarizing if there was an expectation that everyone serves, everyone shares. That probably doesn’t ring very true to a lot of people. Changing demographics, and the likelihood that many states might start enforcing their filial responsibility laws, as well as pushes to supply strong carrots for national service, could change that. None of that necessarily would violate a “Bill of Rights 2,” because it wouldn’t involve government compelling people beyond what they already owe.

Other important posts:

Perspective on libertarianism now: July 2007 (blog); Nov. 2007; Fall 2006 (submission)

Related posting on GLBT blog, Feb. 26, here.

Picture: lunar eclipse, 2/20/2008, toward the end (as the clouds lifted)

Monday, February 18, 2008

Born once and Born again: the pastor says "relax": a political and moral dilemma


In June 1979 I went on a weekend retreat with the Metropolitan Community Church of Dallas in a prairie near Abilene, Texas, in a facility called El Rancho Vista. In suffocating heat, we gathered at midnight Saturday around a pond, and had group prayer. A particular guy named Skip clutched me and prayed for me, making an unusual comment that I was less capable somehow than the others in the group. I was on the spot. Suddenly, there were claps of thunder, a strong refreshing wind, and heavy rain, driving us back to the bunkhouses. The confrontation was cut short. A month later, Skip would call me an invite me to dinner at the Lucas B&B on Oak Lawn in Dallas, and ask me when I would get to know God.

Now this sounds like a heavy way to begin a posting, but go on. In August of that year, one of my friends sang at a Sunday night service at the MCC, then on Reagan (the property had been bought clandestinely from a Church of Christ; this was long before the Cathedral of Hope was built), a hymn called “He’s Alive!” and during the service, a woman who said she hadn’t walked in ten years go up and walked toward the altar. Afterward, there was a big Sunday night supper celebration at the Belle Pepper.

Fast forward to February 17, 2008 at the First Baptist Church of the City of Washington DC and the Rev. James Somerville gives a sermon called “Labor and Delivery.” He makes an analogy (probably common) between leaving the comfort of the womb and being born into the world as a baby, and then being “born again.” There is nothing to do but relax, he says. Let it happen. There is no way to be born again just by being a better person.

I’ll sidestep the “faith v. works” debate here, believing that the two concepts are somewhat equivalent. Some religious faiths even within Christiandom (the LDS or Mormon Church, for example) place more emphasis on works as affecting one’s position in the hereafter.

I don’t particularly relate to being suddenly “born again” while on this planet. I can imagine the reactions. I’m used to hearing opinions (as I noted at the start of this post, when applied to me), as to who has “the Spirit.” What does matter is what happens on passing. Once, driving back to Dallas from Oklahoma on a Sunday night, I heard a preacher on the radio say, “the first thing that happens is that two angels by your side escort you to the judgment room.” I heard a lot of radio debates about post-tribulationism and pre-tribulationism while living in Texas in the 1980s.

It makes sense that, on passing, one is shown the wonders of the Universe, and answers to questions about life on other worlds, that one understands the basic essence of physics, and maybe even that “information” somehow is omnipresent, regardless of the limiting speed of light and mathematics of relativity. However, one does not have the capacity to climb onto Blogger and record what one has learned. One doesn’t have the capacity to do anything or express anything. One must just “relax.” Surely, the transition to the Afterlife from this life is as radical as the transition to this life from the Womb.

Here is where the idea of karma becomes attractive. It’s a well known idea from Rosicrucianism and New Age practice. (Book review here. ). It seems just. Your position in the hereafter is affected by how “well” you do, even if someone wronged you can cut your life off short. The only way to make it better is to live again and do better. That, of course, brings up the subject of reincarnation. Most of us have a “first memory” from about age 3 or 4 (mine is that of an electric train around a Christmas tree at age 3), and before that there is only nothingness – despite the 4.5 billion years of Earth, and tens of thousands of years of human history, of ancestors whose sacrifices and procreation made our experience today possible. Their past problems don’t seem real to us, but they should. Strong memories from very early in life, which seem fresh decades later, help make us “who we are.” If my specific parents had not come together, would I exist? That gets into the philosophy and "mystery" (even in terms of the most basic concepts of physics) of the soul or "inner self" and no one knows -- it is a matter of religious faith.

Nevertheless, standard “New Age” belief is that to have a successful afterlife, you have to live numerous times until you get everything right. You can fix things only during earthly incarnations. (No one deals cleanly with whether they are all on this planet; British novelist and filmmaker Clive Barker took up that problem in his 1991 novel Imajica, and says that he wrote that novel as a personal theological journey.). One variation of the Gospel is that Christ’s sacrifice made this repetitive process possible. Although group and family karma might exist, too, you do have to save yourself, it seems. That comports well with ethical theories underneath radical individualism and even objectivism. You do have to share with others, and pay back (or pay forward) to the system emotional empathy and kindness that was afforded you, however.

But one aspect of almost all spiritual and religious practice has to do with service to others, which might be reckoned as an individual measure, or might be expected as a surrender to group identity and experience. To be frank, much of the debate over “family values” concerns what should be expected of those disinclined to form their own biological lineage (often GLBT people). The older notions of public morality viewed families as units to deemphasize preoccupation with the “performance” of individual members within the member, some of whom may have been “different” or not “competitive” enough to have their own families; so it was both the privilege and responsibilities of their parents to protect and manage them as part of the family unit, even as adults. Such individuals were expected to stay home and remain caregivers, or sometimes take up carefully structured roles in the community (priests or unmarried teachers). Such a view seems at first more compatible with an orthodox Christian view of permanent salvation by Grace alone (after only one incarnation). By focusing on intra-familial and intra-fellowship connections and "faith," it neutralizes the kind of thinking that "measures people" or lets them be "labeled" as "second-class citizens." Because such a system of "public morality" focuses on the emotional well-being and motivation of the group as a whole, it tends to be intolerant of excesses in personal expression, challenges to familial or religious authority (even "blasphemy"). But such a view also invites an abuse and its own moral complacency, encouraging and tolerating such practices as segregation or slavery, and then discrimination in the past, as well as group economic unfairness. Such a system starts to seem "irrational." Individualism answers theses abuses by putting everybody on the track to race on their own, but some people still start ahead in line, and some people as individuals get left behind, to save themselves, often an impossibility. A true moral dilemma.

Picture above: Mormon Temple in Kensington MD, just north of I-495 outside Washington DC.

Saturday, February 16, 2008

Corporate blogging policies: they probably need upgrading


Around 2002 or so, some employers started discussing the idea of developing corporate blogging policies. This development at the time arose out of concerns over the possibility of legal compromise (such as trade secrets or private information of stakeholders), and after media reports of a few spectacular firings (and the emergence of the new verb “dooce”). There was a tendency to lump blogging at work (on company blogs) and at home together, because in the global world, most visitors could see both corporate and personal blogs.

There are some web references on these, such as Groundswell . On that page, the Harvard, Sun, and Microsoft Scoble links still work right now. Most of the rules that are mentioned match common sense. Harvard mentions the concept of a “creative commons license,” an idea that Electronic Frontier Foundation has promoted for artists to help solve the DRM controversies raging today.

In March 2000, I had written a white paper on this whole subject, and I had gone more deeply into the concerns. I was concerned that it would be very easy for a manager to trigger a hostile workplace problem by discussing his personal views on a controversial subject. For example, illegal immigration is controversial, and most reasonable people find differing sides of the issue to be intellectually legitimate. A person might express strong opinions that illegal immigrants should be deported or be denied services. Then the person becomes a manager at work, has (perhaps unknowingly) illegal immigrants as subordinates and is accused of evaluating them unfairly. The possibilities are endless.
Teachers could run into this problem when they have the authority to grade students and affect their college prospects.

The issue became more complicated around 2004 as social networking sites became more popular. At the same time, personal websites, originally tending to be simple flat and linked file systems like mine in 1998, were being supplemented or replaced by formal weblogs, often subordinated to the social networking sites. The issue that disturbed employers was the dissemination of personal information, often “negative” by older social norms (or legal norms like drinking ages), sometimes connected to political or social protests or “teenage rebellion”, that could be perceived by stakeholders as indicating personal untrustworthiness. But “defamation” or “bad reputation” could live in the mind of the beholder (or customer) more than speaker.

Another aspect of the problem is global reach. A web posting typically will get indexed by search engines and can be retrieved anywhere in the world, outside of censorship in certain countries (like China, or some Islamic countries). But social networking companies and blogging companies began to offer whitelisting, the ability to keep access to content restricted to registered or known stakeholders. Even so, some employers looked at Facebook or Myspace profiles of job applicants even when these profiles were supposed to be marked “private.”

I have suggested before that people with certain kinds of job responsibilities (direct reports, speaking for the company in public, lobbying, sales in public, teaching with grading responsibilities) blog “globally” (outside of a whitelisted network) only with certain formal supervision. To some extent, this kind of issue could be managed with “conflict of interest” policies already well known to business. I suggested this when making up a business ethics quiz for a testing company in 2003, and created quite a bit of controversy.

Still, people in particular jobs as individual contributors may find themselves wanting to get promotions, or needing to because of circumstances (even from the family) driving them to do so. Postings made by themselves in the past could create conflicts given new and not previously encountered responsibilities for subordinates. Personal information, especially of a controversial nature, could be troubling (and that could include GLBT-relevant information). So such persons might have to go through an exercise in removing posts, and even getting search engines to remove these from caches and Internet archives. This can be done, but it can be laborious and cumbersome. Many people have dozens of pages of references to them in their “search engine backgrounds.” That’s why companies like “Reputation Defender” have sprung up trying to offer “professionals” an efficient process for doing this.

The Human Resources and legal worlds, as well now school boards, have slowly began to crystallize positions on these issues. In 2006, the American Management Association published Nancy Flynn's "Blog Rules." George Washington University associate law professor Daniel Solove has written (and Yale University Press has published) a book on online reputation, "The Future of Reputation: Gossip, Rumor and Privacy on the Internet" (my review is here). Dr. Phil has taken up the problem a couple of times, especially with respect to teachers (and also Internet bullying).

It still is unclear where all this is headed.

Note: Feb. 20, 2008


I've discussed the Wikileaks case that surfaced Feb. 15 on another blog (Feb 20), here. Stay tuned, as this issue will surely develop quickly.

Wednesday, February 13, 2008

Do literary "diaries" resembled today's "blogs"?



When I started at George Washington University as a freshman in the spring semester of 1962 (the blog entry for Nov. 28 2006 explains the catastrophe that had led to this), I placed out of the first semester of English composition. At GW, one took two semesters of a literature before taking the second composition “term paper” course, in the second half of the sophomore year.

I digress here. I student in sophomore “English 4” could write a term paper on anything reasonable; the whole point of the course was to learn how to write academic research papers, at age 19 as a new “adult”. I wrote mine on Gustav Mahler’s influence on modern composers. A good friend from the chess club actually wrote his on vampires. Three weeks into the course, we had to hand in an “annotated bibliography” for the term paper, one of the most difficult assignments I ever had for my age at the time. I remember staying up until 3 AM to finish it on the kitchen table. We read Huckleberry Finn in class, but that was mostly for fun (although I remember some startling passages about personal appearance and future wealth). At the end of the course, we had to return the term papers, to keep them out of fraternity files (this was long before “turnitin.com”).

The point of the year long literature course (English Literature in my case) was partly have more content to write about in the research course. Because I started in the spring, I took the second half first, starting with Wordsworth. I remember having about sixty pages of a thick gray anthology “British Poetry and Prose” to read for each class, two-thirds of it poetry, and the professor gave regular pop “card quizzes” that were ¼ of the grade (thankfully, he dropped the lowest quizzes, and offered plenty of choice of questions on the final). The poems and prose pieces looked like little glimpses into a world very different from ours, but the point of them was to study how intimate and frank writings give an understanding of what made people “tick” in a particular culture. I seem to remember the idea of poetry as giving “pleasure” from Wordsworth, and one of his famous poems (Ode: Intimations of Immortality from Recollections of Early Childhood.” (1888)) became the motivation for a famous film “Splendor in the Grass” which I saw that lost semester in 1961.

One card quiz that I failed was on the assignment to read Thomas Carlyle, portions of “Sartor Re-Sartus” . You can sample his work here and notice the second paragraph (Gravitation) where he seems to sum up how he thinks about science, and then proceeds to develop his metaphor on clothing. (Maybe the idea wasn’t lost on John T. Molloy and his “Dress for Success” books.) One can check up on his fictional character Diogenes Teufelsdröckh. What I seem to remember was a book of lots of short self-reflective pieces, annotating themselves as they unfolded. They strike me as something comparable to today’s idea of weblogs, when they are collected, cross-references (Wordpress does this well), and presented formally. Maybe one could do that with Heather Armstrong’s “dooce” site. They give the reader a detailed look at what is really going on in the intellectual and somewhat personal lives of people on a day-to-day basis. Their reflexivity is part of the technique. It's true, though, that blogs today tend to be a lot less informal in tone than the formal short pieces often presented in anthologies.

Some writers consider Sartor “this new kind of book” as an early example of existentialism. Now that term gets thrown around today by conservative writers in describing the threat from radical Islam; on his feet, Rudy Giuliani pulled the term out of his mind in one of the debates, and I suspect that John McCain will do the same. The term is perhaps overused. I’ve used it to characterize the concerns people have over the logical outcomes of one’s personal intentions, especially when these intentions involve inherent contradictions (it gets mixed up with rationalism and radical individualism).

Earlier (and this would have been covered in the “first” semester which I would take subsequently, in the fall of 1962) lit courses presented many diarists, such as John Elwyn and Samuel Pepys, whose Diary is our main source today of information about daily life in England in the 17th Century. That does sound like a blog, doesn’t it! Most high school English classes require students to keep handwritten journals, but for the most part, teachers have been reluctant to post them online as "blogs."

One could say that the individual Canterbury Tales are almost like blog entries (in verse). Even in 1962, the literature Professor was willing to offer the view that the Pardoner is homosexual. Or perhaps one could say that of the individual Parables in the New Testament, or even of individual chapters in the Epistles. One could collect them as individual postings, and categorize them according to the kinds of moral issues or social problems that they address.

Update: Feb. 22, 2008

See NBC4's story about a Maryland man's "postsecret" blog based on secrets sent to him in art; the blog consists of the artwork.

Tuesday, February 12, 2008

Potomac Primary today: record turnout for a primary?


Well, today is indeed Chesapeake Primary (or Potomac Primary) day. This time, I did not work the polls. (PBS has a documentary “By the People: Democracy in the Wild”, review here. However, this time I did not offer “national service” for a sixteen hour day, and simply went to the polling place (where I had worked in November) and voted as a private citizen. Yes, they recognized me.

Virginia is still using the WinVote system, which does not leave a detailed trail by vote, but does provide very detailed summary reports. Also, in Virginia, the voter goes to the same polling place for either party's Primary, and does not have to be pre-registered for one party; the voter can decide at the last minute (when checking in) which Primary to vote in, but can vote in only one of them. This time the "technical process" is “complicated” by the fact that the poll workers have to count the voters for the Democratic and Republican primaries separately when one logs in. The poll worker then has to click “which primary” when accessing the machine with the smart key. But there is only one choice to make for any one voter.

The poll here in north Arlington was crowded when I went at 9:30 AM. It was much more crowded than it was for the election in November, when I worked. If the mid-term election in 2006 is any indication, the crowds will probably dwindle by about 10:30 AM, pick up during the lunch hour, dwindle, and then become very heavy after 4:30 PM. The polls opened at 6 AM (people were waiting to get in from the February cold and dark at 5:45, I was told), and will close at 7 PM. Anyone in the building at 7 PM can vote, but no one is admitted after the polls close. The likelihood is that it will take thirty minutes or so to clear the line after the polls close. I had to wait about twenty minutes to vote this morning.

I had planned to vote for Hillary, but I was so taken by Oprah Winfrey’s program on building a dream house for a family yesterday that I rewarded Oprah with a vote for Barak Obama at the last minute. The media stars really can make candidates look good, even at the last minute.

On ABC’s “The View” this morning, Barbara Walters said that Clinton-Obama could be a dream team, but not the other way around.

Monday, February 11, 2008

Oliver North op-ed on cultural norms for gender responsibility


I recall a critical point in James Cameron’s 1997 epic film, “Titanic,” where the call for access to the lifeboats was “Women and Children First.” Later, Len Dawson’s character (Leonardo Di Caprio) became very chivalrous in protecting socialite Rose (Kate Winslet), even as they hung on in the freezing north Atlantic. Eventually Len perishes whereas Rose goes on.

In an op-ed on p B03 of the Sunday Feb. 10, 2008 Washington Times, titled “Women and children first,” Oliver North discussed first how the concept (or phrase) came from a sea incident in 1852 with a British frigate called the Birkenhead near Danger Point in South Africa. The story he spins sounds like it could have become the topic of an unwritten Benjamin Britten opera, where the British, in their naval empire building, have to encounter every moral, social and political issue known to man abroad, and then develop literature about the issues for high school students to study centuries later. The North link is here.

North goes on to discuss the horrific practice in Iraq of using disabled people (especially women) for suicide attacks. The Nazis did the same thing, as did the gulags of Joseph Stalin or Pol Pot. It’s not necessary to belabor callous disregard for human life in totalitarian or extremist cultures.

North had a lively radio talk show in the 1990s. He ran a security company then, and he once mentioned that his own employment policy was something akin to a modest “don’t ask don’t tell.” One can gauge his positions on social issues from a particular quote from around Aug. 1995. “Life is mostly about meeting obligations to others, with occasional moderate self-indulgence.” It turns out that this is a quote from Andrew Peyton Thomas, “Can We Ever Go Back?,” The Wall Street Journal, Aug. 9, 1995. That broadcast occurred when I was gearing up to write my first book.

When I was growing up, I was made very aware of the expectations of men, that their first priority was to protect the women and children around them. I had fallen physically behind (I don’t know why, but an attack of measles at age 6 could have something to do with it) and such an expectation seemed burdensome. This cultural rule carried over to the whole issue of the Vietnam era draft. At one time married men and fathers were to be exempt, but that was dropped around 1965 when Johnson and McNamara (liberal Democrats, to be sure) escalated in Vietnam. But student deferments remained until 1969, and became one of the most contested moral issues of the day, comparable in importance to “don’t ask don’t tell” today.

The draft seemed to penalize the young men who were the most fit. That always struck me, in my coming of age, as “morally questionable.” The same could be said today of the “backdoor draft” in Iraq, created by repeated and extended tours of the “volunteers” who joined the military, and now the media coverage of wounded provides horrific images. In the spirit of the previous posting, it certain provides a test of the meaning of marriage. We can go on about that. But really, the same is true in other areas of society. The most physically fit are likely to enter fields like law enforcement or become firefighters. We depend on the fit to do our most dangerous jobs. Journalist Sebastian Junger wrote about this in his book Fire! (2001), published around the time of 9/11. Finally, our debate over “family values” leads to the idea that the ability of men to “compete” to take care of others is a morally relevant requirement for credible participation in the public space.

In his State of the Union Address, President Bill Clinton said “The days of big government are over. But we’re not going back to the days of fending for yourself.” I remember later in 1996 a speech by Melinda Paras of the National Gay and Lesbian Task Force, given at Fairfax County’s Government Center, where she derided conservative lack of “fairness” and “compassion” as if they were interchangeable. She got into some discussion of the Americans for Disabilities Act of 1990.

On one job in the 1990s, there was a young man with an eyesight disability, almost to the point of legal blindness. He was properly accommodated with a much larger than usual computer monitor. He was the most productive person in the information technology department. Everyone depended on him to solve arcane problems in server scripts that no one else understood. He developed a Wikipedia-like website at work on how to research all of these problems. He ran his own business as a small ISP (which I used from 1997-2001). He had a family. This is how ADA is supposed to work. It doesn’t always work out that well.

Left wing people have sometimes told me that I should regard myself as “disabled.” The physical problems, along with academic success, did not keep me out of the Army, but they kept me in a “safe place” (no Vietnam) when I had my “draft hitch” 1968-1970. In a way, it isn’t fair. Donald Trump says, “life’s not fair.” The moral challenge is to overcome one’s own personal crosses.

ABC "World News Tonight" had a story about a Walgreens distribution center in Anderson, SC, "Employees at This Walgreens Distribution Center Are More Able Than Disabled; Executive With Autistic Son Made It His Mission to Set Up Employment Opportunity," link here.

Sunday, February 10, 2008

Individual sovereignty has a kink to untie


One frigid Saturday night in mid December 1972, I found myself in a dark, drafty, abandoned tenement in Newark, NJ, listening to angry rhetoric of the People’s Party of New Jersey (supposedly connected to Dr. Spock) as various “radicals” proposed their platform. Earlier that day I had been learning to ski, GLM method, in the Appalachian foothills. All of this was the closing of my ”coming of age,” just before I would “come out” for a second time. I remember some of the proposals: abolition of inherited wealth. Limitation of maximum income to $50000 a year. (Imagine that today!) Abolish “profit” as a legitimate motive of business. (That sounds like “Reds” doesn’t it.) I recall a young woman who worked as a secretary complaining that she was called “the girl” in the office. Someone called out, “will anyone making more than $5000 a year stand up and be counted?” I declined, but I felt like an unwelcome intruder and “spy”, working for corporate America -- Sperry Univac -- then for $14000 a year, then a decent middle class salary for an young unmarried male. Not only “rich people” but the pseudo-bourgeois middle class were the “enemy.”

The level of street-wise or “grass roots” indignation about the unfairness of things was now quite manifest to me. But if had been so before. Back in the mid 60s, I had been party to countless debates about the draft and the “fairness” of student deferments. With World War II and Korea still recent history, there was a definite perspective that defending freedom was an obligation that must be shared (an idea that Tom Brokaw talks about today with his “Greatest Generation” and “Boom” books.) By now, even the most conservative among us knew that Vietnam was, at best, very suspect (as we had lived through “Medium Cool” 1968, when I actually was drafted, finally). In the mid 1960s, the Civil Rights movement had blossomed, with Martin Luther King’s leadership (and tragic end in 1968). In 1969, Stonewall had occurred, three weeks before man walked on the Moon. As of that evening, the ramifications of Watergate were still not widely known, as Nixon had just been re-elected in a landslide. Nixon, remember, used to blast the war protestors (and draft dodgers splitting to Canada) as spoiled upper class kids who knew nothing of sacrifice. (But Nixon would end the draft in 1973.)

The runes of the Far Left had become predictive and evidentiary of a new trend in the way we looked at morality. For all the rants now seemed to focus on the unfairness of individuals’ receiving unearned wealth (like inheritance), while others lived in poverty. Notice the tone of the moral rhetoric: it focused on the individual. And this was the talk of the Far Left, a fringe element bordering on Communism, or at least radical socialism. Individuals should be assessed as to what they “deserved,” and wealth should be expropriated from the undeserving and redistributed to the needy. A purification was to occur. In fact, enhanced Communism (especially in the Soviet Union) did just that: it maintained its “deserving elite” (including the Soviet chess masters). In China, the “Cultural Revolution” spurned by Maoism was perhaps the most extreme example of this kind of thinking in modern history, as “intellectuals” were forced to toil in the countryside.

We had long heard moral debates. But in the past, back in the 50s, they had generally talked about people in groups. Dr. Edward H. Pruden in the First Baptist Church of the City of Washington DC had often preached about racial justice and ending segregation. I heard some of his sermons in the old building (replaced in 1955), when I was barely old enough to understand the idea of abstract moral discussion taken beyond the tenets of a religious faith and applied to the real world. Pruden had even asked and analyzed how the people of Germany could have allowed Nazism to overrun the country in the 1930s. Yet, in those days, these sorts of moral questions were understood as political in nature, to be solved en masse by organized political and legal efforts pressuring governments to change. They did not seem to compel the individual to act in an unusual way, other than to vote and perhaps to some rallies or sometimes to work for political candidates. They really did not demand “giving up” things.

By the 1970s, though, we understood that these moral questions subsumed a lot of personal responsibility. Even the early gay movement knew that. There may have existed some income and cultural differences between suburban gays (GANNJ) in New Jersey – that met in Unitarian churches and liberally-owned properties that would rent to them, and the radicals at 99 Wooster Street in New York’s Soho (the “Firehouse”), but by now everyone understood that moral responsibility for things was becoming privatized. It was up to the individual to craft his own life according to some personal moral grounds. Nowhere was the view clearer that at the Ninth Street Center in New York, and it seemed at times that the East Village was where this kind of thinking flourished.

And personal responsibility would take on a different turn in the 1980s as the Moral Majority of James Robison and Jerry Falwell took it up. Right and wrong no longer belonged to the group; it was the individual who was to be measured, assessed and judged.

Of course, in the middle of this lived the central social institution, “the Family,” which one high school student compared to a “banana” in a theme when I was subbing. The Family always defined a “local moral universe.” Loyalty to one’s flesh and blood was always a primal value. Parents expect respect throughout life from their children, even as adults, and they expect siblings to take care of one another. It isn’t hard to see where this leads in the debate about the meaning of marriage and “sexual morality.” As society set it up, marriage confers all of responsibility, social status, prestige, and preferences for those who will commit themselves to lifelong monogamous marital sexuality. Ability to carry out sexual intercourse strictly within the structure of legal marriage, resulting in having and raising children within the family that results, confers the right to influence the lives of others who have not married and had children of their own. In history, that got to be elaborated to the point that whole political alliances were predicated on the idea of arranging the right marriages and bearing the right heirs. Same-sex marriage, while it progressively looks toward the idea of sharing social responsibility among generations, still challenges the “meaning” given to marital sexual intercourse that seems so important to so many people.

Every society has plenty of men and women who do not marry and/or do not bear children. Society has always had other outliers from the normal family structure, even plenty of “only” children. In the past, those who did not procreate were expected to fit into the social structures created by their parents, who had procreated. The women were expected to be the ones who stayed home and looked after aging parents. Society worked at creating “legitimate” places where such “different” people could fit in without causing disruption to the social status quo. Unmarried women often became teachers, and school systems worked hard at making them seem credible to parents and children as authority figures, with strict rules of teacher “morality.” The Roman Catholic Church, using theological justification, carved out honored or respected places (as the “non marrying kind”) for nuns and priests, who had real authority, at the “price” of not only physical but intellectual chastity. In recent times, for priests, the credibility of this arrangement has broken down, but there were plenty of problems with moral credibility in earlier centuries, too. Very talented men could often get away with doing what they wanted, but generally most men felt heavy pressure to marry and have children, even if they were not so inclined.

But one of the accomplishments of the modern western family was to strike some kind of balance between individualism and the group. The “family wage” developed as a concept, eventually to recede after the 60s. There was an implicit understanding that those who did not “compete” well for mates should expect and receive an arrangement of mutual support with their biological families. In exchange for cultural “loyalty” to their family “reputation,” their families would help support them and shield them from the global measurement and judgment of apparently growing radical individualism. Such individuals were expected to experience a lot of “faith” and accept the goals of others in the family as their own goals. After all, others within the family, such as the elderly and the disabled (since medical problems strike many families randomly) need to count on them for a sense of meaning as well as, sometimes, financial support. Only the extremely gifted could break away from this.

Of course, with time, and especially from the 60s on, this social arrangement became less credible and less acceptable. After Betty Friedan (and despite the condescending tone of women’s magazines of the time), women advanced in the workplace (Univac, where I was employed in 1972, was one of the most progressive employers then for moving women into management) and did not “need” men according to the superficial idea of family division of labor, a complementarity that had “protected” families before. Gay men sought, among other things, emancipation from heterosexual expectations of “competitive performance” (and explicit message board postings on the Internet today still reflect that concern); yet, at the same time, gay male values, experienced in “upward affiliation,” seemed to worship social paradigms of “masculinity,” creating a certain existential paradox.

The Vatican would become increasingly vocal in its promotion of the whole paradigm of abstinence-until-marriage-and-procreation as an extension of reverence for human life (as usually stated in the debates over abortion and being “pro-life”). In a certain purely intellectual way, Vatican moral theory makes sense, but eventually becomes circular. In practice, marriages floundered because the government discouraged it then with welfare policies, and because straight men take the cue that it is OK to “love ‘em and leave ‘em”, becoming deadbeat dads. Yet social conservatives and pastors on the religious right often blamed gay men, for walking away from the moral responsibility of biological loyalty, leaving their families (and parents dedicated and transformed by lifelong matrimony) abandoned and gutted, a rather amazing claim to make at the same time as claiming that active gays are only about 1% of the population. All of this moral finger-pointing went on in a culture now ready to place increasing responsibility on the individual instead of the group that the individual had come from. Furthermore, the moral debate would forget that only a couple decades before, the horrible collective abuses of segregation and racial discrimination (that had grown out of slavery) had been accepted as an inevitable result of keeping people focused on the needs of their own immediate biological families. We would see social justice (“karma”) as a matter of redistributions among nations or religious or ethic groups or extended families, and in time come to see it in terms of fairness among the members of any particular group or family. Individualism could easily marginalize some people within any particular family.

The radical individualism points up a basic fact about the way we conduct these debates. We expect to rationalize policy decisions in the effect they have on secular society, now seen as a collection of autonomous individuals. We expect the market to take care of things (a stance sometimes criticized as “market fundamentalism” or “extreme capitalism”). However, people have tended to look to religious precepts to come up with relatively straightforward moral principles to address all of these problems. The Bible, in both Testaments, is filled with parables that explore the moral tension in the way people see themselves and must balance their expressions as individuals with the needs of society around them and of “God.” (The “Rich Young Ruler” incident is a good example, as are the parables of the “Talents” and the “Vineyard.”) Nothing really seems that simple.

Most people in previous generations saw the nuclear family as an intermediary, a social arrangement (and commitment) that not only raised kids but that took care of some of the competitive pressure that normally comes upon adults. That’s why marriage conferred so many privileges, that the unmarried and/or childless sometimes feel they are forced to “subsidize” and “sacrifice” for. What we have now is a somewhat confusing situation of conflicting moral expectations, and a need for a new paradigm. In the coming world most adults will find that some family responsibility (eldercare, and sometimes sibling care or even participating in raising other people’s children) will be expected of them regardless of their own sexual or procreative activity. Sexual intercourse surely creates responsibility, but is also comes from responsibility. Burdens must be shared – something previous generations (like the “Greatest” of Tom Brokaw) understood. And the painful thing is that these responsibilities can be life-changing, and have a serious effect on the decisions a person can make about his own life. Some external situations – pandemics, natural disasters, global warming, terrorism – can place more emphasis on social interdependence than our current culture accepts. An overall term for this problem could be “pay your dues.” The adult finds that, even if he or she does not form a family and have own children, he or she is called upon to prove the ability to provide for others, or else slide back into a pattern of emotional and social co-dependency on family and group, with limited opportunities for personal choice and expression. The nuclear family, buttressed around marriage, is supposed to take care of people in such situations. Of course, “competitive” performance throughout life, starting out in school as a young person, becomes essential to breaking out of this co-dependency and having more self-respect as a free adult. That sounds like a does of objectivism, but there is no way around it in a free society. The pressures on those with "families" to compete have serious practical consequences, as illustrated by the subprime mess -- and, yes, corporate greed (as well as short term thinking in financial markets) is a major factor, too. They are all part of what Princeton professor David Callahan called "The Cheating Culture" in his 2004 book.

Some of the most perplexing problems now associated with the Internet – such as “reputation defense” – have to do with the sudden “empowerment” of individuals to find global audience without necessarily maintaining connection to or accountability others. This Internet opportunity may seem to rescue some people from the “objectivist harness” just mentioned. But "global expression" may be a privilege, dependent on technology, earned after meeting inevitable obligations to others -- an idea that earlier generations understood as implied by liberty, but one that seems to need restating today. Indeed, the notion of "family reputation" could even become significant in the law (again) as this problem evolves on the Web. One subtle problem, vexing employers as noted in the previous post, has to do with the fact that almost anyone may find himself or herself perceived as a “role model” by others, including other people’s children (“OPC”). It’s becoming apparent that this is rather universal, and not something that is always a direct personal choice.

I believe in individual sovereignty. But it does sound like it needs to be earned. Things happen, people get challenged, whatever their wishes; and then they need principles.

Thursday, February 07, 2008

Internet self-parody generates "bad karma"


On these blogs, I’ve talked in a few postings about the disturbing trend for web speakers (especially teens) to “defame” themselves on the Web, particularly on social networking sites, as a form of what they believe to be political or social protest against what seem like arbitrary laws that protect teens (such as regarding drinking) or conformist social standards.

Media reports indicate how seriously unnerved potential and current employers are by this practice. This whole development, in the last three years or so, is a bit of a surprise to everyone—from both sides.

Many times when this happens, the speaker has not had his or her own children, or has not experienced responsibility for or accountability to anyone else besides the self. The speaker may well be unwilling to accept this kind of responsibility except on his or her own terms, because that is a basic principle of modern individualism. But the employer reasonably wonders if the person understands the idea that stakeholders or clients will want to “depend” on the person, at least in a professional capacity, and would not be able to do so if the person “defames” the self in a public space, even to make a political point. The content may in itself be legitimate and legal objectively (although underage drinking, for example, is not when caught in a video), but the individual would not present the self this way if others had to depend on him or her. It’s a bit of a cultural and even moral dilemma, good for Sunday school. Even Donald Trump weighed in on this in an angry outburst "in the boardroom" on "The Apprentice" when a candidate wanted to give up his exemption for having won the previous week; Trump called that "life-threatening" behavior.

While we have grown used to “radical individualism” in the past couple of decades, and use it to justify many expressive behaviors, the idea of interdependence has crept back into public debate, partly because of all of the new “external” potential problems. After all, the debate over “family values” (to speak of it generically) has a lot to do with emotional connection and loyalty that many people (like parents) believe they have a right to expect from others to avoid being marginalized, even as they have made today’s lifestyles possible. The intergenerational connections seem necessary if people have enough incentive to take seriously problems like global warming. People have discussed this concept recently even in a “New Age” context, in conjunction with books and films like “The Secret,” as on Oprah yesterday. It all sounds like karma.

Wednesday, February 06, 2008

Strategic career planning in "pseudo-retirement" -- redux


About fourteen months after my “big” layoff (at the end of 2001) I was at a job fair (still in Minneapolis) and ran across a financial institution that said it was “expanding” with a franchise-type of operation that would make a lot of money by getting people to roll their portfolios. Actually, as I searched my own memory, I had gotten a bizarre phone call from a headhunter on Sept. 13, 2001, two days after 9/11, while I was still working in my IT job (and still had three months to go) about this opportunity, and I wondered what about me had attracted the call.

I went to the “interview” near the MSP airport (in early 2003), and the manager presented a private pep talk, with slides and pictures, of a business that would convince people to convert whole life policies to term life, saving them money. This was said to be a $40 trillion market. It was a dog-and-poney sales presentation. I did not say a lot, and I don’t recall exactly what I asked (nothing too controversial or sharp-edged in my mind). But he did say, “We give you the words,” and mentioned that Nicole Kidman said that about screenwriters at an Academy awards show. But then, twenty minutes into the event he suddenly said he didn’t think I was a good fit. Okay. But then he kept defending his business for ten more minutes after telling me that. It was if he suddenly suspected that I was an undercover journalist sent there to expose him.

Perhaps he had looked me up with search engines. (Why else would he have brought up the subject of the Oscars out of context?) Social networking sites hardly existed yet, and blogs as they are set up today were just coming into vogue (Dooce.com was already known, and a Minnesota filmmaker was starting a project called “Blogumentary”, and the dot-com boom had busted.) But then I ran hppub.com (something that I showed off in the tube city at the Saloon on Saturday nights before people hit the dance floor), which already had loads of controversial material in flat and linked html files based on my books, and even then I got about 1000 hits a day on it. (I had doaskdotell.com to propose the movie, but I would not merge hppub.com into it until 2005.)

It was all for the best perhaps. But I did have an IT career in life insurance, was still collecting supplementary unemployment (I had a part-time job getting contributions for the Minnesota Orchestra and one short freelance writing (multiple-choice test writing) contract with a testing company), and I could have been pressured into something like “sales.” After all, if the industry had paid my rent for a dozen years with stability, why wouldn’t I want to “sell” it?

In the spring of 2005, in fact, I would get calls from two life insurance companies (out of the blue) about becoming an insurance agent. I went to one of the presentations, and took the screening test. (Would you buy insurance from an agent? I answered yes, because I wanted to continue the interview. I did let myself get talked into buying a AAA policy in 1979, but I wouldn’t today.) They actually wanted someone who knew the industry, and I do. They wanted the technical knowledge. But the trouble is, selling insurance this way involves a lot of social schmoozing, and generating contacts for leads. I don’t conduct a social life that is conducive to doing that. (It helps to be married or at least have been married with having raised kids, even though companies deny they would “discriminate.”) Furthermore, you had to give up all other income while in training and collecting a “training bonus.” That would end these blogs, probably.

In early 2006, I got a bizarre unsolicited call on a Friday evening from a company looking for someone to supervise disadvantaged teens who would scour shopping malls or go door-to-door selling surplus restaurant meals and similar services, with money to be raised for designated charities. I wondered why I was even called, but I went to see what it was. The company seemed legitimate enough, and people were starting work the next day. I said no, though, because I don’t chase people down in public places and disturb them, even for good causes.

I’ve already talked about the substitute teaching episodes. There were these situations with certain kinds of students who need more emotional connection than I will give people.

And, I read all the time, that technical sales is a big thing now. But, there is the same thing. There is a lot of schmoozing, pampering and personal manipulation that a lot of techies, who are often introverted, dislike and find gratuitous and unwelcome.

So much for all of this negativity. Most of it – all of it – is the dislike of the idea of being a middleman who manipulates people and produces no real content.

So what are some of the “real jobs” that I would like to do in ex-retirement?

How about solving some problems. I can think of a few of them.

One of the obvious ones is setting up “due diligence” address and identity verification systems, to become mandatory for large lenders to use, that would shut down most identity theft. That’s covered in my blog on “consumer identity protection” (check the profiles). The USPS NCOA system would be a starting point. I have worked with that.

Another is to develop a content labeling system and paradigm that actually works and addresses the concerns addressed by COPA (the Child Online Protection Act of 1998) without hindering the ability of adults to speak freely on the Internet. Many organizations have this well underway, but there is real leadership needed to tie together the loose ends. I was one of the plaintiffs challenging COPA under EFF.

Another would be to address the issues of what kind of telecommunications competition model really does facilitate both profit growth and robust continued input from user-generated content. This is the network neutrality “problem,” augmented by concerns about security and ethics in how Internet advertising really works. It is still a nebulous area.

Another is a sensible replacement for DRM (digital rights management) which has made legally naïve but technically clever consumers and media companies enemies like lions and hyenas. Yes, artists (and media companies) must get paid. But there are better ways.

Closer to what I have done, would be to develop best practices to replace the “Internet background investigations” that employers have suddenly taken to do on job applicants and sometimes current employees. Rather than simply get consumers to sign up to have their online reputations watched, real coordination is needed between employers, associates, and companies offering to manage reputations. Yes, this could be a big early 21st Century issue, one that nobody saw coming until about three years ago.

As I noted on the previous posting, I think that I could help a major media news network “connect the dots” or organize the interconnections among stories in order to further assist viewers and probably increase ratings and therefore sponsor advertising revenue.

On the insurance idea, yes, I would love to analyze the way to make long term care insurance or state partnerships work. I just don't want to hunt down leads as "prey."

And, oh yes, I would love to be holed up in a condo in LA, Toronto, Vancouver, or wherever movies are made, to work 100 hours a week on making the “movie” on DADT, combining Colonial Williamsburg with the 1961 Expulsion with the present day “dadt”. I am struck by the power of great movies to convey moral concepts that are hard to put into words. “Atonement” and “There Will Be Blood” proved that this year.

So much for “American Pie in the sky” or “shoot for the Moon.” What about dusting off my mainframe skills? Although I renewed my Brainbench certifications in key programming areas last year, I haven’t worked in this area since the “retirement” at the end of 2001. I’ll probably say more about this on the IT blog soon.



In the critical content areas that I discussed, I do accept, with enthusiasm, the need for management skills, “people skills,” and salesmanship. I am prepared to deal with these. You always have to be able to sell your own work. Yes, I will sell my own work, and work done in reasonable collaboration with others. And I recognize that, if I created the right opportunity (maybe with an office right next to the Nationals’ new baseball stadium, who knows??) I’d have to behave online in a manner consistent with the job (as I have outlined in previous postings). I just don’t want to be a peddler or huckster. No one needs to “give me the words.”

Monday, February 04, 2008

Revisiting knowledge management, especially on political issues (eve of Super Tuesday)


On the eve of the Super Tuesday primaries in both major parties, let’s think back about how representative democracy works, in the minds of most voters.

Most people vote for legislators and executives (that is, the Executive Branch of government) and, in some areas, even judges who they believe will represent their interests and views. That notion underlines partisanship. It also tends to lead to pork-barrel projects and a lot of inertia in the influence of special interests.

The same trend exists with NGO’s, political advocacy organizations, PAC’s, and the like. Most of them prepare form-letters to email to legislators, or arrange demonstrations. Most of these groups do a “we give you the words” number and tell voters or constituents what to say to “professional politicians.” Spinmasters used to be professionals.

Yet, the academic idea of an educated voter is someone who is familiar with all the issues. Public secondary and subsequently higher education, especially in liberal arts, is supposed to give the young adult a balanced view of society’s problems and how they may be attacked, often based on historical examples.

Issues are connected (like “dots”) in more ways than many people see. There are so many examples. For example, if we were to bring back the draft or even compulsory national service, what would happen with “don’t ask don’t tell”? Politicians talk about reforming health care all the time but so far have very little to say about eldercare. It’s easy for pressure groups to make a case for paid family leave (and we should certainly understand how well it works in many other Western countries), but then you have to ask what is going to be expected of the childless. Concerns about global warming and pandemics raise fundamental questions about individualism and social interdependence. Very few people understand anti-gay sentiments beyond the obvious homage to religion, when there are serious questions about family responsibility is generated and shared. We have a body of law around the First Amendment and free speech, but few people understand the practical challenges posed by the Internet and search engines (the “reputation defender” issue). Overall, when one puts all the issues together in a cake mix, one gets the impression that the way burdens get shared by individuals is an important issue, an idea that was understood a half century ago but that has been gradually forgotten.

Major media companies must, of course, present all the issues; but there is a tendency for the issues to seem fragmented when encountered by viewers, or even by visitors to major media websites. That’s partly because the news is a moving target, and the individual subtargets keep changing. Some of the news programs, like Anderson Cooper’s 360 on CNN, tend to make more effort to correlate issues than do others. In his own documentary film about global warming, “Planet in Peril,” Anderson used the term “interconnected”.

This takes us to the issue of “knowledge management,” as Jimmy Wales calls it. Certainly encyclopedia sites like Wikipedia (and Citizen’s Compendium) accomplish much of that. However, they necessarily must stick to provable facts, carefully supported by references. Wikipedia has attracted attention in the past year by being much stricter with facts and notability. At the same time, there needs to be a way to categorize and correlated views and opinions, and cultural filters.

I’ve tried to do that with these blogs (as well as the doaskdotell.com site). The underlying theme is individualism and the many trends in many areas that challenge individualism. If a visitor looks at a number of postings on a number of the blogs, he or she will get the feel of what this means. In retrospect, a more effective way to accomplish this could be to use Wordpress and use the feature to divide the postings into Categories. Another way could he to have all of them in an SQL Server database, properly set up an normalized.

I do have a new blog on another site, in Wordpress format. The link is here. The intention is to document bills (mainly in Congress) and court opinions that can have a major impact on expressive freedoms. Right now I’ve started with Internet censorship (bills related to COPA, which was overturned), and network neutrality. I may include trademark soon. Later, depending on future political initiatives, I might cover topics like repealing “don’t ask don’t tell,” health care, filial responsibility and family leave, or other new interpretations of old intellectual property law issues in view of the Internet.

Does this approach invoke the problem of "the knowledge of good and evil?" It's true, that people used to depend on social networks (that is family, church, business, corporate distribution) for their information, and with the Internet, that is all changing. The opportunities to take over ownership of how one disseminates and get information can have ramifications in how people avoid and build personal relationships. That doesn't necessarily please everyone.

Friday, February 01, 2008

"Reputation defense" and maintenance for teachers -- a redux


When I was getting out of the Army in 1970, I remember that the Command Sergeant Major said, “well, specialist, after all, you are civilian-oriented.” (That was in a world where among the “EM” the word “life” had been a pejorative.) Yes, I was to be a free spirit. I could do what I wanted with my life. In recent years, things have started to come full circle.

It’s natural for me to apply the concept of “reputation defense” to my “post retirement” teaching career that aborted on the tarmac. The life of a career teacher may be a bit like the life of a career policeman or soldier.

I’ve given the details before, but I just want to ponder the concept of “reputation” here, for teachers.

As I’ve indicated, there were two “big problems” in my two years of intermittent subbing. The first of these was some feedback that I didn’t seem to want to take charge of classes as an authority figure, ready and “willing” to wield discipline. That was particularly a problem in middle schools.

Let me add, at first, that for many kinds of classes – especially AP and Honors – a laid back approach really works and leads to productive class time. Once a class comprises students who get the concept of “enlightened self interest” as Ayn Rand would have defined it, you hand out the lesson plans, help with the appropriate classwork-related Internet searches and let them fly. (Sometimes, there is a test, especially in calculus.)

But the demand in the teaching area is mostly with younger, underprivileged, or special education students. Sub instructions in one school district said to greet the students as they came into the class (often not practical) and to maintain close eye contact or supervision. Those were general for all grades, but they presume the LCD.

Now I wonder if such aggressive intervention is really appropriate or in the school’s best interest, especially from a short-term sub there only for a couple of days. I think it isn’t. But what really disturbed me about several incidents was that, it seemed, the school wanted me to posture, to pretend to be something I wasn’t. Now, acting is fine when you’re in a play or in the movies. But not when people really are supposed to believe it – which raises a certain irony in the subject of “reputation.”

What was I supposed to portray? The image of a substitute parent (the notorious “in loco parentis” concept)? The optimistic social poise that goes with raising kids so energetically depicted in Rafe Esquith's recent book? ("Teach Until Your Hair's On Fire")? I, even given my thirty years of adult "urban exile," was supposed to convey the image of the proper heterosexual male role of, in my case, grandfather who can “protect” his lineage like Victor in “Days of our Lives.” I was supposed to leave the impression of someone who could fulfill both the competitive and emotive requirements of marriage with children and even attachment parenting, or, if not, at least subordination to someone else who had with personal fulfillment in living within absolute loyalty to the family commitment of someone above me. (That used to be the expectation of unmarried female teachers. – look at the 1872 teachers rules. It’s really OK for a much younger teacher to live up to this, even a male: President James Garfield started out as a schoolmaster and actually lived in parents’ homes – and that does require “reputation”. But at over 60, I could not manipulate the environment to leave that impression.

Then there was the flap about my web activity, already covered before. I did raise the possibility that the federal “don’t ask don’t tell” law for the military could raise legal questions if an openly gay man were expected to give custodial care. I know this was upsetting to a couple of school principals that I even made such a suggestion. As far as I am concerned, that’s another reason to repeal the law. But the main concern was over a screenplay script for a fictitious short film which, because of the apparent resemblance of a character to me, could be viewed as incriminating. This, in the context of the web and search engines, raises legal questions that have never been faced before. My intention with this "dreamcatching" was to present a thought experiment as a "reactive political protest." Although I got the job back and taught again for a semester, I finally decided to stop. The recent and subsequent heavy media coverage of seemingly related problems, although for incidents that were actually very different, certainly could make my "reputation" appear worse, even in "ex post facto."

To become a permanent teacher, even at age 64, I would need to apply for a graduate education program and pay tuition for 15-24 hours of classes. Though requirements vary and there is financial assistance of various kinds, one needs “recommends” from schools or references about working with “kids,” both to get in to a graduate program and then later to get a job. The real question, is “reputation” and it is a slippery slope. In the world of public school teaching, it does have a lot to meeting popular expectations of being a (for me male) role model and authority figure, and be able to use social manipulation to transmit messages determined by others. One has to “live the life” as well as “do the job.” In that sense, teaching is a bit like the military. It requires an “enlistment.” It did not always enjoy a good “reputation” itself as a career until more recently with the emphasis on NCLB. Low pay and government burearucracy were real downers from the point of view of go-getters; it took a world that has to accept the need for “service” to wake up to a change in heart. Any career like that (teaching, the military, law enforcement, even medicine) has narrower ideas of acceptable ‘reputation” than the open society at large. A teacher has to be more concerned with personal service to the next generation than with his or her own content issues.

One observations is particularly troubling. For several decades, we have accepted the idea that single adults could live the way they wanted, although they often did so in urban settings away from the “real lives” of “families.” That has changed recently, but so have expectations of socialization and interdependence as the big problems of the world have become more pressing since 9/11. It’s not good to come across as an “unsuitable parent” even if one did not have or want to have kids.

I still recall that shortly after my William and Mary Expulsion in 1961 (Nov. 28, 2006 on this blog -- see archive links), my father warned, "From now on, you have to worry about what everyone thinks." Forty-five years later, it seems that I might well have become a mathematics teacher, and a good one, in "retirement" had the 1993 "don't ask don't tell" law, intended only for the military but with serious secondary implications for other similar areas, not been passed. Reputation in the more communal social context of ingrained family values is often beyond the analysis of defamation law as we normally apply it.



Update: Feb. 5, 2008

I've made a correlated posting about substitute teachers and Internet reputation on my issues blog, here.