Sunday, August 31, 2008

"Pumpkins are orange": notes from kindergarten in 1949, and a prologue for today


Although most of my substitute teaching occurred at high schools and a few middle schools, I did experiment for a while wit a few assistant assignments at a top elementary school in Arlington, Science Focus, with its outdoor wildlife lab on an interior court.

Just once, I assisted with kindergarten, and noticed that the teacher, a young woman, would often assemble the kids to sit on a carpet for various kinds of fast-paced drills. Most activities occurred together or in small groups around table, in a room filled with colorful pictures and maps and tops. The classroom was a garish place, looking a bit like a Disney movie set. Specialized skills, including math drills, were taught by visiting teachers, some of them young men. The idea of a group splitting up and going separate ways and coming back together was barely introduced. At all times, moving from place to place and in the cafeteria, kids had to be closely supervised. The degree of attention was a bit of culture shock for someone who had not fathered kids.

When I was growing up, public school started with first grade (I remember “registering in August” as a big deal). The teachers were all women then (until junior high school), many of them unmarried. The Snopes account of the 1872 Rules for Teachers make for some amusement in these days of First Amendment battles. But in those days, teachers sometimes lived in the homes of parents (like eventual president James Garfield).


For kindergarten, my parents sent me to a “private” school, run in the home of a married (possibly widowed) woman who had done this for years. Most of the time the class met in the basement. My parents said that you go to kindergarten to learn to “behave.” Indeed we did. (There were no SOL’s then.) You learned social expectations. I recall a drawing exercise around Halloween involving pumpkins. The teacher got after me for drawing them red. “Pumpkins are orange,” she said. Well, then I liked red (a primary color). Maybe I shouldn’t have, because for discipline a child had to go sit in “the Red Chair” which was not Jay and Mark Douplass’s “The Puffy Chair.”

Later in the school year, probably the spring of 1949, the teacher tried a mischievous social experiment. She divided the class into two parts, with, it seemed, a higher portion of girls and better “behaved” or functioning boys to go upstairs, while the rest of us stayed in the basement. The physical metaphor was reinforced with her terminology, which might be offensive in today’s world. She called us “brownies and elves” (starting in the basement). Yup, this was Virginia, nearing the end of segregation. Times do change, and no teacher would dare use that first term today. In fact, I don’t think that this experiment, with its intended tone, would be acceptable today. I was very much in the basement, still. That image and experience would stick and my mind and I would remember it for years. I would remember it when Washington’s baseball Senators would finish in the cellar most years, a team whose mismanagement reflected the serious racial problems in the City in the 1950s.

All this happened less than four years after “The Greatest Generation” had won World War II, partly to defeat an ideology based on the idea that some groups of people were “better” than others. Several years before Brown v. Board of Education, we were hardly yet ready to apply the principles we had fought for consistently in our own land.

But the private kindergarten teacher, operating on her own property, may have been conveying a legitimate message. In any free society with competition, some people will be “better off” than others at any particular time. Life is supposed to be a meritocracy, maybe, but some of it will depend just on plain luck, too (just as a baseball team doesn’t get any slack for injuries – as they say in Army Basic, “That’s the breaks!”) So a moral system will expect that those who have more (the “Elves”, upstairs) will share with those who have less, and sometimes protect and look after them. This part of it I don’t recall in as much detail, but the point is well taken. I do remember being allowed just a peak of the class upstairs. Just a glance, and then back to the basement.

Of course, in that time, the “Family” was supposed to take care of problems like this. Since that time, we’ve expected government programs to make up for inequalities, and that may be necessary as long as we understand what we are doing. (There was plenty of hardship exposed in personal testimonials before Obama’s speech in Denver Thursday night.)

We have, in the past few decades, developed a much more individualistic culture, that looks at everyone in terms of “personal responsibility” on a global scale. From a biological viewpoint, Nature doesn’t work that way or cater to hyper-individualism as a moral paradigm. Within any family for community, people are born with somewhat randomized sets of abilities, quantitatively and qualitatively. To have gifted citizens, you have to have “others,” upon which the more advantaged depend on, like it or not. There is no nice way to say it. Lion prides, for example, will not raise their less well-off cubs. Human culture cannot behave that way (we fought a war over it, remember). So a moral system expects individuals to share responsibilities without too much undue emphasis on origin (or "upward affiliation"). Remember, the teacher’s stratification of students was far from perfect or “just” on an individual scale; it was not intended to be. "Merit" could sometimes deceive everyone and break up over clay feet.

American society, perhaps derived from Victorian values as well as religions (Catholic and various more conservative protestant and non-denominational Christianity), had a rule at one time that “religious right” pastors still sometimes utter. “No experience of sexuality unless married.” Note: “unless”, as well as “until.” “It’s just that simple,” one evangelical pastor said on an early Sunday morning NBC broadcast recently. “One partner of the opposite sex per lifetime.” Well, everybody gets one chance, then. And, you’re right, not many people take it seriously today, or believe it could be enforced. The Supreme Court in 2003 struck down the idea that criminal laws could be based on such a concept. Mainstream America probably no longer wants this.

It is important, however, to remember what such a “rule” pretended to accomplish. It was a convenient way to promise that everyone (child, adult, or elder) within the family unit would get taken care of by those in “power” (the married couple), without too much self-consciousness. The committed marital couple had tremendous power to manipulate the "complementarity" within its family unit for the "common good" of the family as a unit, not necessarily for the best interests of each individual on his own. Everyone, whether or not individually having his or own children (by getting married as adults) shared family responsibility. There just was no other possibility, because (in theory) there was no other legally or “morally” acceptable access to sexuality and any self-expression based on it. Notions of immutability and “second-class citizenship” had no meaning in a strictly family-driven culture.

Of course, such an arrangement accepted gross inequalities among groups, and exploitation of one group by another. That’s one reason why sodomy laws and prohibitionist sexual mores came to be seen as a ruse. Individualism fit well with egalitarianism in some ways, if you could accept the idea that you really earn what you have. In a real world, it’s never that simple. In an individualistic world, there are still unfairnesses and dependencies that are often out of sight. One by one, on an individual level, some people could be left behind. Until that became unacceptable. (After all, our most controversial education law is “no child left behind.”)

So, today we have a system of individualism, particularly the idea that the individual receives and sends information from and to the public without the consent of others “in power” (especially the family). This is very important in the West, and much less accepted say in China and Islamic countries. Nevertheless, we find individual sovereignty challenged by concerns over sustainability (global warming, energy, pandemics, religious terror), and demographics (demands for eldercare with fewer children, with many childless adults, and much more visible media attention to impoverished “other people’s children” without families). We have to be very conscious of “fairness” as we contemplate ideas like national service or sharing the military burden, and possibly renewing older notions of family responsibility, and we wonder if our openness about speech is being seen by many as an invitation for abuse and sometimes grievance airing or even retribution. There is no simple behavioral rule that takes care of these things. Or maybe there would be, if we could see how “the Golden Rule” applies to an open, free, individualistic society with fewer structured obligations than in the past.

Along these lines, for LGBT people (for simplicity now I refer to “people” but the “self-righteous” are going to insist on talking about “behaviors”) proposals to accept marriage and to accept “open” gays into intimate services like the military (which does not seem to be so difficult for other democracies like Britain and Israel). These ideas comport with ideas like “pay your dues” or “do your share” (or even “everyone serves”). But in some minds they disrupt the deep sense of social support that some perceive as essential for lasting marital intimacies. We wind up with a culture that is permissive but unequal and potentially unstable. We say to those who do not conform to the personal motives that normally accompany blood loyalty: go ahead, and have your own life for a while, but if there is sacrificing to be done, we’ll come after you, or you’ll wait in line behind us.

Friday, August 29, 2008

Must journalists avoid talking about their personal lives?


Should professional journalists keep their personal business out of their reporting, and refuse to disclose it?

That impression certainly exists in the major media, where there is an understanding that bringing personal life experiences into reporting would undermine the objectivity of the news.

This expectation is strongest with respect to personal relationships. Generally, consensual adult relationships are not thought as having public significance. For example, CNN 360 host Anderson Cooper has said “The whole thing about being a reporter is that you're supposed to be an observer and to be able to adapt with any group you’re in,” and so he never talks about his personal life/ (Source: the Wikipedia article.) This idea generally would apply even to marital matters and children. But sometimes the “rules” are changed and programs are aired giving the personal histories of reporters. ABC aired a major special about Barbara Walters recently.

Of course, a journalist’s professional history can be interesting and relevant to his or her reporting today, and sometimes that history can mesh with family matters.

But what could me more interesting is if the person did have a traumatic event earlier in life, and if that event or sequence really does relate to social and political issues today, even if the events occurred many years or even decades ago.

I have been getting into “blogger journalism” and I may indeed want to try “real” journalism. I’ve covered a number of the “reputation” issues on recent posts (including CNN’s policy regarding employees’ speaking for themselves on issues). In my case, events that happened in the 1960s, particularly, are relevant to today’s debates on a number of issues, particularly “gay rights” and especially “don’t ask don’t tell”, and in another sense, to issues like national service and the possibility that conscription could resume.

I made a related post today on my GLBT blog, here.

Thursday, August 28, 2008

Reputation defense: social networking profiles, and self-publishing represent different, if overlapping issues; which appeared first?


The debate and media coverage of “online reputation defense” is starting to bifurcate into two somewhat overlapping areas. Actually, it’s the second area that major broadcast media and now legal books (and even reputation defense companies) seem to heed the most. That is, as pointed out by Palfrey and Gasser in their new book “Digital Natives” (see my “Books blog” for Aug. 26), a whole generation is living a sort of parallel or “second life” in cyberspace, particularly making social and business contacts and developing (at least outside of marriage) a social and public identity. It’s pretty easy to imagine that this would concern employers, particularly in business models predicated on building and maintaining personal “client lists” and “bringing in business.” The distinction between work and “private life”, so sacrosanct two decades ago, particularly with respect to LGBT issues (remember how things were with the HIV crisis?) seems almost obliterated. This development is particularly interesting as politicians realize that they will probably have to scrap the military’s “don’t ask don’t tell.”

But it’s the first area – self-publication with a particular context – that most concerns me, as it has since the late nineties, once search engines became effective, but even several years before social networking sites came into being. My idea (“do ask do tell” would make a good colloquial buzzword for it) was to accumulate libertarian-oriented political arguments within a certain logical structure, and publish them with very little or no bureaucracy, and have the ideas get around through search engines or even “word of mouth”, so that special interests in the political scene could not easily continue to present public policy as a battle between the special interests of various groups. Although I had a lot to argue about individual liberties (especially LGBT issues), the format of delivery of message was a concept that generally appeals to conservatives, who often criticize the influence of pork and special interests (although during the Bush years they have been guilty of it) and of collectively playing the “I’m a victim” card.

The self-publishing involved personal history, but that should be seen apart from “personal information.” Some of my own history relates to the arguments that I make (particularly about the military policy, as well as various free speech issues, and eldercare). I think that the history strengthens the arguments. But it is true that the history “identifies” me publicly and that can be problematic in some kinds of situations. From a “social networking” point of view, my intention is to meet the “right people” because of the work I have done and published, not just to meet them directly through friends’ lists or something like a “Gossip Girl” world. That is a technique that actually works, but it may separate me from people who need me.

One important aspect of the “self-publication” approach is that I develop all my own materials and present them. I don’t depend on an organization to speak for me or protect my interests in an adversarial battle. I don’t depend on HRC or the AARP to protect me. I don’t want to admit to depending on them.

It also means that normally, I won’t be willing to work publicly for specific political candidates, because no candidate can take care of me. I have to be responsible enough for myself. Pretty idealistic, perhaps.

My own history with self-publication grew the issue of employment concerns, but they were more a matter, from my perspective, of “conflict of interest” than “reputation.” In fact, because of “reputation” concerns and possibly also torts (invasion of privacy, maybe even accidental libel) I stay away from the personal rumors about others: who your boss dates, who was stripped on the gay disco floor last weekend, etc. That sort of thing has caused trouble, for others, particularly in the workplace, even before online social networking came along. (I fact, I feel, even if you’re a celebrity, you have a right to go where you want without the amateur paparazzi.)

When I worked on my first book, covering, in large part, the “gays in the military” issue as it evolved in the 1990s, I was working for a life insurance company that, among other things, specialized in selling to military officers. Even though I was just an “individual contributor” I feared a conflict of interest. We vetted the issue with HR, and eventually I applied for and got a transfer enabled by a “friendly” corporate takeover. (Some of these are actually good for employees, you know.) One point that came out of the discussions was that I never made decisions about others, particularly underwriting decisions. You see what the locus of concern for corporations was in earlier Internet days. Nobody was worried about “reputation” in the sense of drinking or drugs. But I can imagine troublesome scenarios that could occur (even if they didn’t). What if I had access to HIV-related information of servicemembers. I didn’t, but maybe the public won’t know that.

One result of this vetting process was reinforcing the idea that different policy issues really do link up in surprising ways (often overlooked by politicians), and that there are a lot of dots to connect on this issue “game board.”

Over time, I came to see how this could extend to many areas. Of course, employers guard their trade secrets, but in some cases general public statements made by associates could be interpreted as giving out compromising information. Of course, consumer data security has become a much more sensitive issue today than it was even ten years ago. Moreover, public statements could be construed as hostility to certain classes of people, an issue for management employees (or underwriters), or people who give grades, like public school teachers.

Around 2001 or so, we started seeing occasional debates on the Internet as to whether employers needed to have off-the-job “blogging policies.” Once in a while, we would hear of a sensational case of someone being fired (with Heather Armstrong, in 2002, leading to her forming the mommy blog dooce.com and the adoption of the verb “dooce” in the English language).

The social networking site “reputation” and “friends” problems (as discussed Aug. 25) overtook these “conflict of interest” concerns, and we know that many employers started using search engines and social networking profiles to do “background checks”, often with very misleading results since none of these products were designed for that purpose. Even so, older concerns remained. I reported on this blog on Aug. 2 that CNN apparently has a strict policy on employee’s commenting on issues, and I suppose that other news organizations may have similar policies. (Visitors: does anyone know?)

I had to figure out what I would do as I planned on reentering the job market. I wrote a few pieces on the “conflict of interest” issues with self-publication, one as far back as 2000. The main piece is the “personal blogging policy,” which I posted here in February 2005, here; it gives links to other discussions. Nancy Flynn wrote a book “Blog Rules” which SHRM (Society for Human Resources Management) and the AMA (American Management Association) sell, and it reflects the pre-Facebook concerns over blogs and personal sites (link is there; I reviewed this before I had my books blog).

One point that I made was that people in certain kinds of jobs (where they are paid to make decisions about others or to speak for others rather than for themselves) probably should not self-publish in areas accessible to search engines at all, unless they are carefully supervised, and their content remains “static” (the way a published book does) and is allowed to age and become dated. I didn’t feel that merely not mentioning place of employment was “enough”; I rather bought the idea that, in the “digital native age”, one has just “one identity”, no matter how many “cyber dominions” one lives in. The “advent” of social networking (however sudden) may not make as much difference as one thinks: profiles can be kept private and away from search engines, although in practice its unclear how “private” such profiles have really remained.

When I was substitute teaching, I took only short-term assignments and had no authority. Yet, eventually I ran into “difficulty.” (See July 27, 2007 on this blog). Had I worked up to a “long term sub” where I would have the authority to give grades, according to my own rules, I would have had to drop everything. I would even have lost the right to keep using my “do ask do tell” domain name. That wasn’t worth it unless I would make a complete career change.

One may ask, why not just restrict talking about certain subject matter, as part of a “blogging policy”? In a narrow sense, that can work: don’t talk about your own workplace. (A lot of people have gotten in trouble over this on personal blogs, not just Heather Armstrong.) But imagine what it means to demand, “don’t talk about your sexual orientation.” (Hint: the military.) Don’t talk about the elderly, because you’re too close to a situation. Don’t talk about anything “controversial” and “admit” that it is somehow an issue for you. That could undermine “business” or relationships with clients, who have to “trust” you. Well, maybe it could, so that’s why some people probably shouldn’t talk about anything outside of approve channels. It’s not a matter of censorship content. It’s a matter of standing, perhaps, but of the way one presents oneself publicly related to the job one has taken on.

One lesson from all of this would be, when you advance in a career in a way that makes you publicly visible and that presents your “identity” (even as a “digit native”), make sure you ware doing something that represents your values. You may have to stake your personal “reputation” behind what you do. Make sure you really believe in it.

There is a corresponding lesson for “business” here: if it has (rather suddenly) become so vulnerable to the “reputations” of its individual agents, maybe business needs to reconsider how it operates. Maybe the old time ideas of manipulation and selling do need to change. Customer service definitely needs to change.

One other thing about all of this: what’s in it for me? Maybe to sell the “life story” (chuckle) as a movie? Actually, I’m working on that. But shorter term, it’s a good question, since there isn’t a lot of obvious “reward” in it. I do have the satisfaction of showing that one person can “write everything down”; the fact that someone does that keeps the politicians on their toes. Actually, I think that’s true. Maybe I could prove it. But who benefits? What specific human being (whom I’m accountable to) is better off? That’s a good existential question with some potential significance, even legal meaning in some situations. Is "being right" in public a fundamental right, or does it depend on "being accountable" (to someone specific)? Could all of this mean that we all have a moral obligation to become “partisan” eventually? (I use the mathematical meaning of the word “eventually”, as compared to “frequently”, here.) I do get the question, why won’t I run for office (to make myself “real”)? And I thought I’d answered it. But, maybe I will run some day.

Monday, August 25, 2008

Employers check social networking site "friends" lists


Tonight, NBC4 (at the end of the 6 o’clock news hour) ran a quick story about a “new” issue for jobseekers with social networking profiles. Employers now look at “friends” lists on Myspace, Facebook, Linkedin, etc. for more potential “references.” One employer who said he screens candidates who must handle client’s money says that this is an important part of a background check. Michael Fertik, founder of “Reputation Defender” appeared and affirmed that the “friends list” (the “company you keep”) is coming to be perceived as part of someone’s “online reputation.” “It’s a two-way street” an employer said. The days of resumes and prefabricated “reference” lists are over. And yup, technology that allows instant fame, perhaps with effort but without "competing" in a normal bureaucratic manner (or taking on "family responsibility") can invite the quick snap judgments of others.

Indeed. A lot of questions come to mind. How does the employer know for sure that the right profile has been found? How does the employer track down the “friends” and contact them? Is the candidate told this will be done? What about a candidate who does not use social networking sites at all? What if a candidate simply keeps the list "private"?

Of course, there are many lines of work based on the idea of building referrals or leads and bringing in business as individual clients. It makes sense that an employer would regard online social networking as relevant for such jobs. And there are jobs where someone is entrusted with personal information and assets.

But there should evolve some sense of “best practices” in these situations.

It’s all too easy to imagine other uses of this sort of gumshoeing. Say a coop board in a building checks the “associations” from Facebook of an applicant to move into the building. Maybe even a landlord would do that.

Or, imagine a more typical "professional" job hire. The employer looks at the Facebook Friends lists and finds only "other" screenwriters. Nothing wrong with that, is there? But it might convey the idea that the candidate just wants to work for a while to spy on the industry and then make a movie!

Reputation management now is more than what you write or depict on the web, viewed objectively. It has to do with the motives your presence (in totality or in parts) conveys, and the company you keep, and what others say about you, it seems. The whole idea of "private life" that developed in the 70s (especially with respect ot LGBT issues) evaporates in the age of the Internet. In the 1970s, I used to think of Manhattan and the outer boroughs as separate universes. Now, to paraphrase Clive Barker, they are all "reconciled."

Imagine, how, too how ridiculous the military's 1993 "don't ask don't tell" has become in the age of the Internet. "Unit cohesion" can supposedly be affected by "reputation."

Social networking sites were not originally designed with such use in mind. I don’t think Mark Zuckerberg had any idea that the service (now Facebook) that he launched from his Harvard dorm room in 2004 would come to this.

I could not find today’s story at NBC4 but I found an earlier one (Business Week) from May 2, 2008, “do online reputation management services work?” link here with mention of an “Online Reputation Management Association.”

Reiterating comment policy for these blogs (yes, times are changing!)


Recently, I have gone through a number of my blogs, and reviewed older comments, made before I started monitoring and moderating them (which I found I needed to do last spring after some abuses). I found a few that appeared spammy or that had potentially harmful links, and deleted them. In a few cases, I found the same comment repeated, on the same or other blogs, and deleted all but one.

I took this action partly because of the increased attention in the community to “spam in blogs” and in blog comments.

I noticed that in one case, Blogger had deleted a comment because it had a potentially harmful link that I had missed (probably a malware download). But I found this only once.

I’ve also noticed that I do not get as many “spammy” comments to reject as I used to. Some of them now seem to be getting trapped by robots before I get them. I did not put the captcha requirement into my profiles. But still, the volume of inappropriate comments received is down. I am grateful for that. This also seems true of my wordpress blog (about law and technology) where the ISP is Verio. The number of comments that need to be rejected is down there, too.

I accept any comment that is on-subject and that stays within the normal parameters of “terms of service” content acceptability. I don’t reject a comment because it disagrees with what I said, or even if it criticizes me or my work. I do reject comments with links to “adult” sites (I found one with a hidden such link), or to any site offering malware. Usually I reject comments with links to “SiteAdvisor” red or yellow sites. Theoretically, I could run a legal risk if (even just to identify what an unfamiliar linked site is) I link to a mislabeled URL that actually causes certain kinds of illegal content to download on my machine (before SiteAdvisor knows enough to intercept it). It’s OK to link to a commercial site (promoting a product or service) if there is some credible relationship between the link and the content of the blog. I prefer that a comment not make repeated deep links within the same site unless really necessary to argue a point. I found at least one comment that made repeated links but had been bloated by phony text to be made to look legitimate. (I can't allow links to sites that directly offer porn, hentai, etc.; it is OK to sites that talk about these as "subject matter".)

Also, I need to dot my legal "I's" here and state that I can't "warranty" the "safety" or legality of any content in a comment from a third party even if I looked at it. (Some lawyers would say it's better not to screen at all!) The applicable law is Section 230 of the 1996 Telecommunications Act (the "good part" of the Communications Decency Act, much of which was struck down in 1997). It states, in part, that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." Electronic Frontier Foundation (EFF) has a reference here.

A blog is a bit like a miniature China, a “dictatorship”, or an autocracy. The owner dictates what goes on there. It’s just that everybody gets to be the lord of his own jungle, like the “Iams” cat.

Picture: An Uptown neighborhood “beast” from my days in Minneapolis.

Sunday, August 24, 2008

High school English: what should students learn from literature assignments? We need the classics after all!


I remember the first day of class in sophomore English, in a stuffy third-floor classroom filled with musty books of the classics, right after lunch, back in 1958. This is the world of the “good books,” the classics. A former college football start, Mr. Davis, still in his 20s, taught it, and gave the course quite a bit of intellectual rigor, all the way to the final exam where we had to psychoanalyze Mark Antony from Julius Caesar. I must add, here, a bit of fact: in my day, senior high school started in tenth grade; ninth grade had been “junior high school” (before middle school came into use), and in “junior high,” English and social studies (humanities) had been linked as “general education,” an interim step between the one-teacher model of grade school and the campus atmosphere (supposedly) of high school.

The routine in those days was to alternate grammar and literature, in three or four week segments. Grammar was basically like algebra, structured, following a pattern. (By the way, taking a foreign language – in my case, French – made English grammar a lot easier to get as a side benefit.) Our first literature unit was to read Shakespeare’s Julius Caesar. Later in the year we would read George Eliot’s (Mary Ann Evans) Silas Marner (New American Library, 1960: It starts with “In the days when the spinning wheels hummed busily…”). Later we would read various poems and short stories. He liked giving pop quizzes (those dreaded “reading quizzes” as English teachers call them today), and well structured tests. With the Eliot novel, the child character Eppie, and the cleansing effect of almost “mandatory” and inherited “family responsibility” for someone else’s child, after material and “sovereignty” loss, comes to mind. We had themes assigned around various paradigms: definition, character sketch, argument. One girl wrote a theme trying to “prove” that God exists. That same year, I took plane geometry and learned what a “proof” really is.

Today, August 24, the Sunday Washington Post Outlook Section publishes a perspective by private high school English teacher Nancy Schnog, “We’re Teaching Books that Don’t Stack Up” (link). Kids wonder why they have to study stuffy old and irrelevant classics (Nathaniel Hawthorne’s “The Scarlet Letter”), when there are some books, like J.D. Salinger’s “The Catcher in the Rye” that high school students relate to. She makes the point that Salinger’s classic is often taught in eighth or ninth grade, when it is juniors and seniors who relate to the character. (I remember the early line about “old guys …” and so did some buddies in my Army barracks!) In fairness, teachers really have tried to introduce much livelier books, sometimes with controversy, such as with the Indiana teacher who tried to offer “Freedom Writers” (my post). Now, many teachers assign Zora Hurston’s “Their Eyes Were Watching God.” Students do like F. Scott FitzGerald’s “The Great Gatsby,” and kids like to take note of the famous passage at the end of Chapter 3 where Nick Carraway says: “Every one suspects himself of at least one of the cardinal virtues, and this is mine: I am one of the few honest people that I have ever known.” The character Nick Fallon (Blake Berris) from “Days of our Lives” seems based on this original Nick.

Now, most high school kids are rushing to finish their summer reading assignments. Schnog writes about an email dialogue with a former high school student who doesn’t see the point of studying classical fiction when there are real problems to be solved. Science fiction comes up in the dialogue. Indeed, science fiction, as the boy would point out, allows one to imagine other moral and social systems and how they would play out. (I think Clive Barker’s “Chinese puzzle” fantasy “Imajica,” however “perverse,” would make great summer reading for an AP class because of the intractable moral, religious and political problems it poses.)

The biggest reason that English departments assign “classics” (which include non-fiction, actually, like many of Shakespeare’s and other classical plays) is to teach context. By taking four years on English, students learn an absolutely indispensable skill for the adult world of work and interaction in almost any area: interpreting media materials within the context of the social problems that the materials address. One learns the historical, technological, political, and social circumstances in both Elizabethan England and ancient Rome in order to understand that the Cobbler talks about at Julius Caesar opens. By studying literature of many different periods and cultures, one learns what context means in relation to our own issues. One of the biggest practical problems for parents who must monitor their kids’ media and Internet exposure is that minors do not always get the context with respect to which some materials were intended to be interpreted. It’s not hard to see that the whole sequence with the character Eppie in Silas Marner has some meaning in our moral debate about moral values (and “family values”) today.

One other observation about those reading quizzes. I saw a lot of them as a substitute teacher. Since I am working on a novel manuscript of my own, seeing a lot of these quizzes helps me craft my own work. I can ask myself the question, “if my novel were taught in an English class 50 years from now, what reading questions would an English teach as about this chapter?” I can usually guess, based on what I’ve seen (William Golding’s favorite “The Lord of the Flies” makes a comparison). I can tell if the plot and characters are hanging together. The same goes for screenwriting, when writers get together and review sequences of pages, or arrange table readings. The point is to follow the point of each scene and each interaction, keeping in mind the “three part structure” expected by the movie industry.

(For the “best teacher I ever had” see this blog for Sept. 14, 2007).

Saturday, August 23, 2008

When do you own your own life?


I remember anniversaries, particularly when they cycle back to the same day of the week on a perpetual calendar, as affected by leap years. Five years ago yesterday, Friday Aug. 22, 2003, I drove away (back to "The Fifth Dominion") from the six years of a very interesting and productive life in Minneapolis, living in the Churchill Apartments downtown, across the street from the main Post Office, two blocks from the Mississippi River. From the penthouse pool room and gym, I could watch the river, with its barges and bridges (one of which has had to be replaced) on one side, and look at the downtown skyscrapers on the other. For 52 months I had worked for ReliaStar and ING, and had a 1000 foot walk, very convenient in 1998 when I recovered from a broken hip. Even after the “retirement” I could walk to a part-time job at the Minnesota Orchestra, in the Oakwood Apartments near Symphony Hall, fifteen minutes by skyway from the Churchill. That summer, I had worked two months as a debt collector, and actually driven to a location near the airport.

I had driven away from an independent life before, back on Tuesday, June 28, 2008, from my Pleasant Grove condo in Dallas. I had lived there for 9-1/2 years, during the whole Reagan era. But the oil price drops and the “epidemic” of leveraged buyouts and hostile takeovers common then had made my own IT career questionable. I had worked for Chilton for six years, but had accumulated a technical background outside what was thought to be readily marketable. So I could not stretch my luck. When I found an opportunity closer to my “home of origin” I had to go. I wound up renting, then selling the condo for a loss, then having to deal with an unqualified assumption. I eventually got a lot of my money back., and as the military gay ban became a political issue in 1993, my own personal story became interesting publicly. I imagined a “second career” as a writer, and started planning and writing my first book, which came out in 1997. The potential conflicts from that (I’ve gone into detail elsewhere) led to the transfer to Minneapolis and six very interesting and eventful years. For one thing, the independent film community in the Twin Cities is quite strong.

The second time around, the re-relocation resulted in large part from mandatory “family responsibility.” That can occur even for those who do not beget children, and provides a flip side to a quite existential moral debate. That’s one of the reasons why the “gay marriage” debate is so important even to single people not personally wanting a legally recognized relationship, but it seems that only Jonathan Rauch and maybe the California state supreme court are willing to admit that.

When returning to my condo in Dallas or, in more recent eras, my ample highrise apartment in downtown Minneapolis (whether from “home” or from a personal trip, even Europe), I felt a certain pleasure when unlocking and opening the apartment door and viewing the accoutrements of a life that, however modest by 21st century standards of rampant materialism and consumerism, was mine. My property. My connections with people. My values. I could look across the living area and see the computers, books (including what I had authored), CD’s, speakers, and even, for a while, model train set, a kind of miniature Imajica. I would bring up the cable TV and computer, and in a few minutes be back to “normal”. The apartment would disorient some (and suggest "no family responsibility") but at least one celebrity journalist's apartment on television looked a lot like mine. How many people have stuffy "workrooms" that they never show to company (or to television cameras)?

To do what was morally necessary, I had to give up some of that personal sovereignty in 2003. In the sense that “logical consequences”, when applied to radical individualism, demand if carried out mercilessly (even Al Gore criticized taking “Reason” too far), I became a second-class citizen as I drove away from the Churchill garage, back through Wisconsin, Illinois, Indiana, Ohio, etc. The trip out over Labor Day weekend in 1997, on the other hand, had become a personal celebration (a police officer in Chicago even let me off a speeding ticket when he saw my authored book and coffin picture in the back). This time, my life had melted away. The Kimball piano, now badly out of tune, had been donated and taken out a week before. The movers had come Aug. 19 and I had even signed a gypsy moth declaration. I had spent the last three nights cleaning up in an empty apartment, with just a laptop and dialup for connectivity. Wednesday night, I had gone to my last event at IFP and watched (at the Bryant Lake Bowl) a documentary film about blogging, which warned that employers would soon start firing people for what they write in blogs. Yes, now in this Myspace era, that has happened.

I say “second class citizen” and the California supreme court explained that in its May opinion. I don’t want to be too overbearing with this, because I had long known that at some point my life would have to take a “time out.” So, indeed, at age 60, this would happen.

I grew up during the Cold War era, when personal sovereignty started to come to be respected, not only as a result of the Civil Rights movement and Stonewall, but because of the practical realization that a free society needs cultural diversity to nurture the talents it needs to preserve itself. It needed its nerds, and started letting them off the hook from the demands of a macho, patriarchal “moral” system designed to guarantee stability for the “average Joe” as he accepted his marital and familial commitments and a personally "competitive" arena. During the Bush years, we’ve seen the pendulum swing back. The cycle has happened before. One thing to remember: marriage, through implementing complementarity, does answer some basic moral questions about individual karma.

Friday, August 22, 2008

I have a "Letter to the Editor" on retirees as teachers published in a local newspaper this morning


I have a “Letter to the Editor” in The Washington Times today, Friday, August 22, 2008 at this URL. The newspaper inserted a “theme” photograph of the new Teachers’ lounge at Wakefield Highs School in South Arlington, in the online version. In print, the letter appears on p A26. The letter refers to an Aug. 19 Washington Times editorial, "Don't Know Much About Math", here.

I subbed at Wakefield at times. Actually, the last assignment I ever took in Arlington occurred in early October 2005. The assignment included some sections with some severely disadvantaged students economically, and there were some serious discipline problems requiring intervention by security. Within a certain subpopulation of students, “someone like me” simply was not respected or even tolerated as even a temporary “authority figure”. In the minds of such students, I just hadn’t “paid my dues” as a man. I will say that in high schools, such experiences were unusual. I had over 220 assignments in about four semesters of subbing (including many others at Wakefield) and most of them went very well.

Arlington has three high schools: Yorktown, Washington-Lee, and Wakefield. The most affluent area is north Arlington around Yorktown. I graduated from Washington-Lee myself in 1961. W-L has a new building, of which it is very proud; but it did not get completed until I had stopped. Arlington adult education also takes place at W-L. I’ve taken a screenwriting class there. Wakefield is getting a temporary makeover but due for a new building. Wakefield has the largest percentage of economically challenged students due to the demographics of Arlington County populations.

I actually had gotten bounced from two Arlington middle schools for “poor classroom management” – an inability to maintain discipline, but in both cases it was a few students who created the problems.

The Washington Times
letter, as I originally wrote it, had said “small but significant minority” of students. The paper was concerned that the word “minority” could be construed as specifically related to race. But in the Wakefield incident, I actually believe that race was an issue in the minds of the specific misbehaving students needing big-time discipline. In practice (and at some risk of falling for stereotypes), I have to admit that the most effective authority figure for a disadvantaged African American teenager is often a young adult African American male with some “bearing” (and skill in sports), not an older or elderly and timid-appearing white male. In other places, much has been written, such as this paper citing schools in Florida, about recruiting young African American men to become teachers, link here.

I had written about the discipline problems on this blog July 25, 2007 (not 2008), “Substitute Teaching Redux I”. In the two middle school incidents, it was student age and immaturity, not race or minority status particularly, that became a serious matter with a few students. I also wrote about teacher speech on the Internet – in my own personal experience – on that blog two days later, July 27. The major media may soon be covering this controversy in more detail.

I came of age during the period following “Brown v. Board of Education” (1954), and school integration did occur “with all deliberate speed” indeed. In the 1960s, forced bussing was a serious social controversy, and resisted by most parents. I found a 1978 Time Magazine article on the topic here. My Movies Blog (check the profile) has reviews of documentary films that deal with this issue (race and public education) June 23, 2008 and July 7, 2008.

I've written about substitute teaching problems in my issues blog, too, as in December 2006, here.

Picture (above): The "new" Washington-Lee High School, Arlington VA. When I attended, a three-story brick building was there; my homeroom was in room 307 on the third floor; many of my classes were on the first floor with a view of the athletic field, where the "Generals" played and won a famous Thanksgiving Day football game 3-0 in 1960.

Picture (below): Wakefield High School in south Arlington.

Thursday, August 21, 2008

"Second Career" talk: I don't "play family" for money


At 65, I may transition back to work at some point – I think I must – and again I just want to lay down some ground rules, at least for now. Yes, I am on social security, and until full retirement age (66.2) I would have to report any excess earnings if they happen. Right now, I am working, sorts – but I mean, a so called “real job”. And I hope that doesn’t mean a graveyard shift at the 7-11, paying my dues.

I’ll bypass the personal circumstances for the moment and admit that I might well relocate again at some point. That’s OK.

OK, my dream job would be to get money for one of my screenplays, get paid by the production company, hole up in a condo in LA, Toronto, Vancouver (that’s more likely, familiar to someone who watches Smallville) or maybe London or even Madrid and get it through shooting, working 18 hours a day, living modesty. Or perhaps even try Williamsburg. What happened there more than a couple centuries ago seeded our idea of political liberty, but I think what happened there about 50 years ago has more to do with psychological liberty, and maybe “sustainability.”

So, Matt Damon wrote on the Miramax Greenlight site about trying to enter the movies, something like “don’t do it.” The odds so long. So, seriously, I would like to work on one of the “big problems” about law, society and the Internet. Sounds pretentious. But the list of issues is long enough, and they are interconnected, viz: best practices for employers and “online reputation defense”; new due diligence procedures for consumer identity protection (hint: use NCOA); content labeling and filtering; fighting spam by charge systems; less error-prone TOS enforcement, human computation and images, more realistic page-ranking schemes, more customer-friendly advertising, fair use in blogging, news integration, and, particularly, “rationalizing” consumer home internet security (along with the "Internet driver's license" propoals).

All of this requires some “selling” on the job, but that refers to selling ideas or systems that I had a legitimate “creative” role in developing.

Another idea is to go back to the mainframe / midtier / GUI background I worked in before “retirement” at the end of 2001. I haven’t done it in seven years, but sometimes I think I could get on a plane to Minneapolis and resume the job I had as if nothing had happened. That makes a good dream.

Most of the mainframe jobs are contracts run by staffing firms, which must be starting to worry about online reputations, even if they steadfast maintain that their main concern is still being able to “do” the job. (It should be.) I’ve talked about this on the I.T. blog, but the short shrift has to be that professionals don’t want to invest their public “reputations” on older technology.

With any job like this, I would behave online in a manner agreed upon with the employer. In addition to the “reputation” issue, there are questions about continuing to be visible publicly through blogging if one is in a position to make decisions about others, or responsible for representing another entity publicly (rather than just oneself).

I made a decision to become publicly involved with the debate over “don’t ask don’t tell” and all the concentric issues that followed, in 1994 when I was 51. I would work in my regular job for seven more years, as history unfolded. Because of the public controversy, and my unusual life-experience situation with respect to the issue, I felt it was a rather irrevocable decision. I felt that I made the decision late enough in life.

I have been approached several times in the last few years about the kind of job where one has to develop a business on personal clients and referrals, in order to provide them individually with specific services or sell them specific financial products. This activity includes life insurance agent and financial planner, as well, possibly, tax preparer. I would obviously be able to take the courses, pass the tests and get licensed in these areas, and do the analytical, mathematical job on the computer. But I have a problem right now with the idea of a career based on commissions and on selling services to individuals, when I have made myself public with respect to somewhat controversial issues, and want to be able to talk about the issues --- all of them, because they connect up in these never ending round robins -- publicly, which might undermine the idea that I have a client’s own personal best interests at heart.

There’s another part to this, too. My perspective on jobs based on selling “other people’s work” (not mine) on commission to listed clients seems to require a lot of social schmoozing, which I don’t want to be party to, and even “playing family”, which for me right now is a particularly humiliating proposition. I see a lot of schmoozing in the emails. I still get emails inviting the trading of client-referral lists that seem to follow when I vetted New York Life in 2005. I see how the business must work. And, frankly, some sales-based businesses seem to depend more on the ability to manipulate people (the "Always Be Closing" mentality from the comedy film "100 Mile Rule") than the essential value or soundness of what is being sold. Even so, many people see the ability to "manipulate" others as an essential part of "worthiness."

I ran into similar concerns with substitute teaching (“authority” or social “role model” issues in some contexts) and conversion to regular teaching would require a large investment in graduate education courses at my age (I discussed this a lot toward the end of July 2007 on this blog).

I do have a conceptual ethical concern. I do want to land in a media-related or Internet-related area where I would be “listened to” and there is indeed a “privilege of being listened to.” Our culture has not quite come to grips with this yet, but it may have a lot to do with proving yourself as a person (apart from your “work” or “ideas”) in being able to “take care of other people” even if you didn’t have your own children. That reverses a modern mental health paradigm, popular with individualism since the late 70s, that encourages defining oneself first before making commitments. Society, it seems, cannot afford to let adults stay in perpetual adolescence. Perhaps it will no longer be willing to. This is a sea change again, to be sure. So my plans do come across as a kind of chess gambit, maybe like the Benko. I’ve already played some cards, pushed some pawns pretty far.

One thing about a lot of these sales jobs and “people manipulation” jobs. I think there are people whose idea of “non-discrimination” is to see me pretend to play the “family man” role in front of clients or students. That makes them feel better about themselves, pretend to be politically correct, and let them say, “see, we told you so.” I want no part of such “affirmative action.”

A "Second Career" is not synonymous with "Second Life."

Wednesday, August 20, 2008

Rick Warren and "Sustainability 401"


If I were to state in detail my circumstances right now, I could predict what most people (particularly professionals) would recommend. I can anticipate how they will react. But their gut attitudes won’t match those articulated by candidates regarding major issues. Maybe Barack Obama was clever in paying heed Saturday night to Rick Warren’s “It isn’t about me”. Yes, there are things that affect our lives that are bigger than we are. It isn’t always your right to carve out your own meaning in life from the perspective of a narrow liberty interest. Justice is bigger than man is. I know the tenets of faith.

Nevertheless, I still like to reduce moral debates, in so far as they affect me and any one individual, down to a “lowest common denominator” in that “Algebra I” sense. That common factor seems to be this: we often benefit from sacrifices that others made and we often can’t see or perceive those involuntary “burnt offerings” at all. The “moral” issues of my coming-of-age time seemed to revolve around that point. For a time, the issue of the Vietnam era draft and the “unfairness” of deferments seem to filter down to the ultimate question of individual deservedness. Young men were expected to become competent in defending women and children, even before they married. “Group morality” was somehow different. We studied it in history, and the idea that it could matter individually started to sink in a bit when we studied slavery, and then segregation, and had to ponder how some of us had benefited from the earlier sacrifices of others. But, even during the Civil Rights movement of the 60s, we tended to regard these questions as matters to be resolved by collective action, not “personal responsibility.” When I was a young man and starting out on my own in the world world in the late 60s and early 70s, the radical Left was very vocal about this notion of “fairness.”

We see this concern today in several areas, particularly the growing debate over whether quasi-mandatory national service should be expected, or even the draft should be reinstated (and then, what happens with “don’t ask don’t tell”). We also see it in the discussions of the environment, energy tipping points, global warming, and sustainability as now an issue of personal ethics. The Beijing Olympics reminds us that “sustainability” has come up before: a few decades ago, the Chinese tried to address “sustainability” with the horrific “cultural revolution” and then with the one-child-per-family policy.

Another place we see it is in the debate over “family values.” The issue of “gay marriage” has stumbled along, and made us ponder the idea that intergenerational responsibilities need to be shared personally, and also the idea that we used to depend on certain inflexible social structures (marriage and its whole cultural paradigm) to provide the emotional infrastructure that makes intergenerational responsibility possible. Radical individualism has taken the idea of “personal responsibility” to its logical conclusions, and reached something like a quantum or Heisenberg paradox. Think of a family that has one gifted teen or adult child, and another child who is disabled. (I can name some specific examples, even among celebrities, but I won’t here; I’ve covered them elsewhere in the blogs.) The gifted child can be the most responsible person possible as an individual, but he (or she) still owes his existence or “success” to the fact that his parents took a risk that entailed bearing the disabled child (who cannot follow the demands of “personal responsibility” in the modern individualistic sense) as well. Family responsibility, then, entails collective risks and responsibilities that all must share, according to that ethical thinking. Some of the responsibility may be financial, but a lot of it seems to involve maintaining emotional cohesion within extended families. This expected or almost mandatory motivational and emotional "loyalty to blood" tends to protect the more "dependent" or less "competitive" family members (even as adults) according to ability, and socializes life within the family unit, but definitely not outside of it. Many conventionally married parents with children perceive that they need to count on this filial loyalty to justify the personal (and heavily socially supported) commitments they have made to their marriages. Curiously, even conservatives don’t seem to want to say this too loudly. Carried to extreme, this idea leads unwanted behavior one sees in soap operas, and in other cultures that take patriarchal family values to their limit (radical Islam).

At a practical level, there is considerable cultural conflict between those who have children (or other family responsibilities, "chosen" or not) and those who don't. Some of the conflict increases with "globalization." People with fewer "responsibilities" can "lowball" those with more when competing in the workplace, fueling the debates on unions, offshoring, and the like. On the other hand, sometimes single people do more work than people with the families for the same pay (often when "on call" or in a pinch), and the people with families are unaware of the fact that the personal lives of other coworkers were disrupted. The "unseen sacrifice" can go in either direction. Elinor Burkett had taken up this problem with her 2000 book "The Baby Boon: How Family-Friendly America Cheats the Childless."

I’ve noted before, a system that maximizes individual liberty, in the libertarian sense, involves allowing some unknowable risks (the kind that insurance companies can’t make actuarial predictions for), and these can become disconcerting. Authoritarian cultures (China today still somewhat fits this description) can take care of people within the family or community unit with reasonable stability or even “sustainability”, but must deal with large group injustices, whatever their ideology.

We’ve seen the unpredictable and sometimes tragic results that build up over time when some new freedom is exercised by individuals in large numbers. We know how demographics and behavior combined to produce the AIDS epidemic in the 1980s, but in this country we have learned to manage and live with it, by and large. In a completely different area, we see how the freedom of self-expression and particularly self-promotion on the Internet has created novel issues, and sometimes innocent or unwilling “victims” that our legal system has not figured out how to deal with cleanly yet. A third issue that ultimately is related will be eldercare, and medical technology is offering both opportunity and serious and perhaps unwelcome challenges. The way medicine is practiced, the elderly can live much longer, but often in a frail and dependent condition (with a ten-fold increase in Alzeheimer's Disease in the past fifteen years); there are fewer children to care for them, and institutional resources, which nursing homes and the long term care insurance industry are trying to address, may not be sufficient. People who have never had children and constructed productive “different” lives according to their own value systems may learn that they are more tethered to the hidden responsibilities for others than they could ever have imagined.

Tuesday, August 19, 2008

Chess: recalling being competitive in USCF tournaments


Oh, will I ever become active in tournament chess again?

I haven’t played a rated game since 2000, but as a Life USCF member I still have a rating – it prints on my Chess Life mailing label, as 1752. If I were to enter a random tournament right now, I would play about as well as the Nats have recently. I’m just not competitive, since I have migrated away in other directions (and been pulled away). Being a competitive tournament chess player is like fielding a competitive major league team, eh? (See my posting Sunday.)

I played my first rated game in George Washington University team matches, back in the fall of 1964. I lost my first three games, as I recall, and then got a draw on the white side of a Nadjorf Sicilian with White, and then beat a Dragon by castling Queenside and storming Black with pawns. (I hadn’t learned the niceties of the Yugoslav Attack yet, believe me.) The first tournament that I ever entered was in the spring of 1965, at a place called the Washington Chess Divan, which in those days was near Capitol Hill. I lost the first three games (like the Nats) but won a “Sunday doubleheader” to finish 2-3. I somehow remember a Steinetz Ruy Lopez and an opponent actually allowing a pin against an uncastled king and the loss of a piece.

I played some rated matches with other players in college of my own strength. One match in 1965 ended 2-2 with Black winning all four games, decisively. It’s common with weaker players (or players with midrange ratings) for Black to win more of the games, because it is often White who is the first to patz. With some players at GW, chess became an obsession. One player, the star of the team, started failing his courses and wound up getting drafted and going to Vietnam, although he survived. (We played postal games.) Another enlisted to avoid the draft and actually did very well in languages and in military intelligence.

I wound up in the Army myself after graduate school, and actually played a lot of chess. I helped organize post tournaments at Fort Eustis, and got “orders” to play in the Armed Forces Championship in Fort Meade, Maryland (in the back yard of the National Security Agency, half way between Washington and Baltimore). In that tournament, I won my only game ever with the Black side of a French Defense Winawer variation where White goes pawn grabbing (very few White players accept the challenge in practice) and I remember that the game ended as a 70-move endgame race. Later, in December, in a tournament in Newport News, VA, I would defeat Armed Forces Champion Robert Powell on the White Side of a Kings Indian where he varied and sacrificed an exchange, and seemed to be winning, when I came up with a sham Queen sacrifice that won the material back (with mate threats) and entered the endgame a piece ahead. I remember the crowd around the table watching a master be defeated by somebody with a rating 600 points lower. But it happens. In any given game, anything can happen, just as in the NFL. Bill Goichberg, who ran so many Continental Chess Association tournaments in hotels all over the country in the 70s, used to say, “Only your own mistakes can beat you.” Is life really like that? Not quite.

In the 1960s I was already beginning to notice that Chess, while pretending to emulate war and the ideas of Clausewitz, provides, with its 16 pieces and 64 squares, a paradigm for the moral ideas of life. Let me digress a moment. I recall that Joe Steffan, in his book “Honor Bound” (leading to the debate on gays in the military), gives his account in his junior summer at sea as a midshipman on a submarine, and talks about playing chess games, but never says how good he was at the game. I believe that service academies used to teach the game; maybe they still do.

There was a moral debate growing in the chess community about opening theory and “style” then. Queen Pawn openings were thought to be more “solid” and a chess player’s maturity depended on learning to play more “solidly.” That view turned out to be way too dogmatic over time, but I began to understand how many things in life evolve in a manner analogous to “positional play” in Chess. In chess theory, we have debates about when pawn centers are strong, or simply overextended an vulnerable to attack (as in the Grunfeld defense). We have debates about the relative value of a Bishop and Knight (rather sounding like the controversy over the Knights Templar). Sometimes, there are questions about when center pawn majorities outweigh wing majorities. Or when is an isolated pawn a strong battering ram, or an endgame liability? Going into retirement on a sound foundation is like trading into the Engdame. But some of the biggest moral dilemmas in life correspond to the chess concept of zugzwang, or compulsion to move. One can slide into a situation where one is not immediately threatened but where one runs longtime social risks because of karma, and become pinned down, unable to move without harm (like being in a bind in a chess game). The obvious example, of course, is the basic king-and-pawn ending where the losing side must move his King away and the pawn in promoted (this is set up by having the “Opposition”, which again is something akin to moral karma).

The idea of “zugawang” is showing up in subtle ways in modern chess. In the late 1960s, an opening called the Benko Gambit became popular, because in many variations White (up a pawn) can’t find anything constructive to do and deteriorates. Recently, in some offbeat variations of the Sicilian (and a related position from Bird’s Opening), White has experimented with an early Queen sacrifice, for two minor pieces, where Black is constantly in practical zugzwang afterwards.

The other main period of my life where I was active in tournaments was in the early and mid 1980s while living in Dallas. During that time, the Dallas Chess Club moved from East Dallas to Forest Lane up north near EDS. For two or three years I did pretty well, and got my rating almost to 2000. Sometimes I would beat a 2200 player in a tournament, usually because he got overextended, and it seemed, though I tended to hover near .500 ball (the tournament system tends to force that to happen), that more than half of my wins came as Black, even against higher-rated players. One time I got a draw with Black with a 2400 player with a piece sacrifice that netted a perpetual, shortly after the opening (a Saemisch Kings Indian). But another time, in a notorious and published game, I lost in about 17 moves with White by having my Queen trapped without being directly attacked (on the white side of an English). The score is here.

In the early 1990s, after coming back to the DC area, I got active again for a time. A couple of times I beat “Experts” on the white side of a Kings Indian, both times by forcing the exchange of light-square bishops. It’s amazing how often players allow that. In 2005, I assisted the First Baptist Church of the City of Washington DC with a one day chess tournament from DC schools. That would be a good thing to do again.

I had thought about starting a Chess Openings blog, but I’ve wondered about copyright issues, if I simply reproduced analysis made by others. It’s amazing that 40 years later openings don’t seem to be played out, despite computers. It would seem that software engineering paradigms for computers playing chess could be related to other efforts to teach computers to interpret images, a recent subject of controversy in the security industry (and subject of a video by Luis Von Ahn that I expect to discuss in more detail soon).

I show a score book of some games from the 1960s in the picture. Many of the paper scores have been lost over the years.

I discussed Bobby Fischer’s career in January on the International issues blog, here;
(also check that blog for Sept. 24, 2007 for a posting on Garry Kasporov; I discussed Kasporov’s “How Life Imitates Chess” here. )

Note on the United States Chess Federation: The link is above; it acronyms uscf appears in unrelated domain names. In the 1960s, it was located on 11th St in New York City, across the street from the Cast Iron Building where I would live from 1974-1978.

Monday, August 18, 2008

From encyclopedias, to news stories, to "gonzo journalism"


As I explained on my GLBT blog, I updated Wikipedia for the first time today, making a small note about another 1993 government study before “don’t ask don’t tell” was passed. I was quite impressed by the existing entry, and particularly by how much social controversy can be conveyed in stating relatively objective and verifiable facts.

In an encyclopedia, a writer does not speculate or offer opinions or interpretations; he or she simply organizes and writes out the known information. Arguments previously made by others about an issue may, however, be organized and enumerated as long as they are properly cited. A well written article would contrast opposing viewpoints in close proximity so that the deeper problems may be understood.

The same observation generally applies to newspaper articles, when they cover events of enough scope as to require some review of the subject matter. Typically, a news story about a Supreme Court opinion may be like that, forcing the reporter to review the historical facts about the issue. Even more interesting will be a news story about oral arguments, because the reporter will list arguments made at the event (and the Justices’ questions) that tend to suggest what kind of outcome may occur.

In a commentary, a writer could then propose his or her own conclusions and possible recommendations, based on articulated facts. That’s how we are taught to write term papers in high school and college. The conclusion is supposedly justified by what has been found, not by the personal opinions of the author. However, at the end, there may occur the “gonzo journalism” element. Personal life experiences of the writer, particularly if unusual or traumatic, may influence his or her conclusions. We move out of the area of accepted professional journalism (or even neutral “open source”) into “creative” writing or expression where the writer attempts to change the perspective on a problem from that point forward. His or her right to do so is part of our tradition of free speech. What seems controversial is how much responsibility should be taken for the results of the speech. But it is personal history that does give standing, after all – eventually to deal with “everything.”

Sunday, August 17, 2008

Washington Nationals Baseball Team Is as Inept as Were the Senators from the 1950s


There is a favorite term in Human Resources, called “propinquity” – the tendency of people on the same team in close proximity to bond together and take sides. Even in the same company, rivalries develop. I can recall that at work in the 1970s, when us programmers at one location in mid-town Manhattan would say about the team in the financial district, “They’re bad.”

And the same is true in sports, especially baseball. The Washington Nationals managed to get their spanking new stadium, throwing out “undesirables” from the SE neighborhood for “real estate development” (some of it subprime, of course), including a number of businesses important to the LGBT community.

At least we could have a winning team. Usually teams do well for a while when they get new stadiums. But not this year's Nats. Right now, they have lost 10 in a row, have a record of 44-81 (a pct of .352), and lost all six games on a home stand. How can you go 0-6 at home? They’re even getting blown out in most games, which aren’t even close. It's hard to believe that they started the season with three straight wins.

You can go to baseball-reference.com and look at sorry teams in the past. (By the way, the site gives Pythagorean projections of W-L records for each team each year based on runs allowed v. runs scored, an interesting statistical calculation for high school math classes.) In 1958, the old Senators were 61-93 and lost the last 13 in a row, scoring very few runs (getting swept by the Red Sox in one series with every game ending 2-0). In 1959, the Senators lost 18 in a row in July and August, including all the games on a western road trip (they were winning going into the bottom of the Ninth in two of the Chicago games -- remember the home field advantage!). In those days, teams were divided for scheduling very symmetrically, into East (Washington, Baltimore, New York, Boston) and West (Cleveland, Chicago, Detroit, Kansas City). Each team visited every other city 4 times a season. In those days, you went on the road for two weeks at a time. The Senators used to dread those "western" trips, which in earlier days were done on the train.



The performance of the old Senators was obviously degraded by lackadaisical management from the Griffith family, as well as social tensions in Washington associated with racism and oppressive McCarthyism, which did not bode well for attendance. The “new Senators,” eventually under “BobShort” faced a similar fate during the distractions of Vietnam and Nixon (exception: 1969 was a good year). The “new” Senators moved into boring RFK (first called “DC Stadium”) in 1962. I miss the old Griffith Stadium (near where Howard University is and probably near the Town DC club now), with its asymmetry and erratic dimensions, a reverse of Fenway Park, but bigger. I think the Nationals Park should outfield have been built to imitate Griffith Stadium.

In 1961, the “new” Senators (expansion team) were 30-30 on the date of my graduation from high school, and were 31-70 for the rest of the season. The old Senators are now the Twins, the “new” Senators are the Rangers, and the Nats used to be the Expos.



At least they can’t finish as bad as the Detroit Tigers in 2003, who won only 43 games, in a relatively new park. And by 2006, the Tigers were in the playoffs. (The Mets won only 40 in 1962, and a World Series in 1969.) It doesn’t have to take too long to turn things around.

What a MLB team needs is aggressive management, and a big league player in every position. Yes, we’ve had our injuries, but most Nats are back, and they still lose. You need to have a lineup with every guy who can hit the ball hard, between the outfielders at least. You need to have a pitching staff with every starter capable of 7 good innings. (Remember the 1954 Indians?) And you need a closer. Can managing a baseball team be that hard?

It’s interesting how the old, big cities mostly in the Northeast and upper Midwest, tend to have consistently good baseball year after year: New York, Boston, Chicago. The same is more or less true for Los Angeles. Maybe the size of the TV market really does matter.

Thursday, August 14, 2008

Free content, free software, "free" publishing movements face challenges; favorable ruling recently


Remember how “it” used to go with political organizations? They followed something called “parliamentary procedure” or “Roberts Rules of Order.” There were formal rules as to speaking for and against motions. The aim of the organization was nearly always adversarial, to advance the political well-being of some group of persons. Furthermore, the speaker was generally expected to have some kind of “standing” with respect to the issue.

“Standing” is a really important concept in the law when it comes to litigation, especially trying to overturn laws as unconstitutional. A party must be impacted in some demonstrable way by the offending legislation or policy.

In speech, the same idea used to work in a less formal way, certainly in the way formal motions were introduced and processed. The sum of all of this processing was bureaucracy, influence peddling, pork, and cowtowing to special interests, and oversimplitication of issues. Constituents tend to see the world in a restricted view, related only to their own immediate familial interests in straightforward fashion.

Then, as we know, came the Internet (without invention by Al Gore). Close in behind followed the search engines, blogger software and social networking sites, and other Web 2.0 and 3.0 gizmos. Anyone could write about anything and publish instantly. Search engines became amazingly efficient. There was the opportunity for a previously unknown person to amalgamate ideas or arguments in novel ways, to present new world-views, and sell them. This activity would have been very much “out of order” in the old world.

This opportunity has an effect on a debate. Any speaker can play “devil’s advocate” when articulating for himself, and hold up the established lobbyists and organizations accountable for oversimplifying positions to please specific constituents. The material stays out there, and if the speaker, however meager his funding, builds a significant online reputation, he can have a significant impact on the debate as “a team of one.” It sounds self-indulgent. But it works.

Perhaps some will think I overstate things, but I do think I’ve been effective in keeping people honest on a number of issues, ranging from the military gay ban to filial responsibility, to what “family values” really means, and to some of the subtle problems emerging with free speech and Internet law (COPA is only the beginning). Yes, in my ten years online, I’ve seen COPA go down, the Lawrence v. Texas opinion, and the real possibility of repealing “don’t ask don’t tell.”

Sometimes static websites, sometimes based on books available online, are more effective in presenting total world views than blogs. That was my approach until about 2006. But because blogging software now offers categorization and various ancillary facilities, and because Blogger has a “next blog” feature that essentially can make anyone’s blog entry a “breaking news” headline, Blogging can add to the body of integrated knowledge with “newness” most effectively. (That’s another reason why splogs complicate life so much.) Wordpress is carrying this capability further, perhaps, with its database features.

Today (Aug. 14), on p A4 of the Washington Times, Matthew Sheffield, in his Analysis/Opinion Poll-Tech column (not yet online) wrote about “How traditional media lose audience to the Web.” Sheffield writes that the media establishment is partial to the liberals, and makes arguments relating to the way the media broke stories about Sen. John Edwards relative to the earlier “scandal” of Sen. Larry Craig. Sheffield makes it sound like the typical case of partisanship, influence, etc. Perhaps he’s right. I makes me grimace, when I think about CNN’s (and probably other news organizations) muzzling of employees speaking for themselves, as I wrote on this blog recently. But the bigger problem is simply that major media news tends to be bullet-point and disjointed. True, there are news programs, like ABC’s 20-20 and CNN’s 360 (Anderson Cooper) that make a genuine effort to connect things. But individual bloggers are much more likely to “prune” the news to make original connections that actually can influence how the public perceives the developing issues.

Most of the blogosphere may seem to be one-subject oriented, to be sure. Today, the Business Section of The New York Times had a report (by Claire Cain Miller) on how well advertisers like “mommy blogs”, especially “irreverent” ones like Heather Armstrong’s “dooce.com” (link). But blogging setups that seem to cover a wide range of topics can be arranged to convey a more unified point of view to those who will follow a range of material as the blogger presents it.

What, then, is the source of discomfort with this? The “wild west” atmosphere of levels of the Net can become a victim of its own “generosity.” When you offer something for free, you attract bad actors who can endanger the activities of all participants. It can be difficult to filter them out without catching “the innocent.” Various solutions can be proposed, including charging for part or all of the services that we now take use freely. Other solutions could include screening those who will participate, and raising the barrier to entry. Or a service could actually require proven benchmark financial results for a speaker to remain active, but that idea could also "backfire."

In the commercial world, and particularly the world before the Web, money was the common denominator by which one measured “eligibility” for things. Call it the Trump Effect. In the Web world and the “Second Life” of cyberspace, fame and recognition can become as valuable to the speaker as the bottom line. But the problem can be serious if the speaker must show “accomplishments” in monetarily quantifiable terms.

That takes us back to the social understanding of speech that used to live before the Internet. Speech was generally appreciated from those who have “standing” and responsibility for something or for other people (family members) affected by the issue. After all, that is what encouraged the Balkanization of issues and partisanship and ideas of solidarity, but which seemed part of the moral expectations we demanded of people as social beings. Could people be pre-screened according to such expectations? I suppose web services could set themselves up this way (Christian sites) but I don’t know how much traction they would get. In the radical Islamic world, however, this mentality obvious can lead to the abuses that have become familiar.

There are various ways, perhaps largely theoretical so far, that global speakers can put themselves (and perhaps other family members or persons connected somehow) at risk. ISP’s and publishing services require indemnification clauses as part of any TOS arrangement. Although to date individual users have very rarely been pursued for judgments (just extreme cases) and the legal climate generally limits downstream liability for service providers, there are cases in progress (like Viacom and YouTube) that could eventually undermine this arrangement, and there are technical developments that could undermine advertising models (by making it too easy for web surfers to suppress them). There can be practical concerns that "uncommitted" speakers could attract nuisance behavior when they address emotionally controversial topics without obviously clean self-interest or purpose, thereby risky murky legal problems (like enticement) or security issues for others. (I would add, that without "controversy" about -- say radical Islam or whatever else -- there would be no point for me to speak publicly, because then I would just become "partisan.") Then the business model that allows speakers “free entry” or low cost entry might no longer be practical, and the practical legal risks of liability could grow. I recall an extended conversation about this (potential downstream liability situations) with an AOL consultant at the Libertarian Party booth at GLBT pride in Minneapolis as far back as 2000.

The legal risks for speakers and service providers could grow if economic concerns were to make major media outlets believe that their ability to earn profits and employ journalists under the usual manner of professionalism were gutted. That may seem facetious, as many news operations (like in combat areas) cannot be reported directly by “amateurs”. But over time, this “fair competition” issue could drive legal harassment of various kinds, such as the recent claims of copyright infringement by the Associated Press against some particular blogging sites seem to illustrate.

A recent (Federal Circuit) Appeals Court decision may help us keep our heading. In a complicated case involving patent and copyright both (and the “Creative Commons” concept), the court ruled against Kam Industries, which deals with model railroading software (curious considering my post Tuesday), and for software developers who demand credit and attribution even if the software they distribute is “free.” The New York Times story by John Markoff is “Ruling Is a Victory for Supporters of Free Software”, and appears in the Business section here. The case is Jacobsen v. Katzer et al, Opinion (pdf) here.

Wednesday, August 13, 2008

Students and teachers use of social networking sites ("friends'" lists) creates controversy


Randy Turner, an English teacher in a middle school in Joplin, MO has set up a Myspace page and encouraged students to create accounts and add him as a “friend” for communicating questions about assignments.

Of course, the practice is creating controversy, since Myspace and Facebook have come to be perceived as largely intended for “social” networking. The terminology of the word "friend", appropriate in a social context (or even a political one -- just look at the Myspace pages of candidates) may send a wrong message or connotation about the appropriate teacher-student relationship with the issues of being an adult role model and authority figure. Myspace has over 72 million users, and Facebook over 37 million. Of course, repeated major media reports from around the country about various incidents among teachers (especially from about 2005) do fuel the controversy.

Facebook was originally limited to educational institutions but has expanded to everyone over a certain age. Myspace, Facebook, and other sites have age limits that require minors under certain ages to make profiles private.

Of course, in practice, some businesses use these sites for “professional” social networking, although more professional-specific sites (or online profile management sites like Ziggs) may be more appropriate for many companies. Furthermore, adults are starting to use the sites for networking around certain kinds of activities, like screenwriting.

The CNN "Back to School" story ("Online student-teacher friendships can be tricky", by Mallory Simon) is here.

Some school districts around the country forbid teachers to have students as “friends” on social networking sites. Some states, like Missouri, are considering laws prohibiting the practice, at least below certain grades.

High school teachers sometimes place course syllabuses online, and sometimes allow them to be viewed publicly on the Web. Generally, the practice does not cause trouble, but it allows students to compare the content with similar courses around the country if they hunt for them. Professors often do so. Sometimes the URL’s are not linked so they don’t get picked up by search engines, or they could be blocked with metatags or a robots.txt file entry.

It seems logical that school districts should implement closed "blackboard" applications for teacher-student interaction online, and that schools can thereby monitor the "tone" of the online "relationships." Mr. Turner's use of a commercial social networking site would seem to reflect the lack of adequate specific computer networking applications in the school system (or the money to deploy them).

Update:

Mr. Turner's blog is here. The entry is called "Teachers and Myspace." (Please see comment 3).

I have somewhat distantly related comment about an Indiana teacher's assigning a controversial book ("Freedom Writers") on my issues blog on June 29, link here (with a comment from the teacher).