Saturday, March 31, 2007

Schools ban social networking sites, as the controversy over information sources grows

Even Hilary Clinton is on And she has over 32000 “friends”. Within the past year or so, however, many public school systems and private schools have been blocking access not only to Myspace but all social networking sites and other massive free-entry portals like YouTube. A typical account is “Schools block MySpace; kids fight back” by Nate Anderson at Ars Technica, link here: Of course, the web is full of unblockers that kids get caught using by school librarians. (I don't know if they are banning blogger, too, which can certainly pose some issues with some of the personal blogs.)

There is, as always, a deeper problem. All of these sites have a lot of valuable material that would actually fit into a mainstream school program, if submitted and screened by educators. Kids will use the material at home, and hopefully for many of them, the extra material helps them put what they are taught as “approved curricula” in perspective with the real world. Some parents don’t like this, because they believe that, as one of the perks of marriage and family life, they have the “right” to mold their kids strictly according to their own religious and cultural beliefs, even in a pluralistic society. The real “dangers” come at home and parents need to learn themselves the right way to use the Net and pass this on.

But a lot of stuff on these sites is frivolous and at least distracting (I can imagine what could happen if a student googled the teacher’s name and found the teacher’s profile during class); and a small portion is very dangerous, exposing kids and their families, as well as schools, to harmful people, to say the least. That’s what you live with in a world that respects free speech. Schools say that kids surf social networking sites and personal blogs when they are supposed to be doing school work.

That certainly happens. I see that as a substitute teacher. There are plenty of unblocked sites that they surf when not watched, including hip-hop and fight clubs. What matters is what always matters, is the motivation of the individual students to do the work. You have a certain population of students that will skip when there is a sub, or goof off and sleep, or just play. You also have a population that realizes that the young people who do excel (and become visible in the media) didn’t get there by playing hookey and “taking advantage” of things where there was a sub. (Note, actor Jared Padalecki, who plays future law student Sam in Supernatural was a presidential scholar in high school, and Ashton Kutcher seriously considered medical school.) They get the fact that, whatever one’s bias in the culture wars, this is a competitive world with very serious problems for the next generation of young adults to tackle in the real workplace (global warming for starters). Getting a free public education (and AP credits) before needing college loans and having credit card debt is a good deal. And, guess what, the real world, even for superstars, has lots of 18-hour work days. I tell students that a classwork assignment is training for the real world workplace (I had thirty-plus years of it in information technology), where there are deadlines and quality expectations. It doesn’t always sink in, and pretty soon you hear hip-hop in the classroom. With a lot of adults at work, it seems like maybe it never sank in, either. A lot of people would miss their “due dates.”

Learning, of course, is hard sometimes. The only way you get comfortable with algebra (a subject which creates enormous mental blocks it seems), physics, chemistry, foreign language, or writing is to do it essentially every day until it is second nature. You can’t skip things.

As for the other-world web sites, we have always had a layering of preferred sources of information. In the 1950s (my school days), years before personal computers, making the effort to type themes was a controversy, and there was a pecking order of preferred references for term papers. Encyclopedias, however well edited, were to be shunned in favor of the original books if they could be found in public libraries (I even used the Library of Congress in high school), and legitimate periodicals (the kind indexed in Reader’s Guide) were always more up-to-date on many things than most books. Today, it is just that much more complicated.

Update: March 14, 15 2007

The Pentagon announced that military personnel will no longer be able to access Myspace, Youtube, and a number of other video self-publishing sites from military computers overseas, since they tie up military networks. In some cases, soldiers will be able to access these with their own computers. I have more details on the international blog, here.

Friday, March 30, 2007

RIAA, P2P, MGM v Grokster -- still a problem

The April 5, 2007 issue of Rolling Stone has an article “RIAA’s Campus Crackdown” by Steve Knopper. The subtitle of the article is “Thousands of college kids targeted for illegal file-sharing.” The general technique is for the RIAA to get a list of ISPs involved in supposed illegal downloads, notify colleges and universities owning them, and then the schools pass on the letters to students, which offer quick settlements in lieu of being sued. A typical settlement may be $3000 or so, where as a suit could claim hundreds of dollars per song and sometimes add up into the millions of a single student. The practical reality is that students, already facing student loans and debt, probably cannot afford to fight. There are real concerns in dormitories that students could be misidentified, as the article points out. Similarly, home users have sometimes been contacted by the industry with settlement demands (rather like a call from a debt collector), sometimes parents unaware of their kids’ illegal downloading.

One can read Electronic Frontier Foundation’s writeup and many links here.

The industry offers the website p2plawsuits with some rather blunt short answers to FAQ’s. It also has, on its home page, an invitation: “do you want to settle a case online?”

A related site is MusicUnited. This site discusses the audio flag provisions of H.R. 5252, the “Communications Opportunity, Promotion and Enhancement Act of 2006”, here.

All of this follows the Supreme Court decision on MGM v. Grokster in June 2005, which I discuss at this link. The Court agreed with the idea of service provider downstream liability when the business model is predicated on infringement, but now on downstream liability in other cases. Many of us have been concerned that the recording industry does not like to see technology used by newbies with low costs who could free-entry themselves and compete with large companies. The record companies will, of course, deny this.

My own feeling about all of this is mixed. As an “artist” I know that I have to figure out a way to sell my books, web materials and screenplay scripts. That depends on a legal environment in which investors know that copyrights must be respected. On the other hand, I don’t want to see this kind of thinking used to bully out competition that offers material for free or for low cost – and this is a related controversy. My own scripts would not lead to the kind of movies that are obvious targets for piracy.

Personally, there are so many opportunities now to download materials (or rent films) legally for very low cost that P2P infringement sounds like dumb behavior. (Even dumber is the bogus CD's from street vendors. I wouldn't touch them.) However, when the P2P practice started, record companies had been overly insular, not releasing music legally in small units that consumers could afford.

Personally, I don’t engage in P2P file sharing at all now, although that could change if I decide to use it later to distribute some of my own video material. Then I can see running into problems.

Wednesday, March 28, 2007

Citizens' Compendium -- an alternative to Wikipedia?

Media reports and academic circles have criticized the open-entry style (and supposed lack of factual reliability) of the free encyclopedia Wikipedia in recent months. AP reporter Brian Bergstein (Washington Times, March 26, 2007, link) now reports about a new online encyclopedia called Citizendium, The Citizens’ Compendium, founded by Larry Sanger. The site warns that during the last week of March there could be some outages because of server upgrades.

The Compendium requires authors to disclose their identities in most cases, and has much more careful procedures for editing and fact-checking. So far the volume of articles is relatively small (compared to Wikipedia) but the site is looking for volunteer authors.

An author must submit a biography, which would establish the author’s professional credibility to speak about the item. That is important because an encyclopedia usually is concerned mostly with factual presentation and organization. For example, an encyclopedia would present factual discussions of a legal concept (say, copyright, or libel). For a political issue, an encyclopedia would present a factual history, which might tend to be difficult to verify in detail. The problem is that political opinion is much more difficult to present in an “objective” manner. The preferred contributor for an article on “social security” might be a lawyer who practices social security litigation. On the other hand, “social security” as a political and ideological issue is a much more difficult subject to manage. One can present the history of “gay marriage” as a political issue (even discussing the state constitutional amendments and “judicial activism”) but find it much harder to present the psychological problems in a professional manner appropriate for an encyclopedia.

According to Bergstein’s story, there is one possible exception to the no-anonymity rule, when a volunteer’s employer “prohibits outside writing.” The fact that this possibility is even mentioned in the AP story indicates that employers may be starting to implement rules like this. I have, elsewhere, proposed similar rules for jobs involving direct reports or making decisions about others (link). I have not run into anything like this yet with recruiters in information technology. As widely reported, employers have (within the past two years) suddenly become very concerned about what job applicants say on their own blogs and social networking site profiles. Actually, I think that when an employee presents a professional article to Compendium, there is less of an ethical issue (than with personal blogs or profiles), because there is a third party (Compendium) supervising what the person posts in public, and that allows the possibility of writing contract rules in terms of possible conflict-of-interest involving a third party (rather than just in terms of the content that the employee writes).

I have not yet signed on as a volunteer for Compendium, but I may. But I might find I would be restricted to areas in which I had more professional certification (mathematics, information technology).

There have been other online encyclopedias proposed (such as nupedia). What I would like to see is, in addition to a typical encyclopedia reference format, is a database that can trace what people think about various sensitive issues.

It will be interesting to see if the academic world accepts the Citizens’ Compendium model as an acceptable research source for citation. .

My arlier story on Wikipedia and academics, is here, scroll to Jan 23). My database proposal comes from here, scroll to
“Chicken Little” blog on Dec 20, or go to my sample "crude" manual database.

Tuesday, March 27, 2007

For the "Kids" -- bring back essay tests

On Monday, March 26, DC Examiner ran a whimsical op-ed by Fairfax County high school English teacher Erica Jacobs, “A Teacher’s Weekend.” The link is here. She gives a diary or journal of getting through grading 137 essay exams, shortly before quarter grades are due. She even, tongue-in-cheek perhaps, invites the public to help her grade her next set of essay papers.

These days, when I substitute, I do see a large number of multiple choice exams. This is even true in math and physics, where I when I went to school tests were all problems, with full credit given only for showing all work. I even remember, in pre-calculator days, the mantra of “slide rule accuracy” on physics and chemistry tests. All states now mandate a heavy schedule of exams that students must pass at different grades to graduate; in Virginia they are called SOL’s (Standards of Learning). Most of these tests are multiple choice, although there are free-response writing tests. I am by education a math person, and I have to say that the algebra and geometry tests don’t look that bad. (In geometry, make sure you understand inverse, converse, and contraposition in logical reasoning!) Full-time teachers eventually have to pass Praxis exams in the subject areas, and these multiple choice tests often have many problems of some complexity, more that one can reasonably work in two hours. When I worked for a life insurance company, I got certified in LOMA, which was ten multiple choice tests, often compound condition questions. In information technology, some companies like Brainbench offer an array of certification tests, the individual multiple-choice equestions often being complex problems (how many rows would be returned by this compound SQL query, etc.) I have been paid to make up at least one such test, and perhaps test composition will provide income in the future of pseudo-retirement. In most states, people pass an easy multiple choice test to get a driver’s license. On one job, as a debt collector, you had to score 100 on a multiple choice and true-false test on the FDCPA law before starting work. One good variation of the multiple choice test that I see chemistry teachers use is a test where the student has to write a brief explanation of his choice for full credit. There are other variations, including guessing penalties, and more than one correct answer (which LOMA handles with compound condition questions: A & D, C only, B only, etc.)

Nevertheless, the real world of work is not a multiple choice world. It is about getting jobs done, getting things right in a production environment day after day (after careful testing and implementation, in information technology) and often about people skills and the ability to persuade others to accept your employer’s message (that is, the dreaded word, selling). And the real world of social justice and politics is multiple-facted indeed. It is about understanding how people very different from the self think, and why. And the best way to see that students understand this is, guess what, make them put it in words and write essays.

In 11th grade at Washington-Lee High School in Arlington, VA back in 1959-1960 (then, during the Sputnik era, one of the top ten public high schools in the nation) I had a “Virginia and United States History” teacher who insisted on giving tests as all essay. A typical test would have identifications and then four essay questions, to finish in a 55 minute period (we didn’t have alternating blocks in those days). You got writer’s cramp. I made a 79 on the first test (that was a D according to the scale) because I lost 15 points on a question about mercantilism in the British colonies. (He got that test, given on a Monday, graded and back to us the next day!) I don’t remember a lot of the other questions now. In those days, it was acceptable to use the word “Negro” and he once asked a question about the role of the former slaves in presidential elections during the Reconstruction. We had to write an in-class book report on John Kennedy’s Profiles in Courage. As the year progressed, he would offer choices of questions. On the final, we had to answer ten out of 25 questions, and I was barely able to do that, but I eked out an A in the course. I recall one question about the significance of the Fall Line, a concept that would mean something to East Coast residents but probably be overlooked in California schools. The teacher created so much controversy that he had to give one multiple choice test at mid-term. The big bugaboo for him when he graded essay answers was “leaving out” things in answers. He wanted you to understand all the reasons something happened. Whenever you came to class, you waited for one of two commands: either “Take out a paper and pencil for a quiz please” (yes, a short essay, and it could be on current events), or “Turn in your texts…” (The Big Relief for the day.)

We had essay tests in other classes. On a tenth grade English final, I remember that there was a question on Julius Ceasar regarding Antony’s motives. In government, more of the tests were objective, but the semester test in January 1961 was to compare communism, fascism and democracy, something done well by the World Book Encyclopedia of that day.

In mathematics, we would learn to write structured essays by proving theorems in plane geometry. Remember the “Given” and “To Prove” and the charts of statements and reasons (usually definitions, postulates, or previously proved theorems)? That is structured reasoning and the beginning of critical thinking. Later, as in advanced calculus in college and in graduate school, one simply writes mathematical proofs in paragraph form. A good example is proving that the Euler number e is irrational, by assuming that it is rational and reaching a contradiction.

Social studies concepts are often hard to get in high school because they refer to worlds different from what we live in today. The same tends to be true of literary classics that are often mandatory reading in high school and often the subject of tests. (How well do most people today really appreciate what Samuel Clemens (aka Mark Twain) was really doing when he wrote Huckleberry Finn?) They also usually refer to problems that treat people in groups (by nationality, religion, race, gender, etc.) rather than as individuals. That is partly because our notions of public morality have always assumed people could morally be circumscribed in their own opportunities by mandatory loyalty to family, tribe, nation, religion, or the like. Modern liberalism and individualism, growing substantially since I graduated from high school, have challenged these assumptions.

That’s where the critical thinking comes full circle, as to tackle the kinds of problems we have today, people have to tie all of these loose ends together. That’s what that history teacher demanded of us. (That history teacher, a military combat veteran but very much a political liberal himself, was determined to train a generation of social and political activists, as was my government teacher next year.) Yesterday, I attended a demonstration, on the West Lawn of the Capitol, for lifting the “don’t ask don’t tell” policy regarding gays in the military. The arguments seem simple enough, and what one doesn’t see at such an event is the enormous social and political complexity (enough to fill 185000 words in my own “screed” discussed in previous postings) that has made an issue like this come to a head – even in Iraq, and become a genuine national security problem. A good topic for an essay question in a government class these days is “judicial activism,” and another good topic would be “federalism”. What hot-button problems illustrate the constitutional questions? Two of them would be sodomy laws (Lawrence v. Texas in 2003), and now, gay marriage, where the activism is by state judges and there are questions of national recognition and local democratic choice. In my own thinking, I have wanted to cast all of these questions about liberty and public morality in terms of “personal responsibility” (including individual sharing of burdens) and objectivism. Yet, I can see how some of these don’t get resolved without the heart, and the capacity of adults to connect with people, including “kids”, at their own level. Our educational system has to prepare students for a competitive world and society, and yet it may be forgetting where cooperation has to start.

Sunday, March 25, 2007

The DADT Showdown: Part 3: Reorg of The Screed

If I were to rewrite my “screed” – that first “Do Ask Do Tell” book from 1997, what would be the payoff today? The optimistic “libertarian future” certainly has not materialized.

The book was organized around the 1993 debate on gays in the military (resulting in the “don’t ask don’t tell” policy) as a centerpiece, and a steppingstone to a broader debate about the responsibilities of citizenship. That was “chapter 4” and it looked back to an earlier chapter on conscription and student deferments, and then the opening chapter on my William and Mary expulsion for “latent homosexuality” and reparative therapy at NIH – a curious foreshadowing of its role twenty years later in fighting AIDS and HIV.

The fifth chapter, however, would break away from the idea of providing a recipe for solving the controversies over “family” and gender in civilian life from comparison to the military paradigm. Instead, it would become an account of the development of cyberspace and the Internet, and about all of the social, legal and ethical controversies that attend to the availability of this “alternate universe” as a source of quick fame. The centerpiece of this account could be the Philadelphia COPA trial, which I attended one day in October 2006. On March 22, 2007, COPA was declared unconstitutional by a federal judge.

The logical conclusion, for a Chapter 6, could be the citizenship conference (previous post), if it really were held. Of course, instead of a book, I could think in terms of a documentary film. Dream on.

Some people have said that Introduction is egotistical. Maybe so, although the agent didn’t seem to feel that way back in 1996. Now, it strikes me that way too, as by starting with the first chapter immediately, I could establish the historical reason why I have standing to speak to the issues. The Introduction was intended to establish the objectivistic nature of the philosophy that would follow. It also prepaid acknowledgement of a disturbing (and to some, offensive) notion that the right-wing keeps dangling without being able to state: that homosexuality, in men at least, represents some sort existential character defect, whether immutable and biological or not, that amounts to a personal moral hazard. Or, to put it in another pejorative, it seems fixed on a kind of competitive and narcissistic adolescence that does not want to let go of its own rationality and judgment and "jump in" to the pampering emotions that other people perceive as necessary to transmit and nurture new life -- and as necessary to "prove" religious faith. That impression certainly comports with the behavior of the Bush administration during the past six years, and has a lot to do with the "less rational" objections to gay marriage and allowing gays to serve in the military. And personally, I don't see this "attitude" as a pejorative.

I also now wonder if it is necessary or appropriate to list all kinds of specific measures that should be passed (most of all the proposed constitutional amendments in the present Chapter 6). What’s getting to become more important is the nature of the moral debate -- the availability and organization of political knowledge, the understanding of how other people think, and the underlying questions about when we have to fess up and become our “brothers’ keepers”. The real moral issues seem to present more faces that or dimensions that have previously remained hidden.

Saturday, March 24, 2007

The DADT showdown Part 2: "Bill of Rights 2" and "Bill of Responsibilities"

Earlier postings have built up to this next proposal, about a citizenship convention. I’ve talked about the concept of a “Bill of Rights 2”, at least as an informal document if not legally or constitutionally binding. The document would incorporate a corresponding “Bill of Responsibilities”. This was discussed on a January 8, 2007 posting here:

And I’ve talked about the way we organize all of these opposing viewpoints about social and political issues, and try to understand how other people think about them. We can even propose database technologies and techniques and business models as to how something like this should be implemented.

I get an overwhelming sense from all that is going on, that the common denominator of all of these “moral” issues is the way burdens in any free society have to be shared. That gets into the cakemix of all of our attitudes about marriage and family, and how some people want to keep the family out of fully rational discussions about equality and rights. Family helps make some of this sharing transparent. But it still needs to be understood at a certain intellectual level. Ultimately, like denominators in algebra class, issues have to become rationalized.

I can envision forums being held in several cities around the country. My choice of locations would be Williamsburg, Dallas, Minneapolis, and southern California. The agenda would be constructed over several major policy issues where individual contribution and “sacrficie” are relevant, such as (1) global warming (2) terrorism, natural catastrophe and pandemic preparedness (3) national and military service (4) the perks and responsibilities for marriage (if any) and intergenerational responsibilities (including filial responsibility).

A couple of points strike me as particularly important. One is the notion that the responsibility of the individual to family doesn’t get created merely by having (procreating) children. Expecting everyone to respond to family or filial responsibility, the childless adults, actually strengthens as social safety net. Is this to be expected? The point here is that it generally doesn’t get debated openly (outside of a few books, as where Elinor Burkett talks about “cheating the childless”). A component of this is meeting adaptive needs before claming a personal psychological surplus, and using that surplus to build constructive relationships, as documented in my experience with the Ninth Street Center (the Chapter 3 reference below).

The other major point concerns the striking and rapid development of the cyberworld as an alternate living “space” with its own topology and metrics, which allow almost anyone to become a celebrity without accountability or “normal” social connections. Fame has become a new currency replacing money, and sometimes organic relationships. Can this freeing capacity go on forever?

Generally, courts have been quite supportive of individual rights in more recent decades, including free speech and free-entry into a public space. They have tended to analyze “moral” issues in terms of individual fundamental rights and responsibilities rather than abstract notions of common good, which many people feel must apply to institutions like marriage. Some people feel that courts practice “judicial activism” and can endanger long term social stability and resiliency (to disaster) by over-focus on the individual. The extreme capitalism of our society in recent decades stresses individual sovereignty and self-reliance, and these virtues need to be brought into balance with the practical, if contingent, needs for local interdependence.

The convention procedure would follow a protocol like “the area of Mutual Agreement” which I recounted in my first book’s Chapter 3.

At this stage of this proposal, I am quite struck by own personal "moral" position. As I noted, I have gotten a lot of pressure at certain times to become more actively attentive to other people, and to develop the practical "manual" skills to carry out my share of adaptive responsibilities. Some of that needling occurs because of the inordinate attention I had to place on my own performance and then psychic comfort when I was younger. This gets to be elaborated into a demand for more emotional receptiveness to connecting to other people at their own place in life. The impression that I get these days is that it seems that the public responsiveness of "different" people like me is essential for "normal" (I grasp for a less pejorative adjective) people to find the incentive to marry, become parents, and remain personally focused on their marriages. Call this "public morality" if you will. Perhaps that is even a reasonable idea, but carrying out such expectations would not make me "straight." But it might help make things "fairer."

Here is an earlier writeup from mid 2006 on this proposal.

Wednesday, March 21, 2007

The "do ask do tell" showdowns: Part I: Principia

One of the major points in my first two books of my 1997-1998 glory days was promotion of the idea that the “choice of a consenting adult significant other” should be a fundamental right. Along was this was, “the right to be left alone.” This whole dichotomy was thoroughly rehearsed in two famous Supreme Court cases: Bowers v. Hardwick (1986) and then Lawrence v. Texas (2003), regarding sodomy laws and public morality. Of course, what starts out as a private, emotive experience has a subterranean influence on the lives of others, in the values that are conveyed and that, in the Internet age, have become much more public.

In fact, the private v. public dichotomy, so well known in software engineering, seems to lie beneath a lot of today’s moral debate. One can become a public “citizen of the world” without even continuing his own biological family. Except in rare instances, this just wasn’t conceivable for many people until the modern age.

That is pretty much the case with me, and at various points in my life (including the William and Mary Expulsion in 1961), others have certainly tried to interfere with what modern constitutional thought (at least Lawrence v. Texas, whether “judicial activism” or not) asserts is my fundamental right.

Keep in mind, the modern idea of sexual and expressive freedom has developed to be commensurate with a very pointed idea of personal responsibility. One is accountable for his own actions (and sometimes the law comes down today in draconian fashion when people cross the line), but has rather nebulous responsibility for the indirect psychological influence he or she has on others (with the exception of one’s own minor children, who become an absolute responsibility).

Why do people do this—impose themselves on people like me? Why won’t they get off my back? I mean to pose this question with a research, textbook tone. I’m going to try to emphasize principles on this particular post, and go into specific personal stuff to the minimal extent necessary.

I can, however, enumerate some specific items of social and motivational conformity that many people would like to see from me before I make myself visible on the public stage. The central expectation seems to be that I would become emotionally receptive to people “as people” rather than as “part-objects” – manifestations of my own ideas or ideals. Sometimes this expectation comes across to me as a request for deference or even sacrifice. I am supposed to develop my self-concept by responding to others in an immediate local environment (whether blood family or some kind of assumed neighborhood community) before I develop and publicize my own ideas. A corollary was that, as a boy, I was to learn to do my chores, often when they weren’t objectively necessary but because I need to learn to develop the manual and protective skills demanded of men (relative to women and children) in our society as it was then. We had the draft then, which accepted the idea that men offer themselves as sacrifice before they have their own lives (even their own families). Rhonda Byrne glazed along this problem in “The Secret” when she criticized the notion of “sacrifice”, which she sees as contradictory to the idea of “giving”, which presumes one has cultivate one’s own talents and has something to give, massaged by his own volition. What good, they ask, is an idea unless I use it to make some specific person in need better first?

The Gospels seem to call for this kind of other-centeredness, even to the point of making it evidentiary of Faith and receptiveness to what we know as Grace. A couple of the most controversial parables – the Rich Young Ruler (if you can call that a parable), and the Parable of the Talents – play on this kind of problem. Early Christians lived a highly socialized life with communal property and great loyalty to one another that transcended the direction of the individual as we know it today. It’s not so much that the individualism is “wrong” is that it has to account for a culture that created it.

What do people want from this (or sometimes from me)? Some people do see the social and religious supports for their position in the family as indispensable. It’s not just that society says “you are a better person because you are a married father.” It’s that the possibility of measurement or comparison to others on a global scale (the psychological equivalent of a FICO score) is taken off the table as anything worth thinking about. The male homosexual, in their view, is creating an intolerable public paradox. On the one hand, his “upward affiliation” comes across as a commentary on who should have a lineage, but (on the other) he often seems unwilling (or “unworthy”) of continuing his own lineage and showing biological loyalty (essentially paying back his parents by providing "vicarious immortality" through biological lineage). That view seems exacerbated today by the ease with which people can express themselves on a global stage – where “fame” has become a new currency almost equivalent to fiat money. That concern is also exacerbated by the "fantasy world" maintained by the media, icons in a fantasy world that cannot be maintained in the "real life" of families. But this grounding in "moral collectivisim" and "the greater good" was always present, in the past implemented with laws that seem “irrational” to the believer in individual sovereignty, especially sodomy laws, which seemed like the centerpiece of public morality – a system to get most adult men to do what society needs them to do, create, raise and remain loyal to their families. Of course, this “homophobia” deteriorates quickly into patriarchal values and religious tribalism, and the extreme endpoint of this sort of thinking is something like the Taliban in Afghanistan. And competition amount intra-loyal family units can easily be exploited by politicians and unfair businessmen (indeed, a justification for the growth of individual rights and "gay rights" as an add-on to the 60s Civil Rights movement). But unfettered “fundamentalist individualism” (with excessive concern over “moral hazard”, as we see in debates about health care and disaster insurance) can itself lead into dark places.

Yes, many people see family “loyalty to blood” as an essential virtue, and its necessity stems from a practical reality of civilization. Until the modern era, people were largely confined by their familial circumstances (with the visible but rare exceptions, that became more common in American than they ever had anywhere else in the world). People had to find “meaning” in their family experience; there was no reasonable alternative. So any cultural change that would dilute what people experience in the family would be opposed. This was especially important to people who did not beget their own children, but stayed around to take care of other family members. In another area, feminism challenged these ideas, as the idea a woman with her own career was seen as threatening to stay-at-home moms.

Family solidarity has indeed been necessary for survival, as many episodes in history (the Holocaust, for one) will show. Some critics of the modern economic forces of globalization (like “just in time” economics) see these as going hand-in-hand with weaker families, and fear that the ability of a democratic society to hold together in certain kinds of exogenous catastrophes, such as a bird flu pandemic, is jeopardized. Once mandatory psychological collectivism and socialization is taken as given, then all of the usual political issues that one typically studies in history taken on their usual meaning: nationalism, comparative political ideology (fascism, communism, classical liberalism, capitalism), religious conflict, war, discrimination, slavery, segregation, etc.

Does all of this filter down to any useful moral principle that can be cleanly articulated? One could maintain, for example, that one should have accountability to others when visible on the public stage. I call this the “pay your dues” philosophy. This could be stated as an expectation that every adult prove that he or she can support someone besides the self. This notion could also challenge the idea that family responsibility exists only when procreates a child; indeed, one could claim that raising the next generation and caring for the previous one is a responsibility that must be shared by everyone. It seems to me like a reasonable balance between individualism and a real sense that some kind of “collective” cooperation needs to continue. At a constitutional level, this could lead (as with the recend 2nd Amendment case before the DC Circuit) discussions of when rights belong strictly to individuals or when they subsume some kind of accountability to the welfare of a family or group. One point, though, must come through. This is not the same kind of morality formulation one usually hears. It is an expectation of citizenship that goes beyond integrity, or even fidelity. It says more than just that one must remain faithful to a marital spouse if one has “chosen” to have children, or that one must not try to scam people. It is more even than morality’s “third normal form” as in my 1996 essay, orignally intended to be chapter of my first book in 1997.

Many social conservatives see the emotional complementarity of the heterosexual world or marriage and family as necessary, in a practical sense, for people to live up to such an expectation. Perhaps the mores of the heterosexual world make it easier, but we have a bit of a “chicken and egg” problem. The world seems simpler if sexuality – with its emotional release of self from normal rational concerns – is always tied to openness to having children and then providing for them as a first priority; any one who refuses to open up to this self wants to play “God” with the “knowledge of good and evil.” But, again, this is not the same moral issue as “abortion” (and, indeed, the idea that individuals have to be open to sacrificing themselves, as in war, seems to contradict absolute respect for human life) That seems to be what notions of “public morality” have always been about. These notions have been put out as straightforward even if irrational.

Where does all of this lead? I think it leads to a discussion of the social obligations of citizenship – the Part II of this post, the idea that there could be a “Bill of Responsibilities” to go with a “Bill of Fundamental Rights” (or a “Bill of Rights 2”). Specific challenges like uneven population growh, filial responsibility for eldercare, global warming, pandemics, and terrorism raise questions about how burdens are shared and how over-individualism may exaggerate resentment and problems around the world. Gay people will need to recognize that the idea of expected service and citizenship obligations can co-exist with homosexuality, and indeed repealing “don’t ask don’t tell” and supporting gay marriage could support such an idea. But then, that penetrates the emotional “black hole” that the straight world wants to keep protecting the family bed, from which no intellectual cognition may escape. We will always have a tug between aethetic culture and reproductive emotion.

I come back to myself. I still maintain, if my life is hijacked for “the lives of others,” I don’t see how Grace can save me for anything meaningful. If a freedom means anything, then being able to accomplish what one will must mean something.

Coordinated blogger entry here.

Also: COPA was struck down on March 22. Blogger entry here.

Saturday, March 17, 2007

Iraq protest at Pentagon -- hard to get to

I was in the Bryant Lake Bowl theater in Minneapolis on March 19, 2003, when the Iraq war started with the "shock and awe" campaign over Baghdad. IFP had just finished a free public program, and the CNN coverage was shown over the TV in the theater and bar.

Four years later we have a protest in Washington, in the north parking lot at the Pentagon. I was going to get there to snap some pictures of the protest signs, on a cold and windy St Patricks Day, rather late for snow in Washington. The march started at the Lincoln Memorial, went across Memorial Bridge, and onto the parking lot.

But what I found out on foot was how hard it was to see if you didn't join the march from the beginning, in the cold. The Blue Line segment on the Metro that would have stopped at Arlington Cemetery (the closest stop) was shut down. You can part at Pentagon City Mall and walk through the tunnel, but security will not let you cross the lot to the north side (or have a camera). This was a brutal day weatherwise, and I think they know it. They are perfectly willing to be open to the public with a 9/11 memorial commemoration -- which is entirely appropriate -- but today they didn't want it to be too easy to even see the protestors, who were out of sight because of the geography of the area.

True, forty years ago there was a similar protest over the Vietnam war, and it turned ugly. This one went well. But all I saw of it was the protestors returning toward Memorial Bridge as it broke up.

There are stills from earlier protests at this link.

Sunday, March 11, 2007

NY Times Magazine ethicist on checking profiles

The New York Times Magazine, March 11, 2007, p. 26, has a column "Online Extracurriculars" by ethicist Randy Cohen. An interviewer for a college, having researched the applicant on the Internet with a search engine, asks whether she should ask the applicant about the applicant's social networking profile or blog. Mr. Cohen says no. Even though the blog is in a public space, the writer perceives it as like a diary. He notes that the University of Nevada, Las Vegas has announced it will not look at online informaiton in evaluating applicants. In this case, the college in question asked the interviewer for the url of the blog anyway.

Of course, a blogger is likely to post information because he or she wants to influence debate and believes that personal stories bear on the debate, with more precision than what lobbying organizations representing someone can offer.

I believe that employers and universities should not consider online information from search engines, not out of concerns about maturity or pseudo-privacy, but more because mainstream employers and schools should not be trying to force social conformity. The exception would be egregious information (such as related to crime) that can be confirmed. It is legitimate for law enforcement to use blogs as evidence in some cases. As noted before in this blog, "dream catching" or "fake self-incrimination" can pose testy potential legal problems for employers or schools should they find it.

Employers and schools may be inclined to read blogs and profiles for "inference" -- asking themselves what motivated the speaker to make a particular posting, what the speaker is "accomplishing" with the posting, or what may be inferred about the speaker from what she says. The type of reading here is like what students are asked to do on SAT's (or teachers do on ETS PRAXIS exams): first, read for literal meaning, and then draw conclusions about what a writer likely believes about herself or about other people or entities in some sensitive situation.

Some employers may tend to take the position that someone should not say something on a blog or profile, visible to the public and search engines, that he or she could not say in an open conversation at work. An employer might believe that "you" should "earn the right to be famous" by playing by the old fashioned rules (proving that you can sell what you write or say first in conventional markets) before you have a right to be heard in public. Otherwise, anything you say in a public space can reasonably be taken as indicative about your suitability to remain in your job. It's not clear that this notion is legally (or morally) supportable, but employers and speakers need to ask the question, and the legal community needs to think it through conceptually.

Also, remember (as in a posting a few days ago) that information about someone posted (on the public Web) by others is often very unreliable and should not normally be used in evaluating applicants. Normally, a job applicant (at least at individual contributor levels) should not be expected to regulate his informal social reputation online for job purposes.

Friday, March 09, 2007

Spoofing of sender cell phone numbers for text messages


ABC affiliate in Washington DC WJLA had a "7 On Your Side" report March 9, 2007 on text messaging. The report alleges that some cell companies allow Internet text messaging from their sites and allow the message sender to put in any cell number, without verifying the number. This has caused at least one incident where a 13-year-old boy got inappropriate messages, and the owner of the cell number was contacted. Fortunately, it was determined quickly that the cell phone number had been spoofed, as what happens with spam on with e-mail on the Internet. (One wonders if the cell company web site might even have accepted an incorrect land line number if spoofed, or a cell number from a different company. However, it would be easy to establish that a given land line number could not have send the message.)

There might be a risk that people could be falsely accused of and prosecuted for solicitation by electronic device if law enforcement did not carefully investigate the possibility of phone number spoofing in text messaging. (In Virginia, the applicable law is 18.2, text here.) This law has been enforced before with text messages. As with email, there is a need for automatic verification of senders to prevent spoofing. However, it seems that the intention of such an act seems only to be vandalism or harassment, as the recipient of the inappropriate message would not reach the actual sender if an illegal contact had really been intended.

Intentional spoofing of cell numbers, as with email addresses, for illegal messages should itself lead to a separate felony charge. State laws should be modified to cover this possibility.

Another aspect of this story that is particularly shocking is that in at least one case (Sprint) the cell company actually charged the false sending number for the illegal text messages, until Channel 7 pointed out the error. The fact that the cell phone system would have a gaping hole and bill for it seems like shoddy systems work and is particularly galling.

The WJLA story, by Ross McLaughlin, is here.

There was a novel in the 1990s that touched on this kind of scenario with a "telephone virus," called The Trojan Project, by Edmund Contoski, review here. Another example could be Stephen King -s 2006 meltdown thriller horror novel The Cell, not to be confused with a Jennifer Lopez film by the same name.

My own rule on deep links -- OK

There has been an issue with one of my own still photos on I have previously discussed deep linking on this blog.

I welcome links and deep links to any of my blogs or websites, as long as the user knows that she is visiting a different site (that is, a window opens up, either a new window with the "target" parameter, or a replacement window from which the visitor can use the back button). Publishers on other websites, especially social networking sites, profiles and photo albums, should not inline to my site with the src and img parameters to make it appear that a photo is theirs. This has the potential of causing bandwidth or certain other legal issues, and I do believe that this is copyright infringement. (It does seem, however, that, usually, repeated requests for embedded images use browser caching and often do not contribute to bandwidth charges in practice.)

Contact me (info in the profiles) if there is a photo you want. It just depends; maybe I could send it, or direct the visitor how to take a similar one that would belong to the visitor legally.

I don't have a particular concern right now about embedded thumbnails, as displayed by search engines for image searches. They don't seem to represent a real issue, as the way images get displayed, it is clear where they come from.

My earlier posting ond deep linking was in Feb 2006, here.

Thursday, March 08, 2007

Video Curriculum Vitae : disparate impact?

Time magazine, March 5, 2007, p. 51, has a provocative story by Lisa Takeuchi Cullen, "It's a Wrap: You're Hired! Recruiters, get your popcorn. The YouTube generation is discovering the video resume." They are talking about home movie -- that is, individual shot digitial videos CV's -- what we call a filmed curriculum vitae, a detailed discussion of your professional background, emphasizing accomplishments related to the job you are seeking. It is more detailed than a resume. If done with video, the job seeker should do a good job with the filming, lighting (be careful about glare and shadows), sound (use a shotgun microphone if possible) and video editing (use a package like Adobe Premier (Windows) or FinalCut (Apple), both of which are gradually coming down in price. Larger or better funded high schools now often offer elective technology courses that include video editing skills; it is a good skill to learn. And, of course, be well-dressed for the presentation.

There are numerous web references. Here is one on "about job searching", and here is an article "The Video CV: The Way to Get a Law Job."

(By the way, when looking at this subject in search engines, the other term that this acronym refers to is "composite video", a term related to television monitors and color mixing.)

For some jobs, a visual resume sounds appropriate. What would be the most obvious example? Perhaps as a cast member, actor, or any job where you appear in public regularly as a spokesperson. In that business, the "head shot" is a staple that agencies expect. It doesn't sound very appropriate, in most cases, for a computer programmer.

Some employers are afraid to look at video CV's, however, because of downstream legal implications. A video CV will give away the person's race and appearance, and it is easy to imagine how an employer could fear that others will accuse the employer of favoring people with one kind of appearance over another, leading eventually to disparate impact lawsuits. It's not clear how credible this fear really is. It's easy to muddy the waters here with all of the debate about affirmative action.

My own libertarian instinct is to have at it if you can. The most important factor in a job search is whether, relative to all the other applicants, you have what the employer needs to actually do the job.

Wednesday, March 07, 2007

Law students finding that digital reputations can follow them

It seems that half of the postings on this particular blog have dealt with employers and the “Myspace problem”, but a story by Ellen Nakashima and Meg Smith in The Washington Post today, March 7, 2007, p. A1 is one of the most disturbing of all. The story is titled “Harsh Words Die Hard on the Web: Law Students Feel Lasting Effects of Anonymous Attacks,” at this link.

Of particular concern was the story of some female law students who could not find employment, and the students’ belief that employers may have found negative materials about them on chat boards, one in particular being AutoAdmit, which is run by Jarret Cohen. The news story identifies him as an insurance agent, so I am surprised that he can run this site without a conflict of interest with his company, unless he is totally independent, as some agents are. (When I interviewed New York Life in June 2005, I was told that new agents could have “no outside income” – not even ads from a website – and stopped the interviews immediately. I presume that the insurance company would have objected to the attention I could draw to myself even if there were no ads. See the update below.)

A cursory look shows that most of the site has reasonably valuable material, and probably only a little of it is derogatory about anyone. There does seem to have occurred one “contest” among certain students that was sexually oriented and probably not appropriate public behavior among law students. Cohen has streadfastly refused to censor material on principled grounds. There would be a legal question as to whether Cohen has any legal exposure for what the content of posters, under the Section 230 rule (which says that me may not have any), and you can read about this at EFF here. (See update below.)

Individual posters could be liable for libel if their defamatory content were actually untrue, and anonymity does not protect them from liability; their identities can be subpoenaed in some cases.

But the legal questions, as noted before, pale in comparison to the practical business and ethical questions. When a few companies created social networking sites about three years go, their youthful managements probably did not envision a world where employers would feel compelled to bottom feed on the sites for due diligence on hires.

Of particular concern is searching the web for comments about an applicant made by another party. Employers should know that much of that kind of material, especially on message boards and chat logs, is unreliable. Do they consider it a test of an applicant if they can “manage” their reputations online, or mediate what other people say about them in public? That is exactly what some new companies like Reputation Defender purport to do, as discussed in this November 30 blog entry.

An older concern (old hat by now, since widespread media reports starting around the end of 2005) is the concern from employers that applicants may deliberately disparage themselves on their own profiles and blogs in order to make what they consider valuable social arguments or protests (as against underage drinking or controlled substance laws, for example) through parody. This practice (“dreamcatching”) is coming to be seen as a no-no in the human resources world (including school systems) because of the potential legal liability it could create for employers.

The world of conventional employment, however, it itself being exposed in all of its ethical weaknesses by this whole issue of employee web exposure through search engines. I spent my career as an individual contributor in information technology, and my reputation in the outside world meant relatively little (as long as I did nothing wrong at work). Individual contributor jobs that appeal to introvert nerds have tended to go overseas, although now they may be coming back. Many people, however, make a living by selling to the outside world, or by getting paid to represent someone else’s point of view, rather than their own. This must be true of a lot of lawyers, especially trial lawyers. So many employers naturally fear that a person’s public “reputation” (or association with controversy in any area) can affect their ability to get business. So employers may even want their associates to turn over the management of their online reputation to “professionals,” (like a concept that I discussed in this post.

There is one more caveat to say about all of this. If someone, as an applicant, feels so strongly about the problems in a company or a field that she is misrepresenting herself if she works there, then maybe she shouldn't. Or maybe she can and do something about the problems, but she would have to remain silent for so long. The flip side of this is censorship. I have said before that I fear that employers could inadvertently, and out of fear of unseen "paper cutting" edges, turn online presence into a test for social conformity, the way some of them look at office dress.

Tori Johnson weighted in on March 8 on ABC "Good Morning America" and noted that even when teens and college students limit their postings to private groups (a facility that blogging companies and social networking sites now promote) less scrupulous "friends" may make unfavorable postings about a person on the open Internet. A "friends" list on Myspace of 500 friends will leak anything "private." Tori suggested that every job applicant perform a "narcisurf" to their own name(s) in several major search engines. ABC presented the owner of "Reputation Defender" and Chris Cuomo and Tori suggested rebutting unfavorable (and false) comments, contacting ISPs and, in rare cases at least, litigation.

I'll add that I don't post "rumors" about people and identify them by searchable name. And I don't post the fact that I saw such-and-such a celebrity on the disco floor of a gay bar. (By the way, I think that celebrities should be able to circulate in public places without being chased; younger celebrities generally expect that.) What I post on blogs comes from verifiable sources, which I list and usually give links for, or at least specific dates and references, as in term paper footnotes and bibliographies.

Update: Saturday March 10, 2007

Ellen Nakashima has a follow-on story (in the March 10 Washington Post, Business Section, p. D1) "Law School Deans Speak Out on Web Site Content: Yale, Penn Condemn Anonymous Attacks", at this link.
Apparently the AutoAdmit site has removed advertisements because of complaints. The site is reported as co-owned by Anthony Ciolli, a third-year law student at the University of Pennsylvania (Philadelphia). The report suggested that the student's activity could affect his bar admission process. However, the question remains whether site owners or moderators are responsible for content posted by visitors. The free speech community insists that moderators should not be, and sometimes the suggestion is made that "ignorance is bliss" -- do no editing or moderating at all, to avoid potential liability.

Again, one wonders about all of the message boards and discussion forums run by media companies, that usually cause few problems, and by various other advocacy groups, even though once in a while flaming occurs or libelous comments get made on them.

Other links:

An important posting today (March 7) about trademark and copyright is on another of my blogs, here.

An important posting today about proposed new laws requiring social networking sites to do age verification is here.

Update: April 3, 2007. A young professorial candidate loses GMU opportunities, apparently after inappropriate remarks made when a teenager found by search engines

Ian Shapira has a story in today's The Washington Post, page B01, at this link, "Racist Writing as a Teen Haunted GMU Candidate." The story relates the fact that the now 22 year old professorial candidate, Kiwi Camara, had as a precocious student at 16, used the "n" word in discussng a 1948 Supreme Court decision that had banned racially-based restrictive covenants. Camara was dropped from consideration for a position at George Mason University in Fairfax, VA, after he apparently invited people to "google him" and the remarks were found. There is a debate as to whether a remark made as a teenager should be held against him, or whether it is so isolated as to be significant. Universities are well known to be ultra-sensitive to racist, sexist or homophobic speech, however.

A search on his name this morning does not easily show the article or language.

Picture: US Court House in Alexandria, VA, site of the Libby verdict yesterday.

Tuesday, March 06, 2007

The Most Dangerous Game: Brains v. Brawn, and the cultural wars

In teaching English these days, the “rubric” is all the rage. This is a set of guidelines for writing to make it accepted in the professional, academic and mainstream worlds. In English classes, themes are assigned to exercise the rubric in a straightforward way, starting with the “essay map.” The rubric emphasizes a three part structure – we see that in film and documentary – starting with a thesis, followed by presenting factual evidence, and coming up with conclusions. That is, after all, how professional journal articles should be written, most of all in the medical world. The writer stays away from her own opinions and experiences and sticks to well documented facts.

In one high school English class, ninth graders read Richard Connell’s famous short story “The Most Dangerous Game.” Then the students are asked to prepare an essay on the notion that the story explores the “brains vs. brawn” problem, supporting their theses with incidents in the story. The text of the short story becomes the universe for research, but the underlying principle is the same, to stick to what can be supported from evidence.

Sometimes the assignment is enriched by showing the 1932 RKO Radio movie, and the kids will be asked to make a Venn diagram comparing the story to the film, and then elaborate further in the essay as to whether their points are based on the story, the film or both. Director David Fincher picked up on this in his recent film Zodiac, in which the earlier film forms an important plot subtext that drives the rather introverted hero Robert Graysmith in his search for “the truth.” (Blog entry is here.)

Published media (whether articles, books, movies, or even databases) of course deals with all of the ambiguities of modern life (sometimes in comparison to earlier civilizations). The “brains vs brawn” problem translates roughly into “individual sovereignty” vs. “public morality.” (“Personal autonomy” is a good synonym for “individual sovereignty”.) This is what we know as “the culture wars.”

We can go into many directions with this – as I have on my blogs and websites – but one concern here is that inevitably, to make a difference, the writer (me at least) finds that to have a public impact he needs his nuclear weapon – his “personal stuff.” Personal materials add credibility and richness to what otherwise might be limited by expression in SAS-derived tables (the staple of social science research) from a K-Street lobbying firm. Personal stuff can also be dangerous because it can be misinterpreted, and can inadvertently involve others or put them into an enemy’s sights. We all know this well with gay issues.

I come back to the rubric now. For a few years, I maintained a number of static “editorials” on the website, keyed to footnote files from by books, but what I found quickly that these editorials were hitting moving targets. I would add paragraphs and more footnotes as news items appeared, and quickly the editorials lost that neat three-part structure.

That is where the blogs come in. As news items about any one of a number of disturbing or controversial issues occur, I post it, and in the posting I relate the news item to any other materials that I know, sometimes relating to personal experience (which I “know” to be true in fact). In theory, if the blog subject lines were coded according to a scheme, they could be cross-referenced automatically back to the editorials. Better would be a software package or database to do this, and that is something I am prototyping privately. Welcome to information technology, SQL, maybe even DB2 and mainframe programming.

I have, at some points in my life, definitely had my freedom intruded upon by others for what I perceive as “irrational” reasons (at least given modern libertarian ideas about “personal responsibility”). I can induce all kinds of theories from these experiences. Sexual orientation, as practiced in private by consenting adults, does have a public meaning which some people (relative to their own personal abilities) see as disruptive to the meaning that they get from conventional marital (and religious) socialization. Of course that kind of argument can go both ways. It all started a few decades ago with “brains vs. brawn.” What we have to face now, in open debate, is that voluntary procreation of children is not the only event that, at least as a moral matter, induces family responsibility or its socialization.

Related blogger post: "Just for authority".

Laptop batteries continue to raise potential safety hookers

USA Today, in a story Tuesday March 6 2007 by Peter Eisler and Alan Levin, titled "Batteries Can Pose Fire Risk to Planes; laptops, phones lead to new rules", link here here, continues the concerns raised last summer with the recall of lithium laptop computer batteries, hundred of thousands of them, by several vendors, because of the rare occurrences of spontaneous fires.

The Transportation Security Agency is reportedly looking at its carry on policy again. Currently, laptop computers and cell phones are permitted either carry on or checked. See the link for current information. The USA Today story advises against placing non-rechargenable lithium batteries in checked luggage, and the TSA already probibits shipping them on passenger jets.

It's understandable that even a remote risk is unacceptable in commercial aviation. One problem is that this spills over into other areas. Could hotels or apartment property owners be concerned about hazards posed by electronic devices owned by guests or tenants? My own Dell laptop Inspiron computer battery was not on the recall list, but it does get hot with long use. I have made it a safety precaution at home to disconnect it when the laptop is not in use or when I am away from home from an extended period. I usually work on the laptop with the power cord plugged in to a Belkin uninterruptible power supply. This sounds a little "safer." I routinely pat check all outlets, surge protectors and power supplies for unreasonable or unexpected amounts of heat detectable to touch.

The chemistry of lithium batteries is interesting (as well as the theory behind it), and makes a good sidebar discussion in a high school advanced placement chemistry textbook.

The battery industry attributes much of the problem with cheaper counterfeit batteries not made to UL standard, but we know that many of the batteries recalled in the summer of 2006 were from well established companies. It is important for people to be able to travel with laptops and work with them safely. This is an established part of efficient business. Even school systems and teachers use them widely, and classrooms often have electronics left on or in situations where there could be marginal fire hazards after the school day is over and everyone is gone; I have noticed this as a substitute teacher.

No one was concerned about this a few years ago, although problems had occurred as early as 1999. Perhaps laptop and battery manufactures should offer lower-powered but "safer" batteries for travel as optional accessories while the battery industry works on this important manufacturing safety issue.

Monday, March 05, 2007

Corporate social networking

The New York Times has a story on Saturday, March 3, 2007, p. B1, "Businees Day", by Brad Stone, "Social Networking's Next Phase: Services Like You Tube Seen As Way to Conduct Business." The story mentions a deal by Cisco to purchase an "affinity network" design firm Five Across. Cisco is also buying assets of Tribe Net.

This reminds me of the "meetup groups" and similar sites. I have wanted to use these to break in to screenwriting support, and in the DC area have not gotten around to making this work. In Minneapolis there was a screenwriter's workshop that had regular sessions and table readings. Often these affinity groups, that are often political in nature as well as or instead of professional (like GLIL in Washington -- Gays and Lesbians for Individual Liberty) have gotten by with listservers and occasional socials. Craigslist has also helped promote this sort of networking.

But if companies are reading to embrace "social networking" why do they find that the need to "spy" on the social networking profiles or their job applicants and employees?

Sunday, March 04, 2007

Always Be Closing II: Another company thinks I would manage sales people -- weird

I had talked about this a bit with my "Always Be Closing" on Nov. 29, 2006, at this link. About three weeks ago, on a Friday evening around 6 PM, I got a call from a recruiter about my resume
and it wasn't my Dice resume (for information technology). I went in to a suburban office park in Northern Virginia for the interview. The deal was that they make arrangements to sell surplus capacity of businesses -- especially restaurants -- by word-of-mouth salesmen at shopping malls, door-to-door, etc., offering deals with a lot of proceeds to go to a charity.

Now, my first reaction is, I don't like to be approached this way myself, so I don't approach people in public that way. But I can't pour cold water on the idea too much. Keep the ice cubes out. After all, going to a mall, be it Tysons in No. Virginia or the Mall of America in Minneapolis, is a bit of a social experience.

And think about The Donald Trump and The Apprentice. Many of his tasks involve having his teams set up kiosks or exhibits at malls or on NYC or (now) LA streets to attract customers. That is OK. If you have a contest, a drawing, prizes, or a benefit, and attract visitors (maybe even to meet a celebrity) and want to sell something (be it lemonade, paintings, or tickets to the opera), and have some proceeds for a charity, that is certainly fine. So I can see how that could be OK. In August 2005, in fact, I went to the King of Prussia Mall near Philadelphia, where Teen People had an exhibit, and got to talk to both Gregory Smith (Ephram) and Chris Pratt (Bright) from TheWB Everwood while the Teen People exhibit blasted the song "It's hard out here for a p--p" from Hustle and Flow. Since I hang around the movies, and especially did so with IFP when I lived in Minneapolis, I can see how this kind of thing could work.

One area where there is a lot of excess capacity is movie theater seats, especially during the week. One idea to fill seats could be to play licensed tapes of concerts and operas, just as Regal has started some Saturday afternoon closed circuit broadcasts from the Metropolitan Opera.

I worked 14 months as a called for the Minnesota Orchestra in the 2002-2003 period before coming back to the DC area. We would call for the Young Peoples Concerts for the Guaranty Fund for the Orchestra. So I am familiar with the concept of arts development and with the idea that it can go some where.

One other thing about this job call: I was supposed to supervise or train other runners in the mall or street after two weeks. There is nothing in my resume that suggests being a manager for the sake of being a manager, unless the job refers to content that I myself have developed. I wonder about this. If I take a job requiring direct reports just for their own sake, that can cause conflict of interest problems with my writing.

I will keep everyone posted.