Sunday, November 30, 2008
The Business Section of the Sunday Nov. 30 2008 New York Times develops the new tension between “privacy” and efficient communication in an article by John Markoff, “You’re Leaving a Digital Trail. Should You Care?” The link is here.
The Massachusetts Institute of Technology is trying an experiment with some students: in exchange for the ability (for experimental purposes) to monitor the students’ every move, it offers a free smartphone.
It may not seem like that much of a step-up. Consider how much we are monitored already. Cars have GPS devices and OnStar, which can rescue you from an accident, but can let the government (or car rental company) track you. Thomas Friedman proposes an “energy Internet” which will reward consumers for efficient use of resources, but would have to track them to do so. It may be possible to monitor individuals in such a way as to monitor their carbon footprints, as if that became a new kind of pseudo-currency that could be scored. But there is obvious reason for concern for misuse by all kinds of entities, including insurance companies.
Recently, on the books blog, I reviewed law professor Daniel Solove’s “Understanding Privacy,” which has certainly become a dynamic concept. I think we tend to confuse “privacy” with “personal sovereignty” which has expressive components which can actually contradict or confound privacy as we used to understand it. Expressive sovereignty can impact the perceived privacy needs of others connected to a person, especially other family members, a problem that is ethically important when the other family members are not “chosen” (like spouses or one’s own children) but are already existing blood or familial relationships established by parents or by others.
In the past few years, we've developed a notion of "online reputation," which comes about because previously "non public" people can become known publicly (as to "who they are") through the Internet, as a result of the sum totals of their activities (including content posted or distributed by others). Now, such "reputation" could be affected by tracking technology. “Online reputation” has become a fluid concept, because it is so connected to a person’s circumstances, and that sometimes can affect the "reputations" of others beyond their capability of choice, possibly inviting misinterpretation or misuse by previously unconcerned parties. The more control a person has over his or her own life, the easier it is to live “expressively” in a wild world without significantly compromising privacy that “matters”. But not everyone has the opportunity for this kind of independent life. Not everyone agrees that it is such a good or sustainable thing.
Picture: No, not Mass;, but the state capitol in Annapolis, MD
Friday, November 28, 2008
This week, I read and reviewed (on my Book Review blog) Malcolm Gladwell’s “Outliers”, in which the author explains the occasional examples of super success.
I used to be a purist in how I look at human “worth” or success – that everyone owns his own moral outcome and destiny. But I realize that in practice, Gladwell must be right. People who “succeed” do benefit from combinations of circumstances coming together at just the right time. And there is no question that there are a lot of relatively painless things that could be done to spread the opportunity around for more people. Still, judging from practical experience, it seems that person for person, some people have many more of specific gifts than others. In churches they call it the “gifts of the spirit” (often in January, in post-epiphany sermons).
What worries me more is the loss of opportunity. In fact, when I was growing up, and as a young man where I did hear a lot of experience from the Left in the 60s and 70s, “equality of opportunity” seemed like a moral mantra. It seemed that our society had gone to great lengths to determine who “deserved” a better life based largely (during the Cold War) on academic smarts and family role model fulfillment, but we were all too willing to allow those “less deserving” to be sacrificed in war. We had a draft, and we mulled over deferments for the married, for fathers, and then for students, gradually eliminating these. That, to me, seemed the most important possible moral issue of the time, even as we had won World Wars conquering nations promoting ideologies of inherent “superiority”.
Gladwell, in fact, discusses the role of military service in making or breaking personal fortunes, and it can work both ways. People lose “opportunity” because of war or because of all kinds of calamities beyond their control. But, around the world (particularly in poorer countries or families), a bigger reason is family responsibility imposed by the activity of parents. In Africa, many eldest children have to raise siblings left to them when parents die of AIDS. But that sort of involuntary responsibility happens all the time in all cultures. The pure libertarian idea of “personal responsibility” seems hardly workable in a real world for many people.
It’s always sounded to me that the communitarian values in the Gospels are related to the practical reality that most people’s “direction” is shaped as much by forces around them (and the needs of others) as by their own innate abilities. The “Parable of the Talents” (mentioned as a chapter head in Gladwell’s book) offers a particularly grueling moral paradox. It has always seemed to me that this kind of communitarianism is an invitation for abuse of the promise of Salvation by Grace. For example, I don’t want to be “used” for someone else’s purposes after someone else’s wrongdoing, and then say that Salvation makes it all right. (I think I’ve said elsewhere that my not following a classical music career was a much a factor of the Cold War and of “moral concerns” as they were during my coming of age as they were related to my own ability; I really could have and would have “done the work.”) If there were no Salvation, then people would understand that harm is objectively real and cannot be prayed away. That’s why the New Age idea of “karma” has always appealed to me. It seems to offer both Salvation and some kind of earthly justice. It makes sense to me that the Cross could have made karma resolution possible (even if that is not ordinary theology).
Thursday, November 27, 2008
"Fame" is a new kind of "money" or at least property; Domain name registrants don't protect their customers from "the establishment"
It seems that we now live in a world with two kinds of currency. One is the familiar fiat money, the stuff of “the economy”. The other one is more nebulous, a kind of hyper-individualized self-concept connected with “fame”. We all know that the Internet has made much more of this possible.
But “self-promotion”, as I’ve noted, moves in two directions. One is the ability of anyone to investigate and unearth content and publish it with almost no capital. In some cases, the party may be able to attract considerable attention to the self. This asymmetric possibility disturbs those for whom social hierarchy and organization are important virtues.
I went down this path, because I wanted to make various, nuanced arguments that established pressure-groups, even friendly ones, don’t like to make. I wanted to see more intellectual maturity in what we argue. For example, to argue for “gay rights” (to use the term loosely) we need to talk about more than just immutability or minority oppression. Because of individual bloggers, “the truth is out there” and “special interests” in any political area cannot count on just hoodwinking an oversocialized public.
The other area of instant “fame” is social networking, and that has attracted more of the controversy (“reputation defense”) in the past three years or so. But, the way it is usually practiced, “social networking” really tends to be localized within specific communities or even schools. It doesn’t result in global fame. But it seems to create similar risks and exposures. It’s the idea that we allow people to expose themselves and others to risks that used to be contained and quantified within the “established” media industries that could lead to demands for a crackdown, like mandatory insurance or rollback of some of the intermediary downstream liability provisions like Section 230 and DMCA safe harbor.
In fact, Matt Zimmermann has an article on EFF’s website “Censorship in the 21st Century: Targeting Intermediaries” The link is here. Particularly vulnerable, according to the article, are domain name registrants. In a few cases, domains have been taken down when registrants did not bother to think through the fact that the law protects them from liability (at least as it is now). Some of the major cases include a very leftist New York Times spoof (a fictitious July 4 2009 issue is here), The De Beers diamond company, not amused by a fake ad on that site, went after the registrar Joker.com. And the State of Kentucky illegally went after GoDaddy for registering online gambling sites. And Swiss bank Julius Baer went after Dynadot, domain name registrar for Wikileaks.
The New York Times made light of this by recalling a “Not The New York Times” caper from 1978, link here from Nov. 14, 2008.
Most of the cases that attract attention in the Internet community tend to be bizarre in some way. But the threat of stifling ordinary speakers (who don’t “pay their dues” first) and who enjoy the limelight even without the money to back it up, seems to be growing.
Tuesday, November 25, 2008
I’ve looked at Internet speech, whether on blogs, social networking profiles or other formats as potentially significant politically from almost anyone, if well enough conceived. Today (Nov. 25, 2008), Danielle Allen (UPS Foundation Professor at the Institute for Advanced Study in Princeton NJ [as explained here ]) has a piece called “Citizenship 2.0” on p A15 of The Washington Post, link here.
Allen seems to feel that Internet debate tended to have a conservative bent at first, particularly as the Republican, Newt Gingrich backlash against Bill Clinton rose in the mid 1990s. Examples include forming Drudge Report (Now, on Drudge, hear ye this, a shocking Russian prediction about the future breakup of the U.S.!), and Free Republic [link]“independent, grass-roots conservatism on the Web”). The Left, with “Move On” (link) (“Democracy in Action”) did not get a lot of Internet mass until around 2003 as George Soros (the “reflexivity”, “open society” and “anti market fundamentalism” guy) got involved (although Soros had talked a lot about his theories in the late 90s in his book).
Allen says that the Right tends to like a “Wikipedia” approach to reporting news, inviting a potpourri of contributors, where as the Left (Huffington Post [link]) as an example, prefers a controlled, hierarchal approach to selecting and organizing what is published. The Obama campaign, she says, went back to the “Wikipedia” approach.
The theory makes sense. The Left seems to identified with structured planning of society (just listen to Michael Moore talk about what should be done with the auto industry), whereas the Right, at a certain level, praises personal initiative in a piecemeal and piecework fashion. But then the Right wants to install controls at the family and religious levels, with a potential to delve into concerns about who has earned “the privilege of being listened to.”
Since my blogs and sites are a pretty much a one-man show (although I take all reasonable comments and publish them) it sounds like I follow the hierarchal model, even though, as a whole, much of my material sounds like it comes from the Right (at least in a Log Cabin sense). Yet, what really matters is how the material is structured and organized, and especially how one “connects the dots” within the material. With a corporate structure of some kind, I would quickly have a much more wiki-like operation. All of this is inherent in “knowledge management” itself. It’s all about “the knowledge of good and evil.”
Monday, November 24, 2008
I got at least one random call a few months ago inviting me to become some sort of mortgage loan broker or manager, even when we all knew the subprimes were going bad. The New York Times today (Nov 24) runs an editorial on p A22, “Return of the predators” and it’s not talking about Chris Hansen’s Dateline show. Characterized now as “sharks without a frenzy” (or maybe snakes dropped from a plane), they’re back now as “loan-modification companies.”
The Times talks about both “for profit” and nonprofit consumer loan mod companies, and the idea that many homeowners in trouble don’t necessarily go looking for help on their own. They have to be courted, it seems, by salespersons. Then, the salesperson, to keep his or her job, still must follow the “always be closing” 200-mile rule.
The link for the editorial is here.
Update: Nov. 28, 2008
A North Carolina federal bankruptcy judge, Rich Leonard, describes how loan salesmen pushed bill consolidation ARM's not tied to the prime rate, and how this contributed to the foreclosure problem. He argues that bankruptcy law ought to allow modification of mortgages. "Chapter 13 is not a walk in the park" he writes. The op-ed appears on p A29 of the Nov 28 Washington Post and is titled "A Win-Win for Bankruptcy Reform," link here.
Now, it’s Citigroup. Does this string on bailouts ever end? Lehman Brothers, AIG, all the major European banks, partial nationalization of most other big American banks, multiple government receiverships, the big three auto makers with tin cups begging for handouts like bag ladies in shopping malls. Oh, right, they arrived on private jets.
I accidentally misread my Goodyear statement two weeks ago, and caught by the Veteran’s Day bank holiday (which I forgot about until too late), got the $39 charge. I went ahead and just paid off the entire card immediately. I didn’t get a past due (I got in before the closing date) but missed the due date. I wonder how that affects my FICO score. At least, no bailout for me. Maybe that helps the economy, because Goodyear can hardly be in good shape itself.
The big 3 auto companies make (well, made) elephant cars that nobody wants. Actually, I bought a Chevette in 1979 that fell apart quickly, but by the late 1980s American manufacturers really were turning things around in quality. But, now, it seems they’ve fallen off the cliff as to what consumers need. Just watch Michael Moore’s “Roger & Me.”
The banks goad us into tricky accounts and hit us with fees and whopping surcharges.
The airlines take away services and keep us on tarmacs for hours. (Southwest, you’re much better than the rest.)
Builders offer subprime mortgages to buyers for amount two or three times what they could have qualified for a few years before.
Wall Street traders simply have failed to learn right from wrong and throw our money away, into the toilet. At least, they wouldn't pass my test on Business Eethics that I authored for Brainbench in 2003.
Employers hold their people to quotas and build businesses on “lead lists” and an “always be closing” sales culture. The movie “100 Mile Rule” says what really happens, sad to say.
Physicians continue to order unnecessary tests and practice defensive, tort-proof medicine. Lawyers chase ambulances. So employers have to drop their health insurance.
So now, the government (at the tail end of a "Republican" administration run by "market fundamentalism") bails out all of corporate America. What do share prices means when most of corporate America is becoming socialized, as we become the Peoples Republic of America. Maybe Michael Moore is right. Is this the end of capitalism?
At least, some of our CEO's would have been turned into the Chairman Mao's first set of countryfied peasants had they lived through the "Cultural Revolution," while union workers and mid level professionals learn to do "real jobs", perhaps, according to the "free market cultural revolution." Go ahead, Anderson Cooper, keep on naming names in your list of "culprits of the collapse."
What gets nationalized next? Blogs?
Is it any wonder we’ve entered Great Depression 2?
"Greed is good" (Gordon Gekki aka Michael Douglas in "Wall Street"). Nevermore.
Sunday, November 23, 2008
Today, Nov. 23, at a Sunday school class I heard a reserve Army officer discuss the rules for “sole surviving sons” and daughters. The policy is often forgotten in today’s individualistic culture. But military policy, both in selective service and accession and in combat deployment, tries to respect the cultural value of many parents to have a biological lineage, especially a son. The particular officer said that, as a sole surviving son, he was not allowed to enlist until he had a son himself, who also served in the Army. He, already a surgeon, then applied for a direct commission as an Army medical doctor in rehabilitative surgery, repairing combat wounds.
The sole survivor concept fits into the 1998 Steven Spielberg film "Saving Private Ryan" about D-Day, with Matt Damon and Tom Hanks, from Dreamworks.
The About.com website has some material on this. A sole surviving son and/or daughter may not be assigned to combat duty without relatives’ permission, even in the volunteer military (as since 1973). The web link is this.
Sole survivors do have to register for Selective Service, but are entitled to peacetime deferment should the draft be reinstated. The appropriate link is this.
Saturday, November 22, 2008
I wanted to run down the major topic areas of greatest concern to me. They all have to do with personal sovereignty, and the major forces at work that can affect it.
(1) General morality
(1a) Personal morality in terms of responsibility for choices made
(1b) Personal responsibility in terms of karma and unseen dependencies on others
(1c) Is interdependence necessary, and can intimacy be “forced” sometimes?
(1d) Sustainability as a moral value
(1e) Asymmetry as a moral distractor
(1f) Transparency of information
(1g) Power structures as a source of stability
(1i) Liberty in terms of governance structures, v. liberty in terms of "granularity" of personal sovereignty
(1j) Why we need to be able to count on the law, and may be nothing without it
(2) LGBT Issues:
(2a) “Don’t ask don’t tell”, gay marriage, and gay parents: how these issues relate to “sharing responsibilities” and “sharing uncertainty”
(2b) How the institution of marriage affects true singles and people without partners
(2c) How eldercare and longer lifespans may particularly affect singles, the childless, and LGBT people
(2d) What really causes what we call “homophobia”
(3) Retirement issues
(3a) Filial responsibility laws – a deadly legal trap, and the next big financial shocker
(3b) Paying for and providing custodial care, and why politicians fail to talk about it
(3c) What’s the truth about social security?
(3d) Can we “really” save our pensions?
(4) Speech issues
(4a) “Don’t ask don’t tell” mentality and the “real life world”
(4b) reputation, in the eyes of the beholder: ethical practices for employers, including "blogging policies"
(4c) Implicit content (a huge new legal gray area, particularly for the Web!)
(4d) Accountability of speakers: downstream liability, and tort reform
(4e) “Free entry” is not necessarily a “fundamental right”
(4f) How to make filters, label, and accountability software work
(4g) Critical copyright and trademark litigation and bills
(4h) Will the marketplace genuinely give "net neutrality"?
(5) Communications security issues
(5a) Internet “driver’s license” and potential liabilities for security problems
(5b) Schemes to end spam
(5c) “Absolute liability” offenses
(6) Service issues
(6a) The “pay your dues” concept
(6b) the military v. other forms of service
(6c) “don’t ask don’t tell” and any future draft
(6d) “don’t ask don’t tell” and other jobs or services requiring “forced intimacy”
(7a) Transparency of markets as a libertarian concept
(7b) What is Soros's "better regulation"
(7c) Could regulation of markets imply regulation of distribution of speech?
(8) National security
(8a) Which threats could defeat the U.S. and west?
(8b) Which threats are the most intractable?
(8c) What causes the west’s “moral vulnerability”?
Picture: from the newly reopened National Museum of American History, Washington DC.
Thursday, November 20, 2008
If I were a parent with teenage kids, a high priority as a father would be that they (or he or she) be able to understand what is going on in the world on their own, without needing others to prompt them to buy things. At about seventh or eight grade, a young adolescent should start to grasp what the adult world will demand of him or her, start developing some interest in more adult goals. Gradually, the learning accumulates to the point that the teen can start to connect the dots as to what is going on. The earlier someone learns this, the more welcome school and academic challenges will become.
There are two pieces to this. One is, of course, education as we are used to conceiving of it (including education from parents). But today, with technology changing rapidly and social norms also being tested, people have to be able to anticipate that serious challenges to their lives can develop from the outside world very quickly. We’ve seen all kinds of examples: oil shocks, financial scandals, pandemics (including STD’s and HIV), 9/11, natural disasters (Katrina, wildfires, etc), and most recently the financial Panic of 2008. Schools are stressed, but they probably need to spend more course time in social studies than in the past, because social problems have become more uncertain and complex. And schools need to teach students the legal and ethical issues surrounding the Internet, and most school systems are not equipped to do so without asking for more help from Silicon Valley companies.
The other piece is the information itself. Gradually, the web (and the “semantic web”) provides the structure not just for looking things up but for cross referencing them and for seeing the trends and implications of developments. It encourages a certain intellectual self-sufficiency. No longer does one depend on a religious, business or familial “power structure” to determine what one will know or believe. The political value of propaganda and special interests decreases, and “the knowledge of good and evil” can be achieved.
Today (Nov. 20) the New York Times has a story on the “National Page” by Tamar Lewin, “Study Finds Teenagers’ Internet Socializing Isn’t Such a Bad Thing,” link here. The “free entry” Internet went from a forum for publishing, discussion, and perhaps business or political self-promotion to social networking, a development that brought the online and physical worlds together and broke down walls of “privacy” and “ownership.” What social networking sites reinforce is the idea that the “transparency” of information (so much discussed now during the fiscal crisis) includes an awareness of the source. They remind us that structure and “power” in social relationships still matter, particularly when people have to do things they otherwise wouldn’t “choose” to do, just to get along or to manage.
That brings me back to “implicit content.” The advent of social networking sites a few years ago made people sensitive to the purpose or motive behind people’s postings or content. For example, if a high school principal goes to see the movie “Elephant” (Gus Van Sant), she will probably say “it’s only a movie”, however horrific its premise and content. But if a teacher in that school had authored the same screenplay and posted it on her own personal domain and let if be found, she’d probably be gone quickly, and maybe even get questioned by the district attorney. What people make of things (and “reputation”) is creating problems that we are not close to having an ethical or legal handle on.
Tuesday, November 18, 2008
I submitted an op-ed to the Washington Times on Monday, Oct. 27. Since I haven't heard anything, I thought I would run it in my main blog. See also my Letter to the Editor that was published Aug. 22, 2008. Please also note that I have major accounts of "what happened" when I was substituting on July 25 and July 27, 2007 (not 2008) on this blog (see the Archives).
The Teacher Shortage: Career-Switching and Substitutes
Recently, the newspapers have discussed the public schools’ difficulty in recruiting math and science teachers (particularly men) who both good at their academic subject matter and effective with children.
The problem spills over into substitute teaching in many states. Virginia may be typical. In many districts, one can work as a short-term sub with only 60 hours of college and without a teaching license, but the work is hourly, irregular and has no benefits. The job attracts retirees who have “real world” work experience, and sometimes draws graduate students, musicians, actors, or others who may be transitioning to what they really want to do.
School systems know that substitutes can have isolated rocky experiences, so they typically, rather than investigate individual complaints about classroom performance, have a “three strikes” rule. A sub is fired after accumulating, over all elapsed time, three “do not sends” from individual schools. If an inexperienced sub works long enough, there is a practical likelihood that the sub will “expire” – accumulate the three complaints, perhaps after several hundred jobs. Short term subs often rotate among many schools and naturally may find commanding respect from certain students difficult. Nevertheless, when schools note that a specific sub has had discipline issues two or three times, even with specific normally poorly behaved students, they often eliminate the sub from their school. In time, some subs get fired, and I’ve personally heard other former subs talk about this problem.
I completed over 200 assignments from 2004 to 2007 in Arlington and Fairfax. With more “mature” classes (particularly Honors and AP) I found the job rewarding, and students seemed to benefit from contact with an older person who had decades of work experience in the business world. But I ran into a few serious sporadic problems with discipline (euphemistically called “classroom management”), particularly in middle schools. Not having been a parent and having led an “individualistic” life, I simply did not “connect” with certain kinds of students, who seemed to need focused, parent-like attention. Furthermore, in a couple of isolated special education situations, I was placed into circumstances that, because of the surprise element, I found frankly humiliating. I would have been happy to help students work algebra word problems or factor polynomials for eight hours a day, but that isn’t what the job was.
Furthermore, my own public website led to a controversy at one school. A small amount of material on the site, not objectionable by normal standards and protected by the First Amendment (as applied to public employees) could have (out of context) been misconstrued by students or parents because of some extraneous or coincidental circumstances. But I have to admit, I relate to my world in a much more individualized, less “socialized” way than “normal,” and students might need to perceive me as a “role model” according to their cultural frame of reference rather than mine. Subs may be regarded as role models and authority figures by students or parents, but subs may not have the actual classroom authority or compensation to live up to this expectation, and may view themselves as just temporary facilitators. This can create a dangerous situation, where students or administrators or others in the outside world may magnify the significance of things substitute teachers say or do even outside of work. .
From my experience, I think that school systems need to rethink carefully how they recruit teachers from other professions, especially those making “career switches” after layoffs or early retirement. They need to realize that there are many different perceptions of “role model” and how proactive teachers should be with discipline and motivation. Particularly, school systems should hire substitutes more carefully, with an eye to eventual licensure, and provide more training (toward licensure) on school system campuses. School systems must craft clearer policies for teachers’ Internet speech, even when done from personal resources at home, when it can be found so easily by students (with search engines) and be misinterpreted. Students, after all, have yet to navigate English and social studies in high school to learn what content context is all about.
If school systems really need more teachers, they must get serious, even in this difficult economic climate. A “career-switching” retiree like me then has to determine if the entire process of practice as a substitute or student teacher and course work (and the financial and time commitment) for the career change to teaching is really appropriate.
Friday, November 14, 2008
Yesterday (Nov. 13) Dr. Phil made a typical remark, that life isn’t “fair” (Donald Trump has said that, too). I thought back my posting Wednesday and “fairness.”
A half century ago, we had an unwritten social contract that regulated the lives of individuals in relation, particularly, to the family. It went something like this: If you did anything for yourself – experience intimacy, or gain attention or recognition – you willingly accepted some uncertainty and shared the potential burdens of others, especially in the family. That got to be focused on the “no sexuality outside of marriage” idea (and all of the social pampering of courtship, marriage and “blessed events”) – it was a way to guaranteed that everyone shared the responsibility and “risk” – financial and emotional – of providing the next generation (and caring for the past one, which is rapidly becoming a major issue again). It had always been mixed with gender role expectations. There was a sense that if this principle were not universally followed, trust and morality would break down and no one would be able to raise their own families. When some kind of duty is "mandatory," people may find "meaning" in filling it. Yes, often this idea was tracked back to religion and various passages of scripture. It involved covering the idea of individual self-concept with the identification with faith and family, often established by the social structures ruled by others “in power”. This fundamental idea explains what we know as “homophobia.”
We changed our minds about this, starting the process largely in the 60s, because of the confluence of a number of factors, including technology and a recognition that intellectual and personality diversity could actually be a survival resource for a culture when challenged militarily or from any external source. Many factors went into this process, which followed on the heels of the Civil Rights movement. These factors included increasing opportunities for women and gender equity in work and even (almost) the military, and “gay rights.” Some people think we went too far, turning the baby boomers into a “me generation.” But the end result was that previously sacrosanct ideas about blood loyalty and family cohesion became breached. Younger generations wanted to place more emphasis on “fundamental rights” defined in terms of personal choice (and define responsibility in terms of that choice), self-expression, intimacy with consenting adult significant others, and a flexible notion of “right to privacy” (which has certain changed with the Internet). Previous ideas of freedom, all the way back to the American Revolution, had left the family and local community as an intermediary between the individual and the capacity to reach and affect the rest of the world.
Where does “fairness” fit in? In the 50s, there was a definite idea of “local fairness” of the place for people in the family, and indeed society became duplicitous in its insistence that every adult should (whatever competitive concerns) be able to start and keep one (through monogamous marriage). There was gross unfairness at the macro level, most of all, in the United States, based on race, but also on economic class. Indeed, most of the economic instability of the past could be cast in terms of extreme differences between the spending power of the very rich and the poor. The emphasis on individualism, and the transparency of information transmitted to and from the individual (“the knowledge of good and evil”) have gone a long way to reducing past injustices based on race, religion, nationality, or any other group affiliation, as we now see with result of our latest presidential election. But the asymmetry inherent in hyper-individualism can introduce and exaggerate injustices of its own and introduce previously unimagined perils. There are always people who fall behind and feel left out, and who were better off in a world where the family for its own sake has more emotional meaning. We have gotten used to somewhat exaggerated ideas of personal freedom and sovereignty, and must hang onto a “stalking wild pendulum”. There is no such thing as a perfect system or a perfect moral ideology. It is people who are good and bad. There is always a need to balance choice with duty.
What anyone’s opinions (including mine) are about addressing all of these inequities is not so important as is understanding the history of how things came to be the way they are today. A major purpose of the detailed narratives of my “Do Ask Do Tell” books (especially the first one) is to trace how things changed. The period in which I came of age (from the late 50s to about the time of Watergate) is as critical to our understanding of individual rights as any in our history. Yes, I would make a movie about it if I had the resources.
Thursday, November 13, 2008
Barack Obama’s transition team is openly asking for sensitive information in the political appointments that it will fill. This item appeared on ABC's "Good Morning America" Nov. 13. Some of the interesting questions include
(1) Identifying the applicant’s “Facebook” page or, presumably, any other social networking page (Myspace) or Profile (Linkedin) or blog or personal website.
(2) Identifying the legal spouse’s job to avoid nepotism (not clear if this would include domestic partners who were not legally married)
(3) A history of any bad text messages or emails that one might have sent
It’s important to note that the federal government does not ask these questions with normal civil service jobs (such as on USAJOBS).
But, given the concern over “online reputation” the way the media has reported it recently, it brings up the question whether some private employers (even for less “sensitive” jobs) or even college admissions officers will start asking specifically about social networking profiles and blogs.
Later, on ABC "World News Tonight", the report expanded to show the "63 questions" on the form, which include listing all previous relationships and intimate partners.
Wednesday, November 12, 2008
Recently the Dr. Phil show has covered the topic of school bullying, and has introduced a book by Dr. Phil’s son Jay. Other books on the topic by Nancy Willard and Susan Lipkins add material to understanding the topic. Today I want to put a personal spin on it, with what happened to me in the 50s, and lead to a discussion about the flip side of how we view structured hierarchal social relationships in a society that wants to value individual freedom.
I remember feeling tormented, somewhat in grade school, but especially in seventh grade. Since I was not competitive physically as a boy, I began to develop feelings of not just modesty but also shame about my body, a most unacceptable emotion. I do not recall the incidents as well as I recall the questions about being expected to “hit back” and then the occasions when I did retaliate. A few times I fought with my fingernails (possibly causing a serious cosmetic injury that year; I still cringe) and then verbally, in one particularly upsetting incident in ninth grade. In senior high school, I was very much better off. But, as I have covered in the blogs, I had quite an experience in my first semester at college (1961) (William and Mary) and wound up being expelled for admitting homosexuality. I remember skipping a session where freshmen were supposed to be hazed.
The modern interpretation of this problem is to tell the victims that it is not their fault. Certainly, when I went to school, administrations were unwilling to stop the “picking on” of kids who were different, and even today schools say that their hands are tied in policing behavior off the school campus (as on the Internet). Why did authorities have such a diffident attitude? I think there is a natural bias in our culture to encourage “strength” and to expect the “weak” to accept their place in the world. This is a problem.
Some of our more intelligent animals, like wolves and lions, have hierarchal social organizations that do not allow all members the same rights or even to reproduce. They have developed power structures so that their groups, viewed as wholes, have the greatest chance of surviving and reproducing in an uncertain, possibly hostile world. Modern human society, we thought, it supposed to be different. We have the rule of law. If everyone is equal before the law, we can guarantee a level of individual rights. But we have to “guarantee” that the law will work.
I recall after my expulsion my father said once, “Now we have to worry about what everybody thinks.” Appearance is more important than truth. It’s easy to dismiss this as intellectually illogical. In a real world, social dependencies matter. We all depend on the sacrifices or subsidies of others that we don’t see. So we get a sense that people have to be responsible for each other, whether they choose to or not. Taking that responsibility requires competitive skill. In earlier generations, notions about competition were tied to gender, and ideas that men had inherent obligations and “duty” to “protect” women and children (first in their own families) that went beyond the scope of personal choice. Someone with more ability could do more for others, but also could have more power. That meant that someone with less ability (me, then, in terms of male capabilities) did what others said. There was a sense that such a hierarchal order was morally “right.” This moral paradigm reflected a sense that the outside world, even the legal system and market economy, was uncertain and that families and local communities needed reliable “chains of command” and locus of authority, and a sense of loyalty form members. This kind of “moral thinking” accepts the existence of “enemies” as a real part of the entire moral landscape of “right and wrong” and is willing to vilify people who are too disruptive to harmony. It’s true, it’s not hard to get from that kind of thinking to totalitarian systems that we were supposed to be fighting (and had beaten in World War II).
My own father would make something of this, trying to make me learn to do certain manual tasks, unnecessary in a practical sense, a certain way according to the discipline imposed by others. It seemed he wanted me to accept the idea of doing certain things just so show respect “for authority’ or power for its own sake.
In time, I got a pass on my physical inadequacies. I got my C’s in gym and the high school didn’t count them in the GPA. The Cold War world had awakened to the idea that it needed its dorks and nerds. (It also meant that my talent for piano and composition had to be set aside.) Yet my doing so left a certain moral imbalance. I finally served in the military, but after graduate school (and my recovery from the expulsion). I was sheltered while in the Army, whereas others who were physically more forceful but not academically inclined made sacrifices from which, according to the moral standards of the time (centering on the moral debate over draft deferments), I benefited. One can suggest that, while school administrators should have stopped the bullying, I should have been made to measure up. (That is my “Point 1”). That would be much easier to do today, with modern medicine, to determine what the cause of my social, mechanical and physical development problems.
Had I overcome these issues, would my life have been different? Probably. I might have dated, married, had children, and learned “family responsibility”. And then perhaps I would have wanted to break the marriage to experience what came next anyway. Which outcome was right? For me to spare someone a possible divorce, or to create and share parenting responsibility? It’s not necessarily a wash.
Now, I could say, this narrative belongs on the GLBT blog, but I think the problems are broad enough to belong here. But this brings to how I “came out” – twice, as narrated in my books. Because of the effect of the teasing and “bullying” and the way I reacted to cultural norms, I developed a different psychological strategy. I came to value, through a process of upward affiliation, men who were both “smart” and “strong”. This got to be elaborated to a fantasy life and erotic feelings that mapped back to a belief that I knew and could choose what was morally “good.” One of the most important points (“Point 2” of this essay) about this process is that I was admitting to myself that I “bought” the moral values that say that notions like duty and local power are meaningful and necessary. I did not believe that everyone was equally “worthy.” Furthermore, modern technological culture offered other opportunities for self-expression (for some of us) besides procreation. (I could turn that around and make it more negative.)
During the William and Mary and NIH treatment periods in my life, as I’ve explained, I noticed a flip in the attitudes of those who had previously made fun of me. They sensed a dark side to my personal values: I could place myself in a position of being able to remind them that they could fail physically, too. I think that this observation helps explain, in a practical sense, a lot of what we call “homophobia” today or “homohatred.” It’s obviously relevant to the way that the military views the problem. I think that this process is also an admission of the process that competitive male heterosexuality invites its own contradictions, which somehow must be contained.
Remember, I made no pretense then of claiming “equality.” I just wanted to freedom to “be myself.” For a period in my life, that freedom was taken away to assuage the emotional needs of others. I would eventually understand that many marriages, to remain active and stable, are predicated on the notion that the families marriages form become a source of identity for everyone, including other adults who don’t marry and have their own children. Marriage seemed to need to claim a monopoly on how sexuality was to be used. If one wanted full adulthood, one needed to be open to the particular kinds of intimacy and emotional self-giving that marriage and parenthood requires. This turns upsidedown the modern way of looking at problems of teen pregnancy as simply making bad choices. Openness to procreation and sharing in its uncertainties was to be expected of everyone. For example, this was a very important part of Vatican moral thinking, to the point that the Catholic Church managed the issue of men who do not want to procreate separately, offering a celibate priesthood, so that these men would still participate meaningfully in carrying on the generations. It’s important to remember that both family and church were expected to provide a source of collective identity to protect those who really, for “no fault of their own”, were less able and less competitive. This is really how people thought, like it or not.
The world, it seemed once I was in early adulthood, would feel more comfortable if I, despite my lack of manly competitiveness, would marry and have children myself, so I couldn’t remain in a position to “kibitz” the masculinity of others. (Kibitzing the chess games of others is fun..)
Nevertheless, for about three decades or so, I would enjoy my “urban exile,” living my own life my own way, as if on another planet. Until the 1990s, the gay community was relatively separate socially from the rest of society, so others learned to tolerate the existential problems that gays could raise. The one huge exception to this was the reaction to the AIDS crisis in the 1980s.
But in the 1990s things changed, for a couple of reasons. The biggest stimulus was probably the opening of the public Internet and World Wide Web, which brought different kinds of people into what author Clive Barker calls “reconiliation.” The other big reason was the emergence of political issues, particularly gays in the military, which president Bill Clinton would bring up when he tried to lift the ban, as well as gay marriage and civil unions, which were percolating in the 90s. One major context for these issues was that gay people should share the risks and responsibilities of a free society with everyone else. Now, with a global economy with the family less central in American life, the idea that someone who did not share these responsibilities could become a “second class citizen” became important. But – just think about it – the new paradigm for equality and libertarian, hyper-individualistic idea of “personal responsibility” could make less secure or “less competitive” individual people in “the straight world” feel personally exposed.
In 1997, I entered the world of self-publishing (and self-promotion) with my first “Do Ask Do Tell” book. In the first few years I focused on “fundamental rights” and on building a belief system around “personal responsibility,” which could be cast as requiring sometimes involuntary responsibility for others as ones “dues”. But some of that is a way of projecting the more communal ideas about encapsulating responsibility in the family. Circumstances went wrong. We had 9/11, financial scandals (with executives who no longer know right from wrong), and destabilizing concerns about (nuclear) terror, pandemics and global warming; and I had my eldercare responsibility. The notion that one has to deal with other people on terms other than one’s choosing (the “personal responsibility” and “harm” principles) in an unstable world became morally compelling. Family responsibility wasn’t limited to adults who had “chosen” to have kids.
I still, as I did, maintain a conviction that there was an uncanny connection in the issues that led to my college expulsion (and my subsequent draft physicals and military service) and the 90s debate on “don’t ask don’t tell” that continues until today. This, over time (including my involvement in the COPA litigation) morphed into a “paradigm”: in the larger world, “don’t ask don’t tell” came to accepting a certain amount of moral hypocrisy and information hiding so that we could get along without dead-end existential confrontations. The DADT philosophy seemed to infect our entire government, leading to today’s Wall Street crisis.
One of the most important concepts to me became public information transparency. (Surely, investors needed that!) People tend to participate in democracy by letting surrogates (aka paid lobbyists, often enough) speak for them, which encourages the “I’m an oppressed victim” public whining and the childish demonstration campaigns, and deliberate public disinformation that we saw (from the religious right) in measures like Proposition 8 in California. (Both sides look foolish!) Indeed, ethical conflicts can keep people from speaking truthfully for themselves. But that’s the only way we can have a society where we really understand how other people think.
Now, I’m at "Point 3", which is that some parties are unnerved by my drawing public attention to myself when I haven’t taken on the same “family responsibility” as other people (as “real men”) and don’t personally experience their struggles before speaking.
I did embrace the libertarian point of view with the first book, and I still do to some extent. But I’ve come to realize that no government and no regulation can lead back to an overly decentralized world of bullies and enemies. I am not trying to interfere with people getting organized help from government in any area, like collective bargaining, health care, eldercare or, now, mortgages. (When it comes to foreclosure, as John McCain said, “I am my neighbor’s keeper” it seems.) What I present is a lot of observations with links or connections between the observations, pointing out often overlooked problems; but I am not really advocating a specific (ideological) agenda. But I do think that there needs to be a fundamental change in the way people understand issues and pursue them. People need to become much more objective, and much more willing to understand the “other side(s)”. Well, the free flow of information on the Internet helps us to achieve that. But the abuses and gaping exposures (for the innocent) still can threaten the entire “free entry” system that we have gotten used to, and very suddenly, as I’ve pointed out in the past several posts. We could get back to a system where people must have “standing” before they have earned the privilege of being listened to.
In recent years, some parties have tried to challenge be to function as if I could act as an authority figure (for “authority’s” sake) and bond with others in forms of “intimacy” that used to be expected of men who make themselves visible to others. They have tried to “accommodate” me with situations where I could serve as a “male role model.” Some of these situations occurred in bizarre (and in one or two cases, potentially humiliating) circumstances when I was substitute teaching. Others have occurred with calls about jobs predicated on manipulating others through sales or even charity-related circumstances. I have a reaction to this. You want to say “don’t tell” and ask me to “pretend” to be a father figure, in a world where United States code has a statute (relating, to be sure, to the military) stating as a matter of law that I am not fit to assume some of the responsibility for preserving freedom. Indeed, had the bullying that I talked about earlier never happened, I might feel much more all right about this. Okay, that’s what accommodation means, you say. My unwillingness to take this kind of responsibility for someone in circumstances that you dictate makes a perfect excuse for calling me "hostile." And maybe it's "dangerous" to have more than a low profile after admitting that one is no "authority figure" able to "protect" others (outside of depending on the law). But, still, you want an act. In recent years, I have gotten interested in acting, but in situations where people know it is acting (the movies).
Again, it seems as though the underlying idea is that, before being heard from, someone should prove that he (or she) can support others beside the self, even if single. Someone should compete for responsibility and authority and get concrete results, measured in numbers or financial results. Okay, you say, I worked for twelve years in life insurance, so why don’t I want to prove that I can manipulate people into buying it. Well, it was a good living, but it isn’t something that I did myself. I’m interested in managing an activity (and the people working on it) when it has to do with a problem that I have researched and concluded is critical (I’ve documented plenty of them on these blogs).
Most of my life, I’ve had a sense that most people want to see everyone play by the same rules, on a level playing field. They also want to see people share risk and uncertainty, and own up to the way they depend on others out of sight. That was the idea of “morality” that I grew up. There was never any question that you took responsibility for your own actions. You understood that you had some responsibility for others. We do call this “fairness” or “justice” and we expect the political and legal system to work for us in these areas. It doesn’t always, which is one reason why people put stock in matters of faith and family and accept interdependence. I have come to understand indignation and self-righteousness, and in my own beliefs I can become as “fundamentalist” as anyone else. No one wants to be forced to support others in a manner against their beliefs. No one wants to feel cheated. No one wants to believe they are expected to subsidize wrongdoing. But even with saying this, I can understand how such thinking can run away from people, in other parts of the world, and lead to the tragedies we have seen.
I think that we will need to have some townhalls or public events about how new challenges will be met on an individual level, as issues like national service, climate change, and worldwide economic imbalances portend major changes in the way we live. Perhaps we will indeed adopt a new mantra, “pay your bills and pay your dues.”
I guess I need to say how I will pay mine. It’s a lot easier if I can get paid to tackle these problems, and don’t have to play into some phony hierarchy first.
A stable democracy needs both “truth” and “right”. “Truth” refers to the transparency and accuracy of information easily available to the public. “Right” does refer to a “power structure” reasonably related to the merit of the people occupying it. Without “right” (and even information coming from the “right” places) truth will not be recognized or acted upon. Without truth, power structures will “make” wrong rather than right.
Tuesday, November 11, 2008
DMCA 10-year anniversay: We may owe today's Web to DMCA and Section 230 immunities; what is the price?
David Kravets has a provocative article on the Wired Blog Network, dated Oct. 27, 2008: “Ten Years Later, Misunderstood DMCA is the law that saved the web.” The law originally passed in 1998 as an apparent compromise between the need for spontaneity on the web, and protection of genuine copyright concerns.
The main point of the article is that the “safe harbor” provision protects intermediaries (like ISP’s) if they take down allegedly infringing materials immediately. Unfortunately, this has become a “shoot first” practice that sometimes burdens small speakers. In practice, it probably does not affect speakers much who do “all their own work” and do little copying or quoting. And in practice the biggest problems seems to be with copying videos. Sometimes there are controversies with reproducing excerpts from news stories, especially from AP.
The link to the story is here.
The “yank first” practice hit John McCain’s campaign, as YouTube took down some videos, as in the story “YouTube to McCain: You Made Your DMCA Bed, Lie in It”, by Sarah La Sitrland, link here.
The other main “safe harbor” kind of law is Section 230 of the 1996 Telecommunications Act (better known as the “Communications Decency Act,” the “Decency” portion of which was struck down in 1997 by the Supreme Court, to be “replaced” by COPA which would also fall.) Section 230 by and large would protect forum hosts and bloggers from immunity from comments posted by others, and protects ISPs and publishing services from liability for content. That is, it looks at ISP’s as like “phone companies” and not publishers. The analogy does not always work in other industries. In commercial motion pictures, for example, both the production company and distribution company could be held liable for an intellectual property tort (like libel).
But a major threat is that some parties believe that “amateur fame” or amateur-generated content is not a fundamental right that the law should bend over backwards to protect, when there are so many loopholes for abuse that sometimes (although rarely as a mathematical proportion) result in tragic circumstances. That sets up the controversy we have today, as in my posting yesterday Nov. 10).
Picture: George Washington University, including old student union building mentioned in my post Friday Nov. 7.
Monday, November 10, 2008
On September 28 on this blog, I discussed the subject of insurance for bloggers. I added some more references to that entry yesterday after getting an email from someone in the insurance business.
Media perils insurance, also called media risks, has long been around. Until the Internet age, it largely insured book and magazine publishers (and authors) and broadcast journalists. In general, the insurance industry knew how it could calculate the actuarial risk for “conventional” large media organizations who followed well-established procedures, especially for fact-checking. As noted before, there have been a few attempts to offer media perils coverage for bloggers and less "established" writers. Some companies have tried to underwrite such coverage under "umbrella" policies associated with auto or home liability; but intellectual property issues are so different from physical property problems that it does not sound like a good idea to try to bundle them this way; doing so could conceivably jeopardize the more conventional portions of casualty and liability coverage.
Obviously, estimating the "asymmetric risk" for “amateur” bloggers seems to involve accepting a lot of uncertainty. It rather resembles “L’Hopital’s Rule” in calculus. You have some unpredictable functions that you are trying to compare in a limiting case. One unpredictable factor is the likelihood that someone will go after an amateur with no real pockets. It has not happened much so far, although the ABA says that this is increasing. On the denominator side is that fact that bloggers typically don’t have the resources for fact checking. Another uncertainty is the range of subject matter, and range of setups of personal sites, ranging from conventional sites to blogs to social networking profiles. Still another uncertainty is the possibility of “libel tourism” litigation from overseas (where the websites can be read), although the likelihood of collection of judgments is perhaps very low. This ties into concerns about frivolous litigation or SLAPP lawsuits intended to bully or intimidate people of lesser resources into silence. Some states, like California, have made considerable progress in reigning in on this problem. It should be noted that SLAPP (strategic lawsuit against public participation) was an issue before the Internet and tended to focus on local, real-world protest activities often against local “powers that be.”
Still another problem is “online reputation.” This is the area where the practical problems live in the Internet. Reputation is in the eye of the beholder, and some people in publicly sensitive jobs could be more harmed by rogue comments on the web than could others. Families also vary in their sensitivity to “reputation.” The “Gossip Girl” syndrome would surely worry insurers.
The other problem that could worry insurers is controversial subject matter. They may worry that controversy will attract frivolous litigation, with unpredictable results. If a site discusses the topic of “gays in the military” or “don’t ask don’t tell”, an insurer may wonder if the purpose of the site is to do “forced outings” of specific gays in the military (obviously inviting legitimate litigation), rather than discuss DADT as a social and political issue. The insurer may simply be unable to “tell.”
In my case, it’s also the range of subject matter. My blogs cover “everything” because I believe that all political and social issues are interconnected. I think, for example, that there are ethical and political connections among such issues as national service, a possible draft, gays in the military, gay marriage, gay adoption, eldercare, and Internet speech. To cover one of them it becomes necessary to cover them all. Otherwise the issues get balkanized. The ability of individuals or small groups to cover “everything” (and "connect the dots") provides an important break against special interests (who admittedly want to protect their turf and lobbying jobs). Otherwise, we are left with juvenile activism where people carry picket signs or email form letters written by others but have no political or social intellectual awareness beyond their own immediate “needs”. Democrats – take heed – you’ve been guilty of exploiting this!
There have been a few horrific tragedies resulting from wrongful Internet behavior. Most of these have to do with cyberbullying, impersonation or outright defamation, often targeted at relatively narrow audiences such as public school student populations, but often made available for the world to see. By an large, these incidents have nothing to do with “issues” but their notoriety can affect the long term ability of everyone who wants to use the Internet as a forum for debating issues.
The danger is that people will demand that laws be passed to require all amateur “global speakers” to have insurance, and such a measure would shut down much of the Internet world as we know it. (Imagine the effects on Wall Street.) The facile model would be auto insurance, One can, after all, make the argument that the First Amendment, while conferring the right to speak or assemble in a real-world manner, does not guarantee a fundamental right to “fame,” “free entry”, or global distribution. If one has global reach, one should demonstrate the resources and personal accountabilities to cover the risk one might create for others – at least that’s how the secondary argument could go. "Asymmetry" putatively triggers novel moral concerns that challenge our previously sacrosanct model for individual rights and particularly individualized speech. But such an argument sidesteps the real problems: frivolous litigation, and an ambiguous notion about what “reputation” and even “privacy” means today, as George Washington University law professor Daniel Solove has pointed out in his last two books. Furthermore, it ignores the lack of ethical standards in the Human Resources (and college admissions) world as to how Internet profiles and blogs should be viewed when found during the application process.
A concomitant risk is that political pressure could be brought to bear to weaken Section 230 protections for ISP’s and comment facilitators, or to favor large media interests in the way DMCA safe harbor provisions are used. The Viacom litigation in process speaks to that point.
There is a lot at stake here, not just future Internet company earnings and viability, but also the way our democracy functions. This issue really can become cannon fodder for the special interests. The new president Obama will surely hear about this and need to think about it carefully, if quietly at first.
What should be done? Well, it’s incremental. The most important step is to seriously consider national tort reform, to reduce lawsuit abuse. Maybe we need a “loser pays” system, or at least much more care judicial oversight of intellectual property litigation. (This brings up all the issues that came out of the DMCA and P2P issues, of course.) We should have national standards with respect to the SLAPP issue. And we need to pass in-process legislation discouraging foreign “libel tourism” immediately.
Another step is to include intellectual property instruction in public schools. Kids need to learn about copyright, patent, trademark, and, more importantly, the harms of libel and defamation. Dr. Phil (and his son Jay) has done us a good favor by opening up the subject of cyberbullying. But schools are hardly prepared now to offer this instruction. Many times administrators and teachers themselves are ill informed on the problems. They need to hire the help from the software industry.
Another point is that concepts like “privacy” and “right of publicity” need some rethinking, and there may need to occur some changes on common torts manuals.
Likewise, the Human Resources world (and college admissions world, too) needs to sit down and figure out what practices in “background checking” candidates on the Internet are ethical. There is a lot of carelessness in the way this is done, as search engines and social networking sites (and Web 2.0) were never designed to be used this way. What started as an opportunity for open democracy could be perverted into an instrument of social conformity. Just look at China! This is an area where the president can persuade or jawbone business, but there is not much (about "private" HR practices) to be easily done legislatively.
The law background of our new president should serve him well in understanding these issues. But it is essentially non partisan. Some of the points I bring up sound like they come from conservatives and libertarians. Indeed, Democrats will have to resist the temptation to support changes in the legal environment that suppress individual initiative and encourage co-dependency and favor well-established special interest.
Sunday, November 09, 2008
Some colleges check applicants' social networking profies: again, show concern about "online reputation"
Here we go again. The NBC Today show reports this morning (Sunday Nov. 9) that about 10% of colleges and universities (or their admissions officers) check Facebook and Myspace profiles (and presumably other personal blogs) of applicants, some without informing applicants. This report comes from the Kaplan Test Prep and Admissions service in this press release, dated Sept. 18, 2008. The title of the story is "At top schools, one in ten college admissions officers visits applicants' social networking sites".
However Northwestern University told NBC that it considered a college’s doing so without informing students to be unethical. Many other colleges seem to agree. Many reports fail to note the risk that a college could pull a profile or blog for the wrong person and no one would ever know it. Colleges could, however, ask students to identify any profiles as part of the application process (and so could employers).
The general impression is that social networking activity is more likely to hurt students (give colleges a reason to reject them) than help them.
There could be other reasons to visit social networking sites besides reputation. They could be to check writing style, spelling and competence!
It’s important to note that the First Amendment protects individuals from government interference with speech, not from that of private employers or institutions. It’s an interesting question whether a publicly funded state college or university is violating the First Amendment by checking behind a student’s back. Public employees, even public school teachers, do have some First Amendment protection relative to work.
Employees of private businesses are protected by the right organize and engage in collective bargaining. But they are not necessarily protected by a supposed right to distribute their own speech publicly, which social networking sites and blogs do very effectively.
It is very interesting to see the enormous variety of perceptions about the First Amendment in our culture and how it tracks to an enormous range of opinion regarding personal reputation (including, now, online reputation). It would be an interesting project to compare free speech today to what the concept of the Founding Fathers was when they wrote the Bill of Rights, or what the colonists actually experienced before the Revolutionary War.
Kevin Tibbles provided the report.
Friday, November 07, 2008
I remember sitting down with a cheeseburger and fries in the canteen of the first floor of the old George Washington University student union on G Street in Washington, next to a fire station, across the street from a then famous Quigley’s, one early evening the last week of October 1962, and watching President Kennedy address the nation about the Cuban Missile Crisis, on a black-and-white television monitor right above my line of sight. An hour later I would be safety tucked away in class, but I remembered that it hit me how dangerous things were getting.
At the time, I had a somewhat embarrassing living arrangement that I didn’t talk about. I was an in-patient at the National Institutes of Health in Bethesda, MD, in a project studying two groups of mental patients: those who had experienced difficulty adjusting to college (an obvious and now ironic Cold War concern), and those with “family” problems. I was supposed to belong to the former group, but they sometimes mixed.
I was the only patient who regularly left the Clinical Center to go to college. I took the tediously long Friendship Heights bus line to get to and back from school, “on pass.” I was the only patient who really followed what was going on in the world (we called it “on the Outside”). There was no television on the Unit, and probably the doctors wanted it that way.
There were perhaps a dozen patients, some my age (then 19), and I got along well with some of them, and not with some others. Each week there was a full schedule of individual therapy, group therapy, family therapy (sometimes with art), group activities, and “unit government”. I do recall being nudged to disclose certain fantasy materials in the repeated individual therapy sessions, as if such a "confession" would lead to a miraculous, curative revelation. It didn't happen, of course. I thought of myself as more “intact” so sometimes I could bully others in subtle fashions just a little myself. (I’m going to get into the bullying problem, as recently covered on a Dr. Phil show, in more detail on another post soon.) For example, one time we had a ping pong “tournament” and I disoriented a couple of the other patients by winning games by playing defensively, avoiding slams, and keeping the ball on the table.
I was probably the only patient who knew about the Cuban Missile Crisis and what it could mean. Yes, I mentioned it a few times, and in a somewhat unkind fashion. Some of my life in the "Unit" during that period plays out in my mind like a Clint Eastwood movie (an episode from "Changeling" does bring all of this back to mind). But it speaks to a point that is not nice.
Of course, we all knew the doctrine of MAD (mutually assured destruction). But, in the back of our minds, we imagined something worse: maybe an exchange of one or two devices, and a world that eventually could be recovered from but that would ruin a generation of lives, including mine. We could face something like Japan after Hiroshima and Nagasaki. A world like that at first would have no use for someone like me. I think one could have said that about anyone who had failed to perform according to society’s social norms (including the other patients). No wonder the doctors didn’t want the patients to know what was going on.
Of course, we all know the history. Under President Kennedy’s leadership (whatever you think of it), we came back from the brink of that fall crisis, had a Civil Rights movement, a gay rights movement, increasing individualism, even a Reagan Revolution. There were severe crises, like the oil shocks of the 70s, and AIDS in the 80s. Today, at least with 9/11, we have been forced to contemplate how we could adjust to a true History Channel “mega disaster”. We now realize that some of these dangers can be manmade in a longer term sense: pandemics (it’s ironic how we look at avian flu twenty years after AIDS was a social controversy), global warming. We come to question the moral validity of individualism, and start to appreciate why societies, as a whole, until a few decades ago, felt that they had to enforce rigid social codes to function and survive at all.
The patients in that NIH Ward (then 3 West) in the Clinical Center were there in large part because they did not or could not conform to social norms and expectations. That’s certainly why I was there. (The entry on Nov. 28, 2006 on this blog explains my William and Mary expulsion for homosexuality.) Probably, each one of us had our reasons for believing that society’s demands, at least at a certain individual level, were wrong, unjust, or even “irrational.”
I do understand that even democratic societies like ours don’t always let citizens have complete autonomy over their own adult lives. I grew up in a world where there was a draft, and where student deferments (in the Vietnam War that would follow) provided one of the sharpest moral controversies of the era. Nuclear families had tremendous authority not only over minor children (obviously necessary to raise them) but even over unmarried adults, because the family could form a center of identity for those less able to compete for themselves, and because the world we lived in could not always "afford" to let individuals express their own course in life regardless of their family origins. (Again, half the patients on the ward were there for "family" problems, but these patients were all adults themselves!) In the 60s, we started to feel we could afford to really take individual rights seriously, and address the social justice issues that the “group values” of our society (the family) had allowed to continue. Ironically, this would develop so soon after this Cuban Missile Crisis and even the Kennedy assassination in Dallas in 1963. I do remember that Friday afternoon in November 1963, when I (having then been long discharged from NIH) was working on my first full time job at the National Bureau of Standards (then in Washington, where UDC is now), and my boss came in and told me. I remember waiting for the bus on K Street later, going to my parents’ home, wondering if the Soviers would blow us back into survival mode.
In August 1997, right after publishing my first "Do Ask Do Tell" book and before moving to Minneapolis, I would take a Sunday tour of the Cold War bunker underneath the Greenbrier Hotel in White Sulphur Springs, W Va.
Like it or not, our social values in the time, those that shaped the life I had since, were formed by circumstances beyond our locus of control.
Wednesday, November 05, 2008
Jury selection consultants look at blogs, social networking sites of prospective jurors during voir dire
Various media sources say that jury consultants and trial attorneys are turning to the web to vet jurors these days. Attorneys often look for social networking profiles, blogs, personal sites, or comments by others – in a manner similar to the “background investigations” done by some employers – in screening jurors for attitudes and potential bias for or against a particular litigant or defendant or about a particular issue.
There are several stories in the media about this. A good one is by Julie Kay, “Social Networking Sites Help Vet Jurors” in the National Law Journal, Aug. 13, 2008, link here.
Today (Nov. 5, 2008), on p A19, The Washington Post (under “Around the Nation”) has a reprinted story from the Los Angeles Times by Carol J. Williams, “Consultants comb the Internet for clues about potential jurors”, especially during voir dire. I could not find the story on line in either newspaper.
Several stories mention the Jury Research section of the National Legal Research Center in Charlottesville, VA.
I got called to jury duty four times while living in Dallas in the 1980s (one day or one trial). During the last incident, my presence on the jury in a malpractice case apparently helped compel a plaintiff to settle once the lawyers realized (from my statements) that I had been an AIDS activist and, from a couple of newspaper letters, had considerable layman's medical knowledge.
Tuesday, November 04, 2008
Election Day relatively orderly despite huge turnout; controversies over certifying voting technology continue
‘Twas a gentler, quieter voting experience than I expected. The media was showing four block lines everywhere in the DC area, and talking about three hour lines for early voting all over the country. Yet, at around 10 AM this morning, I was about third in line to have my voter ID checked, and only had to wait for about six voters to get to a WinVote machine.
The actual software worked slightly differently this time, displaying your choices to you before prompting you to press Vote. There seemed to be one extra step.
In previous elections, election judges in Virginia would disconnect individual WinVote machines and carry them out to cars for disabled voters. That proved to be time consuming and to perhaps increase the risk of machine crashes, so now sealed paper ballots (counted by Scantrons, like multiple-choice tests) are used for voters unable to stand in line at the polling place.
I also heard that two of the machines had been crashing or hanging. I’ve worked polls before, but did not do so this time, with the prospect of a 20 hour day. There are elaborate procedures for rebooting and bringing a machine back up (and connecting it to the local closed network), and it does not lose the votes that have been cast (they are supposed to be stored on a memory stick). But they do not provide a detail paper audit trail, which probably should be required. (Although Virginia seems to have an argument that a paper trail actually increases the risk of problems.)
There is a story by Kim Zetter in Wired, “Lab that Tests and Certifies Voting Machines Suspended”, Oct. 29, 2008, link here. The laboratory in question is SysTest Labs. However, the Election Assistance Commission (EAC), set up by the National Institute of Standards and Technology (NIST) maintains that the company did not set up an adequate plan.
There is an older story by Zachary A. Goldfarb in the Washington Post, Feb. 19, 2007, “Campaign Strengthens for a Voting Paper Trail,” link here, with pros and cons about paper trails.
Later in the day Tuesday, there were scattered reports of voting machine failures in Virginia, and one or two precincts had to go to paper ballots.
P.S. As for the green sign in the picture, "Yes, we can!"
Monday, November 03, 2008
Today I reviewed a book “Give Me Liberty” by Naomi Wolf, which includes a section on blogging by Elizabeth Curtis. The review is here. I thought I would add a few of the references that she gives.
One of theses is “Alternet” which “creates original journalism and amplifies the best of hundreds of other independent media sources”. The idea is to capture, categorize and organize information on concerns of political progressives so that visitors can “connect the dots.” Alternet offers a convenient syndication feed (daily or weekly) of major stories to visitors.
Then there is "Smart paperboy" site called Bloglines (“The Same Internet, Minus The Clutter”), which delivers favorite content to a home signon page. This is a convenient way to keep up with things, but I use Mixx to get news this way. I signed up for a bloglines account and never got the validation email.
However, I’ll include a Bloglines subscription button here:
This particular blog of mine (you can look up the other associated blogs on my Profile for more specific subject matter) tracks, at a high level, developments in some of the more “existential” problems underneath our modern idea of personal freedom. It also tracks how Internet and communications technology is interacting with legal developments (legislation and litigation), that seeks to sort out some of the previously unprecedented ethical and potentially legal problems that the Internet can create.
Another similar service, perhaps more convenient for those with Google accounts, is Google reader, which shows up on the Account menu. You can subscribe to blogs or feeds (you can also do it on the Blogger dashboard if you use it).
Another resource is “Blog Carnival” where “where someone takes the time to find really good blog posts on a given topic, and then puts all those posts together in a blog post called a ‘carnival’”.
Bloggers have a number of tools available to analyze their traffic, including Google Analytics, Urchin, and Site Meter.
One of the most important potentialities of some of these blog syndication, subscription and aggregation or “carnival” tools is to create the ability to conduct online deliberation of issues: to let visitors figure out what is going on with a particular controversy (for example, credit default swaps in the financial world) so they are not caught by surprise by news developments, and, more important, they do not perceive news issues on some subjects as personally polarizing. The ability to aggregate blog and individual or personal website material from personal sources could lead to another level of Web deployment (or semantic web) that could eventually have legal benefits, in being able to provide review and oversight efficiently at acceptable cost.
Previously this column has discussed "Blog Talk Radio" (Mar 25, 2008) and RSS feeds.
There’s one more story today about Internet advertising, that I found on Mixx; it is a welcome uptick during this period of so much bad news about business. Here's the link. By way of comparison, on Nov. 5 there is a story on CNBC about News Corp (Fox) that shows that, while Myspace still performs decently, it is getting harder, for some operations at least, to maintain good ad revenue on the Internet, because of consumer caution.
Saturday, November 01, 2008
Upsidedown mortgages -- I got burned during the last crisis in Texas in the early 90s: learning by "conditioning" and "Just say no!"
I recall a moment, the Tuesday after Christmas (Monday had been the legal holiday) in 1994 when, returning home from work, I found a disturbing envelope at the bottom of my apartment mailbox. It was from the mortgage company.
Three years before, I had sold the Texas condo in which I had resided once to my renter, who assumed my FHA loan (which I had converted from an original VA loan at 12-1/2%). I had lost $10000 on a condo that had sold originally for just $40000 in 1984. But real estate dropped in a lot of Texas when, under Reagan’s policies, oil prices dropped in 1985 – good for the rest of the country but not for Texas. I had moved back East in 1988.
The purchaser had defaulted, and because it had been an unqualified assumption, I was responsible for the payments. FHA no longer allows unqualified assumptions because of this problem.
I don’t want to spill too many details, but we (myself and a lawyer) got the homeowner paying again. When she got into trouble again, we demanded payment and threatened foreclosure, forcing a Chapter 13 bankruptcy. But it had a happy ending. She paid all the money back, with attorney’s fees and a small amount of interest. We took care of this entire problem, however small, with no government bailout.
You can imagine, then, I was rather aghast in 2005 or so when the media started talking about the subprime boom. A retiree, I got a couple of unsolicited phone calls to go out and make money selling these kinds of mortgages. Anybody can do it. It didn’t sound right. I had been burned before. "Just say no" I said. But Wall Street hadn't been burned (or had it forgotten Texas around 1990 when some condos fell into the teens).
Furthermore, back in the early 1990s, some defaulting homeowners really did get sued for deficiencies. True, a lot of people in Texas just walked and mailed the keys. But some borrowers got in real trouble over this. The gory details are spelled out in an orange paperback (having nothing to do with ING “orange”) by James A. Wiedemer. "A Homeowner's Guide to Foreclosure: How to Protect your Home and your Rights." Dearborn, Dearborn Financial Press, 1992. There were “letter lawsuits” and process servers and the like, and a fine art for getting default judgments to ruin borrowers’ credit.
Actually, the whole thing was rather stressful for me at the time. I remember the phone conversations with realtors and my lawyer, and I remember all the talk about “personal responsibility.” I didn’t have the same family responsibilities or life circumstances as the borrower. I don’t want to get into private matters any more that I have to, other than (to make a discussion point here) to say that at work, people who knew about this said “you could marry her.” That sounds crass (given all the debates about the institution today), but it does comment on a social problem – sometimes we have to take of others besides ourselves and the people we choose to marry and the children we have. The “moral” problems are just more complicated than that. It comes back to the “sustainability.”
And so now we see news reports about the financial crisis blowing out of the problems with upsidedown, adjustable and subprime mortgages. In a real world, true, no one can expect borrowers to cover these problems today the way we did in the 90s in a relatively “simple” case. But I have stayed away from all of this mortgage mess for just that reason – I’ve been burned before. I think of a Dr. Phil show where a young man, with a wife and baby (and “family responsibility” to mollify the social conservatives so addicted to the “idea” of marriage) decides to flip multiple houses (several a week, all over the country) to support his family and doesn’t go to college and learn real skills for “real employers”. Now he is several million dollars in debt and has walked away from a number of homes. We wonder where our moral center has really gone.
Media reports do talk about the deficiency problem as a legal issue in other countries, like Australia. It has been an issue here before in the U.S. More to the point is to go back to requiring more down payment again, so that mortgage holders have a stake in their homes.
No question, there has to be structured help for homeowners, and there’s no way to do it without the whole process becoming “politicized”. It sounds anti-libertarian, but when it comes to residential property values, you are your brother’s keeper.
Remember the etymology of the word “mortgage”. It means “dead pledge.”