16 November 2011

Teachable moment


I attended Eureka High School, Community Unit School District 140, in Eureka, Illinois from 1982 to 1986. My best subjects were math, French, art, history, and music, but there's one thing I learned while I was there that has stuck with me more than any other piece of knowledge: there was a door on the far left of the front of the building that, if you pulled on the handle firmly and gave the base a swift kick, would pop open every time, allowing you entry after the building was closed at 4:00.

On its own, this piece of information was not that useful—after all, who wanted to be in the school any more than you had to? But with the knowledge of this exploit came a host of implications. The person who showed me this trick was a dedicated student, a good kid. So I learned that good people break rules. If you were going to use this technique you still had to do it in plain sight. So I learned that you can get away with things if you act like you're supposed to be doing them. As time went on, I slowly came to learn that nearly everyone knew about the door, probably even most of the teachers. So I learned what an open secret was, and how all communities have them.

In academic circles the phrase for this sort of thing is a "teachable moment." Often the most important educational moments are unplanned. They arise organically from life experience and deal with large issues: How do you deal with failure? With adversity? Where do you draw a moral line? What's the right thing to do?

I thought about this yesterday when my brother, who also attended Eureka High School and who now writes for the Daily Show, Facebooked a link to a story about a teacher at EHS being suspended for showing segments of Jon Stewart's show to his government and law class. It's unclear what specifically happened because the article is so awful, but I suspect that the teacher warned his students against Googling "Santorum" and some of the district's parents—those lacking a sense of humor—were upset. That piece of information might have been helpful for readers trying to understand the the article, but the Pantagraph reporter did, however, note the teacher's salary, so we could all be incensed by what a boondoggle public education is.

Amidst the predictable comments on the article pages supporting freedom of expression and deploring the teacher's alleged bias, this one, from "teach78" stood out:
Spin it anyway you want; there is no educational "value" in the Daily Show.
teach78 is probably right about this, but there is plenty of educational value in how District 140 dealt with the situation. The best way to handle things is not through private negotiations but through public fiat. A parent's sense of indignation is more important than a man's occupation. Pick the right side or you will be dealt with. These are the lessons that Eureka students will take from this event, and they might stick with them longer than a faulty door.


03 November 2011

This is the amount of energy (B) has left over

When I was a kid the second-most fascinating book on my parents' shelves was the small pamphlet whose  prosaic title was "Revised U.S. Edition of the Official Royal Canadian Air Force Exercise Plans For Physical Fitness" (underline original). The cover depicts a pair of smiling Canucks in uniform striding across the tarmac away from their plane, presumably back from a long day of protecting Kapuskasing from bogeys. This book was actually "two books in one" as it contained both the official RCA exercise regimen for men and women. For some reason, the women's plan was a minute longer a day at 12 minutes to the men's 11, but still, such a deal.

I'm really not joking when I say this book held a strange fascination for me. There was something oddly authoritative about this being the actual regimen for a military service with the word "Royal" right there in its name. It claimed to be all you needed to achieve fitness, and as a chubby little nerd I believed it. I've always been a sucker for claims of expertise as well as for things that are short. Because I am gullible and I am lazy. But I was not alone in my interest in the program; 5BX was a big hit in the 1960s and the promotion of regular, intensive calisthenics paved the way for the aerobic craze of the 1970s.

Looking at the book today (I found a PDF of the 5BX pamphlet which was originally published on its own) what I'm most captivated by are not the exercises—a series of standards, some of which are neither effective nor safe—but the lengthy introduction which makes the case for physical fitness. Of the original 32 pages only 10 actually described the exercises; the rest were a series of cartoons, charts, and pep-talk to convince the sluggard to adopt this life-changing process.

And what wonderful bullshit they are! Beautiful examples of how graphics can skirt around meaning and imply that information is being given, without actually saying anything. My favorite is the following chart on page 7:
So much quantitative data is implied; none given. Look at that physical capacity scale, presented in authoritative percentiles. What is it? I don't know, but that guy slouching at the desk sure doesn't have any. Look at how much heavier the shirtless guy's energy reserve is! More energy weighs more. And don't you want to have more so you can enjoy your recreational activities? Activities like 5BX?

23 September 2011

A Case of Identity



All of us know who Sherlock Holmes is. He's that detective guy who wears a deerstalker cap and an inverness cape and the meerschaum pipe who says, "elementary, my dear Watson." Except that only the detective part comes from Holmes's creator, Arthur Conan Doyle. He once described Holmes as wearing a traveling cap with ear flaps but it was the illustrator Sidney Paget who gave Holmes the deerstalker and cape. Doyle described several pipes smoked by Sherlock, but they were straight pipes of clay, cherrywood, or briar, as were common among Victorian gentlemen; the large-bowled, curved pipe was introduced by American actor William Gillette, who became famous for his portrayal of the sleuth; the phrase "elementary, my dear Watson," never appearing in any of Doyle's stories, was first written by P.G.Wodehouse.

These days we have any number of complementary and competing Holmeses, including wonderful legacy portrayals by Basil Rathbone and Jeremy Brett; revisionist Holmeses like Guy Ritchie's action hero as portrayed by Robert Downey, Jr.;  and alternative world Holmeses like the contemporary-world Stephen Moffat series as portrayed by Benedict Cumberbatch. There are hundreds if not thousands of Holmes stories, with some (like Nicholas Meyers's famous Seven-Per-Cent Solution) taking great pains to fit within Doyle's style and timeline, and others jettisoning verisimilitude entirely and having Holmes fight aliens or travel through time. And on top of all of these, we have a multitude of parodies and pastiches from muppet Sherlock Hemlock to Mad's Shermlock Shomes to the Firesign Theatre's Hemlock Stones, who finally took on the case for which the world was not ready.

One might think that with all these Holmeses, audiences might be confused. Where to start? Which one is "real?" And what about the integrity of the creation, is that being preserved? But in practice, there is no difficulty. Communities of readers and viewers can pick and choose which Sherlocks they'd like to support; most probably have several favorites. Likewise, if they find one portrayal out of character or another story's plot falls flat, they're free to ignore them. And the original stories are still there, still untouched and unrevised (although Doyle himself indulged in revisionism by bringing Holmes back from the dead after he killed his creation off). Doyle's canonical works are unaffected by other writers' contributions—if anything, the derivative works keep the originals fresh and relevant.

I like to bring up Holmes when people fret about authorial intent and an artist "owning" their work, or complain about fan fiction being an affront of some sort, or insist that working from another artist's setting or character is unimaginative or a rip-off. The tendency is to see Sherlock Holmes as a special case, owing to his ubiquity in the culture. But isn't his prominent position precisely because of the participatory way the culture at large uses him? In my post about George Lucas, I champion the idea (which I didn't originate) that the eventual relevance of a piece of art owes more to the work of its audience than of its creator. I know this is a controversial idea, especially to people who like the idea of the Artist as an Auteur, uniquely responsible for his creation. But it seems to me that when artists practices their craft they are joining a game in progress, learning the rules by watching others, and then moving the ball along; and then it's up to the audience to decide if they scored or not, and why it mattered, and how it mattered, and before long it's the readers or viewers or listeners who have possession.

15 September 2011

A bad motivator


For the last couple of weeks there has been a great disturbance on the Internet, as if millions of geeks suddenly cried out in terror. I'm talking about the latest batch of changes George Lucas has made to his Star Wars movies, this time on the occasion of their blu-ray release. The long story short here is that ever since the "special edition" releases of the 1990s, Lucas has been altering the original three Star Wars films; sometimes substantially, with new scenes and actors swapped in digitally; sometimes trivially, with newer visual effects and bleeps and farts. This isn't a bad thing in theory—directors' cuts are usually greeted as definitive versions, and many artists can't resist the urge to go back and tweak their earlier work. (Walt Whitman added poems to, subtracted poems from, and generally rewrote poems in every new edition of Leaves of Grass in his lifetime.)

But in the case of Lucas the changes to the original have been so awful, and the memories he's tinkering with are held so dear, that it seems a kind of spite is driving him at this point. I'm not going to list the details of all the alterations here (that's what Google's for), but suffice it to say, it's understandable that people might want to have available the original version of a film (and here I'm talking about the first, 1977 Star Wars) that holds such a central place in the history of film and society. But Lucas says no; this is his vision, you get it all—all retrofitted to mesh with the awful prequels—or you get nothing.

If the original film—now thirty-five years old—had been released under the original fourteen year copyright term (renewable once), this would all be moot. Criterion would be free to release a restored original version with commentary by historians. Wal-mart could release a budget version with all the incest taken out. And Lucas? Lucas would still be free to alter his films in any way he wanted to. He could stick Jar-Jar into every damn frame if he liked and all of the fans who valued his intent over their childhood memories (there must be at least four or five of them) would be free to purchase these enhanced versions. The point is, Art with a captial A would be served and Commerce with a capital ¢ would be served as well.

There are rights holders like Lucas whose bad dealings with the art they own comes from an honest belief that they're doing what's right. Then there are rights holders like Disney, who are motivated entirely by their desire to monetize their holdings as efficiently as possible. The famed "Disney Vault"—the practice of Disney of bringing properties in and out of print in cycles—is a good example of this. They aren't doing this to benefit their films or their audience—they're just making sure their products are not in competition with each other. Disney animation from the 30's, 40's, and 50's is central to our cultural heritage, but we're kept from it, not by the artists who actually produced it (they're all gone) but by a marketing ploy. Similarly, Disney gets to remove any scenes or elements form its films that might affect their salability.

It's not just pop culture that suffers from the heavy hand of rights holders: no less a luminary than James Joyce has also been affected. The current executor of this seminal Modernist, so central to world literature, is the artist's grandson, Stephen Joyce. Under the name of protecting his grandfather's legacy, the younger Joyce has aggressively hindered access to the artist's letters: bringing suit (or threatening) against biographers and scholars whose work he deems harmful, prohibiting public performances of his grandfather's work, destroying letters by Joyce's daughter, and hoarding unpublished writings. He even said no to Kate Bush using Molly's soliloquy in a song. Thankfully, the works of Joyce are about to enter the Public Domain—at the end of this year, only sixty years later than they should have.

Afterword: In the article on Joyce linked above, the author writes: "It is understandable and reasonable that the heirs of an author [...] would gain a financial benefit for a certain time from that author’s work, in the same way that a descendant who has been left a farm or a house is entitled to a financial gain from it." I note this because I think it's a fallacy that's often made to justify the passing of copyright to one's heirs. The correct analogy would be that as a farmer may leave his farm and equipment to the next generation, so too an author may bequeath to their heirs their own tools of production: pens, paper, notes, typewriter or computer. An analog to copyright for the farmer would be if the farmer's heirs continued to receive residuals on crops produced many decades before.

Safety dance

The museum where I work is considered a port by the TSA, because lenders in other countries ship us artwork in sealed crates. The upshot of this is my fellow staff and I have had to have our backgrounds screened and we've received training on terrorist threats. The process has been relatively painless, but it did lead me to try to imagine how anyone might possibly take advantage of us as a conduit for nefarious cargo. It would require the culprits to have prior knowledge of which exhibitions we were going to be staging long enough in advance to go to the country from which art was being lent (which would also require infiltration), get hired by whichever art moving company the lender was going to use (and as this gets bid on, they'd have to have operatives in several) and then under the watchful eye of the conservators overseeing packing they would slip their IED (or whatever) into the crate. Then they would have to perform a similar inside job once the art arrived in America.



In short, it's perishingly unlikely; but the TSA is just doing their job, which is keeping us safe. The question is, how safe is safe? That's a question of nuance and the TSA doesn't do nuance. We all know the result: the long lines to be groped, barefoot, at airports. But the far greater cost is the cost to the soul: the constant agitation, fear, paranoia. If you see something, say something. Security is everyone's business. Think of the children. And so think of them, and we fret and obsess and distrust.

The most hilarious and shameful example of how this mindset messes with us is the Boston Mooninite scare of 2007. After the BPD misidentified the harmless LED signs as potential bombs and brought the city to a panicked halt, they proceeded to accuse the street artists who had installed the signs with intentionally perpetrating a hoax—when it was they who had misled the public. Instead of admitting their error and promising to adjust their policies to avoid such problems in the future, the police threatened the "perpetrators" with criminal action and accepted two million dollars from TBS as "compensation," all the while complementing themselves on their vigilance.

In his novel White Noise, Don DeLillo posits that the more modern medicine staves off death, the more we end up fearing death. A similar calculus applies to safety: the more security we achieve, the more that last bit of safety eludes us. In the absence of clear and present threats we manufacture wars that cannot be won: on terror, on drugs. But here's the thing: total security does not exist. As Kij Johnson wrote, "Nothing is certain. You can lose everything. Eventually, even at your luckiest, you will die and then you will lose it all." It's a hard truth, but accepting it is its only remedy.

14 September 2011

Smells like victory

My older brother used to play an elaborate game with a friend of his using those little plastic army men you'd buy in buckets. An entire back yard was the playing surface and moves were made in turns using a ruler; each piece got a set number of inches. Once in range of enemy pieces, dice were rolled to determine  damage inflicted. Then once a piece had been "killed" the real fun took place: using a lighter and a spray aerosol can of lubricant, the poor soldier would be torched until it caught fire and melted into an olive drab pool.


Later variations on this game included using napalm in the form of setting a two gallon milk jug alight and dripping flaming gobs of polyethylene on the hapless fighters; I also remember one afternoon where the a fort was constructed of styrofoam and also torched, although it never really fully caught. But it did produce large oily plumes of smoke that seemed half ink and half air and were no doubt full of dozens of toxins. For that matter, none of us stopped to consider if the can we were using for flamethrower fuel was likely to ignite; or that the late-August grass was crisp and brown.

As pointless and dangerous as these pyrotechnics were, I have fond memories of them. In fact, it's because they were pointless and dangerous that I have fond memories. If I had ended up burning myself I'd probably enjoy the memory more, because everyone loves their scars. I have a particularly large one on the bottom of my right index finger where I almost chopped the digit off by sticking it into a spinning exercise bike wheel when I was four; I have another at the base of my thumb to mark the time I fell backwards down some stairs and slammed my hand through the window of my back door. I love them both.

Looking back on our stupid choices and telling scandalous stories about the bad things we did is one of life's joys. There's the old adage that our mistakes are what makes us who we are; this is true, but I think our love of  stories of drinking binges and disastrous romantic encounters and quarry diving and childhood games on thin ice speak to us on a baser level. We like to imagine a time free from responsibility and filled with possibility.

We just don't want to imagine these things for our own kids.

06 September 2011

I stop being polite and start being Real (McCoy)

My surname is McCoy, which is a funny name because when I tell it to people they either shoot back some incredibly witty question about whether I'm the real or or not, or they stare blankly and ask if I said McCory. This is odd because while there is a famous space doctor named McCoy and a famous feuding hillbilly family named McCoy and a famous pottery company named McCoy and a famous jazz pianist named McCoy and a famous perky cruise director named McCoy, I have never once met a McCory or even heard of the name aside from having it repeated back to me by the aforementioned vacant starers.

Out here in Boston everyone is Irish or pretending to be Irish and most of them will ask if my family is from Ulster or Cork and then squint intently as though my reply will result in either a hug or an uppercut. However, the story I was told as a wee bairn was that our family name was changed from MacKay when we  came from Scotland, no doubt  searching the new world for peat. My father had a book of the tartans of Highland clans and my brother Robert and I looked at the page for our ancestors family so often the spine cracked to fall open on the MacKay's somewhat plain green-and-blue plaid.


I was a credulous child and I accepted this story about my heritage without question until my mid-twenties, when doubt began to settle in. Too many people I met seemed to think that the name was Irish. Digging around I discovered that while the McCoys were in fact related to the MacKays, the name variants go back many centuries and predate any Atlantic crossings: the name change may have originated with gallowglasses who moved to Ulster in the 13th century. My father has confirmed that the first immigrant in our family named McCoy, James, came to America in the 18th century from Ulster when he was 14, and there's a charming story about how during the passage he boasted about being good with horses, eventurally secure employment in the New World at a stable.

The thing is, whoever he was and wherever he came from, he was just a guy who happened to share my last name. Going back a few generations I have more English surnames than Scots (or Scots-Irish or whatever), and somewhere in there are German names, and French, and allegedly four or five generations back I have Cherokee and Creek great-great-great-great grandmothers. Calling myself Scottish seems as ridiculous as calling myself Native American based on whatever fraction of genetic material I share with these ancestors. My father grew up in Oklahoma and Indiana; my mother in Virginia and Indiana; really I'm a mutt.

But don't tell my uncle; he is proud enough of our alleged Scottishness to have purchased a kilt, a Tam o'Shanter, a set of bagpipes, and an Aberdeen Terrier. When I married my wife Marina (who is, by the way, unquestionably 100% Latvian in descent), he presented her with a pin to welcome her into the Clan MacKay. I often wonder what actual Scotts would think of him, if he were magically transported in full costume to some pub. Probably if it were in Edinburg they'd humor him, but I can't imagine it ending in anything other than a pummeling in Glasgow.

The fact is, despite having a distinctively Gaelic name, I have no sense of myself as having any ethnicity at all.  I have lived in small towns and big cities; even though I've lived in Boston for over 20 years, I don't think of myself as "from" here. But I don't think of myself as "from" anywhere. The idea of having a strong sense of ethnic identity is foreign to me. Maybe it's my white male privilege talking, but I like to think that my personality and abilities arise organically from my own choices and experiences and are not the result of any national character. Ethnic pride has a dark side—the belief that your own clan or race or creed is exceptional is the seed of prejudice and nationalism.

Still, I am not immune to the charms of ethnic imagination. I see Marina's strong connection to her Baltic heritage and it makes sense as the daughter of immigrant parents. Everywhere you go in Boston you see shamrock tattoos and that's nuts, but kind of cool, too. And as for me, I will happily recite Burns if you hand me a wee drap o' whisky.  But bagpipes? That's just silly.

29 August 2011

If it don't happen here, it don't happen anywhere

In the hours leading up to Irene's collision with New York City, there were cries of derision around the twitterverse and blogosphere (and not simply that "twitterverse" and "blogosphere" are godawful non-words). Whatsamatter, hipsters? Snarked the rest of the county. Can't handle a tiny li'l category one? Try living in Florida, we get three category fives before most breakfasts. 




Now I don't live in New York, but I did once long ago, and my little brother does now, and so do about nineteen million other people, hipsters or not, and I couldn't help but feel a little defensive on their behalf (Especially since the minute a single snowflake falls in Texas they break out the sackcloth and ashes). The fact is, no one really knew what was going to happen with regards to the costal surge, and what effect hurricane winds would have on 19th century brownstones, and because it's nineteen million people on a handful of fucking islands. I thought people could cut them a little slack for being a tiny bit jumpy.

So we all know now that, thank LaGuardia's ghost, Irene had minimal-to-no impact on New York. The key words being: on New York. Pennsylvania got creamed. Vermont has sunk under water. Here in Boston, just a block from my home a half a tree fell on a neighbor's home. Over twenty people died. And so of course, this morning all you see from the twitterverse and blogosphere (my apologies) is New Yorkers firing up their old swagger and complaining how everyone got excited about nothing.

So I take it back: you guys are all jerks.

25 August 2011

Guilty of immorality

In the 19th century it was common for employers to insist that their workforce attend church services regularly. In Lowell, Massachusetts a 1848 handbook for women working in the mills stated "The company will not employ anyone who is habitually absent from public worship on the Sabbath, or known to be guilty of immorality." Servants in Victorian and Edwardian households were expected to use a portion of what little time off they were given to attend Church of England services, lest they give into their baser instincts. The upper classes felt justified in taking a paternal interest in directing the spiritual lives of the laboring class: they were, after all, looking out for their employee's immortal souls. And if morality could also be a club to keep the rabble malleable, all the better.



I was thinking about this history when I read this Atlantic collection of interviews from employers about the "mistakes" job seekers make. "Sanitize [your] net presence," chides one interviewer. "Those drunken spring break pictures have got to go." We've all heard stories of people getting into trouble because of what they wrote in their blogs or because of what they do in their spare time, but the unapologetic nature of this interviewer still took me aback. It's not simply the absurdity of an employer thinking that what a candidate did on spring break has any bearing on their fitness. It's the fact that they were even looking at that candidate's Facebook page in the first place. I mean, I find the idea of my mom reading my status creepy, let alone my boss. (Hi, Mom!)

But, the argument goes, if you choose to make your life public on the Net, don't employers get to use that against you? The problem is in most cases, the offending revelations have nothing to do with the employee's fitness. It's that they had the audacity to post a photo of themselves in a bikini, or they used the word fuck in their blog, or they felt they had to support one candidate or another. In other words, the employer is seeking to get their unruly workforce to adhere to a moral code which goes far beyond the concerns of the workplace.

These days it's illegal to make religious decisions for your employees, either by requiring that they practice a certain faith or by prohibiting them from adhering to another. We recognize such efforts as wrong-headed, patronizing, unfair. What we need now is to extend this understanding to the secular choices as well. And HR Dude? Quit being such a creeper.

18 August 2011

The same old played out scenes

This week in copyright comes the news that the rights to many songs from 1978 could transfer from record companies to the original artists. It's a provision in the 1976 Copyright Act called "termination rights" which allows artists to acquire copyrights held by their publishers after 35 years.



I'll admit that the main attraction to this story is watching the publishers hem and haw over the finer points of the law while maneuvering to hold onto these properties. These are the same people who pose as protectors of artists' rights when they sue file sharers (and pocket the settlements); now that their interests are in conflict with the creators they allegedly champion they are making no attempt to hide their hypocrisy.

But while I'll happily grab a bag of popcorn to watch the recording industry's shameful display, I think there's a deeper lesson here. The fact that the RIAA is spending lawyers on 35-year-old rights demonstrates once again that Copyright in its current form is failing at its basic goal of promoting the production of new work. Why should Columbia be looking for new talent when it's more profitable to fight over Darkness on the Edge of Town? For that matter, why should Bruce Springsteen write anything new if he can snag those rights?

Highway 17

Now that I have a new computer I have about three or four months to enjoy being technologically current before sliding inexorably back into obsolescence. For me (to my wife's sadness) this means playing all the games my old computer couldn't handle; this is approximately all of them.

One evening about a week ago I was playing Half-Life 2 when I noticed I was feeling feverish and sweating; however, it's summer in Boston and my basement can get kind of stuffy, so I turned on a fan and went back to whacking headcrabs with a crowbar. All at once I felt like throwing up, which I almost never do. I thought I had come down with something and I staggered to bed.



Feeling better the next day, I resumed the game and promptly needed a lie down, fast. So it had really happened: I'd become motion sick from a video game. The nausea wasn't half as bad as feeling like a wuss; I had really become an old guy for whom freaking Half-Life was too heavy a dish. I waited for the floor to stop spinning and then picked myself off it. Then I typed "Motion sickness Half Life 2" into the search bar.

Turns out, this is a common problem with Half-Life 2, with discussions on dozens of message boards. Amongst all the jibes of n00b and fucking pussy go back to Tetris I learned that the likely cause of my reaction was the field of vision  the game depicts. While most first-person games use a 90 degree angle of view (which roughly corresponds to real life), Half-Life 2 is set to 75 degrees, a sort of tunnel effect, like looking through a pair of binoculars (which can also make me a little dizzy). Fortunately, the game allows you to adjust the field of vision and I did. And that was it. The relief was immediate and complete, and I was able to continue through the game's roughest, shakiest camera bits with no ill effects at all.

But while I've enjoyed the game and also like not being sick, the whole experience has been unsettling. My mind had been tricked by a tiny wedge of virtual sight into debilitating illness. Not only that, but the solution was mundane, mechanical, predictable. At some level I know that what I call myself is a series of biological and psychological processes, but the full implications of that are not something I dwell on. I don't believe in an animating soul that is the truest self, but I do like to think that the mind is more than a chemical/electrical call and response. But those 15 degrees seem to say otherwise.

09 August 2011

What we talk about when we talk about patents III

Granting patents is society's way of saying that certain devices or processes are original; implicit in the system is the idea that novelty is to be especially rewarded. It's an inherently individualistic, anti-cooperative approach to innovation. It's also one based on the romance of competition as the basic mechanism of progress. Americans have great faith in the adversarial: our government, our legal system, our economy are all based on the idea that the clash of interests will result in great laws, or justice, or prosperity. But the ugly truth is that competition doesn't only produce better things, it produces better ways of eliminating your competition.



And to be honest, there aren't a lot of lightbulbs waiting to be patented. No one is going to find a seventh simple machine. Invention isn't so much a process of aha! as it is of hmm. It's about looking into the current state of technology and finding a good place to continue. This is particularly true when it comes to software development, and it's interesting that developers themselves have been vocal opponents to the idea of software patents. But holders of patents are, by and large, not inventors but corporations, and for them the main attraction of patents is to build up an arsenal of potential lawsuits or to protect themselves from said lawsuits. 

Back in the 1970's there was a Parker Brothers game call The Inventors in which players would purchase zany old-timey inventions, patent them, and then seek royalties. Amusingly, the inventions themselves were all of questionable value and completely interchangeable from a gameplay stance. One concern was that until "patented" the inventions could be stolen by other players. So the lesson was not that patents help to bring useful ideas to the market so much as they are chips in a legal game. While I don't think satire was the goal of the game designers, they got this all pretty much on the nose.

05 August 2011

What we talk about when we talk about patents II

America loves the single guy against the world. There's one story that colors our ideas about invention and innovation more than any other, and that's the story of Thomas Edison. The tale of a telegraph operator picking himself up through ingenuity to secure more than 1,000 patents and usher in the electrical age is charming and magical, to the point that the light bulb itself has the symbol of sudden insight. Of course the problem with this story is it's mostly crap.



Not that Edison wasn't a genius, because he was; not because he didn't oversee remarkable innovations, because that's true as well. But the majority of his work, including the development of a commercially viable incandescent bulb, was simply incremental improvements on other people's ideas, carried out by an army of work-for-hire inventors who were treated with varying degrees of fairness. Edison is famous for his poor treatment of Nicolai Tesla, and then subsequently fighting a wrong-headed battle with his former employee over whether electrical distribution should use AC or DC current. But Edison had an even darker side, a ruthless side. He vigorously protected the copyrights to his motion pictures even as he duplicated and exhibited Georges Méliès's A Trip to the Moon without compensation.

Edison was a complex man with a mixed legacy. But my point here is he was not solely, or even primarily, responsible for the various patents he acquired. Nonetheless, his legend lives on in the way we think about patents: the solitary inventor with original insight needs protection from the "theft" of the fruits of his genius. He is rewarded with riches, we are rewarded with innovation. It's a pretty story, and it would be harmless enough, except that it's riddled with false assumptions about the nature of innovation and the importance of originality. It's this last myth—that purely original ideas are to be valued above incremental improvements, that purely original ideas exist at all—that has done the most damage to the way that patents are awarded and rewarded.

(to be continued)

04 August 2011

What we talk about when we talk about patents

Last week's This American Life show on patent trolls did an good job of explaining some of the more vexing problems with the patent system in a way that a disinterested layperson could understand or even find compelling. One of the hard things about being a wonk about intellectual property is that the subject is so full of nuance and history and theory that you can't really form a good soundbite.



Patent trolls are a pretty easy target for scorn. Acquiring as many poorly-defined patents as one can not to produce goods or services but to sue other producers when alleged violations take place is pretty reprehensible even to the most disinterested and makes for a good story.  Even if you have no idea what prior art is you can feel indignant on behalf of the poor programmer who slaves away on his app only to be smacked with a lawsuit at launch because someone else says they invented the idea of icons. But if you want to move beyond complaining about a specific bully to talk about why software patents themselves make no sense, then you have to discuss the history of logic gates and difference engines and virtual machines and what algorithms are and oh god just kill me already.

Ultimately I think that the only way to get a layperson to think critically about patents (and about copyright and trademarks) is to ask what it is we want patents to do, and what it is they're really doing. Because it's not about violators stealing something that belongs to someone else. It's not about property at all. The constitutional basis for patents and copyright is the promotion of science and the arts, and we should judge our system by the question: are we producing more and/or better ideas with the system we have in place? Would we have more and/or better inventions with a different system—or even with none?

(to be continued)

26 July 2011

After a good meal and a good pipe

Borkum Riff. About once a year, I'll catch the distinctive smell of whiskey-soaked pipe tobacco, and for a moment I turn to look for my dad. When I do, I look up, because I'm six years old: my dad hasn't smoked since the 1970's. But that smell, of tobacco and sweet cream, was such a constant part of my childhood that it's burnt into my memory so deeply it would take a pipe cleaner to remove it from my hippocampus.

Whenever this happens I don't really know how to feel. On the one hand, it's a smell that never fails to transport me to my youth. On the other, my dad quit smoking after I had a long outburst telling him with all the earnestness of a child that I hated the fact that he smoked and that it gave me headaches and that I was sure it was going to kill him and I wouldn't have a father. I remember that I couldn't stop shivering for hours from the emotional surge. I am, of course, very glad that my father did stop smoking, as he's still with us today.

And yet. My father owned several pipes and they were all beautiful. He had curved rose-colored pipes made of burls (cherrywood? walnut?) that looked wise and mysterious, he had sharp-angled black pipes that looked like they belonged to Mr. Fantastic. He even had a corn-cob pipe that was hokey and wonderful. He had pipe tools for tamping and scraping and cleaning and a carousel that he kept his pipe in. And he had tins of tobacco with pictures of three-masted ships and peculiar European men and maps of the world on them. And when he read the Hobbit to my brother Robert and me and he got to the part about Gandalf and Thorin blowing magical smoke rings I knew exactly how that must have looked.

My ambivalence about pipes can be summed up nicely by Curious George. In the original H.A. Rey book from 1941, the monkey George is unceremoniously removed from the jungle by the Man in the Yellow Hat for sale to a zoo; Upon arrival in America, he spends what is supposed to be his last night of freedom in the Man's home, where we are told After a good meal and a good pipe, George was tired."





When my kids were little, they constantly watched a VHS recording of the 1982 stop-motion version of Curious George, which faithfully stuck to the text of Rey's book and did in fact show George enjoying his pipe; however, immediately after George takes a couple of puffs the film shows George becoming ill and the Man in the Yellow Hat guiltily putting the pipe away—presumably shamed into quitting himself. And this part always made me very, very angry. Why? Why the need to editorialize? Children already know that George is doing something wrong, something forbidden. That's what makes it fun. And Dad, I'm glad you stopped smoking. But I hope you can still find something wrong to do now and then.

25 July 2011

Hertz schmertz

So I haven't blogged in a few weeks and this is where I give a lame excuse like my computer died. Except in my case, my computer died: the screen developed a dead stripe about 2 inches wide, slightly right of center. Googling (on my iPod) seemed to confirm that the LCD had become disconnected and would need replacing, which was probably more than the machine was worth, and so time for a new one.



I had been planning on getting a new Mac for some time; in spite of my reputation amongst friends and family as the guy who knows a lot about computers, I'd been nursing along an obsolete machine for many years now, running an operating system that was two generations gone. So I should have welcomed the death of my old box and skipped to the store with plastic in hand. Instead, I found myself begrudging the purchase; when I brought the bulky box home I felt a strange lack of enthusiasm.

There was a time when a new computer was a big deal, a life event, a first kiss. I remember my wife and I purchasing our first Mac (a Classic II) back in graduate school; I believe with an academic discount and a newly-introduced inkjet printer it cost us about $2K. We brought it home like nervous parents who feared crib death. It was hard to believe we had anything that valuable in our apartment. For the next five years we wrote every one of our grad school papers on that nine-inch black and white screen and dipped our toes into the exciting new world of Compuserv with our blisteringly fast 28.8K modem.

We've run through a lot of computers since then, and while each has been faster and prettier, acquiring them is less and less glamourous. As technology becomes more advanced, I care about it less and less. There was a time that I could rattle off the hertz for any of a dozen CPUs. I can't begin to tell you any specs for my shiny new iMac. Is it dual core or quad core? For that matter, what's a core?

Maybe it's better for me to not care so much about this stuff. In the end, tech lust is more materialism, and maybe letting go of that is another step towards enlightenment. Maybe what's good is the ubiquity of computers and smart devices means I'm not interested in the tools but what I can do with them. But to be honest, I miss the obsession.

29 June 2011

Reagan and me

Author's note: This is a recycled post that dates back originally to 1997, when I was hand-coding my first web pages in error-filled HTML 2.0. Now that I'm using this blog for my public writing I'm moving the essay here. I've made a few edits for style but it's mostly unchanged. If you've read it already (or even if you haven't), feel free to ignore it. 



I grew up in Eureka, Illinois, a town of about four and a half thousand souls. Eureka was once called Walnut Grove, but had to change names for reasons which remain mysterious to me. Someone told me that the discovery of a second town in Illinois also named Walnut Grove necessitated the change. I tried to verify this story, but a glance through the atlas has revealed no other towns with that name. Perhaps this second Walnut Grove also had a re-christening. In any case, today you will find few walnut trees in Eureka; a blight in 1910 killed nearly all. Eureka was also once the Pumpkin Capital of the World, but somehow this title, too, has been lost. Today, our rivals in Morton, Illinois reign as pumpkin kings, and all that is left of Eureka's cannery is a crumbling brick ruin.

In fact, by the time I came along, Eureka had only one feature that distinguished it from other midwestern fourth-generation German farming towns: its college. And the college was famous because of Reagan. A tiny, private, church-affiliated school, Eureka College gave a diploma to future president Ronald Reagan in 1932. I was twelve when Reagan took the oath of office, and the town was bursting with pride. A large sign appeared in front of the court house which read "Visit Eureka College, alma mater of President Ronald Reagan. Go four blocks, then two blocks south." It was left for the seeker to decide in which direction the initial four blocks lay. A year or so later someone noticed the mistake and added a tiny carat and the scrawled word "west" on the sign.

The college quickly scrambled to capitalize on Reagan's fame. A portrait featured prominently on the prospectus and other recruiting materials. Eventually, a Reagan Scholarship was established—somewhat ironically, as Reagan himself claimed that his grade average in college was "closer to the C level required for [sports] eligibility than it was to straight A's." Perhaps the only one unhappy about the college's love-in with the President was my father, at that time Dean of the College. My dad was (and remains) an old-school Stevenson liberal, as well as something of an academic conservative; apart from the obvious political differences he had with the Reagan Administration, the lauding of an such an undistinguished scholar by a place of higher learning rankled him.

During his two terms in office, Reagan made several trips to Eureka for photo-ops and the occasional speech. A week before each arrival the secret service would arrive in town, black-suited and comlinked. No one knew exactly where they stayed—Eureka has no hotels. One day they simply appeared, pacing intently up and down Main Street past the five and dime, lurking amongst the greeting cards in the Hallmark store. For the most part they stuck to the Eureka College campus, where they endlessly staked out the dozen or so dormitories and classroom buildings, whispering into their sleeves to one another.

On one visit, shortly before the 1980 elections when he was still only a candidate, Reagan came to light a bonfire at Eureka College. The cheering students arrived early and the pom-pon squad did routines dressed in skirts in spite of the autumn cold. The high school pep band, which included my brother on trombone, played the Eureka High School Fight Song ("On, Eureka, win this game, fight to put our foes to shame") and the Star-Spangled Banner. They played those thirty-two bars again and again, for hours. The cheerleaders huddled together for warmth. Suddenly, Reagan's limo arrived and the Secret service pushed back the teenagers to either side as the band played Hail to the Chief. Reagan emerged, smiling, from his car; an agent handed him an already burning torch, which the President threw onto the pyre. A few waves to the cameras and he was gone.

A more substantial visit by Reagan came when he spoke at Eureka College's 1982 commencement. The speech took place in Eureka College's Reagan Athletic Center, and drew a large audience from the national press as well as from the town. Observers filled the basketball court; along one foul shot line sat a row of boom microphones and videocameras huddled together. To one side of the gymnasium, the hundred or so matriculating seniors of Eureka College sat, humble observers of their own graduation. As Dean of the College, my father was to appear on the dais sitting next to Reagan. For this he needed security clearance in the form of a color-coded lapel pin; I was warned not to follow him beyond the marked areas (that is, into the men's locker room). This was less than a year after John Hinkley Jr.'s attempt on Reagan's life, and my youthful and paranoid mind raced with images of agents swarming over me and beating me to the linoleum after one misstep. As a self-pitying teen with something of a persecution complex, the thought of such a fate appealed to me, but I stayed in my place anyway. Two weeks after the graduation, my grandmother called my father to congratulate him: a photograph of him sitting next to the President had been printed in People Magazine. "I never thought I'd see my son there!" she proudly exclaimed.

The town's biggest Reagan moment by far came two years later, when he spoke at Eureka College on its Founder's Day. In a speech sponsored by Time Magazine, Reagan was to detail his proposed Strategic Arms Reduction Talks (and, at the same time, putt the final nail into SALT II's coffin). Again he gave his speech in Reagan Athletic Center, and again the national press descended on Eureka, but in greater numbers that ever bofore or since. This time I was in the pep band, playing my brother's discarded trombone. We played an enthusiastic, but error-laden, version of Eureka College's song, 'Neath the Elms, while listening for the sound of the Air Force One helicopter overhead. The gymnasium was filled with an army reporters, photographers, and cameramen from every major network, newspaper, and magazine in the country. My father had in the intervening years resigned as Dean, and he and a couple dozen other faculty members decided to wear armbands to protest both Reagan's anti-Sandanista policy and his belligerence towards the Soviet Union. The college's administration had been forewarned, however, and seated the faculty far in the back, out of sight of the cameras. So with that potential embarrassment diffused, the speech went off without a hitch. After the President spoke and the applause ended, Reagan flew off to spend the night in the Sands Hotel in Las Vegas. Behind him, the hundreds of reporters sent their copy off by wire. For one day at least, the byline of "Eureka, Illinois" would appear in papers around the world.

In 1986 I left Eureka to attend college. Like most teens from small, midwestern towns, I couldn't get away from home fast enough. That I was leaving one backwater town behind to attend school in another backwater town didn't matter much. Soon the day-to-day concerns of books, papers, and my sex life pushed aside political concerns. By the time Iran-Contra broke in 1987, it seemed more like a nightly sitcom to me than an national outrage. In 1988, the Reagan Administration was dead—long live the Bush Administration.

There was to be an epilogue to my dealings with Mr. Reagan. Starting in the final year of his presidency, Eureka College lobbied hard to receive his Presidential Library. For months, the college's administration held its breath, but to no avail: Simi Valley, California got the papers. In Eureka the rumor was that Nancy Reagan, never a fan of her husband's humble origins, had decided that a West Coast home for the library was more respectable. But the Reagans did throw a bone to Eureka in the form of the "Reagan Memorabilia." If Simi Valley was to get the major documents of the Presidency, Eureka was to get the clutter from the Reagan's attic: t-shirts, paperbacks, presentation gifts, and assorted bric-a-brac. Some items held marginal interest—several keys to several cities, for example—but on the whole the Memorabilia was the sort of detritus one finds at garage sales. The task of sorting out the few wheat berries from the plentiful chaff fell, coincidentally, upon my mother, newly-appointed librarian for Eureka College. Dutifully, she dusted off those items she could and placed them in glass cases on the first floor of Melick Library. But she still had several boxes of—well, of junk—left. What to do with those?

That Christmas, under the tree, all the children had special gifts, courtesy of the Reagans. My future wife, Marina, gratefully received Nancy's copy of Jane Seymore's Guide to Romantic Living, and I tore the wrapper off Ronald's first edition of Tom Clancy's The Hunt for Red October—a novel Reagan reportedly called "un-put-downable." I've since given the book away unread, although I've seen and enjoyed the movie. But in this respect, at least, I resemble the former president—to judge by the wear on the pages, he only made it a third of the way through before setting the novel aside.

27 June 2011

Violators will be shot

My comments last week were a bit of a mess and I want to clarify. I'm not in favor of abolishing copyright or of people trying to profit from their work. But I do worry about what seems to be an ever-broadening sense of entitlement on the part of rights holders. Not only have copyright terms been extended to a point that they have long ceased to be incentives to creation (why should Disney create new characters when they can continue to coast on ones that are 90 years old?), but  also (and more troubling), rights holders are increasingly demanding absolute control over every word, note, or pixel that they produce. Forget about Warhol not being able to paint soup cans. We're living in a world where the French government claims copyright over any image of the Eiffel Tower at night.

Bullet hole brushes by obsidiandawn.com

In at least one work on Maisel's portfolio site, the image is largely made up of someone else's sculpture. Perhaps he had the sculptor's permission; I doubt he bothered to secure the permission of every architect whose work he's photographed. Of course all these photos are transformative. But all these photos rely upon the existing work of other artists, some to a greater extent than they rely upon any choice Maisel made or technique he employed. Whether Baio's work was transformative is something we can argue about but not something that will ever be decided in court.

When I read discussions of this situation here and elsewhere I'm struck most by the type of comment that begins "Well, I make my living as an artist, and I think..." More often than not, the comment goes on to defend Maisel and any and all claims of the rights of the artist against an antagonistic world. The implication is that the interests of the artist—and by extension, of Art—are served best by the broadest interpretation and application of copyright holders' rights. But is this true? Are artists better really better off playing hardball? And perhaps more importantly, how does such behavior serve the culture at large?

One word that I've seen a lot in defenses of Maisel is "respect"—as in, people need to respect the work that photographers do. That would be a reasonable point if Maisel's approach had been proportionate to the imagined harm. As it stands, it's like supporting Old Man Potter at the end of the block who likes to take out neighborhood dogs with a shotgun when they enter his yard. He's only defending his property.

The word I'd like to stress is "humility." All arts are derivative, but photography especially relies on photomechanical processes for its production. As much art as the photographer brings to the process, she still depends on pre-existing subjects. These includes the myriad work of architects, fashion designers, engineers, and even other photographers that make up the scenes she captures. I would hope that photographers, perhaps more than other visual artists, would understand that art is fodder for art, and that if we make it necessary to contact, ask permission from, and pay every possible rights holder out there, we are pricing a lot of people out of making a lot of work, and that's also bad for artists—and everyone.

23 June 2011

The sound of money

There's a fable that is told in many different versions around the world. In some European versions the hero is a wandering clown, but in the version I first read as a boy it was Ōoka Tadasuke, the 18th century Japanese magistrate who has many folktales attached to him. The story goes: an innkeeper overhears a poor student tell a friend that he always eats his rice as the innkeeper is preparing his fish, and thus the smell improves his meal. The angry innkeeper brings the student before Ōoka, demanding payment for the stolen smell. Ōoka responds by having the student spill his pocketful of coins from one hand to another, and tells the innkeeper that the smell of fish has been repaid by the sound of money.


The popularity of this story speaks to a deep, common-sense understanding that there are some things of value that are beyond commerce. The value of these things lies in part to their having no ownership, to the way they float in the air, literally or figuratively. Unfortunately, this doesn't mean that indignant, entitled, or greedy individuals won't try to assert their claim.

Making the Internet rounds today is this disturbing piece on Waxy.org. Andy Baio, a tech enthusiast, made a series of chiptune versions of Miles Davis's tunes on Kind of Blue; when he distributed the files, he included a version of the Kind of Blue cover that had been highly pixelated. The original photographer of the Davis portrait, Jay Maisel, sued Baio for copyright infringement. The case settled out of court for $32K.

I'm not going to go into the legal issue here—I have strong feelings about the stupidity of the current state of intellectual property law and I'll bore you all some other time—but I do think that the story poses the question, what motivates an artist to take such disproportionate and vindictive action against a fan? It's not money—Mr. Maisel is one of the most successful commercial photographers in the world. In Baio's account, he quotes Maisel's lawyer:
"He is a purist when it comes to his photography," his lawyer wrote. "With this in mind, I am certain you can understand that he felt violated to find his image of Miles Davis, one of his most well-known and highly-regarded images, had been pixellated, without his permission […]"
With all due respect, I'm certain I can't understand how Maisel's hurt feelings are worth $32K. What's being referenced here is the noxious concept of Moral Rights, the idea that an artist's right to preserve the integrity of a piece of work can and should prevent anyone else from editing them in any way—including, apparently, using the original work as a springboard for something new. The American system of copyright does not, in fact, recognize Moral Rights (it's mostly concerned with people getting paid), but the romantic ideal that somehow the artist's intent should trump future artists' intent forever and ever is anti-Art. Everything is derivative. That's the way culture happens. 

I make a part of my living as a designer and illustrator. I have used existing works of art as inspiration, as reference material, as grist for the mill. Maybe some other artists have taken bits and pieces from what I've done and made something new. I don't know of transformative works, but I have found places where my art was used unaltered without attribution or payment. If someone were to use one of my illustrations commercially, I would ask for payment; if they were to use it in a way I found offensive, I would ask them to stop.

But really? Most of the time, it just makes me smile—because I know my work is floating out there along with all the other smells.

22 June 2011

Fight for your right

I'm not really the type to have role models; people are too complicated, too full of good and bad to be credible heroes to me. But if pressed for an example of someone who showed great courage in speaking truth to power I would have to go with William Gaines.


Gaines is best known today as the longtime publisher of MAD Magazine, and while financing a rag whose purpose was to pervert middle-American values and raise several generations of smart-ass punks would be enough to commend him, I think his finest moment came in 1954, when he was the publisher of the EC Comics line of horror comics: Tales from the Crypt, The Vault of Horror, Shock SuspenStories, etc. Red panic was in the air in those days, and a crackpot psychiatrist by the name of Frederick Wertham had just published a book entitled Seduction of the Innocent, which claimed that violent comic books were perverting American youth, turning them into either Communists or homosexuals or both. The upshot of all of this was that Gaines was called before the U.S. Senate Subcommittee on Juvenile Delinquency to give an accounting for why his comics were so gory and disgusting and devoid of redeeming qualities and really wasn't he ashamed of himself.

You can read his testimony in its entirety online, but the long story short is he refused to play the game. When asked to justify himself—to explain what possible good could come of comics featuring beheadings and eviscerations played for laughs—he shrugged: "It would be just as difficult to explain the harmless thrill of a horror story to a Dr. Wertham as it would be to explain the sublimity of love to a frigid old maid." 

He stood condemned before he spoke a word, of course, and within a year EC Comics was forced to stop publishing its horror line because of the new guidelines set forth by the Comics Code Authority. But I've always loved the image of this bespectacled, schlubby man effectively thumbing his nose at the idea that entertainment had to serve some noble purpose. Here was a guy who refused to justify something that had, and needed, no justification. There are many people who pay lip service to the idea of free speech when that speech is in service of a cause. It takes courage to defend free speech in the service of nothing other than simple self-gratification.

I often think of Gaines whenever someone ties themselves in rhetorical knots trying to answer scolds and censors who aren't happy with the kind of fun others are having. Video games promote hand-eye coordination; I'm not wasting time on the Internet, I'm building my social network; I'm only reading it for the articles. Look at the convoluted points activists make when trying to legalize marijuana: they talk about pot's medicinal uses, the ways the fibers can be used to make rope and the ways seeds can be used to make oil and the ways the roots can be used to make kitchenette sets. What they don't say is: "I want to get high, and I think it's fair that I be free to do that in a safe and legal way." 

There are some things we do for shits and giggles. Most of the best things: roller coasters, horror movies, bourbon, sex. You have your own list, and it's probably different, but you don't have to justify it to me.

21 June 2011

I get knocked down

Parenthood is a project made of anxiety, delight, but most especially drudgery: and of the tedious bits some of the worst are the endless hours spent playing first board games. The absolute nadir of these is of course Candyland, whose cruel and capricious nature has driven most parents to stack the deck in their child's favor (or in their own favor, who cares, so long as the damn game ends already for God's sake).  But a close second is the game Chutes and Ladders.


Chutes and Ladders is the kinder, gentler cousin of Snakes and Ladders; of the two, the latter name more accurately describes the feeling of playing the game, which not unlike a case of the DT's. In the event you never were a child, here's a description: the game consists of a race to the final square interrupted by a string of random reversals of fate in which your piece ascends ladders or descends chutes. These titular features are arranged in such a way that victory is eternally snatched from young innocent hands and the average game length is six hours (including two nap times).

The picture above is the version of the game I played with my daughter Kate when she was three; the board, apparently from the 1970's, includes scenes that attempt to provide karmic justification for the players' rises and falls. In one square, a girl mixes batter and so is rewarded by a ladder leading to a cake; in another, a boy reaches for cookies on a high shelf and falls down a chute that ends with a concussion and a possible lifetime of epilepsy. In one confusing pair, a child either hands his mother her purse or absconds with it and is rewarded via a ladder with ice cream; in yet another, a boy skates on thin ice, only to chute to what we must presume is an icy death.

I don't think that Kate was impressed by the lessons of these scenes, but being a rational child she liked the depictions of cause and effect. She was especially pleased by a sequence where a pulling a cat's tail results in a scratched face for the abuser. Kate herself has to this day a scar from the family cat that she received in exactly the same way; perhaps she found some sort of atonement via proxy. But for me there was no comfort in these tiny morality plays, because in spite of the veneer of a just universe the game is still entirely one of chance. In fact, the rewards and punishments only made the underlying randomness of it all that more depressing. What did it matter if sweeping the floor earned the little girl a trip to the movie when her stab at the housework was just one of six random options in the first place?

Fortunately, the days of playing this game are long behind me; I think we got rid of our copy at a garage sale, or maybe it's squashed flat beneath the weight of better games on our basement shelves. But at night sometimes the game still haunts me. I wake from a dream of sudden falling and I wonder whether my life is really a series of actions and consequences or just one die roll after another.