29 August 2011

If it don't happen here, it don't happen anywhere

In the hours leading up to Irene's collision with New York City, there were cries of derision around the twitterverse and blogosphere (and not simply that "twitterverse" and "blogosphere" are godawful non-words). Whatsamatter, hipsters? Snarked the rest of the county. Can't handle a tiny li'l category one? Try living in Florida, we get three category fives before most breakfasts. 




Now I don't live in New York, but I did once long ago, and my little brother does now, and so do about nineteen million other people, hipsters or not, and I couldn't help but feel a little defensive on their behalf (Especially since the minute a single snowflake falls in Texas they break out the sackcloth and ashes). The fact is, no one really knew what was going to happen with regards to the costal surge, and what effect hurricane winds would have on 19th century brownstones, and because it's nineteen million people on a handful of fucking islands. I thought people could cut them a little slack for being a tiny bit jumpy.

So we all know now that, thank LaGuardia's ghost, Irene had minimal-to-no impact on New York. The key words being: on New York. Pennsylvania got creamed. Vermont has sunk under water. Here in Boston, just a block from my home a half a tree fell on a neighbor's home. Over twenty people died. And so of course, this morning all you see from the twitterverse and blogosphere (my apologies) is New Yorkers firing up their old swagger and complaining how everyone got excited about nothing.

So I take it back: you guys are all jerks.

25 August 2011

Guilty of immorality

In the 19th century it was common for employers to insist that their workforce attend church services regularly. In Lowell, Massachusetts a 1848 handbook for women working in the mills stated "The company will not employ anyone who is habitually absent from public worship on the Sabbath, or known to be guilty of immorality." Servants in Victorian and Edwardian households were expected to use a portion of what little time off they were given to attend Church of England services, lest they give into their baser instincts. The upper classes felt justified in taking a paternal interest in directing the spiritual lives of the laboring class: they were, after all, looking out for their employee's immortal souls. And if morality could also be a club to keep the rabble malleable, all the better.



I was thinking about this history when I read this Atlantic collection of interviews from employers about the "mistakes" job seekers make. "Sanitize [your] net presence," chides one interviewer. "Those drunken spring break pictures have got to go." We've all heard stories of people getting into trouble because of what they wrote in their blogs or because of what they do in their spare time, but the unapologetic nature of this interviewer still took me aback. It's not simply the absurdity of an employer thinking that what a candidate did on spring break has any bearing on their fitness. It's the fact that they were even looking at that candidate's Facebook page in the first place. I mean, I find the idea of my mom reading my status creepy, let alone my boss. (Hi, Mom!)

But, the argument goes, if you choose to make your life public on the Net, don't employers get to use that against you? The problem is in most cases, the offending revelations have nothing to do with the employee's fitness. It's that they had the audacity to post a photo of themselves in a bikini, or they used the word fuck in their blog, or they felt they had to support one candidate or another. In other words, the employer is seeking to get their unruly workforce to adhere to a moral code which goes far beyond the concerns of the workplace.

These days it's illegal to make religious decisions for your employees, either by requiring that they practice a certain faith or by prohibiting them from adhering to another. We recognize such efforts as wrong-headed, patronizing, unfair. What we need now is to extend this understanding to the secular choices as well. And HR Dude? Quit being such a creeper.

18 August 2011

The same old played out scenes

This week in copyright comes the news that the rights to many songs from 1978 could transfer from record companies to the original artists. It's a provision in the 1976 Copyright Act called "termination rights" which allows artists to acquire copyrights held by their publishers after 35 years.



I'll admit that the main attraction to this story is watching the publishers hem and haw over the finer points of the law while maneuvering to hold onto these properties. These are the same people who pose as protectors of artists' rights when they sue file sharers (and pocket the settlements); now that their interests are in conflict with the creators they allegedly champion they are making no attempt to hide their hypocrisy.

But while I'll happily grab a bag of popcorn to watch the recording industry's shameful display, I think there's a deeper lesson here. The fact that the RIAA is spending lawyers on 35-year-old rights demonstrates once again that Copyright in its current form is failing at its basic goal of promoting the production of new work. Why should Columbia be looking for new talent when it's more profitable to fight over Darkness on the Edge of Town? For that matter, why should Bruce Springsteen write anything new if he can snag those rights?

Highway 17

Now that I have a new computer I have about three or four months to enjoy being technologically current before sliding inexorably back into obsolescence. For me (to my wife's sadness) this means playing all the games my old computer couldn't handle; this is approximately all of them.

One evening about a week ago I was playing Half-Life 2 when I noticed I was feeling feverish and sweating; however, it's summer in Boston and my basement can get kind of stuffy, so I turned on a fan and went back to whacking headcrabs with a crowbar. All at once I felt like throwing up, which I almost never do. I thought I had come down with something and I staggered to bed.



Feeling better the next day, I resumed the game and promptly needed a lie down, fast. So it had really happened: I'd become motion sick from a video game. The nausea wasn't half as bad as feeling like a wuss; I had really become an old guy for whom freaking Half-Life was too heavy a dish. I waited for the floor to stop spinning and then picked myself off it. Then I typed "Motion sickness Half Life 2" into the search bar.

Turns out, this is a common problem with Half-Life 2, with discussions on dozens of message boards. Amongst all the jibes of n00b and fucking pussy go back to Tetris I learned that the likely cause of my reaction was the field of vision  the game depicts. While most first-person games use a 90 degree angle of view (which roughly corresponds to real life), Half-Life 2 is set to 75 degrees, a sort of tunnel effect, like looking through a pair of binoculars (which can also make me a little dizzy). Fortunately, the game allows you to adjust the field of vision and I did. And that was it. The relief was immediate and complete, and I was able to continue through the game's roughest, shakiest camera bits with no ill effects at all.

But while I've enjoyed the game and also like not being sick, the whole experience has been unsettling. My mind had been tricked by a tiny wedge of virtual sight into debilitating illness. Not only that, but the solution was mundane, mechanical, predictable. At some level I know that what I call myself is a series of biological and psychological processes, but the full implications of that are not something I dwell on. I don't believe in an animating soul that is the truest self, but I do like to think that the mind is more than a chemical/electrical call and response. But those 15 degrees seem to say otherwise.

09 August 2011

What we talk about when we talk about patents III

Granting patents is society's way of saying that certain devices or processes are original; implicit in the system is the idea that novelty is to be especially rewarded. It's an inherently individualistic, anti-cooperative approach to innovation. It's also one based on the romance of competition as the basic mechanism of progress. Americans have great faith in the adversarial: our government, our legal system, our economy are all based on the idea that the clash of interests will result in great laws, or justice, or prosperity. But the ugly truth is that competition doesn't only produce better things, it produces better ways of eliminating your competition.



And to be honest, there aren't a lot of lightbulbs waiting to be patented. No one is going to find a seventh simple machine. Invention isn't so much a process of aha! as it is of hmm. It's about looking into the current state of technology and finding a good place to continue. This is particularly true when it comes to software development, and it's interesting that developers themselves have been vocal opponents to the idea of software patents. But holders of patents are, by and large, not inventors but corporations, and for them the main attraction of patents is to build up an arsenal of potential lawsuits or to protect themselves from said lawsuits. 

Back in the 1970's there was a Parker Brothers game call The Inventors in which players would purchase zany old-timey inventions, patent them, and then seek royalties. Amusingly, the inventions themselves were all of questionable value and completely interchangeable from a gameplay stance. One concern was that until "patented" the inventions could be stolen by other players. So the lesson was not that patents help to bring useful ideas to the market so much as they are chips in a legal game. While I don't think satire was the goal of the game designers, they got this all pretty much on the nose.

05 August 2011

What we talk about when we talk about patents II

America loves the single guy against the world. There's one story that colors our ideas about invention and innovation more than any other, and that's the story of Thomas Edison. The tale of a telegraph operator picking himself up through ingenuity to secure more than 1,000 patents and usher in the electrical age is charming and magical, to the point that the light bulb itself has the symbol of sudden insight. Of course the problem with this story is it's mostly crap.



Not that Edison wasn't a genius, because he was; not because he didn't oversee remarkable innovations, because that's true as well. But the majority of his work, including the development of a commercially viable incandescent bulb, was simply incremental improvements on other people's ideas, carried out by an army of work-for-hire inventors who were treated with varying degrees of fairness. Edison is famous for his poor treatment of Nicolai Tesla, and then subsequently fighting a wrong-headed battle with his former employee over whether electrical distribution should use AC or DC current. But Edison had an even darker side, a ruthless side. He vigorously protected the copyrights to his motion pictures even as he duplicated and exhibited Georges Méliès's A Trip to the Moon without compensation.

Edison was a complex man with a mixed legacy. But my point here is he was not solely, or even primarily, responsible for the various patents he acquired. Nonetheless, his legend lives on in the way we think about patents: the solitary inventor with original insight needs protection from the "theft" of the fruits of his genius. He is rewarded with riches, we are rewarded with innovation. It's a pretty story, and it would be harmless enough, except that it's riddled with false assumptions about the nature of innovation and the importance of originality. It's this last myth—that purely original ideas are to be valued above incremental improvements, that purely original ideas exist at all—that has done the most damage to the way that patents are awarded and rewarded.

(to be continued)

04 August 2011

What we talk about when we talk about patents

Last week's This American Life show on patent trolls did an good job of explaining some of the more vexing problems with the patent system in a way that a disinterested layperson could understand or even find compelling. One of the hard things about being a wonk about intellectual property is that the subject is so full of nuance and history and theory that you can't really form a good soundbite.



Patent trolls are a pretty easy target for scorn. Acquiring as many poorly-defined patents as one can not to produce goods or services but to sue other producers when alleged violations take place is pretty reprehensible even to the most disinterested and makes for a good story.  Even if you have no idea what prior art is you can feel indignant on behalf of the poor programmer who slaves away on his app only to be smacked with a lawsuit at launch because someone else says they invented the idea of icons. But if you want to move beyond complaining about a specific bully to talk about why software patents themselves make no sense, then you have to discuss the history of logic gates and difference engines and virtual machines and what algorithms are and oh god just kill me already.

Ultimately I think that the only way to get a layperson to think critically about patents (and about copyright and trademarks) is to ask what it is we want patents to do, and what it is they're really doing. Because it's not about violators stealing something that belongs to someone else. It's not about property at all. The constitutional basis for patents and copyright is the promotion of science and the arts, and we should judge our system by the question: are we producing more and/or better ideas with the system we have in place? Would we have more and/or better inventions with a different system—or even with none?

(to be continued)