Two Quick Reads and Two Videos in Two Weeks ...
In the past two weeks I came across two quick reads and two videos that caused me to make some connections worthy of thought.
Bruce Schneier, writing on May 15, 2008, at Wired, made me stop and think about all of the "free" services I routinely explore for their value-added potential in education. I often just make up absurd information when that information is required of me and I don't want to provide it (like, for an email address: email@example.com). I have never stopped to think about the lifespan or later possible use of this meaningless, inaccurate information. I just don't want any more junk mail. Bruce writes:
Our data is a part of us. It's intimate and personal, and we have basic rights to it. It should be protected from unwanted touch.[Source: Our Data, Ourselves ]
We teach children about the socially expected behaviors surrounding our personal physical space from casual to intimate. This article really got me to stop and think about the virtual me, my data (from financial, health, social, professional, civic...) and the socially and legally appropriate ways that information should be touched--information, accurate or not, that comes to represent me and affect decisions made about and for me, perhaps without my knowledge about the decisions ever being made. I might not even know the information was aggregated and used.
I also watched Jonathan Zittrain's presentation (from April 11, 2008, at the Tribeca Grand in NYC) about his new book, The Future of the Internet and How to Stop It. The video of the presentation (about an hour) is graciously made available by the New York Greater Metropolitan Area chapter of the Internet Society at this link. In the presentation Jonathan talks about the generative nature of the internet versus a new push to use "tethered devices" as he calls them--devices that close innovation and are controlled by the manufacturer even after the sale. [I have written briefly about the internet as an operating system before. Jonathan's ideas helped me clarify some of my thinking.]
He mentions several really interesting examples before extending his examples to the FBI paying to have the OnStar system remotely reprogrammed in a car owned by people of interest to the FBI so that everything spoken in the car was transmitted to the FBI through OnStar without anyone in the car being aware. He goes on to say that because of consumer demand we have built an unrivaled infrastructure that could be leveraged for surveillance (by the good guys and the bad): cell phones and other devices.
And then I read this article about the National Cyber Security Initiative by Ryan Singel at Wired:
... would spend billions on unproven, embryonic technology, and possibly illegal or ill-advised projects, according to the analysis ...
While many of the specifics of the plan are classified, U.S. intelligence chief Michael McConnell told the New Yorker in January that he wants the National Security Agency to begin eavesdropping on the internet, and a McConnell aide said the spy agency was prepared to examine the content of e-mails, file transfers and Google searches without a warrant.[Source: Report: Government's Cyber Security Plan Is Riddled With New Spying Programs]
I'm not really passing any judgement on these examples. Like most everyone else, I want the bad guys caught. I want us to prevent the bad people from doing bad things to good people. But larger issues may be at stake, issues worthy of careful thought and scrutiny. None of us want to wake up one morning and ask, "How in the world did we get here?!"
The rampant pace at which our technologies are developing is vastly outstripping our awareness of the issues that surround that development and our capacity to have informed conversations about those issues to establish public policy and legal frameworks that are both reasonable, fair, and that appropriately safeguard and balance the best interests of a free democratic society, a capitalist economy, and the rights of the individual. And not only is the pace of development rapid, can it also be completely invisible to public scrutiny and democratic oversight? Should it be? These are complex questions!
And during the week I also came across this video interview, on a less weighty, yet more immediately personal level, at Switched with Clay Shirky, adjunct professor teaching New Media in the graduate Interactive Telecommunications Program at NYU. Clay really informs my thinking about the internet.
The issues broadly touched on in this post are complex and have long term implications for freedom, safety, democracy, privacy, economic sustainability, to name but a few. In order to have more informed conversations with our children about these significant, developing concerns, we need to have greater public and professional conversation about data security, privacy, and ways we can move our social, political, and legal structures to develop policy frameworks that keep pace with the challenges that technology brings to our daily lives.
These are challenging and exciting times!