Wednesday, October 31, 2007

AT&T Slashdotted

...and also Wired-Blog-Networked (?) with this story about a research paper from 2001 about a language for data-mining, more provocatively called a "Programming Language for Mass Surveillance". While I'm not happy with much of what my former employer seems to have gotten up to since I left, I guess I know too much of the backstory on this to subscribe to this particular round of hysteria. As an employee back in the late 90's and an sometime colleague of the principles on the paper I'd readily believe that the original purpose of "Hancock" was, as claimed, problems more like detecting long-distance fraud, than supporting the NSA and their ilk. (As an aside, at least 2/3 of the authors of this paper are now at Google).

This all points out an intriguing problem, really, for those of us engaged in R&D. Any sufficiently general tool can be used for many purposes, some good and some evil - and as tool creators, we have little control over the eventual effect of what we do. In fact, the better you are as a scientist and engineer, the more general-purpose your results and tools will be - so in CS, at least, it's unlikely that you won't facilitate something unpleasant sometime in your career. Certainly, we can make choices about where we work and how we direct our energies, but the bottom line is that it is now as scientists, but as citizens (and consumers) that we need to decide to what uses technology will be put.

And one man's language for mass surveillance might be another man's language for analyzing protein-protein interactions for look for cancer cures.

1 comment:

amberman said...

William, that's even true when you're teaching. You have no control over how your students apply their knowledge, although you can certainly try to instill an ethical consciousness. We are collectively responsible for how our governments and corporations apply the tools we've created.

MB