Clifford Stoll wrote The Cuckoo’s Egg in 1989, telling the true story of how he started investigating 75¢ of computer time nobody had paid for and ended up catching an international hacker passing through his computers to gain access to military secrets. The classic story is a fascinating mix of technical detail and the thrilling action of hunting an invisible criminal through the phone lines.
My favorite passage, though, comes at the very end. After the overseas spy is caught and brought to justice, another hacker slips into Stoll’s system and for a moment the whole process starts over again. Stoll writes:
He got in through an unprotected astronomy computer run by a couple of infrared astronomers. They didn’t care about security . . . they just needed to connect to the network. Because I exchange programs with them, we’d set up our systems to work as one—you didn’t need a password to move from their computer to mine.
…
A couple days later the SOB called me. Said his name was Dave. From Australia. “I broke in to show that your security isn’t very good.”
“But I don’t want to secure my computer,” I replied. “I trust other astronomers.”
And that’s the moral, as true today as in the 1980s. We don’t want to secure most systems. Certainly I wouldn’t want my online bank account accessible to common thieves, but a database of research or a casual blog shouldn’t require elaborate protective measures.
This is just as true in the physical world. I wouldn’t put my valuables in a bank vault with no lock, but the classroom doors where I went to college were always open.
Unfortunately, in software0 leaving any door unlocked can grant access to resources beyond the application itself, so we sink a fortune into securing even the most trivial of software against all imaginable attacks. That’s a high price to pay for protection against the pranks of mischievous hackers.