Last week I got to spend an evening reinstalling my wife’s laptop because she picked up some malware. She received an email from a friend with a link to a YouTube video that prompted her to install a new codec.
Except the codec wasn’t really a codec.
The link wasn’t really to YouTube.
The email wasn’t really from her friend.
Here we are in 2010, 65 years after the advent of "program instructions as data" (von Neumann architecture) and the brilliant innovations that propelled computing forward now haunt us. The power and complexity of computing is amplified by the connectivity of the Internet, creating an ecosystem ripe for exploitation through clever social engineering combined with some technical trickery.
And it’s not fair. It’s not fair to the users who suffer, who lose data, whose systems are held hostage. It’s not fair and the blame for calamities cannot be laid at the feet of careless or naive users. Daily computing tasks should not require constant vigilance, let alone deep technical knowledge.
Many of us schooled in the last century hold on to notions of software that are simply unsafe for many of our users. We insist on building software that must run in environments that are not inherently safe. But there are trends afoot: the closed ecosystem of the iPhone and the iPad, the coming Windows Phone 7 closed ecosystem, and the increasing ubiquity of Internet applications. Slowly but surely we are moving toward paradigms in which the average user will have much less to worry about because software is vetted and sandboxed.
As a developer I don’t want to be sandboxed in every situation. I want to be able to make educated decisions about software and run it on my devices, but the default should be safe and secure. It’s shameful that things are still so treacherous.