August 31, 2004Crypto-Guru Bruce Schneier on IT ThreatsJust after I posted regarding IE and malware, I came across this interesting Bruce Schneier interview at Neowin. Bruce is a well-known cryptologist and security expert, and I've read his informative book, "Secrets and Lies". As you can tell from my prior post, I heartily agree with Bruce, particularly on this point: "What do you see as the biggest threat in the IT age?Regarding the ultimate responsibility for security: "If you were to look at 3 areas - The Software Designer, The Systems Administrator, The User - who would you say should bear the burden of responsibility for security? Or do you perceive it to be a shared responsibility?Again, this illustrates my point about users simply not knowing any better while contributing to the problem. But that's reality, like it or not. Some may take the initiative to better protect themselves (especially after getting burned at least once), and others won't. While there's a lot of self-help available online (you know it's bad when WSJ's Walt Mossberg covers spyware this month), it only goes so far. When it comes to security, people are often the weakest link in the chain. Just ask this law firm whose longtime bookkeeper fell prey to a Nigerian e-mail scam to the tune of embezzling $2.1 million. The breach in security wasn't just the person who embezzled the money, but also the management under which it occurred, and the bank manager who approved all of the wire transfers even though the bookkeeper was not authorized to make such transfers. I also recommend reading Sharon Nelson and John Simek's enlightening article on "Disgruntled Employees in Your Law Firm: The Enemy Within". Please don't misunderstand this as a "down on people" tone, as I can assure you it's not. It's about recognizing some of the root causes for security breaches and thereby being better prepared as a result. For example, "social engineering" preys on our fundamental tendency to trust one another, especially in a seemingly routine context. I too would like to see software developers better address the issue. But unlike Bruce, I don't see that as quite the rosy picture he's painting. Reiterative security testing, while welcome, would no doubt increase the development cycle and overall cost of the software. Since it's not practical to expect all software developers to include an equally effective level of security testing and remediation, and since viruses and trojans authors generally find ways to proliferate their malware faster than developers can detect and close the holes, we're still going to need all of our expensive security software and experts to keep us relatively secure. Overall, we'd probably be more secure, but it's going to cost us. How much? As he mentioned, it's tough to determine what's the most cost-effective method for allocating responsibility. Not all that long ago, it occurred to me that the free market would probably determine how much security is appropriate and Bruce lays this out regarding Microsoft: "The company is not a charity, and it doesn't make sense for them to make their products more secure than the marketplace demands. And right now the marketplace doesn't demand security."Lastly, Bruce offers good advice, but inherent in that is the requirement for self-education (my emphasis added): "Do you have any practical advice for our readers, in terms of staying secure, and safe?
Topic(s):
Privacy & Security
Posted by Jeff Beard Comments
|