Lispian Random meanderings on whatever catches my fancy

Lispian
Why We May Never Have IT Security

I’ve been asked repeatedly why security is so bad. For years I’ve just ignored the question, figuring it was pretty obvious to anyone who spent more than a few seconds observing IT. However, I’ve come to the conclusion that it’s not obvious. Most people don’t get why IT security is hard and getting harder and why we’ll never truly have IT security to the point where we don’t have to worry about it.

Much of the problem actually stems from the fact IT security is pretty much in the same place it was back in the 70s and 80s. It’s stagnant. This isn’t a problem with the folks in IT security but rather a sad indictment of IT itself. It’s been stagnant in the 70s and 80s, not much has changed. And therein lies the problem.

But why do I say that it’s the same as the 70s and 80s? Because no one is attacking the real problem: the actual IT infrastructure we have. We are still using the same technologies invented back in the 70s and 80s (sometimes earlier!) and attempting to interconnect said technologies in ways they were never meant to be connected. This has meant hacking and kludging on technology such as networking onto systems that truly were never meant to be networked. Or, if they did have some rudimentary ability to network they were meant to work in an open environment free of threats, such as a tightly integrated academic environment.

What we see, then, is a network of computers where the computers were never meant to be networked.

This is evidenced in a variety of ways, but the most telling remains that, up until recently, computer systems up for “evaluation” by NIST or the NCSC in the US would be evaluated in stand-alone mode only. If the system were networked the evaluation would no longer be valid. Some attempts have occurred to resolve this issue with the Canadian Criteria and, finally, the Common Criteria but the systems themselves are architected, designed, and implemented in ways that did not take into account networking.

One only need to look at the main operating systems to realize this is true: Unix (and its variants) and Windows.

Neither of these were built with security in mind. Neither were built with the network as part of their fundamental being. They are both designed and built to run on solitary desktops/minis/mainframes and only subsequently to talk to a network through a defined interface. They are not built as network operating systems. And therein lies another of the fundamental issues.

All that said, the biggest problem is the end-user. They’ve become comfortable with their favourite desktop environment. They are also, for the most part, not computer savvy. This means someone who is computer savvy can manipulate those who aren’t. And that’s where a lot of the social engineered malfeasance comes into play. The operating systems are cumbersome and not designed to be small or elegant, that results in a lot of heavy interaction wherein attacks can play out, and the end-user is usually too trusting and thus easily manipulated into doing what a tech-savvy person would consider “stupid” things.

And the solution? At the moment none, unfortunately. It remains a patchwork quilt. And it sucks.

However, there is a light at the end of the tunnel: virtualization. We know how malware has to operate: against a known target application/operating system. The best thing to do is to encapsulate, sandbox every application so that malware can be contained within a specific sandbox. We’re nearly there in terms of performance of computers and disk space is nothing but a commodity to the point we could have multiple instantiations of an OS, say one per application, and things would work perfectly well. Imagine a desktop running dozens of instantiations of Windows 98, each instantiation running an application such as Word. By controlling the traffic/communication between these island sandboxes we could finally have some level of control over malware. By placing proper policy-based controls on the traffic between instantiations we could ensure that only acceptable traffic was routed and improper traffic tagged and contained for examination or elimination.

This, of course, is years away. But companies such as Microsoft are in a perfect place to be able to implement this. It does mean making lighter, smaller versions of the operating system but that shouldn’t be a bit deal. And when we see technology such as Xen Client one can quickly realize that these sandboxes can perform as if they’re talking to bare hardware. This allows even gaming environments to be sandboxed and contained efficiently and effectively.

However, if we continue the way we’re going we’ll continue to become less and less secure. Great for the bad guys and the good guys as they’ll both remain gainfully employed for the long term. But bad for poor end-user and corporations who just want to get their work done without having to deal with the lunacies of trying to maintain a computer system for which they have no proper training.

After watching IT security for over 30 years I can just say I have little hope a logical approach will gain any ground. Maybe if it’s implemented as completely transparent to the user so the end-user sees what he or she is accustomed to seeing. But an industry that’s built around the latest cool gadget may just be unable to sell a more secure method of utilizing a computer, and so we may be doomed to an endless succession of breaches and patches. For those in the IT security business, it’ll be their own personal Ground Hog Day, an endless repetition of what needs doing, what should be done, and what isn’t done over and over and over again.

Comments are closed.

July 2009
M T W T F S S
« Jun   Aug »
 12345
6789101112
13141516171819
20212223242526
2728293031