Thursday, November 19

On Predictability

I've been thinking a bit lately, and I've come up with this maxim:

Anything predictable is exploitable.

I'm going to confine this to network traffic at this point, since it's the only application for which I've given this much thought, but I'm willing to bet that it holds true in other areas as well.

Keep in mind that exploitability is not, in itself, a bad thing. A Web browser predicts that an http:// server is running on port 80 and exploits that. With that assumption, most people won't have to know what the previous sentence even means. Sometimes you run across sites where you have to enter "" in your address bar for servers that aren't using port 80 for Web traffic, but the ability to assume and predict a standard port is good.

On the other hand, exploitation can be bad. In old TCP stacks, the sequence numbers always started with 1 and incremented from there. This ability to predict traffic and forge legitimate responses can allow malicious machines to hijack sessions through what's called a "man-in-the-middle attack." More modern implementations of TCP/IP start the sequence number randomly; this prevents prediction and exploitation.

There are many steps in a chain at which predictability can lead to exploitability when it comes to network security, and not all of them are even technological. Take, for instance, the predictability that some percentage of users will click the link in a spam email message. Given that mass amounts of email can be sent at virtually no charge, and there is always some small percentage that will respond positively, there is still a return on investment that makes spam campaigns profitable. [0]

The same thing applies to cryptography. The only way to have a message encrypted well enough in transit to prevent decryption is to create ciphertext that is as close to pattern-free true randomness as possible. If there's any way to detect patterns, and therefore create predictability, exploitation will soon follow.

I can go on, but I think I'll stop here for now. I think, however, that the ability for prediction to beget exploitation is the driving force behind security these days (not limited to computer security). For instance, it's the regulated unpredictability of financial systems like the stock market that keeps people from reliably exploiting them for their own gain!

I might have more thoughts later, but I just wanted to put this out there while it was on my mind.

[0] "Spamalytics: An Empirical Analysis of Spam Marketing Conversion." Chris Kanich, Christian Kreibich, Kirill Levchenko, Brandon Enright, Geoffrey M. Voelker, Vern Paxson, and Stefan Savage. Communications of the Association for Computing Machinery 52(9):99-107, September 2009.

1 comment:

AJ said...

Hey Brad, That's a really interesting theory, and I think it will hold true when applied to a very wide variety of, well, applications.

Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 United States License. Permissions beyond the scope of this license may be available by emailing the author (use the link above).

The Geek Code desperately needs updating, but in any case here's mine (as of 2010-02-28):

Version: 3.12
GIT/MU d+(-) s:+>: a C++> ULXB++++$ L+++ M++ w--() !O !V P+ E---
W+++ N o++ K? PS PE++ Y+ PGP t !5 X- R- tv+@ b++ DI++++ D--- e*++
h--- r+++ y+++ G+

If you really care about knowing what that all means, you either know the code already, or you can get it decoded for you here.