Tuesday, July 29, 2008

Bignum arithmetic and premature optimization

Donald Knuth, the patron saint of algorithm analysis, once famously said “We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.” Programmers of a thoughtful bent constantly argue over what this means, and at times whether it is even true. Mostly, they ignore its effect on security.

As new programming languages become ever-more “high level” and dynamic, they get further and further from forcing the programmer to cater to the way computers “think”. This provides significant advantages for developing software swiftly and easily, sometimes at significant costs to the efficiency of the code itself. Moore’s Law, however, ensures that for many (if not most) cases those efficiency costs are absorbed by the hardware so thoroughly that users never see the difference, at least in a one-to-one comparison of general software functionality. In fact, for the same general functionality, software written in a higher level language will often outperform software written in a lower level language, if each is run on hardware contemporary with the language’s inception.

Of course, featuritis — a separate phenomenon entirely — often adds far greater weight to an application that combines with the greater resource usage of higher level dynamic languages to slow things down to the point where we start noticing something is wrong. That, however, is an entirely separate matter.

There are those who will argue that choosing a language based on the comparative performance characteristics of programs written in that language is a case of premature optimization. When all you need is a command line utility that will complete its task in under half a second, and Ruby can fill that need, resorting to assembly language to eke maximum performance out of the program certainly seems like a bad trade, if the tendency of Ruby programs to be much easier to write and maintain is considered.

There is certainly a case to be made for lower level languages contributing to greater security. Knowing assembly language, or even a higher level “portable assembly” language such as C, helps the programmer wrap his brain around the concepts of von Neumann architecture computation. Even if you write all your software in a very high level language like Ruby, knowing what’s going on “under the hood”, as it were, can yield great rewards when some careful, precise tinkering is necessary — and in understanding the implications of what you’re doing with all those high level language constructs. This applies to security implications as much as to performance, portability, and stability implications.

Don’t take anything said here as dissuading you from learning lower level, static languages such as C or assembly. Even if you never use them in earnest, knowing these languages will help you really understand what you’re doing with higher level, dynamic languages, and may help you make your code more secure.

On the other hand, high level dynamic languages such as Ruby provide a lot of time saving linguistic constructs that, often as a happy accident, actually improve the security of your code without any effort on your part. An example is “bignum” handling.

In programming languages such as C, integers have limits to how big they can get. For instance, an unsigned integer variable might be limited to 16 bits — between 0 and 216-1 (i.e. 0 to 65535). In unsigned 16 bit integer arithmetic, usually 65535 + 1 = 0, because the short integer type is incapable of representing a numeric value outside the range of 0-65535. In some cases, trying to stick a larger value than a data type can handle into a variable of that data type can crash the program, provide improper access to memory, or cause any of a number of other potential security issues. For this reason, programmers in languages like C need to be careful about how they use limited precision data types.

Arbitrary precision arithmetic, also known as “bignum arithmetic”, is an arithmetic technique implemented in a programming language whereby the extent of an integer’s value is limited only by the restrictions of the hardware itself — essentially, by how much RAM the system has. This can, for instance, take the form of an automatic extension of the value that can be handled by the data type as it is needed, rather than limiting the value to an extent defined before a value is entered into a variable or otherwise handled by the program. As this greatly reduces the inherent danger of accepting overly large inputs, bignum arithmetic can prove a great boon to the security of a program.

Such arbitrary precision arithmetic capabilities can be had with languages such as C, via libraries like BigDigits, the GNU MPL, and CLN, but this is not the default behavior of the language and requires explicit use by the programmer. Languages such as Ruby, on the other hand, employ bignum arithmetic by default, as it is needed, without requiring any intervention on the part of the programmer to specify that extensibility of the values that can be handled by numeric data types. It’s important to understand concepts like fixed integer arithmetic, of course, but it’s not important to use it all the time — or even most of the time.

There are programmers who would complain at this implication, because arbitrary precision arithmetic generally imposes an efficiency penalty on programs that make use of it. In most cases, however, such concerns constitute a case of Knuth’s “premature optimization”, because unnecessary use of fixed precision arithmetic can lead to surprising behavior from your software if you make a mistake in development and some unexpected input overflows an integer.

For security purposes, it’s generally the case that Ruby’s way of doing this is the right way to do it: default to avoiding the all too common dangers of fixed precision arithmetic altogether. The only fly in the ointment is the rare occasion where the performance penalties of arbitrary precision arithmetic really matters — or the rare field of endeavor where it matters often.

When the importance of a nanosecond improvement in runtime is not needed, choose the tools that will make it easy to write more secure code.

No comments: