Main image of article 4 Tips for Better Security Programming

As everybody knows, flaws in programming languages allow anyone with the know-how to gain control of a PC. To be fair, it’s often not the fault of the language itself, but the implementation; that’s frequently been the case with Java, although years of use have made most programmers aware of the biggest potential vulnerabilities, such as checked exceptions. That’s not to pick on Java; by and large, it’s a solid design. In the case of C, though, the blame can be put fairly and squarely on the language and the steps that have been taken to fix it. Last year, the Heartbleed bug affected many websites. It targeted a much-used implementation of the Transport Layer Security (TLS) protocol, which is designed to allow secure communications over networks. When the TLS requests something from the server (called a heartbeat request), the expectation is that it returns a string along with a specified length for a buffer sufficiently large enough to hold that string. But if the length of the return buffer is set as much larger than the string, then not only is the string returned, but all the data in the memory locations following the string. Oops. This flaw was possible mainly because of the C way of programming, i.e., at such a low level, and without runtime checking. Because of those attributes, C requires programmers to think more about what they’re doing, and use automated tools to try and catch such bugs.

Check Your Bounds

In C, it's ludicrously easy to do something silly. Here's probably the simplest example: [cpp] #include <stdio.h> #include <string.h> int main(int argc, char *argv[]) { char str[10]; strcpy(str, argv[1]); return 0; } [/cpp] What happens if this is called from the command line, with a string that’s longer than nine characters? It clobbers the memory after the end of the string; nothing might happen, but it also might overwrite the stack or other variables. Unfortunately, C doesn’t let you change the size of the compiled arrays. Calling strlen and then allocating that +1 (for the string terminating #0) dynamically would prevent this; calling strncpy, which includes the length, could also prevent this. But many developers don’t review their work.

Check Operations

This is a particularly important aspect of C, but also applies to other programming languages. When you write to a file and close it, do you double-check that the close was successful? Probably not; and in most cases, things will turn out okay nonetheless. But what if the disk was full and the close failed? On the comp.lang.c group, a developer named Eric Sosman cited a case in which a company product manager failed to check for the close, and ended up zapping a file of customer data. In the example below, the fclose(stream) fails, and the code blithely carries on deleting good data: [cpp] stream = fopen(tempfile, "w"); if (stream == NULL) ... while (more_to_write) if (fwrite(buffer, 1, buflen, stream) != buflen) ... fclose (stream); /* The new version has been written successfully. Delete * the old one and rename. */ remove (realfile); rename (tempfile, realfile); [/cpp] Writing a file and then deleting the previous version is a common way of doing things, whether in C or some other programming language. But did I check the file close? Mea culpa...

Java Applets: Just Say No

Most of the world has now said “no” to Java applets. Java is a pretty secure and well-designed language. The Java Virtual Machine (JVM), though written in C/C++, is a thoroughly tested piece of software. Then there are Java applets, which have been around for twenty years; these small Java programs run in the browser and, until recently, ran much faster than JavaScript. But given that these applets can access thousands of classes and methods, securing them has been an ongoing and not very successful battle. Java applets run inside the sandbox, an area that is—in theory—completely separate from your main PC and thus highly secure. In practice, though, they're very unsafe: Kaspersky noted some 161 security vulnerabilities in 2014. Applets are why Java has its mixed security reputation; not running Java in your browser is a good way to stay safe. (Note that Java applications that are run through Java Web Start (JWS) are safe; they don't work through the browser.)

Stay Aware of Odd Behaviors

This one is clever and comes from developer Alex MacCaw. Note that the alert and the </script> tags on the right-hand side are inside a string—yet the alert still fires. Very naughty: [js] <script> var str = "</script><script>alert('Pwned');</script>"; </script> [/js]

Conclusion

As a desktop, mobile and Web developer, I find it scary the amount of knowledge required to maintain solid security—especially when it comes to Web apps. Sure, someone should rethink the whole Web and redesign it to be secure by default, but that’s likely not happening anytime soon. Instead, make a point of checking your work—and don't take anything for granted.