As everybody knows, flaws in programming languages allow anyone with the know-how to gain control of a PC.
To be fair, it’s often not the fault of the language itself, but the implementation; that’s frequently been the case with Java, although years of use have made most programmers aware of the biggest potential vulnerabilities, such as checked exceptions.
Last year, the Heartbleed bug affected many websites. It targeted a much-used implementation of the Transport Layer Security (TLS) protocol, which is designed to allow secure communications over networks. When the TLS requests something from the server (called a heartbeat request), the expectation is that it returns a string along with a specified length for a buffer sufficiently large enough to hold that string. But if the length of the return buffer is set as much larger than the string, then not only is the string returned, but all the data in the memory locations following the string.
This flaw was possible mainly because of the C way of programming, i.e., at such a low level, and without runtime checking. Because of those attributes, C requires programmers to think more about what they’re doing, and use automated tools to try and catch such bugs.
Check Your Bounds
In C, it’s ludicrously easy to do something silly. Here’s probably the simplest example:
int main(int argc, char *argv)
What happens if this is called from the command line, with a string that’s longer than nine characters? It clobbers the memory after the end of the string; nothing might happen, but it also might overwrite the stack or other variables.
Unfortunately, C doesn’t let you change the size of the compiled arrays. Calling strlen and then allocating that +1 (for the string terminating #0) dynamically would prevent this; calling strncpy, which includes the length, could also prevent this. But many developers don’t review their work.
This is a particularly important aspect of C, but also applies to other programming languages.
When you write to a file and close it, do you double-check that the close was successful? Probably not; and in most cases, things will turn out okay nonetheless. But what if the disk was full and the close failed? On the comp.lang.c group, a developer named Eric Sosman cited a case in which a company product manager failed to check for the close, and ended up zapping a file of customer data.
In the example below, the fclose(stream) fails, and the code blithely carries on deleting good data:
stream = fopen(tempfile, "w");
if (stream == NULL) …
if (fwrite(buffer, 1, buflen, stream) != buflen) …
/* The new version has been written successfully. Delete
* the old one and rename.
rename (tempfile, realfile);
Writing a file and then deleting the previous version is a common way of doing things, whether in C or some other programming language. But did I check the file close? Mea culpa…
Java Applets: Just Say No
Most of the world has now said “no” to Java applets.
Java applets run inside the sandbox, an area that is—in theory—completely separate from your main PC and thus highly secure. In practice, though, they’re very unsafe: Kaspersky noted some 161 security vulnerabilities in 2014. Applets are why Java has its mixed security reputation; not running Java in your browser is a good way to stay safe. (Note that Java applications that are run through Java Web Start (JWS) are safe; they don’t work through the browser.)
Stay Aware of Odd Behaviors
This one is clever and comes from developer Alex MacCaw. Note that the alert and the </script> tags on the right-hand side are inside a string—yet the alert still fires. Very naughty:
var str = "</script><script>alert(‘Pwned’);</script>";
As a desktop, mobile and Web developer, I find it scary the amount of knowledge required to maintain solid security—especially when it comes to Web apps. Sure, someone should rethink the whole Web and redesign it to be secure by default, but that’s likely not happening anytime soon. Instead, make a point of checking your work—and don’t take anything for granted.