Who could have imagined that, more than 46 years after its creation, the programming language C would not only be going strong, but still rank highly on various programming-language indexes such as RedMonk and TIOBE? On GitHub, the language places tenth with regard to pull requests—even beating newer languages such as Swift.
Here’s one secret to C’s longevity: its constant evolution. Since 1972, the language has iterated through multiple versions, with ANSI C (also known as ISO C) becoming the standard in 1989. Personally, I like C99, the version supported by Microsoft and Digital Mars, although it’s been supplanted by C11, published in 2011. (I like C99 because you can declare variables anywhere in the code, not just at the start of functions; it also has one-line comments that start with //.)
The C11 standard adds multi-threading and improved Unicode support, plus other features that help to write safer code (such as bounds checking functions and removing gets). Not all C compilers support it yet, but some that do include GCC 4.7 and higher, Clang, Intel C 16.0, and Pelles C, a free IDE and C Compiler for Windows 32/64.
Without a doubt, C is the programming language with the widest software ecosystem. Take Linux, for example: some 31 percent of Debian Linux is written in C, according OpenHub’s analysis. This is mostly the kernel with its 19.5 million lines of code (as of Linux 4.1 three years ago). When you consider Linux’s footprint (i.e., most of the internet’s infrastructure, not to mention its top-ranked supercomputers, and all of its best-known relational databases), you begin to grasp the full impact of C on the computing world.
To be fair, another reason why C remains ubiquitous is that it’s always been a go-to tool for developers. For example, the Oracle Database was originally written in Assembly back in ye olden days of 1977, before the company’s engineers rewrote it in C six years later. Key-value stores Redis and LMDB, memory caching system memcached, and Aerospike (a NoSQL database) are all written in C. If that wasn’t enough, there’s also Perl’s core interpreter, CPython (the reference implementation), 50 percent of R, Git, Subversion, Apache Webserver, NGinX web server, and Chrome OS (C/C++). If you’re a tech pro who wants to learn how to manipulate the very foundation of the web, learning C is a really good idea.
Why did C become so popular in the decades following its inception? Before C, you had Assembly (also known as Assembler, if you want to risk a fight over tools, languages and terminology), Pl/I, Pl/360, PL/S, Pl-8, and even FORTRAN (there’s a big list here). But C was far more portable—and much simpler—than Assembly, which made it easier to develop C compilers, which in turn allowed developers to produce code on an expedited basis. Many packages have been recoded in C for ease of maintenance. (Some programming languages—for instance, IBM Informix 4GL, Eiffel, Nim, and Cython—compile to C; it saves having to develop a compiler backend, although LLVM has probably reduced that need by this point.)
I recently used the SDL2 open-source library, which is written in C and meant to provide an abstraction layer for computer multimedia hardware; in one program, a screen full of moving objects managed to achieve 5,000 frames per second, without syncing with the retrace. If you want to try out C GUI programming, be sure to experiment with the single-header open source ANSI C GUI library Nuklear.
C is firmly embedded in modern-day computing. C++ is an obvious candidate to replace it, but I suspect it never will; same with Go and Rust. Even if the perfect programming language to replace C ever emerged, the costs associated with substituting legacy code make C’s short-term demise unlikely. Ten years ago, for example, the Linux Foundation estimated the cost of building out the Fedora 9 Linux distribution at $10.8 billion; that’s a pretty good yardstick for judging the cost of a widespread code replacement. In short, C is lodged in our infrastructure like a deer tick.
For the time being, C should be considered “good enough,” and for tech professionals who are looking for a skill to add to their toolbox, you could do far, far worse.