Main image of article How to Know When Old Technology Isn't Outdated
I wrote recently about the things I look for when considering a job or a contract, and it spawned a lot of good discussion in a few different areas. Some people talked about their criteria for considering a job; others talked about the importance of just plain having a job. One thread that came up was the idea of "old" technology, and what constituted old. From a candidate's point of view, when do the technologies you've been working in become so "old" that it becomes difficult to find another job? Here's the relevant part of the quote:
I have spent the past 15 years working for the same company in “old” technology programming in C.
That comment really struck me because I don't think of C as an outdated technology. Yes, it's "old" in the sense that it's been around for decades. However, it's still in current use. Granted, it's become a little bit of a niche technology, but it's still a valuable skill in certain industries or technology areas. I worked in the storage industry for a while, and competition was fierce for experienced C programmers. In file systems vendors (and they do exist!), C and more specifically kernel hacking in C is still a source of jobs. I'm sure there are others scenarios as well. There are numerous other technologies that are old but still in current use. Ping, for example, just had its 40th birthday. I used it yesterday to confirm that yes, the network really was down. Linux — or more precisely, the Linux kernel — dates to 1991, and it powers hot technologies like the Amazon EC2 cloud. Most of the technologies that run the Internet — TCP, etc. — are old, too. It's possible to get jobs working in all of these technologies at modern and growing companies. Old is modern, for these techs. Old doesn't have to mean outdated. Old technologies don't have to die; some just go niche! Image: Picture of a person typing at a keyboard with a matrix effect by Bigstock