Chances are you thought of someone like Amazon (for its AWS and related-cloud services), Apple (shiny devices and usability), Twitter (zero to ubiquity in just a few years), or someone like that. They all have done great things. But in 20 or 40 years, will any of them still matter?
There’s an argument to be made that cloud services legitimately changed the face of computing. But Twitter? That is going down in history as the Tamagotchi of the early 2010s. Apple’s computers will look odd and archaic, just like old computer ads look old and archaic now, even if they were amazing for their time.
So where do we find the true revolutions in computing? In the 1950s and 1960s, DARPA spawned technologies that became the Internet and user interface features we take for granted today. (Windows + Mouse + Presentation Programs + Links = Awesome!). Xerox PARC gave us Ethernet, the PC, GUIs and object-oriented programming. Bell Labs kicked in with UNIX and C. Decades later, all of these still play a large role in computing.
So if you want to work on technologies that will still be influential when your children are software engineers, where do you go? Let’s start by recognizing how many of these inventions started life as pure research. They took years to develop and didn’t have practical applications when inspiration struck – technology came first, application later. Where does that kind of thing happen today?
Welcome to the technology research lab, modern version. Research labs are few and far between on the ground. PARC is now a standalone company, working on more targeted projects. Bell Labs is barely profitable. DARPA is still going strong, but has tightened its focus. Academia is still an option, but you’d better be a good grant writer. And there’s one more giant still standing: IBM’s Research Labs. (Remember Watson, its Jeopardy-playing computer?)
If you want spend your career advancing the science of computing, consider research. Jobs are still available — you just have to look.