Some Surprising Research on IT Job Obsolescence (Video)


Underlying an IT professional’s career is that needling and constant fear that their technical skills will become obsolete. Given that most IT skills begin eroding at just two years it’s understandable, says Dr. Damien Joseph, an assistant professor at Nanyang Technological University, who authored the paper Sustainable IT-Specific Human Capital: Coping with the Threat of Professional Obsolescence, along with Christine Koh and Arthur Foo.

Traditionally, the feeling has been that to protect your career, you should simply learn and adopt more skills. More skills translates to increased employability. But Joseph contends IT professionals can also defer development to other people, meaning if their company needs a new IT skill, they can let one of their coworkers spend the time to learn it.

While the options for someone who defers skill learning is limited, it’s not the death of their career. Even if you defer the new knowledge, you should still be able to find jobs at other organizations. Joseph theorizes that these other organizations have less up-to-date technologies.

Michael Gallivan, assistant professor at Georgia State University, wrote a dissertation on the same topic fifteen years ago. He believes that now maintaining tech skills is the employee’s responsibility. In the mid 90s and before, it was seen as the company’s responsibility. Today, that’s changed, he says. Employees realize it’s at least a 50/50 shared responsibility, if not more weighted on the employee.

5 Responses to “Some Surprising Research on IT Job Obsolescence (Video)”

  1. I have not the ability, where I am, to watch the video. However, I am curious about the good Doctor’s claim that “most IT skills begin eroding at just two years”. Does he have empirical evidence supporting his position? If so he should have cited the source. If so, are there specific skills that erode, or all of them become a little, or a lot, less sharp?

    • @Mike: Sorry you can’t watch the video, and we don’t have a transcript. But I don’t think one can argue that IT skills erode, since the technology we work with is evolving so quickly. In his interview here, Damien Joseph cites part of the research he presented here for the two years. It’s worth watching if you can find a place to see it.

    • Mike, this is based on his research, so that’s what he’s citing. But he’s using eroding skills as a norm, meaning as technology keeps advancing, it’s upon the IT worker to keep their skills up to date because what you knew five years ago doesn’t necessarily apply to what you’re doing today. Right? What he argues is that sometimes it’s a good idea NOT to update your skills and let others take over and you’ll still have job security. Interesting take.

  2. I watched the video. Interesting how the term “re-skilling” is now also obsolete 😉 Staying “up to date” is fine if your employer is also “up to date”. Otherwise you might be learning skills that will not be of any use in your existing position. If you are a job hopper you’ll probably hop to a company on the bleeding edge. If not, what to do? Of course, those “legacy skills” are (allegedly) all the rage these days as company after company finds the WunderKinds whose entire world is MS rather lacking in the necessary skills to maintain, and develop, ba-zillions of code written in not-very-up-to-date languages. Or perhaps I have no idea about which I speak.

    • You’re right on target Mike as that’s exactly what he was talking about. There are plenty of companies that don’t move ahead and need people who can handle legacy technology therefore updating your skill set isn’t necessary.