Having one or more certifications sounds pretty sensible in today’s world, doesn’t it? Many jobs demand proof that you’ve mastered a particular technology. But is the argument for spending lots of time and money to earn a certification as ironclad as it seems?
I’m arguing “no,” and here’s why.
Software Technology Moves Too Fast
If you’re attempting to master a subject or industry whose fundamentals don’t change much—medicine and engineering come to mind—then a test (or a series of tests) is a good way of verifying that you have the knowledge necessary to operate within that sphere.
But software is different. It evolves quickly, and knowledge that’s relevant today will seem hopelessly outdated sooner than you think. Let me pose a hypothetical. Say you needed a certificate to develop iOS apps; your shiny certificates would probably be outdated every year or two, as Apple rolls out new versions of the software.
“We’re now only recruiting iOS 8-certified developers with Swift certificates,” a hypothetical job interviewer might say about your hypothetical qualifications, “and by the end of next quarter, that requirement will change to Swift 2015 certificates only.”
Yes, I’m exaggerating for effect, but the principle holds: The evolution of technology makes many certificates obsolete pretty quickly (with some exceptions—some software, like the Linux kernel, doesn’t change all that rapidly).
The Certifiers Are… Who?
In theory, anyone can set up a certification business: Create a few online tests, charge x for taking a course online, pay for a little marketing, and—hey, presto—you’re ready to issue certificates. But what can online tests really verify?
I’m a little cynical of programmer tests that demand exact syntax knowledge, for example. Such things only test a small part of what programmers do—and with the increasing sophistication of IDEs, I’m betting that many programmers rely on automation and other tools to get the bulk of their work done. For example, iOS 8 added over 4,000 new APIs, and I certainly haven’t used more than a small fraction of them.
Employers Don’t Really Care
Recruiters sometimes have trouble determining a developer’s degree of technical experience, and so insist upon certificates or tests to judge abilities. If you manage to get past them to the job interview, the interviewer (provided they’re also a developer) can usually get a good feel for your actual programming ability and whether you’ll fit well with the group.
My personal experience—and I don’t think I’m alone, by a long shot—is that employers just want someone who can do the job and fit in. Most don’t care if you have a certificate.
It’s a Rip-Off
A few years ago, a job agency insisted that I become a ScrumMaster. While the underlying principles of Scrum seem okay, there’s a whole certification-related ecosystem that seems designed solely to extract money for training and certificates (i.e., Certified Scrum Product Owner, Certified Scrum Developer, and so on). It’s one thing to choose to pay for training; it’s another to be forced to pay for it.
And if you’re forced to pay for it, it’s hard not to think that the certification’s whole purpose for existence is as a moneymaker.
It Only Proves You Can Pass Tests
Certificates are only as good as the people who create the certificate tests. I’ve done a few online tests and, despite being highly experienced in C#, there’s a set of online tests that are so syntax-oriented that I have never scored more than 35 percent on them. I doubt if anyone knows C# syntax to the depth required by these tests; passing them would show you know only a narrow segment of the language, not that you can use it in a real-life context.
I’m obviously not a fan of formal certification. While many jobs require one or more, lots of tech pros have forged perfectly fine careers without them. Don’t let the complicated world of certificates impede you from pursuing what you want.