In tech, certifications and degrees are often used as a barometer for how good a prospective employee may prove when hired. There’s good indication those benchmarks may not tell the whole story, and may not even be necessary.
Recently, Google launched a certification for intermediate web developers. The program itself is a bit opaque; we don’t know what a developer would gain from it versus a MOOC or bootcamp course, and there’s nothing to suggest a hiring manager or client will be impressed by a Google-generated badge on your social media accounts. Save for Google’s name carrying some weight, this certification is probably not going to win anyone over.
Other certifications may carry more weight with recruiters and hiring managers. For example, Robert Half says certifications may increase your salary “by 5 to 10 percent.” Established disciplines such as .NET and Java are on the company’s list of in-demand skills, and many of the “trending” certifications involve security.
Choosing the best path toward obtaining a certification isn’t easy. Two main avenues for rapid skills-building, MOOCs and bootcamps, offer very different approaches. There is typically a severe price difference, too. MOOCs are less costly if you finish the course in a reasonable amount of time. Bootcamps are more expensive, but force you into finishing within a compressed timeframe.
But here’s something that might make that commitment a little easier: employers may be willing to pay for your certification. A recent study by Dice and The Linux Foundation suggests employers are at least open to the idea of shelling out for certs; and some MOOC providers like Coursera actually allow large companies to customize their own coursework for bespoke certifications.
However, such customization programs may blur the line between certification and continuing education. If a Cisco employee takes a bespoke Coursera program for ‘data science,’ it may not be reflective of the state of the industry. On the other hand, a hiring manager may assume a developer with a highly specialized certification knows quite a bit about that subject area, particularly if the cert is of extremely narrow scope.
No matter how many certifications on your belt, though, it’s unlikely that prospective employers will give you a “pass” on their hiring process. Nobody will see a Udacity Nanodegree on your CV and let you skip the whiteboard or pair-coding portion of the job interview. Whatever’s listed on your résumé, you still have to prove yourself, especially if your next interviewer believes that MOOCs and bootcamps have suspect reputations.
To put it another way: no matter how many certifications you earn, employers want to know that you actually have the most up-to-date knowledge about your chosen subject area. Rather than plunge into a formal program, many developers can piece together continuing education via framework documentation, blog posts and sites such as Ray Wenderlich. Sites such as HackerEarth also allow companies to create custom coding challenges for prospects, which gets the latter into a real working environment.
All the hand-wringing about certifications doesn’t solve the last mile: you have to prove yourself to your interviewer(s). Certificates look good as a footnote on a résumé, but falling on your face in a technical interview will almost always lose you the job. Certificate or not, showing that you can solve problems (or at least not rage-quit them) is still the most important feature you can provide an employer.