Last week, we hosted a Reddit AMA (on /r/cscareerquestions) in which we discussed the tech-job market in the post-pandemic, including the top technology skills in demand, whether machine learning is overhyped, and how people can get back into the tech industry after some time away. It was a fun and informative time; for those who couldn’t attend, we’re posting some of the questions and answers below (and in a follow-up article).
We also added some additional links within the answers for a bit more context. Remember, no matter what your experience or skill-set, now’s an interesting and exciting time to explore opportunities in the tech space.
AWS, Azure, GCP, Kubernetes, Docker, Terraform/Ansible/Pulumi—widely understood Cloud is here to stay. What trends within the Cloud community have you noticed that might not be public mainstream knowledge just yet?
Excellent question, and a complicated one given the unpredictability of the cloud ecosystem (and the tech industry at large). I’ve been hearing quite a bit of chatter about companies becoming more comfortable about embracing a multi-cloud paradigm as opposed to relying on just a single vendor such as AWS, mostly out of fears of vendor lock-in.
Some companies are already using a mix of cloud services, hybrid cloud (on-premises mixed with some cloud, etc.), but if that gets even more popular going forward, it could present a lot of challenge and opportunity to technologists going forward. Opportunity, obviously, because being able to deploy and manage a complex environment equals lots of job possibilities and job security. Challenge because a lot of cloud specialists have chosen to devote most of their time and resources to a single stack such as AWS, and widening their knowledge to a broader ecosystem will take more effort (especially when you start throwing in the possibility of needing to earn more certifications).
Some developers with whom I’ve spoken have also made a big deal of edge computing, which is another buzzword that’s been drifting around but hasn’t *quite* hit the mainstream. Edge computing can help companies overcome some latency and processing issues, especially those trying to wrangle enormous datasets for insights, A.I., etc. Technologists who talk about it often mention it in the same breath as ML, so they’re presumably envisioning a future in which ML is used to help network engineers, cloud architects, etc. manage a distributed network in a way that’s both secure and efficient.
How much of the buzz around machine learning do you think is overhype? Do you expect to see a decreased rate in hiring for machine learning positions in the near future at all?
Great question. There *is* quite a bit of hype around A.I. and machine learning, because I think a lot of non-tech people read those terms and think that the technology’s progressed to ‘2001’ levels. Which isn’t the case, of course; anyone who’s interacted with a chatbot or yelled at Alexa knows that the tech is decades away from anything even remotely resembling generalized artificial intelligence.
Burning Glass (BG), which collects and analyzes millions of job postings from across the country, predicts that machine learning jobs will grow an astounding 39.3 percent over the next decade. These jobs also pay really well at the moment; BG claims the median salary currently stands at $107,000. But the number of tech jobs that request machine learning skills is actually still fairly low; for instance, just 4 percent of software engineer/developer job postings ask for machine learning skills, and it’s 8 percent among data analysts.
So I actually think the machine learning hiring rate is going to go *up* over the next decade, especially as the underlying technologies become more commodified and easier to use, and as more companies decide to devote the budget to “smartening” their existing apps and services. But the actual hiring rate right now is still fairly low despite the buzz.
How can new grads differentiate themselves in a job market that seems to be filled with senior developer postings but very few junior positions?
Fantastic question. It’s understandably frustrating when you see a lot of job postings that ask for 10+ years of experience and a wide range of skills. Fortunately, there are ways to catch a hiring manager’s attention while standing out in a crowded field of applicants.
Specialization is key. I was actually talking to a cloud engineer yesterday afternoon who emphasized how every stage of his career unlocked once he chose to focus on a particular technology or sub-industry. For example, he devoted tons of effort to earning all possible AWS and Azure certifications when he became interested in cloud architecture, and then added a cybersecurity focus on top of that. Although he’s relatively new to a tech career, he’s been able to make significant progress by positioning himself as a cloud/cybersecurity specialist who knows the most popular cloud platforms extremely well.
That being said, you don’t necessarily need certifications to prove you’re highly specialized (although they can really help; some recruiters and hiring managers prize seeing certifications listed on your application materials, because it’s concrete evidence that you have the relevant skills); at Tesla, for example, you can land a job utilizing machine learning and A.I. provided you can actually demonstrate those skills in a series of (very difficult) tests. If you have a portfolio of projects on GitHub, for instance, you can show a hiring manager that you’re excellent at a particular tech discipline, even if you’ve had relatively few jobs up to this point.
When breaking into the job market, you also shouldn’t discount the power of mentors and advisors to help you stand out, especially when applying for junior roles. Online communities (like certain subreddits), college alumni networks, and more experienced colleagues can turn you onto new opportunities and give you good advice about how to apply for a particular position.