Main image of article Building Bots: What Skills Do You Need?
“May I help you?” It’s a phrase you can say without much thinking. But programming a piece of software to say it—and then act on the answer—represents a whole new world of complexity. That’s because a bot needs to understand nuance and context, filter out pauses and wayward noises (“eh,” “um”), and analyze for ultimate intent. But it’s a problem that the tech industry must solve: vocal interaction with software, as exemplified by Apple’s Siri and Amazon’s Alexa, is clearly the way of the future. And more companies are turning to online bots, where conversations are typed as opposed to spoken, in order to fill a variety of needs.

Listen Before Speaking

The quest to make computers speak and understand has progressed far longer than you might think. For example, Synthetix has been in the bot game since 2001, when it began crafting “virtual agents.” That initial project was less a chatbot than a rudimentary offshoot of a website, giving “more stock answers from a known base,” according to Scott Swope, the company’s head of U.S. strategy, alliances and channels. “[Then we] started putting in natural language understanding and machine learning in the 2012 time frame,” he added. “Now we are seeing the sector ‘smart up.’” A lot of tech giants, including Facebook and Microsoft, are very interested in having developers use their tools to build conversation-ready bots from the ground up. Some of this newfound interest stems from an abundance of massive and cheap compute power. If you’re a developer interested in building a bot or a voice-activated piece of software, you can tap into the enormous server farms run by AWS and other providers. There are also “a lot of open source libraries for pipeline developers,” Swope said. “There is more on the software side.” Over a decade ago, bots had a limited scope, and automated just a select few online processes. “Today, anyone can use tools like API.ai or Wit.ai and start building a basic bot,” explained John Dumoulin, director of applied research at Next IT, which designs conversational A.I. systems. The rise of bots and voice-activated assistants, of course, raises the question of whether robots will begin replacing humans in certain key positions, most notably customer service. “The way we see it, bots—and A.I. in general—should augment, rather than replace, human intelligence,” said Adam Orentlicher, Director of Horizontal Applications for IBM Watson, Big Blue’s famous A.I. project. “Human agents are working with bots to provide the best service for customers—we call this the ‘human-bot tango.’ Customers may interact with bots on the front-end—getting initial questions answered—before being elevated to a customer service representative who can provide additional support.” That setup “gives human agents more time to resolve complicated customer issues, while the bot can resolve more basic queries and push an escalation based on customer’s intent or sentiment,” he added.

What Language Was That?

When it comes to building bots and assistants, many firms will rely on a proprietary platform that allows them to build from scratch. For tech pros, that means you need a deep familiarity with a variety of languages and technologies, especially since these digital servants end up interacting with a lot of other systems. At Synthetix, for example, the foundation is Jabberwocky, a “home-brew language” for processing and storing data. Add to this PHP, Node.JS, Angular, JavaScript, jQuery, NoSQL (AWS Dynamo DB), SQL (AWS Aurora), and Simple Storage Solutions (AWS S3) for the natural language/machine learning components, Swope said. “We're not using the AWS ML [machine learning] resources at present as we have our own ideas about how to work with Machine Learning,” he added, “and prefer to build [rather] than rent or buy for anything that could be classed as fundamental intellectual property.” Beyond that, the company tries to stick to “vanilla” programming languages, with an internal preference for PHP. Synthetix’s products must handle a lot of traffic, meaning a lot of quick interfacing with a database despite the choppy input of the spoken word. This is where Jabberwocky comes in, pre-processing everything from query expansion to synonyms, stemming, and phonetics. Bytecode sequences are usually cached in memory. Another chatbot shop, Paphus Solutions, also makes “multi-lingual” demands of its programmers. “We are a Java shop, so Java is very important; we also look for experience with Android, iOS, JavaScript, web, python, AI/NLP/NN, TensorFlow and other library experience,” said company president James Sutherland.

The Stuff You Don’t Learn in School

As with every new technology, bots and digital assistants suffer from a shortage of learned practitioners. Firms can hire for specific programing skills, but it is going to take a “little extra” on the part of the job candidate to get hired. “There is not really a skillset that we look for because we only deal with new technology,” Sutherland explained. “So no one has the skills; we are always developing new technology using other new technology.” Sutherland is on the lookout for developers with intuition, who can puzzle out solutions without needing a tutorial or a walkthrough. “It is more important to have someone that can learn iOS development in a couple days, than someone who has iOS experience,” he continued. “But mainly just try to hire really smart people, which is a very difficult thing to do, and something that a résumé is not going to tell you.” Nor do companies need someone with a doctorate in artificial intelligence. “After deploying for enterprise businesses for over 15 years, we have found regular programmers with a balanced understanding to be just as effective in building our products and technologies,” said Next IT’s Dumoulin. “Many skills that are considered ‘must-have’ for a AI/Bot can actually be overkill for bot creation, though scripting conversational solutions do require scripting tools for handling context state, as well as a logic programming system like prolog or data-log for slot filling and unification tasks.” In addition to an inquisitive spirit and good learning skills, programmers who prosper in this brave new world often have some sort of background in natural conversation. This doesn’t necessarily mean natural-language programming; an education in linguistics, psychology, or sociology could also provide the necessary perspective. That’s because those fields attempt to pick apart the mechanics of human conversation—essential when you’re trying to build a bot or assistant that follows those rules. (Synthetix, for example, actively seeks out these eclectic programmers with backgrounds in language structure, linguistics, and advanced math.) “These ‘conversational’ engineers can work together with programmers who are engaged in back-end development to create a more well-rounded, engaging bot,” Orentlicher said. In theory, this could open the door to bots and digital assistants not only capable of handling basic customer-service requests, but also the nuanced back-and-forth that comes with counseling or negotiation. At this juncture, bots and digital assistants are mostly about commerce; think about asking the module in your home to order more toilet paper, or dealing with a package-delivery issue with an online chatbot. Consequently, developers who can easily grasp business and client needs are valued; there are delivery and brand issues to consider along with the linguistic and cognitive-psychology ones. While there’s a lot of buzz around bots, some companies are frustrated with the current pool of candidates. “The quality of the candidates being released from colleges could be a lot better,” Swope said. As the market matures, however, it could compel more tech pros to learn the right mix of skills—and perhaps create the next great digital-assistant platform.