Main image of article What Apple's HomePod Means for Developers
[caption id="attachment_142021" align="aligncenter" width="4032"] HomePod HomePod[/caption] Apple's much-rumored HomePod made its debut at WWDC, and the natural question is “Why?” A developer’s event is no place to showcase hardware that devs can’t use to create apps, but there’s still plenty of reason they should be excited about Apple’s home hub. The squat speaker is actually sort of brilliant. On paper, it has some of the best specs around: seven tweeters, each with their own driver, a four-inch woofer and six microphones to listen for your queries. It also has a bit of nostalgia going for it: HomePod is an unabashed nod towards the iPod, a device Apple used to change the music industry. After seeing the speaker in-person at WWDC, I can say the sound is impressive. Onstage, Apple emphasized music, and how the device works hand-in-glove with Apple Music to feed you playlists and samplings from your favorite artists. Meanwhile, Siri took something of a backseat. While Amazon and Google lean on their digital assistants to sell speakers, Apple is approaching it from the other end, pushing music rather than artificial intelligence as a reason to buy. Indeed, it's a truism of home hubs that we use them to stream music most of the time. This marketing may be just clever enough to work: “Buy this amazing speaker, and, oh yeah, it also comes with Siri.” In terms of Siri's functionality, nothing has changed with the introduction of HomePod: Siri queries are still encrypted, and it still dips into the data in your account to find upcoming events and the like. Developers should pay close attention to HomePod's evolution. Apple’s new Music API is meant for workout apps, but the real angle is playlists and surfacing music within apps. Similarly, a new AirPlay 2 API has unique tweaks for multi-room streaming. [caption id="attachment_142022" align="aligncenter" width="4032"] CoreML CoreML[/caption] HomePod will have more people querying Siri, too. Machine Learning will play a big part there, and Apple’s new CoreML framework is going to be used widely once HomeHub drops; it has most of its roots in image detection, but also has tools for natural language processing. The NSLinguisticTagger class “provides a uniform interface to a variety of natural language processing functionality with support for many different languages and scripts.” Apple says it can be used to “segment natural language text into paragraphs, sentences, or words, and tag information about those tokens, such as part of speech, lexical class, lemma, script, and language.” Making itself at home in your home, HomePod also works with HomeKit. Anything you can ask Siri to do for your connected home gadgets on the phone, you can do with HomePod. The device itself uses the A8 processor of iPhones past, suggesting it’s capable of much more than Apple is giving it credit for. All told, developers can tap into SiriKit, the Apple Music API, Core ML and HomeKit for use with HomePod. That’s enough reason to make any developer consider just how HomePod can fit into their app or service when it launches later this year.