Siri Made Subtle Inroads at WWDC 2017

Apple Watch Siri WWDC 2017

Apple Watch Siri WWDC 2017

Many expected Siri to take center stage at WWDC 2017. It did – but also didn’t. Apple had no landmark changes to the voice-activated digital assistant in store, but there’s still a lot going on. Here’s what’s new with Siri and SiriKit.

Siri’s iconic voice will change with iOS 11, the next iteration of Apple’s mobile operating system. To me, it sounds a lot more like Google Assistant than Siri as we know her now (the male voice is also new). It’s an effort to make an assistant that sounds less robotic.

Another change is Siri’s button. The microphone icon is going away, making room for a morphing myriad of colors much like the macOS Siri icon, or the new SiriKit iconography. It’s similar to the floating color wave on the iPhone and Apple Watch, which also shows up on the new HomePod. It seems that’s Siri’s look and feel, moving forward.

Developers have a few new ‘Domains’ to take advantage of, which allow users to do things like transfer money between accounts or add an item to a to-do list. Siri also translates; if you need to know how to say something in a different language, it can handle that via spoken word and text.

Apple Watch is also getting a healthy dose of Siri, including a new, themed watch face that surfaces timely reminders or appointments based on your calendar, and has proactive features such as reminding you to leave for an appointment. It feels a bit like Apple is beta-testing Siri’s incoming capabilities, but more on that in a minute.

SiriKit WWDC 2017

SiriKit WWDC 2017

Siri for Developers

Siri wasn’t the star of WWDC 2017. Apple CEO Tim Cook sort of cycled through Siri quickly at the keynote, too. But under the hood, it feels like Siri is going to play a big role in WWDC 2018.

If we’re drawing parallels, Apple’s new thinking somewhat mirrors Microsoft’s. In Redmond, Microsoft has a new “Intelligent Cloud, Intelligent Edge” missive that essentially makes machine learning and user-generated data the core of what Cortana, the company’s own digital assistant, can do. (It also positions Cortana as the interface rather than a standalone digital assistant.)

Apple is acting similarly, but is still adding to Siri’s capabilities, which may be a hedged betting scheme. SiriKit is still very much alive and well, and Apple is doubling down on its performance. Within SiriKit, developers are able to get a bit more contextual with added resolutions for users.

An example: If a user said, “Send an iMessage to Jenny, ‘hey I’ll be there shortly,’” Siri would resolve which “Jenny” the user was referring to, as well as the number they wanted to message Jenny on, before sending the message along.

That functionality is being made available elsewhere, too. In iOS 11, Siri will ask follow-up questions. If a user did a Spotlight search for “best Thai food,” Siri would tap into Maps and find some Thai restaurants in the area. From there, it may ask if you want to make a reservation, which could direct you to a Maps-connected service such as OpenTable to save your spot for dinner.

It’s that same prediction activity that we see in the new Apple Watch face. In many ways, this seems like an open beta test, possibly linking back to machine learning. Apple has made aggressive efforts toward machine learning this year with Core ML, its new machine learning framework. It’s essentially asking developers to use a machine learning framework in their own apps, which again is very much like Microsoft’s own initiatives.

Microsoft’s Azure has its own machine learning framework, and developers are asked to tap into it for Windows applications – but web services are also able to use it. The difference is that it’s cloud-based, while Apple prefers the more secure on-device method for machine learning. The latter is slower, but has the capability to become a bit more personalized over time. If a user searched for Thai food often, Siri may eventually offer to book a table at their favorite restaurant rather than suggest they check out OpenTable.

For developers, Siri is still all about intents and domains within SiriKit; it’s just a lot better. Machine learning will round things out over time, and it’s starting to feel like 2018 may be Siri’s true coming-out party. With nearly a full year of apps and machine learning under its belt, Siri may be able to unleash some truly great features next year.

There’s no doubt Apple is behind the curve, and adopting machine learning in 2017 is a very late play. The upside is that the company is working hard, and developers are already weaving Core ML into their apps. Siri still isn’t great, but there’s a lot of evidence it’s about to be the digital assistant we’ve always wanted.

Post a Comment

Your email address will not be published.