Main image of article iOS 12 Boosts AR, Machine Learning, Siri Tools for Developers
If you’re an iOS developer, chances are pretty good that you’ve been poking around the various beta versions of iOS 12 for at least a couple weeks. But whether or not you’ve had the chance to examine it, it’s worth taking a moment to examine the new features now that the operating system is publicly available.

Siri Shortcuts

For starters, Apple would like more developers to embrace Siri, the company’s voice-activated digital assistant. To that end, it has built the Shortcuts API, which allows Siri to quickly access apps. For example, a company that created a “smart” thermostat could have a Shortcut that allows iPhone users to remotely control their home temperature via their voice. Nate Swanner has an extensive breakdownof how Shortcuts work—as well as their limitations.

Machine Learning

For those interested in machine learning, Apple’s new Core ML 2 speeds up the processing of machine-learning models. First launched with iOS 11, Core ML is a framework with a lot of moving parts, and some tricky codebases to support (such as five classes dedicated to object detection and tracking, two for horizon detection, and five “superclasses” for vision). You can check out what we had to say about using it on iOS here. There’s also Create ML, a framework that supposedly lets developers build machine-learning models without any kind of machine-learning expertise. The framework is integrated into playgrounds in Xcode 10, which means you can observe model creation workflows in real time. In theory, this will help developers and companies that want to engage in tasks such as image and text classification, but don’t have machine-learning experts already on staff.

Augmented Reality

Apple wants the iPhone and iPad to dominate augmented reality (AR). The key to that ambition is ARKit, the company’s platform for building AR apps and games. ARKit 2, the iteration accompanying iOS 12, can detect more complex real-world objects such as furniture or sculpture; there’s also enhanced 2D image tracking, which will recognize product boxes, magazine covers, and more. Apple offers a host of SDKs and documentation for building AR apps.

A Smarter Future

It’s clear that Apple wants to make iOS “smarter” in the years ahead, anticipating users’ needs and providing next-generation experiences in AR (and possibly even VR). In order to accomplish that goal, it will need developers who take advantage of Siri, augmented reality, and machine-learning. If you’re an Apple developer but haven’t gotten around to exploring what iOS 12 can do, head over to the developer portal.