Main image of article Apple’s AR Headset is Real, It’s Just Not Really Here (Yet)

We’ve been expecting Apple’s augmented reality (AR) headset for quite some time. With every Apple event it doesn’t appear, the chin-scratching intensifies. "Maybe Apple doesn’t actually have an AR headset for us!" we think.

Nope. It’s real... just not ready for prime-time.

Code spelunkers have discovered mention of a ARDisplayDevice framework within the Xcode 11 GM seed. So, there’s our AR… displaydevice.

Naturally, we all want to know when this headset will land. Other tidbits suggest we won’t see it for some time, sadly. As we noted in our roundup of iPhone 11 (and what it means for technologists like you), the addition of spatial audio and Dolby Atmos for the iPhones 11 Pro is a hat-tip from Apple that this “ARDisplayDevice” is coming.

Like its new triple-camera array, Apple frames the feature (for now) as great for media; and sure, watching a movie on our phones will be a lot cooler with better audio. But it's also a very solid foundation for the coming AR evolution.

The upcoming iOS 13 directly mentions the StarBoard framework, which is a system shell for supporting augmented reality modes. Those “modes” are “worn,” and “held,” which we posited earlier could mean the headset has an accompanying dongle of some sort. Now it seems the “held” version of AR is a game controller. From the aforementioned spelunkers:

The GameController framework in iOS 13 also has a gamepad profile for a device meant to be used while using stereo AR apps. The controller profile has a clicky trackpad, a trigger button, and a system (home?) button. Handheld controller for Apple's headset? 🤔— Steve Troughton-Smith (@stroughtonsmith) September 10, 2019

And:

iOS 13.1 beta 3 and iOS 13.0 GM include the new StarBoard system shell, to run stereo AR apps. Is this real life? https://t.co/TxaX0un1dk pic.twitter.com/9LRuvIIzyc— Guilherme Rambo (@_inside) September 11, 2019

And finally:

The iOS 13 GM also comes with a readme file (!) for how employees can run Stereo AR apps on an iPhone when you don't have access to Apple's headset 😳 pic.twitter.com/SeZEHW8p0S— Steve Troughton-Smith (@stroughtonsmith) September 10, 2019

Apple ARKit RealityKit WWDC 2019 Craig Federighi Dice

We’ve posited Apple is setting up its dominos for a Spring AR push, too. In our roundup, we wrote: “The A13 SoC is too powerful for its own good,” adding, “the A13 SoC is powerful enough to manage advanced augmented reality features… and we’d have to think Apple focusing on the camera system and machine learning is foreshadowing for the next few years of hardware development.”

Furthermore, we’ve suggested the three-camera array found on the iPhones 11 Pro will be mimicked on the incoming headset; a wide-angle ‘normal’ viewer, an ultra-wide camera for scanning peripheral environments, and the telephoto lens for appreciating depth of field (both near- and far-field). In its demo, Apple showed off how the camera could zoom out when you couldn't move physically backward.

We see Apple has provided us with a series of depth-sensing cameras and spatial audio features... and code hinting at a headset alongside a game controller driver unique to AR. This tells us its first pass at AR headsets will be really good, but perhaps not fully evolved. ARKit 3 brought us object and people occlusion, which needs smarter depth-sensing cameras.

Apple also refreshed its A-series chipset for the iPhone 11 Pro and Pro Max with a heavy lean into machine learning, which tells us the headset will allow the phone to do the heavy (processor) lifting. The rumor mill has focused on the "AMX" component, which seemed destined for augmented reality, even if it wasn't framed as such (we still think the machine learning prowess of the A13 lends itself to AR in a big way).

ARKit 3 also has gesture recognition, but as we've noted, it’s not quite ready for full-on gesture controls, so it seems we’ll have to use a game controller for now.

The AR headset is almost certainly real. Somewhere inside the spaceship, Apple engineers are tinkering with it. We’re betting a Spring event brings it to the masses (with new AirPods that may have some spatial audio features of some sort), or we see it unveiled at WWDC 2020 alongside ARKit 4. But this all tells us whatever Apple brings forward won’t be designed as always-on glasses, as many have imagined.