According to a new report from Bloomberg, Apple’s next iPhones – a successor to the mid-range XR, and two new “Pro” models – will carry an A13 SoC. That’s expected, but the report also notes an “AMX” component to the A13 that may be dedicated to augmented reality (AR).
Here’s the full language from Bloomberg’s report:
All of the new iPhones will have faster A13 processors. There’s a new component in the chip, known internally as the “AMX” or “matrix” co-processor, to handle some math-heavy tasks, so the main chip doesn’t have to. That may help with computer vision and augmented reality, which Apple is pushing a core feature of its mobile devices.
This is a nuanced move by Apple. First, the same report suggests Apple is also planning to introduce new features for the incoming iPhones’ new three-camera array. Supposedly, this array brings the mobile device “closer to professional video cameras,” and allows for retouching, effect application, color scheming, reframing, and cropping video live.
A good example of these kinds of effects is Apple’s Clips app, which allows you to apply effects to video before recording begins – but not during. If Bloomberg is accurate (and it typically is with Apple rumors), Apple will essentially be baking Clips into its Camera app, and adding some pro-level features (a la FiLMiC Pro).
“AMX” seems well-suited for this purpose. It also seems like a component that will become useful for real-time augmented reality features. On a very basic level, high-level video editing and augmented reality are very similar: they edit the view of a scene in real time. Each adds some kind of “augmentation,” and understands the scene so the “new” look is respected no matter how the camera moves or orients itself.
The new three-camera array may prove critical in this instance. Much like Apple’s use of bokeh with its Portrait Mode, we expect that real-time video editing will be limited to a certain aspect ratio. The new third camera rumored to land on the next iPhones is believed to be used for wider-angle shots, and to help capture the scene fully. If a friend gets cropped out of a group shot, or a building doesn’t fit in the frame, this wider-angle view may be able to stitch the missing element back in and create a much better image.
With real-time video augmentation, the same wide-angle shot could be used to pre-process a scene. The same could be coming for AR features. More pointedly, this could be laying even more runway for Apple’s rumored AR glasses to take off in 2020.
Let’s first dial it back to 2017. Months after Apple purchased a company focused on some pretty good augmented reality glasses (which it positioned as useful for training athletes more than playing games), Apple CEO Tim Cook said “the technology itself doesn’t exist” for a quality AR headset. He specifically pointed to field-of-view as an issue.
(A previous Bloomberg report notes Apple has a dedicated AR operating system, ‘rOS,’ coming in 2020. This may coincide with the launch of bespoke eyewear.)
We’ve pointed out all along that heads-up AR glasses from Apple will likely not end up positioned as always-on eyewear. Part of that reasoning has to do with hardware. The eyewear’s stems may have AirPod-like cylindrical batteries, and be wirelessly tethered to the iPhone via the W1 bluetooth chipset. The frames may feature three cameras: One on either side of the lenses, and the wide-angle camera in the center.
Apple analyst Ming-Chi Kuo, another trustworthy source, says production of the headset should begin late this year or early next. It’s unclear who would get the go-ahead to manufacture the device, but if it’s the same vendor/vendors as those contributing to the iPhone supply chain, that timeframe suits the typical slowdown for iPhone production.
We’re obviously speculating, but the evidence of eyewear is mounting. Furthermore, though Apple likes to update ARKit at WWDC, it isn’t shy to offer off-cycle half-step updates. ARKit 1.5 was introduced in January 2018. ARKit 2.0 landed in June 2018 at WWDC.
We don’t expect to see the glasses, or any sort of ‘rOS’ system for glasses/augmented reality, before WWDC 2020. If Apple released another half-step update to ARKit this Winter (and it has gesture recognition billed as an extension of its RealityKit framework), it’s the final clue we need to know glasses are coming. Apple is no stranger to tiptoeing into major new features or platforms.