Apple’s fondness for augmented reality (AR) is well-documented. A new report from Bloomberg details the company’s possible progress in developing the technology.
Citing unnamed sources “with knowledge of the company’s plans,” Bloomberg says Apple already has a small army dedicated to AR for iPhone:
Hundreds of engineers are now devoted to the cause, including some on the iPhone camera team who are working on AR-related features for the iPhone, according to one of the people. One of the features Apple is exploring is the ability to take a picture and then change the depth of the photograph or the depth of specific objects in the picture later; another would isolate an object in the image, such as a person’s head, and allow it to be tilted 180 degrees. A different feature in development would use augmented reality to place virtual effects and objects on a person, much the way Snapchat works.
The AR team is well-rounded, too. Not only does it apparently touch every mobile corner within Apple, it’s led by alumni from powerhouse companies. Mike Rockwell, formerly of Dolby, is quarterbacking things alongside Yury Petrov (who came over from Oculus), DuncanMcRoberts (formerly of Meta), Avi Barzeev (HoloLens and Google Earth), Cody White (Amazon’s ‘Lumberyard’ project), and Tomlinson Holman (Lucasfilm).
Apple also reportedly has people working on AR glasses, according to the report. Though the glasses are “a ways off,” it seems Apple is still interested in a dedicated, physical AR product. While there are no details on what may be coming, all bets point to wireless glasses tethered to your iPhone. If that’s accurate, the glasses would likely work in unison with apps on your phone to overlay content into the real world.
That’s a big initiative, and a lot to ask consumers to get used to. For now, we’re best served with augmented reality apps. Our most recent “big” taste of AR was Pokemon Go, which showcased how a bit of location data and mapping, mixed with sprites, could translate into a popular and lucrative app. The game also serves as a baseline explainer for AR to the mass of less-than-tech-savvy iPhone owners out there.
AR for Developers
For developers, this news frames the big picture. We’ve long expected Apple to get involved with either augmented reality or virtual reality (AR/VR) at some point in the near future. Tim Cook has been complimentary of AR, and his comments now seem like breadcrumbs leading us to think about AR from Apple instead of full-fledged VR.
While not every app will make good use of AR (I really can’t think of an AR use-case for a calculator, for example), those that depend on location data have cause to reconsider their app as a candidate for AR augmentation. The simplest example would be something like the native Best Buy app directing you to in-store merchandise using mapping data and your phone’s camera; instead of clumsily poking around the store, AR could serve as a digital escort to get you in, buying, and out the door quickly (and probably coming back more frequently).
To get to that point, however, we’ll need some official tooling from Apple. It’s not clear when such APIs or an ‘ARKit’ might arrive, but considering the timing of this news, WWDC 2017 may be AR’s coming-out party for iOS. That may also be too early. Tipping its AR hat ahead of corresponding hardware that will support it could cause increased scrutiny for clues surrounding Apple’s nascent headset. Buzzfeed also hints that such hardware will have its own operating system, much like watchOS for the Apple Watch. That sound plausible, and may be why so many software people are said to be working on the program.
Apple has already laid some groundwork for mobile augmented reality. Metal, Apple’s official toolkit for accessing the GPU with limited restraints, will likely play a big part in Apple’s AR initiatives. In addition to providing better graphical performance, Metal has some interesting tidbits that AR developers may be able to take advantage of.
MetalArrayTexture is “a collection of one or two-dimensional images of identical size and format, arranged in layers,” which “results in fewer texture switching and thus less overhead.” It appreciates terrain in an application, and uses your orientation to index a texture based on a point’s elevation.
MetalBasic3D helps render layers as 3D objects, replete with shading and perspective, based on orientation.
Going back to that example of navigation inside a Best Buy store, these tools could help display an item as though it were on a shelf (possibly taking into account store lighting, too!), oriented to your position (it would float as a layer on-screen, but show the right side of the box if you were on that side of the item in an aisle) to help you find an item in the store as it appears on the shelf. It’s as real-world as mobile AR can conceptually be without dedicated AR glasses.
If additional rumors are to be believed, Apple will also solve the chicken-egg problem of iPhone processes and battery life. An incoming OLED display logically leaves room for a larger battery, as AR-laden apps are sure to tax your phone’s power source. There’s not much that power-pinching developers can do behind the scenes, but users are sure to notice and complain if their phones run out of juice before they can use iMessage stickers.