Main image of article Apple Vision Pro Offers New Opportunities for Developers

During this year’s Worldwide Developers Conference (WWDC), Apple unveiled the Vision Pro, its long-awaited virtual reality (VR) headset.

In a typical outcome for Apple products, the rumor mill got some features of the final product right. For example, the Vision Pro allows the wearer to adjust the opacity of the virtual environment; when you twist a dial on the headset, you can see more (or less) of your physical surroundings. The Vision Pro’s onboard software also integrates with other Apple products, so you can do things like expand a Safari browser screen to the size of a room.

Apple is pushing the Vision Pro as a “spatial computer,” useful for everything from capturing 3D images to gaming. That’s a different marketing strategy than the one embraced by Meta, which is pushing its VR headsets primarily as gaming and conferencing platforms. Unlike Meta, which prices its headsets in the $300 to $1,000 range, Apple has slapped a hefty $3,500 sticker price on the first version of the Vision Pro, which could drive away some cost-conscious consumers.

If Apple wants the Vision Pro to succeed, it will need developers and engineers who recognize the platform’s potential and build must-have apps and services. That’s one reason why Apple waited until this year’s WWDC to roll out the device, which won’t actually hit store shelves until early 2024; it wants to give the tech professionals in attendance a glimpse at the tech and its underlying operating system, dubbed visionOS.

The all-important visionOS SDK is slated to arrive in late June. Those building apps for the platform will need to keep the following features in mind:

  • Windows: Based on Apple’s public demo, the Vision Pro’s interface emphasizes interactive windows floating in space. Developers will build these “windows” for their apps via SwiftUI and intrgrate 3D elements.
  • Spaces: The Vision Pro features “Shared Space,” where apps/windows float side-by-side, as well as “Full Space,” in which one app dominates the user’s view. The best analogy is probably running multiple apps in small windows on your desktop versus using a single app in full-screen view.
  • Volume: Volumes, also constructed via SwiftUI, are scenes “that can showcase 3D content using RealityKit or Unity, creating experiences that are viewable from any angle in the Shared Space or an app’s Full Space,” according to Apple.

In addition to SwiftUI and RealityKit, Apple is encouraging developers and engineers to master ARKit, its augmented-reality toolkit, and Unity’s developer tools. Apple’s Xcode app-building toolkit  includes Reality Composer Pro, ideal for building, modeling, and iterating 3D content for visionOS apps.