The iPhone X is coming. With initial impressions having landed early this week, the device itself will start hitting Apple Stores (and users’ homes) this Friday. Developers may need to tweak their apps for the new-look iPhone, but that’s not hard.
The work associated with iPhone X support is (mostly) minimal. The first consideration is the ‘notch’ up top, where the device houses various cameras and sensors. Those sensors are meant for Face ID, Apple’s new method for device authentication.
Curiously, Apple don’t extend the hardware hub across the screen, instead opting to have two small ‘ears’ on either side (where time and status indicators go). That’s also not something developers can actively avoid: as Apple clearly notes, apps can’t simply ‘black out’ the space up top and present apps in a carded format.
Apple also tells developers they should avoid hiding the status bar. It may be the clearest example of why the ‘notch’ exists in the first place. “The display height on iPhone provides more vertical space for content than the displays of 4.7[-inch] iPhones, and the status bar occupies an area of the screen your app probably won’t fully utilize,” writes Apple. “The status bar also displays information people find useful. It should only be hidden in exchange for added value.”
safeAreaLayoutGuide framework handles where text and interactive elements should go. A basic tenet to keep in mind is that the top and bottom of the iPhone X screen should be avoided when it comes to buttons and interactive elements. This is especially true in landscape mode, where the top and bottom become sides. This might seem fussy, but CARROT Weather developer Brian Mueller tells us it helped his app; with more screen real estate, he was able to add more context.
This all leads us to gestures, which are now critical for iOS. Apple chose to eliminate the Home button from its latest device, instead asking users to swipe up from the bottom of the screen to go home.
Gestures also drive multitasking. Swiping along the bottom of the iPhone X display migrates between apps, and both ‘ears’ on either side of the notch are gesture-based; you can swipe down from the left for notifications, and the right ear accesses the Control Center. Those areas are now sacred ground: Apple specifically asks developers (especially game devs) to steer clear when possible:
People rely on these gestures to work in every app. In rare cases, immersive apps like games might require custom screen-edge gestures that take priority over the system’s gestures—the first swipe invokes the app-specific gesture and a second-swipe invokes the system gesture. This behavior (known as edge protect) should be implemented sparingly, as it makes it harder for people to access the system-level actions.
Many of the remaining considerations are minimal. Though the underlying API hasn’t changed, Apple encourages developers to reference Face ID rather than Touch ID for authentication when applicable. Keeping artwork in-view is also important; iPhone X brings a new aspect ratio, and content may be cropped or pillarboxed if not optimized for the new display.
Happily, most considerations are handled by boilerplate, system-wide rules, and while the screen is richer, the pixel density won’t require graphical element changes. All told, the dev work is minor; making sure apps are optimized for the App Store may actually prove more intensive.