With WWDC 2019 just around the corner, the rumor and speculation mills are in full swing. The latest suggest ‘Marzipan’ is getting full-featured APIs for cross-platform apps, and augmented reality (AR) is getting a unique app for developers.
APIs for Marzipan were always expected at some point. The current thinking is that, by 2020, we’ll reach full integration between iOS and macOS app binaries, with APIs bridging the natural gaps between the two platforms. One of the major schisms is the Mac’s input system (keyboard, trackpad, Touch Bar) and features such as the menu bar. 9to5Mac reports incoming APIs will allow UIKit apps (those written for iPhone, and, more specifically, iPad) to work better on the Mac by allowing developers to tap into (pun intended) Touch Bar and the menu bar. Sweet.
Split View apps written for UIKit will be allowed to be re-sized on the Mac, and those UIKit apps will gain multi-window capabilities, making them a lot more like native Mac apps.
If you’re keeping tabs, this news further pushes the idea that ‘Marzipan’ apps will be UIKit apps that relegate macOS’s AppKit framework to a series of APIs. Rather than weave keyboard or trackpad support right into UIKit, it’ll require an API (which is probably the cleanest method).
Augmented reality (AR) is getting some attention, too. The Apple framework for augmented reality, ARKit, will have a new Swift-only framework, according to the report. Developers will also see a companion app for ARKit that allows them to “create AR experiences visually.” This sounds like AR devs will have a standalone developer environment to rival others, like Unity, but specific to augmented reality and ARKit proper. This report was light on details, but Apple has not been shy about claiming AR is significant to its future plans. An ARKit-specific IDE wouldn’t be a bad idea (or, you know, Swift Playgrounds, finally).
Based on the rumor mill, developers may also get new Siri intents for things like music playback, calling, travel tickets, sending message attachments, and voice calls. Document scanning will be a framework rather than a feature in Notes, and CoreML models can soon be updated on-device. There may also be broader access to the Taptic Engine and capabilities to read more types of NFC tags.
All good things… but we’ll have to wait for June 3, when WWDC kicks off, to know what Apple actually rolls out.