iOS 11 From a Developer’s Perspective
Last week Apple presented the newest iteration of its operating system for mobile devices, iOS 11. The announcement took place at WWDC, in full accordance with Apple’s long-standing tradition to debut new versions of iOS at the developers conference.
The annual event that gathers developers all across the globe was held in San Jose this time, pretty much at doorstep of their new campus in Cupertino, Apple Park, colloquially known as the “Spaceship”. I took a chance to go through the new release from a developer’s perspective, and my findings are below.
Overall, the number of new features is impressive, consequently this article captures just those that caught my personal attention. In case you have further interest here’s the full list from Apple, beware that it’s long read.
Embedded Augmented Reality (AR) Support
There is a number of SDKs that you can use to bring the AR experience to your iOS apps, such as a multiplatform ARToolKit, but this time Apple decided to take matters into their own hands, so all hail ARKit. What sets it apart from third-party alternatives is the seamless integration with the existing iOS frameworks, long used in game development, such as SceneKit and SpriteKit. That allows developers to easily create AR apps leveraging existing skills with respective frameworks and even existing assets, such as scenes in the SCN format. Just create a sample project using the ARKit template available in Xcode 9, select the Content Technology (e.g. SceneKit) and try it for yourself!
In case of the SceneKit content, the workflow is rather straightforward and similar to the regular node creation in these frameworks. A new ARSCNView does most of the heavy lifting, leaving to the developer optional spawning of new entities, called anchors that define real-world positioning and orientation for scenes that are to be added and aligned. Things do get more complicated with multiple scenes and their manipulations, but developing a rather basic AR app with ARKit took me less than an hour!
Using SpriteKit content seems to be similar, but I didn’t experiment much with it. Also there’s an option to use Metal as well, I guess for those concerned with squeezing out max performance, but unfortunately as of Xcode 9 Beta 1 even the respective template does not compile, which is expected for a first developer preview
Machine learning and Computer vision
Apple also presented its take at the computer vision, supplying developers with CoreML, the framework already powering many advanced features in Apple’s own apps such as Camera and Siri.
Basic CoreML usage is completely straightforward programmatically, it does depends heavily, however, on training models that allow AI to do its jobs, and those could be quit large in size. Fortunately, Apple has some pre-generated models available free of charge, therefore doing basic computer vision tricks such as object recognition is down to simply calling the prediction() method of a loaded model. An alternative would be to use the Vision framework, which also debuted on iOS 11 and seems to be a higher-level setup on top of the CoreML and the same training models from Apple.
NFC for third party apps
NFC is finally up and running free for iOS, developers can now use the new Core NFC framework to scan and read tags from NFC enabled devices nearby. The software part does not seem to be complex to use programmatically, however, only two iPhone, 7 and 7 Plus fully support NFC NDEF tags.
Accessing Apple Music with ease via Music Kit
Now developers are able to access all contents of the user’s music library in their apps, including the offline content and full Apple Music catalog including millions of songs. Searching for and fetching media info is performed via web API, with responses coming in standard JSON, playback done through Media Player. Could be an interesting feature for multimedia apps in my opinion.
Native PDF display and manipulation
Finally the PDFKit has been ported to iOS from Mac, no need to rely on UIWebView with its basic rendering abilities, third party SDKs or complicated Core Graphics routines. Now the PDFView we used to deal with on Mac can handle all PDF related viewing and navigation operations for iOS apps, just as easily. Use an instance of PDFDocument to load a PDF from URL and associate it with the PDFView – that’s pretty much it.
That concludes my short dive into the newest iOS, there’s much more cool stuff (say, the revamped iOS Simulator that you can finally run in multiple instances). Nice job Apple, looking forward to the customer release that should happen this