Apple’s Best New Announcement, According to a Developer

Plus, one unpleasant surprise

Apple’s Best New Announcement, According to a Developer

During its product event on Tuesday, Apple announced new updates for the Apple Watch line, a new “Apple One” bundle of services, and plenty more — but what got me most excited as a developer is the new iPad Air.

The new tablet comes with an updated bezel design that looks like the current iPad Pro’s, several color offerings, a TouchID-capable wake button, and a suite of performance upgrades. The iPad Air will now have the A14 chip with 6-core CPU and neural engine capabilities not yet seen in the iPad. The chip allows for up to 11 million operations per second, which means that developers using CoreML for their apps will see an immense improvement in the speed at which their apps can do machine learning.

One of the developers Apple used to showcase the new ML capabilities was Karim Morsy, who created the app djay Pro AI. Karim explained how with the new iPad Air, users can now DJ in the air, without touching the iPad. The app uses the camera to track the user’s hand movements and then uses machine learning to translate those movements into spinning the turntables. When I saw that, I envisioned an app that could help you learn and improve your sign language by tracking your hand movements.

Game developers should be pumped, as well. Alexey Scherbakov, developer of War Robots, showed improved graphics and textures with the iPad Air.

Credit: Apple

I haven’t yet used CoreML in any of my own apps yet, but the new iPad Air may just be that push for me to dive in. If I was working on a photo-sharing app, I might use CoreML for an image classification feature to help users pre-populate tags for photo posts. An art app could use it to recommend color palettes or pairings to complement the colors in a user’s piece. A language-learning app could give users feedback on how well they’re speaking a new language. With faster processing times, CoreML allows developers to take advantage of machine learning features without sacrificing performance.

After the product announcements — in the last two minutes of the presentation, in fact — Apple dropped another bomb for developers. Tim Cook announced that iOS 14, iPad OS 14, watchOS 7, and the new tvOS will be available on Wednesday. My jaw nearly dropped to the floor: It means that, ideally, all of us app developers would need to download the production version of Xcode 12 (which allows us to upload and submit iOS 14-compatible versions of our apps), fix any issues our apps may have, finish implementing any new iOS 14 features, submit apps to the App Store, and get them approved in 24 hours.

Obviously, that’s nearly impossible, so users may run into a number of bugs if they opt to download the new OSes tomorrow. Most app developers not only have to worry about their own code being up to date, but also the code for any third-party apps or frameworks they depend on for certain features to work. Many developers use frameworks like Realm, for local storage, or AppsFlyer, for mobile attribution. Luckily, third-party frameworks can release on their own through GitHub so they aren’t beholden to the app review process, but if their current versions aren’t compatible with iOS 14, they would have to release new versions as soon as possible.

Usually Apple will give several days or a week after the announcement for developers to submit and ship their apps for the new OS. (Last year’s September event was held on the 10th, and iOS 13 launched on the 19th.) This announcement came as a huge shock to me, so if you know a fellow iOS developer… make sure to send them a virtual hug. This will be several days of scrambling and stress for the majority of us.

Originally published here.