I had high hopes for day five, but I ended up pulling a muscle in my shoulder and I just couldn’t get my head on right to got through online content. I did make some major progress, but then petered out… My backup plan was to power through more today, but I ended up starting to work on my own apps; fixing accessibility issues, designing new icons, and working on some paper cuts. Overall, very rewarding day. The list I did get through are as follows:
- Discover machine learning & AI frameworks on Apple Platforms – this foundational session when through a few key concepts. How Apple is using ML in their own apps, how to leverage the new foundation model as a developer, and finally how to bring your own models to the device.
- Dive deeper into Writing Tools – Apple intelligence, first introduced in 2024, the improvements made to writing tools has been extensive. Not only that, but you can now customize native views to limit which features you wish to expose to your users. You can also customize the writingToolsResultOptions to let writing tools know the type of text to expect.
- Elevate the design of your iPad app – This session takes you through changes you may wish to make to your iPad app to take advantage of the new “liquid glass” design language. While many of the features will be automatically applied if you just recompile, understanding how window resizing, navigation bars, pointers, and the menu bar works will allow your app to really shine on iPad.
- Embracing Swift concurrency – The biggest change to Swift 6.2, in my humble opinion, is the new Approachable Concurrency change. Since most apps trying to adopt the swift 6 concurrency model were quickly overwhelmed with warning and issues, the new model allows developers to declare that their app is, by default, single threaded. You then add the concurrency deliberately, greatly simplifying getting your app setup. I switched one of my apps to this model and it removed 50% of the warnings I have been trying to resolve.
- Enhance child safety with PermissionKit – there has been a lot of legislative activity around child safety on line and in app stores across the US. One of the issues has been that this is causing some states to mandate online ID verification for content. One of the challenges is this creates yet another problem area for personal information to leak. One of the interesting aspects that Apple has provided in this new API is the ability to define “age ranges” that are queryable without requiring direct access for websites or apps to PII (personally identifiable information). I hope that this approach get’s adopted more broadly. However, many companies would rather have your PII so that they can sell that data. Anyway, the session goes through how to use the Permission Kit API, including how to create “ask” experiences, causing a child’s device to notify the parent that they want access and requiring positive confirmation by the parent.
- Enhance your app’s multilingual experience – TextKit 2 was introduced last year and this year it really steps up with the ability to correctly handle two languages at the same time in the same input field. Think about writing words in both left to right and right to left languages in the same sentence. Blows my mind!
- Evaluate your app for Accessibility Nutrition Labels – this is the session that really side tracked me. There is so much I needed to do after running an accessibility audit on my simplest app, that understand it all becomes even more important. This session does an awesome job of going through specific examples for each of the sessions of the Nutrition Table. Highly recommended!
- Share visionOS experiences with nearby people – While I know I will never convince my wife to try the Vision Pro, let alone have two of them in our house. This session goes through the architecture of both nearby people as well as remote users. Explaining how placement, recentering, and FaceTime integration all work. Fascinating discussion shared anchors too!
- Set the scene with SwiftUI in visionOS – A look at how you can now integrated SwiftUI with Volumes and Immersive Spaces, in a much more fluid manner. The new scene bridging capabilities allows for UIKit to also support volumes and immersive spaces. Hopefully this means we will see a lot more visionOS apps coming soon!
- Say hello to the new look of app icons – the final session I watched before my shoulder and neck pain took me out of being able to focus. This sessions did a great job of getting me psyched to update my own apps. A member of Apple’s design team took you through how Apple updated their own icons across the board. Well done!