WWDC 25, Day 2

Well, this year’s virtual WWDC is packed with even more activities, all which make it really had to get into my normal review of a ton of sessions. Yesterday I attended three different group labs. These are new for 2025, and end up being a Webex with a team of Apple Engineers from various areas. The format is a set of Q&A from the attendees, and questions are up voted. Only one hour long, they can get pretty detailed, but they also interrupt my flow for the day. Today I am only attending two of them so I am hoping to get caught up on the various virtual sessions.

I attended the following group labs:

  • Developer Tools group lab – the discussion primarily focused on the new swift assist features (AI). This is going to have a huge impact on how developers work on their code.
  • Camera & Photos frameworks group lab – the discussion here was surprisingly focused on the LIDAR features of the camera. I am thinking there were a lot of people who worked in the auto industry on the Webex. There were also some discussions about how to improve performance on photo views.
  • Swift group lab – most of this discussion was about swift concurrency, which is not a big surprise, as the introduction of Swift 6 has forced a lot of developers to make their apps more concurrency safe.

In between the various labs, I tried to get a few videos in:

  • What’s new in App Store Connect – The major changes here were exposing much of the App Store Connect features via APIs for enterprises to integrate into their existing pipeline, along with new features in the iPad app. Adding summaries in the reviews based on Apple’s LLM was enabled.
  • What’s new in Apple Device Management and Identity – This was extremely informative. I recently took a new job and my company uses Apple Managed IDs. It has really impacted my ability to use my Vision Pro on a daily basis (I won’t put my work account on my personal devices). Apple’s new improvements to Business and School accounts, means that enterprises can now control even more granular capabilities on the device. They also did major improvements to pre-provisioning and reuse of devices. They also now support the Vision Pro in a major way with all the same features (for the most part) as Mac and iOS.
  • What’s new in visions OS – I am going to have to go back and review this on multiple times. A few key items were adding access to the foundation models on the device, new accessories including using the Sony VR controllers, shared experiences, and a whole lot of content focused on enterprise usage models. These include improvements to manage multiple users sharing a single device and multiple people sharing the same virtual space when in the same room.
  • What’s new in watchOS 26 – The changes here were pretty minor, mostly UX refinements; however, the one feature that seemed to be meaningful was the introduction of RelevanceKit. This will allow widgets, stacks and notifications to be much more aware of location, time of day, and other patterns. The one part that worries me about this feature is that it invites a new vector for stores to advertise.
  • What’s new in Widgets – To me the biggest changes in Widgets is that they are now available on visionOS, and have the ability to be pinned to a location. This pinning is also available for other windows, and will make the Vision Pro even more relevant. You can also now use APN (Apple Push Notifications) to update widgets.
  • What’s new in Xcode – The integration of Swift Assist and local models is the part I am so excited about. Learning about the new Icon composer, updates to string catalogs also looks great.