Based on shape of display, the new design language has three foundational patterns,
Dial
Infographic
List
Use background information to convey additional information – this is key to the migration of the widgets. This is backgroundContent – not just a visual flourish
New material applications (this is to align with xrOS or VisionOS) –
Translucency established hierarchy
New navigation patterns –
Use the crown to do things without covering the screen.
Vertical patination, is now used for multiple tabs in an app. Use this instead of horizontal pagination.
User object permanence of items across pages
If you extend beyond the height of the display
Source List is a new pattern. See World clock to see what this means
This API is all about lifting subjects from images, i.e. pulling out your favorite pet from a picture to be used elsewhere
Subject Lifting
Simple long press on an image, you can share the subject. Now you can lift any subject and turn it into a sticker. This is automatically for most code from last year.
Just set an appropriate interaction type, .automatic does a combination of al types
.imageSegmentation will only do Subject lifting
.automaticTextOnly – same as iOS 16 behavior – test selection and data detectors
Visual Look Up
Allows users to know about pets, nature, landmarks, art and Media. In iOS17 adding Food, Products and Signs & Symbols
This helps with laundry tag information!
Works for 6 languages to start with.
Analysis is entirely on the device. Includes feature extraction.
One you look up the object then it goes to server for more information.
Add .visualLookup to your analyzer configuration
Badges are displayed on subjects for lookup etc.
Data Scanner and Live Text – allows for OCR
Optical flow – can now do high speed scanning and comes for free when recognizing text.
Currency Support
Find and interact with monetary by setting textContentType to .currency in the initializer, just like email or phone numbers
Provides both a Bounds and a transcript. Code example in video for totaling a receipt.
Live Text – more regions:
Can detect document structure, like list detection (iOS16), but now can also detect Tables Will even merge cells if needed.
Context-aware data detectors – will pull additional information from surrounding areas, like name and address
You can do .transcript last year, you can now do ranges, selected text, Attributed and identify when text changes.
Expanded platform support
All about the Mac – Catalyst support being rolled out
Catalyst –
Simple recompile for LiveText, Subject Lifting and Visual Look Up, no QR support yet ( you can do it via Vision Framework)
Native Mac API
ImageAnalyzer
ImageAnalysisOverlayView – subclass of NSView, just add above image content (adding as SubView of ContentView is simpler)
Basically the name as iOS (except .machine-readable is a NOOP
Contents rect – needs to know the rectangle bounds that it needs.
If you use NSView you can just set trackingImageView on the overlay and the system will do it for you
Contextual Menus (Mac)
You can combine items in the same menu, this provides a much better right click menus based on selected item.
Items are identified by tags, there is a struct that defines them. Code examples are in the session.
Containers have been updated to address VisionOS – you can add .volumetric to a WindowView
RealityView
ImmersiveSpace {} Scene type allows for full emersion or mixed to create a AR view
Redesigned UX on watchOS 10, is based on existing SwiftUI views. Get new transitions
.containerBackground modifier will allow for transitions in push and pop
ToolbarItem(placement: .topBarTrailing and .bottomBar}
Date Picker and Selection in List have been added to Apple Watch
Widgets on iPad, Home Screen, Desktop, and interactive controls
New framework updates for SwiftUI – Focus on major updates to MapKit – (will need to see how I can reformat my Map integrations – Meet MapKit for SwiftUI), ChartKit has scrolling charts, donuts, and pie charts.
Simplified Data Flow
Start with data model created in SwiftData
@Observable models use SwiftUI patterns but for DataFlow, if you pass a model thru intermediate views, they won’t get invalidated if they don’t use the property in the model
Confirm session Discover Observation with SwiftUI
SwiftData are represented entirely by code. Will receive persistence and observable.
Add a .modelContainer to you App View
Add @Query to model in your views (tells swift data to do a fetch from database). Also works for document based apps.
Document Groups – automatically get sharing, renaming and more
Inspectors – create side bars or sheets based on platform.
Dialog customization
Help Links –
Lists and Tables for fine tuning them.
Tables – Column order and visibility, and new Disclosure Table rows (for grouping)
Sections now have programmatic expansion.
Style formatting for smaller lists or tables.
Extraordinary animations
Animations can be improved with KeyFrames – triggered by state change. (Runs a set of animations in parallel)
Phase animators (simpler than KeyFrame – step thru a sequence of animations)
Haptic feedback uses new SensoryFeedback API – add a modifier .sensoryFeedback check HIG for Playing Haptics
.visualEffects modifier – don’t need geometry reader, to allow for an animation based on position on the screen.
You can do text interpolation with .foregroundStyle so you can do cool metal shaders using ShaderLibrary.
Sliders now have .symbolEffect modifier (can do it at a single item or the entire view). Check out (Animate Symbols in your app)
You can do .textScale to a text view.
.typesettingLanguage modifier will allow for text that requires more space like some languages (Thai)
Enhanced interactions
.scrollTransition modifier will allow effects to items in a scroll view.
.containerRelativeFrame modifier allows for making things relative to other screen parts
.scrollTargetLayout()
.scrollPosition allows your do some thing based on position within the scroll view.
.allowedDynamicRange modifier is now supported to show full fidelity, use sparingly (must be a performance hog)
New Accessibility options. You can add .accessibilityZoomAction so voice over can access other actions like swipe actions.
Color can look up customer colors defined in your asset Catalog.
Menus now have improved visual styles
New BorderShape styles for buttons
.onKeyPress – allows for actions based on key presses (with modifiers)
If-else/switch can now be used inline – this simplifies initialization and other conditions
Improved error messages and more directly aligned to where the problem is. (This will come in handy in SwiftUI)
Swift Generics allows for type identification preservation, this has been extended to handle # multiple argument types. So no longer will you be limited to the number of arguments to be passed. Now use <each Result>
Macro System –
Macros are APIs so you can import module that defines them
@Observable macros is key to turn a class into fully observable in your code. (Expand on Swift Macros session)
Swift everywhere
Low ceremony code to make it easier to scale
Future of Foundation – starting a rewrite of Foundation for Apple and non-Apple platforms. This is moving the implementation of Swift to Swift.
Drives major performance updates across calendar (20%), date formatting (150%), and JSON coding (200-500%)
New opt-in capabilities which focus on code ownership. Check it out on the swift forums
You can now interoperate between C++ and Swift now. There is a Swift Forum that is being driven by a WorkGroup focused on this.
CMake now support Swift and can e used to mix and match C++ and Swift make files, there is a sample Repo to help you get started
Actors and Concurrency
This section explained synchronization, how the actors and tasks behave
Improved asset handling to allow it to be supported in code completion and renames
Strings for internationalization will now show if they are stale (that’s cool)
Improved documentation styling and a new assistant which shows the documentation previews.
Check out Create Rich Documentation with DocC
Support for Swift Macros – they are using in Standard lib, Foundation and SwiftData frameworks – Learn more about @Observable, #Predicate, and @Model
Cmd-Shift-A allows you to access any menu options quickly.
To expand a macro use Cmd-Shift-A Expand Macros
(Expand on Swift Macros, and Write Swift Macros
Navigating
BookMarks Navigator (in left panel) to get to things you’ve flagged.
Great for Find Queries
You can do “Conforming Types” queries in find panel
Sharing
Changes navigator and Commit editor – allows your to view mods in a single scrolling view.
Staging bar allows for easy unstaging
Testing
Updated test navigator (re-written in Swift) 45% faster
Can filter tests by result types, etc.
Insights on patterns of results
Playback of UI tests are supported by Video captures, etc. via test details view.
Debugging
OSLog integration within Xcode, so the Xcode console will now allow for filters, shows severity via color, etc. Jump to line of code via the logger
Distributing
Xcode Cloud deals with versioning, signing and distribution.. adds in Test Flight Test details – to provide information to your testers, also will notarize your Mac Apps.
XCframeworks now has signature verifications – which shows who produced and signed the framework.
Privacy framework allows for improved Privacy report for app.
Most common distribution settings are bundled with Xcode.
Time to get into the details… Ok, this is a just a post of my raw notes, no real analysis. I am looking forward to starting all the sessions tomorrow. But tonight I still have to watch the Design Awards.
Darian Adler – VP of Internet Technologies and Privacy
Language, Frameworks, Tools, and Services come together to make a Platform
Over 300 frameworks on Apple’s products.
Interesting to see Apollo Reddit app on their main screen
Swift and SwiftUI
Holly Borla – Engineering Manager
New kind of API – Swift Macros, i.e. annotations
Generates code in Xcode
You can expand Macros to see what they will do.
All macros provide custom feedback in Xcode
Attached Macros, allows you to add functionality to your existing code, like @AddAsync to your function. Will expand during debugging your code so you can see what it does.
Many of the new APIs will use Macros
Can now do bidirectional interoperability with C++
SwiftUI – updates
New support for Pi Charts
Expanded MapKit support
Animation updates
Auto transfer gesture velocity to animation
.spring(duration:, bounce:)
Animation in Symbols are now supported
Animation Phase
Full support for Key Framing
DataFlow – getting simpler, you will only have to deal with @State and @Environment
You can use @Observable on a call (using annotation), it will set up for you, and you don’t need to use @State
CoreData is now SwiftData.
Uses Annotation with @Model macro and @Query.
Update your widget with same @Query so now you can introduce data in classes
App Experiences — Johnathan
WidgetKit
Make your app more glanceable and usable.
Minor updates to become available in new iPhone standby mode
Available on iPad and MacDesktop
New Widget Architecture allows for continuity
Identify Background, and padding
Add buttons or toggles allow for interactivity for your widget
Can see widget timeline in Xcode previews… (this may help me fix my own app’s widget issue)
AppIntents
If you wrap your intent in an app shortcut it will show up next to your app in the shortcut app
TipKit
Feature discovery in your app, allows your to educate your users.
You can create simple targeting to functionality.
AirDrop
Tap share sheet between devices and that passes data between devices.
Hardware Features
Gaming
Easier than every to port games to Mac
It evaluates your existing game via emulation environment
Cover and compile shaders – via Metal Shader converter (supported in Xcode and on Windows)
MetalFX allows for straightforward porting of code
Camera
AVCapture
Zero Shutter Lag, Overlapping Captures, Deferred Processing (move Deep Fusion into the background)
Later this year – Volume Shutter buttons
HDR capable displays will increase photography improvements information on displays. There is no industry standard for photos, so Apple has been working with industry to create ISO HDR standard for photos.
Video Conferencing
Reactions etc. added in the camera app by default. But you can capture that it is being used to update things your app
Screen Capture Kit – allows for better screen sharing (just share the 2 apps you want)
Added external camera support with iPad and tvOS
Continuity Camera can be used easily in the code.
WatchOS
containerBackground, Pagination, ToolBarItem, and Navigation Split View, and new Spring Animation. IF you are using SwiftUI it will be automatic for you
New custom workout API
New core motion API – for swing analysis (like golf or tennis)
Values
Chris Fleizach Sr. Manager Accessibility
1.3 B people with disabilities around the world.
Added ability to detect doors in magnifier app.
Animation and flashing lights sensitive –
Pause Animated Images – stops animation in animated GIFs
Dim Flashing Lights – will darken screen automatically when flashing lights occur (automatically if you use AVFoundation) – Apple Open Source this algorithm
VisionOS has tons of Accessibility features built in from the start
Katie – Privacy
Privacy prompt improvements. Like Add only permission for Calendars. Photos have new picker
App Privacy – help users to show how you protect their data.
For Third Party SDKs – Privacy Manifests – this will be combined for a summary
Supply Chain Signatures for third party SDKs to validate it was signed correctly
Communication Safety
This was added to messages in iOS15
It is now available across the entire platform – Sensitivity Content Framework, added blurring too.
Chris Markiewicz – App Store
Merchandizing UI for improved exposure in StoreKit
Can add subscription store view easily and it works across all platforms.
Will automatically determine version to show the user
SKAdNetwork – for measuring downloads based on Advertising, can now measure reengagement.
Tools
Ken Orr – Sr. Mgr Xcode and Swift Playgrounds
Xcode –
Code completion will use surrounding code for prioritization, will also automatically define symbols for your assets
Previews – using swift Macros
Works across all UI Frameworks
Git Staging is integrated in Xcode
Can see unpushed commits
Testing –
Redesign of test report, includes video recording of test results with navigation timeline.
Can show accessibility frames
Xcode cloud
Have improved workflows
Added tester notes for test flight
Linker speed increased
Deep dive into Vision OS
Michael – VP
Fundamentals – actions create
Windows
volume
spaces
Dedicated full space
SwiftUI and UIKit, for UI
RealityKit for visual and spatial audio
AR Kit for objects
With SwiftUI you can add depth or a 3d Object for layering. Vision Apps can then separate those depth objects
Using .offset(z: value)
Creating a Volume in SwiftUI
And SwiftUI renders thru RealityKit
Ornaments are for controls and toolbars
For full scenes – you want to use RealityKit
Automatically adjusts to lighting conditions
Can also create portals into 3D Scenes
Dynamic Foveation – uses eye tracking to focus processing power on the area the user is actually looking
IBL – Image Based Lighting object for customizing lighting effects
Material X is shader for environment
Can add attachments to 3d Objects
ARKit – allows objects to interact with the real world
Always running – persistence and world mapping is automatically handled for you
Plane Estimation
Scene ??
Image Anchoring
World Tracking
Skeletal Hand Tracking
Accessibility
People with physical or motor issue can use eyes, or voice
Can use finger, arm or head for selection
Supports VoiceOver information
Dynamic Type, Reduced Transparency and Accessibility UX features are available
These Busca-Aegeria – development tools
Xcode has it all in it.
Simulator has multiple scenes for lighting
MacVirtual Display
Reality Composer Pro – to import and organize 3D Objects and integrates in Xcode for development workflow
Jason – Tour
Demo of Reality Composer Pro
Test Flight will be available on the device from the start
Unity – using their tools on top of RealityKit
They can coexist int he share space
Ralph from Unity
Can bring over Unity work and reimagine it for VisionPro
Demo of What the Golf
Additional Features
Goals – Use Features and Preserve Privacy
Virtual Sound should behave as if it is real in the room. There is a realtime spatial audio and rendering.
User Input is private by design.
Sharing and Collaboration, can use share play to share any window on FaceTime call.
Personas are in 3d for other VisionPro users…
Spacial Personas, will show them in the same space (outside of the window).
Will be available as a developer preview later this year
What’s next
Start design, developing, and testing, Vision dev kit will be available later this month.
To see how your app works, there will be a visionPro lab in US this will be in Cupertino, in Germany is in München.
Good morning! 15th Anniversary of the App Store, which changed everything. Nice to start the discussion with Developers. Time for some breakthrough experiences.
Over to John to talk about the Mac. (Interior shot)
15 inch MacBook Air, (Nice to see Female exec with the details) – 18 hour battery lap! Definitely a nice machine for the people. Nice to show copy and paste with phone. Starting at $1299. (This may be the one to replace susie’s machine. Screen size and lowest tech would be a blow away compared to her current machine)
Pro machine update
Mac Studio – Used at NBC for SNL. Getting an update (Jennifer Munn – Dir. Product Engineering) – Getting M2 Max, adding the M2 Ultra, 24Core CPU, 76 Core GPU, 32 Core Neural Engine, and 192GB of Unified Memory. Can support 6 Pro Display XDRs.
Mac Pro with Apple Silicon, adding the PCI expansion to the Mac Pro case. All are based on M2 Ultra. With built in 7 afterburner cards, built in).
All machines are order able today and will ship next week. And Completes the transition to Mac Silicon.
Software platform time… (Craig)
iOS
Updates to phone, FaceTime, and messages – Personalized Contact posters… so people who get your call get’s your screen you want to send.
It’s like personalizing your Lock Screen.
This is using CallKit
Live Voicemail – should you answer or not, live transcription in real time to decide if you want to answer the phone.
When you call soon on faceTime you can leave a message
Messages –
Better search filters to add people or terms to find data
Jump up to the first message you haven’t seen
Swipe to reply on inline
Transcribed messages
Inline location.
Check-In to see that people made it home okay, with shared data – End to End encrypted.
New Stickers
Add effects
Tap and rotate
And they are system wide
AirDrop updates
Name drop on phone, watch, etc. to hand off contact information
Bring devices close together – can share pictures, share play, etc.
Keyboard and Dictation improvements.
On device machine learning with AutoCorrect (improved) uses transformer Language model. (Notice they are using Transformer Language Model instead of Large Language Model).
Adept Ullal – special moments (new feature)
Journal (app) – coming late this year. There is a new Suggestions API to help in the Journal app, also usable for developers. On device processing, E2E encryption and ability to “lock” your journal.
Standby – Clock, power of Widgets, pictures, and smartstacks..
Now you can say Siri – instead Hey Siri,
Offline maps
Improved People albums, and pets – because they are people too
iPadOS 17
iPadOS
Final Cut and Logic Pro on iPad (just last month)
Widgets on the Lock Screen, and jeu are now interactive!
Love the Astronomy screen for iPad
Live activities on Lock Screen too
Health on iPad
PDF improvements
Auto recognize fields in PDF
Signatures, and reply back in mail as a completed form
Notes app will auto expand PDFs for viewing and annotations with Apple Pencil, Live Collaboration!
Stage manager improvements. Positioning and sizing, use camera in monitor
Games new and coming
macOS
The Name is – Sonoma, yes it is about wine
New screen savers
Widgets – available somewhere new – i.e. on the desktop … widgets are faded or tinted based on what you are doing.
Continuity will allow widget handoff from iPhone to Mac
Gaming (macOS)
Metal 3 – huge performance
Game Mode – prioritizes CPU and Gpu on Game being played.
Games Porting Toolkit – to see how to make conversion simpler for shaders and graphics code
Death Stranding game coming to Mac later this year.
Video conferencing enhancements – Presenter Overlay , reactions overlay
Safari
Webkit improvements
Privacy improvements – locks browser windows, blocks all trackers, and removes trackers from URLs
Sharing passkeys and passwords for your group. (E2E encrypted)
Profiles, setup work, personal, etc. this can be tied to Focus Modes
WebApps – created from a web page. Just add to Dock from file menu
AirPods updates
Adaptive Audio – combines Noise cancelling and Passthru – dynamically blending to match condition of the surroundings.
Personalized volume to give best media experience in the moment
Conversational awareness – start talking, turns down music and focuses on the voices in front of you
Calls on the go, will leverage adaptive audio
Improved automatic switching
Airplay updates
On device intelligence will learn what device you should share to
Airplay support in Hotel- to share your airplay to wife and TV in the hotel.
Apple Music / CarPlay – share play in the car.
tvOS and apple TV
New control center.
User your iPhone to find the remote
Photos to the TV easily
FaceTime on the TV (uses continuity camera)
FaceTime API and Continuity API for developers
watchOS 10 Features
watchOS
Comprehensive App redesign across the phone
Turn crown for widgets using Smart Stack
Widgets can hold complications
World clock app was updated
Active app redesigned
Added Cycling
Connect to bluetooth cycling computers
In watch OS10, cycling becomes. Full display live activity on your phone.
Added Hiking
New waypoint tracking (when did you last have cell overage)
Also added a topography map and elevation details
Two new watch faces – Pallet and Snoop / Woodstock
Health improvements on Apple Watch
Mental Health reflecting on your state of mind to improve wellness – from Mindfulness map
Can also do this on iPhone
Current risk to depression or anxiety
Vision health – Myopia
80-120 minutes a day outside in daylight (using ambient light sensor
Screen distance – uses true depth camera to measure if the device is too close
All info is encrypted on device and only unlockable with your lock code, passcode, etc.
Back to Craig
Lots of new APIs across all platforms, 175 sessions
Back to Tim
One more thing!!! Years in the making – AR is a profound technology
Vision Pro Gallery
Apple Vision Pro! – A new kind of computer
Eyes hand and voice control
Spacial Computing
(Alan Dye)
Home view looks like watch
Environments transform the space for you
This is the holo deck
Subtle gestures
“Eye Sight” – the pass thru of your eyes
Always in sync with your devices – via iCloud
Change focus by turning your head
3D Object – pull it out and look at any angle
“You have the room to do it all!”
Works with Magic Keyboard and track pad. Look at your Mac and bring it in view
Remote collaboration and integration with FaceTime and Sir.. you can now put people around you. (Life sized)
At home –
Power of spacial computing, capture and experience photos and videos in new ways
Panaroamics allow you feel like you are standing where you took them
First 3d camera! Allows your capture 3d value
Movie theatre experience with spacial audio.
Game controller support, 100 Games available on day 1.
Walt Disney on stage, this will allow us to bring fans closer to characters they love
Partnering the greatest Story telling company in the world with the greatest technology company
Disney+ will be available on day 1
Richard Howarth to talk about the design
3d formed and laminated glass.
Quiet thermal design, textile parts, fleshes to your face, confirms to your face. Don’t need AirPods, Band is changeable, and micro adjustments. Custom optical inserts.
All day use when plugged in, 2 hour when on battery.
Mike – Technology
Advanced realtime sensor system, spacial audio, and incredible
resolution
64 pixels in the space of 1 iPhone pixel. 23 million pixels (more than a 4k TV for each eye)
3 element lens… true 4k and UHD range.
Audio –
Integrated audio pods allows for ambient sound and matches it to your room (using audio ray tracing)
Eye tracking
You don’t need hand controllers.
Foundation is Apple Silicon, it’s a M2 plus a new chip called R1
Virtually eliminates drag, 12 ms – 8 times faster than the blink of an eye
FaceTime creates an authentic representation of you, you enroll you face to create a digital person.and matches you as a persona
VisonOS – designed from ground up for facial computing.
Susan Prescott – Developers
Reimage existing apps and design new ones
A few developers played with it
The Heartbeat to give a new education view
SigSpace –
Stagent from PTC
SkyGuide – this looks way cool!
RealityComposer Pro
Same frameworks from iPad and iOS are available – so available at launch
Unity will be Brough to VisionOS
All to be covered in the State of the Union
Mike –
Privacy and Security
Optic ID, encrypted, never leaves the device, etc.
Where you look is not communicated to apps, only where you touch with your hand
Tomorrow is the start of WWDC 2023, and the rumors are all coalescing around the idea of the Apple Headset. It’s been 9 years since I tried on, and ultimately bought, Google Glass, so the promise of an Apple headset is more than exciting to me.
While people are complaining about a rumored price of $3,000, I don’t consider that to be unreasonable for a developer kit. The ability to experiment with a new piece of Apple tech is pretty exciting to me. I just hope that they make it available quickly, like they did years ago when the M1 Mac mini was provided for developers (even though they indicated that the dev kit was running a A12 processor).
But, there is a whole lot more to look forward to this year’s WWDC; revamp of WatchOS, iPadOS Lock Screen updates, more advancements with SwiftUI, and hopefully improvements between CoreData and CloudKit. Yeah, I know, that last one is my pipe dream. I’ve been hoping for years now, that Apple would take a SwiftUI approach to CoreData, making it easier to use across local and remote data synchronization.
As has been the case for the last few years, Apple is not holding WWDC in person (as such). They are continuing to stream the sessions from Tuesday to Friday, releasing content at 8am PDT each day. To that end, I am taking the week off from my day job and time-shifting to California hours. Spending 8 or more hours a day trying to go thru as much content as I can. There is some benefit to this, as I can backup and replay sessions to learn the content more; but I find that I learned so much more when WWDC was in person, and I could talk to others between sessions, cementing the ideas in my long term memory.
I will be writing a daily blog on the sessions I find most interesting, and of course Monday I will be watching the Keynote, State of the Union, and Apple Design Awards on Monday, and hope to provide a summary of salient information on each of those.
Last night, while I was trying to doze off, I started thinking thru the expected announcement of Apple’s VR Headset. I had read the rumor that one item that Apple was going to do is give people the ability to see the users eye, by having a screen on the outside. This way when you are viewing thru the headset in “pass thru” mode, you can look eye to eye
I’ve always said that Apple let you know what is coming if you just look closely enough. This got me thinking about the FaceTime change a few years back that dynamically adjusted the video so that the eyes always looked at the camera.
This feature now makes sense… If I am in the VR space, and lookup the popup contact card that tells me who I am talking to, I don’t want them to notice I am not looking at them.