Meet Safari for Spacial Computing

Should look familiar – it is the same Webkit underneath.

Touch the page to follow a link, or look at it and tap fingers together

Tab over has been completely redesigned

You can do Tab views to surround you

Best practices – 

  • All the websites work out the box using responsive design
  • Use CSS viewport units
  • Resize based on media and container queries
  • Use SVG for UI  elements
  • For bitmap access use devicePixelRatio for image loading

Natural Interactions

  • Main model is eye and hand gesture.. just look around and pinch to select (it will capture where you eyes were looking
  • Touch the page with your index finger – that is a pointerDown event.  You don’t need to worry about this for low level interactions
  • Media Queries – it is similar to a touch screen – coarse and doesn’t support hovering 
  • Elements will highlight based on eye tracking so it knows how to select 
  • Make sure to use cursor:pointer; to show that something is interactive.
  • Test your highlight regions in the xrOS simulator

Platform optimization

  • Screen is a bit more abstract, just like the change of pixels when retina was introduced.
  • Full screen windows can be resized on the platform, so they can be larger than expected.
  • requestAnimationFrame always measure elapsed time between frames 

Integrating with 3d Content 

  • Adjust your website to use quicklook so that you can pull 3D objects out of the website and anchor them in the real space.
  • All you need is a link to the usdz model, and an image with a preview 
  • Will leverage the advance lighting and rendering from ARKit
  • Adding a new element via <model interactive> <source = src””> </model> for html
  • WebXR dev preview is available to learn more
  • New settings screen in Safari on the platform to enable various 3D features and WebXR

Discover Observation in SwiftUI

Let’s you define models in standard swift types

What is Observation?

  • New feature for tracking changes to properties
  • Will automatically create the code for you to track and use any changes occurs via Macros. 
  • Using a computed property, when it is changed the UI updates
  • Adds performance improvements since the UI only changes when it needs to

SwiftUI Property Wrappers

  • You will still need @State if the view needs it’s own state – like a sheet presentation
  • @Environment allows you to propagate globally – Observable types work well here. 
  • @Bindable – is a lightweight just use $ syntax to create bindings.  You will need that for things like TextFields

Advanced uses

  • You can use arrays, optionals or any type that contains @Observable – it will track based on the specific instance
  • If a computed property does not have any stored properties, you need to tell observation that the property was used and when it changes, you do this via creating your own getter and setter look this up for more detail.

ObservableObject – how to update your code

  • Observable macro will simplify your code
  • Take a look at your public class <name> { } remove the @observableobject and @published, then change your view, all @Observed can be changed, mostly just delete annotations will get you there

Make Features Discoverable with TipKit

Create a Tip

  • Title and Message – with direct action labels as title, easy to remember instructions in the message
  • Don’t use for promotional or error messages.
  • Add the TipsCenter.shared. ?? in the init() of the @main for the app.
  • Add icons and color match with the application.  You may wish to add an action button if there are settings.
  • Set treatment and placement.
    • Pop-over – points to element or button to direct the user  (.popoverMiniTip)
    • Inline – adjust the apps UI so no UI is blocked.

Eligibility rules

  • Don’t be spammy or irrelevant.  If the feature is already discovered don’t spam them.  If users are very infrequently using your app, you may not want to show tips either.
  • Rules are
    • Parameter based rule – persistent based on State and Boolean
    • Event-Based rules – define an action that must have occurred before they are shown.
    • Donate the event
    • You can create custom events using associated types and use that for the rule


Display and dismissal 

  • Don’t show it forever, don’t show multiple at once
  • You can do a DisplayFequency (.daily, .hourly, custom duration) you can also do .immediate
  • You can do .ignoresDisplayFrequence(true) for a specific tip 
  • If the feature is used it should be dismissed… You can then call the invalidate function with user performed action
  • You can also set a .maxDisplayCount to not annoy the user with the same tip over and over.
  • Tips can be synced via iCloud, so you don’t want to show the same tip on different devices

Test tips

  • You can work around eligibility rules.  Use TipsCenter.showAllTips() or .showTips with tip iD, or .hideTips to prevent, or .hideAllTips()
  • .resetDatastore() can be used to clean it up 
  • You can also use com.apple.TipKit.showAllTips in your launch arguments (or any of the other commands)

Running your iPad and iPhone apps in the Shared Space

Majority of apps will run fine in VisionOS (check it out in the simulator)

Built-in Behaviors

  • Built on iOS foundations, more than likely your app will just work.
  • iPod and iPad apps will be in light mode in landscape mode.  iOS only would be portrait mode.  If you support rotation you will see rotation button.  Mini and max size will cause bounce.
  • Can use trackpad or game controller when using your app, along with touch.
  • Local authentication is forwarded to Optic ID

Functional Differences

  • There’s no notation of rotating the entire device, so you may define the default orientation – UIPreferredDefaultInterfaceOrientation to you info.plist 
  • UISupportedInterfaceOrientations – will decide if you need a rotation button
  • UIRequiredDeviceCapabilities – will decide if your app can be on the device.  
  • Gestures – work differently
    • Eyes & Hands, maximum of two simultaneous inputs (each hand is a single button)
  • ARKit – ARSession
    • Significantly updated to handle new architecture and privacy.
    • Existing AR views won’t work on this device, you will have to do work on this.
  • Location support – same as iPad
  • Look to Dictate – allows your to quickly adjust around the screen.  You will get a microphone in search fields as an example.
    • API is available, but disabled by default of iOS and iPad app
    • .searchDictationBehavior to enable in SwiftUI
  • Use availability checks to make sure it is available for you try to use it.
  • xrOS Device (Designed for iPad) new target

Choose your experience

  • Most apps don’t need changes at all, you can rebuild against xrOS SDK but not required.
  • Only In iOS SpriteKit and Storyboards
  • You get additional immersion with xrOS SDK – Volume, etc. 
  • You will get system look and feel when using xrOS SDK in the simulator.  Ornaments anchor on side and bottoms.

Check out the Meet SwiftUI for Spacial Computing to see how apps will look and learn what you need.

Bring Widgets to New Places

New location for widgets -First introduced in iOS 14, added to Lock Screen iOS 16

New locations are Desktop on Mac, iPad lock screen, standby Mode on Phone, stack mode on WatchOS

Transition oto content margins

  • Padding automatically applied to prevent getting to close to the edges.   This replaces safe area … .contentMarginsDisabled() will do like ignore safe area.

Add a removable background to widgets

  • You need to remove for background for Lock Screen, etc.
  • Add containerBackground(for: .widget) {  // your original color here in the closure }
  • Smart Stack will use this approach.
  • If you have no background to remove (or one that doesn’t make sense) you .containterBackgroundRemovable(false)

Dynamically adjust layout

  • Change layout when background is removed, this is good for reading from far away in standby mode
  • Also blends better in Lock Screen.

Prepare for vibrant rendering mode

  • This is for Lock Screen mode
  • Use .widgetRenderingMode environment variable to see which mode you are in.

Meet ActivityKit

This is all about Live Activities

Overview

  • Allow for glanceable ways of keep track of task or event.  Have a defined start and end time.
  • Can use either background or push notifications
  • Dynamic Island can display two live activities at any time.  They will show minimal presentation.
  • You can press to expand to more information 
  • Now supported on StandBy Mode and on iPad 
  • It leverages Widget enhancement to add buttons or toggles.
  • Use ActivityKit framework – it is declarative, very similar to using Home Screen Widgets.  Only available after a discreet user activity.  They are user-moderated like notifications. It must support both Lock Screen and Dynamic Island.

Lifecycle

  • The App goes thru different stages,
    • Request
    • Update
    • Observer activity state
    • End
  • Request app is in foreground, configure initial content and configuration.
    • Setup the update based on information and provide information on state (like when will it be stale).  Reliance is also optional.
    • Define if it is pushType for remote verses local
  • Update
    • Based on key things in the application you can send an update, you can also display alerts if the event is critical.
  • Activity state changes can happen at any time during the Live Activity’s lifecycle (started, finished, dismissed and Failed) are the possible values.  Based on the state changes, you can do cleanup and other activities.
  • End
    • Create a final content and provide that to the item
    • Change the policy 

Building Live Activity UI

  • Need a widget configuration to describe the live activity.
  • Need to define all three Dynamic Island UI
    • Add CompactLeading, compactTrailing, Expanded and minimal views.
  • When more than live activity is running, the system will decide which apps are shown in the dynamic island.
  • Expanded presentation has multiple areas defined by the system. .leading, .trailing, .bottom space.

Design and Build apps for watchOS 10

Goals is to surface timely information, communicate at a glance, take advantage of the screen, and make things consistent.

Design Principals

  • Timely and relevant: in the moment 
  • Focused and highly specialized for brief interaction
  • Unique to Apple Watch – utilize the Digital Crown – this is becoming a new focus item for apple (should be backed up by touch)
  • Consider the journey – people will take from moment they raise their wrist – Smart Stack is an example of this.

Navigation

  • Updated in NavigationSplitView
    • Borrowed from 2 column layout from weather on iPad – the list is now a button on the top leading screen – great if you have a source list with detailed views.  Start with the detailed view when user starts your app.  Try with no title
    • Transition between detail view and source list is animated.  Great for showing comparative data.  (Should add number of cards in my view for card app).
    • Same API on watch as other platforms.
  • TabView updates
    • Now it is designed to be vertical.  Can be expanded inline so that you can stay on a single screen by default but support localized text, etc.
    • Activity is a great example of this.  (Uses the  .tabvViewStyle(.verticalPage))
    • Combine it with .containerBackground((color, for: .tabView) for seamless blending
    • Tabview will automatically expand a list if you add it to the grouping.
    • Animation to scale information as it moves to new locations.  You can now drive animation based on the tabView.  You  can use .matchedGeometryEffect to do so.
  • NavigationStack (core paradigm)
    • Use a NavigationStack if the other two don’t address your appropriate workflow.

Layout System Updates

  • Dial based views
    • Great for dense info delivered at a glance, you can also provide 4 different controls 
  • Infographic Views
  • List views
    • Scroll thru context to find what you need
  • If you use these layout patterns then your toolbar placement, and the like don’t need extra padding.  It just works.

Color and Materials

  • Four background materials – Ultra Thin, Thin, Regular and Thick. 
  • Full screen background gradient 
  • You can use Primary, Secondary, Tertiary, Quaternary prominence layers – with vibrant versions to ensure legibility 

Meet watchOS 10

Widgets are surfaced automatically, 

Be focused and specialized

Design for glances

Based on shape of display, the new design language has three foundational patterns,

  • Dial
  • Infographic
  • List

Use background information to convey additional information – this is key to the migration of the widgets.  This is backgroundContent – not just a visual flourish 

New material applications (this is to align with xrOS or VisionOS) – 

  • Translucency established hierarchy 

New navigation patterns – 

  • Use the crown to do things without covering the screen.
  • Vertical patination, is now used for multiple tabs in an app. Use this instead of horizontal pagination.
  • User object permanence of items across pages 
  • If you extend beyond the height of the display 
  • Source List is a new pattern. See World clock to see what this means

Meet SwiftData

Code focused, using macros, to naturally work with SwiftUI

Using the model Macro

  • @Model – new model schema from swift Code
  • Simply decorate the class with @Model 
  • Attributes inferred from properties 
  • Relationships are inferred from the model
  • You can use @Attribute or @Relationship on your properties to drive uniqueness and delete behavior
    • For example 
    • @Attribute(.unique) var name: String
    • @Relationship(.cascade) var bucketList: [BucketListItem]? = []
  • Check out Model your schema with Swift Data

Working with Data

  • Model Container
    • Persistence backend or customize with migration options
  • Model Context
    • Watches for changes
    • Does tracking, fetching, saving and undoing changes.
      • This is usually an Environment feature for the context.
    • Predicate and FetchDescriptor 
    • Does strongly typed construction
    • Can do related objects to be pre-cached 
    • Basic CRUD is supported with simple context.save()
    • Dive deeper into SwiftData session

Use SwiftData with SwiftUI

  • Seamless integration  with SwiftUI
  • Automatically fetching and updating UI
  • Simple code example
import SwiftData
import SwiftUI

struct ContentView: View {
	@Query(sort: \.startDate, order: .reverse) var trips: [Trip]
	@Environment(\.modelContext) var modelContext

	var body: some View {
		NavigationStack() {
			List {
				ForEach(trips) { trip in 
					// …
				}
			}
		}
	}
}
  • Since this is Observable, you don’t need to use @State – it just works.
  • Check Migrate to SwiftData

What’s new in VisionKit

This API is all about lifting subjects from images, i.e. pulling out your favorite pet from a picture to be used elsewhere 

Subject Lifting

  • Simple long press on an image, you can share the subject.  Now you can lift any subject and turn it into a sticker.  This is automatically for most code from last year.
    • Just set an appropriate interaction type, .automatic does a combination of al types
    • .imageSegmentation will only do Subject lifting
    • .automaticTextOnly – same as iOS 16 behavior – test selection and data detectors

Visual Look Up

  • Allows users to know about pets, nature, landmarks, art and Media.  In iOS17 adding Food, Products and Signs & Symbols
    • This helps with laundry tag information!
    • Works for 6 languages to start with.
  • Analysis is entirely on the device. Includes feature extraction.
    • One you look up the object  then it goes to server for more information.
    • Add .visualLookup to your analyzer configuration 
    • Badges are displayed on subjects for lookup etc. 

Data Scanner and Live Text – allows for OCR 

  • Optical flow – can now do high speed scanning and comes for free when recognizing text.  
  • Currency Support
    • Find and interact with monetary by setting textContentType to .currency in the initializer, just like email or phone numbers
    • Provides both a Bounds and a transcript.  Code example in video for totaling a receipt.
  • Live Text – more regions:
    • Can detect document structure, like list detection (iOS16), but now can also detect Tables   Will even merge cells if needed.
    • Context-aware data detectors – will pull additional information from surrounding areas, like name and address
    • You can do .transcript last year, you can now do ranges, selected text, Attributed and identify when text changes.

Expanded platform support

  • All about the Mac – Catalyst support being rolled out
  • Catalyst –
    • Simple recompile for LiveText, Subject Lifting and Visual Look Up, no QR support yet ( you can do it via Vision Framework)
  • Native Mac API
    • ImageAnalyzer
    • ImageAnalysisOverlayView – subclass of NSView, just add above image content (adding as SubView of ContentView is simpler)
    • Basically the name as iOS (except .machine-readable is a NOOP
    • Contents rect – needs to know the rectangle bounds that it needs.
    • If you use NSView you can just set trackingImageView on the overlay and the system will do it for you
  • Contextual Menus (Mac)
    • You can combine items in the same menu, this provides a much better right click menus based on selected item.
    • Items are identified by tags, there is a struct that defines them.  Code examples are in the session.