Build spatial Experiences with RealityKit

RealityKit was introduced in 2019.

Will be going thru “Hello World” sample to explain the concepts. 

RealityKit and SwiftUI

  • RealityKit let’s you add 3D elements to your Windows and Views from SwiftUI
  • By putting models in a separate window, using the new volumetric window – it will be a fully 3d object viewable from any angle.  You can define a specific size, which will be consistent, (sized in meters).  To create an immersive space, this is a scene type, not just a window View.  You create with ImmersiveSpace(id:) { RealityView }. Not that Open Immersive Space is Asynchronous 
  • Sessions to watch – Meet SwiftUI for spatial computing … and … Take SwiftUI to the next dimension
  • Go beyond the Windows with SwiftUI – takes you into the various types of spaces in more detail.

Entities and Components

  • Entities are a container object, it must have components to do “something”, comments are things like models and transformers.  (Really good description on this)
  • Check out Explore Materials in Reality Composer Pro
  • Transforms places things in a space. Properties are used for scale and motion.
  • There are functions to convert between RealityKit and SwiftUI coordinate systems
  • You can create your own components

RealityView

  • This is a SwiftUI view to place entities in your app.  You add entities to a content instance.
  • You can connect sate to component properties so that you can express connection between SwiftUI state and the model
    • These are observable so they only change when objects that they depend on change
  • Convert function will change from entity space to swiftUI space.
    • This is useful for scaling, etc.
  • You can subscribe to RealityView events in SwiftUI via .subscribe(to:) to run code based on events.
  • You can also attach SwiftUI views to entities  – check out enhance your spatial computing app with RealityKit

Input, Animation, and Audio

  • You can add gestures to realityKit views, you must have input target and collision components.
  • You can add these components via Reality Composer Pro
  • USDZ files can reference other usd files
  • Default shape for collision component is cube, you should match it as best you can to your object’s shape.
  • You can add a component (HoverEffectComponent) to make your app react to where you are looking.  This is done outside of the apps process for privacy.
  • Built in type of animations: From-To-By, Orbit, and Time Sampled
  • RealityKit sounds are spatial by default.  No additional reverb is added to ambient sources, channel is great for background music.

Custom Systems

  • You can combine existing functionality in different ways to create components or systems.
  • Work with Reality Composer pro content in Xcode – tells you more about this
  • Systems are code that act on entities and components …. This allows your structure code that implements your apps behavior
  • Registering a system in your app, makes it available across your app.
  • You can filter the entities via a entity query so only those matching entities are impacted by the system.

Meet Reality Composer Pro

This was a walk thru of the tool

Supports compositing, Particle emitters and positional audio 

Tried to follow along, but Xcode beta did not include Reality Composer Pro (Yet) and when I went to download additional tools, it was not there – and Safari crashed on me. 

Uses .usda format for the project, and packages it as a Swift Package..

You can use Object capture to add items into Reality Composer Pro.

Nice to see that they support WASD for movement around the scene…gamers will feel comfortable with this navigation.

Excellent discussion on Particle Emitter creation, along with performance implications.

Grouping objects into other objects is key for authoring both audio, particle emitters and objects.

There are three audio types. Spacial, Ambient and Channel

Work with Reality Composer Pro content in Xcode session should be reviewed to see how to add this to your code.

Statistics tab is needed to understand the performance implications of your model.  Simplifying your model can help reduce triangles.

Explore materials in Reality Composer Pro should also be reviewed to understand model transitions.

Debug with Structured Logging

Screenshot 2023-06-06 at 20.52.59 (2).png

New debug console 

  • Message is the focus not all the prefixes.. so you can add them back in based on metadata options flag.  This now shows up below the message
  • Yellow or Red background is an error and fault
  • Can also press space on a single log entry.
  • You can also use filtering so only show what you care about, you will get auto complete view too. You can also right click on a log entry and use it to show similar or hide similar etc.

Live debugging

  • Demo 

LLDB Improvements 

  • Use p instead of po so you can get the details of an object
  • Hard to remember all the new commands.. try dwim-print (or do what I mean print), you can just use “p” instead
  • Po will now do what I want plus customer object descriptions

Tips for logging

  • OSLog is for debugging, print is only for stdio
  • You should log any tasks being done, along with the results( 
  • You get all the meta data is you use OSLog – import OSLog in to the project.  Define your Logger with a subsystem (your app) and category (your function)
  • Consider
    • create multiple log handles for different components 
    • Use OSLogStore for collection in the field
    • OSLog is a tracing facility – it can do performance too

Meet Safari for Spacial Computing

Should look familiar – it is the same Webkit underneath.

Touch the page to follow a link, or look at it and tap fingers together

Tab over has been completely redesigned

You can do Tab views to surround you

Best practices – 

  • All the websites work out the box using responsive design
  • Use CSS viewport units
  • Resize based on media and container queries
  • Use SVG for UI  elements
  • For bitmap access use devicePixelRatio for image loading

Natural Interactions

  • Main model is eye and hand gesture.. just look around and pinch to select (it will capture where you eyes were looking
  • Touch the page with your index finger – that is a pointerDown event.  You don’t need to worry about this for low level interactions
  • Media Queries – it is similar to a touch screen – coarse and doesn’t support hovering 
  • Elements will highlight based on eye tracking so it knows how to select 
  • Make sure to use cursor:pointer; to show that something is interactive.
  • Test your highlight regions in the xrOS simulator

Platform optimization

  • Screen is a bit more abstract, just like the change of pixels when retina was introduced.
  • Full screen windows can be resized on the platform, so they can be larger than expected.
  • requestAnimationFrame always measure elapsed time between frames 

Integrating with 3d Content 

  • Adjust your website to use quicklook so that you can pull 3D objects out of the website and anchor them in the real space.
  • All you need is a link to the usdz model, and an image with a preview 
  • Will leverage the advance lighting and rendering from ARKit
  • Adding a new element via <model interactive> <source = src””> </model> for html
  • WebXR dev preview is available to learn more
  • New settings screen in Safari on the platform to enable various 3D features and WebXR

Discover Observation in SwiftUI

Let’s you define models in standard swift types

What is Observation?

  • New feature for tracking changes to properties
  • Will automatically create the code for you to track and use any changes occurs via Macros. 
  • Using a computed property, when it is changed the UI updates
  • Adds performance improvements since the UI only changes when it needs to

SwiftUI Property Wrappers

  • You will still need @State if the view needs it’s own state – like a sheet presentation
  • @Environment allows you to propagate globally – Observable types work well here. 
  • @Bindable – is a lightweight just use $ syntax to create bindings.  You will need that for things like TextFields

Advanced uses

  • You can use arrays, optionals or any type that contains @Observable – it will track based on the specific instance
  • If a computed property does not have any stored properties, you need to tell observation that the property was used and when it changes, you do this via creating your own getter and setter look this up for more detail.

ObservableObject – how to update your code

  • Observable macro will simplify your code
  • Take a look at your public class <name> { } remove the @observableobject and @published, then change your view, all @Observed can be changed, mostly just delete annotations will get you there

Make Features Discoverable with TipKit

Create a Tip

  • Title and Message – with direct action labels as title, easy to remember instructions in the message
  • Don’t use for promotional or error messages.
  • Add the TipsCenter.shared. ?? in the init() of the @main for the app.
  • Add icons and color match with the application.  You may wish to add an action button if there are settings.
  • Set treatment and placement.
    • Pop-over – points to element or button to direct the user  (.popoverMiniTip)
    • Inline – adjust the apps UI so no UI is blocked.

Eligibility rules

  • Don’t be spammy or irrelevant.  If the feature is already discovered don’t spam them.  If users are very infrequently using your app, you may not want to show tips either.
  • Rules are
    • Parameter based rule – persistent based on State and Boolean
    • Event-Based rules – define an action that must have occurred before they are shown.
    • Donate the event
    • You can create custom events using associated types and use that for the rule


Display and dismissal 

  • Don’t show it forever, don’t show multiple at once
  • You can do a DisplayFequency (.daily, .hourly, custom duration) you can also do .immediate
  • You can do .ignoresDisplayFrequence(true) for a specific tip 
  • If the feature is used it should be dismissed… You can then call the invalidate function with user performed action
  • You can also set a .maxDisplayCount to not annoy the user with the same tip over and over.
  • Tips can be synced via iCloud, so you don’t want to show the same tip on different devices

Test tips

  • You can work around eligibility rules.  Use TipsCenter.showAllTips() or .showTips with tip iD, or .hideTips to prevent, or .hideAllTips()
  • .resetDatastore() can be used to clean it up 
  • You can also use com.apple.TipKit.showAllTips in your launch arguments (or any of the other commands)

Running your iPad and iPhone apps in the Shared Space

Majority of apps will run fine in VisionOS (check it out in the simulator)

Built-in Behaviors

  • Built on iOS foundations, more than likely your app will just work.
  • iPod and iPad apps will be in light mode in landscape mode.  iOS only would be portrait mode.  If you support rotation you will see rotation button.  Mini and max size will cause bounce.
  • Can use trackpad or game controller when using your app, along with touch.
  • Local authentication is forwarded to Optic ID

Functional Differences

  • There’s no notation of rotating the entire device, so you may define the default orientation – UIPreferredDefaultInterfaceOrientation to you info.plist 
  • UISupportedInterfaceOrientations – will decide if you need a rotation button
  • UIRequiredDeviceCapabilities – will decide if your app can be on the device.  
  • Gestures – work differently
    • Eyes & Hands, maximum of two simultaneous inputs (each hand is a single button)
  • ARKit – ARSession
    • Significantly updated to handle new architecture and privacy.
    • Existing AR views won’t work on this device, you will have to do work on this.
  • Location support – same as iPad
  • Look to Dictate – allows your to quickly adjust around the screen.  You will get a microphone in search fields as an example.
    • API is available, but disabled by default of iOS and iPad app
    • .searchDictationBehavior to enable in SwiftUI
  • Use availability checks to make sure it is available for you try to use it.
  • xrOS Device (Designed for iPad) new target

Choose your experience

  • Most apps don’t need changes at all, you can rebuild against xrOS SDK but not required.
  • Only In iOS SpriteKit and Storyboards
  • You get additional immersion with xrOS SDK – Volume, etc. 
  • You will get system look and feel when using xrOS SDK in the simulator.  Ornaments anchor on side and bottoms.

Check out the Meet SwiftUI for Spacial Computing to see how apps will look and learn what you need.

Bring Widgets to New Places

New location for widgets -First introduced in iOS 14, added to Lock Screen iOS 16

New locations are Desktop on Mac, iPad lock screen, standby Mode on Phone, stack mode on WatchOS

Transition oto content margins

  • Padding automatically applied to prevent getting to close to the edges.   This replaces safe area … .contentMarginsDisabled() will do like ignore safe area.

Add a removable background to widgets

  • You need to remove for background for Lock Screen, etc.
  • Add containerBackground(for: .widget) {  // your original color here in the closure }
  • Smart Stack will use this approach.
  • If you have no background to remove (or one that doesn’t make sense) you .containterBackgroundRemovable(false)

Dynamically adjust layout

  • Change layout when background is removed, this is good for reading from far away in standby mode
  • Also blends better in Lock Screen.

Prepare for vibrant rendering mode

  • This is for Lock Screen mode
  • Use .widgetRenderingMode environment variable to see which mode you are in.

Meet ActivityKit

This is all about Live Activities

Overview

  • Allow for glanceable ways of keep track of task or event.  Have a defined start and end time.
  • Can use either background or push notifications
  • Dynamic Island can display two live activities at any time.  They will show minimal presentation.
  • You can press to expand to more information 
  • Now supported on StandBy Mode and on iPad 
  • It leverages Widget enhancement to add buttons or toggles.
  • Use ActivityKit framework – it is declarative, very similar to using Home Screen Widgets.  Only available after a discreet user activity.  They are user-moderated like notifications. It must support both Lock Screen and Dynamic Island.

Lifecycle

  • The App goes thru different stages,
    • Request
    • Update
    • Observer activity state
    • End
  • Request app is in foreground, configure initial content and configuration.
    • Setup the update based on information and provide information on state (like when will it be stale).  Reliance is also optional.
    • Define if it is pushType for remote verses local
  • Update
    • Based on key things in the application you can send an update, you can also display alerts if the event is critical.
  • Activity state changes can happen at any time during the Live Activity’s lifecycle (started, finished, dismissed and Failed) are the possible values.  Based on the state changes, you can do cleanup and other activities.
  • End
    • Create a final content and provide that to the item
    • Change the policy 

Building Live Activity UI

  • Need a widget configuration to describe the live activity.
  • Need to define all three Dynamic Island UI
    • Add CompactLeading, compactTrailing, Expanded and minimal views.
  • When more than live activity is running, the system will decide which apps are shown in the dynamic island.
  • Expanded presentation has multiple areas defined by the system. .leading, .trailing, .bottom space.

Design and Build apps for watchOS 10

Goals is to surface timely information, communicate at a glance, take advantage of the screen, and make things consistent.

Design Principals

  • Timely and relevant: in the moment 
  • Focused and highly specialized for brief interaction
  • Unique to Apple Watch – utilize the Digital Crown – this is becoming a new focus item for apple (should be backed up by touch)
  • Consider the journey – people will take from moment they raise their wrist – Smart Stack is an example of this.

Navigation

  • Updated in NavigationSplitView
    • Borrowed from 2 column layout from weather on iPad – the list is now a button on the top leading screen – great if you have a source list with detailed views.  Start with the detailed view when user starts your app.  Try with no title
    • Transition between detail view and source list is animated.  Great for showing comparative data.  (Should add number of cards in my view for card app).
    • Same API on watch as other platforms.
  • TabView updates
    • Now it is designed to be vertical.  Can be expanded inline so that you can stay on a single screen by default but support localized text, etc.
    • Activity is a great example of this.  (Uses the  .tabvViewStyle(.verticalPage))
    • Combine it with .containerBackground((color, for: .tabView) for seamless blending
    • Tabview will automatically expand a list if you add it to the grouping.
    • Animation to scale information as it moves to new locations.  You can now drive animation based on the tabView.  You  can use .matchedGeometryEffect to do so.
  • NavigationStack (core paradigm)
    • Use a NavigationStack if the other two don’t address your appropriate workflow.

Layout System Updates

  • Dial based views
    • Great for dense info delivered at a glance, you can also provide 4 different controls 
  • Infographic Views
  • List views
    • Scroll thru context to find what you need
  • If you use these layout patterns then your toolbar placement, and the like don’t need extra padding.  It just works.

Color and Materials

  • Four background materials – Ultra Thin, Thin, Regular and Thick. 
  • Full screen background gradient 
  • You can use Primary, Secondary, Tertiary, Quaternary prominence layers – with vibrant versions to ensure legibility