Create a great spatial playback experience

Creating a great video playback experience in your app

Media Experience

  • Builds on same APIs as other AVFoundation apps on iOS and iPadOS, AVKit builds to create a experience on each platform
  • AVFoundation can render performantly in RealityKit
  • You must use the matching SDK for xrOS or VisionOS 
  • Use AVKit and AVFoundation
    • Adding the item after creating the play may improve performance
  • Use a UIViewControllerPresentatble and then attach it to your WindowView
  • Screen and audio are anchored to the screen so while you move around they stay consistent.  The basics of the controls will be consistent. The video player controls will appear if you look a the screen and touch it.  They will disappear in time, or you can look and touch again.

Advance Features

  • Thumbnail scrubbing allows for scene-less scrubbing. If a trick play track is available it can be used.
  • You can use interstitials UI to programmatically to identify adds or other key points in the scrubber 
  • Contextual actions allow you to add buttons
  • Custom Info View Controllers allow you to show meta data and / or related content 
  • Immersive spaces – you can add in other 3D Assets, your viewing screen will automatically anchor and position itself, moving the controls closer to the viewer
  • Provide feedback to talk about custom UI controls. Apple is working hard to update this new player

Other use cases

  • Inline media can be done in a window. This just an AVViewControlPlayer happens automatically  if it doesn’t cover the whole window.  It won’t be able to be 3D because of the other content in the windows
  • For splash screens, etc. use RealityKit Video players component.
  • This will be an Aspect ratio correct mesh with captions and better performance.
  • If you want video as an effect – you may want your own geometry use RealityKit VideoMaterial 

Conclusion

  • Take a look at this table to help decide which type of player you need:
  • Check out DestinationVideo sample project

Work with Reality Composer Pro content in Xcode

This is a continuation of a prior session on Reality Composer Pro sessions – where you created the scene and materials

Load 3D Content 

  • You can load a full scene via an Entity async initializer – just give it the Bundle you created in Reality Composer Pro , then create a RealityView 
  • It’s easiest if you use a swift Package for the package it is much easier to load and manage in Xcode
  • ECS – Entity Component System – this is the term and approach that is used by RealityKit and Reality Composer Pro
    • Entity is a thing (and can be invisible)
    • Components are attributes and data
    • System – is where behavior lives – updating once per frame

Components 

  • You can add components in swift or Reality Composer Pro –
    • you can add as many as you want, but only one of each type
    • You can create your own components too
      • Sample here is creating a PointOfInterest component

User interface

  • To put SwiftUI content in a RealityKit scene – use the Attachments API
    • They are part of the RealityView { // only run once this is the make closure } update:  { // only called when swiftUI view changes } attachments:  { }
    • The attachments view builder is just a normal SwiftUI View that needs to be hashable – so we can now add to the Entity with attachments.entity(for: “hashable value”) { content.add(attachmentEntity }
    • To make this data-driven – we need to create the attachment views dynamically.  We can do this in code to create the invisible entities (in Reality Composer pro scene) , query for them, and create a view for each of them.  Using @State will let us know when new buttons are created, and we can save them to the view builder and add them as entities.
    • We can query for all entities with the specific component attached, in our case the PointOfInterest component.

Play audio

  • Add Audio Emitter for the Audio component and preview it in the editor.  In the app you have to load the source and tell it to play
  • Use the AudioFileResource API and pull the audio from the usda file.  Prepare the audio and then call Play 
  • You can introduce faders to morph between sounds and terrains

Material properties

  • The Shader graph was created to allow for morphing between the two maps.  Look at the node map in the shader section of Realty Composer Pro
  • You can modify parameter values in code 

This session is very deep and should be watched.. this blog post will help you understand why you should go and watch the session at – https://developer.apple.com/wwdc23/10273

What’s new in privacy

Understand and allowing control by users of what your collect is key for your users.  Their privacy pillars align with GDRP and extend in to on-device processing and security protections.

New Tools

  • There are new APIs in the Embedded Photo Picker, the Screen Capture Picker, a Write-Only calendar access, Oblivious http, and Communication Safety
  • The photo picker allows the API to only access a subset of pictures, you can embed this picker into your app in iOS17 and macOS Sonoma
    • If you use the new embedded picker, you will not have to request permissions
    • Embed the Photos Picker in your App – check this session
    • The new permissions dialog makes it very obvious what can be shared, and will periodically remind users what your app has access to
  • Screen Capture Picker
    • Prior to Sonoma you had to provide the whole screen, in Sonoma yet will not present a window picker on your behalf
    • You will be able to record selected content for the duration.  
    • A screen sharing menu bar item will be displayed.
    • What’s new in ScreenCaptureKit – session
  • Calendar can now be set to only add items since it has a lot of private information from a users perspective
    • EventKit will allow permission by default to create events. 
    • If you create your own API for creating entries you will have a new UI to request write only permission
    • If you need full access you can ask once for upgrade – you will not be able to ask again
    • For write only, you will be transition to this by default and if you use an old versions of EventKit it will only be asking for write
    • Discover Calendar and EventKit –
  • Oblivious HTTP API – hides client IP address from your server
    • This will also hide data from network operators 
    • This may add additional challenges for your app, so you can now use OHTTP to help protect app usage – by separating the who from the what.  This lightweight standard protocol allow the network operator to only see data to the relay – not the end point.
    • This is already used by Private relay
  • There are additional consideration is you use this, depending on your apps architecture
  • Communication Safety – to address sensitive content
    • This is the hide nudity protection from children.  It has been expanded beyond messages, to Airdrop, Contact posters, photos picker, and Phone app.  
    • It is available for all users, not just kids
    • This on device technology is available for apps to deal with sensitive data.

Platform Changes

  • Mac App data protection, advanced data protection, safari private browsing and Safari app extensions
    • Locations on Disk – have systems managed permission – like Desktop, Documents, and Downloads, some apps have data in other locations like ~/Library or ~/Library/Container
      • You can now control that a user must give permission to allow an app to access data in a container from a different developer
      • Use App Sandbox to protect data by your users
      • You can ask for permission by default if you make no changes – it is valid for as long as your app is open, otherwise the permission will be reset.  You should provide a meaningful purpose string.
        • Use NSOpenPanel – outside of your process to allow a user to find the data before a user is prompted
        • For backup or disk management tool that have already been provided Full Disk assess will not need to ask
        • If you have the same signature across apps, you should  have access to the data from your own other apps.  You can specific a NSDataAccessSecurityPolicy to change from “Same Team” access.
    • Advanced Data Protection – (added in 2022) to provide E2E encryption. If you use CloudKit you can get ADP for your app, when the user enabled ADP in their iCloud
      • Use CKAsset and Encrypted variants for all data types in your app.
      • Use the encryptedValues API to simplify the impact in your app
    • Safari Private Browsing – enables protection from finger printing and advanced tracking protections  (you can also turn this on for normal browsing)
      • By default in private mode, known tracking methods are not allowed, you can see the blocking in the web inspector
      • Tracking information on links is automatically stripped away
    • There is a new permission model will be also users to decide on a per site basis if an extension can run and/or when in private browsing mode.

Spatial Input Model

  • To achieve these goals hand and eyes are processed by an internal system component. So your app only get’s the tap or touch notice.

What’s new in App Store Connect

Monetize your app

  • Think about pricing and in app purchases
  • StoreKit for SwiftUI allows you to simplify adding in App Subscriptions to your app
  • Setting pricing needs to consider international pricing issues – there are 9 price points, based on a base region, You can have the store automatically adjust pricing to manage currency exchange changes. 

Manage testers

  • TestFlight makes it easy to test on all platforms and testers on each platform.  You get data about your testers to see how engaged they are, if they installed, sessions, crashes, and feedback, and now which device they installed on.
  • You can put release controls in place to send specific builds to internal teams vs. external teams.
  • Using Xcode cloud you can easily upload TestFlight what to test, can do this from a custom build script or directly from a local text file
  • Sandbox testing can now support family sharing, by combing test accounts to a family group.  Up to 6 accounts are in a single family
  • You can also add on-device to modify renewal rates, test interrupt purchases, clear purchases, and see the family grouping

Build your store presence

  • You can configure your product page, with privacy nutrition labels, and you can now add in information related to VisionOS, for surrounding, hands and head movement.
  • Release schedule on the app, can support Pre-Orders on a regional basis via soft launches.  This is on the availability page.
  • Product page optimization lets you see which ones your users like best, these now persist across app releases.

Automate with APIs

  • Allows for customizing workflows, syncing with internal systems, and dealing with customer reviews and responses.  Adding support for Game Center later this year
    • This is for leaderboards and achievements
    • You can do scores and achievement unlocks
    • Manage fraudulent activities
    • Match based on custom rules
  • For API authentication
    • You can generate marketing and customer services API Keys 
    • You can also create a user-based key 

Verify app dependencies with digital signatures

Automatically verify integrity of your dependencies, thru signature verification. This is trying to address the supply chain security attacks that have been happening.

Just like the discussion on privacy reporting, it is ultimately each developer’s responsibility to ensure all your signatures are correct and you are not using maliciously modified code.

Dependency Signatures

  • This is code signing is done by linking the binary, info.plist and privacy manifest to your developer identity.  This is done via a CDHash and your Dev. Identify from your Certificate. This hash allows for validation that the object has not been modified and was signed by your key.
  • New Signature verification will automatically handle the dependencies in your project
    • It tracks identities of frameworks in your project.  
    • Note the chain of trust will help across changes in certificates, etc.
  • A revoked certificate will trigger an alert to allow you to resolve or remove the framework from your application.

App developers

  • Demo with Backyard Birds app.- for any XCFrameworks you can see the signing information 
  • Notice the Kind and Status. When you build the app all signatures are validated in the app.  
  • You will get an error if they don’t match and get an alert with an option to accept the change or trash the framework.  Be careful with any self signed certificate.
  • You should start using only signed SDKs

SDK Authors

  • For SDK Authors, it becomes important to sign your SDKs so that developers can confirm that the code has not been manipulated 
  • You can either use Apple Developer program (ADP) or Self Signed certificates.
  • For ADP you should use Apple Distribution and Development Certificates
    • Apple is attestation is provided by Apple for these certificates
    • Validation is handled automatically
  • You can sing your code with codesign —timestamp -v —sign “YOUR CERTIFICATE Apple Distribution: (XXXX)” App.xcframework
    • This is using self signed certificate if you are a ADP member
  • You should start signing right away either with ADP or self-signed

Update your app for watchOS 10

While this is a code along session, at the time of my viewing there was no link provided, I did search the forums and the link here was provided, but it gave me a page not found.  Hopefully by the time others see this blog post the link will work.

Check out the Meet watchOS 10 a Design and Build apps for WatchOS 10 sessions.

An initial compile of a WatchOS 9 app against watchOS 10, you will get many features automatically, if you were using standard controls and options.  Shifting to NavigationSplitView can be achieved by switching from NavigationStack, moving your contents into the detail View builder, while adding a list view for navigation.

Breaking up long views to Vertical Tab Views make it easier to break up content in to logical aspects.

Adding in ToolBar and setting materials to provide additional information, makes it easy to help users.

Take SwiftUI to the next dimension

This will review how to use a handful APIs to use Windows, Volumes, and Full Space

Volumes

  • This is used to emphasize 3D content.  You get a fixed scaled container, which maintains the same size regardless of distance.
  • You just use .windowStyle(.volumetric) to get started…. It’s that simple

3D views and layout

  • Model3D API is used to load usdz objects and more.  It is similar to asyncImage 
  • There is a a phase to landing a Model3D,
    • These include .empty, .failure, .success
  • You should use .scaleToFit() modifier so that the model can it within the volume
  • By default alignment is back flush – so you want to use a .frame(depth: , alignment) modifier 
  • And than use .overlay to attach Labels
  • Using a TimelineView you can add .animation to the 3D View so it has motion

RealityView

  • For more complex scenes or experiences you should use RealityView instead of Model3D
  • Using Entity/Component system – check out “Build Spatial Experience with RealityKit” and “Enhance your spatial computing app with RealityKit”
  • Adding Attachments allow you to provide annotations relative to entities. You need to use any washable value for the .tag assigned to attachments
    • .look(at: ) allow you to position and orient items 

3D gestures

  • To interact with these entities you need to configure them for input, this is achieved by adding both InputTargetComponent and CollisionComponent so that you can add spatial tap gestures
  • There is also a .targetedToEntity so that gestures will fail if they are not on the specific entity.  Use .onEnded to retrieve the coordinates of the gesture.
  • You can add a Model3D as an attachment to a RealityView entity , don’t forget to add a .tag
  • You can combine gestures in a Gesture Modifier to provide more functionality

Simplify distribution in Xcode and Xcode Cloud

This is all about the CI/CD pipeline and speeding up the iteration process.

Express TestFlight Distribution

  • .xcarchive is a optimized release build of your app with debug symbols and it is repackaged for distribution
  • Xcode allows you to use the Achieve so that it either goes to your team only, without going to the App Store, or build a workflow that will ultimately be released to the App Store.
  • Nice to see the Internal test option which really makes it quick and easy to test with your team, one button click!  You can use App Store Connect to add in What to test information and receive screenshot feedbacks.
  • Xcode Cloud allows you to build workflows to customize the CI/CD process, this is now discoverable in the “Integrate” menu.  You can have workflows build based on various tags, branches, or other rules.  You can use Git Commit messages to include notes to testers. 

Automating notarization 

  • You can notarize your app so you can distribute without using the App Store, by uploading to the notary service, this will check for malware, etc.
  • To notarize in Xcode, you can go to Organizer and select an archive and click Distribute – choose Direct distribution
  • If you use Notarize post-action in Xcode cloud it can be used to automatically notarize your app for direct distribution 

Model your Schema with SwiftData

Utilize schema Macros

  • Watch meet SwiftData and How to build an App with SwiftData
  • Sample trip app is used with multiple relationships against the main @Model class
  • Schema Macros allow you to customize the behavior of Schemas
    • @Attribute(.unique) – allows for making a value is Unique – this will cause an update instead of a new items
    • You can provide constrains on primitive value types, (Numeric, String, or UUID)
    • You can also decorate a To-One relationship
  • To rename a variable – it will be seen as a new property so you can use @Attribute(originalName: “old_name”) var newName: String
    • This will ensure a simple migration 
    • @Attribute also provides support for External Data, Transformable, Spotlight Integration and Has Modifier
  • Working with Relationship – will auto setup implicit inverse relationships with default delete rule,  You can add @RElationship(.cascade) to delete them verses, just nulling them out.
  • To add non-persistent – just add @Transient and the value will be calculated at use time, must have a default logical

Evolving schemas

  • To handle updates to your schemas between releases, you use VersionSchema to define distinct releases
  • See SchemaMigrationPlan to do ordering of all updates needed to migrate the data
    • Define each migration phase  – either Lightweight (which requires no code), or Custom (which requires you to write code for things like-dedupe.
  • Annotate migrations so that when you build your plan, it will do the migration for you
  • You setup the Model with current Schema and a Migration plan in your app and it will upgrade data when needed.

Meet Object Capture for iOS

Intro for Object Capture for iOS – allows you to use Computer Vision to create lifelike 3D models which can be used in Object Capture API on Mac, but now you can do on device reconstruction for iOS.  There is a sample application to learn how to do this.   To me this looks like an update from the app shared earlier this year / late last year.

The code is available in the developer documentation, but you should certainly bookmark this page – https://developer.apple.com/documentation/Updates/wwdc2023

More objects with LiDAR

  • Performs best with extra details, but improved objects with low texture objects by using LiDAR, system augments models based on the point cloud to create objects
  • Still avoid transparent or reflective objects

Guided Capture

  • This automatically captures images with LiDAR provides feedback and guidance on how to capture.
  • Capture dial indicates which areas that has images – kinda like when you scan your face
  • You will get notified if there is not enough light.  Also use diffused light to minimize reflection 
  • You want a consistent distance when scanning and keep the object within the frame
  • Don’t forget to flip objects if it is rigid, if the object is repetitive, it may be problematic to do flipping.
  • There is now an API to tell you if the object is captured enough for flipping. And will recommend which way you should flip.

iOS API

  • imageCapture API is what you want to look up to find more information.  This is basically a State Machine to capture between ready, detecting, capturing, finishing, and finally completed
  • The APIs are in RealityKit and SwiftUI (https://developer.apple.com/documentation/realitykit/objectcapturesession
  • You should capture a space for where images are stored during Initialize phase
  • Your app will require to create your own UI to control the capture session for the user.  
  • The detection phase allows you to identify the bounding of the object so that it knows what you’d like to capture
  • Capturing will generate a point cloud to show you progress, one you are finished, you will need to create your own UI to complete the capture, or generate new captures to additional passes, or Flipping the object.
  • The Finishing process will wait for all data to be saved and then automatically move to Completing state
  • Completed state will then allow you to do On Device reconstruction, if completing fails you will have to create a new session.
  • Creating a 3D Model is the “Reconstruction API” – This is a PhotogrammetrySession which is pointed to the images to process and generate a usdz model file.  More on this wwdc21
  • Mac will also use LiDAR data, and supports higher resolution than the iOS device.  You can just use Reality Composer Pro on the Mac and won’t have to write any code.

Reconstruction enhancements 

  • Mac performance has been improved, along with providing an estimate processing time
  • You can create poses of images to pre configured and optimized poses.
  • You can also customize the number of Triangles, with a new Custom detail level