Inspectors in SwiftUI: Discover the details

Inspector

  • This show views of selected content.  For example a keynote side bar showing you about a selected item
  • This is available on macOS, iPadOs, and iOS, which includes programatic control of presentated state and width.
  • It is an structural API like NavigationStack or .popover
  • To add an inspector – you have a bool binding, and add the content in the trailing viewBuilder
  • Inspector uses Group style by default, they are not resizable by default, but you can add .inspectorColumnWidth( with an ideal value ) changes are handled by the system and retained across app launches.
  • It has different behaviors based on content it is attached to, this impacts toolbar placement and overall UI characteristics. On MacOS it is a bit easier.  If you are using a splitView it should be placed in the detail section.

Presentation customization

  • In iOS16.4 there are new features and they are also enabled for Inspectors
  • .presentationBackground( ._: )
  • .presentationBackgroundInteraction(._:) 
  • .presentationCornerRadius(_:)
  • .presentationContentInteraction(_:)
  • .presentationCompactAdaptation(_:)

Get started with privacy manifests

This is about simplifying creation of Privacy Nutrition Labels and holding developers accountable for the APIs and SDKs they use.

You are responsible for all code in your app.  I can image a fight brewing now between SDK creators and developers, who are ultimately responsible for the usage of these SDKs.

I highly recommend you check out this presentation yourself at https://developer.apple.com/wwdc23/10060

Privacy manifests

  • Third party developers can provide information about their SDK to be included in your app.
  • Create a PrivacyInfo.xcprivacy file in Xcode which defines the data collected and linked to the app, and how it is being used (Should update this for my own apps).
  • Check App privacy details on the App Store Documentation 

Privacy Report 

  • This pulls together all the information in one place.  It will aggregate across all content.  Right click on context menu for a App Archive, and choose Generate Privacy Report
    • This only works if you have created PrivacyInfo.xcprivacy files and they must be included in the archive

Tracking domains

  • Control network connections from your app.
  • Some SDKs may default to tracking or depend on you to request permission and will assume yes.
    • If you specify domains in your privacy manifest it will automatically disallow tracking unless the user agrees.
    • If the same domain has both tracking and non-tracking functions – split the domain to track this
  • Xcode Instruments will show you domains that are used to track across websites.  So you should run this instrument on your code to confirm that you are OK, Then you can declare the value in your privacy manifest
  • Fingerprinting is NEVER allowed

Required reason APIs

  • For important use cases while avoiding finger printing, Apple has created groups of categories  of APIs with a list of approved reasons
  • For example NSFileSystemFreeSize (disk space) check Required APIs in developer APIs
  • Documentation links to a feedback form if you have a valid reason
  • You must clearly state why you use these APIs in your Privacy Manifest
  • Check the Privacy-impacting SDKs in the developer documentation

Starting in fall 2023, apple will start sending informational emails to developers.  Starting in Spring 2024 they must reviewed and addressed. Also Privacy manifests will be expected at that point.

Fix failures faster with Xcode test reports

A tour of test reports in Xcode

Structuring tests

  • Test methods are individual tests
  • Test classes – are groups of test methods
  • Test Bundles – one or more test class
  • Unit vs. UI tests
    • Unit Test – Source doe
    • UI – User actions
  • Test Plan – runs over the app and has both unit and UI
  • Configurations – say how to setup the environment for your test
    • Language and location
    • Code coverage
    • Test repetitions
  • Run Destinations
    • Devices to run on 
    • In the IDE you can run against 1 destination
    • In Xcode cloud you can choose many
  • Sample test plan
  • One line is a single test run

Explore the test report

  • The report provides a high-level summary of your test run, highlighting important patterns
  • One stop shop
  • Richer failure information for UI testing
  • You can run within CI and get a report
  • Here’s a sample run report, can see insights to see the notable issues across all configurations and runs. Further down you can get the actual test errors.
  • In the run view you can actually find the failure message and the call stack so you can go directly to your source code.
  • For UI tests you can actually get a video of the test at the point where it failed so you can see the failure in context.  Of course you also have the video of the entire test so you can see the full behavior. 
  • Clicking on an event takes you to that point in the test run and shows the video scrubber

Explore SwiftUI Animation

Overview of animation capabilities (to be honest, a lot of this was over my head, and probably explains why my apps don’t have a lot of animation).

Anatomy of an update

  • SwiftUI tracker view’s dependencies – if anything  changes the view is invalid and the close of the body will call another body to redraw the view.
  • If you add an animation, the body is called with new values, including a animatable attribute. If that attribute changes, then it makes a copy and interpolates to transition from old value to new value. It will then update off the main thread for built in animations, which is very efficient  and doesn’t call your view code.
  • There are two aspects – Animatable attributes and Animations that describes how it changes over time. 

Animatable

  • You must conform to Vector Arithmetic to allow you to process a list of points in the animation
  • ScaleEffect let’s you to independently define 4 different vectors so animation.  It is public type so you can look at it if you want to learn how to create your own animatable views.
  • Really good demo of the actual updates along the timeline of the animation

Animation

  • You can customize withAnimation by passing in a specific animation, there are three basic categories – Timing Curve, Spring, and Higher Order animations (which modify a base animation)
  • Apple recommends using Spring animations – and it is the default (.smooth) if you use withAnimation { }
  • New category – “Custom” animations.  Animate, shouldMerge and Velocity are the three requirements to create a custom animation
    • All three of these are vectors  only Animate is required
    • ShouldMerge allows you to handle if a user interrupts your executing animation
    • Velocity allows velocity to be preserved when a running animation  is combined with a new one

Transaction

  • This is a family of APIs, Transaction is a dictionary to propagate all the context for the current update
  • This section of the talk explains what the actual transaction dictionary is used across the attribute graph
  • This behavior enables APIs to control an animation, use a view modifier like .transaction { //action in there }
    • Be careful with overriding – you should just .animation(.bouncy, value: selected) instead to remove accidental animation
    • There is a new version of .animation(.smooth) this will scope it only to that modifier.  So you can have it only react to the .smooth animation , this will reduce the likelihood of accidental animation.
  • You can now update the Transaction Dictionary via an extension for use in Custom Animations, with your own TransactionKey 
  • There are two new variants of the transaction modifier to make it even more unlikely to have accidental animation

Explore enhancements to App Intents

Widgets

  • Widget configuration allows you to provide options on the “back side” of the widget – these are parameters, that creates an ordered list of parameters
  • You define your scheme right in the app Intent code
  • You can provide dynamic parameters, which are supported by queries – check out Dive in to App Intents from WWDC2022
  • To migrate your widgets to app intents, via a single button.  Once you do it, you can remove your definition file.
    • You may have to adjust some of the resultant code
  • Whenever your app is updated by the customer, it will automatically upgrade their intents
  • Widgets can now perform actions
  • App Intents can be used for Siri Shortcuts too – Bring your Widget to Life is a great session to catch up on this.
  • Dynamic options in queries.  So you can create queries that are dependent on other intent information via @IntentParameterDependency
  • Array size, allows you to limit the size of a parameter so you can ensure things fit on the widget which is being displayed
  • You can also use ParameterSummary to show a sequence of items, and including a new When function to show content under certain sizes
  • Continue User Activity – allows you to position users in your app when they tap on your widget
  • RelevantIntentManager is used to help surface your intent at the right spot and time for a user

Developer experience

  • Framework support allows you to place your intents in Frameworks via AppIntentsPackage to reduce compile time, code distribution size, etc.
    • May need to refactor my Wasted Time App Intents
  • App Shortcuts in extension – you can create an AppShortcutsProvider so that your app does not to be launched when using your Shortcuts
  • The compiler will extract app intent information and the a Metadata.appIntents – this process has been sped up in Xcode 15.
  • Ability to continue an Intent in your app.  So if you start an intent outside of your app, you can then enable ForegroundContinuableIntent conformance to enable this.
  • Added support for Apple Pay in App Intents, this was enabled in iOS 16.5

Shortcuts app integration

  • You can use app intents in Shortcuts, App shortcuts, Siri Voice, Apple Watch Ultra Action Button, and Focus Filters
  • And now you can add more integration with SwiftUI Apps Interactive Live Activities, and Interactive Widgets
  • You should create a good ParameterSummary, make sure they read like a sentence.
  • Use isDiscoverable as needed – especially if it is only valuable within your app itself.  Like only in your interactive widgets
  • You can also now provide progress via ProgressReportingIntent protocol
  • Find actions can are easier to integrate with, add EntityPropertyQuery or EnumerableEntityQuery (this is easier, but is optimized for small number of entities)
  • Intent description – has been updated with ResolvedValueName to be more descriptive based on the action being taken

Evolve your ARKit app for spatial experiences

Prepare your experience 

  • Expand your app beyond the window,  by default apps launch into the shared space, like multiple apps on the desktop
  • You can do Shared Spaces, and Full Space – this gives you more features like AnchorEntities and ARKit
  • Prepare you content
    • Use usd to create your 3D content, it is at the heart of 3D content, you can use them directly in Reality Composer Pro
    • If you have custom materials you will need to rebuild them in the shader app in Reality Composer Pro 
    • You can access Reality Composer Pro directly in Xcode

Use RealityView

Bring in your content

  • In shared Space – just add them directly to the RealityView Content, you can see more details in the session above
  • In a Full Space – a key value is you can an Anchor the app to specific surroundings.  Different than iOS you do not need to ask for permission to use Anchor Entities
  • ARKit can allow of WorldAnchors and Anchor persistence – you will be required to have user permission to use ARKit capabilties

Recasting

  • Allows you to reach out beyond arm’s length
  • Requires collision components 
  • You can RayCast with system Gestures or Hand Tracking 
  • By generating a Mesh Anchor of the environment to reconstruct the scene, you then create an Entity to recognize the time, and you create a transform and collision entity to track the environment.  These entities represent the surroundings.  
  • This Recasting allows you to then add an input target to all you to hold a position in world space.
  • By using a HandAnchor from ARKit you can build an RayCasting to identify the collision in the scene – then create a world Anchor in space to save the position.  You now place the entity and transform it to the world anchor – you can now place the item in the real world.
  • By taking the placed item and creating a collision model you can now interact with it as if it is really there.

ARKit updates

  • Receiving anchors has changed to address Spatial Computing 
  • Now you have an Ala carte selection of things to create your items you wish to receive.
  • On xrOS data providers provide an asynchronous anchor update that is decoupled from other updates. You don’t get ARFrames anymore, this is automatically done by the system, which reduces latency 
  • World anchor persistence
    • System continuously persists this mapping for you
    • Just use the world anchor and use WorldTrackingProvider which will allow you to load and unload virtual content

Enhance your spatial computing app with RealityKit

There are a lot of new features  in realityKit as highlighted on this slide

RealityView attachments

  • This allows you to attach SwiftUI content on a RealityKit scene
  • By using a RealityView in your View Model, you add your various Entities and content, you need to add a new parameter in your closure for attachments, and add a view builder, where you add your SwiftUI elements.  Also add a .tag(“”) or any hashable value.  Then you use entity(for: ) with the same .tag to add it to the element.
  • This creates a view.attachment entity that you can add like any other entity.

Video Playbacks

  • VideoPlayerComponent is a new type to embed video inside of a scene.
  • Load the file from bundle (or other location) , create a AVPlayer instance and then create the VideoPlayerComponent
    • This will create a video mesh of the appropriate aspect ratio of the video
  • You an handle both 3D and 2D video, and the player will also handle Captions
  • By default the video will be 1Meter in height, but you can scale it to the size you’d like.
  • This also supports passthrough tinting to match colors in the environment.

Portals

  • Can render a view with different lighting and masked by its geometry.
  • First create an entity that is a World Component.  You can attach children to this entity, and they will only appear within this portal.
  • You create a portal with a model a transform and a portal component which is targeted to the world component.

Particle Emitters

  • Now that we have a world and a portal to it.. we can add a particle emitter to give it some excitement
  • You can create particle emitters in code in RealityKit or design them in Reality Composer Pro
  • Creating your Emitter in Reality Composer Pro, you can still modify it in code.

Anchors

  • Now let’s attach the ports to a wall… this is the purpose of anchors.
  • The  are tracking modes, .once and .continuous – .once will not move once it is placed. .continuous will move with the anchor
  • To use Anchors you must be in an ImmersiveSpace, because you will need to render things outside of the current space.
  • The AnchorEntity will look for a vertical wall that is at least 1M by 1M – and that’s it!

Design with SwiftUI

SwiftUI as a design Tool

  • Don’t spend time recreating coming elements
  • Somethings are really good at making things easy.  They lower the floor.
  • Other tools are focused on being powerful, they raise the ceiling.
  • SwiftUI tries to strike the balance between the two.
  • Being declarative makes it easy to understand what you are trying to do.
  • You can also get to simple system controls like color pickers and framers
  • You also have access to the unique capabilities of apple hardware.  And access all the frameworks

Getting the details right

  • Modern interfaces are dynamic, with complex flows and interactive elements.  These are all surfaced in SwiftUI.
  • By testing on the device, you can see directly how things work and discover usability issues, like the zoom speed of the default map on the Apple Watch
  • Check out if Ticker Animation is available for my own apps.

Designing for interactions

  • Animations give you a sense of how your design feels, not just how it looks.
  • By building a quick prototype in SwiftUI – could test interactions with a real device

Testing your ideas

  • You need to confirm how things work, design can get carried away for ideal scenarios, but you need to test with “real” scenarios and test where they break
  • You can do this in SwiftUI by trying lots of different values, on a real device.
  • Create one of design tools to test multiple designs at one time, these are easy to build in swiftUI 

Presenting your work

  • Great way to share with others, you take SwiftUI to share designs with on device demos.  
  • It allows for the design to explain themselves, cutting down meetings
  • No slide deck can compare to one spectacular demo

Build Widgets for the Smart Stack on Apple Watch

Another Code along session, using AppIntents, WidgetKit and App Intents.  Check out the code which is available at in the Session 10029 folder – https://developer.apple.com/documentation/watchOS-Apps/updating-your-app-and-widgets-for-watchos-10 

Widget Configuration

  • AppIntentConfiguration is new in WatchOS 10 and will be used in this project.
  • Make sure you check out last years Dive in to App Intents to get started and then review Explore enhancements to App Intents

Timeline Setup

  • This holds all the data needs to render a widget on a particular date

Widget Views

  • There was significant work on making a nice looking Widget, but for some reason I could not get the previews to show it.  It kept crashing, and then it said there was no preview defined.  Oh well, I can see it in the simulator

Timeline

  • Make sure you enter based on date and future date, so that your fill the timeline with relevant data
  • Also, setup a AppIntentRecommendation 

Relevance (this will prioritize when our widget should be prioritized on the Smart Stack)

  • Updating the relevance information at the right time in your app is key.  This session does a great job of explaining the why of each update in the code.

Enhance your app’s audio experience with AirPods

AirPods Automatic Switching for macOS

  • This is based on users Intent  with “now playing” registration and user activities.
  • AppStore apps support this by default
  • Best practices –
    • Your App should register for now playing if you are a media or long form app
    • Conference or gaming should not register for now playing
    • You  should use Audio Services API to avoid unexpected behavior
    • Conference apps should only enable microphone during the meeting and close it when the meeting session is done
    • You should select default rout to play audio, avoid playing silence if the user hits pause

Press to mute and unmute support in IOS17/macOS 14

  • Convenience being added by Press to Mute and UnMute by a simple press of the stem
  • All CallKit apps get this support by default.
  • To do this yourself just use AVAudioApplication to configure application wide audio behavior
  • On the macOS it works a little bit differently, however the app is responsible for muting any uplink audio when the gesture is performed.  There is additional API components you need to use.

Spatial Audio with AirPods

  • 80% of Apple Music listeners use spacial audio
  • Note the last two options don’t have API interface, it is enabled automatically for apps that register via Now Playing