Update Live Activities with Push Notifications

ActivityKit allows you to display live activities to show status and updates

Preparations

  • You should understand how push notifications work.  Your App can request a push token (APNS) so you can send it to a server before you can send push tokens.  APNS then sends the payload to the device.
  • There is a new APNS live activity push type – only available with Token-based connection based apps.
  • Modify your app so it can handle push notifications – add the capability in Signing and Capabilities in Xcode
  • Requesting a Push Token is an asynchronous process, so you can’t request one immediately after registering your app with the pushType: .token in the above code.  The proper way to handles is to setup a Task that awaits for the appropriate activity, etc. 
  • In the above code we convert to Hexadecimal string for debugging purposes

First push update

  • This is an http request to APNs – you must provide the following headers
    • apns-push-type: live activity
    • apns-topic: <BUNDLE_ID>.push-type.liveactivity
    • apns-priority: 5 this is low priority, high priority is 10
  • Your payload is a package with appropriate values.  In the ranger app it looks like this
{
	“aps”: {
		“timestamp”: 1685952000,
		“event”: “update”,
		“content-state”: {
			“currentHealthLevel”: 0.941,
			“eventDescription”: “Power Panda found a sword!”
		}
	}
}
  • Note the timestamp is seconds since 1970, event is the action for the live activity – it should be update or end, content-state is the json package that the live activity to decide with data to be used / displayed.
  • Don’t set any custom encoding Strategies for your json – or the system may not function correctly
  • You can use terminal to send info to APNS without requiring changes to your server – This is covered in the developer documentation 
  • If you use terminal grab the push token from the log in the step above and set an environment variable of ACTIVITY_PUSH_TOKEN with the value
    • Now if you use the terminal to test your push notification it will have the correct token 
    • Here’s a sample CURL command
  • Note you still need your authentication Token for the API based on our brearer token.  And make sure you are using http2 to access the bearer token
  • To debug look at device logs in Console app look at the following processes – liveactivitiesd, apsd, and chronod 

Priority and alerts

  • Make sure you use the correct priority, start by using low priority – it is opportunistic and saves the users battery
  • There is no limit to low priority updates
  • High priority are delivered immediately, and there is a budget based on the device condition.
  • If you have a frequent update requirement there is a “Requires frequent high-priority Updates” feature for apps, which get’s a higher update budget, but may still be throttled 
  • Just add a new a new key to info.plist of NSSupportsLiveActivitiesFrequentUpdates=YES
  • Note users can disable this  – so check the .frequentPushesEnabled property in the ActivityAuthorizationInfo()
  • You an add an alert object to a push notification – with  a title, body, and sound – note you should localize this message via a localized string object.This will generate an alert on the device.  To add custom sounds you must have it as an app resource and you can then set that in the payload.

Enhancements

  • Sending a push payload of “event”: “end” – you can then add a custom dismissal-date (again in seconds since 1970).
  • You can add a “stale-date” to allow a view to react to users missing an update – then update your UI to change state based on it.

Migrate to Swiftdata

Generate model classes 

  • Use your managed Object Model to generate your SwiftData model, Open your model file in Xcode and select Editor -> Create SwiftData Code
    • You can also create the model from class.

Complete adoption

  • When fully monitoring you are replacing the core data stack with swiftData stack
  • Focus on understanding  your core data model designs, and check if they are supported in SwiftData
  • Highlights
    • Generate model classes, and delete the CoreData Managed Object model
    • Delete the persistent file that setup the stack
    • Setup ModelContainer – via modifier in a WindowGroup. This also setups the context for the environment
    • You insert new objects into the modelContext – it has an implicit save to persist data, you can remove implicit save
    • You can use Queries instead of FetchRequest

Coexists with Core Data

  • If you want coexistence you will end up with two different persistent stacks talking to the exiting Store
  • Before loading the persistent store you will need to make sure you set both stores with the same url and turn on .setOption(true as NSNumber, forKey: NSPersistentHistoryTrackingKey) – if you don’t do this, you will be in read only mode
  • SwiftData is only available in IOS17 and macOS Sonoma
  • Requirements
    • Use Namespace to ensure there is no collision – but Entities will stay the same
    • Keep Schema’s in sync – adding properties in the same way
    • You must keep track of Schema Versioning – Check out Model your Schema with SwiftData from WWDC23 
    • If you are using UIKit or AppKit you need to Core Data

Integrate with motorized iPhone stands using DockKit

Introduction to DockKit

  • Allows iPhone to act as a compute for motorized camera stands – controls pitch and yaw
  • Stands have LED indicator to let you know that it is tracking, simple buttons for power and disabling tracking
  • You pair a phone with the Stand – all the control is in the iPhone at the system level.  So any App that uses camera APIs can use this Feature
  • Demonstrated a prototyped Stand – that allows for the camera to track the speaker.

How it works

  • It works with the camera processing pipeline in iOS, it estimates trajectory and then adjusts to keep the user in frame.
  • The process runs as a daemon and controls actuators on the device – at 30 fps.
  • It uses bounding boxes and can track multiple users via face and body detection.
  • DockKit starts with a primary tracking person and will try to Fram others but will ensure that the primary stays in frame.

Custom Control

  • You can change framing, change what is being tracked, or directly control the motorized accessory 
  • You must register for accessory state changes in order to do this.  DockAccessoryManager.shared.accessoryStateChanges – there are only two must handle stages, docked and undocked.
  • You can change cropping via Framing Mode (left, center, or right) or you can set a specific Region of interest.
  • You can also do custom motor controls – just set setSystemTrackingEnabled(false) – then you can adjust X & Y for rotation/Pitch (X) and Tilt/Yaw (Y)
  • You can also add your own inference using Vision, CustomML or other Heuristics – use this to decide what you want to track.  Just set a bounding box on the item to track the Observation you’ve defined.  You can use a type of .humanFace or .object 
  • The current vision framework can already be used to detect the following which can be turned into trackable items:

Device animations

  • There are four built in animations – yes, no, wakeup and kapow!
  • You can setup a motion detector to trigger the animations

Go beyond the window with SwiftUI

How to leverage VisionOS using SwiftUI, many of the tools and frameworks have been extended over the last few years.  Making AR more accessible across devices and platforms.

This is all about the Immersive Space in VisionOS.  Both Windows and volumes let you display content within their bounds.  To go outside of the window you must use the immersive space.

Spaces are a container to present your UI

Get Started with Spaces

  • Extending the “Hello World” app to explore the solar system.  Begin with defining an ImmersiveSpace,  while one ore more space can be created within an app.  You can only have one open at a time.
  • Opening a space replaces the shared space with your Full Space, an implicitly defines it’s on coordinate system.  Based on the user’s position

Display Content

  • It is a scene type, so you can place any SwiftUI view in it.  Anything you place uses the same system you are used to, but since the origin is the users feet, you will want to adjust by using RealityView.  RealityView has built in support for asynchronous loading.  You use ARKit anchors of placement.
  • Remember that RealityKit uses different system than SwiftUI.  Check out Build Spatial experience to RealityKit (WWDC23)
  • You should define a id or value type or both.  For controlling you have .openImmersiveSpace and .dismissImmersiveSpace actions from the @Environment, the system will automatically animate the animation.  They are asynchronous so you can act on them when they succeed.
  • Sequence of windows in a Scene will decide which one shows up first.
  • You should use Model3D for asynchronous loading of your assets 

Managing your Space

  • Scene management and coordinate conversion are key to understand. 
  • Scene phases should be handled to let user know what is doing one, for when an alert pops up or other interactions.
    • You can do things like changing scale to let user know
  • To do repositioning, items between swiftUI and 3D content take a look at this graphic
  • Note that the window and the 3D object use the Y axis differently.  You can use transform(in: .immersiveSpace) you will get a conversion.
  • There may be conversions you need to handle between private immersive space vs. Group Immersive Space
  • Immersion Styles
    • This is different presentation for how your environment looks; Mixed, progressive, or Full
    • By default Mixed, you can use a scene modifier .immersionStyle to define the types your scene supports.
    • Progressive style allows you to see people around you and have some interaction, you can choose the level of progression, but if you go all the way it will take you to full immersion
      • By pressing the digital crown you go back to passthrough

Customization

  • A Scene Manifest is used to launch your app directly to full immersion – This is in the Info.plist by setting the UIApplicationPreferredDefaultSceneSessionRole to UISceneSessionRoleImmersiveSpaceApplication
  • If you want to toggle back to a window set UISceneInitialImmersionStyle to UIImmersionStyleMixed 
  • .preferredSurroundingEffect to set the effects when you change to Full Immersion 
  • Hiding hands means you can show virtual hands instead – you use RealityView to do this. The hands are an entity, using ARKit and Hand tracking API.  Check for hand tracking anchors and updates.  By using Transform to the anchor.  Look at session “Evolve your ARKit app for spatial experiences” from WWDC 23.

Explore pie charts and interactivity in Swift Charts

Pie charts

  • Pie charts are new to Swift Charts
  • They don’t have axises and don’t really show precision – so great for casual, intuitive analysis.
  • You can dd a sector mark – you can customize the look- increasing the radius – you get a donut charts
  • The code for a pie chart is pretty easy –

Chart(data, id: \.name) { element in

SectorMark (

angle: .value(“Sales), element.sales)

)

.foregroundStyle(by: .value(“Name”, element.name))

}

  • That’s it.  By using Sector mark, you end up with the PieChart, you can add other properties and modifiers, via .innerRadius, .angularInset, and .cornerRadius

Selection

  • This is interactivity for your chart.  Check out heart rate chart by apple.
  • Using valueSelection chartXSelection(value: $selector) modeifier allows you to capture the selection information and then provide additional information like a popover or annotation, you can use DragGestures to do a selection range

Scrolling

  • Navigating the data, just add .chartScrollableAxes a set a domain of .chartXVisibleDomain, and you provide a .chartScrollPosition as a binding, to say what part of the data to show in the scroll section.  Adding .chartScrollTargetBehavior allows you to snap to boundaries when things scroll out of the view.

Discover Quick Look for spatial computing 

Quick look is a system framework to preview and edit files. It is secure and private to protect you from unsafe files.  Just hit space on macOS or long press on iOS.  On VisionOS you pinch and drag the file outside of the application window.  Supports print and drag for zooming on usdz files.  This is called Windowed Quick Look

Windowed quick look

  • Allows for QL outside of your app, so you can put it next to key content.  You can also close your app and it will persist.  Some files will provide SharePlay – this will for group collaboration.
  • Apps can present this with a NSItemProvider using URL to drag provider outside of your window.  This is a copy of the provided file, so edits won’t be send back to your applications file.   (Add .onDrag modifier to an item in SwiftUI.
  • In websites, you can achieve this with AR Content linking – and get the xrOS feature by default  – check Advances in AR Quicklook form 2019.  You can also get this by simply adding [rel=“ar”] to a link anchor in your html … safari will then present your item in a quicklook preview.

In-app Quick look

  • All you need to do is pass a URL to quickLookPreview to get a full screen preview sheet.  Or if you us a QLPreviewController, you can provide customizations, existing code using these will just work on xrOS 
  • Supported file types include:
    • Images
    • Videos
    • Audio Files
    • PDF documents
    • HTML documents
    • RTF (Rich text format) documents
    • Text files that conform to public.text
    • iWork and Office documents
    • Zip archives
    • USDZ

Design Shortcuts for Spotlight

Spotlight lets you do any search, and now short cuts will show up next to your app in the search window

Ideation

  • Focus on essentials, habitual actions in your app.
  • Don’t be compelled to have a lot of shortcuts
  • Design them to be predictable and personal
    • Also personalize them based on decisions your users do – like pinned values in your app, should be pinned in the shortcuts

Creation

  • Every shortcut is either an action or entity (verb or noun)
  • All actions are an SF Symbol in the the search, you entities should an icon shape
  • Be brief on the titles, yet understood
  • Think beyond just the shape, include other visual clues 
  • Color -you can decide to add a solid or gradient to add as complementing color, or use a secondary tint color
  • Behavior – App Launch, Live Activity, or Snippet  – launch is usually the right action.  Design Dynamic Live Activities WWDC23 is a good session
    • If you do a snippet you may want to present it and then go into your app

Discovery

  • To make your app short cut discoverable – add synonyms of your shortcut phrase, provide a handful of these.

Design dynamic Live Activities

This will address Lock Screen, Stand By and Dynamic Island

On the Lock Screen

  • They are on the top of the list, with a 14pt margin on all layouts.  Don’t try to replicate the notification layouts.  Be unique and make it graphically rich.
  • Think about – quick glances, and only include buttons if it is a critical function.
  • Having a similar iconography, color, and fonts will make it match your app.  If you use your logo, don’t put the app icon.
  • Spacing – use space to focus information, but be as compact as possible. You a dynamically change height as you have more or less information to display.
  • Transitions – when updating between moments you can apply transitions, like the numeric content transition, to count up and down important numbers.  This is a iOS 17 beta feature.  Animation of elements between updates
  • Alerting – You should alert when there is an update that requires users attention.
  • Remove your activity when it is no longer relevant.

Stand By

  • You can update your activity for StandBy – your layout is upsized 200% and your background color is extended.  Background elements will get cut off and cause a visual issue, try removing it.  Make sure all assets and images are at a high enough resolution.  In night mode, you will get a nice red color tint.

For Dynamic Island –

  • Use rounded shapes, thicker shapes, and use of color
  • Objects should be concentric with the shape of the dynamic island
  • Think of blurring objects to see how they would look in an optically good spot, make sure you stay inside the island.  Using RoundRect will help
  • Use an inset and/ or separator line
  • There are three sizes:
    • Compact -informational only showing the most essential information. Be snug against the sensor region.  If you are showing multiples sessions, think about ticking between views. If. You need to show an alert – think about expanding the island to show the data
    • Minimal
    • Expanded view – users can press into the island to get to this.  Show the essence of your app, be in harmony with your color of your app, etc.  Try and maintain relevant placement of items between the compact view.  Also avoid having a forehead, wrap content around the island.

Demystify SwiftUI Performance

Building a mental model for performance in SwiftUI

Performance feedback Loop

  • Symptoms – slowness, broken animation, or spinner
  • Measure – to get data
  • Identify cause
  • Optimize to fix the root cause – then re-measure to verify
  • Resolve the process

Pre-requisite  

  • Identify
  • View Lifetime and Value

Watch Demystify SwiftUI from WWDC. 2021

Dependencies

  • Understanding how a View is dependent on various views and their dependencies – all views ultimately resolve to a leaf view
  • The update process can be understand in the View Graph of dependencies, this includes Dynamic Properties (like @Environment property values), the process is recurves to go thru all the nodes which require updates.  This will replicate down thru to all the leafs.
  • To improve the process you can use _printChanges method in SwiftUI, you can set a break point and look at “expression Self._printChanges()” in lldb.  This is a debugging tool for best effort understanding of why a view changed.  You can also add a let _ = Self._printChanges() and it will write to the view console.  You MUST remove this code before submitting to the App Store
  • By extracting the views to only handle the specific item you will can get better updates
  • You can use @Observable which will also help reduce dependencies

Faster View updates

  • Analyze Hangs in Instruments from WWDC23 should be review ed
  • There is a tech talk on Hitches
  • They both originate from a slow update (like data filtering)
  • You shouldn’t do synchronous work in loading your data in a view (I should check this in my own apps)

Identify in List and Tables

  • These are complex advanced controls that can cause performance issues.  Understanding Identity is key to help improve this.
  • Identity helps swiftUI handle View timeline
    • Adding a ForEach in a list, as it will impact performance, it will need to know how many rows (views) to create, the Identity is used in creating this.
    • For Filters you should not do it inline as it will still need to create all of the rows.  You can move the filter into the ForEach but that is slow too.  So you should move the filter into the model.  That will be the fastest method.
  • Avoid using AnyView in ForEach
  • Flatten nested ForEach, except in Section Lists where SwiftUI will understand how to optimize this. ( can possibly use this for my category view in my own app)
  • These same rules work in Tables too
  • Number of Rows in Tables and Lists is Constant

Create practical workflows in Xcode Cloud

This session will create sample workflows based on three different case studies.

Solo developer

  • One app available on two platforms with most code on the main branch, a few dependencies
  • Send to Test flight to friends, manually release app
  • Key is simplicity that is reliable and maintainable
  • Simple what, where and when
    • You start with what version of macOS and Xcode you want 
    • By default it will build whenever a code is pushed to Main branch
      • For solo you may want to build regardless of branch
    • Select an action of Archive and target TestFlight and App Store
    • Sent Post action to Test Flight – External team and select a group
    • Create an archive action for each platform and post action that you’d like
    • To add a secondary dependency instead of just swift packages – more information is available- you can create a “Post Clone” Script to do run a python command to include additional items for the build environment.

Medium-sized team

  • Team of dev, QA, and PM on a globally distributed.  spereate branches by develop with pull requests to merge, using tags to match a release 
  • Testing both UI and Unit is done
  • Multiple TestFlight plans – use cloud to communicate
  • Create three workflow
    • Pull request
      • Whenever a new pull is created tests are to be run 
      • Add a new start request to select target Branch of Beta and remove the branch start condition
      • Create a new Action and select a Test target
        • Add multiple devices to cover appropriate mix for simple testing
      • With a successful build on this workflow – a developer can deliver their code to the Beta branch
    • Beta workflow
      • This is to deliver your code the QA team – this will run every time a change is delivered to a pull request
      • Start condition of Branch condition to Beta from Main
      • Add archive action and select TestFlight (Internal Testing Only)
      • Add a Post Action to upload to App Store Connect to the internal testers
      • Will also add a test action to ensure you don’t deliver a broken build to the QA team.
      • You can add a customer icon via custom scripts so testers know it is a beta build.  So you run a “pre-build” script
    • Release workflow
      • Developer must merge the Beta stream to the Main Branch and add a “release candidate” tag is added
      • Steps are basically the same as Beta Workflow – you can duplicate the workflow and then update the start condition. In this case we use “Tag changes” – and add value of “tags beginning with release/“
      • Change the Archive change to TestFlight and App Store 
      • Post action change the Test Flight Internal group and add “Executive Group” also add a second test Flight to external team –
      • You can now add a Notify post action to add a notification to Slack 

Large team

  • This is a very large team, similar to the medium sized team.  There is an app that has been around since the beginning of the App Store. There are tons of tests, and many TestFlight testers both internal and external, and they use slack, but they also use CI/CD already using an in-house team developed with limited knowledge.
  • They also use Project Management (PM) tool with various dashboards that allows others to vie status
  • Migration to Xcode cloud should be done in chunks. To allow a successful migration overtime.
  • One is create a workflow that can be released – start here.. leverage flow from above Medium size team.  Once it works you can integrate into your process
  • Create a workflow for testing – Then migrate your tests, tests may fail more than before, so you can set Xcode cloud to progress with “Not required to pass” you can see the results but it won’t stop from delivering the code.  It allows you to fix your tests off your critical path.  You can create multiple test plans – migrating reliable plans over time and making them required.
  • Create integration to existing processes – finally you look at integration with existing PM system and reporting.  Some of the will be similar to prior things, you can create powerful scripts, Xcode cloud APIs, and/or Webhooks there is more about this at Apple’s documentation.