Animate with Springs

All about engaging your user with better animations

Why Springs

  • Animations provide a level of continuity, watching this move is more natural than just seeing it in a new place.  Velocity is important to make it look more natural
  • Ease in and out is a bezier animation defined by a curve and duration, to me this adds gravity to the animation
  • Spring is like an ease in and out animation; however if you use a gesture with an animation a spring looks more natural if you flick or show the object
  • Motion of a spring is not only a bouncing animation; it is about how the animation ends. It is a slow and natural stop, not like hitting a wall.

How Springs Work

  • You are modeling a motion of an object attached to a spring. This is impacted by the mass of the object, the stiffness of the spring, and the damping of the system (aka friction).
    • Initial position of the animation, and the target is the resting position of the spring.
  • While changing those properties make sense for a physical system; in software we use duration and bounce to reflect the spring.  Adding duration makes it take longer, and bounce will cause a bounce in the curve (if greater than 0, at 0 is smooth curve, less than zero takes a flatter time to stop.
  • All of the math is implemented for you, a bouncy spring is like a cosine wave, at 100% it will oscillates back and forth.  A * cos(2π * t / duration).  You can see all three curves in this chart, where the Blue and Green curves are defined at the bottom and the dark curve is defined by the previous formula.
  • To preserve velocity out cosine curve will start off with a downward curve.
  • The velocity can come from a the velocity of a gesture, or from the velocity from an interrupted animation. The session goes through the rest of the calculations for those who are interested.
  • You can use a completion handler that uses perceptual duration instead of the settling duration to process other activities 

How to use Springs

  • Springs are the default for SwiftUI
  • You can explicitly use them  presets like .snappy, .bouncy, and .smooth, and tune it with a duration or extra bounce
  • You can customize the .spring completely 
  • There is also a spring model type to programmatically covert parameters, or just do the math yourself with

What’s new in Wallet and Apple Pay

Payments, Order tracking, and Identity

Payments:

  • Apple Pay Later
    • 2023 US introduction – 4 separate payments – trackable in Wallet, New API is used for support of Apple Pay Later for both Apps and on the Web
  • Learn ore will provide details and explanation, the calculator review will show the 4 payments split, etc.
  • For the web you use and Apple JavaScript SDK
  • You must have an entitlement in your app or register your site to use this feature
  • Preauthorized payments
    • Enabled in iOS 16, have added deferred payments, along with recurring and automatic reload payments.
    • They support both fixed or variable amounts… can be used for preordering an item or for booking a reservation 
    • This is tied to the user’s account, not their device
    • You will need an Apple Pay merchant token
    • Cancellation policies must contain date, time and Timezone
  • Transfer Funds with Apple Pay
    • This is a new feature with iOS17, this will allow users to transfer money from Apple Pay account to a card in their wallet.
    • There is a new request type, that uses minimal amount of data.
    • You need to register as a merchant in the Apple Developer Portal
    • PKPaymentNetwork and PKMerchantCapability show the network and features supported.
    • PKDisbursementSummaryItem represents the final amount to be received on the card (net charges and fees)
    • A PKDisbursementRequest is the new transaction
    • This also supports Instant Funds Transfers from some institutions
    • This is only available for iOS and iPadOS – not the web or macOS

Order Tracking:

  • Introduced in iOS16, in 16.4 you can share orders in messages and an order tracking widget, with support for maps.
  • System Integration
    • shippingType to support carriers
      • You can indicate if it is being shipped or delivered
    • You can associate  with enterprise apps to improved tracking
  • Enhancements
    • Adding new ways to represent payment information, attach receipts, describe if purchase or refund.
  • New ways to add Orders
    • New API to check for an existing order, add or update and order and respond as appropriate
    • You use FinanceKit API for these features
    • This Is also enabled for web via the Javascript SDK

Identity:

  • IDs were enabled in iOS 15.4 – Added verify Waller last year in iOS16
  • Adding Tap to Present ID on iPhone, this builds on top of tap to pay from iPhone
  • Your app can request verification of specific ID data, and the user is able to decide if they are willing to present the information.
  • The system is wireless and secure – at no point does the user have to hand over the device to send the information
  • The system is cryptographically signed by the appropriate government or enterprise system so it cannot be tampered with.
  • And the system is more private as you only share the required data needed for the verification.
  • The types of requests include –
    • Display request – for things like Age or Name
    • Data request – for a wider set of elements, which are returned to the requesting app and requires an additional entitlement

What’s new in Core Data

Composite attributes

  • New type of attribute that encapsulates complex and custom data types
    • You can nest them and now create them in the Core Data model editor
    • There is a demo of how to adopt these new Composite Attributes
    • Watching the demo I can only think, how will SwiftData handle this, and if it does already, how much easier will it be

Stage your migrations

  • It is always preferable to do Lightweight migrations – it is built in and will happen automatically.  Check out last year’s Evolve your app’s Schema – I captured my thoughts on that session here in the section on Core Data.
  • When it is to complex, you can do a staged migration (this is new),
    • For migration of non-conforming lightweight changes
    • Simplify your app
    • Provides some opportunities for your app to gain execution control during the migration to perform specific tasks
      • You need to identify when Changes don’t conform with light weight migration
        • You can manually review the chains
        • You can try to open the store with the lightweight option using NSMigratePersistentStoresAutomaticallyOption and NSInferMappingModelAutomaticallyOption set to true.
        • You’ll receive an hash error if they are not compatible
        • Or you can use NSMappingModel.inferredMappingModel(forSourceModel:destinationModel:) which will either return the inferred model or nil if it cannot 
      • Decompose into a series of comforting migrations
      • Describe the sequence of ordering using the new APIs
      • Have Core Data execute an event loop to do the migration
    • The complex model change that is described may work for my migration, I will have to see if I can recreate this with multiple models
  • Adding Staged migration allows you to mix and match (in sequence) your model migration from both lightweight and custom migration stages.
    • You create a .willMigrateHandler to do custom work during migration, for each fetch entity you will copy in the data and relate them.

Defer your migrations

  • If your lightweight migrations take too much time, which can frustrate your users.
    • You can now do deferred migrations at a later date, for example dropping indices or deleting a column – you can defer this clean up by setting
    • NSPersistentStoreDeferredLightweightMigrationOptionKey to True – whit is only available for Sqlight, but is available back to iOS 14
    • You can check the store’s metadata if that value is set and you can then process them by running NSPersistentStoreCoordinator.finishDeferredLightweightMigration()
    • Consider using Background Task API to schedule this work
  • You can combine deferred and staged migrations

What’s new in Background Assets

Check out Meet Background Assets from WWDC 2022

Feature Recap

  • Designed to prevent waiting when launching your app, use a framework and App Extension to download out-of-band content via a CDN or managed server
    • It will download content that can be placed in your app’s sandbox for use by the app
  • Supported on macOS, iOS, and iPadOs
  • Can run periodically in the at install time before user runs the app, background, or when the app is not running
  • There are time limits on the run to optimize battery life
    • Consider memory mapping your data as you are also limited to amount of memory used by your app
    • The extension can be throttled if the app is not used very often
  • The Background Asset manager is prefixed with BA
  • User can turn off background app refresh turned off or is in low power mode – your extension will not run.
  • BADownloadManager is a singleton that can be used throughout your app. 
  • All downloaded assets are marked pursuable by default.  If you increase their size after download, they will be marked non-purgeable 

What’s New

  • Essential downloads – will be integrated into the system and will stop the user from loading the app.  They occur during app install.   These downloads have priority over normal background downloads.
  • Non-Essential downloads will be automatically launched after essential downloads are ready.  Note the UI changes that shows which part of the install download is App, App Install, and Essential assets.
  • If a user turns off In-App Content the you will not get essential downloads to download automatically.  You will need to code your application to handle download these assets different.
  • You can convert an essential download to a non-essential download, and the re-enque this download 
  • Also you need to setup your info.plist

Sample Implementation

  • First add your required info.plist keys
  • Create your App Extension
  • Make sure your app extensionMakes and app are both using the same team identifier
  • Make sure to use .withExclusiveControl for your download session 
  • Promoting from background to foreground does not restart the download, it just continues from where it was at a higher priority.
  • Always use move objects to move the finished download in to the app package
  • So far none of my apps have a need for these features.. but I do have a few game ideas which may need this in the future.  So definitely keeping this in my back pocket 

Debugging Guidance

  • Since the extension launches during the install or periodic system events.  You must use a tool called “xcrun backgroundassets-debug” in a terminal to be able to debug with a device paired with your Mac. The device must be in Developer Mode
    • You can now trigger events to see how the app or extension behaves.

Update Live Activities with Push Notifications

ActivityKit allows you to display live activities to show status and updates

Preparations

  • You should understand how push notifications work.  Your App can request a push token (APNS) so you can send it to a server before you can send push tokens.  APNS then sends the payload to the device.
  • There is a new APNS live activity push type – only available with Token-based connection based apps.
  • Modify your app so it can handle push notifications – add the capability in Signing and Capabilities in Xcode
  • Requesting a Push Token is an asynchronous process, so you can’t request one immediately after registering your app with the pushType: .token in the above code.  The proper way to handles is to setup a Task that awaits for the appropriate activity, etc. 
  • In the above code we convert to Hexadecimal string for debugging purposes

First push update

  • This is an http request to APNs – you must provide the following headers
    • apns-push-type: live activity
    • apns-topic: <BUNDLE_ID>.push-type.liveactivity
    • apns-priority: 5 this is low priority, high priority is 10
  • Your payload is a package with appropriate values.  In the ranger app it looks like this
{
	“aps”: {
		“timestamp”: 1685952000,
		“event”: “update”,
		“content-state”: {
			“currentHealthLevel”: 0.941,
			“eventDescription”: “Power Panda found a sword!”
		}
	}
}
  • Note the timestamp is seconds since 1970, event is the action for the live activity – it should be update or end, content-state is the json package that the live activity to decide with data to be used / displayed.
  • Don’t set any custom encoding Strategies for your json – or the system may not function correctly
  • You can use terminal to send info to APNS without requiring changes to your server – This is covered in the developer documentation 
  • If you use terminal grab the push token from the log in the step above and set an environment variable of ACTIVITY_PUSH_TOKEN with the value
    • Now if you use the terminal to test your push notification it will have the correct token 
    • Here’s a sample CURL command
  • Note you still need your authentication Token for the API based on our brearer token.  And make sure you are using http2 to access the bearer token
  • To debug look at device logs in Console app look at the following processes – liveactivitiesd, apsd, and chronod 

Priority and alerts

  • Make sure you use the correct priority, start by using low priority – it is opportunistic and saves the users battery
  • There is no limit to low priority updates
  • High priority are delivered immediately, and there is a budget based on the device condition.
  • If you have a frequent update requirement there is a “Requires frequent high-priority Updates” feature for apps, which get’s a higher update budget, but may still be throttled 
  • Just add a new a new key to info.plist of NSSupportsLiveActivitiesFrequentUpdates=YES
  • Note users can disable this  – so check the .frequentPushesEnabled property in the ActivityAuthorizationInfo()
  • You an add an alert object to a push notification – with  a title, body, and sound – note you should localize this message via a localized string object.This will generate an alert on the device.  To add custom sounds you must have it as an app resource and you can then set that in the payload.

Enhancements

  • Sending a push payload of “event”: “end” – you can then add a custom dismissal-date (again in seconds since 1970).
  • You can add a “stale-date” to allow a view to react to users missing an update – then update your UI to change state based on it.

Migrate to Swiftdata

Generate model classes 

  • Use your managed Object Model to generate your SwiftData model, Open your model file in Xcode and select Editor -> Create SwiftData Code
    • You can also create the model from class.

Complete adoption

  • When fully monitoring you are replacing the core data stack with swiftData stack
  • Focus on understanding  your core data model designs, and check if they are supported in SwiftData
  • Highlights
    • Generate model classes, and delete the CoreData Managed Object model
    • Delete the persistent file that setup the stack
    • Setup ModelContainer – via modifier in a WindowGroup. This also setups the context for the environment
    • You insert new objects into the modelContext – it has an implicit save to persist data, you can remove implicit save
    • You can use Queries instead of FetchRequest

Coexists with Core Data

  • If you want coexistence you will end up with two different persistent stacks talking to the exiting Store
  • Before loading the persistent store you will need to make sure you set both stores with the same url and turn on .setOption(true as NSNumber, forKey: NSPersistentHistoryTrackingKey) – if you don’t do this, you will be in read only mode
  • SwiftData is only available in IOS17 and macOS Sonoma
  • Requirements
    • Use Namespace to ensure there is no collision – but Entities will stay the same
    • Keep Schema’s in sync – adding properties in the same way
    • You must keep track of Schema Versioning – Check out Model your Schema with SwiftData from WWDC23 
    • If you are using UIKit or AppKit you need to Core Data

Integrate with motorized iPhone stands using DockKit

Introduction to DockKit

  • Allows iPhone to act as a compute for motorized camera stands – controls pitch and yaw
  • Stands have LED indicator to let you know that it is tracking, simple buttons for power and disabling tracking
  • You pair a phone with the Stand – all the control is in the iPhone at the system level.  So any App that uses camera APIs can use this Feature
  • Demonstrated a prototyped Stand – that allows for the camera to track the speaker.

How it works

  • It works with the camera processing pipeline in iOS, it estimates trajectory and then adjusts to keep the user in frame.
  • The process runs as a daemon and controls actuators on the device – at 30 fps.
  • It uses bounding boxes and can track multiple users via face and body detection.
  • DockKit starts with a primary tracking person and will try to Fram others but will ensure that the primary stays in frame.

Custom Control

  • You can change framing, change what is being tracked, or directly control the motorized accessory 
  • You must register for accessory state changes in order to do this.  DockAccessoryManager.shared.accessoryStateChanges – there are only two must handle stages, docked and undocked.
  • You can change cropping via Framing Mode (left, center, or right) or you can set a specific Region of interest.
  • You can also do custom motor controls – just set setSystemTrackingEnabled(false) – then you can adjust X & Y for rotation/Pitch (X) and Tilt/Yaw (Y)
  • You can also add your own inference using Vision, CustomML or other Heuristics – use this to decide what you want to track.  Just set a bounding box on the item to track the Observation you’ve defined.  You can use a type of .humanFace or .object 
  • The current vision framework can already be used to detect the following which can be turned into trackable items:

Device animations

  • There are four built in animations – yes, no, wakeup and kapow!
  • You can setup a motion detector to trigger the animations

Go beyond the window with SwiftUI

How to leverage VisionOS using SwiftUI, many of the tools and frameworks have been extended over the last few years.  Making AR more accessible across devices and platforms.

This is all about the Immersive Space in VisionOS.  Both Windows and volumes let you display content within their bounds.  To go outside of the window you must use the immersive space.

Spaces are a container to present your UI

Get Started with Spaces

  • Extending the “Hello World” app to explore the solar system.  Begin with defining an ImmersiveSpace,  while one ore more space can be created within an app.  You can only have one open at a time.
  • Opening a space replaces the shared space with your Full Space, an implicitly defines it’s on coordinate system.  Based on the user’s position

Display Content

  • It is a scene type, so you can place any SwiftUI view in it.  Anything you place uses the same system you are used to, but since the origin is the users feet, you will want to adjust by using RealityView.  RealityView has built in support for asynchronous loading.  You use ARKit anchors of placement.
  • Remember that RealityKit uses different system than SwiftUI.  Check out Build Spatial experience to RealityKit (WWDC23)
  • You should define a id or value type or both.  For controlling you have .openImmersiveSpace and .dismissImmersiveSpace actions from the @Environment, the system will automatically animate the animation.  They are asynchronous so you can act on them when they succeed.
  • Sequence of windows in a Scene will decide which one shows up first.
  • You should use Model3D for asynchronous loading of your assets 

Managing your Space

  • Scene management and coordinate conversion are key to understand. 
  • Scene phases should be handled to let user know what is doing one, for when an alert pops up or other interactions.
    • You can do things like changing scale to let user know
  • To do repositioning, items between swiftUI and 3D content take a look at this graphic
  • Note that the window and the 3D object use the Y axis differently.  You can use transform(in: .immersiveSpace) you will get a conversion.
  • There may be conversions you need to handle between private immersive space vs. Group Immersive Space
  • Immersion Styles
    • This is different presentation for how your environment looks; Mixed, progressive, or Full
    • By default Mixed, you can use a scene modifier .immersionStyle to define the types your scene supports.
    • Progressive style allows you to see people around you and have some interaction, you can choose the level of progression, but if you go all the way it will take you to full immersion
      • By pressing the digital crown you go back to passthrough

Customization

  • A Scene Manifest is used to launch your app directly to full immersion – This is in the Info.plist by setting the UIApplicationPreferredDefaultSceneSessionRole to UISceneSessionRoleImmersiveSpaceApplication
  • If you want to toggle back to a window set UISceneInitialImmersionStyle to UIImmersionStyleMixed 
  • .preferredSurroundingEffect to set the effects when you change to Full Immersion 
  • Hiding hands means you can show virtual hands instead – you use RealityView to do this. The hands are an entity, using ARKit and Hand tracking API.  Check for hand tracking anchors and updates.  By using Transform to the anchor.  Look at session “Evolve your ARKit app for spatial experiences” from WWDC 23.

Explore pie charts and interactivity in Swift Charts

Pie charts

  • Pie charts are new to Swift Charts
  • They don’t have axises and don’t really show precision – so great for casual, intuitive analysis.
  • You can dd a sector mark – you can customize the look- increasing the radius – you get a donut charts
  • The code for a pie chart is pretty easy –

Chart(data, id: \.name) { element in

SectorMark (

angle: .value(“Sales), element.sales)

)

.foregroundStyle(by: .value(“Name”, element.name))

}

  • That’s it.  By using Sector mark, you end up with the PieChart, you can add other properties and modifiers, via .innerRadius, .angularInset, and .cornerRadius

Selection

  • This is interactivity for your chart.  Check out heart rate chart by apple.
  • Using valueSelection chartXSelection(value: $selector) modeifier allows you to capture the selection information and then provide additional information like a popover or annotation, you can use DragGestures to do a selection range

Scrolling

  • Navigating the data, just add .chartScrollableAxes a set a domain of .chartXVisibleDomain, and you provide a .chartScrollPosition as a binding, to say what part of the data to show in the scroll section.  Adding .chartScrollTargetBehavior allows you to snap to boundaries when things scroll out of the view.

Discover Quick Look for spatial computing 

Quick look is a system framework to preview and edit files. It is secure and private to protect you from unsafe files.  Just hit space on macOS or long press on iOS.  On VisionOS you pinch and drag the file outside of the application window.  Supports print and drag for zooming on usdz files.  This is called Windowed Quick Look

Windowed quick look

  • Allows for QL outside of your app, so you can put it next to key content.  You can also close your app and it will persist.  Some files will provide SharePlay – this will for group collaboration.
  • Apps can present this with a NSItemProvider using URL to drag provider outside of your window.  This is a copy of the provided file, so edits won’t be send back to your applications file.   (Add .onDrag modifier to an item in SwiftUI.
  • In websites, you can achieve this with AR Content linking – and get the xrOS feature by default  – check Advances in AR Quicklook form 2019.  You can also get this by simply adding [rel=“ar”] to a link anchor in your html … safari will then present your item in a quicklook preview.

In-app Quick look

  • All you need to do is pass a URL to quickLookPreview to get a full screen preview sheet.  Or if you us a QLPreviewController, you can provide customizations, existing code using these will just work on xrOS 
  • Supported file types include:
    • Images
    • Videos
    • Audio Files
    • PDF documents
    • HTML documents
    • RTF (Rich text format) documents
    • Text files that conform to public.text
    • iWork and Office documents
    • Zip archives
    • USDZ