Keep up with the keyboard

The keyboard has changed a bit over the last few years.  It has new languages, it floats, and it handles multiple screens

Out of process keyboard

  • This is the new architecture – allows for improve privacy and security
  • This is now a process out side of your app, this will be an asynchronous process with the system and your app requesting updates from each other. Ultimately you will get a Text Insertion – it may provide some slight timing issues
  • Frees memory from your app
  • Provides flexibility for future changes 

Design for the keyboard

  • Note in the old model, you moved your view up to address for the keyboard.  However in the new model, you need to adjust your app to the intersection with the keyboard overlay (like in Stage manager).  You may have multiple scenes that have to adjust.
  • If you use the mini-toolbar, in stage manager is different than out of stage manager.
  • Keyboard layout guide was introduced in iOS 15 for UIKit, this has been updated by iOS apps, and is the recommended way to address the keyboard.
    • view.keyboardLayoutGuide.topAnchor.constraint(equalTo: textView.bottomAnchor).isActive = true
    • This has been updated to allow for more customization on iOS 17
  • SwiftUI automatically handles the common cases for you, by adjusting the safe view.
  • Notifications
    • In the past you had to listen for will show, did show, etc. and then process these yourself.  But with Stage manager introduction those patterns didn’t work
    • It certainly seems that many of these changes are why Stage Manager only got as far as it did last year.

New text entry APIs

  • Inline predictions will allow for on device processing and context to provide information.  This is enabled by default for most text fields.  This is powerful

Embed the Photo Picker in your App

The new photo picker, you don’t need request any permissions to use it.  It is only a few lines of code to use this.

Embedded Picker

  • The access model runs in a separate model outside of your app. Only what is selected is passed back to your app.
    • You can use the new .photosPickerDisabledCapabilities modifier to turn off certain features of the picker
    • You can disable various accessories to with .photosPickerAccessoryVisibility modifier
    • You can even change the size of the picker
    • You can use the new .photosPickerStyle(.inline) to make it more naturally a part of your app
    • PhotosPicker(selectionBehavior: .continuous) allows your app to respond to each selection of an image.
  • There’s a new privacy badge on the picker
  • A detailed scission of which options can be disabled in the sessions including search, selection actions, and more
  • The picker style include presentation, inline and compact style (a single row – scrolling horizontally)
  • This API is available for iOS, iPadOS, macOS, along with SwiftUI, AppKit and UIKit – it was not listed for xrOS or VisionOS

Options Menu

  • This is a new menu, which gives users control of what is shared with your app.  They can select to remove metadata and location as example.

HDR and Cinematic

  • The system may automatically transform assets to things like JPEF, but if you want to include HDR data
    • you need so set .current encoding policy
    • And use .image or .more for content type

Elevate your windowed app for Spatial Computing

Spatial computing means your apps fit in the your surroundings

While SwiftUI is the focus, UIKit can take advantage of much of this content.

SwiftUI in the Share Space

  • Most of the system applications were written in SwiftUI – so you can see that it is similar to iPad apps but taking full advance of the environment 
  • This session updates the BackYard bird sample app, so until the VisionOS extensions are available you won’t be able code along.  Watch this session to become familiar for when they are available.
  • You need to add a new run definition for the “new platform”, if you use native vs. designed for iPad it will change to the new Glass background.  You don’t have to deal with light and dark

Polish your app

  • Updating your custom views, note that physical resolution may make some assets blurry, try updating your content assets with Vector assets.  If you use Bitmaps you may see blurring
    • Change you “Scales” parameter in the inspector of the asset and choose single scale for vectors, also select preserve vector data.
  • Change solid color backgrounds, as they will not change contrast with Glass.  Add “Vibrancy” to provide additional clarity on glass.  It’s there by default with standard controls.  So things like .foreground(.tertiary), etc.   You can remove color scheme check too.
  • Interactive targets should be reviewed to see how they work on the new platform.  If you have standard controls you should be ok, but if you’ve customized them, you will need to review your code.
    • There are hover effects to show focus. If you have customer controls you should add your hover effects – by default you can just use .hoverEffect() – but you can add a contentShape modifier to clean up the view.
    • By changing a view to a button, you get more appropriate changes.

Brand new concepts

  • Top level layout changes – you should use a TabView instead of a Sidebar – this will give you the side buttons that expand with titles when the user focuses on it.
    • This also gives more room for the content, it is called an ornament 
  • The bottom toolbar  is also a toolbar modifier of .bottomOrnament placement option.
    • You can build custom ornaments 

Dive Deeper into SwiftData

  • You’ll want to download the sample app from here. Make sure to change the group and application identifier
  • You get undo and autosave when you switch applications.  Works with your basic classes and structs 
  • It uses a new @Model macro
  • Where possible SwiftData will infer structure from your code, but you can be very explicit – check out Model your Schema with SwiftData. For more information
  • The schema is applied to the ModelContainer class. And instances of the classes are mapped to a ModelContext in your code.

Configure Persistence

  • The model container is the bridge between the schema and where it is stored
  • You can instantiate easily with try ModelContainer(for: SCHEMA.self) it will infer related types
  • The ModelConfiguration class describes the persistence of the schema
    • On disk or in memory
    • File location (or generate one for you)
    • Read Only mode
    • And you can tell it which cloudKit container to use
  • Note in the above example we define all the Schema’s will use, Where we want to store, including our cloudKit container for each Schema as appropriate (since we want to keep People separate from Trips data)
  • Finally we would create the container with `let container = try ModelContainer(for: fullSchema, trips, people)`
  • You can use modeContainer modifier to a view or scene to describe which ones you will use in that view or scene 

Track and Persist changes

  • The model and modelContext 
  • The the modelContainer modifier binds the @Environment to the modelContext 
  • Changes are stored as snapshots in the modelContext until you call context.save() – this will persist changes to modelContainer
  • This modelContext works in coordination with the ModelContainer – which supports rollback and reset (for undo and autosave)
  • The modelContainer has a isUndoEnabled: value.  This means that system gestures like shake and three finger swipe will automatically do undo for you.
  • Autosave will save during system events like moving to foreground or background.  You can disable vis isAutosaveEnabled.  It is enabled by default in applications, but it is disabled for modelContexts created by hand.

Modeling at scale

  • Background operations, sync and batch processing all work with model objects
  • Using Predicate Macro you can simplify queries and subqueries using Swift.
  • You can add additional tuning parameters in the enumerate function on modelContext, this will be implicitly efficient.  Using platform best practices, like batching (which you can modify if you desire).  It will also implement mutation guards by default, you can override with allowEscapingMutations. 

Design considerations for vision and motion

This is a research based session, 

Visual Depth Cues

  • Make sure your content provides depth triggers so the brain can perceive the data correctly
  • Visual system interrupts what is perceived, Making things agree is important for vision comfort, around line of sight.  The Depth cues help the brain handle this.. if it is not right you can cause users with eye fatigue or double vision.
  • You can use size, gentle motion, color, blur, light, shadow, backgrounds, and occlusion 
  • Conflicting clues can also provide issues to your user. Repeating patterns can cause eyes to mis-focus across multiple items causing issues.

Content Parameters

  • Make sure to place reading, farther than arms length, and let users to adjust for comfort.
  • Direct interaction or quick goes can be placed closer.
  • Use blur and transparency to help users focus their eyes.
  • Use high contrast for reading, and keep it centered to keep the user from having to move head back and forth.
  • Slow transitions between dark and light scenes 

Eye effort

  • Minimize eye effort demanded from your users, it is most comfortable to look down, or left and right. So use that for content to reduce eye strain.
  • Upward and diagonal movement is the most effort.
  • Longterm should be in the center and slightly down of the field of view.
  • Allow for natural break points in your experience to allow for eye rest.

Motion of virtual objects

  • Inner ear is used, along with your eyes, to address motion.  If these two disagree you can get dizzy or sick to your stomach.
  • If things move at the user, you should make your objects semi-transparent to reduce discomfort    

Head-locked content

  • When possible use this, since they will not come at the user. You can use a lazy follow, which moves content in position slowly over time, reducing issues.

Motion in windows

  • Pay attention of the motion of content within a window, make sure that horizon stays align with the same as the real horizon.  Focus of expansion should be slow and predictable (and within the field of view), reduce pure rotational changes.  You can just use a quick fade with shift during it.  Using smaller objects and plain textures are better.

Oscillating motion

  • Sustained Oscillation should be avoided.  Think of Count Floyd’s 3D House of Beef.  If you have to do oscillation remember to make the content semi-transparent 

Customize on-device speech recognition 

iOS 10 introduced speech recognition

Speech is designed to convert an acoustic model to phonetic representation, that is then transcribed to a physical representation.  Sometimes there are multiple matches, so we must do more than just that.  Looking at context we can disambiguate values with a language model.  This was how it was modeled in iOS 10.

In iOS 17 you can customize the language model for your app to make recognition more appropriate for your app.  You will boost your model with phrases that your app needs, you can tune it to weight certain phrases in your system.  You can also use templates to load a lot a patterns like in chess.

You can also define spelling and pronunciations for domains like medical, etc.  Again a chess example:

Training data is bound to a single locale – so you will need to use standard localization methods.

Loading a language model will have latency so run on a background thread and hide behind some UI, like a loading screen or other method.

Customization data is never sent over the network, so you should focus your on the device to; however wise it will not load the language mode.

Animate with Springs

All about engaging your user with better animations

Why Springs

  • Animations provide a level of continuity, watching this move is more natural than just seeing it in a new place.  Velocity is important to make it look more natural
  • Ease in and out is a bezier animation defined by a curve and duration, to me this adds gravity to the animation
  • Spring is like an ease in and out animation; however if you use a gesture with an animation a spring looks more natural if you flick or show the object
  • Motion of a spring is not only a bouncing animation; it is about how the animation ends. It is a slow and natural stop, not like hitting a wall.

How Springs Work

  • You are modeling a motion of an object attached to a spring. This is impacted by the mass of the object, the stiffness of the spring, and the damping of the system (aka friction).
    • Initial position of the animation, and the target is the resting position of the spring.
  • While changing those properties make sense for a physical system; in software we use duration and bounce to reflect the spring.  Adding duration makes it take longer, and bounce will cause a bounce in the curve (if greater than 0, at 0 is smooth curve, less than zero takes a flatter time to stop.
  • All of the math is implemented for you, a bouncy spring is like a cosine wave, at 100% it will oscillates back and forth.  A * cos(2π * t / duration).  You can see all three curves in this chart, where the Blue and Green curves are defined at the bottom and the dark curve is defined by the previous formula.
  • To preserve velocity out cosine curve will start off with a downward curve.
  • The velocity can come from a the velocity of a gesture, or from the velocity from an interrupted animation. The session goes through the rest of the calculations for those who are interested.
  • You can use a completion handler that uses perceptual duration instead of the settling duration to process other activities 

How to use Springs

  • Springs are the default for SwiftUI
  • You can explicitly use them  presets like .snappy, .bouncy, and .smooth, and tune it with a duration or extra bounce
  • You can customize the .spring completely 
  • There is also a spring model type to programmatically covert parameters, or just do the math yourself with

What’s new in Wallet and Apple Pay

Payments, Order tracking, and Identity

Payments:

  • Apple Pay Later
    • 2023 US introduction – 4 separate payments – trackable in Wallet, New API is used for support of Apple Pay Later for both Apps and on the Web
  • Learn ore will provide details and explanation, the calculator review will show the 4 payments split, etc.
  • For the web you use and Apple JavaScript SDK
  • You must have an entitlement in your app or register your site to use this feature
  • Preauthorized payments
    • Enabled in iOS 16, have added deferred payments, along with recurring and automatic reload payments.
    • They support both fixed or variable amounts… can be used for preordering an item or for booking a reservation 
    • This is tied to the user’s account, not their device
    • You will need an Apple Pay merchant token
    • Cancellation policies must contain date, time and Timezone
  • Transfer Funds with Apple Pay
    • This is a new feature with iOS17, this will allow users to transfer money from Apple Pay account to a card in their wallet.
    • There is a new request type, that uses minimal amount of data.
    • You need to register as a merchant in the Apple Developer Portal
    • PKPaymentNetwork and PKMerchantCapability show the network and features supported.
    • PKDisbursementSummaryItem represents the final amount to be received on the card (net charges and fees)
    • A PKDisbursementRequest is the new transaction
    • This also supports Instant Funds Transfers from some institutions
    • This is only available for iOS and iPadOS – not the web or macOS

Order Tracking:

  • Introduced in iOS16, in 16.4 you can share orders in messages and an order tracking widget, with support for maps.
  • System Integration
    • shippingType to support carriers
      • You can indicate if it is being shipped or delivered
    • You can associate  with enterprise apps to improved tracking
  • Enhancements
    • Adding new ways to represent payment information, attach receipts, describe if purchase or refund.
  • New ways to add Orders
    • New API to check for an existing order, add or update and order and respond as appropriate
    • You use FinanceKit API for these features
    • This Is also enabled for web via the Javascript SDK

Identity:

  • IDs were enabled in iOS 15.4 – Added verify Waller last year in iOS16
  • Adding Tap to Present ID on iPhone, this builds on top of tap to pay from iPhone
  • Your app can request verification of specific ID data, and the user is able to decide if they are willing to present the information.
  • The system is wireless and secure – at no point does the user have to hand over the device to send the information
  • The system is cryptographically signed by the appropriate government or enterprise system so it cannot be tampered with.
  • And the system is more private as you only share the required data needed for the verification.
  • The types of requests include –
    • Display request – for things like Age or Name
    • Data request – for a wider set of elements, which are returned to the requesting app and requires an additional entitlement

What’s new in Core Data

Composite attributes

  • New type of attribute that encapsulates complex and custom data types
    • You can nest them and now create them in the Core Data model editor
    • There is a demo of how to adopt these new Composite Attributes
    • Watching the demo I can only think, how will SwiftData handle this, and if it does already, how much easier will it be

Stage your migrations

  • It is always preferable to do Lightweight migrations – it is built in and will happen automatically.  Check out last year’s Evolve your app’s Schema – I captured my thoughts on that session here in the section on Core Data.
  • When it is to complex, you can do a staged migration (this is new),
    • For migration of non-conforming lightweight changes
    • Simplify your app
    • Provides some opportunities for your app to gain execution control during the migration to perform specific tasks
      • You need to identify when Changes don’t conform with light weight migration
        • You can manually review the chains
        • You can try to open the store with the lightweight option using NSMigratePersistentStoresAutomaticallyOption and NSInferMappingModelAutomaticallyOption set to true.
        • You’ll receive an hash error if they are not compatible
        • Or you can use NSMappingModel.inferredMappingModel(forSourceModel:destinationModel:) which will either return the inferred model or nil if it cannot 
      • Decompose into a series of comforting migrations
      • Describe the sequence of ordering using the new APIs
      • Have Core Data execute an event loop to do the migration
    • The complex model change that is described may work for my migration, I will have to see if I can recreate this with multiple models
  • Adding Staged migration allows you to mix and match (in sequence) your model migration from both lightweight and custom migration stages.
    • You create a .willMigrateHandler to do custom work during migration, for each fetch entity you will copy in the data and relate them.

Defer your migrations

  • If your lightweight migrations take too much time, which can frustrate your users.
    • You can now do deferred migrations at a later date, for example dropping indices or deleting a column – you can defer this clean up by setting
    • NSPersistentStoreDeferredLightweightMigrationOptionKey to True – whit is only available for Sqlight, but is available back to iOS 14
    • You can check the store’s metadata if that value is set and you can then process them by running NSPersistentStoreCoordinator.finishDeferredLightweightMigration()
    • Consider using Background Task API to schedule this work
  • You can combine deferred and staged migrations

What’s new in Background Assets

Check out Meet Background Assets from WWDC 2022

Feature Recap

  • Designed to prevent waiting when launching your app, use a framework and App Extension to download out-of-band content via a CDN or managed server
    • It will download content that can be placed in your app’s sandbox for use by the app
  • Supported on macOS, iOS, and iPadOs
  • Can run periodically in the at install time before user runs the app, background, or when the app is not running
  • There are time limits on the run to optimize battery life
    • Consider memory mapping your data as you are also limited to amount of memory used by your app
    • The extension can be throttled if the app is not used very often
  • The Background Asset manager is prefixed with BA
  • User can turn off background app refresh turned off or is in low power mode – your extension will not run.
  • BADownloadManager is a singleton that can be used throughout your app. 
  • All downloaded assets are marked pursuable by default.  If you increase their size after download, they will be marked non-purgeable 

What’s New

  • Essential downloads – will be integrated into the system and will stop the user from loading the app.  They occur during app install.   These downloads have priority over normal background downloads.
  • Non-Essential downloads will be automatically launched after essential downloads are ready.  Note the UI changes that shows which part of the install download is App, App Install, and Essential assets.
  • If a user turns off In-App Content the you will not get essential downloads to download automatically.  You will need to code your application to handle download these assets different.
  • You can convert an essential download to a non-essential download, and the re-enque this download 
  • Also you need to setup your info.plist

Sample Implementation

  • First add your required info.plist keys
  • Create your App Extension
  • Make sure your app extensionMakes and app are both using the same team identifier
  • Make sure to use .withExclusiveControl for your download session 
  • Promoting from background to foreground does not restart the download, it just continues from where it was at a higher priority.
  • Always use move objects to move the finished download in to the app package
  • So far none of my apps have a need for these features.. but I do have a few game ideas which may need this in the future.  So definitely keeping this in my back pocket 

Debugging Guidance

  • Since the extension launches during the install or periodic system events.  You must use a tool called “xcrun backgroundassets-debug” in a terminal to be able to debug with a device paired with your Mac. The device must be in Developer Mode
    • You can now trigger events to see how the app or extension behaves.