Build programmatic UI with Xcode Previews

Create new instant visual feedback.

What are previews

  • This is simply a code snippet to make and display a view, using new #Preview macro makes it easy
  • Designed to iterate faster, the minimal code will be recompiled when you make changes in your code to redisplay the preview
  • They don’t emulate your code.. they are code

Writing previews

  • Start with the macro #Preview  
  • Add one or more trailing closures of content
  • You can configure the preview – Name it, or pass values
  • You can preview Views and Widgets
  • For Views:
    • For SwiftUI, you can place the view in other views, like List{ YourView() }
    • You can pass a name and configuration traits
    • You will not see the canvas unless there is a Preview in the file.  In one of three modes see buttons on bottom left of the canvas window, Live, Selection (or static mode), and Variants (show multiple versions by types).
    • Multi cursor editing via Cmd-Option-E 
    • You have a settings mode to change type size, dark mode, etc. 
  • For Widgets:
    • There are two kinds, Timeline Provider (with individual entries)
    • Timeline Provider
      • Preview Widget, timeline provider, and families to use for previewing (this is a variadic list
      • You get animations by default if you step thru the timelines
      • You an craft specific entries to set the values you’d like to see
    • At this point in the video, I stopped and went over to my app Wasted Time and changed the Widget previews to use this new configuration.  I know I have a bug in my widget I’ve not been able to resolve yet, and figured by setting this up I would be able to debug better.  Now, what I noticed is that I had to add .containerBackground( (some color), for: .widget)  to my widget’s to get the previews to work.
      • Now back to the session
    • You can Pin a preview so that when you go to other views, 
    • Live Activities (just set attributes and pass in set of states to test)
    • Check out Bring Widget to Life for more

Previews in your project

  • Previewing in Libraries
    • Library targets can be used in any project
    • You need to have an example app to run previews, could be your app, or something else which can be derived from the active Scheme, or it can make an app called XCPreviewAgent to load your preview automatically. 
    • If you turn your app into libraries you can create more efficient schemes for build times.
    • You can also create simple preview-only apps to specify needed entitlements, etc. (like needing the photo library entitlement).  You will need to set a dependency and then embedded this app in your library
  • Providing sample assets
    • To add in data to your previews, you can add assists in your Preview Content Preview Assets library – by setting these as Assets for Development you can make sure that the build doesn’t include this in your product.  To set this go to Build Settings of your project and filler on development assets, then. Add the path or drag the content folder to the pop over.
  • Leveraging Devices
    • You could leverage assets and data on an actual device for development.  To do this, use the preview device picker at the bottom the canvas, under More you can see all similar devices you have added. You can pick my feature in the popup, if you plug in your device to your Mac, it will preview on that device.  You will still have access to all the preview settings and configurations.

Build an app with SwiftData

Check out “Meet SwiftData” to cover the basics.  This is a code along session (https://developer.apple.com/wwdc23/10154) and you should download the code from https://developer.apple.com/documentation/SwiftUI/Building-a-document-based-app-using-SwiftData 

The code along get’s you to build a cross platform code flashcard app.  

Meet the App

  • This is the flash card app located at the above links.

SwiftData models

  • If you are converting to SwiftData you will change your class to @Model and remove ObservableObject and @Published from it.  Then in the Views were you update the object change from @ObservedObject to @Bindable
  • For more on this. Check out “Discover Observation with SwiftUI” session  – Discover Observation in SwiftUI 

Querying models to display in UI

  • Changing your content view from @State to @Query and removing the sample data assignment allows you to now directly get your data from SwiftData
  • @Query is a property wrapper that gets data from Swift Data and triggers view updates,  You can add simple syntax for (sort: \.ValueoSortOn) (order: ) and (filter:), and you can have multiple @Query properties in a view.
  • .modelContainer(for: NameOfClass.self) this will setup a model container, which creates the whole storage stack.  Every view can have a single model container.  If you don’t have set one up, it won’t save or query.  Just set it up in the WindowGroup scene.
  • You should update your Previews with sample data.  Creating an in memory set of sample data achieves this.  Look at how this is done in the “PreviewSampleData.swift” file in the project (This failed for me and persisted on both the starter project code along, and the completed final project provided above, I am guessing this may be a Xcode bug).

Creating and updating

  • You need to accept the modelContext of the view in order to do saves and updates.  Just grab it from the Environment
  • Save is automatic for you

[Bonus] Document-based apps

  • Changing to document based app, you can turn the flash card deck as separate instances. To do this, you will have to define the package type but it is almost trivial doing this.

Build accessible apps with SwiftUI and UIKit

Accessibility is key for Apple.  I agree with this philosophy, having even written an article years ago for the magazine called “Enabled” on how OS/2 was made accessible.  I loved that the magazine was called “Enabled” instead of “Disabled” as it sent a message to the reader that they were not less than.

Accessibility Enhancements

  • Adding hints via .accessibilityAddTraits(.isToggle) lets the system know that this button has a toggle trait to provide proper hint and switch button information
    • Also available in UIKit
  • You can add new announcements to do things like let user know screen is loading
    • Accessibility Notifications – can be created for “Announcement”, “LayoutChanged’, “ScreenChanged”, and “PageScrolled” messages to assistive devices, etc,
    • The API is pretty simple – `AccessibilityNotification.Announcement(“TEXT”).post()`
    • You can set announcement priority so they don’t overlap and cut each other off, if there are a series of announcements going in sequence.
  • Accessibility Zoom Action .accessibilityZoomAction modifier allows for zooming via voice over.
  • .allowsDirectInteraction trait allows you to define a region of the screen for direct interaction with your app, you can bypass voice over – you can .silentOnTouch and .requiresActivation as options

Improve Accessibility Visual

  • .contentShape(.accessibility, type) – to control accessibility actions on the screen.  You can create custom cursor visuals, you can apply this to a view and

Keep state up-to-date

  • Block based attribute setters is coming
  • You can now keep underlying attributes up to date. This provides a closure for attributes (whenever it is needed). It is reevaluated whenever it is accessed by assistive technology.
  • It must return the correct type, this makes it easier to maintain the elements.

Bring Widgets to Life

You can now create interactions with widgets.  You can have animation in widgets to show how content has changed.

Animations

  • By default there is default animation by the system whenever things change
  • In regulator objects you use State, but widgets use timelines not state
  • By default this will be a spring transition, but you can use any you which from SwiftUI out of the box.. you can learn more Explore SwiftUI Animation
  • The new preview API will let you work thru this by showing you the timeline.
    • You can use #Preview(as: WidgetFamily.systemSmall) { } timeline: { entries, … }
    • For text you can use .contentTransition for important numeric values
    • Use .id(value) so that it knows when to create a new view
  • Use containerBackground to show up on all platforms.

Interactivity

  • Now you can execute actions from the widget, since widgets are stateless, via a Widget extension discovered by the system, it runs independently from your app. It takes the entries from the timeline, and is used to create a series of views to show at the appropriate time.  Your view code only runs during archiving, you can force an update by using .reloadTimelines whenever data changes in your app.  Updates to widgets are done as best effort, but reloads are guaranteed  to run.
  • You can use Buttons and Toggles to do interactivity. So you need to represent actions that can be done by App Intents.  Just like you did with ShortCuts or Siri.
  • Check out “Dive into App Intents” and “Explore enhancements to App Intents”
  • When you import SwiftUI and App Intents will provide Button and Toggle support to Widgets
  • Note that Perform in AppIntent is an async function. Make sure you persist all info needed to reload your widget.
  • Note the appIntent will also become available in Siri and ShortCuts
  • Note you can build a widget directly and Xcode will install directly on the Home Screen for you.
  • Updated widgets can cause a small latency if you use it on macOS, you can add .invalidatableContent() modifier to fix this latency.  Don’t over use this feature.
  • Toggle with update state optimistically by pre-rendering on both sides

Beyond scroll views

Improvements to scroll views in SwiftUI – this allows for expanding beyond the screen port.

You can define which axis it will scroll – horizontal or vertical.  It will resolve safe area to show information within its content offset.

Margins and safe area

  • You can add hMargin to a .safeAreaPadding( <direction>, hMargin) – note this does not use a dot prefix.
  • ScrollView resolves safe area into the margins to show its content, for things like scroll indicators, etc.
  • .contentMargins API allows you to control the content separately from the safe area.
  • By default you get a standard declaration rate for target content offset when lifting your finger.  Now you can add .scrollTargetBehavior() modifier to do things like paging scrolls, or .viewAligned to align to other views, via the Scroll target.  .scrollTargetLayout()
  • Using LazyStacks require that you use scrollTargetLayout() modifier to ensure that it scrolls to the right place.
  • You can also confirm your own custom behavior ScrollTargetBehavior with the updateTarget method.

Target and Positions

  • New API containerRelativeFrame modifier allows you to take on the size of the container, without having to use a GeometryReader.  This would be good for my card app.
  • You can remove .scrollIndicator(.hidden) – which has been around.  Default behavior will be to hide the indicators, but to allow them to show when mouse is connected. But you can add .never to force it to hide the scroll indicators.  So you can add views with additional “Paddle”s to provide buttons for platforms to show scroll behavior.  I really don’t like that you can force scroll indicators off.  I think this keeps hiding functionality that a user needs to have.  I can see this being used by bad actors.

Scroll Transitions

  • To visually alter a based on where it is in the hierarchy  – this are achieved with ScrollTransitions 
  • You can define actions based on phase
  • This uses new protocol called VisualEffect:
    • Scale
    • Rotation
    • Offset 
    • Are all visual effects
    • You cannot use this for things that change the overall size of the content

Meet SwiftUI for Spatial Computing 

This is a high-level review of SwiftUI’s extensions for xrOS 

All of xrOS was built in SwiftUI

Buttons can provide audio feedback when pressed.  With Spatial computing we have to use Scenes, see the three types defined in other sessions – Window, Volume, and Full Space. 

Breaking down the structure of a windows

  • Start with glass background
  • Within that you have same navigation containers – like TabView, and SplitView with Navigation and Lists
  • Using labels for tabs, they will be exposed when hovered.
  • Ornaments are new – and provide an overlay on the bottom edge of a window (with slight overlap)
  • You should use various Material treatments to the background of windows – this will then dynamically adjust base on the background etc.

Interactions

  • Eyes, by looking
  • Hands, reach out to touch
  • Pointer, connected track pad keyboard, etc.
  • Accessibility, supports all the controls from other platforms

Hover effects:

  • Critical to make your app responsive and assist with targeting
  • Added automatically to most controls.  If you create your own custom controller style, you want to add this
    • I will have to correct my custom button style in Wasted Time
    • Just add .hoverEffect() to your button style
  • Elevate your windowed app for Spatial computing (session to watch)

Volumes:

  • Just change .windowStyle(.volumetric) and you can set a .defaultSize(
  • Model3D is similar to image but always loads async – and it is also just another SwiftUI View
  • If you add a ZStack around a Model3D – it will be aware of the content space and depth.
  • You can add .padding3D(.front, 200)
  • You can also use .glassBackgroundEffect(in:
  • You can add .rotation3DEffect(rotation, axis: .y)
  • You can use RealityView to provide full access to RealityKit content 
  • Check out “Build Spatial experience with RealityKit”
  • Gestures and attachments are enabled with RealityView
  • “Take SwiftUI to the next Dimension”

Full Spaces:

  • Gives you full control of the environment.
  • Just add ImmersiveSpace(id: ) { View() } 
  • You can adjust immersion styles and transition between them on the style.  Mixed, Full, and Progressive (middle ground)
  • You can use ARKit to make your space even more interactive and use physics with the real world.
  • There’s a deep dive in Go Beyond the Window with SwiftUI

Meet MapKit for SwiftUI

Expanded swiftUI API

Use MapKit to your app,

This is a code along class.  

Adding a marker to a Map, is like adding a view to al list.  

Markers are used to display content on a map.

Annotation allows your to provide a view at a specific location

Using the content builder closure allows you to put things on the map.

You can use .mapStyle to change the display of the map,   .standard map is like a paper map.  Adding elevation: .realistic the map will be able to handle terrain information.

Search results can be shown on the map. 

MKLocalSerach will find places near the coordinates you provide.

Map Items can use automatically color and style results, you can override and provide your own image or even three letters of text.

What the map shows is controlled by a .MapCamera (explicit) or a .MapCameraPosition (dynamic)

If you provide a binding to the position state, MapKit will update based on the user’s position.

.onMapCameraChange informs us when there is a change to what is made visible.

Look around Preview pulls in a scene based on MKLookARoundSceneRequest which will return a look around

MapPolyline can be used to show a driving route.

MapPolygon and MapCircle can be used to mark areas on the map.

MapUserLocationButton() will show where the user is

MapCompass() will show the compass on the map

MapScaleView() will show on the screen to show zoom level.

If you use MapControls Modifier they will be placed in default places.  But since they just views, you can place them where you want them in your own position of the app.

Get Started with building apps for spatial computing

Fundamentals

  • By default apps launch into shared space – kinda like multiple windows on a Mac. And people are connected to their environment with passthrough 
  • Volumes allow of bounded 3D Content in a space
  • Dedicated full space – only your app and content appears in an immersive view
  • You can choose to render in a PassThrough or Fully Immersive space
  • Interactions are via Eyes or Hands by defaults.  These interactions there are a bunch of gestures covered in another blog post of mine.
  • You can use ARKit skeletal hand tracking to create app specific hand features
  • Also supports Wireless devices, game controllers, keyboards, etc.
  • SharePlay and Group Activities allow for shared space
  • Check out designing and building spatial experience 
  • There is a lot of Privacy core principles
    • System curates data and interactions and delivers hoover effect or touch events
    • System asks the user for more information
  • Everything starts with Xcode – supports everything you need including Project Management support and Platform SDK
  • You will want to look at RealityKit metrics in Instruments
  • Reality Composer Pro supports Particle Emitters which were recently added to RealtyKit
    • You don’t have to build an app to test your content on the headset if you use Reality Composer Pro
  • Unity is also supported for people who are familiar with it

Where to Start

  • New App – You can use the new App Template for the platform where you can choice for Window or Volume, and then add an entry point for Immersive Space, if you choice Space it will add a second scene called Space to show you how to integrate.
  • There are code samples being published p
    • Destination Video show you how to build a shared immersive experience.  
    • Happy Bean is a game example in immersive space with custom hand gestures. 
    • Hello world shows how to position between different environments and modes
  • To start migrating an app, the iPad variant is preferred over iPhone variant.
    • They will retain light mode style, rotation is supported (Run your iPad and iPhone apps in Share Space) to learn more.
    • I f you add a new destination you can not target and recompile for xrOS – it will take on the new characteristics, spacing, sizing, etc. 

How to build

  • Start with Hello World app to see how it is done. Note basic windows have 3D content. Some views you end up with a volume, Another view provides a full immersion style.
    • This is a great app to see the spectrum of modalities
    • Windows serve as a starting point for your app.  They are written in SwiftUI and can include 2D and3D content.
      • Just add a windowGroup to your scene
      • Model3D(named: “” ) is very much like Image(named: “” )
      • Actions are familiar gesture recognizers and a few unique ones
  • You can add the gestures to Model3D to allow the user to manipulate a 3D object
  • Volume is a new style of window, designed for 3D content, really built for a shared space, so content must remain within the Volume
    • It is a .windowStyle(.volumetric) set with size dimensions.
    • Adopt RealityView to a scene, this allows for any number of entities to be managed by SwiftUI
    • You should learn more from various Spatial Computing with RealityKit sessions
  • Spaces is an element that hides all other apps, leaving only your app visible.  Your objects can interact with the surrounding.  Check out meet ARKit for Spatial Computing.  You use the Digital Grown to dial in the level of immersion
    • There are two styles of Immersion – .mixed (layers your content on top of passthrough) and .full (only shows your content.  There is also .progressive that allows to change the level of immersion
    • Good idea to add a button to allow a user to switch modes

Expand on Swift Macros

Why Macros

  • Lots of build in code expansion like Codable and property wrappers
  • Used to allow your own language features, to eliminate tedium and boiler plate.

Design Philosophy

  • Different from C Macros
  • Distinctive use sites
  • All start with either # or @ depending on type of macros
  • Complete, type-checked, validated – macros must be all of this
  • Inserted in a predictable additive ways.  It can not delete anything
  • Should not be  Magic – you can expand the macro in line or step into it in the debugger.  Even if it is a closed source library.

Translation Model

  • Basic concept,  extract the usage from code and sends to compiler plugin, in separate process and sandbox, and returns an expansion of code – which is added to your program – which is then compiled with your code.
  • All macros have to be declared so that the compiler can do the prior step – this is pacifically the API for the macro.  You can either import it via library/framework or write it inline in your code.
    • It must define it’s role

Macro Roles

  • Role is a set of rules for the macro
    • Where it can be used
    • What types of code it expands into
    • Where the expansions are inserted
  • They are responsible for achieving the goals.
Screenshot 2023-06-07 at 17.29.53 (2).png
  • There are two that create free standing macros and 5 that are attached macros
  • An expression is a piece of code that executes and presents a result.  
  • Detailed examples for each of these types of macros are gone thru in the session
  • If two different macros are attached to the same code it doesn’t matter about ordering, since they all are based on the original code.

Macro Implementation

  • Comes after an = sign, and it is always another macro
  • Usually an external macro implemented by a compiler plugin
  • #externalMacro specifies the plugin to call and use to expand the code.
  • You need to import SwiftSyntax – in your macro code to understand the Swift code syntax.  Check out the Write Swift Macros session or the SwiftSyntax developer documentation. 
  • Import SwiftSyntaxMacros – protocols and types
  • Import SwiftSyntaxBuilder – provides convince tools
  • You should include comprehensive error codes for when someone tries to use it incorrectly 
  • You can provide Fix buttons, highlights and other information in your error message when the macro is used incorrectly

Writing correct macros

  • Name collision – you can use the makeUniqueName() method on the macro expansion context.
  • ● Swift needs to use names outside of the macro, so you need to use makeUniqueName() to solve this.
Screenshot 2023-06-07 at 18.04.31 (2).png
  • Macros can only use the information that the compiler provides to them, so you can’t create a macro that inserts the current date and time (as an example)
  • Testing – it’s an ordinary swift module so you should write normal swift unit test for it

What’s New in App Clips

These are light versions of your app, allows users to try your app without installing

There are three

New size limit

  • Need to be small to ensure instant experience
  • You can now go up to 50MB size limit, 
  • If you want to use QR codes or NFC tags, then you still have to be limited to 16MB
  • If you target iOS 15 or earlier still limited to 10MB

Default app Clip links

  • Many require only a single experience , you must provide a website to host the recognized data.
  • Configure and link your App Clips from WWDC2020 for more information
  • Default links are Apple generated URLs so you can be invoked without any setup on your end.
  • You can get information from this with NSUserActivity

Invoke From your App

  • Now it can be invoked from your app, so you can support things like food pickup in a messaging app.
  • Use Link Presentation API – via LPMetaDataProvider request
  • Or use the default App Clip link 
  • This linking behavior does not require going thru Safari