As always I am going to give my first impressions for all the sessions I go thru. Today is always the busiest day and I may have some spill over into tomorrow. But I am excited to get started:
Let’s start with what is new in Xcode –
- Xcode is 30% smaller – I noticed this by the initial download and installation.
- The first thing I did was download the Foodtruck app that will be used in many of the demos this year.
- Nice to see Swift Plugins to improve development environment.
- Dynamic type variants should help in improving my various Apps on different devices and resolutions.
- Improvements in TestFlight Feedback allows you to quickly and easily see the feedback from your beta testers
- The new Hangs view shows you issues of code in production.
- Different sized images can be automatically generated via the “Single Size” option in the inspector.
Next – What’s New in Swift:
- Community update
- DocC and Swift Setter were open sourced last year.
- C++ Interoperability and Website designed were started as two new communities
- Swift Mentorship was started last year. I signed up for this to see what I can learn and contribute
- Added support for Native Toolchain for CentOS7 and Amazon Linux 2 – RPMs are available but experimental at this time.
- Swift is now being used in Apple’s Secure Enclave
- Swift Pakcages
- TOFU – Trust on first Use – fingerprint is being recorded on first download. Will validate if it changes and report error.
- Command plugins – These include doc generation, source code reformatting, and other tools. By embracing developer tools this way, should expand swift uptake
- Creating a plug in with DocC was shown as a demo – you can now run from Xcode. (Directly executed at any time)
- Build Tool plugins – allow you to inject steps in the build process. Think of this as extending activities during a complex build environment.
- You can use aliases to deal with module collisions
- Performance improvements
- Drivers can now be used as a framework within the build system. So you can improve parallelization in build process.
- Type Checking has sped up due to how they deal with generics
- Runtime – improved launch time due to how protocols are being handled. They can now be cached
- Concurrency updates
- Further fleshed out Async/Await – is now available to be back deployed to earlier Operating systems
- Added in data race avoidance – this is at the thread level this should address many bugs related to data race conditions
- The goal is to get to full thread safely by Swift 6. This current release is swift 5.7
- You can enable stricter checking in the build settings.
- Distributed keyword let’s the compiler know that an actor can be on a different machine – even across the network. You will need Await and Try to handle if network errors occur.
- There is a new set of Open source Algorithms released in Swift 5.5
- Actor prioritization will allow for improved performance
- There is a new concurrency view in instruments so you can visualize and optimize your concurrency code.
- Expressive Swift
- New short hand for if let and guard so you can drop the right hand side of an = and just use the optional.
- Swift is very type specific, unlike C which allows you to deal with automatic conversions. Now if you passing it to C from Swift, you won’t get the swift errors, from valid pointer conversions.
- New String tool to improve parsing the string. There’s a new declarative approach via RegEx to do string parsing. You can now use words instead of symbols to do RegEx – import RegexBuilder library to enable this feature. And you can take it and make it reusable, and recursive. If you want you can still use regex literals in the builder.
- Compatible with UTS#18 with extensions.
- Generic code clarity
- This will require you to add any keyword ahead of any instance of a protocol verses as a generic type. (Not fully sure I understand this one yet, but will dig into it more over the next 12 months).
- Primary associated types can now be constrained
- Generics are improved by the same keyword – to minimize the amount of boiler plate changes you need to use when they are created.
Next – What’s new in SwiftUI:
- New SwiftUI app structure
- Swift Charts (new Framework)
- State driven charts in SwiftUI for data visualization
- A chart is just “Some View”
- It handles localization, dynamic type and dark mode automatically
- Navigation and Windows
- Stacks – Push / Pop
- New Container view – NavigationStack – wraps a root content view
- Improves handling of stack
- Split Views
- New NavigationSplitView allows 2 and 3 column layout
- Works with ValueLinks, automatically collapses to a stack for smaller width environments
- Look at The SwiftUI Cookbook for Navigation
- Scenes – Multi-window views
- WindowGroup is the base way for app …
- Now you can create “Window” which allows a single window for an app., and you can add a keyboard short cut for that window to be displayed. You can use openWindow() to open the window.
- SwiftUI will remember a user’s changes across launches
- And .presentationDetents modifier can be used on iOS to provide sheets from the bottom.
- MenuBar extras are now available in SwiftUI
- MenuBarExtra()
- You can use this to build an entire app in the menu bar, or add a menu bar extra to be available.
- Added updates for each of them.
- Stacks – Push / Pop
- Advanced Controls
- Forms
- System Settings uses this
- Forms allow you to create consistent and well formed designs for the detail view
- You use Form{}. Section{} and use the modifier .formStyle(.grouped)
- You can also use LabeledContent() view to provide simple text and item
- Titles and subtitles can have updated text based on state
- Controls
- Can configure TextField to expand based on axis: .vertical and you can add a limit for the expansion
- Date picker now support non-contiguous selection
- Aggregate toggles are also available now.
- Steppers can now have a format to it’s value. (And available on watch OS – I should change Wasted time to use a stepper)
- Tables
- Now supported on iPadOS – this is using the same code as macOS – Table(){ TableColumn() } and will render on compact devices like iPhone
- You can add a contextMenu(forSelectionType:) to add a selection of items in a table. Will work on single, multiple or no selected row.
- You now have a new toolbar design on iPadOS – they can be customized by users and re-ordered. Which will be saved across app launches.
- ToolbarItem(placement: .secondaryAction)
- Basic search is available with .searchable modifier. Now can add tokens for customized search… can also add scope. Lots of new stuff here.
- Forms
- Sharing (transferable)
- Photos
- The new picker is privacy preserving
- Can be placed anywhere in your app
- Need to check out PhotosPicker you add a binding for the selection (looks very easy to use)
- Sharing
- Standard sharing view just use ShareLink to enable it within your app
- Provide content and preview
- Should add to card app to share cool cards found
- Transferable
- New swift first way of transferring across applications
- Supports drag and drop and uses .dropDestination()
- You need to use Codable and custom Content Type
- Photos
- Graphics and layout
- Shape Styles
- Color has new gradient properties based on the color you use
- Shape has a new Shadow modifier
- Previews in Xcode 14 allows for variants to allow you to easily see things at the same time. And without writing configuration code.
- Layouts
- Applied Geometry
- Grid is a new container view to allow for improved layouts
- New Layout protocol allows you to build really complex abstractions.
- Shape Styles
On to “Get to know Developer Mode” –
I noticed this right away when some of my local apps I’ve been working on wouldn’t run. So I had to figure out how to turn it on for my iOS and iPadOS devices. Seems like Apple is working to lock down more things on the device to help increase security on devices.
- What is it Developer Mode
- iOS16 and WatchOS9 – disabled by default
- Have to be enrolled by device
- Persists across reboots and system updates
- Most common distribution modes do not require developer mode, TestFlight, App Store, enterprise.
- Using Developer Mode
- If you want to run and install development signed applications,
- Use testing automation
- Beta releases will have the option visible
- It is under Settings -> Private Security
- Automation Flow
- There are tools to automate multiple devices, but you need to have passcode turned off
- You can use devmodectl on macOS Ventura to enable this by default
Now on to security with Meet Passkey:
I wonder if 1Password and LastPass are working with Apple to enable in their password mangers.
- Passkeys are now available and should be adopted by everyone
- With Autofill and password managers – you currently have 2-3 steps just to log in and it’s longer if you use second factor
- Passkeys change to 1 step login – stores data in your KeyChain and is unique by account and strong.
- Process will use a QR code if you try to login from an unknown device – or non-Apple – you can then scan from your phone and use it to log in.
- Shared accounts support ability to take a trusted account in KeyChain and use Share button to allow other person to use the same key.
- Designing for Passkeys
- 1st they are replacements for passwords – faster, easier and much more secure
- Common phrase is “passkey”
- In SF Symbols person.key.badge and person.key.badge.fill
- You don’t need to design new UI for login.. just keep User Name field
- You can present them as a first class entry via AutoFill
- There are additional UI options
- Passkeys and AutoFill
- Use WebAuthn on backend to allow this to work
- On Apple platforms the ASAuthorization Family of APIs are used
- You need to setup associated domains in webcredentials (2017 session will help you understand)
- Make sure you use username type in your UI
- Fetch challenge from server
- Create provider and request via ASAuthorizationPlatformPublicKeyCredentialProvider and createCredentialAssertionRequest
- Pass the request to the ASAuthorizationController
- And start the request via .performAutoFillAssistedRequests(). If you use .performRequests() you will get a modal request.
- You will get a didCompleteWithAuthorization call back and it will have an assertion and should read values confirm with your server and complete
- This will also support Signin With Nearby Device
- This also works on the Web via WebAuthN API. – this session shows typical JavaScript example for people who are working on websites.
- Note Passkeys are replacing Safari’s legacy platform authenticator
- Streamlining Sign-in
- Supports using passkey allow Lists – this will show a list of potential passkeys by default for a specific site, you can also use the username to restrict the list to appropriate values.
- Silent fallback requests – by default if there are no matching passkeys – it will show you a QR to sign in from a near by device. But you can fallback to show traditional login features, like username and password, but you must handle the error.code == .canceled to show that form (or other logic)
- Combined credential requests – this will allow the picker to show that you have a passkey and another account with password
- How Passkeys Work
- Current technology requires a server to store the hashed and salted value, which if someone else gets it – allows them to get into your account
- Passkeys are using public/private keys. Public is stored on the server and available to anyone. This is used in a single use challenge back to your device where the private key is stored in the secure Hardware enclave. This is used to send back a solution, which is sent back to the server. If the server can validate the solution with your public key, you are logged in. The server can only say yes the solution is valid, but it CANNOT create new solutions. The public/private Key solution makes it significantly harder for attackers. AS you should NEVER provide your private key, it makers it near impossible for a hacker to phish. However, like all other security, if you hand over you private key, you are giving someone else what they need to hack you.
- Passkey also uses bluetooth via proximity exchange, so a hacked email or fake site can’t also provide the proximity challenge, again, they can’t hack. (The proximity check is in the browser not from the server).
- Multi-Factor Authentication
- Today, multi-factor allows you combine factors to improve security.
- Passkeys eliminate the human factor from phishing, and can’t be leaked from the server, since it is secure on your hardware.
Meet Desktop Class iPad
Focus on Navigation Bar Updates:
- Styling and controls
- UINavigationBar allows for new optimized UI for iPad, styles like Navigator, Browser, and Editor
- Default style is Navigator – this is a traditional UINavigation Bar,
- Browser – changes to address history and location. Title is leading
- Editor – Leading aligned title – great for destination activities
- These last two have lots os space in the center – which all you to present new buttons for the user. Overflow is supposed in all modes (…)
- UIBarButtonItems are now grouped – Fixed Groups always appear first and cannot be removed or moved by customization
- Movable groups cannot be removed but can be moved.
- Shape Group is both movable and removable – will be collapsed when UI needs to save space
- Customization will apply these rules… and then allow the user to make the changes.
- Overflow contains any item that won’t fit along with the item to customize the bar.
- On MacOS this all becomes NSToolBar
- Document interactions
- UINavigationBar supports adding a menu to the title group – to add actions on the content as a whole
- By default they are duplicate, move, rename, export, and print
- You can add your own items on the menu, filter away some default, etc.
- On Mac Catalyst these items exist in the File Menu, and you add, you have to use UIMenuBuilder to add them to an appropriate menu
- Search udpates
- In iOS16 Search now takes up less space – it is inline in iOS
- Suggestions appear and can be updated along with the query
- Confirm to .searchResultsUpdate, UISearchSuggestionItems
Getting the most out of Xcode Cloud
- Xcode Cloud Review:
- Introduced last year for CI/CD
- Can build, run automated tests in parallel and distribute to users
- Review and existing Workflow
- First check that current workflows are working and look for optimization view
- The build details Overview provides info
- Build duration is time based, usage is effort based and is used to calculate usage
- Xcode Cloud usage dashboard
- This shows you usage, trends, and available remaining compute.
- Best Practices
- Avoid unintended builds (based on start conditions)
- Don’t build duplicate commits
- Can also use files and folders options to not start a build if only docs or other folders
- Predefined actions, Analyze, Archive, Build, Test, etc.
- Tests should be based on a concise set of devices (don’t need to do all devices), there is an alias for recommended devices.
- To skip based on type of change – just ad [ci-skip] to the end of the commit message and this will skip running a build.
- Revisit Optimized Build
- Usage dashboard show you the impacts of your changes
- Given that Apple is starting to charge for this, I will monitor my usage on my current app I am testing to see if it is still worth it
Bring your world into augmented reality
Using object capture API and reality kit to create 3d Models of real world objects
- Object Capture recap
- Model and textures are combined to the model
- Last year’s session goes thru details
- There are some great models available and tools that are avail. Check out the Go App to try on shoes via AR.
- ARKit camera enhancements
- Take good photos from all sides, if you use camera iPhone or iPad app, it will capture scale and orientation
- Leverage ARKit for 3D Guidance UI – this is built into ARKit
- The higher the image resolution the better the model quality.
- New High Resolution API at native camera resolution – while still using other items. On iPhone 13 (so far) – it does not interrupt the ARSession.
- EXIF meta data is available in the photos – just use ARWorldTrackingConfiguration.recommendedVideoFormatForHighResolutionFrameCapturing
- Then set config..videoFormat = hiResCaptureVideoFormat
- session.run(config)
- You then call session.captureHighResolutionFrame()
- AVCaptureDevice allows you to capture the underling device properties directly
- All of this is in Discover ARKit 6
- Best practice guidelines
- Characteristics – sufficient texture details (transparent is really hard)
- Minimal reflective surfaces
- Ensure object is ridged if you flipped
- Limited fine Structure – you will need to get close to address that
- Ideal environment has diffuse even lighting
- Lots of space around the object
- Get various nights, and make sure that the object is large in the field of view.
- Lots of overlap is important
- (Around 80 photos is good)
- Get the bottom (another 20 pictures)
- Landscape mode for capturing a long object
- Copy to a Mac and process them. There are four detail levels of the models – Reduced, Medium, Full and RAW – iOS is limited to the first two. The others are for Pro Workflows
- End to end Workflow
- Demo of the workflow
- Use Photogrammetry session API (from 2021 WWDC)
- Reality convert allows you to change colors and textures on objects
- Recommend checking out RealityKit session from 2021
- Will certainly go over this section multiple times.. great example of showing how use 3D objects in a game
Create parametric 3D room scans with RoomPlan
Cool to see members of the prototyping and video team. RoomPlan is a framework to scan a room to build a parametric model of the room.
This should allow for significant changes in interior design and other home remodeling tools.
- Scanning experience API
- UIView Subview with World Space Feedback, Realtime Model Generations, and coaching & user guidance
- To use in your own app, it’s just four steps:
- You can also add delegate classes to opt out of processing and / or export the USDZ model
- Data API
- Scan -> Process -> Export is the basic workflow
- There’s a simple RealityKit app – I downloaded the code.. can’t run it tonight.. but hope to get to it later this week
- Best Practices
- Need at least 50 lux
- Problematic things
- Mirrors and Glass present problems
- High Ceilings
- Dark Surfaces
- For high resolution
- Close doors
- Open curtains
- Provide feedback during the scan
- Battery and Thermals are things to consider – don’t really want to scan over long 5 minutes
Bringing Continuity Camera to your macOS App
- What is continuity camera?
- Just bring your iPhone close to your Mac and it works wirelessly
- It will also show up as an external camera and microphone on your macOS
- For Wired it needs USB for Wireless you need both Bluetooth and Wireless
- You get an onboarding dialog onetime for an app.
- You now get the devices available
- You get additional video effects – In control center you then have the ability to add additional effects
- With continuity camera you get blue on non Apple Silicon Macs
- You can also combine video effects together
- Desk View mode – Portrait camera placement is best. Also turn on Center stage.
- I gave this a try with my iPhone by holding It up behind my Mac. It really was magical to see my external keyboard and fingers on the track pad, as if from an overhead camera
- All notifications will be paused when you use continuity camera
- Building a magical experience
- New Camera Management APIs are available.
- You should enable the APIs to automatically switch the Camera instead of requiring the user to select from a drop down. You can set the user preferred camera property. It is key value observable (KVO) to intelligently switch to the most preferred camera.
- systemPreferredCamera is read only – to present best choice on the system.
- Recommend adding a new UI element to enable and disable auto selection mode.
- Recipe:
- Automatic Camera Selection is Off
- Stop KVO systemPreferredCamera
- Update session’s input device with user selection
- Set userPreferredCamera when user picks a camera
- Automatic Camera Selection is Off
- Will need to check out the sample app – Continuity Camera Sample
- New APIs on macOS
- Your Mac app can now use iPhone camera features
- Support up to 12MP in AVCapturePhotoOutput
- Can also prioritization of photo quality vs. speed
- Flash Capture – is also enabled
- Along with metadata to capture if there is a Face of HumanBody in the image from iPhone
- Also supports video, movie and other formats are now supported in Continuity camera
- The Desk View is exposed as a separate device in device discovery
Adopt Desktop class editing interactions
- Edit Menus
- Representation is based on input mode used, you can get context menus with right click or secondary menu option.
- Data detector integration does things like – if you select an address you will get Get Direction and open in Maps. No code adoption required.
- Adding your own actions is UITextViewDelegate to customize – or return nil to get standard system menu
- UIMenu controller is deprecated
- Instead now we have UIEditMenuInteraction which allows for context menu presentation on secondary click
- This is contextual based on the location of the touch in the system.
- The behavior on Mac will be familiar to that modality
- You now have Preferred element size – similar to how widgets are handled on iPadOS and iOS
- Find and Replace
- New UI component for in app find and replace editing feature. Works across macOS, iOS, and iPadOS all you need to do to enable is set .isFindInteractionEnabled = true for UITextView, WKWebView, PDFView or Quick Look (already setup).
- With HW keyboard all short cuts work as expected just make sure the view can become first responder
- You can make it available via a navigation bar
- On a scroll view it will adapt to trait selection changes.
- It also supports Regular Expression – Wow!
- If you want to add to another view, like a list view. You can add UIFindInteraction on any view.
- After you add it to your custom view… just setup a UIFindInteractionDelegate
- You can also handle all things your self, like a manual find and replace feature you’ve written
- Highly recommend watching this session – both for their dry humor and the amount of detail the provide by something that seems so easy to just use. They had me at this closing image.
Complications and Widgets: Reloaded
Adding both watch complications and iPhone Lock Screen widgets
- Complications timeline
- Immediately accessible, high value information, tap takes your to the area in the app
- They are being remade in WidgetKit – I will have to recreate my complication from Wasted Time for the Watch
- Accessory Corner family is specific for WatchOS
- Go further with WidgetKet
- Colors
- System controls the look in one of three modes: Full color, accented or vibrant
- There is a WidgetRenderingMode
- Accented keep original opacity
- Vibrant rendering mode, you content is desaturated and then adapter to the environment it is in
- Avoid using transparent colors in Vibrant mode
- Using AccessoryWidgetBackgroundMode will fix a lot of issues
- Project Setup
- To update your existing widget app.
- You can add a new Widget target if you want to add watch app to existing iOS widget
- For watch OS you need to create a preconfigured list of intent providers
- Making glanceable views
- New AutoUpdatingProgress so you don’t have to so many timeline updates
- Make sure to use font styles use things like Headline body, title, and caption
- Also look at ViewThatFits to provide better options for o tough fit elements
- Privacy
- You should have settings for your widgets and complications for when they are redacting content or are in a low luminance state. This is the first hint about an always on iPhone screen.
- To test these states you can use the @Environment(\.isLuminanceReduced) variable.
- Use .privacySensitive() to redact some values in your widgets
Pingback: Michael Tsai - Blog - WWDC 2022 Links