WWDC 2023 – Day 1 – Platform State of the Union

Time to get into the details… Ok, this is a just a post of my raw notes, no real analysis. I am looking forward to starting all the sessions tomorrow. But tonight I still have to watch the Design Awards.

  • Darian Adler – VP of Internet Technologies and Privacy
  • Language, Frameworks, Tools, and Services come together to make a Platform
  • Over 300 frameworks on Apple’s products. 
  • Interesting to see Apollo Reddit app on their main screen
  • Swift and SwiftUI
    • Holly Borla – Engineering Manager
    • New kind of API – Swift Macros, i.e. annotations
      • Generates code in Xcode 
      • You can expand Macros to see what they will do.
      • All macros provide custom feedback in Xcode
      • Attached Macros, allows you to add functionality to your existing code, like @AddAsync to your function.  Will expand during debugging your code so you can see what it does.
      • Many of the new APIs will use Macros
    • Can now do bidirectional interoperability with C++
    • SwiftUI – updates
      • New support for Pi Charts
      • Expanded MapKit support
      • Animation updates
        • Auto transfer gesture velocity to animation
        • .spring(duration:, bounce:) 
        • Animation in Symbols are now supported
        • Animation Phase
        • Full support for Key Framing
      • DataFlow – getting simpler, you will only have to deal with @State and @Environment
        • You can use @Observable on a call (using annotation), it will set up for you, and you don’t need to use @State
      • CoreData is now SwiftData.
        • Uses Annotation with @Model macro and @Query.
        • Update your widget with same @Query so now you can introduce data in classes
    • App Experiences — Johnathan
      • WidgetKit
        • Make your app more glanceable and usable.
        • Minor updates to become available in new iPhone standby mode
        • Available on iPad and MacDesktop
        • New Widget Architecture allows for continuity 
        • Identify Background, and padding
        • Add buttons or toggles allow for interactivity for your widget
        • Can see widget timeline in Xcode previews… (this may help me fix my own app’s widget issue)
      • AppIntents
        • If you wrap your intent in an app shortcut it will show up next to your app in the shortcut app
      • TipKit
        • Feature discovery in your app, allows your to educate your users.
        • You can create simple targeting to functionality.
      • AirDrop
        • Tap share sheet between devices and that passes data between devices.
  • Hardware Features
    • Gaming
      • Easier than every to port games to Mac
        • It evaluates your existing game via emulation environment
        • Cover and compile shaders – via Metal Shader converter (supported in Xcode and on Windows)
        • MetalFX allows for straightforward porting of code
    • Camera
      • AVCapture
        • Zero Shutter Lag, Overlapping Captures, Deferred Processing (move Deep Fusion into the background)
        • Later this year – Volume Shutter buttons
      • HDR capable displays will increase photography improvements information on displays.  There is no industry standard for photos, so Apple has been working with industry to create ISO HDR standard for photos.
      • Video Conferencing
        • Reactions etc. added in the camera app by default. But you can capture that it is being used to update things your app
        • Screen Capture Kit – allows for better screen sharing (just share the 2 apps you want)
        • Added external camera support with iPad and tvOS
        • Continuity Camera can be used easily in the code.
    • WatchOS
      • containerBackground, Pagination, ToolBarItem, and Navigation Split View, and new Spring Animation. IF you are using SwiftUI it will be automatic for you
      • New custom workout API
      • New core motion API – for swing analysis (like golf or tennis)
  • Values
    • Chris Fleizach Sr. Manager Accessibility
      • 1.3 B people with disabilities around the world.
      • Added ability to detect doors in magnifier app.
      • Animation and flashing lights sensitive –
        • Pause Animated Images – stops animation in animated GIFs
        • Dim Flashing Lights – will darken screen automatically when flashing lights occur (automatically if you use AVFoundation) – Apple Open Source this algorithm
      • VisionOS has tons of Accessibility features built in from the start
    • Katie – Privacy
      • Privacy prompt improvements. Like Add only permission for Calendars.  Photos have new picker 
      • App Privacy – help users to show how you protect their data.
        • For Third Party SDKs – Privacy Manifests – this will be combined for a summary 
        • Supply Chain Signatures for third party SDKs to validate it was signed correctly
      • Communication Safety
        • This was added to messages in iOS15
        • It is now available across the entire platform  – Sensitivity Content Framework, added blurring too.
    • Chris Markiewicz – App Store
      • Merchandizing UI for improved exposure in StoreKit
      • Can add subscription store view easily and it works across all platforms.
      • Will automatically determine version to show the user
      • SKAdNetwork – for measuring downloads based on Advertising, can now measure reengagement.
  • Tools
    • Ken Orr – Sr. Mgr Xcode and Swift Playgrounds
      • Xcode –
        • Code completion will use surrounding code for prioritization, will also automatically define symbols for your assets
        • Previews – using swift Macros
        • Works across all UI Frameworks
        • Git Staging is integrated in Xcode
        • Can see unpushed commits
        • Testing –
          • Redesign of test report, includes video recording of test results with navigation timeline.
          • Can show accessibility frames
        • Xcode cloud
          • Have improved workflows
          • Added tester notes for test flight
          • Linker speed increased
  • Deep dive into Vision OS
    • Michael – VP
      • Fundamentals – actions create 
      • Windows
      • volume
      • spaces
      • Dedicated full space
      • SwiftUI and UIKit, for UI
      • RealityKit for visual and spatial audio
      • AR Kit for objects
  • With SwiftUI you can add depth or a 3d Object for layering. Vision Apps can then separate those depth objects
  • Using .offset(z: value)
  • Creating a Volume in SwiftUI
  • And SwiftUI renders thru RealityKit
  • Ornaments are for controls and toolbars
  • For full scenes – you want to use RealityKit
    • Automatically adjusts to lighting conditions
    • Can also create portals into 3D Scenes 
    • Dynamic Foveation – uses eye tracking to focus processing power on the area the user is actually looking 
    • IBL – Image Based Lighting object for customizing lighting effects
    • Material X is shader for environment
    • Can add attachments to 3d Objects
  • ARKit – allows objects to interact with the real world
    • Always running – persistence and world mapping is automatically handled for you
    • Plane Estimation
    • Scene ??
    • Image Anchoring
    • World Tracking 
    • Skeletal Hand Tracking
  • Accessibility
    • People with physical or motor issue can use eyes, or voice
    • Can use finger, arm or head for selection
    • Supports VoiceOver information 
    • Dynamic Type, Reduced Transparency and Accessibility UX features are available
  • These Busca-Aegeria – development tools
    • Xcode has it all in it.
    • Simulator has multiple scenes for lighting
    • MacVirtual Display 
    • Reality Composer Pro – to import and organize 3D Objects and integrates in Xcode for development workflow
  • Jason – Tour
    • Demo of Reality Composer Pro
    • Test Flight will be available on the device from the start
  • Unity – using their tools on top of RealityKit
    • They can coexist int he share space 
    • Ralph from Unity
      • Can bring over Unity work and reimagine it for VisionPro
        • Demo of What the Golf
  • Additional Features
    • Goals – Use Features and Preserve Privacy
    • Virtual Sound should behave as if it is real in the room.  There is a realtime spatial audio and rendering.
    • User Input is private by design.
    • Sharing and Collaboration, can use share play to share any window on FaceTime call.
    • Personas are in 3d for other VisionPro users…
    • Spacial Personas, will show them in the same space (outside of the window).
      • Will be available as a developer preview later this year
  • What’s next
    • Start design, developing, and testing, Vision dev kit will be available later this month.
    • To see how your app works, there will be a visionPro lab in US this will be in Cupertino, in Germany is in München. 
    • Band new Appstore 
Tagged , , , , , , , . Bookmark the permalink.

Comments are closed.