Work with Reality Composer Pro content in Xcode

This is a continuation of a prior session on Reality Composer Pro sessions – where you created the scene and materials

Load 3D Content 

  • You can load a full scene via an Entity async initializer – just give it the Bundle you created in Reality Composer Pro , then create a RealityView 
  • It’s easiest if you use a swift Package for the package it is much easier to load and manage in Xcode
  • ECS – Entity Component System – this is the term and approach that is used by RealityKit and Reality Composer Pro
    • Entity is a thing (and can be invisible)
    • Components are attributes and data
    • System – is where behavior lives – updating once per frame

Components 

  • You can add components in swift or Reality Composer Pro –
    • you can add as many as you want, but only one of each type
    • You can create your own components too
      • Sample here is creating a PointOfInterest component

User interface

  • To put SwiftUI content in a RealityKit scene – use the Attachments API
    • They are part of the RealityView { // only run once this is the make closure } update:  { // only called when swiftUI view changes } attachments:  { }
    • The attachments view builder is just a normal SwiftUI View that needs to be hashable – so we can now add to the Entity with attachments.entity(for: “hashable value”) { content.add(attachmentEntity }
    • To make this data-driven – we need to create the attachment views dynamically.  We can do this in code to create the invisible entities (in Reality Composer pro scene) , query for them, and create a view for each of them.  Using @State will let us know when new buttons are created, and we can save them to the view builder and add them as entities.
    • We can query for all entities with the specific component attached, in our case the PointOfInterest component.

Play audio

  • Add Audio Emitter for the Audio component and preview it in the editor.  In the app you have to load the source and tell it to play
  • Use the AudioFileResource API and pull the audio from the usda file.  Prepare the audio and then call Play 
  • You can introduce faders to morph between sounds and terrains

Material properties

  • The Shader graph was created to allow for morphing between the two maps.  Look at the node map in the shader section of Realty Composer Pro
  • You can modify parameter values in code 

This session is very deep and should be watched.. this blog post will help you understand why you should go and watch the session at – https://developer.apple.com/wwdc23/10273