Streaming is available in most browsers,
and in the Developer app.
-
Build spatial experiences with RealityKit
Discover how RealityKit can bring your apps into a new dimension. Get started with RealityKit entities, components, and systems, and learn how you can add 3D models and effects to your app on visionOS. We'll also take you through the RealityView API and demonstrate how to add 3D objects to windows, volumes, and spaces to make your apps more immersive. And we'll explore combining RealityKit with spatial input, animation, and spatial audio.
Chapters
- 0:00 - Introduction
- 2:50 - RealityKit and SwiftUI
- 9:36 - Entities and components
- 12:20 - RealityView
- 15:37 - Input, animation and audio
- 23:00 - Custom systems
- 26:15 - Wrap-up
Resources
Related Videos
WWDC23
- Develop your first immersive app
- Enhance your spatial computing app with RealityKit
- Evolve your ARKit app for spatial experiences
- Explore the USD ecosystem
- Get started with building apps for spatial computing
- Go beyond the window with SwiftUI
- Meet ARKit for spatial computing
- Meet Reality Composer Pro
- Meet SwiftUI for spatial computing
- Meet UIKit for spatial computing
- Take SwiftUI to the next dimension
- Work with Reality Composer Pro content in Xcode
-
Download
♪ Mellow instrumental hip-hop ♪ ♪ Hello. I'm John, and I'm an engineer on the RealityKit team. Today, I'm delighted to introduce you to the new RealityKit for creating spatial experiences. RealityKit is a framework for realistically rendering, animating, and simulating 3D models and effects. We introduced RealityKit in 2019 and have added a lot of new features since then. If you've already used RealityKit for building apps on other devices, you'll find that there's a lot in common. With RealityKit, you can augment your app's 2D windows with 3D content, bring 3D content front and center in a separate window, or bring yourself and your environment into an immersive experience. RealityKit is a core 3D framework on Apple platforms. And especially on xrOS, it offers a lot of features. In this presentation, I'll focus on some key features, like entities, components, and RealityView, which will introduce you to RealityKit and show you how to use it. I'll also mention sessions that cover other concepts or go into more detail. Let's get started. I'll use the Hello World experience to explain the concepts in this presentation. But before I get to those concepts, let me take you through the three different modules that are a part of this experience. The first module, Planet Earth, presents a 3D globe in its own window that you can interact with from any angle. The second module, Objects in Orbit, immerses you in a model of the Earth, the Moon, and a satellite, demonstrating animations, Spatial Audio, and custom behaviors, like the trace which follows the satellite. The third module, the Solar System, contains a fully immersive experience, which you can learn more about in other sessions. I'm going to show you how to build the 3D elements of this Hello World experience using SwiftUI, RealityKit, and Reality Composer Pro. Let's dive in. I'll get started by talking about how you can use RealityKit together with SwiftUI to bring your app into the next dimension. Next, I'll examine the building blocks of RealityKit, entities like the Earth model and the components which implement its behaviors. You'll learn about the features of RealityView, which is a new SwiftUI view for 3D models and effects. Then I'll explain how you can handle input and bring your app to life with animation and Spatial Audio. Finally, I'll talk about unlocking the potential of RealityKit with custom components and systems. Let's begin by exploring how RealityKit and SwiftUI work together. SwiftUI is how you define your views and windows, and RealityKit lets you add 3D elements. For example, the World app uses SwiftUI to display a standard 2D window with a few buttons. Tapping the Planet Earth button on the left navigates to a detail view, which shows a 2D illustration of the Earth. But what if I want to replace that 2D image with a 3D globe? Adding 3D content to a 2D window is easy using the model 3D view in RealityKit. Let's go over the code. Here's the SwiftUI view, which displays that globe image. I'll change it to display a 3D globe by importing RealityKit and changing the image to a model 3D view, referencing a USD file in my project called Globe. We can customize the loaded model before displaying it by adding two pieces of code: a content closure for the model that is returned and a placeholder view builder to specify a view that's displayed while the model is loading. In the content closure, I'll add resizable and scaledToFit modifiers to make sure that the model fits into the available space in my UI. And because Model 3D loads its model asynchronously, there is a placeholder view builder, which you can use to customize the view that's displayed during the loading process. In this case, I'm just using the built-in ProgressView to display a spinner. Finally, I need to add the globe 3D model as a resource to the app or Swift package.
And now the model appears in line in the app with the same appearance that it has in Quick Look or Reality Composer Pro. Placing 3D content in a 2D window is great, but I want my 3D model to be front and center. To accomplish that, I'm going to put the globe in a separate window. I'll add a button to the app's detail view to open that new window. And instead of using a regular window, which displays its contents against a 2D background, I'll use a new window style which places its contents in a volume. This is called a volumetric window, and it's great for viewing 3D content. Unlike a 2D window, a volumetric window is meant to be used from any angle, so it's ideal for 3D models. A volumetric window also keeps a fixed size that is true to life. You can scale a model in a volumetric window to be one meter across, and it will always be displayed at exactly that size. I think that's a great fit for the globe in Hello World. Let's go through the process of adding a volumetric window. First, I'll add a window group to my app. Window groups act as a template that an app can use to open new windows. I'll give the new window an identifier to distinguish it from this app's main window. Next, I'll add a windowStyle volumetric modifier to the window group. I'll also add a defaultSize modifier to give this window a size in meters. Finally, I'll add a button to the detail view. To make the button open the window I've just added, I'll add a property that gives me access to the open window action from the SwiftUI environment.
Then I'll call that action from my button.
Let's run this app in the simulator. When I press the View Globe button, a volumetric window appears. Now I can interact with the globe from multiple angles, not just the front.
But sometimes, the key to unlocking the experience you want to create is immersion. In the Objects in Orbit module of the World app, you're immersed in an animated model of the Earth and its satellites that demonstrates their orbits. This uses an immersive space, a new scene type which allows your app to place 3D elements anywhere in space. When you open an Immersive Space, your app can go beyond the bounds of a window and provide a magical experience. Adding an Immersive Space is similar to adding a window group. It's a new scene in the body of my app. Here I'm using RealityView, which will give me more control over the scene than Model 3D. I'll go over RealityView in more detail in a few minutes. Like before, I'll add a button to the app's detail view. I'll get the openImmersiveSpace action from the environment and call it with the identifier of the scene I've defined.
Note that the openImmersiveSpace action is asynchronous. It completes once the space has finished opening. When I press the View Orbits button, an Immersive Space appears. This is already stunning, but you can make it more engaging by adding interactivity, animation, and audio with RealityKit. Whether you're working with 2D windows containing 3D content, or volumetric windows that emphasize your 3D models, I encourage you to check out these SwiftUI sessions if you haven't already. The "Meet SwiftUI for spatial computing" session is an overview of what's new with SwiftUI on this platform. The "Take SwiftUI to the next dimension" session demonstrates how to get the most out of 3D content in a window. There are also multiple styles of immersion. The solar system module of Hello World uses a fully immersive space that hides passthrough and displays its own backdrop. The "Go beyond the window with SwiftUI" session goes into detail about all styles of Immersive Spaces. If you're thinking about creating an immersive experience, I highly recommend this talk. You've encountered two ways to use RealityKit in your SwiftUI views, the easy-to-use Model 3D and RealityView. RealityView is what I'll use for the rest of this session because it allows you to compose your 3D content using RealityKit entities. So what is a RealityKit entity? An entity is a container object. If you create an empty entity from code, it won't do anything. To make an entity render or give a behavior, it must have components. Each component enables some specific behavior for an entity. Here are some examples. The Earth entity in this app is implemented with two components: a model component, which gives the entity a mesh and materials, and a transform component, which places the entity in 3D space. The same is true of the satellite entity. The model component renders a mesh and applies materials to it. These Earth and satellite models were created in a digital content creation tool, exported to a USDZ file, and loaded into RealityKit. These meshes have a physically-based material applied to them to give them their final appearance. A material uses a set of textures and shaders to describe how the surface of a mesh responds to light. To learn more about materials, I recommend watching Niels' session, "Explore materials in Reality Composer Pro." In addition to a model, these entities have a transform component. The transform component places an entity in 3D space. You can control an entity's position, orientation, and scale by setting properties on its transform component, as well as by setting the entity's parent. RealityKit uses the same 3D coordinate conventions as ARKit and many other 3D engines. The origin is at the center of the RealityView. The y-axis points upward, the z-axis points towards you, and the x-axis points to your right. One unit is one meter. Note that these conventions are different from SwiftUI's conventions. There are functions on RealityView's content instance that make it easy to convert back and forth between the RealityKit and SwiftUI coordinate spaces. Every entity has a transform, but not every entity has a model. Sometimes an entity is assembled out of multiple child entities, each with their own set of components. This gives you more programmatic control. For example, you could play individual animations on the transforms of child entities. RealityKit contains a variety of components depending on what you want to do. I'll talk about some specific components today, collision, input target, and hover effect just to name a few. I'll also demonstrate how to create your own components. Now that we've got a sense of how entities and components work, let's use RealityView to place those entities in your app. RealityView is a SwiftUI view that contains any number of entities. Entities need to be added to a RealityView in order to be rendered, animated, and simulated. So, how does a RealityView work? RealityView provides a content instance which lets you add entities to the view. This is an easy way to get started if you have already loaded an entity or if you want to create an entity programmatically. But this closure is asynchronous, so it's simple to load an entity from a file and display it in your view. Here I asynchronously load an Earth model from a USD file and add it to the content instance once the load has completed. You can also load more than one model and place both of them in your RealityView. Instead of being laid out next to each other, these models will coincide in space. If that's not what you want, you can change the position of the entities added to the view. This example positions the moon entity half a meter to the right. Once you've set up your RealityView, you may want to connect state in your app to properties stored on RealityKit components. RealityView lets you express the connections between SwiftUI managed state and the entities in a RealityView with an update closure. This makes it easy to drive the behavior of 3D models with a source of truth from your app's data model. This view loads a model and applies a rotation controlled by whoever uses the view. Note that the code in the update section only runs when the values that it depends on change. If you're building a UI with a combination of 2D and 3D elements, you'll sometimes need to convert coordinates between views and entities. RealityView provides coordinate conversion functions between SwiftUI view coordinate spaces and RealityKit entity coordinate spaces. The RealityView's content instance provides a convert function that converts points, bounding boxes, and transforms from a SwiftUI coordinate space to an entity's local space or vice versa. Here, I get the minimum length of any of the view's dimensions and scale the loaded entity to fit in the available space. RealityView also provides a mechanism for subscribing to events published by entities and components. In this example, I play an animation authored in the loaded USD file after the load completes. The content instance has a subscribe to: function, which adds an event handler. This example runs some code when an animation completes. There are RealityKit events published for all kinds of things, from animation to physics to audio. You can also attach SwiftUI views to entities. The attachments feature of RealityView makes it easy to position views in 3D space. Check out Yujin’s session, "Enhance your spatial computing app with RealityKit," to learn more. There's a lot you can do with RealityView. But let's get back to our celestial bodies and bring them to life. First, I'll show you how to add a drag gesture so that you can reposition the Earth entity. And then I'll explain animations and Spatial Audio. Here's an example RealityView containing three entities. You can add a gesture to a RealityView, like any other SwiftUI view, and it will hit test against entities in that view. To receive input, the entity must have both an input target component and a collision component. When a touch event is handled by RealityView, it will ignore any entities that don't have both collision and an input target. Only this last entity has both components, so gestures added to this RealityView will only react to input directed at this entity. To make the Earth entity draggable, I'll give it an input target component and a collision component, and add a drag gesture to the RealityView. To add the components, I'll use Reality Composer Pro. Reality Composer Pro is a new developer tool that lets you compose, edit, and preview 3D content. I'll just use it to add a few components to an entity. To learn more about what you can do with Reality Composer Pro, check out Eric's session, "Meet Reality Composer Pro." The World app already has a World Assets package set up containing the USD files that this experience uses. I'll open that package in Reality Composer Pro.
The Earth model is in a USDZ archive, which is self-contained and not meant to be modified. Instead of modifying that asset, I'll create a new USD scene file and reference the Earth asset. USD files can reference other USDs and modify them in place without actually changing the referenced file. Nondestructive editing like this is really handy when you need to make small changes to a USD file that someone else is working on. I'll create a new scene named DraggableGlobe and drag in the globe file to create a reference to it.
Now I can add components to it. I'll add an input target component and also a collision component. The default shape for the collision component is a cube. I'll change it to a sphere so it better matches the model. It's important for the collision shape to be a reasonable approximation of the visual model. The closer the match, the more intuitive interactions with the model will be. I want to be able to move the Earth model around, so I'll add a drag gesture to the RealityView. A standard SwiftUI drag gesture will work, but I can enable the gesture to manipulate specific entities rather than the entire view by adding a targetedToEntity modifier on the gesture. When the gesture's value changes, I'll change the entity's position to match. There's one critical detail though. The gesture's value is in SwiftUI's coordinate space, so I have to convert it to RealityKit's coordinate space in order to change the entity's position. All the pieces are now in place. So in the Objects in Orbit module, I can now pinch and drag to move the Earth around. Great, our app is now interactive. But I'd like my Earth entity to indicate that it's interactive. There's a RealityKit component we can use for this, the HoverEffectComponent. Hover effects provided by SwiftUI and RealityKit are the only way to make your app react to where you're looking. This effect is applied outside of your app's process in a privacy-preserving manner. I'll add a hover effect component to the Earth entity when it's added to the RealityView. Now, the Earth model lights up when the pointer is over it to indicate that I am able to interact with it. Next, let's move on to animations. RealityKit has a number of built-in animation types, such as from-to-by animations, which animate a property from an initial value to a final value, orbit animations that cause an entity to revolve around its parent, and time-sampled animations that progress frame by frame through a series of values. I'll set up an orbit animation on the Moon. The Moon will take 30 seconds to complete a full orbit and the orbital axis will be the y-axis. And I'll make sure that the orbit starts from the Moon's current position. Once I've defined the properties of this orbit animation, I'll generate an animation resource for it and play that animation on the Moon entity. And now the Moon orbits the Earth. To me, this is the magical moment. With an animation in place, the scene feels alive. But while animation helps bring your 3D content to life, Spatial Audio makes your model feel like it's really there. There are three types of audio in RealityKit: spatial, ambient, and channel. Let's look at each one of them in more detail. RealityKit sounds are spatial by default, so audio sources sound like they actually exist in your surroundings. Spatial Audio Component lets you customize how your objects emit sound into your space to make them even more realistic or more artistic. Use directivity to emit sounds in all directions or project sound in a specific direction. Ambient Audio Component is great for multichannel files, which capture the sound of an environment. No additional reverb is added to ambient sources. Each channel of the ambience is played from a fixed direction. And finally, Channel Audio Component sends the audio files channels directly to the speakers without any spatial effects. This is ideal for background music which is not connected to any visual element. You can add audio to your scene in Reality Composer Pro and interface with it using RealityKit. Or you can hook up audio in code. Let's take a look. I'll add a bit of looping audio to an orbiting satellite. First, I'll add a Spatial Audio Component to an empty entity that will act as the audio source. A directivity of 0.75 creates a tight beam of sound in a particular direction. I'll rotate that audio source entity around its y-axis so that the audio is projected in the direction I want. I'll then load a looping audio clip from a resource and play it on the audioSource entity by calling playAudio. Let's see this in action. Since the Spatial Audio source is configured with a tightly focused directivity, the audio can be heard clearly on my side of the Earth, but it is quieter when the satellite is on the other side. That was input, animation, and audio. You can build more functionality in RealityKit by combining its existing functionality in different ways. There are two primary tools you can use for this purpose, defining your own components and defining your own systems. A component contains the data controlling one aspect of a 3D experience. Components are grouped into entities. Without components, an entity does nothing. Each component supplies a single element of an entity's implementation. You've learned that a transform component positions an entity and that a model component renders a 3D model. In addition to the predefined components that RealityKit provides, you can define your own components. Here's an example custom component that contains a traceMesh object that my coworker Paul has created. The trace component type conforms to the component protocol, so you can get and set this component on any entity at runtime. You can also adopt a data-driven workflow by defining a component in a Swift package and conforming it to the Codable protocol. Codable components will appear in the Reality Composer Pro interface and can be directly added to entities at design time. You can learn more about custom components in the talk "Work with Reality Composer Pro content in Xcode." I already went through entities earlier in this talk, and I just covered components. Next, let's talk about systems. Systems contain code that act on entities and components. Taken together, entities, components, and systems, or ECS, are a tool for modeling the appearance and behavior of your 3D experience. Systems are a way to structure the code that implement your app's behaviors. Code in a system runs at a regular interval, acting upon your component's current state. For example, this TraceSystem updates a line mesh that's traced behind the satellite entity as it orbits the earth. Each update, it adds the entity's current position to the trace. Once a system is registered, it automatically applies everywhere in your app that you use RealityKit. Registering the trace system in my app's initializer causes it to update for all relevant entities. But what entities are relevant, and when does the system update? This system only wants to update entities with a trace component, so I create an entity query that filters to entities which have a trace component. In the update function, the system passes in the entity query and also specifies that it wants to update entities when rendering. The rendering condition means that this system will update at an appropriate rate for smooth animations. Here's the trace system in action, adding the entity's position to the line mesh in order to produce a fluid custom animation. Systems are a really effective way to implement a variety of effects and behaviors. RealityKit has a lot of features that make it easy to build 3D apps. You can use RealityKit and RealityView to add 3D elements to views, windows, and immersive spaces defined with SwiftUI. You can load USD files, handle gestures, and play animation and Spatial Audio, all using RealityKit. RealityKit provides many predefined components, but you can also define custom components and systems for your app's specific needs. With that, I've covered the concepts you need to get started with RealityKit. Yujin’s session, "Enhance Your spatial computing app with RealityKit" will take you through more features of RealityKit, like portals, particle emitters, attachments, and more. And Amanda's session "Work with Reality Composer Pro content in Xcode" takes you through the process of building an immersive app using Reality Composer Pro, RealityKit, Xcode previews, and the simulator. There are a lot of exciting features in RealityKit that you can use in your app. I'm really excited to see the wonderful experiences you'll create. Thanks for watching. ♪
-
-
3:40 - Model3D
import SwiftUI import RealityKit struct GlobeModule: View { var body: some View { Model3D(named: "Globe") { model in model .resizable() .scaledToFit() } placeholder: { ProgressView() } } }
-
5:52 - Volumetric window
import SwiftUI import RealityKit // Define a volumetric window. struct WorldApp: App { var body: some SwiftUI.Scene { // ... WindowGroup(id: "planet-earth") { Model3D(named: "Globe") } .windowStyle(.volumetric) .defaultSize(width: 0.8, height: 0.8, depth: 0.8, in: .meters) } }
-
7:31 - ImmersiveSpace
import SwiftUI import RealityKit // Define a immersive space. struct WorldApp: App { var body: some SwiftUI.Scene { // ... ImmersiveSpace(id: "objects-in-orbit") { RealityView { content in // ... } } } }
-
12:40 - RealityView
import SwiftUI import RealityKit struct Orbit: View { let earth: Entity var body: some View { RealityView { content in content.add(earth) } } }
-
12:54 - RealityView asynchronous loading and entity positioning
import SwiftUI import RealityKit struct Orbit: View { var body: some View { RealityView { content in async let earth = ModelEntity(named: "Earth") async let moon = ModelEntity(named: "Moon") if let earth = try? await earth, let moon = try? await moon { content.add(earth) content.add(moon) moon.position = [0.5, 0, 0] } } } }
-
13:54 - Earth rotation
import SwiftUI import RealityKit struct RotatedModel: View { var entity: Entity var rotation: Rotation3D var body: some View { RealityView { content in content.add(entity) } update: { content in entity.orientation = .init(rotation) } } }
-
14:27 - Converting co-ordinate spaces
import SwiftUI import RealityKit struct ResizableModel: View { var body: some View { GeometryReader3D { geometry in RealityView { content in if let earth = try? await ModelEntity(named: "Earth") { let bounds = content.convert(geometry.frame(in: .local), from: .local, to: content) let minExtent = bounds.extents.min() earth.scale = [minExtent, minExtent, minExtent] } } } } }
-
14:56 - Play an animation
import SwiftUI import RealityKit struct AnimatedModel: View { @State var subscription: EventSubscription? var body: some View { RealityView { content in if let moon = try? await Entity(named: "Moon"), let animation = moon.availableAnimations.first { moon.playAnimation(animation) content.add(moon) } subscription = content.subscribe(to: AnimationEvents.PlaybackCompleted.self) { // ... } } } }
-
18:31 - Adding a drag gesture
import SwiftUI import RealityKit struct DraggableModel: View { var earth: Entity var body: some View { RealityView { content in content.add(earth) } .gesture(DragGesture() .targetedToEntity(earth) .onChanged { value in earth.position = value.convert(value.location3D, from: .local, to: earth.parent!) }) } }
-
20:20 - Playing a transform animation
// Playing a transform animation let orbit = OrbitAnimation(name: "Orbit", duration: 30, axis: [0, 1, 0], startTransform: moon.transform, bindTarget: .transform, repeatMode: .repeat) if let animation = try? AnimationResource.generate(with: orbit) { moon.playAnimation(animation) }
-
22:12 - Adding audio
// Create an empty entity to act as an audio source. let audioSource = Entity() // Configure the audio source to project sound out in a tight beam. audioSource.spatialAudio = SpatialAudioComponent(directivity: .beam(focus: 0.75)) // Change the orientation of the audio source (rotate 180º around the Y axis). audioSource.orientation = .init(angle: .pi, axis: [0, 1, 0]) // Add the audio source to a parent entity, and play a looping sound on it. if let audio = try? await AudioFileResource(named: "SatelliteLoop", configuration: .init(shouldLoop: true)) { satellite.addChild(audioSource) audioSource.playAudio(audio) }
-
23:47 - Defining a custom component
import RealityKit // Components are data attached to an Entity. struct TraceComponent: Component { var mesh = TraceMesh() } // Entities contain components, identified by the component’s type. func updateTrace(for entity: Entity) { var component = entity.components[TraceComponent.self] ?? TraceComponent() component.update() entity.components[TraceComponent.self] = component } // Codable components can be added to entities in Reality Composer Pro. struct PointOfInterestComponent: Component, Codable { var name = "" }
-
24:51 - Defining a system
import SwiftUI import RealityKit // Systems supply logic and behavior. struct TraceSystem: System { static let query = EntityQuery(where: .has(TraceComponent.self)) init(scene: Scene) { // ... } func update(context: SceneUpdateContext) { // Systems often act on all entities matching certain conditions. for entity in context.entities(Self.query, updatingSystemWhen: .rendering) { addCurrentPositionToTrace(entity) } } } // Systems run on all RealityKit content in your app once registered. struct MyApp: App { init() { TraceSystem.registerSystem() } }
-
-
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.