Streaming is available in most browsers,
and in the Developer app.
-
Enhance your spatial computing app with RealityKit
Go beyond the window and learn how you can bring engaging and immersive 3D content to your apps with RealityKit. Discover how SwiftUI scenes work in tandem with RealityView and how you can embed your content into an entity hierarchy. We'll also explore how you can blend virtual content and the real world using anchors, bring particle effects into your apps, add video content, and create more immersive experiences with portals.
Chapters
- 0:00 - Introduction
- 2:05 - RealityView attachments
- 6:11 - Video playback
- 10:59 - Portals
- 15:04 - Particle emitters
- 17:06 - Anchors
- 19:28 - Wrap-up
Resources
Related Videos
WWDC23
- Build spatial experiences with RealityKit
- Deliver video content for spatial experiences
- Explore rendering for spatial computing
- Go beyond the window with SwiftUI
- Meet ARKit for spatial computing
- Meet Reality Composer Pro
- Take SwiftUI to the next dimension
- Work with Reality Composer Pro content in Xcode
-
Download
♪ Mellow instrumental hip-hop ♪ ♪ Hello, my name is Yujin, and I'm an engineer on the RealityKit team.
Today, I'm going to show you new features in RealityKit that you can use to enhance your spatial computing apps.
Since the time we released RealityKit in 2019, We've seen apps use its rich feature set to create some amazing experiences.
Now spatial computing adds even more features to RealityKit, like portals, particle emitters, RealityView attachments, and many more.
In the session titled "Build spatial experiences with RealityKit," we learned about the basic building blocks of RealityKit: entities, which are container objects; components, which define specific behavior on entities; and systems, which act on both entities and components to add functionality.
We covered the RealityView API, which acts as a bridge between SwiftUI and RealityKit.
We also showed you how to add interaction, animations, and Spatial Audio to your RealityKit scene.
If you haven't watched it already, I highly recommend that you check out that session.
In this session, we will cover new features in RealityKit that will help make your app even more engaging and immersive.
First, we will learn how to embed SwiftUI views into our RealityKit content using attachments in RealityView.
Next, we will look at how to add video playback within our RealityKit scene.
Then we will learn how to use portals to open a window to an alternate world.
We'll go through how to use the Particle Emitters API to enhance your scene with visual effects.
Finally, we'll learn how to use anchors in RealityKit to attach 3D content to real-world locations, such as a wall.
Let's get started with RealityView attachments.
Attachments are a useful way to embed SwiftUI content into your RealityKit scene.
In this example app, I've used attachments to put text labels beneath the models of the earth and moon.
I've also attached a view that explains how the moon affects tides on our ocean.
Let's see how to make this in code.
Inside of my app, I'm using a RealityView to render my earth model.
A RealityView is a view that lets us add RealityKit entities.
Entities need to be added to a RealityView in order to be rendered, animated, and simulated.
Here we simply load an entity for the earth and add it to the RealityView's content.
Let's now change our RealityView to use attachments.
Attachments are views that can be placed at specific locations relative to your RealityKit content.
There are two parts to setting up attachments.
First, there's the added parameter in the make closure of our RealityView.
Second, there's an attachments view builder that is added to our RealityView.
Let's cover the attachments view builder first.
Here you can provide SwiftUI views that you want to add to your RealityKit content.
In this example, I've added a text view to label the Earth.
We'll also add a tag modifier to the view so that we can later identify it when our view gets delivered to the make closure as an entity.
This tag can be any hashable value.
Here I've used the string earth_label.
In the make closure of our RealityView, the attachments parameter contains our views that are now represented as entities.
To get our view in entity form, we call entity(for:) on our attachments and pass in the same tag that we provided in the view builder, earth_label.
The result that we get is a view attachment entity, which we can add to our RealityKit content, just like any other entity.
To make the label appear beneath the earth, we'll add the attachment as a child of our earth entity and position it slightly below.
We can now repeat this process for all the other attachments we want to add using a different tag for each.
Let's take a look in Xcode.
In my sample app, I'll add three attachments to my RealityView.
First, I'll add a label below the earth.
I'll also do the same for the moon.
Finally, I'll add a short paragraph explaining the role of the moon's orbit on the tides.
I've styled this using a glassBackgroundEffect in SwiftUI.
In the make closure of the RealityView, I'll add the corresponding entities to my content.
First, I'll add the earthAttachment below the earth.
I'll do the same for the moon.
Finally, I'll place the tides explainer to the left of my container entity.
I'll build and run my app, and we'll see the attachments that I've created displayed next to my models.
Let's recap the data flow for attachments.
Attachments start off in the attachments view builder in our RealityView.
Here, we can provide SwiftUI views that we want to add to our RealityKit scene.
In the make closure of our RealityView, we get the attachments back as entities, which we can then add to our scene.
We can also update the entities inside of the update closure.
This closure is called when there are changes to our SwiftUI view state.
You can use this to respond to dynamically changing content in your RealityView.
For a more detailed usage of attachments, check out the session "Work with Reality Composer Pro content in Xcode." RealityView attachments are a useful way of adding text content in other UI elements to a scene.
Additionally, we can also add a video to our app to make it more engaging.
To do this, let's use VideoPlayerComponent.
Video player component is a new component type in RealityKit that is used for embedding video content inside of a 3D scene.
As a reminder, components define specific behavior that you can attach to entities.
To play a video using VideoPlayerComponent, we'll first load a video file from our resources bundle.
Then we'll use that to create an AVPlayer instance.
With it, we can now create a VideoPlayerComponent.
When you attach a VideoPlayerComponent to an entity, a rectangular mesh that matches the aspect ratio of the video is automatically generated for you.
This behavior is analogous to existing video player APIs, such as VideoPlayer in SwiftUI and AVPlayerLayer in Core Animation.
However, since RealityKit is a 3D framework, your video will be represented as an entity with a mesh so that you can move and position it in 3D space.
All video formats that are supported by AV Foundation will work with VideoPlayerComponent, including 2D video formats and 3D video using MV-HEVC.
Finally, VideoPlayerComponent will automatically display captions that are provided through the AVPlayer.
To learn more about how to create your own video content, including 3D videos, check out the session entitled "Deliver video content for spatial experiences." To add video to my RealityKit scene, we'll first create an AVPlayerItem using the URL to my video asset.
We'll then create an AVPlayer.
On the entity, we'll add a VideoPlayerComponent initialized with the AVPlayer that we just created.
VideoPlayerComponent will automatically generate a mesh that is sized based on the aspect ratio of my video.
Because RealityKit works in real-world units, by default, the video will be one meter in height.
To make the video a different size, we can scale the entity.
In my case, I'd like the video to be 40 centimeters tall, so we'll multiply the entity scale by 0.4.
Finally, we're ready to play the video.
We'll set the current item to our AVPlayerItem, and then call play on the AVPlayer.
Let's rebuild and run our app with this code.
I've added a Learn More button to our app, which will add the video entity to our scene.
On button click, I'll fade in the video using an opacity component and a fromToByAnimation.
For our video content, I've prepared a short clip that explains the role of the Moon's gravitational force on the Earth's rising tides.
Let's take a look.
The moon orbits our planet.
Its gravitational pull exerts a powerful force on our oceans, causing it to bulge ever so slightly towards the lunar sphere. < VideoPlayerComponent respects the systemwide preferences for captions.
Let's turn them on in the Settings app under the Accessibility section.
And so it is, that twice a day, in a never-ending cycle, the tides rise and fall, driven by this unceasing interplay of earth and moon. < VideoPlayerComponent also supports passthrough tinting.
When this feature is enabled, your passthrough content is adjusted to match colors in the video.
This is the same treatment that is used when watching movies and TV shows inside of the TV app on this platform.
To use passthrough tinting, you can set the isPassthroughTintingEnabled property to true.
You can also subscribe to VideoPlayerEvents to be notified when properties on a VideoPlayerComponent change, such as the content type, viewing mode, and video size.
To subscribe to events, you can call the subscribe function on your RealityViews content and specify the event type and entity.
You can respond to events inside of the event handler closure.
VideoPlayerComponent is a great addition to our 3D scene.
So far, our app features a model of the earth and moon, but I'd like to present it against a backdrop of outer space.
I think it would be pretty cool if we can make ourselves a magic window in the room that reveals the moon's orbit in outer space.
We can do this using a portal to render our scene.
A portal creates an opening to a different world that is visible through a mesh surface.
Entities inside of this world use separate lighting and are masked by the portal's geometry.
This example demonstrates three distinct features in RealityKit.
First, a portal is used to render the scene in outer space.
Then a particle effect is used to decorate the rim of the portal.
Finally, anchoring is used to place the portal on the wall of our room.
Let's start with portals.
To make a portal, we must first create a world.
To do this, we add an entity in our scene that has a World component.
This component marks its entity tree as belonging to a different world.
Entities in a world are only visible through a portal surface.
To add content to our world, we can attach entities as children of the world entity.
Here, we'll add models for the sky, earth, and moon, as well as an ImageBasedLight to define the lighting inside of the world.
All descendants of the world entity will appear only inside of this world.
Next, we'll make a portal.
To do this, we add an entity with a model component.
The model component contains two properties, a mesh and a material.
For the mesh, we'll generate a circular plane to act as the surface of the portal.
For the material, we'll assign a new portal material in order to make the mesh appear as a portal.
To connect the portal with our world, we'll add a portal component to the entity and set its target property to the world entity.
This allows the portal to act as a mask to reveal the content inside of our world.
Let's see how this looks in code.
In our RealityView, I've added calls to two functions that will implement makeWorld and makePortal.
In our makeWorld function, we'll create a world entity and populate it with the portal's contents.
In the makePortal function, we'll create a portal and link it to the world that we just created.
Finally, we'll add both of these entities to our RealityView's content.
Let's dive into each of these functions.
Inside of the makeWorld function, we create an entity and attach a WorldComponent.
Next, we load an EnvironmentResource to use as our ImageBasedLight.
We'll apply this to the world using the ImageBasedLight component and ImageBasedLight ReceiverComponent.
To learn more about image-based lighting in RealityKit, check out the session "Explore rendering for spatial computing." Next, we'll populate the world with our contents.
I'll load models for the earth, moon, and sky, and add them to the world as children.
Because these entities are children of the world, they will only be visible through the portal.
Let's move on to the makePortal function.
To make a portal, we first need a mesh.
We'll create one by making a model component for the entity.
To make our portal circular, we'll generate a plane with equal dimensions and a corner radius that is half the size.
I'll also create a PortalMaterial to use as a material for the ModelComponent.
Finally, we'll also attach a portal component that is initialized with the world entity that we created earlier.
This links the portal with the world so that we can see the world's contents through the mesh.
Next, let's decorate the rim of the portal with a particle effect.
To do this, we can use the ParticleEmitterComponent provided in RealityKit.
Particle emitters can be used to represent many different visual effects in RealityKit, such as sparks, snow, and impact effects.
Particle emitters can be created either via Reality Composer Pro or at runtime using RealityKit through the ParticleEmitterComponent Here, I've prepared a particle asset using Reality Composer Pro.
We can use this to decorate the portal that we created earlier.
Let's load this into our scene and modify the particle properties at runtime using RealityKit.
To update the particles over time, I've created a custom system called ParticleTransitionSystem.
Here, we'll use an EntityQuery to find entities that have a ParticleEmitterComponent.
Inside of the system update, we'll perform our query and iterate over the resulting entities.
On each entity, we'll call the function updateParticles, which we will implement next.
To learn more about custom systems in RealityKit, check out the session "Build spatial experiences with RealityKit." Inside of our updateParticles function, we'll first get the ParticleEmitterComponent from the entity.
The ParticleEmitterComponent contains many properties that control various aspects of particle look and behavior.
Here, we'll set the lifeSpan and vortexStrength properties based on the entity's scale, so that as the entity grows in size, the particles start spinning faster around the portal.
Finally, let's apply our changes by assigning the component back to the entity.
And we are set.
To learn about all the different properties on particle emitters, check out the session "Meet Reality Composer Pro." We're almost done adding the final touch to our app.
To finish, let's attach our portal to the wall in our room.
To do this, we can use anchors in RealityKit.
Anchors can be used to place content on walls, floors, or locations relative to your head or hand.
Anchors in RealityKit support two tracking modes, .continuous and .once.
When using the continuous tracking mode, the anchor entity moves along with the anchor over time, such as when your head moves.
When using the once tracking mode, the anchor entity will not move after being positioned once.
To listen to when an entity becomes anchored, you can subscribe to the AnchoredStateChanged event in RealityKit.
Note that while you can use anchors to parent entities to place 3D content, explicit transforms of the anchors themselves are not visible to the app to preserve user privacy.
To get access to anchor transforms, you will need to use ARKit.
For more information on this, check out the session, "Meet ARKit for spatial computing." To use anchors in our app, we first need to modify our app to use an immersive space.
An immersive space is a special type of container that allows your app to render content outside of the window.
To do this, we can add an ImmersiveSpace to our SwiftUI scene.
We'll also add an .immersionStyle modifier and set it to mixed.
Inside of the ImmersiveSpace, we can use a RealityView to place content that will be anchored.
To learn more about Immersive Spaces, check out the session "Go beyond the window with SwiftUI." Inside of our RealityView, we can use an anchor entity as a container for our portal.
We initialize an anchor entity with a specification of the type of surface that we would like to anchor our content on.
In our case, we are looking for a vertical wall with a minimum size of one meter by one meter.
When an anchor is found that matches the specification, RealityKit will automatically attach our content to the wall.
And we are finally done.
When we run our app, we get a portal that is attached to the wall.
From portals and particles to anchors and attachments, RealityKit provides many features that let you build immersive experiences.
Let's summarize everything that we went over in this session.
Attachments in RealityView let you embed SwiftUI content inside of your entity hierarchy so that you can place UI elements alongside 3D elements.
VideoPlayerComponent, portals, and particle effects let you add dynamic elements to enhance your scene in RealityKit.
Finally, anchors let you attach 3D content to real-world surfaces such as your floor or wall.
The session "Build spatial experiences with RealityKit" goes over key concepts like entities, components, and RealityView.
The session "Work with Reality Composer Pro content in Xcode" takes you through the process of building an immersive app using Reality Composer Pro together with RealityKit.
I can't wait to see all the things you'll create using these new features in RealityKit.
Thank you for watching.
♪
-
-
2:30 - Attachments
import SwiftUI import RealityKit struct MoonOrbit: View { var body: some View { RealityView { content, attachments in guard let earth = try? await Entity(named: "Earth") else { return } content.add(earth) if let earthAttachment = attachments.entity(for: "earth_label") { earthAttachment.position = [0, -0.15, 0] earth.addChild(earthAttachment) } } attachments: { Attachment(id: "earth_label") { Text("Earth") } } } }
-
8:03 - VideoPlayerComponent
public func makeVideoEntity() -> Entity { let entity = Entity() let asset = AVURLAsset(url: Bundle.main.url(forResource: "tides_video", withExtension: "mp4")!) let playerItem = AVPlayerItem(asset: asset) let player = AVPlayer() entity.components[VideoPlayerComponent.self] = .init(avPlayer: player) entity.scale *= 0.4 player.replaceCurrentItem(with: playerItem) player.play() return entity }
-
10:05 - Passthrough tinting
var videoPlayerComponent = VideoPlayerComponent(avPlayer: player) videoPlayerComponent.isPassthroughTintingEnabled = true entity.components[VideoPlayerComponent.self] = videoPlayerComponent
-
10:40 - VideoPlayerEvents
content.subscribe(to: VideoPlayerEvents.VideoSizeDidChange.self, on: entity) { event in // ... }
-
13:12 - Portal
struct PortalView : View { var body: some View { RealityView { content in let world = makeWorld() let portal = makePortal(world: world) content.add(world) content.add(portal) } } } public func makeWorld() -> Entity { let world = Entity() world.components[WorldComponent.self] = .init() let environment = try! EnvironmentResource.load(named: "SolarSystem") world.components[ImageBasedLightComponent.self] = .init(source: .single(environment), intensityExponent: 6) world.components[ImageBasedLightReceiverComponent.self] = .init(imageBasedLight: world) let earth = try! Entity.load(named: "Earth") let moon = try! Entity.load(named: "Moon") let sky = try! Entity.load(named: "OuterSpace") world.addChild(earth) world.addChild(moon) world.addChild(sky) return world } public func makePortal(world: Entity) -> Entity { let portal = Entity() portal.components[ModelComponent.self] = .init(mesh: .generatePlane(width: 1, height: 1, cornerRadius: 0.5), materials: [PortalMaterial()]) portal.components[PortalComponent.self] = .init(target: world) return portal }
-
15:50 - Adding particles around the portal
public class ParticleTransitionSystem: System { private static let query = EntityQuery(where: .has(ParticleEmitterComponent.self)) public func update(context: SceneUpdateContext) { let entities = context.scene.performQuery(Self.query) for entity in entities { updateParticles(entity: entity) } } } public func updateParticles(entity: Entity) { guard var particle = entity.components[ParticleEmitterComponent.self] else { return } let scale = max(entity.scale(relativeTo: nil).x, 0.3) let vortexStrength: Float = 2.0 let lifeSpan: Float = 1.0 particle.mainEmitter.vortexStrength = scale * vortexStrength particle.mainEmitter.lifeSpan = Double(scale * lifeSpan) entity.components[ParticleEmitterComponent.self] = particle }
-
18:19 - Anchoring the portal
import SwiftUI import RealityKit struct PortalApp: App { @State private var immersionStyle: ImmersionStyle = .mixed var body: some SwiftUI.Scene { ImmersiveSpace { RealityView { content in let anchor = AnchorEntity(.plane(.vertical, classification: .wall, minimumBounds: [1, 1])) content.add(anchor) anchor.addChild(makePortal()) } } .immersionStyle(selection: $immersionStyle, in: .mixed) } }
-
-
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.