Streaming is available in most browsers,
and in the Developer app.
-
Go beyond the window with SwiftUI
Get ready to launch into space — a new SwiftUI scene type that can help you make great immersive experiences for visionOS. We'll show you how to create a new scene with ImmersiveSpace, place 3D content, and integrate RealityView. Explore how you can use the immersionStyle scene modifier to increase the level of immersion in an app and learn best practices for managing spaces, adding virtual hands with ARKit, adding support for SharePlay, and building an "out of this world" experience!
Chapters
- 0:00 - Introduction
- 3:27 - Get Started
- 5:51 - Display content
- 12:00 - Managing your Space
- 19:12 - Customization
Resources
Related Videos
WWDC23
- Build spatial experiences with RealityKit
- Build spatial SharePlay experiences
- Develop your first immersive app
- Discover Metal for immersive apps
- Elevate your windowed app for spatial computing
- Enhance your spatial computing app with RealityKit
- Meet SwiftUI for spatial computing
- Take SwiftUI to the next dimension
-
Download
♪ Mellow instrumental hip-hop ♪ ♪ Raffael Hannemann: Hello, and welcome to Go Beyond the Window with SwiftUI. I'm Raffa, and I'm an engineer at Apple, and I'm later joined by my colleague Mark. Today, we're going to show you how easy it is to leverage the full power of xrOS to create truly immersive experiences with the tools and frameworks you already know.
You may already be familiar with augmented reality from developing AR apps for iOS. In the past few years, we've introduced and extended a number of tools and frameworks, including ARKit and RealityKit, for creating rich AR apps for iPhone and iPad. These apps allow you to blur the line between the real world and your imagination by augmenting the user surroundings with interactive user interfaces and virtual objects. This year, with the launch of xrOS, we are bringing AR to a whole new level, starting with immersive experiences. In these experiences, your application displays UI, including windows and three-dimensional content anywhere around you. The surroundings remain visible and, in fact, even become part of the experience. You can anchor elements of your app to surfaces and augment and enrich the real world with virtual objects and effects. And then there are fully immersive experiences, which go one step further and cover your entire space. Your app takes complete control over what you see. Think about all the possibilities this will unlock. And the best of all, all of this is possible with the tools, frameworks, and patterns that you are already familiar with. And at the core of this is SwiftUI's Immersive Space. Let's get started. In the other sessions, you've learned that this year, we are adding the third dimension to SwiftUI. You can present windows and volumes on xrOS and display three-dimensional user interface elements with the easy-to-use declarative patterns of SwiftUI. Both windows and volumes let you display content within their bounds. So what if you want to make the most of the infinite space that xrOS offers and create a truly immersive experience? You may want to place your items beyond the window's bounds all around your head and be right in the middle of it. And this is what we've designed Space for. Next to windows and volumes, spaces are a kind of container to present your user interface on xrOS. In this session, we'll focus on Spaces and how you can use them to create immersive experiences. Let's first get started with Space and learn how to display your content. Mark will then show you how you manage your Space, launch directly into a Space, and explain all the customizations that Spaces allow. So let's get started and check out some code. I am really excited about space exploration. To continue the World app that we've been working on in the other sessions, we're going to extend the app step-by-step with a space that lets us explore our solar system. Spaces are a new scene type in SwiftUI called Immersive Space. As you would do with other scene types, you define an Immersive Space in your app and can then open and dismiss it at any time. You can make your whole app consist of only one single Space, but you can also extend your existing app by adding one or more Spaces next to your windows and volumes. Your app can have one Space open at the time. Before opening another Space, you first dismiss the current one. And again, similarly to other scene types, you place your view hierarchy in the body of the scene. By placing our SolarSystem in the ImmersiveSpace, it will be rendered without any clipping boundaries. Let's just take a moment and point out how easy this is. With just these three lines, we've brought our solar system view into a rich, immersive experience. Let's dive into the details. Having a Space open enables a couple of special behaviors that make this scene stand out from the other scene types. When multiple apps are running side by side, they are all displayed together in the same space, which is why we are calling this the shared space. Once your app displays an Immersive Space scene, your app enters what we call a Full Space. Your app will then be the only one visible to the user. All other applications will go away to make room for your content to appear without any distractions. Later, once you dismiss your space, the other apps will reappear. Since Immersive Space is a scene, it implicitly defines its own coordinate system. So everything that you place in a Space is positioned relatively to the space's own origin. And the origin of a Space is below the user, close to the user's feet when the Space is first opened. So now you know the basics. Let's move on and talk about how you display the content of your Space. ImmersiveSpace is a scene type, so you place your view hierarchy right in it. An ImmersiveSpace can take any Swift UI view, and while it doesn't have any clipping bounds, a Space still lays out its content within its layout bounds. Anything you place in the Space uses the same layout system that you are already familiar with. But since the origin of your Space is near the user's feet, you probably don't just want to put your content down there. Let's talk about RealityView. If you want to make the most of SwiftUI, ARKit, and RealityKit, we encourage you to use ImmersiveSpace together with the powerful features of the new RealityView. ImmersiveSpace and RealityView go hand-in-hand, and were specifically designed together to provide all the features you need for building great immersive experiences. For example, RealityView comes with built-in support for asynchronous loading of assets, as shown here, for loading and displaying a star field. But next to asynchronous loading, putting a RealityView in an Immersive Space scene allows a lot more. Place elements within your RealityView on ARKit anchors. And since your app gains access to hands and head-pose data, while a Space is open, you can use that data to position your entities within a RealityView. Mark will show you something cool later. A note about coordinate Spaces. RealityView uses RealityKit for displaying its content. So when positioning entities within a RealityView, keep in mind that the coordinate Space orientation is different than in SwiftUI's layout system. In SwiftUI, the y-axis points downwards and the z-axis points towards you. This applies to windows, volumes, and to Immersive Spaces, whereas in RealityKit, the y-axis points upwards. There is a lot to cover about RealityView. Make sure you add "Build spatial experiences with RealityKit" to your watchlist for all the details. Now let's write some code. We're going to use the WorldApp, or at least a simplified version of it, and add an immersive solar system to it step by step. We start by defining an ImmersiveSpace. Similarly to WindowGroup, you can assign an identifier, a value type, or both. In this case, we assign the solar identifier. We will use this identifier later to open the Space. We then place a SolarSystem view into the Space. Let's also define a simple standard window for our app, which we want to show up when the app launches with a control to view our solar system. This is similar to what the World app does. So we define a new launch window using a WindowGroup, and we add some information together with a control that will allow us to open our Space. That control is just a button. When clicking it, we want to change its title and open our Space. For controlling windows, SwiftUI provides the openWindow and dismissWindow environment actions. And for Immersive Space, we are adding the new openImmersiveSpace and dismissImmersiveSpace actions. We obtain the two actions from the environment. We can then use these actions when the button is invoked. When opening the Space, we pass in the identifier we defined earlier. Since only one Space can be open at a time, the dismissImmersiveSpace action doesn't need any argument. The system animates your Space in and out with a certain duration. These environment actions are async so that you can react to the completion of the animation. Opening an Immersive Space may fail, and openImmersiveSpace will tell you via its result whether the call failed or succeeded. Make sure to have proper error handling. Coming back to our app that we defined at the beginning, we can now just add our LaunchWindow right here. Notice the order of our two scenes. The LaunchWindow is the first in our list of scenes, so SwiftUI will display the launch window when the app starts. The Immersive Space won't be visible at launch, but will only show up once the user clicks on our button. Let's run this on the simulator. When our app launches, we get to see the launch window. And then with just a click on our button, the solar system appears right in our living room.
So now we've defined a multiscene app consisting of a standard window and a Space displaying our solar system. You've seen the models used in the World app. When building your immersive app, you're surely going to want to display some 3D assets with a lot of detail in your Space. Keep in mind that it may take some time for your assets to be fully loaded and ready to be rendered by your app. For the best user experience, make sure to leverage the new Model3D and RealityView APIs, which load your 3D assets asynchronously. In this code here, we display a text while the model is still loading and an error in case something went wrong. And now, Mark is going to tell you how to manage your Space and even better, how to launch into Space. Mark Ma: Thanks, Raffa. As we just demonstrated, it was incredibly easy to integrate Immersive Spaces into our World app with just a couple lines of code. Transforming your app into an immersive experience also involves managing your Space alongside the system with scene phases, coordinate convergence between your Space and other scenes, and presenting it with different styles. Similar to other SwiftUI scene types, Immersive Space supports the same scene phases which are handled by the system. This also means your Space is always in one of SwiftUI's scene phases. By opening the Space, it moves it to the active phase. And at any point in time, it may change it to the inactive phase. For example, if we step out of the system-defined boundary or a system alert shows, your Space and windows are hidden temporarily, moving them to the inactive phase. Once the user re-enters the experience, your Space and windows will be made visible, updating their scene phase to be active again. For our World app, we can add a few quick lines of code to handle the inactive scene phase. Let's scale down our Earth model to half size to help indicate that the state of the Space has changed. Let's also make sure that we handle the active phase to restore the contents. And keep in mind, the Space can be dismissed at any time by using hardware or software means. So let's check this out in the Simulator. We'll open up our Space and demonstrate how our app handles the inactive phase. For example, this might be triggered when an alert shows up. When the alert pops up, note that the Space content changed in scale as a result of our previous sample code. And when we dismissed the alert, the Space scaled back, reacting to the now active phase. SwiftUI makes handling and animating these transitions really easy and convenient. Another great way to manage your Space is by integrating content from your other windows with your Space. For example, if you want to reposition the earth model next to the main window, we'll need to know the position of the window in our Immersive Space coordinate system, but both objects define their own coordinate systems. So to help resolve this, SwiftUI provides a new coordinate Space named Immersive Space. And that represents a coordinate system of an Immersive Space. To access this coordinate system, we encapsulate the window inside a geometry reader to 3D context. Then by using the existing API that takes in a coordinate space, like transform, and passing in the Immersive Space type, we can get the proxy.transform in the new coordinate system. Using this transform, we'll update the Earth's position on tap. Let's run this on the simulator. We'll reopen our Space so we'll have our earth and the main window visible. We've slightly shifted the window and we want to reposition earth right in front of it. Now when the earth gets tapped, it'll get positioned to where we expected. And with coordinate conversions, it's that easy to position our content exactly where we want it and move assets between the Space and the window. Other times to use coordinate conversions would include Immersive Spaces in SharePlay, where we can manage our content's position across not only a private Immersive Space, but also a group Immersive Space. If your app supports SharePlay and group Immersive Spaces, when other participants join, the system may move the origin of the space to a shared location defined by spatial templates. For more information, please check out the session, "Build spatial SharePlay experiences." Our World app now handles scene phases and can combine content from other windows, but we still have yet to use the full capabilities that Space offers. We'll explore immersion styles next to have them become even more amazing. Immersion styles offers different presentations of how your space content can take over your surroundings. You can present your content alongside a mixed style, a progressive style online in front of you, or a full style that surrounds you in all directions. Let's update our app to take advantage of all these styles. Let's reopen our app again and go back to where we defined an Immersive Space. Right now, the space presents the solar system in a mixed immersion style, which is the default one. It's easy to change the style and also have it be dynamic. First, let's add a new state variable of type ImmersionStyle and assign a default value we want the Space to begin with. Let's keep the mixed style here. We then use the immersionStyle scene modifier and define the list of styles we want our Space to support. In order to have a reference to the current style, we pass in our state variable as a binding. If we pass the binding to our solar system, we can also read the current style and control it to transition to any of the map styles. In this sample, we'll transition on a magnification gesture so that as we scale up the solar system, we'll go to a different style. So far, we've been running our World app on the simulator to show you how easy it was to bring an Immersive Space into it. But to get a really great sense of how these styles work with our surroundings, let's run our experience on a device. And later, we'll show you even more customizations that really enhance your on-device experience. You open the Space in the default style, which results in the mixed immersion style. This style is great, but you might want to become a bit more immersed in the content and maybe see some stars. So you can perform the magnification gesture. And as the content grows bigger, eventually the Space transitions into the progressive style. This style is the bridge between a passthrough and fully immersive experience. It allows you to see the Immersive Space content within the portal that's in front of you, along with your surroundings. This style feels pretty immersive, but also lets you be aware of what surrounds you. This also means you can chat with people nearby, know where to sit to be comfortable, and even interact with the surroundings. And once you're comfortable, by turning the Digital Crown, you increase the immersion of the style. Isn't that really cool? Now you're floating like an astronaut in the galaxy. And if you want to see more of your surroundings again, just turn the Digital Crown back to decrease the immersion. This lets you quickly and easily be in control of how immersive the content will be within the progressive style. But perhaps you like to be in full immersion all the time. This is great for experiences that surround you or instantly transport you to a whole different world. So far, you've learned how easy it is to transition to different styles based on gestures. Going to full immersion is no different, and you'll experience that as you scale up the Earth again to update the style binding. Notice how easy and seemless it was to transition between the different styles. Now the Space has become fully immersive. And with SwiftUI, it only took a couple lines of code. And by pressing the Digital Crown, you can go back to passthrough whenever you're ready to leave the experience.
We've just demonstrated different ways to manage your Space by reacting to scene phase changes and controlling the style. Now let's add some final enhancements to take our Space to the next level. The spatial computing capabilities on the device allows for your Space to be easily enhanced to make it feel even more exciting. So let's go over a couple options, like launching directly into a Space, adding effects to the surroundings, and virtual hands. So far, our app allows us to open a Space with a click of a button. But what if you want to launch an immersive experience right when your app starts, like if you have a fully immersive game? In order to launch directly into an Immersive Space, you'll need to configure the scene manifest for your app. Just set the ImmersiveSpace application role and the immersion style. Attach your Space content as you normally would, and it'll open right away. You can also have your app go back to a window if the user chooses to dismiss the Space. Next, the surrounding effects preferences allows me to dim the passthrough to have the Space content be in focus even more. We'd like to have our surroundings be dimmed when the Space transitions to the progressive style. We set the preferredSurroundingEffects modifier to be dark, so when the solar system appears, our surroundings will automatically be dimmed. The upperLimbVisibility modifier allows us to hide our hands while in a Space that's fully immersive since no passthrough is available. For our world experience, we'll simply set our preference to be false when we open our Space. And just like that, we can change the upperLimbVisibility preference. Hiding your hands while in the full emergence style means we can show virtual hands instead, and we'll show some space gloves in our World app. Let's start by creating a new view called SpaceGloves. Next, we'll add a RealityView so we can have our gloves render in our Space. Then we'll create a root entity in our RealityView to add entities to so they can also be rendered. Then we'll load an asset onto an entity and add it as a child of our root. To correctly place the entities, we'll need to use ARKit and its hand tracking API, and we'll also need to start the hand tracking system as well. Our next step is making sure the assets correctly anchor to our hands. We'll need to check for hand tracking anchor updates. And next, checking for the hand chirality. We'll then make sure the hand asset's transform is the same as the anchors. In this example, we also made sure our assets have the same joint names as the one ARKit provides. This way, we can map the anchor skeleton joint names correctly and the glove entity will automatically be anchored for us. So let's go back to where our Space is defined and make sure to include the SpaceGloves view. That's all we need for virtual hands. For more ARKit customization and in-depth details, check out the "Evolve your ARKit app for spatial experiences." Now let's try these customizations out on device. The World experience starts again, and the Space will reopen in the default immersion style. By using the magnify gesture on the earth, the app will transition into the progressive style. When the Space opens, the code will modify the surroundings to be dark. You made yourself feel even more immersed by utilizing the Surrounding Effects API to dim the passthrough. It was easy to apply and it's a great way to focus on the experience. This is pretty immersive right now, but you can take it a step further with our next customization. As our previous code demonstrated, when you transition to full immersion, your hands will disappear and the virtual space gloves will appear where your hands would be thanks to hand tracking. By using RealityView with ARKit and enabling hand tracking, you were able to launch into space like a virtual astronaut, and it feels really awesome.
With just a couple enhancements and customizations, we were able to turn our World app into a fully immersive experience that took us beyond the Shared Space. And now it's up to you to use the new Immersive Space API to create experiences effortlessly, show them off with different styles, and be creative with the possible customizations. It's a powerful and easy to use API that gives you all the tools necessary to transform your surroundings and create new and immersive experiences. Thank you for joining us. ♪
-
-
4:18 - Defining an ImmersiveSpace
@main struct WorldApp: App { var body: some Scene { ImmersiveSpace { SolarSystem() } } }
-
6:53 - RealityView in an ImmersiveSpace
ImmersiveSpace { RealityView { content in let starfield = await loadStarfield() content.add(starfield) } }
-
8:17 - ImmersiveSpace with a SolarSystem view
@main struct WorldApp: App { var body: some Scene { ImmersiveSpace(id: "solar") { SolarSystem() } } }
-
9:00 - LaunchWindow
struct LaunchWindow: Scene { var body: some Scene { WindowGroup { VStack { Text("The Solar System") .font(.largeTitle) Text("Every 365.25 days, the planet and its satellites [...]") SpaceControl() } } } }
-
9:11 - SpaceControl button using Environment actions for opening and dismissing an ImmersiveSpace scene
struct SpaceControl: View { @Environment(\.openImmersiveSpace) private var openImmersiveSpace @Environment(\.dismissImmersiveSpace) private var dismissImmersiveSpace @State private var isSpaceHidden: Bool = true var body: some View { Button(isSpaceHidden ? "View Outer Space" : "Exit the solar system") { Task { if isSpaceHidden { let result = await openImmersiveSpace(id: "solar") switch result { // Handle result } } else { await dismissImmersiveSpace() isSpaceHidden = true } } } } }
-
10:44 - WorldApp using LaunchWindow and ImmersiveSpace
@main struct WorldApp: App { var body: some Scene { LaunchWindow() ImmersiveSpace(id: "solar") { SolarSystem() } } }
-
11:32 - Model3D with phase handling
Model3D(named: "Earth") { phase in switch phase { case .empty: Text( "Waiting" ) case .failure(let error): Text("Error \(error.localizedDescription)") case .success(let model): model.resizable() } }
-
13:04 - Scene Phases
@main struct WorldApp: App { @EnvironmentObject private var model: ViewModel @Environment(\.scenePhase) private var scenePhase ImmersiveSpace(id: "solar") { SolarSystem() .onChange(of: scenePhase) { switch scenePhase { case .inactive, .background: model.solarEarth.scale = 0.5 case .active: model.solarEarth.scale = 1 } } } }
-
14:21 - Coordinate Conversions
var body: some View { GeometryReader3D { proxy in ZStack { Earth( earthConfiguration: model.solarEarth, satelliteConfiguration: [model.solarSatellite], moonConfiguration: model.solarMoon, showSun: true, sunAngle: model.solarSunAngle, animateUpdates: animateUpdates ) .onTapGesture { if let translation = proxy.transform(in: .immersiveSpace)?.translation { model.solarEarth.position = Point3D(translation) } } } } }
-
16:34 - Immersion Styles
@main struct WorldApp: App { @State private var currentStyle: ImmersionStyle = .mixed var body: some Scene { ImmersiveSpace(id: "solar") { SolarSystem() .simultaneousGesture(MagnifyGesture() .onChanged { value in let scale = value.magnification if scale > 5 { currentStyle = .progressive } else if scale > 10 { currentStyle = .full } else { currentStyle = .mixed } } ) } .immersionStyle(selection:$currentStyle, in: .mixed, .progressive, .full) } }
-
20:08 - Surrounding Effects
@main struct WorldApp: App { @State private var currentStyle: ImmersionStyle = .progressive var body: some Scene { ImmersiveSpace(id: "solar") { SolarSystem() .preferredSurroundingsEffect( .systemDark) } .immersionStyle(selection: $currentStyle, in: .progressive) } }
-
20:30 - Upper Limbs Visibility
@main struct WorldApp: App { @State private var currentStyle: ImmersionStyle = .full var body: some Scene { ImmersiveSpace(id: "solar") { SolarSystem() } .immersionStyle(selection: $currentStyle, in: .full) .upperLimbVisibility(.hidden) } }
-
20:52 - Hand Anchoring
struct SpaceGloves2: View { let arSession = ARKitSession() let handTracking = HandTrackingProvider() var body: some View { RealityView { content in let root = Entity() content.add(root) // Load Left glove let leftGlove = try! Entity.loadModel(named: "assets/gloves/LeftGlove_v001.usdz") root.addChild(leftGlove) // Load Right glove let rightGlove = try! Entity.loadModel(named: "assets/gloves/RightGlove_v001.usdz") root.addChild(rightGlove) // Start ARKit session and fetch anchorUpdates Task { do { try await arSession.run([handTracking]) } catch let error as ProviderError { print("Encountered an error while running providers: \(error.localizedDescription)") } catch let error { print("Encountered an unexpected error: \(error.localizedDescription)") } for await anchorUpdate in handTracking.anchorUpdates { let anchor = anchorUpdate.anchor switch anchor.chirality { case .left: if let leftGlove = Entity.leftHand { leftGlove.transform = Transform(matrix: anchor.transform) for (index, jointName) in anchor.skeleton.definition.jointNames.enumerated() { leftGlove.jointTransforms[index].rotation = simd_quatf(anchor.skeleton.joint(named: jointName).localTransform) } } case .right: if let rightGlove = Entity.rightHand { rightGlove.transform = Transform(matrix: anchor.transform) for (index, jointName) in anchor.skeleton.definition.jointNames.enumerated() { rightGlove.jointTransforms[index].rotation = simd_quatf(anchor.skeleton.joint(named: jointName).localTransform) } } } } } } } }
-
-
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.