Streaming is available in most browsers,
and in the Developer app.
-
Enhance the immersion of media viewing in custom environments
Extend your media viewing experience using Reality Composer Pro components like Docking Region, Reverb, and Virtual Environment Probe. Find out how to further enhance immersion using Reflections, Tint Surroundings Effect, SharePlay, and the Immersive Environment Picker.
Chapters
- 0:00 - Introduction
- 1:48 - Immersive playback
- 3:33 - Custom Docking Region
- 7:56 - Media reflections
- 10:14 - Grounding the experience
- 14:34 - Immersive environment picker
- 15:33 - SharePlay
Resources
- Building an immersive media viewing experience
- Destination Video
- Enabling video reflections in an immersive environment
- Forum: Media Technologies
Related Videos
WWDC24
-
Download
Hello. My name is Jonathan, and I am on the visionOS Program team at Apple. Today we will look at how you can bring the scale and wonder of a movie theater to your media viewing experience on Apple Vision Pro.
Just like the magic that happens when the lights dim in your favorite movie theater, you can now create a more immersive experience by casting a beautiful media reflection onto your custom environment, like you see here in the new Studio environment from the Destination Video sample. After the lights have dimmed, you can increase the visual ambiance by tinting the passthrough so that your audience’s hands match your environment’s lighting, and set a reverb preset to fill your environment with audio atmosphere. See the smooth transition between media and environment? And you have the option to offer multiple variants to diversify the look and feel of your scene, all while the system lighting features and shadow integration ground the entire viewing experience in your custom environment.
In this session, I will go over the foundation on which you can build your media viewing experience so that playback can be more immersive. I will show you how you can use the new custom docking component in Reality Composer Pro to define the ideal location for video playback in your custom environment. We’ll look at media reflections from the video playing in a custom environment. Then I’ll take you through how you can use visuals and sound to ground your custom environment’s overall experience. We’ll go through how you can make your custom environment more discoverable, using the immersive environment picker. And lastly, I will finish up with how a SharePlay enabled app can sync the environment and allow multiple participants to share an immersive watching experience. Let’s start with the foundational elements of your space and media viewing experience.
While watching media within a window that you can resize and reposition is a great experience in itself, the immersive capabilities on Apple Vision Pro offer you a lot more than that. To make your video watching experience more immersive, you should start with the immersiveSpace scene-type and take advantage of the features in AVKit. The immersiveSpace scene-type works well because it enables media apps to provide the audience with customizable progressive immersion Meaning the audience can set the desired amount of immersion by turning the Digital Crown on the Apple Vision Pro from no immersion to fully immersive. There are other benefits to the immersiveSpace scene-type like custom tint color and passthrough brightness, which we will come back to later in this session.
To learn more about customizing your immersive space, check out the session "Dive deep into volumes and immersive spaces".
The recommended way to build an incredible media viewing experience for visionOS is to take advantage of the features in AVKit, and in particular AVPlayerViewController, which integrates with the system for enhanced video playback capabilities.
Using AVPlayerViewController supports HLS streaming and makes it simple to provide the familiar playback experience that matches system apps like TV and Music. The streamlined playback controls allow the audience to focus on the media. On visionOS, showing AVPlayerViewController in fullscreen offers many great capabilities, one of which is participating in the system docking behavior. Docking enhances the fullscreen experience in an immersive space by placing the video screen into a fixed location.
By default, the system determines the docking location, but now you can customize this location by specifying a custom docking region. You can see the use of a custom docking region in the Destination Video sample, which is built with the AVKit and ImmersiveSpace foundation. I’ll show you how you can use the Docking Region component in Reality Composer Pro to place your video into the ideal location of your custom environment. Here I am in Reality Composer Pro. You should use Reality Composer Pro to prepare 3D content and visualize your experience. I am going to use Reality Composer Pro to add the Video Dock preset to this new project. On the left side is the Scene Navigator. At the bottom of the Scene Navigator, there is a plus sign that is the insert button. I am selecting the insert button, and then in the Environment menu, selecting the Video Dock preset.
I can see a part of the Video Dock preset in the viewport now. The viewport allows me to visualize, navigate through, and preview the 3D scene. I am using Frame Scene in the Viewport menu so I can better see the whole Video Dock preset. Back in the Scene Navigator, I am expanding the Video Dock entity to see that it contains a Player entity.
When I select the Player entity, I have new properties available on the right side in the Inspector. The Inspector provides an easy way to edit the properties of my Player entity, such as the position and rotation.
The Inspector for the Player entity also contains a Docking Region Component at the bottom. This lets me customize my Docking Region.
Before I customize my Docking Region, I am selecting Reset Camera in the Viewport menu.
This moves the camera so that I can see the origin.
The immersiveSpace is a scene, so it implicitly defines its own coordinate system. Everything that you place in a Space is positioned relative to the space's origin. And the origin of a Space is below the audience, close to their feet when the Space is first opened. In the center of the Viewport, I can see the dockingRegion. The position of the dockingRegion determines the location of the video player.
So, I want to position my dockingRegion relative to the origin.
Back in the Inspector, I can see the width property in the Docking Region component. The width of the Docking Region defines a bounding region that is the maximum size for media playback using a 2.4:1 aspect ratio. Videos that are larger than the maximum width will be scaled to fit within the bounding region.
I can change the width property to customize my Docking Region’s size. As you can see, when I change the Docking Region’s width, the height follows.
So, now that my Docking Region has more width and height, I need to raise my Player entity. I am changing the position transform to raise it. The player feels good slightly above the ground; like the screen in a movie theater.
This is just the start of a project, let’s check out something more complete.
In the Destination Video sample, you will see there is a folder named Studio with a Reality Composer Pro project inside. I am opening the Studio project to view its setup and selecting Reset Camera in the Viewport menu so that I can preview.
You can see how the Docking Region in the Studio is positioned so that there are minimal objects between an audience and the media playback. The intention is to provide the audience with a clear view.
The Studio has a sense of symmetry and an architectural thinking for why the Docking Region belongs at this location in the space. The placement is meant to celebrate the space, make the media playback fit the room, and provide a more immersive experience.
Make sure your custom environment is designed to accommodate the docking region. We recommend providing a comfortable viewing angle to avoid causing strain or discomfort during longer viewing sessions. Using Reality Composer Pro is a helpful preview, but be sure to review your environment and docking placement on Apple Vision Pro. The perception of scale and layout can be quite different between a monitor and Apple Vision Pro and nothing beats testing on device.
Now that I have my dockingRegion in the ideal location, let’s look at how video playback can reflect onto a custom environment.
Back in Destination Video, the media reflections on the Studio surfaces help to show where media is playing at and ground it in this space. You can make your media reflections fit your custom environment and it will help your audience understand where the video is playing at in your virtual space.
There are 2 kinds of media reflections in Reality Composer Pro’s ShaderGraph nodes; Reflection Specular and Reflection Diffuse. The reflections create interactions between the media content and the virtual space that increase the sense of realism and immersion.
Use these reflections as depth cues so your audience can understand where the screen is at in your custom environment. This helps to bring focus to your video playback. Specular reflections provide direct reflections of the video content, useful for very glossy surfaces like metals, mirrors, and water.
To setup specular media reflection, all that is needed is an entity in your scene with a dockingRegion component and a material on a surface with the Reflection_Specular node connected. RealityKit will use the dockingRegion location and the ground’s texture coordinates to compute the reflection, and render the reflected media onto your surface. Diffuse reflections provide a softer falloff of video content, useful for rougher, more organic surfaces like a wooden floor or concrete.
The implementation provided in the Video Dock preset is just an example. If you move the Docking Region, ground entity, or use your own mesh, then you will need to recompute texture coordinates for Reflection_Diffuse and import that data into Reality Composer Pro.
The Studio uses diffuse reflection because the softer falloff is a better fit for the floor material properties. Specular reflection is included in the Studio just for demonstration purposes.
You should check out the visionOS system environments for examples that integrate reflections in a more balanced way that complements the media and the environment without being a distraction. There’s a lot more to learn, so I recommend checking out the Destination Video sample and supporting materials that go deeper on how to enable specular reflections and configure diffuse reflections. Next, let’s look at how you can use visuals and sound to ground your custom environment’s overall experience. We will use some new features such as the Virtual Environment Probe, Brightness and Tint, and Reverb. Let’s start with the Virtual Environment Probe The Virtual Environment Probe describes a location in terms of the color variation and illumination intensity. The system can use this information to automatically shade any Physically Based Rendering material and closely approximates the way light bounces off objects. You can use the probe to provide environment lighting for objects in your ImmersiveSpace, and if you are creating an environment using the progressive immersion style, this virtual environment lighting will gradually replace your real-world lighting for objects outside of the ImmersiveSpace. For example, this metallic sphere is reflecting a mix of the studio and my real world surroundings. The effect is a realistic object that looks natural in progressive immersion.
And, if your scene supports two variants, like Destination Video’s dark and light Studio, you can configure a Virtual Environment Probe with two pre-generated Environment Resources, and a blend factor between them. In this case, your app code drives a smooth transition between the dark and light resources, while the system drives immersion blending. It’s best to keep the transition brief. In the visionOS system environments, the dark to light transition is one and a half seconds long. Next up, Brightness and tint. Depending on the look of your custom environment, you may want the lighting on your audience’s hands to blend well with your environment lighting, like when the lights dim in a movie theater. Now in visionOS, you can customize tint and brightness to soften the edges between your audience’s surroundings and your media playback, while also making their hands look like they belong in your experience.
See how the look of my hand changes to match the dark version of the Studio? This helps me feel more immersed in the custom environment and focus on the viewing experience.
Take a look at the immersiveContentBrightness and surroundingsEffect code samples in Destination Video to inform how you use these features in your media app. Try using subtle tint colors that strengthen the sense of immersion without distracting the audience. While these improved visuals make your environment look nice already, you can also use the new Reverb component in Reality Composer Pro to make your authored spatial sounds, and the system sounds, all better match your custom environment, which further increases immersion.
In the real world, sound reverberates; bouncing off walls and furniture before arriving at a person's ears. By default, visionOS reverberates spatial audio sources by simulating the acoustics of a person’s real world environment.
Reverberation can also be simulated for a virtual environment. If you’ve visited the Mount Hood system environment, then you’ve probably heard the frogs croaking at night. Those spatial sounds have reverb which simulates an outdoor environment. Now you can apply reverberation to spatial and system sounds in your custom environment. To enable Reverb, I am using the Add Component button in the Inspector to add the Reverb component to my new project.
Then I can go to the Inspector and select the reverb preset that best fits. The presets are a curated set of high-quality, naturalistic reverb settings that add resonation and simulate the experience of hearing spatial sounds in an environment that’s similar to my custom environment.
It’s recommended to select a Reverb preset in your custom environment, even if it doesn’t provide custom audio, so that the system can use the preset to spatialize system sounds like UI and buttons. These system sounds are spatial audio sources present at runtime.
For example, use the Medium Room (Treated) preset to add an optimal listening experience with a subtle reverb applied, or if your custom environment is an outdoors space, then use the Outdoor preset.
Or go big if you have a large indoor environment and use the Very large room reverb preset. This is the preset used in the Studio.
For more details on how to implement and tune your reverb, see the RealityKit Audio session titled "Enhance your spatial computing app with RealityKit audio". Now that you have learned how to enhance the immersion of media playback in a custom environment, you can take advantage of the Immersive Environment Picker to make your custom environment appear in the same list as the visionOS system environments. Let’s take a closer look.
See how the immersive environment picker now includes a new section for Destination Video with our Studio environment? Selecting the Studio opens the environment and docks the media into its docking region.
You can add your custom environments to this list as well. Simply declare your custom environment by describing it with metadata, like a title and image, and let your audience discover your environment when they go to dock.
Use the .immersiveEnvironmentPicker modifier on a view to add your immersive environments to video players inside that view. There, you can declare how each of your custom environments will appear in the picker. For each environment, include a title, image, and the immersive space to open when it’s selected.
Let's talk about SharePlay. If you are already using SharePlay to sync media, now AVKit brings synchronization of the environment state as well, allowing people to further share an immersive watching experience. Let's see how that works. This code snippet uses a GroupSession to synchronize sharable content between devices. To sync media on the session, pass it to the AVPlayer's AVPlaybackCoordinator. To sync viewing environment on the same session; first, import both AVKit and GroupActivities, and then, pass the session to the AVPlayerViewController's AVGroupExperienceCoordinator. Now when the activity participants watch fullscreen, they can share the same viewing environment.
You’ve just learned to use the Immersive Space scene-type and AVKit as the foundation for your playback experience, and how you can enhance the immersion with a Custom Docking Region and Media Reflections, then ground the experience with Virtual Environment Probe, Reverb, Brightness and Tint. Using these features to increase immersion will help your audience feel like they are present in your space, for the sake of believing in it and enjoying the experience you want to give them. Remember to give your audience a clear and comfortable view of your media, without distractions, so that they can focus on the experience. We also showed how to get started with Immersive Environment Picker and SharePlay. I hope this session has been a helpful primer in how to make your media playback experience more immersive.
Be sure to take a deeper dive into the Destination Video sample and the multiple sessions available. I can’t wait for the immersive experience of watching media playback in the custom environments you build.
-
-
15:14 - Add environments to the Immersive Environment Picker
WindowGroup { ContentView() .immersiveEnvironmentPicker { ForEach(viewModel.environmentItems) { item in Button(item.title, image: item.thumbnail) { Task { await openImmersiveSpace(id: item.id) } } } } }
-
15:47 - Synchronization of an environment state using SharePlay
import AVKit import GroupActivities for await session in MyActivity.sessions() { // join the session, activate the activity, etc. playerViewController .player? .playbackCoordinator .coordinateWithSession(session) playerViewController .groupExperienceCoordinator .coordinateWithSession(session) }
-
-
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.