Streaming is available in most browsers,
and in the Developer app.
-
Explore USD tools and rendering
Discover the latest advancements in tooling to help you generate, inspect, and convert Universal Scene Description (USD) assets. We'll learn about updates to these tools and help you integrate them into your content creation pipeline. We'll also explore the power of USD Hydra rendering, and show how you can integrate it into your own apps. For an introduction to USD, watch "Understand USD fundamentals" from WWDC22.
Resources
- ASWF USD Working Group
- Creating a 3D application with Hydra rendering
- Introduction to Universal Scene Description (USD)
- USDZ schemas for AR
Related Videos
WWDC22
WWDC21
- Create 3D models with Object Capture
- Create 3D workflows with USD
- Explore advanced rendering with RealityKit 2
WWDC20
-
Download
♪ instrumental hip hop music ♪ Hi there. Welcome to this year's WWDC. My name is Stella. And I'm Alex. Stella and I work together on the many parts of USD at Apple. Today we'll explore with you updates to USD tools and rendering. Take it away, Stella! USD is a foundational technology that, with the growing and deepening integration into content creation tools, is enabling more and more ways of creating assets and content, rendering for games, AR, film, and the web, all with USD at the center.
Today, we will focus on two parts of the ecosystem: tools and rendering.
Let's start with updates to our USD tools.
We'll then show you how your assets look even better with new lighting in AR Quick Look, take a deep dive into USD rendering, and last but not least, show you how to integrate Hydra.
Let's get started with USD Tools in the Apple ecosystem.
We'll cover USDZ Tools, Reality Converter, and then follow up with additional tools and frameworks that can help you create USDZ content.
USDZ Tools is a package that contains essential command line USD Python tools to help you generate, validate, and inspect USDZ files. The package also includes a converter called usdzconvert that creates USDZ files from other major 3D file formats.
The Python scripts give you powerful tools for automation and batch processing. Also, they are a great way for you to explore USD and learn how to use the API.
This year, we're bringing you Python 3 and Apple Silicon support. We've also upgraded the USD version to give you simple access to new USD features and performance improvements.
On top of that, we added great new features to usdzconvert. You can now convert OBJ files with materials to USDZ with the useObjMtl flag. We also added support for points and lines from GLTF files, and scene time for animations from FBX files. To display all usdzconvert options and ways to customize your conversion, just type "usdzconvert --help" on the terminal.
This will show you all the options you can use with usdzconvert, such as adding copyright information or providing the diffuseColor or normalMap for your 3D model, and much more. Alternatively, if you prefer a UI editor for your workflow over using the command line, we also have Reality Converter, which is built using the USDZ tools and provides the same capabilities as usdzconvert, but in an editor window, making it easy to convert, view, and customize USD content on MacOS.
Simply import common 3D file formats, such as OBJ, GLTF, and FBX, to view the converted USDZ result. This year, we've also improved the UI to streamline your workflow. You can select the texture to view more information...
Customize material properties with your own textures...
Add copyright or edit file metadata...
And choose classic or new lighting, which we'll get into more details later in the session.
You can even preview your USDZ object under a variety of lighting conditions with built-in options and adjust exposure accordingly On the asset side, we added a new feature to let you choose texture quality while exporting a USDZ file. By default, the textures are exported in their original, uncompressed size. But, if you prefer to reduce the size of your USDZ files, you now have the option to compress textures to the JPEG format.
In this example, we used object capture to scan this chess piece in high resolution. In order to reduce the file size without losing mesh detail, we used RealityConverter to export the model with compressed textures. The visual difference is hardly noticeable, and we got a whopping 80% reduction in the resulting file size! Here is more exciting news: RealityConverter can now fix issues with your USD assets automatically! It will correct mismatched attributes and connection types, fix a stage with multiple top-level prims and a missing default prim, update deprecated attributes, and add missing stage metadata. We've also added universal binary support so now it runs natively on Apple Silicon.
Now, what if you want to create 3D models from scratch? Last year, we launched Object Capture as a RealityKit API on macOS, which provides an innovative way to create USDZ assets. You can then use Reality Composer to compose a scene with multiple assets. This year, we are bringing you RoomPlan API, which lets you create parametric 3D scans of your room.
I highly recommend you to check out these 3 sessions. Together, these technologies make it easier than ever for anyone to create immersive AR experiences. All these tools we covered today are available for you to download right now on our AR Creation Tools page on the Apple developer website. Please check them out! Next, let's take a look+ at AR Quick Look's new lighting.
AR Quick Look is the built-in, system-wide AR viewer on iOS which enables you to place 3D USDZ objects in your physical space, like on a table, a wall, or a floor, and interact with them with simple touch gestures. You can even make interactive scenes built with Reality Composer and save them to a USDZ file which you can share with others on iMessage, Mail, Notes, and more. We constantly strive to make AR Quick Look rendering more realistic, and this year, we are introducing improved lighting in AR Quick Look, which is brighter, with enhanced contrast and improved shape definition to make your assets look even better.
Here is an example of AirPods Pro in object mode with classic and new lighting. The AirPods Pro look great in both lighting conditions, and you will notice the new lighting option offers brighter color, higher contrast, and more highlights.
Now let's place the AirPods Max on the desk to view it in AR mode with new lighting! Stunning, isn't it? So how do you apply the new lighting to your assets? It's really simple. You just have to pick the lighting that works best for your content by setting the new preferredIblVersion metadata in your USDZ file.
Here we have set the value of the preferredIblVersion to 2, which will indicate AR Quick Look to use the new lighting system.
Let's understand this in more detail. The preferredIBLVersion metadata can hold values of 0,1, and 2.
An asset with preferredIblVersion set to 0 will use the system default lighting.
If set to 1, it will continue to use the classic lighting.
A value of 2 will give you the new lighting.
This means you can easily update your current assets to the new lighting as well! We recommend that you make an explicit choice and set this metadata in all of your assets. To do that, you can either use usdzconvert with the flag preferrediblversion. For example, here is how to use the flag to convert the fire hydrant OBJ file to USDZ with the new lighting. Or you can use Reality Converter, which will now use the new lighting by default.
But if you want to use the classic lighting, there is a new option in the Property Panel.
Here we preview the AirPods Max asset with both the classic and new lighting.
This provides an easy way to compare the differences.
By clicking the lighting icon, the applied lighting will also be highlighted for your reference.
Lastly, for assets without this metadata, AR Quick Look will determine the lighting version automatically based on the file's date-time stamp.
If the asset was created after July 1st, 2022, it will use the new lighting.
Older assets will continue to use the classic lighting so they don't change how they look.
Now, I'll hand it over to Alex for USD rendering. Thanks, Stella. We've already seen a lot of rendered USD assets today. Now we'll explore what makes USD rendering and Hydra a great choice for your 3D content creation pipeline and how you can integrate it in your own applications.
First, let's take a step back and talk about rendering in general.
A renderer takes a collection of 3D models, cameras, and lights as input and uses them to generate an image. There are many different renderers, and each one of them is built for a specific purpose and optimized for a different use case.
Some renderers are designed for real-time applications like rendering a character in a game or an AR scene.
Others take much longer but produce a more photorealistic image, for example, producing visual effects for a Hollywood movie.
All renderers make choices about their features and are unique.
Different USD renderers for different use cases also exist on Apple platforms. We're adding documentation about these different renderers to developer.apple.com.
It will help you understand the differences between them, explain which USD features they support, and provide guidance on how to author USDs that work best for you.
One of the renderers on Apple platforms is RealityKit, which features a photorealistic rendering system, optimized for interactive augmented reality experiences, and is used in AR Quick Look. It is the primary renderer of USDZ files.
Another option for USD rendering on macOS is Pixar's open-source renderer, Storm. It is optimized for production-grade assets and designed for real-time preview of large-scale scenes.
It's a great technology for you to visualize and iterate on assets as they flow through your content creation pipeline.
Storm uses a technology called Hydra.
Hydra is a core aspect of the USD open source project. So next, let's understand Hydra and its connection to Storm. We start with this diagram from earlier. Here, we use Storm as the renderer.
The input is usually called "the scene graph," and Storm produces a preview render of it.
But what if you want to generate a photorealistic image of the same scene using a different renderer? That's exactly what Hydra is made for. Instead of tightly coupling the scene to the renderer, Hydra acts as an abstract layer in between them to transport data from scenes to renderers.
This allows you to easily swap out your renderer based on your needs at any time without any changes to your scene graph. For example, you might use Storm and Pixar's RenderMan.
In your content creation pipeline, you can use Storm for fast renders and quick iteration, and then switch to RenderMan to produce the final image when you're ready.
The same goes for the scene graph. Your input to Hydra can be a USD scene graph or a different one.
This allows you to leverage the same renderer in multiple applications even if each one of them has their own, totally different scene graph. The interfaces which connect Hydra with scene graphs and renderers are called delegates. Scene graphs are connected to Hydra via Scene Delegates, and Renderers are connected to Hydra via Render Delegates. And that is Hydra for you! Foundry's Nuke 13 is already using Hydra to render the viewport, enabling Nuke artists on macOS to have better quality and an interactive experience in Nuke's 3D system.
This is achieved by using Hydra with a custom scene delegate and Metal-accelerated Storm.
We are excited to share that we've worked with Pixar Animation Studios to publicly release Metal accelerated Storm to open source. It is ready for you to use in USD 22.05. Now that we've seen the power of Hydra, let's go through a Hydra sample application with the typical use case of a USD scene graph and Storm as renderer. It will get you started with using Hydra to build content creation tools and 3D workflow applications.
Our sample application, HydraPlayer, is very simple but powerful. It renders a USD file with Storm and lets us move the camera around it.
And we're excited to make HydraPlayer a public sample project to get you started right away! It is available in the session resources and comes with full instructions in the readme.
I encourage you to pause this video now, download the project, and then follow along as we walk through it.
In your Xcode project, you will find 4 classes: The AppDelegate, Camera, Renderer, and View Controller. The AppDelegate is basically your root object and manages interactions with the system. The camera class is a simple representation of the USD scene camera to make it easy to translate the user input.
The renderer class will handle all our interactions with USD and Hydra.
Last but not least, the ViewController handles our user input.
Before we can build and launch HydraPlayer, there are three things to do: prepare the environment, use Rosetta on Apple Silicon Macs, and download and build USD & Hydra. So let's get started.
First, we prepare our development environment. Make sure you have Xcode, Python, and CMake installed.
Now let's open up a terminal for the other two steps.
If you are on an Apple Silicon Mac, you need to run under Rosetta while USD is transitioning to fully support arm64. To do this, use the arch command.
Once your environment is ready, we have to download the USD and Hydra source code. Both live in the same GitHub repository at PixarAnimationStudios/USD.
Now that we have the code, we can build it. USD comes with a convenient Python build script. We add the flags "generator Xcode" and "no-python" and specify where we want to install USD to.
Let's put it next to the source code at "USDInstall." Once USD finished building, we are ready to build HydraPlayer! Let's go back to this diagram one more time and use it to identify key parts of HydraPlayer to check out in detail.
We'll look at how to load the 3D models, how to set up the camera, and set the scene light. Then we'll learn how to get our scene graph to Storm and finally, how to produce an image for the application window.
So let's get started with loading our 3D models from USD. In viewDidAppear in our ViewController, we create a file picker with an NSOpenPanel when the view appears the first time.
Once a file has been selected, the renderer can start setting up our scene and load the USD file.
Loading the file is very simple with the USD API. We simply open the USD stage at the file path.
That's it. We have our file loaded.
Next, we set up our camera. In our code, this is straightforward. In setupCamera, we first create a new scene camera.
Then we calculate the world size and center based on our scene. Next, move our camera at a good distance and then focus on the center. This way, our camera captures the whole scene.
Great, now we have our camera set up. Next, lighting. We keep it easy and create one simple ambient light and set its position to match the camera, and that's it. And with that, we have our full scene with 3D models, a camera and a light! We can now pass our scene to Storm. First, we need to initialize our render engine. Here, we create a new UsdImagingGLEngine, which is the class name of Storm. The most important parameter here is the rootPath. It points the engine to our USD stage with our 3D models. To learn more about the other parameters and UsdImagingGLEngine, please check out Pixar's USD API documentation.
Next, we set our camera in our engine and set our lighting setup. And last but not least, we define how we want Storm to render by setting render parameters. For example, we want to render a transparent background so that the rendered image works nicely with our app color scheme. Important for scenes with animation, this is also where we specify the time code.
Now we're ready to render an image! We've already done all the necessary setup, so the render command is just one line of code. We get the result and display it in our window, and there we are! We're rendering the USD toy plane! And actually, we're rendering not just one plane. HydraPlayer can easily render thousands of animated assets with tens of millions of triangles.
If you haven't already, check out the resources for this session, download the sample project, and have fun exploring it further.
Together we explored updates to Apple's USD tools that make them more powerful and make your assets look even better. We learned about Hydra's features and went through the HydraPlayer sample project to see how you can integrate it into your own applications.
To find out more about important concepts of USD, check out the session "Understand USD fundamentals.
Thank you. ♪ ♪
-
-
3:00 - Phyton usdconvert --help
% python usdzconvert --help usdzconvert 0.66 usage: usdzconvert inputFile [outputFile] [-h] [-version] [-f file] [-v] [-path path[+path2[...]]] [-url url] [-copyright copyright] [-copytextures] [-metersPerUnit value] // ... [-diffuseColor r,g,b] [-diffuseColor <file> fr,fg,fb] [-normal x,y,z] [-normal <file> fx,fy,fz] // ...
-
9:00 - choosing lighting in usda metadata
// asset.usda #usda 1.0 ( customLayerData = { dictionary Apple = { int preferredIblVersion = 2 } } )
-
17:50 - Build USD + Hydra
// Rosetta % arch -x86_64 /bin/zsh // Download source code % git clone https://github.com/PixarAnimationStudios/USD.git // Build USD + Hydra % python3 USD/build_scripts/build_usd.py --generator Xcode --no-python USDInstall
-
18:54 - Load USD ViewController
// AAPLViewController.mm - (void)viewDidAppear { NSOpenPanel* panel = [NSOpenPanel openPanel]; panel.allowedContentTypes = @[UTTypeUSD, UTTypeUSDZ]; [panel beginWithCompletionHandler:^(NSModalResponse result) { if (result == NSModalResponseOK) { NSURL* url = panel.URLs[0]; [self->_renderer setupScene:[url path]]; } }]; } // AAPLRenderer.mm - (bool)loadStage:(NSString*)filePath { _stage = UsdStage::Open([filePath UTF8String]); // ... }
-
19:30 - Create Scene Camera
// AAPLRenderer.mm - (void)setupCamera { _viewCamera = [[AAPLCamera alloc] initWithRenderer:self]; [self calculateWorldCenterAndSize]; [_viewCamera setDistance:_worldSize]; [_viewCamera setFocus:_worldCenter]; }
-
19:54 - Create Scene Light
// AAPLRenderer.mm GlfSimpleLight computeCameraLight(const GfMatrix4d& cameraTransform) { GlfSimpleLight light; light.SetPosition(GfVec4f(cameraPosition[0], cameraPosition[1], cameraPosition[2], 1)); return light; }
-
20:17 - transport to storm
// AAPLRenderer.mm - (void)initializeEngine { _engine.reset(new UsdImagingGLEngine(_stage->GetPseudoRoot().GetPath(), excludedPaths, SdfPathVector(), SdfPath::AbsoluteRootPath(), driver)); } // AAPLRenderer.mm - (HgiTextureHandle)drawWithHydraAt:(double)timeCode viewSize:(CGSize)viewSize { _engine->SetCameraState(modelViewMatrix, projMatrix); _engine->SetLightingState(lights, _material, _sceneAmbient); UsdImagingGLRenderParams params; params.clearColor = GfVec4f(0.0f, 0.0f, 0.0f, 0.0f); params.frame = timeCode; // ... }
-
-
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.