Streaming is available in most browsers,
and in the Developer app.
-
AR Quick Look, meet Object Capture
Discover simple ways to bring your Object Capture assets to AR Quick Look while optimizing for visual quality and file size. Explore ways you can integrate AR Quick Look and Object Capture to help create entirely new experiences. To get the most out of this session, we recommend first watching “Advances in AR Quick Look” from WWDC19. You can also learn how to integrate Apple Pay and custom actions with AR on the web through “Shop online with AR Quick Look” from WWDC20.
Resources
- Adding an Apple Pay Button or a Custom Action in AR Quick Look
- AR Quick Look Gallery
- ARKit
- Search developer forums for AR Quick Look
Related Videos
WWDC21
WWDC20
WWDC19
-
Download
Hi, everyone. Thank you for tuning in. My name is Jerry, and I'm excited to present to you how to bring Object Capture assets into AR Quick Look. As a reminder, AR Quick Look is the built-in, system-wide AR viewer for 3D models across iOS, available in Safari, Messages, Files, and more. Apps and websites can embed 3D models, allowing people to view and interact with them in their own environment.
This can be a great way to showcase a product, promote an event, or provide additional content in an immersive experience, such as this interactive demo for the Apple TV+ show "For All Mankind" where you can learn more about a cosmonaut space suit.
In this presentation, I'll be talking about how to use the Object Capture API to create 3D content for AR Quick Look, as well as some of the best practices to keep in mind, depending on your use case.
Then, I'll give a quick recap on how to integrate AR Quick Look in your app or website.
And finally, I'll show some exciting applications that AR Quick Look and Object Capture can help create.
Let's get started with the 3D content creation process for your AR Quick Look experience.
Previously, if you wanted to create 3D content for AR Quick Look, it required the use of a 3D-modeling software. However, they are usually expensive and can be difficult to approach for everyone. This year, we've introduced our new Object Capture API, built into RealityKit, as an alternative way to generate a USDZ file. It allows you to create a high-quality 3D model by using a collection of still images of the real-world object.
Object Capture does the heavy lifting to create a USDZ file that can be viewed directly in AR Quick Look. You can also use Reality Composer if you want to add interactive custom behaviors to your model. For example, you can add tap triggers and camera actions to your digital scene.
Together, these technologies now make it easier than ever for anyone to create an immersive AR experience. Let's see a demo of this workflow in action. I've just started an online shop selling handmade pottery, and here's one of them.
I'll show you how to create a 3D model of this pot and add behaviors so that customers can preview it on their own desk with a variety of succulents before buying. But first, let's see it in action.
Here is the virtual succulent placed on my desk. Let me go in for a closer look. You can see the geometric spirals in great detail.
OK, now let's preview a few other succulents. I'll tap on the green one... and it switches to the red one. Let's check out a few more. Now I'll tap again.
Oh, I really like this gray one! It really matches with the pot. I'm curious how it looks compared to the real one.
Wow, that looks very convincing, except the leaves have grown a lot bigger since I scanned this a few weeks ago.
Now, as a friendly reminder, Object Capture does not make your virtual plants grow automatically, but if you want, you can simulate that by changing the scale using Reality Composer. I think this would make a wonderful asset for my online store. Now let me walk you through how I created this from scratch.
The first step for me was to take photos of the succulent pot by itself, from all angles, on a neutral background.
I then used Object Capture to generate a USDZ model. Similarly, I repeated these steps for the each of the succulent plants to generate three separate USDZ models. Now let me show you how to swap different succulent plants in the same pot by adding behaviors in Reality Composer.
I've already started a project where I've added the pot base and arranged the three succulents so it properly rests inside the pot. Now let's add the behaviors. First, I added a "hide on start" behavior so initially only the green succulent is shown. When someone taps on the green succulent, we want it to swap and show the red one. For this, I've added a tap trigger, along with "hide" and "show" actions. I then repeated these steps so we can swap from the red to gray and gray to green. And that's it! Now we just need to export the asset, and we can view it in AR Quick Look like we just saw earlier.
Now that we've just seen how easy it is to create content using Object Capture, let's talk about the best practices when exporting models for viewing in AR Quick Look.
When exporting a USDZ file using Object Capture, it is important to determine which detail setting to use.
Here's a table to illustrate the different characteristics of each setting. There's always a tradeoff between visual fidelity and file size. Exporting a USDZ with a small file size is an important consideration for AR Quick Look because it results in a shorter download time for users. No matter which detail setting you choose, Object Capture will always use the same underlying algorithms to generate a high-quality reconstruction, but applies varying levels of mesh decimation and texture down-sampling, depending on the specified detail setting. The Reduced and Medium detail settings offer a great balance between visual fidelity and file size. Therefore, when creating content for AR Quick Look, we recommend that you first export in both Reduced and Medium detail settings. Then you can choose which to keep after evaluating both the visual differences and other important considerations, which I will talk about next. As we've just seen, the Reduced detail setting will generate models with the smallest file size. This makes it ideal for web distribution, as it reduces the download time before your model can be viewed. This is also the recommended setting if you plan on combining multiple assets in a single scene, perhaps to showcase a collection of succulents. These attributes make the Reduced detail setting a great choice for a majority of use cases.
In certain situations where you notice a significant difference in visual quality between the Reduced and Medium settings, it may be more suitable to use the Medium detail. This is usually the case when you have a very complex object, and would require you to take hundreds of images for reconstructing it using Object Capture. Keep in mind, however, that this will generate a USDZ with a larger file size, so it's best not to combine multiple models together. For example, here I've taken photos of the succulent and pot base together. Similarly, the Medium detail setting is more suitable for apps where there is already a local copy of the asset on the device, which eliminates download times. In summary, we recommend you evaluate both detail settings and base your decision depending on your object and use case. Be sure to test your asset on a variety of iOS hardware to ensure device compatibility and performance. With either detail setting, it's best to ensure quality at the source. So always take sharp images and avoid motion blur to generate a good, quality reconstruction. Also, make sure to provide sufficient overlap of at least 70% between adjacent photos and fill the frame with the object. For more details, I encourage you to read the "Capturing Photographs" article and check out the "Object Capture" session.
Now that you know the best practices, let's recap how to integrate AR Quick Look. As you know, it's possible to embed AR Quick Look in your app or website with just a few lines of code. Let's go over a quick recap of the integration.
If you are embedding AR Quick Look in an app, you can use the Quick Look framework. Here we create a new QLPreviewController and assign "self" as its dataSource. Then we present it like a regular view controller.
Of course, we also have to implement the QLPreviewController dataSource protocol, which tells Quick Look how many previews to show and what each is going to be.
Here we create an ARQL previewItem with the URL to a local file on disk. Then we return it so the system knows to present AR Quick Look.
We also provide ways to customize the experience, such as disabling content scaling in AR mode so that people can always see the object at its true scale when placed into the world. To disable scaling, set the allowsContentScaling property to be "false." If you are integrating the AR Quick Look experience on your website, you can add this a-tag snippet to your HTML, replacing the URL to your own model and image thumbnail.
And be sure to include the rel="ar" attribute. This will add the AR badge icon to the thumbnail. Similarly, it's possible to disable content scaling in AR when embedded on web pages too. To disable content scaling, set allowsContentScaling to be 0.
When embedded on websites, you can also surface Apple Pay and custom actions, like preorder, directly in AR Quick Look to allow customers to take the next step in your order flow.
So that's a quick summary of just some of the things you can do with AR Quick Look. For more details, I encourage you to check out our previous sessions.
Now let's talk about some real-world applications with AR Quick Look. With the announcement of Object Capture, we've now made the entire process of creating and distributing 3D content much more accessible for everyone. Object Capture does the heavy lifting of generating a USDZ file. Reality Composer makes it easy to create multi-asset scenes and add interactivity to models. AR Quick Look provides a great viewing experience for apps and websites on iOS and is also available on macOS.
Together, these technologies allow for the creation of both improved and new experiences in a variety of fields. Let's take a look at some examples for inspiration. E-commerce is one of the most popular use cases for AR Quick Look, and for good reason. Viewing 3D models with AR Quick Look helps customers visualize products in their own space, whether it's shoes, photo frames, or furniture. Now, with Object Capture, you, as a retailer, can easily create 3D models of your product without needing to have prior 3D experience, like this Fragment Design x Air Jordan 3 sneaker that the team behind GOAT app has created using Object Capture. You can also take advantage of Reality Composer to add additional behaviors to enhance the viewing experience. For example, similar to the succulent demo, where I can swap between different succulent plants, you can make an app that allows customers to preview different outfits or purses.
Another use case is for museums. Historical artifacts are often stored in protective cases, making it difficult to view from all angles. However, by capturing the object in 3D, viewers can see and interact with a detailed virtual representation of the artifact on their own device. Not only does AR Quick Look allow you to tour museums remotely, but it can also enable new forms of in-person exhibits that can only be experienced in a virtual world.
You can also add voiceover audio to explain the 3D content or help people who are hard of seeing, like this Zemi figure from the Metropolitan Museum of Art. Let's have a listen. It was made around 1000 AD by a sculptor of the Taino civilizations of the Antilles archipelago in the Caribbean Sea, probably in what is now the Dominican Republic.
Another great use case of Object Capture and AR Quick Look is in the field of education. Diagrams and videos have historically been 2D, which can sometimes make it challenging to convey 3D concepts. Now, with augmented reality, we can teach in, well, 3D. Teachers can now create engaging lessons by leveraging 3D models generated with Object Capture. And technology can provide immediate feedback and interactivity so students can learn and discover at their own pace.
This doesn't just help teachers with lesson planning. Kids can also express their creativity in AR. For example, the Qlone app allows kids to create 3D models of their favorite toy or art project. It provides a convenient visual guide and automatically takes pictures as appropriate so kids can scan objects by themselves. Qlone is working on incorporating the Object Capture technology with seamless integration between their iOS and Mac apps, making it easier for kids to share their creations with friends and family. I've invited the kids of our Apple colleagues to share some of the work that they've done. After capturing photos and generating 3D models with Object Capture, the kids then added digital interactions using Reality Composer. Let's check it out.
The first is a dinosaur costume that comes to life with audio. We also have a nice set of plates and cups displayed.
And here's a cool interaction where the speech bubbles between the puffer fish and octopi are always facing you.
Those are some really cool, creative art projects.
As you've just seen, anyone can now develop immersive AR experiences from scratch using a combination of Object Capture, Reality Composer, and AR Quick Look. Object Capture creates high-quality assets that are ready to be viewed in AR Quick Look. Reality Composer makes it easy to combine multiple assets into a single scene and add interactivity to models.
For more information, I encourage you to visit the AR Quick Look Gallery page for examples and check out the "Object Capture" session. And that's it. Thank you for watching, and enjoy the rest of WWDC. [music]
-
-
8:02 - Integrating AR Quick Look in your app
// File: MyPreviewController.swift func presentARQuickLook() { let previewController = QLPreviewController() previewController.dataSource = self present(previewController, animated: true) } // MARK: QLPreviewControllerDataSource func previewController( _ controller: QLPreviewController, previewItemAt index: Int) -> QLPreviewItem { let previewItem = ARQuickLookPreviewItem(fileAt: fileURL) // Local file URL return previewItem }
-
-
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.