Streaming is available in most browsers,
and in the Developer app.
-
Bring Continuity Camera to your macOS app
Discover how you can use iPhone as an external camera in any Mac app with Continuity Camera. Whether you're building video conferencing software or an experience that makes creative use of cameras, we'll show you how you can enhance your app with automatic camera switching. We'll also explore how to recognize user-preferred and system-preferred cameras, take you through APIs for high-resolution and high-quality photo capture from iPhone's video stream, and more. To learn more about camera capture, watch "Discover advancements in iOS camera capture" from WWDC22.
Resources
Related Videos
WWDC23
WWDC22
WWDC21
-
Download
♪ instrumental hip hop music ♪ ♪ Hi, my name is Karen Xing. I'm an engineer in the Camera Software team. Welcome to "Bring Continuity Camera support to your macOS app" To start this session I will talk about, what is Continuity Camera? Next, I will discuss how your application can build an automatic camera selection experience with Continuity Camera. And finally, I will walk through the APIs that are new in macOS 13 for Continuity Camera. With Continuity Camera, you can now use iPhone as your webcam. Setup is seamless; just bring your iPhone close to your Mac. And it works wirelessly so you can quickly join a call. Your iPhone will appear on your Mac as an external camera and microphone under several conditions. First, you must be running macOS 13 and iOS 16. Both Mac and iPhone must be signed into the same Apple ID using two-factor authentication. For wired connection, the phone needs to be connected to Mac over USB. Or for wireless connection, the two devices need to be in proximity and have both Bluetooth and Wi-Fi turned on. Rather than talking through it, let me show you right away how magical Continuity Camera looks on devices. Here I have a MacBook Pro and iPhone 13 Pro. Both devices are signed in to the same Apple ID.
The phone is placed on a stand attached to my MacBook. I will be joining a video conferencing call with my colleague Eric today and show you how we can use Continuity Camera in Zoom.
The app is launched using the built-in camera first, and then an onboarding dialogue shows up describing what you can do with the new camera. The dialogue shows up one time after your Mac is upgraded to macOS 13 when you open a camera application for the first time and there's an iPhone eligible for Continuity Camera.
Hi, Eric! Eric: Oh, Karen! Hi! Karen: After the onboarding dialogue is shown on the system, Continuity Camera and microphone devices will become available in all applications.
Let's switch to use this camera and see how it looks.
Continuity Camera uses the rear camera system on the iPhone, so you get the same great video quality that you expect from iPhone. And it works with all four orientations of the phone.
The portrait orientation gives you a more zoomed in field of view compared to landscape orientation.
Continuity Camera also lets you do things that were never before possible with a webcam, including several new video effects. You're probably already familiar with Center Stage and Portrait video effects introduced in iOS 14.5 and macOS 12.3. If not, I highly recommend watching the "What's new in Camera Capture" session from WWDC 2021 to learn more about system video effects and how to interact with them in applications. Let's go to Control Center and enable system video effects on Continuity Camera.
Center Stage keeps you in frame as you move around in the scene.
Portrait blurs the background and naturally puts the focus on you. Portrait is only supported on Apple silicon Macs, but with Continuity Camera, it is now available on all Intel and Apple silicon Macs.
Studio Light is a new system video effect available on macOS 13. It is supported by Continuity Camera when using iPhone 12 or newer. Enable this when you want to look your best on screen. It provides a stunning lighting effect that dims the background and illuminates your face. Studio Light is perfect for tough lighting situations, like when you're in front of a window. Even though I'm showing you each video effect separately for a clear comparison, all of them work well together.
And any combination of the effects can be applied at the same time.
Here's another exciting feature I really want to show you for Continuity Camera. When you want to work together and share what's on your desk, you can now use Desk View. The Desk View app comes with macOS 13 and can be launched right here in Control Center.
It works like an overhead camera setup, without needing all the complicated equipment. iPhone will split the Ultra Wide camera feed in two, showing off your desk and face both at the same time, so you can collaborate on a school project or teach a friend a knitting stitch. It leverages the extended vertical field of view of our Ultra Wide angle camera, applies perspective distortion correction onto cropped frames, and then rotates the frames to create this Desk View. You can use the share window function available in most video conferencing apps to share this Desk View feed, running in parallel with the main video camera feed.
Desk View can also be used alone without streaming from the main camera at the same time. But when you do stream from both Desk View and the main camera, we recommend enabling Center Stage on the main camera for a better framing to capture face and body there. The feature is supported when the phone is placed in either landscape or portrait orientation. The portrait orientation provides the most versatility, as there's a larger vertical field of view. There's also a Desk View camera API to provide customized integration suitable for your application. I will talk about the API in a moment. During a video conferencing call on your Mac, we want you to focus on the session but we also want to make sure you are not missing anything important. When Continuity Camera is in use, all notifications on your phone will be silenced and important call notifications will be forwarded on your Mac. Bye, Eric! Eric: Bye, Karen! Karen: We've just talked about all the great experiences available to users without writing a single line of new code in your application. But with some adoption of new APIs, you can make the Continuity Camera experience even more magical and polished in your app. Now that most users will get at least two camera devices on the Mac, we've thought more on how cameras should be managed. Prior to macOS 13, when a device is either unplugged or a better camera becomes available on the system, a manual selection step is usually required in applications. We'd like to offer customers a magical experience by switching cameras automatically in applications. We've added two new APIs in the AVFoundation framework to help you build this function in your app: the class properties userPreferredCamera and systemPreferredCamera on AVCaptureDevice. userPreferredCamera is a read/write property. You will need to set this property whenever a user picks a camera in the application. This allows the AVCaptureDevice class to learn users' preference, store a list of cameras for each application across app launches and reboots, and use that information to suggest a camera. It also takes into account whether any camera becomes connected or disconnected. This property is key-value observable and intelligently returns the best selection based on user preference. When the most recent preferred device becomes disconnected, it spontaneously changes to the next available camera in the list. Even when there's no user selection history or none of the preferred devices are connected, the property will always try to return a camera device that's ready to use and prioritize cameras that have been previously streamed. It only returns nil when there's no camera available on the system. systemPreferredCamera is a read-only property. It incorporates userPreferredCamera as well as a few other factors to suggest the best choice of cameras present on the system. For example, this property will return a different value than userPreferredCamera when a Continuity Camera shows up signaling that it should be automatically chosen. The property also tracks device suspensions internally so it prioritizes unsuspended devices over suspended ones. This is helpful for building automatic switching behavior to change to another camera if the built-in camera gets suspended from closing the MacBook lid. Continuity Camera signals itself to be automatically chosen when the phone is placed on a stationary stand in landscape orientation, the screen is off, and either connected over USB to the Mac or within a close range of the Mac. In this scenario, the user's intention is clear that the device should be used as Continuity Camera.
When adopting systemPreferredCamera API, you should always key-value observe this property and update your AVCaptureSession's video input device accordingly to offer a magic camera selection experience. userPreferredCamera and systemPreferredCamera are already adopted by first-party applications. With more and more applications adopting these APIs, we will be able to provide customers a universal and consistent method of camera selection on Apple devices. Let me show you a demo to illustrate how automatic switching with Continuity Camera looks like in FaceTime.
Here in FaceTime, I'm in the Automatic Camera Selection mode. For applications that want to offer both manual and automatic behavior, we recommend adding a new UI for enabling and disabling auto mode.
FaceTime is currently streaming from the built-in camera. When I pick up the phone from the desk and place it on a stand behind the MacBook...
...FaceTime switches to stream from the Continuity Camera seamlessly. That is where the new class property systemPreferredCamera comes in: the property value changes to Continuity Camera when the phone is in a position ready to stream. You might want to build your application in a similar way. Here's my recipe for how to implement Automatic Camera Selection and manual selection mode. When Automatic Camera Selection is on, start key-value observing the systemPreferredCamera property. Follow the systemPreferredCamera whenever it changes by updating your session's input device. In auto mode, we highly recommend still providing options to let users pick a camera by themselves. When a different camera gets picked, set the userPreferredCamera to that device, which then gets reflected in systemPreferredCamera property value. When Automatic Camera Selection is off, stop key-value observing the systemPreferredCamera property. Instead of following systemPreferredCamera, you will need to update session's input device with the user-picked camera in manual mode. But same as auto mode, you still need to set the userPreferredCamera property every time a user picks a different camera, so we maintain the user's history of preferred cameras and suggest the right camera when getting back to Automatic Camera Selection mode. For best practices on how to incorporate userPreferredCamera and systemPreferredCamera APIs, please check out the new sample app, "Continuity Camera Sample." Besides bringing a magical webcam experience to the Mac, Continuity Camera also presents you with new opportunities to harness the power of iPhone-specific camera features in your Mac app. We've added a few AVCapture APIs on macOS 13 to help applications better utilize Continuity Camera devices. We're bringing the amazing quality of iPhone photo captures to macOS, thanks to Continuity Camera. First off, we support capturing high-resolution photos. Previously, macOS has only supported photo captures at video resolution. Starting with macOS 13, you will be able to capture up to 12 megapixel photos with Continuity Camera. This can be enabled by first setting highResolutionCaptureEnabled to true on AVCapturePhotoOutput object before starting a capture session, and then setting the highResolutionPhotoEnabled property to true on your photoSettings object for each capture. In addition to capturing high-res photos, Continuity Camera supports controlling how photo quality should be prioritized against speed by first setting the maximum photo quality prioritization on the photoOutput object, then choosing the prioritization for each capture by setting photoQualityPrioritization property on the AVCapturePhotoSettings object. To learn more about choosing the right prioritization for your application, please check out "Capture high-quality photos using video formats" in WWDC2021. Another photo-related feature is flash capture. You can now set flashMode on your photoSettings object to control whether flash should be on, off, or automatically chosen based on the scene and lighting conditions. We are also making AVCaptureMetadataOutput available on macOS to allow processing timed metadata produced by a capture session. You can now stream face metadata objects and human body metadata objects from iPhone. Let's go through how to setup a session to receive face metadata objects. After setting up the session with proper video input and output, you will need to create an AVCaptureMetadataOutput and call addOutput to add it to the session. To receive face metadata in particular, set your object types array on the output to include the face object type. Make sure the metadata types requested are supported by checking availableMetadataObjectTypes property. Then setup the delegate to receive metadata callbacks. After the session starts running, you will get callbacks with face metadata objects produced in real time. Besides AVCapturePhotoOutput and AVCaptureMetadataOutput we just talked about, Continuity Camera also supports video data output, movie file output, and AVCaptureVideoPreviewLayer. Here's a list of video formats supported by Continuity Camera that you'll want to be aware of when integrating this camera into your application. It supports three 16 by 9 formats -- from 640 by 480 to 1080p -- and one 4 by 3 format: 1920 by 1440. You can choose between formats supporting up to 30 frames per second or 60 frames per second, based on the need. Another major addition is Desk View device API. Desk View camera is exposed as a separate AVCaptureDevice. There are two ways you can find this device. First one is by looking up AVCaptureDeviceType DeskViewCamera in device discovery session. Alternatively, if you already know the AVCaptureDevice object of the main video camera, you can use the companionDeskViewCamera property on that device to access a Desk View device. This API will be helpful to pair main camera and Desk View device when there are multiple Continuity Camera devices around. Once you have the AVCaptureDevice object of the desired Desk View camera, you can use it with an AVCapture video data output, movie file output, or video preview layer in the capture session just as you can with other camera devices. Desk View device currently supports one streaming format in 420v pixel format. The resolution of the format is 1920 by 1440, and the maximum frame rate supported is 30 fps. This is the end of the session. You learned about Continuity Camera, how to build a magical camera selection on macOS, and a handful of new APIs to integrate Continuity Camera in your Mac application. I'm excited to see you adopting all these APIs, and have a great rest of WWDC. ♪
-
-
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.