Streaming is available in most browsers,
and in the Developer app.
-
Reaching the Big Screen with AirPlay 2
AirPlay lets you share videos, photos, music, and more from Apple devices to your Apple TV, favorite speakers, and popular smart TVs. Learn how to deliver the best possible experience for AirPlay video, including automatic route selection for long form content, remote control, now playing metadata, and video quality considerations.
Resources
- Becoming a now playable app
- FairPlay Streaming
- Incorporating Ads into a Playlist
- Integrating AirPlay for Long-Form Video Apps
- Presentation Slides (PDF)
Related Videos
WWDC19
-
Download
All right. Good morning, everyone.
My name is Jonathan Bennett and I work with the Connected Media team at Apple.
And I'm happy to be here today helping kick off the conference, talking with you about AirPlay.
So to start, let's talk about what you can do with AirPlay.
You can use AirPlay to wirelessly send a video to watch on your TV or listen to music throughout your home.
And you can also share pretty much anything from your iOS or macOS screen using AirPlay Mirroring.
And so for today's talk, we're going to focus on video and we'll go over some of the enhancements we've made to make sure that you can find out about all the great TV destinations and AirPlay experiences on AirPlay sending devices.
So when it comes to destinations, Apple TV used to be the only way that you could AirPlay a video to a TV, and it's still a great choice for many people. And as you heard earlier this year, we've been working with some of the largest TV manufacturers to build AirPlay support directly into the TVs. This dramatically increases the choices people have on where they can AirPlay their video to, and it adds a huge number of TVs that you can now take advantage of.
So just look for this label on the box and you'll know that it supports AirPlay Mirroring, multi-room music and, of course, video. And this is a high-quality video experience with both Apple TV and AirPlay capable TVs able to support up to 4K HDR video with a surround sound. So those are some of the great new additions that we've made to TVs. And when it comes to AirPlay sending devices, I'd like to go over some of the enhancements we've made to iOS to make it easier than ever for people to get the content they love from your app to the TV. So first, your video app can now sort TV destinations to the top of the AirPlay list, making it easier than ever to find the TV you want to play a video on.
We've also made enhancements and enriched the remote controls available to allow people to interact with your video quickly and easily. And we've also made improvements to allow people to better multitask on their iOS devices so they can receive a phone call or check out that cute video that they just got from an iMessage while your video continues playing on the TV.
And lastly, iOS can now intelligently suggest to TV to AirPlay to using on-device learning. So I can just sit back on the couch, launch your video app, and just tap on my favorite show and have it automatically show on the TV. With great new TV destinations and all of these experience improvements, now is the time to make sure that your video apps fully support AirPlay.
So I'd like to take the rest of this talk to go over five key areas that you can add this great experience.
First, we'll make sure that you have high-quality video available for the TV.
Then we'll provide a way for people to get that video onto the TV using an in-app picker. And once it's on the TV, we'll make sure that people can control it and we'll make sure that that video keeps on playing while they use their iOS device for other things. And lastly, if you have an app that plays primarily long-form video like movies or TV, there are some additional steps that you should take to make sure that your users benefit from intelligent AirPlay Suggestions. So let's start by talking about high-quality video.
HTTP Live Streaming is a fast and adaptive way to stream high-quality video to any device. And you're likely already using HLS to deliver video to your iOS apps.
Now, some of you might be curating your playlist to focus on those type of devices. So, an iPhone screen or an iPad screen.
But remember at any time, a user could take that video and want to send it and AirPlay it to a TV. So it's important that you have a full range of variants in your playlist that are appropriate for a TV, including variants that support 4K, HDR and surround sound.
Now, if in your playlist you have multiple video codecs that you use to encode your content, be sure to also provide a full range of variants for each video codec. This makes sure that your video is fully supported on any device it plays back on and it also makes sure that the player can transition smoothly between variants using the same video codec.
Now, one thing that I find myself doing frequently is fast forwarding or seeking through video. And whether that's to skip a scary part of a movie with-- that I'm watching with my kids or to find that interesting topic in the middle of a WWDC video having fast, rapid, frame previews while seeking allows me to find what I want quickly and confidently. So you can add the support in your videos by including an I-frame variant to your HLS playlist. And lastly, if you have high-value content that you need to protect and manage, you're likely using FairPlay Streaming to do that. And FairPlay Streaming is fully supported whenever and wherever you AirPlay, that includes AirPlaying to an Apple TV. And I'm also happy to say that FairPlay Streaming is supported on every smart TV with built-in AirPlay support. Now, there are more details about validating your HL streams for AirPlay than we can cover here today, so we've made a separate video associated with this session and I encourage you to check that out. So, having high-quality content is great but we also want to make sure that we have smooth playback.
Now, if you have pre, mid or post-roll content that you need to insert into your video, use HLS's discontinuity tag to do that. This allows the player to remain intact and the estimated bandwidths to remain intact so both your interstitial video plays in high quality. And when it ends and you resume the main content, that content can immediately begin in high quality. It's, again, important to make sure that these interstitial videos have the same video formatting as the main content, such as the same video codec and this allows these transitions to happen quickly and smoothly. Now, if you can't use HLS discontinuity tag for some reason or you have multiple pieces of content that you want to play serially, we encourage you to use AVQueuePlayer when AirPlaying.
And if you can't use AVQueuePlayer when AirPlaying, please use the same AVPlayer instance and just replace the current item with your new content. You don't want to destroy an AVPlayer and create a new one for every piece of video or swap between instances, because when AirPlaying, the system doesn't know that a new piece of video is going to be arriving. And so it might start tearing down the engine and you don't want to introduce this type of delay just because you're transitioning between videos.
All right, we've got high-quality content. Now, let's give a way for people to get that to the TV.
And you can do that by adding an AirPlay picker into your app.
And new in iOS 13, you can decide how you want to order those destinations. To add an AirPlay picker, use AVRoutePickerView. It's been available since iOS 11 and it's now available on macOS.
Now, some of you might be using MPVolumeView's route button properties to get a picker in your app. And in iOS 13, the route button properties are now deprecated.
Now, don't worry, you can still use MPVolumeView for putting volume UI into your app. But when it comes to route buttons, use RoutePickerView.
So let's see how you can add that, and it's really simple. All I have to do is create a AVRoutePickerView and add it to my view hierarchy.
And in iOS 13, I can set prioritizesVideoDevices to true and that will give me a route button icon that is focused on video with the AirPlay video logo, and it also sorts those TV destinations to the top making it super easy to find that TV I want to play to.
So now that I have video playing on the TV, let's make sure that people have a way to control it and this is really important. And so we've made sure that remote controls are more convenient and capable than ever throughout the system.
For example, I can control my video with lock screen controls and we've added additional capabilities when you've targeted a TV. I can also use Siri and issue play commands and say, you know, play this on my TV.
And it's not just about controlling what's on your iOS device but also the devices around you. For example, you can pull down Control Center and you can target HomePods that are playing music throughout your home or even a tvOS app that is playing a video and I can see what's playing and control it from almost any device.
So, what happens if your app doesn't support remote controls? Well, here I have my app and it's AirPlaying to the TV. But if I were to lock the screen, I wouldn't know that the video is playing and, more importantly, I would have no way to interact with that video.
So by supporting remote controls, you can know what's playing, how far along I am in that video, a great way to interact with it quickly, and then there's even a way to get back to the app that's currently playing that video by just tapping the artwork or the title. So, let's go through how you can add these remote controls in your app and it starts with becoming the Now Playing app. You become the Now Playing app by doing two things, one is to at least support one remote control command and the other is to initiate playback with immediate configuration.
And on iOS, that means activating your audio session with a non-mixable category. And for most video apps, that's going to be the playback category. On macOS and iOS code running on macOS, you'll do something slightly different. You're going to grab the MPNowPlayingInfoCenter and change the playback state whenever you start or end your video experience. When it comes to supporting remote controls, if you're using AVKit's AVPlayerViewController to present your video, it handles remote controls for you so you're all set.
If you're using AVFoundation and a custom UI to present your video, you'll need to use MPRemoteCommandCenter and MPNowPlayingInfoCenter in order to get equivalent functionality. And so I'd like to spend some time going through step by step about how you use this media player APIs and build out this functionality. So here I am back at my lonely lock screen, and let's just add some basic support by supporting the play command. I can do that by creating a setup method for supporting remote controls. And we'll be coming back to this method again and again as we continue to build out our support. I first grab the shared instance of the remote command center. And the remote command center supports many different types of media commands. Here, we're supporting the play command. And to that, we'll add a target handler to receive that command and do something with it. So here, we'll start our playback. And once I've done that, I'll return a status on whether that command succeeded or failed.
And this is the basic structure you're going to use for any remote command that you support. So to flash out our playback command handling, we'll support the pause command and the togglePlayPauseCommand by adding handlers for those. So this got me a platter with a play/pause button and I at least know what app is playing, but it'd be really great to know what video is playing. And you can do that by providing Now Playing info.
And Now Playing info comes in two parts, one is the item metadata such as the artwork or title and then the playback state such as the duration and the current elapsed time. So in my code, there are two major events where I'll want to update this information. One is when a item changes, so going from episode one to episode two.
The other is when I have a current item and the playback state changes so I-- my rate changes or I seek to a new position in the video. So when an item changes, I'm going to take a metadata that I've packaged up in my app with a structure just to pass it around through my app, I'm going to update that static Now Playing info such as the title and the artwork.
And if I have it, I'm going to update the playback info at this time as well.
Now, when a playback state change happens, all I need to do is update that playback info. I don't need to update artwork and spend time doing all of that. So let's see how you can update some of this information. And the first thing I'm going to do is grab the Now Playing info dictionary from the shared info center. And to this dictionary, I'm going to update my metadata. So here, I'm updating title and artwork. And there are many other types of metadata that are supported and that I might support in my app.
Regardless of what I support, it's important to update this metadata or clear it out, nil it out if a particular item doesn't have that type of property. You don't want a mix of old and new metadata in the dictionary. Once I've updated it, though, I just pass it back to the info center for presentation. So, it's really simple to update Now Playing information. And now, let's see how we update the playback info.
And it's done very similarly. I grab the Now Playing information, the info dictionary, and to that, I'm going to update the duration, the elapse playback time, and the playback rate.
And now, I only need to update this on significant change events. I don't need to update this every second to see progress through the remote UIs. As long as I update these three items when a rate changes or I have a significant jump, then the system can infer where you are in your video at any given moment and update the remote UIs accordingly. And again, once I've updated this, I'll just pass it back to the info center to get presented. So I know what video is playing and it's really nice to be able to navigate through that video. So you can add the support for a seeking with a playhead scrubber or skipping forward and backward in your video. So let's see how we do this.
So we're back in our method for setting up remote control commands. And I'm going to add support for the skip forward command. And this is one of the commands that supports additional configuration, and here, I'm setting the preferred skip interval to 15 seconds. I then add a handler for that command.
And it's important to note here that I'm using the interval that's passed to me through the commands event. Just because I set a preferred interval doesn't mean that's going to be used every time. For example, I could ask Siri to jump forward by 42 seconds. And that's the value that's going to get passed to me and the one that I should honor. I'll then round out my support by supporting the skip backward command and the change playback position command which is used when seeking. Now, for some apps and some types of video, you might need to temporarily disable navigation and you can easily do that by setting the isEnabled property on the commands that you need to disable. You don't need to remove the target handler and add it back. So one thing new this year that we've added to remote control UIs is the ability to select and change audio languages and subtitles. So if you have video with these types of options, you're going to want to add this type of support for remote controlling. This allows people to watch the video the way they want and need easily. So you can add this support by providing a list of audible and legible language options that represent the audio languages and subtitles. And you can also then identify the current selections in each of those groups. And lastly, you're going to want to handle changes to those selections via the remote UIs. So let's see how we provide this information. And it starts by creating a mapping. So I'm going to load the audible and legible language groups from my AVAsset. And I'm going to create a mapping of these by translating them into the media player objects that represent these options. And I can use convenience functions to do that. And I'll maintain this mapping in my app so that when I receive a language option from a remote command, I can eventually translate it into the AVFoundation object and set it on my AVPlayer item to change the selection.
So once I have this mapping, I can update that info. And here, I'm doing that in the update method for my playback info, where I've already added the playback duration and elapse time.
And to that, I'm going to add the list of available option groups and their current selections. So now, we have to handle changes to those selections. And I can do that in my setup method for remote controls or I'll add support for the enableLanguageOptionCommand.
And this handler is going to first get the current playback item.
And then use my mapping to take the language option that I received and translate it to the AVFoundation objects.
I'll then use AVPlayer items, select option and group method to select the new option on the current item. And I'll want to update my Now Playing info and other internal state and handle that playback change. Now, there are some types of option groups that could have nothing selected. For example, turning off subtitles and closed captions. And this comes through with the disableLanguageOptionCommand. So you want to make sure to support both the enable and disable commands.
And the handler is very similar, first, getting the current item and using my mapping to find the group.
But now, I'm going to check if that group allows empty selection and make sure that it can. And if it does, I'm going to use the same select method but passing nil as the option and this will deselect all the options in that group.
So I've got a full set of remote controls that I can support. When do I register and deregister for these? For most video apps, you're going to register when you have content to play and that's when a user selects a video in your app.
And you also want to deregister when you have nothing to play. So, a user exits your video. And you'll use that opportunity to also clear out the Now Playing info dictionary to make sure that there's no still data hanging around. Now, I mentioned that it's not just about controlling what's playing from your device but the devices around you. So if you have a tvOS app that you make, try playing some content on Apple TV and pulling down Control Center on an iPhone and target that Apple TV. See that you're seeing the video information that you expect and you're able to control it as you expect. The same goes for using Siri to control your TV.
So the examples that I went through just now are all part of a sample project that we've made available associated with this session. And I encourage you to check that out.
All right. We got high-quality content that we can get on the TV and we can control it. When it's playing on the TV, it's really nice to be able to use my phone or my iPad for other things.
And there are two parts to supporting a great multitasking experience. One is to make sure that your video keeps on playing while the user multitasks. And the other is to make sure that if there's already video playing on the TV that your app and navigating through its UI doesn't accidentally interrupt that experience.
So, when I'm watching a video on my phone, my video app is usually staying in the foreground because I need to look at it on the screen. But when I'm AirPlaying on the TV, I'm now free to use my iPhone for other things like check my messages or mail, or social media.
If I'm using a video app that doesn't support background media, the video is going to stop the moment I leave that app. And so we want to make sure that your video can continue playing. And you can do this by supporting the audio background mode in your app's Info.plist.
And you'll also configure your AVAudioSession for background playback. For most video apps, that means using the playback category. And this provides media appropriate behaviors in your app, including playing in the background. Another thing about the playback category devices, it'll interrupt the other media. So it's really important to make sure that you only activate your audio session when a user presses play and begins playback on a primary piece of content in your app. Now, once you go active, you can stay active.
There's no need to deactivate your audio session just because the user paused the video. Now, media interruptions can happen at any time. And so, it's important to listen to those notifications on AVAudioSession.
And when an interruption begins, you update your UI and internal state. And when it ends, check the notifications payload and make sure that you check the shouldResume key and honor that value. So, only automatically resume your video if it returns true. When you have video already playing on the TV, it's important to test your app and launching it, navigating to the UI and make sure that there's nothing that accidentally AirPlays that you didn't expect, or that it interrupts that current video experience.
And I like to go over a couple examples that you might run into. For one, there's sometimes video that should never AirPlay. So if I have an animation on app launch that I achieved using a video asset, you don't want that interrupting the current experience. The same goes if I have video assets I use for other types of visual flourish or to promote content within my app. You can make sure that those videos don't interrupt the current experience and stay local on the device by setting allowsExternalPlayback to false. Your app should only AirPlay a video when a user has directly interacted with that video. And to that end, you should not auto-play primary content just because your app launched or came into the foreground. When it comes to media, people should always be in control of when playback begins.
And this is really important, and I want to go over a couple examples of why. So here, I am listening to music on my iPhone and I have a video app that supports on-demand video that I can watch but also auto-plays a live TV channel when the app comes to the foreground.
So if I'm listening to music, I launch that app, that music is going to stop and I had no control over it. And I might have launched that app for a purpose other than watching that live video. You want to make sure that people can launch your app confidently knowing that their current experience won't get interrupted. And I like to call out one more important example that is new this year.
For long-form video apps, iOS can suggest a TV to automatically play to when the video begins. And when this happens, the system provides a notification telling the user that the video will automatically play on the TV. And people can tap that notification or the status bar icon to change that suggested destination. Now, if I have an app that auto-plays its content, it's possible that that video starts playing on the TV before I've had a chance to really see and interact with that notification.
By requiring to use the action to begin playback, you can provide people the power to decide both when and where they play their video, and that creates a great media experience. And so, to talk with you more about long-form video apps and their integration with intelligent AirPlay Suggestions, I'd like to invite up on to the stage Marty Pye. Thank you, Jonathan. Hello, everyone. My name is Marty Pye. I'm an AVKit engineer here at Apple. And I'm here today to tell you a little bit more about the advanced routing features we built into iOS 13 and how your long-form video apps can take advantage of these features. So first, we'll talk about what a long-form video app actually is. Then I'll talk a bit more about what these specific routing behaviors are that we've built for long-form video apps. And finally, I'll show you how to integrate these features into your code.
So what is a long-form video app? As the name suggests, any app that primarily plays long-form video content is considered a long-form video app. This content includes things such as movies or the TV type content like TV shows, sports, et cetera. If your app primarily plays content from social media or other types of short news clips, then it would not qualify as long-form video. We encourage you to assess your app holistically to decide in which category of movie it belongs.
So once you've decided that your app is long-form video, how will the AirPlay routing behavior differ for such apps? Let's take a look at an example.
If-- Imagine you're watching a documentary on banana slugs and you're AirPlaying this content from your iPhone to an Apple TV.
New in iOS 13, when your friend sends you his latest sweet dirt biking video, you can go ahead and watch this short content locally on your device without interrupting the banana slug documentary that's playing on the TV. Another feature we've built into iOS 13 is AirPlay Suggestions.
If the user goes to play some content and the AirPlay routing system detects a nearby likely AirPlay device, then the system may prompt the user and suggest to AirPlay to this device. Or, depending on the confidence, the system might also automatically route playback to that device.
So now that you know about these features, how to take advantage of them in your app? Well, there's two things you need to do. And the first thing is registering your app as a long-form video app.
So how do we do this? It's really very simple.
All you need to do is add the AVInitialRouteSharingPolicy key to your app's Info.plist and set it to a long-form video.
By setting this on the plist, we can ensure that your app behaves like every other long-form video app on the system before you create any of your playback objects and the moment it gets launched.
Now, the second thing you need to do in order to leverage the AirPlay Suggestions is prepare the routing system prior to playback. I'm first going to give you a little conceptual overview of how that works, and then we'll dive into how to achieve it in code. So in your app, you most likely have some sort of browsing UI where users can scroll through different pieces of content, select the piece of content they would like to play, and hit Play.
Well, it's at this transition from browsing to actual playback where you should prepare the AirPlay routing system. And what happens when you prepare the routing system? Well, it may decide to automatically play the content locally on the device or route it to an external screen, be it via a likely nearby AirPlay device, or if you have an external screen wired into the device. However, depending on the level of confidence, the system may also prompt the user asking whether they would like to play to a nearby AirPlay device.
And then depending on the user's choice, control will then be handed back to your app with information about where the content is going to play.
So let's take a look at how to do this in code.
When the user decides to hit Play, you should call prepareRouteSelectionForPlayback on the AVAudioSession. And this method takes a completion handler.
Now, this completion handler will be called right away if the AirPlay can-- if the AirPlay routing system can make an automatic decision.
However, there may also be a prompt shown asking the user whether they would like to play to this nearby AirPlay device. If the user decides they would like to play the content locally, then your completion handler will be called with route selection type local. Now, you can go ahead and present your playback UI. For example, AVPlayerViewController, or you can present your own custom playback UI.
If the user instead decides they would like to AirPlay the content and they select the living room, then your completion handler will be called with route selection type external.
And here, you have the opportunity to present some sort of custom remote playback UI with controls dedicated to controlling something that's playing on an external screen. Lastly, don't forget to handle the case that the user decides to cancel playback.
Now, in order to test your app and how it behaves in the various different scenarios with AirPlay Suggestions, we've built some toggles into the developer settings. If you drill down to settings developer, you'll find this new entry, AirPlay Suggestions.
And in here, you have the choice of three distinct modes.
The Default mode is just the default system behavior. You can also select between a mode where the system will always prompt the user and ask whether a content should AirPlay to a specific device. Or you can force the system into a mode where it will always automatically route to a specific device. And you can select which device this applies to in the nearby list of routes below. Once you've forced the system into a specific mode, make sure that you reset the media services.
If you're concerned about whether AirPlay Suggestions will handle every possible use case where your users might want to AirPlay, then don't worry. At the end of the day, you're still fully in control when and where things will AirPlay. You can always select to change which device you're AirPlaying to via AVRoutePickerView inside of your app via the controls inside of AVPlayerViewController or via control center.
These new features just make it even easier for your users to get your content onto the device of their choosing.
We encourage you to check out the associated sample code where we demonstrate in a simple test app how to integrate the behaviors we've described today.
So what have we learned today? We've learned how to integrate an AirPlay picker into our app via AVRoutePickerView.
We've learned how to handle and register for remote controls commands via MPRemoteCommandCenter, and how to populate system UI with relevant Now Playing information.
We've learned how to take advantage of AirPlay multitasking and how to register your app as a long-form video app.
And lastly, we've learned how to prepare the routing system prior to playback in order to fully leverage AirPlay Suggestions.
With that, I would like to thank you all for joining us on today's session. We can't wait to see these new AirPlay features integrated into your apps. Enjoy the rest of the conference.
[ Applause ]
-
-
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.