Streaming is available in most browsers,
and in the Developer app.
-
Support HDR images in your app
Learn how to identify, load, display, and create High Dynamic Range (HDR) still images in your app. Explore common HDR concepts and find out about the latest updates to the ISO specification. Learn how to identify and display HDR images with SwiftUI and UIKit, create them from ProRAW and RAW captures, and display them in CALayers. We'll also take you through CoreGraphics support for ISO HDR and share best practices for HDR adoption.
Resources
- Edit and play back HDR video with AVFoundation
- Editing and Playing HDR Video
- Export HDR media in your app using AVFoundation
- Processing HDR Images with Metal
- Supporting HDR images in your app
Related Videos
WWDC23
WWDC22
-
Download
Hi! My name is Jackson. And I'm David. In this session, I will provide some background about HDR images and recently published standards in this area.
Then I'll cover how to go about supporting these images in an app using new and existing APIs.
David will dive into the details of handling an HDR image pipeline, and I will wrap up with some more advanced topics around displaying high dynamic range content.
Let's dive in with how HDR works.
In the physical world, humans can perceive an enormous range of light levels, thanks to our eyes' ability to adapt.
In contrast, a typical standard dynamic range, or SDR, display can only produce a limited range of light. This means that when an image of a scene is captured, the wide range of light levels has to be somehow compressed into the smaller SDR range.
With a high dynamic range, or HDR, display, you can show a much greater range of light levels without having to compress them. This lets you display images that look more like the original scene and are brighter and more vibrant.
We've had the ability to capture high dynamic range for many years, but in the past, you would have had to take that captured range and compress it into the SDR display range. Now, when displaying on an HDR display, you can render the scene more like it originally appeared.
For example, in this image of a sunrise over a snow scene, there are areas that fall into a wide range of real-world light levels. On an SDR display, you can only accurately represent part of the scene. With an HDR display, you can represent much more of the scene without compromising contrast.
So, having a display with an HDR range lets us render parts of a scene brighter than the brightest SDR white.
This is commonly referred to as headroom.
In this paradigm, reference white is the brightest white the SDR display would have produced. Anything above that point is headroom.
In past talks, we introduced Extended Dynamic Range, or EDR, for interacting with content that can be rendered in the headroom of an HDR-capable display.
In the EDR paradigm, reference white is 1.0, and the peak is the maximum value the display can represent. The HDR APIs I introduce today use EDR to implement a more complete pipeline for high dynamic range content.
If you want to learn more about EDR, check out the "Explore HDR Rendering with EDR" talk.
Here is an example of HDR in action. This SDR image of a person sitting in front of a window looks good when the white of the paper in the book is just under the reference white. Anything brighter, such as the window, gets rolled off or clipped.
However, when you can display the image in HDR, you can show much more detail in the highlights and retain contrast more reliably across the scene. This is the benefit you can get from supporting HDR.
So why support HDR images? If you are building an app where user-created or provided content is important, supporting HDR will make that experience even better.
HDR support is available on almost all Apple platforms, and we've introduced these APIs to ensure that you can take full advantage of Apple's incredible display hardware.
Another important reason to consider HDR support now is that Apple has been working with the International Standards Organization, through the Technical Committee for Photography, to publish a new technical specification for HDR images this year. This specification, TS22028-5, provides a structure for encoding HDR content into existing still image formats without compromising quality.
I will refer to HDR images that follow this ISO specification as "ISO HDR" to avoid confusion with other forms of HDR such as HDR video, capture, or displays.
Recalling our scale from earlier, typical SDR images, including sRGB and Display P3, define black and white as 0.2 and 80 candelas per meter squared. ISO HDR, meanwhile, defines black and a default reference white as .0005 and 203 respectively. Everything above 203 is headroom.
So what's in these new image files? The specification requires Hybrid Log-Gamma, HLG, or Perceptual Quantizer, PQ, as the encoding transfer function. These are functionally analogous to the gamma curves used in SDR images.
The color primaries for ISO HDR files are the BT.2020 primaries. This a wide-gamut color space, until now only commonly used in video.
To avoid issues with banding, HDR images are required to be 10 bits or more per component. This means that some formats, like HEIF, can encode HDR, but some others, like traditional JPEG, are not able to be 22028-5 compliant, as they only support 8 bits per component. And for required metadata, both traditional ICC profiles and CICP tags are valid. Together, these requirements define the new ISO HDR files.
There are some additional optional metadata fields associated with ISO HDR files that might be relevant to you. The reference environment tag defines the ambient conditions for the content reference condition.
The diffuse white luminance defines where the reference white falls for this content. The default is the 203 I mentioned earlier.
The scene-referred tag can be used when HLG is the transfer curve. It defines whether the image content is scene or display referred. The default value for this tag is display referred.
The mastering and content color volume tags are common to existing HDR video and define information about the color ranges present in the image.
Lastly, the content light level tag provides information about the light level of the scene in the image.
For more information on ISO HDR, check out the specification on the ISO website.
In addition to ISO HDR, I am very excited to tell you for the first time how to access the best version of images captured on iPhone.
Since 2020, trillions of iPhone images have been captured with additional data that allows us to reconstruct an HDR representation from the SDR image. I call this type of HDR "Gain Map HDR." Today, David and I will be showing you new APIs for accessing this HDR representation in your app, giving you the option of showing incredible HDR images from any generation of Gain Map HDR already in your Photos libraries.
Now let's talk about how to use these new APIs to incorporate HDR images into an app.
The APIs I'm going to show you are available in SwiftUI, UIKit, and AppKit. Let's take a look at the SwiftUI and the UIKit API.
In this example, I have an ISO HDR image file accessible via URL, and I want to display it. All I have to do is create a UIImage and provide it to an Image View along with the new allowedDynamicRange modifier to enable high dynamic range. It's that simple.
Similarly, in a UIKit app, you can set the new UIImageView property "preferredImageDynamicRange," and voila, an HDR result.
The dynamic range properties include three options for how to handle HDR content. These properties work on SwiftUI Image, UIImage, and NSImage views.
The high option lets the system know that you want to display high dynamic range content and lets us do the heavy lifting of mapping that content onto the current display, including updating when the display state changes. Note that if the image is not HDR, you will get the exact same experience as without the dynamicRange flag. You can safely use these options with non-HDR content.
The standard option disables high dynamic range rendering and instead displays all content as SDR. This means tone mapping content outside of the SDR range. This is also how images would be shown on displays that have no HDR capability.
Last, the constrainedHigh option should be used when you want to show some HDR but not the full range of the content.
Why would you want to show only some HDR, not all of it? Well, there are a few possible reasons.
In this example, I have a Stack view containing thumbnails of many images. Some of these images are HDR, and some are not.
If I use the high DynamicRange option, this is what you'll get. Some images are very bright and HDR, but the SDR images are not and now look dull, maybe even inactive.
Now let's use the constrainedHigh option. By limiting how much headroom the HDR content is allowed to use, I make the film strip look much more consistent. You can still tell HDR images apart from SDR ones, but I no longer have the problem that SDR images look gray or inactive.
Another reason you might want to use constrainedHigh or standard for a particular image view is that HDR content can sometimes be very bright, and you might not want it to take attention away from other aspects of your app. For example, here is a smaller image that, when displayed with full HDR, looks like the most important part of the app but is drawing attention away from important controls and information.
Before I move on, you may have noticed that there is no option here that does not involve tone mapping the image. If you are in a situation where you do not want to have the OS do the tone mapping for you, you will need to use a lower level API which I will discuss later in this session.
An important aspect of HDR to keep in mind is that it requires a pipeline that does not clamp or otherwise degrade the HDR data. The APIs we discuss today are all fully supported, but deprecated APIs may not have an HDR-safe pipeline. For example, if you are resizing images using the deprecated UIGraphicsBeginImageContextWithOptions, you will lose HDR and wide-gamut color. This should be avoided when creating an HDR-capable app.
If you are trying to create a thumbnail, UIKit introduced a thumbnail API on UIImage in iOS 15. If you don't need precise sizing control, this is the recommended way to get an HDR thumbnail. If you need more control or need support prior to iOS 15, UIKit offers UIGraphicsImageRenderer.
By using the imageRendererFormat, UIKit knows how to construct a renderer that won't cause the HDR information in the image to be lost when redrawing it.
Let's take a look at one common way to get image data into an app.
PhotoKit provides interfaces for an app to access the Photos library. In my app, I've added a Photos Picker to my main view, making it easy to access user-selected images. Because the PhotosPicker may try to transcode images into a format that does not retain the HDR data, I am going to use the "current" encoding policy and the generic "images" matching type.
For more information on how the Photos Picker works, check out the "Embed the Photos picker in your app" session. With ISO HDR images, I can create a UIImage from the DataRepresentation and use that directly with any of my image views with no extra code.
If I'm supporting Gain Map HDR as well, I can use the new UIImageReader to get the HDR representation when it's available. This API will return the HDR representation by default when on an HDR display and the SDR version otherwise.
The APIs we've discussed so far are not dependent on an image being HDR or knowing that an image is HDR. Recall that when you let an image view know it should be showing high dynamic range, it doesn't matter if that image is HDR. However, you may have a pipeline or app that wants to identify if an image is HDR.
With UIKit, you can check the isHighDynamicRange property to determine if the contents are ISO HDR compatible.
With AppKit, CoreGraphics, and CoreImage, you will need to check the CGColorSpace of the image. The CGColorSpaceUsesITUR_2100TF function returns true for ISO HDR images.
HDR images can use a wide range of headroom. For example, current iPhones produce images that use up to 8 times headroom. However, only some displays can show HDR, and not all HDR displays are the same.
iPhone 14 can show HDR highlights up to 8 times brighter than the reference white, while the 12.9" iPad Pro and MacBook Pro can show up to 16 times, and the Pro XDR Display can show up to 400 times.
Most other Apple displays can show up to 2 times headroom. However, this may not be enough for most HDR content. There are also external displays with HDR capability that are supported. There is not an exhaustive list of these displays available; however, there is an API for you to determine the capabilities of the display your app is currently shown on.
You can query the potentialEDRHeadroom on iOS and iPad OS and maximumPotentialExtendedDynamicRange- ColorComponentValue on macOS to determine the capabilities of the display your app is appearing on.
Before we move into more advanced topics, let's talk about when displaying HDR makes sense. As I've discussed, HDR looks great and you should consider including support for it when showing images is a major part of your app. But it can be distracting some times. So if you don't think you need the extra pop that HDR can give you, consider using constrainedHigh or standard options.
Let's recap. You now know how to identify ISO HDR images, display HDR images, access ISO HDR and Gain Map HDR from the Photos Library, and how to determine whether your display is HDR. Now David will walk you through reading, writing, and manipulating HDR images. Thank you, Jackson. When working with HDR images, there are a few common operations that your app is likely to support: reading ISO HDR or Gain Map HDR images from a file or data into memory; modifying images in memory while retaining HDR content; converting from one image class to another without losing HDR; and finally, writing an HDR image to an ISO HDR file. A critical property of a functional HDR image pipeline is that image objects have an associated color space. For example, both CGImage and CIImage objects use the CGColorSpace API for this. Images can have a variety of supported color spaces, but an ISO HDR image will have a CGColorSpace that is either ITUR 2100 HLG or PQ.
With that in mind, lets start with how to read ISO HDR images. UIImage and NSImage now automatically support reading ISO HDR images. ColorSync, Apple's color management infrastructure, will handle the HDR ICC profiles and provide image objects suitable for display.
When reading Gain Map HDR images, you can request an HDR representation by creating a UIImageReader configuration that prefers High Dynamic Range. Note that this new behavior only impacts Gain Map HDR images. Just like with NSImage and UIImage, Core Image automatically supports reading ISO HDR files. All you need to do is use the CIImage contentsOfURL API. The resulting CIImage object will automatically contain the correct recipe to convert from the file's color space to the Core Image extended-range working space.
You can inspect the image object's recipe by using Xcode's QuickLook feature when debugging your code. In this example, the QuickLook popover shows that the image is converted from the PQ ISO HDR color space.
Your code can also get the .colorspace property to inspect the color space of the file.
This may be an SDR color space, such as sRGB or Display P3, or an HDR color space.
If you prefer to use the CoreGraphics API, then you can get the equivalent behavior by using CGImageSourceCreateImageAtIndex with the new decodeRequest key set to decodeToHDR.
A few minutes ago, Jackson described why you might want to limit HDR images to SDR.
Similarly, apps using Core Image may want to override its automatic HDR support to ensure images are tone mapped to SDR. This can be useful when you want to avoid using HDR for certain scenarios, such as feature detection. To enable this, you simply need to provide the toneMapHDRtoSDR option when creating the CIImage.
In this case, the returned CIImage object will contain a recipe step that tone maps the HDR source into SDR range before any other operations are applied.
Note that this option only has an effect if the image has an HDR color space.
The resulting CIImage will look the same as specifying that an image view should use the dynamicRange.standard option. Also, this has the equivalent behavior to using CGImageSourceCreateImageAtIndex with decodeRequest set to decodeToSDR.
Traditionally, Gain Map HDR images would display the full dynamic range in the Photos app, but only the SDR representation was available to APIs such as Core Image and ImageIO. I am really excited to describe new API that will allow your application to access the full range of Gain Map HDR images.
The API is super simple to use. Just provide the expandToHDR option when initializing the CIImage.
In this case, the returned CIImage object will contain a recipe that combines the primary image with the gain map to produce an HDR image.
The image's .colorspace property will be an HDR color space when a photo library contains the additional gain map data to support this.
This behavior is equivalent to using CGImageSourceCreateImageAtIndex with the decodeRequest key set to decodeToHDR.
These options will also work with RAW files, which I will now talk about in more detail.
ProRAW images from iPhones and RAW images from cameras are a flexible image format that gives significant creative control to a photographer. This includes the ability to render parts of a scene into HDR headroom. Many RAW formats contain plenty of dynamic range and simply need to be processed into an unconstrained form. Let me describe how this works.
First, if your application just wants to show the default SDR look for a RAW file, create an image from a URL as usual.
But if your application just wants to show the default HDR rendering look, all you need to do is add the new expandToHDR option.
However, if your app wants to unlock the full functionality of RAWs, then your code should create a CIRAWFilter from the URL.
If you just ask that filter for its output image, you will get a CIImage with the default look. But the key advantage of this API is that a filter can be easily modified.
Each CIRAWFilter instance has several properties that your app can change to alter the output image. These properties are well described in the "Capture and process ProRAW images" session, but let's review one that is especially relevant to this HDR discussion.
The amount of dynamic range for a RAW image can be adjusted to any value from 0 to 1. The extendedDynamicRangeAmount property is analogous to the viewDynamicRange controls that Jackson described earlier.
The default value of this property is 0, which indicates that the output image should be SDR. The maximum value of this property is 1, which indicates that the output image should use the most of the headroom present in the file. That wraps up the various ways to read ISO HDR images.
Next, lets discuss some recommendations for how to modify HDR images.
Core Image provides a powerful and flexible API for working with HDR images because it contains over 150 built-in filters that support HDR.
All of these filters can either generate or process images that contain HDR content.
All of these filters just work because the Core Image working color space is unclamped and linear, which allows RGB values outside the 0 to 1 range.
As you develop your app, you can check if a given filter supports HDR.
To do this, you create an instance of a filter, then ask the filter's attributes for its categories, and then check if the array contains the category high dynamic range.
Please see the "Display EDR content with Core Image, Metal, and SwiftUI" session for more information on built-in CI filters and custom CI kernels. Next, lets discuss writing an HDR image to an ISO HDR file. Often, your app will want to write image objects in memory to a new file representation. Traditionally, using the UIImage, jpegData, and pngData API would save an 8-bit precision SDR image.
New this year, UIImage can automatically write ISO HDR images using either 16-bit PNG or 10-bit HEIF format when an object contains HDR content. It will also convert to ISO HDR if the original image was a Gain Map HDR image.
Similarly, Core Image can write an HDR PNG file when you specify a HDR color space and call writePNGRepresentationOfImage requesting the RGBA16 format.
Or Core Image can write an HDR TIFF file when you specify an HDR color space and call writeTIFFRepresentationOfImage requesting the RGBA16 format.
Note that both PNG and TIFF use lossless compression and will result in much larger file sizes.
As a result, the best practice is to write a HEIF file using writeHEIF10RepresentationOfImage and specify an HDR color space.
There may be occasions where you need to convert from one framework class to another or one color space to another.
The process to convert between the image classes UIImage, CIImage, CGImage, IOSurface, and CVPixelBuffer remain largely the same. That said, here are a few things to look out for when working with an HDR pipeline.
Let's first discuss converting to IOSurface or CVPixelBuffer objects. This image type is useful because, for example, it can be used as the contents of a CALayer. Also, it can hold bi-planar chroma subsampled images, which is very memory efficient. Before you use a CVPixelBuffer, be sure to declare that it has ISO HDR compatible content. The first step is to create the pixel buffer with an appropriate format such as 10-bit biplanar full range.
While you are at it, for best performance, be sure to specify that the buffer should be surface-backed by providing the IOSurfacePropertiesKey.
Next, be sure to add attachments to the CVPixelBuffer so that the system knows that it contains ISO HDR compatible color space properties.
Once you have a CVPixelBuffer, converting it to a CIImage is trivial. Just call the CIImage withCVPixelBuffer API. And you can convert from a CIImage to a CVPixelBuffer by using a CIContext to render to the buffer.
Moving along, there are several situations where your app may want to convert between Core Image and the CGImageRef API.
If you want this conversion to preserve HDR content, you should choose an HDR color space and request a deep pixel format such as RGBA16 or RGBAh format.
And new this year, CoreImage added the RGB10 format, which is deep but uses half the memory.
Converting a CIImage to a CGImage is very convenient, given that CGImages are supported in a wide variety of APIs. But be aware that doing so is not recommended for best performance of user-interactive rendering. For the fastest performance, it is best to have CoreImage render directly to an MTKView or via a PixelBuffer to a CALayer.
Speaking of CALayers, let's return to Jackson to learn more about lower-level APIs you might need for more complex workflows. Thanks, David! CALayers are a powerful tool when you need the best rendering performance or more control over how your content is composited into your app.
To enable HDR rendering on CALayers, you can now set the wantsExtendedDynamicRangeContent property. This is similar to the property used by CAMetalLayers to enable displaying content in the headroom of your display.
The key difference between these two methods is that the CALayer property enables tone mapping of the layer contents, while the CAMetalLayer does not. What does this mean in practice? This image and plot show content with 10 times headroom. When it is rendered to a display with at least 10 times headroom available, both layers behave identically. Lets assume now that the the display only has 5 times headroom available.
In the CAMetalLayer case, the image data above 5 times will be clamped to what the display can show, resulting in a sharp discontinuity in the image.
In the CALayer case, the image will be tone mapped to avoid that discontinuity. The exact tone mapping algorithm used depends on the transfer curve used with that image. For more information about these algorithms, you can refer to the ITU standards for HLG and PQ.
CALayers provide a fast and simple way to get HDR content onto the screen, while CAMetalLayers give you the freedom to create your own tone mapping pipeline.
To directly use CALayer to render HDR, you must use one of these available classes. An object of type CGImage, CVPixelBuffer, or IOSurface that is tagged appropriately as ISO HDR will be rendered and tone mapped by the CALayer. If you want to use the CALayer directly and aren't using one of these classes, you can use one of the methods David described to convert to them.
When working with an HDR workflow, it's important to use the correct pixel formats. These pixel formats are safe to use when handling HDR data. 16 and 32-bit float formats always support high dynamic range. 16-bit integer formats will also work for supporting HDR content in appropriate file formats and contexts. Finally, there are 10-bit pixel formats that you can use when memory and file size are important. This is the default bit depth for most compressed ISO HDR images. There are also CoreGraphics flags when creating a CGImage that can be used for HDR content. Like the previous list, you can use float, half float, 16-bit integer, and 10-bit RGB.
One final important topic when introducing new functionality like this is backwards compatibility. What can you do to support older versions of iOS and macOS when dealing with HDR images? For ISO HDR images, CoreImage provides the toneMapHDRtoSDR option to convert HDR to SDR. Similarly, when rendering using a CoreGraphics CGContext, you can target an SDR CGColorspace, and the image will be tone mapped to that space. For Gain Map HDR, use version checks to gate when the new expandToHDR options are used. When these options are omitted, the SDR version of the file will always be loaded instead of the HDR version.
To wrap up, we've introduced new APIs for reading, writing, and displaying HDR images, showed you how to access Gain Map HDR representations, and given you APIs for working with a fully HDR-capable pipeline.
We can't wait to see the amazing things you make with HDR! together: Thanks for watching!
-
-
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.