Streaming is available in most browsers,
and in the Developer app.
-
Bring your app to Siri
Learn how to use SiriKit and App Intents to expose your app's functionality to Siri and Apple Intelligence. Discover which intents are already available for your use, and how to adopt App Intent domains to integrate actions from your app into the system. Find out what metadata to provide, how to make your entities searchable with Spotlight, how to integrate your app with in-app search, and much more.
Chapters
- 0:00 - Introduction
- 1:44 - What's new with Siri
- 8:34 - Actions
- 15:38 - Personal context
Resources
-
Download
Welcome to the session “Bring your app to Siri”. My name is Daniel Niemeyer and I am a Software Engineer in Siri. Today, I’ll cover how you can bring your app to Siri. I’ll start by talking through our established frameworks for doing so, and then talk through what’s new this year with the introduction of Apple Intelligence. As well as new APIs, that allow Siri to perform actions in your app. And some exciting personal context capabilities that you can integrate directly. Finally, I’ll wrap things up with key take-aways from today’s session. So, why should you integrate your app with Siri? With Siri, people can take actions with your app no matter where they are on their device. Siri is also a great way to get people to quickly perform actions when they’re looking at your app. Let’s explore how you can do this with our existing frameworks, SiriKit and AppIntents. In iOS 10 we introduced SiriKit to the world. It allows developers to use intents provided by the system to empower actions people already ask Siri to do. Such as playing music or sending a text message. SiriKit domains are still the best ways for you to enable these kinds of features, and later in this session you’ll see how they’ve gotten even better this year. In iOS 16 we introduced AppIntents. A new framework to integrate your App with Siri, Shortcuts, Spotlight and more. If your App does not overlap with an existing SiriKit Domain, AppIntents is the right framework for you. Check out this year’s “Bring your app’s core features to users with App Intents” video to learn more about it. This year, we are making significant improvements to Siri thanks to Apple Intelligence. With new Large Language Models, the experience is getting better than it has ever been. If you haven’t integrated your app with Siri, now’s a really exciting time to do it. With Apple Intelligence, we have the foundational capabilities we need to take a major step forward with Siri in 3 key ways: First, Siri can now sound more natural as it speaks to you. Second, Siri is now more contextually relevant, and more personal to you. Apple Intelligence will provide Siri with on-screen awareness, so it’ll be able to understand what you are looking at and take action on it. And third, thanks to richer language understanding, you can now speak to Siri more naturally. Even if you stumble over your words, Siri will understand what you’re getting at. If your app already adopted SiriKit, you automatically get these improvements for free.
But we didn’t want to stop there. We have re-imagined the Siri experience to understand more of the things you do on your device, so Siri can take more actions in and across apps on your behalf. We did this by investing deeply in the App Intents framework as a means of connecting the vast world of apps to Apple Intelligence. As a result of this, we’ve built a series of new APIs called App Intent domains. Let me show them to you now. Domains are collections of App Intents based APIs designed for specific kinds of functionality, like Books, Camera or Spreadsheets. In iOS 18 we are releasing twelve of these domains. Mail and Photos are available for you to try today. And over the next few months, we'll be rolling out more. Each of these include a broad set of new actions that are trained and tested to support flexible voice interactions while still being really easy to adopt. This year Siri is gaining support for over 100 different actions across the twelve domains. As an example of what you can do with these, an app like Darkroom will be able to use the Set Filter intent, to give their users the ability to say, Apply a cinematic preset to the photo I took of Mary yesterday. If your app has features covered by any of these domains, these new APIs were designed for you and we would love to hear your feedback! And this is just the beginning, Siri will be able to understand and take more actions in more apps over time. Now let me show you how to build actions for App Intent domains, through what we call Assistant Schemas.
Schema is an overloaded term, so let me define it. The dictionary calls it a conception of what is common to all members of a class, a general or essential type or form. Let’s unpack what that means in the context of Siri.
Apple Intelligence is powered by foundation models that give Siri new capabilities in the domains we just talked about. These models are trained to expect an intent with a particular shape. This shape is what we call a schema. And Assistant Schemas is what we call the API. If you build an App Intent with the right shape, you’ll benefit from our training and don’t need to worry about the complexities of natural language. All you need to do is write a perform method and let the platform take care of the rest.
This year we’ve built schemas for over 100 kinds of intents like creating a photo or sending an email. They each define a set of inputs and outputs that are common for all adopters of that intent. This is what I mean by shape. In the middle of all this geometry, sits your perform method with full creative freedom to define an experience that is right for your app. Now let me walk you through the life-cycle of a Siri request with Apple Intelligence to demonstrate Assistant Schemas in action. Everything starts with a user request. This request is routed to Apple Intelligence for processing through models. Our models are specifically trained to reason over schemas, allowing Apple Intelligence the ability to predict one based on user request. Once an appropriate schema is selected, the request is routed to a toolbox. This toolbox contains a collection of AppIntents from all the apps on your device grouped by their schema. By conforming your intent to a schema, you give the model the ability to reason over it. Finally, the action is performed by invoking your AppIntent. The result is presented and the output is returned. With that said, enough about diagrams. Let’s jump to some code. Here’s an AppIntent for creating photo albums. Conforming to a schema is as easy as adding one additional Swift Macro before my AppIntent declaration. Like this! This new macro takes in an argument for a schema which is always bound to an App Intent domain. In this example, photos is the domain, and createAlbum is the schema. Since the shape of a schema is already known at compile time, I no longer need to provide additional metadata for my AppIntent. This means the code here, can be further reduced to, this! Schema conforming intents are easier to define in code, thanks to their known shape. With that said, shapes aren’t always the most flexible at times. But don’t worry, in this case you can square the circle. Assistant Intents can also be extended with optional parameters, if needed. Giving the shape a little bit of flexibility. The AppIntents framework doesn’t just include intents for building actions. It also includes entities for modeling concepts in your app. Like the AlbumEntity returned from this intent.
We’ve also added a new macro for exposing AppEntities to Siri. Just like Intents, adopting this new type is as easy as adding one line of code to your AppEntity declaration. Assistant Entities can also benefit from pre-defined shapes. Resulting in a more concise App Entity implementation. And just like Assistant Intent. Assistant Entities can also stretch far beyond their shapes by declaring new optional properties, if needed. Like this Album Color property on my Album Entity. And finally, we couldn’t leave enums behind. Exposing an AppEnum to Siri is as easy as its friends, Entity and Intent. Simply add the new AssistantEnum macro to your enum declaration, and we take care of the rest.
Unlike its friends, Assistant Enums don’t impose any shape on enumeration cases. Allowing you full expressivity during implementation. Now that I have introduced the new Assistant Schema macros, how about an Xcode demo? I have been working on a Media Library App to view and organize photos on my device. Let me show you how to integrate my app with Siri to perform actions. This app will also be available alongside this talk as downloadable sample code. Launching my app takes you into this beautiful gallery containing both photos and videos available on my device. I can click on a photo to view it fullscreen.
I can also click on this menu at the top to perform actions like favorite and share.
Now, let’s switch over to Xcode so I can show you the app’s code.
My app has two fundamental model types. Assets and Albums. Let me show you how Asset is defined. Asset is an object that represents a photo or video in my library. Assets have many properties such as title and creationDate.
At the bottom of the file, I have a computed property called entity which returns an AppEntity modeled after my asset class. As you can see this entity contains many properties that are hydrated directly from my model. Let me select and Jump to Definition to see how my AssetEntity is defined. This entity models a given asset in my library. It exposes many properties such as Title, Creation Date, Location, and more. Entities are a great way for you to model your app’s content. When combined with App Intents, they allow the system to perform actions with your entities. Speaking of actions, I already wrote an App Intent to open a particular AssetEntity in my app. Here, I have an OpenAssetIntent that accepts a target parameter, the asset to open and a few app dependencies that I need to perform this action, such as my app’s Navigation Manager.
Let me show you how to expose this Intent to Siri and integrate my app with Apple Intelligence. This year with App Intent domains, I can do this by adopting the new open photos schema. Let me add the new Assistant Intent macro to my App Intent declaration.
As I start typing, I get code complete suggestions that help me along the way. First let me pick the photos domain, and second let me pick the openAsset schema.
That’s it. Let me build the app now.
Oh what’s this new error doing here? When exposing my Intent to Siri, I also need to make sure any associated entities or enums are also exposed. Let’s jump to the definition of Asset Entity to fix this error.
Ah, it looks like I forgot to also expose my entity. Let’s fix that by adding the new Assistant Entity macro to its declaration.
As I start typing auto-complete once again suggested photos.
I can now pick asset as the schema for this entity.
Let’s try building one more time.
Great, it looks like that fixed the previous error with my intent, though by adding this new conformance, I am now getting a new build failure in my entity. Our models are trained to expect an entity with a particular shape. By conforming my entity to a schema, the compiler is able to perform additional checks that validate the shape of my entity. In this case, it looks like I am missing a property for hasSuggestedEdits. Let’s add it now.
And voila, by adding the missing property, my entity now matches the shape of the schema and my build succeeds. The compiler is a great tool to help you conform existing App Intents to schemas. If you’re starting from scratch, we are also exposing code snippets that fill in the required shape on your behalf.
My model already has a method for favoriting photos. Let’s use it to build a new App Intent to expose this functionality to Siri.
Let's go back to my AssetIntents file.
This time, let’s use a template instead.
All I have to do is replace this place holder with my actual entity type.
Retrieve my app dependencies.
And implement perform.
And that’s it.
The compiler is happy and my build now succeeds. As you saw from the demo, Assistant Schemas enable additional build time validation on existing App Intents. This validation ensures your implementation of the schema matches the shape on which our models have been trained.
We are also exposing Xcode snippets to make it even easier for you to build these from scratch.
Now let’s talk about how to test and run these in the Shortcuts App. Like any App Intent, schema-conforming App Intents will automatically appear as actions in the Shortcuts app, connecting them to the entire Shortcuts ecosystem. This includes personal automations, Home Screen shortcuts, and more. Shortcuts App is a great way for you to test Assistant Schemas today. In the future, these same intents and entities will automatically work with Siri. Remember OpenAssetIntent and UpdateAssetIntent from the previous demo? Let me show you how to use the Shortcuts App to perform these actions and test them end-to-end. Let me start by launching the Shortcuts app. I can tap the plus button at the top to create a new shortcut. I can then tap on AssistantSchemas to filter for actions from my app. Let’s start with Open Photo and save it.
Great, now let's save it.
I now have a new tile. Tapping on it will perform the Open-Asset-Intent you saw earlier.
I am presented with a few options to choose from. I can select one to open it in my app.
As you can see, the action was performed and my app navigated to the right picture. As you saw from the demo, Shortcuts are a great way for me to test actions today. In the future, the same intents will automatically work with Siri. Now that I have built and verified schema conforming actions in my app, I’m excited to give you a sneak peek at personal context. Apple Intelligence will equip Siri with a rich understanding of your personal context. That means Siri, can search and reason over all the information available across your device in a safe and private way. Let’s start with In-App search. Built on top of the existing ShowInAppSearchResultsIntent, It allows the system to tap directly into your app’s search capabilities. Siri will navigate the user directly to your search results.
Using this App Intent, and email app like Superhuman will be able to give users the ability to say, Find bicycles on Superhuman and view results in their app.
Here’s an AppIntent for displaying search results in my app. It conforms to the existing ShowInAppSearchResultsIntent type. Today, we are introducing a new Assistant Schema under the system domain to integrate your app’s built in search functionality with Siri. To conform to this new schema, simply add a Swift Macro before your AppIntent declaration. Like this! This search intent can also benefit from pre-defined shapes. You can even drop the ShowInAppSearchResultsIntent type. Resulting in a more concise App Intent implementation.
Building on top of the previous demo, wouldn’t it be great if Siri could search for photos in my app? With the new system search macro, I can. Let me show you how.
My app already has a built-in search bar that allows me to filter for picture by location.
I can type, New, to filter for pictures of New York.
Let’s go back to Xcode and see how I can expose this functionality to Siri. My navigation manager already has a method called openSearch with criteria. This method routes the user to a page, displaying search results matching the given criteria. I already have a Search Assets App Intent. This intent leverages my navigation manager to display results in my app. With a new App Intent domain for system actions, I can now expose this same intent to Siri and Apple Intelligence. Let me show you how! By adding this new assistant schema for search, Siri will be able to route users directly to my search results UI. And that’s it. Let’s build and run to test this in action with Shortcuts.
Once again, I can open the Shortcuts App, and create a new action. I can filter for my app.
This time, let's use Search Photos, and save it. Let's run it now.
Since this App Intent takes a parameter for search criteria, I get prompted for a value. Let’s search for San Fran.
As you can see, my action was performed successfully and I now see this beautiful photo of the Golden Gate bridge. As you saw from the demo, In-App search is a great way to bring users into my app to show results in its beautiful UI. This year we are expanding Siri’s capabilities so it can do even more when I give it a deeper understanding of my app’s content. Now, thanks to Apple Intelligence, Siri is gaining the ability to do Semantic Search. This means that when I search for pets, it’s not just looking for the word pet, it will find cats, dogs and maybe even snakes. Now, with LLMs, Siri understands what a pet is. Once it finds your content, It can take action directly on it. Like sharing your favorite pet photo with your friends.
For your apps, you’ll be able to use the App Intents framework to define entities to provide this additional context. Conform to the new API called IndexedEntity to give Siri the ability to search your app’s content. Making information available in the semantic index.
For more information about Indexed-Entity, checkout my colleagues’s video on “What’s new in App Intents”. Thanks to the capabilities of Apple Intelligence, this year marks the start of a new era for Siri. The tools you need to integrate your app with Siri are available starting today. And in software updates, we'll be rolling out the in-app actions and personal context you just heard about. In this session, I walked you through the process of bringing my Media Library app to Siri. But we have yet to see a demo of it in action. Before I end today’s session. Here is a sneak peak of what you’ll be able to do with your app in the future.
Add this photo to the California album.
Email it to Josh.
Wrapping up. Thanks to Apple Intelligence’s new large language models, Siri is now more capable, more flexible and more intelligent than it’s ever been. SiriKit and App Intents are the two frameworks for integrating your app with Siri. If your App does not overlap with an existing SiriKit Domain, App Intents is the right framework for you. Speaking of intelligence, Siri will take actions inside apps on your behalf. Adopt the new Assistant Schemas API so your app can benefit from this too.
Here are some great videos to watch next.
Thanks for watching!
-
-
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.