Streaming is available in most browsers,
and in the Developer app.
-
Designing Audio-Haptic Experiences
Learn essential sound and haptic design principles and concepts for creating meaningful and delightful experiences that engage a wider range of human senses. Discover how to combine audio and haptics, using the Taptic Engine, to add a new level of realism and improve feedback in your app or game.
Resources
- Core Haptics
- Human Interface Guidelines: Playing haptics
- Playing a Custom Haptic Pattern from a File
- Playing Collision-Based Haptic Patterns
- Updating Continuous and Transient Haptic Parameters in Real Time
- Presentation Slides (PDF)
Related Videos
WWDC19
-
Download
I'm sure you're familiar with that sound.
It's been part of our life for years. In 2019 though, I think we can do better. I'm Camille Moussette, interaction designer on the Apple design team. And, I'm Hugo Verweij, sound designer on the design team.
This session is about designing great audio-haptic experiences. Our goal, with this talk, is for you to be inspired, and leave you with practical ideas about how to design great sound and haptics when used together in the right way, can bring a new dimension to your app.
During the next 30 minutes, we'll talk about three things. First we'll introduce what is an audio-haptic experience? Then we'll look at three guiding principles to help you design those great experiences.
Lastly, we'll look at different techniques, and practical tips, to make those experiences great and truly compelling.
So, what is an audio-haptic experience? Well, let's start by listening to a sound. OK, now let's lower the sound. What happens if I lower it even further? Whoa. It's so low I can't hear it any more. You know, our ears just don't register it any more. But, if you would put your finger on the speaker, you could still feel it move back and forth.
We designed the Taptic Engine specifically to play these low frequencies that you can only feel.
Here it is in the iPhone.
And, next to it, the speaker module. The haptic sensations from the Taptic Engine are synchronized to the sounds coming from the speaker. And, the result is what we call an audio-haptic experience. But, haptic sensations are meant to be felt, so because we are presenting this on stage, and on screen, we need your help in imaging what this would feel like. We'll do our best to help you by visualizing the haptics like this.
Or, by playing a sound that resembles the haptic, like this. We will also visualize these experiences on the timeline, and Camille will tell you some more about how that's done in a quick haptics design primer.
iOS 13 introduced a new API for designing your own custom haptics. It's called Core Haptics. This new API allows you developers to use the Taptic Engine fully in iPhone.
The Taptic Engine is capable of rendering a wide range of experiences. It can generate a custom vibration, like this. Looks like this, and it should sound and feel like this. So, as you've seen, we're using a waveform and sound to represent haptics. As Hugo said, you need to imagine this in your hand as a silent experience. This should be felt and not heard. So, we can play these continuous experiences.
We can also have something that's more shorter and compact. It's a single cycle, and we call this experience a transient.
It's much more momentary, and it feels like an impact, or a strike, or a tap.
Very momentary. And then we actually can refine it further. Moving forward, we'll use basic shapes to represent haptics in different patterns. So, our transient becomes our simple rectangle.
And, because our Taptic Engine is an exceptional piece of haptic engineering, we can modulate the experience in different ways. First, we can modulate the intensity or the amplitude. We can also make it feel more round, or soft.
At the other extreme, we can make it more precise and crisp.
So, this experience is possible with the Taptic Engine.
So, in the end, this completes our quick introduction to haptic design, and what the Core Haptics API is all about. We have one intensity you can modulate, and another design dimension, haptic sharpness, that you are in control for two types of event, continuous and transient.
Now, let's look at the three guiding principle that we want to share with you today. First is causality.
Then, we have harmony.
And, lastly we have utility.
These concept or approaches are used throughout the work that we do at Apple, and we think they can help you as well in your own app experience. For each of them, we'll look at the concept and explain through a few examples. Let's get started.
Causality: Causality is about, for feedback to be useful; it must be obvious what caused it. So, imagine being a soccer player, kicking this ball. What would the experience feel like? There is a clear relationship, an obvious relationship between the cause-- the foot colliding with the ball-- and the effect-- the sound of the impact and the feel of the impact.
Now, what this experience sounds and feels like is determined by the qualities of the interacting objects. The material of the shoe, the material of the ball, and then, the dynamics of the action. Is it a hard kick or a soft kick? And, the environment. The acoustics of the stadium, or the soccer field.
Because we are so familiar with these things, it would not make sense at all to use a sound that is very different. Let's try it out, and take it way over the top.
Very strange. That doesn't really work.
Now, when designing sounds for your experiences, think about what would it feel and sound like, if what you interact with would be a physical object. As an example, let's look at the Apple Pay confirmation.
We wanted the sound and the haptics to perfectly match the animation on screen, the check mark.
So, where do we start? Well, what sounds do you associate with making a payment? What does money sound like? And, what is the interaction of making a payment, using Apple Pay? And, of course we have to look at that animation of the check mark on screen.
It should feel positive, like confirming a successful transaction. Here are a few examples of sounds that were candidates for this confirmation.
This the first one. This one is very pleasant. But, it sounded a bit too happy and frivolous.
The next one worked really well with the check mark of the animation. But, we felt that its character wasn't really right. It was a little too harsh.
And then, there's this one, the one that we chose in the end. And, that you all know.
It's not too serious, and it's clearly a confirmation. OK. So, we've got our sound. Now, on to the haptics. Our first idea was to mimic the waveform of the sound because it matches perfectly, but after some experimentation, we found that two simple taps actually did a better job. I like to see these as little mini-compositions, where we have two instruments; one that you can hear and one that you can feel, the haptics.
They don't always necessarily have to play the same thing, but they do have to play in the same tempo.
Here they are together. Notice the lower sound indicating the haptics. OK. And, this is then the final experience with the animation. Again, imagine feeling the taps in your hand when you pay.
Next, let's look at our second guiding principle, harmony.
Harmony is about things should feel the way they look, the way they sound. In the real world, audio, haptics, and visuals are naturally in harmony because of the clear cause and effect relationship. In the digital world, though, we have to do this work manually. New experiences are created in an additive process.
The input and output need to be specifically designed by you, the developer. Let's start with creating a simple interface with the visual. We have a simple sphere dropping and colliding with the bottom of the screen. Let's add audio feedback.
Now, we choose a sound that corresponds to a physical impact, or the bounce of the sphere.
It needs to be short and precise and clear, but we also modulate the amplitude based on the velocity of the hit. Now, let's do additional work, and introduce the third sense, haptic feedback.
So, imagine feeling that hit in your hand. Again, we're trying to design in harmony the sphere hitting the bottom of the screen, so we choose a transient event with high sharpness. We also modulate the intensity to match the velocity of the bounce.
We're not done yet. Because it's really important to think about synchronization between the three senses.
That's where the magic happens. That's where the illusion of a real ball colliding with the wall takes shape. So, here's an example where we broke the rules, and we introduced latency between the visual and the rest of the feedback.
It's clearly broken, and the illusion of a real bouncing ball is completely not there. So, harmony requires great care and attention from you, but when done well, can create very delightful and magical experiences.
Let's look at the different harmony in terms of the notion between interaction, visuals, audio, and haptics. In terms of qualities and overall behaviors. We'll look at a simple green dot on screen that we'll animate, and think through what kind of audio, what kind of haptics, make sense with that green dot. So, if we add a snappy pop, or a different pulse, what kind of audio, or what kind of haptics work with these visuals? What if we have a large object on screen? Does it sound different? Does it feel different than a tiny little dot? If we have different dynamic behavior, different energy level, a pressing, pulsating dot that really calls for attention might want different sound, different haptics. And, lastly, something that feels calm, or like a heartbeat, warrant different type of feedback. So, think about the pace, the energy level, and the different quality that you're trying to convey in your app. Design feedback that tells a consistent and unified story.
I'll illustrate how the harmony principle helps us with designing sound and haptics for the Apple Watch crown. We were all used to our phones, and their old-school vibes coming from them, when the Apple Watch came out as the first device with a Taptic Engine. It was the first device that could precisely synchronize sound and haptics.
For Series 4, haptics and a very subtle sound were added to the rotation of the crown. Remember the sharp and precise haptic that Camille described earlier? That was the one that we used for the crown.
But, it was scaled down to match the small size of the crown. And so, the haptics are felt in the finger touching the crown, rather than on the wrist.
For sounds, we looked at the world of traditional watch making for inspiration.
We listened to and recorded all kinds of different watches, some of which sounded quite remarkable, like this one. And then, there are other physical-- mechanical objects in the real world, with a similar sound, like bicycle hubs. We wanted to find a sound that would feel natural coming from a device like this.
We took these sounds as inspiration before we started crafting our own. And, this is the result. On your wrist, it sounds very quiet, just like you would expect coming from a watch. The perfect coordination between sound and haptics creates the illusion of a mechanical crown. And then, to match this mechanical sensation, our motion team changed the animation so it snaps to the sound and the haptics when using the crown. Let's look at that. And, I'll play it again. Look at the crown, visualizing the haptics. The result is a precise mechanical feel, which is in perfect harmony with what you see and what you hear.
Next, let's look at our third guiding principle, utility. Utility is about adding audio and haptic feedback only when you can provide clear value and benefit to your app experience. Use moderation. Don't add sound and haptics just because you can. Let's look at a simple ARKit app that we made to illustrate this point. In this app, we place a virtual timer in the environment, and the interaction is dependent on the distance to that virtual timer. Let's look at the video first.
So, in this app, we purposely designed audio-haptic feedback to complement the AR interaction and the most significant part of the user experience. Meaning moving closer to the timer, or moving away from the timer, modulate the audio-haptic experience. The three senses are coherent and unified. We refrain from adding other sound effects, or haptic feedback, to interacting with the different elements, or other minor interaction in the app.
It is often a good idea not to add sound in haptics. So, start by identifying possible locations in your app for audio-haptic feedback, and then focus only on the elements where it can enhance the experience, or communicate something important.
And then, are you tempted to add more? Well, maybe don't. It will overwhelm people, and it will diminish the value of what's really important.
So, to recap, here are the guiding principles one more time. We spoke about causality. How it can help to think about what makes the sound, and what causes the haptics.
About harmony. How sound, haptics, and visuals work together in creating a great experience.
And, utility.
How looking at the experience from the point of view of the human using your app.
Next, let's look at the techniques that-- and, practical tips, that we can use with these three guiding principle to create great audio-haptic experience.
First, a small recap about the primitive available in Core Haptics. We have two building blocks that you can work with. The first one is called transient, and is a sharp, compact haptic experience that you can feel, like a tap or a strike. The second one is a continuous haptic experience that extend over time. You can specify the duration. How long it should last.
For transient, there are two design dimensions that is available and under your control. We have haptic intensity, and we have haptic sharpness, to create something more round or soft at the lower value. And, something more precise, mechanical, and crisp at the upper bound. Intensity changes the amplitude of the experience as expected.
For continuous, we have the two similar design dimensions. We have sharpness and intensity, and we're able to create more organic, or rumble-like experience that extends over time, or something that is more precise, and more mechanical at the upper values of sharpness. But, there are more many more details and capabilities in the Core Haptics API. Be sure to check out the online documentation. Now, when designing sounds, keep in mind what will work best with these haptics.
For a sharp transient, a chime with a sharp attack will probably work really well. But, if we have a sound that's much smoother, using those same haptics is probably not such a good idea.
So, for something like this, a continuous haptic, ramping up and down probably works better. But, these are not hard rules. There's a lot of room to experiment. And, sometimes, you may find out that the opposite of what you thought would work is actually better. And, this was the case for the Apple Watch alarm that sounds like this.
For a sound like this, you may want to add a haptic like this, because it pairs together perfectly.
But, can we make it better? Can we keep experimenting? Maybe flip it around and change the timing? This creates anticipation by ramping up the haptics and then quickly cutting it off and playing the sound. There's a clear action-reaction, and the sound plays as an answer to the haptic. This works really well for the Apple Watch alarm.
Next, it's pretty common to have a number of events back-to-back to convey different type of experience. In this case, we have four transient event. And, we notice that when we present this to different people, they don't necessarily feel the first one. We have the first ghost haptics. So, the sequence of four taps is actually reported as a triple tap only.
This could be a problem, or an opportunity.
We could use this effect of ghost, or not perceiving the first one completely, as a priming effect. Let's look at the example of a third-party alert on watchOS. This is the sound and the haptics of that third-party notification. So, this is a really important notification that we want to make sure the user perceive and acknowledge clearly. So, in this case, we use our ghost effect, or our primer in this case, to wake up the skin, and make sure that it's completely ready to feel what's to come.
Let's listen and feel it. So, in this case, we have a clear presentation and recognition of our main notification experience. Next, we can also create contrast between very similar experiences. Here is the sound for the left navigation cue on watchOS. It sounds like this. With our harmony guiding principle, we end up with really nice haptic that pairs really well with that sound. So, we have a series of double strike, that sound and feels like this. Now, if we look at the right cue, we have a similar sound, but slightly different. So we can notice the little difference between the left and the right on the audio, but if we continue to follow our harmony principle, with the haptics we end up with the identical pattern between the left and the right experience.
In this case, we want to add haptics-- we double up haptics from the double strikes. And then, we have true contrast between left and right.
Let's listen and feel what-- this experience. Again, we have contrast between the left and the right for very similar audio experience.
So, by now, you have quite a few tools to create your own experiences. We would like to show you one more example to illustrate the points we made. This is a full-screen effect in Messages. The sound and haptics are perfectly synchronized to the animation. And, it's a delightful moment for a special occasion. Let's look at it one more time.
Now, if you haven't yet, I encourage you to try this out on your own iPhone, to experience the haptics yourself. And now, a few more thoughts to consider in addition to the guiding principles that we shared.
The best results come when sound, haptics, and visuals are designed hand-in-hand. Are you an animator? Collaborate with a sound or interaction designer and vice versa. It's the best way to come to a unified experience.
Imagine using your own app for the very first time.
What would you like it to sound like, or feel like? And, then imagine using it 100 times more.
Does it still help you to hear and feel these things? Or, are you overwhelmed? Experience it, and take away all the things that don't feel compelling, or that are not useful. And, don't be afraid to experiment. Try things out. Prototype.
We've seen that you may just come across something amazing, by trying something new.
We're looking forward to seeing, hearing, and feeling what you will come up with in your own apps.
See this URL for more information. Thank you very much. [ Applause ]
-
-
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.