Better Haptics, 10x Faster with Lofelt Studio

Daniel Büttner
10 min readDec 18, 2020

With iOS 13, Apple introduced Core Haptics — a sophisticated API for creating haptic experiences on iPhones. Last year, we published an in-depth article examining Core Haptics, and explained that this API enables you to create custom haptic content and export it as an AHAP (Apple Haptic and Audio Pattern) file. You then use the AHAP file to integrate haptic content into a game or app.

Core Haptics is definitely a step in the right direction. But, Core Haptics and similar hardware-level APIs still pose significant hurdles for creative haptic design. For example, developers are required to learn platform-specific characteristics and then create long, text-based files. Through this process, there is no visual support — no plots, waveforms or envelope displays that show — at a glance — what the pattern will feel like. And, there is often no way to test the haptic experience without the time-consuming process of recompiling the app and pushing it to the host device.

We introduced Lofelt Studio to help accelerate and simplify the haptic design process. Studio offers drag-and-drop simplicity that lets you rapidly translate audio files into haptic clips. It provides an intuitive visual editor so you can easily customize those clips, and it enables you to test your haptic clips instantly on a mobile phone. Our goal is to help developers create better haptic experiences for their games and apps, faster — without requiring any coding.

We wanted to see how Studio compares with manual haptic design processes in the real world. How difficult is it for developers to design haptics using hardware-level APIs, and can Studio really deliver better experiences in less time?

To answer these questions, we asked two iOS developers to create haptic experiences for two game sound effects. For the sake of brevity, we’ll walk you through the process from one of the developers (both devs followed similar steps) and share his pain points. Then, we’ll show you the haptic design process for the same audio files using Studio to see whether it truly offers advantages over manual design.

Meet the iOS developers

Tanay Singhai and Nick Arner

Tanay was, until recently, a research intern in a haptic computing lab at the University of Waterloo in Canada. He studied “juicy” design, in which you provide large amounts of audio, visual and vibrotactile feedback to enhance a game player’s experience. He is now a software engineer for Facebook.

Tanay’s typical haptic design workflow involves writing AHAP files by hand and then testing files on his iPhone using a custom app. He occasionally uses Haptrix to visualize the AHAP file, but he doesn’t use it frequently for editing.

Nick is a developer and researcher with a background in music technology who works in the field of human-computer interaction. He is primarily interested in sensors, human interfaces, augmented reality (AR) and ubiquitous computing. Nick works in the San Francisco Bay Area.

Both Tanay and Nick were asked to design haptic clips for two audio files. One audio file was a bow releasing an arrow; the other was a footstep in snow. Though there were some minor differences in how the developers approached the task, both followed somewhat similar steps. We focus only on Tanay’s process.

Designing haptics manually

Tanay’s screen capture (playing at 4x the original speed)

Step 1: Tanay created an Xcode project with two buttons — one for each of the haptic clips that he intended to create.

An Xcode project with buttons for two haptic experiences.

Step 2: Using Adobe Audition, Tanay played the audio clips — “Bow Release” and “Snow 1” — so he could start to get a sense of what each haptic pattern should feel like. He wrote down the time stamps for particular moments where he thought there should be transient feedback or some sort of continuous ramp up or down (or some other change).

Tanay listened to the audio files with Adobe Audition.
Tanay listened to the audio files with Adobe Audition.

He kept track of the audio’s changing amplitude values as well. Using a scale from 0.0 to 1.0, he captured values by scrubbing the audio — moving the playhead across the audio waveform.

Tanay recorded amplitude levels for footstep in snow audio file.

Step 3: Tanay then created the first AHAP file from the time/value pairs he had noted. He started by copying transient, continuous and control curves from previous haptic files that he had available. Then, he modified the time, intensity and sharpness.

Tanay created the AHAP file.

Step 4: Tanay repeated those steps for the second audio file, a footstep in snow. He listened to the audio file, wrote down time stamps where there were transients or where there should be a continuous vibration. With the footstep, he decided to only work with the transients and omit using any continuous vibration.

Step 5: He then built the project in Xcode so he could try the haptic effects on his iPhone. This was the first time that he was actually able to feel the haptic patterns he was creating to determine whether it was working as he wanted it to.

Step 6: “Iterate, iterate, iterate.” For each small tweak Tanay wanted to make, he had to recompile the app and push it again to the iPhone. Tanay saved the iteration of both haptic patterns as “_new.ahap” so he could go back and compare iterations later.

Iterations are time-consuming with this approach. This approach also creates an increasingly error-prone process as the AHAP file grows.

The developer’s perspective

We asked Tanay about his pain points and his assessment of the results he produced.

Where he struggled

“I sometimes made typos in the AHAP files (for example, typing 0.9 instead of 0.09), and I wouldn’t realize I did something wrong until I actually played it on the device and recognized that something was off. A visualization of the AHAP files would have helped me catch these mistakes instantly.”

Iteration time

“In general, iteration is a little slow. If I have a continuous event with intensity and sharpness controls, and if I decide to change their timings, I have to manually change the numbers for each one. If I mess up during this process, I will end up with a typo/bug in the haptic feel.”

Scalability

“The haptic designs here were relatively simple. But creating intricate haptics longer than five seconds would be very challenging by hand because there would be so many numbers to keep track of. Some visualization is an absolute must.”

Quality

“Even though I wrote everything by hand, the timing and quality of the haptic feedback in general felt really good, even on my first try.”

Here are visual plots of the haptic effects that Tanay created:

AHAP graph for bow release
AHAP graph for footstep in snow.

View the source code for Tanay’s project on GitHub.

This video shows Nick’s design process, which is similar to Tanay’s:

Nick’s screen capture (playing at 4x the original speed)

Streamlining haptic design with Lofelt Studio

As we’ve seen, the manual design process can be time-consuming and tedious. Tanay and Nick had no visual tool to envision the envelopes they were creating. And they had no easy way to feel the haptic experiences as they were designing — they had to compile the file and push an app to an iPhone for every change they wanted to evaluate.

Designing haptics with Lofelt Studio is a significantly different process. Let’s look at how our team designed haptics for the same two audio files used by Tanay and Nick.

Lofelt Studio screen capture (playing at original speed)

Step 1: Create a haptic file automatically.

With Lofelt Studio, you can drag an audio file into the Studio desktop app and hit “Analyze.” And, that’s what we did starting with the bow release audio file used by Tanay and Nick.

Studio then automated those initial, manual steps. It used an advanced algorithm to analyze the audio file we dragged into the interface, and turned the data into matching haptic envelopes and emphasis points. Studio created not only an amplitude envelope but also a frequency envelope from the original audio. The analysis took mere seconds to complete.

We then reduced the number of breakpoints that were automatically created. We did that to make sure that each of the points of emphasis could stand out compared with the less important moments.

Step 2: Connect the phone and test haptics instantly.

Immediately after analyzing the audio sample, we opened the Studio mobile app and scanned the QR code that appeared on the Studio desktop app. This connected the mobile and desktop apps. The haptic clip and any edits we made were instantly synchronized.

We were then able to tweak the envelopes and instantly feel the result without having to set up and compile an Xcode project. For example, we could add or remove emphasis points to see how each change affects the haptic experience.

Step 3: Start iterating.

We clicked “Done” and started editing the breakpoints and emphasis points. This is the creative part of the process. We were able to play the haptics along with the original audio, which was very useful. We were also able to mute the phone’s audio so we could feel the haptics without sound.

Step 4: Save the project and export.

After just a few minutes, we were able to create a haptic experience that we were happy with. The bow release felt nice and snappy, and it perfectly matched the audio.

We saved the project so we could come back and edit it further if needed. Then, we exported a .HAPTIC clip — the equivalent to an AHAP file — that would later play inside our iOS app using the Lofelt SDK.

We repeated the process for the second audio file.

So, how does designing with Lofelt Studio compare to a manual design process?

1. Get started faster

Scrubbing through an audio file and noting time stamps is tedious. Lofelt Studio lets you get to the creative part faster. It generates a precise haptic envelope and breakpoints using an algorithm that can be completely automated. This process is much faster and more accurate than plotting an envelope by hand.

2. Test results immediately

You shouldn’t need to create and compile an Xcode project to evaluate your work. Imagine if you were a web designer and you had to publish your graphics on a website before you could see what you were creating.

Studio gives you an intuitive, creative workflow for haptic design, and then it enables you to test haptic experiences instantly on a mobile phone. You can refine the experience and feel the edits in real-time.

3. Deliver better, more detailed haptic experiences

A “better” haptic experience is a subjective judgement. And, as Tanay noted, you can certainly create good haptic experiences manually. But with Lofelt Studio, you can capture the subtleties of an audio file and translate them into a more nuanced, detailed haptic experience — with less work.

In our tests, Lofelt Studio created envelopes with more amplitude and frequency points than the envelopes of our designers. Those more detailed envelopes can produce more realistic experiences for users.

Bow release

Footstep in snow

4. Focus on the creative process

Manual haptic design requires programming skills. Studio is a no-code solution that opens haptic design to a broader range of users, including people with backgrounds in audio and user experience (UX) design. You can focus on the creative process of translating audio into engaging, realistic haptic experiences instead of worrying whether you’ve made a typo when coding an AHAP file.

5. Finish 10x faster

Both Tanay and Nick spent a little more than an hour creating two haptic clips. With Studio, we created a single haptic clip in about 6 minutes.

Of course, you might be able to work faster with manual design or you might decide to take longer with Studio. But, you can start tweaking the envelope and iterate much faster with Studio, since you can test your haptic experiences as you’re designing.

6. Export cross-platform data

For our test, we focused on creating haptic clips for iPhones since AHAP files will only work on iOS devices. The Android Vibrator API uses an entirely different approach for segmenting haptic data. The data is somewhat transferable, but it requires a lot of manual work.

At Lofelt, we want to make it easy to design haptics once for multiple platforms. We’ve just launched a beta version of Lofelt Studio where any .HAPTIC clip you create will work on Android devices and other platforms.

Take the challenge yourself

Want to take the haptics design challenge yourself? The data and code from these tests are publicly available on GitHub. We’d love to hear your feedback. The Lofelt Studio project is also on GitHub.

Thanks to Tanay and Nick for participating in this haptics design challenge, and for sharing insights into their workflows.

Start a 30-day trial of the Lofelt Studio, which includes iOS and Android support, and a plug-in for Unity.

--

--