The Keys to Delivering Better Haptics on Android

Daniel Büttner
13 min readFeb 12, 2021

--

A comprehensive gap analysis of haptics capability differences between Android and iOS

TL;DR

In the last few years, Apple has raised the bar for haptics, enabling app and game developers to deliver engaging experiences on iPhones. Meanwhile, Android’s diverse ecosystem has made it challenging for developers to deliver experiences at the same quality level. Taking a closer look, there are significant differences between the two platforms throughout their haptic tech stacks — from the hardware actuators, drivers and hardware abstraction layers (HALs) up to the application APIs that developers use to define the haptic experience for their apps. To close this gap and support rich, real-time experiences such as mobile gaming, the Android platform needs framework enhancements that can enable on-the-fly haptic playback, consistent transient playback and better synchronization between audio and haptics. The Lofelt haptic framework for Qualcomm’s Snapdragon mobile platform augments Android with advanced playback capabilities to address key gaps.

Today’s smartphone designs highlight the essential need for advanced haptics. Devices with large, glass screens that serve as the primary user interface need ways to provide feedback other than through physical buttons.

Haptics should have progressed alongside touch displays. But for many years, the haptics industry stalled, in part because of a classic chicken-and-egg problem. Without popular mobile apps containing rich haptic content, manufacturers lacked the incentive to add more sophisticated actuators to their devices. Meanwhile, app and game developers were reluctant to spend time and resources designing haptic experiences that could not be played back on existing devices.

In the last few years, the haptics industry has started to move forward, but progress has been uneven. Apple forged ahead with the introduction of its Taptic Engine actuators and Core Haptics API, raising the bar for haptics on mobile devices. The Google Android ecosystem has progressed at a slower pace.

How has the Apple iOS platform raced ahead and why has Android trailed? Where are the gaps between the two platforms? Most importantly, what can be done to accelerate the progress toward delivering richer, more engaging haptic experiences across all platforms?

Solving the chicken-and-egg problem at Apple

Apple has made haptics a top priority with a multi-year strategy to dramatically enhance the touch experience for mobile devices. And because Apple produces the hardware and software needed for delivering high-quality haptic experiences, the company has been in a strong position to drive broad change within its ecosystem.

The introduction of the Taptic Engine to Apple iPhones in 2015 signaled Apple’s strong commitment to haptics. The Taptic Engine is a well-designed and relatively large actuator. Apple shipped phones with Taptic Engines for years before there were easy ways for third-party developers to capitalize on them.

To facilitate the design and development of haptic experiences for the Taptic Engine, Apple introduced Core Haptics — a powerful API — in 2019. This was a turning point for the haptics industry. Developers now had both API access and the hardware needed for producing sophisticated haptic experiences across all newer iPhones. Beyond simple notifications, developers could simulate button clicks on a touch screen and build full multimodal experiences for games and movies.

While Apple was able to solve the chicken-egg problem within its own ecosystem, some mobile app developers are still cautious about making haptics a core part of their experience because of the gaps between iOS and Android devices. Developers creating apps for both platforms often want to achieve at least 80% of the iOS experience on 80% of Android devices. To truly move the haptics industry forward, the Android platform needs to offer similar capabilities as the iOS platform.

Facing conflicting Android goals

Why has the Android ecosystem been slower than the iOS ecosystem in enabling the design and delivery of engaging haptic experiences? Android’s vast ecosystem and platform neutrality mean device manufacturers may choose their own components — such as displays, speakers and haptic drivers — to meet different price points and capabilities. In addition, device manufacturers using Android often create their own custom hardware abstraction layer (HAL) implementations. The HAL is what enables software to control the hardware components.

The differences in device components, the limitations of customized HAL implementations and the lack of a reference HAL implementation for all Android devices make it difficult for app developers to deliver consistent high-quality haptic experiences across all Android devices. A haptic effect for an Android game will feel different on a Samsung smartphone than on a Xiaomi smartphone, even though both are running Android.

The iOS platform does not have the same level of fragmentation. Because Apple controls both the software and the hardware, it can help ensure consistency for haptics across various device models.

In recent years, Android phone OEMs have attempted to work around the framework limitations and lack of content support by generically creating haptics from the app’s audio. There are several problems with this approach:

  • App audio often mixes effects with music and voices, and a context-unaware audio-haptic algorithm running on a phone in real time will not produce satisfactory results. Unsurprisingly, user reactions to this capability are very mixed, probably because of the unsatisfactory quality and heavy-handed use of the generic haptic feedback.
  • This approach takes away the creative decision from content creators. There is no way for developers to stop the phone from superimposing a generic haptic experience on their app UX. Phones should not decide how a game or app feels.
  • Haptics, like audio and visuals, needs parity and consistency across devices. Especially in mobile gaming, players will learn haptic cues designed by the game developer, and they will expect those to remain consistent when playing the same game on another device.

While generically rendered haptic feedback might be a viable temporary solution, it will never become a widely accepted experience and quickly be superseded by curated haptics. Device makers who want to use haptics as a differentiator should instead focus on premium experience through improved components, such as a more performant actuator or advanced drive electronics.

So, how can developers deliver consistent high-quality haptic experiences across Android devices while operating within Android’s open ecosystem? To answer that question, let’s analyze key gaps between iOS and Android platforms in terms of the capabilities for delivering haptic experiences.

Analyzing haptic gaps between iOS and Android platforms

There are significant gaps between iOS and Android platforms throughout the haptic technology stack. Identifying key differences is the first step in improving the quality and consistency of haptics for Android, and ultimately driving the entire industry forward.

The following gap analysis focuses primarily on the software framework and drivers that allow an application to render a rich haptic signal. A detailed actuator analysis will be covered in a separate article.

This gap analysis examines the software and hardware layers in the technology stack.

Hardware — Haptic actuators

Working from the bottom up, the hardware layer is where we see the first notable gap between iOS and Android platforms. Apple’s commitment to haptics is clearly evident in the size of its Taptic Engine and the large amount of internal real estate the company has devoted to that actuator within the iPhone. Although Apple has slightly reduced the size of the Taptic Engine in recent iPhone models, it is consistently larger than the actuators in Android devices. (For comparative measurements of actuators in multiple iPhones and Android smartphones, see “Comparing actuator sizes in Apple iOS and Android devices.”)

Averaged actuator footprints of iOS and Android actuators, to scale.

Why is size so important? Haptics is physics. To accelerate a mass (like a smartphone) and produce a strong vibration, you need sufficient force. The size of the actuator and its size in relation to the weight of the phone help determine how much force the actuator can produce. The size of the Taptic Engine gives iPhones a physical advantage in producing rich, high-intensity haptic effects.

Hardware — Drive electronics

Apple’s possible use of drive electronics further distinguishes iPhone hardware from Android device hardware. An Apple patent filing suggests that iPhones use a sophisticated closed-loop feedback system. In a closed-loop feedback system, a sensor might perceive a change in the resonant frequency of the phone, for example, such as when a user holds the phone. The feedback system then uses an advanced driver integrated circuit (IC) to adjust the drive signal to ensure the user nevertheless feels the intended haptic effects. Given the closed, proprietary nature of Apple’s technology, these are only assumptions based on IP filings and teardowns.

Operating system / Hardware abstraction layer (HAL)

The iPhone HAL and the Android HAL are designed to address different priorities — mainly because Android devices are part of an open ecosystem and iPhones are not.

The iPhone HAL is likely fully optimized for the Taptic Engine and its capabilities. In theory, Apple might measure the response from the Taptic Engine within each iPhone model and then provide that data to the software controlling the actuator.

Apple’s closed-loop system is filed under U.S. patent 10,007,344.

Because Android is open source, the Android HAL was designed to be customized. Android device manufacturers need to tailor the HAL to their choice of haptic hardware — not only a particular actuator model but also the type of motor. Consequently, unlike the iPhone HAL, the Android HAL is not optimized for one particular actuator type.

These fundamental differences between the iPhone HAL and the Android HAL ultimately contribute to differences in the quality of haptic effects that developers can deliver on each platform without additional work. So, what capabilities would Android developers require to deliver iOS-quality haptics despite the diverse Android device ecosystem?

  • Device-specific signal rendering: An operating system’s signal render engine translates relative amplitude and frequency values from an app or game into a signal that drives an actuator. iPhones include device-specific signal rendering that produces consistent output across different phones; Android devices do not (though device manufacturers can write their own render engines).
  • Hardware calibration profile: A hardware calibration profile contains specific values for the device, such as frequency response or maximum voltage for safe moving-mass displacement in a voice-coil motor (VCM). The profile helps map incoming signals to values tailored to the device. iPhones likely use a closed-loop method to calibrate the drive signal for Taptic Engine, allowing the system to display a broad range of haptic signals. On Android phones, the actuator or system profile is not mandatory or standardized, and the application above the HAL can only display generic haptic expressions through the HAL.

These two points relate to the same overarching issues: The platform needs to understand the vibration characteristics of the device it is running on — and it must be able to create a drive signal that matches those characteristics. iPhones map signals to specific device characteristics. By contrast, Android device manufacturers are responsible for implementing the HAL and incorporating any optional hardware profiling and calibration. Android app developers only have the option for creating generic haptic output. Even if an Android device is calibrated to display haptics over a broad frequency range and with sharp transients, the Android app wouldn’t be able to use this capability within the standard Android API.

Application layer

At the application layer, iOS developers use Core Haptics while Android developers use the Android Vibrator API or other, newer APIs. Both Core Haptics and Android Vibrator APIs are capable of simple notification and confirmation feedback (as defined by the Haptics Industry Forum), where synchronization with audio is not important:

Notification — Captures a user’s attention as a result of an asynchronous event. Pagers have used this modality since the 1980s.

Confirmation — Provides users with confirmation that their intent was executed. Virtual keyboards, mechanical button replacements and UI interactions provide confirmation feedback.

But multimodal experiences, such as for mobile gaming and media streaming, require more sophisticated real-time control of haptic playback than notifications or confirmations. Where are the gaps between Core Haptics and the Android API in supporting multimodal experiences?

  • Real-time, low-latency parameter control: Beyond playing back a predefined haptic file, apps and games need to create and manipulate haptic playback in real time. For example, you might want to reduce the intensity of a haptic effect in your game if a game character is farther away from the object generating that effect. Or you might want to render a haptic signal for a car engine following the user’s acceleration of the car. Core Haptics is designed to support real-time control; the Android API is not designed for continuous real-time control.
  • Interpolation in amplitude and frequency control: To create a natural haptic feeling, developers must be able to create smooth amplitude ramps by sending envelope breakpoints to the part of the system generating the drive signal. The signal generator must then interpolate between breakpoint values to create a smooth signal ramp. Core Haptics offers interpolation during real-time playback and when playing predefined Apple Haptic and Audio Pattern (AHAP) files. Android allows only stepwise change of intensity and no control over frequency at all.
  • Sharp haptic effects, often referred to as transients: Android does offer some support for transient playback via primitives, but not all phones support those primitives. In addition, those primitives can’t be played in parallel with continuous vibrations. Transient playback should be standardized and adhere to the same on-the-fly playback principles as continuous vibrations to help ensure consistent playback across different devices.
  • Unlimited playback duration: With Core Haptics, a haptic effect can technically play for an infinite amount of time without glitches or resets. On Android, playback is commonly reset after a certain duration, depending on the device.
  • Audio haptic synchronization: To deliver an immersive, multimodal experience, you need to synchronize haptics with other modes, such as audio. And that synchronization must be maintained over long periods of time. AHAP files can play back audio files and guarantee synchronization between audio and haptics. Synchronization using the AudioTimestamp API is theoretically possible, but it is rather complex for developers to manage. And importantly, previous versions of Android had an issue that caused haptics to drift.

The Android Vibrator API cannot sync haptics and audio. However, Android does offer an under-documented alternative path for driving haptics using an Ogg file. This approach provides amplitude and frequency control as well as audio and haptic synchronization. Nevertheless, this approach poses several limitations to app developers: perhaps most importantly, only very few phones support playing the haptic channel of Ogg.

Android’s haptic APIs have evolved over the years.

Overall, Core Haptics offers far more advanced and consistent capabilities than the Android Vibrator API. Still, Core Haptics leaves room for improvement. Core Haptics requires a sizable investment from developers to make use of in applications with complex, multimodal experiences (e.g., gaming). A higher-level framework, integration with game engines and more mature design tools will help simplify haptics integration for application developers.

There are significant differences in haptic capabilities and tools available for iOS and Android platforms.

Closing the gaps: Focusing on near-term improvements

As we’ve shown, there are substantial gaps between iOS and Android platforms at multiple layers of the technology stack. Not all gaps need to be closed immediately. But to help developers create rich, engaging haptic experiences across multiple Android devices, we believe these four improvements are mandatory:

  1. Real-time haptics streaming: Developers need an API for real-time haptics streaming that can enable real-time parameter control, mixing of simultaneous haptic events (including transients) and real-time generation of haptic streams from audio.
  2. Consistent, standardized transient playback: The Android platform should support consistent, standardized transient playback for click-like effects and dynamic expression. Transient playback should include amplitude control and provide reliable waveform definition across Android devices.
  3. Callback for interruptions: A callback should notify the Android haptic player of any interruption, like when a call or text message notification is played, so that it can resume playback.
  4. Haptics and audio synchronization: Developers should have a way to synchronize haptics and audio playback, both for pre-rendered and on-the-fly haptics. A callback could notify the haptic player when audio actually starts playing.

In addition to these technical improvements, developers need greater transparency about Android devices’ haptic capabilities. Developers shouldn’t have to spend time searching online for device specifications or — worse — testing their app experience on potentially hundreds of devices.

Looking ahead: From Snapdragon integration to mobile accessories and industry standards

At Lofelt, we are committed to accelerating the progress of the haptics industry. And we believe that reducing haptic-related gaps between Android and iOS platforms will be vital in moving the entire industry forward. To that end, Lofelt and Qualcomm Technologies, Inc. will deliver an advanced haptic framework to the popular Snapdragon mobile platform. The Lofelt framework will be a software option for Android OEMs using Qualcomm Snapdragon processors, which today power more than 50 percent of all Android smartphones.

The work we’re doing at Lofelt to help deliver high-quality experiences to mobile devices will not be limited to smartphones. Our aim is to enable developers to bring rich, engaging haptic experiences to the growing number of actuator-equipped, Bluetooth-connected smartphone peripherals, such as game pads. With iOS 14, Apple extended Core Haptics capabilities to include haptic feedback on game controllers, such as the Sony PS5 DualSense, providing a single API for all haptic playback on the phone or peripheral. In the future, the new haptic framework we are introducing with Qualcomm will enable Android developers to deliver haptics to device accessories without having to deal with numerous device-specific APIs.

In the meantime, our work also extends beyond the specific technologies we are providing to developers and device manufacturers. In particular, we are deeply involved in the creation of haptics standards. We proposed the VT-1 specification, which includes parameters and requirements for creating high-quality, realistic user experiences through vibrotactile (VT) haptic feedback. And as a founding member of the Haptics Industry Forum (HIF), we have collaborated with other haptics experts to create a harmonized HD Haptics Specification. Through these and other efforts, we hope to accelerate changes in the haptics industry that will benefit all participants.

Lofelt Studio is currently available for iOS and Android. Sign up here to join our developer community.

Learn more about the Lofelt partnership with Qualcomm Technologies, Inc.: lofelt.com/qualcomm

A special thanks to the entire Lofelt team for their support and extensive research 🖤

--

--