James's Blog

Sharing random thoughts, stories and ideas.

Hacking Physics

Posted: Dec 19, 2020
◷ 6 minute read

I’ve always joked that Apple would integrate proprietary physics if they had control over the laws of nature. In their quest to achieve ever-better user experience across their products, physics often seem to be the last obstacle standing in the way. With the launch of the M1 chip based Macs, Apple has truly reached total vertical integration on their desktop computing platform, all the way from software applications to the literal (Apple) silicon. If going further is possible, say if the speed of light could be increased beyond $c$, and if doing so would result in Safari loading websites faster, Apple would no doubt do it. Of course, the joke is that they obviously cannot create “proprietary physics”. But I think they are trying really hard to “hack” their way there.

Display tech showcases a great example of how Apple hacks physics, with their HDR implementation. Since the contrast levels demanded by High Dynamic Range content simply aren’t achievable on regular LCD screens, HDR content requires special hardware (e.g. LCDs with local dimming or OLED) to be viewed properly. So experiencing HDR on Standard Dynamic Range (SDR) displays seems to be blocked by physics, and most people don’t give it a second thought (“just buy a better screen”). And even though Apple cannot change physics, they do push on the boundary as much as they can, with their special handling of HDR on SDR screens. The gist of what they do is simple. The essence of the HDR experience comes down to contrast, and even though they cannot make pixels brighter than their maximum (gated by physics), they can dim the regular pixels (such as those in the UI), so that when they have to show HDR content on an SDR display, there is some extra dynamic range headroom. This extended dynamic range is fake of course, a “hack”, since the pixels aren’t any brighter than the maximum that the hardware can do (nowhere near the 1000 nits that’s required for real HDR), but the perceived effect is certainly real. Through this hack, Apple has achieved what should not be possible due to the laws of physics: (perceived) extended dynamic range on SDR screens.

This brings me to the recently launched AirPods Max. To me, this product is not merely a set of high-end active noise canceling headphones, but part of Apple’s journey to achieve new and better experiences through hacking physics. The audio world has a similar age-old problem as the visual world: everyone’s physical playback devices are different, and so the listening experience is never the same as what the original creator intended. EQ in theory can be used to deal with this problem, but it is hardly effective in practice due to the lack of integration across the audio distribution pipeline. Furthermore, the requirements on the physical playback device can change (e.g. more bass response is needed for parts of a track), and manually tuning the EQ in real time is infeasible. Apple’s integration on the other hand, can actually solve this problem. Just like their phone and computers, Apple now controls the entire audio stack, from distribution (Apple Music and Apple TV+) all the way to the physical diaphragms that are pushing air next to our ears. This means they can, for example, do the same extended dynamic range “hack” for audio, achieving perceived sound signatures that shouldn’t be possible because of physics, in both music and videos.

To go even further, and keep in mind this is entering purely speculative territory, Apple can connect their integrated stack with the source, i.e. audio production. By extending the current standards of audio file formats1, special EQ metadata can be embedded in the tracks by the producers to help playback match exactly the intended experience. Every artist may use a different set of studio monitors to master their recordings, each with their own physical audio response characteristics. Our playback devices add even more diversity to the mix. But if all these variables are stored and passed along the audio production pipeline, then the original audio experience can more or less be replicated by the computational audio engine on the AirPods, even though physically the audio devices along the chain are all different. Another “hack” to achieve something that shouldn’t be possible due to physics. And until we can directly stimulate our brain, these hacks are perhaps the best we can do.

For me, the hardest part of software development has always been connecting the perfect, precise world of computing with the imperfect, noisy world of the real. Taking this more broadly, the hardest thing for any technology to do is always reconciling the difference in cost of changing bits vs. changing atoms. Vertical integration, from software to hardware, often results in some of the best technologies precisely because of its power to bridge that gap, it brings programmability to atoms, to a certain degree2. Achieving the effects of “hacking” physics, i.e. doing (by faking) the physically “impossible”, is the pinnacle of such integrations. It requires such a complex, symbiotic system of software (for the hacking) and hardware (to affect physics), that very few have been able to do it successfully3, let alone at the ubiquitous scale of Apple4.

Exploiting the rare ability to hack physics has been, and undoubtedly will continue to be, a key part of Apple’s strategy of differentiation. In the increasingly commoditized world of streaming services (be it for music, TV, movies, or games), content has been one of the only ways to stand out - “I choose Netflix over Apple TV+ because the show I watch is exclusively produce on Netflix.” The new and improved experiences made possible through hacking physics could bring yet another dimension of differentiation to this space. If the audiovisual and interactive experiences are tangibly better on one platform, and that difference cannot be easily closed by competitors, then the product essentially decommoditizes. Spatial audio with dynamic head tracking, Apple’s take on the virtual surround sound experience, is but a small taste of the kind of differentiated experiences to come. Should Apple implement the dynamic audio EQ metadata system described above, fully integrated into the music production pipeline (which they could foreseeably do), or any number of similar things that I did not think of, it’ll probably be impossible for Spotify to replicate and compete against. And if the experience truly matters, both content creators and consumers will shift over time: the better product always wins in the long run.

The attraction (and trap) of ecosystems, like Apple’s, is that the whole is much greater than the sum of its parts. The AirPods Max, being another part added to the whole, is therefore both greater than just another pair of ANC headphones, and an enhancer of all the other parts in the ecosystem. Additionally, neither of these properties is static, as through firmware and other software updates, both will improve over time5. It is much more fitting to think of the AirPods as a pair of computers on your head with voice coils attached, than a pair of traditional headphones. More important than the improvements to sound quality, noise canceling ability, and usability, it further opens the door for everything else in the Apple ecosystem to deliver programmable audio to your brain.


  1. Similar to what Apple has done for the raw image format, DNG, with their new ProRAW format, as detailed in this blog post from Halide. ↩︎

  2. In my opinion, this is one of the original core visions of IoT. Unfortunately it has been morphed into a dysfunctional mess in recent years, acting more as a data gathering mechanism serving the ad-tech industry than anything else. ↩︎

  3. Tesla is another company that has managed to do this quite well, which is the reason it is often compared with Apple for their similarities. ↩︎

  4. Even Apple, as good as they are, cannot do this perfectly, far from it. Even with full control across both the software and hardware stacks, flaws that cause suboptimal experiences are still common (e.g. AirPods automatic device switching is still noticeably unreliable for me, especially when moving between macOS and iOS). More than naything else, this just indicates the sheer difficulty of the underlying problem. ↩︎

  5. From my own experience, the in-ear AirPods Pro has had its sound quality, noice canceling ability, and usability improved through over-the-air firmware updates since its release. ↩︎