While convalescing from a bout of Lyme disease, Hall decided that he needed to set his synaesthesia loose in the wild, as he puts it. He and his digital collaborators developed software that siphons the luminance and color values from the video cameras in iPhones and iPads (only later-generation devices like the iPhone 4, 4S and iPad 2 will work correctly) and uses them to trigger stereo samples from a library of CD-quality audio composed for the purpose.
When Hall — who helped create the lush “painted world” sequence in What Dreams May Come, the 1998 film starring Robin Williams – told me about Sonified in email, I knew I had to try it myself. After downloading it from the App Store, I boarded a streetcar here in San Francisco, slipped on a pair of headphones, and aimed my phone out the window just as the train streaked past a row of brightly painted Victorian houses, accelerating through shafts of sunlight and shade on its way into a tunnel.
The effect of the audio-visual-kinesthetic link-up was unexpectedly profound. Instead of feeling like Sonified was imposing its digital soundtrack on the world, I felt I was accessing a layer of reality that is normally hidden from us. It was like a little dose of Morpheus’ red pill in The Matrix.
To give potential users a foretaste of the experience, Hall has uploaded videos to YouTube here, here, here, and here. But experiencing Sonified second-hand rather misses the point. The thrill of using the app is having it respond to optical nuances in real time as you move through spaces that come alive in new and surprising ways. Hall’s ethereal sonic palette may be a bit New Agey for some tastes, but the software offers a teasing glimpse of how much more we could be doing with these powerful multimedia platforms in our pockets.
The videos are mesmerizing cool. I’ve experienced synesthesia once, but it was decades ago under the influence of drugs. It was, to utilize an overused word that is entirely appropriate here, awesome.