Have you ever tried to hear someone talking in a room whilst other people are also talking in the background? Although this can seem difficult sometimes, our remarkable ability to pick out words and sounds from a wash of background noise is actually highly complicated, and yet we are barely even aware of it.
Imagine you are walking around with you eyes closed, paying attention to all the sounds your feet make. Without looking, did you walk on tile, carpet, or stone? Is your space around you echoing, or are you wearing something soft enough to dampen the sound? Is the space large or small? Inside or Outside? We are constantly perceiving very detailed information about the world around us, just by using our ears.
Have you ever seen a blind person using audible ‘clicks’ to determine their surroundings? Some blind people claim that this echolocation is not only highly effective, but also in some ways superior to sight. One user of this technique, Daniel Kish, points out that people dependent on sight can only see what is directly in-front of them, whilst he can hear in all directions.
Back in 3500 BC, it is said that Egyptian auditors were commissioned by the Pharaoh to compare separate reports of the same stockpiles to check that no-one was stealing. If the auditors were constantly comparing data visually for hours on end, it would demand lots of their focus and likely contain many mistakes. Instead the auditors would sound out their reports at the same time, any discrepancies between the sounds could be easily heard, and the thieves would be spotted.
According to Scientific America, Bruce Walker (a professor of psychology and director of the Georgia Institute of Technology’s Sonification Lab) says:
“The auditory system is the best pattern-recognition device that we know of. If you’re looking through a data set and trying to understand what’s going on, it’s often easier and more efficient to listen to the sound of it rather than looking at a screen or a printed version.”
When wanting to know the weather forecast, we often head straight to a webpage or news report that shows us a visual display. Although doing so might be quick and accurate, perhaps on reflection of reading this, you might reconsider. I’d like to think Wav4kst offers a much more immersive experience. Perhaps the falling of notes as the temperature drops over time allows for more engagement and understanding of the forecast. I think it is important for us to seriously consider how we engage with data, and explore the possibilities of sonic representation.