Within the past few months, using immersive technology in the lab has enabled researchers to study sound perception in realistic settings! Virtual reality was used to probe auditory special awareness in real-life-like studies. The experiment has somebody put on headgear that puts them into a parklike setting. They are then told that they will hear a sound, and they should turn their head in the direction they hear it coming from. While doing this, the experimenter (Travis Moore, in this case) manipulates two essential location cues. The first one is the difference in timing from when the sound wave reaches the person's ear (measured in millionths of a second.) The other is the difference in sound pressure levels registering in each ear.
What they found was a considerable variability in how much weight subjects' brains assign to each cue. This is important since we don't yet know how the process of integrating two cues plays out in real-world listening tasks. Technologies like the ones used in this study should help yield better hearing aids for the hearing impaired, more accurate diagnosing of auditory disorders, and a richer sound experience in video games. Perhaps we can make acoustics of simulation indistinguishable from the real world! If something like this does happen, I'm wondering how different it would be to watch a movie and have it sound like it's happening in real life. I'm thinking that would be really cool!J