Displays: Special issue on the sonification of real-time data

1 minute read

David Worrall and I guest-edited the April 2017 special isse of the journal Displays on the sonification of real-time data. This special issue presents five articles drawn from across the range of sonification practice but which all focus on how to communicate information about real-time data.

The contents are as follows:

  1. Preface to the Special Issue on Sonification” by Paul Vickers, David Worrall, and Richard So.
  2. Comparative study on the effect of Parameter Mapping Sonification on perceived instabilities, efficiency, and accuracy in real-time interactive exploration of noisy data streams” by David Poirier-Quinot, Gaetan Parseihian, and Brian F.G. Katz.
  3. Sonification of a network’s self-organized criticality for real-time situational awareness” by Paul Vickers, Chris Laing, and Tom Fairfax.
  4. Towards a systematic approach to real-time sonification design for surface electromyography” by S. Camille Peres, Daniel Verona, Tariq Nisar, and Paul Ritchey.
  5. The sound of smile: Auditory biofeedback of facial EMG activity” by Yuki Nakayama, Yuji Takano, Masaki Matsubara, Kenji Suzuki, and Hiroko Terasawa.
  6. EcoSonic: Auditory peripheral monitoring of fuel consumption for fuel-efficient driving” by Jan Hammerschmidt and Thomas Hermann.

You can see the full issue at http://www.sciencedirect.com/science/journal/01419382/47/supp/C.