What We Do

The Auditory Computation & Neurophysiology Lab investigates the mechanisms of neural functions underlying auditory perception. We are interested in how information about sound sources (e.g., who is talking? from where? who else is out there?) is processed by the neocortex and reflected in the electrical activity of neurons.

We design and perform various listening tasks related to real life situations. Through this research, we hope to provide strategies to build enriched sensory environments, improving the quality of life for both normal and disabled persons.

Tools We Use


When sounds reach at our ears, the vibration of sound energy is transduced into electrical nerve firings (often called “spikes”). We collect these firing patterns using microelectrodes. They are small enough so that we can monitor the activity of single neurons in the brain. The neural signals are translated in real time into sequences of numbers, which are used to form an internal image of world outside by the brain.


Psychoacoustics investigates the subjective experience of the physical attributes of sounds. The tasks often focus on detection and discrimination among multiple acoustic events. Psychophysical performance provides insights about the internal representation of the world outside.

Computational modeling

Neural modeling investigates the behaviors of neurons and the dynamic interactions among neurons using tractable, numerical simulations. By imitating the functions of a biological system, neural models provide insights about the physiological bases of perception.