Research Projects

Sensory Systems

When we speak, we do more than just create an acoustic signal. Rather, we generate a richly multimodal array of sensory information, including sight, hearing, and the many different somatosenses. We can show how listeners use this information to supplement speech perception and processing. For example, when a listener feels a puff of air, they are more likely to think they heard /p/ than /b/ (Gick and Derrick, 2009).  Conversely, we show that modifying or removing sensory information will disturb speech processing, and force speakers and/or listeners to compensate in other ways.


Spam prevention powered by Akismet