Active Projects

Auditory perception and working memory for nonverbal sounds

A number of theoretical perspectives in psychology emphasize the distinction between verbal and visuospatial processing.  In other words, it is widely held that people think in words/language and images.  But what about nonspeech sounds like music and environmental sounds?  Do people remember sounds as sounds per se?  Does the capacity to remember sounds require that the person covertly verbalize or imitate the sound (via articulation, like humming along to the tune, etc.)?  We are working on empirical and theoretical approaches to understanding perception and working memory for for nonverbal sounds.

Auditory displays for in-vehicle technologies

Auditory displays will play an important role for drivers as in-vehicle technologies bring more diverse information into cars.  We are examining the design of auditory displays for in-vehicle technologies and how auditory displays may promote situation awareness during driving, especially in emerging technologies that support automation in vehicles.  Recent research has begun to examine how auditory displays can promote situation awareness in self-driving cars.

Acceptance of self-driving cars

Self-driving cars have captured the public imagination, and people have begun to establish expectations about how the advent of self-driving cars will change their lives.  The realization of self-driving cars, however, will face many human factors challenges.  Researchers have only begun to examine the factors that will affect drivers' acceptance of vehicle automation, and acceptance will be critical to the ultimate success of self-driving cars.  We have worked to develop a scale to measure acceptance of self-driving cars, and we've used the scale to begin to examine how a priori expectations about self-driving cars will affect acceptance.

Technology and interruptions

Research in our lab is exploring the impact of auditory interruptions (alerts, alarms, etc.) and other types of interruptions from technology such as smartphones and messaging services.  How and when will interruptions disrupt our ability to accomplish everyday tasks? What theoretical models of multitasking and task interruption can best explain results to date? Are interruptions from technology different from in-person interruptions?  How does the availability of messaging technology impact productivity during work tasks?

Previous Projects

Audio assistive technologies in education and testing accommodations

Technology has increasingly allowed for students with disabilities such as visual impairments to pursue educational opportunities in science, technology, engineering, and mathematics (STEM) fields.  Many disciplines in STEM rely heavily on graphs and diagrams to represent abstract concepts.  Our research has examined how can sound be used to improve the delivery of STEM educational curricula for all learners, including those with disabilities.  In particular, research has suggested that auditory graphs could be used to make traditionally visual learning materials accessible to learners with a wide range of sensory capabilities.  Further, high-stakes standardized testing plays an important role in determining a number of educational and career outcomes for many people.  Very little research, however, has examined how accommodations affect the validity of standardized tests.  Research in our lab has examined how graphs, diagrams, and tables are presented in accommodated versions of tests for people with visual impairments.

Auditory pareidolia

We studied the role of top-down perceptual processing in the perception of purported electronic voice phenomena.  Our findings have suggested that the suggestion of a paranormal context results in a criterion shift for perceiving human voices in ambiguous auditory stimuli, even when participants are self-reported skeptics.  Further, people showed little agreement about the content of the perceived utterances in the ambiguous stimuli.