Abstract
Cochlear implants (CIs) are neuroprosthetic devices that provide access to speech and spoken language to children with severe-to-profound sensorineural hearing loss. Despite this language access, many children with CIs do not acquire age-expected vocabularies.
Children with CIs are often included in mainstream classrooms where they are expected to rapidly learn new words; however, CIs convey a degraded representation of the auditory signal and children with CIs must make sense of this impoverished signal using an immature language system. Together, these factors make listening in a classroom a significant challenge. To improve speech perception outcomes, children with CIs may be instructed to look at the speaker’s mouth while listening to speech. It is assumed that access to visual speech information is sufficient to overcome the poor acoustic conditions of the classroom and support speech perception.
While more than a decade of research demonstrates the benefit of audiovisual speech cues for word and sentence recognition, little is known about the benefit of audiovisual speech cues during novel word learning, a more demanding task young children are faced with on a daily basis.
This talk will review data from a novel word learning project where eye tracking was used to examine looking behaviors while children with and without CIs learned novel words. Individual patterns of looking and the relationship with word learning success will be discussed.
About the speaker
Kristen is an Adjunct Professor in the Hearing, Speech, and Language Sciences department, a member of the Rehabilitation Engineering Research Center (RERC), and the Primary Investigator of the Learning Lab for Kids with Implants (KiWI) at Gallaudet University. Her primary research interest is learning outcomes among children who use cochlear implants.
This event is open to everyone, no registration needed.