Worster, E., Pimperton, H., Ralph-Lewis, A., Monroy, L., Hulme, C. & Macsweeney, M. (2017). Eye movements during visual speech perception in deaf and hearing children. Language Learning: a journal of research in language studies,68(1), N. Ellis. 159-179. United Kingdom: Wiley-Blackwell. Retrieved from https://doi.org/10.1111/lang.12264
For children who are born deaf, lipreading (speechreading) is an important source of access to spoken language. We used eye tracking to investigate the strategies used by deaf (n = 33) and hearing 5–8‐year‐olds (n = 59) during a sentence speechreading task. The proportion of time spent looking at the mouth during speech correlated positively with speechreading accuracy. In addition, all children showed a tendency to watch the mouth during speech and watch the eyes when the model was not speaking. The extent to which the children used this communicative pattern, which we refer to as social‐tuning, positively predicted their speechreading performance, with the deaf children showing a stronger relationship than the hearing children. These data suggest that better speechreading skills are seen in those children, both deaf and hearing, who are able to guide their visual attention to the appropriate part of the image and in those who have a good understanding of conversational turn‐taking.
Institute for Learning Sciences and Teacher Education
Access may be restricted.