Executive Function & Implicit Learning in Deaf Children

Children born deaf are commonly reported to have delays in cognitive development, including executive function and implicit learning: two domains that are crucial for healthy development & academic success.  To fix these problems, we need to understand their cause.  This research tests the predictions of two competing theories: auditory deprivation and language deprivation.  We do so by including children who are deaf but do not experience language deprivation, by virtue of being born to parents who already know American Sign Language.  Visit braingamesresearch.uconn.edu for more information or to sign up!

Measuring Language Access Profiles in Deaf Children

Most deaf children are born to parents who don't already know a sign language, and face difficult questions about what kind(s) of language exposure will maximize their child's developmental potential. To find the answers, researchers and clinicians need a way to more accurately capture the complex and dynamic nature of the linguistic input that deaf children receive.  In partnership with the Deaf & Hard of Hearing Program at Boston Children's Hospital, we are developing the Language Access Profile Tool to meet that need.

Language Access Profiles & Developmental Outcomes in Deaf Children

How does a deaf child's language access profile from 0-3 relate to that child's language proficiency (in English and ASL), cognitive development, and social-emotional/behavioral health at age 5?  Is there an optimal profile?  Are there tradeoffs?  In partnership with the Deaf & Hard of Hearing program at Boston Children's Hospital, we are conducting a cross-sectional study to begin characterizing these relationships; we will then conduct a prospective longitudinal study to more fully address these questions. 

Cognitive & Communicative Influences on Language Structure

Where does the structure of human language come from?  Nativist theories have emphasized internal (cognitive) influences, while functionalist theories have emphasized interpersonal (communicative) pressures.  Naturalistic evidence from homesign & emerging sign languages provides new data, but without empircal control.  Computational simulations and experimental semiotics provide experimental control but sacrifice the validity that comes with embodied communication.  My approach combines the best of both perspectives, by studying the emergence of structure in a natural communicative modality (silent gesture) under controlled experimental conditions. 

How Does Social Structure Impact Language Structure?

When a new language comes into being, how do people agree on what to call things? Does that process happen faster if all communication were channeled through a central hub, or is it more effective for everyone to communicate with as many other people as possible?  Building on previous data from naturalistic fieldwork and computational simulations, we use a silent gesture paradigm to experimentally manipulate the communicative structure of a small social network.

Please reload