Methods & Documentation
We use eye-tracking, audio discrimination, EEG, and corpus analysis to examine word learning in infants.
Potential Eligibility
Infant | Eye-Tracking | Discrimination | EEG | Corpus Contribution |
---|---|---|---|---|
Typically Developing | ||||
Profound Hearing Impairment | ||||
Profound Vision Impairment |
Eye-Tracking
For typically-developing infants or infants with hearing impairments, we use eye-tracking to gain insight into infant word learning and word differention.
EEG (electroencephalogram)
For typically-developing children and those with profound hearing or visual impairments, we also use EEG. Using this non-invasive method, we look for certain patterns of brain activity after hearing different types of words. (For the curious: We look for a p100 or N400 .)
Corpus building and analysis
We are in the process of collecting day-long recordings of infants with profound visual impairments. The infants simply wear a vest all day with an audio recording device (LENA) nestled inside. We also invite families to optionally complete a video-recorded play session in the lab.
The Bergelson Lab already has a corpus of audio- and video-recordings from typically-developing infants, part of a project called SEEDLingS .
Audio discrimination tasks
The Bergelson Lab also uses experimental paradigms such as habituation and Headturn Preference Procedure. Using these methods, we measure infants' looking and listening times to different stimuli in order to learn about aspects of early language development.
Lab Documentation
- Bergelson Lab wiki (Gitbook) (we’re currently (July 2024) reorganizing and will add a link once we are finished)
- Bergelson Lab GitHub repo
- SEEDLingS GitHub repo