Abstract
Learning and processing natural language requires the ability to track syntactic relationships between words and phrases in a sentence, which are often separated by intervening material. These nonadjacent dependencies can be studied using artificial grammar learning paradigms and structured sequence processing tasks. These approaches have been used to demonstrate that human adults, infants and some nonhuman animals are able to detect and learn dependencies between nonadjacent elements within a sequence. However, learning nonadjacent dependencies appears to be more cognitively demanding than detecting dependencies between adjacent elements, and only occurs in certain circumstances. In this review, we discuss different types of nonadjacent dependencies in language and in artificial grammar learning experiments, and how these differences might impact learning. We summarize different types of perceptual cues that facilitate learning, by highlighting the relationship between dependent elements bringing them closer together either physically, attentionally, or perceptually. Finally, we review artificial grammar learning experiments in human adults, infants, and nonhuman animals, and discuss how similarities and differences observed across these groups can provide insights into how language is learned across development and how these language-related abilities might have evolved.
Original language | English |
---|---|
Pages (from-to) | 843-858 |
Number of pages | 16 |
Journal | Topics in Cognitive Science |
Volume | 12 |
Issue number | 3 |
DOIs | |
Publication status | Published - Jul 2020 |
Austrian Fields of Science 2012
- 501030 Cognitive science
Keywords
- Artificial grammar
- Human
- Infant
- Non-adjacent dependency
- Nonhuman animal
- Primate
- Structured sequence processing