Abstract
Learning and processing natural language requires the ability to track syntactic relationships between words and phrases in a sentence, which are often separated by intervening material. These nonadjacent dependencies can be studied using artificial grammar learning paradigms and structured sequence processing tasks. These approaches have been used to demonstrate that human adults, infants and some nonhuman animals are able to detect and learn dependencies between nonadjacent elements within a sequence. However, learning nonadjacent dependencies appears to be more cognitively demanding than detecting dependencies between adjacent elements, and only occurs in certain circumstances. In this review, we discuss different types of nonadjacent dependencies in language and in artificial grammar learning experiments, and how these differences might impact learning. We summarize different types of perceptual cues that facilitate learning, by highlighting the relationship between dependent elements bringing them closer together either physically, attentionally, or perceptually. Finally, we review artificial grammar learning experiments in human adults, infants, and nonhuman animals, and discuss how similarities and differences observed across these groups can provide insights into how language is learned across development and how these language-related abilities might have evolved.
Originalsprache | Englisch |
---|---|
Seiten (von - bis) | 843-858 |
Seitenumfang | 16 |
Fachzeitschrift | Topics in Cognitive Science |
Jahrgang | 12 |
Ausgabenummer | 3 |
DOIs | |
Publikationsstatus | Veröffentlicht - Juli 2020 |
ÖFOS 2012
- 501030 Kognitionswissenschaft