The Relation between Language and Cognitive Development Under the Microscope

By: Inas Essa

There is a strong link between language and cognition that plays an important role in cognitive development that starts early in infancy. Although the spoken language seems to be leading in such a process for hearing infants, new research indicates that it is not the sole way, as sign language, a language presented in the visual form, could also serve the same role.

In a groundbreaking study conducted at Northwestern University, researchers found that sign language, just like spoken language, supports infant cognitive development in hearing infants who had never been exposed to sign language. In their press release, researchers indicated that their finding expands the understanding of the powerful and inherent link between language and cognition-language in humans.


The Link Between Human Language and Cognition

The new study builds on prior research showing that very young infants, 3-4 months of age, are prepared to acquire any human language, whether presented in the visual or auditory form and can build a link between this language and basic cognitive processes.

“We had already established a precocious link between acoustic signals to infant cognition. But we had not yet established whether this initial link is sufficiently broad to include sign language, even in infants never exposed to a sign language,” said Sandra Waxman, the lead researcher in the study.

Based on that, researchers’ hypothesis relied on a suspicion that hearing infants may initially link any language, whether it is spoken or signed, to cognition. Therefore, their scope was finding an answer for: If infants' initial language-cognition link is reserved specifically for spoken language, then infants observing sign language should fail to form object categories. However, if infants' initial link is broad enough to include all human languages, then infants observing sign language, like those listening to spoken language, should successfully form object categories.


The Experiment

Researchers conducted the experiment with 113 hearing infants, ranging from 4 to 6 months old, who had previously been exposed to American Sign Language (ASL) or any other sign language, to compare infants' categorization performance in a Sign Language condition with performance in a nonlinguistic control condition.

In the beginning, infants were familiarized with a series of category examples, (e.g. eight fish), each presented by a woman who either signed while pointing and gazing toward the objects (ASL condition) or pointed and gazed without language (non-linguistic control condition). After that, all infants viewed two static images: a new member of the same category (e.g. a new fish) or a new object from a different category (e.g. a dinosaur).


A Deeper Connection Revealed

The researchers found that at 3 and 4 months old, infants in the ASL condition, but not the control condition, successfully formed the object category (e.g. fish). They distinguished between the two test objects, indicating that they had successfully formed the object category. They were as successful as age-mates who listened to their native (spoken) language. However, by 5-6 months, the same group, ASL, no longer provided this cognitive advantage.

“What surprised us the most was that it was specifically the linguistic elements of the ASL that did the trick—not merely pointing and gesturing. Pointing and gesturing are communicative signals for sure, but they are not linguistic signals,” said Waxman.

These findings highlight how expanded is infants' early link between language and cognition and how it supports cognitive development.