How Can Language Change to Maximize its Efficiency?


By: Inas Essa

Language and communication are inseparable; daily oral language changes to keep up with the evolving words and expressions, to make communication easier. Likewise, the American Sign Language (ASL), which is used by deaf and hard-of-hearing people and uses visual-gestural modality, has also changed over years to maximize the efficiency of its processing. Researchers in a new study from Boston University published in the journal Cognition suggest this is an evolution to make it easier for people to recognize signs.

Sign Language Perception

Usually, during sign perception, recipients look almost exclusively at the lower part of the face, more than looking down at the hands, which means that signs articulated far from this area must be perceived through peripheral vision, which is less sharp than the central vision. In the recent study, the research team from Boston University, Syracuse University, and Rochester Institute of Technology, worked on identifying how ASL signs have changed over the years by using Artificial Intelligence (AI), and how this affects language acquisition or perception. 

They found that the more challenging the word to be understood—has uncommon hand shapes like the one for light—the closer they are made to the signer's face, where people often look during sign perception. On the other hand, common signs that have easy to perceive hand shapes, such as the sign for children, are made further away from the face, in the perceiver's peripheral vision.

 

 

Analyzing ASL Evolution

The research team analyzed the evolution of ASL with help from an AI tool that analyzed videos of more than 2500 signs from the ASL lexicon; they used the AI algorithm to estimate the position of the signer's body and limbs.

"We feed the video into a machine learning algorithm that uses computer vision to figure out where key points on the body are," says Caselli, a Boston University Wheelock College of Education & Human Development assistant professor. "We can then figure out where the hands are relative to the face in each sign;" After that, the researchers match that with data from the ASL lexicon to measure how often the signs and handshapes are used.

The team working on this project is diverse, signing researchers in collaboration with computer vision scientists. The researchers believe that their findings would be of high importance, to understanding how sign languages work and develop, which can help improve Deaf education.

 

 

 

References

sciencedirect.com

bu.ed