Can AI learn American Sign Language? Just Ask a Student Developer in India

Learn American Sign Language from AI
Learn American Sign Language from an AI
Photo Credits Inquirernet | <a href=httpswwwstartaslcomlearn asl online>Learn American Sign Language<a> with Start ASL

 

Learn American Sign Language From AI

How cool would it be if artificial intelligence (AI) could learn American Sign Language? Is it even possible? Well, a student in India named Priyanjali Gupta developed an AI model that turned sign language into English. Gupta was a third-year engineer from India’s Vellore Institute of Technology (VIT). She developed an AI model that was capable of translating American Sign Language into English in real-time. Gupta’s AI model was inspired by a data scientist Nicholas Renotte’s video on Real-Time Sign Language Detection. According to an Inquirer.net article, it stated, “She invented the AI model using Tensorflow object detection API that translates hand gestures using transfer learning from a pre-trained model named ssd_mobilenet.” When Gupta signed basic signs such as Hello, I Love You, Thank You, Please, Yes, and No, the AI was able to translate these basic phrases to English.

In an interview with Analytics Drift, Gupta said, “The dataset is manually made with a computer webcam and given annotations. The model, for now, is trained on single frames. To detect videos, the model has to be trained on multiple frames, for which I’m likely to use Long Short-Term Memory (LSTM) networks.” She also mentioned that this was a challenging project and believed that the open-source community will find a solution soon. There were two graduates named Thomas Pryor and Navid Azodi from the University of Washington that invented gloves called ‘SignAloud’, that translated American Sign Language into speech or text

Learning American Sign Language From Signaloud and How Deaf People Feel About It

Learn American Sign Language from SignAloud
Photo Credits Techeblog | <a href=httpswwwstartaslcomlearn asl online>Learn American Sign Language<a> with Start ASL
Pryor and Azodi invented ‘SignAloud’ gloves in an effort to bridge the communication gap between the Deaf and hearing people. Their invention won $10,000 from the Lemelson-MIT competition in 2016. According to the article in Interactive Accessibility, “Sensors in the gloves record hand position and movement and send the data via Bluetooth to a central computer that analyzes the data through various sequential statistical regressions. When a match with a gesture is found, the corresponding word or phrase is played through a speaker.”

Is the invention ‘SignAloud’ beneficial in helping bridge the communication gap between the Deaf and hearing people? In an article in The Atlantic, Lance Forshay, who directs the ASL program at UW said, “Initially, I didn’t want to deal with [SignAloud, the UW project] because this has been a repeated phenomenon or fad. I was surprised and felt somehow betrayed because they obviously didn’t check with the Deaf community or even check with ASL program teachers to make sure that they are representing our language appropriately.”

Although ‘SignAloud’ was a cool invention, it did not authentically represent ASL and the Deaf community. The function of these gloves is to just translate the hands – the position and the movement – the gloves do not translate the other important aspects such as facial expressions or body movement. Key parts of the grammar of ASL include ‘raised or lowered eyebrows, a shift in the orientation of the signer’s torso, or a movement of the mouth,’ reads the letter [from the Deaf community and Deaf culture experts]. “Even perfectly functioning gloves would not have access to facial expressions.”

Is Learning American Sign Language From An AI Ideal?

Technology is advancing, and people are able to create complex inventions. While it is amazing that people come up with inventions such as AIs that are able to translate ASL to English in hopes of bridging the communication gap between the Deaf and hearing people, it is probably not ideal and realistic to learn ASL from AIs for several reasons.

1) AI is very limited.

As mentioned, American Sign Language is not just about communicating with the hands but also includes facial expressions and body movements. Facial expressions include “raised or lowered eyebrows, a shift in the orientation of the signer’s torso, or a movement of the mouth” The facial expressions have different meanings when signing. For instance, “raised or lowered eyebrows” are used depending on what questions are being asked. “Raised eyebrows” typically demonstrate that the questions are a yes or no type of question. On the other hand, “lowered eyebrows” questions typically demonstrate questions that require an answer. Body movements include shifting when referring to a dialogue of different speakers in a conversation, or demonstrations of proud vs. timid, and so forth. You must see the person’s face and the whole body, so you get the full input of both facial expressions and body language. Many people prefer to learn American Sign Language virtually or in person, where they can experience the entire body, including the signer’s signing, facial expressions, and body movements. 

2) AI will not be able to translate the significance of facial expressions, body language, ASL grammar, and sentence structure, nor key aspects of the Deaf culture and Deaf community.

ASL is an expressive language, and facial expressions and body language are significant when signing. Facial expressions and body language can change the meaning of a story. ASL’s grammar and sentence structure are not the same as in English. For instance, the proper sentence structure in English is, “I am going to the store,” but in ASL, the sentence changes to, “Store I go.” The person who programs the AI is probably not Deaf; therefore, the program could easily convey inaccurate ASL.

3) AI will not be able to answer questions

When someone is learning a new language, that person will usually have lots of questions to ask about the language structure itself. Unless the AI is programmed with a lot of knowledge about the linguistics of ASL, the key aspects of Deaf culture, and is consistently immersed within the Deaf community, it would impossible to answer most questions accurately. Real life is constantly changing, and people, along with their language, adapt to the changes. New signs are constantly being invented to this day. AI would not be able to keep up with those changes; therefore would quickly be full of outdated information. The AI would consist of superficial knowledge, which simply demonstrates the basic signs, and those signs are translated to English.

4) AI does not have the everyday real-life experience as opposed to an actual Deaf person

AI has a long way to go before it even comes close to simulating a real person’s knowledge. For example, there are many different ways to sign the same word, and those signs are up to people’s styles of signing. AI is not even able to recognize most specific signs or signer’s styles. In order for someone to become fluent in ASL, the best methods are to watch slow-motion ASL video classes, private one-on-one lessons, attend Deaf socials and interact with Deaf people. You can learn much from real-life conversations when it comes to how ASL is being used into day to day life.

5) A conversation with AI feels unrealistic and unauthentic.

AI is very robotic and is not sign as fast or as smoothly as a real person can. A real person’s expressions are far more animated than any known AI, which makes the conversation more personal and meaningful. AI might be good to practice with when there is no one to practice with at home or needing to brush up on the most basic signs. It is always highly recommended that beginner signers interact with Deaf people in real-life conversations. There are multiple types of slang, deaf jokes, or deaf stories that an AI cannot tell or understand.

Even Though Technology is Advancing; Deaf People Still Value Their Language and Culture

In conclusion, it is great that people are inventing new types of AI that help bridge the communication gap between Deaf and hearing people. However, ASL, the Deaf culture, and the Deaf community hold a lot of history and significance. Many Deaf people feel that AI would only take away the core value of both their language and culture. If AI teaches ASL, the language can easily be incorrectly modified and stray away from the authentic ASL structure, and Deaf people definitely want to prevent that from happening. Deaf teachers are always making sure the correct signs, grammar, and sentence structure are being taught. In the end, AI would not make communication between Deaf and hearing people better or easier. The ideal solution to this problem would be for hearing people to learn American Sign Language either online or in person from a real Deaf teacher. When more hearing people start to learn true American Sign Language, it will make Deaf people’s lives and communication a whole lot easier.

 

Facebook
Twitter
Email
Print

Leave a Reply

Your email address will not be published. Required fields are marked *

American Sign Language Signing Group

Take ASL 1 for Free!

Start learning American Sign Language with our free online ASL 1 course - sign up with your email today! No credit card required.

Latest Posts

learn sign language - Start ASL Free ASL 1 Course

Take ASL 1 For Free!

Sign up today! Start learning American Sign Language with our Free Online ASL 1 Course. No credit card required.