
From the invention of Braille to the development of hearing aids, society has strived to overcome challenges faced by the Deaf and Deafblind communities. Today, artificial intelligence stands at the forefront of this movement, creating new possibilities and solutions. In a world driven by technology, accessibility tools are constantly evolving.
For example, the Ray-Ban Meta AI-powered smart glasses are equipped with an HD video camera, over-ear speakers and a small button for capturing images. They also have Bluetooth capability, so they can make phone calls and connect to cell phone applications. The glasses are compatible with the Be My Eyes app, which provides support to blind and low-vision individuals by connecting them with sighted volunteers.
In addition to compatibility with Be My Eyes, the Meta glasses can connect to a BrailleNote Touch device. BrailleNote Touch devices enable blind and deafblind users to type messages in Braille and transport them back to a device. With the Meta glasses, a deafblind user can capture an image of their surroundings and receive a translated braille description of what the camera captured.

Despite these promising applications, many challenges remain. Denise Crochet, American Sign Language professor of practice and registered interpreter, notes the pitfalls of AI accessibility tools.
“One of the ongoing discussions about AI, specifically within the Deaf community and within the ASL community, is the lack of distinction between [spoken English and ASL] … We see a lot of stakeholders and power structures kind of battling that out … thinking that captioning is the key to everything,” Crochet said. “It’s that continual thrust of belief that English is superior to all other languages.”
For many Deaf and Deafblind individuals, American Sign Language is the only language they grew up using and learning another language, like English, is difficult.
“If your first language is American Sign Language, and you don’t have access to a spoken language because you are deaf, then learning and acquiring that second language is a bigger challenge … [that] may lead to miscommunication [and] disfluency,” Crochet said.
Sign language recognition AI is one of the fastest growing and most researched areas of AI accessibility. This technology can read sign language users’ movements and hand shapes and translate them into English. While this idea could be revolutionary, capturing all of the nuance contained within a sign may be impossible.
“There’s a lot of moving parts to American Sign Language, and I don’t just mean that as a pun,” Crochet said. “An artificial intelligence [model] can be taught to recognize hand shapes, movement, location and palm orientation … but that stickler is the non-manual marker. And while it might be able to look at the face, it doesn’t mean it can interpret [it].”
As the world of artificial intelligence develops, the possibilities for Deaf and Deafblind individuals continue to grow.
“I think we need to proceed with due diligence,” Crochet said. “There’s definitely a potential for it … replace humans’ employment … [but] there’s no way that AI would be better off in all of those schemas [of real-life interactions], because it just doesn’t have the dimension to do that just yet.”