My daughter is in her second year of learning American Sign Language. It seems that NVIDIA is on the road to putting her out of a translator job. On a serious note, having this on your phone could make everyday communication so much easier. Ironic, putting this into a podcast form, considering the deaf will never hear it.
Bridging a gap between a visual language and a written language, the model’s possibilities are endless, according to Ahmed. The AI’s ability to map body language can help predict certain health conditions. In an augmented world, phones wouldn’t be a necessary medium with translated words appearing on the side of the speaker’s face.
Discussion
Source: [H]ardOCP – NVIDIA AI Podcast on American Sign Language