Local governments in Japan are turning to artificial intelligence to improve communication with people who are deaf or hard of hearing at their counters for the public.
A system jointly developed by the University of Electro-Communications in Tokyo and SoftBank Corp. converts sign-language gestures into written text.
While the system currently requires equipment at counters, municipalities hope it will eventually be usable with just a smartphone.
Supplied photo shows the display of a system recognizing sign language movement using artificial intelligence. (Photo courtesy of Softbank Corp.) (Kyodo)
At the Narashino city office in Chiba Prefecture, near Tokyo, a woman with a hearing disability asked directions to the restroom using sign language while standing in front of a camera.
A text translation appeared on a staff member’s computer display after about three seconds. The spoken response then appeared as text on the screen in front of the woman, making for a smooth interaction.
“Sure Talk,” as the AI system that translates sign into Japanese text is called, uses image recognition technology that analyzes the skeletal movements of several areas of the body, such as fingers and arms, to convert sign into Japanese. Sign images of hundreds of people were digitized to develop the system.
Although conversation with people who are deaf or hard of hearing can be carried out in writing, the AI system is “much smoother because the translation takes place in real-time,” the Narashino city official said.
Mito in Ibaraki Prefecture and Chofu in Tokyo have also set up desks at city offices to assist people using the system.
“Sure Talk” still has a lot of room for improvements. Currently, it can only accurately translate gestures into about 1,500 Japanese words. “A huge amount of sign language data is needed to build a model for accurate translations of signs into Japanese text,” said a SoftBank engineer involved in developing the system.
Thus to improve the system’s accuracy, the mobile communications and internet services company has deemed it necessary to launch a website and smartphone app requesting cooperation from the public, calling on as many people as possible to send in sign language images.
In a related development, Hokkaido University and Nippon Telegraph and Telephone East Corp. are jointly developing an AI-based automated sign-language translation system.
They aim to improve the environment in hospitals, pharmacies, tourist sites, and other places, setting up cameras so that people who are deaf or hard of hearing can make inquiries even in the absence of staff capable of using sign language.
The Nippon Foundation and Google LLC have developed “Sign Town,” a game to help people enjoy learning sign language. Players advance when they can answer questions correctly using signs directed at computer cameras. The game is available for free on the public interest foundation’s website.
But for AI to be an effective tool for people who are deaf and hard of hearing, a lot more nuance is necessary, argues the Japanese Federation of the Deaf — something which will be difficult to achieve in the short term if at all.
“As sign language has dialects and regionally unique expressions, the current AI-based sign language translation is not enough. If made more accurate, it will become an effective means for having simple conversations and responding to inquiries at public offices and other places,” an official of the federation said.