BOSTON - Even though Joan Nash has been using American Sign Language for most of her life and has made a career of teaching deaf and hearing-impaired children, she is sometimes stumped when she encounters a sign she has never seen.
She can't just look it up in a dictionary. At least not yet.
Nash, a doctoral student at Boston University, is part of a team working on an interactive video project that would allow someone to demonstrate a sign in front of a camera, and have a computer program interpret and explain its meaning.
"Sometimes when I see a sign I don't know it can be frustrating as you run around asking people and trying to find out what it is," she said.
American Sign Language has no written form, and even though there are print and video ASL dictionaries, one needs to know the meaning of the word to look up the sign. That's sort of like trying to figure out the meaning of a foreign word by looking it up under its English equivalent.
"I know from my own experience that it's really hard if you see a sign that you don't know, either in a class, in a video you've been assigned to watch, or even if you see it on the street, to figure out what it means," said linguistics professor Carol Neidle, one of the project's lead researchers along with BU's Stan Sclaroff and Vassilis Athitsos at the University of Texas-Arlington.
The researchers, working with a three-year, $900,000 grant from the National Science Foundation, are in the early stages of the project, capturing thousands of ASL words on video in a brightly lit Boston University lab.
The goal is to develop a lexicon of more than 3,000 signs. The meaning of each sign is not just determined by the shape of the hands, but also the movements of the hands and arms, and even facial expressions.
As Nash scrolls through hundreds of words alphabetically in English - sweep, sweetheart, swimming, symbol, system - Elizabeth Cassidy, a native ASL speaker, signs them for four cameras, three in front of her and one on her right. Two cameras shoot close-ups from different angles, and one takes a wider shot.
Cassidy is one of four "linguistic consultants" who will eventually sign for the cameras.
Cassidy grew up in a family with three deaf siblings and was signing before she could speak, but even she sees unfamiliar signs once in a while. "A project like this is a long time coming," she said.
The goal is to use the technology to develop a multimedia ASL dictionary to help parents better communicate with deaf children, and to help sign language students.
There are more than 20 million Americans classified as either deaf or hard of hearing, nearly one million of whom are children, according to Gallaudet University, a leading school for a deaf.
"Ninety per cent of deaf children are born into hearing families," Neidle said. "And it's not uncommon for parents to have difficulty understanding their kids."
The new technology in development could help.
For example, if a deaf child signs to a parent who doesn't understand the sign, the parent could sit down in front of a computer, replicate the sign into any commercial webcam, and the program would identify possible translations by recognizing the visual properties.
The technology could also be an important tool for students of ASL, the fourth-most studied foreign language in U.S. colleges, according to a 2007 report by the Modern Language Association of America. As many as two million people in the United States use sign language, said Bobbie Beth Scoggins, president of the National Association of the Deaf.
Sclaroff, chairman of Boston University's computer science department even envisions the technology being used for computer-based automatic translation and searches of ASL video streams.
Scoggins said there is a tremendous need for such technology with the rising interest in learning ASL. "It provides a better scope of the language for people whether they are parents of deaf children, students, teachers, and people who are simply curious about American Sign Language," she said.