Evaluating American Sign Language Generation Through the Participation of Native ASL SignersBy Huenerfauth, Matt; Zhao, Liming; Gu, Erdan; Allbeck, Jan; ASSETS 2007 - The Ninth International ACM SIGACCESS Conference on Computers and Accessibility, pp. 211-218
Publication Date: October 15-17, 2007
Overview of a module generating computer animations of American Sign Language (ASL) sentences that contain classifier predicates, a type of ASL phrase used to indicate the spatial location, size, shape, and movement of objects. The prototype module can translate a limited range of English input sentences into animations of ASL performance in which an onscreen 3D human-like character performs a set of classifier predicates to convey the locations and movements of the entities in the English text. For an evaluation study, two groups of animations of 10 sentences each were developed: ASL classifier-predicate animations produced by the system and, for comparison, Signed-English animations reflecting the current state of the art in broad-coverage English-to-sign translation. The study recruited 15 participants who were native ASL signers to rate the animations according to (1) grammaticality, (2) understandability, (3) naturalness of movement, and (4) whether the signer identified the visualization that correctly matched the animation. Results showed the classifier-predicate generator’s higher scores to be significant, over 80 percent overall compared to about 54 percent for the Signed-English animations. Implications for feedback-generated improvements to the ASL generator are discussed.
Published by: Association for Computing Machinery (Website:http://www.acm.org)
SIGACCESS (ACM Special Interest Group on Accessible Computing) (Web Site: http://www.sigaccess.org )