Synthetic Animation of Deaf Signing Gestures (Abstract)

Richard Kennaway, University of East Anglia

This paper was presented at the International Gesture Workshop 2001, 18-20 April, at City University, London. A revised version of the full paper appears in the published proceedings (LNAI vol.2298, eds. Wachsmuth & Sowa, pp.146-157), and is linked to below.

We describe a method for synthesizing deaf signing animations from a description of signs in terms of the HamNoSys transcription system. This is intended to form one component of the Visicast project, a three-year multi-partner project with the aim of automated signing of broadcast television.

HamNoSys is a broad phonetic transcription of the physical action of producing a sign. Our animation software must make precise the fuzzy categories of the notation such as "near to" or "fast", and fill in details left implicit, such as how to move the arms to place the hands in specified positions.

We use a simplified biocontrol model to generate realistic animations of the movement from one posture to another.

For further information see the following links:

Synthetic animation

Background material


This page created 2001-Apr-24 by Richard Kennaway.