Abstract: The goal of the study reported in the paper was to
determine whether Ambient Occlusion Shading (AOS) has a significant effect on users' perception of American Sign Language (ASL) finger spelling animations. Seventy-one (71) subjects
participated in the study; all subjects were fluent in ASL. The participants were asked to watch forty (40) sign language animation
clips representing twenty (20) finger spelled words. Twenty (20) clips did not show ambient occlusion, whereas the other twenty (20) were
rendered using ambient occlusion shading. After viewing each animation, subjects were asked to type the word being finger-spelled and rate its legibility. Findings show that the presence of AOS had a significant effect on the subjects perception of the signed words.
Subjects were able to recognize the animated words rendered with AOS with higher level of accuracy, and the legibility ratings of the animations showing AOS were consistently higher across subjects.
Abstract: We introduce, a new interactive 3D simulation system of ocular motion and expressions suitable for: (1) character animation applications to game design, film production, HCI (Human Computer Interface), conversational animated agents, and virtual reality; (2) medical applications (ophthalmic neurological and muscular pathologies: research and education); and (3) real time simulation of unconscious cognitive and emotional responses (for use, e.g., in psychological research). The system is comprised of: (1) a physiologically accurate parameterized 3D model of the eyes, eyelids, and eyebrow regions; and (2) a prototype device for realtime control of eye motions and expressions, including unconsciously produced expressions, for application as in (1), (2), and (3) above. The 3D eye simulation system, created using state-of-the-art computer animation technology and 'optimized' for use with an interactive and web deliverable platform, is, to our knowledge, the most advanced/realistic available so far for applications to character animation and medical pedagogy.
Abstract: In this paper we report a study aimed at determining
the most effective animation technique for representing ASL
(American Sign Language) finger-spelling. Specifically, in the study
we compare two commonly used 3D computer animation methods
(keyframe animation and motion capture) in order to ascertain which
technique produces the most 'accurate', 'readable', and 'close to
actual signing' (i.e. realistic) rendering of ASL finger-spelling. To
accomplish this goal we have developed 20 animated clips of fingerspelled
words and we have designed an experiment consisting of a
web survey with rating questions. 71 subjects ages 19-45 participated
in the study. Results showed that recognition of the words was
correlated with the method used to animate the signs. In particular,
keyframe technique produced the most accurate representation of the
signs (i.e., participants were more likely to identify the words
correctly in keyframed sequences rather than in motion captured
ones). Further, findings showed that the animation method had an
effect on the reported scores for readability and closeness to actual
signing; the estimated marginal mean readability and closeness was
greater for keyframed signs than for motion captured signs. To our
knowledge, this is the first study aimed at measuring and comparing
accuracy, readability and realism of ASL animations produced with
different techniques.
Abstract: In this paper we report a study aimed at determining
the effects of animation on usability and appeal of educational
software user interfaces. Specifically, the study compares 3
interfaces developed for the Mathsignerâ„¢ program: a static
interface, an interface with highlighting/sound feedback, and an
interface that incorporates five Disney animation principles. The
main objectives of the comparative study were to: (1) determine
which interface is the most effective for the target users of
Mathsignerâ„¢ (e.g., children ages 5-11), and (2) identify any Gender
and Age differences in using the three interfaces. To accomplish
these goals we have designed an experiment consisting of a
cognitive walkthrough and a survey with rating questions. Sixteen
children ages 7-11 participated in the study, ten males and six
females. Results showed no significant interface effect on user task
performance (e.g., task completion time and number of errors);
however, interface differences were seen in rating of appeal, with
the animated interface rated more 'likeable' than the other two.
Task performance and rating of appeal were not affected
significantly by Gender or Age of the subjects.
Abstract: We introduce a new interactive 3D simulator of ocular motion and expressions suitable for: (1) character animation applications to game design, film production, HCI (Human Computer Interface), conversational animated agents, and virtual reality; (2) medical applications (ophthalmic neurological and muscular pathologies: research and education); and (3) real time simulation of unconscious cognitive and emotional responses (for use, e.g., in psychological research). Using state-of-the-art computer animation technology we have modeled and rigged a physiologically accurate 3D model of the eyes, eyelids, and eyebrow regions and we have 'optimized' it for use with an interactive and web deliverable platform. In addition, we have realized a prototype device for realtime control of eye motions and expressions, including unconsciously produced expressions, for application as in (1), (2), and (3) above. The 3D simulator of eye motion and ocular expression is, to our knowledge, the most advanced/realistic available so far for applications in character animation and medical pedagogy.
Abstract: The object of this research is the design and
evaluation of an immersive Virtual Learning Environment (VLE) for
deaf children. Recently we have developed a prototype immersive
VR game to teach sign language mathematics to deaf students age K-
4 [1] [2]. In this paper we describe a significant extension of the
prototype application. The extension includes: (1) user-centered
design and implementation of two additional interactive
environments (a clock store and a bakery), and (2) user-centered
evaluation including development of user tasks, expert panel-based
evaluation, and formative evaluation. This paper is one of the few to
focus on the importance of user-centered, iterative design in VR
application development, and to describe a structured evaluation
method.