757 research outputs found

    Motion analysis report

    Get PDF
    Human motion analysis is the task of converting actual human movements into computer readable data. Such movement information may be obtained though active or passive sensing methods. Active methods include physical measuring devices such as goniometers on joints of the body, force plates, and manually operated sensors such as a Cybex dynamometer. Passive sensing de-couples the position measuring device from actual human contact. Passive sensors include Selspot scanning systems (since there is no mechanical connection between the subject's attached LEDs and the infrared sensing cameras), sonic (spark-based) three-dimensional digitizers, Polhemus six-dimensional tracking systems, and image processing systems based on multiple views and photogrammetric calculations

    TEMPUS: Simulating personnel and tasks in a 3-D environment

    Get PDF
    The latest TEMPUS installation occurred in March, 1985. Another update is slated for early June, 1985. An updated User's Manual is in preparation and will be delivered approximately mid-June, 1985. NASA JSC has full source code listings and internal documentation for installed software. NASA JSC staff has received instruction in the use of TEMPUS. Telephone consultations have augmented on-site instruction

    Strength Modeling Report

    Get PDF
    Strength modeling is a complex and multi-dimensional issue. There are numerous parameters to the problem of characterizing human strength, most notably: (1) position and orientation of body joints; (2) isometric versus dynamic strength; (3) effector force versus joint torque; (4) instantaneous versus steady force; (5) active force versus reactive force; (6) presence or absence of gravity; (7) body somatotype and composition; (8) body (segment) masses; (9) muscle group envolvement; (10) muscle size; (11) fatigue; and (12) practice (training) or familiarity. In surveying the available literature on strength measurement and modeling an attempt was made to examine as many of these parameters as possible. The conclusions reached at this point toward the feasibility of implementing computationally reasonable human strength models. The assessment of accuracy of any model against a specific individual, however, will probably not be possible on any realistic scale. Taken statistically, strength modeling may be an effective tool for general questions of task feasibility and strength requirements

    Eyes Alive

    Get PDF
    For an animated human face model to appear natural it should produce eye movements consistent with human ocular behavior. During face-to-face conversational interactions, eyes exhibit conversational turn-taking and agent thought processes through gaze direction, saccades, and scan patterns. We have implemented an eye movement model based on empirical models of saccades and statistical models of eye-tracking data. Face animations using stationary eyes, eyes with random saccades only, and eyes with statistically derived saccades are compared, to evaluate whether they appear natural and effective while communicating

    A3I visibility modeling project

    Get PDF
    The Army-NASA Aircrew Aircraft Integration program is supporting a joint project to build a visibility computer-aided design (CAD) tool. CAD has become an essential tool in modern engineering applications. CAD tools are used to create engineering drawings and to evaluate potential designs before they are physically realized. The visibility CAD tool will provide the design engineer with a tool to aid in the location and specification of windows, displays, and control in crewstations. In an aircraft cockpit the location of instruments and the emissive and reflective characteristics of the surfaces must be determined to assure adequate aircrew performance. The visibility CAD tool will allow the designer to ask and answer many of these questions in the context of a three-dimensional graphical representation of the cockpit. The graphic representation of the cockpit is a geometrically valid model of the cockpit design. A graphic model of a pilot, called the pilot manikin, can be placed naturalistically in the cockpit model. The visibility tool has the capability of mapping the cockpit surfaces and other objects modeled in this graphic design space onto the simulated pilot's retinas for a given visual fixation

    Disk Generators for a Raster Display Device

    Get PDF
    A simple modification of Horn\u27s circle drawing procedure yields a disk generator for a class of graphic devices capable of drawing rectangular areas. Another variation produces a disk a scan-line at a time allowing it to be drawn at the refresh rate of the display. The calculations involve only additions and binary shifts

    Animation 2000++

    Get PDF
    In the next millennium, computer animation will be both the same as now and also very different. Animators will always have tools that allow specifying and controlling - through manual interactive interfaces - every nuance of shape, movement, and parameter settings. But whether for skilled animators or novices, the future of animation will present a fantastically expanded palette of possibilities: techniques, resources, and libraries for creating and controlling movements

    Modeling and Animating Human Figures in a CAD Environment

    Get PDF
    With the widespread acceptance of three-dimensional modeling techniques, high-speed hardware, and relatively low-cost computation, modeling and animating one or more human figures for the purposes of design assessment, human factors, task simulation, and human movement understanding has become feasible outside the animation production house environment. This tutorial will address the state-of-the-art in human figure geometric modeling, figure positioning, figure animation, and task simulation

    Virtual Humans for Animation, Ergonomics, and Simulation

    Get PDF
    The last few years have seen great maturation in the computation speed and control methods needed to portray 3D virtual humans suitable for real interactive applications. We first describe the state of the art, then focus on the particular approach taken at the University of Pennsylvania with the Jack system. Various aspects of real-time virtual humans are considered, such as appearance and motion, interactive control, autonomous action, gesture, attention, locomotion, and multiple individuals. The underlying architecture consists of a sense-control-act structure that permits reactive behaviors to be locally adaptive to the environment, and a PaT-Net parallel finite-state machine controller that can be used to drive virtual humans through complex tasks. Finally, we argue for a deep connection between language and animation and describe current efforts in linking them through the JackMOO extension to lambdaMOO

    Virtual Beings

    Get PDF
    corecore