7,217 research outputs found

    3D User Interfaces for General-Purpose 3D Animation

    Get PDF
    Draft submission, Appeared as "3D User Interfaces for General-Purpose 3D Animation"Modern 3D animation systems let a growing number of people generate increasingly sophisticated animated movies, frequently for tutorials or multimedia documents. However, although these tasks are inherently three dimensional, these systems' user interfaces are still predominantly two dimensional. This makes it difficult to interactively input complex animated 3D movements. We have developed Virtual Studio, an inexpensive and easy-to-use 3D animation environment in which animators can perform all interaction directly in three dimensions. Animators can use 3D devices to specify complex 3D motions. Virtual tools are visible mediators that provide interaction metaphors to control application objects. An underlying constraint solver lets animators tightly couple application and interface objects. Users define animation by recording the effect of their manipulations on models. Virtual Studio applies data-reduction techniques to generate editable representations of each animated element that is manipulated.71-78Pubblicat

    Hypermedia = hypercommunication

    Get PDF
    New hardware and software technology gave application designers the freedom to use new realism in human computer interaction. High-quality images, motion video, stereo sound and music, speech, touch, gesture provide richer data channels between the person and the machine. Ultimately, this will lead to richer communication between people with the computer as an intermediary. The whole point of hyper-books, hyper-newspapers, virtual worlds, is to transfer the concept and relationships, the 'data structure', from the mind of creator to that of user. Some of the characteristics of this rich information channel are discussed, and some examples are presented

    Pose-Timeline for Propagating Motion Edits

    Get PDF
    Motion editing often requires repetitive operations for modifying similar action units to give a similar effector impression. This paper proposes a system for efficiently and flexibly editing the sequence of iterative actionsby a few intuitive operations. Our system visualizes a motion sequence on a summary timeline with editablepose-icons, and drag-and-drop operations on the timeline enable intuitive controls of temporal properties of themotion such as timing, duration, and coordination. This graphical interface is also suited to transfer kinematicaland temporal features between two motions through simple interactions with a quick preview of the resultingposes. Our method also integrates the concept of edit propagation by which the manual modification of one actionunit is automatically transferred to the other units that are robustly detected by similarity search technique. Wedemonstrate the efficiency of our pose-timeline interface with a propagation mechanism for the timing adjustmentof mutual actions and for motion synchronization with a music sequence

    Synchronized computational architecture for generalized bilateral control of robot arms

    Get PDF
    A master six degree of freedom Force Reflecting Hand Controller (FRHC) is available at a master site where a received image displays, in essentially real time, a remote robotic manipulator which is being controlled in the corresponding six degree freedom by command signals which are transmitted to the remote site in accordance with the movement of the FRHC at the master site. Software is user-initiated at the master site in order to establish the basic system conditions, and then a physical movement of the FRHC in Cartesean space is reflected at the master site by six absolute numbers that are sensed, translated and computed as a difference signal relative to the earlier position. The change in position is then transmitted in that differential signal form over a high speed synchronized bilateral communication channel which simultaneously returns robot-sensed response information to the master site as forces applied to the FRHC so that the FRHC reflects the feel of what is taking place at the remote site. A system wide clock rate is selected at a sufficiently high rate that the operator at the master site experiences the Force Reflecting operation in real time

    Audeosynth: music-driven video montage

    Get PDF
    We introduce music-driven video montage, a media format that offers a pleasant way to browse or summarize video clips collected from various occasions, including gatherings and adventures. In music-driven video montage, the music drives the composition of the video content. According to musical movement and beats, video clips are organized to form a montage that visually reflects the experiential properties of the music. Nonetheless, it takes enormous manual work and artistic expertise to create it. In this paper, we develop a framework for automatically generating music-driven video montages. The input is a set of video clips and a piece of background music. By analyzing the music and video content, our system extracts carefully designed temporal features from the input, and casts the synthesis problem as an optimization and solves the parameters through Markov Chain Monte Carlo sampling. The output is a video montage whose visual activities are cut and synchronized with the rhythm of the music, rendering a symphony of audio-visual resonance.postprin
    corecore