6,436 research outputs found

    Towards human technology symbiosis in the haptic mode

    Get PDF
    Search and rescue operations are often undertaken in dark and noisy environments in which rescue teams must rely on haptic feedback for exploration and safe exit. However, little attention has been paid specifically to haptic sensitivity in such contexts or to the possibility of enhancing communicational proficiency in the haptic mode as a life-preserving measure. Here we discuss the design of a haptic guide robot, inspired by careful study of the communication between blind person and guide dog. In the case of this partnership, the development of a symbiotic relationship between person and dog, based on mutual trust and confidence, is a prerequisite for successful task performance. We argue that a human-technology symbiosis is equally necessary and possible in the case of the robot guide. But this is dependent on the robot becoming 'transparent technology' in Andy Clark's sense. We report on initial haptic mode experiments in which a person uses a simple mobile mechanical device (a metal disk fixed with a rigid handle) to explore the immediate environment. These experiments demonstrate the extreme sensitivity and trainability of haptic communication and the speed with which users develop and refine their haptic proficiencies in using the device, permitting reliable and accurate discrimination between objects of different weights. We argue that such trials show the transformation of the mobile device into a transparent information appliance and the beginnings of the development of a symbiotic relationship between device and human user. We discuss how these initial explorations may shed light on the more general question of how a human mind, on being exposed to an unknown environment, may enter into collaboration with an external information source in order to learn about, and navigate, that environment

    Visualisation techniques, human perception and the built environment

    Get PDF
    Historically, architecture has a wealth of visualisation techniques that have evolved throughout the period of structural design, with Virtual Reality (VR) being a relatively recent addition to the toolbox. To date the effectiveness of VR has been demonstrated from conceptualisation through to final stages and maintenance, however, its full potential has yet to be realised (Bouchlaghem et al, 2005). According to Dewey (1934), perceptual integration was predicted to be transformational; as the observer would be able to ‘engage’ with the virtual environment. However, environmental representations are predominately focused on the area of vision, regardless of evidence stating that the experience is multi sensory. In addition, there is a marked lack of research exploring the complex interaction of environmental design and the user, such as the role of attention or conceptual interpretation. This paper identifies the potential of VR models to aid communication for the Built Environment with specific reference to human perception issues

    Is Vivaldi smooth and takete? Non-verbal sensory scales for describing music qualities

    Get PDF
    Studies on the perception of music qualities (such as induced or perceived emotions, performance styles, or timbre nuances) make a large use of verbal descriptors. Although many authors noted that particular music qualities can hardly be described by means of verbal labels, few studies have tried alternatives. This paper aims at exploring the use of non-verbal sensory scales, in order to represent different perceived qualities in Western classical music. Musically trained and untrained listeners were required to listen to six musical excerpts in major key and to evaluate them from a sensorial and semantic point of view (Experiment 1). The same design (Experiment 2) was conducted using musically trained and untrained listeners who were required to listen to six musical excerpts in minor key. The overall findings indicate that subjects\u2019 ratings on non-verbal sensory scales are consistent throughout and the results support the hypothesis that sensory scales can convey some specific sensations that cannot be described verbally, offering interesting insights to deepen our knowledge on the relationship between music and other sensorial experiences. Such research can foster interesting applications in the field of music information retrieval and timbre spaces explorations together with experiments applied to different musical cultures and contexts

    From presence to consciousness through virtual reality

    Get PDF
    Immersive virtual environments can break the deep, everyday connection between where our senses tell us we are and where we are actually located and whom we are with. The concept of 'presence' refers to the phenomenon of behaving and feeling as if we are in the virtual world created by computer displays. In this article, we argue that presence is worthy of study by neuroscientists, and that it might aid the study of perception and consciousness

    Exploring relationships between touch perception and surface physical properties

    Get PDF
    This paper reports a study of materials for confectionery packaging. The aim was to explore the touch perceptions of textures and identify their relationships with the surfaces' physical properties. Thirty-seven tactile textures were tested including 22 cardboards, nine flexible materials and six laminate boards. Semantic differential questionnaires were administered to assess responses to touching the textures against six word pairs: warm-cold, slippery-sticky, smooth,-rough, hard-soft, bumpy-flat, and wet-dry. Four physical measurements were conducted to characterize the surfaces' roughness, compliance, friction, and the rate of cooling of an artificial finger when touching the surface. Correlation and regression analyses were carried out to identify the relationships between the people's responses and the physical measurements. Results show that touch perception is often associated with more than one physical property, and the strength and form of the combined contribution can be represented by a regression model. © 2009 Chen, Shao, Barnes, Childs, & Henson

    A Framework to Describe, Analyze and Generate Interactive Motor Behaviors

    Get PDF
    International audienceWhile motor interaction between a robot and a human, or between humans, has important implications for society as well as promising applications, little research has been devoted to its investigation. In particular, it is important to understand the different ways two agents can interact and generate suitable interactive behaviors. Towards this end, this paper introduces a framework for the description and implementation of interactive behaviors of two agents performing a joint motor task. A taxonomy of interactive behaviors is introduced, which can classify tasks and cost functions that represent the way each agent interacts. The role of an agent interacting during a motor task can be directly explained from the cost function this agent is minimizing and the task constraints. The novel framework is used to interpret and classify previous works on human-robot motor interaction. Its implementation power is demonstrated by simulating representative interactions of two humans. It also enables us to interpret and explain the role distribution and switching between roles when performing joint motor tasks
    corecore