4,362 research outputs found

    Towards human technology symbiosis in the haptic mode

    Get PDF
    Search and rescue operations are often undertaken in dark and noisy environments in which rescue teams must rely on haptic feedback for exploration and safe exit. However, little attention has been paid specifically to haptic sensitivity in such contexts or to the possibility of enhancing communicational proficiency in the haptic mode as a life-preserving measure. Here we discuss the design of a haptic guide robot, inspired by careful study of the communication between blind person and guide dog. In the case of this partnership, the development of a symbiotic relationship between person and dog, based on mutual trust and confidence, is a prerequisite for successful task performance. We argue that a human-technology symbiosis is equally necessary and possible in the case of the robot guide. But this is dependent on the robot becoming 'transparent technology' in Andy Clark's sense. We report on initial haptic mode experiments in which a person uses a simple mobile mechanical device (a metal disk fixed with a rigid handle) to explore the immediate environment. These experiments demonstrate the extreme sensitivity and trainability of haptic communication and the speed with which users develop and refine their haptic proficiencies in using the device, permitting reliable and accurate discrimination between objects of different weights. We argue that such trials show the transformation of the mobile device into a transparent information appliance and the beginnings of the development of a symbiotic relationship between device and human user. We discuss how these initial explorations may shed light on the more general question of how a human mind, on being exposed to an unknown environment, may enter into collaboration with an external information source in order to learn about, and navigate, that environment

    Exploring haptic interfacing with a mobile robot without visual feedback

    Get PDF
    Search and rescue scenarios are often complicated by low or no visibility conditions. The lack of visual feedback hampers orientation and causes significant stress for human rescue workers. The Guardians project [1] pioneered a group of autonomous mobile robots assisting a human rescue worker operating within close range. Trials were held with fire fighters of South Yorkshire Fire and Rescue. It became clear that the subjects by no means were prepared to give up their procedural routine and the feel of security they provide: they simply ignored instructions that contradicted their routines

    Symbol Emergence in Robotics: A Survey

    Full text link
    Humans can learn the use of language through physical interaction with their environment and semiotic communication with other people. It is very important to obtain a computational understanding of how humans can form a symbol system and obtain semiotic skills through their autonomous mental development. Recently, many studies have been conducted on the construction of robotic systems and machine-learning methods that can learn the use of language through embodied multimodal interaction with their environment and other systems. Understanding human social interactions and developing a robot that can smoothly communicate with human users in the long term, requires an understanding of the dynamics of symbol systems and is crucially important. The embodied cognition and social interaction of participants gradually change a symbol system in a constructive manner. In this paper, we introduce a field of research called symbol emergence in robotics (SER). SER is a constructive approach towards an emergent symbol system. The emergent symbol system is socially self-organized through both semiotic communications and physical interactions with autonomous cognitive developmental agents, i.e., humans and developmental robots. Specifically, we describe some state-of-art research topics concerning SER, e.g., multimodal categorization, word discovery, and a double articulation analysis, that enable a robot to obtain words and their embodied meanings from raw sensory--motor information, including visual information, haptic information, auditory information, and acoustic speech signals, in a totally unsupervised manner. Finally, we suggest future directions of research in SER.Comment: submitted to Advanced Robotic

    Business Case and Technology Analysis for 5G Low Latency Applications

    Get PDF
    A large number of new consumer and industrial applications are likely to change the classic operator's business models and provide a wide range of new markets to enter. This article analyses the most relevant 5G use cases that require ultra-low latency, from both technical and business perspectives. Low latency services pose challenging requirements to the network, and to fulfill them operators need to invest in costly changes in their network. In this sense, it is not clear whether such investments are going to be amortized with these new business models. In light of this, specific applications and requirements are described and the potential market benefits for operators are analysed. Conclusions show that operators have clear opportunities to add value and position themselves strongly with the increasing number of services to be provided by 5G.Comment: 18 pages, 5 figure

    Virtual and Mixed Reality in Telerobotics: A Survey

    Get PDF

    Integrating Olfaction in a Robotic Telepresence Loop

    Get PDF
    In this work we propose enhancing a typical robotic telepresence architecture by considering olfactory and wind flow information in addition to the common audio and video channels. The objective is to expand the range of applications where robotics telepresence can be applied, including those related to the detection of volatile chemical substances (e.g. land-mine detection, explosive deactivation, operations in noxious environments, etc.). Concretely, we analyze how the sense of smell can be integrated in the telepresence loop, covering the digitization of the gases and wind flow present in the remote environment, the transmission through the communication network, and their display at the user location. Experiments under different environmental conditions are presented to validate the proposed telepresence system when localizing a gas emission leak at the remote environment.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech
    • …
    corecore