991 research outputs found

    A novel plasticity rule can explain the development of sensorimotor intelligence

    Full text link
    Grounding autonomous behavior in the nervous system is a fundamental challenge for neuroscience. In particular, the self-organized behavioral development provides more questions than answers. Are there special functional units for curiosity, motivation, and creativity? This paper argues that these features can be grounded in synaptic plasticity itself, without requiring any higher level constructs. We propose differential extrinsic plasticity (DEP) as a new synaptic rule for self-learning systems and apply it to a number of complex robotic systems as a test case. Without specifying any purpose or goal, seemingly purposeful and adaptive behavior is developed, displaying a certain level of sensorimotor intelligence. These surprising results require no system specific modifications of the DEP rule but arise rather from the underlying mechanism of spontaneous symmetry breaking due to the tight brain-body-environment coupling. The new synaptic rule is biologically plausible and it would be an interesting target for a neurobiolocal investigation. We also argue that this neuronal mechanism may have been a catalyst in natural evolution.Comment: 18 pages, 5 figures, 7 video

    Sensory Manipulation as a Countermeasure to Robot Teleoperation Delays: System and Evidence

    Full text link
    In the field of robotics, robot teleoperation for remote or hazardous environments has become increasingly vital. A major challenge is the lag between command and action, negatively affecting operator awareness, performance, and mental strain. Even with advanced technology, mitigating these delays, especially in long-distance operations, remains challenging. Current solutions largely focus on machine-based adjustments. Yet, there's a gap in using human perceptions to improve the teleoperation experience. This paper presents a unique method of sensory manipulation to help humans adapt to such delays. Drawing from motor learning principles, it suggests that modifying sensory stimuli can lessen the perception of these delays. Instead of introducing new skills, the approach uses existing motor coordination knowledge. The aim is to minimize the need for extensive training or complex automation. A study with 41 participants explored the effects of altered haptic cues in delayed teleoperations. These cues were sourced from advanced physics engines and robot sensors. Results highlighted benefits like reduced task time and improved perceptions of visual delays. Real-time haptic feedback significantly contributed to reduced mental strain and increased confidence. This research emphasizes human adaptation as a key element in robot teleoperation, advocating for improved teleoperation efficiency via swift human adaptation, rather than solely optimizing robots for delay adjustment.Comment: Submitted to Scientific Report

    Aerospace medicine and biology: A continuing bibliography with indexes, supplement 204

    Get PDF
    This bibliography lists 140 reports, articles, and other documents introduced into the NASA scientific and technical information system in February 1980

    Investigating the Impact of Visuohaptic Simulations for the Conceptual Understanding of Electric Field for Distributed Charges

    Get PDF
    The present study assessed the benefits of a multisensory intervention on the conceptual understanding of electric field for distributed charges in engineering and technology undergraduate students. A novel visuohaptic intervention was proposed, which focused on exploring the forces around the different electric field configurations for distributed charges namely point, infinitely long line and uniformly charged ring. The before and after effects of the visuohaptic intervention are compared, wherein the intervention includes instructional scaffolding. Three single-group studies were conducted to investigate the effect among three different populations: (a) Undergraduate engineering students, (b) Undergraduate technology students and (c) Undergraduate engineering technology students from a different demographic setting. The findings from the three studies suggests that the haptic modality intervention provides beneficial effects by allowing students to improve their conceptual understanding of electric field for distributed charges, although students from groups (b) and (c) showed a statistically significant increase in the conceptual understanding. The findings also indicate a positive learning perception among all the three groups

    Human Machine Interfaces for Teleoperators and Virtual Environments

    Get PDF
    In Mar. 1990, a meeting organized around the general theme of teleoperation research into virtual environment display technology was conducted. This is a collection of conference-related fragments that will give a glimpse of the potential of the following fields and how they interplay: sensorimotor performance; human-machine interfaces; teleoperation; virtual environments; performance measurement and evaluation methods; and design principles and predictive models

    Investigating The Impact Of Visuohaptic Simulations For Conceptual Understanding In Electricity And Magnetism

    Get PDF
    The present study examined the efficacy of a haptic simulation used as a pedagogical tool to teach freshmen engineering students about electromagnetism. A quasi-experimental design-based research was executed in two iterations to compare the possible benefits the haptic device provided to the cognitive learning of students. In the first iteration of the experiment performance of learners who used visual-only simulations was compared to the performance of those who used visuohaptic. In the second iteration of the experiment modifications were made to learning materials and experiment procedures to enhance research design. Research hypothesis states that multimodal presentation of information may lead to better conceptual understanding of electromagnetism compared to visual presentation alone

    New control architecturebased on PXI for a 3-finger haptic device applied to virtual manipulation

    Get PDF
    To perform advanced manipulation of remote environments such as grasping, more than one finger is required implying higher requirements for the control architecture. This paper presents the design and control of a modular 3-finger haptic device that can be used to interact with virtual scenarios or to teleoperate dexterous remote hands. In a modular haptic device, each module allows the interaction with a scenario by using a single finger; hence, multi-finger interaction can be achieved by adding more modules. Control requirements for a multifinger haptic device are analyzed and new hardware/software architecture for these kinds of devices is proposed. The software architecture described in this paper is distributed and the different modules communicate to allow the remote manipulation. Moreover, an application in which this haptic device is used to interact with a virtual scenario is shown

    Investigating Real-time Touchless Hand Interaction and Machine Learning Agents in Immersive Learning Environments

    Get PDF
    The recent surge in the adoption of new technologies and innovations in connectivity, interaction technology, and artificial realities can fundamentally change the digital world. eXtended Reality (XR), with its potential to bridge the virtual and real environments, creates new possibilities to develop more engaging and productive learning experiences. Evidence is emerging that thissophisticated technology offers new ways to improve the learning process for better student interaction and engagement. Recently, immersive technology has garnered much attention as an interactive technology that facilitates direct interaction with virtual objects in the real world. Furthermore, these virtual objects can be surrogates for real-world teaching resources, allowing for virtual labs. Thus XR could enable learning experiences that would not bepossible in impoverished educational systems worldwide. Interestingly, concepts such as virtual hand interaction and techniques such as machine learning are still not widely investigated in immersive learning. Hand interaction technologies in virtual environments can support the kinesthetic learning pedagogical approach, and the need for its touchless interaction nature hasincreased exceptionally in the post-COVID world. By implementing and evaluating real-time hand interaction technology for kinesthetic learning and machine learning agents for self-guided learning, this research has addressed these underutilized technologies to demonstrate the efficiency of immersive learning. This thesis has explored different hand-tracking APIs and devices to integrate real-time hand interaction techniques. These hand interaction techniques and integrated machine learning agents using reinforcement learning are evaluated with different display devices to test compatibility. The proposed approach aims to provide self-guided, more productive, and interactive learning experiences. Further, this research has investigated ethics, privacy, and security issues in XR and covered the future of immersive learning in the Metaverse.<br/

    Investigating Real-time Touchless Hand Interaction and Machine Learning Agents in Immersive Learning Environments

    Get PDF
    The recent surge in the adoption of new technologies and innovations in connectivity, interaction technology, and artificial realities can fundamentally change the digital world. eXtended Reality (XR), with its potential to bridge the virtual and real environments, creates new possibilities to develop more engaging and productive learning experiences. Evidence is emerging that thissophisticated technology offers new ways to improve the learning process for better student interaction and engagement. Recently, immersive technology has garnered much attention as an interactive technology that facilitates direct interaction with virtual objects in the real world. Furthermore, these virtual objects can be surrogates for real-world teaching resources, allowing for virtual labs. Thus XR could enable learning experiences that would not bepossible in impoverished educational systems worldwide. Interestingly, concepts such as virtual hand interaction and techniques such as machine learning are still not widely investigated in immersive learning. Hand interaction technologies in virtual environments can support the kinesthetic learning pedagogical approach, and the need for its touchless interaction nature hasincreased exceptionally in the post-COVID world. By implementing and evaluating real-time hand interaction technology for kinesthetic learning and machine learning agents for self-guided learning, this research has addressed these underutilized technologies to demonstrate the efficiency of immersive learning. This thesis has explored different hand-tracking APIs and devices to integrate real-time hand interaction techniques. These hand interaction techniques and integrated machine learning agents using reinforcement learning are evaluated with different display devices to test compatibility. The proposed approach aims to provide self-guided, more productive, and interactive learning experiences. Further, this research has investigated ethics, privacy, and security issues in XR and covered the future of immersive learning in the Metaverse.<br/

    A Novel Haptic Simulator for Evaluating and Training Salient Force-Based Skills for Laparoscopic Surgery

    Get PDF
    Laparoscopic surgery has evolved from an \u27alternative\u27 surgical technique to currently being considered as a mainstream surgical technique. However, learning this complex technique holds unique challenges to novice surgeons due to their \u27distance\u27 from the surgical site. One of the main challenges in acquiring laparoscopic skills is the acquisition of force-based or haptic skills. The neglect of popular training methods (e.g., the Fundamentals of Laparoscopic Surgery, i.e. FLS, curriculum) in addressing this aspect of skills training has led many medical skills professionals to research new, efficient methods for haptic skills training. The overarching goal of this research was to demonstrate that a set of simple, simulator-based haptic exercises can be developed and used to train users for skilled application of forces with surgical tools. A set of salient or core haptic skills that underlie proficient laparoscopic surgery were identified, based on published time-motion studies. Low-cost, computer-based haptic training simulators were prototyped to simulate each of the identified salient haptic skills. All simulators were tested for construct validity by comparing surgeons\u27 performance on the simulators with the performance of novices with no previous laparoscopic experience. An integrated, \u27core haptic skills\u27 simulator capable of rendering the three validated haptic skills was built. To examine the efficacy of this novel salient haptic skills training simulator, novice participants were tested for training improvements in a detailed study. Results from the study demonstrated that simulator training enabled users to significantly improve force application for all three haptic tasks. Research outcomes from this project could greatly influence surgical skills simulator design, resulting in more efficient training
    • …
    corecore