1,662 research outputs found

    An Immersive Telepresence System using RGB-D Sensors and Head Mounted Display

    Get PDF
    We present a tele-immersive system that enables people to interact with each other in a virtual world using body gestures in addition to verbal communication. Beyond the obvious applications, including general online conversations and gaming, we hypothesize that our proposed system would be particularly beneficial to education by offering rich visual contents and interactivity. One distinct feature is the integration of egocentric pose recognition that allows participants to use their gestures to demonstrate and manipulate virtual objects simultaneously. This functionality enables the instructor to ef- fectively and efficiently explain and illustrate complex concepts or sophisticated problems in an intuitive manner. The highly interactive and flexible environment can capture and sustain more student attention than the traditional classroom setting and, thus, delivers a compelling experience to the students. Our main focus here is to investigate possible solutions for the system design and implementation and devise strategies for fast, efficient computation suitable for visual data processing and network transmission. We describe the technique and experiments in details and provide quantitative performance results, demonstrating our system can be run comfortably and reliably for different application scenarios. Our preliminary results are promising and demonstrate the potential for more compelling directions in cyberlearning.Comment: IEEE International Symposium on Multimedia 201

    Immersive Demonstrations are the Key to Imitation Learning

    Full text link
    Achieving successful robotic manipulation is an essential step towards robots being widely used in industry and home settings. Recently, many learning-based methods have been proposed to tackle this challenge, with imitation learning showing great promise. However, imperfect demonstrations and a lack of feedback from teleoperation systems may lead to poor or even unsafe results. In this work we explore the effect of demonstrator force feedback on imitation learning, using a feedback glove and a robot arm to render fingertip-level and palm-level forces, respectively. 10 participants recorded 5 demonstrations of a pick-and-place task with 3 grippers, under conditions with no force feedback, fingertip force feedback, and fingertip and palm force feedback. Results show that force feedback significantly reduces demonstrator fingertip and palm forces, leads to a lower variation in demonstrator forces, and recorded trajectories that a quicker to execute. Using behavioral cloning, we find that agents trained to imitate these trajectories mirror these benefits, even though agents have no force data shown to them during training. We conclude that immersive demonstrations, achieved with force feedback, may be the key to unlocking safer, quicker to execute dexterous manipulation policies.Comment: This paper is accepted to be presented on IEEE International Conference on Robotics and Automation (ICRA) 202

    Development of an Augmented Reality Interface for Intuitive Robot Programming

    Get PDF
    As the demand for advanced robotic systems continues to grow, the need for new technologies and techniques that can improve the efficiency and effectiveness of robot programming is imperative. The latter relies heavily on the effective communication of tasks between the user and the robot. To address this issue, we developed an Augmented Reality (AR) interface that incorporates Head Mounted Display (HMD) capabilities, and integrated it with an active learning framework for intuitive programming of robots. This integration enables the execution of conditional tasks, bridging the gap between user and robot knowledge. The active learning model with the user's guidance incrementally programs a complex task and after encoding the skills, generates a high level task graph. Then the holographic robot is visualising individual skills of the task in order to increase the user's intuition of the whole procedure with sensory information retrieved from the physical robot in real-time. The interactive aspect of the interface can be utilised in this phase, by providing the user the option of actively validating the learnt skills or potentially changing them and thus generating a new skill sequence. Teaching the real robot through teleoperation by using the HMD is also possible for the user to increase the directness and immersion factors of teaching procedure while safely manipulating the physical robot from a distance. The evaluation of the proposed framework is conducted through a series of experiments employing the developed interface on the real system. These experiments aim to assess the degree of intuitiveness provided by the interface features to the user and to determine the extent of similarity between the virtual system's behavior during the robot programming procedure and that of its physical counterpart

    Exploring the future of mathematics teaching: Insight with ChatGPT

    Get PDF
    This study aims to provide a comprehensive overview of the future of mathematics teaching from the perspective of ChatGPT, an advanced language processing artificial intelligence (AI) developed by OpenAI. The results of the chat transcripts edited with ChatGPT suggest that the future of mathematics teaching will see the integration of technology and AI to provide personalized learning experiences, blended learning environments, and computational thinking, data literacy, and statistics. Problem-solving, critical thinking, and interdisciplinary connections will continue to be emphasized, and equity and inclusion will remain crucial. AI is expected to revolutionize mathematics education, but thoughtful implementation, ongoing professional development, and pedagogical considerations are essential. However, the future of teaching mathematics will continue to evolve. Therefore, teachers and lecturers need to keep abreast of the latest developments and adapt to them while remaining committed to providing quality teaching.Studi ini bertujuan untuk memberikan gambaran komprehensif tentang masa depan pengajaran matematika dari perspektif ChatGPT, Artificial Intelligence (AI) pemrosesan bahasa tingkat lanjut yang dikembangkan oleh OpenAI. Hasil transkrip obrolan yang diedit dengan ChatGPT menunjukkan bahwa masa depan pengajaran matematika akan melihat integrasi teknologi dan AI untuk memberikan pengalaman belajar yang dipersonalisasi, lingkungan pembelajaran campuran, dan pemikiran komputasi, literasi data, dan statistik. Pemecahan masalah, pemikiran kritis, dan koneksi interdisipliner akan terus ditekankan, dan kesetaraan dan inklusi akan tetap penting. AI diharapkan merevolusi pendidikan matematika, tetapi implementasi yang bijaksana, pengembangan profesional berkelanjutan, dan pertimbangan pedagogis sangat penting. Namun, masa depan pengajaran matematika akan terus berkembang. Oleh karena itu, guru dan dosen perlu mengikuti perkembangan terkini dan beradaptasi dengannya sambil tetap berkomitmen untuk memberikan pengajaran yang berkualitas

    A review on manipulation skill acquisition through teleoperation-based learning from demonstration

    Get PDF
    Manipulation skill learning and generalization have gained increasing attention due to the wide applications of robot manipulators and the spurt of robot learning techniques. Especially, the learning from demonstration method has been exploited widely and successfully in the robotic community, and it is regarded as a promising direction to realize the manipulation skill learning and generalization. In addition to the learning techniques, the immersive teleoperation enables the human to operate a remote robot with an intuitive interface and achieve the telepresence. Thus, it is a promising way to transfer manipulation skills from humans to robots by combining the learning methods and the teleoperation, and adapting the learned skills to different tasks in new situations. This review, therefore, aims to provide an overview of immersive teleoperation for skill learning and generalization to deal with complex manipulation tasks. To this end, the key technologies, e.g. manipulation skill learning, multimodal interfacing for teleoperation and telerobotic control, are introduced. Then, an overview is given in terms of the most important applications of immersive teleoperation platform for robot skill learning. Finally, this survey discusses the remaining open challenges and promising research topics

    Virtual laboratories for education in science, technology, and engineering: A review

    Get PDF
    Within education, concepts such as distance learning, and open universities, are now becoming more widely used for teaching and learning. However, due to the nature of the subject domain, the teaching of Science, Technology, and Engineering are still relatively behind when using new technological approaches (particularly for online distance learning). The reason for this discrepancy lies in the fact that these fields often require laboratory exercises to provide effective skill acquisition and hands-on experience. Often it is difficult to make these laboratories accessible for online access. Either the real lab needs to be enabled for remote access or it needs to be replicated as a fully software-based virtual lab. We argue for the latter concept since it offers some advantages over remotely controlled real labs, which will be elaborated further in this paper. We are now seeing new emerging technologies that can overcome some of the potential difficulties in this area. These include: computer graphics, augmented reality, computational dynamics, and virtual worlds. This paper summarizes the state of the art in virtual laboratories and virtual worlds in the fields of science, technology, and engineering. The main research activity in these fields is discussed but special emphasis is put on the field of robotics due to the maturity of this area within the virtual-education community. This is not a coincidence; starting from its widely multidisciplinary character, robotics is a perfect example where all the other fields of engineering and physics can contribute. Thus, the use of virtual labs for other scientific and non-robotic engineering uses can be seen to share many of the same learning processes. This can include supporting the introduction of new concepts as part of learning about science and technology, and introducing more general engineering knowledge, through to supporting more constructive (and collaborative) education and training activities in a more complex engineering topic such as robotics. The objective of this paper is to outline this problem space in more detail and to create a valuable source of information that can help to define the starting position for future research

    Augmented Reality and Robotics: A Survey and Taxonomy for AR-enhanced Human-Robot Interaction and Robotic Interfaces

    Get PDF
    This paper contributes to a taxonomy of augmented reality and robotics based on a survey of 460 research papers. Augmented and mixed reality (AR/MR) have emerged as a new way to enhance human-robot interaction (HRI) and robotic interfaces (e.g., actuated and shape-changing interfaces). Recently, an increasing number of studies in HCI, HRI, and robotics have demonstrated how AR enables better interactions between people and robots. However, often research remains focused on individual explorations and key design strategies, and research questions are rarely analyzed systematically. In this paper, we synthesize and categorize this research field in the following dimensions: 1) approaches to augmenting reality; 2) characteristics of robots; 3) purposes and benefits; 4) classification of presented information; 5) design components and strategies for visual augmentation; 6) interaction techniques and modalities; 7) application domains; and 8) evaluation strategies. We formulate key challenges and opportunities to guide and inform future research in AR and robotics

    Recent Advancements in Augmented Reality for Robotic Applications: A Survey

    Get PDF
    Robots are expanding from industrial applications to daily life, in areas such as medical robotics, rehabilitative robotics, social robotics, and mobile/aerial robotics systems. In recent years, augmented reality (AR) has been integrated into many robotic applications, including medical, industrial, humanā€“robot interactions, and collaboration scenarios. In this work, AR for both medical and industrial robot applications is reviewed and summarized. For medical robot applications, we investigated the integration of AR in (1) preoperative and surgical task planning; (2) image-guided robotic surgery; (3) surgical training and simulation; and (4) telesurgery. AR for industrial scenarios is reviewed in (1) humanā€“robot interactions and collaborations; (2) path planning and task allocation; (3) training and simulation; and (4) teleoperation control/assistance. In addition, the limitations and challenges are discussed. Overall, this article serves as a valuable resource for working in the field of AR and robotic research, offering insights into the recent state of the art and prospects for improvement

    A Virtual Reality Laboratory for Blended Learning Education: Design, Implementation and Evaluation

    Get PDF
    Launched during the pandemic, the EU-funded JANUS project aimed to ensure the continuity of student workshops at universities using a virtual reality (VR) robotics laboratory. With the return to normality, the project has been redesigned to capitalise on the positive outcomes of the experience. The VR lab provides safe and unrestricted access to the labs and experiments with the machines, reducing the consequences of student mistakes and improving the user experience by allowing the experiment to be repeated from different angles, some of which are impossible to access in the real lab. In addition, integration with an interactive learning platform called ā€œViLLEā€ allows for continuous assessment of the learning experience. Self-evaluation of the material taught and learned can be integrated with the execution of the exercises that pave the way for Kaizen. Two VR workshops for the blended learning of robotics were developed during the JANUS project. Their evaluation reported favourable responses from the students whose learning performance was indirectly measured
    • ā€¦
    corecore