2,269 research outputs found
A Low-Cost Tele-Presence Wheelchair System
This paper presents the architecture and implementation of a tele-presence
wheelchair system based on tele-presence robot, intelligent wheelchair, and
touch screen technologies. The tele-presence wheelchair system consists of a
commercial electric wheelchair, an add-on tele-presence interaction module, and
a touchable live video image based user interface (called TIUI). The
tele-presence interaction module is used to provide video-chatting for an
elderly or disabled person with the family members or caregivers, and also
captures the live video of an environment for tele-operation and
semi-autonomous navigation. The user interface developed in our lab allows an
operator to access the system anywhere and directly touch the live video image
of the wheelchair to push it as if he/she did it in the presence. This paper
also discusses the evaluation of the user experience
The Analysis of design and manufacturing tasks using haptic and immersive VR - Some case studies
The use of virtual reality in interactive design and manufacture has been researched extensively but the practical application of this technology in industry is still very much in its infancy. This is surprising as one would have expected that, after some 30 years of research commercial applications of interactive design or manufacturing planning and analysis would be widespread throughout the product design domain. One of the major but less well known advantages of VR technology is that logging the user gives a great deal of rich data which can be used to automatically generate designs or manufacturing instructions, analyse design and manufacturing tasks, map engineering processes and, tentatively, acquire expert knowledge. The authors feel that the benefits of VR in these areas have not been fully disseminated to the wider industrial community and - with the advent of cheaper PC-based VR solutions - perhaps a wider appreciation of the capabilities of this type of technology may encourage companies to adopt VR solutions for some of their product design processes. With this in mind, this paper will describe in detail applications of haptics in assembly demonstrating how user task logging can lead to the analysis of design and manufacturing tasks at a level of detail not previously possible as well as giving usable engineering outputs. The haptic 3D VR study involves the use of a Phantom and 3D system to analyse and compare this technology against real-world user performance. This work demonstrates that the detailed logging of tasks in a virtual environment gives considerable potential for understanding how virtual tasks can be mapped onto their real world equivalent as well as showing how haptic process plans can be generated in a similar manner to the conduit design and assembly planning HMD VR tool reported in PART A. The paper concludes with a view as to how the authors feel that the use of VR systems in product design and manufacturing should evolve in order to enable the industrial adoption of this technology in the future
The use of UTAUT and Post Acceptance models to investigate the attitude towards a telepresence robot in an educational setting
(1) Background: In the last decade, various investigations into the field of robotics have created several opportunities for further innovation to be possible in student education. However, despite scientific evidence, there is still strong scepticism surrounding the use of robots in some social fields, such as personal care and education; (2) Methods: In this research, we present a new tool named: HANCON model that was developed merging and extending the constructs of two solid and proven models: the Unified Theory of Acceptance and Use of Technology (UTAUT) model to examine the factors that may influence the decision to use a telepresence robot as an instrument in educational practice, and the Post Acceptance Model to evaluate acceptability after the actual use of a telepresence robot. The new tool is implemented and used to study the acceptance of a Double telepresence robot by 112 pre-service teachers in an educational setting; (3) Results: The analysis of the experimental results predicts and demonstrate a positive attitude towards the use of telepresence robot in a school setting and confirm the applicability of the model in an educational context; (4) Conclusions: The constructs of the HANCON model could predict and explain the acceptance of social telepresence robots in social contexts
Towards Assistive Feeding with a General-Purpose Mobile Manipulator
General-purpose mobile manipulators have the potential to serve as a
versatile form of assistive technology. However, their complexity creates
challenges, including the risk of being too difficult to use. We present a
proof-of-concept robotic system for assistive feeding that consists of a Willow
Garage PR2, a high-level web-based interface, and specialized autonomous
behaviors for scooping and feeding yogurt. As a step towards use by people with
disabilities, we evaluated our system with 5 able-bodied participants. All 5
successfully ate yogurt using the system and reported high rates of success for
the system's autonomous behaviors. Also, Henry Evans, a person with severe
quadriplegia, operated the system remotely to feed an able-bodied person. In
general, people who operated the system reported that it was easy to use,
including Henry. The feeding system also incorporates corrective actions
designed to be triggered either autonomously or by the user. In an offline
evaluation using data collected with the feeding system, a new version of our
multimodal anomaly detection system outperformed prior versions.Comment: This short 4-page paper was accepted and presented as a poster on
May. 16, 2016 in ICRA 2016 workshop on 'Human-Robot Interfaces for Enhanced
Physical Interactions' organized by Arash Ajoudani, Barkan Ugurlu, Panagiotis
Artemiadis, Jun Morimoto. It was peer reviewed by one reviewe
Tele-media-art: web-based inclusive teaching of body expression
Conferência Internacional, realizada em Olhão, Algarve, de 26-28 de abril de 2018.The Tele-Media-Art project aims to promote the improvement of the online distance learning and artistic teaching process applied in the teaching of two test scenarios, doctorate in digital art-media and the lifelong learning course ”the experience of diversity” by exploiting multimodal telepresence facilities encompassing the diversified visual, auditory and sensory channels, as well as rich forms of gestural / body interaction. To this end, a telepresence system was developed to be installed at Palácio Ceia, in Lisbon, Portugal, headquarters of the Portuguese Open University, from which methodologies of artistic teaching in mixed regime - face-to-face and online distance - that are inclusive to blind and partially sighted students. This system has already been tested against a group of subjects, including blind people. Although positive results were achieved, more development and further tests will be carried in the futureThis project was financed by Calouste Gulbenkian Foundation under Grant number 142793.info:eu-repo/semantics/publishedVersio
Recommended from our members
uC: Ubiquitous Collaboration Platform for Multimodal Team Interaction Support
A human-centered computing platform that improves teamwork and transforms the “human- computer interaction experience” for distributed teams is presented. This Ubiquitous Collaboration, or uC (“you see”), platform\u27s objective is to transform distributed teamwork (i.e., work occurring when teams of workers and learners are geographically dispersed and often interacting at different times). It achieves this goal through a multimodal team interaction interface realized through a reconfigurable open architecture. The approach taken is to integrate: (1) an intuitive speech- and video-centric multi-modal interface to augment more conventional methods (e.g., mouse, stylus and touch), (2) an open and reconfigurable architecture supporting information gathering, and (3) a machine intelligent approach to analysis and management of heterogeneous live and stored sensor data to support collaboration. The system will transform how teams of people interact with computers by drawing on both the virtual and physical environment
- …