16 research outputs found
Usability of augmented reality technology in tele-mentorship for managing clinical scenarios - A study protocol
BackgroundTele-mentorship is considered to offer a solution to training and providing professional assistance at a distance. Tele-mentoring is a method in which a mentor interactively guides a mentee at a different geographic location in real time using a technological communication device. During a healthcare procedure, tele-mentoring can support a medical expert, remote from the treatment site, to guide a less-experienced practitioner at a different geographic location. Augmented Reality (AR) technology has been incorporated in tele-mentoring systems in healthcare environments globally. However, evidence is absent about the usability of AR technology in tele-mentoring clinical healthcare professionals in managing clinical scenarios.AimThis study aims to evaluate the usability of Augmented Reality (AR) technology in tele-mentorship for managing clinical scenarios.MethodsThis study uses a quasi-experimental design. Four experienced health professionals and a minimum of twelve novice health practitioners will be recruited for the roles of mentors and mentees, respectively. In the experiment, each mentee wearing the AR headset performs a maximum of four different clinical scenarios in a simulated learning environment. A mentor who stays in a separate room and uses a laptop will provide the mentee remote instruction and guidance following the standard protocols for the treatment proposed for each scenario. The scenarios of Acute Coronary Syndrome, Acute Myocardial Infarction, Pneumonia Severe Reaction to Antibiotics, and Hypoglycaemic Emergency are selected, and the corresponding clinical management protocols developed. Outcome measures include the mentors and mentees’ perception of the AR’s usability, mentorship effectiveness, and the mentees’ self-confidence and skill performance.EthicsThe protocol was approved by the Tasmania Health and Medical Human Research Ethics Committee (Project ID: 23343). The complete pre-registration of our study can be found at https://osf.io/q8c3u/
Enabling symmetric collaboration in public spaces through 3D mobile interaction
© 2018 by the authors. Collaboration has been common in workplaces in various engineering settings and in our daily activities. However, how to effectively engage collaborators with collaborative tasks has long been an issue due to various situational and technical constraints. The research in this paper addresses the issue in a specific scenario, which is how to enable users to interact with public information from their own perspective. We describe a 3D mobile interaction technique that allows users to collaborate with other people by creating a symmetric and collaborative ambience. This in turn can increase their engagement with public displays. In order to better understand the benefits and limitations of this technique, we conducted a usability study with a total of 40 participants. The results indicate that the 3D mobile interaction technique promotes collaboration between users and also improves their engagement with the public displays
Symmetric evaluation of multimodal human-robot interaction with gaze and standard control
© 2018 by the authors. Control of robot arms is often required in engineering and can be performed by using different methods. This study examined and symmetrically compared the use of a controller, eye gaze tracker and a combination thereof in a multimodal setup for control of a robot arm. Tasks of different complexities were defined and twenty participants completed an experiment using these interaction modalities to solve the tasks. More specifically, there were three tasks: the first was to navigate a chess piece from a square to another pre-specified square; the second was the same as the first task, but required more moves to complete; and the third task was to move multiple pieces to reach a solution to a pre-defined arrangement of the pieces. Further, while gaze control has the potential to be more intuitive than a hand controller, it suffers from limitations with regard to spatial accuracy and target selection. The multimodal setup aimed to mitigate the weaknesses of the eye gaze tracker, creating a superior system without simply relying on the controller. The experiment shows that the multimodal setup improves performance over the eye gaze tracker alone (p 0.05)
Enabling symmetric collaboration in public spaces through 3D mobile interaction
Collaboration has been common in workplaces in various engineering settings and in ourdaily activities. However, how to effectively engage collaborators with collaborative tasks has long been an issue due to various situational and technical constraints. The research in this paper addresses the issue in a specific scenario, which is how to enable users to interact with public information from their own perspective. We describe a 3D mobile interaction technique that allows users to collaborate with other people by creating a symmetric and collaborative ambience. This in turn can increase their engagement with public displays. In order to better understand the benefits and limitations of this technique, we conducted a usability study with a total of 40 participants. The results indicate that the 3D mobile interaction technique promotes collaboration between users and also improves their engagement with the public displays