5,139 research outputs found

    A Scenario Analysis of Wearable Interface Technology Foresight

    Get PDF
    Although the importance and value of wearable interface have gradually being realized, wearable interface related technologies and the priority of adopting these technologies have so far not been clearly recognized. To fill this gap, this paper focuses on the technology planning strategy of organizations that have an interest in developing or adopting wearable interface related technologies. Based on the scenario analysis approach, a technology planning strategy is proposed. In this analysis, thirty wearable interface technologies are classified into six categories, and the importance and risk factors of these categories are then evaluated under two possible scenarios. The main research findings include the discovery that most brain based wearable interface technologies are rated high to medium importance and high risk in all scenarios, and that scenario changes will have less impact on voice based as well as gesture based wearable interface technologies. These results provide a reference for organizations and vendors interested in adopting or developing wearable interface technologies

    Computer Aided Drafting Virtual Reality Interface

    Get PDF
    Computer Aided Drafting (CAD) is pervasive in engineering fields today. It has become indispensable for planning, creating, visualizing, troubleshooting, collaborating, and communicating designs before they exist in physical form. From the beginning, CAD was created to be used by means of a mouse, keyboard, and monitor. Along the way, other, more specialized interface devices were created specifically for CAD that allowed for easier and more intuitive navigation within a 3D space, but they were at best stopgap solutions. Virtual Reality (VR) allows users to navigate and interact with digital 3D objects and environments the same way they would in the real world. For this reason, VR is a natural CAD interface solution. Using VR as an interface for CAD software, creating will be more intuitive and visualizing will be second nature. For this project, a prototype VR CAD program was created using Unreal Engine for use with the HTC Vive to compare against traditional WIMP (windows, icons, menus, pointer) interface CAD programs for the time it takes to learn each program, create similar models, and impressions of using each program, specifically the intuitiveness of the user interface and model manipulation. FreeCAD, SolidWorks, and Blender were the three traditional interface modeling programs chosen to compare against VR because of their wide-spread use for modeling in 3D printing, industry, and gaming, respectively. During the course of the project, two VR modeling programs were released, Google Blocks and MakeVR Pro; because they were of a similar type as the prototype software created in Unreal Engine, they were included for comparison as part of this project. The comparison showed that the VR CAD programs were faster to learn and create models and more intuitive to use than the traditional interface CAD programs

    Designing to Support Workspace Awareness in Remote Collaboration using 2D Interactive Surfaces

    Get PDF
    Increasing distributions of the global workforce are leading to collaborative workamong remote coworkers. The emergence of such remote collaborations is essentiallysupported by technology advancements of screen-based devices ranging from tabletor laptop to large displays. However, these devices, especially personal and mobilecomputers, still suffer from certain limitations caused by their form factors, that hinder supporting workspace awareness through non-verbal communication suchas bodily gestures or gaze. This thesis thus aims to design novel interfaces andinteraction techniques to improve remote coworkers’ workspace awareness throughsuch non-verbal cues using 2D interactive surfaces.The thesis starts off by exploring how visual cues support workspace awareness infacilitated brainstorming of hybrid teams of co-located and remote coworkers. Basedon insights from this exploration, the thesis introduces three interfaces for mobiledevices that help users maintain and convey their workspace awareness with their coworkers. The first interface is a virtual environment that allows a remote person to effectively maintain his/her awareness of his/her co-located collaborators’ activities while interacting with the shared workspace. To help a person better express his/her hand gestures in remote collaboration using a mobile device, the second interfacepresents a lightweight add-on for capturing hand images on and above the device’sscreen; and overlaying them on collaborators’ device to improve their workspace awareness. The third interface strategically leverages the entire screen space of aconventional laptop to better convey a remote person’s gaze to his/her co-locatedcollaborators. Building on the top of these three interfaces, the thesis envisions an interface that supports a person using a mobile device to effectively collaborate with remote coworkers working with a large display.Together, these interfaces demonstrate the possibilities to innovate on commodity devices to offer richer non-verbal communication and better support workspace awareness in remote collaboration

    Methodological Challenges in Eye-Tracking based Usability Testing of 3-Dimensional Software – Presented via Experiences of Usability Tests of Four 3D Applications

    Get PDF
    Eye-tracking based usability testing and User Experience (UX) research are widespread in the development processes of various types of software; however, there exist specific difficulties during usability tests of three-dimensional (3D) software. Analysing the screen records with gaze plots, heatmaps of fixations, and statistics of Areas of Interests (AOI), methodological problems occur when the participant wants to rotate, zoom, or move the 3D space. The data gained regarded the menu bar is mainly interpretable; however, the data regarded the 3D environment is hardly so, or not at all. Our research tested four software applications with the aforementioned problem in mind: ViveLab and Jack Digital Human Modelling (DHM) and ArchiCAD and CATIA Computer Aided Design (CAD) software. Our original goal was twofold. Firstly, with these usability tests, we aimed to identify issues in the software. Secondly, we tested the utility of a new methodology which was included in the tests. This paper summarizes the results on the methodology based on individual experiments with different software applications. One of the main ideas behind the methodology adopted is to tell the participants (during certain subtasks of the tests) not to move the 3D space while they perform the given tasks at a certain point in the usability test. During the experiments, we applied a Tobii eye-tracking device, and after the task completion, each participant was interviewed. Based on these experiences, the methodology appears to be both useful and applicable, and its visualisation techniques for one or more participants are interpretable

    IDATER online conference: graphicacy and modelling 2010

    Get PDF
    IDATER online conference: graphicacy and modelling 201

    Algorithms for sketching surfaces

    Get PDF
    CISRG discussion paper ; 1

    Development of a virtual reality milling machine for knowledge learning and skill training

    Get PDF
    Current methods of training personnel on high cost machine tools involve the use of both classroom and hands on practical training. The practical training required the operation of costly equipment and the trainee has to be under close personnel supervision. The main aim of this project is to reduce the amount of practical training and its inherent cost, time, danger, personal injury risk and material requirements by utilising a virtual reality technology. In this study, an investigation into the use of Virtual reality for training operators and students to use the Milling Machine was carried out. The investigation has been divided into two sections: first the development of Milling Machine in the 3D virtual environment, where the real machine was re-constructed in the virtual space. This has been carried out by creating objects and assembling them together. The complete Milling machine was then properly modelled and rendered so it could be viewed from all viewpoints. The second section was to add motion to the virtual world. The machine was made of functions as for the real machine. This was achieved by attaching Superscape Control Language (SCL) to the objects. The developed Milling machine allows the users to choose the material, speed and feed rate. Upon activation, the virtual machine will be simulated to carry out the machining process and instantaneous data on the machined part can be generated. The results were satisfactory, the Milling Machine was modelled successfully and the machine was able to perform according to task set. Using the developed Virtual Model, the ability for training students and operators to use the Milling Machine has been achieved
    • 

    corecore