46 research outputs found

    Examining the use of visualisation methods for the design of interactive systems

    Get PDF
    Human-Computer Interaction (HCI) design has historically involved people from different fields. Designing HCI systems with people of varying background and expertise can bring different perspectives and ideas, but discipline-specific language and design methods can hinder such collaborations. The application of visualisation methods is a way to overcome these challenges, but to date selection tools tend to focus on a facet of HCI design methods and no research has been attempted to assemble a collection of HCI visualisation methods. To fill this gap, this research seeks to establish an inventory of HCI visualisation methods and identify ways of selecting amongst them. Creating the inventory of HCI methods would enable designers to discover and learn about methods that they may not have used before or be familiar with. Categorising the methods provides a structure for new and experienced designers to determine appropriate methods for their design project. The aim of this research is to support designers in the development of Human-Computer Interaction (HCI) systems through better selection and application of visualisation methods. This is achieved through four phases. In the first phase, three case studies are conducted to investigate the challenges and obstacles that influence the choice of a design approach in the development of HCI systems. The findings from the three case studies helped to form the design requirements for a visualisation methods selection and application guide. In the second phase, the Guide is developed. The third phase aims to evaluate the Guide. The Guide is employed in the development of a serious training game to demonstrate its applicability. In the fourth phase, a user study was designed to evaluate the serious training game. Through the evaluation of the serious training game, the Guide is validated. This research has contributed to the knowledge surrounding visualisation tools used in the design of interactive systems. The compilation of HCI visualisation methods establishes an inventory of methods for interaction design. The identification of Selection Approaches brings together the ways in which visualisation methods are organised and grouped. By mapping visualisation methods to Selection Approaches, this study has provided a way for practitioners to select a visualisation method to support their design practice. The development of the Selection Guide provided five filters, which helps designers to identify suitable visualisation methods based on the nature of the design challenge. The development of the Application Guide presented the methodology of each visualisation method in a consistent format. This enables the ease of method comparison and to ensure there is comprehensive information for each method. A user study showing the evaluation of a serious training game is presented. Two learning objectives were identified and mapped to Bloom’s Taxonomy to advocate an approach for like-to-like comparison with future studies

    Natural user interfaces for interdisciplinary design review using the Microsoft Kinect

    Get PDF
    As markets demand engineered products faster, waiting on the cyclical design processes of the past is not an option. Instead, industry is turning to concurrent design and interdisciplinary teams. When these teams collaborate, engineering CAD tools play a vital role in conceptualizing and validating designs. These tools require significant user investment to master, due to challenging interfaces and an overabundance of features. These challenges often prohibit team members from using these tools for exploring designs. This work presents a method allowing users to interact with a design using intuitive gestures and head tracking, all while keeping the model in a CAD format. Specifically, Siemens\u27 Teamcenter® Lifecycle Visualization Mockup (Mockup) was used to display design geometry while modifications were made through a set of gestures captured by a Microsoft KinectTM in real time. This proof of concept program allowed a user to rotate the scene, activate Mockup\u27s immersive menu, move the immersive wand, and manipulate the view based on head position. This work also evaluates gesture usability and task completion time for this proof of concept system. A cognitive model evaluation method was used to evaluate the premise that gesture-based user interfaces are easier to use and learn with regards to time than a traditional mouse and keyboard interface. Using a cognitive model analysis tool allowed the rapid testing of interaction concepts without the significant overhead of user studies and full development cycles. The analysis demonstrated that using the KinectTM is a feasible interaction mode for CAD/CAE programs. In addition, the analysis pointed out limitations in the gesture interfaces ability to compete time wise with easily accessible customizable menu options

    Development of Working Facility to Improve Work Posture at Packaging Section in Organic Vegetable Industry

    Get PDF
    Working facilities may influence worker’s working posture. CV.Tani Organik Merapi (CV.TOM) is an organic vegetables company. The company provides ”dingklik” (footstool) for working. While working with “footstool”, workers must bend their legs with high bend degrees as well as their back. This poor working posture is caused by bad working facilities. It’s important to check the working posture to know whether the improvement of working facilities is needed or not. Ovako Working Analysis System (OWAS) was used to check the badness level of working posture in every CV.TOM’s working activities. It’s identified that packaging section has the worst working posture. Then it was decided to provide a new working facility to improve working posture in packaging section. CATIA V5 was used to design the new working facility. Three ergonomics tools were used to compare footstool” working posture with the new working facility. Those tools are Rapid Upper Limb Assessment (RULA), Manual Task Risk Assessment (ManTRA), and Rodgers Muscle Fatigue Analysis (RMFA). It was found that the new working facility can substitute “dingklik” with lower posture score and safer. Keywords: Ergonomics; Catia; Comfortability; Facility; Working postur

    Design and Validation of Hardware-in-the-Loop Testbed for Proximity Operations Payloads

    Get PDF
    The research presented here is a new testbed design for CubeSat and payload testing and development. This research demonstrates a low-cost, hardware-in-the-loop testing apparatus for use with university CubeSat programs for testing throughout the different levels of the development process. The average university CubeSat program undergoes very little hardware-in-the-loop testing. Most of the focus is the targeted towards performance testing and environmental testing which occur after completion the development process. This research shows that, for minimal schedule and cost impact, testing can occur early in the development process. The testbed presented here demonstrates suitable accuracy to be used for advanced mission testing and regularly throughout the process until completion. The testbed maintains a low-cost, modular design, and ease of integration into new and existing programs. In addition, some modifications and upgraders are suggested to further increase the performance of the testbed. The success of the testbed can be seen through the implementation of actual satellite telemetry with rendezvous and docking missions, the testbed performance, and the results of that experiment

    Sketching in 3D : towards a fluid space for mind and body

    Get PDF
    Thesis (S.M. in Architecture Studies)--Massachusetts Institute of Technology, Dept. of Architecture, 2013.Cataloged from PDF version of thesis.Includes bibliographical references (p. 80-82).This thesis explores a new type of computer-aided sketching tool for 3-dimensional designs. Sketching, as a process, has been used as an effective way to explore and develop ideas in the design process. However, when designers deal with volumetric designs in 3-dimensional space, current sketching means, including traditional free-hand sketching and contemporary computer-aided design (CAD) modeling have limitations such as dimensional inconsistency, and non-intuitive interactions. By observing the roles of sketching in the design process and reviewing the history of design tools, this thesis investigates and proposes new digital methods of 3-dimensional sketching that take advantage of motion detecting and computer-vision technology that is widely available today. In this thesis, two prototype tools were developed and compared. The first prototype uses a motion detecting sensor, projection screen, and gesture tracking software. The movement of the user's hands becomes the intuitive interface to shape 3-dimensional objects in the virtual space. The second prototype, developed in collaboration with Nagakura, uses a hand-held tablet computer with marker-based augmented reality technique. The hand-held device displays the virtual object from desired angles and works as a virtual tool like a chisel, plane, drill, and glue gun to shape virtual objects in 3-dimensional space. Testing these two prototypes for use, and comparing the resulting objects and user responses revealed the strengths and weaknesses of these different 3-dimensional sketching environments. The proposed systems provide a possible foundation for novel computer-aided sketching application that takes advantages of both the physical and virtual worlds.by Woongki Sung.S.M.in Architecture Studie

    The challenges in computer supported conceptual engineering design

    Get PDF
    Computer Aided Engineering Design (CAED) supports the engineering design process during the detail design, but it is not commonly used in the conceptual design stage. This article explores through literature why this is and how the engineering design research community is responding through the development of new conceptual CAED systems and HCI (Human Computer Interface) prototypes. First the requirements and challenges for future conceptual CAED and HCI solutions to better support conceptual design are explored and categorised. Then the prototypes developed in both areas, since 2000, are discussed. Characteristics already considered and those required for future development of CAED systems and HCIs are proposed and discussed, one of the key ones being experience. The prototypes reviewed offer innovative solutions, but only address selected requirements of conceptual design, and are thus unlikely to not provide a solution which would fit the wider needs of the engineering design industry. More importantly, while the majority of prototypes show promising results they are of low maturity and require further development

    User-based gesture vocabulary for form creation during a product design process

    Get PDF
    There are inconsistencies between the nature of the conceptual design and the functionalities of the computational systems supporting it, which disrupt the designers’ process, focusing on technology rather than designers’ needs. A need for elicitation of hand gestures appropriate for the requirements of the conceptual design, rather than those arbitrarily chosen or focusing on ease of implementation was identified.The aim of this thesis is to identify natural and intuitive hand gestures for conceptual design, performed by designers (3rd, 4th year product design engineering students and recent graduates) working on their own, without instruction and without limitations imposed by the facilitating technology. This was done via a user centred study including 44 participants. 1785 gestures were collected. Gestures were explored as a sole mean for shape creation and manipulation in virtual 3D space. Gestures were identified, described in writing, sketched, coded based on the taxonomy used, categorised based on hand form and the path travelled and variants identified. Then they were statistically analysed to ascertain agreement rates between the participants, significance of the agreement and the likelihood of number of repetitions for each category occurring by chance. The most frequently used and statistically significant gestures formed the consensus set of vocabulary for conceptual design. The effect of the shape of the manipulated object on the gesture performed, and if the sequence of the gestures participants proposed was different from the established CAD solid modelling practices were also observed.Vocabulary was evaluated by non-designer participants, and the outcomes have shown that the majority of gestures were appropriate and easy to perform. Evaluation was performed theoretically and in the VR environment. Participants selected their preferred gestures for each activity, and a variant of the vocabulary for conceptual design was created as an outcome, that aims to ensure that extensive training is not required, extending the ability to design beyond trained designers only.There are inconsistencies between the nature of the conceptual design and the functionalities of the computational systems supporting it, which disrupt the designers’ process, focusing on technology rather than designers’ needs. A need for elicitation of hand gestures appropriate for the requirements of the conceptual design, rather than those arbitrarily chosen or focusing on ease of implementation was identified.The aim of this thesis is to identify natural and intuitive hand gestures for conceptual design, performed by designers (3rd, 4th year product design engineering students and recent graduates) working on their own, without instruction and without limitations imposed by the facilitating technology. This was done via a user centred study including 44 participants. 1785 gestures were collected. Gestures were explored as a sole mean for shape creation and manipulation in virtual 3D space. Gestures were identified, described in writing, sketched, coded based on the taxonomy used, categorised based on hand form and the path travelled and variants identified. Then they were statistically analysed to ascertain agreement rates between the participants, significance of the agreement and the likelihood of number of repetitions for each category occurring by chance. The most frequently used and statistically significant gestures formed the consensus set of vocabulary for conceptual design. The effect of the shape of the manipulated object on the gesture performed, and if the sequence of the gestures participants proposed was different from the established CAD solid modelling practices were also observed.Vocabulary was evaluated by non-designer participants, and the outcomes have shown that the majority of gestures were appropriate and easy to perform. Evaluation was performed theoretically and in the VR environment. Participants selected their preferred gestures for each activity, and a variant of the vocabulary for conceptual design was created as an outcome, that aims to ensure that extensive training is not required, extending the ability to design beyond trained designers only

    Effects of activity time limitation on gesture elicitation for form creation

    Get PDF
    Cognitive processing employed during design includes both time critical and time-consuming types of thinking. The ability to match the pace of design generation or modification with the designers thinking processes can be particularly important with gesture-based interfaces for form creation, especially where representation modes of input and response may influence the choice of activities performed. Particularly in gesture elicitation studies, time-consuming design activities can shift the focus on forming the analogies between problem at hand and prior knowledge and experiences, rather than intuitive gesture suggestions that would be the best fit for the given representation mode. However, design methodologies do not prescribe or discuss time limitations and their use in this context. In this paper, time limitation is explored during a gesture elicitation study for three-dimensional object creation, modification and manipulation, by comparing two study parts, one where time limitation was imposed and one where time was unlimited. Resulting gesture durations in both parts were comparable and elicited gestures were similar in nature and employing same elements of hand motion, supporting the hypothesis that time limitation can be a useful methodological approach when gestures are used for interaction with 3D objects and representation and interaction modalities are matched

    Real-time evaluation and feedback system for ergonomics on the shop floor.

    Get PDF
    Despite the greatly increased automation in manufacturing industries, manual operations still exist, and ergonomic risk factors that arise because of manual operations can lead to Work-Related Musculoskeletal Disorders (WMSDs). To mitigate the risk, manual operations should be assessed to identify if any risk, such as awkward posture, exist. Most assessments are carried out offline but this cannot alert and prevent operators from adopting awkward postures in time. Hence, due to the popularity of flexible manufacturing systems that require immediate response to changes, there is need for a real-time assessment. Therefore, the aim of this research is to develop a real-time knowledge-based ergonomic assessment system for use in the real-time evaluation of work postures on the shop floor and provision of feedback to workers, using 3D motion sensors. The developed intelligent system utilizes the knowledge from health and safety (H&S) guidelines, set of rules and an inference engine, to automatically capture and assess worker’s postures and provide real-time feedback to the worker through an easy-to-understand user interface. The system has been validated using many case studies which include the posture assessment of: 6 operators assembling engine valve, 4 seated researchers conducting desk-based reading and 15 operators during lifting, assembly and hammering of IKEA table. The system when tested proved to achieve: real-time assessment, easy-to-understand feedback, reliable measurements with Cronbach’s alpha of 0.978, p=0.045 and Kendall’s coefficient of concordance of 0.634, p = 0.000. The main contribution of this work lies in providing real-time feedback to workers. This contribution is in three sub-areas namely: i) Development of a real-time Kinect-based tool for H&S-compliant ergonomic assessment. ii) Development of a knowledge-based real-time feedback system for improved posture assessment. iii) Provision of real-time feedback to alert workers in time. The novelty of this research is in the development of a knowledge-based system for real-time ergonomic assessment and feedback to workers using 3D motion sensors.PhD in Manufacturin

    Developing a mixed reality assistance system based on projection mapping technology for manual operations at assembly workstations.

    Get PDF
    ABSTRACT Manual tasks play an important role in social sustainable manufacturing enterprises. Commonly, manual operations are used for low volume productions, but are not limited to. Operational models in manufacturing sisters cased of x-to-order paradigms (e.g. assembly-to-order) may require manual operations to speed-up the ramp-up time of new product configuration assemblies. The implications of manual operations in any production line may imply that any manufacturing or assembly process become more susceptible to human errors and therefore translate into delays, defects and/or poor product quality. In this scenario, virtual and augmented realities can offer significant advantages to support the human operator in manual operations. This research work presents the development of a mixed (virtual and augmented) reality assistance system that permits real-time support in manual operations. A review of mixed reality techniques and technologies was conducted, where it was determined to use a projection mapping solution for the proposed assistance system. According to the specific requirements of the demonstration environment, hardware and software components were chosen. The developed mixed reality assistance system was able to guide any user without any prior knowledge through the successful completion of the specific assembly task
    corecore