759 research outputs found

    Portable product miniaturization and the ergonomic threshold

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 1997.Includes bibliographical references (p. 124-125).by David H. Levy.Ph.D

    Haptic controls in cars for safer driving

    Get PDF
    With the spread of latest state of the art technologies geared towards utilization of the human senses, haptic technologies have been introduced as a way of utilising the sense of touch to either solve real world problems or to enhance present experiences. This thesis focuses on using haptic technology in cars to make the driving experience safer. Modern vehicles carry GPS, music systems, sunroofs and a number of other electronic gadgets. Interaction with these devices while driving often takes the driver???s eyes ???off the road??? and raises safety concerns. We are proposing a unique haptic design that uses the ???sense of touch??? as a mode of controlling or coordinating the various technologies and convenience devices found within a car. A pattern of distinguishable haptic feedback linked to a corresponding device allows the user to operate these devices through ???sense of touch??? and eliminates reliance on visual interaction. This design will help to reduce the driver???s distractions, as it will be installed in an easily accessible location such as on the steering wheel. A simulation has been done using a haptic interface ???i.e. desktop phantom to test the system??? and a prototype has been developed which can be installed in any vehicle. This prototype has been tested to work with a limited number of convenient devices. However, further development and enhancements can be made to incorporate more devices and other user preferences. The main objective of this research is to integrate various functionalities in a robust manner, which will focus on the driver???s safety by ensuring ???constant vision on the road???. Distinguishable distinct haptic responses will act as unique depictions for specific convenient devices within the car, allowing the driver to interact and manipulate the settings of the device based on the detection and identification of the various unique haptic depictions

    Interaction Methods for Smart Glasses : A Survey

    Get PDF
    Since the launch of Google Glass in 2014, smart glasses have mainly been designed to support micro-interactions. The ultimate goal for them to become an augmented reality interface has not yet been attained due to an encumbrance of controls. Augmented reality involves superimposing interactive computer graphics images onto physical objects in the real world. This survey reviews current research issues in the area of human-computer interaction for smart glasses. The survey first studies the smart glasses available in the market and afterwards investigates the interaction methods proposed in the wide body of literature. The interaction methods can be classified into hand-held, touch, and touchless input. This paper mainly focuses on the touch and touchless input. Touch input can be further divided into on-device and on-body, while touchless input can be classified into hands-free and freehand. Next, we summarize the existing research efforts and trends, in which touch and touchless input are evaluated by a total of eight interaction goals. Finally, we discuss several key design challenges and the possibility of multi-modal input for smart glasses.Peer reviewe

    Advancements in Sensor Technologies and Control Strategies for Lower-Limb Rehabilitation Exoskeletons: A Comprehensive Review

    Get PDF
    Lower-limb rehabilitation exoskeletons offer a transformative approach to enhancing recovery in patients with movement disorders affecting the lower extremities. This comprehensive systematic review delves into the literature on sensor technologies and the control strategies integrated into these exoskeletons, evaluating their capacity to address user needs and scrutinizing their structural designs regarding sensor distribution as well as control algorithms. The review examines various sensing modalities, including electromyography (EMG), force, displacement, and other innovative sensor types, employed in these devices to facilitate accurate and responsive motion control. Furthermore, the review explores the strengths and limitations of a diverse array of lower-limb rehabilitation-exoskeleton designs, highlighting areas of improvement and potential avenues for further development. In addition, the review investigates the latest control algorithms and analysis methods that have been utilized in conjunction with these sensor systems to optimize exoskeleton performance and ensure safe and effective user interactions. By building a deeper understanding of the diverse sensor technologies and monitoring systems, this review aims to contribute to the ongoing advancement of lower-limb rehabilitation exoskeletons, ultimately improving the quality of life for patients with mobility impairments

    What you see is what you feel : on the simulation of touch in graphical user interfaces

    Get PDF
    This study introduces a novel method of simulating touch with merely visual means. Interactive animations are used to create an optical illusion that evokes haptic percepts like stickiness, stiffness and mass, within a standard graphical user interface. The technique, called optically simulated hapic feedback, exploits the domination of the visual over the haptic modality and the general human tendency to integrate between the various senses. The study began with an aspiration to increase the sensorial qualities of the graphical user interface. With the introduction of the graphical user interface – and in particular the desktop metaphor – computers have become accessible for almost anyone; all over the world, people from various cultures use the same icons, folders, buttons and trashcans. However, from a sensorial point of view this computing paradigm is still extremely limited. Touch can play a powerful role in communication. It can offer an immediacy and intimacy unparalleled by words or images. Although few doubt this intrinsic value of touch perception in everyday life, examples in modern technology where human-machine communication utilizes the tactile and kinesthetic senses as additional channels of information flow are scarce. Hence, it has often been suggested that improvements in the sensorial qualities of computers could lead to more natural interfaces. Various researchers have been creating scenarios and technologies that should enrich the sensorial qualities of our digital environment. Some have developed mechanical force feedback devices that enable people to experience haptics while interacting with a digital display. Others have suggested that the computer should ‘disappear’ into the environment and proposed tangible objects as a means to connect between the digital and the physical environment. While the scenarios of force feedback, tangible interactions and the disappearing computer are maturing, millions of people are still working with a desktop computer interface every day. In spite of its obvious drawbacks, the desktop computing model penetrated deeply into our society and cannot be expected to disappear overnight. Radically different computing paradigms will require the development of radically different hardware. This takes time and it is yet unsure when, if so, other computing paradigms will replace the current desktop computing setup. It is for that reason, that we pursued another approach towards physical computing. Inspired by renaissance painters, who already centuries ago invented illusionary techniques like perspective and trompe d’oeil to increase the presence of their paintings, we aim to improve the physicality of the graphical user interface, without resorting to special hardware. Optically simulated haptic feedback, described in this thesis, has a lot in common with mechanical force-feedback systems, except for the fact that in mechanical force-feedback systems the location of the cursor is manipulated as a result of the force sent to the haptic device (force-feedback mouse, trackball, etc), whereas in our system the cursor location is directly manipulated, resulting in an purely visual force feedback. By applying tiny displacements upon the cursor’s movement, tactile sensations like stickiness, touch, or mass can be simulated. In chapter 2 we suggest that the active cursor technique can be applied to create richer interactions without the need for special hardware. The cursor channel is transformed from an input only to an input/output channel. The active cursor displacements can be used to create various (dynamic) slopes as well as textures and material properties, which can provide the user with feedback while navigating the on-screen environment. In chapter 3 the perceptual illusion of touch, resulting from the domination of the visual over the haptic modality, is described in a larger context of prior research and experimentally tested. Using both the active cursor technique and a mechanical force feedback device, we generated bumps and hole structures. In a controlled experiment the perception of the slopes was measured, comparing between the optical and the mechanical simulation. Results show that people can recognize optically simulated bump and hole structures, and that active cursor displacements influence the haptic perception of bumps and holes. Depending on the simulated strength of the force, optically simulated haptic feedback can take precedence over mechanically simulated haptic feedback, but also the other way around. When optically simulated and mechanically simulated haptic feedback counteract each other, however, the weight attributed to each source of haptic information differs between users. It is concluded that active cursor displacements can be used to optically simulate the operation of mechanical force feedback devices. An obvious application of optically simulated haptic feedback in graphical user interfaces, is to assist the user in pointing at icons and objects on the screen. Given the pervasiveness of pointing in graphical interfaces, every small improvement in a target-acquisition task, represents a substantial improvement in usability. Can active cursor displacements be applied to help the user reach its goal? In chapter 4 we test the usability of optically simulated haptic feedback in a pointing task, again in comparison with the force feedback generated by a mechanical device. In a controlled Fitts’-law type experiment, subjects were asked to point and click at targets of different sizes and distances. Results learn that rendering hole type structures underneath the targets improves the effectiveness, efficiency and satisfaction of the target acquisition task. Optically simulated haptic feedback results in lower error rates, more satisfaction, and a higher index of performance, which can be attributed to the shorter movement times realized for the smaller targets. For larger targets, optically simulated haptic feedback resulted in comparable movement times as mechanically simulated haptic feedback. Since the current graphical interfaces are not designed with tactility in mind, the development of novel interaction styles should also be an important research path. Before optically simulated haptic feedback can be fully brought into play in more complex interaction styles, designers and researchers need to further experiment with the technique. In chapter 5 we describe a software prototyping toolkit, called PowerCursor, which enables designers to create interaction styles using optically simulated haptic feedback, without having to do elaborate programming. The software engine consists of a set of ready force field objects – holes, hills, ramps, rough and slick objects, walls, whirls, and more – that can be added to any Flash project, as well as force behaviours that can be added to custom made shapes and objects. These basic building blocks can be combined to create more complex and dynamic force objects. This setup should allow the users of the toolkit to creatively design their own interaction styles with optically simulated haptic feedback. The toolkit is implemented in Adobe Flash and can be downloaded at www.powercursor.com. Furthermore, in chapter 5 we present a preliminary framework of the expected applicability of optically simulated haptic feedback. Illustrated with examples that have been created with the beta-version of the PowerCursor toolkit so far, we discuss some of the ideas for novel interaction styles. Besides being useful in assisting the user while navigating, optically simulated haptic feedback might be applied to create so-called mixed initiative interfaces – one can for instance think of an installation wizard, which guides the cursor towards the recommended next step. Furthermore since optically simulated haptic feedback can be used to communicate material properties of textures or 3D objects, it can be applied to create aesthetically pleasing interactions – which with the migration of computers into other domains than the office environment are becoming more relevant. Finally we discuss the opportunities for applications outside the desktop computer model. We discuss how, in principle, optically simulated haptic feedback can play a role in any graphical interface where the input and output channels are decoupled. In chapter 6 we draw conclusions and discuss future directions. We conclude that optically simulated haptic feedback can increase the physicality and quality of our current graphical user interfaces, without resorting to specialistic hardware. Users are able to recognize haptic structures simulated by applying active cursor displacements upon the users mouse movements. Our technique of simulating haptic feedback optically opens up an additional communication channel with the user that can enhance the usability of the graphical interface. However, the active cursor technique is not to be expected to replace mechanical haptic feedback altogether, since it can be applied only in combination with a visual display and thus will not work for visually impaired people. Rather, we expect the ability to employ tactile interaction styles in a standard graphical user interface, could catalyze the development of novel physical interaction styles and on the long term might instigate the acceptance of haptic devices. With this research we hope to have contributed to a more sensorial and richer graphical user interface. Moreover we have aimed to increase our awareness and understanding of media technology and simulations in general. Therefore, our scientific research results are deliberately presented within a social-cultural context that reflects upon the dominance of the visual modality in our society and the ever-increasing role of media and simulations in people’s everyday lives

    Multimodal Human-Machine Interface For Haptic-Controlled Excavators

    Get PDF
    The goal of this research is to develop a human-excavator interface for the hapticcontrolled excavator that makes use of the multiple human sensing modalities (visual, auditory haptic), and efficiently integrates these modalities to ensure intuitive, efficient interface that is easy to learn and use, and is responsive to operator commands. Two empirical studies were conducted to investigate conflict in the haptic-controlled excavator interface and identify the level of force feedback for best operator performance
    • …
    corecore