7 research outputs found

    Bimodal Feedback for In-car Mid-air Gesture Interaction

    Get PDF
    This demonstration showcases novel multimodal feedback designs for in-car mid-air gesture interaction. It explores the potential of multimodal feedback types for mid-air gestures in cars and how these can reduce eyes-off-the-road time thus make driving safer. We will show four different bimodal feedback combinations to provide effective information about interaction with systems in a car. These feedback techniques are visual-auditory, auditory-ambient (peripheral vision), ambient-tactile, and tactile-auditory. Users can interact with the system after a short introduction, creating an exciting opportunity to deploy these displays in cars in the future

    Evaluation of Haptic Patterns on a Steering Wheel

    Get PDF
    Infotainment Systems can increase mental workload and divert visual attention away from looking ahead on the roads. When these systems give information to the driver, provide it through the tactile channel on the steering, it wheel might improve driving behaviour and safety. This paper describes an investigation into the perceivability of haptic feedback patterns using an actuated surface on a steering wheel. Six solenoids were embedded along the rim of the steering wheel creating three bumps under each palm. Maximally, four of the six solenoids were actuated simultaneously, resulting in 56 patterns to test. Participants were asked to keep in the middle road of the driving simulator as good as possible. Overall recognition accuracy of the haptic patterns was 81.3%, where identification rate increased with decreasing number of active solenoids (up to 92.2% for a single solenoid). There was no significant increase in lane deviation or steering angle during haptic pattern presentation. These results suggest that drivers can reliably distinguish between cutaneous patterns presented on the steering wheel. Our findings can assist in delivering non-critical messages to the driver (e.g. driving performance, incoming text messages, etc.) without decreasing driving performance or increasing perceived mental workload

    Novel Multimodal Feedback Techniques for In-Car Mid-Air Gesture Interaction

    Get PDF
    This paper presents an investigation into the effects of different feedback modalities on mid-air gesture interaction for infotainment systems in cars. Car crashes and near-crash events are most commonly caused by driver distraction. Mid-air interaction is a way of reducing driver distraction by reducing visual demand from infotainment. Despite a range of available modalities, feedback in mid-air gesture systems is generally provided through visual displays. We conducted a simulated driving study to investigate how different types of multimodal feedback can support in-air gestures. The effects of different feedback modalities on eye gaze behaviour, and the driving and gesturing tasks are considered. We found that feedback modality influenced gesturing behaviour. However, drivers corrected falsely executed gestures more often in non-visual conditions. Our findings show that non-visual feedback can reduce visual distraction significantl

    Evaluating Haptic Feedback on a Steering Wheel in a Simulated Driving Scenario

    No full text
    This paper investigates how perceivable haptic feedback patterns are using an actuated surface on a steering wheel. Six solenoids were embedded along the surface of the wheel, creating three bumps under each palm. The solenoids can be used to create a range of different tactile patterns. As a result of the design recommendation by Gallace et al. [Gallace2006a] maximally four of the six solenoids were actuated simultaneously, resulting in 57 patterns to test. A simulated driving study was conducted to investigate (1) the optimal number of actuated solenoids and (2) the most perceivable haptic patterns. A relationship between number of actuated solenoids and pattern identification rate was established. Perception accuracy drops above three active solenoids. Haptic patterns mirrored symmetrically on both hands were perceived more accurately. Practical applications for displaying tactile messages on the steering wheel are e.g. dead angles, upcoming road conditions, navigation information (i.e. conveying information discretely to the driver)

    Interação com sistema de infotainment de um veículo, implementação e avaliação de gestos e técnicas hápticas

    Get PDF
    Atualmente, a maioria dos fabricantes automóveis incorpora um ecrã táctil no sistema de infotainment dos seus veículos. Ao contrário de interfaces tradicionais, compostas por controlos físicos e tangíveis, como botões, os ecrãs tácteis não permitem receber feedback táctil sobre o posicionamento, forma, ou outras caraterísticas dos controlos. Isto leva a que a interação com estes dispositivos requeira a atenção visual do utilizador. Sendo a condução uma tarefa principalmente visual, constata-se que o uso destes sistemas modernos leva a um aumento da distração do condutor e, consequentemente, do risco de acidente ou quase-acidente. Para resolver estas desvantagens, este trabalho teve como objetivo a criação de um sistema inovador que permita interagir com o ecrã táctil de um sistema de infotainment moderno de uma forma mais intuitiva e com menos dependência da atenção visual do utilizador. Para tal, foi idealizado e desenvolvido um sistema composto por dois módulos – um módulo responsável pela interação com o ecrã táctil, e um módulo responsável por proporcionar feedback háptico ao utilizador. O primeiro módulo consiste num sistema de rastreio da mão que permite traduzir o gesto de apontar para o ecrã em uma posição discreta, permitindo fornecer feedback ao utilizador antes de realizar o toque. O segundo módulo consiste num conjunto de atuadores hápticos montados no volante, onde o condutor assenta a sua mão esquerda. Este módulo permite transmitir informação pertinente na forma de padrões de vibração. Os módulos foram desenvolvidos independentemente e integrados no sistema final após as iterações necessárias. O sistema foi avaliado através de um conjunto de tarefas comuns realizadas nos sistemas de infotainment, e foi comparado com a interação tradicional com estes sistemas, sem qualquer tipo de feedback háptico ou método de interação invulgar. Foram registadas métricas relevantes, tais como número de desvios visuais para fora da estrada, duração total desses desvios, desvio na condução, e carga de trabalho subjetiva. O sistema alcançou as expetativas, levando a uma melhoria em várias das métricas registadas, e não piorou a qualidade da condução nem a carga subjetiva de trabalho, apesar de introduzir feedback e uma nova técnica de interação.Nowadays, most carmakers incorporate touchscreens in their vehicles’ infotainment system. Unlike traditional interfaces, with tangible, physical controls, like buttons, touchscreens aren’t capable of transmitting tactile feedback about the controls’ positioning, shape, or other physical characteristics. This means that interacting with these devices requires the user’s visual attention. As driving is a mainly visual task, using these systems leads to increased driver distraction and, consequently, increased risk of accident or near-accident. To solve these disadvantages, this body of work’s main focus was creating an innovative system that allows interacting with a modern infotainment system’s touchscreen in a more intuitive fashion that requires less visual attention by the user. To accomplish this goal, a system was idealized, composed by two modules – a module responsible for the interaction with the touchscreen, and a module responsible for providing the user with haptic feedback. The first module consists of a system that tracks the user’s hand and translates the pointing gesture to a discrete position on the screen, allowing the system to provide feedback before the user touches the screen. The second module consists of a set of haptic actuators mounted on the steering wheel, where the driver rests his/her left hand. This module allows the transmission of relevant information in the form of vibration patterns. The modules were developed independently and integrated in the final system after the necessary iterations. The system was evaluated through a set of tasks commonly performed in infotainment systems, and was compared against traditional interaction with these systems, without any type of feedback or unusual interaction method. Relevant metrics, such as number of glances outside the road, the total duration of those glances, driving deviation, and subjective workload, were recorded. The system met expectations, leading to improvements across several of the recorded metrics, and did not negatively affect driving performance or the subjective workload, despite introducing feedback and a new interaction technique

    Nonlinear Modeling and Control of Driving Interfaces and Continuum Robots for System Performance Gains

    Get PDF
    With the rise of (semi)autonomous vehicles and continuum robotics technology and applications, there has been an increasing interest in controller and haptic interface designs. The presence of nonlinearities in the vehicle dynamics is the main challenge in the selection of control algorithms for real-time regulation and tracking of (semi)autonomous vehicles. Moreover, control of continuum structures with infinite dimensions proves to be difficult due to their complex dynamics plus the soft and flexible nature of the manipulator body. The trajectory tracking and control of automobile and robotic systems requires control algorithms that can effectively deal with the nonlinearities of the system without the need for approximation, modeling uncertainties, and input disturbances. Control strategies based on a linearized model are often inadequate in meeting precise performance requirements. To cope with these challenges, one must consider nonlinear techniques. Nonlinear control systems provide tools and methodologies for enabling the design and realization of (semi)autonomous vehicle and continuum robots with extended specifications based on the operational mission profiles. This dissertation provides an insight into various nonlinear controllers developed for (semi)autonomous vehicles and continuum robots as a guideline for future applications in the automobile and soft robotics field. A comprehensive assessment of the approaches and control strategies, as well as insight into the future areas of research in this field, are presented.First, two vehicle haptic interfaces, including a robotic grip and a joystick, both of which are accompanied by nonlinear sliding mode control, have been developed and studied on a steer-by-wire platform integrated with a virtual reality driving environment. An operator-in-the-loop evaluation that included 30 human test subjects was used to investigate these haptic steering interfaces over a prescribed series of driving maneuvers through real time data logging and post-test questionnaires. A conventional steering wheel with a robust sliding mode controller was used for all the driving events for comparison. Test subjects operated these interfaces for a given track comprised of a double lane-change maneuver and a country road driving event. Subjective and objective results demonstrate that the driver’s experience can be enhanced up to 75.3% with a robotic steering input when compared to the traditional steering wheel during extreme maneuvers such as high-speed driving and sharp turn (e.g., hairpin turn) passing. Second, a cellphone-inspired portable human-machine-interface (HMI) that incorporated the directional control of the vehicle as well as the brake and throttle functionality into a single holistic device will be presented. A nonlinear adaptive control technique and an optimal control approach based on driver intent were also proposed to accompany the mechatronic system for combined longitudinal and lateral vehicle guidance. Assisting the disabled drivers by excluding extensive arm and leg movements ergonomically, the device has been tested in a driving simulator platform. Human test subjects evaluated the mechatronic system with various control configurations through obstacle avoidance and city road driving test, and a conventional set of steering wheel and pedals were also utilized for comparison. Subjective and objective results from the tests demonstrate that the mobile driving interface with the proposed control scheme can enhance the driver’s performance by up to 55.8% when compared to the traditional driving system during aggressive maneuvers. The system’s superior performance during certain vehicle maneuvers and approval received from the participants demonstrated its potential as an alternative driving adaptation for disabled drivers. Third, a novel strategy is designed for trajectory control of a multi-section continuum robot in three-dimensional space to achieve accurate orientation, curvature, and section length tracking. The formulation connects the continuum manipulator dynamic behavior to a virtual discrete-jointed robot whose degrees of freedom are directly mapped to those of a continuum robot section under the hypothesis of constant curvature. Based on this connection, a computed torque control architecture is developed for the virtual robot, for which inverse kinematics and dynamic equations are constructed and exploited, with appropriate transformations developed for implementation on the continuum robot. The control algorithm is validated in a realistic simulation and implemented on a six degree-of-freedom two-section OctArm continuum manipulator. Both simulation and experimental results show that the proposed method could manage simultaneous extension/contraction, bending, and torsion actions on multi-section continuum robots with decent tracking performance (e.g. steady state arc length and curvature tracking error of 3.3mm and 130mm-1, respectively). Last, semi-autonomous vehicles equipped with assistive control systems may experience degraded lateral behaviors when aggressive driver steering commands compete with high levels of autonomy. This challenge can be mitigated with effective operator intent recognition, which can configure automated systems in context-specific situations where the driver intends to perform a steering maneuver. In this article, an ensemble learning-based driver intent recognition strategy has been developed. A nonlinear model predictive control algorithm has been designed and implemented to generate haptic feedback for lateral vehicle guidance, assisting the drivers in accomplishing their intended action. To validate the framework, operator-in-the-loop testing with 30 human subjects was conducted on a steer-by-wire platform with a virtual reality driving environment. The roadway scenarios included lane change, obstacle avoidance, intersection turns, and highway exit. The automated system with learning-based driver intent recognition was compared to both the automated system with a finite state machine-based driver intent estimator and the automated system without any driver intent prediction for all driving events. Test results demonstrate that semi-autonomous vehicle performance can be enhanced by up to 74.1% with a learning-based intent predictor. The proposed holistic framework that integrates human intelligence, machine learning algorithms, and vehicle control can help solve the driver-system conflict problem leading to safer vehicle operations
    corecore