5,456 research outputs found

    A Multimodal Approach for Monitoring Driving Behavior and Emotions

    Get PDF
    Studies have indicated that emotions can significantly be influenced by environmental factors; these factors can also significantly influence driversā€™ emotional state and, accordingly, their driving behavior. Furthermore, as the demand for autonomous vehicles is expected to significantly increase within the next decade, a proper understanding of driversā€™/passengersā€™ emotions, behavior, and preferences will be needed in order to create an acceptable level of trust with humans. This paper proposes a novel semi-automated approach for understanding the effect of environmental factors on driversā€™ emotions and behavioral changes through a naturalistic driving study. This setup includes a frontal road and facial camera, a smart watch for tracking physiological measurements, and a Controller Area Network (CAN) serial data logger. The results suggest that the driverā€™s affect is highly influenced by the type of road and the weather conditions, which have the potential to change driving behaviors. For instance, when the research defines emotional metrics as valence and engagement, results reveal there exist significant differences between human emotion in different weather conditions and road types. Participantsā€™ engagement was higher in rainy and clear weather compared to cloudy weather. More-over, engagement was higher on city streets and highways compared to one-lane roads and two-lane highways

    A Neural Model of How the Brain Computes Heading from Optic Flow in Realistic Scenes

    Full text link
    Animals avoid obstacles and approach goals in novel cluttered environments using visual information, notably optic flow, to compute heading, or direction of travel, with respect to objects in the environment. We present a neural model of how heading is computed that describes interactions among neurons in several visual areas of the primate magnocellular pathway, from retina through V1, MT+, and MSTd. The model produces outputs which are qualitatively and quantitatively similar to human heading estimation data in response to complex natural scenes. The model estimates heading to within 1.5Ā° in random dot or photo-realistically rendered scenes and within 3Ā° in video streams from driving in real-world environments. Simulated rotations of less than 1 degree per second do not affect model performance, but faster simulated rotation rates deteriorate performance, as in humans. The model is part of a larger navigational system that identifies and tracks objects while navigating in cluttered environments.National Science Foundation (SBE-0354378, BCS-0235398); Office of Naval Research (N00014-01-1-0624); National-Geospatial Intelligence Agency (NMA201-01-1-2016

    Human Requirements Validation for Complex Systems Design

    Get PDF
    AbstractOne of the most critical phases in complex systems design is the requirements engineering process. During this phase, system designers need to accurately elicit, model and validate the desired system based on user requirements. Smart driver assistive technologies (SDAT) belong to a class of complex systems that are used to alleviate accident risk by improving situation awareness, reducing driver workload or enhancing driver attentiveness. Such systems aim to draw driversā€™ attention on critical information cues that improve decision making. Discovering the requirements for such systems necessitates a holistic approach that addresses not only functional and non-functional aspects but also the human requirements such as driversā€™ situation awareness and workload. This work describes a simulation-based user requirements discovery method. It utilizes the benefits of a modular virtual reality simulator to model driving conditions to discover user needs that subsequently inform the design of prototype SDATs that exploit the augmented reality method. Herein, we illustrate the development of the simulator, the elicitation of user needs through an experiment and the prototype SDAT designs using UNITY game engine

    A Fuzzy-Logic Approach to Dynamic Bayesian Severity Level Classification of Driver Distraction Using Image Recognition

    Get PDF
    open access articleDetecting and classifying driver distractions is crucial in the prevention of road accidents. These distractions impact both driver behavior and vehicle dynamics. Knowing the degree of driver distraction can aid in accident prevention techniques, including transitioning of control to a level 4 semi- autonomous vehicle, when a high distraction severity level is reached. Thus, enhancement of Advanced Driving Assistance Systems (ADAS) is a critical component in the safety of vehicle drivers and other road users. In this paper, a new methodology is introduced, using an expert knowledge rule system to predict the severity of distraction in a contiguous set of video frames using the Naturalistic Driving American University of Cairo (AUC) Distraction Dataset. A multi-class distraction system comprises the face orientation, driversā€™ activities, hands and previous driver distraction, a severity classification model is developed as a discrete dynamic Bayesian (DDB). Furthermore, a Mamdani-based fuzzy system was implemented to detect multi- class of distractions into a severity level of safe, careless or dangerous driving. Thus, if a high level of severity is reached the semi-autonomous vehicle will take control. The result further shows that some instances of driverā€™s distraction may quickly transition from a careless to dangerous driving in a multi-class distraction context
    • ā€¦
    corecore