592 research outputs found

    Analysis of Disengagements in Semi-Autonomous Vehicles: Driversโ€™ Takeover Performance and Operational Implications

    Get PDF
    This report analyzes the reactions of human drivers placed in simulated Autonomous Technology disengagement scenarios. The study was executed in a human-in-the-loop setting, within a high-fidelity integrated car simulator capable of handling both manual and autonomous driving. A population of 40 individuals was tested, with metrics for control takeover quantification given by: i) response times (considering inputs of steering, throttle, and braking); ii) vehicle drift from the lane centerline after takeover as well as overall (integral) drift over an S-turn curve compared to a baseline obtained in manual driving; and iii) accuracy metrics to quantify human factors associated with the simulation experiment. Independent variables considered for the study were the age of the driver, the speed at the time of disengagement, and the time at which the disengagement occurred (i.e., how long automation was engaged for). The study shows that changes in the vehicle speed significantly affect all the variables investigated, pointing to the importance of setting up thresholds for maximum operational speed of vehicles driven in autonomous mode when the human driver serves as back-up. The results shows that the establishment of an operational threshold could reduce the maximum drift and lead to better control during takeover, perhaps warranting a lower speed limit than conventional vehicles. With regards to the age variable, neither the response times analysis nor the drift analysis provide support for any claim to limit the age of drivers of semi-autonomous vehicles

    ์ฐจ๋Ÿ‰์šฉ ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด ์„ค๊ณ„์— ๊ด€ํ•œ ์ธ๊ฐ„๊ณตํ•™ ์—ฐ๊ตฌ

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ (๋ฐ•์‚ฌ) -- ์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› : ๊ณต๊ณผ๋Œ€ํ•™ ์‚ฐ์—…๊ณตํ•™๊ณผ, 2020. 8. ๋ฐ•์šฐ์ง„.Head-up display (HUD) systems were introduced into the automobile industry as a means for improving driving safety. They superimpose safety-critical information on top of the drivers forward field of view and thereby help drivers keep their eyes forward while driving. Since the first introduction about three decades ago, automotive HUDs have been available in various commercial vehicles. Despite the long history and potential benefits of automotive HUDs, however, the design of useful automotive HUDs remains a challenging problem. In an effort to contribute to the design of useful automotive HUDs, this doctoral dissertation research conducted four studies. In Study 1, the functional requirements of automotive HUDs were investigated by reviewing the major automakers' automotive HUD products, academic research studies that proposed various automotive HUD functions, and previous research studies that surveyed drivers HUD information needs. The review results indicated that: 1) the existing commercial HUDs perform largely the same functions as the conventional in-vehicle displays, 2) past research studies proposed various HUD functions for improving driver situation awareness and driving safety, 3) autonomous driving and other new technologies are giving rise to new HUD information, and 4) little research is currently available on HUD users perceived information needs. Based on the review results, this study provides insights into the functional requirements of automotive HUDs and also suggests some future research directions for automotive HUD design. In Study 2, the interface design of automotive HUDs for communicating safety-related information was examined by reviewing the existing commercial HUDs and display concepts proposed by academic research studies. Each display was analyzed in terms of its functions, behaviors and structure. Also, related human factors display design principles, and, empirical findings on the effects of interface design decisions were reviewed when information was available. The results indicated that: 1) information characteristics suitable for the contact-analog and unregistered display formats, respectively, are still largely unknown, 2) new types of displays could be developed by combining or mixing existing displays or display elements at both the information and interface element levels, and 3) the human factors display principles need to be used properly according to the situation and only to the extent that the resulting display respects the limitations of the human information processing, and achieving balance among the principles is important to an effective design. On the basis of the review results, this review suggests design possibilities and future research directions on the interface design of safety-related automotive HUD systems. In Study 3, automotive HUD-based take-over request (TOR) displays were developed and evaluated in terms of drivers take-over performance and visual scanning behavior in a highly automated driving situation. Four different types of TOR displays were comparatively evaluated through a driving simulator study - they were: Baseline (an auditory beeping alert), Mini-map, Arrow, and Mini-map-and-Arrow. Baseline simply alerts an imminent take-over, and was always included when the other three displays were provided. Mini-map provides situational information. Arrow presents the action direction information for the take-over. Mini-map-and-Arrow provides the action direction together with the relevant situational information. This study also investigated the relationship between drivers initial trust in the TOR displays and take-over and visual scanning behavior. The results indicated that providing a combination of machine-made decision and situational information, such as Mini-map-and-Arrow, yielded the best results overall in the take-over scenario. Also, drivers initial trust in the TOR displays was found to have significant associations with the take-over and visual behavior of drivers. The higher trust group primarily relied on the proposed TOR displays, while the lower trust group tended to more check the situational information through the traditional displays, such as side-view or rear-view mirrors. In Study 4, the effect of interactive HUD imagery location on driving and secondary task performance, driver distraction, preference, and workload associated with use of scrolling list while driving were investigated. A total of nine HUD imagery locations of full-windshield were examined through a driving simulator study. The results indicated the HUD imagery location affected all the dependent measures, that is, driving and task performance, drivers visual distraction, preference and workload. Considering both objective and subjective evaluations, interactive HUDs should be placed near the driver's line of sight, especially near the left-bottom on the windshield.์ž๋™์ฐจ ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด๋Š” ์ฐจ๋‚ด ๋””์Šคํ”Œ๋ ˆ์ด ์ค‘ ํ•˜๋‚˜๋กœ ์šด์ „์ž์—๊ฒŒ ํ•„์š”ํ•œ ์ •๋ณด๋ฅผ ์ „๋ฐฉ์— ํ‘œ์‹œํ•จ์œผ๋กœ์จ, ์šด์ „์ž๊ฐ€ ์šด์ „์„ ํ•˜๋Š” ๋™์•ˆ ์ „๋ฐฉ์œผ๋กœ ์‹œ์„ ์„ ์œ ์ง€ํ•  ์ˆ˜ ์žˆ๊ฒŒ ๋„์™€์ค€๋‹ค. ์ด๋ฅผ ํ†ตํ•ด ์šด์ „์ž์˜ ์ฃผ์˜ ๋ถ„์‚ฐ์„ ์ค„์ด๊ณ , ์•ˆ์ „์„ ํ–ฅ์ƒ์‹œํ‚ค๋Š”๋ฐ ๋„์›€์ด ๋  ์ˆ˜ ์žˆ๋‹ค. ์ž๋™์ฐจ ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด ์‹œ์Šคํ…œ์€ ์•ฝ 30๋…„ ์ „ ์šด์ „์ž์˜ ์•ˆ์ „์„ ํ–ฅ์ƒ์‹œํ‚ค๊ธฐ ์œ„ํ•œ ์ˆ˜๋‹จ์œผ๋กœ ์ž๋™์ฐจ ์‚ฐ์—…์— ์ฒ˜์Œ ๋„์ž…๋œ ์ด๋ž˜๋กœ ํ˜„์žฌ๊นŒ์ง€ ๋‹ค์–‘ํ•œ ์ƒ์šฉ์ฐจ์—์„œ ์‚ฌ์šฉ๋˜๊ณ  ์žˆ๋‹ค. ์•ˆ์ „๊ณผ ํŽธ์˜ ์ธก๋ฉด์—์„œ ์ž๋™์ฐจ ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด์˜ ์‚ฌ์šฉ์€ ์ ์  ๋” ์ฆ๊ฐ€ํ•  ๊ฒƒ์œผ๋กœ ์˜ˆ์ƒ๋œ๋‹ค. ๊ทธ๋Ÿฌ๋‚˜ ์ด๋Ÿฌํ•œ ์ž๋™์ฐจ ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด์˜ ์ž ์žฌ์  ์ด์ ๊ณผ ๋ฐœ์ „ ๊ฐ€๋Šฅ์„ฑ์—๋„ ๋ถˆ๊ตฌํ•˜๊ณ , ์œ ์šฉํ•œ ์ž๋™์ฐจ ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด๋ฅผ ์„ค๊ณ„ํ•˜๋Š” ๊ฒƒ์€ ์—ฌ์ „ํžˆ ์–ด๋ ค์šด ๋ฌธ์ œ์ด๋‹ค. ์ด์— ๋ณธ ์—ฐ๊ตฌ๋Š” ์ด๋Ÿฌํ•œ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•˜๊ณ , ๊ถ๊ทน์ ์œผ๋กœ ์œ ์šฉํ•œ ์ž๋™์ฐจ ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด ์„ค๊ณ„์— ๊ธฐ์—ฌํ•˜๊ณ ์ž ์ด 4๊ฐ€์ง€ ์—ฐ๊ตฌ๋ฅผ ์ˆ˜ํ–‰ํ•˜์˜€๋‹ค. ์ฒซ ๋ฒˆ์งธ ์—ฐ๊ตฌ๋Š” ์ž๋™์ฐจ ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด์˜ ๊ธฐ๋Šฅ ์š”๊ตฌ ์‚ฌํ•ญ๊ณผ ๊ด€๋ จ๋œ ๊ฒƒ์œผ๋กœ์„œ, ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด ์‹œ์Šคํ…œ์„ ํ†ตํ•ด ์–ด๋–ค ์ •๋ณด๋ฅผ ์ œ๊ณตํ•  ๊ฒƒ์ธ๊ฐ€์— ๋Œ€ํ•œ ๋‹ต์„ ๊ตฌํ•˜๊ณ ์ž ํ•˜์˜€๋‹ค. ์ด์— ์ฃผ์š” ์ž๋™์ฐจ ์ œ์กฐ์—…์ฒด๋“ค์˜ ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด ์ œํ’ˆ๋“ค๊ณผ, ์ž๋™์ฐจ ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด์˜ ๋‹ค์–‘ํ•œ ๊ธฐ๋Šฅ๋“ค์„ ์ œ์•ˆํ•œ ํ•™์ˆ  ์—ฐ๊ตฌ, ๊ทธ๋ฆฌ๊ณ  ์šด์ „์ž์˜ ์ •๋ณด ์š”๊ตฌ ์‚ฌํ•ญ๋“ค์„ ์ฒด๊ณ„์  ๋ฌธํ—Œ ๊ณ ์ฐฐ ๋ฐฉ๋ฒ•๋ก ์„ ํ†ตํ•ด ํฌ๊ด„์ ์œผ๋กœ ์กฐ์‚ฌํ•˜์˜€๋‹ค. ์ž๋™์ฐจ ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด์˜ ๊ธฐ๋Šฅ์  ์š”๊ตฌ ์‚ฌํ•ญ์— ๋Œ€ํ•˜์—ฌ ๊ฐœ๋ฐœ์ž, ์—ฐ๊ตฌ์ž, ์‚ฌ์šฉ์ž ์ธก๋ฉด์„ ๋ชจ๋‘ ๊ณ ๋ คํ•œ ํ†ตํ•ฉ๋œ ์ง€์‹์„ ์ „๋‹ฌํ•˜๊ณ , ์ด๋ฅผ ํ†ตํ•ด ์ž๋™์ฐจ ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด์˜ ๊ธฐ๋Šฅ ์š”๊ตฌ ์‚ฌํ•ญ์— ๋Œ€ํ•œ ํ–ฅํ›„ ์—ฐ๊ตฌ ๋ฐฉํ–ฅ์„ ์ œ์‹œํ•˜์˜€๋‹ค. ๋‘ ๋ฒˆ์งธ ์—ฐ๊ตฌ๋Š” ์•ˆ์ „ ๊ด€๋ จ ์ •๋ณด๋ฅผ ์ œ๊ณตํ•˜๋Š” ์ž๋™์ฐจ ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด์˜ ์ธํ„ฐํŽ˜์ด์Šค ์„ค๊ณ„์™€ ๊ด€๋ จ๋œ ๊ฒƒ์œผ๋กœ, ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด ์‹œ์Šคํ…œ์„ ํ†ตํ•ด ์•ˆ์ „ ๊ด€๋ จ ์ •๋ณด๋ฅผ ์–ด๋–ป๊ฒŒ ์ œ๊ณตํ•  ๊ฒƒ์ธ๊ฐ€์— ๋Œ€ํ•œ ๋‹ต์„ ๊ตฌํ•˜๊ณ ์ž ํ•˜์˜€๋‹ค. ์‹ค์ œ ์ž๋™์ฐจ๋“ค์˜ ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด ์‹œ์Šคํ…œ์—์„œ๋Š” ์–ด๋–ค ๋””์Šคํ”Œ๋ ˆ์ด ์ปจ์…‰๋“ค์ด ์‚ฌ์šฉ๋˜์—ˆ๋Š”์ง€, ๊ทธ๋ฆฌ๊ณ  ํ•™๊ณ„์—์„œ ์ œ์•ˆ๋œ ๋””์Šคํ”Œ๋ ˆ์ด ์ปจ์…‰๋“ค์—๋Š” ์–ด๋–ค ๊ฒƒ๋“ค์ด ์žˆ๋Š”์ง€ ์ฒด๊ณ„์  ๋ฌธํ—Œ ๊ณ ์ฐฐ ๋ฐฉ๋ฒ•๋ก ์„ ํ†ตํ•ด ๊ฒ€ํ† ํ•˜์˜€๋‹ค. ๊ฒ€ํ† ๋œ ๊ฒฐ๊ณผ๋Š” ๊ฐ ๋””์Šคํ”Œ๋ ˆ์ด์˜ ๊ธฐ๋Šฅ๊ณผ ๊ตฌ์กฐ, ๊ทธ๋ฆฌ๊ณ  ์ž‘๋™ ๋ฐฉ์‹์— ๋”ฐ๋ผ ์ •๋ฆฌ๋˜์—ˆ๊ณ , ๊ด€๋ จ๋œ ์ธ๊ฐ„๊ณตํ•™์  ๋””์Šคํ”Œ๋ ˆ์ด ์„ค๊ณ„ ์›์น™๊ณผ ์‹คํ—˜์  ์—ฐ๊ตฌ ๊ฒฐ๊ณผ๋“ค์„ ํ•จ๊ป˜ ๊ฒ€ํ† ํ•˜์˜€๋‹ค. ๊ฒ€ํ† ๋œ ๊ฒฐ๊ณผ๋ฅผ ๋ฐ”ํƒ•์œผ๋กœ ์•ˆ์ „ ๊ด€๋ จ ์ •๋ณด๋ฅผ ์ œ๊ณตํ•˜๋Š” ์ž๋™์ฐจ ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด์˜ ์ธํ„ฐํŽ˜์ด์Šค ์„ค๊ณ„์— ๋Œ€ํ•œ ํ–ฅํ›„ ์—ฐ๊ตฌ ๋ฐฉํ–ฅ์„ ์ œ์‹œํ•˜์˜€๋‹ค. ์„ธ ๋ฒˆ์งธ ์—ฐ๊ตฌ๋Š” ์ž๋™์ฐจ ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด ๊ธฐ๋ฐ˜์˜ ์ œ์–ด๊ถŒ ์ „ํ™˜ ๊ด€๋ จ ์ธํ„ฐํŽ˜์ด์Šค ์„ค๊ณ„์™€ ํ‰๊ฐ€์— ๊ด€ํ•œ ๊ฒƒ์ด๋‹ค. ์ œ์–ด๊ถŒ ์ „ํ™˜์ด๋ž€, ์ž์œจ์ฃผํ–‰ ์ƒํƒœ์—์„œ ์šด์ „์ž๊ฐ€ ์ง์ ‘ ์šด์ „์„ ํ•˜๋Š” ์ˆ˜๋™ ์šด์ „ ์ƒํƒœ๋กœ ์ „ํ™˜์ด ๋˜๋Š” ๊ฒƒ์„ ์˜๋ฏธํ•œ๋‹ค. ๋”ฐ๋ผ์„œ ๊ฐ‘์ž‘์Šค๋Ÿฐ ์ œ์–ด๊ถŒ ์ „ํ™˜ ์š”์ฒญ์ด ๋ฐœ์ƒํ•˜๋Š” ๊ฒฝ์šฐ, ์šด์ „์ž๊ฐ€ ์•ˆ์ „ํ•˜๊ฒŒ ๋Œ€์ฒ˜ํ•˜๊ธฐ ์œ„ํ•ด์„œ๋Š” ๋น ๋ฅธ ์ƒํ™ฉ ํŒŒ์•…๊ณผ ์˜์‚ฌ ๊ฒฐ์ •์ด ํ•„์š”ํ•˜๊ฒŒ ๋˜๊ณ , ์ด๋ฅผ ํšจ๊ณผ์ ์œผ๋กœ ๋„์™€์ฃผ๊ธฐ ์œ„ํ•œ ์ธํ„ฐํŽ˜์ด์Šค ์„ค๊ณ„์— ๋Œ€ํ•ด ์—ฐ๊ตฌํ•  ํ•„์š”์„ฑ์ด ์žˆ๋‹ค. ์ด์— ๋ณธ ์—ฐ๊ตฌ์—์„œ๋Š” ์ž๋™์ฐจ ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด ๊ธฐ๋ฐ˜์˜ ์ด 4๊ฐœ์˜ ์ œ์–ด๊ถŒ ์ „ํ™˜ ๊ด€๋ จ ๋””์Šคํ”Œ๋ ˆ์ด(๊ธฐ์ค€ ๋””์Šคํ”Œ๋ ˆ์ด, ๋ฏธ๋‹ˆ๋งต ๋””์Šคํ”Œ๋ ˆ์ด, ํ™”์‚ดํ‘œ ๋””์Šคํ”Œ๋ ˆ์ด, ๋ฏธ๋‹ˆ๋งต๊ณผ ํ™”์‚ดํ‘œ ๋””์Šคํ”Œ๋ ˆ์ด)๋ฅผ ์ œ์•ˆํ•˜์˜€๊ณ , ์ œ์•ˆ๋œ ๋””์Šคํ”Œ๋ ˆ์ด ๋Œ€์•ˆ๋“ค์€ ์ฃผํ–‰ ์‹œ๋ฎฌ๋ ˆ์ดํ„ฐ ์‹คํ—˜์„ ํ†ตํ•ด ์ œ์–ด๊ถŒ ์ „ํ™˜ ์ˆ˜ํ–‰ ๋Šฅ๋ ฅ๊ณผ ์•ˆ๊ตฌ์˜ ์›€์ง์ž„ ํŒจํ„ด, ๊ทธ๋ฆฌ๊ณ  ์‚ฌ์šฉ์ž์˜ ์ฃผ๊ด€์  ํ‰๊ฐ€ ์ธก๋ฉด์—์„œ ํ‰๊ฐ€๋˜์—ˆ๋‹ค. ๋˜ํ•œ ์ œ์•ˆ๋œ ๋””์Šคํ”Œ๋ ˆ์ด ๋Œ€์•ˆ๋“ค์— ๋Œ€ํ•ด ์šด์ „์ž๋“ค์˜ ์ดˆ๊ธฐ ์‹ ๋ขฐ๋„ ๊ฐ’์„ ์ธก์ •ํ•˜์—ฌ ๊ฐ ๋””์Šคํ”Œ๋ ˆ์ด์— ๋”ฐ๋ฅธ ์šด์ „์ž๋“ค์˜ ํ‰๊ท  ์‹ ๋ขฐ๋„ ์ ์ˆ˜์— ๋”ฐ๋ผ ์ œ์–ด๊ถŒ ์ „ํ™˜ ์ˆ˜ํ–‰ ๋Šฅ๋ ฅ๊ณผ ์•ˆ๊ตฌ์˜ ์›€์ง์ž„ ํŒจํ„ด, ๊ทธ๋ฆฌ๊ณ  ์ฃผ๊ด€์  ํ‰๊ฐ€๊ฐ€ ์–ด๋–ป๊ฒŒ ๋‹ฌ๋ผ์ง€๋Š”์ง€ ๋ถ„์„ํ•˜์˜€๋‹ค. ์‹คํ—˜ ๊ฒฐ๊ณผ, ์ œ์–ด๊ถŒ ์ „ํ™˜ ์ƒํ™ฉ์—์„œ ์ž๋™ํ™”๋œ ์‹œ์Šคํ…œ์ด ์ œ์•ˆํ•˜๋Š” ์ •๋ณด์™€ ๊ทธ์™€ ๊ด€๋ จ๋œ ์ฃผ๋ณ€ ์ƒํ™ฉ ์ •๋ณด๋ฅผ ํ•จ๊ป˜ ์ œ์‹œํ•ด ์ฃผ๋Š” ๋””์Šคํ”Œ๋ ˆ์ด๊ฐ€ ๊ฐ€์žฅ ์ข‹์€ ๊ฒฐ๊ณผ๋ฅผ ๋ณด์—ฌ์ฃผ์—ˆ๋‹ค. ๋˜ํ•œ ๊ฐ ๋””์Šคํ”Œ๋ ˆ์ด์— ๋Œ€ํ•œ ์šด์ „์ž์˜ ์ดˆ๊ธฐ ์‹ ๋ขฐ๋„ ์ ์ˆ˜๋Š” ๋””์Šคํ”Œ๋ ˆ์ด์˜ ์‹ค์ œ ์‚ฌ์šฉ ํ–‰ํƒœ์™€ ๋ฐ€์ ‘ํ•œ ๊ด€๋ จ์ด ์žˆ์Œ์„ ์•Œ ์ˆ˜ ์žˆ์—ˆ๋‹ค. ์‹ ๋ขฐ๋„ ์ ์ˆ˜์— ๋”ฐ๋ผ ์‹ ๋ขฐ๋„๊ฐ€ ๋†’์€ ๊ทธ๋ฃน๊ณผ ๋‚ฎ์€ ๊ทธ๋ฃน์œผ๋กœ ๋ถ„๋ฅ˜๋˜์—ˆ๊ณ , ์‹ ๋ขฐ๋„๊ฐ€ ๋†’์€ ๊ทธ๋ฃน์€ ์ œ์•ˆ๋œ ๋””์Šคํ”Œ๋ ˆ์ด๋“ค์ด ๋ณด์—ฌ์ฃผ๋Š” ์ •๋ณด๋ฅผ ์ฃผ๋กœ ๋ฏฟ๊ณ  ๋”ฐ๋ฅด๋Š” ๊ฒฝํ–ฅ์ด ์žˆ์—ˆ๋˜ ๋ฐ˜๋ฉด, ์‹ ๋ขฐ๋„๊ฐ€ ๋‚ฎ์€ ๊ทธ๋ฃน์€ ๋ฃธ ๋ฏธ๋Ÿฌ๋‚˜ ์‚ฌ์ด๋“œ ๋ฏธ๋Ÿฌ๋ฅผ ํ†ตํ•ด ์ฃผ๋ณ€ ์ƒํ™ฉ ์ •๋ณด๋ฅผ ๋” ํ™•์ธ ํ•˜๋Š” ๊ฒฝํ–ฅ์„ ๋ณด์˜€๋‹ค. ๋„ค ๋ฒˆ์งธ ์—ฐ๊ตฌ๋Š” ์ „๋ฉด ์œ ๋ฆฌ์ฐฝ์—์„œ์˜ ์ธํ„ฐ๋ž™ํ‹ฐ๋ธŒ ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด์˜ ์ตœ์  ์œ„์น˜๋ฅผ ๊ฒฐ์ •ํ•˜๋Š” ๊ฒƒ์œผ๋กœ์„œ ์ฃผํ–‰ ์‹œ๋ฎฌ๋ ˆ์ดํ„ฐ ์‹คํ—˜์„ ํ†ตํ•ด ๋””์Šคํ”Œ๋ ˆ์ด์˜ ์œ„์น˜์— ๋”ฐ๋ผ ์šด์ „์ž์˜ ์ฃผํ–‰ ์ˆ˜ํ–‰ ๋Šฅ๋ ฅ, ์ธํ„ฐ๋ž™ํ‹ฐ๋ธŒ ๋””์Šคํ”Œ๋ ˆ์ด ์กฐ์ž‘ ๊ด€๋ จ ๊ณผ์—… ์ˆ˜ํ–‰ ๋Šฅ๋ ฅ, ์‹œ๊ฐ์  ์ฃผ์˜ ๋ถ„์‚ฐ, ์„ ํ˜ธ๋„, ๊ทธ๋ฆฌ๊ณ  ์ž‘์—… ๋ถ€ํ•˜๊ฐ€ ํ‰๊ฐ€๋˜์—ˆ๋‹ค. ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด์˜ ์œ„์น˜๋Š” ์ „๋ฉด ์œ ๋ฆฌ์ฐฝ์—์„œ ์ผ์ •ํ•œ ๊ฐ„๊ฒฉ์œผ๋กœ ์ด 9๊ฐœ์˜ ์œ„์น˜๊ฐ€ ๊ณ ๋ ค๋˜์—ˆ๋‹ค. ๋ณธ ์—ฐ๊ตฌ์—์„œ ํ™œ์šฉ๋œ ์ธํ„ฐ๋ž™ํ‹ฐ๋ธŒ ๋””์Šคํ”Œ๋ ˆ์ด๋Š” ์Œ์•… ์„ ํƒ์„ ์œ„ํ•œ ์Šคํฌ๋กค ๋ฐฉ์‹์˜ ๋‹จ์ผ ๋””์Šคํ”Œ๋ ˆ์ด์˜€๊ณ , ์šด์ „๋Œ€์— ์žฅ์ฐฉ๋œ ๋ฒ„ํŠผ์„ ํ†ตํ•ด ๋””์Šคํ”Œ๋ ˆ์ด๋ฅผ ์กฐ์ž‘ํ•˜์˜€๋‹ค. ์‹คํ—˜ ๊ฒฐ๊ณผ, ์ธํ„ฐ๋ž™ํ‹ฐ๋ธŒ ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด์˜ ์œ„์น˜๊ฐ€ ๋ชจ๋“  ํ‰๊ฐ€ ์ฒ™๋„, ์ฆ‰ ์ฃผํ–‰ ์ˆ˜ํ–‰ ๋Šฅ๋ ฅ, ๋””์Šคํ”Œ๋ ˆ์ด ์กฐ์ž‘ ๊ณผ์—… ์ˆ˜ํ–‰ ๋Šฅ๋ ฅ, ์‹œ๊ฐ์  ์ฃผ์˜ ๋ถ„์‚ฐ, ์„ ํ˜ธ๋„, ๊ทธ๋ฆฌ๊ณ  ์ž‘์—… ๋ถ€ํ•˜์— ์˜ํ–ฅ์„ ๋ฏธ์นจ์„ ์•Œ ์ˆ˜ ์žˆ์—ˆ๋‹ค. ๋ชจ๋“  ํ‰๊ฐ€ ์ง€ํ‘œ๋ฅผ ๊ณ ๋ คํ–ˆ์„ ๋•Œ, ์ธํ„ฐ๋ž™ํ‹ฐ๋ธŒ ํ—ค๋“œ์—… ๋””์Šคํ”Œ๋ ˆ์ด์˜ ์œ„์น˜๋Š” ์šด์ „์ž๊ฐ€ ๋˜‘๋ฐ”๋กœ ์ „๋ฐฉ์„ ๋ฐ”๋ผ๋ณผ ๋•Œ์˜ ์‹œ์•ผ ๊ตฌ๊ฐ„, ์ฆ‰ ์ „๋ฉด ์œ ๋ฆฌ์ฐฝ์—์„œ์˜ ์™ผ์ชฝ ์•„๋ž˜ ๋ถ€๊ทผ์ด ๊ฐ€์žฅ ์ตœ์ ์ธ ๊ฒƒ์œผ๋กœ ๋‚˜ํƒ€๋‚ฌ๋‹ค.Abstract i Contents v List of Tables ix List of Figures x Chapter 1 Introduction 1 1.1 Research Background 1 1.2 Research Objectives and Questions 8 1.3 Structure of the Thesis 11 Chapter 2 Functional Requirements of Automotive Head-Up Displays: A Systematic Review of Literature from 1994 to Present 13 2.1 Introduction 13 2.2 Method 15 2.3 Results 17 2.3.1 Information Types Displayed by Existing Commercial Automotive HUD Systems 17 2.3.2 Information Types Previously Suggested for Automotive HUDs by Research Studies 28 2.3.3 Information Types Required by Drivers (users) for Automotive HUDs and Their Relative Importance 35 2.4 Discussion 39 2.4.1 Information Types Displayed by Existing Commercial Automotive HUD Systems 39 2.4.2 Information Types Previously Suggested for Automotive HUDs by Research Studies 44 2.4.3 Information Types Required by Drivers (users) for Automotive HUDs and Their Relative Importance 48 Chapter 3 A Literature Review on Interface Design of Automotive Head-Up Displays for Communicating Safety-Related Information 50 3.1 Introduction 50 3.2 Method 52 3.3 Results 55 3.3.1 Commercial Automotive HUDs Presenting Safety-Related Information 55 3.3.2 Safety-Related HUDs Proposed by Academic Research 58 3.4 Discussion 74 Chapter 4 Development and Evaluation of Automotive Head-Up Displays for Take-Over Requests (TORs) in Highly Automated Vehicles 78 4.1 Introduction 78 4.2 Method 82 4.2.1 Participants 82 4.2.2 Apparatus 82 4.2.3 Automotive HUD-based TOR Displays 83 4.2.4 Driving Scenario 86 4.2.5 Experimental Design and Procedure 87 4.2.6 Experiment Variables 88 4.2.7 Statistical Analyses 91 4.3 Results 93 4.3.1 Comparison of the Proposed TOR Displays 93 4.3.2 Characteristics of Drivers Initial Trust in the four TOR Displays 102 4.3.3 Relationship between Drivers Initial Trust and Take-over and Visual Behavior 104 4.4 Discussion 113 4.4.1 Comparison of the Proposed TOR Displays 113 4.4.2 Characteristics of Drivers Initial Trust in the four TOR Displays 116 4.4.3 Relationship between Drivers Initial Trust and Take-over and Visual Behavior 117 4.5 Conclusion 119 Chapter 5 Human Factors Evaluation of Display Locations of an Interactive Scrolling List in a Full-windshield Automotive Head-Up Display System 121 5.1 Introduction 121 5.2 Method 122 5.2.1 Participants 122 5.2.2 Apparatus 123 5.2.3 Experimental Tasks and Driving Scenario 123 5.2.4 Experiment Variables 124 5.2.5 Experimental Design and Procedure 126 5.2.6 Statistical Analyses 126 5.3 Results 127 5.4 Discussion 133 5.5 Conclusion 135 Chapter 6 Conclusion 137 6.1 Summary and Implications 137 6.2 Future Research Directions 139 Bibliography 143 Apeendix A. Display Layouts of Some Commercial HUD Systems Appendix B. Safety-related Displays Provided by the Existing Commercial HUD Systems Appendix C. Safety-related HUD displays Proposed by Academic Research ๊ตญ๋ฌธ์ดˆ๋ก 187Docto

    Is the driver ready to receive just car information in the windshield during manual and autonomous driving?

    Get PDF
    A automaรงรฃo estรก a mudar o mundo. Como na aeronรกutica, as empresas da indรบstria automรณvel estรฃo atualmente a desenvolver veรญculos autรณnomos. No entanto a autonomia do veรญculo nรฃo รฉ completa, necessitando por vezes das aรงรตes do condutor. A forma como รฉ feita a transiรงรฃo entre conduรงรฃo manual e autรณnoma e como mostrar esta informaรงรฃo de transiรงรฃo para o condutor constitui um desafio para a ergonomia. Novos ecrรฃs estรฃo a ser estudados para facilitar estas transiรงรตes. Este estudo usou um simulador de conduรงรฃo para investigar, se a informaรงรฃo em realidade aumentada pode influenciar positivamente a experiรชncia do condutor durante a conduรงรฃo manual e autรณnoma. Compararam-se duas formas de apresentar a comunicaรงรฃo ao condutor. Um โ€œconceito ARโ€ mostrou toda a informaรงรฃo no para-brisas para ser mais fรกcil o condutor aceder ร  informaรงรฃo. O โ€œconceito ICโ€ mostrou a informaรงรฃo que aparece atualmente nos carros, usando o painel de instrumentos e o e-HUD. Os resultados indicam que a experiรชncia do utilizador (UX) รฉ influenciada pelos conceitos, sendo que o โ€œconceito ARโ€ teve uma melhor UX em todos os estados de transiรงรฃo. Em termos de confianรงa, os resultados revelaram tambรฉm valores mais elevado para o โ€œconceito ARโ€. O tipo de conceito nรฃo influenciou nem o tempo nem o comportamento de retomar o controlo do carro. Em termos de situaรงรฃo consciente, o โ€œconceito ARโ€ deixa os condutores mais conscientes durante a disponibilidade e ativaรงรฃo da funรงรฃo. Este estudo traz implicaรงรตes para as empresas que desenvolvem a prรณxima geraรงรฃo de ecrรฃs no mundo automรณvel.Automation is changing the world. As in aviation, the car manufacturers are currently developing autonomous vehicles. However, the autonomy of that vehicles isnโ€™t complete, still being needed in certain moments the driver on ride. The way how is done this transition between manual and autonomous driving and how show this information to the driver is a challenge for Ergonomics. New displays are being studied to facilitate these transitions. This study used a driving simulator to investigates, whether augmented reality information can positively influence the user experience during manual and autonomous driving. Therefore, we compared two ways of present the communicate to the driver. The โ€œAR conceptโ€ displays all the information in windshield to be easier to the driver access to the information. The โ€œIC conceptโ€ displays the information that appears nowadays in the cars, where they use the Instrument Cluster and the e-HUD to display information. Results indicate that the user experience (UX) is influence by concepts, where โ€œAR conceptโ€ had better UX in all the states. In terms of confidence, the results revealed higher scores in โ€œAR conceptโ€ too. The type of concept does not influence the takeover times or the behavior of take control. In terms of situational awareness (SA), โ€œAR conceptโ€ leave the drivers more aware during availability and activation. This study provides implications for automotive companies developing the next generation of car displays

    Entorno virtual para diseรฑar y validar futuras interfaces a bordo para vehรญculos autรณnomos

    Full text link
    [EN] This thesis presents a novel synthetic environment for supporting advanced explorations of user interfaces and interaction modalities for future transport systems. The main goal of the work is the definition of novel interfaces solutions designed for increasing trust in self-driving vehicles. The basic idea is to provide insights to the passengers concerning the information available to the Artificial Intelligence (AI) modules on-board of the car, including the driving behaviour of the vehicle and its decision making. Most of currently existing academic and industrial testbeds and vehicular simulators are designed to reproduce with high fidelity the ergonomic aspects associated with the driving experience. However, they have very low degrees of realism for what concerns the digital components of the various traffic scenarios. These includes the visuals of the driving simulator and the behaviours of both other vehicles on the road and pedestrians. High visual testbed fidelity becomes an important pre-requisite for supporting the design and evaluation of future on-board interfaces. An innovative experimental testbed based on the hyper-realistic video game GTA V, has been developed to satisfy this need. To showcase its experimental flexibility, a set of selected user studies, presenting novel self-driving interfaces and associated user experience results, are described. These explore the capabilities of inducing trust in autonomous vehicles and explore Heads-Up Displays (HUDs), Augmented Reality (ARs) and directional audio solutions. The work includes three core phases focusing on the development of software for the testbed, the definition of relevant interfaces and experiments and focused testing with panels comprising different user demographics. Specific investigations will focus on the design and exploration of a set of alternative visual feedback mechanisms (adopting AR visualizations) to gather information about the surrounding environment and AI decision making. The performances of these will be assessed with real users in respect of their capability to foster trust in the vehicle and on the level of understandability of the provided signals. Moreover, additional accessory studies will focus on the exploration of different designs for triggering driving handover, i.e. the transfer vehicle control from AI to human drivers, which is a central problem in current embodiments of self-driving vehicles.[ES] Esta tesis presenta un nuevo entorno virtual para apoyar exploraciones avanzadas de interfaces de usuario y modalidades de interacciรณn para sistemas de transporte futuros. El objetivo principal del trabajo es la definiciรณn de soluciones de Realidad Aumentada diseรฑadas para aumentar la confianza en los vehรญculos autรณnomos. La idea bรกsica es proporcionar informaciรณn a los pasajeros sobre la informaciรณn disponible para los mรณdulos de Inteligencia Artificial (AI) a bordo del automรณvil, incluido el comportamiento de conducciรณn del vehรญculo y su toma de decisiones. El trabajo incluye tres fases centrales que se centran en el desarrollo de software para el banco de pruebas, la definiciรณn de interfaces y experimentos relevantes y pruebas enfocadas con paneles que comprenden diferentes datos demogrรกficos de los usuarios. El entorno de trabajo especรญfico del banco de pruebas experimental se compone de: - GTA V como entorno de prueba debido a su escenario complejo y sus grรกficos hiperrealistas. - Volante y pedales para una conducciรณn activa. - DeepGTA como marco de autocontrol. - Tobii Eye Tracking como dispositivo de entrada para las intenciones de los usuarios. Las investigaciones especรญficas se centrarรกn en el diseรฑo y la exploraciรณn de un conjunto de mecanismos alternativos de retroalimentaciรณn visual (adopciรณn de visualizaciones de AR) para recopilar informaciรณn sobre el medio ambiente circundante y la toma de decisiones de IA. El rendimiento de estos se evaluarรก con los usuarios reales con respecto a su capacidad para fomentar la confianza en el vehรญculo y en el nivel de comprensiรณn de las seรฑales proporcionadas. Ademรกs, los estudios complementarios adicionales se centrarรกn en la exploraciรณn de diferentes diseรฑos para activar el traspaso de conducciรณn, es decir, el control del vehรญculo de transferencia de AI a los conductores humanos, que es un problema central en las realizaciones actuales de vehรญculos autรณnomos.Mateu Gisbert, C. (2018). Novel synthetic environment to design and validate future onboard interfaces for self-driving vehicles. http://hdl.handle.net/10251/112327TFG

    AutoDRIVE: A Comprehensive, Flexible and Integrated Cyber-Physical Ecosystem for Enhancing Autonomous Driving Research and Education

    Full text link
    Prototyping and validating hardware-software components, sub-systems and systems within the intelligent transportation system-of-systems framework requires a modular yet flexible and open-access ecosystem. This work presents our attempt towards developing such a comprehensive research and education ecosystem, called AutoDRIVE, for synergistically prototyping, simulating and deploying cyber-physical solutions pertaining to autonomous driving as well as smart city management. AutoDRIVE features both software as well as hardware-in-the-loop testing interfaces with openly accessible scaled vehicle and infrastructure components. The ecosystem is compatible with a variety of development frameworks, and supports both single and multi-agent paradigms through local as well as distributed computing. Most critically, AutoDRIVE is intended to be modularly expandable to explore emergent technologies, and this work highlights various complementary features and capabilities of the proposed ecosystem by demonstrating four such deployment use-cases: (i) autonomous parking using probabilistic robotics approach for mapping, localization, path planning and control; (ii) behavioral cloning using computer vision and deep imitation learning; (iii) intersection traversal using vehicle-to-vehicle communication and deep reinforcement learning; and (iv) smart city management using vehicle-to-infrastructure communication and internet-of-things

    From Manual Driving to Automated Driving: A Review of 10 Years of AutoUI

    Full text link
    This paper gives an overview of the ten-year devel- opment of the papers presented at the International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutoUI) from 2009 to 2018. We categorize the topics into two main groups, namely, manual driving-related research and automated driving-related re- search. Within manual driving, we mainly focus on studies on user interfaces (UIs), driver states, augmented reality and head-up displays, and methodology; Within automated driv- ing, we discuss topics, such as takeover, acceptance and trust, interacting with road users, UIs, and methodology. We also discuss the main challenges and future directions for AutoUI and offer a roadmap for the research in this area.https://deepblue.lib.umich.edu/bitstream/2027.42/153959/1/From Manual Driving to Automated Driving: A Review of 10 Years of AutoUI.pdfDescription of From Manual Driving to Automated Driving: A Review of 10 Years of AutoUI.pdf : Main articl

    A Voice and Pointing Gesture Interaction System for Supporting Human Spontaneous Decisions in Autonomous Cars

    Get PDF
    Autonomous cars are expected to improve road safety, traffic and mobility. It is projected that in the next 20-30 years fully autonomous vehicles will be on the market. The advancement on the research and development of this technology will allow the disengagement of humans from the driving task, which will be responsibility of the vehicle intelligence. In this scenario new vehicle interior designs are proposed, enabling more flexible human vehicle interactions inside them. In addition, as some important stakeholders propose, control elements such as the steering wheel and accelerator and brake pedals may not be needed any longer. However, this user control disengagement is one of the main issues related with the user acceptance of this technology. Users do not seem to be comfortable with the idea of giving all the decision power to the vehicle. In addition, there can be location awareness situations where the user makes a spontaneous decision and requires some type of vehicle control. Such is the case of stopping at a particular point of interest or taking a detour in the pre-calculated autonomous route of the car. Vehicle manufacturers\u27 maintain the steering wheel as a control element, allowing the driver to take over the vehicle if needed or wanted. This causes a constraint in the previously mentioned human vehicle interaction flexibility. Thus, there is an unsolved dilemma between providing users enough control over the autonomous vehicle and route so they can make spontaneous decision, and interaction flexibility inside the car. This dissertation proposes the use of a voice and pointing gesture human vehicle interaction system to solve this dilemma. Voice and pointing gestures have been identified as natural interaction techniques to guide and command mobile robots, potentially providing the needed user control over the car. On the other hand, they can be executed anywhere inside the vehicle, enabling interaction flexibility. The objective of this dissertation is to provide a strategy to support this system. For this, a method based on pointing rays intersections for the computation of the point of interest (POI) that the user is pointing to is developed. Simulation results show that this POI computation method outperforms the traditional ray-casting based by 76.5% in cluttered environments and 36.25% in combined cluttered and non-cluttered scenarios. The whole system is developed and demonstrated using a robotics simulator framework. The simulations show how voice and pointing commands performed by the user update the predefined autonomous path, based on the recognized command semantics. In addition, a dialog feedback strategy is proposed to solve conflicting situations such as ambiguity in the POI identification. This additional step is able to solve all the previously mentioned POI computation inaccuracies. In addition, it allows the user to confirm, correct or reject the performed commands in case the system misunderstands them

    License to Supervise:Influence of Driving Automation on Driver Licensing

    Get PDF
    To use highly automated vehicles while a driver remains responsible for safe driving, places new โ€“ yet demanding, requirements on the human operator. This is because the automation creates a gap between driversโ€™ responsibility and the human capabilities to take responsibility, especially for unexpected or time-critical transitions of control. This gap is not being addressed by current practises of driver licensing. Based on literature review, this research collects driversโ€™ requirements to enable safe transitions in control attuned to human capabilities. This knowledge is intended to help system developers and authorities to identify the requirements on human operators to (re)take responsibility for safe driving after automation

    Exploring safer visual feedback in human-machine handover in highly autonomous vehicles

    Get PDF
    Driving is becoming increasingly automated and the automated driving system is gradually replacing the driver, which will inevitably have a significant impact on the driving experience. This study investigates the design of a dashboard for the highly automated vehicle that would provide the driver with relevant information during the human-machine handover. After reviewing previous studies, analyzing the state-of-the-art and generating user scenarios, we developed design guidelines, prototypes and user experience videos. This video served as a "research instrument" to test with users and explore and learn about consequences and interpretations. The test results suggest that the mental workload paid by the user for the task shows a trend of reduction from level 3 to level 4. The reduced workload can ensure more effective alerts and alarms, which can potentially make driving safer. Regarding the implementation of the design, conclusions are mainly drawn on three aspects: the necessity to combine several modalities at automation level 4, the need to redefine the color of the assisting AR bars we designed due to the indistinguishability of the red and orange ones, and the disagreement on whether and how to display speed information at handover. The practice and application of our prototype in scenarios supports previous studies and provides a reliable, creative solution for other researchers and designers
    • โ€ฆ
    corecore