3,536 research outputs found

    Vision for Looking at Traffic Lights:Issues, Survey, and Perspectives

    Get PDF

    Real-Time Traffic Light Recognition Based on C-HOG Features

    Get PDF
    This paper proposes a real-time traffic light detection and recognition algorithm that would allow for the recognition of traffic signals in intelligent vehicles. This algorithm is based on C-HOG features (Color and HOG features) and Support Vector Machine (SVM). The algorithm extracted red and green areas in the video accurately, and then screened the eligible area. Thereafter, the C-HOG features of all kinds of lights could be extracted. Finally, this work used SVM to build a classifier of corresponding category lights. This algorithm obtained accurate real-time information based on the judgment of the decision function. Furthermore, experimental results show that this algorithm demonstrated accuracy and good real-time performance

    Pilot Visual Detection of Small Unmanned Aircraft on Final Approach during Nighttime Conditions

    Get PDF
    In December 2020, the Federal Aviation Administration (FAA) announced the release of a new final rule, permitting operators of small unmanned aircraft systems (sUAS) to perform routine night operations. Public comments to the Notice of Proposed Rulemaking indicated potential safety concerns regarding a pilot’s ability to spot a low-altitude sUAS during nighttime conditions. Leveraging data from the FAA’s UAS Sighting Report Database, the research team evaluated the significance of aircraft encounters with UAS at night. Researchers conducted an inflight experiment in which 10 pilots performed an instrument approach to airport during nighttime conditions in which a multi-rotor sUAS presented a potential collision hazard. The sUAS was equipped with lighting visible for 3 miles with a sufficient flash rate to avoid a collision, as specified by the new regulation. Participants performed five approaches, with the sUAS flying different scripted encounter profiles. Participants were asked to indicate when they visually spotted the sUAS, with sighting data recorded via an onboard observer. Geolocation information from both the aircraft and sUAS were compared at the time of each reported sighting to assess visibility distance and orientation. The sUAS was successfully spotted during 30 percent (n = 12) of the testing passes. Hovering sUAS were spotted at the same rate as moving sUAS, however, sUAS in motion were spotted at a much greater range. Researchers noted disproportionately higher spotting rates occurred when the sUAS was oriented on the starboard side of the aircraft vs. the port side. It is believed that airport lighting systems may have obscured or otherwise camouflaged portside sUAS encounters. When asked to estimate distance to an encountered sUAS, most participants underestimated, perceiving the sUAS to be much closer than reality. Additionally, the researchers assessed the potential for the participants to initiate evasive maneuvers, based on the distance and closure rate of the aircraft and sUAS at the time of sighting. Based on the FAA’s Aircraft Identification and Reaction Time Chart, collision avoidance would only have been successful during 15 percent of encounters (n = 6). The research team recommends Remote Pilots employ vigilant traffic awareness during nighttime operations, and leverage use of ADS-B (In) technology and monitor Common Traffic Advisory Frequencies to maintain situational awareness—particularly when operating in proximity to airports

    Tracking both pose and status of a traffic light via an Interacting Multiple Model filter

    Get PDF
    International audienceEither for driver assistance systems or autonomous vehicles, detecting traffic lights (status and pose) is required when Intelligent Transport Systems go downtown. As detection algorithms could still have some misclassification on the traffic light status, this paper proposes a solution to nearly avoid this problem. An Interacting Multiple Model filter is used to track both the position and the status of a traffic light through the time and to increase traffic light recognition performances for automation purpose.Aussi bien pour la conduite autonome que pour les systèmes d'aide à la conduite, il est nécessaire de pouvoir détecter les feux de trafic (status et position) pour que les différents systèmes de transport intelligent puissent fonctionner en centre ville. Les algorithmes de détection simples pouvant toujours produire des erreurs de classification sur le status du feu, ce papier propose une solution pour contourner le problème. Un filtre "Interacting Multiple Model" est en effet utilisé pour suivre à la fois la position et le status du feux dans le temps et pour ainsi améliorer les performances d'un système global

    Predicting Inattentional Blindness with Pupillary Response in a Simulated Flight Task

    Get PDF
    Inattentional blindness (IB) is the failure of observers to notice the presence of a clearly viewable but unexpected visual event when attentional resources are diverted elsewhere. Knowing when an operator is unable to respond or detect an unexpected event may help improve safety during task performance. Unfortunately, it is difficult to predict when such failures might occur. The current study was a secondary data analysis of data collected in the Human and Autonomous Vehicle Systems Laboratory at NASA Langley Research Center. Specifically, 60 subjects (29 male, with normal or corrected-to-normal vision, mean age of 34.5 years (SD = 13.3) were randomly assigned to one of three automation conditions (full automation, partial automation, and full manual) and took part in a simulated flight landing task. The dependent variable was the detection/non-detection of an IB occurrence (a truck on the landing runway). Scores on the NASA-TLX workload rating scale varied significantly by automation condition. The full automation condition reported the lowest subjective task load followed by partial automation and then manual condition. IB detection varied significantly across automation condition. The moderate workload condition of partial automation exhibited the lowest likelihood of IB occurrence. The low workload full automation condition did not differ significantly from the manual condition. Subjects who reported higher task demand had increased pupil dilation and subjects with larger pupil dilation were more likely to detect the runway incursion. These results show eye tracking may be used to identify periods of reduced unexpected visual stimulus detection for possible real-time IB mitigation
    • …
    corecore