11,407 research outputs found

    Understanding and ameliorating non-linear phase and amplitude responses in AMCW Lidar

    Get PDF
    Amplitude modulated continuous wave (AMCW) lidar systems commonly suffer from non-linear phase and amplitude responses due to a number of known factors such as aliasing and multipath inteference. In order to produce useful range and intensity information it is necessary to remove these perturbations from the measurements. We review the known causes of non-linearity, namely aliasing, temporal variation in correlation waveform shape and mixed pixels/multipath inteference. We also introduce other sources of non-linearity, including crosstalk, modulation waveform envelope decay and non-circularly symmetric noise statistics, that have been ignored in the literature. An experimental study is conducted to evaluate techniques for mitigation of non-linearity, and it is found that harmonic cancellation provides a significant improvement in phase and amplitude linearity

    A New Vehicle Localization Scheme Based on Combined Optical Camera Communication and Photogrammetry

    Full text link
    The demand for autonomous vehicles is increasing gradually owing to their enormous potential benefits. However, several challenges, such as vehicle localization, are involved in the development of autonomous vehicles. A simple and secure algorithm for vehicle positioning is proposed herein without massively modifying the existing transportation infrastructure. For vehicle localization, vehicles on the road are classified into two categories: host vehicles (HVs) are the ones used to estimate other vehicles' positions and forwarding vehicles (FVs) are the ones that move in front of the HVs. The FV transmits modulated data from the tail (or back) light, and the camera of the HV receives that signal using optical camera communication (OCC). In addition, the streetlight (SL) data are considered to ensure the position accuracy of the HV. Determining the HV position minimizes the relative position variation between the HV and FV. Using photogrammetry, the distance between FV or SL and the camera of the HV is calculated by measuring the occupied image area on the image sensor. Comparing the change in distance between HV and SLs with the change in distance between HV and FV, the positions of FVs are determined. The performance of the proposed technique is analyzed, and the results indicate a significant improvement in performance. The experimental distance measurement validated the feasibility of the proposed scheme

    Deep Drone Racing: From Simulation to Reality with Domain Randomization

    Full text link
    Dynamically changing environments, unreliable state estimation, and operation under severe resource constraints are fundamental challenges that limit the deployment of small autonomous drones. We address these challenges in the context of autonomous, vision-based drone racing in dynamic environments. A racing drone must traverse a track with possibly moving gates at high speed. We enable this functionality by combining the performance of a state-of-the-art planning and control system with the perceptual awareness of a convolutional neural network (CNN). The resulting modular system is both platform- and domain-independent: it is trained in simulation and deployed on a physical quadrotor without any fine-tuning. The abundance of simulated data, generated via domain randomization, makes our system robust to changes of illumination and gate appearance. To the best of our knowledge, our approach is the first to demonstrate zero-shot sim-to-real transfer on the task of agile drone flight. We extensively test the precision and robustness of our system, both in simulation and on a physical platform, and show significant improvements over the state of the art.Comment: Accepted as a Regular Paper to the IEEE Transactions on Robotics Journal. arXiv admin note: substantial text overlap with arXiv:1806.0854
    corecore