637 research outputs found

    DragonflEYE: a passive approach to aerial collision sensing

    Get PDF
    "This dissertation describes the design, development and test of a passive wide-field optical aircraft collision sensing instrument titled 'DragonflEYE'. Such a ""sense-and-avoid"" instrument is desired for autonomous unmanned aerial systems operating in civilian airspace. The instrument was configured as a network of smart camera nodes and implemented using commercial, off-the-shelf components. An end-to-end imaging train model was developed and important figures of merit were derived. Transfer functions arising from intermediate mediums were discussed and their impact assessed. Multiple prototypes were developed. The expected performance of the instrument was iteratively evaluated on the prototypes, beginning with modeling activities followed by laboratory tests, ground tests and flight tests. A prototype was mounted on a Bell 205 helicopter for flight tests, with a Bell 206 helicopter acting as the target. Raw imagery was recorded alongside ancillary aircraft data, and stored for the offline assessment of performance. The ""range at first detection"" (R0), is presented as a robust measure of sensor performance, based on a suitably defined signal-to-noise ratio. The analysis treats target radiance fluctuations, ground clutter, atmospheric effects, platform motion and random noise elements. Under the measurement conditions, R0 exceeded flight crew acquisition ranges. Secondary figures of merit are also discussed, including time to impact, target size and growth, and the impact of resolution on detection range. The hardware was structured to facilitate a real-time hierarchical image-processing pipeline, with selected image processing techniques introduced. In particular, the height of an observed event above the horizon compensates for angular motion of the helicopter platform.

    Confronting the Challenge of Modeling Cloud and Precipitation Microphysics

    Get PDF
    In the atmosphere, microphysics refers to the microscale processes that affect cloud and precipitation particles and is a key linkage among the various components of Earth\u27s atmospheric water and energy cycles. The representation of microphysical processes in models continues to pose a major challenge leading to uncertainty in numerical weather forecasts and climate simulations. In this paper, the problem of treating microphysics in models is divided into two parts: (i) how to represent the population of cloud and precipitation particles, given the impossibility of simulating all particles individually within a cloud, and (ii) uncertainties in the microphysical process rates owing to fundamental gaps in knowledge of cloud physics. The recently developed Lagrangian particle‐based method is advocated as a way to address several conceptual and practical challenges of representing particle populations using traditional bulk and bin microphysics parameterization schemes. For addressing critical gaps in cloud physics knowledge, sustained investment for observational advances from laboratory experiments, new probe development, and next‐generation instruments in space is needed. Greater emphasis on laboratory work, which has apparently declined over the past several decades relative to other areas of cloud physics research, is argued to be an essential ingredient for improving process‐level understanding. More systematic use of natural cloud and precipitation observations to constrain microphysics schemes is also advocated. Because it is generally difficult to quantify individual microphysical process rates from these observations directly, this presents an inverse problem that can be viewed from the standpoint of Bayesian statistics. Following this idea, a probabilistic framework is proposed that combines elements from statistical and physical modeling. Besides providing rigorous constraint of schemes, there is an added benefit of quantifying uncertainty systematically. Finally, a broader hierarchical approach is proposed to accelerate improvements in microphysics schemes, leveraging the advances described in this paper related to process modeling (using Lagrangian particle‐based schemes), laboratory experimentation, cloud and precipitation observations, and statistical methods

    Aeolus Ocean -- A simulation environment for the autonomous COLREG-compliant navigation of Unmanned Surface Vehicles using Deep Reinforcement Learning and Maritime Object Detection

    Full text link
    Heading towards navigational autonomy in unmanned surface vehicles (USVs) in the maritime sector can fundamentally lead towards safer waters as well as reduced operating costs, while also providing a range of exciting new capabilities for oceanic research, exploration and monitoring. However, achieving such a goal is challenging. USV control systems must, safely and reliably, be able to adhere to the international regulations for preventing collisions at sea (COLREGs) in encounters with other vessels as they navigate to a given waypoint while being affected by realistic weather conditions, either during the day or at night. To deal with the multitude of possible scenarios, it is critical to have a virtual environment that is able to replicate the realistic operating conditions USVs will encounter, before they can be implemented in the real world. Such "digital twins" form the foundations upon which Deep Reinforcement Learning (DRL) and Computer Vision (CV) algorithms can be used to develop and guide USV control systems. In this paper we describe the novel development of a COLREG-compliant DRL-based collision avoidant navigational system with CV-based awareness in a realistic ocean simulation environment. The performance of the trained autonomous Agents resulting from this approach is evaluated in several successful navigations to set waypoints in both open sea and coastal encounters with other vessels. A binary executable version of the simulator with trained agents is available at https://github.com/aavek/Aeolus-OceanComment: 22 pages, last blank page, 17 figures, 1 table, color, high resolution figure

    Spaceborne radar observations: A guide for Magellan radar-image analysis

    Get PDF
    Geologic analyses of spaceborne radar images of Earth are reviewed and summarized with respect to detecting, mapping, and interpreting impact craters, volcanic landforms, eolian and subsurface features, and tectonic landforms. Interpretations are illustrated mostly with Seasat synthetic aperture radar and shuttle-imaging-radar images. Analogies are drawn for the potential interpretation of radar images of Venus, with emphasis on the effects of variation in Magellan look angle with Venusian latitude. In each landform category, differences in feature perception and interpretive capability are related to variations in imaging geometry, spatial resolution, and wavelength of the imaging radar systems. Impact craters and other radially symmetrical features may show apparent bilateral symmetry parallel to the illumination vector at low look angles. The styles of eruption and the emplacement of major and minor volcanic constructs can be interpreted from morphological features observed in images. Radar responses that are governed by small-scale surface roughness may serve to distinguish flow types, but do not provide unambiguous information. Imaging of sand dunes is rigorously constrained by specific angular relations between the illumination vector and the orientation and angle of repose of the dune faces, but is independent of radar wavelength. With a single look angle, conditions that enable shallow subsurface imaging to occur do not provide the information necessary to determine whether the radar has recorded surface or subsurface features. The topographic linearity of many tectonic landforms is enhanced on images at regional and local scales, but the detection of structural detail is a strong function of illumination direction. Nontopographic tectonic lineaments may appear in response to contrasts in small-surface roughness or dielectric constant. The breakpoint for rough surfaces will vary by about 25 percent through the Magellan viewing geometries from low to high Venusian latitudes. Examples of anomalies and system artifacts that can affect image interpretation are described

    Application of Resonant and Non-Resonant Laser-Induced Plasmas for Quantitative Fuel-to-Air Ratio and Gas-Phase Temperature Measurements

    Get PDF
    In this work, two laser-induced plasma techniques are used for gas-phase chemical and temperature measurements. The first technique, laser-induced breakdown spectroscopy (LIBS) is applied for fuel-to-air ratio (FAR) measurements in a well calibrated Hencken flame. In Chapter I, relevant technical and background information for each technique is provided. In Chapter II, measurements are first performed for high-pressure (1-11 Bar) methane-air flames, for which calibration curves are generated using the emission ratio of hydrogen at 656 nm and ionic nitrogen at 568 nm. The effect of pressure on the sensitivity and precision of the resulting calibrated curves is evaluated. Results indicate a degradation of measurement precision as environmental pressure increases, with data indicating that fluctuations of the plasma play a major part in this behavior. Expanding upon this work with LIBS, a comparison of FAR calibration curve results for atmospheric methane-air Hencken flame using three different laser pulse widths, femto-, pico-, and nanosecond regimes, is done in Chapter III. The results are discussed in the context of potential advantages for high-pressure LIBS-based FAR measurements. Results indicate that while nanosecond duration pulses provide better precision at 1 Bar conditions, femtosecond duration pulses might be better suited for high-pressure measurements.In Chapter IV, the radar REMPI technique, which uses microwave scattering from a plasma created by selective multiphoton ionization of molecular oxygen, is used for gas-phase temperature measurements through the wall of ceramic-enclosed environments. Specifically, measurements are done through the wall of a heated laboratory flow reactor and through the wall of a ceramic well-stirred reactor. Results show good agreement with thermocouple and/or computational modeling and the effectiveness of radar REMPI for through-the-wall measurements.In Chapter V, a new technique is discussed, namely acoustic REMPI, which utilizes the pressure wave generated from the creation of the REMPI plasma for diagnostics. The acoustic emission from the plasma is characterized and used for gas-phase temperature measurements. Comparison, with radar REMPI shows a high-level of agreement.Finally, in Chapter VI, a summary of the work in this dissertation is provided along with a discussion of potential for work in the future

    Spectral LADAR: Active Range-Resolved Imaging Spectroscopy

    Get PDF
    Imaging spectroscopy using ambient or thermally generated optical sources is a well developed technique for capturing two dimensional images with high per-pixel spectral resolution. The per-pixel spectral data is often a sufficient sampling of a material's backscatter spectrum to infer chemical properties of the constituent material to aid in substance identification. Separately, conventional LADAR sensors use quasi-monochromatic laser radiation to create three dimensional images of objects at high angular resolution, compared to RADAR. Advances in dispersion engineered photonic crystal fibers in recent years have made high spectral radiance optical supercontinuum sources practical, enabling this study of Spectral LADAR, a continuous polychromatic spectrum augmentation of conventional LADAR. This imaging concept, which combines multi-spectral and 3D sensing at a physical level, is demonstrated with 25 independent and parallel LADAR channels and generates point cloud images with three spatial dimensions and one spectral dimension. The independence of spectral bands is a key characteristic of Spectral LADAR. Each spectral band maintains a separate time waveform record, from which target parameters are estimated. Accordingly, the spectrum computed for each backscatter reflection is independently and unambiguously range unmixed from multiple target reflections that may arise from transmission of a single panchromatic pulse. This dissertation presents the theoretical background of Spectral LADAR, a shortwave infrared laboratory demonstrator system constructed as a proof-of-concept prototype, and the experimental results obtained by the prototype when imaging scenes at stand off ranges of 45 meters. The resultant point cloud voxels are spectrally classified into a number of material categories which enhances object and feature recognition. Experimental results demonstrate the physical level combination of active backscatter spectroscopy and range resolved sensing to produce images with a level of complexity, detail, and accuracy that is not obtainable with data-level registration and fusion of conventional imaging spectroscopy and LADAR. The capabilities of Spectral LADAR are expected to be useful in a range of applications, such as biomedical imaging and agriculture, but particularly when applied as a sensor in unmanned ground vehicle navigation. Applications to autonomous mobile robotics are the principal motivators of this study, and are specifically addressed

    Soft Target for Advanced Emergency Braking System Daimler Trucks

    Get PDF
    This report provides an overview of the AEBS Soft Target project delivered to Daimler Trucks North Amer- ica as part of the 2016-2017 Mechanical Engineering Senior Design class at California Polytechnic State University, San Luis Obispo. The purpose was to build a soft target to test Advanced Emergency Braking Systems, or AEBS, on Daimler 0 s large trucks. Though this design is for Daimler specifically, there may be other interested parties such as highway safety groups and rival auto manufacturers. Currently, there are no suitable alternative products that satisfy every requirement for Daimler to validate their systems. They require a target that must not damage their trucks, visible to their sensor systems, mountable to a moving frame, can be reset quickly, and is a cheaper long term testing solution than their current setup. The team was able to build a target that had improved car profile and appearance compared to preexisting targets while producing the target for a very low cost. The truss, bumpers, and tarp proved durable in Cal Poly’s testing environment. However, the base connections are a weak point of the design and failed when run over in testing. Fortunately these pieces are extremely quick and inexpensive to replace. Further full scale testing would better validate these results for truck impact

    Measurable Safety of Automated Driving Functions in Commercial Motor Vehicles

    Get PDF
    With the further development of automated driving, the functional performance increases resulting in the need for new and comprehensive testing concepts. This doctoral work aims to enable the transition from quantitative mileage to qualitative test coverage by aggregating the results of both knowledge-based and data-driven test platforms. The validity of the test domain can be extended cost-effectively throughout the software development process to achieve meaningful test termination criteria

    Measurable Safety of Automated Driving Functions in Commercial Motor Vehicles - Technological and Methodical Approaches

    Get PDF
    Fahrerassistenzsysteme sowie automatisiertes Fahren leisten einen wesentlichen Beitrag zur Verbesserung der Verkehrssicherheit von Kraftfahrzeugen, insbesondere von Nutzfahrzeugen. Mit der Weiterentwicklung des automatisierten Fahrens steigt hierbei die funktionale LeistungsfĂ€higkeit, woraus Anforderungen an neue, gesamtheitliche Erprobungskonzepte entstehen. Um die Absicherung höherer Stufen von automatisierten Fahrfunktionen zu garantieren, sind neuartige Verifikations- und Validierungsmethoden erforderlich. Ziel dieser Arbeit ist es, durch die Aggregation von Testergebnissen aus wissensbasierten und datengetriebenen Testplattformen den Übergang von einer quantitativen Kilometerzahl zu einer qualitativen Testabdeckung zu ermöglichen. Die adaptive Testabdeckung zielt somit auf einen Kompromiss zwischen Effizienz- und EffektivitĂ€tskriterien fĂŒr die Absicherung von automatisierten Fahrfunktionen in der Produktentstehung von Nutzfahrzeugen ab. Diese Arbeit umfasst die Konzeption und Implementierung eines modularen Frameworks zur kundenorientierten Absicherung automatisierter Fahrfunktionen mit vertretbarem Aufwand. Ausgehend vom Konfliktmanagement fĂŒr die Anforderungen der Teststrategie werden hochautomatisierte TestansĂ€tze entwickelt. Dementsprechend wird jeder Testansatz mit seinen jeweiligen Testzielen integriert, um die Basis eines kontextgesteuerten Testkonzepts zu realisieren. Die wesentlichen BeitrĂ€ge dieser Arbeit befassen sich mit vier Schwerpunkten: * ZunĂ€chst wird ein Co-Simulationsansatz prĂ€sentiert, mit dem sich die SensoreingĂ€nge in einem Hardware-in-the-Loop-PrĂŒfstand mithilfe synthetischer Fahrszenarien simulieren und/ oder stimulieren lassen. Der vorgestellte Aufbau bietet einen phĂ€nomenologischen Modellierungsansatz, um einen Kompromiss zwischen der ModellgranularitĂ€t und dem Rechenaufwand der Echtzeitsimulation zu erreichen. Diese Methode wird fĂŒr eine modulare Integration von Simulationskomponenten, wie Verkehrssimulation und Fahrdynamik, verwendet, um relevante PhĂ€nomene in kritischen Fahrszenarien zu modellieren. * Danach wird ein Messtechnik- und Datenanalysekonzept fĂŒr die weltweite Absicherung von automatisierten Fahrfunktionen vorgestellt, welches eine Skalierbarkeit zur Aufzeichnung von Fahrzeugsensor- und/ oder Umfeldsensordaten von spezifischen Fahrereignissen einerseits und permanenten Daten zur statistischen Absicherung und Softwareentwicklung andererseits erlaubt. Messdaten aus lĂ€nderspezifischen Feldversuchen werden aufgezeichnet und zentral in einer Cloud-Datenbank gespeichert. * Anschließend wird ein ontologiebasierter Ansatz zur Integration einer komplementĂ€ren Wissensquelle aus Feldbeobachtungen in ein Wissensmanagementsystem beschrieben. Die Gruppierung von Aufzeichnungen wird mittels einer ereignisbasierten Zeitreihenanalyse mit hierarchischer Clusterbildung und normalisierter Kreuzkorrelation realisiert. Aus dem extrahierten Cluster und seinem Parameterraum lassen sich die Eintrittswahrscheinlichkeit jedes logischen Szenarios und die Wahrscheinlichkeitsverteilungen der zugehörigen Parameter ableiten. Durch die Korrelationsanalyse von synthetischen und naturalistischen Fahrszenarien wird die anforderungsbasierte Testabdeckung adaptiv und systematisch durch ausfĂŒhrbare Szenario-Spezifikationen erweitert. * Schließlich wird eine prospektive Risikobewertung als invertiertes Konfidenzniveau der messbaren Sicherheit mithilfe von SensitivitĂ€ts- und ZuverlĂ€ssigkeitsanalysen durchgefĂŒhrt. Der Versagensbereich kann im Parameterraum identifiziert werden, um die Versagenswahrscheinlichkeit fĂŒr jedes extrahierte logische Szenario durch verschiedene Stichprobenverfahren, wie beispielsweise die Monte-Carlo-Simulation und Adaptive-Importance-Sampling, vorherzusagen. Dabei fĂŒhrt die geschĂ€tzte Wahrscheinlichkeit einer Sicherheitsverletzung fĂŒr jedes gruppierte logische Szenario zu einer messbaren Sicherheitsvorhersage. Das vorgestellte Framework erlaubt es, die LĂŒcke zwischen wissensbasierten und datengetriebenen Testplattformen zu schließen, um die Wissensbasis fĂŒr die Abdeckung der Operational Design Domains konsequent zu erweitern. Zusammenfassend zeigen die Ergebnisse den Nutzen und die Herausforderungen des entwickelten Frameworks fĂŒr messbare Sicherheit durch ein Vertrauensmaß der Risikobewertung. Dies ermöglicht eine kosteneffiziente Erweiterung der ValiditĂ€t der TestdomĂ€ne im gesamten Softwareentwicklungsprozess, um die erforderlichen Testabbruchkriterien zu erreichen

    Machine Learning in Nuclear Physics

    Full text link
    Advances in machine learning methods provide tools that have broad applicability in scientific research. These techniques are being applied across the diversity of nuclear physics research topics, leading to advances that will facilitate scientific discoveries and societal applications. This Review gives a snapshot of nuclear physics research which has been transformed by machine learning techniques.Comment: Comments are welcom
    • 

    corecore