942 research outputs found

    Classification and Verification of Online Handwritten Signatures with Time Causal Information Theory Quantifiers

    Get PDF
    We present a new approach for online handwritten signature classification and verification based on descriptors stemming from Information Theory. The proposal uses the Shannon Entropy, the Statistical Complexity, and the Fisher Information evaluated over the Bandt and Pompe symbolization of the horizontal and vertical coordinates of signatures. These six features are easy and fast to compute, and they are the input to an One-Class Support Vector Machine classifier. The results produced surpass state-of-the-art techniques that employ higher-dimensional feature spaces which often require specialized software and hardware. We assess the consistency of our proposal with respect to the size of the training sample, and we also use it to classify the signatures into meaningful groups.Comment: Submitted to PLOS On

    Sonic Booms in Atmospheric Turbulence (SonicBAT): The Influence of Turbulence on Shaped Sonic Booms

    Get PDF
    The objectives of the Sonic Booms in Atmospheric Turbulence (SonicBAT) Program were to develop and validate, via research flight experiments under a range of realistic atmospheric conditions, one numeric turbulence model research code and one classic turbulence model research code using traditional N-wave booms in the presence of atmospheric turbulence, and to apply these models to assess the effects of turbulence on the levels of shaped sonic booms predicted from low boom aircraft designs. The SonicBAT program has successfully investigated sonic boom turbulence effects through the execution of flight experiments at two NASA centers, Armstrong Flight Research Center (AFRC) and Kennedy Space Center (KSC), collecting a comprehensive set of acoustic and atmospheric turbulence data that were used to validate the numeric and classic turbulence models developed. The validated codes were incorporated into the PCBoom sonic boom prediction software and used to estimate the effect of turbulence on the levels of shaped sonic booms associated with several low boom aircraft designs. The SonicBAT program was a four year effort that consisted of turbulence model development and refinement throughout the entire period as well as extensive flight test planning that culminated with the two research flight tests being conducted in the second and third years of the program. The SonicBAT team, led by Wyle, includes partners from the Pennsylvania State University, Lockheed Martin, Gulfstream Aerospace, Boeing, Eagle Aeronautics, Technical & Business Systems, and the Laboratory of Fluid Mechanics and Acoustics (France). A number of collaborators, including the Japan Aerospace Exploration Agency, also participated by supporting the experiments with human and equipment resources at their own expense. Three NASA centers, AFRC, Langley Research Center (LaRC), and KSC were essential to the planning and conduct of the experiments. The experiments involved precision flight of either an F-18A or F-18B executing steady, level passes at supersonic airspeeds in a turbulent atmosphere to create sonic boom signatures that had been distorted by turbulence. The flights spanned a range of atmospheric turbulence conditions at NASA Armstrong and Kennedy in order to provide a variety of conditions for code validations. The SonicBAT experiments at both sites were designed to capture simultaneous F-18A or F-18B onboard flight instrumentation data, high fidelity ground based and airborne acoustic data, surface and upper air meteorological data, and additional meteorological data from ultrasonic anemometers and SODARs to determine the local atmospheric turbulence and boundary layer height

    Flight Safety Assessment and Management.

    Full text link
    This dissertation develops a Flight Safety Assessment and Management (FSAM) system to mitigate aircraft loss of control risk. FSAM enables switching between the pilot/nominal autopilot system and a complex flight control system that can potentially recover from high risk situations but can be hard to certify. FSAM monitors flight conditions for high risk situations and selects the appropriate control authority to prevent or recover from loss of control. The pilot/nominal autopilot system is overridden only when necessary to avoid loss of control. FSAM development is pursued using two approaches. First, finite state machines are manually prescribed to manage control mode switching. Constructing finite state machines for FSAM requires careful consideration of possible exception events, but provides a computationally-tractable and verifiable means of realizing FSAM. The second approach poses FSAM as an uncertain reasoning based decision theoretic problem using Markov Decision Processes (MDP), offering a less tedious knowledge engineering process at the cost of computational overhead. Traditional and constrained MDP formulations are presented. Sparse sampling approaches are also explored to obtain suboptimal solutions to FSAM MDPs. MDPs for takeoff and icing-related loss of control events are developed and evaluated. Finally, this dissertation applies verification techniques to ensure that finite state machine or MDP policies satisfy system requirements. Counterexamples obtained from verification techniques aid in FSAM refinement. Real world aviation accidents are used as case studies to evaluate FSAM formulations. This thesis contributes decision making and verification frameworks to realize flight safety assessment and management capabilities. Novel flight envelopes and state abstractions are prescribed to aid decision making.PhDAerospace EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/133348/1/swee_1.pd

    Application of Remote Sensing to the Chesapeake Bay Region. Volume 2: Proceedings

    Get PDF
    A conference was held on the application of remote sensing to the Chesapeake Bay region. Copies of the papers, resource contributions, panel discussions, and reports of the working groups are presented

    Calibration and performance of the tile calorimeter of ATLAS with cosmic ray muons

    Get PDF
    Tese de doutoramento, Física, Universidade de Lisboa, Faculdade de Ciências, 2011The installation of the ATLAS detector in the experimental cavern, took place from 2005 until 2009. During this period, technicians, engineers and physicists have been intensivelyworking on the preparation of the detector for its main objective: probing the new frontiers of high energy physics with the LHC, the particle collider with the largest center of mass energy (14 TeV nominal) and very high luminosities(1034cm−2s−1 nominal). The context of this thesis was this challenging environment that involved all ATLAS members in the preparation of the detector for collisions during the period of the detector commissioning with cosmic ray muons and with calibration and monitoring systems. In 2008 during a short period of time single beam data was available and was used to study the detector response. This large effort was fundamental to prepare the detector for the first collisions at the LHC that started in November 2009. Before collisions started, the only high energy particles available for studieswith the LHC detectors were the muons produced by the interaction of cosmic particles in the atmosphere. These cosmic ray muons are the only detectable particles reaching the earth surface in quantities large enough to study the performance of the different sub-systems of the ATLAS detector. Thework I have developed duringmy PhDand thatwill be detailed in this document is centered on the energy calibration and synchronization of the Tile Calorimeter, the barrel hadronic calorimeter of ATLAS, using cosmic ray muons. The two main topics of study are now summarized: Contribution to the energy calibration of the Tile Calorimeter A electromagnetic energy scale was set in testbeam using high energy particles for 12% of the Tile Calorimeter modules. My contribution was centered in the validation of the global energy scale algorithm and the detector’s energy response uniformity in φ using the TileMuonFitter. The results presented in this document have shown that both the energy scale application, from testbeam to all modules in the experimental cavern, and the energy uniformity in φ are better than 5%. A difference between radial layers A and D of 3% is measured and it is something not completely understood and must be studied later using e.g. isolated muons from collisions. The used data stream and method, still have shown that a full coverage in φ can be achieved for these measurements. These results obtained with an independent method are consistent with an earlier analysis, reported in the readiness paper of the Tile Calorimeter [18]. Calorimeters are not designed and developed for the detection of muons however they play an important role on the commissioning of the LHC detectors and physics program. Before reaching the muon chambers the muons produced in collisions will lose energy in the calorimeter volume. Corrections on the energy loss in the calorimeters are necessary to improve the precision of the muon momentum measurement. This correctionmus be applied to anymuons crossing the calorimeter volume and in particular in fundamental processes used on the final calibration of the detectorwhich includes complex objects as the Z boson decaying to two muons. Lepton isolation techniques are used in the so called golden-channel for the Higgs boson discovery, the decay to four leptons H→ZZ→4l, for the rejection of QCD background. The Tile Calorimeter performance with muons can have an important impact in physics beyond the standard model, such as Super-Symmetry, for instance on the search for stable massive particles, since some of these massive particles are characterized by having an energy loss in the calorimeter similar to muons. The work developed with cosmic muons can also be applied later using muons produced in collisions to monitor the EM scale during the LHC operation. So the work developed with cosmic ray muons is not only important for the commissioning of the detector but can also be relevant for the physics of the LHC to be done with the ATLAS detector. Understanding the response of the Tile Calorimeter to muons as well as to have under control the EM energy scale are fundamental to achieve the best performance of the ATLAS detector. Synchronization of the Tile Calorimeter The Tile Calorimeter synchronization was established during 2008 combining measurements with the laser system and high energy particles: cosmic ray muons and muons from single beam. Thework presented in this thesis uses both types ofmuons, butwith different objectives inmind. Using the single beamdataweremeasured corrections to the velocity of propagation of light in the clear fibers, a parameter used in the laser synchronization. The measured value of 18.5 cm/ns resulted in the update of this parameter in the laser calibration system. The work done with cosmic muons consisted in the determination of the time offsets of the Tile Calorimeter measured both for towers and individual cells. The time offsets were calculated as the residuals after the synchronization made with the laser system. The final results have shown that the cosmic ray muons and single beam data agree within less than 2 ns. The timing is fundamental for the operation of the detector and all systems must be internally synchronized and externally synchronized with the LHC clock ( f = 1 25 ns given by the bunch crossing). The timing plays an important role in the energymeasurement due to the stringent operation conditions of the LHC that require the online signal reconstruction for the Tile Calorimeter channels to be done without iterations. The time of each channel must be known with a precision of the order of a few nanoseconds so that the correct parameters are chosen for the online reconstruction method. Time is also used to select particles that come from p-p collisions, to provide quality factors on the selection of events, and it is the most sensitive quantity for the discovery of slow long lived particles, also called stable massive particles, that are predicted in models beyond the Standard Model. This thesis is divided in 7 chapters. The first is introductory and presents the Large Hadron Collider, the ATLAS detector and its physics goals. In Chapter 2 the Tile Calorimeter is described in some detail presenting the geometry, calibration systems and performance features obtained from the last testbeam results. The following chapters are dedicated to the commissioning of the Tile Calorimeter with cosmic ray muons. The third chapter presents the motivations for the work developed, focusing on the energy scale and synchronization of the Tile Calorimeter. These quantities are of course important in the overall detector performance and have also a larger importance in specific physics channels. Chapter 4 introduces the commissioning and gives a brief overview of the activities during this stage, it is mostly descriptive but also reporting with some detail the activities in which I contributed during the development of my thesis work. The main contributions to the Tile Calorimeter commissioning is included in the next two chapters. Chapter 5 presents the results on the energy scale and uniformity in φ using the TileMuonFitter. Chapter 6 is dedicated to the methods and results for synchronization with cosmic ray muons data. Finally in Chapter 7 conclusions are given.Fundação para a Ciência e Tecnologia (SFRH/BD/27416/2006

    OIL SPILL MODELING FOR IMPROVED RESPONSE TO ARCTIC MARITIME SPILLS: THE PATH FORWARD

    Get PDF
    Maritime shipping and natural resource development in the Arctic are projected to increase as sea ice coverage decreases, resulting in a greater probability of more and larger oil spills. The increasing risk of Arctic spills emphasizes the need to identify the state-of-the-art oil trajectory and sea ice models and the potential for their integration. The Oil Spill Modeling for Improved Response to Arctic Maritime Spills: The Path Forward (AMSM) project, funded by the Arctic Domain Awareness Center (ADAC), provides a structured approach to gather expert advice to address U.S. Coast Guard (USCG) Federal On-Scene Coordinator (FOSC) core needs for decision-making. The National Oceanic & Atmospheric Administration (NOAA) Office of Response & Restoration (OR&R) provides scientific support to the USCG FOSC during oil spill response. As part of this scientific support, NOAA OR&R supplies decision support models that predict the fate (including chemical and physical weathering) and transport of spilled oil. Oil spill modeling in the Arctic faces many unique challenges including limited availability of environmental data (e.g., currents, wind, ice characteristics) at fine spatial and temporal resolution to feed models. Despite these challenges, OR&R’s modeling products must provide adequate spill trajectory predictions, so that response efforts minimize economic, cultural and environmental impacts, including those to species, habitats and food supplies. The AMSM project addressed the unique needs and challenges associated with Arctic spill response by: (1) identifying state-of-the-art oil spill and sea ice models, (2) recommending new components and algorithms for oil and ice interactions, (3) proposing methods for improving communication of model output uncertainty, and (4) developing methods for coordinating oil and ice modeling efforts
    • …
    corecore