7,344 research outputs found

    Unmanned-Aircraft-System-Assisted Early Wildfire Detection with Air Quality Sensors †

    Get PDF
    Numerous Hectares of Land Are Destroyed by Wildfires Every Year, Causing Harm to the Environment, the Economy, and the Ecology. More Than Fifty Million Acres Have Burned in Several States as a Result of Recent Forest Fires in the Western United States and Australia. According to Scientific Predictions, as the Climate Warms and Dries, Wildfires Will Become More Intense and Frequent, as Well as More Dangerous. These Unavoidable Catastrophes Emphasize How Important Early Wildfire Detection and Prevention Are. the Energy Management System Described in This Paper Uses an Unmanned Aircraft System (UAS) with Air Quality Sensors (AQSs) to Monitor Spot Fires Before They Spread. the Goal Was to Develop an Efficient Autonomous Patrolling System that Detects Early Wildfires While Maximizing the Battery Life of the UAS to Cover Broad Areas. the UAS Will Send Real-Time Data (Sensor Readings, Thermal Imaging, Etc.) to a Nearby Base Station (BS) When a Wildfire is Discovered. an Optimization Model Was Developed to Minimize the Total Amount of Energy Used by the UAS While Maintaining the Required Levels of Data Quality. Finally, the Simulations Showed the Performance of the Proposed Solution under Different Stability Conditions and for Different Minimum Data Rate Types

    DATA AUGMENTATION FOR SYNTHETIC APERTURE RADAR USING ALPHA BLENDING AND DEEP LAYER TRAINING

    Get PDF
    Human-based object detection in synthetic aperture RADAR (SAR) imagery is complex and technical, laboriously slow but time critical—the perfect application for machine learning (ML). Training an ML network for object detection requires very large image datasets with imbedded objects that are accurately and precisely labeled. Unfortunately, no such SAR datasets exist. Therefore, this paper proposes a method to synthesize wide field of view (FOV) SAR images by combining two existing datasets: SAMPLE, which is composed of both real and synthetic single-object chips, and MSTAR Clutter, which is composed of real wide-FOV SAR images. Synthetic objects are extracted from SAMPLE using threshold-based segmentation before being alpha-blended onto patches from MSTAR Clutter. To validate the novel synthesis method, individual object chips are created and classified using a simple convolutional neural network (CNN); testing is performed against the measured SAMPLE subset. A novel technique is also developed to investigate training activity in deep layers. The proposed data augmentation technique produces a 17% increase in the accuracy of measured SAR image classification. This improvement shows that any residual artifacts from segmentation and blending do not negatively affect ML, which is promising for future use in wide-area SAR synthesis.Outstanding ThesisMajor, United States Air ForceApproved for public release. Distribution is unlimited

    Detecting and approximating decision boundaries in low dimensional spaces

    Full text link
    A method for detecting and approximating fault lines or surfaces, respectively, or decision curves in two and three dimensions with guaranteed accuracy is presented. Reformulated as a classification problem, our method starts from a set of scattered points along with the corresponding classification algorithm to construct a representation of a decision curve by points with prescribed maximal distance to the true decision curve. Hereby, our algorithm ensures that the representing point set covers the decision curve in its entire extent and features local refinement based on the geometric properties of the decision curve. We demonstrate applications of our method to problems related to the detection of faults, to Multi-Criteria Decision Aid and, in combination with Kirsch's factorization method, to solving an inverse acoustic scattering problem. In all applications we considered in this work, our method requires significantly less pointwise classifications than previously employed algorithms

    Efficient and portable Winograd convolutions for multi-core processors

    Get PDF
    We take a step forward towards developing high-performance codes for the convolution operator, based on the Winograd algorithm, that are easy to customise for general-purpose processor architectures. In our approach, augmenting the portability of the solution is achieved via the introduction of vector instructions from Intel SSE/AVX2/AVX512 and ARM NEON/SVE to exploit the single-instruction multiple-data capabilities of current processors as well as OpenMP pragmas to exploit multi-threaded parallelism. While this comes at the cost of sacrificing a fraction of the computational performance, our experimental results on three distinct processors, with Intel Xeon Skylake, ARM Cortex A57 and Fujitsu A64FX processors, show that the impact is affordable and still renders a Winograd-based solution that is competitive when compared with the lowering GEMM-based convolution

    Multiple Imaging Modalities for Investigating Soft Hard Tissue Interfaces

    Get PDF
    Interfaces of hard and soft tissues in the body play a crucial role in processes such as skeletal growth, as well as distributing stresses during load bearing ac- tivities. The mechanically dissimilar tissues are able to be studied individually, but how they integrate at the interface, both by collagen, and mineralisation, is an under explored research area. This is of importance due to these interfaces being particularly prone to damage. In the case of the endplate, hyperminerali- sation of the cartilaginous endplate has been correlated with degeneration of the intervertebral (IVD) discs and chronic lower back pain. For the skull of infants, abnormalities in mineralisation of the cranial sutures leads to deformities of the skull, resulting in increased inter-cranial pressure, and developmental complica- tions for the child. Specific questions addressed in this thesis include, how does the osteocyte lacunae canaliculi network (OLCN) in irregular bones compare to the previously studied long bone?, how are collagen fibres arranged at the soft- hard tissue interfaces?, and how does the mineral density change with distance from the soft-hard interfaces? This PhD project has investigated these research questions via experimental methods, with the spine experiments using the central section of the 1 year old Lumbar 4-5 ovine samples in the coronal plane, to assess the vertebral body - endplate - IVD interfaces cranial to the IVD. The skulls used intact 6 week old murine samples to assess the suture-cranial plate interface for the interfrontal, sagittal, squamous, and cranial sutures. These were dissected, dehydrated, stained, embedded, and polished in polymethylmethacrylate, followed by multi- modal imaging. The imaging techniques used have been confocal laser scanning microscopy, to assess the OLCN, scanning electron microscopy to map the spa- tial distribution of minerals, and second harmonic generation for investigating the collagen across these mechanically complex tissues. Analysis for the OLCN in the spine has used Python scripts to quantify the net- work density, the lacunae density, and the direction of the network with respect to the nearest blood vessel. Quantification of minerals in the skull used Quantitative Backscattered Electron Imaging to get the calcium weight % from the pixel in- tensity. Polarised second harmonic generation was used to quantify the principle direction of the collagen bundles, as well as the dispersion of the collagen fibres making up the bundles. Results have been both qualitative and quantitative in this project. Minerali- sation patterns in the vertebral endplate (VEP) show heterogeneity, with higher degrees of mineralisation in the mineralised cartilage. The values for canaliculi density within the VEP range from 0.05-0.14 μm/μm3, similar values reported in long bone, and the collagen across the cartilage and bone interface has the same principle direction, but the cartilage has a greater degree of dispersion. For the suture-cranial plate interface, the mineral density values ranged between 15-22%, with higher values located at the sites of growth, and edges close to non-mineralised tissues. The collagens have continuity across the mineralisation face, with changes in collagen structure to become more ordered once within the bone tissue, or as Sharpey’s fibres which span the soft-hard interface. The soft-hard interface, which defines the boundary of mineralised tissue, is spatially distinct from the interface between the major collagen types: type I and type II. This observation is seen in both the spine and the cranial sutures. This thesis outlines reliable methods to image and quantify the OLCN, miner- alisation, and collagen in mechanically dissimilar tissues, and establishes a base- line for future experiments to expand on how these features may change with age or disease. The results are in agreement with similar findings in literature, and are novel in that these specific tissues have not been quantified by their OLCN, min- eralisation, and collagen arrangement at this scale before. Findings in this thesis show that there are multiple spatially distinct interfaces of the different constituent components as tissues transition from mineralised to non-mineralised

    Offene-Welt-Strukturen: Architektur, Stadt- und Naturlandschaft im Computerspiel

    Get PDF
    Welche Rolle spielen Algorithmen für den Bildbau und die Darstellung von Welt und Wetter in Computerspielen? Wie beeinflusst die Gestaltung der Räume, Level und Topografien die Entscheidungen und das Verhalten der Spieler_innen? Ist der Brutalismus der erste genuine Architekturstil der Computerspiele? Welche Bedeutung haben Landschaftsgärten und Nationalparks im Strukturieren von Spielwelten? Wie wird Natur in Zeiten des Klimawandels dargestellt? Insbesondere in den letzten 20 Jahren adaptieren digitale Spielwelten akribischer denn je Merkmale der physisch-realen Welt. Durch aufwändige Produktionsverfahren und komplexe Visualisierungsstrategien wird die Angleichung an unsere übrige Alltagswelt stets in Abhängigkeit von Spielmechanik und Weltlichkeit erzeugt. Wie sich spätestens am Beispiel der Open-World-Spiele zeigt, führt die Übernahme bestimmter Weltbilder und Bildtraditionen zu ideologischen Implikationen, die weit über die bisher im Fokus der Forschung stehenden, aus anderen Medienformaten transferierten Erzählkonventionen hinausgehen. Mit seiner Theorie der Architektur als medialem Scharnier legt der Autor offen, dass digitale Spielwelten medienspezifische Eigenschaften aufweisen, die bisher nicht zu greifen waren und der Erforschung harrten. Durch Verschränken von Konzepten aus u.a. Medienwissenschaft, Game Studies, Philosophie, Architekturtheorie, Humangeografie, Landschaftstheorie und Kunstgeschichte erarbeitet Bonner ein transdisziplinäres Theoriemodell und ermöglicht anhand der daraus entwickelten analytischen Methoden erstmals, die komplexe Struktur heutiger Computerspiele - vom Indie Game bis zur AAA Open World - zu verstehen und zu benennen. Mit "Offene-Welt-Strukturen" wird die Architektonik digitaler Spielwelten umfassend zugänglich

    Computational modeling of biological nanopores

    Full text link
    Throughout our history, we, humans, have sought to better control and understand our environment. To this end, we have extended our natural senses with a host of sensors-tools that enable us to detect both the very large, such as the merging of two black holes at a distance of 1.3 billion light-years from Earth, and the very small, such as the identification of individual viral particles from a complex mixture. This dissertation is devoted to studying the physical mechanisms that govern a tiny, yet highly versatile sensor: the biological nanopore. Biological nanopores are protein molecules that form nanometer-sized apertures in lipid membranes. When an individual molecule passes through this aperture (i.e., "translocates"), the temporary disturbance of the ionic current caused by its passage reveals valuable information on its identity and properties. Despite this seemingly straightforward sensing principle, the complexity of the interactions between the nanopore and the translocating molecule implies that it is often very challenging to unambiguously link the changes in the ionic current with the precise physical phenomena that cause them. It is here that the computational methods employed in this dissertation have the potential to shine, as they are capable of modeling nearly all aspects of the sensing process with near atomistic precision. Beyond familiarizing the reader with the concepts and state-of-the-art of the nanopore field, the primary goals of this dissertation are fourfold: (1) Develop methodologies for accurate modeling of biological nanopores; (2) Investigate the equilibrium electrostatics of biological nanopores; (3) Elucidate the trapping behavior of a protein inside a biological nanopore; and (4) Mapping the transport properties of a biological nanopore. In the first results chapter of this thesis (Chapter 3), we used 3D equilibrium simulations [...]Comment: PhD thesis, 306 pages. Source code available at https://github.com/willemsk/phdthesis-tex

    Extending the reach of uncertainty quantification in nuclear theory

    Get PDF
    The theory of the strong interaction—quantum chromodynamics (QCD)—is unsuited to practical calculations of nuclear observables and approximate models for nuclear interaction potentials are required. In contrast to phenomenological models, chiral effective field theories (χEFTs) of QCD grant a handle on the theoretical uncertainty arising from the truncation of the chiral expansion. Uncertainties in χEFT are preferably quantified using Bayesian inference, but quantifying reliable posterior predictive distributions for nuclear observables presents several challenges. First, χEFT is parametrized by unknown low-energy constants (LECs) whose values must be inferred from low-energy data of nuclear structure and reaction observables. There are 31 LECs at fourth order in Weinberg power counting, leading to a high-dimensional inference problem which I approach by developing an advanced sampling protocol using Hamiltonian Monte Carlo (HMC). This allows me to quantify LEC posteriors up to and including fourth chiral order. Second, the χEFT truncation error is correlated across independent variables such as scattering energies and angles; I model correlations using a Gaussian process. Third, the computational cost of computing few- and many-nucleon observables typically precludes their direct use in Bayesian parameter estimation as each observable must be computed in excess of 100,000 times during HMC sampling. The one exception is nucleon-nucleon scattering observables, but even these incur a substantial computational cost in the present applications. I sidestep such issues using eigenvector-continuation emulators, which accurately mimic exact calculations while dramatically reducing the computational cost. Equipped with Bayesian posteriors for the LECs, and a model for the truncation error, I explore the predictive ability of χEFT, presenting the results as the probability distributions they always were

    Spectral methods for solving elliptic PDEs on unknown manifolds

    Full text link
    In this paper, we propose a mesh-free numerical method for solving elliptic PDEs on unknown manifolds, identified with randomly sampled point cloud data. The PDE solver is formulated as a spectral method where the test function space is the span of the leading eigenfunctions of the Laplacian operator, which are approximated from the point cloud data. While the framework is flexible for any test functional space, we will consider the eigensolutions of a weighted Laplacian obtained from a symmetric Radial Basis Function (RBF) method induced by a weak approximation of a weighted Laplacian on an appropriate Hilbert space. Especially, we consider a test function space that encodes the geometry of the data yet does not require us to identify and use the sampling density of the point cloud. To attain a more accurate approximation of the expansion coefficients, we adopt a second-order tangent space estimation method to improve the RBF interpolation accuracy in estimating the tangential derivatives. This spectral framework allows us to efficiently solve the PDE many times subjected to different parameters, which reduces the computational cost in the related inverse problem applications. In a well-posed elliptic PDE setting with randomly sampled point cloud data, we provide a theoretical analysis to demonstrate the convergent of the proposed solver as the sample size increases. We also report some numerical studies that show the convergence of the spectral solver on simple manifolds and unknown, rough surfaces. Our numerical results suggest that the proposed method is more accurate than a graph Laplacian-based solver on smooth manifolds. On rough manifolds, these two approaches are comparable. Due to the flexibility of the framework, we empirically found improved accuracies in both smoothed and unsmoothed Stanford bunny domains by blending the graph Laplacian eigensolutions and RBF interpolator.Comment: 8 figure
    • …
    corecore