380 research outputs found

    Sensitivity analysis and probabilistic re-entry modeling for debris using high dimensional model representation based uncertainty treatment

    Get PDF
    Well-known tools developed for satellite and debris re-entry perform break- up and trajectory simulations in a deterministic sense and do not perform any uncertainty treatment. The treatment of uncertainties associated with the re-entry of a space object requires a probabilistic approach. A Monte Carlo campaign is the intuitive approach to performing a probabilistic analysis, however, it is computationally very expensive. In this work, we use a recently developed approach based on a new derivation of the high dimensional model representation method for implementing a computationally efficient probabilistic analysis approach for re-entry. Both aleatoric and epistemic uncertainties that affect aerodynamic trajectory and ground impact location are considered. The method is applicable to both controlled and uncontrolled re-entry scenarios. The resulting ground impact distributions are far from the typically used Gaussian or ellipsoid distributions

    Methodology of a Troposphere Effect Mitigation Processor for SAR Interferometry

    Get PDF
    Troposphere effect mitigation based on numerical weather prediction (NWP) is an actual research topic in SAR interferometry (InSAR) and especially in persistent scatterer interferometry (PSI). This is the reason, a scientific troposphere effect mitigation processing system has been developed. The objective of this paper is to provide the methodology of four developed algorithms, demonstrate application examples, discuss the methods characteristic and recommend techniques for operational systems

    Surrogate model for probabilistic modeling of atmospheric entry for small NEO's

    Get PDF
    Near Earth Objects (NEOs) enter the Earths atmosphere on a regular basis. Depending on the size, object and entry parameters; these objects can burn-up through ablation (complete evaporation), undergo fragmentation of varying nature, or impact the ground unperturbed. Parameters that influence the physics during entry are either unknown or highly uncertain. In this work, we propose a probabilistic approach for simulating entry. Probabilistic modeling typically requires an expensive Monte Carlo approach. In this work, we develop and present a novel engineering approach of developing surrogate models for simulation of the atmospheric entry accounting for drag, ablation, evaporation, fragmentation, and ground impact

    Measuring the mixing scale of the ISM within nearby spiral galaxies

    Get PDF
    The spatial distribution of metals reflects, and can be used to constrain, the processes of chemical enrichment and mixing. Using PHANGS-MUSE optical integral field spectroscopy, we measure the gas phase oxygen abundances (metallicities) across 7,138 HII regions in a sample of eight nearby disc galaxies. In Paper I (Kreckel et al. 2019) we measure and report linear radial gradients in the metallicities of each galaxy, and qualitatively searched for azimuthal abundance variations. Here, we examine the two-dimensional variation in abundances once the radial gradient is subtracted, Delta(O/H), in order to quantify the homogeneity of the metal distribution and to measure the mixing scale over which HII region metallicities are correlated. We observe low (0.03--0.05 dex) scatter in Delta(O/H) globally in all galaxies, with significantly lower (0.02--0.03 dex) scatter on small (<600 pc) spatial scales. This is consistent with the measurement uncertainties, and implies the two-dimensional metallicity distribution is highly correlated on scales of <600 pc. We compute the two point correlation function for metals in the disc in order to quantify the scale lengths associated with the observed homogeneity. This mixing scale is observed to correlate better with the local gas velocity dispersion (of both cold and ionized gas) than with the star formation rate. Selecting only HII regions with enhanced abundances relative to a linear radial gradient, we do not observe increased homogeneity on small scales. This suggests that the observed homogeneity is driven by the mixing introducing material from large scales rather than by pollution from recent and on-going star formation.Comment: 17 pages, 14 figures. Accepted for publication in MNRA

    Spatial statistics and analysis of earth's ionosphere

    Full text link
    Thesis (Ph.D.)--Boston UniversityThe ionosphere, a layer of Earths upper atmosphere characterized by energetic charged particles, serves as a natural plasma laboratory and supplies proxy diagnostics of space weather drivers in the magnetosphere and the solar wind. The ionosphere is a highly dynamic medium, and the spatial structure of observed features (such as auroral light emissions, charge density, temperature, etc.) is rich with information when analyzed in the context of fluid, electromagnetic, and chemical models. Obtaining measurements with higher spatial and temporal resolution is clearly advantageous. For instance, measurements obtained with a new electronically-steerable incoherent scatter radar (ISR) present a unique space-time perspective compared to those of a dish-based ISR. However, there are unique ambiguities for this modality which must be carefully considered. The ISR target is stochastic, and the fidelity of fitted parameters (ionospheric densities and temperatures) requires integrated sampling, creating a tradeoff between measurement uncertainty and spatio-temporal resolution. Spatial statistics formalizes the relationship between spatially dispersed observations and the underlying process(es) they represent. A spatial process is regarded as a random field with its distribution structured (e.g., through a correlation function) such that data, sampled over a spatial domain, support inference or prediction of the process. Quantification of uncertainty, an important component of scientific data analysis, is a core value of spatial statistics. This research applies the formalism of spatial statistics to the analysis of Earth's ionosphere using remote sensing diagnostics. In the first part, we consider the problem of volumetric imaging using phased-array ISR based on optimal spatial prediction ("kriging"). In the second part, we develop a technique for reconstructing two-dimensional ion flow fields from line-of-sight projections using Tikhonov regularization. In the third part, we adapt our spatial statistical approach to global ionospheric imaging using total electron content (TEC) measurements derived from navigation satellite signals

    Multifidelity domain-aware learning for the design of re-entry vehicles

    Get PDF
    The multidisciplinary design optimization (MDO) of re-entry vehicles presents many challenges associated with the plurality of the domains that characterize the design problem and the multi-physics interactions. Aerodynamic and thermodynamic phenomena are strongly coupled and relate to the heat loads that affect the vehicle along the re-entry trajectory, which drive the design of the thermal protection system (TPS). The preliminary design and optimization of re-entry vehicles would benefit from accurate high-fidelity aerothermodynamic analysis, which are usually expensive computational fluid dynamic simulations. We propose an original formulation for multifidelity active learning that considers both the information extracted from data and domain-specific knowledge. Our scheme is developed for the design of re-entry vehicles and is demonstrated for the case of an Orion-like capsule entering the Earth atmosphere. The design process aims to minimize the mass of propellant burned during the entry maneuver, the mass of the TPS, and the temperature experienced by the TPS along the re-entry. The results demonstrate that our multifidelity strategy allows to achieve a sensitive improvement of the design solution with respect to the baseline. In particular, the outcomes of our method are superior to the design obtained through a single-fidelity framework, as a result of the principled selection of a limited number of high-fidelity evaluations

    Combined Computational-Experimental Design of High-Temperature, High-Intensity Permanent Magnetic Alloys with Minimal Addition of Rare-Earth Elements

    Get PDF
    AlNiCo magnets are known for high-temperature stability and superior corrosion resistance and have been widely used for various applications. Reported magnetic energy density ((BH) max) for these magnets is around 10 MGOe. Theoretical calculations show that ((BH) max) of 20 MGOe is achievable which will be helpful in covering the gap between AlNiCo and Rare-Earth Elements (REE) based magnets. An extended family of AlNiCo alloys was studied in this dissertation that consists of eight elements, and hence it is important to determine composition-property relationship between each of the alloying elements and their influence on the bulk properties. In the present research, we proposed a novel approach to efficiently use a set of computational tools based on several concepts of artificial intelligence to address a complex problem of design and optimization of high temperature REE-free magnetic alloys. A multi-dimensional random number generation algorithm was used to generate the initial set of chemical concentrations. These alloys were then examined for phase equilibria and associated magnetic properties as a screening tool to form the initial set of alloy. These alloys were manufactured and tested for desired properties. These properties were fitted with a set of multi-dimensional response surfaces and the most accurate meta-models were chosen for prediction. These properties were simultaneously extremized by utilizing a set of multi-objective optimization algorithm. This provided a set of concentrations of each of the alloying elements for optimized properties. A few of the best predicted Pareto-optimal alloy compositions were then manufactured and tested to evaluate the predicted properties. These alloys were then added to the existing data set and used to improve the accuracy of meta-models. The multi-objective optimizer then used the new meta-models to find a new set of improved Pareto-optimized chemical concentrations. This design cycle was repeated twelve times in this work. Several of these Pareto-optimized alloys outperformed most of the candidate alloys on most of the objectives. Unsupervised learning methods such as Principal Component Analysis (PCA) and Heirarchical Cluster Analysis (HCA) were used to discover various patterns within the dataset. This proves the efficacy of the combined meta-modeling and experimental approach in design optimization of magnetic alloys

    Bayesian approach to ionospheric imaging with Gaussian Markov random field priors

    Get PDF
    Ionosfääri on noin 60–1000 kilometrin korkeudella sijaitseva ilmakehän kerros, jossa kaasuatomien ja -molekyylien elektroneja on päässyt irtoamaan auringon säteilyn ja auringosta peräisin olevien nopeiden hiukkasten vaikutuksesta. Näin syntyneillä ioneilla ja vapailla elektroneilla on sähkö- ja magneettikenttien kanssa vuorovaikuttava sähkövaraus. Ionosfäärillä on siksi merkittävä rooli radioliikenteessä. Se voi mahdollistaa horisontin yli tapahtuvat pitkät radiolähetykset heijastamalla lähetetyn sähkömagneettisen signaalin takaisin maata kohti. Toisaalta ionosfääri vaikuttaa myös sen läpäiseviin korkeampitaajuuksisiin signaaleihin. Esimerkiksi satelliittipaikannuksessa ionosfäärin vaikutus on parhaassakin tapauksessa otettava huomioon, mutta huonoimmassa se voi estää paikannuksen täysin. Näkyvin ja tunnetuin ionosfääriin liittyvä ilmiö lienee revontulet. Yksi keskeisistä suureista ionosfäärin tutkimuksessa on vapaiden elektronien määrä kuutiometrin tilavuudessa. Käytännössä elektronitiheyden mittaaminen on mahdollista mm. tutkilla, kuten Norjan, Suomen ja Ruotsin alueilla sijaitsevalla EISCAT-tutkajärjestelmällä, sekä raketti- tai satelliittimittauksilla. Mittaukset voivat olla hyvinkin tarkkoja, mutta tietoa saadaan ainoastaan tutkakeilan suunnassa tai mittalaitteen läheisyydestä. Näillä menetelmillä ionosfäärin tutkiminen laajemmalla alueella on siten vaikeaa ja kallista. Olemassa olevat paikannussatelliitit ja vastaanotinverkot mahdollistavat ionosfäärin elektronitiheyden mittaamisen alueellisessa, ja jopa globaalissa mittakaavassa, ensisijaisen käyttötarkoituksensa sivutuotteena. Satelliittimittausten ajallinen ja paikallinen kattavuus on hyvä, ja kaiken aikaa kasvava, mutta esimerkiksi tarkkoihin tutkamittauksiin verrattuna yksittäisten mittausten tuottama informaatio on huomattavasti vähäisempää. Tässä väitöstyössä kehitettiin tietokoneohjelmisto ionosfäärin elektronitiheyden kolmiulotteiseen kuvantamiseen. Menetelmä perustuu matemaattisten käänteisongelmien teoriaan ja muistuttaa lääketieteessä käytettyjä viipalekuvausmenetelmiä. Satelliittimittausten puutteellisesta informaatiosta johtuen työssä on keskitytty etenkin siihen, miten ratkaisun löytymistä voidaan auttaa tilastollisesti esitetyllä fysikaalisella ennakkotiedolla. Erityisesti työssä sovellettiin gaussisiin Markovin satunnaiskenttiin perustuvaa uutta korrelaatiopriori-menetelmää. Menetelmä vähentää merkittävästi tietokonelaskennassa käytettävän muistin tarvetta, mikä lyhentää laskenta-aikaa ja mahdollistaa korkeamman kuvantamisresoluution.Ionosphere is the partly ionised layer of Earth's atmosphere caused by solar radiation and particle precipitation. The ionisation can start from 60 km and extend up to 1000 km altitude. Often the interest in ionosphere is in the quantity and distribution of the free electrons. The electron density is related to the ionospheric refractive index and thus sufficiently high densities affect the electromagnetic waves propagating in the ionised medium. This is the reason for HF radio signals being able to reflect from the ionosphere allowing broadcast over the horizon, but also an error source in satellite positioning systems. The ionospheric electron density can be studied e.g. with specific radars and satellite in situ measurements. These instruments can provide very precise observations, however, typically only in the vicinity of the instrument. To make observations in regional and global scales, due to the volume of the domain and price of the aforementioned instruments, indirect satellite measurements and imaging methods are required. Mathematically ionospheric imaging suffers from two main complications. First, due to very sparse and limited measurement geometry between satellites and receivers, it is an ill-posed inverse problem. The measurements do not have enough information to reconstruct the electron density and thus additional information is required in some form. Second, to obtain sufficient resolution, the resulting numerical model can become computationally infeasible. In this thesis, the Bayesian statistical background for the ionospheric imaging is presented. The Bayesian approach provides a natural way to account for different sources of information with corresponding uncertainties and to update the estimated ionospheric state as new information becomes available. Most importantly, the Gaussian Markov Random Field (GMRF) priors are introduced for the application of ionospheric imaging. The GMRF approach makes the Bayesian approach computationally feasible by sparse prior precision matrices. The Bayesian method is indeed practicable and many of the widely used methods in ionospheric imaging revert back to the Bayesian approach. Unfortunately, the approach cannot escape the inherent lack of information provided by the measurement set-up, and similarly to other approaches, it is highly dependent on the additional subjective information required to solve the problem. It is here shown that the use of GMRF provides a genuine improvement for the task as this subjective information can be understood and described probabilistically in a meaningful and physically interpretative way while keeping the computational costs low
    corecore