984 research outputs found

    Spectrophotometric calibration of low-resolution spectra

    Full text link
    Low-resolution spectroscopy is a frequently used technique. Aperture prism spectroscopy in particular is an important tool for large-scale survey observations. The ongoing ESA space mission Gaia is the currently most relevant example. In this work we analyse the fundamental limitations of the calibration of low-resolution spectrophotometric observations and introduce a calibration method that avoids simplifying assumptions on the smearing effects of the line spread functions. To this aim, we developed a functional analytic mathematical formulation of the problem of spectrophotometric calibration. In this formulation, the calibration process can be described as a linear mapping between two suitably constructed Hilbert spaces, independently of the resolution of the spectrophotometric instrument. The presented calibration method can provide a formally unusual but precise calibration of low-resolution spectrophotometry with non-negligible widths of line spread functions. We used the Gaia spectrophotometric instruments to demonstrate that the calibration method of this work can potentially provide a significantly better calibration than methods neglecting the smearing effects of the line spread functions.Comment: Final versio

    The Application of Polynomial Response Surface and Polynomial Chaos Expansion Metamodels within an Augmented Reality Conceptual Design Environment

    Get PDF
    The engineering design process consists of many stages. In the conceptual phase, potential designs are generated and evaluated without considering specifics. Winning concepts then advance to the detail design and high fidelity simulation stages. At this point in the process, very accurate representations are made for each design and are then subjected to rigorous analysis. With the advancement of computer technology, these last two phases have been very well served by the software community. Engineering software such as computer-aided design (CAD), finite element analysis (FEA), and computational fluid dynamics (CFD) have become an inseparable part of the design process for many engineered products and processes. Conceptual design tools, on the other hand, have not undergone this type of advancement, where much of the work is still done with little to no digital technology. Detail oriented tools require a significant amount of time and training to use effectively. This investment is considered worthwhile when high fidelity models are needed. However, conceptual design has no need for this level of detail. Instead, rapid concept generation and evaluation are the primary goals. Considering the lack of adequate tools to suit these needs, new software was created. This thesis discusses the development of that conceptual design application. Traditional design tools rely on a two dimensional mouse to perform three dimensional actions. While many designers have become familiar with this approach, it is not intuitive to an inexperienced user. In order to enhance the usability of the developed application, a new interaction method was applied. Augmented reality (AR) is a developing research area that combines virtual elements with the real world. This capability was used to create a three dimensional interface for the engineering design application. Using specially tracked interface objects, the user\u27s hands become the primary method of interaction. Within this AR environment, users are able perform many of the basic actions available within a CAD system such as object manipulation, editing, and assembly. The same design environment also provides real time assessment data. Calculations for center of gravity and wheel loading can be done with the click of a few buttons. Results are displayed to the user in the AR scene. In order to support the quantitative analysis tools necessary for conceptual design, additional research was done in the area of metamodeling. Metamodels are capable of providing approximations for more complex analyses. In the case of the wheel loading calculation, the approximation takes the place of a time consuming FEA simulation. Two different metamodeling techniques were studied in this thesis: polynomial response surface (PRS) and polynomial chaos expansion (PCE). While only the wheel loading case study was included in the developed application, an additional design problem was analyzed to assess the capabilities of both methods for conceptual design. In the second study, the maximum stresses and displacements within the support frame of a bucket truck were modeled. The source data for building the approximations was generated via an FEA simulation of digital mockups, since no legacy data was available. With this information, experimental models were constructed by varying several factors, including: the distribution of source and test data, the number of input trials, the inclusion of interaction effects, and the addition of third order terms. Comparisons were also drawn between the two metamodeling techniques. For the wheel loading models, third order models with interaction effects provided a good fit of the data (root mean square error of less than 10%) with as few as thirty input data points. With minimal source data, however, second order models and those without interaction effects outperformed third order counterparts. The PRS and PCE methods performed almost equivalently with sufficient source data. Difference began to appear at the twenty trial case. PRS was more suited to wider distributions of data. The PCE technique better handled smaller distributions and extrapolation to larger test data. The support frame problem represented a more difficult analysis with non-linear responses. While initial third order results from the PCE models were better than those for PRS, both had significantly higher error than in the previous case study. However, with simpler second order models and sufficient input data (more than thirty trials) adequate approximation results were achieved. The less complex responses had error around 10%, and the model predictions for the non-linear response were reduced to around 20%. These results demonstrate that useful approximations can be constructed from minimal data. Such models, despite the uncertainty involved, will be able to provide designers with helpful information at the conceptual stage of a design process

    Automating the Analysis of Uncertainties in Multi-Body Dynamic Systems Using Polynomial Chaos Theory

    Get PDF
    Variation occurs in many multi-body dynamic (MBD) systems in the geometry, mass, or forces. This variation creates uncertainty in the responses of an MBD system. Understanding how MBD systems respond to the variation is imperative for the design of a robust system. However, the simulation of how variation propagates into the solution is complicated as most MBD systems cannot be simplified into to a system of ordinary differential equations (ODE). This presentation shows the automation of an uncertainty analysis of an MBD system with variation. The first step to automating the solution is to create a robust algorithm based on the Constrained Lagrangian formulation for deriving the equations of motion. Using the Constrained Lagrangian algorithm as a starting point, the new process presented uses polynomial chaos theory (PCT) to embed the stochastic parameters into the equations of motion. To accomplish this, the concept of Variational Work is derived and implemented in the solution. Variational Work applies PCT to the energy terms and Principle of Virtual Work of the Constrained Lagrangian rather than applying PCT on the equations of motion. Using an automated process for applying PCT to an MBD system, some example problems are solved. Each of these problems is compared to a Monte Carlo analysis using the deterministic automation process. Some of the examples are non-textbook based problems, which show limitations in the application of PCT to an MBD system. The limitations and the possible solutions to overcoming them are discussed

    Tilted Euler characteristic densities for Central Limit random fields, with application to "bubbles"

    Full text link
    Local increases in the mean of a random field are detected (conservatively) by thresholding a field of test statistics at a level uu chosen to control the tail probability or pp-value of its maximum. This pp-value is approximated by the expected Euler characteristic (EC) of the excursion set of the test statistic field above uu, denoted Eφ(Au)\mathbb{E}\varphi(A_u). Under isotropy, one can use the expansion Eφ(Au)=kVkρk(u)\mathbb{E}\varphi(A_u)=\sum_k\mathcal{V}_k\rho_k(u), where Vk\mathcal{V}_k is an intrinsic volume of the parameter space and ρk\rho_k is an EC density of the field. EC densities are available for a number of processes, mainly those constructed from (multivariate) Gaussian fields via smooth functions. Using saddlepoint methods, we derive an expansion for ρk(u)\rho_k(u) for fields which are only approximately Gaussian, but for which higher-order cumulants are available. We focus on linear combinations of nn independent non-Gaussian fields, whence a Central Limit theorem is in force. The threshold uu is allowed to grow with the sample size nn, in which case our expression has a smaller relative asymptotic error than the Gaussian EC density. Several illustrative examples including an application to "bubbles" data accompany the theory.Comment: Published in at http://dx.doi.org/10.1214/07-AOS549 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Feature extraction for image quality prediction

    Get PDF

    Constructing velocity distributions in crossed-molecular beam studies using fourier transform doppler spectroscopy

    Get PDF
    The goal of our scattering experiments is to derive the distribution the differential cross-section and elucidate the dynamics of a bimolecular collision via pure rotational spectroscopy. We have explored the use of a data reduction model to directly transform rotational line shapes into the differential cross section and speed distribution of a reactive bimolecular collision. This inversion technique, known as Fourier Transform Doppler Spectroscopy (FTDS), initially developed by James Kinsey [1], deconvolves the velocity information contained in one-dimensional Doppler Profiles to construct the non-thermal, state-selective three-dimensional velocity distribution. By employing an expansion in classical orthogonal polynomials, the integral transform in FTDS can be simplified into a set of purely algebraic expressions technique; i.e. the Taatjes method [2]. In this investigation, we extend the Taatjes method for general use in recovering asymmetric velocity distributions. We have also constructed a hypothet- ical asymmetric distribution from adiabatic scattering in Argon-Argon to test the general method. The angle- and speed-components of the sample distribution were derived classically from a Lennard-Jones 6-12 potential, with collisions at 60 meV, and mapped onto Radon space to generate a set of discrete Doppler profiles. The sample distribution was reconstructed from these profiles using FTDS. Both distributions were compared along with derived total cross sections for the Argon-Argon system. This study serves as a template for constructing velocity distributions from bimolecular scattering experiments using the FTDS inversion technique

    Image representation and compression using steered hermite transforms

    Get PDF

    An instrumental measure for the perceived blockiness in JPEG-coded images

    Get PDF
    corecore