24,891 research outputs found

    BONNSAI: a Bayesian tool for comparing stars with stellar evolution models

    Get PDF
    Powerful telescopes equipped with multi-fibre or integral field spectrographs combined with detailed models of stellar atmospheres and automated fitting techniques allow for the analysis of large number of stars. These datasets contain a wealth of information that require new analysis techniques to bridge the gap between observations and stellar evolution models. To that end, we develop BONNSAI (BONN Stellar Astrophysics Interface), a Bayesian statistical method, that is capable of comparing all available observables simultaneously to stellar models while taking observed uncertainties and prior knowledge such as initial mass functions and distributions of stellar rotational velocities into account. BONNSAI can be used to (1) determine probability distributions of fundamental stellar parameters such as initial masses and stellar ages from complex datasets, (2) predict stellar parameters that were not yet observationally determined and (3) test stellar models to further advance our understanding of stellar evolution. An important aspect of BONNSAI is that it singles out stars that cannot be reproduced by stellar models through χ2\chi^{2} hypothesis tests and posterior predictive checks. BONNSAI can be used with any set of stellar models and currently supports massive main-sequence single star models of Milky Way and Large and Small Magellanic Cloud composition. We apply our new method to mock stars to demonstrate its functionality and capabilities. In a first application, we use BONNSAI to test the stellar models of Brott et al. (2011a) by comparing the stellar ages inferred for the primary and secondary stars of eclipsing Milky Way binaries. Ages are determined from dynamical masses and radii that are known to better than 3%. We find that the stellar models reproduce the Milky Way binaries well. BONNSAI is available through a web-interface at http://www.astro.uni-bonn.de/stars/bonnsai.Comment: Accepted for publication in A&A; 15 pages, 10 figures, 4 tables; BONNSAI is available through a web-interface at http://www.astro.uni-bonn.de/stars/bonnsa

    High-precision astrometry on the VLT/FORS1 at time scales of few days

    Full text link
    We investigate the accuracy of astrometric measurements with the VLT/FORS1 camera and consider potential applications. The study is based on two-epoch (2000 and 2002/2003) frame series of observations of a selected Galactic Bulge sky region that were obtained with FORS1 during four consecutive nights each. Reductions were carried out with a novel technique that eliminates atmospheric image motion and does not require a distinction between targets and reference objects. The positional astrometric precision was found to be limited only by the accuracy of the determination of the star photocentre, which is typically 200-300 microarcsec per single measurement for bright unsaturated stars B=18-19. Several statistical tests have shown that at time-scales of 1-4 nights the residual noise in measured positions is essentially a white noise with no systematic instrumental signature and no significant deviation from a Gaussian distribution. Some evidence of a good astrometric quality of the VLT for frames separated by two years has also been found. Our data show that the VLT with FORS1/2 cameras can be effectively used for astrometric observations of planetary microlensing events and other applications where a high accuracy is required, that is expected to reach 30-40 microarcsec for a series of 50 frames (one hours with R filter).Comment: 11 pages, 9 figures, accepted for publication in A&

    Space biology initiative program definition review. Trade study 4: Design modularity and commonality

    Get PDF
    The relative cost impacts (up or down) of developing Space Biology hardware using design modularity and commonality is studied. Recommendations for how the hardware development should be accomplished to meet optimum design modularity requirements for Life Science investigation hardware will be provided. In addition, the relative cost impacts of implementing commonality of hardware for all Space Biology hardware are defined. Cost analysis and supporting recommendations for levels of modularity and commonality are presented. A mathematical or statistical cost analysis method with the capability to support development of production design modularity and commonality impacts to parametric cost analysis is provided

    On exponential cosmological type solutions in the model with Gauss-Bonnet term and variation of gravitational constant

    Get PDF
    A D-dimensional gravitational model with Gauss-Bonnet term is considered. When ansatz with diagonal cosmological type metrics is adopted, we find solutions with exponential dependence of scale factors (with respect to "synchronous-like" variable) which describe an exponential expansion of "our" 3-dimensional factor-space and obey the observational constraints on the temporal variation of effective gravitational constant G. Among them there are two exact solutions in dimensions D = 22, 28 with constant G and also an infinite series of solutions in dimensions D \ge 2690 with the variation of G obeying the observational data.Comment: 21 pages, 12 figures, LaTex; eq. (2.1) is modified, several sentences are added, a typo in eq. (3.13) is eliminate

    Space biology initiative program definition review. Trade study 3: Hardware miniaturization versus cost

    Get PDF
    The optimum hardware miniaturization level with the lowest cost impact for space biology hardware was determined. Space biology hardware and/or components/subassemblies/assemblies which are the most likely candidates for application of miniaturization are to be defined and relative cost impacts of such miniaturization are to be analyzed. A mathematical or statistical analysis method with the capability to support development of parametric cost analysis impacts for levels of production design miniaturization are provided

    Locating Encrypted Data Hidden Among Non-Encrypted Data using Statistical Tools

    Get PDF
    This research tests the security of software protection techniques that use encryption to protect code segments containing critical algorithm implementation to prevent reverse engineering. Using the National Institute of Standards and Technology (NIST) Tests for Randomness encrypted regions hidden among non-encrypted bits of a binary executable file are located. The location of ciphertext from four encryption algorithms (AES, DES, RSA, and TEA) and three block sizes (10, 100, and 500 32-bit words) were tested during the development of the techniques described in this research. The test files were generated from the Win32 binary executable file of Adobe\u27s Acrobat Reader version 7.0.9. The culmination of this effort developed a technique capable of locating 100% of the encryption regions with no false negative error and minimal false positive error with a 95% confidence. The encrypted region must be encrypted with a strong encryption algorithm whose ciphertext appears statistically random to the NIST Tests for Randomness, and the size of the encrypted region must be at least 100 32-bit words (3,200 bits)
    • …
    corecore