25,694 research outputs found

    Reliability model for component-based systems in cosmic (a case study)

    Get PDF
    Software component technology has a substantial impact on modern IT evolution. The benefits of this technology, such as reusability, complexity management, time and effort reduction, and increased productivity, have been key drivers of its adoption by industry. One of the main issues in building component-based systems is the reliability of the composed functionality of the assembled components. This paper proposes a reliability assessment model based on the architectural configuration of a component-based system and the reliability of the individual components, which is usage- or testing-independent. The goal of this research is to improve the reliability assessment process for large software component-based systems over time, and to compare alternative component-based system design solutions prior to implementation. The novelty of the proposed reliability assessment model lies in the evaluation of the component reliability from its behavior specifications, and of the system reliability from its topology; the reliability assessment is performed in the context of the implementation-independent ISO/IEC 19761:2003 International Standard on the COSMIC method chosen to provide the component\u27s behavior specifications. In essence, each component of the system is modeled by a discrete time Markov chain behavior based on its behavior specifications with extended-state machines. Then, a probabilistic analysis by means of Markov chains is performed to analyze any uncertainty in the component\u27s behavior. Our hypothesis states that the less uncertainty there is in the component\u27s behavior, the greater the reliability of the component. The system reliability assessment is derived from a typical component-based system architecture with composite reliability structures, which may include the composition of the serial reliability structures, the parallel reliability structures and the p-out-of-n reliability structures. The approach of assessing component-based system reliability in the COSMIC context is illustrated with the railroad crossing case study. © 2008 World Scientific Publishing Company

    Reconstructing the Cosmic Velocity and Tidal Fields with Galaxy Groups Selected from the Sloan Digital Sky Survey

    Full text link
    [abridge]Cosmic velocity and tidal fields are important for the understanding of the cosmic web and the environments of galaxies, and can also be used to constrain cosmology. In this paper, we reconstruct these two fields in SDSS volume from dark matter halos represented by galaxy groups. Detailed mock catalogues are used to test the reliability of our method against uncertainties arising from redshift distortions, survey boundaries, and false identifications of groups by our group finder. We find that both the velocity and tidal fields, smoothed on a scale of ~2Mpc/h, can be reliably reconstructed in the inner region (~66%) of the survey volume. The reconstructed tidal field is used to split the cosmic web into clusters, filaments, sheets, and voids, depending on the sign of the eigenvalues of tidal tensor. The reconstructed velocity field nicely shows how the flows are diverging from the centers of voids, and converging onto clusters, while sheets and filaments have flows that are convergent along one and two directions, respectively. We use the reconstructed velocity field and the Zel'dovich approximation to predict the mass density field in the SDSS volume as function of redshift, and find that the mass distribution closely follows the galaxy distribution even on small scales. We find a large-scale bulk flow of about 117km/s in a very large volume, equivalent to a sphere with a radius of ~170Mpc/h, which seems to be produced by the massive structures associated with the SDSS Great Wall. Finally, we discuss potential applications of our reconstruction to study the environmental effects of galaxy formation, to generate initial conditions for simulations of the local Universe, and to constrain cosmological models. The velocity, tidal and density fields in the SDSS volume, specified on a Cartesian grid with a spatial resolution of ~700kpc/h, are available from the authors upon request.Comment: 35 pages, 13 figures, accepted for publication in MNRA

    Early Quantitative Assessment of Non-Functional Requirements

    Get PDF
    Non-functional requirements (NFRs) of software systems are a well known source of uncertainty in effort estimation. Yet, quantitatively approaching NFR early in a project is hard. This paper makes a step towards reducing the impact of uncertainty due to NRF. It offers a solution that incorporates NFRs into the functional size quantification process. The merits of our solution are twofold: first, it lets us quantitatively assess the NFR modeling process early in the project, and second, it lets us generate test cases for NFR verification purposes. We chose the NFR framework as a vehicle to integrate NFRs into the requirements modeling process and to apply quantitative assessment procedures. Our solution proposal also rests on the functional size measurement method, COSMIC-FFP, adopted in 2003 as the ISO/IEC 19761 standard. We extend its use for NFR testing purposes, which is an essential step for improving NFR development and testing effort estimates, and consequently for managing the scope of NFRs. We discuss the advantages of our approach and the open questions related to its design as well

    Particle detection technology for space-borne astroparticle experiments

    Full text link
    I review the transfer of technology from accelerator-based equipment to space-borne astroparticle detectors. Requirements for detection, identification and measurement of ions, electrons and photons in space are recalled. The additional requirements and restrictions imposed by the launch process in manned and unmanned space flight, as well as by the hostile environment in orbit, are analyzed. Technology readiness criteria and risk mitigation strategies are reviewed. Recent examples are given of missions and instruments in orbit, under construction or in the planning phase.Comment: Technology and Instrumentation in Particle Physics 2014 (TIPP 2014), June 2-6, 2014, Amsterdam, The Netherland

    Galaxy evolution by color-log(n) type since redshift unity in the Hubble Ultra Deep Field

    Full text link
    We explore the use of the color-log(n) plane (where n is the global Sersic index) as a tool for subdividing the high redshift galaxy population in a physically-motivated manner. Using a series of volume-limited samples out to z=1.5 in the Hubble Ultra Deep Field (UDF) we confirm the correlation between color-log(n) plane position and visual morphology observed locally and in other high redshift studies in the color and/or structure domain. Via comparison to a low redshift sample from the Millennium Galaxy Catalogue we quantify evolution by color-log(n) type, accounting separately for the specific selection and measurement biases against each. Specifically, we measure decreases in B-band surface brightness of 1.57 +/- 0.22 mag/sq.arcsec and 1.65 +/- 0.22 mag/sq.arcsec for `blue, diffuse' and `red, compact' galaxies respectively between redshift unity and the present day.Comment: 12 pages, 6 figures, to be published in A&A (accepted 29/10/08

    Space vehicle propulsion systems: Environmental space hazards

    Get PDF
    The hazards that exist in geolunar space which may degrade, disrupt, or terminate the performance of space-based LOX/LH2 rocket engines are evaluated. Accordingly, a summary of the open literature pertaining to the geolunar space hazards is provided. Approximately 350 citations and about 200 documents and abstracts were reviewed; the documents selected give current and quantitative detail. The methodology was to categorize the various space hazards in relation to their importance in specified regions of geolunar space. Additionally, the effect of the various space hazards in relation to spacecraft and their systems were investigated. It was found that further investigation of the literature would be required to assess the effects of these hazards on propulsion systems per se; in particular, possible degrading effects on exterior nozzle structure, directional gimbals, and internal combustion chamber integrity and geometry

    Non-functional requirements: size measurement and testing with COSMIC-FFP

    Get PDF
    The non-functional requirements (NFRs) of software systems are well known to add a degree of uncertainty to process of estimating the cost of any project. This paper contributes to the achievement of more precise project size measurement through incorporating NFRs into the functional size quantification process. We report on an initial solution proposed to deal with the problem of quantitatively assessing the NFR modeling process early in the project, and of generating test cases for NFR verification purposes. The NFR framework has been chosen for the integration of NFRs into the requirements modeling process and for their quantitative assessment. Our proposal is based on the functional size measurement method, COSMIC-FFP, adopted in 2003 as the ISO/IEC 19761 standard. Also in this paper, we extend the use of COSMIC-FFP for NFR testing purposes. This is an essential step for improving NFR development and testing effort estimates, and consequently for managing the scope of NFRs. We discuss the merits of the proposed approach and the open questions related to its design

    Experimental Study Using Functional Size Measurement in Building Estimation Models for Software Project Size

    Get PDF
    This paper reports on an experiment that investigates the predictability of software project size from software product size. The predictability research problem is analyzed at the stage of early requirements by accounting the size of functional requirements as well as the size of non-functional requirements. The experiment was carried out with 55 graduate students in Computer Science from Concordia University in Canada. In the experiment, a functional size measure and a project size measure were used in building estimation models for sets of web application development projects. The results show that project size is predictable from product size. Further replications of the experiment are, however, planed to obtain more results to confirm or disconfirm our claim

    ELUCID - Exploring the Local Universe with reConstructed Initial Density field III: Constrained Simulation in the SDSS Volume

    Full text link
    A method we developed recently for the reconstruction of the initial density field in the nearby Universe is applied to the Sloan Digital Sky Survey Data Release 7. A high-resolution N-body constrained simulation (CS) of the reconstructed initial condition, with 307233072^3 particles evolved in a 500 Mpc/h box, is carried out and analyzed in terms of the statistical properties of the final density field and its relation with the distribution of SDSS galaxies. We find that the statistical properties of the cosmic web and the halo populations are accurately reproduced in the CS. The galaxy density field is strongly correlated with the CS density field, with a bias that depend on both galaxy luminosity and color. Our further investigations show that the CS provides robust quantities describing the environments within which the observed galaxies and galaxy systems reside. Cosmic variance is greatly reduced in the CS so that the statistical uncertainties can be controlled effectively even for samples of small volumes.Comment: submitted to ApJ, 19 pages, 22 figures. Please download the high-resolution version at http://staff.ustc.edu.cn/~whywang/paper
    corecore