5,708 research outputs found

    Scientific applications of radio and radar tracking in the space program Conference proceedings

    Get PDF
    Radar and radio tracking applications in space progra

    ICE Second Halley radial: TDA mission support and DSN operations

    Get PDF
    The article documents the operations encompassing the International Cometary Explorer (ICE) second Halley radial experiment centered around March 28, 1986. The support was provided by the Deep Space Network (DSN) 64-meter subnetwork. Near continuous support was provided the last two weeks of March and the first two weeks of April to insure the collection of adequate background data for the Halley radial experiment. During the last week of March, plasma wave measurements indicate that ICE was within the Halley heavy ion pick-up region

    Cross-correlations in scaling analyses of phase transitions

    Get PDF
    Thermal or finite-size scaling analyses of importance sampling Monte Carlo time series in the vicinity of phase transition points often combine different estimates for the same quantity, such as a critical exponent, with the intent to reduce statistical fluctuations. We point out that the origin of such estimates in the same time series results in often pronounced cross-correlations which are usually ignored even in high-precision studies, generically leading to significant underestimation of statistical fluctuations. We suggest to use a simple extension of the conventional analysis taking correlation effects into account, which leads to improved estimators with often substantially reduced statistical fluctuations at almost no extra cost in terms of computation time.Comment: 4 pages, RevTEX4, 3 tables, 1 figur

    Optimal discrete stopping times for reliability growth tests

    Get PDF
    Often, the duration of a reliability growth development test is specified in advance and the decision to terminate or continue testing is conducted at discrete time intervals. These features are normally not captured by reliability growth models. This paper adapts a standard reliability growth model to determine the optimal time for which to plan to terminate testing. The underlying stochastic process is developed from an Order Statistic argument with Bayesian inference used to estimate the number of faults within the design and classical inference procedures used to assess the rate of fault detection. Inference procedures within this framework are explored where it is shown the Maximum Likelihood Estimators possess a small bias and converges to the Minimum Variance Unbiased Estimator after few tests for designs with moderate number of faults. It is shown that the Likelihood function can be bimodal when there is conflict between the observed rate of fault detection and the prior distribution describing the number of faults in the design. An illustrative example is provided

    sscMap: An extensible Java application for connecting small-molecule drugs using gene-expression signatures

    Get PDF
    Background: Connectivity mapping is a process to recognize novel pharmacological and toxicological properties in small molecules by comparing their gene expression signatures with others in a database. A simple and robust method for connectivity mapping with increased specificity and sensitivity was recently developed, and its utility demonstrated using experimentally derived gene signatures. Results: This paper introduces sscMap (statistically significant connections' map), a Java application designed to undertake connectivity mapping tasks using the recently published method. The software is bundled with a default collection of reference gene-expression profiles based on the publicly available dataset from the Broad Institute Connectivity Map 02, which includes data from over 7000 Affymetrix microarrays, for over 1000 small-molecule compounds, and 6100 treatment instances in 5 human cell lines. In addition, the application allows users to add their custom collections of reference profiles and is applicable to a wide range of other 'omics technologies. Conclusions: The utility of sscMap is two fold. First, it serves to make statistically significant connections between a user-supplied gene signature and the 6100 core reference profiles based on the Broad Institute expanded dataset. Second, it allows users to apply the same improved method to custom-built reference profiles which can be added to the database for future referencing. The software can be freely downloaded from http://purl.oclc.org/NET/sscMapComment: 3 pages, 1 table, 1 eps figur

    Cost-effectiveness of asthma control: an economic appraisal of the GOAL study

    Get PDF
    <i>Background</i>: The Gaining Optimal Asthma ControL (GOAL) study has shown the superiority of a combination of salmeterol/fluticasone propionate (SFC) compared with fluticasone propionate alone (FP) in terms of improving guideline defined asthma control. <i>Methods</i>: Clinical and economic data were taken from the GOAL study, supplemented with data on health related quality of life, in order to estimate the cost per quality adjusted life year (QALY) results for each of three strata (previously corticosteroid-free, low- and moderate-dose corticosteroid users). A series of statistical models of trial outcomes was used to construct cost effectiveness estimates across the strata of the multinational GOAL study including adjustment to the UK experience. Uncertainty was handled using the non-parametric bootstrap. Cost-effectiveness was compared with other treatments for chronic conditions. <i>Result</i>: Salmeterol/fluticasone propionate improved the proportion of patients achieving totally and well-controlled weeks resulting in a similar QALY gain across the three strata of GOAL. Additional costs of treatment were greatest in stratum 1 and least in stratum 3, with some of the costs offset by reduced health care resource use. Cost-effectiveness by stratum was £7600 (95% CI: £4800–10 700) per QALY gained for stratum 3; £11 000 (£8600–14 600) per QALY gained for stratum 2; and £13 700 (£11 000–18 300) per QALY gained for stratum 1. <i>Conclusion</i>: The GOAL study previously demonstrated the improvement in total control associated with the use of SFC compared with FP alone. This study suggests that this improvement in control is associated with cost-per-QALY figures that compare favourably with other uses of scarce health care resources

    Linear regression for numeric symbolic variables: an ordinary least squares approach based on Wasserstein Distance

    Full text link
    In this paper we present a linear regression model for modal symbolic data. The observed variables are histogram variables according to the definition given in the framework of Symbolic Data Analysis and the parameters of the model are estimated using the classic Least Squares method. An appropriate metric is introduced in order to measure the error between the observed and the predicted distributions. In particular, the Wasserstein distance is proposed. Some properties of such metric are exploited to predict the response variable as direct linear combination of other independent histogram variables. Measures of goodness of fit are discussed. An application on real data corroborates the proposed method

    Analyzing 2D gel images using a two-component empirical bayes model

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Two-dimensional polyacrylomide gel electrophoresis (2D gel, 2D PAGE, 2-DE) is a powerful tool for analyzing the proteome of a organism. Differential analysis of 2D gel images aims at finding proteins that change under different conditions, which leads to large-scale hypothesis testing as in microarray data analysis. Two-component empirical Bayes (EB) models have been widely discussed for large-scale hypothesis testing and applied in the context of genomic data. They have not been implemented for the differential analysis of 2D gel data. In the literature, the mixture and null densities of the test statistics are estimated separately. The estimation of the mixture density does not take into account assumptions about the null density. Thus, there is no guarantee that the estimated null component will be no greater than the mixture density as it should be.</p> <p>Results</p> <p>We present an implementation of a two-component EB model for the analysis of 2D gel images. In contrast to the published estimation method, we propose to estimate the mixture and null densities simultaneously using a constrained estimation approach, which relies on an iteratively re-weighted least-squares algorithm. The assumption about the null density is naturally taken into account in the estimation of the mixture density. This strategy is illustrated using a set of 2D gel images from a factorial experiment. The proposed approach is validated using a set of simulated gels.</p> <p>Conclusions</p> <p>The two-component EB model is a very useful for large-scale hypothesis testing. In proteomic analysis, the theoretical null density is often not appropriate. We demonstrate how to implement a two-component EB model for analyzing a set of 2D gel images. We show that it is necessary to estimate the mixture density and empirical null component simultaneously. The proposed constrained estimation method always yields valid estimates and more stable results. The proposed estimation approach proposed can be applied to other contexts where large-scale hypothesis testing occurs.</p

    Randomized Benchmarking of Quantum Gates

    Full text link
    A key requirement for scalable quantum computing is that elementary quantum gates can be implemented with sufficiently low error. One method for determining the error behavior of a gate implementation is to perform process tomography. However, standard process tomography is limited by errors in state preparation, measurement and one-qubit gates. It suffers from inefficient scaling with number of qubits and does not detect adverse error-compounding when gates are composed in long sequences. An additional problem is due to the fact that desirable error probabilities for scalable quantum computing are of the order of 0.0001 or lower. Experimentally proving such low errors is challenging. We describe a randomized benchmarking method that yields estimates of the computationally relevant errors without relying on accurate state preparation and measurement. Since it involves long sequences of randomly chosen gates, it also verifies that error behavior is stable when used in long computations. We implemented randomized benchmarking on trapped atomic ion qubits, establishing a one-qubit error probability per randomized pi/2 pulse of 0.00482(17) in a particular experiment. We expect this error probability to be readily improved with straightforward technical modifications.Comment: 13 page
    corecore