2,186 research outputs found

    Probabilities of spurious connections in gene networks: Application to expression time series

    Full text link
    Motivation: The reconstruction of gene networks from gene expression microarrays is gaining popularity as methods improve and as more data become available. The reliability of such networks could be judged by the probability that a connection between genes is spurious, resulting from chance fluctuations rather than from a true biological relationship. Results: Unlike the false discovery rate and positive false discovery rate, the decisive false discovery rate (dFDR) is exactly equal to a conditional probability without assuming independence or the randomness of hypothesis truth values. This property is useful not only in the common application to the detection of differential gene expression, but also in determining the probability of a spurious connection in a reconstructed gene network. Estimators of the dFDR can estimate each of three probabilities: 1. The probability that two genes that appear to be associated with each other lack such association. 2. The probability that a time ordering observed for two associated genes is misleading. 3. The probability that a time ordering observed for two genes is misleading, either because they are not associated or because they are associated without a lag in time. The first probability applies to both static and dynamic gene networks, and the other two only apply to dynamic gene networks. Availability: Cross-platform software for network reconstruction, probability estimation, and plotting is free from http://www.davidbickel.com as R functions and a Java application.Comment: Like q-bio.GN/0404032, this was rejected in March 2004 because it was submitted to the math archive. The only modification is a corrected reference to q-bio.GN/0404032, which was not modified at al

    Depletion forces near a soft surface

    Full text link
    We investigate excluded-volume effects in a bidisperse colloidal suspension near a flexible interface. Inspired by a recent experiment by Dinsmore et al. (Phys. Rev, Lett. 80, 409 (1998)), we study the adsorption of a mesoscopic bead on the surface and show that depletion forces could in principle lead to particle encapsulation. We then consider the effect of surface fluctuations on the depletion potential itself and construct the density profile of a polymer solution near a soft interface. Surprisingly we find that the chains accumulate at the wall, whereas the density displays a deficit of particles at distances larger than the surface roughness. This non-monotonic behavior demonstrates that surface fluctuations can have major repercusions on the properties of a colloidal solution. On average, the additional contribution to the Gibbs adsorbance is negative. The amplitude of the depletion potential between a mesoscopic bead and the surface increases accordingly.Comment: 10 pages, 5 figure

    Optimal full estimation of qubit mixed states

    Get PDF
    We obtain the optimal scheme for estimating unknown qubit mixed states when an arbitrary number N of identically prepared copies is available. We discuss the case of states in the whole Bloch sphere as well as the restricted situation where these states are known to lie on the equatorial plane. For the former case we obtain that the optimal measurement does not depend on the prior probability distribution provided it is isotropic. Although the equatorial-plane case does not have this property for arbitrary N, we give a prior-independent scheme which becomes optimal in the asymptotic limit of large N. We compute the maximum mean fidelity in this asymptotic regime for the two cases. We show that within the pointwise estimation approach these limits can be obtained in a rather easy and rapid way. This derivation is based on heuristic arguments that are made rigorous by using van Trees inequalities. The interrelation between the estimation of the purity and the direction of the state is also discussed. In the general case we show that they correspond to independent estimations whereas for the equatorial-plane states this is only true asymptotically.Comment: 19 pages, no figure

    Ablation debris control by means of closed thick film filtered water immersion

    Get PDF
    The performance of laser ablation generated debris control by means of open immersion techniques have been shown to be limited by flow surface ripple effects on the beam and the action of ablation plume pressure loss by splashing of the immersion fluid. To eradicate these issues a closed technique has been developed which ensured a controlled geometry for both the optical interfaces of the flowing liquid film. This had the action of preventing splashing, ensuring repeatable machining conditions and allowed for control of liquid flow velocity. To investigate the performance benefits of this closed immersion technique bisphenol A polycarbonate samples have been machined using filtered water at a number of flow velocities. The results demonstrate the efficacy of the closed immersion technique: a 93% decrease in debris is produced when machining under closed filtered water immersion; the average debris particle size becomes larger, with an equal proportion of small and medium sized debris being produced when laser machining under closed flowing filtered water immersion; large debris is shown to be displaced further by a given flow velocity than smaller debris, showing that the action of flow turbulence in the duct has more impact on smaller debris. Low flow velocities were found to be less effective at controlling the positional trend of deposition of laser ablation generated debris than high flow velocities; but, use of excessive flow velocities resulted in turbulence motivated deposition. This work is of interest to the laser micromachining community and may aide in the manufacture of 2.5D laser etched patterns covering large area wafers and could be applied to a range of wavelengths and laser types

    Data-driven efficient score tests for deconvolution problems

    Full text link
    We consider testing statistical hypotheses about densities of signals in deconvolution models. A new approach to this problem is proposed. We constructed score tests for the deconvolution with the known noise density and efficient score tests for the case of unknown density. The tests are incorporated with model selection rules to choose reasonable model dimensions automatically by the data. Consistency of the tests is proved

    Reconstruction Mechanism of FCC Transition-Metal (001) Surfaces

    Full text link
    The reconstruction mechanism of (001) fcc transition metal surfaces is investigated using a full-potential all-electron electronic structure method within density-functional theory. Total-energy supercell calculations confirm the experimental finding that a close-packed quasi-hexagonal overlayer reconstruction is possible for the late 5dd-metals Ir, Pt, and Au, while it is disfavoured in the isovalent 4dd metals (Rh, Pd, Ag). The reconstructive behaviour is driven by the tensile surface stress of the unreconstructed surfaces; the stress is significantly larger in the 5dd metals than in 4dd ones, and only in the former case it overcomes the substrate resistance to the required geometric rearrangement. It is shown that the surface stress for these systems is due to dd charge depletion from the surface layer, and that the cause of the 4th-to-5th row stress difference is the importance of relativistic effects in the 5dd series.Comment: RevTeX 3.0, 12 pages, 1 PostScript figure available upon request] 23 May 199

    A frequentist framework of inductive reasoning

    Full text link
    Reacting against the limitation of statistics to decision procedures, R. A. Fisher proposed for inductive reasoning the use of the fiducial distribution, a parameter-space distribution of epistemological probability transferred directly from limiting relative frequencies rather than computed according to the Bayes update rule. The proposal is developed as follows using the confidence measure of a scalar parameter of interest. (With the restriction to one-dimensional parameter space, a confidence measure is essentially a fiducial probability distribution free of complications involving ancillary statistics.) A betting game establishes a sense in which confidence measures are the only reliable inferential probability distributions. The equality between the probabilities encoded in a confidence measure and the coverage rates of the corresponding confidence intervals ensures that the measure's rule for assigning confidence levels to hypotheses is uniquely minimax in the game. Although a confidence measure can be computed without any prior distribution, previous knowledge can be incorporated into confidence-based reasoning. To adjust a p-value or confidence interval for prior information, the confidence measure from the observed data can be combined with one or more independent confidence measures representing previous agent opinion. (The former confidence measure may correspond to a posterior distribution with frequentist matching of coverage probabilities.) The representation of subjective knowledge in terms of confidence measures rather than prior probability distributions preserves approximate frequentist validity.Comment: major revisio

    On Verifiable Sufficient Conditions for Sparse Signal Recovery via â„“1\ell_1 Minimization

    Full text link
    We propose novel necessary and sufficient conditions for a sensing matrix to be "ss-good" - to allow for exact â„“1\ell_1-recovery of sparse signals with ss nonzero entries when no measurement noise is present. Then we express the error bounds for imperfect â„“1\ell_1-recovery (nonzero measurement noise, nearly ss-sparse signal, near-optimal solution of the optimization problem yielding the â„“1\ell_1-recovery) in terms of the characteristics underlying these conditions. Further, we demonstrate (and this is the principal result of the paper) that these characteristics, although difficult to evaluate, lead to verifiable sufficient conditions for exact sparse â„“1\ell_1-recovery and to efficiently computable upper bounds on those ss for which a given sensing matrix is ss-good. We establish also instructive links between our approach and the basic concepts of the Compressed Sensing theory, like Restricted Isometry or Restricted Eigenvalue properties

    Maximum Likelihood Estimator for Hidden Markov Models in continuous time

    Full text link
    The paper studies large sample asymptotic properties of the Maximum Likelihood Estimator (MLE) for the parameter of a continuous time Markov chain, observed in white noise. Using the method of weak convergence of likelihoods due to I.Ibragimov and R.Khasminskii, consistency, asymptotic normality and convergence of moments are established for MLE under certain strong ergodicity conditions of the chain.Comment: Warning: due to a flaw in the publishing process, some of the references in the published version of the article are confuse
    • …
    corecore