779 research outputs found
PRS23 High Cost Cystic Fibrosis Patients as Identified in a Us Claims Database: A Closer Look at the Tail
PRS8 Predicted Survival for North American Patients with Cystic Fibrosis Adjusted for Cohort Specific Covariates
Recommended from our members
Systematic evaluation of spliced alignment programs for RNA-seq data
High-throughput RNA sequencing is an increasingly accessible method for studying gene structure and activity on a genome-wide scale. A critical step in RNA-seq data analysis is the alignment of partial transcript reads to a reference genome sequence. To assess the performance of current mapping software, we invited developers of RNA-seq aligners to process four large human and mouse RNA-seq data sets. In total, we compared 26 mapping protocols based on 11 programs and pipelines and found major performance differences between methods on numerous benchmarks, including alignment yield, basewise accuracy, mismatch and gap placement, exon junction discovery and suitability of alignments for transcript reconstruction. We observed concordant results on real and simulated RNA-seq data, confirming the relevance of the metrics employed. Future developments in RNA-seq alignment methods would benefit from improved placement of multimapped reads, balanced utilization of existing gene annotation and a reduced false discovery rate for splice junctions
Performance assessment of promoter predictions on ENCODE regions in the EGASP experiment
BACKGROUND: This study analyzes the predictions of a number of promoter predictors on the ENCODE regions of the human genome as part of the ENCODE Genome Annotation Assessment Project (EGASP). The systems analyzed operate on various principles and we assessed the effectiveness of different conceptual strategies used to correlate produced promoter predictions with the manually annotated 5' gene ends. RESULTS: The predictions were assessed relative to the manual HAVANA annotation of the 5' gene ends. These 5' gene ends were used as the estimated reference transcription start sites. With the maximum allowed distance for predictions of 1,000 nucleotides from the reference transcription start sites, the sensitivity of predictors was in the range 32% to 56%, while the positive predictive value was in the range 79% to 93%. The average distance mismatch of predictions from the reference transcription start sites was in the range 259 to 305 nucleotides. At the same time, using transcription start site estimates from DBTSS and H-Invitational databases as promoter predictions, we obtained a sensitivity of 58%, a positive predictive value of 92%, and an average distance from the annotated transcription start sites of 117 nucleotides. In this experiment, the best performing promoter predictors were those that combined promoter prediction with gene prediction. The main reason for this is the reduced promoter search space that resulted in smaller numbers of false positive predictions. CONCLUSION: The main finding, now supported by comprehensive data, is that the accuracy of human promoter predictors for high-throughput annotation purposes can be significantly improved if promoter prediction is combined with gene prediction. Based on the lessons learned in this experiment, we propose a framework for the preparation of the next similar promoter prediction assessment
Quantum state merging and negative information
We consider a quantum state shared between many distant locations, and define
a quantum information processing primitive, state merging, that optimally
merges the state into one location. As announced in [Horodecki, Oppenheim,
Winter, Nature 436, 673 (2005)], the optimal entanglement cost of this task is
the conditional entropy if classical communication is free. Since this quantity
can be negative, and the state merging rate measures partial quantum
information, we find that quantum information can be negative. The classical
communication rate also has a minimum rate: a certain quantum mutual
information. State merging enabled one to solve a number of open problems:
distributed quantum data compression, quantum coding with side information at
the decoder and sender, multi-party entanglement of assistance, and the
capacity of the quantum multiple access channel. It also provides an
operational proof of strong subadditivity. Here, we give precise definitions
and prove these results rigorously.Comment: 23 pages, 3 figure
Tema Con Variazioni: Quantum Channel Capacity
Channel capacity describes the size of the nearly ideal channels, which can
be obtained from many uses of a given channel, using an optimal error
correcting code. In this paper we collect and compare minor and major
variations in the mathematically precise statements of this idea which have
been put forward in the literature. We show that all the variations considered
lead to equivalent capacity definitions. In particular, it makes no difference
whether one requires mean or maximal errors to go to zero, and it makes no
difference whether errors are required to vanish for any sequence of block
sizes compatible with the rate, or only for one infinite sequence.Comment: 32 pages, uses iopart.cl
Efficient and feasible state tomography of quantum many-body systems
We present a novel method to perform quantum state tomography for
many-particle systems which are particularly suitable for estimating states in
lattice systems such as of ultra-cold atoms in optical lattices. We show that
the need for measuring a tomographically complete set of observables can be
overcome by letting the state evolve under some suitably chosen random circuits
followed by the measurement of a single observable. We generalize known results
about the approximation of unitary 2-designs, i.e., certain classes of random
unitary matrices, by random quantum circuits and connect our findings to the
theory of quantum compressed sensing. We show that for ultra-cold atoms in
optical lattices established techniques like optical super-lattices, laser
speckles, and time-of-flight measurements are sufficient to perform fully
certified, assumption-free tomography. Combining our approach with tensor
network methods - in particular the theory of matrix-product states - we
identify situations where the effort of reconstruction is even constant in the
number of lattice sites, allowing in principle to perform tomography on
large-scale systems readily available in present experiments.Comment: 10 pages, 3 figures, minor corrections, discussion added, emphasizing
that no single-site addressing is needed at any stage of the scheme when
implemented in optical lattice system
Entanglement and localization of wavefunctions
We review recent works that relate entanglement of random vectors to their
localization properties. In particular, the linear entropy is related by a
simple expression to the inverse participation ratio, while next orders of the
entropy of entanglement contain information about e.g. the multifractal
exponents. Numerical simulations show that these results can account for the
entanglement present in wavefunctions of physical systems.Comment: 6 pages, 4 figures, to appear in the proceedings of the NATO Advanced
Research Workshop 'Recent Advances in Nonlinear Dynamics and Complex System
Physics', Tashkent, Uzbekistan, 200
- …