75 research outputs found
Exact likelihood computation in Boolean networks with probabilistic time delays, and its application in signal network reconstruction
Motivation: For biological pathways, it is common to measure a gene expression time series after various knockdowns of genes that are putatively involved in the process of interest. These interventional time-resolved data are most suitable for the elucidation of dynamic causal relationships in signaling networks. Even with this kind of data it is still a major and largely unsolved challenge to infer the topology and interaction logic of the underlying regulatory network. Results: In this work, we present a novel model-based approach involving Boolean networks to reconstruct small to medium-sized regulatory networks. In particular, we solve the problem of exact likelihood computation in Boolean networks with probabilistic exponential time delays. Simulations demonstrate the high accuracy of our approach. We apply our method to data of Ivanova et al. (2006), where RNA interference knockdown experiments were used to build a network of the key regulatory genes governing mouse stem cell maintenance and differentiation. In contrast to previous analyses of that data set, our method can identify feedback loops and provides new insights into the interplay of some master regulators in embryonic stem cell development. Availability and implementation: The algorithm is implemented in the statistical language R. Code and documentation are available at Bioinformatics online. Contact: [email protected] or [email protected] Supplementary information: Supplementary Materials are available at Bioinfomatics onlin
Robustness of adiabatic quantum computation
We study the fault tolerance of quantum computation by adiabatic evolution, a
quantum algorithm for solving various combinatorial search problems. We
describe an inherent robustness of adiabatic computation against two kinds of
errors, unitary control errors and decoherence, and we study this robustness
using numerical simulations of the algorithm.Comment: 11 pages, 5 figures, REVTe
Simultaneous characterization of sense and antisense genomic processes by the double-stranded hidden Markov model
Hidden Markov models (HMMs) have been extensively used to dissect the genome into functionally distinct regions using data such as RNA expression or DNA binding measurements. It is a challenge to disentangle processes occurring on complementary strands of the same genomic region. We present the double-stranded HMM (dsHMM), a model for the strand-specific analysis of genomic processes. We applied dsHMM to yeast using strand specific transcription data, nucleosome data, and protein binding data for a set of 11 factors associated with the regulation of transcription. The resulting annotation recovers the mRNA transcription cycle (initiation, elongation, termination) while correctly predicting strand-specificity and directionality of the transcription process. We find that pre-initiation complex formation is an essentially undirected process, giving rise to a large number of bidirectional promoters and to pervasive antisense transcription. Notably, 12% of all transcriptionally active positions showed simultaneous activity on both strands. Furthermore, dsHMM reveals that antisense transcription is specifically suppressed by Nrd1, a yeast termination factor
Determination of the noise parameters in a one-dimensional open quantum system
We consider an electron magnetically interacting with a spin-1/2 impurity,
embedded in an external environment whose noisy term acts only on the
impurity's spin, and we find expressions for the electron transmission and
reflection probabilities in terms of the phenomenological noise parameters.
Moreover, we give a simple example of the necessity of complete positivity for
physical consistency, showing that a positive but not completely positive
dissipative map can lead to negative transmission probabilities
Microscopic theory of energy dissipation and decoherence in solid-state systems: A reformulation of the conventional Markov limit
We present and discuss a general density-matrix description of
energy-dissipation and decoherence phenomena in open quantum systems, able to
overcome the intrinsic limitations of the conventional Markov approximation. In
particular, the proposed alternative adiabatic scheme does not threaten
positivity at any time. The key idea of our approach rests in the temporal
symmetrization and coarse graining of the scattering term in the Liouville-von
Neumann equation, before applying the reduction procedure over the environment
degrees of freedom. The resulting dynamics is genuinely Lindblad-like and
recovers the Fermi's golden rule features in the semiclassical limit.
Applications to the prototypical case of a semiconductor quantum dot exposed to
incoherent phonon excitation peaked around a central mode are discussed,
highlighting the success of our formalism with respect to the critical issues
of the conventional Markov limit.Comment: 16 pages, 5 figure
Complete positivity and entangled degrees of freedom
We study how some recently proposed noncontextuality tests based on quantum
interferometry are affected if the test particles propagate as open systems in
presence of a gaussian stochastic background. We show that physical consistency
requires the resulting markovian dissipative time-evolution to be completely
positive.Comment: 23 pages, plain-TeX, no figure
Optimizing mycobacteria molecular diagnostics: No decontamination! Human DNA depletion? Greener storage at 4 °C!
INTRODUCTION
Tuberculosis (TB) is an infectious disease caused by the group of bacterial pathogens Mycobacterium tuberculosis complex (MTBC) and is one of the leading causes of death worldwide. Timely diagnosis and treatment of drug-resistant TB is a key pillar of WHO's strategy to combat global TB. The time required to carry out drug susceptibility testing (DST) for MTBC via the classic culture method is in the range of weeks and such delays have a detrimental effect on treatment outcomes. Given that molecular testing is in the range of hours to 1 or 2 days its value in treating drug resistant TB cannot be overstated. When developing such tests, one wants to optimize each step so that tests are successful even when confronted with samples that have a low MTBC load or contain large amounts of host DNA. This could improve the performance of the popular rapid molecular tests, especially for samples with mycobacterial loads close to the limits of detection. Where optimizations could have a more significant impact is for tests based on targeted next generation sequencing (tNGS) which typically require higher quantities of DNA. This would be significant as tNGS can provide more comprehensive drug resistance profiles than the relatively limited resistance information provided by rapid tests. In this work we endeavor to optimize pre-treatment and extraction steps for molecular testing.
METHODS
We begin by choosing the best DNA extraction device by comparing the amount of DNA extracted by five commonly used devices from identical samples. Following this, the effect that decontamination and human DNA depletion have on extraction efficiency is explored.
RESULTS
The best results were achieved (i.e., the lowest Ct values) when neither decontamination nor human DNA depletion were used. As expected, in all tested scenarios the addition of decontamination to our workflow substantially reduced the yield of DNA extracted. This illustrates that the standard TB laboratory practice of applying decontamination, although being vital for culture-based testing, can negatively impact the performance of molecular testing. As a complement to the above experiments, we also considered the best Mycobacterium tuberculosis DNA storage method to optimize molecular testing carried out in the near- to medium-term. Comparing Ct values following three-month storage at 4 °C and at -20 °C and showed little difference between the two.
DISCUSSION
In summary, for molecular diagnostics aimed at mycobacteria this work highlights the importance of choosing the right DNA extraction device, indicates that decontamination causes significant loss of mycobacterial DNA, and shows that samples preserved for further molecular testing can be stored at 4 °C, just as well at -20 °C. Under our experimental settings, human DNA depletion gave no significant improvement in Ct values for the detection of MTBC
Non-Standard neutral kaons dynamics from D-brane statistics
The neutral kaon system can be effectively described by non-unitary,
dissipative, completely positive dynamics that extend the usual treatment. In
the framework of open quantum systems, we show how the origin of these
non-standard time evolutions can be traced to the interaction of the kaon
system with a large environment. We find that D-branes, effectively described
by a heat-bath of quanta obeying infinite statistics, could constitute a
realistic example of such an environment.Comment: 14 pages, plain-TeX, no figure
Quantum fluctuation theorem: Can we go from micro to meso?
Quantum extensions of the Gallavotti-Cohen fluctuation theorem (FT) for the
entropy production have been discussed by several authors. There is a practical
gap between microscopic forms of FT and mesoscopic (i.e. not purely
Hamiltonian) forms for open systems. In a microscopic setup, it is easy to
state and to prove FT. In a mesoscopic setup, it is difficult to identify
fluctuations of the entropy production. (This difficulty is absent in the
classical case.) We discuss a particular mesoscopic model: a Lindblad master
equation, in which we state FT and, more importantly, connect it rigorously
with the underlying microscopic FT. We also remark that FT is satisfied by the
Lesovik-Levitov formula for statistics of charge transport.Comment: Conference Proceedings (Brussels, march 2006), 10 pages, to appear in
Comptes rendus - Physiqu
- …