3,219 research outputs found

    Thermodynamic stability criteria for a quantum memory based on stabilizer and subsystem codes

    Full text link
    We discuss and review several thermodynamic criteria that have been introduced to characterize the thermal stability of a self-correcting quantum memory. We first examine the use of symmetry-breaking fields in analyzing the properties of self-correcting quantum memories in the thermodynamic limit: we show that the thermal expectation values of all logical operators vanish for any stabilizer and any subsystem code in any spatial dimension. On the positive side, we generalize the results in [R. Alicki et al., arXiv:0811.0033] to obtain a general upper bound on the relaxation rate of a quantum memory at nonzero temperature, assuming that the quantum memory interacts via a Markovian master equation with a thermal bath. This upper bound is applicable to quantum memories based on either stabilizer or subsystem codes.Comment: 23 pages. v2: revised introduction, various additional comments, and a new section on gapped hamiltonian

    Compressed Sensing Performance Analysis via Replica Method using Bayesian framework

    Full text link
    Compressive sensing (CS) is a new methodology to capture signals at lower rate than the Nyquist sampling rate when the signals are sparse or sparse in some domain. The performance of CS estimators is analyzed in this paper using tools from statistical mechanics, especially called replica method. This method has been used to analyze communication systems like Code Division Multiple Access (CDMA) and multiple input multi- ple output (MIMO) systems with large size. Replica analysis, now days rigorously proved, is an efficient tool to analyze large systems in general. Specifically, we analyze the performance of some of the estimators used in CS like LASSO (the Least Absolute Shrinkage and Selection Operator) estimator and Zero-Norm regularizing estimator as a special case of maximum a posteriori (MAP) estimator by using Bayesian framework to connect the CS estimators and replica method. We use both replica symmetric (RS) ansatz and one-step replica symmetry breaking (1RSB) ansatz, clamming the latter is efficient when the problem is not convex. This work is more analytical in its form. It is deferred for next step to focus on the numerical results.Comment: The analytical work and results were presented at the 2012 IEEE European School of Information Theory in Antalya, Turkey between the 16th and the 20th of Apri

    Measuring neutrino masses with a future galaxy survey

    Full text link
    We perform a detailed forecast on how well a Euclid-like photometric galaxy and cosmic shear survey will be able to constrain the absolute neutrino mass scale. Adopting conservative assumptions about the survey specifications and assuming complete ignorance of the galaxy bias, we estimate that the minimum mass sum of sum m_nu ~ 0.06 eV in the normal hierarchy can be detected at 1.5 sigma to 2.5 sigma significance, depending on the model complexity, using a combination of galaxy and cosmic shear power spectrum measurements in conjunction with CMB temperature and polarisation observations from Planck. With better knowledge of the galaxy bias, the significance of the detection could potentially reach 5.4 sigma. Interestingly, neither Planck+shear nor Planck+galaxy alone can achieve this level of sensitivity; it is the combined effect of galaxy and cosmic shear power spectrum measurements that breaks the persistent degeneracies between the neutrino mass, the physical matter density, and the Hubble parameter. Notwithstanding this remarkable sensitivity to sum m_nu, Euclid-like shear and galaxy data will not be sensitive to the exact mass spectrum of the neutrino sector; no significant bias (< 1 sigma) in the parameter estimation is induced by fitting inaccurate models of the neutrino mass splittings to the mock data, nor does the goodness-of-fit of these models suffer any significant degradation relative to the true one (Delta chi_eff ^2< 1).Comment: v1: 29 pages, 10 figures. v2: 33 pages, 12 figures; added sections on shape evolution and constraints in more complex models, accepted for publication in JCA

    Stability of the replica symmetric solution in diluted perceptron learning

    Full text link
    We study the role played by the dilution in the average behavior of a perceptron model with continuous coupling with the replica method. We analyze the stability of the replica symmetric solution as a function of the dilution field for the generalization and memorization problems. Thanks to a Gardner like stability analysis we show that at any fixed ratio α\alpha between the number of patterns M and the dimension N of the perceptron (α=M/N\alpha=M/N), there exists a critical dilution field hch_c above which the replica symmetric ansatz becomes unstable.Comment: Stability of the solution in arXiv:0907.3241, 13 pages, (some typos corrected

    Phase Diagram of the Kitaev-type Model on a Decorated Honeycomb Lattice in the Isolated Dimer Limit

    Get PDF
    An effective model in the isolated dimer limit of the Kitaev-type model on a decorated honeycomb lattice is investigated at finite temperature. The ground state of this model is exactly shown to be a chiral spin liquid with spontaneous breaking of time reversal symmetry. We elaborate the finite-temperature phase diagram by using the mean-field approximation and Monte Carlo simulation. We find that the phase transition between the high-temperature paramagnetic phase and the low-temperature chiral spin liquid phase is always of second order in the Monte Carlo results, although a tricritical point appears in the mean-field phase diagram. The finite-size scaling analysis of the Monte Carlo data indicates that the phase transition belongs to the two-dimensional Ising universality class.Comment: 8 pages, 4 figure

    Complexity, BioComplexity, the Connectionist Conjecture and Ontology of Complexity\ud

    Get PDF
    This paper develops and integrates major ideas and concepts on complexity and biocomplexity - the connectionist conjecture, universal ontology of complexity, irreducible complexity of totality &amp; inherent randomness, perpetual evolution of information, emergence of criticality and equivalence of symmetry &amp; complexity. This paper introduces the Connectionist Conjecture which states that the one and only representation of Totality is the connectionist one i.e. in terms of nodes and edges. This paper also introduces an idea of Universal Ontology of Complexity and develops concepts in that direction. The paper also develops ideas and concepts on the perpetual evolution of information, irreducibility and computability of totality, all in the context of the Connectionist Conjecture. The paper indicates that the control and communication are the prime functionals that are responsible for the symmetry and complexity of complex phenomenon. The paper takes the stand that the phenomenon of life (including its evolution) is probably the nearest to what we can describe with the term “complexity”. The paper also assumes that signaling and communication within the living world and of the living world with the environment creates the connectionist structure of the biocomplexity. With life and its evolution as the substrate, the paper develops ideas towards the ontology of complexity. The paper introduces new complexity theoretic interpretations of fundamental biomolecular parameters. The paper also develops ideas on the methodology to determine the complexity of “true” complex phenomena.\u
    • …
    corecore