336 research outputs found

    Good Quantum Convolutional Error Correction Codes And Their Decoding Algorithm Exist

    Get PDF
    Quantum convolutional code was introduced recently as an alternative way to protect vital quantum information. To complete the analysis of quantum convolutional code, I report a way to decode certain quantum convolutional codes based on the classical Viterbi decoding algorithm. This decoding algorithm is optimal for a memoryless channel. I also report three simple criteria to test if decoding errors in a quantum convolutional code will terminate after a finite number of decoding steps whenever the Hilbert space dimension of each quantum register is a prime power. Finally, I show that certain quantum convolutional codes are in fact stabilizer codes. And hence, these quantum stabilizer convolutional codes have fault-tolerant implementations.Comment: Minor changes, to appear in PR

    Butterfly distribution along altitudinal gradients: temporal changes over a short time period

    Get PDF
    Mountain ecosystems are particularly sensitive to changes in climate and land cover, but at the same time, they can offer important refuges for species on the opposite of the more altered lowlands. To explore the potential role of mountain ecosystems in butterfly conservation and to assess the vulnerability of the alpine species, we analyzed the short-term changes (2006-2008 vs. 2012-2013) of butterflies\u27 distribution along altitudinal gradients in the NW Italian Alps. We sampled butterfly communities once a month (62 sampling stations, 3 seasonal replicates per year, from June to August) by semi-quantitative sampling techniques. The monitored gradient ranges from the montane to the alpine belt (600-2700 m a.s.l.) within three protected areas: Gran Paradiso National Park (LTER, Sitecode: LTER_EU_IT_109), Orsiera Rocciavr? Natural Park and Veglia Devero Natural Park. We investigated butterflies\u27 temporal changes in accordance with a hierarchical approach to assess potential relationships between species and community level. As a first step, we characterized each species in terms of habitat requirements, elevational range and temperature preferences and we compared plot occupancy and altitudinal range changes between time periods (2006-2008 vs. 2012-2013). Secondly, we focused on community level, analyzing species richness and community composition temporal changes. The species level analysis highlighted a general increase in mean occupancy level and significant changes at both altitudinal boundaries. Looking at the ecological groups, we observed an increase of generalist and highly mobile species at the expense of the specialist and less mobile ones. For the community level, we noticed a significant increase in species richness, in the community temperature index and a tendency towards homogenization within communities. Besides the short time period considered, butterflies species distribution and communities changed considerably. In light of these results, it is fundamental to continue monitoring activities to understand if we are facing transient changes or first signals of an imminent trend

    Error-correcting code on a cactus: a solvable model

    Get PDF
    An exact solution to a family of parity check error-correcting codes is provided by mapping the problem onto a Husimi cactus. The solution obtained in the thermodynamic limit recovers the replica symmetric theory results and provides a very good approximation to finite systems of moderate size. The probability propagation decoding algorithm emerges naturally from the analysis. A phase transition between decoding success and failure phases is found to coincide with an information-theoretic upper bound. The method is employed to compare Gallager and MN codes.Comment: 7 pages, 3 figures, with minor correction

    Statistical Physics of Irregular Low-Density Parity-Check Codes

    Get PDF
    Low-density parity-check codes with irregular constructions have been recently shown to outperform the most advanced error-correcting codes to date. In this paper we apply methods of statistical physics to study the typical properties of simple irregular codes. We use the replica method to find a phase transition which coincides with Shannon's coding bound when appropriate parameters are chosen. The decoding by belief propagation is also studied using statistical physics arguments; the theoretical solutions obtained are in good agreement with simulations. We compare the performance of irregular with that of regular codes and discuss the factors that contribute to the improvement in performance.Comment: 20 pages, 9 figures, revised version submitted to JP

    Assessment of landscape change’s impact on Alpine species distribution using a multi-scale approach

    Get PDF
    There is a strong relation between biodiversity and traditional land use in Mediterrean areas. In these highly human dominated regions traditional acitvities profoundly shape the landscape with strong consequences on biodiversity pattern. However, in the last few decades the rapid socio-economic change lead to the abandonment of \u201cmarginal\u201d land modifying the landscape structures. Available remote sensing data can provide information about environmental changes, but the occurrence of temporal and spatial gaps (e.g., the limited temporal archive of historical aerial images and the coarser spatial resolution of satellite data) can reduce the applicability of gained information. Considering the importance of the scale-dependency of ecological processes, we propose a multi-temporal and scale approach, combining remote sensed and field data, to monitor changes in vegetation and landscape structures and to evaluate their role in shaping Alpine species distribution. The study area is the Gran Paradiso National Park (NW Italian Alps) and we focused both on 5 altitudinal transects, representative of three altitudinal belts, and on landscape level. At first, from the interpretation of historical aerial photos in sampled areas, we reconstructed the land cover changes occurred during the last decades and we extended this information to the entire Park landscape, through a supervised classification of satellite data. Further, we developed a low-cost procedure of UAV (Unmanned Aerial Vehicle) survey adapted to Alpine environment, integrated with botanical sampling, in order to obtain highresolution land cover maps in test areas to replace the use of aerial photos in supervised classification of satellite data. This multi-scale analysis of landscape change allows us to detail how the environmental patterns affect the Alpine animal species distribution ranging from discrete areas to entire Park area

    Search algorithms as a framework for the optimization of drug combinations

    Get PDF
    Combination therapies are often needed for effective clinical outcomes in the management of complex diseases, but presently they are generally based on empirical clinical experience. Here we suggest a novel application of search algorithms, originally developed for digital communication, modified to optimize combinations of therapeutic interventions. In biological experiments measuring the restoration of the decline with age in heart function and exercise capacity in Drosophila melanogaster, we found that search algorithms correctly identified optimal combinations of four drugs with only one third of the tests performed in a fully factorial search. In experiments identifying combinations of three doses of up to six drugs for selective killing of human cancer cells, search algorithms resulted in a highly significant enrichment of selective combinations compared with random searches. In simulations using a network model of cell death, we found that the search algorithms identified the optimal combinations of 6-9 interventions in 80-90% of tests, compared with 15-30% for an equivalent random search. These findings suggest that modified search algorithms from information theory have the potential to enhance the discovery of novel therapeutic drug combinations. This report also helps to frame a biomedical problem that will benefit from an interdisciplinary effort and suggests a general strategy for its solution.Comment: 36 pages, 10 figures, revised versio

    Structural Learning of Attack Vectors for Generating Mutated XSS Attacks

    Full text link
    Web applications suffer from cross-site scripting (XSS) attacks that resulting from incomplete or incorrect input sanitization. Learning the structure of attack vectors could enrich the variety of manifestations in generated XSS attacks. In this study, we focus on generating more threatening XSS attacks for the state-of-the-art detection approaches that can find potential XSS vulnerabilities in Web applications, and propose a mechanism for structural learning of attack vectors with the aim of generating mutated XSS attacks in a fully automatic way. Mutated XSS attack generation depends on the analysis of attack vectors and the structural learning mechanism. For the kernel of the learning mechanism, we use a Hidden Markov model (HMM) as the structure of the attack vector model to capture the implicit manner of the attack vector, and this manner is benefited from the syntax meanings that are labeled by the proposed tokenizing mechanism. Bayes theorem is used to determine the number of hidden states in the model for generalizing the structure model. The paper has the contributions as following: (1) automatically learn the structure of attack vectors from practical data analysis to modeling a structure model of attack vectors, (2) mimic the manners and the elements of attack vectors to extend the ability of testing tool for identifying XSS vulnerabilities, (3) be helpful to verify the flaws of blacklist sanitization procedures of Web applications. We evaluated the proposed mechanism by Burp Intruder with a dataset collected from public XSS archives. The results show that mutated XSS attack generation can identify potential vulnerabilities.Comment: In Proceedings TAV-WEB 2010, arXiv:1009.330

    Confluence Modulo Equivalence in Constraint Handling Rules

    Get PDF
    Previous results on proving confluence for Constraint Handling Rules are extended in two ways in order to allow a larger and more realistic class of CHR programs to be considered confluent. Firstly, we introduce the relaxed notion of confluence modulo equivalence into the context of CHR: while confluence for a terminating program means that all alternative derivations for a query lead to the exact same final state, confluence modulo equivalence only requires the final states to be equivalent with respect to an equivalence relation tailored for the given program. Secondly, we allow non-logical built-in predicates such as var/1 and incomplete ones such as is/2, that are ignored in previous work on confluence. To this end, a new operational semantics for CHR is developed which includes such predicates. In addition, this semantics differs from earlier approaches by its simplicity without loss of generality, and it may also be recommended for future studies of CHR. For the purely logical subset of CHR, proofs can be expressed in first-order logic, that we show is not sufficient in the present case. We have introduced a formal meta-language that allows reasoning about abstract states and derivations with meta-level restrictions that reflect the non-logical and incomplete predicates. This language represents subproofs as diagrams, which facilitates a systematic enumeration of proof cases, pointing forward to a mechanical support for such proofs

    Enhanced stochastic optimization algorithm for finding effective multi-target therapeutics

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>For treating a complex disease such as cancer, we need effective means to control the biological network that underlies the disease. However, biological networks are typically robust to external perturbations, making it difficult to beneficially alter the network dynamics by controlling a single target. In fact, multi-target therapeutics is often more effective compared to monotherapies, and combinatory drugs are commonly used these days for treating various diseases. A practical challenge in combination therapy is that the number of possible drug combinations increases exponentially, which makes the prediction of the optimal drug combination a difficult combinatorial optimization problem. Recently, a stochastic optimization algorithm called the Gur Game algorithm was proposed for drug optimization, which was shown to be very efficient in finding potent drug combinations.</p> <p>Results</p> <p>In this paper, we propose a novel stochastic optimization algorithm that can be used for effective optimization of combinatory drugs. The proposed algorithm analyzes how the concentration change of a specific drug affects the overall drug response, thereby making an informed guess on how the concentration should be updated to improve the drug response. We evaluated the performance of the proposed algorithm based on various drug response functions, and compared it with the Gur Game algorithm.</p> <p>Conclusions</p> <p>Numerical experiments clearly show that the proposed algorithm significantly outperforms the original Gur Game algorithm, in terms of reliability and efficiency. This enhanced optimization algorithm can provide an effective framework for identifying potent drug combinations that lead to optimal drug response.</p

    Concurrent codes:a holographic-type encoding robust against noise and loss

    Get PDF
    Concurrent coding is an encoding scheme with 'holographic' type properties that are shown here to be robust against a significant amount of noise and signal loss. This single encoding scheme is able to correct for random errors and burst errors simultaneously, but does not rely on cyclic codes. A simple and practical scheme has been tested that displays perfect decoding when the signal to noise ratio is of order -18dB. The same scheme also displays perfect reconstruction when a contiguous block of 40% of the transmission is missing. In addition this scheme is 50% more efficient in terms of transmitted power requirements than equivalent cyclic codes. A simple model is presented that describes the process of decoding and can determine the computational load that would be expected, as well as describing the critical levels of noise and missing data at which false messages begin to be generated
    • …
    corecore