5,552 research outputs found

    Fidelity threshold for long-range entanglement in quantum networks

    Full text link
    We present a strategy to generate long-range entanglement in noisy quantum networks. We consider a cubic lattice whose bonds are partially entangled mixed states of two qubits, and where quantum operations can be applied perfectly at the nodes. In contrast to protocols designed for one- or two-dimensional regular lattices, we find that entanglement can be created between arbitrarily distant qubits if the fidelity of the bonds is higher than a critical value, independent of the system size. Therefore, we show that a constant overhead of local resources, together with connections of finite fidelity, is sufficient to achieve long-distance quantum communication in noisy networks.Comment: published versio

    Reconstructing a Simple Polytope from its Graph

    Full text link
    Blind and Mani (1987) proved that the entire combinatorial structure (the vertex-facet incidences) of a simple convex polytope is determined by its abstract graph. Their proof is not constructive. Kalai (1988) found a short, elegant, and algorithmic proof of that result. However, his algorithm has always exponential running time. We show that the problem to reconstruct the vertex-facet incidences of a simple polytope P from its graph can be formulated as a combinatorial optimization problem that is strongly dual to the problem of finding an abstract objective function on P (i.e., a shelling order of the facets of the dual polytope of P). Thereby, we derive polynomial certificates for both the vertex-facet incidences as well as for the abstract objective functions in terms of the graph of P. The paper is a variation on joint work with Michael Joswig and Friederike Koerner (2001).Comment: 14 page

    Hypergraphic LP Relaxations for Steiner Trees

    Get PDF
    We investigate hypergraphic LP relaxations for the Steiner tree problem, primarily the partition LP relaxation introduced by Koenemann et al. [Math. Programming, 2009]. Specifically, we are interested in proving upper bounds on the integrality gap of this LP, and studying its relation to other linear relaxations. Our results are the following. Structural results: We extend the technique of uncrossing, usually applied to families of sets, to families of partitions. As a consequence we show that any basic feasible solution to the partition LP formulation has sparse support. Although the number of variables could be exponential, the number of positive variables is at most the number of terminals. Relations with other relaxations: We show the equivalence of the partition LP relaxation with other known hypergraphic relaxations. We also show that these hypergraphic relaxations are equivalent to the well studied bidirected cut relaxation, if the instance is quasibipartite. Integrality gap upper bounds: We show an upper bound of sqrt(3) ~ 1.729 on the integrality gap of these hypergraph relaxations in general graphs. In the special case of uniformly quasibipartite instances, we show an improved upper bound of 73/60 ~ 1.216. By our equivalence theorem, the latter result implies an improved upper bound for the bidirected cut relaxation as well.Comment: Revised full version; a shorter version will appear at IPCO 2010

    On Multilingual Training of Neural Dependency Parsers

    Full text link
    We show that a recently proposed neural dependency parser can be improved by joint training on multiple languages from the same family. The parser is implemented as a deep neural network whose only input is orthographic representations of words. In order to successfully parse, the network has to discover how linguistically relevant concepts can be inferred from word spellings. We analyze the representations of characters and words that are learned by the network to establish which properties of languages were accounted for. In particular we show that the parser has approximately learned to associate Latin characters with their Cyrillic counterparts and that it can group Polish and Russian words that have a similar grammatical function. Finally, we evaluate the parser on selected languages from the Universal Dependencies dataset and show that it is competitive with other recently proposed state-of-the art methods, while having a simple structure.Comment: preprint accepted into the TSD201

    Configuration mixing of angular-momentum projected triaxial relativistic mean-field wave functions

    Get PDF
    The framework of relativistic energy density functionals is extended to include correlations related to the restoration of broken symmetries and to fluctuations of collective variables. The generator coordinate method is used to perform configuration mixing of angular-momentum projected wave functions, generated by constrained self-consistent relativistic mean-field calculations for triaxial shapes. The effects of triaxial deformation and of KK-mixing is illustrated in a study of spectroscopic properties of low-spin states in 24^{24}Mg.Comment: 15 pages, 11 figures, 4 tables, accepted for publication in Phys. Rev.

    The no-core shell model with general radial bases

    Full text link
    Calculations in the ab initio no-core shell model (NCSM) have conventionally been carried out using the harmonic-oscillator many-body basis. However, the rapid falloff (Gaussian asymptotics) of the oscillator functions at large radius makes them poorly suited for the description of the asymptotic properties of the nuclear wavefunction. We establish the foundations for carrying out no-core configuration interaction (NCCI) calculations using a basis built from general radial functions and discuss some of the considerations which enter into using such a basis. In particular, we consider the Coulomb-Sturmian basis, which provides a complete set of functions with a realistic (exponential) radial falloff.Comment: 7 pages, 3 figures; presented at Horizons on Innovative Theories, Experiments, and Supercomputing in Nuclear Physics 2012, New Orleans, Louisiana, June 4-7, 2012; submitted to J. Phys. Conf. Se

    At what stage in the drinking process does drinking water affect attention and memory? Effects of mouth rinsing and mouth drying in adults

    Get PDF
    Drinking water is important for health and there is agreement that drinking water facilitates certain cognitive processes. However, the mechanism underlying the effect of drinking water on cognition is unknown. While attention performance is improved by even a very small drink, memory performance seems to require larger drinks for performance enhancement. This suggests that attention could be affected earlier in the drinking process than memory. We aimed to elucidate further the mechanism involved, by investigating the stage during the drinking process influencing performance on cognitive tasks. To this end, we compared mouth rinsing and mouth drying. Mouth rinsing was expected to result in improved attention performance and would suggest that the mechanism responsible is located in the mouth and occurs early in the drinking process, before swallowing. Eighty-seven adults participated in either a treatment (mouth rinsing or mouth drying) or control (no intervention) condition. They were assessed at baseline and 20 minutes later after intervention on measures of visual attention, short-term memory, subjective thirst and mood. Our results showed that mouth rinsing improved visual attention, but not short-term memory, mood or subjective thirst. Mouth drying did not affect performance. Our results support the hypothesis that different mechanisms underlie the effect of drinking water on different cognitive processes. They suggest that merely sipping water, as opposed to having a large drink, can improve attention
    • …
    corecore