5,721 research outputs found
A very brief introduction to quantum computing and quantum information theory for mathematicians
This is a very brief introduction to quantum computing and quantum
information theory, primarily aimed at geometers. Beyond basic definitions and
examples, I emphasize aspects of interest to geometers, especially connections
with asymptotic representation theory. Proofs of most statements can be found
in standard references
Cell size, genome size, and maximum growth rate are near-independent dimensions of ecological variation across bacteria and archaea.
Among bacteria and archaea, maximum relative growth rate, cell diameter, and genome size are widely regarded as important influences on ecological strategy. Via the most extensive data compilation so far for these traits across all clades and habitats, we ask whether they are correlated and if so how. Overall, we found little correlation among them, indicating they should be considered as independent dimensions of ecological variation. Nor was correlation evident within particular habitat types. A weak nonlinearity (6% of variance) was found whereby high maximum growth rates (temperature-adjusted) tended to occur in the midrange of cell diameters. Species identified in the literature as oligotrophs or copiotrophs were clearly separated on the dimension of maximum growth rate, but not on the dimensions of genome size or cell diameter
Operator entanglement of two-qubit joint unitary operations revisited: Schmidt number approach
Operator entanglement of two-qubit joint unitary operations is revisited.
Schmidt number is an important attribute of a two-qubit unitary operation, and
may have connection with the entanglement measure of the unitary operator. We
found the entanglement measure of two-qubit unitary operators is classified by
the Schmidt number of the unitary operators. The exact relation between the
operator entanglement and the parameters of the unitary operator is clarified
too.Comment: To appear in the Brazilian Journal of Physic
Quantum computation by local measurement
Quantum computation is a novel way of information processing which allows,
for certain classes of problems, exponential speedups over classical
computation. Various models of quantum computation exist, such as the
adiabatic, circuit and measurement-based models. They have been proven
equivalent in their computational power, but operate very differently. As such,
they may be suitable for realization in different physical systems, and also
offer different perspectives on open questions such as the precise origin of
the quantum speedup. Here, we give an introduction to the one-way quantum
computer, a scheme of measurement-based quantum computation. In this model, the
computation is driven by local measurements on a carefully chosen, highly
entangled state. We discuss various aspects of this computational scheme, such
as the role of entanglement and quantum correlations. We also give examples for
ground states of simple Hamiltonians which enable universal quantum computation
by local measurements.Comment: 36 pages, single column, 6 figures, not published version (as
restricted by the journal), please refer to ARCMP for the final published
versio
Quantum uniqueness
In the classical world one can construct two identical systems which have
identical behavior and give identical measurement results. We show this to be
impossible in the quantum domain. We prove that after the same quantum
measurement two different quantum systems cannot yield always identical
results, provided the possible measurement results belong to a non orthogonal
set. This is interpreted as quantum uniqueness - a quantum feature which has no
classical analog. Its tight relation with objective randomness of quantum
measurements is discussed.Comment: Presented at 4th Feynman festival, June 22-26, 2009, in Olomouc,
Czech Republic
Evaluation of the Inheritance of the Complex Vertebral Malformation Syndrome by Breeding Studies
To investigate the congenital complex vertebral malformation syndrome (CVM) in Holstein calves, two breeding studies were performed including 262 and 363 cows, respectively. Cows were selected from the Danish Cattle Database based on pedigree and insemination records. Selected cows were progeny of sires with an established heterozygous CVM genotype and pregnant after insemination with semen from another sire with heterozygous CVM genotype. Following calving the breeders should state, if the calf was normal and was requested to submit dead calves for necropsy. In both studies, significantly fewer CVM affected calves than expected were obtained; a finding probably reflecting extensive intrauterine mortality in CVM affected foetuses. The findings illustrate increased intrauterine mortality as a major potential bias in observational studies of inherited disorders
A genome-wide study of HardyâWeinberg equilibrium with next generation sequence data
Statistical tests for HardyâWeinberg equilibrium have been an important tool for detecting genotyping errors in the past, and remain important in the quality control of next generation sequence data. In this paper, we analyze complete chromosomes of the 1000 genomes project by using exact test procedures for autosomal and X-chromosomal variants. We find that the rate of disequilibrium largely exceeds what might be expected by chance alone for all chromosomes. Observed disequilibrium is, in about 60% of the cases, due to heterozygote excess. We suggest that most excess disequilibrium can be explained by sequencing problems, and hypothesize mechanisms that can explain exceptional heterozygosities. We report higher rates of disequilibrium for the MHC region on chromosome 6, regions flanking centromeres and p-arms of acrocentric chromosomes. We also detected long-range haplotypes and areas with incidental high disequilibrium. We report disequilibrium to be related to read depth, with variants having extreme read depths being more likely to be out of equilibrium. Disequilibrium rates were found to be 11 times higher in segmental duplications and simple tandem repeat regions. The variants with significant disequilibrium are seen to be concentrated in these areas. For next generation sequence data, HardyâWeinberg disequilibrium seems to be a major indicator for copy number variation.Peer ReviewedPostprint (published version
Measuring measurement
Measurement connects the world of quantum phenomena to the world of classical
events. It plays both a passive role, observing quantum systems, and an active
one, preparing quantum states and controlling them. Surprisingly - in the light
of the central status of measurement in quantum mechanics - there is no general
recipe for designing a detector that measures a given observable. Compounding
this, the characterization of existing detectors is typically based on partial
calibrations or elaborate models. Thus, experimental specification (i.e.
tomography) of a detector is of fundamental and practical importance. Here, we
present the realization of quantum detector tomography: we identify the optimal
positive-operator-valued measure describing the detector, with no ancillary
assumptions. This result completes the triad, state, process, and detector
tomography, required to fully specify an experiment. We characterize an
avalanche photodiode and a photon number resolving detector capable of
detecting up to eight photons. This creates a new set of tools for accurately
detecting and preparing non-classical light.Comment: 6 pages, 4 figures,see video abstract at
http://www.quantiki.org/video_abstracts/0807244
- âŚ