8,272 research outputs found
Application of Permutation Group Theory in Reversible Logic Synthesis
The paper discusses various applications of permutation group theory in the
synthesis of reversible logic circuits consisting of Toffoli gates with
negative control lines. An asymptotically optimal synthesis algorithm for
circuits consisting of gates from the NCT library is described. An algorithm
for gate complexity reduction, based on equivalent replacements of gates
compositions, is introduced. A new approach for combining a group-theory-based
synthesis algorithm with a Reed-Muller-spectra-based synthesis algorithm is
described. Experimental results are presented to show that the proposed
synthesis techniques allow a reduction in input lines count, gate complexity or
quantum cost of reversible circuits for various benchmark functions.Comment: In English, 15 pages, 2 figures, 7 tables. Proceeding of the RC 2016
conferenc
Modeling hormonal and inflammatory contributions to preterm and term labor using uterine temporal transcriptomics
Quantification of Continuous Variable Entanglement with only Two Types of Simple Measurements
Here we propose an experimental set-up in which it is possible to measure the
entanglement of a two-mode Gaussian state, be it pure or mixed, using only
simple linear optical devices. After a proper unitary manipulation of the
two-mode Gaussian state only number and purity measurements of just one of the
modes suffice to give us a complete and exact knowledge of the state's
entanglement.Comment: v1: 4 pages, 1 figure, RevTex4; v2: Title and abstract changed, new
discussion paragraph added; v3: published versio
Systematic review of statistical approaches to quantify, or correct for, measurement error in a continuous exposure in nutritional epidemiology.
BACKGROUND: Several statistical approaches have been proposed to assess and correct for exposure measurement error. We aimed to provide a critical overview of the most common approaches used in nutritional epidemiology. METHODS: MEDLINE, EMBASE, BIOSIS and CINAHL were searched for reports published in English up to May 2016 in order to ascertain studies that described methods aimed to quantify and/or correct for measurement error for a continuous exposure in nutritional epidemiology using a calibration study. RESULTS: We identified 126 studies, 43 of which described statistical methods and 83 that applied any of these methods to a real dataset. The statistical approaches in the eligible studies were grouped into: a) approaches to quantify the relationship between different dietary assessment instruments and "true intake", which were mostly based on correlation analysis and the method of triads; b) approaches to adjust point and interval estimates of diet-disease associations for measurement error, mostly based on regression calibration analysis and its extensions. Two approaches (multiple imputation and moment reconstruction) were identified that can deal with differential measurement error. CONCLUSIONS: For regression calibration, the most common approach to correct for measurement error used in nutritional epidemiology, it is crucial to ensure that its assumptions and requirements are fully met. Analyses that investigate the impact of departures from the classical measurement error model on regression calibration estimates can be helpful to researchers in interpreting their findings. With regard to the possible use of alternative methods when regression calibration is not appropriate, the choice of method should depend on the measurement error model assumed, the availability of suitable calibration study data and the potential for bias due to violation of the classical measurement error model assumptions. On the basis of this review, we provide some practical advice for the use of methods to assess and adjust for measurement error in nutritional epidemiology
KODAMA: an R package for knowledge discovery and data mining
Summary: KODAMA, a novel learning algorithm for unsuper-vised feature extraction, is specifically designed for analysing noisy and high-dimensional data sets. Here we present an R package of the algorithm with additional functions that allow improved interpretation of high-dimensional data. The pack-age requires no additional software and runs on all major plat-forms. Availability and Implementation: KODAMA is freely available from the R archive CRAN (http://cran.r-project.org). The soft-ware is distributed under the GNU General Public License (ver-sion 3 or later)
Physical limits of inference
I show that physical devices that perform observation, prediction, or
recollection share an underlying mathematical structure. I call devices with
that structure "inference devices". I present a set of existence and
impossibility results concerning inference devices. These results hold
independent of the precise physical laws governing our universe. In a limited
sense, the impossibility results establish that Laplace was wrong to claim that
even in a classical, non-chaotic universe the future can be unerringly
predicted, given sufficient knowledge of the present. Alternatively, these
impossibility results can be viewed as a non-quantum mechanical "uncertainty
principle". Next I explore the close connections between the mathematics of
inference devices and of Turing Machines. In particular, the impossibility
results for inference devices are similar to the Halting theorem for TM's.
Furthermore, one can define an analog of Universal TM's (UTM's) for inference
devices. I call those analogs "strong inference devices". I use strong
inference devices to define the "inference complexity" of an inference task,
which is the analog of the Kolmogorov complexity of computing a string. However
no universe can contain more than one strong inference device. So whereas the
Kolmogorov complexity of a string is arbitrary up to specification of the UTM,
there is no such arbitrariness in the inference complexity of an inference
task. I end by discussing the philosophical implications of these results,
e.g., for whether the universe "is" a computer.Comment: 43 pages, updated version of Physica D version, which originally
appeared in 2007 CNLS conference on unconventional computatio
Quantum state transformations and the Schubert calculus
Recent developments in mathematics have provided powerful tools for comparing
the eigenvalues of matrices related to each other via a moment map. In this
paper we survey some of the more concrete aspects of the approach with a
particular focus on applications to quantum information theory. After
discussing the connection between Horn's Problem and Nielsen's Theorem, we move
on to characterizing the eigenvalues of the partial trace of a matrix.Comment: 40 pages. Accepted for publication in Annals of Physic
Multi-dimensional photonic states from a quantum dot
Quantum states superposed across multiple particles or degrees of freedom offer an advantage in the development of quantum technologies. Creating these states deterministically and with high efficiency is an ongoing challenge. A promising approach is the repeated excitation of multi-level quantum emitters, which have been shown to naturally generate light with quantum statistics. Here we describe how to create one class of higher dimensional quantum state, a so called W-state, which is superposed across multiple time bins. We do this by repeated Raman scattering of photons from a charged quantum dot in a pillar microcavity. We show this method can be scaled to larger dimensions with no reduction in coherence or single-photon character. We explain how to extend this work to enable the deterministic creation of arbitrary time-bin encoded qudits
Avaliação da qualidade fisiológica de sementes de café por meio de análise de imagens de plântulas.
Preserving the impossible: conservation of soft-sediment hominin footprint sites and strategies for three-dimensional digital data capture.
Human footprints provide some of the most publically emotive and tangible evidence of our ancestors. To the scientific community they provide evidence of stature, presence, behaviour and in the case of early hominins potential evidence with respect to the evolution of gait. While rare in the geological record the number of footprint sites has increased in recent years along with the analytical tools available for their study. Many of these sites are at risk from rapid erosion, including the Ileret footprints in northern Kenya which are second only in age to those at Laetoli (Tanzania). Unlithified, soft-sediment footprint sites such these pose a significant geoconservation challenge. In the first part of this paper conservation and preservation options are explored leading to the conclusion that to 'record and digitally rescue' provides the only viable approach. Key to such strategies is the increasing availability of three-dimensional data capture either via optical laser scanning and/or digital photogrammetry. Within the discipline there is a developing schism between those that favour one approach over the other and a requirement from geoconservationists and the scientific community for some form of objective appraisal of these alternatives is necessary. Consequently in the second part of this paper we evaluate these alternative approaches and the role they can play in a 'record and digitally rescue' conservation strategy. Using modern footprint data, digital models created via optical laser scanning are compared to those generated by state-of-the-art photogrammetry. Both methods give comparable although subtly different results. This data is evaluated alongside a review of field deployment issues to provide guidance to the community with respect to the factors which need to be considered in digital conservation of human/hominin footprints
- …