117 research outputs found
Approximate transformations of bipartite pure-state entanglement from the majorization lattice
We study the problem of deterministic transformations of an \textit{initial}
pure entangled quantum state, , into a \textit{target} pure
entangled quantum state, , by using \textit{local operations and
classical communication} (LOCC). A celebrated result of Nielsen [Phys. Rev.
Lett. \textbf{83}, 436 (1999)] gives the necessary and sufficient condition
that makes this entanglement transformation process possible. Indeed, this
process can be achieved if and only if the majorization relation holds, where and are probability vectors obtained by taking
the squares of the Schmidt coefficients of the initial and target states,
respectively. In general, this condition is not fulfilled. However, one can
look for an \textit{approximate} entanglement transformation. Vidal \textit{et.
al} [Phys. Rev. A \textbf{62}, 012304 (2000)] have proposed a deterministic
transformation using LOCC in order to obtain a target state
most approximate to in terms of
maximal fidelity between them. Here, we show a strategy to deal with
approximate entanglement transformations based on the properties of the
\textit{majorization lattice}. More precisely, we propose as approximate target
state one whose Schmidt coefficients are given by the supremum between
and . Our proposal is inspired on the observation that fidelity does not
respect the majorization relation in general. Remarkably enough, we find that
for some particular interesting cases, like two-qubit pure states or the
entanglement concentration protocol, both proposals are coincident.Comment: Revised manuscript close to the accepted version in Physica A (10
pages, 1 figure
Multi-class classification based on quantum state discrimination
We present a general framework for the problem of multi-class classification using classification functions that can be interpreted as fuzzy sets. We specialize these functions in the domain of Quantum-inspired classifiers, which are based on quantum state discrimination techniques. In particular, we use unsharp observables (Positive Operator-Valued Measures) that are determined by the training set of a given dataset to construct these classification functions. We show that such classifiers can be tested on near-term quantum computers once these classification functions are âdistilledâ (on a classical platform) from the quantum encoding of a training dataset. We compare these experimental results with their theoretical counterparts and we pose some questions for future research
On the lattice structure of probability spaces in quantum mechanics
Let C be the set of all possible quantum states. We study the convex subsets
of C with attention focused on the lattice theoretical structure of these
convex subsets and, as a result, find a framework capable of unifying several
aspects of quantum mechanics, including entanglement and Jaynes' Max-Ent
principle. We also encounter links with entanglement witnesses, which leads to
a new separability criteria expressed in lattice language. We also provide an
extension of a separability criteria based on convex polytopes to the infinite
dimensional case and show that it reveals interesting facets concerning the
geometrical structure of the convex subsets. It is seen that the above
mentioned framework is also capable of generalization to any statistical theory
via the so-called convex operational models' approach. In particular, we show
how to extend the geometrical structure underlying entanglement to any
statistical model, an extension which may be useful for studying correlations
in different generalizations of quantum mechanics.Comment: arXiv admin note: substantial text overlap with arXiv:1008.416
On the connection between Complementarity and Uncertainty Principles in the Mach-Zehnder interferometric setting
We revisit, in the framework of Mach-Zehnder interferometry, the connection
between the complementarity and uncertainty principles of quantum mechanics.
Specifically, we show that, for a pair of suitably chosen observables, the
trade-off relation between the complementary path information and fringe
visibility is equivalent to the uncertainty relation given by Schr\"odinger
and Robertson, and to the one provided by Landau and Pollak as well. We also
employ entropic uncertainty relations (based on R\'enyi entropic measures) and
study their meaning for different values of the entropic parameter. We show
that these different values define regimes which yield qualitatively different
information concerning the system, in agreement with findings of [A. Luis,
Phys. Rev. A 84, 034101 (2011)]. We find that there exists a regime for which
the entropic uncertinty relations can be used as criteria to pinpoint non
trivial states of minimum uncertainty.Comment: 7 pages, 2 figure
Material Cycles and Chemicals: Dynamic Material Flow Analysis of Contaminants in Paper Recycling
This study provides
a systematic approach for assessment of contaminants
in materials for recycling. Paper recycling is used as an illustrative
example. Three selected chemicals, bisphenol A (BPA), diethylhexyl
phthalate (DEHP) and mineral oil hydrocarbons (MOHs), are evaluated
within the paper cycle. The approach combines static material flow
analysis (MFA) with dynamic material and substance flow modeling.
The results indicate that phasing out of chemicals is the most effective
measure for reducing chemical contamination. However, this scenario
was also associated with a considerable lag phase (between approximately
one and three decades) before the presence of chemicals in paper products
could be considered insignificant. While improved decontamination
may appear to be an effective way of minimizing chemicals in products,
this may also result in lower production yields. Optimized waste material
source-segregation and collection was the least effective strategy
for reducing chemical contamination, if the overall recycling rates
should be maintained at the current level (approximately 70% for Europe).
The study provides a consistent approach for evaluating contaminant
levels in material cycles. The results clearly indicate that mass-based
recycling targets are not sufficient to ensure high quality material
recycling
The importance of constraint relief caused by rubber cavitation in the toughening of epoxy
Many rubber-toughened epoxies are thought to derive the bulk of their toughness through the processes of rubber cavitation and plastic shear-yielding in the epoxy matrix. Constraint relief has been considered to be a key mechanism which allows extra plastic shear deformation to occur. The present work attempts to provide direct experimental evidence of the constraint relief effect by combining testing geometries that vary the degree of constraint with microscopic observations. The results show that the success of a rubber as a toughening agent for epoxies is closely related to its ability to cavitate. Evidence for local constraint relief is presented. Upon cavitation of the rubber, the stress state in a specimen with initial constraint is found to change to a plane stress state. The constraint relief circumvents or delays the crack initiation in the matrix, which allows more plastic deformation to occur.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/44723/1/10853_2005_Article_BF01352202.pd
Failure mechanisms in alloy of polyamide 6,6/polyphenylene oxide under severe conditions
Toughening mechanisms of a polyamide 6,6/polyphenylene oxide alloy containing an elastomer tested under a slow rate, an impact rate, and a low temperature have been investigated using various microscopy techniques. It is found that the toughening mechanisms of the alloy may change from crazing/shear yielding, to crack bridging/crazing, and to transparticle failure, depending on the testing conditions. Except for the low temperature high strain rate testing condition and in the plane stress region of the crack, the crazing mechanism has been observed in all the conditions we studied. When the testing rate is high, the shear yielding mechanism is suppressed; multiple crazing and particle bridging mechanisms appear to dominate.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/44700/1/10853_2004_Article_BF00557130.pd
Recommended from our members
Progress on HL-LHC Nb3Sn Magnets
The high-luminosity Large Hadron Collider (HL-LHC) project aims at allowing to increase the collisions in the LHC by a factor of ten in the decade 2025-2035. One essential element is the superconducting magnet around the interaction region points, where the large aperture magnets will be installed to allow to further reduce the beam size in the interaction point. The core of this upgrade is the Nb Sn triplet, made up of 150-mm aperture quadrupoles in the range of 7-8 m. The project is being shared between the European Organization for Nuclear Research and the US Accelerator Upgrade Program, based on the same design, and on the two strand technologies. The project is ending the short model phase, and entering the prototype construction. We will report on the main results of the short model program, including the quench performance and field quality. A second important element is the 11 T dipole that replaces a standard dipole making space for additional collimators. The magnet is also ending the model development and entering the prototype phase. A critical point in the design of this magnet is the large current density, allowing increase of the field from 8 to 11 T with the same coil cross section as in the LHC dipoles. This is also the first two-in-one Nb Sn magnet developed so far. We will report the main results on the test and the critical aspects. 3
Exome-wide somatic mutation characterization of small bowel adenocarcinoma
Small bowel adenocarcinoma (SBA) is an aggressive disease with limited treatment options. Despite previous studies, its molecular genetic background has remained somewhat elusive. To comprehensively characterize the mutational landscape of this tumor type, and to identify possible targets of treatment, we conducted the first large exome sequencing study on a population-based set of SBA samples from all three small bowel segments. Archival tissue from 106 primary tumors with appropriate clinical information were available for exome sequencing from a patient series consisting of a majority of confirmed SBA cases diagnosed in Finland between the years 2003-2011. Paired-end exome sequencing was performed using Illumina HiSeq 4000, and OncodriveFML was used to identify driver genes from the exome data. We also defined frequently affected cancer signalling pathways and performed the first extensive allelic imbalance (Al) analysis in SBA. Exome data analysis revealed significantly mutated genes previously linked to SBA (TP53, KRAS, APC, SMAD4, and BRAF), recently reported potential driver genes (SOX9, ATM, and ARID2), as well as novel candidate driver genes, such as ACVR2A, ACVR1B, BRCA2, and SMARCA4. We also identified clear mutation hotspot patterns in ERBB2 and BRAF. No BRAF V600E mutations were observed. Additionally, we present a comprehensive mutation signature analysis of SBA, highlighting established signatures 1A, 6, and 17, as well as U2 which is a previously unvalidated signature. Finally, comparison of the three small bowel segments revealed differences in tumor characteristics. This comprehensive work unveils the mutational landscape and most frequently affected genes and pathways in SBA, providing potential therapeutic targets, and novel and more thorough insights into the genetic background of this tumor type.Peer reviewe
- âŠ