1,876 research outputs found
Using Perspective Taking to De-Escalate Commitment to Software Product Launch Decisions
In software product development settings when things go awry and the original plan loses credibility, managers often choose to honor the originally announced product launch schedule anyway, in effect launching a product that may be seriously compromised in terms of both functionality and reliability. In this study, we draw on the perspective of escalation of commitment to investigate adherence to original product launch schedules despite negative feedback. Specifically, we use the notion of perspective taking to propose a de-escalation tactic. Through a laboratory experiment, we found strong support that taking the perspective of individuals that can be negatively influenced by a product launch can indeed effectively promote de-escalation of commitment. Furthermore, we found that the experiences of anticipated guilt mediate the relationship between perspective taking and de-escalation, and this indirect effect is significantly greater when a decision maker’s personal cost associated with de-escalation is high rather than low
Graph-theoretical optimization of fusion-based graph state generation
Graph states are versatile resources for various quantum information processing tasks, including measurement-based quantum computing and quantum repeaters. Although the type-II fusion gate enables all-optical generation of graph states by combining small graph states, its non-deterministic nature hinders the efficient generation of large graph states. In this work, we present a graph-theoretical strategy to effectively optimize fusion-based generation of any given graph state, along with a Python package OptGraphState. Our strategy comprises three stages: simplifying the target graph state, building a fusion network, and determining the order of fusions. Utilizing this proposed method, we evaluate the resource overheads of random graphs and various well-known graphs. Additionally, we investigate the success probability of graph state generation given a restricted number of available resource states. We expect that our strategy and software will assist researchers in developing and assessing experimentally viable schemes that use photonic graph states
Parity-encoding-based quantum computing with Bayesian error tracking
Measurement-based quantum computing (MBQC) in linear optical systems is
promising for near-future quantum computing architecture. However, the
nondeterministic nature of entangling operations and photon losses hinder the
large-scale generation of graph states and introduce logical errors. In this
work, we propose a linear optical topological MBQC protocol employing
multiphoton qubits based on the parity encoding, which turns out to be highly
photon-loss tolerant and resource-efficient even under the effects of nonideal
entangling operations that unavoidably corrupt nearby qubits. For the realistic
error analysis, we introduce a Bayesian methodology, in conjunction with the
stabilizer formalism, to track errors caused by such detrimental effects. We
additionally suggest a graph-theoretical optimization scheme for the process of
constructing an arbitrary graph state, which greatly reduces its resource
overhead. Notably, we show that our protocol is advantageous over several other
existing approaches in terms of fault-tolerance, resource overhead, or
feasibility of basic elements.Comment: Main text: 15 pages, 10 figures / Supplemental Material: 17 pages, 8
figure
Fault-tolerant quantum computation by hybrid qubits with bosonic cat-code and single photons
Hybridizing different degrees of freedom or physical platforms potentially
offers various advantages in building scalable quantum architectures. We here
introduce a fault-tolerant hybrid quantum computation by taking the advantages
of both discrete variable (DV) and continuous variable (CV) systems.
Particularly, we define a CV-DV hybrid qubit with bosonic cat-code and single
photon, which is implementable in current photonic platforms. By the cat-code
encoded in the CV part, the dominant loss errors are readily correctable
without multi-qubit encoding, while the logical basis is inherently orthogonal
due to the DV part. We design fault-tolerant architectures by concatenating
hybrid qubits and an outer DV quantum error correction code such as topological
codes, exploring their potential merits in developing scalable quantum
computation. We demonstrate by numerical simulations that our scheme is at
least an order of magnitude more resource-efficient over all previous proposals
in photonic platforms, allowing to achieve a record-high loss threshold among
existing CV and hybrid approaches. We discuss its realization not only in
all-photonic platforms but also in other hybrid platforms including
superconduting and trapped-ion systems, which allows us to find various
efficient routes towards fault-tolerant quantum computing.Comment: 21 pages, 8 figure
Recommended from our members
Fault-Tolerant Quantum Computation by Hybrid Qubits with Bosonic Cat Code and Single Photons
Hybridizing different degrees of freedom or physical platforms potentially offers various advantages in building scalable quantum architectures. Here, we introduce a fault-tolerant hybrid quantum computation by building on the advantages of both discrete-variable (DV) and continuous-variable (CV) systems. In particular, we define a CV-DV hybrid qubit with a bosonic cat code and a single photon, which is implementable in current photonic platforms. Due to the cat code encoded in the CV part, the predominant loss errors are readily correctable without multiqubit encoding, while the logical basis is inherently orthogonal due to the DV part. We design fault-tolerant architectures by concatenating hybrid qubits and an outer DV quantum error-correction code such as a topological code, exploring their potential merit in developing scalable quantum computation. We demonstrate by numerical simulations that our scheme is at least an order of magnitude more resource efficient compared to all previous proposals in photonic platforms, allowing us to achieve a record-high loss threshold among existing CV and hybrid approaches. We discuss the realization of our approach not only in all-photonic platforms but also in other hybrid platforms including superconducting and trapped-ion systems, which allows us to find various efficient routes toward fault-tolerant quantum computing
MetaMix: Meta-state Precision Searcher for Mixed-precision Activation Quantization
Mixed-precision quantization of efficient networks often suffer from
activation instability encountered in the exploration of bit selections. To
address this problem, we propose a novel method called MetaMix which consists
of bit selection and weight training phases. The bit selection phase iterates
two steps, (1) the mixed-precision-aware weight update, and (2) the bit-search
training with the fixed mixed-precision-aware weights, both of which combined
reduce activation instability in mixed-precision quantization and contribute to
fast and high-quality bit selection. The weight training phase exploits the
weights and step sizes trained in the bit selection phase and fine-tunes them
thereby offering fast training. Our experiments with efficient and
hard-to-quantize networks, i.e., MobileNet v2 and v3, and ResNet-18 on ImageNet
show that our proposed method pushes the boundary of mixed-precision
quantization, in terms of accuracy vs. operations, by outperforming both mixed-
and single-precision SOTA methods
CSGM Designer: a platform for designing cross-species intron-spanning genic markers linked with genome information of legumes.
BackgroundGenetic markers are tools that can facilitate molecular breeding, even in species lacking genomic resources. An important class of genetic markers is those based on orthologous genes, because they can guide hypotheses about conserved gene function, a situation that is well documented for a number of agronomic traits. For under-studied species a key bottleneck in gene-based marker development is the need to develop molecular tools (e.g., oligonucleotide primers) that reliably access genes with orthology to the genomes of well-characterized reference species.ResultsHere we report an efficient platform for the design of cross-species gene-derived markers in legumes. The automated platform, named CSGM Designer (URL: http://tgil.donga.ac.kr/CSGMdesigner), facilitates rapid and systematic design of cross-species genic markers. The underlying database is composed of genome data from five legume species whose genomes are substantially characterized. Use of CSGM is enhanced by graphical displays of query results, which we describe as "circular viewer" and "search-within-results" functions. CSGM provides a virtual PCR representation (eHT-PCR) that predicts the specificity of each primer pair simultaneously in multiple genomes. CSGM Designer output was experimentally validated for the amplification of orthologous genes using 16 genotypes representing 12 crop and model legume species, distributed among the galegoid and phaseoloid clades. Successful cross-species amplification was obtained for 85.3% of PCR primer combinations.ConclusionCSGM Designer spans the divide between well-characterized crop and model legume species and their less well-characterized relatives. The outcome is PCR primers that target highly conserved genes for polymorphism discovery, enabling functional inferences and ultimately facilitating trait-associated molecular breeding
Leukoaraiosis is associated with pneumonia after acute ischemic stroke
Diagnostic criteria for stroke associated pneumonia based on the CDC criteria. (DOCX 25 kb
- …