1,344 research outputs found

    Measured Quantum Fourier Transform of 1024 Qubits on Fiber Optics

    Full text link
    Quantum Fourier transform (QFT) is a key function to realize quantum computers. A QFT followed by measurement was demonstrated on a simple circuit based on fiber-optics. The QFT was shown to be robust against imperfections in the rotation gate. Error probability was estimated to be 0.01 per qubit, which corresponded to error-free operation on 100 qubits. The error probability can be further reduced by taking the majority of the accumulated results. The reduction of error probability resulted in a successful QFT demonstration on 1024 qubits.Comment: 15 pages, 6 figures, submitted to EQIS 2003 Special issue, Int. J. Quantum Informatio

    Scalability of Shor's algorithm with a limited set of rotation gates

    Full text link
    Typical circuit implementations of Shor's algorithm involve controlled rotation gates of magnitude π/22L\pi/2^{2L} where LL is the binary length of the integer N to be factored. Such gates cannot be implemented exactly using existing fault-tolerant techniques. Approximating a given controlled π/2d\pi/2^{d} rotation gate to within δ=O(1/2d)\delta=O(1/2^{d}) currently requires both a number of qubits and number of fault-tolerant gates that grows polynomially with dd. In this paper we show that this additional growth in space and time complexity would severely limit the applicability of Shor's algorithm to large integers. Consequently, we study in detail the effect of using only controlled rotation gates with dd less than or equal to some dmaxd_{\rm max}. It is found that integers up to length Lmax=O(4dmax)L_{\rm max} = O(4^{d_{\rm max}}) can be factored without significant performance penalty implying that the cumbersome techniques of fault-tolerant computation only need to be used to create controlled rotation gates of magnitude π/64\pi/64 if integers thousands of bits long are desired factored. Explicit fault-tolerant constructions of such gates are also discussed.Comment: Substantially revised version, twice as long as original. Two tables converted into one 8-part figure, new section added on the construction of arbitrary single-qubit rotations using only the fault-tolerant gate set. Substantial additional discussion and explanatory figures added throughout. (8 pages, 6 figures

    Statistical Assertions for Validating Patterns and Finding Bugs in Quantum Programs

    Full text link
    In support of the growing interest in quantum computing experimentation, programmers need new tools to write quantum algorithms as program code. Compared to debugging classical programs, debugging quantum programs is difficult because programmers have limited ability to probe the internal states of quantum programs; those states are difficult to interpret even when observations exist; and programmers do not yet have guidelines for what to check for when building quantum programs. In this work, we present quantum program assertions based on statistical tests on classical observations. These allow programmers to decide if a quantum program state matches its expected value in one of classical, superposition, or entangled types of states. We extend an existing quantum programming language with the ability to specify quantum assertions, which our tool then checks in a quantum program simulator. We use these assertions to debug three benchmark quantum programs in factoring, search, and chemistry. We share what types of bugs are possible, and lay out a strategy for using quantum programming patterns to place assertions and prevent bugs.Comment: In The 46th Annual International Symposium on Computer Architecture (ISCA '19). arXiv admin note: text overlap with arXiv:1811.0544

    A Quantitative Measure of Interference

    Full text link
    We introduce an interference measure which allows to quantify the amount of interference present in any physical process that maps an initial density matrix to a final density matrix. In particular, the interference measure enables one to monitor the amount of interference generated in each step of a quantum algorithm. We show that a Hadamard gate acting on a single qubit is a basic building block for interference generation and realizes one bit of interference, an ``i-bit''. We use the interference measure to quantify interference for various examples, including Grover's search algorithm and Shor's factorization algorithm. We distinguish between ``potentially available'' and ``actually used'' interference, and show that for both algorithms the potentially available interference is exponentially large. However, the amount of interference actually used in Grover's algorithm is only about 3 i-bits and asymptotically independent of the number of qubits, while Shor's algorithm indeed uses an exponential amount of interference.Comment: 13 pages of latex; research done at http://www.quantware.ups-tlse.fr

    Resource Requirements for Fault-Tolerant Quantum Simulation: The Transverse Ising Model Ground State

    Full text link
    We estimate the resource requirements, the total number of physical qubits and computational time, required to compute the ground state energy of a 1-D quantum Transverse Ising Model (TIM) of N spin-1/2 particles, as a function of the system size and the numerical precision. This estimate is based on analyzing the impact of fault-tolerant quantum error correction in the context of the Quantum Logic Array (QLA) architecture. Our results show that due to the exponential scaling of the computational time with the desired precision of the energy, significant amount of error correciton is required to implement the TIM problem. Comparison of our results to the resource requirements for a fault-tolerant implementation of Shor's quantum factoring algorithm reveals that the required logical qubit reliability is similar for both the TIM problem and the factoring problem.Comment: 19 pages, 8 figure

    Effects of imperfections for Shor's factorization algorithm

    Full text link
    We study effects of imperfections induced by residual couplings between qubits on the accuracy of Shor's algorithm using numerical simulations of realistic quantum computations with up to 30 qubits. The factoring of numbers up to N=943 show that the width of peaks, which frequencies allow to determine the factors, grow exponentially with the number of qubits. However, the algorithm remains operational up to a critical coupling strength ϵc\epsilon_c which drops only polynomially with log2N\log_2 N. The numerical dependence of ϵc\epsilon_c on log2N\log_2 N is explained by analytical estimates that allows to obtain the scaling for functionality of Shor's algorithm on realistic quantum computers with a large number of qubits.Comment: 10 pages, 10 figures, 1 table. Added references and new data. Erratum added as appendix. 1 Figure and 1 Table added. Research is available at http://www.quantware.ups-tlse.fr

    Star Architecture as Socio-Material Assemblage

    Get PDF
    Taking inspiration from new materialism and assemblage, the chapter deals with star architects and iconic buildings as socio-material network effects that do not pre-exist action, but are enacted in practice, in the materiality of design crafting and city building. Star architects are here conceptualized as part of broader assemblages of actors and practices ‘making star architecture’ a reality, and the buildings they design are considered not just as unique and iconic objects, but dis-articulated as complex crafts mobilizing skills, technologies, materials, and forms of knowledge not necessarily ascribable to architecture. Overcoming narrow criticism focusing on the symbolic order of icons as unique creations and alienated repetitions of capitalist development, the chapter’s main aim is to widen the scope of critique by bridging culture and economy, symbolism and practicality, making star architecture available to a broad, fragmented arena of (potential) critics, unevenly equipped with critical tools and differentiated experiences

    Assessing the climate change impacts of biogenic carbon in buildings: a critical review of two main dynamic approaches

    Get PDF
    Wood is increasingly perceived as a renewable, sustainable building material. The carbon it contains, biogenic carbon, comes from biological processes; it is characterized by a rapid turnover in the global carbon cycle. Increasing the use of harvested wood products (HWP) from sustainable forest management could provide highly needed mitigation efforts and carbon removals. However, the combined climate change benefits of sequestering biogenic carbon, storing it in harvested wood products and substituting more emission-intensive materials are hard to quantify. Although different methodological choices and assumptions can lead to opposite conclusions, there is no consensus on the assessment of biogenic carbon in life cycle assessment (LCA). Since LCA is increasingly relied upon for decision and policy making, incorrect biogenic carbon assessment could lead to inefficient or counterproductive strategies, as well as missed opportunities. This article presents a critical review of biogenic carbon impact assessment methods, it compares two main approaches to include time considerations in LCA, and suggests one that seems better suited to assess the impacts of biogenic carbon in buildings

    Full Counting Statistics of Non-Commuting Variables: the Case of Spin Counts

    Full text link
    We discuss the Full Counting Statistics of non-commuting variables with the measurement of successive spin counts in non-collinear directions taken as an example. We show that owing to an irreducible detector back-action, the FCS in this case may be sensitive to the dynamics of the detectors, and may differ from the predictions obtained with using a naive version of the Projection Postulate. We present here a general model of detector dynamics and path-integral approach to the evaluation of FCS. We concentrate further on a simple "diffusive" model of the detector dynamics where the FCS can be evaluated with transfer-matrix method. The resulting probability distribution of spin counts is characterized by anomalously large higher cumulants and substantially deviates from Gaussian Statistics.Comment: 11 pages, 3 figure

    Bell Correlations and the Common Future

    Full text link
    Reichenbach's principle states that in a causal structure, correlations of classical information can stem from a common cause in the common past or a direct influence from one of the events in correlation to the other. The difficulty of explaining Bell correlations through a mechanism in that spirit can be read as questioning either the principle or even its basis: causality. In the former case, the principle can be replaced by its quantum version, accepting as a common cause an entangled state, leaving the phenomenon as mysterious as ever on the classical level (on which, after all, it occurs). If, more radically, the causal structure is questioned in principle, closed space-time curves may become possible that, as is argued in the present note, can give rise to non-local correlations if to-be-correlated pieces of classical information meet in the common future --- which they need to if the correlation is to be detected in the first place. The result is a view resembling Brassard and Raymond-Robichaud's parallel-lives variant of Hermann's and Everett's relative-state formalism, avoiding "multiple realities."Comment: 8 pages, 5 figure
    corecore