2,330 research outputs found

    Sharp Quantum vs. Classical Query Complexity Separations

    Full text link
    We obtain the strongest separation between quantum and classical query complexity known to date -- specifically, we define a black-box problem that requires exponentially many queries in the classical bounded-error case, but can be solved exactly in the quantum case with a single query (and a polynomial number of auxiliary operations). The problem is simple to define and the quantum algorithm solving it is also simple when described in terms of certain quantum Fourier transforms (QFTs) that have natural properties with respect to the algebraic structures of finite fields. These QFTs may be of independent interest, and we also investigate generalizations of them to noncommutative finite rings.Comment: 13 pages, change in title, improvements in presentation, and minor corrections. To appear in Algorithmic

    One-qubit fingerprinting schemes

    Full text link
    Fingerprinting is a technique in communication complexity in which two parties (Alice and Bob) with large data sets send short messages to a third party (a referee), who attempts to compute some function of the larger data sets. For the equality function, the referee attempts to determine whether Alice's data and Bob's data are the same. In this paper, we consider the extreme scenario of performing fingerprinting where Alice and Bob both send either one bit (classically) or one qubit (in the quantum regime) messages to the referee for the equality problem. Restrictive bounds are demonstrated for the error probability of one-bit fingerprinting schemes, and show that it is easy to construct one-qubit fingerprinting schemes which can outperform any one-bit fingerprinting scheme. The author hopes that this analysis will provide results useful for performing physical experiments, which may help to advance implementations for more general quantum communication protocols.Comment: 9 pages; Fixed some typos; changed order of bibliographical reference

    Minimally complex ion traps as modules for quantum communication and computing

    Full text link
    Optically linked ion traps are promising as components of network-based quantum technologies, including communication systems and modular computers. Experimental results achieved to date indicate that the fidelity of operations within each ion trap module will be far higher than the fidelity of operations involving the links; fortunately internal storage and processing can effectively upgrade the links through the process of purification. Here we perform the most detailed analysis to date on this purification task, using a protocol which is balanced to maximise fidelity while minimising the device complexity and the time cost of the process. Moreover we 'compile down' the quantum circuit to device-level operations including cooling and shutting events. We find that a linear trap with only five ions (two of one species, three of another) can support our protocol while incorporating desirable features such as 'global control', i.e. laser control pulses need only target an entire zone rather than differentiating one ion from its neighbour. To evaluate the capabilities of such a module we consider its use both as a universal communications node for quantum key distribution, and as the basic repeating unit of a quantum computer. For the latter case we evaluate the threshold for fault tolerant quantum computing using the surface code, finding acceptable fidelities for the 'raw' entangling link as low as 83% (or under 75% if an additional ion is available).Comment: 15 pages, 8 figure

    Finding flows in the one-way measurement model

    Full text link
    The one-way measurement model is a framework for universal quantum computation, in which algorithms are partially described by a graph G of entanglement relations on a collection of qubits. A sufficient condition for an algorithm to perform a unitary embedding between two Hilbert spaces is for the graph G, together with input/output vertices I, O \subset V(G), to have a flow in the sense introduced by Danos and Kashefi [quant-ph/0506062]. For the special case of |I| = |O|, using a graph-theoretic characterization, I show that such flows are unique when they exist. This leads to an efficient algorithm for finding flows, by a reduction to solved problems in graph theory.Comment: 8 pages, 3 figures: somewhat condensed and updated version, to appear in PR

    A vector error correction model (VECM) of FTSE/JSE SA Listed Property Index and FTSE/JSE SA Capped Property Index

    Get PDF
    Abstract: In this paper the Efficient Market Hypothesis (EMH) will be investigate from an empirical and theoretical basis. The closing (Closet ), intraday high (Hight ), intraday low (Lowt ) and opening (Opent ), values of the FTSE/JSE SA Listed Property Index (FTJ253) and the FTSE/JSE Capped Property Index (FTJ254)will explore the impact on returns resulting from a one standard deviation shock. The examination of the interrelationship between the closing (Closet ), intraday high (Hight ), intraday low (Lowt ) and opening (Opent ) values of the FTSE/JSE SA Listed Property Index (FTJ253) and the FTSE/JSE Capped Property Index (FTJ254) were conducted by making use of the Johansen cointegration test, a vector error correction model (VECM) and an impulse response function. The results of these tests provided an indication of the short- and long run dynamics of all the variables included, and the reaction of the variables to a one standard deviation shock. The results obtain indicate that there is an opportunity for arbitrage when the price deviate from the long run equilibrium until a new equilibrium is reached

    A mean-variance analysis of the global minimum variance portfolio constructed using the CARBS indices

    Get PDF
    Abstract: The purpose of this paper is to construct a global minimum variance portfolio (GMVP) using the log returns of the CARBS (Canada, Australia, Russia, Brazil, South Africa) indices. The weights obtained indicate that most of the portfolio should be invested in Canadian equity. The returns series of the CARBS and the GMVP seem to be consistent with the stylised facts of financial time series. Further empirical analysis shows that the CAPM relationship holds for Canada, South Africa, and the GMVP. The systematic risk (b) of the GMVP is the lowest, and the Russian equity index is the highest. However the R2 of all the models indicate that the CAPM relationship is not a good fit for all the variables, and can therefore not be considered a reliable measure of risk

    Simulation techniques and value-at-risk of the CARBS Indices

    Get PDF
    Abstract: In this paper, simulation techniques are used to estimate value-at-risk of the CARBS equity indices and a global minimum variance portfolio. The empirical analysis in this paper is divided into two parts, the first part deals with simulating normally distributed returns in order to estimate VaR. In the second part calibrated univariate GARCH models are used to simulate returns series that are consistent with the stylised facts of financial time series. When a normal distribution is assumed, the GARCH model forecast of the returns produces the most reliable result. Finally, when garch processes are simulated, the EGARCH model is superior

    Cymothoa hermani sp. nov. (Isopoda, Cymothoidae, Crustacea), a parasitic isopod, collected off the Zanzibar coast, Tanzania from the mouth of a parrotfish (Scaridae)

    Get PDF
    Cymothoa hermani sp. nov., a buccal fish-parasitic isopod is described from off Unguja Island, Zanzibar, from the buccal cavity of the marbled parrotfish, Leptoscarus vaigiensis. Cymothoa hermani sp. nov. is characterised by the unique bulbous ornamentation on pereonite 1, anterolateral angles on pereonite 1 rounded and produced past frontal margin of cephalon, and pereopods with long and slender dactyli. There are no other species of Cymothoa known from parrotfishes. This description increases the number of known Cymothoa from the southwestern Indian Ocean to four

    Variabilité climatique et statistiques. Etude par simulation de la puissance et de la robustesse de quelques tests utilisés pour vérifier l'homogénéité de chroniques

    Get PDF
    L'analyse statistique de séries chronologiques de données hydrométéorologiques est un des outils d'identification de variations climatiques. Cette analyse consiste le plus souvent à la mise en œuvre et à l'interprétation de tests statistiques d'homogénéité des séries. Les séries hydrologiques (données de pluie ou de débit) se caractérisent fréquemment par des effectifs faibles, et ne répondent que rarement aux conditions requises par l'application des tests statistiques dont certains sont paramétriques.Nous avons cherché à évaluer, en terme de puissance et de robustesse, le comportement de quelques méthodes statistiques largement employées dans les études de variabilité climatique. Ce travail a été mené dans chaque cas étudié au moyen de procédures de simulations type Monte-Carlo de 100 échantillons de 50 valeurs conformes aux caractéristiques souvent rencontrées dans les séries naturelles. La variabilité simulée est celle d'un changement brutal de la moyenne. Les procédures concernées sont le test de corrélation sur le rang, le test de Pettitt, le test de Buishand, la procédure bayésienne de Lee et Heghinian, et la procédure de segmentation des séries hydrométéorologiques de Hubert et Carbonnel. Des séries artificielles soit stationnaires, soit affectées par une rupture de la moyenne, normales, non-normales, autocorrélées, présentant une tendance linéaire ou un changement brutal de la variance ont été générées.Les conclusions de ce travail doivent être nuancées selon la méthode considérée. D'une manière générale la puissance maximale estimée se situe autour de 50% pour des taux de rupture de la moyenne de l'ordre de 75% de la valeur de l'écart-type. Par ailleurs il apparaît que l'autocorrélation et la présence d'une tendance dans les séries sont les deux caractéristiques qui pénalisent le plus les performances des procédures.Statistical analysis of hydrometeorological time series is often used to identify climatic variations. Most often this analysis consists of applying and interpreting statistical tests of time series homogeneity. Hydrological time series (rainfall and runoff data) are often short and do not always comply with the hypotheses of the statistical methods. Through simulation we have investigated the power and the robustness of some tests which are widely used in the studies dealing with climatic variability. In each case studied, one hundred samples of fifty elements have been generated based on the main characteristics of natural rainfall series. A shift in the mean has been used to represent a possible climatic variation. The procedures used are the rank correlation test, Pettitt's test, Buishand's test, Lee and Heghinian's bayesian procedure, and Hubert and Carbonnel's segmentation procedure for hydrometeorological series.Each simulation of one hundred samples is used to assess the performances of the methods considering a specific characteristic of the series: normality or non-normality, autocorrelation, trend, shift in the variance. First of all, stationary series have been simulated to evaluate the type I error of the tests. Then the series have been simulated with a break in the mean with different levels of amplitude, from 25% to 100% of the standard deviation value. The rank correlation test, Pettitt's test, Buishand's test and the segmentation procedure with a significance level of 1% (significance level of Scheffé's test) reject as heterogeneous less than ten series over one hundred homogeneous simulated series. This result is consistent with the type I error of a statistical test. On the other hand, Lee and Heghinian's bayesian method rejects about 40% of the series. This result means that this latter procedure must only be applied under the hypothesis of heterogeneity. The estimated power of the methods exceeds 40% to 50% when the break in the mean is more than 75% of the standard deviation value.Independent series have been simulated from normal, log-normal and Pearson distributions to compare the performances of the methods requiring normality. The results show that normality has no significant impact on the performances of these methods. However, the simulations do show that the condition of independence of the successive elements of the series is essential to keep performances constant. Otherwise a trend in the series makes the tests inefficient, except for the rank correlation test for which the alternative is a trend. No method seems to be robust against both negative and positive autoregressive dependencies. The procedures requiring a constant variance are robust when the series keep a constant mean, but seem more or less slightly influenced by a break both in the mean and in the standard deviation

    NDM-526: INVESTIGATION OF STABLE AND UNSTABLE FIBER-REINFORCED ELASTOMERIC ISOLATORS

    Get PDF
    Fiber-reinforced elastomeric isolators (FREIs) are a potentially low-cost alternative to conventional steel-reinforced elastomeric isolators. FREIs can exhibit a non-linear horizontal force-displacement relationship characterized by a softening and stiffening phase, similar to other adaptive isolation devices such as the triple friction pendulum. This non-linear relationship is a consequence of unique deformations that occur during horizontal displacement denoted as rollover, which causes softening, and full rollover, which causes stiffening. The magnitude of the softening due to rollover is primarily governed by the width-to-total height aspect ratio of the FREI. If the aspect ratio is low, below about 2.5, the isolator may be susceptible to horizontal instability where the tangential stiffness becomes negative before increasing due to full rollover. Design codes prevent the use of an isolation system susceptible to horizontal instability within the design displacement. In this paper, experimental testing is used to calibrate a numerical model of a base isolated structure using horizontally unstable and stable FREIs. The performance of the structure is evaluated based on peak displacement of the isolation layer and peak acceleration of the base isolated structure. For the isolators considered, it is shown that the horizontal instability does not have a negative impact on the performance of the structure. It is postulated that some level of horizontal instability may be allowed in the design of unbonded FREIs
    • …
    corecore