97,805 research outputs found

    Single-qubit unitary gates by graph scattering

    Full text link
    We consider the effects of plane-wave states scattering off finite graphs, as an approach to implementing single-qubit unitary operations within the continuous-time quantum walk framework of universal quantum computation. Four semi-infinite tails are attached at arbitrary points of a given graph, representing the input and output registers of a single qubit. For a range of momentum eigenstates, we enumerate all of the graphs with up to n=9n=9 vertices for which the scattering implements a single-qubit gate. As nn increases, the number of new unitary operations increases exponentially, and for n>6n>6 the majority correspond to rotations about axes distributed roughly uniformly across the Bloch sphere. Rotations by both rational and irrational multiples of π\pi are found.Comment: 8 pages, 7 figure

    Portfolio optimization when risk factors are conditionally varying and heavy tailed

    Get PDF
    Assumptions about the dynamic and distributional behavior of risk factors are crucial for the construction of optimal portfolios and for risk assessment. Although asset returns are generally characterized by conditionally varying volatilities and fat tails, the normal distribution with constant variance continues to be the standard framework in portfolio management. Here we propose a practical approach to portfolio selection. It takes both the conditionally varying volatility and the fat-tailedness of risk factors explicitly into account, while retaining analytical tractability and ease of implementation. An application to a portfolio of nine German DAX stocks illustrates that the model is strongly favored by the data and that it is practically implementable. Klassifizierung: C13, C32, G11, G14, G18Die Bewertung von Risiken und die optimale Zusammensetzung von Wertpapier-Portfolios hängt insbesondere von den für die Risikofaktoren gemachten Annahmen bezüglich der zugrunde liegenden Dynamik und den Verteilungseigenschaften ab. In der empirischen Finanzmarkt-Analyse ist weitestgehend akzeptiert, daß die Renditen von Finanzmarkt-Zeitreihen zeitvariierende Volatilität (HeteroskedastizitÄat) zeigen und daß die bedingte Verteilung der Renditen von der Normalverteilung abweichende Eigenschaften aufweisen. Insbesondere die Enden der Verteilung weisen eine gegenüber der Normalverteilung höhere Wahrscheinlichkeitsdichte auf ('fat-tails') und häufig ist die beobachtete Verteilung nicht symmetrisch. Trotzdem stellt die Normalverteilungs-Annahme mit konstanter Varianz weiterhin die Basis für den Mittelwert-Varianz Ansatz zur Portfolio-Optimierung dar. In der vorliegenden Studie schlagen wir einen praktikablen Ansatz zur Portfolio-Selektion mit einem Mittelwert-Skalen Ansatz vor, der sowohl die bedingte Heteroskedastizität der Renditen, als auch die von der Normalverteilung abweichenden Eigenschaften zu berücksichtigen in der Lage ist. Wir verwenden dazu eine dem GARCH Modellähnliche Dynamik der Risikofaktoren und verwenden stabile Verteilungen anstelle der Normalverteilung. Dabei gewährleistet das von uns vorgeschlagene Faktor-Modell sowohl gute analytische Eigenschaften und ist darüberhinaus auch einfach zu implementieren. Eine beispielhafte Anwendung des vorgeschlagenen Modells mit neun Aktien aus dem Deutschen Aktienindex veranschaulicht die bessere Anpassung des vorgeschlagenen Modells an die Daten und demonstriert die Anwendbarkeit zum Zwecke der Portfolio-Optimierung

    Levine\u27s Isomorph Dictionary

    Get PDF
    Among word buffs, 1971 will undoubtedly be remembered as the year that the Compact Edition of the Oxford English Dictionary was published, making this monumental work available at one-third the price and one-sixth the bulk of the original. An exceedingly useful lexicographic tool has been placed in the hands of many who formerly had to make a trip to the library to consult it. By contrast, one of the least-heralded publishing events of 1971 was the appearance of Jack Levine\u27s A List of Pattern Words of Lengths Two Through Nine. Nevertheless, I predict that the Levine dictionary may have a greater impact than the COED on word buffs. The information in the COED has been available in the OED for decades, but Levine\u27s dictionary enables the logologist to view Webster\u27s Unabridged in an entirely new light: specifically, it groups together all words with the same underlying pattern, such as EXCESS and BAMBOO (and, in fact, 23 rarer words also having the letter-pattern abcadd). Furthermore, the COED costs $75, but the Levine dictionary may be obtained free while the limited supply lasts

    Construction of a Pragmatic Base Line for Journal Classifications and Maps Based on Aggregated Journal-Journal Citation Relations

    Full text link
    A number of journal classification systems have been developed in bibliometrics since the launch of the Citation Indices by the Institute of Scientific Information (ISI) in the 1960s. These systems are used to normalize citation counts with respect to field-specific citation patterns. The best known system is the so-called "Web-of-Science Subject Categories" (WCs). In other systems papers are classified by algorithmic solutions. Using the Journal Citation Reports 2014 of the Science Citation Index and the Social Science Citation Index (n of journals = 11,149), we examine options for developing a new system based on journal classifications into subject categories using aggregated journal-journal citation data. Combining routines in VOSviewer and Pajek, a tree-like classification is developed. At each level one can generate a map of science for all the journals subsumed under a category. Nine major fields are distinguished at the top level. Further decomposition of the social sciences is pursued for the sake of example with a focus on journals in information science (LIS) and science studies (STS). The new classification system improves on alternative options by avoiding the problem of randomness in each run that has made algorithmic solutions hitherto irreproducible. Limitations of the new system are discussed (e.g. the classification of multi-disciplinary journals). The system's usefulness for field-normalization in bibliometrics should be explored in future studies.Comment: accepted for publication in the Journal of Informetrics, 20 July 201

    Probing the non-linear structure of general relativity with black hole binaries

    Get PDF
    Observations of the inspiral of massive binary black holes (BBH) in the Laser Interferometer Space Antenna (LISA) and stellar mass binary black holes in the European Gravitational-Wave Observatory (EGO) offer an unique opportunity to test the non-linear structure of general relativity. For a binary composed of two non-spinning black holes, the non-linear general relativistic effects depend only on the masses of the constituents. In a recent letter, we explored the possibility of a test to determine all the post-Newtonian coefficients in the gravitational wave-phasing. However, mutual covariances dilute the effectiveness of such a test. In this paper, we propose a more powerful test in which the various post-Newtonian coefficients in the gravitational wave phasing are systematically measured by treating three of them as independent parameters and demanding their mutual consistency. LISA (EGO) will observe BBH inspirals with a signal-to-noise ratio of more than 1000 (100) and thereby test the self-consistency of each of the nine post-Newtonian coefficients that have so-far been computed, by measuring the lower order coefficients to a relative accuracy of 105\sim 10^{-5} (respectively, 104\sim 10^{-4}) and the higher order coefficients to a relative accuracy in the range 10410^{-4}-0.1 (respectively, 10310^{-3}-1).Comment: 5 pages, 4 figures. Revised version, accepted for publication in Phys. Rev
    corecore