97,805 research outputs found
Single-qubit unitary gates by graph scattering
We consider the effects of plane-wave states scattering off finite graphs, as
an approach to implementing single-qubit unitary operations within the
continuous-time quantum walk framework of universal quantum computation. Four
semi-infinite tails are attached at arbitrary points of a given graph,
representing the input and output registers of a single qubit. For a range of
momentum eigenstates, we enumerate all of the graphs with up to vertices
for which the scattering implements a single-qubit gate. As increases, the
number of new unitary operations increases exponentially, and for the
majority correspond to rotations about axes distributed roughly uniformly
across the Bloch sphere. Rotations by both rational and irrational multiples of
are found.Comment: 8 pages, 7 figure
Portfolio optimization when risk factors are conditionally varying and heavy tailed
Assumptions about the dynamic and distributional behavior of risk factors are crucial for the construction of optimal portfolios and for risk assessment. Although asset returns are generally characterized by conditionally varying volatilities and fat tails, the normal distribution with constant variance continues to be the standard framework in portfolio management. Here we propose a practical approach to portfolio selection. It takes both the conditionally varying volatility and the fat-tailedness of risk factors explicitly into account, while retaining analytical tractability and ease of implementation. An application to a portfolio of nine German DAX stocks illustrates that the model is strongly favored by the data and that it is practically implementable. Klassifizierung: C13, C32, G11, G14, G18Die Bewertung von Risiken und die optimale Zusammensetzung von Wertpapier-Portfolios hängt insbesondere von den für die Risikofaktoren gemachten Annahmen bezüglich der zugrunde liegenden Dynamik und den Verteilungseigenschaften ab. In der empirischen Finanzmarkt-Analyse ist weitestgehend akzeptiert, daß die Renditen von Finanzmarkt-Zeitreihen zeitvariierende Volatilität (HeteroskedastizitÄat) zeigen und daß die bedingte Verteilung der Renditen von der Normalverteilung abweichende Eigenschaften aufweisen. Insbesondere die Enden der Verteilung weisen eine gegenüber der Normalverteilung höhere Wahrscheinlichkeitsdichte auf ('fat-tails') und häufig ist die beobachtete Verteilung nicht symmetrisch. Trotzdem stellt die Normalverteilungs-Annahme mit konstanter Varianz weiterhin die Basis für den Mittelwert-Varianz Ansatz zur Portfolio-Optimierung dar. In der vorliegenden Studie schlagen wir einen praktikablen Ansatz zur Portfolio-Selektion mit einem Mittelwert-Skalen Ansatz vor, der sowohl die bedingte Heteroskedastizität der Renditen, als auch die von der Normalverteilung abweichenden Eigenschaften zu berücksichtigen in der Lage ist. Wir verwenden dazu eine dem GARCH Modellähnliche Dynamik der Risikofaktoren und verwenden stabile Verteilungen anstelle der Normalverteilung. Dabei gewährleistet das von uns vorgeschlagene Faktor-Modell sowohl gute analytische Eigenschaften und ist darüberhinaus auch einfach zu implementieren. Eine beispielhafte Anwendung des vorgeschlagenen Modells mit neun Aktien aus dem Deutschen Aktienindex veranschaulicht die bessere Anpassung des vorgeschlagenen Modells an die Daten und demonstriert die Anwendbarkeit zum Zwecke der Portfolio-Optimierung
Levine\u27s Isomorph Dictionary
Among word buffs, 1971 will undoubtedly be remembered as the year that the Compact Edition of the Oxford English Dictionary was published, making this monumental work available at one-third the price and one-sixth the bulk of the original. An exceedingly useful lexicographic tool has been placed in the hands of many who formerly had to make a trip to the library to consult it. By contrast, one of the least-heralded publishing events of 1971 was the appearance of Jack Levine\u27s A List of Pattern Words of Lengths Two Through Nine. Nevertheless, I predict that the Levine dictionary may have a greater impact than the COED on word buffs. The information in the COED has been available in the OED for decades, but Levine\u27s dictionary enables the logologist to view Webster\u27s Unabridged in an entirely new light: specifically, it groups together all words with the same underlying pattern, such as EXCESS and BAMBOO (and, in fact, 23 rarer words also having the letter-pattern abcadd). Furthermore, the COED costs $75, but the Levine dictionary may be obtained free while the limited supply lasts
Recommended from our members
Insights and approaches using deep learning to classify wildlife.
The implementation of intelligent software to identify and classify objects and individuals in visual fields is a technology of growing importance to operatives in many fields, including wildlife conservation and management. To non-experts, the methods can be abstruse and the results mystifying. Here, in the context of applying cutting edge methods to classify wildlife species from camera-trap data, we shed light on the methods themselves and types of features these methods extract to make efficient identifications and reliable classifications. The current state of the art is to employ convolutional neural networks (CNN) encoded within deep-learning algorithms. We outline these methods and present results obtained in training a CNN to classify 20 African wildlife species with an overall accuracy of 87.5% from a dataset containing 111,467 images. We demonstrate the application of a gradient-weighted class-activation-mapping (Grad-CAM) procedure to extract the most salient pixels in the final convolution layer. We show that these pixels highlight features in particular images that in some cases are similar to those used to train humans to identify these species. Further, we used mutual information methods to identify the neurons in the final convolution layer that consistently respond most strongly across a set of images of one particular species. We then interpret the features in the image where the strongest responses occur, and present dataset biases that were revealed by these extracted features. We also used hierarchical clustering of feature vectors (i.e., the state of the final fully-connected layer in the CNN) associated with each image to produce a visual similarity dendrogram of identified species. Finally, we evaluated the relative unfamiliarity of images that were not part of the training set when these images were one of the 20 species "known" to our CNN in contrast to images of the species that were "unknown" to our CNN
Construction of a Pragmatic Base Line for Journal Classifications and Maps Based on Aggregated Journal-Journal Citation Relations
A number of journal classification systems have been developed in
bibliometrics since the launch of the Citation Indices by the Institute of
Scientific Information (ISI) in the 1960s. These systems are used to normalize
citation counts with respect to field-specific citation patterns. The best
known system is the so-called "Web-of-Science Subject Categories" (WCs). In
other systems papers are classified by algorithmic solutions. Using the Journal
Citation Reports 2014 of the Science Citation Index and the Social Science
Citation Index (n of journals = 11,149), we examine options for developing a
new system based on journal classifications into subject categories using
aggregated journal-journal citation data. Combining routines in VOSviewer and
Pajek, a tree-like classification is developed. At each level one can generate
a map of science for all the journals subsumed under a category. Nine major
fields are distinguished at the top level. Further decomposition of the social
sciences is pursued for the sake of example with a focus on journals in
information science (LIS) and science studies (STS). The new classification
system improves on alternative options by avoiding the problem of randomness in
each run that has made algorithmic solutions hitherto irreproducible.
Limitations of the new system are discussed (e.g. the classification of
multi-disciplinary journals). The system's usefulness for field-normalization
in bibliometrics should be explored in future studies.Comment: accepted for publication in the Journal of Informetrics, 20 July 201
Probing the non-linear structure of general relativity with black hole binaries
Observations of the inspiral of massive binary black holes (BBH) in the Laser
Interferometer Space Antenna (LISA) and stellar mass binary black holes in the
European Gravitational-Wave Observatory (EGO) offer an unique opportunity to
test the non-linear structure of general relativity. For a binary composed of
two non-spinning black holes, the non-linear general relativistic effects
depend only on the masses of the constituents. In a recent letter, we explored
the possibility of a test to determine all the post-Newtonian coefficients in
the gravitational wave-phasing.
However, mutual covariances dilute the effectiveness of such a test. In this
paper, we propose a more powerful test in which the various post-Newtonian
coefficients in the gravitational wave phasing are systematically measured by
treating three of them as independent parameters and demanding their mutual
consistency. LISA (EGO) will observe BBH inspirals with a signal-to-noise ratio
of more than 1000 (100) and thereby test the self-consistency of each of the
nine post-Newtonian coefficients that have so-far been computed, by measuring
the lower order coefficients to a relative accuracy of
(respectively, ) and the higher order coefficients to a relative
accuracy in the range -0.1 (respectively, -1).Comment: 5 pages, 4 figures. Revised version, accepted for publication in
Phys. Rev
- …