2,083 research outputs found

    A Hyper-Relation Characterization of Weak Pseudo-Rationalizability

    Get PDF
    I provide a characterization of weakly pseudo-rationalizable choice functions---that is, choice functions rationalizable by a set of acyclic relations---in terms of hyper-relations satisfying certain properties. For those hyper-relations Nehring calls extended preference relations, the central characterizing condition is weaker than (hyper-relation) transitivity but stronger than (hyper-relation) acyclicity. Furthermore, the relevant type of hyper-relation can be represented as the intersection of a certain class of its extensions. These results generalize known, analogous results for path independent choice functions

    The physics and the mixed Hodge structure of Feynman integrals

    Full text link
    This expository text is an invitation to the relation between quantum field theory Feynman integrals and periods. We first describe the relation between the Feynman parametrization of loop amplitudes and world-line methods, by explaining that the first Symanzik polynomial is the determinant of the period matrix of the graph, and the second Symanzik polynomial is expressed in terms of world-line Green's functions. We then review the relation between Feynman graphs and variations of mixed Hodge structures. Finally, we provide an algorithm for generating the Picard-Fuchs equation satisfied by the all equal mass banana graphs in a two-dimensional space-time to all loop orders.Comment: v2: 34 pages, 5 figures. Minor changes. References added. String-math 2013 proceeding contributio

    Luttinger States at the Edge

    Full text link
    An effective wavefunction for the edge excitations in the Fractional quantum Hall effect can be found by dimensionally reducing the bulk wavefunction. Treated this way the Laughlin ν=1/(2n+1)\nu=1/(2n+1) wavefunction yields a Luttinger model ground state. We identify the edge-electron field with a Luttinger hyper-fermion operator, and the edge electron itself with a non-backscattering Bogoliubov quasi-particle. The edge-electron propagator may be calculated directly from the effective wavefunction using the properties of a one-dimensional one-component plasma, provided a prescription is adopted which is sensitive to the extra flux attached to the electrons

    Building Efficient Query Engines in a High-Level Language

    Get PDF
    Abstraction without regret refers to the vision of using high-level programming languages for systems development without experiencing a negative impact on performance. A database system designed according to this vision offers both increased productivity and high performance, instead of sacrificing the former for the latter as is the case with existing, monolithic implementations that are hard to maintain and extend. In this article, we realize this vision in the domain of analytical query processing. We present LegoBase, a query engine written in the high-level language Scala. The key technique to regain efficiency is to apply generative programming: LegoBase performs source-to-source compilation and optimizes the entire query engine by converting the high-level Scala code to specialized, low-level C code. We show how generative programming allows to easily implement a wide spectrum of optimizations, such as introducing data partitioning or switching from a row to a column data layout, which are difficult to achieve with existing low-level query compilers that handle only queries. We demonstrate that sufficiently powerful abstractions are essential for dealing with the complexity of the optimization effort, shielding developers from compiler internals and decoupling individual optimizations from each other. We evaluate our approach with the TPC-H benchmark and show that: (a) With all optimizations enabled, LegoBase significantly outperforms a commercial database and an existing query compiler. (b) Programmers need to provide just a few hundred lines of high-level code for implementing the optimizations, instead of complicated low-level code that is required by existing query compilation approaches. (c) The compilation overhead is low compared to the overall execution time, thus making our approach usable in practice for compiling query engines

    What does inflation really predict?

    Full text link
    If the inflaton potential has multiple minima, as may be expected in, e.g., the string theory "landscape", inflation predicts a probability distribution for the cosmological parameters describing spatial curvature (Omega_tot), dark energy (rho_Lambda, w, etc.), the primordial density fluctuations (Omega_tot, dark energy (rho_Lambda, w, etc.). We compute this multivariate probability distribution for various classes of single-field slow-roll models, exploring its dependence on the characteristic inflationary energy scales, the shape of the potential V and and the choice of measure underlying the calculation. We find that unless the characteristic scale Delta-phi on which V varies happens to be near the Planck scale, the only aspect of V that matters observationally is the statistical distribution of its peaks and troughs. For all energy scales and plausible measures considered, we obtain the predictions Omega_tot ~ 1+-0.00001, w=-1 and rho_Lambda in the observed ballpark but uncomfortably high. The high energy limit predicts n_s ~ 0.96, dn_s/dlnk ~ -0.0006, r ~ 0.15 and n_t ~ -0.02, consistent with observational data and indistinguishable from eternal phi^2-inflation. The low-energy limit predicts 5 parameters but prefers larger Q and redder n_s than observed. We discuss the coolness problem, the smoothness problem and the pothole paradox, which severely limit the viable class of models and measures. Our findings bode well for detecting an inflationary gravitational wave signature with future CMB polarization experiments, with the arguably best-motivated single-field models favoring the detectable level r ~ 0.03. (Abridged)Comment: Replaced to match accepted JCAP version. Improved discussion, references. 42 pages, 17 fig

    Judging the Rationality of Decisions in the Presence of Vague Alternatives

    Get PDF
    The standard framework of the decision theory is subjected to partial revision in regard to the usage of the notion of alternative. An approach to judging the rationality of decision-maker's behavior is suggested for various cases of incomplete observability and/or controllability of alternatives. The approach stems from the conventional axiomatic treatment of rationality in the general choice theory and proceeds via modifying the description of alternative modes of behavior into a generalized model that requires no explicit consideration of alternatives. The criteria of rationality in the generalized decision model are proposed. For the conventional model in the choice theory, these criteria can be reduced to the well known criteria of the regularity (binariness) of choice functions. Game and economic examples are considered

    Portfolio Selection in Multidimensional General and Partial Moment Space.

    Get PDF
    This paper develops a general approach for the single period portfolio optimization problem in a multidimensional general and partial moment space. A shortage function is defined that looks for possible increases in odd moments and decreases in even moments. A main result is that this shortage function ensures suffcient conditions for global optimality. It also forms a natural basis for developing tests on the infuence of additional moments. Furthermore, a link is made with an approximation of an arbitrary order of a general indirectutility function. This nonparametric effciency measurement framework permits to dfferentiate mainly between portfolio effciency and allocative effciency. Finally, information can,in principle, be inferred about the revealed risk aversion, prudence, temperance and otherhigher-order risk characteristics of investors.shortage function, efficient frontier, K-moment portfolios

    Next-to-leading order predictions for Z gamma+jet and Z gamma gamma final states at the LHC

    Full text link
    We present next-to-leading order predictions for final states containing leptons produced through the decay of a Z boson in association with either a photon and a jet, or a pair of photons. The effect of photon radiation from the final state leptons is included and we also allow for contributions arising from fragmentation processes. Phenomenological studies are presented for the LHC in the case of final states containing charged leptons and in the case of neutrinos. We also use the procedure introduced by Stewart and Tackmann to provide a reliable estimate of the scale uncertainty inherent in our theoretical calculations of jet-binned Z gamma cross sections. These computations have been implemented in the public code MCFM.Comment: 30 pages, 10 figure

    Learning Models over Relational Data using Sparse Tensors and Functional Dependencies

    Full text link
    Integrated solutions for analytics over relational databases are of great practical importance as they avoid the costly repeated loop data scientists have to deal with on a daily basis: select features from data residing in relational databases using feature extraction queries involving joins, projections, and aggregations; export the training dataset defined by such queries; convert this dataset into the format of an external learning tool; and train the desired model using this tool. These integrated solutions are also a fertile ground of theoretically fundamental and challenging problems at the intersection of relational and statistical data models. This article introduces a unified framework for training and evaluating a class of statistical learning models over relational databases. This class includes ridge linear regression, polynomial regression, factorization machines, and principal component analysis. We show that, by synergizing key tools from database theory such as schema information, query structure, functional dependencies, recent advances in query evaluation algorithms, and from linear algebra such as tensor and matrix operations, one can formulate relational analytics problems and design efficient (query and data) structure-aware algorithms to solve them. This theoretical development informed the design and implementation of the AC/DC system for structure-aware learning. We benchmark the performance of AC/DC against R, MADlib, libFM, and TensorFlow. For typical retail forecasting and advertisement planning applications, AC/DC can learn polynomial regression models and factorization machines with at least the same accuracy as its competitors and up to three orders of magnitude faster than its competitors whenever they do not run out of memory, exceed 24-hour timeout, or encounter internal design limitations.Comment: 61 pages, 9 figures, 2 table
    corecore