2,239 research outputs found

    The vicious cycle: fundraising and perceived visibility in US presidential primaries

    Full text link
    Scholars of presidential primaries have long posited a dynamic positive feedback loop between fundraising and electoral success. Yet existing work on both directions of this feedback remains inconclusive and is often explicitly cross-sectional, ignoring the dynamic aspect of the hypothesis. Pairing high-frequency FEC data on contributions and expenditures with Iowa Electronic Markets data on perceived probability of victory, we examine the bidirectional feedback between contributions and viability. We find robust, significant positive feedback in both directions. This might suggest multiple equilibria: a candidate initially anointed as the front-runner able to sustain such status solely by the fundraising advantage conferred despite possessing no advantage in quality. However, simulations suggest the feedback loop cannot, by itself, sustain advantage. Given the observed durability of front-runners, it would thus seem there is either some other feedback at work and/or the process by which the initial front-runner is identified is informative of candidate quality

    Discrete Scale Invariance and the "Second Black Monday"

    Full text link
    Evidence is offered for log-periodic (in time) fluctuations in the S&P 500 stock index during the three years prior to the October 27, 1997 "correction". These fluctuations were expected on the basis of a discretely scale invariant rupture phenomenology of stock market crashes proposed earlier.Comment: LaTeX file, 4 pages, 2 figure

    Programming of inhomogeneous resonant guided wave networks

    Get PDF
    Photonic functions are programmed by designing the interference of local waves in inhomogeneous resonant guided wave networks composed of power-splitting elements arranged at the nodes of a nonuniform waveguide network. Using a compact, yet comprehensive, scattering matrix representation of the network, the desired photonic function is designed by fitting structural parameters according to an optimization procedure. This design scheme is demonstrated for plasmonic dichroic and trichroic routers in the infrared frequency range

    Gravitational Analogues of Non-linear Born Electrodynamics

    Get PDF
    Gravitational analogues of the nonlinear electrodynamics of Born and of Born and Infeld are introduced and applied to the black hole problem. This work is mainly devoted to the 2-dimensional case in which the relevant lagrangians are nonpolynomial in the scalar curvature.Comment: 20 pages, 2 figures, included a detailed discussion of "non-trace" field equation

    On Randomized Algorithms for Matching in the Online Preemptive Model

    Full text link
    We investigate the power of randomized algorithms for the maximum cardinality matching (MCM) and the maximum weight matching (MWM) problems in the online preemptive model. In this model, the edges of a graph are revealed one by one and the algorithm is required to always maintain a valid matching. On seeing an edge, the algorithm has to either accept or reject the edge. If accepted, then the adjacent edges are discarded. The complexity of the problem is settled for deterministic algorithms. Almost nothing is known for randomized algorithms. A lower bound of 1.6931.693 is known for MCM with a trivial upper bound of 22. An upper bound of 5.3565.356 is known for MWM. We initiate a systematic study of the same in this paper with an aim to isolate and understand the difficulty. We begin with a primal-dual analysis of the deterministic algorithm due to McGregor. All deterministic lower bounds are on instances which are trees at every step. For this class of (unweighted) graphs we present a randomized algorithm which is 2815\frac{28}{15}-competitive. The analysis is a considerable extension of the (simple) primal-dual analysis for the deterministic case. The key new technique is that the distribution of primal charge to dual variables depends on the "neighborhood" and needs to be done after having seen the entire input. The assignment is asymmetric: in that edges may assign different charges to the two end-points. Also the proof depends on a non-trivial structural statement on the performance of the algorithm on the input tree. The other main result of this paper is an extension of the deterministic lower bound of Varadaraja to a natural class of randomized algorithms which decide whether to accept a new edge or not using independent random choices

    More on A Statistical Analysis of Log-Periodic Precursors to Financial Crashes

    Full text link
    We respond to Sornette and Johansen's criticisms of our findings regarding log-periodic precursors to financial crashes. Included in this paper are discussions of the Sornette-Johansen theoretical paradigm, traditional methods of identifying log-periodic precursors, the behavior of the first differences of a log-periodic price series, and the distribution of drawdowns for a securities price.Comment: 12 LaTex pages, no figure

    Semi-Streaming Set Cover

    Full text link
    This paper studies the set cover problem under the semi-streaming model. The underlying set system is formalized in terms of a hypergraph G=(V,E)G = (V, E) whose edges arrive one-by-one and the goal is to construct an edge cover F⊆EF \subseteq E with the objective of minimizing the cardinality (or cost in the weighted case) of FF. We consider a parameterized relaxation of this problem, where given some 0≤ϵ<10 \leq \epsilon < 1, the goal is to construct an edge (1−ϵ)(1 - \epsilon)-cover, namely, a subset of edges incident to all but an ϵ\epsilon-fraction of the vertices (or their benefit in the weighted case). The key limitation imposed on the algorithm is that its space is limited to (poly)logarithmically many bits per vertex. Our main result is an asymptotically tight trade-off between ϵ\epsilon and the approximation ratio: We design a semi-streaming algorithm that on input graph GG, constructs a succinct data structure D\mathcal{D} such that for every 0≤ϵ<10 \leq \epsilon < 1, an edge (1−ϵ)(1 - \epsilon)-cover that approximates the optimal edge \mbox{(11-)cover} within a factor of f(ϵ,n)f(\epsilon, n) can be extracted from D\mathcal{D} (efficiently and with no additional space requirements), where f(ϵ,n)={O(1/ϵ),if ϵ>1/nO(n),otherwise . f(\epsilon, n) = \left\{ \begin{array}{ll} O (1 / \epsilon), & \text{if } \epsilon > 1 / \sqrt{n} \\ O (\sqrt{n}), & \text{otherwise} \end{array} \right. \, . In particular for the traditional set cover problem we obtain an O(n)O(\sqrt{n})-approximation. This algorithm is proved to be best possible by establishing a family (parameterized by ϵ\epsilon) of matching lower bounds.Comment: Full version of the extended abstract that will appear in Proceedings of ICALP 2014 track

    Can Retreads Be as Good as New?: Reflections on the Value of Law as a Second Career

    Get PDF

    Can Retreads Be as Good as New?: Reflections on the Value of Law as a Second Career

    Get PDF

    Equivalent Representations of Non-Exponential Discounting Models

    Get PDF
    I characterize the entire class of consumption rules for finite-horizon models in which consumption is proportional to lifetime wealth. Any such rule can be obtained from a preference model with CRRA period utility. In a steady state with constant interest rates, a proportional consumption rule can be derived from a model with time-consistent preferences or from a model with possibly time-inconsistent preferences in which a household continually reoptimizes future utility discounted relative to the present instant. These two preference models will only coincide for the special case when the discount function is exponential. More generally, there will be two distinct yet observationally equivalent preference models. Hyperbolic-like discounting may arise because that is a simpler way for the brain to process a standard exponential discount function after accounting for mortality risk
    • …
    corecore