2,007 research outputs found

    On Fast and Robust Information Spreading in the Vertex-Congest Model

    Full text link
    This paper initiates the study of the impact of failures on the fundamental problem of \emph{information spreading} in the Vertex-Congest model, in which in every round, each of the nn nodes sends the same O(logn)O(\log{n})-bit message to all of its neighbors. Our contribution to coping with failures is twofold. First, we prove that the randomized algorithm which chooses uniformly at random the next message to forward is slow, requiring Ω(n/k)\Omega(n/\sqrt{k}) rounds on some graphs, which we denote by Gn,kG_{n,k}, where kk is the vertex-connectivity. Second, we design a randomized algorithm that makes dynamic message choices, with probabilities that change over the execution. We prove that for Gn,kG_{n,k} it requires only a near-optimal number of O(nlog3n/k)O(n\log^3{n}/k) rounds, despite a rate of q=O(k/nlog3n)q=O(k/n\log^3{n}) failures per round. Our technique of choosing probabilities that change according to the execution is of independent interest.Comment: Appears in SIROCCO 2015 conferenc

    Physics of Fashion Fluctuations

    Full text link
    We consider a market where many agents trade many different types of products with each other. We model development of collective modes in this market, and quantify these by fluctuations that scale with time with a Hurst exponent of about 0.7. We demonstrate that individual products in the model occationally become globally accepted means of exchange, and simultaneously become very actively traded. Thus collective features similar to money spontaneously emerge, without any a priori reason.Comment: 9 pages RevTeX, 5 Postscript figure

    (Quantum) Space-Time as a Statistical Geometry of Fuzzy Lumps and the Connection with Random Metric Spaces

    Get PDF
    We develop a kind of pregeometry consisting of a web of overlapping fuzzy lumps which interact with each other. The individual lumps are understood as certain closely entangled subgraphs (cliques) in a dynamically evolving network which, in a certain approximation, can be visualized as a time-dependent random graph. This strand of ideas is merged with another one, deriving from ideas, developed some time ago by Menger et al, that is, the concept of probabilistic- or random metric spaces, representing a natural extension of the metrical continuum into a more microscopic regime. It is our general goal to find a better adapted geometric environment for the description of microphysics. In this sense one may it also view as a dynamical randomisation of the causal-set framework developed by e.g. Sorkin et al. In doing this we incorporate, as a perhaps new aspect, various concepts from fuzzy set theory.Comment: 25 pages, Latex, no figures, some references added, some minor changes added relating to previous wor

    Effects of aerodynamic interaction between main and tail rotors on helicopter hover performance and noise

    Get PDF
    A model test was conducted to determine the effects of aerodynamic interaction between main rotor, tail rotor, and vertical fin on helicopter performance and noise in hover out of ground effect. The experimental data were obtained from hover tests performed with a .151 scale Model 222 main rotor, tail rotor and vertical fin. Of primary interest was the effect of location of the tail rotor with respect to the main rotor. Penalties on main rotor power due to interaction with the tail rotor ranged up to 3% depending upon tail rotor location and orientation. Penalties on tail rotor power due to fin blockage alone ranged up to 10% for pusher tail rotors and up to 50% for tractor tail rotors. The main rotor wake had only a second order effect on these tail rotor/fin interactions. Design charts are presented showing the penalties on main rotor power as a function of the relative location of the tail rotor

    Analysis of weighted networks

    Full text link
    The connections in many networks are not merely binary entities, either present or not, but have associated weights that record their strengths relative to one another. Recent studies of networks have, by and large, steered clear of such weighted networks, which are often perceived as being harder to analyze than their unweighted counterparts. Here we point out that weighted networks can in many cases be analyzed using a simple mapping from a weighted network to an unweighted multigraph, allowing us to apply standard techniques for unweighted graphs to weighted ones as well. We give a number of examples of the method, including an algorithm for detecting community structure in weighted networks and a new and simple proof of the max-flow/min-cut theorem.Comment: 9 pages, 3 figure

    Realizability of the Lorentzian (n,1)-Simplex

    Full text link
    In a previous article [JHEP 1111 (2011) 072; arXiv:1108.4965] we have developed a Lorentzian version of the Quantum Regge Calculus in which the significant differences between simplices in Lorentzian signature and Euclidean signature are crucial. In this article we extend a central result used in the previous article, regarding the realizability of Lorentzian triangles, to arbitrary dimension. This technical step will be crucial for developing the Lorentzian model in the case of most physical interest: 3+1 dimensions. We first state (and derive in an appendix) the realizability conditions on the edge-lengths of a Lorentzian n-simplex in total dimension n=d+1, where d is the number of space-like dimensions. We then show that in any dimension there is a certain type of simplex which has all of its time-like edge lengths completely unconstrained by any sort of triangle inequality. This result is the d+1 dimensional analogue of the 1+1 dimensional case of the Lorentzian triangle.Comment: V1: 15 pages, 2 figures. V2: Minor clarifications added to Introduction and Discussion sections. 1 reference updated. This version accepted for publication in JHEP. V3: minor updates and clarifications, this version closely corresponds to the version published in JHE

    Making sense of violence risk predictions using clinical notes

    Get PDF
    Violence risk assessment in psychiatric institutions enables interventions to avoid violence incidents. Clinical notes written by practitioners and available in electronic health records (EHR) are valuable resources that are seldom used to their full potential. Previous studies have attempted to assess violence risk in psychiatric patients using such notes, with acceptable performance. However, they do not explain why classification works and how it can be improved. We explore two methods to better understand the quality of a classifier in the context of clinical note analysis: random forests using topic models, and choice of evaluation metric. These methods allow us to understand both our data and our methodology more profoundly, setting up the groundwork for improved models that build upon this understanding. This is particularly important when it comes to the generalizability of evaluated classifiers to new data, a trustworthiness problem that is of great interest due to the increased availability of new data in electronic format

    The Fermat-Torricelli problem in normed planes and spaces

    Full text link
    We investigate the Fermat-Torricelli problem in d-dimensional real normed spaces or Minkowski spaces, mainly for d=2. Our approach is to study the Fermat-Torricelli locus in a geometric way. We present many new results, as well as give an exposition of known results that are scattered in various sources, with proofs for some of them. Together, these results can be considered to be a minitheory of the Fermat-Torricelli problem in Minkowski spaces and especially in Minkowski planes. This demonstrates that substantial results about locational problems valid for all norms can be found using a geometric approach

    Synchronization Model for Stock Market Asymmetry

    Full text link
    The waiting time needed for a stock market index to undergo a given percentage change in its value is found to have an up-down asymmetry, which, surprisingly, is not observed for the individual stocks composing that index. To explain this, we introduce a market model consisting of randomly fluctuating stocks that occasionally synchronize their short term draw-downs. These synchronous events are parameterized by a ``fear factor'', that reflects the occurrence of dramatic external events which affect the financial market.Comment: 4 pages, 4 figure
    corecore