782 research outputs found

    Leverage Causes Fat Tails and Clustered Volatility

    Get PDF
    We build a simple model of leveraged asset purchases with margin calls. Investment funds use what is perhaps the most basic financial strategy, called "value investing", i.e. systematically attempting to buy underpriced assets. When funds do not borrow, the price fluctuations of the asset are normally distributed and uncorrelated across time. All this changes when the funds are allowed to leverage, i.e. borrow from a bank, to purchase more assets than their wealth would otherwise permit. During good times competition drives investors to funds that use more leverage, because they have higher profits. As leverage increases price fluctuations become heavy tailed and display clustered volatility, similar to what is observed in real markets. Previous explanations of fat tails and clustered volatility depended on "irrational behavior", such as trend following. Here instead this comes from the fact that leverage limits cause funds to sell into a falling market: A prudent bank makes itself locally safer by putting a limit to leverage, so when a fund exceeds its leverage limit, it must partially repay its loan by selling the asset. Unfortunately this sometimes happens to all the funds simultaneously when the price is already falling. The resulting nonlinear feedback amplifies large downward price movements. At the extreme this causes crashes, but the effect is seen at every time scale, producing a power law of price disturbances. A standard (supposedly more sophisticated) risk control policy in which individual banks base leverage limits on volatility causes leverage to rise during periods of low volatility, and to contract more quickly when volatility gets high, making these extreme fluctuations even worse.Comment: 19 pages, 8 figure

    Leverage Causes Fat Tails and Clustered Volatility

    Get PDF
    We build a simple model of leveraged asset purchases with margin calls. Investment funds use what is perhaps the most basic financial strategy, called "value investing," i.e. systematically attempting to buy underpriced assets. When funds do not borrow, the price fluctuations of the asset are normally distributed and uncorrelated across time. All this changes when the funds are allowed to leverage, i.e. borrow from a bank, to purchase more assets than their wealth would otherwise permit. During good times competition drives investors to funds that use more leverage, because they have higher profits. As leverage increases price fluctuations become heavy tailed and display clustered volatility, similar to what is observed in real markets. Previous explanations of fat tails and clustered volatility depended on "irrational behavior," such as trend fol­lowing. Here instead this comes from the fact that leverage limits cause funds to sell into a falling market: A prudent bank makes itself locally safer by putting a limit to leverage, so when a fund exceeds its leverage limit, it must partially repay its loan by selling the asset. Unfortunately this sometimes happens to all the funds simultaneously when the price is already falling. The resulting nonlinear feedback amplifies large downward price movements. At the extreme this causes crashes, but the effect is seen at every time scale, producing a power law of price disturbances. A standard (supposedly more sophisticated) risk control policy in which individual banks base leverage limits on volatility causes leverage to rise during periods of low volatility, and to contract more quickly when volatility gets high, making these extreme fluctuations even worse.Systemic risk, Clustered volatility, Fat tails, Crash, Margin calls, Leverage

    Scaling-violation phenomena and fractality in the human posture control systems

    Get PDF
    By analyzing the movements of quiet standing persons by means of wavelet statistics, we observe multiple scaling regions in the underlying body dynamics. The use of the wavelet-variance function opens the possibility to relate scaling violations to different modes of posture control. We show that scaling behavior becomes close to perfect, when correctional movements are dominated by the vestibular system.Comment: 12 pages, 4 figures, to appear in Phys. Rev.

    Micromechanical model of bovine Haversian bone predicts strain amplification through soft interfaces

    Get PDF
    Context. Recent observations of brown dwarf spectroscopic variability in the infrared infer the presence of patchy cloud cover. Aims. This paper proposes a mechanism for producing inhomogeneous cloud coverage due to the depletion of cloud particles through the Coulomb explosion of dust in atmospheric plasma regions. Charged dust grains Coulomb-explode when the electrostatic stress of the grain exceeds its mechanical tensile stress, which results in grains below a critical radius a < aCoulcrit being broken up. Methods. This work outlines the criteria required for the Coulomb explosion of dust clouds in substellar atmospheres, the effect on the dust particle size distribution function, and the resulting radiative properties of the atmospheric regions. Results. Our results show that for an atmospheric plasma region with an electron temperature of Te = 10 eV (≈ 105 K), the critical grain radius varies from 10-7 to 10-4 cm, depending on the grains’ tensile strength. Higher critical radii up to 10-3 cm are attainable for higher electron temperatures. We find that the process produces a bimodal particle size distribution composed of stable nanoscale seed particles and dust particles with a ≥ aCoulcrit , with the intervening particle sizes defining a region devoid of dust. As a result, the dust population is depleted, and the clouds become optically thin in the wavelength range 0:1 - 10 μm, with a characteristic peak that shifts to higher wavelengths as more sub-micrometer particles are destroyed. Conclusions. In an atmosphere populated with a distribution of plasma volumes, this will yield regions of contrasting radiative properties, thereby giving a source of inhomogeneous cloud coverage. The results presented here may also be relevant for dust in supernova remnants and protoplanetary disks.PostprintPeer reviewe

    On the robustness of q-expectation values and Renyi entropy

    Full text link
    We study the robustness of functionals of probability distributions such as the R\'enyi and nonadditive S_q entropies, as well as the q-expectation values under small variations of the distributions. We focus on three important types of distribution functions, namely (i) continuous bounded (ii) discrete with finite number of states, and (iii) discrete with infinite number of states. The physical concept of robustness is contrasted with the mathematically stronger condition of stability and Lesche-stability for functionals. We explicitly demonstrate that, in the case of continuous distributions, once unbounded distributions and those leading to negative entropy are excluded, both Renyi and nonadditive S_q entropies as well as the q-expectation values are robust. For the discrete finite case, the Renyi and nonadditive S_q entropies and the q-expectation values are robust. For the infinite discrete case, where both Renyi entropy and q-expectations are known to violate Lesche-stability and stability respectively, we show that one can nevertheless state conditions which guarantee physical robustness.Comment: 6 pages, to appear in Euro Phys Let

    Parkinson's Law Quantified: Three Investigations on Bureaucratic Inefficiency

    Full text link
    We formulate three famous, descriptive essays of C.N. Parkinson on bureaucratic inefficiency in a quantifiable and dynamical socio-physical framework. In the first model we show how the use of recent opinion formation models for small groups can be used to understand Parkinson's observation that decision making bodies such as cabinets or boards become highly inefficient once their size exceeds a critical 'Coefficient of Inefficiency', typically around 20. A second observation of Parkinson - which is sometimes referred to as Parkinson's Law - is that the growth of bureaucratic or administrative bodies usually goes hand in hand with a drastic decrease of its overall efficiency. In our second model we view a bureaucratic body as a system of a flow of workers, which enter, become promoted to various internal levels within the system over time, and leave the system after having served for a certain time. Promotion usually is associated with an increase of subordinates. Within the proposed model it becomes possible to work out the phase diagram under which conditions bureaucratic growth can be confined. In our last model we assign individual efficiency curves to workers throughout their life in administration, and compute the optimum time to send them to old age pension, in order to ensure a maximum of efficiency within the body - in Parkinson's words we compute the 'Pension Point'.Comment: 15 pages, 5 figure

    An extended formalism for preferential attachment in heterogeneous complex networks

    Full text link
    In this paper we present a framework for the extension of the preferential attachment (PA) model to heterogeneous complex networks. We define a class of heterogeneous PA models, where node properties are described by fixed states in an arbitrary metric space, and introduce an affinity function that biases the attachment probabilities of links. We perform an analytical study of the stationary degree distributions in heterogeneous PA networks. We show that their degree densities exhibit a richer scaling behavior than their homogeneous counterparts, and that the power law scaling in the degree distribution is robust in presence of heterogeneity

    Statistical mechanics of scale-free networks at a critical point: Complexity without irreversibility?

    Full text link
    Based on a rigorous extension of classical statistical mechanics to networks, we study a specific microscopic network Hamiltonian. The form of this Hamiltonian is derived from the assumption that individual nodes increase/decrease their utility by linking to nodes with a higher/lower degree than their own. We interpret utility as an equivalent to energy in physical systems and discuss the temperature dependence of the emerging networks. We observe the existence of a critical temperature TcT_c where total energy (utility) and network-architecture undergo radical changes. Along this topological transition we obtain scale-free networks with complex hierarchical topology. In contrast to models for scale-free networks introduced so far, the scale-free nature emerges within equilibrium, with a clearly defined microcanonical ensemble and the principle of detailed balance strictly fulfilled. This provides clear evidence that 'complex' networks may arise without irreversibility. The results presented here should find a wide variety of applications in socio-economic statistical systems.Comment: 4 pages, 5 figure

    Transport on complex networks: Flow, jamming and optimization

    Get PDF
    Many transport processes on networks depend crucially on the underlying network geometry, although the exact relationship between the structure of the network and the properties of transport processes remain elusive. In this paper we address this question by using numerical models in which both structure and dynamics are controlled systematically. We consider the traffic of information packets that include driving, searching and queuing. We present the results of extensive simulations on two classes of networks; a correlated cyclic scale-free network and an uncorrelated homogeneous weakly clustered network. By measuring different dynamical variables in the free flow regime we show how the global statistical properties of the transport are related to the temporal fluctuations at individual nodes (the traffic noise) and the links (the traffic flow). We then demonstrate that these two network classes appear as representative topologies for optimal traffic flow in the regimes of low density and high density traffic, respectively. We also determine statistical indicators of the pre-jamming regime on different network geometries and discuss the role of queuing and dynamical betweenness for the traffic congestion. The transition to the jammed traffic regime at a critical posting rate on different network topologies is studied as a phase transition with an appropriate order parameter. We also address several open theoretical problems related to the network dynamics

    Schumpeterian economic dynamics as a quantifiable minimum model of evolution

    Full text link
    We propose a simple quantitative model of Schumpeterian economic dynamics. New goods and services are endogenously produced through combinations of existing goods. As soon as new goods enter the market they may compete against already existing goods, in other words new products can have destructive effects on existing goods. As a result of this competition mechanism existing goods may be driven out from the market - often causing cascades of secondary defects (Schumpeterian gales of destruction). The model leads to a generic dynamics characterized by phases of relative economic stability followed by phases of massive restructuring of markets - which could be interpreted as Schumpeterian business `cycles'. Model timeseries of product diversity and productivity reproduce several stylized facts of economics timeseries on long timescales such as GDP or business failures, including non-Gaussian fat tailed distributions, volatility clustering etc. The model is phrased in an open, non-equilibrium setup which can be understood as a self organized critical system. Its diversity dynamics can be understood by the time-varying topology of the active production networks.Comment: 21 pages, 11 figure
    corecore