1,387 research outputs found

    The Futility of Utility: how market dynamics marginalize Adam Smith

    Get PDF
    Econometrics is based on the nonempiric notion of utility. Prices, dynamics, and market equilibria are supposed to be derived from utility. Utility is usually treated by economists as a price potential, other times utility rates are treated as Lagrangians. Assumptions of integrability of Lagrangians and dynamics are implicitly and uncritically made. In particular, economists assume that price is the gradient of utility in equilibrium, but I show that price as the gradient of utility is an integrability condition for the Hamiltonian dynamics of an optimization problem in econometric control theory. One consequence is that, in a nonintegrable dynamical system, price cannot be expressed as a function of demand or supply variables. Another consequence is that utility maximization does not describe equiulibrium. I point out that the maximization of Gibbs entropy would describe equilibrium, if equilibrium could be achieved, but equilibrium does not describe real markets. To emphasize the inconsistency of the economists' notion of 'equilibrium', I discuss both deterministic and stochastic dynamics of excess demand and observe that Adam Smith's stabilizing hand is not to be found either in deterministic or stochastic dynamical models of markets, nor in the observed motions of asset prices. Evidence for stability of prices of assets in free markets simply has not been found.Comment: 46 pages. accepte

    The Futility of Utility: how market dynamics marginalize Adam Smith

    Get PDF
    General Equilibrium Theory in econometrics is based on the vague notion of utility. Prices, dynamics, and market equilibria are supposed to be derived from utility. Utility is sometimes treated like a potential, other times like a Lagrangian. Illegal assumptions of integrability of actions and dynamics are usually made. Economists usually assume that price is the gradient of utility in equilibrium, but I observe instead that price as the gradient of utility is an integrability condition for the Hamiltonian dynamics of an optimization problem. I discuss both deterministic and statistical descriptions of the dynamics of excess demand and observe that Adam Smith's stabilizing hand is not to be found either in deterministic or stochastic dynamical models of markets nor in the observed motions of asset prices. Evidence for stability of prices of assets in free markets has not been found.Utility; general equilibrium; nonintegrability; control dynamics; conservation laws; chaos; instability; supply-demand curves; nonequilibrium dynamics

    Optimal leverage from non-ergodicity

    Full text link
    In modern portfolio theory, the balancing of expected returns on investments against uncertainties in those returns is aided by the use of utility functions. The Kelly criterion offers another approach, rooted in information theory, that always implies logarithmic utility. The two approaches seem incompatible, too loosely or too tightly constraining investors' risk preferences, from their respective perspectives. The conflict can be understood on the basis that the multiplicative models used in both approaches are non-ergodic which leads to ensemble-average returns differing from time-average returns in single realizations. The classic treatments, from the very beginning of probability theory, use ensemble-averages, whereas the Kelly-result is obtained by considering time-averages. Maximizing the time-average growth rates for an investment defines an optimal leverage, whereas growth rates derived from ensemble-average returns depend linearly on leverage. The latter measure can thus incentivize investors to maximize leverage, which is detrimental to time-average growth and overall market stability. The Sharpe ratio is insensitive to leverage. Its relation to optimal leverage is discussed. A better understanding of the significance of time-irreversibility and non-ergodicity and the resulting bounds on leverage may help policy makers in reshaping financial risk controls.Comment: 17 pages, 3 figures. Updated figures and extended discussion of ergodicit

    Measuring time preferences

    Full text link
    We review research that measures time preferences—i.e., preferences over intertemporal tradeoffs. We distinguish between studies using financial flows, which we call “money earlier or later” (MEL) decisions and studies that use time-dated consumption/effort. Under different structural models, we show how to translate what MEL experiments directly measure (required rates of return for financial flows) into a discount function over utils. We summarize empirical regularities found in MEL studies and the predictive power of those studies. We explain why MEL choices are driven in part by some factors that are distinct from underlying time preferences.National Institutes of Health (NIA R01AG021650 and P01AG005842) and the Pershing Square Fund for Research in the Foundations of Human Behavior

    Classes of fast and specific search mechanisms for proteins on DNA

    Full text link
    Problems of search and recognition appear over different scales in biological systems. In this review we focus on the challenges posed by interactions between proteins, in particular transcription factors, and DNA and possible mechanisms which allow for a fast and selective target location. Initially we argue that DNA-binding proteins can be classified, broadly, into three distinct classes which we illustrate using experimental data. Each class calls for a different search process and we discuss the possible application of different search mechanisms proposed over the years to each class. The main thrust of this review is a new mechanism which is based on barrier discrimination. We introduce the model and analyze in detail its consequences. It is shown that this mechanism applies to all classes of transcription factors and can lead to a fast and specific search. Moreover, it is shown that the mechanism has interesting transient features which allow for stability at the target despite rapid binding and unbinding of the transcription factor from the target.Comment: 65 pages, 23 figure

    Fundamental Framework for Technical Analysis

    Full text link
    Starting from the characterization of the past time evolution of market prices in terms of two fundamental indicators, price velocity and price acceleration, we construct a general classification of the possible patterns characterizing the deviation or defects from the random walk market state and its time-translational invariant properties. The classification relies on two dimensionless parameters, the Froude number characterizing the relative strength of the acceleration with respect to the velocity and the time horizon forecast dimensionalized to the training period. Trend-following and contrarian patterns are found to coexist and depend on the dimensionless time horizon. The classification is based on the symmetry requirements of invariance with respect to change of price units and of functional scale-invariance in the space of scenarii. This ``renormalized scenario'' approach is fundamentally probabilistic in nature and exemplifies the view that multiple competing scenarii have to be taken into account for the same past history. Empirical tests are performed on on about nine to thirty years of daily returns of twelve data sets comprising some major indices (Dow Jones, SP500, Nasdaq, DAX, FTSE, Nikkei), some major bonds (JGB, TYX) and some major currencies against the US dollar (GBP, CHF, DEM, JPY). Our ``renormalized scenario'' exhibits statistically significant predictive power in essentially all market phases. In constrast, a trend following strategy and trend + acceleration following strategy perform well only on different and specific market phases. The value of the ``renormalized scenario'' approach lies in the fact that it always finds the best of the two, based on a calculation of the stability of their predicted market trajectories.Comment: Latex, 27 page

    Evolutionary games on graphs

    Full text link
    Game theory is one of the key paradigms behind many scientific disciplines from biology to behavioral sciences to economics. In its evolutionary form and especially when the interacting agents are linked in a specific social network the underlying solution concepts and methods are very similar to those applied in non-equilibrium statistical physics. This review gives a tutorial-type overview of the field for physicists. The first three sections introduce the necessary background in classical and evolutionary game theory from the basic definitions to the most important results. The fourth section surveys the topological complications implied by non-mean-field-type social network structures in general. The last three sections discuss in detail the dynamic behavior of three prominent classes of models: the Prisoner's Dilemma, the Rock-Scissors-Paper game, and Competing Associations. The major theme of the review is in what sense and how the graph structure of interactions can modify and enrich the picture of long term behavioral patterns emerging in evolutionary games.Comment: Review, final version, 133 pages, 65 figure
    corecore