778 research outputs found

    Solution of polynomial Lyapunov and Sylvester equations

    No full text
    A two-variable polynomial approach to solve the one-variable polynomial Lyapunov and Sylvester equations is proposed. Lifting the problem from the one-variable to the two-variable context gives rise to associated lifted equations which live on finite-dimensional vector spaces. This allows for the design of an iterative solution method which is inspired by the method of Faddeev for the computation of matrix resolvents. The resulting algorithms are especially suitable for applications requiring symbolic or exact computation

    Dynamics of the Fisher Information Metric

    Get PDF
    We present a method to generate probability distributions that correspond to metrics obeying partial differential equations generated by extremizing a functional J[gμν(θi)]J[g^{\mu\nu}(\theta^i)], where gμν(θi)g^{\mu\nu}(\theta^i) is the Fisher metric. We postulate that this functional of the dynamical variable gμν(θi)g^{\mu\nu}(\theta^i) is stationary with respect to small variations of these variables. Our approach enables a dynamical approach to Fisher information metric. It allows to impose symmetries on a statistical system in a systematic way. This work is mainly motivated by the entropy approach to nonmonotonic reasoning.Comment: 11 page

    Metric on a Statistical Space-Time

    Full text link
    We introduce a concept of distance for a space-time where the notion of point is replaced by the notion of physical states e.g. probability distributions. We apply ideas of information theory and compute the Fisher information matrix on such a space-time. This matrix is the metric on that manifold. We apply these ideas to a simple model and show that the Lorentzian metric can be obtained if we assumed that the probability distributions describing space-time fluctuations have complex values. Such complex probability distributions appear in non-Hermitian quantum mechanics.Comment: 7 page

    Identifiability of generalised Randles circuit models

    Full text link
    The Randles circuit (including a parallel resistor and capacitor in series with another resistor) and its generalised topology have widely been employed in electrochemical energy storage systems such as batteries, fuel cells and supercapacitors, also in biomedical engineering, for example, to model the electrode-tissue interface in electroencephalography and baroreceptor dynamics. This paper studies identifiability of generalised Randles circuit models, that is, whether the model parameters can be estimated uniquely from the input-output data. It is shown that generalised Randles circuit models are structurally locally identifiable. The condition that makes the model structure globally identifiable is then discussed. Finally, the estimation accuracy is evaluated through extensive simulations

    Information geometric methods for complexity

    Full text link
    Research on the use of information geometry (IG) in modern physics has witnessed significant advances recently. In this review article, we report on the utilization of IG methods to define measures of complexity in both classical and, whenever available, quantum physical settings. A paradigmatic example of a dramatic change in complexity is given by phase transitions (PTs). Hence we review both global and local aspects of PTs described in terms of the scalar curvature of the parameter manifold and the components of the metric tensor, respectively. We also report on the behavior of geodesic paths on the parameter manifold used to gain insight into the dynamics of PTs. Going further, we survey measures of complexity arising in the geometric framework. In particular, we quantify complexity of networks in terms of the Riemannian volume of the parameter space of a statistical manifold associated with a given network. We are also concerned with complexity measures that account for the interactions of a given number of parts of a system that cannot be described in terms of a smaller number of parts of the system. Finally, we investigate complexity measures of entropic motion on curved statistical manifolds that arise from a probabilistic description of physical systems in the presence of limited information. The Kullback-Leibler divergence, the distance to an exponential family and volumes of curved parameter manifolds, are examples of essential IG notions exploited in our discussion of complexity. We conclude by discussing strengths, limits, and possible future applications of IG methods to the physics of complexity.Comment: review article, 60 pages, no figure

    Three ways to look at mutually unbiased bases

    Get PDF
    This is a review of the problem of Mutually Unbiased Bases in finite dimensional Hilbert spaces, real and complex. Also a geometric measure of "mubness" is introduced, and applied to some recent calculations in six dimensions (partly done by Bjorck and by Grassl). Although this does not yet solve any problem, some appealing structures emerge.Comment: 18 pages. Talk at the Vaxjo Conference on Foundations of Probability and Physics, June 200

    Higher coordination with less control - A result of information maximization in the sensorimotor loop

    Full text link
    This work presents a novel learning method in the context of embodied artificial intelligence and self-organization, which has as few assumptions and restrictions as possible about the world and the underlying model. The learning rule is derived from the principle of maximizing the predictive information in the sensorimotor loop. It is evaluated on robot chains of varying length with individually controlled, non-communicating segments. The comparison of the results shows that maximizing the predictive information per wheel leads to a higher coordinated behavior of the physically connected robots compared to a maximization per robot. Another focus of this paper is the analysis of the effect of the robot chain length on the overall behavior of the robots. It will be shown that longer chains with less capable controllers outperform those of shorter length and more complex controllers. The reason is found and discussed in the information-geometric interpretation of the learning process
    corecore