265 research outputs found

    Memory functions and Correlations in Additive Binary Markov Chains

    Full text link
    A theory of additive Markov chains with long-range memory, proposed earlier in Phys. Rev. E 68, 06117 (2003), is developed and used to describe statistical properties of long-range correlated systems. The convenient characteristics of such systems, a memory function, and its relation to the correlation properties of the systems are examined. Various methods for finding the memory function via the correlation function are proposed. The inverse problem (calculation of the correlation function by means of the prescribed memory function) is also solved. This is demonstrated for the analytically solvable model of the system with a step-wise memory function.Comment: 11 pages, 5 figure

    Thomas Decomposition and Nonlinear Control Systems

    Get PDF
    This paper applies the Thomas decomposition technique to nonlinear control systems, in particular to the study of the dependence of the system behavior on parameters. Thomas' algorithm is a symbolic method which splits a given system of nonlinear partial differential equations into a finite family of so-called simple systems which are formally integrable and define a partition of the solution set of the original differential system. Different simple systems of a Thomas decomposition describe different structural behavior of the control system in general. The paper gives an introduction to the Thomas decomposition method and shows how notions such as invertibility, observability and flat outputs can be studied. A Maple implementation of Thomas' algorithm is used to illustrate the techniques on explicit examples

    Hierarchical Models in the Brain

    Get PDF
    This paper describes a general model that subsumes many parametric models for continuous data. The model comprises hidden layers of state-space or dynamic causal models, arranged so that the output of one provides input to another. The ensuing hierarchy furnishes a model for many types of data, of arbitrary complexity. Special cases range from the general linear model for static data to generalised convolution models, with system noise, for nonlinear time-series analysis. Crucially, all of these models can be inverted using exactly the same scheme, namely, dynamic expectation maximization. This means that a single model and optimisation scheme can be used to invert a wide range of models. We present the model and a brief review of its inversion to disclose the relationships among, apparently, diverse generative models of empirical data. We then show that this inversion can be formulated as a simple neural network and may provide a useful metaphor for inference and learning in the brain

    Algebraic estimation in partial derivatives systems: parameters and differentiation problems

    Get PDF
    International audienceTwo goals are sought in this paper: namely, to provide a succinct overview on algebraic techniques for numerical differentiation and parameter estimation for linear systems and to present novel algebraic methods in the case of several variables. The state-of-art in the introduction is followed by a brief description of the methodology in the subsequent sections. Our new algebraic methods are illustrated by two examples in the multidimensional case. Some algebraic preliminaries are given in the appendix

    S-D logic-informed customer engagement: Integrative framework, revised fundamental propositions, and application to CRM

    Get PDF
    Advance online in 2016</p

    Hypophysenvorderlappen und Gallenblase

    No full text
    • …
    corecore