6 research outputs found

    Identification of Sparse Reciprocal Graphical Models

    Full text link
    In this paper we propose an identification procedure of a sparse graphical model associated to a Gaussian stationary stochastic process. The identification paradigm exploits the approximation of autoregressive processes through reciprocal processes in order to improve the robustness of the identification algorithm, especially when the order of the autoregressive process becomes large. We show that the proposed paradigm leads to a regularized, circulant matrix completion problem whose solution only requires computations of the eigenvalues of matrices of dimension equal to the dimension of the process

    A Maximum Entropy solution of the Covariance Extension Problem for Reciprocal Processes

    Full text link
    Stationary reciprocal processes defined on a finite interval of the integer line can be seen as a special class of Markov random fields restricted to one dimension. Non stationary reciprocal processes have been extensively studied in the past especially by Jamison, Krener, Levy and co-workers. The specialization of the non-stationary theory to the stationary case, however, does not seem to have been pursued in sufficient depth in the literature. Stationary reciprocal processes (and reciprocal stochastic models) are potentially useful for describing signals which naturally live in a finite region of the time (or space) line. Estimation or identification of these models starting from observed data seems still to be an open problem which can lead to many interesting applications in signal and image processing. In this paper, we discuss a class of reciprocal processes which is the acausal analog of auto-regressive (AR) processes, familiar in control and signal processing. We show that maximum likelihood identification of these processes leads to a covariance extension problem for block-circulant covariance matrices. This generalizes the famous covariance band extension problem for stationary processes on the integer line. As in the usual stationary setting on the integer line, the covariance extension problem turns out to be a basic conceptual and practical step in solving the identification problem. We show that the maximum entropy principle leads to a complete solution of the problem.Comment: 33 pages, to appear in the IEEE Trans. Aut. Cont

    On the Geometry of Maximum Entropy Problems

    Full text link
    We show that a simple geometric result suffices to derive the form of the optimal solution in a large class of finite and infinite-dimensional maximum entropy problems concerning probability distributions, spectral densities and covariance matrices. These include Burg's spectral estimation method and Dempster's covariance completion, as well as various recent generalizations of the above. We then apply this orthogonality principle to the new problem of completing a block-circulant covariance matrix when an a priori estimate is available.Comment: 22 page
    corecore