2 research outputs found

    Innovation Representation of Stochastic Processes with Application to Causal Inference

    Full text link
    Typically, real-world stochastic processes are not easy to analyze. In this work we study the representation of any stochastic process as a memoryless innovation process triggering a dynamic system. We show that such a representation is always feasible for innovation processes taking values over a continuous set. However, the problem becomes more challenging when the alphabet size of the innovation is finite. In this case, we introduce both lossless and lossy frameworks, and provide closed-form solutions and practical algorithmic methods. In addition, we discuss the properties and uniqueness of our suggested approach. Finally, we show that the innovation representation problem has many applications. We focus our attention to Entropic Causal Inference, which has recently demonstrated promising performance, compared to alternative methods.Comment: arXiv admin note: text overlap with arXiv:1611.04035 by other author

    PhD Dissertation: Generalized Independent Components Analysis Over Finite Alphabets

    Full text link
    Independent component analysis (ICA) is a statistical method for transforming an observable multi-dimensional random vector into components that are as statistically independent as possible from each other. Usually the ICA framework assumes a model according to which the observations are generated (such as a linear transformation with additive noise). ICA over finite fields is a special case of ICA in which both the observations and the independent components are over a finite alphabet. In this thesis we consider a formulation of the finite-field case in which an observation vector is decomposed to its independent components (as much as possible) with no prior assumption on the way it was generated. This generalization is also known as Barlow's minimal redundancy representation and is considered an open problem. We propose several theorems and show that this hard problem can be accurately solved with a branch and bound search tree algorithm, or tightly approximated with a series of linear problems. Moreover, we show that there exists a simple transformation (namely, order permutation) which provides a greedy yet very effective approximation of the optimal solution. We further show that while not every random vector can be efficiently decomposed into independent components, the vast majority of vectors do decompose very well (that is, within a small constant cost), as the dimension increases. In addition, we show that we may practically achieve this favorable constant cost with a complexity that is asymptotically linear in the alphabet size. Our contribution provides the first efficient set of solutions to Barlow's problem with theoretical and computational guarantees. Finally, we demonstrate our suggested framework in multiple source coding applications.Comment: PhD Dissertatio
    corecore