6,089 research outputs found
An Alternative Proof for the Identifiability of Independent Vector Analysis Using Second Order Statistics
International audienceIn this paper, we present an alternative proof for characterizing the (non-) identifiability conditions of independent vector analysis (IVA). IVA extends blind source separation to several mixtures by taking into account statistical dependencies between mixtures. We focus on IVA in the presence of real Gaussian data with temporally independent and identically distributed samples. This model is always non-identifiable when each mixture is considered separately. However, it can be shown to be generically identifiable within the IVA framework. Our proof differs from previous ones by being based on direct factorization of a closed-form expression for the Fisher information matrix. Our analysis is based on a rigorous linear algebraic formulation, and leads to a new type of factorization of a structured matrix. Therefore, the proposed approach is of potential interest for a broader range of problems
Hidden Markov Model Identifiability via Tensors
The prevalence of hidden Markov models (HMMs) in various applications of
statistical signal processing and communications is a testament to the power
and flexibility of the model. In this paper, we link the identifiability
problem with tensor decomposition, in particular, the Canonical Polyadic
decomposition. Using recent results in deriving uniqueness conditions for
tensor decomposition, we are able to provide a necessary and sufficient
condition for the identification of the parameters of discrete time finite
alphabet HMMs. This result resolves a long standing open problem regarding the
derivation of a necessary and sufficient condition for uniquely identifying an
HMM. We then further extend recent preliminary work on the identification of
HMMs with multiple observers by deriving necessary and sufficient conditions
for identifiability in this setting.Comment: Accepted to ISIT 2013. 5 pages, no figure
Complex Random Vectors and ICA Models: Identifiability, Uniqueness and Separability
In this paper the conditions for identifiability, separability and uniqueness
of linear complex valued independent component analysis (ICA) models are
established. These results extend the well-known conditions for solving
real-valued ICA problems to complex-valued models. Relevant properties of
complex random vectors are described in order to extend the Darmois-Skitovich
theorem for complex-valued models. This theorem is used to construct a proof of
a theorem for each of the above ICA model concepts. Both circular and
noncircular complex random vectors are covered. Examples clarifying the above
concepts are presented.Comment: To appear in IEEE TR-IT March 200
Identifying long-run behaviour with non-stationary data
Copyright @ 2000 Université Catholique de LouvainResults for the identification of non-linear models are used to support the traditional form of the order condition by sufficient conditions. The sufficient conditions reveal a two step procedure for firstly checking generic identification and then testing identifiability. This approach can be extended to sub-blocks of the system and it generalizes to non-linear restrictions. The procedure is applied to an empirical model of the exchange rate, which is identified by diagonalising the system
- âŠ