95 research outputs found

    Modeling of widely-linear quaternion valued systems using hypercomplex algorithms

    Get PDF
    The data-driven optimal modeling and identification of widely-linear quaternion-valued synthetic systems is achieved by using a quaternion-valued gradient based algorithms. To account rigorously for the second-order statistics of the quaternion system, the quaternion least mean square (QLMS) and widely linear quaternion least mean square (WL-QLMS) were selected. The QLMS is shown to successfully model the quaternion-valued systems and the WL-QLMS is able to model both quaternion and widely-linear quaternion valued systems taking into account the full second-order statistics of the system. Analysis has proven that both algorithms are able to adapt to non-stationary nature of the systems. This approach is supported by simulations of various synthetic systems

    An Invitation to Hypercomplex Phase Retrieval: Theory and Applications

    Full text link
    Hypercomplex signal processing (HSP) provides state-of-the-art tools to handle multidimensional signals by harnessing intrinsic correlation of the signal dimensions through Clifford algebra. Recently, the hypercomplex representation of the phase retrieval (PR) problem, wherein a complex-valued signal is estimated through its intensity-only projections, has attracted significant interest. The hypercomplex PR (HPR) arises in many optical imaging and computational sensing applications that usually comprise quaternion and octonion-valued signals. Analogous to the traditional PR, measurements in HPR may involve complex, hypercomplex, Fourier, and other sensing matrices. This set of problems opens opportunities for developing novel HSP tools and algorithms. This article provides a synopsis of the emerging areas and applications of HPR with a focus on optical imaging.Comment: 10 pages, 4 figures, 2 table

    Filtering and Tracking with Trinion-Valued Adaptive Algorithms

    Get PDF
    A new model for three-dimensional processes based on the trinion algebra is introduced for the first time. Compared with the pure quaternion model, the trinion model is more compact and computationally more efficient, while having similar or comparable performance in terms of adaptive linear filtering. Moreover, the trinion model can effectively represent the general relationship of state evolution in Kalman filtering, where the pure quaternion model fails. Simulations on real-world wind recordings and synthetic data sets are provided to demonstrate the potentials of this new modeling method

    The geometry of proper quaternion random variables

    Full text link
    Second order circularity, also called properness, for complex random variables is a well known and studied concept. In the case of quaternion random variables, some extensions have been proposed, leading to applications in quaternion signal processing (detection, filtering, estimation). Just like in the complex case, circularity for a quaternion-valued random variable is related to the symmetries of its probability density function. As a consequence, properness of quaternion random variables should be defined with respect to the most general isometries in 4D4D, i.e. rotations from SO(4)SO(4). Based on this idea, we propose a new definition of properness, namely the (μ1,μ2)(\mu_1,\mu_2)-properness, for quaternion random variables using invariance property under the action of the rotation group SO(4)SO(4). This new definition generalizes previously introduced properness concepts for quaternion random variables. A second order study is conducted and symmetry properties of the covariance matrix of (μ1,μ2)(\mu_1,\mu_2)-proper quaternion random variables are presented. Comparisons with previous definitions are given and simulations illustrate in a geometric manner the newly introduced concept.Comment: 14 pages, 3 figure

    PHNNs: Lightweight Neural Networks via Parameterized Hypercomplex Convolutions

    Get PDF
    Hypercomplex neural networks have proven to reduce the overall number of parameters while ensuring valuable performance by leveraging the properties of Clifford algebras. Recently, hypercomplex linear layers have been further improved by involving efficient parameterized Kronecker products. In this article, we define the parameterization of hypercomplex convolutional layers and introduce the family of parameterized hypercomplex neural networks (PHNNs) that are lightweight and efficient large-scale models. Our method grasps the convolution rules and the filter organization directly from data without requiring a rigidly predefined domain structure to follow. PHNNs are flexible to operate in any user-defined or tuned domain, from 1-D to nD regardless of whether the algebra rules are preset. Such a malleability allows processing multidimensional inputs in their natural domain without annexing further dimensions, as done, instead, in quaternion neural networks (QNNs) for 3-D inputs like color images. As a result, the proposed family of PHNNs operates with 1/n free parameters as regards its analog in the real domain. We demonstrate the versatility of this approach to multiple domains of application by performing experiments on various image datasets and audio datasets in which our method outperforms real and quaternion-valued counterparts
    corecore