130 research outputs found
Underdetermined Source Separation of Finite Alphabet Signals Via L1 Minimization
International audienceThis paper addresses the underdetermined source separation problem of finite alphabet signals. We present a new framework for recovering finite alphabet signals. We formulate this problem as a recovery of sparse signals from highly incomplete measurements. It is known that sparse solutions can be obtained by L_1 minimization, through convex optimization. This relaxation procedure in our problem fails in recovering sparse solutions. However, this does not impact the reconstruction of the finite alphabet signals. Simulation results are presented to show that this approach provides good recovery properties and good images separation performance
New decoding strategy for underdetermined mimo transmission sparse decomposition
International audienceIn this paper we address the problem of large dimension decoding in MIMO systems. The complexity of the optimal maximum likelihood detection makes it unfeasible in practice when the number of antennas, the channel impulse response length or the source constellation size become too high. We consider a MIMO system with finite constellation and model it as a system with sparse signal sources. We formulate the decoding problem as an underdetermined sparse source recovering problem and apply the L1-minimization to solve it. The resulting decoding scheme is applied to large MIMO systems and to frequency selective channel . We also review the computational cost of some L1-minimization algorithms. Simulation results show significant improvement compared to other existing receivers
Sparse Bayesian Learning Approach for Discrete Signal Reconstruction
This study addresses the problem of discrete signal reconstruction from the
perspective of sparse Bayesian learning (SBL). Generally, it is intractable to
perform the Bayesian inference with the ideal discretization prior under the
SBL framework. To overcome this challenge, we introduce a novel discretization
enforcing prior to exploit the knowledge of the discrete nature of the
signal-of-interest. By integrating the discretization enforcing prior into the
SBL framework and applying the variational Bayesian inference (VBI)
methodology, we devise an alternating update algorithm to jointly characterize
the finite alphabet feature and reconstruct the unknown signal. When the
measurement matrix is i.i.d. Gaussian per component, we further embed the
generalized approximate message passing (GAMP) into the VBI-based method, so as
to directly adopt the ideal prior and significantly reduce the computational
burden. Simulation results demonstrate substantial performance improvement of
the two proposed methods over existing schemes. Moreover, the GAMP-based
variant outperforms the VBI-based method with an i.i.d. Gaussian measurement
matrix but it fails to work for non i.i.d. Gaussian matrices.Comment: 13 pages, 7 figure
Réduction d'interférence dans les systèmes de transmission sans fil
Wireless communications have known an exponential growth and a fast progress over the past few decades. Nowadays, wireless mobile communications have evolved over time starting with the first generation primarily developed for voice communications, and reaching the fourth generation referred to as long term evolution (LTE) that offers an increasing capacity and speed using a different radio interface together with core network improvements. Overall throughput and transmission reliability are among the essential measures of service quality in a wireless system. Such measures are mainly subjected to interference management constraint in a multi-user network. The interference management is at the heart of wireless regulation and is essential for maintaining a desirable throughput while avoiding the detrimental impact of interference at the undesired receivers. Our work is incorporated within the framework of interference network where each user is equipped with single or multiple antennas. The goal is to resolve the challenges that the communications face taking into account the achievable rate and the complexity cost. We propose several solutions for the precoding and decoding designs when transmitters have limited cooperation based on a technique called Interference Alignment. We also address the detection scheme in the absence of any precoding design and we introduce a low complexity detection scheme based on the sparse decomposition.Les communications mobiles sans fil ont connu un formidable essor au cours des dernières décennies. Tout a commencé avec les services vocaux offerts par les systèmes de la première génération en 1980, jusqu¿aux systèmes de la quatrième génération aujourd¿hui avec des services internet haut débit et un accroissement du nombre d¿utilisateurs. En effet, les caractéristiques essentielles qui définissent les services et la qualité de ces services dans les systèmes de communication sans fil sont: le débit, la fiabilité de transmission et le nombre d¿utilisateurs. Ces caractéristiques sont fortement liées entre elles et sont dépendantes de la gestion des interférences entre les différents utilisateurs. Les interférences entre-utilisateurs se produisent lorsque plusieurs émetteurs, dans une même zone, transmettent simultanément en utilisant la même bande de fréquence. Dans cette thèse, nous nous intéressons à la gestion d¿interférence entre utilisateurs par le biais de l¿approche d¿alignement d¿interférences où la coopération entre utilisateurs est réduite. Aussi, nous nous sommes intéressés au design d¿un récepteur où l¿alignement d¿interférences n¿est pas utilisé et où la gestion des interférences est réalisée par des techniques de décodage basées sur les décompositions parcimonieuses des signaux de communications. Ces approches ont conduit à des méthodes performantes et peu couteuses, exploitables dans les liens montant ou descendant
Recommended from our members
An Adaptive Strategy for Sensory Processing
Recognizing objects and detecting associations among them is essential for the survival of organisms. The ability to perform these tasks is derived from the representations of objects obtained through processing information along sensory pathways. Our current understanding of sensory processing is based on two sets of foundational theories – The Efficient Coding Hypothesis and hierarchical assembly of object representations. These theories suggest that sensory processing aims to identify independent features of the environment and progressively represent objects in terms of comprehensive combinations of these features. Separately, the two sets of theories have successfully explained the detection of associations and perceptual invariance, respectively; however, reconciling them together in one unified theory has remained challenging. Independent features are deemed essential for detecting association by the Efficient coding hypothesis, but to achieve consistency in representations, multiple comprehensive structures corresponding to the same object must be hierarchically assembled, ignoring independence among such structures.
Here we propose an alternative framework for sensory processing in which the system, instead of finding the truly independent components of the environment, aims to represent objects based on their most informative structures. Using theoretical arguments, we show that following such a strategy allows the system to efficiently represent sensory cues without necessarily acquiring knowledge about statistical properties of all possible inputs. Through mathematical simulations, we find that the framework can describe the known characteristics of early sensory processing stages and permits consistent input representations observed at later stages of processing. We also demonstrate that the framework can be implemented in a biologically plausible neuronal circuit and explain aspects of experience and learning from corrupted inputs. Thus, this framework provides a novel perspective and a unified description of sensory processing in its entirety
Finite Alphabet Blind Separation
This thesis considers a particular blind source separation problem, where the sources are assumed to only take values in a known finite set, denoted as the alphabet. More precisely, one observes M linear mixtures of m signals (sources) taking values in the known finite alphabet. The aim in this model is to identify the unknown mixing weights and sources, including the number of sources, from noisy observations of the mixture.
Finite Alphabet Blind Separation (FABS) occurs in many applications, for instance in digital communications with mixtures of multilevel pulse amplitude modulated digital signals. The main motivation for this thesis, however, comes from cancer genetics, where one aims to infer copy number aberrations of different clones in a tumor.
In the first part of this thesis, we provide necessary and sufficient identifiability conditions and obtain exact recovery within a neighborhood of the mixture.
In the second part, we study FABS for single mixtures M=1 within a change-point regression setting with Gaussian error. We provide uniformly honest lower confidence bounds and estimators with exponential convergence rates for the number of source components. With this at hand, we obtain consistent estimators with optimal convergence rates (up to log-factors) and asymptotically uniform honest confidence statements for the weights and the sources. We explore our procedure with a data example from cancer genetics.
In the third part, we consider multivariate FABS, where several mixtures M > 1 are observed. For Gaussian error we show that the least squares estimator (LSE) attains the minimax rates, both for the prediction and for the estimation error. As computation of the LSE is not feasible, an efficient algorithm is proposed. Simulations suggest that this approximates the LSE well
Theoretical Optimization of Enzymatic Biomass Processes
This dissertation introduces a complete, stochastically-based algorithmic framework Cellulect to study, optimize and predict hydrolysis processes of the structured biomass cellulose.
The framework combines a comprehensive geometric model for the cellulosic substrate with microstructured crystalline/amorphous regions distribution, distinctive monomers, polymer chain lengths distribution and free surface area tracking. An efficient tracking algorithm, formulated in a serial fashion, performs the updates of the system. The updates take place reaction-wise. The notion of real time is preserved.
Advanced types of enzyme actions (random cuts, reduced/non-reduced end cuts, orientation, and the possibility of a fixed position of active centers) and their modular structure (carbohydrate-binding module with a flexible linker and a catalytic domain) are taken into account within the framework. The concept of state machines is adopted to model enzyme entities. This provides a reliable, powerful and maintainable approach for modelling already known enzyme features and can be extended with additional features not taken into account in the present work.
The provided extensive probabilistic catalytic mechanism description further includes adsorption, desorption, competitive inhibition by soluble product polymers, and dynamical bond-breaking reactions with inclusive dependence on monomers and their polymers states within the substrate. All incorporated parameters refer to specific system properties, providing a one to one relationship between degrees of freedom and available features of the model.
Finally, time propagation of the system is based on the modified stochastic Gillespie algorithm. It provides an exact stochastic time-reaction propagation algorithm, taking into account the random nature of reaction events as well as its random occurrences.
The framework is ready for constrained input parameter estimation with empirical data sets of product concentration profiles by utilizing common optimization routines. Verification of the available data for the most common enzyme kinds (EG, β-G, CBH) in the literature has been accomplished.
Sensitivity analysis of estimated model parameters were carried out. Dependency of various experimental input is shown. Optimization behavior in underdetermined conditions is inspected and visualized.
Results and predictions for mixtures of optimized enzymes, as well as a practical way to implement and utilize the Cellulect framework are also provided. The obtained results were compared to experimental literature data demonstrate the high flexibility, efficiency and accuracy of the presented framework for the prediction of the cellulose hydrolysis process
- …