33 research outputs found

    Design and Implementation of Efficient Algorithms for Wireless MIMO Communication Systems

    Full text link
    En la última década, uno de los avances tecnológicos más importantes que han hecho culminar la nueva generación de banda ancha inalámbrica es la comunicación mediante sistemas de múltiples entradas y múltiples salidas (MIMO). Las tecnologías MIMO han sido adoptadas por muchos estándares inalámbricos tales como LTE, WiMAS y WLAN. Esto se debe principalmente a su capacidad de aumentar la máxima velocidad de transmisión , junto con la fiabilidad alcanzada y la cobertura de las comunicaciones inalámbricas actuales sin la necesidad de ancho de banda extra ni de potencia de transmisión adicional. Sin embargo, las ventajas proporcionadas por los sistemas MIMO se producen a expensas de un aumento sustancial del coste de implementación de múltiples antenas y de la complejidad del receptor, la cual tiene un gran impacto sobre el consumo de energía. Por esta razón, el diseño de receptores de baja complejidad es un tema importante que se abordará a lo largo de esta tesis. En primer lugar, se investiga el uso de técnicas de preprocesado de la matriz de canal MIMO bien para disminuir el coste computacional de decodificadores óptimos o bien para mejorar las prestaciones de detectores subóptimos lineales, SIC o de búsqueda en árbol. Se presenta una descripción detallada de dos técnicas de preprocesado ampliamente utilizadas: el método de Lenstra, Lenstra, Lovasz (LLL) para lattice reduction (LR) y el algorimo VBLAST ZF-DFE. Tanto la complejidad como las prestaciones de ambos métodos se han evaluado y comparado entre sí. Además, se propone una implementación de bajo coste del algoritmo VBLAST ZF-DFE, la cual se incluye en la evaluación. En segundo lugar, se ha desarrollado un detector MIMO basado en búsqueda en árbol de baja complejidad, denominado detector K-Best de amplitud variable (VB K-Best). La idea principal de este método es aprovechar el impacto del número de condición de la matriz de canal sobre la detección de datos con el fin de disminuir la complejidad de los sistemasRoger Varea, S. (2012). Design and Implementation of Efficient Algorithms for Wireless MIMO Communication Systems [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/16562Palanci

    Lattice reduction and list based low complexity MIMO detection and its applications.

    Get PDF
    Multiple input multiple output (MIMO) is an important technique of improving the spectral efficiency in wireless communications. In MIMO systems, it is usually required to jointly detect signals at the receiver. While the maximum likelihood (ML) MIMO detection provides an optimal performance with full receive diversity, its complexity grows exponentially with the number of transmit antennas. Thus, lattice reduction (LR) and list based detectors are developed to reduce the complexity. In this thesis, we first apply the partial maximum a posteriori probability (PMAP) principle to the list-based method for MIMO detection. It shows that the PMAP-based list detection outperforms the conventional list detection with a reasonably low complexity. To further improve the performance for slow fading MIMO channels, we develop the column reordering criteria (CRC) for the LR-based list detection. It shows that with our proposed CRC, the LR,-based list detection can provide a near ML performance with a sufficiently low complexity. Then, we develop a complexity efficient pre-voting cancellation based detection with pre-voting vector selection criteria for underdetermined MIMO systems and show that this scheme can exploit a near ML performance with full receive diversity. An extension of MIMO systems is multiuser MIMO systems, where the user selection becomes an effective way to increase diversity (multiuser diversity). If multiple users are selected to access the channel at a time, the selection problem becomes a combinatorial problem, where an exhaustive search may leads to highly computational complexity. Therefore, we propose a low complexity greedy user selection scheme with an iterative LR updating algorithm when a LR-based MIMO detector is used. It shows that the proposed selection scheme can provide a comparable performance to the combinatorial ones with much lower complexity

    Guidance and control of an autonomous underwater vehicle

    Get PDF
    Merged with duplicate record 10026.1/856 on 07.03.2017 by CS (TIS)A cooperative project between the Universities of Plymouth and Cranfield was aimed at designing and developing an autonomous underwater vehicle named Hammerhead. The work presented herein is to formulate an advance guidance and control system and to implement it in the Hammerhead. This involves the description of Hammerhead hardware from a control system perspective. In addition to the control system, an intelligent navigation scheme and a state of the art vision system is also developed. However, the development of these submodules is out of the scope of this thesis. To model an underwater vehicle, the traditional way is to acquire painstaking mathematical models based on laws of physics and then simplify and linearise the models to some operating point. One of the principal novelties of this research is the use of system identification techniques on actual vehicle data obtained from full scale in water experiments. Two new guidance mechanisms have also been formulated for cruising type vehicles. The first is a modification of the proportional navigation guidance for missiles whilst the other is a hybrid law which is a combination of several guidance strategies employed during different phases of the Right. In addition to the modelling process and guidance systems, a number of robust control methodologies have been conceived for Hammerhead. A discrete time linear quadratic Gaussian with loop transfer recovery based autopilot is formulated and integrated with the conventional and more advance guidance laws proposed. A model predictive controller (MPC) has also been devised which is constructed using artificial intelligence techniques such as genetic algorithms (GA) and fuzzy logic. A GA is employed as an online optimization routine whilst fuzzy logic has been exploited as an objective function in an MPC framework. The GA-MPC autopilot has been implemented in Hammerhead in real time and results demonstrate excellent robustness despite the presence of disturbances and ever present modelling uncertainty. To the author's knowledge, this is the first successful application of a GA in real time optimization for controller tuning in the marine sector and thus the thesis makes an extremely novel and useful contribution to control system design in general. The controllers are also integrated with the proposed guidance laws and is also considered to be an invaluable contribution to knowledge. Moreover, the autopilots are used in conjunction with a vision based altitude information sensor and simulation results demonstrate the efficacy of the controllers to cope with uncertain altitude demands.J&S MARINE LTD., QINETIQ, SUBSEA 7 AND SOUTH WEST WATER PL

    Establishing global error bounds for model reduction in combustion

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Chemical Engineering, 2013.Cataloged from PDF version of thesis.Includes bibliographical references (pages 223-239).In addition to theory and experiment, simulation of reacting flows has become important in policymaking, industry, and combustion science. However, simulations of reacting flows can be extremely computationally demanding due to the wide range of length scales involved in turbulence, the wide range of time scales involved in chemical reactions, and the large number of species in detailed chemical reaction mechanisms in combustion. To compensate for limited available computational resources, reduced chemistry is used. However, the accuracy of these reduced chemistry models is usually unknown, which is of great concern in applications; if the accuracy of a simplified model is unknown, it is risky to rely on the results of that model for critical decision-making. To address this issue, this thesis derives bounds on the global error in reduced chemistry models. First, it is shown that many model reduction methods in combustion are based on projection; all of these methods can be described using the same equation. After that, methods from the numerical solution of ODEs are used to derive separate a priori bounds on the global error in the solutions of reduced chemistry models for both projection-based reduced chemistry models and non-projection-based reduced chemistry models. The distinguishing feature between the two sets of bounds is that bounds on projection-based reduced chemistry models are stronger than those on non-projection-based reduced chemistry models. In both cases, the bounds are tight, but tend to drastically overestimate the error in the reduced chemistry. The a priori bounds on the global error in the solutions of reduced chemistry models demonstrate that if the error in the time derivatives of the state variables in the reduced model is controlled, then the error in the reduced model solution is also controlled; this thesis proves that result for the first time. Source code is included for all results presented. After presenting these results, the development of more accurate global error information is discussed. Using the error bounds above, in concert with more accurate global error information, it should be possible to assess better the accuracy and reliability of reduced chemistry models in applications.by Geoffrey Malcolm Oxberry.Ph.D

    A Cooperative Approach for Composite Ontology Matching

    Get PDF
    Ontologies have proven to be an essential element in a range of applications in which knowl-edge plays a key role. Resolving the semantic heterogeneity problem is crucial to allow the interoperability between ontology-based systems. This makes automatic ontology matching, as an anticipated solution to semantic heterogeneity, an important, research issue. Many dif-ferent approaches to the matching problem have emerged from the literature. An important issue of ontology matching is to find effective ways of choosing among many techniques and their variations, and then combining their results. An innovative and promising option is to formalize the combination of matching techniques using agent-based approaches, such as cooperative negotiation and argumentation. In this thesis, the formalization of the on-tology matching problem following an agent-based approach is proposed. Such proposal is evaluated using state-of-the-art data sets. The results show that the consensus obtained by negotiation and argumentation represent intermediary values which are closer to the best matcher. As the best matcher may vary depending on specific differences of multiple data sets, cooperative approaches are an advantage. *** RESUMO - Ontologias são elementos essenciais em sistemas baseados em conhecimento. Resolver o problema de heterogeneidade semântica é fundamental para permitira interoperabilidade entre sistemas baseados em ontologias. Mapeamento automático de ontologias pode ser visto como uma solução para esse problema. Diferentes e complementares abordagens para o problema são propostas na literatura. Um aspecto importante em mapeamento consiste em selecionar o conjunto adequado de abordagens e suas variações, e então combinar seus resultados. Uma opção promissora envolve formalizara combinação de técnicas de ma-peamento usando abordagens baseadas em agentes cooperativos, tais como negociação e argumentação. Nesta tese, a formalização do problema de combinação de técnicas de ma-peamento usando tais abordagens é proposta e avaliada. A avaliação, que envolve conjuntos de testes sugeridos pela comunidade científica, permite concluir que o consenso obtido pela negociação e pela argumentação não é exatamente a melhoria de todos os resultados individuais, mas representa os valores intermediários que são próximo da melhor técnica. Considerando que a melhor técnica pode variar dependendo de diferencas específicas de múltiplas bases de dados, abordagens cooperativas são uma vantagem

    A comparison of the CAR and DAGAR spatial random effects models with an application to diabetics rate estimation in Belgium

    Get PDF
    When hierarchically modelling an epidemiological phenomenon on a finite collection of sites in space, one must always take a latent spatial effect into account in order to capture the correlation structure that links the phenomenon to the territory. In this work, we compare two autoregressive spatial models that can be used for this purpose: the classical CAR model and the more recent DAGAR model. Differently from the former, the latter has a desirable property: its ρ parameter can be naturally interpreted as the average neighbor pair correlation and, in addition, this parameter can be directly estimated when the effect is modelled using a DAGAR rather than a CAR structure. As an application, we model the diabetics rate in Belgium in 2014 and show the adequacy of these models in predicting the response variable when no covariates are available

    A Statistical Approach to the Alignment of fMRI Data

    Get PDF
    Multi-subject functional Magnetic Resonance Image studies are critical. The anatomical and functional structure varies across subjects, so the image alignment is necessary. We define a probabilistic model to describe functional alignment. Imposing a prior distribution, as the matrix Fisher Von Mises distribution, of the orthogonal transformation parameter, the anatomical information is embedded in the estimation of the parameters, i.e., penalizing the combination of spatially distant voxels. Real applications show an improvement in the classification and interpretability of the results compared to various functional alignment methods

    Mathematical Models for Planning and Controlling Air Quality; Proceedings of an IIASA Workshop, October 1979

    Get PDF
    Air-quality management problems fall into three main classes: it is difficult to obtain a reliable picture of all the physicochemical processes involved, comprehensive assessments of the costs and benefits of alternative control strategies are not easily made, and the technology for pollution abatement is not yet well established. Various mathematical or formal management models do exist but the overall impact of modeling on decision making has so far been relatively small. The first aim of the IIASA Workshop on which this volume is based was to bridge the gap between air-quality modeling and management. As described in the ten papers in Part One, Workshop participants examined the goals actually pursued by decision makers, the potential role of mathematical models in air-quality management, and the extent to which modeling has been used in real situations in a number of countries. The Workshop's second aim, reported in the eight papers in Part Two, was to consider the unusual strategy of real-time emission control. An extended description of the IIASA case study of the Venetian Lagoon area was presented, together with contributions on real-time forecast and control schemes in operation in Japan and Italy

    Advanced Process Monitoring for Industry 4.0

    Get PDF
    This book reports recent advances on Process Monitoring (PM) to cope with the many challenges raised by the new production systems, sensors and “extreme data” conditions that emerged with Industry 4.0. Concepts such as digital-twins and deep learning are brought to the PM arena, pushing forward the capabilities of existing methodologies to handle more complex scenarios. The evolution of classical paradigms such as Latent Variable modeling, Six Sigma and FMEA are also covered. Applications span a wide range of domains such as microelectronics, semiconductors, chemicals, materials, agriculture, as well as the monitoring of rotating equipment, combustion systems and membrane separation processes
    corecore