1,639 research outputs found

    Intelligent flight control systems

    Get PDF
    The capabilities of flight control systems can be enhanced by designing them to emulate functions of natural intelligence. Intelligent control functions fall in three categories. Declarative actions involve decision-making, providing models for system monitoring, goal planning, and system/scenario identification. Procedural actions concern skilled behavior and have parallels in guidance, navigation, and adaptation. Reflexive actions are spontaneous, inner-loop responses for control and estimation. Intelligent flight control systems learn knowledge of the aircraft and its mission and adapt to changes in the flight environment. Cognitive models form an efficient basis for integrating 'outer-loop/inner-loop' control functions and for developing robust parallel-processing algorithms

    Massive MIMO is a Reality -- What is Next? Five Promising Research Directions for Antenna Arrays

    Full text link
    Massive MIMO (multiple-input multiple-output) is no longer a "wild" or "promising" concept for future cellular networks - in 2018 it became a reality. Base stations (BSs) with 64 fully digital transceiver chains were commercially deployed in several countries, the key ingredients of Massive MIMO have made it into the 5G standard, the signal processing methods required to achieve unprecedented spectral efficiency have been developed, and the limitation due to pilot contamination has been resolved. Even the development of fully digital Massive MIMO arrays for mmWave frequencies - once viewed prohibitively complicated and costly - is well underway. In a few years, Massive MIMO with fully digital transceivers will be a mainstream feature at both sub-6 GHz and mmWave frequencies. In this paper, we explain how the first chapter of the Massive MIMO research saga has come to an end, while the story has just begun. The coming wide-scale deployment of BSs with massive antenna arrays opens the door to a brand new world where spatial processing capabilities are omnipresent. In addition to mobile broadband services, the antennas can be used for other communication applications, such as low-power machine-type or ultra-reliable communications, as well as non-communication applications such as radar, sensing and positioning. We outline five new Massive MIMO related research directions: Extremely large aperture arrays, Holographic Massive MIMO, Six-dimensional positioning, Large-scale MIMO radar, and Intelligent Massive MIMO.Comment: 20 pages, 9 figures, submitted to Digital Signal Processin

    Multifidelity uncertainty quantification with models based on dissimilar parameters

    Full text link
    Multifidelity uncertainty quantification (MF UQ) sampling approaches have been shown to significantly reduce the variance of statistical estimators while preserving the bias of the highest-fidelity model, provided that the low-fidelity models are well correlated. However, maintaining a high level of correlation can be challenging, especially when models depend on different input uncertain parameters, which drastically reduces the correlation. Existing MF UQ approaches do not adequately address this issue. In this work, we propose a new sampling strategy that exploits a shared space to improve the correlation among models with dissimilar parametrization. We achieve this by transforming the original coordinates onto an auxiliary manifold using the adaptive basis (AB) method~\cite{Tipireddy2014}. The AB method has two main benefits: (1) it provides an effective tool to identify the low-dimensional manifold on which each model can be represented, and (2) it enables easy transformation of polynomial chaos representations from high- to low-dimensional spaces. This latter feature is used to identify a shared manifold among models without requiring additional evaluations. We present two algorithmic flavors of the new estimator to cover different analysis scenarios, including those with legacy and non-legacy high-fidelity data. We provide numerical results for analytical examples, a direct field acoustic test, and a finite element model of a nuclear fuel assembly. For all examples, we compare the proposed strategy against both single-fidelity and MF estimators based on the original model parametrization

    Coordinated Multi-cell Beamforming for Massive MIMO: A Random Matrix Approach

    Get PDF
    We consider the problem of coordinated multi- cell downlink beamforming in massive multiple input multiple output (MIMO) systems consisting of N cells, Nt antennas per base station (BS) and K user terminals (UTs) per cell. Specifically, we formulate a multi-cell beamforming algorithm for massive MIMO systems which requires limited amount of information exchange between the BSs. The design objective is to minimize the aggregate transmit power across all the BSs subject to satisfying the user signal to interference noise ratio (SINR) constraints. The algorithm requires the BSs to exchange parameters which can be computed solely based on the channel statistics rather than the instantaneous CSI. We make use of tools from random matrix theory to formulate the decentralized algorithm. We also characterize a lower bound on the set of target SINR values for which the decentralized multi-cell beamforming algorithm is feasible. We further show that the performance of our algorithm asymptotically matches the performance of the centralized algorithm with full CSI sharing. While the original result focuses on minimizing the aggregate transmit power across all the BSs, we formulate a heuristic extension of this algorithm to incorporate a practical constraint in multi-cell systems, namely the individual BS transmit power constraints. Finally, we investigate the impact of imperfect CSI and pilot contamination effect on the performance of the decentralized algorithm, and propose a heuristic extension of the algorithm to accommodate these issues. Simulation results illustrate that our algorithm closely satisfies the target SINR constraints and achieves minimum power in the regime of massive MIMO systems. In addition, it also provides substantial power savings as compared to zero-forcing beamforming when the number of antennas per BS is of the same orders of magnitude as the number of UTs per cell

    Explorando ferramentas de modelação digital, aumentada e orientada por dados em engenharia e design de produto

    Get PDF
    Tools are indispensable for all diligent professional practice. New concepts and possibilities for paradigm shifting are emerging with recent computational technological developments in digital tools. However, new tools from key concepts such as “Big-Data”, “Accessibility” and “Algorithmic Design” are fundamentally changing the input and position of the Product Engineer and Designer. After the context introduction, this dissertation document starts by extracting three pivotal criteria from the Product Design Engineering's State of the Art analysis. In each one of those criteria the new emergent, more relevant and paradigmatic concepts are explored and later on are positioned and compared within the Product Lifecycle Management wheel scheme, where the potential risks and gaps are pointed to be explored in the experience part. There are two types of empirical experiences: the first being of case studies from Architecture and Urban Planning — from the student's professional experience —, that served as a pretext and inspiration for the experiments directly made for Product Design Engineering. First with a set of isolated explorations and analysis, second with a hypothetical experience derived from the latter and, finally, a deliberative section that culminate in a listing of risks and changes concluded from all the previous work. The urgency to reflect on what will change in that role and position, what kind of ethical and/or conceptual reformulations should exist for the profession to maintain its intellectual integrity and, ultimately, to survive, are of the utmost evidence.As ferramentas são indispensáveis para toda a prática diligente profissional. Novos conceitos e possibilidades de mudança de paradigma estão a surgir com os recentes progressos tecnológicos a nível computacional nas ferramentas digitais. Contudo, novas ferramentas originadas sobre conceitos-chave como “Big Data”, “Acessibilidade” e “Design Algorítmico” estão a mudar de forma fundamental o contributo e posição do Engenheiro e Designer de Produto. Esta dissertação, após uma primeira introdução contextual, começa por extrair três conceitos-eixo duma análise ao Estado da Arte actual em Engenharia e Design de Produto. Em cada um desses conceitos explora-se os novos conceitos emergentes mais relevantes e paradigmáticos, que então são comparados e posicionados no círculo de Gestão de Ciclo de Vida de Produto, apontando aí potenciais riscos e falhas que possam ser explorados em experiências. As experiências empíricas têm duas índoles: a primeira de projetos e casos de estudo de arquitetura e planeamento urbanístico — experiência em contexto de trabalho do aluno —, que serviu de pretexto e inspiração para as experiências relacionadas com Engenharia e Design de Produto. Primeiro com uma série de análises e experiências isoladas, segundo com uma formulação hipotética com o compêndio dessas experiências e, finalmente, com uma secção de reflexão que culmina numa série de riscos e mudanças induzidas do trabalho anterior. A urgência em refletir sobre o que irá alterar nesse papel e posição, que género de reformulações éticas e/ou conceptuais deverão existir para que a profissão mantenha a sua integridade intelectual e, em última instância, sobreviva, são bastante evidentes.Mestrado em Engenharia e Design de Produt

    The Engineering of Software-Defined Quantum Key Distribution Networks

    Full text link
    Quantum computers will change the cryptographic panorama. A technology once believed to lay far away into the future is increasingly closer to real world applications. Quantum computers will break the algorithms used in our public key infrastructure and in our key exchange protocols, forcing a complete retooling of the cryptography as we know it. Quantum Key distribution is a physical layer technology immune to quantum or classical computational threats. However, it requires a physical substrate, and optical fiber has been the usual choice. Most of the time used just as a point to point link for the exclusive transport of the delicate quantum signals. Its integration in a real-world shared network has not been attempted so far. Here we show how the new programmable software network architectures, together with specially designed quantum systems can be used to produce a network that integrates classical and quantum communications, including management, in a single, production-level infrastructure. The network can also incorporate new quantum-safe algorithms and use the existing security protocols, thus bridging the gap between today's network security and the quantum-safe network of the future. This can be done in an evolutionary way, without zero-day migrations and the corresponding upfront costs. We also present how the technologies have been deployed in practice using a production network.Comment: 7 pages, 4 figures, Accepted for publication in the IEEE Communications Magazine, Future Internet: Architectures and Protocols issu
    corecore