64 research outputs found

    Adaptive Intelligent Systems for Extreme Environments

    Get PDF
    As embedded processors become powerful, a growing number of embedded systems equipped with artificial intelligence (AI) algorithms have been used in radiation environments to perform routine tasks to reduce radiation risk for human workers. On the one hand, because of the low price, commercial-off-the-shelf devices and components are becoming increasingly popular to make such tasks more affordable. Meanwhile, it also presents new challenges to improve radiation tolerance, the capability to conduct multiple AI tasks and deliver the power efficiency of the embedded systems in harsh environments. There are three aspects of research work that have been completed in this thesis: 1) a fast simulation method for analysis of single event effect (SEE) in integrated circuits, 2) a self-refresh scheme to detect and correct bit-flips in random access memory (RAM), and 3) a hardware AI system with dynamic hardware accelerators and AI models for increasing flexibility and efficiency. The variances of the physical parameters in practical implementation, such as the nature of the particle, linear energy transfer and circuit characteristics, may have a large impact on the final simulation accuracy, which will significantly increase the complexity and cost in the workflow of the transistor level simulation for large-scale circuits. It makes it difficult to conduct SEE simulations for large-scale circuits. Therefore, in the first research work, a new SEE simulation scheme is proposed, to offer a fast and cost-efficient method to evaluate and compare the performance of large-scale circuits which subject to the effects of radiation particles. The advantages of transistor and hardware description language (HDL) simulations are combined here to produce accurate SEE digital error models for rapid error analysis in large-scale circuits. Under the proposed scheme, time-consuming back-end steps are skipped. The SEE analysis for large-scale circuits can be completed in just few hours. In high-radiation environments, bit-flips in RAMs can not only occur but may also be accumulated. However, the typical error mitigation methods can not handle high error rates with low hardware costs. In the second work, an adaptive scheme combined with correcting codes and refreshing techniques is proposed, to correct errors and mitigate error accumulation in extreme radiation environments. This scheme is proposed to continuously refresh the data in RAMs so that errors can not be accumulated. Furthermore, because the proposed design can share the same ports with the user module without changing the timing sequence, it thus can be easily applied to the system where the hardware modules are designed with fixed reading and writing latency. It is a challenge to implement intelligent systems with constrained hardware resources. In the third work, an adaptive hardware resource management system for multiple AI tasks in harsh environments was designed. Inspired by the “refreshing” concept in the second work, we utilise a key feature of FPGAs, partial reconfiguration, to improve the reliability and efficiency of the AI system. More importantly, this feature provides the capability to manage the hardware resources for deep learning acceleration. In the proposed design, the on-chip hardware resources are dynamically managed to improve the flexibility, performance and power efficiency of deep learning inference systems. The deep learning units provided by Xilinx are used to perform multiple AI tasks simultaneously, and the experiments show significant improvements in power efficiency for a wide range of scenarios with different workloads. To further improve the performance of the system, the concept of reconfiguration was further extended. As a result, an adaptive DL software framework was designed. This framework can provide a significant level of adaptability support for various deep learning algorithms on an FPGA-based edge computing platform. To meet the specific accuracy and latency requirements derived from the running applications and operating environments, the platform may dynamically update hardware and software (e.g., processing pipelines) to achieve better cost, power, and processing efficiency compared to the static system

    Unmanned Aerial Vehicle (UAV)-Enabled Wireless Communications and Networking

    Get PDF
    The emerging massive density of human-held and machine-type nodes implies larger traffic deviatiolns in the future than we are facing today. In the future, the network will be characterized by a high degree of flexibility, allowing it to adapt smoothly, autonomously, and efficiently to the quickly changing traffic demands both in time and space. This flexibility cannot be achieved when the network’s infrastructure remains static. To this end, the topic of UAVs (unmanned aerial vehicles) have enabled wireless communications, and networking has received increased attention. As mentioned above, the network must serve a massive density of nodes that can be either human-held (user devices) or machine-type nodes (sensors). If we wish to properly serve these nodes and optimize their data, a proper wireless connection is fundamental. This can be achieved by using UAV-enabled communication and networks. This Special Issue addresses the many existing issues that still exist to allow UAV-enabled wireless communications and networking to be properly rolled out

    Enlace de retorno satelital DVB-RCS2 : modelagem de fila e otimização de alocação de recursos baseada em teoria dos jogos

    Get PDF
    Tese (doutorado) — Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Elétrica, 2022.É esperado que satélites tenham um papel fundamental no futuro dos sistemas de comunicação, integrando-se às infraestruturas terrestres. Esta dissertação de mestrado propõe três contribuições principais: primeiramente, se apresenta um arcabouço de simulação capaz de prover detalhes da performance de redes de comunicação satelital em cenários realistas. Este arcabouço aplica uma metodologia orientada a eventos, modelando a rede de comunicação como um sistema baseado em eventos discretos (DES), focando no enlace de retorno do protocolo DVB-RCS2. Três diferentes cenários simulados demonstram os possíveis usos das saídas do simulador para entender o comportamento dinâmico da rede e alcançar um ponto ótimo de operação do sistema. Cada cenário explora uma característica diferente do simulador, enquanto cobre um grande território de usuários, que em nosso caso estudo o país de escolha foi o Brasil. Em um segundo tópico, este trabalho introduz um novo algoritmo modificado do método de alocação de timeslots baseado em teoria dos jogos, aplicando-se no protocolo DVB-RCS2. Este procedimento considera a eficiência espectral do terminal como um parâmetro de peso para o problema de otimização convexa resultante da solução da barganha de Nash. Este novo método garante o cumprimento dos requisitos de Qualidade de Serviço (QoS) enquanto provê uma medida de justiça maior; os resultados mostram uma melhoria de 5% na medida de justiça, com uma diminuição de 75% no desvio padrão de justiça entre os quadros, também alcançando um aumento de 12% na satisfação individual média pela alocação de capacidade aos terminais. Por final, apresentamos uma modelagem alternativa para o enlace de retorno do DVB-RCS2 usando cadeias de Markov, predizendo parâmetros tradicionais de fila como a intensidade de tráfego, tempo médio de espera, dentre outros. Utilizamos dados coletados de uma série de simulações usando o arcabouço orientado a eventos para validar o modelo de filas como uma aproximação numérica útil para o cenário real de aplicação. Nós apresentamos o algoritmo de alocação de controle do parâmetro alfa (GTAC) que consegue controlar o tempo médio de espera de um RCST na fila, respeitando um limiar de tempo enquanto otimiza a taxa média média de transmissão de dados dos terminais.Satellite networks are expected to play a vital role in future communication systems, with complex features and seamless integration with ground-based infrastructure. This dissertation proposes three main contributions: firstly, it presents a novel simulation framework capable of providing a detailed assessment of a satellite communication’s network performance in realistic scenarios, employing an event-driven methodology and modeling the communications network as a DES (discrete event system). This work focuses on the return link of the Digital Video Broadcast Return Channel via Satellite (DVB-RCS2) standard. Three different scenarios demonstrate possible uses of the simulator’s output to understand the network’s dynamic behavior and achievable optimal system operation. Each scenario explores a different feature of the simulator. The simulated range covers a large territory with thousands of users, which in our case study was the country of Brazil. In the second theme, this work introduces a novel algorithm modification for the conventional game theory-based time slot assignment method, applying it to the DVB-RCS system. This procedure considers the spectral efficiency as a weighting parameter. We use it as an input for the resulting convex optimization problem of the Nash Bargaining Solution. This approach guarantees the fulfillment of Quality of Service (QoS) constraints while maintaining a higher fairness measure; results show a 5% improvement in fairness, with a 73% decrease in the standard deviation of fairness between frames, while also managing to reach a 12.5% increase in average normalized terminal BTU allocation satisfaction. Lastly, we present an alternative queuing model analysis for the DVB-RCS2 return link using Markov chains, developed to predict traditional queue parameters such as traffic intensity, average queue size, average waiting time, among others. We used data gathered from a series of simulations using the DES framework to validate this queuing model as a useful numerical approximation to the real application scenario, and, by the end of the scope, we present the alpha allocation algorithm (GTAC) that can maintain the average waiting time of a terminal in the queue to a threshold while optimizing the average terminal throughput

    Cyber Security

    Get PDF
    This open access book constitutes the refereed proceedings of the 16th International Annual Conference on Cyber Security, CNCERT 2020, held in Beijing, China, in August 2020. The 17 papers presented were carefully reviewed and selected from 58 submissions. The papers are organized according to the following topical sections: access control; cryptography; denial-of-service attacks; hardware security implementation; intrusion/anomaly detection and malware mitigation; social network security and privacy; systems security

    Telecommunications Networks

    Get PDF
    This book guides readers through the basics of rapidly emerging networks to more advanced concepts and future expectations of Telecommunications Networks. It identifies and examines the most pressing research issues in Telecommunications and it contains chapters written by leading researchers, academics and industry professionals. Telecommunications Networks - Current Status and Future Trends covers surveys of recent publications that investigate key areas of interest such as: IMS, eTOM, 3G/4G, optimization problems, modeling, simulation, quality of service, etc. This book, that is suitable for both PhD and master students, is organized into six sections: New Generation Networks, Quality of Services, Sensor Networks, Telecommunications, Traffic Engineering and Routing

    Advanced Trends in Wireless Communications

    Get PDF
    Physical limitations on wireless communication channels impose huge challenges to reliable communication. Bandwidth limitations, propagation loss, noise and interference make the wireless channel a narrow pipe that does not readily accommodate rapid flow of data. Thus, researches aim to design systems that are suitable to operate in such channels, in order to have high performance quality of service. Also, the mobility of the communication systems requires further investigations to reduce the complexity and the power consumption of the receiver. This book aims to provide highlights of the current research in the field of wireless communications. The subjects discussed are very valuable to communication researchers rather than researchers in the wireless related areas. The book chapters cover a wide range of wireless communication topics

    Efficient Decision Support Systems

    Get PDF
    This series is directed to diverse managerial professionals who are leading the transformation of individual domains by using expert information and domain knowledge to drive decision support systems (DSSs). The series offers a broad range of subjects addressed in specific areas such as health care, business management, banking, agriculture, environmental improvement, natural resource and spatial management, aviation administration, and hybrid applications of information technology aimed to interdisciplinary issues. This book series is composed of three volumes: Volume 1 consists of general concepts and methodology of DSSs; Volume 2 consists of applications of DSSs in the biomedical domain; Volume 3 consists of hybrid applications of DSSs in multidisciplinary domains. The book is shaped upon decision support strategies in the new infrastructure that assists the readers in full use of the creative technology to manipulate input data and to transform information into useful decisions for decision makers

    On the queuing delay of time-varying channels in Low Earth Orbit satellite constellations

    Get PDF
    Low Earth Orbit (LEO) satellite constellations are envisioned as a complementary or integrated part of 5G and future 6G networks for broadband or massive access, given their capabilities of full Earth coverage in inaccessible or very isolated environments. Although the queuing and end-to-end delays of such networks have been analyzed for channels with fixed statistics, currently there is a lack in understanding the effects of more realistic time-varying channels for traffic aggregation across such networks. Therefore, in this work we propose a queuing model for LEO constellation-based networks that captures the inherent variability of realistic satellite channels, where ground-to-satellite/satellite-to-ground links may present extremely poor connection periods due to the Land Mobile Satellite (LMS) channel. We verify the validity of our model with an extensive event-driven simulator framework analysis capturing the characteristics of the considered scenario. We later study the queuing and end-to-end delay distributions under such channels with various link, traffic, packet and background conditions, while observing good match between theory and simulation. Our results show that ground-to-satellite/satellite-to-ground links and background traffic have a much stronger impact over the end-to-end delay in mean and particularly variance, even with moderate queues, than unobstructed inter-satellite connections in outer space on an established path between two ground stations and through the constellation. This might hinder the usability of these networks for services with stringent time requirements.This work was supported in part by the European Union’s Horizon 2020 Research and Innovation Programme under Grant 861111, in part by the Innovation Fund Denmark Project Drones4Energy under Project J.nr.8057-00038A, and in part by the Spanish Government through the Ministerio de Economía y Competitividad, Fondo Europeo de Desarrollo Regional (MINECO-FEDER) by the Project Future Internet Enabled Resilient smart CitiEs (FIERCE) under Grant RTI2018-093475-AI00

    Routing in the Space Internet: A contact graph routing tutorial

    Get PDF
    A Space Internet is possible, as long as the delay and disruption challenges imposed by the space environment are properly tackled. Because these conditions are not well addressed by terrestrial Internet, more capable Delay-Tolerant Networking (DTN) protocols and algorithms are being developed. In particular, the principles and techniques for routing among ground elements and spacecraft in near-Earth orbit and deep-space are enacted in the Contact Graph Routing (CGR) framework. CGR blends a set of non-trivial algorithm adaptations, space operations concepts, time-dynamic scheduling, and specific graph models. The complexity of that framework suggests a need for a focused discussion to facilitate its direct and correct apprehension. To this end, we present an in-depth tutorial that collects and organizes first-hand experience on researching, developing, implementing, and standardizing CGR. Content is laid out in a structure that considers the planning, route search and management, and forwarding phases bridging ground and space domains. We rely on intuitive graphical examples, supporting code material, and references to flight-grade CGR implementations details where pertinent. We hope this tutorial will serve as a valuable resource for engineers and that researchers can also apply the insights presented here to topics in DTN research.Fil: Fraire, Juan Andres. Universidad Nacional de Córdoba. Facultad de Ciencias Exactas, Físicas y Naturales; Argentina. Universitat Saarland; AlemaniaFil: De Jonckère, Olivier. Technische Universität Dresden; AlemaniaFil: Burleigh, Scott C.. California Institute of Technology; Estados Unido

    Delay over LEO networks: modeling and analysis

    Get PDF
    RESUMEN: A lo largo de los años las tecnologías han evolucionado, hasta tal punto en que las redes convencionales y satelitales, que han estado siempre presentes, van a cohesionar, con el fin de proporcionar especialmente cobertura desde cualquier punto, sin importar la localización y, así, cubrir la creciente demanda, siendo relevante el incremento del IoT. Entre las diferentes órbitas, este proyecto se centra en el estudio de las comunicaciones satelitales LEO. Una de sus principales ventajas, es la baja latencia, al tratarse de la órbita más cercana a la Tierra. Otra de sus características, es la conexión continua, garantizada por estar en movimiento con relación a la Tierra. A lo largo de este trabajo, se expondrá el modelado del canal tierra-satélite, incluyéndose también enlaces inter-satelitales. Su estudio se realiza mediante un simulador por eventos desarrollado por el Grupo de Ingeniería Telemática. Posteriormente se realizará un análisis estadístico de los resultados, con el fin de estimar el comportamiento de la red en supuestos realistas.ABSTRACT: Technology has evolved rapidly over a relatively short period of time, to such extent that conventional networks and Non-Terrestrial Networks, which have always been there, are being jointly considered to provide coverage from anywhere, regardless the location, and to meet the increasing demand, being particularly relevant the looming of IoT communications. Between existing satellite orbits, this project focuses on LEO communications. One of their key advantages is the low latency, due to the shorter distance to the Earth. Another key feature is the “always-on” connectivity, which is ensured by the continuous movement with respect to the Earth. In this BSc Thesis, we have thoroughly studied the performance of the satellite-to-ground channel, as well as the subsequent inter satellite links. The analysis will be carried out by an event-driver simulator, developed by the Grupo de Ingeniería Telemática. We will extensively analysis the statistics of the obtained results, in order to shed light on the network performance for different realistic scenariosGrado en Ingeniería de Tecnologías de Telecomunicació
    corecore