9,555 research outputs found

    Prediction And Allocation Of Live To Virtual Communication Bridging Resources

    Get PDF
    This document summarizes a research effort focused on improving live-to-virtual (L-V) communication systems. The purpose of this work is to address a significant challenge facing the tactical communications training community through the development of the Live-to-Virtual Relay Radio Prediction Algorithm and implementation of the algorithm into an Integrated Live-to-Virtual Communications Server prototype device. The motivation for the work and the challenges of integrating live and virtual communications are presented. Details surrounding the formulation of the prediction algorithm and a description of the prototype system, hardware, and software architectures are shared. Experimental results from discrete event simulation analysis and prototype functionality testing accompany recommendations for future investigation. If the methods and technologies summarized are implemented, an estimated equipment savings of 25%-53% and an estimated cost savings of 150,000.00150,000.00 - 630,000.00 per site are anticipated. Thus, a solution to a critical tactical communications training problem is presented through the research discussed

    An Overview on Application of Machine Learning Techniques in Optical Networks

    Get PDF
    Today's telecommunication networks have become sources of enormous amounts of widely heterogeneous data. This information can be retrieved from network traffic traces, network alarms, signal quality indicators, users' behavioral data, etc. Advanced mathematical tools are required to extract meaningful information from these data and take decisions pertaining to the proper functioning of the networks from the network-generated data. Among these mathematical tools, Machine Learning (ML) is regarded as one of the most promising methodological approaches to perform network-data analysis and enable automated network self-configuration and fault management. The adoption of ML techniques in the field of optical communication networks is motivated by the unprecedented growth of network complexity faced by optical networks in the last few years. Such complexity increase is due to the introduction of a huge number of adjustable and interdependent system parameters (e.g., routing configurations, modulation format, symbol rate, coding schemes, etc.) that are enabled by the usage of coherent transmission/reception technologies, advanced digital signal processing and compensation of nonlinear effects in optical fiber propagation. In this paper we provide an overview of the application of ML to optical communications and networking. We classify and survey relevant literature dealing with the topic, and we also provide an introductory tutorial on ML for researchers and practitioners interested in this field. Although a good number of research papers have recently appeared, the application of ML to optical networks is still in its infancy: to stimulate further work in this area, we conclude the paper proposing new possible research directions

    Robust, Resilient and Reliable Architecture for V2X Communication

    Get PDF
    The new developments in mobile edge computing (MEC) and vehicle-to-everything (V2X) communications has positioned 5G and beyond in a strong position to answer the market need towards future emerging intelligent transportation systems and smart city applications. The major attractive features of V2X communication is the inherent ability to adapt to any type of network, device, or data, and to ensure robustness, resilience and reliability of the network, which is challenging to realize. In this work, we propose to drive these further these features by proposing a novel robust, resilient and reliable architecture for V2X communication based on harnessing MEC and blockchain technology. A three stage computing service is proposed. Firstly, a hierarchcial computing architecture is deployed spanning over the vehicular network that constitutes cloud computing (CC), edge computing (EC), fog computing (FC) nodes. The resources and data bases can migrate from the high capacity cloud services (furthest away from the individual node of the network) to the edge (medium) and low level fog node, according to computing service requirements. Secondly, the resource allocation filters the data according to its significance, and rank the nodes according to their usability, and selects the network technology according to their physical channel characteristics. Thirdly, we propose a blockchain-based transaction service that ensures reliability. We discussed two use cases for experimental analysis, plug- in electric vehicles in smart grid scenarios, and massive IoT data services for autonomous cars. The results show that car connectivity prediction is accurate 98% of the times, where 92% more data blocks are added using micro-blockchain solution compared to the public blockchain, where it is able to reduce the time to sign and compute the proof-of-work (PoW), and deliver a low-overhead Proof-of-Stake (PoS) consensus mechanism. This approach can be considered a strong candidate architecture for future V2X, and with more general application for everything- to-everything (X2X) communications

    Cross-Subject Continuous Analytic Workload Profiling Using Stochastic Discrete Event Simulation

    Get PDF
    Operator functional state (OFS) in remotely piloted aircraft (RPA) simulations is modeled using electroencephalograph (EEG) physiological data and continuous analytic workload profiles (CAWPs). A framework is proposed that provides solutions to the limitations that stem from lengthy training data collection and labeling techniques associated with generating CAWPs for multiple operators/trials. The framework focuses on the creation of scalable machine learning models using two generalization methods: 1) the stochastic generation of CAWPs and 2) the use of cross-subject physiological training data to calibrate machine learning models. Cross-subject workload models are used to infer OFS on new subjects, reducing the need to collect truth data or train individualized workload models for unseen operators. Additionally, stochastic techniques are used to generate representative workload profiles using a limited number of training observations. Both methods are found to reduce data collection requirements at the cost of machine learning prediction quality. The costs in quality are considered acceptable due to drastic reductions in machine learning model calibration time for future operators

    A survey of measurement-based spectrum occupancy modeling for cognitive radios

    Get PDF
    Spectrum occupancy models are very useful in cognitive radio designs. They can be used to increase spectrum sensing accuracy for more reliable operation, to remove spectrum sensing for higher resource usage efficiency, or to select channels for better opportunistic access, among other applications. In this survey, various spectrum occupancy models from measurement campaigns taken around the world are investigated. These models extract different statistical properties of the spectrum occupancy from the measured data. In addition to these models, spectrum occupancy prediction is also discussed, where autoregressive and/or moving-average models are used to predict the channel status at future time instants. After comparing these different methods and models, several challenges are also summarized based on this survey

    2020 NASA Technology Taxonomy

    Get PDF
    This document is an update (new photos used) of the PDF version of the 2020 NASA Technology Taxonomy that will be available to download on the OCT Public Website. The updated 2020 NASA Technology Taxonomy, or "technology dictionary", uses a technology discipline based approach that realigns like-technologies independent of their application within the NASA mission portfolio. This tool is meant to serve as a common technology discipline-based communication tool across the agency and with its partners in other government agencies, academia, industry, and across the world

    A Cognitive Routing framework for Self-Organised Knowledge Defined Networks

    Get PDF
    This study investigates the applicability of machine learning methods to the routing protocols for achieving rapid convergence in self-organized knowledge-defined networks. The research explores the constituents of the Self-Organized Networking (SON) paradigm for 5G and beyond, aiming to design a routing protocol that complies with the SON requirements. Further, it also exploits a contemporary discipline called Knowledge-Defined Networking (KDN) to extend the routing capability by calculating the “Most Reliable” path than the shortest one. The research identifies the potential key areas and possible techniques to meet the objectives by surveying the state-of-the-art of the relevant fields, such as QoS aware routing, Hybrid SDN architectures, intelligent routing models, and service migration techniques. The design phase focuses primarily on the mathematical modelling of the routing problem and approaches the solution by optimizing at the structural level. The work contributes Stochastic Temporal Edge Normalization (STEN) technique which fuses link and node utilization for cost calculation; MRoute, a hybrid routing algorithm for SDN that leverages STEN to provide constant-time convergence; Most Reliable Route First (MRRF) that uses a Recurrent Neural Network (RNN) to approximate route-reliability as the metric of MRRF. Additionally, the research outcomes include a cross-platform SDN Integration framework (SDN-SIM) and a secure migration technique for containerized services in a Multi-access Edge Computing environment using Distributed Ledger Technology. The research work now eyes the development of 6G standards and its compliance with Industry-5.0 for enhancing the abilities of the present outcomes in the light of Deep Reinforcement Learning and Quantum Computing
    corecore