92 research outputs found

    An investigation on the use of SNR distributions for the optimisation of coarse-fine spectrum sensing for cognitive radio

    Get PDF
    This thesis investigates the optimisation of Coarse-Fine (CF) spectrum sensing architectures under a distribution of SNRs for Dynamic Spectrum Access (DSA). Three different detector architectures are investigated: the Coarse-Sorting Fine Detector (CSFD), the Coarse-Deciding Fine Detector (CDFD) and the Hybrid Coarse-Fine Detector (HCFD). To date, the majority of the work on coarse-fine spectrum sensing for cognitive radio has focused on a single value for the SNR. This approach overlooks the key advantage that CF sensing has to offer, namely that high powered signals can be easily detected without extra signal processing. By considering a range of SNR values, the detector can be optimised more effectively and greater performance gains realised. This work considers the optimisation of CF spectrum sensing schemes where the security and performance are treated separately. Instead of optimising system performance at a single, constant, low SNR value, the system instead is optimised for the average operating conditions. The security is still provided such that at the low SNR values the safety specifications are met. By decoupling the security and performance, the system’s average performance increases whilst maintaining the protection of licensed users from harmful interference. The different architectures considered in this thesis are investigated in theory, simulation and physical implementation to provide a complete overview of the performance of each system. This thesis provides a method for estimating SNR distributions which is quick, accurate and relatively low cost. The CSFD is modelled and the characteristic equations are found for the CDFD scheme. The HCFD is introduced and optimisation schemes for all three architectures are proposed. Finally, using the Implementing Radio In Software (IRIS) test-bed to confirm simulation results, CF spectrum sensing is shown to be significantly quicker than naive methods, whilst still meeting the required interference probability rates and not requiring substantial receiver complexity increases

    Key management for wireless sensor network security

    Get PDF
    Wireless Sensor Networks (WSNs) have attracted great attention not only in industry but also in academia due to their enormous application potential and unique security challenges. A typical sensor network can be seen as a combination of a number of low-cost sensor nodes which have very limited computation and communication capability, memory space, and energy supply. The nodes are self-organized into a network to sense or monitor surrounding information in an unattended environment, while the self-organization property makes the networks vulnerable to various attacks.Many cryptographic mechanisms that solve network security problems rely directly on secure and efficient key management making key management a fundamental research topic in the field of WSNs security. Although key management for WSNs has been studied over the last years, the majority of the literature has focused on some assumed vulnerabilities along with corresponding countermeasures. Specific application, which is an important factor in determining the feasibility of the scheme, has been overlooked to a large extent in the existing literature.This thesis is an effort to develop a key management framework and specific schemes for WSNs by which different types of keys can be established and also can be distributed in a self-healing manner; explicit/ implicit authentication can be integrated according to the security requirements of expected applications. The proposed solutions would provide reliable and robust security infrastructure for facilitating secure communications in WSNs.There are five main parts in the thesis. In Part I, we begin with an introduction to the research background, problems definition and overview of existing solutions. From Part II to Part IV, we propose specific solutions, including purely Symmetric Key Cryptography based solutions, purely Public Key Cryptography based solutions, and a hybrid solution. While there is always a trade-off between security and performance, analysis and experimental results prove that each proposed solution can achieve the expected security aims with acceptable overheads for some specific applications. Finally, we recapitulate the main contribution of our work and identify future research directions in Part V

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications

    Technology 2001: The Second National Technology Transfer Conference and Exposition, volume 2

    Get PDF
    Proceedings of the workshop are presented. The mission of the conference was to transfer advanced technologies developed by the Federal government, its contractors, and other high-tech organizations to U.S. industries for their use in developing new or improved products and processes. Volume two presents papers on the following topics: materials science, robotics, test and measurement, advanced manufacturing, artificial intelligence, biotechnology, electronics, and software engineering

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications

    Unleashing the Power of Edge-Cloud Generative AI in Mobile Networks: A Survey of AIGC Services

    Full text link
    Artificial Intelligence-Generated Content (AIGC) is an automated method for generating, manipulating, and modifying valuable and diverse data using AI algorithms creatively. This survey paper focuses on the deployment of AIGC applications, e.g., ChatGPT and Dall-E, at mobile edge networks, namely mobile AIGC networks, that provide personalized and customized AIGC services in real time while maintaining user privacy. We begin by introducing the background and fundamentals of generative models and the lifecycle of AIGC services at mobile AIGC networks, which includes data collection, training, finetuning, inference, and product management. We then discuss the collaborative cloud-edge-mobile infrastructure and technologies required to support AIGC services and enable users to access AIGC at mobile edge networks. Furthermore, we explore AIGCdriven creative applications and use cases for mobile AIGC networks. Additionally, we discuss the implementation, security, and privacy challenges of deploying mobile AIGC networks. Finally, we highlight some future research directions and open issues for the full realization of mobile AIGC networks

    FRIEND: A Cyber-Physical System for Traffic Flow Related Information Aggregation and Dissemination

    Get PDF
    The major contribution of this thesis is to lay the theoretical foundations of FRIEND — A cyber-physical system for traffic Flow-Related Information aggrEgatioN and Dissemination. By integrating resources and capabilities at the nexus between the cyber and physical worlds, FRIEND will contribute to aggregating traffic flow data collected by the huge fleet of vehicles on our roads into a comprehensive, near real-time synopsis of traffic flow conditions. We anticipate providing drivers with a meaningful, color-coded, at-a-glance view of flow conditions ahead, alerting them to congested traffic. FRIEND can be used to provide accurate information about traffic flow and can be used to propagate this information. The workhorse of FRIEND is the ubiquitous lane delimiters (a.k.a. cat\u27s eyes) on our roadways that, at the moment, are used simply as dumb reflectors. Our main vision is that by endowing cat\u27s eyes with a modest power source, detection and communication capabilities they will play an important role in collecting, aggregating and disseminating traffic flow conditions to the driving public. We envision the cat\u27s eyes system to be supplemented by road-side units (RSU) deployed at regular intervals (e.g. every kilometer or so). The RSUs placed on opposite sides of the roadway constitute a logical unit and are connected by optical fiber under the median. Unlike inductive loop detectors, adjacent RSUs along the roadway are not connected with each other, thus avoiding the huge cost of optical fiber. Each RSU contains a GPS device (for time synchronization), an active Radio Frequency Identification (RFID) tag for communication with passing cars, a radio transceiver for RSU to RSU communication and a laptop-class computing device. The physical components of FRIEND collect traffic flow-related data from passing vehicles. The collected data is used by FRIEND\u27s inference engine to build beliefs about the state of the traffic, to detect traffic trends, and to disseminate relevant traffic flow-related information along the roadway. The second contribution of this thesis is the development of an incident classification and detection algorithm that can be used to classify different types of traffic incident Then, it can notify the necessary target of the incident. We also compare our incident detection technique with other VANET techniques. Our third contribution is a novel strategy for information dissemination on highways. First, we aim to prevent secondary accidents. Second, we notify drivers far away from the accident of an expected delay that gives them the option to continue or exit before reaching the incident location. A new mechanism tracks the source of the incident while notifying drivers away from the accident. The more time the incident stays, the further the information needs to be propagated. Furthermore, the denser the traffic, the faster it will backup. In high density highways, an incident may form a backup of vehicles faster than low density highways. In order to satisfy this point, we need to propagate information as a function of density and time

    The Federal Conference on Intelligent Processing Equipment

    Get PDF
    Research and development projects involving intelligent processing equipment within the following U.S. agencies are addressed: Department of Agriculture, Department of Commerce, Department of Energy, Department of Defense, Environmental Protection Agency, Federal Emergency Management Agency, NASA, National Institutes of Health, and the National Science Foundation
    • …
    corecore