33 research outputs found

    A secure intelligent system for internet of vehicles: Case study on traffic forecasting

    Get PDF
    Significant efforts have been made for vehicle-to-vehicle communications that now enable the Internet of Vehicles (IoV). However, current IoV solutions are unable to capture traffic data both accurately and securely. Another drawback of current IoV models that are based on deep learning is that the methods used do not tune hyperparameters efficiently. In this paper, a new system known as Secure and Intelligent System for the Internet of Vehicles (SISIV) is developed. A deep learning architecture based on graph convolutional networks and an attention mechanism are implemented. In addition, blockchain technology is used to protect data transmission between nodes in the IoV system. Moreover, the hyperparameters of the generated deep learning model are intelligently selected using a branch-and-bound technique. To validate SISIV, experiments were conducted on four networked vehicle databases dealing with prediction problems. In terms of forecasting rate ( > 90%), F-measure ( > 80%), and attack detection ( < 75%), the results clearly show the superiority of SISIV over baseline systems. Moreover, compared to state-of-the-art solutions based on traffic prediction, SISIV enables efficient and reliable prediction of traffic flow in an IoV context

    Towards a New Generation of Permissioned Blockchain Systems

    Get PDF
    With the release of Satoshi Nakamoto's Bitcoin system in 2008 a new decentralized computation paradigm, known as blockchain, was born. Bitcoin promised a trading network for virtual coins, publicly available for anyone to participate in but owned by nobody. Any participant could propose a transaction and a lottery mechanism decided in which order these transactions would be recorded in a ledger with an elegant mechanism to prevent double spending. The remarkable achievement of Nakamoto's protocol was that participants did not have to trust each other to behave correctly for it to work. As long as more than half of the network participants adhered to the correct code, the recorded transactions on the ledger would both be valid and immutable. Ethereum, as the next major blockchain to appear, improved on the initial idea by introducing smart contracts, which are decentralized Turing-complete stored procedures, thus making blockchain technology interesting for the enterprise setting. However, its intrinsically public data and prohibitive energy costs needed to be overcome. This gave rise to a new type of systems called permissioned blockchains. With these, access to the ledger is restricted and trust assumptions about malicious behaviour have been weakened, allowing more efficient consensus mechanisms to find a global order of transactions. One of the most popular representatives of this kind of blockchain is Hyperledger Fabric. While it is much faster and more energy efficient than permissionless blockchains, it has to compete with conventional distributed databases in the enterprise sector. This thesis aims to mitigate Fabric's three major shortcomings. First, compared to conventional database systems, it is still far too slow. This thesis shows how the performance can be increased by a factor of seven by redesigning the transaction processing pipeline and introducing more efficient data structures. Second, we present a novel solution to Fabric's intrinsic problem of a low throughput for workloads with transactions that access the same data. This is achieved by analyzing the dependencies of transactions and selectively re-executing transactions when a conflict is detected. Third, this thesis tackles the preservation of private data. Even though access to the blockchain as a whole can be restricted, in a setting where multiple enterprises collaborate this is not sufficient to protect sensitive proprietary data. Thus, this thesis introduces a new privacy-preserving blockchain protocol based on network sharding and targeted data dissemination. It also introduces an additional layer of abstraction for the creation of transactions and interaction with data on the blockchain. This allows developers to write applications without the need for low-level knowledge of the internal data structure of the blockchain system. In summary, this thesis addresses the shortcomings of the current generation of permission blockchain systems

    Mathematics in Software Reliability and Quality Assurance

    Get PDF
    This monograph concerns the mathematical aspects of software reliability and quality assurance and consists of 11 technical papers in this emerging area. Included are the latest research results related to formal methods and design, automatic software testing, software verification and validation, coalgebra theory, automata theory, hybrid system and software reliability modeling and assessment

    Quality-aware predictive modelling & inferential analytics at the network edge

    Get PDF
    The Internet of Things has grown by an enormous amount of devices over the later years. With the upcoming idea of the Internet of Everything the growth will be even faster. These embedded devices are connected to a central server, e.g. the Cloud. A major task is to send the generated data for further analysis and modelling to this central collection point. The devices’ network and deployed system are constrained due to energy, bandwidth, connectivity, latency, and privacy. To overcome these constraints, Edge Computing has been introduced to enable devices performing computation near the source. With the increase of embedded devices and the Internet of Things, the continuous data transmission between devices and Central Locations reached an infeasible point in which efficient communication and computational offloading are required. Edge Computing enables devices to compute lightweight algorithms locally to reduce the raw-data transmission of the network. The quality of predictive analytics tasks is of high importance as user satisfaction and decision making depend on the outcome. Therefore, this thesis investigates the ability to perform predictive analytics and model inference in Edge Devices with communication-efficient, latency-efficient, and privacy-efficient procedures by focusing on quality-aware results. The first part of the thesis focuses on reducing data transmission between the device and the central location. Two possible energy-efficient methodologies to control the data forwarding are introduced: prediction-based and time-optimised. Both data forwarding strategies aim to maintain the Central Location’s quality of analytics by introducing reconstruction policies. The second part provides a mechanism to enable edge-centric analytics towards latency-efficient network optimisation. One aspect shows the importance of locally generated analytical models in Edge Devices embracing each device’s data subspace. Furthermore, two possible ensemble-pruning methods are introduced that allow the aggregation of individual models at the Central Location towards accurate query predictions. The conclusion chapter presents the importance of privacy-efficient local learning and analytics in Edge Devices. With the aid of Federated Learning, it is possible to train analytical models for privacy-preserving data locally. Furthermore, for continuous changing environments, the parallel deployment of personalisation and generalisation for quality aware predictions is highlighted and demonstrated through experimental evaluation
    corecore