17,480 research outputs found
Recommended from our members
Blockchain at the Edge: Performance of Resource-Constrained IoT Networks
The proliferation of IoT in various technological realms has resulted in the massive spurt of unsecured data. The use of complex security mechanisms for securing these data is highly restricted owing to the low-power and low-resource nature of most of the IoT devices, especially at the Edge. In this article, we propose to use blockchains for extending security to such IoT implementations. We deploy a Ethereum blockchain consisting of both regular and constrained devices connecting to the blockchain through wired and wireless heterogeneous networks. We additionally implement a secure and encrypted networked clock mechanism to synchronize the non-real-time IoT Edge nodes within the blockchain. Further, we experimentally study the feasibility of such a deployment and the bottlenecks associated with it by running necessary cryptographic operations for blockchains in IoT devices. We study the effects of network latency, increase in constrained blockchain nodes, data size, Ether, and blockchain node mobility during transaction and mining of data within our deployed blockchain. This study serves as a guideline for designing secured solutions for IoT implementations under various operating conditions such as those encountered for static IoT nodes and mobile IoT devices
Overhead Management Strategies for Internet of Things Devices
Overhead (time and energy) management is paramount for IoT edge devices considering their typically resource-constrained nature. In this thesis we present two contributions for lowering resource consumption of IoT devices. The first contribution is minimizing the overhead of the Transport Layer Security (TLS) authentication protocol in the context of IoT networks by selecting a lightweight cipher suite configuration. TLS is the de facto authentication protocol for secure communication in Internet of Things (IoT) applications. However, the processing and energy demands of this protocol are the two essential parameters that must be taken into account with respect to the resource-constraint nature of IoT devices. For the first contribution, we study these parameters using a testbed in which an IoT board (Cypress CYW43907) communicates with a server over an 802.11 wireless link. Although TLS supports a wide-array of cipher suites, in this paper we focus on DHE RSA, ECDHE RSA, and ECDHE ECDSA, which are among the most popular ciphers used due to their robustness. Our studies show that ciphers using Elliptic Curve Diffie Hellman (ECDHE) key exchange are considerably more efficient than ciphers using Diffie Hellman (DHE). Furthermore, ECDSA signature verification consumes more time and energy than RSA signature verification for ECDHE key exchange. This study helps IoT designers choose an appropriate TLS cipher suite based on application demands, computational capabilities, and energy resources available.
The second contribution of this thesis is deploying supervised machine learning anomaly detection algorithms on an IoT edge device to reduce data transmission overhead and cloud storage requirements. With continuous monitoring and sensing, millions of Internet of Things sensors all over the world generate tremendous amounts of data every minute. As a result, recent studies start to raise the question as whether to send all the sensing data directly to the cloud (i.e., direct transmission), or to preprocess such data at the network edge and only send necessary data to the cloud (i.e., preprocessing at the edge). Anomaly detection is particularly useful as an edge mining technique to reduce the transmission overhead in such a context when the frequently monitored activities contain only a sparse set of anomalies. This paper analyzes the potential overhead-savings of machine learning based anomaly detection models on the edge in three different IoT scenarios. Our experimental results prove that by choosing the appropriate anomaly detection models, we are able to effectively reduce the total amount of transmission energy as well as minimize required cloud storage. We prove that Random Forest, Multilayer Perceptron, and Discriminant Analysis models can viably save time and energy on the edge device during data transmission. K-Nearest Neighbors, although reliable in terms of prediction accuracy, demands exorbitant overhead and results in net time and energy loss on the edge device. In addition to presenting our model results for the different IoT scenarios, we provide guidelines for potential model selections through analysis of involved tradeoffs such as training overhead, prediction overhead, and classification accuracy
When Mobile Blockchain Meets Edge Computing
Blockchain, as the backbone technology of the current popular Bitcoin digital
currency, has become a promising decentralized data management framework.
Although blockchain has been widely adopted in many applications, e.g.,
finance, healthcare, and logistics, its application in mobile services is still
limited. This is due to the fact that blockchain users need to solve preset
proof-of-work puzzles to add new data, i.e., a block, to the blockchain.
Solving the proof-of-work, however, consumes substantial resources in terms of
CPU time and energy, which is not suitable for resource-limited mobile devices.
To facilitate blockchain applications in future mobile Internet of Things
systems, multiple access mobile edge computing appears to be an auspicious
solution to solve the proof-of-work puzzles for mobile users. We first
introduce a novel concept of edge computing for mobile blockchain. Then, we
introduce an economic approach for edge computing resource management.
Moreover, a prototype of mobile edge computing enabled blockchain systems is
presented with experimental results to justify the proposed concept.Comment: Accepted by IEEE Communications Magazin
- …