13 research outputs found

    Role of artificial intelligence in cloud computing, IoT and SDN: Reliability and scalability issues

    Get PDF
    Information technology fields are now more dominated by artificial intelligence, as it is playing a key role in terms of providing better services. The inherent strengths of artificial intelligence are driving the companies into a modern, decisive, secure, and insight-driven arena to address the current and future challenges. The key technologies like cloud, internet of things (IoT), and software-defined networking (SDN) are emerging as future applications and rendering benefits to the society. Integrating artificial intelligence with these innovations with scalability brings beneficiaries to the next level of efficiency. Data generated from the heterogeneous devices are received, exchanged, stored, managed, and analyzed to automate and improve the performance of the overall system and be more reliable. Although these new technologies are not free of their limitations, nevertheless, the synthesis of technologies has been challenged and has put forth many challenges in terms of scalability and reliability. Therefore, this paper discusses the role of artificial intelligence (AI) along with issues and opportunities confronting all communities for incorporating the integration of these technologies in terms of reliability and scalability. This paper puts forward the future directions related to scalability and reliability concerns during the integration of the above-mentioned technologies and enable the researchers to address the current research gaps

    From distributed coordination to field calculus and aggregate computing

    Get PDF
    open6siThis work has been partially supported by: EU Horizon 2020 project HyVar (www.hyvar-project .eu), GA No. 644298; ICT COST Action IC1402 ARVI (www.cost -arvi .eu); Ateneo/CSP D16D15000360005 project RunVar (runvar-project.di.unito.it).Aggregate computing is an emerging approach to the engineering of complex coordination for distributed systems, based on viewing system interactions in terms of information propagating through collectives of devices, rather than in terms of individual devices and their interaction with their peers and environment. The foundation of this approach is the distillation of a number of prior approaches, both formal and pragmatic, proposed under the umbrella of field-based coordination, and culminating into the field calculus, a universal functional programming model for the specification and composition of collective behaviours with equivalent local and aggregate semantics. This foundation has been elaborated into a layered approach to engineering coordination of complex distributed systems, building up to pragmatic applications through intermediate layers encompassing reusable libraries of program components. Furthermore, some of these components are formally shown to satisfy formal properties like self-stabilisation, which transfer to whole application services by functional composition. In this survey, we trace the development and antecedents of field calculus, review the field calculus itself and the current state of aggregate computing theory and practice, and discuss a roadmap of current research directions with implications for the development of a broad range of distributed systems.embargoed_20210910Viroli, Mirko; Beal, Jacob; Damiani, Ferruccio; Audrito, Giorgio; Casadei, Roberto; Pianini, DaniloViroli, Mirko; Beal, Jacob; Damiani, Ferruccio; Audrito, Giorgio; Casadei, Roberto; Pianini, Danil

    Low-Power Wide-Area Networks: A Broad Overview of its Different Aspects

    Get PDF
    Low-power wide-area networks (LPWANs) are gaining popularity in the research community due to their low power consumption, low cost, and wide geographical coverage. LPWAN technologies complement and outperform short-range and traditional cellular wireless technologies in a variety of applications, including smart city development, machine-to-machine (M2M) communications, healthcare, intelligent transportation, industrial applications, climate-smart agriculture, and asset tracking. This review paper discusses the design objectives and the methodologies used by LPWAN to provide extensive coverage for low-power devices. We also explore how the presented LPWAN architecture employs various topologies such as star and mesh. We examine many current and emerging LPWAN technologies, as well as their system architectures and standards, and evaluate their ability to meet each design objective. In addition, the possible coexistence of LPWAN with other technologies, combining the best attributes to provide an optimum solution is also explored and reported in the current overview. Following that, a comparison of various LPWAN technologies is performed and their market opportunities are also investigated. Furthermore, an analysis of various LPWAN use cases is performed, highlighting their benefits and drawbacks. This aids in the selection of the best LPWAN technology for various applications. Before concluding the work, the open research issues, and challenges in designing LPWAN are presented.publishedVersio

    An Optimized IoT Architecture based on Fog Computing with a new Method of Data Transfer Control

    Get PDF
    Over the years, distributed and grid computing paradigms have evolved to cloud computing, which has become a common approach applied in the Internet of Things (IoT). The growing popularity of the cloud computing paradigm lies mainly in the simple management of end devices, uniform access to many services, elasticity of available resources and cost savings. In addition to these advantages, the expansion of IoT devices and the demand for speed and data volume have provided an opportunity for the emergence of new computing paradigms. The fog computing paradigm brings data processing nearer to the end devices while preserving the cloud connection, leading to lower latency, higher efficiency and location awareness. The overall aim of the dissertation is the design and implementation of an optimised IoT network architecture which adopts the fog computing paradigm. To eliminate the need to build completely new infrastructure, the optimised network architecture is based on LoRaWAN, which has already been deployed at many locations and offers long-distance communication with low-power consumption. This raises several challenges which need to be overcome. For correct functioning of the fog computing paradigm, it was necessary to explore a new method of controlling the data transfer between IoT gateways and the cloud service. The methods explored in this dissertation are both static (based on predefined values) and dynamic (based on machine learning).V průběhu let se výpočetní modely vyvíjely od distribuovaných a gridových ke cloud computingu, který se stal nejčastěji používaným přístupem v oblasti Internetu věcí. Rostoucí popularita cloud computingu spočívá především v jednoduché správě koncových uzlů, jednotném přístupu k velkému počtu služeb, elasticitě dostupných zdrojů a šetření jednotlivých nákladů. Přes všechny své přínosy však narůstající počet připojených zařízení a nároků na rychlost dávají příležitost vzniku nových výpočetních modelů. Fog computing model přenáší výpočetní výkon blíže ke koncovým zařízením při zachování spojení s cloudem, což vede ke snížení latence, zvýšení efektivity a umožnění reagovat na základě aktuálních podmínek. Výsledným cílem této disertační práce je návrh a implementace optimalizované síťové IoT architektury s podporou pro fog computing. Pro eliminaci nutnosti budovat kompletně novou infrastrukturu počítá výsledné optimalizované řešení s integrací do LoRaWAN, která je již nasazena na mnoha místech a nabízí komunikaci na velké vzdálenosti při nízké spotřebě energie. Tato integrace však přináší několik úskalí, jež je potřeba překonat. K dosažení správné funkčnosti fog computingu bylo potřeba provést výzkum metody pro řízení přenosu dat mezi síťovou bránou a cloud službou. Zkoumané metody jsou jak statické (založené na předdefinovaných hodnotách), tak dynamické (využívající strojového učení).440 - Katedra telekomunikační technikyvyhově

    Leveraging Machine Learning Techniques towards Intelligent Networking Automation

    Get PDF
    In this thesis, we address some of the challenges that the Intelligent Networking Automation (INA) paradigm poses. Our goal is to design schemes leveraging Machine Learning (ML) techniques to cope with situations that involve hard decision-making actions. The proposed solutions are data-driven and consist of an agent that operates at network elements such as routers, switches, or network servers. The data are gathered from realistic scenarios, either actual network deployments or emulated environments. To evaluate the enhancements that the designed schemes provide, we compare our solutions to non-intelligent ones. Additionally, we assess the trade-off between the obtained improvements and the computational costs of implementing the proposed mechanisms. Accordingly, this thesis tackles the challenges that four specific research problems present. The first topic addresses the problem of balancing traffic in dense Internet of Things (IoT) network scenarios where the end devices and the Base Stations (BSs) form complex networks. By applying ML techniques to discover patterns in the association between the end devices and the BSs, the proposed scheme can balance the traffic load in a IoT network to increase the packet delivery ratio and reduce the energy cost of data delivery. The second research topic proposes an intelligent congestion control for internet connections at edge network elements. The design includes a congestion predictor based on an Artificial Neural Network (ANN) and an Active Queue Management (AQM) parameter tuner. Similarly, the third research topic includes an intelligent solution to the inter-domain congestion. Different from second topic, this problem considers the preservation of the private network data by means of Federated Learning (FL), since network elements of several organizations participate in the intelligent process. Finally, the fourth research topic refers to a framework to efficiently gathering network telemetry (NT) data. The proposed solution considers a traffic-aware approach so that the NT is intelligently collected and transmitted by the network elements. All the proposed schemes are evaluated through use cases considering standardized networking mechanisms. Therefore, we envision that the solutions of these specific problems encompass a set of methods that can be utilized in real-world scenarios towards the realization of the INA paradigm

    A Distributed Audit Trail for the Internet of Things

    Get PDF
    Sharing Internet of Things (IoT) data over open-data platforms and digital data marketplaces can reduce infrastructure investments, improve sustainability by reducing the required resources, and foster innovation. However, due to the inability to audit the authenticity, integrity, and quality of IoT data, third-party data consumers cannot assess the trustworthiness of received data. Therefore, it is challenging to use IoT data obtained from third parties for quality-relevant applications. To overcome this limitation, the IoT data must be auditable. Distributed Ledger Technology (DLT) is a promising approach for building auditable systems. However, the existing solutions do not integrate authenticity, integrity, data quality, and location into an all-encompassing auditable model and only focus on specific parts of auditability. This thesis aims to provide a distributed audit trail that makes the IoT auditable and enables sharing of IoT data between multiple organizations for quality relevant applications. Therefore, we designed and evaluated the Veritaa framework. The Veritaa framework comprises the Graph of Trust (GoT) as distributed audit trail and a DLT to immutably store the transactions that build the GoT. The contributions of this thesis are summarized as follows. First, we designed and evaluated the GoT a DLT-based Distributed Public Key Infrastructure (DPKI) with a signature store. Second, we designed a Distributed Calibration Certificate Infrastructure (DCCI) based on the GoT, which makes quality-relevant maintenance information of IoT devices auditable. Third, we designed an Auditable Positioning System (APS) to make positions in the IoT auditable. Finally, we designed an Location Verification System (LVS) to verify location claims and prevent physical layer attacks against the APS. All these components are integrated into the GoT and build the distributed audit trail. We implemented a real-world testbed to evaluate the proposed distributed audit trail. This testbed comprises several custom-built IoT devices connectable over Long Range Wide Area Network (LoRaWAN) or Long-Term Evolution Category M1 (LTE Cat M1), and a Bluetooth Low Energy (BLE)-based Angle of Arrival (AoA) positioning system. All these low-power devices can manage their identity and secure their data on the distributed audit trail using the IoT client of the Veritaa framework. The experiments suggest that a distributed audit trail is feasible and secure, and the low-power IoT devices are capable of performing the required cryptographic functions. Furthermore, the energy overhead introduced by making the IoT auditable is limited and reasonable for quality-relevant applications

    Electromagnetic Side-Channel Resilience against Lightweight Cryptography

    Get PDF
    Side-channel attacks are an unpredictable risk factor in cryptography. Therefore, observations of leakages through physical parameters, i.e., power and electromagnetic (EM) radiation, etc., of digital devices are essential to minimise vulnerabilities associated with cryptographic functions. Compared to costs in the past, performing side-channel attacks using inexpensive test equipment is becoming a reality. Internet-of-Things (IoT) devices are resource-constrained, and lightweight cryptography is a novel approach in progress towards IoT security. Thus, it would provide sufficient data and privacy protection in such a constrained ecosystem. Therefore, cryptanalysis of physical leakages regarding these emerging ciphers is crucial. EM side-channel attacks seem to cause a significant impact on digital forensics nowadays. Within existing literature, power analysis seems to have considerable attention in research whereas other phenomena, such as EM, should continue to be appropriately evaluated in playing a role in forensic analysis.The emphasis of this thesis is on lightweight cryptanalysis. The preliminary investigations showed no Correlation EManalysis (CEMA) of PRESENT lightweight algorithm. The PRESENT is a block cipher that promises to be adequate for IoT devices, and is expected to be used commercially in the future. In an effort to fill in this research gap, this work examines the capabilities of a correlation EM side-channel attack against the PRESENT. For that, Substitution box (S-box) of the PRESENT was targeted for its 1st round with the use of a minimum number of EM waveforms compared to other work in literature, which was 256. The attack indicates the possibility of retrieving 8 bytes of the secret key out of 10 bytes. The experimental process started from a Simple EMA (SEMA) and gradually enhanced up to a CEMA. The thesis presents the methodology of the attack modelling and the observations followed by a critical analysis. Also, a technical review of the IoT technology and a comprehensive literature review on lightweight cryptology are included

    An investigation of change in drone practices in broadacre farming environments

    Get PDF
    The application of drones in broadacre farming is influenced by novel and emergent factors. Drone technology is subject to legal, financial, social, and technical constraints that affect the Agri-tech sector. This research showed that emerging improvements to drone technology influence the analysis of precision data resulting in disparate and asymmetrically flawed Ag-tech outputs. The novelty of this thesis is that it examines the changes in drone technology through the lens of entropic decay. It considers the planning and controlling of an organisation’s resources to minimise harmful effects through systems change. The rapid advances in drone technology have outpaced the systematic approaches that precision agriculture insists is the backbone of reliable ongoing decision-making. Different models and brands take data from different heights, at different times of the day, and with flight of differing velocities. Drone data is in a state of decay, no longer equally comparable to past years’ harvest and crop data and are now mixed into a blended environment of brand-specific variations in height, image resolution, air speed, and optics. This thesis investigates the problem of the rapid emergence of image-capture technology in drones and the corresponding shift away from the established measurements and comparisons used in precision agriculture. New capabilities are applied in an ad hoc manner as different features are rushed to market. At the same time existing practices are subtly changed to suit individual technology capability. The result is a loose collection of technically superior drone imagery, with a corresponding mismatch of year-to-year agricultural data. The challenge is to understand and identify the difference between uniformly accepted technological advance, and market-driven changes that demonstrate entropic decay. The goal of this research is to identify best practice approaches for UAV deployment for broadacre farming. This study investigated the benefits of a range of characteristics to optimise data collection technologies. It identified widespread discrepancies demonstrating broadening decay on precision agriculture and productivity. The pace of drone development is so rapidly different from mainstream agricultural practices that the once reliable reliance upon yearly crop data no longer shares statistically comparable metrics. Whilst farmers have relied upon decades of satellite data that has used the same optics, time of day and flight paths for many years, the innovations that drive increasingly smarter drone technologies are also highly problematic since they render each successive past year’s crop metrics as outdated in terms of sophistication, detail, and accuracy. In five years, the standardised height for recording crop data has changed four times. New innovations, coupled with new rules and regulations have altered the once reliable practice of recording crop data. In addition, the cost of entry in adopting new drone technology is sufficiently varied that agriculturalists are acquiring multiple versions of different drone UAVs with variable camera and sensor settings, and vastly different approaches in terms of flight records, data management, and recorded indices. Without addressing this problem, the true benefits of optimization through machine learning are prevented from improving harvest outcomes for broadacre farming. The key findings of this research reveal a complex, constantly morphing environment that is seeking to build digital trust and reliability in an evolving global market in the face of rapidly changing technology, regulations, standards, networks, and knowledge. The once reliable discipline of precision agriculture is now a fractured melting pot of “first to market” innovations and highly competitive sellers. The future of drone technology is destined for further uncertainty as it struggles to establish a level of maturity that can return broadacre farming to consistent global outcomes
    corecore