5 research outputs found
Software Engineering Applications enabled by Blockchain Technology: A Systematic Mapping Study
The novel, yet disruptive blockchain technology has witnessed growing attention, due to its intrinsic potential. Besides the conventional domains that benefit from such potential, such as finance, supply chain and healthcare, blockchain use cases in software engineering have emerged recently. In this study, we aim to contribute to the body of knowledge of blockchain-oriented software engineering by providing an adequate overview of the software engineering applications enabled by blockchain technology. To do so, we carried out a systematic mapping study and identified 22 primary studies. Then, we extracted data within the research type, research topic and contribution type facets. Findings suggest an increasing trend of studies since 2018. Additionally, findings reveal the potential of using blockchain technologies as an alternative to centralized systems, such as GitHub, Travis CI, and cloud-based package managers, and also to establish trust between parties in collaborative software development. We also found out that smart contracts can enable the automation of a variety of software engineering activities that usually require human reasoning, such as the acceptance phase, payments to software engineers, and compliance adherence. In spite of the fact that the field is not yet mature, we believe that this systematic mapping study provides a holistic overview that may benefit researchers interested in bringing blockchain to the software industry, and practitioners willing to understand how blockchain can transform the software development industry.publishedVersio
Edge intelligence in smart grids : a survey on architectures, offloading models, cyber security measures, and challenges
The rapid development of new information and communication technologies (ICTs) and
the deployment of advanced Internet of Things (IoT)-based devices has led to the study and implementation of edge computing technologies in smart grid (SG) systems. In addition, substantial work
has been expended in the literature to incorporate artificial intelligence (AI) techniques into edge
computing, resulting in the promising concept of edge intelligence (EI). Consequently, in this article,
we provide an overview of the current state-of-the-art in terms of EI-based SG adoption from a range
of angles, including architectures, computation offloading, and cybersecurity c oncerns. The basic
objectives of this article are fourfold. To begin, we discuss EI and SGs separately. Then we highlight
contemporary concepts closely related to edge computing, fundamental characteristics, and essential
enabling technologies from an EI perspective. Additionally, we discuss how the use of AI has aided
in optimizing the performance of edge computing. We have emphasized the important enabling
technologies and applications of SGs from the perspective of EI-based SGs. Second, we explore both
general edge computing and architectures based on EI from the perspective of SGs. Thirdly, two basic
questions about computation offloading are discussed: what is computation offloading and why do
we need it? Additionally, we divided the primary articles into two categories based on the number of
users included in the model, either a single user or a multiple user instance. Finally, we review the
cybersecurity threats with edge computing and the methods used to mitigate them in SGs. Therefore,
this survey comes to the conclusion that most of the viable architectures for EI in smart grids often
consist of three layers: device, edge, and cloud. In addition, it is crucial that computation offloading
techniques must be framed as optimization problems and addressed effectively in order to increase
system performance. This article typically intends to serve as a primer for emerging and interested
scholars concerned with the study of EI in SGs.The Council for Scientific and Industrial Research (CSIR).https://www.mdpi.com/journal/jsanElectrical, Electronic and Computer Engineerin
Smart healthcare system for severity prediction and critical tasks management of COVID-19 patients in IoT-fog computing environments
COVID-19 has depleted healthcare systems around the world. Extreme conditions must be defined as soon as possible so that services and treatment can be deployed and intensified. Many biomarkers are being investigated in order to track the patient's condition. Unfortunately, this may interfere with the symptoms of other diseases, making it more difficult for a specialist to diagnose or predict the severity level of the case. This research develops a Smart Healthcare System for Severity Prediction and Critical Tasks Management (SHSSP-CTM) for COVID-19 patients. On the one hand, a machine learning (ML) model is projected to predict the severity of COVID-19 disease. On the other hand, a multi-agent system is proposed to prioritize patients according to the seriousness of the COVID-19 condition and then provide complete network management from the edge to the cloud. Clinical data, including Internet of Medical Things (IoMT) sensors and Electronic Health Record (EHR) data of 78 patients from one hospital in the Wasit Governorate, Iraq, were used in this study. Different data sources are fused to generate new feature pattern. Also, data mining techniques such as normalization and feature selection are applied. Two models, specifically logistic regression (LR) and random forest (RF), are used as baseline severity predictive models. A multi-agent algorithm (MAA), consisting of a personal agent (PA) and fog node agent (FNA), is used to control the prioritization process of COVID-19 patients. The highest prediction result is achieved based on data fusion and selected features, where all examined classifiers observe a significant increase in accuracy. Furthermore, compared with state-of-the-art methods, the RF model showed a high and balanced prediction performance with 86% accuracy, 85.7% F-score, 87.2% precision, and 86% recall. In addition, as compared to the cloud, the MAA showed very significant performance where the resource usage was 66% in the proposed model and 34% in the traditional cloud, the delay was 19% in the proposed model and 81% in the cloud, and the consumed energy was 31% in proposed model and 69% in the cloud. The findings of this study will allow for the early detection of three severity cases, lowering mortality rates.Web of Science2022art. no. 501296