16 research outputs found

    A New Handover Management Model for Two-Tier 5G Mobile Networks

    Get PDF
    There has been an exponential rise in mobile data traffic in recent times due to the increasing popularity of portable devices like tablets, smartphones, and laptops. The rapid rise in the use of these portable devices has put extreme stress on the network service providers while forcing telecommunication engineers to look for innovative solutions to meet the increased demand. One solution to the problem is the emergence of fifth-generation (5G) wireless communication, which can address the challenges by offering very broad wireless area capacity and potential cut-power consumption. The application of small cells is the fundamental mechanism for the 5G technology. The use of small cells can enhance the facility for higher capacity and reuse. However, it must be noted that small cells deployment will lead to frequent handovers of mobile nodes. Considering the importance of small cells in 5G, this paper aims to examine a new resource management scheme that can work to minimize the rate of handovers for mobile phones through careful resources allocation in a two-tier network. Therefore, the resource management problem has been formulated as an optimization issue that we aim to overcome through an optimal solution. To find a solution to the existing problem of frequent handovers, a heuristic approach has been used. This solution is then evaluated and validated through simulation and testing, during which the performance was noted to improve by 12% in the context of handover costs. Therefore, this model has been observed to be more efficient as compared to the existing model

    Cryptanalysis of an online/offline certificateless signature scheme for Internet of Health Things

    Get PDF
    Recently, Khan et al. [An online-offline certificateless signature scheme for internet of health things,” Journal of Healthcare Engineering, vol. 2020] presented a new certificateless offline/online signature scheme for Internet of Health Things (IoHT) to fulfill the authenticity requirements of the resource-constrained environment of (IoHT) devices. The authors claimed that the newly proposed scheme is formally secured against Type-I adversary under the Random Oracle Model (ROM). Unfortunately, their scheme is insecure against adaptive chosen message attacks. It is demonstrated that an adversary can forge a valid signature on a message by replacing the public key. Furthermore, we performed a comparative analysis of the selective parameters including computation time, communication overhead, security, and formal proof by employing Evaluation based on Distance from Average Solution (EDAS). The analysis shows that the designed scheme of Khan et al. doesn’t have any sort of advantage over the previous schemes. Though, the authors utilized a lightweight hyperelliptic curve cryptosystem with a smaller key size of 80-bits. Finally, we give some suggestions on the construction of a concrete security scheme under ROM

    Blockchain Empowered Federated Learning Ecosystem for Securing Consumer IoT Features Analysis

    Get PDF
    Resource constraint Consumer Internet of Things (CIoT) is controlled through gateway devices (e.g., smartphones, computers, etc.) that are connected to Mobile Edge Computing (MEC) servers or cloud regulated by a third party. Recently Machine Learning (ML) has been widely used in automation, consumer behavior analysis, device quality upgradation, etc. Typical ML predicts by analyzing customers’ raw data in a centralized system which raises the security and privacy issues such as data leakage, privacy violation, single point of failure, etc. To overcome the problems, Federated Learning (FL) developed an initial solution to ensure services without sharing personal data. In FL, a centralized aggregator collaborates and makes an average for a global model used for the next round of training. However, the centralized aggregator raised the same issues, such as a single point of control leaking the updated model and interrupting the entire process. Additionally, research claims data can be retrieved from model parameters. Beyond that, since the Gateway (GW) device has full access to the raw data, it can also threaten the entire ecosystem. This research contributes a blockchain-controlled, edge intelligence federated learning framework for a distributed learning platform for CIoT. The federated learning platform allows collaborative learning with users’ shared data, and the blockchain network replaces the centralized aggregator and ensures secure participation of gateway devices in the ecosystem. Furthermore, blockchain is trustless, immutable, and anonymous, encouraging CIoT end users to participate. We evaluated the framework and federated learning outcomes using the well-known Stanford Cars dataset. Experimental results prove the effectiveness of the proposed framework

    Leveraging convolutional neural network for COVID-19 disease detection using CT scan images

    Full text link
    In 2020, the world faced an unprecedented pandemic outbreak of cor-onavirus disease (COVID-19), which causes severe threats to patients suffering from diabetes, kidney problems, and heart problems. A rapid testing mechanism is a primary obstacle to controlling the spread of COVID-19. Current tests focus on the reverse transcription-polymerase chain reaction (RT-PCR). The PCR test takes around 4–6 h to identify COVID-19 patients. Various research has recom-mended AI-based models leveraging machine learning, deep learning, and neural networks to classify COVID-19 and non-COVID patients from chest X-ray and computerized tomography (CT) scan images. However, no model can be claimed as a standard since models use different datasets. Convolutional neural network (CNN)-based deep learning models are widely used for image analysis to diag-nose and classify various diseases. In this research, we develop a CNN-based diagnostic model to detect COVID-19 patients by analyzing the features in CT scan images. This research considered a publicly available CT scan dataset and fed it into the proposed CNN model to classify COVID-19 infected patients. The model achieved 99.76%, 96.10%, and 96% accuracy in training, validation, and test phases, respectively. It achieved scores of 0.986 in area under curve (AUC) and 0.99 in the precision-recall curve (PRC). We compared the model’s performance to that of three state-of-the-art pretrained models (MobileNetV2, InceptionV3, and Xception). The results show that the model can be used as a diagnostic tool for digital healthcare, particularly in COVID-19 chest CT image classification

    Dickson polynomial-based secure group authentication scheme for Internet of Things

    Get PDF
    Internet of Things (IoT) paves the way for the modern smart industrial applications and cities. Trusted Authority acts as a sole control in monitoring and maintaining the communications between the IoT devices and the infrastructure. The communication between the IoT devices happens from one trusted entity of an area to the other by way of generating security certificates. Establishing trust by way of generating security certificates for the IoT devices in a smart city application can be of high cost and expensive. In order to facilitate this, a secure group authentication scheme that creates trust amongst a group of IoT devices owned by several entities has been proposed. The majority of proposed authentication techniques are made for individual device authentication and are also utilized for group authentication; nevertheless, a unique solution for group authentication is the Dickson polynomial based secure group authentication scheme. The secret keys used in our proposed authentication technique are generated using the Dickson polynomial, which enables the group to authenticate without generating an excessive amount of network traffic overhead. IoT devices' group authentication has made use of the Dickson polynomial. Blockchain technology is employed to enable secure, efficient, and fast data transfer among the unique IoT devices of each group deployed at different places. Also, the proposed secure group authentication scheme developed based on Dickson polynomials is resistant to replay, man-in-the-middle, tampering, side channel and signature forgeries, impersonation, and ephemeral key secret leakage attacks. In order to accomplish this, we have implemented a hardware-based physically unclonable function. Implementation has been carried using python language and deployed and tested on Blockchain using Ethereum Goerli’s Testnet framework. Performance analysis has been carried out by choosing various benchmarks and found that the proposed framework outperforms its counterparts through various metrics. Different parameters are also utilized to assess the performance of the proposed blockchain framework and shows that it has better performance in terms of computation, communication, storage and latency

    A Framework to Support Interdisciplinary Engagement with Learning Analytics

    Get PDF
    Learning analytics can provide an excellent opportunity for instructors to get an in-depth understanding of students’ learning experiences in a course. However, certain technological challenges, namely limited availability of learning analytics data because of learning management system restrictions, can make accessing this data seem impossible at some institutions. Furthermore, even in cases where instructors have access to a range of student data, there may not be organized efforts to support students across various courses and university experiences. In the current chapter, the authors discuss the issue of learning analytics access and ways to leverage learning analytics data between instructors, and in some cases administrators, to create interdisciplinary opportunities for comprehensive student support. The authors consider the implications of these interactions for students, instructors, and administrators. Additionally, the authors focus on some of the technological infrastructure issues involved with accessing learning analytics and discuss the opportunities available for faculty and staff to take a multi-pronged approach to addressing overall student success.https://scholarworks.wm.edu/educationbookchapters/1045/thumbnail.jp

    Dynamic replication strategies in data grid systems: A survey

    Get PDF
    In data grid systems, data replication aims to increase availability, fault tolerance, load balancing and scalability while reducing bandwidth consumption, and job execution time. Several classification schemes for data replication were proposed in the literature, (i) static vs. dynamic, (ii) centralized vs. decentralized, (iii) push vs. pull, and (iv) objective function based. Dynamic data replication is a form of data replication that is performed with respect to the changing conditions of the grid environment. In this paper, we present a survey of recent dynamic data replication strategies. We study and classify these strategies by taking the target data grid architecture as the sole classifier. We discuss the key points of the studied strategies and provide feature comparison of them according to important metrics. Furthermore, the impact of data grid architecture on dynamic replication performance is investigated in a simulation study. Finally, some important issues and open research problems in the area are pointed out

    Leveraging Deep Learning Techniques for Malaria Parasite Detection Using Mobile Application

    No full text
    Malaria is a contagious disease that affects millions of lives every year. Traditional diagnosis of malaria in laboratory requires an experienced person and careful inspection to discriminate healthy and infected red blood cells (RBCs). It is also very time-consuming and may produce inaccurate reports due to human errors. Cognitive computing and deep learning algorithms simulate human intelligence to make better human decisions in applications like sentiment analysis, speech recognition, face detection, disease detection, and prediction. Due to the advancement of cognitive computing and machine learning techniques, they are now widely used to detect and predict early disease symptoms in healthcare field. With the early prediction results, healthcare professionals can provide better decisions for patient diagnosis and treatment. Machine learning algorithms also aid the humans to process huge and complex medical datasets and then analyze them into clinical insights. This paper looks for leveraging deep learning algorithms for detecting a deadly disease, malaria, for mobile healthcare solution of patients building an effective mobile system. The objective of this paper is to show how deep learning architecture such as convolutional neural network (CNN) which can be useful in real-time malaria detection effectively and accurately from input images and to reduce manual labor with a mobile application. To this end, we evaluate the performance of a custom CNN model using a cyclical stochastic gradient descent (SGD) optimizer with an automatic learning rate finder and obtain an accuracy of 97.30% in classifying healthy and infected cell images with a high degree of precision and sensitivity. This outcome of the paper will facilitate microscopy diagnosis of malaria to a mobile application so that reliability of the treatment and lack of medical expertise can be solved

    Evaluation through Realistic Simulations of File Replication Strategies for Large Heterogeneous Distributed Systems

    No full text
    International audienceFile replication is widely used to reduce file transfer times and improve data availability in large distributed systems. Replication techniques are often evaluated through simulations, however, most simulation platform models are oversimplified, which questions the applicability of the findings to real systems. In this paper, we investigate how platform models influence the performance of file replication strategies on large heterogeneous distributed systems, based on common existing techniques such as prestaging and dynamic replication. The novelty of our study resides in our evaluation using a realistic simulator. We consider two platform models: a simple hierarchical model and a detailed model built from execution traces. Our results show that conclusions depend on the modeling of the platform and its capacity to capture the characteristics of the targeted production infrastructure. We also derive recommendations for the implementation of an optimized data management strategy in a scientific gateway for medical image analysis
    corecore