432 research outputs found

    Resource Allocation in the Cognitive Radio Network-Aided Internet of Things for the Cyber-Physical-Social System: An Efficient Jaya Algorithm

    Get PDF
    Currently, there is a growing demand for the use of communication network bandwidth for the Internet of Things (IoT) within the cyber-physical-social system (CPSS), while needing progressively more powerful technologies for using scarce spectrum resources. Then, cognitive radio networks (CRNs) as one of those important solutions mentioned above, are used to achieve IoT effectively. Generally, dynamic resource allocation plays a crucial role in the design of CRN-aided IoT systems. Aiming at this issue, orthogonal frequency division multiplexing (OFDM) has been identified as one of the successful technologies, which works with a multi-carrier parallel radio transmission strategy. In this article, through the use of swarm intelligence paradigm, a solution approach is accordingly proposed by employing an efficient Jaya algorithm, called PA-Jaya, to deal with the power allocation problem in cognitive OFDM radio networks for IoT. Because of the algorithm-specific parameter-free feature in the proposed PA-Jaya algorithm, a satisfactory computational performance could be achieved in the handling of this problem. For this optimization problem with some constraints, the simulation results show that compared with some popular algorithms, the efficiency of spectrum utilization could be further improved by using PA-Jaya algorithm with faster convergence speed, while maximizing the total transmission rate

    Energy-efficient non-orthogonal multiple access for wireless communication system

    Get PDF
    Non-orthogonal multiple access (NOMA) has been recognized as a potential solution for enhancing the throughput of next-generation wireless communications. NOMA is a potential option for 5G networks due to its superiority in providing better spectrum efficiency (SE) compared to orthogonal multiple access (OMA). From the perspective of green communication, energy efficiency (EE) has become a new performance indicator. A systematic literature review is conducted to investigate the available energy efficient approach researchers have employed in NOMA. We identified 19 subcategories related to EE in NOMA out of 108 publications where 92 publications are from the IEEE website. To help the reader comprehend, a summary for each category is explained and elaborated in detail. From the literature review, it had been observed that NOMA can enhance the EE of wireless communication systems. At the end of this survey, future research particularly in machine learning algorithms such as reinforcement learning (RL) and deep reinforcement learning (DRL) for NOMA are also discussed

    A novel design approach for 5G massive MIMO and NB-IoT green networks using a hybrid Jaya-differential evolution algorithm

    Get PDF
    Our main objective is to reduce power consumption by responding to the instantaneous bit rate demand by the user for 4th Generation (4G) and 5th Generation (5G) Massive MIMO network configurations. Moreover, we present and address the problem of designing green LTE networks with the Internet of Things (IoT) nodes. We consider the new NarrowBand-IoT (NB-IoT) wireless technology that will emerge in current and future access networks. In this context, we apply emerging evolutionary algorithms in the context of green network design. We investigate three different cases to show the performance of the new proposed algorithm, namely the 4G, 5G Massive MIMO, and the NB-IoT technologies. More specifically, we investigate the Teaching-Learning-Optimization (TLBO), the Jaya algorithm, the self-adaptive differential evolution jDE algorithm, and other hybrid algorithms. We introduce a new hybrid algorithm named Jaya-jDE that uses concepts from both Jaya and jDE algorithms in an effective way. The results show that 5G Massive MIMO networks require about 50% less power consumption than the 4G ones, and the NB-IoT in-band deployment requires about 10% less power than guard-band deployment. Moreover, Jaya-jDE emerges as the best algorithm based on the results

    Privacy protection and energy optimization for 5G-aided industrial internet of things

    Get PDF
    The 5G is expected to revolutionize every sector of life by providing interconnectivity of everything everywhere at high speed. However, massively interconnected devices and fast data transmission will bring the challenge of privacy as well as energy deficiency. In today's fast-paced economy, almost every sector of the economy is dependent on energy resources. On the other hand, the energy sector is mainly dependent on fossil fuels and is constituting about 80% of energy globally. This massive extraction and combustion of fossil fuels lead to a lot of adverse impacts on health, environment, and economy. The newly emerging 5G technology has changed the existing phenomenon of life by connecting everything everywhere using IoT devices. 5G enabled IIoT devices has transformed everything from traditional to smart, e.g. smart city, smart healthcare, smart industry, smart manufacturing etc. However, massive I/O technologies for providing D2D connection has also created the issue of privacy that need to be addressed. Privacy is the fundamental right of every individual. 5G industries and organizations need to preserve it for their stability and competency. Therefore, privacy at all three levels (data, identity and location) need to be maintained. Further, energy optimization is a big challenge that needs to be addressed for leveraging the potential benefits of 5G and 5G aided IIoT. Billions of IIoT devices that are expected to communicate using the 5G network will consume a considerable amount of energy while energy resources are limited. Therefore, energy optimization is a future challenge faced by 5G industries that need to be addressed. To fill these gaps, we have provided a comprehensive framework that will help energy researchers and practitioners in better understanding of 5G aided industry 4.0 infrastructure and energy resource optimization by improving privacy. The proposed framework is evaluated using case studies and mathematical modelling. © 2020 Institute of Electrical and Electronics Engineers Inc.. All rights reserved

    Internet of Things and Sensors Networks in 5G Wireless Communications

    Get PDF
    This book is a printed edition of the Special Issue Internet of Things and Sensors Networks in 5G Wireless Communications that was published in Sensors

    Internet of Things and Sensors Networks in 5G Wireless Communications

    Get PDF
    The Internet of Things (IoT) has attracted much attention from society, industry and academia as a promising technology that can enhance day to day activities, and the creation of new business models, products and services, and serve as a broad source of research topics and ideas. A future digital society is envisioned, composed of numerous wireless connected sensors and devices. Driven by huge demand, the massive IoT (mIoT) or massive machine type communication (mMTC) has been identified as one of the three main communication scenarios for 5G. In addition to connectivity, computing and storage and data management are also long-standing issues for low-cost devices and sensors. The book is a collection of outstanding technical research and industrial papers covering new research results, with a wide range of features within the 5G-and-beyond framework. It provides a range of discussions of the major research challenges and achievements within this topic

    Unmanned Aerial Vehicle (UAV)-Enabled Wireless Communications and Networking

    Get PDF
    The emerging massive density of human-held and machine-type nodes implies larger traffic deviatiolns in the future than we are facing today. In the future, the network will be characterized by a high degree of flexibility, allowing it to adapt smoothly, autonomously, and efficiently to the quickly changing traffic demands both in time and space. This flexibility cannot be achieved when the network’s infrastructure remains static. To this end, the topic of UAVs (unmanned aerial vehicles) have enabled wireless communications, and networking has received increased attention. As mentioned above, the network must serve a massive density of nodes that can be either human-held (user devices) or machine-type nodes (sensors). If we wish to properly serve these nodes and optimize their data, a proper wireless connection is fundamental. This can be achieved by using UAV-enabled communication and networks. This Special Issue addresses the many existing issues that still exist to allow UAV-enabled wireless communications and networking to be properly rolled out

    HBMFTEFR: Design of a Hybrid Bioinspired Model for Fault-Tolerant Energy Harvesting Networks via Fuzzy Rule Checks

    Get PDF
    Designing energy harvesting networks requires modelling of energy distribution under different real-time network conditions. These networks showcase better energy efficiency, but are affected by internal & external faults, which increase energy consumption of affected nodes. Due to this probability of node failure, and network failure increases, which reduces QoS (Quality of Service) for the network deployment. To overcome this issue, various fault tolerance & mitigation models are proposed by researchers, but these models require large training datasets & real-time samples for efficient operation. This increases computational complexity, storage cost & end-to-end processing delay of the network, which reduces its QoS performance under real-time use cases. To mitigate these issues, this text proposes design of a hybrid bioinspired model for fault-tolerant energy harvesting networks via fuzzy rule checks. The proposed model initially uses a Genetic Algorithm (GA) to cluster nodes depending upon their residual energy & distance metrics. Clustered nodes are processed via Particle Swarm Optimization (PSO) that assists in deploying a fault-tolerant & energy-harvesting process. The PSO model is further augmented via use of a hybrid Ant Colony Optimization (ACO) Model with Teacher Learner Based Optimization (TLBO), which assists in value-based fault prediction & mitigation operations. All bioinspired models are trained-once during initial network deployment, and then evaluated subsequently for each communication request. After a pre-set number of communications are done, the model re-evaluates average QoS performance, and incrementally reconfigures selected solutions. Due to this incremental tuning, the model is observed to consume lower energy, and showcases lower complexity when compared with other state-of-the-art models. Upon evaluation it was observed that the proposed model showcases 15.4% lower energy consumption, 8.5% faster communication response, 9.2% better throughput, and 1.5% better packet delivery ratio (PDR), when compared with recently proposed energy harvesting models. The proposed model also showcased better fault prediction & mitigation performance when compared with its counterparts, thereby making it useful for a wide variety of real-time network deployments

    Brain Neoplasm Classification & Detection of Accuracy on MRI Images

    Get PDF
    The abnormal, uncontrolled cell growth in the brain, commonly known n as a brain tumor, can lead to immense pressure on the various nerves and blood vessels, causing irreversible harm to the body. Early detection of brain tumors is the key to avoiding such compilations. Tumour detection can be done through various advanced Machine Learning and Image Processing algorithms. Mind Brain tumors have demonstrated testing to treat, to a great extent inferable from the organic qualities of these diseases, which frequently plan to restrict progress. To begin with, by invading one of the body's most significant organs, these growths are much of the time situated past the compass of even the most gifted neurosurgeon. These cancers are likewise situated behind the blood-cerebrum boundary (BBB), a tight intersection and transport proteins that shield fragile brain tissues from openness to factors in the overall flow, subsequently obstructing openness to foundational chemotherapy [6,7]. Besides, the interesting formative, hereditary, epigenetic and micro environmental elements of the cerebrum much of the time render these tumors impervious to ordinary and novel medicines. These difficulties are accumulated by the uncommonness of cerebrum growths comparative with numerous different types of disease, restricting the degree of subsidizing and interest from the drug business and drawing in a moderately little and divided research local area

    Internet of Underwater Things and Big Marine Data Analytics -- A Comprehensive Survey

    Full text link
    The Internet of Underwater Things (IoUT) is an emerging communication ecosystem developed for connecting underwater objects in maritime and underwater environments. The IoUT technology is intricately linked with intelligent boats and ships, smart shores and oceans, automatic marine transportations, positioning and navigation, underwater exploration, disaster prediction and prevention, as well as with intelligent monitoring and security. The IoUT has an influence at various scales ranging from a small scientific observatory, to a midsized harbor, and to covering global oceanic trade. The network architecture of IoUT is intrinsically heterogeneous and should be sufficiently resilient to operate in harsh environments. This creates major challenges in terms of underwater communications, whilst relying on limited energy resources. Additionally, the volume, velocity, and variety of data produced by sensors, hydrophones, and cameras in IoUT is enormous, giving rise to the concept of Big Marine Data (BMD), which has its own processing challenges. Hence, conventional data processing techniques will falter, and bespoke Machine Learning (ML) solutions have to be employed for automatically learning the specific BMD behavior and features facilitating knowledge extraction and decision support. The motivation of this paper is to comprehensively survey the IoUT, BMD, and their synthesis. It also aims for exploring the nexus of BMD with ML. We set out from underwater data collection and then discuss the family of IoUT data communication techniques with an emphasis on the state-of-the-art research challenges. We then review the suite of ML solutions suitable for BMD handling and analytics. We treat the subject deductively from an educational perspective, critically appraising the material surveyed.Comment: 54 pages, 11 figures, 19 tables, IEEE Communications Surveys & Tutorials, peer-reviewed academic journa
    • …
    corecore