1,426 research outputs found

    Optimal Timer Based Selection Schemes

    Full text link
    Timer-based mechanisms are often used to help a given (sink) node select the best helper node among many available nodes. Specifically, a node transmits a packet when its timer expires, and the timer value is a monotone non-increasing function of its local suitability metric. The best node is selected successfully if no other node's timer expires within a 'vulnerability' window after its timer expiry, and so long as the sink can hear the available nodes. In this paper, we show that the optimal metric-to-timer mapping that (i) maximizes the probability of success or (ii) minimizes the average selection time subject to a minimum constraint on the probability of success, maps the metric into a set of discrete timer values. We specify, in closed-form, the optimal scheme as a function of the maximum selection duration, the vulnerability window, and the number of nodes. An asymptotic characterization of the optimal scheme turns out to be elegant and insightful. For any probability distribution function of the metric, the optimal scheme is scalable, distributed, and performs much better than the popular inverse metric timer mapping. It even compares favorably with splitting-based selection, when the latter's feedback overhead is accounted for.Comment: 21 pages, 6 figures, 1 table, submitted to IEEE Transactions on Communications, uses stackrel.st

    How does Federated Learning Impact Decision-Making in Firms: A Systematic Literature Review

    Get PDF

    Towards Secure and Privacy-Preserving IoT enabled Smart Home: Architecture and Experimental Study

    Get PDF
    Internet of Things (IoT) technology is increasingly pervasive in all aspects of our life and its usage is anticipated to significantly increase in future Smart Cities to support their myriad of revolutionary applications. This paper introduces a new architecture that can support several IoT-enabled smart home use cases, with a specified level of security and privacy preservation. The security threats that may target such an architecture are highlighted along with the cryptographic algorithms that can prevent them. An experimental study is performed to provide more insights about the suitability of several lightweight cryptographic algorithms for use in securing the constrained IoT devices used in the proposed architecture. The obtained results showed that many modern lightweight symmetric cryptography algorithms, as CLEFIA and TRIVIUM, are optimized for hardware implementations and can consume up to 10 times more energy than the legacy techniques when they are implemented in software. Moreover, the experiments results highlight that CLEFIA significantly outperforms TRIVIUM under all of the investigated test cases, and the latter performs 100 times worse than the legacy cryptographic algorithms tested

    Separation Framework: An Enabler for Cooperative and D2D Communication for Future 5G Networks

    Get PDF
    Soaring capacity and coverage demands dictate that future cellular networks need to soon migrate towards ultra-dense networks. However, network densification comes with a host of challenges that include compromised energy efficiency, complex interference management, cumbersome mobility management, burdensome signaling overheads and higher backhaul costs. Interestingly, most of the problems, that beleaguer network densification, stem from legacy networks' one common feature i.e., tight coupling between the control and data planes regardless of their degree of heterogeneity and cell density. Consequently, in wake of 5G, control and data planes separation architecture (SARC) has recently been conceived as a promising paradigm that has potential to address most of aforementioned challenges. In this article, we review various proposals that have been presented in literature so far to enable SARC. More specifically, we analyze how and to what degree various SARC proposals address the four main challenges in network densification namely: energy efficiency, system level capacity maximization, interference management and mobility management. We then focus on two salient features of future cellular networks that have not yet been adapted in legacy networks at wide scale and thus remain a hallmark of 5G, i.e., coordinated multipoint (CoMP), and device-to-device (D2D) communications. After providing necessary background on CoMP and D2D, we analyze how SARC can particularly act as a major enabler for CoMP and D2D in context of 5G. This article thus serves as both a tutorial as well as an up to date survey on SARC, CoMP and D2D. Most importantly, the article provides an extensive outlook of challenges and opportunities that lie at the crossroads of these three mutually entangled emerging technologies.Comment: 28 pages, 11 figures, IEEE Communications Surveys & Tutorials 201

    Recent advances in radio resource management for heterogeneous LTE/LTE-A networks

    Get PDF
    As heterogeneous networks (HetNets) emerge as one of the most promising developments toward realizing the target specifications of Long Term Evolution (LTE) and LTE-Advanced (LTE-A) networks, radio resource management (RRM) research for such networks has, in recent times, been intensively pursued. Clearly, recent research mainly concentrates on the aspect of interference mitigation. Other RRM aspects, such as radio resource utilization, fairness, complexity, and QoS, have not been given much attention. In this paper, we aim to provide an overview of the key challenges arising from HetNets and highlight their importance. Subsequently, we present a comprehensive survey of the RRM schemes that have been studied in recent years for LTE/LTE-A HetNets, with a particular focus on those for femtocells and relay nodes. Furthermore, we classify these RRM schemes according to their underlying approaches. In addition, these RRM schemes are qualitatively analyzed and compared to each other. We also identify a number of potential research directions for future RRM development. Finally, we discuss the lack of current RRM research and the importance of multi-objective RRM studies

    Failure Analysis in Next-Generation Critical Cellular Communication Infrastructures

    Full text link
    The advent of communication technologies marks a transformative phase in critical infrastructure construction, where the meticulous analysis of failures becomes paramount in achieving the fundamental objectives of continuity, security, and availability. This survey enriches the discourse on failures, failure analysis, and countermeasures in the context of the next-generation critical communication infrastructures. Through an exhaustive examination of existing literature, we discern and categorize prominent research orientations with focuses on, namely resource depletion, security vulnerabilities, and system availability concerns. We also analyze constructive countermeasures tailored to address identified failure scenarios and their prevention. Furthermore, the survey emphasizes the imperative for standardization in addressing failures related to Artificial Intelligence (AI) within the ambit of the sixth-generation (6G) networks, accounting for the forward-looking perspective for the envisioned intelligence of 6G network architecture. By identifying new challenges and delineating future research directions, this survey can help guide stakeholders toward unexplored territories, fostering innovation and resilience in critical communication infrastructure development and failure prevention
    corecore