958 research outputs found

    Relaying in the Internet of Things (IoT): A Survey

    Get PDF
    The deployment of relays between Internet of Things (IoT) end devices and gateways can improve link quality. In cellular-based IoT, relays have the potential to reduce base station overload. The energy expended in single-hop long-range communication can be reduced if relays listen to transmissions of end devices and forward these observations to gateways. However, incorporating relays into IoT networks faces some challenges. IoT end devices are designed primarily for uplink communication of small-sized observations toward the network; hence, opportunistically using end devices as relays needs a redesign of both the medium access control (MAC) layer protocol of such end devices and possible addition of new communication interfaces. Additionally, the wake-up time of IoT end devices needs to be synchronized with that of the relays. For cellular-based IoT, the possibility of using infrastructure relays exists, and noncellular IoT networks can leverage the presence of mobile devices for relaying, for example, in remote healthcare. However, the latter presents problems of incentivizing relay participation and managing the mobility of relays. Furthermore, although relays can increase the lifetime of IoT networks, deploying relays implies the need for additional batteries to power them. This can erode the energy efficiency gain that relays offer. Therefore, designing relay-assisted IoT networks that provide acceptable trade-offs is key, and this goes beyond adding an extra transmit RF chain to a relay-enabled IoT end device. There has been increasing research interest in IoT relaying, as demonstrated in the available literature. Works that consider these issues are surveyed in this paper to provide insight into the state of the art, provide design insights for network designers and motivate future research directions

    Two-path succesive relaying schemes in the presence of inter-relay interference

    Get PDF
    Relaying is a promising technique to improve wireless network performance. A conventional relay transmits and receives signals in two orthogonal channels due to half duplex constraint of wireless network. This results in inefficient use of spectral resources. Two-Path Successive Relaying (TPSR) has been proposed to recover loss in spectral efficiency. However, the performance of TPSR is degraded by Inter-Relay Interference (IRI). This thesis investigates the performance of TPSR affected by IRI and proposes several schemes to improve relaying reliability, throughput and secrecy. Simulations revealed that the existing TPSR could perform worse than the conventional Half Duplex Relaying (HDR) scheme. Opportunistic TPSR schemes are proposed to improve the capacity performance. Several relay pair selection criteria are developed to ensure the selection of the best performing relay pair. Adaptive schemes which dynamically switch between TPSR and conventional HDR are proposed to further improve the performance. Simulation and analytical results show that the proposed schemes can achieve up to 45% ergodic capacity improvement and lower outage probability compared to baseline schemes, while achieving the maximum diversity and multiplexing tradeoff of the multi-input single-output channel. In addition, this thesis proposes secrecy TPSR schemes to protect secrecy of wireless transmission from eavesdropper. The use of two relays in the proposed schemes deliver more robust secrecy transmission while the use of scheduled jamming signals improves secrecy rate. Simulation and analytical results reveal that the proposed schemes can achieve up to 62% ergodic secrecy capacity improvement and quadratically lower intercept and secrecy outage probabilities if compared to existing schemes. Overall, this thesis demonstrates that the proposed TPSR schemes are able to deliver performance improvement in terms of throughput, reliability and secrecy in the presence of IRI

    Review on Radio Resource Allocation Optimization in LTE/LTE-Advanced using Game Theory

    Get PDF
    Recently, there has been a growing trend toward ap-plying game theory (GT) to various engineering fields in order to solve optimization problems with different competing entities/con-tributors/players. Researches in the fourth generation (4G) wireless network field also exploited this advanced theory to overcome long term evolution (LTE) challenges such as resource allocation, which is one of the most important research topics. In fact, an efficient de-sign of resource allocation schemes is the key to higher performance. However, the standard does not specify the optimization approach to execute the radio resource management and therefore it was left open for studies. This paper presents a survey of the existing game theory based solution for 4G-LTE radio resource allocation problem and its optimization
    corecore