471 research outputs found

    Buffer-Aided Relaying with Adaptive Link Selection - Fixed and Mixed Rate Transmission

    Full text link
    We consider a simple network consisting of a source, a half-duplex DF relay with a buffer, and a destination. We assume that the direct source-destination link is not available and all links undergo fading. We propose two new buffer-aided relaying schemes. In the first scheme, neither the source nor the relay have CSIT, and consequently, both nodes are forced to transmit with fixed rates. In contrast, in the second scheme, the source does not have CSIT and transmits with fixed rate but the relay has CSIT and adapts its transmission rate accordingly. In the absence of delay constraints, for both fixed rate and mixed rate transmission, we derive the throughput-optimal buffer-aided relaying protocols which select either the source or the relay for transmission based on the instantaneous SNRs of the source-relay and the relay-destination links. In addition, for the delay constrained case, we develop buffer-aided relaying protocols that achieve a predefined average delay. Compared to conventional relaying protocols, which select the transmitting node according to a predefined schedule independent of the link instantaneous SNRs, the proposed buffer-aided protocols with adaptive link selection achieve large performance gains. In particular, for fixed rate transmission, we show that the proposed protocol achieves a diversity gain of two as long as an average delay of more than three time slots can be afforded. Furthermore, for mixed rate transmission with an average delay of ETE{T} time slots, a multiplexing gain of r=1−1/(2ET)r=1-1/(2E{T}) is achieved. Hence, for mixed rate transmission, for sufficiently large average delays, buffer-aided half-duplex relaying with and without adaptive link selection does not suffer from a multiplexing gain loss compared to full-duplex relaying.Comment: IEEE Transactions on Information Theory. (Published

    Throughput Maximization for Mobile Relaying Systems

    Full text link
    This paper studies a novel mobile relaying technique, where relays of high mobility are employed to assist the communications from source to destination. By exploiting the predictable channel variations introduced by relay mobility, we study the throughput maximization problem in a mobile relaying system via dynamic rate and power allocations at the source and relay. An optimization problem is formulated for a finite time horizon, subject to an information-causality constraint, which results from the data buffering employed at the relay. It is found that the optimal power allocations across the different time slots follow a "stair-case" water filling (WF) structure, with non-increasing and non-decreasing water levels at the source and relay, respectively. For the special case where the relay moves unidirectionally from source to destination, the optimal power allocations reduce to the conventional WF with constant water levels. Numerical results show that with appropriate trajectory design, mobile relaying is able to achieve tremendous throughput gain over the conventional static relaying.Comment: submitted for possible conference publicatio

    Relay assisted device-to-device communication with channel uncertainty

    Get PDF
    The gains of direct communication between user equipment in a network may not be fully realised due to the separation between the user equipment and due to the fading that the channel between these user equipment experiences. In order to fully realise the gains that direct (device-to-device) communication promises, idle user equipment can be exploited to serve as relays to enforce device-to-device communication. The availability of potential relay user equipment creates a problem: a way to select the relay user equipment. Moreover, unlike infrastructure relays, user equipment are carried around by people and these users are self-interested. Thus the problem of relay selection goes beyond choosing which device to assist in relayed communication but catering for user self-interest. Another problem in wireless communication is the unavailability of perfect channel state information. This reality creates uncertainty in the channel and so in designing selection algorithms, channel uncertainty awareness needs to be a consideration. Therefore the work in this thesis considers the design of relay user equipment selection algorithms that are not only device centric but that are relay user equipment centric. Furthermore, the designed algorithms are channel uncertainty aware. Firstly, a stable matching based relay user equipment selection algorithm is put forward for underlay device-to-device communication. A channel uncertainty aware approach is proposed to cater to imperfect channel state information at the devices. The algorithm is combined with a rate based mode selection algorithm. Next, to cater to the queue state at the relay user equipment, a cross-layer selection algorithm is proposed for a twoway decode and forward relay set up. The algorithm proposed employs deterministic uncertainty constraint in the interference channel, solving the selection algorithm in a heuristic fashion. Then a cluster head selection algorithm is proposed for device-to-device group communication constrained by channel uncertainty in the interference channel. The formulated rate maximization problem is solved for deterministic and probabilistic constraint scenarios, and the problem extended to a multiple-input single-out scenario for which robust beamforming was designed. Finally, relay utility and social distance based selection algorithms are proposed for full duplex decode and forward device-to-device communication set up. A worst-case approach is proposed for a full channel uncertainty scenario. The results from computer simulations indicate that the proposed algorithms offer spectral efficiency, fairness and energy efficiency gains. The results also showed clearly the deterioration in the performance of networks when perfect channel state information is assumed

    On the Benefits of Network-Level Cooperation in Millimeter-Wave Communications

    Full text link
    Relaying techniques for millimeter-wave wireless networks represent a powerful solution for improving the transmission performance. In this work, we quantify the benefits in terms of delay and throughput for a random-access multi-user millimeter-wave wireless network, assisted by a full-duplex network cooperative relay. The relay is equipped with a queue for which we analyze the performance characteristics (e.g., arrival rate, service rate, average size, and stability condition). Moreover, we study two possible transmission schemes: fully directional and broadcast. In the former, the source nodes transmit a packet either to the relay or to the destination by using narrow beams, whereas, in the latter, the nodes transmit to both the destination and the relay in the same timeslot by using a wider beam, but with lower beamforming gain. In our analysis, we also take into account the beam alignment phase that occurs every time a transmitter node changes the destination node. We show how the beam alignment duration, as well as position and number of transmitting nodes, significantly affect the network performance. Moreover, we illustrate the optimal transmission scheme (i.e., broadcast or fully directional) for several system parameters and show that a fully directional transmission is not always beneficial, but, in some scenarios, broadcasting and relaying can improve the performance in terms of throughput and delay.Comment: arXiv admin note: text overlap with arXiv:1804.0945

    Relaying in the Internet of Things (IoT): A Survey

    Get PDF
    The deployment of relays between Internet of Things (IoT) end devices and gateways can improve link quality. In cellular-based IoT, relays have the potential to reduce base station overload. The energy expended in single-hop long-range communication can be reduced if relays listen to transmissions of end devices and forward these observations to gateways. However, incorporating relays into IoT networks faces some challenges. IoT end devices are designed primarily for uplink communication of small-sized observations toward the network; hence, opportunistically using end devices as relays needs a redesign of both the medium access control (MAC) layer protocol of such end devices and possible addition of new communication interfaces. Additionally, the wake-up time of IoT end devices needs to be synchronized with that of the relays. For cellular-based IoT, the possibility of using infrastructure relays exists, and noncellular IoT networks can leverage the presence of mobile devices for relaying, for example, in remote healthcare. However, the latter presents problems of incentivizing relay participation and managing the mobility of relays. Furthermore, although relays can increase the lifetime of IoT networks, deploying relays implies the need for additional batteries to power them. This can erode the energy efficiency gain that relays offer. Therefore, designing relay-assisted IoT networks that provide acceptable trade-offs is key, and this goes beyond adding an extra transmit RF chain to a relay-enabled IoT end device. There has been increasing research interest in IoT relaying, as demonstrated in the available literature. Works that consider these issues are surveyed in this paper to provide insight into the state of the art, provide design insights for network designers and motivate future research directions

    Optimization of hybrid cache placement for collaborative relaying

    Get PDF
    Traditional wireless multi-hop relaying systems suffer from inefficient use of bandwidth resources. This letter studies the use of content caching at distributed relays to tackle this problem and improve the performance of collaborative relaying. We propose a hybrid caching scheme that is jointly optimized with the transmission schemes, to achieve a fine balance between the signal cooperation gain and the caching diversity gain. The optimization problem of cache placement to minimize the outage probability is studied and is shown to be convex. Numerical results demonstrate significant outage performance gains over traditional relaying without caching

    Cooperative Diversity with Mobile Nodes: Capacity Outage Rate and Duration

    Full text link
    The outage probability is an important performance measure for cooperative diversity schemes. However, in mobile environments, the outage probability does not completely describe the behavior of cooperative diversity schemes since the mobility of the involved nodes introduces variations in the channel gains. As a result, the capacity outage events are correlated in time and second-order statistical parameters of the achievable information-theoretic capacity such as the average capacity outage rate (AOR) and the average capacity outage duration (AOD) are required to obtain a more complete description of the properties of cooperative diversity protocols. In this paper, assuming slow Rayleigh fading, we derive exact expressions for the AOR and the AOD of three well-known cooperative diversity protocols: variable-gain amplify-and-forward, decode-and-forward, and selection decode-and-forward relaying. Furthermore, we develop asymptotically tight high signal-to-noise ratio (SNR) approximations, which offer important insights into the influence of various system and channel parameters on the AOR and AOD. In particular, we show that on a double-logarithmic scale, similar to the outage probability, the AOR asymptotically decays with the SNR with a slope that depends on the diversity gain of the cooperative protocol, whereas the AOD asymptotically decays with a slope of -1/2 independent of the diversity gain.Comment: IEEE Transactions on Information Theory (2011
    • …
    corecore