2,956 research outputs found

    From 5G to 6G: Has the Time for Modern Random Access Come?

    Get PDF
    This short paper proposes the use of modern random access for IoT applications in 6G. A short overview of recent advances in uncoordinated medium access is provided, highlighting the gains that can be achieved by leveraging smart protocol design intertwined with advanced signal processing techniques at the receiver. The authors' vision on the benefits such schemes can yield for beyond-5G systems is presented, with the aim to trigger further discussion.Comment: 2 pages, 1 figure, presented at 6G Summit, Levi, Finland, 201

    Multiple Access for Massive Machine Type Communications

    Get PDF
    The internet we have known thus far has been an internet of people, as it has connected people with one another. However, these connections are forecasted to occupy only a minuscule of future communications. The internet of tomorrow is indeed: the internet of things. The Internet of Things (IoT) promises to improve all aspects of life by connecting everything to everything. An enormous amount of effort is being exerted to turn these visions into a reality. Sensors and actuators will communicate and operate in an automated fashion with no or minimal human intervention. In the current literature, these sensors and actuators are referred to as machines, and the communication amongst these machines is referred to as Machine to Machine (M2M) communication or Machine-Type Communication (MTC). As IoT requires a seamless mode of communication that is available anywhere and anytime, wireless communications will be one of the key enabling technologies for IoT. In existing wireless cellular networks, users with data to transmit first need to request channel access. All access requests are processed by a central unit that in return either grants or denies the access request. Once granted access, users' data transmissions are non-overlapping and interference free. However, as the number of IoT devices is forecasted to be in the order of hundreds of millions, if not billions, in the near future, the access channels of existing cellular networks are predicted to suffer from severe congestion and, thus, incur unpredictable latencies in the system. On the other hand, in random access, users with data to transmit will access the channel in an uncoordinated and probabilistic fashion, thus, requiring little or no signalling overhead. However, this reduction in overhead is at the expense of reliability and efficiency due to the interference caused by contending users. In most existing random access schemes, packets are lost when they experience interference from other packets transmitted over the same resources. Moreover, most existing random access schemes are best-effort schemes with almost no Quality of Service (QoS) guarantees. In this thesis, we investigate the performance of different random access schemes in different settings to resolve the problem of the massive access of IoT devices with diverse QoS guarantees. First, we take a step towards re-designing existing random access protocols such that they are more practical and more efficient. For many years, researchers have adopted the collision channel model in random access schemes: a collision is the event of two or more users transmitting over the same time-frequency resources. In the event of a collision, all the involved data is lost, and users need to retransmit their information. However, in practice, data can be recovered even in the presence of interference provided that the power of the signal is sufficiently larger than the power of the noise and the power of the interference. Based on this, we re-define the event of collision as the event of the interference power exceeding a pre-determined threshold. We propose a new analytical framework to compute the probability of packet recovery failure inspired by error control codes on graph. We optimize the random access parameters based on evolution strategies. Our results show a significant improvement in performance in terms of reliability and efficiency. Next, we focus on supporting the heterogeneous IoT applications and accommodating their diverse latency and reliability requirements in a unified access scheme. We propose a multi-stage approach where each group of applications transmits in different stages with different probabilities. We propose a new analytical framework to compute the probability of packet recovery failure for each group in each stage. We also optimize the random access parameters using evolution strategies. Our results show that our proposed scheme can outperform coordinated access schemes of existing cellular networks when the number of users is very large. Finally, we investigate random non-orthogonal multiple access schemes that are known to achieve a higher spectrum efficiency and are known to support higher loads. In our proposed scheme, user detection and channel estimation are carried out via pilot sequences that are transmitted simultaneously with the user's data. Here, a collision event is defined as the event of two or more users selecting the same pilot sequence. All collisions are regarded as interference to the remaining users. We first study the distribution of the interference power and derive its expression. Then, we use this expression to derive simple yet accurate analytical bounds on the throughput and outage probability of the proposed scheme. We consider both joint decoding as well as successive interference cancellation. We show that the proposed scheme is especially useful in the case of short packet transmission

    Massive MIMO for Internet of Things (IoT) Connectivity

    Full text link
    Massive MIMO is considered to be one of the key technologies in the emerging 5G systems, but also a concept applicable to other wireless systems. Exploiting the large number of degrees of freedom (DoFs) of massive MIMO essential for achieving high spectral efficiency, high data rates and extreme spatial multiplexing of densely distributed users. On the one hand, the benefits of applying massive MIMO for broadband communication are well known and there has been a large body of research on designing communication schemes to support high rates. On the other hand, using massive MIMO for Internet-of-Things (IoT) is still a developing topic, as IoT connectivity has requirements and constraints that are significantly different from the broadband connections. In this paper we investigate the applicability of massive MIMO to IoT connectivity. Specifically, we treat the two generic types of IoT connections envisioned in 5G: massive machine-type communication (mMTC) and ultra-reliable low-latency communication (URLLC). This paper fills this important gap by identifying the opportunities and challenges in exploiting massive MIMO for IoT connectivity. We provide insights into the trade-offs that emerge when massive MIMO is applied to mMTC or URLLC and present a number of suitable communication schemes. The discussion continues to the questions of network slicing of the wireless resources and the use of massive MIMO to simultaneously support IoT connections with very heterogeneous requirements. The main conclusion is that massive MIMO can bring benefits to the scenarios with IoT connectivity, but it requires tight integration of the physical-layer techniques with the protocol design.Comment: Submitted for publicatio

    EC-CENTRIC: An Energy- and Context-Centric Perspective on IoT Systems and Protocol Design

    Get PDF
    The radio transceiver of an IoT device is often where most of the energy is consumed. For this reason, most research so far has focused on low power circuit and energy efficient physical layer designs, with the goal of reducing the average energy per information bit required for communication. While these efforts are valuable per se, their actual effectiveness can be partially neutralized by ill-designed network, processing and resource management solutions, which can become a primary factor of performance degradation, in terms of throughput, responsiveness and energy efficiency. The objective of this paper is to describe an energy-centric and context-aware optimization framework that accounts for the energy impact of the fundamental functionalities of an IoT system and that proceeds along three main technical thrusts: 1) balancing signal-dependent processing techniques (compression and feature extraction) and communication tasks; 2) jointly designing channel access and routing protocols to maximize the network lifetime; 3) providing self-adaptability to different operating conditions through the adoption of suitable learning architectures and of flexible/reconfigurable algorithms and protocols. After discussing this framework, we present some preliminary results that validate the effectiveness of our proposed line of action, and show how the use of adaptive signal processing and channel access techniques allows an IoT network to dynamically tune lifetime for signal distortion, according to the requirements dictated by the application

    Coded Pilot Random Access for Massive MIMO Systems

    Get PDF
    corecore