772 research outputs found

    Massive MIMO for Internet of Things (IoT) Connectivity

    Full text link
    Massive MIMO is considered to be one of the key technologies in the emerging 5G systems, but also a concept applicable to other wireless systems. Exploiting the large number of degrees of freedom (DoFs) of massive MIMO essential for achieving high spectral efficiency, high data rates and extreme spatial multiplexing of densely distributed users. On the one hand, the benefits of applying massive MIMO for broadband communication are well known and there has been a large body of research on designing communication schemes to support high rates. On the other hand, using massive MIMO for Internet-of-Things (IoT) is still a developing topic, as IoT connectivity has requirements and constraints that are significantly different from the broadband connections. In this paper we investigate the applicability of massive MIMO to IoT connectivity. Specifically, we treat the two generic types of IoT connections envisioned in 5G: massive machine-type communication (mMTC) and ultra-reliable low-latency communication (URLLC). This paper fills this important gap by identifying the opportunities and challenges in exploiting massive MIMO for IoT connectivity. We provide insights into the trade-offs that emerge when massive MIMO is applied to mMTC or URLLC and present a number of suitable communication schemes. The discussion continues to the questions of network slicing of the wireless resources and the use of massive MIMO to simultaneously support IoT connections with very heterogeneous requirements. The main conclusion is that massive MIMO can bring benefits to the scenarios with IoT connectivity, but it requires tight integration of the physical-layer techniques with the protocol design.Comment: Submitted for publicatio

    Power allocation and user selection in multi-cell: multi-user massive MIMO systems

    Get PDF
    Submitted in fulfilment of the academic requirements for the degree of Master of Science (Msc) in Engineering, in the School of Electrical and Information Engineering (EIE), Faculty of Engineering and the Built Environment, at the University of the Witwatersrand, Johannesburg, South Africa, 2017The benefits of massive Multiple-Input Multiple-Output (MIMO) systems have made it a solution for future wireless networking demands. The increase in the number of base station antennas in massive MIMO systems results in an increase in capacity. The throughput increases linearly with an increase in number of antennas. To reap all the benefits of massive MIMO, resources should be allocated optimally amongst users. A lot of factors have to be taken into consideration in resource allocation in multi-cell massive MIMO systems (e.g. intra-cell, inter-cell interference, large scale fading etc.) This dissertation investigates user selection and power allocation algorithms in multi-cell massive MIMO systems. The focus is on designing algorithms that maximizes a particular cell of interest’s sum rate capacity taking into consideration the interference from other cells. To maximize the sum-rate capacity there is need to optimally allocate power and select the optimal number of users who should be scheduled. Global interference coordination has very high complexity and is infeasible in large networks. This dissertation extends previous work and proposes suboptimal per cell resource allocation models that are feasible in practice. The interference is introduced when non-orthogonal pilots are used for channel estimation, resulting in pilot contamination. Resource allocation values from interfering cells are unknown in per cell resource allocation models, hence the inter-cell interference has to be modelled. To tackle the problem sum-rate expressions are derived to enable power allocation and user selection algorithm analysis. The dissertation proposes three different approaches for solving resource allocation problems in multi-cell multi-user massive MIMO systems for a particular cell of interest. The first approach proposes a branch and bound algorithm (BnB algorithm) which models the inter-cell interference in terms of the intra-cell interference by assuming that the statistical properties of the intra-cell interference in the cell of interest are the same as in the other interfering cells. The inter-cell interference is therefore expressed in terms of the intra-cell interference multiplied by a correction factor. The correction factor takes into consideration pilot sequences used in the interfering cells in relation to pilot sequences used in the cell of interest and large scale fading between the users in the interfering cells and the users in the cell of interest. The resource allocation problem is modelled as a mixed integer programming problem. The problem is NP-hard and cannot be solved in polynomial time. To solve the problem it is converted into a convex optimization problem by relaxing the user selection constraint. Dual decomposition is used to solve the problem. In the second approach (two stage algorithm) a mathematical model is proposed for maximum user scheduling in each cell. The scheduled users are then optimally allocated power using the multilevel water filling approach. Finally a hybrid algorithm is proposed which combines the two approaches described above. Generally in the hybrid algorithm the cell of interest allocates resources in the interfering cells using the two stage algorithm to obtain near optimal resource allocation values. The cell of interest then uses these near optimal values to perform its own resource allocation using the BnB algorithm. The two stage algorithm is chosen for resource allocation in the interfering cells because it has a much lower complexity compared to the BnB algorithm. The BnB algorithm is chosen for resource allocation in the cell of interest because it gives higher sum rate in a sum rate maximization problem than the two stage algorithm. Performance analysis and evaluation of the developed algorithms have been presented mainly through extensive simulations. The designed algorithms have also been compared to existing solutions. In general the presented results demonstrate that the proposed algorithms perform better than the existing solutions.XL201

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig
    • …
    corecore