28 research outputs found

    D6.3 Intermediate system evaluation results

    Full text link
    The overall purpose of METIS is to develop a 5G system concept that fulfil s the requirements of the beyond-2020 connected information society and to extend today’s wireless communication systems for new usage cases. First, in this deliverable an updated view on the overall METIS 5G system concept is presented. Thereafter, simulation results for the most promising technology components supporting the METIS 5G system concept are reported. Finally, s imulation results are presented for one relevant aspect of each Horizontal Topic: Direct Device - to - Device Communication, Massive Machine Communication, Moving Networks, Ultra - Dense Networks, and Ultra - Reliable Communication.Popovski, P.; Mange, G.; Fertl, P.; Gozálvez - Serrano, D.; Droste, H.; Bayer, N.; Roos, A.... (2014). D6.3 Intermediate system evaluation results. http://hdl.handle.net/10251/7676

    D 3. 3 Final performance results and consolidated view on the most promising multi -node/multi -antenna transmission technologies

    Full text link
    This document provides the most recent updates on the technical contributions and research challenges focused in WP3. Each Technology Component (TeC) has been evaluated under possible uniform assessment framework of WP3 which is based on the simulation guidelines of WP6. The performance assessment is supported by the simulation results which are in their mature and stable state. An update on the Most Promising Technology Approaches (MPTAs) and their associated TeCs is the main focus of this document. Based on the input of all the TeCs in WP3, a consolidated view of WP3 on the role of multinode/multi-antenna transmission technologies in 5G systems has also been provided. This consolidated view is further supported in this document by the presentation of the impact of MPTAs on METIS scenarios and the addressed METIS goals.Aziz, D.; Baracca, P.; De Carvalho, E.; Fantini, R.; Rajatheva, N.; Popovski, P.; Sørensen, JH.... (2015). D 3. 3 Final performance results and consolidated view on the most promising multi -node/multi -antenna transmission technologies. http://hdl.handle.net/10251/7675

    AN EFFICIENT INTERFERENCE AVOIDANCE SCHEME FOR DEVICE-TODEVICE ENABLED FIFTH GENERATION NARROWBAND INTERNET OF THINGS NETWOKS’

    Get PDF
    Narrowband Internet of Things (NB-IoT) is a low-power wide-area (LPWA) technology built on long-term evolution (LTE) functionalities and standardized by the 3rd-Generation Partnership Project (3GPP). Due to its support for massive machine-type communication (mMTC) and different IoT use cases with rigorous standards in terms of connection, energy efficiency, reachability, reliability, and latency, NB-IoT has attracted the research community. However, as the capacity needs for various IoT use cases expand, the LTE evolved packet core (EPC) system's numerous functionalities may become overburdened and suboptimal. Several research efforts are currently in progress to address these challenges. As a result, an overview of these efforts with a specific focus on the optimized architecture of the LTE EPC functionalities, the 5G architectural design for NB-IoT integration, the enabling technologies necessary for 5G NB-IoT, 5G new radio (NR) coexistence with NB-IoT, and feasible architectural deployment schemes of NB-IoT with cellular networks is discussed. This thesis also presents cloud-assisted relay with backscatter communication as part of a detailed study of the technical performance attributes and channel communication characteristics from the physical (PHY) and medium access control (MAC) layers of the NB-IoT, with a focus on 5G. The numerous drawbacks that come with simulating these systems are explored. The enabling market for NB-IoT, the benefits for a few use cases, and the potential critical challenges associated with their deployment are all highlighted. Fortunately, the cyclic prefix orthogonal frequency division multiplexing (CPOFDM) based waveform by 3GPP NR for improved mobile broadband (eMBB) services does not prohibit the use of other waveforms in other services, such as the NB-IoT service for mMTC. As a result, the coexistence of 5G NR and NB-IoT must be manageably orthogonal (or quasi-orthogonal) to minimize mutual interference that limits the form of freedom in the waveform's overall design. As a result, 5G coexistence with NB-IoT will introduce a new interference challenge, distinct from that of the legacy network, even though the NR's coexistence with NB-IoT is believed to improve network capacity and expand the coverage of the user data rate, as well as improves robust communication through frequency reuse. Interference challenges may make channel estimation difficult for NB-IoT devices, limiting the user performance and spectral efficiency. Various existing interference mitigation solutions either add to the network's overhead, computational complexity and delay or are hampered by low data rate and coverage. These algorithms are unsuitable for an NB-IoT network owing to the low-complexity nature. As a result, a D2D communication based interference-control technique becomes an effective strategy for addressing this problem. This thesis used D2D communication to decrease the network bottleneck in dense 5G NBIoT networks prone to interference. For D2D-enabled 5G NB-IoT systems, the thesis presents an interference-avoidance resource allocation that considers the less favourable cell edge NUEs. To simplify the algorithm's computing complexity and reduce interference power, the system divides the optimization problem into three sub-problems. First, in an orthogonal deployment technique using channel state information (CSI), the channel gain factor is leveraged by selecting a probable reuse channel with higher QoS control. Second, a bisection search approach is used to find the best power control that maximizes the network sum rate, and third, the Hungarian algorithm is used to build a maximum bipartite matching strategy to choose the optimal pairing pattern between the sets of NUEs and the D2D pairs. The proposed approach improves the D2D sum rate and overall network SINR of the 5G NB-IoT system, according to the numerical data. The maximum power constraint of the D2D pair, D2D's location, Pico-base station (PBS) cell radius, number of potential reuse channels, and cluster distance impact the D2D pair's performance. The simulation results achieve 28.35%, 31.33%, and 39% SINR performance higher than the ARSAD, DCORA, and RRA algorithms when the number of NUEs is twice the number of D2D pairs, and 2.52%, 14.80%, and 39.89% SINR performance higher than the ARSAD, RRA, and DCORA when the number of NUEs and D2D pairs are equal. As a result, a D2D sum rate increase of 9.23%, 11.26%, and 13.92% higher than the ARSAD, DCORA, and RRA when the NUE’s number is twice the number of D2D pairs, and a D2D’s sum rate increase of 1.18%, 4.64% and 15.93% higher than the ARSAD, RRA and DCORA respectively, with an equal number of NUEs and D2D pairs is achieved. The results demonstrate the efficacy of the proposed scheme. The thesis also addressed the problem where the cell-edge NUE's QoS is critical to challenges such as long-distance transmission, delays, low bandwidth utilization, and high system overhead that affect 5G NB-IoT network performance. In this case, most cell-edge NUEs boost their transmit power to maximize network throughput. Integrating cooperating D2D relaying technique into 5G NB-IoT heterogeneous network (HetNet) uplink spectrum sharing increases the system's spectral efficiency and interference power, further degrading the network. Using a max-max SINR (Max-SINR) approach, this thesis proposed an interference-aware D2D relaying strategy for 5G NB-IoT QoS improvement for a cell-edge NUE to achieve optimum system performance. The Lagrangian-dual technique is used to optimize the transmit power of the cell-edge NUE to the relay based on the average interference power constraint, while the relay to the NB-IoT base station (NBS) employs a fixed transmit power. To choose an optimal D2D relay node, the channel-to-interference plus noise ratio (CINR) of all available D2D relays is used to maximize the minimum cell-edge NUE's data rate while ensuring the cellular NUEs' QoS requirements are satisfied. Best harmonic mean, best-worst, half-duplex relay selection, and a D2D communication scheme were among the other relaying selection strategies studied. The simulation results reveal that the Max-SINR selection scheme outperforms all other selection schemes due to the high channel gain between the two communication devices except for the D2D communication scheme. The proposed algorithm achieves 21.27% SINR performance, which is nearly identical to the half-duplex scheme, but outperforms the best-worst and harmonic selection techniques by 81.27% and 40.29%, respectively. As a result, as the number of D2D relays increases, the capacity increases by 14.10% and 47.19%, respectively, over harmonic and half-duplex techniques. Finally, the thesis presents future research works on interference control in addition with the open research directions on PHY and MAC properties and a SWOT (Strengths, Weaknesses, Opportunities, and Threats) analysis presented in Chapter 2 to encourage further study on 5G NB-IoT

    C-Band Airport Surface Communications System Standards Development, Phase I

    Get PDF
    This document is being provided as part of ITT's NASA Glenn Research Center Aerospace Communication Systems Technical Support (ACSTS) contract NNC05CA85C, Task 7: "New ATM Requirements--Future Communications, C-Band and L-Band Communications Standard Development." The proposed future C-band (5091- to 5150-MHz) airport surface communication system, referred to as the Aeronautical Mobile Airport Communications System (AeroMACS), is anticipated to increase overall air-to-ground data communications systems capacity by using a new spectrum (i.e., not very high frequency (VHF)). Although some critical services could be supported, AeroMACS will also target noncritical services, such as weather advisory and aeronautical information services as part of an airborne System Wide Information Management (SWIM) program. AeroMACS is to be designed and implemented in a manner that will not disrupt other services operating in the C-band. This report defines the AeroMACS concepts of use, high-level system requirements, and architecture; the performance of supporting system analyses; the development of AeroMACS test and demonstration plans; and the establishment of an operational AeroMACS capability in support of C-band aeronautical data communications standards to be advanced in both international (International Civil Aviation Organization, ICAO) and national (RTCA) forums. This includes the development of system parameter profile recommendations for AeroMACS based on existing Institute of Electrical and Electronics Engineering (IEEE) 802.16e- 2009 standard

    Kapeankaistan LTE koneiden välisessä satelliittitietoliikenteessä

    Get PDF
    Recent trends to wireless Machine-to-Machine (M2M) communication and Internet of Things (IoT) has created a new demand for more efficient low-throughput wireless data connections. Beside the traditional wireless standards, focused on high bandwidth data transfer, has emerged a new generation of Low Power Wide Area Networks (LPWAN) which targets for less power demanding low-throughput devices requiring inexpensive data connections. Recently released NB-IoT (Narrowband IoT) specification extends the existing 4G/LTE standard allowing more easily accessible LPWAN cellular connectivity for IoT devices. Narrower bandwidth and lower data rates combined to a simplified air interface make it less resource demanding still benefiting from the widely spread LTE technologies and infrastructure. %% Applications & Why space Applications, such as wide scale sensor or asset tracking networks, can benefit from a global scale network coverage and easily available low-cost user equipment which could be made possible by new narrowband IoT satellite networks. In this thesis, the NB-IoT specification and its applicability for satellite communication is discussed. Primarily, LTE and NB-IoT standards are designed only for terrestrial and their utilization in Earth-to-space communication raises new challenges, such as timing and frequency synchronization requirements when utilizing Orthogonal Frequency Signal Multiplexing (OFDM) techniques. Many of these challenges can be overcome by specification adaptations and other existing techniques making minimal changes to the standard and allowing extension of the terrestrial cellular networks to global satellite access.Viimeaikaiset kehitystrendit koneiden välisessä kommunikaatiossa (Machine to Machine Communication, M2M) ja esineiden Internet (Internet of Things, IoT) -sovelluksissa ovat luoneet perinteisteisten nopean tiedonsiirron langattomien standardien ohelle uuden sukupolven LPWAN (Low Power Wide Area Networks) -tekniikoita, jotka ovat tarkoitettu pienitehoisille tiedonsiirtoa tarvitseville sovelluksille. Viimeaikoina yleistynyt NB-IoT standardi laajentaa 4G/LTE standardia mahdollistaen entistä matalamman virrankulutuksen matkapuhelinyhteydet IoT laitteissa. Kapeampi lähetyskaista ja hitaampi tiedonsiirtonopeus yhdistettynä yksinkertaisempaan ilmarajapintaan mahdollistaa pienemmän resurssivaatimukset saman aikaan hyötyen laajalti levinneistä LTE teknologioista ja olemassa olevasta infrastruktuurista. Useissa sovelluskohteissa, kuten suurissa sensoriverkoissa, voitaisiin hyötyä merkittävästi globaalista kattavuudesta yhdistettynä edullisiin helposti saataviin päätelaitteisiin. Tässä työssä käsitellään NB-IoT standardia ja sen soveltuvuutta satellittitietoliikenteeseen. LTE ja NB-IoT ovat kehitty maanpääliseen tietoliikenteeseen ja niiden hyödyntäminen avaruuden ja maan välisessä kommunikaatiossa aiheuttaa uusia haasteita esimerkiksi aika- ja taajuussynkronisaatiossa ja OFDM (Orthogonal Frequency Signal Multiplexing) -tekniikan hyödyntämisessä. Nämä haasteet voidaan ratkaista soveltamalla spesifikaatiota sekä muilla jo olemassa olevilla tekniikoilla tehden mahdollisimman vähän muutoksia alkuperäiseen standardiin, ja täten sallien maanpäälisten IoT verkkojen laajenemisen avaruuteen

    User Association in 5G Networks: A Survey and an Outlook

    Get PDF
    26 pages; accepted to appear in IEEE Communications Surveys and Tutorial

    Review on Radio Resource Allocation Optimization in LTE/LTE-Advanced using Game Theory

    Get PDF
    Recently, there has been a growing trend toward ap-plying game theory (GT) to various engineering fields in order to solve optimization problems with different competing entities/con-tributors/players. Researches in the fourth generation (4G) wireless network field also exploited this advanced theory to overcome long term evolution (LTE) challenges such as resource allocation, which is one of the most important research topics. In fact, an efficient de-sign of resource allocation schemes is the key to higher performance. However, the standard does not specify the optimization approach to execute the radio resource management and therefore it was left open for studies. This paper presents a survey of the existing game theory based solution for 4G-LTE radio resource allocation problem and its optimization

    Radio Resource Management in LTE-Advanced Systems with Carrier Aggregation

    Get PDF
    In order to meet the ever-increasing demand for wireless broadband services from fast growing mobile users, the Long Term Evolution -Advanced (LTE-A) standard has been proposed to effectively improve the system capacity and the spectral efficiency for the fourth-generation (4G) wireless mobile communications. Many advanced techniques are incorporated in LTE-A systems to jointly ameliorate system performance, among which Carrier Aggregation (CA) is considered as one of the most promising improvements that has profound significance even in the upcoming 5G era. Component carriers (CCs) from various portions of the spectrum are logically concatenated to form a much larger virtual band, resulting in remarkable boosted system capacity and user data throughput. However, the unique features of CA have posed many emerging challenges as well as span-new opportunities on the Radio Resource Management (RRM) in the LTE-A systems. First, although multi-CC transmission can bring higher throughput, it may incur more intensive interference for each CC and more power consumption for users. Thus the performance gain of CA under different conditions needs fully evaluating. Besides, as CA offers flexible CC selection and cross-CC load balancing and scheduling, enhanced RRM strategies should be designed to further optimize the overall resource utilization. In addition, CA enables the frequency reuse on a CC resolution, adding another dimension to inter-cell interference management in heterogeneous networks (HetNets). New interference management mechanisms should be designed to take the advantage of CA. Last but not least, CA empowers the LTE-A systems to aggregate the licensed spectrum with the unlicensed spectrum, thus offering a capacity surge. Yet how to balance the traffic between licensed and unlicensed spectrum and how to achieve a harmony coexistence with other unlicensed systems are still open issues. To this end, the dissertation emphasizes on the new functionalities introduced by CA to optimize the RRM performance in LTE-A systems. The main objectives are four-fold: 1) to fully evaluate the benefits of CA from different perspectives under different conditions via both theoretical analysis and simulations; 2) to design cross-layer CC selection, packet scheduling and power control strategies to optimize the target performance; 3) to analytically model the interference of HetNets with CA and propose dynamic interference mitigation strategies in a CA scenario; and 4) to investigate the impact of LTE transmissions on other unlicensed systems and develop enhanced RRM mechanisms for harmony coexistence. To achieve these objectives, we first analyze the benefits of CA via investigating the user accommodation capabilities of the system in the downlink admission control process. The LTE-A users with CA capabilities and the legacy LTE users are considered. Analytical models are developed to derive the maximum number of users that can be admitted into the system given the user QoS requirements and traffic features. The results show that with only a slightly higher spectrum utilization, the system can admit as much as twice LTE-A users than LTE users when the user traffic is bursty. Second, we study the RRM in the single-tier LTE-A system and propose a cross-layer dynamic CC selection and power control strategy for uplink CA. Specifically, the uplink power offset effects caused by multi-CC transmission are considered. An estimation method for user bandwidth allocation is developed and a combinatorial optimization problem is formulated to improve the user throughput via maximizing the user power utilization. Third, we explore the interference management problem in multi-tier HetNets considering the CC-resolution frequency reuse. An analytical model is devised to capture the randomness behaviors of the femtocells exploiting the stochastic geometry theory. The interaction between the base stations of different tiers are formulated into a two-level Stackelberg game, and a backward induction method is exploited to obtain the Nash equilibrium. Last, we focus on the mechanism design for licensed and unlicensed spectrum aggregation. An LTE MAC protocol on unlicensed spectrum is developed considering the coexistence with the Wi-Fi systems. The protocol captures the asynchronous nature of Wi-Fi transmissions in time-slotted LTE frame structure and strike a tunable tradeoff between LTE and Wi-Fi performance. Analytical analysis is also presented to reveal the essential relation among different parameters of the two systems. In summary, the dissertation aims at fully evaluating the benefits of CA in different scenarios and making full use of the benefits to develop efficient and effective RRM strategies for better LTE-Advanced system performance
    corecore