279 research outputs found

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    Survey of Spectrum Sharing for Inter-Technology Coexistence

    Full text link
    Increasing capacity demands in emerging wireless technologies are expected to be met by network densification and spectrum bands open to multiple technologies. These will, in turn, increase the level of interference and also result in more complex inter-technology interactions, which will need to be managed through spectrum sharing mechanisms. Consequently, novel spectrum sharing mechanisms should be designed to allow spectrum access for multiple technologies, while efficiently utilizing the spectrum resources overall. Importantly, it is not trivial to design such efficient mechanisms, not only due to technical aspects, but also due to regulatory and business model constraints. In this survey we address spectrum sharing mechanisms for wireless inter-technology coexistence by means of a technology circle that incorporates in a unified, system-level view the technical and non-technical aspects. We thus systematically explore the spectrum sharing design space consisting of parameters at different layers. Using this framework, we present a literature review on inter-technology coexistence with a focus on wireless technologies with equal spectrum access rights, i.e. (i) primary/primary, (ii) secondary/secondary, and (iii) technologies operating in a spectrum commons. Moreover, we reflect on our literature review to identify possible spectrum sharing design solutions and performance evaluation approaches useful for future coexistence cases. Finally, we discuss spectrum sharing design challenges and suggest future research directions

    LTE-verkon suorituskyvyn parantaminen CDMA2000:sta LTE:hen tehdyn muutoksen jälkeen

    Get PDF
    CDMA2000 technology has been widely used on 450 MHz band. Recently the equipment availability and improved performance offered by LTE has started driving the operators to migrate their networks from CDMA2000 to LTE. The migration may cause the network performance to be in suboptimal state. This thesis presents four methods to positively influence LTE network performance after CDMA2000 to LTE migration, especially on 450 MHz band. Furthermore, three of the four presented methods are evaluated in a live network. The measured three methods were cyclic prefix length, handover parameter optimization and uplink coordinated multipoint (CoMP) transmission. The objective was to determine the effectiveness of each method. The research methods included field measurements and network KPI collection. The results show that normal cyclic prefix length is enough for LTE450 although the cell radius may be up to 50km. Only special cases exist where cyclic prefix should be extended. Operators should consider solving such problems individually instead of widely implementing extended cyclic prefix. Handover parameter optimization turned out to be an important point of attention after CDMA2000 to LTE migration. It was observed that if the handover parameters are not concerned, significant amount of unnecessary handovers may happen. It was evaluated that about 50% of the handovers in the network were unnecessary in the initial situation. By adjusting the handover parameter values 47,28 % of the handovers per user were removed and no negative effects were detected. Coordinated multipoint transmission has been widely discussed to be an effective way to improve LTE network performance, especially at the cell edges. Many challenges must be overcome before it can be applied to downlink. Also, implementing it to function between cells in different eNBs involve challenges. Thus, only intra-site uplink CoMP transmission was tested. The results show that the performance improvements were significant at the cell edges as theory predicted.CDMA2000 teknologiaa on laajalti käytetty 450 MHz:n taajuusalueella. Viime aikoina LTE:n tarjoamat halvemmat laitteistot ja parempi suorituskyky ovat kannustaneet operaattoreita muuttamaan verkkoaan CDMA2000:sta LTE:hen. Kyseinen muutos saattaa johtaa epäoptimaaliseen tilaan verkon suorituskyvyn kannalta. Tämä työ esittelee neljä menetelmää, joilla voidaan positiivisesti vaikuttaa LTE-verkon suorituskykyyn CDMA2000:ste LTE:hen tehdyn muutoksen jälkeen erityisesti 450 MHz:n taajuusalueella. Kolmea näistä menetelmistä arvioidaan tuotantoverkossa. Nämä kolme menetelmää ovat suojavälin pituus, solunvaihtoparametrien optimointi ja ylälinkin koordinoitu monipistetiedonsiirto. Tavoite oli määrittää kunkin menetelmän vaikutus. Tutkimusmenetelmiin kuului kenttämittaukset ja verkon suorituskykymittareiden analyysi. Tutkimustulosten perusteella voidaan sanoa, että normaali suojaväli on riittävän pitkä LTE450:lle vaikka solujen säde on jopa 50km. Vain erikoistapauksissa tarvitaan pidennettyä suojaväliä. Operaattoreiden tulisi ratkaista tällaiset tapaukset yksilöllisesti sen sijaan, että koko verkossa käytettäisiin pidennettyä suojaväliä. Solunvaihtoparametrien optimointi osoittautui tärkeäksi huomion aiheeksi CDMA2000:sta LTE:hen tehdyn muutoksen jälkeen. Turhia solunvaihtoja saattaa tapahtua merkittäviä määriä, mikäli parametreihin ei kiinnitetä huomiota. Lähtötilanteessa noin 50 % testiverkon solunvaihdoista arvioitiin olevan turhia. Solunvaihtoparametreja muuttamalla 47,28 % solunvaihdoista per käyttäjä saatiin poistettua ilman, että mitään haittavaikutuksia olisi huomattu. Koordinoidun monipistetiedonsiirron on laajalti sanottu olevan tehokas tapa parantaa LTE-verkon suorituskykyä, etenkin solujen reunoilla. Monia haasteita pitää ratkaista, enne kuin sitä voidaan käyttää alalinkin tiedonsiirtoon. Lisäksi sen käyttöön eri tukiasemien solujen välillä liittyy haasteita. Tästä syystä monipistetiedonsiirtoa voitiin testata vain ylälinkin suuntaan ja vain yhden tukiaseman välisten solujen kesken. Tulokset osoittivat, että suorituskyky parani merkittävästi solun reunalla

    Optimized traffic scheduling and routing in smart home networks

    Get PDF
    Home networks are evolving rapidly to include heterogeneous physical access and a large number of smart devices that generate different types of traffic with different distributions and different Quality of Service (QoS) requirements. Due to their particular architectures, which are very dense and very dynamic, the traditional one-pair-node shortest path solution is no longer efficient to handle inter-smart home networks (inter-SHNs) routing constraints such as delay, packet loss, and bandwidth in all-pair node heterogenous links. In addition, Current QoS-aware scheduling methods consider only the conventional priority metrics based on the IP Type of Service (ToS) field to make decisions for bandwidth allocation. Such priority based scheduling methods are not optimal to provide both QoS and Quality of Experience (QoE), especially for smart home applications, since higher priority traffic does not necessarily require higher stringent delay than lower-priority traffic. Moreover, current QoS-aware scheduling methods in the intra-smart home network (intra-SHN) do not consider concurrent traffic caused by the fluctuation of intra-SH network traffic distributions. Thus, the goal of this dissertation is to build an efficient heterogenous multi-constrained routing mechanism and an optimized traffic scheduling tool in order to maintain a cost-effective communication between all wired-wireless connected devices in inter-SHNs and to effectively process concurrent and non-concurrent traffic in intra-SHN. This will help Internet service providers (ISPs) and home user to enhance the overall QoS and QoE of their applications while maintaining a relevant communication in both inter-SHNs and intra-SHN. In order to meet this goal, three key issues are required to be addressed in our framework and are summarized as follows: i) how to build a cost-effective routing mechanism in heterogonous inter-SHNs ? ii) how to efficiently schedule the multi-sourced intra-SHN traffic based on both QoS and QoE ? and iii) how to design an optimized queuing model for intra-SHN concurrent traffics while considering their QoS requirements? As part of our contributions to solve the first problem highlighted above, we present an analytical framework for dynamically optimizing data flows in inter-SHNs using Software-defined networking (SDN). We formulate a QoS-based routing optimization problem as a constrained shortest path problem and then propose an optimized solution (QASDN) to determine minimal cost between all pairs of nodes in the network taking into account the different types of physical accesses and the network utilization patterns. To address the second issue and to solve the gaps between QoS and QoE, we propose a new queuing model for QoS-level Pair traffic with mixed arrival distributions in Smart Home network (QP-SH) to make a dynamic QoS-aware scheduling decision meeting delay requirements of all traffic while preserving their degrees of criticality. A new metric combining the ToS field and the maximum number of packets that can be processed by the system's service during the maximum required delay, is defined. Finally, as part of our contribution to address the third issue, we present an analytic model for a QoS-aware scheduling optimization of concurrent intra-SHN traffics with mixed arrival distributions and using probabilistic queuing disciplines. We formulate a hybrid QoS-aware scheduling problem for concurrent traffics in intra-SHN, propose an innovative queuing model (QC-SH) based on the auction economic model of game theory to provide a fair multiple access over different communication channels/ports, and design an applicable model to implement auction game on both sides; traffic sources and the home gateway, without changing the structure of the IEEE 802.11 standard. The results of our work offer SHNs more effective data transfer between all heterogenous connected devices with optimal resource utilization, a dynamic QoS/QoE-aware traffic processing in SHN as well as an innovative model for optimizing concurrent SHN traffic scheduling with enhanced fairness strategy. Numerical results show an improvement up to 90% for network resource utilization, 77% for bandwidth, 40% for scheduling with QoS and QoE and 57% for concurrent traffic scheduling delay using our proposed solutions compared with Traditional methods

    Improvement of 5G performance through network densification in millimetre wave band

    Get PDF
    Recently, there has been a substantial growth in mobile data traffic due to the widespread of data hungry devices such as mobiles and laptops. The anticipated high traffic demands and low latency requirements stemmed from the Internet of Things (IoT) and Machine Type Communications (MTC) can only be met with radical changes to the network paradigm such as harnessing the millimetre wave (mmWave) band in Ultra-Dense Network (UDN). This thesis presents many challenges, problems and questions that arise in research and design stage of 5G network. The main challenges of 5G in mmWave can be characterised with the following attributes: i- huge traffic demands, with very high data rate requirements, ii- high interference in UDN, iii increased handover in UDN, higher dependency on Line of Sight (LOS) coverage and high shadow fading, and iv-massive MTC traffic due to billions of connected devices. In this work, software simulation tools have been used to evaluate the proposed solutions. Therefore, we have introduced 5G network based on network densification. Network densification includes densification over frequency through mmWave, and densification over space through higher number of antennas, Higher Order Sectorisation (HOS), and denser deployment of small-cells. Our results show that the densification theme has significantly improved network capacity and user Quality of Experience (QoE). UDN network can efficiently raise the user experience to the level that 5G vision promised. However, one of the drawback of using UDN and HOS is the significant increase in Inter-Cell Interference (ICI). Therefore, ICI has been addressed in this work to increase the gain of densification. ICI can degrade the performance of wireless network, particularly in UDN due to the increased interference from surrounding cells. We have used Fractional Frequency Reuse (FFR) as ICI Coordination (ICIC) for UDN network and HOS environment. The work shows that FFR has improved the network performance in terms of cell-edge data throughput and average cell throughput, and maintain the peak data throughput at a certain threshold. Additionally, HOS has shown even greater gain over default sectored sites when the interference is carefully coordinated. To generalise the principle of densification, we have introduced Distributed Base Station (DBS) as the envisioned network architecture for 5G in mmWave. Remotely distributed antennas in DBS architecture have been harnessed in order to compensate for the high path loss that characterise mmWave propagation. The proposed architecture has significantly improved the user data throughput, decreased the unnecessary handovers as a result of dense network, increased the LOS coverage probability, and reduced the impact of shadow fading. Additionally, this research discusses the regulatory requirements at mmWave band for the Maximum Permissible Exposure (MPE). Finally, scheduling massive MTC traffic in 5G has been considered. MTC is expected to contribute to the majority of IoT traffic. In this context, an algorithm has been developed to schedule this type of traffic. The results demonstrate the gain of using distributed antennas on MTC traffic in terms of spectral efficiency, data throughput, and fairness. The results show considerable improvement in the performance metrics. The combination of these contributions has provided remarkable increase in data throughput to achieve the 5G vision of “massive” capacity and to support human and machine traffic

    Machine Learning for Unmanned Aerial System (UAS) Networking

    Get PDF
    Fueled by the advancement of 5G new radio (5G NR), rapid development has occurred in many fields. Compared with the conventional approaches, beamforming and network slicing enable 5G NR to have ten times decrease in latency, connection density, and experienced throughput than 4G long term evolution (4G LTE). These advantages pave the way for the evolution of Cyber-physical Systems (CPS) on a large scale. The reduction of consumption, the advancement of control engineering, and the simplification of Unmanned Aircraft System (UAS) enable the UAS networking deployment on a large scale to become feasible. The UAS networking can finish multiple complex missions simultaneously. However, the limitations of the conventional approaches are still a big challenge to make a trade-off between the massive management and efficient networking on a large scale. With 5G NR and machine learning, in this dissertation, my contributions can be summarized as the following: I proposed a novel Optimized Ad-hoc On-demand Distance Vector (OAODV) routing protocol to improve the throughput of Intra UAS networking. The novel routing protocol can reduce the system overhead and be efficient. To improve the security, I proposed a blockchain scheme to mitigate the malicious basestations for cellular connected UAS networking and a proof-of-traffic (PoT) to improve the efficiency of blockchain for UAS networking on a large scale. Inspired by the biological cell paradigm, I proposed the cell wall routing protocols for heterogeneous UAS networking. With 5G NR, the inter connections between UAS networking can strengthen the throughput and elasticity of UAS networking. With machine learning, the routing schedulings for intra- and inter- UAS networking can enhance the throughput of UAS networking on a large scale. The inter UAS networking can achieve the max-min throughput globally edge coloring. I leveraged the upper and lower bound to accelerate the optimization of edge coloring. This dissertation paves a way regarding UAS networking in the integration of CPS and machine learning. The UAS networking can achieve outstanding performance in a decentralized architecture. Concurrently, this dissertation gives insights into UAS networking on a large scale. These are fundamental to integrating UAS and National Aerial System (NAS), critical to aviation in the operated and unmanned fields. The dissertation provides novel approaches for the promotion of UAS networking on a large scale. The proposed approaches extend the state-of-the-art of UAS networking in a decentralized architecture. All the alterations can contribute to the establishment of UAS networking with CPS
    corecore