24 research outputs found

    Self organization of tilts in relay enhanced networks: a distributed solution

    Get PDF
    Despite years of physical-layer research, the capacity enhancement potential of relays is limited by the additional spectrum required for Base Station (BS)-Relay Station (RS) links. This paper presents a novel distributed solution by exploiting a system level perspective instead. Building on a realistic system model with impromptu RS deployments, we develop an analytical framework for tilt optimization that can dynamically maximize spectral efficiency of both the BS-RS and BS-user links in an online manner. To obtain a distributed self-organizing solution, the large scale system-wide optimization problem is decomposed into local small scale subproblems by applying the design principles of self-organization in biological systems. The local subproblems are non-convex, but having a very small scale, can be solved via standard nonlinear optimization techniques such as sequential quadratic programming. The performance of the developed solution is evaluated through extensive simulations for an LTE-A type system and compared against a number of benchmarks including a centralized solution obtained via brute force, that also gives an upper bound to assess the optimality gap. Results show that the proposed solution can enhance average spectral efficiency by up to 50% compared to fixed tilting, with negligible signaling overheads. The key advantage of the proposed solution is its potential for autonomous and distributed implementation

    A SON Solution for Sleeping Cell Detection Using Low-Dimensional Embedding of MDT Measurements

    Get PDF
    Automatic detection of cells which are in outage has been identified as one of the key use cases for Self Organizing Networks (SON) for emerging and future generations of cellular systems. A special case of cell outage, referred to as Sleeping Cell (SC) remains particularly challenging to detect in state of the art SON because in this case cell goes into outage or may perform poorly without triggering an alarm for Operation and Maintenance (O&M) entity. Consequently, no SON compensation function can be launched unless SC situation is detected via drive tests or through complaints registered by the affected customers. In this paper, we present a novel solution to address this problem that makes use of minimization of drive test (MDT) measurements recently standardized by 3GPP and NGMN. To overcome the processing complexity challenge, the MDT measurements are projected to a low-dimensional space using multidimensional scaling method. Then we apply state of the art k-nearest neighbor and local outlier factor based anomaly detection models together with pre-processed MDT measurements to profile the network behaviour and to detect SC. Our numerical results show that our proposed solution can automate the SC detection process with 93 accuracy

    A cell outage management framework for dense heterogeneous networks

    Get PDF
    In this paper, we present a novel cell outage management (COM) framework for heterogeneous networks with split control and data planes-a candidate architecture for meeting future capacity, quality-of-service, and energy efficiency demands. In such an architecture, the control and data functionalities are not necessarily handled by the same node. The control base stations (BSs) manage the transmission of control information and user equipment (UE) mobility, whereas the data BSs handle UE data. An implication of this split architecture is that an outage to a BS in one plane has to be compensated by other BSs in the same plane. Our COM framework addresses this challenge by incorporating two distinct cell outage detection (COD) algorithms to cope with the idiosyncrasies of both data and control planes. The COD algorithm for control cells leverages the relatively larger number of UEs in the control cell to gather large-scale minimization-of-drive-test report data and detects an outage by applying machine learning and anomaly detection techniques. To improve outage detection accuracy, we also investigate and compare the performance of two anomaly-detecting algorithms, i.e., k-nearest-neighbor- and local-outlier-factor-based anomaly detectors, within the control COD. On the other hand, for data cell COD, we propose a heuristic Grey-prediction-based approach, which can work with the small number of UE in the data cell, by exploiting the fact that the control BS manages UE-data BS connectivity and by receiving a periodic update of the received signal reference power statistic between the UEs and data BSs in its coverage. The detection accuracy of the heuristic data COD algorithm is further improved by exploiting the Fourier series of the residual error that is inherent to a Grey prediction model. Our COM framework integrates these two COD algorithms with a cell outage compensation (COC) algorithm that can be applied to both planes. Our COC solution utilizes an actor-critic-based reinforcement learning algorithm, which optimizes the capacity and coverage of the identified outage zone in a plane, by adjusting the antenna gain and transmission power of the surrounding BSs in that plane. The simulation results show that the proposed framework can detect both data and control cell outage and compensate for the detected outage in a reliable manner

    Interpretable AI-based large-scale 3D pathloss prediction model for enabling emerging self-driving networks

    Get PDF
    In modern wireless communication systems, radio propagation modeling to estimate pathloss has always been a fundamental task in system design and optimization. The state-of-the-art empirical propagation models are based on measurements in specific environments and limited in their ability to capture idiosyncrasies of various propagation environments. To cope with this problem, ray-tracing based solutions are used in commercial planning tools, but they tend to be extremely time-consuming and expensive. We propose a Machine Learning (ML)-based model that leverages novel key predictors for estimating pathloss. By quantitatively evaluating the ability of various ML algorithms in terms of predictive, generalization and computational performance, our results show that Light Gradient Boosting Machine (LightGBM) algorithm overall outperforms others, even with sparse training data, by providing a 65% increase in prediction accuracy as compared to empirical models and 13x decrease in prediction time as compared to ray-tracing. To address the interpretability challenge that thwarts the adoption of most Machine Learning (ML)-based models, we perform extensive secondary analysis using SHapley Additive exPlanations (SHAP) method, yielding many practically useful insights that can be leveraged for intelligently tuning the network configuration, selective enrichment of training data in real networks and for building lighter ML-based propagation model to enable low-latency use-cases

    Self Organization of Tilts in Relay Enhanced Networks: A Distributed Solution

    Full text link

    Q-Map Application for Enrichment of a Mobile Directory Assistance Service

    Get PDF
    The project described in this paper involves designing and developing a mobile map application, called the Qatar Map (Q-Map), which supports a telephone directory assistance service that runs over the terrestrial cellular network. The application uses WAP Push technology for extending the features available for a conventional directory assistance service. The Q-Map enables the network agent to respond to the subscriber with supplementary information when requesting a telephone number for a business. In addition to the telephone number, the information also includes a web address (URL) through which the subscriber can access a Google map covering the business’s area and any marketing content (e.g., advertising) uploaded earlier by that business. This service is also offered on-line through the Internet. In this regard, the subscriber can access the Q-Map website using a web browser, via either a PC, or a mobile handset

    Challenges in 5G: how to empower SON with big data for enabling 5G

    No full text
    While an al dente character of 5G is yet to emerge, network densification, miscellany of node types, split of control and data plane, network virtualization, heavy and localized cache, infrastructure sharing, concurrent operation at multiple frequency bands, simultaneous use of different medium access control and physical layers, and flexible spectrum allocations can be envisioned as some of the potential ingredients of 5G. It is not difficult to prognosticate that with such a conglomeration of technologies, the complexity of operation and OPEX can become the biggest challenge in 5G. To cope with similar challenges in the context of 3G and 4G networks, recently, self-organizing networks, or SONs, have been researched extensively. However, the ambitious quality of experience requirements and emerging multifarious vision of 5G, and the associated scale of complexity and cost, demand a significantly different, if not totally new, approach toward SONs in order to make 5G technically as well as financially feasible. In this article we first identify what challenges hinder the current self-optimizing networking paradigm from meeting the requirements of 5G. We then propose a comprehensive framework for empowering SONs with big data to address the requirements of 5G. Under this framework we first characterize big data in the context of future mobile networks, identifying its sources and future utilities. We then explicate the specific machine learning and data analytics tools that can be exploited to transform big data into the right data that provides a readily useable knowledge base to create end-to-end intelligence of the network. We then explain how a SON engine can build on the dynamic models extractable from the right data. The resultant dynamicity of a big data empowered SON (BSON) makes it more agile and can essentially transform the SON from being a reactive to proactive paradigm and hence act as a key enabler for 5G's extremely low latency requirements. Finally, we demonstrate the key concepts of our proposed BSON framework through a case study of a problem that the classic 3G/4G SON fails to solve

    Microdiversity on rician fading channels

    No full text
    corecore