182 research outputs found

    A survey of machine learning techniques applied to self organizing cellular networks

    Get PDF
    In this paper, a survey of the literature of the past fifteen years involving Machine Learning (ML) algorithms applied to self organizing cellular networks is performed. In order for future networks to overcome the current limitations and address the issues of current cellular systems, it is clear that more intelligence needs to be deployed, so that a fully autonomous and flexible network can be enabled. This paper focuses on the learning perspective of Self Organizing Networks (SON) solutions and provides, not only an overview of the most common ML techniques encountered in cellular networks, but also manages to classify each paper in terms of its learning solution, while also giving some examples. The authors also classify each paper in terms of its self-organizing use-case and discuss how each proposed solution performed. In addition, a comparison between the most commonly found ML algorithms in terms of certain SON metrics is performed and general guidelines on when to choose each ML algorithm for each SON function are proposed. Lastly, this work also provides future research directions and new paradigms that the use of more robust and intelligent algorithms, together with data gathered by operators, can bring to the cellular networks domain and fully enable the concept of SON in the near future

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    Bicycles Mobility Prediction

    Get PDF
    The growth in mobile wireless communication requires sharp solutions in handling mobility problems that encompass poor handover management, interference in access points, excessive load in macrocells, and other relevant mobility issues. With the deployment of small cell networks in 5G mobile systems the problems mentioned intensify thus, mobility prediction schemes arise to surpass and mitigate these issues. Predicting mobility is not a trivial task due to the vastness of different variables that characterize a mobility route translating into unpredictability and randomness. Therefore, the task of this work is to overcome these challenges by building a solid mobility prediction architecture that can analyze big data and find patterns in the mobility aspect to ultimately perform reliable predictions. The models introduced in this dissertation are two deep learning schemes based on an Artificial Neural Network (ANN) architecture and a LSTM Long-Short Term Memory (LSTM) architecture. The prediction was made in two levels: Short-term prediction and Long-term prediction. We verified that in the short-term domain both models performed equivalently with successful results. However, in long-term prediction, the LSTM model surpassed the ANN model. Consequently, the LSTM approach constitutes the stronger model in all prediction aspects. Implementing this model in cellular networks is an important asset in optimizing processes such as routing and caching as the cellular networks can allocate the necessary resources to provide a better user experience. With this optimization impact and with the emergence of the Internet of Things (IoT), the prediction model can support and improve the development of smart applications related to our daily mobility routine.O crescimento da comunicação móvel sem fios exige soluções precisas para lidar com problemas de mobilidade que englobam uma gestão pobre de handover, interferência em pontos de acesso, carga excessiva em macrocélulas e outros problemas relevantes ao aspeto da mobilidade. Com a implantação de redes de pequenas células no sistema móvel 5G, os problemas mencionados intensificam-se. Desta forma, são necessários esquemas de previsão de mobilidade para superar e mitigar esses problemas. Prever a mobilidade não é uma tarefa trivial devido à imensidão de diferentes variáveis que caracterizam uma rota de mobilidade, traduzindo-se em grandes dimensões de imprevisibilidade e aleatoriedade. Portanto, a tarefa deste trabalho é superar esses desafios construindo uma arquitetura sólida de estimação de mobilidade, que possa analisar um grande fluxo de dados e encontrar padrões para, em última análise, realizar previsões credíveis e assertivas. Os modelos apresentados nesta dissertação são dois esquemas de deep learning baseados em uma arquitetura de RNA (Rede Neuronal) e uma arquitetura LSTM (Long-Short Term Memory). A previsão foi feita em dois níveis: previsão de curto prazo e previsão de longo prazo. Verificámos que no curto prazo ambos os modelos tiveram um desempenho equivalente com resultados bem sucedidos. No entanto, na previsão de longo prazo, o modelo LSTM superou o modelo ANN. Consequentemente, a abordagem LSTM constitui o modelo mais forte em todos os aspectos de previsão. A implementação deste modelo, em redes celulares, é uma medida importante na otimização de processos como, routing ou caching, proporcionando uma melhor experiência wireless ao utilizador. Com este impacto de otimização e com o surgimento da Internet of Things (IoT), o modelo de previsão pode apoiar e melhorar o desenvolvimento de aplicações inteligentes relacionadas com a nossa rotina diária de mobilidade

    Non-linear echo cancellation - a Bayesian approach

    Get PDF
    Echo cancellation literature is reviewed, then a Bayesian model is introduced and it is shown how how it can be used to model and fit nonlinear channels. An algorithm for cancellation of echo over a nonlinear channel is developed and tested. It is shown that this nonlinear algorithm converges for both linear and nonlinear channels and is superior to linear echo cancellation for canceling an echo through a nonlinear echo-path channel

    Machine Learning Meets Communication Networks: Current Trends and Future Challenges

    Get PDF
    The growing network density and unprecedented increase in network traffic, caused by the massively expanding number of connected devices and online services, require intelligent network operations. Machine Learning (ML) has been applied in this regard in different types of networks and networking technologies to meet the requirements of future communicating devices and services. In this article, we provide a detailed account of current research on the application of ML in communication networks and shed light on future research challenges. Research on the application of ML in communication networks is described in: i) the three layers, i.e., physical, access, and network layers; and ii) novel computing and networking concepts such as Multi-access Edge Computing (MEC), Software Defined Networking (SDN), Network Functions Virtualization (NFV), and a brief overview of ML-based network security. Important future research challenges are identified and presented to help stir further research in key areas in this direction

    Cell identity allocation and optimisation of handover parameters in self-organised LTE femtocell networks

    Get PDF
    A thesis submitted to the University of Bedfordshire in partial ful lment of the requirements for the degree of Doctor of PhilosophyFemtocell is a small cellular base station used by operators to extend indoor service coverage and enhance overall network performance. In Long Term Evolution (LTE), femtocell works under macrocell coverage and combines with the macrocell to constitute the two-tier network. Compared to the traditional single-tier network, the two-tier scenario creates many new challenges, which lead to the 3rd Generation Partnership Project (3GPP) implementing an automation technology called Self-Organising Network (SON) in order to achieve lower cost and enhanced network performance. This thesis focuses on the inbound and outbound handovers (handover between femtocell and macrocell); in detail, it provides suitable solutions for the intensity of femtocell handover prediction, Physical Cell Identity (PCI) allocation and handover triggering parameter optimisation. Moreover, those solutions are implemented in the structure of SON. In order to e ciently manage radio resource allocation, this research investigates the conventional UE-based prediction model and proposes a cell-based prediction model to predict the intensity of a femtocell's handover, which overcomes the drawbacks of the conventional models in the two-tier scenario. Then, the predictor is used in the proposed dynamic group PCI allocation approach in order to solve the problem of PCI allocation for the femtocells. In addition, based on SON, this approach is implemented in the structure of a centralised Automated Con guration of Physical Cell Identity (ACPCI). It overcomes the drawbacks of the conventional method by reducing inbound handover failure of Cell Global Identity (CGI). This thesis also tackles optimisation of the handover triggering parameters to minimise handover failure. A dynamic hysteresis-adjusting approach for each User Equipment (UE) is proposed, using received average Reference Signal-Signal to Interference plus Noise Ratio (RS-SINR) of the UE as a criterion. Furthermore, based on SON, this approach is implemented in the structure of hybrid Mobility Robustness Optimisation (MRO). It is able to off er the unique optimised hysteresis value to the individual UE in the network. In order to evaluate the performance of the proposed approach against existing methods, a System Level Simulation (SLS) tool, provided by the Centre for Wireless Network Design (CWiND) research group, is utilised, which models the structure of two-tier communication of LTE femtocell-based networks

    Artificial intelligence (AI) methods in optical networks: A comprehensive survey

    Get PDF
    Producción CientíficaArtificial intelligence (AI) is an extensive scientific discipline which enables computer systems to solve problems by emulating complex biological processes such as learning, reasoning and self-correction. This paper presents a comprehensive review of the application of AI techniques for improving performance of optical communication systems and networks. The use of AI-based techniques is first studied in applications related to optical transmission, ranging from the characterization and operation of network components to performance monitoring, mitigation of nonlinearities, and quality of transmission estimation. Then, applications related to optical network control and management are also reviewed, including topics like optical network planning and operation in both transport and access networks. Finally, the paper also presents a summary of opportunities and challenges in optical networking where AI is expected to play a key role in the near future.Ministerio de Economía, Industria y Competitividad (Project EC2014-53071-C3-2-P, TEC2015-71932-REDT
    corecore