10,589 research outputs found

    A survey of machine learning techniques applied to self organizing cellular networks

    Get PDF
    In this paper, a survey of the literature of the past fifteen years involving Machine Learning (ML) algorithms applied to self organizing cellular networks is performed. In order for future networks to overcome the current limitations and address the issues of current cellular systems, it is clear that more intelligence needs to be deployed, so that a fully autonomous and flexible network can be enabled. This paper focuses on the learning perspective of Self Organizing Networks (SON) solutions and provides, not only an overview of the most common ML techniques encountered in cellular networks, but also manages to classify each paper in terms of its learning solution, while also giving some examples. The authors also classify each paper in terms of its self-organizing use-case and discuss how each proposed solution performed. In addition, a comparison between the most commonly found ML algorithms in terms of certain SON metrics is performed and general guidelines on when to choose each ML algorithm for each SON function are proposed. Lastly, this work also provides future research directions and new paradigms that the use of more robust and intelligent algorithms, together with data gathered by operators, can bring to the cellular networks domain and fully enable the concept of SON in the near future

    Network-based business process management: embedding business logic in communications networks

    Get PDF
    Advanced Business Process Management (BPM) tools enable the decomposition of previously integrated and often ill-defined processes into re-usable process modules. These process modules can subsequently be distributed on the Internet over a variety of many different actors, each with their own specialization and economies-of-scale. The economic benefits of process specialization can be huge. However, how should such actors in a business network find, select, and control, the best partner for what part of the business process, in such a way that the best result is achieved? This particular management challenge requires more advanced techniques and tools in the enabling communications networks. An approach has been developed to embed business logic into the communications networks in order to optimize the allocation of business resources from a network point of view. Initial experimental results have been encouraging while at the same time demonstrating the need for more robust techniques in a future of massively distributed business processes.active networks;business process management;business protocols;embedded business logic;genetic algorithms;internet distributed process management;payment systems;programmable networks;resource optimization

    AI Solutions for MDS: Artificial Intelligence Techniques for Misuse Detection and Localisation in Telecommunication Environments

    Get PDF
    This report considers the application of Articial Intelligence (AI) techniques to the problem of misuse detection and misuse localisation within telecommunications environments. A broad survey of techniques is provided, that covers inter alia rule based systems, model-based systems, case based reasoning, pattern matching, clustering and feature extraction, articial neural networks, genetic algorithms, arti cial immune systems, agent based systems, data mining and a variety of hybrid approaches. The report then considers the central issue of event correlation, that is at the heart of many misuse detection and localisation systems. The notion of being able to infer misuse by the correlation of individual temporally distributed events within a multiple data stream environment is explored, and a range of techniques, covering model based approaches, `programmed' AI and machine learning paradigms. It is found that, in general, correlation is best achieved via rule based approaches, but that these suffer from a number of drawbacks, such as the difculty of developing and maintaining an appropriate knowledge base, and the lack of ability to generalise from known misuses to new unseen misuses. Two distinct approaches are evident. One attempts to encode knowledge of known misuses, typically within rules, and use this to screen events. This approach cannot generally detect misuses for which it has not been programmed, i.e. it is prone to issuing false negatives. The other attempts to `learn' the features of event patterns that constitute normal behaviour, and, by observing patterns that do not match expected behaviour, detect when a misuse has occurred. This approach is prone to issuing false positives, i.e. inferring misuse from innocent patterns of behaviour that the system was not trained to recognise. Contemporary approaches are seen to favour hybridisation, often combining detection or localisation mechanisms for both abnormal and normal behaviour, the former to capture known cases of misuse, the latter to capture unknown cases. In some systems, these mechanisms even work together to update each other to increase detection rates and lower false positive rates. It is concluded that hybridisation offers the most promising future direction, but that a rule or state based component is likely to remain, being the most natural approach to the correlation of complex events. The challenge, then, is to mitigate the weaknesses of canonical programmed systems such that learning, generalisation and adaptation are more readily facilitated

    Privacy and Accountability in Black-Box Medicine

    Get PDF
    Black-box medicine—the use of big data and sophisticated machine learning techniques for health-care applications—could be the future of personalized medicine. Black-box medicine promises to make it easier to diagnose rare diseases and conditions, identify the most promising treatments, and allocate scarce resources among different patients. But to succeed, it must overcome two separate, but related, problems: patient privacy and algorithmic accountability. Privacy is a problem because researchers need access to huge amounts of patient health information to generate useful medical predictions. And accountability is a problem because black-box algorithms must be verified by outsiders to ensure they are accurate and unbiased, but this means giving outsiders access to this health information. This article examines the tension between the twin goals of privacy and accountability and develops a framework for balancing that tension. It proposes three pillars for an effective system of privacy-preserving accountability: substantive limitations on the collection, use, and disclosure of patient information; independent gatekeepers regulating information sharing between those developing and verifying black-box algorithms; and information-security requirements to prevent unintentional disclosures of patient information. The article examines and draws on a similar debate in the field of clinical trials, where disclosing information from past trials can lead to new treatments but also threatens patient privacy

    Cost based optimization for strategic mobile radio access network planning using metaheuristics

    Get PDF
    La evolución experimentada por las comunicaciones móviles a lo largo de las últimas décadas ha sido motivada por dos factores principales: el surgimiento de nuevas aplicaciones y necesidades por parte del usuario, así como los avances tecnológicos. Los servicios ofrecidos para términales móviles han evolucionado desde el clásico servicio de voz y mensajes cortos (SMS), a servicios más atractivos y por lo tanto con una rápida aceptación por parte de usuario final como, video telephony, video streaming, online gaming, and the internet broadband access (MBAS). Todos estos nuevos servicios se han convertido en una realidad gracias a los avances técnologicos, avances tales como nuevas técnicas de acceso al medio compartido, nuevos esquemas de codificiación y modulación de la información intercambiada, sistemas de transmisión y recepción basados en múltiples antenas (MIMO), etc. Un aspecto importante en esta evolución fue la liberación del sector a principios de los años 90, donde la función reguladora llevado a cabo por las autoridades regulatorias nacionales (NRA) se ha antojado fundamental. Uno de los principales problemas tratados por la NRA espcífica de cada nación es la determinación de los costes por servicios mayoristas, esto es los servicios entre operadores de servicios móvilles, entre los que cabe destacar el coste por terminación de llamada o de inteconexión. El servicio de interconexión hace posible la comunicación de usuarios de diferente operadores, así como el acceso a la totalidad de servicios, incluso a aquellos no prestados por un operador en concreto gracias al uso de una red perteneciente a otro operador, por parte de todos los usuarios. El objetivo principal de esta tesis es la minimización de los costes de inversión en equipamiento de red, lo cual repercute en el establecimiento de las tarifas de interconexión como se verá a lo largo de este trabajo. La consecución de dicho objetivo se divide en dos partes: en primer lugar, el desarrollo de un conjunto de algoritmos para el dimesionado óptimo de una red de acceso radio (RAN) para un sistema de comunicaciones móvilles. En segundo lugar, el diseño y aplicación de algoritmos de optimización para la distribución óptima de los servicios sobre el conjunto de tecnologías móviles existentes (OSDP). El modulo de diseño de red proporciona cuatro algoritmos diferenciados encargados del dimensionado y planificación de la red de acceso móvil. Estos algoritmos se aplican en un entorno multi-tecnología, considerando sistemas de segunda (2G), tercera (3G) y cuarta (4G) generación, multi-usuario, teniendo en cuenta diferentes perfiles de usuarios con su respectiva carga de tráfico, y multo-servicio, incluyendo voz, servicios de datos de baja velocidad (64-144 Kbps), y acceso a internet de banda ancha móvil. La segunda parte de la tesis se encarga de distribuir de una manera óptima el conjunto de servicios sobre las tecnologías a desplegar. El objetivo de esta parte es hacer un uso eficiente de las tecnologías existentes reduciendo los costes de inversión en equipamiento de red. Esto es posible gracias a las diferencias tecnológicas existente entre los diferentes sistemas móviles, que hacen que los sistemas de segunda generación sean adecuados para proporcionar el servicio de voz y mensajería corta, mientras que redes de tercera generación muestran un mejor rendimiento en la transmisión de servicios de datos. Por último, el servicio de banda ancha móvil es nativo de redes de última generadón, como High Speed Data Acces (HSPA) y 4G. Ambos módulos han sido aplicados a un extenso conjunto de experimentos para el desarrollo de análisis tecno-económicos tales como el estudio del rendimiento de las tecnologías de HSPA y 4G para la prestación del servicio de banda ancha móvil, así como el análisis de escenarios reales de despliegue para redes 4G que tendrán lugar a partir del próximo año coinicidiendo con la licitación de las frecuencias en la banda de 800 MHz. Así mismo, se ha llevado a cabo un estudio sobre el despliegue de redes de 4G en las bandas de 800 MHz, 1800 MHz y 2600 MHz, comparando los costes de inversión obtenidos tras la optimización. En todos los casos se ha demostrado la mejora, en términos de costes de inversión, obtenida tras la aplicación de ambos módulos, posibilitando una reducción en la determinación de los costes de provisión de servicios. Los estudios realizados en esta tesis se centran en la nación de España, sin embargo todos los algoritmos implementados son aplicables a cualquier otro país europeo, prueba de ello es que los algoritmos de diseño de red han sido utilizados en diversos proyectos de regulación
    corecore