560 research outputs found

    SIMULATING LONG TERM EVOLUTION SELF-OPTIMIZING BASED NETWORKS

    Get PDF
    With the first 3rd Generation Partnership Project (3GPP) Long Term Evolution (LTE) networks being deployed more complexity is added to current existing cellular mobile networks and more capital (CAPEX) and operational (OPEX) effort will be needed. In addition, the rising demand of users for new services and higher data rates demands more efficiency from operators. For this matter, 3GPP Release 8 as introduced the Self-Organizing Network (SON) concept, a set of self-configuration, self-optimizing and self-healing functions that allow the automation of labor-intensive tasks, reducing operational and capital costs. While requirements on cutting operational expenditure remain, operators still remain skeptical with the efficiency of these functions. In this paper, Physical Cell Identity (PCI) conflict detection and resolution, Automatic Neighbor Relation (ANR) and automatic Handover Parameter Optimization (HPO) functions are proposed as part of a simulator for LTE SON based networks. Based on user defined inputs, these functions allow operators to closely predict and gather optimal policy input values for SON algorithms, while maintaining desirable network performance. Based on a real network scenario, results show simulator’s clear benefit when compared with other proposals

    SIMULATING LONG TERM EVOLUTION SELF-OPTIMIZING BASED NETWORKS

    Get PDF
    With the first 3rd Generation Partnership Project (3GPP) Long Term Evolution (LTE) networks being deployed more complexity is added to current existing cellular mobile networks and more capital (CAPEX) and operational (OPEX) effort will be needed. In addition, the rising demand of users for new services and higher data rates demands more efficiency from operators. For this matter, 3GPP Release 8 as introduced the Self-Organizing Network (SON) concept, a set of self-configuration, self-optimizing and self-healing functions that allow the automation of labor-intensive tasks, reducing operational and capital costs. While requirements on cutting operational expenditure remain, operators still remain skeptical with the efficiency of these functions. In this paper, Physical Cell Identity (PCI) conflict detection and resolution, Automatic Neighbor Relation (ANR) and automatic Handover Parameter Optimization (HPO) functions are proposed as part of a simulator for LTE SON based networks. Based on user defined inputs, these functions allow operators to closely predict and gather optimal policy input values for SON algorithms, while maintaining desirable network performance. Based on a real network scenario, results show simulator’s clear benefit when compared with other proposals

    Final Report of the ModSysC2020 Working Group - Data, Models and Theories for Complex Systems: new challenges and opportunities

    Get PDF
    Final Report of the ModSysC2020 Working Group at University Montpellier 2At University Montpellier 2, the modeling and simulation of complex systems has been identified as a major scientific challenge and one of the priority axes in interdisciplinary research, with major potential impact on training, economy and society. Many research groups and laboratories in Montpellier are already working in that direction, but typically in isolation within their own scientific discipline. Several local actions have been initiated in order to structure the scientific community with interdisciplinary projects, but with little coordination among the actions. The goal of the ModSysC2020 (modeling and simulation of complex systems in 2020) working group was to analyze the local situation (forces and weaknesses, current projects), identify the critical research directions and propose concrete actions in terms of research projects, equipment facilities, human resources and training to be encouraged. To guide this perspective, we decomposed the scientific challenge into four main themes, for which there is strong background in Montpellier: (1) modeling and simulation of complex systems; (2) algorithms and computing; (3) scientific data management; (4) production, storage and archiving of data from the observation of the natural and biological media. In this report, for each theme, we introduce the context and motivations, analyze the situation in Montpellier, identify research directions and propose specific actions in terms of interdisciplinary research projects and training. We also provide an analysis of the socio-economical aspects of modeling and simulation through use cases in various domains such as life science and healthcare, environmental science and energy. Finally, we discuss the importance of revisiting students training in fundamental domains such as modeling, computer programming and database which are typically taught too late, in specialized masters

    Contribución al estudio de técnicas Self Organizing & Self Optimizing Networks para redes de comunicaciones móviles LTE

    Get PDF
    La aparición continua de nuevos servicios que consumen grandes anchos de banda, unida a la creciente demanda de acceso a Internet por parte de los usuarios en cualquier lugar en el que se encuentren, está aumentando considerablemente la complejidad de las redes de telefonía móvil. Los mecanismos actuales permiten gestionar los limitados recursos radioeléctricos en redes heterogéneas GSM/UMTS/LTE. Mediante el establecimiento de una estructura jerárquica celular, y unas funcionalidades de gestión multicapa, se puede distribuir el tráfico de voz y datos para ofrecer el mejor servicio a los clientes. Por otro lado, la movilidad de los usuarios y su distribución geográfica es muy variable dentro de un mismo territorio, por lo que las redes deben también adaptarse a este escenario. En los entornos urbanos se requiere un diseño de red con muchos elementos, en el que coexistirán células de diferentes tipos (macro, micro, pico, etc.) según la demanda de servicio, concentración de clientes y su movilidad.Además, estas redes han ido evolucionando tecnológicamente, encontrándonos en este momento en España con sistemas GSM 900, DCS 1800, UMTS 2100, UMTS 900 y LTE 1800 (a corto plazo LTE 800), que están en servicio simultáneamente, con un alto grado de relación y traspasos entre ellos. Para ofrecer continuidad del servicio entre todas las redes, y con el objeto de obtener la máxima rentabilidad aprovechando las inversiones realizadas en la infraestructura actual (GSM, UMTS R99, HSPA), la definición de la norma LTE contempla el mecanismo de ¿interworking¿, que consiste en un conjunto de funciones para permitir la interacción entre todas la redes que ofrecen el servicio de telefonía móvil. Este aumento de la complejidad requerirá nuevas inversiones con el objeto de gestionar de forma adecuada las labores de optimización, operación y mantenimiento de las redes, y en consecuencia, poder adaptarse a un mercado tan cambiante en el que constantemente están apareciendo nuevos servicios, terminales y modelos de negocio. En este sentido, aparece el concepto SON, que significa Self Organizing Networks, y tiene como objetivo la automatización de arduas tareas en la configuración, puesta en servicio, y optimización de parámetros de red, así como la de responder adecuadamente a eventualidades, consiguiendo con todo ello una reducción en los costes de operación y una mejora sustancial de la calidad de la red. La movilidad de los usuarios es una de las principales características del servicio de telefonía móvil, por lo que uno de los métodos para garantizar la Calidad consiste en un correcto diseño de relación de colindancias entre células cercanas, de tal manera que el terminal pueda disponer siempre de una célula a la que asociarse mientras se desplaza, dando así continuidad al servicio. En este trabajo de tesis se pretende diseñar un algoritmo de optimización automática que genere una relación de colindancias entre células para ofrecer la mejor calidad posible de la red.In this work we present theoretical studies on the characteristics of the cellular structures employed in the mobile communications networks, where there is a special emphasis on the interference and over reach like main limiting element of coverage, capacity and data rate. It also details the current trends of the different methods of optimization and shown a series of case studies carried out in the UMTS network of Telefónica, the results can be extrapolated to the LTE system. As a solution to the constraints of a real network compared to the theoretical designs, it is proposed in this thesis an ANR algorithm (Automatic Neighbour Relation) for the generation of neighbour lists which optimize the transfer between cells, achieving an increase in the number of completed calls and reduce the dropped calls or interrupted in the link radio Propagation models show the differences in free space loss with respect to complex urban environments, as well as the effects that produces the mobility of terminals in the characterization of the mobile channel. This feature is especially relevant in the heterogeneous scenarios with users where the communication is done through almost free spread, against others where the signal reaches them very attenuated and distorted after multiple reflections. The search for an optimal solution that offers good coverage, quality and capacity at the lowest cost, it is in this case a very complex task, so it is necessary to study the different lines of research in this area. In addition, the arrival of the LTE brings with it a cheaper and simplification of network structure, which includes the automation of the tasks of optimization. Consensus in different international forums and projector, described in this thesis the use cases and the most common algorithms and methods from several authors. To evaluate the characteristics of the test scenarios, measurements analysis and simulations were realized using innovative methods in order to obtain a characterization of the system in situations of high interference. In a first phase were implemented changes to the configuration of isolated cells, with the purpose of checking the mutual dependence that exists between all cells. With this procedure it was demonstrated that in dense scenarios, it is not possible to optimize the performance of a cell or base station without produce effects on the environment cells. With the conclusions of the previous tests there was designed a novel ANR algorithm that proposes different neighbor lists according to a set of constrains. Power measurements and interference obtained by a call traced tool, showed values very scattered and sometimes opposite. This makes difficult the selection of the best adjacent cell to complete a handover when the subscribers move. The algorithm is capable of combining the best option for a very wide area of the network with high interference, which represents a considerable improvement over the usual methods of neighbor list definition. Designed algorithm applied in the UMTS network of Telefónica, which is located in the Valle de la Orotava, and includes the towns of Puerto de la Cruz, La Orotava and Los Realejos. By the topography of the valley and the dispersion of the population, levels of interference and overreach are greater than in a normal urban scenario, so the net performance are lower. With the proposed method was obtained an increase in the total number of successfully calls, and reduced the number of dropped or interrupted calls

    Sensors Fault Diagnosis Trends and Applications

    Get PDF
    Fault diagnosis has always been a concern for industry. In general, diagnosis in complex systems requires the acquisition of information from sensors and the processing and extracting of required features for the classification or identification of faults. Therefore, fault diagnosis of sensors is clearly important as faulty information from a sensor may lead to misleading conclusions about the whole system. As engineering systems grow in size and complexity, it becomes more and more important to diagnose faulty behavior before it can lead to total failure. In the light of above issues, this book is dedicated to trends and applications in modern-sensor fault diagnosis

    A formal architecture-centric and model driven approach for the engineering of science gateways

    Get PDF
    From n-Tier client/server applications, to more complex academic Grids, or even the most recent and promising industrial Clouds, the last decade has witnessed significant developments in distributed computing. In spite of this conceptual heterogeneity, Service-Oriented Architecture (SOA) seems to have emerged as the common and underlying abstraction paradigm, even though different standards and technologies are applied across application domains. Suitable access to data and algorithms resident in SOAs via so-called ‘Science Gateways’ has thus become a pressing need in order to realize the benefits of distributed computing infrastructures.In an attempt to inform service-oriented systems design and developments in Grid-based biomedical research infrastructures, the applicant has consolidated work from three complementary experiences in European projects, which have developed and deployed large-scale production quality infrastructures and more recently Science Gateways to support research in breast cancer, pediatric diseases and neurodegenerative pathologies respectively. In analyzing the requirements from these biomedical applications the applicant was able to elaborate on commonly faced issues in Grid development and deployment, while proposing an adapted and extensible engineering framework. Grids implement a number of protocols, applications, standards and attempt to virtualize and harmonize accesses to them. Most Grid implementations therefore are instantiated as superposed software layers, often resulting in a low quality of services and quality of applications, thus making design and development increasingly complex, and rendering classical software engineering approaches unsuitable for Grid developments.The applicant proposes the application of a formal Model-Driven Engineering (MDE) approach to service-oriented developments, making it possible to define Grid-based architectures and Science Gateways that satisfy quality of service requirements, execution platform and distribution criteria at design time. An novel investigation is thus presented on the applicability of the resulting grid MDE (gMDE) to specific examples and conclusions are drawn on the benefits of this approach and its possible application to other areas, in particular that of Distributed Computing Infrastructures (DCI) interoperability, Science Gateways and Cloud architectures developments

    Smart Classifiers and Bayesian Inference for Evaluating River Sensitivity to Natural and Human Disturbances: A Data Science Approach

    Get PDF
    Excessive rates of channel adjustment and riverine sediment export represent societal challenges; impacts include: degraded water quality and ecological integrity, erosion hazards to infrastructure, and compromised public safety. The nonlinear nature of sediment erosion and deposition within a watershed and the variable patterns in riverine sediment export over a defined timeframe of interest are governed by many interrelated factors, including geology, climate and hydrology, vegetation, and land use. Human disturbances to the landscape and river networks have further altered these patterns of water and sediment routing. An enhanced understanding of river sediment sources and dynamics is important for stakeholders, and will become more critical under a nonstationary climate, as sediment yields are expected to increase in regions of the world that will experience increased frequency, persistence, and intensity of storm events. Practical tools are needed to predict sediment erosion, transport and deposition and to characterize sediment sources within a reasonable measure of uncertainty. Water resource scientists and engineers use multidimensional data sets of varying types and quality to answer management-related questions, and the temporal and spatial resolution of these data are growing exponentially with the advent of automated samplers and in situ sensors (i.e., “big data”). Data-driven statistics and classifiers have great utility for representing system complexity and can often be more readily implemented in an adaptive management context than process-based models. Parametric statistics are often of limited efficacy when applied to data of varying quality, mixed types (continuous, ordinal, nominal), censored or sparse data, or when model residuals do not conform to Gaussian distributions. Data-driven machine-learning algorithms and Bayesian statistics have advantages over Frequentist approaches for data reduction and visualization; they allow for non-normal distribution of residuals and greater robustness to outliers. This research applied machine-learning classifiers and Bayesian statistical techniques to multidimensional data sets to characterize sediment source and flux at basin, catchment, and reach scales. These data-driven tools enabled better understanding of: (1) basin-scale spatial variability in concentration-discharge patterns of instream suspended sediment and nutrients; (2) catchment-scale sourcing of suspended sediments; and (3) reach-scale sediment process domains. The developed tools have broad management application and provide insights into landscape drivers of channel dynamics and riverine solute and sediment export

    A reusable benchmark of brain-age prediction from M/EEG resting-state signals

    Get PDF
    Population-level modeling can define quantitative measures of individual aging by applying machine learning to large volumes of brain images. These measures of brain age, obtained from the general population, helped characterize disease severity in neurological populations, improving estimates of diagnosis or prognosis. Magnetoencephalography (MEG) and Electroencephalography (EEG) have the potential to further generalize this approach towards prevention and public health by enabling assessments of brain health at large scales in socioeconomically diverse environments. However, more research is needed to define methods that can handle the complexity and diversity of M/EEG signals across diverse real-world contexts. To catalyse this effort, here we propose reusable benchmarks of competing machine learning approaches for brain age modeling. We benchmarked popular classical machine learning pipelines and deep learning architectures previously used for pathology decoding or brain age estimation in 4 international M/EEG cohorts from diverse countries and cultural contexts, including recordings from more than 2500 participants. Our benchmarks were built on top of the M/EEG adaptations of the BIDS standard, providing tools that can be applied with minimal modification on any M/EEG dataset provided in the BIDS format. Our results suggest that, regardless of whether classical machine learning or deep learning was used, the highest performance was reached by pipelines and architectures involving spatially aware representations of the M/EEG signals, leading to R^2 scores between 0.60-0.71. Hand-crafted features paired with random forest regression provided robust benchmarks even in situations in which other approaches failed. Taken together, this set of benchmarks, accompanied by open-source software and high-level Python scripts, can serve as a starting point and quantitative reference for future efforts at developing M/EEG-based measures of brain aging. The generality of the approach renders this benchmark reusable for other related objectives such as modeling specific cognitive variables or clinical endpoints

    Hydrolink 2019/1. Drone

    Get PDF
    Topic: Dron
    corecore