296 research outputs found

    Machine Learning at the Edge: A Data-Driven Architecture with Applications to 5G Cellular Networks

    Full text link
    The fifth generation of cellular networks (5G) will rely on edge cloud deployments to satisfy the ultra-low latency demand of future applications. In this paper, we argue that such deployments can also be used to enable advanced data-driven and Machine Learning (ML) applications in mobile networks. We propose an edge-controller-based architecture for cellular networks and evaluate its performance with real data from hundreds of base stations of a major U.S. operator. In this regard, we will provide insights on how to dynamically cluster and associate base stations and controllers, according to the global mobility patterns of the users. Then, we will describe how the controllers can be used to run ML algorithms to predict the number of users in each base station, and a use case in which these predictions are exploited by a higher-layer application to route vehicular traffic according to network Key Performance Indicators (KPIs). We show that the prediction accuracy improves when based on machine learning algorithms that rely on the controllers' view and, consequently, on the spatial correlation introduced by the user mobility, with respect to when the prediction is based only on the local data of each single base station.Comment: 15 pages, 10 figures, 5 tables. IEEE Transactions on Mobile Computin

    Exploiting Map Topology Knowledge for Context-predictive Multi-interface Car-to-cloud Communication

    Full text link
    While the automotive industry is currently facing a contest among different communication technologies and paradigms about predominance in the connected vehicles sector, the diversity of the various application requirements makes it unlikely that a single technology will be able to fulfill all given demands. Instead, the joint usage of multiple communication technologies seems to be a promising candidate that allows benefiting from characteristical strengths (e.g., using low latency direct communication for safety-related messaging). Consequently, dynamic network interface selection has become a field of scientific interest. In this paper, we present a cross-layer approach for context-aware transmission of vehicular sensor data that exploits mobility control knowledge for scheduling the transmission time with respect to the anticipated channel conditions for the corresponding communication technology. The proposed multi-interface transmission scheme is evaluated in a comprehensive simulation study, where it is able to achieve significant improvements in data rate and reliability

    Towards Data-driven Simulation of End-to-end Network Performance Indicators

    Full text link
    Novel vehicular communication methods are mostly analyzed simulatively or analytically as real world performance tests are highly time-consuming and cost-intense. Moreover, the high number of uncontrollable effects makes it practically impossible to reevaluate different approaches under the exact same conditions. However, as these methods massively simplify the effects of the radio environment and various cross-layer interdependencies, the results of end-to-end indicators (e.g., the resulting data rate) often differ significantly from real world measurements. In this paper, we present a data-driven approach that exploits a combination of multiple machine learning methods for modeling the end-to-end behavior of network performance indicators within vehicular networks. The proposed approach can be exploited for fast and close to reality evaluation and optimization of new methods in a controllable environment as it implicitly considers cross-layer dependencies between measurable features. Within an example case study for opportunistic vehicular data transfer, the proposed approach is validated against real world measurements and a classical system-level network simulation setup. Although the proposed method does only require a fraction of the computation time of the latter, it achieves a significantly better match with the real world evaluations

    Prediction-based techniques for the optimization of mobile networks

    Get PDF
    Mención Internacional en el título de doctorMobile cellular networks are complex system whose behavior is characterized by the superposition of several random phenomena, most of which, related to human activities, such as mobility, communications and network usage. However, when observed in their totality, the many individual components merge into more deterministic patterns and trends start to be identifiable and predictable. In this thesis we analyze a recent branch of network optimization that is commonly referred to as anticipatory networking and that entails the combination of prediction solutions and network optimization schemes. The main intuition behind anticipatory networking is that knowing in advance what is going on in the network can help understanding potentially severe problems and mitigate their impact by applying solution when they are still in their initial states. Conversely, network forecast might also indicate a future improvement in the overall network condition (i.e. load reduction or better signal quality reported from users). In such a case, resources can be assigned more sparingly requiring users to rely on buffered information while waiting for the better condition when it will be more convenient to grant more resources. In the beginning of this thesis we will survey the current anticipatory networking panorama and the many prediction and optimization solutions proposed so far. In the main body of the work, we will propose our novel solutions to the problem, the tools and methodologies we designed to evaluate them and to perform a real world evaluation of our schemes. By the end of this work it will be clear that not only is anticipatory networking a very promising theoretical framework, but also that it is feasible and it can deliver substantial benefit to current and next generation mobile networks. In fact, with both our theoretical and practical results we show evidences that more than one third of the resources can be saved and even larger gain can be achieved for data rate enhancements.Programa Oficial de Doctorado en Ingeniería TelemáticaPresidente: Albert Banchs Roca.- Presidente: Pablo Serrano Yañez-Mingot.- Secretario: Jorge Ortín Gracia.- Vocal: Guevara Noubi

    Anticipatory Buffer Control and Quality Selection for Wireless Video Streaming

    Full text link
    Video streaming is in high demand by mobile users, as recent studies indicate. In cellular networks, however, the unreliable wireless channel leads to two major problems. Poor channel states degrade video quality and interrupt the playback when a user cannot sufficiently fill its local playout buffer: buffer underruns occur. In contrast to that, good channel conditions cause common greedy buffering schemes to pile up very long buffers. Such over-buffering wastes expensive wireless channel capacity. To keep buffering in balance, we employ a novel approach. Assuming that we can predict data rates, we plan the quality and download time of the video segments ahead. This anticipatory scheduling avoids buffer underruns by downloading a large number of segments before a channel outage occurs, without wasting wireless capacity by excessive buffering. We formalize this approach as an optimization problem and derive practical heuristics for segmented video streaming protocols (e.g., HLS or MPEG DASH). Simulation results and testbed measurements show that our solution essentially eliminates playback interruptions without significantly decreasing video quality

    Machine learning based context-predictive car-to-cloud communication using multi-layer connectivity maps for upcoming 5G networks

    Full text link
    While cars were only considered as means of personal transportation for a long time, they are currently transcending to mobile sensor nodes that gather highly up-to-date information for crowdsensing-enabled big data services in a smart city context. Consequently, upcoming 5G communication networks will be confronted with massive increases in Machine-type Communication (MTC) and require resource-efficient transmission methods in order to optimize the overall system performance and provide interference-free coexistence with human data traffic that is using the same public cellular network. In this paper, we bring together mobility prediction and machine learning based channel quality estimation in order to improve the resource-efficiency of car-to-cloud data transfer by scheduling the transmission time of the sensor data with respect to the anticipated behavior of the communication context. In a comprehensive field evaluation campaign, we evaluate the proposed context-predictive approach in a public cellular network scenario where it is able to increase the average data rate by up to 194% while simultaneously reducing the mean uplink power consumption by up to 54%
    corecore