49 research outputs found

    TCP in 5G mmWave Networks: Link Level Retransmissions and MP-TCP

    Full text link
    MmWave communications, one of the cornerstones of future 5G mobile networks, are characterized at the same time by a potential multi-gigabit capacity and by a very dynamic channel, sensitive to blockage, wide fluctuations in the received signal quality, and possibly also sudden link disruption. While the performance of physical and MAC layer schemes that address these issues has been thoroughly investigated in the literature, the complex interactions between mmWave links and transport layer protocols such as TCP are still relatively unexplored. This paper uses the ns-3 mmWave module, with its channel model based on real measurements in New York City, to analyze the performance of the Linux TCP/IP stack (i) with and without link-layer retransmissions, showing that they are fundamental to reach a high TCP throughput on mmWave links and (ii) with Multipath TCP (MP-TCP) over multiple LTE and mmWave links, illustrating which are the throughput-optimal combinations of secondary paths and congestion control algorithms in different conditions.Comment: 6 pages, 11 figures, accepted for presentation at the 2017 IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS

    WiLiTV: A Low-Cost Wireless Framework for Live TV Services

    Full text link
    With the evolution of HDTV and Ultra HDTV, the bandwidth requirement for IP-based TV content is rapidly increasing. Consumers demand uninterrupted service with a high Quality of Experience (QoE). Service providers are constantly trying to differentiate themselves by innovating new ways of distributing content more efficiently with lower cost and higher penetration. In this work, we propose a cost-efficient wireless framework (WiLiTV) for delivering live TV services, consisting of a mix of wireless access technologies (e.g. Satellite, WiFi and LTE overlay links). In the proposed architecture, live TV content is injected into the network at a few residential locations using satellite dishes. The content is then further distributed to other homes using a house-to-house WiFi network or via an overlay LTE network. Our problem is to construct an optimal TV distribution network with the minimum number of satellite injection points, while preserving the highest QoE, for different neighborhood densities. We evaluate the framework using realistic time-varying demand patterns and a diverse set of home location data. Our study demonstrates that the architecture requires 75 - 90% fewer satellite injection points, compared to traditional architectures. Furthermore, we show that most cost savings can be obtained using simple and practical relay routing solutions

    Change detection in teletraffic models

    Get PDF
    In this paper, we propose a likelihood-based ratio test to detect distributional changes in common teletraffic models. These include traditional models like the Markov modulated Poisson process and processes exhibiting long range dependency, in particular, Gaussian fractional ARIMA processes. A practical approach is also developed for the case where the parameter after the change is unknown. It is noticed that the algorithm is robust enough to detect slight perturbations of the parameter value after the change. A comprehensive set of numerical results including results for the mean detection delay is provided

    Machine Learning at the Edge: A Data-Driven Architecture with Applications to 5G Cellular Networks

    Full text link
    The fifth generation of cellular networks (5G) will rely on edge cloud deployments to satisfy the ultra-low latency demand of future applications. In this paper, we argue that such deployments can also be used to enable advanced data-driven and Machine Learning (ML) applications in mobile networks. We propose an edge-controller-based architecture for cellular networks and evaluate its performance with real data from hundreds of base stations of a major U.S. operator. In this regard, we will provide insights on how to dynamically cluster and associate base stations and controllers, according to the global mobility patterns of the users. Then, we will describe how the controllers can be used to run ML algorithms to predict the number of users in each base station, and a use case in which these predictions are exploited by a higher-layer application to route vehicular traffic according to network Key Performance Indicators (KPIs). We show that the prediction accuracy improves when based on machine learning algorithms that rely on the controllers' view and, consequently, on the spatial correlation introduced by the user mobility, with respect to when the prediction is based only on the local data of each single base station.Comment: 15 pages, 10 figures, 5 tables. IEEE Transactions on Mobile Computin

    Empowering the 6G Cellular Architecture with Open RAN

    Full text link
    Innovation and standardization in 5G have brought advancements to every facet of the cellular architecture. This ranges from the introduction of new frequency bands and signaling technologies for the radio access network (RAN), to a core network underpinned by micro-services and network function virtualization (NFV). However, like any emerging technology, the pace of real-world deployments does not instantly match the pace of innovation. To address this discrepancy, one of the key aspects under continuous development is the RAN with the aim of making it more open, adaptive, functional, and easy to manage. In this paper, we highlight the transformative potential of embracing novel cellular architectures by transitioning from conventional systems to the progressive principles of Open RAN. This promises to make 6G networks more agile, cost-effective, energy-efficient, and resilient. It opens up a plethora of novel use cases, ranging from ubiquitous support for autonomous devices to cost-effective expansions in regions previously underserved. The principles of Open RAN encompass: (i) a disaggregated architecture with modular and standardized interfaces; (ii) cloudification, programmability and orchestration; and (iii) AI-enabled data-centric closed-loop control and automation. We first discuss the transformative role Open RAN principles have played in the 5G era. Then, we adopt a system-level approach and describe how these Open RAN principles will support 6G RAN and architecture innovation. We qualitatively discuss potential performance gains that Open RAN principles yield for specific 6G use cases. For each principle, we outline the steps that research, development and standardization communities ought to take to make Open RAN principles central to next-generation cellular network designs.Comment: This paper is part of the IEEE JSAC SI on Open RAN. Please cite as: M. Polese, M. Dohler, F. Dressler, M. Erol-Kantarci, R. Jana, R. Knopp, T. Melodia, "Empowering the 6G Cellular Architecture with Open RAN," in IEEE Journal on Selected Areas in Communications, doi: 10.1109/JSAC.2023.333461

    Deep Learning based Fast and Accurate Beamforming for Millimeter-Wave Systems

    Full text link
    The widespread proliferation of mmW devices has led to a surge of interest in antenna arrays. This interest in arrays is due to their ability to steer beams in desired directions, for the purpose of increasing signal-power and/or decreasing interference levels. To enable beamforming, array coefficients are typically stored in look-up tables (LUTs) for subsequent referencing. While LUTs enable fast sweep times, their limited memory size restricts the number of beams the array can produce. Consequently, a receiver is likely to be offset from the main beam, thus decreasing received power, and resulting in sub-optimal performance. In this letter, we present BeamShaper, a deep neural network (DNN) framework, which enables fast and accurate beamsteering in any desirable 3-D direction. Unlike traditional finite-memory LUTs which support a fixed set of beams, BeamShaper utilizes a trained NN model to generate the array coefficients for arbitrary directions in \textit{real-time}. Our simulations show that BeamShaper outperforms contemporary LUT based solutions in terms of cosine-similarity and central angle in time scales that are slightly higher than LUT based solutions. Additionally, we show that our DNN based approach has the added advantage of being more resilient to the effects of quantization noise generated while using digital phase-shifters.Comment: 7 pages, 5 figures, accepted to Milcom2023. Not published yet(Sep 2023

    Back to the future: Throughput prediction for cellular networks using radio KPIs

    Get PDF
    The availability of reliable predictions for cellular throughput would offer a fundamental change in the way applications are designed and operated. Numerous cellular applications, including video streaming and VoIP, embed logic that attempts to estimate achievable throughput and adapt their behaviour accordingly. We believe that providing applications with reliable predictions several seconds into the future would enable profoundly better adaptation decisions and dramatically benefit demanding applications like mobile virtual and augmented reality. The question we pose and seek to address is whether such reliable predictions are possible. We conduct a preliminary study of throughput prediction in a cellular environment using statistical machine learning techniques. An accurate prediction can be very challenging in large scale cellular environments because they are characterized by highly fluctuating channel conditions. Using simulations and real-world experiments, we study how prediction error varies as a function of prediction horizon, and granularity of available data. In particular, our simulation experiments show that the prediction error for mobile devices can be reduced significantly by combining measurements from the network with measurements from the end device. Our results indicate that it is possible to accurately predict achievable throughput up to 8 sec in the future where 50th percentile of all errors are less than 15% for mobile and 2% for static devices
    corecore