6,577 research outputs found
LTE Spectrum Sharing Research Testbed: Integrated Hardware, Software, Network and Data
This paper presents Virginia Tech's wireless testbed supporting research on
long-term evolution (LTE) signaling and radio frequency (RF) spectrum
coexistence. LTE is continuously refined and new features released. As the
communications contexts for LTE expand, new research problems arise and include
operation in harsh RF signaling environments and coexistence with other radios.
Our testbed provides an integrated research tool for investigating these and
other research problems; it allows analyzing the severity of the problem,
designing and rapidly prototyping solutions, and assessing them with
standard-compliant equipment and test procedures. The modular testbed
integrates general-purpose software-defined radio hardware, LTE-specific test
equipment, RF components, free open-source and commercial LTE software, a
configurable RF network and recorded radar waveform samples. It supports RF
channel emulated and over-the-air radiated modes. The testbed can be remotely
accessed and configured. An RF switching network allows for designing many
different experiments that can involve a variety of real and virtual radios
with support for multiple-input multiple-output (MIMO) antenna operation. We
present the testbed, the research it has enabled and some valuable lessons that
we learned and that may help designing, developing, and operating future
wireless testbeds.Comment: In Proceeding of the 10th ACM International Workshop on Wireless
Network Testbeds, Experimental Evaluation & Characterization (WiNTECH),
Snowbird, Utah, October 201
Optimized LTE Data Transmission Procedures for IoT: Device Side Energy Consumption Analysis
The efficient deployment of Internet of Things (IoT) over cellular networks,
such as Long Term Evolution (LTE) or the next generation 5G, entails several
challenges. For massive IoT, reducing the energy consumption on the device side
becomes essential. One of the main characteristics of massive IoT is small data
transmissions. To improve the support of them, the 3GPP has included two novel
optimizations in LTE: one of them based on the Control Plane (CP), and the
other on the User Plane (UP). In this paper, we analyze the average energy
consumption per data packet using these two optimizations compared to
conventional LTE Service Request procedure. We propose an analytical model to
calculate the energy consumption for each procedure based on a Markov chain. In
the considered scenario, for large and small Inter-Arrival Times (IATs), the
results of the three procedures are similar. While for medium IATs CP reduces
the energy consumption per packet up to 87% due to its connection release
optimization
Control-data separation architecture for cellular radio access networks: a survey and outlook
Conventional cellular systems are designed to ensure ubiquitous coverage with an always present wireless channel irrespective of the spatial and temporal demand of service. This approach raises several problems due to the tight coupling between network and data access points, as well as the paradigm shift towards data-oriented services, heterogeneous deployments and network densification. A logical separation between control and data planes is seen as a promising solution that could overcome these issues, by providing data services under the umbrella of a coverage layer. This article presents a holistic survey of existing literature on the control-data separation architecture (CDSA) for cellular radio access networks. As a starting point, we discuss the fundamentals, concepts, and general structure of the CDSA. Then, we point out limitations of the conventional architecture in futuristic deployment scenarios. In addition, we present and critically discuss the work that has been done to investigate potential benefits of the CDSA, as well as its technical challenges and enabling technologies. Finally, an overview of standardisation proposals related to this research vision is provided
Handover parameter optimization in LTE self-organizing networks
This paper presents a self-optimizing algorithm that tunes the handover (HO) parameters of a LTE (Long-Term Evolution) base station in order to improve the overall network performance and diminish negative effects (call dropping, HO failures). The proposed algorithm picks the best hysteresis and time-to-trigger combination for the current network status. We examined the effects of this self-optimizing algorithm in a realistic scenario setting and the results show an improvement from the static value settings
What Can Wireless Cellular Technologies Do about the Upcoming Smart Metering Traffic?
The introduction of smart electricity meters with cellular radio interface
puts an additional load on the wireless cellular networks. Currently, these
meters are designed for low duty cycle billing and occasional system check,
which generates a low-rate sporadic traffic. As the number of distributed
energy resources increases, the household power will become more variable and
thus unpredictable from the viewpoint of the Distribution System Operator
(DSO). It is therefore expected, in the near future, to have an increased
number of Wide Area Measurement System (WAMS) devices with Phasor Measurement
Unit (PMU)-like capabilities in the distribution grid, thus allowing the
utilities to monitor the low voltage grid quality while providing information
required for tighter grid control. From a communication standpoint, the traffic
profile will change drastically towards higher data volumes and higher rates
per device. In this paper, we characterize the current traffic generated by
smart electricity meters and supplement it with the potential traffic
requirements brought by introducing enhanced Smart Meters, i.e., meters with
PMU-like capabilities. Our study shows how GSM/GPRS and LTE cellular system
performance behaves with the current and next generation smart meters traffic,
where it is clearly seen that the PMU data will seriously challenge these
wireless systems. We conclude by highlighting the possible solutions for
upgrading the cellular standards, in order to cope with the upcoming smart
metering traffic.Comment: Submitted; change: corrected location of eSM box in Fig. 1; May 22,
2015: Major revision after review; v4: revised, accepted for publicatio
Memory-full context-aware predictive mobility management in dual connectivity 5G networks
Network densification with small cell deployment is being considered as one of the dominant themes in the fifth generation (5G) cellular system. Despite the capacity gains, such deployment scenarios raise several challenges from mobility management perspective. The small cell size, which implies a small cell residence time, will increase the handover (HO) rate dramatically. Consequently, the HO latency will become a critical consideration in the 5G era. The latter requires an intelligent, fast and light-weight HO procedure with minimal signalling overhead. In this direction, we propose a memory-full context-aware HO scheme with mobility prediction to achieve the aforementioned objectives. We consider a dual connectivity radio access network architecture with logical separation between control and data planes because it offers relaxed constraints in implementing the predictive approaches. The proposed scheme predicts future HO events along with the expected HO time by combining radio frequency performance to physical proximity along with the user context in terms of speed, direction and HO history. To minimise the processing and the storage requirements whilst improving the prediction performance, a user-specific prediction triggering threshold is proposed. The prediction outcome is utilised to perform advance HO signalling whilst suspending the periodic transmission of measurement reports. Analytical and simulation results show that the proposed scheme provides promising gains over the conventional approach
- …