259 research outputs found

    Performance Evaluation of Power Efficient Mechanisms on Multimedia over LTE-A Networks

    Get PDF
    Power optimization is a critical challenge in multimedia services over cellular communication systems. Long Term Evolution-Advanced (LTE-A) has been developed for higher bandwidth access for accommodating today’s heavy data applications to provide better performance. Idle mode permits cellularstations to manipulate power and sources with the aid of limiting its activity for discrete periods and this eliminates the lively requirement for handover and other ordinary operations. Also, provides a periodical method for the cell station for pending downlink traffic directed to the cellularstation and as a result gets rid network handover traffic from basically inactive cellular stations. Discontinuous Reception (DRX) has been carried out to decrease the power intake of the consumer device, and transmission of big quantity of data. At data transfer, mobile device and the network phases negotiation occur. During other times, the device turns its receiver off and enters a low power state. Thereby similarly assisting numerous services and big quantities of information transmissions. This study prepossession of a massive quantity of data. Also proposes the two-power optimization modes idle mode and DRX mode parameters to achieve maximum possible power saving with the higher quality of multimedia services. Furthermore, the effectiveness of using DRX short cycles and DRX long cycles on multimedia services and the overall performance. Using OPNET Simulator 17.5, it concluded that DRX mechanism is preferred to operate compared with the Idle mechanism, also resulted that the DRX long cycles are a very good choice for all multimedia services and the overall network performance

    Performance Comparison Between VoLTE and non-VoLTE Voice Calls During Mobility in Commercial Deployment: A Drive Test-Based Analysis

    Full text link
    The optimization of network performance is vital for the delivery of services using standard cellular technologies for mobile communications. Call setup delay and User Equipment (UE) battery savings significantly influence network performance. Improving these factors is vital for ensuring optimal service delivery. In comparison to traditional circuit-switched voice calls, VoLTE (Voice over LTE) technology offers faster call setup durations and better battery-saving performance. To validate these claims, a drive test was carried out using the XCAL drive test tool to collect real-time network parameter details in VoLTE and non-VoLTE voice calls. The findings highlight the analysis of real-time network characteristics, such as the call setup delay calculation, battery-saving performance, and DRX mechanism. The study contributes to the understanding of network optimization strategies and provides insights for enhancing the quality of service (QoS) in mobile communication networks. Examining VoLTE and non-VoLTE operations, this research highlights the substantial energy savings obtained by VoLTE. Specifically, VoLTE saves approximately 60.76% of energy before the Service Request and approximately 38.97% of energy after the Service Request. Moreover, VoLTE to VoLTE calls have a 72.6% faster call setup delay than non-VoLTE-based LTE to LTE calls, because of fewer signaling messages required. Furthermore, as compared to non-VoLTE to non-VoLTE calls, VoLTE to non-VoLTE calls offer an 18.6% faster call setup delay. These results showcase the performance advantages of VoLTE and reinforce its potential for offering better services in wireless communication networks.Comment: Accepted for presentation and Publication on the IEEE 10th International Conference on Electrical Engineering, Computer Science and Informatics (EECSI 2023

    Power Saving Techniques in 5G Technology for Multiple-Beam Communications

    Get PDF
    The evolution of mobile technology and computation systems enables User Equipment (UE) to manage tremendous amounts of data transmission. As a result of current 5G technology, several types of wireless traffic in millimeter wave bands can be transmitted at high data rates with ultra-reliable and small latency communications. The 5G networks rely on directional beamforming and mmWave uses to overcome propagation and losses during penetration. To align the best beam pairs and achieve high data rates, beam-search operations are used in 5G. This combined with multibeam reception and high-order modulation techniques deteriorates the battery power of the UE. In the previous 4G radio mobile system, Discontinuous Reception (DRX) techniques were successfully used to save energy. To reduce the energy consumption and latency of multiple-beam 5G radio communications, we will propose in this paper the DRX Beam Measurement technique (DRX-BM). Based on the power-saving factor analysis and the delayed response, we will model DRX-BM into a semi-Markov process to reduce the tracking time. Simulations in MATLAB are used to assess the effectiveness of the proposed model and avoid unnecessary time spent on beam search. Furthermore, the simulation indicates that our proposed technique makes an improvement and saves 14% on energy with a minimum delay

    Analysis of M2M Capabilities in 4G

    Get PDF
    M2M (Machine to Machine) communications enable many new applications that reduce the costs of maintenance and operation via remote monitoring and control. The forecasts for this type of communications predict traffic increases associated with these devices of about 100% in the coming years. However, the behaviour of M2M devices is different from the human user, which causes stress on the networks due to the overload of the signalling procedures. This paper reviews the literature on the current scenario, projections for the decade, and improvements that LTE (Long Term Evolution) will offer for this segment of devices.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech

    Quantifying Potential Energy Efficiency Gain in Green Cellular Wireless Networks

    Full text link
    Conventional cellular wireless networks were designed with the purpose of providing high throughput for the user and high capacity for the service provider, without any provisions of energy efficiency. As a result, these networks have an enormous Carbon footprint. In this paper, we describe the sources of the inefficiencies in such networks. First we present results of the studies on how much Carbon footprint such networks generate. We also discuss how much more mobile traffic is expected to increase so that this Carbon footprint will even increase tremendously more. We then discuss specific sources of inefficiency and potential sources of improvement at the physical layer as well as at higher layers of the communication protocol hierarchy. In particular, considering that most of the energy inefficiency in cellular wireless networks is at the base stations, we discuss multi-tier networks and point to the potential of exploiting mobility patterns in order to use base station energy judiciously. We then investigate potential methods to reduce this inefficiency and quantify their individual contributions. By a consideration of the combination of all potential gains, we conclude that an improvement in energy consumption in cellular wireless networks by two orders of magnitude, or even more, is possible.Comment: arXiv admin note: text overlap with arXiv:1210.843

    Energy and Spectral Efficiency Balancing Algorithm for Energy Saving in LTE Downlinks

    Get PDF
    In wireless network communication environments, Spectral Efficiency (SE) and Energy Efficiency (EE) are among the major indicators used for evaluating network performance. However, given the high demand for data rate services and the exponential growth of energy consumption, SE and EE continue to elicit increasing attention in academia and industries. Consequently, a study of the trade-off between these metrics is imperative. In contrast with existing works, this study proposes an efficient SE and EE trade-off algorithm for saving energy in downlink Long Term Evolution (LTE) networks to concurrently optimize SE and EE while considering battery life at the Base Station (BS). The scheme is formulated as a Multi-objective Optimization Problem (MOP) and its Pareto optimal solution is examined. In contrast with other algorithms that prolong battery life by considering the idle state of a BS, thereby increasing average delay and energy consumption, the proposed algorithm prolongs battery life by adjusting the initial and final states of a BS to minimize the average delay and the energy consumption. Similarly, the use of an omni-directional antenna to spread radio signals to the user equipment in all directions causes high interference and low spatial reuse. We propose using a directional antenna instead of an omni-directional antenna by transmitting signals in one direction which results in no or low interference and high spatial reuse. The proposed scheme has been extensively evaluated through simulation, where simulation results prove that the proposed scheme is efficiently able to decrease the average response delay, improve SE, and minimize energy consumption.Comment: 19 page
    corecore