15 research outputs found
Effect of Noise Factor on System Performance in LTE Networks
The 3GPP Long Term Evolution (LTE) technology provides higher system throughput over Broadband Wireless Access (BWA) telecommunication systems in order to meet the escalating demands of multimedia services. In such systems, the higher noise factor at the base station (eNB) degrades the system throughput, since the increase in noise factor at the eNB decreases the Signal to Noise Ratio (SNR) of the received signal. Thus the network deployment with lower noise factor at the eNB support higher system throughput and it is essential to provide high Quality of Services (QoS) in LTE networks. Hence in this paper, an attempt has been made to study and evaluate the effect of various noise factors at the eNB on system performance in uplink LTE network using QualNet 7.1 network simulator. The performance metrics considered for the simulation studies are spectral efficiency, system throughput, total numbers of data bytes received, total numbers of transport blocks received with errors, delay and jitter
An experimental analysis of the effects of noise on Wi-Fi video streaming
Wireless networks such as WiFi suffer communication
performance issues in addition to those seen on wired networks
due to the characteristics of the radio communication channel
used by their Physical Layers (PHY). Understanding these issues
is a complex but necessary task given the importance of wireless
networks for the transfer of wide ranging packet steams
including video as well as traditional data. Simulators are not
accurate enough to allow all the intricacies of such
communication to be accurately understood, especially when
complex interactions between the protocols of different layers
occurs. The paper suggests cross layer measurement as a solution
to the problem of understanding and analysis of such complex
communication issues and proposes a framework in which
appropriate performance measurements can be made from a
WiFi network supporting a video streaming application. The
framework has been used to collect these measurements at the
PHY, MAC, Transport and Application layers. Analysis of the
collected measurements has allowed the effects of noise
interference at the PHY to be related to the perceived
performance at the Application Layer for a video streaming
application. This has allowed the effect of the SNR on the
download time of a video sequence to be studied
Experimental assessment of the effects of cross-traffic on Wi-Fi video streaming
Wi-Fi networks are the first and sometimes only choice for the video streaming in homes, airports,
malls, public areas and museums. However, Wi-Fi networks are vulnerable to interference, noise and have
bandwidth limitations. Due to the intrinsic vulnerability of the communication channel, and the large number
of variables involved, simulation alone is not enough in the evaluation of the performance of wireless networks.
Actually, there is a tendency to give experimental tests a central role in the assessment of Wi-Fi networks
performance.
The paper presents an experimental analysis of the effects of cross traffic on the performance of video
streaming over Wi-Fi, based on cross-layer measurements. Experiments are carried out in a semi-anechoic
chamber, to prevent the results from being influenced by external factors. The experimental results permit to
analyze the influence of cross traffic characteristics on cross layer measures and objective video quality metrics
evaluated through a standardized approach
Doppler Spread Estimation in MIMO Frequency-Selective Fading Channels
One of the main challenges in high-speed mobile communications is the presence of large Doppler spreads. Thus, accurate estimation of maximum Doppler spread (MDS) plays an important role in improving the performance of the communication link. In this paper, we derive the data-aided (DA) and non-data-aided (NDA) Cramér-Rao lower bounds (CRLBs) and maximum likelihood estimators (MLEs) for the MDS in multiple-input multiple-output (MIMO) frequency-selective fading channel. Moreover, a low-complexity NDA-moment-based estimator (MBE) is proposed. The proposed NDA-MBE relies on the second- and fourth-order moments of the received signal, which are employed to estimate the normalized squared autocorrelation function of the fading channel. Then, the problem of MDS estimation is formulated as a non-linear regression problem, and the least-squares curve-fitting optimization technique is applied to determine the estimate of the MDS. This is the first time in the literature, when DA- and NDA-MDS estimation is investigated for MIMO frequency-selective fading channel. Simulation results show that there is no significant performance gap between the derived NDA-MLE and NDA-CRLB, even when the observation window is relatively small. Furthermore, the significant reduced-complexity in the NDA-MBE leads to low root-mean-square error over a wide range of MDSs, when the observation window is selected large enough
Design of a Channel-Aware OFDM Transceiver
The transmission performance of an OFDM system can significantly be improved by exploiting the channel characteristics. In this paper, we consider the design of a channel-aware OFDM transceiver whose parameters are adjusted in response to the change of channel condition. To this end, we first estimate the channel state information (CSI), such as the signal to interference power ratio, low order moments of Doppler spectrum and power-delay profile of the channel. The proposed CSI estimator can estimate these CSI parameters altogether in a unique manner by exploiting the autocorrelation properties of the channel impulse response (CIR). Then, we design a CIR estimator and adaptive OFDM modulator that adjust their parameters according to the estimated CSI. Finally, we verify the performance of the proposed OFDM transceiver by computer simulation.This work was in part supported by the Ministry of Information &
Communications, Korea, under the Information Technology Research Center
(ITRC) Support Program