Traditionally, inter-cell interference in wireless cellular systems has been modeled as a zero-mean random variable with variance equal to the average power of the interference. While this model is appropriate for code division multiple access-type systems, in orthogonal frequency division multiple access-based systems, such as long term evolution (LTE), it does not hold anymore. This paper considers the problem of interference modeling for the downlink of LTE systems. It is shown, through analytical and numerical studies, that the interference can be deemed as a gaussian random variate with random variance (i.e., the variance depends on the strength of the interference itself), thus implying that it is actually a non-gaussian random variate.</p
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.