5 research outputs found

    Influence of random DC offsets on burst-mode receiver sensitivity

    Get PDF

    Unequal bit error probability in coherent QPSK fiber-optic systems using phase modulator based transmitters

    Get PDF
    We report on the occurrence of unequal bit error probability in a coherent quadrature phase shift keying (QPSK) fiber optic system. The bit error rates (BER) of two QPSK bits are derived individually based on the developed system model, and they turn out to differ by more than an order of magnitude for a phase modulator based transmitter. The phenomenon, previously unreported, arises because such a transmitter introduces a controlled form of inter-symbol-interference (ISI), and the receiver low-pass filters affect this ISI differently for the two bits. The optimum bandwidth of the receiver low-pass filter is obtained from the analytic derivation, which is about 0.7 times the symbol rate. We propose two simple system modifications, one in the transmitter and one in the receiver, to compensate for the phenomenon and equalize the two BER\u27s. Those modifications improve the system performance by about 2~dB without adding any extra hardware

    Influence of random DC offsets on burst-mode receiver sensitivity

    Full text link

    Shuttle Communications and Tracking Systems Modeling and TDRSS Link Simulations Studies

    Get PDF
    An analytical simulation package (LinCsim) which allows the analytical verification of data transmission performance through TDRSS satellites was modified. The work involved the modeling of the user transponder, TDRS, TDRS ground terminal, and link dynamics for forward and return links based on the TDRSS performance specifications (4) and the critical design reviews. The scope of this effort has recently been expanded to include the effects of radio frequency interference (RFI) on the bit error rate (BER) performance of the S-band return links. The RFI environment and the modified TDRSS satellite and ground station hardware are being modeled in accordance with their description in the applicable documents

    Importance Sampling Simulation of the Stack Algorithm with Application to Sequential Decoding

    Get PDF
    Importance sampling is a Monte Carlo variance reduction technique which in many applications has resulted in a significant reduction in computational cost required to obtain accurate Monte Carlo estimates. The basic idea is to generate the random inputs using a biased simulation distribution. That is, one that differs from the true underlying probability model. Simulation data is then weighted by an appropriate likelihood ratio in order to obtain an unbiased estimate of the desired parameter. This thesis presents new importance sampling techniques for the simulation of systems that employ the stack algorithm. The stack algorithm is primarily used in digital communications to decode convolutional codes, but there are also other applications. For example, sequential edge linking is a method of finding edges in images that employs the stack algorithm. In brief, the stack algorithm is an algorithm that attempts to find the maximum metric path through a large decision tree. There are two quantities that characterize its performance. First there is the probability of a branching error. The second quantity is the distribution of computation. It turns out that the number of tree nodes examined in order to make a specific branching decision is a random variable. The distribution of computation is the distribution of this random variable. The estimation of the distribution of computation, and parameters derived from this distribution, is the main goal of this work. We present two new importance sampling schemes (including some variations) for estimating the distribution of computation of the stack algorithm. The first general method is called the reference path method. This method biases noise inputs using the weight distribution of the associated convolutional code. The second method is the partitioning method. This method uses a stationary biasing of noise inputs that alters the drift of the node metric process in an ensemble average sense. The biasing is applied only up to a certain point in time; the point where the correct path node metric minimum occurs. This method is inspired by both information theory and large deviations theory. This thesis also presents another two importance sampling techniques. The first is called the error events simulation method. This scheme will be used to estimate the error probabilities of stack algorithm decoders. The second method that we shall present is a new importance sampling technique for simulating the sequential edge linking algorithm. The main goal of this presentation will be the development of the basic theory that is relevant to this simulation problem, and to discuss some of the key issues that are related to the sequential edge linking simulation
    corecore