4 research outputs found

    Cost-Function-Based Hypothesis Control Techniques for Multiple Hypothesis Tracking

    No full text
    The problem of tracking targets in clutter naturally leads to a Gaussian mixture representation of the probability density function of the target state vector. Modern tracking methods maintain the mean, covariance and probability weight corresponding to each hypothesis, yet they rely on simple merging and pruning rules to control the growth of hypotheses. This paper proposes a structured, cost-function-based approach to the hypothesis control problem, utilizing the Integral Square Error (ISE) cost measure. A comparison of track life performance versus computational cost is made between the ISE-based filter and previously proposed approximations including simple pruning, Singer’s n-scan memory filter, Salmond’s joining filter, and Chen and Liu’s Mixture Kalman Filter (MKF). The results demonstrate that the ISE-based mixture reduction algorithm provides track life performance which is significantly better than the compared techniques using similar numbers of mixture components, and performance competitive with the compared algorithms for similar mean computation times

    Initial Data Truncation for Univariate Output of Discrete-Event Simulations Using the Kalman Filter

    No full text
    Data truncation is a commonly accepted method of dealing with initialization bias in discrete-event simulation. An algorithm for determining the appropriate initial-data truncation point for univariate output is proposed. The technique entails averaging across independent replications and estimating a steady-state output model in a state-space framework. A Bayesian technique called Multiple Model Adaptive Estimation (MMAE) is applied to compute a time varying estimate of the output's steady-state mean. This MMAE implementation features the use, in parallel, of a bank of three Kalman filters. Each filter is constructed under a different assumption about the output's steady-state mean. One of the filters assumes that the steady-state mean is accurately reflected by an estimate, called the "assumed steady-state mean," taken from the last half of the simulation data. This filter is called the reference filter. The remaining filters are calibrated with steady-state means corresponding to simple functions of the minimum and maximum data values, respectively. As the filters process the output through the effective transient, the reference filter becomes more likely (in a Bayesian sense) to be the best filter to represent the data, and the MMAE mean estimator is influenced increasingly towards the assumed steady-state mean. The estimated truncation point is selected when the MMAE mean estimate is within a small tolerance of the assumed steady-state mean. A Monte Carlo analysis using data generated from twelve simulation models is used to evaluate the technique. The evaluation criteria include the ability to estimate accurately and to construct reliable confidence intervals for the mean of the response based on the truncated sequences.output analysis, initial transient, truncation, Kalman filter, multiple model adaptive estimation
    corecore