Proc. SPIE. 6699, Signal and Data Processing of Small Targets 2007
KEYWORDS: Weapons, Statistical analysis, Detection and tracking algorithms, Matrices, Error analysis, Computer simulations, Personal digital assistants, Monte Carlo methods, Data processing, Expectation maximization algorithms
The Probabilistic Multi-Hypothesis Tracker (PMHT) has been demonstrated to be an effective multi-target
tracker while retaining linear computational complexity in the number of measurements and targets. However
PMHT only provides a point estimate for target tracks. The "covariance" returned by the PMHT is a byproduct
of applying the Expectation-Maximization algorithm to maximize the PMHT likelihood function and
is not intended to be the track estimate covariance. In this paper we derive a consistent covariance estimator
for PMHT. By re-introducing the constraint that the sum of the PMHT weights (posterior probabilities that a
measurement is target-originated) across measurements sum to unity, a covariance based on Probabilistic Data
Association (PDA) principles is derived. We show through simulations that the resulting covariance provides a
consistent covariance for the PMHT track estimates.
There has been some work both in the statistics and engineering literature that gives the posterior covariance
for ML Gaussian-mixture estimation, and the PMHT can be viewed as a tracker whose genesis is of
MAP Gaussian-mixture estimation with a Gaussian prior. The expressions and calculations are, unfortunately,
complicated. Consequently we also report on a novel and intuitive way to derive these via calculus.
Proc. SPIE. 4380, Signal Processing, Sensor Fusion, and Target Recognition X
KEYWORDS: Signal to noise ratio, Detection and tracking algorithms, Data modeling, Sensors, Matrices, Digital filtering, Signal processing, Quantization, Optical simulations, Expectation maximization algorithms
Tracking energy on an intensity-modulated sensor output typically requires windowing, thresholding, and/or interpolation to arrive at point measurements to feed the tracking algorithm. Conventional trackers are point trackers, and point measurement estimation procedures pose problems for tracking signal energy that is distributed across many sensor cells. Such signals are sometimes termed over-resolved. Large arrays provide greater resolution with the potential for improved detection and classification performance, but higher resolution is in direct conflict with tracking over-resolved signals. The Histogram-Probabilistic Multi-Hypothesis Tracker (H-PMHT) algorithm addresses these issues and provides a means for modeling and tracking signals that are spread across several contiguous sensor cells. H-PMHT models the cell responses as a received energy histogram, and the probability density underlying this histogram is modeled by a mixture density. Elements of the H-PMHT signal model, theory, and algorithm are presented for linear Gauss-Markov targets. Tracking examples using simulated azimuth beam data are presented.
Proc. SPIE. 3720, Signal Processing, Sensor Fusion, and Target Recognition VIII
KEYWORDS: Target detection, Radon, Detection and tracking algorithms, Data modeling, Computer simulations, Monte Carlo methods, Time metrology, Palladium, Algorithm development, Expectation maximization algorithms
The S-dimensional (S-D) assignment algorithm is a recently- favored approach to multitarget tracking in which the data association is formulated as a generalized multidimensional matching problem, and solved by a Lagrangian (dual) relaxation approach. The Probabilistic Multiple Hypothesis Tracking algorithm is a relatively new method, which uses the EM algorithm and a modified probabilistic model to develop a `soft' association tracker. In this paper, we implement the two algorithms (S = 3, in the S-D assignment algorithm) in the multitarget tracking problem, presented with false alarms and imperfect target detection. Simulation results for various scenarios are presented and the performances of the two algorithms are compared in terms of computational time and percentage of lost tracks.
The PMHT is a very nice tracking algorithm for a number of implementational reasons. However, it relies on a modification on the usual data association assumption, specifically that the event that a target can generate more than one measurement in a given scan is made feasible. In this paper we examine the ramifications of this from the point of view of theoretical estimation accuracy - the Cramer-Rao lower bound. We find that the CRLB behavior for the PMHT is much like that for the PDAF: there is a scalar 'information reduction factor' (IRF) relating the loss of accuracy from measurement-origin-uncertainty. This IRF ix explored in a number of ways, and in particular it is found that the IRF for the PMHT is significantly degraded relative to that for the standard measurement model when clutter is heavy. Other topics include the effect of 'homothetic' measurements; data fusion; and non-Gaussian measurement.
Proc. SPIE. 3373, Signal and Data Processing of Small Targets 1998
KEYWORDS: Statistical analysis, Switching, Detection and tracking algorithms, Time metrology, Signal processing, Process control, Algorithm development, Optimization (mathematics), Process modeling, Expectation maximization algorithms
The Probabilistic Multi-Hypothesis Tracker (PMHT) of Streit and Luginbuhl uses the EM algorithm and a slight modification of the usual target-tracking assumptions to combine data-association and filtering. The performance of the PMHT to date has been comparable to that of existing tracking algorithms; however, part of its appeal is a consistent and extensible statistical foundation, and it is the extension to the tracking of maneuvering targets which we explore in this paper. The basis, as with many algorithms designed for maneuvering targets, is of an underlying and hidden 'model-switch' process controlled by a Markov probability structure. Performance of the modified PMHT is investigated both for maneuvering and non-maneuvering targets. The improved performance observed in the latter case is somewhat surprising.
Sensitivity to track initialization error is quantified as a function of clutter density for the nearest neighbor assignment function (NNAF). Two statistical distributions of track initialization error are derived under idealized hypothesis representing excellent and poor performance of the NNAF in clutter. A track initialization error region is obtained by constraining the mean initialization error (under the excellent NNAF performance hypothesis) to have significance level (alpha) (under the poor NNAF performance hypothesis). The initialization sensitivity region is derived for general nonlinear state equations and for measurement equations with additive Gaussian noise. Examples with constant velocity linear models are given. There exists a critical clutter density at which the initialization error region is the empty set. Thus, even perfect track initialization is a poor initialization in clutter whose density exceeds critical density. An explicit expression for the critical clutter density is derived.
In a multi-target multi-measurement environment, knowledge of the measurement-to-track assignments is typically unavailable to the tracking algorithm. In this paper, a strictly probabilistic approach to the measurement-to-track assignment problem is taken. Measurements are not assigned to tracks as in traditional multi-hypothesis tracking (MHT) algorithms; instead, the probability that each measurement belongs to each track is estimated using a maximum likelihood algorithm derived by the method of Expectation-Maximization. These measurement-to-track probability estimates are intrinsic to the multi-target tracker called the probabilistic multi-hypothesis tracking (PMHT) algorithm. Unlike MHT algorithms, the PMHT algorithm does not maintain explicit hypothesis lists. The PMHT algorithm is computationally practical because it requires neither enumeration of measurement-to-track assignments nor pruning.