Paper
6 March 2002 Analyzing approximation algorithms in the theory of evidence
Anne-Laure Jousselme, Dominic Grenier, Eloi Bosse
Author Affiliations +
Abstract
The major drawback of the Dempster-Shafer's theory of evidence is its computational burden. Indeed, the Dempster's rule of combination involves an exponential number of focal elements, that can be unmanageable in many applications. To avoid this problem, some approximation rules or algorithms have been explored for both reducing the number of focal elements and keeping a maximum of information in the next belief function to be combined. Some studies have yet to be done which compare approximation algorithms. The criteria used always involve pignistic transformations, and by that a loss of information in both the original belief function and the approximated one. In this paper, we propose to analyze some approximation methods by computing the distance between the original belief function and the approximated one. This real distance allows then to quantify the quality of the approximation. We also compare this criterion to other error criteria, often based on pignistic transformations. We show results of Monte-Carlo simulations, and also of an application of target identification.
© (2002) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Anne-Laure Jousselme, Dominic Grenier, and Eloi Bosse "Analyzing approximation algorithms in the theory of evidence", Proc. SPIE 4731, Sensor Fusion: Architectures, Algorithms, and Applications VI, (6 March 2002); https://doi.org/10.1117/12.458371
Lens.org Logo
CITATIONS
Cited by 10 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Detection and tracking algorithms

Monte Carlo methods

Radon

Computer simulations

Algorithms

Data fusion

Target recognition

Back to Top