Dempster-Shafer evidential theory, a probability-based data fusion classification algorithm, is useful when the sensors (or more generally, the information sources) contributing information cannot associate a 100-percent probability of certainty to their output decisions. The algorithm captures and combines whatever certainty exists in the object-discrimination capability of the sensors. Knowledge from multiple sensors about events (called propositions) is combined using Dempster's rule to find the intersection or conjunction of the propositions and their associated probabilities. When the intersection of the propositions reported by the sensors is an empty set, Dempster's rule redistributes the conflicting probability to the nonempty set elements. When the conflicting probability becomes large, application of Dempster's rule can lead to counterintuitive conclusions. Several modifications to the original Dempster-Shafer theory have been proposed to accommodate these situations.
You have requested a machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Neither SPIE nor the owners and publishers of the content make, and they explicitly disclaim, any express or implied representations or warranties of any kind, including, without limitation, representations and warranties as to the functionality of the translation feature or the accuracy or completeness of the translations.
Translations are not retained in our system. Your use of this feature and the translations is subject to all use restrictions contained in the Terms and Conditions of Use of the SPIE website.
Dempster-Shafer Evidential Theory