Paper
5 May 2011 Comparison of information theoretic divergences for sensor management
Author Affiliations +
Abstract
In this paper, we compare the information-theoretic metrics of the Kullback-Leibler (K-L) and Renyi (α) divergence formulations for sensor management. Information-theoretic metrics have been well suited for sensor management as they afford comparisons between distributions resulting from different types of sensors under different actions. The difference in distributions can also be measured as entropy formulations to discern the communication channel capacity (i.e., Shannon limit). In this paper, we formulate a sensor management scenario for target tracking and compare various metrics for performance evaluation as a function of the design parameter (α) so as to determine which measures might be appropriate for sensor management given the dynamics of the scenario and design parameter.
© (2011) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Chun Yang, Ivan Kadar, Erik Blasch, and Michael Bakich "Comparison of information theoretic divergences for sensor management", Proc. SPIE 8050, Signal Processing, Sensor Fusion, and Target Recognition XX, 80500C (5 May 2011); https://doi.org/10.1117/12.883745
Lens.org Logo
CITATIONS
Cited by 5 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Sensors

Target detection

Information theory

Probability theory

Target recognition

Distance measurement

Error analysis

Back to Top