16 December 1989 Sample Estimators For Entropic Measures Of Mutual Information
Author Affiliations +
Abstract
A nonlinear, entropic measure of mutual information (statistical dependence), I(X1,...,Xnin ≥ 2) = ∫-∞+∞...∫-∞+∞log [(f(x1,...xn)/(Πf1(x1))]dF(x1,...,xn)≥0 was proposed in 1966 by Blachman for a set of continuous random variables, (X1,..., X,ni-∞ < Xt < +∞, i= 1,..., n; ≥ 2) with continuous distribution F(xi,...,xn).
© (1989) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
R. C. McCarty, R. C. McCarty, } "Sample Estimators For Entropic Measures Of Mutual Information", Proc. SPIE 0977, Real-Time Signal Processing XI, (16 December 1989); doi: 10.1117/12.948566; https://doi.org/10.1117/12.948566
PROCEEDINGS
10 PAGES


SHARE
RELATED CONTENT


Back to Top