entropy.MillerMadow estimates the Shannon entropy H of the random variable Y from the corresponding observed counts y using the Miller-Madow correction to the empirical entropy).

entropy.MillerMadow(y, unit=c("log", "log2", "log10"))

Arguments

y

vector of counts.

unit

the unit in which entropy is measured. The default is "nats" (natural units). For computing entropy in "bits" set unit="log2".

Details

The Miller-Madow entropy estimator (1955) is the bias-corrected empirical entropy estimate.

Note that the Miller-Madow estimator is not a plug-in estimator, hence there are no explicit underlying bin frequencies.

Value

entropy.MillerMadow returns an estimate of the Shannon entropy.

References

Miller, G. 1955. Note on the bias of information estimates. Info. Theory Psychol. Prob. Methods II-B:95-100.

Author

Korbinian Strimmer (https://strimmerlab.github.io).

Examples

# load entropy library 
library("entropy")

# observed counts for each bin
y = c(4, 2, 3, 0, 2, 4, 0, 0, 2, 1, 1)  

# estimate entropy using Miller-Madow method
entropy.MillerMadow(y)
#> [1] 2.152593

# compare to empirical estimate
entropy.empirical(y)
#> [1] 1.968382