entropy.Rdentropy estimates the Shannon entropy H of the random variable Y
from the corresponding observed counts y.
freqs estimates bin frequencies from the counts y.
vector of counts.
the method employed to estimate entropy (see Details).
the unit in which entropy is measured.
The default is "nats" (natural units). For
computing entropy in "bits" set unit="log2".
shrinkage intensity (for "shrink" option).
verbose option (for "shrink" option).
option passed on to entropy.NSB.
The entropy function allows to estimate entropy from observed counts by a variety
of methods:
method="ML":maximum likelihood, see entropy.empirical
method="MM":bias-corrected maximum likelihood, see entropy.MillerMadow
method="Jeffreys":entropy.Dirichlet with a=1/2
method="Laplace":entropy.Dirichlet with a=1
method="SG":entropy.Dirichlet with a=a=1/length(y)
method="minimax":entropy.Dirichlet with a=sqrt(sum(y))/length(y
method="CS":see entropy.ChaoShen
method="NSB":see entropy.NSB
method="shrink":see entropy.shrink
The freqs function estimates the underlying bin frequencies. Note that
estimated frequencies are not
available for method="MM", method="CS" and method="NSB". In these
instances a vector containing NAs is returned.
entropy returns an estimate of the Shannon entropy.
freqs returns a vector with estimated bin frequencies (if available).
# load entropy library
library("entropy")
# observed counts for each bin
y = c(4, 2, 3, 0, 2, 4, 0, 0, 2, 1, 1)
entropy(y, method="ML")
#> [1] 1.968382
entropy(y, method="MM")
#> [1] 2.152593
entropy(y, method="Jeffreys")
#> [1] 2.17948
entropy(y, method="Laplace")
#> [1] 2.257876
entropy(y, method="SG")
#> [1] 2.036888
entropy(y, method="minimax")
#> [1] 2.154091
entropy(y, method="CS")
#> [1] 2.201137
#entropy(y, method="NSB")
entropy(y, method="shrink")
#> Estimating optimal shrinkage intensity lambda.freq (frequencies): 0.7664
#> [1] 2.379603
#> attr(,"lambda.freqs")
#> [1] 0.7663934