Compute information-based estimates and distances.

entropy(.data, .base = 2, .norm = FALSE, .do.norm = NA, .laplace = 1e-12)

kl_div(.alpha, .beta, .base = 2, .do.norm = NA, .laplace = 1e-12)

js_div(.alpha, .beta, .base = 2, .do.norm = NA, .laplace = 1e-12, .norm.entropy = FALSE)

cross_entropy(.alpha, .beta, .base = 2, .do.norm = NA,
              .laplace = 1e-12, .norm.entropy = FALSE)

Arguments

.data

Numeric vector. Any distribution.

.base

Numeric. A base of logarithm.

.norm

Logical. If TRUE then normalise the entropy by the maximal value of the entropy.

.do.norm

If TRUE then normalise input distributions to make them sum up to 1.

.laplace

Numeric. A value for the laplace correction.

.alpha

Numeric vector. A distribution of some random value.

.beta

Numeric vector. A distribution of some random value.

.norm.entropy

Logical. If TRUE then normalise the resultant value by the average entropy of input distributions.

Value

A numeric value.

Examples

P <- abs(rnorm(10)) Q <- abs(rnorm(10)) entropy(P)
#> [1] 2.710804
kl_div(P, Q)
#> [1] 0.6685579
js_div(P, Q)
#> [1] 0.1754736
cross_entropy(P, Q)
#> [1] 3.379361