Skip to contents

[Deprecated]

Compute information-based estimates and distances.

Usage

entropy(.data, .base = 2, .norm = FALSE, .do.norm = NA, .laplace = 1e-12)

kl_div(.alpha, .beta, .base = 2, .do.norm = NA, .laplace = 1e-12)

js_div(.alpha, .beta, .base = 2, .do.norm = NA, .laplace = 1e-12, .norm.entropy = FALSE)

cross_entropy(.alpha, .beta, .base = 2, .do.norm = NA,
              .laplace = 1e-12, .norm.entropy = FALSE)

Arguments

.data

Numeric vector. Any distribution.

.base

Numeric. A base of logarithm.

.norm

Logical. If TRUE then normalises the entropy by the maximal value of the entropy.

.do.norm

If TRUE then normalises the input distributions to make them sum up to 1.

.laplace

Numeric. A value for the laplace correction.

.alpha

Numeric vector. A distribution of some random value.

.beta

Numeric vector. A distribution of some random value.

.norm.entropy

Logical. If TRUE then normalises the resulting value by the average entropy of input distributions.

Value

A numeric value.

Examples

P <- abs(rnorm(10))
Q <- abs(rnorm(10))
entropy(P)
#> [1] 3.021958
kl_div(P, Q)
#> [1] 0.7606774
js_div(P, Q)
#> [1] 0.1411576
cross_entropy(P, Q)
#> [1] 3.782636