cupyx.scipy.stats.entropy#
- cupyx.scipy.stats.entropy(pk, qk=None, base=None, axis=0)[source]#
Calculate the entropy of a distribution for given probability values.
If only probabilities
pk
are given, the entropy is calculated asS = -sum(pk * log(pk), axis=axis)
.If
qk
is not None, then compute the Kullback-Leibler divergenceS = sum(pk * log(pk / qk), axis=axis)
.This routine will normalize
pk
andqk
if they don’t sum to 1.- Parameters:
pk (ndarray) – Defines the (discrete) distribution.
pk[i]
is the (possibly unnormalized) probability of eventi
.qk (ndarray, optional) – Sequence against which the relative entropy is computed. Should be in the same format as
pk
.base (float, optional) – The logarithmic base to use, defaults to
e
(natural logarithm).axis (int, optional) – The axis along which the entropy is calculated. Default is 0.
- Returns:
The calculated entropy.
- Return type:
S (cupy.ndarray)