evaluate_class_gmm — 通过高斯混合模型评估特征向量。
evaluate_class_gmm( : : GMMHandle, Features : ClassProb, Density, KSigmaProb)
evaluate_class_gmm computes three different probability
values for a feature vector Features with the Gaussian
Mixture Model (GMM) GMMHandle。
The a-posteriori probability of class i for the sample
Features(x) is computed as
and returned for each class in ClassProb. The formulas for
the calculation of the center density function p(x|j) are described with create_class_gmm。
The probability density of the feature vector is computed as a sum of
the posterior class probabilities
and is returned in Density. Here, Pr(i) are
the prior classes probabilities as computed by
train_class_gmm。Density can be used for novelty
detection, i.e., to reject feature vectors that do not belong to any
of the trained classes. However, since Density depends on
the scaling of the feature vectors and since Density is a
probability density, and consequently does not need to lie between 0
and 1, the novelty detection can typically be performed more easily
with KSigmaProb (see below).
A k-sigma error ellipsoid is defined as a locus of points
for which
In the one dimensional case this is the interval . For any 1D Gaussian
distribution, it is true that approximately 68% of the
occurrences of the random variable are within this range for k=1,
approximately 95% for k=2, approximately 99%
for k=3, etc.
This probability is called k-sigma probability and is denoted by P[k].
P[k] can be computed numerically for univariate as well as for
multivariate Gaussian distributions, where it should be noted that
for the same values of k,
(here N and (N+1) denote dimensions). For Gaussian
mixture models the k-sigma probability is computed as:
where
.
are weighted with the class priors and then
normalized.
The maximum value of all classes is returned in KSigmaProb, such
that
KSigmaProb can be used for novelty detection, as it indicates how
well a feature vector fits into the distribution of the class it is assigned
to. Typically, feature vectors having values below 0.0001 should be rejected.
Note that the rejection threshold defined by the parameter
RejectionThreshold in classify_image_class_gmm
refers to the KSigmaProb values.
Before calling evaluate_class_gmm, the GMM must be trained
with train_class_gmm。
The position of the maximum value of ClassProb is usually
interpreted as the class of the feature vector and the corresponding
value as the probability of the class. In this case,
classify_class_gmm should be used instead of
evaluate_class_gmm, because classify_class_gmm
directly returns the class and corresponding probability.
GMMHandle (输入控制) class_gmm → (handle)
GMM 句柄。
Features (输入控制) real-array → (real)
Feature vector.
ClassProb (输出控制) real-array → (real)
A-posteriori probability of the classes.
Density (输出控制) real → (real)
Probability density of the feature vector.
KSigmaProb (输出控制) real → (real)
Normalized k-sigma-probability for the feature vector.
如果参数有效,算子 evaluate_class_gmm 返回值 2 ( H_MSG_TRUE )。如有必要,则抛出异常。
train_class_gmm,
read_class_gmm
Christopher M. Bishop: “Neural Networks for Pattern Recognition”;
Oxford University Press, Oxford; 1995.
Mario A.T. Figueiredo: “Unsupervised Learning of Finite Mixture
Models”; IEEE Transactions on Pattern Analysis and Machine
Intelligence, Vol. 24, No. 3; March 2002.
基础