gmm#
- diffsptk.GMM#
alias of
GaussianMixtureModeling
- class diffsptk.GaussianMixtureModeling(order, n_mixture, n_iter=100, eps=1e-05, weight_floor=1e-05, var_floor=1e-06, var_type='diag', block_size=None, ubm=None, alpha=0, verbose=False)[source]#
See this page for details. This module is not differentiable.
- Parameters:
- orderint >= 0
Order of vector.
- n_mixtureint >= 1
Number of mixture components.
- n_iterint >= 1
Number of iterations.
- epsfloat >= 0
Convergence threshold.
- weight_floorfloat >= 0
Floor value for mixture weights.
- var_floorfloat >= 0
Floor value for variance.
- var_type[‘diag’, ‘full’]
Type of covariance.
- block_sizelist[int]
Block size of covariance matrix.
- ubmtuple of Tensors [shape=((K,), (K, M+1), (K, M+1, M+1))]
Parameters of universal background model.
- alphafloat in [0, 1]
Smoothing parameter.
- verbosebool
If True, print progress.
- forward(x)[source]#
Train Gaussian mixture models.
- Parameters:
- xTensor [shape=(…, M+1)]
Input vectors.
- Returns:
- paramstuple of Tensors [shape=((K,), (K, M+1), (K, M+1, M+1))]
GMM parameters.
- log_likelihoodTensor [scalar]
Total log-likelihood.
Examples
>>> x = diffsptk.nrand(10, 1) >>> gmm = diffsptk.GMM(1, 2) >>> params, log_likelihood = gmm(x) >>> w, mu, sigma = params >>> w tensor([0.1917, 0.8083]) >>> mu tensor([[ 1.2321, 0.2058], [-0.1326, -0.7006]]) >>> sigma tensor([[[3.4010e-01, 0.0000e+00], [0.0000e+00, 6.2351e-04]], [[3.0944e-01, 0.0000e+00], [0.0000e+00, 8.6096e-01]]]) >>> log_likelihood tensor(-19.5235)
See also