gmm#

diffsptk.GMM#

alias of GaussianMixtureModeling

class diffsptk.GaussianMixtureModeling(order, n_mixture, *, n_iter=100, eps=1e-05, weight_floor=1e-05, var_floor=1e-06, var_type='diag', block_size=None, ubm=None, alpha=0, batch_size=None, verbose=False)[source]#

See this page for details. This module is not differentiable.

Parameters:
orderint >= 0

Order of vector, \(M\).

n_mixtureint >= 1

Number of mixture components, \(K\).

n_iterint >= 1

Number of iterations.

epsfloat >= 0

Convergence threshold.

weight_floorfloat >= 0

Floor value for mixture weights.

var_floorfloat >= 0

Floor value for variance.

var_type[‘diag’, ‘full’]

Type of covariance.

block_sizelist[int]

Block size of covariance matrix.

ubmtuple of Tensors [shape=((K,), (K, M+1), (K, M+1, M+1))]

Parameters of universal background model.

alphafloat in [0, 1]

Smoothing parameter.

batch_sizeint >= 1 or None

Batch size.

verbosebool

If 1, show distance at each iteration; if 2, show progress bar.

forward(x, return_posterior=False)[source]#

Train Gaussian mixture models.

Parameters:
xTensor [shape=(T, M+1)] or DataLoader

Input vectors or dataloder yielding input vectors.

return_posteriorbool

If True, return posterior probabilities.

Returns:
paramstuple of Tensors [shape=((K,), (K, M+1), (K, M+1, M+1))]

GMM parameters.

posteriorTensor [shape=(T, K)] (optional)

Posterior probabilities.

log_likelihoodTensor [scalar]

Total log-likelihood.

Examples

>>> x = diffsptk.nrand(10, 1)
>>> gmm = diffsptk.GMM(1, 2)
>>> params, log_likelihood = gmm(x)
>>> w, mu, sigma = params
>>> w
tensor([0.1917, 0.8083])
>>> mu
tensor([[ 1.2321,  0.2058],
        [-0.1326, -0.7006]])
>>> sigma
tensor([[[3.4010e-01, 0.0000e+00],
         [0.0000e+00, 6.2351e-04]],
        [[3.0944e-01, 0.0000e+00],
         [0.0000e+00, 8.6096e-01]]])
>>> log_likelihood
tensor(-19.5235)
set_params(params)[source]#

Set model parameters.

Parameters:
paramstuple of Tensors [shape=((K,), (K, M+1), (K, M+1, M+1))]

Parameters of Gaussian mixture model.

transform(x)[source]#

Transform input vectors based on a single mixture sequence.

Parameters:
xTensor [shape=(T, N+1)]

Input vectors.

Returns:
yTensor [shape=(T, M-N)]

Output vectors.

indicesTensor [shape=(T,)]

Selected mixture indices.

log_probTensor [shape=(T,)]

Log probabilities.

warmup(x, **lbg_params)[source]#

Initialize model parameters by K-means clustering.

Parameters:
xTensor [shape=(T, M+1)] or DataLoader

Training data.

lbg_paramsadditional keyword arguments

Parameters for Linde-Buzo-Gray algorithm.

Returns:
outtuple of Tensors [shape=((K,), (K, M+1), (K, M+1, M+1))]

GMM parameters.

See also

lbg