Session 4 - Advances in Probabilistic Modeling for Machine Learning: An Approach to Initializing the EM Algorithm in Gaussian Mixtures with an Unknown Number of Components

Presenter Information/ Coauthors Information

Igor Melnykov, University of Minnesota - Duluth

Presentation Type

Invited

Track

Other

Abstract

The EM algorithm is a common tool for finding the maximum likelihood estimates of parameters in finite mixture models. As the algorithm is often sensitive to the choice of the initial parameter vector, efficient initialization is an important preliminary process for the future convergence of the algorithm to the best local maximum. Currently, no initialization method has superiority over others in all practical settings. Considering Gaussian mixture models, we propose a procedure for initializing mean vectors and covariance matrices. The suggested approach can be used in a stepwise manner when the number of components is unknown.

Start Date

2-11-2020 9:30 AM

End Date

2-11-2020 10:30 AM

This document is currently not available here.

Share

COinS
 
Feb 11th, 9:30 AM Feb 11th, 10:30 AM

Session 4 - Advances in Probabilistic Modeling for Machine Learning: An Approach to Initializing the EM Algorithm in Gaussian Mixtures with an Unknown Number of Components

Campanile & Hobo Day Gallery (A & B)

The EM algorithm is a common tool for finding the maximum likelihood estimates of parameters in finite mixture models. As the algorithm is often sensitive to the choice of the initial parameter vector, efficient initialization is an important preliminary process for the future convergence of the algorithm to the best local maximum. Currently, no initialization method has superiority over others in all practical settings. Considering Gaussian mixture models, we propose a procedure for initializing mean vectors and covariance matrices. The suggested approach can be used in a stepwise manner when the number of components is unknown.