A note on EM algorithm for mixture models
dc.citation.doi | doi:10.1016/j.spl.2012.10.017 | en_US |
dc.citation.epage | 526 | en_US |
dc.citation.issue | 2 | en_US |
dc.citation.jtitle | Statistics and Probability Letters | en_US |
dc.citation.spage | 519 | en_US |
dc.citation.volume | 83 | en_US |
dc.contributor.author | Yao, Weixin | |
dc.contributor.authoreid | wxyao | en_US |
dc.date.accessioned | 2013-01-22T17:24:19Z | |
dc.date.available | 2013-01-22T17:24:19Z | |
dc.date.issued | 2013-01-22 | |
dc.date.published | 2013 | en_US |
dc.description.abstract | Expectation-maximization (EM) algorithm has been used to maximize the likelihood function or posterior when the model contains unobserved latent variables. One main important application of EM algorithm is to find the maximum likelihood estimator for mixture models. In this article, we propose an EM type algorithm to maximize a class of mixture type objective functions. In addition, we prove the monotone ascending property of the proposed algorithm and discuss some of its applications. | en_US |
dc.identifier.uri | http://hdl.handle.net/2097/15224 | |
dc.language.iso | en_US | en_US |
dc.relation.uri | http://www.sciencedirect.com/science/article/pii/S0167715212003896 | en_US |
dc.subject | Adaptive regression | en_US |
dc.subject | EM algorithm | en_US |
dc.subject | Edge-preserving smoothers | en_US |
dc.subject | Mode | en_US |
dc.subject | Robust regression | en_US |
dc.title | A note on EM algorithm for mixture models | en_US |
dc.type | Article (author version) | en_US |