A note on EM algorithm for mixture models

dc.citation.doidoi:10.1016/j.spl.2012.10.017en_US
dc.citation.epage526en_US
dc.citation.issue2en_US
dc.citation.jtitleStatistics and Probability Lettersen_US
dc.citation.spage519en_US
dc.citation.volume83en_US
dc.contributor.authorYao, Weixin
dc.contributor.authoreidwxyaoen_US
dc.date.accessioned2013-01-22T17:24:19Z
dc.date.available2013-01-22T17:24:19Z
dc.date.issued2013-01-22
dc.date.published2013en_US
dc.description.abstractExpectation-maximization (EM) algorithm has been used to maximize the likelihood function or posterior when the model contains unobserved latent variables. One main important application of EM algorithm is to find the maximum likelihood estimator for mixture models. In this article, we propose an EM type algorithm to maximize a class of mixture type objective functions. In addition, we prove the monotone ascending property of the proposed algorithm and discuss some of its applications.en_US
dc.identifier.urihttp://hdl.handle.net/2097/15224
dc.language.isoen_USen_US
dc.relation.urihttp://www.sciencedirect.com/science/article/pii/S0167715212003896en_US
dc.subjectAdaptive regressionen_US
dc.subjectEM algorithmen_US
dc.subjectEdge-preserving smoothersen_US
dc.subjectModeen_US
dc.subjectRobust regressionen_US
dc.titleA note on EM algorithm for mixture modelsen_US
dc.typeArticle (author version)en_US

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Yao StatsProbabLetts 2013.pdf
Size:
97.87 KB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.62 KB
Format:
Item-specific license agreed upon to submission
Description: