EM algorithm: how it works
Full lecture: http://bit.ly/EM-alg
Mixture models are a probabilistically-sound way to do soft clustering. We assume our data is sampled from K different sources (probability distributions). The expectation maximisation (EM) algorithm allows us to discover the parameters of these distributions, and figure out which point comes from each source at the same time.
Видео EM algorithm: how it works канала Victor Lavrenko
Mixture models are a probabilistically-sound way to do soft clustering. We assume our data is sampled from K different sources (probability distributions). The expectation maximisation (EM) algorithm allows us to discover the parameters of these distributions, and figure out which point comes from each source at the same time.
Видео EM algorithm: how it works канала Victor Lavrenko
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
IAML2.14: Why blurring helps machine learningWeb search 10: anchor textWeb crawling 4: inside an HTTP requestBIR.8 Natural zeroPRF 12: geometric view of RocchioBIR.1 Formal models in Information RetrievalWeb search 1: more data = higher precisionIAML8.19 Evaluating regression: MSE, MAE, CCIAML7.15 SummaryLM.7 Good-Turing estimateIAML8.15 When classification error is wrongPRF 4: WordNet synonymsIAML7.12 Decision tree regressionPRF 15: state-of-the-art expansionLSH.7 Properties of conventional hashcodesIAML2.4: What is regression?IR3.9 State-of-the-art retrieval formulaIAML2.17: Representing text with categorical attributesIR4.14 Statistical synonymsLaws of Text 5: Benford's LawRelevance model 6: cross-language estimation