SuperLectures.com

MAXIMUM MARGINAL LIKELIHOOD ESTIMATION FOR NONNEGATIVE DICTIONARY LEARNING

Full Paper at IEEE Xplore

Non-negative Tensor Factorization and Blind Separation

Přednášející: Onur Dikmen, Autoři: Onur Dikmen, Cédric Févotte, CNRS LTCI / Télécom ParisTech, France

We describe an alternative to standard nonnegative matrix factorisation (NMF) for nonnegative dictionary learning. NMF with the Kullback-Leibler divergence can be seen as maximisation of the joint likelihood of the dictionary and the expansion coefficients under Poisson observation noise. This approach lacks optimality because the number of parameters (which include the expansion coefficients) grows with the number of observations. As such, we describe a variational EM algorithm for optimisation of the marginal likelihood, i.e., the likelihood of the dictionary where the expansion coefficients have been integrated out (given a Gamma conjugate prior). We compare the output of both maximum joint likelihood estimation (i.e., standard NMF) and maximum marginal likelihood estimation (MMLE) on real and synthetical data. The MMLE approach is shown to embed automatic model order selection, similar to automatic relevance determination.


  Přepis řeči

|

  Slajdy

Zvětšit slajd | Zobrazit všechny slajdy

0:00:16

  1. slajd

0:00:37

  2. slajd

0:01:46

  3. slajd

0:04:49

  4. slajd

0:05:51

  5. slajd

0:08:24

  6. slajd

0:09:40

  7. slajd

0:10:59

  8. slajd

0:13:34

  9. slajd

0:14:24

 10. slajd

0:15:04

 11. slajd

0:15:52

 12. slajd

0:17:03

 13. slajd

0:18:23

 14. slajd

0:19:01

 15. slajd

  Komentáře

Please sign in to post your comment!

  Informace o přednášce

Nahráno: 2011-05-26 17:35 - 17:55, Club B
Přidáno: 18. 6. 2011 03:35
Počet zhlédnutí: 16
Rozlišení videa: 1024x576 px, 512x288 px
Délka videa: 0:23:08
Audio stopa: MP3 [7.84 MB], 0:23:08