SuperLectures.com

FAST DAMPED GAUSS-NEWTON ALGORITHM FOR SPARSE AND NONNEGATIVE TENSOR FACTORIZATION

Full Paper at IEEE Xplore

Non-negative Tensor Factorization and Blind Separation

Přednášející: Anh-Huy Phan, Autoři: Anh-Huy Phan, Brain Science Institue, Japan; Petr Tichavský, Institute of Information Theory and Automation, Czech Republic; Andrzej Cichocki, Brain Science Institue, Japan

Alternating optimization algorithms for canonical polyadic decomposition (with/without nonnegative constraints) often accompany update rules with low computational cost, but could face problems of swamps, bottlenecks, and slow convergence. All-at-once algorithms can deal with such problems, but always demand significant temporary extra-storage, and high computational cost. In this paper, we propose an all-at-once algorithm with low complexity for sparse and nonnegative tensor factorization based on the damped Gauss-Newton iteration. Especially, for low-rank approximations, the proposed algorithm avoids building up Hessians and gradients, reduces the computational cost dramatically. Moreover, we proposed selection strategies for regularization parameters. The proposed algorithm has been verified to overwhelmingly outperform ``state-of-the-art'' NTF algorithms for difficult benchmarks, and for real-world application such as clustering of the ORL face database.


  Přepis řeči

|

  Slajdy

Zvětšit slajd | Zobrazit všechny slajdy

0:00:22

  1. slajd

0:01:24

  2. slajd

0:04:10

  3. slajd

0:05:41

  4. slajd

0:06:21

  5. slajd

0:06:39

  6. slajd

0:07:41

  7. slajd

0:08:57

  8. slajd

0:09:37

  9. slajd

0:09:58

 10. slajd

0:10:46

 11. slajd

0:10:55

 12. slajd

0:11:23

 13. slajd

0:13:21

 14. slajd

  Komentáře

Please sign in to post your comment!

  Informace o přednášce

Nahráno: 2011-05-26 17:15 - 17:35, Club B
Přidáno: 22. 6. 2011 03:50
Počet zhlédnutí: 37
Rozlišení videa: 1024x576 px, 512x288 px
Délka videa: 0:17:44
Audio stopa: MP3 [5.98 MB], 0:17:44