SuperLectures.com

FAST DAMPED GAUSS-NEWTON ALGORITHM FOR SPARSE AND NONNEGATIVE TENSOR FACTORIZATION

Full Paper at IEEE Xplore

Non-negative Tensor Factorization and Blind Separation

Presented by: Anh-Huy Phan, Author(s): Anh-Huy Phan, Brain Science Institue, Japan; Petr Tichavský, Institute of Information Theory and Automation, Czech Republic; Andrzej Cichocki, Brain Science Institue, Japan

Alternating optimization algorithms for canonical polyadic decomposition (with/without nonnegative constraints) often accompany update rules with low computational cost, but could face problems of swamps, bottlenecks, and slow convergence. All-at-once algorithms can deal with such problems, but always demand significant temporary extra-storage, and high computational cost. In this paper, we propose an all-at-once algorithm with low complexity for sparse and nonnegative tensor factorization based on the damped Gauss-Newton iteration. Especially, for low-rank approximations, the proposed algorithm avoids building up Hessians and gradients, reduces the computational cost dramatically. Moreover, we proposed selection strategies for regularization parameters. The proposed algorithm has been verified to overwhelmingly outperform ``state-of-the-art'' NTF algorithms for difficult benchmarks, and for real-world application such as clustering of the ORL face database.


  Speech Transcript

|

  Slides

Enlarge the slide | Show all slides in a pop-up window

0:00:22

  1. slide

0:01:24

  2. slide

0:04:10

  3. slide

0:05:41

  4. slide

0:06:21

  5. slide

0:06:39

  6. slide

0:07:41

  7. slide

0:08:57

  8. slide

0:09:37

  9. slide

0:09:58

 10. slide

0:10:46

 11. slide

0:10:55

 12. slide

0:11:23

 13. slide

0:13:21

 14. slide

  Comments

Please sign in to post your comment!

  Lecture Information

Recorded: 2011-05-26 17:15 - 17:35, Club B
Added: 22. 6. 2011 03:50
Number of views: 37
Video resolution: 1024x576 px, 512x288 px
Video length: 0:17:44
Audio track: MP3 [5.98 MB], 0:17:44