FAST DAMPED GAUSS-NEWTON ALGORITHM FOR SPARSE AND NONNEGATIVE TENSOR FACTORIZATION
Non-negative Tensor Factorization and Blind Separation
Přednášející: Anh-Huy Phan, Autoři: Anh-Huy Phan, Brain Science Institue, Japan; Petr Tichavský, Institute of Information Theory and Automation, Czech Republic; Andrzej Cichocki, Brain Science Institue, Japan
Alternating optimization algorithms for canonical polyadic decomposition (with/without nonnegative constraints) often accompany update rules with low computational cost, but could face problems of swamps, bottlenecks, and slow convergence. All-at-once algorithms can deal with such problems, but always demand significant temporary extra-storage, and high computational cost. In this paper, we propose an all-at-once algorithm with low complexity for sparse and nonnegative tensor factorization based on the damped Gauss-Newton iteration. Especially, for low-rank approximations, the proposed algorithm avoids building up Hessians and gradients, reduces the computational cost dramatically. Moreover, we proposed selection strategies for regularization parameters. The proposed algorithm has been verified to overwhelmingly outperform ``state-of-the-art'' NTF algorithms for difficult benchmarks, and for real-world application such as clustering of the ORL face database.
Informace o přednášce
Nahráno: | 2011-05-26 17:15 - 17:35, Club B |
---|---|
Přidáno: | 22. 6. 2011 03:50 |
Počet zhlédnutí: | 37 |
Rozlišení videa: | 1024x576 px, 512x288 px |
Délka videa: | 0:17:44 |
Audio stopa: | MP3 [5.98 MB], 0:17:44 |
Komentáře