logo

SCIENTIA SINICA Informationis, Volume 46 , Issue 7 : 819-833(2016) https://doi.org/10.1360/N112016-00049

Multi-tensor completion with shared factors from multiple\\ sources

More info
  • ReceivedMar 9, 2016
  • AcceptedMar 28, 2016

Abstract


Funded by

国家自然科学基金重点项目(61432011)

国家自然科学基金(61170019)


References

[1] He X F, Cai D, Niyogi P. Tensor subspace analysis. In: Advances in Neural Information Processing Systems 18, Vancouver, 2005. 499-506. Google Scholar

[2] Hao Z F, He L F, Chen B Q, et al. A linear support higher-order tensor machine for classification. IEEE Trans Image Process, 2013, 22: 2911-2920 CrossRef Google Scholar

[3] Kim T K, Cipolla R. Canonical correlation analysis of video volume tensors for action categorization and detection. IEEE Trans Pattern Anal Mach Intell, 2009, 31: 1415-1428 CrossRef Google Scholar

[4] Kolda T G, Bader B W. Tensor decompositions and applications. SIAM Rev, 2009, 51: 455-500 CrossRef Google Scholar

[5] Liu J, Musialski P, Wonka P, et al. Tensor completion for estimating missing values in visual data. Trans Pattern Anal Mach Intell, 2013, 35: 208-220 CrossRef Google Scholar

[6] Cand{è}s E J, Recht B. Exact matrix completion via convex optimization. Found Comput Math, 2009, 9: 717-772 CrossRef Google Scholar

[7] Recht B, Fazel M, Parrilo P A. Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization. SIAM Rev, 2010, 52: 471-501 CrossRef Google Scholar

[8] Shamir O, Shalev-Shwartz S. Matrix completion with the trace norm: learning, bounding, and transducing. J Mach Learn Res, 2014, 15: 3401-3423. Google Scholar

[9] Tomioka R, Hayashi K, Kashima H. On the extension of trace norm to tensors. In: Proceedings of NIPS Workshop on Tensors, Kernels, and Machine Learning, Vancouver, 2010. 1-4. Google Scholar

[10] Liu Y Y, Shang F H, Fan W, et al. Generalized higher-order orthogonal iteration for tensor decomposition and completion. In: Advances in Neural Information Processing Systems 27, Montréal, 2014. 1763-1771. Google Scholar

[11] Acar E, Dunlavy D M, Kolda T G, et al. Scalable tensor factorizations for incomplete data. Chemometr Intell Lab Syst, 2011, 106: 41-56 CrossRef Google Scholar

[12] Filipovi{ć} M, Juki{ć} A. Tucker factorization with missing data with application to low-$n$-rank tensor completion. Multidim Syst Signal Process, 2015, 26: 677-692 CrossRef Google Scholar

[13] Narita A, Hayashi K, Tomioka R, et al. Tensor factorization using auxiliary information. Data Min Knowl Discov, 2012, 25: 298-324 CrossRef Google Scholar

[14] Acar E, Rasmussen M A, Savorani F, et al. Understanding data fusion within the framework of coupled matrix and tensor factorizations. Chemometr Intell Lab Syst, 2013, 129: 53-63 CrossRef Google Scholar

[15] Acar E, Dunlavy D M, Kolda T G, et al. Scalable tensor factorizations with missing data. In: Proceedings of the 10th SIAM International Conference on Data Mining, Columbus, 2010. 701-712. Google Scholar

[16] de Lathauwer L, de Moor B, Vandewalle J. On the best rank-1 and rank-$(r_1, r_2, \ldots , r_n)$ approximation of higher-order tensors. SIAM J Matrix Anal Appl, 2000, 21: 1324-1342 CrossRef Google Scholar

[17] Cichocki A, Zdunek R, Phan A H, et al. Nonnegative Matrix and Tensor Factorizations: Applications to Exploratory Multi-Way Data Analysis and Blind Source Separation. Chichester: John Wiley & Sons, 2009. Google Scholar

[18] Dunlavy D M, Kolda T G, Acar E. Poblano v1.0: a Matlab toolbox for gradient-based optimization. Technical Report SAND2010-1422. 2010. Google Scholar