This paper is concerned with the structured simultaneous low-rank and sparse recovery,which can be formulated as the rank and zero-norm regularized least squares problem with a hard constraint diag(■)=0.For this class of NP-hard problems,we propose a convex relaxation algorithm by applying the accelerated proximal gradient method to a convex relaxation model,which is yielded by the smoothed nuclear norm and the weighted l1-norm regularized least squares problem.A theoretical guarantee is provided by establishing the error bounds of the iterates to the true solution under mild restricted strong convexity conditions.To the best of our knowledge,this work is the first one to characterize the error bound of the iterates of the algorithm to the true solution.Finally,numerical results are reported for some random test problems and synthetic data in subspace clustering to verify the efficiency of the proposed convex relaxation algorithm.
为了处理张量数据,传统的学习算法常常把张量展成向量,但会造成破坏原始数据固有的高阶结构和内在相关性,导致信息丢失,或产生高维向量,使得后期学习过程中容易出现过拟合、维度灾难和小样本问题.近年提出了许多基于张量模式的分类算法,而支持高阶张量机算法是张量分类算法中最有效的方法之一.考虑到张量的高维性和高冗余性,本文提出基于多线性主成分分析的支持高阶张量机分类算法(Multilinear Principle Component Analysis Based Support High-Order Tensor Machine,MPCA+SHTM).该算法首先利用多线性主成分分析对张量进行降维,然后利用支持高阶张量机对降维后的张量进行学习.在12个张量数据集上的实验表明:MPCA+SHTM在保持测试精度的情况下有效地降低了SHTM的计算时间.