Structured AutoEncoders for Subspace Clustering

Structured AutoEncoders for Subspace Clustering
Structured AutoEncoders for Subspace Clustering
Other Titles:
IEEE Transactions on Image Processing
Publication Date:
15 October 2018
X. Peng, J. Feng, S. Xiao, W. Yau, J. T. Zhou and S. Yang, "Structured AutoEncoders for Subspace Clustering," in IEEE Transactions on Image Processing, vol. 27, no. 10, pp. 5076-5086, Oct. 2018. doi: 10.1109/TIP.2018.2848470
Existing subspace clustering methods typically employ shallow models to estimate underlying subspaces of unlabeled data points and cluster them into corresponding groups. However, due to the limited representative capacity of the employed shallow models, those methods may fail in handling realistic data without the linear subspace structure. To address this issue, we propose a novel subspace clustering approach by introducing a new deep model—Structured AutoEncoder (StructAE). The StructAE learns a set of explicit transformations to progressively map input data points into nonlinear latent spaces while preserving the local and global subspace structure. In particular, to preserve local structure, the StructAE learns representations for each data point by minimizing reconstruction error w.r.t. itself. To preserve global structure, the StructAE incorporates a prior structured information by encouraging the learned representation to preserve specified reconstruction patterns over the entire data set. To the best of our knowledge, StructAE is one of first deep subspace clustering approaches. Extensive experiments show that the proposed StructAE significantly outperforms 15 state-of-the-art subspace clustering approaches in terms of five evaluation metrics.
License type:
Funding Info:
X. Peng was partially supported by the Fundamental Research Funds for the Central Universities under Grant YJ201748, NFSC under Grant 61432012 and Grant U1435213, and the Fund of Sichuan University -Tomorrow Advancing Life (TAL). J. Feng was partially supported by NUS startup R-263-000-C08- 133, MOE Tier-I R-263-000-C21-112, NUS IDS R-263-000-C67-646, ECRA R-263-000-C87-133 and MOE Tier-II R-263-000-D17-112. W. Yau was supported by Singapore’s RIE2020 AME-Programmatic Grant A1687b0033. S. Yang was supported by NFSC under Grant 61501312.
Files uploaded:

File Size Format Action
pengx-tip-2017-v4.pdf 1.02 MB PDF Open