Abstract
In recent years, multi-view subspace clustering has attracted extensive attention. In order to improve the clustering performance, the previous work tries to explore the consistency and specificity between different views by making the common representation matrix as close to the representation matrix learned in each view as possible. However, the values of the elements corresponding to a similar degree of the strong or weak relationship often have different magnitude levels in the representation matrix learned in each view. In this situation, the above strategy will make the information of some views ignored or magnified. To overcome this limitation, we propose a novel multi-view subspace clustering method in this paper. Because our proposed method can normalize the degree of the strong or weak relationship in each view to the unified measure standard by scaling the representation matrix learned in each view, the consistency and specificity between different views will be mined more effectively. In addition, we provide a theoretical analysis of the convergence and computation complexity of our numerical algorithm. The experimental results on several benchmark data sets indicate that our proposed method is not only effective but also efficient for the problem of multi-view subspace clustering.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Data Availability
The data sets generated and/or analysed during the current study are available from the corresponding author on reasonable request.
References
Vidal R (2011) Subspace clustering. IEEE Signal Process Mag 28(2):52–68
Li B, Liu R, Cao J, Zhang J, Lai Y, Liu X (2018) Online low-rank representation learning for joint multi-subspace recovery and clustering. IEEE Trans Image Process 27(1):335–348
Peng X, Feng J, Xiao S, Yau W, Zhou JT, Yang S (2018) Structured autoencoders for subspace clustering. IEEE Trans Image Process 27(10):5076–5086
Wang B, Hu Y, Gao J, Sun Y, Ju F, Yin B (2021) Learning adaptive neighborhood graph on Grassmann manifolds for video/image-set subspace clustering. IEEE Trans Multimed 23:216–227
Yin M, Liu W, Li M, Jin T, Ji R (2021) Cauchy loss induced block diagonal representation for robust multi-view subspace clustering. Neurocomputing 427:84–95
Xiao X, Wei L (2020) Robust subspace clustering via latent smooth representation clustering. Neural Process Lett 52(2):1317–1337
Wei L, Zhang Y, Yin J, Zhou R, Zhu C, Zhang X (2019) An improved structured low-rank representation for disjoint subspace segmentation. Neural Process Lett 50(2):1035–1050
Tseng P (2000) Nearest q-flat to m points. J Optim Theory Appl 105(1):249–252
Costeira JP, Kanade T (1998) A multibody factorization method for independently moving objects. Int J Comput Vis 29(3):159–179
Tipping ME, Bishop CM (1999) Mixtures of probabilistic principal component analyzers. Neural Comput 11(2):443–482
Lu C, Feng J, Lin Z, Mei T, Yan S (2019) Subspace clustering by block diagonal representation. IEEE Trans Pattern Anal Mach Intell 41(2):487–501
Zhang S, You C, Vidal R, Li C (2021) Learning a self-expressive network for subspace clustering. In: CVPR, pp 12393–12403
You C, Li C, Robinson DP, Vidal R (2019) Is an affine constraint needed for affine subspace clustering? In: ICCV, pp 9914–9923
Shi J, Malik J (2000) Normalized cuts and image segmentation. IEEE Trans Pattern Anal Mach Intell 22(8):888–905
Tang K, Liu R, Su Z, Zhang J (2014) Structure-constrained low-rank representation. IEEE Trans Neural Netw Learn Syst 25(12):2167–2179
Tang K, Dunson DB, Su Z, Liu R, Zhang J, Dong J (2016) Subspace segmentation by dense block and sparse representation. Neural Netw 75:66–76
Tang K, Su Z, Liu Y, Jiang W, Zhang J, Sun X (2019) Subspace segmentation with a large number of subspaces using infinity norm minimization. Pattern Recognit 89:45–54
Elhamifar E, Vidal R (2013) Sparse subspace clustering: algorithm, theory, and applications. IEEE Trans Pattern Anal Mach Intell 35(11):2765–2781
Liu G, Lin Z, Yan S, Sun J, Yu Y, Ma Y (2013) Robust recovery of subspace structures by low-rank representation. IEEE Trans Pattern Anal Mach Intell 35(1):171–184
Lu C-Y, Min H, Zhao Z-Q, Zhu L, Huang D-S, Yan S (2012) Robust and efficient subspace segmentation via least squares regression. In: ECCV
Zhang C, Fu H, Hu Q, Cao X, Xie Y, Tao D, Xu D (2020) Generalized latent multi-view subspace clustering. IEEE Trans Pattern Anal Mach Intell 42(1):86–99
Li Z, Tang C, Zheng X, Liu X, Zhang W, Zhu E (2022) High-order correlation preserved incomplete multi-view subspace clustering. IEEE Trans Image Process 31:2067–2080
Huang Y, Xiao Q, Du S, Yu Y (2022) Multi-view clustering based on low-rank representation and adaptive graph learning. Neural Process Lett 54(1):265–283
Kang Z, Zhao X, Peng C, Zhu H, Zhou JT, Peng X, Chen W, Xu Z (2020) Partition level multiview subspace clustering. Neural Netw 122:279–288
Wang S, Liu X, Zhu X, Zhang P, Zhang Y, Gao F, Zhu E (2022) Fast parameter-free multi-view subspace clustering with consensus anchor guidance. IEEE Trans Image Process 31:556–568
Gao H, Nie F, Li X, Huang H (2015) Multi-view subspace clustering. In: ICCV, pp 4238–4246
Zhang C, Hu Q, Fu H, Zhu P, Cao X (2017) Latent multi-view subspace clustering. In: CVPR, pp 4333–4341
Zhang C, Fu H, Liu S, Liu G, Cao X (2015) Low-rank tensor constrained multiview subspace clustering. In: ICCV, pp 1582–1590
Xie Y, Tao D, Zhang W, Liu Y, Zhang L, Qu Y (2018) On unifying multi-view self-representations for clustering by tensor multi-rank minimization. Int J Comput Vis 126(11):1157–1179
Kilmer ME, Braman KS, Hao N, Hoover RC (2013) Third-order tensors as operators on matrices: a theoretical and computational framework with applications in imaging. SIAM J Matrix Anal Appl 34(1):148–172
Wu J, Lin Z, Zha H (2019) Essential tensor learning for multi-view spectral clustering. IEEE Trans Image Process 28(12):5910–5922
Wu J, Xie X, Nie L, Lin Z, Zha H (2020) Unified graph and low-rank tensor learning for multi-view clustering. In: AAAI, pp 6388–6395
Gao Q, Xia W, Wan Z, Xie D, Zhang P (2020) Tensor-svd based graph learning for multi-view subspace clustering. In: AAAI, pp 3930–3937
Cao X, Zhang C, Fu H, Liu S, Zhang H (2015) Diversity-induced multi-view subspace clustering. In: CVPR, pp 586–594
Wang X, Guo X, Lei Z, Zhang C, Li SZ (2017) Exclusivity-consistency regularized multi-view subspace clustering. In: CVPR, pp 1–9
Luo S, Zhang C, Zhang W, Cao X (2018) Consistent and specific multi-view subspace clustering. In: AAAI, pp 3730–3737
Wang H, Yang Y, Liu B (2020) GMC: graph-based multi-view clustering. IEEE Trans Knowl Data Eng 32(6):1116–1129
Kang Z, Shi G, Huang S, Chen W, Pu X, Zhou JT, Xu Z (2020) Multi-graph fusion for multi-view spectral clustering. Knowl Based Syst 189:66
Zhang X (2004) Matrix analysis and applications. Tsinghua University Press, Beijing
Lin Z, Chen M, Wu L, Ma Y (2009) The augmented Lagrange multiplier method for exact recovery of corrupted low-rank matrices. UIUC Technical Report, UILU-ENG-09-2215
Ikizler N, Cinbis RG, Pehlivan S, Duygulu P (2008) Recognizing actions from still images. In: ICPR, pp 1–4
Kang Z, Zhou W, Zhao Z, Shao J, Han M, Xu Z (2020) Large-scale multi-view subspace clustering in linear time. In: AAAI, pp 4412–4419
Ojala T, Pietikäinen M, Mäenpää T (2002) Multiresolution gray-scale and rotation invariant texture classification with local binary patterns. IEEE Trans Pattern Anal Mach Intell 24(7):971–987
Selvaraj A, Ganesan L, Priyal SP (2006) Texture classification using gabor wavelets based rotation invariant features. Pattern Recognit Lett 27(16):1976–1982
Kumar A, Rai P III, Daume H (2011) Co-regularized multi-view spectral clustering. In: NIPS, pp 1413–1421
Xia R, Pan Y, Du L, Yin J (2014) Robust multi-view spectral clustering via low-rank and sparse decomposition. In: AAAI, pp 2149–2155
Kang Z, Lin Z, Zhu X, Xu W (2021) Structured graph learning for scalable subspace clustering: from single-view to multi-view. CoRR arXiv:2102.07943
Liu L, Chen P, Luo G, Kang Z, Luo Y, Han S (2022) Scalable multi-view clustering with graph filtering. CoRR arXiv:2205.09228
Acknowledgements
This work was supported in part by the National Natural Science Foundation of China under Grant 62076115, in part by the LiaoNing Revitalization Talents Program under Grant XLYC1907169, and in part by the Program of Star of Dalian Youth Science and Technology under Grants 2019RQ033 and 2020RQ053.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declared that they have no conflicts of interest to this work.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix A: Proof of Theorem 1
Appendix A: Proof of Theorem 1
Suppose we have obtained \(Z^{(v)}_{k}, \mu _{k}\) and \(C_{k}\) by our algorithm. In the next round of iteration, when updating \(Z^{(v)}\), \(\mu _{k}\) and \(C_{k}\) are fixed. Since Eq. (6) is a closed-form solution, \(Z^{(v)}_{k+1}\) is the optimal solution. Therefore, it will minimize the value of the objective function with \(\mu _{k}\) and \(C_{k}\) fixed, i.e.
When updating \(\mu , Z^{(v)}_{k+1}\) and \(C_{k}\) are fixed. Since both Eqs. (8) and (10) are the closed-form solution, \(\mu _{k+1}\) is the optimal solution. Therefore, it will minimize the value of the objective function with \(Z^{(v)}_{k+1}\) and \(C_{k}\) fixed, i.e.
When updating \(C, Z^{(v)}_{k+1}\) and \(\mu _{k+1}\) are fixed. Since Eq. (12) is also a closed-form solution, \(C_{k+1}\) is the optimal solution. Therefore, it will minimize the value of the objective function with \(Z^{(v)}_{k+1}\) and \(\mu _{k+1}\) fixed, i.e.
Hence, after a round of iteration, the objective function value will decrease, i.e.
Hence, Theorem 1 is proved.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Tang, K., Wang, X. & Li, J. Multi-view Subspace Clustering Based on Unified Measure Standard. Neural Process Lett 55, 6231–6246 (2023). https://doi.org/10.1007/s11063-022-11136-6
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11063-022-11136-6