Multiple Kernel Clustering With Neighbor-Kernel Subspace Segmentation
Multiple kernel clustering (MKC) has been intensively studied during the last few decades. Even though they demonstrate promising clustering performance in various applications, existing MKC algorithms do not sufficiently consider the intrinsic neighborhood structure among base kernels, which could adversely affect the clustering performance. In this paper, we propose a simple yet effective neighbor-kernel-based MKC algorithm to address this issue. Specifically, we first define a neighbor kernel, which can be utilized to preserve the block diagonal structure and strengthen the robustness against noise and outliers among base kernels. After that, we linearly combine these base neighbor kernels to extract a consensus affinity matrix through an exact-rank-constrained subspace segmentation. The naturally possessed block diagonal structure of neighbor kernels better serves the subsequent subspace segmentation, and in turn, the extracted shared structure is further refined through subspace segmentation based on the combined neighbor kernels. In this manner, the above two learning processes can be seamlessly coupled and negotiate with each other to achieve better clustering. Furthermore, we carefully design an efficient iterative optimization algorithm with proven convergence to address the resultant optimization problem. As a by-product, we reveal an interesting insight into the exact-rank constraint in ridge regression by careful theoretical analysis: it back-projects the solution of the unconstrained counterpart to its principal components. Comprehensive experiments have been conducted on several benchmark data sets, and the results demonstrate the effectiveness of the proposed algorithm.