6 research outputs found
Reconsidering Representation Alignment for Multi-view Clustering
Aligning distributions of view representations is a core component of today's
state of the art models for deep multi-view clustering. However, we identify
several drawbacks with na\"ively aligning representation distributions. We
demonstrate that these drawbacks both lead to less separable clusters in the
representation space, and inhibit the model's ability to prioritize views.
Based on these observations, we develop a simple baseline model for deep
multi-view clustering. Our baseline model avoids representation alignment
altogether, while performing similar to, or better than, the current state of
the art. We also expand our baseline model by adding a contrastive learning
component. This introduces a selective alignment procedure that preserves the
model's ability to prioritize views. Our experiments show that the contrastive
learning component enhances the baseline model, improving on the current state
of the art by a large margin on several datasets.Comment: To appear in CVPR 2021. Code available at
https://github.com/DanielTrosten/mv
One-Step Clustering with Adaptively Local Kernels and a Neighborhood Kernel
Among the methods of multiple kernel clustering (MKC), some adopt a neighborhood kernel as the optimal kernel, and some use local base kernels to generate an optimal kernel. However, these two methods are not synthetically combined together to leverage their advantages, which affects the quality of the optimal kernel. Furthermore, most existing MKC methods require a two-step strategy to cluster, i.e., first learn an indicator matrix, then executive clustering. This does not guarantee the optimality of the final results. To overcome the above drawbacks, a one-step clustering with adaptively local kernels and a neighborhood kernel (OSC-ALK-ONK) is proposed in this paper, where the two methods are combined together to produce an optimal kernel. In particular, the neighborhood kernel improves the expression capability of the optimal kernel and enlarges its search range, and local base kernels avoid the redundancy of base kernels and promote their variety. Accordingly, the quality of the optimal kernel is enhanced. Further, a soft block diagonal (BD) regularizer is utilized to encourage the indicator matrix to be BD. It is helpful to obtain explicit clustering results directly and achieve one-step clustering, then overcome the disadvantage of the two-step strategy. In addition, extensive experiments on eight data sets and comparisons with six clustering methods show that OSC-ALK-ONK is effective
Late Fusion Multi-view Clustering via Global and Local Alignment Maximization
Multi-view clustering (MVC) optimally integrates complementary information
from different views to improve clustering performance. Although demonstrating
promising performance in various applications, most of existing approaches
directly fuse multiple pre-specified similarities to learn an optimal
similarity matrix for clustering, which could cause over-complicated
optimization and intensive computational cost. In this paper, we propose late
fusion MVC via alignment maximization to address these issues. To do so, we
first reveal the theoretical connection of existing k-means clustering and the
alignment between base partitions and the consensus one. Based on this
observation, we propose a simple but effective multi-view algorithm termed
LF-MVC-GAM. It optimally fuses multiple source information in partition level
from each individual view, and maximally aligns the consensus partition with
these weighted base ones. Such an alignment is beneficial to integrate
partition level information and significantly reduce the computational
complexity by sufficiently simplifying the optimization procedure. We then
design another variant, LF-MVC-LAM to further improve the clustering
performance by preserving the local intrinsic structure among multiple
partition spaces. After that, we develop two three-step iterative algorithms to
solve the resultant optimization problems with theoretically guaranteed
convergence. Further, we provide the generalization error bound analysis of the
proposed algorithms. Extensive experiments on eighteen multi-view benchmark
datasets demonstrate the effectiveness and efficiency of the proposed
LF-MVC-GAM and LF-MVC-LAM, ranging from small to large-scale data items. The
codes of the proposed algorithms are publicly available at
https://github.com/wangsiwei2010/latefusionalignment
Multiple kernel clustering with local kernel alignment maximization
Kernel alignment has recently been employed for multiple kernel clustering (MKC). However, we find that most of existing works implement this alignment in a global manner, which: i) indiscriminately forces all sample pairs to be equally aligned with the same ideal similarity; and ii) is inconsistent with a well-established concept that the similarity evaluated for two farther samples in a high dimensional space is less reliable. To address these issues, this paper proposes a novel MKC algorithm with a local kernel alignment, which only requires that the similarity of a sample to its k-nearest neighbours be aligned with the ideal similarity matrix. Such an alignment helps the clustering algorithm to focus on closer sample pairs that shall stay together and avoids involving unreliable similarity evaluation for farther sample pairs. We derive a new optimization problem to implement this idea, and design a two-step algorithm to efficiently solve it. As experimentally demonstrated on six challenging multiple kernel learning benchmark data sets, our algorithm significantly outperforms the state-of-the-art comparable methods in the recent literature, verifying the effectiveness and superiority of maximizing local kernel alignment