Chrome Extension
WeChat Mini Program
Use on ChatGLM

Neural Normalized Cut: A Differential and Generalizable Approach for Spectral Clustering

CoRR(2025)

Cited 0|Views9
Abstract
Spectral clustering, as a popular tool for data clustering, requires an eigen-decomposition step on a given affinity to obtain the spectral embedding. Nevertheless, such a step suffers from the lack of generalizability and scalability. Moreover, the obtained spectral embeddings can hardly provide a good approximation to the ground-truth partition and thus a k-means step is adopted to quantize the embedding. In this paper, we propose a simple yet effective scalable and generalizable approach, called Neural Normalized Cut (NeuNcut), to learn the clustering membership for spectral clustering directly. In NeuNcut, we properly reparameterize the unknown cluster membership via a neural network, and train the neural network via stochastic gradient descent with a properly relaxed normalized cut loss. As a result, our NeuNcut enjoys a desired generalization ability to directly infer clustering membership for out-of-sample unseen data and hence brings us an efficient way to handle clustering task with ultra large-scale data. We conduct extensive experiments on both synthetic data and benchmark datasets and experimental results validate the effectiveness and the superiority of our approach. Our code is available at: https://github.com/hewei98/NeuNcut.
More
Translated text
PDF
Bibtex
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper

要点】:论文提出了一种名为Neural Normalized Cut(NeuNcut)的新型谱聚类方法,通过神经网络直接学习聚类成员关系,实现了谱聚类的可扩展性和泛化性。

方法】:NeuNcut方法通过神经网络重新参数化未知的聚类成员关系,并使用随机梯度下降法配合放松的归一化切损失来训练神经网络。

实验】:作者在合成数据和标准数据集上进行了大量实验,结果表明NeuNcut方法在有效性和优越性方面均表现突出,所使用的数据集名称未在摘要中明确提及,但代码已公开在GitHub上,数据集可能包含在代码库中。