Ortakci, Yasin2024-09-292024-09-2920242215-0986https://doi.org/10.1016/j.jestch.2024.101730https://hdl.handle.net/20.500.14619/4923Large Language Models (LLMs), one of the most advanced representatives of neural networks, have revolutionized the field of natural language processing. Among the many applications of these models, text clustering is gaining increasing interest. In particular, the fact that LLMs digitize text more semantically and contextually than existing methods in the literature has led LLMs to produce more successful results with clustering algorithms. However, since these models are not specifically designed for text clustering, they can lead to processing times that exceed acceptable runtime thresholds. To address this challenge, the Sentence-BERT (SBERT) model has been proposed as a solution, offering the ability to accurately measure text similarity by transforming entire texts into dense, fixed-size vectors. SBERT has been integrated into various LLMs, resulting in the creation of diverse SBERT model variants. This study aims to assess the transfer learning capabilities of SBERT models in the context of text clustering. Furthermore, it investigates the influence of CLS (classification token), mean, and max pooling techniques on the performance of these models. In this direction, we applied these pooling techniques to DistilBERT, DistilRoBERTa, ALBERT, and MPNET based SBERT models and compared their performance on different corpora. The results show that there is no clear superiority among the SBERT models. However, the mean pooling emerged as the most effective method in 13 out of 16 text clustering tasks. This finding underscores the high compatibility of the mean pooling technique with SBERT models.eninfo:eu-repo/semantics/openAccessSBERTLarge language modelsSentence embeddingsText clusteringPooling techniquesRevolutionary text clustering: Investigating transfer learning capacity of SBERT models through pooling techniquesArticle10.1016/j.jestch.2024.1017302-s2.0-85195462434Q155WOS:001252306300001N/A