Publication Date

Spring 2025

Degree Type

Master's Project

Degree Name

Master of Science in Computer Science (MSCS)

Department

Computer Science

First Advisor

Katerina Potika

Second Advisor

Ching-seh (Mike) Wu

Third Advisor

Magdalini Eirinaki

Keywords

Graph neural networks, pretrained sentence transformers, large language models, recommender systems, user-item interaction, contrastive learning, fine-tuning

Abstract

Graph neural networks (GNNs) have emerged as a powerful paradigm for collaborative filtering. However, they often fall short in fully leveraging side textual content, resulting in suboptimal recommendations. To address this limitation, we explore the synergy between GNNs and deep contextual embeddings of item descriptions, aiming to enhance recommendation quality on the Amazon-Books dataset. We propose SemanticGraphRec, which combines GNNs with Large Language Models (LLMs) to leverage both collaborative filtering and textual item content. Experimental results demonstrate that incorporating semantic item embeddings produced by fine-tuning LLMs consistently improves performance. Our approach enhances recommendation relevance in sparse data scenarios by leveraging both textual content and graph structure, offering a promising direction for more context-aware and personalized recommender systems across diverse application domains.

Available for download on Monday, May 25, 2026

Share

COinS