Publication Date

Spring 2025

Degree Type

Master's Project

Degree Name

Master of Science in Computer Science (MSCS)

Department

Computer Science

First Advisor

Katerina Potika

Second Advisor

Saptarshi Sengupta

Third Advisor

Genya Ishigaki

Keywords

Graph Machine Learning, Graph Neural Networks, Large Language Models, Text Attributed Graphs, Low Rank Adaptation

Abstract

With the exponential rise of language models (LMs) and their potential to understand semantic relationships, large LMs are being used across a wide range of applications. Text-attributed graphs (TAGs) are one notable example where LLMs can be combined with Graph Neural Networks (GNNs) to enhance node classification results. TAGs associate textual content with each node and are commonly seen in various domains such as social networks, citation graphs, recommendation systems, etc. Effectively modeling TAGs would enable deeper insights into different aspects of the graph and improve decision-making in relevant domains. We present GaLoRA, a parameter-efficient framework to integrate structural information in large LMs. GaLoRA demonstrates a strong performance for the node classification task on TAGs, performing on par with state-of-the-art models while requiring fewer trainable parameters. We experiment with three real-world datasets to showcase GaLoRA’s effectiveness in combining structural and contextual information of TAGs.

Available for download on Monday, May 25, 2026

Share

COinS