Publication Date

Fall 2024

Degree Type

Master's Project

Degree Name

Master of Science in Computer Science (MSCS)

Department

Computer Science

First Advisor

Katerina Potika

Second Advisor

Genya Ishigaki

Third Advisor

William Andreopoulos

Keywords

Distractor generation, MCQ creation, knowledge graphs, natural language processing, contrastive learning

Abstract

Knowledge-based tests are widely used to assess knowledge on a specific subject and have many applications in education and professional certifications. These tests usually consist of Multiple Choice Questions (MCQs), where a question with a few possible answers is given. Along with the correct answer, three or more incorrect answers are provided, which are called distractors. MCQs are a popular method for these tests because they are easy to grade. These tests can check different levels of comprehension ranging from beginners to advanced by creating distractors that may confuse unprepared test takers. This project proposes the Knowledge Graph Multiple Choice Generator (KGMCQ) that generates MCQs using Knowledge graphs (KGs) and their embeddings. Our generator explores two approaches for MCQ generation: Entity Type Loss Regularization and Contrastive Learning. The first approach ensures that the correct answer and the distractors are more likely to be of the same type. The second approach trains the model to distinguish between correct and incorrect entity types. When creating MCQs, finding questions and the correct answer using a Knowledge graph is a well-studied topic but finding distractors in the same KG is not well investigated. We conduct experiments on the MetaQA dataset, which has a KG and question-answer pairs. We evaluate the results by proposing different metrics based on the cosine similarity and Euclidean distances of the predicted distractor entity embeddings.

Available for download on Wednesday, December 31, 2025

Share

COinS