Publication Date
Spring 2025
Degree Type
Master's Project
Degree Name
Master of Science in Computer Science (MSCS)
Department
Computer Science
First Advisor
Chris Pollett
Second Advisor
Thomas Austin
Third Advisor
Robert Chun
Keywords
AI-driven legal case prediction, DistilBERT, parameter-efficient fine-tuning, legal text processing, SHAP explanations, DeBERTa-v3-small, XGBoost
Abstract
The large volume of legal cases presented by judicial professionals has made it
challenging to study and predict results. With advances in research methods and
technology, predicting law cases in a more accurate manner has become an important
trend. Prediction tools based on AI may help manage a large number of legislative
texts and documents that cannot possibly be fully read, reduce the number of cases
to be seen, and give accurate outcomes of how cases may turn out. Now, when
we look into the current AI legal prediction tools in this domain, they mostly lack
efficiency and interpretability, the ability for numerous legal texts to be processed.
Our AI-driven legal case prediction tool addresses these limitations, advancing the
development of AI-driven legal case prediction systems that can judge the likely
winner in a lawful dispute but also explain the reason for making that decision. Using
DistilBERT-base-uncased, a large language model that we fine-tune and pre-train,
we can efficiently understand legislative texts while making correct predictions about
results. The Supreme Court dataset from kaggle.com provides the training foundation
for our system. In order to be energy efficient, we use fine-tuning methods mainly
based on parameter fine-tuning, such as LoRA, DoRA, and QLoRA. We also explored
other LLM models like DeBERTa-v3-small, and classifiers like XGBoost. We also
experimented with Ex-AI, a tool to explain the interpretation of case outcomes. Our
final model has an accuracy of 95.6% for predicting legal cases.
Recommended Citation
Rath, Alisha, "AI Powered Legal Decision Support System" (2025). Master's Projects. 1466.
DOI: https://doi.org/10.31979/etd.877v-yfm2
https://scholarworks.sjsu.edu/etd_projects/1466