Off-campus SJSU users: To download campus access theses, please use the following link to log into our proxy server with your SJSU library user name and PIN.

Publication Date

Fall 2025

Degree Type

Thesis - Campus Access Only

Degree Name

Master of Science (MS)

Department

Computer Engineering

Advisor

KaiKai Liu; Bernardo Flores; Mahima Agumbe Suresh

Abstract

Large Language Models exhibit remarkable reasoning capabilities in mathematical and logical tasks. However, their application in healthcare is constrained by computational demands exceeding 100 billion parameters. Conversely, Small Language Models (<10B) present a feasible alternative for environments with limited resources. Nonetheless, they exhibit reduced capacity for multi-step reasoning and logical coherence, both of which are critical for clinical decision-making. Although recent advancements in prompt engineering and fine-tuning are promising, existing Chain-of-Thought methods still necessitate large training datasets, often comprising tens of thousands of samples, coupled with complex fine-tuning procedures, thereby limiting practical implementation. To address these challenges, this thesis investigates feedback-guided refinement strategies to enhance medical reasoning using minimal training data. We developed a multi-stage pipeline wherein prompts function as navigational beacons for reasoning path generation, actively guiding the reasoning process rather than merely verifying results. Through iterative prompt refinement and preservation-aware feedback mechanisms, our approach identifies and maintains valid reasoning while rectifying logical inconsistencies. This method effectively curates high-quality training samples for fine-tuning. Experimental evaluation on standard medical benchmarks indicates that the approach achieves performance comparable to methods requiring substantially larger datasets. This demonstrates that feedback-guided refinement offers a practical framework for medical reasoning in resource-constrained settings.

Share

COinS