RAGStudentGPT: A Syllabus-Aligned Retrieval-Augmented Generation Framework for Educational AI Systems

Author(s): Kinshuk Dutta, Sabyasachi Paul, Ankit Anand

Publication #: 2601018

Date of Publication: 06.07.2023

Country: United States

Pages: 1-9

Published In: Volume 9 Issue 4 July-2023

DOI: https://doi.org/10.62970/IJIRCT.v9.i4.2601018

Abstract

Large language models (LLMs) exhibit impressive generative prowess, yet their integration into formal educational settings is hindered by issues of hallucinations and misalignment with curricula. In education, accuracy encompasses not just factual veracity but also conformity to syllabus boundaries, instructional sequences, and pedagogical objectives.

This paper presents RAGStudentGPT, a framework for syllabus-aligned retrieval-augmented generation that imposes curriculum restrictions dynamically during inference. The architecture decouples parametric language proficiency acquired in pretraining from non-parametric curriculum oversight, achieved by confining retrieval to syllabus-designated units and constraining generation to the fetched content. We formalize curriculum-bounded retrieval-augmented generation (CB-RAG) and empirically validate enhanced pedagogical congruence without necessitating model retraining. Assessments on datasets segmented by syllabi reveal a 35% decrease in hallucinations and a 42% reduction in curricular infractions relative to unguided baselines, all while preserving linguistic coherence.

This advancement promotes ethical AI utilization in education by facilitating on-the-fly curriculum modifications and diminishing retraining demands.

Keywords: Educational Language Models, Curriculum Alignment, Retrieval-Augmented Generation, Pedagogical AI, Hallucination Mitigation, Trustworthy AI, Educational AI.

Download/View Paper's PDF

Download/View Count: 22

Share this Article