Skip to main content
AI Teaching Assistant

Learn how to use the AI Teaching Assistant in Kritik

Support Team avatar
Written by Support Team
Updated over a week ago

The AI Teaching Assistant enhances the grading process by offering another grading perspecive on student work. It serves as an additional reference point, offering a valuable comparison to human-assigned grades based on the same rubric. AI-generated grading recommendations help instructors fine-tune their grading for greater accuracy and consistency.

This is a premium feature that is currently only available upon request.

How It Works

When an activity reaches the evaluation stage, the AI Teaching Assistant automatically evaluates all student submissions according to the provided rubric. The AI considers each criterion carefully, mirroring the instructor's approach to grading.

After the AI has completed its evaluations, the results are made available to instructors under the "Evaluations Received" tab. Each score provided by the AI is accompanied by a detailed explanation of why that score was assigned to a particular criterion. This allows professors and collaborators to understand the reasoning behind the AI's assessments and provides an objective comparison against student evaluations.

How to Use

  • Viewing AI Scores: To access the AI-generated scores, simply click the "AI Score" button. This will display the AI's evaluation beneath the scores given by human graders. The AI score is then compared to the student peer-assigned grades and any significant discrepancies (over 20%) between the AI and peer grades are flagged.


  • Applying AI Scores: Instructors or course collaborators have the ability to apply the AI scores directly to a student’s creation. This can be done by selecting "Edit Score" and choosing either "Apply AI Score to All" or applying the AI score to individual criteria. This feature allows instructors to leverage AI insights while retaining control over the final grades.

  • Score Comparison View: To use the Score Comparison View, select the "Score Comparison" tab in the "View by" dropdown menu. This will allow professors to see a side-by-side display of the AI score, peer-assigned creation score, and any variances between the scores.

Did this answer your question?