...
Feature | Requirements |
Stage 0: New XBlock Foundations | Create a new XBlock that is available for custom addition to an edX site instance, leveraging the new react based editor components for the configuration and Studio experience. |
Stage 1: Single entry, basic feedback response | The MVP XBlock echoes the basic text problem type, requiring a prompt from the course designer, collecting student input in a text box via a simple submission, and optionally showing feedback (constant response not AI). |
Stage 2: Multiple linked blocks, and gated access | Building on the initial basic entry experience, authors would be able to linked multiple short response blocks in a course. This means that student submissions from previous blocks could be shown to students in other linked course blocks, and access to subsequent blocks could be gated to ensure students have completed earlier blocks or specific assignments if needed.Visuals coming soon |
Stage 3 - prompt + feedback vs reflection modes, plus response length suggestions | In prompt + feedback mode, the supporting text of the block suggests to a learner that they are answering the prompt for submission, as is true for the Stage 1 + 2 iterations of the block. • For reflection mode, the supporting text is based on capturing a reflection and making visible previous personal reflections throughout the course if the block is set up to have multiple course entries. |
Stage 4 - multiple submissions and submission history | The ability for each block to allow for resubmission of student answers, perhaps with the ability to show grades associated with previous answers. This would be controlled by an attempts setting similar to other problem blocks, enabling mastery learning and an improvement over ORA2’s single submission only model. |
Stage 5 - LLM Response Feedback & Grading | LLM configuration would be something that could be determined at the site, organization, or potentially course level. If multiple LLM configurations exist, the block level might allow for selection of model based on cost, model fit, or other reasons determined by the course author. All LLM options would be linked via API to this XBlock (with initial configuration of ChatGPT-3.5 as a supported LLM option.) LLM based grading would be contingent on an LLM prompt that guides th LLM in determining whether a student has completed or mastered a given prompt. Separate from the LLM prompt, authors can provide a scoring guide visible to students that describes how they will be assessed (especially useful if the LLM prompt actually will be scoring students on a scale or range, not just for completion of the requirements). |
Additional Details / Definition Coming Soon | |
Stage 6 - Supportive LLM Response Feedback | Beyond basic LLM response feedback, this update would support multiple stages of LLM feedback for students to improve submissions, receiving guidance from the LLM on how to improve their submission. |
...