Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: extract excerpt content to allow commenting

Page Properties

Status

Status
colourYellow
titlePROPOSAL / DRAFT

Contributing Team

Marco Morales - Schema Education

Earlier Discovery

(ORA v3 - Schema Discovery not yet published!)

Linked Initiatives

N/A

Overview

This proposed XBlock offers a lightweight solution to short answer text responses with static and LLM based student feedback options. This XBlock fills the gap between pattern-matching short text input problems and multi-step rubric-graded ORAs. It integrates AI-powered instant feedback and coaching and allows for the re-display of student responses across multiple pages in a course. It is an ideal solution for student predictions, reflections, and section summaries, and allows all of these to later be built up into a larger project or essay.

Overview

Excerpt
hiddentrue
nameoverview

This proposed XBlock offers a lightweight solution to short answer text responses with static and LLM based student feedback options. This XBlock fills the gap between pattern-matching short text input problems and multi-step rubric-graded ORAs. It integrates AI-powered instant feedback and coaching and allows for the re-display of student responses across multiple pages in a course. It is an ideal solution for student predictions, reflections, and section summaries, and allows all of these to later be built up into a larger project or essay.

This proposed XBlock offers a lightweight solution to short answer text responses with static and LLM based student feedback options. This XBlock fills the gap between pattern-matching short text input problems and multi-step rubric-graded ORAs. It integrates AI-powered instant feedback and coaching and allows for the re-display of student responses across multiple pages in a course. It is an ideal solution for student predictions, reflections, and section summaries, and allows all of these to later be built up into a larger project or essay.

Key Use Cases

  • Instant Feedback

    • As a course designer, I can quickly create short response questions and add a LLM pre-prompt for instant feedback while avoiding timely grading by instructors.

    • As a learner, I can receive instant feedback on short responses throughout the learning process rather than waiting for delayed feedback on my writing after I moved on.

  • Open-ended Predictions

    • As a course designer, I can ask learners to make open-ended predictions and later explain how their understanding has changed to help them expose and overcome misconceptions.

    • As a learner, I can record my open-ended predictions at a snapshot in time and later review and write how my understanding has changed to clearly demonstrate my growth.

  • Built-up Reflections

    • As a course designer, I can ask learners to reflect frequently throughout the course, and then have them review, synthesize, and build on these in broader reflections.

    • As a learner, I can see my short reflections, with feedback, when I’m reflecting on larger timescales of learning.

  • Iteration

    • As a course designer, I can set checkpoint questions in a course that provide sufficient time and feedback cycles so that most learners can reach a specific form of proficiency before moving on.

    • As a learner, I can use writing as a tool for formulating and building on thoughts with several cycles of feedback before moving past a question, allowing me to advance quickly through topics I understand and slow down through parts that I do not.

...

Based on the above use cases, we are breaking down on high-level scope as follows:

In Scope

Out of Scope

Evaluating student responses using an out-of-the-box LLM and a pre-prompt from the course designer.

Training a model grader with human-graded student responses or fine-tuning a LLM

Creating a new XBlock for optional integration by app providers.

Integrating this work into the existing ORA2 project.

Evaluating student responses with a single criterion statement.

Incorporating rubrics and multiple criteria. (Stage 10?)

Auto-grading student responses via LLM.

Self, peer, and instructor grading of student responses

MVP Specs

Features & Requirements

In order to realize this MVP, we believe the following features will be required. Refer to the following flow chart for more details:

Feature

Requirements

Stage 0: New XBlock Foundations

Create a new XBlock that is available for custom addition to an edX site instance, leveraging the new react based editor components for the configuration and Studio experience.

Stage 1: Single entry, basic feedback response

The MVP XBlock echoes the basic text problem type, requiring a prompt from the course designer, collecting student input in a text box via a simple submission, and optionally showing feedback (constant response not AI).

The Studio editor should allow entry of the prompt text and feedback text to be optionally displayed for all students after submission. Grading to be based on submission / completion rather than any other criteria for now. Collected responses must be stored in a way that allows later retrieval from other pages using the same XBlock in the same course. This initial MVP echoes the basic text problem type, though much more will be added in future stages to situate this Xblock between the simplicity of a text input block and the complexity of an ORA2 block, with unique teaching and learning opportunities provided by future LLM / generative AI assessment and engagement modes.

Stage 2: Multiple linked blocks, and gated access

Building on the initial basic entry experience, authors would be able to linked multiple short response blocks in a course. This means that student submissions from previous blocks could be shown to students in other linked course blocks, and access to subsequent blocks could be gated to ensure students have completed earlier blocks or specific assignments if needed.

Visuals coming soon

Stage 3 - prompt + feedback vs reflection modes, plus response length suggestions

In

prompt + feedback mode, the supporting text of the block suggests to a learner that they are answering the prompt for submission, as is true for the Stage 1 + 2 iterations of the block.

For reflection mode, the supporting text is based on capturing a reflection and making visible previous personal reflections throughout the course if the block is set up to have multiple course entries.

Stage 4 - multiple submissions and submission history

The ability for each block to allow for resubmission of student answers, perhaps with the ability to show grades associated with previous answers. This would be controlled by an attempts setting similar to other problem blocks, enabling mastery learning and an improvement over ORA2’s single submission only model.

Stage 5 - LLM Response Feedback & Grading

LLM configuration would be something that could be determined at the site, organization, or potentially course level. If multiple LLM configurations exist, the block level might allow for selection of model based on cost, model fit, or other reasons determined by the course author. All LLM options would be linked via API to this XBlock (with initial configuration of ChatGPT-3.5 as a supported LLM option.) LLM based grading would be contingent on an LLM prompt that guides th LLM in determining whether a student has completed or mastered a given prompt. Separate from the LLM prompt, authors can provide a scoring guide visible to students that describes how they will be assessed (especially useful if the LLM prompt actually will be scoring students on a scale or range, not just for completion of the requirements).

Additional Details / Definition Coming Soon

Stage 6 - Supportive LLM Response Feedback

Beyond basic LLM response feedback, this update would support multiple stages of LLM feedback for students to improve submissions, receiving guidance from the LLM on how to improve their submission.

Technical Open Questions

We anticipate the following to some of the key questions that we will need answered during technical discovery.

...

  •  Marco Morales create page and share in #wg-educators
  •  Marco Morales share this with #wg-product-core as part of the proposal process for open source product contributions
  •  Marco Morales link to video for early development proof of concept for input / visibility.

Product & UX/UI FAQ

The following represent our Product view of key questions. However, we look to the UX/UI and technical teams to validate these as needed.

...