Open Response Assessments - Reusable Rubrics

TL;DR: 

Course staff can now reuse a rubric from an existing Open Response Assessment (ORA) in a course when creating a new ORA in the same course. This MVP version is part of our greater effort to increase task efficiency around authoring and moderating staff graded assignments.

Who you can contact:

Product Manager: @Sapana Thomas (Deactivated)        

Engineering Leads: @Mat Carter (Deactivated) @Justin Lapierre (Deactivated)

UX Lead: @Kevin McGrath (Deactivated)

Aurora Engineering: #content-aurora slack channel

What is it?

Using Block ID, course staff can specify which ORA’s rubric they want to clone into another ORA within the same course.

In Studio, course staff navigates to the “Rubric” section of the editing modal for the published or unpublished ORA whose rubric they want to clone. After expanding the “Clone Rubric” section, they can copy the Block ID for that ORA.

Next, they can either create a new ORA or navigate to an existing ORA, and open the “Rubric” section of the editing modal. Here, they can either paste the full Block ID of the ORA whose rubric they want to clone or type in a few characters of that Block ID and select it from the dropdown.

Once the correct Block ID is selected, they can select “Clone” and all of the existing rubric values will be replaced with the rubric values from the original ORA.

 

Key talking points for customers:

Why work on this area of the platform?

Open Response Assessments were flagged as a top 20 Teaching and Learning platform gap in the last platform map rating exercise about a year ago. Commonly cited as a time-intensive instructor process when this component uses staff and peer assessments, this was also called out as a reason why some course teams haven’t switched over to using self-paced courses. This component can provide rich learner feedback through its rubric-based grading and more broadly open-ended assignment submissions can drive higher quality courses beyond our other basic problem submission options.

Why did we build it? What problem does it solve?

Inefficiencies with authoring and moderation of ORAs became spotlighted through the Grading research study conducted with Masters and Micromasters Partners in the spring, providing both new insights and deeper context around partner feedback previously gathered by other groups at edX. Some of these courses use numerous ORAs and assess students against standard criteria across ORAs within a course--sometimes across multiple courses in a program. Having to create identical rubrics multiple times during course authoring was a great source of frustration for course staff. 

Who will notice the change, and where?

This is a general release. All course staff will notice the functionality in the ORA editing modal in Studio.

What impact will it have on course development teams?

By allowing them to clone an existing rubric within a course, this should reduce time spent authoring ORAs and increase standardization of how students will be assessed within a course.

Results

Through this new functionality, we aim to decrease course authoring friction and improve the ORA authoring experience, and we’ll be working to gather feedback from partners to better understand the value delivered.

Credits / A Group Effort!

Thanks to the Aurora Squad, including Product Designer Kevin McGrath, for all their hard work making this a reality. Thanks also to Marco Morales for historical ORA knowledge and guidance.