Discovery: How long should different grading tasks usually take?
TL;DR - to build our own understanding (as well as assist support/partners) try to gather order-of-magnitude measurements for how long different types of grading tasks take to propagate through our systems.
Effectively we want to answer, "As <actor>, given a <grade action> how long is a reasonable amount of time to wait to see my changes appear in <other places I would expect that data>?"
Discovery work may include investigation of task logs, rule-of-thumb estimates from support engineers, or deciding we have no way of answering this and would like to start recording task performance.
Provide order-of-magnitude time estimates for the following grade tasks to complete:
Student-level change (e.g. student attempts a problem, staff rescores a problem for one student)
Multi-student problem-level change (e.g. staff rescores a problem for all students, staff performs a bulk update)
Course-wide change (e.g. staff changes grading policy or subsection assignment types)
Determine pipeline/timing discrepancies between when grade info appears in progress page and gradebook views.
Determine order-of-magnitude effect for large events:
Very large course update/resource starvation due to active grading/enrollment period
Steps to Reproduce
Reason for Variance
User Impact Summary
Currently reaching out to eSRE/SRE and Gabe to understand which logs/dashboards are already available towards generated the expected durations of grade related tasks.
To add on to what Emilio wrote earlier, Stanford previously had a tool they created for offline grading, which provided a way to let instructors see how much was done, and when it was done, who was updated successfully and who wasn’t, for whatever reason. I forget the particulars and the UI probably could have used a v2 but it was a very helpful tool and the insight into progress was appreciated by our course teams:
Follow up huddle has been schedule for Thursday, March 24 at 1:30 pm.
: Thanks for the prompt and reminder! Justin, I can share some baselines and some context on what types of problems this work could address. Please feel free to use whatever info would be relevant for discovery!
Current ballpark expectations:
As a background, I think current platform performance is along the following general lines. I’d be excited to other people's assumptions/ the actual typical times!
learners expect their actions to update grades immediately: (upon page refresh)
per-student or per-problem actions by course staff (grade override): (upon page refresh - a couple minutes)
typical assignment-wide or course-wide grading updates in live courses: (~hour, variable)
course-wide grading updates in huge/complex courses (~hour - day, with grumbling)
Friction around reporting ‘slow’ grading
These issues might be worth review to see what shared set of expectations/language/ ways of reporting grade update speed would be helpful
[Note: this ticket had an external root cause, but highlights some challenges of reporting ‘slower than usual’.]
Some instances where people used ‘wait 24 hours’ in troubleshooting to try to rule out any grade-delay-related complications. If we could set a better expectation around wait times (or were able to track tasks), we’d have faster troubleshooting.
Feedback from : Thanks so much for following up Justin.
Upon reading the ticket my first thought is about overall re-grading at the course level for all learners. For instance, a course team has made a mistake on the grading policy, we help them to fix it. Then it is not clear how long it is going to take or if the fix has taken place correctly. There is an uncertainty time:1 hour, 2 hours or 1 one day.
It would be awsome to have some sort of 2"4% re-grading complete" ... in the instructor tab that helps visualize what is happening at the backend. Bring some confidence to instructors/partners when there is a grading problem to be addressed.
Student grading or few learners grading it is quite fast, almost instantly. UI improvements would be welcome, for example, 3 learners re-graded successfully or something went wrong try again...
Please set some time for the huddle. After it probably it will spark other use cases and remind which CRs may relate to it.