In-Context Metrics for Studio: Product Requirements

Overview

Course delivery teams are busy and often don’t want to have to navigate to a separate full-page dashboard to view metrics for their courses or their content. In an effort to equip users with content engagement and performance metrics where they edit and view their course content, we’ll be adding easy-to-access, easy-to-collapse metrics in context, right in Studio.

Requirements

Developed as a frontend plugin

In-context metrics will be developed utilizing frontend plugin framework by creating a frontend plugin that can be removed, customized, or used by any instance, and frontend plugin slots so the plugin can be rendered in appropriate places on the page(s) that can be removed, customized, or used by any instance (even those that don’t have Aspects)

  • For instances that have Aspects, in-context metrics are enabled by default. Operators can choose to remove or customize the plugin on their instance.

  • For instances that don’t have Aspects, what does this plugin look like? My initial gut instinct would be to say it is empty by default

  • The plugin slot can be stacked if multiple plugins try to use it (it would be possible for someone to use this slot to display Aspects in-context metrics alongside an LLM summarization plugin, for example)

Overarching functionality

On instances where in-context metrics are enabled:

  • Course Admins can select which staff roles (including Course Admin) can view in-context metrics for their courses (Default would be that all course staff that have access to Studio can see in-context metrics on instances where it’s enabled).

  • It must be clear in the UI to Studio users that in-context metrics are available for them to view (when enabled). The in-context metrics are to be hidden by default, but easily accessed and re-hidden by the user. 

    • When visible, the user must be able to view the in-context metrics next to/by the content the metrics relate to.

    • When disabled or hidden, there are no calls made to Superset and the UI for this is unobtrusive (while also being clear)

  • Default metrics that are shown must be tailored to the type of content displayed (See specific requirements in the next section).

  • If a course is re-run or exported, the course metrics are copied over from the copied run. By default, the new course run or imported course does not show the copied course metrics, but users are able to view historic (or copied run) in-context metrics if they choose. 

  • The visuals can be viewed without having to zoom or scroll horizontally (the sidebar is wide enough to clearly display visualizations in-full with legible labels, etc)

  • It should be possible to the default override in-context metrics with custom ones

  • It should be possible to add new in-context metrics for custom xblock types.

Requirements by content type

  • Per component, I want to see the following:

    • For units with discussions enabled, 

      • We provide a link to the discussion for that unit that allows the user to view what learners are saying about that unit (on a new tab)

        • To confirm this requirement with UX/UI or Design (there is a way to get to the discussion forums from the view live button on this page and then clicking the discussions icon - 2 steps instead of one).

      • When there is a higher than average number of discussion posts for this unit when compared to other units in this course with discussions enabled, can we show that as well as how much higher than average this block’s number of discussion posts are?

    • For video components:

      • I want to see where (what timestamps) in the video my learners get to

      • I want to see where (what timestamps) in the video are being watched multiple times

    • For single problem components:

      • I want to see the % correct problem responses (out of all learner submissions, not first attempt or last attempt)

      • Initial problem response distribution

        • For single select, multi-select, dropdown, I would imagine this could be a bar graph showing the number of learners that submitted each response as their initial problem response

        • For text input, we’ll want to show the 7 (if this looks too crunched, we can decrease) most common responses and indicate the number of learners that submitted each response as their initial problem response. If possible, it might be good to give users a way to see more than the top 7 responses either by clicking a view more button, scrolling through the graph, etc.

        • For numerical input or numerical input, we’ll want to show the responses and indicate the number of learners that submitted each range of response as their initial problem response (I could imagine these being a scatter plot, density plot, or box plot or even a histogram with bars as ranges. Not sure if this is possible with numerical inputs).

    • For graded subsections

      • I want to see the average grade on the whole graded subsection among those who submitted at least one response to a problem in the graded subsection.

        • Note: The grade is based on the learner’s last attempt for each problem. You can configure problems in graded subsections to accept a set number or even unlimited attempts.

      • I want to see a distribution of the number of learners who attempted each problem in the graded subsection. Can this distribution be a stacked bar to show breakdown of correct/incorrect final responses?