[DRAFT] Aspects V1: Product Requirements

Overview:

 

This document lays out the product goals and steps toward a full V1 release of Aspects. The goal of the V1 release is to deliver an MVP experience for administrators, instructional designers and course teams to access a curated set of the most important reports they need for analyzing course success and instance health, directly in-platform. 

V1 experience:

A V1 experience will include the following. However, we will likely de-scope the dashboard experiences from the MVP.

  1. A basic set of reports with visualized data for administrators 

  2. A basic set of reports with visualized data for course authors and delivery teams

  3. A dashboard experience for administrators to access reports

  4. A dashboard experience for course teams to access reports

  5. An optional workflow for administrators to access the raw data.

Goals of the V1 release:

  1. Administrators at the instance or organization level have easy access to reports in-platform that present the most important metrics for analyzing health of the instance, including individual course data and data aggregated across courses, such as enrollment trends and completion rates. The data presented in a visual way that is quick to interpret.

  2. Course teams have easy access to reports in-platform that present the most important metrics for analyzing course and learner performance, including data at the course level and data at the level of the individual learner. The data presented in a visual way that is quick to interpret.

  3. Administrators and course teams have a clear way to access the raw data should they so choose.

What reports are in scope for V1?

Method of acquiring market input:

  1. A survey to the Educator’s Working Group

  2. Individual inquiries/interviews with community members actively investing in this space

  3. Competitive research on Canvas and Moodle

We asked users what questions they would like to be able to answer using learner-level, course-level or instance-level analytics, and why (with a particular focus on questions they are not able to answer now.)

From this input, we synthesized over 120 questions that users would like to be able to answer. From this pool, we curated an MVP set of reports that will cover the fundamental needs of administrators and course teams, and that achieve parity with competitors where relevant. The MVP reports focus on analytics needs of large courses. (Future releases may include a bundle of reports that address specific needs for residential, small-scale, hybrid courses.)

Full market data here. 

Reports at the course level for authors, instructional designers and administrators

 

Course Enrollment: Analyze enrollment trends

  • How many learners are currently enrolled in this course?

  • How many new learners have enrolled in the last week/month/year?

 

Course Activity: Analyze how learners (in aggregate) are engaging with the course and specific course components

At the course level:

  • How many learners were active in a specified time period?

  • How many grade-able problems were answered correctly?

  • How many non-gradeable problems were answered correctly?

  • How many grade-able ORAs were completed?

  • How many non-gradeable ORAs were completed?

  • How many distinct users are currently participating in discussions (in the last week, month)?

  • A report of forum responses and replies by student and by topic

    • [Format? And to what extent is this already covered in the new discussions upgrades?]

  • What percentage of overall sessions happen on mobile (vs non-mobile)?

At the component level:

  • How long are learners spending to complete this component/unit/section/subsection of my course on average?

  • How long are learners spending to complete this problem component on average?

  • How often was this graded problem answered corrected?

  • How often was this ungraded problem answered correctly?

  • What is the average number of attempts to get to a correct answer for a graded problem?

  • What is the average number of attempts to get to a correct answer for an ungraded problem?

  • How often was this gradeable ORA completed?

  • How often was this ungradeable ORA completed?

  • How much of a video has been watched?

  • How many learners attempted a graded assignment (in the weighted grading schema)?

  • What grade are learners averaging for specific graded assignments?

Course Success Metrics: Analyze course success

  • What is the current average grade of learners in my course?

  • What is the average grade of learners in all previous course runs?

  • What is the average time learners spend on a course?

  • How many learners have earned a certificate and/or badge?

  • How many learners have completed this course?

  • What percentage of students complete a course in a specific time period?

  • How long are learners spending to complete the entire course on average?

  • Where in my course have the majority of learners reached and completed?

  • What percentage of the course have learners completed?

 

Learner Progress: Track learner progress to interpret early signs of drop-out/disengagement

  • How much time has passed since a learner last accessed the course?

  • Which learners taking significantly more or less time to complete content than their peers?

  • What is the current grade of an individual learner compared to the course average?

  • How long is a learner taking to complete a course?

  • Which learners are at risk of dropping out?

  • Proposed approach:

    • A report of at-risk learners with user name and email address

      • Last sign in

      • Percent of course complete

      • Current grade

      • Current grade compared to course average

Reports at the site level for administrators

 

Enrollment Trends: Analyze platform registration and course enrollment trends

  • How many learners are currently registered on the site?

  • How many new learners are registering on the site in the last week/per month/per year?

  • How many learners are enrolled in all courses?

    • Across all courses

    • At the course level

  • How many new learners are enrolling in the last week/month/year?

    • Across all courses

    • At the course level

 

Platform Activity: Analyze data aggregated across courses

  • How many courses are there per organization? (for multitenants)

  • Which courses have the most/least active users in a time frame?

  • How many courses are published and active, vs published and not started, vs ended?

  • How many learners were active across all courses in the chosen time period?

  • What is the average time learners spend on the platform?

    • Same question - time tracking

 

Course Success: Analyze course completion data

  • What is the average current grade of learners in all active courses?

  • How many learners have completed courses?

  • How many learners have earned certificates?

 

Learner Engagement

  • Which courses have learners that are at risk of dropping out?

  • How many learners enrolled in courses, but never answer a question?

 

Features & Requirements

In order to realize this MVP, we believe the following features will be required.

Feature

Requirements

Course-level reports

Data Visualization requirements TBD

Site-level reports

Data Visualization requirements TBD

Administrative Dashboard

Reqs TBD

Course Reports Dashboard

Reqs TBD

 

 

Open Product/UX Questions:

It’s possible to include comparative data from one run to the next - learner grades, completion rates, etc. This data is interesting to course teams AND admins. Where does it live? How do course teams access global data outside of the Instructor Dashboard which is constrained to a single run/course?

Best UI for accessing data at component levels - should be accomplished in-course.

All questions that attempt to quantify “time spent” (time spent in the course, time spent on a problem, time spent on a unit, etc) need to be defined. (the “get up to make lunch” problem)

Default definition of “active”, with some config options for customization

Appropriate/sensitively designed UI for eliciting voluntary learner demographic data

Is there an MVP for component-level reports that can be released without in-platform/in-course UI?

Possible Future Directions:

Tie instructor actions for interventions to data. ex, send custom email to all learners at risk of not dropping out bc they haven’t access the course in 3 weeks

Tie analytics to content tagging