Overview:

This document lays out the product goals and steps toward a full V1 release of Aspects. The goal of the V1 release is to deliver an MVP experience for administrators, instructional designers and course teams to access a curated set of the most important reports they need for analyzing course success and instance health, directly in-platform. 

V1 experience:

A V1 experience will include the following. However, we will likely de-scope the dashboard experiences from the MVP.

  1. A basic set of reports with visualized data for administrators 

  2. A basic set of reports with visualized data for course authors and delivery teams

  3. A dashboard experience for administrators to access reports

  4. A dashboard experience for course teams to access reports

  5. An optional workflow for administrators to access the raw data.

Goals of the V1 release:

  1. Administrators at the instance or organization level have easy access to reports in-platform that present the most important metrics for analyzing health of the instance, including individual course data and data aggregated across courses, such as enrollment trends and completion rates. The data presented in a visual way that is quick to interpret.

  2. Course teams have easy access to reports in-platform that present the most important metrics for analyzing course and learner performance, including data at the course level and data at the level of the individual learner. The data presented in a visual way that is quick to interpret.

  3. Administrators and course teams have a clear way to access the raw data should they so choose.

What reports are in scope for V1?

Method of acquiring market input:

  1. A survey to the Educator’s Working Group

  2. Individual inquiries/interviews with community members actively investing in this space

  3. Competitive research on Canvas and Moodle

We asked users what questions they would like to be able to answer using learner-level, course-level or instance-level analytics, and why (with a particular focus on questions they are not able to answer now.)

From this input, we synthesized over 120 questions that users would like to be able to answer. From this pool, we curated an MVP set of reports that will cover the fundamental needs of administrators and course teams, and that achieve parity with competitors where relevant. The MVP reports focus on analytics needs of large courses. (Future releases may include a bundle of reports that address specific needs for residential, small-scale, hybrid courses.)

Full market data here. 

Reports at the course level for authors, instructional designers and administrators

Course Enrollment: Analyze enrollment trends

Course Activity: Analyze how learners (in aggregate) are engaging with the course and specific course components

At the course level:

At the component level:

Course Success Metrics: Analyze course success

Learner Progress: Track learner progress to interpret early signs of drop-out/disengagement

Reports at the site level for administrators

Enrollment Trends: Analyze platform registration and course enrollment trends

Platform Activity: Analyze data aggregated across courses

Course Success: Analyze course completion data

Learner Engagement

Features & Requirements

In order to realize this MVP, we believe the following features will be required.

Feature

Requirements

Course-level reports

Data Visualization requirements TBD

Site-level reports

Data Visualization requirements TBD

Administrative Dashboard

Reqs TBD

Course Reports Dashboard

Reqs TBD

Open Product/UX Questions:

It’s possible to include comparative data from one run to the next - learner grades, completion rates, etc. This data is interesting to course teams AND admins. Where does it live? How do course teams access global data outside of the Instructor Dashboard which is constrained to a single run/course?

Best UI for accessing data at component levels - should be accomplished in-course.

All questions that attempt to quantify “time spent” (time spent in the course, time spent on a problem, time spent on a unit, etc) need to be defined. (the “get up to make lunch” problem)

Default definition of “active”, with some config options for customization

Appropriate/sensitively designed UI for eliciting voluntary learner demographic data

Is there an MVP for component-level reports that can be released without in-platform/in-course UI?

Possible Future Directions:

Tie instructor actions for interventions to data. ex, send custom email to all learners at risk of not dropping out bc they haven’t access the course in 3 weeks

Tie analytics to content tagging