Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 10 Next »

A space to aggregate and share various use cases associated with data, reporting and analytics needs. Examples might include things like:

  • User story/persona, ie “As a course operator I want to build a report that includes X, Y, Z…”

  • Problem/challenge to address: “As a faculty member teaching a blended learning course, I need predictive analytics to help me identify at-risk learners etc etc”

Matt Bunch at HMS shared the following data analysis questions/topics:

  • Analysis of all courses, and all learners, at the same time, is the goal.

  • What is the correctness delta between the first assessment attempt and the second attempt?

  • What was the learner page view path through their course?

    • Is there a theme, are pages frequently revisited, and which page has the longest dwell time?

  • How many total seconds were spent viewing a page?
    Basically, any learner or administrative activity in the logs is within the scope of our research.


Jenna’s compiled use cases - no particular order

Anonymized sources:

  • Real-time progress reports at individual level - both at progress-across-the-course level and at the individual problem/assessment level

  • Individual data - for in-person/live courses. An instructor/educator needs to understand which individual is doing what and their individual progress in real-time

  • Data and analysis to track progress of cohorts/groups, especially for blended contexts

  • “Quick views” for educators and learners. For educators, quick insights into #enrollments, courses by enrollment, courses by completion, quick stats by course etc. For learners, courses enrolled, progress by course, completions

  • [Reporting] Better visualization of real-time data

  • [Reporting] Make it easy to export data into third-party business analytics tools - tie to google analytics for example

  • [Reporting] An important need for enterprise instances is to be able to align their enterprise training to internal KPIs

Specific use cases:

  • UBx: Want easier ways to track individual student progress

  • UBx: Want easy integration of data - particularly grades/outcomes - to campuswide LMS (canvas): Use case - the way faculty use UBx on campus is to assign homework, assignments, etc with particular learning objectives like “learn python basics”. They want assessment results to flow seamlessly into their central LMS. “Can I take a quiz on Open edX for my Math 309 course, get the grades, and have the grades flow into Brightspace?” He sees this as a huge incentive for getting other departments using UBx more broadly for blended learning initiatives

  • Penn State: Page views and amount of time spend per page (total seconds). Also meaningful ways to conglomerate/report. “We want to look at a course with 10k people and be able to say: In XYZ year, this is how many people engaged with this page and this is how much time they spent”

  • Penn State: Challenge is integration - getting accurate course completion data. Need a more elegant way to define course completion. The only way around their integration setup is to have students complete a yes/no quiz saying they completed the course. They rely on the course completions to report out CEUs or credential. They use the completion counts to measure impact of programs and if no CEU attached, there’s not a lot of motivation to complete the quiz. 

  • Penn State: Their need is to to connect in-person and online experiences into one cohesive learning experience. Use case: Learner completes a 5-hour online training course on landscaping, then has their pruning skills assessed in-person. Need 1) Seamless purchase of a blended learning experience; 2) Tie the off-campus event to the platform

  • WGU: Need to know who is where and what they are doing at all times - better tracking of progress at individual student level for engagement

  • WGU: [RBAC/permissions] Need more nuanced permissions control and security levels. For example currently can’t give instructors access to analytics without also giving them access to Studio, but they don’t want their instructors all having access to Studio.

  • WGU: Data for adaptive learning contexts: Need both static and dynamic attributes to determine what content students see. Need deep event engagement data to answers questions like - is this student taking a long time because they are struggling, or because they are distracted? How many videos have they watched? How long were they engaged with the video?

  • WHO: Adaptive learning branching - generating real-time learner experience pathways based on assessments and other feedback loops

  • MIT Blended: Predictive reports/analytics to identify at-risk students. Daily reports of student activity from the day prior.

  • MIT Blended: Content reuse data - -descriptive and dynamic metadata (dynamic like time on task and difficulty it gives students). Dynamic data examples: atomic time - how much time did a student spend on a problem. Molecular time - additional time needed to reference textbooks. The ideal would be to measure learning per unit time.

  • Spanish consortium: Initiative for aggregated statistics- aggregated across multiple courses rather than course-level

  • Spanish consortium: Real-time, in-context analytics

  • Spanish consortium: Predictive analytics

  • No labels