User group project: Aamir's understanding
Background: Community Needs Driving the Project
Multiple ways to group users exist: Cohort, Teams and “track groups???”
Need to expand grouping capabilities based on user attributes, performance and behavior.
The objective is to create targeted relevant interventions that may improve user outcomes.
Major strategic goals of the project are:
Enable platform/course admins to group users on course, org and platform level.
Allow instance operators to add grouping criteria.
Enable admins to send tailered comms to user groups.
Centralize user grouping.
Enable user to assign groups to themselves (teams).
Enable random user group assignment (cohorts).
Value proposition
Enable instructors/ admins to:
Perform analytics on
Deliver tailored comms to
Customize learning experience for
… specific groups of learners to improve learning outcomes.
Over time, simplify and expand grouping capability make it easier to use and more often.
Solution
Platform users
Users with certain attributes take certain actions and may have a performance metric for the course that they are enrolled in.
Group creation and update mechanism
An experience that would allow admins to create and update a user group. They can create 2 types of groups:
Static groups (created one time, updated manually)
Dynamic groups (automatically updated periodically or based on triggers)
using:
Attributes of a users (language etc.)
Actions by or related to a user (enrolled, problem submitted, grade assigned, forum visited etc.)
Performance of a user in a course (% completed, grade earned etc.)
that meet a set of criteria like:
Language IS NOT English
Problem submitted BEFORE 12PM Sunday
Grade earned > 70%
User group
A list of users that meets a criteria and that can be created and updated manually (static) or automatically (dynamic).
Intervention logic
A manual or automatic process that performs a certain action on selected user group(s).
Example of a manual process would be to download a user group in CSV, send email to a user group via bulk email tool, notify a group of a forum post etc.
Example of a dynamic process would be to configure a notification or email that is automatically sent every week to a group who didn’t engage with a course in last 10 days.
Intervention
Intervention is an action to improve course performance and/or learning outcomes. It could be done manually like mailing a group of learners outside of the platform. Or it could be done inside the platform like allowing instructors to selecting a user group as audience of the bulk email.
Feedback
Where relevant, instructors can monitor the behavior of a user group by filtering them on Aspects dashboards.
Use Cases
What groups would i want to create?
Lots of examples. Where did these come from? User interviews?
What would i want to use these groups for?
Communicate with a subset of learners
Download list of users in a group
View course engagement and performance of a group
Reset problem attempts (may or may not be ideal for groups)
Tailor course content for a group
What We Propose to Build
Centralized grouping mechanism for creating and managing user subsets at the course, org, or platform level
Support multiple grouping methods: random assignment, enrollment track, username/email, self-assignment, and activity/characteristics-based
Extend existing grouping tools (cohorts, track groups, teams) with added flexibility
Enable targeted communications to specific user groups
Allow export of user group lists for offline use or analysis
Integrate with dashboards like Aspects for filtered insights based on groups
Key Capabilities
Capability | In MVP? | Clarity Notes / Comments |
Group Creation |
|
|
Level: Create user groups at course level | ✅ |
|
Level: Create user groups at platform level | ❌ |
|
Type: Create static user groups | ✅ |
|
Type: Create dynamic user groups | ❌ | Auto-refreshing based on criteria |
Mechanism: Create static group via CSV upload | ✅ |
|
Mechanism: Create static group via UI |
| Mentioned as nice-to-have for MVP; may come in later iteration |
Validation: Inviting/adding unregistered or unenrolled users | ❌ | Users not in platform or not enrolled won’t be added; message provided |
Grouping logic: Random assignment | ❌ |
|
Grouping logic: Enrollment track | ❌ |
|
Grouping logic: Platform activity or learner characteristics | ❌ |
|
Grouping logic: Users assigning themselves | ❌ |
|
Group Management |
|
|
View list of all groups in a course | ✅ |
|
View group members in UI | ❌ |
|
Download group members in CSV | ✅ |
|
View group metadata in UI (size, method, last modified) | ✅ | Explicitly required |
Edit static group members via CSV re-upload | ✅ |
|
Edit static group members via UI | ❌ |
|
Edit group name via UI | ✅ |
|
Delete group | ✅ | Need clarification on pre-reqs and aftermath |
Track updates to groups (logs for admins) | ✅ |
|
Group persistence across course reruns | ❌ |
|
Utilization & Interventions |
|
|
Filter Aspects dashboard by user group | ✅ | For Aspects-enabled instances only |
Targeted communication via platform tools (email/push) | ❌ | Only manual communication via CSV supported |
Learning content customization | ❌ | Manual usage only — platform-level intervention tooling is not part of MVP |
Permissions, Visibility & Scaling |
|
|
Define criteria via UI for static or dynamic groups | ❌ |
|
Define criteria via code for for static or dynamic groups | ❌ |
|
Course role-based access control | ✅ | Groups only visible and accessible to Course Admin & Course Staff roles |
Platform/org role-based access | ❌ |
|
Open Questions
Chelsea added responses in BLUE.
What do we intend to GAIN by rolling out the MVP?
During this period from now until the Ulmo code cut in October, the user group delivery team has a few priorities: (1) Obtain and address any feedback from the community on the 5 Architectural Decision Records regarding the user group data model, refresh mechanisms, and migration of legacy grouping mechanisms that are currently review; (2) Load test user group model, refresh mechanisms, and migration plans at scale; (3) Deliver a very slim slice of user group creation, management, and usage for static user groups only.
What do we gain:
By making sure to allow the technical delivery team time for 1 and 2, I’m hoping we gain
input, constructive feedback, and (hopefully) buy-in from the community around this user grouping project. This feature has the potential to make a big impact and to touch a bunch of legacy ways of grouping users on the platform, so we want to make sure that the community feels confident in the path forward for user grouping as a whole. Load testing our assumptions now will give us a better sense of how this feature will scale and what tweaks we may need to consider as we roll out more robust functionality in a future release.
Once we deliver static user groups, Aspects users will be able to filter by the Aspects User Group dashboard by a list of specific learners. The user group dashboard shows information on enrollment, engagement and performance at the course level. We know from talking with users that some instances (particularly instances that have some on-campus and some online learners) have a list of specific users that they want to keep track of (maybe these learners are at-risk, are part of a particular group or program, or potentially have some specific accessibility needs.
This very slim delivery allows us to chip away what was initially a very large chunk of work that we were hoping to deliver for U (both static and dynamic user group creation, management and usage). It builds out the core user group creation and management UI that we can build upon further for the V release.
By pushing back the dynamic user group delivery, this buys us some time to really hone in on and test what UI designs might look like for dynamic user grouping before we’re in a crunch for delivering this work. I’ve requested some UI/UX time for the U funding contribution to start building out low/medium fidelity mockups for dynamic user group delivery, so that we can start to review, consider and even test these designs with community members.
What do we intend to LEARN by rolling out the MVP? How do we intend to learn it?
Since we are almost entirely reliant on qualitative feedback (although - I would love it if you do come up with any creative ways to obtain quantitative feedback from instances!), I’d like to learn how and if users are using this feature. When they use it, what are their friction points when it comes to creating, managing, and understanding how they can use the user group they created (and we can start to learn this even before it’s delivered by using prototypes and wireframes), but I think we’ll get even more valuable feedback on friction once there is something delivered that we can watch users interact with during usability tests and interviews with course delivery team members (I find I often have luck seeking out feedback from members of the educators working group).
What is the success criteria of MVP rollout? How do we measure it?
Success would be that a user understands how to create, manage, and use a user group as they navigate the interface. Beyond that, I would consider this roll out a success if users on instances that use Aspects create one or more user groups and filter their user group dashboard by a group they’ve created. This will involve partnering with instances we know are using Aspects that are upgrading their open edX release on a regular basis. (eduNEXT will be a good partner in helping us find one or more of these instances).
What would make the MVP rollout a failure?
Success would be that a user cannot figure out how to create, manage, and use a user group as they navigate the interface during a usability test. I would consider this roll out a failure if users on instances that use Aspects and upgrade to Ulmo don’t create any user groups and filter their user group dashboard by a group they’ve created. This will involve partnering with instances we know are using Aspects that are upgrading their open edX release on a regular basis. (eduNEXT will be a good partner in helping us find one or more of these instances).
From where and how will we collect feedback?
eduNEXT is a provider that has a few instances running Aspects - they may be a good source of intel for us as we try to collect usage feedback.
I would set up a series of usability interviews with course delivery team members from a variety of different institutions. I usually find participants through the educators working group or by mentioning what usability tests/interviews I want to conduct to Axim colleagues and during all of my working group meeting appearances. I can usually get at least 10-15 names this way. I can also provide a list of individuals I’ve interviewed in the past if you’d like to reach out to any of them to participate (the caveat is - their instance may not have upgraded to Ulmo when you speak, so working with prototypes or with a sanbox may be necessary).
This statement does NOT makes sense to me because we want to get feedback to build the next iteration and it is a pretty non-intrusive feature sitting quietly behind a tab in instructor dashboard.
”Because this initial slice of work delivers an important, yet slim chunk of functionality, this feature will be delivered as defaulted to OFF and marked as a beta feature for the Ulmo release.”I think this is a VERY GOOD POINT. We tend to be extremely conservative about rolling out new features to the community especially because sometimes an instance will upgrade to one release and continue to use that release for a long time (this increment may be all they see of user groups for a year or multiple years), but I very much hear your point here. The initial reasoning behind this decision was because the value being delivered this release is small and will really only benefit Aspects users for the most part. The idea was to get this out of beta once dynamic user groups can be created and/or targeted interventions can be sent. However, I think this is an idea we can raise with Jenna when we meet 1-1-1 before I head out.