Getting Feedback from Users

This page outlines the different techniques we use to collect feedback from our users. The goal is to learn more about user preferences and pain points, so that we may improve the usability Open edX.

The first step is to find test users. We use this Google Form to recruit test users, and this spreadsheet to keep track of where/with whom we’ve shared the form.

Now that we have a pool of test users, it is crucial to define what part of the product development process we want to carry out. Before embarking on any project, a Product Manager must carefully validate the following aspects:

  1. Value: Is the consumer willing to purchase or use our product?

  2. Feasibility: Do we have the resources necessary to develop this product, and does it make sense based on our goals?

  3. Technical Feasibility: Can we build it from a technical point of view?

  4. Usability: Will end customers be able to understand and use the product effectively?

The Value and Usability points can be validated through contact with the end customer, for which we can use the following techniques:

Validation

Technique

Description

Note

Tools

Validation

Technique

Description

Note

Tools

Value Validation

Remote user interviews

A moderator conducts a one-on-one interview with a user to identify problems that, when solved, could have a high positive impact on some community objective.

  • It is recommended to follow the guidelines of the book "The Mom Test".

Zoom / Google Meet

Surveys / Questionnaires

A survey is created with a number of questions to understand user problems in relation to the target user group. The survey is sent out to users who match the relevant persona. Once a predetermined number of users have responded to the survey, the feedback is collated and evaluated

 

Typeform, Google Forms

Data analytics

Aspects is an analytics system for Open edX. It gives instructors and site operators actionable data about course and learner performance

 

Open edX Aspects,

Forum feedback threads

Feedback widget in Studio and the LMS

Add a feedback plugin to Studio and the LMS that allows users to add their feedback by filling out a simple form

 

Microsoft clarity
Hotjar

Usability Validation

Remote usability testing (moderated)

A moderator uses screen-sharing software to conduct a remote usability test with a test user. The moderator asks the user to share their screen as they complete a number of tasks given to them by the moderator. Users are encouraged to “speak their thoughts out loud” whilst completing the tasks. The call is recorded for later evaluation

  • It is recommended to start with moderated tests, as they allow you to interact directly with the customer, ask specific questions about their perceptions, and obtain qualitative data more effectively.

  • Five interviews is enough.

  • The book "Sprint" by Jake Knapp is a recommended read to perfect these tests.

Zoom / Google Meet

Remote usability tests (unmoderated)

A test user completes a remote usability test using specialized software. When the test is complete, the recording is shared with the test team to evaluate. Some testing services provide heat maps and analytics based on test objectives

  • This option is recommended when you have a large number of clients

  • Tools such as Maze can be used to synthesize data collected during tests run independently by users.

Loom, Playbook UX, LookBack, Maze


Learning Resources