Getting Feedback from Users
This page outlines the different techniques we use to collect feedback from our users. The goal is to learn more about user preferences and pain points, so that we may improve the usability Open edX.
The first step is to find test users. We use this Google Form to recruit test users, and this spreadsheet to keep track of where/with whom we’ve shared the form.
Now that we have a pool of test users, it is crucial to define what part of the product development process we want to carry out. Before embarking on any project, a Product Manager must carefully validate the following aspects:
Value: Is the consumer willing to purchase or use our product?
Feasibility: Do we have the resources necessary to develop this product, and does it make sense based on our goals?
Technical Feasibility: Can we build it from a technical point of view?
Usability: Will end customers be able to understand and use the product effectively?
The Value and Usability points can be validated through contact with the end customer, for which we can use the following techniques:
Validation | Technique | Description | Note | Tools |
---|---|---|---|---|
Value Validation | Remote user interviews | A moderator conducts a one-on-one interview with a user to identify problems that, when solved, could have a high positive impact on some community objective. |
| Zoom / Google Meet |
Surveys / Questionnaires | A survey is created with a number of questions to understand user problems in relation to the target user group. The survey is sent out to users who match the relevant persona. Once a predetermined number of users have responded to the survey, the feedback is collated and evaluated |
| Typeform, Google Forms | |
Data analytics | Aspects is an analytics system for Open edX. It gives instructors and site operators actionable data about course and learner performance |
| Forum feedback threads | |
Feedback widget in Studio and the LMS | Add a feedback plugin to Studio and the LMS that allows users to add their feedback by filling out a simple form |
| Microsoft clarity | |
Usability Validation | Remote usability testing (moderated) | A moderator uses screen-sharing software to conduct a remote usability test with a test user. The moderator asks the user to share their screen as they complete a number of tasks given to them by the moderator. Users are encouraged to “speak their thoughts out loud” whilst completing the tasks. The call is recorded for later evaluation |
| Zoom / Google Meet |
Remote usability tests (unmoderated) | A test user completes a remote usability test using specialized software. When the test is complete, the recording is shared with the test team to evaluate. Some testing services provide heat maps and analytics based on test objectives |
| Loom, Playbook UX, LookBack, Maze |
Learning Resources