Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 27 Next »

This page outlines the different techniques we use to collect feedback from our users. The goal is to learn more about user preferences and pain points, so that we may improve the usability Open edX.

The first step is to find test users. We use this Google Form to recruit test users, and this spreadsheet to keep track of where/with whom we’ve shared the form.

Now that we have a pool of test users, we need to decide which research technique would be best for our particular use case. The table below outlines the various techniques we use.

Technique

When to use

Tools

Remote user interviews

Zoom / Google Meet

Remote usability tests (moderated)

Zoom / Google Meet

Remote usability tests (unmoderated)

Loom, Playbook UX, LookBack, Maze

Surveys / Questionnaires

Typeform

Feedback widget in Studio and the LMS

Microsoft clarity
Hotjar

Analytics

Open edX Aspects

Remote user interviews

  • A moderator conducts a one-on-one interview with a test user using virtual meeting software. The moderator asks the user a number of UX-related questions, and encourages them to elaborate on their answers if required. The call is recorded for later evaluation

Remote usability tests (moderated)

  • A moderator uses screen-sharing software to conduct a remote usability test with a test user. The moderator asks the user to share their screen as they complete a number of tasks given to them by the moderator. Users are encouraged to “speak their thoughts out loud” whilst completing the tasks. The call is recorded for later evaluation

Remote usability tests (unmoderated)

  • A test user completes a remote usability test using screen-recording software. The user is provided with a set of instructions to follow during the test. They are asked to record their screen as they complete the tasks outlined in the instructions. Users are encourage to “speak their thoughts out loud” whilst completing the tasks. When the test is complete, the recording is shared with the test team to evaluate. Some testing services provide heat maps and analytics based on test objectives

Surveys / Questionnaires

  • A survey is created with a number of UX-related questions about a specific feature. The survey is sent out to users who match the relevant persona. Once a predetermined number of users have responded to the survey, the feedback is collated and evaluated

Feedback Widget in Studio and the LMS

  • Add a feedback plugin to Studio and the LMS that allows users to add their feedback by filling out a simple form

Analytics

  • Aspects is an analytics system for Open edX. It gives instructors and site operators actionable data about course and learner performance


Learning Resources

  • No labels