Overall Assessment Configuration Analysis

The purpose of these documents is to break down the stages of grading configuration in full detail, identifying feature areas for comparison and improvement, and analysing the current state of Open edX in comparison to other market-leading LMSes that serve different use-cases in order to make recommendations for future core platform improvements.

The following assumptions are made about grading in this document:

  1. A course’s assessment strategy revolves around the completion of graded content that is present within the LMS (not external exam systems or third-party grader software).

  2. Courses have two primary modes - Self-Paced, or Instructor-Led. While other models obviously exist, these are the two modes of course content explicitly supported by the current system.

  3. Graded content can be either graded manually by a user (staff, or peer assessment), or automatically graded upon submission (such as CAPA problems).

As such, the process of setting up and managing content in a graded course can roughly be broken down into 4 broad areas:

  1. Course-level assessment configuration - In which course staff define how a course should be graded as a whole.

  2. Assessment setup - In which an individual assessment is created, such as a graded subsection or timed exam in Open edX.

  3. Content setup - In which the content for learners to complete (such as a quiz or submission) is created and configured and prepared within an assessment, including integrating third-party tools and assessments, as well as question banks.

The following areas will not be explicitly covered in this document, but may be mentioned in passing:

  1. Broader educational content setup and course creation - The focus of this document should be explicitly assessed materials only

  2. Certification and credentials - This is a broad area and is one potential output of assessment, rather than assessment itself.

  3. Learner management and reporting- While some learner management is intrinsically linked to assessment, the focus of this remains on configuring the learner assessment process, not managing live courses.

  4. Grading - The process of awarding grades to learners is important, but deserving of its own summary and research rather than being bundled into content configuration.

Comparisons made to other LMSes will be based on reviews of their documentation and community resources such as forums, rather than hands-on experimentation with those platforms due to a lack of access and general time available to complete this analysis. This, by its nature, will mean that a much stronger spotlight will be put on Open edX’s shortcomings than other platforms, as those platforms are unlikely to advertise their inadequacies, while we are very aware of our own platform’s.

The stated goal of Open edX Core is to provide a high-quality core experience that is not specific to any particular use-case, such as academic courses or customer training. This analysis will, where possible, attempt to take this into account when proposing recommendations and improvements. It is possible that greater improvements could be made than those listed in the relevant sections, but unless they have value to a wide range of use-cases, they will not necessarily be good additions to Open edX Core.

Finally, this analysis is based on the current state of production-level Open edX as of June 2023, with the initial Palm release, and should be understood as such. Improvements to the platform’s shortcomings are always ongoing, and if something highlighted as being lacking here is already being worked on or even already scheduled for release in a later version, it should be viewed as a validation of the need for that work to exist, rather than a deliberate omission.