Simple Problem Editor (Olive)

The differentiation between common problems and advanced problems in the Open edX platform is typically whether XML or an XBlock is required to configure a problem, versus the simple (or “common”) problem editor. For the purposes of this analysis, our focus will be on the latter - how Open edX enables authors to build problems without having to engage with OLX, as the scope of OLX is not informed by the editor feature. For the sake of simplicity, where the problem editor is referred to in these docs, we are talking specifically about the UI-led problem building experience.

The problem editor is the feature that allows course authors to build problems using a simple set of syntax elements that provide learners with a range of input types that can be used to answer the question. The problem types that can be built using the problem editor are:

  • Multiple Choice (aka single select problems)

  • Checkbox (aka multi-select problems)

  • Dropdown

  • Text input (with alternative answers available)

  • Numerical input (which technically is the same thing as text input, but with support for a numerical answer variance)

In addition to these input types (which can be mixed-and-matched as required), the problem editor also supports markup that can provide the following features:

  • Plain Text (to contextualise problems)

  • HTML (to allow images in both context and responses)

  • Headings (at H3 level)

  • Explanation (text that appears when a configurable condition such as completion the problem is correct)

  • Hints (text that can appear when requested by the learner)

  • Feedback (text that can appear when specific options or sets of options are selected or not selected by the learner)

Finally, the settings tab of the editor provides access to the following controls:

  • Problem display name (which is also shown as a heading)

  • Matlab API key

  • Maximum attempts (which have a default that can be set at the course level under Advanced Settings

  • Problem Weight (the points a problem is worth)

  • Randomization (which cannot affect common problems)

  • Show Answer (the condition on which the answer can be shown to the learner, which can be set at the course level under Advanced Settings)

  • Number of Attempts

  • Show Reset Button (allowing learners to clear all inputs)

  • Timer Between Attempts (in seconds, forcing a delay between submissions)

The problem editor also has a cheat sheet showing some of the markup options available, which cannot be closed, and can be converted to the advanced problem editor (but cannot be converted back once converted to XML).

There is a general trend in which having a featureful problem editor allows non-technical course authors, such as those typically found in arts subjects, to create better assessments more easily. More technical course authors are able to make use of more technical tools, such as PrairieLearn or Open edX’s advanced problem editor, to create extremely bespoke questions that test hyper-specific skills and knowledge. In this sense, having a problem editor that works well will generally benefit less technical subjects more than technical subjects, but regardless of this, having a good problem editor with a variety of question types makes it faster for all authors to create effective interactive content without needing to spend days on custom work which may be of variable quality and cause a poor user experience.

The usage of problems in Open edX is generally split across three groups of users:

  • Basic users of the platform will only use the problem editor and its templated capabilities, and are unlikely to ever discover the more advanced features of the problem editor due to its UX issues and the under-documentation of certain features. This in turn shapes the way they build courses and assessment, never straying beyond what is provided to them in the UI.

  • Power users of the problem editor exist in smaller numbers, but use more capabilities of the editor, such as HTML insertion, randomization, hints and adaptive feedback. This split between basic users of the editor is indicative of problems that exist with the editor that can absolutely be resolved.

  • Finally, deeply invested, long-term technical users of the platform such as those at MIT will regularly use the advanced problem editor instead, especially as it enables problem features and types that were never added to the editor. These users are typically frustrated by the lack of support for more custom problem types enabled by the advanced editor that rely on third-party scripts that are not maintained or supported.

Recommendations

  • Create a new problem editor from scratch that meets the needs of the core platform.
    OR

  • Work to extend 2U’s new problem editor solution to ensure it fits the needs of the core platform.

  • Create new tools for assessment creation and management in general to ensure that the complete assessment creation and management workflow are prioritised and polished.

  • Improve tracking of Studio events in general  to provide greater insight into Studio feature usage for platform providers and operators.

  • Investigate formal support for assessment subsections/quizzes beyond simply marking them as graded in order to meet feature parity with other platforms, while still enabling Open edX’s near-unique mixed-content capabilities.

  • Standardise the terminology used between the front-end, back-end, and documentation for different problem types to eliminate possible confusion.

Links: