Moodle - ORA

Staff-Graded Assessments

Staff-graded assessments in Moodle are handled by the Assignments feature, which is extremely broad and versatile. Learners can submit assignments in one of four possible formats:

  • File submission

  • Text

  • Audio

  • Video

This is configured on a per-assignment basis. Audio and video assignment submissions are handled by a recording feature present in the Atto editor, Moodle’s HTML and text editor.

Staff have access to grading submitted assignments, annotation for PDF file uploads (if “Ghostscript” is enabled), and grade based on a raw number grade (no rubric enforcement appears to be present). They can grant extensions, lock submissions, and generally have significant control over individual learners. Comments can be shared, and comment files can be uploaded, for example documents that contain more detailed, specific feedback.

Staff have the option of notifying learners as they grade via whatever default notification method is set at the Moodle instance level (such as email or in-app), or notifying all learners once grading is complete or a specific date is met. For bulk grading purposes, staff can also download Excel files, enter marks into a spreadsheet, and upload it back to Moodle to be processed as grades in the Moodle gradebook.

Learners can be required to hit the submit button, locking their submission from further edits, or allowed to continue editing until locked manually by staff. Once submitted, staff must manually convert the submission back to a draft in order to continue editing.

Group submissions are possible based on Moodle’s Groups feature, discussed more at length in the Cohorts Feature Analysis. Grading of learner assignments can be allocated to different instructors, either by learner or by group, so that only the specified staff member may grade submissions by those learners.

In general, Moodle’s assessment feature is extremely robust for the very traditional university use-case, and has many tools geared towards high-touch mentoring and class-led teaching. The staff-grading features are reflective of this, and provide what appears to be a much better interface for staff-grading than is currently available in Open edX.

Above: Student view of the submission process for a file upload assignment.

 

Above: Staff grading view with tools for:

  1. Viewing and annotating the assignment

  2. Downloading the assignment

  3. Grade entry

  4. Individual feedback

  5. Save and show next learner

  6. Learner submission navigation

 

Links:

Peer Assessment

Moodle peer assessments are conducted through a separate activity named “Workshops”. Workshops are specifically peer-to-peer and feature grading for both the quality of their submission and the quality of the learner’s assessments, making peer grading an integral part of completing a workshop assignment.

There are 4 main models for peer assessment possible in workshops, called grading strategies. The first is accumulative grading, where scores are allocated on a scale or in a range by each learner for a set of different criteria, such as answer correctness or writing clarity. This is put into a complex formula, which returns an overall grade based on the score allocated by the marking learner. Comments can be given for each criteria, but they do not affect the score.

The second is comment-only. This works similarly to the first method, except there is no quantitative grade assigned, and all learners score 100% for completing the activity and having their work graded. In this case, the entire point of the exercise is simply to share and comment, making it more suited to formative exercises.

The third grading strategy option, named “number of errors”, provides the marking learner with a series of yes/no binary response questions posed by the assignment’s creator, such as “This assignment as creative ideas”. Each question has a response that indicates the submission has passed or failed that criterion, and each criterion has a weight assigned in order to calculate how many criteria the submission failed in, and to automatically calculate the final grade based on that.

The final grading strategy is to grade on a predefined rubric in a similar manner to the Open edX ORA grading method. However, the method of calculating the grade using the rubric is much more complex in Moodle Workshops than Open edX ORAs, and the final grade is calculated behind the scenes with a complex formula.

Learners can be assigned directly as reviewers for other learners by course staff, or they can be auto assigned so that either each submission requires a certain number of reviews, or each learner has to assess a number of submissions. If multiple learners assess an assignment, then the learner’s final score is given as the weighted mean average of all received grades. The weight given to each grader is governed by their score as an assessor, and can be overridden by a member of staff.

The assessor grade is calculated based on how in-line with other graders the learner’s grade is. So if three learners all give a submission full marks, and a fourth learner gives the same submission a significantly worse mark, the fourth learner will receive a lower score as an assessor, and their assessment will be weighted lower in the overall average.

The tools available to instructors are broad, with the ability to override grades, force learners backwards in the submission process, and generally control the peer grading experience, as well as being able to publish certain submissions to highlight them to the wider group after the assessment period ends.

In general, Moodle’s Workshop grading is extremely complex and based around a number of complex formulae which stand significantly in contrast with Open edX’s simple median average score. As with Assessments, this tool seems well-tailored for use with smaller groups of learners, such as average classes, and may not be ideal for huge scale. Notably, Workshops are clearly designed to be run in a short timescale, in an instructor-paced environment, making them not at all suited to self-paced courses, though the effectiveness of peer assessment in a self-paced course without a time-bound cohort is questionable to begin with.

Links:

Self Assessment

There is no dedicated self-assessment tool in the core of Moodle, despite an apparently overwhelming desire for one in the community. The various methods documented by community members for doing inventive logical backflips with other features, such as creating a hundred copies of a single Workshop so that each learner gets their own manually-configured assignment paint a picture of extremely lacking functionality, as well as widespread desire for this feature.

The closest you can get is adding self-assessment as a step in a Workshop activity (alongside peer grading), and allowing a learner’s own assessment to appear in the pool of Workshop responses (the more responses are in the pool, the less likely their own is to appear).

Links: