Add Track Changes feature for peer assessments
We've been using this feature on our instance for several years and it works well.
Creates "TrackChanges" django model, which ties a (submission_uuid,
scorer_id) pair to a set of edits suggested by peers for a particular
Adds machinery for on-the-fly installation of the New York Times's ICE
library for collaborative editing, allowing peer assessors to create
marked-up versions of submissions, which can then be returned to the
author as feedback.
Adds machinery for displaying suggested edits to the creator of a
submission with their scores and other feedback.
Configurable within Studio on a per-Assessment basis by toggling the
"Enable Track Changes" setting.
Enabling in studio:
<img width="920" alt="Screen Shot 2019-09-04 at 11 15 21 AM" src="https://user-images.githubusercontent.com/868615/64281103-415f5c00-cf07-11e9-94e1-b923885372f7.png">
Student assess view:
<img width="1080" alt="Screen Shot 2019-09-04 at 11 23 02 AM" src="https://user-images.githubusercontent.com/868615/64281231-8b484200-cf07-11e9-9775-733ace10b08f.png">
Student grade view:
<img width="1080" alt="Screen Shot 2019-09-04 at 11 27 46 AM" src="https://user-images.githubusercontent.com/868615/64281263-9bf8b800-cf07-11e9-8b4e-9effd1c7fb16.png">
will be taking a look at this with me
Let me have my team take a look to get a better sense of the functionality and I’ll get back to you
Do you think this is still relevant and of interest to us?