Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

We have some specific pain points around newcomer onboarding and the pull request review process which we hope to address as part of this Summit. Establishing metrics for these issues is critical to understanding where we are, and whether we've improved.

Author Note: This proposal has been updated to reflect the data which currently exists on openedx.biterg.io

Solution

Reinstate The Open edX's GrimoireLab Bitergia instance (metrics-dashboard) previously deployed by Axim to report openedx.biterg.io) has a number of useful metrics we can use to measure the effect of the changes implemented as part of this Summit. Some useful existing metrics include:

Pull Requests Efficiency

  • Lead time – shows the average time (in days) between the initiation and completion of a pull request.

  • Review Efficiency Index (REI) – the number of closed pull requests divided by the number of open ones in a given period of time.

To report on the Newcomer experience, we will need to update openedx.biterg.io to report additional participation metrics across Open edX repositories, Discourse, and Slack.

Pick some salient, out-of-the-box Some salient metrics from the CHAOSS list to track in the Contributors Updates and CC program annual report, e.g are:

  • Newcomer Experience – How well does an open source community attend to welcoming newcomers? 

  • Conversion Rate – What are the rates at which new contributors become more sustained contributors? 

  • Change Request Closure Ratio – Is the project keeping up with change/pull requests?

  • Issue Response Time – How much time passes between the opening of an issue and a response in the issue thread from another contributor?

Wikimedia use a system called Gerrit for managing their change requests, and they have several newcomer-related dashboards we could work from:

Impact

This proposal aims to measure the health of Open edX in general, and the effectiveness of improvements/changes made to the program. We can measure the effectiveness of adding GrimoireLab to our toolset using openedx.biterg.io metrics by counting the number of places where the CC program uses this data.

Timeline

In 2022, OpenCraft estimated the effort needed to reliably deploy GrimoireLab at 115 hours . This estimate includes:

  • Upgrade metrics-dashboard to the latest version

  • Improve deployment using helm charts and terraform (discovery only)

  • Pull data from github + Discourse + Slack

The existing metrics can be used right away.

It’s difficult to estimate how much time it will take to build each metric, since I haven’t done this before. But once one is done, others will surely be quicker to add. And modeling our work on existing dashboards will also speed things up.

Additional concerns:

Deploying GrimoireLab was relatively straightforward, but the process for cleaning up the data was onerous.

In addition to operational resources to set up and maintain the data, this proposal requires someone to pay for hosting the service and its databases.

Have asked Edward Zarecoron Slack for advice on both pointsthis, as he did the initial cleanup.