Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 3 Next »

Topic

Indicate which topic your proposal falls under:

  • Enhance Core Contributor Onboarding
  • Improve Collaboration, Communication & Reporting
  • Improve Fulfilling Commitments and Planning Processes
  • Improve Review Processes

Overview

We have some specific pain points around newcomer onboarding and the pull request review process which we hope to address as part of this Summit. Establishing metrics for these issues is critical to understanding where we are, and whether we've improved.

Author Note: This proposal has been updated to reflect the data which currently exists on openedx.biterg.io

Solution

The Open edX's Bitergia instance (openedx.biterg.io) has a number of useful metrics we can use to measure the effect of the changes implemented as part of this Summit. Some useful existing metrics include:

Github Pull Requests Efficiency

  • Lead time – shows the average time (in days) between the initiation and completion of a pull request.

  • Review Efficiency Index (REI) – the number of closed pull requests divided by the number of open ones in a given period of time.

To report on the Newcomer experience, we will need to update openedx.biterg.io to report additional participation metrics across Open edX repositories, Discourse, and Slack.

Some salient metrics from the CHAOSS list are:

  • Newcomer Experience – How well does an open source community attend to welcoming newcomers? 

  • Conversion Rate – What are the rates at which new contributors become more sustained contributors? 

Wikimedia use a system called Gerrit for managing their change requests, and they have several newcomer-related dashboards we could work from:

Existing openedx.biterg.io metrics like Contributor Growth can also be used to measure newcomer retention.

Impact

This proposal aims to measure the health of Open edX in general, and the effectiveness of improvements/changes made to the program. We can measure the effectiveness of using openedx.biterg.io metrics by counting the number of places where the CC program uses this data.

Timeline

The existing metrics can be used right away.

It’s difficult to estimate how much time it will take to build each metric, since I haven’t done this before. But once one is done, others will surely be quicker to add. And modeling our work on existing dashboards will also speed things up.

Additional concerns:

Deploying GrimoireLab was relatively straightforward, but the process for cleaning up the data was onerous.

Have asked Edward Zarecoron Slack for advice on this, as he did the initial cleanup.

  • No labels