Redwood Release Testing Strategy (Draft)

Background: The Axim Engineering Team got together and brainstormed ways we could improve the release testing strategy around named releases. Here are our recommendations; we hope to spark discussion within BTR about the ways in which the testing process can evolve to be faster, catch more bugs, and be more comprehensive. Also, how BTR can work with the Product team to ensure the right things are being tested with the right level of scrutiny?

Overview of the current community testing strategy

Links to existing documentation:

Recommendations for change with Redwood release

  1. The Product Core Working Group should take ownership of a “master” test spreadsheet. By “ownership”, we mean that the Product Core Working Group should be the group ultimately responsible for making sure that the ways in which we test the release align with the Product Core Working Group’s definition of “core product”. That means defining the tests and their relative priorities. BTR is still responsible for building, testing, and releasing the release.

    1. The Product Team proposes to take on the task of writing coherent and sane tests that align with the highest priority features

      • This should help alleviate the testing burden that has previously been solely on BTR’s plate

      • The Product Team will prioritize defining test plans through the most important paths through the core product features.

      • Teams outside of BTR are welcome to make their own test plans for features outside of the core product that they care about; however, those tests should be executed by the team that has an interest in the feature, and any bugs found will not be considered release blockers.

    2. The master version should be ready by the cutoff date

    3. The test manager still instantiates the release-specific version of the spreadsheet

    4. Prototype here:

  2. Cut the official testing period in half

    1. Pros: more features land in Redwood

      • “If the test cycle was smaller, we’d increase the chance of features getting into each release”

    2. We (Axim) know BTR has taken the brunt of it, and are willing to help coordinate efforts to get testing done faster (for example, connecting the right community members together for needed work items, such as more automation, more documentation, etc)

    3. Note: Expected new features/updates and statuses will be known long before the testing cutoff (tracked in the Community Release Plan) and much documentation will be available before the cutoff (see Appendix below).

  3. Codify the process on - the Docs site is a semi-permanent record of how our software and community works. Making official BTR documentation ensures all stakeholders agree on the processes and that it’s clear this is an official Open edX community function.

    1. What the test manager needs to do: write these docs proactively, including docs such as “How we do Pull Request Workflows” for PRs before the release is finished testing as well as backports after the release is out

  4. Move from spreadsheet to Github issues

    1. One issue for each test case

      • Tracks the whole lifecycle of the issue

    2. Produce a script to convert the product-owned master spreadsheet to issues

      • Benefit: Happens at every release

      • Requirement: Must be smart enough to create new issues if new rows are added

  5. Process suggestions

    1. Critical BTR roles should probably rotate mandatorily every two terms, to avoid individual burnout, allow for process iterations with fresh eyes, and ensure we don’t “default” to the same set of people running the releases

    2. Critical roles should probably have a primary and secondary

  6. Make the test plan/state more visible/discoverable

    1. Collect all testing docs and get as many into the site as possible; aggregate all pieces of important information - whether in , wiki, or elsewhere - in a top-level BTR wiki page.

      • Maintain a place where feature use experts can contribute docs easily - for now may be the wiki until we migrate all testing docs to

    2. Pinning the top-level wiki page to the Slack channel would be great

  7. Milestones (with dates) should be set

    1. Product team will write all the tests & slot them into 4 priority buckets

    2. Highest priority items must be completed by Week 1. Second highest in Week 2, and so on.

  8. Maintainers are responsible for fixing bugs or getting the authors of the feature question to do so

    1. For Redwood specifically, we recognize that the maintainers program is still gathering steam. Certainly we'll be in a situation where we'll need some heroes (unfortunately).

    2. BTR should help maintainers that are new if they're having trouble, but in general having a concrete thing to fix or understand in a codebase is a great way to get a better understanding of it. BTR should be prepared to give the maintainers extra support for Redwood (if a maintainer exists at all). If the maintainer doesn't exist, we'll be in the situation we've been in historically.

    3. The community needs to rally such that by Sumac, all repos have named maintainers who are ready to help with the release process.

AppendixThe branding must be site-agnostic and configurable

Definition of “Done” for new features

We are implementing a definition of “Done”, described here. We hope it helps BTR know what to expect with all new features:

Product Documentation, to be completed by the Product Lead for each feature, and the Product Delivery Team:

Definition of done for a feature includes:

  1. Value statement/impact statement - Describing the value of the feature and the impact for users.

  2. How-to documentation - Describing what the feature is and how to use it and configure it

  3. Demo videos - From the product manager demonstrating how to use the feature

  4. Technical documentation - How to set the feature up on the technical end

  5. Case study - A real-life example of how someone is using the feature

  6. Testing instructions (Gherkin/cucumber method) (product)

    1. Example Aspects user story written in Gherkin method

      • Scenario: A user with access to a course can view total number of enrolled learners and the number and percentage of total enrollees for current enrollees and enrollees per enrollment track

      • Given I: am a user that has access to course data

      • When I: Navigate to the instructor tab for one of my courses

      • Then I: See the total number of learners enrolled in course

      • And I: See the number and percentage of total enrollees for current enrollees

      • And I: See the number and percentage of total enrollees per enrollment track (of the enrollment tracks available on my Open edX instance)

Other Requirements:

  1. Features must work in Tutor

  2. Translations requirement must be defined

  3. The branding must be site-agnostic and configurable

  4. Accessibility Review

General testing categories

“Tests” are not all the same. As a reminder, when we say “test” we may mean one or more of the following types of tests are involved.

  • Feature evaluation/user acceptance testing

  • Feature validation

  • Testing at scale

    • Edge cases

    • Performance