I. Context, data, and highlights 🏅
The Product Working Group was closely involved in the process and led the implementation of several changes to it (Redwood Release Testing Strategy ), including:
Monthly meeting to track new features and initiatives to be included in Redwood (https://docs.google.com/spreadsheets/d/1tWgp9LXNg4sfWYd_0ghNl6qfIZZ9851MtGBXPeSgzFs )
The Product Core Working Group should take ownership of a “master” test spreadsheet of new features
Maintainers are responsible for fixing bugs or getting the authors of the feature question to do so
The Redwood master branch was cut on 09/05 as planned.
The official testing period was cut in half to increase the chance of landing more new features in Redwood. For this reason, the testing process was launched on 16/05 (https://discuss.openedx.org/t/join-the-redwood-testing-team/13017 ).
Due to a last-minute security vulnerability (https://discuss.openedx.org/t/security-upcoming-security-release-for-edx-platform-on-2024-06-17/13204/3 ), Redwood was delayed a week and a half, to be successfully released on 19/06.
To target their respective audiences more specifically, the release notes were divided into https://docs.openedx.org/en/latest/community/release_notes/redwood/feature_release_notes.html and https://docs.openedx.org/en/latest/community/release_notes/redwood/dev_op_release_notes.html.
II. Successes and achievements 🏅
[Peter]: Although it was tight, I was impressed by the community’s ability to complete the testing on schedule (perhaps we were helped by some features being pushed to the next release).
[Chelsea]: It seemed that the prioritization of discovered bugs was helpful for the triaging process.
[Sarina] A lot of people really stepped up this release. The two that were most visible to me were Maria Grimaldi and Chris Patti , as well as Jorge Londoño and Maksim Sokolskiy .
[Sarina] largely, new product processes worked, and involved more people than previously in the release
III. We should keep ✅
[Jenna]: Propose that we keep Product ownership of the “master” test spreadsheet, and continue to flesh out regression tests
[Jenna]: Propose that we keep the monthly meeting to track features slated for Sumac (can start again in August
[Chelsea]: Keep adding cucumber formatted tests to the test list? (Given I… When I… Then I…)
IV. What didn’t we do so well, and what could we have done to do better ❌
[Jenna]: How can we tighten the lead time between the code cutoff and having the testing sandboxes ready? For Redwood, there was about a week in between. How can product help in defining which features should be enabled, etc?
[Jenna]: Only having 3 weeks for testing ended up feeling tight. What what would it look like to try a 6 week testing period between code cut and release, especially if we can shorten the lead time on the sandboxes?
[Jenna]: Redwood was feature-packed. How can we scale back and still have a meaningful set of capabilities/features in Sumac, without over-committing?
[Peter]: There was a lot uncertainty and confusion with the security issue, particularly around what and with whom details could be discussed. It would be good to have a documented and, ideally, more transparent process before this happens again.
[Peter]: Some of the regression tests are poorly defined and therefore difficult to test confidently. For the next release, we should try to define the legacy tests as well as the new tests were defined.
[Chelsea]: I had claimed/planned to do a test that only once I sat down to do the test did I realize how technical the tester needed to be to test the functionality. It made me wonder if our test sheet should have a designated label for tests that need to be done by an engineer or can be done by anyone.
[Chelsea]: We could potentially firm up exactly how the testing sandbox should be configured and make sure that’s communicated well with the team setting up the testing sandbox (Example: it was a miss on my part to make sure Aspects was enabled from the start).
[Feanil]: Even if it’s not officially supported, should we allow the community to provide backport fixes to older releases at the release managers discretion? I feel like people are making these on their local forks and we could just let them upstream them if they can be validated easily.
[Sarina] how can we “fast fail” in the monthly release meetings run by product (that is, “kick” something out of the release as soon as it’s slipping, and divert resources to getting the other things done)?
[Sarina] consider having release meetings every month, always looking forward to the next release’s cut. Starting meetings in August for an October cut seems a bit late.
VI. Action Items for Sumac release 📈
[Jenna]: Product to update https://docs.google.com/spreadsheets/d/1tWgp9LXNg4sfWYd_0ghNl6qfIZZ9851MtGBXPeSgzFs/edit?gid=0#gid=0 with current plans for Sumac, ahead of the August monthly planning
[Peter]: Review all the tests.
Remove ones that are no longer relevant.
Write better test descriptions for legacy tests, in the style of the new Redwood tests.
Look for opportunities to automate tests.
[Peter]: To discuss: should we invest in the test course repo?