Attendees
Adam Palay (Deactivated), Andy Armstrong (Deactivated), Brian Beggs, EdwardF (Deactivated), JesseZ (Deactivated), Renzo Lucioni (Deactivated), Robert Raposa, Scott Dunn (Deactivated), Nimisha Asthagiri (Deactivated)
...
- Definition of Done (5mns)
- Consensus on long-term goal
- Today's Problems/Observations
- Developer Velocity
- Pull request complexity
- i.e., over interdependencies
- Pull request wall time for tests
- Developer onboarding
- Disorganization
- Pull request complexity
- Quality
- Clear seams and contracts for tests
- APIs
- Extensibility/Modularity
- Open source community
- Incremental Upgrades (Python/Django) - Future
- Time to value
- brittleness
- time to deployment
- Developer Velocity
- Question:
- Will a containerized/modularized monolith within the same repo give us the benefits we're looking for, as measured by our metrics?
- With the exception of:
- incremental upgrades
- test time - unless we come up with a smarter test-runner
- independent scalability/failure-recovery
- Goal for this working group: get baseline metrics, begin this work and validate/invalidate the approach.
- With the exception of:
- Will a containerized/modularized monolith within the same repo give us the benefits we're looking for, as measured by our metrics?
- Metrics (30mns)
- Before embarking on this project, what metrics can we use to validate our progress and success?
- Collect metrics from the field
- That extended around the monolith
- Enterprise projects
- Learner projects
- That needed to deal with the monolith
- Proctoring
- That tried to modularize before working
- Grades
- Instructor Tasks
- Which metrics?
- Extensible Modularity - Repo lines touched (how many lines within the monolith needed to be changed for new features)
- That extended around the monolith
- "Good metric"
- Easy to measure
- Actionable
- validates/invalidates
- not too many other contributing factors - so the metric is reliable for this project
- Hotspots within the codebase
- Which modules of the code have a lot of contention?
- Number of commits touching the same file
- EdwardF (Deactivated) to share Hotspot presentation
- Code reviews (measuring modularity)
- PRs that needed cross-team reviews
- Number of reviewers that needed to be tagged
- ?? can tag teams as well as individuals
- Number of approvals may be a better proxy (to measure how many blockers on a PR)
- Number of comments - (may be impacted more by other factors - such as the reviewer)
- Pull Requests
- Number of files touched
- PRs that touched multiple modules
- Pylint
- Number of complexity-disabling pylint warnings
- Number of total pylint errors (could potentially go down when code is touched)
- Number of complexity-disabling pylint warnings
- Time-to-value (too many other contributing factors - may not be directly related to the monolith initiative)
- Github branch age
- Age of commits on branch - in case developers worked on it locally for long?
- Release pipeline metrics - these are already available
- Number of rollbacks
- Disrupted deployments
- Github branch age
- Tests (may not be affected until tests are smarter or until code is actually pulled out)
- Time
- Number of flaky
- Collect metrics from the field
- Before embarking on this project, what metrics can we use to validate our progress and success?
- Next Steps (10mns)
- Swarm or Divide-n-Conquer?
- Review proposed Epics
...