Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Attendees

Adam Palay (Deactivated), Andy Armstrong (Deactivated)Brian BeggsEdwardF (Deactivated), JesseZ (Deactivated)Renzo Lucioni (Deactivated)Robert RaposaScott Dunn (Deactivated)Nimisha Asthagiri (Deactivated)

Meeting Notes

  • Definition of Done (5mns)
    • Consensus on long-term goal
    • Today's Problems/Observations
      • Developer Velocity
        • Pull request complexity
          • i.e., over interdependencies
        • Pull request wall time for tests
        • Developer onboarding
        • Disorganization
      • Quality
        • Clear seams and contracts for tests
        • APIs
      • Extensibility/Modularity
        • Open source community
      • Incremental Upgrades (Python/Django) - Future
      • Time to value
        • brittleness
        • time to deployment
    • Question:
      • Will a containerized/modularized monolith within the same repo give us the benefits we're looking for, as measured by our metrics?
        • With the exception of:
          • incremental upgrades
          • test time - unless we come up with a smarter test-runner
          • independent scalability/failure-recovery
        • Goal for this working group: get baseline metrics, begin this work and validate/invalidate the approach.
  • Metrics (30mns)
    • Before embarking on this project, what metrics can we use to validate our progress and success?
      • Collect metrics from the field
        • That extended around the monolith
          • Enterprise projects
          • Learner projects
        • That needed to deal with the monolith
          • Proctoring
        • That tried to modularize before working
          • Grades
          • Instructor Tasks
        • Which metrics?
          • Extensible Modularity - Repo lines touched (how many lines within the monolith needed to be changed for new features)
      • "Good metric"
        • Easy to measure
        • Actionable:
          • validates/invalidates
          • not too many other contributing factors - so the metric is reliable for this project
      • Hotspots within the codebase
        • Which modules of the code have a lot of contention?
        • Number of commits touching the same file
        • EdwardF (Deactivated) to share Hotspot presentation
      • Code reviews (measuring modularity)
        • PRs that needed cross-team reviews
        • Number of reviewers that needed to be tagged
          • ?? can tag teams as well as individuals
          • Number of approvals may be a better proxy (to measure how many blockers on a PR)
        • Number of comments - (may be impacted more by other factors - such as the reviewer)
      • Pull Requests
        • Number of files touched
        • PRs that touched multiple modules
      • Complexity
        • Number of complexity-disabling pylint warnings
        • Complexity metric - trend line
      • Time-to-valuevalue (too many other contributing factors - may not be directly related to the monolith initiative)
        • Github branch age
          • Age of commits on branch - in case developers worked on it locally for long?
        • Release pipeline metrics - these are already available
          • Number of rollbacks
          • Disrupted deployments
      • TestsTests (may not be affected until tests are smarter or until code is actually pulled out)
        • Time
        • Number of flaky
  • Next Steps (10mns)

Action Items