Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Meeting Notes

  • Definition of Done (5mns)
    • Consensus on long-term goal
    • Today's Problems/Observations
      • Developer Velocity
        • Pull request complexity
          • i.e., over interdependencies
        • Pull request wall time for tests
        • Developer onboarding
        • Disorganization
      • Quality
        • Clear seams and contracts for tests
        • APIs
      • Extensibility/Modularity
        • Open source community
      • Incremental Upgrades (Python/Django) - Future
      • Time to value
        • brittleness
        • time to deployment
    • Question:
      • Will a containerized/modularized monolith within the same repo give us the benefits we're looking for, as measured by our metrics?
        • With the exception of:
          • incremental upgrades
          • test time - unless we come up with a smarter test-runner
          • independent scalability/failure-recovery
        • Goal for this working group: get baseline metrics, begin this work and validate/invalidate the approach.
  • Metrics (30mns)
    • Before embarking on this project, what metrics can we use to validate our progress and success?
      • Collect metrics from the field
        • That extended around the monolith
          • Enterprise projects
          • Learner projects
        • That needed to deal with the monolith
          • Proctoring
        • That tried to modularize before working
          • Grades
          • Instructor Tasks
        • Which metrics?
          • Extensible Modularity - Repo lines touched (how many lines within the monolith needed to be changed for new features)
      • "Good metric"
        • Easy to measure
        • Actionable: validates/invalidates
      • Hotspots within the codebase
        • Which modules of the code have a lot of contention?
        • Number of commits touching the same file
        • EdwardF (Deactivated) to share Hotspot presentation
      • Code reviews (measuring modularity)
        • PRs that needed cross-team reviews
        • Number of reviewers that needed to be tagged
        • Number of comments
      • Pull Requests
        • Number of files touched
        • PRs that touched multiple modules
      • Time-to-value
        • Github branch age
        • Release pipeline metrics
          • Number of rollbacks
          • Disrupted deployments
      • Tests
        • Time
        • Number of flaky
  • Review proposed Epics (15mns)
  • Next Steps (10mns)

Action Items

...

...

...

...

  •  

...

...

...