Lightest Plausible LMS/Studio Configuration

Say we want to run an actual Open edX site on a Raspberry Pi 4. The smallest plausible thing I could see working:

  • Raspberry Pi OS (64-bit) – we want arm64 for library compatibility reasons

  • Python running directly (using pyenv or some such, but not through Docker)

  • SQLite with WAL=True for the database

  • Redis, for both caching and as a celery backend

  • Port contentstore/SplitMongoKVS to use Django storages (or wait until Learning Core data models eat everything). Use local storage (we could get away with the definition doc fetch latency if everything’s reading SSD on local disk)

  • nginx proxy (or maybe caddy)?

    • edx-platform static assets

    • prebuilt MFE assets

This configuration would be optimized for low resource usage and easy setup, and sacrifice scalability to get it. This could also be the default configuration when you’re running bare metal–i.e. it’s self-contained and easy to get up and running.

Would this be feasible for Quince? Would anyone want it?