DevPI local server

As part of PLAT-1910 - Getting issue details... STATUS I was able to set up a locally running DevPI instance and test Devstack against it. The goal was to understand the benefits and limitations of the software as a package caching solution and compare it against the Artifactory local setup test.


Initially I tried a few Docker containers pre-configured with DevPI, but none of them was up-to-date and each had issues that prevented them from even starting for me. Eventually I just created a local Virtualenv for these tests and ran it from there. It was trivial to get a server running, using the generally good documentation here:

(devpi) C02RD35PG8WM:devpi brianmesick$ pip install -q -U devpi-server
(devpi) C02RD35PG8WM:devpi brianmesick$ devpi-server --start --init --host

NOTE: I had to add --host to allow Docker containers to connect via IP.


Configuration for pip to use DevPI as it's primary PyPI server and take advantage of local caching was much the same as for Artifactory. Create a ~/.pip/pip.conf like this:

index-url =
# Next line is necessary if you aren't using TLS on the DevPI host
trusted-host = 

# The search section is necessary for DevPI, but did not seem to be for Artifactory
index =

NOTE: I had to replace localhost with my host machine IP due to connecting from a Docker container.

Using pip install from the command line showed packages being initially pulled from PyPI, and subsequent calls using the DevPI cache.

Like Artifactory a user and index needed to be set up ahead of time in order to have permissions to upload packages. Instead of a web interface (which DevPI has, but I did not try) I was able to do it from command line tools:

devpi use
create user "devpi user -c admin password=password"
login "devpi login admin --password=password"

# dev is the name of the index, it will be prefixed with the user name 
# to make an index name of admin/dev
devpi index -c dev bases=root/pypi

Pushing Packages

Pushing packages to the local DevPI can be done in a couple of ways. I tried using the same script I wrote for uploading to Artifactory with moderate success, but more packages failed to upload "out of the box" than for Artifactory, with only 10 of 35 immediately succeeding. This script used the sdist upload command and ~/.pypirc config file, with just a different port from Artifactory:

index-servers =

username: admin
password: password

I had better luck using "devpi upload" from inside the package's path, which got all of the same 29 packages uploaded to DevPI as were in Artifactoy (5 still failed for various reasons).


All tests were just for setup time of a Devstack Django 1.11, clean tox environment (.tox entirely removed between tests) triggered running: tox -e py27-django111 -- pytest . Times were measured from hitting enter on the command to when the line "py27-django111 installed:" appeared. NOTE: These tests were run from my home connection, and therefore are likely to be a fair amount slower on uncached runs due to lower bandwidth than in the office.

Using current github.txt

Default current setup: 10:33
Empty DevPI (caching all dependencies for the first time): 10:22
Populated DevPI (all dependencies cached that it can): 6:52

Using github.txt with our 29 non-pip packages built to Artifactory (5 still using github requirements)

Empty DevPI (github.txt packages cached, nothing else): 6:14
Populated DevPI: 4:13


  • DevPI works, and the documentation is good
  • Clearing the PyPI cache and uploaded packages seemed unnecessarily difficult, eventually I just deleted and re-created the index to get to a clean state
    • There may be options / plugins to help with this that I didn't find in my quick searches
  • Seems easy enough to run and script, and has some plugin scripts to help with things like testing, Jenkins automated package building, etc.
  • Other conclusions from the Artifactory writeup apply here as well