On Thu, Jun 16, 2016 at 8:47 PM, Nick Coghlan firstname.lastname@example.org wrote:
On 16 June 2016 at 05:01, Jim Fulton email@example.com wrote:
I'm a fan of docker, but it seems to me that build workflow is a an unsolved problem if you need build tools that you don't want be included at run time.
For OpenShift's source-to-image (which combines builder images with a git repo to create your deployment image ), they tackled that problem by setting up a multi-step build pipeline where the first builder image created the binary artifact (they use a WAR file in their example, but it can be an arbitrary tarball since you're not doing anything with it except handing it to the next stage of the pipeline), and then a second builder image that provides the required runtime environment and also unpacks the tarball into the right place.
Or pip. I don't think this is really about buildout or pip. I agree that a multi-step process is needed if builds are required. I suggested that in my original response.
When I said this was an unsolved problem, I didn't mean that there weren't solutions, but that there didn't seem to be widely accepted workflows for this within the docker community (or at least docs). The common assumption is that images are defined by a single build file, to the point that images are often "published" as git repositories with build files, rather than through a docker repository. <shrug>
Anyway, getting back to Reinout's original question, I don't think docker build should be used for development. Build should assemble (again, using buildout, pip, or whatever) existing resources without actually building anything.
(Of course, docker can be used for development, and docker build could and probably should be used to create build environments.)