<div dir="ltr"><br><div class="gmail_extra"><br><br><div class="gmail_quote">On Wed, Jan 29, 2014 at 9:11 AM, Vinay Sajip <span dir="ltr"><<a href="mailto:vinay_sajip@yahoo.co.uk" target="_blank">vinay_sajip@yahoo.co.uk</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><div class="im">> Does it mean that it actually makes sense to look into that<br>
> direction and make wheel usage closer to jar?<br>
<br>
</div>There is a parallel discussion going on, with the title "Using Wheel with zipimport",<br>
which is relevant to this question, and other questions you raised (e.g. about<br>
supporting C extensions/pure-Python modules.<br>
<div class="im"><br></div></blockquote><div><br></div><div>I read all of it and got a bit lost in between the distil API and PEP process discussion;)</div><div><br></div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">
<div class="im">
> I have no knowledge about c extensions scope, but i feel<br>
> like it might be of less importance then pure python<br>
> packaging issues? Am I wrong?<br>
<br>
</div>A lot of Python users depend on C extensions - and while it is a subset of all Python<br>
users, it is a large (and important) subset. Example: any usage of Python in<br>
numerical analysis or scientific applications involves use of C extensions.<br>
<br>
Regards,<br>
<br>
Vinay Sajip<br>
</blockquote></div><br></div><div class="gmail_extra"><br></div><div class="gmail_extra"><div>I can see that it might be quite beneficial to have virtualenv and pip installing wheels locally for development needs, so here is what i was able to come up with so far:</div>
<div><br></div><div>I have one folder on NFS where all python developed stuff should be *deployed* - pyhtonlib. It is impossible to use pip or virtualenv there - so i'm bound to artifacts. The only way something can appear there is by using the "release" program that knows how to put artifacts in specified locations. Currently most of the stuff there is the .py modules and few eggs (some are executable). But this stuff is not allowing for sane dependency management, neither code reuse. I actually don't like the idea of specifying dependencies in the code via sys.path. I think the resolved sys.path based on requirements.txt is much better solution.</div>
<div><br></div><div>So i'm looking for a solution that would allow to use the same artifact for everything (like jar) so it can guarantee that the same subset of code that was tested, goes to production and used in dev. Currently I'm leaning towards using pip's capability to work with flat folders via --find-links, so i can deploy wheels to the pythonlib and then reuse them in the development environment.</div>
<div><br></div><div>But in this setup how do i make my program executable from pythonlib location? I think I should I create some smart runner script that would be able to use the pip's dependency resolution, create the necessary sys.path basing on the wheel requirements.txt and then my program wheel should have an entry point like __main__.py</div>
<div><br></div><div>As Nick pointed out the wheel is a superset of the egg - so I assume wheels can be executable, correct? How do i achieve that?</div><div><br></div><div>Thanks a lot!</div><div>Eugene</div><div><br></div>
</div></div>