The ability to resolve dependencies with static metadata is the major one that comes to my mind that’s specific to pip. The ability to have better build systems besides distutils/setuptools is a more ecosystem level one but that’s something we’ll get too.
As far as shared libs… beyond what’s already possible (sticking a shared lib inside of a python project and having libraries load that .dll explicitly) it’s not currently on the road map and may never be. I hesitate to say never because it’s obviously a problem that needs solved and if the Python ecosystem solves it (specific to shared libraries, not whole runtimes or other languages or what have you) then that would be a useful thing. I think we have lower hanging fruit that we need to deal with before something like that is even possibly to be on the radar though (if we ever put it on the radar).
I guess I’m confused what the benefit of making pip able to install a conda package would be. If Python adds someplace for shared libs to go then we could just add shared lib support to Wheels, it’s just another file type so that’s not a big deal. The hardest part is dealing with ABI compatibility. However, given the current state of things, what’s the benefit of being able to do ``pip install conda-lxml``? Either it’s going to flat out break or you’re going to have to do ``conda install libxml2`` first, and if you’re doing ``conda install libxml2`` first then why not just do ``conda install lxml``?
I view conda the same way I view apt-get, yum, Chocolatey, etc. It provides an environment and you can install a Python package into that environment, but that pip shouldn’t know how to install a .deb or a .rpm or a conda package because those packages rely on specifics to that environment and Python packages can’t.