On Windows and Mac OS, clearly these should go somewhere under sys.prefix: this is the directory for all things Python, including third-party module distributions. If Brian Hooper distributes a module "foo" that requires a data file containing character encoding data (yes, this is based on a true story), then the module belongs in (eg.) C:\Python and the data file in (?) C:\Python\Data. (Maybe C:\Python\Data\foo, but that's a minor wrinkle.)
Any disagreement so far?
A little. I dont think we need a new dump for arbitary files that no one can associate with their application.
Why not put the data with the code? It is quite trivial for a Python package or module to find its own location, and this way we are not dependent on anything.
Why assume packages are installed _under_ Python? Why not just assume the package is _reachable_ by Python. Once our package/module is being executed by Python, we know exactly where we are.
On my machine, there is no "data" equivilent; the closest would be "python-cvs\pcbuild\data", and that certainly doesnt make sense. Why can't I just place it where I put all my other Python extensions, ensure it is on the PythonPath, and have it "just work"?
It sounds a little complicated - do we provide an API for this magic location, or does everybody cut-and-paste a reference implementation for locating it? Either way sounds pretty bad - the API shouldnt be distutils dependent (I may not have installed this package via distutils), and really Python itself shouldnt care about this...
So all in all, I dont think it is a problem we need to push up to this level - let each package author do whatever makes sense, and point out how trivial it would be if you assumed code and data in the same place/tree.
[If the data is considered read/write, then you need a better answer anyway, as you can't assume "c:\python\data" is writable (when actually running the code) anymore than "c:\python\my_package" is]