Charles G Waldman
cgw at fnal.gov
Fri Jun 25 22:46:30 CEST 1999
I am building and testing software on Linux which will eventually be
deployed onto a variety of systems, including OS's with large
filesystem support. I need to be sure that all of the code (some of
which is C extension modules) is prepared to handle python Long
integer objects as file sizes and offsets - for instance some of the C
code naively does a PyArg_ParseTuple(args, "i", &file_size) which
works fine as long as the file size is a plain int.
For testing purposes, I've modified config.h after running
./configure, so that the symbol HAVE_LARGEFILE_SUPPORT is #defined.
This allows me to continue testing on Linux, and be fairly sure that
there will be no surprises when I run the code on an SGI or Solaris
Since this feature is useful, I've started building Python this way
all the time. It seems to run fine, I haven't run into any trouble.
I'm wondering if there's some hidden pitfall that I haven't thought
More information about the Python-list