adalke at mindspring.com
Thu Sep 23 22:04:53 CEST 2004
> Do you mean fd.seek?
> I did manage to read and write a 4GB file on crcdocs using the script I
> posted above. Perhaps I don't understand what the "large" in large
> file really means. I assumed it was 2**31 approx equal 2GB. Has the
> default, non LFS limit, increased?
I think the answer is "it depends."
By default Python checks for large file support. From
if test "$have_long_long" = yes -a \
"$ac_cv_sizeof_off_t" -gt "$ac_cv_sizeof_long" -a \
"$ac_cv_sizeof_long_long" -ge "$ac_cv_sizeof_off_t"; then
cat >>confdefs.h <<\_ACEOF
#define HAVE_LARGEFILE_SUPPORT 1
In other words, the default is to support large files
but only when the system supports >32 bit off_t. CVS
says that was added in 1999.
The fileobject.c code for seek has
offset = PyInt_AsLong(offobj);
offset = PyLong_Check(offobj) ?
PyLong_AsLongLong(offobj) : PyInt_AsLong(offobj);
so my (corrected) statement about seeking to >2**31
should work. Though I'm not sure now that sys.maxint
is the proper test since it might return 2**63 under
a 64 bit machine. Hardcoding the value should work.
> I did do the normal incantation with the CFLAGS when compiling python
> for LFS on crcdocs.
You're beyond my knowledge there. I thought that
Python did the check automatically and didn't need
the CLAGS= ...
I compile from CVS source without special commands
and it Just Works.
dalke at dalkescientific.com
More information about the Python-list