[New-bugs-announce] [issue22526] file iteration crashes for huge lines (2GiB+)
Jakub Mateusz Kowalski
report at bugs.python.org
Tue Sep 30 18:10:51 CEST 2014
New submission from Jakub Mateusz Kowalski:
File /tmp/2147483648zeros is 2^31 (2GiB) zero-bytes ('\0').
Readline method works fine:
>>> fh = open('/tmp/2147483648zeros', 'rb')
>>> line = fh.readline()
>>> len(line)
2147483648
However when I try to iterate over the file:
>>> fh = open('/tmp/2147483648zeros', 'rb')
>>> for line in fh:
... print len(line)
SystemError Traceback (most recent call last)
/home/jkowalski/<ipython-input-55-aaa9ddb42aea> in <module>()
----> 1 for line in fh:
2 print len(line)
3
SystemError: Negative size passed to PyString_FromStringAndSize
Same is for greater files (issue discovered for 2243973120 B).
For a shorter file iteration works as expected.
File /tmp/2147483647zeros is 2^31 - 1 (< 2GiB) zero-bytes.
>>> fh = open('/tmp/2147483647zeros', 'rb')
>>> for line in fh:
... print len(line)
2147483647
I guess the variable used for size is of 32bit signed type.
I am using Python 2.7.3 (default, Feb 27 2014, 19:58:35) with IPython 0.12.1 on Ubuntu 12.04.5 LTS.
----------
components: IO
messages: 227949
nosy: Jakub.Mateusz.Kowalski
priority: normal
severity: normal
status: open
title: file iteration crashes for huge lines (2GiB+)
type: crash
versions: Python 2.7
_______________________________________
Python tracker <report at bugs.python.org>
<http://bugs.python.org/issue22526>
_______________________________________
More information about the New-bugs-announce
mailing list