[New-bugs-announce] [issue5011] issue4428 - make io.BufferedWriter observe max_buffer_size limits
report at bugs.python.org
Tue Jan 20 12:12:51 CET 2009
New submission from Antoine Pitrou <pitrou at free.fr>:
File Lib/io.py (right):
Line 1055: # b is an iterable of ints, it won't always support len().
There is no reason for write() to accept arbitrary iterable of ints,
only bytes-like and buffer-like objects. It will make the code simpler.
Line 1060: # No buffer API? Make intermediate slice copies instead.
Objects without the buffer API shouldn't be supported at all.
Line 1066: while chunk and len(self._write_buf) > self.buffer_size:
What if buffer_size == max_buffer_size? Is everything still written ok?
Line 1070: written += e.characters_written
e.characters_written can include bytes which were already part of the
buffer before write() was called, but the newly raised BlockingIOError
should only count those bytes which were part of the object passed to
File Lib/test/test_io.py (right):
Line 496: def testWriteNoLengthIterable(self):
This shouldn't work at all. If it works right now, it is only a
side-effect of the implementation.
(it won't work with FileIO, for example)
nosy: gregory.p.smith, pitrou
title: issue4428 - make io.BufferedWriter observe max_buffer_size limits
Python tracker <report at bugs.python.org>
More information about the New-bugs-announce