[issue5011] issue4428 - make io.BufferedWriter observe max_buffer_size limits

Antoine Pitrou report at bugs.python.org
Tue Jan 20 12:12:51 CET 2009


New submission from Antoine Pitrou <pitrou at free.fr>:

http://codereview.appspot.com/12470/diff/1/2
File Lib/io.py (right):

http://codereview.appspot.com/12470/diff/1/2#newcode1055
Line 1055: # b is an iterable of ints, it won't always support len().
There is no reason for write() to accept arbitrary iterable of ints,
only bytes-like and buffer-like objects. It will make the code simpler.

http://codereview.appspot.com/12470/diff/1/2#newcode1060
Line 1060: # No buffer API?  Make intermediate slice copies instead.
Objects without the buffer API shouldn't be supported at all.

http://codereview.appspot.com/12470/diff/1/2#newcode1066
Line 1066: while chunk and len(self._write_buf) > self.buffer_size:
What if buffer_size == max_buffer_size? Is everything still written ok?

http://codereview.appspot.com/12470/diff/1/2#newcode1070
Line 1070: written += e.characters_written
e.characters_written can include bytes which were already part of the
buffer before write() was called, but the newly raised BlockingIOError
should only count those bytes which were part of the object passed to
write().

http://codereview.appspot.com/12470/diff/1/3
File Lib/test/test_io.py (right):

http://codereview.appspot.com/12470/diff/1/3#newcode496
Line 496: def testWriteNoLengthIterable(self):
This shouldn't work at all. If it works right now, it is only a
side-effect of the implementation.
(it won't work with FileIO, for example)

http://codereview.appspot.com/12470

----------
messages: 80242
nosy: gregory.p.smith, pitrou
severity: normal
status: open
title: issue4428 - make io.BufferedWriter observe max_buffer_size limits

_______________________________________
Python tracker <report at bugs.python.org>
<http://bugs.python.org/issue5011>
_______________________________________


More information about the Python-bugs-list mailing list