[issue719888] tokenize module w/ coding cookie
report at bugs.python.org
Tue Mar 18 19:01:23 CET 2008
Mark Dickinson <dickinsm at gmail.com> added the comment:
Is it worth keeping generate_tokens as an alias for tokenize, just
to avoid gratuitous 2-to-3 breakage? Maybe not---I guess they're
different beasts, in that one wants a string-valued iterator and the
other wants a bytes-valued iterator.
So if I understand correctly, the readline argument to tokenize
would have to return bytes instances. Would it be worth adding a check
for this, to catch possible misuse? You could put the check in
detect_encoding, so that just checks that the first one or two yields
from readline have the correct type, and assumes that the rest is okay.
Tracker <report at bugs.python.org>
More information about the Python-bugs-list