[issue12486] tokenize module should have a unicode API

Serhiy Storchaka report at bugs.python.org
Fri May 18 03:59:47 EDT 2018


Serhiy Storchaka <storchaka+cpython at gmail.com> added the comment:

My concern is that we will have two functions with non-similar names (tokenize() and generate_tokens()) that does virtually the same, but accept different types of input (bytes or str), and the single function untokenize() that produces different type of result depending on the value of input. This doesn't look like a good design to me.

----------

_______________________________________
Python tracker <report at bugs.python.org>
<https://bugs.python.org/issue12486>
_______________________________________


More information about the Python-bugs-list mailing list