[Python-bugs-list] [ python-Bugs-633560 ] tokenize.__all__ needs "generate_tokens"
noreply@sourceforge.net
noreply@sourceforge.net
Mon, 04 Nov 2002 22:09:34 -0800
Bugs item #633560, was opened at 2002-11-04 19:29
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=105470&aid=633560&group_id=5470
Category: Python Library
Group: Python 2.2.2
>Status: Closed
>Resolution: Fixed
Priority: 5
Submitted By: Patrick K. O'Brien (pobrien)
>Assigned to: Raymond Hettinger (rhettinger)
>Summary: tokenize.__all__ needs "generate_tokens"
Initial Comment:
The tokenize module needs to have its __all__ attribute
updated, like such:
__all__ = [x for x in dir(token) if x[0] != '_'] +
["COMMENT", "tokenize", "NL", "generate_tokens"]
The current version is missing "generate_tokens":
__all__ = [x for x in dir(token) if x[0] != '_'] +
["COMMENT", "tokenize", "NL"]
While "tokenize" is described as an "older entry point", it
appears to still be useful and is certainly used when
tokenize is run from the command line. So I'm not sure
what the intent is for describing it in those terms, since it
doesn't appear that the intent is to deprecate the tokenize
function.
Pat (looking for love in __all__ the wrong places)
----------------------------------------------------------------------
>Comment By: Raymond Hettinger (rhettinger)
Date: 2002-11-05 01:09
Message:
Logged In: YES
user_id=80475
Fixed. See tokenize 1.34 and 1.28.14.3
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=105470&aid=633560&group_id=5470