[Python-bugs-list] [ python-Bugs-633560 ] tokenize.__all__ needs "generate_tokens"

noreply@sourceforge.net noreply@sourceforge.net
Mon, 04 Nov 2002 16:29:07 -0800


Bugs item #633560, was opened at 2002-11-04 18:29
You can respond by visiting: 
https://sourceforge.net/tracker/?func=detail&atid=105470&aid=633560&group_id=5470

Category: Python Library
Group: Python 2.2.2
Status: Open
Resolution: None
Priority: 5
Submitted By: Patrick K. O'Brien (pobrien)
Assigned to: Nobody/Anonymous (nobody)
Summary: tokenize.__all__ needs "generate_tokens"

Initial Comment:
The tokenize module needs to have its __all__ attribute  
updated, like such:  
  
__all__ = [x for x in dir(token) if x[0] != '_'] +  
["COMMENT", "tokenize", "NL", "generate_tokens"]  
  
The current version is missing "generate_tokens":  
  
__all__ = [x for x in dir(token) if x[0] != '_'] +  
["COMMENT", "tokenize", "NL"] 
 
While "tokenize" is described as an "older entry point", it 
appears to still be useful and is certainly used when 
tokenize is run from the command line. So I'm not sure 
what the intent is for describing it in those terms, since it 
doesn't appear that the intent is to deprecate the tokenize 
function. 
 
Pat (looking for love in __all__ the wrong places) 

----------------------------------------------------------------------

You can respond by visiting: 
https://sourceforge.net/tracker/?func=detail&atid=105470&aid=633560&group_id=5470