Which one to use: generate_tokens or tokenize?
andre.roberge at ns.sympatico.ca
Fri Sep 10 01:32:14 CEST 2004
According to the Python documentation:
18.5 tokenize -- Tokenizer for Python source
The primary entry point is a generator:
An older entry point is retained for backward compatibility:
Does this mean that one should preferably use generate_tokens? If so,
what are the advantages?
More information about the Python-list