Tokenizer for python?
rcdailey at gmail.com
Tue Aug 21 23:05:50 CEST 2007
I am looking for a sort of "tokenizer" for python. I've taken a look at the
tokenize module, but that seems to parse python code from what I read. I
want a tokenizer that works a little like boost::tokenizer, however for
python. Basically I want to be able to pass in an arbitrary string (or line
from readline()) and specify tokens that cause the string to be separated
into parts, much like the regular expression split() method (I think that's
the name of it). Is there anything that already exists that does this, or do
I need to implement it myself with regular expressions?
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Python-list