[Python-Dev] "".tokenize() ?

Greg Ward gward@python.net
Fri, 4 May 2001 14:15:51 -0400


On 04 May 2001, M.-A. Lemburg said:
> Gustavo Niemeyer submitted a patch which adds a tokenize like
> method to strings and Unicode:
> 
> "one, two and three".tokenize([",", "and"])
> -> ["one", " two ", "three"]
> 
> I like this method -- should I review the code and then check it in ?

I concur with /F: -1 because you can do it easily with re.split().

        Greg
-- 
Greg Ward - Unix bigot                                  gward@python.net
http://starship.python.net/~gward/
I hope something GOOD came in the mail today so I have a REASON to live!!