[Python-Dev] "".tokenize() ?
Guido van Rossum
guido@digicool.com
Fri, 04 May 2001 14:38:06 -0400
> On 04 May 2001, M.-A. Lemburg said:
> > Gustavo Niemeyer submitted a patch which adds a tokenize like
> > method to strings and Unicode:
> >
> > "one, two and three".tokenize([",", "and"])
> > -> ["one", " two ", "three"]
> >
> > I like this method -- should I review the code and then check it in ?
>
> I concur with /F: -1 because you can do it easily with re.split().
-1 also.
--Guido van Rossum (home page: http://www.python.org/~guido/)