[Python-checkins] python/dist/src/Doc/lib libtokenize.tex,1.5,1.6

rhettinger@users.sourceforge.net rhettinger at users.sourceforge.net
Fri Jun 10 13:05:20 CEST 2005


Update of /cvsroot/python/python/dist/src/Doc/lib
In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv2829/Doc/lib

Modified Files:
	libtokenize.tex 
Log Message:
Add untokenize() function to allow full round-trip tokenization.

Should significantly enhance the utility of the module by supporting
the creation of tools that modify the token stream and writeback the
modified result.



Index: libtokenize.tex
===================================================================
RCS file: /cvsroot/python/python/dist/src/Doc/lib/libtokenize.tex,v
retrieving revision 1.5
retrieving revision 1.6
diff -u -d -r1.5 -r1.6
--- libtokenize.tex	29 Jun 2001 23:51:07 -0000	1.5
+++ libtokenize.tex	10 Jun 2005 11:05:18 -0000	1.6
@@ -45,6 +45,9 @@
   provides the same interface as the \method{readline()} method of
   built-in file objects (see section~\ref{bltin-file-objects}).  Each
   call to the function should return one line of input as a string.
+  Alternately, \var{readline} may be a callable object that signals
+  completion by raising \exception{StopIteration}.
+  \versionchanged[Added StopIteration support]{2.5}
 
   The second parameter, \var{tokeneater}, must also be a callable
   object.  It is called once for each token, with five arguments,
@@ -65,3 +68,52 @@
   are generated when a logical line of code is continued over multiple
   physical lines.
 \end{datadesc}
+
+Another function is provided to reverse the tokenization process.
+This is useful for creating tools that tokenize a script, modify
+the token stream, and write back the modified script.
+
+\begin{funcdesc}{untokenize}{iterable}
+  Converts tokens back into Python source code.  The \variable{iterable}
+  must return sequences with at least two elements, the token type and
+  the token string.  Any additional sequence elements are ignored.
+
+  The reconstructed script is returned as a single string.  The
+  result is guaranteed to tokenize back to match the input so that
+  the conversion is lossless and round-trips are assured.  The
+  guarantee applies only to the token type and token string as
+  the spacing between tokens (column positions) may change.
+  \versionadded{2.5}
+\end{funcdesc}
+
+Example of a script re-writer that transforms float literals into
+Decimal objects:
+\begin{verbatim}
+def decistmt(s):
+    """Substitute Decimals for floats in a string of statements.
+
+    >>> from decimal import Decimal
+    >>> s = 'print +21.3e-5*-.1234/81.7'
+    >>> decistmt(s)
+    "print +Decimal ('21.3e-5')*-Decimal ('.1234')/Decimal ('81.7')"
+
+    >>> exec(s)
+    -3.21716034272e-007
+    >>> exec(decistmt(s))
+    -3.217160342717258261933904529E-7
+
+    """
+    result = []
+    g = generate_tokens(StringIO(s).readline)   # tokenize the string
+    for toknum, tokval, _, _, _  in g:
+        if toknum == NUMBER and '.' in tokval:  # replace NUMBER tokens
+            result.extend([
+                (NAME, 'Decimal'),
+                (OP, '('),
+                (STRING, repr(tokval)),
+                (OP, ')')
+            ])
+        else:
+            result.append((toknum, tokval))
+    return untokenize(result)
+\end{verbatim}



More information about the Python-checkins mailing list