[Python-checkins] cpython (3.2): Issue #2134: Clarify token.OP handling rationale in tokenize documentation.

meador.inge python-checkins at python.org
Thu Jan 19 07:46:55 CET 2012


http://hg.python.org/cpython/rev/dfd74d752b0e
changeset:   74517:dfd74d752b0e
branch:      3.2
parent:      74511:46b245f03f54
user:        Meador Inge <meadori at gmail.com>
date:        Thu Jan 19 00:22:22 2012 -0600
summary:
  Issue #2134: Clarify token.OP handling rationale in tokenize documentation.

files:
  Doc/library/tokenize.rst |  6 ++++++
  Misc/NEWS                |  3 +++
  2 files changed, 9 insertions(+), 0 deletions(-)


diff --git a/Doc/library/tokenize.rst b/Doc/library/tokenize.rst
--- a/Doc/library/tokenize.rst
+++ b/Doc/library/tokenize.rst
@@ -15,6 +15,12 @@
 as well, making it useful for implementing "pretty-printers," including
 colorizers for on-screen displays.
 
+To simplify token stream handling, all :ref:`operators` and :ref:`delimiters`
+tokens are returned using the generic :data:`token.OP` token type.  The exact
+type can be determined by checking the token ``string`` field on the
+:term:`named tuple` returned from :func:`tokenize.tokenize` for the character
+sequence that identifies a specific operator token.
+
 The primary entry point is a :term:`generator`:
 
 .. function:: tokenize(readline)
diff --git a/Misc/NEWS b/Misc/NEWS
--- a/Misc/NEWS
+++ b/Misc/NEWS
@@ -418,6 +418,9 @@
 Documentation
 -------------
 
+- Issue #2134: The tokenize documentation has been clarified to explain why
+  all operator and delimiter tokens are treated as token.OP tokens.
+
 - Issue #13513: Fix io.IOBase documentation to correctly link to the
   io.IOBase.readline method instead of the readline module.
 

-- 
Repository URL: http://hg.python.org/cpython


More information about the Python-checkins mailing list