IndentationError: two many levels of indentation

Alex Martelli aleax at
Fri Mar 7 10:16:18 CET 2003

Erik Max Francis wrote:
> It would indeed appear there's some sort of limit.  I did the following:
> IndentationError: too many levels of indentation

Yes, Parser/tokenizer.h in the Python 2.2.2 sources for example
is quite clear about it (grepping in the file...):

#define MAXINDENT 100      /* Max indentation level */
   int indstack[MAXINDENT];        /* Stack of indents */
   int altindstack[MAXINDENT];     /* Stack of alternate indents */

and Parser/tokenizer.c checks for over-indentation:

   if (tok->indent+1 >= MAXINDENT) {

giving an E_TOODEEP error that gets turned into the above exception.
2.3a2 seems to be identical in this respect.

I suspect that editing tokenizer.h to use a MAXINDENT of 1000, or
whatever, is going to bump up the limit accordingly (at a small
memory cost), though I have not experimented with it.  Removing the
limit altogether would seem to require quite a different approach
to tokenization than the one currently used, or else it might be
achieved by turning those arrays into dynamically resized ones,
if somebody's keen enough to support code-generators to offer
patches for the purpose...


More information about the Python-list mailing list