[Python-Dev] Using defaultdict as globals/locals for eval()
Geert Jansen
geertj@boskant.nl
Fri, 25 Oct 2002 18:26:51 +0200
Dear Python developers,
I have a question about "dict" subclasses and eval() that I would kindly
like to ask.
The problem I have is the following: For a web content system in Python, I
am using Python's eval() function to run code embedded in templates.
Currently, the code failes with a NameError if a variable is referenced
inside the code that is not defined in the locals or globals dictionary.
Unfortunately, this is a rather common situation: one often uses variables
(like error codes) that aren't set all the time. Intializing these variables
is a tedious job so a default value would suit me nicely.
I created a derived class of the standard `dict' that fills in a default
value when a key is not found. This is exactly the same as Guido describes
in his "descintro" paper. I tried to use this dictionary as the "globals"
parameter with eval(). As Guido already describes in his paper, this doesn't
work. Quoting Guido:
"The interpreter uses an internal function to access the dictionary, which
bypasses our __getitem__() override. I admit that this can be a problem
(although it is only a problem in this context, when a dict subclass is used
as a locals/globals dictionary); it remains to be seen if I can fix this
without compromising performance in the common case."
Is there a solution to this problem in sight? Or altenately, is there a way
I can find out which variables are used inside a compiled code block so I
can initialize non-specified variables? I have a vague memory that the
nested scopes feature has to determine at compile time which variables are
being used in a code block.
Thanks for your time!
Greetings,
Geert Jansen