Genetic programming: pygene, pygp, AST, or (gasp) Lisp?

John Ladasky ladasky at my-deja.com
Sun Jul 20 03:52:13 EDT 2008


Hi folks,

I've played around with neural nets for a while.  I wrote my own slow,
pure-Python NN package.  I knew that there were Python NN packages out
there -- but I couldn't really understand their features and
documentation at first, not without some hands-on experience.

I haven't yet solved any interesting problems with NN, but I learned a
lot about both NN and about Python along the way.  One of the
unpleasant things I learned about NN was their propensity for becoming
trapped in local minima.  I also learned about overfitting, wasting
many hours of CPU time in the process.  In short, simple neural nets
have disappointed me.

I have now asked myself: wouldn't it be cool if, when you ceased to
make progress on an optimization algorithm, you could divide your data
set with an "if" statement, which would evolve a useful dividing line
through your data set -- and then, develop divergent solutions to
explain each subset better?

In mathematical terms:

Round 1 of evolutionary programming leads to a single, but sub-optimal
solution, probably a solution in a local minimum:

    b = f(a)

Round 2 would start by elaborating the solution of round 1, adding a
condition that is initially meaningless:

    if disc(a) < c:
        b = f1(a)
    else:
        b = f2(a)

Initially, f1() = f2() = the original f().  Therefore, the output of
the starting state of round 2 will be identical to the output of the
final state of round 1, regardless of the initial state of the
discriminant function, disc().  But then, f1(), f2(), and disc() will
all be permitted to evolve.  This may provide a chance to "pop out" of
the local minimum, without sacrificing any of the progress made in
defining the solution of round 1.

Of course, f(), f1(), f2(), and disc() could be complex functions.
They would best be described with parse trees, which I've just
discovered (I'm not formally trained in computer science).

Some reading has led me to conclude that what I'm proposing is
basically genetic programming.

http://en.wikipedia.org/wiki/Genetic_programming

I've tried my hand at this already in Python.  I have devised a code
tree class, which functions as a parse tree for functions and also
handles multiple-line statements. I've tried cobbling together program
strings from code trees, and then using the exec() function to run
them.  This is all very cool, but I'm finding it VERY cumbersome!

As with neural nets, I know that there are genetic programming
packages offered for Python:

http://www.freenet.org.nz/python/pygene/
http://sourceforge.net/projects/pygp/

But I can't really understand what they do -- both of these packages
are, alas, minimally documented.  Is anyone out there using either of
these?  I can't tell whether they will implement algorithms of the
type I've described here.  Specifically, I don't know if conditional
statements are included in their repertoire.  Also, Pygene SEEMS to
have a bent toward Boolean logic. I'm working with analog data, and I
want complex, nonlinear functions.

So, as with neural nets, shall I once again proceed on my own? There
is apparently a layer in the Python interpreter at which code is
represented as an abstract syntax tree (AST).

http://docs.python.org/lib/ast.html

Why not do genetic programming directly on Python code?  Maybe my code
tree data structure is redundant to something which already exists?
But apparently Python's "_ast" module offers only one-way access -- it
will generate an AST from a piece of code, but you can't modify an
AST, and turn it back into executable code?  I would definitely need
this latter feature.

ALTERNATELY -- and I don't mention this to start a flame war -- in
pondering this problem I've noticed the frightening fact that Lisp
seems set up to handle genetic programming by design.  The s-
expression syntax IS essentially a parse tree.  Now, I've spent a few
hours with Lisp so far, and I can't make it do much of anything -- but
I DO see how Lisp could be helpful.  Still, I'm reluctant to pursue a
language I don't know, and which I'm finding much harder to grasp than
any language I've tried before.

Is there a way to interface Lisp to Python, so that I can do all the
interface programming in the language I already know best -- and just
do the genetic parts in Lisp?  I haven't seen exception handling in
Lisp, a feature I've come to love in Python.  Since it is fairly easy
for a randomly-generated program to generate illegal output (I already
know this from my initial experiments in Python), I don't think I can
live without exception handling.

I also think that Python has a higher-level understanding of a
variable's "type" or an object's "class" than Lisp does -- if I'm
hacking at a parse tree and I want to minimize syntax errors, I need
to know more than the fact that an element in an expression is an atom
or a list.

Producing human-readable code from my genetic programming search would
be a great bonus -- and for me, at this moment, this seems to mean
Algol-style syntax.  (Sigh.)  But it's not a requirement.

Thanks for your advice!



More information about the Python-list mailing list