Re: [SciPy-user] feed-forward neural network for python
On Wed, 06 Dec 2006 18:00:08 -0000, <scipy-user-request@scipy.org> wrote:
Hi! I released feed-forward neural network for python project at sourceforge (ffnet). I'm announcing it here because it depends on numpy/scipy tandem, so you, folks, are potential users/testers
If anyone is interested please visit ffnet.sourceforge.net (and then post comments if any...) Hi Marek,
Marek Wojciechowski wrote: thank you for you ANN implementation. I'm currently interested in recurrent neural networks so I hope you'll add them too in near future Anyway, for those interested in ANN you can check conx.py module of PyRobotics project. It's just one python file (182Kb of python source!) inside a much bigger project. Conx.py is self contained and unfortunately it needs Numeric and not numpy :(( What about mixing ffnet and conx.py?
Comapring to conx.py ffnet is much, much faster and much, much simpler. Faster thanks to scipy optimization routines and simpler because simplicity was the basic assumption when I created ffnet. To prove it: I trained a XOR network (2-2-1) in conx with: from pyrobot.brain import conx net=conx.Network() net.addThreeLayers(2,2,1) net.setInputs([[0., 0.], [0., 1.], [1., 0.], [1., 1.]]) net.setTargets([[1.], [0.], [0.], [1.]]) net.setLearning(0.5) net.setMomentum(0.8) net.setTolerance(0.001) net.train() This performed 5000 iterations in 34.6 s (and the tolerance has not been reached) With ffnet it can be done with: from ffnet import ffnet, mlgraph conec = mlgraph((2,2,1)) net = ffnet(conec) input=[[0., 0.], [0., 1.], [1., 0.], [1., 1.]]; target=[[1.], [0.], [0.], [1.]] net.train_momentum(input, target, eta=0.5, momentum=0.8, maxiter=5000) The above trains nework in 17.9 ms, and the fitness is perfect. ffnet is almost 2000 times faster than conx in this example! An the code is simpler. However there are some nice testing methods in conx, which can be used in ffnet. Thanks for this tip. -- Marek Wojciechowski
Marek Wojciechowski wrote:
Comapring to conx.py ffnet is much, much faster and much, much simpler. Faster thanks to scipy optimization routines and simpler because simplicity was the basic assumption when I created ffnet.
To prove it: I trained a XOR network (2-2-1) in conx with: ... This performed 5000 iterations in 34.6 s (and the tolerance has not been reached)
With ffnet it can be done with: ... The above trains nework in 17.9 ms, and the fitness is perfect.
ffnet is almost 2000 times faster than conx in this example! An the code is simpler.
However there are some nice testing methods in conx, which can be used in ffnet.
Hi Marek, Thank you _very_ much for your test. When I need to use ANN is on huge datasets so performance is a really big issue to me. I'll definitely try your ffnet very soon on my data. And I hope to see recurrent neural networks implemented in future releases ;) Emanuele
participants (2)
-
Emanuele Olivetti
-
Marek Wojciechowski