segmentation fault in scipy?

conor.robinson at conor.robinson at
Thu May 11 00:21:23 CEST 2006

I'm running operations large arrays of floats, approx 25,000 x 80.
Python (scipy) does not seem to come close to using 4GB of wired mem,
but segments at around a gig. Everything works fine on smaller batches
of data around 10,000 x 80 and uses a max of ~600mb of mem.  Any Ideas?
 Is this just too much data for scipy?

Thanks Conor

Traceback (most recent call last):
 File "C:\Temp\CR_2\", line 68, in ?
   net.rProp(1.2, .5, .000001, 50.0, input, output, 1)
 File "/Users/conorrob/Desktop/CR_2/", line 230, in rProp
   print scipy.trace(error*scipy.transpose(error))
 File "D:\Python24\Lib\site-packages\numpy\core\", line
149, in
   return, other)

More information about the Python-list mailing list