segmentation fault in scipy?
Travis E. Oliphant
oliphant.travis at ieee.org
Thu May 11 00:29:06 CEST 2006
conor.robinson at gmail.com wrote:
> I'm running operations large arrays of floats, approx 25,000 x 80.
> Python (scipy) does not seem to come close to using 4GB of wired mem,
> but segments at around a gig. Everything works fine on smaller batches
> of data around 10,000 x 80 and uses a max of ~600mb of mem. Any Ideas?
> Is this just too much data for scipy?
> Thanks Conor
> Traceback (most recent call last):
> File "C:\Temp\CR_2\run.py", line 68, in ?
> net.rProp(1.2, .5, .000001, 50.0, input, output, 1)
> File "/Users/conorrob/Desktop/CR_2/Network.py", line 230, in rProp
> print scipy.trace(error*scipy.transpose(error))
> File "D:\Python24\Lib\site-packages\numpy\core\defmatrix.py", line
> 149, in
> return N.dot(self, other)
You should ask this question on the numpy-discussion list for better
Does it actually segfault or give you this Memory Error?
Temporary arrays that need to be created could be the source of the
Generally, you should be able to use all the memory on your system
(unless you are on a 64-bit system and are not using Python 2.5).
More information about the Python-list