[SciPy-User] memory errors when using savemat

Tyrel Newton newton at tethers.com
Sat Jul 24 13:37:33 EDT 2010


I'm trying to use scipy.io.savemat to export a very large set of data to a .mat file. The dataset contains around 20 million floats. When I try to export this to a .mat file, I get a MemoryError. The specific MemoryError is:

File "C:\Python26\lib\site-packages\scipy\io\matlab\miobase.py", line 557 in write_bytes
	self.file_stream.write(arr.tostring(order='F'))

I'm running this on Windows under Python 2.6.

Does anybody know of a way to deal with this type memory error? Either increasing python's available memory or telling scipy to break apart the export into chunks . . .

Thanks in advance for any suggestions.

Tyrel





More information about the SciPy-User mailing list