How to import input data to make ndarray for batch processing?
Hi All, I am new to Numpy (also Scipy).
I am trying to reshape my text data which is in one single column (10,000 rows). I want the data to be in 100x100 array form.
I have many files to convert like this. All of them are having file names like 0, 1, 2, ....500. with out any extension. Actually, I renamed actual files so that I can import them in Matlab for batch processing. Since Matlab also new for me, I thought I will try Numpy first.
Can any body help me in writing the script to do this for making batch processing.
Thanks in advance, Venkat
Do you want to save the file to disk as 100x100 matrices, or just to read them into the memory? Are the files in ascii or binary format?
Nadav ________________________________________ From: numpydiscussionbounces@scipy.org [numpydiscussionbounces@scipy.org] On Behalf Of Venkat [dvr002@gmail.com] Sent: 18 November 2010 16:49 To: Discussion of Numerical Python Subject: [Numpydiscussion] How to import input data to make ndarray for batch processing?
Hi All, I am new to Numpy (also Scipy).
I am trying to reshape my text data which is in one single column (10,000 rows). I want the data to be in 100x100 array form.
I have many files to convert like this. All of them are having file names like 0, 1, 2, ....500. with out any extension. Actually, I renamed actual files so that I can import them in Matlab for batch processing. Since Matlab also new for me, I thought I will try Numpy first.
Can any body help me in writing the script to do this for making batch processing.
Thanks in advance, Venkat  ******************************************************************************* D.Venkat Research Scholar Dept of Physics IISc, Bangalore India560 012 ********************************************************************************
Venkat <dvr002 <at> gmail.com> writes:
Hi All,I am new to Numpy (also Scipy).I am trying to reshape my text data
which is in one single column (10,000 rows).I want the data to be in 100x100 array form.I have many files to convert like this. All of them are having file names like 0, 1, 2, ....500. with out any extension.
Actually, I renamed actual files so that I can import them in Matlab for batch
processing.Since Matlab also new for me, I thought I will try Numpy first. Can any body help me in writing the script to do this for making batch processing. Thanks in advance,Venkat
In [2]: dummy_data = np.random.randn(100,100)
In [3]: dummy_data.shape Out[3]: (100, 100)
In [4]: dummy_data.flatten().shape Out[4]: (10000,)
In [5]: np.savetxt('dummy_data.txt', dummy_data.flatten())
In [6]: !less dummy_data.txt 2.571031186906808100e01 1.566790681796508500e+00 6.846267829937818800e01 3.271332705287631200e01 7.482409829656505600e02 1.429095432454441600e01 6.888841694801869400e01 5.319842186383831900e01 4.047786844569442600e01 6.696045994533519300e01 4.895085917712171400e01 6.969419814656118200e01 6.656815445278644300e01 7.455135053686292600e01 ...
In [7]: data = np.loadtxt('dummy_data.txt')
In [8]: data.shape Out[8]: (10000,)
In [9]: data = data.reshape(100, 100)
In [10]: data.shape Out[10]: (100, 100)
In [11]: np.allclose(dummy_data, data) Out[11]: True
HTH, Dave
On 11/18/10 7:40 AM, Dave Hirschfeld wrote:
In [7]: data = np.loadtxt('dummy_data.txt')
or, faster:
data = np.fromfile('dummy_data.txt', dtype=np.float64, sep = ' ')
fromfile() is not very flexible, and doesn't have good error handling, but it's a lot faster than loadtxt for the simple cases like this.
Chris
El jeu., 18112010 a las 20:19 +0530, Venkat escribió:
I have many files to convert like this. All of them are having file names like 0, 1, 2, ....500. with out any extension. Actually, I renamed actual files so that I can import them in Matlab for batch processing. Since Matlab also new for me, I thought I will try Numpy first. Can any body help me in writing the script to do this for making batch processing.
One point that others did not answer is the 'batch' part. If your files are named sequentially, you can 'template' the argument you pass to the loader function. For example, if you load with numpy.loadtxt your data that is stored in files named 'mydata0', 'mydata1', .... 'mydata511', your batch processing may look like that
for ind in xrange(512): filename = 'mydata%d' % ind data = numpy.loadtxt(filename, ... ) #... your processing on single file
with adapted range of indices (see xrange doc), string formatting (see string doc) and arguments to loader function
On Nov 18, 2010, at 6:49 AM, Venkat wrote:
I am trying to reshape my text data which is in one single column (10,000 rows). I want the data to be in 100x100 array form.
If all you want to do is converting the actual files, and you are using a unixish operating system, you don't even need python:
paste           < filename > newfilename
should do the trick, without any assumptions on the type of data or change in precision due to reading/writing.
Hope this helps,
Lutz
participants (6)

Christopher Barker

Dave Hirschfeld

Fabrice Silva

Lutz Maibaum

Nadav Horesh

Venkat