Python "compiler" is too slow for processing large data files???
Scott Gilbert
xscottgjunk at yahoo.com
Tue Sep 17 04:24:27 EDT 2002
"Ron Horn" <rchii at lycos.com> wrote
>
> I'm writing a fairly simple app that loads (text) data files, lets you edit
> the data, and then saves them back out. Rather than parsing the data myself
> when loading a new file, and building data structures for the program's data
> that way, I decided to try to 'import' (or 'exec', actually) the data into
> the app. The idea was to be able to format my data files as python code
> (see below), and then let the python compiler do the parsing. In the
> future, I could actually put 'def' and 'class' statements right into the
> data file to capture some behavior along with the data.
>
> Simple example - I can import or exec this file to load my data (my real app
> has int, float, and string data):
> ------ try5a3.py --------
> list1 = [
> (323, 870, 46, ),
> (810, 336, 271, ),
> (572, 55, 596, ),
> (337, 256, 629, ),
> (31, 702, 16, ),
> ]
> print len(list1)
> ---------------------------
>
try using this instead:
data1 = """323 870 46
810 336 271
572 55 596
337 256 629
31 702 16"""
list1 = [tuple([int(x) for x in y.split()]) for y in
data1.split('\n')]
del data1
It's not as pretty, but I was able to load more than 100,000 lines of
3 elements each in a few seconds with Python 2.2 on my aging laptop.
Cheers,
-Scott
More information about the Python-list
mailing list