Need design advice. What's my best approach for storing this data?

J Correia correia_j at KILLTHIShotmail.com
Fri Mar 17 18:27:42 CET 2006


"Mudcat" <mnations at gmail.com> wrote in message 
news:1142615283.733346.74260 at i40g2000cwc.googlegroups.com...
> Hi,
>
> I am trying to build a tool that analyzes stock data. Therefore I am
> going to download and store quite a vast amount of it. Just for a
> general number - assuming there are about 7000 listed stocks on the two
> major markets plus some extras, 255 tradying days a year for 20 years,
> that is about 36 million entries.
>

On a different tack, to avoid thinking about any db issues, consider 
subscribing
to TC2000 (tc2000.com)... they already have all that data,
in a database which takes about 900Mb when fully installed.
They also have an API which allows you full access to the database
(including from Python via COM).  The API is pretty robust and allows
you do pre-filtering (e.g. give me last 20 years of all stocks over $50
with ave daily vol > 100k) at the db level meaning you can focus on using
Python for analysis.  The database is also updated daily.

If you don't need daily updates, then subscribe (first 30 days free) and
cancel, and you've got a snapshot db of all the data you need.

They also used to send out an evaluation CD which had all
the history data barring the last 3 months or so which is certainly
good enough for analysis and testing.  Not sure if they still do that.

HTH.





More information about the Python-list mailing list