simultaneous multiple requests to very simple database
Eric S. Johansson
esj at harvee.org
Tue Jan 18 11:26:46 EST 2005
I have an application where I need a very simple database, effectively a
very large dictionary. The very large dictionary must be accessed from
multiple processes simultaneously. I need to be able to lock records
within the very large dictionary when records are written to. Estimated
number of records will be in the ballpark of 50,000 to 100,000 in his
early phase and 10 times that in the future. Each record will run about
100 to 150 bytes.
speed is not a huge concern although I must complete processing in less
than 90 seconds. The longer the delay however the greater number of
processes must be running parallel in order to keep the throughput up.
It's the usual trade-off we have all come to know and love.
it is not necessary for the dictionary to persist beyond the life of the
parent process although I have another project coming up in which this
would be a good idea.
at this point, I know they will be some kind souls suggesting various
SQL solutions. While I appreciate the idea, unfortunately I do not have
time to puzzle out yet another component. Someday I will figure it out
because I really liked what I see with SQL lite but unfortunately, today
is not that day (unless they will give me their work, home and cell
phone numbers so I can call when I am stuck. ;-)
So the solutions that come to mind are some form of dictionary in shared
memory with locking semaphore scoreboard or a multithreaded process
containing a single database (Python native dictionary, metakit, gdbm??)
and have all of my processes speak to it using xmlrpc which leaves me
with the question of how to make a multithreaded server using stock xmlrpc.
so feedback and pointers to information would be most welcome. I'm
still exploring the idea so I am open to any and all suggestions (except
maybe SQL :-)
---eric
More information about the Python-list
mailing list