Pysqlite tables in RAM
steve at holdenweb.com
Fri Sep 29 15:40:01 CEST 2006
> Hi all,
> I'm relatively new to python and am facing a problem with database
> I want to store my data in a database on the disk. I also want to be
> able to reload the tables into the RAM whenever I have a lot of disk
> accesses and commit the changes back to the database. There is an
> option of storing the data in the RAM where you connect to :memory:
> instead of a DB file. The problem with this is that the data is lost
> everytime you close the connection to the database. Could somebody
> suggest a way to load the tables into the RAM as tables and not as some
> lists or dictionaries?
As long as your data isn't too voluminous it's quite practical to read
it from a SQLite database on disk into a SQLite database in memory,
given the same table structures on each.
But if the data isn't too voluminous then probably any
halfway-reasonable database will also cache it effectively. So you may
win less performance that you expect.
Steve Holden +44 150 684 7255 +1 800 494 3119
Holden Web LLC/Ltd http://www.holdenweb.com
Skype: holdenweb http://holdenweb.blogspot.com
Recent Ramblings http://del.icio.us/steve.holden
More information about the Python-list