database persistence with mysql, sqlite
fakeaddress at nowhere.org
Sun Sep 23 21:28:08 CEST 2007
> I want to run a database query and then display the first 10 records
> on a web page. Then I want to be able to click the 'Next' link on the
> page to show the next 10 records, and so on.
> My question is how to implement paging, i.e. the 'Next/Prev' NN
> records without reestablishing a database connection every time I
> click Next/Prev? Is it at all possible with cgi/mod_python?
Caching database connections works in mod_python; not so
much with cgi.
> For example, in a NON-web environment, with sqlite3 and most other
> modules, I can establish a database connection once, get a cursor
> object on which I run a single 'SELECT * FROM TABLE' statement and
> then use cursor.fetchmany(NN) as many times as there are still results
> left from the initial query.
> How do I do the same for the web?
Short answer: you don't. It would mean saving cursors with
partial query results, and arranging for incoming requests to
go to the right process. Web-apps avoid that kind of thing.
Many web toolkits offer session objects, but do not support
saving active objects such as cursors. That said, I've
never tried what you proposing with the tools you name.
Depending on how your database handles transactions, an
open cursor can lock-out writers, and even other readers.
How long do you keep it around if the user doesn't return?
What should happen if the user re-loads a page from a few
> I am not using any high-level
> framework. I am looking for a solution at the level of cgi or
> mod_python (Python Server Pages under Apache). To call
> cursor.fetchmany(NN) over and over I need to pass a handle to the
> database connection but how do I keep a reference to the cursor object
> across pages? I use mysql and sqlite3 as databases, and I am looking
> for an approach that would work with both database types (one at a
> time). So far I have successfully used the following modules for
> database access: sqlite3, mysqld, and pyodbc.
> So far, with mysql I use 'SELECT * FROM TABLE LIMIT L1, L2' where L1
> and L2 define the range for the 'Next' and 'Previous' commands. I
> have to run the query every time a click a 'Next/Prev' link.
You might want to run that query by a MySQL expert.
The basic method is nice in that it needs no server-side
state between requests. (It's a little squirrely in that
it can show a set of records that was never the contents
of the table.)
> But I am
> not sure that this is the best and most efficient way. I suppose using
> CURSOR.FETCHMANY(NN) would probably be faster and nicer but how do I
> pass an object reference across pages? Is it possible without any
> higher-level libraries?
Do you know that you have a performance problem? If so do
you know that it is due to too many cursor.execute() calls?
Keeping partially-executed queries is way down on the list
of optimizations to try.
> What would be the proper way to do it on a non-enterprise scale?
With mod_python, you can cache connections, which may help.
If you use "ORDER BY" with "LIMIT", the right index can make
a big difference.
Have you considered implementing your 'Next/Prev' commands
get all the records in one query, and the user would see
point-in-time correct results.
Another possibility is to the get all the query results and
save them in a session object, then deal them out a few at
a time. But as a rule of thumb, the less state on the server
> Would SqlAlchemy or SqlObject make things easier with regard to
> database persistence?
Quite likely, but probably not in the way you propose.
The web frameworks that use those toolkits try to do
things in robust and portable ways.
More information about the Python-list