[DB-SIG] Optional DB API Extensions

M.-A. Lemburg mal@lemburg.com
Fri, 26 Oct 2001 14:56:40 +0200

Stuart Bishop wrote:
> On Thursday, October 25, 2001, at 10:37  PM, Matthew T. Kromer wrote:
> >
> > On Thursday, October 25, 2001, at 06:52 AM, Stuart Bishop wrote:
> >
> >>
> >>
> >> I also would like the connection method:
> >>         quote(object)
> >>
> >>             Returns a SQL quoted version of the given variable as a
> >>             string, suitable for execution by the database backend.
> >>             For example:
> >>
> > Since the primary reason you want quoting is generally to handle cases
> > where you can't directly BIND values to your database, I'm of the
> > opinion that this should be heavily leaned against.  Instead, use of
> > parameter binding should be highly encouraged.
> Definitely encouraged, which is is best done with documentation. The two
> things I do with
> this sort method (using Perl is a past life, and now Zope) are:
>         - Generate logs of SQL commands sent to the RDBMS.
>         - Generate hairy single use SQL dynamically for data mining type
> reports.
> > I would be open to having extension methods which helped construct
> > statements using the API's preferred bind specification method, or a
> > mechanism whereby the application can negotiate with the adapter to
> > choose a quoting mechanism.
> A different method of getting this functionality would be:
>         sql(command,params)
>      >>> con.sql('SELECT * FROM users WHERE username = :p1',('zen',))
> This would be useful for the first case I mentioned, but more obtuse for
> the second case
> since I would end up doing ''' con.sql(':p1',('zen',)) ''' to retrieve
> the quoted parameter.

mxODBC has a method called .nativesql() which pretty much implements
this feature (the ODBC driver is usually the part in the chain
which does the quoting, etc. before passing the data on to the
database), however, .nativesql() will only return data if you have
previously .prepare()ed it. Also, the methods are cursor methods,
not connection ones.

I'd suggest to leave this out of the spec and use a helper like
e.g. dbinfo.py (see Python Softare below) for this.

> > Having said all of that, this makes me lean towards a bilevel (or
> > trilevel) approach where there is a standard module which all adapters
> > subclass, and an optional module to attempt to provide further adapter
> > abstractions (inter-adapter abstractions).  However, I'd leave the
> > current API spec largely untouched except for DRIVER level features;
> > APPLICATION level features (including application convenience
> > functions) belong elsewhere.
> This is similar to the approach the DB API 3.0 strawman I put out a few
> months ago was
> heading for. I think this sort of thing is beyond the scope of
> Marc-Andre's proposal
> as it stands (although he may decide to grow it).

I'm taking the stand here that the DB API itself should be minimal
and that DB authors should add as much to the interface as they
feel necessary.

The reason I started this discussion is to provide at least some
of these extensions with a common understanding.
Having a single implementation which then gets subclassed could
be possible in Python 2.2 with the new type mechanism in place,
but I think we should punt on it for at least a few years
until subclassing C types has become stable enough to rely on it.

Marc-Andre Lemburg
CEO eGenix.com Software GmbH
Consulting & Company:                           http://www.egenix.com/
Python Software:                        http://www.lemburg.com/python/