[DB-SIG] Remaining issues with DB API 1.1
Sun, 28 Mar 1999 06:19:30 -0800
M.-A. Lemburg wrote:
> The updated spec now says:
> Fetch the next set of rows of a query result, returning a
> sequence of sequences (e.g. a list of tuples). An empty
> sequence is returned when no more rows are available. The
> number of rows to fetch is specified by the parameter. If it is not
> given, the cursor's arraysize determines the number of rows to
> be fetched.
This is good, although we should probably elaborate a bit more: it is
possible that *fewer* than the requested number of rows (via the
parameter or cursor.arraysize) will be returned. Specifically, it will
almost always happen at the "end" of a query. IMO, it should also be
legal mid-query, but we don't need to explicitly state that. (although
we should be clear that returning fewer does *not* mean the
end-of-query... the only determining factor for that is an empty
> > No need to specify in the spec exactly _what_ the default should
> > be - it seems sufficient to describe the behaviour - ie, "if not
> > specified", rather than "the default value is xxx"
> Hmm, I don't quite follow you here. Why shouldn't the default
> be defined ? [After all, the 1.0 spec also defined the "default"
> to be cursor.arraysize.]
It should be defined. He's smoking something :-) Your current text
> If we were not to define the default value, then the definition
> of cursor.arraysize would be obsolete w/r to fetchmany():
> Should we drop the reference to fetchmany() in the above definition ?
Greg Stein, http://www.lyra.org/