python-postgres-informix interaction problems

Steve Holden sholden at
Wed Oct 31 14:42:01 CET 2001

"J Dubal" <dubal at> wrote ...
> Hello good people!
Hello yourself. All the *good* people were busy, so they asked *me* to reply

> We have developed Informix RDBMS based commercial applications in
> Informix-4gl. It is a good procedural language with embeded SQL. We
> are trying to rewrite some apps in Python with postgresql backend for
> licensing reasons. Python is good too. We are trying to do this with
> latest versions of RH linux (7.2), python 2.1 and postgres 7.1.3. We
> face following issues:
Excellent. Another company decides to embrace open source products. If only
Microsoft would learn this lesson ...

> 1. We have to read data from informix db, process it and insert into
> postgres. We are using informixdb module (DBAPI 1) which is bit dated.
> In Informix we 'select custno, custname, custaddr1, ... into progvar1,
> progvar2, ... from custmf where ...' In python, we say fetchone() and
> we get a list. We are then moving the elements of the list to named
> program variables thus:
>         pcustno = list1[0]
>         pcustname = list1[1]
>         pcustaddr1 = list1[2]
>         ... as many as reqd
> This way we give meaningful names to data items read from db and
> process further.
> Is there a better, more elegant/efficient way to to this?
Yes. Not necessarily more efficient (probably less so), but much more usable
from a programmatic point of view is Greg Stein's dtuple module, whicj
allows you to address the tuple using numerical or field-name subscripts as
well as attributes (t.fieldname). See

gives you a short example of the advantages of this excellent module.

> 2. To insert this data into postgres, we used to use pygres included
> in the postgres distribution (latest). We had to stop using this
> because we couldn't find a way to find out how many rows it have
> inserted, updated or deleted after such statements are executed ie we
> couldn't find a way to check the status or return code. So we switched
> over to PoPy which is DBAPI2 and where we could check number of rows
> affected. Is this right thing to do?
This sounds like an excellent plan.

> 3. In order to insert data into postgres, we have to compose an insert
> statement thus:
>         cur.execute("insert into custmf (custno, custname,
> custaddr1,...)
>             values (%i,'%s','%s'...)" %(pcustno,pcustname,
> pcustaddr1....))
> Is there a better way of doing this?
Yes. Use parameterized statements. There are five different schemes, the
most common of which uses question marks in the statement to associate data
values from a tuple provided to the cursor.execute() method as a second

So you would write:

    cur.execute("""INSERT INTO custmf (custno, custname, pcustaddr1, ...)
                            VALUES (?, ?, ?, ...)""",
                        (pcustno, pcustname, pcustaddr1, ...))

The DB API documentation gives you more detail on this. Note that not only
is this saving you from constructing individual statements, it also gives
the DB API chance to optimize successive uses of the same statement with
different values.

> 4. The above string substitute converts program variables holding null
> value into 'None' which goes into target table as string 'None'. To
> avoid this we parse all columns to check if they are 'None' and
> replace them by ''. Is there a better way to do this?
Parameterization should solve this problem too!

> 5. The above check works only on string variables. We have not found
> yet a way to handle nulls in numeric columns. Please suggest a
> solution.
And this one!

> 6. If the character columns happen to contain quotes, the above
> composed insert statement breaks. Please suggest a solution.
And this one! THe DB module will perform all necessary quoting!

Hope this helps. Good luck with your project.


More information about the Python-list mailing list