Postgres Large Objects with Python/PyPgSQL

Rod MacNeil rmacneil at interactdirect.com
Tue Jul 31 19:45:46 CEST 2001


I have 2 computers with PostgreSQL running.
1. PostgresSQL 7.0/Redhat 7.0/Python 2.1/pyPgSQL 1.2
2. PostgresSQL 7.1.1/Redhat 7.1/Python 2.1/pyPgSQL 1.2

I am relatively new to both python and postgres and I
was experimenting with large objects.
I wrote the simple test program shown below to insert.
and extract a large object. It worked
fine on computer #1 with Postgress 7.0
but fails with Postgres 7.1.1 on machine #2 with the following
error:

C:\Rod\Project\Active\00012-interact-db-build>python
blobtest.py
[['one ', 709448]]
Traceback (most recent call last):
File "blobtest.py", line 36, in ?
handle.open('r')
AttributeError: 'int' object has no attribute 'open'


Did something change in Postgres from version 7.0 to 7.1.1 that has caused
this
inability to access a large object with the PyPgSQL interface?
 Is there a problem with my code?

It appears that on computer #2 the cursor doesn't
contain an object.

Thanks in Advance
Rod MacNeil
rmacneil at interactdirect.com

Here's the code:

# blobtest.py

"""
This is me playing with blobs with PyPgSQL
"""



import PgSQL
conx = PgSQL.connect
(host='192.168.1.102',database='junk',user='user',password='pass')

cur = conx.cursor()
cur.execute("create table test2 (f1 char(10),f2 oid)")
conx.commit()
cur.close()

data ={}
data['name'] = 'one'
cur = conx.cursor()
data['blob'] = conx.binary(open('client.ico','rb').read())

sql = "insert into test2 values (%(name)s,%(blob)s)"
cur.execute(sql,data)
conx.commit()
cur.close()

cur = conx.cursor()
cur.execute("select * from test2")
data=cur.fetchall()
print data

outfile=open('client2.ico','wb')

# the large object that comes back from a query can be
opened as a file handle
handle = data[0][1]
handle.open('r')
outfile.write(handle.read())
outfile.close()

cur.close()
conx.close






More information about the Python-list mailing list