large insert strings crash MySQL

jesse at spine.com jesse at spine.com
Sun May 30 16:10:52 EDT 1999


I'm trying to insert large strings (up to several meg) into a mysql
table via python and MySQL.  My table defines item_data as a longblob:

mysql> show fields from item;
+-------------+------------------+------+-----+---------+----------------+
| Field       | Type             | Null | Key | Default | Extra          |
+-------------+------------------+------+-----+---------+----------------+
| item_id     | int(10) unsigned |      | PRI | 0       | auto_increment |
| item_data   | longblob         | YES  |     | NULL    |                |
...



and I'm able to insert large strings from the mysql client:

mysql> insert into item (item_id, item_name, item_data) 
       values (null, 'tmp', repeat('x', 80000));
Query OK, 1 row affected (0.88 sec)



but when I try it from MySQL it fails with an error:

>>> db = MySQL.connect('localhost','user','pass')
>>> db.selectdb('kb')
>>> bs = 'x' * 40000
>>> db["insert into item (item_id, item_name, item_data) 
    values (null, 'tmp', '%s')" % bs]
1
>>> bs = 'x' * 80000
>>> db["insert into item (item_id, item_name, item_data) 
    values (null, 'tmp', '%s')" % bs]
Traceback (innermost last):
  File "<stdin>", line 1, in ?
MySQL.error: MySQL server has gone away


Any suggestions?  It looks like it's treating it as a normal blob with
a 64k limit.





More information about the Python-list mailing list