resume upload wsgi script
Diez B. Roggisch
deets at nospam.web.de
Sun Aug 9 14:25:47 EDT 2009
gert schrieb:
> On Aug 9, 4:42 pm, "Diez B. Roggisch" <de... at nospam.web.de> wrote:
>> gert schrieb:
>>
>>
>>
>>> On Aug 9, 3:17 pm, "Diez B. Roggisch" <de... at nospam.web.de> wrote:
>>>> gert schrieb:
>>>>> I working on a resume upload script and encountered the following
>>>>> problems
>>>>> sql: Could not decode to UTF-8 column 'SUBSTR(picture,?)' with text
>>>>> '\ufffd\ufff
>>>>> d\ufffd\ufffd↑!Exif------------Ef1gL6KM7Ij5ae0gL6KM7cH2cH2GI3
>>>>> Content-Disposition: form-data; name="Filename"
>>>> You are treating a binary data column as if it were a string. That's
>>>> bogus, you need to use a blob column.
>>>> Also I wouldn't combine the uplodaded chunks until the full upload is
>>>> finished - and even then only if I actually need the data.
>>>> Diez
>>> And the best solution would be to use TEXT instead or some sort of
>>> SUBBIN that i do not know of in sqlite ?
>> No, the best solution would be to use "BLOB", and no SUB*-stuff, but
>> instead a 1:n-relation of chunks, that when the upload is finished you
>> can easily combine in python to one large blob, storing that in the DB
>> again.
>>
>> Diez
>
> so one table of chunks
>
> CREATE TABLE temp (
> file_id VARCHAR(64),
> chunK_id INTEGER,
> chunk BLOB,
> PRIMARY KEY(file_id,chunk_id)
> );
>
> SELECT chunk FROM temp WHERE file_id = 'file'
> concatenating result in python
> update blob
> delete temp
>
> How do I concatenate results that do not fit into memory ?
By writing them into one file? If files were to large for your memory,
all the substring-stuff wouldn't help either - the sqlite works in the
same memory as your program...
But how many gigabytes of uploads do you expect?
Diez
More information about the Python-list
mailing list