[python-uk] Handling large file uploads

Javier Llopis javier at llopis.me
Tue Apr 11 06:33:18 EDT 2017

I haven't done anything similar, but this upload in chunks then reassemble 
idea bears some similarity to peer to peer file sharing software. I would 
look there for ideas in case I got stuck with some problem.

 From: "Hansel Dunlop" <hansel at interpretthis.org>
Sent: 11 April 2017 11:22
To: "UK Python Users" <python-uk at python.org>
Subject: [python-uk] Handling large file uploads   
 Hello all  
 I'm working on an application that has to accept large uploads. Think ~ 
2GB+ size files getting uploaded over slowish connections. These files are 
eventually going to end up in S3.
 Uploading smallish files is not a problem. But things get a bit 
complicated when you're dealing with large files and load balanced servers. 
Servers that can be replaced at any time. Has anyone done something 
 My current plan is:
 1. Accept chunked uploads. So the app/browser sends individual POST 
requests with ~10mb chunks. Once that upload is complete the server 
responds with a chunk id and the current offset
 2. The server stores each intermediate chunk in a temporary S3 bucket
 3. Once the final chunk has been uploaded the server kicks off another 
process that stitches the pieces together and puts the whole file into it's 
final location. And then deletes the intermediate pieces. 
 I think I have to do the file in chunks like this but maybe there is some 
way to stream the files somewhere? 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-uk/attachments/20170411/d36f0cd8/attachment.html>

More information about the python-uk mailing list