[Tutor] File compression

Carlos Daniel Ruvalcaba Valenzuela clsdaniel at gmail.com
Wed Jun 28 00:29:31 CEST 2006


I think this is something that can be easily madded with some shell
scripting, i suppose you are using Linux or a Unix derivate.

In this case tar + bzip2 is your friend, altough for the volume of
files i sugest looking at something like rsync or unison and coupling
it with cron for automating the online backup.

Otherwise i suggest looking at glob module for listing directories and
files and the bz2 module, there is also ziplib module, check the docs:

http://docs.python.org/lib/someos.html

As for ftp, there is ftplib:

http://docs.python.org/lib/module-ftplib.html


Good luck, regards
Carlos Daniel Ruvalcaba

On 6/27/06, Magnus Wirström <asdlinux at yahoo.se> wrote:
> Hi Everyone
>
> I'm starting to python and i need to write a program that are able to
> compress a high amount of files and directories into a single file that
> is later going to be transfered  with ftp to a  backup storage server.
> The data is a quite big amount (over 1.5 gb in 40000 files and 1300
> directories) so i would like to have high compression like RAR or bz2.
> What way is the best to approach this? I have looked a bit on zlib but i
> didn't find a good example that uses it to compress files and
> directories in one file from python. or perhaps it's simpler to execute
> a external program, doing the work. I would prefer to use python as much
> as i can without external apps.
>
> Does anyone have a idea how this can be made with python.
>
> Thanks
> Magnus Wirström
> _______________________________________________
> Tutor maillist  -  Tutor at python.org
> http://mail.python.org/mailman/listinfo/tutor
>


More information about the Tutor mailing list