Urllib's urlopen and urlretrieve
davea at davea.name
Thu Feb 21 16:56:15 CET 2013
On 02/21/2013 07:12 AM, qoresucks at gmail.com wrote:
> I only just started Python and given that I know nothing about network programming or internet programming of any kind really, I thought it would be interesting to try write something that could create an archive of a website for myself.
Please send your emails as text, not html; this is a text-based mailing
To archive your website, use the rsync command. No need to write any
code, as rsync will descend into all the directories as needed, and
it'll get the actual website data, not the stuff that the web server
feeds to the browsers.
If for some reason you don't have rsync, you could use scp. But it
doesn't seem to be able to preserve attributes. It's also not smart
enough to only copy stuff that's been changed, when you want to update
More information about the Python-list