tjreedy at udel.edu
Sat Aug 2 03:36:15 CEST 2008
> OK, that sounds stupid. Anyway, I've been learning Python for some
> time now, and am currently having fun with the urllib and urllib2
> modules, but have run into a problem(?) - is there any way to fetch
> (urllib.retrieve) files from a server without knowing the filenames?
> For instance, there is smth like folder/spam.egg, folder/
> unpredictable.egg and so on. If not, perhaps some kind of glob to
> create a list of existing files? I'd really appreciate some help,
> since I'm really out of my (newb) depth here.
If you are asking whether servers will let you go fishing around their
file system, the answer is that http is not designed for that (whereas
ftp is as long as you stay under the main ftp directory). You can try
random file names, but the server may get unhappy and think you are
trying to break in through a back door or something. You are *expected*
to start at ..../index.html and proceed with the links given there. Or
to use a valid filename that was retrieved by that method.
More information about the Python-list