automatically dl pages which needs cookie

shagshag13 shagshag13 at yahooPLUSDESPAM.fr
Tue Jul 1 04:28:55 EDT 2003


hello,

i would like to automatically download html pages that need a cookie to be
set (in response of a form login) to be available. But i can't manage to do
it. Does anyone has a cut and paste example somewhere ? thanks for your
help.

i've tried ClientCookie but didn't manage to undestand how it work.
Also tried PyCurl and following code :

>>> import pycurl
>>> import urllib
>>> curl=pycurl.init()
>>> curl.setopt(pycurl.URL,mywebsite)
>>> dataread="login=aaa&password=bbb"
>>> curl.setopt(pycurl.POST,1)
>>> curl.setopt(pycurl.POSTFIELDS,dataread)
>>> curl.setopt(pycurl.COOKIEJAR,'my.txt')
>>> curl.perform()

but get an error that i don't undestand :

Traceback (most recent call last):
  File "<pyshell#21>", line 1, in ?
    curl.perform()
error: (23, 'Failed writing body')






More information about the Python-list mailing list