[Python-checkins] python/dist/src/Lib robotparser.py, 1.17,
1.17.10.1
rhettinger at users.sourceforge.net
rhettinger at users.sourceforge.net
Sat Mar 13 15:31:35 EST 2004
Update of /cvsroot/python/python/dist/src/Lib
In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv31240
Modified Files:
Tag: release23-maint
robotparser.py
Log Message:
SF patch #911431: robot.txt must be robots.txt
(Contributed by George Yoshida.)
Index: robotparser.py
===================================================================
RCS file: /cvsroot/python/python/dist/src/Lib/robotparser.py,v
retrieving revision 1.17
retrieving revision 1.17.10.1
diff -C2 -d -r1.17 -r1.17.10.1
*** robotparser.py 27 Feb 2003 20:14:40 -0000 1.17
--- robotparser.py 13 Mar 2004 20:31:33 -0000 1.17.10.1
***************
*** 84,88 ****
def parse(self, lines):
! """parse the input lines from a robot.txt file.
We allow that a user-agent: line is not preceded by
one or more blank lines."""
--- 84,88 ----
def parse(self, lines):
! """parse the input lines from a robots.txt file.
We allow that a user-agent: line is not preceded by
one or more blank lines."""
***************
*** 149,153 ****
def can_fetch(self, useragent, url):
"""using the parsed robots.txt decide if useragent can fetch url"""
! _debug("Checking robot.txt allowance for:\n user agent: %s\n url: %s" %
(useragent, url))
if self.disallow_all:
--- 149,153 ----
def can_fetch(self, useragent, url):
"""using the parsed robots.txt decide if useragent can fetch url"""
! _debug("Checking robots.txt allowance for:\n user agent: %s\n url: %s" %
(useragent, url))
if self.disallow_all:
More information about the Python-checkins
mailing list