[Python-Dev] robots exclusion file on the buildbot pages?
exarkun at twistedmatrix.com
exarkun at twistedmatrix.com
Sat May 15 19:57:28 CEST 2010
On 05:48 pm, solipsis at pitrou.net wrote:
>
>Hello,
>
>The buildbots are sometimes subject to a flood of "svn exception"
>errors. It has been conjectured that these errors are caused by Web
>crawlers pressing "force build" buttons without filling any of the
>fields (of course, the fact that we get such ugly errors in the
>buildbot results, rather than a clean error message when pressing
>the button, is a buildbot bug in itself). Couldn't we simply exclude
>all
>crawlers from the buildbot Web pages?
Most (all?) legitimate crawlers won't submit forms. Do you think
there's a non-form link to the force build URL (which _will_ accept a
GET request to mean the same thing as a POST)?
One thing I have noticed is that spammers find these forms and submit
them with garbage. We can probably suppose that such people are going
to ignore a robots.txt file.
Jean-Paul
More information about the Python-Dev
mailing list