Filtering web proxy

Oleg Broytmann phd at phd.russ.ru
Mon Apr 17 16:20:02 CEST 2000


Hello!

   I want a filtering web proxy. I can write one myself, but if there is a
thing already... well, I don't want to reinvent the wheel. If there is such
thing (free and opensourse, 'course), I'll extend it for my needs.

   Well, what are my needs? I am braindamaged idiot, who do not want to see
banners, pop-ups and all the crap, so I run Navigator with graphics/Java/
Javascript/cookies turned off 99% of the time. But for some sites I need
to go to preference dialog and turn it on manually. Then turn it back off
before visiting another site. (I need frames, tables and other features, so
I am not using lynx; yes, I know there are "links" and "w3m", and yes, I
use them from time to time; anyway I run Navigator).
   It is tiresome. I want to automate the task.

   So I want to turn graphics/script/cookies forever on, but filter the
crap in the proxy. For 99% sites I'll just remove the junk. For some sites
(a list of which I'll care manually) the proxy will pass HTML, graphics and
cookies unchanged (or a bit modified, 'cause in any case I don't want to
eat Doubleclick's cookies attached to their ads).
   Once I tried Junkbuster, but found it inadequate. My need is simple -
just a short list of "white" sites. Never tried adzapper...

   I wrote dozen HTML parsers in Python, so I can write one more, and turn
it into a proxy, but may be I can start with some already debugged code?

Oleg.
---- 
     Oleg Broytmann          http://phd.russ.ru/~phd/          phd2 at mail.com
           Programmers don't die, they just GOSUB without RETURN.





More information about the Python-list mailing list