
Barry Warsaw wrote:
On Aug 31, 2009, at 1:15 PM, C Nulk wrote:
I am pretty sure allowing the raw email addresses to be available is going to go over like a lead balloon here. Anything (however minor) to help protect the users/clients email addresses is helpful despite what others think. It is fine if someone considers the obfuscation that Mailman uses is trivial, however, anything I can do to make it harder or more computationally time-invested to get the email address is better than giving it away. Sure bots are out there but if what I do helps slow down someones system to make them look at it (and hopefully get rid of the bot), then great. But at least give me the choice to be able to do it.
Agreed.
I happened to like Barry's (?) earlier comment about the "send me this message" link. Or maybe "send my message to the original poster" link where you can click on the link, compose your message, and send it through Mailman all without the original sender's address. Mailman or whatever process can figure out the original sender and pass on the your message. Yes, I know it is more work that is why we have computers :)
The difficult part about the latter is that I hate web interfaces for reading/composing email (Gmail included). I want to use my mail reader for that!
Actually, I had more of a mailto style link in mind that sends the message to the list (run by Mailman naturally) and as part of the body/subject include an encrypted form of the message id (providing it is unique). You would use your mail client to read/compose. Maybe something similar to a list's listname-bounces address but with the message id could be done. Don't know. Mailman would receive your message, decrypt the message id, look up the message, then forward your message to the original sender.
I am not particularly fond of web interfaces for reading/composing email. Well, maybe when I travel overseas without a laptop, then it is minimally okay.
As for using robots.txt, hmm, it is not the legitimate search engines I care about, it is the search engines/crawlers that do not respect my robots.txt file that I care about. If I had an effective way to consistently identify those non-legitimate crawlers, I would add what I needed to drop them into my firewall as I recognized them.
Agreed. -Barry
Now, totally off-topic, anyone have a recommendation for a book on learning Python so I am no longer truly dangerous, just slightly.
Thanks, Chris