On Sat, Jul 14, 2001 at 08:49:34PM -0700, Chuq Von Rospach wrote:
What if I have a smart system on my desktop PC that is configured to prefetch any URLs it sees in my incoming email,
Given the number of viruses, spam with URLs and other garbage out there, I'd say you're foolish (bordering on stupid), but that's beside the point. It is an interesting issue, one I can't toss out easily; but I'm not convinced, either. I think it's something that needs to be hashed over, perhaps prototyped both ways so we can see the best way to tweak it.
Barry, what do you think about this instance? As the net moves more towards wireless, PDA, mobile-phone, stuff, could we be setting ourselves up for a later problem by ignoring this pre-cache issue Gerald's raised?
What *I* think is that it's a special case, and any such pre-fetch system ought to, by default, *not* pre-fetch anything with GET parameters in it.
*All* GETs have side effects by definition: you get something different depending on what the parameters are.
That would allow anyone in the world to sign me up for any mailman-backed mailing list, whether or not I even see the email confirmation requests. And that would be Mailman's fault, for misusing HTTP GETs.
I disagree with this -- since as you say, any number of places already misuse GET, that usage is fairly common. A user who sets themselves up for it should know (or be warned by their mail client) that it has the possibility to trigger events. I'd say it's more a client issue than a Mailman issue.
Concur. And I base my opinion on 15 years of systems design experience, FWIW.
And, thinking about it, since GET *can* do this, it's probably wrong for W3 to push for it not to be used that way, because if things like your pre-caching system come into common use, the dark side of the net will take advantage of it, the way early virus writers took advantage of mail clients with auto-exec of .EXE's being on by default. So aren't you setting yourself up for problems by having a technology that can, even if you deprecate it, because it sets a user expectation that's going to be broken by those wanting to take advantage of it? I've got a bad feeling about this -- your example seems to set yourself up for abuse by those looking for ways ot abuse you, and that's a bigger issue than Mailman using it -- because if all of the 'white side' programs cooperate, it just encourages creation of things (like the pre-caching) that the dark side will take adavantage of.
Well put, young pilot.
Of course, I only do that because I don't have that prefetching thing set up... yet.)
At this point, I'd never turn on pre-fectching, since it's safety depends entirely no voluntary cooperation, and you aren't in a position to police until after the fact. That's a Bad Thing in a big way.
Well, yeah, but you don't have a palmtop, either, Chuq, right? :-)
But I'd be making this argument whether or not that were the case, since it's clearly the Right Thing to do imho...
And the more I think about it, the more it's an interesting point -- but on more than one level. Has W3c considered the implications of defining a standard that depends on voluntary acceptance here? Because the service you propose is unsafe unless you can guarantee everyone you talk to is compliant, and we know how likely that's going to be. That, to me, is a much bigger issue than whether or not Mailman complies, and in fact, I could make an argument that the standard isnt' acceptable if it's going to be a basis for services that can cause harm but requires voluntary acceptance on the server side. By the time you figure out the server isn't complying, they've burnt down the barn and run off with the horse. That's bad.
I can't find a thing to argue with here; let's see what he comes up with...
Cheers, -- jra
Jay R. Ashworth jra@baylink.com Member of the Technical Staff Baylink RFC 2100 The Suncoast Freenet The Things I Think Tampa Bay, Florida http://baylink.pitas.com +1 727 804 5015
OS X: Because making Unix user-friendly was easier than debugging Windows -- Simon Slavin in a.f.c