
On 7/14/01 4:32 PM, "Gerald Oskoboiny" <gerald@impressive.net> wrote:
Regarding "I think they're wrong", I respect your opinion, but the HTTP spec is the result of a decade of work on and experience with HTTP by full-time web protocol geeks...
But web geeks are not necessarily user-interface or user-experience geeks. You can perfectly correctly build a nice system that's technically correct, but not right for the users.
What if I have a smart system on my desktop PC that is configured to prefetch any URLs it sees in my incoming email,
Given the number of viruses, spam with URLs and other garbage out there, I'd say you're foolish (bordering on stupid), but that's beside the point. It is an interesting issue, one I can't toss out easily; but I'm not convinced, either. I think it's something that needs to be hashed over, perhaps prototyped both ways so we can see the best way to tweak it.
Barry, what do you think about this instance? As the net moves more towards wireless, PDA, mobile-phone, stuff, could we be setting ourselves up for a later problem by ignoring this pre-cache issue Gerald's raised?
Gerald, does W3 have sample pages for "right" and "wrong" that can be looked at, or are we going to have to develop some? The more I think about this, the more I think it's case where we ought to see if we can develop a prototype that follows the standards that we like, rather than toss it out at first glance. But if we can't come up with a system that we agree is 'easy enough', then we should go to what we're currently thinking.
That would allow anyone in the world to sign me up for any mailman-backed mailing list, whether or not I even see the email confirmation requests. And that would be Mailman's fault, for misusing HTTP GETs.
In disagree with this -- since as you say, any number of places already misuse GET, that usage is fairly common. A user who sets themselves up for it should know (or be warned by their mail client) that it has the possibility to trigger events. I'd say it's more a client issue than a Mailman issue.
And, thinking about it, since GET *can* do this, it's probably wrong for W3 to push for it not to be used that way, because if things like your pre-caching system come into common use, the dark side of the net will take advantage of it, the way early virus writers took advantage of mail clients with auto-exec of .EXE's being on by default. So aren't you setting yourself up for problems by having a technology that can, even if you deprecate it, because it sets a user expectation that's going to be broken by those wanting to take advantage of it? I've got a bad feeling about this -- your example seems to set yourself up for abuse by those looking for ways ot abuse you, and that's a bigger issue than Mailman using it -- because if all of the 'white side' programs cooperate, it just encourages creation of things (like the pre-caching) that the dark side will take adavantage of.
As long as GET is capable of being used this way, I'd be very careful about creating stuff that depends on "we don't want it used this way, so it won't be" -- it seems to open up avenues for attack.
Which is not a reason for mailman to ignore the standard -- but a bigger issue about whether this standard creates a perception that could come back and bite people. If people start creating services (like that pre-cache) that get tripped up by this, even if the white hats follow your advice, you're still at risk from the black hats. Isn't it better to acknowledge the capability and not create services that depend on "well behaved" systems?
Of course, I only do that because I don't have that prefetching thing set up... yet.)
At this point, I'd never turn on pre-fectching, since it's safety depends entirely no voluntary cooperation, and you aren't in a position to police until after the fact. That's a Bad Thing in a big way.
btw, part of the reason I care about this is that I work for W3C and am currently evaluating mailman for use on lists.w3.org (to replace smartlist) and we're pretty fussy about complying with our own specs, for obvious reasons.
As you should be.
But I'd be making this argument whether or not that were the case, since it's clearly the Right Thing to do imho...
And the more I think about it, the more it's an interesting point -- but on more than one level. Has W3c considered the implications of defining a standard that depends on voluntary acceptance here? Because the service you propose is unsafe unless you can guarantee everyone you talk to is compliant, and we know how likely that's going to be. That, to me, is a much bigger issue than whether or not Mailman complies, and in fact, I could make an argument that the standard isnt' acceptable if it's going to be a basis for services that can cause harm but requires voluntary acceptance on the server side. By the time you figure out the server isn't complying, they've burnt down the barn and run off with the horse. That's bad.
-- Chuq Von Rospach, Internet Gnome <http://www.chuqui.com> [<chuqui@plaidworks.com> = <me@chuqui.com> = <chuq@apple.com>] Yes, yes, I've finally finished my home page. Lucky you.
Someday, we'll look back on this, laugh nervously and change the subject.