On 17 Jul 2019, at 05:17, Nam Nguyen <bitsink@gmail.com> wrote:
On Tue, Jul 16, 2019 at 1:18 PM Barry <barry@barrys-emacs.org <mailto:barry@barrys-emacs.org>> wrote:
On 16 Jul 2019, at 04:47, Andrew Barnert via Python-ideas <python-ideas@python.org <mailto:python-ideas@python.org>> wrote:
How often do you need to parse a million URLs in your inner loop?
As it happens i work on code that would be impacted by such a slow down. It runs in production 24x7 and parses URLs on its critical path.
It does not have to be either or ;). In general, I would prefer security / correctness over performance.
Of course.
But if your use cases call for performance, it is perfectly fine to understand the tradeoffs, and opt in to the more appropriate solutions. And, of course, maybe there is a solution that could satisfy *both*.
Generally speaking, though, do you see 1 millisecond spent on parsing a URL deal breaker? I sense that some web frameworks might not like that very much, but I don't have any concrete use case to quote.
Yes 1ms would be a serious issue. I guess what I'm concerned about is the use of a universal parser for a benefit I'm not clear exists having a terrible affect of the speed of code that is secure and correct. Barry
Barry