This mail is the consequence of a true story, a story where CPython
got defeated by Javascript, Java, C# and Go.
def filter(rule, whatever):
if rule.x in whatever.x:
return True
rules = get_rules()
whatevers = get_whatevers()
for rule in rules:
for whatever in whatevers:
if filter(rule, whatever):
cnt = cnt + 1
return cnt
It's true that they didn't optimize the code, but
they did not for any language having for all of them the same cost in
terms of iterations.
for rule in rules:
x = rule.x
for whatever in whatevers:
if x in whatever.x:
cnt += 1
The performance of the CPython boosted x3/x4 just doing these "silly" things.
The case of the rule cache IMHO is very striking, we have plenty
examples in many repositories where the caching of none local
variables is a widely used pattern, why hasn't been considered a way
to do it implicitly and by default?
The case of the slowness to call functions in CPython is quite
recurrent and looks like its an unsolved problem at all.
If the default code that you
can write in a language is by default slow and exists an alternative
to make it faster, this language is doing something wrong.
BTW: pypy looks like is immunized [1]
[1] https://gist.github.com/pfreixes/ d60d00761093c3bdaf29da025a0045 82