On Thu, May 18, 2017 at 12:07:05AM -0400, tritium-list@sdamon.com wrote:
At the cost of a slight inefficiency, you could use the pure Python equivalent given in the docs:
https://docs.python.org/3/library/fnmatch.html#fnmatch.filter
fnmatch.filter(names, pattern)
Return the subset of the list of names that match pattern. It is the same as [n for n in names if fnmatch(n, pattern)], but implemented more efficiently.
So your filter_false is:
[n for n in names if not fnmatch(n, pattern)]
which avoids the need for the copy-and-paste anti-pattern.
I ran a test on the same dataset using listcomps. The modified version of filter is still 22.<some noise>, the unmodified version is still 19.<some noise>. However the listcomp method is 41.<some noise>. That performance is important, at least to me.
41 what? Nanoseconds? Hours? For how many files? *wink* In any case, are you running on Linux or Unix? If so, try replacing the fnmatch with fnmatchcase, since that avoids calling os.path.normcase (a no-op on Linux) twice for each file. If you want to reduce the number of function calls even more, try this untested code: # using an implementation detail is a bit naughty match = fnmatch._compile_pattern(pattern).match results = [n for n in names if match(n) is None] -- Steve