[Python-ideas] is that expensive?

Chris Angelico rosuav at gmail.com
Fri Feb 21 18:52:24 CET 2014


On Sat, Feb 22, 2014 at 4:37 AM, Liam Marsh <liam.marsh.home at gmail.com> wrote:
> Hello everyone,
> is it possible to create(or tell me its name) a command to evaluate compute
> length of a command/procedure?
> (in number of:             -processor operations(for 32 and 64 bit
> processors)
>                                  -RAM read and write operations
>                                  -hard disk write and read operations
>                                  -eventual graphic operations
>                    )
> this may be difficult, but it ables users to optimise their programs.

Here's a first-cut timing routine:

def how_fast_is_this(func):
    return "Fast enough."

Trust me, that's accurate enough for most cases. For anything else,
you need to be timing it in your actual code.

The forms of measurement you're asking for make no sense for most
Python functions. The best you could do would probably be to look at
the size of the compiled byte-code, but that's not going to be
particularly accurate anyway.

No, the only way to profile your code is to put timing points in your
actual code. You may want to try the timeit module for some help with
that, but the simplest is to just pepper your code with calls to
time.time().

I hope there never is a function for counting processor instructions
required for a particular Python function. Apart from being nearly
impossible to calculate, it'd be almost never correctly used. People
would warp their code around using "the one with the smaller number",
when high level languages these days should be written primarily with
a view to being readable by a human. Make your code look right and act
right, and worry about how fast it is only when you have evidence that
it really isn't fast enough.

Incidentally, the newer Python versions will tend to be faster than
older ones, because the developers of Python itself care about
performance. A bit of time spent optimizing CPython will improve
execution time of every Python script, but a bit of time spent
optimizing your one script improves only that one script.

For further help with optimizing scripts, ask on python-list. We can
help with back-of-the-envelope calculations (if you're concerned that
your server can't handle X network requests a second, first ascertain
whether your server's network connection can feed it that many a
second - that exact question came up on the list a few months ago),
and also with tips and tricks when you come to the optimizing itself.
And who knows, maybe we can save you a huge amount of time...
programmer time, which is usually more expensive than processor time
:)

ChrisA


More information about the Python-ideas mailing list