Python and the need for speed
steve+python at pearwood.info
Tue Apr 11 10:56:33 EDT 2017
On Tue, 11 Apr 2017 07:56 pm, Brecht Machiels wrote:
> On 2017-04-11 08:19:31 +0000, Steven D'Aprano said:
>> If we're going to talk about speeding up Python, we ought to talk about
>> *serious* approaches to the problem, not the musing of random, ignorant
>> bloggers and other naive commentators.
> Hey! This random, ignorant blogger has feelings too! :-)
Hi, and welcome!
Sorry, I wasn't *specifically* referring to you, except in the sense that
you aren't a compiler expert.
The truth is, all of us in this discussion -- including me -- are "random,
ignorant commentators". I don't believe that any of us are experts at
Bart is a possible exception, for some definition of "expert" -- he claims
to have written a quite fast, moderately dynamic language, but nobody else
(that I know of) has used it.
And no offence to Bart, but from his comments and questions on the list, I
think it is fair to say that whatever knowledge he has on language design
was probably state of the art thirty years ago. Bart sometimes expresses
surprise and confusion over concepts which are common in languages like
is over 20 years old, Perl is even older. So I suspect Bart's knowledge is
probably from the 70s or 80s?
That doesn't mean it is irrelevant. But it does mean that there's a lot he
is unfamiliar with.
> I don't know much about interpreter or compiler design, but I never
> claimed that speeding up CPython would simply be a matter of deleting
> some code.
No, that seems to be Chris' interpretation. My interpretation is that it
is "common sense" that it needs more than just pressing delete on a few
features to speed up an interpreter, and therefore for casual discussion
(as in a blog post) it goes without saying that removing features is only
the first step, not the only step.
> I merely suggested that approaches different from PyPy and
> other JIT compilers should be explored, since I do not feel that these
> projects are delivering satisfactory results. I am glad to see it got
> this discussion started, at the very least.
I agree! I think that its wonderful that people and companies are willing to
invest time and money exploring the options.
But I also think that while making Python faster is a good goal to have, it
seems that for most people, and most companies, it isn't their priority.
For a company, if it costs $30,000 to build an experimental TurboPython
which may or may not solve your problems, and $40,000 to migrate to Go or
problem, why wouldn't you do so?
Well, maybe because your development costs for using Go will be higher.
Maybe. But will they be higher than your maintenance costs for TurboPython?
>> some are actively maintained and
>> used in production by many people (cython, PyPy).
> Are there any statistics on PyPy usage? I'm not convinced it is being
> used widely. As far as I can tell, it really is only useful for server
> applications because of the long JIT warm-up time.
You are correct: PyPy is not designed for short scripts and other
applications. The JIT warm-up is significant.
>> The Python ecosystem is actually quite healthy, if you need to speed up
>> code there are lots of solutions, and some of them are even good
> There seem to be no solutions for my use case (rinohtype).
Have you tried Nuitka?
> DropBox and
> Google seem to agree that there are no good solutions, since they are
> moving to Go.
That's a good solution! Maybe we should be writing extensions in Go, instead
of C. Or for maths-heavy work, using extensions written in Julia.
While there are advantages to using a single language, it is silly to
artificially limit yourself to a single language if you don't need to.
Python started life as a "glue language" for C and Fortran, and there are
projects and implementations like Jython for example that specifically
exist so people can call Java libraries from Python.
A hybrid code base where the heavy lifting is done in a fast but annoying
language, and the glue and infrastructure is written in Python, is a good
solution for many problems.
>> Nevertheless, it is interesting to discuss whether or not any
>> of these features will go mainstream or make it into CPython.
> Indeed! I initially wanted to include the following in the article, but
> decided it would be too controversial. But now that I've been exposed
> as an ignorant and naive blogger, I might as well share these thoughts.
> I have the feeling that a faster Python will never materialise unless
> the CPython core developers make performance a high priority.
I think you are both right and wrong.
You are right in the sense that none of the companies exploring Python
optimizers have wanted to carry the maintenance burden themselves. Their
aim is to prove the concept, then push it back into CPython and have the
core devs maintain it.
But you're also wrong, in the sense that you're focused on *revolutionary*
speed-ups rather than *evolutionary* optimizations. CPython is faster today
than it was in version 1.5, despite doing MUCH more. Python continues to
shave slow code off and improve performance. No, this isn't likely to make
Python ten times faster, but its performance does incrementally increase.
> understand that high performance was never a goal in CPython
> development (and Python language design!), but recent events (DropBox,
> Google) might help to reconsider that standpoint.
It isn't as if high-performance is a requirement for all code. And it isn't
as if Python is in any serious risk of losing popularity.
No language can expect to be popular forever. Eventually Python will be as
obsolete as or niche as COBOL, Tcl or ABC. But that day is not now.
> Here's a wild idea: consider Python 3 feature-complete.
I think that will conflict with the many, many people who want Python to
have more features, and care more about them than speed.
“Cheer up,” they said, “things could be worse.” So I cheered up, and sure
enough, things got worse.
More information about the Python-list