[Python-ideas] Type Hinting - Performance booster ?

Ron Adam ron3200 at gmail.com
Sat Dec 27 02:40:25 CET 2014

On 12/26/2014 02:37 PM, Antoine Pitrou wrote:
> On Fri, 26 Dec 2014 14:03:42 -0600
> Ron Adam<ron3200 at gmail.com>  wrote:
>> >
>> >On 12/26/2014 11:13 AM, Antoine Pitrou wrote:
>>> > >On Wed, 24 Dec 2014 13:04:01 -0600
>>> > >Ron Adam<ron3200 at gmail.com>   wrote:
>>>>> > >> >
>>>>> > >> >My thoughts is that making python easier to multi-process on multi-core
>>>>> > >> >CPUs will be where the biggest performance gains will be.  Think of 100
>>>>> > >> >core chips in as soon as 5 or 6 years.
>> >
>>> > >Won't happen on mainstream computers
>>> > >(laptop/desktop/tablet/smartphone), as it's a totally silly thing to
>>> > >do there.
>> >
>> >Which is silly?, 100 cores,
> This:-)

Depends on what and how it's used.

>> >The 5 or 6 years figure is my optimistic expectation for high end
>> >workstations and servers.
> I don't see how that's optimistic. Most workloads are intrinsically
> serial, not parallel.

This is changing.  It's a feedback loop, as new hardware becomes available, 
software engineers learn to take advantage of it, and as they do, it drives 
the market... and then more hardware improvements are added by hardware 

Hey, they need a pay check, and investors need dividends, and there are 
enough people interested to make everyone think it's worth doing. <shrug> 
It's a huge economic machine we are talking about, and it's moving forward. 
  I doubt we could stop it if we wanted to.

So it's a process that is changing across the board.  If you think of one 
part of that staying the same, either the hardware or the software,  the 
other parts may seem "silly", but if you think of it all changing at once, 
it starts to make more sense.

I wanted to find a good example, and the first thing that came to mind is 
the ever present web browser.  That is probably the one piece of software 
that is run by more users directly than any other software program.  A 
quick search brought up an interesting talk by Jack Moffitt about the 
Mozilla SERVO project and the Rust programming language.

It's a bit long but interesting.  At one point, (18:30 to 19:45 minute 
marks), he mentions a tree design where you spawn threads for each child, 
and they spawn threads for each of their children.


That of course is the software engineering side of things, and then you 
also have the hardware engineering side, but both sides are actively being 
developed at the same time.

> Expecting to get a 100-core general purpose CPU
> is expecting to get something that's unfit for most daily tasks, which
> is rather pessimistic. If the industry had followed the enthusiastic
> predictions from 5 years ago, the average desktop CPU would probably
> have 16+ HW threads right now - which it doesn't: the average core count
> stagnates between 2 and 4.

Yes, I think it got off to a slower start than many expected.  And with 
only a few cores, it makes more sense to distribute process's rather than 
threads.  But I think this will change as the number of cores increase, and 
the techniques to use them also develop.  Until we have the kind of fine 
grained threading Jack Moffitt was describing.

> Sure, some specific workloads in scientific computing may benefit -
> but if I understand correctly you can already release the GIL using
> Cython, and perhaps soon using Numba.

A web browser is about as mainstream as you can get.  And it's presence is 
big enough to drive the computer market where ever it goes.  ;-)

> Besides the serial nature of most workloads, there are other limits to
> multicore scalability, such as DRAM access latency and bandwidth.
> There's little point in having 100 CPU cores if they all compete for
> memory access as executing multiple threads simultaneously reduces the
> locality of accesses and therefore the efficiency of on-chip caches.

It will be interesting to see how this changes.  ;-)


More information about the Python-ideas mailing list