Re: [Python-checkins] r51492 - in python/branches/int_unification: Include/boolobject.h Include/intobject.h Include/longobject.h Modules/_sre.c Objects/abstract.c Objects/boolobject.c Objects/exceptions.c Objects/intobject.c Objects/listobject.c Obje
On 8/23/06, M.-A. Lemburg <mal@egenix.com> wrote:
I'm not questioning the merge itself, only its direction and that for a few reasons:
* PyInts are very fast * They are the most used object type in Python * Lots and lots of extensions rely on them
* PyLongs are, well, slow compared to PyInts (they are still fast compared to other arbitrary precision implementations) * PyLongs are only rarely used in Python, mostly in the cases where the PyInt range doesn't suffice and this use case is going to become even less important with 64bit platforms becoming a commodity * PyLongs are often not expected/support by extensions
You are obviously not reading the python-3000 list. The plan is to keep all the PyInt_ APIs but make them work with the new internal representation instead (turning some of the macros into functions). So well-behaved extensions (that don't reach into the int guts or only do so using the macros) only need to be recompiled (which they need to be anyway for Py3k). Then the only issue is speed, which we can address by adding optimizations to the long implementation similar to the optimizations we currenty have in the int object. -- --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido van Rossum wrote:
On 8/23/06, M.-A. Lemburg <mal@egenix.com> wrote:
I'm not questioning the merge itself, only its direction and that for a few reasons:
* PyInts are very fast * They are the most used object type in Python * Lots and lots of extensions rely on them
* PyLongs are, well, slow compared to PyInts (they are still fast compared to other arbitrary precision implementations) * PyLongs are only rarely used in Python, mostly in the cases where the PyInt range doesn't suffice and this use case is going to become even less important with 64bit platforms becoming a commodity * PyLongs are often not expected/support by extensions
You are obviously not reading the python-3000 list.
True, though I did read the short thread that started this branch in the archives.
The plan is to keep all the PyInt_ APIs but make them work with the new internal representation instead (turning some of the macros into functions). So well-behaved extensions (that don't reach into the int guts or only do so using the macros) only need to be recompiled (which they need to be anyway for Py3k). Then the only issue is speed, which we can address by adding optimizations to the long implementation similar to the optimizations we currenty have in the int object.
I know and there's no problem with that, except or the performance issues associated with the direction you've chosen. PyInts already have all the optimizations you have on your agenda for PyLongs, so I wonder why you're trying to duplicate all this work, instead of building upon it and adding the PyLong features to PyInts. Maybe I'm missing something, but this looks like a much more natural approach to me. The PyLong approach alone (using a digits array) will never be as fast as working with a C long directly. -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source (#1, Aug 23 2006)
Python/Zope Consulting and Support ... http://www.egenix.com/ mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/
::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ::::
On 8/23/06, M.-A. Lemburg <mal@egenix.com> wrote:
PyInts already have all the optimizations you have on your agenda for PyLongs, so I wonder why you're trying to duplicate all this work, instead of building upon it and adding the PyLong features to PyInts. Maybe I'm missing something, but this looks like a much more natural approach to me.
Since the PyLong implementation is nearly three times as large as the PyInt implementation, adding long functionality to ints would be the backwards thing to do. -- --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido van Rossum wrote:
On 8/23/06, M.-A. Lemburg <mal@egenix.com> wrote:
PyInts already have all the optimizations you have on your agenda for PyLongs, so I wonder why you're trying to duplicate all this work, instead of building upon it and adding the PyLong features to PyInts. Maybe I'm missing something, but this looks like a much more natural approach to me.
Since the PyLong implementation is nearly three times as large as the PyInt implementation, adding long functionality to ints would be the backwards thing to do.
I was actually thinking of adding the PyLong implementation as special case to PyInts, ie. appending the digits array to the PyInt struct and making PyInts VarObjects. A size 0 PyInt would then be the classical PyInt, a size n (n>0) PyInt would work as a classical PyLong. One could even use a union to save a sizeof(long) bytes in the PyLong situation. -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source (#1, Aug 23 2006)
Python/Zope Consulting and Support ... http://www.egenix.com/ mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/
::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ::::
Zitat von "M.-A. Lemburg" <mal@egenix.com>:
I was actually thinking of adding the PyLong implementation as special case to PyInts, ie. appending the digits array to the PyInt struct and making PyInts VarObjects. A size 0 PyInt would then be the classical PyInt, a size n (n>0) PyInt would work as a classical PyLong.
That's my plan, except that the special case is the small integers, not the large ones. Before responding that you would like to see it the other way 'round, please analyze what the ACTUAL difference between your approach and mine would be. Regards, Martin
martin@v.loewis.de wrote:
Zitat von "M.-A. Lemburg" <mal@egenix.com>:
I was actually thinking of adding the PyLong implementation as special case to PyInts, ie. appending the digits array to the PyInt struct and making PyInts VarObjects. A size 0 PyInt would then be the classical PyInt, a size n (n>0) PyInt would work as a classical PyLong.
That's my plan,
Good. Looking at the checkin, it seemed to me that you were trying to use the long digit array as basis for everything.
except that the special case is the small integers, not the large ones.
No need to fight over that detail :-)
Before responding that you would like to see it the other way 'round, please analyze what the ACTUAL difference between your approach and mine would be.
None, except less work: you now move/rename everything to the PyLong implementation, only to then rename everything back to PyInt later on. -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source (#1, Aug 23 2006)
Python/Zope Consulting and Support ... http://www.egenix.com/ mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/
::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ::::
Zitat von "M.-A. Lemburg" <mal@egenix.com>:
PyInts already have all the optimizations you have on your agenda for PyLongs, so I wonder why you're trying to duplicate all this work, instead of building upon it and adding the PyLong features to PyInts.
Because it doesn't matter. The code would be the same in the end, either way: it has to support two cases if two different internal representations are chosen. If you worry about the efficiency of the arithmetic operations: these are TRIVIAL in intobject.c, except for the overflow handling. What specific optimizations do you think the int implementation has?
Maybe I'm missing something, but this looks like a much more natural approach to me.
After reading your messages, it seems that you are missing the fact that there aren't really any optimizations in the int type, except for the custom allocator (which predates obmalloc) and the singleton cache for small ints (which is trivial to move to longs). Regards, Martin
martin@v.loewis.de wrote:
Zitat von "M.-A. Lemburg" <mal@egenix.com>:
PyInts already have all the optimizations you have on your agenda for PyLongs, so I wonder why you're trying to duplicate all this work, instead of building upon it and adding the PyLong features to PyInts.
Because it doesn't matter. The code would be the same in the end, either way: it has to support two cases if two different internal representations are chosen.
If you worry about the efficiency of the arithmetic operations: these are TRIVIAL in intobject.c, except for the overflow handling.
What specific optimizations do you think the int implementation has?
Maybe I'm missing something, but this looks like a much more natural approach to me.
After reading your messages, it seems that you are missing the fact that there aren't really any optimizations in the int type, except for the custom allocator (which predates obmalloc) and the singleton cache for small ints (which is trivial to move to longs).
The optimizations are not in the int type itself, they are scattered throughout the interpreter. The key to these optimizations is that it's possible to access the C long directly via the access macro and thus allowing the compiler to generate more efficient code. -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source (#1, Aug 23 2006)
Python/Zope Consulting and Support ... http://www.egenix.com/ mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/
::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ::::
participants (3)
-
Guido van Rossum
-
M.-A. Lemburg
-
martin@v.loewis.de