IDLEfork, now in Lib/idlelib, uses a new C extension, the 'interrupt'
module. This is a truly minimal, and generally useful, module to
provide a way to cause a KeyboardInterrupt in the main thread from a
secondary thread. I propose to move this to Modules, and make it a
toplevel module. This is the easiest thing in order to get it
compiled and built for various platforms.
--Guido van Rossum (home page: http://www.python.org/~guido/)
What is the current procedure for changing IDLE, for Python
committers? Can I just commit changes to Lib/idlelib, do I need to
post them to idle-fork, or do I need to do something else?
[assuming, of course, that I think the change is sensible, and
I would consider applying it to be within my competence]
some consideration on PEP 280: Optimizing access to globals
after my experiment.
1) in the PEP
(Note: Jason Orendorff writes: """I implemented this once, long
ago, for Python 1.5-ish, I believe. I got it to the point where
it was only 15% slower than ordinary Python, then abandoned it.
;) In my implementation, "cells" were real first-class objects,
and "celldict" was a copy-and-hack version of dictionary. I
forget how the rest worked.""" Reference:
at least now you know there is hope :) for at a minimum some 15% _speedup_
when globals/builtins are heavely involved and a wash when they are not used.
When a function object is created from a code object and a celldict,
the function object creates an array of cell pointers by asking the
celldict for cells corresponding to the names in the code object's
co_globals. If the celldict doesn't already have a cell for a
particular name, it creates and an empty one. This array of cell
pointers is stored on the function object as func_cells. When a
function object is created from a regular dict instead of a
celldict, func_cells is a NULL pointer.
this is probably far from ideal for closures, OTOH with the right
infrastructure it should be possible to store created caches e.g. in code
objects and so reuse them.
Mini-Keychain BREATHALYZER! A Must Have!
A Fast, Reusable, Pocket-sized Alcohol Detection Device for Use
Anywhere, Anytime Alcohol Consumption or Intoxication is a Concern.
Don't Be a Victim of Drunk Driving!
Makes a great personal item & gift. Almost everyone can use one, even if
your a casual drinker. It even has a flashlight built in for added
These retail for $99 in stores. Our exclusive internet-offer allows us
to sell the Mini-Breathalyzer for ONLY 39.95 !!!
***FREE*** Order Now and get a FREE Cable Descrambler ($70 Value)
FREE PayPerViews, Porno, Special Events & More...DONT MISS THIS AMAZING LIMITED TIME
If you no longer wish to receive our offers and updates
click below and we will promptly honor your request.
one thing that prevents the linux standard base to include Python (or Perl
for that matter) is that there is no formal language standard with test
cases. (For the LSB a subset might be enough.)
Exists there anywhere a rudimentary standard or are there plans to create
Oops. Disregard the last post, it got away too soon.
| On Fri, Jun 13, 2003 at 12:50:53AM -0400, Raymond Hettinger wrote:
| | I don't see any significant downside to 246. The code is simple
| | enough. It is useful in at least some cases and provides some
| | support for interoperability.
| | So, my questions is whether there is any reason not to adopt 246
| | right away? AFAICT, there is no competing proposal and nothing
| | that would be harmed by adoption. What's all the fuss about?
| On Fri, Jun 13, 2003 at 08:19:59AM -0400, Phillip J. Eby wrote:
| | Guido has said (and I agree) that if Python includes adapt(), then the
| | Python standard library should use it.
| | PyProtocols was my attempt to show that a PEP 246 mechanism can
| | actually be pretty agnostic about what kind of interface objects
| | are used, just like the '+' operator is agnostic about what
| | object types it's used with.
| | Ah well. :) On the bright side, I think PyProtocols can alleviate
| | *one* of his concerns, which was that having a Python-included interface
| | type would make other interface types (e.g. Zope interfaces) "second-class
| | citizens". That is, I've demonstrated that it is possible to have a
| | "protocol protocol", thus allowing different protocol or interface
| | types to exist, even if they have no implementation in common (e.g.
| | Twisted, Zope, and PyProtocols).
Could this approach work:
- use regular class inheritance for static requirements
- use adapt() for dynamic or custom needs (as shown below)
- let specific use cases further refine the requirements
This whole 'Interface' issue has been in the works for over
three years now, and potentially great interoperability
between frameworks and components is being lost. For example,
why not just have a 'interfaces.py' in the standard library.
The interface for iterators could be something like...
# inside interfaces.py
""" returns the next value in the iteration, or
raises StopIteration to finish up
def __adapt__(cls, obj):
// perhaps return a wrapper object
// for other types, like lists...
__adapt__ = classmethod(__adapt__)
Then, iter() could be defined something like...
return adapt(Iterable, obj)
At 03:42 PM 6/12/03 +0100, Moore, Paul wrote:
>From: Phillip J. Eby [mailto:email@example.com]
> > Open protocols solve the chicken and egg problem by allowing one
> > to make declarations about third-party objects.
>OK, I've tried to put together a simple example of "what I expect" and
>I find I can't. I want to continue to write code which "just assumes"
>that it gets the types it needs - I don't want to explicitly state that
>I need specific interface types - that feels like type declaration and
>Java. Your IFile example reinforces that, both in terms of its naming
>convention, and in the assumption that there *is* a single, usable,
>"file" protocol. I know it was my example, but I pointed out later in
>the same message that I really sometimes wanted seekable readline-only
>files, and other times block read/write (potentially unseekable) files.
>Expecting library writers to declare interface "classes" for every
>subtle variation of requirements seems impractical. Expecting the
>requirements to be *documented* is fine - it's having a concrete class
>which encapsulates them that I don't see happening - no-one would ever
>look to see if there was already a "standard" interface which said "I
>have readline, seek, and tell" - they'd just write a new one. There
>goes any hope of reuse. (This may be what you see as a "chicken and egg"
>problem - if so, I see it as a "hopelessly unrealistic expectations"
>problem instead, because it's never going to happen...)
Even if it's as simple as saying this:
"""You must have readline, seek, and tell if you pass this to me"""
Or, even more briefly, using Samuele's way:
RST = subproto(IFile,['readline','seek','tell'])
>On the other hand, expecting callers to write stuff to adapt existing
>classes to the requirements of library routines is (IMHO) a non-issue.
>I think that's what PEP 246 was getting at in the statement that "The
>typical Python programmer is an integrator" which you quote. It's
>common to write
> class wrap_source:
> def __init__(self, source):
> self.source = source
> def read(self, n = 1024):
> return self.source.get_data(n)
>So PEP 246 is trying to make writing that sort of boilerplate easier
>(in my view). The *caller* should be calling adapt(), not the callee.
That depends. It's common in both Zope and Twisted for framework code to
do the Zope or Twisted equivalents of 'adapt()'. But yes, it is also
common to state that some function in effect requires an already adapted
>Which is appropriate is basically down to which came first, string or
>file. But both suffer from the problem of putting all the knowledge
>in one location (and using a type switch as well).
Right. That's what the point of the open protocols system is. The problem
is that in every Python interface system I know of (except PyProtocols),
you can't declare that package A, protocol X, implies package B, protocol
Y, unless you are the *author* of package B's protocol Y.
But if package A's author used an open protocol to define protocol X, then
you can use PyProtocols today to say that it implies package B's protocol
Y, as long as protocol Y can be adapted to the declaration API.
Translation: anybody who defines protocols for their packages today using
PyProtocols, will be able to have third parties define adapters to
interfaces provided by Zope or Twisted, or that are defined using Zope or
>The third option, which is the "external registry" allows a *user* of
>the string and file "libraries" (bear with me here...) to say how he
>wants to make strings look like files:
>So I see PEP 246 as more or less entirely a class based mechanism. It
>has very little to do with constraining (or even documenting) the
>types of function parameters.
PEP 246 specifically said it wasn't tackling that case, however, and I
personally don't feel that a singleton of such broad scope is practical; it
seems more appropriate to make protocols responsible for their own adapter
registries, which is what PyProtocols (and to a lesser extent Twisted) do.
>Of course, a library writer can define interface classes, and the
>adaptation mechanism will allow concrete classes to be made to
>"conform" to those interfaces, but it's not necessary. And given
>the general apathy of the Python community towards interfaces (at
>least as far as I see) I don't imagine this being a very popular use
When designing PyProtocols I did some research on the type-sig, and other
places... I got the distinct impression that lots of Pythonistas like
interfaces, as long as they're abstract base classes. PyProtocols allows
that. But, a protocol can also be a pure symbol, e.g.:
myInterface = protocols.Protocol()
Voila. There's your interface. Let the user register what they
will. Sure, it doesn't *document* anything that isn't stated by its name,
but that's your users' problem. ;) At least they can import
'yourpackage.myInterface' and register adapters now.
>And describing a PEP 246 style facility in terms of
>interfaces could be a big turn-off. (This applies strongly to your
>PyProtocols code - I looked for something like the pep246.register
>function I suggested above, but I couldn't manage to wade through all
>the IThis and IThat stuff to find it, if it was there...)
Unfortunately, I biased the reference documentation towards explaining the
architecture and what you need to do to integrate with the framework
itself, even though 90% of the actual intended audience of PyProtocols will
never need to do anything like that. In a sense, you could say that the
current reference manual is much more of an embedding and extending manual,
instead of being the Python tutorial.
>Once again, I apologise if I've missed your point. But I hope I've
>explained what I see as the point of PEP 246, and where your proposal
>is going off in a different direction.
I understand what you're saying, although I'm not sure how you interpreted
PEP 246 that way, when it explicitly states that it *doesn't* cover case
"e". And I didn't diverge from PEP 246 in that respect. What I
effectively said is, "well, let's make it so that case "e" is irrelevant,
because there are plenty of __adapt__-capable protocols, and because you
can register the relationships between them." Thus, the solution to case
"e" is effectively:
Don't use built-in types as protocols!
>But the fact remains, that neither PEP 246 nor PyProtocols has any
>need to be in the core, or even in the standard library (yet). Wide
>usage could change that. Usage in something that is destined for the
>standard library could, too, but that one could go the other way (the
>"something" gets rejected because it depends on PEP246/PyProtocols).
Right, that was why I was asking.
>PS This is *way* off-topic for python-dev by now. I suggest that we
> leave things here. I don't have anything else worth adding...
I suppose we could always resurrect the types-sig.... Although, as Tim
Peters has frequently suggested, nothing ends discussion more quickly than
having an implementation available. :)
> From: Guido van Rossum [mailto:firstname.lastname@example.org]
> Sent: Viernes, 13 de Junio de 2003 01:36 p.m.
> > [...snippage...]
> > While Python is more stable than perl in this respect (at
> > least I have that expression) the problem is that there is
> > no fixed python language: With any new release not only bugs
> > are fixed, but also new language features are added. While
> > this makes features-to-market faster, it probably creates
> > the problems that make it hard to "standardize" python.
> > This done when it is included in the LSB (kind of):
> > The programs have to behave _identical_ independend of
> > the python version.
> This reveals a hopelessly naive view on standardization.
Actually, it's much, much worse than naive, it's fossilizing:
no program can continue evolving. It reminds me of Microsoft's
DirectX thing, all recent versions contain all previous
versions, in order to replicate past version's behaviour,
downright to that caused by bugs or implementations errors.
> > > There is of course a thorough standard test suite for Python
> > Hmm. It should be somehow possible to get python (and perl)
> > into the LSB, hmm.
> > > Other than that, I expect that including Python in LSB is more a
> > > matter of political will in the LSB committee than anything else.
> > I'm not that sure, at least for LSB 2.0 which is supposed to be
> > modulized this might actually happen. (Though probably only if also
> > Perl gets
> > included.)
> Why would Python only be included if Perl was also included?
> As I said, this is just politics.
Tobias: What if certain [Python] program contains code that
depends on behaviour caused by a bug, or an "interim" feature?
Should Python become fossilized?
pd: Please excuse and ignore following legalese :-(
Advertencia:La informacion contenida en este mensaje es confidencial y
restringida, por lo tanto esta destinada unicamente para el uso de la
persona arriba indicada, se le notifica que esta prohibida la difusion de
este mensaje. Si ha recibido este mensaje por error, o si hay problemas en
la transmision, favor de comunicarse con el remitente. Gracias.