Hi.
[Mark Hammond]
> The point isn't about my suffering as such. The point is more that
> python-dev owns a tiny amount of the code out there, and I don't believe we
> should put Python's users through this.
>
> Sure - I would be happy to "upgrade" all the win32all code, no problem. I
> am also happy to live in the bleeding edge and take some pain that will
> cause.
>
> The issue is simply the user base, and giving Python a reputation of not
> being able to painlessly upgrade even dot revisions.
I agree with all this.
[As I imagined explicit syntax did not catch up and would require
lot of discussions.]
[GvR]
> > Another way is to use special rules
> > (similar to those for class defs), e.g. having
> >
> > <frag>
> > y=3
> > def f():
> > exec "y=2"
> > def g():
> > return y
> > return g()
> >
> > print f()
> > </frag>
> >
> > # print 3.
> >
> > Is that confusing for users? maybe they will more naturally expect 2
> > as outcome (given nested scopes).
>
> This seems the best compromise to me. It will lead to the least
> broken code, because this is the behavior that we had before nested
> scopes! It is also quite easy to implement given the current
> implementation, I believe.
>
> Maybe we could introduce a warning rather than an error for this
> situation though, because even if this behavior is clearly documented,
> it will still be confusing to some, so it is better if we outlaw it in
> some future version.
>
Yes this can be easy to implement but more confusing situations can arise:
<frag>
y=3
def f():
y=9
exec "y=2"
def g():
return y
return y,g()
print f()
</frag>
What should this print? the situation leads not to a canonical solution
as class def scopes.
or
<frag>
def f():
from foo import *
def g():
return y
return g()
print f()
</frag>
[Mark Hammond]
> > This probably won't be a very popular suggestion, but how about pulling
> > nested scopes (I assume they are at the root of the problem)
> > until this can be solved cleanly?
>
> Agreed. While I think nested scopes are kinda cool, I have lived without
> them, and really without missing them, for years. At the moment the cure
> appears worse then the symptoms in at least a few cases. If nothing else,
> it compromises the elegant simplicity of Python that drew me here in the
> first place!
>
> Assuming that people really _do_ want this feature, IMO the bar should be
> raised so there are _zero_ backward compatibility issues.
I don't say anything about pulling nested scopes (I don't think my opinion
can change things in this respect)
but I should insist that without explicit syntax IMO raising the bar
has a too high impl cost (both performance and complexity) or creates
confusion.
[Andrew Kuchling]
> >Assuming that people really _do_ want this feature, IMO the bar should be
> >raised so there are _zero_ backward compatibility issues.
>
> Even at the cost of additional implementation complexity? At the cost
> of having to learn "scopes are nested, unless you do these two things
> in which case they're not"?
>
> Let's not waffle. If nested scopes are worth doing, they're worth
> breaking code. Either leave exec and from..import illegal, or back
> out nested scopes, or think of some better solution, but let's not
> introduce complicated backward compatibility hacks.
IMO breaking code would be ok if we issue warnings today and implement
nested scopes issuing errors tomorrow. But this is simply a statement
about principles and raised impression.
IMO import * in an inner scope should end up being an error,
not sure about 'exec's.
We will need a final BDFL statement.
regards, Samuele Pedroni.
Looking at a bug report Fred forwarded, I realized that after
py-howto.sourceforge.net was set up, www.python.org/doc/howto was
never changed to redirect to the SF site instead. As of this
afternoon, that's now done; links on www.python.org have been updated,
and I've added the redirect.
Question: is it worth blowing away the doc/howto/ tree now, or should
it just be left there, inaccessible, until work on www.python.org
resumes?
--amk
Hi.
Writing nested scopes support for jython (now it passes test_scope and
test_future <wink>),
I have come across these further corner cases for nested scopes mixed with
global decl,
I have tried them with python 2.1b1 and I wonder if the results are consistent
with
the proposed rule:
a free variable is bound according to the nearest outer scope binding
(assign-like or global decl),
class scopes (for backw-comp) are ignored wrt this.
(I)
from __future__ import nested_scopes
x='top'
def ta():
global x
def tata():
exec "x=1" in locals()
return x # LOAD_NAME
return tata
print ta()() prints 1, I believed it should print 'top' and a LOAD_GLOBAL
should have been produced.
In this case the global binding is somehow ignored. Note: putting a global decl
in tata xor removing
the exec make tata deliver 'top' as I expected (LOAD_GLOBALs are emitted).
Is this a bug or I'm missing something?
(II)
from __future__ import nested_scopes
x='top'
def ta():
x='ta'
class A:
global x
def tata(self):
return x # LOAD_GLOBAL
return A
print ta()().tata() # -> 'top'
should not the global decl in class scope be ignored and so x be bound to x in
ta,
resulting in 'ta' as output? If one substitutes global x with x='A' that's what
happens.
Or only local binding in class scope should be ignored but global decl not?
regards, Samuele Pedroni
I understand the issue of "default Unicode encoding" is a loaded one,
however I believe with the Windows' file system we may be able to use a
default.
Windows provides 2 versions of many functions that accept "strings" - one
that uses "char *" arguments, and another using "wchar *" for Unicode.
Interestingly, the "char *" versions of function almost always support
"mbcs" encoded strings.
To make Python work nicely with the file system, we really should handle
Unicode characters somehow. It is not too uncommon to find the "program
files" or the "user" directory have Unicode characters in non-english
version of Win2k.
The way I see it, to fix this we have 2 basic choices when a Unicode object
is passed as a filename:
* we call the Unicode versions of the CRTL.
* we auto-encode using the "mbcs" encoding, and still call the non-Unicode
versions of the CRTL.
The first option has a problem in that determining what Unicode support
Windows 95/98 have may be more trouble than it is worth. Sticking to purely
ascii versions of the functions means that the worst thing that can happen
is we get a regular file-system error if an mbcs encoded string is passed on
a non-Unicode platform.
Does anyone have any objections to this scheme or see any drawbacks in it?
If not, I'll knock up a patch...
Mark.
I'm still trying to sort this out. Some concerns and questions:
I don't like the new MatchFilename, because it triggers on *all* platforms
that #define HAVE_DIRENT_H.
Anyone, doesn't that trigger on straight Linux systems too (all I know is
that it's part of the Single UNIX Specification)?
I don't like it because it implements a woefully inefficient algorithm: it
cycles through the entire directory looking for a case-sensitive match. But
there can be hundreds of .py files in a directory, and on average it will
need to look at half of them, while if this triggers on straight Linux
there's no need to look at *any* of them there. I also don't like it because
it apparently triggers on Cygwin too but the code that calls it doesn't cater
to that Cygwin possibly *should* be defining ALTSEP as well as SEP.
Would rather dump MatchFilename and rewrite in terms of the old check_case
(which should run much quicker, and already comes in several appropriate
platform-aware versions -- and I clearly minimize the chance of breakage if I
stick to that time-tested code).
Steven, there is a "#ifdef macintosh" version of check_case already. Will
that or won't that work correctly on your variant of Mac? If not, would you
please supply a version that does (along with the #ifdef'ery needed to
recognize your Mac variant)?
Jason, I *assume* that the existing "#if defined(MS_WIN32) ||
defined(__CYGWIN__)" version of check_case works already for you. Scream if
that's wrong.
Steven and Jack, does getenv() work on both your flavors of Mac? I want to
make PYTHONCASEOK work for you too.
These are some half-baked ideas about getting classes and types to look
more similar. I would like to know whether they are workable or not and
so I present them to the people best equipped to tell me.
Many extension types have a __getattr__ that looks like this:
static PyObject *
Xxo_getattr(XxoObject *self, char *name)
{
// try to do some work with known attribute names, else:
return Py_FindMethod(Xxo_methods, (PyObject *)self, name);
}
Py_FindMethod can (despite its name) return any Python object, including
ordinary (non-function) attributes. It also has complete access to the
object's state and type through the self parameter. Here's what we do
today for __doc__:
if (strcmp(name, "__doc__") == 0) {
char *doc = self->ob_type->tp_doc;
if (doc != NULL)
return PyString_FromString(doc);
}
Why can't we do this for all magic methods?
* __class__ would return for the type object
* __add__,__len__, __call__, ... would return a method wrapper around
the appropriate slot,
* __init__ might map to a no-op
I think that Py_FindMethod could even implement inheritance between
types if we wanted.
We already do this magic for __methods__ and __doc__. Why not for all of
the magic methods?
Many other types implement no getattr at all (the slot is NULL). In that
case, I think that we have carte blanche to define their getattr
behavior as instance-like as possible.
Finally there are the types with getattrs that do not dispatch to
Py_FindMethod. we can just change those over manually. Extension authors
will do the same when they realize that their types are not inheriting
the features that the other types are.
Benefits:
* objects based on extension types would "look more like" classes to
Python programmers so there is less confusion about how they are
different
* users could stop using the type() function to get concrete types and
instead use __class__. After a version or two, type() could be formally
deprecated in favor of isinstance and __class__.
* we will have started some momentum towards type/class unification
which we could continue on into __setattr__ and subclassing.
--
Take a recipe. Leave a recipe.
Python Cookbook! http://www.activestate.com/pythoncookbook
Here are some silly bits of code implementing single frame
coroutines and threads using my frame suspend/resume patch.
The coroutine example does not allow a value to be passed but
that would be simple to add. An updated version of the (very
experimental) patch is here:
http://arctrix.com/nas/generator3.diff
For me, thinking in terms of frames is quite natural and I didn't
have any trouble writing these examples. I'm hoping they will be
useful to other people who are trying to get their mind around
continuations. If your sick of such postings on python-dev flame
me privately and I will stop. Cheers,
Neil
#####################################################################
# Single frame threads (nano-threads?). Output should be:
#
# foo
# bar
# foo
# bar
# bar
import sys
def yield():
f = sys._getframe(1)
f.suspend(f)
def run_threads(threads):
frame = {}
for t in threads:
frame[t] = t()
while threads:
for t in threads[:]:
f = frame.get(t)
if not f:
threads.remove(t)
else:
frame[t] = f.resume()
def foo():
for x in range(2):
print "foo"
yield()
def bar():
for x in range(3):
print "bar"
yield()
def test():
run_threads([foo, bar])
test()
#####################################################################
# Single frame coroutines. Should print:
#
# foo
# bar
# baz
# foo
# bar
# baz
# foo
# ...
import sys
def transfer(func):
f = sys._getframe(1)
f.suspend((f, func))
def run_coroutines(args):
funcs = {}
for f in args:
funcs[f] = f
current = args[0]
while 1:
rv = funcs[current]()
if not rv:
break
(frame, next) = rv
funcs[current] = frame.resume
current = next
def foo():
while 1:
print "foo"
transfer(bar)
def bar():
while 1:
print "bar"
transfer(baz)
transfer(foo)
I'm a long-time listener/first-time caller and would like to know what I
should do to have my patch examined. I've included a description of the
patch below.
Cheers,
-chris
-----------------------------------------------------------------------------
[reference: python-Patches #412553]
Problem:
test_linuxaudiodev.py failed with "Resource temporarily busy message"
(under the cvs version of python)
Analysis:
The lad_write() method attempts to write continuously to /dev/dsp (or
equivalent); when the audio buffer fills, write() returns an error code and
errorno is set to EAGAIN, indicating that the device buffer is full. The
lad_write() interprets this as an error and instead of trying to write
again returns NULL.
Solution:
Use select() to check when the audio device becomes writable and test for
EAGAIN after doing a write(). I've submitted patch #412553 that implements
this solution. (use python21-lihnuxaudiodev.c-version2.diff). With this
patch, test_linuxaudiodev.py passes. This patch may also be relevant for
the python 2.0.1 bugfix release.
System configuration:
linux kernel 2.4.2 and 2.4.3 SMP on a dual processor i686 with the
soundblaster live value soundcard.
After the recent change that assignments to __debug__ are disallowed,
I noticed that IDLE stops working (see SF bug report), since it was
assigning to __debug__.
Simply commenting-out the assignment (to zero) did no good: Inside the
__debug__ blocks, IDLE would try to perform print statements, which
would write to the re-assigned sys.stdout, which would invoke the code
that had the __debug__, which would give up thanks to infinite
recursion. So essentially, you either have to remove the __debug__
blocks, or rewrite them to writing to save_stdout - in which case all
the ColorDelegator debug message appear in the terminal window.
So anybody porting to Python 2.1 will essentially have to remove all
__debug__ blocks that were previously disabled by assigning 0 to
__debug__. I think this is undesirable.
As I recall, in the original description of __debug__, being able to
assign to it was reported as one of its main features, so that you
still had a run-time option (unless the interpreter was running with
-O, which eliminates the __debug__ blocks).
So in short, I think this change should be reverted.
Regards,
Martin
P.S. What was the motivation for that change, anyway?
It was suggested that I post this to python-dev, as well as python-list and
the distutils SIG. I apologise if this is being done backwards? Should I get
a proper PEP number first, or is it appropriate to ask for initial comments
like this?
Paul
-----Original Message-----
From: Moore, Paul
Sent: 30 March 2001 13:32
To: distutils-sig(a)python.org
Cc: 'python-list(a)python.org'
Subject: [Distutils] PEP: Use site-packages on all platforms
Attached is a first draft of a proposal to use the "site-packages" directory
for locally installed modules, on all platforms instead of just on Unix. If
the consensus is that this is a worthwhile proposal, I'll submit it as a
formal PEP.
Any advice or suggestions welcomed - I've never written a PEP before - I
hope I've got the procedure right...
Paul Moore
PEP: TBA
Title: Install local packages in site-packages on all platforms
Version $Revision$
Author: Paul Moore <gustav(a)morpheus.demon.co.uk>
Status: Draft
Type: Standards Track
Python-Version: 2.2
Created: 2001-03-30
Post-History: TBA
Abstract
The standard Python distribution includes a directory Lib/site-packages,
which is used on Unix platforms to hold locally-installed modules and
packages. The site.py module distributed with Python includes support
for
locating modules in this directory.
This PEP proposes that the site-packages directory should be used
uniformly across all platforms for locally installed modules.
Motivation
On Windows platforms, the default setting for sys.path does not include
a
directory suitable for users to install locally-developed modules. The
"expected" location appears to be the directory containing the Python
executable itself. Including locally developed code in the same
directory
as installed executables is not good practice.
Clearly, users can manipulate sys.path, either in a locally modified
site.py, or in a suitable sitecustomize.py, or even via .pth files.
However, there should be a standard location for such files, rather than
relying on every individual site having to set their own policy.
In addition, with distutils becoming more prevalent as a means of
distributing modules, the need for a standard install location for
distributed modules will become more common. It would be better to
define
such a standard now, rather than later when more distutils-based
packages
exist which will need rebuilding.
It is relevant to note that prior to Python 2.1, the site-packages
directory was not included in sys.path for Macintosh platforms. This has
been changed in 2.1, and Macintosh includes sys.path now, leaving
Windows
as the only major platform with no site-specific modules directory.
Implementation
The implementation of this feature is fairly trivial. All that would be
required is a change to site.py, to change the section setting sitedirs.
The Python 2.1 version has
if os.sep == '/':
sitedirs = [makepath(prefix,
"lib",
"python" + sys.version[:3],
"site-packages"),
makepath(prefix, "lib", "site-python")]
elif os.sep == ':':
sitedirs = [makepath(prefix, "lib", "site-packages")]
else:
sitedirs = [prefix]
A suitable change would be to simply replace the last 4 lines with
else:
sitedirs = [makepath(prefix, "lib", "site-packages")]
Changes would also be required to distutils, in the sysconfig.py file.
It
is worth noting that this file does not seem to have been updated in
line
with the change of policy on the Macintosh, as of this writing.
Notes
1. It would be better if this change could be included in Python 2.1, as
changing something of this nature is better done sooner, rather than
later, to reduce the backward-compatibility burden. This is extremely
unlikely to happen at this late stage in the release cycle, however.
2. This change does not preclude packages using the current location -
the change only adds a directory to sys.path, it does not remove
anything.
3. In the Windows distribution of Python 2.1 (beta 1), the
Lib\site-packages directory has been removed. It would need to be
reinstated.
Copyright
This document has been placed in the public domain.
_______________________________________________
Distutils-SIG maillist - Distutils-SIG(a)python.org
http://mail.python.org/mailman/listinfo/distutils-sig