Hi.
[Mark Hammond]
> The point isn't about my suffering as such. The point is more that
> python-dev owns a tiny amount of the code out there, and I don't believe we
> should put Python's users through this.
>
> Sure - I would be happy to "upgrade" all the win32all code, no problem. I
> am also happy to live in the bleeding edge and take some pain that will
> cause.
>
> The issue is simply the user base, and giving Python a reputation of not
> being able to painlessly upgrade even dot revisions.
I agree with all this.
[As I imagined explicit syntax did not catch up and would require
lot of discussions.]
[GvR]
> > Another way is to use special rules
> > (similar to those for class defs), e.g. having
> >
> > <frag>
> > y=3
> > def f():
> > exec "y=2"
> > def g():
> > return y
> > return g()
> >
> > print f()
> > </frag>
> >
> > # print 3.
> >
> > Is that confusing for users? maybe they will more naturally expect 2
> > as outcome (given nested scopes).
>
> This seems the best compromise to me. It will lead to the least
> broken code, because this is the behavior that we had before nested
> scopes! It is also quite easy to implement given the current
> implementation, I believe.
>
> Maybe we could introduce a warning rather than an error for this
> situation though, because even if this behavior is clearly documented,
> it will still be confusing to some, so it is better if we outlaw it in
> some future version.
>
Yes this can be easy to implement but more confusing situations can arise:
<frag>
y=3
def f():
y=9
exec "y=2"
def g():
return y
return y,g()
print f()
</frag>
What should this print? the situation leads not to a canonical solution
as class def scopes.
or
<frag>
def f():
from foo import *
def g():
return y
return g()
print f()
</frag>
[Mark Hammond]
> > This probably won't be a very popular suggestion, but how about pulling
> > nested scopes (I assume they are at the root of the problem)
> > until this can be solved cleanly?
>
> Agreed. While I think nested scopes are kinda cool, I have lived without
> them, and really without missing them, for years. At the moment the cure
> appears worse then the symptoms in at least a few cases. If nothing else,
> it compromises the elegant simplicity of Python that drew me here in the
> first place!
>
> Assuming that people really _do_ want this feature, IMO the bar should be
> raised so there are _zero_ backward compatibility issues.
I don't say anything about pulling nested scopes (I don't think my opinion
can change things in this respect)
but I should insist that without explicit syntax IMO raising the bar
has a too high impl cost (both performance and complexity) or creates
confusion.
[Andrew Kuchling]
> >Assuming that people really _do_ want this feature, IMO the bar should be
> >raised so there are _zero_ backward compatibility issues.
>
> Even at the cost of additional implementation complexity? At the cost
> of having to learn "scopes are nested, unless you do these two things
> in which case they're not"?
>
> Let's not waffle. If nested scopes are worth doing, they're worth
> breaking code. Either leave exec and from..import illegal, or back
> out nested scopes, or think of some better solution, but let's not
> introduce complicated backward compatibility hacks.
IMO breaking code would be ok if we issue warnings today and implement
nested scopes issuing errors tomorrow. But this is simply a statement
about principles and raised impression.
IMO import * in an inner scope should end up being an error,
not sure about 'exec's.
We will need a final BDFL statement.
regards, Samuele Pedroni.
I have just had the experience of writing a bunch
of expressions of the form
"create index %(table)s_lid1_idx on %(table)s(%(lid1)s)" % params
and found myself getting quite confused by all the parentheses
and "s" suffixes. I would *really* like to be able to write
this as
"create index %{table}_lid1_idx on %{table}(%{lid1})" % params
which I find to be much easier on the eyes.
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury, | A citizen of NewZealandCorp, a |
Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. |
greg(a)cosc.canterbury.ac.nz +--------------------------------------+
Today I got the wheels turning on my masters thesis by getting an
adviser. Now I just need a topic. =) The big goal is to do something
involving Python for a thesis to be finished by fall of next year (about
October) so as to have it done, hopefully published (getting into LL4
would be cool), and ready to be used for doctoral applications come
January 2005.
So, anyone have any ideas? The best one that I can think of is optional
type-checking. I am fairly open to ideas, though, in almost any area
involving language design.
There is no deadline to this, so if an idea strikes you a while from now
still let me know. I suspect I won't settle on an idea any sooner than
December, and that is only if the idea just smacks me in the face and
says, "DO THIS!" Otherwise it might be a while since I don't want to
take up a topic that won't interest me or is not helpful in some way.
-Brett
Hi again, again!
After hours of investigating why my instance method __reduce__
doesn't work, I found out the following:
instancemethod_getattro
does this:
if (PyType_HasFeature(tp, Py_TPFLAGS_HAVE_CLASS)) {
if (tp->tp_dict == NULL) {
if (PyType_Ready(tp) < 0)
return NULL;
}
descr = _PyType_Lookup(tp, name);
}
f = NULL;
if (descr != NULL) {
f = TP_DESCR_GET(descr->ob_type);
if (f != NULL && PyDescr_IsData(descr))
return f(descr, obj, (PyObject *)obj->ob_type);
}
Why, please can someone explain, why does it ask for
PyDescr_IsData ???
I think this is wrong.
I'm defining an __reduce__ method, and it doesn't provide
a tp_descr_set, as defined in...
int
PyDescr_IsData(PyObject *d)
{
return d->ob_type->tp_descr_set != NULL;
}
but for what reason is this required???
This thingie is going wrong both in Py 2.2.3 and in Py 2.3.2,
so I guess there is something very basically going wrong.
I'd like to fix that, but I need to understand what the intent of
this code has been.
Can somebody, perhaps the author, explain why this is this way?
thanks so much -- chris
--
Christian Tismer :^) <mailto:tismer@tismer.com>
Mission Impossible 5oftware : Have a break! Take a ride on Python's
Johannes-Niemeyer-Weg 9a : *Starship* http://starship.python.net/
14109 Berlin : PGP key -> http://wwwkeys.pgp.net/
work +49 30 89 09 53 34 home +49 30 802 86 56 mobile +49 173 24 18 776
PGP 0x57F3BF04 9064 F4E1 D754 C2FF 1619 305B C09C 5A3B 57F3 BF04
whom do you want to sponsor today? http://www.stackless.com/
There's a bunch of FutureWarnings e.g. about 0xffffffff<<1 that
promise they will disappear in Python 2.4. If anyone has time to fix
these, I'd appreciate it. (It's not just a matter of removing the
FutureWarnings -- you actually have to implement the promised future
behavior. :-) I may get to these myself, but they're not exactly
rocket science, so they might be a good thing for a beginning
developer (use SF please if you'd like someone to review the changes
first).
Another -- much bigger -- TODO is to implement generator expressions
(PEP 289). Raymond asked for help but I don't think he got any,
unless it was offered through private email. Anyone interested?
(Of course, I don't want any of this to interfere with the work to get
2.3.3 out in December.)
--Guido van Rossum (home page: http://www.python.org/~guido/)
Here's a reworking which returns iterators. I had to decide what to do if
the user tries to access things out of order; I raise an exception.
Anything else would complicate the code quite a lot I think.
def groupby(key, iterable):
it = iter(iterable)
value = it.next() # If there are no items, this takes an early exit
oldkey = [key(value)]
cache = [value]
lock = []
def grouper():
yield cache.pop()
for value in it:
newkey = key(value)
if newkey == oldkey[0]:
yield value
else:
oldkey[0] = newkey
cache.append(value)
break
del lock[0]
while 1:
if lock:
raise LookupError, "groups accessed out of order"
if not cache:
break
lock.append(1)
yield grouper()
--Greg Ball
I'm adding section to the tutorial with a brief sampling of library
offerings and some short examples of how to use them.
My first draft included:
copy, glob, shelve, pickle, os, re, math/cmath, urllib, smtplib
Guido's thoughts:
- copy tends to be overused by beginners
- the shelve module has pitfalls for new users
- cmath is rarely needed and some folks are scared of complex numbers
- urllib2 is be a better choice than urllib
I'm interested to know what your experiences have been with teaching
python. Which modules are necessary to start doing real work (like
pickle and os), which are most easily grasped (like glob or random),
which have impressive examples only a few lines long (i.e. urllib), and
which might just be fun (turtle would be a candidate if it didn't have a
Tk dependency).
Note, re was included because everyone should know it's there and
everyone should get advice to not use it when string methods will
suffice.
I'm especially interested in thoughts on whether shelve should be
included. When I first started out, I was very impressed with shelves
because they were the simplest way to add a form of persistence and
because they could be dropped in place of a dictionary in scripts that
were already built. Also, it was trivially easy to learn based on
existing knowledge of dictionaries. OTOH, that existing knowledge is
what makes the pitfalls so surprising.
Likewise, I was impressed with the substitutability of line lists, text
splits, file.readlines(), and urlopen().
While I think of copy() and deepcopy() as builtins that got tucked away
in module, Guido is right about their rarity in well-crafted code.
Some other candidates (let's pick just a two or three):
- csv (basic tool for sharing data with other applications)
- datetime (comes up frequently in real apps and admin tasks)
- ftplib (because the examples are so brief)
- getopt or optparse (because the task is common)
- operator (because otherwise, the functionals can be a PITA)
- pprint (because beauty counts)
- struct (because fixed record layouts are common)
- threading/Queue (because without direction people grab thread and
mutexes)
- timeit (because it answers most performance questions in a jiffy)
- unittest (because TDD folks like myself live by it)
I've avoided XML because it is a can of worms and because short examples
don't do it justice. OTOH, it *is* the hot topic of the day and seems
to be taking over the world one angle bracket at a time.
Ideally, the new section should be relatively short but leave a reader
with a reasonable foundation for crafting non-toy scripts. A secondary
goal is to show-off the included batteries -- I think it is common for
someone to download several languages and choose between them based on
their tutorial experiences (so, a little flash and sizzle might be
warranted).
Raymond
I'm going to be pretty much offline for a week or so - we got burgled
the other night while we were asleep and my laptop was stolen. The data's
backed up, but it'll be a few days til the replacement laptop arrives.
In the meantime, if someone wants to take on the "upgrade to autoconf 2.59"
task, I'd appreciate it very much.
thanks,
Anthony
>>> Guido van Rossum wrote
> Yes, backticks will be gone in 3.0. But I expect there's no hope of
> getting rid of them earlier -- they've been used too much. I suspect
Then let's kill all use of backticks in the standard library. There's
a lot of them.
Anthony
--
Anthony Baxter <anthony(a)interlink.com.au>
It's never to late to have a happy childhood.
Hi developers,
It seems that the test.regrtest module has a possible backward
incompatibility with regard to pre-Python 2.3 releases. I have
a test suit implemented using the test.regrtest module. In this
test suit, my own tests are invoked by a script like this:
import os
from test import regrtest
regrtest.STDTESTS = []
regrtest.main(testdir=os.getcwd())
This script runs fine with 2.2 but does not with 2.3, since
regrtest.py in Python 2.3 has the following lines in runtest()
(introduced in Revision 1.87.2.1. See [1]):
if test.startswith('test.'):
abstest = test
else:
# Always import it from the test package
abstest = 'test.' + test
the_package = __import__(abstest, globals(), locals(), [])
That is, tests must be in a package named "test". However, this
package name is already used by the standard library, and AFAIK
multiple packages with the same package name cannot exist. In
other words, any additional tests (i.e. my own tests) have to be
put into the test package in the standard library. Otherwise,
the additional tests won't be found. IMHO, this change in 2.3
is not reasonable.
Unless I miss something trivial (I hope so), I'd have to give up
using the test.regrtest module. I appreciate any comment.
Thanks,
--
KAJIYAMA, Tamito <kajiyama(a)grad.sccs.chukyo-u.ac.jp>
[1] http://cvs.sourceforge.net/viewcvs.py/python/python/dist/src/Lib/test/regrt…