Tim Peters writes:
> BTW, note that the spec has nothing to say about how an implementation
> spells anything, except for the results of the to-string operations. That
> is, there's no requirement to *call* the spec's quantize operation
> "quantize"; the requirement is to provide something with the semantics of
> the spec's quantize operation.
Without having given it a LOT of thought, I'm in favor of calling it
"quantize" just to match the name in the spec. Without deciding it yet,
there are good reasons to think we MIGHT want the behavior of "round"
to be somewhat different (eg: match the behavior of Python's round())
if that were supported in the future.
> For example, round_like() or rescale() might
> be friendlier names (in fact, the current quantize used to be called rescale
> in the spec).
Yes, but there are ALSO notes in the spec explaining that they changed
the name from rescale to quantize because people had misleading
preconceptions about what "rescale" would do. (Partly because it worked
differently in previous versions of the spec.)
-- Michael Chermside
Pep 258  specifies that the __docformat__ variable should be used to
define the markup language used by docstrings in a module.
Recently, I got an RFE for epydoc requesting that a __docformat__ in a
package's __init__.py apply to the entire package . My first
reaction was that it was a good idea, but then I remembered a recent
discussion on python-dev, where there seemed to be a strong sentiment
that "from __future__ import absolute_import" should *not* have package
scope ; and I wondered if some of the same arguments would go against
applying __docformat__ to the entire package.
So... Should __docformat__ have package scope (with submodules/
subpackages possibly overriding it), or should it only apply to the
actual module that defines it?
Either way, PEP 258 should be updated to reflect the decision.
 pep 258 (Docutils Design Specification)
 epydoc RFE: __docformat__ for entire packages
#- [Batista, Facundo]
#- > Well, I think we must decide how it works
#- > With:
#- > >>> d = Decimal('12345.678')
#- > >>> d
#- > Decimal( (0, (1, 2, 3, 4, 5, 6, 7, 8), -3) )
#- > >>> str(d)
#- > '12345.678'
#- > And being the syntax Decimal.round(n), we have the
#- > following options:
#- > a) n is the quantity of relevant digits of the
#- > final number (must be non negative).
#- > b) n has the same behaviour that in the built in round().
#- > What option do you all like more?
#- I like (c) best: drop round(), because it's redundant --
#- there are other
#- ways to do (a) and (b) already. Perhaps (b) isn't obvious,
#- so extending
#- your example with what already exists:
#- >>> dimes = Decimal.Decimal('0.1')
#- >>> print d.quantize(dimes)
#- >>> print d.quantize(Decimal.Decimal('1e1'))
The round() method has the appealing that it could has the same
name/meaning/function that the built in round().
But, the PEP is for implementing the Spec, so I'm +1 to drop round(), at
least in this stage.
#- In general, we should be very reluctant to extend the spec
#- at the start:
#- it's complicated, subtle and exacting, and plenty of big
#- brains are working
#- hard on getting the spec exactly right. Alas, it's also
#- still a bit of a
#- moving target.
#- The second way of spelling it is fine, but there's no need
#- to cater to an
#- optional context argument. Decimal.using_context(input)
#- should use the
#- current context object. It's been a general rule so far
#- that all operations
#- are available as methods *of* context objects too, so in the
#- goofy case (I
#- say "goofy" because I predict it won't be used in real life)
#- of wanting to
#- force use of a particular non-current context object c, the
#- natural spelling
#- is (or should be)
You mean adding a method to the context, to create a Decimal using itself as
context and not the one from the thread?
If yes, what about c.create_decimal(number) ?
And with floats? c.create_decimal_from_float(number)? Or the same method
> But, the PEP is for implementing the Spec, so I'm +1 to drop
> round(), at
> least in this stage.
Me too. But let's promise that the docs will include an example
of how to use quantize() to do rounding, since the name will
not be obvious to the uninitiated.
-- Michael Chermside
>> note that import * _already_ imports names which
>> shadow builtins, so the only real change would be
>> when the imported module does *not* shadow a
>> builtin, but your own module does (and does so
>> before the import, perhaps because of an earlier
>> import *).
Phillip J. Eby:
> I think you're missing the part where builtins
> change from one release to the next. I might have
> had a global named 'zip' or 'enumerate' in a Python
> 1.5 program, and when I upgrade to 2.4 I am now
> "shadowing a builtin".
How did you get that global? You either defined it
or imported it (or had it added by another module).
Any of these actions will still work, and will still
replace the cached builtin.
The only problem is if you then do an import * _after_
that definition. Doing that was already risking a break;
this change just makes the break a bit more likely.
At 12:22 PM 4/22/04 -0400, Jewett, Jim J wrote:
>At 04:39 PM 4/21/04 -0400, Jewett, Jim J wrote:
> >>>> If this is really only about globals and builtins,
> >>>> then you can just initialize each module's dictionary
> >>>> with a copy of builtins.
>(Note that Raymond's original proposal was much stronger;
>instead of just moving builtins to globals, it moved both
>builtins and globals into locals.)
But at least it was still backward-compatible with respect to what actually
is found in the module's globals.
> >>Phillip J. Eby:
> >>> There's only one problem with this idea, and it's a big
> >>> one: 'import *' would now include all the builtins,
> >> Why is this bad?
> > Because some modules are examined by software, and only
> > the expected names belong there. For example, I believe
> > if you run 'pydoc' on such a module, it will proceed to
> > document all the builtins.
>Fair enough. But that software could compare against
>builtins, and do a set-minus.
It *could*, but it's wasteful to break such programs without necessity.
>But note that import * _already_ imports names which
>shadow builtins, so the only real change would be when
>the imported module does *not* shadow a builtin, but
>your own module does (and does so before the import,
>perhaps because of an earlier import *).
>If you want to protect even this obscure case, import
>could be changed to special-case builtins.
I think you're missing the part where builtins change from one release to
the next. I might have had a global named 'zip' or 'enumerate' in a Python
1.5 program, and when I upgrade to 2.4 I am now "shadowing a builtin".
I have written a new module 'config.py' which can be used for similar
purposes as ConfigParser. The biggest difference is that this fetches user
configured python objects. The technique this module supports is well
thought out and provides a single solution for the simplest to the most
complex configuration problems without sacrificing ease of use and
simplicity. For these reasons I think it should be considered for inclusion
into the Python distribution. I work in the field of embedded software
development and find this module extremely useful for unit, integration, and
functional testing of our software as well as for controlling many other
To date I have:
1) provided documentation in the module doc string
2) insured made sure module is well commented and follows the coding
3) written a test suite to insure functionality is correct (and remains
I am willing to do further development of this module, support of the module
and take the steps necessary to see it through the PEP process. I am
looking for feedback on whether this module should be considered for
inclusion in the standard Python distribution as well as any technical
suggestions. I have placed the GNU license on the module but would be more
than happy (and legally able) to relabel it with the license of your choice.
Thanks in advance for your consideration!
MSN Toolbar provides one-click access to Hotmail from any Web page FREE
At 04:39 PM 4/21/04 -0400, Jewett, Jim J wrote:
>>>> If this is really only about globals and builtins,
>>>> then you can just initialize each module's dictionary
>>>> with a copy of builtins.
(Note that Raymond's original proposal was much stronger;
instead of just moving builtins to globals, it moved both
builtins and globals into locals.)
>>Phillip J. Eby:
>>> There's only one problem with this idea, and it's a big
>>> one: 'import *' would now include all the builtins,
>> Why is this bad?
> Because some modules are examined by software, and only
> the expected names belong there. For example, I believe
> if you run 'pydoc' on such a module, it will proceed to
> document all the builtins.
Fair enough. But that software could compare against
builtins, and do a set-minus.
>> ... If the other module actually has modified a
>> builtin, you'll need to do the same ...
> ... just because one module shadows a builtin, doesn't
> mean you have to follow suit.
I did phrase that carelessly. The objects you import
will still use their original module's globals. You
only need to copy these when you want your own code to
act consistently with the other module.
But note that import * _already_ imports names which
shadow builtins, so the only real change would be when
the imported module does *not* shadow a builtin, but
your own module does (and does so before the import,
perhaps because of an earlier import *).
If you want to protect even this obscure case, import
could be changed to special-case builtins.
By the way, what are people benchmarking to decide that
builtin lookups are frequent enough to matter? It seems
intuitively true (and probably for the same handful of
builtins throughout an entire program), but pystone is
contrived enough not to show much difference compared
to its per-run variability.
#- Sounds good. If you're going this route, it seems to me
#- that Decimal()
#- would work better if it ends up being a proxy class for an internal
#- implementation that's more closely linked to context, and
#- that most of
#- the Decimal methods simply delegate to the internal implementation
I'm ok with that. But I'm kinda OO newbie, so: do you have a link to "formal
theory" about this? Some example of something you've done using this design?
#- through context. (If you look at my original implementation, that's
#- similar to the direction I was going -- haven't actually
#- looked at your
#- code to see what it's like currently.) Make sure that Decimal has-a
That's what is WorkRep for?
Actually, I didn't messed up a lot with Decimal.py. I writed a lot of code
in test_Decimal.py, and then added from_float to Deicmal and fixed a lot of
small details: but just to comply with the test cases.
A fundamental redesign of Decimal will be delayed until have all the test
cases finished and working. But I think that will be needed in the seek of
#- internal implementation rather than is-a (i.e. use
#- composition instead
#- of subclassing).
#- > And with floats? c.create_decimal_from_float(number)? Or
#- the same method
#- > that before?
#- ``from_float()`` probably is better.
Michael proposed to use the same method. This is different from Decimal
(with all that that implies), but can you trust that the user that uses the
method from context is aware of binary float traps?
You propose another method. Its name is not clear to me: