From python-checkins at python.org Sat Jul 1 11:42:21 2006
From: python-checkins at python.org (georg.brandl)
Date: Sat, 1 Jul 2006 11:42:21 +0200 (CEST)
Subject: [Python-checkins] r47186 - peps/trunk/pep-0356.txt
Message-ID: <20060701094221.B62091E4009@bag.python.org>
Author: georg.brandl
Date: Sat Jul 1 11:42:21 2006
New Revision: 47186
Modified:
peps/trunk/pep-0356.txt
Log:
Add stringobject bug.
Modified: peps/trunk/pep-0356.txt
==============================================================================
--- peps/trunk/pep-0356.txt (original)
+++ peps/trunk/pep-0356.txt Sat Jul 1 11:42:21 2006
@@ -158,6 +158,7 @@
http://python.org/sf/1508010
http://python.org/sf/1475523
http://python.org/sf/1494314
+ http://python.org/sf/1515471
- Should relative imports from __main__ work when feasible?
Bug report: http://python.org/sf/1510172
From python-checkins at python.org Sat Jul 1 11:48:01 2006
From: python-checkins at python.org (georg.brandl)
Date: Sat, 1 Jul 2006 11:48:01 +0200 (CEST)
Subject: [Python-checkins] r47187 - peps/trunk/pep-0356.txt
Message-ID: <20060701094801.6F73C1E4002@bag.python.org>
Author: georg.brandl
Date: Sat Jul 1 11:48:00 2006
New Revision: 47187
Modified:
peps/trunk/pep-0356.txt
Log:
Tab->Spaces.
Modified: peps/trunk/pep-0356.txt
==============================================================================
--- peps/trunk/pep-0356.txt (original)
+++ peps/trunk/pep-0356.txt Sat Jul 1 11:48:00 2006
@@ -158,7 +158,7 @@
http://python.org/sf/1508010
http://python.org/sf/1475523
http://python.org/sf/1494314
- http://python.org/sf/1515471
+ http://python.org/sf/1515471
- Should relative imports from __main__ work when feasible?
Bug report: http://python.org/sf/1510172
From python-checkins at python.org Sat Jul 1 12:45:21 2006
From: python-checkins at python.org (vinay.sajip)
Date: Sat, 1 Jul 2006 12:45:21 +0200 (CEST)
Subject: [Python-checkins] r47188 - python/trunk/Misc/NEWS
Message-ID: <20060701104521.A0F391E4002@bag.python.org>
Author: vinay.sajip
Date: Sat Jul 1 12:45:20 2006
New Revision: 47188
Modified:
python/trunk/Misc/NEWS
Log:
Added entry for fileConfig() bugfix.
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Sat Jul 1 12:45:20 2006
@@ -39,6 +39,9 @@
- The wsgiref package is now installed properly on Unix.
+- A bug was fixed in logging.config.fileConfig() which caused a crash on
+ shutdown when fileConfig() was called multiple times.
+
Extension Modules
-----------------
@@ -81,7 +84,7 @@
- Fix the socket tests so they can be run concurrently.
- Removed 5 integers from C frame objects (PyFrameObject).
- f_nlocals, f_ncells, f_nfreevars, f_stack_size, f_restricted.
+ f_nlocals, f_ncells, f_nfreevars, f_stack_size, f_restricted.
- Bug #532646: object.__call__() will continue looking for the __call__
attribute on objects until one without one is found. This leads to recursion
@@ -220,11 +223,11 @@
Fixes SF python patch/bug #1408584.
- bsddb: multithreaded DB access using the simple bsddb module interface
- now works reliably. It has been updated to use automatic BerkeleyDB
+ now works reliably. It has been updated to use automatic BerkeleyDB
deadlock detection and the bsddb.dbutils.DeadlockWrap wrapper to retry
database calls that would previously deadlock. [SF python bug #775414]
-- Patch #1446489: add support for the ZIP64 extensions to zipfile.
+- Patch #1446489: add support for the ZIP64 extensions to zipfile.
- Patch #1506645: add Python wrappers for the curses functions
is_term_resized, resize_term and resizeterm.
@@ -237,7 +240,7 @@
- Patch #812986: Update turtle output even if not tracing.
-- Patch #1494750: Destroy master after deleting children in
+- Patch #1494750: Destroy master after deleting children in
Tkinter.BaseWidget.
- Patch #1096231: Add ``default`` argument to Tkinter.Wm.wm_iconbitmap.
From python-checkins at python.org Sat Jul 1 12:47:20 2006
From: python-checkins at python.org (vinay.sajip)
Date: Sat, 1 Jul 2006 12:47:20 +0200 (CEST)
Subject: [Python-checkins] r47189 - python/trunk/Lib/test/test_logging.py
Message-ID: <20060701104720.8C5081E4002@bag.python.org>
Author: vinay.sajip
Date: Sat Jul 1 12:47:20 2006
New Revision: 47189
Modified:
python/trunk/Lib/test/test_logging.py
Log:
Added duplicate call to fileConfig() to ensure that it cleans up after itself correctly.
Modified: python/trunk/Lib/test/test_logging.py
==============================================================================
--- python/trunk/Lib/test/test_logging.py (original)
+++ python/trunk/Lib/test/test_logging.py Sat Jul 1 12:47:20 2006
@@ -480,6 +480,8 @@
f.close()
try:
logging.config.fileConfig(fn)
+ #call again to make sure cleanup is correct
+ logging.config.fileConfig(fn)
except:
t = sys.exc_info()[0]
message(str(t))
From kristjan at ccpgames.com Sat Jul 1 13:23:23 2006
From: kristjan at ccpgames.com (=?iso-8859-1?Q?Kristj=E1n_V=2E_J=F3nsson?=)
Date: Sat, 1 Jul 2006 11:23:23 -0000
Subject: [Python-checkins] r46894 - in python/trunk:
Modules/timemodule.c Objects/exceptions.c Objects/fileobject.c
Message-ID: <129CEF95A523704B9D46959C922A280002F5360D@nemesis.central.ccp.cc>
Interesting problem.
(Yes, I subscribe to python-chekins but it is hard to weed out the chaff there)
Is this compiler possibly an interrim soulution before VC8 came out? Strange that it should call itself version 14.
Maybe we should bracket this with a _VC8 macro that is only defined
By the PCBuild8 directory?
I will fix this when I get to the office on Monday.
Kristj?n
-----Original Message-----
From: Tim Peters [mailto:tim.peters at gmail.com]
Sent: 30. j?n? 2006 18:38
To: Kristj?n V. J?nsson
Cc: python-checkins at python.org
Subject: Re: [Python-checkins] r46894 - in python/trunk: Modules/timemodule.c Objects/exceptions.c Objects/fileobject.c
Copying Kristj?n directly since he may not be subscribed to python-checkins.
On 6/30/06, Thomas Heller wrote:
> > +#if defined _MSC_VER && _MSC_VER >= 1400
> > + /* reset CRT error handling */
> > + _set_invalid_parameter_handler(prevCrtHandler);
> > + _CrtSetReportMode(_CRT_ASSERT, prevCrtReportMode);
> > +#endif
> > }
> ...
>
> These changes to Objects/exceptions.c break the build for Windows AMD64. Apparently the amd64
> compiler from the Server 2003 SP1 SDK has _MSC_VER >= 1400, but does not know about this new
> error handling.
>
> The compiler identifies itself in this way:
>
> C:\Program Files\Microsoft Platform SDK>cl
> Microsoft (R) C/C++ Optimizing Compiler Version 14.00.40310.41 for AMD64
> Copyright (C) Microsoft Corporation. All rights reserved.
>
> usage: cl [ option... ] filename... [ /link linkoption... ]
>
> C:\Program Files\Microsoft Platform SDK>
>
>
> Thomas
>
> _______________________________________________
> Python-checkins mailing list
> Python-checkins at python.org
> http://mail.python.org/mailman/listinfo/python-checkins
>
From martin at v.loewis.de Sat Jul 1 15:38:08 2006
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sat, 01 Jul 2006 15:38:08 +0200
Subject: [Python-checkins] r46894 - in
python/trunk: Modules/timemodule.c Objects/exceptions.c
Objects/fileobject.c
In-Reply-To: <129CEF95A523704B9D46959C922A280002F5360D@nemesis.central.ccp.cc>
References: <129CEF95A523704B9D46959C922A280002F5360D@nemesis.central.ccp.cc>
Message-ID: <44A67AC0.2040301@v.loewis.de>
Kristj?n V. J?nsson wrote:
> Interesting problem. (Yes, I subscribe to python-chekins but it is
> hard to weed out the chaff there)
>
> Is this compiler possibly an interrim soulution before VC8 came out?
> Strange that it should call itself version 14.
No, it's the compiler that ships with the Platform SDK.
Notice that the feature (runtime error handlers) is not a feature of
the compiler at all. Instead, it is a feature of the CRT, and
apparently, the SDK ships with a different version of the CRT than
VC8.
> Maybe we should bracket this with a _VC8 macro that is only defined
> By the PCBuild8 directory?
I would prefer a solution based on tool-and-library defines.
Regards,
Martin
From neal at metaslash.com Sat Jul 1 11:07:33 2006
From: neal at metaslash.com (Neal Norwitz)
Date: Sat, 1 Jul 2006 05:07:33 -0400
Subject: [Python-checkins] Python Regression Test Failures refleak (1)
Message-ID: <20060701090733.GA31557@python.psfb.org>
test_socket leaked [212, -212, 0] references
From python-checkins at python.org Sat Jul 1 17:33:38 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Sat, 1 Jul 2006 17:33:38 +0200 (CEST)
Subject: [Python-checkins] r47190 - in python/trunk: Lib/socket.py
Lib/test/test_socket.py Misc/NEWS
Message-ID: <20060701153338.5EEE11E4002@bag.python.org>
Author: martin.v.loewis
Date: Sat Jul 1 17:33:37 2006
New Revision: 47190
Modified:
python/trunk/Lib/socket.py
python/trunk/Lib/test/test_socket.py
python/trunk/Misc/NEWS
Log:
Release all forwarded functions in .close. Fixes #1513223.
Modified: python/trunk/Lib/socket.py
==============================================================================
--- python/trunk/Lib/socket.py (original)
+++ python/trunk/Lib/socket.py Sat Jul 1 17:33:37 2006
@@ -130,35 +130,37 @@
if sys.platform == "riscos":
_socketmethods = _socketmethods + ('sleeptaskw',)
+# All the method names that must be delegated to either the real socket
+# object or the _closedsocket object.
+_delegate_methods = ("recv", "recvfrom", "recv_into", "recvfrom_into",
+ "send", "sendto")
+
class _closedsocket(object):
__slots__ = []
def _dummy(*args):
raise error(EBADF, 'Bad file descriptor')
- send = recv = sendto = recvfrom = __getattr__ = _dummy
+ # All _delegate_methods must also be initialized here.
+ send = recv = recv_into = sendto = recvfrom = recvfrom_into = _dummy
+ __getattr__ = _dummy
class _socketobject(object):
__doc__ = _realsocket.__doc__
- __slots__ = ["_sock",
- "recv", "recv_into", "recvfrom_into",
- "send", "sendto", "recvfrom",
- "__weakref__"]
+ __slots__ = ["_sock", "__weakref__"] + list(_delegate_methods)
def __init__(self, family=AF_INET, type=SOCK_STREAM, proto=0, _sock=None):
if _sock is None:
_sock = _realsocket(family, type, proto)
self._sock = _sock
- self.send = self._sock.send
- self.recv = self._sock.recv
- self.recv_into = self._sock.recv_into
- self.sendto = self._sock.sendto
- self.recvfrom = self._sock.recvfrom
- self.recvfrom_into = self._sock.recvfrom_into
+ for method in _delegate_methods:
+ setattr(self, method, getattr(_sock, method))
def close(self):
self._sock = _closedsocket()
- self.send = self.recv = self.sendto = self.recvfrom = self._sock._dummy
+ dummy = self._sock._dummy
+ for method in _delegate_methods:
+ setattr(self, method, dummy)
close.__doc__ = _realsocket.close.__doc__
def accept(self):
Modified: python/trunk/Lib/test/test_socket.py
==============================================================================
--- python/trunk/Lib/test/test_socket.py (original)
+++ python/trunk/Lib/test/test_socket.py Sat Jul 1 17:33:37 2006
@@ -582,6 +582,21 @@
def _testRecvFrom(self):
self.cli.sendto(MSG, 0, (HOST, PORT))
+class TCPCloserTest(ThreadedTCPSocketTest):
+
+ def testClose(self):
+ conn, addr = self.serv.accept()
+ conn.close()
+
+ sd = self.cli
+ read, write, err = select.select([sd], [], [], 1.0)
+ self.assertEqual(read, [sd])
+ self.assertEqual(sd.recv(1), '')
+
+ def _testClose(self):
+ self.cli.connect((HOST, PORT))
+ time.sleep(1.0)
+
class BasicSocketPairTest(SocketPairTest):
def __init__(self, methodName='runTest'):
@@ -890,8 +905,8 @@
self.serv_conn.send(buf)
def test_main():
- tests = [GeneralModuleTests, BasicTCPTest, TCPTimeoutTest, TestExceptions,
- BufferIOTest]
+ tests = [GeneralModuleTests, BasicTCPTest, TCPCloserTest, TCPTimeoutTest,
+ TestExceptions, BufferIOTest]
if sys.platform != 'mac':
tests.extend([ BasicUDPTest, UDPTimeoutTest ])
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Sat Jul 1 17:33:37 2006
@@ -19,6 +19,9 @@
Library
-------
+- Bug #1513223: .close() of a _socketobj now releases the underlying
+ socket again, which then gets closed as it becomes unreferenced.
+
- The '_ctypes' extension module now works when Python is configured
with the --without-threads option.
From buildbot at python.org Sat Jul 1 17:50:02 2006
From: buildbot at python.org (buildbot at python.org)
Date: Sat, 01 Jul 2006 15:50:02 +0000
Subject: [Python-checkins] buildbot failure in x86 cygwin trunk
Message-ID: <20060701155002.C8EB11E4002@bag.python.org>
The Buildbot has detected a new failure of x86 cygwin trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520cygwin%2520trunk/builds/912
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: martin.v.loewis
BUILD FAILED: failed failed slave lost
sincerely,
-The Buildbot
From python-checkins at python.org Sat Jul 1 18:28:22 2006
From: python-checkins at python.org (fred.drake)
Date: Sat, 1 Jul 2006 18:28:22 +0200 (CEST)
Subject: [Python-checkins] r47191 - in python/trunk:
Lib/test/crashers/xml_parsers.py Lib/test/test_pyexpat.py
Modules/expat/xmlparse.c
Message-ID: <20060701162822.398661E4002@bag.python.org>
Author: fred.drake
Date: Sat Jul 1 18:28:20 2006
New Revision: 47191
Removed:
python/trunk/Lib/test/crashers/xml_parsers.py
Modified:
python/trunk/Lib/test/test_pyexpat.py
python/trunk/Modules/expat/xmlparse.c
Log:
SF bug #1296433 (Expat bug #1515266): Unchecked calls to character data
handler would cause a segfault. This merges in Expat's lib/xmlparse.c
revisions 1.154 and 1.155, which fix this and a closely related problem
(the later does not affect Python).
Moved the crasher test to the tests for xml.parsers.expat.
Deleted: /python/trunk/Lib/test/crashers/xml_parsers.py
==============================================================================
--- /python/trunk/Lib/test/crashers/xml_parsers.py Sat Jul 1 18:28:20 2006
+++ (empty file)
@@ -1,56 +0,0 @@
-from xml.parsers import expat
-
-# http://python.org/sf/1296433
-
-def test_parse_only_xml_data():
- #
- xml = "%s" % ('a' * 1025)
- # this one doesn't crash
- #xml = "%s" % ('a' * 10000)
-
- def handler(text):
- raise Exception
-
- parser = expat.ParserCreate()
- parser.CharacterDataHandler = handler
-
- try:
- parser.Parse(xml)
- except:
- pass
-
-if __name__ == '__main__':
- test_parse_only_xml_data()
-
-# Invalid read of size 4
-# at 0x43F936: PyObject_Free (obmalloc.c:735)
-# by 0x45A7C7: unicode_dealloc (unicodeobject.c:246)
-# by 0x1299021D: PyUnknownEncodingHandler (pyexpat.c:1314)
-# by 0x12993A66: processXmlDecl (xmlparse.c:3330)
-# by 0x12999211: doProlog (xmlparse.c:3678)
-# by 0x1299C3F0: prologInitProcessor (xmlparse.c:3550)
-# by 0x12991EA3: XML_ParseBuffer (xmlparse.c:1562)
-# by 0x1298F8EC: xmlparse_Parse (pyexpat.c:895)
-# by 0x47B3A1: PyEval_EvalFrameEx (ceval.c:3565)
-# by 0x47CCAC: PyEval_EvalCodeEx (ceval.c:2739)
-# by 0x47CDE1: PyEval_EvalCode (ceval.c:490)
-# by 0x499820: PyRun_SimpleFileExFlags (pythonrun.c:1198)
-# by 0x4117F1: Py_Main (main.c:492)
-# by 0x12476D1F: __libc_start_main (in /lib/libc-2.3.5.so)
-# by 0x410DC9: (within /home/neal/build/python/svn/clean/python)
-# Address 0x12704020 is 264 bytes inside a block of size 592 free'd
-# at 0x11B1BA8A: free (vg_replace_malloc.c:235)
-# by 0x124B5F18: (within /lib/libc-2.3.5.so)
-# by 0x48DE43: find_module (import.c:1320)
-# by 0x48E997: import_submodule (import.c:2249)
-# by 0x48EC15: load_next (import.c:2083)
-# by 0x48F091: import_module_ex (import.c:1914)
-# by 0x48F385: PyImport_ImportModuleEx (import.c:1955)
-# by 0x46D070: builtin___import__ (bltinmodule.c:44)
-# by 0x4186CF: PyObject_Call (abstract.c:1777)
-# by 0x474E9B: PyEval_CallObjectWithKeywords (ceval.c:3432)
-# by 0x47928E: PyEval_EvalFrameEx (ceval.c:2038)
-# by 0x47CCAC: PyEval_EvalCodeEx (ceval.c:2739)
-# by 0x47CDE1: PyEval_EvalCode (ceval.c:490)
-# by 0x48D0F7: PyImport_ExecCodeModuleEx (import.c:635)
-# by 0x48D4F4: load_source_module (import.c:913)
Modified: python/trunk/Lib/test/test_pyexpat.py
==============================================================================
--- python/trunk/Lib/test/test_pyexpat.py (original)
+++ python/trunk/Lib/test/test_pyexpat.py Sat Jul 1 18:28:20 2006
@@ -365,3 +365,24 @@
''', 1)
+
+
+def test_parse_only_xml_data():
+ # http://python.org/sf/1296433
+ #
+ xml = "%s" % ('a' * 1025)
+ # this one doesn't crash
+ #xml = "%s" % ('a' * 10000)
+
+ def handler(text):
+ raise Exception
+
+ parser = expat.ParserCreate()
+ parser.CharacterDataHandler = handler
+
+ try:
+ parser.Parse(xml)
+ except:
+ pass
+
+test_parse_only_xml_data()
Modified: python/trunk/Modules/expat/xmlparse.c
==============================================================================
--- python/trunk/Modules/expat/xmlparse.c (original)
+++ python/trunk/Modules/expat/xmlparse.c Sat Jul 1 18:28:20 2006
@@ -2552,6 +2552,8 @@
(int)(dataPtr - (ICHAR *)dataBuf));
if (s == next)
break;
+ if (ps_parsing == XML_FINISHED || ps_parsing == XML_SUSPENDED)
+ break;
*eventPP = s;
}
}
From buildbot at python.org Sat Jul 1 19:21:22 2006
From: buildbot at python.org (buildbot at python.org)
Date: Sat, 01 Jul 2006 17:21:22 +0000
Subject: [Python-checkins] buildbot warnings in alpha Debian trunk
Message-ID: <20060701172122.8D5171E4002@bag.python.org>
The Buildbot has detected a new failure of alpha Debian trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/alpha%2520Debian%2520trunk/builds/438
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: martin.v.loewis
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Sat Jul 1 19:34:34 2006
From: buildbot at python.org (buildbot at python.org)
Date: Sat, 01 Jul 2006 17:34:34 +0000
Subject: [Python-checkins] buildbot warnings in x86 cygwin trunk
Message-ID: <20060701173434.60EFD1E4002@bag.python.org>
The Buildbot has detected a new failure of x86 cygwin trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520cygwin%2520trunk/builds/913
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: fred.drake
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Sat Jul 1 19:34:41 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Sat, 1 Jul 2006 19:34:41 +0200 (CEST)
Subject: [Python-checkins] r47192 - sandbox/trunk/Doc/functional.rst
Message-ID: <20060701173441.5A07A1E4002@bag.python.org>
Author: andrew.kuchling
Date: Sat Jul 1 19:34:40 2006
New Revision: 47192
Modified:
sandbox/trunk/Doc/functional.rst
Log:
Change example from .strip() to .split() -- I used split() a lot, it turns out; fix a typo
Modified: sandbox/trunk/Doc/functional.rst
==============================================================================
--- sandbox/trunk/Doc/functional.rst (original)
+++ sandbox/trunk/Doc/functional.rst Sat Jul 1 19:34:40 2006
@@ -436,8 +436,8 @@
::
- (line.strip() for line in line_list) =>
- 'line 1', 'line 2'
+ (line.split() for line in line_list) =>
+ ['line', '1'], ['line', '2']
Generator expressions always have to be written inside parentheses, as
in the above example. The parentheses signalling a function call also
@@ -812,7 +812,7 @@
If there's a Python built-in or a module function that's suitable, you
don't need to define a new function at all::
- stripped_lines = [line.strip for line in lines]
+ stripped_lines = [line.strip() for line in lines]
existing_files = filter(os.path.exists, file_list)
If the function you need doesn't exist, you need to write it. One way
@@ -1133,7 +1133,7 @@
The author would like to thank the following people for offering
suggestions, corrections and assistance with various drafts of this
-article: Raymond Hettinger, Jim Jewett.
+article: Ian Bicking, Raymond Hettinger, Jim Jewett.
Version 0.1: posted June 30 2006.
From python-checkins at python.org Sat Jul 1 19:44:21 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Sat, 1 Jul 2006 19:44:21 +0200 (CEST)
Subject: [Python-checkins] r47193 - sandbox/trunk/Doc/functional.rst
Message-ID: <20060701174421.717771E4002@bag.python.org>
Author: andrew.kuchling
Date: Sat Jul 1 19:44:20 2006
New Revision: 47193
Modified:
sandbox/trunk/Doc/functional.rst
Log:
More typo fixes and a few edits
Modified: sandbox/trunk/Doc/functional.rst
==============================================================================
--- sandbox/trunk/Doc/functional.rst (original)
+++ sandbox/trunk/Doc/functional.rst Sat Jul 1 19:44:20 2006
@@ -838,8 +838,8 @@
def adder(x,y):
return x + y
-Which alternative is preferable? That's a style question; my general
-view is to avoid it.
+Which alternative is preferable? That's a style question; my usual
+view is to avoid using ``lambda``.
``lambda`` is quite limited in the functions it can define. The
result has to be computable as a single expression, which means you
@@ -852,16 +852,16 @@
freq = reduce(lambda a, b: (0, a[1] + b[1]), items)[1]
-You can figure it out, but it takes time to disentangle the function
+You can figure it out, but it takes time to disentangle the expression
to figure out what's going on. Using a short nested
``def`` statements makes things a little bit better::
def combine (a, b):
return 0, a[1] + b[1]
- return reduce(combine_freq, items)[1]
+ return reduce(combine, items)[1]
-It would be best of all if I had simply used a ``for`` loop::
+But it would be best of all if I had simply used a ``for`` loop::
total = 0
for a, b in items:
@@ -877,7 +877,7 @@
4) Convert the lambda to a def statement, using that name.
5) Remove the comment.
-I really like these rules, but you're free todisagree that this style
+I really like these rules, but you're free to disagree that this style
is better.
@@ -1133,7 +1133,7 @@
The author would like to thank the following people for offering
suggestions, corrections and assistance with various drafts of this
-article: Ian Bicking, Raymond Hettinger, Jim Jewett.
+article: Ian Bicking, Raymond Hettinger, Jim Jewett, Leandro Lameiro.
Version 0.1: posted June 30 2006.
From python-checkins at python.org Sat Jul 1 19:45:50 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Sat, 1 Jul 2006 19:45:50 +0200 (CEST)
Subject: [Python-checkins] r47194 - sandbox/trunk/Doc/functional.rst
Message-ID: <20060701174550.A3DCA1E4014@bag.python.org>
Author: andrew.kuchling
Date: Sat Jul 1 19:45:50 2006
New Revision: 47194
Modified:
sandbox/trunk/Doc/functional.rst
Log:
Bump version; move commented-out stuff to the very end
Modified: sandbox/trunk/Doc/functional.rst
==============================================================================
--- sandbox/trunk/Doc/functional.rst (original)
+++ sandbox/trunk/Doc/functional.rst Sat Jul 1 19:45:50 2006
@@ -1,7 +1,7 @@
Functional Programming HOWTO
================================
-**Version 0.1**
+**Version 0.11**
(This is a first draft. Please send comments/error
reports/suggestions to amk at amk.ca. This URL is probably not going to
@@ -1137,6 +1137,39 @@
Version 0.1: posted June 30 2006.
+Version 0.11: posted July 1 2006.
+
+References
+--------------------
+
+General
+'''''''''''''''
+
+**Structure and Interpretation of Computer Programs**, by
+Harold Abelson and Gerald Jay Sussman with Julie Sussman.
+
+Full text at http://mitpress.mit.edu/sicp/.
+
+A classic textbook of computer science. Chapters 2 and 3 discuss the
+use of sequences and streams to organize the data flow inside a
+program. The book uses Scheme for its examples, but many of the
+design approaches described in these chapters are applicable to
+functional-style Python code.
+
+http://en.wikipedia.org/wiki/Functional_programming:
+General Wikipedia entry describing functional programming.
+
+
+Python documentation
+'''''''''''''''''''''''''''
+
+http://docs.python.org/lib/module-itertools.html:
+Documentation ``for the itertools`` module.
+
+http://docs.python.org/lib/module-operator.html:
+Documentation ``for the operator`` module.
+
+
.. comment
Topics to place
@@ -1197,33 +1230,3 @@
print elem[-1]
-References
---------------------
-
-General
-'''''''''''''''
-
-**Structure and Interpretation of Computer Programs**, by
-Harold Abelson and Gerald Jay Sussman with Julie Sussman.
-
-Full text at http://mitpress.mit.edu/sicp/.
-
-A classic textbook of computer science. Chapters 2 and 3 discuss the
-use of sequences and streams to organize the data flow inside a
-program. The book uses Scheme for its examples, but many of the
-design approaches described in these chapters are applicable to
-functional-style Python code.
-
-http://en.wikipedia.org/wiki/Functional_programming:
-General Wikipedia entry describing functional programming.
-
-
-Python documentation
-'''''''''''''''''''''''''''
-
-http://docs.python.org/lib/module-itertools.html:
-Documentation ``for the itertools`` module.
-
-http://docs.python.org/lib/module-operator.html:
-Documentation ``for the operator`` module.
-
From buildbot at python.org Sat Jul 1 22:39:10 2006
From: buildbot at python.org (buildbot at python.org)
Date: Sat, 01 Jul 2006 20:39:10 +0000
Subject: [Python-checkins] buildbot warnings in x86 XP-2 trunk
Message-ID: <20060701203910.A75AF1E4002@bag.python.org>
The Buildbot has detected a new failure of x86 XP-2 trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520XP-2%2520trunk/builds/720
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: fred.drake,georg.brandl,martin.v.loewis,vinay.sajip
Build Had Warnings: warnings clean
sincerely,
-The Buildbot
From python-checkins at python.org Sun Jul 2 19:48:31 2006
From: python-checkins at python.org (gerhard.haering)
Date: Sun, 2 Jul 2006 19:48:31 +0200 (CEST)
Subject: [Python-checkins] r47197 - in python/trunk:
Lib/sqlite3/test/types.py Misc/NEWS Modules/_sqlite/cursor.c
Modules/_sqlite/module.h
Message-ID: <20060702174831.0201A1E4008@bag.python.org>
Author: gerhard.haering
Date: Sun Jul 2 19:48:30 2006
New Revision: 47197
Modified:
python/trunk/Lib/sqlite3/test/types.py
python/trunk/Misc/NEWS
python/trunk/Modules/_sqlite/cursor.c
python/trunk/Modules/_sqlite/module.h
Log:
The sqlite3 module did cut off data from the SQLite database at the first null
character before sending it to a custom converter. This has been fixed now.
Modified: python/trunk/Lib/sqlite3/test/types.py
==============================================================================
--- python/trunk/Lib/sqlite3/test/types.py (original)
+++ python/trunk/Lib/sqlite3/test/types.py Sun Jul 2 19:48:30 2006
@@ -21,7 +21,7 @@
# misrepresented as being the original software.
# 3. This notice may not be removed or altered from any source distribution.
-import datetime
+import bz2, datetime
import unittest
import sqlite3 as sqlite
@@ -273,6 +273,23 @@
val = self.cur.fetchone()[0]
self.failUnlessEqual(type(val), float)
+class BinaryConverterTests(unittest.TestCase):
+ def convert(s):
+ return bz2.decompress(s)
+ convert = staticmethod(convert)
+
+ def setUp(self):
+ self.con = sqlite.connect(":memory:", detect_types=sqlite.PARSE_COLNAMES)
+ sqlite.register_converter("bin", BinaryConverterTests.convert)
+
+ def tearDown(self):
+ self.con.close()
+
+ def CheckBinaryInputForConverter(self):
+ testdata = "abcdefg" * 10
+ result = self.con.execute('select ? as "x [bin]"', (buffer(bz2.compress(testdata)),)).fetchone()[0]
+ self.failUnlessEqual(testdata, result)
+
class DateTimeTests(unittest.TestCase):
def setUp(self):
self.con = sqlite.connect(":memory:", detect_types=sqlite.PARSE_DECLTYPES)
@@ -322,8 +339,9 @@
decltypes_type_suite = unittest.makeSuite(DeclTypesTests, "Check")
colnames_type_suite = unittest.makeSuite(ColNamesTests, "Check")
adaptation_suite = unittest.makeSuite(ObjectAdaptationTests, "Check")
+ bin_suite = unittest.makeSuite(BinaryConverterTests, "Check")
date_suite = unittest.makeSuite(DateTimeTests, "Check")
- return unittest.TestSuite((sqlite_type_suite, decltypes_type_suite, colnames_type_suite, adaptation_suite, date_suite))
+ return unittest.TestSuite((sqlite_type_suite, decltypes_type_suite, colnames_type_suite, adaptation_suite, bin_suite, date_suite))
def test():
runner = unittest.TextTestRunner()
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Sun Jul 2 19:48:30 2006
@@ -45,6 +45,10 @@
- A bug was fixed in logging.config.fileConfig() which caused a crash on
shutdown when fileConfig() was called multiple times.
+- The sqlite3 module did cut off data from the SQLite database at the first
+ null character before sending it to a custom converter. This has been fixed
+ now.
+
Extension Modules
-----------------
Modified: python/trunk/Modules/_sqlite/cursor.c
==============================================================================
--- python/trunk/Modules/_sqlite/cursor.c (original)
+++ python/trunk/Modules/_sqlite/cursor.c Sun Jul 2 19:48:30 2006
@@ -321,12 +321,13 @@
}
if (converter != Py_None) {
- val_str = (const char*)sqlite3_column_text(self->statement->st, i);
+ nbytes = sqlite3_column_bytes(self->statement->st, i);
+ val_str = (const char*)sqlite3_column_blob(self->statement->st, i);
if (!val_str) {
Py_INCREF(Py_None);
converted = Py_None;
} else {
- item = PyString_FromString(val_str);
+ item = PyString_FromStringAndSize(val_str, nbytes);
if (!item) {
return NULL;
}
Modified: python/trunk/Modules/_sqlite/module.h
==============================================================================
--- python/trunk/Modules/_sqlite/module.h (original)
+++ python/trunk/Modules/_sqlite/module.h Sun Jul 2 19:48:30 2006
@@ -25,7 +25,7 @@
#define PYSQLITE_MODULE_H
#include "Python.h"
-#define PYSQLITE_VERSION "2.3.1"
+#define PYSQLITE_VERSION "2.3.2"
extern PyObject* Error;
extern PyObject* Warning;
From python-checkins at python.org Sun Jul 2 20:44:01 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Sun, 2 Jul 2006 20:44:01 +0200 (CEST)
Subject: [Python-checkins] r47198 - in python/trunk: Lib/test/test_os.py
Misc/NEWS Modules/posixmodule.c
Message-ID: <20060702184401.30A741E4008@bag.python.org>
Author: martin.v.loewis
Date: Sun Jul 2 20:44:00 2006
New Revision: 47198
Modified:
python/trunk/Lib/test/test_os.py
python/trunk/Misc/NEWS
python/trunk/Modules/posixmodule.c
Log:
Correct arithmetic in access on Win32. Fixes #1513646.
Modified: python/trunk/Lib/test/test_os.py
==============================================================================
--- python/trunk/Lib/test/test_os.py (original)
+++ python/trunk/Lib/test/test_os.py Sun Jul 2 20:44:00 2006
@@ -11,6 +11,19 @@
warnings.filterwarnings("ignore", "tempnam", RuntimeWarning, __name__)
warnings.filterwarnings("ignore", "tmpnam", RuntimeWarning, __name__)
+# Tests creating TESTFN
+class FileTests(unittest.TestCase):
+ def setUp(self):
+ if os.path.exists(test_support.TESTFN):
+ os.unlink(test_support.TESTFN)
+ tearDown = setUp
+
+ def test_access(self):
+ f = os.open(test_support.TESTFN, os.O_CREAT|os.O_RDWR)
+ os.close(f)
+ self.assert_(os.access(test_support.TESTFN, os.W_OK))
+
+
class TemporaryFileTests(unittest.TestCase):
def setUp(self):
self.files = []
@@ -393,6 +406,7 @@
def test_main():
test_support.run_unittest(
+ FileTests,
TemporaryFileTests,
StatAttributeTests,
EnvironTests,
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Sun Jul 2 20:44:00 2006
@@ -52,6 +52,9 @@
Extension Modules
-----------------
+- Bug #1513646: os.access on Windows now correctly determines write
+ access, again.
+
- Bug #1512695: cPickle.loads could crash if it was interrupted with
a KeyboardInterrupt.
Modified: python/trunk/Modules/posixmodule.c
==============================================================================
--- python/trunk/Modules/posixmodule.c (original)
+++ python/trunk/Modules/posixmodule.c Sun Jul 2 20:44:00 2006
@@ -1402,7 +1402,7 @@
return PyBool_FromLong(0);
/* Access is possible if either write access wasn't requested, or
the file isn't read-only. */
- return PyBool_FromLong(!(mode & 2) || !(attr && FILE_ATTRIBUTE_READONLY));
+ return PyBool_FromLong(!(mode & 2) || !(attr & FILE_ATTRIBUTE_READONLY));
#else
int res;
if (!PyArg_ParseTuple(args, "eti:access",
From python-checkins at python.org Sun Jul 2 22:02:25 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Sun, 2 Jul 2006 22:02:25 +0200 (CEST)
Subject: [Python-checkins] r47199 - sandbox/trunk/Doc/functional.rst
Message-ID: <20060702200225.058641E4008@bag.python.org>
Author: andrew.kuchling
Date: Sun Jul 2 22:02:24 2006
New Revision: 47199
Modified:
sandbox/trunk/Doc/functional.rst
Log:
Correct explanation of 'if' clauses in listcomps; untabify
Modified: sandbox/trunk/Doc/functional.rst
==============================================================================
--- sandbox/trunk/Doc/functional.rst (original)
+++ sandbox/trunk/Doc/functional.rst Sun Jul 2 22:02:24 2006
@@ -1,7 +1,7 @@
Functional Programming HOWTO
================================
-**Version 0.11**
+**Version 0.12**
(This is a first draft. Please send comments/error
reports/suggestions to amk at amk.ca. This URL is probably not going to
@@ -373,14 +373,17 @@
List comprehensions have the form::
[ expression for expr in sequence1
+ if condition1
for expr2 in sequence2
+ if condition2
for expr3 in sequence3 ...
+ if condition3
for exprN in sequenceN
- if condition ]
+ if conditionN ]
The elements of the generated list will be the successive
-values of ``expression``. The final ``if`` clause is
-optional; if present, ``expression`` is only evaluated and added to
+values of ``expression``. The ``if`` clauses are
+all optional; if present, ``expression`` is only evaluated and added to
the result when ``condition`` is true.
The ``for...in`` clauses contain the sequences to be iterated over.
@@ -394,13 +397,18 @@
Python code::
for expr1 in sequence1:
+ if not (condition1):
+ continue
for expr2 in sequence2:
- ...
+ if not (condition2):
+ continue
+ ...
for exprN in sequenceN:
- if (condition):
- # Append the value of
- # the expression to the
- # resulting list.
+ if not (conditionN):
+ continue
+ # Append the value of
+ # the expression to the
+ # resulting list.
This means that when there are multiple ``for...in``
clauses, the resulting list will be equal to the product of the
@@ -444,7 +452,7 @@
count, so if you want to create an iterator that will be immediately
passed to a function you could write::
- obj_total = sum(obj.count for obj in list_all_objects())
+ obj_total = sum(obj.count for obj in list_all_objects())
Generators
@@ -782,7 +790,7 @@
[9878, 9828, 8442, 7953, 6431, 6213, 2207, 769]
(For a more detailed discussion of sorting, see the Sorting mini-HOWTO
-in the Python wiki at http://wiki.python.org/moin/SortingHowto.)
+in the Python wiki at http://wiki.python.org/moin/HowTo/Sorting.)
The ``any(iter)`` and ``all(iter)`` built-ins look at
the truth values of an iterable's contents. ``any()`` returns
@@ -857,7 +865,7 @@
``def`` statements makes things a little bit better::
def combine (a, b):
- return 0, a[1] + b[1]
+ return 0, a[1] + b[1]
return reduce(combine, items)[1]
@@ -1133,7 +1141,8 @@
The author would like to thank the following people for offering
suggestions, corrections and assistance with various drafts of this
-article: Ian Bicking, Raymond Hettinger, Jim Jewett, Leandro Lameiro.
+article: Ian Bicking, Nick Coghlan, Raymond Hettinger, Jim Jewett,
+Leandro Lameiro.
Version 0.1: posted June 30 2006.
From python-checkins at python.org Sun Jul 2 22:33:27 2006
From: python-checkins at python.org (steven.bethard)
Date: Sun, 2 Jul 2006 22:33:27 +0200 (CEST)
Subject: [Python-checkins] r47200 - peps/trunk/pep-3099.txt
Message-ID: <20060702203327.460CC1E4008@bag.python.org>
Author: steven.bethard
Date: Sun Jul 2 22:33:26 2006
New Revision: 47200
Modified:
peps/trunk/pep-3099.txt
Log:
Add rejected "globals.foo" proposal.
Modified: peps/trunk/pep-3099.txt
==============================================================================
--- peps/trunk/pep-3099.txt (original)
+++ peps/trunk/pep-3099.txt Sun Jul 2 22:33:26 2006
@@ -110,6 +110,12 @@
list. Do ``from __future__ import braces`` to get a definitive
answer on this subject.
+* Referencing the global name ``foo`` will not be spelled ``globals.foo``.
+
+ Thread: "replace globals() and global statement with global builtin
+ object",
+ http://mail.python.org/pipermail/python-3000/2006-July/002485.html
+
Builtins
========
From python-checkins at python.org Sun Jul 2 23:27:17 2006
From: python-checkins at python.org (matt.fleming)
Date: Sun, 2 Jul 2006 23:27:17 +0200 (CEST)
Subject: [Python-checkins] r47201 - sandbox/trunk/pdb/mpdb.py
sandbox/trunk/pdb/mthread.py
Message-ID: <20060702212717.A824C1E4008@bag.python.org>
Author: matt.fleming
Date: Sun Jul 2 23:27:17 2006
New Revision: 47201
Modified:
sandbox/trunk/pdb/mpdb.py
sandbox/trunk/pdb/mthread.py
Log:
move all the thread debugging code into mthread.py and now we can pass arguments to the script being debugged.
Modified: sandbox/trunk/pdb/mpdb.py
==============================================================================
--- sandbox/trunk/pdb/mpdb.py (original)
+++ sandbox/trunk/pdb/mpdb.py Sun Jul 2 23:27:17 2006
@@ -58,14 +58,6 @@
self.lastcmd = ''
self.connection = None
self.debug_thread = False
- # We don't trace the MainThread, so the tracer for the main thread is
- # None.
- self.tracers = [None]
- self.threads = [threading.currentThread()]
- self.current_thread = self.threads[0]
-
- self._info_cmds.append('target')
- self._info_cmds.append('thread')
def _rebind_input(self, new_input):
""" This method rebinds the debugger's input to the object specified
@@ -173,17 +165,6 @@
args = arg.split()
if 'target'.startswith(args[0]) and len(args[0]) > 2:
self.msg("target is %s" % self.target)
- elif 'thread'.startswith(args[0]) and len(args[0])> 2:
- if not self.debug_thread:
- self.errmsg('Thread debugging is not on.')
- return
- # We need some way to remove old thread instances
- for t in self.threads:
- if t == self.current_thread:
- self.msg('* %d %s' % (self.threads.index(t)+1, t))
- else:
- self.msg(' %d %s' % (self.threads.index(t)+1, t))
- return
else:
pydb.Pdb.do_info(self, arg)
@@ -199,8 +180,6 @@
self.msg_nocr("info %s --" % cmd)
if 'target'.startswith(cmd):
self.msg("Names of targets and files being debugged")
- elif 'thread'.startswith(cmd):
- self.msg('Information about active and inactive threads.')
else:
pydb.Pdb.info_helper(self, cmd)
@@ -212,29 +191,16 @@
args = arg.split()
if 'thread'.startswith(args[0]):
- threading.settrace(self.thread_trace_dispatch)
+ try:
+ import mthread
+ except ImportError:
+ self.errmsg('Thread debugging is not on')
+ return
+ mthread.init(self.msg, self.errmsg)
self.msg('Thread debugging on')
self.debug_thread = True
return
- def thread_trace_dispatch(self, frame, event, arg):
- """ Create an MTracer object so trace the thread. """
- # This method is called when a thread is being created with the
- # threading module. The _MainThread is no longer of primary concern,
- # this new thread is.
- try:
- from mthread import MTracer
- except ImportError:
- self.errmsg('Could not import mthread.MTracer')
- sys.settrace(None)
- return # Thread not being traced
-
- self.threads.append(threading.currentThread())
- self.msg('New thread: %s' % self.threads[-1])
- m = MTracer(self.breaks, self.filename, self.stdout)
- self.tracers.append(m)
- sys.settrace(m.trace_dispatch)
-
# Debugger commands
def do_attach(self, addr):
""" Attach to a process or file outside of Pdb.
@@ -443,84 +409,13 @@
self.msg("Re exec'ing\n\t%s" % self._sys_argv)
os.execvp(self._sys_argv[0], self._sys_argv)
-
-
- def do_thread(self, arg):
- """Use this command to switch between threads.
-The new thread ID must be currently known.
-
-List of thread subcommands:
-
-thread apply -- Apply a command to a thread
-
-Type "help thread" followed by thread subcommand name for full documentation.
-Command name abbreviations are allowed if unambiguous.
-"""
- if not self.debug_thread:
- self.errmsg('Thread debugging not on.')
- return
- args = arg.split()
- if len(args) == 0:
- self.msg('Current thread is %d (%s)' % \
- (self.threads.index(self.current_thread)+1,
- self.current_thread))
- return
- if len(args) < 2:
- if args[0].isdigit():
- # XXX Switch to a different thread, although this doesn't
- # actually do anything yet.
- t_num = int(args[0])-1
- if t_num > len(self.threads):
- self.errmsg('Thread ID %d not known.' % t_num+1)
- return
- self.current_thread = self.threads[t_num]
- self.msg('Switching to thread %d (%s)' % (t_num+1, \
- self.current_thread))
- return
- self.errmsg('Please specify a Thread ID')
- return
- if len(args) < 3:
- self.errmsg('Please specify a command following the thread' \
- + ' ID')
- return
- if len(self.threads) == 0:
- self.errmsg('No threads')
- return
- if len(self.threads) < int(args[1]):
- self.errmsg('Thread ID %d not known.' % int(args[1]))
- return
- # These should always be in sync
- t = self.threads[int(args[1])-1]
- t_tracer = self.tracers[int(args[1])-1]
- func = args[2]
- if len(args) > 2:
- str_params = ""
- for w in args[3:]:
- str_params += w
- str_params.rstrip()
- eval('t_tracer.do_' + func + '(str_params)')
- #except AttributeError:
- # self.errmsg('No such thread subcommand')
- # return
- else:
- try:
- eval('t_tracer.do_'+func+'()')
- except AttributeError:
- self.errmsg('Undefined thread apply subcommand "%s".' \
- % args[0])
- return
-
-def pdbserver(addr):
+def pdbserver(addr, m):
""" This method sets up a pdbserver debugger that allows debuggers
to connect to 'addr', which a protocol-specific address, i.e.
tcp = 'tcp mydomainname.com:9876'
serial = 'serial /dev/ttyC0'
"""
- m = MPdb()
- m._sys_argv = ['python']
- for i in sys.argv:
- m._sys_argv.append(i)
- m._program_sys_argv = sys.argv[1:]
+ m._program_sys_argv = list(m._sys_argv[2:])
m.mainpyfile = m._program_sys_argv[1]
m.do_pdbserver(addr)
while True:
@@ -549,37 +444,38 @@
def main():
""" Main entry point to this module. """
opts, args = parse_opts()
+
+ mpdb = MPdb()
+ mpdb._sys_argv = ['python']
+ for i in sys.argv:
+ mpdb._sys_argv.append(i)
+ mpdb._program_sys_argv = mpdb._sys_argv[4:]
+ sys.argv = list(args)
+ if not opts.scriptname:
+ if not args and not opts.target:
+ print 'Error: mpdb.py must be called with a script name if ' \
+ + '-p or -t switches are not specified.'
+ sys.exit(1)
+ elif not opts.target:
+ mainpyfile = args[0]
+ else:
+ mainpyfile = opts.scriptname
if opts.target:
target(opts.target)
sys.exit()
elif opts.pdbserver:
- pdbserver(opts.pdbserver)
+ pdbserver(opts.pdbserver, mpdb)
sys.exit()
- else:
- if not opts.scriptname:
- if not args:
- print 'Error: mpdb.py must be called with a script name if ' \
- + '-p or -t switches are not specified.'
- sys.exit(1)
- else:
- mainpyfile = args[0]
- if not os.path.exists(mainpyfile):
- print 'Error:', mainpyfile, 'does not exist'
- sys.exit(1)
- mpdb = MPdb()
+
while 1:
try:
- mpdb._sys_argv = ['python']
- for i in sys.argv:
- mpdb._sys_argv.append(i)
- mpdb._program_sys_argv = mpdb._sys_argv[1:]
mpdb._runscript(mainpyfile)
if mpdb._user_requested_quit:
break
mpdb.msg("The program finished and will be restarted")
except Restart:
sys.argv = list(mpdb._program_sys_argv)
- mpdb.msg('Restarting with %s with arguments:\n\t%s'
+ mpdb.msg('Restarting %s with arguments:\n\t%s'
% (mpdb.filename(mainpyfile),
' '.join(mpdb._program_sys_argv[1:])))
except SystemExit:
@@ -605,6 +501,7 @@
# Parse arguments
def parse_opts():
parser = OptionParser()
+ parser.disable_interspersed_args()
parser.add_option("-s", "--script", dest="scriptname",
help="The script to debug")
parser.add_option("-t", "--target", dest="target",
@@ -615,7 +512,7 @@
+ "command. The arguments should be of the form," \
+ " 'protocol address scriptname'.")
(options, args) = parser.parse_args()
- return (options,args)
+ return (options, args)
if __name__ == '__main__':
main()
Modified: sandbox/trunk/pdb/mthread.py
==============================================================================
--- sandbox/trunk/pdb/mthread.py (original)
+++ sandbox/trunk/pdb/mthread.py Sun Jul 2 23:27:17 2006
@@ -13,11 +13,10 @@
which is useful, for instance, if a breakpoint occurs inside
a thread's run() method.
"""
- def __init__(self, breaks={}, filename=None, stdout=None):
+ def __init__(self, msg, errmsg, breaks={}, filename=None):
self.thread = threading.currentThread()
- if stdout is None:
- stdout = sys.stdout
- self.out = stdout
+ self.msg = msg
+ self.errmsg = errmsg
# Each tracer instance must keep track of its own breakpoints
self.breaks = breaks
self.fncache = {}
@@ -115,10 +114,10 @@
except TypeError:
err = self.set_break(filename, line, temporary, cond)
- if err: print >> self.out, err
+ if err: self.msg, err
else:
bp = self.get_breaks(filename, line)[-1]
- print >> self.out, "Breakpoint %d set in file %s, line %d." \
+ self.msg, "Breakpoint %d set in file %s, line %d." \
% (bp.number, self.filename(bp.file), bp.line)
def __parse_filepos(self, arg):
@@ -132,8 +131,8 @@
filename = arg[:colon].rstrip()
f = self.lookupmodule(filename)
if not f:
- print >> self.out, "%s not found from sys.path" % \
- self._saferepr(filename)
+ self.msg("%s not found from sys.path" %
+ self._saferepr(filename))
return (None, None, None)
else:
filename = f
@@ -141,7 +140,7 @@
try:
lineno = int(arg)
except ValueError, msg:
- print >> self.out, 'Bad lineno: %s' % str(arg)
+ self.msg('Bad lineno: %s' % str(arg))
return (None, filename, None)
return (None, filename, lineno)
else:
@@ -177,10 +176,10 @@
# last thing to try
(ok, filename, ln) = self.lineinfo(arg)
if not ok:
- print >> self.out, 'The specified object %s is not' % \
- str(repr(arg)),
- print >> self.out, ' a function, or not found' \
- +' along sys.path or no line given.'
+ self.msg('The specified object %s is not ' \
+ ' a function, or not found' \
+ ' along sys.path or no line given.' %
+ str(repr(arg)))
return (None, None, None)
funcname = ok # ok contains a function name
@@ -237,29 +236,29 @@
"""
line = linecache.getline(filename, lineno)
if not line:
- print >>self.out, 'End of file'
+ self.errmsg('End of file')
return 0
line = line.strip()
# Don't allow setting breakpoint at a blank line
if (not line or (line[0] == '#') or
(line[:3] == '"""') or line[:3] == "'''"):
- print >>self.out, '*** Blank or comment'
+ self.errmsg('Blank or comment')
return 0
return lineno
def trace_dispatch(self, frame, event, arg):
self.curframe = frame
if event == 'line':
- print >> self.out, self.thread.getName(),'*** line'
- return self.trace_dispatch
+ self.msg('%s *** line' % self.thread.getName())
+ return self.dispatch_line
if event == 'call':
- print >> self.out, self.thread.getName(), '*** call'
+ self.msg('%s *** call' % self.thread.getName())
return self.trace_dispatch
if event == 'return':
- print >> self.out, self.thread.getName(), '*** return'
+ self.msg('%s *** return' % self.thread.getName())
return self.trace_dispatch
if event == 'exception':
- print >> self.out, '*** exception'
+ self.msg('%s *** exception' % self.thread.getName())
return self.trace_dispatch
if event == 'c_call':
print '*** c_call'
@@ -272,3 +271,57 @@
return self.trace_dispatch
print 'bdb.Bdb.dispatch: unknown debugging event:', repr(event)
return self.trace_dispatch
+
+ def dispatch_line(self, frame, event, arg):
+ print frame.f_code.co_filename, self.thread.getName()
+ return self.trace_dispatch
+
+class ThreadDebug(object):
+ def __init__(self, msg, errmsg):
+ self.msg = msg
+ self.errmsg = errmsg
+ self.threads = []
+ self.tracers = []
+ self.current_thread = None
+
+ def trace_dispatch_init(self, frame, event, arg):
+ t = threading.currentThread()
+ self.threads.append(t)
+ m = MTracer(self.msg, self.errmsg)
+ self.tracers.append(m)
+
+ sys.settrace(m.trace_dispatch)
+
+ def get_current_thread(self):
+ self.msg('Current thread is %d (%s)' % \
+ (self.threads.index(self.current_thread)+1,
+ self.current_thread))
+ return
+
+ def set_current_thread(self, args):
+ # XXX Switch to a different thread, although this doesn't
+ # actually do anything yet.
+ t_num = int(args)-1
+ if t_num > len(self.threads):
+ self.errmsg('Thread ID %d not known.' % t_num+1)
+ return
+ self.current_thread = self.threads[t_num]
+ self.msg('Switching to thread %d (%s)' % (t_num+1, \
+ self.current_thread))
+ return
+
+def init(msg, errmsg):
+ """ This method sets up thread debugging by creating a ThreadDebug
+ object that creates MTracer objects for every thread that is created.
+ 'msg' is a method to write standard output to, and 'errmsg' is a method
+ to write error output to.
+ """
+ t = ThreadDebug(msg, errmsg)
+ threading.settrace(t.trace_dispatch_init)
+ #sys.settrace(t.trace_dispatch_init)
+
+
+
+
+
+
From python-checkins at python.org Mon Jul 3 01:14:31 2006
From: python-checkins at python.org (matt.fleming)
Date: Mon, 3 Jul 2006 01:14:31 +0200 (CEST)
Subject: [Python-checkins] r47202 - in sandbox/trunk/pdb: mconnection.py
test/Makefile test/test_mpdb.py test/test_mthread.py
Message-ID: <20060702231431.247FE1E4008@bag.python.org>
Author: matt.fleming
Date: Mon Jul 3 01:14:30 2006
New Revision: 47202
Added:
sandbox/trunk/pdb/test/test_mthread.py
Modified:
sandbox/trunk/pdb/mconnection.py
sandbox/trunk/pdb/test/Makefile
sandbox/trunk/pdb/test/test_mpdb.py
Log:
A bug in Python 2.5 meant we had to call shutdown on the socket, but it's
now been fixed. Also move the tests for the thread code into a separate file.
Modified: sandbox/trunk/pdb/mconnection.py
==============================================================================
--- sandbox/trunk/pdb/mconnection.py (original)
+++ sandbox/trunk/pdb/mconnection.py Mon Jul 3 01:14:30 2006
@@ -126,7 +126,7 @@
def disconnect(self):
if self.output is None or self._sock is None:
return
- self.output.shutdown(socket.SHUT_RDWR)
+ self.output.close()
self._sock.close()
self._sock = None
self.listening = False
Modified: sandbox/trunk/pdb/test/Makefile
==============================================================================
--- sandbox/trunk/pdb/test/Makefile (original)
+++ sandbox/trunk/pdb/test/Makefile Mon Jul 3 01:14:30 2006
@@ -8,13 +8,16 @@
PY = python
-.PHONY: all test test_mpdb test_mconnection
+.PHONY: all test test_mpdb test_mconnection test_mthread
all: test
-test: test_mpdb test_mconnection
+test: test_mpdb test_mconnection test_mthread
test_mpdb:
@$(PY) test_mpdb.py
test_mconnection:
@$(PY) test_mconnection.py
+
+test_mthread:
+ @$(PY) test_mthread.py
Modified: sandbox/trunk/pdb/test/test_mpdb.py
==============================================================================
--- sandbox/trunk/pdb/test/test_mpdb.py (original)
+++ sandbox/trunk/pdb/test/test_mpdb.py Mon Jul 3 01:14:30 2006
@@ -3,7 +3,9 @@
import os
import sys
import socket
+import time
import thread
+import threading
import unittest
from test import test_support
@@ -14,7 +16,7 @@
MAXTRIES = 100
sys.path.append("..")
-from mpdb import MPdb, pdbserver, target
+from mpdb import MPdb, pdbserver, target, Exit
from mconnection import (MConnectionClientTCP, MConnectionServerTCP,
ConnectionFailed)
@@ -25,7 +27,7 @@
if address is None:
address = __addr__
client.connection = MConnectionClientTCP()
-
+
while True:
try:
client.connection.connect(address)
@@ -44,6 +46,24 @@
def msg_nocr(self, msg):
self.lines.append(msg)
+
+class Pdbserver(threading.Thread, MPdb):
+ def __init__(self):
+ MPdb.__init__(self)
+ threading.Thread.__init__(self)
+ self.botframe = None
+ script = os.path.abspath('thread_script.py')
+ self._sys_argv = [script]
+
+
+ def run(self):
+ self.do_pdbserver('tcp localhost:8000')
+ while True:
+ try:
+ self.cmdloop()
+ except Exit:
+ break
+
class TestRemoteDebugging(unittest.TestCase):
""" Test Case to make sure debugging remotely works properly. """
@@ -100,6 +120,10 @@
self.assertEquals(errmsg, line)
+ server.disconnect()
+ while server._sock != None:
+ time.sleep(0.1)
+
def testRebindOutput(self):
""" Test rebinding output. """
self.server = MPdb()
@@ -128,46 +152,24 @@
f.close()
self.assertEquals(line, 'help', 'Could not rebind input.')
- def testThread(self):
- """ Test the thread command. """
- server = MConnectionServerTCP()
-
- thread.start_new_thread(server.connect, (__addr__,))
+ def testRestart(self):
+ """ Test the restart command. """
+ server = Pdbserver()
+ server.start()
self.client1 = MPdbTest()
- connect_to_target(self.client1)
+ self.client1.do_target('tcp localhost:8000')
- # Turn on thread debugging
- self.client1.onecmd('set thread')
- line = self.client1.lines[0]
- self.assertEquals('Thread debugging on\n', line)
-
- # Thread with no commands should return current thread
- self.client1.onecmd('thread')
- assert 'MainThread' in self.client1.lines[1]
-
- # 'thread apply' without thread ID should return an error message
- self.client1.onecmd('thread apply')
- line = self.client1.lines[2]
- errmsg = '*** Please specify a Thread ID\n'
- self.assertEquals(errmsg, line)
-
- # Need a command to actually apply to a thread
- self.client1.onecmd('thread apply 49843')
- line = self.client1.lines[3]
- errmsg = '*** Please specify a command following the thread ID\n'
- self.assertEquals(errmsg, line)
-
- # We've still not started any threads
- self.client1.onecmd('thread apply 2 info break')
- line = self.client1.lines[4]
- errmsg = '*** Thread ID 2 not known.\n'
- self.assertEquals(errmsg, line)
+ while 'Failed' in self.client1.lines[0]:
+ self.client1.lines = []
+ self.client1.do_target('tcp localhost:8000')
+
+ server.target = 'remote'
+ self.client1.onecmd('restart')
+ self.client1.connection.write('rquit\n')
- self.client1.onecmd('thread')
- line = self.client1.lines[5]
- msg = 'Current thread is 1 (<_MainThread(MainThread, started)>)\n'
- self.assertEquals(msg, line)
+ while server.connection != None:
+ time.sleep(0.1)
def test_main():
test_support.run_unittest(TestRemoteDebugging)
Added: sandbox/trunk/pdb/test/test_mthread.py
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/test/test_mthread.py Mon Jul 3 01:14:30 2006
@@ -0,0 +1,24 @@
+#!/usr/bin/env python
+
+# Unit tests for the thread debugging code.
+
+import sys
+import unittest
+from test import test_support
+
+sys.path.append('..')
+import mthread
+
+class TestThreadDebugging(unittest.TestCase):
+ def testMthreadInit(self):
+ """ Test the init method of the mthread file. """
+ m = sys.stdout.write
+ e = sys.stderr.write
+ mthread.init(m, e)
+
+def test_main():
+ test_support.run_unittest(TestThreadDebugging)
+
+if __name__ == '__main__':
+ test_main()
+
From python-checkins at python.org Mon Jul 3 09:58:09 2006
From: python-checkins at python.org (thomas.heller)
Date: Mon, 3 Jul 2006 09:58:09 +0200 (CEST)
Subject: [Python-checkins] r47203 -
python/trunk/Modules/_ctypes/libffi_msvc/ffi.c
Message-ID: <20060703075809.E06601E4008@bag.python.org>
Author: thomas.heller
Date: Mon Jul 3 09:58:09 2006
New Revision: 47203
Modified:
python/trunk/Modules/_ctypes/libffi_msvc/ffi.c
Log:
Cleanup: Remove commented out code.
Modified: python/trunk/Modules/_ctypes/libffi_msvc/ffi.c
==============================================================================
--- python/trunk/Modules/_ctypes/libffi_msvc/ffi.c (original)
+++ python/trunk/Modules/_ctypes/libffi_msvc/ffi.c Mon Jul 3 09:58:09 2006
@@ -227,11 +227,7 @@
void **arg_area;
unsigned short rtype;
void *resp = (void*)&res;
-//#ifdef _MSC_VER
void *args = &argp[1];
-//#else
-// void *args = __builtin_dwarf_cfa ();
-//#endif
cif = closure->cif;
arg_area = (void**) alloca (cif->nargs * sizeof (void*));
@@ -353,10 +349,6 @@
/* How to make a trampoline. Derived from gcc/config/i386/i386.c. */
-/* MOV EDX, ESP is 0x8b 0xd4 */
-
-//#ifdef _MSC_VER
-
#define FFI_INIT_TRAMPOLINE(TRAMP,FUN,CTX,BYTES) \
{ unsigned char *__tramp = (unsigned char*)(TRAMP); \
unsigned int __fun = (unsigned int)(FUN); \
@@ -365,26 +357,13 @@
*(unsigned char*) &__tramp[0] = 0xb9; \
*(unsigned int*) &__tramp[1] = __ctx; /* mov ecx, __ctx */ \
*(unsigned char*) &__tramp[5] = 0x8b; \
- *(unsigned char*) &__tramp[6] = 0xd4; \
+ *(unsigned char*) &__tramp[6] = 0xd4; /* mov edx, esp */ \
*(unsigned char*) &__tramp[7] = 0xe8; \
*(unsigned int*) &__tramp[8] = __dis; /* call __fun */ \
*(unsigned char*) &__tramp[12] = 0xC2; /* ret BYTES */ \
*(unsigned short*) &__tramp[13] = BYTES; \
}
-//#else
-//#define FFI_INIT_TRAMPOLINE(TRAMP,FUN,CTX,BYTES) \
-//({ unsigned char *__tramp = (unsigned char*)(TRAMP); \
-// unsigned int __fun = (unsigned int)(FUN); \
-// unsigned int __ctx = (unsigned int)(CTX); \
-// unsigned int __dis = __fun - ((unsigned int) __tramp + FFI_TRAMPOLINE_SIZE); \
-// *(unsigned char*) &__tramp[0] = 0xb8; \
-// *(unsigned int*) &__tramp[1] = __ctx; /* movl __ctx, %eax */ \
-// *(unsigned char *) &__tramp[5] = 0xe9; \
-// *(unsigned int*) &__tramp[6] = __dis; /* jmp __fun */ \
-// })
-//#endif
-
/* the cif must already be prep'ed */
ffi_status
From python-checkins at python.org Mon Jul 3 09:59:50 2006
From: python-checkins at python.org (thomas.heller)
Date: Mon, 3 Jul 2006 09:59:50 +0200 (CEST)
Subject: [Python-checkins] r47204 -
python/trunk/Lib/ctypes/test/test_objects.py
Message-ID: <20060703075950.57EEF1E4008@bag.python.org>
Author: thomas.heller
Date: Mon Jul 3 09:59:50 2006
New Revision: 47204
Modified:
python/trunk/Lib/ctypes/test/test_objects.py
Log:
Don't run the doctests with Python 2.3 because it doesn't have the ELLIPSIS flag.
Modified: python/trunk/Lib/ctypes/test/test_objects.py
==============================================================================
--- python/trunk/Lib/ctypes/test/test_objects.py (original)
+++ python/trunk/Lib/ctypes/test/test_objects.py Mon Jul 3 09:59:50 2006
@@ -54,13 +54,17 @@
'''
-import unittest, doctest
+import unittest, doctest, sys
import ctypes.test.test_objects
class TestCase(unittest.TestCase):
- def test(self):
- doctest.testmod(ctypes.test.test_objects)
+ if sys.hexversion > 0x02040000:
+ # Python 2.3 has no ELLIPSIS flag, so we don't test with this
+ # version:
+ def test(self):
+ doctest.testmod(ctypes.test.test_objects)
if __name__ == '__main__':
- doctest.testmod(ctypes.test.test_objects)
+ if sys.hexversion > 0x02040000:
+ doctest.testmod(ctypes.test.test_objects)
From python-checkins at python.org Mon Jul 3 10:04:05 2006
From: python-checkins at python.org (thomas.heller)
Date: Mon, 3 Jul 2006 10:04:05 +0200 (CEST)
Subject: [Python-checkins] r47205 -
python/trunk/Modules/_ctypes/libffi_msvc/fficonfig.h
python/trunk/Modules/_ctypes/libffi_msvc/ffitarget.h
Message-ID: <20060703080405.A0E3D1E4008@bag.python.org>
Author: thomas.heller
Date: Mon Jul 3 10:04:05 2006
New Revision: 47205
Modified:
python/trunk/Modules/_ctypes/libffi_msvc/fficonfig.h
python/trunk/Modules/_ctypes/libffi_msvc/ffitarget.h
Log:
Fixes so that _ctypes can be compiled with the MingW compiler.
It seems that the definition of '__attribute__(x)' was responsible for
the compiler ignoring the '__fastcall' attribute on the
ffi_closure_SYSV function in libffi_msvc/ffi.c, took me quite some
time to figure this out.
Modified: python/trunk/Modules/_ctypes/libffi_msvc/fficonfig.h
==============================================================================
--- python/trunk/Modules/_ctypes/libffi_msvc/fficonfig.h (original)
+++ python/trunk/Modules/_ctypes/libffi_msvc/fficonfig.h Mon Jul 3 10:04:05 2006
@@ -3,8 +3,10 @@
/* fficonfig.h. Generated automatically by configure. */
/* fficonfig.h.in. Generated automatically from configure.in by autoheader. */
-/* Defines for MSVC */
+/* Define this for MSVC, but not for mingw32! */
+#ifdef _MSC_VER
#define __attribute__(x) /* */
+#endif
#define alloca _alloca
/*----------------------------------------------------------------*/
Modified: python/trunk/Modules/_ctypes/libffi_msvc/ffitarget.h
==============================================================================
--- python/trunk/Modules/_ctypes/libffi_msvc/ffitarget.h (original)
+++ python/trunk/Modules/_ctypes/libffi_msvc/ffitarget.h Mon Jul 3 10:04:05 2006
@@ -71,11 +71,7 @@
#define FFI_TRAMPOLINE_SIZE 24
#define FFI_NATIVE_RAW_API 0
#else
-#ifdef _MSC_VER
-# define FFI_TRAMPOLINE_SIZE 15
-#else
-# define FFI_TRAMPOLINE_SIZE 10
-#endif
+#define FFI_TRAMPOLINE_SIZE 15
#define FFI_NATIVE_RAW_API 1 /* x86 has native raw api support */
#endif
From python-checkins at python.org Mon Jul 3 10:08:15 2006
From: python-checkins at python.org (thomas.heller)
Date: Mon, 3 Jul 2006 10:08:15 +0200 (CEST)
Subject: [Python-checkins] r47206 - in python/trunk:
Lib/ctypes/test/test_win32.py Modules/_ctypes/callproc.c
Message-ID: <20060703080815.22FA41E4008@bag.python.org>
Author: thomas.heller
Date: Mon Jul 3 10:08:14 2006
New Revision: 47206
Modified:
python/trunk/Lib/ctypes/test/test_win32.py
python/trunk/Modules/_ctypes/callproc.c
Log:
Add a new function uses_seh() to the _ctypes extension module. This
will return True if Windows Structured Exception handling (SEH) is
used when calling functions, False otherwise.
Currently, only MSVC supports SEH.
Fix the test so that it doesn't crash when run with MingW compiled
_ctypes. Note that two tests are still failing when mingw is used, I
suspect structure layout differences and function calling conventions
between MSVC and MingW.
Modified: python/trunk/Lib/ctypes/test/test_win32.py
==============================================================================
--- python/trunk/Lib/ctypes/test/test_win32.py (original)
+++ python/trunk/Lib/ctypes/test/test_win32.py Mon Jul 3 10:08:14 2006
@@ -30,15 +30,11 @@
# or wrong calling convention
self.assertRaises(ValueError, IsWindow, None)
- def test_SEH(self):
- # Call functions with invalid arguments, and make sure that access violations
- # are trapped and raise an exception.
- #
- # Normally, in a debug build of the _ctypes extension
- # module, exceptions are not trapped, so we can only run
- # this test in a release build.
- import sys
- if not hasattr(sys, "getobjects"):
+ import _ctypes
+ if _ctypes.uses_seh():
+ def test_SEH(self):
+ # Call functions with invalid arguments, and make sure that access violations
+ # are trapped and raise an exception.
self.assertRaises(WindowsError, windll.kernel32.GetModuleHandleA, 32)
class Structures(unittest.TestCase):
Modified: python/trunk/Modules/_ctypes/callproc.c
==============================================================================
--- python/trunk/Modules/_ctypes/callproc.c (original)
+++ python/trunk/Modules/_ctypes/callproc.c Mon Jul 3 10:08:14 2006
@@ -1526,7 +1526,21 @@
return Py_None;
}
+static PyObject *
+uses_seh(PyObject *self, PyObject *args)
+{
+#if defined(DONT_USE_SEH) || !defined(MS_WIN32)
+ Py_INCREF(Py_False);
+ return Py_False;
+#else
+ Py_INCREF(Py_True);
+ return Py_True;
+#endif
+}
+
PyMethodDef module_methods[] = {
+ {"uses_seh", uses_seh, METH_NOARGS,
+ "Return whether ctypes uses Windows structured exception handling"},
{"resize", resize, METH_VARARGS, "Resize the memory buffer of a ctypes instance"},
#ifdef CTYPES_UNICODE
{"set_conversion_mode", set_conversion_mode, METH_VARARGS, set_conversion_mode_doc},
From python-checkins at python.org Mon Jul 3 10:23:21 2006
From: python-checkins at python.org (tim.peters)
Date: Mon, 3 Jul 2006 10:23:21 +0200 (CEST)
Subject: [Python-checkins] r47207 - python/trunk/Lib/test/test_os.py
Message-ID: <20060703082321.B53411E4011@bag.python.org>
Author: tim.peters
Date: Mon Jul 3 10:23:19 2006
New Revision: 47207
Modified:
python/trunk/Lib/test/test_os.py
Log:
Whitespace normalization.
Modified: python/trunk/Lib/test/test_os.py
==============================================================================
--- python/trunk/Lib/test/test_os.py (original)
+++ python/trunk/Lib/test/test_os.py Mon Jul 3 10:23:19 2006
@@ -22,7 +22,7 @@
f = os.open(test_support.TESTFN, os.O_CREAT|os.O_RDWR)
os.close(f)
self.assert_(os.access(test_support.TESTFN, os.W_OK))
-
+
class TemporaryFileTests(unittest.TestCase):
def setUp(self):
From buildbot at python.org Mon Jul 3 11:02:59 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 03 Jul 2006 09:02:59 +0000
Subject: [Python-checkins] buildbot failure in x86 XP trunk
Message-ID: <20060703090259.814C41E4008@bag.python.org>
The Buildbot has detected a new failure of x86 XP trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520XP%2520trunk/builds/1139
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: tim.peters
BUILD FAILED: failed failed slave lost
sincerely,
-The Buildbot
From python-checkins at python.org Mon Jul 3 11:44:01 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Mon, 3 Jul 2006 11:44:01 +0200 (CEST)
Subject: [Python-checkins] r47208 - in python/trunk: Lib/lib-tk/turtle.py
Misc/NEWS
Message-ID: <20060703094401.BB0E01E4025@bag.python.org>
Author: martin.v.loewis
Date: Mon Jul 3 11:44:00 2006
New Revision: 47208
Modified:
python/trunk/Lib/lib-tk/turtle.py
python/trunk/Misc/NEWS
Log:
Only setup canvas when it is first created.
Fixes #1514703
Modified: python/trunk/Lib/lib-tk/turtle.py
==============================================================================
--- python/trunk/Lib/lib-tk/turtle.py (original)
+++ python/trunk/Lib/lib-tk/turtle.py Mon Jul 3 11:44:00 2006
@@ -679,7 +679,7 @@
_canvas = Tkinter.Canvas(_root, background="white")
_canvas.pack(expand=1, fill="both")
- setup(width=_width, height= _height, startx=_startx, starty=_starty)
+ setup(width=_width, height= _height, startx=_startx, starty=_starty)
RawPen.__init__(self, _canvas)
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Mon Jul 3 11:44:00 2006
@@ -19,6 +19,9 @@
Library
-------
+- Bug #1514703: Only setup canvas window in turtle when the canvas
+ is created.
+
- Bug #1513223: .close() of a _socketobj now releases the underlying
socket again, which then gets closed as it becomes unreferenced.
From python-checkins at python.org Mon Jul 3 12:05:30 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Mon, 3 Jul 2006 12:05:30 +0200 (CEST)
Subject: [Python-checkins] r47209 - in python/trunk: Lib/lib-tk/turtle.py
Misc/NEWS
Message-ID: <20060703100530.AF9901E4016@bag.python.org>
Author: martin.v.loewis
Date: Mon Jul 3 12:05:30 2006
New Revision: 47209
Modified:
python/trunk/Lib/lib-tk/turtle.py
python/trunk/Misc/NEWS
Log:
Reimplement turtle.circle using a polyline, to allow correct
filling of arcs. Also fixes #1514693.
Modified: python/trunk/Lib/lib-tk/turtle.py
==============================================================================
--- python/trunk/Lib/lib-tk/turtle.py (original)
+++ python/trunk/Lib/lib-tk/turtle.py Mon Jul 3 12:05:30 2006
@@ -344,7 +344,7 @@
"""
self.fill(0)
- def circle(self, radius, extent=None):
+ def circle(self, radius, extent = None):
""" Draw a circle with given radius.
The center is radius units left of the turtle; extent
determines which part of the circle is drawn. If not given,
@@ -360,53 +360,19 @@
>>> turtle.circle(120, 180) # half a circle
"""
if extent is None:
- extent = self._fullcircle
- x0, y0 = self._position
- xc = x0 - radius * sin(self._angle * self._invradian)
- yc = y0 - radius * cos(self._angle * self._invradian)
- if radius >= 0.0:
- start = self._angle - (self._fullcircle / 4.0)
- else:
- start = self._angle + (self._fullcircle / 4.0)
- extent = -extent
- if self._filling:
- if abs(extent) >= self._fullcircle:
- item = self._canvas.create_oval(xc-radius, yc-radius,
- xc+radius, yc+radius,
- width=self._width,
- outline="")
- self._tofill.append(item)
- item = self._canvas.create_arc(xc-radius, yc-radius,
- xc+radius, yc+radius,
- style="chord",
- start=start,
- extent=extent,
- width=self._width,
- outline="")
- self._tofill.append(item)
- if self._drawing:
- if abs(extent) >= self._fullcircle:
- item = self._canvas.create_oval(xc-radius, yc-radius,
- xc+radius, yc+radius,
- width=self._width,
- outline=self._color)
- self._items.append(item)
- item = self._canvas.create_arc(xc-radius, yc-radius,
- xc+radius, yc+radius,
- style="arc",
- start=start,
- extent=extent,
- width=self._width,
- outline=self._color)
- self._items.append(item)
- angle = start + extent
- x1 = xc + abs(radius) * cos(angle * self._invradian)
- y1 = yc - abs(radius) * sin(angle * self._invradian)
- self._angle = (self._angle + extent) % self._fullcircle
- self._position = x1, y1
- if self._filling:
- self._path.append(self._position)
- self._draw_turtle()
+ extent = self._fullcircle
+ frac = abs(extent)/self._fullcircle
+ steps = 1+int(min(11+abs(radius)/6.0, 59.0)*frac)
+ w = 1.0 * extent / steps
+ w2 = 0.5 * w
+ l = 2.0 * radius * sin(w2*self._invradian)
+ if radius < 0:
+ l, w, w2 = -l, -w, -w2
+ self.left(w2)
+ for i in range(steps):
+ self.forward(l)
+ self.left(w)
+ self.right(w2)
def heading(self):
""" Return the turtle's current heading.
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Mon Jul 3 12:05:30 2006
@@ -19,6 +19,9 @@
Library
-------
+- Reimplement turtle.circle using a polyline, to allow correct
+ filling of arcs. Also fixes #1514693.
+
- Bug #1514703: Only setup canvas window in turtle when the canvas
is created.
From python-checkins at python.org Mon Jul 3 12:19:50 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Mon, 3 Jul 2006 12:19:50 +0200 (CEST)
Subject: [Python-checkins] r47210 - in python/trunk: Lib/lib-tk/turtle.py
Misc/NEWS
Message-ID: <20060703101950.66F1D1E4008@bag.python.org>
Author: martin.v.loewis
Date: Mon Jul 3 12:19:49 2006
New Revision: 47210
Modified:
python/trunk/Lib/lib-tk/turtle.py
python/trunk/Misc/NEWS
Log:
Bug #1514693: Update turtle's heading when switching between
degrees and radians.
Modified: python/trunk/Lib/lib-tk/turtle.py
==============================================================================
--- python/trunk/Lib/lib-tk/turtle.py (original)
+++ python/trunk/Lib/lib-tk/turtle.py Mon Jul 3 12:19:49 2006
@@ -30,6 +30,7 @@
self._tracing = 1
self._arrow = 0
self._delay = 10 # default delay for drawing
+ self._angle = 0.0
self.degrees()
self.reset()
@@ -39,6 +40,10 @@
Example:
>>> turtle.degrees()
"""
+ # Don't try to change _angle if it is 0, because
+ # _fullcircle might not be set, yet
+ if self._angle:
+ self._angle = (self._angle / self._fullcircle) * fullcircle
self._fullcircle = fullcircle
self._invradian = pi / (fullcircle * 0.5)
@@ -365,7 +370,7 @@
steps = 1+int(min(11+abs(radius)/6.0, 59.0)*frac)
w = 1.0 * extent / steps
w2 = 0.5 * w
- l = 2.0 * radius * sin(w2*self._invradian)
+ l = 2.0 * radius * sin(w2*self._invradian)
if radius < 0:
l, w, w2 = -l, -w, -w2
self.left(w2)
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Mon Jul 3 12:19:49 2006
@@ -19,8 +19,11 @@
Library
-------
+- Bug #1514693: Update turtle's heading when switching between
+ degrees and radians.
+
- Reimplement turtle.circle using a polyline, to allow correct
- filling of arcs. Also fixes #1514693.
+ filling of arcs.
- Bug #1514703: Only setup canvas window in turtle when the canvas
is created.
From buildbot at python.org Mon Jul 3 13:08:22 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 03 Jul 2006 11:08:22 +0000
Subject: [Python-checkins] buildbot warnings in alpha Debian trunk
Message-ID: <20060703110822.E71411E4008@bag.python.org>
The Buildbot has detected a new failure of alpha Debian trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/alpha%2520Debian%2520trunk/builds/442
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: thomas.heller
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Mon Jul 3 13:12:07 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Mon, 3 Jul 2006 13:12:07 +0200 (CEST)
Subject: [Python-checkins] r47211 - python/trunk/Doc/lib/libturtle.tex
Message-ID: <20060703111207.77E201E4008@bag.python.org>
Author: martin.v.loewis
Date: Mon Jul 3 13:12:06 2006
New Revision: 47211
Modified:
python/trunk/Doc/lib/libturtle.tex
Log:
Document functions added in 2.3 and 2.5.
Modified: python/trunk/Doc/lib/libturtle.tex
==============================================================================
--- python/trunk/Doc/lib/libturtle.tex (original)
+++ python/trunk/Doc/lib/libturtle.tex Mon Jul 3 13:12:06 2006
@@ -42,6 +42,19 @@
line.
\end{funcdesc}
+\begin{funcdesc}{speed}{speed}
+Set the speed of the turtle. Valid values for the parameter
+\var{speed} are \code{'fastest'} (no delay), \code{'fast'},
+(delay 5ms), \code{'normal'} (delay 10ms), \code{'slow'}
+(delay 15ms), and \code{'slowest'} (delay 20ms).
+\versionadded{2.5}
+\end{funcdesc}
+
+\begin{funcdesc}{delay}{delay}
+Set the speed of the turtle to \var{delay}, which is given
+in ms. \versionadded{2.5}
+\end{funcdesc}
+
\begin{funcdesc}{forward}{distance}
Go forward \var{distance} steps.
\end{funcdesc}
@@ -94,6 +107,16 @@
and call \code{fill(0)} when you finish to draw the path.
\end{funcdesc}
+\begin{funcdesc}{begin\_fill}{}
+Switch turtle into filling mode; equivalent to \code{fill(1)}.
+\versionadded{2.5}
+\end{funcdesc}
+
+\begin{funcdesc}{end\_fill}{}
+End filling mode, and fill the shape; equivalent to \code{fill(0)}.
+\versionadded{2.5}
+\end{funcdesc}
+
\begin{funcdesc}{circle}{radius\optional{, extent}}
Draw a circle with radius \var{radius} whose center-point is
\var{radius} units left of the turtle.
@@ -113,6 +136,49 @@
specified either as two separate arguments or as a 2-tuple.
\end{funcdesc}
+\begin{funcdesc}{towards}{x, y}
+Return the angle of the line from the turtle's position
+to the point \var{x}, \var{y}. The co-ordinates may be
+specified either as two separate arguments, as a 2-tuple,
+or as another pen object.
+\versionadded{2.5}
+\end{funcdesc}
+
+\begin{funcdesc}{heading}{}
+Return the current orientation of the turtle.
+\versionadded{2.3}
+\end{funcdesc}
+
+\begin{funcdesc}{setheading}{angle}
+Set the orientation of the turtle to \var{angle}.
+\versionadded{2.3}
+\end{funcdesc}
+
+\begin{funcdesc}{position}{}
+Return the current location of the turtle as an \code{(x,y)} pair.
+\versionadded{2.3}
+\end{funcdesc}
+
+\begin{funcdesc}{setx}{x}
+Set the x coordinate of the turtle to \var{x}.
+\versionadded{2.3}
+\end{funcdesc}
+
+\begin{funcdesc}{sety}{y}
+Set the y coordinate of the turtle to \var{y}.
+\versionadded{2.3}
+\end{funcdesc}
+
+\begin{funcdesc}{window\_width}{}
+Return the width of the canvas window.
+\versionadded{2.3}
+\end{funcdesc}
+
+\begin{funcdesc}{window\_height}{}
+Return the height of the canvas window.
+\versionadded{2.3}
+\end{funcdesc}
+
This module also does \code{from math import *}, so see the
documentation for the \refmodule{math} module for additional constants
and functions useful for turtle graphics.
@@ -134,16 +200,21 @@
pen. The constructor automatically creates a canvas do be drawn on.
\end{classdesc}
+\begin{classdesc}{Turtle}{}
+Define a pen. This is essentially a synonym for \code{Pen()};
+\class{Turtle} is an empty subclass of \class{Pen}.
+\end{classdesc}
+
\begin{classdesc}{RawPen}{canvas}
Define a pen which draws on a canvas \var{canvas}. This is useful if
you want to use the module to create graphics in a ``real'' program.
\end{classdesc}
-\subsection{Pen and RawPen Objects \label{pen-rawpen-objects}}
+\subsection{Turtle, Pen and RawPen Objects \label{pen-rawpen-objects}}
-\class{Pen} and \class{RawPen} objects have all the global functions
-described above, except for \function{demo()} as methods, which
-manipulate the given pen.
+\class{Turtle}, \class{Pen} and \class{RawPen} objects have all the
+global functions described above, except for \function{demo()} as
+methods, which manipulate the given pen.
The only method which is more powerful as a method is
\function{degrees()}.
From python-checkins at python.org Mon Jul 3 14:19:51 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Mon, 3 Jul 2006 14:19:51 +0200 (CEST)
Subject: [Python-checkins] r47212 - in python/trunk:
Lib/test/test__locale.py Lib/test/test_builtin.py Misc/NEWS
Python/pystrtod.c
Message-ID: <20060703121951.312941E4008@bag.python.org>
Author: martin.v.loewis
Date: Mon Jul 3 14:19:50 2006
New Revision: 47212
Modified:
python/trunk/Lib/test/test__locale.py
python/trunk/Lib/test/test_builtin.py
python/trunk/Misc/NEWS
python/trunk/Python/pystrtod.c
Log:
Bug #1417699: Reject locale-specific decimal point in float()
and atof().
Modified: python/trunk/Lib/test/test__locale.py
==============================================================================
--- python/trunk/Lib/test/test__locale.py (original)
+++ python/trunk/Lib/test/test__locale.py Mon Jul 3 14:19:50 2006
@@ -113,6 +113,9 @@
"using eval('3.14') failed for %s" % loc)
self.assertEquals(int(float('3.14') * 100), 314,
"using float('3.14') failed for %s" % loc)
+ if localeconv()['decimal_point'] != '.':
+ self.assertRaises(ValueError, float,
+ localeconv()['decimal_point'].join(['1', '23']))
def test_main():
run_unittest(_LocaleTests)
Modified: python/trunk/Lib/test/test_builtin.py
==============================================================================
--- python/trunk/Lib/test/test_builtin.py (original)
+++ python/trunk/Lib/test/test_builtin.py Mon Jul 3 14:19:50 2006
@@ -558,13 +558,24 @@
@run_with_locale('LC_NUMERIC', 'fr_FR', 'de_DE')
def test_float_with_comma(self):
# set locale to something that doesn't use '.' for the decimal point
+ # float must not accept the locale specific decimal point but
+ # it still has to accept the normal python syntac
import locale
if not locale.localeconv()['decimal_point'] == ',':
return
- self.assertEqual(float(" 3,14 "), 3.14)
- self.assertEqual(float(" +3,14 "), 3.14)
- self.assertEqual(float(" -3,14 "), -3.14)
+ self.assertEqual(float(" 3.14 "), 3.14)
+ self.assertEqual(float("+3.14 "), 3.14)
+ self.assertEqual(float("-3.14 "), -3.14)
+ self.assertEqual(float(".14 "), .14)
+ self.assertEqual(float("3. "), 3.0)
+ self.assertEqual(float("3.e3 "), 3000.0)
+ self.assertEqual(float("3.2e3 "), 3200.0)
+ self.assertEqual(float("2.5e-1 "), 0.25)
+ self.assertEqual(float("5e-1"), 0.5)
+ self.assertRaises(ValueError, float, " 3,14 ")
+ self.assertRaises(ValueError, float, " +3,14 ")
+ self.assertRaises(ValueError, float, " -3,14 ")
self.assertRaises(ValueError, float, " 0x3.1 ")
self.assertRaises(ValueError, float, " -0x3.p-1 ")
self.assertEqual(float(" 25.e-1 "), 2.5)
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Mon Jul 3 14:19:50 2006
@@ -12,6 +12,9 @@
Core and builtins
-----------------
+- Bug #1417699: Reject locale-specific decimal point in float()
+ and atof().
+
- Bug #1511381: codec_getstreamcodec() in codec.c is corrected to
omit a default "error" argument for NULL pointer. This allows
the parser to take a codec from cjkcodecs again.
Modified: python/trunk/Python/pystrtod.c
==============================================================================
--- python/trunk/Python/pystrtod.c (original)
+++ python/trunk/Python/pystrtod.c Mon Jul 3 14:19:50 2006
@@ -90,6 +90,13 @@
p++;
end = p;
}
+ else if (strncmp(p, decimal_point, decimal_point_len) == 0)
+ {
+ /* Python bug #1417699 */
+ *endptr = (char*)nptr;
+ errno = EINVAL;
+ return val;
+ }
/* For the other cases, we need not convert the decimal point */
}
From python-checkins at python.org Mon Jul 3 14:29:00 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Mon, 3 Jul 2006 14:29:00 +0200 (CEST)
Subject: [Python-checkins] r47213 - in python/trunk:
Lib/distutils/command/bdist_rpm.py Misc/NEWS
Message-ID: <20060703122900.2BE7B1E4008@bag.python.org>
Author: martin.v.loewis
Date: Mon Jul 3 14:28:58 2006
New Revision: 47213
Modified:
python/trunk/Lib/distutils/command/bdist_rpm.py
python/trunk/Misc/NEWS
Log:
Bug #1267547: Put proper recursive setup.py call into the
spec file generated by bdist_rpm.
Modified: python/trunk/Lib/distutils/command/bdist_rpm.py
==============================================================================
--- python/trunk/Lib/distutils/command/bdist_rpm.py (original)
+++ python/trunk/Lib/distutils/command/bdist_rpm.py Mon Jul 3 14:28:58 2006
@@ -467,7 +467,8 @@
# rpm scripts
# figure out default build script
- def_build = "%s setup.py build" % self.python
+ def_setup_call = "%s %s" % (self.python,os.path.basename(sys.argv[0]))
+ def_build = "%s build" % def_setup_call
if self.use_rpm_opt_flags:
def_build = 'env CFLAGS="$RPM_OPT_FLAGS" ' + def_build
@@ -481,9 +482,9 @@
('prep', 'prep_script', "%setup"),
('build', 'build_script', def_build),
('install', 'install_script',
- ("%s setup.py install "
+ ("%s install "
"--root=$RPM_BUILD_ROOT "
- "--record=INSTALLED_FILES") % self.python),
+ "--record=INSTALLED_FILES") % def_setup_call),
('clean', 'clean_script', "rm -rf $RPM_BUILD_ROOT"),
('verifyscript', 'verify_script', None),
('pre', 'pre_install', None),
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Mon Jul 3 14:28:58 2006
@@ -22,6 +22,9 @@
Library
-------
+- Bug #1267547: Put proper recursive setup.py call into the
+ spec file generated by bdist_rpm.
+
- Bug #1514693: Update turtle's heading when switching between
degrees and radians.
From python-checkins at python.org Mon Jul 3 14:29:54 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Mon, 3 Jul 2006 14:29:54 +0200 (CEST)
Subject: [Python-checkins] r47214 - in python/branches/release24-maint:
Lib/distutils/command/bdist_rpm.py Misc/NEWS
Message-ID: <20060703122954.7911A1E4008@bag.python.org>
Author: martin.v.loewis
Date: Mon Jul 3 14:29:51 2006
New Revision: 47214
Modified:
python/branches/release24-maint/Lib/distutils/command/bdist_rpm.py
python/branches/release24-maint/Misc/NEWS
Log:
Bug #1267547: Put proper recursive setup.py call into the
spec file generated by bdist_rpm.
Modified: python/branches/release24-maint/Lib/distutils/command/bdist_rpm.py
==============================================================================
--- python/branches/release24-maint/Lib/distutils/command/bdist_rpm.py (original)
+++ python/branches/release24-maint/Lib/distutils/command/bdist_rpm.py Mon Jul 3 14:29:51 2006
@@ -454,7 +454,8 @@
# rpm scripts
# figure out default build script
- def_build = "%s setup.py build" % self.python
+ def_setup_call = "%s %s" % (self.python,os.path.basename(sys.argv[0]))
+ def_build = "%s build" % def_setup_call
if self.use_rpm_opt_flags:
def_build = 'env CFLAGS="$RPM_OPT_FLAGS" ' + def_build
@@ -468,9 +469,9 @@
('prep', 'prep_script', "%setup"),
('build', 'build_script', def_build),
('install', 'install_script',
- ("%s setup.py install "
+ ("%s install "
"--root=$RPM_BUILD_ROOT "
- "--record=INSTALLED_FILES") % self.python),
+ "--record=INSTALLED_FILES") % def_setup_call),
('clean', 'clean_script', "rm -rf $RPM_BUILD_ROOT"),
('verifyscript', 'verify_script', None),
('pre', 'pre_install', None),
Modified: python/branches/release24-maint/Misc/NEWS
==============================================================================
--- python/branches/release24-maint/Misc/NEWS (original)
+++ python/branches/release24-maint/Misc/NEWS Mon Jul 3 14:29:51 2006
@@ -56,6 +56,9 @@
Library
-------
+- Bug #1267547: Put proper recursive setup.py call into the
+ spec file generated by bdist_rpm.
+
- Bug #1504333: Make sgmllib support angle brackets in quoted attribute
values.
From buildbot at python.org Mon Jul 3 14:34:55 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 03 Jul 2006 12:34:55 +0000
Subject: [Python-checkins] buildbot warnings in PPC64 Debian trunk
Message-ID: <20060703123455.495B21E4008@bag.python.org>
The Buildbot has detected a new failure of PPC64 Debian trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/PPC64%2520Debian%2520trunk/builds/263
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: thomas.heller
Build Had Warnings: warnings failed slave lost
sincerely,
-The Buildbot
From python-checkins at python.org Mon Jul 3 15:01:36 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Mon, 3 Jul 2006 15:01:36 +0200 (CEST)
Subject: [Python-checkins] r47215 - in python/trunk: Lib/telnetlib.py
Misc/NEWS
Message-ID: <20060703130136.177F81E4008@bag.python.org>
Author: martin.v.loewis
Date: Mon Jul 3 15:01:35 2006
New Revision: 47215
Modified:
python/trunk/Lib/telnetlib.py
python/trunk/Misc/NEWS
Log:
Patch #825417: Fix timeout processing in expect,
read_until. Will backport to 2.4.
Modified: python/trunk/Lib/telnetlib.py
==============================================================================
--- python/trunk/Lib/telnetlib.py (original)
+++ python/trunk/Lib/telnetlib.py Mon Jul 3 15:01:35 2006
@@ -311,6 +311,8 @@
s_args = s_reply
if timeout is not None:
s_args = s_args + (timeout,)
+ from time import time
+ time_start = time()
while not self.eof and select.select(*s_args) == s_reply:
i = max(0, len(self.cookedq)-n)
self.fill_rawq()
@@ -321,6 +323,11 @@
buf = self.cookedq[:i]
self.cookedq = self.cookedq[i:]
return buf
+ if timeout is not None:
+ elapsed = time() - time_start
+ if elapsed >= timeout:
+ break
+ s_args = s_reply + (timeout-elapsed,)
return self.read_very_lazy()
def read_all(self):
@@ -601,6 +608,9 @@
if not hasattr(list[i], "search"):
if not re: import re
list[i] = re.compile(list[i])
+ if timeout is not None:
+ from time import time
+ time_start = time()
while 1:
self.process_rawq()
for i in indices:
@@ -613,7 +623,11 @@
if self.eof:
break
if timeout is not None:
- r, w, x = select.select([self.fileno()], [], [], timeout)
+ elapsed = time() - time_start
+ if elapsed >= timeout:
+ break
+ s_args = ([self.fileno()], [], [], timeout-elapsed)
+ r, w, x = select.select(*s_args)
if not r:
break
self.fill_rawq()
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Mon Jul 3 15:01:35 2006
@@ -22,6 +22,9 @@
Library
-------
+- Bug #822974: Honor timeout in telnetlib.{expect,read_until}
+ even if some data are received.
+
- Bug #1267547: Put proper recursive setup.py call into the
spec file generated by bdist_rpm.
From python-checkins at python.org Mon Jul 3 15:02:59 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Mon, 3 Jul 2006 15:02:59 +0200 (CEST)
Subject: [Python-checkins] r47216 - in python/branches/release24-maint:
Lib/telnetlib.py Misc/NEWS
Message-ID: <20060703130259.2C0891E4008@bag.python.org>
Author: martin.v.loewis
Date: Mon Jul 3 15:02:58 2006
New Revision: 47216
Modified:
python/branches/release24-maint/Lib/telnetlib.py
python/branches/release24-maint/Misc/NEWS
Log:
Patch #825417: Fix timeout processing in expect,
read_until.
Modified: python/branches/release24-maint/Lib/telnetlib.py
==============================================================================
--- python/branches/release24-maint/Lib/telnetlib.py (original)
+++ python/branches/release24-maint/Lib/telnetlib.py Mon Jul 3 15:02:58 2006
@@ -311,6 +311,8 @@
s_args = s_reply
if timeout is not None:
s_args = s_args + (timeout,)
+ from time import time
+ time_start = time()
while not self.eof and select.select(*s_args) == s_reply:
i = max(0, len(self.cookedq)-n)
self.fill_rawq()
@@ -321,6 +323,11 @@
buf = self.cookedq[:i]
self.cookedq = self.cookedq[i:]
return buf
+ if timeout is not None:
+ elapsed = time() - time_start
+ if elapsed >= timeout:
+ break
+ s_args = s_reply + (timeout-elapsed,)
return self.read_very_lazy()
def read_all(self):
@@ -601,6 +608,9 @@
if not hasattr(list[i], "search"):
if not re: import re
list[i] = re.compile(list[i])
+ if timeout is not None:
+ from time import time
+ time_start = time()
while 1:
self.process_rawq()
for i in indices:
@@ -613,7 +623,11 @@
if self.eof:
break
if timeout is not None:
- r, w, x = select.select([self.fileno()], [], [], timeout)
+ elapsed = time() - time_start
+ if elapsed >= timeout:
+ break
+ s_args = ([self.fileno()], [], [], timeout-elapsed)
+ r, w, x = select.select(*s_args)
if not r:
break
self.fill_rawq()
Modified: python/branches/release24-maint/Misc/NEWS
==============================================================================
--- python/branches/release24-maint/Misc/NEWS (original)
+++ python/branches/release24-maint/Misc/NEWS Mon Jul 3 15:02:58 2006
@@ -56,6 +56,9 @@
Library
-------
+- Bug #822974: Honor timeout in telnetlib.{expect,read_until}
+ even if some data are received.
+
- Bug #1267547: Put proper recursive setup.py call into the
spec file generated by bdist_rpm.
From buildbot at python.org Mon Jul 3 15:03:44 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 03 Jul 2006 13:03:44 +0000
Subject: [Python-checkins] buildbot warnings in x86 OpenBSD 2.4
Message-ID: <20060703130344.CCDB31E4008@bag.python.org>
The Buildbot has detected a new failure of x86 OpenBSD 2.4.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520OpenBSD%25202.4/builds/130
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch branches/release24-maint] HEAD
Blamelist: fred.drake,martin.v.loewis
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From kristjan at ccpgames.com Mon Jul 3 15:24:57 2006
From: kristjan at ccpgames.com (=?iso-8859-1?Q?Kristj=E1n_V=2E_J=F3nsson?=)
Date: Mon, 3 Jul 2006 13:24:57 -0000
Subject: [Python-checkins] r46894 - in python/trunk:Modules/timemodule.c
Objects/exceptions.c Objects/fileobject.c
Message-ID: <129CEF95A523704B9D46959C922A280002FE9832@nemesis.central.ccp.cc>
Thomas, can you check if testing for the macro __STDC_SECURE_LIB__ in addition to testing the _MSC_VER resolves the problem? I see no versioning macros for the CRT that we can use for this.
Kristj?n
> -----Original Message-----
> From: Tim Peters [mailto:tim.peters at gmail.com]
> Sent: 30. j?n? 2006 18:38
> To: Kristj?n V. J?nsson
> Cc: python-checkins at python.org
> Subject: Re: [Python-checkins] r46894 - in python/trunk:
> Modules/timemodule.c Objects/exceptions.c Objects/fileobject.c
>
> Copying Kristj?n directly since he may not be subscribed to
> python-checkins.
>
> On 6/30/06, Thomas Heller wrote:
> > > +#if defined _MSC_VER && _MSC_VER >= 1400
> > > + /* reset CRT error handling */
> > > + _set_invalid_parameter_handler(prevCrtHandler);
> > > + _CrtSetReportMode(_CRT_ASSERT, prevCrtReportMode); #endif
> > > }
> > ...
> >
> > These changes to Objects/exceptions.c break the build for Windows
> > AMD64. Apparently the amd64 compiler from the Server 2003
> SP1 SDK has
> > _MSC_VER >= 1400, but does not know about this new error handling.
> >
> > The compiler identifies itself in this way:
> >
> > C:\Program Files\Microsoft Platform SDK>cl Microsoft (R) C/C++
> > Optimizing Compiler Version 14.00.40310.41 for AMD64 Copyright (C)
> > Microsoft Corporation. All rights reserved.
> >
> > usage: cl [ option... ] filename... [ /link linkoption... ]
> >
> > C:\Program Files\Microsoft Platform SDK>
> >
> >
> > Thomas
> >
> > _______________________________________________
> > Python-checkins mailing list
> > Python-checkins at python.org
> > http://mail.python.org/mailman/listinfo/python-checkins
> >
> _______________________________________________
> Python-checkins mailing list
> Python-checkins at python.org
> http://mail.python.org/mailman/listinfo/python-checkins
>
From python-checkins at python.org Mon Jul 3 15:47:31 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Mon, 3 Jul 2006 15:47:31 +0200 (CEST)
Subject: [Python-checkins] r47217 - in python/branches/release24-maint:
Lib/test/test_descr.py Misc/NEWS Objects/descrobject.c
Message-ID: <20060703134731.ABEB51E4004@bag.python.org>
Author: martin.v.loewis
Date: Mon Jul 3 15:47:29 2006
New Revision: 47217
Modified:
python/branches/release24-maint/Lib/test/test_descr.py
python/branches/release24-maint/Misc/NEWS
python/branches/release24-maint/Objects/descrobject.c
Log:
Put method-wrappers into trashcan. Fixes #927248.
Modified: python/branches/release24-maint/Lib/test/test_descr.py
==============================================================================
--- python/branches/release24-maint/Lib/test/test_descr.py (original)
+++ python/branches/release24-maint/Lib/test/test_descr.py Mon Jul 3 15:47:29 2006
@@ -3936,6 +3936,13 @@
o.whatever = Provoker(o)
del o
+def wrapper_segfault():
+ # SF 927248: deeply nested wrappers could cause stack overflow
+ f = lambda:None
+ for i in xrange(1000000):
+ f = f.__call__
+ f = None
+
# Fix SF #762455, segfault when sys.stdout is changed in getattr
def filefault():
if verbose:
@@ -4075,6 +4082,7 @@
def test_main():
weakref_segfault() # Must be first, somehow
+ wrapper_segfault()
do_this_first()
class_docstrings()
lists()
Modified: python/branches/release24-maint/Misc/NEWS
==============================================================================
--- python/branches/release24-maint/Misc/NEWS (original)
+++ python/branches/release24-maint/Misc/NEWS Mon Jul 3 15:47:29 2006
@@ -12,6 +12,9 @@
Core and builtins
-----------------
+- Bug #927248: Recursive method-wrapper objects can now safely
+ be released.
+
- Bug #992017: A classic class that defined a __coerce__() method that returned
its arguments swapped would infinitely recurse and segfault the interpreter.
Modified: python/branches/release24-maint/Objects/descrobject.c
==============================================================================
--- python/branches/release24-maint/Objects/descrobject.c (original)
+++ python/branches/release24-maint/Objects/descrobject.c Mon Jul 3 15:47:29 2006
@@ -904,10 +904,12 @@
static void
wrapper_dealloc(wrapperobject *wp)
{
- _PyObject_GC_UNTRACK(wp);
+ PyObject_GC_UnTrack(wp);
+ Py_TRASHCAN_SAFE_BEGIN(wp)
Py_XDECREF(wp->descr);
Py_XDECREF(wp->self);
PyObject_GC_Del(wp);
+ Py_TRASHCAN_SAFE_END(wp)
}
static PyMethodDef wrapper_methods[] = {
From python-checkins at python.org Mon Jul 3 15:47:41 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Mon, 3 Jul 2006 15:47:41 +0200 (CEST)
Subject: [Python-checkins] r47218 - in python/trunk: Lib/test/test_descr.py
Misc/NEWS Objects/descrobject.c
Message-ID: <20060703134741.BBA9F1E4004@bag.python.org>
Author: martin.v.loewis
Date: Mon Jul 3 15:47:40 2006
New Revision: 47218
Modified:
python/trunk/Lib/test/test_descr.py
python/trunk/Misc/NEWS
python/trunk/Objects/descrobject.c
Log:
Put method-wrappers into trashcan. Fixes #927248.
Modified: python/trunk/Lib/test/test_descr.py
==============================================================================
--- python/trunk/Lib/test/test_descr.py (original)
+++ python/trunk/Lib/test/test_descr.py Mon Jul 3 15:47:40 2006
@@ -3966,6 +3966,13 @@
o.whatever = Provoker(o)
del o
+def wrapper_segfault():
+ # SF 927248: deeply nested wrappers could cause stack overflow
+ f = lambda:None
+ for i in xrange(1000000):
+ f = f.__call__
+ f = None
+
# Fix SF #762455, segfault when sys.stdout is changed in getattr
def filefault():
if verbose:
@@ -4121,6 +4128,7 @@
def test_main():
weakref_segfault() # Must be first, somehow
+ wrapper_segfault()
do_this_first()
class_docstrings()
lists()
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Mon Jul 3 15:47:40 2006
@@ -12,6 +12,9 @@
Core and builtins
-----------------
+- Bug #927248: Recursive method-wrapper objects can now safely
+ be released.
+
- Bug #1417699: Reject locale-specific decimal point in float()
and atof().
Modified: python/trunk/Objects/descrobject.c
==============================================================================
--- python/trunk/Objects/descrobject.c (original)
+++ python/trunk/Objects/descrobject.c Mon Jul 3 15:47:40 2006
@@ -892,10 +892,12 @@
static void
wrapper_dealloc(wrapperobject *wp)
{
- _PyObject_GC_UNTRACK(wp);
+ PyObject_GC_UnTrack(wp);
+ Py_TRASHCAN_SAFE_BEGIN(wp)
Py_XDECREF(wp->descr);
Py_XDECREF(wp->self);
PyObject_GC_Del(wp);
+ Py_TRASHCAN_SAFE_END(wp)
}
static int
From python-checkins at python.org Mon Jul 3 16:07:31 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Mon, 3 Jul 2006 16:07:31 +0200 (CEST)
Subject: [Python-checkins] r47219 - python/trunk/Doc/lib/libstdtypes.tex
Message-ID: <20060703140731.DD1111E4004@bag.python.org>
Author: andrew.kuchling
Date: Mon Jul 3 16:07:30 2006
New Revision: 47219
Modified:
python/trunk/Doc/lib/libstdtypes.tex
Log:
[Bug #1515932] Clarify description of slice assignment
Modified: python/trunk/Doc/lib/libstdtypes.tex
==============================================================================
--- python/trunk/Doc/lib/libstdtypes.tex (original)
+++ python/trunk/Doc/lib/libstdtypes.tex Mon Jul 3 16:07:30 2006
@@ -1101,7 +1101,8 @@
\lineiii{\var{s}[\var{i}] = \var{x}}
{item \var{i} of \var{s} is replaced by \var{x}}{}
\lineiii{\var{s}[\var{i}:\var{j}] = \var{t}}
- {slice of \var{s} from \var{i} to \var{j} is replaced by \var{t}}{}
+ {slice of \var{s} from \var{i} to \var{j}
+ is replaced by the contents of the iterable \var{t}}{}
\lineiii{del \var{s}[\var{i}:\var{j}]}
{same as \code{\var{s}[\var{i}:\var{j}] = []}}{}
\lineiii{\var{s}[\var{i}:\var{j}:\var{k}] = \var{t}}
From theller at python.net Mon Jul 3 16:13:20 2006
From: theller at python.net (Thomas Heller)
Date: Mon, 03 Jul 2006 16:13:20 +0200
Subject: [Python-checkins] r46894 - in python/trunk:Modules/timemodule.c
Objects/exceptions.c Objects/fileobject.c
In-Reply-To: <129CEF95A523704B9D46959C922A280002FE9832@nemesis.central.ccp.cc>
References: <129CEF95A523704B9D46959C922A280002FE9832@nemesis.central.ccp.cc>
Message-ID: <44A92600.7050105@python.net>
Kristj?n V. J?nsson schrieb:
> Thomas, can you check if testing for the macro __STDC_SECURE_LIB__ in addition to testing the _MSC_VER resolves the problem? I see no versioning macros for the CRT that we can use for this.
Yes, this attached patch works for me.
Thomas
Index: exceptions.c
===================================================================
--- exceptions.c (revision 47207)
+++ exceptions.c (working copy)
@@ -1967,7 +1967,7 @@
if (PyDict_SetItemString(bdict, # TYPE, PyExc_ ## TYPE)) \
Py_FatalError("Module dictionary insertion problem.");
-#if defined _MSC_VER && _MSC_VER >= 1400
+#if defined _MSC_VER && _MSC_VER >= 1400 && defined(__STDC_SECURE_LIB__)
/* crt variable checking in VisualStudio .NET 2005 */
#include
@@ -2120,7 +2120,7 @@
Py_DECREF(bltinmod);
-#if defined _MSC_VER && _MSC_VER >= 1400
+#if defined _MSC_VER && _MSC_VER >= 1400 && defined(__STDC_SECURE_LIB__)
/* Set CRT argument error handler */
prevCrtHandler = _set_invalid_parameter_handler(InvalidParameterHandler);
/* turn off assertions in debug mode */
@@ -2133,7 +2133,7 @@
{
Py_XDECREF(PyExc_MemoryErrorInst);
PyExc_MemoryErrorInst = NULL;
-#if defined _MSC_VER && _MSC_VER >= 1400
+#if defined _MSC_VER && _MSC_VER >= 1400 && defined(__STDC_SECURE_LIB__)
/* reset CRT error handling */
_set_invalid_parameter_handler(prevCrtHandler);
_CrtSetReportMode(_CRT_ASSERT, prevCrtReportMode);
From python-checkins at python.org Mon Jul 3 16:16:09 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Mon, 3 Jul 2006 16:16:09 +0200 (CEST)
Subject: [Python-checkins] r47220 - python/trunk/Doc/lib/libfuncs.tex
Message-ID: <20060703141609.805CA1E4004@bag.python.org>
Author: andrew.kuchling
Date: Mon Jul 3 16:16:09 2006
New Revision: 47220
Modified:
python/trunk/Doc/lib/libfuncs.tex
Log:
[Bug #1511911] Clarify description of optional arguments to sorted()
by improving the xref to the section on lists, and by
copying the explanations of the arguments (with a slight modification).
Modified: python/trunk/Doc/lib/libfuncs.tex
==============================================================================
--- python/trunk/Doc/lib/libfuncs.tex (original)
+++ python/trunk/Doc/lib/libfuncs.tex Mon Jul 3 16:16:09 2006
@@ -1008,8 +1008,30 @@
\begin{funcdesc}{sorted}{iterable\optional{, cmp\optional{,
key\optional{, reverse}}}}
Return a new sorted list from the items in \var{iterable}.
- The optional arguments \var{cmp}, \var{key}, and \var{reverse}
- have the same meaning as those for the \method{list.sort()} method.
+
+ The optional arguments \var{cmp}, \var{key}, and \var{reverse} have
+ the same meaning as those for the \method{list.sort()} method
+ (described in section~\ref{typesseq-mutable}).
+
+ \var{cmp} specifies a custom comparison function of two arguments
+ (iterable elements) which should return a negative, zero or positive
+ number depending on whether the first argument is considered smaller
+ than, equal to, or larger than the second argument:
+ \samp{\var{cmp}=\keyword{lambda} \var{x},\var{y}:
+ \function{cmp}(x.lower(), y.lower())}
+
+ \var{key} specifies a function of one argument that is used to
+ extract a comparison key from each list element:
+ \samp{\var{key}=\function{str.lower}}
+
+ \var{reverse} is a boolean value. If set to \code{True}, then the
+ list elements are sorted as if each comparison were reversed.
+
+ In general, the \var{key} and \var{reverse} conversion processes are
+ much faster than specifying an equivalent \var{cmp} function. This is
+ because \var{cmp} is called multiple times for each list element while
+ \var{key} and \var{reverse} touch each element only once.
+
\versionadded{2.4}
\end{funcdesc}
From python-checkins at python.org Mon Jul 3 16:18:14 2006
From: python-checkins at python.org (matt.fleming)
Date: Mon, 3 Jul 2006 16:18:14 +0200 (CEST)
Subject: [Python-checkins] r47221 - sandbox/trunk/pdb/README.txt
sandbox/trunk/pdb/mconnection.py sandbox/trunk/pdb/mpdb.py
Message-ID: <20060703141814.B23121E4004@bag.python.org>
Author: matt.fleming
Date: Mon Jul 3 16:18:13 2006
New Revision: 47221
Modified:
sandbox/trunk/pdb/README.txt
sandbox/trunk/pdb/mconnection.py
sandbox/trunk/pdb/mpdb.py
Log:
Remove SO_RESUSEADDR sockopt because it was causing problems with the unit tests on windows and hasn't caused any other problems since it's been removed. Add things to do to README.txt, make MPdb thread-ignorant so that Rocky (mentor) can track thread code changes more easily.
Modified: sandbox/trunk/pdb/README.txt
==============================================================================
--- sandbox/trunk/pdb/README.txt (original)
+++ sandbox/trunk/pdb/README.txt Mon Jul 3 16:18:13 2006
@@ -50,4 +50,7 @@
this currently has _no_ effect. We should at some point be able to switch
threads and from them on not have to use the 'thread apply '
syntax, but just '' which would be applied to the current thread.
+* Run command does not pass arguments across a remote connection, i.e.
+ `run 2' does not pass the argument 2 to the script.
+* Allow thread debugging to be turned on with a command line switch.
Modified: sandbox/trunk/pdb/mconnection.py
==============================================================================
--- sandbox/trunk/pdb/mconnection.py (original)
+++ sandbox/trunk/pdb/mconnection.py Mon Jul 3 16:18:13 2006
@@ -112,7 +112,6 @@
self.port = int(p)
if not self.listening:
self._sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
- self._sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
try:
self._sock.bind((self.host, self.port))
except socket.error, e:
Modified: sandbox/trunk/pdb/mpdb.py
==============================================================================
--- sandbox/trunk/pdb/mpdb.py (original)
+++ sandbox/trunk/pdb/mpdb.py Mon Jul 3 16:18:13 2006
@@ -57,7 +57,7 @@
self.target = 'local' # local connections by default
self.lastcmd = ''
self.connection = None
- self.debug_thread = False
+ self._info_cmds.append('target')
def _rebind_input(self, new_input):
""" This method rebinds the debugger's input to the object specified
@@ -168,11 +168,6 @@
else:
pydb.Pdb.do_info(self, arg)
- def help_info(self, *args):
- """Extends pydb help_info command. """
- self.subcommand_help('info', getattr(self,'help_info').__doc__,
- self._info_cmds, self.info_helper, args[0])
-
def info_helper(self, cmd, label=False):
"""Extends pydb info_helper() to give info about a single Mpdb
info extension."""
@@ -183,24 +178,6 @@
else:
pydb.Pdb.info_helper(self, cmd)
- def do_set(self, arg):
- """ Extends pydb do_set() to allow setting thread debugging. """
- if not arg:
- pydb.Pdb.do_set(self, arg)
- return
-
- args = arg.split()
- if 'thread'.startswith(args[0]):
- try:
- import mthread
- except ImportError:
- self.errmsg('Thread debugging is not on')
- return
- mthread.init(self.msg, self.errmsg)
- self.msg('Thread debugging on')
- self.debug_thread = True
- return
-
# Debugger commands
def do_attach(self, addr):
""" Attach to a process or file outside of Pdb.
From buildbot at python.org Mon Jul 3 16:18:59 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 03 Jul 2006 14:18:59 +0000
Subject: [Python-checkins] buildbot warnings in x86 OpenBSD 2.4
Message-ID: <20060703141900.15A011E4004@bag.python.org>
The Buildbot has detected a new failure of x86 OpenBSD 2.4.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520OpenBSD%25202.4/builds/132
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch branches/release24-maint] HEAD
Blamelist: martin.v.loewis
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Mon Jul 3 16:19:01 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Mon, 3 Jul 2006 16:19:01 +0200 (CEST)
Subject: [Python-checkins] r47222 -
python/branches/release24-maint/Doc/lib/libfuncs.tex
Message-ID: <20060703141901.9B0311E4004@bag.python.org>
Author: andrew.kuchling
Date: Mon Jul 3 16:19:01 2006
New Revision: 47222
Modified:
python/branches/release24-maint/Doc/lib/libfuncs.tex
Log:
[Bug #1511911] Backport: Clarify optional arguments to sorted() and improve xref
Modified: python/branches/release24-maint/Doc/lib/libfuncs.tex
==============================================================================
--- python/branches/release24-maint/Doc/lib/libfuncs.tex (original)
+++ python/branches/release24-maint/Doc/lib/libfuncs.tex Mon Jul 3 16:19:01 2006
@@ -934,8 +934,30 @@
\begin{funcdesc}{sorted}{iterable\optional{, cmp\optional{,
key\optional{, reverse}}}}
Return a new sorted list from the items in \var{iterable}.
- The optional arguments \var{cmp}, \var{key}, and \var{reverse}
- have the same meaning as those for the \method{list.sort()} method.
+
+ The optional arguments \var{cmp}, \var{key}, and \var{reverse} have
+ the same meaning as those for the \method{list.sort()} method
+ (described in section~\ref{typesseq-mutable}).
+
+ \var{cmp} specifies a custom comparison function of two arguments
+ (iterable elements) which should return a negative, zero or positive
+ number depending on whether the first argument is considered smaller
+ than, equal to, or larger than the second argument:
+ \samp{\var{cmp}=\keyword{lambda} \var{x},\var{y}:
+ \function{cmp}(x.lower(), y.lower())}
+
+ \var{key} specifies a function of one argument that is used to
+ extract a comparison key from each list element:
+ \samp{\var{key}=\function{str.lower}}
+
+ \var{reverse} is a boolean value. If set to \code{True}, then the
+ list elements are sorted as if each comparison were reversed.
+
+ In general, the \var{key} and \var{reverse} conversion processes are
+ much faster than specifying an equivalent \var{cmp} function. This is
+ because \var{cmp} is called multiple times for each list element while
+ \var{key} and \var{reverse} touch each element only once.
+
\versionadded{2.4}
\end{funcdesc}
From python-checkins at python.org Mon Jul 3 16:59:05 2006
From: python-checkins at python.org (kristjan.jonsson)
Date: Mon, 3 Jul 2006 16:59:05 +0200 (CEST)
Subject: [Python-checkins] r47223 - in python/trunk: Modules/timemodule.c
Objects/exceptions.c Objects/fileobject.c
Message-ID: <20060703145905.EB9A41E4009@bag.python.org>
Author: kristjan.jonsson
Date: Mon Jul 3 16:59:05 2006
New Revision: 47223
Modified:
python/trunk/Modules/timemodule.c
python/trunk/Objects/exceptions.c
python/trunk/Objects/fileobject.c
Log:
Fix build problems with the platform SDK on windows. It is not sufficient to test for the C compiler version when determining if we have the secure CRT from microsoft. Must test with an undocumented macro, __STDC_SECURE_LIB__ too.
Modified: python/trunk/Modules/timemodule.c
==============================================================================
--- python/trunk/Modules/timemodule.c (original)
+++ python/trunk/Modules/timemodule.c Mon Jul 3 16:59:05 2006
@@ -467,7 +467,7 @@
return ret;
}
free(outbuf);
-#if defined _MSC_VER && _MSC_VER >= 1400
+#if defined _MSC_VER && _MSC_VER >= 1400 && defined(__STDC_SECURE_LIB__)
/* VisualStudio .NET 2005 does this properly */
if (buflen == 0 && errno == EINVAL) {
PyErr_SetString(PyExc_ValueError, "Invalid format string");
Modified: python/trunk/Objects/exceptions.c
==============================================================================
--- python/trunk/Objects/exceptions.c (original)
+++ python/trunk/Objects/exceptions.c Mon Jul 3 16:59:05 2006
@@ -1967,7 +1967,7 @@
if (PyDict_SetItemString(bdict, # TYPE, PyExc_ ## TYPE)) \
Py_FatalError("Module dictionary insertion problem.");
-#if defined _MSC_VER && _MSC_VER >= 1400
+#if defined _MSC_VER && _MSC_VER >= 1400 && defined(__STDC_SECURE_LIB__)
/* crt variable checking in VisualStudio .NET 2005 */
#include
@@ -2120,7 +2120,7 @@
Py_DECREF(bltinmod);
-#if defined _MSC_VER && _MSC_VER >= 1400
+#if defined _MSC_VER && _MSC_VER >= 1400 && defined(__STDC_SECURE_LIB__)
/* Set CRT argument error handler */
prevCrtHandler = _set_invalid_parameter_handler(InvalidParameterHandler);
/* turn off assertions in debug mode */
@@ -2133,7 +2133,7 @@
{
Py_XDECREF(PyExc_MemoryErrorInst);
PyExc_MemoryErrorInst = NULL;
-#if defined _MSC_VER && _MSC_VER >= 1400
+#if defined _MSC_VER && _MSC_VER >= 1400 && defined(__STDC_SECURE_LIB__)
/* reset CRT error handling */
_set_invalid_parameter_handler(prevCrtHandler);
_CrtSetReportMode(_CRT_ASSERT, prevCrtReportMode);
Modified: python/trunk/Objects/fileobject.c
==============================================================================
--- python/trunk/Objects/fileobject.c (original)
+++ python/trunk/Objects/fileobject.c Mon Jul 3 16:59:05 2006
@@ -241,7 +241,7 @@
}
if (f->f_fp == NULL) {
-#if defined _MSC_VER && _MSC_VER < 1400
+#if defined _MSC_VER && (_MSC_VER < 1400 || !defined(__STDC_SECURE_LIB__))
/* MSVC 6 (Microsoft) leaves errno at 0 for bad mode strings,
* across all Windows flavors. When it sets EINVAL varies
* across Windows flavors, the exact conditions aren't
From buildbot at python.org Mon Jul 3 17:05:59 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 03 Jul 2006 15:05:59 +0000
Subject: [Python-checkins] buildbot warnings in sparc solaris10 gcc trunk
Message-ID: <20060703150559.D97781E4008@bag.python.org>
The Buildbot has detected a new failure of sparc solaris10 gcc trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/sparc%2520solaris10%2520gcc%2520trunk/builds/1124
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: martin.v.loewis
Build Had Warnings: warnings failed slave lost
sincerely,
-The Buildbot
From buildbot at python.org Mon Jul 3 17:14:48 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 03 Jul 2006 15:14:48 +0000
Subject: [Python-checkins] buildbot warnings in ppc Debian unstable trunk
Message-ID: <20060703151448.C4E991E4008@bag.python.org>
The Buildbot has detected a new failure of ppc Debian unstable trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/ppc%2520Debian%2520unstable%2520trunk/builds/873
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: martin.v.loewis
Build Had Warnings: warnings failed slave lost
sincerely,
-The Buildbot
From python-checkins at python.org Tue Jul 4 14:30:24 2006
From: python-checkins at python.org (ronald.oussoren)
Date: Tue, 4 Jul 2006 14:30:24 +0200 (CEST)
Subject: [Python-checkins] r47224 -
python/trunk/Modules/_ctypes/libffi/src/x86/darwin.S
python/trunk/Modules/_ctypes/libffi/src/x86/ffi_darwin.c
Message-ID: <20060704123024.5D5531E4008@bag.python.org>
Author: ronald.oussoren
Date: Tue Jul 4 14:30:22 2006
New Revision: 47224
Modified:
python/trunk/Modules/_ctypes/libffi/src/x86/darwin.S
python/trunk/Modules/_ctypes/libffi/src/x86/ffi_darwin.c
Log:
Sync the darwin/x86 port libffi with the copy in PyObjC. This fixes a number
of bugs in that port. The most annoying ones were due to some subtle differences
between the document ABI and the actual implementation :-(
(there are no python unittests that fail without this patch, but without it
some of libffi's unittests fail).
Modified: python/trunk/Modules/_ctypes/libffi/src/x86/darwin.S
==============================================================================
--- python/trunk/Modules/_ctypes/libffi/src/x86/darwin.S (original)
+++ python/trunk/Modules/_ctypes/libffi/src/x86/darwin.S Tue Jul 4 14:30:22 2006
@@ -35,6 +35,13 @@
#include
#include
+#ifdef PyObjC_STRICT_DEBUGGING
+ /* XXX: Debugging of stack alignment, to be removed */
+#define ASSERT_STACK_ALIGNED movdqa -16(%esp), %xmm0
+#else
+#define ASSERT_STACK_ALIGNED
+#endif
+
.text
.globl _ffi_prep_args
@@ -47,30 +54,41 @@
pushl %ebp
.LCFI0:
movl %esp,%ebp
+ subl $8,%esp
+ ASSERT_STACK_ALIGNED
.LCFI1:
/* Make room for all of the new args. */
movl 16(%ebp),%ecx
subl %ecx,%esp
+ ASSERT_STACK_ALIGNED
+
movl %esp,%eax
/* Place all of the ffi_prep_args in position */
+ subl $8,%esp
pushl 12(%ebp)
pushl %eax
call *8(%ebp)
+ ASSERT_STACK_ALIGNED
+
/* Return stack to previous state and call the function */
- addl $8,%esp
+ addl $16,%esp
- call *28(%ebp)
+ ASSERT_STACK_ALIGNED
- /* Remove the space we pushed for the args */
+ call *28(%ebp)
+
+ /* XXX: return returns return with 'ret $4', that upsets the stack! */
movl 16(%ebp),%ecx
addl %ecx,%esp
+
/* Load %ecx with the return type code */
movl 20(%ebp),%ecx
+
/* If the return value pointer is NULL, assume no return value. */
cmpl $0,24(%ebp)
jne retint
@@ -117,17 +135,47 @@
retint64:
cmpl $FFI_TYPE_SINT64,%ecx
- jne retstruct
+ jne retstruct1b
/* Load %ecx with the pointer to storage for the return value */
movl 24(%ebp),%ecx
movl %eax,0(%ecx)
movl %edx,4(%ecx)
+ jmp epilogue
+
+retstruct1b:
+ cmpl $FFI_TYPE_SINT8,%ecx
+ jne retstruct2b
+ movl 24(%ebp),%ecx
+ movb %al,0(%ecx)
+ jmp epilogue
+
+retstruct2b:
+ cmpl $FFI_TYPE_SINT16,%ecx
+ jne retstruct
+ movl 24(%ebp),%ecx
+ movw %ax,0(%ecx)
+ jmp epilogue
retstruct:
+ cmpl $FFI_TYPE_STRUCT,%ecx
+ jne noretval
/* Nothing to do! */
+ subl $4,%esp
+
+ ASSERT_STACK_ALIGNED
+
+ addl $8,%esp
+ movl %ebp, %esp
+ popl %ebp
+ ret
+
noretval:
epilogue:
+ ASSERT_STACK_ALIGNED
+ addl $8, %esp
+
+
movl %ebp,%esp
popl %ebp
ret
Modified: python/trunk/Modules/_ctypes/libffi/src/x86/ffi_darwin.c
==============================================================================
--- python/trunk/Modules/_ctypes/libffi/src/x86/ffi_darwin.c (original)
+++ python/trunk/Modules/_ctypes/libffi/src/x86/ffi_darwin.c Tue Jul 4 14:30:22 2006
@@ -140,7 +140,7 @@
switch (cif->rtype->type)
{
case FFI_TYPE_VOID:
-#if !defined(X86_WIN32)
+#if !defined(X86_WIN32) && !defined(X86_DARWIN)
case FFI_TYPE_STRUCT:
#endif
case FFI_TYPE_SINT64:
@@ -154,7 +154,7 @@
cif->flags = FFI_TYPE_SINT64;
break;
-#if defined X86_WIN32
+#if defined(X86_WIN32) || defined(X86_DARWIN)
case FFI_TYPE_STRUCT:
if (cif->rtype->size == 1)
@@ -186,10 +186,11 @@
}
/* Darwin: The stack needs to be aligned to a multiple of 16 bytes */
-#if 0
+#if 1
cif->bytes = (cif->bytes + 15) & ~0xF;
#endif
+
return FFI_OK;
}
@@ -221,7 +222,6 @@
/*@dependent@*/ void **avalue)
{
extended_cif ecif;
- int flags;
ecif.cif = cif;
ecif.avalue = avalue;
@@ -238,20 +238,6 @@
else
ecif.rvalue = rvalue;
- flags = cif->flags;
- if (flags == FFI_TYPE_STRUCT) {
- if (cif->rtype->size == 8) {
- flags = FFI_TYPE_SINT64;
- } else if (cif->rtype->size == 4) {
- flags = FFI_TYPE_INT;
- } else if (cif->rtype->size == 2) {
- flags = FFI_TYPE_INT;
- } else if (cif->rtype->size == 1) {
- flags = FFI_TYPE_INT;
- }
- }
-
-
switch (cif->abi)
{
case FFI_SYSV:
@@ -260,8 +246,8 @@
* block is a multiple of 16. Then add 8 to compensate for local variables
* in ffi_call_SYSV.
*/
- ffi_call_SYSV(ffi_prep_args, &ecif, ALIGN(cif->bytes, 16) +8,
- flags, ecif.rvalue, fn);
+ ffi_call_SYSV(ffi_prep_args, &ecif, cif->bytes,
+ cif->flags, ecif.rvalue, fn);
/*@=usedef@*/
break;
#ifdef X86_WIN32
@@ -281,8 +267,6 @@
/** private members **/
-static void ffi_prep_incoming_args_SYSV (char *stack, void **ret,
- void** args, ffi_cif* cif);
static void ffi_closure_SYSV (ffi_closure *)
__attribute__ ((regparm(1)));
#if !FFI_NO_RAW_API
@@ -290,6 +274,48 @@
__attribute__ ((regparm(1)));
#endif
+/*@-exportheader@*/
+static inline void
+ffi_prep_incoming_args_SYSV(char *stack, void **rvalue,
+ void **avalue, ffi_cif *cif)
+/*@=exportheader@*/
+{
+ register unsigned int i;
+ register void **p_argv;
+ register char *argp;
+ register ffi_type **p_arg;
+
+ argp = stack;
+
+ if (retval_on_stack(cif->rtype)) {
+ *rvalue = *(void **) argp;
+ argp += 4;
+ }
+
+ p_argv = avalue;
+
+ for (i = cif->nargs, p_arg = cif->arg_types; (i != 0); i--, p_arg++)
+ {
+ size_t z;
+
+ /* Align if necessary */
+ if ((sizeof(int) - 1) & (unsigned) argp) {
+ argp = (char *) ALIGN(argp, sizeof(int));
+ }
+
+ z = (*p_arg)->size;
+
+ /* because we're little endian, this is what it turns into. */
+
+ *p_argv = (void*) argp;
+
+ p_argv++;
+ argp += z;
+ }
+
+ return;
+}
+
/* This function is jumped to by the trampoline */
static void
@@ -302,10 +328,10 @@
// our various things...
ffi_cif *cif;
void **arg_area;
- unsigned short rtype;
void *resp = (void*)&res;
void *args = __builtin_dwarf_cfa ();
+
cif = closure->cif;
arg_area = (void**) alloca (cif->nargs * sizeof (void*));
@@ -319,94 +345,52 @@
(closure->fun) (cif, resp, arg_area, closure->user_data);
- rtype = cif->flags;
-
- if (!retval_on_stack(cif->rtype) && cif->flags == FFI_TYPE_STRUCT) {
- if (cif->rtype->size == 8) {
- rtype = FFI_TYPE_SINT64;
- } else {
- rtype = FFI_TYPE_INT;
- }
- }
-
/* now, do a generic return based on the value of rtype */
- if (rtype == FFI_TYPE_INT)
+ if (cif->flags == FFI_TYPE_INT)
{
asm ("movl (%0),%%eax" : : "r" (resp) : "eax");
}
- else if (rtype == FFI_TYPE_FLOAT)
+ else if (cif->flags == FFI_TYPE_FLOAT)
{
asm ("flds (%0)" : : "r" (resp) : "st" );
}
- else if (rtype == FFI_TYPE_DOUBLE)
+ else if (cif->flags == FFI_TYPE_DOUBLE)
{
asm ("fldl (%0)" : : "r" (resp) : "st", "st(1)" );
}
- else if (rtype == FFI_TYPE_LONGDOUBLE)
+ else if (cif->flags == FFI_TYPE_LONGDOUBLE)
{
asm ("fldt (%0)" : : "r" (resp) : "st", "st(1)" );
}
- else if (rtype == FFI_TYPE_SINT64)
+ else if (cif->flags == FFI_TYPE_SINT64)
{
asm ("movl 0(%0),%%eax;"
"movl 4(%0),%%edx"
: : "r"(resp)
: "eax", "edx");
}
-#ifdef X86_WIN32
- else if (rtype == FFI_TYPE_SINT8) /* 1-byte struct */
+#if defined(X86_WIN32) || defined(X86_DARWIN)
+ else if (cif->flags == FFI_TYPE_SINT8) /* 1-byte struct */
{
asm ("movsbl (%0),%%eax" : : "r" (resp) : "eax");
}
- else if (rtype == FFI_TYPE_SINT16) /* 2-bytes struct */
+ else if (cif->flags == FFI_TYPE_SINT16) /* 2-bytes struct */
{
asm ("movswl (%0),%%eax" : : "r" (resp) : "eax");
}
#endif
-}
-/*@-exportheader@*/
-static void
-ffi_prep_incoming_args_SYSV(char *stack, void **rvalue,
- void **avalue, ffi_cif *cif)
-/*@=exportheader@*/
-{
- register unsigned int i;
- register void **p_argv;
- register char *argp;
- register ffi_type **p_arg;
-
- argp = stack;
-
- if (retval_on_stack(cif->rtype)) {
- *rvalue = *(void **) argp;
- argp += 4;
- }
-
- p_argv = avalue;
-
- for (i = cif->nargs, p_arg = cif->arg_types; (i != 0); i--, p_arg++)
+ else if (cif->flags == FFI_TYPE_STRUCT)
{
- size_t z;
-
- /* Align if necessary */
- if ((sizeof(int) - 1) & (unsigned) argp) {
- argp = (char *) ALIGN(argp, sizeof(int));
- }
-
- z = (*p_arg)->size;
-
- /* because we're little endian, this is what it turns into. */
-
- *p_argv = (void*) argp;
-
- p_argv++;
- argp += z;
+ asm ("lea -8(%ebp),%esp;"
+ "pop %esi;"
+ "pop %edi;"
+ "pop %ebp;"
+ "ret $4");
}
-
- return;
}
+
/* How to make a trampoline. Derived from gcc/config/i386/i386.c. */
#define FFI_INIT_TRAMPOLINE(TRAMP,FUN,CTX) \
From mwh at python.net Sat Jul 1 13:37:01 2006
From: mwh at python.net (Michael Hudson)
Date: Sat, 01 Jul 2006 12:37:01 +0100
Subject: [Python-checkins] r46894 - in python/trunk:
Modules/timemodule.c Objects/exceptions.c Objects/fileobject.c
In-Reply-To: (Thomas Heller's message of "Fri,
30 Jun 2006 20:58:04 +0200")
References: <20060612154513.619FB1E4013@bag.python.org>
<1f7befae0606301137p714d43afhceffef125652a465@mail.gmail.com>
Message-ID: <2m4py17e02.fsf@starship.python.net>
Thomas Heller writes:
> Tim Peters schrieb:
>> Copying Kristj?n directly since he may not be subscribed to python-checkins.
>
> [off-topic, sorry]
>
> I would have done this myself, but I cannot log into the starship to send or receive mail.
> Does anyone know what's up?
The server moved. It should work now (unless your ISP has been fooled
by the overly long TTL that was on the python.net DNS record until too
recently; the new IP address is 87.106.17.236).
Cheers,
mwh
--
There are two ways of constructing a software design: one way is to
make it so simple that there are obviously no deficiencies and the
other way is to make it so complicated that there are no obvious
deficiencies. -- C. A. R. Hoare
From python-checkins at python.org Tue Jul 4 21:53:29 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Tue, 4 Jul 2006 21:53:29 +0200 (CEST)
Subject: [Python-checkins] r47226 - sandbox/trunk/Doc/functional.rst
Message-ID: <20060704195329.719491E4008@bag.python.org>
Author: andrew.kuchling
Date: Tue Jul 4 21:53:28 2006
New Revision: 47226
Modified:
sandbox/trunk/Doc/functional.rst
Log:
Markup fixes
Modified: sandbox/trunk/Doc/functional.rst
==============================================================================
--- sandbox/trunk/Doc/functional.rst (original)
+++ sandbox/trunk/Doc/functional.rst Tue Jul 4 21:53:28 2006
@@ -121,8 +121,8 @@
The technique used to prove programs correct is to write down
**invariants**, properties of the input data and of the program's
variables that are always true. For each line of code, you then show
-that if invariants X and Y are true ***before*** the line is executed,
-the slightly different invariants X' and Y' are true ***after***
+that if invariants X and Y are true **before** the line is executed,
+the slightly different invariants X' and Y' are true **after**
the line is executed. This continues until you reach the end of the
program, at which point the invariants should match the desired
conditions on the program's output.
@@ -139,7 +139,7 @@
you use daily (the Python interpreter, your XML parser, your web
browser) could be proven correct. Even if you wrote down or generated
a proof, there would then be the question of verifying the proof;
-maybe there's an error in it, and you only ***think*** you've proved
+maybe there's an error in it, and you only think you've proved
that the program correct.
Modularity
@@ -388,7 +388,7 @@
The ``for...in`` clauses contain the sequences to be iterated over.
The sequences do not have to be the same length, because they are
-iterated over from left to right, ***not*** in parallel. For each
+iterated over from left to right, **not** in parallel. For each
element in ``sequence1``, ``sequence2`` is looped over from the
beginning. ``sequence3`` is then looped over for each
resulting pair of elements from ``sequence1`` and ``sequence2``.
@@ -425,7 +425,7 @@
To avoid introducing an ambiguity into Python's grammar, if
``expression`` is creating a tuple, it must be surrounded with
parentheses. The first list comprehension below is a syntax error,
-while the second one is correct:
+while the second one is correct::
# Syntax error
[ x,y for x in seq1 for y in seq2]
From python-checkins at python.org Tue Jul 4 21:55:22 2006
From: python-checkins at python.org (georg.brandl)
Date: Tue, 4 Jul 2006 21:55:22 +0200 (CEST)
Subject: [Python-checkins] r47227 - peps/trunk/pep-3099.txt
Message-ID: <20060704195522.24E981E400D@bag.python.org>
Author: georg.brandl
Date: Tue Jul 4 21:55:21 2006
New Revision: 47227
Modified:
peps/trunk/pep-3099.txt
Log:
Add int[:100].
Modified: peps/trunk/pep-3099.txt
==============================================================================
--- peps/trunk/pep-3099.txt (original)
+++ peps/trunk/pep-3099.txt Tue Jul 4 21:55:21 2006
@@ -143,6 +143,11 @@
Thread: "Immutable lists",
http://mail.python.org/pipermail/python-3000/2006-May/002219.html
+* ``int`` will not support subscripts yielding a range.
+
+ Thread: "xrange vs. int.__getslice__",
+ http://mail.python.org/pipermail/python-3000/2006-June/002450.html
+
Coding style
============
From python-checkins at python.org Tue Jul 4 21:55:35 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Tue, 4 Jul 2006 21:55:35 +0200 (CEST)
Subject: [Python-checkins] r47228 - sandbox/trunk/Doc/functional.rst
Message-ID: <20060704195535.83D491E4008@bag.python.org>
Author: andrew.kuchling
Date: Tue Jul 4 21:55:35 2006
New Revision: 47228
Modified:
sandbox/trunk/Doc/functional.rst
Log:
Wording fix
Modified: sandbox/trunk/Doc/functional.rst
==============================================================================
--- sandbox/trunk/Doc/functional.rst (original)
+++ sandbox/trunk/Doc/functional.rst Tue Jul 4 21:55:35 2006
@@ -139,8 +139,8 @@
you use daily (the Python interpreter, your XML parser, your web
browser) could be proven correct. Even if you wrote down or generated
a proof, there would then be the question of verifying the proof;
-maybe there's an error in it, and you only think you've proved
-that the program correct.
+maybe there's an error in it, and you wrongly believe you've proved
+the program correct.
Modularity
''''''''''''''''''''''
@@ -1142,7 +1142,7 @@
The author would like to thank the following people for offering
suggestions, corrections and assistance with various drafts of this
article: Ian Bicking, Nick Coghlan, Raymond Hettinger, Jim Jewett,
-Leandro Lameiro.
+Leandro Lameiro, Blake Winton.
Version 0.1: posted June 30 2006.
From python-checkins at python.org Tue Jul 4 23:47:08 2006
From: python-checkins at python.org (matt.fleming)
Date: Tue, 4 Jul 2006 23:47:08 +0200 (CEST)
Subject: [Python-checkins] r47230 - in sandbox/trunk/pdb: mconnection.py
mpdb.py test/test_mpdb.py
Message-ID: <20060704214708.B078C1E4009@bag.python.org>
Author: matt.fleming
Date: Tue Jul 4 23:47:08 2006
New Revision: 47230
Modified:
sandbox/trunk/pdb/mconnection.py
sandbox/trunk/pdb/mpdb.py
sandbox/trunk/pdb/test/test_mpdb.py
Log:
Avoid infinte loops in test cases. Allow REUSEADDR socket option to be set
when connect() is called, and no longer need to override Pydb's msg_nocr method
because Rocky applied a patch to CVS.
Modified: sandbox/trunk/pdb/mconnection.py
==============================================================================
--- sandbox/trunk/pdb/mconnection.py (original)
+++ sandbox/trunk/pdb/mconnection.py Tue Jul 4 23:47:08 2006
@@ -100,7 +100,7 @@
self._sock = self.output = self.input = None
MConnectionInterface.__init__(self)
- def connect(self, addr):
+ def connect(self, addr, reuseaddr=False):
"""Set to allow a connection from a client. 'addr' specifies
the hostname and port combination of the server.
"""
@@ -112,6 +112,9 @@
self.port = int(p)
if not self.listening:
self._sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
+ if reuseaddr:
+ self._sock.setsockopt(socket.SOL_SOCKET,
+ socket.SO_REUSEADDR, 1)
try:
self._sock.bind((self.host, self.port))
except socket.error, e:
Modified: sandbox/trunk/pdb/mpdb.py
==============================================================================
--- sandbox/trunk/pdb/mpdb.py (original)
+++ sandbox/trunk/pdb/mpdb.py Tue Jul 4 23:47:08 2006
@@ -142,20 +142,6 @@
if hasattr(self, 'local_prompt'):
self.prompt = self.local_prompt
- def msg_nocr(self, msg, out=None):
- """Common routine for reporting messages. Derived classes may want
- to override this to capture output.
- """
- do_print = True
- if self.logging:
- if self.logging_fileobj is not None:
- print >> self.logging_fileobj, msg,
- do_print = not self.logging_redirect
- if do_print:
- if out is None:
- out = self.stdout
- print >> out, msg,
-
def do_info(self, arg):
"""Extends pydb do_info() to give info about the Mpdb extensions."""
if not arg:
Modified: sandbox/trunk/pdb/test/test_mpdb.py
==============================================================================
--- sandbox/trunk/pdb/test/test_mpdb.py (original)
+++ sandbox/trunk/pdb/test/test_mpdb.py Tue Jul 4 23:47:08 2006
@@ -28,7 +28,7 @@
address = __addr__
client.connection = MConnectionClientTCP()
- while True:
+ for i in range(MAXTRIES):
try:
client.connection.connect(address)
except ConnectionFailed, e:
@@ -52,8 +52,7 @@
MPdb.__init__(self)
threading.Thread.__init__(self)
self.botframe = None
- script = os.path.abspath('thread_script.py')
- self._sys_argv = [script]
+ self._sys_argv = ['python', '-c', '"pass"']
def run(self):
@@ -121,8 +120,10 @@
self.assertEquals(errmsg, line)
server.disconnect()
- while server._sock != None:
- time.sleep(0.1)
+
+ for i in range(MAXTRIES):
+ if server._sock != None: pass
+ else: break
def testRebindOutput(self):
""" Test rebinding output. """
@@ -168,8 +169,9 @@
self.client1.onecmd('restart')
self.client1.connection.write('rquit\n')
- while server.connection != None:
- time.sleep(0.1)
+ for i in range(MAXTRIES):
+ if server.connection != None: pass
+ else: break
def test_main():
test_support.run_unittest(TestRemoteDebugging)
From python-checkins at python.org Tue Jul 4 23:57:09 2006
From: python-checkins at python.org (neal.norwitz)
Date: Tue, 4 Jul 2006 23:57:09 +0200 (CEST)
Subject: [Python-checkins] r47231 - peps/trunk/pep-0356.txt
Message-ID: <20060704215709.D3A521E4008@bag.python.org>
Author: neal.norwitz
Date: Tue Jul 4 23:57:09 2006
New Revision: 47231
Modified:
peps/trunk/pep-0356.txt
Log:
Remove duplicate and closed bugs. Sort order and add some details.
Modified: peps/trunk/pep-0356.txt
==============================================================================
--- peps/trunk/pep-0356.txt (original)
+++ peps/trunk/pep-0356.txt Tue Jul 4 23:57:09 2006
@@ -147,18 +147,14 @@
Open issues
- Bugs that need resolving before release:
- http://python.org/sf/1513646
- http://python.org/sf/1513223
- http://python.org/sf/1512814
- http://python.org/sf/1512695
- http://python.org/sf/1504046
- http://python.org/sf/1501934
- http://python.org/sf/1333982
- http://python.org/sf/1513646
- http://python.org/sf/1508010
- http://python.org/sf/1475523
- http://python.org/sf/1494314
- http://python.org/sf/1515471
+ http://python.org/sf/1515471 - stringobject (char buffers)
+ http://python.org/sf/1512814 - AST incorrect lineno
+ http://python.org/sf/1508010 - msvccompiler using VC6
+ http://python.org/sf/1504046 - doc for ElementTree
+ http://python.org/sf/1501934 - AST incorrect LOAD/STORE_GLOBAL
+ http://python.org/sf/1494314 - can't use high sockets (regr in 2.4.3)
+ http://python.org/sf/1475523 - gettext.py bug
+ http://python.org/sf/1333982 - AST
- Should relative imports from __main__ work when feasible?
Bug report: http://python.org/sf/1510172
From python-checkins at python.org Wed Jul 5 02:51:40 2006
From: python-checkins at python.org (talin)
Date: Wed, 5 Jul 2006 02:51:40 +0200 (CEST)
Subject: [Python-checkins] r47233 - peps/trunk/pep-3101.txt
Message-ID: <20060705005140.BAC901E4008@bag.python.org>
Author: talin
Date: Wed Jul 5 02:51:40 2006
New Revision: 47233
Modified:
peps/trunk/pep-3101.txt
Log:
Updates to PEP 3101 as a result of discussion in Python-3000
Modified: peps/trunk/pep-3101.txt
==============================================================================
--- peps/trunk/pep-3101.txt (original)
+++ peps/trunk/pep-3101.txt Wed Jul 5 02:51:40 2006
@@ -169,12 +169,12 @@
"My name is {0:8}".format('Fred')
The meaning and syntax of the conversion specifiers depends on the
- type of object that is being formatted, however many of the
- built-in types will recognize a standard set of conversion
- specifiers.
+ type of object that is being formatted, however there is a
+ standard set of conversion specifiers used for any object that
+ does not override them.
Conversion specifiers can themselves contain replacement fields.
- For example, a field whose field width it itself a parameter
+ For example, a field whose field width is itself a parameter
could be specified via:
"{0:{1}}".format(a, b, c)
@@ -184,12 +184,12 @@
the '{{' and '}}' syntax for escapes is only applied when used
*outside* of a format field. Within a format field, the brace
characters always have their normal meaning.
-
- The syntax for conversion specifiers is open-ended, since except
- than doing field replacements, the format() method does not
- attempt to interpret them in any way; it merely passes all of the
- characters between the first colon and the matching brace to
- the various underlying formatter methods.
+
+ The syntax for conversion specifiers is open-ended, since a class
+ can override the standard conversion specifiers. In such cases,
+ the format() method merely passes all of the characters between
+ the first colon and the matching brace to the relevant underlying
+ formatting method.
Standard Conversion Specifiers
@@ -206,7 +206,7 @@
[[fill]align][sign][width][.precision][type]
- The brackets ([]) indicate an optional field.
+ The brackets ([]) indicate an optional element.
Then the optional align flag can be one of the following:
@@ -214,8 +214,8 @@
space (This is the default.)
'>' - Forces the field to be right-aligned within the
available space.
- '=' - Forces the padding to be placed between immediately
- after the sign, if any. This is used for printing fields
+ '=' - Forces the padding to be placed after the sign (if any)
+ but before the digits. This is used for printing fields
in the form '+000000120'.
Note that unless a minimum field width is defined, the field
@@ -229,7 +229,7 @@
specifier). A zero fill character without an alignment flag
implies an alignment type of '='.
- The 'sign' field can be one of the following:
+ The 'sign' element can be one of the following:
'+' - indicates that a sign should be used for both
positive as well as negative numbers
@@ -244,9 +244,12 @@
not specified, then the field width will be determined by the
content.
- The 'precision' field is a decimal number indicating how many
- digits should be displayed after the decimal point.
-
+ The 'precision' is a decimal number indicating how many digits
+ should be displayed after the decimal point in a floating point
+ conversion. In a string conversion the field indicates how many
+ characters will be used from the field content. The precision is
+ ignored for integer conversions.
+
Finally, the 'type' determines how the data should be presented.
If the type field is absent, an appropriate type will be assigned
based on the value to be formatted ('d' for integers and longs,
From python-checkins at python.org Wed Jul 5 10:21:01 2006
From: python-checkins at python.org (georg.brandl)
Date: Wed, 5 Jul 2006 10:21:01 +0200 (CEST)
Subject: [Python-checkins] r47234 - in python/trunk/Misc: Vim/python.vim
cheatsheet
Message-ID: <20060705082101.89B961E4008@bag.python.org>
Author: georg.brandl
Date: Wed Jul 5 10:21:00 2006
New Revision: 47234
Modified:
python/trunk/Misc/Vim/python.vim
python/trunk/Misc/cheatsheet
Log:
Remove remaining references to OverflowWarning.
Modified: python/trunk/Misc/Vim/python.vim
==============================================================================
--- python/trunk/Misc/Vim/python.vim (original)
+++ python/trunk/Misc/Vim/python.vim Wed Jul 5 10:21:00 2006
@@ -85,7 +85,7 @@
syn keyword pythonException LookupError OSError DeprecationWarning
syn keyword pythonException UnicodeError UnicodeEncodeError
syn keyword pythonException FloatingPointError ReferenceError NameError
- syn keyword pythonException OverflowWarning IOError SyntaxError
+ syn keyword pythonException IOError SyntaxError
syn keyword pythonException FutureWarning ImportWarning SystemExit
syn keyword pythonException Exception EOFError StandardError ValueError
syn keyword pythonException TabError KeyError ZeroDivisionError SystemError
Modified: python/trunk/Misc/cheatsheet
==============================================================================
--- python/trunk/Misc/cheatsheet (original)
+++ python/trunk/Misc/cheatsheet Wed Jul 5 10:21:00 2006
@@ -1135,7 +1135,6 @@
DeprecationWarning
PendingDeprecationWarning
SyntaxWarning
- OverflowWarning
RuntimeWarning
FutureWarning
From python-checkins at python.org Wed Jul 5 10:28:49 2006
From: python-checkins at python.org (georg.brandl)
Date: Wed, 5 Jul 2006 10:28:49 +0200 (CEST)
Subject: [Python-checkins] r47235 - peps/trunk/pep-3099.txt
Message-ID: <20060705082849.6548E1E400D@bag.python.org>
Author: georg.brandl
Date: Wed Jul 5 10:28:49 2006
New Revision: 47235
Modified:
peps/trunk/pep-3099.txt
Log:
Forbid ":=".
Modified: peps/trunk/pep-3099.txt
==============================================================================
--- peps/trunk/pep-3099.txt (original)
+++ peps/trunk/pep-3099.txt Wed Jul 5 10:28:49 2006
@@ -116,6 +116,11 @@
object",
http://mail.python.org/pipermail/python-3000/2006-July/002485.html
+* There will be no alternative binding operators such as ``:=``.
+
+ Thread: "Explicit Lexical Scoping (pre-PEP?)",
+ http://mail.python.org/pipermail/python-dev/2006-July/066995.html
+
Builtins
========
From python-checkins at python.org Wed Jul 5 11:13:57 2006
From: python-checkins at python.org (thomas.heller)
Date: Wed, 5 Jul 2006 11:13:57 +0200 (CEST)
Subject: [Python-checkins] r47236 - python/trunk/Modules/_ctypes/cfield.c
Message-ID: <20060705091357.2C2EB1E4008@bag.python.org>
Author: thomas.heller
Date: Wed Jul 5 11:13:56 2006
New Revision: 47236
Modified:
python/trunk/Modules/_ctypes/cfield.c
Log:
Fix the bitfield test when _ctypes is compiled with MingW. Structures
containing bitfields may have different layout on MSVC and MingW .
Modified: python/trunk/Modules/_ctypes/cfield.c
==============================================================================
--- python/trunk/Modules/_ctypes/cfield.c (original)
+++ python/trunk/Modules/_ctypes/cfield.c Wed Jul 5 11:13:56 2006
@@ -65,10 +65,10 @@
}
if (bitsize /* this is a bitfield request */
&& *pfield_size /* we have a bitfield open */
-#ifdef MS_WIN32
+#if defined(MS_WIN32) && !defined(__MINGW32__)
&& dict->size * 8 == *pfield_size /* MSVC */
#else
- && dict->size * 8 <= *pfield_size /* GCC */
+ && dict->size * 8 <= *pfield_size /* GCC, MINGW */
#endif
&& (*pbitofs + bitsize) <= *pfield_size) {
/* continue bit field */
From python-checkins at python.org Wed Jul 5 13:03:50 2006
From: python-checkins at python.org (thomas.wouters)
Date: Wed, 5 Jul 2006 13:03:50 +0200 (CEST)
Subject: [Python-checkins] r47237 - in python/trunk: Lib/string.py
Lib/test/test_pep292.py Misc/NEWS
Message-ID: <20060705110350.350F31E4003@bag.python.org>
Author: thomas.wouters
Date: Wed Jul 5 13:03:49 2006
New Revision: 47237
Modified:
python/trunk/Lib/string.py
python/trunk/Lib/test/test_pep292.py
python/trunk/Misc/NEWS
Log:
Fix bug in passing tuples to string.Template. All other values (with working
str() or repr()) would work, just not multi-value tuples. Probably not a
backport candidate, since it changes the behaviour of passing a
single-element tuple:
>>> string.Template("$foo").substitute(dict(foo=(1,)))
'(1,)'
versus
'1'
Modified: python/trunk/Lib/string.py
==============================================================================
--- python/trunk/Lib/string.py (original)
+++ python/trunk/Lib/string.py Wed Jul 5 13:03:49 2006
@@ -161,7 +161,7 @@
val = mapping[named]
# We use this idiom instead of str() because the latter will
# fail if val is a Unicode containing non-ASCII characters.
- return '%s' % val
+ return '%s' % (val,)
if mo.group('escaped') is not None:
return self.delimiter
if mo.group('invalid') is not None:
@@ -186,13 +186,13 @@
try:
# We use this idiom instead of str() because the latter
# will fail if val is a Unicode containing non-ASCII
- return '%s' % mapping[named]
+ return '%s' % (mapping[named],)
except KeyError:
return self.delimiter + named
braced = mo.group('braced')
if braced is not None:
try:
- return '%s' % mapping[braced]
+ return '%s' % (mapping[braced],)
except KeyError:
return self.delimiter + '{' + braced + '}'
if mo.group('escaped') is not None:
Modified: python/trunk/Lib/test/test_pep292.py
==============================================================================
--- python/trunk/Lib/test/test_pep292.py (original)
+++ python/trunk/Lib/test/test_pep292.py Wed Jul 5 13:03:49 2006
@@ -58,6 +58,13 @@
s = Template('tim has eaten ${count} bags of ham today')
eq(s.substitute(d), 'tim has eaten 7 bags of ham today')
+ def test_tupleargs(self):
+ eq = self.assertEqual
+ s = Template('$who ate ${meal}')
+ d = dict(who=('tim', 'fred'), meal=('ham', 'kung pao'))
+ eq(s.substitute(d), "('tim', 'fred') ate ('ham', 'kung pao')")
+ eq(s.safe_substitute(d), "('tim', 'fred') ate ('ham', 'kung pao')")
+
def test_SafeTemplate(self):
eq = self.assertEqual
s = Template('$who likes ${what} for ${meal}')
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Wed Jul 5 13:03:49 2006
@@ -25,6 +25,10 @@
Library
-------
+- string.Template() now correctly handles tuple-values. Previously,
+ multi-value tuples would raise an exception and single-value tuples would
+ be treated as the value they contain, instead.
+
- Bug #822974: Honor timeout in telnetlib.{expect,read_until}
even if some data are received.
From python-checkins at python.org Wed Jul 5 14:19:21 2006
From: python-checkins at python.org (matt.fleming)
Date: Wed, 5 Jul 2006 14:19:21 +0200 (CEST)
Subject: [Python-checkins] r47238 - in sandbox/trunk/pdb/Doc: Makefile
Makefile.deps commontex commontex/boilerplate.tex
commontex/copyright.tex commontex/patchlevel.tex
commontex/underscore.sty html html/icons html/icons/blank.gif
html/icons/blank.png html/icons/contents.gif
html/icons/contents.png html/icons/index.gif
html/icons/index.png html/icons/modules.gif
html/icons/modules.png html/icons/next.gif
html/icons/next.png html/icons/previous.gif
html/icons/previous.png html/icons/pyfav.gif
html/icons/pyfav.png html/icons/up.gif html/icons/up.png
html/stdabout.dat html/style.css info info/Makefile lib
lib/Makefile lib/lib-ds.tex lib/lib.tex lib/libmpdb.tex
paper-letter perl perl/SynopsisTable.pm perl/l2hinit.perl
perl/manual.perl perl/python.perl texinputs
texinputs/fncychap.sty texinputs/manual.cls
texinputs/pypaper.sty texinputs/python.ist
texinputs/python.sty texinputs/underscore.sty tools
tools/buildindex.py tools/checkargs.pm tools/getversioninfo
tools/indfix.py tools/mkhowto tools/mkinfo
tools/node2label.pl tools/patchlevel.h tools/py2texi.el
tools/toc2bkm.py
Message-ID: <20060705121921.1FBCB1E4003@bag.python.org>
Author: matt.fleming
Date: Wed Jul 5 14:19:15 2006
New Revision: 47238
Added:
sandbox/trunk/pdb/Doc/
- copied from r47223, sandbox/trunk/pdb/doc/
sandbox/trunk/pdb/Doc/Makefile
sandbox/trunk/pdb/Doc/Makefile.deps
sandbox/trunk/pdb/Doc/commontex/
sandbox/trunk/pdb/Doc/commontex/boilerplate.tex
sandbox/trunk/pdb/Doc/commontex/copyright.tex
sandbox/trunk/pdb/Doc/commontex/patchlevel.tex
sandbox/trunk/pdb/Doc/commontex/underscore.sty
sandbox/trunk/pdb/Doc/html/
sandbox/trunk/pdb/Doc/html/icons/
sandbox/trunk/pdb/Doc/html/icons/blank.gif (contents, props changed)
sandbox/trunk/pdb/Doc/html/icons/blank.png (contents, props changed)
sandbox/trunk/pdb/Doc/html/icons/contents.gif (contents, props changed)
sandbox/trunk/pdb/Doc/html/icons/contents.png (contents, props changed)
sandbox/trunk/pdb/Doc/html/icons/index.gif (contents, props changed)
sandbox/trunk/pdb/Doc/html/icons/index.png (contents, props changed)
sandbox/trunk/pdb/Doc/html/icons/modules.gif (contents, props changed)
sandbox/trunk/pdb/Doc/html/icons/modules.png (contents, props changed)
sandbox/trunk/pdb/Doc/html/icons/next.gif (contents, props changed)
sandbox/trunk/pdb/Doc/html/icons/next.png (contents, props changed)
sandbox/trunk/pdb/Doc/html/icons/previous.gif (contents, props changed)
sandbox/trunk/pdb/Doc/html/icons/previous.png (contents, props changed)
sandbox/trunk/pdb/Doc/html/icons/pyfav.gif (contents, props changed)
sandbox/trunk/pdb/Doc/html/icons/pyfav.png (contents, props changed)
sandbox/trunk/pdb/Doc/html/icons/up.gif (contents, props changed)
sandbox/trunk/pdb/Doc/html/icons/up.png (contents, props changed)
sandbox/trunk/pdb/Doc/html/stdabout.dat
sandbox/trunk/pdb/Doc/html/style.css
sandbox/trunk/pdb/Doc/info/
sandbox/trunk/pdb/Doc/info/Makefile
sandbox/trunk/pdb/Doc/lib/
sandbox/trunk/pdb/Doc/lib/Makefile
sandbox/trunk/pdb/Doc/lib/lib-ds.tex
sandbox/trunk/pdb/Doc/lib/lib.tex
sandbox/trunk/pdb/Doc/lib/libmpdb.tex
sandbox/trunk/pdb/Doc/paper-letter/
sandbox/trunk/pdb/Doc/perl/
sandbox/trunk/pdb/Doc/perl/SynopsisTable.pm
sandbox/trunk/pdb/Doc/perl/l2hinit.perl
sandbox/trunk/pdb/Doc/perl/manual.perl
sandbox/trunk/pdb/Doc/perl/python.perl
sandbox/trunk/pdb/Doc/texinputs/
sandbox/trunk/pdb/Doc/texinputs/fncychap.sty
sandbox/trunk/pdb/Doc/texinputs/manual.cls
sandbox/trunk/pdb/Doc/texinputs/pypaper.sty
sandbox/trunk/pdb/Doc/texinputs/python.ist
sandbox/trunk/pdb/Doc/texinputs/python.sty
sandbox/trunk/pdb/Doc/texinputs/underscore.sty
sandbox/trunk/pdb/Doc/tools/
sandbox/trunk/pdb/Doc/tools/buildindex.py (contents, props changed)
sandbox/trunk/pdb/Doc/tools/checkargs.pm
sandbox/trunk/pdb/Doc/tools/getversioninfo (contents, props changed)
sandbox/trunk/pdb/Doc/tools/indfix.py (contents, props changed)
sandbox/trunk/pdb/Doc/tools/mkhowto (contents, props changed)
sandbox/trunk/pdb/Doc/tools/mkinfo (contents, props changed)
sandbox/trunk/pdb/Doc/tools/node2label.pl (contents, props changed)
sandbox/trunk/pdb/Doc/tools/patchlevel.h
sandbox/trunk/pdb/Doc/tools/py2texi.el
sandbox/trunk/pdb/Doc/tools/toc2bkm.py (contents, props changed)
Log:
Added the beginning of documentation
Added: sandbox/trunk/pdb/Doc/Makefile
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/Makefile Wed Jul 5 14:19:15 2006
@@ -0,0 +1,610 @@
+# Makefile for Python documentation
+# ---------------------------------
+#
+# See also the README file.
+#
+# This is a bit of a mess. The documents are identified by short names:
+# api -- Python/C API Reference Manual
+# doc -- Documenting Python
+# ext -- Extending and Embedding the Python Interpreter
+# lib -- Library Reference Manual
+# mac -- Macintosh Library Modules
+# ref -- Python Reference Manual
+# tut -- Python Tutorial
+# inst -- Installing Python Modules
+# dist -- Distributing Python Modules
+#
+# The LaTeX sources for each of these documents are in subdirectories
+# with the three-letter designations above as the directory names.
+#
+# The main target creates HTML for each of the documents. You can
+# also do "make lib" (etc.) to create the HTML versions of individual
+# documents.
+#
+# The document classes and styles are in the texinputs/ directory.
+# These define a number of macros that are similar in name and intent
+# as macros in Texinfo (e.g. \code{...} and \emph{...}), as well as a
+# number of environments for formatting function and data definitions.
+# Documentation for the macros is included in "Documenting Python"; see
+# http://www.python.org/doc/current/doc/doc.html, or the sources for
+# this document in the doc/ directory.
+#
+# Everything is processed by LaTeX. See the file `README' for more
+# information on the tools needed for processing.
+#
+# There's a problem with generating the index which has been solved by
+# a sed command applied to the index file. The shell script fix_hack
+# does this (the Makefile takes care of calling it).
+#
+# Additional targets attempt to convert selected LaTeX sources to
+# various other formats. These are generally site specific because
+# the tools used are all but universal. These targets are:
+#
+# ps -- convert all documents from LaTeX to PostScript
+# pdf -- convert all documents from LaTeX to the
+# Portable Document Format
+#
+# See the README file for more information on these targets.
+#
+# The formatted output is located in subdirectories. For PDF and
+# PostScript, look in the paper-$(PAPER)/ directory. For HTML, look in
+# the html/ directory. If you want to fix the GNU info process, look
+# in the info/ directory; please send patches to docs at python.org.
+
+# This Makefile only includes information on how to perform builds; for
+# dependency information, see Makefile.deps.
+srcdir=.
+DISTFILES=lib/lib.tex lib/libmpdb.tex Makefile.in Makefile.deps \
+ html
+
+PYTHON_SRC:=$(shell cd .. && pwd)
+PYTHON_DOC:=$(shell cd . && pwd)
+
+# Customization -- you *may* have to edit this
+
+# You could set this to a4:
+PAPER=letter
+
+# Ideally, you shouldn't need to edit beyond this point
+.PHONY: realclean distclean clean clobber test check
+
+
+INFODIR= $(srcdir)/info
+TOOLSDIR= $(srcdir)/tools
+
+# This is the *documentation* release, and is used to construct the
+# file names of the downloadable tarballs. It is initialized by the
+# getversioninfo script to ensure that the right version number is
+# used; the script will also write commontex/patchlevel.tex if that
+# doesn't exist or needs to be changed. Documents which depend on the
+# version number should use \input{patchlevel} and include
+# commontex/patchlevel.tex in their dependencies.
+RELEASE:=$(shell $(PYTHON) $(PYTHON_DOC)/tools/getversioninfo)
+
+PYTHON:= python
+DVIPS:= dvips -N0 -t $(PAPER)
+
+# This is ugly! The issue here is that there are two different levels
+# in the directory tree at which we execute mkhowto, so we can't
+# define it just once using a relative path (at least not with the
+# current implementation and Makefile structure). We use the GNUish
+# $(shell) function here to work around that restriction by
+# identifying mkhowto and the commontex/ directory using absolute paths.
+#
+# If your doc build fails immediately, you may need to switch to GNU make.
+# (e.g. OpenBSD needs package gmake installed; use gmake instead of make)
+PWD=$(shell pwd)
+
+# (The trailing colon in the value is needed; TeX places its default
+# set of paths at the location of the empty string in the path list.)
+TEXINPUTS:=$(PWD)/commontex:
+
+# The mkhowto script can be run from the checkout using the first
+# version of this variable definition, or from a preferred version
+# using the second version. The standard documentation is typically
+# built using the second flavor, where the preferred version is from
+# the Python CVS trunk.
+MKHOWTO= TEXINPUTS=$(TEXINPUTS) $(PYTHON) $(PWD)/tools/mkhowto
+
+MKDVI= $(MKHOWTO) --paper=$(PAPER) --dvi
+MKHTML= $(MKHOWTO) --html --about html/stdabout.dat \
+ --iconserver ../icons --favicon ../icons/pyfav.png \
+ --address $(PYTHONDOCS) --up-link ../index.html \
+ --up-title "Python Documentation Index" \
+ --global-module-index "../modindex.html" --dvips-safe
+MKPDF= $(MKHOWTO) --paper=$(PAPER) --pdf
+MKPS= $(MKHOWTO) --paper=$(PAPER) --ps
+
+BUILDINDEX=$(TOOLSDIR)/buildindex.py
+
+PYTHONDOCS="See About this document... for information on suggesting changes."
+HTMLBASE= file:`pwd`
+
+# The emacs binary used to build the info docs. GNU Emacs 21 is required.
+EMACS= emacs
+
+# The end of this should reflect the major/minor version numbers of
+# the release:
+WHATSNEW=whatsnew25
+
+# what's what
+MANDVIFILES= paper-$(PAPER)/lib.dvi
+
+MANPDFFILES= paper-$(PAPER)/lib.pdf
+
+HOWTOPDFFILES=
+
+MANPSFILES= paper-$(PAPER)/lib.ps
+
+DVIFILES= $(MANDVIFILES) $(HOWTODVIFILES)
+PDFFILES= $(MANPDFFILES) $(HOWTOPDFFILES)
+PSFILES= $(MANPSFILES) $(HOWTOPSFILES)
+
+HTMLCSSFILES=html/lib/lib.css
+
+ISILOCSSFILES=
+
+ALLCSSFILES=$(HTMLCSSFILES) $(ISILOCSSFILES)
+
+INDEXFILES=html/lib/lib.html
+
+ALLHTMLFILES=$(INDEXFILES)
+
+COMMONPERL= $(PYTHON_DOC)/perl/manual.perl $(PYTHON_DOC)/perl/python.perl $(PYTHON_DOC)/perl/l2hinit.perl
+
+ANNOAPI=api/refcounts.dat tools/anno-api.py
+
+include ./Makefile.deps
+
+# These must be declared phony since there
+# are directories with matching names:
+.PHONY: api doc lib ref tut inst dist install
+.PHONY: html info
+
+
+# Main target
+default: html
+all: html info dvi ps pdf
+
+dvi: $(DVIFILES)
+pdf: $(PDFFILES)
+ps: $(PSFILES)
+
+# For now, we don't install anything.
+install:
+
+# For now, we don't test/check anything.
+test check:
+
+world: ps pdf html distfiles
+
+
+# Rules to build PostScript and PDF formats
+.SUFFIXES: .dvi .ps
+
+.dvi.ps:
+ $(DVIPS) -o $@ $<
+
+
+# Targets for each document:
+# Python/C API Reference Manual
+paper-$(PAPER)/api.dvi: $(ANNOAPIFILES)
+ cd paper-$(PAPER) && $(MKDVI) api.tex
+
+paper-$(PAPER)/api.pdf: $(ANNOAPIFILES)
+ cd paper-$(PAPER) && $(MKPDF) api.tex
+
+paper-$(PAPER)/api.tex: api/api.tex
+ cp api/api.tex $@
+
+paper-$(PAPER)/abstract.tex: api/abstract.tex $(ANNOAPI)
+ $(PYTHON) $(TOOLSDIR)/anno-api.py -o $@ api/abstract.tex
+
+paper-$(PAPER)/concrete.tex: api/concrete.tex $(ANNOAPI)
+ $(PYTHON) $(TOOLSDIR)/anno-api.py -o $@ api/concrete.tex
+
+paper-$(PAPER)/exceptions.tex: api/exceptions.tex $(ANNOAPI)
+ $(PYTHON) $(TOOLSDIR)/anno-api.py -o $@ api/exceptions.tex
+
+paper-$(PAPER)/init.tex: api/init.tex $(ANNOAPI)
+ $(PYTHON) $(TOOLSDIR)/anno-api.py -o $@ api/init.tex
+
+paper-$(PAPER)/intro.tex: api/intro.tex
+ cp api/intro.tex $@
+
+paper-$(PAPER)/memory.tex: api/memory.tex $(ANNOAPI)
+ $(PYTHON) $(TOOLSDIR)/anno-api.py -o $@ api/memory.tex
+
+paper-$(PAPER)/newtypes.tex: api/newtypes.tex $(ANNOAPI)
+ $(PYTHON) $(TOOLSDIR)/anno-api.py -o $@ api/newtypes.tex
+
+paper-$(PAPER)/refcounting.tex: api/refcounting.tex $(ANNOAPI)
+ $(PYTHON) $(TOOLSDIR)/anno-api.py -o $@ api/refcounting.tex
+
+paper-$(PAPER)/utilities.tex: api/utilities.tex $(ANNOAPI)
+ $(PYTHON) $(TOOLSDIR)/anno-api.py -o $@ api/utilities.tex
+
+paper-$(PAPER)/veryhigh.tex: api/veryhigh.tex $(ANNOAPI)
+ $(PYTHON) $(TOOLSDIR)/anno-api.py -o $@ api/veryhigh.tex
+
+# Distributing Python Modules
+paper-$(PAPER)/dist.dvi: $(DISTFILES)
+ cd paper-$(PAPER) && $(MKDVI) ../dist/dist.tex
+
+paper-$(PAPER)/dist.pdf: $(DISTFILES)
+ cd paper-$(PAPER) && $(MKPDF) ../dist/dist.tex
+
+# Documenting Python
+paper-$(PAPER)/doc.dvi: $(DOCFILES)
+ cd paper-$(PAPER) && $(MKDVI) ../doc/doc.tex
+
+paper-$(PAPER)/doc.pdf: $(DOCFILES)
+ cd paper-$(PAPER) && $(MKPDF) ../doc/doc.tex
+
+# Extending and Embedding the Python Interpreter
+paper-$(PAPER)/ext.dvi: $(EXTFILES)
+ cd paper-$(PAPER) && $(MKDVI) ../ext/ext.tex
+
+paper-$(PAPER)/ext.pdf: $(EXTFILES)
+ cd paper-$(PAPER) && $(MKPDF) ../ext/ext.tex
+
+# Installing Python Modules
+paper-$(PAPER)/inst.dvi: $(INSTFILES)
+ cd paper-$(PAPER) && $(MKDVI) ../inst/inst.tex
+
+paper-$(PAPER)/inst.pdf: $(INSTFILES)
+ cd paper-$(PAPER) && $(MKPDF) ../inst/inst.tex
+
+# Python Library Reference
+paper-$(PAPER)/lib.dvi: $(LIBFILES)
+ cd paper-$(PAPER) && $(MKDVI) ../lib/lib.tex
+
+paper-$(PAPER)/lib.pdf: $(LIBFILES)
+ cd paper-$(PAPER) && $(MKPDF) ../lib/lib.tex
+
+# Macintosh Library Modules
+paper-$(PAPER)/mac.dvi: $(MACFILES)
+ cd paper-$(PAPER) && $(MKDVI) ../mac/mac.tex
+
+paper-$(PAPER)/mac.pdf: $(MACFILES)
+ cd paper-$(PAPER) && $(MKPDF) ../mac/mac.tex
+
+# Python Reference Manual
+paper-$(PAPER)/ref.dvi: $(REFFILES)
+ cd paper-$(PAPER) && $(MKDVI) ../ref/ref.tex
+
+paper-$(PAPER)/ref.pdf: $(REFFILES)
+ cd paper-$(PAPER) && $(MKPDF) ../ref/ref.tex
+
+# Python Tutorial
+paper-$(PAPER)/tut.dvi: $(TUTFILES)
+ cd paper-$(PAPER) && $(MKDVI) ../tut/tut.tex
+
+paper-$(PAPER)/tut.pdf: $(TUTFILES)
+ cd paper-$(PAPER) && $(MKPDF) ../tut/tut.tex
+
+# What's New in Python X.Y
+paper-$(PAPER)/$(WHATSNEW).dvi: whatsnew/$(WHATSNEW).tex
+ cd paper-$(PAPER) && $(MKDVI) ../whatsnew/$(WHATSNEW).tex
+
+paper-$(PAPER)/$(WHATSNEW).pdf: whatsnew/$(WHATSNEW).tex
+ cd paper-$(PAPER) && $(MKPDF) ../whatsnew/$(WHATSNEW).tex
+
+# The remaining part of the Makefile is concerned with various
+# conversions, as described above. See also the README file.
+
+info:
+ cd $(INFODIR) && $(MAKE) EMACS=$(EMACS) WHATSNEW=$(WHATSNEW)
+
+# Targets to convert the manuals to HTML using Nikos Drakos' LaTeX to
+# HTML converter. For more info on this program, see
+# .
+
+# Note that LaTeX2HTML inserts references to an icons directory in
+# each page that it generates. I have placed a copy of this directory
+# in the distribution to simplify the process of creating a
+# self-contained HTML distribution; for this purpose I have also added
+# a (trivial) index.html. Change the definition of $ICONSERVER in
+# perl/l2hinit.perl to use a different location for the icons directory.
+
+# If you have the standard LaTeX2HTML icons installed, the versions shipped
+# with this documentation should be stored in a separate directory and used
+# instead. The standard set does *not* include all the icons used in the
+# Python documentation.
+
+$(ALLCSSFILES): html/style.css
+ cp $< $@
+
+$(INDEXFILES): $(COMMONPERL) \
+ $(PYTHON_DOC)/html/stdabout.dat $(PYTHON_DOC)/tools/node2label.pl
+
+html/acks.html: ACKS $(TOOLSDIR)/support.py $(TOOLSDIR)/mkackshtml
+ $(PYTHON) $(TOOLSDIR)/mkackshtml --address $(PYTHONDOCS) \
+ --favicon icons/pyfav.png \
+ --output html/acks.html $@
+
+html/modindex.html: $(TOOLSDIR)/support.py $(TOOLSDIR)/mkmodindex
+html/modindex.html: html/dist/dist.html
+html/modindex.html: html/lib/lib.html html/mac/mac.html
+ cd html && \
+ $(PYTHON) ../$(TOOLSDIR)/mkmodindex --columns 3 \
+ --output modindex.html --address $(PYTHONDOCS) \
+ --favicon icons/pyfav.png \
+ dist/modindex.html \
+ lib/modindex.html mac/modindex.html
+
+html: $(ALLHTMLFILES)
+
+doc: html/doc/doc.html html/doc/doc.css
+html/doc/doc.html: $(DOCFILES)
+ $(MKHTML) --dir html/doc doc/doc.tex
+
+lib: html/lib/lib.html html/lib/lib.css
+html/lib/lib.html: $(LIBFILES)
+ $(MKHTML) --dir html/lib lib/lib.tex
+
+tut: html/tut/tut.html html/tut/tut.css
+html/tut/tut.html: $(TUTFILES)
+ $(MKHTML) --dir html/tut --numeric --split 3 tut/tut.tex
+
+inst: html/inst/inst.html html/inst/inst.css
+html/inst/inst.html: $(INSTFILES) $(PYTHON_DOC)/perl/distutils.perl
+ $(MKHTML) --dir html/inst --split 4 inst/inst.tex
+
+whatsnew: html/whatsnew/$(WHATSNEW).html
+html/whatsnew/$(WHATSNEW).html: whatsnew/$(WHATSNEW).tex
+ $(MKHTML) --dir html/whatsnew --split 4 whatsnew/$(WHATSNEW).tex
+
+
+# The iSilo format is used by the iSilo document reader for PalmOS devices.
+
+ISILOINDEXFILES=
+
+$(ISILOINDEXFILES): $(COMMONPERL) html/stdabout.dat $(PYTHON_DOC)/perl/isilo.perl
+
+# webchecker needs an extra flag to process the huge index from the libref
+WEBCHECKER=$(PYTHON) ../Tools/webchecker/webchecker.py
+HTMLBASE= file:`pwd`/html
+
+webcheck: $(ALLHTMLFILES)
+ $(WEBCHECKER) $(HTMLBASE)/api/
+ $(WEBCHECKER) $(HTMLBASE)/doc/
+ $(WEBCHECKER) $(HTMLBASE)/ext/
+ $(WEBCHECKER) -m290000 $(HTMLBASE)/lib/
+ $(WEBCHECKER) $(HTMLBASE)/mac/
+ $(WEBCHECKER) $(HTMLBASE)/ref/
+ $(WEBCHECKER) $(HTMLBASE)/tut/
+ $(WEBCHECKER) $(HTMLBASE)/dist/
+ $(WEBCHECKER) $(HTMLBASE)/inst/
+ $(WEBCHECKER) $(HTMLBASE)/whatsnew/
+
+fastwebcheck: $(ALLHTMLFILES)
+ $(WEBCHECKER) -x $(HTMLBASE)/api/
+ $(WEBCHECKER) -x $(HTMLBASE)/doc/
+ $(WEBCHECKER) -x $(HTMLBASE)/ext/
+ $(WEBCHECKER) -x -m290000 $(HTMLBASE)/lib/
+ $(WEBCHECKER) -x $(HTMLBASE)/mac/
+ $(WEBCHECKER) -x $(HTMLBASE)/ref/
+ $(WEBCHECKER) -x $(HTMLBASE)/tut/
+ $(WEBCHECKER) -x $(HTMLBASE)/dist/
+ $(WEBCHECKER) -x $(HTMLBASE)/inst/
+ $(WEBCHECKER) -x $(HTMLBASE)/whatsnew/
+
+
+# Release packaging targets:
+
+paper-$(PAPER)/README: $(PSFILES) $(TOOLSDIR)/getpagecounts
+ cd paper-$(PAPER) && ../$(TOOLSDIR)/getpagecounts -r $(RELEASE) >../$@
+
+info-$(RELEASE).tgz: info
+ cd $(INFODIR) && tar cf - README python.dir python-*.info* \
+ | gzip -9 >../$@
+
+info-$(RELEASE).tar.bz2: info
+ cd $(INFODIR) && tar cf - README python.dir python-*.info* \
+ | bzip2 -9 >../$@
+
+latex-$(RELEASE).tgz:
+ $(PYTHON) $(TOOLSDIR)/mksourcepkg --gzip $(RELEASE)
+
+latex-$(RELEASE).tar.bz2:
+ $(PYTHON) $(TOOLSDIR)/mksourcepkg --bzip2 $(RELEASE)
+
+latex-$(RELEASE).zip:
+ rm -f $@
+ $(PYTHON) $(TOOLSDIR)/mksourcepkg --zip $(RELEASE)
+
+pdf-$(PAPER)-$(RELEASE).tar: $(PDFFILES)
+ rm -f $@
+ mkdir Python-Docs-$(RELEASE)
+ cp paper-$(PAPER)/*.pdf Python-Docs-$(RELEASE)
+ tar cf $@ Python-Docs-$(RELEASE)
+ rm -r Python-Docs-$(RELEASE)
+
+pdf-$(PAPER)-$(RELEASE).tgz: pdf-$(PAPER)-$(RELEASE).tar
+ gzip -9 <$? >$@
+
+pdf-$(PAPER)-$(RELEASE).tar.bz2: pdf-$(PAPER)-$(RELEASE).tar
+ bzip2 -9 <$? >$@
+
+pdf-$(PAPER)-$(RELEASE).zip: pdf
+ rm -f $@
+ mkdir Python-Docs-$(RELEASE)
+ cp paper-$(PAPER)/*.pdf Python-Docs-$(RELEASE)
+ zip -q -r -9 $@ Python-Docs-$(RELEASE)
+ rm -r Python-Docs-$(RELEASE)
+
+postscript-$(PAPER)-$(RELEASE).tar: $(PSFILES) paper-$(PAPER)/README
+ rm -f $@
+ mkdir Python-Docs-$(RELEASE)
+ cp paper-$(PAPER)/*.ps Python-Docs-$(RELEASE)
+ cp paper-$(PAPER)/README Python-Docs-$(RELEASE)
+ tar cf $@ Python-Docs-$(RELEASE)
+ rm -r Python-Docs-$(RELEASE)
+
+postscript-$(PAPER)-$(RELEASE).tar.bz2: postscript-$(PAPER)-$(RELEASE).tar
+ bzip2 -9 <$< >$@
+
+postscript-$(PAPER)-$(RELEASE).tgz: postscript-$(PAPER)-$(RELEASE).tar
+ gzip -9 <$< >$@
+
+postscript-$(PAPER)-$(RELEASE).zip: $(PSFILES) paper-$(PAPER)/README
+ rm -f $@
+ mkdir Python-Docs-$(RELEASE)
+ cp paper-$(PAPER)/*.ps Python-Docs-$(RELEASE)
+ cp paper-$(PAPER)/README Python-Docs-$(RELEASE)
+ zip -q -r -9 $@ Python-Docs-$(RELEASE)
+ rm -r Python-Docs-$(RELEASE)
+
+HTMLPKGFILES=*.html */*.css */*.html */*.gif */*.png
+
+html-$(RELEASE).tar: $(ALLHTMLFILES) $(HTMLCSSFILES)
+ mkdir Python-Docs-$(RELEASE)
+ -find html -name '*.gif' -size 0 | xargs rm -f
+ cd html && tar cf ../temp.tar $(HTMLPKGFILES)
+ cd Python-Docs-$(RELEASE) && tar xf ../temp.tar
+ rm temp.tar
+ tar cf html-$(RELEASE).tar Python-Docs-$(RELEASE)
+ rm -r Python-Docs-$(RELEASE)
+
+html-$(RELEASE).tgz: html-$(RELEASE).tar
+ gzip -9 <$? >$@
+
+html-$(RELEASE).tar.bz2: html-$(RELEASE).tar
+ bzip2 -9 <$? >$@
+
+html-$(RELEASE).zip: $(ALLHTMLFILES) $(HTMLCSSFILES)
+ rm -f $@
+ mkdir Python-Docs-$(RELEASE)
+ cd html && tar cf ../temp.tar $(HTMLPKGFILES)
+ cd Python-Docs-$(RELEASE) && tar xf ../temp.tar
+ rm temp.tar
+ zip -q -r -9 $@ Python-Docs-$(RELEASE)
+ rm -r Python-Docs-$(RELEASE)
+
+isilo-$(RELEASE).zip: isilo
+ rm -f $@
+ mkdir Python-Docs-$(RELEASE)
+ cp isilo/python-*.pdb Python-Docs-$(RELEASE)
+ zip -q -r -9 $@ Python-Docs-$(RELEASE)
+ rm -r Python-Docs-$(RELEASE)
+
+
+# convenience targets:
+
+tarhtml: html-$(RELEASE).tgz
+tarinfo: info-$(RELEASE).tgz
+tarps: postscript-$(PAPER)-$(RELEASE).tgz
+tarpdf: pdf-$(PAPER)-$(RELEASE).tgz
+tarlatex: latex-$(RELEASE).tgz
+
+tarballs: tarpdf tarps tarhtml
+
+ziphtml: html-$(RELEASE).zip
+zipps: postscript-$(PAPER)-$(RELEASE).zip
+zippdf: pdf-$(PAPER)-$(RELEASE).zip
+ziplatex: latex-$(RELEASE).zip
+
+zips: zippdf zipps ziphtml
+
+bziphtml: html-$(RELEASE).tar.bz2
+bzipinfo: info-$(RELEASE).tar.bz2
+bzipps: postscript-$(PAPER)-$(RELEASE).tar.bz2
+bzippdf: pdf-$(PAPER)-$(RELEASE).tar.bz2
+bziplatex: latex-$(RELEASE).tar.bz2
+
+bzips: bzippdf bzipps bziphtml
+
+disthtml: bziphtml ziphtml
+distinfo: bzipinfo
+distps: bzipps zipps
+distpdf: bzippdf zippdf
+distlatex: bziplatex ziplatex
+
+# We use the "pkglist" target at the end of these to ensure the
+# package list is updated after building either of these; this seems a
+# reasonable compromise between only building it for distfiles or
+# having to build it manually. Doing it here allows the packages for
+# distribution to be built using either of
+# make distfiles && make PAPER=a4 paperdist
+# make paperdist && make PAPER=a4 distfiles
+# The small amount of additional work is a small price to pay for not
+# having to remember which order to do it in. ;)
+paperdist: distpdf distps pkglist
+edist: disthtml distinfo zipisilo pkglist
+
+# The pkglist.html file is used as part of the download.html page on
+# python.org; it is not used as intermediate input here or as part of
+# the packages created.
+pkglist:
+ $(TOOLSDIR)/mkpkglist >pkglist.html
+
+distfiles: paperdist edist
+ $(TOOLSDIR)/mksourcepkg --bzip2 --zip $(RELEASE)
+ $(TOOLSDIR)/mkpkglist >pkglist.html
+
+
+# Housekeeping targets
+
+# Remove temporary files; all except the following:
+# - sources: .tex, .bib, .sty, *.cls
+# - useful results: .dvi, .pdf, .ps, .texi, .info
+clean:
+ -rm -f html-$(RELEASE).tar
+ -rm -f $(INDEXFILES)
+# cd $(INFODIR) && $(MAKE) clean
+
+# Remove temporaries as well as final products
+clobber: clean
+ -rm -f html-$(RELEASE).tgz info-$(RELEASE).tgz
+ -rm -f pdf-$(RELEASE).tgz postscript-$(RELEASE).tgz
+ -rm -f latex-$(RELEASE).tgz html-$(RELEASE).zip
+ -rm -f pdf-$(RELEASE).zip postscript-$(RELEASE).zip
+ -rm -f $(DVIFILES) $(PSFILES) $(PDFFILES)
+ cd $(INFODIR) && $(MAKE) clobber
+ -rm -f paper-$(PAPER)/*.tex paper-$(PAPER)/*.ind paper-$(PAPER)/*.idx
+ -rm -f paper-$(PAPER)/*.l2h paper-$(PAPER)/*.how paper-$(PAPER)/README
+ -rm -rf html/index.html html/lib/
+
+realclean distclean: clobber
+
+distdir: $(DISTFILES)
+ @srcdirstrip=`echo "$(srcdir)" | sed 's|.|.|g'`; \
+ topsrcdirstrip=`echo "$(top_srcdir)" | sed 's|.|.|g'`; \
+ list='$(DISTFILES)'; for file in $$list; do \
+ case $$file in \
+ $(srcdir)/*) file=`echo "$$file" | sed "s|^$$srcdirstrip/||"`;; \
+ $(top_srcdir)/*) file=`echo "$$file" | sed "s|^$$topsrcdirstrip/|$(top_builddir)/|"`;; \
+ esac; \
+ if test -f $$file || test -d $$file; then d=.; else d=$(srcdir); fi; \
+ dir=`echo "$$file" | sed -e 's,/[^/]*$$,,'`; \
+ if test "$$dir" != "$$file" && test "$$dir" != "."; then \
+ dir="/$$dir"; \
+ $(mkdir_p) "$(distdir)$$dir"; \
+ else \
+ dir=''; \
+ fi; \
+ if test -d $$d/$$file; then \
+ if test -d $(srcdir)/$$file && test $$d != $(srcdir); then \
+ cp -pR $(srcdir)/$$file $(distdir)$$dir || exit 1; \
+ fi; \
+ cp -pR $$d/$$file $(distdir)$$dir || exit 1; \
+ else \
+ test -f $(distdir)/$$file \
+ || cp -p $$d/$$file $(distdir)/$$file \
+ || exit 1; \
+ fi; \
+ done
Added: sandbox/trunk/pdb/Doc/Makefile.deps
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/Makefile.deps Wed Jul 5 14:19:15 2006
@@ -0,0 +1,48 @@
+# -*- Makefile -*-
+# LaTeX source dependencies.
+
+COMMONSTYLES= $(PYTHON_DOC)/texinputs/python.sty \
+ $(PYTHON_DOC)/texinputs/pypaper.sty
+
+INDEXSTYLES=$(PYTHON_DOC)/texinputs/python.ist
+
+COMMONTEX=$(PTYHON_DOC)/commontex/copyright.tex \
+ $(PTYHON_DOC)/commontex/license.tex \
+ $(PTYHON_DOC)/commontex/patchlevel.tex \
+ $(PTYHON_DOC)/commontex/boilerplate.tex
+
+MANSTYLES= $(PYTHON_DOC)/texinputs/fncychap.sty \
+ $(PYTHON_DOC)/texinputs/manual.cls \
+ $(COMMONSTYLES)
+
+HOWTOSTYLES=
+APIFILES=
+
+# These files are generated from those listed above, and are used to
+# generate the typeset versions of the manuals. The list is defined
+# here to make it easier to ensure parallelism.
+ANNOAPIFILES= $(MANSTYLES) $(INDEXSTYLES) $(COMMONTEX) \
+ $(PTYHON_DOC)/commontex/reportingbugs.tex
+
+DOCFILES= $(HOWTOSTYLES) \
+ $(PTYHON_DOC)/commontex/boilerplate.tex \
+ $(PYTHON_DOC)/texinputs/ltxmarkup.sty \
+ doc/doc.tex
+
+EXTFILES=
+
+TUTFILES=
+
+# LaTeX source files for the Python Reference Manual
+REFFILES=
+
+# LaTeX source files for the Python Library Reference
+LIBFILES= lib/libmpdb.tex
+
+
+# LaTeX source files for Macintosh Library Modules.
+MACFILES=
+
+INSTFILES =
+
+DISTFILES =
Added: sandbox/trunk/pdb/Doc/commontex/boilerplate.tex
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/commontex/boilerplate.tex Wed Jul 5 14:19:15 2006
@@ -0,0 +1,8 @@
+\author{Revised by Matt Fleming\\
+ }
+\authoraddress{
+ Email: \email{mattjfleming at googlemail.com}
+}
+
+\date{\today} % XXX update before final release!
+\input{patchlevel} % include Python version information
Added: sandbox/trunk/pdb/Doc/commontex/copyright.tex
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/commontex/copyright.tex Wed Jul 5 14:19:15 2006
@@ -0,0 +1,14 @@
+Copyright \copyright{} 2001-2004 Python Software Foundation.
+All rights reserved.
+
+Copyright \copyright{} 2000 BeOpen.com.
+All rights reserved.
+
+Copyright \copyright{} 1995-2000 Corporation for National Research Initiatives.
+All rights reserved.
+
+Copyright \copyright{} 1991-1995 Stichting Mathematisch Centrum.
+All rights reserved.
+
+See the end of this document for complete license and permissions
+information.
Added: sandbox/trunk/pdb/Doc/commontex/patchlevel.tex
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/commontex/patchlevel.tex Wed Jul 5 14:19:15 2006
@@ -0,0 +1,6 @@
+% This file is generated by ../tools/getversioninfo;
+% do not edit manually.
+
+\release{2.4.2}
+\setreleaseinfo{mpdb}
+\setshortversion{2.4}
Added: sandbox/trunk/pdb/Doc/commontex/underscore.sty
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/commontex/underscore.sty Wed Jul 5 14:19:15 2006
@@ -0,0 +1,232 @@
+% underscore.sty 12-Oct-2001 Donald Arseneau asnd at triumf.ca
+% Make the "_" character print as "\textunderscore" in text.
+% Copyright 1998,2001 Donald Arseneau; Distribute freely if unchanged.
+% Instructions follow after the definitions.
+
+\ProvidesPackage{underscore}[2001/10/12]
+
+\begingroup
+ \catcode`\_=\active
+ \gdef_{% \relax % No relax gives a small vulnerability in alignments
+ \ifx\if at safe@actives\iftrue % must be outermost test!
+ \string_%
+ \else
+ \ifx\protect\@typeset at protect
+ \ifmmode \sb \else \BreakableUnderscore \fi
+ \else
+ \ifx\protect\@unexpandable at protect \noexpand_%
+ \else \protect_%
+ \fi\fi
+ \fi}
+\endgroup
+
+% At begin: set catcode; fix \long \ttdefault so I can use it in comparisons;
+\AtBeginDocument{%
+ {\immediate\write\@auxout{\catcode\number\string`\_ \string\active}}%
+ \catcode\string`\_\string=\active
+ \edef\ttdefault{\ttdefault}%
+}
+
+\newcommand{\BreakableUnderscore}{\leavevmode\nobreak\hskip\z at skip
+ \ifx\f at family\ttdefault \string_\else \textunderscore\fi
+ \usc at dischyph\nobreak\hskip\z at skip}
+
+\DeclareRobustCommand{\_}{%
+ \ifmmode \nfss at text{\textunderscore}\else \BreakableUnderscore \fi}
+
+\let\usc at dischyph\@dischyph
+\DeclareOption{nohyphen}{\def\usc at dischyph{\discretionary{}{}{}}}
+\DeclareOption{strings}{\catcode`\_=\active}
+
+\ProcessOptions
+\ifnum\catcode`\_=\active\else \endinput \fi
+
+%%%%%%%% Redefine commands that use character strings %%%%%%%%
+
+\@ifundefined{UnderscoreCommands}{\let\UnderscoreCommands\@empty}{}
+\expandafter\def\expandafter\UnderscoreCommands\expandafter{%
+ \UnderscoreCommands
+ \do\include \do\includeonly
+ \do\@input \do\@iinput \do\InputIfFileExists
+ \do\ref \do\pageref \do\newlabel
+ \do\bibitem \do\@bibitem \do\cite \do\nocite \do\bibcite
+}
+
+% Macro to redefine a macro to pre-process its string argument
+% with \protect -> \string.
+\def\do#1{% Avoid double processing if user includes command twice!
+ \@ifundefined{US\string_\expandafter\@gobble\string#1}{%
+ \edef\@tempb{\meaning#1}% Check if macro is just a protection shell...
+ \def\@tempc{\protect}%
+ \edef\@tempc{\meaning\@tempc\string#1\space\space}%
+ \ifx\@tempb\@tempc % just a shell: hook into the protected inner command
+ \expandafter\do
+ \csname \expandafter\@gobble\string#1 \expandafter\endcsname
+ \else % Check if macro takes an optional argument
+ \def\@tempc{\@ifnextchar[}%
+ \edef\@tempa{\def\noexpand\@tempa####1\meaning\@tempc}%
+ \@tempa##2##3\@tempa{##2\relax}%
+ \edef\@tempb{\meaning#1\meaning\@tempc}%
+ \edef\@tempc{\noexpand\@tempd \csname
+ US\string_\expandafter\@gobble\string#1\endcsname}%
+ \if \expandafter\@tempa\@tempb \relax 12\@tempa % then no optional arg
+ \@tempc #1\US at prot
+ \else % There is optional arg
+ \@tempc #1\US at protopt
+ \fi
+ \fi
+ }{}}
+
+\def\@tempd#1#2#3{\let#1#2\def#2{#3#1}}
+
+\def\US at prot#1#2{\let\@@protect\protect \let\protect\string
+ \edef\US at temp##1{##1{#2}}\restore at protect\US at temp#1}
+\def\US at protopt#1{\@ifnextchar[{\US at protarg#1}{\US at prot#1}}
+\def\US at protarg #1[#2]{\US at prot{{#1[#2]}}}
+
+\UnderscoreCommands
+\let\do\relax \let\@tempd\relax % un-do
+
+%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
+
+\endinput
+
+underscore.sty 12-Oct-2001 Donald Arseneau
+
+Features:
+~~~~~~~~~
+\_ prints an underscore so that the hyphenation of constituent words
+is not affected and hyphenation is permitted after the underscore.
+For example, "compound\_fracture" hyphenates as com- pound_- frac- ture.
+If you prefer the underscore to break without a hyphen (but still with
+the same rules for explicit hyphen-breaks) then use the [nohyphen]
+package option.
+
+A simple _ acts just like \_ in text mode, but makes a subscript in
+math mode: activation_energy $E_a$
+
+Both forms use an underscore character if the font encoding contains
+one (e.g., "\usepackage[T1]{fontenc}" or typewriter fonts in any encoding),
+but they use a rule if the there is no proper character.
+
+Deficiencies:
+~~~~~~~~~~~~~
+The skips and penalties ruin any kerning with the underscore character
+(when a character is used). However, there doesn't seem to be much, if
+any, such kerning in the ec fonts, and there is never any kerning with
+a rule.
+
+You must avoid "_" in file names and in cite or ref tags, or you must use
+the babel package, with its active-character controls, or you must give
+the [strings] option, which attempts to redefine several commands (and
+may not work perfectly). Even without the [strings] option or babel, you
+can use occasional underscores like: "\include{file\string_name}".
+
+Option: [strings]
+~~~~~~~~~~~~~~~~~
+The default operation is quite simple and needs no customization; but
+you must avoid using "_" in any place where LaTeX uses an argument as
+a string of characters for some control function or as a name. These
+include the tags for \cite and \ref, file names for \input, \include,
+and \includegraphics, environment names, counter names, and placement
+parameters (like "[t]"). The problem with these contexts is that they
+are `moving arguments' but LaTeX does not `switch on' the \protect
+mechanism for them.
+
+If you need to use the underscore character in these places, the package
+option [strings] is provided to redefine commands taking a string argument
+so that the argument is protected (with \protect -> \string). The list
+of commands is given in "\UnderscoreCommands", with "\do" before each,
+covering \cite, \ref, \input, and their variants. Not included are many
+commands regarding font names, everything with counter names, environment
+names, page styles, and versions of \ref and \cite defined by external
+packages (e.g. \vref and \citeyear).
+
+You can add to the list of supported commands by defining \UnderscoreCommands
+before loading this package; e.g.
+
+ \usepackage{chicago}
+ \newcommand{\UnderscoreCommands}{% (\cite already done)
+ \do\citeNP \do\citeA \do\citeANP \do\citeN \do\shortcite
+ \do\shortciteNP \do\shortciteA \do\shortciteANP \do\shortciteN
+ \do\citeyear \do\citeyearNP
+ }
+ \usepackage[strings]{underscore}
+
+Not all commands can be supported this way! Only commands that take a
+string argument *first* can be protected. One optional argument before
+the string argument is also permitted, as exemplified by \cite: both
+\cite{tags} and \cite[text]{tags} are allowed. A command like
+\@addtoreset which takes two counter names as arguments could not
+be protected by adding it to \UnderscoreCommands.
+
+!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+!! When you use the [strings] option, you must load this package !!
+!! last (or nearly last). !!
+!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+
+There are two reasons: 1) The redefinitions done for protection must come
+after other packages define their customized versions of those commands.
+2) The [strings] option requires the _ character to be activated immediately
+in order for the cite and ref tags to be read properly from the .aux file
+as plain strings, and this catcode setting might disrupt other packages.
+
+The babel package implements a protection mechanism for many commands,
+and will be a complete fix for most documents without the [strings] option.
+Many add-on packages are compatible with babel, so they will get the
+strings protection also. However, there are several commands that are
+not covered by babel, but can easily be supported by the [strings] and
+\UnderscoreCommands mechanism. Beware that using both [strings] and babel
+may lead to conflicts, but does appear to work (load babel last).
+
+Implementation Notes:
+~~~~~~~~~~~~~~~~~~~~~
+The first setting of "_" to be an active character is performed in a local
+group so as to not interfere with other packages. The catcode setting
+is repeated with \AtBeginDocument so the definition is in effect for the
+text. However, the catcode setting is repeated immediately when the
+[strings] option is detected.
+
+The definition of the active "_" is essentially:
+ \ifmmode \sb \else \BreakableUnderscore \fi
+where "\sb" retains the normal subscript meaning of "_" and where
+"\BreakableUnderscore" is essentially "\_". The rest of the definition
+handles the "\protect"ion without causing \relax to be inserted before
+the character.
+
+\BreakableUnderscore uses "\nobreak\hskip\z at skip" to separate the
+underscore from surrounding words, thus allowing TeX to hyphenate them,
+but preventing free breaks around the underscore. Next, it checks the
+current font family, and uses the underscore character from tt fonts or
+otherwise \textunderscore (which is a character or rule depending on
+the font encoding). After the underscore, it inserts a discretionary
+hyphenation point as "\usc at dischyph", which is usually just "\-"
+except that it still works in the tabbing environment, although it
+will give "\discretionary{}{}{}" under the [nohyphen] option. After
+that, another piece of non-breaking interword glue is inserted.
+Ordinarily, the comparison "\ifx\f at family\ttdefault" will always fail
+because \ttdefault is `long' where \f at family is not (boooo hisss), but
+\ttdefault is redefined to be non-long by "\AtBeginDocument".
+
+The "\_" command is then defined to use "\BreakableUnderscore".
+
+If the [strings] option is not given, then that is all!
+
+Under the [strings] option, the list of special commands is processed to:
+- retain the original command as \US_command (\US_ref)
+- redefine the command as \US at prot\US_command for ordinary commands
+ (\ref -> \US at prot\US_ref) or as \US at protopt\US_command when an optional
+ argument is possible (\bibitem -> \US at protopt\US_bibitem).
+- self-protecting commands (\cite) retain their self-protection.
+Diagnosing the state of the pre-existing command is done by painful
+contortions involving \meaning.
+
+\US at prot and \US at protopt read the argument, process it with \protect
+enabled, then invoke the saved \US_command.
+
+Modifications:
+~~~~~~~~~~~~~~
+12-Oct-2001 Babel (safe at actives) compatibility and [nohyphen] option.
+
+Test file integrity: ASCII 32-57, 58-126: !"#$%&'()*+,-./0123456789
+:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~
Added: sandbox/trunk/pdb/Doc/html/icons/blank.gif
==============================================================================
Binary file. No diff available.
Added: sandbox/trunk/pdb/Doc/html/icons/blank.png
==============================================================================
Binary file. No diff available.
Added: sandbox/trunk/pdb/Doc/html/icons/contents.gif
==============================================================================
Binary file. No diff available.
Added: sandbox/trunk/pdb/Doc/html/icons/contents.png
==============================================================================
Binary file. No diff available.
Added: sandbox/trunk/pdb/Doc/html/icons/index.gif
==============================================================================
Binary file. No diff available.
Added: sandbox/trunk/pdb/Doc/html/icons/index.png
==============================================================================
Binary file. No diff available.
Added: sandbox/trunk/pdb/Doc/html/icons/modules.gif
==============================================================================
Binary file. No diff available.
Added: sandbox/trunk/pdb/Doc/html/icons/modules.png
==============================================================================
Binary file. No diff available.
Added: sandbox/trunk/pdb/Doc/html/icons/next.gif
==============================================================================
Binary file. No diff available.
Added: sandbox/trunk/pdb/Doc/html/icons/next.png
==============================================================================
Binary file. No diff available.
Added: sandbox/trunk/pdb/Doc/html/icons/previous.gif
==============================================================================
Binary file. No diff available.
Added: sandbox/trunk/pdb/Doc/html/icons/previous.png
==============================================================================
Binary file. No diff available.
Added: sandbox/trunk/pdb/Doc/html/icons/pyfav.gif
==============================================================================
Binary file. No diff available.
Added: sandbox/trunk/pdb/Doc/html/icons/pyfav.png
==============================================================================
Binary file. No diff available.
Added: sandbox/trunk/pdb/Doc/html/icons/up.gif
==============================================================================
Binary file. No diff available.
Added: sandbox/trunk/pdb/Doc/html/icons/up.png
==============================================================================
Binary file. No diff available.
Added: sandbox/trunk/pdb/Doc/html/stdabout.dat
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/html/stdabout.dat Wed Jul 5 14:19:15 2006
@@ -0,0 +1,45 @@
+<-- -*- HTML -*->
+
This document was generated using the
+ LaTeX2HTML translator.
+
The application of
+ LaTeX2HTML to the Python
+ documentation has been heavily tailored by Fred L. Drake,
+ Jr. Original navigation icons were contributed by Christopher
+ Petrilli.
+
+ If you are able to provide suggested text, either to replace
+ existing incorrect or unclear material, or additional text to
+ supplement what's already available, we'd appreciate the
+ contribution. There's no need to worry about text markup; our
+ documentation team will gladly take care of that.
+
+
+
For any of these channels, please be sure not to send HTML email.
+ Thanks.
+
Added: sandbox/trunk/pdb/Doc/html/style.css
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/html/style.css Wed Jul 5 14:19:15 2006
@@ -0,0 +1,243 @@
+/*
+ * The first part of this is the standard CSS generated by LaTeX2HTML,
+ * with the "empty" declarations removed.
+ */
+
+/* Century Schoolbook font is very similar to Computer Modern Math: cmmi */
+.math { font-family: "Century Schoolbook", serif; }
+.math i { font-family: "Century Schoolbook", serif;
+ font-weight: bold }
+.boldmath { font-family: "Century Schoolbook", serif;
+ font-weight: bold }
+
+/*
+ * Implement both fixed-size and relative sizes.
+ *
+ * I think these can be safely removed, as it doesn't appear that
+ * LaTeX2HTML ever generates these, even though these are carried
+ * over from the LaTeX2HTML stylesheet.
+ */
+small.xtiny { font-size : xx-small; }
+small.tiny { font-size : x-small; }
+small.scriptsize { font-size : smaller; }
+small.footnotesize { font-size : small; }
+big.xlarge { font-size : large; }
+big.xxlarge { font-size : x-large; }
+big.huge { font-size : larger; }
+big.xhuge { font-size : xx-large; }
+
+/*
+ * Document-specific styles come next;
+ * these are added for the Python documentation.
+ *
+ * Note that the size specifications for the H* elements are because
+ * Netscape on Solaris otherwise doesn't get it right; they all end up
+ * the normal text size.
+ */
+
+body { color: #000000;
+ background-color: #ffffff; }
+
+a:link:active { color: #ff0000; }
+a:link:hover { background-color: #bbeeff; }
+a:visited:hover { background-color: #bbeeff; }
+a:visited { color: #551a8b; }
+a:link { color: #0000bb; }
+
+h1, h2, h3, h4, h5, h6 { font-family: avantgarde, sans-serif;
+ font-weight: bold; }
+h1 { font-size: 180%; }
+h2 { font-size: 150%; }
+h3, h4 { font-size: 120%; }
+
+/* These are section titles used in navigation links, so make sure we
+ * match the section header font here, even it not the weight.
+ */
+.sectref { font-family: avantgarde, sans-serif; }
+/* And the label before the titles in navigation: */
+.navlabel { font-size: 85%; }
+
+
+/* LaTeX2HTML insists on inserting elements into headers which
+ * are marked with \label. This little bit of CSS magic ensures that
+ * these elements don't cause spurious whitespace to be added.
+ */
+h1>br, h2>br, h3>br,
+h4>br, h5>br, h6>br { display: none; }
+
+code, tt { font-family: "lucida typewriter", lucidatypewriter,
+ monospace; }
+var { font-family: times, serif;
+ font-style: italic;
+ font-weight: normal; }
+
+.Unix { font-variant: small-caps; }
+
+.typelabel { font-family: lucida, sans-serif; }
+
+.navigation td { background-color: #99ccff;
+ font-weight: bold;
+ font-family: avantgarde, sans-serif;
+ font-size: 110%; }
+
+div.warning { background-color: #fffaf0;
+ border: thin solid black;
+ padding: 1em;
+ margin-left: 2em;
+ margin-right: 2em; }
+
+div.warning .label { font-family: sans-serif;
+ font-size: 110%;
+ margin-right: 0.5em; }
+
+div.note { background-color: #fffaf0;
+ border: thin solid black;
+ padding: 1em;
+ margin-left: 2em;
+ margin-right: 2em; }
+
+div.note .label { margin-right: 0.5em;
+ font-family: sans-serif; }
+
+address { font-size: 80%; }
+.release-info { font-style: italic;
+ font-size: 80%; }
+
+.titlegraphic { vertical-align: top; }
+
+.verbatim pre { color: #00008b;
+ font-family: "lucida typewriter", lucidatypewriter,
+ monospace;
+ font-size: 90%; }
+.verbatim { margin-left: 2em; }
+.verbatim .footer { padding: 0.05in;
+ font-size: 85%;
+ background-color: #99ccff;
+ margin-right: 0.5in; }
+
+.grammar { background-color: #99ccff;
+ margin-right: 0.5in;
+ padding: 0.05in; }
+.grammar-footer { padding: 0.05in;
+ font-size: 85%; }
+.grammartoken { font-family: "lucida typewriter", lucidatypewriter,
+ monospace; }
+
+.productions { background-color: #bbeeff; }
+.productions a:active { color: #ff0000; }
+.productions a:link:hover { background-color: #99ccff; }
+.productions a:visited:hover { background-color: #99ccff; }
+.productions a:visited { color: #551a8b; }
+.productions a:link { color: #0000bb; }
+.productions table { vertical-align: baseline;
+ empty-cells: show; }
+.productions > table td,
+.productions > table th { padding: 2px; }
+.productions > table td:first-child,
+.productions > table td:last-child {
+ font-family: "lucida typewriter",
+ lucidatypewriter,
+ monospace;
+ }
+/* same as the second selector above, but expressed differently for Opera */
+.productions > table td:first-child + td + td {
+ font-family: "lucida typewriter",
+ lucidatypewriter,
+ monospace;
+ vertical-align: baseline;
+ }
+.productions > table td:first-child + td {
+ padding-left: 1em;
+ padding-right: 1em;
+ }
+.productions > table tr { vertical-align: baseline; }
+
+.email { font-family: avantgarde, sans-serif; }
+.mailheader { font-family: avantgarde, sans-serif; }
+.mimetype { font-family: avantgarde, sans-serif; }
+.newsgroup { font-family: avantgarde, sans-serif; }
+.url { font-family: avantgarde, sans-serif; }
+.file { font-family: avantgarde, sans-serif; }
+.guilabel { font-family: avantgarde, sans-serif; }
+
+.realtable { border-collapse: collapse;
+ border-color: black;
+ border-style: solid;
+ border-width: 0px 0px 2px 0px;
+ empty-cells: show;
+ margin-left: auto;
+ margin-right: auto;
+ padding-left: 0.4em;
+ padding-right: 0.4em;
+ }
+.realtable tbody { vertical-align: baseline; }
+.realtable tfoot { display: table-footer-group; }
+.realtable thead { background-color: #99ccff;
+ border-width: 0px 0px 2px 1px;
+ display: table-header-group;
+ font-family: avantgarde, sans-serif;
+ font-weight: bold;
+ vertical-align: baseline;
+ }
+.realtable thead :first-child {
+ border-width: 0px 0px 2px 0px;
+ }
+.realtable thead th { border-width: 0px 0px 2px 1px }
+.realtable td,
+.realtable th { border-color: black;
+ border-style: solid;
+ border-width: 0px 0px 1px 1px;
+ padding-left: 0.4em;
+ padding-right: 0.4em;
+ }
+.realtable td:first-child,
+.realtable th:first-child {
+ border-left-width: 0px;
+ vertical-align: baseline;
+ }
+.center { text-align: center; }
+.left { text-align: left; }
+.right { text-align: right; }
+
+.refcount-info { font-style: italic; }
+.refcount-info .value { font-weight: bold;
+ color: #006600; }
+
+/*
+ * Some decoration for the "See also:" blocks, in part inspired by some of
+ * the styling on Lars Marius Garshol's XSA pages.
+ * (The blue in the navigation bars is #99CCFF.)
+ */
+.seealso { background-color: #fffaf0;
+ border: thin solid black;
+ padding: 0pt 1em 4pt 1em; }
+
+.seealso > .heading { font-size: 110%;
+ font-weight: bold; }
+
+/*
+ * Class 'availability' is used for module availability statements at
+ * the top of modules.
+ */
+.availability .platform { font-weight: bold; }
+
+
+/*
+ * Additional styles for the distutils package.
+ */
+.du-command { font-family: monospace; }
+.du-option { font-family: avantgarde, sans-serif; }
+.du-filevar { font-family: avantgarde, sans-serif;
+ font-style: italic; }
+.du-xxx:before { content: "** ";
+ font-weight: bold; }
+.du-xxx:after { content: " **";
+ font-weight: bold; }
+
+
+/*
+ * Some specialization for printed output.
+ */
+ at media print {
+ .online-navigation { display: none; }
+ }
Added: sandbox/trunk/pdb/Doc/info/Makefile
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/info/Makefile Wed Jul 5 14:19:15 2006
@@ -0,0 +1,44 @@
+# Generate the Python "info" documentation.
+
+TOPDIR=..
+TOOLSDIR=$(TOPDIR)/tools
+HTMLDIR=$(TOPDIR)/html
+
+# The emacs binary used to build the info docs. GNU Emacs 21 is required.
+EMACS=emacs
+
+MKINFO=$(TOOLSDIR)/mkinfo
+SCRIPTS=$(TOOLSDIR)/checkargs.pm $(TOOLSDIR)/mkinfo $(TOOLSDIR)/py2texi.el
+
+# set VERSION to code the VERSION number into the info file name
+# allowing installation of more than one set of python info docs
+# into the same directory
+VERSION=
+
+all: check-emacs-version \
+ lib
+# doc inst
+
+lib: python$(VERSION)-lib.info
+
+whatsnew: $(WHATSNEW)
+$(WHATSNEW): python$(VERSION)-$(WHATSNEW).info
+
+check-emacs-version:
+ @v="`$(EMACS) --version 2>&1 | egrep '^(GNU |X)Emacs [12]*'`"; \
+ if `echo "$$v" | grep '^GNU Emacs 21' >/dev/null 2>&1`; then \
+ echo "Using $(EMACS) to build the info docs"; \
+ else \
+ echo "GNU Emacs 21 is required to build the info docs"; \
+ echo "Found $$v"; \
+ false; \
+ fi
+
+python$(VERSION)-lib.info: ../lib/lib.tex $(SCRIPTS)
+ EMACS=$(EMACS) $(MKINFO) $< $*.texi $@
+
+clean:
+ rm -f *.texi~ *.texi
+
+clobber: clean
+ rm -f *.texi python*-*.info python*-*.info-[0-9]*
Added: sandbox/trunk/pdb/Doc/lib/Makefile
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/lib/Makefile Wed Jul 5 14:19:15 2006
@@ -0,0 +1,7 @@
+#=============================================================
+# $Id: Makefile,v 1.1 2006/01/28 10:50:36 rockyb Exp $
+#=============================================================
+# Whatever it is you want to do, it should be handled by the
+# by the main (parent) Makefile. So reissue make from there.
+all %:
+ $(MAKE) -C .. $@
Added: sandbox/trunk/pdb/Doc/lib/lib-ds.tex
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/lib/lib-ds.tex Wed Jul 5 14:19:15 2006
@@ -0,0 +1,54 @@
+\documentclass{manual}
+
+% NOTE: this file controls which chapters/sections of the library
+% manual are actually printed. It is easy to customize your manual
+% by commenting out sections that you're not interested in.
+
+\title{Python Library Reference for the Improved Python Debugger}
+
+\input{boilerplate}
+
+\makeindex % tell \index to actually write the
+ % .idx file
+\makemodindex % ... and the module index as well.
+
+
+\begin{document}
+
+\maketitle
+
+\ifhtml
+\chapter*{Front Matter\label{front}}
+\fi
+
+\input{copyright}
+
+\begin{abstract}
+
+\noindent
+Python is an extensible, interpreted, object-oriented programming
+language. It supports a wide range of applications, from simple text
+processing scripts to interactive Web browsers.
+
+We describe here only the Improved Python Debugger. The rest of the The
+\ulink{\module{Python Reference Manual}}
+{http://docs.python.org/lib/lib.html} should be consulted for other
+standard Python modules including the original \ulink{\module{Python
+Debugger}}{http://docs.python.org/lib/module-pdb.html} (\tt{pdb.py}).
+
+\end{abstract}
+
+\tableofcontents
+
+ % Chapter title:
+
+% =============
+% DEVELOPMENT TOOLS
+% =============
+% % Software development support
+
+\renewcommand{\baselinestretch}{3.0}\normalsize
+\input{libmpdb} % The Python Debugger
+
+
+\end{document}
Added: sandbox/trunk/pdb/Doc/lib/lib.tex
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/lib/lib.tex Wed Jul 5 14:19:15 2006
@@ -0,0 +1,57 @@
+\documentclass{manual}
+
+% NOTE: this file controls which chapters/sections of the library
+% manual are actually printed. It is easy to customize your manual
+% by commenting out sections that you're not interested in.
+
+\title{Python Library Reference for The Improved Python Debugger}
+
+\input{boilerplate}
+
+\makeindex % tell \index to actually write the
+ % .idx file
+\makemodindex % ... and the module index as well.
+
+
+\begin{document}
+
+\maketitle
+
+\ifhtml
+\chapter*{Front Matter\label{front}}
+\fi
+
+\input{copyright}
+
+\begin{abstract}
+
+\noindent
+Python is an extensible, interpreted, object-oriented programming
+language. It supports a wide range of applications, from simple text
+processing scripts to interactive Web browsers.
+
+We describe here only the Improved Python Debugger. The rest of the The
+\ulink{\module{Python Reference Manual}}
+{http://docs.python.org/lib/lib.html} should be consulted for other
+standard Python modules including the original \ulink{\module{Python
+Debugger}}{http://docs.python.org/lib/module-pdb.html} (\tt{pdb.py}).
+
+\end{abstract}
+
+\tableofcontents
+
+ % Chapter title:
+
+% =============
+% DEVELOPMENT TOOLS
+% =============
+% % Software development support
+
+\input{libmpdb} % The Python Debugger
+
+%begin{latexonly}
+\renewcommand{\indexname}{Index}
+%end{latexonly}
+\input{lib.ind} % Index
+
+\end{document}
Added: sandbox/trunk/pdb/Doc/lib/libmpdb.tex
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/lib/libmpdb.tex Wed Jul 5 14:19:15 2006
@@ -0,0 +1,81 @@
+\chapter{The Improved Python Debugger \label{chapter-mpdb}}
+
+\declaremodule{}{mpdb} % not standard, in Python
+
+\platform{Unix, Windows}
+
+
+\moduleauthor{Matt Fleming}{mattjfleming at googlemail.com}
+
+\sectionauthor{Matt Fleming}{mattjfleming at googlemail.com}
+
+
+
+% Leave at least one blank line after this, to simplify ad-hoc tools
+% that are sometimes used to massage these files.
+\modulesynopsis{An Improved Python Debugger}
+
+
+This module is a Goolge Summer Of Code project undertaken in the summer of
+2006 by Matt Fleming.
+The \module{mpdb} module defines operations for debugging programs remotely,
+debugging threaded programs and debugging separate processes. It builds on the
+work done by Rocky Bernstein in his Extended Python Debugger (\module{Pydb}), which
+in turn, builds upon the Python Debugger that is part of the Python
+Standard Library. Rocky also kindly agreed to mentor this project.
+
+
+
+The \module{mpdb} module defines the following functions:
+
+
+\begin{funcdesc}{pdbserver}{address\optional{, mpdb}}
+Setup a pdbserver at \var{address} that will allows debuggers to
+connect to this debugging session. \var{address} should be of the form
+ \code{'protocol procotol-specific-address'}. For example, to use TCP at host
+'myhost' and port '8765', \var{address} would be \code{'tcp myhost:8765'}.
+To connect to a serial device \var{address} might be \code{'serial /dev/ttyC0'}.
+The optional argument \var{mpdb} is an \class{MPdb} instance to use
+for this pdbserver. If \var{mpdb} is ommitted, a new \class{MPdb} instance
+is created.
+\end{funcdesc}
+
+\begin{funcdesc}{target}{address}
+Connect to a pdbserver at \var{address}. \var{address} should use the same
+format as \function{pdbserver}.
+\end{funcdesc}
+
+\begin{excdesc}{Exit}
+Exception raised when the debugger is going to immediately exit.
+\end{excdesc}
+
+\section{MPdb Class}
+\label{mpdb-class}
+
+MPdb objects have the following methods:
+
+\begin{methoddesc}[mpdb]{remote_onecmd}{self, line}
+This method is used by an \class{MPdb} instance that is connected
+to a remote machine. Instead of the debugger instance interpreting commands,
+all commands are sent directly to the remote machine where they are
+interpreted, executed and the results are sent back to the client.
+\end{methoddesc}
+
+\begin{methoddesc}[mpdb]{do_pdbserver}{self, addr}
+Setup a pdbserver at \var{addr} and wait for incoming connections. \var{addr}
+must be string containing a protocol to use and a protocol-specific address.
+\end{methoddesc}
+
+\section{Thread Debugging}
+\label{thread-debug}
+This section provides information on Python's thread debugging facilities and
+how \module{mpdb} makes use of them.
+
+\section{Remote Debugging}
+This section describes how \module{mpdb} handles debugging remotely.
+\label{remote-debug}
+
+\section{External Process Debugging}
+This section describes how \module{mpdb} debugs processes that are external
+to the process in which \module{mpdb} is being run.
+\label{proc-debug}
\ No newline at end of file
Added: sandbox/trunk/pdb/Doc/perl/SynopsisTable.pm
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/perl/SynopsisTable.pm Wed Jul 5 14:19:15 2006
@@ -0,0 +1,95 @@
+package SynopsisTable;
+
+sub new{
+ return bless {names=>'', info=>{}, file=>''};
+}
+
+sub declare{
+ my($self,$name,$key,$type) = @_;
+ if ($self->{names}) {
+ $self->{names} .= ",$name";
+ }
+ else {
+ $self->{names} .= "$name";
+ }
+ $self->{info}{$name} = "$key,$type,";
+}
+
+# The 'file' attribute is used to store the filename of the node in which
+# the table will be presented; this assumes that each table will be presented
+# only once, which works for the current use of this object.
+
+sub set_file{
+ my($self, $filename) = @_;
+ $self->{file} = "$filename";
+}
+
+sub get_file{
+ my $self = shift;
+ return $self->{file};
+}
+
+sub set_synopsis{
+ my($self,$name,$synopsis) = @_;
+ my($key,$type,$unused) = split ',', $self->{info}{$name}, 3;
+ $self->{info}{$name} = "$key,$type,$synopsis";
+}
+
+sub get{
+ my($self,$name) = @_;
+ return split /,/, $self->{info}{$name}, 3;
+}
+
+sub show{
+ my $self = shift;
+ my $name;
+ print "names: ", $self->{names}, "\n\n";
+ foreach $name (split /,/, $self->{names}) {
+ my($key,$type,$synopsis) = $self->get($name);
+ print "$name($key) is $type: $synopsis\n";
+ }
+}
+
+sub tohtml{
+ my $self = shift;
+ my $oddrow = 1;
+ my $data = "
\n";
+ $data;
+}
+
+
+package testSynopsisTable;
+
+sub test{
+ # this little test is mostly to debug the stuff above, since this is
+ # my first Perl "object".
+ my $st = SynopsisTable->new();
+ $st->declare("sample", "sample", "standard");
+ $st->set_synopsis("sample", "This is a little synopsis....");
+ $st->declare("copy_reg", "copyreg", "standard");
+ $st->set_synopsis("copy_reg", "pickle support stuff");
+ $st->show();
+
+ print "\n\n";
+
+ my $st2 = SynopsisTable->new();
+ $st2->declare("st2module", "st2module", "built-in");
+ $st2->set_synopsis("st2module", "silly little synopsis");
+ $st2->show();
+}
+
+1; # This must be the last line -- Perl is bogus!
Added: sandbox/trunk/pdb/Doc/perl/l2hinit.perl
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/perl/l2hinit.perl Wed Jul 5 14:19:15 2006
@@ -0,0 +1,784 @@
+# LaTeX2HTML support base for use with Python documentation.
+
+package main;
+
+use L2hos;
+
+$HTML_VERSION = 4.0;
+
+$MAX_LINK_DEPTH = 2;
+$ADDRESS = '';
+
+$NO_FOOTNODE = 1;
+$NUMBERED_FOOTNOTES = 1;
+
+# Python documentation uses section numbers to support references to match
+# in the printed and online versions.
+#
+$SHOW_SECTION_NUMBERS = 1;
+
+$ICONSERVER = '.';
+$IMAGE_TYPE = 'gif';
+
+# Control where the navigation bars should show up:
+$TOP_NAVIGATION = 1;
+$BOTTOM_NAVIGATION = 1;
+$AUTO_NAVIGATION = 0;
+
+$BODYTEXT = '';
+$CHILDLINE = "\n
\n"
+ . "\n"
+ . get_version_text()
+ . "\n";
+}
+
+sub add_link {
+ # Returns a pair (iconic link, textual link)
+ my($icon, $current_file, @link) = @_;
+ my($dummy, $file, $title) = split($delim,
+ $section_info{join(' ', at link)});
+ if ($icon =~ /\/) {
+ my $r = get_my_icon($1);
+ $icon =~ s/\/$r/;
+ }
+ if ($title && ($file ne $current_file)) {
+ $title = purify($title);
+ $title = get_first_words($title, $WORDS_IN_NAVIGATION_PANEL_TITLES);
+ return (make_href($file, $icon), make_href($file, "$title"))
+ }
+ elsif ($icon eq get_my_icon('up') && $EXTERNAL_UP_LINK) {
+ return (make_href($EXTERNAL_UP_LINK, $icon),
+ make_href($EXTERNAL_UP_LINK, "$EXTERNAL_UP_TITLE"))
+ }
+ elsif ($icon eq get_my_icon('previous')
+ && $EXTERNAL_PREV_LINK && $EXTERNAL_PREV_TITLE) {
+ return (make_href($EXTERNAL_PREV_LINK, $icon),
+ make_href($EXTERNAL_PREV_LINK, "$EXTERNAL_PREV_TITLE"))
+ }
+ elsif ($icon eq get_my_icon('next')
+ && $EXTERNAL_DOWN_LINK && $EXTERNAL_DOWN_TITLE) {
+ return (make_href($EXTERNAL_DOWN_LINK, $icon),
+ make_href($EXTERNAL_DOWN_LINK, "$EXTERNAL_DOWN_TITLE"))
+ }
+ return (&inactive_img($icon), "");
+}
+
+sub add_special_link($$$) {
+ my($icon, $file, $current_file) = @_;
+ if ($icon =~ /\/) {
+ my $r = get_my_icon($1);
+ $icon =~ s/\/$r/;
+ }
+ return (($file && ($file ne $current_file))
+ ? make_href($file, $icon)
+ : undef)
+}
+
+# The img_tag() function seems only to be called with the parameter
+# 'anchor_invisible_mark', which we want to turn into ''. Since
+# replace_icon_marks() is the only interesting caller, and all it really
+# does is call img_tag(), we can just define the hook alternative to be
+# a no-op instead.
+#
+sub replace_icons_hook {}
+
+sub do_cmd_arabic {
+ # get rid of that nasty ...
+ my($ctr, $val, $id, $text) = &read_counter_value($_[0]);
+ return ($val ? farabic($val) : "0") . $text;
+}
+
+
+sub gen_index_id($$) {
+ # this is used to ensure common index key generation and a stable sort
+ my($str, $extra) = @_;
+ sprintf('%s###%s%010d', $str, $extra, ++$global{'max_id'});
+}
+
+sub insert_index($$$$$) {
+ my($mark, $datafile, $columns, $letters, $prefix) = @_;
+ my $prog = "$myrootdir/tools/buildindex.py";
+ my $index;
+ if ($letters) {
+ $index = `$prog --columns $columns --letters $datafile`;
+ }
+ else {
+ $index = `$prog --columns $columns $datafile`;
+ }
+ if (!s/$mark/$prefix$index/) {
+ print "\nCould not locate index mark: $mark";
+ }
+}
+
+sub add_idx() {
+ print "\nBuilding HTML for the index ...";
+ close(IDXFILE);
+ insert_index($idx_mark, 'index.dat', $INDEX_COLUMNS, 1, '');
+}
+
+
+$idx_module_mark = '';
+$idx_module_title = 'Module Index';
+
+sub add_module_idx() {
+ print "\nBuilding HTML for the module index ...";
+ my $key;
+ my $first = 1;
+ my $prevplat = '';
+ my $allthesame = 1;
+ my $prefix = '';
+ foreach $key (keys %Modules) {
+ $key =~ s/([a-zA-Z0-9._]*)<\/tt>/$1/;
+ my $plat = "$ModulePlatforms{$key}";
+ $plat = ''
+ if ($plat eq $IGNORE_PLATFORM_ANNOTATION);
+ if (!$first) {
+ $allthesame = 0
+ if ($prevplat ne $plat);
+ }
+ else { $first = 0; }
+ $prevplat = $plat;
+ }
+ open(MODIDXFILE, '>modindex.dat') || die "\n$!\n";
+ foreach $key (keys %Modules) {
+ # dump the line in the data file; just use a dummy seqno field
+ my $nkey = $1;
+ my $moditem = "$Modules{$key}";
+ my $plat = '';
+ $key =~ s/([a-zA-Z0-9._]*)<\/tt>/$1/;
+ if ($ModulePlatforms{$key} && !$allthesame) {
+ $plat = (" ($ModulePlatforms{$key}"
+ . ')');
+ }
+ print MODIDXFILE $moditem . $IDXFILE_FIELD_SEP
+ . "$key$plat###\n";
+ }
+ close(MODIDXFILE);
+
+ if ($GLOBAL_MODULE_INDEX) {
+ $prefix = < This index only lists modules documented in this manual.
+ The Global Module
+ Index lists all modules that are documented in this set
+ of manuals.
+MODULE_INDEX_PREFIX
+ }
+ if (!$allthesame) {
+ $prefix .= < Some module names are followed by an annotation indicating what
+platform they are available on.
+
+PLAT_DISCUSS
+ }
+ insert_index($idx_module_mark, 'modindex.dat', $MODULE_INDEX_COLUMNS, 0,
+ $prefix);
+}
+
+# replace both indexes as needed:
+sub add_idx_hook {
+ add_idx() if (/$idx_mark/);
+ process_python_state();
+ if ($MODULE_INDEX_FILE) {
+ local ($_);
+ open(MYFILE, "<$MODULE_INDEX_FILE");
+ sysread(MYFILE, $_, 1024*1024);
+ close(MYFILE);
+ add_module_idx();
+ open(MYFILE,">$MODULE_INDEX_FILE");
+ print MYFILE $_;
+ close(MYFILE);
+ }
+}
+
+
+# In addition to the standard stuff, add label to allow named node files and
+# support suppression of the page complete (for HTML Help use).
+$MY_CONTENTS_PAGE = '';
+sub do_cmd_tableofcontents {
+ local($_) = @_;
+ $TITLE = $toc_title;
+ $tocfile = $CURRENT_FILE;
+ my($closures, $reopens) = preserve_open_tags();
+ anchor_label('contents', $CURRENT_FILE, $_); # this is added
+ $MY_CONTENTS_PAGE = "$CURRENT_FILE";
+ join('', "\\tableofchildlinks[off]", $closures
+ , make_section_heading($toc_title, 'h2'), $toc_mark
+ , $reopens, $_);
+}
+# In addition to the standard stuff, add label to allow named node files.
+sub do_cmd_listoffigures {
+ local($_) = @_;
+ $TITLE = $lof_title;
+ $loffile = $CURRENT_FILE;
+ my($closures, $reopens) = preserve_open_tags();
+ anchor_label('lof', $CURRENT_FILE, $_); # this is added
+ join('', " \n", $closures
+ , make_section_heading($lof_title, 'h2'), $lof_mark
+ , $reopens, $_);
+}
+# In addition to the standard stuff, add label to allow named node files.
+sub do_cmd_listoftables {
+ local($_) = @_;
+ $TITLE = $lot_title;
+ $lotfile = $CURRENT_FILE;
+ my($closures, $reopens) = preserve_open_tags();
+ anchor_label('lot', $CURRENT_FILE, $_); # this is added
+ join('', " \n", $closures
+ , make_section_heading($lot_title, 'h2'), $lot_mark
+ , $reopens, $_);
+}
+# In addition to the standard stuff, add label to allow named node files.
+sub do_cmd_textohtmlinfopage {
+ local($_) = @_;
+ if ($INFO) { #
+ anchor_label("about",$CURRENT_FILE,$_); # this is added
+ } #
+ my $the_version = ''; # and the rest is
+ if ($t_date) { # mostly ours
+ $the_version = ",\n$t_date";
+ if ($PACKAGE_VERSION) {
+ $the_version .= ", Release $PACKAGE_VERSION$RELEASE_INFO";
+ }
+ }
+ my $about;
+ open(ABOUT, "<$ABOUT_FILE") || die "\n$!\n";
+ sysread(ABOUT, $about, 1024*1024);
+ close(ABOUT);
+ $_ = (($INFO == 1)
+ ? join('',
+ $close_all,
+ "$t_title$the_version\n",
+ $about,
+ $open_all, $_)
+ : join('', $close_all, $INFO,"\n", $open_all, $_));
+ $_;
+}
+
+$GENERAL_INDEX_FILE = '';
+$MODULE_INDEX_FILE = '';
+
+# $idx_mark will be replaced with the real index at the end
+sub do_cmd_textohtmlindex {
+ local($_) = @_;
+ $TITLE = $idx_title;
+ $idxfile = $CURRENT_FILE;
+ $GENERAL_INDEX_FILE = "$CURRENT_FILE";
+ if (%index_labels) { make_index_labels(); }
+ if (($SHORT_INDEX) && (%index_segment)) { make_preindex(); }
+ else { $preindex = ''; }
+ my $heading = make_section_heading($idx_title, 'h2') . $idx_mark;
+ my($pre, $post) = minimize_open_tags($heading);
+ anchor_label('genindex',$CURRENT_FILE,$_); # this is added
+ return " \n" . $pre . $_;
+}
+
+# $idx_module_mark will be replaced with the real index at the end
+sub do_cmd_textohtmlmoduleindex {
+ local($_) = @_;
+ $TITLE = $idx_module_title;
+ anchor_label('modindex', $CURRENT_FILE, $_);
+ $MODULE_INDEX_FILE = "$CURRENT_FILE";
+ $_ = ('' . make_section_heading($idx_module_title, 'h2')
+ . $idx_module_mark . $_);
+ return $_;
+}
+
+# The bibliography and the index should be treated as separate
+# sections in their own HTML files. The \bibliography{} command acts
+# as a sectioning command that has the desired effect. But when the
+# bibliography is constructed manually using the thebibliography
+# environment, or when using the theindex environment it is not
+# possible to use the normal sectioning mechanism. This subroutine
+# inserts a \bibliography{} or a dummy \textohtmlindex command just
+# before the appropriate environments to force sectioning.
+
+# XXX This *assumes* that if there are two {theindex} environments,
+# the first is the module index and the second is the standard
+# index. This is sufficient for the current Python documentation,
+# but that's about it.
+
+sub add_bbl_and_idx_dummy_commands {
+ my $id = $global{'max_id'};
+
+ if (/[\\]tableofcontents/) {
+ $HAVE_TABLE_OF_CONTENTS = 1;
+ }
+ s/([\\]begin\s*$O\d+$C\s*thebibliography)/$bbl_cnt++; $1/eg;
+ s/([\\]begin\s*$O\d+$C\s*thebibliography)/$id++; "\\bibliography$O$id$C$O$id$C $1"/geo;
+ my(@parts) = split(/\\begin\s*$O\d+$C\s*theindex/);
+ if (scalar(@parts) == 3) {
+ # Be careful to re-write the string in place, since $_ is *not*
+ # returned explicity; *** nasty side-effect dependency! ***
+ print "\nadd_bbl_and_idx_dummy_commands ==> adding general index";
+ print "\nadd_bbl_and_idx_dummy_commands ==> adding module index";
+ my $rx = "([\\\\]begin\\s*$O\\d+$C\\s*theindex[\\s\\S]*)"
+ . "([\\\\]begin\\s*$O\\d+$C\\s*theindex)";
+ s/$rx/\\textohtmlmoduleindex $1 \\textohtmlindex $2/o;
+ # Add a button to the navigation areas:
+ $CUSTOM_BUTTONS .= (''
+ . get_my_icon('modules')
+ . '');
+ $HAVE_MODULE_INDEX = 1;
+ $HAVE_GENERAL_INDEX = 1;
+ }
+ elsif (scalar(@parts) == 2) {
+ print "\nadd_bbl_and_idx_dummy_commands ==> adding general index";
+ my $rx = "([\\\\]begin\\s*$O\\d+$C\\s*theindex)";
+ s/$rx/\\textohtmlindex $1/o;
+ $HAVE_GENERAL_INDEX = 1;
+ }
+ elsif (scalar(@parts) == 1) {
+ print "\nadd_bbl_and_idx_dummy_commands ==> no index found";
+ $CUSTOM_BUTTONS .= get_my_icon('blank');
+ $global{'max_id'} = $id; # not sure why....
+ s/([\\]begin\s*$O\d+$C\s*theindex)/\\textohtmlindex $1/o;
+ s/[\\]printindex/\\textohtmlindex /o;
+ }
+ else {
+ die "\n\nBad number of index environments!\n\n";
+ }
+ #----------------------------------------------------------------------
+ lib_add_bbl_and_idx_dummy_commands()
+ if defined(&lib_add_bbl_and_idx_dummy_commands);
+}
+
+# The bibliographic references, the appendices, the lists of figures
+# and tables etc. must appear in the contents table at the same level
+# as the outermost sectioning command. This subroutine finds what is
+# the outermost level and sets the above to the same level;
+
+sub set_depth_levels {
+ # Sets $outermost_level
+ my $level;
+ #RRM: do not alter user-set value for $MAX_SPLIT_DEPTH
+ foreach $level ("part", "chapter", "section", "subsection",
+ "subsubsection", "paragraph") {
+ last if (($outermost_level) = /\\($level)$delimiter_rx/);
+ }
+ $level = ($outermost_level ? $section_commands{$outermost_level} :
+ do {$outermost_level = 'section'; 3;});
+
+ #RRM: but calculate value for $MAX_SPLIT_DEPTH when a $REL_DEPTH was given
+ if ($REL_DEPTH && $MAX_SPLIT_DEPTH) {
+ $MAX_SPLIT_DEPTH = $level + $MAX_SPLIT_DEPTH;
+ } elsif (!($MAX_SPLIT_DEPTH)) { $MAX_SPLIT_DEPTH = 1 };
+
+ %unnumbered_section_commands = ('tableofcontents' => $level,
+ 'listoffigures' => $level,
+ 'listoftables' => $level,
+ 'bibliography' => $level,
+ 'textohtmlindex' => $level,
+ 'textohtmlmoduleindex' => $level);
+ $section_headings{'textohtmlmoduleindex'} = 'h1';
+
+ %section_commands = (%unnumbered_section_commands,
+ %section_commands);
+
+ make_sections_rx();
+}
+
+
+# This changes the markup used for {verbatim} environments, and is the
+# best way I've found that ensures the
goes on the outside of the
+#
...
.
+#
+# Note that this *must* be done in the init file, not the python.perl
+# style support file. The %declarations must be set before
+# initialize() is called in the main LaTeX2HTML script (which happens
+# before style files are loaded).
+#
+%declarations = ('preform' => '
',
+ %declarations);
+
+
+# This is a modified version of what's provided by LaTeX2HTML; see the
+# comment on the middle stanza for an explanation of why we keep our
+# own version.
+#
+# This routine must be called once on the text only,
+# else it will "eat up" sensitive constructs.
+sub text_cleanup {
+ # MRO: replaced $* with /m
+ s/(\s*\n){3,}/\n\n/gom; # Replace consecutive blank lines with one
+ s/<(\/?)P>\s*(\w)/<$1P>\n$2/gom; # clean up paragraph starts and ends
+ s/$O\d+$C//go; # Get rid of bracket id's
+ s/$OP\d+$CP//go; # Get rid of processed bracket id's
+ s/()?/(length($1) || length($2)) ? "$1--$2" : "-"/ge;
+ # Spacing commands
+ s/\\( |$)/ /go;
+ #JKR: There should be no more comments in the source now.
+ #s/([^\\]?)%/$1/go; # Remove the comment character
+ # Cannot treat \, as a command because , is a delimiter ...
+ s/\\,/ /go;
+ # Replace tilde's with non-breaking spaces
+ s/ *~/ /g;
+
+ # This is why we have this copy of this routine; the following
+ # isn't so desirable as the author/maintainers of LaTeX2HTML seem
+ # to think. It's not commented out in the main script, so we have
+ # to override the whole thing. In particular, we don't want empty
+ # table cells to disappear.
+
+ ### DANGEROUS ?? ###
+ # remove redundant (not ) empty tags, incl. with attributes
+ #s/\n?<([^PD >][^>]*)>\s*<\/\1>//g;
+ #s/\n?<([^PD >][^>]*)>\s*<\/\1>//g;
+ # remove redundant empty tags (not
or
or
)
+ #s/<\/(TT|[^PTH][A-Z]+)><\1>//g;
+ #s/<([^PD ]+)(\s[^>]*)?>\n*<\/\1>//g;
+
+ #JCL(jcl-hex)
+ # Replace ^^ special chars (according to p.47 of the TeX book)
+ # Useful when coming from the .aux file (german umlauts, etc.)
+ s/\^\^([^0-9a-f])/chr((64+ord($1))&127)/ge;
+ s/\^\^([0-9a-f][0-9a-f])/chr(hex($1))/ge;
+}
+
+# This is used to map the link rel attributes LaTeX2HTML uses to those
+# currently recommended by the W3C.
+sub custom_REL_hook {
+ my($rel,$junk) = @_;
+ return 'parent' if $rel eq 'up';
+ return 'prev' if $rel eq 'previous';
+ return $rel;
+}
+
+# This is added to get rid of the long comment that follows the
+# doctype declaration; MSIE5 on NT4 SP4 barfs on it and drops the
+# content of the page.
+$MY_PARTIAL_HEADER = '';
+sub make_head_and_body($$) {
+ my($title, $body) = @_;
+ $body = " $body" unless ($body eq '');
+ my $DTDcomment = '';
+ my($version, $isolanguage) = ($HTML_VERSION, 'EN');
+ my %isolanguages = ( 'english', 'EN' , 'USenglish', 'EN.US'
+ , 'original', 'EN' , 'german' , 'DE'
+ , 'austrian', 'DE.AT', 'french' , 'FR'
+ , 'spanish', 'ES');
+ $isolanguage = $isolanguages{$default_language};
+ $isolanguage = 'EN' unless $isolanguage;
+ $title = &purify($title,1);
+ eval("\$title = ". $default_title ) unless ($title);
+
+ # allow user-modification of the tag; thanks Dan Young
+ if (defined &custom_TITLE_hook) {
+ $title = &custom_TITLE_hook($title, $toc_sec_title);
+ }
+
+ if ($DOCTYPE =~ /\/\/[\w\.]+\s*$/) { # language spec included
+ $DTDcomment = "\n";
+ } else {
+ $DTDcomment = "\n";
+ }
+ if ($MY_PARTIAL_HEADER eq '') {
+ my $favicon = '';
+ if ($FAVORITES_ICON) {
+ my($myname, $mydir, $myext) = fileparse($FAVORITES_ICON, '\..*');
+ my $favtype = '';
+ if ($myext eq '.gif' || $myext eq '.png') {
+ $myext =~ s/^[.]//;
+ $favtype = " type=\"image/$myext\"";
+ }
+ $favicon = (
+ "\n");
+ }
+ $STYLESHEET = $FILE.".css" unless $STYLESHEET;
+ $MY_PARTIAL_HEADER = join('',
+ ($DOCTYPE ? $DTDcomment : ''),
+ "\n",
+ ($BASE ? "\n" : ''),
+ "\n",
+ $favicon,
+ ($EXTERNAL_UP_LINK
+ ? ("\n" : "' />"))
+ : ''),
+ "\n',
+ ($HAVE_TABLE_OF_CONTENTS
+ ? ("\n')
+ : ''),
+ ($HAVE_GENERAL_INDEX
+ ? ("\n")
+ : ''),
+ # disable for now -- Mozilla doesn't do well with multiple indexes
+ # ($HAVE_MODULE_INDEX
+ # ? ("\n")
+ # : ''),
+ ($INFO
+ # XXX We can do this with the Python tools since the About...
+ # page always gets copied to about.html, even when we use the
+ # generated node###.html page names. Won't work with the
+ # rest of the Python doc tools.
+ ? ("\n"
+ . "\n")
+ : ''),
+ $more_links_mark,
+ "\n",
+ ($CHARSET && $HTML_VERSION ge "2.1"
+ ? ('\n")
+ : ''),
+ ($AESOP_META_TYPE
+ ? "\n" : ''));
+ }
+ if (!$charset && $CHARSET) {
+ $charset = $CHARSET;
+ $charset =~ s/_/\-/go;
+ }
+ join('',
+ $MY_PARTIAL_HEADER,
+ "", $title, "\n\n");
+}
+
+sub replace_morelinks {
+ $more_links =~ s/ REL=/ rel=/g;
+ $more_links =~ s/ HREF=/ href=/g;
+ $more_links =~ s//" \/>/g;
+ $_ =~ s/$more_links_mark/$more_links/e;
+}
+
+1; # This must be the last line
Added: sandbox/trunk/pdb/Doc/perl/manual.perl
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/perl/manual.perl Wed Jul 5 14:19:15 2006
@@ -0,0 +1,15 @@
+# -*- perl -*-
+#
+# This implements the Python manual class. All it really needs to do it
+# load the "python" style. The style code is not moved into the class code
+# at this time, since we expect additional document class to be developed
+# for the Python documentation in the future. Appropriate relocations will
+# be made at that time.
+
+package main;
+
+do_require_package("report");
+do_require_package("alltt");
+do_require_package("python");
+
+1; # sheesh....
Added: sandbox/trunk/pdb/Doc/perl/python.perl
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/perl/python.perl Wed Jul 5 14:19:15 2006
@@ -0,0 +1,2173 @@
+# python.perl by Fred L. Drake, Jr. -*- perl -*-
+#
+# Heavily based on Guido van Rossum's myformat.perl (now obsolete).
+#
+# Extension to LaTeX2HTML for documents using myformat.sty.
+# Subroutines of the form do_cmd_ here define translations
+# for LaTeX commands \ defined in the corresponding .sty file.
+
+package main;
+
+use warnings;
+use File::Basename;
+
+
+sub next_argument{
+ my $param;
+ $param = missing_braces()
+ unless ((s/$next_pair_pr_rx/$param=$2;''/eo)
+ ||(s/$next_pair_rx/$param=$2;''/eo));
+ return $param;
+}
+
+sub next_optional_argument{
+ my($param, $rx) = ('', "^\\s*(\\[([^]]*)\\])?");
+ s/$rx/$param=$2;''/eo;
+ return $param;
+}
+
+sub make_icon_filename($){
+ my($myname, $mydir, $myext) = fileparse($_[0], '\..*');
+ chop $mydir;
+ if ($mydir eq '.') {
+ $mydir = $ICONSERVER;
+ }
+ $myext = ".$IMAGE_TYPE"
+ unless $myext;
+ return "$mydir$dd$myname$myext";
+}
+
+sub get_link_icon($){
+ my $url = $_[0];
+ if ($OFF_SITE_LINK_ICON && ($url =~ /^[-a-zA-Z0-9.]+:/)) {
+ # absolute URL; assume it points off-site
+ my $icon = make_icon_filename($OFF_SITE_LINK_ICON);
+ return (" ");
+ }
+ return '';
+}
+
+# This is a fairly simple hack; it supports \let when it is used to create
+# (or redefine) a macro to exactly be some other macro: \let\newname=\oldname.
+# Many possible uses of \let aren't supported or aren't supported correctly.
+#
+sub do_cmd_let{
+ local($_) = @_;
+ my $matched = 0;
+ s/[\\]([a-zA-Z]+)\s*(=\s*)?[\\]([a-zA-Z]*)/$matched=1; ''/e;
+ if ($matched) {
+ my($new, $old) = ($1, $3);
+ eval "sub do_cmd_$new { do_cmd_$old" . '(@_); }';
+ print "\ndefining handler for \\$new using \\$old\n";
+ }
+ else {
+ s/[\\]([a-zA-Z]+)\s*(=\s*)?([^\\])/$matched=1; ''/es;
+ if ($matched) {
+ my($new, $char) = ($1, $3);
+ eval "sub do_cmd_$new { \"\\$char\" . \$_[0]; }";
+ print "\ndefining handler for \\$new to insert '$char'\n";
+ }
+ else {
+ write_warnings("Could not interpret \\let construct...");
+ }
+ }
+ return $_;
+}
+
+
+# the older version of LaTeX2HTML we use doesn't support this, but we use it:
+
+sub do_cmd_textasciitilde{ '~' . $_[0]; }
+sub do_cmd_textasciicircum{ '^' . $_[0]; }
+sub do_cmd_textbar{ '|' . $_[0]; }
+sub do_cmd_texteuro { '€' . $_[0]; }
+sub do_cmd_textgreater{ '>' . $_[0]; }
+sub do_cmd_textless{ '<' . $_[0]; }
+sub do_cmd_textunderscore{ '_' . $_[0]; }
+sub do_cmd_infinity{ '∞' . $_[0]; }
+sub do_cmd_plusminus{ '±' . $_[0]; }
+sub do_cmd_guilabel{
+ return use_wrappers($_[0]. '', ''); }
+sub do_cmd_menuselection{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_sub{
+ return ' > ' . $_[0]; }
+
+
+# words typeset in a special way (not in HTML though)
+
+sub do_cmd_ABC{ 'ABC' . $_[0]; }
+sub do_cmd_UNIX{ 'Unix' . $_[0]; }
+sub do_cmd_LaTeX{ 'LaTeX' . $_[0]; }
+sub do_cmd_TeX{ 'TeX' . $_[0]; }
+sub do_cmd_ASCII{ 'ASCII' . $_[0]; }
+sub do_cmd_POSIX{ 'POSIX' . $_[0]; }
+sub do_cmd_C{ 'C' . $_[0]; }
+sub do_cmd_Cpp{ 'C++' . $_[0]; }
+sub do_cmd_EOF{ 'EOF' . $_[0]; }
+sub do_cmd_NULL{ 'NULL' . $_[0]; }
+
+sub do_cmd_e{ '\' . $_[0]; }
+
+$DEVELOPER_ADDRESS = '';
+$SHORT_VERSION = '';
+$RELEASE_INFO = '';
+$PACKAGE_VERSION = '';
+
+sub do_cmd_version{ $PACKAGE_VERSION . $_[0]; }
+sub do_cmd_shortversion{ $SHORT_VERSION . $_[0]; }
+sub do_cmd_release{
+ local($_) = @_;
+ $PACKAGE_VERSION = next_argument();
+ return $_;
+}
+
+sub do_cmd_setreleaseinfo{
+ local($_) = @_;
+ $RELEASE_INFO = next_argument();
+ return $_;
+}
+
+sub do_cmd_setshortversion{
+ local($_) = @_;
+ $SHORT_VERSION = next_argument();
+ return $_;
+}
+
+sub do_cmd_authoraddress{
+ local($_) = @_;
+ $DEVELOPER_ADDRESS = next_argument();
+ return $_;
+}
+
+sub do_cmd_hackscore{
+ local($_) = @_;
+ next_argument();
+ return '_' . $_;
+}
+
+# Helper used in many places that arbitrary code-like text appears:
+
+sub codetext($){
+ my $text = "$_[0]";
+ # Make sure that "---" is not converted to "--" later when
+ # LaTeX2HTML tries converting em-dashes based on the conventional
+ # TeX font ligatures:
+ $text =~ s/--/-\-/go;
+ return $text;
+}
+
+sub use_wrappers($$$){
+ local($_,$before,$after) = @_;
+ my $stuff = next_argument();
+ return $before . $stuff . $after . $_;
+}
+
+sub use_code_wrappers($$$){
+ local($_,$before,$after) = @_;
+ my $stuff = codetext(next_argument());
+ return $before . $stuff . $after . $_;
+}
+
+$IN_DESC_HANDLER = 0;
+sub do_cmd_optional{
+ if ($IN_DESC_HANDLER) {
+ return use_wrappers($_[0], "\[",
+ "\]");
+ }
+ else {
+ return use_wrappers($_[0], "\[", "\]");
+ }
+}
+
+# Logical formatting (some based on texinfo), needs to be converted to
+# minimalist HTML. The "minimalist" is primarily to reduce the size of
+# output files for users that read them over the network rather than
+# from local repositories.
+
+sub do_cmd_pytype{ return $_[0]; }
+sub do_cmd_makevar{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_code{
+ return use_code_wrappers($_[0], '', ''); }
+sub do_cmd_module{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_keyword{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_exception{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_class{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_function{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_constant{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_member{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_method{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_cfunction{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_cdata{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_ctype{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_regexp{
+ return use_code_wrappers($_[0], '', ''); }
+sub do_cmd_character{
+ return use_code_wrappers($_[0], '"', '"'); }
+sub do_cmd_program{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_programopt{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_longprogramopt{
+ # note that the --- will be later converted to -- by LaTeX2HTML
+ return use_wrappers($_[0], '---', ''); }
+sub do_cmd_email{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_mailheader{
+ return use_wrappers($_[0], '', ':'); }
+sub do_cmd_mimetype{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_var{
+ return use_wrappers($_[0], "", ""); }
+sub do_cmd_dfn{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_emph{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_file{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_filenq{
+ return do_cmd_file($_[0]); }
+sub do_cmd_samp{
+ return use_code_wrappers($_[0], '"', '"'); }
+sub do_cmd_kbd{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_strong{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_textbf{
+ return use_wrappers($_[0], '', ''); }
+sub do_cmd_textit{
+ return use_wrappers($_[0], '', ''); }
+# This can be changed/overridden for translations:
+%NoticeNames = ('note' => 'Note:',
+ 'warning' => 'Warning:',
+ );
+
+sub do_cmd_note{
+ my $label = $NoticeNames{'note'};
+ return use_wrappers(
+ $_[0],
+ "$label\n",
+ ''); }
+sub do_cmd_warning{
+ my $label = $NoticeNames{'warning'};
+ return use_wrappers(
+ $_[0],
+ "$label\n",
+ ''); }
+
+sub do_env_notice{
+ local($_) = @_;
+ my $notice = next_optional_argument();
+ if (!$notice) {
+ $notice = 'note';
+ }
+ my $label = $NoticeNames{$notice};
+ return ("
$label\n"
+ . $_
+ . '
');
+}
+
+sub do_cmd_moreargs{
+ return '...' . $_[0]; }
+sub do_cmd_unspecified{
+ return '...' . $_[0]; }
+
+
+sub do_cmd_refmodule{
+ # Insert the right magic to jump to the module definition.
+ local($_) = @_;
+ my $key = next_optional_argument();
+ my $module = next_argument();
+ $key = $module
+ unless $key;
+ return "$module"
+ . $_;
+}
+
+sub do_cmd_newsgroup{
+ local($_) = @_;
+ my $newsgroup = next_argument();
+ my $icon = get_link_icon("news:$newsgroup");
+ my $stuff = (""
+ . "$newsgroup$icon");
+ return $stuff . $_;
+}
+
+sub do_cmd_envvar{
+ local($_) = @_;
+ my $envvar = next_argument();
+ my($name, $aname, $ahref) = new_link_info();
+ # The here is really to keep buildindex.py from making
+ # the variable name case-insensitive.
+ add_index_entry("environment variables!$envvar@$envvar",
+ $ahref);
+ add_index_entry("$envvar (environment variable)", $ahref);
+ $aname =~ s/" . $_;
+}
+
+sub do_cmd_url{
+ # use the URL as both text and hyperlink
+ local($_) = @_;
+ my $url = next_argument();
+ my $icon = get_link_icon($url);
+ $url =~ s/~/~/g;
+ return "$url$icon" . $_;
+}
+
+sub do_cmd_manpage{
+ # two parameters: \manpage{name}{section}
+ local($_) = @_;
+ my $page = next_argument();
+ my $section = next_argument();
+ return "$page($section)" . $_;
+}
+
+$PEP_FORMAT = "http://www.python.org/peps/pep-%04d.html";
+#$RFC_FORMAT = "http://www.ietf.org/rfc/rfc%04d.txt";
+$RFC_FORMAT = "http://www.faqs.org/rfcs/rfc%d.html";
+
+sub get_rfc_url($$){
+ my($rfcnum, $format) = @_;
+ return sprintf($format, $rfcnum);
+}
+
+sub do_cmd_pep{
+ local($_) = @_;
+ my $rfcnumber = next_argument();
+ my $id = "rfcref-" . ++$global{'max_id'};
+ my $href = get_rfc_url($rfcnumber, $PEP_FORMAT);
+ my $icon = get_link_icon($href);
+ # Save the reference
+ my $nstr = gen_index_id("Python Enhancement Proposals!PEP $rfcnumber", '');
+ $index{$nstr} .= make_half_href("$CURRENT_FILE#$id");
+ return ("PEP $rfcnumber$icon" . $_);
+}
+
+sub do_cmd_rfc{
+ local($_) = @_;
+ my $rfcnumber = next_argument();
+ my $id = "rfcref-" . ++$global{'max_id'};
+ my $href = get_rfc_url($rfcnumber, $RFC_FORMAT);
+ my $icon = get_link_icon($href);
+ # Save the reference
+ my $nstr = gen_index_id("RFC!RFC $rfcnumber", '');
+ $index{$nstr} .= make_half_href("$CURRENT_FILE#$id");
+ return (""
+ . "RFC $rfcnumber$icon" . $_);
+}
+
+sub do_cmd_ulink{
+ local($_) = @_;
+ my $text = next_argument();
+ my $url = next_argument();
+ return "$text" . $_;
+}
+
+sub do_cmd_citetitle{
+ local($_) = @_;
+ my $url = next_optional_argument();
+ my $title = next_argument();
+ my $icon = get_link_icon($url);
+ my $repl = '';
+ if ($url) {
+ my $titletext = strip_html_markup("$title");
+ $repl = ("$title$icon");
+ }
+ else {
+ $repl = "$title";
+ }
+ return $repl . $_;
+}
+
+sub do_cmd_deprecated{
+ # two parameters: \deprecated{version}{whattodo}
+ local($_) = @_;
+ my $release = next_argument();
+ my $reason = next_argument();
+ return ('
'
+ . "Deprecated since release $release."
+ . "\n$reason
"
+ . $_);
+}
+
+sub versionnote($$){
+ # one or two parameters: \versionnote[explanation]{version}
+ my $type = $_[0];
+ local $_ = $_[1];
+ my $explanation = next_optional_argument();
+ my $release = next_argument();
+ my $text = "$type in version $release.";
+ if ($explanation) {
+ $text = "$type in version $release:\n$explanation.";
+ }
+ return "\n$text\n" . $_;
+}
+
+sub do_cmd_versionadded{
+ return versionnote('New', $_[0]);
+}
+
+sub do_cmd_versionchanged{
+ return versionnote('Changed', $_[0]);
+}
+
+#
+# These function handle platform dependency tracking.
+#
+sub do_cmd_platform{
+ local($_) = @_;
+ my $platform = next_argument();
+ $ModulePlatforms{"$THIS_MODULE"} = $platform;
+ $platform = "Macintosh"
+ if $platform eq 'Mac';
+ return "\n
Availability: $platform.
\n" . $_;
+}
+
+$IGNORE_PLATFORM_ANNOTATION = '';
+sub do_cmd_ignorePlatformAnnotation{
+ local($_) = @_;
+ $IGNORE_PLATFORM_ANNOTATION = next_argument();
+ return $_;
+}
+
+
+# index commands
+
+$INDEX_SUBITEM = "";
+
+sub get_indexsubitem(){
+ return $INDEX_SUBITEM ? " $INDEX_SUBITEM" : '';
+}
+
+sub do_cmd_setindexsubitem{
+ local($_) = @_;
+ $INDEX_SUBITEM = next_argument();
+ return $_;
+}
+
+sub do_cmd_withsubitem{
+ # We can't really do the right thing, because LaTeX2HTML doesn't
+ # do things in the right order, but we need to at least strip this stuff
+ # out, and leave anything that the second argument expanded out to.
+ #
+ local($_) = @_;
+ my $oldsubitem = $INDEX_SUBITEM;
+ $INDEX_SUBITEM = next_argument();
+ my $stuff = next_argument();
+ my $br_id = ++$globals{'max_id'};
+ my $marker = "$O$br_id$C";
+ $stuff =~ s/^\s+//;
+ return
+ $stuff
+ . "\\setindexsubitem$marker$oldsubitem$marker"
+ . $_;
+}
+
+# This is the prologue macro which is required to start writing the
+# mod\jobname.idx file; we can just ignore it. (Defining this suppresses
+# a warning that \makemodindex is unknown.)
+#
+sub do_cmd_makemodindex{ return $_[0]; }
+
+# We're in the document subdirectory when this happens!
+#
+open(IDXFILE, '>index.dat') || die "\n$!\n";
+open(INTLABELS, '>intlabels.pl') || die "\n$!\n";
+print INTLABELS "%internal_labels = ();\n";
+print INTLABELS "1; # hack in case there are no entries\n\n";
+
+# Using \0 for this is bad because we can't use common tools to work with the
+# resulting files. Things like grep can be useful with this stuff!
+#
+$IDXFILE_FIELD_SEP = "\1";
+
+sub write_idxfile($$){
+ my($ahref, $str) = @_;
+ print IDXFILE $ahref, $IDXFILE_FIELD_SEP, $str, "\n";
+}
+
+
+sub gen_link($$){
+ my($node, $target) = @_;
+ print INTLABELS "\$internal_labels{\"$target\"} = \"/$node\";\n";
+ return "";
+}
+
+sub add_index_entry($$){
+ # add an entry to the index structures; ignore the return value
+ my($str, $ahref) = @_;
+ $str = gen_index_id($str, '');
+ $index{$str} .= $ahref;
+ write_idxfile($ahref, $str);
+}
+
+sub new_link_name_info(){
+ my $name = "l2h-" . ++$globals{'max_id'};
+ my $aname = "";
+ my $ahref = gen_link($CURRENT_FILE, $name);
+ return ($name, $ahref);
+}
+
+sub new_link_info(){
+ my($name, $ahref) = new_link_name_info();
+ my $aname = "";
+ return ($name, $aname, $ahref);
+}
+
+$IndexMacroPattern = '';
+sub define_indexing_macro(@){
+ my $count = @_;
+ my $i = 0;
+ for (; $i < $count; ++$i) {
+ my $name = $_[$i];
+ my $cmd = "idx_cmd_$name";
+ die "\nNo function $cmd() defined!\n"
+ if (!defined &$cmd);
+ eval ("sub do_cmd_$name { return process_index_macros("
+ . "\$_[0], '$name'); }");
+ if (length($IndexMacroPattern) == 0) {
+ $IndexMacroPattern = "$name";
+ }
+ else {
+ $IndexMacroPattern .= "|$name";
+ }
+ }
+}
+
+$DEBUG_INDEXING = 0;
+sub process_index_macros($$){
+ local($_) = @_;
+ my $cmdname = $_[1]; # This is what triggered us in the first place;
+ # we know it's real, so just process it.
+ my($name, $aname, $ahref) = new_link_info();
+ my $cmd = "idx_cmd_$cmdname";
+ print "\nIndexing: \\$cmdname"
+ if $DEBUG_INDEXING;
+ &$cmd($ahref); # modifies $_ and adds index entries
+ while (/^[\s\n]*\\($IndexMacroPattern)) {
+ $cmdname = "$1";
+ print " \\$cmdname"
+ if $DEBUG_INDEXING;
+ $cmd = "idx_cmd_$cmdname";
+ if (!defined &$cmd) {
+ last;
+ }
+ else {
+ s/^[\s\n]*\\$cmdname//;
+ &$cmd($ahref);
+ }
+ }
+# XXX I don't remember why I added this to begin with.
+# if (/^[ \t\r\n]/) {
+# $_ = substr($_, 1);
+# }
+ return "$aname$anchor_invisible_mark" . $_;
+}
+
+define_indexing_macro('index');
+sub idx_cmd_index($){
+ my $str = next_argument();
+ add_index_entry("$str", $_[0]);
+}
+
+define_indexing_macro('kwindex');
+sub idx_cmd_kwindex($){
+ my $str = next_argument();
+ add_index_entry("$str!keyword", $_[0]);
+ add_index_entry("keyword!$str", $_[0]);
+}
+
+define_indexing_macro('indexii');
+sub idx_cmd_indexii($){
+ my $str1 = next_argument();
+ my $str2 = next_argument();
+ add_index_entry("$str1!$str2", $_[0]);
+ add_index_entry("$str2!$str1", $_[0]);
+}
+
+define_indexing_macro('indexiii');
+sub idx_cmd_indexiii($){
+ my $str1 = next_argument();
+ my $str2 = next_argument();
+ my $str3 = next_argument();
+ add_index_entry("$str1!$str2 $str3", $_[0]);
+ add_index_entry("$str2!$str3, $str1", $_[0]);
+ add_index_entry("$str3!$str1 $str2", $_[0]);
+}
+
+define_indexing_macro('indexiv');
+sub idx_cmd_indexiv($){
+ my $str1 = next_argument();
+ my $str2 = next_argument();
+ my $str3 = next_argument();
+ my $str4 = next_argument();
+ add_index_entry("$str1!$str2 $str3 $str4", $_[0]);
+ add_index_entry("$str2!$str3 $str4, $str1", $_[0]);
+ add_index_entry("$str3!$str4, $str1 $str2", $_[0]);
+ add_index_entry("$str4!$str1 $str2 $str3", $_[0]);
+}
+
+define_indexing_macro('ttindex');
+sub idx_cmd_ttindex($){
+ my $str = codetext(next_argument());
+ my $entry = $str . get_indexsubitem();
+ add_index_entry($entry, $_[0]);
+}
+
+sub my_typed_index_helper($$){
+ my($word, $ahref) = @_;
+ my $str = next_argument();
+ add_index_entry("$str $word", $ahref);
+ add_index_entry("$word!$str", $ahref);
+}
+
+define_indexing_macro('stindex', 'opindex', 'exindex', 'obindex');
+sub idx_cmd_stindex($){ my_typed_index_helper('statement', $_[0]); }
+sub idx_cmd_opindex($){ my_typed_index_helper('operator', $_[0]); }
+sub idx_cmd_exindex($){ my_typed_index_helper('exception', $_[0]); }
+sub idx_cmd_obindex($){ my_typed_index_helper('object', $_[0]); }
+
+define_indexing_macro('bifuncindex');
+sub idx_cmd_bifuncindex($){
+ my $str = next_argument();
+ add_index_entry("$str() (built-in function)",
+ $_[0]);
+}
+
+
+sub make_mod_index_entry($$){
+ my($str, $define) = @_;
+ my($name, $aname, $ahref) = new_link_info();
+ # equivalent of add_index_entry() using $define instead of ''
+ $ahref =~ s/\#[-_a-zA-Z0-9]*\"/\"/
+ if ($define eq 'DEF');
+ $str = gen_index_id($str, $define);
+ $index{$str} .= $ahref;
+ write_idxfile($ahref, $str);
+
+ if ($define eq 'DEF') {
+ # add to the module index
+ $str =~ /()/;
+ my $nstr = $1;
+ $Modules{$nstr} .= $ahref;
+ }
+ return "$aname$anchor_invisible_mark2";
+}
+
+
+$THIS_MODULE = '';
+$THIS_CLASS = '';
+
+sub define_module($$){
+ my($word, $name) = @_;
+ my $section_tag = join('', @curr_sec_id);
+ if ($word ne "built-in" && $word ne "extension"
+ && $word ne "standard" && $word ne "") {
+ write_warnings("Bad module type '$word'"
+ . " for \\declaremodule (module $name)");
+ $word = "";
+ }
+ $word = "$word " if $word;
+ $THIS_MODULE = "$name";
+ $INDEX_SUBITEM = "(in module $name)";
+ print "[$name]";
+ return make_mod_index_entry(
+ "$name (${word}module)", 'DEF');
+}
+
+sub my_module_index_helper($$){
+ local($word, $_) = @_;
+ my $name = next_argument();
+ return define_module($word, $name) . $_;
+}
+
+sub do_cmd_modindex{ return my_module_index_helper('', $_[0]); }
+sub do_cmd_bimodindex{ return my_module_index_helper('built-in', $_[0]); }
+sub do_cmd_exmodindex{ return my_module_index_helper('extension', $_[0]); }
+sub do_cmd_stmodindex{ return my_module_index_helper('standard', $_[0]); }
+# local($_) = @_;
+# my $name = next_argument();
+# return define_module('standard', $name) . $_;
+# }
+
+sub ref_module_index_helper($$){
+ my($word, $ahref) = @_;
+ my $str = next_argument();
+ $word = "$word " if $word;
+ $str = "$str (${word}module)";
+ # can't use add_index_entry() since the 2nd arg to gen_index_id() is used;
+ # just inline it all here
+ $str = gen_index_id($str, 'REF');
+ $index{$str} .= $ahref;
+ write_idxfile($ahref, $str);
+}
+
+# these should be adjusted a bit....
+define_indexing_macro('refmodindex', 'refbimodindex',
+ 'refexmodindex', 'refstmodindex');
+sub idx_cmd_refmodindex($){
+ return ref_module_index_helper('', $_[0]); }
+sub idx_cmd_refbimodindex($){
+ return ref_module_index_helper('built-in', $_[0]); }
+sub idx_cmd_refexmodindex($){
+ return ref_module_index_helper('extension', $_[0]);}
+sub idx_cmd_refstmodindex($){
+ return ref_module_index_helper('standard', $_[0]); }
+
+sub do_cmd_nodename{ return do_cmd_label($_[0]); }
+
+sub init_myformat(){
+ # This depends on the override of text_cleanup() in l2hinit.perl;
+ # if that function cleans out empty tags, the first three of these
+ # variables must be set to comments.
+ #
+ # Thanks to Dave Kuhlman for figuring why some named anchors were
+ # being lost.
+ $anchor_invisible_mark = '';
+ $anchor_invisible_mark2 = '';
+ $anchor_mark = '';
+ $icons{'anchor_mark'} = '';
+}
+init_myformat();
+
+# Create an index entry, but include the string in the target anchor
+# instead of the dummy filler.
+#
+sub make_str_index_entry($){
+ my $str = $_[0];
+ my($name, $ahref) = new_link_name_info();
+ add_index_entry($str, $ahref);
+ if ($str =~ /^<[a-z]+\b/) {
+ my $s = "$str";
+ $s =~ s/^<([a-z]+)\b/<$1 id='$name' xml:id='$name'/;
+ return $s;
+ }
+ else {
+ return "$str";
+ }
+}
+
+
+%TokenToTargetMapping = (); # language:token -> link target
+%DefinedGrammars = (); # language -> full grammar text
+%BackpatchGrammarFiles = (); # file -> 1 (hash of files to fixup)
+
+sub do_cmd_token{
+ local($_) = @_;
+ my $token = next_argument();
+ my $target = $TokenToTargetMapping{"$CURRENT_GRAMMAR:$token"};
+ if ($token eq $CURRENT_TOKEN || $CURRENT_GRAMMAR eq '*') {
+ # recursive definition or display-only productionlist
+ return "$token";
+ }
+ if ($target eq '') {
+ $target = "<$CURRENT_GRAMMAR><$token>";
+ if (! $BackpatchGrammarFiles{"$CURRENT_FILE"}) {
+ print "Adding '$CURRENT_FILE' to back-patch list.\n";
+ }
+ $BackpatchGrammarFiles{"$CURRENT_FILE"} = 1;
+ }
+ return "$token" . $_;
+}
+
+sub do_cmd_grammartoken{
+ return do_cmd_token(@_);
+}
+
+sub do_env_productionlist{
+ local($_) = @_;
+ my $lang = next_optional_argument();
+ my $filename = "grammar-$lang.txt";
+ if ($lang eq '') {
+ $filename = 'grammar.txt';
+ }
+ local($CURRENT_GRAMMAR) = $lang;
+ $DefinedGrammars{$lang} .= $_;
+ return ("
" . $_;
+}
+
+
+# For tables, we include a class on every cell to allow the CSS to set
+# the text-align property; this is because support for styling columns
+# via the
element appears nearly non-existant on even the latest
+# browsers (Mozilla 1.7 is stable at the time of this writing).
+# Hopefully this can be improved as browsers evolve.
+
+ at col_aligns = ('
', '
', '
', '
', '
');
+
+%FontConversions = ('cdata' => 'tt class="cdata"',
+ 'character' => 'tt class="character"',
+ 'class' => 'tt class="class"',
+ 'command' => 'code',
+ 'constant' => 'tt class="constant"',
+ 'exception' => 'tt class="exception"',
+ 'file' => 'tt class="file"',
+ 'filenq' => 'tt class="file"',
+ 'kbd' => 'kbd',
+ 'member' => 'tt class="member"',
+ 'programopt' => 'b',
+ 'textrm' => '',
+ );
+
+sub fix_font($){
+ # do a little magic on a font name to get the right behavior in the first
+ # column of the output table
+ my $font = $_[0];
+ if (defined $FontConversions{$font}) {
+ $font = $FontConversions{$font};
+ }
+ return $font;
+}
+
+sub figure_column_alignment($){
+ my $a = $_[0];
+ if (!defined $a) {
+ return '';
+ }
+ my $mark = substr($a, 0, 1);
+ my $r = '';
+ if ($mark eq 'c')
+ { $r = ' class="center"'; }
+ elsif ($mark eq 'r')
+ { $r = ' class="right" '; }
+ elsif ($mark eq 'l')
+ { $r = ' class="left" '; }
+ elsif ($mark eq 'p')
+ { $r = ' class="left" '; }
+ return $r;
+}
+
+sub setup_column_alignments($){
+ local($_) = @_;
+ my($s1, $s2, $s3, $s4, $s5) = split(/[|]/,$_);
+ my $a1 = figure_column_alignment($s1);
+ my $a2 = figure_column_alignment($s2);
+ my $a3 = figure_column_alignment($s3);
+ my $a4 = figure_column_alignment($s4);
+ my $a5 = figure_column_alignment($s5);
+ $col_aligns[0] = "
"
+ . $_;
+}
+
+
+# These can be used to control the title page appearance;
+# they need a little bit of documentation.
+#
+# If $TITLE_PAGE_GRAPHIC is set, it should be the name of a file in the
+# $ICONSERVER directory, or include path information (other than "./"). The
+# default image type will be assumed if an extension is not provided.
+#
+# If specified, the "title page" will contain two colums: one containing the
+# title/author/etc., and the other containing the graphic. Use the other
+# four variables listed here to control specific details of the layout; all
+# are optional.
+#
+# $TITLE_PAGE_GRAPHIC = "my-company-logo";
+# $TITLE_PAGE_GRAPHIC_COLWIDTH = "30%";
+# $TITLE_PAGE_GRAPHIC_WIDTH = 150;
+# $TITLE_PAGE_GRAPHIC_HEIGHT = 150;
+# $TITLE_PAGE_GRAPHIC_ON_RIGHT = 0;
+
+sub make_my_titlepage(){
+ my $the_title = "";
+ if ($t_title) {
+ $the_title .= "\n
$t_title
";
+ }
+ else {
+ write_warnings("\nThis document has no title.");
+ }
+ if ($t_author) {
+ if ($t_authorURL) {
+ my $href = translate_commands($t_authorURL);
+ $href = make_named_href('author', $href,
+ "$t_author"
+ . '');
+ $the_title .= "\n
$href
";
+ }
+ else {
+ $the_title .= ("\n
$t_author"
+ . '
');
+ }
+ }
+ else {
+ write_warnings("\nThere is no author for this document.");
+ }
+ if ($t_institute) {
+ $the_title .= "\n
";
+ }
+ return $the_title;
+}
+
+sub make_my_titlegraphic(){
+ my $filename = make_icon_filename($TITLE_PAGE_GRAPHIC);
+ my $graphic = "
\n";
+ return $graphic;
+}
+
+sub do_cmd_maketitle{
+ local($_) = @_;
+ my $the_title = "\n";
+ if ($EXTERNAL_UP_LINK) {
+ # This generates a element in the wrong place (the
+ # body), but I don't see any other way to get this generated
+ # at all. Browsers like Mozilla, that support navigation
+ # links, can make use of this.
+ $the_title .= ("\n");
+ }
+ $the_title .= '
';
+ if ($TITLE_PAGE_GRAPHIC) {
+ if ($TITLE_PAGE_GRAPHIC_ON_RIGHT) {
+ $the_title .= ("\n
"
+ . "
\n
"
+ . make_my_titlepage()
+ . "
\n"
+ . make_my_titlegraphic()
+ . "
\n
");
+ }
+ else {
+ $the_title .= ("\n
\n"
+ . make_my_titlegraphic()
+ . "
"
+ . make_my_titlepage()
+ . "
\n
");
+ }
+ }
+ else {
+ $the_title .= ("\n
"
+ . make_my_titlepage()
+ . "\n
");
+ }
+ $the_title .= "\n
";
+ return $the_title . $_;
+}
+
+
+#
+# Module synopsis support
+#
+
+require SynopsisTable;
+
+sub get_chapter_id(){
+ my $id = do_cmd_thechapter('');
+ $id =~ s/(\d+)<\/SPAN>/$1/;
+ $id =~ s/\.//;
+ return $id;
+}
+
+# 'chapter' => 'SynopsisTable instance'
+%ModuleSynopses = ();
+
+sub get_synopsis_table($){
+ my $chap = $_[0];
+ my $key;
+ foreach $key (keys %ModuleSynopses) {
+ if ($key eq $chap) {
+ return $ModuleSynopses{$chap};
+ }
+ }
+ my $st = SynopsisTable->new();
+ $ModuleSynopses{$chap} = $st;
+ return $st;
+}
+
+sub do_cmd_moduleauthor{
+ local($_) = @_;
+ next_argument();
+ next_argument();
+ return $_;
+}
+
+sub do_cmd_sectionauthor{
+ local($_) = @_;
+ next_argument();
+ next_argument();
+ return $_;
+}
+
+sub do_cmd_declaremodule{
+ local($_) = @_;
+ my $key = next_optional_argument();
+ my $type = next_argument();
+ my $name = next_argument();
+ my $st = get_synopsis_table(get_chapter_id());
+ #
+ $key = $name unless $key;
+ $type = 'built-in' if $type eq 'builtin';
+ $st->declare($name, $key, $type);
+ define_module($type, $name);
+ return anchor_label("module-$key",$CURRENT_FILE,$_)
+}
+
+sub do_cmd_modulesynopsis{
+ local($_) = @_;
+ my $st = get_synopsis_table(get_chapter_id());
+ $st->set_synopsis($THIS_MODULE, translate_commands(next_argument()));
+ return $_;
+}
+
+sub do_cmd_localmoduletable{
+ local($_) = @_;
+ my $chap = get_chapter_id();
+ my $st = get_synopsis_table($chap);
+ $st->set_file("$CURRENT_FILE");
+ return "<$chap>\\tableofchildlinks[off]" . $_;
+}
+
+sub process_all_localmoduletables(){
+ my $key;
+ foreach $key (keys %ModuleSynopses) {
+ my $st = $ModuleSynopses{$key};
+ my $file = $st->get_file();
+ if ($file) {
+ process_localmoduletables_in_file($file);
+ }
+ else {
+ print "\nsynopsis table $key has no file association\n";
+ }
+ }
+}
+
+sub process_localmoduletables_in_file($){
+ my $file = $_[0];
+ open(MYFILE, "<$file");
+ local($_);
+ sysread(MYFILE, $_, 1024*1024);
+ close(MYFILE);
+ # need to get contents of file in $_
+ while (/<(\d+)>/) {
+ my $match = $&;
+ my $chap = $1;
+ my $st = get_synopsis_table($chap);
+ my $data = $st->tohtml();
+ s/$match/$data/;
+ }
+ open(MYFILE,">$file");
+ print MYFILE $_;
+ close(MYFILE);
+}
+sub process_python_state(){
+ process_all_localmoduletables();
+ process_grammar_files();
+}
+
+
+#
+# "See also:" -- references placed at the end of a \section
+#
+
+sub do_env_seealso{
+ return ("
\n "
+ . "
See Also:
\n"
+ . $_[0]
+ . '
');
+}
+
+sub do_env_seealsostar{
+ return ("
\n "
+ . $_[0]
+ . '
');
+}
+
+sub do_cmd_seemodule{
+ # Insert the right magic to jump to the module definition. This should
+ # work most of the time, at least for repeat builds....
+ local($_) = @_;
+ my $key = next_optional_argument();
+ my $module = next_argument();
+ my $text = next_argument();
+ my $period = '.';
+ $key = $module
+ unless $key;
+ if ($text =~ /\.$/) {
+ $period = '';
+ }
+ return ('
';
+
+sub do_env_alltt{
+ local ($_) = @_;
+ local($closures,$reopens, at open_block_tags);
+
+ # get the tag-strings for all open tags
+ local(@keep_open_tags) = @$open_tags_R;
+ ($closures,$reopens) = &preserve_open_tags() if (@$open_tags_R);
+
+ # get the tags for text-level tags only
+ $open_tags_R = [ @keep_open_tags ];
+ local($local_closures, $local_reopens);
+ ($local_closures, $local_reopens, at open_block_tags)
+ = &preserve_open_block_tags
+ if (@$open_tags_R);
+
+ $open_tags_R = [ @open_block_tags ];
+
+ do {
+ local($open_tags_R) = [ @open_block_tags ];
+ local(@save_open_tags) = ();
+
+ local($cnt) = ++$global{'max_id'};
+ $_ = join('',"$O$cnt$C\\tt$O", ++$global{'max_id'}, $C
+ , $_ , $O, $global{'max_id'}, "$C$O$cnt$C");
+
+ $_ = &translate_environments($_);
+ $_ = &translate_commands($_) if (/\\/);
+
+ # remove spurious someone sticks in; not sure where they
+ # actually come from
+ # XXX the replacement space is there to accomodate something
+ # broken that inserts a space in front of the first line of
+ # the environment
+ s/ / /gi;
+
+ $_ = join('', $closures, $alltt_start, $local_reopens
+ , $_
+ , &balance_tags() #, $local_closures
+ , $alltt_end, $reopens);
+ undef $open_tags_R; undef @save_open_tags;
+ };
+ $open_tags_R = [ @keep_open_tags ];
+ return codetext($_);
+}
+
+# List of all filenames produced by do_cmd_verbatiminput()
+%VerbatimFiles = ();
+ at VerbatimOutputs = ();
+
+sub get_verbatim_output_name($){
+ my $file = $_[0];
+ #
+ # Re-write the source filename to always use a .txt extension
+ # so that Web servers will present it as text/plain. This is
+ # needed since there is no other even moderately reliable way
+ # to get the right Content-Type header on text files for
+ # servers which we can't configure (like python.org mirrors).
+ #
+ if (defined $VerbatimFiles{$file}) {
+ # We've seen this one before; re-use the same output file.
+ return $VerbatimFiles{$file};
+ }
+ my($srcname, $srcdir, $srcext) = fileparse($file, '\..*');
+ $filename = "$srcname.txt";
+ #
+ # We need to determine if our default filename is already
+ # being used, and find a new one it it is. If the name is in
+ # used, this algorithm will first attempt to include the
+ # source extension as part of the name, and if that is also in
+ # use (if the same file is included multiple times, or if
+ # another source file has that as the base name), a counter is
+ # used instead.
+ #
+ my $found = 1;
+ FIND:
+ while ($found) {
+ foreach $fn (@VerbatimOutputs) {
+ if ($fn eq $filename) {
+ if ($found == 1) {
+ $srcext =~ s/^[.]//; # Remove '.' from extension
+ $filename = "$srcname-$srcext.txt";
+ }
+ else {
+ $filename = "$srcname-$found.txt";
+ }
+ ++$found;
+ next FIND;
+ }
+ }
+ $found = 0;
+ }
+ push @VerbatimOutputs, $filename;
+ $VerbatimFiles{$file} = $filename;
+ return $filename;
+}
+
+sub do_cmd_verbatiminput{
+ local($_) = @_;
+ my $fname = next_argument();
+ my $file;
+ my $found = 0;
+ my $texpath;
+ # Search TEXINPUTS for the input file, the way we're supposed to:
+ foreach $texpath (split /$envkey/, $TEXINPUTS) {
+ $file = "$texpath$dd$fname";
+ last if ($found = (-f $file));
+ }
+ my $filename = '';
+ my $text;
+ if ($found) {
+ open(MYFILE, "<$file") || die "\n$!\n";
+ read(MYFILE, $text, 1024*1024);
+ close(MYFILE);
+ $filename = get_verbatim_output_name($file);
+ # Now that we have a filename, write it out.
+ open(MYFILE, ">$filename");
+ print MYFILE $text;
+ close(MYFILE);
+ #
+ # These rewrites convert the raw text to something that will
+ # be properly visible as HTML and also will pass through the
+ # vagaries of conversion through LaTeX2HTML. The order in
+ # which the specific rewrites are performed is significant.
+ #
+ $text =~ s/\&/\&/g;
+ # These need to happen before the normal < and > re-writes,
+ # since we need to avoid LaTeX2HTML's attempt to perform
+ # ligature processing without regard to context (since it
+ # doesn't have font information).
+ $text =~ s/--/-&\#45;/g;
+ $text =~ s/<\<\&\#60;/g;
+ $text =~ s/>>/\>\&\#62;/g;
+ # Just normal re-writes...
+ $text =~ s/\</g;
+ $text =~ s/>/\>/g;
+ # These last isn't needed for the HTML, but is needed to get
+ # past LaTeX2HTML processing TeX macros. We use \ instead
+ # of / since many browsers don't support that.
+ $text =~ s/\\/\&\#92;/g;
+ }
+ else {
+ return 'Could not locate requested file $fname!\n';
+ }
+ my $note = 'Download as text.';
+ if ($file ne $filename) {
+ $note = ('Download as text (original file name: '
+ . $fname
+ . ').');
+ }
+ return ("
\n
"
+ . $text
+ . "
\n
"
+ . $_);
+}
+
+1; # This must be the last line
Added: sandbox/trunk/pdb/Doc/texinputs/fncychap.sty
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/texinputs/fncychap.sty Wed Jul 5 14:19:15 2006
@@ -0,0 +1,433 @@
+%%% Derived from the original fncychap.sty,
+%%% but changed ``TWELV'' to ``TWELVE''.
+
+%%% Copyright Ulf A. Lindgren
+%%% Department of Applied Electronics
+%%% Chalmers University of Technology
+%%% S-412 96 Gothenburg, Sweden
+%%% E-mail lindgren at ae.chalmers.se
+%%%
+%%% Note Permission is granted to modify this file under
+%%% the condition that it is saved using another
+%%% file and package name.
+%%%
+%%% Revision 1.1
+%%%
+%%% Jan. 8th Modified package name base date option
+%%% Jan. 22th Modified FmN and FmTi for error in book.cls
+%%% \MakeUppercase{#}->{\MakeUppercase#}
+%%% Apr. 6th Modified Lenny option to prevent undesired
+%%% skip of line.
+%%% Nov. 8th Fixed \@chapapp for AMS
+%%% Feb. 11th Fixed appendix problem related to Bjarne
+%%% Last modified Feb. 11th 1998
+
+\NeedsTeXFormat{LaTeX2e}[1995/12/01]
+\ProvidesPackage{fncychap}
+ [1997/04/06 v1.11
+ LaTeX package (Revised chapters)]
+
+%%%% DEFINITION OF Chapapp variables
+\newcommand{\CNV}{\huge\bfseries}
+\newcommand{\ChNameVar}[1]{\renewcommand{\CNV}{#1}}
+
+
+%%%% DEFINITION OF TheChapter variables
+\newcommand{\CNoV}{\huge\bfseries}
+\newcommand{\ChNumVar}[1]{\renewcommand{\CNoV}{#1}}
+
+\newif\ifUCN
+\UCNfalse
+\newif\ifLCN
+\LCNfalse
+\def\ChNameLowerCase{\LCNtrue\UCNfalse}
+\def\ChNameUpperCase{\UCNtrue\LCNfalse}
+\def\ChNameAsIs{\UCNfalse\LCNfalse}
+
+%%%%% Fix for AMSBook 971008
+
+\@ifundefined{@chapapp}{\let\@chapapp\chaptername}{}
+
+
+%%%%% Fix for Bjarne and appendix 980211
+
+\newif\ifinapp
+\inappfalse
+\renewcommand\appendix{\par
+ \setcounter{chapter}{0}%
+ \setcounter{section}{0}%
+ \inapptrue%
+ \renewcommand\@chapapp{\appendixname}%
+ \renewcommand\thechapter{\@Alph\c at chapter}}
+
+%%%%%
+
+\newcommand{\FmN}[1]{%
+\ifUCN
+ {\MakeUppercase#1}\LCNfalse
+\else
+ \ifLCN
+ {\MakeLowercase#1}\UCNfalse
+ \else #1
+ \fi
+\fi}
+
+
+%%%% DEFINITION OF Title variables
+\newcommand{\CTV}{\Huge\bfseries}
+\newcommand{\ChTitleVar}[1]{\renewcommand{\CTV}{#1}}
+
+%%%% DEFINITION OF the basic rule width
+\newlength{\RW}
+\setlength{\RW}{1pt}
+\newcommand{\ChRuleWidth}[1]{\setlength{\RW}{#1}}
+
+\newif\ifUCT
+\UCTfalse
+\newif\ifLCT
+\LCTfalse
+\def\ChTitleLowerCase{\LCTtrue\UCTfalse}
+\def\ChTitleUpperCase{\UCTtrue\LCTfalse}
+\def\ChTitleAsIs{\UCTfalse\LCTfalse}
+\newcommand{\FmTi}[1]{%
+\ifUCT
+
+ {\MakeUppercase#1}\LCTfalse
+\else
+ \ifLCT
+ {\MakeLowercase#1}\UCTfalse
+ \else #1
+ \fi
+\fi}
+
+
+
+\newlength{\mylen}
+\newlength{\myhi}
+\newlength{\px}
+\newlength{\py}
+\newlength{\pyy}
+\newlength{\pxx}
+
+
+\def\mghrulefill#1{\leavevmode\leaders\hrule\@height #1\hfill\kern\z@}
+
+\newcommand{\DOCH}{%
+ \CNV\FmN{\@chapapp}\space \CNoV\thechapter
+ \par\nobreak
+ \vskip 20\p@
+ }
+\newcommand{\DOTI}[1]{%
+ \CTV\FmTi{#1}\par\nobreak
+ \vskip 40\p@
+ }
+\newcommand{\DOTIS}[1]{%
+ \CTV\FmTi{#1}\par\nobreak
+ \vskip 40\p@
+ }
+
+%%%%%% SONNY DEF
+
+\DeclareOption{Sonny}{%
+ \ChNameVar{\Large\sf}
+ \ChNumVar{\Huge}
+ \ChTitleVar{\Large\sf}
+ \ChRuleWidth{0.5pt}
+ \ChNameUpperCase
+ \renewcommand{\DOCH}{%
+ \raggedleft
+ \CNV\FmN{\@chapapp}\space \CNoV\thechapter
+ \par\nobreak
+ \vskip 40\p@}
+ \renewcommand{\DOTI}[1]{%
+ \CTV\raggedleft\mghrulefill{\RW}\par\nobreak
+ \vskip 5\p@
+ \CTV\FmTi{#1}\par\nobreak
+ \mghrulefill{\RW}\par\nobreak
+ \vskip 40\p@}
+ \renewcommand{\DOTIS}[1]{%
+ \CTV\raggedleft\mghrulefill{\RW}\par\nobreak
+ \vskip 5\p@
+ \CTV\FmTi{#1}\par\nobreak
+ \mghrulefill{\RW}\par\nobreak
+ \vskip 40\p@}
+}
+
+%%%%%% LENNY DEF
+
+\DeclareOption{Lenny}{%
+
+ \ChNameVar{\fontsize{14}{16}\usefont{OT1}{phv}{m}{n}\selectfont}
+ \ChNumVar{\fontsize{60}{62}\usefont{OT1}{ptm}{m}{n}\selectfont}
+ \ChTitleVar{\Huge\bfseries\rm}
+ \ChRuleWidth{1pt}
+ \renewcommand{\DOCH}{%
+ \settowidth{\px}{\CNV\FmN{\@chapapp}}
+ \addtolength{\px}{2pt}
+ \settoheight{\py}{\CNV\FmN{\@chapapp}}
+ \addtolength{\py}{1pt}
+
+ \settowidth{\mylen}{\CNV\FmN{\@chapapp}\space\CNoV\thechapter}
+ \addtolength{\mylen}{1pt}
+ \settowidth{\pxx}{\CNoV\thechapter}
+ \addtolength{\pxx}{-1pt}
+
+ \settoheight{\pyy}{\CNoV\thechapter}
+ \addtolength{\pyy}{-2pt}
+ \setlength{\myhi}{\pyy}
+ \addtolength{\myhi}{-1\py}
+ \par
+ \parbox[b]{\textwidth}{%
+ \rule[\py]{\RW}{\myhi}%
+ \hskip -\RW%
+ \rule[\pyy]{\px}{\RW}%
+ \hskip -\px%
+ \raggedright%
+ \CNV\FmN{\@chapapp}\space\CNoV\thechapter%
+ \hskip1pt%
+ \mghrulefill{\RW}%
+ \rule{\RW}{\pyy}\par\nobreak%
+ \vskip -\baselineskip%
+ \vskip -\pyy%
+ \hskip \mylen%
+ \mghrulefill{\RW}\par\nobreak%
+ \vskip \pyy}%
+ \vskip 20\p@}
+
+
+ \renewcommand{\DOTI}[1]{%
+ \raggedright
+ \CTV\FmTi{#1}\par\nobreak
+ \vskip 40\p@}
+
+ \renewcommand{\DOTIS}[1]{%
+ \raggedright
+ \CTV\FmTi{#1}\par\nobreak
+ \vskip 40\p@}
+ }
+
+
+%%%%%%% GLENN DEF
+
+
+\DeclareOption{Glenn}{%
+ \ChNameVar{\bfseries\Large\sf}
+ \ChNumVar{\Huge}
+ \ChTitleVar{\bfseries\Large\rm}
+ \ChRuleWidth{1pt}
+ \ChNameUpperCase
+ \ChTitleUpperCase
+ \renewcommand{\DOCH}{%
+ \settoheight{\myhi}{\CTV\FmTi{Test}}
+ \setlength{\py}{\baselineskip}
+ \addtolength{\py}{\RW}
+ \addtolength{\py}{\myhi}
+ \setlength{\pyy}{\py}
+ \addtolength{\pyy}{-1\RW}
+
+ \raggedright
+ \CNV\FmN{\@chapapp}\space\CNoV\thechapter
+ \hskip 3pt\mghrulefill{\RW}\rule[-1\pyy]{2\RW}{\py}\par\nobreak}
+
+ \renewcommand{\DOTI}[1]{%
+ \addtolength{\pyy}{-4pt}
+ \settoheight{\myhi}{\CTV\FmTi{#1}}
+ \addtolength{\myhi}{\py}
+ \addtolength{\myhi}{-1\RW}
+ \vskip -1\pyy
+ \rule{2\RW}{\myhi}\mghrulefill{\RW}\hskip 2pt
+ \raggedleft\CTV\FmTi{#1}\par\nobreak
+ \vskip 80\p@}
+
+ \renewcommand{\DOTIS}[1]{%
+ \setlength{\py}{10pt}
+ \setlength{\pyy}{\py}
+ \addtolength{\pyy}{\RW}
+ \setlength{\myhi}{\baselineskip}
+ \addtolength{\myhi}{\pyy}
+ \mghrulefill{\RW}\rule[-1\py]{2\RW}{\pyy}\par\nobreak
+% \addtolength{}{}
+\vskip -1\baselineskip
+ \rule{2\RW}{\myhi}\mghrulefill{\RW}\hskip 2pt
+ \raggedleft\CTV\FmTi{#1}\par\nobreak
+ \vskip 60\p@}
+ }
+
+%%%%%%% CONNY DEF
+
+\DeclareOption{Conny}{%
+ \ChNameUpperCase
+ \ChTitleUpperCase
+ \ChNameVar{\centering\Huge\rm\bfseries}
+ \ChNumVar{\Huge}
+ \ChTitleVar{\centering\Huge\rm}
+ \ChRuleWidth{2pt}
+
+ \renewcommand{\DOCH}{%
+ \mghrulefill{3\RW}\par\nobreak
+ \vskip -0.5\baselineskip
+ \mghrulefill{\RW}\par\nobreak
+ \CNV\FmN{\@chapapp}\space \CNoV\thechapter
+ \par\nobreak
+ \vskip -0.5\baselineskip
+ }
+ \renewcommand{\DOTI}[1]{%
+ \mghrulefill{\RW}\par\nobreak
+ \CTV\FmTi{#1}\par\nobreak
+ \vskip 60\p@
+ }
+ \renewcommand{\DOTIS}[1]{%
+ \mghrulefill{\RW}\par\nobreak
+ \CTV\FmTi{#1}\par\nobreak
+ \vskip 60\p@
+ }
+ }
+
+%%%%%%% REJNE DEF
+
+\DeclareOption{Rejne}{%
+
+ \ChNameUpperCase
+ \ChTitleUpperCase
+ \ChNameVar{\centering\Large\rm}
+ \ChNumVar{\Huge}
+ \ChTitleVar{\centering\Huge\rm}
+ \ChRuleWidth{1pt}
+ \renewcommand{\DOCH}{%
+ \settoheight{\py}{\CNoV\thechapter}
+ \addtolength{\py}{-1pt}
+ \CNV\FmN{\@chapapp}\par\nobreak
+ \vskip 20\p@
+ \setlength{\myhi}{2\baselineskip}
+ \setlength{\px}{\myhi}
+ \addtolength{\px}{-1\RW}
+ \rule[-1\px]{\RW}{\myhi}\mghrulefill{\RW}\hskip
+ 10pt\raisebox{-0.5\py}{\CNoV\thechapter}\hskip
+10pt\mghrulefill{\RW}\rule[-1\px]{\RW}{\myhi}\par\nobreak
+ \vskip -1\p@
+ }
+ \renewcommand{\DOTI}[1]{%
+ \setlength{\mylen}{\textwidth}
+ \addtolength{\mylen}{-2\RW}
+ {\vrule width\RW}\parbox{\mylen}{\CTV\FmTi{#1}}{\vrule
+width\RW}\par\nobreak
+ \vskip
+-1pt\rule{\RW}{2\baselineskip}\mghrulefill{\RW}\rule{\RW}{2\baselineskip}
+ \vskip 60\p@
+ }
+ \renewcommand{\DOTIS}[1]{%
+ \setlength{\py}{\fboxrule}
+ \setlength{\fboxrule}{\RW}
+ \setlength{\mylen}{\textwidth}
+ \addtolength{\mylen}{-2\RW}
+ \fbox{\parbox{\mylen}{\vskip
+2\baselineskip\CTV\FmTi{#1}\par\nobreak\vskip \baselineskip}}
+ \setlength{\fboxrule}{\py}
+ \vskip 60\p@
+ }
+ }
+
+
+%%%%%%% BJARNE DEF
+
+\DeclareOption{Bjarne}{%
+ \ChNameUpperCase
+ \ChTitleUpperCase
+ \ChNameVar{\raggedleft\normalsize\rm}
+ \ChNumVar{\raggedleft \bfseries\Large}
+ \ChTitleVar{\raggedleft \Large\rm}
+ \ChRuleWidth{1pt}
+
+
+%% Note thechapter -> c at chapter fix appendix bug
+
+ \newcounter{AlphaCnt}
+ \newcounter{AlphaDecCnt}
+ \newcommand{\AlphaNo}{%
+ \ifcase\number\theAlphaCnt
+ \ifnum\c at chapter=0
+ ZERO\else{}\fi
+ \or ONE\or TWO\or THREE\or FOUR\or FIVE
+ \or SIX\or SEVEN\or EIGHT\or NINE\or TEN
+ \or ELEVEN\or TWELVE\or THIRTEEN\or FOURTEEN\or FIFTEEN
+ \or SIXTEEN\or SEVENTEEN\or EIGHTEEN\or NINETEEN\fi
+}
+
+ \newcommand{\AlphaDecNo}{%
+ \setcounter{AlphaDecCnt}{0}
+ \@whilenum\number\theAlphaCnt>0\do
+ {\addtocounter{AlphaCnt}{-10}
+ \addtocounter{AlphaDecCnt}{1}}
+ \ifnum\number\theAlphaCnt=0
+ \else
+ \addtocounter{AlphaDecCnt}{-1}
+ \addtocounter{AlphaCnt}{10}
+ \fi
+
+
+ \ifcase\number\theAlphaDecCnt\or TEN\or TWENTY\or THIRTY\or
+ FORTY\or FIFTY\or SIXTY\or SEVENTY\or EIGHTY\or NINETY\fi
+ }
+ \newcommand{\TheAlphaChapter}{%
+
+ \ifinapp
+ \thechapter
+ \else
+ \setcounter{AlphaCnt}{\c at chapter}
+ \ifnum\c at chapter<20
+ \AlphaNo
+ \else
+ \AlphaDecNo\AlphaNo
+ \fi
+ \fi
+ }
+ \renewcommand{\DOCH}{%
+ \mghrulefill{\RW}\par\nobreak
+ \CNV\FmN{\@chapapp}\par\nobreak
+ \CNoV\TheAlphaChapter\par\nobreak
+ \vskip -1\baselineskip\vskip 5pt\mghrulefill{\RW}\par\nobreak
+ \vskip 20\p@
+ }
+ \renewcommand{\DOTI}[1]{%
+ \CTV\FmTi{#1}\par\nobreak
+ \vskip 40\p@
+ }
+ \renewcommand{\DOTIS}[1]{%
+ \CTV\FmTi{#1}\par\nobreak
+ \vskip 40\p@
+ }
+}
+
+\DeclareOption*{%
+ \PackageWarning{fancychapter}{unknown style option}
+ }
+
+\ProcessOptions* \relax
+
+\def\@makechapterhead#1{%
+ \vspace*{50\p@}%
+ {\parindent \z@ \raggedright \normalfont
+ \ifnum \c at secnumdepth >\m at ne
+ \DOCH
+ \fi
+ \interlinepenalty\@M
+ \DOTI{#1}
+ }}
+\def\@schapter#1{\if at twocolumn
+ \@topnewpage[\@makeschapterhead{#1}]%
+ \else
+ \@makeschapterhead{#1}%
+ \@afterheading
+ \fi}
+\def\@makeschapterhead#1{%
+ \vspace*{50\p@}%
+ {\parindent \z@ \raggedright
+ \normalfont
+ \interlinepenalty\@M
+ \DOTIS{#1}
+ \vskip 40\p@
+ }}
+
+\endinput
+
+
Added: sandbox/trunk/pdb/Doc/texinputs/manual.cls
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/texinputs/manual.cls Wed Jul 5 14:19:15 2006
@@ -0,0 +1,155 @@
+%
+% manual.cls for the Python documentation
+%
+
+\NeedsTeXFormat{LaTeX2e}[1995/12/01]
+\ProvidesClass{manual}
+ [1998/03/03 Document class (Python manual)]
+
+\RequirePackage{pypaper}
+\RequirePackage{fancybox}
+
+% Change the options here to get a different set of basic options, but only
+% if you have to. Paper and font size should be adjusted in pypaper.sty.
+%
+\LoadClass[\py at paper,\py at ptsize,twoside,openright]{report}
+
+\setcounter{secnumdepth}{2}
+
+% Optional packages:
+%
+% If processing of these documents fails at your TeX installation,
+% these may be commented out (independently) to make things work.
+% These are both supplied with the current version of the teTeX
+% distribution.
+%
+% The "fancyhdr" package makes nicer page footers reasonable to
+% implement, and is used to put the chapter and section information in
+% the footers.
+%
+\RequirePackage{fancyhdr}\typeout{Using fancier footers than usual.}
+
+
+% Required packages:
+%
+% The "fncychap" package is used to get the nice chapter headers. The
+% .sty file is distributed with Python, so you should not need to disable
+% it. You'd also end up with a mixed page style; uglier than stock LaTeX!
+%
+\RequirePackage[Bjarne]{fncychap}\typeout{Using fancy chapter headings.}
+% Do horizontal rules it this way to match:
+\newcommand{\py at doHorizontalRule}{\mghrulefill{\RW}}
+%
+%
+% This gives us all the Python-specific markup that we really want.
+% This should come last. Do not change this.
+%
+\RequirePackage{python}
+
+% support for module synopsis sections:
+\newcommand{\py at ModSynopsisFilename}{\jobname\thechapter.syn}
+\let\py at OldChapter=\chapter
+\renewcommand{\chapter}{
+ \py at ProcessModSynopsis
+ \py at closeModSynopsisFile
+ \py at OldChapter
+}
+
+
+% Change the title page to look a bit better, and fit in with the
+% fncychap ``Bjarne'' style a bit better.
+%
+\renewcommand{\maketitle}{%
+ \begin{titlepage}%
+ \let\footnotesize\small
+ \let\footnoterule\relax
+ \py at doHorizontalRule%
+ \ifpdf
+ \begingroup
+ % This \def is required to deal with multi-line authors; it
+ % changes \\ to ', ' (comma-space), making it pass muster for
+ % generating document info in the PDF file.
+ \def\\{, }
+ \pdfinfo{
+ /Author (\@author)
+ /Title (\@title)
+ }
+ \endgroup
+ \fi
+ \begin{flushright}%
+ {\rm\Huge\py at HeaderFamily \@title \par}%
+ {\em\LARGE\py at HeaderFamily \py at release\releaseinfo \par}
+ \vfill
+ {\LARGE\py at HeaderFamily \@author \par}
+ \vfill\vfill
+ {\large
+ \@date \par
+ \vfill
+ \py at authoraddress \par
+ }%
+ \end{flushright}%\par
+ \@thanks
+ \end{titlepage}%
+ \setcounter{footnote}{0}%
+ \let\thanks\relax\let\maketitle\relax
+ \gdef\@thanks{}\gdef\@author{}\gdef\@title{}
+}
+
+
+% Catch the end of the {abstract} environment, but here make sure the
+% abstract is followed by a blank page if the 'openright' option is used.
+%
+\let\py at OldEndAbstract=\endabstract
+\renewcommand{\endabstract}{
+ \if at openright
+ \ifodd\value{page}
+ \typeout{Adding blank page after the abstract.}
+ \vfil\pagebreak
+ \fi
+ \fi
+ \py at OldEndAbstract
+}
+
+% This wraps the \tableofcontents macro with all the magic to get the
+% spacing right and have the right number of pages if the 'openright'
+% option has been used. This eliminates a fair amount of crud in the
+% individual document files.
+%
+\let\py at OldTableofcontents=\tableofcontents
+\renewcommand{\tableofcontents}{%
+ \setcounter{page}{1}%
+ \pagebreak%
+ \pagestyle{plain}%
+ {%
+ \parskip = 0mm%
+ \py at OldTableofcontents%
+ \if at openright%
+ \ifodd\value{page}%
+ \typeout{Adding blank page after the table of contents.}%
+ \pagebreak\hspace{0pt}%
+ \fi%
+ \fi%
+ \cleardoublepage%
+ }%
+ \pagenumbering{arabic}%
+ \@ifundefined{fancyhf}{}{\pagestyle{normal}}%
+ \py at doing@page at targetstrue%
+}
+% This is needed to get the width of the section # area wide enough in the
+% library reference. Doing it here keeps it the same for all the manuals.
+%
+\renewcommand*\l at section{\@dottedtocline{1}{1.5em}{2.6em}}
+\renewcommand*\l at subsection{\@dottedtocline{2}{4.1em}{3.5em}}
+\setcounter{tocdepth}{1}
+
+
+% Fix the theindex environment to add an entry to the Table of
+% Contents; this is much nicer than just having to jump to the end of
+% the book and flip around, especially with multiple indexes.
+%
+\let\py at OldTheindex=\theindex
+\renewcommand{\theindex}{
+ \cleardoublepage
+ \py at OldTheindex
+ \addcontentsline{toc}{chapter}{\indexname}
+}
Added: sandbox/trunk/pdb/Doc/texinputs/pypaper.sty
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/texinputs/pypaper.sty Wed Jul 5 14:19:15 2006
@@ -0,0 +1,18 @@
+%
+% Change this to say a4paper instead of letterpaper if you want A4. These
+% are the latex defaults.
+%
+\newcommand{\py at paper}{letterpaper}
+\newcommand{\py at ptsize}{10pt}
+
+% These set up the fonts for the documents.
+%
+% The "times" package makes the default font the PostScript Times
+% font, which makes for smaller PostScript and a font that more people
+% like.
+%
+% The "avant" package causes the AvantGarde font to be used for
+% sans-serif text, instead of the uglier Helvetica set up by the "times"
+% package.
+%
+\RequirePackage{times}\typeout{Using Times instead of Computer Modern.}
Added: sandbox/trunk/pdb/Doc/texinputs/python.ist
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/texinputs/python.ist Wed Jul 5 14:19:15 2006
@@ -0,0 +1,11 @@
+line_max 100
+headings_flag 1
+heading_prefix " \\bigletter "
+
+preamble "\\begin{theindex}
+\\def\\bigletter#1{{\\Large\\sffamily#1}\\nopagebreak\\vspace{1mm}}
+
+"
+
+symhead_positive "{Symbols}"
+numhead_positive "{Numbers}"
Added: sandbox/trunk/pdb/Doc/texinputs/python.sty
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/texinputs/python.sty Wed Jul 5 14:19:15 2006
@@ -0,0 +1,1309 @@
+%
+% python.sty for the Python docummentation [works only with Latex2e]
+%
+
+\NeedsTeXFormat{LaTeX2e}[1995/12/01]
+\ProvidesPackage{python}
+ [1998/01/11 LaTeX package (Python markup)]
+
+\RequirePackage{longtable}
+\RequirePackage{underscore}
+
+% Uncomment these two lines to ignore the paper size and make the page
+% size more like a typical published manual.
+%\renewcommand{\paperheight}{9in}
+%\renewcommand{\paperwidth}{8.5in} % typical squarish manual
+%\renewcommand{\paperwidth}{7in} % O'Reilly ``Programmming Python''
+
+% These packages can be used to add marginal annotations which indicate
+% index entries and labels; useful for reviewing this messy documentation!
+%
+%\RequirePackage{showkeys}
+%\RequirePackage{showidx}
+
+% If we ever want to indent paragraphs, this needs to be changed.
+% This is used inside the macros defined here instead of coding
+% \noindent directly.
+\let\py at parindent=\noindent
+
+% for PDF output, use maximal compression & a lot of other stuff
+% (test for PDF recommended by Tanmoy Bhattacharya )
+%
+\newif\ifpy at doing@page at targets
+\py at doing@page at targetsfalse
+
+\newif\ifpdf\pdffalse
+\ifx\pdfoutput\undefined\else\ifcase\pdfoutput
+\else
+ \pdftrue
+ \input{pdfcolor}
+ \let\py at LinkColor=\NavyBlue
+ \let\py at NormalColor=\Black
+ \pdfcompresslevel=9
+ \pdfpagewidth=\paperwidth % page width of PDF output
+ \pdfpageheight=\paperheight % page height of PDF output
+ %
+ % Pad the number with '0' to 3 digits wide so no page name is a prefix
+ % of any other.
+ %
+ \newcommand{\py at targetno}[1]{\ifnum#1<100 0\fi\ifnum#1<10 0\fi#1}
+ \newcommand{\py at pageno}{\py at targetno\thepage}
+ %
+ % This definition allows the entries in the page-view of the ToC to be
+ % active links. Some work, some don't.
+ %
+ \let\py at OldContentsline=\contentsline
+ %
+ % Backward compatibility hack: pdfTeX 0.13 defined \pdfannotlink,
+ % but it changed to \pdfstartlink in 0.14. This let's us use either
+ % version and still get useful behavior.
+ %
+ \@ifundefined{pdfstartlink}{
+ \let\pdfstartlink=\pdfannotlink
+ }{}
+ %
+ % The \py at parindent here is a hack -- we're forcing pdfTeX into
+ % horizontal mode since \pdfstartlink requires that.
+ \def\py at pdfstartlink{%
+ \ifvmode\py at parindent\fi%
+ \pdfstartlink%
+ }
+ %
+ % Macro that takes two args: the name to link to and the content of
+ % the link. This takes care of the PDF magic, getting the colors
+ % the same for each link, and avoids having lots of garbage all over
+ % this style file.
+ \newcommand{\py at linkToName}[2]{%
+ \py at pdfstartlink attr{/Border [0 0 0]} goto name{#1}%
+ \py at LinkColor#2\py at NormalColor%
+ \pdfendlink%
+ }
+ % Compute the padded page number separately since we end up with a pair of
+ % \relax tokens; this gets the right string computed and works.
+ \renewcommand{\contentsline}[3]{%
+ \def\my at pageno{\py at targetno{#3}}%
+ \py at OldContentsline{#1}{\py at linkToName{page\my at pageno}{#2}}{#3}%
+ }
+ \AtEndDocument{
+ \def\_{\string_}
+ \InputIfFileExists{\jobname.bkm}{\pdfcatalog{/PageMode /UseOutlines}}{}
+ }
+ \newcommand{\py at target}[1]{%
+ \ifpy at doing@page at targets%
+ {\pdfdest name{#1} xyz}%
+ \fi%
+ }
+ \let\py at OldLabel=\label
+ \renewcommand{\label}[1]{%
+ \py at OldLabel{#1}%
+ \py at target{label-#1}%
+ }
+ % This stuff adds a page# destination to every PDF page, where # is three
+ % digits wide, padded with leading zeros. This doesn't really help with
+ % the frontmatter, but does fine with the body.
+ %
+ % This is *heavily* based on the hyperref package.
+ %
+ \def\@begindvi{%
+ \unvbox \@begindvibox
+ \@hyperfixhead
+ }
+ \def\@hyperfixhead{%
+ \let\H at old@thehead\@thehead
+ \global\def\@foo{\py at target{page\py at pageno}}%
+ \expandafter\ifx\expandafter\@empty\H at old@thehead
+ \def\H at old@thehead{\hfil}\fi
+ \def\@thehead{\@foo\relax\H at old@thehead}%
+ }
+\fi\fi
+
+% Increase printable page size (copied from fullpage.sty)
+\topmargin 0pt
+\advance \topmargin by -\headheight
+\advance \topmargin by -\headsep
+
+% attempt to work a little better for A4 users
+\textheight \paperheight
+\advance\textheight by -2in
+
+\oddsidemargin 0pt
+\evensidemargin 0pt
+%\evensidemargin -.25in % for ``manual size'' documents
+\marginparwidth 0.5in
+
+\textwidth \paperwidth
+\advance\textwidth by -2in
+
+
+% Style parameters and macros used by most documents here
+\raggedbottom
+\sloppy
+\parindent = 0mm
+\parskip = 2mm
+\hbadness = 5000 % don't print trivial gripes
+
+\pagestyle{empty} % start this way; change for
+\pagenumbering{roman} % ToC & chapters
+
+% Use this to set the font family for headers and other decor:
+\newcommand{\py at HeaderFamily}{\sffamily}
+
+% Set up abstract ways to get the normal and smaller font sizes that
+% work even in footnote context.
+\newif\ifpy at infootnote \py at infootnotefalse
+\let\py at oldmakefntext\@makefntext
+\def\@makefntext#1{%
+ \bgroup%
+ \py at infootnotetrue
+ \py at oldmakefntext{#1}%
+ \egroup%
+}
+\def\py at defaultsize{%
+ \ifpy at infootnote\footnotesize\else\normalsize\fi%
+}
+\def\py at smallsize{%
+ \ifpy at infootnote\scriptsize\else\small\fi%
+}
+
+% Redefine the 'normal' header/footer style when using "fancyhdr" package:
+\@ifundefined{fancyhf}{}{
+ % Use \pagestyle{normal} as the primary pagestyle for text.
+ \fancypagestyle{normal}{
+ \fancyhf{}
+ \fancyfoot[LE,RO]{{\py at HeaderFamily\thepage}}
+ \fancyfoot[LO]{{\py at HeaderFamily\nouppercase{\rightmark}}}
+ \fancyfoot[RE]{{\py at HeaderFamily\nouppercase{\leftmark}}}
+ \renewcommand{\headrulewidth}{0pt}
+ \renewcommand{\footrulewidth}{0.4pt}
+ }
+ % Update the plain style so we get the page number & footer line,
+ % but not a chapter or section title. This is to keep the first
+ % page of a chapter and the blank page between chapters `clean.'
+ \fancypagestyle{plain}{
+ \fancyhf{}
+ \fancyfoot[LE,RO]{{\py at HeaderFamily\thepage}}
+ \renewcommand{\headrulewidth}{0pt}
+ \renewcommand{\footrulewidth}{0.4pt}
+ }
+ % Redefine \cleardoublepage so that the blank page between chapters
+ % gets the plain style and not the fancy style. This is described
+ % in the documentation for the fancyhdr package by Piet von Oostrum.
+ \@ifundefined{chapter}{}{
+ \renewcommand{\cleardoublepage}{
+ \clearpage\if at openright \ifodd\c at page\else
+ \hbox{}
+ \thispagestyle{plain}
+ \newpage
+ \if at twocolumn\hbox{}\newpage\fi\fi\fi
+ }
+ }
+}
+
+% This sets up the {verbatim} environment to be indented and a minipage,
+% and to have all the other mostly nice properties that we want for
+% code samples.
+
+\let\py at OldVerbatim=\verbatim
+\let\py at OldEndVerbatim=\endverbatim
+\RequirePackage{verbatim}
+\let\py at OldVerbatimInput=\verbatiminput
+
+% Variable used by begin code command
+\newlength{\py at codewidth}
+
+\renewcommand{\verbatim}{%
+ \setlength{\parindent}{1cm}%
+ % Calculate the text width for the minipage:
+ \setlength{\py at codewidth}{\linewidth}%
+ \addtolength{\py at codewidth}{-\parindent}%
+ %
+ \par\indent%
+ \begin{minipage}[t]{\py at codewidth}%
+ \small%
+ \py at OldVerbatim%
+}
+\renewcommand{\endverbatim}{%
+ \py at OldEndVerbatim%
+ \end{minipage}%
+}
+\renewcommand{\verbatiminput}[1]{%
+ {\setlength{\parindent}{1cm}%
+ % Calculate the text width for the minipage:
+ \setlength{\py at codewidth}{\linewidth}%
+ \addtolength{\py at codewidth}{-\parindent}%
+ %
+ \small%
+ \begin{list}{}{\setlength{\leftmargin}{1cm}}
+ \item%
+ \py at OldVerbatimInput{#1}%
+ \end{list}
+ }%
+}
+
+% This does a similar thing for the {alltt} environment:
+\RequirePackage{alltt}
+\let\py at OldAllTT=\alltt
+\let\py at OldEndAllTT=\endalltt
+
+\renewcommand{\alltt}{%
+ \setlength{\parindent}{1cm}%
+ % Calculate the text width for the minipage:
+ \setlength{\py at codewidth}{\linewidth}%
+ \addtolength{\py at codewidth}{-\parindent}%
+ \let\e=\textbackslash%
+ %
+ \par\indent%
+ \begin{minipage}[t]{\py at codewidth}%
+ \small%
+ \py at OldAllTT%
+}
+\renewcommand{\endalltt}{%
+ \py at OldEndAllTT%
+ \end{minipage}%
+}
+
+
+\newcommand{\py at modulebadkey}{{--just-some-junk--}}
+
+
+%% Lots of index-entry generation support.
+
+% Command to wrap around stuff that refers to function / module /
+% attribute names in the index. Default behavior: like \code{}. To
+% just keep the index entries in the roman font, uncomment the second
+% definition; it matches O'Reilly style more.
+%
+\newcommand{\py at idxcode}[1]{\texttt{#1}}
+%\renewcommand{\py at idxcode}[1]{#1}
+
+% Command to generate two index entries (using subentries)
+\newcommand{\indexii}[2]{\index{#1!#2}\index{#2!#1}}
+
+% And three entries (using only one level of subentries)
+\newcommand{\indexiii}[3]{\index{#1!#2 #3}\index{#2!#3, #1}\index{#3!#1 #2}}
+
+% And four (again, using only one level of subentries)
+\newcommand{\indexiv}[4]{
+\index{#1!#2 #3 #4}
+\index{#2!#3 #4, #1}
+\index{#3!#4, #1 #2}
+\index{#4!#1 #2 #3}
+}
+
+% Command to generate a reference to a function, statement, keyword,
+% operator.
+\newcommand{\kwindex}[1]{\indexii{keyword}{#1@{\py at idxcode{#1}}}}
+\newcommand{\stindex}[1]{\indexii{statement}{#1@{\py at idxcode{#1}}}}
+\newcommand{\opindex}[1]{\indexii{operator}{#1@{\py at idxcode{#1}}}}
+\newcommand{\exindex}[1]{\indexii{exception}{#1@{\py at idxcode{#1}}}}
+\newcommand{\obindex}[1]{\indexii{object}{#1}}
+\newcommand{\bifuncindex}[1]{%
+ \index{#1@{\py at idxcode{#1()}} (built-in function)}}
+
+% Add an index entry for a module
+\newcommand{\py at refmodule}[2]{\index{#1@{\py at idxcode{#1}} (#2module)}}
+\newcommand{\refmodindex}[1]{\py at refmodule{#1}{}}
+\newcommand{\refbimodindex}[1]{\py at refmodule{#1}{built-in }}
+\newcommand{\refexmodindex}[1]{\py at refmodule{#1}{extension }}
+\newcommand{\refstmodindex}[1]{\py at refmodule{#1}{standard }}
+
+% Refer to a module's documentation using a hyperlink of the module's
+% name, at least if we're building PDF:
+\ifpdf
+ \newcommand{\refmodule}[2][\py at modulebadkey]{%
+ \ifx\py at modulebadkey#1\def\py at modulekey{#2}\else\def\py at modulekey{#1}\fi%
+ \py at linkToName{label-module-\py at modulekey}{\module{#2}}%
+ }
+\else
+ \newcommand{\refmodule}[2][\py at modulebadkey]{\module{#2}}
+\fi
+
+% support for the module index
+\newif\ifpy at UseModuleIndex
+\py at UseModuleIndexfalse
+
+\newcommand{\makemodindex}{
+ \newwrite\modindexfile
+ \openout\modindexfile=mod\jobname.idx
+ \py at UseModuleIndextrue
+}
+
+% Add the defining entry for a module
+\newcommand{\py at modindex}[2]{%
+ \renewcommand{\py at thismodule}{#1}
+ \setindexsubitem{(in module #1)}%
+ \index{#1@{\py at idxcode{#1}} (#2module)|textbf}%
+ \ifpy at UseModuleIndex%
+ \@ifundefined{py at modplat@\py at thismodulekey}{
+ \write\modindexfile{\protect\indexentry{#1@{\texttt{#1}}}{\thepage}}%
+ }{\write\modindexfile{\protect\indexentry{#1@{\texttt{#1} %
+ \emph{(\py at platformof[\py at thismodulekey]{})}}}{\thepage}}%
+ }
+ \fi%
+}
+
+% *** XXX *** THE NEXT FOUR MACROS ARE NOW OBSOLETE !!! ***
+
+% built-in & Python modules in the main distribution
+\newcommand{\bimodindex}[1]{\py at modindex{#1}{built-in }%
+ \typeout{*** MACRO bimodindex IS OBSOLETE -- USE declaremodule INSTEAD!}}
+\newcommand{\stmodindex}[1]{\py at modindex{#1}{standard }%
+ \typeout{*** MACRO stmodindex IS OBSOLETE -- USE declaremodule INSTEAD!}}
+
+% Python & extension modules outside the main distribution
+\newcommand{\modindex}[1]{\py at modindex{#1}{}%
+ \typeout{*** MACRO modindex IS OBSOLETE -- USE declaremodule INSTEAD!}}
+\newcommand{\exmodindex}[1]{\py at modindex{#1}{extension }%
+ \typeout{*** MACRO exmodindex IS OBSOLETE -- USE declaremodule INSTEAD!}}
+
+% Additional string for an index entry
+\newif\ifpy at usingsubitem\py at usingsubitemfalse
+\newcommand{\py at indexsubitem}{}
+\newcommand{\setindexsubitem}[1]{\renewcommand{\py at indexsubitem}{ #1}%
+ \py at usingsubitemtrue}
+\newcommand{\ttindex}[1]{%
+ \ifpy at usingsubitem
+ \index{#1@{\py at idxcode{#1}}\py at indexsubitem}%
+ \else%
+ \index{#1@{\py at idxcode{#1}}}%
+ \fi%
+}
+\newcommand{\withsubitem}[2]{%
+ \begingroup%
+ \def\ttindex##1{\index{##1@{\py at idxcode{##1}} #1}}%
+ #2%
+ \endgroup%
+}
+
+
+% Module synopsis processing -----------------------------------------------
+%
+\newcommand{\py at thisclass}{}
+\newcommand{\py at thismodule}{}
+\newcommand{\py at thismodulekey}{}
+\newcommand{\py at thismoduletype}{}
+
+\newcommand{\py at standardIndexModule}[1]{\py at modindex{#1}{standard }}
+\newcommand{\py at builtinIndexModule}[1]{\py at modindex{#1}{built-in }}
+\newcommand{\py at extensionIndexModule}[1]{\py at modindex{#1}{extension }}
+\newcommand{\py at IndexModule}[1]{\py at modindex{#1}{}}
+
+\newif\ifpy at HaveModSynopsis \py at HaveModSynopsisfalse
+\newif\ifpy at ModSynopsisFileIsOpen \py at ModSynopsisFileIsOpenfalse
+\newif\ifpy at HaveModPlatform \py at HaveModPlatformfalse
+
+% \declaremodule[key]{type}{name}
+\newcommand{\declaremodule}[3][\py at modulebadkey]{
+ \py at openModSynopsisFile
+ \renewcommand{\py at thismoduletype}{#2}
+ \ifx\py at modulebadkey#1
+ \renewcommand{\py at thismodulekey}{#3}
+ \else
+ \renewcommand{\py at thismodulekey}{#1}
+ \fi
+ \@ifundefined{py@#2IndexModule}{%
+ \typeout{*** MACRO declaremodule called with unknown module type: `#2'}
+ \py at IndexModule{#3}%
+ }{%
+ \csname py@#2IndexModule\endcsname{#3}%
+ }
+ \label{module-\py at thismodulekey}
+}
+\newif\ifpy at ModPlatformFileIsOpen \py at ModPlatformFileIsOpenfalse
+\newcommand{\py at ModPlatformFilename}{\jobname.pla}
+\newcommand{\platform}[1]{
+ \ifpy at ModPlatformFileIsOpen\else
+ \newwrite\py at ModPlatformFile
+ \openout\py at ModPlatformFile=\py at ModPlatformFilename
+ \py at ModPlatformFileIsOpentrue
+ \fi
+}
+\InputIfFileExists{\jobname.pla}{}{}
+\newcommand{\py at platformof}[2][\py at modulebadkey]{%
+ \ifx\py at modulebadkey#1 \def\py at key{#2}%
+ \else \def\py at key{#1}%
+ \fi%
+ \csname py at modplat@\py at key\endcsname%
+}
+\newcommand{\ignorePlatformAnnotation}[1]{}
+
+% \moduleauthor{name}{email}
+\newcommand{\moduleauthor}[2]{}
+
+% \sectionauthor{name}{email}
+\newcommand{\sectionauthor}[2]{}
+
+
+\newcommand{\py at defsynopsis}{Module has no synopsis.}
+\newcommand{\py at modulesynopsis}{\py at defsynopsis}
+\newcommand{\modulesynopsis}[1]{
+ \py at HaveModSynopsistrue
+ \renewcommand{\py at modulesynopsis}{#1}
+}
+
+% define the file
+\newwrite\py at ModSynopsisFile
+
+% hacked from \addtocontents from latex.ltx:
+\long\def\py at writeModSynopsisFile#1{%
+ \protected at write\py at ModSynopsisFile%
+ {\let\label\@gobble \let\index\@gobble \let\glossary\@gobble}%
+ {\string#1}%
+}
+\newcommand{\py at closeModSynopsisFile}{
+ \ifpy at ModSynopsisFileIsOpen
+ \closeout\py at ModSynopsisFile
+ \py at ModSynopsisFileIsOpenfalse
+ \fi
+}
+\newcommand{\py at openModSynopsisFile}{
+ \ifpy at ModSynopsisFileIsOpen\else
+ \openout\py at ModSynopsisFile=\py at ModSynopsisFilename
+ \py at ModSynopsisFileIsOpentrue
+ \fi
+}
+
+\newcommand{\py at ProcessModSynopsis}{
+ \ifpy at HaveModSynopsis
+ \py at writeModSynopsisFile{\modulesynopsis%
+ {\py at thismodulekey}{\py at thismodule}%
+ {\py at thismoduletype}{\py at modulesynopsis}}%
+ \py at HaveModSynopsisfalse
+ \fi
+ \renewcommand{\py at modulesynopsis}{\py at defsynopsis}
+}
+\AtEndDocument{\py at ProcessModSynopsis\py at closeModSynopsisFile}
+
+
+\long\def\py at writeModPlatformFile#1{%
+ \protected at write\py at ModPlatformFile%
+ {\let\label\@gobble \let\index\@gobble \let\glossary\@gobble}%
+ {\string#1}%
+}
+
+
+\newcommand{\localmoduletable}{
+ \IfFileExists{\py at ModSynopsisFilename}{
+ \begin{synopsistable}
+ \input{\py at ModSynopsisFilename}
+ \end{synopsistable}
+ }{}
+}
+
+\ifpdf
+ \newcommand{\py at ModSynopsisSummary}[4]{%
+ \py at linkToName{label-module-#1}{\bfcode{#2}} & #4\\
+ }
+\else
+ \newcommand{\py at ModSynopsisSummary}[4]{\bfcode{#2} & #4\\}
+\fi
+\newenvironment{synopsistable}{
+ % key, name, type, synopsis
+ \let\modulesynopsis=\py at ModSynopsisSummary
+ \begin{tabular}{ll}
+}{
+ \end{tabular}
+}
+%
+% --------------------------------------------------------------------------
+
+
+\newcommand{\py at reset}{
+ \py at usingsubitemfalse
+ \py at ProcessModSynopsis
+ \renewcommand{\py at thisclass}{}
+ \renewcommand{\py at thismodule}{}
+ \renewcommand{\py at thismodulekey}{}
+ \renewcommand{\py at thismoduletype}{}
+}
+
+% Augment the sectioning commands used to get our own font family in place,
+% and reset some internal data items:
+\renewcommand{\section}{\py at reset%
+ \@startsection{section}{1}{\z@}%
+ {-3.5ex \@plus -1ex \@minus -.2ex}%
+ {2.3ex \@plus.2ex}%
+ {\reset at font\Large\py at HeaderFamily}}
+\renewcommand{\subsection}{\@startsection{subsection}{2}{\z@}%
+ {-3.25ex\@plus -1ex \@minus -.2ex}%
+ {1.5ex \@plus .2ex}%
+ {\reset at font\large\py at HeaderFamily}}
+\renewcommand{\subsubsection}{\@startsection{subsubsection}{3}{\z@}%
+ {-3.25ex\@plus -1ex \@minus -.2ex}%
+ {1.5ex \@plus .2ex}%
+ {\reset at font\normalsize\py at HeaderFamily}}
+\renewcommand{\paragraph}{\@startsection{paragraph}{4}{\z@}%
+ {3.25ex \@plus1ex \@minus.2ex}%
+ {-1em}%
+ {\reset at font\normalsize\py at HeaderFamily}}
+\renewcommand{\subparagraph}{\@startsection{subparagraph}{5}{\parindent}%
+ {3.25ex \@plus1ex \@minus .2ex}%
+ {-1em}%
+ {\reset at font\normalsize\py at HeaderFamily}}
+
+
+% Now for a lot of semantically-loaded environments that do a ton of magical
+% things to get the right formatting and index entries for the stuff in
+% Python modules and C API.
+
+
+% {fulllineitems} is used in one place in libregex.tex, but is really for
+% internal use in this file.
+%
+\newcommand{\py at itemnewline}[1]{%
+ \@tempdima\linewidth%
+ \advance\@tempdima \leftmargin\makebox[\@tempdima][l]{#1}%
+}
+
+\newenvironment{fulllineitems}{
+ \begin{list}{}{\labelwidth \leftmargin \labelsep 0pt
+ \rightmargin 0pt \topsep -\parskip \partopsep \parskip
+ \itemsep -\parsep
+ \let\makelabel=\py at itemnewline}
+}{\end{list}}
+
+% \optional is mostly for use in the arguments parameters to the various
+% {*desc} environments defined below, but may be used elsewhere. Known to
+% be used in the debugger chapter.
+%
+% Typical usage:
+%
+% \begin{funcdesc}{myfunc}{reqparm\optional{, optparm}}
+% ^^^ ^^^
+% No space here No space here
+%
+% When a function has multiple optional parameters, \optional should be
+% nested, not chained. This is right:
+%
+% \begin{funcdesc}{myfunc}{\optional{parm1\optional{, parm2}}}
+%
+\let\py at badkey=\@undefined
+
+\newcommand{\optional}[1]{%
+ {\textnormal{\Large[}}{#1}\hspace{0.5mm}{\textnormal{\Large]}}}
+
+% This can be used when a function or method accepts an varying number
+% of arguments, such as by using the *args syntax in the parameter list.
+\newcommand{\py at moreargs}{...}
+
+% This can be used when you don't want to document the parameters to a
+% function or method, but simply state that it's an alias for
+% something else.
+\newcommand{\py at unspecified}{...}
+
+
+\newlength{\py at argswidth}
+\newcommand{\py at sigparams}[1]{%
+ \parbox[t]{\py at argswidth}{\py at varvars{#1}\code{)}}}
+\newcommand{\py at sigline}[2]{%
+ \settowidth{\py at argswidth}{#1\code{(}}%
+ \addtolength{\py at argswidth}{-2\py at argswidth}%
+ \addtolength{\py at argswidth}{\textwidth}%
+ \item[#1\code{(}\py at sigparams{#2}]}
+
+% C functions ------------------------------------------------------------
+% \begin{cfuncdesc}[refcount]{type}{name}{arglist}
+% Note that the [refcount] slot should only be filled in by
+% tools/anno-api.py; it pulls the value from the refcounts database.
+\newcommand{\cfuncline}[3]{
+ \py at sigline{\code{#1 \bfcode{#2}}}{#3}%
+ \index{#2@{\py at idxcode{#2()}}}
+}
+\newenvironment{cfuncdesc}[4][\py at badkey]{
+ \begin{fulllineitems}
+ \cfuncline{#2}{#3}{#4}
+ \ifx#1\@undefined\else%
+ \emph{Return value: \textbf{#1}.}\\
+ \fi
+}{\end{fulllineitems}}
+
+% C variables ------------------------------------------------------------
+% \begin{cvardesc}{type}{name}
+\newenvironment{cvardesc}[2]{
+ \begin{fulllineitems}
+ \item[\code{#1 \bfcode{#2}}\index{#2@{\py at idxcode{#2}}}]
+}{\end{fulllineitems}}
+
+% C data types -----------------------------------------------------------
+% \begin{ctypedesc}[index name]{typedef name}
+\newenvironment{ctypedesc}[2][\py at badkey]{
+ \begin{fulllineitems}
+ \item[\bfcode{#2}%
+ \ifx#1\@undefined%
+ \index{#2@{\py at idxcode{#2}} (C type)}
+ \else%
+ \index{#2@{\py at idxcode{#1}} (C type)}
+ \fi]
+}{\end{fulllineitems}}
+
+% C type fields ----------------------------------------------------------
+% \begin{cmemberdesc}{container type}{ctype}{membername}
+\newcommand{\cmemberline}[3]{
+ \item[\code{#2 \bfcode{#3}}]
+ \index{#3@{\py at idxcode{#3}} (#1 member)}
+}
+\newenvironment{cmemberdesc}[3]{
+ \begin{fulllineitems}
+ \cmemberline{#1}{#2}{#3}
+}{\end{fulllineitems}}
+
+% Funky macros -----------------------------------------------------------
+% \begin{csimplemacrodesc}{name}
+% -- "simple" because it has no args; NOT for constant definitions!
+\newenvironment{csimplemacrodesc}[1]{
+ \begin{fulllineitems}
+ \item[\bfcode{#1}\index{#1@{\py at idxcode{#1}} (macro)}]
+}{\end{fulllineitems}}
+
+% simple functions (not methods) -----------------------------------------
+% \begin{funcdesc}{name}{args}
+\newcommand{\funcline}[2]{%
+ \funclineni{#1}{#2}%
+ \index{#1@{\py at idxcode{#1()}} (in module \py at thismodule)}}
+\newenvironment{funcdesc}[2]{
+ \begin{fulllineitems}
+ \funcline{#1}{#2}
+}{\end{fulllineitems}}
+
+% similar to {funcdesc}, but doesn't add to the index
+\newcommand{\funclineni}[2]{%
+ \py at sigline{\bfcode{#1}}{#2}}
+\newenvironment{funcdescni}[2]{
+ \begin{fulllineitems}
+ \funclineni{#1}{#2}
+}{\end{fulllineitems}}
+
+% classes ----------------------------------------------------------------
+% \begin{classdesc}{name}{constructor args}
+\newenvironment{classdesc}[2]{
+ % Using \renewcommand doesn't work for this, for unknown reasons:
+ \global\def\py at thisclass{#1}
+ \begin{fulllineitems}
+ \py at sigline{\strong{class }\bfcode{#1}}{#2}%
+ \index{#1@{\py at idxcode{#1}} (class in \py at thismodule)}
+}{\end{fulllineitems}}
+
+% \begin{classdesc*}{name}
+\newenvironment{classdesc*}[1]{
+ % Using \renewcommand doesn't work for this, for unknown reasons:
+ \global\def\py at thisclass{#1}
+ \begin{fulllineitems}
+ \item[\strong{class }\code{\bfcode{#1}}%
+ \index{#1@{\py at idxcode{#1}} (class in \py at thismodule)}]
+}{\end{fulllineitems}}
+
+% \begin{excclassdesc}{name}{constructor args}
+% but indexes as an exception
+\newenvironment{excclassdesc}[2]{
+ % Using \renewcommand doesn't work for this, for unknown reasons:
+ \global\def\py at thisclass{#1}
+ \begin{fulllineitems}
+ \py at sigline{\strong{exception }\bfcode{#1}}{#2}%
+ \index{#1@{\py at idxcode{#1}} (exception in \py at thismodule)}
+}{\end{fulllineitems}}
+
+% There is no corresponding {excclassdesc*} environment. To describe
+% a class exception without parameters, use the {excdesc} environment.
+
+
+\let\py at classbadkey=\@undefined
+
+% object method ----------------------------------------------------------
+% \begin{methoddesc}[classname]{methodname}{args}
+\newcommand{\methodline}[3][\@undefined]{
+ \methodlineni{#2}{#3}
+ \ifx#1\@undefined
+ \index{#2@{\py at idxcode{#2()}} (\py at thisclass\ method)}
+ \else
+ \index{#2@{\py at idxcode{#2()}} (#1 method)}
+ \fi
+}
+\newenvironment{methoddesc}[3][\@undefined]{
+ \begin{fulllineitems}
+ \ifx#1\@undefined
+ \methodline{#2}{#3}
+ \else
+ \def\py at thisclass{#1}
+ \methodline{#2}{#3}
+ \fi
+}{\end{fulllineitems}}
+
+% similar to {methoddesc}, but doesn't add to the index
+% (never actually uses the optional argument)
+\newcommand{\methodlineni}[3][\py at classbadkey]{%
+ \py at sigline{\bfcode{#2}}{#3}}
+\newenvironment{methoddescni}[3][\py at classbadkey]{
+ \begin{fulllineitems}
+ \methodlineni{#2}{#3}
+}{\end{fulllineitems}}
+
+% object data attribute --------------------------------------------------
+% \begin{memberdesc}[classname]{membername}
+\newcommand{\memberline}[2][\py at classbadkey]{%
+ \ifx#1\@undefined
+ \memberlineni{#2}
+ \index{#2@{\py at idxcode{#2}} (\py at thisclass\ attribute)}
+ \else
+ \memberlineni{#2}
+ \index{#2@{\py at idxcode{#2}} (#1 attribute)}
+ \fi
+}
+\newenvironment{memberdesc}[2][\py at classbadkey]{
+ \begin{fulllineitems}
+ \ifx#1\@undefined
+ \memberline{#2}
+ \else
+ \def\py at thisclass{#1}
+ \memberline{#2}
+ \fi
+}{\end{fulllineitems}}
+
+% similar to {memberdesc}, but doesn't add to the index
+% (never actually uses the optional argument)
+\newcommand{\memberlineni}[2][\py at classbadkey]{\item[\bfcode{#2}]}
+\newenvironment{memberdescni}[2][\py at classbadkey]{
+ \begin{fulllineitems}
+ \memberlineni{#2}
+}{\end{fulllineitems}}
+
+% For exceptions: --------------------------------------------------------
+% \begin{excdesc}{name}
+% -- for constructor information, use excclassdesc instead
+\newenvironment{excdesc}[1]{
+ \begin{fulllineitems}
+ \item[\strong{exception }\bfcode{#1}%
+ \index{#1@{\py at idxcode{#1}} (exception in \py at thismodule)}]
+}{\end{fulllineitems}}
+
+% Module data or constants: ----------------------------------------------
+% \begin{datadesc}{name}
+\newcommand{\dataline}[1]{%
+ \datalineni{#1}\index{#1@{\py at idxcode{#1}} (data in \py at thismodule)}}
+\newenvironment{datadesc}[1]{
+ \begin{fulllineitems}
+ \dataline{#1}
+}{\end{fulllineitems}}
+
+% similar to {datadesc}, but doesn't add to the index
+\newcommand{\datalineni}[1]{\item[\bfcode{#1}]\nopagebreak}
+\newenvironment{datadescni}[1]{
+ \begin{fulllineitems}
+ \datalineni{#1}
+}{\end{fulllineitems}}
+
+% bytecode instruction ---------------------------------------------------
+% \begin{opcodedesc}{name}{var}
+% -- {var} may be {}
+\newenvironment{opcodedesc}[2]{
+ \begin{fulllineitems}
+ \item[\bfcode{#1}\quad\var{#2}]
+}{\end{fulllineitems}}
+
+
+\newcommand{\nodename}[1]{\label{#1}}
+
+% For these commands, use \command{} to get the typography right, not
+% {\command}. This works better with the texinfo translation.
+\newcommand{\ABC}{{\sc abc}}
+\newcommand{\UNIX}{{\sc Unix}}
+\newcommand{\POSIX}{POSIX}
+\newcommand{\ASCII}{{\sc ascii}}
+\newcommand{\Cpp}{C\protect\raisebox{.18ex}{++}}
+\newcommand{\C}{C}
+\newcommand{\EOF}{{\sc eof}}
+\newcommand{\NULL}{\constant{NULL}}
+\newcommand{\infinity}{\ensuremath{\infty}}
+\newcommand{\plusminus}{\ensuremath{\pm}}
+
+% \guilabel{Start}
+\newcommand{\guilabel}[1]{\textsf{#1}}
+% \menuselection{Start \sub Programs \sub Python}
+\newcommand{\menuselection}[1]{\guilabel{{\def\sub{ \ensuremath{>} }#1}}}
+
+% Also for consistency: spell Python "Python", not "python"!
+
+% code is the most difficult one...
+\newcommand{\code}[1]{\textrm{\@vobeyspaces\@noligs\def\{{\char`\{}\def\}{\char`\}}\def\~{\char`\~}\def\^{\char`\^}\def\e{\char`\\}\def\${\char`\$}\def\#{\char`\#}\def\&{\char`\&}\def\%{\char`\%}%
+\texttt{#1}}}
+
+\newcommand{\bfcode}[1]{\code{\bfseries#1}} % bold-faced code font
+\newcommand{\csimplemacro}[1]{\code{#1}}
+\newcommand{\kbd}[1]{\code{#1}}
+\newcommand{\samp}[1]{`\code{#1}'}
+\newcommand{\var}[1]{%
+ \ifmmode%
+ \hbox{\py at defaultsize\textrm{\textit{#1\/}}}%
+ \else%
+ \py at defaultsize\textrm{\textit{#1\/}}%
+ \fi%
+}
+\renewcommand{\emph}[1]{{\em #1}}
+\newcommand{\dfn}[1]{\emph{#1}}
+\newcommand{\strong}[1]{{\bf #1}}
+% let's experiment with a new font:
+\newcommand{\file}[1]{`\filenq{#1}'}
+\newcommand{\filenq}[1]{{\py at smallsize\textsf{\let\e=\textbackslash#1}}}
+
+% Use this def/redef approach for \url{} since hyperref defined this already,
+% but only if we actually used hyperref:
+\ifpdf
+ \newcommand{\url}[1]{{%
+ \py at pdfstartlink attr{/Border [0 0 0]} user{/S /URI /URI (#1)}%
+ \py at LinkColor% color of the link text
+ \py at smallsize\sf #1%
+ \py at NormalColor% Turn it back off; these are declarative
+ \pdfendlink}% and don't appear bound to the current
+ }% formatting "box".
+\else
+ \newcommand{\url}[1]{\mbox{\py at smallsize\textsf{#1}}}
+\fi
+\newcommand{\email}[1]{{\py at smallsize\textsf{#1}}}
+\newcommand{\newsgroup}[1]{{\py at smallsize\textsf{#1}}}
+
+\newcommand{\py at varvars}[1]{{%
+ {\let\unspecified=\py at unspecified%
+ \let\moreargs=\py at moreargs%
+ \var{#1}}}}
+
+% I'd really like to get rid of this!
+\newif\iftexi\texifalse
+
+% This is used to get l2h to put the copyright and abstract on
+% a separate HTML page.
+\newif\ifhtml\htmlfalse
+
+
+% These should be used for all references to identifiers which are
+% used to refer to instances of specific language constructs. See the
+% names for specific semantic assignments.
+%
+% For now, don't do anything really fancy with them; just use them as
+% logical markup. This might change in the future.
+%
+\newcommand{\module}[1]{\texttt{#1}}
+\newcommand{\keyword}[1]{\texttt{#1}}
+\newcommand{\exception}[1]{\texttt{#1}}
+\newcommand{\class}[1]{\texttt{#1}}
+\newcommand{\function}[1]{\texttt{#1}}
+\newcommand{\member}[1]{\texttt{#1}}
+\newcommand{\method}[1]{\texttt{#1}}
+
+\newcommand{\pytype}[1]{#1} % built-in Python type
+
+\newcommand{\cfunction}[1]{\texttt{#1}}
+\newcommand{\ctype}[1]{\texttt{#1}} % C struct or typedef name
+\newcommand{\cdata}[1]{\texttt{#1}} % C variable, typically global
+
+\newcommand{\mailheader}[1]{{\py at smallsize\textsf{#1:}}}
+\newcommand{\mimetype}[1]{{\py at smallsize\textsf{#1}}}
+% The \! is a "negative thin space" in math mode.
+\newcommand{\regexp}[1]{%
+ {\tiny$^{^\lceil}\!\!$%
+ {\py at defaultsize\code{#1}}%
+ $\!\rfloor\!$%
+ }}
+\newcommand{\envvar}[1]{%
+ #1%
+ \index{#1}%
+ \index{environment variables!{#1}}%
+}
+\newcommand{\makevar}[1]{#1} % variable in a Makefile
+\newcommand{\character}[1]{\samp{#1}}
+
+% constants defined in Python modules or C headers, not language constants:
+\newcommand{\constant}[1]{\code{#1}} % manifest constant, not syntactic
+
+\newcommand{\manpage}[2]{{\emph{#1}(#2)}}
+\newcommand{\pep}[1]{PEP #1\index{Python Enhancement Proposals!PEP #1}}
+\newcommand{\rfc}[1]{RFC #1\index{RFC!RFC #1}}
+\newcommand{\program}[1]{\strong{#1}}
+\newcommand{\programopt}[1]{\strong{#1}}
+% Note that \longprogramopt provides the '--'!
+\newcommand{\longprogramopt}[1]{\strong{-{}-#1}}
+
+% \ulink{link text}{URL}
+\ifpdf
+ \newcommand{\ulink}[2]{{%
+ % For PDF, we *should* only generate a link when the URL is absolute.
+ \py at pdfstartlink attr{/Border [0 0 0]} user{/S /URI /URI (#2)}%
+ \py at LinkColor% color of the link text
+ #1%
+ \py at NormalColor% Turn it back off; these are declarative
+ \pdfendlink}% and don't appear bound to the current
+ }% formatting "box".
+\else
+ \newcommand{\ulink}[2]{#1}
+\fi
+
+% cited titles: \citetitle{Title of Work}
+% online: \citetitle[url-to-resource]{Title of Work}
+\ifpdf
+ \newcommand{\citetitle}[2][\py at modulebadkey]{%
+ \ifx\py at modulebadkey#1\emph{#2}\else\ulink{\emph{#2}}{#1}\fi%
+ }
+\else
+ \newcommand{\citetitle}[2][URL]{\emph{#2}}
+\fi
+
+
+
+% This version is being checked in for the historical record; it shows
+% how I've managed to get some aspects of this to work. It will not
+% be used in practice, so a subsequent revision will change things
+% again. This version has problems, but shows how to do something
+% that proved more tedious than I'd expected, so I don't want to lose
+% the example completely.
+%
+\newcommand{\grammartoken}[1]{\texttt{#1}}
+\newenvironment{productionlist}[1][\py at badkey]{
+ \def\optional##1{{\Large[}##1{\Large]}}
+ \def\production##1##2{\code{##1}&::=&\code{##2}\\}
+ \def\productioncont##1{& &\code{##1}\\}
+ \def\token##1{##1}
+ \let\grammartoken=\token
+ \parindent=2em
+ \indent
+ \begin{tabular}{lcl}
+}{%
+ \end{tabular}
+}
+
+\newlength{\py at noticelength}
+
+\newcommand{\py at heavybox}{
+ \setlength{\fboxrule}{2pt}
+ \setlength{\fboxsep}{7pt}
+ \setlength{\py at noticelength}{\linewidth}
+ \addtolength{\py at noticelength}{-2\fboxsep}
+ \addtolength{\py at noticelength}{-2\fboxrule}
+ \setlength{\shadowsize}{3pt}
+ \Sbox
+ \minipage{\py at noticelength}
+}
+\newcommand{\py at endheavybox}{
+ \endminipage
+ \endSbox
+ \fbox{\TheSbox}
+}
+
+% a 'note' is as plain as it gets:
+\newcommand{\py at noticelabel@note}{Note:}
+\newcommand{\py at noticestart@note}{}
+\newcommand{\py at noticeend@note}{}
+
+% a 'warning' gets more visible distinction:
+\newcommand{\py at noticelabel@warning}{Warning:}
+\newcommand{\py at noticestart@warning}{\py at heavybox}
+\newcommand{\py at noticeend@warning}{\py at endheavybox}
+
+\newenvironment{notice}[1][note]{
+ \def\py at noticetype{#1}
+ \csname py at noticestart@#1\endcsname
+ \par\strong{\csname py at noticelabel@#1\endcsname}
+}{\csname py at noticeend@\py at noticetype\endcsname}
+\newcommand{\note}[1]{\strong{\py at noticelabel@note} #1}
+\newcommand{\warning}[1]{\strong{\py at noticelabel@warning} #1}
+
+% Deprecation stuff.
+% Should be extended to allow an index / list of deprecated stuff. But
+% there's a lot of stuff that needs to be done to make that automatable.
+%
+% First parameter is the release number that deprecates the feature, the
+% second is the action the should be taken by users of the feature.
+%
+% Example:
+% \deprecated{1.5.1}{Use \method{frobnicate()} instead.}
+%
+\newcommand{\deprecated}[2]{%
+ \strong{Deprecated since release #1.} #2\par}
+
+% New stuff.
+% This should be used to mark things which have been added to the
+% development tree but that aren't in the release, but are documented.
+% This allows release of documentation that already includes updated
+% descriptions. Place at end of descriptor environment.
+%
+% Example:
+% \versionadded{1.5.2}
+% \versionchanged[short explanation]{2.0}
+%
+\newcommand{\versionadded}[2][\py at badkey]{%
+ \ifx#1\@undefined%
+ { New in version #2. }%
+ \else%
+ { New in version #2:\ #1. }%
+ \fi%
+}
+\newcommand{\versionchanged}[2][\py at badkey]{%
+ \ifx#1\@undefined%
+ { Changed in version #2. }%
+ \else%
+ { Changed in version #2:\ #1. }%
+ \fi%
+}
+
+
+% Tables.
+%
+\newenvironment{tableii}[4]{%
+ \begin{center}%
+ \def\lineii##1##2{\csname#2\endcsname{##1}#2\\}%
+ \begin{tabular}{#1}\strong{#3}&\strong{#4} \\* \hline%
+}{%
+ \end{tabular}%
+ \end{center}%
+}
+
+\newenvironment{longtableii}[4]{%
+ \begin{center}%
+ \def\lineii##1##2{\csname#2\endcsname{##1}#2\\}%
+ \begin{longtable}[c]{#1}\strong{#3}&\strong{#4} \\* \hline\endhead%
+}{%
+ \end{longtable}%
+ \end{center}%
+}
+
+\newenvironment{tableiii}[5]{%
+ \begin{center}%
+ \def\lineiii##1##2##3{\csname#2\endcsname{##1}#2#3\\}%
+ \begin{tabular}{#1}\strong{#3}&\strong{#4}&\strong{#5} \\%
+ \hline%
+}{%
+ \end{tabular}%
+ \end{center}%
+}
+
+\newenvironment{longtableiii}[5]{%
+ \begin{center}%
+ \def\lineiii##1##2##3{\csname#2\endcsname{##1}#2#3\\}%
+ \begin{longtable}[c]{#1}\strong{#3}&\strong{#4}&\strong{#5} \\%
+ \hline\endhead%
+}{%
+ \end{longtable}%
+ \end{center}%
+}
+
+\newenvironment{tableiv}[6]{%
+ \begin{center}%
+ \def\lineiv##1##2##3##4{\csname#2\endcsname{##1}#2#3#4\\}%
+ \begin{tabular}{#1}\strong{#3}&\strong{#4}&\strong{#5}&\strong{#6} \\%
+ \hline%
+}{%
+ \end{tabular}%
+ \end{center}%
+}
+
+\newenvironment{longtableiv}[6]{%
+ \begin{center}%
+ \def\lineiv##1##2##3##4{\csname#2\endcsname{##1}#2#3#4\\}%
+ \begin{longtable}[c]{#1}\strong{#3}&\strong{#4}&\strong{#5}&\strong{#6}%
+ \\%
+ \hline\endhead%
+}{%
+ \end{longtable}%
+ \end{center}%
+}
+
+\newenvironment{tablev}[7]{%
+ \begin{center}%
+ \def\linev##1##2##3##4##5{\csname#2\endcsname{##1}#2#3#4#5\\}%
+ \begin{tabular}{#1}\strong{#3}&\strong{#4}&\strong{#5}&\strong{#6}&\strong{#7} \\%
+ \hline%
+}{%
+ \end{tabular}%
+ \end{center}%
+}
+
+\newenvironment{longtablev}[7]{%
+ \begin{center}%
+ \def\linev##1##2##3##4##5{\csname#2\endcsname{##1}#2#3#4#5\\}%
+ \begin{longtable}[c]{#1}\strong{#3}&\strong{#4}&\strong{#5}&\strong{#6}&\strong{#7}%
+ \\%
+ \hline\endhead%
+}{%
+ \end{longtable}%
+ \end{center}%
+}
+
+% XXX Don't think we can use this yet, though it cleans up some
+% tedious markup. There's no equivalent for the HTML transform yet,
+% and that needs to exist. I don't know how to write it.
+%
+% This should really have something that makes it easier to bind a
+% table's ``Notes'' column and an associated tablenotes environment,
+% and generates the right magic for getting the numbers right in the
+% table.
+%
+% So this is quite incomplete.
+%
+\newcounter{py at tablenotescounter}
+\newenvironment{tablenotes}{%
+ \noindent Notes:
+ \par
+ \setcounter{py at tablenotescounter}{0}
+ \begin{list}{(\arabic{py at tablenotescounter})}%
+ {\usecounter{py at tablenotescounter}}
+}{\end{list}}
+
+
+% Cross-referencing (AMK, new impl. FLD)
+% Sample usage:
+% \begin{seealso}
+% \seemodule{rand}{Uniform random number generator.}; % Module xref
+% \seetext{\emph{Encyclopedia Britannica}}. % Ref to a book
+%
+% % A funky case: module name contains '_'; have to supply an optional key
+% \seemodule[copyreg]{copy_reg}{Interface constructor registration for
+% \module{pickle}.}
+% \end{seealso}
+%
+% Note that the last parameter for \seemodule and \seetext should be complete
+% sentences and be terminated with the proper punctuation.
+
+\ifpdf
+ \newcommand{\py at seemodule}[3][\py at modulebadkey]{%
+ \par%
+ \ifx\py at modulebadkey#1\def\py at modulekey{#2}\else\def\py at modulekey{#1}\fi%
+ \begin{fulllineitems}
+ \item[\py at linkToName{label-module-\py at modulekey}{Module \module{#2}}
+ (section \ref{module-\py at modulekey}):]
+ #3
+ \end{fulllineitems}
+ }
+\else
+ \newcommand{\py at seemodule}[3][\py at modulebadkey]{%
+ \par%
+ \ifx\py at modulebadkey#1\def\py at modulekey{#2}\else\def\py at modulekey{#1}\fi%
+ \begin{fulllineitems}
+ \item[Module \module{#2} (section \ref{module-\py at modulekey}):]
+ #3
+ \end{fulllineitems}
+ }
+\fi
+
+% \seelink{url}{link text}{why it's interesting}
+\newcommand{\py at seelink}[3]{%
+ \par
+ \begin{fulllineitems}
+ \item[\ulink{#2}{#1}]
+ #3
+ \end{fulllineitems}
+}
+% \seetitle[url]{title}{why it's interesting}
+\newcommand{\py at seetitle}[3][\py at modulebadkey]{%
+ \par
+ \begin{fulllineitems}
+ \item[\citetitle{#2}]
+ \ifx\py at modulebadkey#1\else
+ \item[{\small{(\url{#1})}}]
+ \fi
+ #3
+ \end{fulllineitems}
+}
+% \seepep{number}{title}{why it's interesting}
+\newcommand{\py at seepep}[3]{%
+ \par%
+ \begin{fulllineitems}
+ \item[\pep{#1}, ``\emph{#2}'']
+ #3
+ \end{fulllineitems}
+}
+% \seerfc{number}{title}{why it's interesting}
+\newcommand{\py at seerfc}[3]{%
+ \par%
+ \begin{fulllineitems}
+ \item[\rfc{#1}, ``\emph{#2}'']
+ #3
+ \end{fulllineitems}
+}
+% \seeurl{url}{why it's interesting}
+\newcommand{\py at seeurl}[2]{%
+ \par%
+ \begin{fulllineitems}
+ \item[\url{#1}]
+ #2
+ \end{fulllineitems}
+}
+
+\newenvironment{seealso*}{
+ \par
+ \def\seetext##1{\par{##1}}
+ \let\seemodule=\py at seemodule
+ \let\seepep=\py at seepep
+ \let\seerfc=\py at seerfc
+ \let\seetitle=\py at seetitle
+ \let\seeurl=\py at seeurl
+ \let\seelink=\py at seelink
+}{\par}
+\newenvironment{seealso}{
+ \par
+ \strong{See Also:}
+ \par
+ \def\seetext##1{\par{##1}}
+ \let\seemodule=\py at seemodule
+ \let\seepep=\py at seepep
+ \let\seerfc=\py at seerfc
+ \let\seetitle=\py at seetitle
+ \let\seeurl=\py at seeurl
+ \let\seelink=\py at seelink
+}{\par}
+
+% Allow the Python release number to be specified independently of the
+% \date{}. This allows the date to reflect the document's date and
+% release to specify the Python release that is documented.
+%
+\newcommand{\py at release}{}
+\newcommand{\version}{}
+\newcommand{\shortversion}{}
+\newcommand{\releaseinfo}{}
+\newcommand{\releasename}{Release}
+\newcommand{\release}[1]{%
+ \renewcommand{\py at release}{\releasename\space\version}%
+ \renewcommand{\version}{#1}}
+\newcommand{\setshortversion}[1]{%
+ \renewcommand{\shortversion}{#1}}
+\newcommand{\setreleaseinfo}[1]{%
+ \renewcommand{\releaseinfo}{#1}}
+
+% Allow specification of the author's address separately from the
+% author's name. This can be used to format them differently, which
+% is a good thing.
+%
+\newcommand{\py at authoraddress}{}
+\newcommand{\authoraddress}[1]{\renewcommand{\py at authoraddress}{#1}}
+\let\developersaddress=\authoraddress
+\let\developer=\author
+\let\developers=\author
+
+% This sets up the fancy chapter headings that make the documents look
+% at least a little better than the usual LaTeX output.
+%
+\@ifundefined{ChTitleVar}{}{
+ \ChNameVar{\raggedleft\normalsize\py at HeaderFamily}
+ \ChNumVar{\raggedleft \bfseries\Large\py at HeaderFamily}
+ \ChTitleVar{\raggedleft \rm\Huge\py at HeaderFamily}
+ % This creates chapter heads without the leading \vspace*{}:
+ \def\@makechapterhead#1{%
+ {\parindent \z@ \raggedright \normalfont
+ \ifnum \c at secnumdepth >\m at ne
+ \DOCH
+ \fi
+ \interlinepenalty\@M
+ \DOTI{#1}
+ }
+ }
+}
+
+
+% Definition lists; requested by AMK for HOWTO documents. Probably useful
+% elsewhere as well, so keep in in the general style support.
+%
+\newenvironment{definitions}{%
+ \begin{description}%
+ \def\term##1{\item[##1]\mbox{}\\*[0mm]}
+}{%
+ \end{description}%
+}
+
+% Tell TeX about pathological hyphenation cases:
+\hyphenation{Base-HTTP-Re-quest-Hand-ler}
Added: sandbox/trunk/pdb/Doc/texinputs/underscore.sty
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/texinputs/underscore.sty Wed Jul 5 14:19:15 2006
@@ -0,0 +1,232 @@
+% underscore.sty 12-Oct-2001 Donald Arseneau asnd at triumf.ca
+% Make the "_" character print as "\textunderscore" in text.
+% Copyright 1998,2001 Donald Arseneau; Distribute freely if unchanged.
+% Instructions follow after the definitions.
+
+\ProvidesPackage{underscore}[2001/10/12]
+
+\begingroup
+ \catcode`\_=\active
+ \gdef_{% \relax % No relax gives a small vulnerability in alignments
+ \ifx\if at safe@actives\iftrue % must be outermost test!
+ \string_%
+ \else
+ \ifx\protect\@typeset at protect
+ \ifmmode \sb \else \BreakableUnderscore \fi
+ \else
+ \ifx\protect\@unexpandable at protect \noexpand_%
+ \else \protect_%
+ \fi\fi
+ \fi}
+\endgroup
+
+% At begin: set catcode; fix \long \ttdefault so I can use it in comparisons;
+\AtBeginDocument{%
+ {\immediate\write\@auxout{\catcode\number\string`\_ \string\active}}%
+ \catcode\string`\_\string=\active
+ \edef\ttdefault{\ttdefault}%
+}
+
+\newcommand{\BreakableUnderscore}{\leavevmode\nobreak\hskip\z at skip
+ \ifx\f at family\ttdefault \string_\else \textunderscore\fi
+ \usc at dischyph\nobreak\hskip\z at skip}
+
+\DeclareRobustCommand{\_}{%
+ \ifmmode \nfss at text{\textunderscore}\else \BreakableUnderscore \fi}
+
+\let\usc at dischyph\@dischyph
+\DeclareOption{nohyphen}{\def\usc at dischyph{\discretionary{}{}{}}}
+\DeclareOption{strings}{\catcode`\_=\active}
+
+\ProcessOptions
+\ifnum\catcode`\_=\active\else \endinput \fi
+
+%%%%%%%% Redefine commands that use character strings %%%%%%%%
+
+\@ifundefined{UnderscoreCommands}{\let\UnderscoreCommands\@empty}{}
+\expandafter\def\expandafter\UnderscoreCommands\expandafter{%
+ \UnderscoreCommands
+ \do\include \do\includeonly
+ \do\@input \do\@iinput \do\InputIfFileExists
+ \do\ref \do\pageref \do\newlabel
+ \do\bibitem \do\@bibitem \do\cite \do\nocite \do\bibcite
+}
+
+% Macro to redefine a macro to pre-process its string argument
+% with \protect -> \string.
+\def\do#1{% Avoid double processing if user includes command twice!
+ \@ifundefined{US\string_\expandafter\@gobble\string#1}{%
+ \edef\@tempb{\meaning#1}% Check if macro is just a protection shell...
+ \def\@tempc{\protect}%
+ \edef\@tempc{\meaning\@tempc\string#1\space\space}%
+ \ifx\@tempb\@tempc % just a shell: hook into the protected inner command
+ \expandafter\do
+ \csname \expandafter\@gobble\string#1 \expandafter\endcsname
+ \else % Check if macro takes an optional argument
+ \def\@tempc{\@ifnextchar[}%
+ \edef\@tempa{\def\noexpand\@tempa####1\meaning\@tempc}%
+ \@tempa##2##3\@tempa{##2\relax}%
+ \edef\@tempb{\meaning#1\meaning\@tempc}%
+ \edef\@tempc{\noexpand\@tempd \csname
+ US\string_\expandafter\@gobble\string#1\endcsname}%
+ \if \expandafter\@tempa\@tempb \relax 12\@tempa % then no optional arg
+ \@tempc #1\US at prot
+ \else % There is optional arg
+ \@tempc #1\US at protopt
+ \fi
+ \fi
+ }{}}
+
+\def\@tempd#1#2#3{\let#1#2\def#2{#3#1}}
+
+\def\US at prot#1#2{\let\@@protect\protect \let\protect\string
+ \edef\US at temp##1{##1{#2}}\restore at protect\US at temp#1}
+\def\US at protopt#1{\@ifnextchar[{\US at protarg#1}{\US at prot#1}}
+\def\US at protarg #1[#2]{\US at prot{{#1[#2]}}}
+
+\UnderscoreCommands
+\let\do\relax \let\@tempd\relax % un-do
+
+%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
+
+\endinput
+
+underscore.sty 12-Oct-2001 Donald Arseneau
+
+Features:
+~~~~~~~~~
+\_ prints an underscore so that the hyphenation of constituent words
+is not affected and hyphenation is permitted after the underscore.
+For example, "compound\_fracture" hyphenates as com- pound_- frac- ture.
+If you prefer the underscore to break without a hyphen (but still with
+the same rules for explicit hyphen-breaks) then use the [nohyphen]
+package option.
+
+A simple _ acts just like \_ in text mode, but makes a subscript in
+math mode: activation_energy $E_a$
+
+Both forms use an underscore character if the font encoding contains
+one (e.g., "\usepackage[T1]{fontenc}" or typewriter fonts in any encoding),
+but they use a rule if the there is no proper character.
+
+Deficiencies:
+~~~~~~~~~~~~~
+The skips and penalties ruin any kerning with the underscore character
+(when a character is used). However, there doesn't seem to be much, if
+any, such kerning in the ec fonts, and there is never any kerning with
+a rule.
+
+You must avoid "_" in file names and in cite or ref tags, or you must use
+the babel package, with its active-character controls, or you must give
+the [strings] option, which attempts to redefine several commands (and
+may not work perfectly). Even without the [strings] option or babel, you
+can use occasional underscores like: "\include{file\string_name}".
+
+Option: [strings]
+~~~~~~~~~~~~~~~~~
+The default operation is quite simple and needs no customization; but
+you must avoid using "_" in any place where LaTeX uses an argument as
+a string of characters for some control function or as a name. These
+include the tags for \cite and \ref, file names for \input, \include,
+and \includegraphics, environment names, counter names, and placement
+parameters (like "[t]"). The problem with these contexts is that they
+are `moving arguments' but LaTeX does not `switch on' the \protect
+mechanism for them.
+
+If you need to use the underscore character in these places, the package
+option [strings] is provided to redefine commands taking a string argument
+so that the argument is protected (with \protect -> \string). The list
+of commands is given in "\UnderscoreCommands", with "\do" before each,
+covering \cite, \ref, \input, and their variants. Not included are many
+commands regarding font names, everything with counter names, environment
+names, page styles, and versions of \ref and \cite defined by external
+packages (e.g. \vref and \citeyear).
+
+You can add to the list of supported commands by defining \UnderscoreCommands
+before loading this package; e.g.
+
+ \usepackage{chicago}
+ \newcommand{\UnderscoreCommands}{% (\cite already done)
+ \do\citeNP \do\citeA \do\citeANP \do\citeN \do\shortcite
+ \do\shortciteNP \do\shortciteA \do\shortciteANP \do\shortciteN
+ \do\citeyear \do\citeyearNP
+ }
+ \usepackage[strings]{underscore}
+
+Not all commands can be supported this way! Only commands that take a
+string argument *first* can be protected. One optional argument before
+the string argument is also permitted, as exemplified by \cite: both
+\cite{tags} and \cite[text]{tags} are allowed. A command like
+\@addtoreset which takes two counter names as arguments could not
+be protected by adding it to \UnderscoreCommands.
+
+!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+!! When you use the [strings] option, you must load this package !!
+!! last (or nearly last). !!
+!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+
+There are two reasons: 1) The redefinitions done for protection must come
+after other packages define their customized versions of those commands.
+2) The [strings] option requires the _ character to be activated immediately
+in order for the cite and ref tags to be read properly from the .aux file
+as plain strings, and this catcode setting might disrupt other packages.
+
+The babel package implements a protection mechanism for many commands,
+and will be a complete fix for most documents without the [strings] option.
+Many add-on packages are compatible with babel, so they will get the
+strings protection also. However, there are several commands that are
+not covered by babel, but can easily be supported by the [strings] and
+\UnderscoreCommands mechanism. Beware that using both [strings] and babel
+may lead to conflicts, but does appear to work (load babel last).
+
+Implementation Notes:
+~~~~~~~~~~~~~~~~~~~~~
+The first setting of "_" to be an active character is performed in a local
+group so as to not interfere with other packages. The catcode setting
+is repeated with \AtBeginDocument so the definition is in effect for the
+text. However, the catcode setting is repeated immediately when the
+[strings] option is detected.
+
+The definition of the active "_" is essentially:
+ \ifmmode \sb \else \BreakableUnderscore \fi
+where "\sb" retains the normal subscript meaning of "_" and where
+"\BreakableUnderscore" is essentially "\_". The rest of the definition
+handles the "\protect"ion without causing \relax to be inserted before
+the character.
+
+\BreakableUnderscore uses "\nobreak\hskip\z at skip" to separate the
+underscore from surrounding words, thus allowing TeX to hyphenate them,
+but preventing free breaks around the underscore. Next, it checks the
+current font family, and uses the underscore character from tt fonts or
+otherwise \textunderscore (which is a character or rule depending on
+the font encoding). After the underscore, it inserts a discretionary
+hyphenation point as "\usc at dischyph", which is usually just "\-"
+except that it still works in the tabbing environment, although it
+will give "\discretionary{}{}{}" under the [nohyphen] option. After
+that, another piece of non-breaking interword glue is inserted.
+Ordinarily, the comparison "\ifx\f at family\ttdefault" will always fail
+because \ttdefault is `long' where \f at family is not (boooo hisss), but
+\ttdefault is redefined to be non-long by "\AtBeginDocument".
+
+The "\_" command is then defined to use "\BreakableUnderscore".
+
+If the [strings] option is not given, then that is all!
+
+Under the [strings] option, the list of special commands is processed to:
+- retain the original command as \US_command (\US_ref)
+- redefine the command as \US at prot\US_command for ordinary commands
+ (\ref -> \US at prot\US_ref) or as \US at protopt\US_command when an optional
+ argument is possible (\bibitem -> \US at protopt\US_bibitem).
+- self-protecting commands (\cite) retain their self-protection.
+Diagnosing the state of the pre-existing command is done by painful
+contortions involving \meaning.
+
+\US at prot and \US at protopt read the argument, process it with \protect
+enabled, then invoke the saved \US_command.
+
+Modifications:
+~~~~~~~~~~~~~~
+12-Oct-2001 Babel (safe at actives) compatibility and [nohyphen] option.
+
+Test file integrity: ASCII 32-57, 58-126: !"#$%&'()*+,-./0123456789
+:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~
Added: sandbox/trunk/pdb/Doc/tools/buildindex.py
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/tools/buildindex.py Wed Jul 5 14:19:15 2006
@@ -0,0 +1,388 @@
+#! /usr/bin/env python
+
+__version__ = '$Revision: 1.1 $'
+
+import os.path
+import re
+import string
+import sys
+
+from xml.sax.saxutils import quoteattr
+
+
+bang_join = "!".join
+null_join = "".join
+
+REPLACEMENTS = [
+ # Hackish way to deal with macros replaced with simple text
+ (re.compile(r"\\ABC\b"), "ABC"),
+ (re.compile(r"\\ASCII\b"), "ASCII"),
+ (re.compile(r"\\Cpp\b"), "C++"),
+ (re.compile(r"\\EOF\b"), "EOF"),
+ (re.compile(r"\\NULL\b"), "NULL"),
+ (re.compile(r"\\POSIX\b"), "POSIX"),
+ (re.compile(r"\\UNIX\b"), "Unix"),
+ # deal with turds left over from LaTeX2HTML
+ (re.compile(r"<#\d+#>"), ""),
+ ]
+
+class Node:
+ continuation = 0
+
+ def __init__(self, link, str, seqno):
+ self.links = [link]
+ self.seqno = seqno
+ for pattern, replacement in REPLACEMENTS:
+ str = pattern.sub(replacement, str)
+ # build up the text
+ self.text = split_entry_text(str)
+ self.key = split_entry_key(str)
+
+ def __cmp__(self, other):
+ """Comparison operator includes sequence number, for use with
+ list.sort()."""
+ return self.cmp_entry(other) or cmp(self.seqno, other.seqno)
+
+ def cmp_entry(self, other):
+ """Comparison 'operator' that ignores sequence number."""
+ c = 0
+ for i in range(min(len(self.key), len(other.key))):
+ c = (cmp_part(self.key[i], other.key[i])
+ or cmp_part(self.text[i], other.text[i]))
+ if c:
+ break
+ return c or cmp(self.key, other.key) or cmp(self.text, other.text)
+
+ def __repr__(self):
+ return "" % (bang_join(self.text), self.seqno)
+
+ def __str__(self):
+ return bang_join(self.key)
+
+ def dump(self):
+ return "%s\1%s###%s\n" \
+ % ("\1".join(self.links),
+ bang_join(self.text),
+ self.seqno)
+
+
+def cmp_part(s1, s2):
+ result = cmp(s1, s2)
+ if result == 0:
+ return 0
+ l1 = s1.lower()
+ l2 = s2.lower()
+ minlen = min(len(s1), len(s2))
+ if len(s1) < len(s2) and l1 == l2[:len(s1)]:
+ result = -1
+ elif len(s2) < len(s1) and l2 == l1[:len(s2)]:
+ result = 1
+ else:
+ result = cmp(l1, l2) or cmp(s1, s2)
+ return result
+
+
+def split_entry(str, which):
+ stuff = []
+ parts = str.split('!')
+ parts = [part.split('@') for part in parts]
+ for entry in parts:
+ if len(entry) != 1:
+ key = entry[which]
+ else:
+ key = entry[0]
+ stuff.append(key)
+ return stuff
+
+
+_rmtt = re.compile(r"""(.*)(.*)(.*)$""",
+ re.IGNORECASE)
+_rmparens = re.compile(r"\(\)")
+
+def split_entry_key(str):
+ parts = split_entry(str, 1)
+ for i in range(len(parts)):
+ m = _rmtt.match(parts[i])
+ if m:
+ parts[i] = null_join(m.group(1, 2, 3))
+ else:
+ parts[i] = parts[i].lower()
+ # remove '()' from the key:
+ parts[i] = _rmparens.sub('', parts[i])
+ return map(trim_ignored_letters, parts)
+
+
+def split_entry_text(str):
+ if '<' in str:
+ m = _rmtt.match(str)
+ if m:
+ str = null_join(m.group(1, 2, 3))
+ return split_entry(str, 1)
+
+
+def load(fp):
+ nodes = []
+ rx = re.compile("(.*)\1(.*)###(.*)$")
+ while 1:
+ line = fp.readline()
+ if not line:
+ break
+ m = rx.match(line)
+ if m:
+ link, str, seqno = m.group(1, 2, 3)
+ nodes.append(Node(link, str, seqno))
+ return nodes
+
+
+def trim_ignored_letters(s):
+ # ignore $ to keep environment variables with the
+ # leading letter from the name
+ if s.startswith("$"):
+ return s[1:].lower()
+ else:
+ return s.lower()
+
+def get_first_letter(s):
+ if s.startswith(""):
+ return "%"
+ else:
+ return trim_ignored_letters(s)[0]
+
+
+def split_letters(nodes):
+ letter_groups = []
+ if nodes:
+ group = []
+ append = group.append
+ letter = get_first_letter(nodes[0].text[0])
+ letter_groups.append((letter, group))
+ for node in nodes:
+ nletter = get_first_letter(node.text[0])
+ if letter != nletter:
+ letter = nletter
+ group = []
+ letter_groups.append((letter, group))
+ append = group.append
+ append(node)
+ return letter_groups
+
+
+def group_symbols(groups):
+ entries = []
+ ident_letters = string.ascii_letters + "_"
+ while groups[0][0] not in ident_letters:
+ entries += groups[0][1]
+ del groups[0]
+ if entries:
+ groups.insert(0, ("Symbols", entries))
+
+
+# need a function to separate the nodes into columns...
+def split_columns(nodes, columns=1):
+ if columns <= 1:
+ return [nodes]
+ # This is a rough height; we may have to increase to avoid breaks before
+ # a subitem.
+ colheight = int(len(nodes) / columns)
+ numlong = int(len(nodes) % columns)
+ if numlong:
+ colheight = colheight + 1
+ else:
+ numlong = columns
+ cols = []
+ for i in range(numlong):
+ start = i * colheight
+ end = start + colheight
+ cols.append(nodes[start:end])
+ del nodes[:end]
+ colheight = colheight - 1
+ try:
+ numshort = int(len(nodes) / colheight)
+ except ZeroDivisionError:
+ cols = cols + (columns - len(cols)) * [[]]
+ else:
+ for i in range(numshort):
+ start = i * colheight
+ end = start + colheight
+ cols.append(nodes[start:end])
+ #
+ # If items continue across columns, make sure they are marked
+ # as continuations so the user knows to look at the previous column.
+ #
+ for i in range(len(cols) - 1):
+ try:
+ prev = cols[i][-1]
+ next = cols[i + 1][0]
+ except IndexError:
+ return cols
+ else:
+ n = min(len(prev.key), len(next.key))
+ for j in range(n):
+ if prev.key[j] != next.key[j]:
+ break
+ next.continuation = j + 1
+ return cols
+
+
+DL_LEVEL_INDENT = " "
+
+def format_column(nodes):
+ strings = ["
"]
+ append = strings.append
+ level = 0
+ previous = []
+ for node in nodes:
+ current = node.text
+ count = 0
+ for i in range(min(len(current), len(previous))):
+ if previous[i] != current[i]:
+ break
+ count = i + 1
+ if count > level:
+ append("
\n\n" \
+ % (quoteattr("letter-" + letter), lettername)
+
+
+def format_html_letters(nodes, columns, group_symbol_nodes):
+ letter_groups = split_letters(nodes)
+ if group_symbol_nodes:
+ group_symbols(letter_groups)
+ items = []
+ for letter, nodes in letter_groups:
+ s = "%s" % (letter, letter)
+ items.append(s)
+ s = ["
\n%s
\n" % " |\n".join(items)]
+ for letter, nodes in letter_groups:
+ s.append(format_letter(letter))
+ s.append(format_nodes(nodes, columns))
+ return null_join(s)
+
+def format_html(nodes, columns):
+ return format_nodes(nodes, columns)
+
+
+def collapse(nodes):
+ """Collapse sequences of nodes with matching keys into a single node.
+ Destructive."""
+ if len(nodes) < 2:
+ return
+ prev = nodes[0]
+ i = 1
+ while i < len(nodes):
+ node = nodes[i]
+ if not node.cmp_entry(prev):
+ prev.links.append(node.links[0])
+ del nodes[i]
+ else:
+ i = i + 1
+ prev = node
+
+
+def dump(nodes, fp):
+ for node in nodes:
+ fp.write(node.dump())
+
+
+def process_nodes(nodes, columns, letters=0, group_symbol_nodes=0):
+ nodes.sort()
+ collapse(nodes)
+ if letters:
+ return format_html_letters(nodes, columns, group_symbol_nodes)
+ else:
+ return format_html(nodes, columns)
+
+
+def main():
+ import getopt
+ ifn = "-"
+ ofn = "-"
+ columns = 1
+ letters = 0
+ group_symbol_nodes = 1
+ opts, args = getopt.getopt(sys.argv[1:], "c:lo:",
+ ["columns=", "dont-group-symbols",
+ "group-symbols", "letters", "output="])
+ for opt, val in opts:
+ if opt in ("-o", "--output"):
+ ofn = val
+ elif opt in ("-c", "--columns"):
+ columns = int(val, 10)
+ elif opt in ("-l", "--letters"):
+ letters = 1
+ elif opt == "--group-symbols":
+ group_symbol_nodes = 1
+ elif opt == "--dont-group-symbols":
+ group_symbol_nodes = 0
+ if not args:
+ args = [ifn]
+ nodes = []
+ for fn in args:
+ nodes = nodes + load(open(fn))
+ num_nodes = len(nodes)
+ html = process_nodes(nodes, columns, letters, group_symbol_nodes)
+ program = os.path.basename(sys.argv[0])
+ if ofn == "-":
+ sys.stdout.write(html)
+ sys.stderr.write("\n%s: %d index nodes" % (program, num_nodes))
+ else:
+ open(ofn, "w").write(html)
+ print
+ print "%s: %d index nodes" % (program, num_nodes)
+
+
+if __name__ == "__main__":
+ main()
Added: sandbox/trunk/pdb/Doc/tools/checkargs.pm
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/tools/checkargs.pm Wed Jul 5 14:19:15 2006
@@ -0,0 +1,112 @@
+#! /usr/bin/perl
+
+package checkargs;
+require 5.004; # uses "for my $var"
+require Exporter;
+ at ISA = qw(Exporter);
+ at EXPORT = qw(check_args check_args_range check_args_at_least);
+use strict;
+use Carp;
+
+=head1 NAME
+
+checkargs -- Provide rudimentary argument checking for perl5 functions
+
+=head1 SYNOPSIS
+
+ check_args(cArgsExpected, @_)
+ check_args_range(cArgsMin, cArgsMax, @_)
+ check_args_at_least(cArgsMin, @_)
+where "@_" should be supplied literally.
+
+=head1 DESCRIPTION
+
+As the first line of user-written subroutine foo, do one of the following:
+
+ my ($arg1, $arg2) = check_args(2, @_);
+ my ($arg1, @rest) = check_args_range(1, 4, @_);
+ my ($arg1, @rest) = check_args_at_least(1, @_);
+ my @args = check_args_at_least(0, @_);
+
+These functions may also be called for side effect (put a call to one
+of the functions near the beginning of the subroutine), but using the
+argument checkers to set the argument list is the recommended usage.
+
+The number of arguments and their definedness are checked; if the wrong
+number are received, the program exits with an error message.
+
+=head1 AUTHOR
+
+Michael D. Ernst >
+
+=cut
+
+## Need to check that use of caller(1) really gives desired results.
+## Need to give input chunk information.
+## Is this obviated by Perl 5.003's declarations? Not entirely, I think.
+
+sub check_args ( $@ )
+{
+ my ($num_formals, @args) = @_;
+ my ($pack, $file_arg, $line_arg, $subname, $hasargs, $wantarr) = caller(1);
+ if (@_ < 1) { croak "check_args needs at least 7 args, got ", scalar(@_), ": @_\n "; }
+ if ((!wantarray) && ($num_formals != 0))
+ { croak "check_args called in scalar context"; }
+ # Can't use croak below here: it would only go out to caller, not its caller
+ my $num_actuals = @args;
+ if ($num_actuals != $num_formals)
+ { die "$file_arg:$line_arg: function $subname expected $num_formals argument",
+ (($num_formals == 1) ? "" : "s"),
+ ", got $num_actuals",
+ (($num_actuals == 0) ? "" : ": @args"),
+ "\n"; }
+ for my $index (0..$#args)
+ { if (!defined($args[$index]))
+ { die "$file_arg:$line_arg: function $subname undefined argument ", $index+1, ": @args[0..$index-1]\n"; } }
+ return @args;
+}
+
+sub check_args_range ( $$@ )
+{
+ my ($min_formals, $max_formals, @args) = @_;
+ my ($pack, $file_arg, $line_arg, $subname, $hasargs, $wantarr) = caller(1);
+ if (@_ < 2) { croak "check_args_range needs at least 8 args, got ", scalar(@_), ": @_"; }
+ if ((!wantarray) && ($max_formals != 0) && ($min_formals !=0) )
+ { croak "check_args_range called in scalar context"; }
+ # Can't use croak below here: it would only go out to caller, not its caller
+ my $num_actuals = @args;
+ if (($num_actuals < $min_formals) || ($num_actuals > $max_formals))
+ { die "$file_arg:$line_arg: function $subname expected $min_formals-$max_formals arguments, got $num_actuals",
+ ($num_actuals == 0) ? "" : ": @args", "\n"; }
+ for my $index (0..$#args)
+ { if (!defined($args[$index]))
+ { die "$file_arg:$line_arg: function $subname undefined argument ", $index+1, ": @args[0..$index-1]\n"; } }
+ return @args;
+}
+
+sub check_args_at_least ( $@ )
+{
+ my ($min_formals, @args) = @_;
+ my ($pack, $file_arg, $line_arg, $subname, $hasargs, $wantarr) = caller(1);
+ # Don't do this, because we want every sub to start with a call to check_args*
+ # if ($min_formals == 0)
+ # { die "Isn't it pointless to check for at least zero args to $subname?\n"; }
+ if (scalar(@_) < 1)
+ { croak "check_args_at_least needs at least 1 arg, got ", scalar(@_), ": @_"; }
+ if ((!wantarray) && ($min_formals != 0))
+ { croak "check_args_at_least called in scalar context"; }
+ # Can't use croak below here: it would only go out to caller, not its caller
+ my $num_actuals = @args;
+ if ($num_actuals < $min_formals)
+ { die "$file_arg:$line_arg: function $subname expected at least $min_formals argument",
+ ($min_formals == 1) ? "" : "s",
+ ", got $num_actuals",
+ ($num_actuals == 0) ? "" : ": @args", "\n"; }
+ for my $index (0..$#args)
+ { if (!defined($args[$index]))
+ { warn "$file_arg:$line_arg: function $subname undefined argument ", $index+1, ": @args[0..$index-1]\n"; last; } }
+ return @args;
+}
+
+1; # successful import
+__END__
Added: sandbox/trunk/pdb/Doc/tools/getversioninfo
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/tools/getversioninfo Wed Jul 5 14:19:15 2006
@@ -0,0 +1,63 @@
+#! /usr/bin/env python
+
+import os
+import re
+import sys
+
+try:
+ __file__
+except NameError:
+ __file__ = sys.argv[0]
+
+tools = os.path.dirname(os.path.abspath(__file__))
+Doc = os.path.dirname(tools)
+src = os.path.dirname(Doc)
+patchlevel_h = os.path.join(Doc, "tools", "patchlevel.h")
+
+# This won't pick out all #defines, but it will pick up the ones we
+# care about.
+rx = re.compile(r"\s*#define\s+([a-zA-Z][a-zA-Z_0-9]*)\s+([a-zA-Z_0-9]+)")
+
+d = {}
+f = open(patchlevel_h)
+for line in f:
+ m = rx.match(line)
+ if m is not None:
+ name, value = m.group(1, 2)
+ d[name] = value
+f.close()
+
+release = "%s.%s" % (d["PY_MAJOR_VERSION"], d["PY_MINOR_VERSION"])
+micro = int(d["PY_MICRO_VERSION"])
+shortversion = release
+if micro != 0:
+ release += "." + str(micro)
+level = d["PY_RELEASE_LEVEL"]
+
+releaseinfo = "mpdb"
+
+def write_file(name, text):
+ """Write text to a file if the file doesn't exist or if text
+ differs from any existing content."""
+ if os.path.exists(name):
+ f = open(name, "r")
+ s = f.read()
+ f.close()
+ if s == text:
+ return
+ f = open(name, "w")
+ f.write(text)
+ f.close()
+
+patchlevel_tex = os.path.join(Doc, "commontex", "patchlevel.tex")
+
+write_file(patchlevel_tex,
+ "%% This file is generated by ../tools/getversioninfo;\n"
+ "%% do not edit manually.\n"
+ "\n"
+ "\\release{%s}\n"
+ "\\setreleaseinfo{%s}\n"
+ "\\setshortversion{%s}\n"
+ % (release, releaseinfo, shortversion))
+
+print release + releaseinfo
Added: sandbox/trunk/pdb/Doc/tools/indfix.py
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/tools/indfix.py Wed Jul 5 14:19:15 2006
@@ -0,0 +1,100 @@
+#! /usr/bin/env python
+
+"""Combine similar index entries into an entry and subentries.
+
+For example:
+
+ \item {foobar} (in module flotz), 23
+ \item {foobar} (in module whackit), 4323
+
+becomes
+
+ \item {foobar}
+ \subitem in module flotz, 23
+ \subitem in module whackit, 4323
+
+Note that an item which matches the format of a collapsable item but which
+isn't part of a group of similar items is not modified.
+"""
+__version__ = '$Revision: 1.1 $'
+
+import re
+import StringIO
+import sys
+
+
+def cmp_entries(e1, e2):
+ return cmp(e1[1].lower(), e2[1].lower()) or cmp(e1, e2)
+
+
+def dump_entries(write, entries):
+ if len(entries) == 1:
+ write(" \\item %s (%s)%s\n" % entries[0])
+ return
+ write(" \item %s\n" % entries[0][0])
+ # now sort these in a case insensitive manner:
+ if len(entries) > 0:
+ entries.sort(cmp_entries)
+ for xxx, subitem, pages in entries:
+ write(" \subitem %s%s\n" % (subitem, pages))
+
+
+breakable_re = re.compile(
+ r" \\item (.*) [(](.*)[)]((?:(?:, \d+)|(?:, \\[a-z]*\{\d+\}))+)")
+
+
+def process(ifn, ofn=None):
+ if ifn == "-":
+ ifp = sys.stdin
+ else:
+ ifp = open(ifn)
+ if ofn is None:
+ ofn = ifn
+ ofp = StringIO.StringIO()
+ entries = []
+ match = breakable_re.match
+ write = ofp.write
+ while 1:
+ line = ifp.readline()
+ if not line:
+ break
+ m = match(line)
+ if m:
+ entry = m.group(1, 2, 3)
+ if entries and entries[-1][0] != entry[0]:
+ dump_entries(write, entries)
+ entries = []
+ entries.append(entry)
+ elif entries:
+ dump_entries(write, entries)
+ entries = []
+ write(line)
+ else:
+ write(line)
+ del write
+ del match
+ ifp.close()
+ data = ofp.getvalue()
+ ofp.close()
+ if ofn == "-":
+ ofp = sys.stdout
+ else:
+ ofp = open(ofn, "w")
+ ofp.write(data)
+ ofp.close()
+
+
+def main():
+ import getopt
+ outfile = None
+ opts, args = getopt.getopt(sys.argv[1:], "o:")
+ for opt, val in opts:
+ if opt in ("-o", "--output"):
+ outfile = val
+ filename = args[0]
+ outfile = outfile or filename
+ process(filename, outfile)
+
+
+if __name__ == "__main__":
+ main()
Added: sandbox/trunk/pdb/Doc/tools/mkhowto
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/tools/mkhowto Wed Jul 5 14:19:15 2006
@@ -0,0 +1,659 @@
+#! /usr/bin/env python
+# -*- Python -*-
+"""usage: %(program)s [options...] file ...
+
+Options specifying formats to build:
+ --html HyperText Markup Language (default)
+ --pdf Portable Document Format
+ --ps PostScript
+ --dvi 'DeVice Indepentent' format from TeX
+ --text ASCII text (requires lynx)
+
+ More than one output format may be specified, or --all.
+
+HTML options:
+ --address, -a Specify an address for page footers.
+ --dir Specify the directory for HTML output.
+ --link Specify the number of levels to include on each page.
+ --split, -s Specify a section level for page splitting, default: %(max_split_depth)s.
+ --iconserver, -i Specify location of icons (default: ./).
+ --image-type Specify the image type to use in HTML output;
+ values: gif, png (default).
+ --numeric Don't rename the HTML files; just keep node#.html for
+ the filenames.
+ --style Specify the CSS file to use for the output (filename,
+ not a URL).
+ --up-link URL to a parent document.
+ --up-title Title of a parent document.
+ --favicon Icon to display in the browsers location bar.
+
+Other options:
+ --a4 Format for A4 paper.
+ --letter Format for US letter paper (the default).
+ --help, -H Show this text.
+ --logging, -l Log stdout and stderr to a file (*.how).
+ --debugging, -D Echo commands as they are executed.
+ --keep, -k Keep temporary files around.
+ --quiet, -q Do not print command output to stdout.
+ (stderr is also lost, sorry; see *.how for errors)
+"""
+
+import getopt
+import glob
+import os
+import re
+import shutil
+import sys
+
+
+MYDIR = os.path.abspath(sys.path[0])
+TOPDIR = os.path.dirname(MYDIR)
+
+ISTFILE = os.path.join(TOPDIR, "texinputs", "python.ist")
+NODE2LABEL_SCRIPT = os.path.join(MYDIR, "node2label.pl")
+L2H_INIT_FILE = os.path.join(TOPDIR, "perl", "l2hinit.perl")
+
+BIBTEX_BINARY = "bibtex"
+DVIPS_BINARY = "dvips"
+LATEX_BINARY = "latex"
+LATEX2HTML_BINARY = "latex2html"
+LYNX_BINARY = "lynx"
+MAKEINDEX_BINARY = "makeindex"
+PDFLATEX_BINARY = "pdflatex"
+PERL_BINARY = "perl"
+PYTHON_BINARY = "python"
+
+
+def usage(options, file):
+ print >>file, __doc__ % options
+
+def error(options, message, err=2):
+ print >>sys.stderr, message
+ print >>sys.stderr
+ usage(options, sys.stderr)
+ sys.exit(2)
+
+
+class Options:
+ program = os.path.basename(sys.argv[0])
+ #
+ address = ''
+ builddir = None
+ debugging = 0
+ discard_temps = 1
+ have_temps = 0
+ icon_server = "."
+ image_type = "png"
+ logging = 0
+ max_link_depth = 3
+ max_split_depth = 6
+ paper = "letter"
+ quiet = 0
+ runs = 0
+ numeric = 0
+ global_module_index = None
+ style_file = os.path.join(TOPDIR, "html", "style.css")
+ about_file = os.path.join(TOPDIR, "html", "about.dat")
+ up_link = None
+ up_title = None
+ favicon = None
+ #
+ # 'dvips_safe' is a weird option. It is used mostly to make
+ # LaTeX2HTML not try to be too smart about protecting the user
+ # from a bad version of dvips -- some versions would core dump if
+ # the path to the source DVI contained a dot, and it's appearantly
+ # difficult to determine if the version available has that bug.
+ # This option gets set when PostScript output is requested
+ # (because we're going to run dvips regardless, and we'll either
+ # know it succeeds before LaTeX2HTML is run, or we'll have
+ # detected the failure and bailed), or the user asserts that it's
+ # safe from the command line.
+ #
+ # So, why does LaTeX2HTML think it appropriate to protect the user
+ # from a dvips that's only potentially going to core dump? Only
+ # because they want to avoid doing a lot of work just to have to
+ # bail later with no useful intermediates. Unfortunately, they
+ # bail *before* they know whether dvips will be needed at all.
+ # I've gone around the bush a few times with the LaTeX2HTML
+ # developers over whether this is appropriate behavior, and they
+ # don't seem interested in changing their position.
+ #
+ dvips_safe = 0
+ #
+ DEFAULT_FORMATS = ("html",)
+ ALL_FORMATS = ("dvi", "html", "pdf", "ps", "text")
+
+ def __init__(self):
+ self.formats = []
+ self.l2h_init_files = []
+
+ def __getitem__(self, key):
+ # This is used when formatting the usage message.
+ try:
+ return getattr(self, key)
+ except AttributeError:
+ raise KeyError, key
+
+ def parse(self, args):
+ opts, args = getopt.getopt(args, "Hi:a:s:lDkqr:",
+ ["all", "postscript", "help", "iconserver=",
+ "address=", "a4", "letter", "l2h-init=",
+ "link=", "split=", "logging", "debugging",
+ "keep", "quiet", "runs=", "image-type=",
+ "about=", "numeric", "style=", "paper=",
+ "up-link=", "up-title=", "dir=",
+ "global-module-index=", "dvips-safe",
+ "favicon="]
+ + list(self.ALL_FORMATS))
+ for opt, arg in opts:
+ if opt == "--all":
+ self.formats = list(self.ALL_FORMATS)
+ self.dvips_safe = "ps" in self.formats
+ elif opt in ("-H", "--help"):
+ usage(self, sys.stdout)
+ sys.exit()
+ elif opt == "--iconserver":
+ self.icon_server = arg
+ elif opt in ("-a", "--address"):
+ self.address = arg
+ elif opt == "--a4":
+ self.paper = "a4"
+ elif opt == "--letter":
+ self.paper = "letter"
+ elif opt == "--link":
+ self.max_link_depth = int(arg)
+ elif opt in ("-s", "--split"):
+ self.max_split_depth = int(arg)
+ elif opt in ("-l", "--logging"):
+ self.logging = self.logging + 1
+ elif opt in ("-D", "--debugging"):
+ self.debugging = self.debugging + 1
+ elif opt in ("-k", "--keep"):
+ self.discard_temps = 0
+ elif opt in ("-q", "--quiet"):
+ self.quiet = 1
+ elif opt in ("-r", "--runs"):
+ self.runs = int(arg)
+ elif opt == "--image-type":
+ self.image_type = arg
+ elif opt == "--about":
+ # always make this absolute:
+ self.about_file = os.path.normpath(
+ os.path.abspath(arg))
+ elif opt == "--numeric":
+ self.numeric = 1
+ elif opt == "--style":
+ self.style_file = os.path.abspath(arg)
+ elif opt == "--l2h-init":
+ self.l2h_init_files.append(os.path.abspath(arg))
+ elif opt == "--favicon":
+ self.favicon = arg
+ elif opt == "--up-link":
+ self.up_link = arg
+ elif opt == "--up-title":
+ self.up_title = arg
+ elif opt == "--global-module-index":
+ self.global_module_index = arg
+ elif opt == "--dir":
+ if os.sep == "\\":
+ arg = re.sub("/", "\\\\", arg)
+ self.builddir = os.path.expanduser(arg)
+ elif opt == "--paper":
+ self.paper = arg
+ elif opt == "--dvips-safe":
+ self.dvips_safe = 1
+ #
+ # Format specifiers:
+ #
+ elif opt[2:] in self.ALL_FORMATS:
+ self.add_format(opt[2:])
+ elif opt == "--postscript":
+ # synonym for --ps
+ self.add_format("ps")
+ self.initialize()
+ #
+ # return the args to allow the caller access:
+ #
+ return args
+
+ def add_format(self, format):
+ """Add a format to the formats list if not present."""
+ if not format in self.formats:
+ if format == "ps":
+ # assume this is safe since we're going to run it anyway
+ self.dvips_safe = 1
+ self.formats.append(format)
+
+ def initialize(self):
+ """Complete initialization. This is needed if parse() isn't used."""
+ # add the default format if no formats were specified:
+ if not self.formats:
+ self.formats = self.DEFAULT_FORMATS
+ # determine the base set of texinputs directories:
+ texinputs = os.environ.get("TEXINPUTS", "").split(os.pathsep)
+ if not texinputs:
+ texinputs = ['']
+ mydirs = [os.path.join(TOPDIR, "paper-" + self.paper),
+ os.path.join(TOPDIR, "texinputs"),
+ ]
+ if '' in texinputs:
+ i = texinputs.index('')
+ texinputs[i:i] = mydirs
+ else:
+ texinputs += mydirs
+ self.base_texinputs = texinputs
+ if self.builddir:
+ self.builddir = os.path.abspath(self.builddir)
+
+
+class Job:
+ latex_runs = 0
+
+ def __init__(self, options, path):
+ self.options = options
+ self.doctype = get_doctype(path)
+ self.filedir, self.doc = split_pathname(path)
+ self.builddir = os.path.abspath(options.builddir or self.doc)
+ if ("html" in options.formats or "text" in options.formats):
+ if not os.path.exists(self.builddir):
+ os.mkdir(self.builddir)
+ self.log_filename = os.path.join(self.builddir, self.doc + ".how")
+ else:
+ self.log_filename = os.path.abspath(self.doc + ".how")
+ if os.path.exists(self.log_filename):
+ os.unlink(self.log_filename)
+ l2hconf = self.doc + ".l2h"
+ if os.path.exists(l2hconf):
+ if os.path.exists(l2hconf + "~"):
+ os.unlink(l2hconf + "~")
+ os.rename(l2hconf, l2hconf + "~")
+ self.l2h_aux_init_file = self.doc + ".l2h"
+ self.write_l2h_aux_init_file()
+
+ def build(self):
+ self.setup_texinputs()
+ formats = self.options.formats
+ if "dvi" in formats or "ps" in formats:
+ self.build_dvi()
+ if "pdf" in formats:
+ self.build_pdf()
+ if "ps" in formats:
+ self.build_ps()
+ if "html" in formats:
+ self.require_temps()
+ self.build_html(self.builddir)
+ if self.options.icon_server == ".":
+ pattern = os.path.join(TOPDIR, "html", "icons",
+ "*." + self.options.image_type)
+ imgs = glob.glob(pattern)
+ if not imgs:
+ self.warning(
+ "Could not locate support images of type %s."
+ % `self.options.image_type`)
+ for fn in imgs:
+ new_fn = os.path.join(self.builddir, os.path.basename(fn))
+ shutil.copyfile(fn, new_fn)
+ if "text" in formats:
+ self.require_temps()
+ tempdir = self.doc
+ need_html = "html" not in formats
+ if self.options.max_split_depth != 1:
+ fp = open(self.l2h_aux_init_file, "a")
+ fp.write("# re-hack this file for --text:\n")
+ l2hoption(fp, "MAX_SPLIT_DEPTH", "1")
+ fp.write("1;\n")
+ fp.close()
+ tempdir = self.doc + "-temp-html"
+ need_html = 1
+ if need_html:
+ self.build_html(tempdir, max_split_depth=1)
+ self.build_text(tempdir)
+ if self.options.discard_temps:
+ self.cleanup()
+
+ def setup_texinputs(self):
+ texinputs = [self.filedir] + self.options.base_texinputs
+ os.environ["TEXINPUTS"] = os.pathsep.join(texinputs)
+ self.message("TEXINPUTS=" + os.environ["TEXINPUTS"])
+
+ def build_aux(self, binary=None):
+ if binary is None:
+ binary = LATEX_BINARY
+ new_index( "%s.ind" % self.doc, "genindex")
+ new_index("mod%s.ind" % self.doc, "modindex")
+ self.run("%s %s" % (binary, self.doc))
+ self.use_bibtex = check_for_bibtex(self.doc + ".aux")
+ self.latex_runs = 1
+
+ def build_dvi(self):
+ self.use_latex(LATEX_BINARY)
+
+ def build_pdf(self):
+ self.use_latex(PDFLATEX_BINARY)
+
+ def use_latex(self, binary):
+ self.require_temps(binary=binary)
+ if self.latex_runs < 2:
+ if os.path.isfile("mod%s.idx" % self.doc):
+ self.run("%s mod%s.idx" % (MAKEINDEX_BINARY, self.doc))
+ use_indfix = 0
+ if os.path.isfile(self.doc + ".idx"):
+ use_indfix = 1
+ # call to Doc/tools/fix_hack omitted; doesn't appear necessary
+ self.run("%s %s.idx" % (MAKEINDEX_BINARY, self.doc))
+ import indfix
+ indfix.process(self.doc + ".ind")
+ if self.use_bibtex:
+ self.run("%s %s" % (BIBTEX_BINARY, self.doc))
+ self.process_synopsis_files()
+ self.run("%s %s" % (binary, self.doc))
+ self.latex_runs = self.latex_runs + 1
+ if os.path.isfile("mod%s.idx" % self.doc):
+ self.run("%s -s %s mod%s.idx"
+ % (MAKEINDEX_BINARY, ISTFILE, self.doc))
+ if use_indfix:
+ self.run("%s -s %s %s.idx"
+ % (MAKEINDEX_BINARY, ISTFILE, self.doc))
+ indfix.process(self.doc + ".ind")
+ self.process_synopsis_files()
+ #
+ # and now finish it off:
+ #
+ if os.path.isfile(self.doc + ".toc") and binary == PDFLATEX_BINARY:
+ import toc2bkm
+ if self.doctype == "manual":
+ bigpart = "chapter"
+ else:
+ bigpart = "section"
+ toc2bkm.process(self.doc + ".toc", self.doc + ".bkm", bigpart)
+ if self.use_bibtex:
+ self.run("%s %s" % (BIBTEX_BINARY, self.doc))
+ self.run("%s %s" % (binary, self.doc))
+ self.latex_runs = self.latex_runs + 1
+
+ def process_synopsis_files(self):
+ synopsis_files = glob.glob(self.doc + "*.syn")
+ for path in synopsis_files:
+ uniqify_module_table(path)
+
+ def build_ps(self):
+ self.run("%s -N0 -o %s.ps %s" % (DVIPS_BINARY, self.doc, self.doc))
+
+ def build_html(self, builddir, max_split_depth=None):
+ if max_split_depth is None:
+ max_split_depth = self.options.max_split_depth
+ texfile = None
+ for p in os.environ["TEXINPUTS"].split(os.pathsep):
+ fn = os.path.join(p, self.doc + ".tex")
+ if os.path.isfile(fn):
+ texfile = fn
+ break
+ if not texfile:
+ self.warning("Could not locate %s.tex; aborting." % self.doc)
+ sys.exit(1)
+ # remove leading ./ (or equiv.); might avoid problems w/ dvips
+ if texfile[:2] == os.curdir + os.sep:
+ texfile = texfile[2:]
+ # build the command line and run LaTeX2HTML:
+ if not os.path.isdir(builddir):
+ os.mkdir(builddir)
+ else:
+ for fname in glob.glob(os.path.join(builddir, "*.html")):
+ os.unlink(fname)
+ args = [LATEX2HTML_BINARY,
+ "-init_file", self.l2h_aux_init_file,
+ "-dir", builddir,
+ texfile
+ ]
+ self.run(" ".join(args)) # XXX need quoting!
+ # ... postprocess
+ shutil.copyfile(self.options.style_file,
+ os.path.join(builddir, self.doc + ".css"))
+ shutil.copyfile(os.path.join(builddir, self.doc + ".html"),
+ os.path.join(builddir, "index.html"))
+ if max_split_depth != 1:
+ label_file = os.path.join(builddir, "labels.pl")
+ fp = open(label_file)
+ about_node = None
+ target = " = q/about/;\n"
+ x = len(target)
+ while 1:
+ line = fp.readline()
+ if not line:
+ break
+ if line[-x:] == target:
+ line = fp.readline()
+ m = re.search(r"\|(node\d+\.[a-z]+)\|", line)
+ about_node = m.group(1)
+ shutil.copyfile(os.path.join(builddir, about_node),
+ os.path.join(builddir, "about.html"))
+ break
+ if not self.options.numeric:
+ pwd = os.getcwd()
+ try:
+ os.chdir(builddir)
+ self.run("%s %s *.html" % (PERL_BINARY, NODE2LABEL_SCRIPT))
+ finally:
+ os.chdir(pwd)
+ # These files need to be cleaned up here since builddir there
+ # can be more than one, so we clean each of them.
+ if self.options.discard_temps:
+ for fn in ("images.tex", "images.log", "images.aux"):
+ safe_unlink(os.path.join(builddir, fn))
+
+ def build_text(self, tempdir=None):
+ if tempdir is None:
+ tempdir = self.doc
+ indexfile = os.path.join(tempdir, "index.html")
+ self.run("%s -nolist -dump %s >%s.txt"
+ % (LYNX_BINARY, indexfile, self.doc))
+
+ def require_temps(self, binary=None):
+ if not self.latex_runs:
+ self.build_aux(binary=binary)
+
+ def write_l2h_aux_init_file(self):
+ options = self.options
+ fp = open(self.l2h_aux_init_file, "w")
+ d = string_to_perl(os.path.dirname(L2H_INIT_FILE))
+ fp.write("package main;\n"
+ "push (@INC, '%s');\n"
+ "$mydir = '%s';\n"
+ % (d, d))
+ fp.write(open(L2H_INIT_FILE).read())
+ for filename in options.l2h_init_files:
+ fp.write("\n# initialization code incorporated from:\n# ")
+ fp.write(filename)
+ fp.write("\n")
+ fp.write(open(filename).read())
+ fp.write("\n"
+ "# auxillary init file for latex2html\n"
+ "# generated by mkhowto\n"
+ "$NO_AUTO_LINK = 1;\n"
+ )
+ l2hoption(fp, "ABOUT_FILE", options.about_file)
+ l2hoption(fp, "ICONSERVER", options.icon_server)
+ l2hoption(fp, "IMAGE_TYPE", options.image_type)
+ l2hoption(fp, "ADDRESS", options.address)
+ l2hoption(fp, "MAX_LINK_DEPTH", options.max_link_depth)
+ l2hoption(fp, "MAX_SPLIT_DEPTH", options.max_split_depth)
+ l2hoption(fp, "EXTERNAL_UP_LINK", options.up_link)
+ l2hoption(fp, "EXTERNAL_UP_TITLE", options.up_title)
+ l2hoption(fp, "FAVORITES_ICON", options.favicon)
+ l2hoption(fp, "GLOBAL_MODULE_INDEX", options.global_module_index)
+ l2hoption(fp, "DVIPS_SAFE", options.dvips_safe)
+ fp.write("1;\n")
+ fp.close()
+
+ def cleanup(self):
+ self.__have_temps = 0
+ for pattern in ("%s.aux", "%s.log", "%s.out", "%s.toc", "%s.bkm",
+ "%s.idx", "%s.ilg", "%s.ind", "%s.pla",
+ "%s.bbl", "%s.blg",
+ "mod%s.idx", "mod%s.ind", "mod%s.ilg",
+ ):
+ safe_unlink(pattern % self.doc)
+ map(safe_unlink, glob.glob(self.doc + "*.syn"))
+ for spec in ("IMG*", "*.pl", "WARNINGS", "index.dat", "modindex.dat"):
+ pattern = os.path.join(self.doc, spec)
+ map(safe_unlink, glob.glob(pattern))
+ if "dvi" not in self.options.formats:
+ safe_unlink(self.doc + ".dvi")
+ if os.path.isdir(self.doc + "-temp-html"):
+ shutil.rmtree(self.doc + "-temp-html", ignore_errors=1)
+ if not self.options.logging:
+ os.unlink(self.log_filename)
+ if not self.options.debugging:
+ os.unlink(self.l2h_aux_init_file)
+
+ def run(self, command):
+ self.message(command)
+ if sys.platform.startswith("win"):
+ rc = os.system(command)
+ else:
+ rc = os.system("(%s) >%s 2>&1"
+ % (command, self.log_filename))
+ if rc:
+ self.warning(
+ "Session transcript and error messages are in %s."
+ % self.log_filename)
+ result = 1
+ if hasattr(os, "WIFEXITED"):
+ if os.WIFEXITED(rc):
+ result = os.WEXITSTATUS(rc)
+ self.warning("Exited with status %s." % result)
+ else:
+ self.warning("Killed by signal %s." % os.WSTOPSIG(rc))
+ else:
+ self.warning("Return code: %s" % rc)
+ sys.stderr.write("The relevant lines from the transcript are:\n")
+ sys.stderr.write("-" * 72 + "\n")
+ sys.stderr.writelines(get_run_transcript(self.log_filename))
+ sys.exit(result)
+
+ def message(self, msg):
+ msg = "+++ " + msg
+ if not self.options.quiet:
+ print msg
+ self.log(msg + "\n")
+
+ def warning(self, msg):
+ msg = "*** %s\n" % msg
+ sys.stderr.write(msg)
+ self.log(msg)
+
+ def log(self, msg):
+ fp = open(self.log_filename, "a")
+ fp.write(msg)
+ fp.close()
+
+
+def get_run_transcript(filename):
+ """Return lines from the transcript file for the most recent run() call."""
+ fp = open(filename)
+ lines = fp.readlines()
+ fp.close()
+ lines.reverse()
+ L = []
+ for line in lines:
+ L.append(line)
+ if line[:4] == "+++ ":
+ break
+ L.reverse()
+ return L
+
+
+def safe_unlink(path):
+ """Unlink a file without raising an error if it doesn't exist."""
+ try:
+ os.unlink(path)
+ except os.error:
+ pass
+
+
+def split_pathname(path):
+ path = os.path.abspath(path)
+ dirname, basename = os.path.split(path)
+ if basename[-4:] == ".tex":
+ basename = basename[:-4]
+ return dirname, basename
+
+
+_doctype_rx = re.compile(r"\\documentclass(?:\[[^]]*\])?{([a-zA-Z]*)}")
+def get_doctype(path):
+ fp = open(path)
+ doctype = None
+ while 1:
+ line = fp.readline()
+ if not line:
+ break
+ m = _doctype_rx.match(line)
+ if m:
+ doctype = m.group(1)
+ break
+ fp.close()
+ return doctype
+
+
+def main():
+ options = Options()
+ try:
+ args = options.parse(sys.argv[1:])
+ except getopt.error, msg:
+ error(options, msg)
+ if not args:
+ # attempt to locate single .tex file in current directory:
+ args = glob.glob("*.tex")
+ if not args:
+ error(options, "No file to process.")
+ if len(args) > 1:
+ error(options, "Could not deduce which files should be processed.")
+ #
+ # parameters are processed, let's go!
+ #
+ for path in args:
+ Job(options, path).build()
+
+
+def l2hoption(fp, option, value):
+ if value:
+ fp.write('$%s = "%s";\n' % (option, string_to_perl(str(value))))
+
+
+_to_perl = {}
+for c in map(chr, range(1, 256)):
+ _to_perl[c] = c
+_to_perl["@"] = "\\@"
+_to_perl["$"] = "\\$"
+_to_perl['"'] = '\\"'
+
+def string_to_perl(s):
+ return ''.join(map(_to_perl.get, s))
+
+
+def check_for_bibtex(filename):
+ fp = open(filename)
+ pos = fp.read().find(r"\bibdata{")
+ fp.close()
+ return pos >= 0
+
+def uniqify_module_table(filename):
+ lines = open(filename).readlines()
+ if len(lines) > 1:
+ if lines[-1] == lines[-2]:
+ del lines[-1]
+ open(filename, "w").writelines(lines)
+
+
+def new_index(filename, label="genindex"):
+ fp = open(filename, "w")
+ fp.write(r"""\
+\begin{theindex}
+\label{%s}
+\end{theindex}
+""" % label)
+ fp.close()
+
+
+if __name__ == "__main__":
+ main()
Added: sandbox/trunk/pdb/Doc/tools/mkinfo
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/tools/mkinfo Wed Jul 5 14:19:15 2006
@@ -0,0 +1,65 @@
+#! /bin/sh
+# -*- Ksh -*-
+
+# Script to drive the HTML-info conversion process.
+# Pass in upto three parameters:
+# - the name of the main tex file
+# - the name of the output file in texi format (optional)
+# - the name of the output file in info format (optional)
+#
+# Written by Fred L. Drake, Jr.
+
+EMACS=${EMACS:-emacs}
+MAKEINFO=${MAKEINFO:-makeinfo}
+
+
+# Normalize file name since something called by html2texi.pl seems to
+# screw up with relative path names.
+FILENAME="$1"
+DOCDIR=`dirname "$FILENAME"`
+DOCFILE=`basename "$FILENAME"`
+DOCNAME=`basename "$FILENAME" .tex`
+if [ $# -gt 1 ]; then
+ TEXINAME="$2"
+else
+ TEXINAME="python-$DOCNAME.texi"
+fi
+if [ $# -gt 2 ]; then
+ INFONAME="$3"
+else
+ INFONAME="python-$DOCNAME.info"
+fi
+
+# Now build the real directory names, and locate our support stuff:
+WORKDIR=`pwd`
+cd `dirname $0`
+TOOLSDIR=`pwd`
+cd $DOCDIR
+DOCDIR=`pwd`
+cd $WORKDIR
+
+COMMONDIR="`dirname $DOCDIR`/commontex"
+
+
+run() {
+ # show what we're doing, like make does:
+ echo "$*"
+ "$@" || exit $?
+}
+
+
+# generate the Texinfo file:
+
+run $EMACS -batch -q --no-site-file -l $TOOLSDIR/py2texi.el \
+ --eval "(setq py2texi-dirs '(\"$DOCDIR\" \"$COMMONDIR\" \"../texinputs\"))" \
+ --eval "(setq py2texi-texi-file-name \"$TEXINAME\")" \
+ --eval "(setq py2texi-info-file-name \"$INFONAME\")" \
+ --eval "(py2texi \"$DOCDIR/$DOCFILE\")" \
+ -f kill-emacs
+echo Done
+
+
+# generate the .info files:
+
+run $MAKEINFO --footnote-style end --fill-column 72 \
+ --paragraph-indent 0 --output=$INFONAME $TEXINAME
Added: sandbox/trunk/pdb/Doc/tools/node2label.pl
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/tools/node2label.pl Wed Jul 5 14:19:15 2006
@@ -0,0 +1,71 @@
+#! /usr/bin/env perl
+
+# On Cygwin, we actually have to generate a temporary file when doing
+# the inplace edit, or we'll get permission errors. Not sure who's
+# bug this is, except that it isn't ours. To deal with this, we
+# generate backups during the edit phase and remove them at the end.
+#
+use English;
+$INPLACE_EDIT = '.bak';
+
+# read the labels, then reverse the mappings
+require "labels.pl";
+
+%nodes = ();
+my $key;
+# sort so that we get a consistent assignment for nodes with multiple labels
+foreach $label (sort keys %external_labels) {
+ #
+ # If the label can't be used as a filename on non-Unix platforms,
+ # skip it. Such labels may be used internally within the documentation,
+ # but will never be used for filename generation.
+ #
+ if ($label =~ /^([-.a-zA-Z0-9]+)$/) {
+ $key = $external_labels{$label};
+ $key =~ s|^/||;
+ $nodes{$key} = $label;
+ }
+}
+
+# This adds the "internal" labels added for indexing. These labels will not
+# be used for file names.
+require "intlabels.pl";
+foreach $label (keys %internal_labels) {
+ $key = $internal_labels{$label};
+ $key =~ s|^/||;
+ if (defined($nodes{$key})) {
+ $nodes{$label} = $nodes{$key};
+ }
+}
+
+# collect labels that have been used
+%newnames = ();
+
+while (<>) {
+ # don't want to do one s/// per line per node
+ # so look for lines with hrefs, then do s/// on nodes present
+ if (/(HREF|href)=[\"\']node\d+\.html[\#\"\']/) {
+ @parts = split(/(HREF|href)\=[\"\']/);
+ shift @parts;
+ for $node (@parts) {
+ $node =~ s/[\#\"\'].*$//g;
+ chomp($node);
+ if (defined($nodes{$node})) {
+ $label = $nodes{$node};
+ if (s/(HREF|href)=([\"\'])$node([\#\"\'])/href=$2$label.html$3/g) {
+ s/(HREF|href)=([\"\'])$label.html/href=$2$label.html/g;
+ $newnames{$node} = "$label.html";
+ }
+ }
+ }
+ }
+ print;
+}
+
+foreach $oldname (keys %newnames) {
+ rename($oldname, $newnames{$oldname});
+}
+
+foreach $filename (glob('*.bak')) {
+ unlink($filename);
+}
Added: sandbox/trunk/pdb/Doc/tools/patchlevel.h
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/tools/patchlevel.h Wed Jul 5 14:19:15 2006
@@ -0,0 +1,40 @@
+
+/* Newfangled version identification scheme.
+
+ This scheme was added in Python 1.5.2b2; before that time, only PATCHLEVEL
+ was available. To test for presence of the scheme, test for
+ defined(PY_MAJOR_VERSION).
+
+ When the major or minor version changes, the VERSION variable in
+ configure.in must also be changed.
+
+ There is also (independent) API version information in modsupport.h.
+*/
+
+/* Values for PY_RELEASE_LEVEL */
+#define PY_RELEASE_LEVEL_ALPHA 0xA
+#define PY_RELEASE_LEVEL_BETA 0xB
+#define PY_RELEASE_LEVEL_GAMMA 0xC /* For release candidates */
+#define PY_RELEASE_LEVEL_FINAL 0xF /* Serial should be 0 here */
+ /* Higher for patch releases */
+
+/* Version parsed out into numeric values */
+#define PY_MAJOR_VERSION 2
+#define PY_MINOR_VERSION 4
+#define PY_MICRO_VERSION 2
+#define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_ALPHA
+#define PY_RELEASE_SERIAL 0
+
+/* Version as a string */
+#define PY_VERSION "2.4.3mpdb"
+
+/* Subversion Revision number of this file (not of the repository) */
+#define PY_PATCHLEVEL_REVISION "$Revision: 1.1 $"
+
+/* Version as a single 4-byte hex number, e.g. 0x010502B2 == 1.5.2b2.
+ Use this for numeric comparisons, e.g. #if PY_VERSION_HEX >= ... */
+#define PY_VERSION_HEX ((PY_MAJOR_VERSION << 24) | \
+ (PY_MINOR_VERSION << 16) | \
+ (PY_MICRO_VERSION << 8) | \
+ (PY_RELEASE_LEVEL << 4) | \
+ (PY_RELEASE_SERIAL << 0))
Added: sandbox/trunk/pdb/Doc/tools/py2texi.el
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/tools/py2texi.el Wed Jul 5 14:19:15 2006
@@ -0,0 +1,951 @@
+;;; py2texi.el -- Conversion of Python LaTeX documentation to Texinfo
+
+;; Copyright (C) 1998, 1999, 2001, 2002 Milan Zamazal
+
+;; Author: Milan Zamazal
+;; Version: $Id: py2texi.el,v 1.1 2006/01/30 11:54:58 rockyb Exp $
+;; Keywords: python
+
+;; COPYRIGHT NOTICE
+;;
+;; This program is free software; you can redistribute it and/or modify it
+;; under the terms of the GNU General Public License as published by the Free
+;; Software Foundation; either version 2, or (at your option) any later
+;; version.
+;;
+;; This program is distributed in the hope that it will be useful, but
+;; WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY
+;; or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
+;; for more details.
+;;
+;; You can find the GNU General Public License at
+;; http://www.gnu.org/copyleft/gpl.html
+;; or you can write to the Free Software Foundation, Inc., 59 Temple Place,
+;; Suite 330, Boston, MA 02111-1307, USA.
+
+;;; Commentary:
+
+;; This is a Q&D hack for conversion of Python manuals to on-line help format.
+;; I desperately needed usable online documenta for Python, so I wrote this.
+;; The result code is ugly and need not contain complete information from
+;; Python manuals. I apologize for my ignorance, especially ignorance to
+;; python.sty. Improvements of this convertor are welcomed.
+
+;; How to use it:
+;; Load this file and apply `M-x py2texi'. You will be asked for name of a
+;; file to be converted.
+
+;; Where to find it:
+;; New versions of this code might be found at
+;; http://www.zamazal.org/software/python/py2texi/ .
+
+;;; Code:
+
+
+(require 'texinfo)
+(eval-when-compile
+ (require 'cl))
+
+
+(defvar py2texi-python-version "2.2"
+ "What to substitute for the \\version macro.")
+
+(defvar py2texi-python-short-version
+ (progn
+ (string-match "[0-9]+\\.[0-9]+" py2texi-python-version)
+ (match-string 0 py2texi-python-version))
+ "Short version number, usually set by the LaTeX commands.")
+
+(defvar py2texi-texi-file-name nil
+ "If non-nil, that string is used as the name of the Texinfo file.
+Otherwise a generated Texinfo file name is used.")
+
+(defvar py2texi-info-file-name nil
+ "If non-nil, that string is used as the name of the Info file.
+Otherwise a generated Info file name is used.")
+
+(defvar py2texi-stop-on-problems nil
+ "*If non-nil, stop when you encouter soft problem.")
+
+(defconst py2texi-environments
+ '(("abstract" 0 "@quotation" "@end quotation\n")
+ ("center" 0 "" "")
+ ("cfuncdesc" 3
+ (progn (setq findex t)
+ "\n at table @code\n at item \\1 \\2(\\3)\n at findex \\2\n")
+ "@end table\n")
+ ("cmemberdesc" 3
+ "\n at table @code\n at item \\2 \\3\n"
+ "@end table\n")
+ ("classdesc" 2
+ (progn (setq obindex t)
+ "\n at table @code\n at item \\1(\\2)\n at obindex \\1\n")
+ "@end table\n")
+ ("classdesc*" 1
+ (progn (setq obindex t)
+ "\n at table @code\n at item \\1\n at obindex \\1\n")
+ "@end table\n")
+ ("comment" 0 "\n at ignore\n" "\n at end ignore\n")
+ ("csimplemacrodesc" 1
+ (progn (setq cindex t)
+ "\n at table @code\n at item \\1\n at cindex \\1\n")
+ "@end table\n")
+ ("ctypedesc" 1
+ (progn (setq cindex t)
+ "\n at table @code\n at item \\1\n at cindex \\1\n")
+ "@end table\n")
+ ("cvardesc" 2
+ (progn (setq findex t)
+ "\n at table @code\n at item \\1 \\2\n at findex \\2\n")
+ "@end table\n")
+ ("datadesc" 1
+ (progn (setq findex t)
+ "\n at table @code\n at item \\1\n at findex \\1\n")
+ "@end table\n")
+ ("datadescni" 1 "\n at table @code\n at item \\1\n" "@end table\n")
+ ("definitions" 0 "@table @dfn" "@end table\n")
+ ("description" 0 "@table @samp" "@end table\n")
+ ("displaymath" 0 "" "")
+ ("document" 0
+ (concat "@defcodeindex mo\n"
+ "@defcodeindex ob\n"
+ "@titlepage\n"
+ (format "@title " title "\n")
+ (format "@author " author "\n")
+ "@page\n"
+ author-address
+ "@end titlepage\n"
+ "@node Top, , , (dir)\n")
+ (concat "@indices\n"
+ "@contents\n"
+ "@bye\n"))
+ ("enumerate" 0 "@enumerate" "@end enumerate")
+ ("envdesc" 2 (concat "\n at table @code"
+ "\n at item @backslash{}begin@{\\1@}\\2")
+ "@end table\n")
+ ("excdesc" 1
+ (progn (setq obindex t)
+ "\n at table @code\n at item \\1\n at obindex \\1\n")
+ "@end table\n")
+ ("excclassdesc" 2
+ (progn (setq obindex t)
+ "\n at table @code\n at item \\1(\\2)\n at obindex \\1\n")
+ "@end table\n")
+ ("flushleft" 0 "" "")
+ ("fulllineitems" 0 "\n at table @code\n" "@end table\n")
+ ("funcdesc" 2
+ (progn (setq findex t)
+ "\n at table @code\n at item \\1(\\2)\n at findex \\1\n")
+ "@end table\n")
+ ("funcdescni" 2 "\n at table @code\n at item \\1(\\2)\n" "@end table\n")
+ ("itemize" 0 "@itemize @bullet" "@end itemize\n")
+ ("list" 2 "\n at table @code\n" "@end table\n")
+ ("longtableii" 4 (concat "@multitable @columnfractions .5 .5\n"
+ "@item \\3 @tab \\4\n"
+ "@item ------- @tab ------ \n")
+ "@end multitable\n")
+ ("longtableiii" 5 (concat "@multitable @columnfractions .33 .33 .33\n"
+ "@item \\3 @tab \\4 @tab \\5\n"
+ "@item ------- @tab ------ @tab ------\n")
+ "@end multitable\n")
+ ("macrodesc" 2 (concat "\n at table @code"
+ "\n at item \\1@{\\2@}")
+ "@end table\n")
+ ("memberdesc" 1
+ (progn (setq findex t)
+ "\n at table @code\n at item \\1\n at findex \\1\n")
+ "@end table\n")
+ ("memberdescni" 1 "\n at table @code\n at item \\1\n" "@end table\n")
+ ("methoddesc" 2
+ (progn (setq findex t)
+ "\n at table @code\n at item \\1(\\2)\n at findex \\1\n")
+ "@end table\n")
+ ("methoddescni" 2 "\n at table @code\n at item \\1(\\2)\n" "@end table\n")
+ ("notice" 0 "@emph{Notice:} " "")
+ ("opcodedesc" 2
+ (progn (setq findex t)
+ "\n at table @code\n at item \\1 \\2\n at findex \\1\n")
+ "@end table\n")
+ ("productionlist" 0 "\n at table @code\n" "@end table\n")
+ ("quotation" 0 "@quotation" "@end quotation")
+ ("seealso" 0 "See also:\n at table @emph\n" "@end table\n")
+ ("seealso*" 0 "@table @emph\n" "@end table\n")
+ ("sloppypar" 0 "" "")
+ ("small" 0 "" "")
+ ("tableii" 4 (concat "@multitable @columnfractions .5 .5\n"
+ "@item \\3 @tab \\4\n"
+ "@item ------- @tab ------ \n")
+ "@end multitable\n")
+ ("tableiii" 5 (concat "@multitable @columnfractions .33 .33 .33\n"
+ "@item \\3 @tab \\4 @tab \\5\n"
+ "@item ------- @tab ------ @tab ------\n")
+ "@end multitable\n")
+ ("tableiv" 6 (concat
+ "@multitable @columnfractions .25 .25 .25 .25\n"
+ "@item \\3 @tab \\4 @tab \\5 @tab \\6\n"
+ "@item ------- @tab ------- @tab ------- @tab -------\n")
+ "@end multitable\n")
+ ("tablev" 7 (concat
+ "@multitable @columnfractions .20 .20 .20 .20 .20\n"
+ "@item \\3 @tab \\4 @tab \\5 @tab \\6 @tab \\7\n"
+ "@item ------- @tab ------- @tab ------- @tab ------- @tab -------\n")
+ "@end multitable\n")
+ ("alltt" 0 "@example" "@end example")
+ )
+ "Associative list defining substitutions for environments.
+Each list item is of the form (ENVIRONMENT ARGNUM BEGIN END) where:
+- ENVIRONMENT is LaTeX environment name
+- ARGNUM is number of (required) macro arguments
+- BEGIN is substitution for \begin{ENVIRONMENT}
+- END is substitution for \end{ENVIRONMENT}
+Both BEGIN and END are evaled. Moreover, you can reference arguments through
+\N regular expression notation in strings of BEGIN.")
+
+(defconst py2texi-commands
+ '(("AA" 0 "@AA{}")
+ ("aa" 0 "@aa{}")
+ ("ABC" 0 "ABC")
+ ("appendix" 0 (progn (setq appendix t) ""))
+ ("ASCII" 0 "ASCII")
+ ("author" 1 (progn (setq author (match-string 1 string)) ""))
+ ("authoraddress" 1
+ (progn (setq author-address (match-string 1 string)) ""))
+ ("b" 1 "@w{\\1}")
+ ("backslash" 0 "@backslash{}")
+ ("bf" 0 "@destroy")
+ ("bifuncindex" 1 (progn (setq findex t) "@findex{\\1}"))
+ ("C" 0 "C")
+ ("c" 0 "@,")
+ ("catcode" 0 "")
+ ("cdata" 1 "@code{\\1}")
+ ("centerline" 1 "@center \\1")
+ ("cfuncline" 3 "@itemx \\1 \\2(\\3)\n at findex \\2")
+ ("cfunction" 1 "@code{\\1}")
+ ("chapter" 1 (format "@node \\1\n@%s \\1\n"
+ (if appendix "appendix" "chapter")))
+ ("chapter*" 1 "@node \\1\n at unnumbered \\1\n")
+ ("character" 1 "@samp{\\1}")
+ ("citetitle" 1 "@ref{Top,,,\\1}")
+ ("class" 1 "@code{\\1}")
+ ("cmemberline" 3 "@itemx \\2 \\3\n")
+ ("code" 1 "@code{\\1}")
+ ("command" 1 "@command{\\1}")
+ ("constant" 1 "@code{\\1}")
+ ("copyright" 1 "@copyright{}")
+ ("Cpp" 0 "C++")
+ ("csimplemacro" 1 "@code{\\1}")
+ ("ctype" 1 "@code{\\1}")
+ ("dataline" 1 (progn (setq findex t) "@item \\1\n at findex \\1\n"))
+ ("date" 1 "\\1")
+ ("declaremodule" 2 (progn (setq cindex t) "@label{\\2}@cindex{\\2}"))
+ ("deprecated" 2 "@emph{This is deprecated in Python \\1. \\2}\n\n")
+ ("dfn" 1 "@dfn{\\1}")
+ ("documentclass" 1 py2texi-magic)
+ ("e" 0 "@backslash{}")
+ ("else" 0 (concat "@end ifinfo\n@" (setq last-if "iftex")))
+ ("env" 1 "@code{\\1}")
+ ("EOF" 0 "@code{EOF}")
+ ("email" 1 "@email{\\1}")
+ ("emph" 1 "@emph{\\1}")
+ ("envvar" 1 "@env{\\1}")
+ ("exception" 1 "@code{\\1}")
+ ("exindex" 1 (progn (setq obindex t) "@obindex{\\1}"))
+ ("fi" 0 (concat "@end " last-if))
+ ("file" 1 "@file{\\1}")
+ ("filenq" 1 "@file{\\1}")
+ ("filevar" 1 "@file{@var{\\1}}")
+ ("footnote" 1 "@footnote{\\1}")
+ ("frac" 0 "")
+ ("funcline" 2 (progn (setq findex t) "@item \\1 \\2\n at findex \\1"))
+ ("funclineni" 2 "@item \\1 \\2")
+ ("function" 1 "@code{\\1}")
+ ("grammartoken" 1 "@code{\\1}")
+ ("guilabel" 1 "@strong{\\1}")
+ ("hline" 0 "")
+ ("ifhtml" 0 (concat "@" (setq last-if "ifinfo")))
+ ("iftexi" 0 (concat "@" (setq last-if "ifinfo")))
+ ("index" 1 (progn (setq cindex t) "@cindex{\\1}"))
+ ("indexii" 2 (progn (setq cindex t) "@cindex{\\1 \\2}"))
+ ("indexiii" 3 (progn (setq cindex t) "@cindex{\\1 \\2 \\3}"))
+ ("indexiv" 3 (progn (setq cindex t) "@cindex{\\1 \\2 \\3 \\4}"))
+ ("infinity" 0 "@emph{infinity}")
+ ("it" 0 "@destroy")
+ ("kbd" 1 "@key{\\1}")
+ ("keyword" 1 "@code{\\1}")
+ ("kwindex" 1 (progn (setq cindex t) "@cindex{\\1}"))
+ ("label" 1 "@label{\\1}")
+ ("Large" 0 "")
+ ("LaTeX" 0 "La at TeX{}")
+ ("large" 0 "")
+ ("ldots" 0 "@dots{}")
+ ("leftline" 1 "\\1")
+ ("leq" 0 "<=")
+ ("lineii" 2 "@item \\1 @tab \\2")
+ ("lineiii" 3 "@item \\1 @tab \\2 @tab \\3")
+ ("lineiv" 4 "@item \\1 @tab \\2 @tab \\3 @tab \\4")
+ ("linev" 5 "@item \\1 @tab \\2 @tab \\3 @tab \\4 @tab \\5")
+ ("localmoduletable" 0 "")
+ ("longprogramopt" 1 "@option{--\\1}")
+ ("macro" 1 "@code{@backslash{}\\1}")
+ ("mailheader" 1 "@code{\\1}")
+ ("makeindex" 0 "")
+ ("makemodindex" 0 "")
+ ("maketitle" 0 (concat "@top " title "\n"))
+ ("makevar" 1 "@code{\\1}")
+ ("manpage" 2 "@samp{\\1(\\2)}")
+ ("mbox" 1 "@w{\\1}")
+ ("member" 1 "@code{\\1}")
+ ("memberline" 1 "@item \\1\n at findex \\1\n")
+ ("menuselection" 1 "@samp{\\1}")
+ ("method" 1 "@code{\\1}")
+ ("methodline" 2 (progn (setq moindex t) "@item \\1(\\2)\n at moindex \\1\n"))
+ ("methodlineni" 2 "@item \\1(\\2)\n")
+ ("mimetype" 1 "@samp{\\1}")
+ ("module" 1 "@samp{\\1}")
+ ("moduleauthor" 2 "")
+ ("modulesynopsis" 1 "\\1")
+ ("moreargs" 0 "@dots{}")
+ ("n" 0 "@backslash{}n")
+ ("newcommand" 2 "")
+ ("newsgroup" 1 "@samp{\\1}")
+ ("nodename" 1
+ (save-excursion
+ (save-match-data
+ (re-search-backward "^@node "))
+ (delete-region (point) (save-excursion (end-of-line) (point)))
+ (insert "@node " (match-string 1 string))
+ ""))
+ ("noindent" 0 "@noindent ")
+ ("note" 1 "@emph{Note:} \\1")
+ ("NULL" 0 "@code{NULL}")
+ ("obindex" 1 (progn (setq obindex t) "@obindex{\\1}"))
+ ("opindex" 1 (progn (setq cindex t) "@cindex{\\1}"))
+ ("option" 1 "@option{\\1}")
+ ("optional" 1 "[\\1]")
+ ("pep" 1 (progn (setq cindex t) "PEP@ \\1 at cindex PEP \\1\n"))
+ ("pi" 0 "pi")
+ ("platform" 1 "")
+ ("plusminus" 0 "+-")
+ ("POSIX" 0 "POSIX")
+ ("production" 2 "@item \\1 \\2")
+ ("productioncont" 1 "@item @w{} \\1")
+ ("program" 1 "@command{\\1}")
+ ("programopt" 1 "@option{\\1}")
+ ("protect" 0 "")
+ ("pytype" 1 "@code{\\1}")
+ ("ref" 1 "@ref{\\1}")
+ ("refbimodindex" 1 (progn (setq moindex t) "@moindex{\\1}"))
+ ("refmodindex" 1 (progn (setq moindex t) "@moindex{\\1}"))
+ ("refmodule" 1 "@samp{\\1}")
+ ("refstmodindex" 1 (progn (setq moindex t) "@moindex{\\1}"))
+ ("regexp" 1 "\"\\1\"")
+ ("release" 1
+ (progn (setq py2texi-python-version (match-string 1 string)) ""))
+ ("renewcommand" 2 "")
+ ("rfc" 1 (progn (setq cindex t) "RFC@ \\1 at cindex RFC \\1\n"))
+ ("rm" 0 "@destroy")
+ ("samp" 1 "@samp{\\1}")
+ ("section" 1 (let ((str (match-string 1 string)))
+ (save-match-data
+ (if (string-match "\\(.*\\)[ \t\n]*---[ \t\n]*\\(.*\\)"
+ str)
+ (format
+ "@node %s\n at section %s\n"
+ (py2texi-backslash-quote (match-string 1 str))
+ (py2texi-backslash-quote (match-string 2 str)))
+ "@node \\1\n at section \\1\n"))))
+ ("sectionauthor" 2 "")
+ ("seelink" 3 "\n at table @url\n at item @strong{\\1}\n(\\2)\n\\3\n at end table\n")
+ ("seemodule" 2 "@ref{\\1} \\2")
+ ("seepep" 3 "\n at table @strong\n at item PEP\\1 \\2\n\\3\n at end table\n")
+ ("seerfc" 3 "\n at table @strong\n at item RFC\\1 \\2\n\\3\n at end table\n")
+ ("seetext" 1 "\\1")
+ ("seetitle" 1 "@cite{\\1}")
+ ("seeurl" 2 "\n at table @url\n at item \\1\n\\2\n at end table\n")
+ ("setindexsubitem" 1 (progn (setq cindex t) "@cindex \\1"))
+ ("setreleaseinfo" 1 (progn (setq py2texi-releaseinfo "")))
+ ("setshortversion" 1
+ (progn (setq py2texi-python-short-version (match-string 1 string)) ""))
+ ("shortversion" 0 py2texi-python-short-version)
+ ("sqrt" 0 "")
+ ("stindex" 1 (progn (setq cindex t) "@cindex{\\1}"))
+ ("stmodindex" 1 (progn (setq moindex t) "@moindex{\\1}"))
+ ("strong" 1 "@strong{\\1}")
+ ("sub" 0 "/")
+ ("subsection" 1 "@node \\1\n at subsection \\1\n")
+ ("subsubsection" 1 "@node \\1\n at subsubsection \\1\n")
+ ("sum" 0 "")
+ ("tableofcontents" 0 "")
+ ("term" 1 "@item \\1")
+ ("TeX" 0 "@TeX{}")
+ ("textasciitilde" 0 "~")
+ ("textasciicircum" 0 "^")
+ ("textbackslash" 0 "@backslash{}")
+ ("textbar" 0 "|")
+ ; Some common versions of Texinfo don't support @euro yet:
+ ; ("texteuro" 0 "@euro{}")
+ ; Unfortunately, this alternate spelling doesn't actually apply to
+ ; the usage found in Python Tutorial, which actually requires a
+ ; Euro symbol to make sense, so this is commented out as well.
+ ; ("texteuro" 0 "Euro ")
+ ("textgreater" 0 ">")
+ ("textit" 1 "@i{\\1}")
+ ("textless" 0 "<")
+ ("textrm" 1 "\\1")
+ ("texttt" 1 "@code{\\1}")
+ ("textunderscore" 0 "_")
+ ("title" 1 (progn (setq title (match-string 1 string)) "@settitle \\1"))
+ ("today" 0 "@today{}")
+ ("token" 1 "@code{\\1}")
+ ("tt" 0 "@destroy")
+ ("ttindex" 1 (progn (setq cindex t) "@cindex{\\1}"))
+ ("u" 0 "@backslash{}u")
+ ("ulink" 2 "\\1")
+ ("UNIX" 0 "UNIX")
+ ("unspecified" 0 "@dots{}")
+ ("url" 1 "@url{\\1}")
+ ("usepackage" 1 "")
+ ("var" 1 "@var{\\1}")
+ ("verbatiminput" 1 "@code{\\1}")
+ ("version" 0 py2texi-python-version)
+ ("versionadded" 1 "@emph{Added in Python version \\1}")
+ ("versionchanged" 1 "@emph{Changed in Python version \\1}")
+ ("vskip" 1 "")
+ ("vspace" 1 "")
+ ("warning" 1 "@emph{\\1}")
+ ("withsubitem" 2 "\\2")
+ ("XXX" 1 "@strong{\\1}"))
+ "Associative list of command substitutions.
+Each list item is of the form (COMMAND ARGNUM SUBSTITUTION) where:
+- COMMAND is LaTeX command name
+- ARGNUM is number of (required) command arguments
+- SUBSTITUTION substitution for the command. It is evaled and you can
+ reference command arguments through the \\N regexp notation in strings.")
+
+(defvar py2texi-magic "@documentclass\n"
+ "\"Magic\" string for auxiliary insertion at the beginning of document.")
+
+(defvar py2texi-dirs '("./" "../texinputs/")
+ "Where to search LaTeX input files.")
+
+(defvar py2texi-buffer "*py2texi*"
+ "The name of a buffer where Texinfo is generated.")
+
+(defconst py2texi-xemacs (string-match "^XEmacs" (emacs-version))
+ "Running under XEmacs?")
+
+
+(defmacro py2texi-search (regexp &rest body)
+ `(progn
+ (goto-char (point-min))
+ (while (re-search-forward ,regexp nil t)
+ , at body)))
+
+(defmacro py2texi-search-safe (regexp &rest body)
+ `(py2texi-search ,regexp
+ (unless (py2texi-protected)
+ , at body)))
+
+
+(defun py2texi-message (message)
+ "Report message and stop if `py2texi-stop-on-problems' is non-nil."
+ (if py2texi-stop-on-problems
+ (error message)
+ (message message)))
+
+
+(defun py2texi-backslash-quote (string)
+ "Double backslahes in STRING."
+ (let ((i 0))
+ (save-match-data
+ (while (setq i (string-match "\\\\" string i))
+ (setq string (replace-match "\\\\\\\\" t nil string))
+ (setq i (+ i 2))))
+ string))
+
+
+(defun py2texi (file)
+ "Convert Python LaTeX documentation FILE to Texinfo."
+ (interactive "fFile to convert: ")
+ (switch-to-buffer (get-buffer-create py2texi-buffer))
+ (erase-buffer)
+ (insert-file file)
+ (let ((case-fold-search nil)
+ (title "")
+ (author "")
+ (author-address "")
+ (appendix nil)
+ (findex nil)
+ (obindex nil)
+ (cindex nil)
+ (moindex nil)
+ last-if)
+ (py2texi-process-verbatims)
+ (py2texi-process-comments)
+ (py2texi-process-includes)
+ (py2texi-process-funnyas)
+ (py2texi-process-environments)
+ (py2texi-process-commands)
+ (py2texi-fix-indentation)
+ (py2texi-fix-nodes)
+ (py2texi-fix-references)
+ (py2texi-fix-indices)
+ (py2texi-process-simple-commands)
+ (py2texi-fix-fonts)
+ (py2texi-fix-braces)
+ (py2texi-fix-backslashes)
+ (py2texi-destroy-empties)
+ (py2texi-fix-newlines)
+ (py2texi-adjust-level))
+ (let* ((texi-file-name (or py2texi-texi-file-name
+ (py2texi-texi-file-name file)))
+ (info-file-name (or py2texi-info-file-name
+ (py2texi-info-file-name texi-file-name))))
+ (goto-char (point-min))
+ (when (looking-at py2texi-magic)
+ (delete-region (point) (progn (beginning-of-line 2) (point)))
+ (insert "\\input texinfo @c -*-texinfo-*-\n")
+ (insert "@setfilename " info-file-name))
+ (when (re-search-forward "@chapter" nil t)
+ (texinfo-all-menus-update t))
+ (goto-char (point-min))
+ (write-file texi-file-name)
+ (message (format "You can apply `makeinfo %s' now." texi-file-name))))
+
+
+(defun py2texi-texi-file-name (filename)
+ "Generate name of Texinfo file from original file name FILENAME."
+ (concat filename
+ (if (string-match "\\.tex$" filename) "i" ".texi")))
+
+
+(defun py2texi-info-file-name (filename)
+ "Generate name of info file from original file name FILENAME."
+ (setq filename (expand-file-name filename))
+ (let ((directory (file-name-directory filename))
+ (basename (file-name-nondirectory filename)))
+ (concat directory "python-"
+ (substring basename 0 (- (length basename) 4)) "info")))
+
+
+(defun py2texi-process-verbatims ()
+ "Process and protect verbatim environments."
+ (let (delimiter
+ beg
+ end)
+ (py2texi-search-safe "\\\\begin{\\(verbatim\\|displaymath\\)}"
+ (replace-match "@example")
+ (setq beg (copy-marker (point) nil))
+ (re-search-forward "\\\\end{\\(verbatim\\|displaymath\\)}")
+ (setq end (copy-marker (match-beginning 0) nil))
+ (replace-match "@end example")
+ (py2texi-texinfo-escape beg end)
+ (put-text-property (- beg (length "@example"))
+ (+ end (length "@end example"))
+ 'py2texi-protected t))
+ (py2texi-search-safe "\\\\verb\\([^a-z]\\)"
+ (setq delimiter (match-string 1))
+ (replace-match "@code{")
+ (setq beg (copy-marker (point) nil))
+ (re-search-forward (regexp-quote delimiter))
+ (setq end (copy-marker (match-beginning 0) nil))
+ (replace-match "}")
+ (put-text-property (- beg (length "@code{")) (+ end (length "}"))
+ 'py2texi-protected t)
+ (py2texi-texinfo-escape beg end))))
+
+
+(defun py2texi-process-comments ()
+ "Remove comments."
+ (let (point)
+ (py2texi-search-safe "%"
+ (setq point (point))
+ (when (save-excursion
+ (re-search-backward "\\(^\\|[^\\]\\(\\\\\\\\\\)*\\)%\\=" nil t))
+ (delete-region (1- point)
+ (save-excursion (beginning-of-line 2) (point)))))))
+
+
+(defun py2texi-process-includes ()
+ "Include LaTeX input files.
+Do not include .ind files."
+ (let ((path (file-name-directory file))
+ filename
+ dirs
+ includefile)
+ (py2texi-search-safe "\\\\input{\\([^}]+\\)}"
+ (setq filename (match-string 1))
+ (unless (save-match-data (string-match "\\.tex$" filename))
+ (setq filename (concat filename ".tex")))
+ (setq includefile (save-match-data
+ (string-match "\\.ind\\.tex$" filename)))
+ (setq dirs py2texi-dirs)
+ (while (and (not includefile) dirs)
+ (setq includefile
+ (concat (file-name-as-directory (car dirs)) filename))
+ (if (not (file-name-absolute-p includefile))
+ (setq includefile
+ (concat (file-name-as-directory path) includefile)))
+ (unless (file-exists-p includefile)
+ (setq includefile nil)
+ (setq dirs (cdr dirs))))
+ (if includefile
+ (save-restriction
+ (narrow-to-region (match-beginning 0) (match-end 0))
+ (delete-region (point-min) (point-max))
+ (when (stringp includefile)
+ (insert-file-contents includefile)
+ (goto-char (point-min))
+ (insert "\n")
+ (py2texi-process-verbatims)
+ (py2texi-process-comments)
+ (py2texi-process-includes)))
+ (replace-match (format "\\\\emph{Included file %s}" filename))
+ (py2texi-message (format "Input file %s not found" filename))))))
+
+
+(defun py2texi-process-funnyas ()
+ "Convert @s."
+ (py2texi-search-safe "@"
+ (replace-match "@@")))
+
+
+(defun py2texi-process-environments ()
+ "Process LaTeX environments."
+ (let ((stack ())
+ kind
+ environment
+ parameter
+ arguments
+ n
+ string
+ description)
+ (py2texi-search-safe (concat "\\\\\\(begin\\|end\\|item\\)"
+ "\\({\\([^}]*\\)}\\|[[]\\([^]]*\\)[]]\\|\\)")
+ (setq kind (match-string 1)
+ environment (match-string 3)
+ parameter (match-string 4))
+ (replace-match "")
+ (cond
+ ((string= kind "begin")
+ (setq description (assoc environment py2texi-environments))
+ (if description
+ (progn
+ (setq n (cadr description))
+ (setq description (cddr description))
+ (setq string (py2texi-tex-arguments n))
+ (string-match (py2texi-regexp n) string)
+ ; incorrect but sufficient
+ (insert (replace-match (eval (car description))
+ t nil string))
+ (setq stack (cons (cadr description) stack)))
+ (py2texi-message (format "Unknown environment: %s" environment))
+ (setq stack (cons "" stack))))
+ ((string= kind "end")
+ (insert (eval (car stack)))
+ (setq stack (cdr stack)))
+ ((string= kind "item")
+ (insert "\n at item " (or parameter "") "\n"))))
+ (when stack
+ (py2texi-message (format "Unclosed environment: %s" (car stack))))))
+
+
+(defun py2texi-process-commands ()
+ "Process LaTeX commands."
+ (let (done
+ command
+ command-info
+ string
+ n)
+ (while (not done)
+ (setq done t)
+ (py2texi-search-safe "\\\\\\([a-zA-Z*]+\\)\\(\\[[^]]*\\]\\)?"
+ (setq command (match-string 1))
+ (setq command-info (assoc command py2texi-commands))
+ (if command-info
+ (progn
+ (setq done nil)
+ (replace-match "")
+ (setq command-info (cdr command-info))
+ (setq n (car command-info))
+ (setq string (py2texi-tex-arguments n))
+ (string-match (py2texi-regexp n) string)
+ ; incorrect but sufficient
+ (insert (replace-match (eval (cadr command-info))
+ t nil string)))
+ (py2texi-message (format "Unknown command: %s (not processed)"
+ command)))))))
+
+
+(defun py2texi-argument-pattern (count)
+ (let ((filler "\\(?:[^{}]\\|\\\\{\\|\\\\}\\)*"))
+ (if (<= count 0)
+ filler
+ (concat filler "\\(?:{"
+ (py2texi-argument-pattern (1- count))
+ "}" filler "\\)*" filler))))
+(defconst py2texi-tex-argument
+ (concat
+ "{\\("
+ (py2texi-argument-pattern 10) ;really at least 10!
+ "\\)}[ \t%@c\n]*")
+ "Regexp describing LaTeX command argument including argument separators.")
+
+
+(defun py2texi-regexp (n)
+ "Make regexp matching N LaTeX command arguments."
+ (if (= n 0)
+ ""
+ (let ((regexp "^[^{]*"))
+ (while (> n 0)
+ (setq regexp (concat regexp py2texi-tex-argument))
+ (setq n (1- n)))
+ regexp)))
+
+
+(defun py2texi-tex-arguments (n)
+ "Remove N LaTeX command arguments and return them as a string."
+ (let ((point (point))
+ (i 0)
+ result
+ match)
+ (if (= n 0)
+ (progn
+ (when (re-search-forward "\\=\\({}\\| *\\)" nil t)
+ (replace-match ""))
+ "")
+ (while (> n 0)
+ (unless (re-search-forward
+ "\\(\\=\\|[^\\\\]\\)\\(\\\\\\\\\\)*\\([{}]\\)" nil t)
+ (debug))
+ (if (string= (match-string 3) "{")
+ (setq i (1+ i))
+ (setq i (1- i))
+ (when (<= i 0)
+ (setq n (1- n)))))
+ (setq result (buffer-substring-no-properties point (point)))
+ (while (string-match "\n[ \t]*" result)
+ (setq result (replace-match " " t nil result)))
+ (delete-region point (point))
+ result)))
+
+
+(defun py2texi-process-simple-commands ()
+ "Replace single character LaTeX commands."
+ (let (char)
+ (py2texi-search-safe "\\\\\\([^a-z]\\)"
+ (setq char (match-string 1))
+ (replace-match (format "%s%s"
+ (if (or (string= char "{")
+ (string= char "}")
+ (string= char " "))
+ "@"
+ "")
+ (if (string= char "\\")
+ "\\\\"
+ char))))))
+
+
+(defun py2texi-fix-indentation ()
+ "Remove white space at the beginning of lines."
+ (py2texi-search-safe "^[ \t]+"
+ (replace-match "")))
+
+
+(defun py2texi-fix-nodes ()
+ "Remove unwanted characters from nodes and make nodes unique."
+ (let ((nodes (make-hash-table :test 'equal))
+ id
+ counter
+ string
+ label
+ index)
+ (py2texi-search "^@node +\\(.*\\)$"
+ (setq string (match-string 1))
+ (if py2texi-xemacs
+ (replace-match "@node " t)
+ (replace-match "" t nil nil 1))
+ (while (string-match "@label{[^}]*}" string)
+ (setq label (match-string 0 string))
+ (setq string (replace-match "" t nil string)))
+ (while (string-match "@..?index{[^}]*}" string)
+ (setq index (match-string 0 string))
+ (setq string (replace-match "" t nil string)))
+ (while (string-match "@[a-zA-Z]+\\|[{}():]\\|``\\|''" string)
+ (setq string (replace-match "" t nil string)))
+ (while (string-match " -- " string)
+ (setq string (replace-match " - " t nil string)))
+ (while (string-match "\\." string)
+ (setq string (replace-match "" t nil string)))
+ (when (string-match " +$" string)
+ (setq string (replace-match "" t nil string)))
+ (when (string-match "^\\(Built-in\\|Standard\\) Module \\|The " string)
+ (setq string (replace-match "" t nil string)))
+ (string-match "^[^,]+" string)
+ (setq id (match-string 0 string))
+ (setq counter (gethash id nodes))
+ (if counter
+ (progn
+ (setq counter (1+ counter))
+ (setq string (replace-match (format "\\& %d" counter)
+ t nil string)))
+ (setq counter 1))
+ (setf (gethash id nodes) counter)
+ (insert string)
+ (beginning-of-line 3)
+ (when label
+ (insert label "\n"))
+ (when index
+ (insert index "\n")))))
+
+
+(defun py2texi-fix-references ()
+ "Process labels and make references to point to appropriate nodes."
+ (let ((labels ())
+ node)
+ (py2texi-search-safe "@label{\\([^}]*\\)}"
+ (setq node (save-excursion
+ (save-match-data
+ (and (re-search-backward "@node +\\([^,\n]+\\)" nil t)
+ (match-string 1)))))
+ (when node
+ (setq labels (cons (cons (match-string 1) node) labels)))
+ (replace-match ""))
+ (py2texi-search-safe "@ref{\\([^}]*\\)}"
+ (setq node (assoc (match-string 1) labels))
+ (replace-match "")
+ (when node
+ (insert (format "@ref{%s}" (cdr node)))))))
+
+
+(defun py2texi-fix-indices ()
+ "Remove unwanted characters from @*index commands and create final indices."
+ (py2texi-search-safe "@..?index\\>[^\n]*\\(\\)\n"
+ (replace-match "" t nil nil 1))
+ (py2texi-search-safe "@..?index\\>[^\n]*\\(\\)"
+ (replace-match "\n" t nil nil 1))
+ (py2texi-search-safe "@..?index\\({\\)\\([^}]+\\)\\(}+\\)"
+ (replace-match " " t nil nil 1)
+ (replace-match "" t nil nil 3)
+ (let ((string (match-string 2)))
+ (save-match-data
+ (while (string-match "@[a-z]+{" string)
+ (setq string (replace-match "" nil nil string)))
+ (while (string-match "{" string)
+ (setq string (replace-match "" nil nil string))))
+ (replace-match string t t nil 2)))
+ (py2texi-search-safe "@..?index\\>.*\\([{}]\\|@[a-z]*\\)"
+ (replace-match "" t nil nil 1)
+ (goto-char (match-beginning 0)))
+ (py2texi-search-safe "[^\n]\\(\\)@..?index\\>"
+ (replace-match "\n" t nil nil 1))
+ (goto-char (point-max))
+ (re-search-backward "@indices")
+ (replace-match "")
+ (insert (if moindex
+ (concat "@node Module Index\n"
+ "@unnumbered Module Index\n"
+ "@printindex mo\n")
+ "")
+ (if obindex
+ (concat "@node Class-Exception-Object Index\n"
+ "@unnumbered Class, Exception, and Object Index\n"
+ "@printindex ob\n")
+ "")
+ (if findex
+ (concat "@node Function-Method-Variable Index\n"
+ "@unnumbered Function, Method, and Variable Index\n"
+ "@printindex fn\n")
+ "")
+ (if cindex
+ (concat "@node Miscellaneous Index\n"
+ "@unnumbered Miscellaneous Index\n"
+ "@printindex cp\n")
+ "")))
+
+
+(defun py2texi-fix-backslashes ()
+ "Make backslashes from auxiliary commands."
+ (py2texi-search-safe "@backslash{}"
+ (replace-match "\\\\")))
+
+
+(defun py2texi-fix-fonts ()
+ "Remove garbage after unstructured font commands."
+ (let (string)
+ (py2texi-search-safe "@destroy"
+ (replace-match "")
+ (when (eq (preceding-char) ?{)
+ (forward-char -1)
+ (setq string (py2texi-tex-arguments 1))
+ (insert (substring string 1 (1- (length string))))))))
+
+
+(defun py2texi-fix-braces ()
+ "Escape braces for Texinfo."
+ (let (string)
+ (py2texi-search "{"
+ (unless (or (py2texi-protected)
+ (save-excursion
+ (re-search-backward
+ "@\\([a-zA-Z]*\\|multitable.*\\){\\=" nil t)))
+ (forward-char -1)
+ (setq string (py2texi-tex-arguments 1))
+ (insert "@" (substring string 0 (1- (length string))) "@}")))))
+
+
+(defun py2texi-fix-newlines ()
+ "Remove extra newlines."
+ (py2texi-search "\n\n\n+"
+ (replace-match "\n\n"))
+ (py2texi-search-safe "@item.*\n\n"
+ (delete-backward-char 1))
+ (py2texi-search "@end example"
+ (unless (looking-at "\n\n")
+ (insert "\n"))))
+
+
+(defun py2texi-destroy-empties ()
+ "Remove all comments.
+This avoids some makeinfo errors."
+ (py2texi-search "@c\\>"
+ (unless (eq (py2texi-protected) t)
+ (delete-region (- (point) 2) (save-excursion (end-of-line) (point)))
+ (cond
+ ((looking-at "\n\n")
+ (delete-char 1))
+ ((save-excursion (re-search-backward "^[ \t]*\\=" nil t))
+ (delete-region (save-excursion (beginning-of-line) (point))
+ (1+ (point))))))))
+
+
+(defun py2texi-adjust-level ()
+ "Increase heading level to @chapter, if needed.
+This is only needed for distutils, so it has a very simple form only."
+ (goto-char (point-min))
+ (unless (re-search-forward "@chapter\\>" nil t)
+ (py2texi-search-safe "@section\\>"
+ (replace-match "@chapter" t))
+ (py2texi-search-safe "@\\(sub\\)\\(sub\\)?section\\>"
+ (replace-match "" nil nil nil 1))))
+
+
+(defun py2texi-texinfo-escape (beg end)
+ "Escape Texinfo special characters in region."
+ (save-excursion
+ (goto-char beg)
+ (while (re-search-forward "[@{}]" end t)
+ (replace-match "@\\&"))))
+
+
+(defun py2texi-protected ()
+ "Return protection status of the point before current point."
+ (get-text-property (1- (point)) 'py2texi-protected))
+
+
+;;; Announce
+
+(provide 'py2texi)
+
+
+;;; py2texi.el ends here
Added: sandbox/trunk/pdb/Doc/tools/toc2bkm.py
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/Doc/tools/toc2bkm.py Wed Jul 5 14:19:15 2006
@@ -0,0 +1,143 @@
+#! /usr/bin/env python
+
+"""Convert a LaTeX .toc file to some PDFTeX magic to create that neat outline.
+
+The output file has an extension of '.bkm' instead of '.out', since hyperref
+already uses that extension.
+"""
+
+import getopt
+import os
+import re
+import string
+import sys
+
+
+# Ench item in an entry is a tuple of:
+#
+# Section #, Title String, Page #, List of Sub-entries
+#
+# The return value of parse_toc() is such a tuple.
+
+cline_re = r"""^
+\\contentsline\ \{([a-z]*)} # type of section in $1
+\{(?:\\numberline\ \{([0-9.A-Z]+)})? # section number
+(.*)} # title string
+\{(\d+)}$""" # page number
+
+cline_rx = re.compile(cline_re, re.VERBOSE)
+
+OUTER_TO_INNER = -1
+
+_transition_map = {
+ ('chapter', 'section'): OUTER_TO_INNER,
+ ('section', 'subsection'): OUTER_TO_INNER,
+ ('subsection', 'subsubsection'): OUTER_TO_INNER,
+ ('subsubsection', 'subsection'): 1,
+ ('subsection', 'section'): 1,
+ ('section', 'chapter'): 1,
+ ('subsection', 'chapter'): 2,
+ ('subsubsection', 'section'): 2,
+ ('subsubsection', 'chapter'): 3,
+ }
+
+INCLUDED_LEVELS = ("chapter", "section", "subsection", "subsubsection")
+
+
+def parse_toc(fp, bigpart=None):
+ toc = top = []
+ stack = [toc]
+ level = bigpart or 'chapter'
+ lineno = 0
+ while 1:
+ line = fp.readline()
+ if not line:
+ break
+ lineno = lineno + 1
+ m = cline_rx.match(line)
+ if m:
+ stype, snum, title, pageno = m.group(1, 2, 3, 4)
+ title = clean_title(title)
+ entry = (stype, snum, title, int(pageno), [])
+ if stype == level:
+ toc.append(entry)
+ else:
+ if stype not in INCLUDED_LEVELS:
+ # we don't want paragraphs & subparagraphs
+ continue
+ direction = _transition_map[(level, stype)]
+ if direction == OUTER_TO_INNER:
+ toc = toc[-1][-1]
+ stack.insert(0, toc)
+ toc.append(entry)
+ else:
+ for i in range(direction):
+ del stack[0]
+ toc = stack[0]
+ toc.append(entry)
+ level = stype
+ else:
+ sys.stderr.write("l.%s: " + line)
+ return top
+
+
+hackscore_rx = re.compile(r"\\hackscore\s*{[^}]*}")
+raisebox_rx = re.compile(r"\\raisebox\s*{[^}]*}")
+title_rx = re.compile(r"\\([a-zA-Z])+\s+")
+title_trans = string.maketrans("", "")
+
+def clean_title(title):
+ title = raisebox_rx.sub("", title)
+ title = hackscore_rx.sub(r"\\_", title)
+ pos = 0
+ while 1:
+ m = title_rx.search(title, pos)
+ if m:
+ start = m.start()
+ if title[start:start+15] != "\\textunderscore":
+ title = title[:start] + title[m.end():]
+ pos = start + 1
+ else:
+ break
+ title = title.translate(title_trans, "{}")
+ return title
+
+
+def write_toc(toc, fp):
+ for entry in toc:
+ write_toc_entry(entry, fp, 0)
+
+def write_toc_entry(entry, fp, layer):
+ stype, snum, title, pageno, toc = entry
+ s = "\\pdfoutline goto name{page%03d}" % pageno
+ if toc:
+ s = "%s count -%d" % (s, len(toc))
+ if snum:
+ title = "%s %s" % (snum, title)
+ s = "%s {%s}\n" % (s, title)
+ fp.write(s)
+ for entry in toc:
+ write_toc_entry(entry, fp, layer + 1)
+
+
+def process(ifn, ofn, bigpart=None):
+ toc = parse_toc(open(ifn), bigpart)
+ write_toc(toc, open(ofn, "w"))
+
+
+def main():
+ bigpart = None
+ opts, args = getopt.getopt(sys.argv[1:], "c:")
+ if opts:
+ bigpart = opts[0][1]
+ if not args:
+ usage()
+ sys.exit(2)
+ for filename in args:
+ base, ext = os.path.splitext(filename)
+ ext = ext or ".toc"
+ process(base + ext, base + ".bkm", bigpart)
+
+
+if __name__ == "__main__":
+ main()
From python-checkins at python.org Wed Jul 5 14:27:01 2006
From: python-checkins at python.org (matt.fleming)
Date: Wed, 5 Jul 2006 14:27:01 +0200 (CEST)
Subject: [Python-checkins] r47239 - sandbox/trunk/pdb/doc
Message-ID: <20060705122701.527981E4017@bag.python.org>
Author: matt.fleming
Date: Wed Jul 5 14:27:01 2006
New Revision: 47239
Removed:
sandbox/trunk/pdb/doc/
Log:
Remove old doc directory.
From python-checkins at python.org Wed Jul 5 16:18:46 2006
From: python-checkins at python.org (georg.brandl)
Date: Wed, 5 Jul 2006 16:18:46 +0200 (CEST)
Subject: [Python-checkins] r47241 - python/trunk/Doc/lib/libfuncs.tex
Message-ID: <20060705141846.600971E4005@bag.python.org>
Author: georg.brandl
Date: Wed Jul 5 16:18:45 2006
New Revision: 47241
Modified:
python/trunk/Doc/lib/libfuncs.tex
Log:
Patch #1517490: fix glitches in filter() docs.
Modified: python/trunk/Doc/lib/libfuncs.tex
==============================================================================
--- python/trunk/Doc/lib/libfuncs.tex (original)
+++ python/trunk/Doc/lib/libfuncs.tex Wed Jul 5 16:18:45 2006
@@ -468,10 +468,10 @@
Construct a list from those elements of \var{list} for which
\var{function} returns true. \var{list} may be either a sequence, a
container which supports iteration, or an iterator, If \var{list}
- is a string or a tuple, the result also has that type; otherwise it
- is always a list. If \var{function} is \code{None}, the identity
- function is assumed, that is, all elements of \var{list} that are false
- (zero or empty) are removed.
+ is a string (either \code{str} or \code{unicode}) or a tuple, the result
+ also has that type; otherwise it is always a list. If \var{function} is
+ \code{None}, the identity function is assumed, that is, all elements of
+ \var{list} that are false are removed.
Note that \code{filter(function, \var{list})} is equivalent to
\code{[item for item in \var{list} if function(item)]} if function is
From python-checkins at python.org Wed Jul 5 16:18:49 2006
From: python-checkins at python.org (georg.brandl)
Date: Wed, 5 Jul 2006 16:18:49 +0200 (CEST)
Subject: [Python-checkins] r47242 -
python/branches/release24-maint/Doc/lib/libfuncs.tex
Message-ID: <20060705141849.9AEA51E4003@bag.python.org>
Author: georg.brandl
Date: Wed Jul 5 16:18:49 2006
New Revision: 47242
Modified:
python/branches/release24-maint/Doc/lib/libfuncs.tex
Log:
Patch #1517490: fix glitches in filter() docs.
(backport from rev. 47241)
Modified: python/branches/release24-maint/Doc/lib/libfuncs.tex
==============================================================================
--- python/branches/release24-maint/Doc/lib/libfuncs.tex (original)
+++ python/branches/release24-maint/Doc/lib/libfuncs.tex Wed Jul 5 16:18:49 2006
@@ -432,10 +432,10 @@
Construct a list from those elements of \var{list} for which
\var{function} returns true. \var{list} may be either a sequence, a
container which supports iteration, or an iterator, If \var{list}
- is a string or a tuple, the result also has that type; otherwise it
- is always a list. If \var{function} is \code{None}, the identity
- function is assumed, that is, all elements of \var{list} that are false
- (zero or empty) are removed.
+ is a string (either \code{str} or \code{unicode}) or a tuple, the result
+ also has that type; otherwise it is always a list. If \var{function} is
+ \code{None}, the identity function is assumed, that is, all elements of
+ \var{list} that are false are removed.
Note that \code{filter(function, \var{list})} is equivalent to
\code{[item for item in \var{list} if function(item)]} if function is
From python-checkins at python.org Wed Jul 5 16:49:26 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Wed, 5 Jul 2006 16:49:26 +0200 (CEST)
Subject: [Python-checkins] r47243 - sandbox/trunk/Doc/functional.rst
Message-ID: <20060705144926.3313C1E4015@bag.python.org>
Author: andrew.kuchling
Date: Wed Jul 5 16:49:25 2006
New Revision: 47243
Modified:
sandbox/trunk/Doc/functional.rst
Log:
Fix example; add paragraph completing a thought; link to functional package
Modified: sandbox/trunk/Doc/functional.rst
==============================================================================
--- sandbox/trunk/Doc/functional.rst (original)
+++ sandbox/trunk/Doc/functional.rst Wed Jul 5 16:49:25 2006
@@ -253,7 +253,7 @@
>>> L = [1,2,3]
>>> iterator = iter(L)
- >>> a,b,c = i
+ >>> a,b,c = iterator
>>> a,b,c
(1, 2, 3)
@@ -875,6 +875,8 @@
for a, b in items:
total += b
+Many uses of ``reduce()`` are clearer when written as ``for`` loops.
+
Fredrik Lundh once suggested the following set of rules for refactoring
uses of ``lambda``:
@@ -1135,14 +1137,18 @@
server_log = functools.partial(log, subsystem='server')
server_log('Unable to open socket')
+There are also third-party modules, such as Collin Winter's
+`functional package `__,
+that are intended for use in functional-style programs.
+
Revision History and Acknowledgements
------------------------------------------------
The author would like to thank the following people for offering
suggestions, corrections and assistance with various drafts of this
-article: Ian Bicking, Nick Coghlan, Raymond Hettinger, Jim Jewett,
-Leandro Lameiro, Blake Winton.
+article: Ian Bicking, Nick Coghlan, Nick Efford, Raymond Hettinger,
+Jim Jewett, Leandro Lameiro, Collin Winter, Blake Winton.
Version 0.1: posted June 30 2006.
From python at rcn.com Wed Jul 5 17:36:09 2006
From: python at rcn.com (Raymond Hettinger)
Date: Wed, 05 Jul 2006 15:36:09 -0000
Subject: [Python-checkins] r47241 - python/trunk/Doc/lib/libfuncs.tex
References: <20060705141846.600971E4005@bag.python.org>
Message-ID: <00a901c688b5$bfd20cb0$884310ac@RaymondLaptop1>
> Patch #1517490: fix glitches in filter() docs.
. . .
> - is a string or a tuple, the result also has that type; otherwise it
> - is always a list. If \var{function} is \code{None}, the identity
> - function is assumed, that is, all elements of \var{list} that are false
> - (zero or empty) are removed.
> + is a string (either \code{str} or \code{unicode}) or a tuple, the result
> + also has that type; otherwise it is always a list. If \var{function} is
> + \code{None}, the identity function is assumed, that is, all elements of
> + \var{list} that are false are removed.
I don't think there is a benefit to going through the docs and elaborating on
"string" as meaning either "str" or "unicode".
Also, in this particular case, the object can also be a subclass of str or
unicode.
Raymond
From python-checkins at python.org Wed Jul 5 17:50:06 2006
From: python-checkins at python.org (georg.brandl)
Date: Wed, 5 Jul 2006 17:50:06 +0200 (CEST)
Subject: [Python-checkins] r47244 - python/trunk/Doc/lib/libfuncs.tex
Message-ID: <20060705155006.101161E4003@bag.python.org>
Author: georg.brandl
Date: Wed Jul 5 17:50:05 2006
New Revision: 47244
Modified:
python/trunk/Doc/lib/libfuncs.tex
Log:
no need to elaborate "string".
Modified: python/trunk/Doc/lib/libfuncs.tex
==============================================================================
--- python/trunk/Doc/lib/libfuncs.tex (original)
+++ python/trunk/Doc/lib/libfuncs.tex Wed Jul 5 17:50:05 2006
@@ -468,7 +468,7 @@
Construct a list from those elements of \var{list} for which
\var{function} returns true. \var{list} may be either a sequence, a
container which supports iteration, or an iterator, If \var{list}
- is a string (either \code{str} or \code{unicode}) or a tuple, the result
+ is a string or a tuple, the result
also has that type; otherwise it is always a list. If \var{function} is
\code{None}, the identity function is assumed, that is, all elements of
\var{list} that are false are removed.
From python-checkins at python.org Wed Jul 5 17:50:08 2006
From: python-checkins at python.org (georg.brandl)
Date: Wed, 5 Jul 2006 17:50:08 +0200 (CEST)
Subject: [Python-checkins] r47245 -
python/branches/release24-maint/Doc/lib/libfuncs.tex
Message-ID: <20060705155008.CD4DE1E4009@bag.python.org>
Author: georg.brandl
Date: Wed Jul 5 17:50:08 2006
New Revision: 47245
Modified:
python/branches/release24-maint/Doc/lib/libfuncs.tex
Log:
no need to elaborate "string".
(backport from rev. 47244)
Modified: python/branches/release24-maint/Doc/lib/libfuncs.tex
==============================================================================
--- python/branches/release24-maint/Doc/lib/libfuncs.tex (original)
+++ python/branches/release24-maint/Doc/lib/libfuncs.tex Wed Jul 5 17:50:08 2006
@@ -432,7 +432,7 @@
Construct a list from those elements of \var{list} for which
\var{function} returns true. \var{list} may be either a sequence, a
container which supports iteration, or an iterator, If \var{list}
- is a string (either \code{str} or \code{unicode}) or a tuple, the result
+ is a string or a tuple, the result
also has that type; otherwise it is always a list. If \var{function} is
\code{None}, the identity function is assumed, that is, all elements of
\var{list} that are false are removed.
From buildbot at python.org Wed Jul 5 18:04:20 2006
From: buildbot at python.org (buildbot at python.org)
Date: Wed, 05 Jul 2006 16:04:20 +0000
Subject: [Python-checkins] buildbot warnings in alpha Debian trunk
Message-ID: <20060705160420.CA7991E4003@bag.python.org>
The Buildbot has detected a new failure of alpha Debian trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/alpha%2520Debian%2520trunk/builds/448
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: thomas.wouters
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Wed Jul 5 22:54:26 2006
From: python-checkins at python.org (matt.fleming)
Date: Wed, 5 Jul 2006 22:54:26 +0200 (CEST)
Subject: [Python-checkins] r47247 - sandbox/trunk/pdb/Doc/lib/libmpdb.tex
Message-ID: <20060705205426.41CC31E4003@bag.python.org>
Author: matt.fleming
Date: Wed Jul 5 22:54:24 2006
New Revision: 47247
Modified:
sandbox/trunk/pdb/Doc/lib/libmpdb.tex
Log:
Added more documentation.
Modified: sandbox/trunk/pdb/Doc/lib/libmpdb.tex
==============================================================================
--- sandbox/trunk/pdb/Doc/lib/libmpdb.tex (original)
+++ sandbox/trunk/pdb/Doc/lib/libmpdb.tex Wed Jul 5 22:54:24 2006
@@ -10,9 +10,6 @@
\sectionauthor{Matt Fleming}{mattjfleming at googlemail.com}
-
-% Leave at least one blank line after this, to simplify ad-hoc tools
-% that are sometimes used to massage these files.
\modulesynopsis{An Improved Python Debugger}
@@ -49,21 +46,976 @@
Exception raised when the debugger is going to immediately exit.
\end{excdesc}
-\section{MPdb Class}
-\label{mpdb-class}
+\section{Debugger commands}
+\label{mpdb-commands}
+
+The section describes the command that are available to the debugger when
+run as a standalone program.
+
+Most commands can be abbreviated to one or two letters;
+e.g., \samp{h(elp)} means that either \samp{h} or \samp{help} can be
+used to enter the help command (but not \samp{he}, \samp{hel},
+\samp{H}, \samp{Help}, or \samp{HELP}). Arguments to commands must
+be separated by whitespace (spaces or tabs). Optional arguments are
+enclosed in square brackets (\samp{[]}) in the command syntax; the
+square brackets must not be typed. Alternatives in the command syntax
+are separated by a vertical bar (\samp{|}).
+
+Entering a blank line repeats the last command entered. Exceptions: if
+the last command was a \samp{list} command, the next 11 lines are
+listed or if the debugger has just connected to a pdbserver, the debugger
+exits.
+
+Commands that the debugger doesn't recognize are assumed to be Python
+statements and are executed in the context of the program being
+debugged. Python statements can also be prefixed with an exclamation
+point (\samp{!}). This may be a good way to inspect the program being
+debugged; it is even possible to change a variable or call a function.
+When an exception occurs in such a statement, the exception name is
+printed but the debugger's state is not changed.
+
+The debugger supports aliases. Aliases can have parameters which
+allow a certain level of adaptability to the context under
+examination. See \ref{command:aliases}.
+
+{\bf Debugger Prompt}:\label{debugger:prompt}\\
+By default the debugger's prompt string is \samp{(MPdb) } with
+a trailing blank. Recursive invocations using the
+\samp{debug} command strip off the trailing blanks, add a layer of
+parenthesis around the string, and add a trailing blank. For example,
+for the default prompt the first debug invocation will be
+\samp{((MPdb)) }.
+
+\emph{There's currently a bug in the code where specified trailing
+blanks are chopped. Furthermore the prompt may change in the future to
+add a history number. It is generally not advisable to change the
+prompt.}
+
+If you do need to change the prompt see \ref{command:prompt}.
+
+Multiple commands may be entered on a single line, separated by
+\samp{;;}. (A single \samp{;} is not used because it is
+the separator for multiple commands in a line that is passed to
+the Python parser.)
+No intelligence is applied to separating the commands;
+the input is split at the first \samp{;;} pair, even if it is in
+the middle of a quoted string.
+
+\subsection{Status and Debugger Settings ({\tt info}, {\tt set}, {\tt show})\label{subsection-status}}
+
+An {\tt info} command shows things about the program being debugged.A
+{\tt set} command modifies parts of the debugger environment. You can
+see these environment settings with the {\tt show} command.
+
+In all of the set options that take ``on'' or ``off'' parameters, you
+can also use 1 for ``on'' and 0 for ``off.''
+
+Each command has a corresponding {\tt show} command to show the current
+value. See \ref{subsubsection-show} for these counterparts.
+
+If a
+\ulink{\module{readline}}{http://docs.python.org/lib/module-readline.html}
+module is available, \code{mpdb} can keep track of the commands you
+type during your debugging sessions, so that you can be certain of
+precisely what happened. The \code{set history} commands to manage
+the command history facility.
+
+POSIX-style line tracing is available and the \code{set linetrace}
+commands can be used to control that.
+
+You may want to save the output of \code{mpdb} commands to a file.
+See the \code{set logging} commands to control \code{mpdb}'s logging.
+
+\subsubsection{Info ({\tt info})\label{subsubsection-info}}
+
+Running this command without parameters will print the list of
+available info commands. Below is a description of the individual
+commands.
+
+\begin{description}
+
+\item[info args]
+
+Show function/method parameters. See \ref{command:info-args}.
+
+\item[info break]
+
+Show the list of breakpoints.
+
+\item[info globals]
+
+Show the global variables. See \ref{command:info-globals}.
+
+\item[info line]
+
+Show the current line number in source file. If a function name is
+given, the starting line of the function is reported.
+
+\item[info locals]
+
+Show the local variables. See \ref{command:info-locals}.
+
+\item[info program]
+
+Show the execution status of the program. The possible status is that
+the program is not running (e.g. in post-mortem dump), or the program is
+``stoppped'' and if stopped at a breakpoint that is shown as well.
+
+\item[info source]
+
+Information about the current Python file.
+
+\item[info target]
+
+Show the target of the debugger, which will either be local or remote.
+
+\end{description}
+
+\subsubsection{Set ({\tt set})\label{subsubsection-set}}
+
+\begin{description}
+
+\item[set basename on\code{\Large{|}}off]\label{command:basename}
+
+When showing filenames print only the basename. This option is useful
+in regression testing where the base file names are the same
+on different installations even though the directory path may
+be different. You may want to use this in other
+situtations as well, like showing a debugger session in a manual
+such as this one.
+
+\item[set cmdtrace on\code{\Large{|}}off]\label{command:cmdtrace}
+
+Show lines as they are read from the debugger command file (or
+\samp{source} debugger command). This is useful in running
+regression tests, but it may be helpful in tracking down a problem in
+your \code{.mpdbrc} file.
+
+\item[set history filename \var{filename}]\label{command:hist-filename}
+
+Set the filename in which to record the command history.
+(the list of previous commands of which a record is kept). The default
+file is \verb|~/.mpdbhist|.
+
+\item[set history save]
+
+Set saving of the history record on exit. Use ``on'' to enable the
+saving, and ``off'' to disable it. Without an argument, saving is
+enabled.
+
+\item[set history size]
+
+Set the size of the command history, ie. the number of previous
+commands to keep a record of. The default is 256.
+
+\item[set linetrace on\code{\Large{|}}off]\label{command:linetrace}
+
+If this is set on, the position (file and linenumber) is shown before
+executing a statement. By default this is off. Using the command-line
+option \samp{--trace} when invoking \code{mpdb} implicitly sets this
+on. For information on \samp{--trace}, see \ref{switch:trace}.
+
+Using this option will slow down your
+program. Unless single stepping through a program, normally the
+debugger is called only at breakpoints or at the call and return of a
+function or method. However when line tracing is turned on, the
+debugger is called on execution of every statement.
+
+That said, execution may still be pretty fast. If you want to slow
+down execution further, see the following option.
+
+\item[set linetrace delay \var{time}]\label{command:linetrace-delay}
+
+One of the useful things you can do with this debugger if you run it
+via a front-end GUI is watch your program as it executes. To do this,
+use \samp{set linetrace on} which prints the location before each
+Python statement is run. Many front-end GUIs like the one in GNU Emacs
+and \code{ddd} will read the location and update the display
+accordingly.
+
+There is however one catch---Python runs too fast. So by using this
+option you can set a delay after each statement is run in order for
+GNU and your eyes to catch up with Python. Specify a floating
+point indicating the number of seconds to wait. For example:
+
+\begin{verbatim}
+set linetrace delay 0.5 # 1/2 a second
+\end{verbatim}
+
+In my experience half a second is about right.
+
+\item[set listsize \var{lines}]\label{command:listsize}
+
+Sets how many lines are shown by the \code{list} command. See
+\ref{command:list}.
+
+\item[set logging]\label{command:logging}
+
+Prints \code{set logging} usage.
+
+\item[set logging on\code{\Large{|}}off]\label{command:logging}
+
+Enable or disable logging.
+
+\item[set logging file \var{filename}]\label{command:log-redirect}
+
+By default, \code{mpdb} output will go to both the terminal and the
+logfile. Set \code{redirect} if you want output to go only to the log
+file.
+
+\item[set logging overwrite on\code{\Large{|}}off]\label{command:log-overwrite}
+
+By default, \code{mpdb} will append to the logfile. Set
+\code{overwrite} if you want \code{set logging on} to overwrite the
+logfile instead.
+
+\item[set logging redirect on\code{\Large{|}}off]\label{command:log-redirect}
+
+By default, \code{mpdb} output will go to both the terminal and the
+logfile. Set \code{redirect} if you want output to go only to the log
+file.
+
+\item[set prompt \var{prompt-string}]\label{command:prompt}
+
+Set debugger's prompt string. By default it is \samp{(MPdb) } with
+a trailing space. For information on how the prompt
+changes, see \ref{debugger:prompt}.
+
+\emph{There's currently a bug in the code where specified trailing
+blanks specified. Furthermore the prompt may change in the future to
+add a history number. It is generally not advisable to change the
+prompt.}
+
+\end{description}
+
+\subsubsection{Show ({\tt show}) \label{subsubsection-show}}
+
+All of the ``show'' commands report some sort of status and all have a
+corresponding ``set'' command to change the value. See
+\ref{subsubsection-set} for the ``set'' counterparts.
+
+\begin{description}
+
+\item[show args]
+
+Show the argument list that was given the program being debugged or it is
+restarted
+
+\item[show basename]
+
+Show short or long filenames
+
+\item[show cmdtrace]
+
+Show the debugger commands before running
+
+\item[show commands]
+
+Show the history of commands you typed. You can supply a command
+number to start with, or a `+' to start after the previous command
+number shown. A negative number starts from the end.
+
+This command is available only if a
+\ulink{\module{readline}}{http://docs.python.org/lib/module-readline.html}
+module is available and supports the history saving.
+
+\item[show history]
+
+Generic command for showing command history parameters. The command
+history filename, saving of history on exit and size of history file
+are shown.
+
+\item[show linetrace]
+
+Show the line tracing status.
+
+\item[show linetrace delay]
+
+Show the delay after tracing each line.
+
+\item[show listsize]
+
+Show the number of source lines \code{mpdb} will list by default.
+
+\item[show logging]
+
+Show summary information of logging variables which can be set via
+\code{set logging}.
+
+\item[show logging file]
+
+Show the current logging file.
+
+\item[show logging overwrite]
+
+Show whether logging overwrites or appends to the log file.
+
+\item[show prompt]
+
+Show the current debugger prompt.
+
+\item[show version]
+
+Show the debugger version number.
+
+\end{description}
+
+\subsection{Breakpoints ({\tt break}, {\tt tbreak}, {\tt clear},
+ {\tt commands}, {\tt delete}, {\tt disable}, {\tt
+ ignore})\label{subsubsection-brkpts}}
+
+\index{breakpoints} A breakpoint makes your program stop at that
+point. You can add conditions for each breakpoint. You can set
+breakpoints with the \code{break} command and its variants. You can specify
+the place where your program should stop by file and line number or by
+function name.
+
+The debugger assigns a number to each breakpoint when you create it;
+these numbers are successive integers starting with 1. In many of the
+commands for controlling various features of breakpoints you use this
+number. Each breakpoint may be enabled or disabled; if disabled, it
+has no effect on your program until you enable it again.
+
+The debugger allows you to set any number of breakpoints at the same
+place in your program. There is nothing unusual about this because
+different breakpoints can have different conditions associated with them.
+
+\begin{description}
+\item[b(reak) \optional{\optional{\var{filename}:}\var{lineno}\code{\Large{|}}\var{function}\optional{, \var{condition}}}]
+
+With a \var{lineno} argument, set a break at that line number in the
+current file. With a \var{function} argument, set a break at the
+first executable statement within that function. The line number may
+be prefixed with a filename and a colon to specify a breakpoint in
+another file (probably one that hasn't been loaded yet). The file is
+searched on \code{sys.path}. Note that each breakpoint is assigned a
+number to which all the other breakpoint commands refer.
+
+If a second argument is present, it is an expression which must
+evaluate to true before the breakpoint is honored.
+
+Without an argument, list all breaks, including for each breakpoint:
+the number of times that breakpoint has been hit, the current
+ignore count, and the associated condition if any.
+
+\item[tbreak \optional{\optional{\var{filename}:}\var{lineno}\code{\Large{|}}\var{function}\optional{, \var{condition}}}]
+
+Temporary breakpoint, which is removed automatically when it is
+first hit. The arguments are the same as those for \code{break}.
+
+\item[cl(ear) \optional{\optional{\var{filename}:}\var{lineno}\code{\Large{|}}\var{function}}]
+
+Clear breakpoint at specified line or function. Argument may be line
+number, function name, or `*' and an address. If a line number is
+specified, all breakpoints in that line are cleared. If a function is
+specified, the breakpoints at the beginning of the function are cleared. If an
+address is specified, breakpoints at that address are cleared.
+
+With no argument, clears all breakpoints in the line where the selected
+frame is executing.
+
+See also the \code{delete} command below which clears breakpoints by
+number.
+
+\item[commands \optional{\optional{\var{bpnumber}}}]
+
+Set commands to be executed when a breakpoint is hit.
+Give breakpoint number as the argument after "commands".
+With no bpnumber argument, commands refers to the last one set.
+The commands themselves follow starting on the next line.
+Type a line containing "end" to terminate the commands.
+
+To remove all commands from a breakpoint, type commands and
+follow it immediately with \code{end}; that is, give no commands.
+
+Specifying any command resuming execution (currently \code{continue},
+\code{step}, \code{next}, \code{return}, \code{jump}, and \code{quit})
+terminates the command list as if that command was immediately
+followed by \code{end}. This is because any time you resume execution
+(even with a simple next or step), you may encounter another
+breakpoint---which could have its own command list, leading to
+ambiguities about which list to execute.
+
+If you use the \code{silent} command in the command list, the
+usual message about stopping at a breakpoint is not printed.
+This may be desirable for breakpoints that are to print a
+specific message and then continue. If none of the other
+commands print anything, you see no sign that the breakpoint
+was reached.
+
+\item[delete \optional{\var{bpnumber} \optional{\var{bpnumber \ldots}}}]
+
+With a space-separated list of breakpoint numbers, clear those
+breakpoints. Without argument, clear all breaks (but first
+ask confirmation).
+
+\item[disable \optional{\var{bpnumber} \optional{\var{bpnumber \ldots}}}]
+
+Disable the breakpoints given as a space-separated list of
+breakpoint numbers. Disabling a breakpoint means it cannot cause
+the program to stop execution, but unlike clearing a breakpoint, it
+remains in the list of breakpoints and can be (re-)enabled.
+
+\item[enable \optional{\var{bpnumber} \optional{\var{bpnumber \ldots}}}]
+
+Enable the breakpoints specified.
+
+\item[ignore \var{bpnumber} \optional{\var{count}}]
+
+Set the ignore count for the given breakpoint number. If count is
+omitted, the ignore count is set to 0. A breakpoint becomes active
+when the ignore count is zero. When non-zero, the count is
+decremented each time the breakpoint is reached, the breakpoint is not
+disabled, and any associated condition evaluates to true.
+
+\item[condition \var{bpnumber} \optional{\var{condition}}]
+
+Condition is an expression which must evaluate to true before
+the breakpoint is honored. If condition is absent, any existing
+condition is removed; i.e., the breakpoint is made unconditional.
+
+\end{description}
+
+\subsection{Resuming Execution ({\tt step}, {\tt next}, {\tt
+ finish}, {\tt return}, {\tt continue}, {\tt jump})\label{subsubsection-resume}}
+
+``Continuing'' means resuming program execution until the program
+completes normally. In contrast, ``stepping'' means executing just
+one statement of the program. When continuing or stepping, the program may stop even sooner, due to a breakpoint or an
+exception.
+
+\begin{description}
+
+\item[s(tep) \optional{\var{count}}]\label{command:step}
+
+Execute the current line, stop at the first possible occasion
+(either in a function that is called or on the next line in the
+current function).
+
+\item[n(ext) \optional{\var{count}}]\label{command:next}
+
+Continue execution until the next line in the current function
+is reached or the function returns. The difference between \samp{next} and
+\samp{step} is that \samp{step} stops inside a called function, while
+\samp{next} executes called functions at (nearly) full speed,
+stopping only at the next line in the current function.
+
+\item[finish]\label{command:finish}
+
+Continue execution until the current function returns.
+that point the \samp{retval} command can be used to show the
+return value. The short command name is rv.
+
+See also \ref{command:retval} and \ref{command:return}.
+
+\item[return]\label{command:return}
+
+Make selected stack frame return to its caller. Control remains in the
+debugger, but when you continue execution will resume at the return
+statement found inside the subroutine or method. At present we are
+only able to perform this if we are in a subroutine that has a
+\code{return} statement in it. See also \ref{command:retval} and
+\ref{command:finish}
-MPdb objects have the following methods:
+\item[c(ontinue)]
-\begin{methoddesc}[mpdb]{remote_onecmd}{self, line}
+Continue execution, stop only when a breakpoint is encountered.
+
+\item[jump \var{lineno}]
+
+Set the next line that will be executed. available only in the
+bottom-most frame. This lets you jump back and execute code
+again, or jump forward to skip code that you don't want to run.
+
+Not all jumps are allowed---for instance it
+is not possible to jump into the middle of a \keyword{for} loop or out
+of a \keyword{finally} clause.
+
+One common use for the \code{jump} statement is to get out of a
+loop. Sometimes the bounds of loops are computed in advance so you
+can't leave a loop early by say setting the value of the loop variable
+
+Here's an example demonstrating this:
+
+\begin{verbatim}
+mpdb ptest.py
+(ptest.py:2):
+(MPdb) list
+ 1 #!/bin/python
+ 2 -> for i in range(1,10):
+ 3 print i
+ 4 print "tired of this"
+[EOF]
+(MPdb) step
+(ptest.py:3):
+(MPdb) i=1000
+(MPdb) step
+1000
+(ptest.py:2):
+(MPdb) jump 4
+(ptest.py:4):
+(MPdb) step
+tired of this
+--Return--
+--Return--
+The program finished and will be restarted
+(ptest.py:2):
+(MPdb)
+\end{verbatim}
+
+Not that the assignment of 1,000 to \code{i} took effect, although it
+had no effect on termintating the \code{for} loop; \code{jump} was
+needed to get out of the loop early.
+
+\end{description}
+
+\subsection{Examining Call Frames ({\tt info args}, {\tt info
+ locals}, {\tt down}, {\tt frame}, {\tt up})\label{subsection-frames}}
+
+Each line in the backtrace shows the frame number and the function
+name, if it exists and the place in a file where the statement is
+located.
+
+Here is an example of a backtrace from a sample Towers of Hanoi
+program that is used in regression testing:
+
+\begin{verbatim}
+## 0 hanoi() called from file '/tmp/pydb/test/hanoi.py' at line 5
+-> 1 hanoi() called from file '/tmp/pydb/test/hanoi.py' at line 6
+## 2 in file '/tmp/pydb/test/hanoi.py' at line 29
+## 3 in file '' at line 1
+## 4 run() called from file '/usr/lib/python2.4/bdb.py' at line 366
+\end{verbatim}
+
+The {\tt->} arrow indicates the focus. In the example, I issued an
+\samp{up} command which is why the focus is on 1 rather than 0 as it
+would normally be after a stop.
+
+There are two ``hanoi'' frames listed because this is a hanoi called
+itself recursively. In frame 2 and 3 we don't have a function name
+listed. That's because there is none. Furthermore in frame 3 there is
+a funny ``in file {\tt ''} at line 1.'' That's because there
+isn't even a file assocated with the command. The command issued:
+
+\begin{verbatim}
+ exec cmd in globals, locals
+\end{verbatim}
+
+This statement can be seen in frame 4. This is a bug which Rocky hopes to
+fix in \module{Pydb} with a more informative message.
+
+Finally, note that frames 2 and 3 really are not part of the
+program to be debugged but are part of the internal workings of the
+debugger. It's possible to hide this, but in the open spirit of Python
+for now it hasn't been hidden.
+
+\begin{description}
+
+\item[info args]\label{command:info-args}
+
+Show the method or function parameters and their values.
+
+Here is an example of the output for the backtrace of the hanoi
+program shown at the beginning of this section:
+
+\begin{verbatim}
+(MPdb) info args
+n= 3
+a= a
+b= b
+c= c
+(MPdb)
+\end{verbatim}
+
+\item[info locals]\label{command:info-locals}
+
+Show all local variables for the given stack frame. This will include
+the variables that would be shown by \samp{info args}.
+
+\item[where\code{\Large{|}}T\code{\Large{|}}bt \optional{\var{count}}]
+
+Print a backtrace, with the most recent frame at the top. An
+arrow indicates the current frame, which determines the context of
+most commands.
+
+With a positive number \var{count}, print at most many entries.
+
+An example of a backtrace is given at the beginning of this section.
+
+\item[retval\code{\Large{|}}rv]\label{command:retval}
+
+Show the value that will be returned by the current function. This
+command is meaningful only just before a return (such as you'd get
+using the \code{finish} or \code{return} commands) or stepping after a
+return statement.
+
+To change the value, make an assignment to the variable
+\code{__return__}.
+
+See also \ref{command:finish}.
+
+\item[down \optional{\var{count}}]
+
+Move the current frame one level down in the stack trace
+(to a newer frame). With a count, which can be positive
+or negative, move that many positions.
+
+\item[up \optional{\var{count}}]
+
+Move the current frame one level up in the stack trace (to an older
+frame). With a count, which can be positive or negative,
+move that many positions.
+
+\item[frame \optional{\var{position}}]
+Move the current frame to the specified frame number. A negative
+number indicates position from the end, so \code{frame -1} moves to
+the newest frame, and \code{frame 0} moves to the oldest frame.
+
+\end{description}
+
+\subsection{Examining Data ({\tt print}, {\tt pprint}, {\tt examine}, {\tt info globals})\label{subsection-data}}
+
+\begin{description}
+
+\item[display \optional{\var{format}} \var{expression}]
+
+Print value of expression \var{expression} each time the program
+stops. \var{format} may be used before \var{expression} as in the
+"print" command. \var{format} "i" or "s" or including a size-letter
+is allowed, and then \var{expression} is used to get the address to
+examine.
+
+With no argument, display all currently requested auto-display
+expressions. Use "undisplay" to cancel display requests previously
+made.
+
+\item[undisplay \optional{\var{format}} \var{expression}]
+
+Evaluate the \var{expression} in the current context and print its
+value. \note{\samp{print} can also be used, but is not a debugger
+command---it executes the Python \keyword{print} statement.}
+
+
+\item[p \var{expression}]
+
+Evaluate the \var{expression} in the current context and print its
+value. One can also often have an expression printed by just typing
+the expression. If the first token doesn't conflict with a debugger
+built-in command Python will, by default, print the result same as if
+you did this inside a Python interpreter shell. To make things even
+more confused, a special case of running an arbitrary Python command
+is the \samp{print} command. But note that the debugger command is
+just \samp{p}.
+
+So what's the difference? The debugger's print command encloses
+everything in a \samp{repr()}, to ensure the resulting output is not
+too long. \emph{\note{Should add info as to how to customize what
+``too long'' means}}. So if you want abbreviated output, or are not
+sure if the expression may have an arbitrarily long (or infinite)
+representation, then use \samp{p}. If you want the output as Python
+would print it, just give the expression or possibly use python's
+\samp{print} command.
+
+\item[pp \var{expression}]
+
+Like the \samp{p} command, except the value of the expression is
+pretty-printed using the \module{pprint} module.
+
+\item[examine \var{expression}]
+
+Print the type of the expression and pretty-print its value. For
+functions, methods, classes, and modules print out the documentation
+string if any. For functions also show the argument list.
+
+The examine debugger command in Perl is the model here, however much
+more work is needed. Note that \samp{x} is not a short name for
+``expression'' (as it is in Perl's debugger), although you could
+easily make it be via an alias.
+
+\item[info globals]\label{command:info-globals}
+
+Show all global variables. These variables are not just the variables
+that a programs sees via a \code{global} statement, but all of them
+that can be accessible.
+
+\end{description}
+
+\subsection{Running Arbitrary Python Commands ({\tt debug}, {\tt !})\label{subsection-commands}}
+
+\begin{description}
+
+\item[\optional{!}\var{statement}]
+
+Execute the (one-line) \var{statement} in the context of
+the current stack frame.
+The exclamation point can be omitted unless the first word
+of the statement resembles a debugger command.
+To set a global variable, you can prefix the assignment
+command with a \samp{global} command on the same line, e.g.:
+
+\begin{verbatim}
+(MPdb) global list_options; list_options = ['-l']
+(MPdb)
+\end{verbatim}
+
+\item[debug \var{statement}]
+
+Enter a recursive debugger that steps through the code argument (which
+is an arbitrary expression or statement to be executed in the current
+environment). The prompt is changed to indicate nested behavior. See
+\ref{debugger:prompt}
+
+\end{description}
+
+\subsection{Restarting a Python Script ({\tt restart}, {\tt run})\label{subsection-restart}}
+
+\begin{description}
+
+\item[restart \var{args...}]\label{command:restart}
+
+Restart debugger and program via an \code{exec} call. All state
+is lost, and new copy of the debugger is used. \footnote{The \code{restart}
+command may not function properly on the Windows platform, this is because of
+the way that Windows handles \code{execvp} calls.}
+
+Sometimes in debugging it is necessary to modify module code when one
+finds a bugs in them. Python will not notice dynamically that a module
+has changed and thus not reimport it (which also means that module
+initialization code is not rerun either). So in such a situation one
+must use \code{restart} rather than \code{run}.\footnote{It may be
+possible to unimport by removing a the module from a namespace, but if
+there are shared dynamically loaded objects those don't get unloaded.}
+
+\item[run \var{args...}]
+
+Run or ``soft'' restart the debugged Python program. If a string is
+supplied that becomes the new command arguments. History,
+breakpoints, actions and debugger options are preserved. \code{R} is
+a short command alias for \code{run}.
+
+You may notice that the sometimes you can \code{step} into modules
+included via an \code{import} statement, but after a \code{run} this
+stepping skips over the import rather than goes into it. A similar
+situation is that you may have a breakpoint set inside class
+\code{__init__} code, but after issiuing \code{run} this doesn't seem
+to get called---and in fact it isn't run again!
+
+That's because in Python the \code{import} occurs only once. In fact,
+if the module was imported \emph{before\/} invoking the program, you
+might not be able to step inside an \code{import} the first time as
+well.
+
+In such a situation or other situations where \code{run} doesn't seem
+to have the effect of getting module initialization code executed,
+you might try using \code{restart} rather than \code{run}.
+
+\end{description}
+
+\subsection{Interfacing to the OS ({\tt cd}, {\tt pwd},
+ {\tt shell})\label{subsection-os}}
+
+\begin{description}
+
+\item[cd \var{directory}]\label{command:cd}
+
+Set working directory to \var{directory} for debugger and program
+being debugged.
+
+\item[pwd]\label{command:pwd}
+
+Print working directory.
+
+\item[shell \var{statement}]\label{command:shell}
+
+Execute the rest of the line as a shell command.
+
+
+\end{description}
+
+\subsection{Listing Program Code ({\tt list}, {\tt disassemble})\label{subsection-listing}}
+
+\begin{description}
+
+\item[disassemble \optional{\var{arg}}]
+
+With no argument, disassemble at the current frame location. With a
+numeric argument, disassemble at the frame location at that line
+number. With a class, method, function, code or string argument,
+disassemble that.
+
+\item[l(ist) \optional{- \Large{|} \var{first}\optional{, \var{last}}}]\label{command:list}
+
+List source code. Without arguments, list
+\emph{n} lines centered around the current line or continue the previous
+listing, where \emph{n} is the value set by \samp{set listsize} or
+shown by \samp{show listsize}. The default value is 10.
+
+\samp{list -} lists \emph{n} lines before a previous listing. With one
+argument other than `-', list \emph{n} lines centered around the
+specified position. With two arguments, list the given range; if the
+second argument is less than the first, it is a count. \var{first} and
+\var{last} can be either a function name, a line number, or
+\var{filename}:\var{line-number}.
+
+\end{description}
+
+\subsection{Interfacing to the debugger ({\tt alias}, {\tt complete},
+{\tt help}, {\tt quit}, {\tt source}, {\tt unalias})\label{subsection-misc}}
+
+\begin{description}
+
+\item[alias \optional{\var{name} \optional{command}}]\label{command:aliases}
+
+Create an alias called \var{name} that executes \var{command}. The
+command must not be enclosed in quotes. Replaceable parameters
+can be indicated by \samp{\%1}, \samp{\%2}, and so on, while \samp{\%*} is
+replaced by all the parameters. If no command is given, the current
+alias for \var{name} is shown. If no arguments are given, all
+aliases are listed.
+
+Aliases may be nested and can contain anything that can be legally
+typed at the \code{mpdb} prompt. Note that internal \code{mpdb}
+commands can be overridden by aliases. Such a command is then hidden
+until the alias is removed. Aliasing is recursively applied to the
+first word of the command line; all other words in the line are left
+alone.
+
+As an example, here are two useful aliases (especially when placed
+in the \code{.mpdbrc} file):
+
+\begin{verbatim}
+#Print instance variables (usage "pi classInst")
+alias pi for k in %1.__dict__.keys(): print "%1.",k,"=",%1.__dict__[k]
+#Print instance variables in self
+alias ps pi self
+\end{verbatim}
+
+\item[complete \var{command-prefix}]\label{command:complete}
+
+If
+\ulink{\module{readline}}{http://docs.python.org/lib/module-readline.html}
+or one of readline-compatible interfaces such as
+\ulink{\module{pyreadline}}{http://projects.scipy.org/ipython/ipython/wiki/PyReadline/Intro}
+are available on your OS, the \code{complete} command will print a
+list of command names that start with \var{command-prefix}.
+
+\code{complete} will also work on \code{info}, \code{set}, and
+\code{show} sub-command.
+
+In addition the command-completion key (usually the tab key) can be
+used to complete command names, or \code{info}, \code{set}, and
+\code{show} subcommands.
+
+\item[h(elp) \optional{\var{command} \optional{subcommand}}]
+
+Without argument, print the list of available commands. With
+\var{command} as argument, print help about that command. \samp{help
+mpdb} displays the full documentation file; if the environment
+variable \envvar{PAGER} is defined, the file is piped through that
+command. Since the \var{command} argument must be an identifier,
+\samp{help exec} must be entered to get help on the \samp{!} command.
+
+Some commands, \code{info}, \code{set}, and \code{show} can accept an
+additional subcommand to give help just about that particular
+subcommand. For example \code{help info line} give help about the
+\code{info line} command.
+
+\item[q(uit)]\label{command:quit}
+
+Quit the debugger. The program being executed is aborted. For now,
+\code{kill} is a synonym for quit.
+
+\item[source \var{filename}]\label{command:source}
+
+Read commands from a file named \var{filename}.
+Note that the file \code{.mpdbrc} is read automatically
+this way when \code{mpdb} is started.
+
+An error in any command terminates execution of the command and
+control is returned to the console.
+
+For tracking down problems with command files, see the \samp{set
+cmdtrace on} debugger command, \ref{command:cmdtrace}.
+
+\item[unalias \var{name}]\label{command:unalias}
+
+Delete the specified alias.
+
+\end{description}
+
+
+\section{MPdb Objects}
+\label{mpdb-objects}
+
+MPdb objects have the following methods. These methods should usually never
+be called directly, they are documented here to serve as a programmer's
+reference to the inner workings of mpdb. Refer to \ref{module-mpdb} for
+routines that are considered ``safe'' to be called from user code.
+
+\begin{methoddesc}[mpdb]{remote_onecmd}{line}
This method is used by an \class{MPdb} instance that is connected
-to a remote machine. Instead of the debugger instance interpreting commands,
-all commands are sent directly to the remote machine where they are
-interpreted, executed and the results are sent back to the client.
+to a remote machine. It replaces the \code{onecmd} method inherited from the
+\class{Cmd} class in the \module{cmd} module. \var{line} contains the
+command to be executed on the remote machine. Instead of the debugger
+instance interpreting commands, all commands are sent directly to the
+remote machine where they are interpreted, executed and the results are
+sent back to the client. The client then writes the results to the \class{MPdb}
+objects stdout file-object.
\end{methoddesc}
-\begin{methoddesc}[mpdb]{do_pdbserver}{self, addr}
+\begin{methoddesc}[mpdb]{do_pdbserver}{addr}
Setup a pdbserver at \var{addr} and wait for incoming connections. \var{addr}
must be string containing a protocol to use and a protocol-specific address.
+Refer to \ref{module-mpdb} for more information.
+\end{methoddesc}
+
+\begin{methoddesc}[mpdb]{do_target}{args}
+This method is called to connect to a remote pdbserver. \var{args} should
+contain a protocol and an address to connect to.
+\end{methoddesc}
+
+\section{MTracer Objects}
+\label{mtracer-objects}
+
+MTracer objects are used to trace stack frames in threads. They
+have the following methods:
+
+\begin{methoddesc}[mtracer]{trace_dispatch}{frame, event, arg}
+This method is the global trace function for this thread. It delegates
+work to other methods according the value of \var{event} which may be
+\code{'line'}, \code{'call'}, \code{'exception'}, \code{'return'},
+\code{'c_call'}, \code{'c_return'} or \code{'c_exception'}.
+\end{methoddesc}
+
+\begin{methoddesc}[mtracer]{dispatch_line}{frame}
+This method is called when the event of the current frame object is
+\code{'line'}. This method checks to see whether we have been instructed
+to stop at this frame or if we have a breakpoint set at this frame. If
+either of these conditions are true, control is passed to the main
+debugger via it's \function{user_line()} method.
+\footnote{The main debugger is the \class{MPdb} instance. It is
+the debugger that is tracing the \class{_MainThread} object (the Python
+Interpreter).}
+\end{methoddesc}
+
+\begin{methoddesc}[mtracer]{dispatch_call}{frame, arg}
+This method is called when the event of the current frame object is
+\code{'call'}. If the debugger has not been instructed to stop or break
+inside this function, this method returns and the function is not traced.
+Otherwise the main debugger's \function{user_call()} method is called,
+passing \var{frame} and \var{arg} as arguments.
+\end{methoddesc}
+
+\begin{methoddesc}[mtracer]{dispatch_return}{frame, arg}
+This method is called when the event of the current frame object is
+\code{'return'}. If the debugger has been instructed to stop at this
+frame the main debugger's \function{user_return()} method is called.
+\end{methoddesc}
+
+\begin{methoddesc}[mtracer]{dispatch_exception}{frame, arg}
+This method is called when the event of the current frame object is
+\code{'exception'}. If the debugger has been instructed to stop here
+the main debugger's \function{user_exception()} method is called.
\end{methoddesc}
\section{Thread Debugging}
@@ -71,6 +1023,41 @@
This section provides information on Python's thread debugging facilities and
how \module{mpdb} makes use of them.
+% How does Python facilitate debugging threads
+\subsection{Python's Thread Debugging Features}
+
+This section aims to provides the reader with information into how Python
+allows threads to be debugged.
+
+There are two modules in the Python Standard Library that provide thread
+functionality, \ulink{\module{thread}}{http://docs.python.org/lib/module-thread.html} and \ulink{\module{threading}}{http://docs.python.org/lib/module-threading.html}. Threads created with the \module{threading} module can be traced by
+calling \code{threading.settrace()}.
+
+\subsection{\module{mpdb}'s Thread Debugging Facilities}
+
+\module{mpdb} does not have thread debugging on by default. It can be turned
+on whilst inside the debugger by issuing the command \code{set thread}.
+
+\emph{All the information below can be considered the internals of how mpdb
+works. In other words, this information doesn't provide useful information
+for someone simply wishing to use the debugger for thread debugging.}
+
+When thread debugging is activated in \module{mpdb}, it imports the
+\module{mthread} module and calls that modules \function{init()} function,
+passing as an argument, the instance of \class{MPdb} that is currently
+being used. This debugger argument is stored as a global variable in the
+\module{mthread} module. Whenever a thread is created, a new \class{MTracer}
+object is instantiated and the trace function for the thread is set to this
+object's \function{trace_dispatch} function. Refer to \ulink{Section 9.2 - How It Works}{http://docs.python.org/lib/debugger-hooks.html} of the \module{Pdb}
+documentation in the Python Standard Library for more information on how
+debuggers interact with trace functions.
+
+An \class{MTracer} object provides three methods for dealing with the different
+possible events in that occur in the frame object.
+
+
+
+
\section{Remote Debugging}
This section describes how \module{mpdb} handles debugging remotely.
\label{remote-debug}
From python-checkins at python.org Wed Jul 5 23:19:27 2006
From: python-checkins at python.org (brett.cannon)
Date: Wed, 5 Jul 2006 23:19:27 +0200 (CEST)
Subject: [Python-checkins] r47248 - python/branches/bcannon-sandboxing
Message-ID: <20060705211927.E714C1E4003@bag.python.org>
Author: brett.cannon
Date: Wed Jul 5 23:19:27 2006
New Revision: 47248
Added:
python/branches/bcannon-sandboxing/
- copied from r47247, python/trunk/
Log:
Initial copy of HEAD for work on adding new sandboxing to the interpreter.
From python-checkins at python.org Thu Jul 6 00:09:00 2006
From: python-checkins at python.org (brett.cannon)
Date: Thu, 6 Jul 2006 00:09:00 +0200 (CEST)
Subject: [Python-checkins] r47249 -
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Message-ID: <20060705220900.6BF8A1E4003@bag.python.org>
Author: brett.cannon
Date: Thu Jul 6 00:08:59 2006
New Revision: 47249
Added:
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt (contents, props changed)
Log:
Add initial draft of design doc (same as one initially sent to python-dev).
Added: python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
==============================================================================
--- (empty file)
+++ python/branches/bcannon-sandboxing/sandboxing_design_doc.txt Thu Jul 6 00:08:59 2006
@@ -0,0 +1,998 @@
+Restricted Execution for Python
+#######################################
+
+About This Document
+=============================
+
+This document is meant to lay out the general design for re-introducing a
+restriced execution model for Python. This document should provide one with
+enough information to understand the goals for restricted execution, what
+considerations were made for the design, and the actual design itself. Design
+decisions should be clear and explain not only why they were chosen but
+possible drawbacks from taking that approach.
+
+
+Goal
+=============================
+
+A good restricted execution model provides enough protection to prevent
+malicious harm to come to the system, and no more. Barriers should be
+minimized so as to allow most code that does not do anything that would be
+regarded as harmful to run unmodified.
+
+An important point to take into consideration when reading this document is to
+realize it is part of my (Brett Cannon's) Ph.D. dissertation. This means it is
+heavily geared toward the restricted execution when the interpreter is working
+with Python code embedded in a web page. While great strides have been taken
+to keep the design general enough so as to allow all previous uses of the
+'rexec' module [#rexec]_ to be able to use the new design, it is not the
+focused goal. This means if a design decision must be made for the embedded
+use case compared to sandboxing Python code in a Python application, the former
+will win out.
+
+Throughout this document, the term "resource" is to represent anything that
+deserves possible protection. This includes things that have a physical
+representation (e.g., memory) to things that are more abstract and specific to
+the interpreter (e.g., sys.path).
+
+When referring to the state of an interpreter, it is either "trusted" or
+"untrusted". A trusted interpreter has no restrictions imposed upon any
+resource. An untrusted interpreter has at least one, possibly more, resource
+with a restriction placed upon it.
+
+
+.. contents::
+
+
+Use Cases
+/////////////////////////////
+
+All use cases are based on how many untrusted or trusted interpreters are
+running in a single process.
+
+
+When the Interpreter Is Embedded
+================================
+
+Single Untrusted Interpreter
+----------------------------
+
+This use case is when an application embeds the interpreter and never has more
+than one interpreter running.
+
+The main security issue to watch out for is not having default abilities be
+provided to the interpreter by accident. There must also be protection from
+leaking resources that the interpreter needs for general use underneath the
+covers into the untrusted interpreter.
+
+
+Multiple Untrusted Interpreters
+-------------------------------
+
+When multiple interpreters, all untrusted at varying levels, need to be running
+within a single application. This is the key use case that this proposed
+design is targetted for.
+
+On top of the security issues from a single untrusted interpreter, there is one
+additional worry. Resources cannot end up being leaked into other interpreters
+where they are given escalated rights.
+
+
+Stand-Alone Python
+==================
+
+When someone has written a Python program that wants to execute Python code in
+an untrusted interpreter(s). This is the use case that 'rexec' attempted to
+fulfill.
+
+The added security issues for this use case (on top of the ones for the other
+use cases) is preventing something from the trusted interpreter leaking into an
+untrusted interpreter and having elevated permissions. With the multiple
+untrusted interpreters one did not have to worry about preventing actions from
+occurring that are disallowed for all untrusted interpreters. With this use
+case you do have to worry about the binary distinction between trusted and
+untrusted interpreters running in the same process.
+
+
+Resources to Protect
+/////////////////////////////
+
+XXX Threading?
+XXX CPU?
+
+Filesystem
+===================
+
+The most obvious facet of a filesystem to protect is reading from it. One does
+not want what is stored in ``/etc/passwd`` to get out. And one also does not
+want writing to the disk unless explicitly allowed for basically the same
+reason; if someone can write ``/etc/passwd`` then they can set the password for
+the root account.
+
+But one must also protect information about the filesystem. This includes both
+the filesystem layout and permissions on files. This means pathnames need to
+be properly hidden from an untrusted interpreter.
+
+
+Physical Resources
+===================
+
+Memory should be protected. It is a limited resource on the system that can
+have an impact on other running programs if it is exhausted. Being able to
+restrict the use of memory would help alleviate issues from denial-of-service
+(DoS) attacks.
+
+
+Networking
+===================
+
+Networking is somewhat like the filesystem in terms of wanting similar
+protections. You do not want to let untrusted code make tons of socket
+connections or accept them to do possibly nefarious things (e.g., acting as a
+zombie).
+
+You also want to prevent finding out information about the network you are
+connected to. This includes doing DNS resolution since that allows one to find
+out what addresses your intranet has or what subnets you use.
+
+
+Interpreter
+===================
+
+One must make sure that the interpreter is not harmed in any way. There are
+several ways to possibly do this. One is generating hostile bytecode. Another
+is some buffer overflow. In general any ability to crash the interpreter is
+unacceptable.
+
+There is also the issue of taking it over. If one is able to gain control of
+the overall process through the interpreter then heightened abilities could be
+gained.
+
+
+Types of Security
+///////////////////////////////////////
+
+As with most things, there are multiple approaches one can take to tackle a
+problem. Security is no exception. In general there seem to be two approaches
+to protecting resources.
+
+
+Resource Hiding
+=============================
+
+By never giving code a chance to access a resource, you prevent it from be
+(ab)used. This is the idea behind resource hiding. This can help minimize
+security checks by only checking if someone should be given a resource. By
+having possession of a resource be what determines if one should be allowed to
+use it you minimize the checks to only when a resource is handed out.
+
+This can be viewed as a passive system for security. Once a resource has been
+given to code there are no more checks to make sure the security model is being
+violated.
+
+The most common implementation of resource hiding is capabilities. In this
+type of system a resource's reference acts as a ticket that represents the right
+to use the resource. Once code has a reference it is considered to have full
+use of that resource it represents and no further security checks are
+performed.
+
+To allow customizable restrictions one can pass references to wrappers of
+resources. This allows one to provide custom security to resources instead of
+requiring an all-or-nothing approach.
+
+The problem with capabilities is that it requires a way to control access to
+references. In languages such as Java that use a capability-based security
+system, namespaces provide the protection. By having private attributes and
+compartmentalized namespaces, references cannot be reached without explicit
+permission.
+
+For instance, Java has a ClassLoader class that one can call to have return a
+reference that is desired. The class does a security check to make sure the
+code should be allowed to access the resource, and then returns a reference as
+appropriate. And with private attributes in objects and packages not providing
+global attributes you can effectively hide references to prevent security
+breaches.
+
+To use an analogy, imagine you are providing security for your home. With
+capabilities, security came from not having any way to know where your house is
+without being told where it was; a reference to its location. You might be
+able to ask a guard (e.g., Java's ClassLoader) for a map, but if they refuse
+there is no way for you to guess its location without being told. But once you
+knew where it was, you had complete use of the house.
+
+And that complete access is an issue with a capability system. If someone
+played a little loose with a reference for a resource then you run the risk of
+it getting out. Once a reference leaves your hands it becomes difficult to
+revoke the right to use that resource. A capability system can be designed to
+do a check every time a reference is handed to a new object, but that can be
+difficult to do properly when grafting a new way to handle resources on to an
+existing system such as Python since the check is no longer at a point for
+requesting a reference but also at plain assignment time.
+
+
+Resource Crippling
+=============================
+
+Another approach to security is to provide constant, proactive security
+checking of rights to use a resource. One can have a resource perform a
+security check every time someone tries to use a method on that resource. This
+pushes the security check to a lower level; from a reference level to the
+method level.
+
+By performing the security check every time a resource's method is called the
+worry of a resource's reference leaking out to insecure code is alleviated
+since the resource cannot be used without authorizing it regardless of whether
+even having the reference was granted. This does add extra overhead, though,
+by having to do so many security checks.
+
+FreeBSD's jail system provides a system similar to this. Various system calls
+allow for basic usage, but knowing of the system call is not enough to grant
+usage. Every call of a system call requires checking that the proper rights
+have been granted to the use in order to allow for the system call to perform
+its action.
+
+An even better example in FreeBSD's jail system is its protection of sockets.
+One can only bind a single IP address to a jail. Any attempt to do more or
+perform uses with the one IP address that is granted is prevented. The check
+is performed at every call involving the one granted IP address.
+
+Using our home analogy, everyone in the world can know where your home is. But
+to access any door in your home, you have to pass a security check. The
+overhead is higher and slows down your movement in your home, but not caring if
+perfect strangers know where your home is prevents the worry of your address
+leaking out to the world.
+
+
+The 'rexec' Module
+///////////////////////////////////////
+
+The 'rexec' module [#rexec]_ was based on the design used by Safe-Tcl
+[#safe-tcl]_. The design was essentially a capability system. Safe-Tcl
+allowed you to launch a separate interpreter where its global functions were
+specified at creation time. This prevented one from having any abilities that
+were not explicitly provided.
+
+For 'rexec', the Safe-Tcl model was tweaked to better match Python's situation.
+An RExec object represented a restricted environment. Imports were checked
+against a whitelist of modules. You could also restrict the type of modules to
+import based on whether they were Python source, bytecode, or C extensions.
+Built-ins were allowed except for a blacklist of built-ins to not provide.
+Several other protections were provided; see documentation for the complete
+list.
+
+With an RExec object created, one could pass in strings of code to be executed
+and have the result returned. One could execute code based on whether stdin,
+stdout, and stderr were provided or not.
+
+The ultimate undoing of the 'rexec' module was how access to objects that in
+normal Python require no direct action to reach was handled. Importing modules
+requires a direct action, and thus can be protected against directly in the
+import machinery. But for built-ins, they are accessible by default and
+require no direct action to access in normal Python; you just use their name
+since they are provided in all namespaces.
+
+For instance, in a restricted interpreter, one only had to do
+``del __builtins__`` to gain access to the full set of built-ins. Another way
+is through using the gc module:
+``gc.get_referrers(''.__class__.__bases__[0])[6]['file']``. While both of
+these could be fixed (the former a bug in 'rexec' and the latter not allowing
+gc to be imported), they are examples of things that do not require proactive
+actions on the part of the programmer in normal Python to gain access to
+tends to leak out. An unfortunate side-effect of having all of that wonderful
+reflection in Python.
+
+There is also the issue that 'rexec' was written in Python which provides its
+own problems.
+
+Much has been learned since 'rexec' was written about how Python tends to be
+used and where security issues tend to appear. Essentially Python's dynamic
+nature does not lend itself very well to passive security measures since the
+reflection abilities in the language lend themselves to getting around
+non-proactive security checks.
+
+
+The Proposed Approach
+///////////////////////////////////////
+
+In light of where 'rexec' succeeded and failed along with what is known about
+the two main types of security and how Python tends to operate, the following
+is a proposal on how to secure Python for restricted execution.
+
+First, security will be provided at the C level. By taking advantage of the
+language barrier of accessing C code from Python without explicit allowance
+(i.e., ignoring ctypes [#ctypes]_), direct manipulation of the various security
+checks can be substantially reduced and controlled.
+
+Second, all proactive actions that code can do to gain access to resources will
+be protected through resource hiding. By having to go through Python to get to
+something (e.g., modules), a security check can be put in place to deny access
+as appropriate (this also ties into the separation between interpreters,
+discussed below).
+
+Third, any resource that is usually accessible by default will use resource
+crippling. Instead of worrying about hiding a resource that is available by
+default (e.g., 'file' type), security checks within the resource will prevent
+misuse. Crippling can also be used for resources where an object could be
+desired, but not at its full capacity (e.g., sockets).
+
+Performance should not be too much of an issue for resource crippling. It's
+main use if for I/O types; files and sockets. Since operations on these types
+are I/O bound and not CPU bound, the overhead for doing the security check
+should be a wash overall.
+
+Fourth, the restrictions separating multiple interpreters within a single
+process will be utilized. This helps prevent the leaking of objects into
+different interpreters with escalated privileges. Python source code
+modules are reloaded for each interpreter, preventing an object that does not
+have resource crippling from being leaked into another interpreter unless
+explicitly allowed. C extension modules are shared by not reloading them
+between interpreters, but this is considered in the security design.
+
+Fifth, Python source code is always trusted. Damage to a system is considered
+to be done from either hostile bytecode or at the C level. Thus protecting the
+interpreter and extension modules is the great worry, not Python source code.
+Python bytecode files, on the other hand, are considered inherently unsafe and
+will never be imported directly.
+
+Attempts to perform an action that is not allowed by the security policy will
+raise an XXX exception (or subclass thereof) as appropriate.
+
+
+Implementation Details
+===============================
+
+XXX prefix/module name; Restrict, Secure, Sandbox? Different tense?
+XXX C APIs use abstract names (e.g., string, integer) since have not decided if
+Python objects or C types (e.g., PyStringObject vs. char *) will be used
+
+Support for untrusted interpreters will be a compilation flag. This allows the
+more common case of people not caring about protections to not have a
+performance hindrance when not desired. And even when Python is compiled for
+untrusted interpreter restrictions, when the running interpreter *is* trusted,
+there will be no accidental triggers of protections. This means that
+developers should be liberal with the security protections without worrying
+about there being issues for interpreters that do not need/want the protection.
+
+At the Python level, the __restricted__ built-in will be set based on whether
+the interpreter is untrusted or not. This will be set for *all* interpreters,
+regardless of whether untrusted interpreter support was compiled in or not.
+
+For setting what is to be protected, the XXX for the
+untrusted interpreter must be passed in. This makes the protection very
+explicit and helps make sure you set protections for the exact interpreter you
+mean to.
+
+The functions for checking for permissions are actually macros that take
+in at least an error return value for the function calling the macro. This
+allows the macro to return for the caller if the check failed and cause the XXX
+exception to be propagated. This helps eliminate any coding errors from
+incorrectly checking a return value on a rights-checking function call. For
+the rare case where this functionality is disliked, just make the check in a
+utility function and check that function's return value (but this is strongly
+discouraged!).
+
+
+API
+--------------
+
+* interpreter PyXXX_NewInterpreter()
+ Return a new interpreter that is considered untrusted. There is no
+ corresponding PyXXX_EndInterpreter() as Py_EndInterpreter() will be taught
+ how to handle untrusted interpreters.
+
+* PyXXX_Trusted(error_return)
+ Macro that has the caller return with 'error_return' if the interpreter is
+ not a trusted one.
+
+
+Memory
+=============================
+
+Protection
+--------------
+
+An memory cap will be allowed.
+
+Modification to pymalloc will be needed to properly keep track of the
+allocation and freeing of memory. Same goes for the macros around the system
+malloc/free system calls. This provides a platform-independent system for
+protection instead of relying on the operating system providing a service for
+capping memory usage of a process. Also allows the protection to be at the
+interpreter level instead of at the process level.
+
+
+Why
+--------------
+
+Protecting excessive memory usage allows one to make sure that a DoS attack
+against the system's memory is prevented.
+
+
+Possible Security Flaws
+-----------------------
+
+If code makes direct calls to malloc/free instead of using the proper PyMem_*()
+macros then the security check will be circumvented. But C code is *supposed*
+to use the proper macros or pymalloc and thus this issue is not with the
+security model but with code not following Python coding standards.
+
+
+API
+--------------
+
+* int PyXXX_SetMemoryCap(interpreter, integer)
+ Set the memory cap for an untrusted interpreter. If the interpreter is not
+ running an untrusted interpreter, return NULL.
+
+* PyXXX_MemoryAlloc(integer, error_return)
+ Macro to increase the amount of memory that is reported that the running
+ untrusted interpreter is running. If the increase puts the total count
+ passed the set limit, raise an XXX exception and cause the calling function
+ to return with the value of error_return. For trusted interpreters or
+ untrusted interpreters where a cap has not been set, the macro does
+ nothing.
+
+* int PyXXX_MemoryFree(integer)
+ Decrease the current running interpreter's allocated memory. If this puts
+ the memory returned to below 0, raise an XXX exception and return NULL.
+ For trusted interpreters or untrusted interpreters where there is no memory
+ cap, the macro does nothing.
+
+
+CPU
+=============================
+XXX Needed? Difficult to get right for all platforms. Would have to be very
+platform-specific.
+
+
+Reading/Writing Files
+=============================
+
+Protection
+--------------
+
+The 'file' type will be resource crippled. The user may specify files or
+directories that are acceptable to be opened for reading/writing, or both.
+
+All operations that either read, write, or provide info on a file will require
+a security check to make sure that it is allowed for the file that the 'file'
+object represents. This includes the 'file' type's constructor not raising an
+IOError stating a file does not exist but XXX instead so that information about
+the filesystem is not improperly provided.
+
+The security check will be done for all 'file' objects regardless of where the
+'file' object originated. This prevents issues if the 'file' type or an
+instance of it was accidentally made available to an untrusted interpreter.
+
+
+Why
+--------------
+
+Allowing anyone to be able to arbitrarily read, write, or learn about the
+layout of your filesystem is extremely dangerous. It can lead to loss of data
+or data being exposed to people whom should not have access.
+
+
+Possible Security Flaws
+-----------------------
+
+Assuming that the method-level checks are correct and control of what
+files/directories is not exposed, 'file' object protection is secure, even when
+a 'file' object is leaked from a trusted interpreter to an untrusted one.
+
+
+API
+--------------
+
+* int PyXXX_AllowFile(interpreter, path, mode)
+ Add a file that is allowed to be opened in 'mode' by the 'file' object. If
+ the interpreter is not untrusted then return NULL.
+
+* int PyXXX_AllowDirectory(interpreter, path, mode)
+ Add a directory that is allowed to have files opened in 'mode' by the
+ 'file' object. This includes both pre-existing files and any new files
+ created by the 'file' object.
+ XXX allow for creating/reading subdirectories?
+
+* PyXXX_CheckPath(path, mode, error_return)
+ Macro that causes the caller to return with 'error_return' and XXX as the
+ exception if the specified path with 'mode' is not allowed. For trusted
+ interpreters, the macro does nothing.
+
+
+Extension Module Importation
+============================
+
+Protection
+--------------
+
+A whitelist of extension modules that may be imported must be provided. A
+default set is given for stdlib modules known to be safe.
+
+A check in the import machinery will check that a specified module name is
+allowed based on the type of module (Python source, Python bytecode, or
+extension module). Python bytecode files are never directly imported because
+of the possibility of hostile bytecode being present. Python source is always
+trusted based on the assumption that all resource harm is eventually done at
+the C level, thus Python code directly cannot cause harm. Thus only C
+extension modules need to be checked against the whitelist.
+
+The requested extension module name is checked in order to make sure that it
+is on the whitelist if it is a C extension module. If the name is not correct
+an XXX exception is raised. Otherwise the import is allowed.
+
+Even if a Python source code module imports a C extension module in a trusted
+interpreter it is not a problem since the Python source code module is reloaded
+in the untrusted interpreter. When that Python source module is freshly
+imported the normal import check will be triggered to prevent the C extension
+module from becoming available to the untrusted interpreter.
+
+For the 'os' module, a special restricted version will be used if the proper
+C extension module providing the correct abilities is not allowed. This will
+default to '/' as the path separator and provide as much reasonable abilities
+as possible from a pure Python module.
+
+The 'sys' module is specially addressed in
+`Changing the Behaviour of the Interpreter`_.
+
+By default, the whitelisted modules are:
+
+* XXX work off of rexec whitelist?
+
+
+Why
+--------------
+
+Because C code is considered unsafe, its use should be regulated. By using a
+whitelist it allows one to explicitly decide that a C extension module should
+be considered safe.
+
+
+Possible Security Flaws
+-----------------------
+
+If a trusted C extension module imports an untrusted C extension module and
+make it an attribute of the trust module there will be a breach in security.
+Luckily this a rarity in extension modules.
+
+There is also the issue of a C extension module calling the C API of an
+untrusted C extension module.
+
+Lastly, if a trusted C extension module is loaded in a trusted interpreter and
+then loaded into an untrusted interpreter then there is no possible checks
+during module initialization for possible security issues for resources opened
+during initialization of the module if such checks exist in the init*()
+function.
+
+All of these issues can be handled by never blindly whitelisting a C extension
+module. Added support for dealing with C extension modules comes in the form
+of `Extension Module Crippling`_.
+
+API
+--------------
+
+* int PyXXX_AllowModule(interpreter, module_name)
+ Allow the untrusted interpreter to import 'module_name'. If the
+ interpreter is not untrusted, return NULL.
+ XXX sub-modules in packages allowed implicitly? Or have to list all
+ modules explicitly?
+
+* int PyXXX_BlockModule(interpreter, module_name)
+ Remove the specified module from the whitelist. Used to remove modules
+ that are allowed by default. If called on a trusted interpreter, returns
+ NULL.
+
+* PyXXX_CheckModule(module_Name, error_return)
+ Macro that causes the caller to return with 'error_return' and sets the
+ exception XXX if the specified module cannot be imported. For trusted
+ interpreters the macro does nothing.
+
+
+Extension Module Crippling
+==========================
+
+Protection
+--------------
+
+By providing a C API for checking for allowed abilities, modules that have some
+useful functionality can do proper security checks for those functions that
+could provide insecure abilities while allowing safe code to be used (and thus
+not fully deny importation).
+
+
+Why
+--------------
+
+Consider a module that provides a string processing ability. If that module
+provides a single convenience function that reads its input string from a file
+(with a specified path), the whole module should not be blocked from being
+used, just that convenience function. By whitelisting the module but having a
+security check on the one problem function, the user can still gain access to
+the safe functions. Even better, the unsafe function can be allowed if the
+security checks pass.
+
+
+Possible Security Flaws
+-----------------------
+
+If a C extension module developer incorrectly implements the security checks
+for the unsafe functions it could lead to undesired abilities.
+
+
+API
+--------------
+
+Use PyXXX_Trusted() to protect unsafe code from being executed.
+
+
+Hostile Bytecode
+=============================
+
+Protection
+--------------
+
+The code object's constructor is not callable from Python. Importation of .pyc
+and .pyo files is also prohibited.
+
+
+Why
+--------------
+
+Without implementing a bytecode verification tool, there is no way of making
+sure that bytecode does not jump outside its bounds, thus possibly executing
+malicious code. It also presents the possibility of crashing the interpreter.
+
+
+Possible Security Flaws
+-----------------------
+
+None known.
+
+
+API
+--------------
+
+None.
+
+
+Changing the Behaviour of the Interpreter
+=========================================
+
+Protection
+--------------
+
+Only a subset of the 'sys' module will be made available to untrusted
+interpreters. Things to allow from the sys module:
+
+* byteorder
+* subversion
+* copyright
+* displayhook
+* excepthook
+* __displayhook__
+* __excepthook__
+* exc_info
+* exc_clear
+* exit
+* getdefaultencoding
+* _getframe
+* hexversion
+* last_type
+* last_value
+* last_traceback
+* maxint
+* maxunicode
+* modules
+* stdin # See `Stdin, Stdout, and Stderr`_.
+* stdout
+* stderr
+* __stdin__ # See `Stdin, Stdout, and Stderr`_ XXX Perhaps not needed?
+* __stdout__
+* __stderr__
+* version
+* api_version
+
+
+Why
+--------------
+
+Filesystem information must be removed. Any settings that could
+possibly lead to a DoS attack (e.g., sys.setrecursionlimit()) or risk crashing
+the interpreter must also be removed.
+
+
+Possible Security Flaws
+-----------------------
+
+Exposing something that could lead to future security problems (e.g., a way to
+crash the interpreter).
+
+
+API
+--------------
+
+None.
+
+
+Socket Usage
+=============================
+
+Protection
+--------------
+
+Allow sending and receiving data to/from specific IP addresses on specific
+ports.
+
+
+Why
+--------------
+
+Allowing arbitrary sending of data over sockets can lead to DoS attacks on the
+network and other machines. Limiting accepting data prevents your machine from
+being attacked by accepting malicious network connections. It also allows you
+to know exactly where communication is going to and coming from.
+
+
+Possible Security Flaws
+-----------------------
+
+If someone managed to influence the used DNS server to influence what IP
+addresses were used after a DNS lookup.
+
+
+API
+--------------
+
+* int PyXXX_AllowIPAddress(interpreter, IP, port)
+ Allow the untrusted interpreter to send/receive to the specified IP
+ address on the specified port. If the interpreter is not untrusted,
+ return NULL.
+
+* PyXXX_CheckIPAddress(IP, port, error_return)
+ Macro to verify that the specified IP address on the specified port is
+ allowed to be communicated with. If not, cause the caller to return with
+ 'error_return' and XXX exception set. If the interpreter is trusted then
+ do nothing.
+
+* PyXXX_AllowHost(interpreter, host, port)
+ Allow the untrusted interpreter to send/receive to the specified host on
+ the specified port. If the interpreter is not untrusted, return NULL.
+ XXX resolve to IP at call time to prevent DNS man-in-the-middle attacks?
+
+* PyXXX_CheckHost(host, port, error_return)
+ Check that the specified host on the specified port is allowed to be
+ communicated with. If not, set an XXX exception and cause the caller to
+ return 'error_return'. If the interpreter is trusted then do nothing.
+
+
+Network Information
+=============================
+
+Protection
+--------------
+
+Limit what information can be gleaned about the network the system is running
+on. This does not include restricting information on IP addresses and hosts
+that are have been explicitly allowed for the untrusted interpreter to
+communicate with.
+
+
+Why
+--------------
+
+With enough information from the network several things could occur. One is
+that someone could possibly figure out where your machine is on the Internet.
+Another is that enough information about the network you are connected to could
+be used against it in an attack.
+
+
+Possible Security Flaws
+-----------------------
+
+As long as usage is restricted to only what is needed to work with allowed
+addresses, there are no security issues to speak of.
+
+
+API
+--------------
+
+* int PyXXX_AllowNetworkInfo(interpreter)
+ Allow the untrusted interpreter to get network information regardless of
+ whether the IP or host address is explicitly allowed. If the interpreter
+ is not untrusted, return NULL.
+
+* PyXXX_CheckNetworkInfo(error_return)
+ Macro that will return 'error_return' for the caller and set XXX exception
+ if the untrusted interpreter does not allow checking for arbitrary network
+ information. For a trusted interpreter this does nothing.
+
+
+Filesystem Information
+=============================
+
+Protection
+--------------
+
+Do not allow information about the filesystem layout from various parts of
+Python to be exposed. This means blocking exposure at the Python level to:
+
+* __file__ attribute on modules
+* __path__ attribute on packages
+* co_filename attribute on code objects
+
+
+Why
+--------------
+
+Exposing information about the filesystem is not allowed. You can figure out
+what operating system one is on which can lead to vulnerabilities specific to
+that operating system being exploited.
+
+
+Possible Security Flaws
+-----------------------
+
+Not finding every single place where a file path is exposed.
+
+
+API
+--------------
+
+* int PyXXX_AllowFilesystemInfo(interpreter)
+ Allow the untrusted interpreter to expose filesystem information. If the
+ passed-in interpreter is not untrusted, return NULL.
+
+* PyXXX_CheckFilesystemInfo(error_return)
+ Macro that checks if exposing filesystem information is allowed. If it is
+ not, cause the caller to return with the value of 'error_return' and raise
+ XXX.
+
+
+Threading
+=============================
+
+XXX Needed?
+
+
+Stdin, Stdout, and Stderr
+=============================
+
+Protection
+--------------
+
+By default, sys.__stdin__, sys.__stdout__, and sys.__stderr__ will be set to
+instances of cStringIO. Allowing use of the normal stdin, stdout, and stderr
+will be allowed.
+XXX Or perhaps __stdin__ and friends should just be blocked and all you get is
+sys.stdin and friends set to cStringIO.
+
+
+Why
+--------------
+
+Interference with stdin, stdout, or stderr should not be allowed unless
+desired.
+
+
+Possible Security Flaws
+-----------------------
+
+Unless cStringIO instances can be used maliciously, none to speak of.
+XXX Use StringIO instances instead for even better security?
+
+
+API
+--------------
+
+* int PyXXX_UseTrueStdin(interpreter)
+ int PyXXX_UseTrueStdout(interpreter)
+ int PyXXX_UseTrueStderr(interpreter)
+ Set the specific stream for the interpreter to the true version of the
+ stream and not to the default instance of cStringIO. If the interpreter is
+ not untrusted, return NULL.
+
+
+Adding New Protections
+=============================
+
+Protection
+--------------
+
+Allow for extensibility in the security model by being able to add new types of
+checks. This allows not only for Python to add new security protections in a
+backwards-compatible fashion, but to also have extension modules add their own
+as well.
+
+An extension module can introduce a group for its various values to check, with
+a type being a specific value within a group. The "Python" group is
+specifically reserved for use by the Python core itself.
+
+
+Why
+--------------
+
+We are all human. There is the possibility that a need for a new type of
+protection for the interpreter will present itself and thus need support. By
+providing an extensible way to add new protections it helps to future-proof the
+system.
+
+It also allows extension modules to present their own set of security
+protections. That way one extension module can use the protection scheme
+presented by another that it is dependent upon.
+
+
+Possible Security Flaws
+------------------------
+
+Poor definitions by extension module users of how their protections should be
+used would allow for possible exploitation.
+
+
+API
+--------------
+
+XXX Could also have PyXXXExtended prefix instead for the following functions
+
++ Bool
+ * int PyXXX_ExtendedSetTrue(interpreter, group, type)
+ Set a group-type to be true. Expected use is for when a binary
+ possibility of something is needed and that the default is to not allow
+ use of the resource (e.g., network information). Returns NULL if the
+ interpreter is not untrusted.
+
+ * PyXXX_ExtendedCheckTrue(group, type, error_return)
+ Macro that if the group-type is not set to true, cause the caller to
+ return with 'error_return' with XXX exception raised. For trusted
+ interpreters the check does nothing.
+
++ Numeric Range
+ * int PyXXX_ExtendedValueCap(interpreter, group, type, cap)
+ Set a group-type to a capped value, with the initial value set to 0.
+ Expected use is when a resource has a capped amount of use (e.g.,
+ memory). Returns NULL if the interpreter is not untrusted.
+
+ * PyXXX_ExtendedValueAlloc(increase, error_return)
+ Macro to raise the amount of a resource is used by 'increase'. If the
+ increase pushes the resource allocation past the set cap, then return
+ 'error_return' and set XXX as the exception.
+
+ * PyXXX_ExtendedValueFree(decrease, error_return)
+ Macro to lower the amount a resource is used by 'decrease'. If the
+ decrease pushes the allotment to below 0 then have the caller return
+ 'error_return' and set XXX as the exception.
+
+
++ Membership
+ * int PyXXX_ExtendedAddMembership(interpreter, group, type, string)
+ Add a string to be considered a member of a group-type (e.g., allowed
+ file paths). If the interpreter is not an untrusted interpreter,
+ return NULL.
+
+ * PyXXX_ExtendedCheckMembership(group, type, string, error_return)
+ Macro that checks 'string' is a member of the values set for the
+ group-type. If it is not, then have the caller return 'error_return'
+ and set an exception for XXX. For trusted interpreters the call does
+ nothing.
+
++ Specific Value
+ * int PyXXX_ExtendedSetValue(interpreter, group, type, string)
+ Set a group-type to a specific string. If the interpreter is not
+ untrusted, return NULL.
+
+ * PyXXX_ExtendedCheckValue(group, type, string, error_return)
+ Macro to check that the group-type is set to 'string'. If it is not,
+ then have the caller return 'error_return' and set an exception for
+ XXX. If the interpreter is trusted then nothing is done.
+
+
+References
+///////////////////////////////////////
+
+.. [#rexec] The 'rexec' module
+ (http://docs.python.org/lib/module-rexec.html)
+
+.. [#safe-tcl] The Safe-Tcl Security Model
+ (http://research.sun.com/technical-reports/1997/abstract-60.html)
+
+.. [#ctypes] 'ctypes' module
+ (http://docs.python.org/dev/lib/module-ctypes.html)
From python-checkins at python.org Thu Jul 6 03:06:40 2006
From: python-checkins at python.org (mateusz.rukowicz)
Date: Thu, 6 Jul 2006 03:06:40 +0200 (CEST)
Subject: [Python-checkins] r47250 - sandbox/trunk/decimal-c/_decimal.c
sandbox/trunk/decimal-c/decimal.h
sandbox/trunk/decimal-c/test_decimal.py
Message-ID: <20060706010640.CC0E51E4003@bag.python.org>
Author: mateusz.rukowicz
Date: Thu Jul 6 03:06:35 2006
New Revision: 47250
Modified:
sandbox/trunk/decimal-c/_decimal.c
sandbox/trunk/decimal-c/decimal.h
sandbox/trunk/decimal-c/test_decimal.py
Log:
Made things ready to switch to big exponents, fixed pickling of Context. Changed test_decimal.py so it doesn't need __dict__ attribute of Context instance.
Modified: sandbox/trunk/decimal-c/_decimal.c
==============================================================================
--- sandbox/trunk/decimal-c/_decimal.c (original)
+++ sandbox/trunk/decimal-c/_decimal.c Thu Jul 6 03:06:35 2006
@@ -521,6 +521,43 @@
return new_pos;
}
+
+#define exp_add_i(exp, a) ((exp) + (a))
+#define exp_add(a, b) ((a) + (b))
+#define exp_inc(a) ((*a)++)
+#define exp_dec(a) ((*a)--)
+#define exp_is_zero(a) ((a) == 0)
+#define exp_is_neg(a) ((a) < 0)
+#define exp_is_pos(a) ((a) > 0)
+#define exp_inp_add(a,b) ((*(a)) += (b))
+#define exp_to_int(a) (a)
+#define exp_to_i(a) (a)
+#define exp_from_i(a) (a)
+#define exp_sub_i(exp, a) ((exp) - (a))
+#define exp_sub(a, b) ((a) - (b))
+#define exp_g(a, b) ((a) > (b))
+#define exp_l(a, b) ((a) < (b))
+#define exp_ge(a, b) ((a) >= (b))
+#define exp_eq(a, b) ((a) == (b))
+#define exp_ne(a, b) ((a) != (b))
+#define exp_g_i(a, b) ((a) > (b))
+#define exp_l_i(a, b) ((a) < (b))
+#define exp_ge_i(a, b) ((a) >= (b))
+#define exp_le_i(a, b) ((a) <= (b))
+#define exp_mod_i(a, b) ((a) % (b))
+#define exp_ge_i(a, b) ((a) >= (b))
+#define exp_floordiv_i(a,b) ((a) / (b) - ((a) % (b) && (a) < 0))
+#define exp_inp_sub(a, b) ((*(a)) -= (b))
+#define exp_inp_sub_i(a, b) ((*(a)) -= (b))
+#define exp_sprintf(a, e) (sprintf(a, "%d", e))
+#define exp_sscanf(a, e) (sscanf(a, "%d", e))
+#define exp_min(a, b) ((a) < (b) ? (a) : (b))
+#define exp_max(a, b) ((a) > (b) ? (a) : (b))
+#define exp_neg(a) (-(a))
+#define exp_mul_i(a, b) ((a) * (b))
+
+
+
/* helpful macros ************************************************************/
/* if we later decide to call this module "squeaky" */
@@ -545,9 +582,10 @@
/* Exponent calculations */
/* XXX: overflow checking? */
-#define ETINY(ctx) ((ctx)->Emin - (ctx)->prec + 1)
-#define ETOP(ctx) ((ctx)->Emax - (ctx)->prec + 1)
-#define ADJUSTED(dec) ((dec)->exp + (dec)->ob_size - 1)
+#define ETINY(ctx) (exp_sub_i((ctx)->Emin, (ctx)->prec - 1))
+#define ETOP(ctx) (exp_sub_i((ctx)->Emax, (ctx)->prec - 1))
+/*#define ADJUSTED(dec) ((dec)->exp + (dec)->ob_size - 1) */
+#define ADJUSTED(dec) (exp_add_i((dec)->exp, (dec)->ob_size - 1))
/* constant values ***********************************************************/
@@ -897,7 +935,7 @@
*/
static decimalobject *
-_new_decimalobj(PyTypeObject *type, long ndigits, char sign, long exp)
+_new_decimalobj(PyTypeObject *type, long ndigits, char sign, exp_t exp)
{
decimalobject *new;
char *arr = NULL;
@@ -1034,10 +1072,10 @@
/* Round towards 0, that is, truncate digits. */
static decimalobject *
-_round_down(decimalobject *self, long prec, long expdiff, contextobject *ctx)
+_round_down(decimalobject *self, long prec, exp_t expdiff, contextobject *ctx)
{
long i;
- decimalobject *new = _NEW_decimalobj(prec, self->sign, self->exp + expdiff);
+ decimalobject *new = _NEW_decimalobj(prec, self->sign, exp_add(self->exp, expdiff));
if (!new) return NULL;
for (i = 0; i < prec; i++)
new->digits[i] = self->digits[i];
@@ -1048,10 +1086,10 @@
/* Round away from 0. */
static decimalobject *
-_round_up(decimalobject *self, long prec, long expdiff, contextobject *ctx)
+_round_up(decimalobject *self, long prec, exp_t expdiff, contextobject *ctx)
{
long i;
- decimalobject *new = _NEW_decimalobj(prec, self->sign, self->exp + expdiff);
+ decimalobject *new = _NEW_decimalobj(prec, self->sign, exp_add(self->exp, expdiff));
decimalobject *new2 = NULL;
if (!new) return NULL;
for (i = 0; i < prec; i++)
@@ -1068,7 +1106,7 @@
_limb_cut_one_digit(new2->limbs,new2->ob_size);
new2->ob_size--;
new2->limb_count = (new2->ob_size + LOG -1)/LOG;
- new2->exp++;
+ exp_inc(&(new2->exp));
}
return new2;
}
@@ -1081,7 +1119,7 @@
* without DECREFing tmp afterwards.
*/
static decimalobject *
-_do_round_half_up(decimalobject *self, long prec, long expdiff,
+_do_round_half_up(decimalobject *self, long prec, exp_t expdiff,
contextobject *ctx, decimalobject *tmp)
{
decimalobject *new;
@@ -1094,7 +1132,7 @@
_limb_cut_one_digit(new->limbs,new->ob_size);
new->ob_size--;
new->limb_count = (new->ob_size + LOG - 1)/LOG;
- new->exp++;
+ exp_inc(&(new->exp));
}
return new;
} else {
@@ -1104,12 +1142,12 @@
/* Round 5 down. */
static decimalobject *
-_round_half_down(decimalobject *self, long prec, long expdiff, contextobject *ctx)
+_round_half_down(decimalobject *self, long prec, exp_t expdiff, contextobject *ctx)
{
long i, last;
decimalobject *tmp;
assert(expdiff > 0);
- tmp = _NEW_decimalobj(prec, self->sign, self->exp + expdiff);
+ tmp = _NEW_decimalobj(prec, self->sign, exp_add(self->exp, expdiff));
if (!tmp) return NULL;
for (i = 0; i < prec; i++)
tmp->digits[i] = self->digits[i];
@@ -1128,12 +1166,12 @@
/* Round 5 to even, rest to nearest. */
static decimalobject *
-_round_half_even(decimalobject *self, long prec, long expdiff, contextobject *ctx)
+_round_half_even(decimalobject *self, long prec, exp_t expdiff, contextobject *ctx)
{
decimalobject *tmp;
long i, last;
assert(expdiff > 0);
- tmp = _NEW_decimalobj(prec, self->sign, self->exp + expdiff);
+ tmp = _NEW_decimalobj(prec, self->sign, exp_add(self->exp, expdiff));
if (!tmp) return NULL;
for (i = 0; i < prec; i++)
tmp->digits[i] = self->digits[i];
@@ -1151,11 +1189,11 @@
/* Round 5 up (away from 0). */
static decimalobject *
-_round_half_up(decimalobject *self, long prec, long expdiff, contextobject *ctx)
+_round_half_up(decimalobject *self, long prec, exp_t expdiff, contextobject *ctx)
{
decimalobject *tmp;
long i;
- tmp = _NEW_decimalobj(prec, self->sign, self->exp + expdiff);
+ tmp = _NEW_decimalobj(prec, self->sign, exp_add(self->exp, expdiff));
if (!tmp) return NULL;
for (i = 0; i < prec; i++)
tmp->digits[i] = self->digits[i];
@@ -1165,7 +1203,7 @@
/* Round up (regardless of sign) */
static decimalobject *
-_round_ceiling(decimalobject *self, long prec, long expdiff, contextobject *ctx)
+_round_ceiling(decimalobject *self, long prec, exp_t expdiff, contextobject *ctx)
{
assert(self->sign <= 1);
if (self->sign > 0)
@@ -1176,7 +1214,7 @@
/* Round down (regardless of sign) */
static decimalobject *
-_round_floor(decimalobject *self, long prec, long expdiff, contextobject *ctx)
+_round_floor(decimalobject *self, long prec, exp_t expdiff, contextobject *ctx)
{
assert(self->sign <= 1);
if (self->sign > 0)
@@ -1199,7 +1237,8 @@
{
decimalobject *new, *new2 = NULL;
contextobject *ctx2 = NULL;
- long i, expdiff;
+ long i;
+ exp_t expdiff;
round_func rnd_func;
if (ISSPECIAL(self)) {
@@ -1226,7 +1265,7 @@
i = prec;
new = _NEW_decimalobj(i, self->sign,
- self->ob_size - prec + self->exp);
+ exp_add_i(self->exp, self->ob_size - prec));
if (!new) return NULL;
while (i--)
new->digits[i] = 0;
@@ -1249,7 +1288,7 @@
prec = 1;
} else if (prec < 0) {
new = _NEW_decimalobj(2, self->sign,
- self->exp + self->ob_size - prec - 1);
+ exp_add_i(self->exp, self->ob_size - prec - 1));
if (!new) return NULL;
new->digits[0] = 0;
new->digits[1] = 1;
@@ -1260,12 +1299,14 @@
if (!new) return NULL;
}
- expdiff = new->ob_size - prec;
- if (expdiff == 0)
+ expdiff = exp_from_i(new->ob_size - prec);
+/* if (expdiff == 0) */
+ if(exp_is_zero(expdiff))
return new;
- else if (expdiff < 0) {
+
+ else if (exp_is_neg(expdiff)) {
/* we need to extend precision */
- new2 = _NEW_decimalobj(prec, new->sign, new->exp + expdiff);
+ new2 = _NEW_decimalobj(prec, new->sign, exp_add(new->exp, expdiff));
if (!new2) {
Py_DECREF(new);
return NULL;
@@ -1283,7 +1324,7 @@
}
/* Maybe all the lost digits are 0. */
- for (i = self->ob_size - expdiff; i < self->ob_size; i++) {
+ for (i = self->ob_size - exp_to_int(expdiff); i < self->ob_size; i++) {
if(_limb_get_digit(self->limbs, self->ob_size, i) > 0)
goto no_way;
}
@@ -1295,7 +1336,8 @@
new->ob_size --;
}
new->limb_count = (new->ob_size + LOG - 1)/LOG;
- new->exp += expdiff;
+/* new->exp += expdiff; */
+ exp_inp_add(&(new->exp), expdiff);
if (handle_Rounded(ctx, NULL) != 0) {
Py_DECREF(new);
return NULL;
@@ -1339,12 +1381,13 @@
/* Default values: rounding=-1 (use context), watchexp=1 */
static decimalobject *
-_decimal_rescale(decimalobject *self, long exp, contextobject *ctx,
+_decimal_rescale(decimalobject *self, exp_t exp, contextobject *ctx,
int rounding, int watchexp)
{
decimalobject *ans = NULL, *tmp;
int ret;
- long diff, adj;
+ exp_t diff;
+ long adj;
long digits, i;
if (ISSPECIAL(self)) {
@@ -1357,7 +1400,7 @@
return ans;
}
- if (watchexp && (exp > ctx->Emax || exp < ETINY(ctx)))
+ if (watchexp && (exp_g(exp, ctx->Emax) || exp_l(exp, ETINY(ctx))))
return handle_InvalidOperation(self->ob_type, ctx,
"rescale(a, INF)", NULL);
@@ -1370,8 +1413,8 @@
return ans;
}
- diff = self->exp - exp;
- digits = self->ob_size + diff;
+ diff = exp_sub(self->exp, exp);
+ digits = exp_to_int(exp_add_i(diff, self->ob_size));
if (watchexp && ctx->prec < digits)
return handle_InvalidOperation(self->ob_type, ctx,
@@ -1379,7 +1422,7 @@
digits += 1;
if (digits < 0) {
- ans = _NEW_decimalobj(2, self->sign, self->exp - digits);
+ ans = _NEW_decimalobj(2, self->sign, exp_sub_i(self->exp, digits));
if (!ans)
return NULL;
ans->digits[0] = 0;
@@ -1432,13 +1475,16 @@
{
static char *kwlist[] = {"exp", "rounding", "context", "watchexp", 0};
contextobject *ctx = NULL;
- long exp;
+ exp_t exp;
+ long tmp_exp;
int rounding = -1, watchexp = 1;
+ /* TODO big exp*/
if(!PyArg_ParseTupleAndKeywords(args,kwds,"l|iOi:_rescale",kwlist,
- &exp, &rounding, &ctx, &watchexp))
+ &tmp_exp, &rounding, &ctx, &watchexp))
return NULL;
+ exp = exp_from_i(tmp_exp);
if(ctx == NULL)
if(!(ctx = getcontext()))
return NULL;
@@ -1457,13 +1503,13 @@
_fixexponents(decimalobject *self, contextobject *ctx)
{
decimalobject *ans = NULL;
- long adj;
+ exp_t adj;
assert(!ISSPECIAL(self));
adj = ADJUSTED(self);
- if (adj < ctx->Emin) {
- long Etiny = ETINY(ctx);
- if (self->exp < Etiny) {
+ if (exp_l(adj, ctx->Emin)) {
+ exp_t Etiny = ETINY(ctx);
+ if (exp_l(self->exp, Etiny)) {
if (!decimal_nonzero(self)) {
ans = _decimal_get_copy(self);
if (!ans)
@@ -1491,8 +1537,8 @@
/* this returns self, below */
}
} else {
- long Etop = ETOP(ctx);
- if (ctx->clamp && self->exp > Etop) {
+ exp_t Etop = ETOP(ctx);
+ if (ctx->clamp && exp_g(self->exp, Etop)) {
if (handle_Clamped(ctx, NULL) != 0)
return NULL;
ans = _decimal_rescale(self, Etop, ctx, -1, 1);
@@ -1500,7 +1546,7 @@
return NULL;
return ans;
} else {
- if (adj > ctx->Emax) {
+ if (exp_g(adj, ctx->Emax)) {
if (!decimal_nonzero(self)) {
ans = _decimal_get_copy(self);
if (!ans)
@@ -1671,7 +1717,7 @@
_do_real_decimal_compare(decimalobject *self, decimalobject *other,
contextobject *ctx)
{
- long adj1, adj2;
+ exp_t adj1, adj2;
long i, minsize;
decimalobject *longer, *ans;
contextobject *ctx2;
@@ -1707,7 +1753,7 @@
adj1 = ADJUSTED(self);
adj2 = ADJUSTED(other);
- if (adj1 == adj2) {
+ if (exp_eq(adj1, adj2)) {
if (self->ob_size <= other->ob_size) {
minsize = self->ob_size;
longer = other;
@@ -1733,9 +1779,9 @@
next:
/* Adjusted sizes are not equal. */
- if(adj1 > adj2 && _limb_get_digit(self->limbs, self->ob_size, 0) != 0)
+ if(exp_g(adj1, adj2) && _limb_get_digit(self->limbs, self->ob_size, 0) != 0)
return 1 - 2*(self->sign & 1); /* 0 -> 1, 1 -> -1 */
- else if(adj1 < adj2 && _limb_get_digit(other->limbs, other->ob_size, 0) != 0)
+ else if(exp_l(adj1, adj2) && _limb_get_digit(other->limbs, other->ob_size, 0) != 0)
return -1 + 2*(other->sign & 1);
ctx2 = context_copy(ctx);
@@ -1882,9 +1928,9 @@
}
else
{
- if(self->exp < other->exp && !self->sign)
+ if (exp_l(self->exp, other->exp) && !self->sign)
ans = other;
- else if(self->exp > other->exp && self->sign)
+ else if (exp_g(self->exp, other->exp) && self->sign)
ans = other;
}
}
@@ -1954,9 +2000,9 @@
ans = other;
}
else{
- if(self->exp > other->exp && !self->sign)
+ if (exp_g(self->exp, other->exp) && !self->sign)
ans = other;
- else if (self->exp < other->exp && self->sign)
+ else if (exp_l(self->exp, other->exp) && self->sign)
ans = other;
}
}
@@ -2001,7 +2047,7 @@
while(_limb_get_digit(dup->limbs, dup->ob_size, dup->ob_size -1) == 0){
_limb_cut_one_digit(dup->limbs,dup->ob_size);
dup->ob_size --;
- dup->exp ++;
+ exp_inc(&(dup->exp));
}
return dup;
}
@@ -2353,7 +2399,7 @@
}
}
- if (self->exp == other->exp)
+ if (exp_eq(self->exp, other->exp))
Py_RETURN_TRUE;
else
Py_RETURN_FALSE;
@@ -2392,14 +2438,14 @@
contextobject *ctx2 = 0;
decimalobject *half = 0;
PyObject *flags = 0;
- long expadd;
+ exp_t expadd;
long firstprec;
long i;
- long Emax;
- long Emin;
+ exp_t Emax;
+ exp_t Emin;
long maxp;
long rounding;
- long prevexp;
+ exp_t prevexp;
if (ISSPECIAL(self)) {
decimalobject *nan;
@@ -2416,9 +2462,10 @@
return NULL;
ret->limbs[0] = 0;
- ret->exp = self->exp / 2;
- if (self->exp < 0 && self->exp%2)
- ret->exp --;
+ ret->exp = exp_floordiv_i(self->exp, 2);
+// ret->exp = self->exp / 2;
+// if (self->exp < 0 && self->exp%2)
+// ret->exp --;
ret->sign = self->sign & 1;
return ret;
@@ -2433,11 +2480,12 @@
if (!tmp)
return NULL;
- expadd = tmp->exp/2;
- if (tmp->exp < 0 && tmp->exp % 2)
- expadd --;
+ expadd = exp_floordiv_i(tmp->exp, 2);
+// expadd = tmp->exp/2;
+// if (tmp->exp < 0 && tmp->exp % 2)
+// expadd --;
- if (tmp->exp &1) {
+ if (exp_mod_i(tmp->exp, 2)) {
_limb_first_n_digits(self->limbs, self->ob_size, 0, tmp->limbs, tmp->ob_size);
}
else {
@@ -2447,7 +2495,7 @@
tmp->limbs[i] = self->limbs[i];
}
- tmp->exp = 0;
+ tmp->exp = exp_from_i(0);
ctx2 = context_copy(ctx);
@@ -2464,7 +2512,7 @@
return NULL;
}
- ans = _NEW_decimalobj(3, 0, 0);
+ ans = _NEW_decimalobj(3, 0, exp_from_i(0));
if (!ans) {
Py_DECREF(tmp);
@@ -2472,7 +2520,7 @@
Py_DECREF(flags);
}
- tmp2 = _NEW_decimalobj(3, 0, 0);
+ tmp2 = _NEW_decimalobj(3, 0, exp_from_i(0));
if (!tmp2) {
Py_DECREF(tmp);
@@ -2481,17 +2529,19 @@
Py_DECREF(ans);
}
- if ((ADJUSTED(tmp) & 1) == 0) {
+ if (exp_mod_i(ADJUSTED(tmp), 2) == 0) {
ans->limbs[0] = 819;
- ans->exp = ADJUSTED(tmp) - 2;
+ ans->exp = exp_sub_i(ADJUSTED(tmp), 2);
tmp2->limbs[0] = 259;
- tmp2->exp = -2;
+// tmp2->exp = -2;
+ exp_inp_sub_i(&(tmp2->exp), 2);
}
else {
ans->limbs[0] = 259;
- ans->exp = tmp->exp + tmp->ob_size - 3;
+ ans->exp = exp_add_i(tmp->exp, tmp->ob_size - 3);
tmp2->limbs[0] = 819;
- tmp2->exp = -3;
+// tmp2->exp = -3;
+ exp_inp_sub_i(&(tmp2->exp), 3);
}
firstprec = ctx2->prec;
@@ -2514,10 +2564,11 @@
Py_DECREF(ans);
ans = tmp2;
tmp2 = 0;
- ans->exp -= 1 + ADJUSTED(tmp)/2;
+/* ans->exp -= 1 + ADJUSTED(tmp)/2;
if (1 + ADJUSTED(tmp) < 0 && (1 + ADJUSTED(tmp)) % 2)
ans->exp --;
-
+ */
+ exp_inp_sub(&(ans->exp),exp_add_i(exp_floordiv_i(ADJUSTED(tmp), 2), 1));
Emax = ctx2->Emax;
Emin = ctx2->Emin;
@@ -2571,9 +2622,10 @@
goto err;
ctx2->prec = firstprec + 1;
- if (prevexp != ADJUSTED(tmp2)) {
+ if (exp_ne(prevexp, ADJUSTED(tmp2))) {
_limb_first_n_digits(tmp2->limbs, tmp2->ob_size, 0, ans->limbs, ans->ob_size);
- ans->exp --;
+// ans->exp --;
+ exp_dec(&(ans->exp));
}
else {
ans->limb_count = tmp2->limb_count;
@@ -2588,7 +2640,7 @@
{
int cmp;
decimalobject *lower;
- half->exp = ans->exp - 1;
+ half->exp = exp_sub_i(ans->exp, 1);
half->limbs[0] = 5;
lower = _do_decimal_subtract(ans, half, ctx2);
if (!lower)
@@ -2622,7 +2674,7 @@
}
else {
decimalobject *upper;
- half->exp = ans->exp-1;
+ half->exp = exp_sub_i(ans->exp, 1);
half->limbs[0] = 5;
upper = _do_decimal_add(ans, half, ctx2);
if (!upper)
@@ -2655,7 +2707,8 @@
}
}
- ans->exp += expadd;
+// ans->exp += expadd;
+ exp_inp_add(&(ans->exp), expadd);
ctx2->rounding = rounding;
tmp2 = _decimal_fix(ans, ctx2);
@@ -2691,10 +2744,11 @@
}
else {
- long exp = self->exp / 2;
+/* long exp = self->exp / 2;
if (self->exp < 0 && self->exp % 2)
- exp --;
- ctx2->prec += ans->exp - exp;
+ exp --;*/
+ exp_t exp = exp_floordiv_i(self->exp, 2);
+ ctx2->prec += exp_to_i(exp_sub(ans->exp, exp));
tmp2 = _decimal_rescale(ans, exp, ctx2, -1, 1);
if (!tmp2)
@@ -2753,7 +2807,8 @@
if (_check_nans(self, NULL, ctx, &nan) != 0)
return nan;
}
- if (self->exp >= 0) {
+// if (self->exp >= 0) {
+ if (exp_is_pos(self->exp) || exp_is_zero(self->exp)) {
Py_INCREF(self);
return self;
}
@@ -2933,7 +2988,8 @@
{
char outbuf[FORMAT_SIZE];
char *p, *end;
- int roundexp, i;
+ int i;
+ exp_t roundexp;
end = &outbuf[FORMAT_SIZE-1];
p = outbuf;
@@ -2941,39 +2997,40 @@
*p++ = '-';
SANITY_CHECK(p);
}
- if (d->exp < 0 && d->exp >= -6) {
-
+// if (d->exp < 0 && d->exp >= -6) {
+ if(exp_is_neg(d->exp) && exp_ge_i(d->exp, -6)) {
i = 0;
*p++ = '0';
*p++ = '.';
- for (;iexp);i++)
+ for (;iexp));i++)
*p++ = '0';
SANITY_CHECK(p);
} else {
roundexp = d->exp;
- while(roundexp%3)
- roundexp ++;
- if (roundexp != d->exp) {
+ while(exp_mod_i(roundexp, 3))
+ exp_inc (&roundexp);
+ if (exp_ne(roundexp, d->exp)) {
*p++ = '0';
*p++ = '.';
- for (i = 0; i < (roundexp - d->exp); i++)
+ for (i = 0; i < (exp_to_i(exp_sub(roundexp, d->exp))); i++)
*p++ = '0';
SANITY_CHECK(p);
} else {
*p++ = '0';
}
- if (roundexp != 0) {
+ if (!exp_is_zero(roundexp)) {
if (context->capitals)
*p++ = 'E';
else
*p++ = 'e';
SANITY_CHECK(p);
- if (roundexp > 0)
+ if (exp_is_pos(roundexp))
*p++ = '+';
SANITY_CHECK(p);
- p += sprintf(p, "%d", roundexp);
+// p += sprintf(p, "%d", roundexp);
+ p += exp_sprintf(p, roundexp);
SANITY_CHECK(p);
}
}
@@ -3002,7 +3059,8 @@
{
char *outbuf;
int buflen, i;
- int dotplace, adjexp;
+ int dotplace;
+ exp_t adjexp;
int append_E = 0, append_adjexp = 0;
long extra_zeros=0;
char *p, *end;
@@ -3042,7 +3100,7 @@
/* (1) dotplace = -adjexp + d->exp + d->ob_size */
/* why is that like? well, d->exp moves dot right, d->ob_size moves dot right
* and adjexp moves dot left */
- adjexp = 0;
+ adjexp = exp_from_i(0);
dotplace = d->exp + d->ob_size;
/* dotplace must be at most d->ob_size (no dot at all) and at last -5 (6 pre-zeros)*/
if(dotplace >d->ob_size || dotplace <-5)
@@ -3050,13 +3108,14 @@
/* ok, we have to put dot after 1 digit, that is dotplace = 1
* we compute adjexp from equation (1) */
dotplace = 1;
- adjexp = -dotplace + d->exp + d->ob_size;
+// adjexp = -dotplace + d->exp + d->ob_size;
+ adjexp = exp_add_i(d->exp, d->ob_size - dotplace);
}
if(engineering) /* we have to be sure, adjexp%3 == 0 */
- while(adjexp%3)
+ while(exp_mod_i(adjexp,3))
{
- adjexp --;
+ exp_dec(&adjexp);
dotplace ++;
}
@@ -3086,7 +3145,7 @@
/* that was a way easier way =] */
- if(adjexp)
+ if(!exp_is_zero(adjexp))
{
append_E = 1;
append_adjexp = 1;
@@ -3095,7 +3154,7 @@
if (context->capitals) {
*p++ = 'E';
SANITY_CHECK(p);
- if (adjexp > 0) {
+ if (exp_is_pos(adjexp)) {
*p++ = '+';
SANITY_CHECK(p);
}
@@ -3106,7 +3165,7 @@
}
if (append_adjexp) {
- p += sprintf(p, "%d", adjexp);
+ p += exp_sprintf(p, adjexp);
}
SANITY_CHECK(p);
*p++ = 0;
@@ -3295,15 +3354,15 @@
}
if (self_is_zero) {
- long exp;
+ exp_t exp;
if (divmod) {
decimalobject *ans1, *ans2;
ans2 = _decimal_get_copy(self);
if (!ans2)
return NULL;
- ans2->exp = self->exp < other->exp ? self->exp : other->exp;
-
+ //ans2->exp = self->exp < other->exp ? self->exp : other->exp;
+ ans2->exp = exp_min(self->exp, other->exp);
ans1 = _NEW_decimalobj(1, sign, 0);
if (!ans1) {
Py_DECREF(ans2);
@@ -3317,15 +3376,15 @@
return ans;
}
- exp = self->exp - other->exp;
+ exp = exp_sub(self->exp, other->exp);
- if (exp < ETINY(ctx)) {
+ if (exp_l(exp, ETINY(ctx))) {
exp = ETINY(ctx);
if (handle_Clamped(ctx, "0e-x / y"))
return NULL;
}
- if (exp > ctx->Emax) {
+ if (exp_g(exp, ctx->Emax)) {
exp = ctx->Emax;
if (handle_Clamped(ctx, "0e+x / y"))
return NULL;
@@ -3387,8 +3446,9 @@
if (divmod == 1 || divmod == 3) {
decimalobject *tmp;
- long exp;
- exp = self->exp < other->exp ? self->exp : other->exp;
+ exp_t exp;
+// exp = self->exp < other->exp ? self->exp : other->exp;
+ exp = exp_min(self->exp, other->exp);
tmp = _decimal_rescale(self, exp, ctx, -1, 0);
if (!tmp) {
@@ -3434,12 +3494,12 @@
/* if divmod, then op1->exp - op2->exp must be divisible by LOG */
if (divmod) {
- long expdiff = op1->exp - op2->exp;
- long tmp_exp = op1->exp;
+ exp_t expdiff = exp_sub(op1->exp, op2->exp);
+ exp_t tmp_exp = op1->exp;
/* we will do in loop, because of some issues with x % a, when x < 0 */
- while (expdiff % LOG) {
- tmp_exp --;
- expdiff = tmp_exp - op2->exp;
+ while (exp_mod_i(expdiff, LOG)) {
+ exp_dec(&tmp_exp);
+ expdiff = exp_sub(tmp_exp, op2->exp);
}
op1 = _decimal_rescale(self, tmp_exp, ctx, -1, 0);
@@ -3451,14 +3511,14 @@
decimalobject *result;
decimalobject *remainder_ret;
long *remainder;
- long exp = op1->exp - op2->exp;
- long expdiff;
+ exp_t exp = exp_sub(op1->exp, op2->exp);
+ exp_t expdiff;
long rlimbs;
long old_size;
- long adj1, adj2, adjusted = 0;
+ exp_t adj1, adj2, adjusted = exp_from_i(0);
long i;
long max_size;
- long min_expdiff; /* used when divmod */
+ exp_t min_expdiff; /* used when divmod */
long remainder_limbs = op2->limb_count+1 > op1->limb_count ?
op2->limb_count + 1 : op1->limb_count;
@@ -3501,25 +3561,28 @@
* expdiff * LOG >= -exp - LOG
* expdiff >= (-exp - LOG)/ LOG */
if (divmod) {
- assert(!(exp % LOG));
- min_expdiff = (-exp - LOG)/ LOG;
-
+ assert(!(exp_mod_i(exp, LOG)));
+// min_expdiff = (-exp - LOG)/ LOG;
+ min_expdiff = exp_sub_i(exp_neg(exp), LOG);
+ min_expdiff = exp_floordiv_i(min_expdiff, LOG);
+
}
else
- min_expdiff = op1->limb_count;
+ min_expdiff = exp_from_i(op1->limb_count);
- expdiff = _limb_divide(op1->limbs, op1->limb_count,
+ expdiff = exp_from_i(_limb_divide(op1->limbs, op1->limb_count,
op2->limbs, op2->limb_count, result->limbs,
- significant_limbs, remainder, min_expdiff);
+ significant_limbs, remainder, min_expdiff));
result->limbs[significant_limbs] = 0;
- exp += expdiff * LOG + LOG;
-
+// exp += expdiff * LOG + LOG;
+ exp_inp_add(&exp, exp_add_i(exp_mul_i(expdiff, LOG), LOG));
+
rlimbs = _limb_size(remainder, remainder_limbs);
/* remainder non-zero */
if (!(rlimbs == 1 && remainder[0] == 0)) {
/* we have not enough precision to do exact integer division */
- if (divmod && exp < 0) {
+ if (divmod && exp_is_neg(exp)) {
Py_DECREF(op1);
Py_DECREF(result);
PyMem_FREE(remainder);
@@ -3527,7 +3590,8 @@
}
if (!divmod) {
- exp -= LOG;
+// exp -= LOG;
+ exp_inp_sub_i(&exp, LOG);
result->limb_count ++;
result->ob_size += LOG;
_limb_move_left(result->limbs, result->limb_count, 1);
@@ -3545,8 +3609,8 @@
}
for (i = 0; i< rlimbs; i++)
remainder_ret->limbs[i] = remainder[i];
- if (expdiff <= 0)
- remainder_ret->exp = op1->exp + expdiff * LOG;
+ if (exp_le_i(expdiff, 0))
+ remainder_ret->exp = exp_add(op1->exp, exp_mul_i( expdiff, LOG));
else
remainder_ret->exp = op1->exp;
remainder_ret->ob_size = _limb_size_s(remainder_ret->limbs, remainder_ret->ob_size);
@@ -3581,16 +3645,17 @@
while (result->ob_size > adjusted &&
_limb_get_digit(result->limbs, result->ob_size, result->ob_size -1) == 0) {
/* when int dividing, exp should be 0 */
- if (result->exp >= 0 && divmod)
+ if (exp_ge_i(result->exp, 0) && divmod)
break;
_limb_cut_one_digit(result->limbs, result->ob_size);
result->ob_size --;
- result->exp ++;
+// result->exp ++;
+ exp_inc(&(result->exp));
}
result->limb_count = (result->ob_size + LOG -1)/LOG;
- if (result->ob_size + result->exp > ctx->prec && shouldround && divmod) {
+ if (exp_add_i(result->exp, result->ob_size) > ctx->prec && shouldround && divmod) {
Py_DECREF(remainder_ret);
Py_DECREF(result);
Py_DECREF(op1);
@@ -3604,7 +3669,7 @@
/* we need to rescale, to be compatible with python implementation */
PyObject *flags = 0;
contextobject *ctx2 = 0;
- long remainder_exp = self->exp < other->exp ? self->exp : other->exp;
+ exp_t remainder_exp = exp_min(self->exp, other->exp);
decimalobject *rescaled_rem = 0;
decimalobject *rescaled = _decimal_rescale(result, 0, ctx, -1, 0);
if (!rescaled) {
@@ -3834,7 +3899,7 @@
assert(PyDecimal_Check(other));
assert(PyDecimal_Check(self));
int shouldround, sign, negativezero = 0;
- long exp, oexp;
+ exp_t exp, oexp;
long prec,cmp;
decimalobject *res, *res2, *ret, *o1, *o2;
prec = ctx->prec;
@@ -3857,8 +3922,8 @@
}
shouldround = (ctx->rounding_dec == ALWAYS_ROUND);
- exp = (self->exp < other->exp ? self->exp : other->exp);
-
+// exp = (self->exp < other->exp ? self->exp : other->exp);
+ exp = exp_min(self->exp, other->exp);
if (ctx->rounding == ROUND_FLOOR &&
(self->sign & 1) != (other->sign & 1))
negativezero = 1;
@@ -3877,8 +3942,10 @@
return res;
}
if (!decimal_nonzero(self)) {
- oexp = other->exp - ctx->prec - 1;
- exp = (exp > oexp ? exp : oexp);
+// oexp = other->exp - ctx->prec - 1;
+ oexp = exp_sub(other->exp, ctx->prec + 1);
+// exp = (exp > oexp ? exp : oexp);
+ exp = exp_max(exp, oexp);
res = _decimal_rescale(other, exp, ctx, 0, 0);
if (!res) return NULL;
if (shouldround) {
@@ -3890,8 +3957,10 @@
}
}
if (!decimal_nonzero(other)) {
- oexp = self->exp - ctx->prec - 1;
- exp = (exp > oexp ? exp : oexp);
+// oexp = self->exp - ctx->prec - 1;
+ oexp = exp_sub(self->exp, ctx->prec + 1);
+// exp = (exp > oexp ? exp : oexp);
+ exp = exp_max(exp, oexp);
res = _decimal_rescale(self, exp, ctx, 0, 0);
if (!res) return NULL;
if (shouldround) {
@@ -3906,7 +3975,7 @@
decimalobject *tmp; /* we borrow refference */
decimalobject *oother;
- long numdigits = self->exp - other->exp;
+ long numdigits = exp_to_i(exp_sub(self->exp, other->exp));
if(numdigits < 0)
{
@@ -3928,7 +3997,7 @@
long extend = prec + 2 - tmp->ob_size;
if(extend <= 0)
extend = 1;
- o1 = _NEW_decimalobj(tmp->ob_size + extend, tmp->sign, tmp->exp - extend);
+ o1 = _NEW_decimalobj(tmp->ob_size + extend, tmp->sign, exp_sub_i(tmp->exp, extend));
if(!o1)
return NULL;
@@ -3944,7 +4013,7 @@
}
}
- o1 = _NEW_decimalobj(tmp->ob_size + numdigits, tmp->sign, tmp->exp - numdigits);
+ o1 = _NEW_decimalobj(tmp->ob_size + numdigits, tmp->sign, exp_sub_i(tmp->exp, numdigits));
_limb_first_n_digits(tmp->limbs, tmp->ob_size, 0, o1->limbs, o1->ob_size);
if(!o1)
@@ -3957,7 +4026,7 @@
calc:
- assert(o1->exp == o2->exp);
+ assert(exp_eq(o1->exp, o2->exp));
exp = o1->exp;
cmp = _limb_compare(o1->limbs, o1->limb_count, o2->limbs, o2->limb_count);
@@ -3971,7 +4040,7 @@
if(!ret)
return NULL;
ret->limbs[0] = 0;
- if(ret->exp < ETINY(ctx))
+ if(exp_l(ret->exp, ETINY(ctx)))
{
ret->exp = ETINY(ctx);
if(handle_Clamped(ctx, NULL) != 0)
@@ -4096,7 +4165,7 @@
contextobject *ctx)
{
long resultsign;
- long resultexp;
+ exp_t resultexp;
long shouldround;
long i;
long max_limbs;
@@ -4132,7 +4201,7 @@
}
}
- resultexp = self->exp + other->exp;
+ resultexp = exp_add(self->exp, other->exp);
shouldround = ctx->rounding_dec == ALWAYS_ROUND;
if(!decimal_nonzero(self) || !decimal_nonzero(other))
@@ -4622,7 +4691,7 @@
char *buf;
PyObject *res;
- max = self->ob_size + self->exp;
+ max = exp_to_i(self->ob_size) + self->exp;
buf = PyMem_MALLOC(max + 2); /* with sign */
if (!buf) return NULL;
@@ -4660,7 +4729,7 @@
}
/* try converting to int, if it's too big, convert to long */
- max = self->ob_size + self->exp;
+ max = self->ob_size + exp_to_i(self->exp);
#if SIZEOF_LONG == 4
if (max > 9) return decimal_long(self);
@@ -4756,17 +4825,20 @@
int
_decimal_isint(decimalobject *d)
{
- long i;
+ exp_t i;
+ exp_t cmp;
- if (d->exp >= 0)
+ if (exp_ge_i(d->exp, 0))
/* if the exponent is positive, there's no
* fractional part at all. */
return 1;
/* Here comes the tricky part. We have to find out
* whether the fractional part consists only of zeroes. */
- for (i = d->ob_size-1; i >= d->ob_size + d->exp; --i) {
+ i = exp_from_i(d->ob_size-1);
+ cmp = exp_add_i(d->exp, d->ob_size);
+ for (; exp_ge(i, cmp); --i) {
/* if (d->digits[i] > 0)*/
- if(_limb_get_digit(d->limbs, d->ob_size, i) > 0)
+ if(_limb_get_digit(d->limbs, d->ob_size, exp_to_i(i)) > 0)
return 0;
}
return 1;
@@ -4834,7 +4906,7 @@
if (size > ctx->prec)
return handle_ConversionSyntax(type, ctx, "diagnostic info too long");
- new = _new_decimalobj(type, size, sign, 0);
+ new = _new_decimalobj(type, size, sign, exp_from_i(0));
if (!new)
return NULL;
for (i = 0; i < size; i++)
@@ -4884,7 +4956,7 @@
}
} while (*++p);
if (sign) {
- new = _new_decimalobj(type, 1, sign, 0);
+ new = _new_decimalobj(type, 1, sign, exp_from_i(0));
if (!new)
return NULL;
new->digits[0] = 0;
@@ -4899,7 +4971,7 @@
_decimal_fromliteral(PyTypeObject *type, char *str, long len, contextobject *ctx)
{
char sign = 0;
- long exp = 0;
+ exp_t exp = exp_from_i(0);
long ndigits;
long zero = 0; /* if zero */
long digits_after_dot = 0;
@@ -5002,14 +5074,15 @@
if(e) /* pretty obvious */
{
int ok;
- ok = sscanf(e+1,"%d", &exp);
+// ok = sscanf(e+1,"%d", &exp);
+ ok = exp_sscanf(e+1, &exp);
if(ok!=1)
{
goto err;
}
}
- exp -= digits_after_dot;
-
+// exp -= digits_after_dot;
+ exp_inp_sub_i(&exp, digits_after_dot);
new = _new_decimalobj(type, ndigits, sign, exp);
if(!new)
return NULL;
@@ -5082,7 +5155,7 @@
int ndigits = 0, neg = 0, i = 0;
if (value == 0) {
- new = _new_decimalobj(type, 1, 0, 0);
+ new = _new_decimalobj(type, 1, 0, exp_from_i(0));
if (!new) return NULL;
new->digits[0] = 0;
new->limbs[0] = 0;
@@ -5132,7 +5205,9 @@
decimalobject *new = NULL;
PyObject *tup, *digits, *digtup = NULL, *item;
int sign;
- long i, exp;
+ long i;
+ exp_t exp;
+ long tmp_exp;
if (PyTuple_Check(seq)) {
tup = seq;
@@ -5147,8 +5222,11 @@
PyErr_SetString(PyExc_ValueError, "Invalid arguments");
goto err;
}
- if (!PyArg_ParseTuple(tup, "iOl", &sign, &digits, &exp))
+
+ /* TODO big exponents reading */
+ if (!PyArg_ParseTuple(tup, "iOl", &sign, &digits, &tmp_exp))
goto err;
+ exp = exp_from_i(tmp_exp);
if (sign < 0 || sign > 7) {
PyErr_SetString(PyExc_ValueError, "Invalid sign");
goto err;
@@ -5654,7 +5732,7 @@
static contextobject *
_new_contextobj(long prec, int rounding, int rounding_dec, PyObject *traps,
- PyObject *flags, long Emin, long Emax, int capitals,
+ PyObject *flags, exp_t Emin, exp_t Emax, int capitals,
int clamp, PyObject *ignored, int copy_dicts)
{
contextobject *new;
@@ -6510,7 +6588,7 @@
PyObject *_ignored = NULL;
int i, j;
- if (!PyArg_ParseTupleAndKeywords(args, kwds, "|nbbOOnnbbO:Context", kwlist,
+ if (!PyArg_ParseTupleAndKeywords(args, kwds, "|liiOOlliiO:Context", kwlist,
&prec, &rounding, &rounding_dec, &pytraps,
&pyflags, &Emin, &Emax, &capitals, &clamp,
&pyignored))
Modified: sandbox/trunk/decimal-c/decimal.h
==============================================================================
--- sandbox/trunk/decimal-c/decimal.h (original)
+++ sandbox/trunk/decimal-c/decimal.h Thu Jul 6 03:06:35 2006
@@ -7,6 +7,7 @@
extern "C" {
#endif
+#define exp_t long
#define BASE 1000 /* biggest value of limb power of 10 */
#define LOG 3 /* number of digits per limb LOG = log10(BASE) */
@@ -16,7 +17,7 @@
PyObject_HEAD
long ob_size; /* number of digits */
unsigned int sign; /* see sign contstants above */
- long exp;
+ exp_t exp;
signed char *digits; /* digits are stored as the actual numbers 0-9 */
long limb_count; /* number of limbs */
long *limbs;
@@ -37,8 +38,8 @@
int clamp; /* change exponents if they are too high */
/* min/max exponents */
- long Emin;
- long Emax;
+ exp_t Emin;
+ exp_t Emax;
/* signal handling -- these are dicts */
PyObject *traps; /* if a signal is trapped */
Modified: sandbox/trunk/decimal-c/test_decimal.py
==============================================================================
--- sandbox/trunk/decimal-c/test_decimal.py (original)
+++ sandbox/trunk/decimal-c/test_decimal.py Thu Jul 6 03:06:35 2006
@@ -28,7 +28,7 @@
import glob
import os, sys
import pickle, copy
-from decimal import *
+from _decimal import *
from test.test_support import (TestSkipped, run_unittest, run_doctest,
is_resource_enabled)
import random
@@ -1047,9 +1047,11 @@
def test_pickle(self):
c = Context()
e = pickle.loads(pickle.dumps(c))
- for k in vars(c):
- v1 = vars(c)[k]
- v2 = vars(e)[k]
+ for k in dir(c):
+ #v1 = vars(c)[k]
+ v1 = c.__getattribute__(k)
+ #v2 = vars(e)[k]
+ v2 = c.__getattribute__(k)
self.assertEqual(v1, v2)
def test_equality_with_other_types(self):
@@ -1079,7 +1081,7 @@
DecimalExplicitConstructionTest,
DecimalImplicitConstructionTest,
DecimalArithmeticOperatorsTest,
- #DecimalUseOfContextTest,
+ DecimalUseOfContextTest,
DecimalUsabilityTest,
DecimalPythonAPItests,
ContextAPItests,
@@ -1088,7 +1090,7 @@
try:
run_unittest(*test_classes)
- import decimal as DecimalModule
+ import _decimal as DecimalModule
run_doctest(DecimalModule, verbose)
finally:
setcontext(ORIGINAL_CONTEXT)
From nnorwitz at gmail.com Thu Jul 6 05:24:38 2006
From: nnorwitz at gmail.com (Neal Norwitz)
Date: Wed, 5 Jul 2006 20:24:38 -0700
Subject: [Python-checkins] r47250 - sandbox/trunk/decimal-c/_decimal.c
sandbox/trunk/decimal-c/decimal.h
sandbox/trunk/decimal-c/test_decimal.py
In-Reply-To: <20060706010640.CC0E51E4003@bag.python.org>
References: <20060706010640.CC0E51E4003@bag.python.org>
Message-ID:
// comments should not be used since Python conforms to C89. See PEP 7.
n
--
On 7/5/06, mateusz.rukowicz wrote:
> Author: mateusz.rukowicz
> Date: Thu Jul 6 03:06:35 2006
> New Revision: 47250
>
> Modified:
> sandbox/trunk/decimal-c/_decimal.c
> sandbox/trunk/decimal-c/decimal.h
> sandbox/trunk/decimal-c/test_decimal.py
From nnorwitz at gmail.com Thu Jul 6 05:56:49 2006
From: nnorwitz at gmail.com (Neal Norwitz)
Date: Wed, 5 Jul 2006 20:56:49 -0700
Subject: [Python-checkins] r47206 - in python/trunk:
Lib/ctypes/test/test_win32.py Modules/_ctypes/callproc.c
In-Reply-To: <20060703080815.22FA41E4008@bag.python.org>
References: <20060703080815.22FA41E4008@bag.python.org>
Message-ID:
On 7/3/06, thomas.heller wrote:
> Author: thomas.heller
> Date: Mon Jul 3 10:08:14 2006
> New Revision: 47206
>
> Modified:
> python/trunk/Lib/ctypes/test/test_win32.py
> python/trunk/Modules/_ctypes/callproc.c
> Log:
> Add a new function uses_seh() to the _ctypes extension module. This
> will return True if Windows Structured Exception handling (SEH) is
> used when calling functions, False otherwise.
>
> Currently, only MSVC supports SEH.
>
> Fix the test so that it doesn't crash when run with MingW compiled
> _ctypes. Note that two tests are still failing when mingw is used, I
> suspect structure layout differences and function calling conventions
> between MSVC and MingW.
This seems like a really minor/limited feature. Can't this wait for
2.6? I would prefer this change was reverted.
n
From python-checkins at python.org Thu Jul 6 06:29:04 2006
From: python-checkins at python.org (neal.norwitz)
Date: Thu, 6 Jul 2006 06:29:04 +0200 (CEST)
Subject: [Python-checkins] r47251 - python/trunk/Python/import.c
Message-ID: <20060706042904.F2A0F1E4008@bag.python.org>
Author: neal.norwitz
Date: Thu Jul 6 06:28:59 2006
New Revision: 47251
Modified:
python/trunk/Python/import.c
Log:
Fix refleaks reported by Shane Hathaway in SF patch #1515361. This change
contains only the changes related to leaking the copy variable.
Modified: python/trunk/Python/import.c
==============================================================================
--- python/trunk/Python/import.c (original)
+++ python/trunk/Python/import.c Thu Jul 6 06:28:59 2006
@@ -1237,8 +1237,10 @@
importer = get_path_importer(path_importer_cache,
path_hooks, v);
- if (importer == NULL)
+ if (importer == NULL) {
+ Py_XDECREF(copy);
return NULL;
+ }
/* Note: importer is a borrowed reference */
if (importer == Py_False) {
/* Cached as not being a valid dir. */
@@ -1273,6 +1275,7 @@
loader = PyObject_CallMethod(importer,
"find_module",
"s", fullname);
+ Py_XDECREF(copy);
if (loader == NULL)
return NULL; /* error */
if (loader != Py_None) {
@@ -1281,7 +1284,6 @@
return &importhookdescr;
}
Py_DECREF(loader);
- Py_XDECREF(copy);
continue;
}
}
From python-checkins at python.org Thu Jul 6 06:33:23 2006
From: python-checkins at python.org (neal.norwitz)
Date: Thu, 6 Jul 2006 06:33:23 +0200 (CEST)
Subject: [Python-checkins] r47252 - peps/trunk/pep-0361.txt
Message-ID: <20060706043323.B9E641E4003@bag.python.org>
Author: neal.norwitz
Date: Thu Jul 6 06:33:23 2006
New Revision: 47252
Modified:
peps/trunk/pep-0361.txt
Log:
Add some things I'd like to see done for 2.6, plus the outstanding issues that are being carried over from 2.5
Modified: peps/trunk/pep-0361.txt
==============================================================================
--- peps/trunk/pep-0361.txt (original)
+++ peps/trunk/pep-0361.txt Thu Jul 6 06:33:23 2006
@@ -85,6 +85,14 @@
http://python.org/sf/1505257
(Owner: MAL)
+ Start removing deprecated features and generally moving towards Py3k
+
+ Replace all old style tests (operate on import) with unittest or docttest
+
+ All tests for all untested modules
+
+ Document undocumented modules/features
+
Deferred until 2.7
@@ -93,7 +101,14 @@
Open issues
- None
+ How should import warnings be handled?
+ http://mail.python.org/pipermail/python-dev/2006-June/066345.html
+ http://python.org/sf/1515609
+ http://python.org/sf/1515361
+
+ How should -m work with packages?
+ How should -m work with relative imports?
+ http://mail.python.org/pipermail/python-dev/2006-June/066161.html
Copyright
From buildbot at python.org Thu Jul 6 06:34:49 2006
From: buildbot at python.org (buildbot at python.org)
Date: Thu, 06 Jul 2006 04:34:49 +0000
Subject: [Python-checkins] buildbot failure in ia64 Debian unstable trunk
Message-ID: <20060706043449.6BCFD1E4003@bag.python.org>
The Buildbot has detected a new failure of ia64 Debian unstable trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/ia64%2520Debian%2520unstable%2520trunk/builds/825
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl,neal.norwitz
BUILD FAILED: failed svn
sincerely,
-The Buildbot
From python-checkins at python.org Thu Jul 6 07:13:23 2006
From: python-checkins at python.org (fred.drake)
Date: Thu, 6 Jul 2006 07:13:23 +0200 (CEST)
Subject: [Python-checkins] r47253 - in python/trunk/Modules:
expat/xmlparse.c pyexpat.c
Message-ID: <20060706051323.20D8E1E400A@bag.python.org>
Author: fred.drake
Date: Thu Jul 6 07:13:22 2006
New Revision: 47253
Modified:
python/trunk/Modules/expat/xmlparse.c
python/trunk/Modules/pyexpat.c
Log:
- back out Expat change; the final fix to Expat will be different
- change the pyexpat wrapper to not be so sensitive to this detail of the
Expat implementation (the ex-crasher test still passes)
Modified: python/trunk/Modules/expat/xmlparse.c
==============================================================================
--- python/trunk/Modules/expat/xmlparse.c (original)
+++ python/trunk/Modules/expat/xmlparse.c Thu Jul 6 07:13:22 2006
@@ -2552,8 +2552,6 @@
(int)(dataPtr - (ICHAR *)dataBuf));
if (s == next)
break;
- if (ps_parsing == XML_FINISHED || ps_parsing == XML_SUSPENDED)
- break;
*eventPP = s;
}
}
Modified: python/trunk/Modules/pyexpat.c
==============================================================================
--- python/trunk/Modules/pyexpat.c (original)
+++ python/trunk/Modules/pyexpat.c Thu Jul 6 07:13:22 2006
@@ -238,6 +238,18 @@
return 0;
}
+/* Dummy character data handler used when an error (exception) has
+ been detected, and the actual parsing can be terminated early.
+ This is needed since character data handler can't be safely removed
+ from within the character data handler, but can be replaced. It is
+ used only from the character data handler trampoline, and must be
+ used right after `flag_error()` is called. */
+static void
+noop_character_data_handler(void *userData, const XML_Char *data, int len)
+{
+ /* Do nothing. */
+}
+
static void
flag_error(xmlparseobject *self)
{
@@ -457,6 +469,8 @@
if (temp == NULL) {
Py_DECREF(args);
flag_error(self);
+ XML_SetCharacterDataHandler(self->itself,
+ noop_character_data_handler);
return -1;
}
PyTuple_SET_ITEM(args, 0, temp);
@@ -469,6 +483,8 @@
Py_DECREF(args);
if (temp == NULL) {
flag_error(self);
+ XML_SetCharacterDataHandler(self->itself,
+ noop_character_data_handler);
return -1;
}
Py_DECREF(temp);
@@ -1542,8 +1558,22 @@
xmlhandler c_handler = NULL;
PyObject *temp = self->handlers[handlernum];
- if (v == Py_None)
+ if (v == Py_None) {
+ /* If this is the character data handler, and a character
+ data handler is already active, we need to be more
+ careful. What we can safely do is replace the existing
+ character data handler callback function with a no-op
+ function that will refuse to call Python. The downside
+ is that this doesn't completely remove the character
+ data handler from the C layer if there's any callback
+ active, so Expat does a little more work than it
+ otherwise would, but that's really an odd case. A more
+ elaborate system of handlers and state could remove the
+ C handler more effectively. */
+ if (handlernum == CharacterData && self->in_callback)
+ c_handler = noop_character_data_handler;
v = NULL;
+ }
else if (v != NULL) {
Py_INCREF(v);
c_handler = handler_info[handlernum].handler;
From python-checkins at python.org Thu Jul 6 07:26:29 2006
From: python-checkins at python.org (jackilyn.hoxworth)
Date: Thu, 6 Jul 2006 07:26:29 +0200 (CEST)
Subject: [Python-checkins] r47254 -
python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py
Message-ID: <20060706052629.8E17F1E4003@bag.python.org>
Author: jackilyn.hoxworth
Date: Thu Jul 6 07:26:29 2006
New Revision: 47254
Added:
python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py
Log:
Added items remotely
test_soc_httplib.py
httplib.py
Added: python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py
==============================================================================
--- (empty file)
+++ python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py Thu Jul 6 07:26:29 2006
@@ -0,0 +1,33 @@
+# !/usr/bin/env python
+
+""""
+Test harness for the standard library logging module.
+
+"""
+
+import logging
+import httplib
+from cStringIO import StringIO
+
+log=logging.getLogger("py.httplib")
+stringLog = StringIO()
+
+# define the handler and level
+handler = logging.StreamHandler(stringLog)
+log.setLevel(logging.INFO)
+
+# set a format for the output
+formatter = logging.Formatter('%(name)s: %(levelname)s %(message)s')
+handler.setFormatter(formatter)
+
+# add the handler to the logger
+log.addHandler(handler)
+
+httplib.HTTPResponse(sock).log("message 1")
+httplib.HTTPConnection().log("message 2")
+print stringLog.getvalue() # For testing purposes
+
+if stringLog.getvalue() != "Error: It worked":
+ print "it worked"
+else:
+ print "it didn't work"
From python-checkins at python.org Thu Jul 6 07:40:25 2006
From: python-checkins at python.org (jackilyn.hoxworth)
Date: Thu, 6 Jul 2006 07:40:25 +0200 (CEST)
Subject: [Python-checkins] r47255 -
python/branches/hoxworth-stdlib_logging-soc/httplib.py
Message-ID: <20060706054025.E4F5C1E4003@bag.python.org>
Author: jackilyn.hoxworth
Date: Thu Jul 6 07:40:23 2006
New Revision: 47255
Modified:
python/branches/hoxworth-stdlib_logging-soc/httplib.py
Log:
Modified: python/branches/hoxworth-stdlib_logging-soc/httplib.py
==============================================================================
--- python/branches/hoxworth-stdlib_logging-soc/httplib.py (original)
+++ python/branches/hoxworth-stdlib_logging-soc/httplib.py Thu Jul 6 07:40:23 2006
@@ -1,1421 +1,1436 @@
-"""HTTP/1.1 client library
-
-
-
-
-HTTPConnection go through a number of "states", which defines when a client
-may legally make another request or fetch the response for a particular
-request. This diagram details these state transitions:
-
- (null)
- |
- | HTTPConnection()
- v
- Idle
- |
- | putrequest()
- v
- Request-started
- |
- | ( putheader() )* endheaders()
- v
- Request-sent
- |
- | response = getresponse()
- v
- Unread-response [Response-headers-read]
- |\____________________
- | |
- | response.read() | putrequest()
- v v
- Idle Req-started-unread-response
- ______/|
- / |
- response.read() | | ( putheader() )* endheaders()
- v v
- Request-started Req-sent-unread-response
- |
- | response.read()
- v
- Request-sent
-
-This diagram presents the following rules:
- -- a second request may not be started until {response-headers-read}
- -- a response [object] cannot be retrieved until {request-sent}
- -- there is no differentiation between an unread response body and a
- partially read response body
-
-Note: this enforcement is applied by the HTTPConnection class. The
- HTTPResponse class does not enforce this state machine, which
- implies sophisticated clients may accelerate the request/response
- pipeline. Caution should be taken, though: accelerating the states
- beyond the above pattern may imply knowledge of the server's
- connection-close behavior for certain requests. For example, it
- is impossible to tell whether the server will close the connection
- UNTIL the response headers have been read; this means that further
- requests cannot be placed into the pipeline until it is known that
- the server will NOT be closing the connection.
-
-Logical State __state __response
-------------- ------- ----------
-Idle _CS_IDLE None
-Request-started _CS_REQ_STARTED None
-Request-sent _CS_REQ_SENT None
-Unread-response _CS_IDLE
-Req-started-unread-response _CS_REQ_STARTED
-Req-sent-unread-response _CS_REQ_SENT
-"""
-
-import errno
-import mimetools
-import socket
-from urlparse import urlsplit
-
-import logging
-_log = logging.getLogger('py.httplib')
-
-try:
- from cStringIO import StringIO
-except ImportError:
- from StringIO import StringIO
-
-__all__ = ["HTTP", "HTTPResponse", "HTTPConnection", "HTTPSConnection",
- "HTTPException", "NotConnected", "UnknownProtocol",
- "UnknownTransferEncoding", "UnimplementedFileMode",
- "IncompleteRead", "InvalidURL", "ImproperConnectionState",
- "CannotSendRequest", "CannotSendHeader", "ResponseNotReady",
- "BadStatusLine", "error", "responses"]
-
-HTTP_PORT = 80
-HTTPS_PORT = 443
-
-_UNKNOWN = 'UNKNOWN'
-
-# connection states
-_CS_IDLE = 'Idle'
-_CS_REQ_STARTED = 'Request-started'
-_CS_REQ_SENT = 'Request-sent'
-
-# status codes
-# informational
-CONTINUE = 100
-SWITCHING_PROTOCOLS = 101
-PROCESSING = 102
-
-# successful
-OK = 200
-CREATED = 201
-ACCEPTED = 202
-NON_AUTHORITATIVE_INFORMATION = 203
-NO_CONTENT = 204
-RESET_CONTENT = 205
-PARTIAL_CONTENT = 206
-MULTI_STATUS = 207
-IM_USED = 226
-
-# redirection
-MULTIPLE_CHOICES = 300
-MOVED_PERMANENTLY = 301
-FOUND = 302
-SEE_OTHER = 303
-NOT_MODIFIED = 304
-USE_PROXY = 305
-TEMPORARY_REDIRECT = 307
-
-# client error
-BAD_REQUEST = 400
-UNAUTHORIZED = 401
-PAYMENT_REQUIRED = 402
-FORBIDDEN = 403
-NOT_FOUND = 404
-METHOD_NOT_ALLOWED = 405
-NOT_ACCEPTABLE = 406
-PROXY_AUTHENTICATION_REQUIRED = 407
-REQUEST_TIMEOUT = 408
-CONFLICT = 409
-GONE = 410
-LENGTH_REQUIRED = 411
-PRECONDITION_FAILED = 412
-REQUEST_ENTITY_TOO_LARGE = 413
-REQUEST_URI_TOO_LONG = 414
-UNSUPPORTED_MEDIA_TYPE = 415
-REQUESTED_RANGE_NOT_SATISFIABLE = 416
-EXPECTATION_FAILED = 417
-UNPROCESSABLE_ENTITY = 422
-LOCKED = 423
-FAILED_DEPENDENCY = 424
-UPGRADE_REQUIRED = 426
-
-# server error
-INTERNAL_SERVER_ERROR = 500
-NOT_IMPLEMENTED = 501
-BAD_GATEWAY = 502
-SERVICE_UNAVAILABLE = 503
-GATEWAY_TIMEOUT = 504
-HTTP_VERSION_NOT_SUPPORTED = 505
-INSUFFICIENT_STORAGE = 507
-NOT_EXTENDED = 510
-
-# Mapping status codes to official W3C names
-responses = {
- 100: 'Continue',
- 101: 'Switching Protocols',
-
- 200: 'OK',
- 201: 'Created',
- 202: 'Accepted',
- 203: 'Non-Authoritative Information',
- 204: 'No Content',
- 205: 'Reset Content',
- 206: 'Partial Content',
-
- 300: 'Multiple Choices',
- 301: 'Moved Permanently',
- 302: 'Found',
- 303: 'See Other',
- 304: 'Not Modified',
- 305: 'Use Proxy',
- 306: '(Unused)',
- 307: 'Temporary Redirect',
-
- 400: 'Bad Request',
- 401: 'Unauthorized',
- 402: 'Payment Required',
- 403: 'Forbidden',
- 404: 'Not Found',
- 405: 'Method Not Allowed',
- 406: 'Not Acceptable',
- 407: 'Proxy Authentication Required',
- 408: 'Request Timeout',
- 409: 'Conflict',
- 410: 'Gone',
- 411: 'Length Required',
- 412: 'Precondition Failed',
- 413: 'Request Entity Too Large',
- 414: 'Request-URI Too Long',
- 415: 'Unsupported Media Type',
- 416: 'Requested Range Not Satisfiable',
- 417: 'Expectation Failed',
-
- 500: 'Internal Server Error',
- 501: 'Not Implemented',
- 502: 'Bad Gateway',
- 503: 'Service Unavailable',
- 504: 'Gateway Timeout',
- 505: 'HTTP Version Not Supported',
-}
-
-# maximal amount of data to read at one time in _safe_read
-MAXAMOUNT = 1048576
-
-class HTTPMessage(mimetools.Message):
-
- def addheader(self, key, value):
- """Add header for field key handling repeats."""
- prev = self.dict.get(key)
- if prev is None:
- self.dict[key] = value
- else:
- combined = ", ".join((prev, value))
- self.dict[key] = combined
-
- def addcontinue(self, key, more):
- """Add more field data from a continuation line."""
- prev = self.dict[key]
- self.dict[key] = prev + "\n " + more
-
- def readheaders(self):
- """Read header lines.
-
- Read header lines up to the entirely blank line that terminates them.
- The (normally blank) line that ends the headers is skipped, but not
- included in the returned list. If a non-header line ends the headers,
- (which is an error), an attempt is made to backspace over it; it is
- never included in the returned list.
-
- The variable self.status is set to the empty string if all went well,
- otherwise it is an error message. The variable self.headers is a
- completely uninterpreted list of lines contained in the header (so
- printing them will reproduce the header exactly as it appears in the
- file).
-
- If multiple header fields with the same name occur, they are combined
- according to the rules in RFC 2616 sec 4.2:
-
- Appending each subsequent field-value to the first, each separated
- by a comma. The order in which header fields with the same field-name
- are received is significant to the interpretation of the combined
- field value.
- """
- # XXX The implementation overrides the readheaders() method of
- # rfc822.Message. The base class design isn't amenable to
- # customized behavior here so the method here is a copy of the
- # base class code with a few small changes.
-
- self.dict = {}
- self.unixfrom = ''
- self.headers = hlist = []
- self.status = ''
- headerseen = ""
- firstline = 1
- startofline = unread = tell = None
- if hasattr(self.fp, 'unread'):
- unread = self.fp.unread
- elif self.seekable:
- tell = self.fp.tell
- while True:
- if tell:
- try:
- startofline = tell()
- except IOError:
- startofline = tell = None
- self.seekable = 0
- line = self.fp.readline()
- if not line:
- self.status = 'EOF in headers'
- break
- # Skip unix From name time lines
- if firstline and line.startswith('From '):
- self.unixfrom = self.unixfrom + line
- continue
- firstline = 0
- if headerseen and line[0] in ' \t':
- # XXX Not sure if continuation lines are handled properly
- # for http and/or for repeating headers
- # It's a continuation line.
- hlist.append(line)
- self.addcontinue(headerseen, line.strip())
- continue
- elif self.iscomment(line):
- # It's a comment. Ignore it.
- continue
- elif self.islast(line):
- # Note! No pushback here! The delimiter line gets eaten.
- break
- headerseen = self.isheader(line)
- if headerseen:
- # It's a legal header line, save it.
- hlist.append(line)
- self.addheader(headerseen, line[len(headerseen)+1:].strip())
- continue
- else:
- # It's not a header line; throw it back and stop here.
- if not self.dict:
- self.status = 'No headers'
- else:
- self.status = 'Non-header line where header expected'
- # Try to undo the read.
- if unread:
- unread(line)
- elif tell:
- self.fp.seek(startofline)
- else:
- self.status = self.status + '; bad seek'
- break
-
-class HTTPResponse:
- # strict: If true, raise BadStatusLine if the status line can't be
- # parsed as a valid HTTP/1.0 or 1.1 status line. By default it is
- # false because it prevents clients from talking to HTTP/0.9
- # servers. Note that a response with a sufficiently corrupted
- # status line will look like an HTTP/0.9 response.
-
- # See RFC 2616 sec 19.6 and RFC 1945 sec 6 for details.
-
- def __init__(self, sock, debuglevel=0, strict=0, method=None):
- self.fp = sock.makefile('rb', 0)
- self.debuglevel = debuglevel
- self.strict = strict
- self._method = method
-
- self.msg = None
-
- # from the Status-Line of the response
- self.version = _UNKNOWN # HTTP-Version
- self.status = _UNKNOWN # Status-Code
- self.reason = _UNKNOWN # Reason-Phrase
-
- self.chunked = _UNKNOWN # is "chunked" being used?
- self.chunk_left = _UNKNOWN # bytes left to read in current chunk
- self.length = _UNKNOWN # number of bytes left in response
- self.will_close = _UNKNOWN # conn will close at end of response
-
- def _read_status(self):
- # Initialize with Simple-Response defaults
- line = self.fp.readline()
- if self.debuglevel > 0:
- print "reply:", repr(line)
- if not line:
- # Presumably, the server closed the connection before
- # sending a valid response.
- raise BadStatusLine(line)
- try:
- [version, status, reason] = line.split(None, 2)
- except ValueError:
- try:
- [version, status] = line.split(None, 1)
- reason = ""
- except ValueError:
- # empty version will cause next test to fail and status
- # will be treated as 0.9 response.
- version = ""
- if not version.startswith('HTTP/'):
- if self.strict:
- self.close()
- raise BadStatusLine(line)
- else:
- # assume it's a Simple-Response from an 0.9 server
- self.fp = LineAndFileWrapper(line, self.fp)
- return "HTTP/0.9", 200, ""
-
- # The status code is a three-digit number
- try:
- status = int(status)
- if status < 100 or status > 999:
- raise BadStatusLine(line)
- except ValueError:
- raise BadStatusLine(line)
- return version, status, reason
-
- def begin(self):
- if self.msg is not None:
- # we've already started reading the response
- return
-
- # read until we get a non-100 response
- while True:
- version, status, reason = self._read_status()
- if status != CONTINUE:
- break
- # skip the header from the 100 response
- while True:
- skip = self.fp.readline().strip()
- if not skip:
- break
- if self.debuglevel > 0:
- print "header:", skip
-
- self.status = status
- self.reason = reason.strip()
- if version == 'HTTP/1.0':
- self.version = 10
- elif version.startswith('HTTP/1.'):
- self.version = 11 # use HTTP/1.1 code for HTTP/1.x where x>=1
- elif version == 'HTTP/0.9':
- self.version = 9
- else:
- raise UnknownProtocol(version)
-
- if self.version == 9:
- self.length = None
- self.chunked = 0
- self.will_close = 1
- self.msg = HTTPMessage(StringIO())
- return
-
- self.msg = HTTPMessage(self.fp, 0)
- if self.debuglevel > 0:
- for hdr in self.msg.headers:
- _log.info( "header:", hdr,)
-
- # don't let the msg keep an fp
- self.msg.fp = None
-
- # are we using the chunked-style of transfer encoding?
- tr_enc = self.msg.getheader('transfer-encoding')
- if tr_enc and tr_enc.lower() == "chunked":
- self.chunked = 1
- self.chunk_left = None
- else:
- self.chunked = 0
-
- # will the connection close at the end of the response?
- self.will_close = self._check_close()
-
- # do we have a Content-Length?
- # NOTE: RFC 2616, S4.4, #3 says we ignore this if tr_enc is "chunked"
- length = self.msg.getheader('content-length')
- if length and not self.chunked:
- try:
- self.length = int(length)
- except ValueError:
- self.length = None
- else:
- self.length = None
-
- # does the body have a fixed length? (of zero)
- if (status == NO_CONTENT or status == NOT_MODIFIED or
- 100 <= status < 200 or # 1xx codes
- self._method == 'HEAD'):
- self.length = 0
-
- # if the connection remains open, and we aren't using chunked, and
- # a content-length was not provided, then assume that the connection
- # WILL close.
- if not self.will_close and \
- not self.chunked and \
- self.length is None:
- self.will_close = 1
-
- def _check_close(self):
- conn = self.msg.getheader('connection')
- if self.version == 11:
- # An HTTP/1.1 proxy is assumed to stay open unless
- # explicitly closed.
- conn = self.msg.getheader('connection')
- if conn and "close" in conn.lower():
- return True
- return False
-
- # Some HTTP/1.0 implementations have support for persistent
- # connections, using rules different than HTTP/1.1.
-
- # For older HTTP, Keep-Alive indiciates persistent connection.
- if self.msg.getheader('keep-alive'):
- return False
-
- # At least Akamai returns a "Connection: Keep-Alive" header,
- # which was supposed to be sent by the client.
- if conn and "keep-alive" in conn.lower():
- return False
-
- # Proxy-Connection is a netscape hack.
- pconn = self.msg.getheader('proxy-connection')
- if pconn and "keep-alive" in pconn.lower():
- return False
-
- # otherwise, assume it will close
- return True
-
- def close(self):
- if self.fp:
- self.fp.close()
- self.fp = None
-
- def isclosed(self):
- # NOTE: it is possible that we will not ever call self.close(). This
- # case occurs when will_close is TRUE, length is None, and we
- # read up to the last byte, but NOT past it.
- #
- # IMPLIES: if will_close is FALSE, then self.close() will ALWAYS be
- # called, meaning self.isclosed() is meaningful.
- return self.fp is None
-
- # XXX It would be nice to have readline and __iter__ for this, too.
-
- def read(self, amt=None):
- if self.fp is None:
- return ''
-
- if self.chunked:
- return self._read_chunked(amt)
-
- if amt is None:
- # unbounded read
- if self.length is None:
- s = self.fp.read()
- else:
- s = self._safe_read(self.length)
- self.length = 0
- self.close() # we read everything
- return s
-
- if self.length is not None:
- if amt > self.length:
- # clip the read to the "end of response"
- amt = self.length
-
- # we do not use _safe_read() here because this may be a .will_close
- # connection, and the user is reading more bytes than will be provided
- # (for example, reading in 1k chunks)
- s = self.fp.read(amt)
- if self.length is not None:
- self.length -= len(s)
-
- return s
-
- def _read_chunked(self, amt):
- assert self.chunked != _UNKNOWN
- chunk_left = self.chunk_left
- value = ''
-
- # XXX This accumulates chunks by repeated string concatenation,
- # which is not efficient as the number or size of chunks gets big.
- while True:
- if chunk_left is None:
- line = self.fp.readline()
- i = line.find(';')
- if i >= 0:
- line = line[:i] # strip chunk-extensions
- chunk_left = int(line, 16)
- if chunk_left == 0:
- break
- if amt is None:
- value += self._safe_read(chunk_left)
- elif amt < chunk_left:
- value += self._safe_read(amt)
- self.chunk_left = chunk_left - amt
- return value
- elif amt == chunk_left:
- value += self._safe_read(amt)
- self._safe_read(2) # toss the CRLF at the end of the chunk
- self.chunk_left = None
- return value
- else:
- value += self._safe_read(chunk_left)
- amt -= chunk_left
-
- # we read the whole chunk, get another
- self._safe_read(2) # toss the CRLF at the end of the chunk
- chunk_left = None
-
- # read and discard trailer up to the CRLF terminator
- ### note: we shouldn't have any trailers!
- while True:
- line = self.fp.readline()
- if line == '\r\n':
- break
-
- # we read everything; close the "file"
- self.close()
-
- return value
-
- def _safe_read(self, amt):
- """Read the number of bytes requested, compensating for partial reads.
-
- Normally, we have a blocking socket, but a read() can be interrupted
- by a signal (resulting in a partial read).
-
- Note that we cannot distinguish between EOF and an interrupt when zero
- bytes have been read. IncompleteRead() will be raised in this
- situation.
-
- This function should be used when bytes "should" be present for
- reading. If the bytes are truly not available (due to EOF), then the
- IncompleteRead exception can be used to detect the problem.
- """
- s = []
- while amt > 0:
- chunk = self.fp.read(min(amt, MAXAMOUNT))
- if not chunk:
- raise IncompleteRead(s)
- s.append(chunk)
- amt -= len(chunk)
- return ''.join(s)
-
- def getheader(self, name, default=None):
- if self.msg is None:
- raise ResponseNotReady()
- return self.msg.getheader(name, default)
-
- def getheaders(self):
- """Return list of (header, value) tuples."""
- if self.msg is None:
- raise ResponseNotReady()
- return self.msg.items()
-
-
-class HTTPConnection:
-
- _http_vsn = 11
- _http_vsn_str = 'HTTP/1.1'
-
- response_class = HTTPResponse
- default_port = HTTP_PORT
- auto_open = 1
- debuglevel = 0
- strict = 0
-
- def __init__(self, host, port=None, strict=None):
- self.sock = None
- self._buffer = []
- self.__response = None
- self.__state = _CS_IDLE
- self._method = None
-
- self._set_hostport(host, port)
- if strict is not None:
- self.strict = strict
-
- def _set_hostport(self, host, port):
- if port is None:
- i = host.rfind(':')
- j = host.rfind(']') # ipv6 addresses have [...]
- if i > j:
- try:
- port = int(host[i+1:])
- except ValueError:
- raise InvalidURL("nonnumeric port: '%s'" % host[i+1:])
- host = host[:i]
- else:
- port = self.default_port
- if host and host[0] == '[' and host[-1] == ']':
- host = host[1:-1]
- self.host = host
- self.port = port
-
- def set_debuglevel(self, level):
- self.debuglevel = level
-
- def connect(self):
- """Connect to the host and port specified in __init__."""
- msg = "getaddrinfo returns an empty list"
- for res in socket.getaddrinfo(self.host, self.port, 0,
- socket.SOCK_STREAM):
- af, socktype, proto, canonname, sa = res
- try:
- self.sock = socket.socket(af, socktype, proto)
- if self.debuglevel > 0:
- print "connect: (%s, %s)" % (self.host, self.port)
- self.sock.connect(sa)
- except socket.error, msg:
- if self.debuglevel > 0:
- print 'connect fail:', (self.host, self.port)
- if self.sock:
- self.sock.close()
- self.sock = None
- continue
- break
- if not self.sock:
- raise socket.error, msg
-
- def close(self):
- """Close the connection to the HTTP server."""
- if self.sock:
- self.sock.close() # close it manually... there may be other refs
- self.sock = None
- if self.__response:
- self.__response.close()
- self.__response = None
- self.__state = _CS_IDLE
-
- def send(self, str):
- """Send `str' to the server."""
- if self.sock is None:
- if self.auto_open:
- self.connect()
- else:
- raise NotConnected()
-
- # send the data to the server. if we get a broken pipe, then close
- # the socket. we want to reconnect when somebody tries to send again.
- #
- # NOTE: we DO propagate the error, though, because we cannot simply
- # ignore the error... the caller will know if they can retry.
- if self.debuglevel > 0:
- print "send:", repr(str)
- try:
- self.sock.sendall(str)
- except socket.error, v:
- if v[0] == 32: # Broken pipe
- self.close()
- raise
-
- def _output(self, s):
- """Add a line of output to the current request buffer.
-
- Assumes that the line does *not* end with \\r\\n.
- """
- self._buffer.append(s)
-
- def _send_output(self):
- """Send the currently buffered request and clear the buffer.
-
- Appends an extra \\r\\n to the buffer.
- """
- self._buffer.extend(("", ""))
- msg = "\r\n".join(self._buffer)
- del self._buffer[:]
- self.send(msg)
-
- def putrequest(self, method, url, skip_host=0, skip_accept_encoding=0):
- """Send a request to the server.
-
- `method' specifies an HTTP request method, e.g. 'GET'.
- `url' specifies the object being requested, e.g. '/index.html'.
- `skip_host' if True does not add automatically a 'Host:' header
- `skip_accept_encoding' if True does not add automatically an
- 'Accept-Encoding:' header
- """
-
- # if a prior response has been completed, then forget about it.
- if self.__response and self.__response.isclosed():
- self.__response = None
-
-
- # in certain cases, we cannot issue another request on this connection.
- # this occurs when:
- # 1) we are in the process of sending a request. (_CS_REQ_STARTED)
- # 2) a response to a previous request has signalled that it is going
- # to close the connection upon completion.
- # 3) the headers for the previous response have not been read, thus
- # we cannot determine whether point (2) is true. (_CS_REQ_SENT)
- #
- # if there is no prior response, then we can request at will.
- #
- # if point (2) is true, then we will have passed the socket to the
- # response (effectively meaning, "there is no prior response"), and
- # will open a new one when a new request is made.
- #
- # Note: if a prior response exists, then we *can* start a new request.
- # We are not allowed to begin fetching the response to this new
- # request, however, until that prior response is complete.
- #
- if self.__state == _CS_IDLE:
- self.__state = _CS_REQ_STARTED
- else:
- raise CannotSendRequest()
-
- # Save the method we use, we need it later in the response phase
- self._method = method
- if not url:
- url = '/'
- str = '%s %s %s' % (method, url, self._http_vsn_str)
-
- self._output(str)
-
- if self._http_vsn == 11:
- # Issue some standard headers for better HTTP/1.1 compliance
-
- if not skip_host:
- # this header is issued *only* for HTTP/1.1
- # connections. more specifically, this means it is
- # only issued when the client uses the new
- # HTTPConnection() class. backwards-compat clients
- # will be using HTTP/1.0 and those clients may be
- # issuing this header themselves. we should NOT issue
- # it twice; some web servers (such as Apache) barf
- # when they see two Host: headers
-
- # If we need a non-standard port,include it in the
- # header. If the request is going through a proxy,
- # but the host of the actual URL, not the host of the
- # proxy.
-
- netloc = ''
- if url.startswith('http'):
- nil, netloc, nil, nil, nil = urlsplit(url)
-
- if netloc:
- self.putheader('Host', netloc.encode("idna"))
- elif self.port == HTTP_PORT:
- self.putheader('Host', self.host.encode("idna"))
- else:
- self.putheader('Host', "%s:%s" % (self.host.encode("idna"), self.port))
-
- # note: we are assuming that clients will not attempt to set these
- # headers since *this* library must deal with the
- # consequences. this also means that when the supporting
- # libraries are updated to recognize other forms, then this
- # code should be changed (removed or updated).
-
- # we only want a Content-Encoding of "identity" since we don't
- # support encodings such as x-gzip or x-deflate.
- if not skip_accept_encoding:
- self.putheader('Accept-Encoding', 'identity')
-
- # we can accept "chunked" Transfer-Encodings, but no others
- # NOTE: no TE header implies *only* "chunked"
- #self.putheader('TE', 'chunked')
-
- # if TE is supplied in the header, then it must appear in a
- # Connection header.
- #self.putheader('Connection', 'TE')
-
- else:
- # For HTTP/1.0, the server will assume "not chunked"
- pass
-
- def putheader(self, header, value):
- """Send a request header line to the server.
-
- For example: h.putheader('Accept', 'text/html')
- """
- if self.__state != _CS_REQ_STARTED:
- raise CannotSendHeader()
-
- str = '%s: %s' % (header, value)
- self._output(str)
-
- def endheaders(self):
- """Indicate that the last header line has been sent to the server."""
-
- if self.__state == _CS_REQ_STARTED:
- self.__state = _CS_REQ_SENT
- else:
- raise CannotSendHeader()
-
- self._send_output()
-
- def request(self, method, url, body=None, headers={}):
- """Send a complete request to the server."""
-
- try:
- self._send_request(method, url, body, headers)
- except socket.error, v:
- # trap 'Broken pipe' if we're allowed to automatically reconnect
- if v[0] != 32 or not self.auto_open:
- raise
- # try one more time
- self._send_request(method, url, body, headers)
-
- def _send_request(self, method, url, body, headers):
- # honour explicitly requested Host: and Accept-Encoding headers
- header_names = dict.fromkeys([k.lower() for k in headers])
- skips = {}
- if 'host' in header_names:
- skips['skip_host'] = 1
- if 'accept-encoding' in header_names:
- skips['skip_accept_encoding'] = 1
-
- self.putrequest(method, url, **skips)
-
- if body and ('content-length' not in header_names):
- self.putheader('Content-Length', str(len(body)))
- for hdr, value in headers.iteritems():
- self.putheader(hdr, value)
- self.endheaders()
-
- if body:
- self.send(body)
-
- def getresponse(self):
- "Get the response from the server."
-
- # if a prior response has been completed, then forget about it.
- if self.__response and self.__response.isclosed():
- self.__response = None
-
- #
- # if a prior response exists, then it must be completed (otherwise, we
- # cannot read this response's header to determine the connection-close
- # behavior)
- #
- # note: if a prior response existed, but was connection-close, then the
- # socket and response were made independent of this HTTPConnection
- # object since a new request requires that we open a whole new
- # connection
- #
- # this means the prior response had one of two states:
- # 1) will_close: this connection was reset and the prior socket and
- # response operate independently
- # 2) persistent: the response was retained and we await its
- # isclosed() status to become true.
- #
- if self.__state != _CS_REQ_SENT or self.__response:
- raise ResponseNotReady()
-
- if self.debuglevel > 0:
- response = self.response_class(self.sock, self.debuglevel,
- strict=self.strict,
- method=self._method)
- else:
- response = self.response_class(self.sock, strict=self.strict,
- method=self._method)
-
- response.begin()
- assert response.will_close != _UNKNOWN
- self.__state = _CS_IDLE
-
- if response.will_close:
- # this effectively passes the connection to the response
- self.close()
- else:
- # remember this, so we can tell when it is complete
- self.__response = response
-
- return response
-
-# The next several classes are used to define FakeSocket,a socket-like
-# interface to an SSL connection.
-
-# The primary complexity comes from faking a makefile() method. The
-# standard socket makefile() implementation calls dup() on the socket
-# file descriptor. As a consequence, clients can call close() on the
-# parent socket and its makefile children in any order. The underlying
-# socket isn't closed until they are all closed.
-
-# The implementation uses reference counting to keep the socket open
-# until the last client calls close(). SharedSocket keeps track of
-# the reference counting and SharedSocketClient provides an constructor
-# and close() method that call incref() and decref() correctly.
-
-class SharedSocket:
-
- def __init__(self, sock):
- self.sock = sock
- self._refcnt = 0
-
- def incref(self):
- self._refcnt += 1
-
- def decref(self):
- self._refcnt -= 1
- assert self._refcnt >= 0
- if self._refcnt == 0:
- self.sock.close()
-
- def __del__(self):
- self.sock.close()
-
-class SharedSocketClient:
-
- def __init__(self, shared):
- self._closed = 0
- self._shared = shared
- self._shared.incref()
- self._sock = shared.sock
-
- def close(self):
- if not self._closed:
- self._shared.decref()
- self._closed = 1
- self._shared = None
-
-class SSLFile(SharedSocketClient):
- """File-like object wrapping an SSL socket."""
-
- BUFSIZE = 8192
-
- def __init__(self, sock, ssl, bufsize=None):
- SharedSocketClient.__init__(self, sock)
- self._ssl = ssl
- self._buf = ''
- self._bufsize = bufsize or self.__class__.BUFSIZE
-
- def _read(self):
- buf = ''
- # put in a loop so that we retry on transient errors
- while True:
- try:
- buf = self._ssl.read(self._bufsize)
- except socket.sslerror, err:
- if (err[0] == socket.SSL_ERROR_WANT_READ
- or err[0] == socket.SSL_ERROR_WANT_WRITE):
- continue
- if (err[0] == socket.SSL_ERROR_ZERO_RETURN
- or err[0] == socket.SSL_ERROR_EOF):
- break
- raise
- except socket.error, err:
- if err[0] == errno.EINTR:
- continue
- if err[0] == errno.EBADF:
- # XXX socket was closed?
- break
- raise
- else:
- break
- return buf
-
- def read(self, size=None):
- L = [self._buf]
- avail = len(self._buf)
- while size is None or avail < size:
- s = self._read()
- if s == '':
- break
- L.append(s)
- avail += len(s)
- all = "".join(L)
- if size is None:
- self._buf = ''
- return all
- else:
- self._buf = all[size:]
- return all[:size]
-
- def readline(self):
- L = [self._buf]
- self._buf = ''
- while 1:
- i = L[-1].find("\n")
- if i >= 0:
- break
- s = self._read()
- if s == '':
- break
- L.append(s)
- if i == -1:
- # loop exited because there is no more data
- return "".join(L)
- else:
- all = "".join(L)
- # XXX could do enough bookkeeping not to do a 2nd search
- i = all.find("\n") + 1
- line = all[:i]
- self._buf = all[i:]
- return line
-
- def readlines(self, sizehint=0):
- total = 0
- list = []
- while True:
- line = self.readline()
- if not line:
- break
- list.append(line)
- total += len(line)
- if sizehint and total >= sizehint:
- break
- return list
-
- def fileno(self):
- return self._sock.fileno()
-
- def __iter__(self):
- return self
-
- def next(self):
- line = self.readline()
- if not line:
- raise StopIteration
- return line
-
-class FakeSocket(SharedSocketClient):
-
- class _closedsocket:
- def __getattr__(self, name):
- raise error(9, 'Bad file descriptor')
-
- def __init__(self, sock, ssl):
- sock = SharedSocket(sock)
- SharedSocketClient.__init__(self, sock)
- self._ssl = ssl
-
- def close(self):
- SharedSocketClient.close(self)
- self._sock = self.__class__._closedsocket()
-
- def makefile(self, mode, bufsize=None):
- if mode != 'r' and mode != 'rb':
- raise UnimplementedFileMode()
- return SSLFile(self._shared, self._ssl, bufsize)
-
- def send(self, stuff, flags = 0):
- return self._ssl.write(stuff)
-
- sendall = send
-
- def recv(self, len = 1024, flags = 0):
- return self._ssl.read(len)
-
- def __getattr__(self, attr):
- return getattr(self._sock, attr)
-
-
-class HTTPSConnection(HTTPConnection):
- "This class allows communication via SSL."
-
- default_port = HTTPS_PORT
-
- def __init__(self, host, port=None, key_file=None, cert_file=None,
- strict=None):
- HTTPConnection.__init__(self, host, port, strict)
- self.key_file = key_file
- self.cert_file = cert_file
-
- def connect(self):
- "Connect to a host on a given (SSL) port."
-
- sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
- sock.connect((self.host, self.port))
- ssl = socket.ssl(sock, self.key_file, self.cert_file)
- self.sock = FakeSocket(sock, ssl)
-
-
-class HTTP:
- "Compatibility class with httplib.py from 1.5."
-
- _http_vsn = 10
- _http_vsn_str = 'HTTP/1.0'
-
- debuglevel = 0
-
- _connection_class = HTTPConnection
-
- def __init__(self, host='', port=None, strict=None):
- "Provide a default host, since the superclass requires one."
-
- # some joker passed 0 explicitly, meaning default port
- if port == 0:
- port = None
-
- # Note that we may pass an empty string as the host; this will throw
- # an error when we attempt to connect. Presumably, the client code
- # will call connect before then, with a proper host.
- self._setup(self._connection_class(host, port, strict))
-
- def _setup(self, conn):
- self._conn = conn
-
- # set up delegation to flesh out interface
- self.send = conn.send
- self.putrequest = conn.putrequest
- self.endheaders = conn.endheaders
- self.set_debuglevel = conn.set_debuglevel
-
- conn._http_vsn = self._http_vsn
- conn._http_vsn_str = self._http_vsn_str
-
- self.file = None
-
- def connect(self, host=None, port=None):
- "Accept arguments to set the host/port, since the superclass doesn't."
-
- if host is not None:
- self._conn._set_hostport(host, port)
- self._conn.connect()
-
- def getfile(self):
- "Provide a getfile, since the superclass' does not use this concept."
- return self.file
-
- def putheader(self, header, *values):
- "The superclass allows only one value argument."
- self._conn.putheader(header, '\r\n\t'.join(values))
-
- def getreply(self):
- """Compat definition since superclass does not define it.
-
- Returns a tuple consisting of:
- - server status code (e.g. '200' if all goes well)
- - server "reason" corresponding to status code
- - any RFC822 headers in the response from the server
- """
- try:
- response = self._conn.getresponse()
- except BadStatusLine, e:
- ### hmm. if getresponse() ever closes the socket on a bad request,
- ### then we are going to have problems with self.sock
-
- ### should we keep this behavior? do people use it?
- # keep the socket open (as a file), and return it
- self.file = self._conn.sock.makefile('rb', 0)
-
- # close our socket -- we want to restart after any protocol error
- self.close()
-
- self.headers = None
- return -1, e.line, None
-
- self.headers = response.msg
- self.file = response.fp
- return response.status, response.reason, response.msg
-
- def close(self):
- self._conn.close()
-
- # note that self.file == response.fp, which gets closed by the
- # superclass. just clear the object ref here.
- ### hmm. messy. if status==-1, then self.file is owned by us.
- ### well... we aren't explicitly closing, but losing this ref will
- ### do it
- self.file = None
-
-if hasattr(socket, 'ssl'):
- class HTTPS(HTTP):
- """Compatibility with 1.5 httplib interface
-
- Python 1.5.2 did not have an HTTPS class, but it defined an
- interface for sending http requests that is also useful for
- https.
- """
-
- _connection_class = HTTPSConnection
-
- def __init__(self, host='', port=None, key_file=None, cert_file=None,
- strict=None):
- # provide a default host, pass the X509 cert info
-
- # urf. compensate for bad input.
- if port == 0:
- port = None
- self._setup(self._connection_class(host, port, key_file,
- cert_file, strict))
-
- # we never actually use these for anything, but we keep them
- # here for compatibility with post-1.5.2 CVS.
- self.key_file = key_file
- self.cert_file = cert_file
-
-
-class HTTPException(Exception):
- # Subclasses that define an __init__ must call Exception.__init__
- # or define self.args. Otherwise, str() will fail.
- pass
-
-class NotConnected(HTTPException):
- pass
-
-class InvalidURL(HTTPException):
- pass
-
-class UnknownProtocol(HTTPException):
- def __init__(self, version):
- self.args = version,
- self.version = version
-
-class UnknownTransferEncoding(HTTPException):
- pass
-
-class UnimplementedFileMode(HTTPException):
- pass
-
-class IncompleteRead(HTTPException):
- def __init__(self, partial):
- self.args = partial,
- self.partial = partial
-
-class ImproperConnectionState(HTTPException):
- pass
-
-class CannotSendRequest(ImproperConnectionState):
- pass
-
-class CannotSendHeader(ImproperConnectionState):
- pass
-
-class ResponseNotReady(ImproperConnectionState):
- pass
-
-class BadStatusLine(HTTPException):
- def __init__(self, line):
- self.args = line,
- self.line = line
-
-# for backwards compatibility
-error = HTTPException
-
-class LineAndFileWrapper:
- """A limited file-like object for HTTP/0.9 responses."""
-
- # The status-line parsing code calls readline(), which normally
- # get the HTTP status line. For a 0.9 response, however, this is
- # actually the first line of the body! Clients need to get a
- # readable file object that contains that line.
-
- def __init__(self, line, file):
- self._line = line
- self._file = file
- self._line_consumed = 0
- self._line_offset = 0
- self._line_left = len(line)
-
- def __getattr__(self, attr):
- return getattr(self._file, attr)
-
- def _done(self):
- # called when the last byte is read from the line. After the
- # call, all read methods are delegated to the underlying file
- # object.
- self._line_consumed = 1
- self.read = self._file.read
- self.readline = self._file.readline
- self.readlines = self._file.readlines
-
- def read(self, amt=None):
- if self._line_consumed:
- return self._file.read(amt)
- assert self._line_left
- if amt is None or amt > self._line_left:
- s = self._line[self._line_offset:]
- self._done()
- if amt is None:
- return s + self._file.read()
- else:
- return s + self._file.read(amt - len(s))
- else:
- assert amt <= self._line_left
- i = self._line_offset
- j = i + amt
- s = self._line[i:j]
- self._line_offset = j
- self._line_left -= amt
- if self._line_left == 0:
- self._done()
- return s
-
- def readline(self):
- if self._line_consumed:
- return self._file.readline()
- assert self._line_left
- s = self._line[self._line_offset:]
- self._done()
- return s
-
- def readlines(self, size=None):
- if self._line_consumed:
- return self._file.readlines(size)
- assert self._line_left
- L = [self._line[self._line_offset:]]
- self._done()
- if size is None:
- return L + self._file.readlines()
- else:
- return L + self._file.readlines(size)
-
-def test():
- """Test this module.
-
- A hodge podge of tests collected here, because they have too many
- external dependencies for the regular test suite.
- """
-
- import sys
- import getopt
- opts, args = getopt.getopt(sys.argv[1:], 'd')
- dl = 0
- for o, a in opts:
- if o == '-d': dl = dl + 1
- host = 'www.python.org'
- selector = '/'
- if args[0:]: host = args[0]
- if args[1:]: selector = args[1]
- h = HTTP()
- h.set_debuglevel(dl)
- h.connect(host)
- h.putrequest('GET', selector)
- h.endheaders()
- status, reason, headers = h.getreply()
- print 'status =', status
- print 'reason =', reason
- print "read", len(h.getfile().read())
- print
- if headers:
- for header in headers.headers: print header.strip()
- print
-
- # minimal test that code to extract host from url works
- class HTTP11(HTTP):
- _http_vsn = 11
- _http_vsn_str = 'HTTP/1.1'
-
- h = HTTP11('www.python.org')
- h.putrequest('GET', 'http://www.python.org/~jeremy/')
- h.endheaders()
- h.getreply()
- h.close()
-
- if hasattr(socket, 'ssl'):
-
- for host, selector in (('sourceforge.net', '/projects/python'),
- ):
- print "https://%s%s" % (host, selector)
- hs = HTTPS()
- hs.set_debuglevel(dl)
- hs.connect(host)
- hs.putrequest('GET', selector)
- hs.endheaders()
- status, reason, headers = hs.getreply()
- print 'status =', status
- print 'reason =', reason
- print "read", len(hs.getfile().read())
- print
- if headers:
- for header in headers.headers: print header.strip()
- print
-
-if __name__ == '__main__':
- test()
+"""HTTP/1.1 client library
+
+
+
+
+HTTPConnection go through a number of "states", which defines when a client
+may legally make another request or fetch the response for a particular
+request. This diagram details these state transitions:
+
+ (null)
+ |
+ | HTTPConnection()
+ v
+ Idle
+ |
+ | putrequest()
+ v
+ Request-started
+ |
+ | ( putheader() )* endheaders()
+ v
+ Request-sent
+ |
+ | response = getresponse()
+ v
+ Unread-response [Response-headers-read]
+ |\____________________
+ | |
+ | response.read() | putrequest()
+ v v
+ Idle Req-started-unread-response
+ ______/|
+ / |
+ response.read() | | ( putheader() )* endheaders()
+ v v
+ Request-started Req-sent-unread-response
+ |
+ | response.read()
+ v
+ Request-sent
+
+This diagram presents the following rules:
+ -- a second request may not be started until {response-headers-read}
+ -- a response [object] cannot be retrieved until {request-sent}
+ -- there is no differentiation between an unread response body and a
+ partially read response body
+
+Note: this enforcement is applied by the HTTPConnection class. The
+ HTTPResponse class does not enforce this state machine, which
+ implies sophisticated clients may accelerate the request/response
+ pipeline. Caution should be taken, though: accelerating the states
+ beyond the above pattern may imply knowledge of the server's
+ connection-close behavior for certain requests. For example, it
+ is impossible to tell whether the server will close the connection
+ UNTIL the response headers have been read; this means that further
+ requests cannot be placed into the pipeline until it is known that
+ the server will NOT be closing the connection.
+
+Logical State __state __response
+------------- ------- ----------
+Idle _CS_IDLE None
+Request-started _CS_REQ_STARTED None
+Request-sent _CS_REQ_SENT None
+Unread-response _CS_IDLE
+Req-started-unread-response _CS_REQ_STARTED
+Req-sent-unread-response _CS_REQ_SENT
+"""
+
+import errno
+import mimetools
+import socket
+from urlparse import urlsplit
+import logging
+_log = logging.getLogger('py.httplib')
+
+try:
+ from cStringIO import StringIO
+except ImportError:
+ from StringIO import StringIO
+
+__all__ = ["HTTP", "HTTPResponse", "HTTPConnection", "HTTPSConnection",
+ "HTTPException", "NotConnected", "UnknownProtocol",
+ "UnknownTransferEncoding", "UnimplementedFileMode",
+ "IncompleteRead", "InvalidURL", "ImproperConnectionState",
+ "CannotSendRequest", "CannotSendHeader", "ResponseNotReady",
+ "BadStatusLine", "error", "responses"]
+
+HTTP_PORT = 80
+HTTPS_PORT = 443
+
+_UNKNOWN = 'UNKNOWN'
+
+# connection states
+_CS_IDLE = 'Idle'
+_CS_REQ_STARTED = 'Request-started'
+_CS_REQ_SENT = 'Request-sent'
+
+# status codes
+# informational
+CONTINUE = 100
+SWITCHING_PROTOCOLS = 101
+PROCESSING = 102
+
+# successful
+OK = 200
+CREATED = 201
+ACCEPTED = 202
+NON_AUTHORITATIVE_INFORMATION = 203
+NO_CONTENT = 204
+RESET_CONTENT = 205
+PARTIAL_CONTENT = 206
+MULTI_STATUS = 207
+IM_USED = 226
+
+# redirection
+MULTIPLE_CHOICES = 300
+MOVED_PERMANENTLY = 301
+FOUND = 302
+SEE_OTHER = 303
+NOT_MODIFIED = 304
+USE_PROXY = 305
+TEMPORARY_REDIRECT = 307
+
+# client error
+BAD_REQUEST = 400
+UNAUTHORIZED = 401
+PAYMENT_REQUIRED = 402
+FORBIDDEN = 403
+NOT_FOUND = 404
+METHOD_NOT_ALLOWED = 405
+NOT_ACCEPTABLE = 406
+PROXY_AUTHENTICATION_REQUIRED = 407
+REQUEST_TIMEOUT = 408
+CONFLICT = 409
+GONE = 410
+LENGTH_REQUIRED = 411
+PRECONDITION_FAILED = 412
+REQUEST_ENTITY_TOO_LARGE = 413
+REQUEST_URI_TOO_LONG = 414
+UNSUPPORTED_MEDIA_TYPE = 415
+REQUESTED_RANGE_NOT_SATISFIABLE = 416
+EXPECTATION_FAILED = 417
+UNPROCESSABLE_ENTITY = 422
+LOCKED = 423
+FAILED_DEPENDENCY = 424
+UPGRADE_REQUIRED = 426
+
+# server error
+INTERNAL_SERVER_ERROR = 500
+NOT_IMPLEMENTED = 501
+BAD_GATEWAY = 502
+SERVICE_UNAVAILABLE = 503
+GATEWAY_TIMEOUT = 504
+HTTP_VERSION_NOT_SUPPORTED = 505
+INSUFFICIENT_STORAGE = 507
+NOT_EXTENDED = 510
+
+# Mapping status codes to official W3C names
+responses = {
+ 100: 'Continue',
+ 101: 'Switching Protocols',
+
+ 200: 'OK',
+ 201: 'Created',
+ 202: 'Accepted',
+ 203: 'Non-Authoritative Information',
+ 204: 'No Content',
+ 205: 'Reset Content',
+ 206: 'Partial Content',
+
+ 300: 'Multiple Choices',
+ 301: 'Moved Permanently',
+ 302: 'Found',
+ 303: 'See Other',
+ 304: 'Not Modified',
+ 305: 'Use Proxy',
+ 306: '(Unused)',
+ 307: 'Temporary Redirect',
+
+ 400: 'Bad Request',
+ 401: 'Unauthorized',
+ 402: 'Payment Required',
+ 403: 'Forbidden',
+ 404: 'Not Found',
+ 405: 'Method Not Allowed',
+ 406: 'Not Acceptable',
+ 407: 'Proxy Authentication Required',
+ 408: 'Request Timeout',
+ 409: 'Conflict',
+ 410: 'Gone',
+ 411: 'Length Required',
+ 412: 'Precondition Failed',
+ 413: 'Request Entity Too Large',
+ 414: 'Request-URI Too Long',
+ 415: 'Unsupported Media Type',
+ 416: 'Requested Range Not Satisfiable',
+ 417: 'Expectation Failed',
+
+ 500: 'Internal Server Error',
+ 501: 'Not Implemented',
+ 502: 'Bad Gateway',
+ 503: 'Service Unavailable',
+ 504: 'Gateway Timeout',
+ 505: 'HTTP Version Not Supported',
+}
+
+# maximal amount of data to read at one time in _safe_read
+MAXAMOUNT = 1048576
+
+class HTTPMessage(mimetools.Message):
+
+ def addheader(self, key, value):
+ """Add header for field key handling repeats."""
+ prev = self.dict.get(key)
+ if prev is None:
+ self.dict[key] = value
+ else:
+ combined = ", ".join((prev, value))
+ self.dict[key] = combined
+
+ def addcontinue(self, key, more):
+ """Add more field data from a continuation line."""
+ prev = self.dict[key]
+ self.dict[key] = prev + "\n " + more
+
+ def readheaders(self):
+ """Read header lines.
+
+ Read header lines up to the entirely blank line that terminates them.
+ The (normally blank) line that ends the headers is skipped, but not
+ included in the returned list. If a non-header line ends the headers,
+ (which is an error), an attempt is made to backspace over it; it is
+ never included in the returned list.
+
+ The variable self.status is set to the empty string if all went well,
+ otherwise it is an error message. The variable self.headers is a
+ completely uninterpreted list of lines contained in the header (so
+ printing them will reproduce the header exactly as it appears in the
+ file).
+
+ If multiple header fields with the same name occur, they are combined
+ according to the rules in RFC 2616 sec 4.2:
+
+ Appending each subsequent field-value to the first, each separated
+ by a comma. The order in which header fields with the same field-name
+ are received is significant to the interpretation of the combined
+ field value.
+ """
+ # XXX The implementation overrides the readheaders() method of
+ # rfc822.Message. The base class design isn't amenable to
+ # customized behavior here so the method here is a copy of the
+ # base class code with a few small changes.
+
+ self.dict = {}
+ self.unixfrom = ''
+ self.headers = hlist = []
+ self.status = ''
+ headerseen = ""
+ firstline = 1
+ startofline = unread = tell = None
+ if hasattr(self.fp, 'unread'):
+ unread = self.fp.unread
+ elif self.seekable:
+ tell = self.fp.tell
+ while True:
+ if tell:
+ try:
+ startofline = tell()
+ except IOError:
+ startofline = tell = None
+ self.seekable = 0
+ line = self.fp.readline()
+ if not line:
+ self.status = 'EOF in headers'
+ break
+ # Skip unix From name time lines
+ if firstline and line.startswith('From '):
+ self.unixfrom = self.unixfrom + line
+ continue
+ firstline = 0
+ if headerseen and line[0] in ' \t':
+ # XXX Not sure if continuation lines are handled properly
+ # for http and/or for repeating headers
+ # It's a continuation line.
+ hlist.append(line)
+ self.addcontinue(headerseen, line.strip())
+ continue
+ elif self.iscomment(line):
+ # It's a comment. Ignore it.
+ continue
+ elif self.islast(line):
+ # Note! No pushback here! The delimiter line gets eaten.
+ break
+ headerseen = self.isheader(line)
+ if headerseen:
+ # It's a legal header line, save it.
+ hlist.append(line)
+ self.addheader(headerseen, line[len(headerseen)+1:].strip())
+ continue
+ else:
+ # It's not a header line; throw it back and stop here.
+ if not self.dict:
+ self.status = 'No headers'
+ else:
+ self.status = 'Non-header line where header expected'
+ # Try to undo the read.
+ if unread:
+ unread(line)
+ elif tell:
+ self.fp.seek(startofline)
+ else:
+ self.status = self.status + '; bad seek'
+ break
+
+class HTTPResponse:
+
+ # strict: If true, raise BadStatusLine if the status line can't be
+ # parsed as a valid HTTP/1.0 or 1.1 status line. By default it is
+ # false because it prevents clients from talking to HTTP/0.9
+ # servers. Note that a response with a sufficiently corrupted
+ # status line will look like an HTTP/0.9 response.
+
+ # See RFC 2616 sec 19.6 and RFC 1945 sec 6 for details.
+
+ def __init__(self, sock, debuglevel=0, strict=0, method=None):
+ self.fp = sock.makefile('rb', 0)
+ self.debuglevel = debuglevel
+ self.strict = strict
+ self._method = method
+
+ self.msg = None
+
+ # from the Status-Line of the response
+ self.version = _UNKNOWN # HTTP-Version
+ self.status = _UNKNOWN # Status-Code
+ self.reason = _UNKNOWN # Reason-Phrase
+
+ self.chunked = _UNKNOWN # is "chunked" being used?
+ self.chunk_left = _UNKNOWN # bytes left to read in current chunk
+ self.length = _UNKNOWN # number of bytes left in response
+ self.will_close = _UNKNOWN # conn will close at end of response
+
+ def _read_status(self):
+ # Initialize with Simple-Response defaults
+ line = self.fp.readline()
+ if self.debuglevel > 0:
+ _log.info("reply:", repr(line))
+ if not line:
+ # Presumably, the server closed the connection before
+ # sending a valid response.
+ raise BadStatusLine(line)
+ try:
+ [version, status, reason] = line.split(None, 2)
+ except ValueError:
+ try:
+ [version, status] = line.split(None, 1)
+ reason = ""
+ except ValueError:
+ # empty version will cause next test to fail and status
+ # will be treated as 0.9 response.
+ version = ""
+ if not version.startswith('HTTP/'):
+ if self.strict:
+ self.close()
+ raise BadStatusLine(line)
+ else:
+ # assume it's a Simple-Response from an 0.9 server
+ self.fp = LineAndFileWrapper(line, self.fp)
+ return "HTTP/0.9", 200, ""
+
+ # The status code is a three-digit number
+ try:
+ status = int(status)
+ if status < 100 or status > 999:
+ raise BadStatusLine(line)
+ except ValueError:
+ raise BadStatusLine(line)
+ return version, status, reason
+
+ def begin(self):
+ if self.msg is not None:
+ # we've already started reading the response
+ return
+
+ # read until we get a non-100 response
+ while True:
+ version, status, reason = self._read_status()
+ if status != CONTINUE:
+ break
+ # skip the header from the 100 response
+ while True:
+ skip = self.fp.readline().strip()
+ if not skip:
+ break
+ if self.debuglevel > 0: # TO DO:
+ _log.info("header:", skip) # Log an error before raising an
+ # exception so users can see what
+ # what went wrong instead of just
+ # recovering the program
+
+ self.status = status
+ self.reason = reason.strip()
+ if version == 'HTTP/1.0':
+ self.version = 10
+ elif version.startswith('HTTP/1.'):
+ self.version = 11 # use HTTP/1.1 code for HTTP/1.x where x>=1
+ elif version == 'HTTP/0.9':
+ self.version = 9
+ else:
+ raise UnknownProtocol(version)
+
+ if self.version == 9:
+ self.length = None
+ self.chunked = 0
+ self.will_close = 1
+ self.msg = HTTPMessage(StringIO())
+ return
+
+ self.msg = HTTPMessage(self.fp, 0)
+ if self.debuglevel > 0:
+ for hdr in self.msg.headers: # TO DO:
+ _log.info("header:", skip) # Log an error before raising an
+ # exception so users can see what
+ # what went wrong instead of just
+ # recovering the program
+
+ # don't let the msg keep an fp
+ self.msg.fp = None
+
+ # are we using the chunked-style of transfer encoding?
+ tr_enc = self.msg.getheader('transfer-encoding')
+ if tr_enc and tr_enc.lower() == "chunked":
+ self.chunked = 1
+ self.chunk_left = None
+ else:
+ self.chunked = 0
+
+ # will the connection close at the end of the response?
+ self.will_close = self._check_close()
+
+ # do we have a Content-Length?
+ # NOTE: RFC 2616, S4.4, #3 says we ignore this if tr_enc is "chunked"
+ length = self.msg.getheader('content-length')
+ if length and not self.chunked:
+ try:
+ self.length = int(length)
+ except ValueError:
+ self.length = None
+ else:
+ self.length = None
+
+ # does the body have a fixed length? (of zero)
+ if (status == NO_CONTENT or status == NOT_MODIFIED or
+ 100 <= status < 200 or # 1xx codes
+ self._method == 'HEAD'):
+ self.length = 0
+
+ # if the connection remains open, and we aren't using chunked, and
+ # a content-length was not provided, then assume that the connection
+ # WILL close.
+ if not self.will_close and \
+ not self.chunked and \
+ self.length is None:
+ self.will_close = 1
+
+ def _check_close(self):
+ conn = self.msg.getheader('connection')
+ if self.version == 11:
+ # An HTTP/1.1 proxy is assumed to stay open unless
+ # explicitly closed.
+ conn = self.msg.getheader('connection')
+ if conn and "close" in conn.lower():
+ return True
+ return False
+
+ # Some HTTP/1.0 implementations have support for persistent
+ # connections, using rules different than HTTP/1.1.
+
+ # For older HTTP, Keep-Alive indiciates persistent connection.
+ if self.msg.getheader('keep-alive'):
+ return False
+
+ # At least Akamai returns a "Connection: Keep-Alive" header,
+ # which was supposed to be sent by the client.
+ if conn and "keep-alive" in conn.lower():
+ return False
+
+ # Proxy-Connection is a netscape hack.
+ pconn = self.msg.getheader('proxy-connection')
+ if pconn and "keep-alive" in pconn.lower():
+ return False
+
+ # otherwise, assume it will close
+ return True
+
+ def close(self):
+ if self.fp:
+ self.fp.close()
+ self.fp = None
+
+ def isclosed(self):
+ # NOTE: it is possible that we will not ever call self.close(). This
+ # case occurs when will_close is TRUE, length is None, and we
+ # read up to the last byte, but NOT past it.
+ #
+ # IMPLIES: if will_close is FALSE, then self.close() will ALWAYS be
+ # called, meaning self.isclosed() is meaningful.
+ return self.fp is None
+
+ # XXX It would be nice to have readline and __iter__ for this, too.
+
+ def read(self, amt=None):
+ if self.fp is None:
+ return ''
+
+ if self.chunked:
+ return self._read_chunked(amt)
+
+ if amt is None:
+ # unbounded read
+ if self.length is None:
+ s = self.fp.read()
+ else:
+ s = self._safe_read(self.length)
+ self.length = 0
+ self.close() # we read everything
+ return s
+
+ if self.length is not None:
+ if amt > self.length:
+ # clip the read to the "end of response"
+ amt = self.length
+
+ # we do not use _safe_read() here because this may be a .will_close
+ # connection, and the user is reading more bytes than will be provided
+ # (for example, reading in 1k chunks)
+ s = self.fp.read(amt)
+ if self.length is not None:
+ self.length -= len(s)
+
+ return s
+
+ def _read_chunked(self, amt):
+ assert self.chunked != _UNKNOWN
+ chunk_left = self.chunk_left
+ value = ''
+
+ # XXX This accumulates chunks by repeated string concatenation,
+ # which is not efficient as the number or size of chunks gets big.
+ while True:
+ if chunk_left is None:
+ line = self.fp.readline()
+ i = line.find(';')
+ if i >= 0:
+ line = line[:i] # strip chunk-extensions
+ chunk_left = int(line, 16)
+ if chunk_left == 0:
+ break
+ if amt is None:
+ value += self._safe_read(chunk_left)
+ elif amt < chunk_left:
+ value += self._safe_read(amt)
+ self.chunk_left = chunk_left - amt
+ return value
+ elif amt == chunk_left:
+ value += self._safe_read(amt)
+ self._safe_read(2) # toss the CRLF at the end of the chunk
+ self.chunk_left = None
+ return value
+ else:
+ value += self._safe_read(chunk_left)
+ amt -= chunk_left
+
+ # we read the whole chunk, get another
+ self._safe_read(2) # toss the CRLF at the end of the chunk
+ chunk_left = None
+
+ # read and discard trailer up to the CRLF terminator
+ ### note: we shouldn't have any trailers!
+ while True:
+ line = self.fp.readline()
+ if line == '\r\n':
+ break
+
+ # we read everything; close the "file"
+ self.close()
+
+ return value
+
+ def _safe_read(self, amt):
+ """Read the number of bytes requested, compensating for partial reads.
+
+ Normally, we have a blocking socket, but a read() can be interrupted
+ by a signal (resulting in a partial read).
+
+ Note that we cannot distinguish between EOF and an interrupt when zero
+ bytes have been read. IncompleteRead() will be raised in this
+ situation.
+
+ This function should be used when bytes "should" be present for
+ reading. If the bytes are truly not available (due to EOF), then the
+ IncompleteRead exception can be used to detect the problem.
+ """
+ s = []
+ while amt > 0:
+ chunk = self.fp.read(min(amt, MAXAMOUNT))
+ if not chunk:
+ raise IncompleteRead(s)
+ s.append(chunk)
+ amt -= len(chunk)
+ return ''.join(s)
+
+ def getheader(self, name, default=None):
+ if self.msg is None:
+ raise ResponseNotReady()
+ return self.msg.getheader(name, default)
+
+ def getheaders(self):
+ """Return list of (header, value) tuples."""
+ if self.msg is None:
+ raise ResponseNotReady()
+ return self.msg.items()
+
+
+class HTTPConnection:
+
+ _http_vsn = 11
+ _http_vsn_str = 'HTTP/1.1'
+
+ response_class = HTTPResponse
+ default_port = HTTP_PORT
+ auto_open = 1
+ debuglevel = 0
+ strict = 0
+
+ def __init__(self, host, port=None, strict=None):
+ self.sock = None
+ self._buffer = []
+ self.__response = None
+ self.__state = _CS_IDLE
+ self._method = None
+
+ self._set_hostport(host, port)
+ if strict is not None:
+ self.strict = strict
+
+ def _set_hostport(self, host, port):
+ if port is None:
+ i = host.rfind(':')
+ j = host.rfind(']') # ipv6 addresses have [...]
+ if i > j:
+ try:
+ port = int(host[i+1:])
+ except ValueError:
+ raise InvalidURL("nonnumeric port: '%s'" % host[i+1:])
+ host = host[:i]
+ else:
+ port = self.default_port
+ if host and host[0] == '[' and host[-1] == ']':
+ host = host[1:-1]
+ self.host = host
+ self.port = port
+
+ def set_debuglevel(self, level):
+ self.debuglevel = level
+
+ def connect(self):
+ """Connect to the host and port specified in __init__."""
+ msg = "getaddrinfo returns an empty list"
+ for res in socket.getaddrinfo(self.host, self.port, 0,
+ socket.SOCK_STREAM):
+ af, socktype, proto, canonname, sa = res
+ try:
+ self.sock = socket.socket(af, socktype, proto)
+ if self.debuglevel > 0:
+ _log.info("connect: (%s, %s)" % (self.host, self.port)) # SoC
+ self.sock.connect(sa)
+ except socket.error, msg:
+ if self.debuglevel > 0:
+ _log('connect fail:', (self.host, self.port)) # SoC
+ if self.sock:
+ self.sock.close()
+ self.sock = None
+ continue
+ break
+ if not self.sock:
+ raise socket.error, msg
+
+ def close(self):
+ """Close the connection to the HTTP server."""
+ if self.sock:
+ self.sock.close() # close it manually... there may be other refs
+ self.sock = None
+ if self.__response:
+ self.__response.close()
+ self.__response = None
+ self.__state = _CS_IDLE
+
+ def send(self, str):
+ """Send `str' to the server."""
+ if self.sock is None:
+ if self.auto_open:
+ self.connect()
+ else:
+ raise NotConnected()
+
+ # send the data to the server. if we get a broken pipe, then close
+ # the socket. we want to reconnect when somebody tries to send again.
+ #
+ # NOTE: we DO propagate the error, though, because we cannot simply
+ # ignore the error... the caller will know if they can retry.
+ if self.debuglevel > 0:
+ _log("send:", repr(str)) # SoC
+ try:
+ self.sock.sendall(str)
+ except socket.error, v:
+ if v[0] == 32: # Broken pipe
+ self.close()
+ raise
+
+ def _output(self, s):
+ """Add a line of output to the current request buffer.
+
+ Assumes that the line does *not* end with \\r\\n.
+ """
+ self._buffer.append(s)
+
+ def _send_output(self):
+ """Send the currently buffered request and clear the buffer.
+
+ Appends an extra \\r\\n to the buffer.
+ """
+ self._buffer.extend(("", ""))
+ msg = "\r\n".join(self._buffer)
+ del self._buffer[:]
+ self.send(msg)
+
+ def putrequest(self, method, url, skip_host=0, skip_accept_encoding=0):
+ """Send a request to the server.
+
+ `method' specifies an HTTP request method, e.g. 'GET'.
+ `url' specifies the object being requested, e.g. '/index.html'.
+ `skip_host' if True does not add automatically a 'Host:' header
+ `skip_accept_encoding' if True does not add automatically an
+ 'Accept-Encoding:' header
+ """
+
+ # if a prior response has been completed, then forget about it.
+ if self.__response and self.__response.isclosed():
+ self.__response = None
+
+
+ # in certain cases, we cannot issue another request on this connection.
+ # this occurs when:
+ # 1) we are in the process of sending a request. (_CS_REQ_STARTED)
+ # 2) a response to a previous request has signalled that it is going
+ # to close the connection upon completion.
+ # 3) the headers for the previous response have not been read, thus
+ # we cannot determine whether point (2) is true. (_CS_REQ_SENT)
+ #
+ # if there is no prior response, then we can request at will.
+ #
+ # if point (2) is true, then we will have passed the socket to the
+ # response (effectively meaning, "there is no prior response"), and
+ # will open a new one when a new request is made.
+ #
+ # Note: if a prior response exists, then we *can* start a new request.
+ # We are not allowed to begin fetching the response to this new
+ # request, however, until that prior response is complete.
+ #
+ if self.__state == _CS_IDLE:
+ self.__state = _CS_REQ_STARTED
+ else:
+ raise CannotSendRequest()
+
+ # Save the method we use, we need it later in the response phase
+ self._method = method
+ if not url:
+ url = '/'
+ str = '%s %s %s' % (method, url, self._http_vsn_str)
+
+ self._output(str)
+
+ if self._http_vsn == 11:
+ # Issue some standard headers for better HTTP/1.1 compliance
+
+ if not skip_host:
+ # this header is issued *only* for HTTP/1.1
+ # connections. more specifically, this means it is
+ # only issued when the client uses the new
+ # HTTPConnection() class. backwards-compat clients
+ # will be using HTTP/1.0 and those clients may be
+ # issuing this header themselves. we should NOT issue
+ # it twice; some web servers (such as Apache) barf
+ # when they see two Host: headers
+
+ # If we need a non-standard port,include it in the
+ # header. If the request is going through a proxy,
+ # but the host of the actual URL, not the host of the
+ # proxy.
+
+ netloc = ''
+ if url.startswith('http'):
+ nil, netloc, nil, nil, nil = urlsplit(url)
+
+ if netloc:
+ try:
+ netloc_enc = netloc.encode("ascii")
+ except UnicodeEncodeError:
+ netloc_enc = netloc.encode("idna")
+ self.putheader('Host', netloc_enc)
+ else:
+ try:
+ host_enc = self.host.encode("ascii")
+ except UnicodeEncodeError:
+ host_enc = self.host.encode("idna")
+ if self.port == HTTP_PORT:
+ self.putheader('Host', host_enc)
+ else:
+ self.putheader('Host', "%s:%s" % (host_enc, self.port))
+
+ # note: we are assuming that clients will not attempt to set these
+ # headers since *this* library must deal with the
+ # consequences. this also means that when the supporting
+ # libraries are updated to recognize other forms, then this
+ # code should be changed (removed or updated).
+
+ # we only want a Content-Encoding of "identity" since we don't
+ # support encodings such as x-gzip or x-deflate.
+ if not skip_accept_encoding:
+ self.putheader('Accept-Encoding', 'identity')
+
+ # we can accept "chunked" Transfer-Encodings, but no others
+ # NOTE: no TE header implies *only* "chunked"
+ #self.putheader('TE', 'chunked')
+
+ # if TE is supplied in the header, then it must appear in a
+ # Connection header.
+ #self.putheader('Connection', 'TE')
+
+ else:
+ # For HTTP/1.0, the server will assume "not chunked"
+ pass
+
+ def putheader(self, header, value):
+ """Send a request header line to the server.
+
+ For example: h.putheader('Accept', 'text/html')
+ """
+ if self.__state != _CS_REQ_STARTED:
+ raise CannotSendHeader()
+
+ str = '%s: %s' % (header, value)
+ self._output(str)
+
+ def endheaders(self):
+ """Indicate that the last header line has been sent to the server."""
+
+ if self.__state == _CS_REQ_STARTED:
+ self.__state = _CS_REQ_SENT
+ else:
+ raise CannotSendHeader()
+
+ self._send_output()
+
+ def request(self, method, url, body=None, headers={}):
+ """Send a complete request to the server."""
+
+ try:
+ self._send_request(method, url, body, headers)
+ except socket.error, v:
+ # trap 'Broken pipe' if we're allowed to automatically reconnect
+ if v[0] != 32 or not self.auto_open:
+ raise
+ # try one more time
+ self._send_request(method, url, body, headers)
+
+ def _send_request(self, method, url, body, headers):
+ # honour explicitly requested Host: and Accept-Encoding headers
+ header_names = dict.fromkeys([k.lower() for k in headers])
+ skips = {}
+ if 'host' in header_names:
+ skips['skip_host'] = 1
+ if 'accept-encoding' in header_names:
+ skips['skip_accept_encoding'] = 1
+
+ self.putrequest(method, url, **skips)
+
+ if body and ('content-length' not in header_names):
+ self.putheader('Content-Length', str(len(body)))
+ for hdr, value in headers.iteritems():
+ self.putheader(hdr, value)
+ self.endheaders()
+
+ if body:
+ self.send(body)
+
+ def getresponse(self):
+ "Get the response from the server."
+
+ # if a prior response has been completed, then forget about it.
+ if self.__response and self.__response.isclosed():
+ self.__response = None
+
+ #
+ # if a prior response exists, then it must be completed (otherwise, we
+ # cannot read this response's header to determine the connection-close
+ # behavior)
+ #
+ # note: if a prior response existed, but was connection-close, then the
+ # socket and response were made independent of this HTTPConnection
+ # object since a new request requires that we open a whole new
+ # connection
+ #
+ # this means the prior response had one of two states:
+ # 1) will_close: this connection was reset and the prior socket and
+ # response operate independently
+ # 2) persistent: the response was retained and we await its
+ # isclosed() status to become true.
+ #
+ if self.__state != _CS_REQ_SENT or self.__response:
+ raise ResponseNotReady()
+
+ if self.debuglevel > 0:
+ response = self.response_class(self.sock, self.debuglevel,
+ strict=self.strict,
+ method=self._method)
+ else:
+ response = self.response_class(self.sock, strict=self.strict,
+ method=self._method)
+
+ response.begin()
+ assert response.will_close != _UNKNOWN
+ self.__state = _CS_IDLE
+
+ if response.will_close:
+ # this effectively passes the connection to the response
+ self.close()
+ else:
+ # remember this, so we can tell when it is complete
+ self.__response = response
+
+ return response
+
+# The next several classes are used to define FakeSocket,a socket-like
+# interface to an SSL connection.
+
+# The primary complexity comes from faking a makefile() method. The
+# standard socket makefile() implementation calls dup() on the socket
+# file descriptor. As a consequence, clients can call close() on the
+# parent socket and its makefile children in any order. The underlying
+# socket isn't closed until they are all closed.
+
+# The implementation uses reference counting to keep the socket open
+# until the last client calls close(). SharedSocket keeps track of
+# the reference counting and SharedSocketClient provides an constructor
+# and close() method that call incref() and decref() correctly.
+
+class SharedSocket:
+
+ def __init__(self, sock):
+ self.sock = sock
+ self._refcnt = 0
+
+ def incref(self):
+ self._refcnt += 1
+
+ def decref(self):
+ self._refcnt -= 1
+ assert self._refcnt >= 0
+ if self._refcnt == 0:
+ self.sock.close()
+
+ def __del__(self):
+ self.sock.close()
+
+class SharedSocketClient:
+
+ def __init__(self, shared):
+ self._closed = 0
+ self._shared = shared
+ self._shared.incref()
+ self._sock = shared.sock
+
+ def close(self):
+ if not self._closed:
+ self._shared.decref()
+ self._closed = 1
+ self._shared = None
+
+class SSLFile(SharedSocketClient):
+ """File-like object wrapping an SSL socket."""
+
+ BUFSIZE = 8192
+
+ def __init__(self, sock, ssl, bufsize=None):
+ SharedSocketClient.__init__(self, sock)
+ self._ssl = ssl
+ self._buf = ''
+ self._bufsize = bufsize or self.__class__.BUFSIZE
+
+ def _read(self):
+ buf = ''
+ # put in a loop so that we retry on transient errors
+ while True:
+ try:
+ buf = self._ssl.read(self._bufsize)
+ except socket.sslerror, err:
+ if (err[0] == socket.SSL_ERROR_WANT_READ
+ or err[0] == socket.SSL_ERROR_WANT_WRITE):
+ continue
+ if (err[0] == socket.SSL_ERROR_ZERO_RETURN
+ or err[0] == socket.SSL_ERROR_EOF):
+ break
+ raise
+ except socket.error, err:
+ if err[0] == errno.EINTR:
+ continue
+ if err[0] == errno.EBADF:
+ # XXX socket was closed?
+ break
+ raise
+ else:
+ break
+ return buf
+
+ def read(self, size=None):
+ L = [self._buf]
+ avail = len(self._buf)
+ while size is None or avail < size:
+ s = self._read()
+ if s == '':
+ break
+ L.append(s)
+ avail += len(s)
+ all = "".join(L)
+ if size is None:
+ self._buf = ''
+ return all
+ else:
+ self._buf = all[size:]
+ return all[:size]
+
+ def readline(self):
+ L = [self._buf]
+ self._buf = ''
+ while 1:
+ i = L[-1].find("\n")
+ if i >= 0:
+ break
+ s = self._read()
+ if s == '':
+ break
+ L.append(s)
+ if i == -1:
+ # loop exited because there is no more data
+ return "".join(L)
+ else:
+ all = "".join(L)
+ # XXX could do enough bookkeeping not to do a 2nd search
+ i = all.find("\n") + 1
+ line = all[:i]
+ self._buf = all[i:]
+ return line
+
+ def readlines(self, sizehint=0):
+ total = 0
+ list = []
+ while True:
+ line = self.readline()
+ if not line:
+ break
+ list.append(line)
+ total += len(line)
+ if sizehint and total >= sizehint:
+ break
+ return list
+
+ def fileno(self):
+ return self._sock.fileno()
+
+ def __iter__(self):
+ return self
+
+ def next(self):
+ line = self.readline()
+ if not line:
+ raise StopIteration
+ return line
+
+class FakeSocket(SharedSocketClient):
+
+ class _closedsocket:
+ def __getattr__(self, name):
+ raise error(9, 'Bad file descriptor')
+
+ def __init__(self, sock, ssl):
+ sock = SharedSocket(sock)
+ SharedSocketClient.__init__(self, sock)
+ self._ssl = ssl
+
+ def close(self):
+ SharedSocketClient.close(self)
+ self._sock = self.__class__._closedsocket()
+
+ def makefile(self, mode, bufsize=None):
+ if mode != 'r' and mode != 'rb':
+ raise UnimplementedFileMode()
+ return SSLFile(self._shared, self._ssl, bufsize)
+
+ def send(self, stuff, flags = 0):
+ return self._ssl.write(stuff)
+
+ sendall = send
+
+ def recv(self, len = 1024, flags = 0):
+ return self._ssl.read(len)
+
+ def __getattr__(self, attr):
+ return getattr(self._sock, attr)
+
+
+class HTTPSConnection(HTTPConnection):
+ "This class allows communication via SSL."
+
+ default_port = HTTPS_PORT
+
+ def __init__(self, host, port=None, key_file=None, cert_file=None,
+ strict=None):
+ HTTPConnection.__init__(self, host, port, strict)
+ self.key_file = key_file
+ self.cert_file = cert_file
+
+ def connect(self):
+ "Connect to a host on a given (SSL) port."
+
+ sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
+ sock.connect((self.host, self.port))
+ ssl = socket.ssl(sock, self.key_file, self.cert_file)
+ self.sock = FakeSocket(sock, ssl)
+
+
+class HTTP:
+ "Compatibility class with httplib.py from 1.5."
+
+ _http_vsn = 10
+ _http_vsn_str = 'HTTP/1.0'
+
+ debuglevel = 0
+
+ _connection_class = HTTPConnection
+
+ def __init__(self, host='', port=None, strict=None):
+ "Provide a default host, since the superclass requires one."
+
+ # some joker passed 0 explicitly, meaning default port
+ if port == 0:
+ port = None
+
+ # Note that we may pass an empty string as the host; this will throw
+ # an error when we attempt to connect. Presumably, the client code
+ # will call connect before then, with a proper host.
+ self._setup(self._connection_class(host, port, strict))
+
+ def _setup(self, conn):
+ self._conn = conn
+
+ # set up delegation to flesh out interface
+ self.send = conn.send
+ self.putrequest = conn.putrequest
+ self.endheaders = conn.endheaders
+ self.set_debuglevel = conn.set_debuglevel
+
+ conn._http_vsn = self._http_vsn
+ conn._http_vsn_str = self._http_vsn_str
+
+ self.file = None
+
+ def connect(self, host=None, port=None):
+ "Accept arguments to set the host/port, since the superclass doesn't."
+
+ if host is not None:
+ self._conn._set_hostport(host, port)
+ self._conn.connect()
+
+ def getfile(self):
+ "Provide a getfile, since the superclass' does not use this concept."
+ return self.file
+
+ def putheader(self, header, *values):
+ "The superclass allows only one value argument."
+ self._conn.putheader(header, '\r\n\t'.join(values))
+
+ def getreply(self):
+ """Compat definition since superclass does not define it.
+
+ Returns a tuple consisting of:
+ - server status code (e.g. '200' if all goes well)
+ - server "reason" corresponding to status code
+ - any RFC822 headers in the response from the server
+ """
+ try:
+ response = self._conn.getresponse()
+ except BadStatusLine, e:
+ ### hmm. if getresponse() ever closes the socket on a bad request,
+ ### then we are going to have problems with self.sock
+
+ ### should we keep this behavior? do people use it?
+ # keep the socket open (as a file), and return it
+ self.file = self._conn.sock.makefile('rb', 0)
+
+ # close our socket -- we want to restart after any protocol error
+ self.close()
+
+ self.headers = None
+ return -1, e.line, None
+
+ self.headers = response.msg
+ self.file = response.fp
+ return response.status, response.reason, response.msg
+
+ def close(self):
+ self._conn.close()
+
+ # note that self.file == response.fp, which gets closed by the
+ # superclass. just clear the object ref here.
+ ### hmm. messy. if status==-1, then self.file is owned by us.
+ ### well... we aren't explicitly closing, but losing this ref will
+ ### do it
+ self.file = None
+
+if hasattr(socket, 'ssl'):
+ class HTTPS(HTTP):
+ """Compatibility with 1.5 httplib interface
+
+ Python 1.5.2 did not have an HTTPS class, but it defined an
+ interface for sending http requests that is also useful for
+ https.
+ """
+
+ _connection_class = HTTPSConnection
+
+ def __init__(self, host='', port=None, key_file=None, cert_file=None,
+ strict=None):
+ # provide a default host, pass the X509 cert info
+
+ # urf. compensate for bad input.
+ if port == 0:
+ port = None
+ self._setup(self._connection_class(host, port, key_file,
+ cert_file, strict))
+
+ # we never actually use these for anything, but we keep them
+ # here for compatibility with post-1.5.2 CVS.
+ self.key_file = key_file
+ self.cert_file = cert_file
+
+
+class HTTPException(Exception):
+ # Subclasses that define an __init__ must call Exception.__init__
+ # or define self.args. Otherwise, str() will fail.
+ pass
+
+class NotConnected(HTTPException):
+ pass
+
+class InvalidURL(HTTPException):
+ pass
+
+class UnknownProtocol(HTTPException):
+ def __init__(self, version):
+ self.args = version,
+ self.version = version
+
+class UnknownTransferEncoding(HTTPException):
+ pass
+
+class UnimplementedFileMode(HTTPException):
+ pass
+
+class IncompleteRead(HTTPException):
+ def __init__(self, partial):
+ self.args = partial,
+ self.partial = partial
+
+class ImproperConnectionState(HTTPException):
+ pass
+
+class CannotSendRequest(ImproperConnectionState):
+ pass
+
+class CannotSendHeader(ImproperConnectionState):
+ pass
+
+class ResponseNotReady(ImproperConnectionState):
+ pass
+
+class BadStatusLine(HTTPException):
+ def __init__(self, line):
+ self.args = line,
+ self.line = line
+
+# for backwards compatibility
+error = HTTPException
+
+class LineAndFileWrapper:
+ """A limited file-like object for HTTP/0.9 responses."""
+
+ # The status-line parsing code calls readline(), which normally
+ # get the HTTP status line. For a 0.9 response, however, this is
+ # actually the first line of the body! Clients need to get a
+ # readable file object that contains that line.
+
+ def __init__(self, line, file):
+ self._line = line
+ self._file = file
+ self._line_consumed = 0
+ self._line_offset = 0
+ self._line_left = len(line)
+
+ def __getattr__(self, attr):
+ return getattr(self._file, attr)
+
+ def _done(self):
+ # called when the last byte is read from the line. After the
+ # call, all read methods are delegated to the underlying file
+ # object.
+ self._line_consumed = 1
+ self.read = self._file.read
+ self.readline = self._file.readline
+ self.readlines = self._file.readlines
+
+ def read(self, amt=None):
+ if self._line_consumed:
+ return self._file.read(amt)
+ assert self._line_left
+ if amt is None or amt > self._line_left:
+ s = self._line[self._line_offset:]
+ self._done()
+ if amt is None:
+ return s + self._file.read()
+ else:
+ return s + self._file.read(amt - len(s))
+ else:
+ assert amt <= self._line_left
+ i = self._line_offset
+ j = i + amt
+ s = self._line[i:j]
+ self._line_offset = j
+ self._line_left -= amt
+ if self._line_left == 0:
+ self._done()
+ return s
+
+ def readline(self):
+ if self._line_consumed:
+ return self._file.readline()
+ assert self._line_left
+ s = self._line[self._line_offset:]
+ self._done()
+ return s
+
+ def readlines(self, size=None):
+ if self._line_consumed:
+ return self._file.readlines(size)
+ assert self._line_left
+ L = [self._line[self._line_offset:]]
+ self._done()
+ if size is None:
+ return L + self._file.readlines()
+ else:
+ return L + self._file.readlines(size)
+
+def test():
+ """Test this module.
+
+ A hodge podge of tests collected here, because they have too many
+ external dependencies for the regular test suite.
+ """
+
+ import sys
+ import getopt
+ opts, args = getopt.getopt(sys.argv[1:], 'd')
+ dl = 0
+ for o, a in opts:
+ if o == '-d': dl = dl + 1
+ host = 'www.python.org'
+ selector = '/'
+ if args[0:]: host = args[0]
+ if args[1:]: selector = args[1]
+ h = HTTP()
+ h.set_debuglevel(dl)
+ h.connect(host)
+ h.putrequest('GET', selector)
+ h.endheaders()
+ status, reason, headers = h.getreply()
+ print 'status =', status
+ print 'reason =', reason
+ print "read", len(h.getfile().read())
+ print
+ if headers:
+ for header in headers.headers: print header.strip()
+ print
+
+ # minimal test that code to extract host from url works
+ class HTTP11(HTTP):
+ _http_vsn = 11
+ _http_vsn_str = 'HTTP/1.1'
+
+ h = HTTP11('www.python.org')
+ h.putrequest('GET', 'http://www.python.org/~jeremy/')
+ h.endheaders()
+ h.getreply()
+ h.close()
+
+ if hasattr(socket, 'ssl'):
+
+ for host, selector in (('sourceforge.net', '/projects/python'),
+ ):
+ print "https://%s%s" % (host, selector)
+ hs = HTTPS()
+ hs.set_debuglevel(dl)
+ hs.connect(host)
+ hs.putrequest('GET', selector)
+ hs.endheaders()
+ status, reason, headers = hs.getreply()
+ print 'status =', status
+ print 'reason =', reason
+ print "read", len(hs.getfile().read())
+ print
+ if headers:
+ for header in headers.headers: print header.strip()
+ print
+
+if __name__ == '__main__':
+ test()
From buildbot at python.org Thu Jul 6 08:07:02 2006
From: buildbot at python.org (buildbot at python.org)
Date: Thu, 06 Jul 2006 06:07:02 +0000
Subject: [Python-checkins] buildbot warnings in ia64 Debian unstable trunk
Message-ID: <20060706060702.4F4931E4003@bag.python.org>
The Buildbot has detected a new failure of ia64 Debian unstable trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/ia64%2520Debian%2520unstable%2520trunk/builds/826
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: fred.drake
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Thu Jul 6 08:14:30 2006
From: python-checkins at python.org (neal.norwitz)
Date: Thu, 6 Jul 2006 08:14:30 +0200 (CEST)
Subject: [Python-checkins] r47256 - peps/trunk/pep-0361.txt
Message-ID: <20060706061430.3837C1E4003@bag.python.org>
Author: neal.norwitz
Date: Thu Jul 6 08:14:29 2006
New Revision: 47256
Modified:
peps/trunk/pep-0361.txt
Log:
Add some carry overs from 2.5 (add dashes in front of all items).
Modified: peps/trunk/pep-0361.txt
==============================================================================
--- peps/trunk/pep-0361.txt (original)
+++ peps/trunk/pep-0361.txt Thu Jul 6 08:14:29 2006
@@ -85,13 +85,20 @@
http://python.org/sf/1505257
(Owner: MAL)
- Start removing deprecated features and generally moving towards Py3k
+ - Check the various bits of code in Demo/ and Tools/ all still work,
+ update or remove the ones that don't.
- Replace all old style tests (operate on import) with unittest or docttest
+ - All modules in Modules/ should be updated to be ssize_t clean.
- All tests for all untested modules
+ - All of Python (including Modules/) should compile cleanly with g++
- Document undocumented modules/features
+ - Start removing deprecated features and generally moving towards Py3k
+
+ - Replace all old style tests (operate on import) with unittest or docttest
+
+ - All tests for all untested modules
+
+ - Document undocumented modules/features
Deferred until 2.7
From python-checkins at python.org Thu Jul 6 08:45:08 2006
From: python-checkins at python.org (neal.norwitz)
Date: Thu, 6 Jul 2006 08:45:08 +0200 (CEST)
Subject: [Python-checkins] r47257 - python/trunk/Misc/NEWS
Message-ID: <20060706064508.B812F1E4003@bag.python.org>
Author: neal.norwitz
Date: Thu Jul 6 08:45:08 2006
New Revision: 47257
Modified:
python/trunk/Misc/NEWS
Log:
Add a NEWS entry for a recent pyexpat fix
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Thu Jul 6 08:45:08 2006
@@ -83,6 +83,9 @@
- Bug #1512695: cPickle.loads could crash if it was interrupted with
a KeyboardInterrupt.
+- Bug #1296433: parsing XML with a non-default encoding and
+ a CharacterDataHandler could crash the interpreter in pyexpat.
+
Build
-----
From nnorwitz at gmail.com Thu Jul 6 08:51:40 2006
From: nnorwitz at gmail.com (Neal Norwitz)
Date: Wed, 5 Jul 2006 23:51:40 -0700
Subject: [Python-checkins] r47086 - in python/trunk: Misc/NEWS
Python/codecs.c
In-Reply-To: <20060623211619.1DA121E400D@bag.python.org>
References: <20060623211619.1DA121E400D@bag.python.org>
Message-ID:
On 6/23/06, hyeshik.chang wrote:
> Author: hyeshik.chang
> Date: Fri Jun 23 23:16:18 2006
> New Revision: 47086
>
> Modified:
> python/trunk/Misc/NEWS
> python/trunk/Python/codecs.c
> Log:
> Bug #1511381: codec_getstreamcodec() in codec.c is corrected to
> omit a default "error" argument for NULL pointer. This allows
> the parser to take a codec from cjkcodecs again.
> (Reported by Taewook Kang and reviewed by Walter Doerwald)
Can we get a test case for this?
n
From python-checkins at python.org Thu Jul 6 08:55:58 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Thu, 6 Jul 2006 08:55:58 +0200 (CEST)
Subject: [Python-checkins] r47258 - in python/trunk: Misc/NEWS
Tools/msi/msi.py
Message-ID: <20060706065558.BF5CE1E4003@bag.python.org>
Author: martin.v.loewis
Date: Thu Jul 6 08:55:58 2006
New Revision: 47258
Modified:
python/trunk/Misc/NEWS
python/trunk/Tools/msi/msi.py
Log:
Add sqlite3.dll to the DLLs component, not to the TkDLLs component.
Fixes #1517388.
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Thu Jul 6 08:55:58 2006
@@ -89,6 +89,9 @@
Build
-----
+- Bug #1517388: sqlite3.dll is now installed on Windows independent
+ of Tcl/Tk.
+
- Bug #1513032: 'make install' failed on FreeBSD 5.3 due to lib-old
trying to be installed even though it's empty.
Modified: python/trunk/Tools/msi/msi.py
==============================================================================
--- python/trunk/Tools/msi/msi.py (original)
+++ python/trunk/Tools/msi/msi.py Thu Jul 6 08:55:58 2006
@@ -962,6 +962,14 @@
continue
dlls.append(f)
lib.add_file(f)
+ # Add sqlite
+ if msilib.msi_type=="Intel64;1033":
+ sqlite_arch = "/ia64"
+ elif msilib.msi_type=="x64;1033":
+ sqlite_arch = "/amd64"
+ else:
+ sqlite_arch = ""
+ lib.add_file(srcdir+"/"+sqlite_dir+sqlite_arch+"/sqlite3.dll")
if have_tcl:
if not os.path.exists(srcdir+"/PCBuild/_tkinter.pyd"):
print "WARNING: Missing _tkinter.pyd"
@@ -972,14 +980,6 @@
tcldir = os.path.normpath(srcdir+"/../tcltk/bin")
for f in glob.glob1(tcldir, "*.dll"):
lib.add_file(f, src=os.path.join(tcldir, f))
- # Add sqlite
- if msilib.msi_type=="Intel64;1033":
- sqlite_arch = "/ia64"
- elif msilib.msi_type=="x64;1033":
- sqlite_arch = "/amd64"
- else:
- sqlite_arch = ""
- lib.add_file(srcdir+"/"+sqlite_dir+sqlite_arch+"/sqlite3.dll")
# check whether there are any unknown extensions
for f in glob.glob1(srcdir+"/PCBuild", "*.pyd"):
if f.endswith("_d.pyd"): continue # debug version
From buildbot at python.org Thu Jul 6 09:01:43 2006
From: buildbot at python.org (buildbot at python.org)
Date: Thu, 06 Jul 2006 07:01:43 +0000
Subject: [Python-checkins] buildbot failure in ia64 Debian unstable trunk
Message-ID: <20060706070143.DADD61E4003@bag.python.org>
The Buildbot has detected a new failure of ia64 Debian unstable trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/ia64%2520Debian%2520unstable%2520trunk/builds/827
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: martin.v.loewis,neal.norwitz
BUILD FAILED: failed svn
sincerely,
-The Buildbot
From python-checkins at python.org Thu Jul 6 09:05:22 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Thu, 6 Jul 2006 09:05:22 +0200 (CEST)
Subject: [Python-checkins] r47259 - in python/trunk: Misc/NEWS
Tools/msi/msi.py
Message-ID: <20060706070522.B88C21E4003@bag.python.org>
Author: martin.v.loewis
Date: Thu Jul 6 09:05:21 2006
New Revision: 47259
Modified:
python/trunk/Misc/NEWS
python/trunk/Tools/msi/msi.py
Log:
Properly quote compileall and Lib paths in case TARGETDIR has a space.
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Thu Jul 6 09:05:21 2006
@@ -89,6 +89,9 @@
Build
-----
+- The MSI compileall step was fixed to also support a TARGETDIR
+ with spaces in it.
+
- Bug #1517388: sqlite3.dll is now installed on Windows independent
of Tcl/Tk.
Modified: python/trunk/Tools/msi/msi.py
==============================================================================
--- python/trunk/Tools/msi/msi.py (original)
+++ python/trunk/Tools/msi/msi.py Thu Jul 6 09:05:21 2006
@@ -367,7 +367,7 @@
("VerdanaRed9", "Verdana", 9, 255, 0),
])
- compileargs = r"-Wi [TARGETDIR]Lib\compileall.py -f -x bad_coding|badsyntax|site-packages [TARGETDIR]Lib"
+ compileargs = r'-Wi "[TARGETDIR]Lib\compileall.py" -f -x bad_coding|badsyntax|site-packages "[TARGETDIR]Lib"'
# See "CustomAction Table"
add_data(db, "CustomAction", [
# msidbCustomActionTypeFirstSequence + msidbCustomActionTypeTextData + msidbCustomActionTypeProperty
From theller at python.net Thu Jul 6 09:26:41 2006
From: theller at python.net (Thomas Heller)
Date: Thu, 06 Jul 2006 09:26:41 +0200
Subject: [Python-checkins] r47206 - in python/trunk:
Lib/ctypes/test/test_win32.py Modules/_ctypes/callproc.c
In-Reply-To:
References: <20060703080815.22FA41E4008@bag.python.org>
Message-ID: <44ACBB31.9030902@python.net>
Neal Norwitz schrieb:
> On 7/3/06, thomas.heller wrote:
>> Author: thomas.heller
>> Date: Mon Jul 3 10:08:14 2006
>> New Revision: 47206
>>
>> Modified:
>> python/trunk/Lib/ctypes/test/test_win32.py
>> python/trunk/Modules/_ctypes/callproc.c
>> Log:
>> Add a new function uses_seh() to the _ctypes extension module. This
>> will return True if Windows Structured Exception handling (SEH) is
>> used when calling functions, False otherwise.
>>
>> Currently, only MSVC supports SEH.
>>
>> Fix the test so that it doesn't crash when run with MingW compiled
>> _ctypes. Note that two tests are still failing when mingw is used, I
>> suspect structure layout differences and function calling conventions
>> between MSVC and MingW.
>
> This seems like a really minor/limited feature. Can't this wait for
> 2.6? I would prefer this change was reverted.
This feature only provides a way to have access from Python to the value of
the #define DONT_USE_SEH symbol used when compiling, which gets set depending
on whether mingw or msvc is used. I can revert the patch, but then I would
like to change Lib/test/test_win32.py so that the test_SEH testcase is skipped
by default:
Index: test_win32.py
===================================================================
--- test_win32.py (Revision 47206)
+++ test_win32.py (Arbeitskopie)
@@ -30,8 +30,7 @@
# or wrong calling convention
self.assertRaises(ValueError, IsWindow, None)
- import _ctypes
- if _ctypes.uses_seh():
+ if is_resource_enabled("SEH"):
def test_SEH(self):
# Call functions with invalid arguments, and make sure that access violations
# are trapped and raise an exception.
Thomas
From nnorwitz at gmail.com Thu Jul 6 09:36:10 2006
From: nnorwitz at gmail.com (Neal Norwitz)
Date: Thu, 6 Jul 2006 00:36:10 -0700
Subject: [Python-checkins] r47206 - in python/trunk:
Lib/ctypes/test/test_win32.py Modules/_ctypes/callproc.c
In-Reply-To: <44ACBB31.9030902@python.net>
References: <20060703080815.22FA41E4008@bag.python.org>
<44ACBB31.9030902@python.net>
Message-ID:
On 7/6/06, Thomas Heller wrote:
> Neal Norwitz schrieb:
> > On 7/3/06, thomas.heller wrote:
> >> Author: thomas.heller
> >> Date: Mon Jul 3 10:08:14 2006
> >> New Revision: 47206
> >>
> >> Modified:
> >> python/trunk/Lib/ctypes/test/test_win32.py
> >> python/trunk/Modules/_ctypes/callproc.c
> >> Log:
> >> Add a new function uses_seh() to the _ctypes extension module. This
> >> will return True if Windows Structured Exception handling (SEH) is
> >> used when calling functions, False otherwise.
> >>
> >> Currently, only MSVC supports SEH.
> >>
> >> Fix the test so that it doesn't crash when run with MingW compiled
> >> _ctypes. Note that two tests are still failing when mingw is used, I
> >> suspect structure layout differences and function calling conventions
> >> between MSVC and MingW.
> >
> > This seems like a really minor/limited feature. Can't this wait for
> > 2.6? I would prefer this change was reverted.
>
> This feature only provides a way to have access from Python to the value of
> the #define DONT_USE_SEH symbol used when compiling, which gets set depending
> on whether mingw or msvc is used. I can revert the patch, but then I would
> like to change Lib/test/test_win32.py so that the test_SEH testcase is skipped
> by default:
That's fine with me. Thanks.
n
From python-checkins at python.org Thu Jul 6 09:50:19 2006
From: python-checkins at python.org (thomas.heller)
Date: Thu, 6 Jul 2006 09:50:19 +0200 (CEST)
Subject: [Python-checkins] r47260 - in python/trunk:
Lib/ctypes/test/test_win32.py Modules/_ctypes/callproc.c
Message-ID: <20060706075019.45A3E1E4006@bag.python.org>
Author: thomas.heller
Date: Thu Jul 6 09:50:18 2006
New Revision: 47260
Modified:
python/trunk/Lib/ctypes/test/test_win32.py
python/trunk/Modules/_ctypes/callproc.c
Log:
Revert the change done in svn revision 47206:
Add a new function uses_seh() to the _ctypes extension module. This
will return True if Windows Structured Exception handling (SEH) is
used when calling functions, False otherwise.
Modified: python/trunk/Lib/ctypes/test/test_win32.py
==============================================================================
--- python/trunk/Lib/ctypes/test/test_win32.py (original)
+++ python/trunk/Lib/ctypes/test/test_win32.py Thu Jul 6 09:50:18 2006
@@ -30,11 +30,15 @@
# or wrong calling convention
self.assertRaises(ValueError, IsWindow, None)
- import _ctypes
- if _ctypes.uses_seh():
- def test_SEH(self):
- # Call functions with invalid arguments, and make sure that access violations
- # are trapped and raise an exception.
+ def test_SEH(self):
+ # Call functions with invalid arguments, and make sure that access violations
+ # are trapped and raise an exception.
+ #
+ # Normally, in a debug build of the _ctypes extension
+ # module, exceptions are not trapped, so we can only run
+ # this test in a release build.
+ import sys
+ if not hasattr(sys, "getobjects"):
self.assertRaises(WindowsError, windll.kernel32.GetModuleHandleA, 32)
class Structures(unittest.TestCase):
Modified: python/trunk/Modules/_ctypes/callproc.c
==============================================================================
--- python/trunk/Modules/_ctypes/callproc.c (original)
+++ python/trunk/Modules/_ctypes/callproc.c Thu Jul 6 09:50:18 2006
@@ -1526,21 +1526,7 @@
return Py_None;
}
-static PyObject *
-uses_seh(PyObject *self, PyObject *args)
-{
-#if defined(DONT_USE_SEH) || !defined(MS_WIN32)
- Py_INCREF(Py_False);
- return Py_False;
-#else
- Py_INCREF(Py_True);
- return Py_True;
-#endif
-}
-
PyMethodDef module_methods[] = {
- {"uses_seh", uses_seh, METH_NOARGS,
- "Return whether ctypes uses Windows structured exception handling"},
{"resize", resize, METH_VARARGS, "Resize the memory buffer of a ctypes instance"},
#ifdef CTYPES_UNICODE
{"set_conversion_mode", set_conversion_mode, METH_VARARGS, set_conversion_mode_doc},
From python-checkins at python.org Thu Jul 6 09:58:19 2006
From: python-checkins at python.org (armin.rigo)
Date: Thu, 6 Jul 2006 09:58:19 +0200 (CEST)
Subject: [Python-checkins] r47261 -
python/trunk/Lib/test/crashers/borrowed_ref_1.py
python/trunk/Lib/test/crashers/borrowed_ref_2.py
Message-ID: <20060706075819.7332D1E4003@bag.python.org>
Author: armin.rigo
Date: Thu Jul 6 09:58:18 2006
New Revision: 47261
Added:
python/trunk/Lib/test/crashers/borrowed_ref_1.py (contents, props changed)
python/trunk/Lib/test/crashers/borrowed_ref_2.py (contents, props changed)
Log:
A couple of examples about how to attack the fact that _PyType_Lookup()
returns a borrowed ref. Many of the calls are open to attack.
Added: python/trunk/Lib/test/crashers/borrowed_ref_1.py
==============================================================================
--- (empty file)
+++ python/trunk/Lib/test/crashers/borrowed_ref_1.py Thu Jul 6 09:58:18 2006
@@ -0,0 +1,29 @@
+"""
+_PyType_Lookup() returns a borrowed reference.
+This attacks the call in dictobject.c.
+"""
+
+class A(object):
+ pass
+
+class B(object):
+ def __del__(self):
+ print 'hi'
+ del D.__missing__
+
+class D(dict):
+ class __missing__:
+ def __init__(self, *args):
+ pass
+
+
+d = D()
+a = A()
+a.cycle = a
+a.other = B()
+del a
+
+prev = None
+while 1:
+ d[5]
+ prev = (prev,)
Added: python/trunk/Lib/test/crashers/borrowed_ref_2.py
==============================================================================
--- (empty file)
+++ python/trunk/Lib/test/crashers/borrowed_ref_2.py Thu Jul 6 09:58:18 2006
@@ -0,0 +1,38 @@
+"""
+_PyType_Lookup() returns a borrowed reference.
+This attacks PyObject_GenericSetAttr().
+
+NB. on my machine this crashes in 2.5 debug but not release.
+"""
+
+class A(object):
+ pass
+
+class B(object):
+ def __del__(self):
+ print "hi"
+ del C.d
+
+class D(object):
+ def __set__(self, obj, value):
+ self.hello = 42
+
+class C(object):
+ d = D()
+
+ def g():
+ pass
+
+
+c = C()
+a = A()
+a.cycle = a
+a.other = B()
+
+lst = [None] * 1000000
+i = 0
+del a
+while 1:
+ c.d = 42 # segfaults in PyMethod_New(im_func=D.__set__, im_self=d)
+ lst[i] = c.g # consume the free list of instancemethod objects
+ i += 1
From buildbot at python.org Thu Jul 6 10:23:12 2006
From: buildbot at python.org (buildbot at python.org)
Date: Thu, 06 Jul 2006 08:23:12 +0000
Subject: [Python-checkins] buildbot warnings in alpha Debian trunk
Message-ID: <20060706082313.0234E1E4003@bag.python.org>
The Buildbot has detected a new failure of alpha Debian trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/alpha%2520Debian%2520trunk/builds/450
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: fred.drake
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Thu Jul 6 10:28:14 2006
From: python-checkins at python.org (thomas.heller)
Date: Thu, 6 Jul 2006 10:28:14 +0200 (CEST)
Subject: [Python-checkins] r47262 -
python/trunk/Lib/ctypes/test/test_win32.py
Message-ID: <20060706082814.8FA7C1E4003@bag.python.org>
Author: thomas.heller
Date: Thu Jul 6 10:28:14 2006
New Revision: 47262
Modified:
python/trunk/Lib/ctypes/test/test_win32.py
Log:
The test that calls a function with invalid arguments and catches the
resulting Windows access violation will not be run by default.
Modified: python/trunk/Lib/ctypes/test/test_win32.py
==============================================================================
--- python/trunk/Lib/ctypes/test/test_win32.py (original)
+++ python/trunk/Lib/ctypes/test/test_win32.py Thu Jul 6 10:28:14 2006
@@ -1,6 +1,7 @@
# Windows specific tests
from ctypes import *
+from ctypes.test import is_resource_enabled
import unittest, sys
import _ctypes_test
@@ -30,15 +31,10 @@
# or wrong calling convention
self.assertRaises(ValueError, IsWindow, None)
- def test_SEH(self):
- # Call functions with invalid arguments, and make sure that access violations
- # are trapped and raise an exception.
- #
- # Normally, in a debug build of the _ctypes extension
- # module, exceptions are not trapped, so we can only run
- # this test in a release build.
- import sys
- if not hasattr(sys, "getobjects"):
+ if is_resource_enabled("SEH"):
+ def test_SEH(self):
+ # Call functions with invalid arguments, and make sure that access violations
+ # are trapped and raise an exception.
self.assertRaises(WindowsError, windll.kernel32.GetModuleHandleA, 32)
class Structures(unittest.TestCase):
From python-checkins at python.org Thu Jul 6 10:48:37 2006
From: python-checkins at python.org (thomas.heller)
Date: Thu, 6 Jul 2006 10:48:37 +0200 (CEST)
Subject: [Python-checkins] r47263 - in python/trunk:
Lib/ctypes/test/test_parameters.py Misc/NEWS
Modules/_ctypes/_ctypes.c
Message-ID: <20060706084837.755601E4003@bag.python.org>
Author: thomas.heller
Date: Thu Jul 6 10:48:35 2006
New Revision: 47263
Modified:
python/trunk/Lib/ctypes/test/test_parameters.py
python/trunk/Misc/NEWS
python/trunk/Modules/_ctypes/_ctypes.c
Log:
Patch #1517790: It is now possible to use custom objects in the ctypes
foreign function argtypes sequence as long as they provide a
from_param method, no longer is it required that the object is a
ctypes type.
Modified: python/trunk/Lib/ctypes/test/test_parameters.py
==============================================================================
--- python/trunk/Lib/ctypes/test/test_parameters.py (original)
+++ python/trunk/Lib/ctypes/test/test_parameters.py Thu Jul 6 10:48:35 2006
@@ -147,6 +147,41 @@
## def test_performance(self):
## check_perf()
+ def test_noctypes_argtype(self):
+ import _ctypes_test
+ from ctypes import CDLL, c_void_p, ArgumentError
+
+ func = CDLL(_ctypes_test.__file__)._testfunc_p_p
+ func.restype = c_void_p
+ # TypeError: has no from_param method
+ self.assertRaises(TypeError, setattr, func, "argtypes", (object,))
+
+ class Adapter(object):
+ def from_param(cls, obj):
+ return None
+
+ func.argtypes = (Adapter(),)
+ self.failUnlessEqual(func(None), None)
+ self.failUnlessEqual(func(object()), None)
+
+ class Adapter(object):
+ def from_param(cls, obj):
+ return obj
+
+ func.argtypes = (Adapter(),)
+ # don't know how to convert parameter 1
+ self.assertRaises(ArgumentError, func, object())
+ self.failUnlessEqual(func(c_void_p(42)), 42)
+
+ class Adapter(object):
+ def from_param(cls, obj):
+ raise ValueError(obj)
+
+ func.argtypes = (Adapter(),)
+ # ArgumentError: argument 1: ValueError: 99
+ self.assertRaises(ArgumentError, func, 99)
+
+
################################################################
if __name__ == '__main__':
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Thu Jul 6 10:48:35 2006
@@ -25,6 +25,10 @@
Library
-------
+- Patch #1517790: It is now possible to use custom objects in the ctypes
+ foreign function argtypes sequence as long as they provide a from_param
+ method, no longer is it required that the object is a ctypes type.
+
- string.Template() now correctly handles tuple-values. Previously,
multi-value tuples would raise an exception and single-value tuples would
be treated as the value they contain, instead.
Modified: python/trunk/Modules/_ctypes/_ctypes.c
==============================================================================
--- python/trunk/Modules/_ctypes/_ctypes.c (original)
+++ python/trunk/Modules/_ctypes/_ctypes.c Thu Jul 6 10:48:35 2006
@@ -1633,9 +1633,8 @@
for (i = 0; i < nArgs; ++i) {
PyObject *tp = PyTuple_GET_ITEM(ob, i);
- StgDictObject *dict = PyType_stgdict(tp);
PyObject *cnv = PyObject_GetAttrString(tp, "from_param");
- if (!dict || !cnv)
+ if (!cnv)
goto argtypes_error_1;
PyTuple_SET_ITEM(converters, i, cnv);
}
@@ -1646,7 +1645,7 @@
Py_XDECREF(converters);
Py_DECREF(ob);
PyErr_Format(PyExc_TypeError,
- "item %d in _argtypes_ is not a valid C type", i+1);
+ "item %d in _argtypes_ has no from_param method", i+1);
return NULL;
}
From python-checkins at python.org Thu Jul 6 10:58:40 2006
From: python-checkins at python.org (thomas.heller)
Date: Thu, 6 Jul 2006 10:58:40 +0200 (CEST)
Subject: [Python-checkins] r47264 - python/trunk/Doc/lib/libctypes.tex
Message-ID: <20060706085840.CE7511E4003@bag.python.org>
Author: thomas.heller
Date: Thu Jul 6 10:58:40 2006
New Revision: 47264
Modified:
python/trunk/Doc/lib/libctypes.tex
Log:
Document the Struture and Union constructors.
Modified: python/trunk/Doc/lib/libctypes.tex
==============================================================================
--- python/trunk/Doc/lib/libctypes.tex (original)
+++ python/trunk/Doc/lib/libctypes.tex Thu Jul 6 10:58:40 2006
@@ -2389,6 +2389,13 @@
separate \member{{\_}fields{\_}} variable, the fields specified in this are
appended to the fields of the base class.
+Structure and union constructors accept both positional and
+keyword arguments. Positional arguments are used to initialize member
+fields in the same order as they are appear in \member{{\_}fields{\_}}. Keyword
+arguments in the constructor are interpreted as attribute assignments,
+so they will initialize \member{{\_}fields{\_}} with the same name, or create new
+attributes for names not present in \member{{\_}fields{\_}}.
+
\subsubsection{Arrays and pointers\label{ctypes-arrays-pointers}}
From python-checkins at python.org Thu Jul 6 11:11:23 2006
From: python-checkins at python.org (thomas.heller)
Date: Thu, 6 Jul 2006 11:11:23 +0200 (CEST)
Subject: [Python-checkins] r47265 - python/trunk/Doc/lib/libctypes.tex
Message-ID: <20060706091123.3C5C81E4005@bag.python.org>
Author: thomas.heller
Date: Thu Jul 6 11:11:22 2006
New Revision: 47265
Modified:
python/trunk/Doc/lib/libctypes.tex
Log:
Document the changes in svn revision 47263, from patch #1517790.
Modified: python/trunk/Doc/lib/libctypes.tex
==============================================================================
--- python/trunk/Doc/lib/libctypes.tex (original)
+++ python/trunk/Doc/lib/libctypes.tex Thu Jul 6 11:11:22 2006
@@ -1648,6 +1648,12 @@
example, a \class{c{\_}char{\_}p} item in the \member{argtypes} tuple will
convert a unicode string passed as argument into an byte string
using ctypes conversion rules.
+
+New: It is now possible to put items in argtypes which are not
+ctypes types, but each item must have a \method{from{\_}param} method
+which returns a value usable as argument (integer, string, ctypes
+instance). This allows to define adapters that can adapt custom
+objects as function parameters.
\end{memberdesc}
\begin{memberdesc}{errcheck}
From python-checkins at python.org Thu Jul 6 12:11:03 2006
From: python-checkins at python.org (matt.fleming)
Date: Thu, 6 Jul 2006 12:11:03 +0200 (CEST)
Subject: [Python-checkins] r47266 - sandbox/trunk/pdb/README.txt
sandbox/trunk/pdb/mpdb.py sandbox/trunk/pdb/mthread.py
Message-ID: <20060706101103.C90211E4003@bag.python.org>
Author: matt.fleming
Date: Thu Jul 6 12:11:02 2006
New Revision: 47266
Modified:
sandbox/trunk/pdb/README.txt
sandbox/trunk/pdb/mpdb.py
sandbox/trunk/pdb/mthread.py
Log:
Some changes to the thread code.
Modified: sandbox/trunk/pdb/README.txt
==============================================================================
--- sandbox/trunk/pdb/README.txt (original)
+++ sandbox/trunk/pdb/README.txt Thu Jul 6 12:11:02 2006
@@ -53,4 +53,5 @@
* Run command does not pass arguments across a remote connection, i.e.
`run 2' does not pass the argument 2 to the script.
* Allow thread debugging to be turned on with a command line switch.
-
+* Allow reading commands from .mpdbrc file
+* Changed the name of the default history file from ~/.pydbhist to ~/.mpdbhist
\ No newline at end of file
Modified: sandbox/trunk/pdb/mpdb.py
==============================================================================
--- sandbox/trunk/pdb/mpdb.py (original)
+++ sandbox/trunk/pdb/mpdb.py Thu Jul 6 12:11:02 2006
@@ -164,6 +164,9 @@
else:
pydb.Pdb.info_helper(self, cmd)
+ def help_mpdb(self, *arg):
+ help()
+
# Debugger commands
def do_attach(self, addr):
""" Attach to a process or file outside of Pdb.
Modified: sandbox/trunk/pdb/mthread.py
==============================================================================
--- sandbox/trunk/pdb/mthread.py (original)
+++ sandbox/trunk/pdb/mthread.py Thu Jul 6 12:11:02 2006
@@ -7,249 +7,38 @@
import sys
import threading
-class MTracer(object):
+from pydb.gdb import Gdb
+
+# Global variables
+tracers = [] # The MTracer objects that are tracing threads
+threads = [] # List of threads we're tracing
+
+# This global variable keeps track of the 'main' debugger which, when one
+# of the MTracer objects encounters a breakpoint in its thread, is used to
+# place a call to that main debugger asking it to stop and examine the frame
+# said thread stopped at.
+_main_debugger = None
+
+class MTracer(Gdb):
""" A class to trace a thread. Breakpoints can be passed from
a main debugger to this debugger through the constructor
which is useful, for instance, if a breakpoint occurs inside
a thread's run() method.
"""
- def __init__(self, msg, errmsg, breaks={}, filename=None):
+ def __init__(self):
+ Gdb.__init__(self)
self.thread = threading.currentThread()
- self.msg = msg
- self.errmsg = errmsg
# Each tracer instance must keep track of its own breakpoints
- self.breaks = breaks
self.fncache = {}
- self.filename = filename
self.lineno = 0
self.curframe = None
+ self.stopframe = None
+ self.botframe = None
+ self.quitting = False
- def canonic(self, filename):
- if filename == "<" + filename[1:-1] + ">":
- return filename
- canonic = self.fncache.get(filename)
- if not canonic:
- canonic = os.path.abspath(filename)
- canonic = os.path.normcase(canonic)
- self.fncache[filename] = canonic
- return canonic
-
- def info(self, arg):
- args = arg.split()
- if 'break'.startswith(args[0]):
- return self.breaks
-
- def get_break(self, filename, lineno):
- """ Return the breakpoint at [filename:]lineno. """
- filename = self.canonic(filename)
- return filename in self.breaks and \
- lineno in self.breaks[filename]
-
- def get_breaks(self, filename, lineno):
- """ Return all the breakpoints set in this thread. """
- filename = self.canonic(filename)
- return filename in self.breaks and \
- lineno in self.breaks[filename] and \
- Breakpoint.bplist[filename, lineno] or []
-
- def set_break(self, filename, lineno, temporary=0, cond = None,
- funcname=None):
- """ Set a breakpoint in this thread. """
-
- # Derived classes and clients can call the following methods
- # to manipulate breakpoints. These methods return an
- # error message is something went wrong, None if all is well.
- # Set_break prints out the breakpoint line and file:lineno.
- # Call self.get_*break*() to see the breakpoints or better
- # for bp in Breakpoint.bpbynumber: if bp: bp.bpprint().
-
- filename = self.canonic(filename)
- import linecache # Import as late as possible
- line = linecache.getline(filename, lineno)
- if not line:
- return 'Line %s:%d does not exist' % (filename,
- lineno)
- if not filename in self.breaks:
- self.breaks[filename] = []
- list = self.breaks[filename]
- if not lineno in list:
- list.append(lineno)
- bp = Breakpoint(filename, lineno, temporary, cond, funcname)
-
- def do_break(self, arg=None):
- """ thread-specific breakpoint information. """
- # XXX For now we don't support temporary breakpoints in threads
- temporary = cond = False
- if arg is None:
- if self.lineno is None:
- lineno = max(1, inspect.lineno(self.curframe))
- else:
- lineno = self.lineno + 1
- filename = self.curframe.f_code.co_filename
- else:
- filename = None
- lineno = None
- comma = arg.find(',')
- if comma > 0:
- # parse stuff after comma: "condition"
- cond = arg[comma+1:].lstrip()
- arg = arg[:comma].rstrip()
- (funcname, filename, lineno) = self.__parse_filepos(arg)
- if lineno is None: return
-
- # FIXME This default setting doesn't match that used in
- # do_clear. Perhaps one is non-optimial.
- if not filename:
- filename = self.defaultFile()
-
- # Check for reasonable breakpoint
- line = self.checkline(filename, lineno)
- if line:
- # now set the break point
- # Python 2.3.5 takes 5 args rather than 6.
- # There is another way in configure to test for the version,
- # but this works too.
- try:
- err = self.set_break(filename, line, temporary, cond, funcname)
- except TypeError:
- err = self.set_break(filename, line, temporary, cond)
-
- if err: self.msg, err
- else:
- bp = self.get_breaks(filename, line)[-1]
- self.msg, "Breakpoint %d set in file %s, line %d." \
- % (bp.number, self.filename(bp.file), bp.line)
-
- def __parse_filepos(self, arg):
- """__parse_filepos(self,arg)->(fn, filename, lineno)
-
- Parse arg as [filename:]lineno | function
- Make sure it works for C:\foo\bar.py:12
- """
- colon = arg.rfind(':')
- if colon >= 0:
- filename = arg[:colon].rstrip()
- f = self.lookupmodule(filename)
- if not f:
- self.msg("%s not found from sys.path" %
- self._saferepr(filename))
- return (None, None, None)
- else:
- filename = f
- arg = arg[colon+1:].lstrip()
- try:
- lineno = int(arg)
- except ValueError, msg:
- self.msg('Bad lineno: %s' % str(arg))
- return (None, filename, None)
- return (None, filename, lineno)
- else:
- # no colon; can be lineno or function
- return self.__get_brkpt_lineno(arg)
-
- def __get_brkpt_lineno(self, arg):
- """__get_brkpt_lineno(self,arg)->(filename, file, lineno)
-
- See if arg is a line number or a function name. Return what
- we've found. None can be returned as a value in the triple."""
- funcname, filename = (None, None)
- try:
- # First try as an integer
- lineno = int(arg)
- filename = self.curframe.f_code.co_filename
- except ValueError:
- try:
- func = eval(arg, self.curframe.f_globals,
- self.curframe.f_locals)
- except:
- func = arg
- try:
- if hasattr(func, 'im_func'):
- func = func.im_func
- code = func.func_code
- #use co_name to identify the bkpt (function names
- #could be aliased, but co_name is invariant)
- funcname = code.co_name
- lineno = code.co_firstlineno
- filename = code.co_filename
- except:
- # last thing to try
- (ok, filename, ln) = self.lineinfo(arg)
- if not ok:
- self.msg('The specified object %s is not ' \
- ' a function, or not found' \
- ' along sys.path or no line given.' %
- str(repr(arg)))
-
- return (None, None, None)
- funcname = ok # ok contains a function name
- lineno = int(ln)
- return (funcname, filename, lineno)
-
- def lineinfo(self, identifier):
- failed = (None, None, None)
- # Input is identifier, may be in single quotes
- idstring = identifier.split("'")
- if len(idstring) == 1:
- # not in single quotes
- id = idstring[0].strip()
- elif len(idstring) == 3:
- # quoted
- id = idstring[1].strip()
- else:
- return failed
- if id == '': return failed
- parts = id.split('.')
- # Protection for derived debuggers
- if parts[0] == 'self':
- del parts[0]
- if len(parts) == 0:
- return failed
- # Best first guess at file to look at
- fname = self.defaultFile()
- if len(parts) == 1:
- item = parts[0]
- else:
- # More than one part.
- # First is module, second is method/class
- f = self.lookupmodule(parts[0])
- if f:
- fname = f
- item = parts[1]
- answer = find_function(item, fname)
- return answer or failed
-
- def defaultFile(self):
- """Produce a reasonable default."""
- filename = self.curframe.f_code.co_filename
- # Consider using is_exec_stmt(). I just don't understand
- # the conditions under which the below test is true.
- if filename == '' and self.mainpyfile:
- filename = self.mainpyfile
- return filename
-
- def checkline(self, filename, lineno):
- """Check whether specified line seems to be executable.
-
- Return `lineno` if it is, 0 if not (e.g. a docstring, comment, blank
- line or EOF). Warning: testing is not comprehensive.
- """
- line = linecache.getline(filename, lineno)
- if not line:
- self.errmsg('End of file')
- return 0
- line = line.strip()
- # Don't allow setting breakpoint at a blank line
- if (not line or (line[0] == '#') or
- (line[:3] == '"""') or line[:3] == "'''"):
- self.errmsg('Blank or comment')
- return 0
- return lineno
-
def trace_dispatch(self, frame, event, arg):
- self.curframe = frame
+ self.currentframe = frame
if event == 'line':
- self.msg('%s *** line' % self.thread.getName())
return self.dispatch_line
if event == 'call':
self.msg('%s *** call' % self.thread.getName())
@@ -272,53 +61,66 @@
print 'bdb.Bdb.dispatch: unknown debugging event:', repr(event)
return self.trace_dispatch
- def dispatch_line(self, frame, event, arg):
- print frame.f_code.co_filename, self.thread.getName()
+ def dispatch_line(self, frame, arg):
+ if self.stop_here(frame) or self.break_here(frame):
+ # Signal to the main debugger that we've hit a breakpoint
+ print 'bang'
+ _main_debugger.user_line(frame)
+ if self.quitting: raise BdbQuit
+ return self.trace_dispatch
+
+ def dispatch_call(self, frame, arg):
+ # XXX 'arg' is no longer used
+ if self.botframe is None:
+ # First call of dispatch since reset()
+ self.botframe = frame.f_back # (CT) Note that this may also be None!
+ return self.trace_dispatch
+ if not (self.stop_here(frame) or self.break_anywhere(frame)):
+ # No need to trace this function
+ return # None
+ _main_debugger.user_call(frame, arg)
+ if self.quitting: raise BdbQuit
+ return self.trace_dispatch
+
+ def dispatch_return(self, frame, arg):
+ if self.stop_here(frame) or frame == self.returnframe:
+ _main_debugger.user_return(frame, arg)
+ if self.quitting: raise BdbQuit
return self.trace_dispatch
-class ThreadDebug(object):
- def __init__(self, msg, errmsg):
- self.msg = msg
- self.errmsg = errmsg
- self.threads = []
- self.tracers = []
- self.current_thread = None
-
- def trace_dispatch_init(self, frame, event, arg):
- t = threading.currentThread()
- self.threads.append(t)
- m = MTracer(self.msg, self.errmsg)
- self.tracers.append(m)
-
- sys.settrace(m.trace_dispatch)
-
- def get_current_thread(self):
- self.msg('Current thread is %d (%s)' % \
- (self.threads.index(self.current_thread)+1,
- self.current_thread))
- return
-
- def set_current_thread(self, args):
- # XXX Switch to a different thread, although this doesn't
- # actually do anything yet.
- t_num = int(args)-1
- if t_num > len(self.threads):
- self.errmsg('Thread ID %d not known.' % t_num+1)
- return
- self.current_thread = self.threads[t_num]
- self.msg('Switching to thread %d (%s)' % (t_num+1, \
- self.current_thread))
- return
-
-def init(msg, errmsg):
- """ This method sets up thread debugging by creating a ThreadDebug
- object that creates MTracer objects for every thread that is created.
- 'msg' is a method to write standard output to, and 'errmsg' is a method
- to write error output to.
+ def dispatch_exception(self, frame, arg):
+ if self.stop_here(frame):
+ _main_debugger.user_exception(frame, arg)
+ if self.quitting: raise BdbQuit
+ return self.trace_dispatch
+
+def trace_dispatch_init(frame, event, arg):
+ """ This method is called by a sys.settrace when a thread is running
+ for the first time. Setup this thread with a tracer object and
+ set this thread's tracing function to that object's trace_dispatch
+ method.
"""
- t = ThreadDebug(msg, errmsg)
- threading.settrace(t.trace_dispatch_init)
- #sys.settrace(t.trace_dispatch_init)
+ global threads, tracers
+ threads.append(threading.currentThread())
+ t = MTracer()
+
+ tracers.append(t)
+ threading.settrace(t.trace_dispatch)
+ sys.settrace(t.trace_dispatch)
+
+
+def init(debugger):
+ """ This method intialises thread debugging. It sets up a tracing
+ method for newly created threads so that they call trace_dispatch_init,
+ which hooks them up with a MTracer object. The argument 'debugger' is
+ the debugger that is debugging the MainThread, i.e. the Python
+ interpreter.
+ """
+ global _main_debugger
+ _main_debugger = debugger
+ threading.settrace(trace_dispatch_init)
+
+
From python-checkins at python.org Thu Jul 6 12:13:36 2006
From: python-checkins at python.org (ronald.oussoren)
Date: Thu, 6 Jul 2006 12:13:36 +0200 (CEST)
Subject: [Python-checkins] r47267 - python/trunk/configure
python/trunk/configure.in
Message-ID: <20060706101336.E4A6E1E4003@bag.python.org>
Author: ronald.oussoren
Date: Thu Jul 6 12:13:35 2006
New Revision: 47267
Modified:
python/trunk/configure
python/trunk/configure.in
Log:
This patch solves the problem Skip was seeing with zlib, this patch ensures that
configure uses similar compiler flags as setup.py when doing the zlib test.
Without this patch configure would use the first shared library on the linker
path, with this patch it uses the first shared or static library on that path
just like setup.py.
Modified: python/trunk/configure
==============================================================================
--- python/trunk/configure (original)
+++ python/trunk/configure Thu Jul 6 12:13:35 2006
@@ -1,5 +1,5 @@
#! /bin/sh
-# From configure.in Revision: .
+# From configure.in Revision: 47023 .
# Guess values for system-dependent variables and create Makefiles.
# Generated by GNU Autoconf 2.59 for python 2.5.
#
@@ -722,13 +722,13 @@
/^X\(\/\).*/{ s//\1/; q; }
s/.*/./; q'`
srcdir=$ac_confdir
- if test ! -r "$srcdir/$ac_unique_file"; then
+ if test ! -r $srcdir/$ac_unique_file; then
srcdir=..
fi
else
ac_srcdir_defaulted=no
fi
-if test ! -r "$srcdir/$ac_unique_file"; then
+if test ! -r $srcdir/$ac_unique_file; then
if test "$ac_srcdir_defaulted" = yes; then
{ echo "$as_me: error: cannot find sources ($ac_unique_file) in $ac_confdir or .." >&2
{ (exit 1); exit 1; }; }
@@ -737,7 +737,7 @@
{ (exit 1); exit 1; }; }
fi
fi
-(cd $srcdir && test -r "./$ac_unique_file") 2>/dev/null ||
+(cd $srcdir && test -r ./$ac_unique_file) 2>/dev/null ||
{ echo "$as_me: error: sources are in $srcdir, but \`cd $srcdir' does not work" >&2
{ (exit 1); exit 1; }; }
srcdir=`echo "$srcdir" | sed 's%\([^\\/]\)[\\/]*$%\1%'`
@@ -14980,6 +14980,15 @@
fi
+case $ac_sys_system/$ac_sys_release in
+Darwin/*)
+ _CUR_CFLAGS="${CFLAGS}"
+ _CUR_LDFLAGS="${LDFLAGS}"
+ CFLAGS="${CFLAGS} -Wl,-search_paths_first"
+ LDFLAGS="${LDFLAGS} -Wl,-search_paths_first -L/usr/local/lib"
+ ;;
+esac
+
echo "$as_me:$LINENO: checking for inflateCopy in -lz" >&5
echo $ECHO_N "checking for inflateCopy in -lz... $ECHO_C" >&6
if test "${ac_cv_lib_z_inflateCopy+set}" = set; then
@@ -15053,6 +15062,13 @@
fi
+case $ac_sys_system/$ac_sys_release in
+Darwin/*)
+ CFLAGS="${_CUR_CFLAGS}"
+ LDFLAGS="${_CUR_LDFLAGS}"
+ ;;
+esac
+
echo "$as_me:$LINENO: checking for hstrerror" >&5
echo $ECHO_N "checking for hstrerror... $ECHO_C" >&6
cat >conftest.$ac_ext <<_ACEOF
Modified: python/trunk/configure.in
==============================================================================
--- python/trunk/configure.in (original)
+++ python/trunk/configure.in Thu Jul 6 12:13:35 2006
@@ -2352,8 +2352,34 @@
)
dnl Check if system zlib has *Copy() functions
+dnl
+dnl On MacOSX the linker will search for dylibs on the entire linker path
+dnl before searching for static libraries. setup.py adds -Wl,-search_paths_first
+dnl to revert to a more traditional unix behaviour and make it possible to
+dnl override the system libz with a local static library of libz. Temporarily
+dnl add that flag to our CFLAGS as well to ensure that we check the version
+dnl of libz that will be used by setup.py.
+dnl The -L/usr/local/lib is needed as wel to get the same compilation
+dnl environment as setup.py (and leaving it out can cause configure to use the
+dnl wrong version of the library)
+case $ac_sys_system/$ac_sys_release in
+Darwin/*)
+ _CUR_CFLAGS="${CFLAGS}"
+ _CUR_LDFLAGS="${LDFLAGS}"
+ CFLAGS="${CFLAGS} -Wl,-search_paths_first"
+ LDFLAGS="${LDFLAGS} -Wl,-search_paths_first -L/usr/local/lib"
+ ;;
+esac
+
AC_CHECK_LIB(z, inflateCopy, AC_DEFINE(HAVE_ZLIB_COPY, 1, Define if the zlib library has inflateCopy))
+case $ac_sys_system/$ac_sys_release in
+Darwin/*)
+ CFLAGS="${_CUR_CFLAGS}"
+ LDFLAGS="${_CUR_LDFLAGS}"
+ ;;
+esac
+
AC_MSG_CHECKING(for hstrerror)
AC_TRY_LINK([
#include "confdefs.h"
From python-checkins at python.org Thu Jul 6 12:48:28 2006
From: python-checkins at python.org (thomas.wouters)
Date: Thu, 6 Jul 2006 12:48:28 +0200 (CEST)
Subject: [Python-checkins] r47268 - python/trunk/Misc/NEWS
Message-ID: <20060706104828.B94B51E4003@bag.python.org>
Author: thomas.wouters
Date: Thu Jul 6 12:48:28 2006
New Revision: 47268
Modified:
python/trunk/Misc/NEWS
Log:
NEWS entry for r47267: fixing configure's zlib probing.
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Thu Jul 6 12:48:28 2006
@@ -93,6 +93,10 @@
Build
-----
+- 'configure' now detects the zlib library the same way as distutils.
+ Previously, the slight difference could cause compilation errors of the
+ 'zlib' module on systems with more than one version of zlib.
+
- The MSI compileall step was fixed to also support a TARGETDIR
with spaces in it.
From buildbot at python.org Thu Jul 6 14:05:18 2006
From: buildbot at python.org (buildbot at python.org)
Date: Thu, 06 Jul 2006 12:05:18 +0000
Subject: [Python-checkins] buildbot warnings in alpha Debian trunk
Message-ID: <20060706120518.9C7CF1E4006@bag.python.org>
The Buildbot has detected a new failure of alpha Debian trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/alpha%2520Debian%2520trunk/builds/452
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: ronald.oussoren,thomas.heller
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Thu Jul 6 14:29:24 2006
From: python-checkins at python.org (fredrik.lundh)
Date: Thu, 6 Jul 2006 14:29:24 +0200 (CEST)
Subject: [Python-checkins] r47269 -
python/trunk/Lib/xmlcore/etree/ElementTree.py
Message-ID: <20060706122924.BF71C1E4003@bag.python.org>
Author: fredrik.lundh
Date: Thu Jul 6 14:29:24 2006
New Revision: 47269
Modified:
python/trunk/Lib/xmlcore/etree/ElementTree.py
Log:
added XMLParser alias for cElementTree compatibility
Modified: python/trunk/Lib/xmlcore/etree/ElementTree.py
==============================================================================
--- python/trunk/Lib/xmlcore/etree/ElementTree.py (original)
+++ python/trunk/Lib/xmlcore/etree/ElementTree.py Thu Jul 6 14:29:24 2006
@@ -84,7 +84,7 @@
"tostring",
"TreeBuilder",
"VERSION", "XML",
- "XMLTreeBuilder",
+ "XMLParser", "XMLTreeBuilder",
]
##
@@ -1255,3 +1255,6 @@
tree = self._target.close()
del self._target, self._parser # get rid of circular references
return tree
+
+# compatibility
+XMLParser = XMLTreeBuilder
From python-checkins at python.org Thu Jul 6 14:36:24 2006
From: python-checkins at python.org (nick.coghlan)
Date: Thu, 6 Jul 2006 14:36:24 +0200 (CEST)
Subject: [Python-checkins] r47270 - peps/trunk/pep-0338.txt
Message-ID: <20060706123624.E38A01E4003@bag.python.org>
Author: nick.coghlan
Date: Thu Jul 6 14:36:24 2006
New Revision: 47270
Modified:
peps/trunk/pep-0338.txt
Log:
Updated writeup about import statements to strongly recommend absolute imports from main modules
Modified: peps/trunk/pep-0338.txt
==============================================================================
--- peps/trunk/pep-0338.txt (original)
+++ peps/trunk/pep-0338.txt Thu Jul 6 14:36:24 2006
@@ -180,39 +180,67 @@
and then invokes ``run_module(sys.argv[0], run_name="__main__",
alter_sys=True)``.
-Relative Imports
-================
-
-2.5b1 showed an annoying interaction between this PEP and PEP 328 -
-explicit relative imports don't work from a main module, because
-relative imports rely on ``__name__`` to determine the current module's
-position in the package hierarchy.
-
-Accordingly, the operation of ``run_module()`` was modified so that
-another special variable ``__module_name__`` was defined in the
-namespace of the module being executed. This variable always holds
-the true module name, even if ``__name__`` is set to something else
-(like ``'__main__'``)
-
-Modules that don't rely on relative imports can be used from a
-package as a main module without any changes. In order to both use
-relative imports and also be usable as a main module, a module in a
-package will currently need to use a structure similar to the
-following::
-
- # Docstring and any future statements
- _is_main = False
- if __name__ == "__main__":
- _is_main = True
- __name__ = __module_name__
-
- # Support module section, including relative imports
-
- if _is_main:
- # Main module section
+Import Statements and the Main Module
+=====================================
-That said, Guido's recommended solution is to avoid using relative
-imports in the first place.
+The release of 2.5b1 showed a surprising (although obvious in
+retrospect) interaction between this PEP and PEP 328 - explicit
+relative imports don't work from a main module. This is due to
+the fact that relative imports rely on ``__name__`` to determine
+the current module's position in the package hierarchy. In a main
+module, the value of ``__name__`` is always ``'__main__'``, so
+explicit relative imports will always fail (as they only work for
+a module inside a package).
+
+Investigation into why implicit relative imports *appear* to work when
+a main module is executed directly but fail when executed using -m
+showed that such imports are actually always treated as absolute
+imports. Because of the way direct execution works, the package
+containing the executed module is added to sys.path, so its sibling
+modules are actually imported as top level modules. This can easily
+lead to multiple copies of the sibling modules in the application if
+implicit relative imports are used in modules that may be directly
+executed (e.g. test modules or utility scripts).
+
+For the 2.5 release, the recommendation is to always use absolute
+imports in any module that is intended to be used as a main module.
+The -m switch provides a benefit here, as it inserts the current
+directory into sys.path, instead of the directory contain the main
+module. This means that it is possible to run a module from inside a
+package using -m so long as the current directory contains the top
+level directory for the package. Absolute imports will work correctly
+even if the package isn't installed anywhere else on sys.path. If the
+module is executed directly and uses absolute imports to retrieve its
+sibling modules, then the top level package directory needs to be
+installed somewhere on sys.path (since the current directory won't be
+added automatically).
+
+Here's an example file layout::
+
+ devel/
+ pkg/
+ __init__.py
+ moduleA.py
+ moduleB.py
+ test/
+ __init__.py
+ test_A.py
+ test_B.py
+
+So long as the current directory is ``devel``, or ``devel`` is already
+on ``sys.path`` and the test modules use absolute imports (such as
+``import pkg moduleA`` to retrieve the module under test, PEP 338
+allows the tests to be run as::
+
+ python -m pkg.test.test_A
+ python -m pkg.test.test_B
+
+The question of whether or not relative imports should be supported
+when a main module is executed with -m is something that will be
+revisited for Python 2.6. Permitting it would require changes to
+either Python's import semantics or the semantics used to indicate
+when a module is the main module, so it is not a decision to be made
+hastily.
Resolved Issues
================
From python-checkins at python.org Thu Jul 6 14:53:05 2006
From: python-checkins at python.org (nick.coghlan)
Date: Thu, 6 Jul 2006 14:53:05 +0200 (CEST)
Subject: [Python-checkins] r47271 - in python/trunk: Doc/lib/librunpy.tex
Lib/runpy.py Lib/test/test_runpy.py
Message-ID: <20060706125305.811F11E4003@bag.python.org>
Author: nick.coghlan
Date: Thu Jul 6 14:53:04 2006
New Revision: 47271
Modified:
python/trunk/Doc/lib/librunpy.tex
python/trunk/Lib/runpy.py
python/trunk/Lib/test/test_runpy.py
Log:
Revert the __module_name__ changes made in rev 47142. We'll revisit this in Python 2.6
Modified: python/trunk/Doc/lib/librunpy.tex
==============================================================================
--- python/trunk/Doc/lib/librunpy.tex (original)
+++ python/trunk/Doc/lib/librunpy.tex Thu Jul 6 14:53:04 2006
@@ -35,19 +35,16 @@
global variables below are defined in the supplied dictionary, those
definitions are overridden by the \code{run_module} function.
-The special global variables \code{__name__}, \code{__module_name__},
-\code{__file__}, \code{__loader__} and \code{__builtins__} are
-set in the globals dictionary before the module code is executed.
+The special global variables \code{__name__}, \code{__file__},
+\code{__loader__} and \code{__builtins__} are set in the globals
+dictionary before the module code is executed.
\code{__name__} is set to \var{run_name} if this optional argument is
supplied, and the \var{mod_name} argument otherwise.
-\code{__module_name__} is always set to \var{mod_name} (this allows
-modules to use imports relative to their package name).
-
\code{__loader__} is set to the PEP 302 module loader used to retrieve
-the code for the module (This will not be defined if the module was
-found using the standard import mechanism).
+the code for the module (This loader may be a wrapper around the
+standard import mechanism).
\code{__file__} is set to the name provided by the module loader. If
the loader does not make filename information available, this
@@ -58,12 +55,10 @@
If the argument \var{alter_sys} is supplied and evaluates to
\code{True}, then \code{sys.argv[0]} is updated with the value of
-\code{__file__} and \code{sys.modules[mod_name]} is updated with a
+\code{__file__} and \code{sys.modules[__name__]} is updated with a
temporary module object for the module being executed. Both
-\code{sys.argv[0]} and \code{sys.modules[mod_name]} are restored to
-their original values before the function returns. If \var{run_name}
-differs from \var{mod_name} entries are made in \code{sys.modules}
-for both names.
+\code{sys.argv[0]} and \code{sys.modules[__name__]} are restored to
+their original values before the function returns.
Note that this manipulation of \module{sys} is not thread-safe. Other
threads may see the partially initialised module, as well as the
Modified: python/trunk/Lib/runpy.py
==============================================================================
--- python/trunk/Lib/runpy.py (original)
+++ python/trunk/Lib/runpy.py Thu Jul 6 14:53:04 2006
@@ -21,19 +21,18 @@
]
-def _run_code(code, run_globals, init_globals, run_name,
+def _run_code(code, run_globals, init_globals,
mod_name, mod_fname, mod_loader):
"""Helper for _run_module_code"""
if init_globals is not None:
run_globals.update(init_globals)
- run_globals.update(__name__ = run_name,
- __module_name__ = mod_name,
+ run_globals.update(__name__ = mod_name,
__file__ = mod_fname,
__loader__ = mod_loader)
exec code in run_globals
return run_globals
-def _run_module_code(code, init_globals=None, run_name=None,
+def _run_module_code(code, init_globals=None,
mod_name=None, mod_fname=None,
mod_loader=None, alter_sys=False):
"""Helper for run_module"""
@@ -43,33 +42,26 @@
temp_module = imp.new_module(mod_name)
mod_globals = temp_module.__dict__
saved_argv0 = sys.argv[0]
- sentinel = object()
- module_mod_name = sys.modules.get(mod_name, sentinel)
- module_run_name = sys.modules.get(run_name, sentinel)
+ restore_module = mod_name in sys.modules
+ if restore_module:
+ saved_module = sys.modules[mod_name]
sys.argv[0] = mod_fname
sys.modules[mod_name] = temp_module
- if run_name != mod_name:
- sys.modules[run_name] = temp_module
try:
- _run_code(code, mod_globals, init_globals, run_name,
+ _run_code(code, mod_globals, init_globals,
mod_name, mod_fname, mod_loader)
finally:
sys.argv[0] = saved_argv0
- if module_mod_name is not sentinel:
- sys.modules[mod_name] = module_mod_name
- else:
- del sys.modules[mod_name]
- if run_name != mod_name:
- if module_run_name is not sentinel:
- sys.modules[run_name] = module_run_name
- else:
- del sys.modules[run_name]
+ if restore_module:
+ sys.modules[mod_name] = saved_module
+ else:
+ del sys.modules[mod_name]
# Copy the globals of the temporary module, as they
# may be cleared when the temporary module goes away
return mod_globals.copy()
else:
# Leave the sys module alone
- return _run_code(code, {}, init_globals, run_name,
+ return _run_code(code, {}, init_globals,
mod_name, mod_fname, mod_loader)
@@ -100,7 +92,7 @@
if run_name is None:
run_name = mod_name
return _run_module_code(code, init_globals, run_name,
- mod_name, filename, loader, alter_sys)
+ filename, loader, alter_sys)
if __name__ == "__main__":
Modified: python/trunk/Lib/test/test_runpy.py
==============================================================================
--- python/trunk/Lib/test/test_runpy.py (original)
+++ python/trunk/Lib/test/test_runpy.py Thu Jul 6 14:53:04 2006
@@ -23,8 +23,6 @@
"run_argv0 = sys.argv[0]\n"
"if __name__ in sys.modules:\n"
" run_name = sys.modules[__name__].__name__\n"
- "if __module_name__ in sys.modules:\n"
- " mod_name = sys.modules[__module_name__].__module_name__\n"
"# Check nested operation\n"
"import runpy\n"
"nested = runpy._run_module_code('x=1\\n', mod_name='',\n"
@@ -34,16 +32,14 @@
def test_run_module_code(self):
initial = object()
- run_name = ""
- mod_name = ""
+ name = ""
file = "Some other nonsense"
loader = "Now you're just being silly"
d1 = dict(initial=initial)
saved_argv0 = sys.argv[0]
d2 = _run_module_code(self.test_source,
d1,
- run_name,
- mod_name,
+ name,
file,
loader,
True)
@@ -51,23 +47,19 @@
self.failUnless(d2["initial"] is initial)
self.failUnless(d2["result"] == self.expected_result)
self.failUnless(d2["nested"]["x"] == 1)
- self.failUnless(d2["__name__"] is run_name)
- self.failUnless(d2["run_name"] is run_name)
- self.failUnless(d2["__module_name__"] is mod_name)
- self.failUnless(d2["mod_name"] is mod_name)
+ self.failUnless(d2["__name__"] is name)
+ self.failUnless(d2["run_name"] is name)
self.failUnless(d2["__file__"] is file)
self.failUnless(d2["run_argv0"] is file)
self.failUnless(d2["__loader__"] is loader)
self.failUnless(sys.argv[0] is saved_argv0)
- self.failUnless(mod_name not in sys.modules)
- self.failUnless(run_name not in sys.modules)
+ self.failUnless(name not in sys.modules)
def test_run_module_code_defaults(self):
saved_argv0 = sys.argv[0]
d = _run_module_code(self.test_source)
self.failUnless(d["result"] == self.expected_result)
self.failUnless(d["__name__"] is None)
- self.failUnless(d["__module_name__"] is None)
self.failUnless(d["__file__"] is None)
self.failUnless(d["__loader__"] is None)
self.failUnless(d["run_argv0"] is saved_argv0)
From python-checkins at python.org Thu Jul 6 15:04:58 2006
From: python-checkins at python.org (nick.coghlan)
Date: Thu, 6 Jul 2006 15:04:58 +0200 (CEST)
Subject: [Python-checkins] r47272 - python/trunk/Doc/tut/tut.tex
Message-ID: <20060706130458.3FD641E4005@bag.python.org>
Author: nick.coghlan
Date: Thu Jul 6 15:04:56 2006
New Revision: 47272
Modified:
python/trunk/Doc/tut/tut.tex
Log:
Update the tutorial section on relative imports
Modified: python/trunk/Doc/tut/tut.tex
==============================================================================
--- python/trunk/Doc/tut/tut.tex (original)
+++ python/trunk/Doc/tut/tut.tex Thu Jul 6 15:04:56 2006
@@ -2919,14 +2919,13 @@
The submodules often need to refer to each other. For example, the
\module{surround} module might use the \module{echo} module. In fact,
-such references
-are so common that the \keyword{import} statement first looks in the
-containing package before looking in the standard module search path.
-Thus, the surround module can simply use \code{import echo} or
-\code{from echo import echofilter}. If the imported module is not
-found in the current package (the package of which the current module
-is a submodule), the \keyword{import} statement looks for a top-level
-module with the given name.
+such references are so common that the \keyword{import} statement
+first looks in the containing package before looking in the standard
+module search path. Thus, the \module{surround} module can simply use
+\code{import echo} or \code{from echo import echofilter}. If the
+imported module is not found in the current package (the package of
+which the current module is a submodule), the \keyword{import}
+statement looks for a top-level module with the given name.
When packages are structured into subpackages (as with the
\module{Sound} package in the example), there's no shortcut to refer
@@ -2936,6 +2935,24 @@
in the \module{Sound.Effects} package, it can use \code{from
Sound.Effects import echo}.
+Starting with Python 2.5, in addition to the implicit relative imports
+described above, you can write explicit relative imports with the
+\code{from module import name} form of import statement. These explicit
+relative imports use leading dots to indicate the current and parent
+packages involved in the relative import. From the \module{surround}
+module for example, you might use:
+
+\begin{verbatim}
+from . import echo
+from .. import Formats
+from ..Filters import equalizer
+\end{verbatim}
+
+Note that both explicit and implicit relative imports are based on the
+name of the current module. Since the name of the main module is always
+\code{"__main__"}, modules intended for use as the main module of a
+Python application should always use absolute imports.
+
\subsection{Packages in Multiple Directories}
Packages support one more special attribute, \member{__path__}. This
From python-checkins at python.org Thu Jul 6 15:35:28 2006
From: python-checkins at python.org (nick.coghlan)
Date: Thu, 6 Jul 2006 15:35:28 +0200 (CEST)
Subject: [Python-checkins] r47273 - python/trunk/Lib/warnings.py
Message-ID: <20060706133528.5AE711E4003@bag.python.org>
Author: nick.coghlan
Date: Thu Jul 6 15:35:27 2006
New Revision: 47273
Modified:
python/trunk/Lib/warnings.py
Log:
Ignore ImportWarning by default
Modified: python/trunk/Lib/warnings.py
==============================================================================
--- python/trunk/Lib/warnings.py (original)
+++ python/trunk/Lib/warnings.py Thu Jul 6 15:35:27 2006
@@ -261,3 +261,4 @@
# Module initialization
_processoptions(sys.warnoptions)
simplefilter("ignore", category=PendingDeprecationWarning, append=1)
+simplefilter("ignore", category=ImportWarning, append=1)
From python-checkins at python.org Thu Jul 6 15:41:34 2006
From: python-checkins at python.org (nick.coghlan)
Date: Thu, 6 Jul 2006 15:41:34 +0200 (CEST)
Subject: [Python-checkins] r47274 - python/trunk/Doc/lib/libwarnings.tex
Message-ID: <20060706134134.B4B261E4003@bag.python.org>
Author: nick.coghlan
Date: Thu Jul 6 15:41:34 2006
New Revision: 47274
Modified:
python/trunk/Doc/lib/libwarnings.tex
Log:
Cover ImportWarning, PendingDeprecationWarning and simplefilter() in the warnings module docs
Modified: python/trunk/Doc/lib/libwarnings.tex
==============================================================================
--- python/trunk/Doc/lib/libwarnings.tex (original)
+++ python/trunk/Doc/lib/libwarnings.tex Thu Jul 6 15:41:34 2006
@@ -71,6 +71,11 @@
\lineii{FutureWarning}{Base category for warnings about constructs
that will change semantically in the future.}
+\lineii{PendingDeprecationWarning}{Base category for warnings about
+features that will be deprecated in the future (ignored by default).}
+
+\lineii{ImportWarning}{Base category for warnings triggered during the
+process of importing a module (ignored by default).}
\end{tableii}
While these are technically built-in exceptions, they are documented
@@ -143,6 +148,17 @@
it is first imported (invalid options are ignored, after printing a
message to \code{sys.stderr}).
+The warnings that are ignored by default may be enabled by passing
+ \programopt{-Wd} to the interpreter. This enables default handling
+for all warnings, including those that are normally ignored by
+default. This is particular useful for enabling ImportWarning when
+debugging problems importing a developed package. ImportWarning can
+also be enabled explicitly in Python code using:
+
+\begin{verbatim}
+ warnings.simplefilter('default', ImportWarning)
+\end{verbatim}
+
\subsection{Available Functions \label{warning-functions}}
@@ -209,14 +225,26 @@
inserted at the front by default; if \var{append} is true, it is
inserted at the end.
This checks the types of the arguments, compiles the message and
-module regular expressions, and inserts them as a tuple in front
-of the warnings filter. Entries inserted later override entries
-inserted earlier, if both match a particular warning. Omitted
-arguments default to a value that matches everything.
+module regular expressions, and inserts them as a tuple in the
+list of warnings filters. Entries closer to the front of the list
+override entries later in the list, if both match a particular
+warning. Omitted arguments default to a value that matches
+everything.
+\end{funcdesc}
+
+\begin{funcdesc}{simplefilter}{action\optional{,
+ category\optional{,
+ lineno\optional{, append}}}}
+Insert a simple entry into the list of warnings filters. The meaning
+of the function parameters is as for \function{filterwarnings()}, but
+regular expressions are not needed as the filter inserted always
+matches any message in any module as long as the category and line
+number match.
\end{funcdesc}
\begin{funcdesc}{resetwarnings}{}
Reset the warnings filter. This discards the effect of all previous
calls to \function{filterwarnings()}, including that of the
-\programopt{-W} command line options.
+\programopt{-W} command line options and calls to
+\function{simplefilter()}.
\end{funcdesc}
From python-checkins at python.org Thu Jul 6 15:47:19 2006
From: python-checkins at python.org (nick.coghlan)
Date: Thu, 6 Jul 2006 15:47:19 +0200 (CEST)
Subject: [Python-checkins] r47275 - python/trunk/Misc/NEWS
Message-ID: <20060706134719.0DD861E4003@bag.python.org>
Author: nick.coghlan
Date: Thu Jul 6 15:47:18 2006
New Revision: 47275
Modified:
python/trunk/Misc/NEWS
Log:
Add NEWS entries for the ImportWarning change and documentation update
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Thu Jul 6 15:47:18 2006
@@ -25,6 +25,8 @@
Library
-------
+- warnings.py now ignores ImportWarning by default
+
- Patch #1517790: It is now possible to use custom objects in the ctypes
foreign function argtypes sequence as long as they provide a from_param
method, no longer is it required that the object is a ctypes type.
@@ -115,6 +117,9 @@
Documentation
-------------
+- Cover ImportWarning, PendingDeprecationWarning and simplefilter() in the
+ documentation for the warnings module.
+
- Patch #1509163: MS Toolkit Compiler no longer available.
From python-checkins at python.org Thu Jul 6 15:57:29 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Thu, 6 Jul 2006 15:57:29 +0200 (CEST)
Subject: [Python-checkins] r47276 - python/trunk/Doc/whatsnew/whatsnew25.tex
Message-ID: <20060706135729.4DBE41E4005@bag.python.org>
Author: andrew.kuchling
Date: Thu Jul 6 15:57:28 2006
New Revision: 47276
Modified:
python/trunk/Doc/whatsnew/whatsnew25.tex
Log:
ImportWarning is now silent by default
Modified: python/trunk/Doc/whatsnew/whatsnew25.tex
==============================================================================
--- python/trunk/Doc/whatsnew/whatsnew25.tex (original)
+++ python/trunk/Doc/whatsnew/whatsnew25.tex Thu Jul 6 15:57:28 2006
@@ -1170,19 +1170,12 @@
to include an \file{__init__.py} module in a package directory.
Debugging this mistake can be confusing, and usually requires running
Python with the \programopt{-v} switch to log all the paths searched.
-In Python 2.5, a new \exception{ImportWarning} warning is raised when
+In Python 2.5, a new \exception{ImportWarning} warning is triggered when
an import would have picked up a directory as a package but no
-\file{__init__.py} was found. (Implemented by Thomas Wouters.)
-
-To suppress these warnings, you can either supply
-\code{\programopt{-W}'ignore:Not importing directory'} when running the Python
-interpreter, or use the \module{warnings} module to suppress the
-message:
-
-\begin{verbatim}
-warnings.filterwarnings('ignore', 'Not importing directory',
- ImportWarning)
-\end{verbatim}
+\file{__init__.py} was found. This warning is silently ignored by default;
+provide the \programopt{-Wd} option when running the Python executable
+to display the warning message.
+(Implemented by Thomas Wouters.)
\item The list of base classes in a class definition can now be empty.
As an example, this is now legal:
From buildbot at python.org Thu Jul 6 15:58:13 2006
From: buildbot at python.org (buildbot at python.org)
Date: Thu, 06 Jul 2006 13:58:13 +0000
Subject: [Python-checkins] buildbot warnings in x86 W2k trunk
Message-ID: <20060706135813.C9DE01E4005@bag.python.org>
The Buildbot has detected a new failure of x86 W2k trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520W2k%2520trunk/builds/1200
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: nick.coghlan
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Thu Jul 6 17:06:07 2006
From: python-checkins at python.org (thomas.heller)
Date: Thu, 6 Jul 2006 17:06:07 +0200 (CEST)
Subject: [Python-checkins] r47277 - python/trunk/Doc/api/concrete.tex
Message-ID: <20060706150607.178DF1E4017@bag.python.org>
Author: thomas.heller
Date: Thu Jul 6 17:06:05 2006
New Revision: 47277
Modified:
python/trunk/Doc/api/concrete.tex
Log:
Document the correct return type of PyLong_AsUnsignedLongLongMask.
Modified: python/trunk/Doc/api/concrete.tex
==============================================================================
--- python/trunk/Doc/api/concrete.tex (original)
+++ python/trunk/Doc/api/concrete.tex Thu Jul 6 17:06:05 2006
@@ -376,7 +376,7 @@
\versionadded{2.3}
\end{cfuncdesc}
-\begin{cfuncdesc}{unsigned long}{PyLong_AsUnsignedLongLongMask}{PyObject *io}
+\begin{cfuncdesc}{unsigned PY_LONG_LONG}{PyLong_AsUnsignedLongLongMask}{PyObject *io}
Return a C \ctype{unsigned long long} from a Python long integer, without
checking for overflow.
\versionadded{2.3}
From python-checkins at python.org Thu Jul 6 17:21:53 2006
From: python-checkins at python.org (hyeshik.chang)
Date: Thu, 6 Jul 2006 17:21:53 +0200 (CEST)
Subject: [Python-checkins] r47278 -
python/trunk/Lib/test/test_multibytecodec.py
Message-ID: <20060706152153.68B4E1E4003@bag.python.org>
Author: hyeshik.chang
Date: Thu Jul 6 17:21:52 2006
New Revision: 47278
Modified:
python/trunk/Lib/test/test_multibytecodec.py
Log:
Add a testcase for r47086 which fixed a bug in codec_getstreamcodec().
Modified: python/trunk/Lib/test/test_multibytecodec.py
==============================================================================
--- python/trunk/Lib/test/test_multibytecodec.py (original)
+++ python/trunk/Lib/test/test_multibytecodec.py Thu Jul 6 17:21:52 2006
@@ -6,7 +6,8 @@
from test import test_support
from test import test_multibytecodec_support
-import unittest, StringIO, codecs, sys
+from test.test_support import TESTFN
+import unittest, StringIO, codecs, sys, os
class Test_MultibyteCodec(unittest.TestCase):
@@ -25,6 +26,13 @@
self.assertRaises(IndexError, dec,
'apple\x92ham\x93spam', 'test.cjktest')
+ def test_codingspec(self):
+ print >> open(TESTFN, 'w'), '# coding: euc-kr'
+ try:
+ exec open(TESTFN)
+ finally:
+ os.unlink(TESTFN)
+
class Test_IncrementalEncoder(unittest.TestCase):
def test_stateless(self):
From hyeshik at gmail.com Thu Jul 6 17:23:11 2006
From: hyeshik at gmail.com (Hye-Shik Chang)
Date: Fri, 7 Jul 2006 00:23:11 +0900
Subject: [Python-checkins] r47086 - in python/trunk: Misc/NEWS
Python/codecs.c
In-Reply-To:
References: <20060623211619.1DA121E400D@bag.python.org>
Message-ID: <4f0b69dc0607060823we85ae4fmafe9c77ea263f1a9@mail.gmail.com>
On 7/6/06, Neal Norwitz wrote:
> On 6/23/06, hyeshik.chang wrote:
> > Author: hyeshik.chang
> > Date: Fri Jun 23 23:16:18 2006
> > New Revision: 47086
> >
> > Modified:
> > python/trunk/Misc/NEWS
> > python/trunk/Python/codecs.c
> > Log:
> > Bug #1511381: codec_getstreamcodec() in codec.c is corrected to
> > omit a default "error" argument for NULL pointer. This allows
> > the parser to take a codec from cjkcodecs again.
> > (Reported by Taewook Kang and reviewed by Walter Doerwald)
>
> Can we get a test case for this?
>
Added. Thank you for reminding me. :)
Hye-Shik
From python-checkins at python.org Thu Jul 6 17:39:25 2006
From: python-checkins at python.org (hyeshik.chang)
Date: Thu, 6 Jul 2006 17:39:25 +0200 (CEST)
Subject: [Python-checkins] r47279 -
python/trunk/Lib/test/test_multibytecodec.py
Message-ID: <20060706153925.2CDD91E4003@bag.python.org>
Author: hyeshik.chang
Date: Thu Jul 6 17:39:24 2006
New Revision: 47279
Modified:
python/trunk/Lib/test/test_multibytecodec.py
Log:
Test using all CJK encodings for the testcases which don't require
specific encodings.
Modified: python/trunk/Lib/test/test_multibytecodec.py
==============================================================================
--- python/trunk/Lib/test/test_multibytecodec.py (original)
+++ python/trunk/Lib/test/test_multibytecodec.py Thu Jul 6 17:39:24 2006
@@ -9,15 +9,34 @@
from test.test_support import TESTFN
import unittest, StringIO, codecs, sys, os
+ALL_CJKENCODINGS = [
+# _codecs_cn
+ 'gb2312', 'gbk', 'gb18030', 'hz',
+# _codecs_hk
+ 'big5hkscs',
+# _codecs_jp
+ 'cp932', 'shift_jis', 'euc_jp', 'euc_jisx0213', 'shift_jisx0213',
+ 'euc_jis_2004', 'shift_jis_2004',
+# _codecs_kr
+ 'cp949', 'euc_kr', 'johab',
+# _codecs_tw
+ 'big5', 'cp950',
+# _codecs_iso2022
+ 'iso2022_jp', 'iso2022_jp_1', 'iso2022_jp_2', 'iso2022_jp_2004',
+ 'iso2022_jp_3', 'iso2022_jp_ext', 'iso2022_kr',
+]
+
class Test_MultibyteCodec(unittest.TestCase):
def test_nullcoding(self):
- self.assertEqual(''.decode('gb18030'), u'')
- self.assertEqual(unicode('', 'gb18030'), u'')
- self.assertEqual(u''.encode('gb18030'), '')
+ for enc in ALL_CJKENCODINGS:
+ self.assertEqual(''.decode(enc), u'')
+ self.assertEqual(unicode('', enc), u'')
+ self.assertEqual(u''.encode(enc), '')
def test_str_decode(self):
- self.assertEqual('abcd'.encode('gb18030'), 'abcd')
+ for enc in ALL_CJKENCODINGS:
+ self.assertEqual('abcd'.encode(enc), 'abcd')
def test_errorcallback_longindex(self):
dec = codecs.getdecoder('euc-kr')
@@ -27,9 +46,10 @@
'apple\x92ham\x93spam', 'test.cjktest')
def test_codingspec(self):
- print >> open(TESTFN, 'w'), '# coding: euc-kr'
try:
- exec open(TESTFN)
+ for enc in ALL_CJKENCODINGS:
+ print >> open(TESTFN, 'w'), '# coding:', enc
+ exec open(TESTFN)
finally:
os.unlink(TESTFN)
From python-checkins at python.org Thu Jul 6 21:28:04 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Thu, 6 Jul 2006 21:28:04 +0200 (CEST)
Subject: [Python-checkins] r47280 - in python/trunk: Lib/msilib/__init__.py
Misc/NEWS
Message-ID: <20060706192804.F3B781E4003@bag.python.org>
Author: martin.v.loewis
Date: Thu Jul 6 21:28:03 2006
New Revision: 47280
Modified:
python/trunk/Lib/msilib/__init__.py
python/trunk/Misc/NEWS
Log:
Properly generate logical file ids. Fixes #1515998.
Also correct typo in Control.mapping.
Modified: python/trunk/Lib/msilib/__init__.py
==============================================================================
--- python/trunk/Lib/msilib/__init__.py (original)
+++ python/trunk/Lib/msilib/__init__.py Thu Jul 6 21:28:03 2006
@@ -187,7 +187,7 @@
self.filenames = sets.Set()
self.index = 0
- def gen_id(self, dir, file):
+ def gen_id(self, file):
logical = _logical = make_id(file)
pos = 1
while logical in self.filenames:
@@ -196,9 +196,11 @@
self.filenames.add(logical)
return logical
- def append(self, full, logical):
+ def append(self, full, file, logical):
if os.path.isdir(full):
return
+ if not logical:
+ logical = self.gen_id(file)
self.index += 1
self.files.append((full, logical))
return self.index, logical
@@ -328,7 +330,7 @@
logical = self.keyfiles[file]
else:
logical = None
- sequence, logical = self.cab.append(absolute, logical)
+ sequence, logical = self.cab.append(absolute, file, logical)
assert logical not in self.ids
self.ids.add(logical)
short = self.make_short(file)
@@ -403,7 +405,7 @@
[(self.dlg.name, self.name, event, argument,
condition, ordering)])
- def mapping(self, mapping, attribute):
+ def mapping(self, event, attribute):
add_data(self.dlg.db, "EventMapping",
[(self.dlg.name, self.name, event, attribute)])
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Thu Jul 6 21:28:03 2006
@@ -25,6 +25,8 @@
Library
-------
+- Bug #1515998: Properly generate logical ids for files in bdist_msi.
+
- warnings.py now ignores ImportWarning by default
- Patch #1517790: It is now possible to use custom objects in the ctypes
From python-checkins at python.org Thu Jul 6 23:13:24 2006
From: python-checkins at python.org (mateusz.rukowicz)
Date: Thu, 6 Jul 2006 23:13:24 +0200 (CEST)
Subject: [Python-checkins] r47281 - sandbox/trunk/decimal-c/_decimal.c
sandbox/trunk/decimal-c/decimal.h
Message-ID: <20060706211324.E702E1E4003@bag.python.org>
Author: mateusz.rukowicz
Date: Thu Jul 6 23:13:23 2006
New Revision: 47281
Modified:
sandbox/trunk/decimal-c/_decimal.c
sandbox/trunk/decimal-c/decimal.h
Log:
Long exps now works, every test now passes. There are some fixes, polishes still needed. '//' are for my information, they will disappear.
Modified: sandbox/trunk/decimal-c/_decimal.c
==============================================================================
--- sandbox/trunk/decimal-c/_decimal.c (original)
+++ sandbox/trunk/decimal-c/_decimal.c Thu Jul 6 23:13:23 2006
@@ -521,41 +521,448 @@
return new_pos;
}
+static long
+_limb_normalize(long *first, long size) {
+ long i;
+ long new_size;
+
+ for (i = 0; i < size - 1; i++) {
+ first[i+1] += first[i] / BASE;
+ first[i] %= BASE;
+ }
+
+ new_size = size;
+
+ while (!first[new_size - 1] && new_size > 1) new_size --;
+
+ return new_size;
+}
+#ifdef BIG_EXP
+
+
+exp_t exp_from_i(long a) {
+ exp_t ret;
+
+ memset(ret.limbs, 0, sizeof(long) * EXP_LIMB_COUNT);
+ ret.limbs[0] = a;
+ ret.sign = 0;
+
+ if (a < 0) {
+ ret.sign = 1;
+ ret.limbs[0] *= -1;
+ }
+
+ ret.size = _limb_normalize(ret.limbs, EXP_LIMB_COUNT);
+
+ return ret;
+}
+
+int
+exp_sscanf(char *buf, exp_t *exp) {
+ int len;
+ long mul;
+ long limb;
+ long i;
+ exp->sign = 0;
+ exp->size = 1;
+ memset(exp->limbs, 0, sizeof(long) * EXP_LIMB_COUNT);
+
+ if (buf[0] == '-') {
+ exp->sign = 1;
+ buf ++;
+ }
+ else if (buf[0] == '+')
+ buf ++;
+
+ len = strlen(buf);
+ if (!len)
+ return 0;
+
+ while (buf[0] == '0') {
+ buf ++;
+ }
+ len = strlen(buf);
+
+ mul = 1;
+ limb = 0;
+
+ for (i = len-1; i>=0 ;i--) {
+ if (buf[i] < '0' && buf[i] > '9')
+ return 0;
+
+ exp->size = limb + 1;
+ if (limb >= EXP_LIMB_COUNT)
+ return 0;
+
+ exp->limbs[limb] += mul * (buf[i] - '0');
+
+ mul *= 10;
+ if (mul == BASE) {
+ limb ++;
+ mul = 1;
+ }
+ }
+
+ return 1;
+}
+
+int
+exp_sprintf(char *buf, exp_t exp) {
+ int written = 0;
+ int tmp;
+ long i;
+
+ if (exp.sign) {
+ buf[0] = '-';
+ buf ++;
+ written ++;
+ }
+
+ tmp = sprintf(buf, "%i", exp.limbs[exp.size-1]);
+ buf += tmp;
+ written += tmp;
+ for (i = exp.size - 2; i>=0; i--) {
+ tmp = sprintf(buf, "%."LOG_STR"i", exp.limbs[i]);
+ buf += tmp;
+ written += tmp;
+ }
+ buf[0] = '\0';
+
+ return written;
+}
+
+/* there is no overflow checking !*/
+long
+exp_to_i(exp_t exp) {
+ long mult;
+ long i;
+ long ret = 0;
+ long mul = 1;
+ mult = 1;
+ for (i=0; isign == b.sign) {
+ a->size = _limb_add(a->limbs, a->size * LOG, b.limbs, b.size * LOG, a->limbs);
+ a->size = (a->size + LOG - 1) / LOG;
+ }
+ else {
+ int cmp;
+ cmp = _limb_compare(a->limbs, a->size, b.limbs, b.size);
+
+ if (cmp == 0) {
+ memset(a->limbs, 0, sizeof(long) * a->size);
+ a->size = 1;
+ a->sign = 0;
+ }
+
+ if (cmp == 1) {
+ a->size = _limb_sub_sl(a->limbs, a->size, b.limbs, b.size);
+ }
+
+ if (cmp == -1) {
+ exp_t tmp;
+ memset(tmp.limbs, 0, sizeof(long) * EXP_LIMB_COUNT);
+ tmp.size = _limb_sub(b.limbs, b.size * LOG, a->limbs, a->size * LOG, tmp.limbs);
+ tmp.size = (tmp.size + LOG -1) / LOG;
+ *a = tmp;
+ a->sign = b.sign;
+ }
+
+ }
+
+ return *a;
+}
+
+exp_t
+exp_inp_sub(exp_t *a, exp_t b) {
+ exp_t tmp_b = b;
+ tmp_b.sign ^= 1;
+ if (tmp_b.limbs[0] == 0 && tmp_b.size == 1)
+ tmp_b.sign =0;
+
+ return exp_inp_add(a, tmp_b);
+}
+
+exp_t
+exp_add(exp_t a, exp_t b) {
+ return exp_inp_add(&a, b);
+}
+
+exp_t
+exp_sub(exp_t a, exp_t b) {
+ return exp_inp_sub(&a, b);
+}
+
+exp_t
+exp_add_i(exp_t a, long b) {
+ return exp_add(a, exp_from_i(b));
+}
+
+exp_t
+exp_sub_i(exp_t a, long b) {
+ return exp_sub(a, exp_from_i(b));
+}
+
+exp_t
+exp_inp_add_i(exp_t *a, long b) {
+ return exp_inp_add(a, exp_from_i(b));
+}
+
+exp_t
+exp_inp_sub_i(exp_t *a, long b) {
+ return exp_inp_sub(a, exp_from_i(b));
+}
+
+exp_t
+exp_inc(exp_t *a) {
+ return exp_inp_add(a, exp_from_i(1));
+}
+
+exp_t
+exp_dec(exp_t *a) {
+ return exp_inp_sub(a, exp_from_i(1));
+}
+
+exp_t
+exp_mul(exp_t a, exp_t b) {
+ exp_t ret;
+ memset(ret.limbs, 0, sizeof(long) *EXP_LIMB_COUNT);
+ ret.size = _limb_multiply_core(a.limbs, a.size, b.limbs, b.size, ret.limbs);
+ ret.sign = a.sign ^ b.sign;
+ if (ret.size == 1 && ret.limbs[0] == 0)
+ ret.sign = 0;
+ return ret;
+}
+
+exp_t
+exp_mul_i(exp_t a, long b) {
+ return exp_mul(a, exp_from_i(b));
+}
+
+exp_t
+exp_div_i(exp_t a, long b, long *remainder) {
+ exp_t ret;
+ long i;
+ long mult = 1;
+ for (i=0 ; i=0; i--) {
+ *remainder *= mult;
+ *remainder += a.limbs[i];
+ ret.limbs[i] = *remainder / b;
+ *remainder %= b;
+ }
+
+ ret.size = EXP_LIMB_COUNT;
+ while (!ret.limbs[ret.size-1] && ret.size > 1) ret.size --;
+ ret.sign = a.sign ^ (b < 0);
+
+ if (ret.limbs[0] == 0 && ret.size == 1)
+ ret.sign = 0;
+
+ return ret;
+}
+
+/* TODO */
+exp_t
+exp_floordiv_i(exp_t a, long b) {
+ long remainder;
+ exp_t ret;
+ ret = exp_div_i(a, b, &remainder);
+ if (remainder && exp_is_neg(a))
+ exp_dec(&ret);
+
+ return ret;
+}
+
+/* TODO */
+int
+exp_mod_i(exp_t a, long b) {
+ long remainder;
+ exp_div_i(a, b, &remainder);
+ return remainder;
+}
+
+int
+exp_cmp(exp_t a, exp_t b) {
+ int cmp;
+ if (a.sign != b.sign) {
+ if (a.sign)
+ return -1;
+ else
+ return 1;
+ }
+
+ cmp = _limb_compare(a.limbs, a.size, b.limbs, b.size);
+ if (a.sign)
+ return -cmp;
+ else
+ return cmp;
+}
+
+int
+exp_cmp_i(exp_t a, long b) {
+ return exp_cmp(a, exp_from_i(b));
+}
+
+int
+exp_g(exp_t a, exp_t b) {
+ int cmp = exp_cmp(a, b);
+ return cmp == 1;
+}
+
+int
+exp_g_i(exp_t a, long b) {
+ return exp_g(a, exp_from_i(b));
+}
+
+int
+exp_l(exp_t a, exp_t b) {
+ int cmp = exp_cmp(a, b);
+ return cmp == -1;
+}
+
+int
+exp_l_i(exp_t a, long b) {
+ return exp_l(a, exp_from_i(b));
+}
+
+int
+exp_eq(exp_t a, exp_t b) {
+ int cmp = exp_cmp(a, b);
+ return cmp == 0;
+}
+
+int
+exp_eq_i(exp_t a, long b) {
+ return exp_eq(a, exp_from_i(b));
+}
-#define exp_add_i(exp, a) ((exp) + (a))
-#define exp_add(a, b) ((a) + (b))
-#define exp_inc(a) ((*a)++)
-#define exp_dec(a) ((*a)--)
-#define exp_is_zero(a) ((a) == 0)
-#define exp_is_neg(a) ((a) < 0)
-#define exp_is_pos(a) ((a) > 0)
-#define exp_inp_add(a,b) ((*(a)) += (b))
-#define exp_to_int(a) (a)
-#define exp_to_i(a) (a)
-#define exp_from_i(a) (a)
-#define exp_sub_i(exp, a) ((exp) - (a))
-#define exp_sub(a, b) ((a) - (b))
-#define exp_g(a, b) ((a) > (b))
-#define exp_l(a, b) ((a) < (b))
-#define exp_ge(a, b) ((a) >= (b))
-#define exp_eq(a, b) ((a) == (b))
-#define exp_ne(a, b) ((a) != (b))
-#define exp_g_i(a, b) ((a) > (b))
-#define exp_l_i(a, b) ((a) < (b))
-#define exp_ge_i(a, b) ((a) >= (b))
-#define exp_le_i(a, b) ((a) <= (b))
+int
+exp_ne(exp_t a, exp_t b) {
+ int cmp = exp_cmp(a, b);
+ return cmp != 0;
+}
+
+int
+exp_ne_i(exp_t a, long b) {
+ return exp_ne(a, exp_from_i(b));
+}
+
+int
+exp_ge(exp_t a, exp_t b) {
+ int cmp = exp_cmp(a, b);
+ return (cmp == 1) || (cmp == 0);
+}
+
+int
+exp_ge_i(exp_t a, long b) {
+ return exp_ge(a, exp_from_i(b));
+}
+
+int
+exp_le(exp_t a, exp_t b) {
+ int cmp = exp_cmp(a, b);
+ return (cmp == -1) || (cmp == 0);
+}
+
+int exp_le_i(exp_t a, long b) {
+ return exp_le(a, exp_from_i(b));
+}
+
+int exp_is_zero(exp_t a) {
+ return a.limbs[0] == 0 && a.size == 1;
+}
+
+int exp_is_neg(exp_t a) {
+ return !exp_is_zero(a) && a.sign == 1;
+}
+
+int exp_is_pos(exp_t a) {
+ return !exp_is_zero(a) && a.sign == 0;
+}
+
+exp_t
+exp_neg(exp_t a) {
+ a.sign ^= 1;
+ if (a.limbs[0] == 0 && a.size == 1)
+ a.sign = 0;
+ return a;
+}
+
+exp_t
+exp_min(exp_t a, exp_t b) {
+ int cmp = exp_cmp(a, b);
+ if (cmp == 1)
+ return b;
+ else
+ return a;
+}
+
+exp_t
+exp_max(exp_t a, exp_t b) {
+ int cmp = exp_cmp(a, b);
+ if(cmp == 1)
+ return a;
+ else
+ return b;
+}
+
+#else
+#define exp_add_i(exp, a) ((exp) + (a)) //
+#define exp_add(a, b) ((a) + (b)) //
+#define exp_inc(a) ((*a)++) //
+#define exp_dec(a) ((*a)--) //
+#define exp_is_zero(a) ((a) == 0) //
+#define exp_is_neg(a) ((a) < 0) //
+#define exp_is_pos(a) ((a) > 0) //
+#define exp_inp_add(a,b) ((*(a)) += (b)) //
+#define exp_to_int(a) (a) //
+#define exp_to_i(a) (a) //
+#define exp_from_i(a) (a) //
+#define exp_sub_i(exp, a) ((exp) - (a)) //
+#define exp_sub(a, b) ((a) - (b)) //
+#define exp_g(a, b) ((a) > (b)) //
+#define exp_l(a, b) ((a) < (b)) //
+#define exp_ge(a, b) ((a) >= (b)) //
+#define exp_eq(a, b) ((a) == (b)) //
+#define exp_eq_i(a, b) ((a) == (b)) //
+#define exp_ne(a, b) ((a) != (b)) //
+#define exp_g_i(a, b) ((a) > (b)) //
+#define exp_l_i(a, b) ((a) < (b)) //
+#define exp_ge_i(a, b) ((a) >= (b)) //
+#define exp_le_i(a, b) ((a) <= (b)) //
#define exp_mod_i(a, b) ((a) % (b))
-#define exp_ge_i(a, b) ((a) >= (b))
+#define exp_ge_i(a, b) ((a) >= (b)) //
#define exp_floordiv_i(a,b) ((a) / (b) - ((a) % (b) && (a) < 0))
-#define exp_inp_sub(a, b) ((*(a)) -= (b))
-#define exp_inp_sub_i(a, b) ((*(a)) -= (b))
-#define exp_sprintf(a, e) (sprintf(a, "%d", e))
-#define exp_sscanf(a, e) (sscanf(a, "%d", e))
-#define exp_min(a, b) ((a) < (b) ? (a) : (b))
-#define exp_max(a, b) ((a) > (b) ? (a) : (b))
-#define exp_neg(a) (-(a))
-#define exp_mul_i(a, b) ((a) * (b))
-
+#define exp_inp_sub(a, b) ((*(a)) -= (b)) //
+#define exp_inp_sub_i(a, b) ((*(a)) -= (b)) //
+#define exp_sprintf(a, e) (sprintf(a, "%d", e)) //
+#define exp_sscanf(a, e) (sscanf(a, "%d", e)) //
+#define exp_min(a, b) ((a) < (b) ? (a) : (b)) //
+#define exp_max(a, b) ((a) > (b) ? (a) : (b)) //
+#define exp_neg(a) (-(a)) //
+#define exp_mul_i(a, b) ((a) * (b)) //
+#endif
/* helpful macros ************************************************************/
@@ -720,7 +1127,7 @@
static PyObject *decimal_from_long(PyTypeObject *, long);
static PyObject *decimal_str(decimalobject *);
static PyObject *_do_decimal_str(decimalobject *, contextobject *, int);
-static decimalobject *_new_decimalobj(PyTypeObject *, long, char, long);
+static decimalobject *_new_decimalobj(PyTypeObject *, long, char, exp_t);
static decimalobject *_do_decimal_subtract(decimalobject *, decimalobject *, contextobject *);
static contextobject *context_shallow_copy(contextobject *);
static PyObject *context_ignore_all_flags(contextobject *);
@@ -776,7 +1183,7 @@
else
sign = SIGN_POSNAN;
- res = _new_decimalobj(type, thing->ob_size, sign, 0);
+ res = _new_decimalobj(type, thing->ob_size, sign, exp_from_i(0));
if (!res) return NULL;
for (i = 0; i < thing->ob_size; i++)
res->digits[i] = thing->digits[i]; /* DELETE */
@@ -890,7 +1297,7 @@
res = (decimalobject *)PyDecimal_Inf;
else {
res = _new_decimalobj(type, ctx->prec,
- lsign, ctx->Emax - ctx->prec + 1);
+ lsign, exp_sub_i(ctx->Emax, ctx->prec - 1));
if (res) {
for (i = 0; i < ctx->prec; i++)
res->digits[i] = 9;
@@ -903,7 +1310,7 @@
res = (decimalobject *)PyDecimal_NegInf;
else {
res = _new_decimalobj(type, ctx->prec,
- lsign, ctx->Emax - ctx->prec + 1);
+ lsign, exp_sub_i(ctx->Emax, ctx->prec - 1));
if (res) {
for (i = 0; i < ctx->prec; i++)
res->digits[i] = 9;
@@ -1123,7 +1530,7 @@
contextobject *ctx, decimalobject *tmp)
{
decimalobject *new;
- assert(expdiff > 0);
+ assert(exp_g_i(expdiff, 0));
if(self->ob_size > prec && _limb_get_digit(self->limbs, self->ob_size, prec) >= 5){ /* SLOW */
new = _decimal_increment(tmp, 1, ctx);
Py_DECREF(tmp);
@@ -1146,7 +1553,7 @@
{
long i, last;
decimalobject *tmp;
- assert(expdiff > 0);
+ assert(exp_g_i(expdiff, 0));
tmp = _NEW_decimalobj(prec, self->sign, exp_add(self->exp, expdiff));
if (!tmp) return NULL;
for (i = 0; i < prec; i++)
@@ -1170,7 +1577,7 @@
{
decimalobject *tmp;
long i, last;
- assert(expdiff > 0);
+ assert(exp_g_i(expdiff, 0));
tmp = _NEW_decimalobj(prec, self->sign, exp_add(self->exp, expdiff));
if (!tmp) return NULL;
for (i = 0; i < prec; i++)
@@ -1225,7 +1632,7 @@
/* Mapping rounding constants to functions. Since in C this can be indexed with
any value, it's important to check the rounding constants before! */
-typedef decimalobject*(*round_func)(decimalobject *, long, long, contextobject *);
+typedef decimalobject*(*round_func)(decimalobject *, long, exp_t, contextobject *);
static round_func round_funcs[] = {
_round_down, _round_up, _round_half_down, _round_half_even,
_round_half_up, _round_floor, _round_ceiling
@@ -1324,7 +1731,7 @@
}
/* Maybe all the lost digits are 0. */
- for (i = self->ob_size - exp_to_int(expdiff); i < self->ob_size; i++) {
+ for (i = self->ob_size - exp_to_i(expdiff); i < self->ob_size; i++) {
if(_limb_get_digit(self->limbs, self->ob_size, i) > 0)
goto no_way;
}
@@ -1387,7 +1794,7 @@
decimalobject *ans = NULL, *tmp;
int ret;
exp_t diff;
- long adj;
+ exp_t adj;
long digits, i;
if (ISSPECIAL(self)) {
@@ -1414,7 +1821,7 @@
}
diff = exp_sub(self->exp, exp);
- digits = exp_to_int(exp_add_i(diff, self->ob_size));
+ digits = exp_to_i(exp_add_i(diff, self->ob_size));
if (watchexp && ctx->prec < digits)
return handle_InvalidOperation(self->ob_type, ctx,
@@ -1457,10 +1864,10 @@
adj = ADJUSTED(tmp);
if (decimal_nonzero(tmp)) {
- if (adj < ctx->Emin) {
+ if (exp_l(adj, ctx->Emin)) {
if (handle_Subnormal(ctx, NULL) != 0)
return NULL;
- } else if (adj > ctx->Emax) {
+ } else if (exp_g(adj, ctx->Emax)) {
ans = handle_InvalidOperation(self->ob_type, ctx,
"rescale(a, INF)", NULL);
Py_DECREF(tmp);
@@ -2036,7 +2443,7 @@
return dup;
if (!decimal_nonzero(dup)) {
- new = _NEW_decimalobj(1, dup->sign, 0);
+ new = _NEW_decimalobj(1, dup->sign, exp_from_i(0));
Py_DECREF(dup);
if (!new) return NULL;
new->limbs[0] = 0;
@@ -2183,7 +2590,7 @@
{
- decimalobject *two = _NEW_decimalobj(1, 0, 0);
+ decimalobject *two = _NEW_decimalobj(1, 0, exp_from_i(0));
if (!two) {
PyObject *r = context_regard_flags(ctx2, flags);
Py_XDECREF(r);
@@ -2293,7 +2700,7 @@
{
decimalobject *one;
decimalobject *tmp;
- one = _NEW_decimalobj(1,0,0);
+ one = _NEW_decimalobj(1,0,exp_from_i(0));
if (!one)
goto err;
@@ -2457,7 +2864,7 @@
}
if (!decimal_nonzero(self)) {
- ret = _NEW_decimalobj(1, 0, 0);
+ ret = _NEW_decimalobj(1, 0, exp_from_i(0));
if (!ret)
return NULL;
@@ -2815,7 +3222,7 @@
/* XXX: slow? */
rnd = _set_flag(ctx->ignored, S_ROUNDED, 1);
inex =_set_flag(ctx->ignored, S_INEXACT, 1);
- ans = _decimal_rescale(self, 0, ctx, rounding, 1);
+ ans = _decimal_rescale(self, exp_from_i(0), ctx, rounding, 1);
if (!rnd) _set_flag(ctx->ignored, S_ROUNDED, rnd);
if (!inex) _set_flag(ctx->ignored, S_INEXACT, inex);
return ans;
@@ -2900,7 +3307,8 @@
return PyInt_FromLong(0);
/* XXX: Overflow? */
- return PyInt_FromSsize_t(ADJUSTED(self));
+ /* EXP TODO */
+ return PyInt_FromSsize_t(exp_to_i(ADJUSTED(self)));
}
@@ -2920,8 +3328,9 @@
PyTuple_SET_ITEM(digits, i, d);
}
+ /* TODO EXP */
if(!ISINF(self))
- res = Py_BuildValue("(bOn)", self->sign % 2, digits, self->exp);
+ res = Py_BuildValue("(bOn)", self->sign % 2, digits, exp_to_i(self->exp));
else {
inf = PyString_FromString("F");
if (!inf) {
@@ -3059,7 +3468,7 @@
{
char *outbuf;
int buflen, i;
- int dotplace;
+ exp_t dotplace;
exp_t adjexp;
int append_E = 0, append_adjexp = 0;
long extra_zeros=0;
@@ -3101,41 +3510,49 @@
/* why is that like? well, d->exp moves dot right, d->ob_size moves dot right
* and adjexp moves dot left */
adjexp = exp_from_i(0);
- dotplace = d->exp + d->ob_size;
+ dotplace = exp_add_i(d->exp, d->ob_size);
/* dotplace must be at most d->ob_size (no dot at all) and at last -5 (6 pre-zeros)*/
- if(dotplace >d->ob_size || dotplace <-5)
+// if(dotplace >d->ob_size || dotplace <-5)
+ if (exp_g_i(dotplace, d->ob_size) || exp_l_i(dotplace, -5))
{
/* ok, we have to put dot after 1 digit, that is dotplace = 1
* we compute adjexp from equation (1) */
- dotplace = 1;
+ dotplace = exp_from_i(1);
// adjexp = -dotplace + d->exp + d->ob_size;
- adjexp = exp_add_i(d->exp, d->ob_size - dotplace);
+ adjexp = exp_add_i(exp_sub(d->exp,dotplace), d->ob_size);
}
if(engineering) /* we have to be sure, adjexp%3 == 0 */
while(exp_mod_i(adjexp,3))
{
exp_dec(&adjexp);
- dotplace ++;
+ exp_inc(&dotplace);
}
/* now all we have to do, is to put it to the string =] */
- if(dotplace > d->ob_size)
- extra_zeros = dotplace - d->ob_size;
+// if(dotplace > d->ob_size)
+ if (exp_g_i(dotplace, d->ob_size))
+ extra_zeros = exp_to_i(exp_sub_i(dotplace, d->ob_size));
- if(dotplace <= 0)
+// if(dotplace <= 0)
+ if (exp_le_i(dotplace, 0))
{
*p++ = '0';
*p++ = '.';
- while(dotplace++ < 0)
+// while(dotplace++ < 0)
+// *p++ = '0';
+ while (exp_l_i(dotplace, 0)) {
+ exp_inc(&dotplace);
*p++ = '0';
- dotplace = -1;
+ }
+
+ dotplace = exp_from_i(-1);
}
for(i = 0;iob_size;i++)
{
- if(dotplace == i)
+ if(exp_eq_i(dotplace, i))
*p++ = '.';
p += sprintf(p, "%d", _limb_get_digit(d->limbs,d->ob_size,i));
}
@@ -3264,7 +3681,7 @@
int inf_sign;
inf_sign = sign ? SIGN_NEGINF : SIGN_POSINF;
- inf = _NEW_decimalobj(1, inf_sign, 0);
+ inf = _NEW_decimalobj(1, inf_sign, exp_from_i(0));
if (!inf)
return NULL;
@@ -3288,7 +3705,7 @@
else if (divmod == 2) {
decimalobject *nan;
- nan = _NEW_decimalobj(1, SIGN_POSNAN, 0);
+ nan = _NEW_decimalobj(1, SIGN_POSNAN, exp_from_i(0));
if (!nan) {
Py_DECREF(inf);
return NULL;
@@ -3310,7 +3727,7 @@
if (ISINF(other)) {
if (divmod) {
decimalobject *ans1, *ans2;
- ans1 = _NEW_decimalobj(1, sign, 0);
+ ans1 = _NEW_decimalobj(1, sign, exp_from_i(0));
if (!ans1)
return NULL;
@@ -3363,7 +3780,7 @@
return NULL;
//ans2->exp = self->exp < other->exp ? self->exp : other->exp;
ans2->exp = exp_min(self->exp, other->exp);
- ans1 = _NEW_decimalobj(1, sign, 0);
+ ans1 = _NEW_decimalobj(1, sign, exp_from_i(0));
if (!ans1) {
Py_DECREF(ans2);
return NULL;
@@ -3439,7 +3856,7 @@
if (cmp == -1) {
decimalobject *ans1, *ans2;
- ans1 = _NEW_decimalobj(1, sign, 0);
+ ans1 = _NEW_decimalobj(1, sign, exp_from_i(0));
if (!ans1)
return NULL;
ans1->limbs[0] = 0;
@@ -3546,7 +3963,7 @@
/* we create room for additional limb, in case remainder != 0
* then, we will just move results->limbs one left, and set first
* limb 1, to make rounding working properly */
- result = _NEW_decimalobj(significant_limbs * LOG + LOG, sign, 0);
+ result = _NEW_decimalobj(significant_limbs * LOG + LOG, sign, exp_from_i(0));
for (i = 0; i < result->limb_count; i++)
result->limbs[i] = 0;
@@ -3570,14 +3987,14 @@
else
min_expdiff = exp_from_i(op1->limb_count);
+ /* EXP TODO ?? (min_expdiff) */
expdiff = exp_from_i(_limb_divide(op1->limbs, op1->limb_count,
op2->limbs, op2->limb_count, result->limbs,
- significant_limbs, remainder, min_expdiff));
+ significant_limbs, remainder, exp_to_i(min_expdiff)));
result->limbs[significant_limbs] = 0;
// exp += expdiff * LOG + LOG;
exp_inp_add(&exp, exp_add_i(exp_mul_i(expdiff, LOG), LOG));
-
rlimbs = _limb_size(remainder, remainder_limbs);
/* remainder non-zero */
if (!(rlimbs == 1 && remainder[0] == 0)) {
@@ -3600,7 +4017,7 @@
}
if (divmod) {
- remainder_ret = _NEW_decimalobj(rlimbs * LOG, self->sign, 0);
+ remainder_ret = _NEW_decimalobj(rlimbs * LOG, self->sign, exp_from_i(0));
if (!remainder_ret) {
Py_DECREF(op1);
Py_DECREF(result);
@@ -3626,13 +4043,13 @@
max_size = op1->ob_size > op2->ob_size ? op1->ob_size : op2->ob_size;
/* adjusted is computed just like in the python code */
- adjusted = self->ob_size - other->ob_size + 1;
+ adjusted = exp_from_i(self->ob_size - other->ob_size + 1);
for (i = 0;i < max_size ;i++) {
long x1,x2;
x1 = _limb_get_digit(self->limbs, self->ob_size, i);
x2 = _limb_get_digit(other->limbs, other->ob_size, i);
if (x2 > x1) {
- adjusted --;
+ exp_dec(&adjusted);
break;
}
}
@@ -3642,7 +4059,7 @@
* have more than adjusted digits =] */
- while (result->ob_size > adjusted &&
+ while (exp_l_i(adjusted, result->ob_size) &&
_limb_get_digit(result->limbs, result->ob_size, result->ob_size -1) == 0) {
/* when int dividing, exp should be 0 */
if (exp_ge_i(result->exp, 0) && divmod)
@@ -3655,7 +4072,7 @@
result->limb_count = (result->ob_size + LOG -1)/LOG;
- if (exp_add_i(result->exp, result->ob_size) > ctx->prec && shouldround && divmod) {
+ if (exp_g_i(exp_add_i(result->exp, result->ob_size), ctx->prec) && shouldround && divmod) {
Py_DECREF(remainder_ret);
Py_DECREF(result);
Py_DECREF(op1);
@@ -3671,7 +4088,7 @@
contextobject *ctx2 = 0;
exp_t remainder_exp = exp_min(self->exp, other->exp);
decimalobject *rescaled_rem = 0;
- decimalobject *rescaled = _decimal_rescale(result, 0, ctx, -1, 0);
+ decimalobject *rescaled = _decimal_rescale(result, exp_from_i(0), ctx, -1, 0);
if (!rescaled) {
Py_DECREF(result);
Py_DECREF(remainder_ret);
@@ -3943,7 +4360,7 @@
}
if (!decimal_nonzero(self)) {
// oexp = other->exp - ctx->prec - 1;
- oexp = exp_sub(other->exp, ctx->prec + 1);
+ oexp = exp_sub_i(other->exp, ctx->prec + 1);
// exp = (exp > oexp ? exp : oexp);
exp = exp_max(exp, oexp);
res = _decimal_rescale(other, exp, ctx, 0, 0);
@@ -3958,7 +4375,7 @@
}
if (!decimal_nonzero(other)) {
// oexp = self->exp - ctx->prec - 1;
- oexp = exp_sub(self->exp, ctx->prec + 1);
+ oexp = exp_sub_i(self->exp, ctx->prec + 1);
// exp = (exp > oexp ? exp : oexp);
exp = exp_max(exp, oexp);
res = _decimal_rescale(self, exp, ctx, 0, 0);
@@ -4184,7 +4601,7 @@
{
if(decimal_nonzero(other))
{
- ans = _NEW_decimalobj(1, resultsign ? SIGN_NEGINF : SIGN_POSINF, 0);
+ ans = _NEW_decimalobj(1, resultsign ? SIGN_NEGINF : SIGN_POSINF, exp_from_i(0));
return ans;
}
return handle_InvalidOperation(self->ob_type, ctx, "(+-)INF * 0", NULL);
@@ -4194,7 +4611,7 @@
{
if(decimal_nonzero(self))
{
- ans = _NEW_decimalobj(1, resultsign ? SIGN_NEGINF : SIGN_POSINF, 0);
+ ans = _NEW_decimalobj(1, resultsign ? SIGN_NEGINF : SIGN_POSINF, exp_from_i(0));
return ans;
}
return handle_InvalidOperation(self->ob_type, ctx, "0 * (+-)INF", NULL);
@@ -4349,9 +4766,9 @@
long spot;
int mod = modulo != Py_None;
long firstprec = ctx->prec;
- double a, b;
+ int cmp;
- if (ISINF(other) || ADJUSTED(other) > 8) {
+ if (ISINF(other) || exp_g_i(ADJUSTED(other), 8)) {
return handle_InvalidOperation(self->ob_type, ctx, "x ** INF", NULL);
}
@@ -4371,7 +4788,7 @@
}
if (!decimal_nonzero(other)) {
- ret = _NEW_decimalobj(1, 0, 0);
+ ret = _NEW_decimalobj(1, 0, exp_from_i(0));
ret->limbs[0] = 1;
return ret;
}
@@ -4398,7 +4815,7 @@
"INF % x", NULL);
}
- ret = _NEW_decimalobj(1, sign, 0);
+ ret = _NEW_decimalobj(1, sign, exp_from_i(0));
ret->limbs[0] = 0;
if (n > 0)
@@ -4408,11 +4825,23 @@
}
/* XXX temporary solution */
- a = ((double) self->exp + (double) self->ob_size - (double) 1) * (double) n;
- b = ctx->Emax;
+ {
+#ifdef BIG_EXP
+ exp_t b;
+ exp_t a = exp_add_i(self->exp, self->ob_size - 1);
+ b = exp_mul_i(a, n);
+ cmp = exp_g(a, ctx->Emax);
+
+#else
+ double a,b;
+ a = ((double) self->exp + (double) self->ob_size - (double) 1) * (double) n;
+ b = ctx->Emax;
+ cmp = a > b;
+#endif
+ }
if (!mod && n > 0 && /* (self->exp + self->ob_size -1) * n > ctx->Emax && */
- a > b &&
+ cmp &&
decimal_nonzero(self)) {
if (handle_Rounded(ctx, NULL)) {
@@ -4442,12 +4871,12 @@
}
}
- if (!mod && ctx->prec > PyDecimal_DefaultContext->Emax) {
+ if (!mod && exp_l_i(PyDecimal_DefaultContext->Emax, ctx->prec)) {
ctx->prec = firstprec;
return handle_Overflow(self->ob_type, ctx, "Too much precision", sign);
}
mul = _decimal_get_copy(self);
- val = _NEW_decimalobj(1,0,0);
+ val = _NEW_decimalobj(1,0,exp_from_i(0));
val->limbs[0] = 1;
if(!val || !mul) {
@@ -4691,7 +5120,7 @@
char *buf;
PyObject *res;
- max = exp_to_i(self->ob_size) + self->exp;
+ max = exp_to_i(self->exp) + self->ob_size;
buf = PyMem_MALLOC(max + 2); /* with sign */
if (!buf) return NULL;
@@ -4836,7 +5265,7 @@
* whether the fractional part consists only of zeroes. */
i = exp_from_i(d->ob_size-1);
cmp = exp_add_i(d->exp, d->ob_size);
- for (; exp_ge(i, cmp); --i) {
+ for (; exp_ge(i, cmp); exp_dec(&i)) {
/* if (d->digits[i] > 0)*/
if(_limb_get_digit(d->limbs, d->ob_size, exp_to_i(i)) > 0)
return 0;
@@ -5173,7 +5602,7 @@
v = value;
- new = _new_decimalobj(type, ndigits, (neg ? SIGN_NEG : SIGN_POS), 0);
+ new = _new_decimalobj(type, ndigits, (neg ? SIGN_NEG : SIGN_POS), exp_from_i(0));
if (!new) return NULL;
while (value) {
new->digits[ndigits-i-1] = value % 10;
@@ -5361,7 +5790,7 @@
return NULL;
if (value == NULL) {
- decimalobject *new = _new_decimalobj(type, 1, 0, 0);
+ decimalobject *new = _new_decimalobj(type, 1, 0, exp_from_i(0));
if (!new) return NULL;
new->digits[0] = 0;
new->limbs[0] = 0;
@@ -5868,14 +6297,16 @@
static PyObject *
context_Etiny(contextobject *self)
{
- return PyInt_FromSsize_t(ETINY(self));
+ /* TODO EXP */
+ return PyInt_FromSsize_t(exp_to_i(ETINY(self)));
}
static PyObject *
context_Etop(contextobject *self)
{
- return PyInt_FromSsize_t(ETOP(self));
+ /* TODO EXP */
+ return PyInt_FromSsize_t(exp_to_i(ETOP(self)));
}
@@ -6105,6 +6536,7 @@
}
+/* TODO EXP */
static PyObject *
context_reduce(contextobject *self)
{
@@ -6112,7 +6544,7 @@
return Py_BuildValue("(O(liiOOlliiO))", self->ob_type,
self->prec, self->rounding,
self->rounding_dec, self->traps, self->flags,
- self->Emin, self->Emax, self->capitals,
+ exp_to_i(self->Emin), exp_to_i(self->Emax), self->capitals,
self->clamp, self->ignored);
}
@@ -6579,7 +7011,9 @@
"traps", "flags", "Emin", "Emax",
"capitals", "_clamp", "_ignored_flags", 0};
long prec = LONG_MIN;
- long Emin = LONG_MAX, Emax = LONG_MIN;
+// long Emin = LONG_MAX, Emax = LONG_MIN;
+ long TEmin, TEmax;
+ exp_t Emin, Emax;
int rounding = -1, rounding_dec = -1, capitals = -1, clamp = 0;
PyObject *pytraps = NULL, *pyflags = NULL, *pyignored = NULL;
PyObject *tmp, *res = NULL;
@@ -6588,12 +7022,17 @@
PyObject *_ignored = NULL;
int i, j;
+ TEmin = LONG_MAX;
+ TEmax = LONG_MIN;
+ /* TODO EXP */
if (!PyArg_ParseTupleAndKeywords(args, kwds, "|liiOOlliiO:Context", kwlist,
&prec, &rounding, &rounding_dec, &pytraps,
- &pyflags, &Emin, &Emax, &capitals, &clamp,
+ &pyflags, &TEmin, &TEmax, &capitals, &clamp,
&pyignored))
return NULL;
+ Emin = exp_from_i(TEmin);
+ Emax = exp_from_i(TEmax);
if (pytraps == NULL) {
_traps = PyDict_Copy(PyDecimal_DefaultContext->traps);
@@ -6609,6 +7048,7 @@
goto err;
else if (j == 1)
_set_flag(_traps, i, 1);
+ else _set_flag(_traps, i, 0);
}
} else {
for (i = 0; i < NUMSIGNALS; i++) {
@@ -6625,6 +7065,7 @@
goto err;
else if (j == 1)
_set_flag(_traps, i, 1);
+ else _set_flag(_traps, i, 0);
}
}
}
@@ -6695,10 +7136,10 @@
if (prec == LONG_MIN)
prec = PyDecimal_DefaultContext->prec;
- if (Emin == LONG_MAX)
+ if (exp_eq_i(Emin, LONG_MAX))
Emin = PyDecimal_DefaultContext->Emin;
- if (Emax == LONG_MIN)
+ if (exp_eq_i(Emax, LONG_MIN))
Emax = PyDecimal_DefaultContext->Emax;
if (capitals == -1)
@@ -6736,8 +7177,8 @@
static PyMemberDef context_members[] = {
{"prec", T_LONG, OFF(prec), 0},
- {"Emin", T_LONG, OFF(Emin), 0},
- {"Emax", T_LONG, OFF(Emax), 0},
+// {"Emin", T_LONG, OFF(Emin), 0},
+// {"Emax", T_LONG, OFF(Emax), 0},
{"capitals", T_INT, OFF(capitals), 0},
{"rounding", T_INT, OFF(rounding), 0},
{"_rounding_decision", T_INT, OFF(rounding_dec), 0},
@@ -6748,6 +7189,42 @@
{NULL}
};
+static PyObject *
+context_get_emax(contextobject *self) {
+ return PyInt_FromLong(exp_to_i(self->Emax));
+}
+
+
+/* TODO EXP */
+static int
+context_set_emax(contextobject *self, PyObject *value) {
+ long var = PyInt_AsLong(value);
+ if (PyErr_Occurred())
+ return -1;
+ self->Emax = exp_from_i(var);
+ return 0;
+}
+
+static PyObject *
+context_get_emin(contextobject *self) {
+ return PyInt_FromLong(exp_to_i(self->Emin));
+}
+
+/* TODO EXP */
+static int
+context_set_emin(contextobject *self, PyObject *value) {
+ long var = PyInt_AsLong(value);
+ if (PyErr_Occurred())
+ return -1;
+ self->Emin = exp_from_i(var);
+ return 0;
+}
+
+static PyGetSetDef context_getset[] = {
+ {"Emax", (getter) context_get_emax, (setter)context_set_emax},
+ {"Emin", (getter) context_get_emin, (setter)context_set_emin},
+ {NULL}
+};
/* XXX: If the dicts stay, Context might need to become a GC type */
@@ -6783,7 +7260,7 @@
0, /* tp_iternext */
context_methods, /* tp_methods */
context_members, /* tp_members */
- 0, /* tp_getset */
+ context_getset, /* tp_getset */
0, /* tp_base */
0, /* tp_dict */
0, /* tp_descr_get */
@@ -6941,7 +7418,8 @@
_set_flag(traps, S_OVERFLOW, 1);
_set_flag(traps, S_INV_OPERATION, 1);
PyDecimal_DefaultContext = _new_contextobj(28, ROUND_HALF_EVEN, ALWAYS_ROUND,
- traps, NULL, -999999999L, 999999999L,
+ traps, NULL, exp_from_i(-999999999L),
+ exp_from_i(999999999L),
1, 0, NULL, 1);
if (!PyDecimal_DefaultContext)
return;
@@ -6954,7 +7432,8 @@
_set_flag(traps, S_UNDERFLOW, 1);
PyDecimal_BasicContext = _new_contextobj(9, ROUND_HALF_UP, ALWAYS_ROUND,
- traps, NULL, -999999999L, 999999999L,
+ traps, NULL, exp_from_i(-999999999L),
+ exp_from_i(999999999L),
1, 0, NULL, 1);
if (!PyDecimal_BasicContext)
return;
@@ -6963,7 +7442,8 @@
return;
PyDecimal_ExtendedContext = _new_contextobj(9, ROUND_HALF_EVEN, ALWAYS_ROUND,
- NULL, NULL, -999999999L, 999999999L,
+ NULL, NULL, exp_from_i(-999999999L),
+ exp_from_i(999999999L),
1, 0, NULL, 1);
if (!PyDecimal_ExtendedContext)
return;
Modified: sandbox/trunk/decimal-c/decimal.h
==============================================================================
--- sandbox/trunk/decimal-c/decimal.h (original)
+++ sandbox/trunk/decimal-c/decimal.h Thu Jul 6 23:13:23 2006
@@ -7,10 +7,23 @@
extern "C" {
#endif
+#define BIG_EXP
+
+
+#ifdef BIG_EXP
+#define EXP_LIMB_COUNT 10 /* maximal number of limbs per exp */
+typedef struct {
+ long limbs[EXP_LIMB_COUNT];
+ int sign;
+ int size;
+} exp_t;
+#else
#define exp_t long
+#endif
#define BASE 1000 /* biggest value of limb power of 10 */
#define LOG 3 /* number of digits per limb LOG = log10(BASE) */
+#define LOG_STR "3" /* any ideas how to avoid that? :P */
/* decimalobject struct ******************************************************/
typedef struct {
From python-checkins at python.org Fri Jul 7 00:11:58 2006
From: python-checkins at python.org (mateusz.rukowicz)
Date: Fri, 7 Jul 2006 00:11:58 +0200 (CEST)
Subject: [Python-checkins] r47282 - sandbox/trunk/decimal-c/test_decimal.py
Message-ID: <20060706221158.795C91E4003@bag.python.org>
Author: mateusz.rukowicz
Date: Fri Jul 7 00:11:57 2006
New Revision: 47282
Modified:
sandbox/trunk/decimal-c/test_decimal.py
Log:
Fixed some brain-damaged code made by me ;P.
Modified: sandbox/trunk/decimal-c/test_decimal.py
==============================================================================
--- sandbox/trunk/decimal-c/test_decimal.py (original)
+++ sandbox/trunk/decimal-c/test_decimal.py Fri Jul 7 00:11:57 2006
@@ -1047,11 +1047,12 @@
def test_pickle(self):
c = Context()
e = pickle.loads(pickle.dumps(c))
- for k in dir(c):
+ for k in ("prec", "Emin", "Emax", "capitals", "rounding",\
+ "_rounding_decision", "_clamp", "flags", "traps", "_ignored"):
#v1 = vars(c)[k]
v1 = c.__getattribute__(k)
#v2 = vars(e)[k]
- v2 = c.__getattribute__(k)
+ v2 = e.__getattribute__(k)
self.assertEqual(v1, v2)
def test_equality_with_other_types(self):
From python-checkins at python.org Fri Jul 7 00:27:50 2006
From: python-checkins at python.org (matt.fleming)
Date: Fri, 7 Jul 2006 00:27:50 +0200 (CEST)
Subject: [Python-checkins] r47283 - in sandbox/trunk/pdb:
Doc/lib/libmpdb.tex README.txt mconnection.py mpdb.py
mthread.py test/test_mconnection.py test/test_mpdb.py
test/test_mthread.py test/thread_script.py
Message-ID: <20060706222750.A9BEE1E4003@bag.python.org>
Author: matt.fleming
Date: Fri Jul 7 00:27:49 2006
New Revision: 47283
Modified:
sandbox/trunk/pdb/Doc/lib/libmpdb.tex
sandbox/trunk/pdb/README.txt
sandbox/trunk/pdb/mconnection.py
sandbox/trunk/pdb/mpdb.py
sandbox/trunk/pdb/mthread.py
sandbox/trunk/pdb/test/test_mconnection.py
sandbox/trunk/pdb/test/test_mpdb.py
sandbox/trunk/pdb/test/test_mthread.py
sandbox/trunk/pdb/test/thread_script.py
Log:
Fix tests cases. When using TCP as a pdbserver connection, setsockopt's
REUSEADDR is on by default. Add some documentation regarding threading and
a little bit on remote debugging. Update README with some things to finish.
Change the the thread_script.py test file so it raises an exception.
Modified: sandbox/trunk/pdb/Doc/lib/libmpdb.tex
==============================================================================
--- sandbox/trunk/pdb/Doc/lib/libmpdb.tex (original)
+++ sandbox/trunk/pdb/Doc/lib/libmpdb.tex Fri Jul 7 00:27:49 2006
@@ -1023,6 +1023,15 @@
This section provides information on Python's thread debugging facilities and
how \module{mpdb} makes use of them.
+\module{mpdb} uses a master-slaves system for handling thread debugging. The
+main debugger (master) is the instance of \class{MPdb} that is
+used by the Python
+Interpreter. The thread that runs the Python Interpreter is a special class
+called \class{_MainThread}. Whenever we refer to the main debugger, this is
+what we are refering to. The \class{MTracer} objects (slaves)
+are responsible for tracing threads that are created in the script being
+debugged.
+
% How does Python facilitate debugging threads
\subsection{Python's Thread Debugging Features}
@@ -1030,8 +1039,26 @@
allows threads to be debugged.
There are two modules in the Python Standard Library that provide thread
-functionality, \ulink{\module{thread}}{http://docs.python.org/lib/module-thread.html} and \ulink{\module{threading}}{http://docs.python.org/lib/module-threading.html}. Threads created with the \module{threading} module can be traced by
-calling \code{threading.settrace()}.
+functionality, \ulink{\module{thread}}{http://docs.python.org/lib/module-thread.html} and \ulink{\module{threading}}{http://docs.python.org/lib/module-threading.html}.
+There are three types of threads in Python,
+
+- threads created with the \module{thread} module. These are low-level threads.
+
+- threads created with the \module{threading} module. These are high-level
+ threads.
+
+- the \class{_MainThread}. This thread is the main thread, it is created
+when the Python interpreter is started. All signals go to this thread.
+
+
+The global trace function for threads created with the \module{threading}
+module can be set by calling \function{threading.settrace()}. We can use
+this method to inspect the frame objects on the stack, for example, to find
+the filename and line number of the code object currently executing. See
+\ulink{The Python Library Reference}
+{http://docs.python.org/lib/debugger-hooks.html} for documentation on how
+debugger's use the global trace function.
+
\subsection{\module{mpdb}'s Thread Debugging Facilities}
@@ -1039,8 +1066,7 @@
on whilst inside the debugger by issuing the command \code{set thread}.
\emph{All the information below can be considered the internals of how mpdb
-works. In other words, this information doesn't provide useful information
-for someone simply wishing to use the debugger for thread debugging.}
+works. It is a programmer's reference.}
When thread debugging is activated in \module{mpdb}, it imports the
\module{mthread} module and calls that modules \function{init()} function,
@@ -1052,17 +1078,30 @@
documentation in the Python Standard Library for more information on how
debuggers interact with trace functions.
-An \class{MTracer} object provides three methods for dealing with the different
-possible events in that occur in the frame object.
-
+An \class{MTracer} object provides methods for dealing with the different
+possible events in that occur in the frame object \ref{mtracer-objects}.
+The main debugger is repsonsible for debugging the \class{_MainThread} object
+and the \class{MTracer} objects are responsible for debugging all other
+threads. When a thread needs to stop exection, it acquires the main debugger's
+\member{lock} variable, which is a \class{Lock} object. The \function{acquire}
+call will block until the lock is obtained. When this lock is obtained
+execution transfers to the thread the \class{MTracer} is running in. This
+means that this thread is now the current thread.
\section{Remote Debugging}
-This section describes how \module{mpdb} handles debugging remotely.
\label{remote-debug}
+This section describes how \module{mpdb} handles debugging remotely.
+
+Remote debugging in \module{mpdb} takes place between a \emph{pdbserver}
+and a client. The pdbserver is analogous to GDB's gdbserver. A pdbserver
+is run on the same machine as the program being debugged. This pdbserver
+then allows clients to connect to it which begins a debugging session. In
+this session all commands come from the client and are executed on the
+pdbserver.
\section{External Process Debugging}
+\label{proc-debug}
This section describes how \module{mpdb} debugs processes that are external
to the process in which \module{mpdb} is being run.
-\label{proc-debug}
\ No newline at end of file
Modified: sandbox/trunk/pdb/README.txt
==============================================================================
--- sandbox/trunk/pdb/README.txt (original)
+++ sandbox/trunk/pdb/README.txt Fri Jul 7 00:27:49 2006
@@ -54,4 +54,8 @@
`run 2' does not pass the argument 2 to the script.
* Allow thread debugging to be turned on with a command line switch.
* Allow reading commands from .mpdbrc file
-* Changed the name of the default history file from ~/.pydbhist to ~/.mpdbhist
\ No newline at end of file
+* Changed the name of the default history file from ~/.pydbhist to ~/.mpdbhist
+* do_return inherited from pydb.gdb.Gdb doesn't use self.stdin for reading
+ input from the user and doesn't work remotely.
+* pdbserver using a TCP connection uses setsockopt() REUSEADDR by default.
+ Need some way to make this configurable. `set reuseaddr' ?
\ No newline at end of file
Modified: sandbox/trunk/pdb/mconnection.py
==============================================================================
--- sandbox/trunk/pdb/mconnection.py (original)
+++ sandbox/trunk/pdb/mconnection.py Fri Jul 7 00:27:49 2006
@@ -100,7 +100,7 @@
self._sock = self.output = self.input = None
MConnectionInterface.__init__(self)
- def connect(self, addr, reuseaddr=False):
+ def connect(self, addr, reuseaddr=True):
"""Set to allow a connection from a client. 'addr' specifies
the hostname and port combination of the server.
"""
Modified: sandbox/trunk/pdb/mpdb.py
==============================================================================
--- sandbox/trunk/pdb/mpdb.py (original)
+++ sandbox/trunk/pdb/mpdb.py Fri Jul 7 00:27:49 2006
@@ -18,10 +18,8 @@
from optparse import OptionParser
import pydb
from pydb.gdb import Restart
-import socket
import sys
import time
-import threading
import traceback
__all__ = ["MPdb", "pdbserver", "target"]
@@ -407,6 +405,26 @@
except:
break
+def thread_debugging(m):
+ """ Setup this debugger to handle threaded applications."""
+ import mthread
+ mthread.init(m)
+ m.user_line = mthread.user_line
+ m.user_call = mthread.user_call
+ m.user_exception = mthread.user_exception
+ m.user_return = mthread.user_return
+ m._program_sys_argv = list(m._sys_argv[2:])
+ m.mainpyfile = m._program_sys_argv[1]
+ while True:
+ try:
+ m._runscript(m.mainpyfile)
+ if m._user_requested_quit: break
+ except Restart:
+ m.msg('Restarting')
+ except:
+ m.msg(traceback.format_exc())
+ break
+
def main():
""" Main entry point to this module. """
opts, args = parse_opts()
@@ -432,7 +450,9 @@
elif opts.pdbserver:
pdbserver(opts.pdbserver, mpdb)
sys.exit()
-
+ elif opts.debug_thread:
+ thread_debugging(mpdb)
+ sys.exit()
while 1:
try:
mpdb._runscript(mainpyfile)
@@ -477,6 +497,8 @@
help="Start the debugger and execute the pdbserver " \
+ "command. The arguments should be of the form," \
+ " 'protocol address scriptname'.")
+ parser.add_option("-d", "--debug-thread", action="store_true",
+ help="Turn on thread debugging.")
(options, args) = parser.parse_args()
return (options, args)
Modified: sandbox/trunk/pdb/mthread.py
==============================================================================
--- sandbox/trunk/pdb/mthread.py (original)
+++ sandbox/trunk/pdb/mthread.py Fri Jul 7 00:27:49 2006
@@ -10,8 +10,7 @@
from pydb.gdb import Gdb
# Global variables
-tracers = [] # The MTracer objects that are tracing threads
-threads = [] # List of threads we're tracing
+tt_dict = {} # Thread,tracer dictionary
# This global variable keeps track of the 'main' debugger which, when one
# of the MTracer objects encounters a breakpoint in its thread, is used to
@@ -26,73 +25,37 @@
a thread's run() method.
"""
def __init__(self):
- Gdb.__init__(self)
+ Gdb.__init__(self, stdout=_main_debugger.stdout)
self.thread = threading.currentThread()
- # Each tracer instance must keep track of its own breakpoints
- self.fncache = {}
- self.lineno = 0
- self.curframe = None
- self.stopframe = None
- self.botframe = None
- self.quitting = False
-
- def trace_dispatch(self, frame, event, arg):
- self.currentframe = frame
- if event == 'line':
- return self.dispatch_line
- if event == 'call':
- self.msg('%s *** call' % self.thread.getName())
- return self.trace_dispatch
- if event == 'return':
- self.msg('%s *** return' % self.thread.getName())
- return self.trace_dispatch
- if event == 'exception':
- self.msg('%s *** exception' % self.thread.getName())
- return self.trace_dispatch
- if event == 'c_call':
- print '*** c_call'
- return self.trace_dispatch
- if event == 'c_exception':
- print '*** c_exception'
- return self.trace_dispatch
- if event == 'c_return':
- print '*** c_return'
- return self.trace_dispatch
- print 'bdb.Bdb.dispatch: unknown debugging event:', repr(event)
- return self.trace_dispatch
-
- def dispatch_line(self, frame, arg):
- if self.stop_here(frame) or self.break_here(frame):
- # Signal to the main debugger that we've hit a breakpoint
- print 'bang'
- _main_debugger.user_line(frame)
- if self.quitting: raise BdbQuit
- return self.trace_dispatch
-
- def dispatch_call(self, frame, arg):
- # XXX 'arg' is no longer used
- if self.botframe is None:
- # First call of dispatch since reset()
- self.botframe = frame.f_back # (CT) Note that this may also be None!
- return self.trace_dispatch
- if not (self.stop_here(frame) or self.break_anywhere(frame)):
- # No need to trace this function
- return # None
- _main_debugger.user_call(frame, arg)
- if self.quitting: raise BdbQuit
- return self.trace_dispatch
-
- def dispatch_return(self, frame, arg):
- if self.stop_here(frame) or frame == self.returnframe:
- _main_debugger.user_return(frame, arg)
- if self.quitting: raise BdbQuit
- return self.trace_dispatch
-
- def dispatch_exception(self, frame, arg):
- if self.stop_here(frame):
- _main_debugger.user_exception(frame, arg)
- if self.quitting: raise BdbQuit
- return self.trace_dispatch
+ self.prompt = '(MPdb: %s)' % self.thread.getName()
+ self.running = True
+ self.reset()
+
+ def user_line(self, frame):
+ """ Override this method from pydb.pydbbdb.Bdb to make
+ it thread-safe.
+ """
+ _main_debugger.lock.acquire()
+ Gdb.user_line(self, frame)
+ _main_debugger.lock.release()
+
+ def user_call(self, frame, args):
+ """ Override pydb.pydbbdb.Bdb.user_call and make it
+ thread-safe.
+ """
+ _main_debugger.lock.acquire()
+ Gdb.user_call(self, frame, args)
+ _main_debugger.lock.release()
+
+ def user_exception(self, frame, (type, value, traceback)):
+ """ Override this method from pydb.pydbbdb.Bdb to make
+ it thread-safe.
+ """
+ _main_debugger.lock.acquire()
+ Gdb.user_exception(self, frame, (type, value,
+ traceback))
+ _main_debugger.lock.release()
+
def trace_dispatch_init(frame, event, arg):
""" This method is called by a sys.settrace when a thread is running
@@ -100,13 +63,14 @@
set this thread's tracing function to that object's trace_dispatch
method.
"""
- global threads, tracers
- threads.append(threading.currentThread())
- t = MTracer()
-
- tracers.append(t)
- threading.settrace(t.trace_dispatch)
- sys.settrace(t.trace_dispatch)
+ global tt_dict
+ tr = MTracer()
+ th = threading.currentThread()
+
+ tt_dict[th] = tr
+
+ threading.settrace(tr.trace_dispatch)
+ sys.settrace(tr.trace_dispatch)
def init(debugger):
@@ -117,11 +81,39 @@
interpreter.
"""
global _main_debugger
- _main_debugger = debugger
- threading.settrace(trace_dispatch_init)
+ if _main_debugger == None:
+ _main_debugger = debugger
+
+ # This lock must be acquired when a MTracer object
+ # places a call to _main_debugger.user_*
+ _main_debugger.lock = threading.Lock()
+ threading.settrace(trace_dispatch_init)
-
+# All the methods below override the methods from MPdb so
+# that they are thread-safe. Every thread must contend for
+# the Lock object on the MPdb instance.
+
+def user_line(frame):
+ _main_debugger.lock.acquire()
+ Gdb.user_line(_main_debugger, frame)
+ _main_debugger.lock.release()
+
+def user_call(frame, arg):
+ _main_debugger.lock.acquire()
+ Gdb.user_call(_main_debugger, frame, arg)
+ _main_debugger.lock.release()
+
+def user_return(frame, return_value):
+ _main_debugger.lock.acquire()
+ Gdb.user_return(_main_debugger, frame, return_value)
+ _main_debugger.lock.release()
+
+def user_exception(frame, (exc_type, exc_value, exc_traceback)):
+ _main_debugger.lock.acquire()
+ Gdb.user_exception(_main_debugger, frame, (exc_type, exc_value,
+ exc_traceback))
+ _main_debugger.lock.release()
Modified: sandbox/trunk/pdb/test/test_mconnection.py
==============================================================================
--- sandbox/trunk/pdb/test/test_mconnection.py (original)
+++ sandbox/trunk/pdb/test/test_mconnection.py Fri Jul 7 00:27:49 2006
@@ -42,6 +42,8 @@
thread.start_new_thread(repeatedConnect, (self.client, __addr__))
self.server.connect(__addr__)
+ self.server.disconnect()
+
def testClientConnectAndRead(self):
"""(tcp) Connect to server and read/write. """
thread.start_new_thread(repeatedConnect, (self.client,__addr__))
@@ -70,6 +72,8 @@
line = self.server.readline()
self.assertEquals('good\n', line, 'Could not read first line.')
+ self.server.disconnect()
+
def testErrorAddressAlreadyInUse(self):
"""(tcp) Test address already in use error. """
thread.start_new_thread(repeatedConnect, (self.client, __addr__))
Modified: sandbox/trunk/pdb/test/test_mpdb.py
==============================================================================
--- sandbox/trunk/pdb/test/test_mpdb.py (original)
+++ sandbox/trunk/pdb/test/test_mpdb.py Fri Jul 7 00:27:49 2006
@@ -93,7 +93,7 @@
def testTarget(self):
""" Test the target command. """
server = MConnectionServerTCP()
- thread.start_new_thread(server.connect, (__addr__,))
+ thread.start_new_thread(server.connect, (__addr__,True))
self.client1 = MPdbTest()
connect_to_target(self.client1)
@@ -168,6 +168,7 @@
server.target = 'remote'
self.client1.onecmd('restart')
self.client1.connection.write('rquit\n')
+ server.disconnect()
for i in range(MAXTRIES):
if server.connection != None: pass
Modified: sandbox/trunk/pdb/test/test_mthread.py
==============================================================================
--- sandbox/trunk/pdb/test/test_mthread.py (original)
+++ sandbox/trunk/pdb/test/test_mthread.py Fri Jul 7 00:27:49 2006
@@ -7,14 +7,13 @@
from test import test_support
sys.path.append('..')
-import mthread
+import mthread, mpdb
class TestThreadDebugging(unittest.TestCase):
def testMthreadInit(self):
""" Test the init method of the mthread file. """
- m = sys.stdout.write
- e = sys.stderr.write
- mthread.init(m, e)
+ m = mpdb.MPdb()
+ mthread.init(m)
def test_main():
test_support.run_unittest(TestThreadDebugging)
Modified: sandbox/trunk/pdb/test/thread_script.py
==============================================================================
--- sandbox/trunk/pdb/test/thread_script.py (original)
+++ sandbox/trunk/pdb/test/thread_script.py Fri Jul 7 00:27:49 2006
@@ -5,14 +5,19 @@
import threading
+def foo():
+ l = [i for i in range(10)]
+ for n in l:
+ print l[n+1]
+
class MyThread(threading.Thread):
def run(self):
- for i in range(10):
- print i
-
+ foo()
+
def func():
- t = MyThread()
- t.start()
+ for i in range(2):
+ t = MyThread()
+ t.start()
t.join()
if __name__ == '__main__':
From python-checkins at python.org Fri Jul 7 03:38:43 2006
From: python-checkins at python.org (brett.cannon)
Date: Fri, 7 Jul 2006 03:38:43 +0200 (CEST)
Subject: [Python-checkins] r47284 -
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Message-ID: <20060707013843.2AE911E4005@bag.python.org>
Author: brett.cannon
Date: Fri Jul 7 03:38:41 2006
New Revision: 47284
Modified:
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Log:
Second draft:
* Renamed "trusted" and "untrusted" to "unprotected" and "sandboxed"
* Added a "Thread Model" section
* Clarified the discussion on resource hiding and crippling
* Made API prefix be PySandbox
* Made some arguments to the API concrete types
* Added a todo list of what still needs to be considered
* probably other stuff I can't remember
Modified: python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
==============================================================================
--- python/branches/bcannon-sandboxing/sandboxing_design_doc.txt (original)
+++ python/branches/bcannon-sandboxing/sandboxing_design_doc.txt Fri Jul 7 03:38:41 2006
@@ -5,40 +5,75 @@
=============================
This document is meant to lay out the general design for re-introducing a
-restriced execution model for Python. This document should provide one with
-enough information to understand the goals for restricted execution, what
+sandboxing model for Python. This document should provide one with
+enough information to understand the goals for sandboxing, what
considerations were made for the design, and the actual design itself. Design
decisions should be clear and explain not only why they were chosen but
possible drawbacks from taking that approach.
+If any of the above is found not to be true, please email me at
+brett at python.org and let me know what problems you are having with the
+document.
+
+
+XXX TO DO
+=============================
+
+* Use a callback for paths?
+* threading needs protection?
+* python-dev convince me that hiding 'file' possible?
+ + based on that, handle code objects
+ + also decide how to handle sockets
+* resolve to IP at call time to prevent DNS man-in-the-middle attacks when
+ allowing a specific host name?
+* what network inko functions are allowed by default?
+* does the object.__subclasses__() trick work across interpreters, or is it
+ unique per interpreter?
+* don't use abstract types in API spec
+* in the Threat Model, reference the section of the implementation that
+* addresses that concern
+* PySandbox_*Extended*()
+* figure out default whitelist of extension modules
+* check default accessible objects for file path exposure
+* helper functions to get at StringIO instances for stdin, stdout, and friends?
+* decide on what type of objects (e.g., PyStringObject or const char *) are to
+ be passed into PySandbox_*Extended*() functions
+
Goal
=============================
-A good restricted execution model provides enough protection to prevent
+A good sandboxing model provides enough protection to prevent
malicious harm to come to the system, and no more. Barriers should be
minimized so as to allow most code that does not do anything that would be
-regarded as harmful to run unmodified.
+regarded as harmful to run unmodified. But the protections need to be thorough
+enough to prevent any unintended changes or information of the system to come
+about.
An important point to take into consideration when reading this document is to
realize it is part of my (Brett Cannon's) Ph.D. dissertation. This means it is
-heavily geared toward the restricted execution when the interpreter is working
-with Python code embedded in a web page. While great strides have been taken
+heavily geared toward sandboxing when the interpreter is working
+with Python code embedded in a web page as viewed in Firefox. While great strides have been taken
to keep the design general enough so as to allow all previous uses of the
'rexec' module [#rexec]_ to be able to use the new design, it is not the
focused goal. This means if a design decision must be made for the embedded
-use case compared to sandboxing Python code in a Python application, the former
-will win out.
+use case compared to sandboxing Python code in a pure Python application, the former
+will win out over the latter.
-Throughout this document, the term "resource" is to represent anything that
+Throughout this document, the term "resource" is used to represent anything that
deserves possible protection. This includes things that have a physical
representation (e.g., memory) to things that are more abstract and specific to
the interpreter (e.g., sys.path).
-When referring to the state of an interpreter, it is either "trusted" or
-"untrusted". A trusted interpreter has no restrictions imposed upon any
-resource. An untrusted interpreter has at least one, possibly more, resource
-with a restriction placed upon it.
+Throughout this document, the term "sandoxing" will be used. It can also be
+substituted to mean "restricted execution" if one prefers.
+
+When referring to the state of an interpreter, it is either "unprotected" or
+"sandboxed". A unprotected interpreter has no restrictions imposed upon any
+resource. A sandboxed interpreter has at least one, possibly more, resources
+with restrictions placed upon it to prevent unsafe code that is running
+within the interpreter to cause harm to the system (the interpreter
+itself is never considered unsafe).
.. contents::
@@ -47,71 +82,74 @@
Use Cases
/////////////////////////////
-All use cases are based on how many untrusted or trusted interpreters are
-running in a single process.
+All use cases are based on how many sandboxed interpreters are
+running in a single process and whether an unprotected interpreter is also running
+or not. They can be broken down into two categories: when the interpreter is
+embedded and only using sandboxed interpreters, and when pure Python code is
+running in an unprotected interpreter and uses sandboxed interpreters.
When the Interpreter Is Embedded
================================
-Single Untrusted Interpreter
+Single Sandboxed Interpreter
----------------------------
This use case is when an application embeds the interpreter and never has more
-than one interpreter running.
-
-The main security issue to watch out for is not having default abilities be
-provided to the interpreter by accident. There must also be protection from
-leaking resources that the interpreter needs for general use underneath the
-covers into the untrusted interpreter.
+than one interpreter running which happens to be sandboxed.
-Multiple Untrusted Interpreters
+Multiple Sandboxed Interpreters
-------------------------------
-When multiple interpreters, all untrusted at varying levels, need to be running
+When multiple interpreters, all sandboxed at varying levels, need to be running
within a single application. This is the key use case that this proposed
design is targetted for.
-On top of the security issues from a single untrusted interpreter, there is one
-additional worry. Resources cannot end up being leaked into other interpreters
-where they are given escalated rights.
-
Stand-Alone Python
-==================
+=============================
When someone has written a Python program that wants to execute Python code in
-an untrusted interpreter(s). This is the use case that 'rexec' attempted to
+an sandboxed interpreter(s). This is the use case that 'rexec' attempted to
fulfill.
-The added security issues for this use case (on top of the ones for the other
-use cases) is preventing something from the trusted interpreter leaking into an
-untrusted interpreter and having elevated permissions. With the multiple
-untrusted interpreters one did not have to worry about preventing actions from
-occurring that are disallowed for all untrusted interpreters. With this use
-case you do have to worry about the binary distinction between trusted and
-untrusted interpreters running in the same process.
+
+Issues to Consider
+=============================
+
+Common to all use cases, resources that the interpreter requires at a level
+below user code to be used unhindered cannot be exposed to a sandboxed
+interpreter. For instance, the interpreter might need to stat a file to see if
+it is possible to import. If the abililty to stat a file is not allowed to a
+sandboxed interpreter, it should not be allowed to perform that action,
+regardless of whether the interpreter at a level below user code needs that
+ability.
+
+When multiple interpreters are involved (sandboxed or not), not allowing an interpreter
+to gain access to resources available in other interpreters without explicit
+permission must be enforced. It would be a security violation, for instance,
+if a sandboxed interpreter manages to gain access to an unprotected instance of
+the 'file' object from an unprotected interpreter without being given that object.
Resources to Protect
/////////////////////////////
-XXX Threading?
-XXX CPU?
+It is important to make sure that the proper resources are protected from a
+sandboxed interpreter. If you don't there is no point to sandboxing.
Filesystem
===================
-The most obvious facet of a filesystem to protect is reading from it. One does
-not want what is stored in ``/etc/passwd`` to get out. And one also does not
-want writing to the disk unless explicitly allowed for basically the same
-reason; if someone can write ``/etc/passwd`` then they can set the password for
-the root account.
-
-But one must also protect information about the filesystem. This includes both
-the filesystem layout and permissions on files. This means pathnames need to
-be properly hidden from an untrusted interpreter.
+All facets of the filesystem must be protected. This means restricting
+reading and writing to the filesystem (e.g., files, directories, etc.). It
+should be allowed in controlled situations where allowing access to the
+filesystem is desirable, but that should be an explicit allowance.
+
+There must also be protection to prevent revealing any information about the
+filesystem. Disclosing information on the filesystem could allow one to infer
+what OS the interpreter is running on, for instance.
Physical Resources
@@ -120,33 +158,30 @@
Memory should be protected. It is a limited resource on the system that can
have an impact on other running programs if it is exhausted. Being able to
restrict the use of memory would help alleviate issues from denial-of-service
-(DoS) attacks.
+(DoS) attacks on the system.
Networking
===================
Networking is somewhat like the filesystem in terms of wanting similar
-protections. You do not want to let untrusted code make tons of socket
-connections or accept them to do possibly nefarious things (e.g., acting as a
-zombie).
-
+protections. You do not want to let unsafe code make socket
+connections unhindered or accept them to do possibly nefarious things.
You also want to prevent finding out information about the network you are
-connected to. This includes doing DNS resolution since that allows one to find
-out what addresses your intranet has or what subnets you use.
+connected to.
Interpreter
===================
-One must make sure that the interpreter is not harmed in any way. There are
-several ways to possibly do this. One is generating hostile bytecode. Another
-is some buffer overflow. In general any ability to crash the interpreter is
-unacceptable.
-
-There is also the issue of taking it over. If one is able to gain control of
-the overall process through the interpreter then heightened abilities could be
-gained.
+One must make sure that the interpreter is not harmed in any way from sandboxed
+code. This usually takes the form of crashing the program that the interpreter
+is embedded in or the unprotected interpreter that started the sandbox interpreter.
+Executing hostile bytecode that might lead to undesirable effects is another
+possible issue.
+
+There is also the issue of taking it over. If one is able to gain escalated
+privileges in any way without explicit permission is an issue.
Types of Security
@@ -160,74 +195,64 @@
Resource Hiding
=============================
-By never giving code a chance to access a resource, you prevent it from be
-(ab)used. This is the idea behind resource hiding. This can help minimize
-security checks by only checking if someone should be given a resource. By
-having possession of a resource be what determines if one should be allowed to
-use it you minimize the checks to only when a resource is handed out.
-
-This can be viewed as a passive system for security. Once a resource has been
-given to code there are no more checks to make sure the security model is being
-violated.
+By never giving code a chance to access a resource, you prevent it from being
+(ab)used. This is the idea behind resource hiding; you can't misuse something
+you don't have in the first place.
The most common implementation of resource hiding is capabilities. In this
type of system a resource's reference acts as a ticket that represents the right
to use the resource. Once code has a reference it is considered to have full
-use of that resource it represents and no further security checks are
-performed.
-
-To allow customizable restrictions one can pass references to wrappers of
-resources. This allows one to provide custom security to resources instead of
-requiring an all-or-nothing approach.
-
-The problem with capabilities is that it requires a way to control access to
-references. In languages such as Java that use a capability-based security
-system, namespaces provide the protection. By having private attributes and
-compartmentalized namespaces, references cannot be reached without explicit
-permission.
-
-For instance, Java has a ClassLoader class that one can call to have return a
-reference that is desired. The class does a security check to make sure the
-code should be allowed to access the resource, and then returns a reference as
-appropriate. And with private attributes in objects and packages not providing
-global attributes you can effectively hide references to prevent security
-breaches.
-
-To use an analogy, imagine you are providing security for your home. With
-capabilities, security came from not having any way to know where your house is
-without being told where it was; a reference to its location. You might be
-able to ask a guard (e.g., Java's ClassLoader) for a map, but if they refuse
-there is no way for you to guess its location without being told. But once you
-knew where it was, you had complete use of the house.
-
-And that complete access is an issue with a capability system. If someone
-played a little loose with a reference for a resource then you run the risk of
-it getting out. Once a reference leaves your hands it becomes difficult to
-revoke the right to use that resource. A capability system can be designed to
-do a check every time a reference is handed to a new object, but that can be
-difficult to do properly when grafting a new way to handle resources on to an
-existing system such as Python since the check is no longer at a point for
-requesting a reference but also at plain assignment time.
+use of resource that reference represents and no further security checks are
+directly performed (using delegates and other structured ways one can actually
+have a security check for each access of a resource, but this is not a default
+behaviour).
+
+As an example, consider the 'file' type as a resource we want to protect. That
+would mean that we did not want a reference to the 'file' type to ever be
+accessible without explicit permission. If one wanted to provide read-only
+access to a temp file, you could have open() perform a check on the permissions
+of the current interpreter, and if it is allowed to, return a proxy object for
+the file that only allows reading from it. The 'file' instance for the proxy
+would need to be properly hidden so that the reference was not reachable from
+outside so that 'file' access could stil be controlled.
+
+Python, as it stands now, unfortunately does not work well for a pure capabilities sytem.
+Capabilities require the prohibition of certain abilities, such as
+"direct access to another's private state" [#paradigm regained]_. This
+obviously is not possible in Python since, at least at the Python level, there
+is no such thing as private state that is persistent (one could argue that
+local variables that are not cell variables for lexical scopes are private, but
+since they do not survive after a function call they are not usable for keeping
+persistent state). One can hide references at the C level by storing it in the
+struct for the instance of a type and not providing a function to access that
+attribute.
+
+Python's introspection abilities also do not help make implementing
+capabilities that much easier. Consider accessing 'file' even when it is deleted from
+__builtin__. You can still get to the reference for 'file' through the sequence returned by
+``object.__subclasses__()``.
Resource Crippling
=============================
-Another approach to security is to provide constant, proactive security
-checking of rights to use a resource. One can have a resource perform a
+Another approach to security is to not worry about controlling access to the
+reference of a resource.
+One can have a resource perform a
security check every time someone tries to use a method on that resource. This
pushes the security check to a lower level; from a reference level to the
method level.
By performing the security check every time a resource's method is called the
-worry of a resource's reference leaking out to insecure code is alleviated
-since the resource cannot be used without authorizing it regardless of whether
-even having the reference was granted. This does add extra overhead, though,
-by having to do so many security checks.
-
-FreeBSD's jail system provides a system similar to this. Various system calls
-allow for basic usage, but knowing of the system call is not enough to grant
-usage. Every call of a system call requires checking that the proper rights
+worry of a specific resource's reference leaking out to insecure code is alleviated
+since the resource cannot be used without authorizing it upon every method call. This does add extra overhead, though,
+by having to do so many security checks. It also does not handle the situation
+where an unexpected exposure of a type occurs that has not been properly
+crippled.
+
+FreeBSD's jail system provides a protection scheme similar to this. Various system calls
+allow for basic usage, but knowing or having access to the system call is not enough to grant
+usage. Every call to a system call requires checking that the proper rights
have been granted to the use in order to allow for the system call to perform
its action.
@@ -236,24 +261,54 @@
perform uses with the one IP address that is granted is prevented. The check
is performed at every call involving the one granted IP address.
-Using our home analogy, everyone in the world can know where your home is. But
-to access any door in your home, you have to pass a security check. The
-overhead is higher and slows down your movement in your home, but not caring if
-perfect strangers know where your home is prevents the worry of your address
-leaking out to the world.
+Using 'file' as the example again, one could cripple the type so that
+instantiation is not possible for the type in Python. One could also
+provide a permission check on each call to a unsafe method call and thus allow
+the type to be used in normal situations (such as type checking), but still
+feel safe that illegal operations are not performed.
+Regardless of which approach you take, you do not need
+to worry about a reference to the type being exposed unexpectedly since the
+reference is not the security check but the actual method calls.
+
+
+Comparison of the Two Approaches
+================================
+
+From the perspective of Python, the two approaches differ on what would be the
+most difficult thing to analyze from a security standpoint: all of the ways to
+gain access to various types from a sandboxed interpreter with no imports, or
+finding all of the types that can lead to possibly dangerous actions and thus
+need to be crippled.
+
+Some Python developers, such as Armin Rigo, feel that truly hiding objects in
+Python is "quite hard" [#armin-hiding]_. This sentiment means that making a
+pure capabilities system in Python that is secure is not possible as people
+would continue to find new ways to get a hold of the reference to a protected
+resource.
+
+Others feel that by not going the capabilities route we will be constantly
+chasing down new types that require crippling. The thinking is that if we
+cannot control the references for 'file', how are we to know what other types
+might become exposed later on and thus require more crippling?
+
+It essentially comes down to what is harder to do: find all the ways to access
+the types in Python in a sandboxed interpreter with no imported modules, or to
+go through the Python code base and find all types that should be crippled?
The 'rexec' Module
///////////////////////////////////////
-The 'rexec' module [#rexec]_ was based on the design used by Safe-Tcl
-[#safe-tcl]_. The design was essentially a capability system. Safe-Tcl
+The 'rexec' module [#rexec]_ was original attempt at providing a sandbox
+environment for Python code to run in. It's design was based on Safe-Tcl which
+was essentially a capabilities system
+[#safe-tcl]_. Safe-Tcl
allowed you to launch a separate interpreter where its global functions were
specified at creation time. This prevented one from having any abilities that
were not explicitly provided.
For 'rexec', the Safe-Tcl model was tweaked to better match Python's situation.
-An RExec object represented a restricted environment. Imports were checked
+An RExec object represented a sandboxed environment. Imports were checked
against a whitelist of modules. You could also restrict the type of modules to
import based on whether they were Python source, bytecode, or C extensions.
Built-ins were allowed except for a blacklist of built-ins to not provide.
@@ -261,128 +316,131 @@
list.
With an RExec object created, one could pass in strings of code to be executed
-and have the result returned. One could execute code based on whether stdin,
-stdout, and stderr were provided or not.
+and have the result returned. One could restrict whether stdin,
+stdout, and stderr were provided or not on a per-RExec basis.
The ultimate undoing of the 'rexec' module was how access to objects that in
-normal Python require no direct action to reach was handled. Importing modules
+normal Python require no imports to reach was handled. Importing modules
requires a direct action, and thus can be protected against directly in the
import machinery. But for built-ins, they are accessible by default and
require no direct action to access in normal Python; you just use their name
since they are provided in all namespaces.
-For instance, in a restricted interpreter, one only had to do
+For instance, in a sandboxed interpreter, one only had to
``del __builtins__`` to gain access to the full set of built-ins. Another way
is through using the gc module:
``gc.get_referrers(''.__class__.__bases__[0])[6]['file']``. While both of
-these could be fixed (the former a bug in 'rexec' and the latter not allowing
-gc to be imported), they are examples of things that do not require proactive
-actions on the part of the programmer in normal Python to gain access to
-tends to leak out. An unfortunate side-effect of having all of that wonderful
+these could be fixed (the former was a bug in 'rexec' that was fixed and the
+latter could be handled by not allowing
+'gc' to be imported), they are examples of things that do not require proactive
+actions on the part of the programmer in normal Python to gain access to a
+resource. This was an unfortunate side-effect of having all of that wonderful
reflection in Python.
There is also the issue that 'rexec' was written in Python which provides its
-own problems.
+own problems based on reflection and the ability to modify the code at
+run-time without security protection.
Much has been learned since 'rexec' was written about how Python tends to be
used and where security issues tend to appear. Essentially Python's dynamic
-nature does not lend itself very well to passive security measures since the
-reflection abilities in the language lend themselves to getting around
-non-proactive security checks.
+nature does not lend itself very well to a security implementation that does
+not require a constant checking of permissions.
-The Proposed Approach
+Threat Model
///////////////////////////////////////
-In light of where 'rexec' succeeded and failed along with what is known about
-the two main types of security and how Python tends to operate, the following
-is a proposal on how to secure Python for restricted execution.
+Below is a list of what the security implementation
+should allow/prevent or assumes, along with what section of this document that addresses
+that part of the security model (if applicable). The term "bare" when in terms
+of an interpreter means an interpreter that has not performed a single import
+of a module. Also, all comments refer to a sandboxed interpreter unless
+otherwise explicitly stated.
+
+* The Python interpreter cannot be crashed by valid Python source code in a
+ bare interpreter.
+* Python source code is always considered safe.
+* Python bytecode is always considered dangerous.
+* C extension modules are inherently considered dangerous.
+ + Explicit trust of a C extension module is possible.
+* Sandboxed interpreters running in the same process inherently cannot communicate with
+ each other.
+ + Communication through C extension modules is possible.
+* Sandboxed interpreters running in the same process inherently cannot share
+ objects.
+ + Sharing objects through C extension modules is possible.
+* When starting a sandboxed interpreter, it starts with a fresh built-in and
+ global namespace that is not shared with the interpreter that started it.
+* I/O through stdin, stdout, and stderr is inherently not sent to the process'
+ own version of these file descriptors
+ + Should be aliased to safe replacements.
+ + Allowing use of the process' actual version should be possible.
+* Objects in the built-in namespace should be safe to use.
+ + Either hide the dangerous ones or cripple them so they can cause no harm.
+
+There are also some features that might be desirable, but are not being
+addressed by this security model.
+
+* Communication between an unprotected interpreter and a sandboxed interpreter
+ it created in any direction.
-First, security will be provided at the C level. By taking advantage of the
-language barrier of accessing C code from Python without explicit allowance
-(i.e., ignoring ctypes [#ctypes]_), direct manipulation of the various security
-checks can be substantially reduced and controlled.
-
-Second, all proactive actions that code can do to gain access to resources will
-be protected through resource hiding. By having to go through Python to get to
-something (e.g., modules), a security check can be put in place to deny access
-as appropriate (this also ties into the separation between interpreters,
-discussed below).
-
-Third, any resource that is usually accessible by default will use resource
-crippling. Instead of worrying about hiding a resource that is available by
-default (e.g., 'file' type), security checks within the resource will prevent
-misuse. Crippling can also be used for resources where an object could be
-desired, but not at its full capacity (e.g., sockets).
-
-Performance should not be too much of an issue for resource crippling. It's
-main use if for I/O types; files and sockets. Since operations on these types
-are I/O bound and not CPU bound, the overhead for doing the security check
-should be a wash overall.
-
-Fourth, the restrictions separating multiple interpreters within a single
-process will be utilized. This helps prevent the leaking of objects into
-different interpreters with escalated privileges. Python source code
-modules are reloaded for each interpreter, preventing an object that does not
-have resource crippling from being leaked into another interpreter unless
-explicitly allowed. C extension modules are shared by not reloading them
-between interpreters, but this is considered in the security design.
-
-Fifth, Python source code is always trusted. Damage to a system is considered
-to be done from either hostile bytecode or at the C level. Thus protecting the
-interpreter and extension modules is the great worry, not Python source code.
-Python bytecode files, on the other hand, are considered inherently unsafe and
-will never be imported directly.
-Attempts to perform an action that is not allowed by the security policy will
-raise an XXX exception (or subclass thereof) as appropriate.
+The Proposed Approach
+///////////////////////////////////////
+
+In light of where 'rexec' succeeded and failed along with what is known about
+the two main approaches to security and how Python tends to operate, the following
+is a proposal on how to secure Python for sandboxing.
Implementation Details
===============================
-XXX prefix/module name; Restrict, Secure, Sandbox? Different tense?
-XXX C APIs use abstract names (e.g., string, integer) since have not decided if
-Python objects or C types (e.g., PyStringObject vs. char *) will be used
-
-Support for untrusted interpreters will be a compilation flag. This allows the
-more common case of people not caring about protections to not have a
-performance hindrance when not desired. And even when Python is compiled for
-untrusted interpreter restrictions, when the running interpreter *is* trusted,
+Support for sandboxed interpreters will be a compilation option. This allows the
+more common case of people not caring about protections to not take a
+performance hit. And even when Python is compiled for
+sandboxed interpreter restrictions, when the running interpreter *is*
+unprotected,
there will be no accidental triggers of protections. This means that
developers should be liberal with the security protections without worrying
about there being issues for interpreters that do not need/want the protection.
-At the Python level, the __restricted__ built-in will be set based on whether
-the interpreter is untrusted or not. This will be set for *all* interpreters,
-regardless of whether untrusted interpreter support was compiled in or not.
+At the Python level, the __sandboxed__ built-in will be set based on whether
+the interpreter is sandboxed or not. This will be set for *all* interpreters,
+regardless of whether sandboxed interpreter support was compiled in or not.
-For setting what is to be protected, the XXX for the
-untrusted interpreter must be passed in. This makes the protection very
+For setting what is to be protected, the PyThreadState for the
+sandboxed interpreter must be passed in. This makes the protection very
explicit and helps make sure you set protections for the exact interpreter you
-mean to.
+mean to. All functions that set protections begin with the prefix
+``PySandbox_Set*()``. These functions are meant to only work with sandboxed interpreters
+that have not been used yet to execute any Python code.
The functions for checking for permissions are actually macros that take
in at least an error return value for the function calling the macro. This
-allows the macro to return for the caller if the check failed and cause the XXX
+allows the macro to return for the caller if the check failed and cause the
+SandboxError
exception to be propagated. This helps eliminate any coding errors from
incorrectly checking a return value on a rights-checking function call. For
the rare case where this functionality is disliked, just make the check in a
utility function and check that function's return value (but this is strongly
-discouraged!).
+discouraged!). Functions that check that an operation is allowed implicitly operate on the currently running interpreter as
+returned by ``PyInterpreter_Get()`` and are to be used by any code (the
+interpreter, extension modules, etc.) that needs to check for permission to
+execute.
API
--------------
-* interpreter PyXXX_NewInterpreter()
- Return a new interpreter that is considered untrusted. There is no
- corresponding PyXXX_EndInterpreter() as Py_EndInterpreter() will be taught
- how to handle untrusted interpreters.
+* PyThreadState* PySandbox_NewInterpreter()
+ Return a new interpreter that is considered sandboxed. There is no
+ corresponding ``PySandbox_EndInterpreter()`` as ``Py_EndInterpreter()`` will be taught
+ how to handle sandboxed interpreters. ``NULL`` is returned on error.
-* PyXXX_Trusted(error_return)
+* PySandbox_Allowed(error_return)
Macro that has the caller return with 'error_return' if the interpreter is
- not a trusted one.
+ unprotected, otherwise do nothing.
Memory
@@ -391,13 +449,13 @@
Protection
--------------
-An memory cap will be allowed.
+A memory cap will be allowed.
Modification to pymalloc will be needed to properly keep track of the
allocation and freeing of memory. Same goes for the macros around the system
malloc/free system calls. This provides a platform-independent system for
-protection instead of relying on the operating system providing a service for
-capping memory usage of a process. Also allows the protection to be at the
+protection of memory instead of relying on the operating system to provide a service for
+capping memory usage of a process. It also allows the protection to be at the
interpreter level instead of at the process level.
@@ -420,29 +478,20 @@
API
--------------
-* int PyXXX_SetMemoryCap(interpreter, integer)
- Set the memory cap for an untrusted interpreter. If the interpreter is not
- running an untrusted interpreter, return NULL.
+* int PySandbox_SetMemoryCap(PyThreadState *, Py_ssize_t)
+ Set the memory cap for an sandboxed interpreter. If the interpreter is not
+ running an sandboxed interpreter, return a false value.
-* PyXXX_MemoryAlloc(integer, error_return)
+* PySandbox_AllowedMemoryAlloc(Py_ssize_t, error_return)
Macro to increase the amount of memory that is reported that the running
- untrusted interpreter is running. If the increase puts the total count
- passed the set limit, raise an XXX exception and cause the calling function
- to return with the value of error_return. For trusted interpreters or
- untrusted interpreters where a cap has not been set, the macro does
- nothing.
-
-* int PyXXX_MemoryFree(integer)
- Decrease the current running interpreter's allocated memory. If this puts
- the memory returned to below 0, raise an XXX exception and return NULL.
- For trusted interpreters or untrusted interpreters where there is no memory
- cap, the macro does nothing.
-
-
-CPU
-=============================
-XXX Needed? Difficult to get right for all platforms. Would have to be very
-platform-specific.
+ sandboxed interpreter is using. If the increase puts the total count
+ passed the set limit, raise an SandboxError exception and cause the calling function
+ to return with the value of error_return, otherwise do nothing.
+
+* PySandbox_AllowedMemoryFree(Py_ssize_t, error_return)
+ Macro to decrease the current running interpreter's allocated memory. If this puts
+ the memory used to below 0, raise a SandboxError exception and return
+ error_return, otherwise do nothing.
Reading/Writing Files
@@ -451,19 +500,7 @@
Protection
--------------
-The 'file' type will be resource crippled. The user may specify files or
-directories that are acceptable to be opened for reading/writing, or both.
-
-All operations that either read, write, or provide info on a file will require
-a security check to make sure that it is allowed for the file that the 'file'
-object represents. This includes the 'file' type's constructor not raising an
-IOError stating a file does not exist but XXX instead so that information about
-the filesystem is not improperly provided.
-
-The security check will be done for all 'file' objects regardless of where the
-'file' object originated. This prevents issues if the 'file' type or an
-instance of it was accidentally made available to an untrusted interpreter.
-
+XXX
Why
--------------
@@ -476,28 +513,21 @@
Possible Security Flaws
-----------------------
-Assuming that the method-level checks are correct and control of what
-files/directories is not exposed, 'file' object protection is secure, even when
-a 'file' object is leaked from a trusted interpreter to an untrusted one.
+XXX
API
--------------
-* int PyXXX_AllowFile(interpreter, path, mode)
+* int PySandbox_SetAllowedFile(PyThreadState *, const char *path, const char *mode)
Add a file that is allowed to be opened in 'mode' by the 'file' object. If
- the interpreter is not untrusted then return NULL.
+ the interpreter is not sandboxed then return a false value.
-* int PyXXX_AllowDirectory(interpreter, path, mode)
- Add a directory that is allowed to have files opened in 'mode' by the
- 'file' object. This includes both pre-existing files and any new files
- created by the 'file' object.
- XXX allow for creating/reading subdirectories?
-
-* PyXXX_CheckPath(path, mode, error_return)
- Macro that causes the caller to return with 'error_return' and XXX as the
- exception if the specified path with 'mode' is not allowed. For trusted
- interpreters, the macro does nothing.
+* PySandbox_AllowedPath(path, mode, error_return)
+ Macro that causes the caller to return with 'error_return' and raise
+ SandboxError as the
+ exception if the specified path with 'mode' is not allowed, otherwise do
+ nothing.
Extension Module Importation
@@ -513,21 +543,22 @@
allowed based on the type of module (Python source, Python bytecode, or
extension module). Python bytecode files are never directly imported because
of the possibility of hostile bytecode being present. Python source is always
-trusted based on the assumption that all resource harm is eventually done at
+considered safe based on the assumption that all resource harm is eventually done at
the C level, thus Python code directly cannot cause harm. Thus only C
extension modules need to be checked against the whitelist.
The requested extension module name is checked in order to make sure that it
is on the whitelist if it is a C extension module. If the name is not correct
-an XXX exception is raised. Otherwise the import is allowed.
+a SandboxError exception is raised. Otherwise the import is allowed.
-Even if a Python source code module imports a C extension module in a trusted
+Even if a Python source code module imports a C extension module in an
+unprotected
interpreter it is not a problem since the Python source code module is reloaded
-in the untrusted interpreter. When that Python source module is freshly
+in the sandboxed interpreter. When that Python source module is freshly
imported the normal import check will be triggered to prevent the C extension
-module from becoming available to the untrusted interpreter.
+module from becoming available to the sandboxed interpreter.
-For the 'os' module, a special restricted version will be used if the proper
+For the 'os' module, a special sandboxed version will be used if the proper
C extension module providing the correct abilities is not allowed. This will
default to '/' as the path separator and provide as much reasonable abilities
as possible from a pure Python module.
@@ -537,32 +568,31 @@
By default, the whitelisted modules are:
-* XXX work off of rexec whitelist?
+* XXX
Why
--------------
Because C code is considered unsafe, its use should be regulated. By using a
-whitelist it allows one to explicitly decide that a C extension module should
-be considered safe.
+whitelist it allows one to explicitly decide that a C extension module is considered safe.
Possible Security Flaws
-----------------------
-If a trusted C extension module imports an untrusted C extension module and
-make it an attribute of the trust module there will be a breach in security.
+If a whitelisted C extension module imports a non-whitelisted C extension module and
+makes it an attribute of the whitelisted module there will be a breach in security.
Luckily this a rarity in extension modules.
-There is also the issue of a C extension module calling the C API of an
-untrusted C extension module.
+There is also the issue of a C extension module calling the C API of a
+non-whitelisted C extension module.
-Lastly, if a trusted C extension module is loaded in a trusted interpreter and
-then loaded into an untrusted interpreter then there is no possible checks
-during module initialization for possible security issues for resources opened
-during initialization of the module if such checks exist in the init*()
-function.
+Lastly, if a whitelisted C extension module is loaded in an unprotected interpreter and
+then loaded into a sandboxed interpreter then there is no checks
+during module initialization for possible security issues in the sandboxed
+interpreter that would have occurred had the sandboxed interpreter done the
+initial import.
All of these issues can be handled by never blindly whitelisting a C extension
module. Added support for dealing with C extension modules comes in the form
@@ -571,21 +601,20 @@
API
--------------
-* int PyXXX_AllowModule(interpreter, module_name)
- Allow the untrusted interpreter to import 'module_name'. If the
- interpreter is not untrusted, return NULL.
- XXX sub-modules in packages allowed implicitly? Or have to list all
- modules explicitly?
+* int PySandbox_SetModule(PyThreadState *, const char *module_name)
+ Allow the sandboxed interpreter to import 'module_name'. If the
+ interpreter is not sandboxed, return a false value. Absolute import paths must be
+ specified.
-* int PyXXX_BlockModule(interpreter, module_name)
+* int PySandbox_BlockModule(PyThreadState *, const char *module_name)
Remove the specified module from the whitelist. Used to remove modules
- that are allowed by default. If called on a trusted interpreter, returns
- NULL.
+ that are allowed by default. Return a false value if called on an
+ unprotected interpreter.
-* PyXXX_CheckModule(module_Name, error_return)
+* PySandbox_AllowedModule(const char *module_name, error_return)
Macro that causes the caller to return with 'error_return' and sets the
- exception XXX if the specified module cannot be imported. For trusted
- interpreters the macro does nothing.
+ exception SandboxError if the specified module cannot be imported,
+ otherwise does nothing.
Extension Module Crippling
@@ -595,7 +624,7 @@
--------------
By providing a C API for checking for allowed abilities, modules that have some
-useful functionality can do proper security checks for those functions that
+useful functionality can do proper security checks for those functions that
could provide insecure abilities while allowing safe code to be used (and thus
not fully deny importation).
@@ -622,7 +651,7 @@
API
--------------
-Use PyXXX_Trusted() to protect unsafe code from being executed.
+Use PySandbox_Allowed() to protect unsafe code from being executed.
Hostile Bytecode
@@ -631,8 +660,7 @@
Protection
--------------
-The code object's constructor is not callable from Python. Importation of .pyc
-and .pyo files is also prohibited.
+XXX
Why
@@ -661,7 +689,7 @@
Protection
--------------
-Only a subset of the 'sys' module will be made available to untrusted
+Only a subset of the 'sys' module will be made available to sandboxed
interpreters. Things to allow from the sys module:
* byteorder
@@ -686,9 +714,6 @@
* stdin # See `Stdin, Stdout, and Stderr`_.
* stdout
* stderr
-* __stdin__ # See `Stdin, Stdout, and Stderr`_ XXX Perhaps not needed?
-* __stdout__
-* __stderr__
* version
* api_version
@@ -743,26 +768,25 @@
API
--------------
-* int PyXXX_AllowIPAddress(interpreter, IP, port)
- Allow the untrusted interpreter to send/receive to the specified IP
- address on the specified port. If the interpreter is not untrusted,
- return NULL.
+* int PySandboxed_SetIPAddress(PyThreadState *, const char *IP, int port)
+ Allow the sandboxed interpreter to send/receive to the specified IP
+ address on the specified port. If the interpreter is not sandboxed,
+ return a false value.
-* PyXXX_CheckIPAddress(IP, port, error_return)
+* PySandbox_AllowedIPAddress(const char *IP, int port, error_return)
Macro to verify that the specified IP address on the specified port is
allowed to be communicated with. If not, cause the caller to return with
- 'error_return' and XXX exception set. If the interpreter is trusted then
- do nothing.
+ 'error_return' and SandboxError exception set, otherwise do nothing.
-* PyXXX_AllowHost(interpreter, host, port)
- Allow the untrusted interpreter to send/receive to the specified host on
- the specified port. If the interpreter is not untrusted, return NULL.
- XXX resolve to IP at call time to prevent DNS man-in-the-middle attacks?
+* int PySandbox_SetHost(PyThreadState *, const char *host, int port)
+ Allow the sandboxed interpreter to send/receive to the specified host on
+ the specified port. If the interpreter is not sandboxed, return a false
+ value.
-* PyXXX_CheckHost(host, port, error_return)
+* PySandbox_AllowedHost(const char *host, int port, error_return)
Check that the specified host on the specified port is allowed to be
- communicated with. If not, set an XXX exception and cause the caller to
- return 'error_return'. If the interpreter is trusted then do nothing.
+ communicated with. If not, set a SandboxError exception and cause the caller to
+ return 'error_return', otherwise do nothing.
Network Information
@@ -773,9 +797,11 @@
Limit what information can be gleaned about the network the system is running
on. This does not include restricting information on IP addresses and hosts
-that are have been explicitly allowed for the untrusted interpreter to
+that are have been explicitly allowed for the sandboxed interpreter to
communicate with.
+XXX
+
Why
--------------
@@ -796,15 +822,15 @@
API
--------------
-* int PyXXX_AllowNetworkInfo(interpreter)
- Allow the untrusted interpreter to get network information regardless of
+* int PySandbox_SetNetworkInfo(interpreter)
+ Allow the sandboxed interpreter to get network information regardless of
whether the IP or host address is explicitly allowed. If the interpreter
- is not untrusted, return NULL.
+ is not sandboxed, return a false value.
-* PyXXX_CheckNetworkInfo(error_return)
- Macro that will return 'error_return' for the caller and set XXX exception
- if the untrusted interpreter does not allow checking for arbitrary network
- information. For a trusted interpreter this does nothing.
+* PySandbox_CheckNetworkInfo(error_return)
+ Macro that will return 'error_return' for the caller and set a SandboxError exception
+ if the sandboxed interpreter does not allow checking for arbitrary network
+ information, otherwise do nothing.
Filesystem Information
@@ -819,6 +845,7 @@
* __file__ attribute on modules
* __path__ attribute on packages
* co_filename attribute on code objects
+* XXX
Why
@@ -838,20 +865,14 @@
API
--------------
-* int PyXXX_AllowFilesystemInfo(interpreter)
- Allow the untrusted interpreter to expose filesystem information. If the
- passed-in interpreter is not untrusted, return NULL.
+* int PySandbox_SetFilesystemInfo(PyThreadState *)
+ Allow the sandboxed interpreter to expose filesystem information. If the
+ passed-in interpreter is not sandboxed, return NULL.
-* PyXXX_CheckFilesystemInfo(error_return)
+* PySandbox_AllowedFilesystemInfo(error_return)
Macro that checks if exposing filesystem information is allowed. If it is
not, cause the caller to return with the value of 'error_return' and raise
- XXX.
-
-
-Threading
-=============================
-
-XXX Needed?
+ SandboxError, otherwise do nothing.
Stdin, Stdout, and Stderr
@@ -861,10 +882,8 @@
--------------
By default, sys.__stdin__, sys.__stdout__, and sys.__stderr__ will be set to
-instances of cStringIO. Allowing use of the normal stdin, stdout, and stderr
-will be allowed.
-XXX Or perhaps __stdin__ and friends should just be blocked and all you get is
-sys.stdin and friends set to cStringIO.
+instances of cStringIO. Explicit allowance of the process' stdin, stdout, and
+stderr is possible.
Why
@@ -877,19 +896,18 @@
Possible Security Flaws
-----------------------
-Unless cStringIO instances can be used maliciously, none to speak of.
-XXX Use StringIO instances instead for even better security?
+Unless StringIO instances can be used maliciously, none to speak of.
API
--------------
-* int PyXXX_UseTrueStdin(interpreter)
- int PyXXX_UseTrueStdout(interpreter)
- int PyXXX_UseTrueStderr(interpreter)
+* int PySandbox_SetTrueStdin(PyThreadState *)
+ int PySandbox_SetTrueStdout(PyThreadState *)
+ int PySandbox_SetTrueStderr(PyThreadState *)
Set the specific stream for the interpreter to the true version of the
stream and not to the default instance of cStringIO. If the interpreter is
- not untrusted, return NULL.
+ not sandboxed, return a false value.
Adding New Protections
@@ -931,58 +949,57 @@
API
--------------
-XXX Could also have PyXXXExtended prefix instead for the following functions
-
+ Bool
- * int PyXXX_ExtendedSetTrue(interpreter, group, type)
+ * int PySandbox_SetExtendedFlag(PyThreadState *, group, type)
Set a group-type to be true. Expected use is for when a binary
possibility of something is needed and that the default is to not allow
- use of the resource (e.g., network information). Returns NULL if the
- interpreter is not untrusted.
+ use of the resource (e.g., network information). Returns a false value
+ if used on an unprotected interpreter.
- * PyXXX_ExtendedCheckTrue(group, type, error_return)
+ * PySandbox_ExtendedAllowedFlag(group, type, error_return)
Macro that if the group-type is not set to true, cause the caller to
- return with 'error_return' with XXX exception raised. For trusted
+ return with 'error_return' with SandboxError exception raised. For unprotected
interpreters the check does nothing.
+ Numeric Range
- * int PyXXX_ExtendedValueCap(interpreter, group, type, cap)
+ * int PySandbox_SetExtendedCap(PyThreadState *, group, type, cap)
Set a group-type to a capped value, with the initial value set to 0.
Expected use is when a resource has a capped amount of use (e.g.,
- memory). Returns NULL if the interpreter is not untrusted.
+ memory). Returns a false value if the interpreter is not sandboxed.
- * PyXXX_ExtendedValueAlloc(increase, error_return)
+ * PySandbox_AllowedExtendedAlloc(increase, error_return)
Macro to raise the amount of a resource is used by 'increase'. If the
increase pushes the resource allocation past the set cap, then return
- 'error_return' and set XXX as the exception.
+ 'error_return' and set SandboxError as the exception, otherwise do
+ nothing.
- * PyXXX_ExtendedValueFree(decrease, error_return)
+ * PySandbox_AllowedExtendedFree(decrease, error_return)
Macro to lower the amount a resource is used by 'decrease'. If the
decrease pushes the allotment to below 0 then have the caller return
- 'error_return' and set XXX as the exception.
+ 'error_return' and set SandboxError as the exception, otherwise do
+ nothing.
+ Membership
- * int PyXXX_ExtendedAddMembership(interpreter, group, type, string)
+ * int PySandbox_ExtendedSetMembership(PyThreadState *, group, type, string)
Add a string to be considered a member of a group-type (e.g., allowed
- file paths). If the interpreter is not an untrusted interpreter,
- return NULL.
+ file paths). If the interpreter is not an sandboxed interpreter,
+ return a false value.
- * PyXXX_ExtendedCheckMembership(group, type, string, error_return)
+ * PySandbox_ExtendedCheckMembership(group, type, string, error_return)
Macro that checks 'string' is a member of the values set for the
group-type. If it is not, then have the caller return 'error_return'
- and set an exception for XXX. For trusted interpreters the call does
- nothing.
+ and set an exception for SandboxError, otherwise does nothing.
+ Specific Value
- * int PyXXX_ExtendedSetValue(interpreter, group, type, string)
+ * int PySandbox_SetExtendedValue(PyThreadState *, group, type, string)
Set a group-type to a specific string. If the interpreter is not
- untrusted, return NULL.
+ sandboxed, return NULL.
- * PyXXX_ExtendedCheckValue(group, type, string, error_return)
+ * PySandbox_AllowedExtendedValue(group, type, string, error_return)
Macro to check that the group-type is set to 'string'. If it is not,
then have the caller return 'error_return' and set an exception for
- XXX. If the interpreter is trusted then nothing is done.
+ SandboxError, otherwise do nothing.
References
@@ -996,3 +1013,10 @@
.. [#ctypes] 'ctypes' module
(http://docs.python.org/dev/lib/module-ctypes.html)
+
+.. [#paradigm regained] "Paradigm Regained:
+ Abstraction Mechanisms for Access Control"
+ (http://erights.org/talks/asian03/paradigm-revised.pdf)
+
+.. [#armin-hiding] [Python-Dev] what can we do to hide the 'file' type?
+ (http://mail.python.org/pipermail/python-dev/2006-July/067076.html)
From python-checkins at python.org Fri Jul 7 03:41:47 2006
From: python-checkins at python.org (brett.cannon)
Date: Fri, 7 Jul 2006 03:41:47 +0200 (CEST)
Subject: [Python-checkins] r47285 -
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Message-ID: <20060707014147.053C71E4005@bag.python.org>
Author: brett.cannon
Date: Fri Jul 7 03:41:46 2006
New Revision: 47285
Modified:
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Log:
Cleanup of todo list.
Modified: python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
==============================================================================
--- python/branches/bcannon-sandboxing/sandboxing_design_doc.txt (original)
+++ python/branches/bcannon-sandboxing/sandboxing_design_doc.txt Fri Jul 7 03:41:46 2006
@@ -19,7 +19,7 @@
XXX TO DO
=============================
-* Use a callback for paths?
+* Use a callback for paths to create wrapper?
* threading needs protection?
* python-dev convince me that hiding 'file' possible?
+ based on that, handle code objects
@@ -29,10 +29,8 @@
* what network inko functions are allowed by default?
* does the object.__subclasses__() trick work across interpreters, or is it
unique per interpreter?
-* don't use abstract types in API spec
-* in the Threat Model, reference the section of the implementation that
-* addresses that concern
-* PySandbox_*Extended*()
+* in the "Threat Model", reference the section of the implementation that
+ addresses that concern
* figure out default whitelist of extension modules
* check default accessible objects for file path exposure
* helper functions to get at StringIO instances for stdin, stdout, and friends?
@@ -357,6 +355,10 @@
of a module. Also, all comments refer to a sandboxed interpreter unless
otherwise explicitly stated.
+This list does not address specifics such as how 'file' will be protected or
+whether memory should be protected. This list is meant to make clear at a more
+basic level what the security model is assuming is true.
+
* The Python interpreter cannot be crashed by valid Python source code in a
bare interpreter.
* Python source code is always considered safe.
@@ -371,11 +373,7 @@
+ Sharing objects through C extension modules is possible.
* When starting a sandboxed interpreter, it starts with a fresh built-in and
global namespace that is not shared with the interpreter that started it.
-* I/O through stdin, stdout, and stderr is inherently not sent to the process'
- own version of these file descriptors
- + Should be aliased to safe replacements.
- + Allowing use of the process' actual version should be possible.
-* Objects in the built-in namespace should be safe to use.
+ Objects in the built-in namespace should be safe to use.
+ Either hide the dangerous ones or cripple them so they can cause no harm.
There are also some features that might be desirable, but are not being
From python-checkins at python.org Fri Jul 7 05:33:40 2006
From: python-checkins at python.org (brett.cannon)
Date: Fri, 7 Jul 2006 05:33:40 +0200 (CEST)
Subject: [Python-checkins] r47286 - peps/trunk/pep-3100.txt
Message-ID: <20060707033340.7E5BC1E4005@bag.python.org>
Author: brett.cannon
Date: Fri Jul 7 05:33:39 2006
New Revision: 47286
Modified:
peps/trunk/pep-3100.txt
Log:
Reference cleanup and reformatting of modules to be removed.
Modified: peps/trunk/pep-3100.txt
==============================================================================
--- peps/trunk/pep-3100.txt (original)
+++ peps/trunk/pep-3100.txt Fri Jul 7 05:33:39 2006
@@ -63,7 +63,7 @@
Core language
=============
-* True division becomes default behavior [10]_
+* True division becomes default behavior [#pep238]_
* ``exec`` as a statement is not worth it -- make it a function
* Add optional declarations for static typing [11]_
* Support only new-style classes; classic classes will be gone [1]_
@@ -84,7 +84,7 @@
* floats will not be acceptable as arguments in place of ints for operations
where floats are inadvertantly accepted (PyArg_ParseTuple() i & l formats)
* Imports will be absolute by default.
- Relative imports must be explicitly specified [19]_
+ Relative imports must be explicitly specified [#pep328]_
* __init__.py will be optional in sub-packages. __init__.py will still
be required for top-level packages.
* Cleanup the Py_InitModule() variants {,3,4} (also import and parser APIs)
@@ -148,7 +148,7 @@
* ``basestring.find()`` and ``basestring.rfind()``; use ``basestring.index()``
or ``basestring.rindex()`` in a try/except block [15]_
-* ``file.xreadlines()`` method [17]_
+* ``file.xreadlines()`` method [#file-object]_
* ``dict.setdefault()`` [22]_
* ``dict.has_key()`` method
@@ -163,7 +163,7 @@
* Introduce ``trunc()``, which would call the ``__trunc__()`` method on its
argument; suggested use is for objects like float where calling ``__int__()``
has data loss, but an integral representation is still desired [8]_
-* Exception hierarchy changes [20]_
+* Exception hierarchy changes [#pep352]_
To be removed:
@@ -191,29 +191,26 @@
To be removed:
-* Deprecated modules, methods, parameters, attributes, etc. [1]_ [17]_ [18]_
- There may be other modules, the most common are listed below.
-
- stdlib modules to be removed (see docstrings and comments in the source):
- * ``macfs``, ``new``, ``reconvert``, ``stringold``, ``xmllib``
- * ``pcre``, ``pypcre``, ``strop``
- stdlib modules to be removed (see PEP 4): [18]_
- * ``posixfile``, ``pre``, ``regsub``, ``rfc822``,
- * ``statcache``, ``string``, ``TERMIOS``
- * ``mimetools``, ``MimeWriter``, ``mimify``,
- * ``mpz``, ``rgbimage``
- * Everything in lib-old: [18]_
- * Para.py, addpack.py, cmp.py, cmpcache.py, codehack.py,
- * dircmp.py, dump.py, find.py, fmt.py, grep.py, lockfile.py,
- * newdir.py, ni.py, packmail.py, poly.py, rand.py, statcache.py,
- * tb.py, tzparse.py, util.py, whatsound.py, whrandom.py, zmod.py
-
-* ``sys.exitfunc``: use atexit module instead [17]_
+* stdlib modules to be removed
+ + see docstrings and comments in the source
+ - ``macfs``, ``new``, ``reconvert``, ``stringold``, ``xmllib``,
+ ``pcre``, ``pypcre``, ``strop``
+ + see PEP 4 [18]_
+ - ``posixfile``, ``pre``, ``regsub``, ``rfc822``,
+ ``statcache``, ``string``, ``TERMIOS`` ``mimetools``,
+ ``MimeWriter``, ``mimify``, ``mpz``, ``rgbimage``
+ + Everything in lib-old [18]_
+ - ``Para``, ``addpack``, ``cmp``, ``cmpcache``, ``codehack``,
+ ``dircmp``, ``dump``, ``find``, ``fmt``, ``grep``,
+ ``lockfile``, ``newdir``, ``ni``, ``packmail``, ``poly``,
+ ``rand``, ``statcache``, ``tb``, ``tzparse``, ``util``,
+ ``whatsound``, ``whrandom``, ``zmod``
+* ``sys.exitfunc``: use atexit module instead [#sys-module]_
* ``sys.exc_type``, ``sys.exc_values``, ``sys.exc_traceback``:
not thread-safe; use ``sys.exc_info()`` or an attribute
- of the exception [2]_ [13]_ [17]_
-* ``array.read``, ``array.write`` [17]_
-* ``operator.isCallable``, ``operator.sequenceIncludes`` [17]_
+ of the exception [2]_ [13]_ [#sys-module]_
+* ``array.read``, ``array.write`` [#array-module]_
+* ``operator.isCallable``, ``operator.sequenceIncludes`` [#operator-module]_
* In the thread module, the aquire_lock() and release_lock() aliases
for the acquire() and release() methods on lock objects.
(Probably also just remove the thread module as a public API,
@@ -239,6 +236,12 @@
.. [2] Python Regrets:
http://www.python.org/doc/essays/ppt/regrets/PythonRegrets.pdf
+.. [9] Guido's blog ("The fate of reduce() in Python 3000")
+ http://www.artima.com/weblogs/viewpost.jsp?thread=98196
+
+.. [11] Guido's blog ("Python Optional Typechecking Redux")
+ http://www.artima.com/weblogs/viewpost.jsp?thread=89161
+
.. [3] Python Wiki:
http://www.python.org/moin/Python3.0
@@ -258,18 +261,6 @@
objects can be used")
http://mail.python.org/pipermail/python-dev/2005-February/051674.html
-.. [9] Guido's blog ("The fate of reduce() in Python 3000")
- http://www.artima.com/weblogs/viewpost.jsp?thread=98196
-
-.. [10] PEP 238 ("Changing the Division Operator")
- http://www.python.org/dev/peps/pep-0238
-
-.. [11] Guido's blog ("Python Optional Typechecking Redux")
- http://www.artima.com/weblogs/viewpost.jsp?thread=89161
-
-.. [12] PEP 289 ("Generator Expressions")
- http://www.python.org/dev/peps/pep-0289
-
.. [13] python-dev email ("anonymous blocks")
http://mail.python.org/pipermail/python-dev/2005-April/053060.html
@@ -282,31 +273,12 @@
.. [16] python-dev email (Replacement for print in Python 3.0)
http://mail.python.org/pipermail/python-dev/2005-September/056154.html
-.. [17] Python docs
- http://docs.python.org/ref/sequence-methods.html
- http://docs.python.org/lib/module-sys.html
- http://docs.python.org/lib/module-operator.html
- http://docs.python.org/lib/module-array.html
- http://docs.python.org/lib/bltin-file-objects.html
-
-.. [18] PEP 4 ("Deprecation of Standard Modules")
- http://www.python.org/dev/peps/pep-0004
-
-.. [19] PEP 328 ("Imports: Multi-Line and Absolute/Relative")
- http://www.python.org/dev/peps/pep-0328
-
-.. [20] PEP 352 ("Required Superclass for Exceptions")
- http://www.python.org/dev/peps/pep-0352
-
.. [21] python-dev email
http://mail.python.org/pipermail/python-dev/2006-February/061169.html
.. [22] python-dev email ("defaultdict")
http://mail.python.org/pipermail/python-dev/2006-February/061261.html
-.. [23] PEP 308 ("Conditional Expressions")
- http://www.python.org/dev/peps/pep-0308
-
.. [24] python-3000 email
http://mail.python.org/pipermail/python-3000/2006-April/000996.html
@@ -325,12 +297,36 @@
.. [29] python-3000 email ("bug in modulus?")
http://mail.python.org/pipermail/python-3000/2006-May/001735.html
-.. [30] PEP 299 ("Special __main__() function in modules")
- http://www.python.org/dev/peps/pep-0299
+.. [17] Python docs (Additional methods for emulation of sequence types)
+ http://docs.python.org/ref/sequence-methods.html
+
+.. [#sys-module] Python docs (sys -- System-specific parameters and functions)
+ http://docs.python.org/lib/module-sys.html
+
+.. [#operator-module] Python docs (operator -- Standard operators as functions)
+ http://docs.python.org/lib/module-operator.html
+
+.. [#array-module] Python docs (array -- Efficient arrays of numeric values)
+ http://docs.python.org/lib/module-array.html
+
+.. [#file-object] Python docs (File objects)
+ http://docs.python.org/lib/bltin-file-objects.html
+
+.. [18] PEP 4 ("Deprecation of Standard Modules")
+ http://www.python.org/dev/peps/pep-0004
.. [#pep238] PEP 238 (Changing the Division Operator)
http://www.python.org/dev/peps/pep-0238
+.. [12] PEP 289 ("Generator Expressions")
+ http://www.python.org/dev/peps/pep-0289
+
+.. [30] PEP 299 ("Special __main__() function in modules")
+ http://www.python.org/dev/peps/pep-0299
+
+.. [23] PEP 308 ("Conditional Expressions")
+ http://www.python.org/dev/peps/pep-0308
+
.. [#pep328] PEP 328 (Imports: Multi-Line and Absolute/Relative)
http://www.python.org/dev/peps/pep-0328
@@ -347,6 +343,7 @@
http://www.python.org/dev/peps/pep-3099
+
Copyright
=========
From python-checkins at python.org Fri Jul 7 08:03:17 2006
From: python-checkins at python.org (neal.norwitz)
Date: Fri, 7 Jul 2006 08:03:17 +0200 (CEST)
Subject: [Python-checkins] r47287 - python/trunk/Lib/test/fork_wait.py
python/trunk/Lib/test/test_fork1.py
python/trunk/Lib/test/test_wait3.py
python/trunk/Lib/test/test_wait4.py
Message-ID: <20060707060317.402F61E4006@bag.python.org>
Author: neal.norwitz
Date: Fri Jul 7 08:03:15 2006
New Revision: 47287
Modified:
python/trunk/Lib/test/fork_wait.py
python/trunk/Lib/test/test_fork1.py
python/trunk/Lib/test/test_wait3.py
python/trunk/Lib/test/test_wait4.py
Log:
Restore rev 47014:
The hppa ubuntu box sometimes hangs forever in these tests. My guess
is that the wait is failing for some reason. Use WNOHANG, so we won't
wait until the buildbot kills the test suite.
I haven't been able to reproduce the failure, so I'm not sure if
this will help or not. Hopefully, this change will cause the test
to fail, rather than hang. That will be better since we will get
the rest of the test results. It may also help us debug the real problem.
*** The reason this originally failed was because there were many
zombie children outstanding before rev 47158 cleaned them up.
There are still hangs in test_subprocess that need to be addressed,
but that will take more work. This should close some holes.
Modified: python/trunk/Lib/test/fork_wait.py
==============================================================================
--- python/trunk/Lib/test/fork_wait.py (original)
+++ python/trunk/Lib/test/fork_wait.py Fri Jul 7 08:03:15 2006
@@ -34,7 +34,14 @@
pass
def wait_impl(self, cpid):
- spid, status = os.waitpid(cpid, 0)
+ for i in range(10):
+ # waitpid() shouldn't hang, but some of the buildbots seem to hang
+ # in the forking tests. This is an attempt to fix the problem.
+ spid, status = os.waitpid(cpid, os.WNOHANG)
+ if spid == cpid:
+ break
+ time.sleep(2 * SHORTSLEEP)
+
self.assertEquals(spid, cpid)
self.assertEquals(status, 0, "cause = %d, exit = %d" % (status&0xff, status>>8))
Modified: python/trunk/Lib/test/test_fork1.py
==============================================================================
--- python/trunk/Lib/test/test_fork1.py (original)
+++ python/trunk/Lib/test/test_fork1.py Fri Jul 7 08:03:15 2006
@@ -2,6 +2,7 @@
"""
import os
+import time
from test.fork_wait import ForkWait
from test.test_support import TestSkipped, run_unittest, reap_children
@@ -12,7 +13,14 @@
class ForkTest(ForkWait):
def wait_impl(self, cpid):
- spid, status = os.waitpid(cpid, 0)
+ for i in range(10):
+ # waitpid() shouldn't hang, but some of the buildbots seem to hang
+ # in the forking tests. This is an attempt to fix the problem.
+ spid, status = os.waitpid(cpid, os.WNOHANG)
+ if spid == cpid:
+ break
+ time.sleep(1.0)
+
self.assertEqual(spid, cpid)
self.assertEqual(status, 0, "cause = %d, exit = %d" % (status&0xff, status>>8))
Modified: python/trunk/Lib/test/test_wait3.py
==============================================================================
--- python/trunk/Lib/test/test_wait3.py (original)
+++ python/trunk/Lib/test/test_wait3.py Fri Jul 7 08:03:15 2006
@@ -2,6 +2,7 @@
"""
import os
+import time
from test.fork_wait import ForkWait
from test.test_support import TestSkipped, run_unittest, reap_children
@@ -17,10 +18,14 @@
class Wait3Test(ForkWait):
def wait_impl(self, cpid):
- while 1:
- spid, status, rusage = os.wait3(0)
+ for i in range(10):
+ # wait3() shouldn't hang, but some of the buildbots seem to hang
+ # in the forking tests. This is an attempt to fix the problem.
+ spid, status, rusage = os.wait3(os.WNOHANG)
if spid == cpid:
break
+ time.sleep(1.0)
+
self.assertEqual(spid, cpid)
self.assertEqual(status, 0, "cause = %d, exit = %d" % (status&0xff, status>>8))
self.assertTrue(rusage)
Modified: python/trunk/Lib/test/test_wait4.py
==============================================================================
--- python/trunk/Lib/test/test_wait4.py (original)
+++ python/trunk/Lib/test/test_wait4.py Fri Jul 7 08:03:15 2006
@@ -2,6 +2,7 @@
"""
import os
+import time
from test.fork_wait import ForkWait
from test.test_support import TestSkipped, run_unittest, reap_children
@@ -17,7 +18,13 @@
class Wait4Test(ForkWait):
def wait_impl(self, cpid):
- spid, status, rusage = os.wait4(cpid, 0)
+ for i in range(10):
+ # wait4() shouldn't hang, but some of the buildbots seem to hang
+ # in the forking tests. This is an attempt to fix the problem.
+ spid, status, rusage = os.wait4(cpid, os.WNOHANG)
+ if spid == cpid:
+ break
+ time.sleep(1.0)
self.assertEqual(spid, cpid)
self.assertEqual(status, 0, "cause = %d, exit = %d" % (status&0xff, status>>8))
self.assertTrue(rusage)
From python-checkins at python.org Fri Jul 7 09:50:40 2006
From: python-checkins at python.org (georg.brandl)
Date: Fri, 7 Jul 2006 09:50:40 +0200 (CEST)
Subject: [Python-checkins] r47288 - peps/trunk/pep-3099.txt
Message-ID: <20060707075040.68F4E1E4005@bag.python.org>
Author: georg.brandl
Date: Fri Jul 7 09:50:39 2006
New Revision: 47288
Modified:
peps/trunk/pep-3099.txt
Log:
Add "global" decision.
Modified: peps/trunk/pep-3099.txt
==============================================================================
--- peps/trunk/pep-3099.txt (original)
+++ peps/trunk/pep-3099.txt Fri Jul 7 09:50:39 2006
@@ -111,10 +111,13 @@
answer on this subject.
* Referencing the global name ``foo`` will not be spelled ``globals.foo``.
+ The ``global`` statement will stay.
- Thread: "replace globals() and global statement with global builtin
+ Threads: "replace globals() and global statement with global builtin
object",
- http://mail.python.org/pipermail/python-3000/2006-July/002485.html
+ http://mail.python.org/pipermail/python-3000/2006-July/002485.html,
+ "Explicit Lexical Scoping (pre-PEP?)",
+ http://mail.python.org/pipermail/python-dev/2006-July/067111.html
* There will be no alternative binding operators such as ``:=``.
From python-checkins at python.org Fri Jul 7 10:15:14 2006
From: python-checkins at python.org (georg.brandl)
Date: Fri, 7 Jul 2006 10:15:14 +0200 (CEST)
Subject: [Python-checkins] r47289 - python/trunk/Doc/lib/libcookielib.tex
Message-ID: <20060707081514.BFA371E4005@bag.python.org>
Author: georg.brandl
Date: Fri Jul 7 10:15:12 2006
New Revision: 47289
Modified:
python/trunk/Doc/lib/libcookielib.tex
Log:
Fix RFC number.
Modified: python/trunk/Doc/lib/libcookielib.tex
==============================================================================
--- python/trunk/Doc/lib/libcookielib.tex (original)
+++ python/trunk/Doc/lib/libcookielib.tex Fri Jul 7 10:15:12 2006
@@ -24,7 +24,7 @@
the de-facto Netscape cookie protocol (which differs substantially
from that set out in the original Netscape specification), including
taking note of the \code{max-age} and \code{port} cookie-attributes
-introduced with RFC 2109. \note{The various named parameters found in
+introduced with RFC 2965. \note{The various named parameters found in
\mailheader{Set-Cookie} and \mailheader{Set-Cookie2} headers
(eg. \code{domain} and \code{expires}) are conventionally referred to
as \dfn{attributes}. To distinguish them from Python attributes, the
From python-checkins at python.org Fri Jul 7 17:24:59 2006
From: python-checkins at python.org (pythondev)
Date: Fri, 7 Jul 2006 17:24:59 +0200 (CEST)
Subject: [Python-checkins] r47290 - ctypes
Message-ID: <20060707152459.9F0751E4006@bag.python.org>
Author: pythondev
Date: Fri Jul 7 17:24:58 2006
New Revision: 47290
Added:
ctypes/
Log:
Import ctypes
From buildbot at python.org Fri Jul 7 18:45:17 2006
From: buildbot at python.org (buildbot at python.org)
Date: Fri, 07 Jul 2006 16:45:17 +0000
Subject: [Python-checkins] buildbot warnings in x86 cygwin trunk
Message-ID: <20060707164517.955531E400C@bag.python.org>
The Buildbot has detected a new failure of x86 cygwin trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520cygwin%2520trunk/builds/919
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: andrew.kuchling,armin.rigo,fred.drake,fredrik.lundh,georg.brandl,hyeshik.chang,martin.v.loewis,neal.norwitz,nick.coghlan,ronald.oussoren,thomas.heller,thomas.wouters
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Fri Jul 7 20:10:38 2006
From: python-checkins at python.org (brett.cannon)
Date: Fri, 7 Jul 2006 20:10:38 +0200 (CEST)
Subject: [Python-checkins] r50477 -
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Message-ID: <20060707181038.0F8191E4011@bag.python.org>
Author: brett.cannon
Date: Fri Jul 7 20:10:37 2006
New Revision: 50477
Modified:
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Log:
Add mention of use of open() for files. Also add references in "Threat Model" to sections handling assumptions that are not true by default in Python.
Modified: python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
==============================================================================
--- python/branches/bcannon-sandboxing/sandboxing_design_doc.txt (original)
+++ python/branches/bcannon-sandboxing/sandboxing_design_doc.txt Fri Jul 7 20:10:37 2006
@@ -350,7 +350,8 @@
Below is a list of what the security implementation
should allow/prevent or assumes, along with what section of this document that addresses
-that part of the security model (if applicable). The term "bare" when in terms
+that part of the security model (if not already true in Python by default).
+The term "bare" when in terms
of an interpreter means an interpreter that has not performed a single import
of a module. Also, all comments refer to a sandboxed interpreter unless
otherwise explicitly stated.
@@ -362,18 +363,22 @@
* The Python interpreter cannot be crashed by valid Python source code in a
bare interpreter.
* Python source code is always considered safe.
-* Python bytecode is always considered dangerous.
-* C extension modules are inherently considered dangerous.
+* Python bytecode is always considered dangerous [`Hostile Bytecode`_].
+* C extension modules are inherently considered dangerous
+ [`Extension Module Importation`_].
+ Explicit trust of a C extension module is possible.
* Sandboxed interpreters running in the same process inherently cannot communicate with
each other.
- + Communication through C extension modules is possible.
+ + Communication through C extension modules is possible because of the
+ technical need to share extension module instances between interpreters.
* Sandboxed interpreters running in the same process inherently cannot share
objects.
- + Sharing objects through C extension modules is possible.
+ + Sharing objects through C extension modules is possible because of the
+ technical need to share extension module instances between interpreters.
* When starting a sandboxed interpreter, it starts with a fresh built-in and
global namespace that is not shared with the interpreter that started it.
- Objects in the built-in namespace should be safe to use.
+ Objects in the built-in namespace should be safe to use
+ [``Reading/Writing Files`_].
+ Either hide the dangerous ones or cripple them so they can cause no harm.
There are also some features that might be desirable, but are not being
@@ -500,6 +505,24 @@
XXX
+To open a file, one will have to use open(). This will make open() a factory
+function that controls reference access to the 'file' type in terms of creating
+new instances. When an attempted file opening fails, SandboxError will be
+raised.
+
+What open() may not specifically be an instance of 'file' but a proxy
+that provides the security measures needed. While this might break code that
+uses type checking to make sure a 'file' object is used, taking a duck typing
+approach would be better. This is not only more Pythonic but would also allow
+the code to use a StringIO instance.
+
+It has been suggested to allow for a passed-in callback to be called when a
+specific path is to be opened. While this provides good flexibility in terms
+of allowing custom proxies with more fine-grained security (e.g., capping the
+amount of disk write), this has been deemed unneeded in the initial security
+model and thus is not being considered at this time.
+
+
Why
--------------
@@ -911,6 +934,9 @@
Adding New Protections
=============================
+This feature has the lowest priority and thus will be the last feature
+implemented (if ever).
+
Protection
--------------
From python-checkins at python.org Fri Jul 7 20:11:40 2006
From: python-checkins at python.org (brett.cannon)
Date: Fri, 7 Jul 2006 20:11:40 +0200 (CEST)
Subject: [Python-checkins] r50478 -
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Message-ID: <20060707181140.048AF1E4008@bag.python.org>
Author: brett.cannon
Date: Fri Jul 7 20:11:39 2006
New Revision: 50478
Modified:
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Log:
Remove some points from the todo list.
Modified: python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
==============================================================================
--- python/branches/bcannon-sandboxing/sandboxing_design_doc.txt (original)
+++ python/branches/bcannon-sandboxing/sandboxing_design_doc.txt Fri Jul 7 20:11:39 2006
@@ -19,7 +19,6 @@
XXX TO DO
=============================
-* Use a callback for paths to create wrapper?
* threading needs protection?
* python-dev convince me that hiding 'file' possible?
+ based on that, handle code objects
@@ -29,8 +28,6 @@
* what network inko functions are allowed by default?
* does the object.__subclasses__() trick work across interpreters, or is it
unique per interpreter?
-* in the "Threat Model", reference the section of the implementation that
- addresses that concern
* figure out default whitelist of extension modules
* check default accessible objects for file path exposure
* helper functions to get at StringIO instances for stdin, stdout, and friends?
From python-checkins at python.org Fri Jul 7 20:16:23 2006
From: python-checkins at python.org (brett.cannon)
Date: Fri, 7 Jul 2006 20:16:23 +0200 (CEST)
Subject: [Python-checkins] r50479 -
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Message-ID: <20060707181623.EEFFF1E4007@bag.python.org>
Author: brett.cannon
Date: Fri Jul 7 20:16:23 2006
New Revision: 50479
Modified:
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Log:
Clarify protection of 'print', input(), and raw_input().
Modified: python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
==============================================================================
--- python/branches/bcannon-sandboxing/sandboxing_design_doc.txt (original)
+++ python/branches/bcannon-sandboxing/sandboxing_design_doc.txt Fri Jul 7 20:16:23 2006
@@ -33,7 +33,7 @@
* helper functions to get at StringIO instances for stdin, stdout, and friends?
* decide on what type of objects (e.g., PyStringObject or const char *) are to
be passed into PySandbox_*Extended*() functions
-
+* all built-ins properly protected?
Goal
=============================
@@ -345,8 +345,7 @@
Threat Model
///////////////////////////////////////
-Below is a list of what the security implementation
-should allow/prevent or assumes, along with what section of this document that addresses
+Below is a list of what the security implementation assumes, along with what section of this document that addresses
that part of the security model (if not already true in Python by default).
The term "bare" when in terms
of an interpreter means an interpreter that has not performed a single import
@@ -375,7 +374,7 @@
* When starting a sandboxed interpreter, it starts with a fresh built-in and
global namespace that is not shared with the interpreter that started it.
Objects in the built-in namespace should be safe to use
- [``Reading/Writing Files`_].
+ [`Reading/Writing Files`_, `Stdin, Stdout, and Stderr`_].
+ Either hide the dangerous ones or cripple them so they can cause no harm.
There are also some features that might be desirable, but are not being
@@ -903,12 +902,15 @@
instances of cStringIO. Explicit allowance of the process' stdin, stdout, and
stderr is possible.
+This will protect the 'print' statement, and the built-ins input() and
+raw_input().
+
Why
--------------
Interference with stdin, stdout, or stderr should not be allowed unless
-desired.
+desired. No one wants uncontrolled output sent to their screen.
Possible Security Flaws
From python-checkins at python.org Fri Jul 7 20:30:05 2006
From: python-checkins at python.org (brett.cannon)
Date: Fri, 7 Jul 2006 20:30:05 +0200 (CEST)
Subject: [Python-checkins] r50480 -
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Message-ID: <20060707183005.CAD231E4007@bag.python.org>
Author: brett.cannon
Date: Fri Jul 7 20:30:05 2006
New Revision: 50480
Modified:
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Log:
Make PySandbox_Allowed*() the common prefix for macros that check for permission rights.
Modified: python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
==============================================================================
--- python/branches/bcannon-sandboxing/sandboxing_design_doc.txt (original)
+++ python/branches/bcannon-sandboxing/sandboxing_design_doc.txt Fri Jul 7 20:30:05 2006
@@ -34,6 +34,7 @@
* decide on what type of objects (e.g., PyStringObject or const char *) are to
be passed into PySandbox_*Extended*() functions
* all built-ins properly protected?
+* exactly how to tell whether argument to open() is a path, IP, or host name.
Goal
=============================
@@ -423,10 +424,12 @@
incorrectly checking a return value on a rights-checking function call. For
the rare case where this functionality is disliked, just make the check in a
utility function and check that function's return value (but this is strongly
-discouraged!). Functions that check that an operation is allowed implicitly operate on the currently running interpreter as
+discouraged!).
+
+Functions that check that an operation is allowed implicitly operate on the currently running interpreter as
returned by ``PyInterpreter_Get()`` and are to be used by any code (the
interpreter, extension modules, etc.) that needs to check for permission to
-execute.
+execute. They have the common prefix of ``PySandbox_Allowed*()``.
API
@@ -503,8 +506,11 @@
To open a file, one will have to use open(). This will make open() a factory
function that controls reference access to the 'file' type in terms of creating
-new instances. When an attempted file opening fails, SandboxError will be
-raised.
+new instances. When an attempted file opening fails (either because the path
+does not exist or of security reasons), SandboxError will be
+raised. The same exception must be raised to prevent filesystem information
+being gleaned from the type of exception returned (i.e., returning IOError if a
+path does not exist tells the user something about that file path).
What open() may not specifically be an instance of 'file' but a proxy
that provides the security measures needed. While this might break code that
@@ -765,6 +771,15 @@
Allow sending and receiving data to/from specific IP addresses on specific
ports.
+open() is to be used as a factory function to open a network connection. If
+the connection is not possible (either because of an invalid address or
+security reasons), SandboxError is raised.
+
+A socket object may not be returned by the call. A proxy to handle security
+might be returned instead.
+
+XXX
+
Why
--------------
@@ -844,7 +859,7 @@
whether the IP or host address is explicitly allowed. If the interpreter
is not sandboxed, return a false value.
-* PySandbox_CheckNetworkInfo(error_return)
+* PySandbox_AllowedNetworkInfo(error_return)
Macro that will return 'error_return' for the caller and set a SandboxError exception
if the sandboxed interpreter does not allow checking for arbitrary network
information, otherwise do nothing.
@@ -979,7 +994,7 @@
use of the resource (e.g., network information). Returns a false value
if used on an unprotected interpreter.
- * PySandbox_ExtendedAllowedFlag(group, type, error_return)
+ * PySandbox_AllowedExtendedFlag(group, type, error_return)
Macro that if the group-type is not set to true, cause the caller to
return with 'error_return' with SandboxError exception raised. For unprotected
interpreters the check does nothing.
@@ -1004,12 +1019,12 @@
+ Membership
- * int PySandbox_ExtendedSetMembership(PyThreadState *, group, type, string)
+ * int PySandbox_SetExtendedMembership(PyThreadState *, group, type, string)
Add a string to be considered a member of a group-type (e.g., allowed
file paths). If the interpreter is not an sandboxed interpreter,
return a false value.
- * PySandbox_ExtendedCheckMembership(group, type, string, error_return)
+ * PySandbox_AllowedExtendedMembership(group, type, string, error_return)
Macro that checks 'string' is a member of the values set for the
group-type. If it is not, then have the caller return 'error_return'
and set an exception for SandboxError, otherwise does nothing.
From python-checkins at python.org Fri Jul 7 21:45:47 2006
From: python-checkins at python.org (mateusz.rukowicz)
Date: Fri, 7 Jul 2006 21:45:47 +0200 (CEST)
Subject: [Python-checkins] r50481 - sandbox/trunk/decimal-c/_decimal.c
Message-ID: <20060707194547.AAC131E4007@bag.python.org>
Author: mateusz.rukowicz
Date: Fri Jul 7 21:45:46 2006
New Revision: 50481
Modified:
sandbox/trunk/decimal-c/_decimal.c
Log:
Exponents works ok now everywhere.
Modified: sandbox/trunk/decimal-c/_decimal.c
==============================================================================
--- sandbox/trunk/decimal-c/_decimal.c (original)
+++ sandbox/trunk/decimal-c/_decimal.c Fri Jul 7 21:45:46 2006
@@ -539,8 +539,12 @@
}
#ifdef BIG_EXP
+/* TODO this implementation assumes that every limb, which
+ * is not used is 0. Since it's not really fast, this will
+ * change */
-exp_t exp_from_i(long a) {
+static exp_t
+exp_from_i(long a) {
exp_t ret;
memset(ret.limbs, 0, sizeof(long) * EXP_LIMB_COUNT);
@@ -557,7 +561,7 @@
return ret;
}
-int
+static int
exp_sscanf(char *buf, exp_t *exp) {
int len;
long mul;
@@ -606,7 +610,7 @@
return 1;
}
-int
+static int
exp_sprintf(char *buf, exp_t exp) {
int written = 0;
int tmp;
@@ -631,8 +635,47 @@
return written;
}
+/* must check errors with PyErr_Occurred() */
+static exp_t
+exp_from_pyobj(PyObject *a)
+{
+ if (PyInt_Check(a)) {
+ return exp_from_i(PyInt_AS_LONG(a));
+ }
+ else if (PyLong_Check(a)) {
+ char *bufer;
+ PyObject *strval;
+ exp_t ret;
+ Py_ssize_t buf_len = 0;
+ strval = PyObject_Str(a);
+
+ if (!strval)
+ return ret;
+
+ if (PyObject_AsCharBuffer(strval, (const char **)&bufer, &buf_len) == -1) {
+ Py_DECREF(strval);
+ return ret;
+ }
+
+ if (exp_sscanf(bufer, &ret) != 1) {
+ Py_DECREF(strval);
+ PyErr_SetString(PyExc_TypeError, "exponent must be integer value.");
+ return ret;
+ }
+
+ Py_DECREF(strval);
+ return ret;
+ }
+
+ else {
+ exp_t ret;
+ PyErr_SetString(PyExc_TypeError, "exponent must be integer value.");
+ return ret;
+ }
+}
+
/* there is no overflow checking !*/
-long
+static long
exp_to_i(exp_t exp) {
long mult;
long i;
@@ -654,7 +697,16 @@
return ret;
}
-exp_t
+static PyObject*
+exp_to_pyobj(exp_t exp)
+{
+ char bufer[LOG * EXP_LIMB_COUNT + 5];
+ exp_sprintf(bufer, exp);
+
+ return PyLong_FromString(bufer, 0, 10);
+}
+
+static exp_t
exp_inp_add(exp_t *a, exp_t b) {
if (a->sign == b.sign) {
a->size = _limb_add(a->limbs, a->size * LOG, b.limbs, b.size * LOG, a->limbs);
@@ -688,7 +740,7 @@
return *a;
}
-exp_t
+static exp_t
exp_inp_sub(exp_t *a, exp_t b) {
exp_t tmp_b = b;
tmp_b.sign ^= 1;
@@ -698,47 +750,47 @@
return exp_inp_add(a, tmp_b);
}
-exp_t
+static exp_t
exp_add(exp_t a, exp_t b) {
return exp_inp_add(&a, b);
}
-exp_t
+static exp_t
exp_sub(exp_t a, exp_t b) {
return exp_inp_sub(&a, b);
}
-exp_t
+static exp_t
exp_add_i(exp_t a, long b) {
return exp_add(a, exp_from_i(b));
}
-exp_t
+static exp_t
exp_sub_i(exp_t a, long b) {
return exp_sub(a, exp_from_i(b));
}
-exp_t
+static exp_t
exp_inp_add_i(exp_t *a, long b) {
return exp_inp_add(a, exp_from_i(b));
}
-exp_t
+static exp_t
exp_inp_sub_i(exp_t *a, long b) {
return exp_inp_sub(a, exp_from_i(b));
}
-exp_t
+static exp_t
exp_inc(exp_t *a) {
return exp_inp_add(a, exp_from_i(1));
}
-exp_t
+static exp_t
exp_dec(exp_t *a) {
return exp_inp_sub(a, exp_from_i(1));
}
-exp_t
+static exp_t
exp_mul(exp_t a, exp_t b) {
exp_t ret;
memset(ret.limbs, 0, sizeof(long) *EXP_LIMB_COUNT);
@@ -749,12 +801,12 @@
return ret;
}
-exp_t
+static exp_t
exp_mul_i(exp_t a, long b) {
return exp_mul(a, exp_from_i(b));
}
-exp_t
+static exp_t
exp_div_i(exp_t a, long b, long *remainder) {
exp_t ret;
long i;
@@ -782,8 +834,7 @@
return ret;
}
-/* TODO */
-exp_t
+static exp_t
exp_floordiv_i(exp_t a, long b) {
long remainder;
exp_t ret;
@@ -794,15 +845,14 @@
return ret;
}
-/* TODO */
-int
+static int
exp_mod_i(exp_t a, long b) {
long remainder;
exp_div_i(a, b, &remainder);
return remainder;
}
-int
+static int
exp_cmp(exp_t a, exp_t b) {
int cmp;
if (a.sign != b.sign) {
@@ -819,89 +869,93 @@
return cmp;
}
-int
+static int
exp_cmp_i(exp_t a, long b) {
return exp_cmp(a, exp_from_i(b));
}
-int
+static int
exp_g(exp_t a, exp_t b) {
int cmp = exp_cmp(a, b);
return cmp == 1;
}
-int
+static int
exp_g_i(exp_t a, long b) {
return exp_g(a, exp_from_i(b));
}
-int
+static int
exp_l(exp_t a, exp_t b) {
int cmp = exp_cmp(a, b);
return cmp == -1;
}
-int
+static int
exp_l_i(exp_t a, long b) {
return exp_l(a, exp_from_i(b));
}
-int
+static int
exp_eq(exp_t a, exp_t b) {
int cmp = exp_cmp(a, b);
return cmp == 0;
}
-int
+static int
exp_eq_i(exp_t a, long b) {
return exp_eq(a, exp_from_i(b));
}
-int
+static int
exp_ne(exp_t a, exp_t b) {
int cmp = exp_cmp(a, b);
return cmp != 0;
}
-int
+static int
exp_ne_i(exp_t a, long b) {
return exp_ne(a, exp_from_i(b));
}
-int
+static int
exp_ge(exp_t a, exp_t b) {
int cmp = exp_cmp(a, b);
return (cmp == 1) || (cmp == 0);
}
-int
+static int
exp_ge_i(exp_t a, long b) {
return exp_ge(a, exp_from_i(b));
}
-int
+static int
exp_le(exp_t a, exp_t b) {
int cmp = exp_cmp(a, b);
return (cmp == -1) || (cmp == 0);
}
-int exp_le_i(exp_t a, long b) {
+static int
+exp_le_i(exp_t a, long b) {
return exp_le(a, exp_from_i(b));
}
-int exp_is_zero(exp_t a) {
+static int
+exp_is_zero(exp_t a) {
return a.limbs[0] == 0 && a.size == 1;
}
-int exp_is_neg(exp_t a) {
+static int
+exp_is_neg(exp_t a) {
return !exp_is_zero(a) && a.sign == 1;
}
-int exp_is_pos(exp_t a) {
+static int
+exp_is_pos(exp_t a) {
return !exp_is_zero(a) && a.sign == 0;
}
-exp_t
+static exp_t
exp_neg(exp_t a) {
a.sign ^= 1;
if (a.limbs[0] == 0 && a.size == 1)
@@ -909,7 +963,7 @@
return a;
}
-exp_t
+static exp_t
exp_min(exp_t a, exp_t b) {
int cmp = exp_cmp(a, b);
if (cmp == 1)
@@ -918,7 +972,7 @@
return a;
}
-exp_t
+static exp_t
exp_max(exp_t a, exp_t b) {
int cmp = exp_cmp(a, b);
if(cmp == 1)
@@ -928,40 +982,41 @@
}
#else
-#define exp_add_i(exp, a) ((exp) + (a)) //
-#define exp_add(a, b) ((a) + (b)) //
-#define exp_inc(a) ((*a)++) //
-#define exp_dec(a) ((*a)--) //
-#define exp_is_zero(a) ((a) == 0) //
-#define exp_is_neg(a) ((a) < 0) //
-#define exp_is_pos(a) ((a) > 0) //
-#define exp_inp_add(a,b) ((*(a)) += (b)) //
-#define exp_to_int(a) (a) //
-#define exp_to_i(a) (a) //
-#define exp_from_i(a) (a) //
-#define exp_sub_i(exp, a) ((exp) - (a)) //
-#define exp_sub(a, b) ((a) - (b)) //
-#define exp_g(a, b) ((a) > (b)) //
-#define exp_l(a, b) ((a) < (b)) //
-#define exp_ge(a, b) ((a) >= (b)) //
-#define exp_eq(a, b) ((a) == (b)) //
-#define exp_eq_i(a, b) ((a) == (b)) //
-#define exp_ne(a, b) ((a) != (b)) //
-#define exp_g_i(a, b) ((a) > (b)) //
-#define exp_l_i(a, b) ((a) < (b)) //
-#define exp_ge_i(a, b) ((a) >= (b)) //
-#define exp_le_i(a, b) ((a) <= (b)) //
+/* TODO a little mess here ;P */
+#define exp_add_i(exp, a) ((exp) + (a))
+#define exp_add(a, b) ((a) + (b))
+#define exp_inc(a) ((*a)++)
+#define exp_dec(a) ((*a)--)
+#define exp_is_zero(a) ((a) == 0)
+#define exp_is_neg(a) ((a) < 0)
+#define exp_is_pos(a) ((a) > 0)
+#define exp_inp_add(a,b) ((*(a)) += (b))
+#define exp_to_int(a) (a)
+#define exp_to_i(a) (a)
+#define exp_from_i(a) (a)
+#define exp_sub_i(exp, a) ((exp) - (a))
+#define exp_sub(a, b) ((a) - (b))
+#define exp_g(a, b) ((a) > (b))
+#define exp_l(a, b) ((a) < (b))
+#define exp_ge(a, b) ((a) >= (b))
+#define exp_eq(a, b) ((a) == (b))
+#define exp_eq_i(a, b) ((a) == (b))
+#define exp_ne(a, b) ((a) != (b))
+#define exp_g_i(a, b) ((a) > (b))
+#define exp_l_i(a, b) ((a) < (b))
+#define exp_ge_i(a, b) ((a) >= (b))
+#define exp_le_i(a, b) ((a) <= (b))
#define exp_mod_i(a, b) ((a) % (b))
-#define exp_ge_i(a, b) ((a) >= (b)) //
+#define exp_ge_i(a, b) ((a) >= (b))
#define exp_floordiv_i(a,b) ((a) / (b) - ((a) % (b) && (a) < 0))
-#define exp_inp_sub(a, b) ((*(a)) -= (b)) //
-#define exp_inp_sub_i(a, b) ((*(a)) -= (b)) //
-#define exp_sprintf(a, e) (sprintf(a, "%d", e)) //
-#define exp_sscanf(a, e) (sscanf(a, "%d", e)) //
-#define exp_min(a, b) ((a) < (b) ? (a) : (b)) //
-#define exp_max(a, b) ((a) > (b) ? (a) : (b)) //
-#define exp_neg(a) (-(a)) //
-#define exp_mul_i(a, b) ((a) * (b)) //
+#define exp_inp_sub(a, b) ((*(a)) -= (b))
+#define exp_inp_sub_i(a, b) ((*(a)) -= (b))
+#define exp_sprintf(a, e) (sprintf(a, "%d", e))
+#define exp_sscanf(a, e) (sscanf(a, "%d", e))
+#define exp_min(a, b) ((a) < (b) ? (a) : (b))
+#define exp_max(a, b) ((a) > (b) ? (a) : (b))
+#define exp_neg(a) (-(a))
+#define exp_mul_i(a, b) ((a) * (b))
#endif
@@ -2870,9 +2925,6 @@
ret->limbs[0] = 0;
ret->exp = exp_floordiv_i(self->exp, 2);
-// ret->exp = self->exp / 2;
-// if (self->exp < 0 && self->exp%2)
-// ret->exp --;
ret->sign = self->sign & 1;
return ret;
@@ -2888,9 +2940,6 @@
return NULL;
expadd = exp_floordiv_i(tmp->exp, 2);
-// expadd = tmp->exp/2;
-// if (tmp->exp < 0 && tmp->exp % 2)
-// expadd --;
if (exp_mod_i(tmp->exp, 2)) {
_limb_first_n_digits(self->limbs, self->ob_size, 0, tmp->limbs, tmp->ob_size);
@@ -2940,14 +2989,12 @@
ans->limbs[0] = 819;
ans->exp = exp_sub_i(ADJUSTED(tmp), 2);
tmp2->limbs[0] = 259;
-// tmp2->exp = -2;
exp_inp_sub_i(&(tmp2->exp), 2);
}
else {
ans->limbs[0] = 259;
ans->exp = exp_add_i(tmp->exp, tmp->ob_size - 3);
tmp2->limbs[0] = 819;
-// tmp2->exp = -3;
exp_inp_sub_i(&(tmp2->exp), 3);
}
@@ -3031,7 +3078,6 @@
ctx2->prec = firstprec + 1;
if (exp_ne(prevexp, ADJUSTED(tmp2))) {
_limb_first_n_digits(tmp2->limbs, tmp2->ob_size, 0, ans->limbs, ans->ob_size);
-// ans->exp --;
exp_dec(&(ans->exp));
}
else {
@@ -3114,7 +3160,6 @@
}
}
-// ans->exp += expadd;
exp_inp_add(&(ans->exp), expadd);
ctx2->rounding = rounding;
@@ -3214,7 +3259,6 @@
if (_check_nans(self, NULL, ctx, &nan) != 0)
return nan;
}
-// if (self->exp >= 0) {
if (exp_is_pos(self->exp) || exp_is_zero(self->exp)) {
Py_INCREF(self);
return self;
@@ -3307,8 +3351,8 @@
return PyInt_FromLong(0);
/* XXX: Overflow? */
- /* EXP TODO */
- return PyInt_FromSsize_t(exp_to_i(ADJUSTED(self)));
+/* return PyInt_FromSsize_t(exp_to_i(ADJUSTED(self))); */
+ return exp_to_pyobj(ADJUSTED(self));
}
@@ -3328,9 +3372,16 @@
PyTuple_SET_ITEM(digits, i, d);
}
- /* TODO EXP */
- if(!ISINF(self))
- res = Py_BuildValue("(bOn)", self->sign % 2, digits, exp_to_i(self->exp));
+ if(!ISINF(self)) {
+ PyObject *exp = exp_to_pyobj(self->exp);
+
+ if (!exp) {
+ Py_DECREF(digits);
+ return NULL;
+ }
+ res = Py_BuildValue("(bOO)", self->sign % 2, digits, exp);
+ Py_DECREF(exp);
+ }
else {
inf = PyString_FromString("F");
if (!inf) {
@@ -3406,7 +3457,7 @@
*p++ = '-';
SANITY_CHECK(p);
}
-// if (d->exp < 0 && d->exp >= -6) {
+
if(exp_is_neg(d->exp) && exp_ge_i(d->exp, -6)) {
i = 0;
*p++ = '0';
@@ -3438,7 +3489,6 @@
*p++ = '+';
SANITY_CHECK(p);
-// p += sprintf(p, "%d", roundexp);
p += exp_sprintf(p, roundexp);
SANITY_CHECK(p);
}
@@ -3512,13 +3562,11 @@
adjexp = exp_from_i(0);
dotplace = exp_add_i(d->exp, d->ob_size);
/* dotplace must be at most d->ob_size (no dot at all) and at last -5 (6 pre-zeros)*/
-// if(dotplace >d->ob_size || dotplace <-5)
if (exp_g_i(dotplace, d->ob_size) || exp_l_i(dotplace, -5))
{
/* ok, we have to put dot after 1 digit, that is dotplace = 1
* we compute adjexp from equation (1) */
dotplace = exp_from_i(1);
-// adjexp = -dotplace + d->exp + d->ob_size;
adjexp = exp_add_i(exp_sub(d->exp,dotplace), d->ob_size);
}
@@ -3531,17 +3579,13 @@
/* now all we have to do, is to put it to the string =] */
-// if(dotplace > d->ob_size)
if (exp_g_i(dotplace, d->ob_size))
extra_zeros = exp_to_i(exp_sub_i(dotplace, d->ob_size));
-// if(dotplace <= 0)
if (exp_le_i(dotplace, 0))
{
*p++ = '0';
*p++ = '.';
-// while(dotplace++ < 0)
-// *p++ = '0';
while (exp_l_i(dotplace, 0)) {
exp_inc(&dotplace);
*p++ = '0';
@@ -3778,7 +3822,6 @@
ans2 = _decimal_get_copy(self);
if (!ans2)
return NULL;
- //ans2->exp = self->exp < other->exp ? self->exp : other->exp;
ans2->exp = exp_min(self->exp, other->exp);
ans1 = _NEW_decimalobj(1, sign, exp_from_i(0));
if (!ans1) {
@@ -3864,7 +3907,6 @@
if (divmod == 1 || divmod == 3) {
decimalobject *tmp;
exp_t exp;
-// exp = self->exp < other->exp ? self->exp : other->exp;
exp = exp_min(self->exp, other->exp);
tmp = _decimal_rescale(self, exp, ctx, -1, 0);
@@ -3979,7 +4021,6 @@
* expdiff >= (-exp - LOG)/ LOG */
if (divmod) {
assert(!(exp_mod_i(exp, LOG)));
-// min_expdiff = (-exp - LOG)/ LOG;
min_expdiff = exp_sub_i(exp_neg(exp), LOG);
min_expdiff = exp_floordiv_i(min_expdiff, LOG);
@@ -3993,7 +4034,6 @@
significant_limbs, remainder, exp_to_i(min_expdiff)));
result->limbs[significant_limbs] = 0;
-// exp += expdiff * LOG + LOG;
exp_inp_add(&exp, exp_add_i(exp_mul_i(expdiff, LOG), LOG));
rlimbs = _limb_size(remainder, remainder_limbs);
/* remainder non-zero */
@@ -4007,7 +4047,6 @@
}
if (!divmod) {
-// exp -= LOG;
exp_inp_sub_i(&exp, LOG);
result->limb_count ++;
result->ob_size += LOG;
@@ -4066,7 +4105,6 @@
break;
_limb_cut_one_digit(result->limbs, result->ob_size);
result->ob_size --;
-// result->exp ++;
exp_inc(&(result->exp));
}
@@ -4339,7 +4377,6 @@
}
shouldround = (ctx->rounding_dec == ALWAYS_ROUND);
-// exp = (self->exp < other->exp ? self->exp : other->exp);
exp = exp_min(self->exp, other->exp);
if (ctx->rounding == ROUND_FLOOR &&
(self->sign & 1) != (other->sign & 1))
@@ -4359,9 +4396,7 @@
return res;
}
if (!decimal_nonzero(self)) {
-// oexp = other->exp - ctx->prec - 1;
oexp = exp_sub_i(other->exp, ctx->prec + 1);
-// exp = (exp > oexp ? exp : oexp);
exp = exp_max(exp, oexp);
res = _decimal_rescale(other, exp, ctx, 0, 0);
if (!res) return NULL;
@@ -4374,9 +4409,7 @@
}
}
if (!decimal_nonzero(other)) {
-// oexp = self->exp - ctx->prec - 1;
oexp = exp_sub_i(self->exp, ctx->prec + 1);
-// exp = (exp > oexp ? exp : oexp);
exp = exp_max(exp, oexp);
res = _decimal_rescale(self, exp, ctx, 0, 0);
if (!res) return NULL;
@@ -5503,14 +5536,12 @@
if(e) /* pretty obvious */
{
int ok;
-// ok = sscanf(e+1,"%d", &exp);
ok = exp_sscanf(e+1, &exp);
if(ok!=1)
{
goto err;
}
}
-// exp -= digits_after_dot;
exp_inp_sub_i(&exp, digits_after_dot);
new = _new_decimalobj(type, ndigits, sign, exp);
if(!new)
@@ -6297,16 +6328,14 @@
static PyObject *
context_Etiny(contextobject *self)
{
- /* TODO EXP */
- return PyInt_FromSsize_t(exp_to_i(ETINY(self)));
+ return exp_to_pyobj(ETINY(self));
}
static PyObject *
context_Etop(contextobject *self)
{
- /* TODO EXP */
- return PyInt_FromSsize_t(exp_to_i(ETOP(self)));
+ return exp_to_pyobj(ETOP(self));
}
@@ -6536,16 +6565,27 @@
}
-/* TODO EXP */
static PyObject *
context_reduce(contextobject *self)
{
+ PyObject *Emin, *Emax, *ret;
+ Emin = exp_to_pyobj(self->Emin);
+ if (!Emin)
+ return NULL;
+ Emax = exp_to_pyobj(self->Emax);
+ if (!Emax) {
+ Py_DECREF(Emin);
+ return NULL;
+ }
/* Yes, it seems to be as simple as that. */
- return Py_BuildValue("(O(liiOOlliiO))", self->ob_type,
+ ret = Py_BuildValue("(O(liiOOOOiiO))", self->ob_type,
self->prec, self->rounding,
self->rounding_dec, self->traps, self->flags,
- exp_to_i(self->Emin), exp_to_i(self->Emax), self->capitals,
+ Emin, Emax, self->capitals,
self->clamp, self->ignored);
+ Py_DECREF(Emin);
+ Py_DECREF(Emax);
+ return ret;
}
@@ -6940,6 +6980,8 @@
char roundstr[20] = "ROUND_";
char flags[250] = "["; /* 250 is enough for 12 error names of max. 17 chars */
char traps[250] = "["; /* and commas inbetween. */
+ char Emin[LOG * EXP_LIMB_COUNT + 1];
+ char Emax[LOG * EXP_LIMB_COUNT + 1];
long flaglen = 1, traplen = 1;
switch (self->rounding) {
@@ -6987,9 +7029,12 @@
traps[traplen-2] = ']';
traps[traplen-1] = 0;
- return PyString_FromFormat("Context(prec=%ld, rounding=%s, Emin=%ld, "
- "Emax=%ld, capitals=%d, flags=%s, traps=%s)",
- self->prec, roundstr, self->Emin, self->Emax,
+ exp_sprintf(Emax, self->Emax);
+ exp_sprintf(Emin, self->Emin);
+
+ return PyString_FromFormat("Context(prec=%ld, rounding=%s, Emin=%s, "
+ "Emax=%s, capitals=%d, flags=%s, traps=%s)",
+ self->prec, roundstr, Emin, Emax,
self->capitals, flags, traps);
}
@@ -7004,6 +7049,7 @@
}
+/* TODO take a closer look at refs when error */
static PyObject *
context_new(PyTypeObject *type, PyObject *args, PyObject *kwds)
{
@@ -7011,8 +7057,7 @@
"traps", "flags", "Emin", "Emax",
"capitals", "_clamp", "_ignored_flags", 0};
long prec = LONG_MIN;
-// long Emin = LONG_MAX, Emax = LONG_MIN;
- long TEmin, TEmax;
+ PyObject *TEmin = NULL, *TEmax = NULL;
exp_t Emin, Emax;
int rounding = -1, rounding_dec = -1, capitals = -1, clamp = 0;
PyObject *pytraps = NULL, *pyflags = NULL, *pyignored = NULL;
@@ -7022,17 +7067,30 @@
PyObject *_ignored = NULL;
int i, j;
- TEmin = LONG_MAX;
- TEmax = LONG_MIN;
- /* TODO EXP */
- if (!PyArg_ParseTupleAndKeywords(args, kwds, "|liiOOlliiO:Context", kwlist,
+ if (!PyArg_ParseTupleAndKeywords(args, kwds, "|liiOOOOiiO:Context", kwlist,
&prec, &rounding, &rounding_dec, &pytraps,
&pyflags, &TEmin, &TEmax, &capitals, &clamp,
&pyignored))
return NULL;
- Emin = exp_from_i(TEmin);
- Emax = exp_from_i(TEmax);
+
+ if (!TEmin)
+ Emin = exp_from_i(LONG_MAX);
+ else {
+ Emin = exp_from_pyobj(TEmin);
+ if (PyErr_Occurred()) {
+ goto err;
+ }
+ }
+
+ if (!TEmax)
+ Emax = exp_from_i(LONG_MIN);
+ else {
+ Emax = exp_from_pyobj(TEmax);
+ if (PyErr_Occurred()) {
+ goto err;
+ }
+ }
if (pytraps == NULL) {
_traps = PyDict_Copy(PyDecimal_DefaultContext->traps);
@@ -7177,8 +7235,6 @@
static PyMemberDef context_members[] = {
{"prec", T_LONG, OFF(prec), 0},
-// {"Emin", T_LONG, OFF(Emin), 0},
-// {"Emax", T_LONG, OFF(Emax), 0},
{"capitals", T_INT, OFF(capitals), 0},
{"rounding", T_INT, OFF(rounding), 0},
{"_rounding_decision", T_INT, OFF(rounding_dec), 0},
@@ -7191,32 +7247,30 @@
static PyObject *
context_get_emax(contextobject *self) {
- return PyInt_FromLong(exp_to_i(self->Emax));
+ return exp_to_pyobj(self->Emax);
}
-/* TODO EXP */
static int
context_set_emax(contextobject *self, PyObject *value) {
- long var = PyInt_AsLong(value);
+ exp_t tmp = exp_from_pyobj(value);
if (PyErr_Occurred())
return -1;
- self->Emax = exp_from_i(var);
+ self->Emax = tmp;
return 0;
}
static PyObject *
context_get_emin(contextobject *self) {
- return PyInt_FromLong(exp_to_i(self->Emin));
+ return exp_to_pyobj(self->Emin);
}
-/* TODO EXP */
static int
context_set_emin(contextobject *self, PyObject *value) {
- long var = PyInt_AsLong(value);
+ exp_t tmp = exp_from_pyobj(value);
if (PyErr_Occurred())
return -1;
- self->Emin = exp_from_i(var);
+ self->Emin = tmp;
return 0;
}
From python-checkins at python.org Fri Jul 7 21:47:25 2006
From: python-checkins at python.org (matt.fleming)
Date: Fri, 7 Jul 2006 21:47:25 +0200 (CEST)
Subject: [Python-checkins] r50482 - in sandbox/trunk/pdb: mconnection.py
test/test_mconnection.py
Message-ID: <20060707194725.CDF341E4007@bag.python.org>
Author: matt.fleming
Date: Fri Jul 7 21:47:25 2006
New Revision: 50482
Modified:
sandbox/trunk/pdb/mconnection.py
sandbox/trunk/pdb/test/test_mconnection.py
Log:
Raise exception on pdbserver-side if client connection closes unexpectedly.
Modified: sandbox/trunk/pdb/mconnection.py
==============================================================================
--- sandbox/trunk/pdb/mconnection.py (original)
+++ sandbox/trunk/pdb/mconnection.py Fri Jul 7 21:47:25 2006
@@ -138,6 +138,8 @@
line = self.input.recv(bufsize)
except socket.error, e:
raise ReadError, e[1]
+ if len(line) == 0:
+ raise ReadError, 'Connection closed'
if line[-1] != '\n': line += '\n'
return line
@@ -155,6 +157,7 @@
""" Specify the address to connection to. """
MConnectionInterface.__init__(self)
self._sock = self.output = self.input = None
+ self.connected = True
def connect(self, addr):
"""Connect to the server. 'input' reads data from the
@@ -171,6 +174,7 @@
self._sock.connect((self.host, self.port))
except socket.error, e:
raise ConnectionFailed, e[1]
+ self.connected = True
def write(self, msg):
try:
@@ -183,6 +187,8 @@
line = self._sock.recv(bufsize)
except socket.error, e:
raise ReadError, e[1]
+ if len(line) == 0:
+ raise ReadError, 'Connection closed'
return line
def disconnect(self):
@@ -192,5 +198,7 @@
return
else:
self._sock.close()
+ self._sock = None
+ self.connected = False
Modified: sandbox/trunk/pdb/test/test_mconnection.py
==============================================================================
--- sandbox/trunk/pdb/test/test_mconnection.py (original)
+++ sandbox/trunk/pdb/test/test_mconnection.py Fri Jul 7 21:47:25 2006
@@ -19,7 +19,7 @@
sys.path.append("..")
from mconnection import (MConnectionServerTCP, MConnectionClientTCP,
- MConnectionSerial, ConnectionFailed)
+ MConnectionSerial, ConnectionFailed, ReadError)
# Try to connect the client to addr either until we've tried MAXTRIES
# times or until it succeeds.
@@ -27,8 +27,7 @@
for i in range(MAXTRIES):
try:
client.connect(addr)
- # The _sock variable appears when there's a connection
- if client._sock: break
+ if client.connected: break
except ConnectionFailed:
pass
@@ -96,7 +95,26 @@
"""(tcp) Test invald hostname, port pair. """
addr = 'localhost 8000'
self.assertRaises(ConnectionFailed, self.server.connect, addr)
-
+
+ def testServerReadError(self):
+ """(tcp) Test the ReadError exception."""
+ thread.start_new_thread(self.server.connect, (__addr__,))
+
+ while self.server._sock == None:
+ pass
+
+ repeatedConnect(self.client, __addr__)
+ self.client.disconnect()
+ self.assertRaises(ReadError, self.server.readline)
+
+ self.server.disconnect()
+
+ thread.start_new_thread(self.client.connect, (__addr__,))
+ self.server.connect(__addr__)
+
+ self.server.disconnect()
+ self.assertRaises(ReadError, self.client.readline)
+
def tearDown(self):
self.server.disconnect()
self.client.disconnect()
From nnorwitz at gmail.com Fri Jul 7 22:08:31 2006
From: nnorwitz at gmail.com (Neal Norwitz)
Date: Fri, 7 Jul 2006 13:08:31 -0700
Subject: [Python-checkins] r50482 - in sandbox/trunk/pdb: mconnection.py
test/test_mconnection.py
In-Reply-To: <20060707194725.CDF341E4007@bag.python.org>
References: <20060707194725.CDF341E4007@bag.python.org>
Message-ID:
On 7/7/06, matt.fleming wrote:
> Author: matt.fleming
> Date: Fri Jul 7 21:47:25 2006
> New Revision: 50482
>
> Modified:
> sandbox/trunk/pdb/mconnection.py
> sandbox/trunk/pdb/test/test_mconnection.py
> Log:
> Raise exception on pdbserver-side if client connection closes unexpectedly.
>
>
> Modified: sandbox/trunk/pdb/mconnection.py
> ==============================================================================
> --- sandbox/trunk/pdb/mconnection.py (original)
> +++ sandbox/trunk/pdb/mconnection.py Fri Jul 7 21:47:25 2006
> @@ -138,6 +138,8 @@
> line = self.input.recv(bufsize)
> except socket.error, e:
> raise ReadError, e[1]
> + if len(line) == 0:
> + raise ReadError, 'Connection closed'
if not line: # is shorter and easier to read.
n
From python-checkins at python.org Fri Jul 7 22:16:43 2006
From: python-checkins at python.org (guido.van.rossum)
Date: Fri, 7 Jul 2006 22:16:43 +0200 (CEST)
Subject: [Python-checkins] r50483 - peps/trunk/pep-3103.txt
Message-ID: <20060707201643.20EE81E4007@bag.python.org>
Author: guido.van.rossum
Date: Fri Jul 7 22:16:41 2006
New Revision: 50483
Modified:
peps/trunk/pep-3103.txt
Log:
Switch my preference to syntax alternative 1.
Modified: peps/trunk/pep-3103.txt
==============================================================================
--- peps/trunk/pep-3103.txt (original)
+++ peps/trunk/pep-3103.txt Fri Jul 7 22:16:41 2006
@@ -65,7 +65,7 @@
275 here. There are lots of other possibilities, but I don't see that
they add anything.
-My current preference is alternative 2.
+I've recently been converted to alternative 1.
I should note that all alternatives here have the "implicit break"
property: at the end of the suite for a particular case, the control
@@ -95,7 +95,8 @@
SUITE
The main downside is that the suites where all the action is are
-indented two levels deep.
+indented two levels deep; this can be remedied by indenting the cases
+"half a level" (e.g. 2 spaces if the general indentation level is 4).
Alternative 2
-------------
@@ -112,6 +113,11 @@
else:
SUITE
+Some reasons not to choose this include expected difficulties for
+auto-indenting editors, folding editors, and the like; and confused
+users. There are no situations currently in Python where a line
+ending in a colon is followed by an unindented line.
+
Alternative 3
-------------
From python-checkins at python.org Fri Jul 7 22:18:14 2006
From: python-checkins at python.org (mateusz.rukowicz)
Date: Fri, 7 Jul 2006 22:18:14 +0200 (CEST)
Subject: [Python-checkins] r50484 - sandbox/trunk/decimal-c/_decimal.c
sandbox/trunk/decimal-c/decimal.h
Message-ID: <20060707201814.A66AF1E4007@bag.python.org>
Author: mateusz.rukowicz
Date: Fri Jul 7 22:18:13 2006
New Revision: 50484
Modified:
sandbox/trunk/decimal-c/_decimal.c
sandbox/trunk/decimal-c/decimal.h
Log:
Some minor speed ups with exp. Fixed compiling error when BIG_EXP not defined.
Modified: sandbox/trunk/decimal-c/_decimal.c
==============================================================================
--- sandbox/trunk/decimal-c/_decimal.c (original)
+++ sandbox/trunk/decimal-c/_decimal.c Fri Jul 7 22:18:13 2006
@@ -546,8 +546,8 @@
static exp_t
exp_from_i(long a) {
exp_t ret;
+ long i;
- memset(ret.limbs, 0, sizeof(long) * EXP_LIMB_COUNT);
ret.limbs[0] = a;
ret.sign = 0;
@@ -556,7 +556,14 @@
ret.limbs[0] *= -1;
}
- ret.size = _limb_normalize(ret.limbs, EXP_LIMB_COUNT);
+ for (i=0 ; isign = 0;
exp->size = 1;
- memset(exp->limbs, 0, sizeof(long) * EXP_LIMB_COUNT);
if (buf[0] == '-') {
exp->sign = 1;
@@ -598,7 +604,10 @@
if (limb >= EXP_LIMB_COUNT)
return 0;
- exp->limbs[limb] += mul * (buf[i] - '0');
+ if (mul == 1)
+ exp->limbs[limb] = buf[i] - '0';
+ else
+ exp->limbs[limb] += mul * (buf[i] - '0');
mul *= 10;
if (mul == BASE) {
@@ -717,7 +726,7 @@
cmp = _limb_compare(a->limbs, a->size, b.limbs, b.size);
if (cmp == 0) {
- memset(a->limbs, 0, sizeof(long) * a->size);
+ a->limbs[0] = 0;
a->size = 1;
a->sign = 0;
}
@@ -728,7 +737,6 @@
if (cmp == -1) {
exp_t tmp;
- memset(tmp.limbs, 0, sizeof(long) * EXP_LIMB_COUNT);
tmp.size = _limb_sub(b.limbs, b.size * LOG, a->limbs, a->size * LOG, tmp.limbs);
tmp.size = (tmp.size + LOG -1) / LOG;
*a = tmp;
@@ -793,7 +801,7 @@
static exp_t
exp_mul(exp_t a, exp_t b) {
exp_t ret;
- memset(ret.limbs, 0, sizeof(long) *EXP_LIMB_COUNT);
+/* memset(ret.limbs, 0, sizeof(long) *EXP_LIMB_COUNT); */
ret.size = _limb_multiply_core(a.limbs, a.size, b.limbs, b.size, ret.limbs);
ret.sign = a.sign ^ b.sign;
if (ret.size == 1 && ret.limbs[0] == 0)
@@ -814,7 +822,7 @@
for (i=0 ; i=0; i--) {
@@ -824,7 +832,7 @@
*remainder %= b;
}
- ret.size = EXP_LIMB_COUNT;
+ ret.size = a.size;
while (!ret.limbs[ret.size-1] && ret.size > 1) ret.size --;
ret.sign = a.sign ^ (b < 0);
@@ -1017,6 +1025,8 @@
#define exp_max(a, b) ((a) > (b) ? (a) : (b))
#define exp_neg(a) (-(a))
#define exp_mul_i(a, b) ((a) * (b))
+#define exp_to_pyobj(a) (PyInt_FromLong(a))
+#define exp_from_pyobj(a) (PyInt_AsLong(a))
#endif
@@ -6980,8 +6990,13 @@
char roundstr[20] = "ROUND_";
char flags[250] = "["; /* 250 is enough for 12 error names of max. 17 chars */
char traps[250] = "["; /* and commas inbetween. */
+#ifdef BIG_EXP
char Emin[LOG * EXP_LIMB_COUNT + 1];
char Emax[LOG * EXP_LIMB_COUNT + 1];
+#else
+ char Emin[20];
+ char Emax[20];
+#endif
long flaglen = 1, traplen = 1;
switch (self->rounding) {
Modified: sandbox/trunk/decimal-c/decimal.h
==============================================================================
--- sandbox/trunk/decimal-c/decimal.h (original)
+++ sandbox/trunk/decimal-c/decimal.h Fri Jul 7 22:18:13 2006
@@ -11,7 +11,9 @@
#ifdef BIG_EXP
-#define EXP_LIMB_COUNT 10 /* maximal number of limbs per exp */
+/* At the moment this has significant speed effect, since
+ * there is many exp_t copied */
+#define EXP_LIMB_COUNT 10 /* maximal number of limbs per exp */
typedef struct {
long limbs[EXP_LIMB_COUNT];
int sign;
From python-checkins at python.org Fri Jul 7 22:28:03 2006
From: python-checkins at python.org (matt.fleming)
Date: Fri, 7 Jul 2006 22:28:03 +0200 (CEST)
Subject: [Python-checkins] r50485 - in sandbox/trunk/pdb: mconnection.py
test/test_mconnection.py
Message-ID: <20060707202803.C8D101E4007@bag.python.org>
Author: matt.fleming
Date: Fri Jul 7 22:28:02 2006
New Revision: 50485
Modified:
sandbox/trunk/pdb/mconnection.py
sandbox/trunk/pdb/test/test_mconnection.py
Log:
Make sure we dont reuse the server address.. tests passed on Linux but were hanging on Windows. Explicitly tell the test not to reuse addr.
Modified: sandbox/trunk/pdb/mconnection.py
==============================================================================
--- sandbox/trunk/pdb/mconnection.py (original)
+++ sandbox/trunk/pdb/mconnection.py Fri Jul 7 22:28:02 2006
@@ -138,7 +138,7 @@
line = self.input.recv(bufsize)
except socket.error, e:
raise ReadError, e[1]
- if len(line) == 0:
+ if not line:
raise ReadError, 'Connection closed'
if line[-1] != '\n': line += '\n'
return line
@@ -187,7 +187,7 @@
line = self._sock.recv(bufsize)
except socket.error, e:
raise ReadError, e[1]
- if len(line) == 0:
+ if not line:
raise ReadError, 'Connection closed'
return line
Modified: sandbox/trunk/pdb/test/test_mconnection.py
==============================================================================
--- sandbox/trunk/pdb/test/test_mconnection.py (original)
+++ sandbox/trunk/pdb/test/test_mconnection.py Fri Jul 7 22:28:02 2006
@@ -78,9 +78,9 @@
thread.start_new_thread(repeatedConnect, (self.client, __addr__))
self.server.connect(__addr__)
- # Set up second server on same port
+ # Set up second server on same port and do not reuse the addr
s = MConnectionServerTCP()
- self.assertRaises(ConnectionFailed, s.connect, __addr__)
+ self.assertRaises(ConnectionFailed, s.connect, __addr__, False)
def testInvalidServerAddress(self):
"""(tcp) Connect to an invalid hostname. """
@@ -102,8 +102,12 @@
while self.server._sock == None:
pass
-
+
repeatedConnect(self.client, __addr__)
+
+ # Wait to make _absolutely_ sure that the client has connected
+ while self.server.output == None:
+ pass
self.client.disconnect()
self.assertRaises(ReadError, self.server.readline)
From python-checkins at python.org Sat Jul 8 00:02:54 2006
From: python-checkins at python.org (brett.cannon)
Date: Sat, 8 Jul 2006 00:02:54 +0200 (CEST)
Subject: [Python-checkins] r50486 -
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Message-ID: <20060707220254.D717E1E4018@bag.python.org>
Author: brett.cannon
Date: Sat Jul 8 00:02:51 2006
New Revision: 50486
Modified:
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Log:
Spellcheck and rewording pass.
Modified: python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
==============================================================================
--- python/branches/bcannon-sandboxing/sandboxing_design_doc.txt (original)
+++ python/branches/bcannon-sandboxing/sandboxing_design_doc.txt Sat Jul 8 00:02:51 2006
@@ -9,7 +9,7 @@
enough information to understand the goals for sandboxing, what
considerations were made for the design, and the actual design itself. Design
decisions should be clear and explain not only why they were chosen but
-possible drawbacks from taking that approach.
+possible drawbacks from taking a specific approach.
If any of the above is found not to be true, please email me at
brett at python.org and let me know what problems you are having with the
@@ -23,18 +23,25 @@
* python-dev convince me that hiding 'file' possible?
+ based on that, handle code objects
+ also decide how to handle sockets
+ + perhaps go with crippling but try best effort on hiding reference and if
+ best effort holds up eventually shift over to capabilities system
* resolve to IP at call time to prevent DNS man-in-the-middle attacks when
allowing a specific host name?
-* what network inko functions are allowed by default?
+* what network info functions are allowed by default?
* does the object.__subclasses__() trick work across interpreters, or is it
unique per interpreter?
* figure out default whitelist of extension modules
* check default accessible objects for file path exposure
* helper functions to get at StringIO instances for stdin, stdout, and friends?
* decide on what type of objects (e.g., PyStringObject or const char *) are to
- be passed into PySandbox_*Extended*() functions
+ be passed in
* all built-ins properly protected?
-* exactly how to tell whether argument to open() is a path, IP, or host name.
+* exactly how to tell whether argument to open() is a path, IP, or host name
+ (third argument, 'n' prefix for networking, format of path, ...)
+* API at the Python level
+* for extension module protection, allow for wildcard allowance
+ (e.g., ``xml.*``)
+
Goal
=============================
@@ -61,15 +68,11 @@
representation (e.g., memory) to things that are more abstract and specific to
the interpreter (e.g., sys.path).
-Throughout this document, the term "sandoxing" will be used. It can also be
-substituted to mean "restricted execution" if one prefers.
-
When referring to the state of an interpreter, it is either "unprotected" or
"sandboxed". A unprotected interpreter has no restrictions imposed upon any
-resource. A sandboxed interpreter has at least one, possibly more, resources
+resource. A sandboxed interpreter has at least one, possibly more, resource
with restrictions placed upon it to prevent unsafe code that is running
-within the interpreter to cause harm to the system (the interpreter
-itself is never considered unsafe).
+within the interpreter to cause harm to the system.
.. contents::
@@ -79,8 +82,8 @@
/////////////////////////////
All use cases are based on how many sandboxed interpreters are
-running in a single process and whether an unprotected interpreter is also running
-or not. They can be broken down into two categories: when the interpreter is
+running in a single process and whether an unprotected interpreter is also running. The
+use cases can be broken down into two categories: when the interpreter is
embedded and only using sandboxed interpreters, and when pure Python code is
running in an unprotected interpreter and uses sandboxed interpreters.
@@ -100,7 +103,7 @@
When multiple interpreters, all sandboxed at varying levels, need to be running
within a single application. This is the key use case that this proposed
-design is targetted for.
+design is targeted for.
Stand-Alone Python
@@ -114,19 +117,17 @@
Issues to Consider
=============================
-Common to all use cases, resources that the interpreter requires at a level
-below user code to be used unhindered cannot be exposed to a sandboxed
+Common to all use cases, resources that the interpreter requires to function at a level
+below user code cannot be exposed to a sandboxed
interpreter. For instance, the interpreter might need to stat a file to see if
-it is possible to import. If the abililty to stat a file is not allowed to a
+it is possible to import. If the ability to stat a file is not allowed to a
sandboxed interpreter, it should not be allowed to perform that action,
regardless of whether the interpreter at a level below user code needs that
ability.
When multiple interpreters are involved (sandboxed or not), not allowing an interpreter
to gain access to resources available in other interpreters without explicit
-permission must be enforced. It would be a security violation, for instance,
-if a sandboxed interpreter manages to gain access to an unprotected instance of
-the 'file' object from an unprotected interpreter without being given that object.
+permission must be enforced.
Resources to Protect
@@ -148,7 +149,7 @@
what OS the interpreter is running on, for instance.
-Physical Resources
+Memory
===================
Memory should be protected. It is a limited resource on the system that can
@@ -176,8 +177,8 @@
Executing hostile bytecode that might lead to undesirable effects is another
possible issue.
-There is also the issue of taking it over. If one is able to gain escalated
-privileges in any way without explicit permission is an issue.
+There is also the issue of taking it over. One should not able to gain escalated
+privileges in any way without explicit permission.
Types of Security
@@ -210,9 +211,9 @@
of the current interpreter, and if it is allowed to, return a proxy object for
the file that only allows reading from it. The 'file' instance for the proxy
would need to be properly hidden so that the reference was not reachable from
-outside so that 'file' access could stil be controlled.
+outside so that 'file' access could still be controlled.
-Python, as it stands now, unfortunately does not work well for a pure capabilities sytem.
+Python, as it stands now, unfortunately does not work well for a pure capabilities system.
Capabilities require the prohibition of certain abilities, such as
"direct access to another's private state" [#paradigm regained]_. This
obviously is not possible in Python since, at least at the Python level, there
@@ -224,7 +225,7 @@
attribute.
Python's introspection abilities also do not help make implementing
-capabilities that much easier. Consider accessing 'file' even when it is deleted from
+capabilities that much easier. Consider how one could access 'file' even when it is deleted from
__builtin__. You can still get to the reference for 'file' through the sequence returned by
``object.__subclasses__()``.
@@ -240,8 +241,7 @@
method level.
By performing the security check every time a resource's method is called the
-worry of a specific resource's reference leaking out to insecure code is alleviated
-since the resource cannot be used without authorizing it upon every method call. This does add extra overhead, though,
+worry of a specific resource's reference leaking out to insecure code is alleviated. This does add extra overhead, though,
by having to do so many security checks. It also does not handle the situation
where an unexpected exposure of a type occurs that has not been properly
crippled.
@@ -295,7 +295,7 @@
The 'rexec' Module
///////////////////////////////////////
-The 'rexec' module [#rexec]_ was original attempt at providing a sandbox
+The 'rexec' module [#rexec]_ was the original attempt at providing a sandbox
environment for Python code to run in. It's design was based on Safe-Tcl which
was essentially a capabilities system
[#safe-tcl]_. Safe-Tcl
@@ -308,13 +308,11 @@
against a whitelist of modules. You could also restrict the type of modules to
import based on whether they were Python source, bytecode, or C extensions.
Built-ins were allowed except for a blacklist of built-ins to not provide.
+One could restrict whether stdin,
+stdout, and stderr were provided or not on a per-RExec basis.
Several other protections were provided; see documentation for the complete
list.
-With an RExec object created, one could pass in strings of code to be executed
-and have the result returned. One could restrict whether stdin,
-stdout, and stderr were provided or not on a per-RExec basis.
-
The ultimate undoing of the 'rexec' module was how access to objects that in
normal Python require no imports to reach was handled. Importing modules
requires a direct action, and thus can be protected against directly in the
@@ -348,8 +346,8 @@
Below is a list of what the security implementation assumes, along with what section of this document that addresses
that part of the security model (if not already true in Python by default).
-The term "bare" when in terms
-of an interpreter means an interpreter that has not performed a single import
+The term "bare" when in regards to
+an interpreter means an interpreter that has not performed a single import
of a module. Also, all comments refer to a sandboxed interpreter unless
otherwise explicitly stated.
@@ -357,6 +355,7 @@
whether memory should be protected. This list is meant to make clear at a more
basic level what the security model is assuming is true.
+* The Python interpreter itself is always trusted.
* The Python interpreter cannot be crashed by valid Python source code in a
bare interpreter.
* Python source code is always considered safe.
@@ -374,15 +373,15 @@
technical need to share extension module instances between interpreters.
* When starting a sandboxed interpreter, it starts with a fresh built-in and
global namespace that is not shared with the interpreter that started it.
- Objects in the built-in namespace should be safe to use
- [`Reading/Writing Files`_, `Stdin, Stdout, and Stderr`_].
+* Objects in the default built-in namespace should be safe to use
+ [`Reading/Writing Files`_, `Stdin, Stdout, and Stderr`_].
+ Either hide the dangerous ones or cripple them so they can cause no harm.
There are also some features that might be desirable, but are not being
addressed by this security model.
-* Communication between an unprotected interpreter and a sandboxed interpreter
- it created in any direction.
+* Communication in any direction between an unprotected interpreter and a sandboxed interpreter
+ it created.
The Proposed Approach
@@ -396,7 +395,7 @@
Implementation Details
===============================
-Support for sandboxed interpreters will be a compilation option. This allows the
+Support for sandboxed interpreters will require a compilation flag. This allows the
more common case of people not caring about protections to not take a
performance hit. And even when Python is compiled for
sandboxed interpreter restrictions, when the running interpreter *is*
@@ -414,13 +413,15 @@
explicit and helps make sure you set protections for the exact interpreter you
mean to. All functions that set protections begin with the prefix
``PySandbox_Set*()``. These functions are meant to only work with sandboxed interpreters
-that have not been used yet to execute any Python code.
+that have not been used yet to execute any Python code. The calls must be made
+by the code creating and handling the sandboxed interpreter *before* the
+sandboxed interpreter is used to execute any Python code.
The functions for checking for permissions are actually macros that take
in at least an error return value for the function calling the macro. This
-allows the macro to return for the caller if the check failed and cause the
+allows the macro to return on behalf of the caller if the check fails and cause the
SandboxError
-exception to be propagated. This helps eliminate any coding errors from
+exception to be propagated automatically. This helps eliminate any coding errors from
incorrectly checking a return value on a rights-checking function call. For
the rare case where this functionality is disliked, just make the check in a
utility function and check that function's return value (but this is strongly
@@ -471,7 +472,8 @@
Possible Security Flaws
-----------------------
-If code makes direct calls to malloc/free instead of using the proper PyMem_*()
+If code makes direct calls to malloc/free instead of using the proper
+``PyMem_*()``
macros then the security check will be circumvented. But C code is *supposed*
to use the proper macros or pymalloc and thus this issue is not with the
security model but with code not following Python coding standards.
@@ -480,20 +482,20 @@
API
--------------
-* int PySandbox_SetMemoryCap(PyThreadState *, Py_ssize_t)
+* int PySandbox_SetMemoryCap(PyThreadState *, integer)
Set the memory cap for an sandboxed interpreter. If the interpreter is not
running an sandboxed interpreter, return a false value.
-* PySandbox_AllowedMemoryAlloc(Py_ssize_t, error_return)
+* PySandbox_AllowedMemoryAlloc(integer, error_return)
Macro to increase the amount of memory that is reported that the running
sandboxed interpreter is using. If the increase puts the total count
passed the set limit, raise an SandboxError exception and cause the calling function
- to return with the value of error_return, otherwise do nothing.
+ to return with the value of 'error_return', otherwise do nothing.
-* PySandbox_AllowedMemoryFree(Py_ssize_t, error_return)
+* PySandbox_AllowedMemoryFree(integer, error_return)
Macro to decrease the current running interpreter's allocated memory. If this puts
the memory used to below 0, raise a SandboxError exception and return
- error_return, otherwise do nothing.
+ 'error_return', otherwise do nothing.
Reading/Writing Files
@@ -512,7 +514,7 @@
being gleaned from the type of exception returned (i.e., returning IOError if a
path does not exist tells the user something about that file path).
-What open() may not specifically be an instance of 'file' but a proxy
+What open() returns may not be an instance of 'file' but a proxy
that provides the security measures needed. While this might break code that
uses type checking to make sure a 'file' object is used, taking a duck typing
approach would be better. This is not only more Pythonic but would also allow
@@ -542,11 +544,11 @@
API
--------------
-* int PySandbox_SetAllowedFile(PyThreadState *, const char *path, const char *mode)
+* int PySandbox_SetAllowedFile(PyThreadState *, string path, string mode)
Add a file that is allowed to be opened in 'mode' by the 'file' object. If
the interpreter is not sandboxed then return a false value.
-* PySandbox_AllowedPath(path, mode, error_return)
+* PySandbox_AllowedPath(string path, string mode, error_return)
Macro that causes the caller to return with 'error_return' and raise
SandboxError as the
exception if the specified path with 'mode' is not allowed, otherwise do
@@ -567,7 +569,8 @@
extension module). Python bytecode files are never directly imported because
of the possibility of hostile bytecode being present. Python source is always
considered safe based on the assumption that all resource harm is eventually done at
-the C level, thus Python code directly cannot cause harm. Thus only C
+the C level, thus Python source code directly cannot cause harm without help of
+C extension modules. Thus only C
extension modules need to be checked against the whitelist.
The requested extension module name is checked in order to make sure that it
@@ -624,17 +627,17 @@
API
--------------
-* int PySandbox_SetModule(PyThreadState *, const char *module_name)
+* int PySandbox_SetModule(PyThreadState *, string module_name)
Allow the sandboxed interpreter to import 'module_name'. If the
interpreter is not sandboxed, return a false value. Absolute import paths must be
specified.
-* int PySandbox_BlockModule(PyThreadState *, const char *module_name)
+* int PySandbox_BlockModule(PyThreadState *, string module_name)
Remove the specified module from the whitelist. Used to remove modules
that are allowed by default. Return a false value if called on an
unprotected interpreter.
-* PySandbox_AllowedModule(const char *module_name, error_return)
+* PySandbox_AllowedModule(string module_name, error_return)
Macro that causes the caller to return with 'error_return' and sets the
exception SandboxError if the specified module cannot be imported,
otherwise does nothing.
@@ -703,7 +706,7 @@
API
--------------
-None.
+N/A
Changing the Behaviour of the Interpreter
@@ -715,9 +718,8 @@
Only a subset of the 'sys' module will be made available to sandboxed
interpreters. Things to allow from the sys module:
-* byteorder
-* subversion
-* copyright
+* byteorder (?)
+* copyright
* displayhook
* excepthook
* __displayhook__
@@ -726,19 +728,18 @@
* exc_clear
* exit
* getdefaultencoding
-* _getframe
+* _getframe (?)
* hexversion
* last_type
* last_value
* last_traceback
-* maxint
-* maxunicode
+* maxint (?)
+* maxunicode (?)
* modules
* stdin # See `Stdin, Stdout, and Stderr`_.
* stdout
* stderr
* version
-* api_version
Why
@@ -800,23 +801,23 @@
API
--------------
-* int PySandboxed_SetIPAddress(PyThreadState *, const char *IP, int port)
- Allow the sandboxed interpreter to send/receive to the specified IP
- address on the specified port. If the interpreter is not sandboxed,
+* int PySandbox_SetIPAddress(PyThreadState *, string IP, integer port)
+ Allow the sandboxed interpreter to send/receive to the specified 'IP'
+ address on the specified 'port'. If the interpreter is not sandboxed,
return a false value.
-* PySandbox_AllowedIPAddress(const char *IP, int port, error_return)
- Macro to verify that the specified IP address on the specified port is
+* PySandbox_AllowedIPAddress(string IP, integer port, error_return)
+ Macro to verify that the specified 'IP' address on the specified 'port' is
allowed to be communicated with. If not, cause the caller to return with
'error_return' and SandboxError exception set, otherwise do nothing.
-* int PySandbox_SetHost(PyThreadState *, const char *host, int port)
- Allow the sandboxed interpreter to send/receive to the specified host on
- the specified port. If the interpreter is not sandboxed, return a false
+* int PySandbox_SetHost(PyThreadState *, string host, integer port)
+ Allow the sandboxed interpreter to send/receive to the specified 'host' on
+ the specified 'port'. If the interpreter is not sandboxed, return a false
value.
-* PySandbox_AllowedHost(const char *host, int port, error_return)
- Check that the specified host on the specified port is allowed to be
+* PySandbox_AllowedHost(string host, integer port, error_return)
+ Check that the specified 'host' on the specified 'port' is allowed to be
communicated with. If not, set a SandboxError exception and cause the caller to
return 'error_return', otherwise do nothing.
@@ -854,7 +855,7 @@
API
--------------
-* int PySandbox_SetNetworkInfo(interpreter)
+* int PySandbox_SetNetworkInfo(PyThreadState *)
Allow the sandboxed interpreter to get network information regardless of
whether the IP or host address is explicitly allowed. If the interpreter
is not sandboxed, return a false value.
@@ -914,7 +915,7 @@
--------------
By default, sys.__stdin__, sys.__stdout__, and sys.__stderr__ will be set to
-instances of cStringIO. Explicit allowance of the process' stdin, stdout, and
+instances of StringIO. Explicit allowance of the process' stdin, stdout, and
stderr is possible.
This will protect the 'print' statement, and the built-ins input() and
@@ -941,15 +942,15 @@
int PySandbox_SetTrueStdout(PyThreadState *)
int PySandbox_SetTrueStderr(PyThreadState *)
Set the specific stream for the interpreter to the true version of the
- stream and not to the default instance of cStringIO. If the interpreter is
+ stream and not to the default instance of StringIO. If the interpreter is
not sandboxed, return a false value.
Adding New Protections
=============================
-This feature has the lowest priority and thus will be the last feature
-implemented (if ever).
+.. note:: This feature has the lowest priority and thus will be the last feature
+ implemented (if ever).
Protection
--------------
@@ -988,30 +989,31 @@
--------------
+ Bool
- * int PySandbox_SetExtendedFlag(PyThreadState *, group, type)
+ * int PySandbox_SetExtendedFlag(PyThreadState *, string group, string type)
Set a group-type to be true. Expected use is for when a binary
possibility of something is needed and that the default is to not allow
use of the resource (e.g., network information). Returns a false value
if used on an unprotected interpreter.
- * PySandbox_AllowedExtendedFlag(group, type, error_return)
+ * PySandbox_AllowedExtendedFlag(string group, string type, error_return)
Macro that if the group-type is not set to true, cause the caller to
return with 'error_return' with SandboxError exception raised. For unprotected
interpreters the check does nothing.
+ Numeric Range
- * int PySandbox_SetExtendedCap(PyThreadState *, group, type, cap)
- Set a group-type to a capped value, with the initial value set to 0.
+ * int PySandbox_SetExtendedCap(PyThreadState *, string group, string type,
+ integer cap)
+ Set a group-type to a capped value, 'cap', with the initial allocated value set to 0.
Expected use is when a resource has a capped amount of use (e.g.,
memory). Returns a false value if the interpreter is not sandboxed.
- * PySandbox_AllowedExtendedAlloc(increase, error_return)
+ * PySandbox_AllowedExtendedAlloc(integer increase, error_return)
Macro to raise the amount of a resource is used by 'increase'. If the
increase pushes the resource allocation past the set cap, then return
'error_return' and set SandboxError as the exception, otherwise do
nothing.
- * PySandbox_AllowedExtendedFree(decrease, error_return)
+ * PySandbox_AllowedExtendedFree(integer decrease, error_return)
Macro to lower the amount a resource is used by 'decrease'. If the
decrease pushes the allotment to below 0 then have the caller return
'error_return' and set SandboxError as the exception, otherwise do
@@ -1019,27 +1021,46 @@
+ Membership
- * int PySandbox_SetExtendedMembership(PyThreadState *, group, type, string)
- Add a string to be considered a member of a group-type (e.g., allowed
+ * int PySandbox_SetExtendedMembership(PyThreadState *, string group,
+ string type, string member)
+ Add a string, 'member', to be considered a member of a group-type (e.g., allowed
file paths). If the interpreter is not an sandboxed interpreter,
return a false value.
- * PySandbox_AllowedExtendedMembership(group, type, string, error_return)
- Macro that checks 'string' is a member of the values set for the
+ * PySandbox_AllowedExtendedMembership(string group, string type,
+ string member, error_return)
+ Macro that checks 'member' is a member of the values set for the
group-type. If it is not, then have the caller return 'error_return'
and set an exception for SandboxError, otherwise does nothing.
+ Specific Value
- * int PySandbox_SetExtendedValue(PyThreadState *, group, type, string)
- Set a group-type to a specific string. If the interpreter is not
+ * int PySandbox_SetExtendedValue(PyThreadState *, string group,
+ string type, string value)
+ Set a group-type to 'value'. If the interpreter is not
sandboxed, return NULL.
- * PySandbox_AllowedExtendedValue(group, type, string, error_return)
- Macro to check that the group-type is set to 'string'. If it is not,
+ * PySandbox_AllowedExtendedValue(string group, string type, string value, error_return)
+ Macro to check that the group-type is set to 'value'. If it is not,
then have the caller return 'error_return' and set an exception for
SandboxError, otherwise do nothing.
+Python API
+=============================
+
+__sandboxed__
+--------------
+
+A built-in that flags whether the interpreter currently running is sandboxed or
+not. Set to a 'bool' value that is read-only. To mimic working of __debug__.
+
+
+sandbox module
+--------------
+
+XXX
+
+
References
///////////////////////////////////////
From python-checkins at python.org Sat Jul 8 00:43:27 2006
From: python-checkins at python.org (brett.cannon)
Date: Sat, 8 Jul 2006 00:43:27 +0200 (CEST)
Subject: [Python-checkins] r50487 -
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Message-ID: <20060707224327.EEF2A1E4007@bag.python.org>
Author: brett.cannon
Date: Sat Jul 8 00:43:27 2006
New Revision: 50487
Modified:
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Log:
Rewrap to 72 columns.
Modified: python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
==============================================================================
--- python/branches/bcannon-sandboxing/sandboxing_design_doc.txt (original)
+++ python/branches/bcannon-sandboxing/sandboxing_design_doc.txt Sat Jul 8 00:43:27 2006
@@ -4,12 +4,12 @@
About This Document
=============================
-This document is meant to lay out the general design for re-introducing a
-sandboxing model for Python. This document should provide one with
+This document is meant to lay out the general design for re-introducing
+a sandboxing model for Python. This document should provide one with
enough information to understand the goals for sandboxing, what
-considerations were made for the design, and the actual design itself. Design
-decisions should be clear and explain not only why they were chosen but
-possible drawbacks from taking a specific approach.
+considerations were made for the design, and the actual design itself.
+Design decisions should be clear and explain not only why they were
+chosen but possible drawbacks from taking a specific approach.
If any of the above is found not to be true, please email me at
brett at python.org and let me know what problems you are having with the
@@ -46,33 +46,35 @@
Goal
=============================
-A good sandboxing model provides enough protection to prevent
-malicious harm to come to the system, and no more. Barriers should be
-minimized so as to allow most code that does not do anything that would be
-regarded as harmful to run unmodified. But the protections need to be thorough
-enough to prevent any unintended changes or information of the system to come
-about.
-
-An important point to take into consideration when reading this document is to
-realize it is part of my (Brett Cannon's) Ph.D. dissertation. This means it is
-heavily geared toward sandboxing when the interpreter is working
-with Python code embedded in a web page as viewed in Firefox. While great strides have been taken
-to keep the design general enough so as to allow all previous uses of the
-'rexec' module [#rexec]_ to be able to use the new design, it is not the
-focused goal. This means if a design decision must be made for the embedded
-use case compared to sandboxing Python code in a pure Python application, the former
-will win out over the latter.
-
-Throughout this document, the term "resource" is used to represent anything that
-deserves possible protection. This includes things that have a physical
-representation (e.g., memory) to things that are more abstract and specific to
-the interpreter (e.g., sys.path).
-
-When referring to the state of an interpreter, it is either "unprotected" or
-"sandboxed". A unprotected interpreter has no restrictions imposed upon any
-resource. A sandboxed interpreter has at least one, possibly more, resource
-with restrictions placed upon it to prevent unsafe code that is running
-within the interpreter to cause harm to the system.
+A good sandboxing model provides enough protection to prevent malicious
+harm to come to the system, and no more. Barriers should be minimized
+so as to allow most code that does not do anything that would be
+regarded as harmful to run unmodified. But the protections need to be
+thorough enough to prevent any unintended changes or information of the
+system to come about.
+
+An important point to take into consideration when reading this
+document is to realize it is part of my (Brett Cannon's) Ph.D.
+dissertation. This means it is heavily geared toward sandboxing when
+the interpreter is working with Python code embedded in a web page as
+viewed in Firefox. While great strides have been taken to keep the
+design general enough so as to allow all previous uses of the 'rexec'
+module [#rexec]_ to be able to use the new design, it is not the
+focused goal. This means if a design decision must be made for the
+embedded use case compared to sandboxing Python code in a pure Python
+application, the former will win out over the latter.
+
+Throughout this document, the term "resource" is used to represent
+anything that deserves possible protection. This includes things that
+have a physical representation (e.g., memory) to things that are more
+abstract and specific to the interpreter (e.g., sys.path).
+
+When referring to the state of an interpreter, it is either
+"unprotected" or "sandboxed". A unprotected interpreter has no
+restrictions imposed upon any resource. A sandboxed interpreter has at
+least one, possibly more, resource with restrictions placed upon it to
+prevent unsafe code that is running within the interpreter to cause
+harm to the system.
.. contents::
@@ -81,11 +83,12 @@
Use Cases
/////////////////////////////
-All use cases are based on how many sandboxed interpreters are
-running in a single process and whether an unprotected interpreter is also running. The
-use cases can be broken down into two categories: when the interpreter is
-embedded and only using sandboxed interpreters, and when pure Python code is
-running in an unprotected interpreter and uses sandboxed interpreters.
+All use cases are based on how many sandboxed interpreters are running
+in a single process and whether an unprotected interpreter is also
+running. The use cases can be broken down into two categories: when
+the interpreter is embedded and only using sandboxed interpreters, and
+when pure Python code is running in an unprotected interpreter and uses
+sandboxed interpreters.
When the Interpreter Is Embedded
@@ -94,68 +97,68 @@
Single Sandboxed Interpreter
----------------------------
-This use case is when an application embeds the interpreter and never has more
-than one interpreter running which happens to be sandboxed.
+This use case is when an application embeds the interpreter and never
+has more than one interpreter running which happens to be sandboxed.
Multiple Sandboxed Interpreters
-------------------------------
-When multiple interpreters, all sandboxed at varying levels, need to be running
-within a single application. This is the key use case that this proposed
-design is targeted for.
+When multiple interpreters, all sandboxed at varying levels, need to be
+running within a single application. This is the key use case that
+this proposed design is targeted for.
Stand-Alone Python
=============================
-When someone has written a Python program that wants to execute Python code in
-an sandboxed interpreter(s). This is the use case that 'rexec' attempted to
-fulfill.
+When someone has written a Python program that wants to execute Python
+code in an sandboxed interpreter(s). This is the use case that 'rexec'
+attempted to fulfill.
Issues to Consider
=============================
-Common to all use cases, resources that the interpreter requires to function at a level
-below user code cannot be exposed to a sandboxed
-interpreter. For instance, the interpreter might need to stat a file to see if
-it is possible to import. If the ability to stat a file is not allowed to a
-sandboxed interpreter, it should not be allowed to perform that action,
-regardless of whether the interpreter at a level below user code needs that
-ability.
-
-When multiple interpreters are involved (sandboxed or not), not allowing an interpreter
-to gain access to resources available in other interpreters without explicit
-permission must be enforced.
+Common to all use cases, resources that the interpreter requires to
+function at a level below user code cannot be exposed to a sandboxed
+interpreter. For instance, the interpreter might need to stat a file
+to see if it is possible to import. If the ability to stat a file is
+not allowed to a sandboxed interpreter, it should not be allowed to
+perform that action, regardless of whether the interpreter at a level
+below user code needs that ability.
+
+When multiple interpreters are involved (sandboxed or not), not
+allowing an interpreter to gain access to resources available in other
+interpreters without explicit permission must be enforced.
Resources to Protect
/////////////////////////////
-It is important to make sure that the proper resources are protected from a
-sandboxed interpreter. If you don't there is no point to sandboxing.
+It is important to make sure that the proper resources are protected
+from a sandboxed interpreter. If you don't there is no point to sandboxing.
Filesystem
===================
All facets of the filesystem must be protected. This means restricting
-reading and writing to the filesystem (e.g., files, directories, etc.). It
-should be allowed in controlled situations where allowing access to the
-filesystem is desirable, but that should be an explicit allowance.
-
-There must also be protection to prevent revealing any information about the
-filesystem. Disclosing information on the filesystem could allow one to infer
-what OS the interpreter is running on, for instance.
+reading and writing to the filesystem (e.g., files, directories, etc.).
+It should be allowed in controlled situations where allowing access to
+the filesystem is desirable, but that should be an explicit allowance.
+
+There must also be protection to prevent revealing any information
+about the filesystem. Disclosing information on the filesystem could
+allow one to infer what OS the interpreter is running on, for instance.
Memory
===================
-Memory should be protected. It is a limited resource on the system that can
-have an impact on other running programs if it is exhausted. Being able to
-restrict the use of memory would help alleviate issues from denial-of-service
-(DoS) attacks on the system.
+Memory should be protected. It is a limited resource on the system
+that can have an impact on other running programs if it is exhausted.
+Being able to restrict the use of memory would help alleviate issues
+from denial-of-service (DoS) attacks on the system.
Networking
@@ -164,273 +167,291 @@
Networking is somewhat like the filesystem in terms of wanting similar
protections. You do not want to let unsafe code make socket
connections unhindered or accept them to do possibly nefarious things.
-You also want to prevent finding out information about the network you are
-connected to.
+You also want to prevent finding out information about the network your
+are connected to.
Interpreter
===================
-One must make sure that the interpreter is not harmed in any way from sandboxed
-code. This usually takes the form of crashing the program that the interpreter
-is embedded in or the unprotected interpreter that started the sandbox interpreter.
-Executing hostile bytecode that might lead to undesirable effects is another
-possible issue.
+One must make sure that the interpreter is not harmed in any way from
+sandboxed code. This usually takes the form of crashing the program
+that the interpreter is embedded in or the unprotected interpreter that
+started the sandbox interpreter. Executing hostile bytecode that might
+lead to undesirable effects is another possible issue.
-There is also the issue of taking it over. One should not able to gain escalated
-privileges in any way without explicit permission.
+There is also the issue of taking it over. One should not able to gain
+escalated privileges in any way without explicit permission.
Types of Security
///////////////////////////////////////
-As with most things, there are multiple approaches one can take to tackle a
-problem. Security is no exception. In general there seem to be two approaches
-to protecting resources.
+As with most things, there are multiple approaches one can take to
+tackle a problem. Security is no exception. In general there seem to
+be two approaches to protecting resources.
Resource Hiding
=============================
-By never giving code a chance to access a resource, you prevent it from being
-(ab)used. This is the idea behind resource hiding; you can't misuse something
-you don't have in the first place.
-
-The most common implementation of resource hiding is capabilities. In this
-type of system a resource's reference acts as a ticket that represents the right
-to use the resource. Once code has a reference it is considered to have full
-use of resource that reference represents and no further security checks are
-directly performed (using delegates and other structured ways one can actually
-have a security check for each access of a resource, but this is not a default
-behaviour).
-
-As an example, consider the 'file' type as a resource we want to protect. That
-would mean that we did not want a reference to the 'file' type to ever be
-accessible without explicit permission. If one wanted to provide read-only
-access to a temp file, you could have open() perform a check on the permissions
-of the current interpreter, and if it is allowed to, return a proxy object for
-the file that only allows reading from it. The 'file' instance for the proxy
-would need to be properly hidden so that the reference was not reachable from
+By never giving code a chance to access a resource, you prevent it from
+being (ab)used. This is the idea behind resource hiding; you can't
+misuse something you don't have in the first place.
+
+The most common implementation of resource hiding is capabilities. In
+this type of system a resource's reference acts as a ticket that
+represents the right to use the resource. Once code has a reference it
+is considered to have full use of resource that reference represents
+and no further security checks are directly performed (using delegates
+and other structured ways one can actually have a security check for
+each access of a resource, but this is not a default behaviour).
+
+As an example, consider the 'file' type as a resource we want to
+protect. That would mean that we did not want a reference to the
+'file' type to ever be accessible without explicit permission. If one
+wanted to provide read-only access to a temp file, you could have
+open() perform a check on the permissions of the current interpreter,
+and if it is allowed to, return a proxy object for the file that only
+allows reading from it. The 'file' instance for the proxy would need
+to be properly hidden so that the reference was not reachable from
outside so that 'file' access could still be controlled.
-Python, as it stands now, unfortunately does not work well for a pure capabilities system.
-Capabilities require the prohibition of certain abilities, such as
-"direct access to another's private state" [#paradigm regained]_. This
-obviously is not possible in Python since, at least at the Python level, there
-is no such thing as private state that is persistent (one could argue that
-local variables that are not cell variables for lexical scopes are private, but
-since they do not survive after a function call they are not usable for keeping
-persistent state). One can hide references at the C level by storing it in the
-struct for the instance of a type and not providing a function to access that
-attribute.
+Python, as it stands now, unfortunately does not work well for a pure
+capabilities system. Capabilities require the prohibition of certain
+abilities, such as "direct access to another's private state"
+[#paradigm regained]_. This obviously is not possible in Python since,
+at least at the Python level, there is no such thing as private state
+that is persistent (one could argue that local variables that are not
+cell variables for lexical scopes are private, but since they do not
+survive after a function call they are not usable for keeping
+persistent state). One can hide references at the C level by storing
+it in the struct for the instance of a type and not providing a
+function to access that attribute.
Python's introspection abilities also do not help make implementing
-capabilities that much easier. Consider how one could access 'file' even when it is deleted from
-__builtin__. You can still get to the reference for 'file' through the sequence returned by
+capabilities that much easier. Consider how one could access 'file'
+even when it is deleted from __builtin__. You can still get to the
+reference for 'file' through the sequence returned by
``object.__subclasses__()``.
Resource Crippling
=============================
-Another approach to security is to not worry about controlling access to the
-reference of a resource.
-One can have a resource perform a
-security check every time someone tries to use a method on that resource. This
-pushes the security check to a lower level; from a reference level to the
-method level.
-
-By performing the security check every time a resource's method is called the
-worry of a specific resource's reference leaking out to insecure code is alleviated. This does add extra overhead, though,
-by having to do so many security checks. It also does not handle the situation
-where an unexpected exposure of a type occurs that has not been properly
-crippled.
-
-FreeBSD's jail system provides a protection scheme similar to this. Various system calls
-allow for basic usage, but knowing or having access to the system call is not enough to grant
-usage. Every call to a system call requires checking that the proper rights
-have been granted to the use in order to allow for the system call to perform
+Another approach to security is to not worry about controlling access
+to the reference of a resource. One can have a resource perform a
+security check every time someone tries to use a method on that
+resource. This pushes the security check to a lower level; from a
+reference level to the method level.
+
+By performing the security check every time a resource's method is
+called the worry of a specific resource's reference leaking out to
+insecure code is alleviated. This does add extra overhead, though, by
+having to do so many security checks. It also does not handle the
+situation where an unexpected exposure of a type occurs that has not
+been properly crippled.
+
+FreeBSD's jail system provides a protection scheme similar to this.
+Various system calls allow for basic usage, but knowing or having
+access to the system call is not enough to grant usage. Every call to
+a system call requires checking that the proper rights have been
+granted to the use in order to allow for the system call to perform
its action.
-An even better example in FreeBSD's jail system is its protection of sockets.
-One can only bind a single IP address to a jail. Any attempt to do more or
-perform uses with the one IP address that is granted is prevented. The check
-is performed at every call involving the one granted IP address.
+An even better example in FreeBSD's jail system is its protection of
+sockets. One can only bind a single IP address to a jail. Any attempt
+to do more or perform uses with the one IP address that is granted is
+prevented. The check is performed at every call involving the one
+granted IP address.
Using 'file' as the example again, one could cripple the type so that
instantiation is not possible for the type in Python. One could also
-provide a permission check on each call to a unsafe method call and thus allow
-the type to be used in normal situations (such as type checking), but still
-feel safe that illegal operations are not performed.
-Regardless of which approach you take, you do not need
-to worry about a reference to the type being exposed unexpectedly since the
-reference is not the security check but the actual method calls.
+provide a permission check on each call to a unsafe method call and
+thus allow the type to be used in normal situations (such as type
+checking), but still feel safe that illegal operations are not
+performed. Regardless of which approach you take, you do not need to
+worry about a reference to the type being exposed unexpectedly since
+the reference is not the security check but the actual method calls.
Comparison of the Two Approaches
================================
-From the perspective of Python, the two approaches differ on what would be the
-most difficult thing to analyze from a security standpoint: all of the ways to
-gain access to various types from a sandboxed interpreter with no imports, or
-finding all of the types that can lead to possibly dangerous actions and thus
-need to be crippled.
-
-Some Python developers, such as Armin Rigo, feel that truly hiding objects in
-Python is "quite hard" [#armin-hiding]_. This sentiment means that making a
-pure capabilities system in Python that is secure is not possible as people
-would continue to find new ways to get a hold of the reference to a protected
-resource.
-
-Others feel that by not going the capabilities route we will be constantly
-chasing down new types that require crippling. The thinking is that if we
-cannot control the references for 'file', how are we to know what other types
-might become exposed later on and thus require more crippling?
-
-It essentially comes down to what is harder to do: find all the ways to access
-the types in Python in a sandboxed interpreter with no imported modules, or to
-go through the Python code base and find all types that should be crippled?
+From the perspective of Python, the two approaches differ on what would
+be the most difficult thing to analyze from a security standpoint: all
+of the ways to gain access to various types from a sandboxed
+interpreter with no imports, or finding all of the types that can lead
+to possibly dangerous actions and thus need to be crippled.
+
+Some Python developers, such as Armin Rigo, feel that truly hiding
+objects in Python is "quite hard" [#armin-hiding]_. This sentiment
+means that making a pure capabilities system in Python that is secure
+is not possible as people would continue to find new ways to get a hold
+of the reference to a protected resource.
+
+Others feel that by not going the capabilities route we will be
+constantly chasing down new types that require crippling. The thinking
+is that if we cannot control the references for 'file', how are we to
+know what other types might become exposed later on and thus require
+more crippling?
+
+It essentially comes down to what is harder to do: find all the ways to
+access the types in Python in a sandboxed interpreter with no imported
+modules, or to go through the Python code base and find all types that
+should be crippled?
The 'rexec' Module
///////////////////////////////////////
-The 'rexec' module [#rexec]_ was the original attempt at providing a sandbox
-environment for Python code to run in. It's design was based on Safe-Tcl which
-was essentially a capabilities system
-[#safe-tcl]_. Safe-Tcl
-allowed you to launch a separate interpreter where its global functions were
-specified at creation time. This prevented one from having any abilities that
-were not explicitly provided.
-
-For 'rexec', the Safe-Tcl model was tweaked to better match Python's situation.
-An RExec object represented a sandboxed environment. Imports were checked
-against a whitelist of modules. You could also restrict the type of modules to
-import based on whether they were Python source, bytecode, or C extensions.
-Built-ins were allowed except for a blacklist of built-ins to not provide.
-One could restrict whether stdin,
-stdout, and stderr were provided or not on a per-RExec basis.
-Several other protections were provided; see documentation for the complete
-list.
-
-The ultimate undoing of the 'rexec' module was how access to objects that in
-normal Python require no imports to reach was handled. Importing modules
-requires a direct action, and thus can be protected against directly in the
-import machinery. But for built-ins, they are accessible by default and
-require no direct action to access in normal Python; you just use their name
-since they are provided in all namespaces.
+The 'rexec' module [#rexec]_ was the original attempt at providing a
+sandbox environment for Python code to run in. It's design was based
+on Safe-Tcl which was essentially a capabilities system [#safe-tcl]_.
+Safe-Tcl allowed you to launch a separate interpreter where its global
+functions were specified at creation time. This prevented one from
+having any abilities that were not explicitly provided.
+
+For 'rexec', the Safe-Tcl model was tweaked to better match Python's
+situation. An RExec object represented a sandboxed environment.
+Imports were checked against a whitelist of modules. You could also
+restrict the type of modules to import based on whether they were
+Python source, bytecode, or C extensions. Built-ins were allowed
+except for a blacklist of built-ins to not provide. One could restrict
+whether stdin, stdout, and stderr were provided or not on a per-RExec
+basis. Several other protections were provided; see documentation for
+the complete list.
+
+The ultimate undoing of the 'rexec' module was how access to objects
+that in normal Python require no imports to reach was handled.
+Importing modules requires a direct action, and thus can be protected
+against directly in the import machinery. But for built-ins, they are
+accessible by default and require no direct action to access in normal
+Python; you just use their name since they are provided in all
+namespaces.
For instance, in a sandboxed interpreter, one only had to
-``del __builtins__`` to gain access to the full set of built-ins. Another way
-is through using the gc module:
-``gc.get_referrers(''.__class__.__bases__[0])[6]['file']``. While both of
-these could be fixed (the former was a bug in 'rexec' that was fixed and the
-latter could be handled by not allowing
-'gc' to be imported), they are examples of things that do not require proactive
-actions on the part of the programmer in normal Python to gain access to a
-resource. This was an unfortunate side-effect of having all of that wonderful
-reflection in Python.
-
-There is also the issue that 'rexec' was written in Python which provides its
-own problems based on reflection and the ability to modify the code at
-run-time without security protection.
-
-Much has been learned since 'rexec' was written about how Python tends to be
-used and where security issues tend to appear. Essentially Python's dynamic
-nature does not lend itself very well to a security implementation that does
-not require a constant checking of permissions.
+``del __builtins__`` to gain access to the full set of built-ins.
+Another way is through using the gc module:
+``gc.get_referrers(''.__class__.__bases__[0])[6]['file']``. While both
+of these could be fixed (the former was a bug in 'rexec' that was fixed
+and the latter could be handled by not allowing 'gc' to be imported),
+they are examples of things that do not require proactive actions on
+the part of the programmer in normal Python to gain access to a
+resource. This was an unfortunate side-effect of having all of that
+wonderful reflection in Python.
+
+There is also the issue that 'rexec' was written in Python which
+provides its own problems based on reflection and the ability to modify
+the code at run-time without security protection.
+
+Much has been learned since 'rexec' was written about how Python tends
+to be used and where security issues tend to appear. Essentially
+Python's dynamic nature does not lend itself very well to a security
+implementation that does not require a constant checking of
+permissions.
Threat Model
///////////////////////////////////////
-Below is a list of what the security implementation assumes, along with what section of this document that addresses
-that part of the security model (if not already true in Python by default).
-The term "bare" when in regards to
-an interpreter means an interpreter that has not performed a single import
-of a module. Also, all comments refer to a sandboxed interpreter unless
-otherwise explicitly stated.
-
-This list does not address specifics such as how 'file' will be protected or
-whether memory should be protected. This list is meant to make clear at a more
-basic level what the security model is assuming is true.
+Below is a list of what the security implementation assumes, along with
+what section of this document that addresses that part of the security
+model (if not already true in Python by default). The term "bare" when
+in regards to an interpreter means an interpreter that has not
+performed a single import of a module. Also, all comments refer to a
+sandboxed interpreter unless otherwise explicitly stated.
+
+This list does not address specifics such as how 'file' will be
+protected or whether memory should be protected. This list is meant to
+make clear at a more basic level what the security model is assuming is
+true.
* The Python interpreter itself is always trusted.
-* The Python interpreter cannot be crashed by valid Python source code in a
- bare interpreter.
+* The Python interpreter cannot be crashed by valid Python source code
+ in a bare interpreter.
* Python source code is always considered safe.
* Python bytecode is always considered dangerous [`Hostile Bytecode`_].
* C extension modules are inherently considered dangerous
[`Extension Module Importation`_].
+ Explicit trust of a C extension module is possible.
-* Sandboxed interpreters running in the same process inherently cannot communicate with
- each other.
- + Communication through C extension modules is possible because of the
- technical need to share extension module instances between interpreters.
-* Sandboxed interpreters running in the same process inherently cannot share
- objects.
- + Sharing objects through C extension modules is possible because of the
- technical need to share extension module instances between interpreters.
-* When starting a sandboxed interpreter, it starts with a fresh built-in and
- global namespace that is not shared with the interpreter that started it.
+* Sandboxed interpreters running in the same process inherently cannot
+ communicate with each other.
+ + Communication through C extension modules is possible because of
+ the technical need to share extension module instances between
+ interpreters.
+* Sandboxed interpreters running in the same process inherently cannot
+ share objects.
+ + Sharing objects through C extension modules is possible because
+ of the technical need to share extension module instances between
+ interpreters.
+* When starting a sandboxed interpreter, it starts with a fresh
+ built-in and global namespace that is not shared with the interpreter
+ that started it.
* Objects in the default built-in namespace should be safe to use
[`Reading/Writing Files`_, `Stdin, Stdout, and Stderr`_].
- + Either hide the dangerous ones or cripple them so they can cause no harm.
+ + Either hide the dangerous ones or cripple them so they can cause
+ no harm.
There are also some features that might be desirable, but are not being
addressed by this security model.
-* Communication in any direction between an unprotected interpreter and a sandboxed interpreter
- it created.
+* Communication in any direction between an unprotected interpreter and
+ a sandboxed interpreter it created.
The Proposed Approach
///////////////////////////////////////
-In light of where 'rexec' succeeded and failed along with what is known about
-the two main approaches to security and how Python tends to operate, the following
-is a proposal on how to secure Python for sandboxing.
+In light of where 'rexec' succeeded and failed along with what is known
+about the two main approaches to security and how Python tends to
+operate, the following is a proposal on how to secure Python for
+sandboxing.
Implementation Details
===============================
-Support for sandboxed interpreters will require a compilation flag. This allows the
-more common case of people not caring about protections to not take a
-performance hit. And even when Python is compiled for
+Support for sandboxed interpreters will require a compilation flag.
+This allows the more common case of people not caring about protections
+to not take a performance hit. And even when Python is compiled for
sandboxed interpreter restrictions, when the running interpreter *is*
-unprotected,
-there will be no accidental triggers of protections. This means that
-developers should be liberal with the security protections without worrying
-about there being issues for interpreters that do not need/want the protection.
-
-At the Python level, the __sandboxed__ built-in will be set based on whether
-the interpreter is sandboxed or not. This will be set for *all* interpreters,
-regardless of whether sandboxed interpreter support was compiled in or not.
+unprotected, there will be no accidental triggers of protections. This
+means that developers should be liberal with the security protections
+without worrying about there being issues for interpreters that do not
+need/want the protection.
+
+At the Python level, the __sandboxed__ built-in will be set based on
+whether the interpreter is sandboxed or not. This will be set for
+*all* interpreters, regardless of whether sandboxed interpreter support
+was compiled in or not.
For setting what is to be protected, the PyThreadState for the
-sandboxed interpreter must be passed in. This makes the protection very
-explicit and helps make sure you set protections for the exact interpreter you
-mean to. All functions that set protections begin with the prefix
-``PySandbox_Set*()``. These functions are meant to only work with sandboxed interpreters
-that have not been used yet to execute any Python code. The calls must be made
-by the code creating and handling the sandboxed interpreter *before* the
-sandboxed interpreter is used to execute any Python code.
-
-The functions for checking for permissions are actually macros that take
-in at least an error return value for the function calling the macro. This
-allows the macro to return on behalf of the caller if the check fails and cause the
-SandboxError
-exception to be propagated automatically. This helps eliminate any coding errors from
-incorrectly checking a return value on a rights-checking function call. For
-the rare case where this functionality is disliked, just make the check in a
-utility function and check that function's return value (but this is strongly
-discouraged!).
-
-Functions that check that an operation is allowed implicitly operate on the currently running interpreter as
-returned by ``PyInterpreter_Get()`` and are to be used by any code (the
-interpreter, extension modules, etc.) that needs to check for permission to
-execute. They have the common prefix of ``PySandbox_Allowed*()``.
+sandboxed interpreter must be passed in. This makes the protection
+very explicit and helps make sure you set protections for the exact
+interpreter you mean to. All functions that set protections begin with
+the prefix ``PySandbox_Set*()``. These functions are meant to only
+work with sandboxed interpreters that have not been used yet to execute
+any Python code. The calls must be made by the code creating and
+handling the sandboxed interpreter *before* the sandboxed interpreter
+is used to execute any Python code.
+
+The functions for checking for permissions are actually macros that
+take in at least an error return value for the function calling the
+macro. This allows the macro to return on behalf of the caller if the
+check fails and cause the SandboxError exception to be propagated
+automatically. This helps eliminate any coding errors from incorrectly
+checking a return value on a rights-checking function call. For the
+rare case where this functionality is disliked, just make the check in
+a utility function and check that function's return value (but this is
+strongly discouraged!).
+
+Functions that check that an operation is allowed implicitly operate on
+the currently running interpreter as returned by
+``PyInterpreter_Get()`` and are to be used by any code (the
+interpreter, extension modules, etc.) that needs to check for
+permission to execute. They have the common prefix of
+`PySandbox_Allowed*()``.
API
@@ -438,12 +459,13 @@
* PyThreadState* PySandbox_NewInterpreter()
Return a new interpreter that is considered sandboxed. There is no
- corresponding ``PySandbox_EndInterpreter()`` as ``Py_EndInterpreter()`` will be taught
- how to handle sandboxed interpreters. ``NULL`` is returned on error.
+ corresponding ``PySandbox_EndInterpreter()`` as
+ ``Py_EndInterpreter()`` will be taught how to handle sandboxed
+ interpreters. ``NULL`` is returned on error.
* PySandbox_Allowed(error_return)
- Macro that has the caller return with 'error_return' if the interpreter is
- unprotected, otherwise do nothing.
+ Macro that has the caller return with 'error_return' if the
+ interpreter is unprotected, otherwise do nothing.
Memory
@@ -455,18 +477,19 @@
A memory cap will be allowed.
Modification to pymalloc will be needed to properly keep track of the
-allocation and freeing of memory. Same goes for the macros around the system
-malloc/free system calls. This provides a platform-independent system for
-protection of memory instead of relying on the operating system to provide a service for
-capping memory usage of a process. It also allows the protection to be at the
-interpreter level instead of at the process level.
+allocation and freeing of memory. Same goes for the macros around the
+system malloc/free system calls. This provides a platform-independent
+system for protection of memory instead of relying on the operating
+system to provide a service for capping memory usage of a process. It
+also allows the protection to be at the interpreter level instead of at
+the process level.
Why
--------------
-Protecting excessive memory usage allows one to make sure that a DoS attack
-against the system's memory is prevented.
+Protecting excessive memory usage allows one to make sure that a DoS
+attack against the system's memory is prevented.
Possible Security Flaws
@@ -474,28 +497,32 @@
If code makes direct calls to malloc/free instead of using the proper
``PyMem_*()``
-macros then the security check will be circumvented. But C code is *supposed*
-to use the proper macros or pymalloc and thus this issue is not with the
-security model but with code not following Python coding standards.
+macros then the security check will be circumvented. But C code is
+*supposed* to use the proper macros or pymalloc and thus this issue is
+not with the security model but with code not following Python coding
+standards.
API
--------------
* int PySandbox_SetMemoryCap(PyThreadState *, integer)
- Set the memory cap for an sandboxed interpreter. If the interpreter is not
- running an sandboxed interpreter, return a false value.
+ Set the memory cap for an sandboxed interpreter. If the
+ interpreter is not running an sandboxed interpreter, return a false
+ value.
* PySandbox_AllowedMemoryAlloc(integer, error_return)
- Macro to increase the amount of memory that is reported that the running
- sandboxed interpreter is using. If the increase puts the total count
- passed the set limit, raise an SandboxError exception and cause the calling function
- to return with the value of 'error_return', otherwise do nothing.
+ Macro to increase the amount of memory that is reported that the
+ running sandboxed interpreter is using. If the increase puts the
+ total count passed the set limit, raise an SandboxError exception
+ and cause the calling function to return with the value of
+ 'error_return', otherwise do nothing.
* PySandbox_AllowedMemoryFree(integer, error_return)
- Macro to decrease the current running interpreter's allocated memory. If this puts
- the memory used to below 0, raise a SandboxError exception and return
- 'error_return', otherwise do nothing.
+ Macro to decrease the current running interpreter's allocated
+ memory. If this puts the memory used to below 0, raise a
+ SandboxError exception and return 'error_return', otherwise do
+ nothing.
Reading/Writing Files
@@ -506,33 +533,35 @@
XXX
-To open a file, one will have to use open(). This will make open() a factory
-function that controls reference access to the 'file' type in terms of creating
-new instances. When an attempted file opening fails (either because the path
-does not exist or of security reasons), SandboxError will be
-raised. The same exception must be raised to prevent filesystem information
-being gleaned from the type of exception returned (i.e., returning IOError if a
-path does not exist tells the user something about that file path).
-
-What open() returns may not be an instance of 'file' but a proxy
-that provides the security measures needed. While this might break code that
-uses type checking to make sure a 'file' object is used, taking a duck typing
-approach would be better. This is not only more Pythonic but would also allow
-the code to use a StringIO instance.
-
-It has been suggested to allow for a passed-in callback to be called when a
-specific path is to be opened. While this provides good flexibility in terms
-of allowing custom proxies with more fine-grained security (e.g., capping the
-amount of disk write), this has been deemed unneeded in the initial security
-model and thus is not being considered at this time.
-
+To open a file, one will have to use open(). This will make open() a
+factory function that controls reference access to the 'file' type in
+terms of creating new instances. When an attempted file opening fails
+(either because the path does not exist or of security reasons),
+SandboxError will be raised. The same exception must be raised to
+prevent filesystem information being gleaned from the type of exception
+returned (i.e., returning IOError if a path does not exist tells the
+user something about that file path).
+
+What open() returns may not be an instance of 'file' but a proxy that
+provides the security measures needed. While this might break code
+that uses type checking to make sure a 'file' object is used, taking a
+duck typing approach would be better. This is not only more Pythonic
+but would also allow the code to use a StringIO instance.
+
+It has been suggested to allow for a passed-in callback to be called
+when a specific path is to be opened. While this provides good
+flexibility in terms of allowing custom proxies with more fine-grained
+security (e.g., capping the amount of disk write), this has been deemed
+unneeded in the initial security model and thus is not being considered
+at this time.
Why
--------------
-Allowing anyone to be able to arbitrarily read, write, or learn about the
-layout of your filesystem is extremely dangerous. It can lead to loss of data
-or data being exposed to people whom should not have access.
+Allowing anyone to be able to arbitrarily read, write, or learn about
+the layout of your filesystem is extremely dangerous. It can lead to
+loss of data or data being exposed to people whom should not have
+access.
Possible Security Flaws
@@ -544,15 +573,16 @@
API
--------------
-* int PySandbox_SetAllowedFile(PyThreadState *, string path, string mode)
- Add a file that is allowed to be opened in 'mode' by the 'file' object. If
- the interpreter is not sandboxed then return a false value.
+* int PySandbox_SetAllowedFile(PyThreadState *, string path,
+ string mode)
+ Add a file that is allowed to be opened in 'mode' by the 'file'
+ object. If the interpreter is not sandboxed then return a false
+ value.
* PySandbox_AllowedPath(string path, string mode, error_return)
- Macro that causes the caller to return with 'error_return' and raise
- SandboxError as the
- exception if the specified path with 'mode' is not allowed, otherwise do
- nothing.
+ Macro that causes the caller to return with 'error_return' and
+ raise SandboxError as the exception if the specified path with
+ 'mode' is not allowed, otherwise do nothing.
Extension Module Importation
@@ -561,33 +591,34 @@
Protection
--------------
-A whitelist of extension modules that may be imported must be provided. A
-default set is given for stdlib modules known to be safe.
+A whitelist of extension modules that may be imported must be provided.
+A default set is given for stdlib modules known to be safe.
-A check in the import machinery will check that a specified module name is
-allowed based on the type of module (Python source, Python bytecode, or
-extension module). Python bytecode files are never directly imported because
-of the possibility of hostile bytecode being present. Python source is always
-considered safe based on the assumption that all resource harm is eventually done at
-the C level, thus Python source code directly cannot cause harm without help of
-C extension modules. Thus only C
-extension modules need to be checked against the whitelist.
-
-The requested extension module name is checked in order to make sure that it
-is on the whitelist if it is a C extension module. If the name is not correct
-a SandboxError exception is raised. Otherwise the import is allowed.
+A check in the import machinery will check that a specified module name
+is allowed based on the type of module (Python source, Python bytecode,
+or extension module). Python bytecode files are never directly
+imported because of the possibility of hostile bytecode being present.
+Python source is always considered safe based on the assumption that
+all resource harm is eventually done at the C level, thus Python source
+code directly cannot cause harm without help of C extension modules.
+Thus only C extension modules need to be checked against the whitelist.
+
+The requested extension module name is checked in order to make sure
+that it is on the whitelist if it is a C extension module. If the name
+is not correct a SandboxError exception is raised. Otherwise the
+import is allowed.
Even if a Python source code module imports a C extension module in an
-unprotected
-interpreter it is not a problem since the Python source code module is reloaded
-in the sandboxed interpreter. When that Python source module is freshly
-imported the normal import check will be triggered to prevent the C extension
-module from becoming available to the sandboxed interpreter.
-
-For the 'os' module, a special sandboxed version will be used if the proper
-C extension module providing the correct abilities is not allowed. This will
-default to '/' as the path separator and provide as much reasonable abilities
-as possible from a pure Python module.
+unprotected interpreter it is not a problem since the Python source
+code module is reloaded in the sandboxed interpreter. When that Python
+source module is freshly imported the normal import check will be
+triggered to prevent the C extension module from becoming available to
+the sandboxed interpreter.
+
+For the 'os' module, a special sandboxed version will be used if the
+proper C extension module providing the correct abilities is not
+allowed. This will default to '/' as the path separator and provide as
+much reasonable abilities as possible from a pure Python module.
The 'sys' module is specially addressed in
`Changing the Behaviour of the Interpreter`_.
@@ -600,47 +631,50 @@
Why
--------------
-Because C code is considered unsafe, its use should be regulated. By using a
-whitelist it allows one to explicitly decide that a C extension module is considered safe.
+Because C code is considered unsafe, its use should be regulated. By
+using a whitelist it allows one to explicitly decide that a C extension
+module is considered safe.
Possible Security Flaws
-----------------------
-If a whitelisted C extension module imports a non-whitelisted C extension module and
-makes it an attribute of the whitelisted module there will be a breach in security.
-Luckily this a rarity in extension modules.
+If a whitelisted C extension module imports a non-whitelisted C
+extension module and makes it an attribute of the whitelisted module
+there will be a breach in security. Luckily this a rarity in
+extension modules.
There is also the issue of a C extension module calling the C API of a
non-whitelisted C extension module.
-Lastly, if a whitelisted C extension module is loaded in an unprotected interpreter and
-then loaded into a sandboxed interpreter then there is no checks
-during module initialization for possible security issues in the sandboxed
-interpreter that would have occurred had the sandboxed interpreter done the
-initial import.
-
-All of these issues can be handled by never blindly whitelisting a C extension
-module. Added support for dealing with C extension modules comes in the form
-of `Extension Module Crippling`_.
+Lastly, if a whitelisted C extension module is loaded in an unprotected
+interpreter and then loaded into a sandboxed interpreter then there is
+no checks during module initialization for possible security issues in
+the sandboxed interpreter that would have occurred had the sandboxed
+interpreter done the initial import.
+
+All of these issues can be handled by never blindly whitelisting a C
+extension module. Added support for dealing with C extension modules
+comes in the form of `Extension Module Crippling`_.
+
API
--------------
* int PySandbox_SetModule(PyThreadState *, string module_name)
Allow the sandboxed interpreter to import 'module_name'. If the
- interpreter is not sandboxed, return a false value. Absolute import paths must be
- specified.
+ interpreter is not sandboxed, return a false value. Absolute
+ import paths must be specified.
* int PySandbox_BlockModule(PyThreadState *, string module_name)
- Remove the specified module from the whitelist. Used to remove modules
- that are allowed by default. Return a false value if called on an
- unprotected interpreter.
+ Remove the specified module from the whitelist. Used to remove
+ modules that are allowed by default. Return a false value if
+ called on an unprotected interpreter.
* PySandbox_AllowedModule(string module_name, error_return)
- Macro that causes the caller to return with 'error_return' and sets the
- exception SandboxError if the specified module cannot be imported,
- otherwise does nothing.
+ Macro that causes the caller to return with 'error_return' and sets
+ the exception SandboxError if the specified module cannot be
+ imported, otherwise does nothing.
Extension Module Crippling
@@ -649,29 +683,29 @@
Protection
--------------
-By providing a C API for checking for allowed abilities, modules that have some
-useful functionality can do proper security checks for those functions that
-could provide insecure abilities while allowing safe code to be used (and thus
-not fully deny importation).
+By providing a C API for checking for allowed abilities, modules that
+have some useful functionality can do proper security checks for those
+functions that could provide insecure abilities while allowing safe
+code to be used (and thus not fully deny importation).
Why
--------------
-Consider a module that provides a string processing ability. If that module
-provides a single convenience function that reads its input string from a file
-(with a specified path), the whole module should not be blocked from being
-used, just that convenience function. By whitelisting the module but having a
-security check on the one problem function, the user can still gain access to
-the safe functions. Even better, the unsafe function can be allowed if the
-security checks pass.
+Consider a module that provides a string processing ability. If that
+module provides a single convenience function that reads its input
+string from a file (with a specified path), the whole module should not
+be blocked from being used, just that convenience function. By
+whitelisting the module but having a security check on the one problem
+function, the user can still gain access to the safe functions. Even
+better, the unsafe function can be allowed if the security checks pass.
Possible Security Flaws
-----------------------
-If a C extension module developer incorrectly implements the security checks
-for the unsafe functions it could lead to undesired abilities.
+If a C extension module developer incorrectly implements the security
+checks for the unsafe functions it could lead to undesired abilities.
API
@@ -692,9 +726,10 @@
Why
--------------
-Without implementing a bytecode verification tool, there is no way of making
-sure that bytecode does not jump outside its bounds, thus possibly executing
-malicious code. It also presents the possibility of crashing the interpreter.
+Without implementing a bytecode verification tool, there is no way of
+making sure that bytecode does not jump outside its bounds, thus
+possibly executing malicious code. It also presents the possibility of
+crashing the interpreter.
Possible Security Flaws
@@ -746,15 +781,15 @@
--------------
Filesystem information must be removed. Any settings that could
-possibly lead to a DoS attack (e.g., sys.setrecursionlimit()) or risk crashing
-the interpreter must also be removed.
+possibly lead to a DoS attack (e.g., sys.setrecursionlimit()) or risk
+crashing the interpreter must also be removed.
Possible Security Flaws
-----------------------
-Exposing something that could lead to future security problems (e.g., a way to
-crash the interpreter).
+Exposing something that could lead to future security problems (e.g., a
+way to crash the interpreter).
API
@@ -769,15 +804,15 @@
Protection
--------------
-Allow sending and receiving data to/from specific IP addresses on specific
-ports.
+Allow sending and receiving data to/from specific IP addresses on
+specific ports.
-open() is to be used as a factory function to open a network connection. If
-the connection is not possible (either because of an invalid address or
-security reasons), SandboxError is raised.
+open() is to be used as a factory function to open a network
+connection. If the connection is not possible (either because of an
+invalid address or security reasons), SandboxError is raised.
-A socket object may not be returned by the call. A proxy to handle security
-might be returned instead.
+A socket object may not be returned by the call. A proxy to handle
+security might be returned instead.
XXX
@@ -785,41 +820,43 @@
Why
--------------
-Allowing arbitrary sending of data over sockets can lead to DoS attacks on the
-network and other machines. Limiting accepting data prevents your machine from
-being attacked by accepting malicious network connections. It also allows you
-to know exactly where communication is going to and coming from.
+Allowing arbitrary sending of data over sockets can lead to DoS attacks
+on the network and other machines. Limiting accepting data prevents
+your machine from being attacked by accepting malicious network
+connections. It also allows you to know exactly where communication is
+going to and coming from.
Possible Security Flaws
-----------------------
-If someone managed to influence the used DNS server to influence what IP
-addresses were used after a DNS lookup.
+If someone managed to influence the used DNS server to influence what
+IP addresses were used after a DNS lookup.
API
--------------
* int PySandbox_SetIPAddress(PyThreadState *, string IP, integer port)
- Allow the sandboxed interpreter to send/receive to the specified 'IP'
- address on the specified 'port'. If the interpreter is not sandboxed,
- return a false value.
+ Allow the sandboxed interpreter to send/receive to the specified
+ 'IP' address on the specified 'port'. If the interpreter is not
+ sandboxed, return a false value.
* PySandbox_AllowedIPAddress(string IP, integer port, error_return)
- Macro to verify that the specified 'IP' address on the specified 'port' is
- allowed to be communicated with. If not, cause the caller to return with
- 'error_return' and SandboxError exception set, otherwise do nothing.
+ Macro to verify that the specified 'IP' address on the specified
+ 'port' is allowed to be communicated with. If not, cause the
+ caller to return with 'error_return' and SandboxError exception
+ set, otherwise do nothing.
* int PySandbox_SetHost(PyThreadState *, string host, integer port)
- Allow the sandboxed interpreter to send/receive to the specified 'host' on
- the specified 'port'. If the interpreter is not sandboxed, return a false
- value.
+ Allow the sandboxed interpreter to send/receive to the specified
+ 'host' on the specified 'port'. If the interpreter is not
+ sandboxed, return a false value.
* PySandbox_AllowedHost(string host, integer port, error_return)
- Check that the specified 'host' on the specified 'port' is allowed to be
- communicated with. If not, set a SandboxError exception and cause the caller to
- return 'error_return', otherwise do nothing.
+ Check that the specified 'host' on the specified 'port' is allowed
+ to be communicated with. If not, set a SandboxError exception and
+ cause the caller to return 'error_return', otherwise do nothing.
Network Information
@@ -828,10 +865,10 @@
Protection
--------------
-Limit what information can be gleaned about the network the system is running
-on. This does not include restricting information on IP addresses and hosts
-that are have been explicitly allowed for the sandboxed interpreter to
-communicate with.
+Limit what information can be gleaned about the network the system is
+running on. This does not include restricting information on IP
+addresses and hosts that are have been explicitly allowed for the
+sandboxed interpreter to communicate with.
XXX
@@ -839,31 +876,31 @@
Why
--------------
-With enough information from the network several things could occur. One is
-that someone could possibly figure out where your machine is on the Internet.
-Another is that enough information about the network you are connected to could
-be used against it in an attack.
+With enough information from the network several things could occur.
+One is that someone could possibly figure out where your machine is on
+the Internet. Another is that enough information about the network you
+are connected to could be used against it in an attack.
Possible Security Flaws
-----------------------
-As long as usage is restricted to only what is needed to work with allowed
-addresses, there are no security issues to speak of.
+As long as usage is restricted to only what is needed to work with
+allowed addresses, there are no security issues to speak of.
API
--------------
* int PySandbox_SetNetworkInfo(PyThreadState *)
- Allow the sandboxed interpreter to get network information regardless of
- whether the IP or host address is explicitly allowed. If the interpreter
- is not sandboxed, return a false value.
+ Allow the sandboxed interpreter to get network information
+ regardless of whether the IP or host address is explicitly allowed.
+ If the interpreter is not sandboxed, return a false value.
* PySandbox_AllowedNetworkInfo(error_return)
- Macro that will return 'error_return' for the caller and set a SandboxError exception
- if the sandboxed interpreter does not allow checking for arbitrary network
- information, otherwise do nothing.
+ Macro that will return 'error_return' for the caller and set a
+ SandboxError exception if the sandboxed interpreter does not allow
+ checking for arbitrary network information, otherwise do nothing.
Filesystem Information
@@ -872,8 +909,9 @@
Protection
--------------
-Do not allow information about the filesystem layout from various parts of
-Python to be exposed. This means blocking exposure at the Python level to:
+Do not allow information about the filesystem layout from various parts
+of Python to be exposed. This means blocking exposure at the Python
+level to:
* __file__ attribute on modules
* __path__ attribute on packages
@@ -884,9 +922,9 @@
Why
--------------
-Exposing information about the filesystem is not allowed. You can figure out
-what operating system one is on which can lead to vulnerabilities specific to
-that operating system being exploited.
+Exposing information about the filesystem is not allowed. You can
+figure out what operating system one is on which can lead to
+vulnerabilities specific to that operating system being exploited.
Possible Security Flaws
@@ -899,13 +937,13 @@
--------------
* int PySandbox_SetFilesystemInfo(PyThreadState *)
- Allow the sandboxed interpreter to expose filesystem information. If the
- passed-in interpreter is not sandboxed, return NULL.
+ Allow the sandboxed interpreter to expose filesystem information.
+ If the passed-in interpreter is not sandboxed, return NULL.
* PySandbox_AllowedFilesystemInfo(error_return)
- Macro that checks if exposing filesystem information is allowed. If it is
- not, cause the caller to return with the value of 'error_return' and raise
- SandboxError, otherwise do nothing.
+ Macro that checks if exposing filesystem information is allowed.
+ If it is not, cause the caller to return with the value of
+ 'error_return' and raise SandboxError, otherwise do nothing.
Stdin, Stdout, and Stderr
@@ -914,9 +952,9 @@
Protection
--------------
-By default, sys.__stdin__, sys.__stdout__, and sys.__stderr__ will be set to
-instances of StringIO. Explicit allowance of the process' stdin, stdout, and
-stderr is possible.
+By default, sys.__stdin__, sys.__stdout__, and sys.__stderr__ will be
+set to instances of StringIO. Explicit allowance of the process'
+stdin, stdout, and stderr is possible.
This will protect the 'print' statement, and the built-ins input() and
raw_input().
@@ -941,97 +979,104 @@
* int PySandbox_SetTrueStdin(PyThreadState *)
int PySandbox_SetTrueStdout(PyThreadState *)
int PySandbox_SetTrueStderr(PyThreadState *)
- Set the specific stream for the interpreter to the true version of the
- stream and not to the default instance of StringIO. If the interpreter is
- not sandboxed, return a false value.
+ Set the specific stream for the interpreter to the true version of
+ the stream and not to the default instance of StringIO. If the
+ interpreter is not sandboxed, return a false value.
Adding New Protections
=============================
-.. note:: This feature has the lowest priority and thus will be the last feature
- implemented (if ever).
+.. note:: This feature has the lowest priority and thus will be the
+ last feature implemented (if ever).
Protection
--------------
-Allow for extensibility in the security model by being able to add new types of
-checks. This allows not only for Python to add new security protections in a
-backwards-compatible fashion, but to also have extension modules add their own
-as well.
-
-An extension module can introduce a group for its various values to check, with
-a type being a specific value within a group. The "Python" group is
-specifically reserved for use by the Python core itself.
+Allow for extensibility in the security model by being able to add new
+types of checks. This allows not only for Python to add new security
+protections in a backwards-compatible fashion, but to also have
+extension modules add their own as well.
+
+An extension module can introduce a group for its various values to
+check, with a type being a specific value within a group. The "Python"
+group is specifically reserved for use by the Python core itself.
Why
--------------
-We are all human. There is the possibility that a need for a new type of
-protection for the interpreter will present itself and thus need support. By
-providing an extensible way to add new protections it helps to future-proof the
-system.
+We are all human. There is the possibility that a need for a new type
+of protection for the interpreter will present itself and thus need
+support. By providing an extensible way to add new protections it
+helps to future-proof the system.
It also allows extension modules to present their own set of security
-protections. That way one extension module can use the protection scheme
-presented by another that it is dependent upon.
+protections. That way one extension module can use the protection
+scheme presented by another that it is dependent upon.
Possible Security Flaws
------------------------
-Poor definitions by extension module users of how their protections should be
-used would allow for possible exploitation.
+Poor definitions by extension module users of how their protections
+should be used would allow for possible exploitation.
API
--------------
+ Bool
- * int PySandbox_SetExtendedFlag(PyThreadState *, string group, string type)
+ * int PySandbox_SetExtendedFlag(PyThreadState *, string group,
+ string type)
Set a group-type to be true. Expected use is for when a binary
- possibility of something is needed and that the default is to not allow
- use of the resource (e.g., network information). Returns a false value
- if used on an unprotected interpreter.
-
- * PySandbox_AllowedExtendedFlag(string group, string type, error_return)
- Macro that if the group-type is not set to true, cause the caller to
- return with 'error_return' with SandboxError exception raised. For unprotected
- interpreters the check does nothing.
+ possibility of something is needed and that the default is to
+ not allow use of the resource (e.g., network information).
+ Returns a false value if used on an unprotected interpreter.
+
+ * PySandbox_AllowedExtendedFlag(string group, string type,
+ error_return)
+ Macro that if the group-type is not set to true, cause the
+ caller to return with 'error_return' with SandboxError
+ exception raised. For unprotected interpreters the check does
+ nothing.
+ Numeric Range
- * int PySandbox_SetExtendedCap(PyThreadState *, string group, string type,
- integer cap)
- Set a group-type to a capped value, 'cap', with the initial allocated value set to 0.
- Expected use is when a resource has a capped amount of use (e.g.,
- memory). Returns a false value if the interpreter is not sandboxed.
+ * int PySandbox_SetExtendedCap(PyThreadState *, string group,
+ string type, integer cap)
+ Set a group-type to a capped value, 'cap', with the initial
+ allocated value set to 0. Expected use is when a resource has
+ a capped amount of use (e.g., memory). Returns a false value
+ if the interpreter is not sandboxed.
* PySandbox_AllowedExtendedAlloc(integer increase, error_return)
- Macro to raise the amount of a resource is used by 'increase'. If the
- increase pushes the resource allocation past the set cap, then return
- 'error_return' and set SandboxError as the exception, otherwise do
- nothing.
+ Macro to raise the amount of a resource is used by 'increase'.
+ If the increase pushes the resource allocation past the set
+ cap, then return 'error_return' and set SandboxError as the
+ exception, otherwise do nothing.
* PySandbox_AllowedExtendedFree(integer decrease, error_return)
- Macro to lower the amount a resource is used by 'decrease'. If the
- decrease pushes the allotment to below 0 then have the caller return
- 'error_return' and set SandboxError as the exception, otherwise do
- nothing.
+ Macro to lower the amount a resource is used by 'decrease'. If
+ the decrease pushes the allotment to below 0 then have the
+ caller return 'error_return' and set SandboxError as the
+ exception, otherwise do nothing.
+ Membership
- * int PySandbox_SetExtendedMembership(PyThreadState *, string group,
- string type, string member)
- Add a string, 'member', to be considered a member of a group-type (e.g., allowed
- file paths). If the interpreter is not an sandboxed interpreter,
- return a false value.
+ * int PySandbox_SetExtendedMembership(PyThreadState *,
+ string group, string type,
+ string member)
+ Add a string, 'member', to be considered a member of a
+ group-type (e.g., allowed file paths). If the interpreter is not
+ an sandboxed interpreter, return a false value.
* PySandbox_AllowedExtendedMembership(string group, string type,
- string member, error_return)
- Macro that checks 'member' is a member of the values set for the
- group-type. If it is not, then have the caller return 'error_return'
- and set an exception for SandboxError, otherwise does nothing.
+ string member,
+ error_return)
+ Macro that checks 'member' is a member of the values set for
+ the group-type. If it is not, then have the caller return
+ 'error_return' and set an exception for SandboxError, otherwise
+ does nothing.
+ Specific Value
* int PySandbox_SetExtendedValue(PyThreadState *, string group,
@@ -1039,10 +1084,11 @@
Set a group-type to 'value'. If the interpreter is not
sandboxed, return NULL.
- * PySandbox_AllowedExtendedValue(string group, string type, string value, error_return)
- Macro to check that the group-type is set to 'value'. If it is not,
- then have the caller return 'error_return' and set an exception for
- SandboxError, otherwise do nothing.
+ * PySandbox_AllowedExtendedValue(string group, string type,
+ string value, error_return)
+ Macro to check that the group-type is set to 'value'. If it is
+ not, then have the caller return 'error_return' and set an
+ exception for SandboxError, otherwise do nothing.
Python API
@@ -1051,8 +1097,9 @@
__sandboxed__
--------------
-A built-in that flags whether the interpreter currently running is sandboxed or
-not. Set to a 'bool' value that is read-only. To mimic working of __debug__.
+A built-in that flags whether the interpreter currently running is
+sandboxed or not. Set to a 'bool' value that is read-only. To mimic
+working of __debug__.
sandbox module
From python-checkins at python.org Sat Jul 8 03:50:31 2006
From: python-checkins at python.org (nick.coghlan)
Date: Sat, 8 Jul 2006 03:50:31 +0200 (CEST)
Subject: [Python-checkins] r50488 - peps/trunk/pep-0361.txt
Message-ID: <20060708015031.AFCE91E400A@bag.python.org>
Author: nick.coghlan
Date: Sat Jul 8 03:50:31 2006
New Revision: 50488
Modified:
peps/trunk/pep-0361.txt
Log:
Make note about __dir__() in the Python 2.6 PEP
Modified: peps/trunk/pep-0361.txt
==============================================================================
--- peps/trunk/pep-0361.txt (original)
+++ peps/trunk/pep-0361.txt Sat Jul 8 03:50:31 2006
@@ -78,6 +78,7 @@
- distutils replacement (requires a PEP)
- turtle.py replacement or enhancements
+ - adding a __dir__() magic method to control dir() [1]
New modules in the standard library:
@@ -113,11 +114,16 @@
http://python.org/sf/1515609
http://python.org/sf/1515361
- How should -m work with packages?
- How should -m work with relative imports?
+ How should -m (and __main__ in general) work with relative imports?
http://mail.python.org/pipermail/python-dev/2006-June/066161.html
+ (also see the section on main modules in PEP 338)
+References
+
+ [1] Adding a __dir__() magic method
+ http://mail.python.org/pipermail/python-dev/2006-July/067139.html
+
Copyright
This document has been placed in the public domain.
From python-checkins at python.org Sat Jul 8 07:31:38 2006
From: python-checkins at python.org (neal.norwitz)
Date: Sat, 8 Jul 2006 07:31:38 +0200 (CEST)
Subject: [Python-checkins] r50489 - in python/trunk:
Lib/test/test_compile.py Misc/NEWS Python/ast.c
Message-ID: <20060708053138.B96801E4008@bag.python.org>
Author: neal.norwitz
Date: Sat Jul 8 07:31:37 2006
New Revision: 50489
Modified:
python/trunk/Lib/test/test_compile.py
python/trunk/Misc/NEWS
python/trunk/Python/ast.c
Log:
Fix SF bug #1519018: 'as' is now validated properly in import statements
Modified: python/trunk/Lib/test/test_compile.py
==============================================================================
--- python/trunk/Lib/test/test_compile.py (original)
+++ python/trunk/Lib/test/test_compile.py Sat Jul 8 07:31:37 2006
@@ -238,6 +238,8 @@
succeed = [
'import sys',
'import os, sys',
+ 'import os as bar',
+ 'import os.path as bar',
'from __future__ import nested_scopes, generators',
'from __future__ import (nested_scopes,\ngenerators)',
'from __future__ import (nested_scopes,\ngenerators,)',
@@ -257,6 +259,8 @@
'import (sys',
'import sys)',
'import (os,)',
+ 'import os As bar',
+ 'import os.path a bar',
'from (sys) import stdin',
'from __future__ import (nested_scopes',
'from __future__ import nested_scopes)',
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Sat Jul 8 07:31:37 2006
@@ -22,6 +22,8 @@
omit a default "error" argument for NULL pointer. This allows
the parser to take a codec from cjkcodecs again.
+- Bug #1519018: 'as' is now validated properly in import statements.
+
Library
-------
Modified: python/trunk/Python/ast.c
==============================================================================
--- python/trunk/Python/ast.c (original)
+++ python/trunk/Python/ast.c Sat Jul 8 07:31:37 2006
@@ -2142,7 +2142,14 @@
loop:
switch (TYPE(n)) {
case import_as_name:
- str = (NCH(n) == 3) ? NEW_IDENTIFIER(CHILD(n, 2)) : NULL;
+ str = NULL;
+ if (NCH(n) == 3) {
+ if (strcmp(STR(CHILD(n, 1)), "as") != 0) {
+ ast_error(n, "must use 'as' in import");
+ return NULL;
+ }
+ str = NEW_IDENTIFIER(CHILD(n, 2));
+ }
return alias(NEW_IDENTIFIER(CHILD(n, 0)), str, c->c_arena);
case dotted_as_name:
if (NCH(n) == 1) {
@@ -2151,6 +2158,10 @@
}
else {
alias_ty a = alias_for_import_name(c, CHILD(n, 0));
+ if (strcmp(STR(CHILD(n, 1)), "as") != 0) {
+ ast_error(n, "must use 'as' in import");
+ return NULL;
+ }
assert(!a->asname);
a->asname = NEW_IDENTIFIER(CHILD(n, 2));
return a;
From python-checkins at python.org Sat Jul 8 14:15:27 2006
From: python-checkins at python.org (georg.brandl)
Date: Sat, 8 Jul 2006 14:15:27 +0200 (CEST)
Subject: [Python-checkins] r50490 - python/trunk/Lib/test/test_compile.py
Message-ID: <20060708121527.A54D41E4008@bag.python.org>
Author: georg.brandl
Date: Sat Jul 8 14:15:27 2006
New Revision: 50490
Modified:
python/trunk/Lib/test/test_compile.py
Log:
Add an additional test for bug #1519018.
Modified: python/trunk/Lib/test/test_compile.py
==============================================================================
--- python/trunk/Lib/test/test_compile.py (original)
+++ python/trunk/Lib/test/test_compile.py Sat Jul 8 14:15:27 2006
@@ -261,6 +261,8 @@
'import (os,)',
'import os As bar',
'import os.path a bar',
+ 'from sys import stdin As stdout',
+ 'from sys import stdin a stdout',
'from (sys) import stdin',
'from __future__ import (nested_scopes',
'from __future__ import nested_scopes)',
From g.brandl at gmx.net Sat Jul 8 14:17:34 2006
From: g.brandl at gmx.net (Georg Brandl)
Date: Sat, 08 Jul 2006 14:17:34 +0200
Subject: [Python-checkins] r50489 - in python/trunk:
Lib/test/test_compile.py Misc/NEWS Python/ast.c
In-Reply-To: <20060708053138.B96801E4008@bag.python.org>
References: <20060708053138.B96801E4008@bag.python.org>
Message-ID:
neal.norwitz wrote:
> Author: neal.norwitz
> Date: Sat Jul 8 07:31:37 2006
> New Revision: 50489
> Modified: python/trunk/Python/ast.c
> ==============================================================================
> --- python/trunk/Python/ast.c (original)
> +++ python/trunk/Python/ast.c Sat Jul 8 07:31:37 2006
> @@ -2142,7 +2142,14 @@
> loop:
> switch (TYPE(n)) {
> case import_as_name:
> - str = (NCH(n) == 3) ? NEW_IDENTIFIER(CHILD(n, 2)) : NULL;
> + str = NULL;
> + if (NCH(n) == 3) {
> + if (strcmp(STR(CHILD(n, 1)), "as") != 0) {
> + ast_error(n, "must use 'as' in import");
Note that 2.4 had just "invalid syntax" here (since "as" is supposed to be
a quasi-keyword in this place, that makes sense).
Georg
From buildbot at python.org Sat Jul 8 21:29:27 2006
From: buildbot at python.org (buildbot at python.org)
Date: Sat, 08 Jul 2006 19:29:27 +0000
Subject: [Python-checkins] buildbot failure in x86 XP-2 trunk
Message-ID: <20060708192927.ADD771E4004@bag.python.org>
The Buildbot has detected a new failure of x86 XP-2 trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520XP-2%2520trunk/builds/743
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl,neal.norwitz
BUILD FAILED: failed failed slave lost
sincerely,
-The Buildbot
From python-checkins at python.org Sat Jul 8 21:55:05 2006
From: python-checkins at python.org (tim.peters)
Date: Sat, 8 Jul 2006 21:55:05 +0200 (CEST)
Subject: [Python-checkins] r50491 - python/trunk/Lib/lib-tk/turtle.py
Message-ID: <20060708195505.A73501E4004@bag.python.org>
Author: tim.peters
Date: Sat Jul 8 21:55:05 2006
New Revision: 50491
Modified:
python/trunk/Lib/lib-tk/turtle.py
Log:
Whitespace normalization.
Modified: python/trunk/Lib/lib-tk/turtle.py
==============================================================================
--- python/trunk/Lib/lib-tk/turtle.py (original)
+++ python/trunk/Lib/lib-tk/turtle.py Sat Jul 8 21:55:05 2006
@@ -365,8 +365,8 @@
>>> turtle.circle(120, 180) # half a circle
"""
if extent is None:
- extent = self._fullcircle
- frac = abs(extent)/self._fullcircle
+ extent = self._fullcircle
+ frac = abs(extent)/self._fullcircle
steps = 1+int(min(11+abs(radius)/6.0, 59.0)*frac)
w = 1.0 * extent / steps
w2 = 0.5 * w
From python-checkins at python.org Sun Jul 9 04:58:47 2006
From: python-checkins at python.org (mateusz.rukowicz)
Date: Sun, 9 Jul 2006 04:58:47 +0200 (CEST)
Subject: [Python-checkins] r50492 - sandbox/trunk/decimal-c/_decimal.c
Message-ID: <20060709025847.52ECB1E4008@bag.python.org>
Author: mateusz.rukowicz
Date: Sun Jul 9 04:58:46 2006
New Revision: 50492
Modified:
sandbox/trunk/decimal-c/_decimal.c
Log:
Some minor cleanups. Coments udpated a little bit.
Modified: sandbox/trunk/decimal-c/_decimal.c
==============================================================================
--- sandbox/trunk/decimal-c/_decimal.c (original)
+++ sandbox/trunk/decimal-c/_decimal.c Sun Jul 9 04:58:46 2006
@@ -11,13 +11,10 @@
- Rounding constants are integers.
- There's no handle method on the exceptions.
- Context methods don't accept keyword arguments (they're all
- called a and b anyways).
+ called a and b anyways). This probably will be changed.
- Special values are represented by special sign values, so
the results of as_tuple() differ.
- Many internal "underscore" methods are not accessible on the object.
- - In principle, exponents and the number of digits are unbounded in
- Python decimals, since they could use long integers. While that's
- praiseworthy, it's also not very speedy ;)
- The Python version sometimes gets the context in the middle of
a method which leads to some calls succeeding even if there is
no valid context.
@@ -32,14 +29,11 @@
I guess that this causes slowdown here and there, and the proper way
to do this is to subclass dict to call back some method on assignment.
(This is better done in Python code, I guess.)
- - Some internal methods (names starting with underscores) have been
- added as a method to the C objects to allow the overlay Python
- module to use them. They should be removed from that list, along
- with the wrapper functions parsing the Python arguments, once every-
- thing is coded in C.
- Some special (__xxx__) methods in the Python code took additional
arguments used internally. Since that's not possible, there's a
_do_decimal_xxx along with a decimal_xxx, see below.
+ - Arithmetic functions (like __add__, __mul__) have only one operand
+ so, we cannot use custom context.
This is not yet optimized C code, but mostly translated Python that
@@ -61,6 +55,12 @@
We were undecided whether to offer a public C API (PyDecimal_FromString etc.)
If you want to do this, look at the datetime module.
+
+ This code still needs some clean-ups and improvements, ie. Context isn't
+ safe for subclassing. Context should become gc object. 'digits' will
+ be removed from decimalobject. Also, I am not really sure, if code is
+ readable, especially after switching from longs to exp_t - this will
+ need a little work too.
*/
#include "Python.h"
@@ -539,10 +539,10 @@
}
#ifdef BIG_EXP
-/* TODO this implementation assumes that every limb, which
- * is not used is 0. Since it's not really fast, this will
- * change */
-
+/* TODO I have to *limit* passing arguments by value,
+ * because it really slows down implementation. Most probably
+ * it will also dynamicly switch between C longs/bigint arithmetic
+ * to make things go faster */
static exp_t
exp_from_i(long a) {
exp_t ret;
@@ -801,7 +801,6 @@
static exp_t
exp_mul(exp_t a, exp_t b) {
exp_t ret;
-/* memset(ret.limbs, 0, sizeof(long) *EXP_LIMB_COUNT); */
ret.size = _limb_multiply_core(a.limbs, a.size, b.limbs, b.size, ret.limbs);
ret.sign = a.sign ^ b.sign;
if (ret.size == 1 && ret.limbs[0] == 0)
@@ -822,7 +821,6 @@
for (i=0 ; i=0; i--) {
@@ -3360,8 +3358,6 @@
if (ISSPECIAL(self))
return PyInt_FromLong(0);
- /* XXX: Overflow? */
-/* return PyInt_FromSsize_t(exp_to_i(ADJUSTED(self))); */
return exp_to_pyobj(ADJUSTED(self));
}
@@ -4749,7 +4745,7 @@
}
DECIMAL_SPECIAL_2FUNC(decimal_floor_div)
-
+/* XXX what's the diffirence between div and truediv? =] */
static decimalobject *
_do_decimal_true_div(decimalobject *self, decimalobject *other,
contextobject *ctx)
@@ -4899,8 +4895,6 @@
}
firstprec = ctx->prec;
- /* XXX I am not really sure is this ok, in python implementation it is
- * prec + 1 + len(str(n))*/
ctx->prec += 1;
{
long t = n;
@@ -5001,8 +4995,6 @@
Py_DECREF(mul);
Py_XDECREF(ret);
return NULL;
- /* XXX */
- Py_RETURN_NONE;
}
From python-checkins at python.org Sun Jul 9 18:16:35 2006
From: python-checkins at python.org (neil.schemenauer)
Date: Sun, 9 Jul 2006 18:16:35 +0200 (CEST)
Subject: [Python-checkins] r50493 - in python/trunk: Lib/test/test_ast.py
Lib/test/test_scope.py Misc/NEWS Python/ast.c Python/compile.c
Message-ID: <20060709161635.ED8D41E4017@bag.python.org>
Author: neil.schemenauer
Date: Sun Jul 9 18:16:34 2006
New Revision: 50493
Modified:
python/trunk/Lib/test/test_ast.py
python/trunk/Lib/test/test_scope.py
python/trunk/Misc/NEWS
python/trunk/Python/ast.c
python/trunk/Python/compile.c
Log:
Fix AST compiler bug #1501934: incorrect LOAD/STORE_GLOBAL generation.
Modified: python/trunk/Lib/test/test_ast.py
==============================================================================
--- python/trunk/Lib/test/test_ast.py (original)
+++ python/trunk/Lib/test/test_ast.py Sun Jul 9 18:16:34 2006
@@ -160,7 +160,7 @@
('Module', [('FunctionDef', (1, 0), 'f', ('arguments', [], None, None, []), [('Return', (1, 8), ('Num', (1, 15), 1))], [])]),
('Module', [('Delete', (1, 0), [('Name', (1, 4), 'v', ('Del',))])]),
('Module', [('Assign', (1, 0), [('Name', (1, 0), 'v', ('Store',))], ('Num', (1, 4), 1))]),
-('Module', [('AugAssign', (1, 0), ('Name', (1, 0), 'v', ('Load',)), ('Add',), ('Num', (1, 5), 1))]),
+('Module', [('AugAssign', (1, 0), ('Name', (1, 0), 'v', ('Store',)), ('Add',), ('Num', (1, 5), 1))]),
('Module', [('Print', (1, 0), ('Name', (1, 8), 'f', ('Load',)), [('Num', (1, 11), 1)], False)]),
('Module', [('For', (1, 0), ('Name', (1, 4), 'v', ('Store',)), ('Name', (1, 9), 'v', ('Load',)), [('Pass', (1, 11))], [])]),
('Module', [('While', (1, 0), ('Name', (1, 6), 'v', ('Load',)), [('Pass', (1, 8))], [])]),
Modified: python/trunk/Lib/test/test_scope.py
==============================================================================
--- python/trunk/Lib/test/test_scope.py (original)
+++ python/trunk/Lib/test/test_scope.py Sun Jul 9 18:16:34 2006
@@ -299,6 +299,17 @@
else:
raise TestFailed
+# test for bug #1501934: incorrect LOAD/STORE_GLOBAL generation
+global_x = 1
+def f():
+ global_x += 1
+try:
+ f()
+except UnboundLocalError:
+ pass
+else:
+ raise TestFailed, 'scope of global_x not correctly determined'
+
print "14. complex definitions"
def makeReturner(*lst):
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Sun Jul 9 18:16:34 2006
@@ -12,6 +12,9 @@
Core and builtins
-----------------
+- Bug #1501934: The scope of global variables that are locally assigned
+ using augmented assignment is now correctly determined.
+
- Bug #927248: Recursive method-wrapper objects can now safely
be released.
Modified: python/trunk/Python/ast.c
==============================================================================
--- python/trunk/Python/ast.c (original)
+++ python/trunk/Python/ast.c Sun Jul 9 18:16:34 2006
@@ -339,7 +339,7 @@
/* The ast defines augmented store and load contexts, but the
implementation here doesn't actually use them. The code may be
a little more complex than necessary as a result. It also means
- that expressions in an augmented assignment have no context.
+ that expressions in an augmented assignment have a Store context.
Consider restructuring so that augmented assignment uses
set_context(), too.
*/
@@ -1901,7 +1901,7 @@
if (!expr1)
return NULL;
- /* TODO(jhylton): Figure out why set_context() can't be used here. */
+ /* TODO(nas): Remove duplicated error checks (set_context does it) */
switch (expr1->kind) {
case GeneratorExp_kind:
ast_error(ch, "augmented assignment to generator "
@@ -1923,6 +1923,7 @@
"assignment");
return NULL;
}
+ set_context(expr1, Store, ch);
ch = CHILD(n, 2);
if (TYPE(ch) == testlist)
Modified: python/trunk/Python/compile.c
==============================================================================
--- python/trunk/Python/compile.c (original)
+++ python/trunk/Python/compile.c Sun Jul 9 18:16:34 2006
@@ -3688,7 +3688,8 @@
VISIT(c, expr, auge);
break;
case Name_kind:
- VISIT(c, expr, s->v.AugAssign.target);
+ if (!compiler_nameop(c, e->v.Name.id, Load))
+ return 0;
VISIT(c, expr, s->v.AugAssign.value);
ADDOP(c, inplace_binop(c, s->v.AugAssign.op));
return compiler_nameop(c, e->v.Name.id, Store);
From buildbot at python.org Sun Jul 9 19:29:39 2006
From: buildbot at python.org (buildbot at python.org)
Date: Sun, 09 Jul 2006 17:29:39 +0000
Subject: [Python-checkins] buildbot warnings in ppc Debian unstable trunk
Message-ID: <20060709172939.EF46F1E4009@bag.python.org>
The Buildbot has detected a new failure of ppc Debian unstable trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/ppc%2520Debian%2520unstable%2520trunk/builds/896
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: neil.schemenauer
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Sun Jul 9 21:48:58 2006
From: python-checkins at python.org (tim.peters)
Date: Sun, 9 Jul 2006 21:48:58 +0200 (CEST)
Subject: [Python-checkins] r50494 - python/branches/tim-current_frames
Message-ID: <20060709194858.1C7011E4009@bag.python.org>
Author: tim.peters
Date: Sun Jul 9 21:48:57 2006
New Revision: 50494
Added:
python/branches/tim-current_frames/
- copied from r50493, python/trunk/
Log:
Branch to develop sys._current_frames.
From python-checkins at python.org Sun Jul 9 23:19:30 2006
From: python-checkins at python.org (neil.schemenauer)
Date: Sun, 9 Jul 2006 23:19:30 +0200 (CEST)
Subject: [Python-checkins] r50495 - in python/trunk:
Lib/test/test_compile.py Misc/NEWS Python/ast.c
Message-ID: <20060709211930.31A791E4009@bag.python.org>
Author: neil.schemenauer
Date: Sun Jul 9 23:19:29 2006
New Revision: 50495
Modified:
python/trunk/Lib/test/test_compile.py
python/trunk/Misc/NEWS
python/trunk/Python/ast.c
Log:
Fix SF bug 1441486: bad unary minus folding in compiler.
Modified: python/trunk/Lib/test/test_compile.py
==============================================================================
--- python/trunk/Lib/test/test_compile.py (original)
+++ python/trunk/Lib/test/test_compile.py Sun Jul 9 23:19:29 2006
@@ -211,6 +211,10 @@
self.assertEqual(eval("-" + all_one_bits), -18446744073709551615L)
else:
self.fail("How many bits *does* this machine have???")
+ # Verify treatment of contant folding on -(sys.maxint+1)
+ # i.e. -2147483648 on 32 bit platforms. Should return int, not long.
+ self.assertTrue(isinstance(eval("%s" % (-sys.maxint - 1)), int))
+ self.assertTrue(isinstance(eval("%s" % (-sys.maxint - 2)), long))
def test_sequence_unpacking_error(self):
# Verify sequence packing/unpacking with "or". SF bug #757818
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Sun Jul 9 23:19:29 2006
@@ -12,6 +12,9 @@
Core and builtins
-----------------
+- Bug #1441486: The literal representation of -(sys.maxint - 1)
+ again evaluates to a int object, not a long.
+
- Bug #1501934: The scope of global variables that are locally assigned
using augmented assignment is now correctly determined.
Modified: python/trunk/Python/ast.c
==============================================================================
--- python/trunk/Python/ast.c (original)
+++ python/trunk/Python/ast.c Sun Jul 9 23:19:29 2006
@@ -1484,6 +1484,57 @@
}
static expr_ty
+ast_for_factor(struct compiling *c, const node *n)
+{
+ node *pfactor, *ppower, *patom, *pnum;
+ expr_ty expression;
+
+ /* If the unary - operator is applied to a constant, don't generate
+ a UNARY_NEGATIVE opcode. Just store the approriate value as a
+ constant. The peephole optimizer already does something like
+ this but it doesn't handle the case where the constant is
+ (sys.maxint - 1). In that case, we want a PyIntObject, not a
+ PyLongObject.
+ */
+ if (TYPE(CHILD(n, 0)) == MINUS
+ && NCH(n) == 2
+ && TYPE((pfactor = CHILD(n, 1))) == factor
+ && NCH(pfactor) == 1
+ && TYPE((ppower = CHILD(pfactor, 0))) == power
+ && NCH(ppower) == 1
+ && TYPE((patom = CHILD(ppower, 0))) == atom
+ && TYPE((pnum = CHILD(patom, 0))) == NUMBER) {
+ char *s = PyObject_MALLOC(strlen(STR(pnum)) + 2);
+ if (s == NULL)
+ return NULL;
+ s[0] = '-';
+ strcpy(s + 1, STR(pnum));
+ PyObject_FREE(STR(pnum));
+ STR(pnum) = s;
+ return ast_for_atom(c, patom);
+ }
+
+ expression = ast_for_expr(c, CHILD(n, 1));
+ if (!expression)
+ return NULL;
+
+ switch (TYPE(CHILD(n, 0))) {
+ case PLUS:
+ return UnaryOp(UAdd, expression, LINENO(n), n->n_col_offset,
+ c->c_arena);
+ case MINUS:
+ return UnaryOp(USub, expression, LINENO(n), n->n_col_offset,
+ c->c_arena);
+ case TILDE:
+ return UnaryOp(Invert, expression, LINENO(n),
+ n->n_col_offset, c->c_arena);
+ }
+ PyErr_Format(PyExc_SystemError, "unhandled factor: %d",
+ TYPE(CHILD(n, 0)));
+ return NULL;
+}
+
+static expr_ty
ast_for_power(struct compiling *c, const node *n)
{
/* power: atom trailer* ('**' factor)*
@@ -1662,30 +1713,12 @@
}
return Yield(exp, LINENO(n), n->n_col_offset, c->c_arena);
}
- case factor: {
- expr_ty expression;
-
+ case factor:
if (NCH(n) == 1) {
n = CHILD(n, 0);
goto loop;
}
-
- expression = ast_for_expr(c, CHILD(n, 1));
- if (!expression)
- return NULL;
-
- switch (TYPE(CHILD(n, 0))) {
- case PLUS:
- return UnaryOp(UAdd, expression, LINENO(n), n->n_col_offset, c->c_arena);
- case MINUS:
- return UnaryOp(USub, expression, LINENO(n), n->n_col_offset, c->c_arena);
- case TILDE:
- return UnaryOp(Invert, expression, LINENO(n), n->n_col_offset, c->c_arena);
- }
- PyErr_Format(PyExc_SystemError, "unhandled factor: %d",
- TYPE(CHILD(n, 0)));
- break;
- }
+ return ast_for_factor(c, n);
case power:
return ast_for_power(c, n);
default:
From python-checkins at python.org Sun Jul 9 23:29:48 2006
From: python-checkins at python.org (guido.van.rossum)
Date: Sun, 9 Jul 2006 23:29:48 +0200 (CEST)
Subject: [Python-checkins] r50496 - peps/trunk/pep-3099.txt
Message-ID: <20060709212948.A5C031E4009@bag.python.org>
Author: guido.van.rossum
Date: Sun Jul 9 23:29:48 2006
New Revision: 50496
Modified:
peps/trunk/pep-3099.txt
Log:
Container literals stay.
Modified: peps/trunk/pep-3099.txt
==============================================================================
--- peps/trunk/pep-3099.txt (original)
+++ peps/trunk/pep-3099.txt Sun Jul 9 23:29:48 2006
@@ -124,6 +124,11 @@
Thread: "Explicit Lexical Scoping (pre-PEP?)",
http://mail.python.org/pipermail/python-dev/2006-July/066995.html
+* We won't be removing container literals.
+ That is, {expr: expr, ...}, [expr, ...] and (expr, ...) will stay.
+
+ Thread: "No Container Literals",
+ http://mail.python.org/pipermail/python-3000/2006-July/002550.html
Builtins
========
From buildbot at python.org Sun Jul 9 23:42:57 2006
From: buildbot at python.org (buildbot at python.org)
Date: Sun, 09 Jul 2006 21:42:57 +0000
Subject: [Python-checkins] buildbot warnings in amd64 gentoo trunk
Message-ID: <20060709214257.9FF6D1E4009@bag.python.org>
The Buildbot has detected a new failure of amd64 gentoo trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/amd64%2520gentoo%2520trunk/builds/1206
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: neil.schemenauer
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Mon Jul 10 00:09:48 2006
From: buildbot at python.org (buildbot at python.org)
Date: Sun, 09 Jul 2006 22:09:48 +0000
Subject: [Python-checkins] buildbot warnings in alpha Tru64 5.1 trunk
Message-ID: <20060709220948.923E41E4009@bag.python.org>
The Buildbot has detected a new failure of alpha Tru64 5.1 trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/alpha%2520Tru64%25205.1%2520trunk/builds/880
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: neil.schemenauer
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Mon Jul 10 00:14:43 2006
From: python-checkins at python.org (neal.norwitz)
Date: Mon, 10 Jul 2006 00:14:43 +0200 (CEST)
Subject: [Python-checkins] r50497 - in python/trunk:
Lib/test/test_compile.py Misc/NEWS Python/mystrtoul.c
Message-ID: <20060709221443.4ACEE1E4009@bag.python.org>
Author: neal.norwitz
Date: Mon Jul 10 00:14:42 2006
New Revision: 50497
Modified:
python/trunk/Lib/test/test_compile.py
python/trunk/Misc/NEWS
python/trunk/Python/mystrtoul.c
Log:
On 64 bit systems, int literals that use less than 64 bits are now ints
rather than longs. This also fixes the test for eval(-sys.maxint - 1).
Modified: python/trunk/Lib/test/test_compile.py
==============================================================================
--- python/trunk/Lib/test/test_compile.py (original)
+++ python/trunk/Lib/test/test_compile.py Mon Jul 10 00:14:42 2006
@@ -216,6 +216,21 @@
self.assertTrue(isinstance(eval("%s" % (-sys.maxint - 1)), int))
self.assertTrue(isinstance(eval("%s" % (-sys.maxint - 2)), long))
+ if sys.maxint == 9223372036854775807:
+ def test_32_63_bit_values(self):
+ a = +4294967296 # 1 << 32
+ b = -4294967296 # 1 << 32
+ c = +281474976710656 # 1 << 48
+ d = -281474976710656 # 1 << 48
+ e = +4611686018427387904 # 1 << 62
+ f = -4611686018427387904 # 1 << 62
+ g = +9223372036854775807 # 1 << 63 - 1
+ h = -9223372036854775807 # 1 << 63 - 1
+
+ for variable in self.test_32_63_bit_values.func_code.co_consts:
+ if variable is not None:
+ self.assertTrue(isinstance(variable, int))
+
def test_sequence_unpacking_error(self):
# Verify sequence packing/unpacking with "or". SF bug #757818
i,j = (1, -1) or (-1, 1)
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Mon Jul 10 00:14:42 2006
@@ -30,6 +30,9 @@
- Bug #1519018: 'as' is now validated properly in import statements.
+- On 64 bit systems, int literals that use less than 64 bits are
+ now ints rather than longs.
+
Library
-------
Modified: python/trunk/Python/mystrtoul.c
==============================================================================
--- python/trunk/Python/mystrtoul.c (original)
+++ python/trunk/Python/mystrtoul.c Mon Jul 10 00:14:42 2006
@@ -69,11 +69,22 @@
* calculated by [int(math.floor(math.log(2**32, i))) for i in range(2, 37)].
* Note that this is pessimistic if sizeof(long) > 4.
*/
+#if SIZEOF_LONG == 4
static int digitlimit[] = {
0, 0, 32, 20, 16, 13, 12, 11, 10, 10, /* 0 - 9 */
9, 9, 8, 8, 8, 8, 8, 7, 7, 7, /* 10 - 19 */
7, 7, 7, 7, 6, 6, 6, 6, 6, 6, /* 20 - 29 */
6, 6, 6, 6, 6, 6, 6}; /* 30 - 36 */
+#elif SIZEOF_LONG == 8
+/* [int(math.floor(math.log(2**64, i))) for i in range(2, 37)] */
+static int digitlimit[] = {
+ 0, 0, 64, 40, 32, 27, 24, 22, 21, 20, /* 0 - 9 */
+ 19, 18, 17, 17, 16, 16, 16, 15, 15, 15, /* 10 - 19 */
+ 14, 14, 14, 14, 13, 13, 13, 13, 13, 13, /* 20 - 29 */
+ 13, 12, 12, 12, 12, 12, 12}; /* 30 - 36 */
+#else
+#error "Need table for SIZEOF_LONG"
+#endif
/*
** strtoul
From python-checkins at python.org Mon Jul 10 00:34:56 2006
From: python-checkins at python.org (tim.peters)
Date: Mon, 10 Jul 2006 00:34:56 +0200 (CEST)
Subject: [Python-checkins] r50498 - in python/branches/tim-current_frames:
Doc/lib/libsys.tex Include/pystate.h Lib/test/test_sys.py
Python/pystate.c Python/sysmodule.c
Message-ID: <20060709223456.070231E4009@bag.python.org>
Author: tim.peters
Date: Mon Jul 10 00:34:55 2006
New Revision: 50498
Modified:
python/branches/tim-current_frames/Doc/lib/libsys.tex
python/branches/tim-current_frames/Include/pystate.h
python/branches/tim-current_frames/Lib/test/test_sys.py
python/branches/tim-current_frames/Python/pystate.c
python/branches/tim-current_frames/Python/sysmodule.c
Log:
Code, docs, and tests for new sys._current_frames() function.
Modified: python/branches/tim-current_frames/Doc/lib/libsys.tex
==============================================================================
--- python/branches/tim-current_frames/Doc/lib/libsys.tex (original)
+++ python/branches/tim-current_frames/Doc/lib/libsys.tex Mon Jul 10 00:34:55 2006
@@ -41,7 +41,7 @@
\code{Include/patchlevel.h} if the branch is a tag. Otherwise,
it is \code{None}.
\versionadded{2.5}
-\end{datadesc}
+\end{datadesc}
\begin{datadesc}{builtin_module_names}
A tuple of strings giving the names of all modules that are compiled
@@ -55,6 +55,23 @@
interpreter.
\end{datadesc}
+\begin{funcdesc}{_current_frames}{}
+ Return a dictionary mapping each thread's identifier to the topmost stack
+ frame currently active in that thread at the time the function is called.
+ Note that functions in the \refmodule{traceback} module can build the
+ call stack given such a frame.
+
+ This is most useful for debugging deadlock: this function does not
+ require the deadlocked threads' cooperation, and such threads' call stacks
+ are frozen for as long as they remain deadlocked. The frame returned
+ for a non-deadlocked thread may bear no relationship to that thread's
+ current activity by the time calling code examines the frame.
+
+ This function should be used for internal and specialized purposes
+ only.
+ \versionadded{2.5}
+\end{funcdesc}
+
\begin{datadesc}{dllhandle}
Integer specifying the handle of the Python DLL.
Availability: Windows.
@@ -142,7 +159,7 @@
function, \function{exc_info()} will return three \code{None} values until
another exception is raised in the current thread or the execution stack
returns to a frame where another exception is being handled.
-
+
This function is only needed in only a few obscure situations. These
include logging and error handling systems that report information on the
last or current exception. This function can also be used to try to free
@@ -241,7 +258,7 @@
\begin{itemize}
\item On Windows 9x, the encoding is ``mbcs''.
\item On Mac OS X, the encoding is ``utf-8''.
-\item On Unix, the encoding is the user's preference
+\item On Unix, the encoding is the user's preference
according to the result of nl_langinfo(CODESET), or None if
the nl_langinfo(CODESET) failed.
\item On Windows NT+, file names are Unicode natively, so no conversion
@@ -279,8 +296,8 @@
\end{funcdesc}
\begin{funcdesc}{getwindowsversion}{}
- Return a tuple containing five components, describing the Windows
- version currently running. The elements are \var{major}, \var{minor},
+ Return a tuple containing five components, describing the Windows
+ version currently running. The elements are \var{major}, \var{minor},
\var{build}, \var{platform}, and \var{text}. \var{text} contains
a string while all other values are integers.
@@ -491,7 +508,7 @@
be registered using \function{settrace()} for each thread being
debugged. \note{The \function{settrace()} function is intended only
for implementing debuggers, profilers, coverage tools and the like.
- Its behavior is part of the implementation platform, rather than
+ Its behavior is part of the implementation platform, rather than
part of the language definition, and thus may not be available in
all Python implementations.}
\end{funcdesc}
Modified: python/branches/tim-current_frames/Include/pystate.h
==============================================================================
--- python/branches/tim-current_frames/Include/pystate.h (original)
+++ python/branches/tim-current_frames/Include/pystate.h Mon Jul 10 00:34:55 2006
@@ -171,6 +171,11 @@
*/
PyAPI_FUNC(PyThreadState *) PyGILState_GetThisThreadState(void);
+/* The implementation of sys._current_frames() Returns a dict mapping
+ thread id to that thread's current frame.
+*/
+PyAPI_FUNC(PyObject *) _PyThread_CurrentFrames(void);
+
/* Routines for advanced debuggers, requested by David Beazley.
Don't use unless you know what you are doing! */
PyAPI_FUNC(PyInterpreterState *) PyInterpreterState_Head(void);
Modified: python/branches/tim-current_frames/Lib/test/test_sys.py
==============================================================================
--- python/branches/tim-current_frames/Lib/test/test_sys.py (original)
+++ python/branches/tim-current_frames/Lib/test/test_sys.py Mon Jul 10 00:34:55 2006
@@ -237,6 +237,67 @@
is sys._getframe().f_code
)
+ # sys._current_frames() is a CPython-only gimmick.
+ def test_current_frames(self):
+ import threading, thread
+ import traceback
+
+ # Spawn a thread that blocks at a known place. Then the main
+ # thread does sys._current_frames(), and verifies that the frames
+ # returned make sense.
+ entered_g = threading.Event()
+ leave_g = threading.Event()
+ thread_info = [] # the thread's id
+
+ def f123():
+ g456()
+
+ def g456():
+ thread_info.append(thread.get_ident())
+ entered_g.set()
+ leave_g.wait()
+
+ t = threading.Thread(target=f123)
+ t.start()
+ entered_g.wait()
+
+ # At this point, t has finished its entered_g.set(), and is blocked
+ # in its leave_g.wait().
+ self.assertEqual(len(thread_info), 1)
+ thread_id = thread_info[0]
+
+ d = sys._current_frames()
+
+ main_id = thread.get_ident()
+ self.assert_(main_id in d)
+ self.assert_(thread_id in d)
+
+ # Verify that the captured main-thread frame is _this_ frame.
+ frame = d.pop(main_id)
+ self.assert_(frame is sys._getframe())
+
+ # Verify that the captured thread frame is blocked in g456, called
+ # from f123. This is a litte tricky, since various bits of
+ # threading.py are also in the thread's call stack.
+ frame = d.pop(thread_id)
+ stack = traceback.extract_stack(frame)
+ for i, (filename, lineno, funcname, sourceline) in enumerate(stack):
+ if funcname == "f123":
+ break
+ else:
+ self.fail("didn't find f123() on thread's call stack")
+
+ self.assertEqual(sourceline, "g456()")
+
+ # And the next record must be for g456().
+ filename, lineno, funcname, sourceline = stack[i+1]
+ self.assertEqual(funcname, "g456")
+ self.assertEqual(sourceline, "leave_g.wait()")
+
+ # Reap the spawned thread.
+ leave_g.set()
+ t.join()
+
def test_attributes(self):
self.assert_(isinstance(sys.api_version, int))
self.assert_(isinstance(sys.argv, list))
Modified: python/branches/tim-current_frames/Python/pystate.c
==============================================================================
--- python/branches/tim-current_frames/Python/pystate.c (original)
+++ python/branches/tim-current_frames/Python/pystate.c Mon Jul 10 00:34:55 2006
@@ -444,15 +444,15 @@
/* If autoTLSkey is 0, this must be the very first threadstate created
in Py_Initialize(). Don't do anything for now (we'll be back here
when _PyGILState_Init is called). */
- if (!autoTLSkey)
+ if (!autoTLSkey)
return;
-
+
/* Stick the thread state for this thread in thread local storage.
The only situation where you can legitimately have more than one
thread state for an OS level thread is when there are multiple
interpreters, when:
-
+
a) You shouldn't really be using the PyGILState_ APIs anyway,
and:
@@ -550,6 +550,54 @@
PyEval_SaveThread();
}
+/* The implementation of sys._current_frames(). This is intended to be
+ called with the GIL held, as it will be when called via
+ sys._current_frames(). It's possible it would work fine even without
+ the GIL held, but haven't thought enough about that.
+*/
+PyObject *
+_PyThread_CurrentFrames(void)
+{
+ PyObject *result;
+ PyInterpreterState *i;
+
+ result = PyDict_New();
+ if (result == NULL)
+ return NULL;
+
+ /* for i in all interpreters:
+ * for t in all of i's thread states:
+ * if t's frame isn't NULL, map t's id to its frame
+ * Because these lists can mutute even when the GIL isn't held, we
+ * need to grab head_mutex for the duration.
+ */
+ HEAD_LOCK();
+ for (i = interp_head; i != NULL; i = i->next) {
+ PyThreadState *t;
+ for (t = i->tstate_head; t != NULL; t = t->next) {
+ PyObject *id;
+ int stat;
+ struct _frame *frame = t->frame;
+ if (frame == NULL)
+ continue;
+ id = PyInt_FromLong(t->thread_id);
+ if (id == NULL)
+ goto Fail;
+ stat = PyDict_SetItem(result, id, (PyObject *)frame);
+ Py_DECREF(id);
+ if (stat < 0)
+ goto Fail;
+ }
+ }
+ HEAD_UNLOCK();
+ return result;
+
+ Fail:
+ HEAD_UNLOCK();
+ Py_DECREF(result);
+ return NULL;
+}
+
#ifdef __cplusplus
}
#endif
Modified: python/branches/tim-current_frames/Python/sysmodule.c
==============================================================================
--- python/branches/tim-current_frames/Python/sysmodule.c (original)
+++ python/branches/tim-current_frames/Python/sysmodule.c Mon Jul 10 00:34:55 2006
@@ -660,6 +660,21 @@
return (PyObject*)f;
}
+PyDoc_STRVAR(current_frames_doc,
+"_current_frames() -> dictionary\n\
+\n\
+Return a dictionary mapping each current thread T's thread id to T's\n\
+current stack frame.\n\
+\n\
+This function should be used for specialized purposes only."
+);
+
+static PyObject *
+sys_current_frames(PyObject *self, PyObject *noargs)
+{
+ return _PyThread_CurrentFrames();
+}
+
PyDoc_STRVAR(call_tracing_doc,
"call_tracing(func, args) -> object\n\
\n\
@@ -722,6 +737,8 @@
/* Might as well keep this in alphabetic order */
{"callstats", (PyCFunction)PyEval_GetCallStats, METH_NOARGS,
callstats_doc},
+ {"_current_frames", sys_current_frames, METH_NOARGS,
+ current_frames_doc},
{"displayhook", sys_displayhook, METH_O, displayhook_doc},
{"exc_info", sys_exc_info, METH_NOARGS, exc_info_doc},
{"exc_clear", sys_exc_clear, METH_NOARGS, exc_clear_doc},
From python-checkins at python.org Mon Jul 10 00:43:45 2006
From: python-checkins at python.org (neal.norwitz)
Date: Mon, 10 Jul 2006 00:43:45 +0200 (CEST)
Subject: [Python-checkins] r50499 - peps/trunk/pep-0356.txt
Message-ID: <20060709224345.6C2041E4009@bag.python.org>
Author: neal.norwitz
Date: Mon Jul 10 00:43:45 2006
New Revision: 50499
Modified:
peps/trunk/pep-0356.txt
Log:
There were a couple of AST fixes and one request for a new crasher bug to track
Modified: peps/trunk/pep-0356.txt
==============================================================================
--- peps/trunk/pep-0356.txt (original)
+++ peps/trunk/pep-0356.txt Mon Jul 10 00:43:45 2006
@@ -147,11 +147,11 @@
Open issues
- Bugs that need resolving before release:
+ http://python.org/sf/1517042 - Fix crashers/gc_inspection.py
http://python.org/sf/1515471 - stringobject (char buffers)
http://python.org/sf/1512814 - AST incorrect lineno
http://python.org/sf/1508010 - msvccompiler using VC6
http://python.org/sf/1504046 - doc for ElementTree
- http://python.org/sf/1501934 - AST incorrect LOAD/STORE_GLOBAL
http://python.org/sf/1494314 - can't use high sockets (regr in 2.4.3)
http://python.org/sf/1475523 - gettext.py bug
http://python.org/sf/1333982 - AST
@@ -169,9 +169,6 @@
- AST compiler problems
(Owner: Jeremy Hylton)
- * eval(str(-sys.maxint - 1)) should produce an int, not long.
- when a constant it should not be stored in co_consts
-
* string constants that are not assigned are in byte code
def foo():
"docstring"
From python-checkins at python.org Mon Jul 10 02:04:45 2006
From: python-checkins at python.org (neal.norwitz)
Date: Mon, 10 Jul 2006 02:04:45 +0200 (CEST)
Subject: [Python-checkins] r50500 - in python/trunk:
Lib/test/test_compile.py Misc/NEWS Python/compile.c
Message-ID: <20060710000445.C65A51E4009@bag.python.org>
Author: neal.norwitz
Date: Mon Jul 10 02:04:44 2006
New Revision: 50500
Modified:
python/trunk/Lib/test/test_compile.py
python/trunk/Misc/NEWS
python/trunk/Python/compile.c
Log:
Bug #1512814, Fix incorrect lineno's when code at module scope
started after line 256.
Modified: python/trunk/Lib/test/test_compile.py
==============================================================================
--- python/trunk/Lib/test/test_compile.py (original)
+++ python/trunk/Lib/test/test_compile.py Mon Jul 10 02:04:44 2006
@@ -166,6 +166,16 @@
pass"""
compile(s, "", "exec")
+ # This test is probably specific to CPython and may not generalize
+ # to other implementations. We are trying to ensure that when
+ # the first line of code starts after 256, correct line numbers
+ # in tracebacks are still produced.
+ def test_leading_newlines(self):
+ s256 = "".join(["\n"] * 256 + ["spam"])
+ co = compile(s256, 'fn', 'exec')
+ self.assertEqual(co.co_firstlineno, 257)
+ self.assertEqual(co.co_lnotab, '')
+
def test_literals_with_leading_zeroes(self):
for arg in ["077787", "0xj", "0x.", "0e", "090000000000000",
"080000000000000", "000000000000009", "000000000000008"]:
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Mon Jul 10 02:04:44 2006
@@ -33,6 +33,9 @@
- On 64 bit systems, int literals that use less than 64 bits are
now ints rather than longs.
+- Bug #1512814, Fix incorrect lineno's when code at module scope
+ started after line 256.
+
Library
-------
Modified: python/trunk/Python/compile.c
==============================================================================
--- python/trunk/Python/compile.c (original)
+++ python/trunk/Python/compile.c Mon Jul 10 02:04:44 2006
@@ -1776,7 +1776,8 @@
if (!module)
return NULL;
}
- if (!compiler_enter_scope(c, module, mod, 1))
+ /* Use 0 for firstlineno initially, will fixup in assemble(). */
+ if (!compiler_enter_scope(c, module, mod, 0))
return NULL;
switch (mod->kind) {
case Module_kind:
@@ -4446,6 +4447,13 @@
entryblock = b;
}
+ /* Set firstlineno if it wasn't explicitly set. */
+ if (!c->u->u_firstlineno) {
+ if (entryblock && entryblock->b_instr)
+ c->u->u_firstlineno = entryblock->b_instr->i_lineno;
+ else
+ c->u->u_firstlineno = 1;
+ }
if (!assemble_init(&a, nblocks, c->u->u_firstlineno))
goto error;
dfs(c, entryblock, &a);
From python-checkins at python.org Mon Jul 10 02:05:34 2006
From: python-checkins at python.org (neal.norwitz)
Date: Mon, 10 Jul 2006 02:05:34 +0200 (CEST)
Subject: [Python-checkins] r50501 - python/trunk/Lib/subprocess.py
Message-ID: <20060710000534.99F1D1E4009@bag.python.org>
Author: neal.norwitz
Date: Mon Jul 10 02:05:34 2006
New Revision: 50501
Modified:
python/trunk/Lib/subprocess.py
Log:
Fix doco. Backport candidate.
Modified: python/trunk/Lib/subprocess.py
==============================================================================
--- python/trunk/Lib/subprocess.py (original)
+++ python/trunk/Lib/subprocess.py Mon Jul 10 02:05:34 2006
@@ -234,7 +234,7 @@
sts = os.system("mycmd" + " myarg")
==>
p = Popen("mycmd" + " myarg", shell=True)
-sts = os.waitpid(p.pid, 0)
+pid, sts = os.waitpid(p.pid, 0)
Note:
From python-checkins at python.org Mon Jul 10 02:07:11 2006
From: python-checkins at python.org (neal.norwitz)
Date: Mon, 10 Jul 2006 02:07:11 +0200 (CEST)
Subject: [Python-checkins] r50502 - peps/trunk/pep-0356.txt
Message-ID: <20060710000711.363451E4015@bag.python.org>
Author: neal.norwitz
Date: Mon Jul 10 02:07:10 2006
New Revision: 50502
Modified:
peps/trunk/pep-0356.txt
Log:
Another one bites the dust.
Modified: peps/trunk/pep-0356.txt
==============================================================================
--- peps/trunk/pep-0356.txt (original)
+++ peps/trunk/pep-0356.txt Mon Jul 10 02:07:10 2006
@@ -149,7 +149,6 @@
- Bugs that need resolving before release:
http://python.org/sf/1517042 - Fix crashers/gc_inspection.py
http://python.org/sf/1515471 - stringobject (char buffers)
- http://python.org/sf/1512814 - AST incorrect lineno
http://python.org/sf/1508010 - msvccompiler using VC6
http://python.org/sf/1504046 - doc for ElementTree
http://python.org/sf/1494314 - can't use high sockets (regr in 2.4.3)
From python-checkins at python.org Mon Jul 10 02:23:18 2006
From: python-checkins at python.org (neal.norwitz)
Date: Mon, 10 Jul 2006 02:23:18 +0200 (CEST)
Subject: [Python-checkins] r50503 - python/trunk/Lib/tarfile.py
Message-ID: <20060710002318.A9E111E400A@bag.python.org>
Author: neal.norwitz
Date: Mon Jul 10 02:23:17 2006
New Revision: 50503
Modified:
python/trunk/Lib/tarfile.py
Log:
Part of SF patch #1484695. This removes dead code. The chksum was
already verified in .frombuf() on the lines above. If there was
a problem an exception is raised, so there was no way this condition
could have been true.
Modified: python/trunk/Lib/tarfile.py
==============================================================================
--- python/trunk/Lib/tarfile.py (original)
+++ python/trunk/Lib/tarfile.py Mon Jul 10 02:23:17 2006
@@ -1750,13 +1750,6 @@
try:
tarinfo = TarInfo.frombuf(buf)
- # We shouldn't rely on this checksum, because some tar programs
- # calculate it differently and it is merely validating the
- # header block. We could just as well skip this part, which would
- # have a slight effect on performance...
- if tarinfo.chksum not in calc_chksums(buf):
- self._dbg(1, "tarfile: Bad Checksum %r" % tarinfo.name)
-
# Set the TarInfo object's offset to the current position of the
# TarFile and set self.offset to the position where the data blocks
# should begin.
From python-checkins at python.org Mon Jul 10 03:19:00 2006
From: python-checkins at python.org (neal.norwitz)
Date: Mon, 10 Jul 2006 03:19:00 +0200 (CEST)
Subject: [Python-checkins] r50504 - in python/trunk: Misc/NEWS
Modules/bz2module.c Modules/cryptmodule.c Modules/dlmodule.c
Modules/fpectlmodule.c Modules/getpath.c
Modules/posixmodule.c Modules/selectmodule.c Modules/socketmodule.c
Message-ID: <20060710011900.166711E4009@bag.python.org>
Author: neal.norwitz
Date: Mon Jul 10 03:18:57 2006
New Revision: 50504
Modified:
python/trunk/Misc/NEWS
python/trunk/Modules/bz2module.c
python/trunk/Modules/cryptmodule.c
python/trunk/Modules/dlmodule.c
python/trunk/Modules/fpectlmodule.c
python/trunk/Modules/getpath.c
python/trunk/Modules/posixmodule.c
python/trunk/Modules/selectmodule.c
python/trunk/Modules/socketmodule.c
Log:
Patch #1516912: improve Modules support for OpenVMS.
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Mon Jul 10 03:18:57 2006
@@ -108,6 +108,8 @@
- Bug #1296433: parsing XML with a non-default encoding and
a CharacterDataHandler could crash the interpreter in pyexpat.
+- Patch #1516912: improve Modules support for OpenVMS.
+
Build
-----
Modified: python/trunk/Modules/bz2module.c
==============================================================================
--- python/trunk/Modules/bz2module.c (original)
+++ python/trunk/Modules/bz2module.c Mon Jul 10 03:18:57 2006
@@ -1311,7 +1311,11 @@
break;
case 'U':
+#ifdef __VMS
+ self->f_univ_newline = 0;
+#else
self->f_univ_newline = 1;
+#endif
break;
default:
Modified: python/trunk/Modules/cryptmodule.c
==============================================================================
--- python/trunk/Modules/cryptmodule.c (original)
+++ python/trunk/Modules/cryptmodule.c Mon Jul 10 03:18:57 2006
@@ -5,6 +5,9 @@
#include
+#ifdef __VMS
+#include
+#endif
/* Module crypt */
@@ -12,7 +15,9 @@
static PyObject *crypt_crypt(PyObject *self, PyObject *args)
{
char *word, *salt;
+#ifndef __VMS
extern char * crypt(const char *, const char *);
+#endif
if (!PyArg_ParseTuple(args, "ss:crypt", &word, &salt)) {
return NULL;
Modified: python/trunk/Modules/dlmodule.c
==============================================================================
--- python/trunk/Modules/dlmodule.c (original)
+++ python/trunk/Modules/dlmodule.c Mon Jul 10 03:18:57 2006
@@ -5,6 +5,10 @@
#include
+#ifdef __VMS
+#include
+#endif
+
#ifndef RTLD_LAZY
#define RTLD_LAZY 1
#endif
@@ -186,6 +190,24 @@
PyErr_SetString(Dlerror, dlerror());
return NULL;
}
+#ifdef __VMS
+ /* Under OpenVMS dlopen doesn't do any check, just save the name
+ * for later use, so we have to check if the file is readable,
+ * the name can be a logical or a file from SYS$SHARE.
+ */
+ if (access(name, R_OK)) {
+ char fname[strlen(name) + 20];
+ strcpy(fname, "SYS$SHARE:");
+ strcat(fname, name);
+ strcat(fname, ".EXE");
+ if (access(fname, R_OK)) {
+ dlclose(handle);
+ PyErr_SetString(Dlerror,
+ "File not found or protection violation");
+ return NULL;
+ }
+ }
+#endif
return newdlobject(handle);
}
Modified: python/trunk/Modules/fpectlmodule.c
==============================================================================
--- python/trunk/Modules/fpectlmodule.c (original)
+++ python/trunk/Modules/fpectlmodule.c Mon Jul 10 03:18:57 2006
@@ -70,6 +70,10 @@
#if defined(__FreeBSD__)
# include
+#elif defined(__VMS)
+#define __NEW_STARLET
+#include
+#include
#endif
#ifndef WANT_SIGFPE_HANDLER
@@ -190,6 +194,19 @@
/*-- DEC ALPHA VMS --------------------------------------------------------*/
#elif defined(__ALPHA) && defined(__VMS)
+ IEEE clrmsk;
+ IEEE setmsk;
+ clrmsk.ieee$q_flags =
+ IEEE$M_TRAP_ENABLE_UNF | IEEE$M_TRAP_ENABLE_INE |
+ IEEE$M_MAP_UMZ;
+ setmsk.ieee$q_flags =
+ IEEE$M_TRAP_ENABLE_INV | IEEE$M_TRAP_ENABLE_DZE |
+ IEEE$M_TRAP_ENABLE_OVF;
+ sys$ieee_set_fp_control(&clrmsk, &setmsk, 0);
+ PyOS_setsig(SIGFPE, handler);
+
+/*-- HP IA64 VMS --------------------------------------------------------*/
+#elif defined(__ia64) && defined(__VMS)
PyOS_setsig(SIGFPE, handler);
/*-- Cray Unicos ----------------------------------------------------------*/
@@ -244,6 +261,14 @@
#ifdef __FreeBSD__
fpresetsticky(fpgetsticky());
fpsetmask(0);
+#elif defined(__VMS)
+ IEEE clrmsk;
+ clrmsk.ieee$q_flags =
+ IEEE$M_TRAP_ENABLE_UNF | IEEE$M_TRAP_ENABLE_INE |
+ IEEE$M_MAP_UMZ | IEEE$M_TRAP_ENABLE_INV |
+ IEEE$M_TRAP_ENABLE_DZE | IEEE$M_TRAP_ENABLE_OVF |
+ IEEE$M_INHERIT;
+ sys$ieee_set_fp_control(&clrmsk, 0, 0);
#else
fputs("Operation not implemented\n", stderr);
#endif
Modified: python/trunk/Modules/getpath.c
==============================================================================
--- python/trunk/Modules/getpath.c (original)
+++ python/trunk/Modules/getpath.c Mon Jul 10 03:18:57 2006
@@ -97,19 +97,19 @@
#ifndef VERSION
-#if defined(__VMS)
-#define VERSION "2_1"
-#else
#define VERSION "2.1"
#endif
-#endif
#ifndef VPATH
#define VPATH "."
#endif
#ifndef PREFIX
-#define PREFIX "/usr/local"
+# ifdef __VMS
+# define PREFIX ""
+# else
+# define PREFIX "/usr/local"
+# endif
#endif
#ifndef EXEC_PREFIX
Modified: python/trunk/Modules/posixmodule.c
==============================================================================
--- python/trunk/Modules/posixmodule.c (original)
+++ python/trunk/Modules/posixmodule.c Mon Jul 10 03:18:57 2006
@@ -7882,6 +7882,42 @@
}
#endif
+#ifdef __VMS
+/* Use openssl random routine */
+#include
+PyDoc_STRVAR(vms_urandom__doc__,
+"urandom(n) -> str\n\n\
+Return a string of n random bytes suitable for cryptographic use.");
+
+static PyObject*
+vms_urandom(PyObject *self, PyObject *args)
+{
+ int howMany;
+ PyObject* result;
+
+ /* Read arguments */
+ if (! PyArg_ParseTuple(args, "i:urandom", &howMany))
+ return NULL;
+ if (howMany < 0)
+ return PyErr_Format(PyExc_ValueError,
+ "negative argument not allowed");
+
+ /* Allocate bytes */
+ result = PyString_FromStringAndSize(NULL, howMany);
+ if (result != NULL) {
+ /* Get random data */
+ if (RAND_pseudo_bytes((unsigned char*)
+ PyString_AS_STRING(result),
+ howMany) < 0) {
+ Py_DECREF(result);
+ return PyErr_Format(PyExc_ValueError,
+ "RAND_pseudo_bytes");
+ }
+ }
+ return result;
+}
+#endif
+
static PyMethodDef posix_methods[] = {
{"access", posix_access, METH_VARARGS, posix_access__doc__},
#ifdef HAVE_TTYNAME
@@ -8175,6 +8211,9 @@
#ifdef MS_WINDOWS
{"urandom", win32_urandom, METH_VARARGS, win32_urandom__doc__},
#endif
+ #ifdef __VMS
+ {"urandom", vms_urandom, METH_VARARGS, vms_urandom__doc__},
+ #endif
{NULL, NULL} /* Sentinel */
};
Modified: python/trunk/Modules/selectmodule.c
==============================================================================
--- python/trunk/Modules/selectmodule.c (original)
+++ python/trunk/Modules/selectmodule.c Mon Jul 10 03:18:57 2006
@@ -46,14 +46,14 @@
#endif
#ifdef MS_WINDOWS
-#include
+# include
#else
-#ifdef __BEOS__
-#include
-#define SOCKET int
-#else
-#define SOCKET int
-#endif
+# define SOCKET int
+# ifdef __BEOS__
+# include
+# elif defined(__VMS)
+# include
+# endif
#endif
@@ -668,7 +668,7 @@
that are ready.\n\
\n\
*** IMPORTANT NOTICE ***\n\
-On Windows, only sockets are supported; on Unix, all file descriptors.");
+On Windows and OpenVMS, only sockets are supported; on Unix, all file descriptors.");
static PyMethodDef select_methods[] = {
{"select", select_select, METH_VARARGS, select_doc},
@@ -682,7 +682,7 @@
"This module supports asynchronous I/O on multiple file descriptors.\n\
\n\
*** IMPORTANT NOTICE ***\n\
-On Windows, only sockets are supported; on Unix, all file descriptors.");
+On Windows and OpenVMS, only sockets are supported; on Unix, all file descriptors.");
PyMODINIT_FUNC
initselect(void)
Modified: python/trunk/Modules/socketmodule.c
==============================================================================
--- python/trunk/Modules/socketmodule.c (original)
+++ python/trunk/Modules/socketmodule.c Mon Jul 10 03:18:57 2006
@@ -161,7 +161,8 @@
(this includes the getaddrinfo emulation) protect access with a lock. */
#if defined(WITH_THREAD) && (defined(__APPLE__) || \
(defined(__FreeBSD__) && __FreeBSD_version+0 < 503000) || \
- defined(__OpenBSD__) || defined(__NetBSD__) || !defined(HAVE_GETADDRINFO))
+ defined(__OpenBSD__) || defined(__NetBSD__) || \
+ defined(__VMS) || !defined(HAVE_GETADDRINFO))
#define USE_GETADDRINFO_LOCK
#endif
@@ -186,15 +187,8 @@
#endif
#if defined(__VMS)
-#if ! defined(_SOCKADDR_LEN)
-# ifdef getaddrinfo
-# undef getaddrinfo
-# endif
-# include "TCPIP_IOCTL_ROUTINE"
-#else
# include
#endif
-#endif
#if defined(PYOS_OS2)
# define INCL_DOS
@@ -363,11 +357,6 @@
#define SOCKETCLOSE close
#endif
-#ifdef __VMS
-/* TCP/IP Services for VMS uses a maximum send/revc buffer length of 65535 */
-#define SEGMENT_SIZE 65535
-#endif
-
#if defined(HAVE_BLUETOOTH_H) || defined(HAVE_BLUETOOTH_BLUETOOTH_H)
#define USE_BLUETOOTH 1
#if defined(__FreeBSD__)
@@ -386,6 +375,11 @@
#endif
#endif
+#ifdef __VMS
+/* TCP/IP Services for VMS uses a maximum send/recv buffer length */
+#define SEGMENT_SIZE (32 * 1024 -1)
+#endif
+
/*
* Constants for getnameinfo()
*/
@@ -620,6 +614,30 @@
return NULL;
}
+#ifdef __VMS
+/* Function to send in segments */
+static int
+sendsegmented(int sock_fd, char *buf, int len, int flags)
+{
+ int n = 0;
+ int remaining = len;
+
+ while (remaining > 0) {
+ unsigned int segment;
+
+ segment = (remaining >= SEGMENT_SIZE ? SEGMENT_SIZE : remaining);
+ n = send(sock_fd, buf, segment, flags);
+ if (n < 0) {
+ return n;
+ }
+ remaining -= segment;
+ buf += segment;
+ } /* end while */
+
+ return len;
+}
+#endif
+
/* Function to perform the setting of socket blocking mode
internally. block = (1 | 0). */
static int
@@ -644,8 +662,8 @@
ioctl(s->sock_fd, FIONBIO, (caddr_t)&block, sizeof(block));
#elif defined(__VMS)
block = !block;
- ioctl(s->sock_fd, FIONBIO, (char *)&block);
-#else /* !PYOS_OS2 && !_VMS */
+ ioctl(s->sock_fd, FIONBIO, (unsigned int *)&block);
+#else /* !PYOS_OS2 && !__VMS */
delay_flag = fcntl(s->sock_fd, F_GETFL, 0);
if (block)
delay_flag &= (~O_NONBLOCK);
@@ -1725,6 +1743,8 @@
return PyInt_FromLong(flag);
}
#ifdef __VMS
+ /* socklen_t is unsigned so no negative test is needed,
+ test buflen == 0 is previously done */
if (buflen > 1024) {
#else
if (buflen <= 0 || buflen > 1024) {
@@ -2498,9 +2518,6 @@
{
char *buf;
int len, n = 0, flags = 0, timeout;
-#ifdef __VMS
- int send_length;
-#endif
if (!PyArg_ParseTuple(args, "s#|i:send", &buf, &len, &flags))
return NULL;
@@ -2508,11 +2525,14 @@
if (!IS_SELECTABLE(s))
return select_error();
-#ifndef __VMS
Py_BEGIN_ALLOW_THREADS
timeout = internal_select(s, 1);
if (!timeout)
+#ifdef __VMS
+ n = sendsegmented(s->sock_fd, buf, len, flags);
+#else
n = send(s->sock_fd, buf, len, flags);
+#endif
Py_END_ALLOW_THREADS
if (timeout) {
@@ -2521,36 +2541,6 @@
}
if (n < 0)
return s->errorhandler();
-#else
- /* Divide packet into smaller segments for */
- /* TCP/IP Services for OpenVMS */
- send_length = len;
- while (send_length != 0) {
- unsigned int segment;
-
- segment = send_length / SEGMENT_SIZE;
- if (segment != 0) {
- segment = SEGMENT_SIZE;
- }
- else {
- segment = send_length;
- }
- Py_BEGIN_ALLOW_THREADS
- timeout = internal_select(s, 1);
- if (!timeout)
- n = send(s->sock_fd, buf, segment, flags);
- Py_END_ALLOW_THREADS
- if (timeout) {
- PyErr_SetString(socket_timeout, "timed out");
- return NULL;
- }
- if (n < 0) {
- return s->errorhandler();
- }
- send_length -= segment;
- buf += segment;
- } /* end while */
-#endif /* !__VMS */
return PyInt_FromLong((long)n);
}
@@ -2581,7 +2571,11 @@
timeout = internal_select(s, 1);
if (timeout)
break;
+#ifdef __VMS
+ n = sendsegmented(s->sock_fd, buf, len, flags);
+#else
n = send(s->sock_fd, buf, len, flags);
+#endif
if (n < 0)
break;
buf += n;
From python-checkins at python.org Mon Jul 10 03:53:39 2006
From: python-checkins at python.org (matt.fleming)
Date: Mon, 10 Jul 2006 03:53:39 +0200 (CEST)
Subject: [Python-checkins] r50505 - in sandbox/trunk/pdb: README.txt
mconnection.py mpdb.py mthread.py test/test_mconnection.py
Message-ID: <20060710015339.3CBAE1E4009@bag.python.org>
Author: matt.fleming
Date: Mon Jul 10 03:53:38 2006
New Revision: 50505
Modified:
sandbox/trunk/pdb/README.txt
sandbox/trunk/pdb/mconnection.py
sandbox/trunk/pdb/mpdb.py
sandbox/trunk/pdb/mthread.py
sandbox/trunk/pdb/test/test_mconnection.py
Log:
Added FIFOs and unit tests. Some changes to the threading code.
Modified: sandbox/trunk/pdb/README.txt
==============================================================================
--- sandbox/trunk/pdb/README.txt (original)
+++ sandbox/trunk/pdb/README.txt Mon Jul 10 03:53:38 2006
@@ -33,7 +33,7 @@
* Provide a proper top-level methods including, set_trace(), post_mortem(),
run(), remote_sighandler() (for allowing a signal to start
remote debugging)
-* Extend mconnection FIFO's
+* Clean up FIFO output (too many newlines)
* mconnection should use the exceptions that have been introduced and mpdb
should check for these exceptions being raised.
* Write documentation
@@ -58,4 +58,4 @@
* do_return inherited from pydb.gdb.Gdb doesn't use self.stdin for reading
input from the user and doesn't work remotely.
* pdbserver using a TCP connection uses setsockopt() REUSEADDR by default.
- Need some way to make this configurable. `set reuseaddr' ?
\ No newline at end of file
+ Need some way to make this configurable. `set reuseaddr' ?
Modified: sandbox/trunk/pdb/mconnection.py
==============================================================================
--- sandbox/trunk/pdb/mconnection.py (original)
+++ sandbox/trunk/pdb/mconnection.py Mon Jul 10 03:53:38 2006
@@ -201,4 +201,101 @@
self._sock = None
self.connected = False
+import os
+class MConnectionServerFIFO(MConnectionInterface):
+ """ This class implements a named pipe for communication between
+ a pdbserver and client.
+ """
+ def __init__(self):
+ MConnectionInterface.__init__(self)
+ self.input = self.output = self._filename = self._mode = None
+ def connect(self, name, mode=0644):
+ self._filename = name
+ self._file_in = self._filename+'0'
+ self._file_out = self._filename+'1'
+ self._mode = mode
+ try:
+ os.mkfifo(self._file_in, self._mode)
+ os.mkfifo(self._file_out, self._mode)
+ except OSError, e:
+ raise ConnectionFailed, e[1]
+ self.input = open(self._file_in, 'r')
+ self.output = open(self._file_out, 'w')
+
+ def disconnect(self):
+ """ Disconnect from the named pipe. """
+ if not self.input or not self.output:
+ return
+ self.output.close()
+ self.input.close()
+ self.input = self.output = None
+ os.unlink(self._file_in)
+ os.unlink(self._file_out)
+
+
+ def readline(self):
+ """ Read a line from the named pipe. """
+ try:
+ # Using readline allows the data to be read quicker, don't
+ # know why.
+ line = self.input.readline()
+ except IOError, e:
+ raise ReadError, e[1]
+ if not line:
+ raise ReadError, 'Connection closed'
+ return line
+
+ def write(self, msg):
+ if msg[-1] != '\n': msg += '\n'
+ try:
+ self.output.write(msg)
+ self.output.flush()
+ except IOError, e:
+ raise WriteError, e[1]
+
+class MConnectionClientFIFO(MConnectionInterface):
+ """ This class is the client class for accessing a named pipe
+ used for communication between client and pdbserver.
+ """
+ def __init__(self):
+ MConnectionInterface.__init__(self)
+ self.input = self.output = self._filename = self._mode = None
+
+ def connect(self, name, mode=0644):
+ self._filename = name
+ self._file_in = self._filename+'1'
+ self._file_out = self._filename+'0'
+ self._mode = mode
+ try:
+ self.output = open(self._file_out, 'w')
+ self.input = open(self._file_in, 'r')
+ except IOError, e:
+ raise ConnectionFailed, e[1]
+
+ def disconnect(self):
+ if not self.input or not self.output:
+ return
+ self.output.close()
+ self.input.close()
+ self.input = self.output = None
+
+ def readline(self):
+ try:
+ line = self.input.readline()
+ except IOError, e:
+ raise ReadError, e[1]
+ if not line:
+ raise ReadError, 'Connection closed'
+ return line
+
+ def write(self, msg):
+ if msg[-1] != '\n': msg += '\n'
+ try:
+ self.output.write(msg)
+ self.output.flush()
+ except IOError, e:
+ raise WriteError, e[1]
+
+
+
Modified: sandbox/trunk/pdb/mpdb.py
==============================================================================
--- sandbox/trunk/pdb/mpdb.py (original)
+++ sandbox/trunk/pdb/mpdb.py Mon Jul 10 03:53:38 2006
@@ -55,6 +55,7 @@
self.target = 'local' # local connections by default
self.lastcmd = ''
self.connection = None
+ self.debugger_name = 'mpdb'
self._info_cmds.append('target')
def _rebind_input(self, new_input):
@@ -221,24 +222,10 @@
self.errmsg('Could not import MConnectionSerial')
return
else:
- if '.' in target:
- if self.connection: self.connection.disconnect()
- # We dynamically load the class for the connection
- base = target[:target.rfind('.')]
- cls = target[target.rfind('.')+1:]
- try:
- exec 'from ' + base + ' import (' + cls + ', \
- ConnectionFailed)'
- except ImportError:
- self.errmsg('Unknown target type')
- return
- else:
- try:
- __import__(target)
- except ImportError:
- self.errmsg('Unknown target type')
- return
- self.connection = eval(target+'()')
+ cls = target[target.rfind('.')+1:]
+ path = target[:target.rfind('.')]
+ exec 'from ' + path + ' import ' + cls + ', ConnectionFailed'
+ self.connection = eval(cls+'()')
try:
self.connection.connect(addr)
except ConnectionFailed, err:
@@ -409,10 +396,7 @@
""" Setup this debugger to handle threaded applications."""
import mthread
mthread.init(m)
- m.user_line = mthread.user_line
- m.user_call = mthread.user_call
- m.user_exception = mthread.user_exception
- m.user_return = mthread.user_return
+
m._program_sys_argv = list(m._sys_argv[2:])
m.mainpyfile = m._program_sys_argv[1]
while True:
Modified: sandbox/trunk/pdb/mthread.py
==============================================================================
--- sandbox/trunk/pdb/mthread.py (original)
+++ sandbox/trunk/pdb/mthread.py Mon Jul 10 03:53:38 2006
@@ -18,6 +18,16 @@
# said thread stopped at.
_main_debugger = None
+# These global variables keep track of 'global state' in the
+# debugger.
+g_args = None
+g_breaks = {}
+g_commands = {} # commands to execute at breakpoints
+g_main_dirname = None
+g_program_sys_argv = None
+g_search_path = None
+g_sys_argv = None
+
class MTracer(Gdb):
""" A class to trace a thread. Breakpoints can be passed from
a main debugger to this debugger through the constructor
@@ -30,7 +40,17 @@
self.prompt = '(MPdb: %s)' % self.thread.getName()
self.running = True
self.reset()
-
+
+ # Copy some state
+ self.breaks = dict(g_breaks)
+ self._program_sys_argv = g_program_sys_argv
+ self._sys_argv = g_sys_argv
+
+ # XXX This needs fixing, hack so that when a new MTracer
+ # is created for this thread we don't stop at the bottom
+ # frame.
+ self.botframe = 1
+
def user_line(self, frame):
""" Override this method from pydb.pydbbdb.Bdb to make
it thread-safe.
@@ -47,6 +67,7 @@
Gdb.user_call(self, frame, args)
_main_debugger.lock.release()
+
def user_exception(self, frame, (type, value, traceback)):
""" Override this method from pydb.pydbbdb.Bdb to make
it thread-safe.
@@ -56,6 +77,9 @@
traceback))
_main_debugger.lock.release()
+ def do_thread(self, arg):
+ do_thread(arg)
+
def trace_dispatch_init(frame, event, arg):
""" This method is called by a sys.settrace when a thread is running
@@ -65,11 +89,10 @@
"""
global tt_dict
tr = MTracer()
- th = threading.currentThread()
+ th = threading.currentThread().getName()
tt_dict[th] = tr
- threading.settrace(tr.trace_dispatch)
sys.settrace(tr.trace_dispatch)
@@ -80,14 +103,34 @@
the debugger that is debugging the MainThread, i.e. the Python
interpreter.
"""
- global _main_debugger
+ global _main_debugger, g_breaks
if _main_debugger == None:
_main_debugger = debugger
# This lock must be acquired when a MTracer object
# places a call to _main_debugger.user_*
_main_debugger.lock = threading.Lock()
-
+
+ # Copy some state from the main debugger so that
+ # newly created debuggers can have the same.
+ g_breaks = _main_debugger.breaks
+ g_commands = _main_debugger.commands
+ g_dirname = _main_debugger.main_dirname
+ g_program_sys_argv = _main_debugger._program_sys_argv
+ g_search_path = _main_debugger.search_path
+ g_sys_argv = _main_debugger._sys_argv
+
+ # Replace some of the mpdb methods with thread-safe ones
+ _main_debugger.user_line = user_line
+ _main_debugger.user_call = user_call
+ _main_debugger.user_exception = user_exception
+ _main_debugger.user_return = user_return
+
+ _main_debugger.do_break = do_break
+ _main_debugger.do_thread = do_thread
+
+ global tt_dict
+ tt_dict[threading.currentThread().getName()] = _main_debugger
threading.settrace(trace_dispatch_init)
# All the methods below override the methods from MPdb so
@@ -115,7 +158,53 @@
exc_traceback))
_main_debugger.lock.release()
+def do_break(arg, temporary=0):
+ """Override the break command so that, by default,
+ breakpoints are set in all threads.
+ """
+ global tt_dict
+ if 'thread' in arg:
+ # We're setting a thread-specific breakpoint
+ # syntax: break linespec thread threadID
+ args, threadcmd, threadID = arg.split()
+ try:
+ t = tt_dict[threadID]
+ except KeyError:
+ Gdb.errmsg(_main_debugger, 'Invalid thread ID')
+ return
+
+ Gdb.do_break(_main_debugger, arg, temporary)
-
-
+ if len(tt_dict) > 1:
+ for tracer in tt_dict.values():
+ tracer.do_break(arg, temporary)
+
+def do_thread(arg):
+ """Use this command to switch between threads.
+The new thread ID must be currently known.
+
+List of thread subcommands:
+
+thread apply -- Apply a command to a list of threads
+
+Type "help thread" followed by thread subcommand name for full documentation.
+Command name abbreviations are allowed if unambiguous.
+"""
+ if not arg:
+ global tt_dict
+ for t in tt_dict.keys():
+ Gdb.msg(_main_debugger, t)
+
+ if 'apply' in arg:
+ threadID = arg[6:arg.find(' ', 6)]
+ cmd = arg[7+len(threadID):]
+ global tt_dict
+ try:
+ tr = tt_dict[threadID]
+ # XXX Maybe this wants to be, tr.do_cmd(params)?
+ tr.onecmd(cmd)
+ except KeyError:
+ return
+
+
Modified: sandbox/trunk/pdb/test/test_mconnection.py
==============================================================================
--- sandbox/trunk/pdb/test/test_mconnection.py (original)
+++ sandbox/trunk/pdb/test/test_mconnection.py Mon Jul 10 03:53:38 2006
@@ -19,7 +19,9 @@
sys.path.append("..")
from mconnection import (MConnectionServerTCP, MConnectionClientTCP,
- MConnectionSerial, ConnectionFailed, ReadError)
+ MConnectionSerial, MConnectionServerFIFO,
+ MConnectionClientFIFO, ConnectionFailed,
+ ReadError, WriteError)
# Try to connect the client to addr either until we've tried MAXTRIES
# times or until it succeeds.
@@ -100,13 +102,13 @@
"""(tcp) Test the ReadError exception."""
thread.start_new_thread(self.server.connect, (__addr__,))
- while self.server._sock == None:
+ while not self.server._sock:
pass
repeatedConnect(self.client, __addr__)
# Wait to make _absolutely_ sure that the client has connected
- while self.server.output == None:
+ while not self.server.output:
pass
self.client.disconnect()
self.assertRaises(ReadError, self.server.readline)
@@ -180,9 +182,84 @@
self.client.disconnect()
os.remove(TESTFN)
+class TestFIFOConnections(unittest.TestCase):
+ def setUp(self):
+ self.server = MConnectionServerFIFO()
+ self.client = MConnectionClientFIFO()
+
+ def testConnect(self):
+ """(FIFO) Connect a client to a server. """
+ thread.start_new_thread(self.client.connect, ('test_file',))
+ self.server.connect('test_file')
+
+ def testReadWrite(self):
+ """(FIFO) Test reading and writing to and from server/client."""
+ thread.start_new_thread(self.client.connect, ('test_file',))
+ self.server.connect('test_file')
+
+ # Server write, client read
+ self.server.write('Tim The Enchanter!\n')
+
+ # Wait for the thread to catch up
+ while not self.client.input:
+ pass
+ line = self.client.readline()
+ self.assertEquals('Tim The Enchanter!\n', line)
+
+ # Client write, server read
+ self.client.write('received\n')
+ line = self.server.readline()
+ self.assertEquals('received\n', line)
+
+ def testMultipleDisconnect(self):
+ """(FIFO) Disconnect disconnected connections."""
+ self.client.disconnect()
+ self.server.disconnect()
+
+ def testReadError(self):
+ """(FIFO) Test ReadError."""
+ thread.start_new_thread(self.client.connect, ('test_file',))
+ self.server.connect('test_file')
+
+ while not self.client.input:
+ pass
+
+ self.client.disconnect()
+ self.assertRaises(ReadError, self.server.readline)
+
+ self.client.connect('test_file')
+
+ self.server.disconnect()
+ self.assertRaises(ReadError, self.client.readline)
+
+ def testWriteError(self):
+ """(FIFO) Test WriteError."""
+ thread.start_new_thread(self.client.connect, ('test_file',))
+ self.server.connect('test_file')
+
+ while not self.client.input:
+ pass
+
+ self.client.disconnect()
+ self.assertRaises(WriteError, self.server.write, 'spam\n')
+
+ self.client.connect('test_file')
+
+ self.server.disconnect()
+ self.assertRaises(WriteError, self.client.write, 'Ni!\n')
+
+ def testInvalidPipe(self):
+ """(FIFO) Connect to an invalid named pipe."""
+ self.assertRaises(ConnectionFailed,self.client.connect, 'invalid')
+
+ def tearDown(self):
+ self.client.disconnect()
+ self.server.disconnect()
+
def test_main():
- test_support.run_unittest(TestTCPConnections, TestSerialConnections)
+ test_support.run_unittest(TestTCPConnections, TestSerialConnections,
+ TestFIFOConnections)
if __name__ == '__main__':
test_main()
From python-checkins at python.org Mon Jul 10 04:36:42 2006
From: python-checkins at python.org (neal.norwitz)
Date: Mon, 10 Jul 2006 04:36:42 +0200 (CEST)
Subject: [Python-checkins] r50506 - in python/trunk: Doc/Makefile.deps
Doc/lib/lib.tex Doc/lib/libetree.tex Misc/NEWS
Message-ID: <20060710023642.CD25B1E4009@bag.python.org>
Author: neal.norwitz
Date: Mon Jul 10 04:36:41 2006
New Revision: 50506
Added:
python/trunk/Doc/lib/libetree.tex (contents, props changed)
Modified:
python/trunk/Doc/Makefile.deps
python/trunk/Doc/lib/lib.tex
python/trunk/Misc/NEWS
Log:
Patch #1504046: Add documentation for xml.etree.
/F wrote the text docs, Englebert Gruber massaged it to latex and I
did some more massaging to try and improve the consistency and
fix some name mismatches between the declaration and text.
Modified: python/trunk/Doc/Makefile.deps
==============================================================================
--- python/trunk/Doc/Makefile.deps (original)
+++ python/trunk/Doc/Makefile.deps Mon Jul 10 04:36:41 2006
@@ -270,6 +270,7 @@
lib/xmlsaxhandler.tex \
lib/xmlsaxutils.tex \
lib/xmlsaxreader.tex \
+ lib/libetree.tex \
lib/libqueue.tex \
lib/liblocale.tex \
lib/libgettext.tex \
Modified: python/trunk/Doc/lib/lib.tex
==============================================================================
--- python/trunk/Doc/lib/lib.tex (original)
+++ python/trunk/Doc/lib/lib.tex Mon Jul 10 04:36:41 2006
@@ -171,6 +171,7 @@
\input{xmlsaxhandler}
\input{xmlsaxutils}
\input{xmlsaxreader}
+\input{libetree}
% \input{libxmllib}
\input{fileformats} % Miscellaneous file formats
Added: python/trunk/Doc/lib/libetree.tex
==============================================================================
--- (empty file)
+++ python/trunk/Doc/lib/libetree.tex Mon Jul 10 04:36:41 2006
@@ -0,0 +1,367 @@
+\section{\module{elementtree} --- The xml.etree.ElementTree Module}
+\declaremodule{standard}{elementtree}
+\moduleauthor{Fredrik Lundh}{fredrik at pythonware.com}
+\modulesynopsis{This module provides implementations
+of the Element and ElementTree types, plus support classes.
+
+A C version of this API is available as xml.etree.cElementTree.}
+\versionadded{2.5}
+
+
+\subsection{Overview\label{elementtree-overview}}
+
+The Element type is a flexible container object, designed to store
+hierarchical data structures in memory. The type can be described as a
+cross between a list and a dictionary.
+
+Each element has a number of properties associated with it:
+\begin{itemize}
+\item {}
+a tag which is a string identifying what kind of data
+this element represents (the element type, in other words).
+
+\item {}
+a number of attributes, stored in a Python dictionary.
+
+\item {}
+a text string.
+
+\item {}
+an optional tail string.
+
+\item {}
+a number of child elements, stored in a Python sequence
+
+\end{itemize}
+
+To create an element instance, use the Element or SubElement factory
+functions.
+
+The ElementTree class can be used to wrap an element
+structure, and convert it from and to XML.
+
+
+\subsection{Functions\label{elementtree-functions}}
+
+\begin{funcdesc}{Comment}{\optional{text}}
+Comment element factory. This factory function creates a special
+element that will be serialized as an XML comment.
+The comment string can be either an 8-bit ASCII string or a Unicode
+string.
+\var{text} is a string containing the comment string.
+
+\begin{datadescni}{Returns:}
+An element instance, representing a comment.
+\end{datadescni}
+\end{funcdesc}
+
+\begin{funcdesc}{dump}{elem}
+Writes an element tree or element structure to sys.stdout. This
+function should be used for debugging only.
+
+The exact output format is implementation dependent. In this
+version, it's written as an ordinary XML file.
+
+\var{elem} is an element tree or an individual element.
+\end{funcdesc}
+
+\begin{funcdesc}{Element}{tag\optional{, attrib}\optional{, **extra}}
+Element factory. This function returns an object implementing the
+standard Element interface. The exact class or type of that object
+is implementation dependent, but it will always be compatible with
+the {\_}ElementInterface class in this module.
+
+The element name, attribute names, and attribute values can be
+either 8-bit ASCII strings or Unicode strings.
+\var{tag} is the element name.
+\var{attrib} is an optional dictionary, containing element attributes.
+\var{extra} contains additional attributes, given as keyword arguments.
+
+\begin{datadescni}{Returns:}
+An element instance.
+\end{datadescni}
+\end{funcdesc}
+
+\begin{funcdesc}{fromstring}{text}
+Parses an XML section from a string constant. Same as XML.
+\var{text} is a string containing XML data.
+
+\begin{datadescni}{Returns:}
+An Element instance.
+\end{datadescni}
+\end{funcdesc}
+
+\begin{funcdesc}{iselement}{element}
+Checks if an object appears to be a valid element object.
+\var{element} is an element instance.
+
+\begin{datadescni}{Returns:}
+A true value if this is an element object.
+\end{datadescni}
+\end{funcdesc}
+
+\begin{funcdesc}{iterparse}{source\optional{, events}}
+Parses an XML section into an element tree incrementally, and reports
+what's going on to the user.
+\var{source} is a filename or file object containing XML data.
+\var{events} is a list of events to report back. If omitted, only ``end''
+events are reported.
+
+\begin{datadescni}{Returns:}
+A (event, elem) iterator.
+\end{datadescni}
+\end{funcdesc}
+
+\begin{funcdesc}{parse}{source\optional{, parser}}
+Parses an XML section into an element tree.
+\var{source} is a filename or file object containing XML data.
+\var{parser} is an optional parser instance. If not given, the
+standard XMLTreeBuilder parser is used.
+
+\begin{datadescni}{Returns:}
+An ElementTree instance
+\end{datadescni}
+\end{funcdesc}
+
+\begin{funcdesc}{ProcessingInstruction}{target\optional{, text}}
+PI element factory. This factory function creates a special element
+that will be serialized as an XML processing instruction.
+\var{target} is a string containing the PI target.
+\var{text} is a string containing the PI contents, if given.
+
+\begin{datadescni}{Returns:}
+An element instance, representing a PI.
+\end{datadescni}
+\end{funcdesc}
+
+\begin{funcdesc}{SubElement}{parent, tag\optional{, attrib} \optional{, **extra}}
+Subelement factory. This function creates an element instance, and
+appends it to an existing element.
+
+The element name, attribute names, and attribute values can be
+either 8-bit ASCII strings or Unicode strings.
+\var{parent} is the parent element.
+\var{tag} is the subelement name.
+\var{attrib} is an optional dictionary, containing element attributes.
+\var{extra} contains additional attributes, given as keyword arguments.
+
+\begin{datadescni}{Returns:}
+An element instance.
+\end{datadescni}
+\end{funcdesc}
+
+\begin{funcdesc}{tostring}{element\optional{, encoding}}
+Generates a string representation of an XML element, including all
+subelements.
+\var{element} is an Element instance.
+\var{encoding} is the output encoding (default is US-ASCII).
+
+\begin{datadescni}{Returns:}
+An encoded string containing the XML data.
+\end{datadescni}
+\end{funcdesc}
+
+\begin{funcdesc}{XML}{text}
+Parses an XML section from a string constant. This function can
+be used to embed ``XML literals'' in Python code.
+\var{text} is a string containing XML data.
+
+\begin{datadescni}{Returns:}
+An Element instance.
+\end{datadescni}
+\end{funcdesc}
+
+\begin{funcdesc}{XMLID}{text}
+Parses an XML section from a string constant, and also returns
+a dictionary which maps from element id:s to elements.
+\var{text} is a string containing XML data.
+
+\begin{datadescni}{Returns:}
+A tuple containing an Element instance and a dictionary.
+\end{datadescni}
+\end{funcdesc}
+
+
+\subsection{ElementTree Objects\label{elementtree-elementtree-objects}}
+
+\begin{classdesc}{ElementTree}{\optional{element,} \optional{file}}
+ElementTree wrapper class. This class represents an entire element
+hierarchy, and adds some extra support for serialization to and from
+standard XML.
+
+\var{element} is the root element.
+The tree is initialized with the contents of the XML \var{file} if given.
+\end{classdesc}
+
+\begin{methoddesc}{_setroot}{element}
+Replaces the root element for this tree. This discards the
+current contents of the tree, and replaces it with the given
+element. Use with care.
+\var{element} is an element instance.
+\end{methoddesc}
+
+\begin{methoddesc}{find}{path}
+Finds the first toplevel element with given tag.
+Same as getroot().find(path).
+\var{path} is the element to look for.
+
+\begin{datadescni}{Returns:}
+The first matching element, or None if no element was found.
+\end{datadescni}
+\end{methoddesc}
+
+\begin{methoddesc}{findall}{path}
+Finds all toplevel elements with the given tag.
+Same as getroot().findall(path).
+\var{path} is the element to look for.
+
+\begin{datadescni}{Returns:}
+A list or iterator containing all matching elements,
+in section order.
+\end{datadescni}
+\end{methoddesc}
+
+\begin{methoddesc}{findtext}{path\optional{, default}}
+Finds the element text for the first toplevel element with given
+tag. Same as getroot().findtext(path).
+\var{path} is the toplevel element to look for.
+\var{default} is the value to return if the element was not found.
+
+\begin{datadescni}{Returns:}
+The text content of the first matching element, or the
+default value no element was found. Note that if the element
+has is found, but has no text content, this method returns an
+empty string.
+\end{datadescni}
+\end{methoddesc}
+
+\begin{methoddesc}{getiterator}{\optional{tag}}
+Creates a tree iterator for the root element. The iterator loops
+over all elements in this tree, in section order.
+\var{tag} is the tag to look for (default is to return all elements)
+
+\begin{datadescni}{Returns:}
+An iterator.
+\end{datadescni}
+\end{methoddesc}
+
+\begin{methoddesc}{getroot}{}
+Gets the root element for this tree.
+
+\begin{datadescni}{Returns:}
+An element instance.
+\end{datadescni}
+\end{methoddesc}
+
+\begin{methoddesc}{parse}{source\optional{, parser}}
+Loads an external XML section into this element tree.
+\var{source} is a file name or file object.
+\var{parser} is an optional parser instance. If not given, the
+standard XMLTreeBuilder parser is used.
+
+\begin{datadescni}{Returns:}
+The section root element.
+\end{datadescni}
+\end{methoddesc}
+
+\begin{methoddesc}{write}{file\optional{, encoding}}
+Writes the element tree to a file, as XML.
+\var{file} is a file name, or a file object opened for writing.
+\var{encoding} is the output encoding (default is US-ASCII).
+\end{methoddesc}
+
+
+\subsection{QName Objects\label{elementtree-qname-objects}}
+
+\begin{classdesc}{QName}{text_or_uri\optional{, tag}}
+QName wrapper. This can be used to wrap a QName attribute value, in
+order to get proper namespace handling on output.
+\var{text_or_uri} is a string containing the QName value,
+in the form {\{}uri{\}}local, or, if the tag argument is given,
+the URI part of a QName.
+If \var{tag} is given, the first argument is interpreted as
+an URI, and this argument is interpreted as a local name.
+
+\begin{datadescni}{Returns:}
+An opaque object, representing the QName.
+\end{datadescni}
+\end{classdesc}
+
+
+\subsection{TreeBuilder Objects\label{elementtree-treebuilder-objects}}
+
+\begin{classdesc}{TreeBuilder}{\optional{element_factory}}
+Generic element structure builder. This builder converts a sequence
+of start, data, and end method calls to a well-formed element structure.
+You can use this class to build an element structure using a custom XML
+parser, or a parser for some other XML-like format.
+The \var{element_factory} is called to create new Element instances when
+given.
+\end{classdesc}
+
+\begin{methoddesc}{close}{}
+Flushes the parser buffers, and returns the toplevel documen
+element.
+
+\begin{datadescni}{Returns:}
+An Element instance.
+\end{datadescni}
+\end{methoddesc}
+
+\begin{methoddesc}{data}{data}
+Adds text to the current element.
+\var{data} is a string. This should be either an 8-bit string
+containing ASCII text, or a Unicode string.
+\end{methoddesc}
+
+\begin{methoddesc}{end}{tag}
+Closes the current element.
+\var{tag} is the element name.
+
+\begin{datadescni}{Returns:}
+The closed element.
+\end{datadescni}
+\end{methoddesc}
+
+\begin{methoddesc}{start}{tag, attrs}
+Opens a new element.
+\var{tag} is the element name.
+\var{attrs} is a dictionary containing element attributes.
+
+\begin{datadescni}{Returns:}
+The opened element.
+\end{datadescni}
+\end{methoddesc}
+
+
+\subsection{XMLTreeBuilder Objects\label{elementtree-xmltreebuilder-objects}}
+
+\begin{classdesc}{XMLTreeBuilder}{\optional{html,} \optional{target}}
+Element structure builder for XML source data, based on the
+expat parser.
+\var{html} are predefined HTML entities. This flag is not supported
+by the current implementation.
+\var{target} is the target object. If omitted, the builder uses an
+instance of the standard TreeBuilder class.
+\end{classdesc}
+
+\begin{methoddesc}{close}{}
+Finishes feeding data to the parser.
+
+\begin{datadescni}{Returns:}
+An element structure.
+\end{datadescni}
+\end{methoddesc}
+
+\begin{methoddesc}{doctype}{name, pubid, system}
+Handles a doctype declaration.
+\var{name} is the doctype name.
+\var{pubid} is the public identifier.
+\var{system} is the system identifier.
+\end{methoddesc}
+
+\begin{methoddesc}{feed}{data}
+Feeds data to the parser.
+
+\var{data} is encoded data.
+\end{methoddesc}
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Mon Jul 10 04:36:41 2006
@@ -140,6 +140,7 @@
- Patch #1509163: MS Toolkit Compiler no longer available.
+- Patch #1504046: Add documentation for xml.etree.
What's New in Python 2.5 beta 1?
================================
From python-checkins at python.org Mon Jul 10 04:36:52 2006
From: python-checkins at python.org (neal.norwitz)
Date: Mon, 10 Jul 2006 04:36:52 +0200 (CEST)
Subject: [Python-checkins] r50507 - peps/trunk/pep-0356.txt
Message-ID: <20060710023652.DDDDC1E4009@bag.python.org>
Author: neal.norwitz
Date: Mon Jul 10 04:36:52 2006
New Revision: 50507
Modified:
peps/trunk/pep-0356.txt
Log:
Some doc is in, good enough to start. We punted on the relative imports from main. Defered until 2.6
Modified: peps/trunk/pep-0356.txt
==============================================================================
--- peps/trunk/pep-0356.txt (original)
+++ peps/trunk/pep-0356.txt Mon Jul 10 04:36:52 2006
@@ -150,22 +150,10 @@
http://python.org/sf/1517042 - Fix crashers/gc_inspection.py
http://python.org/sf/1515471 - stringobject (char buffers)
http://python.org/sf/1508010 - msvccompiler using VC6
- http://python.org/sf/1504046 - doc for ElementTree
http://python.org/sf/1494314 - can't use high sockets (regr in 2.4.3)
http://python.org/sf/1475523 - gettext.py bug
http://python.org/sf/1333982 - AST
- - Should relative imports from __main__ work when feasible?
- Bug report: http://python.org/sf/1510172
- Rev 47142 updated -m switch to always set a __module_name__ attribute
- Patch attached to bug report makes import.c aware of that attribute.
- Needs a call from the release managers as to whether to apply the
- second half of the fix, or else roll back the first half.
-
- - Missing documentation
- * ElementTree/cElementTree (Fredrik Lundh)
- Needs latex-ifcation, patch at http://python.org/sf/1504046
-
- AST compiler problems
(Owner: Jeremy Hylton)
* string constants that are not assigned are in byte code
From python-checkins at python.org Mon Jul 10 06:06:55 2006
From: python-checkins at python.org (neal.norwitz)
Date: Mon, 10 Jul 2006 06:06:55 +0200 (CEST)
Subject: [Python-checkins] r50508 - peps/trunk/pep-0356.txt
Message-ID: <20060710040655.4E1561E4009@bag.python.org>
Author: neal.norwitz
Date: Mon Jul 10 06:06:54 2006
New Revision: 50508
Modified:
peps/trunk/pep-0356.txt
Log:
Add some XML differences. Remove ref to setuptools which is not in 2.5
Modified: peps/trunk/pep-0356.txt
==============================================================================
--- peps/trunk/pep-0356.txt (original)
+++ peps/trunk/pep-0356.txt Mon Jul 10 06:06:54 2006
@@ -136,7 +136,6 @@
http://mail.python.org/pipermail/python-dev/2006-February/060926.html
- bdist_egg in distutils package
- (though bdist_egg is supported in setuptools)
- pure python pgen module
(Owner: Guido)
@@ -149,6 +148,8 @@
- Bugs that need resolving before release:
http://python.org/sf/1517042 - Fix crashers/gc_inspection.py
http://python.org/sf/1515471 - stringobject (char buffers)
+ http://python.org/sf/1513611 - XML: xml.sax.expatreader missing
+ http://python.org/sf/1511497 - XML: xml.sax.ParseException issue
http://python.org/sf/1508010 - msvccompiler using VC6
http://python.org/sf/1494314 - can't use high sockets (regr in 2.4.3)
http://python.org/sf/1475523 - gettext.py bug
From python-checkins at python.org Mon Jul 10 09:23:49 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Mon, 10 Jul 2006 09:23:49 +0200 (CEST)
Subject: [Python-checkins] r50509 - in python/trunk: Doc/dist/dist.tex
Lib/distutils/msvccompiler.py Misc/NEWS
Message-ID: <20060710072349.A00BB1E4008@bag.python.org>
Author: martin.v.loewis
Date: Mon Jul 10 09:23:48 2006
New Revision: 50509
Modified:
python/trunk/Doc/dist/dist.tex
python/trunk/Lib/distutils/msvccompiler.py
python/trunk/Misc/NEWS
Log:
Introduce DISTUTILS_USE_SDK as a flag to determine whether the
SDK environment should be used. Fixes #1508010.
Modified: python/trunk/Doc/dist/dist.tex
==============================================================================
--- python/trunk/Doc/dist/dist.tex (original)
+++ python/trunk/Doc/dist/dist.tex Mon Jul 10 09:23:48 2006
@@ -2873,9 +2873,20 @@
\modulesynopsis{Microsoft Compiler}
This module provides \class{MSVCCompiler}, an implementation of the abstract
-\class{CCompiler} class for Microsoft Visual Studio. It should also work using
-the freely available compiler provided as part of the .Net SDK download. XXX
-download link.
+\class{CCompiler} class for Microsoft Visual Studio. Typically, extension
+modules need to be compiled with the same compiler that was used to compile
+Python. For Python 2.3 and earlier, the compiler was Visual Studio 6. For
+Python 2.4 and 2.5, the compiler is Visual Studio .NET 2003. The AMD64
+and Itanium binaries are created using the Platform SDK.
+
+\class{MSVCCompiler} will normally choose the right compiler, linker etc.
+on its own. To override this choice, the environment variables
+\var{DISTUTILS\_USE\_SDK} and \var{MSSdk} must be both set. \var{MSSdk}
+indicates that the current environment has been setup by the SDK's
+\code{SetEnv.Cmd} script, or that the environment variables had been
+registered when the SDK was installed; \var{DISTUTILS\_USE\_SDK} indicates
+that the distutils user has made an explicit choice to override the
+compiler selection by \class{MSVCCompiler}.
\section{\module{distutils.bcppcompiler} --- Borland Compiler}
\declaremodule{standard}{distutils.bcppcompiler}
Modified: python/trunk/Lib/distutils/msvccompiler.py
==============================================================================
--- python/trunk/Lib/distutils/msvccompiler.py (original)
+++ python/trunk/Lib/distutils/msvccompiler.py Mon Jul 10 09:23:48 2006
@@ -237,7 +237,7 @@
def initialize(self):
self.__paths = []
- if os.environ.has_key("MSSdk") and self.find_exe("cl.exe"):
+ if os.environ.has_key("DISTUTILS_USE_SDK") and os.environ.has_key("MSSdk") and self.find_exe("cl.exe"):
# Assume that the SDK set up everything alright; don't try to be
# smarter
self.cc = "cl.exe"
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Mon Jul 10 09:23:48 2006
@@ -39,6 +39,10 @@
Library
-------
+- Bug #1508010: msvccompiler now requires the DISTUTILS_USE_SDK
+ environment variable to be set in order to the SDK environment
+ for finding the compiler, include files, etc.
+
- Bug #1515998: Properly generate logical ids for files in bdist_msi.
- warnings.py now ignores ImportWarning by default
From python-checkins at python.org Mon Jul 10 09:26:41 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Mon, 10 Jul 2006 09:26:41 +0200 (CEST)
Subject: [Python-checkins] r50510 -
python/trunk/Lib/distutils/msvccompiler.py
Message-ID: <20060710072641.B38D01E4002@bag.python.org>
Author: martin.v.loewis
Date: Mon Jul 10 09:26:41 2006
New Revision: 50510
Modified:
python/trunk/Lib/distutils/msvccompiler.py
Log:
Change error message to indicate that VS2003 is necessary to build extension modules, not the .NET SDK.
Modified: python/trunk/Lib/distutils/msvccompiler.py
==============================================================================
--- python/trunk/Lib/distutils/msvccompiler.py (original)
+++ python/trunk/Lib/distutils/msvccompiler.py Mon Jul 10 09:26:41 2006
@@ -131,7 +131,7 @@
self.set_macro("FrameworkSDKDir", net, "sdkinstallroot")
except KeyError, exc: #
raise DistutilsPlatformError, \
- ("The .NET Framework SDK needs to be installed before "
+ ("Visual Studio 2003 needs to be installed before "
"building extensions for Python.")
p = r"Software\Microsoft\NET Framework Setup\Product"
From python-checkins at python.org Mon Jul 10 09:29:41 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Mon, 10 Jul 2006 09:29:41 +0200 (CEST)
Subject: [Python-checkins] r50511 - python/trunk/Tools/msi
Message-ID: <20060710072941.0437D1E4002@bag.python.org>
Author: martin.v.loewis
Date: Mon Jul 10 09:29:41 2006
New Revision: 50511
Modified:
python/trunk/Tools/msi/ (props changed)
Log:
Add svn:ignore.
From anthony at interlink.com.au Mon Jul 10 09:32:26 2006
From: anthony at interlink.com.au (Anthony Baxter)
Date: Mon, 10 Jul 2006 17:32:26 +1000
Subject: [Python-checkins] TRUNK FREEZE for 2.5b2, tomorrow Tuesday 11th,
00:00 UTC
Message-ID: <200607101732.28316.anthony@interlink.com.au>
The trunk is frozen from 00:00 UTC Tuesday the 11th of July. This is
in about 16 hours or so time. As usual, unless you're a member of the
release team, please don't checkin past that time until I send an
email that the release is done.
Thanks,
Anthony
--
Anthony Baxter
It's never too late to have a happy childhood.
From python-checkins at python.org Mon Jul 10 09:41:04 2006
From: python-checkins at python.org (anthony.baxter)
Date: Mon, 10 Jul 2006 09:41:04 +0200 (CEST)
Subject: [Python-checkins] r50512 - in python/trunk:
Doc/commontex/boilerplate.tex Include/patchlevel.h
Lib/idlelib/NEWS.txt Lib/idlelib/idlever.py Misc/NEWS
Misc/RPM/python-2.5.spec README
Message-ID: <20060710074104.F40121E4002@bag.python.org>
Author: anthony.baxter
Date: Mon Jul 10 09:41:04 2006
New Revision: 50512
Modified:
python/trunk/Doc/commontex/boilerplate.tex
python/trunk/Include/patchlevel.h
python/trunk/Lib/idlelib/NEWS.txt
python/trunk/Lib/idlelib/idlever.py
python/trunk/Misc/NEWS
python/trunk/Misc/RPM/python-2.5.spec
python/trunk/README
Log:
preparing for 2.5b2
Modified: python/trunk/Doc/commontex/boilerplate.tex
==============================================================================
--- python/trunk/Doc/commontex/boilerplate.tex (original)
+++ python/trunk/Doc/commontex/boilerplate.tex Mon Jul 10 09:41:04 2006
@@ -5,5 +5,5 @@
Email: \email{docs at python.org}
}
-\date{\today} % XXX update before final release!
+\date{11th July, 2006} % XXX update before final release!
\input{patchlevel} % include Python version information
Modified: python/trunk/Include/patchlevel.h
==============================================================================
--- python/trunk/Include/patchlevel.h (original)
+++ python/trunk/Include/patchlevel.h Mon Jul 10 09:41:04 2006
@@ -23,10 +23,10 @@
#define PY_MINOR_VERSION 5
#define PY_MICRO_VERSION 0
#define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_BETA
-#define PY_RELEASE_SERIAL 1
+#define PY_RELEASE_SERIAL 2
/* Version as a string */
-#define PY_VERSION "2.5b1"
+#define PY_VERSION "2.5b2"
/* Subversion Revision number of this file (not of the repository) */
#define PY_PATCHLEVEL_REVISION "$Revision$"
Modified: python/trunk/Lib/idlelib/NEWS.txt
==============================================================================
--- python/trunk/Lib/idlelib/NEWS.txt (original)
+++ python/trunk/Lib/idlelib/NEWS.txt Mon Jul 10 09:41:04 2006
@@ -1,3 +1,8 @@
+What's New in IDLE 1.2b2?
+=========================
+
+*Release date: 11-JUL-2006*
+
What's New in IDLE 1.2b1?
=========================
Modified: python/trunk/Lib/idlelib/idlever.py
==============================================================================
--- python/trunk/Lib/idlelib/idlever.py (original)
+++ python/trunk/Lib/idlelib/idlever.py Mon Jul 10 09:41:04 2006
@@ -1 +1 @@
-IDLE_VERSION = "1.2b1"
+IDLE_VERSION = "1.2b2"
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Mon Jul 10 09:41:04 2006
@@ -7,7 +7,7 @@
What's New in Python 2.5 beta 2?
================================
-*Release date: XX-JUL-2006*
+*Release date: 11-JUL-2006*
Core and builtins
-----------------
Modified: python/trunk/Misc/RPM/python-2.5.spec
==============================================================================
--- python/trunk/Misc/RPM/python-2.5.spec (original)
+++ python/trunk/Misc/RPM/python-2.5.spec Mon Jul 10 09:41:04 2006
@@ -33,7 +33,7 @@
#################################
%define name python
-%define version 2.5a2
+%define version 2.5b2
%define libvers 2.5
%define release 1pydotorg
%define __prefix /usr
Modified: python/trunk/README
==============================================================================
--- python/trunk/README (original)
+++ python/trunk/README Mon Jul 10 09:41:04 2006
@@ -1,4 +1,4 @@
-This is Python version 2.5 alpha 1
+This is Python version 2.5 beta 2
==================================
Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006 Python Software Foundation.
From python-checkins at python.org Mon Jul 10 11:10:28 2006
From: python-checkins at python.org (thomas.heller)
Date: Mon, 10 Jul 2006 11:10:28 +0200 (CEST)
Subject: [Python-checkins] r50513 - in python/trunk:
Lib/ctypes/test/test_pointers.py Misc/NEWS Modules/_ctypes/cfield.c
Message-ID: <20060710091028.EB7671E4002@bag.python.org>
Author: thomas.heller
Date: Mon Jul 10 11:10:28 2006
New Revision: 50513
Modified:
python/trunk/Lib/ctypes/test/test_pointers.py
python/trunk/Misc/NEWS
python/trunk/Modules/_ctypes/cfield.c
Log:
Fix bug #1518190: accept any integer or long value in the
ctypes.c_void_p constructor.
Modified: python/trunk/Lib/ctypes/test/test_pointers.py
==============================================================================
--- python/trunk/Lib/ctypes/test/test_pointers.py (original)
+++ python/trunk/Lib/ctypes/test/test_pointers.py Mon Jul 10 11:10:28 2006
@@ -157,6 +157,23 @@
q = pointer(y)
pp[0] = q # <==
self.failUnlessEqual(p[0], 6)
+ def test_c_void_p(self):
+ # http://sourceforge.net/tracker/?func=detail&aid=1518190&group_id=5470&atid=105470
+ if sizeof(c_void_p) == 4:
+ self.failUnlessEqual(c_void_p(0xFFFFFFFFL).value,
+ c_void_p(-1).value)
+ self.failUnlessEqual(c_void_p(0xFFFFFFFFFFFFFFFFL).value,
+ c_void_p(-1).value)
+ elif sizeof(c_void_p) == 8:
+ self.failUnlessEqual(c_void_p(0xFFFFFFFFL).value,
+ 0xFFFFFFFFL)
+ self.failUnlessEqual(c_void_p(0xFFFFFFFFFFFFFFFFL).value,
+ c_void_p(-1).value)
+ self.failUnlessEqual(c_void_p(0xFFFFFFFFFFFFFFFFFFFFFFFFL).value,
+ c_void_p(-1).value)
+
+ self.assertRaises(TypeError, c_void_p, 3.14) # make sure floats are NOT accepted
+ self.assertRaises(TypeError, c_void_p, object()) # nor other objects
if __name__ == '__main__':
unittest.main()
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Mon Jul 10 11:10:28 2006
@@ -39,6 +39,9 @@
Library
-------
+- Bug #1518190: The ctypes.c_void_p constructor now accepts any
+ integer or long, without range checking.
+
- Bug #1508010: msvccompiler now requires the DISTUTILS_USE_SDK
environment variable to be set in order to the SDK environment
for finding the compiler, include files, etc.
Modified: python/trunk/Modules/_ctypes/cfield.c
==============================================================================
--- python/trunk/Modules/_ctypes/cfield.c (original)
+++ python/trunk/Modules/_ctypes/cfield.c Mon Jul 10 11:10:28 2006
@@ -1486,16 +1486,27 @@
*(void **)ptr = NULL;
_RET(value);
}
-
- v = PyLong_AsVoidPtr(value);
- if (PyErr_Occurred()) {
- /* prevent the SystemError: bad argument to internal function */
- if (!PyInt_Check(value) && !PyLong_Check(value)) {
- PyErr_SetString(PyExc_TypeError,
- "cannot be converted to pointer");
- }
+
+ if (!PyInt_Check(value) && !PyLong_Check(value)) {
+ PyErr_SetString(PyExc_TypeError,
+ "cannot be converted to pointer");
return NULL;
}
+
+#if SIZEOF_VOID_P <= SIZEOF_LONG
+ v = (void *)PyInt_AsUnsignedLongMask(value);
+#else
+#ifndef HAVE_LONG_LONG
+# error "PyLong_AsVoidPtr: sizeof(void*) > sizeof(long), but no long long"
+#elif SIZEOF_LONG_LONG < SIZEOF_VOID_P
+# error "PyLong_AsVoidPtr: sizeof(PY_LONG_LONG) < sizeof(void*)"
+#endif
+ v = (void *)PyInt_AsUnsignedLongLongMask(value);
+#endif
+
+ if (PyErr_Occurred())
+ return NULL;
+
*(void **)ptr = v;
_RET(value);
}
From python-checkins at python.org Mon Jul 10 11:31:07 2006
From: python-checkins at python.org (thomas.heller)
Date: Mon, 10 Jul 2006 11:31:07 +0200 (CEST)
Subject: [Python-checkins] r50514 - in python/trunk: Misc/NEWS
Modules/_ctypes/_ctypes.c
Message-ID: <20060710093107.819261E401A@bag.python.org>
Author: thomas.heller
Date: Mon Jul 10 11:31:06 2006
New Revision: 50514
Modified:
python/trunk/Misc/NEWS
python/trunk/Modules/_ctypes/_ctypes.c
Log:
Fixed a segfault when ctypes.wintypes were imported on
non-Windows machines.
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Mon Jul 10 11:31:06 2006
@@ -39,6 +39,9 @@
Library
-------
+- Fixed a segfault in _ctypes when ctypes.wintypes were imported
+ on non-Windows platforms.
+
- Bug #1518190: The ctypes.c_void_p constructor now accepts any
integer or long, without range checking.
Modified: python/trunk/Modules/_ctypes/_ctypes.c
==============================================================================
--- python/trunk/Modules/_ctypes/_ctypes.c (original)
+++ python/trunk/Modules/_ctypes/_ctypes.c Mon Jul 10 11:31:06 2006
@@ -1384,13 +1384,20 @@
Py_DECREF(result);
return NULL;
}
+ fmt = getentry(PyString_AS_STRING(proto));
+ if (fmt == NULL) {
+ Py_DECREF(result);
+ PyErr_Format(PyExc_ValueError,
+ "_type_ '%s' not supported",
+ PyString_AS_STRING(proto));
+ return NULL;
+ }
+
stgdict = (StgDictObject *)PyObject_CallObject(
(PyObject *)&StgDict_Type, NULL);
if (!stgdict)
return NULL;
- fmt = getentry(PyString_AS_STRING(proto));
-
stgdict->ffi_type_pointer = *fmt->pffi_type;
stgdict->align = fmt->pffi_type->alignment;
stgdict->length = 0;
From python-checkins at python.org Mon Jul 10 13:11:11 2006
From: python-checkins at python.org (thomas.heller)
Date: Mon, 10 Jul 2006 13:11:11 +0200 (CEST)
Subject: [Python-checkins] r50516 - in python/trunk:
Lib/ctypes/test/test_structures.py Misc/NEWS
Modules/_ctypes/_ctypes.c
Message-ID: <20060710111111.232E51E4002@bag.python.org>
Author: thomas.heller
Date: Mon Jul 10 13:11:10 2006
New Revision: 50516
Modified:
python/trunk/Lib/ctypes/test/test_structures.py
python/trunk/Misc/NEWS
python/trunk/Modules/_ctypes/_ctypes.c
Log:
Assigning None to pointer type structure fields possible overwrote
wrong fields.
Modified: python/trunk/Lib/ctypes/test/test_structures.py
==============================================================================
--- python/trunk/Lib/ctypes/test/test_structures.py (original)
+++ python/trunk/Lib/ctypes/test/test_structures.py Mon Jul 10 13:11:10 2006
@@ -371,5 +371,15 @@
items = [s.array[i] for i in range(3)]
self.failUnlessEqual(items, [1, 2, 3])
+ def test_none_to_pointer_fields(self):
+ class S(Structure):
+ _fields_ = [("x", c_int),
+ ("p", POINTER(c_int))]
+
+ s = S()
+ s.x = 12345678
+ s.p = None
+ self.failUnlessEqual(s.x, 12345678)
+
if __name__ == '__main__':
unittest.main()
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Mon Jul 10 13:11:10 2006
@@ -39,6 +39,9 @@
Library
-------
+- Assigning None to pointer type fields in ctypes structures possible
+ overwrote the wrong fields, this is fixed now.
+
- Fixed a segfault in _ctypes when ctypes.wintypes were imported
on non-Windows platforms.
Modified: python/trunk/Modules/_ctypes/_ctypes.c
==============================================================================
--- python/trunk/Modules/_ctypes/_ctypes.c (original)
+++ python/trunk/Modules/_ctypes/_ctypes.c Mon Jul 10 13:11:10 2006
@@ -2187,7 +2187,7 @@
Py_DECREF(ob);
return result;
} else if (value == Py_None && PointerTypeObject_Check(type)) {
- *(void **)dst->b_ptr = NULL;
+ *(void **)ptr = NULL;
Py_INCREF(Py_None);
return Py_None;
} else {
From python-checkins at python.org Mon Jul 10 13:17:37 2006
From: python-checkins at python.org (thomas.heller)
Date: Mon, 10 Jul 2006 13:17:37 +0200 (CEST)
Subject: [Python-checkins] r50517 - python/trunk/Misc/NEWS
Message-ID: <20060710111737.CFBC91E4008@bag.python.org>
Author: thomas.heller
Date: Mon Jul 10 13:17:37 2006
New Revision: 50517
Modified:
python/trunk/Misc/NEWS
Log:
Moved the ctypes news entries from the 'Library' section into the
'Extension Modules' section where they belong, probably.
This destroyes the original order of the news entries, don't know
if that is important or not.
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Mon Jul 10 13:17:37 2006
@@ -39,15 +39,6 @@
Library
-------
-- Assigning None to pointer type fields in ctypes structures possible
- overwrote the wrong fields, this is fixed now.
-
-- Fixed a segfault in _ctypes when ctypes.wintypes were imported
- on non-Windows platforms.
-
-- Bug #1518190: The ctypes.c_void_p constructor now accepts any
- integer or long, without range checking.
-
- Bug #1508010: msvccompiler now requires the DISTUTILS_USE_SDK
environment variable to be set in order to the SDK environment
for finding the compiler, include files, etc.
@@ -56,10 +47,6 @@
- warnings.py now ignores ImportWarning by default
-- Patch #1517790: It is now possible to use custom objects in the ctypes
- foreign function argtypes sequence as long as they provide a from_param
- method, no longer is it required that the object is a ctypes type.
-
- string.Template() now correctly handles tuple-values. Previously,
multi-value tuples would raise an exception and single-value tuples would
be treated as the value they contain, instead.
@@ -82,9 +69,6 @@
- Bug #1513223: .close() of a _socketobj now releases the underlying
socket again, which then gets closed as it becomes unreferenced.
-- The '_ctypes' extension module now works when Python is configured
- with the --without-threads option.
-
- Bug #1504333: Make sgmllib support angle brackets in quoted
attribute values.
@@ -112,6 +96,22 @@
Extension Modules
-----------------
+- Assigning None to pointer type fields in ctypes structures possible
+ overwrote the wrong fields, this is fixed now.
+
+- Fixed a segfault in _ctypes when ctypes.wintypes were imported
+ on non-Windows platforms.
+
+- Bug #1518190: The ctypes.c_void_p constructor now accepts any
+ integer or long, without range checking.
+
+- Patch #1517790: It is now possible to use custom objects in the ctypes
+ foreign function argtypes sequence as long as they provide a from_param
+ method, no longer is it required that the object is a ctypes type.
+
+- The '_ctypes' extension module now works when Python is configured
+ with the --without-threads option.
+
- Bug #1513646: os.access on Windows now correctly determines write
access, again.
From python-checkins at python.org Mon Jul 10 14:32:08 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Mon, 10 Jul 2006 14:32:08 +0200 (CEST)
Subject: [Python-checkins] r50520 - sandbox/trunk/Doc/functional.rst
Message-ID: <20060710123208.C46B31E4002@bag.python.org>
Author: andrew.kuchling
Date: Mon Jul 10 14:32:08 2006
New Revision: 50520
Modified:
sandbox/trunk/Doc/functional.rst
Log:
Add a name
Modified: sandbox/trunk/Doc/functional.rst
==============================================================================
--- sandbox/trunk/Doc/functional.rst (original)
+++ sandbox/trunk/Doc/functional.rst Mon Jul 10 14:32:08 2006
@@ -1148,7 +1148,7 @@
The author would like to thank the following people for offering
suggestions, corrections and assistance with various drafts of this
article: Ian Bicking, Nick Coghlan, Nick Efford, Raymond Hettinger,
-Jim Jewett, Leandro Lameiro, Collin Winter, Blake Winton.
+Jim Jewett, Mike Krell, Leandro Lameiro, Collin Winter, Blake Winton.
Version 0.1: posted June 30 2006.
From python-checkins at python.org Mon Jul 10 15:13:50 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Mon, 10 Jul 2006 15:13:50 +0200 (CEST)
Subject: [Python-checkins] r50521 - sandbox/trunk/Doc/functional.rst
Message-ID: <20060710131350.C9FA61E4002@bag.python.org>
Author: andrew.kuchling
Date: Mon Jul 10 15:13:50 2006
New Revision: 50521
Modified:
sandbox/trunk/Doc/functional.rst
Log:
Merge genexp and listcomp sections into one.
Add PEP references.
Some minor edits.
Bump version to 0.2
Modified: sandbox/trunk/Doc/functional.rst
==============================================================================
--- sandbox/trunk/Doc/functional.rst (original)
+++ sandbox/trunk/Doc/functional.rst Mon Jul 10 15:13:50 2006
@@ -1,7 +1,7 @@
Functional Programming HOWTO
================================
-**Version 0.12**
+**Version 0.2**
(This is a first draft. Please send comments/error
reports/suggestions to amk at amk.ca. This URL is probably not going to
@@ -336,26 +336,27 @@
-List comprehensions
------------------------
-
-.. comment
-
- Maybe gencomps should be described and emphasized, and listcomps
- be mentioned as an afterthought.
+Generator expressions and list comprehensions
+----------------------------------------------------
-Two common operations on a stream are performing some operation for
-every element, or selecting a subset of elements that meet some
+Two common operations on a stream are 1) performing some operation for
+every element, 2) selecting a subset of elements that meet some
condition. For example, given a list of strings, you might want to
-pull out all the strings containing a given substring or strip off
-trailing whitespace from each line.
+strip off trailing whitespace from each line or extract all the
+strings containing a given substring.
-List comprehensions (or "listcomps") are a concise notation for such
-operations, borrowed from the functional programming language Haskell
-(http://www.haskell.org). You can strip all the whitespace from
-a stream of strings with the following list comprehension::
+List comprehensions and generator expressions (short form: "listcomps"
+and "genexps") are a concise notation for such operations, borrowed
+from the functional programming language Haskell
+(http://www.haskell.org). You can strip all the whitespace from a
+stream of strings with the following code::
line_list = [' line 1\n', 'line 2 \n', ...]
+
+ # Generator expression -- returns iterator
+ stripped_iter = (line.strip() for line in line_list)
+
+ # List comprehension -- returns list
stripped_list = [line.strip() for line in line_list]
You can select only certain elements by adding an ``"if"`` condition::
@@ -363,28 +364,42 @@
stripped_list = [line.strip() for line in line_list
if line != ""]
-Note that in all cases the resulting ``stripped_list`` is a Python list
-containing the resulting lines, not an iterator. This means that list
-comprehensions aren't very useful if you're working with iterators
-that return an infinite or a very large stream of data. Later we'll
-discuss generator expressions, which have the capabilities of list
-comprehensions but can be used with infinite iterators.
+With a list comprehension, you get back a Python list;
+``stripped_list`` is a list containing the resulting lines, not an
+iterator. Generator expressions return an iterator that computes the
+values as necessary, not needing to materialize all the values at
+once. This means that list comprehensions aren't useful if you're
+working with iterators that return an infinite stream or a very large
+amount of data. Generator expressions are preferable in these
+situations.
+
+Generator expressions are surrounded by parentheses ("()") and list
+comprehensions are surrounded by square brackets ("[]"). Generator
+expressions have the form::
-List comprehensions have the form::
-
- [ expression for expr in sequence1
+ ( expression for expr in sequence1
if condition1
for expr2 in sequence2
if condition2
for expr3 in sequence3 ...
if condition3
for exprN in sequenceN
- if conditionN ]
+ if conditionN )
+
+Again, for a list comprehension only the outside brackets are
+different (square brackets instead of parentheses).
-The elements of the generated list will be the successive
-values of ``expression``. The ``if`` clauses are
-all optional; if present, ``expression`` is only evaluated and added to
-the result when ``condition`` is true.
+The elements of the generated output will be the successive values of
+``expression``. The ``if`` clauses are all optional; if present,
+``expression`` is only evaluated and added to the result when
+``condition`` is true.
+
+Generator expressions always have to be written inside parentheses,
+but the parentheses signalling a function call also count. If you
+want to create an iterator that will be immediately passed to a
+function you can write::
+
+ obj_total = sum(obj.count for obj in list_all_objects())
The ``for...in`` clauses contain the sequences to be iterated over.
The sequences do not have to be the same length, because they are
@@ -393,27 +408,27 @@
beginning. ``sequence3`` is then looped over for each
resulting pair of elements from ``sequence1`` and ``sequence2``.
-Put another way, a list comprehension is equivalent to the following
-Python code::
+To put it another way, a list comprehension or generator expression is
+equivalent to the following Python code::
for expr1 in sequence1:
if not (condition1):
- continue
+ continue # Skip this element
for expr2 in sequence2:
if not (condition2):
- continue
+ continue # Skip this element
...
for exprN in sequenceN:
if not (conditionN):
- continue
- # Append the value of
- # the expression to the
- # resulting list.
-
-This means that when there are multiple ``for...in``
-clauses, the resulting list will be equal to the product of the
-lengths of all the sequences. If you have two lists of length 3, the
-output list is 9 elements long::
+ continue # Skip this element
+
+ # Output the value of
+ # the expression.
+
+This means that when there are multiple ``for...in`` clauses but no
+``if`` clauses, the length of the resulting output will be equal to
+the product of the lengths of all the sequences. If you have two
+lists of length 3, the output list is 9 elements long::
seq1 = 'abc'
seq2 = (1,2,3)
@@ -433,28 +448,6 @@
[ (x,y) for x in seq1 for y in seq2]
-Generator Expressions
------------------------
-
-Generator expressions are written like list comprehensions, but are
-surrounded by parentheses ("()") and not square brackets
-("[]"). The result of a generator expression
-is an iterator that returns the computed elements without
-materializing a list containing them all.
-
-::
-
- (line.split() for line in line_list) =>
- ['line', '1'], ['line', '2']
-
-Generator expressions always have to be written inside parentheses, as
-in the above example. The parentheses signalling a function call also
-count, so if you want to create an iterator that will be immediately
-passed to a function you could write::
-
- obj_total = sum(obj.count for obj in list_all_objects())
-
-
Generators
-----------------------
@@ -542,7 +535,9 @@
if t:
for x in inorder(t.left):
yield x
+
yield t.label
+
for x in inorder(t.right):
yield x
@@ -558,15 +553,15 @@
''''''''''''''''''''''''''''''''''''''''''''''
In Python 2.4 and earlier, generators only produced output. Once a
-generator's code was invoked to create an iterator, there's no way to
+generator's code was invoked to create an iterator, there was no way to
pass any new information into the function when its execution is
-resumed. (You could hack together this ability by making the
-generator look at a global variable or passing in some mutable object
-that callers then modify, but these approaches are messy.)
-In Python 2.5 there's a simple way to pass values into a generator.
+resumed. You could hack together this ability by making the
+generator look at a global variable or by passing in some mutable object
+that callers then modify, but these approaches are messy.
-In Python 2.5 ``yield`` became an expression, returning a value that
-can be assigned to a variable or otherwise operated on::
+In Python 2.5 there's a simple way to pass values into a generator.
+``yield`` became an expression, returning a value that can be assigned
+to a variable or otherwise operated on::
val = (yield i)
@@ -1152,7 +1147,11 @@
Version 0.1: posted June 30 2006.
-Version 0.11: posted July 1 2006.
+Version 0.11: posted July 1 2006. Typo fixes.
+
+Version 0.2: posted July 10 2006. Merged genexp and listcomp
+sections into one. Typo fixes.
+
References
--------------------
@@ -1162,10 +1161,8 @@
**Structure and Interpretation of Computer Programs**, by
Harold Abelson and Gerald Jay Sussman with Julie Sussman.
-
Full text at http://mitpress.mit.edu/sicp/.
-
-A classic textbook of computer science. Chapters 2 and 3 discuss the
+In this classic textbook of computer science, chapters 2 and 3 discuss the
use of sequences and streams to organize the data flow inside a
program. The book uses Scheme for its examples, but many of the
design approaches described in these chapters are applicable to
@@ -1174,6 +1171,8 @@
http://en.wikipedia.org/wiki/Functional_programming:
General Wikipedia entry describing functional programming.
+http://en.wikipedia.org/wiki/Coroutine:
+Entry for coroutines.
Python documentation
'''''''''''''''''''''''''''
@@ -1184,6 +1183,12 @@
http://docs.python.org/lib/module-operator.html:
Documentation ``for the operator`` module.
+http://www.python.org/dev/peps/pep-0289/:
+PEP 289: "Generator Expressions"
+
+http://www.python.org/dev/peps/pep-0342/
+PEP 342: "Coroutines via Enhanced Generators" describes the new generator
+features in Python 2.5.
.. comment
From python-checkins at python.org Mon Jul 10 19:13:47 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Mon, 10 Jul 2006 19:13:47 +0200 (CEST)
Subject: [Python-checkins] r50523 - sandbox/trunk/Doc/functional.rst
Message-ID: <20060710171347.C9D721E4002@bag.python.org>
Author: andrew.kuchling
Date: Mon Jul 10 19:13:47 2006
New Revision: 50523
Modified:
sandbox/trunk/Doc/functional.rst
Log:
Change variable names for consistency; bump version again
Modified: sandbox/trunk/Doc/functional.rst
==============================================================================
--- sandbox/trunk/Doc/functional.rst (original)
+++ sandbox/trunk/Doc/functional.rst Mon Jul 10 19:13:47 2006
@@ -1,7 +1,7 @@
Functional Programming HOWTO
================================
-**Version 0.2**
+**Version 0.21**
(This is a first draft. Please send comments/error
reports/suggestions to amk at amk.ca. This URL is probably not going to
@@ -853,7 +853,7 @@
::
- freq = reduce(lambda a, b: (0, a[1] + b[1]), items)[1]
+ total = reduce(lambda a, b: (0, a[1] + b[1]), items)[1]
You can figure it out, but it takes time to disentangle the expression
to figure out what's going on. Using a short nested
@@ -862,7 +862,7 @@
def combine (a, b):
return 0, a[1] + b[1]
- return reduce(combine, items)[1]
+ total = reduce(combine, items)[1]
But it would be best of all if I had simply used a ``for`` loop::
From python-checkins at python.org Mon Jul 10 19:14:29 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Mon, 10 Jul 2006 19:14:29 +0200 (CEST)
Subject: [Python-checkins] r50524 - sandbox/trunk/Doc/functional.rst
Message-ID: <20060710171429.9D8931E4002@bag.python.org>
Author: andrew.kuchling
Date: Mon Jul 10 19:14:29 2006
New Revision: 50524
Modified:
sandbox/trunk/Doc/functional.rst
Log:
Add a name
Modified: sandbox/trunk/Doc/functional.rst
==============================================================================
--- sandbox/trunk/Doc/functional.rst (original)
+++ sandbox/trunk/Doc/functional.rst Mon Jul 10 19:14:29 2006
@@ -1143,7 +1143,8 @@
The author would like to thank the following people for offering
suggestions, corrections and assistance with various drafts of this
article: Ian Bicking, Nick Coghlan, Nick Efford, Raymond Hettinger,
-Jim Jewett, Mike Krell, Leandro Lameiro, Collin Winter, Blake Winton.
+Jim Jewett, Mike Krell, Leandro Lameiro, Jussi Salmela,
+Collin Winter, Blake Winton.
Version 0.1: posted June 30 2006.
From python-checkins at python.org Mon Jul 10 19:16:53 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Mon, 10 Jul 2006 19:16:53 +0200 (CEST)
Subject: [Python-checkins] r50525 - sandbox/trunk/Doc/functional.rst
Message-ID: <20060710171653.E12CB1E4002@bag.python.org>
Author: andrew.kuchling
Date: Mon Jul 10 19:16:53 2006
New Revision: 50525
Modified:
sandbox/trunk/Doc/functional.rst
Log:
Add further development using sum
Modified: sandbox/trunk/Doc/functional.rst
==============================================================================
--- sandbox/trunk/Doc/functional.rst (original)
+++ sandbox/trunk/Doc/functional.rst Mon Jul 10 19:16:53 2006
@@ -870,6 +870,10 @@
for a, b in items:
total += b
+Or the ``sum()`` built-in and a generator expression::
+
+ total = sum(b for a,b in items)
+
Many uses of ``reduce()`` are clearer when written as ``for`` loops.
Fredrik Lundh once suggested the following set of rules for refactoring
From python-checkins at python.org Mon Jul 10 21:03:30 2006
From: python-checkins at python.org (phillip.eby)
Date: Mon, 10 Jul 2006 21:03:30 +0200 (CEST)
Subject: [Python-checkins] r50526 - in python/trunk/Lib: inspect.py
test/test_inspect.py
Message-ID: <20060710190330.2346D1E4002@bag.python.org>
Author: phillip.eby
Date: Mon Jul 10 21:03:29 2006
New Revision: 50526
Modified:
python/trunk/Lib/inspect.py
python/trunk/Lib/test/test_inspect.py
Log:
Fix SF#1516184 and add a test to prevent regression.
Modified: python/trunk/Lib/inspect.py
==============================================================================
--- python/trunk/Lib/inspect.py (original)
+++ python/trunk/Lib/inspect.py Mon Jul 10 21:03:29 2006
@@ -355,40 +355,37 @@
return None
if os.path.exists(filename):
return filename
- # Ugly but necessary - '' and '' mean that getmodule()
- # would infinitely recurse, because they're not real files nor loadable
- # Note that this means that writing a PEP 302 loader that uses '<'
- # at the start of a filename is now not a good idea. :(
- if filename[:1]!='<' and hasattr(getmodule(object), '__loader__'):
+ # only return a non-existent filename if the module has a PEP 302 loader
+ if hasattr(getmodule(object, filename), '__loader__'):
return filename
-def getabsfile(object):
+def getabsfile(object, _filename=None):
"""Return an absolute path to the source or compiled file for an object.
The idea is for each object to have a unique origin, so this routine
normalizes the result as much as possible."""
return os.path.normcase(
- os.path.abspath(getsourcefile(object) or getfile(object)))
+ os.path.abspath(_filename or getsourcefile(object) or getfile(object)))
modulesbyfile = {}
-def getmodule(object):
+def getmodule(object, _filename=None):
"""Return the module an object was defined in, or None if not found."""
if ismodule(object):
return object
if hasattr(object, '__module__'):
return sys.modules.get(object.__module__)
try:
- file = getabsfile(object)
+ file = getabsfile(object, _filename)
except TypeError:
return None
if file in modulesbyfile:
return sys.modules.get(modulesbyfile[file])
for module in sys.modules.values():
if ismodule(module) and hasattr(module, '__file__'):
- modulesbyfile[
- os.path.realpath(
- getabsfile(module))] = module.__name__
+ f = getabsfile(module)
+ modulesbyfile[f] = modulesbyfile[
+ os.path.realpath(f)] = module.__name__
if file in modulesbyfile:
return sys.modules.get(modulesbyfile[file])
main = sys.modules['__main__']
Modified: python/trunk/Lib/test/test_inspect.py
==============================================================================
--- python/trunk/Lib/test/test_inspect.py (original)
+++ python/trunk/Lib/test/test_inspect.py Mon Jul 10 21:03:29 2006
@@ -178,6 +178,16 @@
def test_getfile(self):
self.assertEqual(inspect.getfile(mod.StupidGit), mod.__file__)
+ def test_getmodule_recursion(self):
+ from new import module
+ name = '__inspect_dummy'
+ m = sys.modules[name] = module(name)
+ m.__file__ = "" # hopefully not a real filename...
+ m.__loader__ = "dummy" # pretend the filename is understood by a loader
+ exec "def x(): pass" in m.__dict__
+ self.assertEqual(inspect.getsourcefile(m.x.func_code), '')
+ del sys.modules[name]
+
class TestDecorators(GetSourceBase):
fodderFile = mod2
From python-checkins at python.org Mon Jul 10 21:09:59 2006
From: python-checkins at python.org (jackilyn.hoxworth)
Date: Mon, 10 Jul 2006 21:09:59 +0200 (CEST)
Subject: [Python-checkins] r50527 -
python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py
Message-ID: <20060710190959.B1C731E4002@bag.python.org>
Author: jackilyn.hoxworth
Date: Mon Jul 10 21:09:59 2006
New Revision: 50527
Modified:
python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py
Log:
Doesn't work yet
Modified: python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py
==============================================================================
--- python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py (original)
+++ python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py Mon Jul 10 21:09:59 2006
@@ -7,6 +7,7 @@
import logging
import httplib
+import socket
from cStringIO import StringIO
log=logging.getLogger("py.httplib")
@@ -23,6 +24,9 @@
# add the handler to the logger
log.addHandler(handler)
+# create socket
+sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
+
httplib.HTTPResponse(sock).log("message 1")
httplib.HTTPConnection().log("message 2")
print stringLog.getvalue() # For testing purposes
From buildbot at python.org Mon Jul 10 21:14:35 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 10 Jul 2006 19:14:35 +0000
Subject: [Python-checkins] buildbot warnings in x86 XP-2 trunk
Message-ID: <20060710191435.4D9A91E400E@bag.python.org>
The Buildbot has detected a new failure of x86 XP-2 trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520XP-2%2520trunk/builds/752
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: phillip.eby,thomas.heller
Build Had Warnings: warnings failed slave lost
sincerely,
-The Buildbot
From python-checkins at python.org Mon Jul 10 21:18:35 2006
From: python-checkins at python.org (phillip.eby)
Date: Mon, 10 Jul 2006 21:18:35 +0200 (CEST)
Subject: [Python-checkins] r50528 -
python/trunk/Lib/distutils/command/upload.py
Message-ID: <20060710191835.843891E4002@bag.python.org>
Author: phillip.eby
Date: Mon Jul 10 21:18:35 2006
New Revision: 50528
Modified:
python/trunk/Lib/distutils/command/upload.py
Log:
Fix SF#1457312: bad socket error handling in distutils "upload" command.
Modified: python/trunk/Lib/distutils/command/upload.py
==============================================================================
--- python/trunk/Lib/distutils/command/upload.py (original)
+++ python/trunk/Lib/distutils/command/upload.py Mon Jul 10 21:18:35 2006
@@ -185,7 +185,7 @@
http.endheaders()
http.send(body)
except socket.error, e:
- self.announce(e.msg, log.ERROR)
+ self.announce(str(e), log.ERROR)
return
r = http.getresponse()
From buildbot at python.org Mon Jul 10 21:24:32 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 10 Jul 2006 19:24:32 +0000
Subject: [Python-checkins] buildbot failure in x86 XP-2 trunk
Message-ID: <20060710192433.1585E1E4002@bag.python.org>
The Buildbot has detected a new failure of x86 XP-2 trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520XP-2%2520trunk/builds/753
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: phillip.eby
BUILD FAILED: failed compile
sincerely,
-The Buildbot
From python-checkins at python.org Mon Jul 10 21:48:36 2006
From: python-checkins at python.org (phillip.eby)
Date: Mon, 10 Jul 2006 21:48:36 +0200 (CEST)
Subject: [Python-checkins] r50529 -
sandbox/trunk/setuptools/setuptools/command/upload.py
Message-ID: <20060710194836.AFD661E4002@bag.python.org>
Author: phillip.eby
Date: Mon Jul 10 21:48:35 2006
New Revision: 50529
Modified:
sandbox/trunk/setuptools/setuptools/command/upload.py
Log:
Fix broken error message for socket error during upload.
Modified: sandbox/trunk/setuptools/setuptools/command/upload.py
==============================================================================
--- sandbox/trunk/setuptools/setuptools/command/upload.py (original)
+++ sandbox/trunk/setuptools/setuptools/command/upload.py Mon Jul 10 21:48:35 2006
@@ -164,7 +164,7 @@
http.endheaders()
http.send(body)
except socket.error, e:
- self.announce(e.msg, log.ERROR)
+ self.announce(str(e), log.ERROR)
return
r = http.getresponse()
From python-checkins at python.org Mon Jul 10 21:51:20 2006
From: python-checkins at python.org (phillip.eby)
Date: Mon, 10 Jul 2006 21:51:20 +0200 (CEST)
Subject: [Python-checkins] r50530 -
sandbox/branches/setuptools-0.6/setuptools/command/upload.py
Message-ID: <20060710195120.C6CD71E4002@bag.python.org>
Author: phillip.eby
Date: Mon Jul 10 21:51:20 2006
New Revision: 50530
Modified:
sandbox/branches/setuptools-0.6/setuptools/command/upload.py
Log:
Fix broken error message for socket error during upload.
(backport from trunk)
Modified: sandbox/branches/setuptools-0.6/setuptools/command/upload.py
==============================================================================
--- sandbox/branches/setuptools-0.6/setuptools/command/upload.py (original)
+++ sandbox/branches/setuptools-0.6/setuptools/command/upload.py Mon Jul 10 21:51:20 2006
@@ -164,7 +164,7 @@
http.endheaders()
http.send(body)
except socket.error, e:
- self.announce(e.msg, log.ERROR)
+ self.announce(str(e), log.ERROR)
return
r = http.getresponse()
From python-checkins at python.org Mon Jul 10 21:54:07 2006
From: python-checkins at python.org (phillip.eby)
Date: Mon, 10 Jul 2006 21:54:07 +0200 (CEST)
Subject: [Python-checkins] r50531 -
sandbox/trunk/setuptools/pkg_resources.txt
sandbox/trunk/setuptools/setuptools.txt
Message-ID: <20060710195407.A9AFD1E4002@bag.python.org>
Author: phillip.eby
Date: Mon Jul 10 21:54:06 2006
New Revision: 50531
Modified:
sandbox/trunk/setuptools/pkg_resources.txt
sandbox/trunk/setuptools/setuptools.txt
Log:
Include more detailed version ranges spec, and make Requirement.specs a
public/documented attribute.
Modified: sandbox/trunk/setuptools/pkg_resources.txt
==============================================================================
--- sandbox/trunk/setuptools/pkg_resources.txt (original)
+++ sandbox/trunk/setuptools/pkg_resources.txt Mon Jul 10 21:54:06 2006
@@ -616,6 +616,17 @@
parsed using the ``parse_version()`` utility function. Otherwise, it is
assumed to be an already-parsed version.
+ The ``Requirement`` object's version specifiers (``.specs``) are internally
+ sorted into ascending version order, and used to establish what ranges of
+ versions are acceptable. Adjacent redundant conditions are effectively
+ consolidated (e.g. ``">1, >2"`` produces the same results as ``">1"``, and
+ ``"<2,<3"`` produces the same results as``"<3"``). ``"!="`` versions are
+ excised from the ranges they fall within. The version being tested for
+ acceptability is then checked for membership in the resulting ranges.
+ (Note that providing conflicting conditions for the same version (e.g.
+ ``"<2,>=2"`` or ``"==2,!=2"``) is meaningless and may therefore produce
+ bizarre results when compared with actual version number(s).)
+
``__eq__(other_requirement)``
A requirement compares equal to another requirement if they have
case-insensitively equal project names, version specifiers, and "extras".
@@ -640,6 +651,14 @@
function, so they may not exactly equal the extras the requirement was
created with.)
+``specs``
+ A list of ``(op,version)`` tuples, sorted in ascending parsed-version
+ order. The `op` in each tuple is a comparison operator, represented as
+ a string. The `version` is the (unparsed) version number. The relative
+ order of tuples containing the same version numbers is undefined, since
+ having more than one operator for a given version is either redundant or
+ self-contradictory.
+
Entry Points
============
Modified: sandbox/trunk/setuptools/setuptools.txt
==============================================================================
--- sandbox/trunk/setuptools/setuptools.txt (original)
+++ sandbox/trunk/setuptools/setuptools.txt Mon Jul 10 21:54:06 2006
@@ -349,7 +349,7 @@
has an ``additional_tests()`` function, it is called and the results are
added to the tests to be run. If the named suite is a package, any
submodules and subpackages are recursively added to the overall test suite.
-
+
Specifying this argument enables use of the `test`_ command to run the
specified test suite, e.g. via ``setup.py test``. See the section on the
`test`_ command below for more details.
@@ -514,6 +514,15 @@
separated by whitespace, but any whitespace or nonstandard characters within a
project name or version identifier must be replaced with ``-``.
+Version specifiers for a given project are internally sorted into ascending
+version order, and used to establish what ranges of versions are acceptable.
+Adjacent redundant conditions are also consolidated (e.g. ``">1, >2"`` becomes
+``">1"``, and ``"<2,<3"`` becomes ``"<3"``). ``"!="`` versions are excised from
+the ranges they fall within. A project's version is then checked for
+membership in the resulting ranges. (Note that providing conflicting conditions
+for the same version (e.g. "<2,>=2" or "==2,!=2") is meaningless and may
+therefore produce bizarre results.)
+
Here are some example requirement specifiers::
docutils >= 0.3
@@ -1041,7 +1050,7 @@
Note that you can also apply setuptools commands to non-setuptools projects,
using commands like this::
-
+
python -c "import setuptools; execfile('setup.py')" develop
That is, you can simply list the normal setup commands and options following
@@ -2432,7 +2441,7 @@
with the absence of needed programs (i.e., ones belonging to the revision
control system itself. It *may*, however, use ``distutils.log.warn()`` to
inform the user of the missing program(s).
-
+
Subclassing ``Command``
-----------------------
From python-checkins at python.org Mon Jul 10 21:55:32 2006
From: python-checkins at python.org (phillip.eby)
Date: Mon, 10 Jul 2006 21:55:32 +0200 (CEST)
Subject: [Python-checkins] r50532 -
sandbox/branches/setuptools-0.6/pkg_resources.txt
sandbox/branches/setuptools-0.6/setuptools.txt
Message-ID: <20060710195532.A25741E4002@bag.python.org>
Author: phillip.eby
Date: Mon Jul 10 21:55:31 2006
New Revision: 50532
Modified:
sandbox/branches/setuptools-0.6/pkg_resources.txt
sandbox/branches/setuptools-0.6/setuptools.txt
Log:
Include more detailed version ranges spec, and make Requirement.specs a
public/documented attribute. (backport from trunk)
Modified: sandbox/branches/setuptools-0.6/pkg_resources.txt
==============================================================================
--- sandbox/branches/setuptools-0.6/pkg_resources.txt (original)
+++ sandbox/branches/setuptools-0.6/pkg_resources.txt Mon Jul 10 21:55:31 2006
@@ -610,6 +610,17 @@
parsed using the ``parse_version()`` utility function. Otherwise, it is
assumed to be an already-parsed version.
+ The ``Requirement`` object's version specifiers (``.specs``) are internally
+ sorted into ascending version order, and used to establish what ranges of
+ versions are acceptable. Adjacent redundant conditions are effectively
+ consolidated (e.g. ``">1, >2"`` produces the same results as ``">1"``, and
+ ``"<2,<3"`` produces the same results as``"<3"``). ``"!="`` versions are
+ excised from the ranges they fall within. The version being tested for
+ acceptability is then checked for membership in the resulting ranges.
+ (Note that providing conflicting conditions for the same version (e.g.
+ ``"<2,>=2"`` or ``"==2,!=2"``) is meaningless and may therefore produce
+ bizarre results when compared with actual version number(s).)
+
``__eq__(other_requirement)``
A requirement compares equal to another requirement if they have
case-insensitively equal project names, version specifiers, and "extras".
@@ -634,6 +645,14 @@
function, so they may not exactly equal the extras the requirement was
created with.)
+``specs``
+ A list of ``(op,version)`` tuples, sorted in ascending parsed-version
+ order. The `op` in each tuple is a comparison operator, represented as
+ a string. The `version` is the (unparsed) version number. The relative
+ order of tuples containing the same version numbers is undefined, since
+ having more than one operator for a given version is either redundant or
+ self-contradictory.
+
Entry Points
============
Modified: sandbox/branches/setuptools-0.6/setuptools.txt
==============================================================================
--- sandbox/branches/setuptools-0.6/setuptools.txt (original)
+++ sandbox/branches/setuptools-0.6/setuptools.txt Mon Jul 10 21:55:31 2006
@@ -514,6 +514,15 @@
separated by whitespace, but any whitespace or nonstandard characters within a
project name or version identifier must be replaced with ``-``.
+Version specifiers for a given project are internally sorted into ascending
+version order, and used to establish what ranges of versions are acceptable.
+Adjacent redundant conditions are also consolidated (e.g. ``">1, >2"`` becomes
+``">1"``, and ``"<2,<3"`` becomes ``"<3"``). ``"!="`` versions are excised from
+the ranges they fall within. A project's version is then checked for
+membership in the resulting ranges. (Note that providing conflicting conditions
+for the same version (e.g. "<2,>=2" or "==2,!=2") is meaningless and may
+therefore produce bizarre results.)
+
Here are some example requirement specifiers::
docutils >= 0.3
From python-checkins at python.org Mon Jul 10 22:04:49 2006
From: python-checkins at python.org (phillip.eby)
Date: Mon, 10 Jul 2006 22:04:49 +0200 (CEST)
Subject: [Python-checkins] r50533 - in sandbox/trunk/setuptools:
setuptools.egg-info/entry_points.txt
setuptools/command/__init__.py setuptools/command/register.py
Message-ID: <20060710200449.C997A1E4002@bag.python.org>
Author: phillip.eby
Date: Mon Jul 10 22:04:48 2006
New Revision: 50533
Added:
sandbox/trunk/setuptools/setuptools/command/register.py (contents, props changed)
Modified:
sandbox/trunk/setuptools/setuptools.egg-info/entry_points.txt
sandbox/trunk/setuptools/setuptools/command/__init__.py
Log:
Fix "register" command not necessarily reflecting build tags.
Modified: sandbox/trunk/setuptools/setuptools.egg-info/entry_points.txt
==============================================================================
--- sandbox/trunk/setuptools/setuptools.egg-info/entry_points.txt (original)
+++ sandbox/trunk/setuptools/setuptools.egg-info/entry_points.txt Mon Jul 10 22:04:48 2006
@@ -38,6 +38,7 @@
build_py = setuptools.command.build_py:build_py
saveopts = setuptools.command.saveopts:saveopts
egg_info = setuptools.command.egg_info:egg_info
+register = setuptools.command.register:register
upload = setuptools.command.upload:upload
install_egg_info = setuptools.command.install_egg_info:install_egg_info
alias = setuptools.command.alias:alias
Modified: sandbox/trunk/setuptools/setuptools/command/__init__.py
==============================================================================
--- sandbox/trunk/setuptools/setuptools/command/__init__.py (original)
+++ sandbox/trunk/setuptools/setuptools/command/__init__.py Mon Jul 10 22:04:48 2006
@@ -2,6 +2,7 @@
'alias', 'bdist_egg', 'bdist_rpm', 'build_ext', 'build_py', 'develop',
'easy_install', 'egg_info', 'install', 'install_lib', 'rotate', 'saveopts',
'sdist', 'setopt', 'test', 'upload', 'install_egg_info', 'install_scripts',
+ 'register',
]
import sys
Added: sandbox/trunk/setuptools/setuptools/command/register.py
==============================================================================
--- (empty file)
+++ sandbox/trunk/setuptools/setuptools/command/register.py Mon Jul 10 22:04:48 2006
@@ -0,0 +1,10 @@
+from distutils.command.register import register as _register
+
+class register(_register):
+ __doc__ = _register.__doc__
+
+ def run(self):
+ # Make sure that we are using valid current name/version info
+ self.run_command('egg_info')
+ _register.run(self)
+
From python-checkins at python.org Mon Jul 10 22:08:21 2006
From: python-checkins at python.org (phillip.eby)
Date: Mon, 10 Jul 2006 22:08:21 +0200 (CEST)
Subject: [Python-checkins] r50534 - in sandbox/branches/setuptools-0.6:
setuptools.egg-info/entry_points.txt setuptools.txt
setuptools/command/__init__.py setuptools/command/register.py
Message-ID: <20060710200821.4046C1E4002@bag.python.org>
Author: phillip.eby
Date: Mon Jul 10 22:08:20 2006
New Revision: 50534
Added:
sandbox/branches/setuptools-0.6/setuptools/command/register.py
- copied unchanged from r50533, sandbox/trunk/setuptools/setuptools/command/register.py
Modified:
sandbox/branches/setuptools-0.6/setuptools.egg-info/entry_points.txt
sandbox/branches/setuptools-0.6/setuptools.txt
sandbox/branches/setuptools-0.6/setuptools/command/__init__.py
Log:
Fix ``register`` not obeying name/version set by ``egg_info`` command, if
``egg_info`` wasn't explicitly run first on the same command line.
(backport from trunk)
Modified: sandbox/branches/setuptools-0.6/setuptools.egg-info/entry_points.txt
==============================================================================
--- sandbox/branches/setuptools-0.6/setuptools.egg-info/entry_points.txt (original)
+++ sandbox/branches/setuptools-0.6/setuptools.egg-info/entry_points.txt Mon Jul 10 22:08:20 2006
@@ -38,6 +38,7 @@
build_py = setuptools.command.build_py:build_py
saveopts = setuptools.command.saveopts:saveopts
egg_info = setuptools.command.egg_info:egg_info
+register = setuptools.command.register:register
upload = setuptools.command.upload:upload
install_egg_info = setuptools.command.install_egg_info:install_egg_info
alias = setuptools.command.alias:alias
Modified: sandbox/branches/setuptools-0.6/setuptools.txt
==============================================================================
--- sandbox/branches/setuptools-0.6/setuptools.txt (original)
+++ sandbox/branches/setuptools-0.6/setuptools.txt Mon Jul 10 22:08:20 2006
@@ -2507,6 +2507,10 @@
Release Notes/Change History
----------------------------
+0.6b4
+ * Fix ``register`` not obeying name/version set by ``egg_info`` command, if
+ ``egg_info`` wasn't explicitly run first on the same command line.
+
0.6b3
* Fix ``bdist_egg`` not including files in subdirectories of ``.egg-info``.
Modified: sandbox/branches/setuptools-0.6/setuptools/command/__init__.py
==============================================================================
--- sandbox/branches/setuptools-0.6/setuptools/command/__init__.py (original)
+++ sandbox/branches/setuptools-0.6/setuptools/command/__init__.py Mon Jul 10 22:08:20 2006
@@ -2,6 +2,7 @@
'alias', 'bdist_egg', 'bdist_rpm', 'build_ext', 'build_py', 'develop',
'easy_install', 'egg_info', 'install', 'install_lib', 'rotate', 'saveopts',
'sdist', 'setopt', 'test', 'upload', 'install_egg_info', 'install_scripts',
+ 'register',
]
From python-checkins at python.org Mon Jul 10 22:33:46 2006
From: python-checkins at python.org (phillip.eby)
Date: Mon, 10 Jul 2006 22:33:46 +0200 (CEST)
Subject: [Python-checkins] r50535 - in sandbox/trunk/setuptools:
setuptools.txt setuptools/command/egg_info.py
Message-ID: <20060710203346.25B791E4002@bag.python.org>
Author: phillip.eby
Date: Mon Jul 10 22:33:45 2006
New Revision: 50535
Modified:
sandbox/trunk/setuptools/setuptools.txt
sandbox/trunk/setuptools/setuptools/command/egg_info.py
Log:
Add --no-date and --no-svn-revision options to make creating release
snapshots easier.
Modified: sandbox/trunk/setuptools/setuptools.txt
==============================================================================
--- sandbox/trunk/setuptools/setuptools.txt (original)
+++ sandbox/trunk/setuptools/setuptools.txt Mon Jul 10 22:33:45 2006
@@ -1512,17 +1512,9 @@
will know that ``1.0a1`` supersedes any interim snapshots from Subversion, and
handle upgrades accordingly.
-Note, by the way, that this means that you need to remove these settings from
-``setup.cfg`` when you make an official release. This is easy to do if you
-are developing on the trunk and using tags or branches for your releases - just
-make the change after branching or tagging the release, so the trunk will still
-produce development snapshots.
-
-Also notice that this procedure means that the project version number you
-specify in ``setup.py`` should always be the *next* version of your software,
-not the last released version.
-
-(Alternately, you can leave out the ``tag_build=.dev``, and always use the
+(Note: the project version number you specify in ``setup.py`` should always be
+the *next* version of your software, not the last released version.
+Alternately, you can leave out the ``tag_build=.dev``, and always use the
*last* release as a version number, so that your post-1.0 builds are labelled
``1.0-r1263``, indicating a post-1.0 patchlevel. Most projects so far,
however, seem to prefer to think of their project as being a future version
@@ -1531,6 +1523,22 @@
post-release numbering on release branches, and pre-release numbering on the
trunk. But you don't have to make things this complex if you don't want to.)
+When you make an official release, creating source or binary distributions,
+you will need to override the tag settings from ``setup.cfg``. This is easy to
+do if you are developing on the trunk and using tags or branches for your
+releases - just make the change after branching or tagging the release, so the
+trunk will still produce development snapshots.
+
+Alternately, you can override the options on the command line, using
+something like::
+
+ python setup.py egg_info -RDb "" sdist bdist_egg register upload
+
+The first part of this command (``egg_info -RDb ""``) will override the
+configured tag information, before creating source and binary eggs, registering
+the project with PyPI, and uploading the files. (See the section below on the
+`egg_info`_ command for more details on the command line options in use here.)
+
Commonly, projects releasing code from Subversion will include a PyPI link to
their checkout URL (as described in the previous section) with an
``#egg=projectname-dev`` suffix. This allows users to request EasyInstall
@@ -1859,19 +1867,19 @@
metadata directory (used by the ``bdist_egg``, ``develop``, and ``test``
commands), and it allows you to temporarily change a project's version string,
to support "daily builds" or "snapshot" releases. It is run automatically by
-the ``sdist``, ``bdist_egg``, ``develop``, and ``test`` commands in order to
-update the project's metadata, but you can also specify it explicitly in order
-to temporarily change the project's version string. (It also generates the
-``.egg-info/SOURCES.txt`` manifest file, which is used when you are building
-source distributions.)
+the ``sdist``, ``bdist_egg``, ``develop``, ``register``, and ``test`` commands
+in order to update the project's metadata, but you can also specify it
+explicitly in order to temporarily change the project's version string while
+executing other commands. (It also generates the``.egg-info/SOURCES.txt``
+manifest file, which is used when you are building source distributions.)
-(In addition to writing the core egg metadata defined by ``setuptools`` and
+In addition to writing the core egg metadata defined by ``setuptools`` and
required by ``pkg_resources``, this command can be extended to write other
metadata files as well, by defining entry points in the ``egg_info.writers``
group. See the section on `Adding new EGG-INFO Files`_ below for more details.
Note that using additional metadata writers may require you to include a
``setup_requires`` argument to ``setup()`` in order to ensure that the desired
-writers are available on ``sys.path``.)
+writers are available on ``sys.path``.
Release Tagging Options
@@ -1892,6 +1900,10 @@
always leave off --tag-build and then use one or both of the following
options.)
+ If you have a default build tag set in your ``setup.cfg``, you can suppress
+ it on the command line using ``-b ""`` or ``--tag-build=""`` as an argument
+ to the ``egg_info`` command.
+
``--tag-svn-revision, -r``
If the current directory is a Subversion checkout (i.e. has a ``.svn``
subdirectory, this appends a string of the form "-rNNNN" to the project's
@@ -1909,10 +1921,19 @@
If there is no ``PKG-INFO`` file, or the version number contained therein
does not end with ``-r`` and a number, then ``-r0`` is used.
+``--no-svn-revision, -R``
+ Don't include the Subversion revision in the version number. This option
+ is included so you can override a default setting put in ``setup.cfg``.
+
``--tag-date, -d``
Add a date stamp of the form "-YYYYMMDD" (e.g. "-20050528") to the
project's version number.
+``--no-date, -D``
+ Don't include a date stamp in the version number. This option is included
+ so you can override a default setting in ``setup.cfg``.
+
+
(Note: Because these options modify the version number used for source and
binary distributions of your project, you should first make sure that you know
how the resulting version numbers will be interpreted by automated tools
@@ -1938,6 +1959,22 @@
no ``package_dir`` set, this option defaults to the current directory.
+``egg_info`` Examples
+---------------------
+
+Creating a dated "nightly build" snapshot egg::
+
+ python setup.py egg_info --tag-date --tag-build=DEV bdist_egg
+
+Creating and uploading a release with no version tags, even if some default
+tags are specified in ``setup.cfg``::
+
+ python setup.py egg_info -RDb "" sdist bdist_egg register upload
+
+(Notice that ``egg_info`` must always appear on the command line *before* any
+commands that you want the version changes to apply to.)
+
+
.. _install command:
``install`` - Run ``easy_install`` or old-style installation
Modified: sandbox/trunk/setuptools/setuptools/command/egg_info.py
==============================================================================
--- sandbox/trunk/setuptools/setuptools/command/egg_info.py (original)
+++ sandbox/trunk/setuptools/setuptools/command/egg_info.py Mon Jul 10 22:33:45 2006
@@ -25,9 +25,19 @@
"Add subversion revision ID to version number"),
('tag-date', 'd', "Add date stamp (e.g. 20050528) to version number"),
('tag-build=', 'b', "Specify explicit tag to add to version number"),
+ ('no-svn-revision', 'R',
+ "Don't add subversion revision ID [default]"),
+ ('no-date', 'D', "Don't include date stamp [default]"),
+ ('tag-build=', 'b', "Specify explicit tag to add to version number"),
]
- boolean_options = ['tag-date','tag-svn-revision']
+ boolean_options = ['tag-date', 'tag-svn-revision']
+ negative_opt = {'no-svn-revision': 'tag-svn-revision',
+ 'no-date': 'tag-date'}
+
+
+
+
def initialize_options (self):
self.egg_name = None
@@ -39,6 +49,37 @@
self.tag_date = 0
self.broken_egg_info = False
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
def finalize_options (self):
self.egg_name = safe_name(self.distribution.get_name())
self.egg_version = self.tagged_version()
@@ -363,3 +404,7 @@
if match:
return int(match.group(1))
return 0
+
+
+
+#
From python-checkins at python.org Mon Jul 10 22:38:59 2006
From: python-checkins at python.org (phillip.eby)
Date: Mon, 10 Jul 2006 22:38:59 +0200 (CEST)
Subject: [Python-checkins] r50536 - sandbox/trunk/setuptools/setup.cfg
Message-ID: <20060710203859.2DF991E4002@bag.python.org>
Author: phillip.eby
Date: Mon Jul 10 22:38:58 2006
New Revision: 50536
Modified:
sandbox/trunk/setuptools/setup.cfg
Log:
Update aliases for easier release using new options
Modified: sandbox/trunk/setuptools/setup.cfg
==============================================================================
--- sandbox/trunk/setuptools/setup.cfg (original)
+++ sandbox/trunk/setuptools/setup.cfg Mon Jul 10 22:38:58 2006
@@ -1,8 +1,9 @@
-[aliases]
-binary = bdist_egg upload --show-response
-develop = develop
-source = register sdist binary
-
[egg_info]
tag_build = dev
tag_svn_revision = 1
+
+[aliases]
+release = egg_info -RDb ''
+source = register sdist binary
+binary = bdist_egg upload --show-response
+
From python-checkins at python.org Mon Jul 10 22:39:49 2006
From: python-checkins at python.org (peter.astrand)
Date: Mon, 10 Jul 2006 22:39:49 +0200 (CEST)
Subject: [Python-checkins] r50537 - python/trunk/Lib/test/test_subprocess.py
Message-ID: <20060710203949.B90F71E400C@bag.python.org>
Author: peter.astrand
Date: Mon Jul 10 22:39:49 2006
New Revision: 50537
Modified:
python/trunk/Lib/test/test_subprocess.py
Log:
Make it possible to run test_subprocess.py with Python 2.2, which lacks test_support.reap_children().
Modified: python/trunk/Lib/test/test_subprocess.py
==============================================================================
--- python/trunk/Lib/test/test_subprocess.py (original)
+++ python/trunk/Lib/test/test_subprocess.py Mon Jul 10 22:39:49 2006
@@ -30,12 +30,14 @@
def setUp(self):
# Try to minimize the number of children we have so this test
# doesn't crash on some buildbots (Alphas in particular).
- test_support.reap_children()
+ if hasattr(test_support, "reap_children"):
+ test_support.reap_children()
def tearDown(self):
# Try to minimize the number of children we have so this test
# doesn't crash on some buildbots (Alphas in particular).
- test_support.reap_children()
+ if hasattr(test_support, "reap_children"):
+ test_support.reap_children()
def mkstemp(self):
"""wrapper for mkstemp, calling mktemp if mkstemp is not available"""
@@ -610,7 +612,8 @@
def test_main():
test_support.run_unittest(ProcessTestCase)
- test_support.reap_children()
+ if hasattr(test_support, "reap_children"):
+ test_support.reap_children()
if __name__ == "__main__":
test_main()
From python-checkins at python.org Mon Jul 10 22:45:34 2006
From: python-checkins at python.org (phillip.eby)
Date: Mon, 10 Jul 2006 22:45:34 +0200 (CEST)
Subject: [Python-checkins] r50538 - sandbox/trunk/setuptools/setuptools.txt
Message-ID: <20060710204534.37C301E4011@bag.python.org>
Author: phillip.eby
Date: Mon Jul 10 22:45:33 2006
New Revision: 50538
Modified:
sandbox/trunk/setuptools/setuptools.txt
Log:
Tweak docs for subversion release stuff to make better use of new
options.
Modified: sandbox/trunk/setuptools/setuptools.txt
==============================================================================
--- sandbox/trunk/setuptools/setuptools.txt (original)
+++ sandbox/trunk/setuptools/setuptools.txt Mon Jul 10 22:45:33 2006
@@ -1523,22 +1523,6 @@
post-release numbering on release branches, and pre-release numbering on the
trunk. But you don't have to make things this complex if you don't want to.)
-When you make an official release, creating source or binary distributions,
-you will need to override the tag settings from ``setup.cfg``. This is easy to
-do if you are developing on the trunk and using tags or branches for your
-releases - just make the change after branching or tagging the release, so the
-trunk will still produce development snapshots.
-
-Alternately, you can override the options on the command line, using
-something like::
-
- python setup.py egg_info -RDb "" sdist bdist_egg register upload
-
-The first part of this command (``egg_info -RDb ""``) will override the
-configured tag information, before creating source and binary eggs, registering
-the project with PyPI, and uploading the files. (See the section below on the
-`egg_info`_ command for more details on the command line options in use here.)
-
Commonly, projects releasing code from Subversion will include a PyPI link to
their checkout URL (as described in the previous section) with an
``#egg=projectname-dev`` suffix. This allows users to request EasyInstall
@@ -1567,6 +1551,41 @@
in order to keep their checkout completely in sync.
+Making "Official" (Non-Snapshot) Releases
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+When you make an official release, creating source or binary distributions,
+you will need to override the tag settings from ``setup.cfg``, so that you
+don't end up registering versions like ``foobar-0.7a1.dev-r34832``. This is
+easy to do if you are developing on the trunk and using tags or branches for
+your releases - just make the change to ``setup.cfg`` after branching or
+tagging the release, so the trunk will still produce development snapshots.
+
+Alternately, if you are not branching for releases, you can override the
+default version options on the command line, using something like::
+
+ python setup.py egg_info -RDb "" sdist bdist_egg register upload
+
+The first part of this command (``egg_info -RDb ""``) will override the
+configured tag information, before creating source and binary eggs, registering
+the project with PyPI, and uploading the files. Thus, these commands will use
+the plain version from your ``setup.py``, without adding the Subversion
+revision number or build designation string.
+
+Of course, if you will be doing this a lot, you may wish to create a personal
+alias for this operation, e.g.::
+
+ python setup.py alias -u release egg_info -RDb ""
+
+You can then use it like this::
+
+ python setup.py release sdist bdist_egg register upload
+
+Or of course you can create more elaborate aliases that do all of the above.
+See the sections below on the `egg_info`_ and `alias`_ commands for more ideas.
+
+
+
Distributing Extensions compiled with Pyrex
-------------------------------------------
From python-checkins at python.org Mon Jul 10 22:47:59 2006
From: python-checkins at python.org (phillip.eby)
Date: Mon, 10 Jul 2006 22:47:59 +0200 (CEST)
Subject: [Python-checkins] r50539 - in sandbox/branches/setuptools-0.6:
setup.cfg setuptools.txt setuptools/command/egg_info.py
Message-ID: <20060710204759.CFFF11E4002@bag.python.org>
Author: phillip.eby
Date: Mon Jul 10 22:47:59 2006
New Revision: 50539
Modified:
sandbox/branches/setuptools-0.6/setup.cfg
sandbox/branches/setuptools-0.6/setuptools.txt
sandbox/branches/setuptools-0.6/setuptools/command/egg_info.py
Log:
Added ``--no-date`` and ``--no-svn-revision`` options to ``egg_info``
command, to allow suppressing tags configured in ``setup.cfg``.
(backports from trunk)
Modified: sandbox/branches/setuptools-0.6/setup.cfg
==============================================================================
--- sandbox/branches/setuptools-0.6/setup.cfg (original)
+++ sandbox/branches/setuptools-0.6/setup.cfg Mon Jul 10 22:47:59 2006
@@ -1,8 +1,9 @@
-[aliases]
-binary = bdist_egg upload --show-response
-develop = develop
-source = register sdist binary
-
[egg_info]
tag_build = dev
tag_svn_revision = 1
+
+[aliases]
+release = egg_info -RDb ''
+source = register sdist binary
+binary = bdist_egg upload --show-response
+
Modified: sandbox/branches/setuptools-0.6/setuptools.txt
==============================================================================
--- sandbox/branches/setuptools-0.6/setuptools.txt (original)
+++ sandbox/branches/setuptools-0.6/setuptools.txt Mon Jul 10 22:47:59 2006
@@ -1534,17 +1534,9 @@
will know that ``1.0a1`` supersedes any interim snapshots from Subversion, and
handle upgrades accordingly.
-Note, by the way, that this means that you need to remove these settings from
-``setup.cfg`` when you make an official release. This is easy to do if you
-are developing on the trunk and using tags or branches for your releases - just
-make the change after branching or tagging the release, so the trunk will still
-produce development snapshots.
-
-Also notice that this procedure means that the project version number you
-specify in ``setup.py`` should always be the *next* version of your software,
-not the last released version.
-
-(Alternately, you can leave out the ``tag_build=.dev``, and always use the
+(Note: the project version number you specify in ``setup.py`` should always be
+the *next* version of your software, not the last released version.
+Alternately, you can leave out the ``tag_build=.dev``, and always use the
*last* release as a version number, so that your post-1.0 builds are labelled
``1.0-r1263``, indicating a post-1.0 patchlevel. Most projects so far,
however, seem to prefer to think of their project as being a future version
@@ -1581,6 +1573,41 @@
in order to keep their checkout completely in sync.
+Making "Official" (Non-Snapshot) Releases
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+When you make an official release, creating source or binary distributions,
+you will need to override the tag settings from ``setup.cfg``, so that you
+don't end up registering versions like ``foobar-0.7a1.dev-r34832``. This is
+easy to do if you are developing on the trunk and using tags or branches for
+your releases - just make the change to ``setup.cfg`` after branching or
+tagging the release, so the trunk will still produce development snapshots.
+
+Alternately, if you are not branching for releases, you can override the
+default version options on the command line, using something like::
+
+ python setup.py egg_info -RDb "" sdist bdist_egg register upload
+
+The first part of this command (``egg_info -RDb ""``) will override the
+configured tag information, before creating source and binary eggs, registering
+the project with PyPI, and uploading the files. Thus, these commands will use
+the plain version from your ``setup.py``, without adding the Subversion
+revision number or build designation string.
+
+Of course, if you will be doing this a lot, you may wish to create a personal
+alias for this operation, e.g.::
+
+ python setup.py alias -u release egg_info -RDb ""
+
+You can then use it like this::
+
+ python setup.py release sdist bdist_egg register upload
+
+Or of course you can create more elaborate aliases that do all of the above.
+See the sections below on the `egg_info`_ and `alias`_ commands for more ideas.
+
+
+
Distributing Extensions compiled with Pyrex
-------------------------------------------
@@ -1881,19 +1908,19 @@
metadata directory (used by the ``bdist_egg``, ``develop``, and ``test``
commands), and it allows you to temporarily change a project's version string,
to support "daily builds" or "snapshot" releases. It is run automatically by
-the ``sdist``, ``bdist_egg``, ``develop``, and ``test`` commands in order to
-update the project's metadata, but you can also specify it explicitly in order
-to temporarily change the project's version string. (It also generates the
-``.egg-info/SOURCES.txt`` manifest file, which is used when you are building
-source distributions.)
+the ``sdist``, ``bdist_egg``, ``develop``, ``register``, and ``test`` commands
+in order to update the project's metadata, but you can also specify it
+explicitly in order to temporarily change the project's version string while
+executing other commands. (It also generates the``.egg-info/SOURCES.txt``
+manifest file, which is used when you are building source distributions.)
-(In addition to writing the core egg metadata defined by ``setuptools`` and
+In addition to writing the core egg metadata defined by ``setuptools`` and
required by ``pkg_resources``, this command can be extended to write other
metadata files as well, by defining entry points in the ``egg_info.writers``
group. See the section on `Adding new EGG-INFO Files`_ below for more details.
Note that using additional metadata writers may require you to include a
``setup_requires`` argument to ``setup()`` in order to ensure that the desired
-writers are available on ``sys.path``.)
+writers are available on ``sys.path``.
Release Tagging Options
@@ -1914,6 +1941,10 @@
always leave off --tag-build and then use one or both of the following
options.)
+ If you have a default build tag set in your ``setup.cfg``, you can suppress
+ it on the command line using ``-b ""`` or ``--tag-build=""`` as an argument
+ to the ``egg_info`` command.
+
``--tag-svn-revision, -r``
If the current directory is a Subversion checkout (i.e. has a ``.svn``
subdirectory, this appends a string of the form "-rNNNN" to the project's
@@ -1931,10 +1962,19 @@
If there is no ``PKG-INFO`` file, or the version number contained therein
does not end with ``-r`` and a number, then ``-r0`` is used.
+``--no-svn-revision, -R``
+ Don't include the Subversion revision in the version number. This option
+ is included so you can override a default setting put in ``setup.cfg``.
+
``--tag-date, -d``
Add a date stamp of the form "-YYYYMMDD" (e.g. "-20050528") to the
project's version number.
+``--no-date, -D``
+ Don't include a date stamp in the version number. This option is included
+ so you can override a default setting in ``setup.cfg``.
+
+
(Note: Because these options modify the version number used for source and
binary distributions of your project, you should first make sure that you know
how the resulting version numbers will be interpreted by automated tools
@@ -1960,6 +2000,22 @@
no ``package_dir`` set, this option defaults to the current directory.
+``egg_info`` Examples
+---------------------
+
+Creating a dated "nightly build" snapshot egg::
+
+ python setup.py egg_info --tag-date --tag-build=DEV bdist_egg
+
+Creating and uploading a release with no version tags, even if some default
+tags are specified in ``setup.cfg``::
+
+ python setup.py egg_info -RDb "" sdist bdist_egg register upload
+
+(Notice that ``egg_info`` must always appear on the command line *before* any
+commands that you want the version changes to apply to.)
+
+
.. _install command:
``install`` - Run ``easy_install`` or old-style installation
@@ -2511,6 +2567,9 @@
* Fix ``register`` not obeying name/version set by ``egg_info`` command, if
``egg_info`` wasn't explicitly run first on the same command line.
+ * Added ``--no-date`` and ``--no-svn-revision`` options to ``egg_info``
+ command, to allow suppressing tags configured in ``setup.cfg``.
+
0.6b3
* Fix ``bdist_egg`` not including files in subdirectories of ``.egg-info``.
Modified: sandbox/branches/setuptools-0.6/setuptools/command/egg_info.py
==============================================================================
--- sandbox/branches/setuptools-0.6/setuptools/command/egg_info.py (original)
+++ sandbox/branches/setuptools-0.6/setuptools/command/egg_info.py Mon Jul 10 22:47:59 2006
@@ -25,9 +25,19 @@
"Add subversion revision ID to version number"),
('tag-date', 'd', "Add date stamp (e.g. 20050528) to version number"),
('tag-build=', 'b', "Specify explicit tag to add to version number"),
+ ('no-svn-revision', 'R',
+ "Don't add subversion revision ID [default]"),
+ ('no-date', 'D', "Don't include date stamp [default]"),
+ ('tag-build=', 'b', "Specify explicit tag to add to version number"),
]
- boolean_options = ['tag-date','tag-svn-revision']
+ boolean_options = ['tag-date', 'tag-svn-revision']
+ negative_opt = {'no-svn-revision': 'tag-svn-revision',
+ 'no-date': 'tag-date'}
+
+
+
+
def initialize_options (self):
self.egg_name = None
@@ -39,6 +49,37 @@
self.tag_date = 0
self.broken_egg_info = False
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
def finalize_options (self):
self.egg_name = safe_name(self.distribution.get_name())
self.egg_version = self.tagged_version()
@@ -366,4 +407,4 @@
-
+#
From python-checkins at python.org Mon Jul 10 22:59:45 2006
From: python-checkins at python.org (brett.cannon)
Date: Mon, 10 Jul 2006 22:59:45 +0200 (CEST)
Subject: [Python-checkins] r50540 - in python/branches/bcannon-sandboxing:
Include/pystate.h Include/sandbox.h configure configure.in
pyconfig.h.in
Message-ID: <20060710205945.B01C71E4002@bag.python.org>
Author: brett.cannon
Date: Mon Jul 10 22:59:44 2006
New Revision: 50540
Added:
python/branches/bcannon-sandboxing/Include/sandbox.h
Modified:
python/branches/bcannon-sandboxing/Include/pystate.h
python/branches/bcannon-sandboxing/configure
python/branches/bcannon-sandboxing/configure.in
python/branches/bcannon-sandboxing/pyconfig.h.in
Log:
Add support for ``--with-sandboxing`` configure option and introduce Include/sandbox.h .
Modified: python/branches/bcannon-sandboxing/Include/pystate.h
==============================================================================
--- python/branches/bcannon-sandboxing/Include/pystate.h (original)
+++ python/branches/bcannon-sandboxing/Include/pystate.h Mon Jul 10 22:59:44 2006
@@ -32,6 +32,9 @@
#ifdef WITH_TSC
int tscdump;
#endif
+#ifdef PySandbox_SUPPORTED
+ PySandboxState *sandbox_state;
+#endif
} PyInterpreterState;
Added: python/branches/bcannon-sandboxing/Include/sandbox.h
==============================================================================
--- (empty file)
+++ python/branches/bcannon-sandboxing/Include/sandbox.h Mon Jul 10 22:59:44 2006
@@ -0,0 +1,17 @@
+#ifdef PySandbox_SUPPORTED
+
+#ifndef Py_SANDBOX_H
+#define Py_SANDBOX_H
+#ifdef __cplusplus
+extern "C" {
+#endif
+
+struct _sandbox_state; /* Forward */
+
+typedef struct _sandbox_state {
+
+} PySandboxState;
+
+#endif /* Py_SANDBOX_H */
+
+#endif /* PySandbox_SUPPORTED */
Modified: python/branches/bcannon-sandboxing/configure
==============================================================================
--- python/branches/bcannon-sandboxing/configure (original)
+++ python/branches/bcannon-sandboxing/configure Mon Jul 10 22:59:44 2006
@@ -1,5 +1,5 @@
#! /bin/sh
-# From configure.in Revision: .
+# From configure.in Revision: 47023 .
# Guess values for system-dependent variables and create Makefiles.
# Generated by GNU Autoconf 2.59 for python 2.5.
#
@@ -722,13 +722,13 @@
/^X\(\/\).*/{ s//\1/; q; }
s/.*/./; q'`
srcdir=$ac_confdir
- if test ! -r "$srcdir/$ac_unique_file"; then
+ if test ! -r $srcdir/$ac_unique_file; then
srcdir=..
fi
else
ac_srcdir_defaulted=no
fi
-if test ! -r "$srcdir/$ac_unique_file"; then
+if test ! -r $srcdir/$ac_unique_file; then
if test "$ac_srcdir_defaulted" = yes; then
{ echo "$as_me: error: cannot find sources ($ac_unique_file) in $ac_confdir or .." >&2
{ (exit 1); exit 1; }; }
@@ -737,7 +737,7 @@
{ (exit 1); exit 1; }; }
fi
fi
-(cd $srcdir && test -r "./$ac_unique_file") 2>/dev/null ||
+(cd $srcdir && test -r ./$ac_unique_file) 2>/dev/null ||
{ echo "$as_me: error: sources are in $srcdir, but \`cd $srcdir' does not work" >&2
{ (exit 1); exit 1; }; }
srcdir=`echo "$srcdir" | sed 's%\([^\\/]\)[\\/]*$%\1%'`
@@ -866,6 +866,7 @@
compiler
--with-suffix=.exe set executable suffix
--with-pydebug build with Py_DEBUG defined
+ --with-sandboxing build with PySandbox_SUPPORTED defined
--with-libs='lib1 ...' link against additional libs
--with-system-ffi build _ctypes module using an installed ffi library
--with-signal-module disable/enable signal module
@@ -984,7 +985,7 @@
else
echo "$as_me: WARNING: no configuration information is in $ac_dir" >&2
fi
- cd $ac_popdir
+ cd "$ac_popdir"
done
fi
@@ -2331,8 +2332,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -2390,8 +2390,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -2507,8 +2506,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -2562,8 +2560,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -2608,8 +2605,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -2653,8 +2649,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -3770,6 +3765,33 @@
echo "${ECHO_T}no" >&6
fi;
+# Check for --with-sandboxing
+echo "$as_me:$LINENO: checking for --with-sandboxing" >&5
+echo $ECHO_N "checking for --with-sandboxing... $ECHO_C" >&6
+
+# Check whether --with-sandboxing or --without-sandboxing was given.
+if test "${with_sandboxing+set}" = set; then
+ withval="$with_sandboxing"
+
+if test "$withval" != no
+then
+
+cat >>confdefs.h <<\_ACEOF
+#define PySandbox_SUPPORTED 1
+_ACEOF
+
+ echo "$as_me:$LINENO: result: yes" >&5
+echo "${ECHO_T}yes" >&6;
+ PySandbox_SUPPORTED='true'
+else echo "$as_me:$LINENO: result: no" >&5
+echo "${ECHO_T}no" >&6; PySandbox_SUPPORTED='false'
+fi
+else
+ echo "$as_me:$LINENO: result: no" >&5
+echo "${ECHO_T}no" >&6
+fi;
+
+
# XXX Shouldn't the code above that fiddles with BASECFLAGS and OPT be
# merged with this chunk of code?
@@ -4378,8 +4400,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -4549,8 +4570,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -4675,8 +4695,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -4831,8 +4850,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -4904,8 +4922,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -4959,8 +4976,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -5031,8 +5047,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -5086,8 +5101,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -5150,8 +5164,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -5208,8 +5221,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -5356,8 +5368,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -5509,8 +5520,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -5579,8 +5589,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -5670,8 +5679,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -5721,8 +5729,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -5799,8 +5806,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -5881,8 +5887,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -5947,8 +5952,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -6013,8 +6017,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -6086,8 +6089,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -6148,8 +6150,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -6252,8 +6253,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -6319,8 +6319,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -6382,8 +6381,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -6423,8 +6421,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -6480,8 +6477,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -6521,8 +6517,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -6586,8 +6581,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -6618,10 +6612,8 @@
esac
else
if test "$cross_compiling" = yes; then
- { { echo "$as_me:$LINENO: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&5
-echo "$as_me: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&2;}
+ { { echo "$as_me:$LINENO: error: internal error: not reached in cross-compile" >&5
+echo "$as_me: error: internal error: not reached in cross-compile" >&2;}
{ (exit 1); exit 1; }; }
else
cat >conftest.$ac_ext <<_ACEOF
@@ -6733,8 +6725,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -6796,8 +6787,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -6837,8 +6827,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -6894,8 +6883,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -6935,8 +6923,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -7000,8 +6987,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -7032,10 +7018,8 @@
esac
else
if test "$cross_compiling" = yes; then
- { { echo "$as_me:$LINENO: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&5
-echo "$as_me: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&2;}
+ { { echo "$as_me:$LINENO: error: internal error: not reached in cross-compile" >&5
+echo "$as_me: error: internal error: not reached in cross-compile" >&2;}
{ (exit 1); exit 1; }; }
else
cat >conftest.$ac_ext <<_ACEOF
@@ -7147,8 +7131,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -7210,8 +7193,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -7251,8 +7233,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -7308,8 +7289,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -7349,8 +7329,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -7414,8 +7393,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -7446,10 +7424,8 @@
esac
else
if test "$cross_compiling" = yes; then
- { { echo "$as_me:$LINENO: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&5
-echo "$as_me: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&2;}
+ { { echo "$as_me:$LINENO: error: internal error: not reached in cross-compile" >&5
+echo "$as_me: error: internal error: not reached in cross-compile" >&2;}
{ (exit 1); exit 1; }; }
else
cat >conftest.$ac_ext <<_ACEOF
@@ -7561,8 +7537,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -7624,8 +7599,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -7665,8 +7639,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -7722,8 +7695,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -7763,8 +7735,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -7828,8 +7799,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -7860,10 +7830,8 @@
esac
else
if test "$cross_compiling" = yes; then
- { { echo "$as_me:$LINENO: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&5
-echo "$as_me: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&2;}
+ { { echo "$as_me:$LINENO: error: internal error: not reached in cross-compile" >&5
+echo "$as_me: error: internal error: not reached in cross-compile" >&2;}
{ (exit 1); exit 1; }; }
else
cat >conftest.$ac_ext <<_ACEOF
@@ -7975,8 +7943,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -8038,8 +8005,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -8079,8 +8045,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -8136,8 +8101,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -8177,8 +8141,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -8242,8 +8205,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -8274,10 +8236,8 @@
esac
else
if test "$cross_compiling" = yes; then
- { { echo "$as_me:$LINENO: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&5
-echo "$as_me: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&2;}
+ { { echo "$as_me:$LINENO: error: internal error: not reached in cross-compile" >&5
+echo "$as_me: error: internal error: not reached in cross-compile" >&2;}
{ (exit 1); exit 1; }; }
else
cat >conftest.$ac_ext <<_ACEOF
@@ -8389,8 +8349,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -8452,8 +8411,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -8493,8 +8451,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -8550,8 +8507,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -8591,8 +8547,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -8656,8 +8611,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -8688,10 +8642,8 @@
esac
else
if test "$cross_compiling" = yes; then
- { { echo "$as_me:$LINENO: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&5
-echo "$as_me: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&2;}
+ { { echo "$as_me:$LINENO: error: internal error: not reached in cross-compile" >&5
+echo "$as_me: error: internal error: not reached in cross-compile" >&2;}
{ (exit 1); exit 1; }; }
else
cat >conftest.$ac_ext <<_ACEOF
@@ -8803,8 +8755,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -8866,8 +8817,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -8907,8 +8857,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -8964,8 +8913,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -9005,8 +8953,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -9070,8 +9017,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -9102,10 +9048,8 @@
esac
else
if test "$cross_compiling" = yes; then
- { { echo "$as_me:$LINENO: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&5
-echo "$as_me: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&2;}
+ { { echo "$as_me:$LINENO: error: internal error: not reached in cross-compile" >&5
+echo "$as_me: error: internal error: not reached in cross-compile" >&2;}
{ (exit 1); exit 1; }; }
else
cat >conftest.$ac_ext <<_ACEOF
@@ -9217,8 +9161,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -9280,8 +9223,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -9321,8 +9263,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -9378,8 +9319,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -9419,8 +9359,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -9484,8 +9423,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -9516,10 +9454,8 @@
esac
else
if test "$cross_compiling" = yes; then
- { { echo "$as_me:$LINENO: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&5
-echo "$as_me: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&2;}
+ { { echo "$as_me:$LINENO: error: internal error: not reached in cross-compile" >&5
+echo "$as_me: error: internal error: not reached in cross-compile" >&2;}
{ (exit 1); exit 1; }; }
else
cat >conftest.$ac_ext <<_ACEOF
@@ -9627,8 +9563,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -9689,8 +9624,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -9752,8 +9686,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -9793,8 +9726,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -9850,8 +9782,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -9891,8 +9822,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -9956,8 +9886,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -9988,10 +9917,8 @@
esac
else
if test "$cross_compiling" = yes; then
- { { echo "$as_me:$LINENO: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&5
-echo "$as_me: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&2;}
+ { { echo "$as_me:$LINENO: error: internal error: not reached in cross-compile" >&5
+echo "$as_me: error: internal error: not reached in cross-compile" >&2;}
{ (exit 1); exit 1; }; }
else
cat >conftest.$ac_ext <<_ACEOF
@@ -10100,8 +10027,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -10162,8 +10088,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -10225,8 +10150,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -10266,8 +10190,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -10323,8 +10246,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -10364,8 +10286,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -10429,8 +10350,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -10461,10 +10381,8 @@
esac
else
if test "$cross_compiling" = yes; then
- { { echo "$as_me:$LINENO: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&5
-echo "$as_me: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&2;}
+ { { echo "$as_me:$LINENO: error: internal error: not reached in cross-compile" >&5
+echo "$as_me: error: internal error: not reached in cross-compile" >&2;}
{ (exit 1); exit 1; }; }
else
cat >conftest.$ac_ext <<_ACEOF
@@ -10716,8 +10634,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -11229,8 +11146,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -11303,8 +11219,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -11379,8 +11294,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -11434,8 +11348,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -11510,8 +11423,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -11573,8 +11485,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -11645,8 +11556,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -11713,8 +11623,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -11784,8 +11693,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -12054,8 +11962,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -12209,8 +12116,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -12390,8 +12296,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -12483,8 +12388,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -12547,8 +12451,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -12697,8 +12600,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -12855,8 +12757,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -12929,8 +12830,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -13003,8 +12903,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -13077,8 +12976,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -13180,8 +13078,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -13256,8 +13153,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -13459,8 +13355,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -13608,8 +13503,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -13999,8 +13893,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -14243,8 +14136,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -14305,8 +14197,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -14360,8 +14251,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -14415,8 +14305,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -14470,8 +14359,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -14525,8 +14413,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -14580,8 +14467,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -14645,8 +14531,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -14704,8 +14589,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -14763,8 +14647,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -14873,8 +14756,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -14941,8 +14823,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -15018,8 +14899,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -15082,8 +14962,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -15145,8 +15024,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -15208,8 +15086,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -15271,8 +15148,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -15372,8 +15248,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -15441,8 +15316,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -15511,8 +15385,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -15620,8 +15493,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -15689,8 +15561,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -15759,8 +15630,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -15875,8 +15745,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -15982,8 +15851,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -16094,8 +15962,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -16147,8 +16014,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -16245,8 +16111,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -16298,8 +16163,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -16396,8 +16260,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -16449,8 +16312,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -16516,8 +16378,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -16585,8 +16446,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -16823,8 +16683,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -16891,8 +16750,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -16954,8 +16812,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -17020,8 +16877,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -17067,8 +16923,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -17142,8 +16997,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -17207,8 +17061,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -17251,8 +17104,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -17317,8 +17169,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -17361,8 +17212,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -17427,8 +17277,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -17471,8 +17320,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -17537,8 +17385,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -17581,8 +17428,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -17647,8 +17493,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -17691,8 +17536,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -17757,8 +17601,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -17801,8 +17644,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -17880,8 +17722,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -17944,8 +17785,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -18004,8 +17844,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -18068,8 +17907,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -18135,8 +17973,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -18241,8 +18078,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -18301,8 +18137,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -18357,8 +18192,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -18413,8 +18247,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -18480,8 +18313,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -18540,8 +18372,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -18599,8 +18430,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -18661,8 +18491,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -18761,8 +18590,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -18830,8 +18658,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -18899,8 +18726,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -18966,8 +18792,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -19081,8 +18906,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -19192,8 +19016,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -19259,8 +19082,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -19452,8 +19274,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -19528,8 +19349,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -19687,8 +19507,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -19751,8 +19570,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -19793,8 +19611,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -19851,8 +19668,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -19893,8 +19709,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -19959,8 +19774,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -19991,10 +19805,8 @@
esac
else
if test "$cross_compiling" = yes; then
- { { echo "$as_me:$LINENO: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&5
-echo "$as_me: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&2;}
+ { { echo "$as_me:$LINENO: error: internal error: not reached in cross-compile" >&5
+echo "$as_me: error: internal error: not reached in cross-compile" >&2;}
{ (exit 1); exit 1; }; }
else
cat >conftest.$ac_ext <<_ACEOF
@@ -20108,8 +19920,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -20320,8 +20131,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -20363,8 +20173,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -20421,8 +20230,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -20613,8 +20421,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -20690,8 +20497,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -20766,8 +20572,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -20842,8 +20647,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -20974,8 +20778,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -21048,8 +20851,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -21311,8 +21113,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -21358,8 +21159,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -21433,8 +21233,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -21617,8 +21416,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -21684,8 +21482,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -21751,8 +21548,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -21818,8 +21614,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -21880,8 +21675,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -21936,8 +21730,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -21992,8 +21785,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -22158,8 +21950,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -23075,11 +22866,6 @@
*) ac_INSTALL=$ac_top_builddir$INSTALL ;;
esac
- if test x"$ac_file" != x-; then
- { echo "$as_me:$LINENO: creating $ac_file" >&5
-echo "$as_me: creating $ac_file" >&6;}
- rm -f "$ac_file"
- fi
# Let's still pretend it is `configure' which instantiates (i.e., don't
# use $as_me), people would be surprised to read:
# /* config.h. Generated by config.status. */
@@ -23118,6 +22904,12 @@
fi;;
esac
done` || { (exit 1); exit 1; }
+
+ if test x"$ac_file" != x-; then
+ { echo "$as_me:$LINENO: creating $ac_file" >&5
+echo "$as_me: creating $ac_file" >&6;}
+ rm -f "$ac_file"
+ fi
_ACEOF
cat >>$CONFIG_STATUS <<_ACEOF
sed "$ac_vpsub
Modified: python/branches/bcannon-sandboxing/configure.in
==============================================================================
--- python/branches/bcannon-sandboxing/configure.in (original)
+++ python/branches/bcannon-sandboxing/configure.in Mon Jul 10 22:59:44 2006
@@ -725,6 +725,22 @@
fi],
[AC_MSG_RESULT(no)])
+# Check for --with-sandboxing
+AC_MSG_CHECKING(for --with-sandboxing)
+AC_ARG_WITH(sandboxing,
+ AC_HELP_STRING(--with-sandboxing, build with PySandbox_SUPPORTED defined),
+[
+if test "$withval" != no
+then
+ AC_DEFINE(PySandbox_SUPPORTED, 1,
+ [Define if you want to build an interpreter with sandboxing support.])
+ AC_MSG_RESULT(yes);
+ PySandbox_SUPPORTED='true'
+else AC_MSG_RESULT(no); PySandbox_SUPPORTED='false'
+fi],
+[AC_MSG_RESULT(no)])
+
+
# XXX Shouldn't the code above that fiddles with BASECFLAGS and OPT be
# merged with this chunk of code?
Modified: python/branches/bcannon-sandboxing/pyconfig.h.in
==============================================================================
--- python/branches/bcannon-sandboxing/pyconfig.h.in (original)
+++ python/branches/bcannon-sandboxing/pyconfig.h.in Mon Jul 10 22:59:44 2006
@@ -85,6 +85,15 @@
/* Define to 1 if you have the header file. */
#undef HAVE_CURSES_H
+/* Define if you have the 'is_term_resized' function. */
+#undef HAVE_CURSES_IS_TERM_RESIZED
+
+/* Define if you have the 'resizeterm' function. */
+#undef HAVE_CURSES_RESIZETERM
+
+/* Define if you have the 'resize_term' function. */
+#undef HAVE_CURSES_RESIZE_TERM
+
/* Define to 1 if you have the device macros. */
#undef HAVE_DEVICE_MACROS
@@ -398,15 +407,6 @@
/* Define to 1 if you have the `realpath' function. */
#undef HAVE_REALPATH
-/* Define to 1 if you have the `is_term_resized' function. */
-#undef HAVE_CURSES_IS_TERM_RESIZED
-
-/* Define to 1 if you have the `resize_term' function. */
-#undef HAVE_CURSES_RESIZE_TERM
-
-/* Define to 1 if you have the `resizeterm' function. */
-#undef HAVE_CURSES_RESIZETERM
-
/* Define if you have readline 2.1 */
#undef HAVE_RL_CALLBACK
@@ -766,6 +766,9 @@
/* Define as the integral type used for Unicode representation. */
#undef PY_UNICODE_TYPE
+/* Define if you want to build an interpreter with sandboxing support. */
+#undef PySandbox_SUPPORTED
+
/* Define if you want to build an interpreter with many run-time checks. */
#undef Py_DEBUG
From python-checkins at python.org Mon Jul 10 23:08:27 2006
From: python-checkins at python.org (tim.peters)
Date: Mon, 10 Jul 2006 23:08:27 +0200 (CEST)
Subject: [Python-checkins] r50541 - in python/trunk: Doc/lib/libsys.tex
Include/pystate.h Lib/test/test_sys.py Misc/NEWS
Python/pystate.c Python/sysmodule.c
Message-ID: <20060710210827.4C0271E4008@bag.python.org>
Author: tim.peters
Date: Mon Jul 10 23:08:24 2006
New Revision: 50541
Modified:
python/trunk/Doc/lib/libsys.tex
python/trunk/Include/pystate.h
python/trunk/Lib/test/test_sys.py
python/trunk/Misc/NEWS
python/trunk/Python/pystate.c
python/trunk/Python/sysmodule.c
Log:
After approval from Anthony, merge the tim-current_frames
branch into the trunk. This adds a new sys._current_frames()
function, which returns a dict mapping thread id to topmost
thread stack frame.
Modified: python/trunk/Doc/lib/libsys.tex
==============================================================================
--- python/trunk/Doc/lib/libsys.tex (original)
+++ python/trunk/Doc/lib/libsys.tex Mon Jul 10 23:08:24 2006
@@ -41,7 +41,7 @@
\code{Include/patchlevel.h} if the branch is a tag. Otherwise,
it is \code{None}.
\versionadded{2.5}
-\end{datadesc}
+\end{datadesc}
\begin{datadesc}{builtin_module_names}
A tuple of strings giving the names of all modules that are compiled
@@ -55,6 +55,23 @@
interpreter.
\end{datadesc}
+\begin{funcdesc}{_current_frames}{}
+ Return a dictionary mapping each thread's identifier to the topmost stack
+ frame currently active in that thread at the time the function is called.
+ Note that functions in the \refmodule{traceback} module can build the
+ call stack given such a frame.
+
+ This is most useful for debugging deadlock: this function does not
+ require the deadlocked threads' cooperation, and such threads' call stacks
+ are frozen for as long as they remain deadlocked. The frame returned
+ for a non-deadlocked thread may bear no relationship to that thread's
+ current activity by the time calling code examines the frame.
+
+ This function should be used for internal and specialized purposes
+ only.
+ \versionadded{2.5}
+\end{funcdesc}
+
\begin{datadesc}{dllhandle}
Integer specifying the handle of the Python DLL.
Availability: Windows.
@@ -142,7 +159,7 @@
function, \function{exc_info()} will return three \code{None} values until
another exception is raised in the current thread or the execution stack
returns to a frame where another exception is being handled.
-
+
This function is only needed in only a few obscure situations. These
include logging and error handling systems that report information on the
last or current exception. This function can also be used to try to free
@@ -241,7 +258,7 @@
\begin{itemize}
\item On Windows 9x, the encoding is ``mbcs''.
\item On Mac OS X, the encoding is ``utf-8''.
-\item On Unix, the encoding is the user's preference
+\item On Unix, the encoding is the user's preference
according to the result of nl_langinfo(CODESET), or None if
the nl_langinfo(CODESET) failed.
\item On Windows NT+, file names are Unicode natively, so no conversion
@@ -279,8 +296,8 @@
\end{funcdesc}
\begin{funcdesc}{getwindowsversion}{}
- Return a tuple containing five components, describing the Windows
- version currently running. The elements are \var{major}, \var{minor},
+ Return a tuple containing five components, describing the Windows
+ version currently running. The elements are \var{major}, \var{minor},
\var{build}, \var{platform}, and \var{text}. \var{text} contains
a string while all other values are integers.
@@ -491,7 +508,7 @@
be registered using \function{settrace()} for each thread being
debugged. \note{The \function{settrace()} function is intended only
for implementing debuggers, profilers, coverage tools and the like.
- Its behavior is part of the implementation platform, rather than
+ Its behavior is part of the implementation platform, rather than
part of the language definition, and thus may not be available in
all Python implementations.}
\end{funcdesc}
Modified: python/trunk/Include/pystate.h
==============================================================================
--- python/trunk/Include/pystate.h (original)
+++ python/trunk/Include/pystate.h Mon Jul 10 23:08:24 2006
@@ -171,6 +171,11 @@
*/
PyAPI_FUNC(PyThreadState *) PyGILState_GetThisThreadState(void);
+/* The implementation of sys._current_frames() Returns a dict mapping
+ thread id to that thread's current frame.
+*/
+PyAPI_FUNC(PyObject *) _PyThread_CurrentFrames(void);
+
/* Routines for advanced debuggers, requested by David Beazley.
Don't use unless you know what you are doing! */
PyAPI_FUNC(PyInterpreterState *) PyInterpreterState_Head(void);
Modified: python/trunk/Lib/test/test_sys.py
==============================================================================
--- python/trunk/Lib/test/test_sys.py (original)
+++ python/trunk/Lib/test/test_sys.py Mon Jul 10 23:08:24 2006
@@ -237,6 +237,67 @@
is sys._getframe().f_code
)
+ # sys._current_frames() is a CPython-only gimmick.
+ def test_current_frames(self):
+ import threading, thread
+ import traceback
+
+ # Spawn a thread that blocks at a known place. Then the main
+ # thread does sys._current_frames(), and verifies that the frames
+ # returned make sense.
+ entered_g = threading.Event()
+ leave_g = threading.Event()
+ thread_info = [] # the thread's id
+
+ def f123():
+ g456()
+
+ def g456():
+ thread_info.append(thread.get_ident())
+ entered_g.set()
+ leave_g.wait()
+
+ t = threading.Thread(target=f123)
+ t.start()
+ entered_g.wait()
+
+ # At this point, t has finished its entered_g.set(), and is blocked
+ # in its leave_g.wait().
+ self.assertEqual(len(thread_info), 1)
+ thread_id = thread_info[0]
+
+ d = sys._current_frames()
+
+ main_id = thread.get_ident()
+ self.assert_(main_id in d)
+ self.assert_(thread_id in d)
+
+ # Verify that the captured main-thread frame is _this_ frame.
+ frame = d.pop(main_id)
+ self.assert_(frame is sys._getframe())
+
+ # Verify that the captured thread frame is blocked in g456, called
+ # from f123. This is a litte tricky, since various bits of
+ # threading.py are also in the thread's call stack.
+ frame = d.pop(thread_id)
+ stack = traceback.extract_stack(frame)
+ for i, (filename, lineno, funcname, sourceline) in enumerate(stack):
+ if funcname == "f123":
+ break
+ else:
+ self.fail("didn't find f123() on thread's call stack")
+
+ self.assertEqual(sourceline, "g456()")
+
+ # And the next record must be for g456().
+ filename, lineno, funcname, sourceline = stack[i+1]
+ self.assertEqual(funcname, "g456")
+ self.assertEqual(sourceline, "leave_g.wait()")
+
+ # Reap the spawned thread.
+ leave_g.set()
+ t.join()
+
def test_attributes(self):
self.assert_(isinstance(sys.api_version, int))
self.assert_(isinstance(sys.argv, list))
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Mon Jul 10 23:08:24 2006
@@ -36,10 +36,17 @@
- Bug #1512814, Fix incorrect lineno's when code at module scope
started after line 256.
+- New function ``sys._current_frames()`` returns a dict mapping thread
+ id to topmost thread stack frame. This is for expert use, and is
+ especially useful for debugging application deadlocks. The functionality
+ was previously available in Fazal Majid's ``threadframe`` extension
+ module, but it wasn't possible to do this in a wholly threadsafe way from
+ an extension.
+
Library
-------
-- Bug #1508010: msvccompiler now requires the DISTUTILS_USE_SDK
+- Bug #1508010: msvccompiler now requires the DISTUTILS_USE_SDK
environment variable to be set in order to the SDK environment
for finding the compiler, include files, etc.
@@ -126,7 +133,7 @@
Build
-----
-- 'configure' now detects the zlib library the same way as distutils.
+- 'configure' now detects the zlib library the same way as distutils.
Previously, the slight difference could cause compilation errors of the
'zlib' module on systems with more than one version of zlib.
Modified: python/trunk/Python/pystate.c
==============================================================================
--- python/trunk/Python/pystate.c (original)
+++ python/trunk/Python/pystate.c Mon Jul 10 23:08:24 2006
@@ -444,15 +444,15 @@
/* If autoTLSkey is 0, this must be the very first threadstate created
in Py_Initialize(). Don't do anything for now (we'll be back here
when _PyGILState_Init is called). */
- if (!autoTLSkey)
+ if (!autoTLSkey)
return;
-
+
/* Stick the thread state for this thread in thread local storage.
The only situation where you can legitimately have more than one
thread state for an OS level thread is when there are multiple
interpreters, when:
-
+
a) You shouldn't really be using the PyGILState_ APIs anyway,
and:
@@ -550,6 +550,54 @@
PyEval_SaveThread();
}
+/* The implementation of sys._current_frames(). This is intended to be
+ called with the GIL held, as it will be when called via
+ sys._current_frames(). It's possible it would work fine even without
+ the GIL held, but haven't thought enough about that.
+*/
+PyObject *
+_PyThread_CurrentFrames(void)
+{
+ PyObject *result;
+ PyInterpreterState *i;
+
+ result = PyDict_New();
+ if (result == NULL)
+ return NULL;
+
+ /* for i in all interpreters:
+ * for t in all of i's thread states:
+ * if t's frame isn't NULL, map t's id to its frame
+ * Because these lists can mutute even when the GIL is held, we
+ * need to grab head_mutex for the duration.
+ */
+ HEAD_LOCK();
+ for (i = interp_head; i != NULL; i = i->next) {
+ PyThreadState *t;
+ for (t = i->tstate_head; t != NULL; t = t->next) {
+ PyObject *id;
+ int stat;
+ struct _frame *frame = t->frame;
+ if (frame == NULL)
+ continue;
+ id = PyInt_FromLong(t->thread_id);
+ if (id == NULL)
+ goto Fail;
+ stat = PyDict_SetItem(result, id, (PyObject *)frame);
+ Py_DECREF(id);
+ if (stat < 0)
+ goto Fail;
+ }
+ }
+ HEAD_UNLOCK();
+ return result;
+
+ Fail:
+ HEAD_UNLOCK();
+ Py_DECREF(result);
+ return NULL;
+}
+
#ifdef __cplusplus
}
#endif
Modified: python/trunk/Python/sysmodule.c
==============================================================================
--- python/trunk/Python/sysmodule.c (original)
+++ python/trunk/Python/sysmodule.c Mon Jul 10 23:08:24 2006
@@ -660,6 +660,21 @@
return (PyObject*)f;
}
+PyDoc_STRVAR(current_frames_doc,
+"_current_frames() -> dictionary\n\
+\n\
+Return a dictionary mapping each current thread T's thread id to T's\n\
+current stack frame.\n\
+\n\
+This function should be used for specialized purposes only."
+);
+
+static PyObject *
+sys_current_frames(PyObject *self, PyObject *noargs)
+{
+ return _PyThread_CurrentFrames();
+}
+
PyDoc_STRVAR(call_tracing_doc,
"call_tracing(func, args) -> object\n\
\n\
@@ -722,6 +737,8 @@
/* Might as well keep this in alphabetic order */
{"callstats", (PyCFunction)PyEval_GetCallStats, METH_NOARGS,
callstats_doc},
+ {"_current_frames", sys_current_frames, METH_NOARGS,
+ current_frames_doc},
{"displayhook", sys_displayhook, METH_O, displayhook_doc},
{"exc_info", sys_exc_info, METH_NOARGS, exc_info_doc},
{"exc_clear", sys_exc_clear, METH_NOARGS, exc_clear_doc},
From python-checkins at python.org Mon Jul 10 23:11:50 2006
From: python-checkins at python.org (tim.peters)
Date: Mon, 10 Jul 2006 23:11:50 +0200 (CEST)
Subject: [Python-checkins] r50542 - python/trunk/Lib/test/test_inspect.py
Message-ID: <20060710211150.443E01E4008@bag.python.org>
Author: tim.peters
Date: Mon Jul 10 23:11:49 2006
New Revision: 50542
Modified:
python/trunk/Lib/test/test_inspect.py
Log:
Whitespace normalization.
Modified: python/trunk/Lib/test/test_inspect.py
==============================================================================
--- python/trunk/Lib/test/test_inspect.py (original)
+++ python/trunk/Lib/test/test_inspect.py Mon Jul 10 23:11:49 2006
@@ -182,7 +182,7 @@
from new import module
name = '__inspect_dummy'
m = sys.modules[name] = module(name)
- m.__file__ = "" # hopefully not a real filename...
+ m.__file__ = "" # hopefully not a real filename...
m.__loader__ = "dummy" # pretend the filename is understood by a loader
exec "def x(): pass" in m.__dict__
self.assertEqual(inspect.getsourcefile(m.x.func_code), '')
From python-checkins at python.org Mon Jul 10 23:14:28 2006
From: python-checkins at python.org (phillip.eby)
Date: Mon, 10 Jul 2006 23:14:28 +0200 (CEST)
Subject: [Python-checkins] r50543 -
sandbox/trunk/setuptools/setuptools/command/egg_info.py
sandbox/trunk/setuptools/setuptools/command/sdist.py
Message-ID: <20060710211428.164EE1E4002@bag.python.org>
Author: phillip.eby
Date: Mon Jul 10 23:14:27 2006
New Revision: 50543
Modified:
sandbox/trunk/setuptools/setuptools/command/egg_info.py
sandbox/trunk/setuptools/setuptools/command/sdist.py
Log:
Don't warn about missing README(.txt) unless creating an sdist
Modified: sandbox/trunk/setuptools/setuptools/command/egg_info.py
==============================================================================
--- sandbox/trunk/setuptools/setuptools/command/egg_info.py (original)
+++ sandbox/trunk/setuptools/setuptools/command/egg_info.py Mon Jul 10 23:14:27 2006
@@ -281,9 +281,9 @@
self.execute(file_util.write_file, (self.manifest, files),
"writing manifest file '%s'" % self.manifest)
-
-
-
+ def warn(self, msg): # suppress missing-file warnings from sdist
+ if not msg.startswith("standard file not found:"):
+ sdist.warn(self, msg)
def add_defaults(self):
sdist.add_defaults(self)
Modified: sandbox/trunk/setuptools/setuptools/command/sdist.py
==============================================================================
--- sandbox/trunk/setuptools/setuptools/command/sdist.py (original)
+++ sandbox/trunk/setuptools/setuptools/command/sdist.py Mon Jul 10 23:14:27 2006
@@ -142,7 +142,7 @@
ei_cmd = self.get_finalized_command('egg_info')
self.filelist = ei_cmd.filelist
self.filelist.append(os.path.join(ei_cmd.egg_info,'SOURCES.txt'))
-
+ self.check_readme()
self.check_metadata()
self.make_distribution()
@@ -161,3 +161,45 @@
# dying and thus masking the real error
sys.exc_info()[2].tb_next.tb_frame.f_locals['template'].close()
raise
+
+ def check_readme(self):
+ alts = ("README", "README.txt")
+ for f in alts:
+ if os.path.exists(f):
+ return
+ else:
+ self.warn(
+ "standard file not found: should have one of " +', '.join(alts)
+ )
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+#
From python-checkins at python.org Mon Jul 10 23:17:17 2006
From: python-checkins at python.org (phillip.eby)
Date: Mon, 10 Jul 2006 23:17:17 +0200 (CEST)
Subject: [Python-checkins] r50544 - in sandbox/branches/setuptools-0.6:
setuptools.txt setuptools/command/egg_info.py
setuptools/command/sdist.py
Message-ID: <20060710211717.2624E1E4002@bag.python.org>
Author: phillip.eby
Date: Mon Jul 10 23:17:16 2006
New Revision: 50544
Modified:
sandbox/branches/setuptools-0.6/setuptools.txt
sandbox/branches/setuptools-0.6/setuptools/command/egg_info.py
sandbox/branches/setuptools-0.6/setuptools/command/sdist.py
Log:
Fixed redundant warnings about missing ``README`` file(s); it should now
appear only if you are actually a source distribution.
(backport from trunk)
Modified: sandbox/branches/setuptools-0.6/setuptools.txt
==============================================================================
--- sandbox/branches/setuptools-0.6/setuptools.txt (original)
+++ sandbox/branches/setuptools-0.6/setuptools.txt Mon Jul 10 23:17:16 2006
@@ -2570,6 +2570,9 @@
* Added ``--no-date`` and ``--no-svn-revision`` options to ``egg_info``
command, to allow suppressing tags configured in ``setup.cfg``.
+ * Fixed redundant warnings about missing ``README`` file(s); it should now
+ appear only if you are actually a source distribution.
+
0.6b3
* Fix ``bdist_egg`` not including files in subdirectories of ``.egg-info``.
Modified: sandbox/branches/setuptools-0.6/setuptools/command/egg_info.py
==============================================================================
--- sandbox/branches/setuptools-0.6/setuptools/command/egg_info.py (original)
+++ sandbox/branches/setuptools-0.6/setuptools/command/egg_info.py Mon Jul 10 23:17:16 2006
@@ -281,9 +281,9 @@
self.execute(file_util.write_file, (self.manifest, files),
"writing manifest file '%s'" % self.manifest)
-
-
-
+ def warn(self, msg): # suppress missing-file warnings from sdist
+ if not msg.startswith("standard file not found:"):
+ sdist.warn(self, msg)
def add_defaults(self):
sdist.add_defaults(self)
Modified: sandbox/branches/setuptools-0.6/setuptools/command/sdist.py
==============================================================================
--- sandbox/branches/setuptools-0.6/setuptools/command/sdist.py (original)
+++ sandbox/branches/setuptools-0.6/setuptools/command/sdist.py Mon Jul 10 23:17:16 2006
@@ -142,9 +142,9 @@
ei_cmd = self.get_finalized_command('egg_info')
self.filelist = ei_cmd.filelist
self.filelist.append(os.path.join(ei_cmd.egg_info,'SOURCES.txt'))
-
+ self.check_readme()
self.check_metadata()
- self.make_distribution()
+ self.make_distribution()
dist_files = getattr(self.distribution,'dist_files',[])
for file in self.archive_files:
@@ -162,3 +162,44 @@
sys.exc_info()[2].tb_next.tb_frame.f_locals['template'].close()
raise
+ def check_readme(self):
+ alts = ("README", "README.txt")
+ for f in alts:
+ if os.path.exists(f):
+ return
+ else:
+ self.warn(
+ "standard file not found: should have one of " +', '.join(alts)
+ )
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+#
From python-checkins at python.org Mon Jul 10 23:25:14 2006
From: python-checkins at python.org (phillip.eby)
Date: Mon, 10 Jul 2006 23:25:14 +0200 (CEST)
Subject: [Python-checkins] r50545 -
sandbox/trunk/setuptools/setuptools/command/easy_install.py
Message-ID: <20060710212514.961031E4009@bag.python.org>
Author: phillip.eby
Date: Mon Jul 10 23:25:14 2006
New Revision: 50545
Modified:
sandbox/trunk/setuptools/setuptools/command/easy_install.py
Log:
Ensure that sys.path_importer_cache is updated when an existing zipfile
or directory is deleted.
Modified: sandbox/trunk/setuptools/setuptools/command/easy_install.py
==============================================================================
--- sandbox/trunk/setuptools/setuptools/command/easy_install.py (original)
+++ sandbox/trunk/setuptools/setuptools/command/easy_install.py Mon Jul 10 23:25:14 2006
@@ -1410,7 +1410,6 @@
options = ' '+options
return "#!%(executable)s%(options)s\n" % locals()
-
def auto_chmod(func, arg, exc):
if func is os.remove and os.name=='nt':
os.chmod(arg, stat.S_IWRITE)
@@ -1418,21 +1417,22 @@
exc = sys.exc_info()
raise exc[0], (exc[1][0], exc[1][1] + (" %s %s" % (func,arg)))
-
def uncache_zipdir(path):
- """Ensure that the zip directory cache doesn't have stale info for path"""
+ """Ensure that the importer caches dont have stale info for `path`"""
from zipimport import _zip_directory_cache as zdc
- if path in zdc:
- del zdc[path]
+ _uncache(path, zdc)
+ _uncache(path, sys.path_importer_cache)
+
+def _uncache(path, cache):
+ if path in cache:
+ del cache[path]
else:
path = normalize_path(path)
- for p in zdc:
+ for p in cache:
if normalize_path(p)==path:
- del zdc[p]
+ del cache[p]
return
-
-
def is_python(text, filename=''):
"Is this string a valid Python script?"
try:
From python-checkins at python.org Mon Jul 10 23:30:13 2006
From: python-checkins at python.org (phillip.eby)
Date: Mon, 10 Jul 2006 23:30:13 +0200 (CEST)
Subject: [Python-checkins] r50546 - in sandbox/branches/setuptools-0.6:
EasyInstall.txt setuptools/command/easy_install.py
Message-ID: <20060710213013.244691E400B@bag.python.org>
Author: phillip.eby
Date: Mon Jul 10 23:30:12 2006
New Revision: 50546
Modified:
sandbox/branches/setuptools-0.6/EasyInstall.txt
sandbox/branches/setuptools-0.6/setuptools/command/easy_install.py
Log:
Fix ``sys.path_importer_cache`` not being updated when an existing zipfile
or directory is deleted/overwritten.
Modified: sandbox/branches/setuptools-0.6/EasyInstall.txt
==============================================================================
--- sandbox/branches/setuptools-0.6/EasyInstall.txt (original)
+++ sandbox/branches/setuptools-0.6/EasyInstall.txt Mon Jul 10 23:30:12 2006
@@ -1101,6 +1101,9 @@
* Fix ``ftp://`` directory listing URLs from causing a crash when used in the
"Home page" or "Download URL" slots on PyPI.
+ * Fix ``sys.path_importer_cache`` not being updated when an existing zipfile
+ or directory is deleted/overwritten.
+
0.6b3
* Fix local ``--find-links`` eggs not being copied except with
``--always-copy``.
Modified: sandbox/branches/setuptools-0.6/setuptools/command/easy_install.py
==============================================================================
--- sandbox/branches/setuptools-0.6/setuptools/command/easy_install.py (original)
+++ sandbox/branches/setuptools-0.6/setuptools/command/easy_install.py Mon Jul 10 23:30:12 2006
@@ -1410,7 +1410,6 @@
options = ' '+options
return "#!%(executable)s%(options)s\n" % locals()
-
def auto_chmod(func, arg, exc):
if func is os.remove and os.name=='nt':
os.chmod(arg, stat.S_IWRITE)
@@ -1418,21 +1417,22 @@
exc = sys.exc_info()
raise exc[0], (exc[1][0], exc[1][1] + (" %s %s" % (func,arg)))
-
def uncache_zipdir(path):
- """Ensure that the zip directory cache doesn't have stale info for path"""
+ """Ensure that the importer caches dont have stale info for `path`"""
from zipimport import _zip_directory_cache as zdc
- if path in zdc:
- del zdc[path]
+ _uncache(path, zdc)
+ _uncache(path, sys.path_importer_cache)
+
+def _uncache(path, cache):
+ if path in cache:
+ del cache[path]
else:
path = normalize_path(path)
- for p in zdc:
+ for p in cache:
if normalize_path(p)==path:
- del zdc[p]
+ del cache[p]
return
-
-
def is_python(text, filename=''):
"Is this string a valid Python script?"
try:
From python-checkins at python.org Mon Jul 10 23:37:04 2006
From: python-checkins at python.org (tim.peters)
Date: Mon, 10 Jul 2006 23:37:04 +0200 (CEST)
Subject: [Python-checkins] r50547 - python/branches/tim-current_frames
Message-ID: <20060710213704.7D4E11E4002@bag.python.org>
Author: tim.peters
Date: Mon Jul 10 23:37:04 2006
New Revision: 50547
Removed:
python/branches/tim-current_frames/
Log:
This branch was merged to the trunk.
From python-checkins at python.org Mon Jul 10 23:52:42 2006
From: python-checkins at python.org (brett.cannon)
Date: Mon, 10 Jul 2006 23:52:42 +0200 (CEST)
Subject: [Python-checkins] r50548 -
python/branches/bcannon-sandboxing/Include/Python.h
Message-ID: <20060710215242.9E0AC1E4002@bag.python.org>
Author: brett.cannon
Date: Mon Jul 10 23:52:42 2006
New Revision: 50548
Modified:
python/branches/bcannon-sandboxing/Include/Python.h
Log:
Add sandbox.h .
Modified: python/branches/bcannon-sandboxing/Include/Python.h
==============================================================================
--- python/branches/bcannon-sandboxing/Include/Python.h (original)
+++ python/branches/bcannon-sandboxing/Include/Python.h Mon Jul 10 23:52:42 2006
@@ -113,6 +113,8 @@
#include "codecs.h"
#include "pyerrors.h"
+#include "sandbox.h"
+
#include "pystate.h"
#include "pyarena.h"
From python-checkins at python.org Mon Jul 10 23:53:12 2006
From: python-checkins at python.org (brett.cannon)
Date: Mon, 10 Jul 2006 23:53:12 +0200 (CEST)
Subject: [Python-checkins] r50549 - in python/branches/bcannon-sandboxing:
Makefile.pre.in Python/sandbox.c
Message-ID: <20060710215312.6834F1E4002@bag.python.org>
Author: brett.cannon
Date: Mon Jul 10 23:53:12 2006
New Revision: 50549
Added:
python/branches/bcannon-sandboxing/Python/sandbox.c
Modified:
python/branches/bcannon-sandboxing/Makefile.pre.in
Log:
Add Python/sandbox.c .
Modified: python/branches/bcannon-sandboxing/Makefile.pre.in
==============================================================================
--- python/branches/bcannon-sandboxing/Makefile.pre.in (original)
+++ python/branches/bcannon-sandboxing/Makefile.pre.in Mon Jul 10 23:53:12 2006
@@ -262,6 +262,7 @@
Python/mysnprintf.o \
Python/pyarena.o \
Python/pyfpe.o \
+ Python/sandbox.o \
Python/pystate.o \
Python/pythonrun.o \
Python/structmember.o \
@@ -539,6 +540,7 @@
Include/pyfpe.h \
Include/pymem.h \
Include/pyport.h \
+ Include/sandbox.h \
Include/pystate.h \
Include/pythonrun.h \
Include/rangeobject.h \
Added: python/branches/bcannon-sandboxing/Python/sandbox.c
==============================================================================
--- (empty file)
+++ python/branches/bcannon-sandboxing/Python/sandbox.c Mon Jul 10 23:53:12 2006
@@ -0,0 +1,13 @@
+#include "Python.h" /* Must be defined before PySandbox_SUPPORTED check */
+
+#ifdef PySandbox_SUPPORTED
+
+#ifdef __cplusplus
+extern "C" {
+#endif
+
+#ifdef __cplusplus
+}
+#endif
+
+#endif /* PySandbox_SUPPORTED */
From python-checkins at python.org Mon Jul 10 23:54:13 2006
From: python-checkins at python.org (phillip.eby)
Date: Mon, 10 Jul 2006 23:54:13 +0200 (CEST)
Subject: [Python-checkins] r50550 -
sandbox/trunk/setuptools/setuptools/package_index.py
Message-ID: <20060710215413.4D7AE1E4002@bag.python.org>
Author: phillip.eby
Date: Mon Jul 10 23:54:12 2006
New Revision: 50550
Modified:
sandbox/trunk/setuptools/setuptools/package_index.py
Log:
Fix not recognizing HTML 404 pages from package indexes.
Modified: sandbox/trunk/setuptools/setuptools/package_index.py
==============================================================================
--- sandbox/trunk/setuptools/setuptools/package_index.py (original)
+++ sandbox/trunk/setuptools/setuptools/package_index.py Mon Jul 10 23:54:12 2006
@@ -169,7 +169,7 @@
base = f.url # handle redirects
page = f.read()
f.close()
- if url.startswith(self.index_url):
+ if url.startswith(self.index_url) and getattr(f,'code',None)!=404:
page = self.process_index(url, page)
for match in HREF.finditer(page):
@@ -253,7 +253,7 @@
def scan_all(self, msg=None, *args):
if self.index_url not in self.fetched_urls:
if msg: self.warn(msg,*args)
- self.warn(
+ self.info(
"Scanning index of all packages (this may take a while)"
)
self.scan_url(self.index_url)
From python-checkins at python.org Mon Jul 10 23:54:47 2006
From: python-checkins at python.org (brett.cannon)
Date: Mon, 10 Jul 2006 23:54:47 +0200 (CEST)
Subject: [Python-checkins] r50551 - in python/branches/bcannon-sandboxing:
Include/sandbox.h Python/pythonrun.c
Message-ID: <20060710215447.BBB2A1E401B@bag.python.org>
Author: brett.cannon
Date: Mon Jul 10 23:54:47 2006
New Revision: 50551
Modified:
python/branches/bcannon-sandboxing/Include/sandbox.h
python/branches/bcannon-sandboxing/Python/pythonrun.c
Log:
Initialize PyInterpreterState->sandbox_state to NULL in Py_NewInterpreter(). Also add _PySandbox_Protected() macro to check if the current interpreter is sandboxed or not.
Modified: python/branches/bcannon-sandboxing/Include/sandbox.h
==============================================================================
--- python/branches/bcannon-sandboxing/Include/sandbox.h (original)
+++ python/branches/bcannon-sandboxing/Include/sandbox.h Mon Jul 10 23:54:47 2006
@@ -9,9 +9,16 @@
struct _sandbox_state; /* Forward */
typedef struct _sandbox_state {
+ PY_LONG_LONG mem_cap;
} PySandboxState;
-#endif /* Py_SANDBOX_H */
+/* Return true if sandboxing is turn on for the current interpreter. */
+#define _PySandbox_Protected() (PyThreadState_GET()->interp->sandbox_state != NULL)
+
+#ifdef __cplusplus
+}
+#endif
+#endif /* Py_SANDBOX_H */
#endif /* PySandbox_SUPPORTED */
Modified: python/branches/bcannon-sandboxing/Python/pythonrun.c
==============================================================================
--- python/branches/bcannon-sandboxing/Python/pythonrun.c (original)
+++ python/branches/bcannon-sandboxing/Python/pythonrun.c Mon Jul 10 23:54:47 2006
@@ -512,6 +512,13 @@
if (interp == NULL)
return NULL;
+#ifdef PySandbox_SUPPORTED
+ /* Must set sandbox_state to NULL to flag that the interpreter is
+ unprotected. It if is to be protected, the field is set by
+ PySandbox_NewInterpreter(). */
+ interp->sandbox_state = NULL;
+#endif
+
tstate = PyThreadState_New(interp);
if (tstate == NULL) {
PyInterpreterState_Delete(interp);
From python-checkins at python.org Mon Jul 10 23:55:25 2006
From: python-checkins at python.org (phillip.eby)
Date: Mon, 10 Jul 2006 23:55:25 +0200 (CEST)
Subject: [Python-checkins] r50552 - in sandbox/branches/setuptools-0.6:
EasyInstall.txt setuptools/package_index.py
Message-ID: <20060710215525.7DC131E4008@bag.python.org>
Author: phillip.eby
Date: Mon Jul 10 23:55:25 2006
New Revision: 50552
Modified:
sandbox/branches/setuptools-0.6/EasyInstall.txt
sandbox/branches/setuptools-0.6/setuptools/package_index.py
Log:
Fix not recognizing HTML 404 pages from package indexes.
(backport from trunk)
Modified: sandbox/branches/setuptools-0.6/EasyInstall.txt
==============================================================================
--- sandbox/branches/setuptools-0.6/EasyInstall.txt (original)
+++ sandbox/branches/setuptools-0.6/EasyInstall.txt Mon Jul 10 23:55:25 2006
@@ -1104,6 +1104,8 @@
* Fix ``sys.path_importer_cache`` not being updated when an existing zipfile
or directory is deleted/overwritten.
+ * Fix not recognizing HTML 404 pages from package indexes.
+
0.6b3
* Fix local ``--find-links`` eggs not being copied except with
``--always-copy``.
Modified: sandbox/branches/setuptools-0.6/setuptools/package_index.py
==============================================================================
--- sandbox/branches/setuptools-0.6/setuptools/package_index.py (original)
+++ sandbox/branches/setuptools-0.6/setuptools/package_index.py Mon Jul 10 23:55:25 2006
@@ -169,7 +169,7 @@
base = f.url # handle redirects
page = f.read()
f.close()
- if url.startswith(self.index_url):
+ if url.startswith(self.index_url) and getattr(f,'code',None)!=404:
page = self.process_index(url, page)
for match in HREF.finditer(page):
@@ -253,7 +253,7 @@
def scan_all(self, msg=None, *args):
if self.index_url not in self.fetched_urls:
if msg: self.warn(msg,*args)
- self.warn(
+ self.info(
"Scanning index of all packages (this may take a while)"
)
self.scan_url(self.index_url)
From python-checkins at python.org Tue Jul 11 00:11:28 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Tue, 11 Jul 2006 00:11:28 +0200 (CEST)
Subject: [Python-checkins] r50553 - in python/trunk: Doc/lib/libturtle.tex
Lib/lib-tk/turtle.py Misc/NEWS
Message-ID: <20060710221128.E697A1E4002@bag.python.org>
Author: martin.v.loewis
Date: Tue Jul 11 00:11:28 2006
New Revision: 50553
Modified:
python/trunk/Doc/lib/libturtle.tex
python/trunk/Lib/lib-tk/turtle.py
python/trunk/Misc/NEWS
Log:
Patch #1519566: Remove unused _tofill member.
Make begin_fill idempotent.
Update demo2 to demonstrate filling of concave shapes.
Modified: python/trunk/Doc/lib/libturtle.tex
==============================================================================
--- python/trunk/Doc/lib/libturtle.tex (original)
+++ python/trunk/Doc/lib/libturtle.tex Tue Jul 11 00:11:28 2006
@@ -108,7 +108,9 @@
\end{funcdesc}
\begin{funcdesc}{begin\_fill}{}
-Switch turtle into filling mode; equivalent to \code{fill(1)}.
+Switch turtle into filling mode;
+Must eventually be followed by a corresponding end_fill() call.
+Otherwise it will be ignored.
\versionadded{2.5}
\end{funcdesc}
Modified: python/trunk/Lib/lib-tk/turtle.py
==============================================================================
--- python/trunk/Lib/lib-tk/turtle.py (original)
+++ python/trunk/Lib/lib-tk/turtle.py Tue Jul 11 00:11:28 2006
@@ -86,7 +86,6 @@
self._color = "black"
self._filling = 0
self._path = []
- self._tofill = []
self.clear()
canvas._root().tkraise()
@@ -306,19 +305,15 @@
{'fill': self._color,
'smooth': smooth})
self._items.append(item)
- if self._tofill:
- for item in self._tofill:
- self._canvas.itemconfigure(item, fill=self._color)
- self._items.append(item)
self._path = []
- self._tofill = []
self._filling = flag
if flag:
self._path.append(self._position)
- self.forward(0)
def begin_fill(self):
""" Called just before drawing a shape to be filled.
+ Must eventually be followed by a corresponding end_fill() call.
+ Otherwise it will be ignored.
Example:
>>> turtle.begin_fill()
@@ -331,7 +326,8 @@
>>> turtle.forward(100)
>>> turtle.end_fill()
"""
- self.fill(1)
+ self._path = [self._position]
+ self._filling = 1
def end_fill(self):
""" Called after drawing a shape to be filled.
@@ -901,15 +897,30 @@
speed(speeds[sp])
color(0.25,0,0.75)
fill(0)
- color("green")
- left(130)
+ # draw and fill a concave shape
+ left(120)
up()
- forward(90)
+ forward(70)
+ right(30)
+ down()
color("red")
- speed('fastest')
+ speed("fastest")
+ fill(1)
+ for i in range(4):
+ circle(50,90)
+ right(90)
+ forward(30)
+ right(90)
+ color("yellow")
+ fill(0)
+ left(90)
+ up()
+ forward(30)
down();
+ color("red")
+
# create a second turtle and make the original pursue and catch it
turtle=Turtle()
turtle.reset()
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Tue Jul 11 00:11:28 2006
@@ -46,6 +46,8 @@
Library
-------
+- Patch #1519566: Update turtle demo, make begin_fill idempotent.
+
- Bug #1508010: msvccompiler now requires the DISTUTILS_USE_SDK
environment variable to be set in order to the SDK environment
for finding the compiler, include files, etc.
From python-checkins at python.org Tue Jul 11 00:54:04 2006
From: python-checkins at python.org (phillip.eby)
Date: Tue, 11 Jul 2006 00:54:04 +0200 (CEST)
Subject: [Python-checkins] r50554 -
sandbox/trunk/setuptools/setuptools/package_index.py
Message-ID: <20060710225404.210631E4002@bag.python.org>
Author: phillip.eby
Date: Tue Jul 11 00:54:03 2006
New Revision: 50554
Modified:
sandbox/trunk/setuptools/setuptools/package_index.py
Log:
Allow use of file:// URLs for --index-url.
Modified: sandbox/trunk/setuptools/setuptools/package_index.py
==============================================================================
--- sandbox/trunk/setuptools/setuptools/package_index.py (original)
+++ sandbox/trunk/setuptools/setuptools/package_index.py Tue Jul 11 00:54:03 2006
@@ -1,6 +1,6 @@
"""PyPI and direct package downloading"""
-import sys, os.path, re, urlparse, urllib2, shutil, random, socket
+import sys, os.path, re, urlparse, urllib2, shutil, random, socket, cStringIO
from pkg_resources import *
from distutils import log
from distutils.errors import DistutilsError
@@ -573,6 +573,8 @@
def open_url(self, url):
+ if url.startswith('file:'):
+ return local_open(url)
try:
return urllib2.urlopen(url)
except urllib2.HTTPError, v:
@@ -610,6 +612,7 @@
else:
return filename
+
def scan_url(self, url):
self.process_url(url, True)
@@ -643,17 +646,6 @@
def warn(self, msg, *args):
log.warn(msg, *args)
-
-
-
-
-
-
-
-
-
-
-
def fix_sf_url(url):
scheme, server, path, param, query, frag = urlparse.urlparse(url)
if server!='prdownloads.sourceforge.net':
@@ -674,22 +666,30 @@
return random.choice(_sf_mirrors)
+def local_open(url):
+ """Read a local path, with special support for directories"""
+ scheme, server, path, param, query, frag = urlparse.urlparse(url)
+ filename = urllib2.url2pathname(path)
+ if os.path.isfile(filename):
+ return urllib2.urlopen(url)
+ elif path.endswith('/') and os.path.isdir(filename):
+ files = []
+ for f in os.listdir(filename):
+ if f=='index.html':
+ body = open(os.path.join(filename,f),'rb').read()
+ break
+ elif os.path.isdir(os.path.join(filename,f)):
+ f+='/'
+ files.append("%s" % (f,f))
+ else:
+ body = ("%s" % url) + \
+ "%s" % '\n'.join(files)
+ status, message = 200, "OK"
+ else:
+ status, message, body = 404, "Path not found", "Not found"
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
+ return urllib2.HTTPError(url, status, message,
+ {'content-type':'text/html'}, cStringIO.StringIO(body))
From python-checkins at python.org Tue Jul 11 01:03:21 2006
From: python-checkins at python.org (phillip.eby)
Date: Tue, 11 Jul 2006 01:03:21 +0200 (CEST)
Subject: [Python-checkins] r50555 - in sandbox/branches/setuptools-0.6:
EasyInstall.txt setuptools/package_index.py
Message-ID: <20060710230321.600551E4002@bag.python.org>
Author: phillip.eby
Date: Tue Jul 11 01:03:20 2006
New Revision: 50555
Modified:
sandbox/branches/setuptools-0.6/EasyInstall.txt
sandbox/branches/setuptools-0.6/setuptools/package_index.py
Log:
Allow ``file://`` URLs to be used as a package index. URLs that refer to
directories will use an internally-generated directory listing if there is
no ``index.html`` file in the directory.
(backport from trunk)
Modified: sandbox/branches/setuptools-0.6/EasyInstall.txt
==============================================================================
--- sandbox/branches/setuptools-0.6/EasyInstall.txt (original)
+++ sandbox/branches/setuptools-0.6/EasyInstall.txt Tue Jul 11 01:03:20 2006
@@ -1106,6 +1106,10 @@
* Fix not recognizing HTML 404 pages from package indexes.
+ * Allow ``file://`` URLs to be used as a package index. URLs that refer to
+ directories will use an internally-generated directory listing if there is
+ no ``index.html`` file in the directory.
+
0.6b3
* Fix local ``--find-links`` eggs not being copied except with
``--always-copy``.
Modified: sandbox/branches/setuptools-0.6/setuptools/package_index.py
==============================================================================
--- sandbox/branches/setuptools-0.6/setuptools/package_index.py (original)
+++ sandbox/branches/setuptools-0.6/setuptools/package_index.py Tue Jul 11 01:03:20 2006
@@ -1,6 +1,6 @@
"""PyPI and direct package downloading"""
-import sys, os.path, re, urlparse, urllib2, shutil, random, socket
+import sys, os.path, re, urlparse, urllib2, shutil, random, socket, cStringIO
from pkg_resources import *
from distutils import log
from distutils.errors import DistutilsError
@@ -573,6 +573,8 @@
def open_url(self, url):
+ if url.startswith('file:'):
+ return local_open(url)
try:
return urllib2.urlopen(url)
except urllib2.HTTPError, v:
@@ -610,6 +612,7 @@
else:
return filename
+
def scan_url(self, url):
self.process_url(url, True)
@@ -643,17 +646,6 @@
def warn(self, msg, *args):
log.warn(msg, *args)
-
-
-
-
-
-
-
-
-
-
-
def fix_sf_url(url):
scheme, server, path, param, query, frag = urlparse.urlparse(url)
if server!='prdownloads.sourceforge.net':
@@ -674,22 +666,30 @@
return random.choice(_sf_mirrors)
+def local_open(url):
+ """Read a local path, with special support for directories"""
+ scheme, server, path, param, query, frag = urlparse.urlparse(url)
+ filename = urllib2.url2pathname(path)
+ if os.path.isfile(filename):
+ return urllib2.urlopen(url)
+ elif path.endswith('/') and os.path.isdir(filename):
+ files = []
+ for f in os.listdir(filename):
+ if f=='index.html':
+ body = open(os.path.join(filename,f),'rb').read()
+ break
+ elif os.path.isdir(os.path.join(filename,f)):
+ f+='/'
+ files.append("%s" % (f,f))
+ else:
+ body = ("%s" % url) + \
+ "%s" % '\n'.join(files)
+ status, message = 200, "OK"
+ else:
+ status, message, body = 404, "Path not found", "Not found"
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
+ return urllib2.HTTPError(url, status, message,
+ {'content-type':'text/html'}, cStringIO.StringIO(body))
From python-checkins at python.org Tue Jul 11 02:06:18 2006
From: python-checkins at python.org (phillip.eby)
Date: Tue, 11 Jul 2006 02:06:18 +0200 (CEST)
Subject: [Python-checkins] r50556 -
sandbox/trunk/setuptools/setuptools/package_index.py
Message-ID: <20060711000618.157291E4002@bag.python.org>
Author: phillip.eby
Date: Tue Jul 11 02:06:16 2006
New Revision: 50556
Modified:
sandbox/trunk/setuptools/setuptools/package_index.py
Log:
Reduce screenscraping required for a package index. Homepage and
download URLs can now be marked with 'rel="download"' and
'rel="homepage"' respectively, and the 'Index of Packages' string is no
longer required. Since PyPI doesn't yet support rel="" attributes, the
old "
"-matching code remains, as does the MD5 scraping.
Modified: sandbox/trunk/setuptools/setuptools/package_index.py
==============================================================================
--- sandbox/trunk/setuptools/setuptools/package_index.py (original)
+++ sandbox/trunk/setuptools/setuptools/package_index.py Tue Jul 11 02:06:16 2006
@@ -121,6 +121,47 @@
platform = platform
)
+REL = re.compile("""<([^>]*\srel\s*=\s*['"]?([^'">]+)[^>]*)>""", re.I)
+# this line is here to fix emacs' cruddy broken syntax highlighting
+
+def find_external_links(url, page):
+ """Find rel="homepage" and rel="download" links in `page`, yielding URLs"""
+
+ for match in REL.finditer(page):
+ tag, rel = match.groups()
+ rels = map(str.strip, rel.lower().split(','))
+ if 'homepage' in rels or 'download' in rels:
+ for match in HREF.finditer(tag):
+ yield urlparse.urljoin(url, match.group(1))
+
+ for tag in ("
Home Page", "
Download URL"):
+ pos = page.find(tag)
+ if pos!=-1:
+ match = HREF.search(page,pos)
+ if match:
+ yield urlparse.urljoin(url, match.group(1))
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
class PackageIndex(Environment):
"""A distribution index that scans web pages for download URLs"""
@@ -211,7 +252,7 @@
parts = map(
urllib2.unquote, link[len(self.index_url):].split('/')
)
- if len(parts)==2:
+ if len(parts)==2 and '#' not in parts[1]:
# it's a package page, sanitize and index it
pkg = safe_name(parts[0])
ver = safe_version(parts[1])
@@ -219,30 +260,30 @@
return to_filename(pkg), to_filename(ver)
return None, None
- if url==self.index_url or 'Index of Packages' in page:
- # process an index page into the package-page index
- for match in HREF.finditer(page):
- scan( urlparse.urljoin(url, match.group(1)) )
- else:
- pkg,ver = scan(url) # ensure this page is in the page index
+ # process an index page into the package-page index
+ for match in HREF.finditer(page):
+ scan( urlparse.urljoin(url, match.group(1)) )
+
+ pkg, ver = scan(url) # ensure this page is in the page index
+ if pkg:
# process individual package page
- for tag in ("
Home Page", "
Download URL"):
- pos = page.find(tag)
- if pos!=-1:
- match = HREF.search(page,pos)
- if match:
- # Process the found URL
- new_url = urlparse.urljoin(url, match.group(1))
- base, frag = egg_info_for_url(new_url)
- if base.endswith('.py') and not frag:
- if pkg and ver:
- new_url+='#egg=%s-%s' % (pkg,ver)
- else:
- self.need_version_info(url)
- self.scan_url(new_url)
- return PYPI_MD5.sub(
- lambda m: '%s' % m.group(1,3,2), page
- )
+ for new_url in find_external_links(url, page):
+ # Process the found URL
+ base, frag = egg_info_for_url(new_url)
+ if base.endswith('.py') and not frag:
+ if ver:
+ new_url+='#egg=%s-%s' % (pkg,ver)
+ else:
+ self.need_version_info(url)
+ self.scan_url(new_url)
+
+ return PYPI_MD5.sub(
+ lambda m: '%s' % m.group(1,3,2), page
+ )
+ else:
+ return "" # no sense double-scanning non-package pages
+
+
def need_version_info(self, url):
self.scan_all(
@@ -273,7 +314,7 @@
)
self.scan_all()
- for url in self.package_pages.get(requirement.key,()):
+ for url in list(self.package_pages.get(requirement.key,())):
# scan each page that might be related to the desired package
self.scan_url(url)
From python-checkins at python.org Tue Jul 11 02:14:37 2006
From: python-checkins at python.org (phillip.eby)
Date: Tue, 11 Jul 2006 02:14:37 +0200 (CEST)
Subject: [Python-checkins] r50557 - in sandbox/branches/setuptools-0.6:
EasyInstall.txt setuptools/package_index.py
Message-ID: <20060711001437.3A2F71E4002@bag.python.org>
Author: phillip.eby
Date: Tue Jul 11 02:14:36 2006
New Revision: 50557
Modified:
sandbox/branches/setuptools-0.6/EasyInstall.txt
sandbox/branches/setuptools-0.6/setuptools/package_index.py
Log:
Reduce screenscraping required for a package index. Homepage and
download URLs can now be marked with 'rel="download"' and
'rel="homepage"' respectively, and the 'Index of Packages' string is no
longer required. Since PyPI doesn't yet support rel="" attributes, the
old "
"-matching code remains, as does the MD5 scraping.
(backport from trunk)
Modified: sandbox/branches/setuptools-0.6/EasyInstall.txt
==============================================================================
--- sandbox/branches/setuptools-0.6/EasyInstall.txt (original)
+++ sandbox/branches/setuptools-0.6/EasyInstall.txt Tue Jul 11 02:14:36 2006
@@ -1110,6 +1110,10 @@
directories will use an internally-generated directory listing if there is
no ``index.html`` file in the directory.
+ * Allow external links in a package index to be specified using
+ ``rel="homepage"`` or ``rel="download"``, without needing the old
+ PyPI-specific visible markup.
+
0.6b3
* Fix local ``--find-links`` eggs not being copied except with
``--always-copy``.
Modified: sandbox/branches/setuptools-0.6/setuptools/package_index.py
==============================================================================
--- sandbox/branches/setuptools-0.6/setuptools/package_index.py (original)
+++ sandbox/branches/setuptools-0.6/setuptools/package_index.py Tue Jul 11 02:14:36 2006
@@ -121,6 +121,47 @@
platform = platform
)
+REL = re.compile("""<([^>]*\srel\s*=\s*['"]?([^'">]+)[^>]*)>""", re.I)
+# this line is here to fix emacs' cruddy broken syntax highlighting
+
+def find_external_links(url, page):
+ """Find rel="homepage" and rel="download" links in `page`, yielding URLs"""
+
+ for match in REL.finditer(page):
+ tag, rel = match.groups()
+ rels = map(str.strip, rel.lower().split(','))
+ if 'homepage' in rels or 'download' in rels:
+ for match in HREF.finditer(tag):
+ yield urlparse.urljoin(url, match.group(1))
+
+ for tag in ("
Home Page", "
Download URL"):
+ pos = page.find(tag)
+ if pos!=-1:
+ match = HREF.search(page,pos)
+ if match:
+ yield urlparse.urljoin(url, match.group(1))
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
class PackageIndex(Environment):
"""A distribution index that scans web pages for download URLs"""
@@ -211,7 +252,7 @@
parts = map(
urllib2.unquote, link[len(self.index_url):].split('/')
)
- if len(parts)==2:
+ if len(parts)==2 and '#' not in parts[1]:
# it's a package page, sanitize and index it
pkg = safe_name(parts[0])
ver = safe_version(parts[1])
@@ -219,30 +260,30 @@
return to_filename(pkg), to_filename(ver)
return None, None
- if url==self.index_url or 'Index of Packages' in page:
- # process an index page into the package-page index
- for match in HREF.finditer(page):
- scan( urlparse.urljoin(url, match.group(1)) )
- else:
- pkg,ver = scan(url) # ensure this page is in the page index
+ # process an index page into the package-page index
+ for match in HREF.finditer(page):
+ scan( urlparse.urljoin(url, match.group(1)) )
+
+ pkg, ver = scan(url) # ensure this page is in the page index
+ if pkg:
# process individual package page
- for tag in ("
Home Page", "
Download URL"):
- pos = page.find(tag)
- if pos!=-1:
- match = HREF.search(page,pos)
- if match:
- # Process the found URL
- new_url = urlparse.urljoin(url, match.group(1))
- base, frag = egg_info_for_url(new_url)
- if base.endswith('.py') and not frag:
- if pkg and ver:
- new_url+='#egg=%s-%s' % (pkg,ver)
- else:
- self.need_version_info(url)
- self.scan_url(new_url)
- return PYPI_MD5.sub(
- lambda m: '%s' % m.group(1,3,2), page
- )
+ for new_url in find_external_links(url, page):
+ # Process the found URL
+ base, frag = egg_info_for_url(new_url)
+ if base.endswith('.py') and not frag:
+ if ver:
+ new_url+='#egg=%s-%s' % (pkg,ver)
+ else:
+ self.need_version_info(url)
+ self.scan_url(new_url)
+
+ return PYPI_MD5.sub(
+ lambda m: '%s' % m.group(1,3,2), page
+ )
+ else:
+ return "" # no sense double-scanning non-package pages
+
+
def need_version_info(self, url):
self.scan_all(
@@ -273,7 +314,7 @@
)
self.scan_all()
- for url in self.package_pages.get(requirement.key,()):
+ for url in list(self.package_pages.get(requirement.key,())):
# scan each page that might be related to the desired package
self.scan_url(url)
From python-checkins at python.org Tue Jul 11 02:21:42 2006
From: python-checkins at python.org (jackilyn.hoxworth)
Date: Tue, 11 Jul 2006 02:21:42 +0200 (CEST)
Subject: [Python-checkins] r50558 -
python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py
Message-ID: <20060711002142.299241E4002@bag.python.org>
Author: jackilyn.hoxworth
Date: Tue Jul 11 02:21:41 2006
New Revision: 50558
Modified:
python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py
Log:
doesn?t work.
Modified: python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py
==============================================================================
--- python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py (original)
+++ python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py Tue Jul 11 02:21:41 2006
@@ -27,11 +27,12 @@
# create socket
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
-httplib.HTTPResponse(sock).log("message 1")
-httplib.HTTPConnection().log("message 2")
+httplib.HTTPResponse(sock).log("message 1") # says there is no attribute for "log", wtf
+httplib.HTTPConnection(sock.connect).log("message 2")
+
print stringLog.getvalue() # For testing purposes
if stringLog.getvalue() != "Error: It worked":
print "it worked"
else:
- print "it didn't work"
+ print "it didn't work"
\ No newline at end of file
From python-checkins at python.org Tue Jul 11 02:46:47 2006
From: python-checkins at python.org (brett.cannon)
Date: Tue, 11 Jul 2006 02:46:47 +0200 (CEST)
Subject: [Python-checkins] r50559 -
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Message-ID: <20060711004647.EAAF01E4002@bag.python.org>
Author: brett.cannon
Date: Tue Jul 11 02:46:47 2006
New Revision: 50559
Modified:
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Log:
Change API for PySandbox_AllowedMemoryFree() to be a void call and just cap the lowest that memory usage can go to 0.
Also break todo list into design and implementation.
Modified: python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
==============================================================================
--- python/branches/bcannon-sandboxing/sandboxing_design_doc.txt (original)
+++ python/branches/bcannon-sandboxing/sandboxing_design_doc.txt Tue Jul 11 02:46:47 2006
@@ -19,6 +19,9 @@
XXX TO DO
=============================
+Design
+--------------
+
* threading needs protection?
* python-dev convince me that hiding 'file' possible?
+ based on that, handle code objects
@@ -43,6 +46,12 @@
(e.g., ``xml.*``)
+Implementation
+--------------
+
+* note in SpecialBuilds.txt
+
+
Goal
=============================
@@ -518,11 +527,9 @@
and cause the calling function to return with the value of
'error_return', otherwise do nothing.
-* PySandbox_AllowedMemoryFree(integer, error_return)
- Macro to decrease the current running interpreter's allocated
- memory. If this puts the memory used to below 0, raise a
- SandboxError exception and return 'error_return', otherwise do
- nothing.
+* void PySandbox_AllowedMemoryFree(integer, error_return)
+ Decrease the current running interpreter's allocated
+ memory. If this puts the memory used to below 0, re-set it to 0.
Reading/Writing Files
From python-checkins at python.org Tue Jul 11 02:48:29 2006
From: python-checkins at python.org (brett.cannon)
Date: Tue, 11 Jul 2006 02:48:29 +0200 (CEST)
Subject: [Python-checkins] r50560 -
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Message-ID: <20060711004829.2CB131E4002@bag.python.org>
Author: brett.cannon
Date: Tue Jul 11 02:48:28 2006
New Revision: 50560
Modified:
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Log:
Remove error return argument from PySandbox_AllowedMemoryFree().
Modified: python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
==============================================================================
--- python/branches/bcannon-sandboxing/sandboxing_design_doc.txt (original)
+++ python/branches/bcannon-sandboxing/sandboxing_design_doc.txt Tue Jul 11 02:48:28 2006
@@ -527,7 +527,7 @@
and cause the calling function to return with the value of
'error_return', otherwise do nothing.
-* void PySandbox_AllowedMemoryFree(integer, error_return)
+* void PySandbox_AllowedMemoryFree(integer)
Decrease the current running interpreter's allocated
memory. If this puts the memory used to below 0, re-set it to 0.
From python-checkins at python.org Tue Jul 11 03:12:17 2006
From: python-checkins at python.org (phillip.eby)
Date: Tue, 11 Jul 2006 03:12:17 +0200 (CEST)
Subject: [Python-checkins] r50561 - sandbox/trunk/setuptools/EasyInstall.txt
Message-ID: <20060711011217.24F081E4002@bag.python.org>
Author: phillip.eby
Date: Tue Jul 11 03:12:15 2006
New Revision: 50561
Modified:
sandbox/trunk/setuptools/EasyInstall.txt
Log:
Add documentation for package index "API" (layout/content rules)
Modified: sandbox/trunk/setuptools/EasyInstall.txt
==============================================================================
--- sandbox/trunk/setuptools/EasyInstall.txt (original)
+++ sandbox/trunk/setuptools/EasyInstall.txt Tue Jul 11 03:12:15 2006
@@ -1095,6 +1095,98 @@
set, if you haven't already got this set up on your machine.
+Package Index "API"
+-------------------
+
+Custom package indexes (and PyPI) must follow the following rules for
+EasyInstall to be able to look up and download packages:
+
+1. Except where stated otherwise, "pages" are HTML or XHTML, and "links"
+ refer to ``href`` attributes.
+
+2. Individual project version pages' URLs must be of the form
+ ``base/projectname/version``, where ``base`` is the package index's base URL.
+
+3. Omitting the ``/version`` part of a project page's URL (but keeping the
+ trailing ``/``) should result in a page that is either:
+
+ a) The single active version of that project, as though the version had been
+ explicitly included, OR
+
+ b) A page with links to all of the active version pages for that project.
+
+4. Individual project version pages should contain direct links to downloadable
+ distributions where possible. It is explicitly permitted for a project's
+ "long_description" to include URLs, and these should be formatted as HTML
+ links by the package index, as EasyInstall does no special processing to
+ identify what parts of a page are index-specific and which are part of the
+ project's supplied description.
+
+5. Where available, MD5 information should be added to download URLs by
+ appending a fragment identifier of the form ``#md5=...``, where ``...`` is
+ the 32-character hex MD5 digest. EasyInstall will verify that the
+ downloaded file's MD5 digest matches the given value.
+
+6. Individual project version pages should identify any "homepage" or
+ "download" URLs using ``rel="homepage"`` and ``rel="download"`` attributes
+ on the HTML elements linking to those URLs. Use of these attributes will
+ cause EasyInstall to always follow the provided links, unless it can be
+ determined by inspection that they are downloadable distributions. If the
+ links are not to downloadable distributions, they are retrieved, and if they
+ are HTML, they are scanned for download links. They are *not* scanned for
+ additional "homepage" or "download" links, as these are only processed for
+ pages that are part of a package index site.
+
+7. The root URL of the index, if retrieved with a trailing ``/``, must result
+ in a page containing links to *all* projects' active version pages.
+
+ (Note: This requirement is a workaround for the absence of case-insensitive
+ ``safe_name()`` matching of project names in URL paths. If project names are
+ matched in this fashion (e.g. via the PyPI server, mod_rewrite, or a similar
+ mechanism), then it is not necessary to include this all-packages listing
+ page.)
+
+8. If a package index is accessed via a ``file://`` URL, then EasyInstall will
+ automatically use ``index.html`` files, if present, when trying to read a
+ directory with a trailing ``/`` on the URL.
+
+
+Backward Compatibility
+~~~~~~~~~~~~~~~~~~~~~~
+
+Package indexes that wish to support setuptools versions prior to 0.6b4 should
+also follow these rules:
+
+* Homepage and download links must be preceded with ``"
Home Page"`` or
+ ``"
Download URL"``, in addition to (or instead of) the ``rel=""``
+ attributes on the actual links. These marker strings do not need to be
+ visible, or uncommented, however! For example, the following is a valid
+ homepage link that will work with any version of setuptools::
+
+
+
+ Even though the marker string is in an HTML comment, older versions of
+ EasyInstall will still "see" it and know that the link that follows is the
+ project's home page URL.
+
+* The pages described by paragraph 3(b) of the preceding section *must*
+ contain the string ``"Index of Packages"`` somewhere in their text.
+ This can be inside of an HTML comment, if desired, and it can be anywhere
+ in the page. (Note: this string MUST NOT appear on normal project pages, as
+ described in paragraphs 2 and 3(a)!)
+
+In addition, for compatibility with PyPI versions that do not use ``#md5=``
+fragment IDs, EasyInstall uses the following regular expression to match PyPI's
+displayed MD5 info (broken onto two lines for readability)::
+
+ ([^<]+)\n\s+\(md5\)
+
+
Release Notes/Change History
============================
From python-checkins at python.org Tue Jul 11 03:14:51 2006
From: python-checkins at python.org (phillip.eby)
Date: Tue, 11 Jul 2006 03:14:51 +0200 (CEST)
Subject: [Python-checkins] r50562 - sandbox/trunk/setuptools/EasyInstall.txt
Message-ID: <20060711011451.B71631E4002@bag.python.org>
Author: phillip.eby
Date: Tue Jul 11 03:14:51 2006
New Revision: 50562
Modified:
sandbox/trunk/setuptools/EasyInstall.txt
Log:
Crosslink "Creating your own package index" to the new docs.
Modified: sandbox/trunk/setuptools/EasyInstall.txt
==============================================================================
--- sandbox/trunk/setuptools/EasyInstall.txt (original)
+++ sandbox/trunk/setuptools/EasyInstall.txt Tue Jul 11 03:14:51 2006
@@ -430,6 +430,10 @@
As you can see, you can list multiple URLs separated by whitespace, continuing
on multiple lines if necessary (as long as the subsequent lines are indented.
+If you are more ambitious, you can also create an entirely custom package index
+or PyPI mirror. See the ``--index-url`` option under `Command-Line Options`_,
+below, and also the section on `The Package Index API`_.
+
Controlling Build Options
~~~~~~~~~~~~~~~~~~~~~~~~~
From python-checkins at python.org Tue Jul 11 03:16:16 2006
From: python-checkins at python.org (phillip.eby)
Date: Tue, 11 Jul 2006 03:16:16 +0200 (CEST)
Subject: [Python-checkins] r50563 -
sandbox/branches/setuptools-0.6/EasyInstall.txt
Message-ID: <20060711011616.80B301E400A@bag.python.org>
Author: phillip.eby
Date: Tue Jul 11 03:16:16 2006
New Revision: 50563
Modified:
sandbox/branches/setuptools-0.6/EasyInstall.txt
Log:
Merge updated index api docs from the trunk.
Modified: sandbox/branches/setuptools-0.6/EasyInstall.txt
==============================================================================
--- sandbox/branches/setuptools-0.6/EasyInstall.txt (original)
+++ sandbox/branches/setuptools-0.6/EasyInstall.txt Tue Jul 11 03:16:16 2006
@@ -426,6 +426,10 @@
As you can see, you can list multiple URLs separated by whitespace, continuing
on multiple lines if necessary (as long as the subsequent lines are indented.
+If you are more ambitious, you can also create an entirely custom package index
+or PyPI mirror. See the ``--index-url`` option under `Command-Line Options`_,
+below, and also the section on `The Package Index API`_.
+
Controlling Build Options
~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -1091,6 +1095,97 @@
set, if you haven't already got this set up on your machine.
+Package Index "API"
+-------------------
+
+Custom package indexes (and PyPI) must follow the following rules for
+EasyInstall to be able to look up and download packages:
+
+1. Except where stated otherwise, "pages" are HTML or XHTML, and "links"
+ refer to ``href`` attributes.
+
+2. Individual project version pages' URLs must be of the form
+ ``base/projectname/version``, where ``base`` is the package index's base URL.
+
+3. Omitting the ``/version`` part of a project page's URL (but keeping the
+ trailing ``/``) should result in a page that is either:
+
+ a) The single active version of that project, as though the version had been
+ explicitly included, OR
+
+ b) A page with links to all of the active version pages for that project.
+
+4. Individual project version pages should contain direct links to downloadable
+ distributions where possible. It is explicitly permitted for a project's
+ "long_description" to include URLs, and these should be formatted as HTML
+ links by the package index, as EasyInstall does no special processing to
+ identify what parts of a page are index-specific and which are part of the
+ project's supplied description.
+
+5. Where available, MD5 information should be added to download URLs by
+ appending a fragment identifier of the form ``#md5=...``, where ``...`` is
+ the 32-character hex MD5 digest. EasyInstall will verify that the
+ downloaded file's MD5 digest matches the given value.
+
+6. Individual project version pages should identify any "homepage" or
+ "download" URLs using ``rel="homepage"`` and ``rel="download"`` attributes
+ on the HTML elements linking to those URLs. Use of these attributes will
+ cause EasyInstall to always follow the provided links, unless it can be
+ determined by inspection that they are downloadable distributions. If the
+ links are not to downloadable distributions, they are retrieved, and if they
+ are HTML, they are scanned for download links. They are *not* scanned for
+ additional "homepage" or "download" links, as these are only processed for
+ pages that are part of a package index site.
+
+7. The root URL of the index, if retrieved with a trailing ``/``, must result
+ in a page containing links to *all* projects' active version pages.
+
+ (Note: This requirement is a workaround for the absence of case-insensitive
+ ``safe_name()`` matching of project names in URL paths. If project names are
+ matched in this fashion (e.g. via the PyPI server, mod_rewrite, or a similar
+ mechanism), then it is not necessary to include this all-packages listing
+ page.)
+
+8. If a package index is accessed via a ``file://`` URL, then EasyInstall will
+ automatically use ``index.html`` files, if present, when trying to read a
+ directory with a trailing ``/`` on the URL.
+
+
+Backward Compatibility
+~~~~~~~~~~~~~~~~~~~~~~
+
+Package indexes that wish to support setuptools versions prior to 0.6b4 should
+also follow these rules:
+
+* Homepage and download links must be preceded with ``"
Home Page"`` or
+ ``"
Download URL"``, in addition to (or instead of) the ``rel=""``
+ attributes on the actual links. These marker strings do not need to be
+ visible, or uncommented, however! For example, the following is a valid
+ homepage link that will work with any version of setuptools::
+
+
\n''' % base)
f.close()
# Write .yml files
@@ -73,12 +74,54 @@
breadcrumb: !breadcrumb nav.yml nav
text: !htmlfile question.html""")
f.close()
-
def convert_index (filename):
root = htmlload.load(filename)
+ title = root.findtext('*/title')
+ i = title.find(':')
+ category = title[i+1:].strip()
+
+ # XXX can there be subsections within a category?
+ entries = []
+ for para in root.findall('*/p'):
+ a = para.find('a')
+ if a is not None:
+ entries.append(a)
+
+ idir = os.path.join(DESTDIR, category)
+
+ # Write body of question
+ f = open(os.path.join(idir, 'listing.html'), 'w')
+ f.write('
\n')
+ for a in entries:
+ f.write('
%s
\n' % ET.tostring(a, 'utf-8'))
+ f.write('
\n')
+ f.close()
+
+ # Write .yml files
+ f = open(os.path.join(idir, 'index.yml'), 'w')
+ f.write("""--- !fragment
+template: index.html
+# The data to pass to the template
+local:
+ title: %s
+ content: !fragment content.yml
+ """ % title)
+ f.close()
+
+ f = open(os.path.join(idir, 'content.yml'), 'w')
+ f.write("""--- !fragment
+# Type of template to use
+template: content.html
+# The data to pass to the template
+local:
+ content:
+ breadcrumb: !breadcrumb nav.yml nav
+ text: !htmlfile listing.html""")
+ f.close()
+
def write_master_index (index_files):
pass
From python-checkins at python.org Sat Jul 15 08:20:21 2006
From: python-checkins at python.org (mateusz.rukowicz)
Date: Sat, 15 Jul 2006 08:20:21 +0200 (CEST)
Subject: [Python-checkins] r50663 - sandbox/trunk/decimal-c/_decimal.c
Message-ID: <20060715062021.2F71F1E4004@bag.python.org>
Author: mateusz.rukowicz
Date: Sat Jul 15 08:20:19 2006
New Revision: 50663
Modified:
sandbox/trunk/decimal-c/_decimal.c
Log:
Added ln function and x^y with non-integer y (this still need some fixing). Improved efficency of dividing a little bit.
Modified: sandbox/trunk/decimal-c/_decimal.c
==============================================================================
--- sandbox/trunk/decimal-c/_decimal.c (original)
+++ sandbox/trunk/decimal-c/_decimal.c Sat Jul 15 08:20:19 2006
@@ -241,7 +241,7 @@
static long
_limb_get_digit(long *self, long ndigits, long i)
{
- if(i >= ndigits)
+ if(i >= ndigits || i < 0)
return 0;
long pos = ndigits - i - 1;
long limb = pos / LOG;
@@ -397,6 +397,27 @@
}
+
+static long
+_limb_multiply_int(long *first, long flimbs, long second, long *out) {
+ long i;
+ for (i=0;i= BASE) {
+ assert (i+1 <=flimbs);
+ out[i+1] += out[i]/BASE;
+ out[i] %= BASE;
+ }
+ }
+
+ if (out[flimbs])
+ return flimbs+1;
+ return flimbs;
+
+}
/* temporary solution, will be speeded up */
static long
_limb_multiply_core(long *first, long flimbs, long *second, long slimbs, long *out)
@@ -473,12 +494,45 @@
while(1) {
long candidate = 0;
long cmp;
+ long up, down;
+ up = BASE;
+ down = 0;
+
+ while(1) {
+ long tmp[slimbs+1];
+ long tmp_limbs;
+ long diff;
+ long mid;
+ int bla;
+ diff = up - down;
+ mid = down + diff/2;
+
+ tmp_limbs = _limb_multiply_int(second, slimbs, mid, tmp);
+ cmp = _limb_compare_un(rest, rlimbs, tmp, tmp_limbs);
+ if (cmp == 0) {
+ up = mid + 1;
+ down = mid;
+ }
+ if (cmp == -1) {
+ up = mid;
+ }
+ if (cmp == 1) {
+ down = mid;
+ }
+
+ if (down == up-1) {
+ tmp_limbs = _limb_multiply_int(second, slimbs, down, tmp);
+ rlimbs = _limb_sub_sl(rest, rlimbs, tmp, tmp_limbs);
+ candidate = down;
+ break;
+ }
+ }
- while ((cmp = _limb_compare_un(rest, rlimbs, second, slimbs)) >= 0) {
+ /* while ((cmp = _limb_compare_un(rest, rlimbs, second, slimbs)) >= 0) {
candidate ++;
rlimbs = _limb_sub_sl(rest, rlimbs, second, slimbs);
}
-
+*/
if (candidate)
is_significant = 1;
@@ -1074,6 +1128,8 @@
#define SIGN_POSSNAN 6
#define SIGN_NEGSNAN 7
+/* Bigest exponent in transcendetial functions */
+#define MAX_MATH 999999
/* Rounding constants */
/* for context->rounding */
@@ -1900,7 +1956,7 @@
ans->digits[1] = 1;
ans->limbs[0] = 1;
digits = 1;
- } else {
+ } else { /* SLOW */
ans = _NEW_decimalobj(self->ob_size+1, self->sign, self->exp);
if (!ans)
return NULL;
@@ -4507,12 +4563,12 @@
ans = tmp;
}
+ ctx->rounding = rounding;
{
decimalobject *fixed = _decimal_fix(ans, ctx);
Py_DECREF(ans);
ans = fixed;
}
- ctx->rounding = rounding;
return ans;
}
@@ -4525,14 +4581,261 @@
if (!ctx)
return NULL;
- if (ctx->prec > 999999 ||
- exp_g_i(ctx->Emax, 999999) ||
- exp_l_i(ctx->Emin,-999999))
+ if (ctx->prec > MAX_MATH ||
+ exp_g_i(ctx->Emax, MAX_MATH) ||
+ exp_l_i(ctx->Emin,-1 * MAX_MATH))
handle_InvalidContext(self->ob_type, ctx, NULL);
return _do_decimal_exponent(self, ctx);
}
+int ln_lookup[] = {9016, 8652, 8316, 8008, 7724, 7456, 7208,
+ 6972, 6748, 6540, 6340, 6148, 5968, 5792, 5628, 5464, 5312,
+ 5164, 5020, 4884, 4748, 4620, 4496, 4376, 4256, 4144, 4032,
+ 39233, 38181, 37157, 36157, 35181, 34229, 33297, 32389, 31501, 30629,
+ 29777, 28945, 28129, 27329, 26545, 25777, 25021, 24281, 23553, 22837,
+ 22137, 21445, 20769, 20101, 19445, 18801, 18165, 17541, 16925, 16321,
+ 15721, 15133, 14553, 13985, 13421, 12865, 12317, 11777, 11241, 10717,
+ 10197, 9685, 9177, 8677, 8185, 7697, 7213, 6737, 6269, 5801,
+ 5341, 4889, 4437, 39930, 35534, 31186, 26886, 22630, 18418, 14254,
+ 10130, 6046, 20055};
+
+static PyObject *
+_do_decimal__ln(decimalobject *self, contextobject *ctx) {
+
+ decimalobject *ans = 0, *b = 0, *tmp, *one = 0;
+ contextobject *ctx1 = 0, *ctx2 = 0;
+ PyObject *flags;
+ long precision;
+ int rounding;
+ int clamped;
+ long prec, cp;
+ int t;
+ if (!ctx)
+ ctx = getcontext();
+ if (!ctx)
+ return NULL;
+
+ if (ISSPECIAL(self)) {
+ decimalobject *nan;
+ if (_check_nans(self, NULL, ctx, &nan) != 0)
+ return nan;
+
+ /*-inf -> error
+ * inf -> self */
+ if (ISINF(self)) {
+ if (self->sign &1)
+ return handle_InvalidOperation(self->ob_type, ctx, "ln(-inf)", NULL);
+ else
+ return _decimal_get_copy(self);
+ }
+ }
+
+ if (!decimal_nonzero(self)) {
+ ans = _NEW_decimalobj(1, SIGN_NEGINF, exp_from_i(0));
+ return ans;
+ }
+
+ if (self->sign &1) {
+ return handle_InvalidOperation(self->ob_type, ctx, "ln(-x)", NULL);
+ }
+
+ one = _NEW_decimalobj(1, 0, exp_from_i(0));
+ if (!one)
+ return NULL;
+ one->limbs[0] = 1;
+
+ ctx1 = context_copy(ctx);
+ if (!ctx1)
+ goto err;
+
+ flags = context_ignore_all_flags(ctx1);
+ if (!flags)
+ goto err;
+ Py_DECREF(flags);
+
+
+
+ ctx1->prec = 16;
+
+ prec = (self->ob_size > ctx->prec ? self->ob_size : ctx->prec) + 2;
+ prec = prec > 9 ? prec : 9;
+
+ ans = decimal_from_long(self->ob_type, exp_to_i(ADJUSTED(self)) + 1);
+ if (!ans)
+ goto err;
+ b = decimal_from_long(self->ob_type, 2302585);
+ if (!b)
+ goto err;
+ b->exp = exp_from_i(-6);
+
+
+ tmp = _do_decimal_multiply(ans, b, ctx1);
+ if (!tmp)
+ goto err;
+
+ Py_DECREF(ans);
+ ans = tmp;
+
+ t = _limb_get_digit(self->limbs, self->ob_size, 0);
+ if (self->ob_size > 1) {
+ t *= 10;
+ t += _limb_get_digit(self->limbs, self->ob_size, 1);
+ }
+ if (t<10) t *= 10;
+
+ t = ln_lookup[t-10];
+
+ Py_DECREF(b);
+ b = decimal_from_long(self->ob_type, t >> 2);
+ if (!b)
+ goto err;
+ b->exp = exp_from_i(-(t&3) - 3);
+ b->sign = 1;
+
+ rounding = ctx1->rounding;
+ ctx1->rounding = ROUND_HALF_EVEN;
+ tmp = _do_decimal_add(ans, b, ctx1);
+ if (!tmp)
+ goto err;
+ Py_DECREF(ans);
+ ans = tmp;
+
+ ctx1->prec = ctx->prec;
+ ctx1->clamp = 0;
+ ctx2 = context_shallow_copy(ctx1);
+ if (!ctx2)
+ goto err;
+ ctx2->Emax = exp_from_i(2 * MAX_MATH);
+ ctx2->Emin = exp_from_i(-2 * MAX_MATH);
+
+ cp = 9; /* precision we'll get after every iteration */
+ ctx1->prec = cp;
+ ctx2->prec = cp + self->ob_size;
+
+
+ while (1) {
+ ans->sign ^= 1;
+ Py_DECREF(b);
+ b = _do_decimal_exponent(ans, ctx2);
+ if (!b)
+ goto err;
+ ans->sign ^= 1;
+
+ tmp = _do_decimal_multiply(b, self, ctx2);
+ if (!tmp)
+ goto err;
+ Py_DECREF(b);
+ b = tmp;
+
+ tmp = _do_decimal_subtract(b, one, ctx2);
+ if (!tmp)
+ goto err;
+ Py_DECREF(b);
+ b = tmp;
+
+ /* ADJUSTED(ans) >= ADJUSTED(b) + ctx->prec + 1 */
+ if (!decimal_nonzero(b) || exp_ge(ADJUSTED(ans),
+ exp_add_i(ADJUSTED(b), ctx->prec + 1))) {
+ if (ans->ob_size == prec)
+ break;
+
+ if (!decimal_nonzero(ans)) {
+ if (!_do_real_decimal_compare(self, one, ctx1)) {
+ ans->exp = exp_from_i(0);
+ break;
+ }
+ else {
+ if (handle_Rounded(ctx, NULL))
+ goto err;
+ if (handle_Inexact(ctx, NULL))
+ goto err;
+
+ }
+ }
+
+ /* to make addition easier */
+ if (!decimal_nonzero(b))
+ b->exp = exp_sub_i(ans->exp, prec);
+ }
+
+ tmp = _do_decimal_add(ans, b, ctx1);
+ if (!tmp)
+ goto err;
+ Py_DECREF(ans);
+ ans = tmp;
+
+ /* we're done */
+ if (cp == prec)
+ continue;
+
+ cp *= 2;
+ if (cp > prec)
+ cp = prec;
+ ctx1->prec = cp;
+ ctx2->prec = cp + self->ob_size;
+ }
+
+ rounding = ctx->rounding;
+ ctx->rounding = ROUND_HALF_EVEN;
+ /* we add extra 1 at the end to make proper rounding */
+ if (decimal_nonzero(ans) && !ISSPECIAL(ans)) {
+ int i;
+
+
+ tmp = _NEW_decimalobj(ans->ob_size + LOG, ans->sign, exp_sub_i(ans->exp, LOG));
+ if (!tmp) {
+ ctx->rounding = rounding;
+ goto err;
+ }
+
+ for (i = 0; i < ans->limb_count ; i++){
+ tmp->limbs[i+1] = ans->limbs[i];
+ }
+ tmp->limbs[0] = 1;
+
+ Py_DECREF(ans);
+ ans = tmp;
+ }
+
+ ctx->rounding = rounding;
+ tmp = _decimal_fix(ans, ctx);
+
+ if (!tmp)
+ goto err;
+ Py_DECREF(ans);
+ ans = tmp;
+
+ Py_DECREF(b);
+ Py_DECREF(ctx1);
+ Py_DECREF(ctx2);
+ Py_DECREF(one);
+ return ans;
+err:
+ Py_DECREF(one);
+ Py_XDECREF(ans);
+ Py_XDECREF(b);
+ Py_XDECREF(ctx1);
+ Py_XDECREF(ctx2);
+
+ return NULL;
+}
+
+DECIMAL_UNARY_FUNC(_ln);
+
+static PyObject *
+_do_decimal_ln(decimalobject *self, contextobject *ctx) {
+ return _do_decimal__ln(self, ctx);
+}
+
+DECIMAL_UNARY_FUNC(ln);
+
static PyMethodDef decimal_methods[] = {
+ {"ln", (PyCFunction)decimal_ln,
+ METH_VARARGS | METH_KEYWORDS,
+ PyDoc_STR("TODO")},
+ {"_ln", (PyCFunction)decimal__ln,
+ METH_VARARGS | METH_KEYWORDS,
+ PyDoc_STR("TODO")},
{"exp", (PyCFunction)decimal_exponent,
METH_VARARGS | METH_KEYWORDS,
PyDoc_STR("TODO")},
@@ -5121,20 +5424,126 @@
int mod = modulo != Py_None;
long firstprec = ctx->prec;
int cmp;
+ int use_exp = 0; /* when we should use exp/ln method */
+
+ if (!ctx)
+ ctx = getcontext();
+ if (!ctx)
+ return NULL;
+
+// if ((ISINF(other) || exp_g_i(ADJUSTED(other), 8)) && decimal_nonzero(other) ) {
+// return handle_InvalidOperation(self->ob_type, ctx, "x ** INF", NULL);
+// }
+
+ if ((exp_g_i(ADJUSTED(other), 8)) && decimal_nonzero(other)) {
+ use_exp = 1;
+ }
+ if(0)
+ if (exp_eq_i(ADJUSTED(other), 9) && _limb_get_digit(other->limbs, other->ob_size, 0) >=2)
+ use_exp = 1;
- if (ISINF(other) || exp_g_i(ADJUSTED(other), 8)) {
- return handle_InvalidOperation(self->ob_type, ctx, "x ** INF", NULL);
- }
-
if (ISSPECIAL(self) || ISSPECIAL(other)) {
decimalobject *nan;
if (_check_nans(self, other, ctx, &nan))
return nan;
}
+ if (ISINF(other)) {
+ int cmp;
+
+ if (!decimal_nonzero(self)) {
+ if (other->sign &1) {
+ return _NEW_decimalobj(1, SIGN_POSINF, exp_from_i(0));
+ }
+ else {
+ ret = _NEW_decimalobj(1, 0, exp_from_i(0));
+ if (!ret)
+ return NULL;
+ ret->limbs[0] = 0;
+ return ret;
+ }
+ }
+
+ if (self->sign &1)
+ return handle_InvalidOperation(self->ob_type, ctx, "-x ** [+-]INF", NULL);
+
+ {
+ contextobject *ctx2;
+ decimalobject *one;
+ ctx2 = context_copy(ctx);
+ if (!ctx2)
+ return NULL;
+
+
+ PyObject *flags = context_ignore_all_flags(ctx2);
+ if (!flags){
+ Py_DECREF(ctx2);
+ return NULL;
+ }
+ Py_DECREF(flags);
+
+ one = _NEW_decimalobj(1, 0, exp_from_i(0));
+ if (!one) {
+ Py_DECREF(ctx2);
+ return NULL;
+ }
+ one->limbs[0] = 1;
+
+ cmp = _do_real_decimal_compare(self, one, ctx2);
+ Py_DECREF(ctx2);
+ Py_DECREF(one);
+ if (PyErr_Occurred())
+ return NULL;
+ }
+
+ if (cmp == 0) {
+ int i;
+ if (handle_Inexact(ctx, NULL))
+ return NULL;
+ if (handle_Rounded(ctx, NULL))
+ return NULL;
+
+ ret = _NEW_decimalobj(ctx->prec, 0, exp_from_i(-ctx->prec+1));
+ if (!ret)
+ return NULL;
+ for (i=0;ilimb_count;i++)
+ ret->limbs[i] = 0;
+
+ {
+ long mult = 1;
+ long where;
+ long limb = (ctx->prec-1) / LOG;
+ where = (ctx->prec-1) % LOG;
+ while(where --) mult *= 10;
+ ret->limbs[limb] = mult;
+ }
+
+ return ret;
+ }
+ /* cmp == 1 for self > 1
+ * cmp == 0 for self < 1*/
+ cmp += 1;
+ cmp /= 2;
+
+ /* if self > 1 self^inf = inf, self^-inf = 0
+ * if self < 1 self^inf = 0, self^-inf = inf */
+
+ /* inf */
+ if (cmp ^ (other->sign &1)) {
+ return _NEW_decimalobj(1, SIGN_POSINF, exp_from_i(0));
+ }
+ /* 0 */
+ else {
+ ret = _NEW_decimalobj(1, 0, exp_from_i(0));
+ if (!ret)
+ return NULL;
+ ret->limbs[0] = 0;
+ return ret;
+ }
+ }
+
if (!_decimal_isint(other)) {
- return handle_InvalidOperation(self->ob_type, ctx,
- "x ** (non-integer)", NULL);
+ use_exp = 1;
}
if (!decimal_nonzero(self) && !decimal_nonzero(other)) {
@@ -5147,6 +5556,7 @@
return ret;
}
+ if (!use_exp)
{
PyObject *tmp = decimal_int(other);
if (!tmp)
@@ -5160,24 +5570,141 @@
}
}
+ /* all we really need here is n%2 - for special cases */
+ else {
+ /* where = other->ob_size - other->exp - 1 */
+ exp_t where = exp_from_i(other->ob_size);
+ exp_inp_sub(&where, exp_add_i(other->exp, 1));
+
+ n = _limb_get_digit(other->limbs, other->ob_size, exp_to_i(where));
+ }
sign = (self->sign&1) && n&1;
+ if (!decimal_nonzero(self)) {
+ if (other->sign &1) {
+ ret = _NEW_decimalobj(1, 0, exp_from_i(0));
+ if (!ret)
+ return NULL;
+ ret->sign = sign ? SIGN_NEGINF : SIGN_POSINF;
+ return ret;
+ }
+ else {
+ ret = _NEW_decimalobj(1, sign, exp_from_i(0));
+ if (!ret)
+ return NULL;
+ ret->limbs[0] = 0;
+ return ret;
+ }
+ }
if (ISINF(self)) {
if (mod) {
return handle_InvalidOperation(self->ob_type, ctx,
"INF % x", NULL);
}
+ if (!_decimal_isint(other) && self->sign&1)
+ return handle_InvalidOperation(self->ob_type, ctx, "INF ** -(non-int)", NULL);
ret = _NEW_decimalobj(1, sign, exp_from_i(0));
ret->limbs[0] = 0;
- if (n > 0)
+ if (decimal_nonzero(other) && !(other->sign &1))
ret->sign = sign ? SIGN_NEGINF : SIGN_POSINF;
return ret;
}
+ /* non-integer case */
+ /* we calculate it using exp(ln(self) * other) */
+ if (use_exp) {
+ decimalobject *tmp;
+ contextobject *ctx2;
+ ctx2 = context_shallow_copy(ctx);
+ if (!ctx2)
+ return NULL;
+
+ ctx2->Emax = exp_from_i(MAX_MATH);
+ ctx2->Emin = exp_from_i(-MAX_MATH);
+ ctx2->clamp = 0;
+ ctx2->rounding = ctx->rounding;
+ ctx2->rounding_dec = ALWAYS_ROUND;
+ /* we take context precision or size of self if greater
+ * and add some constant */
+ ctx2->prec = ctx->prec > self->ob_size ? ctx->prec : self->ob_size;
+ ctx2->prec += 10;
+
+ ret = _do_decimal__ln(self, ctx2);
+ if (!ret) {
+ Py_DECREF(ctx2);
+ return NULL;
+ }
+
+ if (!decimal_nonzero(ret)) {
+ int i;
+ Py_DECREF(ret);
+ Py_DECREF(ctx2);
+ if (!_decimal_isint(other))
+ {
+ ret = _NEW_decimalobj(ctx->prec, 0, exp_from_i(-ctx->prec + 1));
+ if (!ret)
+ return NULL;
+
+ if (handle_Inexact(ctx, NULL))
+ return NULL;
+ if (handle_Rounded(ctx, NULL))
+ return NULL;
+
+ for (i = 0; i < ret -> limb_count; i++) {
+ ret->limbs[i] = 0;
+ }
+ {
+ long mult = 1;
+ long limb;
+ long where = (ctx->prec - 1)% LOG;
+ limb = (ctx->prec - 1)/LOG;
+ while(where--) mult *= 10;
+ ret->limbs[limb] = mult;
+ }
+
+ return ret;
+ }
+ else {
+ ret = _NEW_decimalobj(1, 0, exp_from_i(0));
+ if (!ret)
+ return NULL;
+ ret->limbs[0] = 1;
+ return ret;
+ }
+ }
+
+ tmp = _do_decimal_multiply(ret, other, ctx2);
+ Py_DECREF(ret);
+ if (!tmp) {
+ Py_DECREF(ctx2);
+ return NULL;
+ }
+ ret = tmp;
+
+ tmp = _do_decimal_exponent(ret, ctx2);
+ Py_DECREF(ret);
+ Py_DECREF(ctx2);
+
+ if (!tmp)
+ return NULL;
+
+ ret = tmp;
+
+ if (ctx->rounding_dec == ALWAYS_ROUND) {
+ tmp = _decimal_fix(ret, ctx);
+ Py_DECREF(ret);
+ if (!tmp)
+ return NULL;
+ ret = tmp;
+ }
+
+ return ret;
+ }
+
/* XXX temporary solution */
{
#ifdef BIG_EXP
@@ -5222,7 +5749,11 @@
t/=10;
}
}
+ /* when n < 0 we need to extend precision - then every tests pass =] */
+ if (n<0)
+ ctx->prec += 1;
+ /* TODO shouldn't it be Invalid_Context ? */
if (!mod && exp_l_i(PyDecimal_DefaultContext->Emax, ctx->prec)) {
ctx->prec = firstprec;
return handle_Overflow(self->ob_type, ctx, "Too much precision", sign);
@@ -5234,6 +5765,7 @@
if(!val || !mul) {
Py_XDECREF(mul);
Py_XDECREF(val);
+ ctx->prec = firstprec;
return NULL;
}
@@ -5244,6 +5776,7 @@
if (!tmp) {
Py_DECREF(mul);
Py_DECREF(val);
+ ctx->prec = firstprec;
return NULL;
}
@@ -5615,6 +6148,7 @@
* whether the fractional part consists only of zeroes. */
i = exp_from_i(d->ob_size-1);
cmp = exp_add_i(d->exp, d->ob_size);
+ /* TODO bail out when we go out of bound */
for (; exp_ge(i, cmp); exp_dec(&i)) {
/* if (d->digits[i] > 0)*/
if(_limb_get_digit(d->limbs, d->ob_size, exp_to_i(i)) > 0)
@@ -6714,6 +7248,7 @@
CONTEXT_UNARY_FUNC(sqrt, sqrt)
CONTEXT_UNARY_FUNC(to_eng_string, to_eng_string)
CONTEXT_UNARY_FUNC(exp, exp)
+CONTEXT_UNARY_FUNC(ln, ln);
/* Unfortunately, the following methods are non-standard and can't
@@ -7238,6 +7773,8 @@
METH_O},
{"exp", (PyCFunction)context_exp,
METH_O},
+ {"ln", (PyCFunction)context_ln,
+ METH_O},
{"subtract", (PyCFunction)context_subtract,
METH_VARARGS},
{"to_eng_string", (PyCFunction)context_to_eng_string,
From python-checkins at python.org Sat Jul 15 18:03:50 2006
From: python-checkins at python.org (george.yoshida)
Date: Sat, 15 Jul 2006 18:03:50 +0200 (CEST)
Subject: [Python-checkins] r50664 - python/trunk/Doc/ext/windows.tex
Message-ID: <20060715160350.4C24F1E4004@bag.python.org>
Author: george.yoshida
Date: Sat Jul 15 18:03:49 2006
New Revision: 50664
Modified:
python/trunk/Doc/ext/windows.tex
Log:
Bug #15187702 : ext/win-cookbook.html has a broken link to distutils
Modified: python/trunk/Doc/ext/windows.tex
==============================================================================
--- python/trunk/Doc/ext/windows.tex (original)
+++ python/trunk/Doc/ext/windows.tex Sat Jul 15 18:03:49 2006
@@ -28,13 +28,15 @@
\section{A Cookbook Approach \label{win-cookbook}}
There are two approaches to building extension modules on Windows,
-just as there are on \UNIX: use the \refmodule{distutils} package to
+just as there are on \UNIX: use the
+\citetitle[../lib/module-distutils.html]{distutils} package to
control the build process, or do things manually. The distutils
approach works well for most extensions; documentation on using
-\refmodule{distutils} to build and package extension modules is
-available in \citetitle[../dist/dist.html]{Distributing Python
-Modules}. This section describes the manual approach to building
-Python extensions written in C or \Cpp.
+\citetitle[../lib/module-distutils.html]{distutils} to build and
+package extension modules is available in
+\citetitle[../dist/dist.html]{Distributing Python Modules}. This
+section describes the manual approach to building Python extensions
+written in C or \Cpp.
To build extensions using these instructions, you need to have a copy
of the Python sources of the same version as your installed Python.
From python-checkins at python.org Sat Jul 15 18:12:49 2006
From: python-checkins at python.org (george.yoshida)
Date: Sat, 15 Jul 2006 18:12:49 +0200 (CEST)
Subject: [Python-checkins] r50665 -
python/branches/release24-maint/Doc/ext/windows.tex
Message-ID: <20060715161249.C2D471E4004@bag.python.org>
Author: george.yoshida
Date: Sat Jul 15 18:12:49 2006
New Revision: 50665
Modified:
python/branches/release24-maint/Doc/ext/windows.tex
Log:
Backport revision 50664
Bug #15187702 : ext/win-cookbook.html has a broken link to distutils
Modified: python/branches/release24-maint/Doc/ext/windows.tex
==============================================================================
--- python/branches/release24-maint/Doc/ext/windows.tex (original)
+++ python/branches/release24-maint/Doc/ext/windows.tex Sat Jul 15 18:12:49 2006
@@ -28,13 +28,15 @@
\section{A Cookbook Approach \label{win-cookbook}}
There are two approaches to building extension modules on Windows,
-just as there are on \UNIX: use the \refmodule{distutils} package to
+just as there are on \UNIX: use the
+\citetitle[../lib/module-distutils.html]{distutils} package to
control the build process, or do things manually. The distutils
approach works well for most extensions; documentation on using
-\refmodule{distutils} to build and package extension modules is
-available in \citetitle[../dist/dist.html]{Distributing Python
-Modules}. This section describes the manual approach to building
-Python extensions written in C or \Cpp.
+\citetitle[../lib/module-distutils.html]{distutils} to build and
+package extension modules is available in
+\citetitle[../dist/dist.html]{Distributing Python Modules}. This
+section describes the manual approach to building Python extensions
+written in C or \Cpp.
To build extensions using these instructions, you need to have a copy
of the Python sources of the same version as your installed Python.
From python-checkins at python.org Sat Jul 15 18:19:24 2006
From: python-checkins at python.org (matt.fleming)
Date: Sat, 15 Jul 2006 18:19:24 +0200 (CEST)
Subject: [Python-checkins] r50666 - sandbox/trunk/pdb/mpdb.py
Message-ID: <20060715161924.DA1E91E4004@bag.python.org>
Author: matt.fleming
Date: Sat Jul 15 18:19:24 2006
New Revision: 50666
Modified:
sandbox/trunk/pdb/mpdb.py
Log:
Begin using 'queues' for communication.
Modified: sandbox/trunk/pdb/mpdb.py
==============================================================================
--- sandbox/trunk/pdb/mpdb.py (original)
+++ sandbox/trunk/pdb/mpdb.py Sat Jul 15 18:19:24 2006
@@ -19,8 +19,10 @@
from optparse import OptionParser
import pydb
from pydb.gdb import Restart
+from Queue import Queue
import sys
import time
+import thread
import traceback
__all__ = ["MPdb", "pdbserver", "target", "thread_debugging"]
@@ -40,19 +42,15 @@
- debugging applications on remote machines
- debugging threaded applications
"""
- def __init__(self, completekey='tab', stdin=None, stdout=None):
+ def __init__(self, completekey='tab'):
""" Instantiate a debugger.
The optional argument 'completekey' is the readline name of a
completion key; it defaults to the Tab key. If completekey is
not None and the readline module is available, command completion
- is done automatically. The optional arguments stdin and stdout
- specify alternate input and output file objects; if not specified,
- sys.stdin and sys.stdout are used.
+ is done automatically.
"""
- pydb.Pdb.__init__(self, completekey, stdin, stdout)
- self.orig_stdout = self.stdout
- self.orig_stdin = self.stdin
+ pydb.Pdb.__init__(self, completekey)
self.prompt = '(MPdb)'
self.target = 'local' # local connections by default
self.lastcmd = ''
@@ -60,26 +58,42 @@
self.debugger_name = 'mpdb'
self._info_cmds.append('target')
- def _rebind_input(self, new_input):
- """ This method rebinds the debugger's input to the object specified
- by 'new_input'.
+ thread.start_new_thread(self.input, ())
+ self.inputqueue = Queue()
+ self.outputqueue = Queue()
+ thread.start_new_thread(self.output, ())
+
+ def cmdloop(self, intro=None):
+ """ Override the cmdloop from cmd.Cmd so that we only use this
+ object's cmdqueue variable for reading commands.
"""
- self.use_rawinput = False
- self.stdin = new_input
+ self.preloop()
- def _rebind_output(self, new_output):
- """ This method rebinds the debugger's output to the object specified
- by 'new_output'.
+ if self.intro is not None:
+ self.intro = intro
+ if self.intro:
+ self.outputqueue.put(str(self.intro) + "\n")
+ stop = None
+ while not stop:
+ try:
+ line = self.cmdqueue.pop()
+ except IndexError:
+ time.sleep(0.1)
+ continue
+
+ line = self.precmd(line)
+ stop = self.onecmd(line)
+ stop = self.postcmd(stop, line)
+ self.postloop()
+
+ def msg_nocr(self, msg):
+ """ Override this method from pydb so that instead of writing
+ to a file object, we place the output on to an output queue. Output
+ is picked up by a thread.
"""
- self.stdout.flush()
- self.stdout = new_output
- if not hasattr(self.stdout, 'flush'):
- # Add a dummy flush method because cmdloop() in cmd.py
- # uses this code:
- # self.stdout.write(self.prompt)
- # self.stdout.flush()
- # line = self.readline()
- self.stdout.flush = lambda: None
+ # XXX This should probably be removed
+ if msg[-1] == '\n': msg = msg[:-1]
+ self.outputqueue.put(msg)
def remote_onecmd(self, line):
""" All commands in 'line' are sent across this object's connection
From python-checkins at python.org Sat Jul 15 18:53:17 2006
From: python-checkins at python.org (bob.ippolito)
Date: Sat, 15 Jul 2006 18:53:17 +0200 (CEST)
Subject: [Python-checkins] r50667 - in python/trunk: Lib/binhex.py Misc/NEWS
Message-ID: <20060715165317.0149E1E4004@bag.python.org>
Author: bob.ippolito
Date: Sat Jul 15 18:53:15 2006
New Revision: 50667
Modified:
python/trunk/Lib/binhex.py
python/trunk/Misc/NEWS
Log:
Patch #1220874: Update the binhex module for Mach-O.
Modified: python/trunk/Lib/binhex.py
==============================================================================
--- python/trunk/Lib/binhex.py (original)
+++ python/trunk/Lib/binhex.py Sat Jul 15 18:53:15 2006
@@ -44,22 +44,14 @@
#
# Workarounds for non-mac machines.
-if os.name == 'mac':
- import macfs
- import MacOS
- try:
- openrf = MacOS.openrf
- except AttributeError:
- # Backward compatibility
- openrf = open
-
- def FInfo():
- return macfs.FInfo()
+try:
+ from Carbon.File import FSSpec, FInfo
+ from MacOS import openrf
def getfileinfo(name):
- finfo = macfs.FSSpec(name).GetFInfo()
+ finfo = FSSpec(name).FSpGetFInfo()
dir, file = os.path.split(name)
- # XXXX Get resource/data sizes
+ # XXX Get resource/data sizes
fp = open(name, 'rb')
fp.seek(0, 2)
dlen = fp.tell()
@@ -75,7 +67,7 @@
mode = '*' + mode[0]
return openrf(name, mode)
-else:
+except ImportError:
#
# Glue code for non-macintosh usage
#
@@ -183,7 +175,7 @@
ofname = ofp
ofp = open(ofname, 'w')
if os.name == 'mac':
- fss = macfs.FSSpec(ofname)
+ fss = FSSpec(ofname)
fss.SetCreatorType('BnHq', 'TEXT')
ofp.write('(This file must be converted with BinHex 4.0)\n\n:')
hqxer = _Hqxcoderengine(ofp)
@@ -486,7 +478,7 @@
if not out:
out = ifp.FName
if os.name == 'mac':
- ofss = macfs.FSSpec(out)
+ ofss = FSSpec(out)
out = ofss.as_pathname()
ofp = open(out, 'wb')
@@ -519,6 +511,7 @@
def _test():
if os.name == 'mac':
+ import macfs
fss, ok = macfs.PromptGetFile('File to convert:')
if not ok:
sys.exit(0)
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Sat Jul 15 18:53:15 2006
@@ -22,6 +22,8 @@
Library
-------
+- Patch #1220874: Update the binhex module for Mach-O.
+
Extension Modules
-----------------
From buildbot at python.org Sat Jul 15 19:30:40 2006
From: buildbot at python.org (buildbot at python.org)
Date: Sat, 15 Jul 2006 17:30:40 +0000
Subject: [Python-checkins] buildbot warnings in x86 XP trunk
Message-ID: <20060715173040.9A3421E401B@bag.python.org>
The Buildbot has detected a new failure of x86 XP trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520XP%2520trunk/builds/1180
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: bob.ippolito,fredrik.lundh,george.yoshida,thomas.heller
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Sat Jul 15 23:53:15 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Sat, 15 Jul 2006 23:53:15 +0200 (CEST)
Subject: [Python-checkins] r50669 -
sandbox/trunk/seealso/convert-python-faqs.py
Message-ID: <20060715215315.D3D4A1E4004@bag.python.org>
Author: andrew.kuchling
Date: Sat Jul 15 23:53:15 2006
New Revision: 50669
Modified:
sandbox/trunk/seealso/convert-python-faqs.py
Log:
Create various random files that Pyramid requires in order to run
Modified: sandbox/trunk/seealso/convert-python-faqs.py
==============================================================================
--- sandbox/trunk/seealso/convert-python-faqs.py (original)
+++ sandbox/trunk/seealso/convert-python-faqs.py Sat Jul 15 23:53:15 2006
@@ -57,8 +57,10 @@
f.write("""--- !fragment
template: index.html
# The data to pass to the template
+global:
+ metadata:
local:
- title: %s
+ title: "%s"
content: !fragment content.yml
""" % title)
f.close()
@@ -75,6 +77,18 @@
text: !htmlfile question.html""")
f.close()
+ f = open(os.path.join(qdir, 'nav.yml'), 'w')
+ f.write("""--- !fragment
+# Type of template to use
+template: nav.html
+
+# Contents of the template
+global:
+ nav : !sectionnav |
+ xxx index.html
+""")
+ f.close()
+
def convert_index (filename):
root = htmlload.load(filename)
@@ -105,7 +119,7 @@
template: index.html
# The data to pass to the template
local:
- title: %s
+ title: "%s"
content: !fragment content.yml
""" % title)
f.close()
@@ -123,8 +137,50 @@
f.close()
def write_master_index (index_files):
- pass
+ f = open(os.path.join(DESTDIR, 'index.yml'), 'w')
+ f.write("""--- !fragment
+template: index.html
+# The data to pass to the template
+local:
+ title: FAQs
+ content: !fragment content.yml
+""")
+ f.close()
+
+ f = open(os.path.join(DESTDIR, 'content.yml'), 'w')
+ f.write("""--- !fragment
+# Type of template to use
+template: content.html
+# The data to pass to the template
+local:
+ content:
+ text: !htmlfile listing.html
+ breadcrumb: !breadcrumb nav.yml nav
+""")
+ f.close()
+
+ f = open(os.path.join(DESTDIR, 'listing.html'), 'w')
+ f.write('
XXX put content here later
')
+ f.close()
+
+ f = open(os.path.join(DESTDIR, 'nav.yml'), 'w')
+ f.write("""--- !fragment
+# Type of template to use
+template: nav.html
+
+# Contents of the template
+global:
+ nav : !sectionnav |
+ xxx index.html
+""")
+ f.close()
+
+ f = open(os.path.join(DESTDIR, 'content.html'), 'w')
+ f.write("""
+
+""")
+ f.close()
if os.path.exists(DESTDIR):
shutil.rmtree(DESTDIR)
From python-checkins at python.org Sun Jul 16 00:05:18 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Sun, 16 Jul 2006 00:05:18 +0200 (CEST)
Subject: [Python-checkins] r50670 -
sandbox/trunk/seealso/convert-python-faqs.py
Message-ID: <20060715220518.1160D1E4004@bag.python.org>
Author: andrew.kuchling
Date: Sun Jul 16 00:05:17 2006
New Revision: 50670
Modified:
sandbox/trunk/seealso/convert-python-faqs.py
Log:
Write .ht files instead of .html files, because I think Pyramid wants a valid XHTML file. Still doesn't work, though.
Modified: sandbox/trunk/seealso/convert-python-faqs.py
==============================================================================
--- sandbox/trunk/seealso/convert-python-faqs.py (original)
+++ sandbox/trunk/seealso/convert-python-faqs.py Sun Jul 16 00:05:17 2006
@@ -3,6 +3,8 @@
# subdirectory) and write out a pyramid-formatting tree to pyramid-faq/
#
# Warning: erases the existing pyramid-faq/ directory on starting!
+#
+# $Id$
import os, shutil, glob
import htmlload
@@ -42,7 +44,9 @@
os.makedirs(qdir)
# Write body of question
- f = open(os.path.join(qdir, 'question.html'), 'w')
+ f = open(os.path.join(qdir, 'question.ht'), 'w')
+ f.write('Title: XXX\n')
+ f.write('\n')
body = root.find('body')
for child in body.getchildren():
s = ET.tostring(child, 'utf-8')
@@ -74,7 +78,7 @@
local:
content:
breadcrumb: !breadcrumb nav.yml nav
- text: !htmlfile question.html""")
+ text: !htfile question.ht""")
f.close()
f = open(os.path.join(qdir, 'nav.yml'), 'w')
@@ -106,7 +110,7 @@
idir = os.path.join(DESTDIR, category)
# Write body of question
- f = open(os.path.join(idir, 'listing.html'), 'w')
+ f = open(os.path.join(idir, 'listing.ht'), 'w')
f.write('
\n')
for a in entries:
f.write('
%s
\n' % ET.tostring(a, 'utf-8'))
@@ -133,7 +137,7 @@
local:
content:
breadcrumb: !breadcrumb nav.yml nav
- text: !htmlfile listing.html""")
+ text: !htfile listing.ht""")
f.close()
def write_master_index (index_files):
@@ -155,13 +159,15 @@
# The data to pass to the template
local:
content:
- text: !htmlfile listing.html
+ text: !htfile listing.ht
breadcrumb: !breadcrumb nav.yml nav
""")
f.close()
- f = open(os.path.join(DESTDIR, 'listing.html'), 'w')
- f.write('
\n')
f.close()
f = open(os.path.join(DESTDIR, 'nav.yml'), 'w')
From python-checkins at python.org Sun Jul 16 03:21:21 2006
From: python-checkins at python.org (fred.drake)
Date: Sun, 16 Jul 2006 03:21:21 +0200 (CEST)
Subject: [Python-checkins] r50671 - python/trunk/Doc/ext/windows.tex
Message-ID: <20060716012121.24EC11E4004@bag.python.org>
Author: fred.drake
Date: Sun Jul 16 03:21:20 2006
New Revision: 50671
Modified:
python/trunk/Doc/ext/windows.tex
Log:
clean up some link markup
Modified: python/trunk/Doc/ext/windows.tex
==============================================================================
--- python/trunk/Doc/ext/windows.tex (original)
+++ python/trunk/Doc/ext/windows.tex Sun Jul 16 03:21:20 2006
@@ -29,10 +29,10 @@
There are two approaches to building extension modules on Windows,
just as there are on \UNIX: use the
-\citetitle[../lib/module-distutils.html]{distutils} package to
+\ulink{\module{distutils}}{../lib/module-distutils.html} package to
control the build process, or do things manually. The distutils
approach works well for most extensions; documentation on using
-\citetitle[../lib/module-distutils.html]{distutils} to build and
+\ulink{\module{distutils}}{../lib/module-distutils.html} to build and
package extension modules is available in
\citetitle[../dist/dist.html]{Distributing Python Modules}. This
section describes the manual approach to building Python extensions
From python-checkins at python.org Sun Jul 16 03:21:47 2006
From: python-checkins at python.org (fred.drake)
Date: Sun, 16 Jul 2006 03:21:47 +0200 (CEST)
Subject: [Python-checkins] r50672 -
python/branches/release24-maint/Doc/ext/windows.tex
Message-ID: <20060716012147.BBEB01E4004@bag.python.org>
Author: fred.drake
Date: Sun Jul 16 03:21:47 2006
New Revision: 50672
Modified:
python/branches/release24-maint/Doc/ext/windows.tex
Log:
clean up some link markup
Modified: python/branches/release24-maint/Doc/ext/windows.tex
==============================================================================
--- python/branches/release24-maint/Doc/ext/windows.tex (original)
+++ python/branches/release24-maint/Doc/ext/windows.tex Sun Jul 16 03:21:47 2006
@@ -29,10 +29,10 @@
There are two approaches to building extension modules on Windows,
just as there are on \UNIX: use the
-\citetitle[../lib/module-distutils.html]{distutils} package to
+\ulink{\module{distutils}}{../lib/module-distutils.html} package to
control the build process, or do things manually. The distutils
approach works well for most extensions; documentation on using
-\citetitle[../lib/module-distutils.html]{distutils} to build and
+\ulink{\module{distutils}}{../lib/module-distutils.html} to build and
package extension modules is available in
\citetitle[../dist/dist.html]{Distributing Python Modules}. This
section describes the manual approach to building Python extensions
From python-checkins at python.org Sun Jul 16 03:50:40 2006
From: python-checkins at python.org (neal.norwitz)
Date: Sun, 16 Jul 2006 03:50:40 +0200 (CEST)
Subject: [Python-checkins] r50673 - in python/trunk: Lib/test/test_dis.py
Misc/NEWS Python/compile.c
Message-ID: <20060716015040.0D51A1E4004@bag.python.org>
Author: neal.norwitz
Date: Sun Jul 16 03:50:38 2006
New Revision: 50673
Modified:
python/trunk/Lib/test/test_dis.py
python/trunk/Misc/NEWS
python/trunk/Python/compile.c
Log:
Bug #1512814, Fix incorrect lineno's when code within a function
had more than 255 blank lines. Byte codes need to go first, line #s second.
Modified: python/trunk/Lib/test/test_dis.py
==============================================================================
--- python/trunk/Lib/test/test_dis.py (original)
+++ python/trunk/Lib/test/test_dis.py Sun Jul 16 03:50:38 2006
@@ -81,6 +81,13 @@
bug1333982.func_code.co_firstlineno + 2,
bug1333982.func_code.co_firstlineno + 3)
+_BIG_LINENO_FORMAT = """\
+%3d 0 LOAD_GLOBAL 0 (spam)
+ 3 POP_TOP
+ 4 LOAD_CONST 0 (None)
+ 7 RETURN_VALUE
+"""
+
class DisTests(unittest.TestCase):
def do_disassembly_test(self, func, expected):
s = StringIO.StringIO()
@@ -124,6 +131,23 @@
if __debug__:
self.do_disassembly_test(bug1333982, dis_bug1333982)
+ def test_big_linenos(self):
+ def func(count):
+ namespace = {}
+ func = "def foo():\n " + "".join(["\n "] * count + ["spam\n"])
+ exec func in namespace
+ return namespace['foo']
+
+ # Test all small ranges
+ for i in xrange(1, 300):
+ expected = _BIG_LINENO_FORMAT % (i + 2)
+ self.do_disassembly_test(func(i), expected)
+
+ # Test some larger ranges too
+ for i in xrange(300, 5000, 10):
+ expected = _BIG_LINENO_FORMAT % (i + 2)
+ self.do_disassembly_test(func(i), expected)
+
def test_main():
run_unittest(DisTests)
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Sun Jul 16 03:50:38 2006
@@ -12,6 +12,9 @@
Core and builtins
-----------------
+- Bug #1512814, Fix incorrect lineno's when code within a function
+ had more than 255 blank lines.
+
- Patch #1521179: Python now accepts the standard options ``--help`` and
``--version`` as well as ``/?`` on Windows.
Modified: python/trunk/Python/compile.c
==============================================================================
--- python/trunk/Python/compile.c (original)
+++ python/trunk/Python/compile.c Sun Jul 16 03:50:38 2006
@@ -4098,9 +4098,10 @@
In order for this to work, when the addr field increments by more than 255,
the line # increment in each pair generated must be 0 until the remaining addr
-increment is < 256. So, in the example above, com_set_lineno should not (as
-was actually done until 2.2) expand 300, 300 to 255, 255, 45, 45, but to
-255, 0, 45, 255, 0, 45.
+increment is < 256. So, in the example above, assemble_lnotab (it used
+to be called com_set_lineno) should not (as was actually done until 2.2)
+expand 300, 300 to 255, 255, 45, 45,
+ but to 255, 0, 45, 255, 0, 45.
*/
static int
@@ -4155,12 +4156,12 @@
}
lnotab = (unsigned char *)
PyString_AS_STRING(a->a_lnotab) + a->a_lnotab_off;
- *lnotab++ = 255;
*lnotab++ = d_bytecode;
+ *lnotab++ = 255;
d_bytecode = 0;
for (j = 1; j < ncodes; j++) {
- *lnotab++ = 255;
*lnotab++ = 0;
+ *lnotab++ = 255;
}
d_lineno -= ncodes * 255;
a->a_lnotab_off += ncodes * 2;
From python-checkins at python.org Sun Jul 16 04:00:37 2006
From: python-checkins at python.org (neal.norwitz)
Date: Sun, 16 Jul 2006 04:00:37 +0200 (CEST)
Subject: [Python-checkins] r50674 - python/trunk/Objects/longobject.c
Message-ID: <20060716020037.988511E4004@bag.python.org>
Author: neal.norwitz
Date: Sun Jul 16 04:00:32 2006
New Revision: 50674
Modified:
python/trunk/Objects/longobject.c
Log:
a & b were dereffed above, so they are known to be valid pointers.
z is known to be NULL, nothing to DECREF.
Reported by Klockwork, #107.
Modified: python/trunk/Objects/longobject.c
==============================================================================
--- python/trunk/Objects/longobject.c (original)
+++ python/trunk/Objects/longobject.c Sun Jul 16 04:00:32 2006
@@ -3149,9 +3149,8 @@
: MAX(size_a, size_b);
z = _PyLong_New(size_z);
if (z == NULL) {
- Py_XDECREF(a);
- Py_XDECREF(b);
- Py_XDECREF(z);
+ Py_DECREF(a);
+ Py_DECREF(b);
return NULL;
}
From python-checkins at python.org Sun Jul 16 04:02:57 2006
From: python-checkins at python.org (neal.norwitz)
Date: Sun, 16 Jul 2006 04:02:57 +0200 (CEST)
Subject: [Python-checkins] r50675 - python/trunk/Modules/_tkinter.c
Message-ID: <20060716020257.CBA5F1E4004@bag.python.org>
Author: neal.norwitz
Date: Sun Jul 16 04:02:57 2006
New Revision: 50675
Modified:
python/trunk/Modules/_tkinter.c
Log:
self is dereffed (and passed as first arg), so it's known to be good.
func is returned from PyArg_ParseTuple and also dereffed.
Reported by Klocwork, #30 (self one at least).
Modified: python/trunk/Modules/_tkinter.c
==============================================================================
--- python/trunk/Modules/_tkinter.c (original)
+++ python/trunk/Modules/_tkinter.c Sun Jul 16 04:02:57 2006
@@ -2104,8 +2104,8 @@
data = PyMem_NEW(PythonCmd_ClientData, 1);
if (!data)
return PyErr_NoMemory();
- Py_XINCREF(self);
- Py_XINCREF(func);
+ Py_INCREF(self);
+ Py_INCREF(func);
data->self = selfptr;
data->func = func;
From python-checkins at python.org Sun Jul 16 04:05:36 2006
From: python-checkins at python.org (neal.norwitz)
Date: Sun, 16 Jul 2006 04:05:36 +0200 (CEST)
Subject: [Python-checkins] r50676 - python/trunk/Modules/_ctypes/cfield.c
Message-ID: <20060716020536.4C61E1E4004@bag.python.org>
Author: neal.norwitz
Date: Sun Jul 16 04:05:35 2006
New Revision: 50676
Modified:
python/trunk/Modules/_ctypes/cfield.c
Log:
proto was dereffed above and is known to be good. No need for X.
Reported by Klocwork, #39.
Modified: python/trunk/Modules/_ctypes/cfield.c
==============================================================================
--- python/trunk/Modules/_ctypes/cfield.c (original)
+++ python/trunk/Modules/_ctypes/cfield.c Sun Jul 16 04:05:35 2006
@@ -124,7 +124,7 @@
self->getfunc = getfunc;
self->index = index;
- Py_XINCREF(proto);
+ Py_INCREF(proto);
self->proto = proto;
switch (fieldtype) {
From python-checkins at python.org Sun Jul 16 04:15:28 2006
From: python-checkins at python.org (neal.norwitz)
Date: Sun, 16 Jul 2006 04:15:28 +0200 (CEST)
Subject: [Python-checkins] r50677 - python/trunk/Modules/readline.c
Message-ID: <20060716021528.510FE1E4004@bag.python.org>
Author: neal.norwitz
Date: Sun Jul 16 04:15:27 2006
New Revision: 50677
Modified:
python/trunk/Modules/readline.c
Log:
Fix memory leaks in some conditions.
Reported by Klocwork #152.
Modified: python/trunk/Modules/readline.c
==============================================================================
--- python/trunk/Modules/readline.c (original)
+++ python/trunk/Modules/readline.c Sun Jul 16 04:15:27 2006
@@ -20,6 +20,12 @@
#include
#endif
+#ifdef SAVE_LOCALE
+# define RESTORE_LOCALE(sl) { setlocale(LC_CTYPE, sl); free(sl); }
+#else
+# define RESTORE_LOCALE(sl)
+#endif
+
/* GNU readline definitions */
#undef HAVE_CONFIG_H /* Else readline/chardefs.h includes strings.h */
#include
@@ -723,10 +729,7 @@
*/
rl_initialize();
-#ifdef SAVE_LOCALE
- setlocale(LC_CTYPE, saved_locale); /* Restore locale */
- free(saved_locale);
-#endif
+ RESTORE_LOCALE(saved_locale)
}
/* Wrapper around GNU readline that handles signals differently. */
@@ -864,7 +867,8 @@
p = readline_until_enter_or_signal(prompt, &signal);
/* we got an interrupt signal */
- if(signal) {
+ if (signal) {
+ RESTORE_LOCALE(saved_locale)
return NULL;
}
@@ -873,6 +877,7 @@
p = PyMem_Malloc(1);
if (p != NULL)
*p = '\0';
+ RESTORE_LOCALE(saved_locale)
return p;
}
@@ -905,10 +910,7 @@
p[n+1] = '\0';
}
free(q);
-#ifdef SAVE_LOCALE
- setlocale(LC_CTYPE, saved_locale); /* Restore locale */
- free(saved_locale);
-#endif
+ RESTORE_LOCALE(saved_locale)
return p;
}
From python-checkins at python.org Sun Jul 16 04:17:37 2006
From: python-checkins at python.org (neal.norwitz)
Date: Sun, 16 Jul 2006 04:17:37 +0200 (CEST)
Subject: [Python-checkins] r50678 - python/trunk/Python/import.c
Message-ID: <20060716021737.662AE1E4004@bag.python.org>
Author: neal.norwitz
Date: Sun Jul 16 04:17:36 2006
New Revision: 50678
Modified:
python/trunk/Python/import.c
Log:
Fix memory leak under some conditions.
Reported by Klocwork, #98.
Modified: python/trunk/Python/import.c
==============================================================================
--- python/trunk/Python/import.c (original)
+++ python/trunk/Python/import.c Sun Jul 16 04:17:36 2006
@@ -1906,11 +1906,10 @@
if (co == NULL)
return -1;
if (!PyCode_Check(co)) {
- Py_DECREF(co);
PyErr_Format(PyExc_TypeError,
"frozen object %.200s is not a code object",
name);
- return -1;
+ goto err_return;
}
if (ispackage) {
/* Set __path__ to the package name */
@@ -1918,22 +1917,25 @@
int err;
m = PyImport_AddModule(name);
if (m == NULL)
- return -1;
+ goto err_return;
d = PyModule_GetDict(m);
s = PyString_InternFromString(name);
if (s == NULL)
- return -1;
+ goto err_return;
err = PyDict_SetItemString(d, "__path__", s);
Py_DECREF(s);
if (err != 0)
- return err;
+ goto err_return;
}
m = PyImport_ExecCodeModuleEx(name, co, "");
- Py_DECREF(co);
if (m == NULL)
- return -1;
+ goto err_return;
+ Py_DECREF(co);
Py_DECREF(m);
return 1;
+err_return:
+ Py_DECREF(co);
+ return -1;
}
From python-checkins at python.org Sun Jul 16 04:22:32 2006
From: python-checkins at python.org (neal.norwitz)
Date: Sun, 16 Jul 2006 04:22:32 +0200 (CEST)
Subject: [Python-checkins] r50679 - python/trunk/Objects/complexobject.c
Message-ID: <20060716022232.0F7521E4004@bag.python.org>
Author: neal.norwitz
Date: Sun Jul 16 04:22:30 2006
New Revision: 50679
Modified:
python/trunk/Objects/complexobject.c
Log:
Use sizeof(buffer) instead of duplicating the constants to ensure they won't
be wrong.
The real change is to pass (bufsz - 1) to PyOS_ascii_formatd and 1
to strncat. strncat copies n+1 bytes from src (not dest).
Reported by Klocwork #58.
Modified: python/trunk/Objects/complexobject.c
==============================================================================
--- python/trunk/Objects/complexobject.c (original)
+++ python/trunk/Objects/complexobject.c Sun Jul 16 04:22:30 2006
@@ -274,16 +274,16 @@
{
char format[32];
if (v->cval.real == 0.) {
- PyOS_snprintf(format, 32, "%%.%ig", precision);
- PyOS_ascii_formatd(buf, bufsz, format, v->cval.imag);
- strncat(buf, "j", bufsz);
+ PyOS_snprintf(format, sizeof(format), "%%.%ig", precision);
+ PyOS_ascii_formatd(buf, bufsz - 1, format, v->cval.imag);
+ strncat(buf, "j", 1);
} else {
char re[64], im[64];
/* Format imaginary part with sign, real part without */
- PyOS_snprintf(format, 32, "%%.%ig", precision);
- PyOS_ascii_formatd(re, 64, format, v->cval.real);
- PyOS_snprintf(format, 32, "%%+.%ig", precision);
- PyOS_ascii_formatd(im, 64, format, v->cval.imag);
+ PyOS_snprintf(format, sizeof(format), "%%.%ig", precision);
+ PyOS_ascii_formatd(re, sizeof(re), format, v->cval.real);
+ PyOS_snprintf(format, sizeof(format), "%%+.%ig", precision);
+ PyOS_ascii_formatd(im, sizeof(im), format, v->cval.imag);
PyOS_snprintf(buf, bufsz, "(%s%sj)", re, im);
}
}
From python-checkins at python.org Sun Jul 16 04:32:04 2006
From: python-checkins at python.org (neal.norwitz)
Date: Sun, 16 Jul 2006 04:32:04 +0200 (CEST)
Subject: [Python-checkins] r50680 - python/trunk/Objects/fileobject.c
Message-ID: <20060716023204.B09381E4004@bag.python.org>
Author: neal.norwitz
Date: Sun Jul 16 04:32:03 2006
New Revision: 50680
Modified:
python/trunk/Objects/fileobject.c
Log:
Handle a NULL name properly.
Reported by Klocwork #67
Modified: python/trunk/Objects/fileobject.c
==============================================================================
--- python/trunk/Objects/fileobject.c (original)
+++ python/trunk/Objects/fileobject.c Sun Jul 16 04:32:03 2006
@@ -411,11 +411,11 @@
if (PyUnicode_Check(f->f_name)) {
#ifdef Py_USING_UNICODE
PyObject *ret = NULL;
- PyObject *name;
- name = PyUnicode_AsUnicodeEscapeString(f->f_name);
+ PyObject *name = PyUnicode_AsUnicodeEscapeString(f->f_name);
+ const char *name_str = name ? PyString_AsString(name) : "?";
ret = PyString_FromFormat("<%s file u'%s', mode '%s' at %p>",
f->f_fp == NULL ? "closed" : "open",
- PyString_AsString(name),
+ name_str,
PyString_AsString(f->f_mode),
f);
Py_XDECREF(name);
From python-checkins at python.org Sun Jul 16 04:35:47 2006
From: python-checkins at python.org (neal.norwitz)
Date: Sun, 16 Jul 2006 04:35:47 +0200 (CEST)
Subject: [Python-checkins] r50681 - python/trunk/Objects/funcobject.c
Message-ID: <20060716023547.C643A1E4004@bag.python.org>
Author: neal.norwitz
Date: Sun Jul 16 04:35:47 2006
New Revision: 50681
Modified:
python/trunk/Objects/funcobject.c
Log:
PyFunction_SetDefaults() is documented as taking None or a tuple.
A NULL would crash the PyTuple_Check(). Now make NULL return a SystemError.
Reported by Klocwork #73.
Modified: python/trunk/Objects/funcobject.c
==============================================================================
--- python/trunk/Objects/funcobject.c (original)
+++ python/trunk/Objects/funcobject.c Sun Jul 16 04:35:47 2006
@@ -109,8 +109,8 @@
}
if (defaults == Py_None)
defaults = NULL;
- else if (PyTuple_Check(defaults)) {
- Py_XINCREF(defaults);
+ else if (defaults && PyTuple_Check(defaults)) {
+ Py_INCREF(defaults);
}
else {
PyErr_SetString(PyExc_SystemError, "non-tuple default args");
From python-checkins at python.org Sun Jul 16 14:52:40 2006
From: python-checkins at python.org (andrew.macintyre)
Date: Sun, 16 Jul 2006 14:52:40 +0200 (CEST)
Subject: [Python-checkins] r50682 - python/branches/aimacintyre-sf1454481
Message-ID: <20060716125240.ED3261E4007@bag.python.org>
Author: andrew.macintyre
Date: Sun Jul 16 14:52:40 2006
New Revision: 50682
Removed:
python/branches/aimacintyre-sf1454481/
Log:
This branch was merged to the trunk
From buildbot at python.org Sun Jul 16 15:42:47 2006
From: buildbot at python.org (buildbot at python.org)
Date: Sun, 16 Jul 2006 13:42:47 +0000
Subject: [Python-checkins] buildbot warnings in sparc Ubuntu dapper trunk
Message-ID: <20060716134247.AA4601E4008@bag.python.org>
The Buildbot has detected a new failure of sparc Ubuntu dapper trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/sparc%2520Ubuntu%2520dapper%2520trunk/builds/522
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: neal.norwitz
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Sun Jul 16 17:22:30 2006
From: buildbot at python.org (buildbot at python.org)
Date: Sun, 16 Jul 2006 15:22:30 +0000
Subject: [Python-checkins] buildbot failure in MIPS Debian trunk
Message-ID: <20060716152230.34DA01E4007@bag.python.org>
The Buildbot has detected a new failure of MIPS Debian trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/MIPS%2520Debian%2520trunk/builds/288
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason: The web-page 'force build' button was pressed by 'doko': test
Build Source Stamp: [branch trunk] HEAD
Blamelist:
BUILD FAILED: failed configure
sincerely,
-The Buildbot
From python-checkins at python.org Mon Jul 17 02:55:46 2006
From: python-checkins at python.org (neal.norwitz)
Date: Mon, 17 Jul 2006 02:55:46 +0200 (CEST)
Subject: [Python-checkins] r50683 - python/trunk/Objects/fileobject.c
Message-ID: <20060717005546.A4E1D1E4007@bag.python.org>
Author: neal.norwitz
Date: Mon Jul 17 02:55:45 2006
New Revision: 50683
Modified:
python/trunk/Objects/fileobject.c
Log:
Stop INCREFing name, then checking if it's NULL. name (f_name) should never
be NULL so assert it. Fix one place where we could have passed NULL.
Reported by Klocwork #66.
Modified: python/trunk/Objects/fileobject.c
==============================================================================
--- python/trunk/Objects/fileobject.c (original)
+++ python/trunk/Objects/fileobject.c Mon Jul 17 02:55:45 2006
@@ -103,6 +103,7 @@
fill_file_fields(PyFileObject *f, FILE *fp, PyObject *name, char *mode,
int (*close)(FILE *))
{
+ assert(name != NULL);
assert(f != NULL);
assert(PyFile_Check(f));
assert(f->f_fp == NULL);
@@ -111,7 +112,7 @@
Py_DECREF(f->f_mode);
Py_DECREF(f->f_encoding);
- Py_INCREF (name);
+ Py_INCREF(name);
f->f_name = name;
f->f_mode = PyString_FromString(mode);
@@ -126,7 +127,7 @@
Py_INCREF(Py_None);
f->f_encoding = Py_None;
- if (f->f_name == NULL || f->f_mode == NULL)
+ if (f->f_mode == NULL)
return NULL;
f->f_fp = fp;
f = dircheck(f);
@@ -278,7 +279,9 @@
PyFileObject *f = (PyFileObject *)PyFile_Type.tp_new(&PyFile_Type,
NULL, NULL);
if (f != NULL) {
- PyObject *o_name = PyString_FromString(name);
+ PyObject *o_name = PyString_FromString(name);
+ if (o_name == NULL)
+ return NULL;
if (fill_file_fields(f, fp, o_name, mode, close) == NULL) {
Py_DECREF(f);
f = NULL;
From python-checkins at python.org Mon Jul 17 02:57:20 2006
From: python-checkins at python.org (neal.norwitz)
Date: Mon, 17 Jul 2006 02:57:20 +0200 (CEST)
Subject: [Python-checkins] r50684 - python/trunk/Objects/setobject.c
Message-ID: <20060717005720.9F6811E4002@bag.python.org>
Author: neal.norwitz
Date: Mon Jul 17 02:57:15 2006
New Revision: 50684
Modified:
python/trunk/Objects/setobject.c
Log:
otherset is known to be non-NULL based on checks before and DECREF after.
DECREF otherset rather than XDECREF in error conditions too.
Reported by Klockwork #154.
Modified: python/trunk/Objects/setobject.c
==============================================================================
--- python/trunk/Objects/setobject.c (original)
+++ python/trunk/Objects/setobject.c Mon Jul 17 02:57:15 2006
@@ -1380,12 +1380,12 @@
while (set_next(otherset, &pos, &entry)) {
int rv = set_discard_entry(so, entry);
if (rv == -1) {
- Py_XDECREF(otherset);
+ Py_DECREF(otherset);
return NULL;
}
if (rv == DISCARD_NOTFOUND) {
if (set_add_entry(so, entry) == -1) {
- Py_XDECREF(otherset);
+ Py_DECREF(otherset);
return NULL;
}
}
From python-checkins at python.org Mon Jul 17 02:59:04 2006
From: python-checkins at python.org (neal.norwitz)
Date: Mon, 17 Jul 2006 02:59:04 +0200 (CEST)
Subject: [Python-checkins] r50685 - python/trunk/Python/pythonrun.c
Message-ID: <20060717005904.B98B51E4002@bag.python.org>
Author: neal.norwitz
Date: Mon Jul 17 02:59:04 2006
New Revision: 50685
Modified:
python/trunk/Python/pythonrun.c
Log:
Reported by Klocwork #151.
v2 can be NULL if exception2 is NULL. I don't think that condition can happen,
but I'm not sure it can't either. Now the code will protect against either
being NULL.
Modified: python/trunk/Python/pythonrun.c
==============================================================================
--- python/trunk/Python/pythonrun.c (original)
+++ python/trunk/Python/pythonrun.c Mon Jul 17 02:59:04 2006
@@ -1064,6 +1064,17 @@
}
PyErr_Fetch(&exception2, &v2, &tb2);
PyErr_NormalizeException(&exception2, &v2, &tb2);
+ /* It should not be possible for exception2 or v2
+ to be NULL. However PyErr_Display() can't
+ tolerate NULLs, so just be safe. */
+ if (exception2 == NULL) {
+ exception2 = Py_None;
+ Py_INCREF(exception2);
+ }
+ if (v2 == NULL) {
+ v2 = Py_None;
+ Py_INCREF(v2);
+ }
if (Py_FlushLine())
PyErr_Clear();
fflush(stdout);
@@ -1071,8 +1082,8 @@
PyErr_Display(exception2, v2, tb2);
PySys_WriteStderr("\nOriginal exception was:\n");
PyErr_Display(exception, v, tb);
- Py_XDECREF(exception2);
- Py_XDECREF(v2);
+ Py_DECREF(exception2);
+ Py_DECREF(v2);
Py_XDECREF(tb2);
}
Py_XDECREF(result);
From python-checkins at python.org Mon Jul 17 03:00:18 2006
From: python-checkins at python.org (neal.norwitz)
Date: Mon, 17 Jul 2006 03:00:18 +0200 (CEST)
Subject: [Python-checkins] r50686 - python/trunk/Misc/NEWS
Message-ID: <20060717010018.29F151E4023@bag.python.org>
Author: neal.norwitz
Date: Mon Jul 17 03:00:16 2006
New Revision: 50686
Modified:
python/trunk/Misc/NEWS
Log:
Add NEWS entry for a bunch of fixes due to warnings produced by Klocworks static analysis tool.
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Mon Jul 17 03:00:16 2006
@@ -12,6 +12,8 @@
Core and builtins
-----------------
+- Fix warnings reported by Klocwork's static analysis tool.
+
- Bug #1512814, Fix incorrect lineno's when code within a function
had more than 255 blank lines.
From buildbot at python.org Mon Jul 17 03:41:37 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 17 Jul 2006 01:41:37 +0000
Subject: [Python-checkins] buildbot warnings in g4 osx.4 trunk
Message-ID: <20060717014137.2560B1E4002@bag.python.org>
The Buildbot has detected a new failure of g4 osx.4 trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/g4%2520osx.4%2520trunk/builds/1176
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: neal.norwitz
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Mon Jul 17 07:47:53 2006
From: python-checkins at python.org (fred.drake)
Date: Mon, 17 Jul 2006 07:47:53 +0200 (CEST)
Subject: [Python-checkins] r50687 - python/trunk/Doc/lib/markup.tex
Message-ID: <20060717054753.722FD1E4002@bag.python.org>
Author: fred.drake
Date: Mon Jul 17 07:47:52 2006
New Revision: 50687
Modified:
python/trunk/Doc/lib/markup.tex
Log:
document xmlcore (still minimal; needs mention in each of the xml.* modules)
SF bug #1504456 (partial)
Modified: python/trunk/Doc/lib/markup.tex
==============================================================================
--- python/trunk/Doc/lib/markup.tex (original)
+++ python/trunk/Doc/lib/markup.tex Mon Jul 17 07:47:52 2006
@@ -15,6 +15,17 @@
package}{http://pyxml.sourceforge.net/}; that package provides an
extended set of XML libraries for Python.
+Python 2.5 introduces the \module{xmlcore} package; this package
+provides the implementation of the \module{xml} package as distributed
+with the standard library. The \module{xml} package, as in earlier
+versions, provides an interface that will provide the PyXML
+implementation of the interfaces when available, and the standard
+library implementation if not. Applications that can use either the
+PyXML implementation or the standard library's implementation may
+continue to make imports from the \module{xml} package; applications
+that want to only import the standard library's implementation can now
+use the \module{xmlcore} package.
+
The documentation for the \module{xml.dom} and \module{xml.sax}
packages are the definition of the Python bindings for the DOM and SAX
interfaces.
From python-checkins at python.org Mon Jul 17 15:23:47 2006
From: python-checkins at python.org (georg.brandl)
Date: Mon, 17 Jul 2006 15:23:47 +0200 (CEST)
Subject: [Python-checkins] r50688 - in python/trunk/Lib:
idlelib/CodeContext.py test/test_bsddb.py
test/test_mimetools.py test/test_mimetypes.py
Message-ID: <20060717132347.735661E4008@bag.python.org>
Author: georg.brandl
Date: Mon Jul 17 15:23:46 2006
New Revision: 50688
Modified:
python/trunk/Lib/idlelib/CodeContext.py
python/trunk/Lib/test/test_bsddb.py
python/trunk/Lib/test/test_mimetools.py
python/trunk/Lib/test/test_mimetypes.py
Log:
Remove usage of sets module (patch #1500609).
Modified: python/trunk/Lib/idlelib/CodeContext.py
==============================================================================
--- python/trunk/Lib/idlelib/CodeContext.py (original)
+++ python/trunk/Lib/idlelib/CodeContext.py Mon Jul 17 15:23:46 2006
@@ -11,11 +11,10 @@
"""
import Tkinter
from configHandler import idleConf
-from sets import Set
import re
from sys import maxint as INFINITY
-BLOCKOPENERS = Set(["class", "def", "elif", "else", "except", "finally", "for",
+BLOCKOPENERS = set(["class", "def", "elif", "else", "except", "finally", "for",
"if", "try", "while"])
UPDATEINTERVAL = 100 # millisec
FONTUPDATEINTERVAL = 1000 # millisec
Modified: python/trunk/Lib/test/test_bsddb.py
==============================================================================
--- python/trunk/Lib/test/test_bsddb.py (original)
+++ python/trunk/Lib/test/test_bsddb.py Mon Jul 17 15:23:46 2006
@@ -8,7 +8,6 @@
import dbhash # Just so we know it's imported
import unittest
from test import test_support
-from sets import Set
class TestBSDDB(unittest.TestCase):
openflag = 'c'
@@ -53,7 +52,7 @@
self.assertEqual(self.f[k], v)
def assertSetEquals(self, seqn1, seqn2):
- self.assertEqual(Set(seqn1), Set(seqn2))
+ self.assertEqual(set(seqn1), set(seqn2))
def test_mapping_iteration_methods(self):
f = self.f
Modified: python/trunk/Lib/test/test_mimetools.py
==============================================================================
--- python/trunk/Lib/test/test_mimetools.py (original)
+++ python/trunk/Lib/test/test_mimetools.py Mon Jul 17 15:23:46 2006
@@ -1,7 +1,7 @@
import unittest
from test import test_support
-import string, StringIO, mimetools, sets
+import string, StringIO, mimetools
msgtext1 = mimetools.Message(StringIO.StringIO(
"""Content-Type: text/plain; charset=iso-8859-1; format=flowed
@@ -25,7 +25,7 @@
self.assertEqual(o.getvalue(), start)
def test_boundary(self):
- s = sets.Set([""])
+ s = set([""])
for i in xrange(100):
nb = mimetools.choose_boundary()
self.assert_(nb not in s)
Modified: python/trunk/Lib/test/test_mimetypes.py
==============================================================================
--- python/trunk/Lib/test/test_mimetypes.py (original)
+++ python/trunk/Lib/test/test_mimetypes.py Mon Jul 17 15:23:46 2006
@@ -1,7 +1,6 @@
import mimetypes
import StringIO
import unittest
-from sets import Set
from test import test_support
@@ -52,8 +51,8 @@
# First try strict. Use a set here for testing the results because if
# test_urllib2 is run before test_mimetypes, global state is modified
# such that the 'all' set will have more items in it.
- all = Set(self.db.guess_all_extensions('text/plain', strict=True))
- unless(all >= Set(['.bat', '.c', '.h', '.ksh', '.pl', '.txt']))
+ all = set(self.db.guess_all_extensions('text/plain', strict=True))
+ unless(all >= set(['.bat', '.c', '.h', '.ksh', '.pl', '.txt']))
# And now non-strict
all = self.db.guess_all_extensions('image/jpg', strict=False)
all.sort()
From python-checkins at python.org Mon Jul 17 15:26:33 2006
From: python-checkins at python.org (georg.brandl)
Date: Mon, 17 Jul 2006 15:26:33 +0200 (CEST)
Subject: [Python-checkins] r50689 - python/trunk/Misc/NEWS
Message-ID: <20060717132633.807721E401C@bag.python.org>
Author: georg.brandl
Date: Mon Jul 17 15:26:33 2006
New Revision: 50689
Modified:
python/trunk/Misc/NEWS
Log:
Add missing NEWS item (#1522771)
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Mon Jul 17 15:26:33 2006
@@ -1208,6 +1208,10 @@
Library
-------
+
+- Patch #1388073: Numerous __-prefixed attributes of unittest.TestCase have
+ been renamed to have only a single underscore prefix. This was done to
+ make subclassing easier.
- PEP 338: new module runpy defines a run_module function to support
executing modules which provide access to source code or a code object
From buildbot at python.org Mon Jul 17 16:38:20 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 17 Jul 2006 14:38:20 +0000
Subject: [Python-checkins] buildbot warnings in x86 Ubuntu dapper (icc) trunk
Message-ID: <20060717143820.C43D91E4008@bag.python.org>
The Buildbot has detected a new failure of x86 Ubuntu dapper (icc) trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520Ubuntu%2520dapper%2520%2528icc%2529%2520trunk/builds/757
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: fred.drake,georg.brandl
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Mon Jul 17 18:47:55 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Mon, 17 Jul 2006 18:47:55 +0200 (CEST)
Subject: [Python-checkins] r50690 - python/trunk/Doc/whatsnew/whatsnew25.tex
Message-ID: <20060717164755.142761E4008@bag.python.org>
Author: andrew.kuchling
Date: Mon Jul 17 18:47:54 2006
New Revision: 50690
Modified:
python/trunk/Doc/whatsnew/whatsnew25.tex
Log:
Attribute more features
Modified: python/trunk/Doc/whatsnew/whatsnew25.tex
==============================================================================
--- python/trunk/Doc/whatsnew/whatsnew25.tex (original)
+++ python/trunk/Doc/whatsnew/whatsnew25.tex Mon Jul 17 18:47:54 2006
@@ -74,7 +74,7 @@
Candidates included C's \code{cond ? true_v : false_v},
\code{if cond then true_v else false_v}, and 16 other variations.
-GvR eventually chose a surprising syntax:
+Guido van~Rossum eventually chose a surprising syntax:
\begin{verbatim}
x = true_value if condition else false_value
@@ -407,7 +407,7 @@
combined version was complicated and it wasn't clear what the
semantics of the combined should be.
-GvR spent some time working with Java, which does support the
+Guido van~Rossum spent some time working with Java, which does support the
equivalent of combining \keyword{except} blocks and a
\keyword{finally} block, and this clarified what the statement should
mean. In Python 2.5, you can now write:
@@ -600,7 +600,11 @@
\seepep{342}{Coroutines via Enhanced Generators}{PEP written by
Guido van~Rossum and Phillip J. Eby;
implemented by Phillip J. Eby. Includes examples of
-some fancier uses of generators as coroutines.}
+some fancier uses of generators as coroutines.
+
+Earlier versions of these features were proposed in
+\pep{288} by Raymond Hettinger and \pep{325} by Samuele Pedroni.
+}
\seeurl{http://en.wikipedia.org/wiki/Coroutine}{The Wikipedia entry for
coroutines.}
@@ -1152,8 +1156,8 @@
false values. \function{any()} returns \constant{True} if any value
returned by the iterator is true; otherwise it will return
\constant{False}. \function{all()} returns \constant{True} only if
-all of the values returned by the iterator evaluate as being true.
-(Suggested by GvR, and implemented by Raymond Hettinger.)
+all of the values returned by the iterator evaluate as true.
+(Suggested by Guido van~Rossum, and implemented by Raymond Hettinger.)
\item ASCII is now the default encoding for modules. It's now
a syntax error if a module contains string literals with 8-bit
@@ -1259,7 +1263,8 @@
\item The code generator's peephole optimizer now performs
simple constant folding in expressions. If you write something like
\code{a = 2+3}, the code generator will do the arithmetic and produce
-code corresponding to \code{a = 5}.
+code corresponding to \code{a = 5}. (Proposed and implemented
+by Raymond Hettinger.)
\item Function calls are now faster because code objects now keep
the most recently finished frame (a ``zombie frame'') in an internal
@@ -1353,10 +1358,13 @@
'r': ['ritrovai'], 'u': ['una'], 'v': ['vita', 'via']}
\end{verbatim}
-The \class{deque} double-ended queue type supplied by the
+(Contributed by Guido van~Rossum.)
+
+\item The \class{deque} double-ended queue type supplied by the
\module{collections} module now has a \method{remove(\var{value})}
method that removes the first occurrence of \var{value} in the queue,
raising \exception{ValueError} if the value isn't found.
+(Contributed by Raymond Hettinger.)
\item New module: The \module{contextlib} module contains helper functions for use
with the new '\keyword{with}' statement. See
@@ -2197,6 +2205,11 @@
\begin{itemize}
+\item The Python source tree was converted from CVS to Subversion,
+in a complex migration procedure that was supervised and flawlessly
+carried out by Martin von~L\"owis. The procedure was developed as
+\pep{347}.
+
\item The largest change to the C API came from \pep{353},
which modifies the interpreter to use a \ctype{Py_ssize_t} type
definition instead of \ctype{int}. See the earlier
@@ -2417,8 +2430,9 @@
The author would like to thank the following people for offering
suggestions, corrections and assistance with various drafts of this
-article: Nick Coghlan, Phillip J. Eby, Ralf W. Grosse-Kunstleve, Kent
-Johnson, Martin von~L\"owis, Fredrik Lundh, Gustavo Niemeyer, James
-Pryor, Mike Rovner, Scott Weikart, Barry Warsaw, Thomas Wouters.
+article: Nick Coghlan, Phillip J. Eby, Raymond Hettinger, Ralf
+W. Grosse-Kunstleve, Kent Johnson, Martin von~L\"owis, Fredrik Lundh,
+Gustavo Niemeyer, James Pryor, Mike Rovner, Scott Weikart, Barry
+Warsaw, Thomas Wouters.
\end{document}
From python-checkins at python.org Mon Jul 17 21:57:32 2006
From: python-checkins at python.org (matt.fleming)
Date: Mon, 17 Jul 2006 21:57:32 +0200 (CEST)
Subject: [Python-checkins] r50691 - sandbox/trunk/pdb/Doc/lib/libmpdb.tex
Message-ID: <20060717195732.2BE9B1E4008@bag.python.org>
Author: matt.fleming
Date: Mon Jul 17 21:57:31 2006
New Revision: 50691
Modified:
sandbox/trunk/pdb/Doc/lib/libmpdb.tex
Log:
Minor changes to the threading documentation.
Modified: sandbox/trunk/pdb/Doc/lib/libmpdb.tex
==============================================================================
--- sandbox/trunk/pdb/Doc/lib/libmpdb.tex (original)
+++ sandbox/trunk/pdb/Doc/lib/libmpdb.tex Mon Jul 17 21:57:31 2006
@@ -1030,7 +1030,9 @@
called \class{_MainThread}. Whenever we refer to the main debugger, this is
what we are refering to. The \class{MTracer} objects (slaves)
are responsible for tracing threads that are created in the script being
-debugged.
+debugged. Any messages that need to be output to the user are placed
+on the main debugger's \code{outputqueue} queue. This messages are
+periodically written to the main debugger's \code{stdout} variable.
% How does Python facilitate debugging threads
\subsection{Python's Thread Debugging Features}
@@ -1083,12 +1085,22 @@
The main debugger is repsonsible for debugging the \class{_MainThread} object
and the \class{MTracer} objects are responsible for debugging all other
-threads. When a thread needs to stop exection, it acquires the main debugger's
-\member{lock} variable, which is a \class{Lock} object. The \function{acquire}
+threads. When a thread needs to stop exection, it acquires the global lock
+\member{global_lock}, which is a \class{Lock} object. The \function{acquire}
call will block until the lock is obtained. When this lock is obtained
execution transfers to the thread the \class{MTracer} is running in. This
-means that this thread is now the current thread.
+means that this thread is now the current thread. This method of halting
+the execution of the entire program is taken from how GDB handle's threads.
+\emph{The GDB thread debugging facility allows you to observe all threads while your program runs--but whenever GDB takes control, one thread in particular is always the focus of debugging.} - \citetitle[http://sources.redhat.com/gdb/current/onlinedocs/gdb_5.html\#SEC27]{Debugging with GDB: Debugging programs with multiple threads}
+
+and..
+
+\emph{Whenever your program stops under GDB for any reason, all threads of execution stop, not just the current thread. This allows you to examine the overall state of the program, including switching between threads, without worrying that things may change underfoot.} - \citetitle [http://sources.redhat.com/gdb/current/onlinedocs/gdb_6.html\#SEC45]{Debugging with GDB: Stopping and starting multi-thread programs}
+
+When the main thread exits from debugging, it is system defined as to what
+happens to the other threads. On most systems, they are killed without
+executing \code{try/finally} clauses or executing object destructors.
\section{Remote Debugging}
\label{remote-debug}
From python-checkins at python.org Mon Jul 17 23:59:28 2006
From: python-checkins at python.org (kurt.kaiser)
Date: Mon, 17 Jul 2006 23:59:28 +0200 (CEST)
Subject: [Python-checkins] r50692 -
python/trunk/Lib/idlelib/ColorDelegator.py
Message-ID: <20060717215928.5C3061E4008@bag.python.org>
Author: kurt.kaiser
Date: Mon Jul 17 23:59:27 2006
New Revision: 50692
Modified:
python/trunk/Lib/idlelib/ColorDelegator.py
Log:
Patch 1479219 - Tal Einat
1. 'as' highlighted as builtin in comment string on import line
2. Comments such as "#False identity" which start with a keyword immediately
after the '#' character aren't colored as comments.
3. u or U beginning unicode string not correctly highlighted
Closes bug 1325071
Modified: python/trunk/Lib/idlelib/ColorDelegator.py
==============================================================================
--- python/trunk/Lib/idlelib/ColorDelegator.py (original)
+++ python/trunk/Lib/idlelib/ColorDelegator.py Mon Jul 17 23:59:27 2006
@@ -8,28 +8,29 @@
DEBUG = False
-def any(name, list):
- return "(?P<%s>" % name + "|".join(list) + ")"
+def any(name, alternates):
+ "Return a named group pattern matching list of alternates."
+ return "(?P<%s>" % name + "|".join(alternates) + ")"
def make_pat():
kw = r"\b" + any("KEYWORD", keyword.kwlist) + r"\b"
builtinlist = [str(name) for name in dir(__builtin__)
if not name.startswith('_')]
# self.file = file("file") :
- # 1st 'file' colorized normal, 2nd as builtin, 3rd as comment
- builtin = r"([^.'\"\\]\b|^)" + any("BUILTIN", builtinlist) + r"\b"
+ # 1st 'file' colorized normal, 2nd as builtin, 3rd as string
+ builtin = r"([^.'\"\\#]\b|^)" + any("BUILTIN", builtinlist) + r"\b"
comment = any("COMMENT", [r"#[^\n]*"])
- sqstring = r"(\b[rR])?'[^'\\\n]*(\\.[^'\\\n]*)*'?"
- dqstring = r'(\b[rR])?"[^"\\\n]*(\\.[^"\\\n]*)*"?'
- sq3string = r"(\b[rR])?'''[^'\\]*((\\.|'(?!''))[^'\\]*)*(''')?"
- dq3string = r'(\b[rR])?"""[^"\\]*((\\.|"(?!""))[^"\\]*)*(""")?'
+ sqstring = r"(\b[rRuU])?'[^'\\\n]*(\\.[^'\\\n]*)*'?"
+ dqstring = r'(\b[rRuU])?"[^"\\\n]*(\\.[^"\\\n]*)*"?'
+ sq3string = r"(\b[rRuU])?'''[^'\\]*((\\.|'(?!''))[^'\\]*)*(''')?"
+ dq3string = r'(\b[rRuU])?"""[^"\\]*((\\.|"(?!""))[^"\\]*)*(""")?'
string = any("STRING", [sq3string, dq3string, sqstring, dqstring])
return kw + "|" + builtin + "|" + comment + "|" + string +\
"|" + any("SYNC", [r"\n"])
prog = re.compile(make_pat(), re.S)
idprog = re.compile(r"\s+(\w+)", re.S)
-asprog = re.compile(r".*?\b(as)\b", re.S)
+asprog = re.compile(r".*?\b(as)\b")
class ColorDelegator(Delegator):
@@ -208,10 +209,15 @@
head + "+%dc" % a,
head + "+%dc" % b)
elif value == "import":
- # color all the "as" words on same line;
- # cheap approximation to the truth
+ # color all the "as" words on same line, except
+ # if in a comment; cheap approximation to the
+ # truth
+ if '#' in chars:
+ endpos = chars.index('#')
+ else:
+ endpos = len(chars)
while True:
- m1 = self.asprog.match(chars, b)
+ m1 = self.asprog.match(chars, b, endpos)
if not m1:
break
a, b = m1.span(1)
From python-checkins at python.org Tue Jul 18 01:07:55 2006
From: python-checkins at python.org (barry.warsaw)
Date: Tue, 18 Jul 2006 01:07:55 +0200 (CEST)
Subject: [Python-checkins] r50693 - in python/trunk/Lib/email: __init__.py
test/test_email_renamed.py utils.py
Message-ID: <20060717230755.7271D1E4008@bag.python.org>
Author: barry.warsaw
Date: Tue Jul 18 01:07:51 2006
New Revision: 50693
Modified:
python/trunk/Lib/email/__init__.py
python/trunk/Lib/email/test/test_email_renamed.py
python/trunk/Lib/email/utils.py
Log:
decode_rfc2231(): Be more robust against buggy RFC 2231 encodings.
Specifically, instead of raising a ValueError when there is a single tick in
the parameter, simply return that the entire string unquoted, with None for
both the charset and the language. Also, if there are more than 2 ticks in
the parameter, interpret the first three parts as the standard RFC 2231 parts,
then the rest of the parts as the encoded string.
Test cases added.
Original fewer-than-3-parts fix by Tokio Kikuchi.
Resolves SF bug # 1218081. I will back port the fix and tests to Python 2.4
(email 3.0) and Python 2.3 (email 2.5).
Also, bump the version number to email 4.0.1, removing the 'alpha' moniker.
Modified: python/trunk/Lib/email/__init__.py
==============================================================================
--- python/trunk/Lib/email/__init__.py (original)
+++ python/trunk/Lib/email/__init__.py Tue Jul 18 01:07:51 2006
@@ -4,7 +4,7 @@
"""A package for parsing, handling, and generating email messages."""
-__version__ = '4.0a2'
+__version__ = '4.0.1'
__all__ = [
# Old names
Modified: python/trunk/Lib/email/test/test_email_renamed.py
==============================================================================
--- python/trunk/Lib/email/test/test_email_renamed.py (original)
+++ python/trunk/Lib/email/test/test_email_renamed.py Tue Jul 18 01:07:51 2006
@@ -3060,6 +3060,40 @@
msg = email.message_from_string(m)
self.assertEqual(msg.get_filename(), 'myfile.txt')
+ def test_rfc2231_single_tick_in_filename(self):
+ eq = self.assertEqual
+ m = """\
+Content-Type: application/x-foo; name*0=\"Frank's\"; name*1=\" Document\"
+
+"""
+ msg = email.message_from_string(m)
+ charset, language, s = msg.get_param('name')
+ eq(charset, None)
+ eq(language, None)
+ eq(s, "Frank's Document")
+
+ def test_rfc2231_tick_attack(self):
+ eq = self.assertEqual
+ m = """\
+Content-Type: application/x-foo;
+\tname*0=\"us-ascii'en-us'Frank's\"; name*1=\" Document\"
+
+"""
+ msg = email.message_from_string(m)
+ charset, language, s = msg.get_param('name')
+ eq(charset, 'us-ascii')
+ eq(language, 'en-us')
+ eq(s, "Frank's Document")
+
+ def test_rfc2231_no_extended_values(self):
+ eq = self.assertEqual
+ m = """\
+Content-Type: application/x-foo; name=\"Frank's Document\"
+
+"""
+ msg = email.message_from_string(m)
+ eq(msg.get_param('name'), "Frank's Document")
+
def _testclasses():
Modified: python/trunk/Lib/email/utils.py
==============================================================================
--- python/trunk/Lib/email/utils.py (original)
+++ python/trunk/Lib/email/utils.py Tue Jul 18 01:07:51 2006
@@ -45,6 +45,7 @@
EMPTYSTRING = ''
UEMPTYSTRING = u''
CRLF = '\r\n'
+TICK = "'"
specialsre = re.compile(r'[][\\()<>@,:;".]')
escapesre = re.compile(r'[][\\()"]')
@@ -231,10 +232,14 @@
def decode_rfc2231(s):
"""Decode string according to RFC 2231"""
import urllib
- parts = s.split("'", 2)
- if len(parts) == 1:
+ parts = s.split(TICK, 2)
+ if len(parts) <= 2:
return None, None, urllib.unquote(s)
- charset, language, s = parts
+ if len(parts) > 3:
+ charset, language = parts[:2]
+ s = TICK.join(parts[2:])
+ else:
+ charset, language, s = parts
return charset, language, urllib.unquote(s)
From buildbot at python.org Tue Jul 18 02:06:05 2006
From: buildbot at python.org (buildbot at python.org)
Date: Tue, 18 Jul 2006 00:06:05 +0000
Subject: [Python-checkins] buildbot warnings in S-390 Debian trunk
Message-ID: <20060718000605.37C981E4005@bag.python.org>
The Buildbot has detected a new failure of S-390 Debian trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/S-390%2520Debian%2520trunk/builds/257
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: barry.warsaw
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Tue Jul 18 02:12:21 2006
From: python-checkins at python.org (jackilyn.hoxworth)
Date: Tue, 18 Jul 2006 02:12:21 +0200 (CEST)
Subject: [Python-checkins] r50694 -
python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py
Message-ID: <20060718001221.573A91E4005@bag.python.org>
Author: jackilyn.hoxworth
Date: Tue Jul 18 02:12:19 2006
New Revision: 50694
Modified:
python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py
Log:
Still not working.
Modified: python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py
==============================================================================
--- python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py (original)
+++ python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py Tue Jul 18 02:12:19 2006
@@ -32,9 +32,15 @@
# create socket
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
-# httplib.HTTPResponse(sock).log("message 1") # says there is no attribute for "log"
-httplib._log.info("message 1")
-# httplib.HTTPConnection(sock.connect).log("message 2")
+httplib._log.info("message 1") # first stage of testing
+
+r = httplib.HTTPResponse(sock) # second stage of testing
+r.begin() # call the begin method
+
+"""self.msg == None
+self._read_status == "message 1" == CONTINUE
+skip != True
+self.debuglevel > 0"""
print stringLog.getvalue() # For testing purposes
From python-checkins at python.org Tue Jul 18 06:03:19 2006
From: python-checkins at python.org (kurt.kaiser)
Date: Tue, 18 Jul 2006 06:03:19 +0200 (CEST)
Subject: [Python-checkins] r50695 - python/trunk/Lib/idlelib/NEWS.txt
python/trunk/Lib/idlelib/keybindingDialog.py
Message-ID: <20060718040319.BBD791E4005@bag.python.org>
Author: kurt.kaiser
Date: Tue Jul 18 06:03:16 2006
New Revision: 50695
Modified:
python/trunk/Lib/idlelib/NEWS.txt
python/trunk/Lib/idlelib/keybindingDialog.py
Log:
Rebinding Tab key was inserting 'tab' instead of 'Tab'. Bug 1179168.
Modified: python/trunk/Lib/idlelib/NEWS.txt
==============================================================================
--- python/trunk/Lib/idlelib/NEWS.txt (original)
+++ python/trunk/Lib/idlelib/NEWS.txt Tue Jul 18 06:03:16 2006
@@ -1,3 +1,14 @@
+What's New in IDLE 1.2c1?
+=========================
+
+*Release date: XX-XXX-2006*
+
+- Rebinding Tab key was inserting 'tab' instead of 'Tab'. Bug 1179168.
+
+- Colorizer now handles # correctly, also unicode strings and
+ 'as' keyword in comment directly following import command. Closes 1325071.
+ Patch 1479219 Tal Einat
+
What's New in IDLE 1.2b2?
=========================
Modified: python/trunk/Lib/idlelib/keybindingDialog.py
==============================================================================
--- python/trunk/Lib/idlelib/keybindingDialog.py (original)
+++ python/trunk/Lib/idlelib/keybindingDialog.py Tue Jul 18 06:03:16 2006
@@ -202,7 +202,7 @@
':':'colon',',':'comma','.':'period','<':'less','>':'greater',
'/':'slash','?':'question','Page Up':'Prior','Page Down':'Next',
'Left Arrow':'Left','Right Arrow':'Right','Up Arrow':'Up',
- 'Down Arrow': 'Down', 'Tab':'tab'}
+ 'Down Arrow': 'Down', 'Tab':'Tab'}
if key in translateDict.keys():
key = translateDict[key]
if 'Shift' in modifiers and key in string.ascii_lowercase:
From buildbot at python.org Tue Jul 18 06:25:56 2006
From: buildbot at python.org (buildbot at python.org)
Date: Tue, 18 Jul 2006 04:25:56 +0000
Subject: [Python-checkins] buildbot warnings in x86 W2k trunk
Message-ID: <20060718042556.BD5771E4005@bag.python.org>
The Buildbot has detected a new failure of x86 W2k trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520W2k%2520trunk/builds/1237
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: kurt.kaiser
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Tue Jul 18 06:41:38 2006
From: python-checkins at python.org (brett.cannon)
Date: Tue, 18 Jul 2006 06:41:38 +0200 (CEST)
Subject: [Python-checkins] r50696 - in python/trunk: Doc/lib/libtime.tex
Lib/test/test_time.py Misc/NEWS Modules/timemodule.c
Message-ID: <20060718044138.D412E1E4005@bag.python.org>
Author: brett.cannon
Date: Tue Jul 18 06:41:36 2006
New Revision: 50696
Modified:
python/trunk/Doc/lib/libtime.tex
python/trunk/Lib/test/test_time.py
python/trunk/Misc/NEWS
python/trunk/Modules/timemodule.c
Log:
Fix bug #1520914. Starting in 2.4, time.strftime() began to check the bounds
of values in the time tuple passed in. Unfortunately people came to rely on
undocumented behaviour of setting unneeded values to 0, regardless of if it was
within the valid range. Now those values force the value internally to the
minimum value when 0 is passed in.
Modified: python/trunk/Doc/lib/libtime.tex
==============================================================================
--- python/trunk/Doc/lib/libtime.tex (original)
+++ python/trunk/Doc/lib/libtime.tex Tue Jul 18 06:41:36 2006
@@ -226,6 +226,8 @@
\versionchanged[Allowed \var{t} to be omitted]{2.1}
\versionchanged[\exception{ValueError} raised if a field in \var{t} is
out of range]{2.4}
+\versionchanged[0 is now a legal argument for any position in the time tuple;
+if it is normally illegal the value is forced to a correct one.]{2.5}
The following directives can be embedded in the \var{format} string.
Modified: python/trunk/Lib/test/test_time.py
==============================================================================
--- python/trunk/Lib/test/test_time.py (original)
+++ python/trunk/Lib/test/test_time.py Tue Jul 18 06:41:36 2006
@@ -39,9 +39,9 @@
def test_strftime_bounds_checking(self):
# Make sure that strftime() checks the bounds of the various parts
- #of the time tuple.
+ #of the time tuple (0 is valid for *all* values).
- # Check year
+ # Check year [1900, max(int)]
self.assertRaises(ValueError, time.strftime, '',
(1899, 1, 1, 0, 0, 0, 0, 1, -1))
if time.accept2dyear:
@@ -49,27 +49,27 @@
(-1, 1, 1, 0, 0, 0, 0, 1, -1))
self.assertRaises(ValueError, time.strftime, '',
(100, 1, 1, 0, 0, 0, 0, 1, -1))
- # Check month
+ # Check month [1, 12] + zero support
self.assertRaises(ValueError, time.strftime, '',
- (1900, 0, 1, 0, 0, 0, 0, 1, -1))
+ (1900, -1, 1, 0, 0, 0, 0, 1, -1))
self.assertRaises(ValueError, time.strftime, '',
(1900, 13, 1, 0, 0, 0, 0, 1, -1))
- # Check day of month
+ # Check day of month [1, 31] + zero support
self.assertRaises(ValueError, time.strftime, '',
- (1900, 1, 0, 0, 0, 0, 0, 1, -1))
+ (1900, 1, -1, 0, 0, 0, 0, 1, -1))
self.assertRaises(ValueError, time.strftime, '',
(1900, 1, 32, 0, 0, 0, 0, 1, -1))
- # Check hour
+ # Check hour [0, 23]
self.assertRaises(ValueError, time.strftime, '',
(1900, 1, 1, -1, 0, 0, 0, 1, -1))
self.assertRaises(ValueError, time.strftime, '',
(1900, 1, 1, 24, 0, 0, 0, 1, -1))
- # Check minute
+ # Check minute [0, 59]
self.assertRaises(ValueError, time.strftime, '',
(1900, 1, 1, 0, -1, 0, 0, 1, -1))
self.assertRaises(ValueError, time.strftime, '',
(1900, 1, 1, 0, 60, 0, 0, 1, -1))
- # Check second
+ # Check second [0, 61]
self.assertRaises(ValueError, time.strftime, '',
(1900, 1, 1, 0, 0, -1, 0, 1, -1))
# C99 only requires allowing for one leap second, but Python's docs say
@@ -82,17 +82,25 @@
# modulo.
self.assertRaises(ValueError, time.strftime, '',
(1900, 1, 1, 0, 0, 0, -2, 1, -1))
- # Check day of the year
+ # Check day of the year [1, 366] + zero support
self.assertRaises(ValueError, time.strftime, '',
- (1900, 1, 1, 0, 0, 0, 0, 0, -1))
+ (1900, 1, 1, 0, 0, 0, 0, -1, -1))
self.assertRaises(ValueError, time.strftime, '',
(1900, 1, 1, 0, 0, 0, 0, 367, -1))
- # Check daylight savings flag
+ # Check daylight savings flag [-1, 1]
self.assertRaises(ValueError, time.strftime, '',
(1900, 1, 1, 0, 0, 0, 0, 1, -2))
self.assertRaises(ValueError, time.strftime, '',
(1900, 1, 1, 0, 0, 0, 0, 1, 2))
+ def test_default_values_for_zero(self):
+ # Make sure that using all zeros uses the proper default values.
+ # No test for daylight savings since strftime() does not change output
+ # based on its value.
+ expected = "2000 01 01 00 00 00 1 001"
+ result = time.strftime("%Y %m %d %H %M %S %w %j", (0,)*9)
+ self.assertEquals(expected, result)
+
def test_strptime(self):
tt = time.gmtime(self.t)
for directive in ('a', 'A', 'b', 'B', 'c', 'd', 'H', 'I',
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Tue Jul 18 06:41:36 2006
@@ -27,6 +27,12 @@
Library
-------
+- Bug #1520914: Change time.strftime() to accept a zero for any position in its
+ argument tuple. For arguments where zero is illegal, the value is forced to
+ the minimum value that is correct. This is to support an undocumented but
+ common way people used to fill in inconsequential information in the time
+ tuple pre-2.4.
+
- Patch #1220874: Update the binhex module for Mach-O.
Extension Modules
Modified: python/trunk/Modules/timemodule.c
==============================================================================
--- python/trunk/Modules/timemodule.c (original)
+++ python/trunk/Modules/timemodule.c Tue Jul 18 06:41:36 2006
@@ -406,13 +406,35 @@
indexing blindly into some array for a textual representation
by some bad index (fixes bug #897625).
- No check for year since handled in gettmarg().
+ Also support values of zero from Python code for arguments in which
+ that is out of range by forcing that value to the lowest value that
+ is valid (fixed bug #XXX).
+
+ Valid ranges based on what is allowed in struct tm:
+
+ - tm_year: [0, max(int)] (1)
+ - tm_mon: [0, 11] (2)
+ - tm_mday: [1, 31]
+ - tm_hour: [0, 23]
+ - tm_min: [0, 59]
+ - tm_sec: [0, 60]
+ - tm_wday: [0, 6] (1)
+ - tm_yday: [0, 365] (2)
+ - tm_isdst: [-max(int), max(int)]
+
+ (1) gettmarg() handles bounds-checking.
+ (2) Python's acceptable range is one greater than the range in C,
+ thus need to check against automatic decrement by gettmarg().
*/
- if (buf.tm_mon < 0 || buf.tm_mon > 11) {
+ if (buf.tm_mon == -1)
+ buf.tm_mon = 0;
+ else if (buf.tm_mon < 0 || buf.tm_mon > 11) {
PyErr_SetString(PyExc_ValueError, "month out of range");
return NULL;
}
- if (buf.tm_mday < 1 || buf.tm_mday > 31) {
+ if (buf.tm_mday == 0)
+ buf.tm_mday = 1;
+ else if (buf.tm_mday < 0 || buf.tm_mday > 31) {
PyErr_SetString(PyExc_ValueError, "day of month out of range");
return NULL;
}
@@ -434,7 +456,9 @@
PyErr_SetString(PyExc_ValueError, "day of week out of range");
return NULL;
}
- if (buf.tm_yday < 0 || buf.tm_yday > 365) {
+ if (buf.tm_yday == -1)
+ buf.tm_yday = 0;
+ else if (buf.tm_yday < 0 || buf.tm_yday > 365) {
PyErr_SetString(PyExc_ValueError, "day of year out of range");
return NULL;
}
From python-checkins at python.org Tue Jul 18 14:16:14 2006
From: python-checkins at python.org (facundo.batista)
Date: Tue, 18 Jul 2006 14:16:14 +0200 (CEST)
Subject: [Python-checkins] r50697 - python/trunk/Lib/decimal.py
Message-ID: <20060718121614.CA0601E4015@bag.python.org>
Author: facundo.batista
Date: Tue Jul 18 14:16:13 2006
New Revision: 50697
Modified:
python/trunk/Lib/decimal.py
Log:
Comments and docs cleanups, and some little fixes, provided by Santi?go Peres?n
Modified: python/trunk/Lib/decimal.py
==============================================================================
--- python/trunk/Lib/decimal.py (original)
+++ python/trunk/Lib/decimal.py Tue Jul 18 14:16:13 2006
@@ -29,8 +29,8 @@
Decimal floating point has finite precision with arbitrarily large bounds.
-The purpose of the module is to support arithmetic using familiar
-"schoolhouse" rules and to avoid the some of tricky representation
+The purpose of this module is to support arithmetic using familiar
+"schoolhouse" rules and to avoid some of the tricky representation
issues associated with binary floating point. The package is especially
useful for financial applications or for contexts where users have
expectations that are at odds with binary floating point (for instance,
@@ -136,7 +136,7 @@
import copy as _copy
-#Rounding
+# Rounding
ROUND_DOWN = 'ROUND_DOWN'
ROUND_HALF_UP = 'ROUND_HALF_UP'
ROUND_HALF_EVEN = 'ROUND_HALF_EVEN'
@@ -145,11 +145,11 @@
ROUND_UP = 'ROUND_UP'
ROUND_HALF_DOWN = 'ROUND_HALF_DOWN'
-#Rounding decision (not part of the public API)
+# Rounding decision (not part of the public API)
NEVER_ROUND = 'NEVER_ROUND' # Round in division (non-divmod), sqrt ONLY
ALWAYS_ROUND = 'ALWAYS_ROUND' # Every operation rounds at end.
-#Errors
+# Errors
class DecimalException(ArithmeticError):
"""Base exception class.
@@ -160,17 +160,17 @@
called if the others are present. This isn't actually used for
anything, though.
- handle -- Called when context._raise_error is called and the
- trap_enabler is set. First argument is self, second is the
- context. More arguments can be given, those being after
- the explanation in _raise_error (For example,
- context._raise_error(NewError, '(-x)!', self._sign) would
- call NewError().handle(context, self._sign).)
-
To define a new exception, it should be sufficient to have it derive
from DecimalException.
"""
def handle(self, context, *args):
+ """Called when context._raise_error is called and trap_enabler is set.
+
+ First argument is self, second is the context. More arguments can
+ be given, those being after the explanation in _raise_error (For
+ example, context._raise_error(NewError, '(-x)!', self._sign) would
+ call NewError().handle(context, self._sign).)
+ """
pass
@@ -179,12 +179,13 @@
This occurs and signals clamped if the exponent of a result has been
altered in order to fit the constraints of a specific concrete
- representation. This may occur when the exponent of a zero result would
- be outside the bounds of a representation, or when a large normal
- number would have an encoded exponent that cannot be represented. In
+ representation. This may occur when the exponent of a zero result would
+ be outside the bounds of a representation, or when a large normal
+ number would have an encoded exponent that cannot be represented. In
this latter case, the exponent is reduced to fit and the corresponding
number of zero digits are appended to the coefficient ("fold-down").
"""
+ pass
class InvalidOperation(DecimalException):
@@ -194,8 +195,8 @@
Something creates a signaling NaN
-INF + INF
- 0 * (+-)INF
- (+-)INF / (+-)INF
+ 0 * (+-)INF
+ (+-)INF / (+-)INF
x % 0
(+-)INF % x
x._rescale( non-integer )
@@ -207,20 +208,21 @@
"""
def handle(self, context, *args):
if args:
- if args[0] == 1: #sNaN, must drop 's' but keep diagnostics
+ if args[0] == 1: # sNaN, must drop 's' but keep diagnostics
return Decimal( (args[1]._sign, args[1]._int, 'n') )
return NaN
+
class ConversionSyntax(InvalidOperation):
"""Trying to convert badly formed string.
This occurs and signals invalid-operation if an string is being
converted to a number and it does not conform to the numeric string
- syntax. The result is [0,qNaN].
+ syntax. The result is [0,qNaN].
"""
-
def handle(self, context, *args):
- return (0, (0,), 'n') #Passed to something which uses a tuple.
+ return (0, (0,), 'n') # Passed to something which uses a tuple.
+
class DivisionByZero(DecimalException, ZeroDivisionError):
"""Division by 0.
@@ -234,42 +236,42 @@
or of the signs of the operands for divide, or is 1 for an odd power of
-0, for power.
"""
-
def handle(self, context, sign, double = None, *args):
if double is not None:
return (Infsign[sign],)*2
return Infsign[sign]
+
class DivisionImpossible(InvalidOperation):
"""Cannot perform the division adequately.
This occurs and signals invalid-operation if the integer result of a
divide-integer or remainder operation had too many digits (would be
- longer than precision). The result is [0,qNaN].
+ longer than precision). The result is [0,qNaN].
"""
-
def handle(self, context, *args):
return (NaN, NaN)
+
class DivisionUndefined(InvalidOperation, ZeroDivisionError):
"""Undefined result of division.
This occurs and signals invalid-operation if division by zero was
attempted (during a divide-integer, divide, or remainder operation), and
- the dividend is also zero. The result is [0,qNaN].
+ the dividend is also zero. The result is [0,qNaN].
"""
-
def handle(self, context, tup=None, *args):
if tup is not None:
- return (NaN, NaN) #for 0 %0, 0 // 0
+ return (NaN, NaN) # For 0 %0, 0 // 0
return NaN
+
class Inexact(DecimalException):
"""Had to round, losing information.
This occurs and signals inexact whenever the result of an operation is
not exact (that is, it needed to be rounded and any discarded digits
- were non-zero), or if an overflow or underflow condition occurs. The
+ were non-zero), or if an overflow or underflow condition occurs. The
result in all cases is unchanged.
The inexact signal may be tested (or trapped) to determine if a given
@@ -277,26 +279,27 @@
"""
pass
+
class InvalidContext(InvalidOperation):
"""Invalid context. Unknown rounding, for example.
This occurs and signals invalid-operation if an invalid context was
- detected during an operation. This can occur if contexts are not checked
+ detected during an operation. This can occur if contexts are not checked
on creation and either the precision exceeds the capability of the
underlying concrete representation or an unknown or unsupported rounding
- was specified. These aspects of the context need only be checked when
- the values are required to be used. The result is [0,qNaN].
+ was specified. These aspects of the context need only be checked when
+ the values are required to be used. The result is [0,qNaN].
"""
-
def handle(self, context, *args):
return NaN
+
class Rounded(DecimalException):
"""Number got rounded (not necessarily changed during rounding).
This occurs and signals rounded whenever the result of an operation is
rounded (that is, some zero or non-zero digits were discarded from the
- coefficient), or if an overflow or underflow condition occurs. The
+ coefficient), or if an overflow or underflow condition occurs. The
result in all cases is unchanged.
The rounded signal may be tested (or trapped) to determine if a given
@@ -304,18 +307,20 @@
"""
pass
+
class Subnormal(DecimalException):
"""Exponent < Emin before rounding.
This occurs and signals subnormal whenever the result of a conversion or
operation is subnormal (that is, its adjusted exponent is less than
- Emin, before any rounding). The result in all cases is unchanged.
+ Emin, before any rounding). The result in all cases is unchanged.
The subnormal signal may be tested (or trapped) to determine if a given
or operation (or sequence of operations) yielded a subnormal result.
"""
pass
+
class Overflow(Inexact, Rounded):
"""Numerical overflow.
@@ -328,16 +333,15 @@
For round-half-up and round-half-even (and for round-half-down and
round-up, if implemented), the result of the operation is [sign,inf],
- where sign is the sign of the intermediate result. For round-down, the
+ where sign is the sign of the intermediate result. For round-down, the
result is the largest finite number that can be represented in the
- current precision, with the sign of the intermediate result. For
+ current precision, with the sign of the intermediate result. For
round-ceiling, the result is the same as for round-down if the sign of
- the intermediate result is 1, or is [0,inf] otherwise. For round-floor,
+ the intermediate result is 1, or is [0,inf] otherwise. For round-floor,
the result is the same as for round-down if the sign of the intermediate
- result is 0, or is [1,inf] otherwise. In all cases, Inexact and Rounded
+ result is 0, or is [1,inf] otherwise. In all cases, Inexact and Rounded
will also be raised.
- """
-
+ """
def handle(self, context, sign, *args):
if context.rounding in (ROUND_HALF_UP, ROUND_HALF_EVEN,
ROUND_HALF_DOWN, ROUND_UP):
@@ -360,18 +364,20 @@
This occurs and signals underflow if a result is inexact and the
adjusted exponent of the result would be smaller (more negative) than
the smallest value that can be handled by the implementation (the value
- Emin). That is, the result is both inexact and subnormal.
+ Emin). That is, the result is both inexact and subnormal.
The result after an underflow will be a subnormal number rounded, if
- necessary, so that its exponent is not less than Etiny. This may result
+ necessary, so that its exponent is not less than Etiny. This may result
in 0 with the sign of the intermediate result and an exponent of Etiny.
In all cases, Inexact, Rounded, and Subnormal will also be raised.
"""
+ pass
+
# List of public traps and flags
_signals = [Clamped, DivisionByZero, Inexact, Overflow, Rounded,
- Underflow, InvalidOperation, Subnormal]
+ Underflow, InvalidOperation, Subnormal]
# Map conditions (per the spec) to signals
_condition_map = {ConversionSyntax:InvalidOperation,
@@ -379,32 +385,34 @@
DivisionUndefined:InvalidOperation,
InvalidContext:InvalidOperation}
-##### Context Functions #######################################
+##### Context Functions #####################################################
# The getcontext() and setcontext() function manage access to a thread-local
# current context. Py2.4 offers direct support for thread locals. If that
# is not available, use threading.currentThread() which is slower but will
# work for older Pythons. If threads are not part of the build, create a
-# mock threading object with threading.local() returning the module namespace.
+# mock threading object with threading.local() returning the module
+# namespace.
try:
import threading
except ImportError:
# Python was compiled without threads; create a mock object instead
import sys
- class MockThreading:
+ class MockThreading(object):
def local(self, sys=sys):
return sys.modules[__name__]
threading = MockThreading()
del sys, MockThreading
+
try:
threading.local
except AttributeError:
- #To fix reloading, force it to create a new context
- #Old contexts have different exceptions in their dicts, making problems.
+ # To fix reloading, force it to create a new context
+ # Old contexts have different exceptions in their dicts, making problems.
if hasattr(threading.currentThread(), '__decimal_context__'):
del threading.currentThread().__decimal_context__
@@ -456,10 +464,10 @@
context.clear_flags()
_local.__decimal_context__ = context
- del threading, local # Don't contaminate the namespace
+ del threading, local # Don't contaminate the namespace
-##### Decimal class ###########################################
+##### Decimal class ##########################################################
class Decimal(object):
"""Floating point class for decimal arithmetic."""
@@ -475,7 +483,7 @@
>>> Decimal('3.14') # string input
Decimal("3.14")
- >>> Decimal((0, (3, 1, 4), -2)) # tuple input (sign, digit_tuple, exponent)
+ >>> Decimal((0, (3, 1, 4), -2)) # tuple (sign, digit_tuple, exponent)
Decimal("3.14")
>>> Decimal(314) # int or long
Decimal("314")
@@ -514,12 +522,13 @@
# tuple/list conversion (possibly from as_tuple())
if isinstance(value, (list,tuple)):
if len(value) != 3:
- raise ValueError, 'Invalid arguments'
+ raise ValueError('Invalid arguments')
if value[0] not in (0,1):
- raise ValueError, 'Invalid sign'
+ raise ValueError('Invalid sign')
for digit in value[1]:
if not isinstance(digit, (int,long)) or digit < 0:
- raise ValueError, "The second value in the tuple must be composed of non negative integer elements."
+ raise ValueError("The second value in the tuple must be "+
+ "composed of non negative integer elements.")
self._sign = value[0]
self._int = tuple(value[1])
@@ -553,22 +562,23 @@
if _isnan(value):
sig, sign, diag = _isnan(value)
self._is_special = True
- if len(diag) > context.prec: #Diagnostic info too long
+ if len(diag) > context.prec: # Diagnostic info too long
self._sign, self._int, self._exp = \
context._raise_error(ConversionSyntax)
return self
if sig == 1:
self._exp = 'n' #qNaN
- else: #sig == 2
+ else: # sig == 2
self._exp = 'N' #sNaN
self._sign = sign
- self._int = tuple(map(int, diag)) #Diagnostic info
+ self._int = tuple(map(int, diag)) # Diagnostic info
return self
try:
self._sign, self._int, self._exp = _string2exact(value)
except ValueError:
self._is_special = True
- self._sign, self._int, self._exp = context._raise_error(ConversionSyntax)
+ self._sign, self._int, self._exp = \
+ context._raise_error(ConversionSyntax)
return self
raise TypeError("Cannot convert %r to Decimal" % value)
@@ -651,15 +661,15 @@
if self._is_special or other._is_special:
ans = self._check_nans(other, context)
if ans:
- return 1 # Comparison involving NaN's always reports self > other
+ return 1 # Comparison involving NaN's always reports self > other
# INF = INF
return cmp(self._isinfinity(), other._isinfinity())
if not self and not other:
- return 0 #If both 0, sign comparison isn't certain.
+ return 0 # If both 0, sign comparison isn't certain.
- #If different signs, neg one is less
+ # If different signs, neg one is less
if other._sign < self._sign:
return -1
if self._sign < other._sign:
@@ -670,7 +680,7 @@
if self_adjusted == other_adjusted and \
self._int + (0,)*(self._exp - other._exp) == \
other._int + (0,)*(other._exp - self._exp):
- return 0 #equal, except in precision. ([0]*(-x) = [])
+ return 0 # Equal, except in precision. ([0]*(-x) = [])
elif self_adjusted > other_adjusted and self._int[0] != 0:
return (-1)**self._sign
elif self_adjusted < other_adjusted and other._int[0] != 0:
@@ -681,7 +691,7 @@
context = getcontext()
context = context._shallow_copy()
- rounding = context._set_rounding(ROUND_UP) #round away from 0
+ rounding = context._set_rounding(ROUND_UP) # Round away from 0
flags = context._ignore_all_flags()
res = self.__sub__(other, context=context)
@@ -719,7 +729,7 @@
if other is NotImplemented:
return other
- #compare(NaN, NaN) = NaN
+ # Compare(NaN, NaN) = NaN
if (self._is_special or other and other._is_special):
ans = self._check_nans(other, context)
if ans:
@@ -780,11 +790,11 @@
tmp = map(str, self._int)
numdigits = len(self._int)
leftdigits = self._exp + numdigits
- if eng and not self: #self = 0eX wants 0[.0[0]]eY, not [[0]0]0eY
- if self._exp < 0 and self._exp >= -6: #short, no need for e/E
+ if eng and not self: # self = 0eX wants 0[.0[0]]eY, not [[0]0]0eY
+ if self._exp < 0 and self._exp >= -6: # short, no need for e/E
s = '-'*self._sign + '0.' + '0'*(abs(self._exp))
return s
- #exp is closest mult. of 3 >= self._exp
+ # exp is closest mult. of 3 >= self._exp
exp = ((self._exp - 1)// 3 + 1) * 3
if exp != self._exp:
s = '0.'+'0'*(exp - self._exp)
@@ -796,7 +806,7 @@
else:
s += 'e'
if exp > 0:
- s += '+' #0.0e+3, not 0.0e3
+ s += '+' # 0.0e+3, not 0.0e3
s += str(exp)
s = '-'*self._sign + s
return s
@@ -936,19 +946,19 @@
return ans
if self._isinfinity():
- #If both INF, same sign => same as both, opposite => error.
+ # If both INF, same sign => same as both, opposite => error.
if self._sign != other._sign and other._isinfinity():
return context._raise_error(InvalidOperation, '-INF + INF')
return Decimal(self)
if other._isinfinity():
- return Decimal(other) #Can't both be infinity here
+ return Decimal(other) # Can't both be infinity here
shouldround = context._rounding_decision == ALWAYS_ROUND
exp = min(self._exp, other._exp)
negativezero = 0
if context.rounding == ROUND_FLOOR and self._sign != other._sign:
- #If the answer is 0, the sign should be negative, in this case.
+ # If the answer is 0, the sign should be negative, in this case.
negativezero = 1
if not self and not other:
@@ -983,19 +993,19 @@
return Decimal((negativezero, (0,), exp))
if op1.int < op2.int:
op1, op2 = op2, op1
- #OK, now abs(op1) > abs(op2)
+ # OK, now abs(op1) > abs(op2)
if op1.sign == 1:
result.sign = 1
op1.sign, op2.sign = op2.sign, op1.sign
else:
result.sign = 0
- #So we know the sign, and op1 > 0.
+ # So we know the sign, and op1 > 0.
elif op1.sign == 1:
result.sign = 1
op1.sign, op2.sign = (0, 0)
else:
result.sign = 0
- #Now, op1 > abs(op2) > 0
+ # Now, op1 > abs(op2) > 0
if op2.sign == 0:
result.int = op1.int + op2.int
@@ -1052,8 +1062,8 @@
ans = self._check_nans(context=context)
if ans:
return ans
-
- return Decimal(self) # Must be infinite, and incrementing makes no difference
+ # Must be infinite, and incrementing makes no difference
+ return Decimal(self)
L = list(self._int)
L[-1] += 1
@@ -1109,7 +1119,7 @@
if not self or not other:
ans = Decimal((resultsign, (0,), resultexp))
if shouldround:
- #Fixing in case the exponent is out of bounds
+ # Fixing in case the exponent is out of bounds
ans = ans._fix(context)
return ans
@@ -1128,7 +1138,7 @@
op1 = _WorkRep(self)
op2 = _WorkRep(other)
- ans = Decimal( (resultsign, map(int, str(op1.int * op2.int)), resultexp))
+ ans = Decimal((resultsign, map(int, str(op1.int * op2.int)), resultexp))
if shouldround:
ans = ans._fix(context)
@@ -1221,12 +1231,11 @@
sign, 1)
return context._raise_error(DivisionByZero, 'x / 0', sign)
- #OK, so neither = 0, INF or NaN
-
+ # OK, so neither = 0, INF or NaN
shouldround = context._rounding_decision == ALWAYS_ROUND
- #If we're dividing into ints, and self < other, stop.
- #self.__abs__(0) does not round.
+ # If we're dividing into ints, and self < other, stop.
+ # self.__abs__(0) does not round.
if divmod and (self.__abs__(0, context) < other.__abs__(0, context)):
if divmod == 1 or divmod == 3:
@@ -1238,7 +1247,7 @@
ans2)
elif divmod == 2:
- #Don't round the mod part, if we don't need it.
+ # Don't round the mod part, if we don't need it.
return (Decimal( (sign, (0,), 0) ), Decimal(self))
op1 = _WorkRep(self)
@@ -1287,7 +1296,7 @@
op1.exp -= 1
if res.exp == 0 and divmod and op2.int > op1.int:
- #Solves an error in precision. Same as a previous block.
+ # Solves an error in precision. Same as a previous block.
if res.int >= prec_limit and shouldround:
return context._raise_error(DivisionImpossible)
@@ -1373,7 +1382,7 @@
# ignored in the calling function.
context = context._shallow_copy()
flags = context._ignore_flags(Rounded, Inexact)
- #keep DivisionImpossible flags
+ # Keep DivisionImpossible flags
(side, r) = self.__divmod__(other, context=context)
if r._isnan():
@@ -1396,7 +1405,7 @@
if r < comparison:
r._sign, comparison._sign = s1, s2
- #Get flags now
+ # Get flags now
self.__divmod__(other, context=context)
return r._fix(context)
r._sign, comparison._sign = s1, s2
@@ -1418,7 +1427,8 @@
if r > comparison or decrease and r == comparison:
r._sign, comparison._sign = s1, s2
context.prec += 1
- if len(side.__add__(Decimal(1), context=context)._int) >= context.prec:
+ numbsquant = len(side.__add__(Decimal(1), context=context)._int)
+ if numbsquant >= context.prec:
context.prec -= 1
return context._raise_error(DivisionImpossible)[1]
context.prec -= 1
@@ -1453,7 +1463,7 @@
context = getcontext()
return context._raise_error(InvalidContext)
elif self._isinfinity():
- raise OverflowError, "Cannot convert infinity to long"
+ raise OverflowError("Cannot convert infinity to long")
if self._exp >= 0:
s = ''.join(map(str, self._int)) + '0'*self._exp
else:
@@ -1507,13 +1517,13 @@
context._raise_error(Clamped)
return ans
ans = ans._rescale(Etiny, context=context)
- #It isn't zero, and exp < Emin => subnormal
+ # It isn't zero, and exp < Emin => subnormal
context._raise_error(Subnormal)
if context.flags[Inexact]:
context._raise_error(Underflow)
else:
if ans:
- #Only raise subnormal if non-zero.
+ # Only raise subnormal if non-zero.
context._raise_error(Subnormal)
else:
Etop = context.Etop()
@@ -1530,7 +1540,8 @@
return ans
context._raise_error(Inexact)
context._raise_error(Rounded)
- return context._raise_error(Overflow, 'above Emax', ans._sign)
+ c = context._raise_error(Overflow, 'above Emax', ans._sign)
+ return c
return ans
def _round(self, prec=None, rounding=None, context=None):
@@ -1590,18 +1601,18 @@
ans = Decimal( (temp._sign, tmp, temp._exp - expdiff))
return ans
- #OK, but maybe all the lost digits are 0.
+ # OK, but maybe all the lost digits are 0.
lostdigits = self._int[expdiff:]
if lostdigits == (0,) * len(lostdigits):
ans = Decimal( (temp._sign, temp._int[:prec], temp._exp - expdiff))
- #Rounded, but not Inexact
+ # Rounded, but not Inexact
context._raise_error(Rounded)
return ans
# Okay, let's round and lose data
this_function = getattr(temp, self._pick_rounding_function[rounding])
- #Now we've got the rounding function
+ # Now we've got the rounding function
if prec != context.prec:
context = context._shallow_copy()
@@ -1697,7 +1708,7 @@
context = getcontext()
if self._is_special or n._is_special or n.adjusted() > 8:
- #Because the spot << doesn't work with really big exponents
+ # Because the spot << doesn't work with really big exponents
if n._isinfinity() or n.adjusted() > 8:
return context._raise_error(InvalidOperation, 'x ** INF')
@@ -1727,10 +1738,9 @@
return Infsign[sign]
return Decimal( (sign, (0,), 0) )
- #with ludicrously large exponent, just raise an overflow and return inf.
- if not modulo and n > 0 and (self._exp + len(self._int) - 1) * n > context.Emax \
- and self:
-
+ # With ludicrously large exponent, just raise an overflow and return inf.
+ if not modulo and n > 0 \
+ and (self._exp + len(self._int) - 1) * n > context.Emax and self:
tmp = Decimal('inf')
tmp._sign = sign
context._raise_error(Rounded)
@@ -1749,7 +1759,7 @@
context = context._shallow_copy()
context.prec = firstprec + elength + 1
if n < 0:
- #n is a long now, not Decimal instance
+ # n is a long now, not Decimal instance
n = -n
mul = Decimal(1).__div__(mul, context=context)
@@ -1758,7 +1768,7 @@
spot <<= 1
spot >>= 1
- #Spot is the highest power of 2 less than n
+ # Spot is the highest power of 2 less than n
while spot:
val = val.__mul__(val, context=context)
if val._isinfinity():
@@ -1816,7 +1826,7 @@
if exp._isinfinity() or self._isinfinity():
if exp._isinfinity() and self._isinfinity():
- return self #if both are inf, it is OK
+ return self # If both are inf, it is OK
if context is None:
context = getcontext()
return context._raise_error(InvalidOperation,
@@ -1848,7 +1858,8 @@
if self._is_special:
if self._isinfinity():
- return context._raise_error(InvalidOperation, 'rescale with an INF')
+ return context._raise_error(InvalidOperation,
+ 'rescale with an INF')
ans = self._check_nans(context=context)
if ans:
@@ -1920,13 +1931,13 @@
return Decimal(self)
if not self:
- #exponent = self._exp / 2, using round_down.
- #if self._exp < 0:
- # exp = (self._exp+1) // 2
- #else:
+ # exponent = self._exp / 2, using round_down.
+ # if self._exp < 0:
+ # exp = (self._exp+1) // 2
+ # else:
exp = (self._exp) // 2
if self._sign == 1:
- #sqrt(-0) = -0
+ # sqrt(-0) = -0
return Decimal( (1, (0,), exp))
else:
return Decimal( (0, (0,), exp))
@@ -1960,8 +1971,7 @@
ans = ans.__add__(tmp.__mul__(Decimal((0, (8,1,9), -3)),
context=context), context=context)
ans._exp -= 1 + tmp.adjusted() // 2
-
- #ans is now a linear approximation.
+ # ans is now a linear approximation.
Emax, Emin = context.Emax, context.Emin
context.Emax, context.Emin = DefaultContext.Emax, DefaultContext.Emin
@@ -1977,12 +1987,12 @@
if context.prec == maxp:
break
- #round to the answer's precision-- the only error can be 1 ulp.
+ # Round to the answer's precision-- the only error can be 1 ulp.
context.prec = firstprec
prevexp = ans.adjusted()
ans = ans._round(context=context)
- #Now, check if the other last digits are better.
+ # Now, check if the other last digits are better.
context.prec = firstprec + 1
# In case we rounded up another digit and we should actually go lower.
if prevexp != ans.adjusted():
@@ -2014,10 +2024,10 @@
context._raise_error(Rounded)
context._raise_error(Inexact)
else:
- #Exact answer, so let's set the exponent right.
- #if self._exp < 0:
- # exp = (self._exp +1)// 2
- #else:
+ # Exact answer, so let's set the exponent right.
+ # if self._exp < 0:
+ # exp = (self._exp +1)// 2
+ # else:
exp = self._exp // 2
context.prec += ans._exp - exp
ans = ans._rescale(exp, context=context)
@@ -2052,13 +2062,13 @@
ans = self
c = self.__cmp__(other)
if c == 0:
- # if both operands are finite and equal in numerical value
+ # If both operands are finite and equal in numerical value
# then an ordering is applied:
#
- # if the signs differ then max returns the operand with the
+ # If the signs differ then max returns the operand with the
# positive sign and min returns the operand with the negative sign
#
- # if the signs are the same then the exponent is used to select
+ # If the signs are the same then the exponent is used to select
# the result.
if self._sign != other._sign:
if self._sign:
@@ -2079,7 +2089,7 @@
def min(self, other, context=None):
"""Returns the smaller value.
- like min(self, other) except if one is not a number, returns
+ Like min(self, other) except if one is not a number, returns
NaN (and signals if one is sNaN). Also rounds.
"""
other = _convert_other(other)
@@ -2087,7 +2097,7 @@
return other
if self._is_special or other._is_special:
- # if one operand is a quiet NaN and the other is number, then the
+ # If one operand is a quiet NaN and the other is number, then the
# number is always returned
sn = self._isnan()
on = other._isnan()
@@ -2101,13 +2111,13 @@
ans = self
c = self.__cmp__(other)
if c == 0:
- # if both operands are finite and equal in numerical value
+ # If both operands are finite and equal in numerical value
# then an ordering is applied:
#
- # if the signs differ then max returns the operand with the
+ # If the signs differ then max returns the operand with the
# positive sign and min returns the operand with the negative sign
#
- # if the signs are the same then the exponent is used to select
+ # If the signs are the same then the exponent is used to select
# the result.
if self._sign != other._sign:
if other._sign:
@@ -2142,37 +2152,38 @@
"""Return the adjusted exponent of self"""
try:
return self._exp + len(self._int) - 1
- #If NaN or Infinity, self._exp is string
+ # If NaN or Infinity, self._exp is string
except TypeError:
return 0
- # support for pickling, copy, and deepcopy
+ # Support for pickling, copy, and deepcopy
def __reduce__(self):
return (self.__class__, (str(self),))
def __copy__(self):
if type(self) == Decimal:
- return self # I'm immutable; therefore I am my own clone
+ return self # I'm immutable; therefore I am my own clone
return self.__class__(str(self))
def __deepcopy__(self, memo):
if type(self) == Decimal:
- return self # My components are also immutable
+ return self # My components are also immutable
return self.__class__(str(self))
-##### Context class ###########################################
-
+##### Context class ##########################################################
-# get rounding method function:
-rounding_functions = [name for name in Decimal.__dict__.keys() if name.startswith('_round_')]
+# Get rounding method function:
+rounding_functions = [name for name in Decimal.__dict__.keys()
+ if name.startswith('_round_')]
for name in rounding_functions:
- #name is like _round_half_even, goes to the global ROUND_HALF_EVEN value.
+ # Name is like _round_half_even, goes to the global ROUND_HALF_EVEN value.
globalname = name[1:].upper()
val = globals()[globalname]
Decimal._pick_rounding_function[val] = name
del name, val, globalname, rounding_functions
+
class ContextManager(object):
"""Helper class to simplify Context management.
@@ -2197,12 +2208,13 @@
def __exit__(self, t, v, tb):
setcontext(self.saved_context)
+
class Context(object):
"""Contains the context for a Decimal instance.
Contains:
prec - precision (for use in rounding, division, square roots..)
- rounding - rounding type. (how you round)
+ rounding - rounding type (how you round).
_rounding_decision - ALWAYS_ROUND, NEVER_ROUND -- do you round?
traps - If traps[exception] = 1, then the exception is
raised when it is caused. Otherwise, a value is
@@ -2243,9 +2255,13 @@
def __repr__(self):
"""Show the current context."""
s = []
- s.append('Context(prec=%(prec)d, rounding=%(rounding)s, Emin=%(Emin)d, Emax=%(Emax)d, capitals=%(capitals)d' % vars(self))
- s.append('flags=[' + ', '.join([f.__name__ for f, v in self.flags.items() if v]) + ']')
- s.append('traps=[' + ', '.join([t.__name__ for t, v in self.traps.items() if v]) + ']')
+ s.append(
+ 'Context(prec=%(prec)d, rounding=%(rounding)s, Emin=%(Emin)d, Emax=%(Emax)d, capitals=%(capitals)d'
+ % vars(self))
+ s.append('flags=[' + ', '.join([f.__name__ for f, v
+ in self.flags.items() if v]) + ']')
+ s.append('traps=[' + ', '.join([t.__name__ for t, v
+ in self.traps.items() if v]) + ']')
return ', '.join(s) + ')'
def get_manager(self):
@@ -2265,9 +2281,10 @@
def copy(self):
"""Returns a deep copy from self."""
- nc = Context(self.prec, self.rounding, self.traps.copy(), self.flags.copy(),
- self._rounding_decision, self.Emin, self.Emax,
- self.capitals, self._clamp, self._ignored_flags)
+ nc = Context(self.prec, self.rounding, self.traps.copy(),
+ self.flags.copy(), self._rounding_decision,
+ self.Emin, self.Emax, self.capitals,
+ self._clamp, self._ignored_flags)
return nc
__copy__ = copy
@@ -2281,16 +2298,16 @@
"""
error = _condition_map.get(condition, condition)
if error in self._ignored_flags:
- #Don't touch the flag
+ # Don't touch the flag
return error().handle(self, *args)
self.flags[error] += 1
if not self.traps[error]:
- #The errors define how to handle themselves.
+ # The errors define how to handle themselves.
return condition().handle(self, *args)
# Errors should only be risked on copies of the context
- #self._ignored_flags = []
+ # self._ignored_flags = []
raise error, explanation
def _ignore_all_flags(self):
@@ -2314,7 +2331,7 @@
def __hash__(self):
"""A Context cannot be hashed."""
# We inherit object.__hash__, so we must deny this explicitly
- raise TypeError, "Cannot hash a Context."
+ raise TypeError("Cannot hash a Context.")
def Etiny(self):
"""Returns Etiny (= Emin - prec + 1)"""
@@ -2340,7 +2357,6 @@
This will make it not round for that operation.
"""
-
rounding = self._rounding_decision
self._rounding_decision = type
return rounding
@@ -2369,12 +2385,12 @@
d = Decimal(num, context=self)
return d._fix(self)
- #Methods
+ # Methods
def abs(self, a):
"""Returns the absolute value of the operand.
If the operand is negative, the result is the same as using the minus
- operation on the operand. Otherwise, the result is the same as using
+ operation on the operand. Otherwise, the result is the same as using
the plus operation on the operand.
>>> ExtendedContext.abs(Decimal('2.1'))
@@ -2476,8 +2492,8 @@
If either operand is a NaN then the general rules apply.
Otherwise, the operands are compared as as though by the compare
- operation. If they are numerically equal then the left-hand operand
- is chosen as the result. Otherwise the maximum (closer to positive
+ operation. If they are numerically equal then the left-hand operand
+ is chosen as the result. Otherwise the maximum (closer to positive
infinity) of the two operands is chosen as the result.
>>> ExtendedContext.max(Decimal('3'), Decimal('2'))
@@ -2496,8 +2512,8 @@
If either operand is a NaN then the general rules apply.
Otherwise, the operands are compared as as though by the compare
- operation. If they are numerically equal then the left-hand operand
- is chosen as the result. Otherwise the minimum (closer to negative
+ operation. If they are numerically equal then the left-hand operand
+ is chosen as the result. Otherwise the minimum (closer to negative
infinity) of the two operands is chosen as the result.
>>> ExtendedContext.min(Decimal('3'), Decimal('2'))
@@ -2528,10 +2544,10 @@
def multiply(self, a, b):
"""multiply multiplies two operands.
- If either operand is a special value then the general rules apply.
- Otherwise, the operands are multiplied together ('long multiplication'),
- resulting in a number which may be as long as the sum of the lengths
- of the two operands.
+ If either operand is a special value then the general rules
+ apply. Otherwise, the operands are multiplied together
+ ('long multiplication'), resulting in a number which may be
+ as long as the sum of the lengths of the two operands.
>>> ExtendedContext.multiply(Decimal('1.20'), Decimal('3'))
Decimal("3.60")
@@ -2586,14 +2602,14 @@
The right-hand operand must be a whole number whose integer part (after
any exponent has been applied) has no more than 9 digits and whose
- fractional part (if any) is all zeros before any rounding. The operand
+ fractional part (if any) is all zeros before any rounding. The operand
may be positive, negative, or zero; if negative, the absolute value of
the power is used, and the left-hand operand is inverted (divided into
1) before use.
If the increased precision needed for the intermediate calculations
- exceeds the capabilities of the implementation then an Invalid operation
- condition is raised.
+ exceeds the capabilities of the implementation then an Invalid
+ operation condition is raised.
If, when raising to a negative power, an underflow occurs during the
division into 1, the operation is not halted at that point but
@@ -2631,18 +2647,18 @@
return a.__pow__(b, modulo, context=self)
def quantize(self, a, b):
- """Returns a value equal to 'a' (rounded) and having the exponent of 'b'.
+ """Returns a value equal to 'a' (rounded), having the exponent of 'b'.
The coefficient of the result is derived from that of the left-hand
- operand. It may be rounded using the current rounding setting (if the
+ operand. It may be rounded using the current rounding setting (if the
exponent is being increased), multiplied by a positive power of ten (if
the exponent is being decreased), or is unchanged (if the exponent is
already equal to that of the right-hand operand).
Unlike other operations, if the length of the coefficient after the
quantize operation would be greater than precision then an Invalid
- operation condition is raised. This guarantees that, unless there is an
- error condition, the exponent of the result of a quantize is always
+ operation condition is raised. This guarantees that, unless there is
+ an error condition, the exponent of the result of a quantize is always
equal to that of the right-hand operand.
Also unlike other operations, quantize will never raise Underflow, even
@@ -2685,9 +2701,9 @@
"""Returns the remainder from integer division.
The result is the residue of the dividend after the operation of
- calculating integer division as described for divide-integer, rounded to
- precision digits if necessary. The sign of the result, if non-zero, is
- the same as that of the original dividend.
+ calculating integer division as described for divide-integer, rounded
+ to precision digits if necessary. The sign of the result, if non-zero,
+ is the same as that of the original dividend.
This operation will fail under the same conditions as integer division
(that is, if integer division on the same two operands would fail, the
@@ -2711,7 +2727,7 @@
def remainder_near(self, a, b):
"""Returns to be "a - b * n", where n is the integer nearest the exact
value of "x / b" (if two integers are equally near then the even one
- is chosen). If the result is equal to 0 then its sign will be the
+ is chosen). If the result is equal to 0 then its sign will be the
sign of a.
This operation will fail under the same conditions as integer division
@@ -2753,7 +2769,7 @@
return a.same_quantum(b)
def sqrt(self, a):
- """Returns the square root of a non-negative number to context precision.
+ """Square root of a non-negative number to context precision.
If the result must be inexact, it is rounded using the round-half-even
algorithm.
@@ -2814,7 +2830,7 @@
as using the quantize() operation using the given operand as the
left-hand-operand, 1E+0 as the right-hand-operand, and the precision
of the operand as the precision setting, except that no flags will
- be set. The rounding mode is taken from the context.
+ be set. The rounding mode is taken from the context.
>>> ExtendedContext.to_integral(Decimal('2.1'))
Decimal("2")
@@ -2835,6 +2851,7 @@
"""
return a.to_integral(context=self)
+
class _WorkRep(object):
__slots__ = ('sign','int','exp')
# sign: 0 or 1
@@ -2889,9 +2906,9 @@
other_len = len(str(other.int))
if numdigits > (other_len + prec + 1 - tmp_len):
# If the difference in adjusted exps is > prec+1, we know
- # other is insignificant, so might as well put a 1 after the precision.
- # (since this is only for addition.) Also stops use of massive longs.
-
+ # other is insignificant, so might as well put a 1 after the
+ # precision (since this is only for addition). Also stops
+ # use of massive longs.
extend = prec + 2 - tmp_len
if extend <= 0:
extend = 1
@@ -2913,13 +2930,13 @@
Used on _WorkRep instances during division.
"""
adjust = 0
- #If op1 is smaller, make it larger
+ # If op1 is smaller, make it larger
while op2.int > op1.int:
op1.int *= 10
op1.exp -= 1
adjust += 1
- #If op2 is too small, make it larger
+ # If op2 is too small, make it larger
while op1.int >= (10 * op2.int):
op2.int *= 10
op2.exp -= 1
@@ -2927,7 +2944,8 @@
return op1, op2, adjust
-##### Helper Functions ########################################
+
+##### Helper Functions #######################################################
def _convert_other(other):
"""Convert other to Decimal.
@@ -2968,16 +2986,16 @@
if not num:
return 0
- #get the sign, get rid of trailing [+-]
+ # get the sign, get rid of trailing [+-]
sign = 0
if num[0] == '+':
num = num[1:]
- elif num[0] == '-': #elif avoids '+-nan'
+ elif num[0] == '-': # elif avoids '+-nan'
num = num[1:]
sign = 1
if num.startswith('nan'):
- if len(num) > 3 and not num[3:].isdigit(): #diagnostic info
+ if len(num) > 3 and not num[3:].isdigit(): # diagnostic info
return 0
return (1, sign, num[3:].lstrip('0'))
if num.startswith('snan'):
@@ -2987,7 +3005,7 @@
return 0
-##### Setup Specific Contexts ################################
+##### Setup Specific Contexts ################################################
# The default context prototype used by Context()
# Is mutable, so that new contexts can have different default values
@@ -3020,19 +3038,19 @@
)
-##### Useful Constants (internal use only) ####################
+##### Useful Constants (internal use only) ###################################
-#Reusable defaults
+# Reusable defaults
Inf = Decimal('Inf')
negInf = Decimal('-Inf')
-#Infsign[sign] is infinity w/ that sign
+# Infsign[sign] is infinity w/ that sign
Infsign = (Inf, negInf)
NaN = Decimal('NaN')
-##### crud for parsing strings #################################
+##### crud for parsing strings ################################################
import re
# There's an optional sign at the start, and an optional exponent
@@ -3052,13 +3070,16 @@
([eE](?P[-+]? \d+))?
# \s*
$
-""", re.VERBOSE).match #Uncomment the \s* to allow leading or trailing spaces.
+""", re.VERBOSE).match # Uncomment the \s* to allow leading/trailing spaces
del re
-# return sign, n, p s.t. float string value == -1**sign * n * 10**p exactly
def _string2exact(s):
+ """Return sign, n, p s.t.
+
+ Float string value == -1**sign * n * 10**p exactly
+ """
m = _parser(s)
if m is None:
raise ValueError("invalid literal for Decimal: %r" % s)
From rhettinger at ewtllc.com Tue Jul 18 17:36:27 2006
From: rhettinger at ewtllc.com (Raymond Hettinger)
Date: Tue, 18 Jul 2006 08:36:27 -0700
Subject: [Python-checkins] r50697 - python/trunk/Lib/decimal.py
In-Reply-To: <20060718121614.CA0601E4015@bag.python.org>
References: <20060718121614.CA0601E4015@bag.python.org>
Message-ID: <44BCFFFB.4090107@ewtllc.com>
facundo.batista wrote:
>Python/trunk/Lib/decimal.py
>Log:
>Comments and docs cleanups, and some little fixes, provided by Santi?go Peres?n
>
>
This patch is somewhat difficult to review. It would be nice if the
minor formatting adjustments were submitted separately from changes in
functionality. There was no need or value in most of the changes
(adding two spaces after a period instead of one; adding one space after
the start of a comment; re-line-wrapping block comments; etc.) --
please don't do this to any other modules. This goes double for
reformatting code (changing "raise ValueError, str" to "raise
ValueError(str)". IMO, it is somewhat foolish to make these kind of
changes after a beta-release. It's up to Anthony, but we should
probably revert this patch.
- class MockThreading:
+ class MockThreading(obj
Why did MockThreading become a new-style class?
- self._sign, self._int, self._exp = context._raise_error(ConversionSyntax)
+ self._sign, self._int, self._exp = \
+ context._raise_error(ConversionSyntax)
wtf?
- if len(side.__add__(Decimal(1), context=context)._int) >= context.prec:
+ numbsquant = len(side.__add__(Decimal(1), context=context)._int)
+ if numbsquant >= context.prec
Introducing a intermediate variable with an inscrutable variable name is not an improvement.
- return context._raise_error(Overflow, 'above Emax', ans._sign)
+ c = context._raise_error(Overflow, 'above Emax', ans._sign)
+ return c
Of what possible benefit is this?
Raymond
From python-checkins at python.org Tue Jul 18 17:59:47 2006
From: python-checkins at python.org (phillip.eby)
Date: Tue, 18 Jul 2006 17:59:47 +0200 (CEST)
Subject: [Python-checkins] r50699 -
sandbox/trunk/setuptools/setuptools/command/install.py
Message-ID: <20060718155947.033B61E4008@bag.python.org>
Author: phillip.eby
Date: Tue Jul 18 17:59:46 2006
New Revision: 50699
Modified:
sandbox/trunk/setuptools/setuptools/command/install.py
Log:
Restored support for extra_path when using backward compatibility mode.
Modified: sandbox/trunk/setuptools/setuptools/command/install.py
==============================================================================
--- sandbox/trunk/setuptools/setuptools/command/install.py (original)
+++ sandbox/trunk/setuptools/setuptools/command/install.py Tue Jul 18 17:59:46 2006
@@ -40,10 +40,16 @@
)
def handle_extra_path(self):
- # We always ignore extra_path, because we install as .egg or .egg-info
+ if self.root or self.single_version_externally_managed:
+ # explicit backward-compatibility mode, allow extra_path to work
+ return _install.handle_extra_path(self)
+
+ # Ignore extra_path when installing an egg (or being run by another
+ # command without --root or --single-version-externally-managed
self.path_file = None
self.extra_dirs = ''
+
def run(self):
# Explicit request for old-style install? Just do it
if self.old_and_unmanageable or self.single_version_externally_managed:
@@ -74,12 +80,6 @@
-
-
-
-
-
-
def do_egg_install(self):
from setuptools.command.easy_install import easy_install
@@ -99,3 +99,25 @@
cmd.args = args
cmd.run()
setuptools.bootstrap_install_from = None
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+#
From python-checkins at python.org Tue Jul 18 18:03:46 2006
From: python-checkins at python.org (phillip.eby)
Date: Tue, 18 Jul 2006 18:03:46 +0200 (CEST)
Subject: [Python-checkins] r50700 - in sandbox/branches/setuptools-0.6:
setuptools.txt setuptools/command/install.py
Message-ID: <20060718160346.5FCDD1E4008@bag.python.org>
Author: phillip.eby
Date: Tue Jul 18 18:03:45 2006
New Revision: 50700
Modified:
sandbox/branches/setuptools-0.6/setuptools.txt
sandbox/branches/setuptools-0.6/setuptools/command/install.py
Log:
Support ``extra_path`` option to ``setup()`` when ``install`` is run in
backward-compatibility mode. (backport from trunk)
Modified: sandbox/branches/setuptools-0.6/setuptools.txt
==============================================================================
--- sandbox/branches/setuptools-0.6/setuptools.txt (original)
+++ sandbox/branches/setuptools-0.6/setuptools.txt Tue Jul 18 18:03:45 2006
@@ -2574,6 +2574,9 @@
* Made ``setup.py develop`` respect the ``--no-deps`` option, which it
previously was ignoring.
+ * Support ``extra_path`` option to ``setup()`` when ``install`` is run in
+ backward-compatibility mode.
+
0.6b4
* Fix ``register`` not obeying name/version set by ``egg_info`` command, if
``egg_info`` wasn't explicitly run first on the same command line.
Modified: sandbox/branches/setuptools-0.6/setuptools/command/install.py
==============================================================================
--- sandbox/branches/setuptools-0.6/setuptools/command/install.py (original)
+++ sandbox/branches/setuptools-0.6/setuptools/command/install.py Tue Jul 18 18:03:45 2006
@@ -40,10 +40,16 @@
)
def handle_extra_path(self):
- # We always ignore extra_path, because we install as .egg or .egg-info
+ if self.root or self.single_version_externally_managed:
+ # explicit backward-compatibility mode, allow extra_path to work
+ return _install.handle_extra_path(self)
+
+ # Ignore extra_path when installing an egg (or being run by another
+ # command without --root or --single-version-externally-managed
self.path_file = None
self.extra_dirs = ''
+
def run(self):
# Explicit request for old-style install? Just do it
if self.old_and_unmanageable or self.single_version_externally_managed:
@@ -60,7 +66,7 @@
caller = sys._getframe(2)
caller_module = caller.f_globals.get('__name__','')
caller_name = caller.f_code.co_name
-
+
if caller_module != 'distutils.dist' or caller_name!='run_commands':
# We weren't called from the command line or setup(), so we
# should run in backward-compatibility mode to support bdist_*
@@ -68,12 +74,6 @@
_install.run(self)
else:
self.do_egg_install()
-
-
-
-
-
-
@@ -120,4 +120,4 @@
-
+#
From python-checkins at python.org Tue Jul 18 18:32:08 2006
From: python-checkins at python.org (phillip.eby)
Date: Tue, 18 Jul 2006 18:32:08 +0200 (CEST)
Subject: [Python-checkins] r50701 -
sandbox/trunk/setuptools/setuptools/command/sdist.py
Message-ID: <20060718163208.283D81E4008@bag.python.org>
Author: phillip.eby
Date: Tue Jul 18 18:32:07 2006
New Revision: 50701
Modified:
sandbox/trunk/setuptools/setuptools/command/sdist.py
Log:
Edit an sdist's setup.cfg to include any egg_info options that were used
to build it.
Modified: sandbox/trunk/setuptools/setuptools/command/sdist.py
==============================================================================
--- sandbox/trunk/setuptools/setuptools/command/sdist.py (original)
+++ sandbox/trunk/setuptools/setuptools/command/sdist.py Tue Jul 18 18:32:07 2006
@@ -173,28 +173,28 @@
)
+ def make_release_tree (self, base_dir, files):
+ _sdist.make_release_tree(self, base_dir, files)
+ # Save any egg_info command line options used to create this sdist
+ settings = {}
+ dist = self.distribution
+ for opt,(src,val) in dist.get_option_dict('egg_info').items():
+ if src=="command line":
+ settings[opt] = val
+
+ if not settings:
+ return # nothing to change
+
+ dest = os.path.join(base_dir, 'setup.cfg')
+ if hasattr(os,'link') and os.path.exists(dest):
+ # unlink and re-copy, since it might be hard-linked, and
+ # we don't want to change the source version
+ os.unlink(dest)
+ self.copy_file('setup.cfg', dest)
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
+ from setopt import edit_config
+ edit_config(dest, {'egg_info':settings})
From python-checkins at python.org Tue Jul 18 18:49:43 2006
From: python-checkins at python.org (phillip.eby)
Date: Tue, 18 Jul 2006 18:49:43 +0200 (CEST)
Subject: [Python-checkins] r50702 -
sandbox/trunk/setuptools/setuptools/command/egg_info.py
sandbox/trunk/setuptools/setuptools/command/sdist.py
Message-ID: <20060718164943.3695B1E4008@bag.python.org>
Author: phillip.eby
Date: Tue Jul 18 18:49:42 2006
New Revision: 50702
Modified:
sandbox/trunk/setuptools/setuptools/command/egg_info.py
sandbox/trunk/setuptools/setuptools/command/sdist.py
Log:
Make sdist-from-sdist builds keep the exact same version number, even if
--tag-date was used to build the original sdist.
Modified: sandbox/trunk/setuptools/setuptools/command/egg_info.py
==============================================================================
--- sandbox/trunk/setuptools/setuptools/command/egg_info.py (original)
+++ sandbox/trunk/setuptools/setuptools/command/egg_info.py Tue Jul 18 18:49:42 2006
@@ -39,7 +39,7 @@
- def initialize_options (self):
+ def initialize_options(self):
self.egg_name = None
self.egg_version = None
self.egg_base = None
@@ -48,16 +48,16 @@
self.tag_svn_revision = 0
self.tag_date = 0
self.broken_egg_info = False
+ self.vtags = None
-
-
-
-
-
-
-
-
-
+ def save_version_info(self, filename):
+ from setopt import edit_config
+ edit_config(
+ filename,
+ {'egg_info':
+ {'tag_svn_revision':0, 'tag_date': 0, 'tag_build': self.tags()}
+ }
+ )
@@ -82,6 +82,7 @@
def finalize_options (self):
self.egg_name = safe_name(self.distribution.get_name())
+ self.vtags = self.tags()
self.egg_version = self.tagged_version()
try:
@@ -120,7 +121,6 @@
self.distribution._patched_dist = None
-
def write_or_delete_file(self, what, filename, data, force=False):
"""Write `data` to `filename` or delete if empty
@@ -159,8 +159,8 @@
if not self.dry_run:
os.unlink(filename)
-
-
+ def tagged_version(self):
+ return safe_version(self.distribution.get_version() + self.vtags)
def run(self):
self.mkpath(self.egg_info)
@@ -170,8 +170,8 @@
writer(self, ep.name, os.path.join(self.egg_info,ep.name))
self.find_sources()
- def tagged_version(self):
- version = self.distribution.get_version()
+ def tags(self):
+ version = ''
if self.tag_build:
version+=self.tag_build
if self.tag_svn_revision and (
@@ -179,7 +179,7 @@
): version += '-r%s' % self.get_svn_revision()
if self.tag_date:
import time; version += time.strftime("-%Y%m%d")
- return safe_version(version)
+ return version
def get_svn_revision(self):
revision = 0
Modified: sandbox/trunk/setuptools/setuptools/command/sdist.py
==============================================================================
--- sandbox/trunk/setuptools/setuptools/command/sdist.py (original)
+++ sandbox/trunk/setuptools/setuptools/command/sdist.py Tue Jul 18 18:49:42 2006
@@ -173,19 +173,10 @@
)
- def make_release_tree (self, base_dir, files):
+ def make_release_tree(self, base_dir, files):
_sdist.make_release_tree(self, base_dir, files)
# Save any egg_info command line options used to create this sdist
- settings = {}
- dist = self.distribution
- for opt,(src,val) in dist.get_option_dict('egg_info').items():
- if src=="command line":
- settings[opt] = val
-
- if not settings:
- return # nothing to change
-
dest = os.path.join(base_dir, 'setup.cfg')
if hasattr(os,'link') and os.path.exists(dest):
# unlink and re-copy, since it might be hard-linked, and
@@ -193,8 +184,17 @@
os.unlink(dest)
self.copy_file('setup.cfg', dest)
- from setopt import edit_config
- edit_config(dest, {'egg_info':settings})
+ self.get_finalized_command('egg_info').save_version_info(dest)
+
+
+
+
+
+
+
+
+
+
From python-checkins at python.org Tue Jul 18 18:55:23 2006
From: python-checkins at python.org (phillip.eby)
Date: Tue, 18 Jul 2006 18:55:23 +0200 (CEST)
Subject: [Python-checkins] r50703 - in sandbox/branches/setuptools-0.6:
setuptools.txt setuptools/command/egg_info.py
setuptools/command/sdist.py
Message-ID: <20060718165523.BF7901E4008@bag.python.org>
Author: phillip.eby
Date: Tue Jul 18 18:55:23 2006
New Revision: 50703
Modified:
sandbox/branches/setuptools-0.6/setuptools.txt
sandbox/branches/setuptools-0.6/setuptools/command/egg_info.py
sandbox/branches/setuptools-0.6/setuptools/command/sdist.py
Log:
Source distributions now always include a ``setup.cfg`` file that explicitly
sets ``egg_info`` options such that they produce an identical version number
to the source distribution's version number. (Previously, the default
version number could be different due to the use of ``--tag-date``, or if
the version was overridden on the command line that built the source
distribution.)
(backport from trunk)
Modified: sandbox/branches/setuptools-0.6/setuptools.txt
==============================================================================
--- sandbox/branches/setuptools-0.6/setuptools.txt (original)
+++ sandbox/branches/setuptools-0.6/setuptools.txt Tue Jul 18 18:55:23 2006
@@ -2577,6 +2577,11 @@
* Support ``extra_path`` option to ``setup()`` when ``install`` is run in
backward-compatibility mode.
+ * Rewrite ``setup.cfg`` in ``sdist`` distributions to include any ``egg_info``
+ options that were set on the command line when ``sdist`` was run, so that
+ the building from the source distribution will create a package with an
+ identical version number (unless ``--tag-date`` is in effect).
+
0.6b4
* Fix ``register`` not obeying name/version set by ``egg_info`` command, if
``egg_info`` wasn't explicitly run first on the same command line.
Modified: sandbox/branches/setuptools-0.6/setuptools/command/egg_info.py
==============================================================================
--- sandbox/branches/setuptools-0.6/setuptools/command/egg_info.py (original)
+++ sandbox/branches/setuptools-0.6/setuptools/command/egg_info.py Tue Jul 18 18:55:23 2006
@@ -39,7 +39,7 @@
- def initialize_options (self):
+ def initialize_options(self):
self.egg_name = None
self.egg_version = None
self.egg_base = None
@@ -48,16 +48,16 @@
self.tag_svn_revision = 0
self.tag_date = 0
self.broken_egg_info = False
+ self.vtags = None
-
-
-
-
-
-
-
-
-
+ def save_version_info(self, filename):
+ from setopt import edit_config
+ edit_config(
+ filename,
+ {'egg_info':
+ {'tag_svn_revision':0, 'tag_date': 0, 'tag_build': self.tags()}
+ }
+ )
@@ -82,6 +82,7 @@
def finalize_options (self):
self.egg_name = safe_name(self.distribution.get_name())
+ self.vtags = self.tags()
self.egg_version = self.tagged_version()
try:
@@ -120,7 +121,6 @@
self.distribution._patched_dist = None
-
def write_or_delete_file(self, what, filename, data, force=False):
"""Write `data` to `filename` or delete if empty
@@ -159,8 +159,8 @@
if not self.dry_run:
os.unlink(filename)
-
-
+ def tagged_version(self):
+ return safe_version(self.distribution.get_version() + self.vtags)
def run(self):
self.mkpath(self.egg_info)
@@ -170,8 +170,8 @@
writer(self, ep.name, os.path.join(self.egg_info,ep.name))
self.find_sources()
- def tagged_version(self):
- version = self.distribution.get_version()
+ def tags(self):
+ version = ''
if self.tag_build:
version+=self.tag_build
if self.tag_svn_revision and (
@@ -179,7 +179,7 @@
): version += '-r%s' % self.get_svn_revision()
if self.tag_date:
import time; version += time.strftime("-%Y%m%d")
- return safe_version(version)
+ return version
def get_svn_revision(self):
revision = 0
Modified: sandbox/branches/setuptools-0.6/setuptools/command/sdist.py
==============================================================================
--- sandbox/branches/setuptools-0.6/setuptools/command/sdist.py (original)
+++ sandbox/branches/setuptools-0.6/setuptools/command/sdist.py Tue Jul 18 18:55:23 2006
@@ -173,18 +173,18 @@
)
+ def make_release_tree(self, base_dir, files):
+ _sdist.make_release_tree(self, base_dir, files)
+ # Save any egg_info command line options used to create this sdist
+ dest = os.path.join(base_dir, 'setup.cfg')
+ if hasattr(os,'link') and os.path.exists(dest):
+ # unlink and re-copy, since it might be hard-linked, and
+ # we don't want to change the source version
+ os.unlink(dest)
+ self.copy_file('setup.cfg', dest)
-
-
-
-
-
-
-
-
-
-
+ self.get_finalized_command('egg_info').save_version_info(dest)
From python-checkins at python.org Tue Jul 18 19:46:33 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Tue, 18 Jul 2006 19:46:33 +0200 (CEST)
Subject: [Python-checkins] r50704 - in python/trunk: Lib/lib-tk/Tkinter.py
Misc/NEWS
Message-ID: <20060718174633.0A8361E4014@bag.python.org>
Author: martin.v.loewis
Date: Tue Jul 18 19:46:31 2006
New Revision: 50704
Modified:
python/trunk/Lib/lib-tk/Tkinter.py
python/trunk/Misc/NEWS
Log:
Patch #1524429: Use repr instead of backticks again.
Modified: python/trunk/Lib/lib-tk/Tkinter.py
==============================================================================
--- python/trunk/Lib/lib-tk/Tkinter.py (original)
+++ python/trunk/Lib/lib-tk/Tkinter.py Tue Jul 18 19:46:31 2006
@@ -186,7 +186,7 @@
if name:
self._name = name
else:
- self._name = 'PY_VAR' + `_varnum`
+ self._name = 'PY_VAR' + repr(_varnum)
_varnum += 1
if value != None:
self.set(value)
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Tue Jul 18 19:46:31 2006
@@ -27,6 +27,8 @@
Library
-------
+- Patch #1524429: Use repr() instead of backticks in Tkinter again.
+
- Bug #1520914: Change time.strftime() to accept a zero for any position in its
argument tuple. For arguments where zero is illegal, the value is forced to
the minimum value that is correct. This is to support an undocumented but
From python-checkins at python.org Tue Jul 18 23:55:16 2006
From: python-checkins at python.org (tim.peters)
Date: Tue, 18 Jul 2006 23:55:16 +0200 (CEST)
Subject: [Python-checkins] r50706 - in python/trunk/Lib: decimal.py
subprocess.py test/test_dis.py
Message-ID: <20060718215516.6AF741E4012@bag.python.org>
Author: tim.peters
Date: Tue Jul 18 23:55:15 2006
New Revision: 50706
Modified:
python/trunk/Lib/decimal.py
python/trunk/Lib/subprocess.py
python/trunk/Lib/test/test_dis.py
Log:
Whitespace normalization.
Modified: python/trunk/Lib/decimal.py
==============================================================================
--- python/trunk/Lib/decimal.py (original)
+++ python/trunk/Lib/decimal.py Tue Jul 18 23:55:15 2006
@@ -165,7 +165,7 @@
"""
def handle(self, context, *args):
"""Called when context._raise_error is called and trap_enabler is set.
-
+
First argument is self, second is the context. More arguments can
be given, those being after the explanation in _raise_error (For
example, context._raise_error(NewError, '(-x)!', self._sign) would
@@ -2258,9 +2258,9 @@
s.append(
'Context(prec=%(prec)d, rounding=%(rounding)s, Emin=%(Emin)d, Emax=%(Emax)d, capitals=%(capitals)d'
% vars(self))
- s.append('flags=[' + ', '.join([f.__name__ for f, v
+ s.append('flags=[' + ', '.join([f.__name__ for f, v
in self.flags.items() if v]) + ']')
- s.append('traps=[' + ', '.join([t.__name__ for t, v
+ s.append('traps=[' + ', '.join([t.__name__ for t, v
in self.traps.items() if v]) + ']')
return ', '.join(s) + ')'
@@ -2281,7 +2281,7 @@
def copy(self):
"""Returns a deep copy from self."""
- nc = Context(self.prec, self.rounding, self.traps.copy(),
+ nc = Context(self.prec, self.rounding, self.traps.copy(),
self.flags.copy(), self._rounding_decision,
self.Emin, self.Emax, self.capitals,
self._clamp, self._ignored_flags)
@@ -2701,7 +2701,7 @@
"""Returns the remainder from integer division.
The result is the residue of the dividend after the operation of
- calculating integer division as described for divide-integer, rounded
+ calculating integer division as described for divide-integer, rounded
to precision digits if necessary. The sign of the result, if non-zero,
is the same as that of the original dividend.
@@ -3077,7 +3077,7 @@
def _string2exact(s):
"""Return sign, n, p s.t.
-
+
Float string value == -1**sign * n * 10**p exactly
"""
m = _parser(s)
Modified: python/trunk/Lib/subprocess.py
==============================================================================
--- python/trunk/Lib/subprocess.py (original)
+++ python/trunk/Lib/subprocess.py Tue Jul 18 23:55:15 2006
@@ -369,7 +369,7 @@
self.cmd = cmd
def __str__(self):
return "Command '%s' returned non-zero exit status %d" % (self.cmd, self.returncode)
-
+
if mswindows:
import threading
Modified: python/trunk/Lib/test/test_dis.py
==============================================================================
--- python/trunk/Lib/test/test_dis.py (original)
+++ python/trunk/Lib/test/test_dis.py Tue Jul 18 23:55:15 2006
@@ -133,10 +133,10 @@
def test_big_linenos(self):
def func(count):
- namespace = {}
- func = "def foo():\n " + "".join(["\n "] * count + ["spam\n"])
- exec func in namespace
- return namespace['foo']
+ namespace = {}
+ func = "def foo():\n " + "".join(["\n "] * count + ["spam\n"])
+ exec func in namespace
+ return namespace['foo']
# Test all small ranges
for i in xrange(1, 300):
From python-checkins at python.org Wed Jul 19 00:14:01 2006
From: python-checkins at python.org (phillip.eby)
Date: Wed, 19 Jul 2006 00:14:01 +0200 (CEST)
Subject: [Python-checkins] r50707 -
sandbox/branches/setuptools-0.6/setuptools.txt
Message-ID: <20060718221401.418661E4008@bag.python.org>
Author: phillip.eby
Date: Wed Jul 19 00:14:00 2006
New Revision: 50707
Modified:
sandbox/branches/setuptools-0.6/setuptools.txt
Log:
Fix incorrect release note regarding the egg_info/sdist fix.
Modified: sandbox/branches/setuptools-0.6/setuptools.txt
==============================================================================
--- sandbox/branches/setuptools-0.6/setuptools.txt (original)
+++ sandbox/branches/setuptools-0.6/setuptools.txt Wed Jul 19 00:14:00 2006
@@ -2577,10 +2577,12 @@
* Support ``extra_path`` option to ``setup()`` when ``install`` is run in
backward-compatibility mode.
- * Rewrite ``setup.cfg`` in ``sdist`` distributions to include any ``egg_info``
- options that were set on the command line when ``sdist`` was run, so that
- the building from the source distribution will create a package with an
- identical version number (unless ``--tag-date`` is in effect).
+ * Source distributions now always include a ``setup.cfg`` file that explicitly
+ sets ``egg_info`` options such that they produce an identical version number
+ to the source distribution's version number. (Previously, the default
+ version number could be different due to the use of ``--tag-date``, or if
+ the version was overridden on the command line that built the source
+ distribution.)
0.6b4
* Fix ``register`` not obeying name/version set by ``egg_info`` command, if
From python-checkins at python.org Wed Jul 19 02:03:21 2006
From: python-checkins at python.org (tim.peters)
Date: Wed, 19 Jul 2006 02:03:21 +0200 (CEST)
Subject: [Python-checkins] r50708 - in python/trunk: Lib/test/test_sys.py
Misc/NEWS Python/pystate.c
Message-ID: <20060719000321.36AF31E401E@bag.python.org>
Author: tim.peters
Date: Wed Jul 19 02:03:19 2006
New Revision: 50708
Modified:
python/trunk/Lib/test/test_sys.py
python/trunk/Misc/NEWS
python/trunk/Python/pystate.c
Log:
SF bug 1524317: configure --without-threads fails to build
Moved the code for _PyThread_CurrentFrames() up, so it's no longer
in a huge "#ifdef WITH_THREAD" block (I didn't realize it /was/ in
one).
Changed test_sys's test_current_frames() so it passes with or without
thread supported compiled in.
Note that test_sys fails when Python is compiled without threads,
but for an unrelated reason (the old test_exit() fails with an
indirect ImportError on the `thread` module). There are also
other unrelated compilation failures without threads, in extension
modules (like ctypes); at least the core compiles again.
Do we really support --without-threads? If so, there are several
problems remaining.
Modified: python/trunk/Lib/test/test_sys.py
==============================================================================
--- python/trunk/Lib/test/test_sys.py (original)
+++ python/trunk/Lib/test/test_sys.py Wed Jul 19 02:03:19 2006
@@ -239,6 +239,19 @@
# sys._current_frames() is a CPython-only gimmick.
def test_current_frames(self):
+ have_threads = True
+ try:
+ import thread
+ except ImportError:
+ have_threads = False
+
+ if have_threads:
+ self.current_frames_with_threads()
+ else:
+ self.current_frames_without_threads()
+
+ # Test sys._current_frames() in a WITH_THREADS build.
+ def current_frames_with_threads(self):
import threading, thread
import traceback
@@ -298,6 +311,15 @@
leave_g.set()
t.join()
+ # Test sys._current_frames() when thread support doesn't exist.
+ def current_frames_without_threads(self):
+ # Not much happens here: there is only one thread, with artificial
+ # "thread id" 0.
+ d = sys._current_frames()
+ self.assertEqual(len(d), 1)
+ self.assert_(0 in d)
+ self.assert_(d[0] is sys._getframe())
+
def test_attributes(self):
self.assert_(isinstance(sys.api_version, int))
self.assert_(isinstance(sys.argv, list))
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Wed Jul 19 02:03:19 2006
@@ -24,6 +24,11 @@
again. Fixing this problem required changing the .pyc magic number.
This means that .pyc files generated before 2.5c1 will be regenerated.
+- Bug #1524317: Compiling Python ``--without-threads`` failed.
+ The Python core compiles again then, and, in a build without threads, the
+ new ``sys._current_frames()`` returns a dictionary with one entry,
+ mapping the faux "thread id" 0 to the current frame.
+
Library
-------
@@ -155,9 +160,9 @@
Extension Modules
-----------------
-- #1494314: Fix a regression with high-numbered sockets in 2.4.3. This
- means that select() on sockets > FD_SETSIZE (typically 1024) work again.
- The patch makes sockets use poll() internally where available.
+- #1494314: Fix a regression with high-numbered sockets in 2.4.3. This
+ means that select() on sockets > FD_SETSIZE (typically 1024) work again.
+ The patch makes sockets use poll() internally where available.
- Assigning None to pointer type fields in ctypes structures possible
overwrote the wrong fields, this is fixed now.
@@ -1216,7 +1221,7 @@
Library
-------
-
+
- Patch #1388073: Numerous __-prefixed attributes of unittest.TestCase have
been renamed to have only a single underscore prefix. This was done to
make subclassing easier.
Modified: python/trunk/Python/pystate.c
==============================================================================
--- python/trunk/Python/pystate.c (original)
+++ python/trunk/Python/pystate.c Wed Jul 19 02:03:19 2006
@@ -387,6 +387,53 @@
return tstate->next;
}
+/* The implementation of sys._current_frames(). This is intended to be
+ called with the GIL held, as it will be when called via
+ sys._current_frames(). It's possible it would work fine even without
+ the GIL held, but haven't thought enough about that.
+*/
+PyObject *
+_PyThread_CurrentFrames(void)
+{
+ PyObject *result;
+ PyInterpreterState *i;
+
+ result = PyDict_New();
+ if (result == NULL)
+ return NULL;
+
+ /* for i in all interpreters:
+ * for t in all of i's thread states:
+ * if t's frame isn't NULL, map t's id to its frame
+ * Because these lists can mutute even when the GIL is held, we
+ * need to grab head_mutex for the duration.
+ */
+ HEAD_LOCK();
+ for (i = interp_head; i != NULL; i = i->next) {
+ PyThreadState *t;
+ for (t = i->tstate_head; t != NULL; t = t->next) {
+ PyObject *id;
+ int stat;
+ struct _frame *frame = t->frame;
+ if (frame == NULL)
+ continue;
+ id = PyInt_FromLong(t->thread_id);
+ if (id == NULL)
+ goto Fail;
+ stat = PyDict_SetItem(result, id, (PyObject *)frame);
+ Py_DECREF(id);
+ if (stat < 0)
+ goto Fail;
+ }
+ }
+ HEAD_UNLOCK();
+ return result;
+
+ Fail:
+ HEAD_UNLOCK();
+ Py_DECREF(result);
+ return NULL;
+}
/* Python "auto thread state" API. */
#ifdef WITH_THREAD
@@ -550,54 +597,6 @@
PyEval_SaveThread();
}
-/* The implementation of sys._current_frames(). This is intended to be
- called with the GIL held, as it will be when called via
- sys._current_frames(). It's possible it would work fine even without
- the GIL held, but haven't thought enough about that.
-*/
-PyObject *
-_PyThread_CurrentFrames(void)
-{
- PyObject *result;
- PyInterpreterState *i;
-
- result = PyDict_New();
- if (result == NULL)
- return NULL;
-
- /* for i in all interpreters:
- * for t in all of i's thread states:
- * if t's frame isn't NULL, map t's id to its frame
- * Because these lists can mutute even when the GIL is held, we
- * need to grab head_mutex for the duration.
- */
- HEAD_LOCK();
- for (i = interp_head; i != NULL; i = i->next) {
- PyThreadState *t;
- for (t = i->tstate_head; t != NULL; t = t->next) {
- PyObject *id;
- int stat;
- struct _frame *frame = t->frame;
- if (frame == NULL)
- continue;
- id = PyInt_FromLong(t->thread_id);
- if (id == NULL)
- goto Fail;
- stat = PyDict_SetItem(result, id, (PyObject *)frame);
- Py_DECREF(id);
- if (stat < 0)
- goto Fail;
- }
- }
- HEAD_UNLOCK();
- return result;
-
- Fail:
- HEAD_UNLOCK();
- Py_DECREF(result);
- return NULL;
-}
-
#ifdef __cplusplus
}
#endif
From python-checkins at python.org Wed Jul 19 03:06:50 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Wed, 19 Jul 2006 03:06:50 +0200 (CEST)
Subject: [Python-checkins] r50709 - sandbox/trunk/Doc/threading.rst
Message-ID: <20060719010650.D24E41E4009@bag.python.org>
Author: andrew.kuchling
Date: Wed Jul 19 03:06:50 2006
New Revision: 50709
Added:
sandbox/trunk/Doc/threading.rst
Log:
Start document
Added: sandbox/trunk/Doc/threading.rst
==============================================================================
--- (empty file)
+++ sandbox/trunk/Doc/threading.rst Wed Jul 19 03:06:50 2006
@@ -0,0 +1,43 @@
+Thread Programming HOWTO
+================================
+
+**Version 0.01**
+
+(This is a first draft. Please send comments/error
+reports/suggestions to amk at amk.ca. This URL is probably not going to
+be the final location of the document, so be careful about linking to
+it -- you may want to add a disclaimer.)
+
+Introduction
+----------------------
+
+XXX
+
+Revision History and Acknowledgements
+------------------------------------------------
+
+The author would like to thank the following people for offering
+suggestions, corrections and assistance with various drafts of this
+article: no one yet.
+
+References
+--------------------
+
+General
+'''''''''''''''
+
+XXX
+
+Thread programming in Python
+''''''''''''''''''''''''''''''''
+
+http://www.python.net/crew/aahz/OSCON2001/:
+Aahz's notes from a 2001 OSCON tutorial.
+
+http://www.pyzine.com/Issue001/Section_Articles/article_ThreadingGlobalInterpreter.html:
+"Threading the Global Interpreter Lock", by Aahz, in PyZine #1.
+
+http://heather.cs.ucdavis.edu/~matloff/Python/PyThreads.pdf:
+A general tutorial on threaded programming in Python, by Norman Matloff and
+Francis Hsu, University of California, Davis.
+
From jackdied at jackdied.com Wed Jul 19 03:08:40 2006
From: jackdied at jackdied.com (Jack Diederich)
Date: Tue, 18 Jul 2006 21:08:40 -0400
Subject: [Python-checkins] r50708 - in python/trunk:
Lib/test/test_sys.py Misc/NEWS Python/pystate.c
In-Reply-To: <20060719000321.36AF31E401E@bag.python.org>
References: <20060719000321.36AF31E401E@bag.python.org>
Message-ID: <20060719010840.GD2540@performancedrivers.com>
On Wed, Jul 19, 2006 at 02:03:21AM +0200, tim.peters wrote:
>
> Do we really support --without-threads? If so, there are several
> problems remaining.
>
mod_python for apache 1.x reccomends building without threads but doesn't
require it. I tried googling for people who use --without-threads and most
were pre-2003 and talking about mod_python. HURD and FreeBSD came up a
couple times. Do we need to add more *BSD buildbots? As for HURD I think
we can safely say getting python running is the least of their problems.
-Jack
From python-checkins at python.org Wed Jul 19 03:52:11 2006
From: python-checkins at python.org (brett.cannon)
Date: Wed, 19 Jul 2006 03:52:11 +0200 (CEST)
Subject: [Python-checkins] r50710 -
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
python/branches/bcannon-sandboxing/securing_python.txt
Message-ID: <20060719015211.530401E4009@bag.python.org>
Author: brett.cannon
Date: Wed Jul 19 03:52:10 2006
New Revision: 50710
Added:
python/branches/bcannon-sandboxing/securing_python.txt
- copied, changed from r50656, python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Removed:
python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
Log:
Redesign the security model to use object-capabilities. Aim was to explain
more design plans than API. This allows more flexibility in the future and
hopefully makes it easier for people to see if the design is sound and makes
sense.
Also renamed the file securing_python.txt (original name was rather long and
this one just "feels" better).
Have two (known) sections left to write before this new draft is finished.
Deleted: /python/branches/bcannon-sandboxing/sandboxing_design_doc.txt
==============================================================================
--- /python/branches/bcannon-sandboxing/sandboxing_design_doc.txt Wed Jul 19 03:52:10 2006
+++ (empty file)
@@ -1,1195 +0,0 @@
-Restricted Execution for Python
-#######################################
-
-About This Document
-=============================
-
-This document is meant to lay out the general design for re-introducing
-a sandboxing model for Python. This document should provide one with
-enough information to understand the goals for sandboxing, what
-considerations were made for the design, and the actual design itself.
-Design decisions should be clear and explain not only why they were
-chosen but possible drawbacks from taking a specific approach.
-
-If any of the above is found not to be true, please email me at
-brett at python.org and let me know what problems you are having with the
-document.
-
-
-XXX TO DO
-=============================
-
-Design
---------------
-
-* threading needs protection?
-* python-dev convince me that hiding 'file' possible?
- + based on that, handle code objects
- + also decide how to handle sockets
- + perhaps go with crippling but try best effort on hiding reference and if
- best effort holds up eventually shift over to capabilities system
-* resolve to IP at call time to prevent DNS man-in-the-middle attacks when
- allowing a specific host name?
-* what network info functions are allowed by default?
-* does the object.__subclasses__() trick work across interpreters, or is it
- unique per interpreter?
-* figure out default whitelist of extension modules
-* check default accessible objects for file path exposure
-* helper functions to get at StringIO instances for stdin, stdout, and friends?
-* decide on what type of objects (e.g., PyStringObject or const char *) are to
- be passed in
-* all built-ins properly protected?
-* exactly how to tell whether argument to open() is a path, IP, or host name
- (third argument, 'n' prefix for networking, format of path, ...)
-* API at the Python level
-* for extension module protection, allow for wildcard allowance
- (e.g., ``xml.*``)
-
-
-Implementation
---------------
-
-* add __sandbox__
-* merge from HEAD
- + last merge on rev. 47248
-* remove bare malloc()/realloc()/free() uses
- + also watch out for PyObject_Malloc()/PyObject_MALLOC() calls
-* note in SpecialBuilds.txt
-
-
-Goal
-=============================
-
-A good sandboxing model provides enough protection to prevent malicious
-harm to come to the system, and no more. Barriers should be minimized
-so as to allow most code that does not do anything that would be
-regarded as harmful to run unmodified. But the protections need to be
-thorough enough to prevent any unintended changes or information of the
-system to come about.
-
-An important point to take into consideration when reading this
-document is to realize it is part of my (Brett Cannon's) Ph.D.
-dissertation. This means it is heavily geared toward sandboxing when
-the interpreter is working with Python code embedded in a web page as
-viewed in Firefox. While great strides have been taken to keep the
-design general enough so as to allow all previous uses of the 'rexec'
-module [#rexec]_ to be able to use the new design, it is not the
-focused goal. This means if a design decision must be made for the
-embedded use case compared to sandboxing Python code in a pure Python
-application, the former will win out over the latter.
-
-Throughout this document, the term "resource" is used to represent
-anything that deserves possible protection. This includes things that
-have a physical representation (e.g., memory) to things that are more
-abstract and specific to the interpreter (e.g., sys.path).
-
-When referring to the state of an interpreter, it is either
-"unprotected" or "sandboxed". A unprotected interpreter has no
-restrictions imposed upon any resource. A sandboxed interpreter has at
-least one, possibly more, resource with restrictions placed upon it to
-prevent unsafe code that is running within the interpreter to cause
-harm to the system.
-
-
-.. contents::
-
-
-Use Cases
-/////////////////////////////
-
-All use cases are based on how many sandboxed interpreters are running
-in a single process and whether an unprotected interpreter is also
-running. The use cases can be broken down into two categories: when
-the interpreter is embedded and only using sandboxed interpreters, and
-when pure Python code is running in an unprotected interpreter and uses
-sandboxed interpreters.
-
-
-When the Interpreter Is Embedded
-================================
-
-Single Sandboxed Interpreter
-----------------------------
-
-This use case is when an application embeds the interpreter and never
-has more than one interpreter running which happens to be sandboxed.
-
-
-Multiple Sandboxed Interpreters
--------------------------------
-
-When multiple interpreters, all sandboxed at varying levels, need to be
-running within a single application. This is the key use case that
-this proposed design is targeted for.
-
-
-Stand-Alone Python
-=============================
-
-When someone has written a Python program that wants to execute Python
-code in an sandboxed interpreter(s). This is the use case that 'rexec'
-attempted to fulfill.
-
-
-Issues to Consider
-=============================
-
-Common to all use cases, resources that the interpreter requires to
-function at a level below user code cannot be exposed to a sandboxed
-interpreter. For instance, the interpreter might need to stat a file
-to see if it is possible to import. If the ability to stat a file is
-not allowed to a sandboxed interpreter, it should not be allowed to
-perform that action, regardless of whether the interpreter at a level
-below user code needs that ability.
-
-When multiple interpreters are involved (sandboxed or not), not
-allowing an interpreter to gain access to resources available in other
-interpreters without explicit permission must be enforced.
-
-
-Resources to Protect
-/////////////////////////////
-
-It is important to make sure that the proper resources are protected
-from a sandboxed interpreter. If you don't there is no point to sandboxing.
-
-Filesystem
-===================
-
-All facets of the filesystem must be protected. This means restricting
-reading and writing to the filesystem (e.g., files, directories, etc.).
-It should be allowed in controlled situations where allowing access to
-the filesystem is desirable, but that should be an explicit allowance.
-
-There must also be protection to prevent revealing any information
-about the filesystem. Disclosing information on the filesystem could
-allow one to infer what OS the interpreter is running on, for instance.
-
-
-Memory
-===================
-
-Memory should be protected. It is a limited resource on the system
-that can have an impact on other running programs if it is exhausted.
-Being able to restrict the use of memory would help alleviate issues
-from denial-of-service (DoS) attacks on the system.
-
-
-Networking
-===================
-
-Networking is somewhat like the filesystem in terms of wanting similar
-protections. You do not want to let unsafe code make socket
-connections unhindered or accept them to do possibly nefarious things.
-You also want to prevent finding out information about the network your
-are connected to.
-
-
-Interpreter
-===================
-
-One must make sure that the interpreter is not harmed in any way from
-sandboxed code. This usually takes the form of crashing the program
-that the interpreter is embedded in or the unprotected interpreter that
-started the sandbox interpreter. Executing hostile bytecode that might
-lead to undesirable effects is another possible issue.
-
-There is also the issue of taking it over. One should not able to gain
-escalated privileges in any way without explicit permission.
-
-
-Types of Security
-///////////////////////////////////////
-
-As with most things, there are multiple approaches one can take to
-tackle a problem. Security is no exception. In general there seem to
-be two approaches to protecting resources.
-
-
-Resource Hiding
-=============================
-
-By never giving code a chance to access a resource, you prevent it from
-being (ab)used. This is the idea behind resource hiding; you can't
-misuse something you don't have in the first place.
-
-The most common implementation of resource hiding is capabilities. In
-this type of system a resource's reference acts as a ticket that
-represents the right to use the resource. Once code has a reference it
-is considered to have full use of resource that reference represents
-and no further security checks are directly performed (using delegates
-and other structured ways one can actually have a security check for
-each access of a resource, but this is not a default behaviour).
-
-As an example, consider the 'file' type as a resource we want to
-protect. That would mean that we did not want a reference to the
-'file' type to ever be accessible without explicit permission. If one
-wanted to provide read-only access to a temp file, you could have
-open() perform a check on the permissions of the current interpreter,
-and if it is allowed to, return a proxy object for the file that only
-allows reading from it. The 'file' instance for the proxy would need
-to be properly hidden so that the reference was not reachable from
-outside so that 'file' access could still be controlled.
-
-Python, as it stands now, unfortunately does not work well for a pure
-capabilities system. Capabilities require the prohibition of certain
-abilities, such as "direct access to another's private state"
-[#paradigm regained]_. This obviously is not possible in Python since,
-at least at the Python level, there is no such thing as private state
-that is persistent (one could argue that local variables that are not
-cell variables for lexical scopes are private, but since they do not
-survive after a function call they are not usable for keeping
-persistent state). One can hide references at the C level by storing
-it in the struct for the instance of a type and not providing a
-function to access that attribute.
-
-Python's introspection abilities also do not help make implementing
-capabilities that much easier. Consider how one could access 'file'
-even when it is deleted from __builtin__. You can still get to the
-reference for 'file' through the sequence returned by
-``object.__subclasses__()``.
-
-
-Resource Crippling
-=============================
-
-Another approach to security is to not worry about controlling access
-to the reference of a resource. One can have a resource perform a
-security check every time someone tries to use a method on that
-resource. This pushes the security check to a lower level; from a
-reference level to the method level.
-
-By performing the security check every time a resource's method is
-called the worry of a specific resource's reference leaking out to
-insecure code is alleviated. This does add extra overhead, though, by
-having to do so many security checks. It also does not handle the
-situation where an unexpected exposure of a type occurs that has not
-been properly crippled.
-
-FreeBSD's jail system provides a protection scheme similar to this.
-Various system calls allow for basic usage, but knowing or having
-access to the system call is not enough to grant usage. Every call to
-a system call requires checking that the proper rights have been
-granted to the use in order to allow for the system call to perform
-its action.
-
-An even better example in FreeBSD's jail system is its protection of
-sockets. One can only bind a single IP address to a jail. Any attempt
-to do more or perform uses with the one IP address that is granted is
-prevented. The check is performed at every call involving the one
-granted IP address.
-
-Using 'file' as the example again, one could cripple the type so that
-instantiation is not possible for the type in Python. One could also
-provide a permission check on each call to a unsafe method call and
-thus allow the type to be used in normal situations (such as type
-checking), but still feel safe that illegal operations are not
-performed. Regardless of which approach you take, you do not need to
-worry about a reference to the type being exposed unexpectedly since
-the reference is not the security check but the actual method calls.
-
-
-Comparison of the Two Approaches
-================================
-
-From the perspective of Python, the two approaches differ on what would
-be the most difficult thing to analyze from a security standpoint: all
-of the ways to gain access to various types from a sandboxed
-interpreter with no imports, or finding all of the types that can lead
-to possibly dangerous actions and thus need to be crippled.
-
-Some Python developers, such as Armin Rigo, feel that truly hiding
-objects in Python is "quite hard" [#armin-hiding]_. This sentiment
-means that making a pure capabilities system in Python that is secure
-is not possible as people would continue to find new ways to get a hold
-of the reference to a protected resource.
-
-Others feel that by not going the capabilities route we will be
-constantly chasing down new types that require crippling. The thinking
-is that if we cannot control the references for 'file', how are we to
-know what other types might become exposed later on and thus require
-more crippling?
-
-It essentially comes down to what is harder to do: find all the ways to
-access the types in Python in a sandboxed interpreter with no imported
-modules, or to go through the Python code base and find all types that
-should be crippled?
-
-
-The 'rexec' Module
-///////////////////////////////////////
-
-The 'rexec' module [#rexec]_ was the original attempt at providing a
-sandbox environment for Python code to run in. It's design was based
-on Safe-Tcl which was essentially a capabilities system [#safe-tcl]_.
-Safe-Tcl allowed you to launch a separate interpreter where its global
-functions were specified at creation time. This prevented one from
-having any abilities that were not explicitly provided.
-
-For 'rexec', the Safe-Tcl model was tweaked to better match Python's
-situation. An RExec object represented a sandboxed environment.
-Imports were checked against a whitelist of modules. You could also
-restrict the type of modules to import based on whether they were
-Python source, bytecode, or C extensions. Built-ins were allowed
-except for a blacklist of built-ins to not provide. One could restrict
-whether stdin, stdout, and stderr were provided or not on a per-RExec
-basis. Several other protections were provided; see documentation for
-the complete list.
-
-The ultimate undoing of the 'rexec' module was how access to objects
-that in normal Python require no imports to reach was handled.
-Importing modules requires a direct action, and thus can be protected
-against directly in the import machinery. But for built-ins, they are
-accessible by default and require no direct action to access in normal
-Python; you just use their name since they are provided in all
-namespaces.
-
-For instance, in a sandboxed interpreter, one only had to
-``del __builtins__`` to gain access to the full set of built-ins.
-Another way is through using the gc module:
-``gc.get_referrers(''.__class__.__bases__[0])[6]['file']``. While both
-of these could be fixed (the former was a bug in 'rexec' that was fixed
-and the latter could be handled by not allowing 'gc' to be imported),
-they are examples of things that do not require proactive actions on
-the part of the programmer in normal Python to gain access to a
-resource. This was an unfortunate side-effect of having all of that
-wonderful reflection in Python.
-
-There is also the issue that 'rexec' was written in Python which
-provides its own problems based on reflection and the ability to modify
-the code at run-time without security protection.
-
-Much has been learned since 'rexec' was written about how Python tends
-to be used and where security issues tend to appear. Essentially
-Python's dynamic nature does not lend itself very well to a security
-implementation that does not require a constant checking of
-permissions.
-
-
-Threat Model
-///////////////////////////////////////
-
-Below is a list of what the security implementation assumes, along with
-what section of this document that addresses that part of the security
-model (if not already true in Python by default). The term "bare" when
-in regards to an interpreter means an interpreter that has not
-performed a single import of a module. Also, all comments refer to a
-sandboxed interpreter unless otherwise explicitly stated.
-
-This list does not address specifics such as how 'file' will be
-protected or whether memory should be protected. This list is meant to
-make clear at a more basic level what the security model is assuming is
-true.
-
-* The Python interpreter itself is always trusted.
- + Implemented by code that runs at the process level performing any
- necessary security checks.
-* The Python interpreter cannot be crashed by valid Python source code
- in a bare interpreter.
-* Python source code is always considered safe.
-* Python bytecode is always considered dangerous [`Hostile Bytecode`_].
-* C extension modules are inherently considered dangerous.
- [`Extension Module Importation`_].
- + Explicit trust of a C extension module is possible.
-* Built-in modules are considered dangerous.
- + Explicit trust of a built-in module is possible.
-* Sandboxed interpreters running in the same process inherently cannot
- communicate with each other.
- + Communication through C extension modules is possible because of
- the technical need to share extension module instances between
- interpreters.
-* Sandboxed interpreters running in the same process inherently cannot
- share objects.
- + Sharing objects through C extension modules is possible because
- of the technical need to share extension module instances between
- interpreters.
-* When starting a sandboxed interpreter, it starts with a fresh
- built-in and global namespace that is not shared with the interpreter
- that started it.
-* Objects in the default built-in namespace should be safe to use
- [`Reading/Writing Files`_, `Stdin, Stdout, and Stderr`_].
- + Either hide the dangerous ones or cripple them so they can cause
- no harm.
-
-There are also some features that might be desirable, but are not being
-addressed by this security model.
-
-* Communication in any direction between an unprotected interpreter and
- a sandboxed interpreter it created.
-
-
-The Proposed Approach
-///////////////////////////////////////
-
-In light of where 'rexec' succeeded and failed along with what is known
-about the two main approaches to security and how Python tends to
-operate, the following is a proposal on how to secure Python for
-sandboxing.
-
-
-Implementation Details
-===============================
-
-Support for sandboxed interpreters will require a compilation flag.
-This allows the more common case of people not caring about protections
-to not take a performance hit. And even when Python is compiled for
-sandboxed interpreter restrictions, when the running interpreter *is*
-unprotected, there will be no accidental triggers of protections. This
-means that developers should be liberal with the security protections
-without worrying about there being issues for interpreters that do not
-need/want the protection.
-
-At the Python level, the __sandboxed__ built-in will be set based on
-whether the interpreter is sandboxed or not. This will be set for
-*all* interpreters, regardless of whether sandboxed interpreter support
-was compiled in or not.
-
-For setting what is to be protected, the PyThreadState for the
-sandboxed interpreter must be passed in. This makes the protection
-very explicit and helps make sure you set protections for the exact
-interpreter you mean to. All functions that set protections begin with
-the prefix ``PySandbox_Set*()``. These functions are meant to only
-work with sandboxed interpreters that have not been used yet to execute
-any Python code. The calls must be made by the code creating and
-handling the sandboxed interpreter *before* the sandboxed interpreter
-is used to execute any Python code.
-
-The functions for checking for permissions are actually macros that
-take in at least an error return value for the function calling the
-macro. This allows the macro to return on behalf of the caller if the
-check fails and cause the SandboxError exception to be propagated
-automatically. This helps eliminate any coding errors from incorrectly
-checking a return value on a rights-checking function call. For the
-rare case where this functionality is disliked, just make the check in
-a utility function and check that function's return value (but this is
-strongly discouraged!).
-
-Functions that check that an operation is allowed implicitly operate on
-the currently running interpreter as returned by
-``PyInterpreter_Get()`` and are to be used by any code (the
-interpreter, extension modules, etc.) that needs to check for
-permission to execute. They have the common prefix of
-`PySandbox_Allowed*()``.
-
-
-API
---------------
-
-* PyThreadState* PySandbox_NewInterpreter()
- Return a new interpreter that is considered sandboxed. There is no
- corresponding ``PySandbox_EndInterpreter()`` as
- ``Py_EndInterpreter()`` will be taught how to handle sandboxed
- interpreters. ``NULL`` is returned on error.
-
-* PySandbox_Allowed(error_return)
- Macro that has the caller return with 'error_return' if the
- interpreter is unprotected, otherwise do nothing.
-
-
-Memory
-=============================
-
-Protection
---------------
-
-A memory cap will be allowed.
-
-Modification to pymalloc will be needed to properly keep track of the
-allocation and freeing of memory. Same goes for the macros around the
-system malloc/free system calls. This provides a platform-independent
-system for protection of memory instead of relying on the operating
-system to provide a service for capping memory usage of a process. It
-also allows the protection to be at the interpreter level instead of at
-the process level.
-
-Existing APIs to protect:
-- _PyObject_New()
- protected directly
-- _PyObject_NewVar()
- protected directly
-- _PyObject_Del()
- remove macro that uses PyObject_Free() and protect directly
-- PyObject_New()
- implicitly by macro using _PyObject_New()
-- PyObject_NewVar()
- implicitly by macro using _PyObject_NewVar()
-- PyObject_Del()
- redefine macro to use _PyObject_Del() instead of PyObject_Free()
-- PyMem_Malloc()
- protected directly
-- PyMem_Realloc()
- protected directly
-- PyMem_Free()
- protected directly
-- PyMem_New()
- implicitly protected by macro using PyMem_Malloc()
-- PyMem_Resize()
- implicitly protected by macro using PyMem_Realloc()
-- PyMem_Del()
- implicitly protected by macro using PyMem_Free()
-- PyMem_MALLOC()
- redefine macro to use PyMem_Malloc()
-- PyMem_REALLOC()
- redefine macro to use PyMem_Realloc()
-- PyMem_FREE()
- redefine macro to use PyMem_Free()
-- PyMem_NEW()
- implicitly protected by macro using PyMem_MALLOC()
-- PyMem_RESIZE()
- implicitly protected by macro using PyMem_REALLOC()
-- PyMem_DEL()
- implicitly protected by macro using PyMem_FREE()
-- PyObject_Malloc()
- XXX
-- PyObject_Realloc()
- XXX
-- PyObject_Free()
- XXX
-- PyObject_MALLOC()
- XXX
-- PyObject_REALLOC()
- XXX
-- PyObject_FREE()
- XXX
-
-
-Why
---------------
-
-Protecting excessive memory usage allows one to make sure that a DoS
-attack against the system's memory is prevented.
-
-
-Possible Security Flaws
------------------------
-
-If code makes direct calls to malloc/free instead of using the proper
-``PyMem_*()``
-macros then the security check will be circumvented. But C code is
-*supposed* to use the proper macros or pymalloc and thus this issue is
-not with the security model but with code not following Python coding
-standards.
-
-
-API
---------------
-
-* int PySandbox_SetMemoryCap(PyThreadState *, integer)
- Set the memory cap for an sandboxed interpreter. If the
- interpreter is not running an sandboxed interpreter, return a false
- value.
-
-* PySandbox_AllowedMemoryAlloc(integer, error_return)
- Macro to increase the amount of memory that is reported that the
- running sandboxed interpreter is using. If the increase puts the
- total count passed the set limit or leads to integer overflow in
- the allocation count, raise an SandboxError exception
- and cause the calling function to return with the value of
- 'error_return', otherwise do nothing.
-
-* void PySandbox_AllowedMemoryFree(integer)
- Decrease the current running interpreter's allocated
- memory. If this puts the memory used to below 0, re-set it to 0.
-
-
-Reading/Writing Files
-=============================
-
-Protection
---------------
-
-XXX
-
-To open a file, one will have to use open(). This will make open() a
-factory function that controls reference access to the 'file' type in
-terms of creating new instances. When an attempted file opening fails
-(either because the path does not exist or of security reasons),
-SandboxError will be raised. The same exception must be raised to
-prevent filesystem information being gleaned from the type of exception
-returned (i.e., returning IOError if a path does not exist tells the
-user something about that file path).
-
-What open() returns may not be an instance of 'file' but a proxy that
-provides the security measures needed. While this might break code
-that uses type checking to make sure a 'file' object is used, taking a
-duck typing approach would be better. This is not only more Pythonic
-but would also allow the code to use a StringIO instance.
-
-It has been suggested to allow for a passed-in callback to be called
-when a specific path is to be opened. While this provides good
-flexibility in terms of allowing custom proxies with more fine-grained
-security (e.g., capping the amount of disk write), this has been deemed
-unneeded in the initial security model and thus is not being considered
-at this time.
-
-Why
---------------
-
-Allowing anyone to be able to arbitrarily read, write, or learn about
-the layout of your filesystem is extremely dangerous. It can lead to
-loss of data or data being exposed to people whom should not have
-access.
-
-
-Possible Security Flaws
------------------------
-
-XXX
-
-
-API
---------------
-
-* int PySandbox_SetAllowedFile(PyThreadState *, string path,
- string mode)
- Add a file that is allowed to be opened in 'mode' by the 'file'
- object. If the interpreter is not sandboxed then return a false
- value.
-
-* PySandbox_AllowedPath(string path, string mode, error_return)
- Macro that causes the caller to return with 'error_return' and
- raise SandboxError as the exception if the specified path with
- 'mode' is not allowed, otherwise do nothing.
-
-
-Extension Module Importation
-============================
-
-Protection
---------------
-
-A whitelist of extension modules that may be imported must be provided.
-A default set is given for stdlib modules known to be safe.
-
-A check in the import machinery will check that a specified module name
-is allowed based on the type of module (Python source, Python bytecode,
-or extension module). Python bytecode files are never directly
-imported because of the possibility of hostile bytecode being present.
-Python source is always considered safe based on the assumption that
-all resource harm is eventually done at the C level, thus Python source
-code directly cannot cause harm without help of C extension modules.
-Thus only C extension modules need to be checked against the whitelist.
-
-The requested extension module name is checked in order to make sure
-that it is on the whitelist if it is a C extension module. If the name
-is not correct a SandboxError exception is raised. Otherwise the
-import is allowed.
-
-Even if a Python source code module imports a C extension module in an
-unprotected interpreter it is not a problem since the Python source
-code module is reloaded in the sandboxed interpreter. When that Python
-source module is freshly imported the normal import check will be
-triggered to prevent the C extension module from becoming available to
-the sandboxed interpreter.
-
-For the 'os' module, a special sandboxed version will be used if the
-proper C extension module providing the correct abilities is not
-allowed. This will default to '/' as the path separator and provide as
-much reasonable abilities as possible from a pure Python module.
-
-The 'sys' module is specially addressed in
-`Changing the Behaviour of the Interpreter`_.
-
-By default, the whitelisted modules are:
-
-* XXX
-
-
-Why
---------------
-
-Because C code is considered unsafe, its use should be regulated. By
-using a whitelist it allows one to explicitly decide that a C extension
-module is considered safe.
-
-
-Possible Security Flaws
------------------------
-
-If a whitelisted C extension module imports a non-whitelisted C
-extension module and makes it an attribute of the whitelisted module
-there will be a breach in security. Luckily this a rarity in
-extension modules.
-
-There is also the issue of a C extension module calling the C API of a
-non-whitelisted C extension module.
-
-Lastly, if a whitelisted C extension module is loaded in an unprotected
-interpreter and then loaded into a sandboxed interpreter then there is
-no checks during module initialization for possible security issues in
-the sandboxed interpreter that would have occurred had the sandboxed
-interpreter done the initial import.
-
-All of these issues can be handled by never blindly whitelisting a C
-extension module. Added support for dealing with C extension modules
-comes in the form of `Extension Module Crippling`_.
-
-
-API
---------------
-
-* int PySandbox_SetModule(PyThreadState *, string module_name)
- Allow the sandboxed interpreter to import 'module_name'. If the
- interpreter is not sandboxed, return a false value. Absolute
- import paths must be specified.
-
-* int PySandbox_BlockModule(PyThreadState *, string module_name)
- Remove the specified module from the whitelist. Used to remove
- modules that are allowed by default. Return a false value if
- called on an unprotected interpreter.
-
-* PySandbox_AllowedModule(string module_name, error_return)
- Macro that causes the caller to return with 'error_return' and sets
- the exception SandboxError if the specified module cannot be
- imported, otherwise does nothing.
-
-
-Extension Module Crippling
-==========================
-
-Protection
---------------
-
-By providing a C API for checking for allowed abilities, modules that
-have some useful functionality can do proper security checks for those
-functions that could provide insecure abilities while allowing safe
-code to be used (and thus not fully deny importation).
-
-
-Why
---------------
-
-Consider a module that provides a string processing ability. If that
-module provides a single convenience function that reads its input
-string from a file (with a specified path), the whole module should not
-be blocked from being used, just that convenience function. By
-whitelisting the module but having a security check on the one problem
-function, the user can still gain access to the safe functions. Even
-better, the unsafe function can be allowed if the security checks pass.
-
-
-Possible Security Flaws
------------------------
-
-If a C extension module developer incorrectly implements the security
-checks for the unsafe functions it could lead to undesired abilities.
-
-
-API
---------------
-
-Use PySandbox_Allowed() to protect unsafe code from being executed.
-
-
-Hostile Bytecode
-=============================
-
-Protection
---------------
-
-XXX
-
-
-Why
---------------
-
-Without implementing a bytecode verification tool, there is no way of
-making sure that bytecode does not jump outside its bounds, thus
-possibly executing malicious code. It also presents the possibility of
-crashing the interpreter.
-
-
-Possible Security Flaws
------------------------
-
-None known.
-
-
-API
---------------
-
-N/A
-
-
-Changing the Behaviour of the Interpreter
-=========================================
-
-Protection
---------------
-
-Only a subset of the 'sys' module will be made available to sandboxed
-interpreters. Things to allow from the sys module:
-
-* byteorder (?)
-* copyright
-* displayhook
-* excepthook
-* __displayhook__
-* __excepthook__
-* exc_info
-* exc_clear
-* exit
-* getdefaultencoding
-* _getframe (?)
-* hexversion
-* last_type
-* last_value
-* last_traceback
-* maxint (?)
-* maxunicode (?)
-* modules
-* stdin # See `Stdin, Stdout, and Stderr`_.
-* stdout
-* stderr
-* version
-
-
-Why
---------------
-
-Filesystem information must be removed. Any settings that could
-possibly lead to a DoS attack (e.g., sys.setrecursionlimit()) or risk
-crashing the interpreter must also be removed.
-
-
-Possible Security Flaws
------------------------
-
-Exposing something that could lead to future security problems (e.g., a
-way to crash the interpreter).
-
-
-API
---------------
-
-None.
-
-
-Socket Usage
-=============================
-
-Protection
---------------
-
-Allow sending and receiving data to/from specific IP addresses on
-specific ports.
-
-open() is to be used as a factory function to open a network
-connection. If the connection is not possible (either because of an
-invalid address or security reasons), SandboxError is raised.
-
-A socket object may not be returned by the call. A proxy to handle
-security might be returned instead.
-
-XXX
-
-
-Why
---------------
-
-Allowing arbitrary sending of data over sockets can lead to DoS attacks
-on the network and other machines. Limiting accepting data prevents
-your machine from being attacked by accepting malicious network
-connections. It also allows you to know exactly where communication is
-going to and coming from.
-
-
-Possible Security Flaws
------------------------
-
-If someone managed to influence the used DNS server to influence what
-IP addresses were used after a DNS lookup.
-
-
-API
---------------
-
-* int PySandbox_SetIPAddress(PyThreadState *, string IP, integer port)
- Allow the sandboxed interpreter to send/receive to the specified
- 'IP' address on the specified 'port'. If the interpreter is not
- sandboxed, return a false value.
-
-* PySandbox_AllowedIPAddress(string IP, integer port, error_return)
- Macro to verify that the specified 'IP' address on the specified
- 'port' is allowed to be communicated with. If not, cause the
- caller to return with 'error_return' and SandboxError exception
- set, otherwise do nothing.
-
-* int PySandbox_SetHost(PyThreadState *, string host, integer port)
- Allow the sandboxed interpreter to send/receive to the specified
- 'host' on the specified 'port'. If the interpreter is not
- sandboxed, return a false value.
-
-* PySandbox_AllowedHost(string host, integer port, error_return)
- Check that the specified 'host' on the specified 'port' is allowed
- to be communicated with. If not, set a SandboxError exception and
- cause the caller to return 'error_return', otherwise do nothing.
-
-
-Network Information
-=============================
-
-Protection
---------------
-
-Limit what information can be gleaned about the network the system is
-running on. This does not include restricting information on IP
-addresses and hosts that are have been explicitly allowed for the
-sandboxed interpreter to communicate with.
-
-XXX
-
-
-Why
---------------
-
-With enough information from the network several things could occur.
-One is that someone could possibly figure out where your machine is on
-the Internet. Another is that enough information about the network you
-are connected to could be used against it in an attack.
-
-
-Possible Security Flaws
------------------------
-
-As long as usage is restricted to only what is needed to work with
-allowed addresses, there are no security issues to speak of.
-
-
-API
---------------
-
-* int PySandbox_SetNetworkInfo(PyThreadState *)
- Allow the sandboxed interpreter to get network information
- regardless of whether the IP or host address is explicitly allowed.
- If the interpreter is not sandboxed, return a false value.
-
-* PySandbox_AllowedNetworkInfo(error_return)
- Macro that will return 'error_return' for the caller and set a
- SandboxError exception if the sandboxed interpreter does not allow
- checking for arbitrary network information, otherwise do nothing.
-
-
-Filesystem Information
-=============================
-
-Protection
---------------
-
-Do not allow information about the filesystem layout from various parts
-of Python to be exposed. This means blocking exposure at the Python
-level to:
-
-* __file__ attribute on modules
-* __path__ attribute on packages
-* co_filename attribute on code objects
-* XXX
-
-
-Why
---------------
-
-Exposing information about the filesystem is not allowed. You can
-figure out what operating system one is on which can lead to
-vulnerabilities specific to that operating system being exploited.
-
-
-Possible Security Flaws
------------------------
-
-Not finding every single place where a file path is exposed.
-
-
-API
---------------
-
-* int PySandbox_SetFilesystemInfo(PyThreadState *)
- Allow the sandboxed interpreter to expose filesystem information.
- If the passed-in interpreter is not sandboxed, return NULL.
-
-* PySandbox_AllowedFilesystemInfo(error_return)
- Macro that checks if exposing filesystem information is allowed.
- If it is not, cause the caller to return with the value of
- 'error_return' and raise SandboxError, otherwise do nothing.
-
-
-Stdin, Stdout, and Stderr
-=============================
-
-Protection
---------------
-
-By default, sys.__stdin__, sys.__stdout__, and sys.__stderr__ will be
-set to instances of StringIO. Explicit allowance of the process'
-stdin, stdout, and stderr is possible.
-
-This will protect the 'print' statement, and the built-ins input() and
-raw_input().
-
-
-Why
---------------
-
-Interference with stdin, stdout, or stderr should not be allowed unless
-desired. No one wants uncontrolled output sent to their screen.
-
-
-Possible Security Flaws
------------------------
-
-Unless StringIO instances can be used maliciously, none to speak of.
-
-
-API
---------------
-
-* int PySandbox_SetTrueStdin(PyThreadState *)
- int PySandbox_SetTrueStdout(PyThreadState *)
- int PySandbox_SetTrueStderr(PyThreadState *)
- Set the specific stream for the interpreter to the true version of
- the stream and not to the default instance of StringIO. If the
- interpreter is not sandboxed, return a false value.
-
-
-Adding New Protections
-=============================
-
-.. note:: This feature has the lowest priority and thus will be the
- last feature implemented (if ever).
-
-Protection
---------------
-
-Allow for extensibility in the security model by being able to add new
-types of checks. This allows not only for Python to add new security
-protections in a backwards-compatible fashion, but to also have
-extension modules add their own as well.
-
-An extension module can introduce a group for its various values to
-check, with a type being a specific value within a group. The "Python"
-group is specifically reserved for use by the Python core itself.
-
-
-Why
---------------
-
-We are all human. There is the possibility that a need for a new type
-of protection for the interpreter will present itself and thus need
-support. By providing an extensible way to add new protections it
-helps to future-proof the system.
-
-It also allows extension modules to present their own set of security
-protections. That way one extension module can use the protection
-scheme presented by another that it is dependent upon.
-
-
-Possible Security Flaws
-------------------------
-
-Poor definitions by extension module users of how their protections
-should be used would allow for possible exploitation.
-
-
-API
---------------
-
-+ Bool
- * int PySandbox_SetExtendedFlag(PyThreadState *, string group,
- string type)
- Set a group-type to be true. Expected use is for when a binary
- possibility of something is needed and that the default is to
- not allow use of the resource (e.g., network information).
- Returns a false value if used on an unprotected interpreter.
-
- * PySandbox_AllowedExtendedFlag(string group, string type,
- error_return)
- Macro that if the group-type is not set to true, cause the
- caller to return with 'error_return' with SandboxError
- exception raised. For unprotected interpreters the check does
- nothing.
-
-+ Numeric Range
- * int PySandbox_SetExtendedCap(PyThreadState *, string group,
- string type, integer cap)
- Set a group-type to a capped value, 'cap', with the initial
- allocated value set to 0. Expected use is when a resource has
- a capped amount of use (e.g., memory). Returns a false value
- if the interpreter is not sandboxed.
-
- * PySandbox_AllowedExtendedAlloc(integer increase, error_return)
- Macro to raise the amount of a resource is used by 'increase'.
- If the increase pushes the resource allocation past the set
- cap, then return 'error_return' and set SandboxError as the
- exception, otherwise do nothing.
-
- * PySandbox_AllowedExtendedFree(integer decrease, error_return)
- Macro to lower the amount a resource is used by 'decrease'. If
- the decrease pushes the allotment to below 0 then have the
- caller return 'error_return' and set SandboxError as the
- exception, otherwise do nothing.
-
-
-+ Membership
- * int PySandbox_SetExtendedMembership(PyThreadState *,
- string group, string type,
- string member)
- Add a string, 'member', to be considered a member of a
- group-type (e.g., allowed file paths). If the interpreter is not
- an sandboxed interpreter, return a false value.
-
- * PySandbox_AllowedExtendedMembership(string group, string type,
- string member,
- error_return)
- Macro that checks 'member' is a member of the values set for
- the group-type. If it is not, then have the caller return
- 'error_return' and set an exception for SandboxError, otherwise
- does nothing.
-
-+ Specific Value
- * int PySandbox_SetExtendedValue(PyThreadState *, string group,
- string type, string value)
- Set a group-type to 'value'. If the interpreter is not
- sandboxed, return NULL.
-
- * PySandbox_AllowedExtendedValue(string group, string type,
- string value, error_return)
- Macro to check that the group-type is set to 'value'. If it is
- not, then have the caller return 'error_return' and set an
- exception for SandboxError, otherwise do nothing.
-
-
-Python API
-=============================
-
-__sandboxed__
---------------
-
-A built-in that flags whether the interpreter currently running is
-sandboxed or not. Set to a 'bool' value that is read-only. To mimic
-working of __debug__.
-
-
-sandbox module
---------------
-
-XXX
-
-
-References
-///////////////////////////////////////
-
-.. [#rexec] The 'rexec' module
- (http://docs.python.org/lib/module-rexec.html)
-
-.. [#safe-tcl] The Safe-Tcl Security Model
- (http://research.sun.com/technical-reports/1997/abstract-60.html)
-
-.. [#ctypes] 'ctypes' module
- (http://docs.python.org/dev/lib/module-ctypes.html)
-
-.. [#paradigm regained] "Paradigm Regained:
- Abstraction Mechanisms for Access Control"
- (http://erights.org/talks/asian03/paradigm-revised.pdf)
-
-.. [#armin-hiding] [Python-Dev] what can we do to hide the 'file' type?
- (http://mail.python.org/pipermail/python-dev/2006-July/067076.html)
Copied: python/branches/bcannon-sandboxing/securing_python.txt (from r50656, python/branches/bcannon-sandboxing/sandboxing_design_doc.txt)
==============================================================================
--- python/branches/bcannon-sandboxing/sandboxing_design_doc.txt (original)
+++ python/branches/bcannon-sandboxing/securing_python.txt Wed Jul 19 03:52:10 2006
@@ -1,1195 +1,467 @@
-Restricted Execution for Python
-#######################################
+Securing Python
+#####################################################################
-About This Document
-=============================
-
-This document is meant to lay out the general design for re-introducing
-a sandboxing model for Python. This document should provide one with
-enough information to understand the goals for sandboxing, what
-considerations were made for the design, and the actual design itself.
-Design decisions should be clear and explain not only why they were
-chosen but possible drawbacks from taking a specific approach.
-
-If any of the above is found not to be true, please email me at
-brett at python.org and let me know what problems you are having with the
-document.
-
-
-XXX TO DO
-=============================
-
-Design
---------------
-
-* threading needs protection?
-* python-dev convince me that hiding 'file' possible?
- + based on that, handle code objects
- + also decide how to handle sockets
- + perhaps go with crippling but try best effort on hiding reference and if
- best effort holds up eventually shift over to capabilities system
-* resolve to IP at call time to prevent DNS man-in-the-middle attacks when
- allowing a specific host name?
-* what network info functions are allowed by default?
-* does the object.__subclasses__() trick work across interpreters, or is it
- unique per interpreter?
-* figure out default whitelist of extension modules
-* check default accessible objects for file path exposure
-* helper functions to get at StringIO instances for stdin, stdout, and friends?
-* decide on what type of objects (e.g., PyStringObject or const char *) are to
- be passed in
-* all built-ins properly protected?
-* exactly how to tell whether argument to open() is a path, IP, or host name
- (third argument, 'n' prefix for networking, format of path, ...)
-* API at the Python level
-* for extension module protection, allow for wildcard allowance
- (e.g., ``xml.*``)
-
-
-Implementation
---------------
-
-* add __sandbox__
-* merge from HEAD
- + last merge on rev. 47248
-* remove bare malloc()/realloc()/free() uses
- + also watch out for PyObject_Malloc()/PyObject_MALLOC() calls
-* note in SpecialBuilds.txt
-
-
-Goal
-=============================
-
-A good sandboxing model provides enough protection to prevent malicious
-harm to come to the system, and no more. Barriers should be minimized
-so as to allow most code that does not do anything that would be
-regarded as harmful to run unmodified. But the protections need to be
-thorough enough to prevent any unintended changes or information of the
-system to come about.
-
-An important point to take into consideration when reading this
-document is to realize it is part of my (Brett Cannon's) Ph.D.
-dissertation. This means it is heavily geared toward sandboxing when
-the interpreter is working with Python code embedded in a web page as
-viewed in Firefox. While great strides have been taken to keep the
-design general enough so as to allow all previous uses of the 'rexec'
-module [#rexec]_ to be able to use the new design, it is not the
-focused goal. This means if a design decision must be made for the
-embedded use case compared to sandboxing Python code in a pure Python
-application, the former will win out over the latter.
-
-Throughout this document, the term "resource" is used to represent
-anything that deserves possible protection. This includes things that
-have a physical representation (e.g., memory) to things that are more
-abstract and specific to the interpreter (e.g., sys.path).
-
-When referring to the state of an interpreter, it is either
-"unprotected" or "sandboxed". A unprotected interpreter has no
-restrictions imposed upon any resource. A sandboxed interpreter has at
-least one, possibly more, resource with restrictions placed upon it to
-prevent unsafe code that is running within the interpreter to cause
-harm to the system.
-
-
-.. contents::
-
-
-Use Cases
-/////////////////////////////
-
-All use cases are based on how many sandboxed interpreters are running
-in a single process and whether an unprotected interpreter is also
-running. The use cases can be broken down into two categories: when
-the interpreter is embedded and only using sandboxed interpreters, and
-when pure Python code is running in an unprotected interpreter and uses
-sandboxed interpreters.
-
-
-When the Interpreter Is Embedded
-================================
-
-Single Sandboxed Interpreter
-----------------------------
-
-This use case is when an application embeds the interpreter and never
-has more than one interpreter running which happens to be sandboxed.
-
-
-Multiple Sandboxed Interpreters
--------------------------------
-
-When multiple interpreters, all sandboxed at varying levels, need to be
-running within a single application. This is the key use case that
-this proposed design is targeted for.
-
-
-Stand-Alone Python
-=============================
-
-When someone has written a Python program that wants to execute Python
-code in an sandboxed interpreter(s). This is the use case that 'rexec'
-attempted to fulfill.
-
-
-Issues to Consider
-=============================
-
-Common to all use cases, resources that the interpreter requires to
-function at a level below user code cannot be exposed to a sandboxed
-interpreter. For instance, the interpreter might need to stat a file
-to see if it is possible to import. If the ability to stat a file is
-not allowed to a sandboxed interpreter, it should not be allowed to
-perform that action, regardless of whether the interpreter at a level
-below user code needs that ability.
-
-When multiple interpreters are involved (sandboxed or not), not
-allowing an interpreter to gain access to resources available in other
-interpreters without explicit permission must be enforced.
-
-
-Resources to Protect
-/////////////////////////////
-
-It is important to make sure that the proper resources are protected
-from a sandboxed interpreter. If you don't there is no point to sandboxing.
-
-Filesystem
-===================
-
-All facets of the filesystem must be protected. This means restricting
-reading and writing to the filesystem (e.g., files, directories, etc.).
-It should be allowed in controlled situations where allowing access to
-the filesystem is desirable, but that should be an explicit allowance.
-
-There must also be protection to prevent revealing any information
-about the filesystem. Disclosing information on the filesystem could
-allow one to infer what OS the interpreter is running on, for instance.
-
-
-Memory
-===================
-
-Memory should be protected. It is a limited resource on the system
-that can have an impact on other running programs if it is exhausted.
-Being able to restrict the use of memory would help alleviate issues
-from denial-of-service (DoS) attacks on the system.
-
-
-Networking
-===================
-
-Networking is somewhat like the filesystem in terms of wanting similar
-protections. You do not want to let unsafe code make socket
-connections unhindered or accept them to do possibly nefarious things.
-You also want to prevent finding out information about the network your
-are connected to.
-
-
-Interpreter
-===================
-
-One must make sure that the interpreter is not harmed in any way from
-sandboxed code. This usually takes the form of crashing the program
-that the interpreter is embedded in or the unprotected interpreter that
-started the sandbox interpreter. Executing hostile bytecode that might
-lead to undesirable effects is another possible issue.
-
-There is also the issue of taking it over. One should not able to gain
-escalated privileges in any way without explicit permission.
-
-
-Types of Security
+Introduction
///////////////////////////////////////
-As with most things, there are multiple approaches one can take to
-tackle a problem. Security is no exception. In general there seem to
-be two approaches to protecting resources.
-
-
-Resource Hiding
-=============================
-
-By never giving code a chance to access a resource, you prevent it from
-being (ab)used. This is the idea behind resource hiding; you can't
-misuse something you don't have in the first place.
-
-The most common implementation of resource hiding is capabilities. In
-this type of system a resource's reference acts as a ticket that
-represents the right to use the resource. Once code has a reference it
-is considered to have full use of resource that reference represents
-and no further security checks are directly performed (using delegates
-and other structured ways one can actually have a security check for
-each access of a resource, but this is not a default behaviour).
-
-As an example, consider the 'file' type as a resource we want to
-protect. That would mean that we did not want a reference to the
-'file' type to ever be accessible without explicit permission. If one
-wanted to provide read-only access to a temp file, you could have
-open() perform a check on the permissions of the current interpreter,
-and if it is allowed to, return a proxy object for the file that only
-allows reading from it. The 'file' instance for the proxy would need
-to be properly hidden so that the reference was not reachable from
-outside so that 'file' access could still be controlled.
-
-Python, as it stands now, unfortunately does not work well for a pure
-capabilities system. Capabilities require the prohibition of certain
-abilities, such as "direct access to another's private state"
-[#paradigm regained]_. This obviously is not possible in Python since,
-at least at the Python level, there is no such thing as private state
-that is persistent (one could argue that local variables that are not
-cell variables for lexical scopes are private, but since they do not
-survive after a function call they are not usable for keeping
-persistent state). One can hide references at the C level by storing
-it in the struct for the instance of a type and not providing a
-function to access that attribute.
-
-Python's introspection abilities also do not help make implementing
-capabilities that much easier. Consider how one could access 'file'
-even when it is deleted from __builtin__. You can still get to the
-reference for 'file' through the sequence returned by
-``object.__subclasses__()``.
-
-
-Resource Crippling
-=============================
-
-Another approach to security is to not worry about controlling access
-to the reference of a resource. One can have a resource perform a
-security check every time someone tries to use a method on that
-resource. This pushes the security check to a lower level; from a
-reference level to the method level.
-
-By performing the security check every time a resource's method is
-called the worry of a specific resource's reference leaking out to
-insecure code is alleviated. This does add extra overhead, though, by
-having to do so many security checks. It also does not handle the
-situation where an unexpected exposure of a type occurs that has not
-been properly crippled.
-
-FreeBSD's jail system provides a protection scheme similar to this.
-Various system calls allow for basic usage, but knowing or having
-access to the system call is not enough to grant usage. Every call to
-a system call requires checking that the proper rights have been
-granted to the use in order to allow for the system call to perform
-its action.
-
-An even better example in FreeBSD's jail system is its protection of
-sockets. One can only bind a single IP address to a jail. Any attempt
-to do more or perform uses with the one IP address that is granted is
-prevented. The check is performed at every call involving the one
-granted IP address.
-
-Using 'file' as the example again, one could cripple the type so that
-instantiation is not possible for the type in Python. One could also
-provide a permission check on each call to a unsafe method call and
-thus allow the type to be used in normal situations (such as type
-checking), but still feel safe that illegal operations are not
-performed. Regardless of which approach you take, you do not need to
-worry about a reference to the type being exposed unexpectedly since
-the reference is not the security check but the actual method calls.
-
-
-Comparison of the Two Approaches
-================================
-
-From the perspective of Python, the two approaches differ on what would
-be the most difficult thing to analyze from a security standpoint: all
-of the ways to gain access to various types from a sandboxed
-interpreter with no imports, or finding all of the types that can lead
-to possibly dangerous actions and thus need to be crippled.
-
-Some Python developers, such as Armin Rigo, feel that truly hiding
-objects in Python is "quite hard" [#armin-hiding]_. This sentiment
-means that making a pure capabilities system in Python that is secure
-is not possible as people would continue to find new ways to get a hold
-of the reference to a protected resource.
-
-Others feel that by not going the capabilities route we will be
-constantly chasing down new types that require crippling. The thinking
-is that if we cannot control the references for 'file', how are we to
-know what other types might become exposed later on and thus require
-more crippling?
-
-It essentially comes down to what is harder to do: find all the ways to
-access the types in Python in a sandboxed interpreter with no imported
-modules, or to go through the Python code base and find all types that
-should be crippled?
+As of Python 2.5, the Python does not support any form of security
+model for
+executing arbitrary Python code in some form of protected interpreter.
+While one can use such things as ``exec`` and ``eval`` to garner a
+very weak form of sandboxing, it does not provide any thorough
+protections from malicious code.
+
+This should be rectified. This document attempts to lay out what
+would be needed to secure Python in such a way as to allow arbitrary
+Python code to execute in a sandboxed interpreter without worries of
+that interpreter providing access to any resource of the operating
+system without being given explicit authority to do so.
+
+Throughout this document several terms are going to be used. A
+"sandboxed interpreter" is one where the built-in namespace is not the
+same as that of an interpreter whose built-ins were unaltered, which
+is called an "unprotected interpreter".
+
+A "bare interpreter" is one where the built-in namespace has been
+stripped down the bare minimum needed to run any form of basic Python
+program. This means that all atomic types (i.e., syntactically
+supported types), ``object``, and the exceptions provided by the
+``exceptions`` module are considered in the built-in namespace. There
+have also been no imports executed in the interpreter.
-The 'rexec' Module
+Rationale
///////////////////////////////////////
-The 'rexec' module [#rexec]_ was the original attempt at providing a
-sandbox environment for Python code to run in. It's design was based
-on Safe-Tcl which was essentially a capabilities system [#safe-tcl]_.
-Safe-Tcl allowed you to launch a separate interpreter where its global
-functions were specified at creation time. This prevented one from
-having any abilities that were not explicitly provided.
-
-For 'rexec', the Safe-Tcl model was tweaked to better match Python's
-situation. An RExec object represented a sandboxed environment.
-Imports were checked against a whitelist of modules. You could also
-restrict the type of modules to import based on whether they were
-Python source, bytecode, or C extensions. Built-ins were allowed
-except for a blacklist of built-ins to not provide. One could restrict
-whether stdin, stdout, and stderr were provided or not on a per-RExec
-basis. Several other protections were provided; see documentation for
-the complete list.
-
-The ultimate undoing of the 'rexec' module was how access to objects
-that in normal Python require no imports to reach was handled.
-Importing modules requires a direct action, and thus can be protected
-against directly in the import machinery. But for built-ins, they are
-accessible by default and require no direct action to access in normal
-Python; you just use their name since they are provided in all
-namespaces.
-
-For instance, in a sandboxed interpreter, one only had to
-``del __builtins__`` to gain access to the full set of built-ins.
-Another way is through using the gc module:
-``gc.get_referrers(''.__class__.__bases__[0])[6]['file']``. While both
-of these could be fixed (the former was a bug in 'rexec' that was fixed
-and the latter could be handled by not allowing 'gc' to be imported),
-they are examples of things that do not require proactive actions on
-the part of the programmer in normal Python to gain access to a
-resource. This was an unfortunate side-effect of having all of that
-wonderful reflection in Python.
-
-There is also the issue that 'rexec' was written in Python which
-provides its own problems based on reflection and the ability to modify
-the code at run-time without security protection.
-
-Much has been learned since 'rexec' was written about how Python tends
-to be used and where security issues tend to appear. Essentially
-Python's dynamic nature does not lend itself very well to a security
-implementation that does not require a constant checking of
-permissions.
+Python is used extensively as an embedded language within existing
+programs. These applications often times need to provide the
+functionality of allowing users to run Python code written by someone
+else where they can trust that no unintentional harm will come to
+their system regardless of their trust of the code they are executing.
+
+For instance, think of an application that supports a plug-in system
+with Python as the language used for writing plug-ins. You do not
+want to have to examine every plug-in you download to make sure that
+it does not alter your filesystem if you can help it. With a proper
+security model and implementation in place this hinderance of having
+to examine all code you execute should be alleviated.
-Threat Model
+Approaches to Security
///////////////////////////////////////
-Below is a list of what the security implementation assumes, along with
-what section of this document that addresses that part of the security
-model (if not already true in Python by default). The term "bare" when
-in regards to an interpreter means an interpreter that has not
-performed a single import of a module. Also, all comments refer to a
-sandboxed interpreter unless otherwise explicitly stated.
-
-This list does not address specifics such as how 'file' will be
-protected or whether memory should be protected. This list is meant to
-make clear at a more basic level what the security model is assuming is
-true.
-
-* The Python interpreter itself is always trusted.
- + Implemented by code that runs at the process level performing any
- necessary security checks.
-* The Python interpreter cannot be crashed by valid Python source code
- in a bare interpreter.
-* Python source code is always considered safe.
-* Python bytecode is always considered dangerous [`Hostile Bytecode`_].
-* C extension modules are inherently considered dangerous.
- [`Extension Module Importation`_].
- + Explicit trust of a C extension module is possible.
-* Built-in modules are considered dangerous.
- + Explicit trust of a built-in module is possible.
-* Sandboxed interpreters running in the same process inherently cannot
- communicate with each other.
- + Communication through C extension modules is possible because of
- the technical need to share extension module instances between
- interpreters.
-* Sandboxed interpreters running in the same process inherently cannot
- share objects.
- + Sharing objects through C extension modules is possible because
- of the technical need to share extension module instances between
- interpreters.
-* When starting a sandboxed interpreter, it starts with a fresh
- built-in and global namespace that is not shared with the interpreter
- that started it.
-* Objects in the default built-in namespace should be safe to use
- [`Reading/Writing Files`_, `Stdin, Stdout, and Stderr`_].
- + Either hide the dangerous ones or cripple them so they can cause
- no harm.
+There are essentially two types of security: who-I-am
+(permissions-based) security and what-I-have (authority-based)
+security.
+
+Who-I-Am Security
+========================
+
+With who-I-am security (a.k.a., permissions-based security), the
+ability to use a resource requires providing who you are, validating
+you are allowed to access the resource you are requesting, and then
+performing the requested action on the resource.
+
+The ACL security system on most UNIX filesystems is who-I-am security.
+When you want to open a file, say ``/etc/passwd``, you make the
+function call to open the file. Within that function, it fetchs
+the ACL for the file, finds out who the caller is, checks to see if
+the caller is on the ACL for opening the file, and then proceeds to
+either deny access or return an open file object.
+
+
+What-I-Have Security
+========================
+
+A contrast to who-I-am security, what-I-have security never requires
+knowing who is requesting a resource. By never providing a function
+to access a resource or by creating a proxy that wraps the function to
+access a resource with argument checking, you can skip the need to
+know who is making a call.
+
+Using our file example, the program trying to open a file is given a
+proxy that checks whether paths passed into the function match allowed
+based at the creation time of the proxy before using the full-featured
+open function to open the file.
+
+This illustrates a subtle, but key difference between who-I-am and
+what-I-have security. For who-I-am, you must know who the caller is
+and check that the arguments are valid for the person calling. For
+what-I-have security, you only have to validate the arguments.
-There are also some features that might be desirable, but are not being
-addressed by this security model.
-* Communication in any direction between an unprotected interpreter and
- a sandboxed interpreter it created.
-
-
-The Proposed Approach
+Object-Capabilities
///////////////////////////////////////
-In light of where 'rexec' succeeded and failed along with what is known
-about the two main approaches to security and how Python tends to
-operate, the following is a proposal on how to secure Python for
-sandboxing.
+What-I-have security is more often called the object-capabilities
+security model. The belief here is in POLA (Principle Of Least
+Authority): you give a program exactly what it needs, and no more. By
+providing a function that can open any file that relies on identity to
+decide if to open something, you are still providing a fully capable
+function that just requires faking one's identity to circumvent
+security. It also means that if you accidentally run code that
+performs actions that you did not expect (e.g., deleting all your
+files), there is no way to stop it since it operates with *your*
+permissions.
+Using POLA and object-capabilities, you only give access to resources
+to the extent that someone needs. This means if a program only needs
+access to a single file, you only give them a function that can open
+that single file. If you accidentally run code that tries to delete
+all of your files, it can only delete the one file you authorized the
+program to open.
+
+Object-capabilities use the reference graph of objects to provide the
+security of accessing resources. If you do not have a reference to a
+resource (or a reference to an object that can references a resource),
+you cannot access it, period. You can provide conditional access by
+using a proxy between code and a resource, but that still requires a
+reference to the resource by the proxy.
+
+This leads to a much cleaner implementation of security. By not
+having to change internal code in the interpreter to perform identity
+checks, you can instead shift the burden of security to proxies
+which are much more flexible and have less of an adverse affect on the
+interpreter directly (assuming you have the basic requirements for
+object-capabilities met).
+
+
+Difficulties in Python for Object-Capabilities
+//////////////////////////////////////////////
+
+In order to provide the proper protection of references that
+object-capabilities require, you must set up a secure perimeter
+defense around your security domain. The domain can be anthing:
+objects, interpreters, processes, etc. The point is that the domain
+is where you draw the line for allowing arbitrary access to resources.
+This means that with the interpreter is the security domain, then
+anything within an interpreter can be expected to be freely shared,
+but beyond that, reference access is strictly controlled.
+
+Three key requirements for providing a proper perimeter defence is
+private namespaces, immutable shared state across domains, and
+unforgeable references. Unfortunately Python only has one of the
+three requirements by default (you cannot forge a reference in Python
+code).
-Implementation Details
+
+Problem of No Private Namespace
===============================
-Support for sandboxed interpreters will require a compilation flag.
-This allows the more common case of people not caring about protections
-to not take a performance hit. And even when Python is compiled for
-sandboxed interpreter restrictions, when the running interpreter *is*
-unprotected, there will be no accidental triggers of protections. This
-means that developers should be liberal with the security protections
-without worrying about there being issues for interpreters that do not
-need/want the protection.
-
-At the Python level, the __sandboxed__ built-in will be set based on
-whether the interpreter is sandboxed or not. This will be set for
-*all* interpreters, regardless of whether sandboxed interpreter support
-was compiled in or not.
-
-For setting what is to be protected, the PyThreadState for the
-sandboxed interpreter must be passed in. This makes the protection
-very explicit and helps make sure you set protections for the exact
-interpreter you mean to. All functions that set protections begin with
-the prefix ``PySandbox_Set*()``. These functions are meant to only
-work with sandboxed interpreters that have not been used yet to execute
-any Python code. The calls must be made by the code creating and
-handling the sandboxed interpreter *before* the sandboxed interpreter
-is used to execute any Python code.
-
-The functions for checking for permissions are actually macros that
-take in at least an error return value for the function calling the
-macro. This allows the macro to return on behalf of the caller if the
-check fails and cause the SandboxError exception to be propagated
-automatically. This helps eliminate any coding errors from incorrectly
-checking a return value on a rights-checking function call. For the
-rare case where this functionality is disliked, just make the check in
-a utility function and check that function's return value (but this is
-strongly discouraged!).
-
-Functions that check that an operation is allowed implicitly operate on
-the currently running interpreter as returned by
-``PyInterpreter_Get()`` and are to be used by any code (the
-interpreter, extension modules, etc.) that needs to check for
-permission to execute. They have the common prefix of
-`PySandbox_Allowed*()``.
-
-
-API
---------------
-
-* PyThreadState* PySandbox_NewInterpreter()
- Return a new interpreter that is considered sandboxed. There is no
- corresponding ``PySandbox_EndInterpreter()`` as
- ``Py_EndInterpreter()`` will be taught how to handle sandboxed
- interpreters. ``NULL`` is returned on error.
-
-* PySandbox_Allowed(error_return)
- Macro that has the caller return with 'error_return' if the
- interpreter is unprotected, otherwise do nothing.
-
-
-Memory
-=============================
-
-Protection
---------------
-
-A memory cap will be allowed.
-
-Modification to pymalloc will be needed to properly keep track of the
-allocation and freeing of memory. Same goes for the macros around the
-system malloc/free system calls. This provides a platform-independent
-system for protection of memory instead of relying on the operating
-system to provide a service for capping memory usage of a process. It
-also allows the protection to be at the interpreter level instead of at
-the process level.
-
-Existing APIs to protect:
-- _PyObject_New()
- protected directly
-- _PyObject_NewVar()
- protected directly
-- _PyObject_Del()
- remove macro that uses PyObject_Free() and protect directly
-- PyObject_New()
- implicitly by macro using _PyObject_New()
-- PyObject_NewVar()
- implicitly by macro using _PyObject_NewVar()
-- PyObject_Del()
- redefine macro to use _PyObject_Del() instead of PyObject_Free()
-- PyMem_Malloc()
- protected directly
-- PyMem_Realloc()
- protected directly
-- PyMem_Free()
- protected directly
-- PyMem_New()
- implicitly protected by macro using PyMem_Malloc()
-- PyMem_Resize()
- implicitly protected by macro using PyMem_Realloc()
-- PyMem_Del()
- implicitly protected by macro using PyMem_Free()
-- PyMem_MALLOC()
- redefine macro to use PyMem_Malloc()
-- PyMem_REALLOC()
- redefine macro to use PyMem_Realloc()
-- PyMem_FREE()
- redefine macro to use PyMem_Free()
-- PyMem_NEW()
- implicitly protected by macro using PyMem_MALLOC()
-- PyMem_RESIZE()
- implicitly protected by macro using PyMem_REALLOC()
-- PyMem_DEL()
- implicitly protected by macro using PyMem_FREE()
-- PyObject_Malloc()
- XXX
-- PyObject_Realloc()
- XXX
-- PyObject_Free()
- XXX
-- PyObject_MALLOC()
- XXX
-- PyObject_REALLOC()
- XXX
-- PyObject_FREE()
- XXX
-
-
-Why
---------------
-
-Protecting excessive memory usage allows one to make sure that a DoS
-attack against the system's memory is prevented.
-
-
-Possible Security Flaws
------------------------
-
-If code makes direct calls to malloc/free instead of using the proper
-``PyMem_*()``
-macros then the security check will be circumvented. But C code is
-*supposed* to use the proper macros or pymalloc and thus this issue is
-not with the security model but with code not following Python coding
-standards.
-
-
-API
---------------
-
-* int PySandbox_SetMemoryCap(PyThreadState *, integer)
- Set the memory cap for an sandboxed interpreter. If the
- interpreter is not running an sandboxed interpreter, return a false
- value.
-
-* PySandbox_AllowedMemoryAlloc(integer, error_return)
- Macro to increase the amount of memory that is reported that the
- running sandboxed interpreter is using. If the increase puts the
- total count passed the set limit or leads to integer overflow in
- the allocation count, raise an SandboxError exception
- and cause the calling function to return with the value of
- 'error_return', otherwise do nothing.
-
-* void PySandbox_AllowedMemoryFree(integer)
- Decrease the current running interpreter's allocated
- memory. If this puts the memory used to below 0, re-set it to 0.
+Typically, in languages that are statically typed (like C++), you have
+public and private attributes on objects. Those private attributes
+provide a private namespace for the class and instances that are not
+accessible by other objects.
+
+The Python language has no such thing as a private namespace. The
+language has the philosophy that if exposing something to the
+programmer could provide some use, then it is exposed. This has led
+to Python having a wonderful amount of introspection abilities.
+Unfortunately this makes the possibility of a private namespace
+non-existent. This poses an issue for providing proxies for resources
+since there is no way in Python code to hide the reference to a
+resource.
+
+Luckily, the Python virtual machine *does* provide a private namespace,
+albeit not for pure Python source code. If you use the Python/C
+language barrier in extension modules, you can provide a private
+namespace by using the struct allocated for each instance of an
+object. This provides a way to create proxies, written in C, that can
+protect resources properly. Throughout this document, when mentioning
+proxies, it is assumed they have been implemented in C.
-Reading/Writing Files
-=============================
+Problem of Mutable Shared State
+===============================
-Protection
---------------
+Another problem that Python's introspection abilties cause is that of
+mutable shared state. At the interpreter level, there has never been
+a concerted effort to isolate state shared between all interpreters
+running in the same Python process. Sometimes this is for performance
+reasons, sometimes because it is just easier to implement this way.
+Regardless, sharing of state that can be influenced by another
+interpreter is not safe for object-capabilities.
+
+To rectify the situation, some changes will be needed to some built-in
+objects in Python. It should mostly consist of abstracting or
+refactoring certain abilities out to an extension module so that
+access can be protected using import guards.
-XXX
-To open a file, one will have to use open(). This will make open() a
-factory function that controls reference access to the 'file' type in
-terms of creating new instances. When an attempted file opening fails
-(either because the path does not exist or of security reasons),
-SandboxError will be raised. The same exception must be raised to
-prevent filesystem information being gleaned from the type of exception
-returned (i.e., returning IOError if a path does not exist tells the
-user something about that file path).
-
-What open() returns may not be an instance of 'file' but a proxy that
-provides the security measures needed. While this might break code
-that uses type checking to make sure a 'file' object is used, taking a
-duck typing approach would be better. This is not only more Pythonic
-but would also allow the code to use a StringIO instance.
-
-It has been suggested to allow for a passed-in callback to be called
-when a specific path is to be opened. While this provides good
-flexibility in terms of allowing custom proxies with more fine-grained
-security (e.g., capping the amount of disk write), this has been deemed
-unneeded in the initial security model and thus is not being considered
-at this time.
-
-Why
---------------
-
-Allowing anyone to be able to arbitrarily read, write, or learn about
-the layout of your filesystem is extremely dangerous. It can lead to
-loss of data or data being exposed to people whom should not have
-access.
+Threat Model
+///////////////////////////////////////
+The threat that this security model is attempting to handle is the
+execution of arbitrary Python code in a sandboxed interpreter such
+that the code in that interpreter is not able to harm anything outside
+of itself. This means that:
+
+* An interpreter cannot influence another interpreter directly at the
+ Python level without explicitly allowing it.
+ + This includes preventing communicating with another interpreter.
+ + Mutable objects cannot be shared between interpreters without
+ explicit allowance for it.
+ + "Explicit allowance" includes the importation of C extension
+ modules because a technical detail requires that these modules
+ not be re-initialized per interpreter, meaning that all
+ interpreters in a single Python process share the same C
+ extension modules.
+* An interpreter cannot use operating system resources without being
+ explicitly given those resources.
+ + This includes importing modules since that requires the ability
+ to use the resource of the filesystem.
+
+In order to accomplish these goals, certain things must be made true.
+
+* The Python process is the "powerbox".
+ + It controls the initial granting of abilties to interpreters.
+* A bare Python interpreter is always trusted.
+ + Python source code that can be created in a bare interpreter is
+ always trusted.
+ + Python source code created within a bare interpreter cannot
+ crash the interpreter.
+* Python bytecode is always distrusted.
+ + Malicious bytecode can bring down an interpreter.
+* Pure Python source code is always safe on its own.
+ + Malicious abilities are derived from C extension modules,
+ built-in modules, and unsafe types implemented in C, not from
+ pure Python source.
+* A sub-interpreter started by another interpreter does not inherit
+ any state.
+ + The sub-interpreter starts out with a fresh global namespace and
+ whatever built-ins it was initially given.
-Possible Security Flaws
------------------------
-
-XXX
+Implementation
+///////////////////////////////////////
-API
---------------
+Guiding Principles
+========================
-* int PySandbox_SetAllowedFile(PyThreadState *, string path,
- string mode)
- Add a file that is allowed to be opened in 'mode' by the 'file'
- object. If the interpreter is not sandboxed then return a false
- value.
-
-* PySandbox_AllowedPath(string path, string mode, error_return)
- Macro that causes the caller to return with 'error_return' and
- raise SandboxError as the exception if the specified path with
- 'mode' is not allowed, otherwise do nothing.
-
-
-Extension Module Importation
-============================
-
-Protection
---------------
-
-A whitelist of extension modules that may be imported must be provided.
-A default set is given for stdlib modules known to be safe.
-
-A check in the import machinery will check that a specified module name
-is allowed based on the type of module (Python source, Python bytecode,
-or extension module). Python bytecode files are never directly
-imported because of the possibility of hostile bytecode being present.
-Python source is always considered safe based on the assumption that
-all resource harm is eventually done at the C level, thus Python source
-code directly cannot cause harm without help of C extension modules.
-Thus only C extension modules need to be checked against the whitelist.
-
-The requested extension module name is checked in order to make sure
-that it is on the whitelist if it is a C extension module. If the name
-is not correct a SandboxError exception is raised. Otherwise the
-import is allowed.
-
-Even if a Python source code module imports a C extension module in an
-unprotected interpreter it is not a problem since the Python source
-code module is reloaded in the sandboxed interpreter. When that Python
-source module is freshly imported the normal import check will be
-triggered to prevent the C extension module from becoming available to
-the sandboxed interpreter.
-
-For the 'os' module, a special sandboxed version will be used if the
-proper C extension module providing the correct abilities is not
-allowed. This will default to '/' as the path separator and provide as
-much reasonable abilities as possible from a pure Python module.
+To begin, the Python process garners all power as the powerbox. It is
+up to the process to initially hand out access to resources and
+abilities to interpreters. This might take the form of an interpreter
+with all abilities granted (i.e., a standard interpreter as launched
+when you execute Python), which then creates sub-interpreters with
+sandboxed abilities. Another alternative is only creating
+interpreters with sandboxed abilities (i.e., Python being embedded in
+an application that only uses sandboxed interpreters).
+
+All security measures should never have to ask who an interpreter is.
+This means that what abilities an interpreter has should not be stored
+at the interpreter level when the security can use a proxy to protect
+a resource. This means that while supporting a memory cap can
+have a per-interpreter setting that is checked (because access to the
+operating system's memory allocator is not supported at the program
+level), protecting files and imports should not such a per-interpreter
+protection at such a low level (because those can have extension
+module proxies to provide the security).
+
+For common case security measures, the Python standard library
+(stdlib) should provide a simple way to provide those measures. Most
+commonly this will take the form of providing factory functions that
+create instances of proxies for providing protection of key resources.
+
+Backwards-compatibility will not be a hindrance upon the design or
+implementation of the security model. Because the security model will
+inherently remove resources and abilities that existing code expects,
+it is not reasonable to expect existing code to work in a sandboxed
+interpreter.
+
+Keeping Python "pythonic" is required for all design decisions. If
+removing an ability leads to something being unpythonic, it will not
+be done. This does not mean existing pythonic code must continue to
+work, but the spirit of being pythonic will not be compromised in the
+name of the security model. While this might lead to a weaker
+security model, this is a price that must be paid in order for Python
+to continue to be the language that it is.
+
+Restricting what is in the built-in namespace and the safe-guarding
+the interpreter (which includes safe-guarding the built-in types) is
+where security will come from. Imports and the ``file`` type are
+both part of the standard namespace and must be restricted in order
+for any security implementation to be effective.
+The built-in types which are needed for basic Python usage (e.g.,
+``object`` code objects, etc.) must be made safe to use in a sandboxed
+interpreter since they are easily accessbile and yet required for
+Python to function.
+
+
+Abilities of a Standard Sandboxed Interpreter
+=============================================
+
+In the end, a standard sandboxed interpreter should (not)
+allow certain things to be doable by code running within itself.
+Below is a list of abilities that will (not) be allowed in the default
+instance of a sandboxed interpreter comparative to an unprotected
+interpreter that has not imported any modules. These protections can
+be tweaked by using proxies to allow for certain extended abilities to
+be accessible.
+
+* You cannot open any files directly.
+* Importation
+ + You can import any pure Python module.
+ + You cannot import any Python bytecode module.
+ + You cannot import any C extension module.
+ + You cannot import any built-in module.
+* You cannot find out any information about the operating system you
+ are running on.
+* Only safe built-ins are provided.
-The 'sys' module is specially addressed in
-`Changing the Behaviour of the Interpreter`_.
-By default, the whitelisted modules are:
+Implementation Details
+========================
+An important point to keep in mind when reading about the
+implementation details for the security model is that these are
+general changes and are not special to any type of interpreter,
+sandboxed or otherwise. That means if a change to a built-in type is
+suggested and it does not involve a proxy, that change is meant
+Python-wide for *all* interpreters.
+
+
+Imports
+-------
+
+A proxy for protecting imports will be provided. This is done by
+setting the ``__import__()`` function in the built-in namespace of the
+sandboxed interpreter to a proxied version of the function.
+
+The planned proxy will take in a passed-in function to use for the
+import and a whitelist of C extension modules and built-in modules to
+allow importation of. If an import would lead to loading an extension
+or built-in module, it is checked against the whitelist and allowed
+to be imported based on that list. All .pyc and .pyo file will not
+be imported. All .py files will be imported.
+
+XXX perhaps augment 'sys' so that you list the extension of files that
+can be used for importing? Thought this was controlled somewhere
+already but can't find it.
+
+It must be warned that importing any C extension module is dangerous.
+Not only are they able to circumvent security measures by executing C
+code, but they share state across interpreters. Because an extension
+module's init function is only called once for the Python *process*,
+its initial state is set only once. This means that if some mutable
+object is exposed at the module level, a sandboxed interpreter could
+mutate that object, return, and then if the creating interpreter
+accesses that mutated object it is essentially communicating and/or
+acting on behalf of the sandboxed interpreter. This violates the
+perimeter defence. No one should import extension modules blindly.
+
+
+Sanitizing Built-In Types
+-------------------------
+
+Python contains a wealth of bulit-in types. These are used at a basic
+level so that they are easily accessible to any Python code. They are
+also shared amongst all interpreters in a Python process. This means
+all built-in types need to be made safe (e.g., immutable shared
+state) so that they can be used by any and all interpreters in a
+single Python process. Several aspects of built-in types need to be
+examined.
+
+
+Constructors
+++++++++++++
+
+Almost all of Python's built-in types
+contain a constructor that allows code to create a new instance of a
+type as long as you have the type itself. Unfortunately this does not
+work in an object-capabilities system without either providing a proxy
+to the constructor or just turning it off.
+
+The plan is to turn off the constructors that are currently supplied
+directly by the types that are dangerous. Their constructors will
+then either be shifted over to factory functions that will be stored
+in a C extension module or to built-ins that will be
+provided to use to create instances. The former approach will allow
+for protections to be enforced by import proxy; just don't allow the
+extension module to be imported. The latter approach would allow
+either a unique constructor per type, or more generic built-in(s) for
+construction (e.g., introducing a ``construct()`` function that takes
+in a type and any arguments desired to be passed in for constructing
+an instance of the type) and allowing using proxies to provide
+security.
+
+Some might consider this unpythonic. Python very rarely separates the
+constructor of an object from the class/type and require that you go
+through a function. But there is some precedent for not using a
+type's constructor to get an instance of a type. The ``file`` type,
+for instance, typically has its instances created through the
+``open()`` function. This slight shift for certain types to have their
+(dangerous) constructor not on the type but in a function is
+considered an acceptable compromise.
+
+Types whose constructors are considered dangerous are:
+
+* ``file``
+ + Will definitely use the ``open()`` built-in.
+* code objects
+* XXX sockets?
+* XXX type?
* XXX
-Why
---------------
-
-Because C code is considered unsafe, its use should be regulated. By
-using a whitelist it allows one to explicitly decide that a C extension
-module is considered safe.
-
-
-Possible Security Flaws
------------------------
-
-If a whitelisted C extension module imports a non-whitelisted C
-extension module and makes it an attribute of the whitelisted module
-there will be a breach in security. Luckily this a rarity in
-extension modules.
-
-There is also the issue of a C extension module calling the C API of a
-non-whitelisted C extension module.
-
-Lastly, if a whitelisted C extension module is loaded in an unprotected
-interpreter and then loaded into a sandboxed interpreter then there is
-no checks during module initialization for possible security issues in
-the sandboxed interpreter that would have occurred had the sandboxed
-interpreter done the initial import.
-
-All of these issues can be handled by never blindly whitelisting a C
-extension module. Added support for dealing with C extension modules
-comes in the form of `Extension Module Crippling`_.
-
-
-API
---------------
-
-* int PySandbox_SetModule(PyThreadState *, string module_name)
- Allow the sandboxed interpreter to import 'module_name'. If the
- interpreter is not sandboxed, return a false value. Absolute
- import paths must be specified.
-
-* int PySandbox_BlockModule(PyThreadState *, string module_name)
- Remove the specified module from the whitelist. Used to remove
- modules that are allowed by default. Return a false value if
- called on an unprotected interpreter.
-
-* PySandbox_AllowedModule(string module_name, error_return)
- Macro that causes the caller to return with 'error_return' and sets
- the exception SandboxError if the specified module cannot be
- imported, otherwise does nothing.
-
-
-Extension Module Crippling
-==========================
-
-Protection
---------------
-
-By providing a C API for checking for allowed abilities, modules that
-have some useful functionality can do proper security checks for those
-functions that could provide insecure abilities while allowing safe
-code to be used (and thus not fully deny importation).
-
-
-Why
---------------
-
-Consider a module that provides a string processing ability. If that
-module provides a single convenience function that reads its input
-string from a file (with a specified path), the whole module should not
-be blocked from being used, just that convenience function. By
-whitelisting the module but having a security check on the one problem
-function, the user can still gain access to the safe functions. Even
-better, the unsafe function can be allowed if the security checks pass.
-
-
-Possible Security Flaws
------------------------
-
-If a C extension module developer incorrectly implements the security
-checks for the unsafe functions it could lead to undesired abilities.
-
-
-API
---------------
-
-Use PySandbox_Allowed() to protect unsafe code from being executed.
-
-
-Hostile Bytecode
-=============================
-
-Protection
---------------
-
-XXX
-
-
-Why
---------------
-
-Without implementing a bytecode verification tool, there is no way of
-making sure that bytecode does not jump outside its bounds, thus
-possibly executing malicious code. It also presents the possibility of
-crashing the interpreter.
-
-
-Possible Security Flaws
------------------------
-
-None known.
-
-
-API
---------------
-
-N/A
-
-
-Changing the Behaviour of the Interpreter
-=========================================
-
-Protection
---------------
-
-Only a subset of the 'sys' module will be made available to sandboxed
-interpreters. Things to allow from the sys module:
-
-* byteorder (?)
-* copyright
-* displayhook
-* excepthook
-* __displayhook__
-* __excepthook__
-* exc_info
-* exc_clear
-* exit
-* getdefaultencoding
-* _getframe (?)
-* hexversion
-* last_type
-* last_value
-* last_traceback
-* maxint (?)
-* maxunicode (?)
-* modules
-* stdin # See `Stdin, Stdout, and Stderr`_.
-* stdout
-* stderr
-* version
-
-
-Why
---------------
-
-Filesystem information must be removed. Any settings that could
-possibly lead to a DoS attack (e.g., sys.setrecursionlimit()) or risk
-crashing the interpreter must also be removed.
-
-
-Possible Security Flaws
------------------------
-
-Exposing something that could lead to future security problems (e.g., a
-way to crash the interpreter).
-
-
-API
---------------
-
-None.
-
-
-Socket Usage
-=============================
-
-Protection
---------------
-
-Allow sending and receiving data to/from specific IP addresses on
-specific ports.
-
-open() is to be used as a factory function to open a network
-connection. If the connection is not possible (either because of an
-invalid address or security reasons), SandboxError is raised.
-
-A socket object may not be returned by the call. A proxy to handle
-security might be returned instead.
-
-XXX
-
-
-Why
---------------
-
-Allowing arbitrary sending of data over sockets can lead to DoS attacks
-on the network and other machines. Limiting accepting data prevents
-your machine from being attacked by accepting malicious network
-connections. It also allows you to know exactly where communication is
-going to and coming from.
-
-
-Possible Security Flaws
------------------------
-
-If someone managed to influence the used DNS server to influence what
-IP addresses were used after a DNS lookup.
-
-
-API
---------------
-
-* int PySandbox_SetIPAddress(PyThreadState *, string IP, integer port)
- Allow the sandboxed interpreter to send/receive to the specified
- 'IP' address on the specified 'port'. If the interpreter is not
- sandboxed, return a false value.
-
-* PySandbox_AllowedIPAddress(string IP, integer port, error_return)
- Macro to verify that the specified 'IP' address on the specified
- 'port' is allowed to be communicated with. If not, cause the
- caller to return with 'error_return' and SandboxError exception
- set, otherwise do nothing.
-
-* int PySandbox_SetHost(PyThreadState *, string host, integer port)
- Allow the sandboxed interpreter to send/receive to the specified
- 'host' on the specified 'port'. If the interpreter is not
- sandboxed, return a false value.
-
-* PySandbox_AllowedHost(string host, integer port, error_return)
- Check that the specified 'host' on the specified 'port' is allowed
- to be communicated with. If not, set a SandboxError exception and
- cause the caller to return 'error_return', otherwise do nothing.
-
-
-Network Information
-=============================
-
-Protection
---------------
-
-Limit what information can be gleaned about the network the system is
-running on. This does not include restricting information on IP
-addresses and hosts that are have been explicitly allowed for the
-sandboxed interpreter to communicate with.
-
-XXX
-
-
-Why
---------------
-
-With enough information from the network several things could occur.
-One is that someone could possibly figure out where your machine is on
-the Internet. Another is that enough information about the network you
-are connected to could be used against it in an attack.
-
-
-Possible Security Flaws
------------------------
-
-As long as usage is restricted to only what is needed to work with
-allowed addresses, there are no security issues to speak of.
-
-
-API
---------------
-
-* int PySandbox_SetNetworkInfo(PyThreadState *)
- Allow the sandboxed interpreter to get network information
- regardless of whether the IP or host address is explicitly allowed.
- If the interpreter is not sandboxed, return a false value.
-
-* PySandbox_AllowedNetworkInfo(error_return)
- Macro that will return 'error_return' for the caller and set a
- SandboxError exception if the sandboxed interpreter does not allow
- checking for arbitrary network information, otherwise do nothing.
-
-
Filesystem Information
-=============================
-
-Protection
---------------
+++++++++++++++++++++++
-Do not allow information about the filesystem layout from various parts
-of Python to be exposed. This means blocking exposure at the Python
-level to:
-
-* __file__ attribute on modules
-* __path__ attribute on packages
-* co_filename attribute on code objects
+When running code in a sandboxed interpreter, POLA suggests that you
+do not want to expose information about your environment on top of
+protecting its use. This means that filesystem paths typically should
+not be exposed. Unfortunately, Python exposes file paths all over the
+place:
+
+* Modules
+ + ``__file__`` attribute
+* Code objects
+ + ``co_filename`` attribute
+* Packages
+ + ``__path__`` attribute
* XXX
+XXX how to expose safely?
-Why
---------------
-
-Exposing information about the filesystem is not allowed. You can
-figure out what operating system one is on which can lead to
-vulnerabilities specific to that operating system being exploited.
+Mutable Shared State
+++++++++++++++++++++
-Possible Security Flaws
------------------------
-
-Not finding every single place where a file path is exposed.
-
-
-API
---------------
-
-* int PySandbox_SetFilesystemInfo(PyThreadState *)
- Allow the sandboxed interpreter to expose filesystem information.
- If the passed-in interpreter is not sandboxed, return NULL.
-
-* PySandbox_AllowedFilesystemInfo(error_return)
- Macro that checks if exposing filesystem information is allowed.
- If it is not, cause the caller to return with the value of
- 'error_return' and raise SandboxError, otherwise do nothing.
-
-
-Stdin, Stdout, and Stderr
-=============================
-
-Protection
---------------
-
-By default, sys.__stdin__, sys.__stdout__, and sys.__stderr__ will be
-set to instances of StringIO. Explicit allowance of the process'
-stdin, stdout, and stderr is possible.
-
-This will protect the 'print' statement, and the built-ins input() and
-raw_input().
-
-
-Why
---------------
-
-Interference with stdin, stdout, or stderr should not be allowed unless
-desired. No one wants uncontrolled output sent to their screen.
-
-
-Possible Security Flaws
------------------------
-
-Unless StringIO instances can be used maliciously, none to speak of.
-
-
-API
---------------
-
-* int PySandbox_SetTrueStdin(PyThreadState *)
- int PySandbox_SetTrueStdout(PyThreadState *)
- int PySandbox_SetTrueStderr(PyThreadState *)
- Set the specific stream for the interpreter to the true version of
- the stream and not to the default instance of StringIO. If the
- interpreter is not sandboxed, return a false value.
-
-
-Adding New Protections
-=============================
-
-.. note:: This feature has the lowest priority and thus will be the
- last feature implemented (if ever).
-
-Protection
---------------
-
-Allow for extensibility in the security model by being able to add new
-types of checks. This allows not only for Python to add new security
-protections in a backwards-compatible fashion, but to also have
-extension modules add their own as well.
-
-An extension module can introduce a group for its various values to
-check, with a type being a specific value within a group. The "Python"
-group is specifically reserved for use by the Python core itself.
-
-
-Why
---------------
-
-We are all human. There is the possibility that a need for a new type
-of protection for the interpreter will present itself and thus need
-support. By providing an extensible way to add new protections it
-helps to future-proof the system.
-
-It also allows extension modules to present their own set of security
-protections. That way one extension module can use the protection
-scheme presented by another that it is dependent upon.
-
-
-Possible Security Flaws
-------------------------
-
-Poor definitions by extension module users of how their protections
-should be used would allow for possible exploitation.
-
-
-API
---------------
-
-+ Bool
- * int PySandbox_SetExtendedFlag(PyThreadState *, string group,
- string type)
- Set a group-type to be true. Expected use is for when a binary
- possibility of something is needed and that the default is to
- not allow use of the resource (e.g., network information).
- Returns a false value if used on an unprotected interpreter.
-
- * PySandbox_AllowedExtendedFlag(string group, string type,
- error_return)
- Macro that if the group-type is not set to true, cause the
- caller to return with 'error_return' with SandboxError
- exception raised. For unprotected interpreters the check does
- nothing.
-
-+ Numeric Range
- * int PySandbox_SetExtendedCap(PyThreadState *, string group,
- string type, integer cap)
- Set a group-type to a capped value, 'cap', with the initial
- allocated value set to 0. Expected use is when a resource has
- a capped amount of use (e.g., memory). Returns a false value
- if the interpreter is not sandboxed.
-
- * PySandbox_AllowedExtendedAlloc(integer increase, error_return)
- Macro to raise the amount of a resource is used by 'increase'.
- If the increase pushes the resource allocation past the set
- cap, then return 'error_return' and set SandboxError as the
- exception, otherwise do nothing.
-
- * PySandbox_AllowedExtendedFree(integer decrease, error_return)
- Macro to lower the amount a resource is used by 'decrease'. If
- the decrease pushes the allotment to below 0 then have the
- caller return 'error_return' and set SandboxError as the
- exception, otherwise do nothing.
-
-
-+ Membership
- * int PySandbox_SetExtendedMembership(PyThreadState *,
- string group, string type,
- string member)
- Add a string, 'member', to be considered a member of a
- group-type (e.g., allowed file paths). If the interpreter is not
- an sandboxed interpreter, return a false value.
-
- * PySandbox_AllowedExtendedMembership(string group, string type,
- string member,
- error_return)
- Macro that checks 'member' is a member of the values set for
- the group-type. If it is not, then have the caller return
- 'error_return' and set an exception for SandboxError, otherwise
- does nothing.
-
-+ Specific Value
- * int PySandbox_SetExtendedValue(PyThreadState *, string group,
- string type, string value)
- Set a group-type to 'value'. If the interpreter is not
- sandboxed, return NULL.
-
- * PySandbox_AllowedExtendedValue(string group, string type,
- string value, error_return)
- Macro to check that the group-type is set to 'value'. If it is
- not, then have the caller return 'error_return' and set an
- exception for SandboxError, otherwise do nothing.
+Because built-in types are shared between interpreters, they cannot
+expose any mutable shared state. Unfortunately, as it stands, some
+do. Below is a list of types that share some form of dangerous state,
+how they share it, and how to fix the problem:
+* ``object``
+ + ``__subclasses__()`` function
+ - Remove the function; never seen used in real-world code.
+* XXX
-Python API
-=============================
-__sandboxed__
---------------
+Perimeter Defences Between a Created Interpreter and Its Creator
+----------------------------------------------------------------
-A built-in that flags whether the interpreter currently running is
-sandboxed or not. Set to a 'bool' value that is read-only. To mimic
-working of __debug__.
+The plan is to allow interpreters to instantiate sandboxed
+interpreters safely. By using the creating interpreter's abilities to
+provide abilities to the created interpreter, you make sure there is
+no escalation in abilities.
+
+But by creating a sandboxed interpreter and passing in any code into
+it, you open up the chance of possible ways of getting back to the
+creating interpreter or escalating privileges. Those ways are:
+
+* ``__del__`` created in sandboxed interpreter but object is cleaned
+ up in unprotected interpreter.
+* Using frames to walk the frame stack back to another interpreter.
+* XXX
-sandbox module
---------------
+Making the ``sys`` Module Safe
+------------------------------
XXX
-References
-///////////////////////////////////////
-
-.. [#rexec] The 'rexec' module
- (http://docs.python.org/lib/module-rexec.html)
-
-.. [#safe-tcl] The Safe-Tcl Security Model
- (http://research.sun.com/technical-reports/1997/abstract-60.html)
+Safe Networking
+---------------
-.. [#ctypes] 'ctypes' module
- (http://docs.python.org/dev/lib/module-ctypes.html)
-
-.. [#paradigm regained] "Paradigm Regained:
- Abstraction Mechanisms for Access Control"
- (http://erights.org/talks/asian03/paradigm-revised.pdf)
-
-.. [#armin-hiding] [Python-Dev] what can we do to hide the 'file' type?
- (http://mail.python.org/pipermail/python-dev/2006-July/067076.html)
+XXX
From nnorwitz at gmail.com Wed Jul 19 04:36:34 2006
From: nnorwitz at gmail.com (Neal Norwitz)
Date: Tue, 18 Jul 2006 19:36:34 -0700
Subject: [Python-checkins] r50708 - in python/trunk:
Lib/test/test_sys.py Misc/NEWS Python/pystate.c
In-Reply-To: <20060719010840.GD2540@performancedrivers.com>
References: <20060719000321.36AF31E401E@bag.python.org>
<20060719010840.GD2540@performancedrivers.com>
Message-ID:
On 7/18/06, Jack Diederich wrote:
>
> were pre-2003 and talking about mod_python. HURD and FreeBSD came up a
> couple times. Do we need to add more *BSD buildbots?
Yes. We only have OpenBSD now. It would be nice to have {Free,Net}BSD too..
n
From buildbot at python.org Wed Jul 19 06:27:27 2006
From: buildbot at python.org (buildbot at python.org)
Date: Wed, 19 Jul 2006 04:27:27 +0000
Subject: [Python-checkins] buildbot warnings in sparc Ubuntu dapper trunk
Message-ID: <20060719042728.B0E581E4009@bag.python.org>
The Buildbot has detected a new failure of sparc Ubuntu dapper trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/sparc%2520Ubuntu%2520dapper%2520trunk/builds/532
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: tim.peters
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From theller at python.net Wed Jul 19 10:35:06 2006
From: theller at python.net (Thomas Heller)
Date: Wed, 19 Jul 2006 10:35:06 +0200
Subject: [Python-checkins] r50708 - in python/trunk:
Lib/test/test_sys.py Misc/NEWS Python/pystate.c
In-Reply-To:
References: <20060719000321.36AF31E401E@bag.python.org> <20060719010840.GD2540@performancedrivers.com>
Message-ID:
Neal Norwitz schrieb:
> On 7/18/06, Jack Diederich wrote:
>>
>> were pre-2003 and talking about mod_python. HURD and FreeBSD came up a
>> couple times. Do we need to add more *BSD buildbots?
>
> Yes. We only have OpenBSD now. It would be nice to have {Free,Net}BSD too..
Maybe some of the buildbots should (in addition to the normal build?)
configure Python with --without-threads?
Thomas
From python-checkins at python.org Wed Jul 19 11:09:33 2006
From: python-checkins at python.org (thomas.heller)
Date: Wed, 19 Jul 2006 11:09:33 +0200 (CEST)
Subject: [Python-checkins] r50713 - python/trunk/Modules/_ctypes/callbacks.c
python/trunk/Modules/_ctypes/callproc.c
Message-ID: <20060719090933.331A11E4009@bag.python.org>
Author: thomas.heller
Date: Wed Jul 19 11:09:32 2006
New Revision: 50713
Modified:
python/trunk/Modules/_ctypes/callbacks.c
python/trunk/Modules/_ctypes/callproc.c
Log:
Make sure the _ctypes extension can be compiled when WITH_THREAD is
not defined on Windows, even if that configuration is probably not
supported at all.
Modified: python/trunk/Modules/_ctypes/callbacks.c
==============================================================================
--- python/trunk/Modules/_ctypes/callbacks.c (original)
+++ python/trunk/Modules/_ctypes/callbacks.c Wed Jul 19 11:09:32 2006
@@ -348,7 +348,9 @@
static void LoadPython(void)
{
if (!Py_IsInitialized()) {
+#ifdef WITH_THREAD
PyEval_InitThreads();
+#endif
Py_Initialize();
}
}
Modified: python/trunk/Modules/_ctypes/callproc.c
==============================================================================
--- python/trunk/Modules/_ctypes/callproc.c (original)
+++ python/trunk/Modules/_ctypes/callproc.c Wed Jul 19 11:09:32 2006
@@ -818,7 +818,9 @@
/* We absolutely have to release the GIL during COM method calls,
otherwise we may get a deadlock!
*/
+#ifdef WITH_THREAD
Py_BEGIN_ALLOW_THREADS
+#endif
hr = pIunk->lpVtbl->QueryInterface(pIunk, &IID_ISupportErrorInfo, (void **)&psei);
if (FAILED(hr))
@@ -842,7 +844,9 @@
pei->lpVtbl->Release(pei);
failed:
+#ifdef WITH_THREAD
Py_END_ALLOW_THREADS
+#endif
progid = NULL;
ProgIDFromCLSID(&guid, &progid);
From anthony at interlink.com.au Wed Jul 19 14:41:27 2006
From: anthony at interlink.com.au (Anthony Baxter)
Date: Wed, 19 Jul 2006 22:41:27 +1000
Subject: [Python-checkins] r50708 - in python/trunk:
Lib/test/test_sys.py Misc/NEWS Python/pystate.c
In-Reply-To:
References: <20060719000321.36AF31E401E@bag.python.org>
Message-ID: <200607192241.30043.anthony@interlink.com.au>
On Wednesday 19 July 2006 18:35, Thomas Heller wrote:
> Neal Norwitz schrieb:
> > On 7/18/06, Jack Diederich wrote:
> >> were pre-2003 and talking about mod_python. HURD and FreeBSD
> >> came up a couple times. Do we need to add more *BSD buildbots?
> >
> > Yes. We only have OpenBSD now. It would be nice to have
> > {Free,Net}BSD too..
>
> Maybe some of the buildbots should (in addition to the normal
> build?) configure Python with --without-threads?
Or the daily build.sh script could do it?
Anthony
--
Anthony Baxter
It's never too late to have a happy childhood.
From andymac at bullseye.apana.org.au Wed Jul 19 12:18:03 2006
From: andymac at bullseye.apana.org.au (Andrew MacIntyre)
Date: Wed, 19 Jul 2006 21:18:03 +1100
Subject: [Python-checkins] r50713 -
python/trunk/Modules/_ctypes/callbacks.c
python/trunk/Modules/_ctypes/callproc.c
In-Reply-To: <20060719090933.331A11E4009@bag.python.org>
References: <20060719090933.331A11E4009@bag.python.org>
Message-ID: <44BE06DB.1090102@bullseye.apana.org.au>
thomas.heller wrote:
> Author: thomas.heller
> Date: Wed Jul 19 11:09:32 2006
> New Revision: 50713
>
> Modified:
> python/trunk/Modules/_ctypes/callbacks.c
> python/trunk/Modules/_ctypes/callproc.c
> Log:
> Make sure the _ctypes extension can be compiled when WITH_THREAD is
> not defined on Windows, even if that configuration is probably not
> supported at all.
>
>
> Modified: python/trunk/Modules/_ctypes/callbacks.c
> ==============================================================================
> --- python/trunk/Modules/_ctypes/callbacks.c (original)
> +++ python/trunk/Modules/_ctypes/callbacks.c Wed Jul 19 11:09:32 2006
> @@ -348,7 +348,9 @@
> static void LoadPython(void)
> {
> if (!Py_IsInitialized()) {
> +#ifdef WITH_THREAD
> PyEval_InitThreads();
> +#endif
> Py_Initialize();
> }
> }
>
> Modified: python/trunk/Modules/_ctypes/callproc.c
> ==============================================================================
> --- python/trunk/Modules/_ctypes/callproc.c (original)
> +++ python/trunk/Modules/_ctypes/callproc.c Wed Jul 19 11:09:32 2006
> @@ -818,7 +818,9 @@
> /* We absolutely have to release the GIL during COM method calls,
> otherwise we may get a deadlock!
> */
> +#ifdef WITH_THREAD
> Py_BEGIN_ALLOW_THREADS
> +#endif
>
> hr = pIunk->lpVtbl->QueryInterface(pIunk, &IID_ISupportErrorInfo, (void **)&psei);
> if (FAILED(hr))
> @@ -842,7 +844,9 @@
> pei->lpVtbl->Release(pei);
>
> failed:
> +#ifdef WITH_THREAD
> Py_END_ALLOW_THREADS
> +#endif
>
> progid = NULL;
> ProgIDFromCLSID(&guid, &progid);
Umm... the Py_[BEGIN|END]_ALLOW_THREADS macros shouldn't need to be
#ifdef'ed like this surely? There's already an #ifdef WITH_THREAD in
Include/ceval.h to (hopefully) correctly redefine them appropriately.
--
-------------------------------------------------------------------------
Andrew I MacIntyre "These thoughts are mine alone..."
E-mail: andymac at bullseye.apana.org.au (pref) | Snail: PO Box 370
andymac at pcug.org.au (alt) | Belconnen ACT 2616
Web: http://www.andymac.org/ | Australia
From andymac at bullseye.apana.org.au Wed Jul 19 12:10:29 2006
From: andymac at bullseye.apana.org.au (Andrew MacIntyre)
Date: Wed, 19 Jul 2006 21:10:29 +1100
Subject: [Python-checkins] r50708 - in
python/trunk: Lib/test/test_sys.py Misc/NEWS Python/pystate.c
In-Reply-To:
References: <20060719000321.36AF31E401E@bag.python.org> <20060719010840.GD2540@performancedrivers.com>
Message-ID: <44BE0515.7070505@bullseye.apana.org.au>
Neal Norwitz wrote:
> On 7/18/06, Jack Diederich wrote:
>> were pre-2003 and talking about mod_python. HURD and FreeBSD came up a
>> couple times. Do we need to add more *BSD buildbots?
>
> Yes. We only have OpenBSD now. It would be nice to have {Free,Net}BSD too..
>
> n
I have plans for a FreeBSD 6 buildbot, which are dependant on me getting
an ADSL connection (a month or so with luck) and the time to upgrade my
FreeBSD 4 box.
Cheers,
Andrew.
--
-------------------------------------------------------------------------
Andrew I MacIntyre "These thoughts are mine alone..."
E-mail: andymac at bullseye.apana.org.au (pref) | Snail: PO Box 370
andymac at pcug.org.au (alt) | Belconnen ACT 2616
Web: http://www.andymac.org/ | Australia
From theller at python.net Wed Jul 19 15:38:17 2006
From: theller at python.net (Thomas Heller)
Date: Wed, 19 Jul 2006 15:38:17 +0200
Subject: [Python-checkins] r50713 -
python/trunk/Modules/_ctypes/callbacks.c
python/trunk/Modules/_ctypes/callproc.c
In-Reply-To: <44BE06DB.1090102@bullseye.apana.org.au>
References: <20060719090933.331A11E4009@bag.python.org>
<44BE06DB.1090102@bullseye.apana.org.au>
Message-ID:
Andrew MacIntyre schrieb:
> thomas.heller wrote:
[...]
>> Modified: python/trunk/Modules/_ctypes/callproc.c
>> ==============================================================================
>> --- python/trunk/Modules/_ctypes/callproc.c (original)
>> +++ python/trunk/Modules/_ctypes/callproc.c Wed Jul 19 11:09:32 2006
>> @@ -818,7 +818,9 @@
>> /* We absolutely have to release the GIL during COM method calls,
>> otherwise we may get a deadlock!
>> */
>> +#ifdef WITH_THREAD
>> Py_BEGIN_ALLOW_THREADS
>> +#endif
>>
>> hr = pIunk->lpVtbl->QueryInterface(pIunk, &IID_ISupportErrorInfo, (void **)&psei);
>> if (FAILED(hr))
>> @@ -842,7 +844,9 @@
>> pei->lpVtbl->Release(pei);
>>
>> failed:
>> +#ifdef WITH_THREAD
>> Py_END_ALLOW_THREADS
>> +#endif
>>
>> progid = NULL;
>> ProgIDFromCLSID(&guid, &progid);
>
> Umm... the Py_[BEGIN|END]_ALLOW_THREADS macros shouldn't need to be
> #ifdef'ed like this surely? There's already an #ifdef WITH_THREAD in
> Include/ceval.h to (hopefully) correctly redefine them appropriately.
>
Ideally, yes. However, in this case, MSVC complains about a syntax error:
missing ';' before '}', or something like that. Py_END_ALLOW_THREADS
expands to '}' when WITH_THREAD is not defined. Maybe the label immediately
before the end of a block is not valid.
I haven't tried if inserting a ';' after the failed: label will repair this.
Thomas
From rhettinger at ewtllc.com Wed Jul 19 17:52:53 2006
From: rhettinger at ewtllc.com (Raymond Hettinger)
Date: Wed, 19 Jul 2006 08:52:53 -0700
Subject: [Python-checkins] r50697 - python/trunk/Lib/decimal.py
In-Reply-To: <20060718121614.CA0601E4015@bag.python.org>
References: <20060718121614.CA0601E4015@bag.python.org>
Message-ID: <44BE5555.4000500@ewtllc.com>
I propose that this patch be reverted.
Raymond
facundo.batista wrote:
>Author: facundo.batista
>Date: Tue Jul 18 14:16:13 2006
>New Revision: 50697
>
>Modified:
> python/trunk/Lib/decimal.py
>Log:
>Comments and docs cleanups, and some little fixes, provided by Santi?go Peres?n
>
>
From martin at v.loewis.de Wed Jul 19 19:06:33 2006
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Wed, 19 Jul 2006 19:06:33 +0200
Subject: [Python-checkins] r50708 - in
python/trunk: Lib/test/test_sys.py Misc/NEWS Python/pystate.c
In-Reply-To:
References: <20060719000321.36AF31E401E@bag.python.org> <20060719010840.GD2540@performancedrivers.com>
Message-ID: <44BE6699.7080603@v.loewis.de>
Thomas Heller wrote:
>>> were pre-2003 and talking about mod_python. HURD and FreeBSD came up a
>>> couple times. Do we need to add more *BSD buildbots?
>> Yes. We only have OpenBSD now. It would be nice to have {Free,Net}BSD too..
>
> Maybe some of the buildbots should (in addition to the normal build?)
> configure Python with --without-threads?
I don't have the time to work something out here (as I don't consider it
that important), so if anybody else wants to work on it, please let me
know.
Regards,
Martin
From python-checkins at python.org Wed Jul 19 19:17:51 2006
From: python-checkins at python.org (matt.fleming)
Date: Wed, 19 Jul 2006 19:17:51 +0200 (CEST)
Subject: [Python-checkins] r50714 - in sandbox/trunk/pdb:
Doc/lib/libmpdb.tex mpdb.py mqueue.py
Message-ID: <20060719171751.920C01E4003@bag.python.org>
Author: matt.fleming
Date: Wed Jul 19 19:17:50 2006
New Revision: 50714
Added:
sandbox/trunk/pdb/mqueue.py
Modified:
sandbox/trunk/pdb/Doc/lib/libmpdb.tex
sandbox/trunk/pdb/mpdb.py
Log:
The beginnings of being able to debug a runniny process.
Modified: sandbox/trunk/pdb/Doc/lib/libmpdb.tex
==============================================================================
--- sandbox/trunk/pdb/Doc/lib/libmpdb.tex (original)
+++ sandbox/trunk/pdb/Doc/lib/libmpdb.tex Wed Jul 19 19:17:50 2006
@@ -286,6 +286,11 @@
add a history number. It is generally not advisable to change the
prompt.}
+\item[set target-address \var{target_addr}]\label{command:target}
+
+Set the debugger's target. This sets the current address to connect to
+for commands such as \code{attach} and \code{target}.
+
\end{description}
\subsubsection{Show ({\tt show}) \label{subsubsection-show}}
@@ -354,6 +359,10 @@
Show the current debugger prompt.
+\item[show target-address]
+
+Show the current target address.
+
\item[show version]
Show the debugger version number.
@@ -1029,7 +1038,7 @@
Interpreter. The thread that runs the Python Interpreter is a special class
called \class{_MainThread}. Whenever we refer to the main debugger, this is
what we are refering to. The \class{MTracer} objects (slaves)
-are responsible for tracing threads that are created in the script being
+are responsible for handling threads that are created in the script being
debugged. Any messages that need to be output to the user are placed
on the main debugger's \code{outputqueue} queue. This messages are
periodically written to the main debugger's \code{stdout} variable.
@@ -1117,3 +1126,32 @@
\label{proc-debug}
This section describes how \module{mpdb} debugs processes that are external
to the process in which \module{mpdb} is being run.
+
+If a program wishes to allow debugging from another process it must import
+and call the \code{process_debugging} function from the \module{mpdb} module.
+This function sets up a signal handler for \module{mpdb}'s debugging signal
+(defaults to \code{USR1}).
+
+\begin{verbatim}
+# Import and initialise debugging from another process
+import mpdb.process_debugging
+mpdb.process_debugging()
+
+# Regular program code
+\end{verbatim}
+
+From the debugger console a user must issue the 'attach' command, the
+\code{target_addr} variable must be set \ref{command::set}.
+
+\begin{verbatim}
+(MPdb) set target [protcol] [address]
+
+(MPdb) attach PID [USR1]
+\end{verbatim}
+
+Protocol can be any of the protocols the 'pdbserver' and 'target' commands
+can accept. Note however, that if the protcol is \code{tcp} you need not
+specify the hostname, as it can only be 'localhost'.
+
+
+
Modified: sandbox/trunk/pdb/mpdb.py
==============================================================================
--- sandbox/trunk/pdb/mpdb.py (original)
+++ sandbox/trunk/pdb/mpdb.py Wed Jul 19 19:17:50 2006
@@ -19,7 +19,6 @@
from optparse import OptionParser
import pydb
from pydb.gdb import Restart
-from Queue import Queue
import sys
import time
import thread
@@ -42,58 +41,41 @@
- debugging applications on remote machines
- debugging threaded applications
"""
- def __init__(self, completekey='tab'):
+ def __init__(self, completekey='tab', stdin=None, stdout=None):
""" Instantiate a debugger.
The optional argument 'completekey' is the readline name of a
completion key; it defaults to the Tab key. If completekey is
not None and the readline module is available, command completion
- is done automatically.
+ is done automatically. The optional arguments stdin and stdout
+ are the objects that data is read from and written to, respectively.
"""
- pydb.Pdb.__init__(self, completekey)
+ pydb.Pdb.__init__(self, completekey, stdin, stdout)
+ self.orig_stdin = self.stdin
+ self.orig_stdout = self.stdout
self.prompt = '(MPdb)'
self.target = 'local' # local connections by default
self.lastcmd = ''
self.connection = None
self.debugger_name = 'mpdb'
+ self._show_cmds.append('target-address')
+ self._show_cmds.sort()
self._info_cmds.append('target')
+ self._info_cmds.sort()
+ self.target_addr = "" # target address used by 'attach'
- thread.start_new_thread(self.input, ())
- self.inputqueue = Queue()
- self.outputqueue = Queue()
- thread.start_new_thread(self.output, ())
-
- def cmdloop(self, intro=None):
- """ Override the cmdloop from cmd.Cmd so that we only use this
- object's cmdqueue variable for reading commands.
- """
- self.preloop()
+ # We need to be in control of self.stdin
+ self.use_rawinput = False
- if self.intro is not None:
- self.intro = intro
- if self.intro:
- self.outputqueue.put(str(self.intro) + "\n")
- stop = None
- while not stop:
- try:
- line = self.cmdqueue.pop()
- except IndexError:
- time.sleep(0.1)
- continue
-
- line = self.precmd(line)
- stop = self.onecmd(line)
- stop = self.postcmd(stop, line)
- self.postloop()
-
- def msg_nocr(self, msg):
- """ Override this method from pydb so that instead of writing
- to a file object, we place the output on to an output queue. Output
- is picked up by a thread.
- """
- # XXX This should probably be removed
- if msg[-1] == '\n': msg = msg[:-1]
- self.outputqueue.put(msg)
+ def _rebind_input(self, new_input):
+ self.stdin.flush()
+ self.stdin = new_input
+
+ def _rebind_output(self, new_output):
+ self.stdout.flush()
+ self.stdout = new_output
+ if not hasattr(self.stdout, 'flush'):
+ self.stdout.flush = lambda: None
def remote_onecmd(self, line):
""" All commands in 'line' are sent across this object's connection
@@ -143,8 +125,11 @@
# process is started, so we have to connect all over again.
self._disconnect()
time.sleep(3.0)
- self.do_target(self.target_addr)
- return
+ if not self.do_target(self.target_addr):
+ # We cannot trust these variables below to be in a
+ # stable state. i.e. if the pdbserver doesn't come back up.
+ self.onecmd = lambda x: pydb.Pdb.onecmd(self, x)
+ return
self.msg_nocr(ret)
self.lastcmd = line
return
@@ -156,6 +141,7 @@
self.target = 'local'
if hasattr(self, 'local_prompt'):
self.prompt = self.local_prompt
+ self.onecmd = lambda x: pydb.Pdb.onecmd(self, x)
def do_info(self, arg):
"""Extends pydb do_info() to give info about the Mpdb extensions."""
@@ -182,6 +168,33 @@
def help_mpdb(self, *arg):
help()
+ def do_set(self, arg):
+ """ Extends pydb.do_set() to allow setting of mpdb extensions.
+ """
+ if not arg:
+ pydb.Pdb.do_set(self, arg)
+ return
+
+ args = arg.split()
+ if 'target-address'.startswith(args[0]):
+ self.target_addr = "".join(["%s " % a for a in args[1:]])
+ self.target_addr = self.target_addr.strip()
+
+ self.msg('target address set to: %s' % self.target_addr)
+ return
+
+ def do_show(self, arg):
+ """Extends pydb.do_show() to show Mpdb extension settings. """
+ if not arg:
+ pydb.Pdb.do_show(self, arg)
+ return
+
+ args = arg.split()
+ if 'target-address'.startswith(args[0]):
+ self.msg('target address is %s.' % self.target_addr.__repr__())
+ else:
+ pydb.Pdb.do_show(self, arg)
+
# Debugger commands
def do_attach(self, addr):
""" Attach to a process or file outside of Pdb.
@@ -195,6 +208,30 @@
(see the "directory" command). You can also use the "file" command
to specify the program, and to load its symbol table.
"""
+ if not self.target_addr:
+ self.errmsg('No target address is set')
+ return
+ try:
+ pid = int(addr)
+ except ValueError:
+ # no PID
+ self.errmsg('You must specify a process ID to attach to.')
+ return
+ from os import kill
+ from signal import SIGUSR1
+ try:
+ kill(pid, SIGUSR1)
+ except OSError, err:
+ self.errmsg(err)
+ return
+
+ # Will remove this
+ time.sleep(3.0)
+
+ # At the moment by default we'll use named pipes for communication
+ conn_str = 'mconnection.MConnectionClientFIFO p'
+ self.do_target(conn_str)
+
def do_target(self, args):
""" Connect to a target machine or process.
The first argument is the type or protocol of the target machine
@@ -209,14 +246,19 @@
target serial device-name -- Use a remote computer via a serial line
target tcp hostname:port -- Use a remote computer via a socket connection
"""
+ if not args:
+ args = self.target_addr
try:
target, addr = args.split(' ')
except ValueError:
self.errmsg('Invalid arguments')
- return
+ return False
+ # If addr is ':PORT' fill in 'localhost' as the hostname
+ if addr[0] == ':':
+ addr = 'localhost'+addr[:]
if 'remote' in self.target:
self.errmsg('Already connected to a remote machine.')
- return
+ return False
if target == 'tcp':
# TODO: need to save state of current debug session
if self.connection: self.connection.disconnect()
@@ -227,7 +269,7 @@
except ImportError:
self.errmsg('Could not import MConnectionClientTCP')
- return
+ return False
elif target == 'serial':
if self.connection: self.connection.disconnect()
try:
@@ -236,7 +278,7 @@
self.connection = MConnectionSerial()
except ImportError:
self.errmsg('Could not import MConnectionSerial')
- return
+ return False
else:
cls = target[target.rfind('.')+1:]
path = target[:target.rfind('.')]
@@ -246,7 +288,7 @@
self.connection.connect(addr)
except ConnectionFailed, err:
self.errmsg("Failed to connect to %s: (%s)" % (addr, err))
- return
+ return False
# This interpreter no longer interprets commands but sends
# them straight across this object's connection to a server.
# XXX: In the remote_onecmd method we use the local_prompt string
@@ -265,7 +307,7 @@
self.msg_nocr(line)
self.onecmd = self.remote_onecmd
self.target = 'remote-client'
- return
+ return True
def do_detach(self, args):
""" Detach a process or file previously attached.
@@ -376,20 +418,6 @@
self.msg("Re exec'ing\n\t%s" % self._sys_argv)
os.execvp(self._sys_argv[0], self._sys_argv)
- def do_disassemble(self, arg):
- """disassemble [arg]
- With no argument disassemble at the current frame location.
- With a numeric argument, disassemble at the frame location at that
- line number. With a class, method, function, code or string argument
- disassemble that."""
- # XXX Override this method so that we can take control of where
- # the output is written to. This seems generally 'wrong'. Rocky,
- # what's your opinion?
- orig_stdout = sys.stdout
- sys.stdout = self.stdout
- pydb.Pdb.do_disassemble(self, arg)
- sys.stdout = orig_stdout
-
def pdbserver(addr, m):
""" This method sets up a pdbserver debugger that allows debuggers
to connect to 'addr', which a protocol-specific address, i.e.
@@ -413,11 +441,13 @@
serial = 'serial /dev/ttyC0'
"""
m = MPdb()
+ m.reset()
# Look Ma, no script!
m.do_target(addr)
while True:
try:
m.cmdloop()
+ if m._user_requested_quit: break
except:
break
@@ -437,6 +467,44 @@
m.msg(traceback.format_exc())
break
+# Moving this out of this file makes things 'awkward', for instance
+# the outter file has to know about how to setup a pdbserver.
+def process_debugging():
+ """ Allow debugging of other processes. This routine should
+ be imported and called near the top of the program file.
+ It sets up signal handlers that are used to create a pdbserver
+ that a debugging client can attach to.
+ """
+ import signal
+ signal.signal(signal.SIGUSR1, signal_handler)
+
+def signal_handler(signum, frame):
+ """ This signal handler replaces the programs signal handler
+ for the SIGUSR1 signal. When a program receives this signal,
+ it creates a pdbserver with a FIFO connection. Debugger
+ clients can then attach to this pdbserver via it's pid.
+ """
+ m = MPdb()
+ m.reset()
+ m.running = True
+ m.currentframe = frame
+ m._sys_argv = ['python']
+ for i in sys.argv:
+ m._sys_argv.append(i)
+ m.do_pdbserver('mconnection.MConnectionServerFIFO p')
+ m.set_trace(m.curframe)
+
+def attach(pid, mpdb):
+ """ Attach to running process 'pid'. """
+ mpdb.reset() # botframe, etc should be None
+ mpdb.do_attach(pid)
+ while True:
+ try:
+ mpdb.cmdloop()
+ if mpdb._user_requested_quit: break
+ except Exit:
+ break
+
def main():
""" Main entry point to this module. """
mpdb = MPdb()
@@ -448,12 +516,13 @@
make_option("-t", "--target", dest="target",
help="Specify a target to connect to. The arguments" \
+ " should be of form, 'protocol address'."),
- make_option("-p", "--pdbserver", dest="pdbserver",
+ make_option("--pdbserver", dest="pdbserver",
help="Start the debugger and execute the pdbserver " \
+ "command. The arguments should be of the form," \
+ " 'protocol address scriptname'."),
make_option("-d", "--debug-thread", action="store_true",
- help="Turn on thread debugging.")
+ help="Turn on thread debugging."),
+ make_option("--pid", dest="pid", help="Attach to running process PID.")
]
opts = process_options(mpdb, "mpdb", os.path.basename(sys.argv[0])
@@ -489,6 +558,9 @@
elif opts.debug_thread:
thread_debugging(mpdb)
sys.exit()
+ elif opts.pid:
+ attach(opts.pid, mpdb)
+ sys.exit()
while 1:
try:
@@ -527,26 +599,6 @@
mpdb.msg("Post mortem debugger finished. The " + \
mpdb.mainpyfile + " will be restarted")
-# Utility functions
-
-# Parse arguments
-def parse_opts():
- parser = OptionParser()
- parser.disable_interspersed_args()
- parser.add_option("-s", "--script", dest="scriptname",
- help="The script to debug")
- parser.add_option("-t", "--target", dest="target",
- help="Specify a target to connect to. The arguments" \
- + " should be of form, 'protocol address'.")
- parser.add_option("-p", "--pdbserver", dest="pdbserver",
- help="Start the debugger and execute the pdbserver " \
- + "command. The arguments should be of the form," \
- + " 'protocol address scriptname'.")
- parser.add_option("-d", "--debug-thread", action="store_true",
- help="Turn on thread debugging.")
- (options, args) = parser.parse_args()
- return (options, args)
-
if __name__ == '__main__':
main()
Added: sandbox/trunk/pdb/mqueue.py
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/mqueue.py Wed Jul 19 19:17:50 2006
@@ -0,0 +1,26 @@
+""" This file contains all code for the queue mechanisms used for
+communication inside the debugger.
+"""
+
+from Queue import Queue, Empty
+
+class MQueue(Queue):
+ """ This class represents a 'communcation' queue. """
+ def __init__(self):
+ Queue.__init__(self)
+
+ def append(self, msg):
+ """ This command simply calls this object's put method. This
+ method allows us to immitate a list object.
+ """
+ self.put_nowait(msg)
+
+ def pop(self, index=None):
+ """ Immitate a list object so that we can use cmdloop from cmd.Cmd.
+ Note: the 'index' argument is never used.
+ """
+ return self.get()
+
+
+
+
From python-checkins at python.org Wed Jul 19 19:18:33 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Wed, 19 Jul 2006 19:18:33 +0200 (CEST)
Subject: [Python-checkins] r50715 - python/trunk/Lib/decimal.py
Message-ID: <20060719171833.B21B01E4003@bag.python.org>
Author: martin.v.loewis
Date: Wed Jul 19 19:18:32 2006
New Revision: 50715
Modified:
python/trunk/Lib/decimal.py
Log:
Revert r50706 (Whitespace normalization) and
r50697: Comments and docs cleanups, and some little fixes
per recommendation from Raymond Hettinger.
Modified: python/trunk/Lib/decimal.py
==============================================================================
--- python/trunk/Lib/decimal.py (original)
+++ python/trunk/Lib/decimal.py Wed Jul 19 19:18:32 2006
@@ -29,8 +29,8 @@
Decimal floating point has finite precision with arbitrarily large bounds.
-The purpose of this module is to support arithmetic using familiar
-"schoolhouse" rules and to avoid some of the tricky representation
+The purpose of the module is to support arithmetic using familiar
+"schoolhouse" rules and to avoid the some of tricky representation
issues associated with binary floating point. The package is especially
useful for financial applications or for contexts where users have
expectations that are at odds with binary floating point (for instance,
@@ -136,7 +136,7 @@
import copy as _copy
-# Rounding
+#Rounding
ROUND_DOWN = 'ROUND_DOWN'
ROUND_HALF_UP = 'ROUND_HALF_UP'
ROUND_HALF_EVEN = 'ROUND_HALF_EVEN'
@@ -145,11 +145,11 @@
ROUND_UP = 'ROUND_UP'
ROUND_HALF_DOWN = 'ROUND_HALF_DOWN'
-# Rounding decision (not part of the public API)
+#Rounding decision (not part of the public API)
NEVER_ROUND = 'NEVER_ROUND' # Round in division (non-divmod), sqrt ONLY
ALWAYS_ROUND = 'ALWAYS_ROUND' # Every operation rounds at end.
-# Errors
+#Errors
class DecimalException(ArithmeticError):
"""Base exception class.
@@ -160,17 +160,17 @@
called if the others are present. This isn't actually used for
anything, though.
+ handle -- Called when context._raise_error is called and the
+ trap_enabler is set. First argument is self, second is the
+ context. More arguments can be given, those being after
+ the explanation in _raise_error (For example,
+ context._raise_error(NewError, '(-x)!', self._sign) would
+ call NewError().handle(context, self._sign).)
+
To define a new exception, it should be sufficient to have it derive
from DecimalException.
"""
def handle(self, context, *args):
- """Called when context._raise_error is called and trap_enabler is set.
-
- First argument is self, second is the context. More arguments can
- be given, those being after the explanation in _raise_error (For
- example, context._raise_error(NewError, '(-x)!', self._sign) would
- call NewError().handle(context, self._sign).)
- """
pass
@@ -179,13 +179,12 @@
This occurs and signals clamped if the exponent of a result has been
altered in order to fit the constraints of a specific concrete
- representation. This may occur when the exponent of a zero result would
- be outside the bounds of a representation, or when a large normal
- number would have an encoded exponent that cannot be represented. In
+ representation. This may occur when the exponent of a zero result would
+ be outside the bounds of a representation, or when a large normal
+ number would have an encoded exponent that cannot be represented. In
this latter case, the exponent is reduced to fit and the corresponding
number of zero digits are appended to the coefficient ("fold-down").
"""
- pass
class InvalidOperation(DecimalException):
@@ -195,8 +194,8 @@
Something creates a signaling NaN
-INF + INF
- 0 * (+-)INF
- (+-)INF / (+-)INF
+ 0 * (+-)INF
+ (+-)INF / (+-)INF
x % 0
(+-)INF % x
x._rescale( non-integer )
@@ -208,21 +207,20 @@
"""
def handle(self, context, *args):
if args:
- if args[0] == 1: # sNaN, must drop 's' but keep diagnostics
+ if args[0] == 1: #sNaN, must drop 's' but keep diagnostics
return Decimal( (args[1]._sign, args[1]._int, 'n') )
return NaN
-
class ConversionSyntax(InvalidOperation):
"""Trying to convert badly formed string.
This occurs and signals invalid-operation if an string is being
converted to a number and it does not conform to the numeric string
- syntax. The result is [0,qNaN].
+ syntax. The result is [0,qNaN].
"""
- def handle(self, context, *args):
- return (0, (0,), 'n') # Passed to something which uses a tuple.
+ def handle(self, context, *args):
+ return (0, (0,), 'n') #Passed to something which uses a tuple.
class DivisionByZero(DecimalException, ZeroDivisionError):
"""Division by 0.
@@ -236,42 +234,42 @@
or of the signs of the operands for divide, or is 1 for an odd power of
-0, for power.
"""
+
def handle(self, context, sign, double = None, *args):
if double is not None:
return (Infsign[sign],)*2
return Infsign[sign]
-
class DivisionImpossible(InvalidOperation):
"""Cannot perform the division adequately.
This occurs and signals invalid-operation if the integer result of a
divide-integer or remainder operation had too many digits (would be
- longer than precision). The result is [0,qNaN].
+ longer than precision). The result is [0,qNaN].
"""
+
def handle(self, context, *args):
return (NaN, NaN)
-
class DivisionUndefined(InvalidOperation, ZeroDivisionError):
"""Undefined result of division.
This occurs and signals invalid-operation if division by zero was
attempted (during a divide-integer, divide, or remainder operation), and
- the dividend is also zero. The result is [0,qNaN].
+ the dividend is also zero. The result is [0,qNaN].
"""
+
def handle(self, context, tup=None, *args):
if tup is not None:
- return (NaN, NaN) # For 0 %0, 0 // 0
+ return (NaN, NaN) #for 0 %0, 0 // 0
return NaN
-
class Inexact(DecimalException):
"""Had to round, losing information.
This occurs and signals inexact whenever the result of an operation is
not exact (that is, it needed to be rounded and any discarded digits
- were non-zero), or if an overflow or underflow condition occurs. The
+ were non-zero), or if an overflow or underflow condition occurs. The
result in all cases is unchanged.
The inexact signal may be tested (or trapped) to determine if a given
@@ -279,27 +277,26 @@
"""
pass
-
class InvalidContext(InvalidOperation):
"""Invalid context. Unknown rounding, for example.
This occurs and signals invalid-operation if an invalid context was
- detected during an operation. This can occur if contexts are not checked
+ detected during an operation. This can occur if contexts are not checked
on creation and either the precision exceeds the capability of the
underlying concrete representation or an unknown or unsupported rounding
- was specified. These aspects of the context need only be checked when
- the values are required to be used. The result is [0,qNaN].
+ was specified. These aspects of the context need only be checked when
+ the values are required to be used. The result is [0,qNaN].
"""
+
def handle(self, context, *args):
return NaN
-
class Rounded(DecimalException):
"""Number got rounded (not necessarily changed during rounding).
This occurs and signals rounded whenever the result of an operation is
rounded (that is, some zero or non-zero digits were discarded from the
- coefficient), or if an overflow or underflow condition occurs. The
+ coefficient), or if an overflow or underflow condition occurs. The
result in all cases is unchanged.
The rounded signal may be tested (or trapped) to determine if a given
@@ -307,20 +304,18 @@
"""
pass
-
class Subnormal(DecimalException):
"""Exponent < Emin before rounding.
This occurs and signals subnormal whenever the result of a conversion or
operation is subnormal (that is, its adjusted exponent is less than
- Emin, before any rounding). The result in all cases is unchanged.
+ Emin, before any rounding). The result in all cases is unchanged.
The subnormal signal may be tested (or trapped) to determine if a given
or operation (or sequence of operations) yielded a subnormal result.
"""
pass
-
class Overflow(Inexact, Rounded):
"""Numerical overflow.
@@ -333,15 +328,16 @@
For round-half-up and round-half-even (and for round-half-down and
round-up, if implemented), the result of the operation is [sign,inf],
- where sign is the sign of the intermediate result. For round-down, the
+ where sign is the sign of the intermediate result. For round-down, the
result is the largest finite number that can be represented in the
- current precision, with the sign of the intermediate result. For
+ current precision, with the sign of the intermediate result. For
round-ceiling, the result is the same as for round-down if the sign of
- the intermediate result is 1, or is [0,inf] otherwise. For round-floor,
+ the intermediate result is 1, or is [0,inf] otherwise. For round-floor,
the result is the same as for round-down if the sign of the intermediate
- result is 0, or is [1,inf] otherwise. In all cases, Inexact and Rounded
+ result is 0, or is [1,inf] otherwise. In all cases, Inexact and Rounded
will also be raised.
- """
+ """
+
def handle(self, context, sign, *args):
if context.rounding in (ROUND_HALF_UP, ROUND_HALF_EVEN,
ROUND_HALF_DOWN, ROUND_UP):
@@ -364,20 +360,18 @@
This occurs and signals underflow if a result is inexact and the
adjusted exponent of the result would be smaller (more negative) than
the smallest value that can be handled by the implementation (the value
- Emin). That is, the result is both inexact and subnormal.
+ Emin). That is, the result is both inexact and subnormal.
The result after an underflow will be a subnormal number rounded, if
- necessary, so that its exponent is not less than Etiny. This may result
+ necessary, so that its exponent is not less than Etiny. This may result
in 0 with the sign of the intermediate result and an exponent of Etiny.
In all cases, Inexact, Rounded, and Subnormal will also be raised.
"""
- pass
-
# List of public traps and flags
_signals = [Clamped, DivisionByZero, Inexact, Overflow, Rounded,
- Underflow, InvalidOperation, Subnormal]
+ Underflow, InvalidOperation, Subnormal]
# Map conditions (per the spec) to signals
_condition_map = {ConversionSyntax:InvalidOperation,
@@ -385,34 +379,32 @@
DivisionUndefined:InvalidOperation,
InvalidContext:InvalidOperation}
-##### Context Functions #####################################################
+##### Context Functions #######################################
# The getcontext() and setcontext() function manage access to a thread-local
# current context. Py2.4 offers direct support for thread locals. If that
# is not available, use threading.currentThread() which is slower but will
# work for older Pythons. If threads are not part of the build, create a
-# mock threading object with threading.local() returning the module
-# namespace.
+# mock threading object with threading.local() returning the module namespace.
try:
import threading
except ImportError:
# Python was compiled without threads; create a mock object instead
import sys
- class MockThreading(object):
+ class MockThreading:
def local(self, sys=sys):
return sys.modules[__name__]
threading = MockThreading()
del sys, MockThreading
-
try:
threading.local
except AttributeError:
- # To fix reloading, force it to create a new context
- # Old contexts have different exceptions in their dicts, making problems.
+ #To fix reloading, force it to create a new context
+ #Old contexts have different exceptions in their dicts, making problems.
if hasattr(threading.currentThread(), '__decimal_context__'):
del threading.currentThread().__decimal_context__
@@ -464,10 +456,10 @@
context.clear_flags()
_local.__decimal_context__ = context
- del threading, local # Don't contaminate the namespace
+ del threading, local # Don't contaminate the namespace
-##### Decimal class ##########################################################
+##### Decimal class ###########################################
class Decimal(object):
"""Floating point class for decimal arithmetic."""
@@ -483,7 +475,7 @@
>>> Decimal('3.14') # string input
Decimal("3.14")
- >>> Decimal((0, (3, 1, 4), -2)) # tuple (sign, digit_tuple, exponent)
+ >>> Decimal((0, (3, 1, 4), -2)) # tuple input (sign, digit_tuple, exponent)
Decimal("3.14")
>>> Decimal(314) # int or long
Decimal("314")
@@ -522,13 +514,12 @@
# tuple/list conversion (possibly from as_tuple())
if isinstance(value, (list,tuple)):
if len(value) != 3:
- raise ValueError('Invalid arguments')
+ raise ValueError, 'Invalid arguments'
if value[0] not in (0,1):
- raise ValueError('Invalid sign')
+ raise ValueError, 'Invalid sign'
for digit in value[1]:
if not isinstance(digit, (int,long)) or digit < 0:
- raise ValueError("The second value in the tuple must be "+
- "composed of non negative integer elements.")
+ raise ValueError, "The second value in the tuple must be composed of non negative integer elements."
self._sign = value[0]
self._int = tuple(value[1])
@@ -562,23 +553,22 @@
if _isnan(value):
sig, sign, diag = _isnan(value)
self._is_special = True
- if len(diag) > context.prec: # Diagnostic info too long
+ if len(diag) > context.prec: #Diagnostic info too long
self._sign, self._int, self._exp = \
context._raise_error(ConversionSyntax)
return self
if sig == 1:
self._exp = 'n' #qNaN
- else: # sig == 2
+ else: #sig == 2
self._exp = 'N' #sNaN
self._sign = sign
- self._int = tuple(map(int, diag)) # Diagnostic info
+ self._int = tuple(map(int, diag)) #Diagnostic info
return self
try:
self._sign, self._int, self._exp = _string2exact(value)
except ValueError:
self._is_special = True
- self._sign, self._int, self._exp = \
- context._raise_error(ConversionSyntax)
+ self._sign, self._int, self._exp = context._raise_error(ConversionSyntax)
return self
raise TypeError("Cannot convert %r to Decimal" % value)
@@ -661,15 +651,15 @@
if self._is_special or other._is_special:
ans = self._check_nans(other, context)
if ans:
- return 1 # Comparison involving NaN's always reports self > other
+ return 1 # Comparison involving NaN's always reports self > other
# INF = INF
return cmp(self._isinfinity(), other._isinfinity())
if not self and not other:
- return 0 # If both 0, sign comparison isn't certain.
+ return 0 #If both 0, sign comparison isn't certain.
- # If different signs, neg one is less
+ #If different signs, neg one is less
if other._sign < self._sign:
return -1
if self._sign < other._sign:
@@ -680,7 +670,7 @@
if self_adjusted == other_adjusted and \
self._int + (0,)*(self._exp - other._exp) == \
other._int + (0,)*(other._exp - self._exp):
- return 0 # Equal, except in precision. ([0]*(-x) = [])
+ return 0 #equal, except in precision. ([0]*(-x) = [])
elif self_adjusted > other_adjusted and self._int[0] != 0:
return (-1)**self._sign
elif self_adjusted < other_adjusted and other._int[0] != 0:
@@ -691,7 +681,7 @@
context = getcontext()
context = context._shallow_copy()
- rounding = context._set_rounding(ROUND_UP) # Round away from 0
+ rounding = context._set_rounding(ROUND_UP) #round away from 0
flags = context._ignore_all_flags()
res = self.__sub__(other, context=context)
@@ -729,7 +719,7 @@
if other is NotImplemented:
return other
- # Compare(NaN, NaN) = NaN
+ #compare(NaN, NaN) = NaN
if (self._is_special or other and other._is_special):
ans = self._check_nans(other, context)
if ans:
@@ -790,11 +780,11 @@
tmp = map(str, self._int)
numdigits = len(self._int)
leftdigits = self._exp + numdigits
- if eng and not self: # self = 0eX wants 0[.0[0]]eY, not [[0]0]0eY
- if self._exp < 0 and self._exp >= -6: # short, no need for e/E
+ if eng and not self: #self = 0eX wants 0[.0[0]]eY, not [[0]0]0eY
+ if self._exp < 0 and self._exp >= -6: #short, no need for e/E
s = '-'*self._sign + '0.' + '0'*(abs(self._exp))
return s
- # exp is closest mult. of 3 >= self._exp
+ #exp is closest mult. of 3 >= self._exp
exp = ((self._exp - 1)// 3 + 1) * 3
if exp != self._exp:
s = '0.'+'0'*(exp - self._exp)
@@ -806,7 +796,7 @@
else:
s += 'e'
if exp > 0:
- s += '+' # 0.0e+3, not 0.0e3
+ s += '+' #0.0e+3, not 0.0e3
s += str(exp)
s = '-'*self._sign + s
return s
@@ -946,19 +936,19 @@
return ans
if self._isinfinity():
- # If both INF, same sign => same as both, opposite => error.
+ #If both INF, same sign => same as both, opposite => error.
if self._sign != other._sign and other._isinfinity():
return context._raise_error(InvalidOperation, '-INF + INF')
return Decimal(self)
if other._isinfinity():
- return Decimal(other) # Can't both be infinity here
+ return Decimal(other) #Can't both be infinity here
shouldround = context._rounding_decision == ALWAYS_ROUND
exp = min(self._exp, other._exp)
negativezero = 0
if context.rounding == ROUND_FLOOR and self._sign != other._sign:
- # If the answer is 0, the sign should be negative, in this case.
+ #If the answer is 0, the sign should be negative, in this case.
negativezero = 1
if not self and not other:
@@ -993,19 +983,19 @@
return Decimal((negativezero, (0,), exp))
if op1.int < op2.int:
op1, op2 = op2, op1
- # OK, now abs(op1) > abs(op2)
+ #OK, now abs(op1) > abs(op2)
if op1.sign == 1:
result.sign = 1
op1.sign, op2.sign = op2.sign, op1.sign
else:
result.sign = 0
- # So we know the sign, and op1 > 0.
+ #So we know the sign, and op1 > 0.
elif op1.sign == 1:
result.sign = 1
op1.sign, op2.sign = (0, 0)
else:
result.sign = 0
- # Now, op1 > abs(op2) > 0
+ #Now, op1 > abs(op2) > 0
if op2.sign == 0:
result.int = op1.int + op2.int
@@ -1062,8 +1052,8 @@
ans = self._check_nans(context=context)
if ans:
return ans
- # Must be infinite, and incrementing makes no difference
- return Decimal(self)
+
+ return Decimal(self) # Must be infinite, and incrementing makes no difference
L = list(self._int)
L[-1] += 1
@@ -1119,7 +1109,7 @@
if not self or not other:
ans = Decimal((resultsign, (0,), resultexp))
if shouldround:
- # Fixing in case the exponent is out of bounds
+ #Fixing in case the exponent is out of bounds
ans = ans._fix(context)
return ans
@@ -1138,7 +1128,7 @@
op1 = _WorkRep(self)
op2 = _WorkRep(other)
- ans = Decimal((resultsign, map(int, str(op1.int * op2.int)), resultexp))
+ ans = Decimal( (resultsign, map(int, str(op1.int * op2.int)), resultexp))
if shouldround:
ans = ans._fix(context)
@@ -1231,11 +1221,12 @@
sign, 1)
return context._raise_error(DivisionByZero, 'x / 0', sign)
- # OK, so neither = 0, INF or NaN
+ #OK, so neither = 0, INF or NaN
+
shouldround = context._rounding_decision == ALWAYS_ROUND
- # If we're dividing into ints, and self < other, stop.
- # self.__abs__(0) does not round.
+ #If we're dividing into ints, and self < other, stop.
+ #self.__abs__(0) does not round.
if divmod and (self.__abs__(0, context) < other.__abs__(0, context)):
if divmod == 1 or divmod == 3:
@@ -1247,7 +1238,7 @@
ans2)
elif divmod == 2:
- # Don't round the mod part, if we don't need it.
+ #Don't round the mod part, if we don't need it.
return (Decimal( (sign, (0,), 0) ), Decimal(self))
op1 = _WorkRep(self)
@@ -1296,7 +1287,7 @@
op1.exp -= 1
if res.exp == 0 and divmod and op2.int > op1.int:
- # Solves an error in precision. Same as a previous block.
+ #Solves an error in precision. Same as a previous block.
if res.int >= prec_limit and shouldround:
return context._raise_error(DivisionImpossible)
@@ -1382,7 +1373,7 @@
# ignored in the calling function.
context = context._shallow_copy()
flags = context._ignore_flags(Rounded, Inexact)
- # Keep DivisionImpossible flags
+ #keep DivisionImpossible flags
(side, r) = self.__divmod__(other, context=context)
if r._isnan():
@@ -1405,7 +1396,7 @@
if r < comparison:
r._sign, comparison._sign = s1, s2
- # Get flags now
+ #Get flags now
self.__divmod__(other, context=context)
return r._fix(context)
r._sign, comparison._sign = s1, s2
@@ -1427,8 +1418,7 @@
if r > comparison or decrease and r == comparison:
r._sign, comparison._sign = s1, s2
context.prec += 1
- numbsquant = len(side.__add__(Decimal(1), context=context)._int)
- if numbsquant >= context.prec:
+ if len(side.__add__(Decimal(1), context=context)._int) >= context.prec:
context.prec -= 1
return context._raise_error(DivisionImpossible)[1]
context.prec -= 1
@@ -1463,7 +1453,7 @@
context = getcontext()
return context._raise_error(InvalidContext)
elif self._isinfinity():
- raise OverflowError("Cannot convert infinity to long")
+ raise OverflowError, "Cannot convert infinity to long"
if self._exp >= 0:
s = ''.join(map(str, self._int)) + '0'*self._exp
else:
@@ -1517,13 +1507,13 @@
context._raise_error(Clamped)
return ans
ans = ans._rescale(Etiny, context=context)
- # It isn't zero, and exp < Emin => subnormal
+ #It isn't zero, and exp < Emin => subnormal
context._raise_error(Subnormal)
if context.flags[Inexact]:
context._raise_error(Underflow)
else:
if ans:
- # Only raise subnormal if non-zero.
+ #Only raise subnormal if non-zero.
context._raise_error(Subnormal)
else:
Etop = context.Etop()
@@ -1540,8 +1530,7 @@
return ans
context._raise_error(Inexact)
context._raise_error(Rounded)
- c = context._raise_error(Overflow, 'above Emax', ans._sign)
- return c
+ return context._raise_error(Overflow, 'above Emax', ans._sign)
return ans
def _round(self, prec=None, rounding=None, context=None):
@@ -1601,18 +1590,18 @@
ans = Decimal( (temp._sign, tmp, temp._exp - expdiff))
return ans
- # OK, but maybe all the lost digits are 0.
+ #OK, but maybe all the lost digits are 0.
lostdigits = self._int[expdiff:]
if lostdigits == (0,) * len(lostdigits):
ans = Decimal( (temp._sign, temp._int[:prec], temp._exp - expdiff))
- # Rounded, but not Inexact
+ #Rounded, but not Inexact
context._raise_error(Rounded)
return ans
# Okay, let's round and lose data
this_function = getattr(temp, self._pick_rounding_function[rounding])
- # Now we've got the rounding function
+ #Now we've got the rounding function
if prec != context.prec:
context = context._shallow_copy()
@@ -1708,7 +1697,7 @@
context = getcontext()
if self._is_special or n._is_special or n.adjusted() > 8:
- # Because the spot << doesn't work with really big exponents
+ #Because the spot << doesn't work with really big exponents
if n._isinfinity() or n.adjusted() > 8:
return context._raise_error(InvalidOperation, 'x ** INF')
@@ -1738,9 +1727,10 @@
return Infsign[sign]
return Decimal( (sign, (0,), 0) )
- # With ludicrously large exponent, just raise an overflow and return inf.
- if not modulo and n > 0 \
- and (self._exp + len(self._int) - 1) * n > context.Emax and self:
+ #with ludicrously large exponent, just raise an overflow and return inf.
+ if not modulo and n > 0 and (self._exp + len(self._int) - 1) * n > context.Emax \
+ and self:
+
tmp = Decimal('inf')
tmp._sign = sign
context._raise_error(Rounded)
@@ -1759,7 +1749,7 @@
context = context._shallow_copy()
context.prec = firstprec + elength + 1
if n < 0:
- # n is a long now, not Decimal instance
+ #n is a long now, not Decimal instance
n = -n
mul = Decimal(1).__div__(mul, context=context)
@@ -1768,7 +1758,7 @@
spot <<= 1
spot >>= 1
- # Spot is the highest power of 2 less than n
+ #Spot is the highest power of 2 less than n
while spot:
val = val.__mul__(val, context=context)
if val._isinfinity():
@@ -1826,7 +1816,7 @@
if exp._isinfinity() or self._isinfinity():
if exp._isinfinity() and self._isinfinity():
- return self # If both are inf, it is OK
+ return self #if both are inf, it is OK
if context is None:
context = getcontext()
return context._raise_error(InvalidOperation,
@@ -1858,8 +1848,7 @@
if self._is_special:
if self._isinfinity():
- return context._raise_error(InvalidOperation,
- 'rescale with an INF')
+ return context._raise_error(InvalidOperation, 'rescale with an INF')
ans = self._check_nans(context=context)
if ans:
@@ -1931,13 +1920,13 @@
return Decimal(self)
if not self:
- # exponent = self._exp / 2, using round_down.
- # if self._exp < 0:
- # exp = (self._exp+1) // 2
- # else:
+ #exponent = self._exp / 2, using round_down.
+ #if self._exp < 0:
+ # exp = (self._exp+1) // 2
+ #else:
exp = (self._exp) // 2
if self._sign == 1:
- # sqrt(-0) = -0
+ #sqrt(-0) = -0
return Decimal( (1, (0,), exp))
else:
return Decimal( (0, (0,), exp))
@@ -1971,7 +1960,8 @@
ans = ans.__add__(tmp.__mul__(Decimal((0, (8,1,9), -3)),
context=context), context=context)
ans._exp -= 1 + tmp.adjusted() // 2
- # ans is now a linear approximation.
+
+ #ans is now a linear approximation.
Emax, Emin = context.Emax, context.Emin
context.Emax, context.Emin = DefaultContext.Emax, DefaultContext.Emin
@@ -1987,12 +1977,12 @@
if context.prec == maxp:
break
- # Round to the answer's precision-- the only error can be 1 ulp.
+ #round to the answer's precision-- the only error can be 1 ulp.
context.prec = firstprec
prevexp = ans.adjusted()
ans = ans._round(context=context)
- # Now, check if the other last digits are better.
+ #Now, check if the other last digits are better.
context.prec = firstprec + 1
# In case we rounded up another digit and we should actually go lower.
if prevexp != ans.adjusted():
@@ -2024,10 +2014,10 @@
context._raise_error(Rounded)
context._raise_error(Inexact)
else:
- # Exact answer, so let's set the exponent right.
- # if self._exp < 0:
- # exp = (self._exp +1)// 2
- # else:
+ #Exact answer, so let's set the exponent right.
+ #if self._exp < 0:
+ # exp = (self._exp +1)// 2
+ #else:
exp = self._exp // 2
context.prec += ans._exp - exp
ans = ans._rescale(exp, context=context)
@@ -2062,13 +2052,13 @@
ans = self
c = self.__cmp__(other)
if c == 0:
- # If both operands are finite and equal in numerical value
+ # if both operands are finite and equal in numerical value
# then an ordering is applied:
#
- # If the signs differ then max returns the operand with the
+ # if the signs differ then max returns the operand with the
# positive sign and min returns the operand with the negative sign
#
- # If the signs are the same then the exponent is used to select
+ # if the signs are the same then the exponent is used to select
# the result.
if self._sign != other._sign:
if self._sign:
@@ -2089,7 +2079,7 @@
def min(self, other, context=None):
"""Returns the smaller value.
- Like min(self, other) except if one is not a number, returns
+ like min(self, other) except if one is not a number, returns
NaN (and signals if one is sNaN). Also rounds.
"""
other = _convert_other(other)
@@ -2097,7 +2087,7 @@
return other
if self._is_special or other._is_special:
- # If one operand is a quiet NaN and the other is number, then the
+ # if one operand is a quiet NaN and the other is number, then the
# number is always returned
sn = self._isnan()
on = other._isnan()
@@ -2111,13 +2101,13 @@
ans = self
c = self.__cmp__(other)
if c == 0:
- # If both operands are finite and equal in numerical value
+ # if both operands are finite and equal in numerical value
# then an ordering is applied:
#
- # If the signs differ then max returns the operand with the
+ # if the signs differ then max returns the operand with the
# positive sign and min returns the operand with the negative sign
#
- # If the signs are the same then the exponent is used to select
+ # if the signs are the same then the exponent is used to select
# the result.
if self._sign != other._sign:
if other._sign:
@@ -2152,38 +2142,37 @@
"""Return the adjusted exponent of self"""
try:
return self._exp + len(self._int) - 1
- # If NaN or Infinity, self._exp is string
+ #If NaN or Infinity, self._exp is string
except TypeError:
return 0
- # Support for pickling, copy, and deepcopy
+ # support for pickling, copy, and deepcopy
def __reduce__(self):
return (self.__class__, (str(self),))
def __copy__(self):
if type(self) == Decimal:
- return self # I'm immutable; therefore I am my own clone
+ return self # I'm immutable; therefore I am my own clone
return self.__class__(str(self))
def __deepcopy__(self, memo):
if type(self) == Decimal:
- return self # My components are also immutable
+ return self # My components are also immutable
return self.__class__(str(self))
-##### Context class ##########################################################
+##### Context class ###########################################
+
-# Get rounding method function:
-rounding_functions = [name for name in Decimal.__dict__.keys()
- if name.startswith('_round_')]
+# get rounding method function:
+rounding_functions = [name for name in Decimal.__dict__.keys() if name.startswith('_round_')]
for name in rounding_functions:
- # Name is like _round_half_even, goes to the global ROUND_HALF_EVEN value.
+ #name is like _round_half_even, goes to the global ROUND_HALF_EVEN value.
globalname = name[1:].upper()
val = globals()[globalname]
Decimal._pick_rounding_function[val] = name
del name, val, globalname, rounding_functions
-
class ContextManager(object):
"""Helper class to simplify Context management.
@@ -2208,13 +2197,12 @@
def __exit__(self, t, v, tb):
setcontext(self.saved_context)
-
class Context(object):
"""Contains the context for a Decimal instance.
Contains:
prec - precision (for use in rounding, division, square roots..)
- rounding - rounding type (how you round).
+ rounding - rounding type. (how you round)
_rounding_decision - ALWAYS_ROUND, NEVER_ROUND -- do you round?
traps - If traps[exception] = 1, then the exception is
raised when it is caused. Otherwise, a value is
@@ -2255,13 +2243,9 @@
def __repr__(self):
"""Show the current context."""
s = []
- s.append(
- 'Context(prec=%(prec)d, rounding=%(rounding)s, Emin=%(Emin)d, Emax=%(Emax)d, capitals=%(capitals)d'
- % vars(self))
- s.append('flags=[' + ', '.join([f.__name__ for f, v
- in self.flags.items() if v]) + ']')
- s.append('traps=[' + ', '.join([t.__name__ for t, v
- in self.traps.items() if v]) + ']')
+ s.append('Context(prec=%(prec)d, rounding=%(rounding)s, Emin=%(Emin)d, Emax=%(Emax)d, capitals=%(capitals)d' % vars(self))
+ s.append('flags=[' + ', '.join([f.__name__ for f, v in self.flags.items() if v]) + ']')
+ s.append('traps=[' + ', '.join([t.__name__ for t, v in self.traps.items() if v]) + ']')
return ', '.join(s) + ')'
def get_manager(self):
@@ -2281,10 +2265,9 @@
def copy(self):
"""Returns a deep copy from self."""
- nc = Context(self.prec, self.rounding, self.traps.copy(),
- self.flags.copy(), self._rounding_decision,
- self.Emin, self.Emax, self.capitals,
- self._clamp, self._ignored_flags)
+ nc = Context(self.prec, self.rounding, self.traps.copy(), self.flags.copy(),
+ self._rounding_decision, self.Emin, self.Emax,
+ self.capitals, self._clamp, self._ignored_flags)
return nc
__copy__ = copy
@@ -2298,16 +2281,16 @@
"""
error = _condition_map.get(condition, condition)
if error in self._ignored_flags:
- # Don't touch the flag
+ #Don't touch the flag
return error().handle(self, *args)
self.flags[error] += 1
if not self.traps[error]:
- # The errors define how to handle themselves.
+ #The errors define how to handle themselves.
return condition().handle(self, *args)
# Errors should only be risked on copies of the context
- # self._ignored_flags = []
+ #self._ignored_flags = []
raise error, explanation
def _ignore_all_flags(self):
@@ -2331,7 +2314,7 @@
def __hash__(self):
"""A Context cannot be hashed."""
# We inherit object.__hash__, so we must deny this explicitly
- raise TypeError("Cannot hash a Context.")
+ raise TypeError, "Cannot hash a Context."
def Etiny(self):
"""Returns Etiny (= Emin - prec + 1)"""
@@ -2357,6 +2340,7 @@
This will make it not round for that operation.
"""
+
rounding = self._rounding_decision
self._rounding_decision = type
return rounding
@@ -2385,12 +2369,12 @@
d = Decimal(num, context=self)
return d._fix(self)
- # Methods
+ #Methods
def abs(self, a):
"""Returns the absolute value of the operand.
If the operand is negative, the result is the same as using the minus
- operation on the operand. Otherwise, the result is the same as using
+ operation on the operand. Otherwise, the result is the same as using
the plus operation on the operand.
>>> ExtendedContext.abs(Decimal('2.1'))
@@ -2492,8 +2476,8 @@
If either operand is a NaN then the general rules apply.
Otherwise, the operands are compared as as though by the compare
- operation. If they are numerically equal then the left-hand operand
- is chosen as the result. Otherwise the maximum (closer to positive
+ operation. If they are numerically equal then the left-hand operand
+ is chosen as the result. Otherwise the maximum (closer to positive
infinity) of the two operands is chosen as the result.
>>> ExtendedContext.max(Decimal('3'), Decimal('2'))
@@ -2512,8 +2496,8 @@
If either operand is a NaN then the general rules apply.
Otherwise, the operands are compared as as though by the compare
- operation. If they are numerically equal then the left-hand operand
- is chosen as the result. Otherwise the minimum (closer to negative
+ operation. If they are numerically equal then the left-hand operand
+ is chosen as the result. Otherwise the minimum (closer to negative
infinity) of the two operands is chosen as the result.
>>> ExtendedContext.min(Decimal('3'), Decimal('2'))
@@ -2544,10 +2528,10 @@
def multiply(self, a, b):
"""multiply multiplies two operands.
- If either operand is a special value then the general rules
- apply. Otherwise, the operands are multiplied together
- ('long multiplication'), resulting in a number which may be
- as long as the sum of the lengths of the two operands.
+ If either operand is a special value then the general rules apply.
+ Otherwise, the operands are multiplied together ('long multiplication'),
+ resulting in a number which may be as long as the sum of the lengths
+ of the two operands.
>>> ExtendedContext.multiply(Decimal('1.20'), Decimal('3'))
Decimal("3.60")
@@ -2602,14 +2586,14 @@
The right-hand operand must be a whole number whose integer part (after
any exponent has been applied) has no more than 9 digits and whose
- fractional part (if any) is all zeros before any rounding. The operand
+ fractional part (if any) is all zeros before any rounding. The operand
may be positive, negative, or zero; if negative, the absolute value of
the power is used, and the left-hand operand is inverted (divided into
1) before use.
If the increased precision needed for the intermediate calculations
- exceeds the capabilities of the implementation then an Invalid
- operation condition is raised.
+ exceeds the capabilities of the implementation then an Invalid operation
+ condition is raised.
If, when raising to a negative power, an underflow occurs during the
division into 1, the operation is not halted at that point but
@@ -2647,18 +2631,18 @@
return a.__pow__(b, modulo, context=self)
def quantize(self, a, b):
- """Returns a value equal to 'a' (rounded), having the exponent of 'b'.
+ """Returns a value equal to 'a' (rounded) and having the exponent of 'b'.
The coefficient of the result is derived from that of the left-hand
- operand. It may be rounded using the current rounding setting (if the
+ operand. It may be rounded using the current rounding setting (if the
exponent is being increased), multiplied by a positive power of ten (if
the exponent is being decreased), or is unchanged (if the exponent is
already equal to that of the right-hand operand).
Unlike other operations, if the length of the coefficient after the
quantize operation would be greater than precision then an Invalid
- operation condition is raised. This guarantees that, unless there is
- an error condition, the exponent of the result of a quantize is always
+ operation condition is raised. This guarantees that, unless there is an
+ error condition, the exponent of the result of a quantize is always
equal to that of the right-hand operand.
Also unlike other operations, quantize will never raise Underflow, even
@@ -2701,9 +2685,9 @@
"""Returns the remainder from integer division.
The result is the residue of the dividend after the operation of
- calculating integer division as described for divide-integer, rounded
- to precision digits if necessary. The sign of the result, if non-zero,
- is the same as that of the original dividend.
+ calculating integer division as described for divide-integer, rounded to
+ precision digits if necessary. The sign of the result, if non-zero, is
+ the same as that of the original dividend.
This operation will fail under the same conditions as integer division
(that is, if integer division on the same two operands would fail, the
@@ -2727,7 +2711,7 @@
def remainder_near(self, a, b):
"""Returns to be "a - b * n", where n is the integer nearest the exact
value of "x / b" (if two integers are equally near then the even one
- is chosen). If the result is equal to 0 then its sign will be the
+ is chosen). If the result is equal to 0 then its sign will be the
sign of a.
This operation will fail under the same conditions as integer division
@@ -2769,7 +2753,7 @@
return a.same_quantum(b)
def sqrt(self, a):
- """Square root of a non-negative number to context precision.
+ """Returns the square root of a non-negative number to context precision.
If the result must be inexact, it is rounded using the round-half-even
algorithm.
@@ -2830,7 +2814,7 @@
as using the quantize() operation using the given operand as the
left-hand-operand, 1E+0 as the right-hand-operand, and the precision
of the operand as the precision setting, except that no flags will
- be set. The rounding mode is taken from the context.
+ be set. The rounding mode is taken from the context.
>>> ExtendedContext.to_integral(Decimal('2.1'))
Decimal("2")
@@ -2851,7 +2835,6 @@
"""
return a.to_integral(context=self)
-
class _WorkRep(object):
__slots__ = ('sign','int','exp')
# sign: 0 or 1
@@ -2906,9 +2889,9 @@
other_len = len(str(other.int))
if numdigits > (other_len + prec + 1 - tmp_len):
# If the difference in adjusted exps is > prec+1, we know
- # other is insignificant, so might as well put a 1 after the
- # precision (since this is only for addition). Also stops
- # use of massive longs.
+ # other is insignificant, so might as well put a 1 after the precision.
+ # (since this is only for addition.) Also stops use of massive longs.
+
extend = prec + 2 - tmp_len
if extend <= 0:
extend = 1
@@ -2930,13 +2913,13 @@
Used on _WorkRep instances during division.
"""
adjust = 0
- # If op1 is smaller, make it larger
+ #If op1 is smaller, make it larger
while op2.int > op1.int:
op1.int *= 10
op1.exp -= 1
adjust += 1
- # If op2 is too small, make it larger
+ #If op2 is too small, make it larger
while op1.int >= (10 * op2.int):
op2.int *= 10
op2.exp -= 1
@@ -2944,8 +2927,7 @@
return op1, op2, adjust
-
-##### Helper Functions #######################################################
+##### Helper Functions ########################################
def _convert_other(other):
"""Convert other to Decimal.
@@ -2986,16 +2968,16 @@
if not num:
return 0
- # get the sign, get rid of trailing [+-]
+ #get the sign, get rid of trailing [+-]
sign = 0
if num[0] == '+':
num = num[1:]
- elif num[0] == '-': # elif avoids '+-nan'
+ elif num[0] == '-': #elif avoids '+-nan'
num = num[1:]
sign = 1
if num.startswith('nan'):
- if len(num) > 3 and not num[3:].isdigit(): # diagnostic info
+ if len(num) > 3 and not num[3:].isdigit(): #diagnostic info
return 0
return (1, sign, num[3:].lstrip('0'))
if num.startswith('snan'):
@@ -3005,7 +2987,7 @@
return 0
-##### Setup Specific Contexts ################################################
+##### Setup Specific Contexts ################################
# The default context prototype used by Context()
# Is mutable, so that new contexts can have different default values
@@ -3038,19 +3020,19 @@
)
-##### Useful Constants (internal use only) ###################################
+##### Useful Constants (internal use only) ####################
-# Reusable defaults
+#Reusable defaults
Inf = Decimal('Inf')
negInf = Decimal('-Inf')
-# Infsign[sign] is infinity w/ that sign
+#Infsign[sign] is infinity w/ that sign
Infsign = (Inf, negInf)
NaN = Decimal('NaN')
-##### crud for parsing strings ################################################
+##### crud for parsing strings #################################
import re
# There's an optional sign at the start, and an optional exponent
@@ -3070,16 +3052,13 @@
([eE](?P[-+]? \d+))?
# \s*
$
-""", re.VERBOSE).match # Uncomment the \s* to allow leading/trailing spaces
+""", re.VERBOSE).match #Uncomment the \s* to allow leading or trailing spaces.
del re
+# return sign, n, p s.t. float string value == -1**sign * n * 10**p exactly
def _string2exact(s):
- """Return sign, n, p s.t.
-
- Float string value == -1**sign * n * 10**p exactly
- """
m = _parser(s)
if m is None:
raise ValueError("invalid literal for Decimal: %r" % s)
From martin at v.loewis.de Wed Jul 19 19:18:52 2006
From: martin at v.loewis.de (=?UTF-8?B?Ik1hcnRpbiB2LiBMw7Z3aXMi?=)
Date: Wed, 19 Jul 2006 19:18:52 +0200
Subject: [Python-checkins] r50697 - python/trunk/Lib/decimal.py
In-Reply-To: <44BE5555.4000500@ewtllc.com>
References: <20060718121614.CA0601E4015@bag.python.org>
<44BE5555.4000500@ewtllc.com>
Message-ID: <44BE697C.8070607@v.loewis.de>
Raymond Hettinger wrote:
> I propose that this patch be reverted.
Done!
Martin
From python-checkins at python.org Wed Jul 19 22:00:41 2006
From: python-checkins at python.org (jackilyn.hoxworth)
Date: Wed, 19 Jul 2006 22:00:41 +0200 (CEST)
Subject: [Python-checkins] r50716 -
python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py
Message-ID: <20060719200041.425E31E4003@bag.python.org>
Author: jackilyn.hoxworth
Date: Wed Jul 19 22:00:40 2006
New Revision: 50716
Modified:
python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py
Log:
I'm still not finished yet.
Modified: python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py
==============================================================================
--- python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py (original)
+++ python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py Wed Jul 19 22:00:40 2006
@@ -30,17 +30,30 @@
log.addHandler(handler)
# create socket
-sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
+class FakeSocket:
+ def __init__(self, text, fileclass=StringIO.StringIO):
+ self.text = text
+ self.fileclass = fileclass
+
+ def makefile(self, mode, bufsize=None):
+ if mode != 'r' and mode != 'rb':
+ raise httplib.UnimplementedFileMode()
+ return self.fileclass(self.text)
+
+sock = FakeSocket("socket")
httplib._log.info("message 1") # first stage of testing
r = httplib.HTTPResponse(sock) # second stage of testing
r.begin() # call the begin method
-"""self.msg == None
-self._read_status == "message 1" == CONTINUE
-skip != True
-self.debuglevel > 0"""
+# class test:
+# def someTest:
+# self.msg == None
+# self._read_status == "message 1" == CONTINUE
+# skip != True
+# self.debuglevel > 0
+
print stringLog.getvalue() # For testing purposes
From python-checkins at python.org Thu Jul 20 00:15:34 2006
From: python-checkins at python.org (brett.cannon)
Date: Thu, 20 Jul 2006 00:15:34 +0200 (CEST)
Subject: [Python-checkins] r50717 -
python/branches/bcannon-sandboxing/securing_python.txt
Message-ID: <20060719221534.8F9711E400C@bag.python.org>
Author: brett.cannon
Date: Thu Jul 20 00:15:33 2006
New Revision: 50717
Modified:
python/branches/bcannon-sandboxing/securing_python.txt
Log:
Initial draft of doc for using object-capabilities.
Modified: python/branches/bcannon-sandboxing/securing_python.txt
==============================================================================
--- python/branches/bcannon-sandboxing/securing_python.txt (original)
+++ python/branches/bcannon-sandboxing/securing_python.txt Thu Jul 20 00:15:33 2006
@@ -29,6 +29,18 @@
``exceptions`` module are considered in the built-in namespace. There
have also been no imports executed in the interpreter.
+The "security domain" is the boundary at which security is cared
+about. For this dicussion, it is the interpreter. Anything that
+happens within a security domain is considered open and unprotected.
+But any action that tries to cross the boundary of the security domain
+is where the security model and protection comes in.
+
+The "powerbox" is the thing that possesses the ultimate power in the
+system. In our case it is the Python process. No interpreter can
+possess any ability that the overall process does not have. This
+means that we care about interpreter<->interpreter interaction along
+with interpreter<->process interactions.
+
Rationale
///////////////////////////////////////
@@ -116,7 +128,11 @@
resource (or a reference to an object that can references a resource),
you cannot access it, period. You can provide conditional access by
using a proxy between code and a resource, but that still requires a
-reference to the resource by the proxy.
+reference to the resource by the proxy. This means that your security
+model can be viewed simply by using a whiteboard to draw out the
+interactions between your security domains where by any connection
+between domains is a possible security issue if you do not put in a
+proxy to mediate between the two domains.
This leads to a much cleaner implementation of security. By not
having to change internal code in the interpreter to perform identity
@@ -160,7 +176,8 @@
Unfortunately this makes the possibility of a private namespace
non-existent. This poses an issue for providing proxies for resources
since there is no way in Python code to hide the reference to a
-resource.
+resource. It also makes providing security at the object level using
+object-capabilities non-existent in pure Python code.
Luckily, the Python virtual machine *does* provide a private namespace,
albeit not for pure Python source code. If you use the Python/C
@@ -194,8 +211,13 @@
The threat that this security model is attempting to handle is the
execution of arbitrary Python code in a sandboxed interpreter such
that the code in that interpreter is not able to harm anything outside
-of itself. This means that:
+of itself unless explicitly allowed to. This means that:
+* An interpreter cannot gain abilties the Python process possesses
+ without explicitly being given those abilities.
+ + With the Python process being the powerbox, if an interpreter
+ could gain whatever abilities it wanted to then the security
+ domain would be completely breached.
* An interpreter cannot influence another interpreter directly at the
Python level without explicitly allowing it.
+ This includes preventing communicating with another interpreter.
@@ -210,10 +232,12 @@
explicitly given those resources.
+ This includes importing modules since that requires the ability
to use the resource of the filesystem.
+ + This is mediated by having to go through the process to gain the
+ abilities in the OS that the process possesses.
In order to accomplish these goals, certain things must be made true.
-* The Python process is the "powerbox".
+* The Python process is the powerbox.
+ It controls the initial granting of abilties to interpreters.
* A bare Python interpreter is always trusted.
+ Python source code that can be created in a bare interpreter is
@@ -255,7 +279,12 @@
operating system's memory allocator is not supported at the program
level), protecting files and imports should not such a per-interpreter
protection at such a low level (because those can have extension
-module proxies to provide the security).
+module proxies to provide the security). This means that security is
+based on possessing the authority to do something through a reference
+to an object that can perform the action. And that object will most
+likely decide whether to carry out its action based on the arguments
+passed in (whether that is an opaque token, file path allowed to be
+opened, etc.).
For common case security measures, the Python standard library
(stdlib) should provide a simple way to provide those measures. Most
@@ -336,7 +365,8 @@
XXX perhaps augment 'sys' so that you list the extension of files that
can be used for importing? Thought this was controlled somewhere
-already but can't find it.
+already but can't find it. It is returned by ``imp.get_suffixes()``,
+but I can't find where to set it from Python code.
It must be warned that importing any C extension module is dangerous.
Not only are they able to circumvent security measures by executing C
@@ -349,6 +379,31 @@
acting on behalf of the sandboxed interpreter. This violates the
perimeter defence. No one should import extension modules blindly.
+Implementing Import in Python
++++++++++++++++++++++++++++++
+
+To help facilitate in the exposure of more of what importation
+requires (and thus make implementing a proxy easier), the import
+machinery should be rewritten in Python. This will require some
+bootstrapping in order for the code to be loaded into the process
+without itself requiring importation, but that should be doable. Plus
+some care must be taken to not lead to circular dependency on
+importing modules needed to handle importing (e.g. importing sys but
+having that import call the import call, etc.).
+
+Interaction with another interpreter that might provide an import
+function must also be dealt with. One cannot expose the importation
+of a needed module for the import machinery as it might not be allowed
+by a proxy. This can be handled by allowing the powerbox's import
+function to have modules directly injected into its global namespace.
+But there is also the issue of using the proper ``sys.modules`` for
+storing the modules already imported. You do not want to inject the
+``sys`` module of the powerbox and have all imports end up in its
+``sys.modules`` but in the interpreter making the call. This must be
+dealt with in some fashion (injecting per-call, having a factory
+function create a new import function based on an interpreter passed
+in, etc.).
+
Sanitizing Built-In Types
-------------------------
@@ -458,10 +513,183 @@
Making the ``sys`` Module Safe
------------------------------
-XXX
+The ``sys`` module is an odd mix of both information and settings for
+the interpreter. Because of this dichotomy, some very useful, but
+innocuous information is stored in the module along with things that
+should not be exposed to sandboxed interpreters.
+
+This means that the ``sys`` module needs to have its safe information
+separated out from the unsafe settings. This will allow an import
+proxy to let through safe information but block out the ability to set
+values.
+
+XXX separate modules, ``sys.settings`` and ``sys.info``, or strip
+``sys`` to settings and put info somewhere else? Or provide a method
+that will create a faked sys module that has the safe values copied
+into it?
+
+The safe information values are:
+
+* builtin_module_names
+ Information about what might be blocked from importation.
+* byteorder
+ Needed for networking.
+* copyright
+ Set to a string about the interpreter.
+* displayhook (?)
+* excepthook (?)
+* __displayhook__ (?)
+* __excepthook__ (?)
+* exc_info() (?)
+* exc_clear()
+* exit()
+* exitfunc
+* getcheckinterval()
+ Returns an int.
+* getdefaultencoding()
+ Returns a string about interpreter.
+* getrefcount()
+ Returns an int about the passed-in object.
+* getrecursionlimit()
+ Returns an int about the interpreter.
+* hexversion
+ Set to an int about the interpreter.
+* last_type
+* last_value
+* last_traceback (?)
+* maxint
+ Set to an int that exposes ambiguous information about the
+ computer.
+* maxunicode
+ Returns a string about the interpreter.
+* meta_path (?)
+* path_hooks (?)
+* path_importer_cache (?)
+* ps1
+* ps2
+* stdin
+* stdout
+* stderr
+* traceback (?)
+* version
+* api_version
+* version_info
+* warnoptions (?)
+
+The dangerous settings are:
+
+* argv
+* subversion
+* _current_frames()
+* dllhandle
+* exc_type
+ Deprecated since 1.5 .
+* exc_value
+ Deprecated since 1.5 .
+* exc_traceback
+ Deprecated since 1.5 .
+* exc_prefix
+ Exposes filesystem information.
+* executable
+ Exposes filesystem information.
+* _getframe()
+* getwindowsversion()
+ Exposes OS information.
+* modules
+* path
+* platform
+ Exposes OS information.
+* prefix
+ Exposes filesystem information.
+* setcheckinterval()
+* setdefaultencoding()
+* setdlopenflags()
+* setprofile()
+* setrecursionlimit()
+* settrace()
+* settcsdump()
+* __stdin__
+* __stdout__
+* __stderr__
+* winver
+ Exposes OS information.
+
+
+Protecting I/O
+++++++++++++++
+
+The ``print`` keyword and the built-ins ``raw_input()`` and
+``input()`` use the values stored in ``sys.stdout`` and ``sys.stdin``.
+By exposing these attributes to the creating interpreter, one can set
+them to safe objects, such as instances of ``StringIO``.
Safe Networking
---------------
-XXX
+XXX proxy on socket module, modify open() to be the constructor, etc.
+
+
+Protecting Memory Usage
+-----------------------
+
+To protect memory, low-level hooks into the memory allocator for
+Python is needed. By hooking into the C API for memory allocation and
+deallocation a very rough running count of used memory can kept. This
+can be used to prevent sandboxed interpreters from using so much
+memory that it impacts the overall performance of the system.
+
+Because this has no direct connection with object-capabilities or has
+any form of exposure at the Python level, this feature can be safely
+implemented separately from the rest of the security model.
+
+Existing APIs to protect are:
+
+- _PyObject_New()
+ protected directly
+- _PyObject_NewVar()
+ protected directly
+- _PyObject_Del()
+ remove macro that uses PyObject_Free() and protect directly
+- PyObject_New()
+ implicitly by macro using _PyObject_New()
+- PyObject_NewVar()
+ implicitly by macro using _PyObject_NewVar()
+- PyObject_Del()
+ redefine macro to use _PyObject_Del() instead of PyObject_Free()
+- PyMem_Malloc()
+ protected directly
+- PyMem_Realloc()
+ protected directly
+- PyMem_Free()
+ protected directly
+- PyMem_New()
+ implicitly protected by macro using PyMem_Malloc()
+- PyMem_Resize()
+ implicitly protected by macro using PyMem_Realloc()
+- PyMem_Del()
+ implicitly protected by macro using PyMem_Free()
+- PyMem_MALLOC()
+ redefine macro to use PyMem_Malloc()
+- PyMem_REALLOC()
+ redefine macro to use PyMem_Realloc()
+- PyMem_FREE()
+ redefine macro to use PyMem_Free()
+- PyMem_NEW()
+ implicitly protected by macro using PyMem_MALLOC()
+- PyMem_RESIZE()
+ implicitly protected by macro using PyMem_REALLOC()
+- PyMem_DEL()
+ implicitly protected by macro using PyMem_FREE()
+- PyObject_Malloc()
+ XXX
+- PyObject_Realloc()
+ XXX
+- PyObject_Free()
+ XXX
+- PyObject_MALLOC()
+ XXX
+- PyObject_REALLOC()
+ XXX
+- PyObject_FREE()
+ XXX
From python-checkins at python.org Thu Jul 20 03:37:03 2006
From: python-checkins at python.org (brett.cannon)
Date: Thu, 20 Jul 2006 03:37:03 +0200 (CEST)
Subject: [Python-checkins] r50718 -
python/branches/bcannon-sandboxing/securing_python.txt
Message-ID: <20060720013703.2588D1E4003@bag.python.org>
Author: brett.cannon
Date: Thu Jul 20 03:37:02 2006
New Revision: 50718
Modified:
python/branches/bcannon-sandboxing/securing_python.txt
Log:
Add mention in the what-I-have security explanation that you can pass in
resources directly and skip the accessor functions being made available.
Also mentioned that object-capabilities is a subset of what-I-have security.
All based on comments from Alan Karp.
Modified: python/branches/bcannon-sandboxing/securing_python.txt
==============================================================================
--- python/branches/bcannon-sandboxing/securing_python.txt (original)
+++ python/branches/bcannon-sandboxing/securing_python.txt Thu Jul 20 03:37:02 2006
@@ -86,15 +86,27 @@
========================
A contrast to who-I-am security, what-I-have security never requires
-knowing who is requesting a resource. By never providing a function
-to access a resource or by creating a proxy that wraps the function to
-access a resource with argument checking, you can skip the need to
-know who is making a call.
-
-Using our file example, the program trying to open a file is given a
-proxy that checks whether paths passed into the function match allowed
-based at the creation time of the proxy before using the full-featured
-open function to open the file.
+knowing who is requesting a resource. If you know what resources are
+allowed or needed when you begin a security domain, you can just have
+the powerbox pass in those resources to begin with and not provide a
+function to retrieve them. This alleviates the worry of providing a
+function that can wield more power than the security domain should
+ever have if security is breached on that function.
+
+But if you don't know the exact resources needed ahead of time, you
+pass in a proxy to the resource that checks its arguments to make sure
+they are valid in terms of allowed usage of the protected resource.
+With this approach, you are only doing argument validation, where the
+validation happens to be related to security. No identity check is
+needed at any point.
+
+Using our file example, the program trying to open a file is given the
+open file object directly at time of creation that it will need to
+work with. A proxy to the full-powered open function can be used if
+you need more of a wildcard support for opening files. But
+it works just as well, if not better, to pass in all needed file
+objects at the beginning when the allowed files to work with is known
+so as to not even risk exposing the file opening function.
This illustrates a subtle, but key difference between who-I-am and
what-I-have security. For who-I-am, you must know who the caller is
@@ -105,7 +117,7 @@
Object-Capabilities
///////////////////////////////////////
-What-I-have security is more often called the object-capabilities
+What-I-have security is a super-set of the object-capabilities
security model. The belief here is in POLA (Principle Of Least
Authority): you give a program exactly what it needs, and no more. By
providing a function that can open any file that relies on identity to
From anthony at interlink.com.au Thu Jul 20 03:38:38 2006
From: anthony at interlink.com.au (Anthony Baxter)
Date: Thu, 20 Jul 2006 11:38:38 +1000
Subject: [Python-checkins] r50697 - python/trunk/Lib/decimal.py
In-Reply-To: <44BE5555.4000500@ewtllc.com>
References: <20060718121614.CA0601E4015@bag.python.org>
<44BE5555.4000500@ewtllc.com>
Message-ID: <200607201138.42337.anthony@interlink.com.au>
On Thursday 20 July 2006 01:52, Raymond Hettinger wrote:
> I propose that this patch be reverted.
Martin's already done it, but I agree. Sorry I didn't get to this
sooner - been off the air the last two days with a bad case of
can't-type-from-RSI :-/
Anthony
--
Anthony Baxter
It's never too late to have a happy childhood.
From python-checkins at python.org Thu Jul 20 17:54:16 2006
From: python-checkins at python.org (phillip.eby)
Date: Thu, 20 Jul 2006 17:54:16 +0200 (CEST)
Subject: [Python-checkins] r50719 - in python/trunk/Lib: inspect.py
test/test_inspect.py
Message-ID: <20060720155416.9253A1E4003@bag.python.org>
Author: phillip.eby
Date: Thu Jul 20 17:54:16 2006
New Revision: 50719
Modified:
python/trunk/Lib/inspect.py
python/trunk/Lib/test/test_inspect.py
Log:
Fix SF#1516184 (again) and add a test to prevent regression.
(There was a problem with empty filenames still causing recursion)
Modified: python/trunk/Lib/inspect.py
==============================================================================
--- python/trunk/Lib/inspect.py (original)
+++ python/trunk/Lib/inspect.py Thu Jul 20 17:54:16 2006
@@ -364,8 +364,9 @@
The idea is for each object to have a unique origin, so this routine
normalizes the result as much as possible."""
- return os.path.normcase(
- os.path.abspath(_filename or getsourcefile(object) or getfile(object)))
+ if _filename is None:
+ _filename = getsourcefile(object) or getfile(object)
+ return os.path.normcase(os.path.abspath(_filename))
modulesbyfile = {}
Modified: python/trunk/Lib/test/test_inspect.py
==============================================================================
--- python/trunk/Lib/test/test_inspect.py (original)
+++ python/trunk/Lib/test/test_inspect.py Thu Jul 20 17:54:16 2006
@@ -187,6 +187,7 @@
exec "def x(): pass" in m.__dict__
self.assertEqual(inspect.getsourcefile(m.x.func_code), '')
del sys.modules[name]
+ inspect.getmodule(compile('a=10','','single'))
class TestDecorators(GetSourceBase):
fodderFile = mod2
From python-checkins at python.org Thu Jul 20 18:28:40 2006
From: python-checkins at python.org (vinay.sajip)
Date: Thu, 20 Jul 2006 18:28:40 +0200 (CEST)
Subject: [Python-checkins] r50721 - python/trunk/Doc/lib/liblogging.tex
Message-ID: <20060720162840.8E96D1E4003@bag.python.org>
Author: vinay.sajip
Date: Thu Jul 20 18:28:39 2006
New Revision: 50721
Modified:
python/trunk/Doc/lib/liblogging.tex
Log:
Updated documentation for TimedRotatingFileHandler relating to how rollover files are named. The previous documentation was wrongly the same as for RotatingFileHandler.
Modified: python/trunk/Doc/lib/liblogging.tex
==============================================================================
--- python/trunk/Doc/lib/liblogging.tex (original)
+++ python/trunk/Doc/lib/liblogging.tex Thu Jul 20 18:28:39 2006
@@ -1068,13 +1068,11 @@
\end{tableii}
If \var{backupCount} is non-zero, the system will save old log files by
-appending the extensions ".1", ".2" etc., to the filename. For example,
-with a \var{backupCount} of 5 and a base file name of \file{app.log},
-you would get \file{app.log}, \file{app.log.1}, \file{app.log.2}, up to
-\file{app.log.5}. The file being written to is always \file{app.log}.
-When this file is filled, it is closed and renamed to \file{app.log.1},
-and if files \file{app.log.1}, \file{app.log.2}, etc. exist, then they
-are renamed to \file{app.log.2}, \file{app.log.3} etc. respectively.
+appending extensions to the filename. The extensions are date-and-time
+based, using the strftime format \code{%Y-%m-%d_%H-%M-%S} or a leading
+portion thereof, depending on the rollover interval. At most \var{backupCount}
+files will be kept, and if more would be created when rollover occurs, the
+oldest one is deleted.
\end{classdesc}
\begin{methoddesc}{doRollover}{}
From python-checkins at python.org Thu Jul 20 18:28:40 2006
From: python-checkins at python.org (georg.brandl)
Date: Thu, 20 Jul 2006 18:28:40 +0200 (CEST)
Subject: [Python-checkins] r50720 - python/trunk/Lib/subprocess.py
Message-ID: <20060720162840.95B461E4009@bag.python.org>
Author: georg.brandl
Date: Thu Jul 20 18:28:39 2006
New Revision: 50720
Modified:
python/trunk/Lib/subprocess.py
Log:
Guard for _active being None in __del__ method.
Modified: python/trunk/Lib/subprocess.py
==============================================================================
--- python/trunk/Lib/subprocess.py (original)
+++ python/trunk/Lib/subprocess.py Thu Jul 20 18:28:39 2006
@@ -618,7 +618,7 @@
return
# In case the child hasn't been waited on, check if it's done.
self.poll(_deadstate=sys.maxint)
- if self.returncode is None:
+ if self.returncode is None and _active is not None:
# Child is still running, keep us alive until we can wait on it.
_active.append(self)
From python-checkins at python.org Thu Jul 20 19:30:50 2006
From: python-checkins at python.org (brett.cannon)
Date: Thu, 20 Jul 2006 19:30:50 +0200 (CEST)
Subject: [Python-checkins] r50723 -
python/branches/bcannon-sandboxing/securing_python.txt
Message-ID: <20060720173050.731E21E400A@bag.python.org>
Author: brett.cannon
Date: Thu Jul 20 19:30:50 2006
New Revision: 50723
Modified:
python/branches/bcannon-sandboxing/securing_python.txt
Log:
Clarified certain parts that may not have been clear to non-python-devers.
Generated by comments from Lawrence Oluyede.
Modified: python/branches/bcannon-sandboxing/securing_python.txt
==============================================================================
--- python/branches/bcannon-sandboxing/securing_python.txt (original)
+++ python/branches/bcannon-sandboxing/securing_python.txt Thu Jul 20 19:30:50 2006
@@ -36,10 +36,13 @@
is where the security model and protection comes in.
The "powerbox" is the thing that possesses the ultimate power in the
-system. In our case it is the Python process. No interpreter can
-possess any ability that the overall process does not have. This
-means that we care about interpreter<->interpreter interaction along
-with interpreter<->process interactions.
+system for giving out abilities. In our case it is the Python
+process. No interpreter can possess any ability that the overall
+process does not have. It is up to the Python process to initially
+hand out abilities to interpreters to use either for themselves or to
+give to interpreters they create themselves. This means that we care
+about interpreter<->interpreter interaction along with
+interpreter<->process interactions.
Rationale
@@ -214,7 +217,10 @@
To rectify the situation, some changes will be needed to some built-in
objects in Python. It should mostly consist of abstracting or
refactoring certain abilities out to an extension module so that
-access can be protected using import guards.
+access can be protected using import guards. For instance, as it
+stands now, ``object.__subclasses__()`` will return a tuple of all of
+its subclasses, regardless of what interpreter the subclass was
+defined in.
Threat Model
@@ -309,13 +315,15 @@
it is not reasonable to expect existing code to work in a sandboxed
interpreter.
-Keeping Python "pythonic" is required for all design decisions. If
-removing an ability leads to something being unpythonic, it will not
-be done. This does not mean existing pythonic code must continue to
-work, but the spirit of being pythonic will not be compromised in the
-name of the security model. While this might lead to a weaker
-security model, this is a price that must be paid in order for Python
-to continue to be the language that it is.
+Keeping Python "pythonic" is required for all design decisions.
+In general, being pythonic means that something fits the general
+design guidelines (run ``import this`` from a Python interpreter to
+see the basic ones). If removing an ability leads to something being
+unpythonic, it will not be done. This does not mean existing pythonic
+code must continue to work, but the spirit of being pythonic will not
+be compromised in the name of the security model. While this might
+lead to a weaker security model, this is a price that must be paid in
+order for Python to continue to be the language that it is.
Restricting what is in the built-in namespace and the safe-guarding
the interpreter (which includes safe-guarding the built-in types) is
@@ -477,7 +485,7 @@
do not want to expose information about your environment on top of
protecting its use. This means that filesystem paths typically should
not be exposed. Unfortunately, Python exposes file paths all over the
-place:
+place that will need to be hidden:
* Modules
+ ``__file__`` attribute
@@ -487,7 +495,7 @@
+ ``__path__`` attribute
* XXX
-XXX how to expose safely?
+XXX how to expose safely? ``path()`` built-in?
Mutable Shared State
@@ -647,9 +655,10 @@
To protect memory, low-level hooks into the memory allocator for
Python is needed. By hooking into the C API for memory allocation and
-deallocation a very rough running count of used memory can kept. This
-can be used to prevent sandboxed interpreters from using so much
-memory that it impacts the overall performance of the system.
+deallocation a *very* rough running count of used memory can be kept.
+This can be used to compare against the set memory cap to prevent
+sandboxed interpreters from using so much memory that it impacts the
+overall performance of the system.
Because this has no direct connection with object-capabilities or has
any form of exposure at the Python level, this feature can be safely
From python-checkins at python.org Thu Jul 20 19:49:09 2006
From: python-checkins at python.org (brett.cannon)
Date: Thu, 20 Jul 2006 19:49:09 +0200 (CEST)
Subject: [Python-checkins] r50725 -
python/branches/bcannon-sandboxing/securing_python.txt
Message-ID: <20060720174909.71BC31E4003@bag.python.org>
Author: brett.cannon
Date: Thu Jul 20 19:49:09 2006
New Revision: 50725
Modified:
python/branches/bcannon-sandboxing/securing_python.txt
Log:
Remove permission and authority labels for who-I-am security compared to what-I-have security. Both types of security have both permissions and authority in them.
Based on comments from Mark S. Miller.
Modified: python/branches/bcannon-sandboxing/securing_python.txt
==============================================================================
--- python/branches/bcannon-sandboxing/securing_python.txt (original)
+++ python/branches/bcannon-sandboxing/securing_python.txt Thu Jul 20 19:49:09 2006
@@ -66,8 +66,7 @@
///////////////////////////////////////
There are essentially two types of security: who-I-am
-(permissions-based) security and what-I-have (authority-based)
-security.
+security and what-I-have security.
Who-I-Am Security
========================
From buildbot at python.org Thu Jul 20 20:19:47 2006
From: buildbot at python.org (buildbot at python.org)
Date: Thu, 20 Jul 2006 18:19:47 +0000
Subject: [Python-checkins] buildbot warnings in MIPS Debian trunk
Message-ID: <20060720181947.A28FC1E4003@bag.python.org>
The Buildbot has detected a new failure of MIPS Debian trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/MIPS%2520Debian%2520trunk/builds/302
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: phillip.eby
Build Had Warnings: warnings failed slave lost
sincerely,
-The Buildbot
From python-checkins at python.org Thu Jul 20 21:42:41 2006
From: python-checkins at python.org (brett.cannon)
Date: Thu, 20 Jul 2006 21:42:41 +0200 (CEST)
Subject: [Python-checkins] r50730 - in python/branches/bcannon-sandboxing:
Include/Python.h Include/objimpl.h Include/pyerrors.h
Include/pymem.h Include/pystate.h Include/sandbox.h
Lib/test/exception_hierarchy.txt Makefile.pre.in
Modules/sandboxmodule.c Objects/exceptions.c Objects/object.c
Python/pystate.c Python/pythonrun.c Python/sandbox.c
configure configure.in pyconfig.h.in setup.py
Message-ID: <20060720194241.31EBB1E4003@bag.python.org>
Author: brett.cannon
Date: Thu Jul 20 21:42:38 2006
New Revision: 50730
Removed:
python/branches/bcannon-sandboxing/Include/sandbox.h
python/branches/bcannon-sandboxing/Modules/sandboxmodule.c
python/branches/bcannon-sandboxing/Python/sandbox.c
Modified:
python/branches/bcannon-sandboxing/Include/Python.h
python/branches/bcannon-sandboxing/Include/objimpl.h
python/branches/bcannon-sandboxing/Include/pyerrors.h
python/branches/bcannon-sandboxing/Include/pymem.h
python/branches/bcannon-sandboxing/Include/pystate.h
python/branches/bcannon-sandboxing/Lib/test/exception_hierarchy.txt
python/branches/bcannon-sandboxing/Makefile.pre.in
python/branches/bcannon-sandboxing/Objects/exceptions.c
python/branches/bcannon-sandboxing/Objects/object.c
python/branches/bcannon-sandboxing/Python/pystate.c
python/branches/bcannon-sandboxing/Python/pythonrun.c
python/branches/bcannon-sandboxing/configure
python/branches/bcannon-sandboxing/configure.in
python/branches/bcannon-sandboxing/pyconfig.h.in
python/branches/bcannon-sandboxing/setup.py
Log:
Initial stab at introducing a special build for capping memory usage in an
interpreter. Build with Py_MEMORY_CAP defined (can be done with
--with-memory-cap) to activate features.
Have not tested actual memory cap usage; have only built, did not set a
memory cap, and then ran the testing suite (which all passes). Still need to
figure out how to cap the memory usage. Maybe put something in sys or gc?
Also still need to go through and change places that use a bare malloc()/free()
instead of PyMem_Malloc()/PyMem_Free(). Also catch places that inline
PyObject_New() and call PyObject_Malloc() directly.
Modified: python/branches/bcannon-sandboxing/Include/Python.h
==============================================================================
--- python/branches/bcannon-sandboxing/Include/Python.h (original)
+++ python/branches/bcannon-sandboxing/Include/Python.h Thu Jul 20 21:42:38 2006
@@ -114,7 +114,6 @@
#include "pyerrors.h"
#include "pystate.h"
-#include "sandbox.h"
#include "pyarena.h"
#include "modsupport.h"
Modified: python/branches/bcannon-sandboxing/Include/objimpl.h
==============================================================================
--- python/branches/bcannon-sandboxing/Include/objimpl.h (original)
+++ python/branches/bcannon-sandboxing/Include/objimpl.h Thu Jul 20 21:42:38 2006
@@ -128,11 +128,15 @@
#endif /* WITH_PYMALLOC */
+#ifndef Py_MEMORY_CAP
#define PyObject_Del PyObject_Free
-#define PyObject_DEL PyObject_FREE
-
/* for source compatibility with 2.2 */
#define _PyObject_Del PyObject_Free
+#else /* Py_MEMORY_CAP */
+PyAPI_FUNC(void) _PyObject_Del(void *);
+#define PyObject_Del _PyObject_Del
+#endif /* !Py_MEMORY_CAP */
+#define PyObject_DEL PyObject_FREE
/*
* Generic object allocator interface
Modified: python/branches/bcannon-sandboxing/Include/pyerrors.h
==============================================================================
--- python/branches/bcannon-sandboxing/Include/pyerrors.h (original)
+++ python/branches/bcannon-sandboxing/Include/pyerrors.h Thu Jul 20 21:42:38 2006
@@ -140,7 +140,6 @@
PyAPI_DATA(PyObject *) PyExc_NameError;
PyAPI_DATA(PyObject *) PyExc_OverflowError;
PyAPI_DATA(PyObject *) PyExc_RuntimeError;
-PyAPI_DATA(PyObject *) PyExc_SandboxError;
PyAPI_DATA(PyObject *) PyExc_NotImplementedError;
PyAPI_DATA(PyObject *) PyExc_SyntaxError;
PyAPI_DATA(PyObject *) PyExc_IndentationError;
Modified: python/branches/bcannon-sandboxing/Include/pymem.h
==============================================================================
--- python/branches/bcannon-sandboxing/Include/pymem.h (original)
+++ python/branches/bcannon-sandboxing/Include/pymem.h Thu Jul 20 21:42:38 2006
@@ -55,12 +55,18 @@
no longer supported. They used to call PyErr_NoMemory() on failure. */
/* Macros. */
-#ifdef PYMALLOC_DEBUG
+#if defined (PYMALLOC_DEBUG)
/* Redirect all memory operations to Python's debugging allocator. */
#define PyMem_MALLOC PyObject_MALLOC
#define PyMem_REALLOC PyObject_REALLOC
#define PyMem_FREE PyObject_FREE
+#elif defined (Py_MEMORY_CAP)
+/* Redirect all memory allocations through memory tracking functions. */
+#define PyMem_MALLOC PyMem_Malloc
+#define PyMem_REALLOC PyMem_Realloc
+#define PyMem_FREE PyMem_Free
+
#else /* ! PYMALLOC_DEBUG */
/* PyMem_MALLOC(0) means malloc(1). Some systems would return NULL
Modified: python/branches/bcannon-sandboxing/Include/pystate.h
==============================================================================
--- python/branches/bcannon-sandboxing/Include/pystate.h (original)
+++ python/branches/bcannon-sandboxing/Include/pystate.h Thu Jul 20 21:42:38 2006
@@ -32,8 +32,9 @@
#ifdef WITH_TSC
int tscdump;
#endif
-#ifdef PySandbox_SUPPORTED
- PySandboxState *sandbox_state;
+#ifdef Py_MEMORY_CAP
+ size_t mem_cap;
+ size_t mem_usage;
#endif
} PyInterpreterState;
@@ -106,6 +107,13 @@
PyAPI_FUNC(void) PyInterpreterState_Clear(PyInterpreterState *);
PyAPI_FUNC(void) PyInterpreterState_Delete(PyInterpreterState *);
+#ifdef Py_MEMORY_CAP
+#define PyInterpreterState_SET_MEMORY_CAP(interp, cap) (interp->mem_cap = cap)
+PyAPI_FUNC(PyInterpreterState *) PyInterpreterState_SafeGet(void);
+PyAPI_FUNC(int) PyInterpreterState_RaiseMemoryUsage(PyInterpreterState *, size_t);
+PyAPI_FUNC(void) PyInterpreterState_LowerMemoryUsage(PyInterpreterState *, size_t);
+#endif /* Py_MEMORY_CAP */
+
PyAPI_FUNC(PyThreadState *) PyThreadState_New(PyInterpreterState *);
PyAPI_FUNC(void) PyThreadState_Clear(PyThreadState *);
PyAPI_FUNC(void) PyThreadState_Delete(PyThreadState *);
Deleted: /python/branches/bcannon-sandboxing/Include/sandbox.h
==============================================================================
--- /python/branches/bcannon-sandboxing/Include/sandbox.h Thu Jul 20 21:42:38 2006
+++ (empty file)
@@ -1,56 +0,0 @@
-#ifdef PySandbox_SUPPORTED
-
-#ifndef Py_SANDBOX_H
-#define Py_SANDBOX_H
-
-#ifdef __cplusplus
-extern "C" {
-#endif
-
-struct _sandbox_state; /* Forward */
-
-typedef struct _sandbox_state {
- /* The memory cap and current usage. */
- size_t mem_cap;
- size_t mem_usage;
-
-} PySandboxState;
-
-typedef struct {
- PyObject_HEAD
- PyThreadState *tstate;
-} PySandboxObject;
-
-PyAPI_DATA(PyTypeObject) PySandbox_Type;
-
-/* Return the sandbox state struct. */
-#define _PySandbox_GET() (PyThreadState_GET()->interp->sandbox_state)
-
-/* Return true if sandboxing is turn on for the current interpreter. */
-#define _PySandbox_Check() (_PySandbox_GET() != NULL)
-
-/* Return true if memory caps are to be used.
- Assumes sandboxing is turned on. */
-#define _PySandbox_IsMemCapped() (_PySandbox_GET()->mem_cap > 0)
-
-
-/*
- Memory
-*/
-
-PyAPI_FUNC(int) PySandbox_SetMemoryCap(PyThreadState *, size_t);
-
-PyAPI_FUNC(int) _PySandbox_AllowedMemoryAlloc(size_t);
-/* Return for caller if memory allocation would exceed memory cap. */
-#define PySandbox_AllowedMemoryAlloc(alloc, err_return) \
- if (!_PySandbox_AllowedMemoryAlloc(alloc)) return err_return
-
-/* Lower memory usage. */
-PyAPI_FUNC(void) PySandbox_AllowedMemoryFree(size_t);
-
-#ifdef __cplusplus
-}
-#endif
-
-#endif /* Py_SANDBOX_H */
-#endif /* PySandbox_SUPPORTED */
Modified: python/branches/bcannon-sandboxing/Lib/test/exception_hierarchy.txt
==============================================================================
--- python/branches/bcannon-sandboxing/Lib/test/exception_hierarchy.txt (original)
+++ python/branches/bcannon-sandboxing/Lib/test/exception_hierarchy.txt Thu Jul 20 21:42:38 2006
@@ -27,7 +27,6 @@
| +-- ReferenceError
| +-- RuntimeError
| | +-- NotImplementedError
- | +-- SandboxError
| +-- SyntaxError
| | +-- IndentationError
| | +-- TabError
Modified: python/branches/bcannon-sandboxing/Makefile.pre.in
==============================================================================
--- python/branches/bcannon-sandboxing/Makefile.pre.in (original)
+++ python/branches/bcannon-sandboxing/Makefile.pre.in Thu Jul 20 21:42:38 2006
@@ -262,7 +262,6 @@
Python/mysnprintf.o \
Python/pyarena.o \
Python/pyfpe.o \
- Python/sandbox.o \
Python/pystate.o \
Python/pythonrun.o \
Python/structmember.o \
@@ -540,7 +539,6 @@
Include/pyfpe.h \
Include/pymem.h \
Include/pyport.h \
- Include/sandbox.h \
Include/pystate.h \
Include/pythonrun.h \
Include/rangeobject.h \
Deleted: /python/branches/bcannon-sandboxing/Modules/sandboxmodule.c
==============================================================================
--- /python/branches/bcannon-sandboxing/Modules/sandboxmodule.c Thu Jul 20 21:42:38 2006
+++ (empty file)
@@ -1,227 +0,0 @@
-#include "Python.h"
-#include "structmember.h"
-
-#ifdef __cplusplus
-extern "C" {
-#endif
-
-/*
- Destroy the sandboxed interpreter and dealloc memory.
-*/
-static void
-sandbox_dealloc(PyObject *self)
-{
- PyThreadState *sand_tstate = NULL;
- PyThreadState *cur_tstate = NULL;
-
- /* To destory an interpreter using Py_EndInterpreter() it must be the
- currently running interpreter. This means you must temporariy make the
- sanboxed interpreter the running interpreter again, destroy it, and then
- swap back to the interpreter that created the interpreter in the first
- place. */
- sand_tstate = ((PySandboxObject *)self)->tstate;
- cur_tstate = PyThreadState_Swap(sand_tstate);
-
- Py_EndInterpreter(sand_tstate);
- PyEval_RestoreThread(cur_tstate);
-
- /* XXX need to do any special memory dealloc for sandboxed interpreter? */
- self->ob_type->tp_free(self);
-}
-
-/*
- Create new sandboxed interpreter.
- XXX Might be better to wait until actual execution occurs to create the interpreter. Would make these objects one-offs, but could then change this to a function instead where you pass in the needed arguments to handle everything (which is fine since object-capabilities puts the security into __builtins__ and thus only need that)).
- XXX Could also create thread/interpreter from scratch and avoid all of this swapping of thread states (would need to do initialization that Py_NewInterpreter() did, though). */
-static PyObject *
-sandbox_new(PyTypeObject *type, PyObject *args, PyObject *kwds)
-{
- PySandboxObject *self;
- PyThreadState *cur_tstate;
-
- self = (PySandboxObject *)type->tp_alloc(type, 0);
- if (self == NULL)
- return NULL;
-
- /* Creating a new interpreter swaps out the current one. */
- cur_tstate = PyThreadState_GET();
-
- /* XXX chang eto call PySandbox_NewInterpreter() */
- if (Py_NewInterpreter() == NULL) {
- Py_DECREF(self);
- /* XXX SandboxError best exception to use here? */
- PyErr_SetString(PyExc_SandboxError, "sub-interpreter creation failed");
- return NULL;
- }
-
- self->tstate = PyThreadState_Swap(cur_tstate);
- if (self->tstate == NULL) {
- Py_DECREF(self);
- PyErr_SetString(PyExc_SandboxError, "sub-interpreter swap failed");
- return NULL;
- }
-
- return (PyObject *)self;
-}
-
-static PyObject *
-sandbox_run(PyObject *self, PyObject *arg)
-{
- PySandboxObject *sandbox_self = (PySandboxObject *)self;
- const char *str_arg = NULL;
- PyThreadState* cur_tstate = NULL;
- int result = 0;
-
- if (!PyString_Check(arg)) {
- PyErr_SetString(PyExc_TypeError, "argument must be a string");
- return NULL;
- }
-
- str_arg = PyString_AsString(arg);
- if (!str_arg)
- return NULL;
-
- cur_tstate = PyThreadState_Swap(sandbox_self->tstate);
-
- result = PyRun_SimpleString(str_arg);
- if (result < 0) {
- PyErr_Clear();
- }
-
- PyThreadState_Swap(cur_tstate);
-
- if (result < 0) {
- PyErr_SetString(PyExc_SandboxError,
- "exception during execution in sandbox");
- return NULL;
- }
-
- Py_RETURN_NONE;
-}
-
-static PyMethodDef sandbox_methods[] = {
- {"run", sandbox_run, METH_O,
- "Run the passed-in string in the sandboxed interpreter"},
- {NULL}
-};
-
-
-static PyObject *
-sandbox_run_fxn(PyObject *self, PyObject *arg)
-{
- PyThreadState *sandbox_tstate = NULL;
- PyThreadState *cur_tstate = NULL;
- int result = 0;
- const char *arg_str = NULL;
-
- if (!PyString_Check(arg)) {
- PyErr_SetString(PyExc_TypeError, "argument must be a string");
- return NULL;
- }
-
- arg_str = PyString_AsString(arg);
- if (!arg_str)
- return NULL;
-
- cur_tstate = PyThreadState_GET();
-
- sandbox_tstate = Py_NewInterpreter();
- if (sandbox_tstate == NULL) {
- PyErr_SetString(PyExc_SandboxError,
- "could not instantiate a new sandboxed interpreter");
- return NULL;
- }
-
- result = PyRun_SimpleString(arg_str);
-
- Py_EndInterpreter(sandbox_tstate);
- PyEval_RestoreThread(cur_tstate);
-
- if (result < 0) {
- PyErr_SetString(PyExc_SandboxError,
- "exception raised in sandboxed interpreter");
- return NULL;
- }
-
- Py_RETURN_NONE;
-}
-
-static PyMethodDef sandbox_fxns[] = {
- {"run", sandbox_run_fxn, METH_O,
- "Run the passed-in string in a new sandboxed interpreter"},
- {NULL}
-};
-
-
-PyDoc_STRVAR(sandbox_type_doc,
-"XXX\n\
-\n\
-XXX");
-
-PyTypeObject PySandbox_Type = {
- PyObject_HEAD_INIT(NULL)
- 0, /* ob_size */
- "sandbox.Sandbox", /* tp_name */
- sizeof(PySandboxObject), /* tp_basicsize */
- 0, /* tp_itemsize */
- sandbox_dealloc, /* tp_dealloc */
- 0, /* tp_print */
- 0, /* tp_getattr */
- 0, /* tp_setattr */
- 0, /* tp_compare */
- 0, /* tp_repr */
- 0, /* tp_as_number */
- 0, /* tp_as_sequence */
- 0, /* tp_as_mapping */
- 0, /* tp_hash */
- 0, /* tp_call */
- 0, /* tp_str */
- 0, /* tp_getattro */
- 0, /* tp_setattro */
- 0, /* tp_as_buffer */
- Py_TPFLAGS_DEFAULT |
- Py_TPFLAGS_BASETYPE, /* tp_flags */
- sandbox_type_doc, /* tp_doc */
- 0, /* tp_traverse */
- 0, /* tp_clear */
- 0, /* tp_richcompare */
- 0, /* tp_weaklistoffset */
- 0, /* tp_iter */
- 0, /* tp_iternext */
- sandbox_methods, /* tp_methods */
- 0, /* tp_members */
- 0, /* tp_getset */
- 0, /* tp_base */
- 0, /* tp_dict */
- 0, /* tp_descr_get */
- 0, /* tp_descr_set */
- 0, /* tp_dictoffset */
- 0, /* tp_init */
- 0, /* tp_alloc */
- sandbox_new, /* tp_new */
- 0, /* tp_free */
- 0, /* tp_is_gc */
-};
-
-
-PyMODINIT_FUNC
-initsandbox(void)
-{
- PyObject *module;
-
- module = Py_InitModule3("sandbox", sandbox_fxns,
- "Provide a sandbox to safely execute Python code in.");
- if (module == NULL)
- return;
-
- Py_INCREF(&PySandbox_Type);
- if (PyType_Ready(&PySandbox_Type) < 0)
- return;
-
- if (PyModule_AddObject(module, "Sandbox", (PyObject *)&PySandbox_Type) < 0)
- return;
-}
-
-#ifdef __cplusplus
-}
-#endif
Modified: python/branches/bcannon-sandboxing/Objects/exceptions.c
==============================================================================
--- python/branches/bcannon-sandboxing/Objects/exceptions.c (original)
+++ python/branches/bcannon-sandboxing/Objects/exceptions.c Thu Jul 20 21:42:38 2006
@@ -944,12 +944,6 @@
"Unspecified run-time error.");
/*
- SandboxError extends StandardError
-*/
-SimpleExtendsException(PyExc_StandardError, SandboxError,
- "Attempt to exceed privileges under sandboxing.");
-
-/*
* NotImplementedError extends RuntimeError
*/
SimpleExtendsException(PyExc_RuntimeError, NotImplementedError,
@@ -2020,7 +2014,6 @@
#endif
PRE_INIT(EOFError)
PRE_INIT(RuntimeError)
- PRE_INIT(SandboxError)
PRE_INIT(NotImplementedError)
PRE_INIT(NameError)
PRE_INIT(UnboundLocalError)
@@ -2086,7 +2079,6 @@
#endif
POST_INIT(EOFError)
POST_INIT(RuntimeError)
- POST_INIT(SandboxError)
POST_INIT(NotImplementedError)
POST_INIT(NameError)
POST_INIT(UnboundLocalError)
Modified: python/branches/bcannon-sandboxing/Objects/object.c
==============================================================================
--- python/branches/bcannon-sandboxing/Objects/object.c (original)
+++ python/branches/bcannon-sandboxing/Objects/object.c Thu Jul 20 21:42:38 2006
@@ -234,6 +234,13 @@
_PyObject_New(PyTypeObject *tp)
{
PyObject *op;
+#ifdef Py_MEMORY_CAP
+ PyInterpreterState *interp = PyInterpreterState_SafeGet();
+ if (interp) {
+ if (!PyInterpreterState_RaiseMemoryUsage(interp, _PyObject_SIZE(tp)))
+ return NULL;
+ }
+#endif
op = (PyObject *) PyObject_MALLOC(_PyObject_SIZE(tp));
if (op == NULL)
return PyErr_NoMemory();
@@ -245,19 +252,47 @@
{
PyVarObject *op;
const size_t size = _PyObject_VAR_SIZE(tp, nitems);
+#ifdef Py_MEMORY_CAP
+ PyInterpreterState *interp = PyInterpreterState_SafeGet();
+ if (interp) {
+ if (!PyInterpreterState_RaiseMemoryUsage(interp, size))
+ return NULL;
+ }
+#endif
op = (PyVarObject *) PyObject_MALLOC(size);
if (op == NULL)
return (PyVarObject *)PyErr_NoMemory();
return PyObject_INIT_VAR(op, tp, nitems);
}
-/* for binary compatibility with 2.2 */
-#undef _PyObject_Del
+#ifdef Py_MEMORY_CAP
+void
+_PyObject_Del(void *op)
+{
+ PyObject *obj = (PyObject *)op;
+ size_t to_free = obj->ob_type->tp_basicsize;
+ PyInterpreterState *interp = PyInterpreterState_SafeGet();
+
+ if (obj->ob_type->tp_itemsize) {
+ Py_ssize_t obj_size = ((PyVarObject *)obj)->ob_size;
+
+ if (obj_size > 0)
+ to_free += obj_size * obj->ob_type->tp_itemsize;
+ }
+
+ if (interp)
+ PyInterpreterState_LowerMemoryUsage(interp, to_free);
+
+ PyObject_Free(op);
+}
+#else /* !Py_MEMORY_CAP */
+/* for binary compatibility with 2.2 and sandboxing. */
void
_PyObject_Del(PyObject *op)
{
PyObject_FREE(op);
}
+#endif /* Py_MEMORY_CAP */
/* Implementation of PyObject_Print with recursion checking */
static int
@@ -2019,18 +2054,41 @@
void *
PyMem_Malloc(size_t nbytes)
{
+#ifdef Py_MEMORY_CAP
+ PyInterpreterState *interp = PyInterpreterState_SafeGet();
+
+ if (interp) {
+ if (!PyInterpreterState_RaiseMemoryUsage(interp, nbytes))
+ return NULL;
+ }
+#endif
return PyMem_MALLOC(nbytes);
}
void *
PyMem_Realloc(void *p, size_t nbytes)
{
+#ifdef Py_MEMORY_CAP
+ size_t mem_diff = (p ? nbytes - sizeof(p) : nbytes);
+ PyInterpreterState *interp = PyInterpreterState_SafeGet();
+
+ if (interp) {
+ if (!PyInterpreterState_RaiseMemoryUsage(interp, mem_diff))
+ return NULL;
+ }
+#endif
return PyMem_REALLOC(p, nbytes);
}
void
PyMem_Free(void *p)
{
+#ifdef Py_MEMORY_CAP
+ PyInterpreterState *interp = PyInterpreterState_SafeGet();
+
+ if (interp)
+ PyInterpreterState_LowerMemoryUsage(interp, sizeof(p));
+#endif
PyMem_FREE(p);
}
Modified: python/branches/bcannon-sandboxing/Python/pystate.c
==============================================================================
--- python/branches/bcannon-sandboxing/Python/pystate.c (original)
+++ python/branches/bcannon-sandboxing/Python/pystate.c Thu Jul 20 21:42:38 2006
@@ -80,6 +80,10 @@
#ifdef WITH_TSC
interp->tscdump = 0;
#endif
+#ifdef Py_MEMORY_CAP
+ interp->mem_cap = 0;
+ interp->mem_usage = 0;
+#endif
HEAD_LOCK();
interp->next = interp_head;
@@ -140,6 +144,75 @@
free(interp);
}
+#ifdef Py_MEMORY_CAP
+/*
+ Get the interpreter state from a PyThreadState after checking to make sure
+ it is safe to do so based on initialization of the interpreter.
+*/
+PyInterpreterState *
+PyInterpreterState_SafeGet(void)
+{
+ PyThreadState *tstate = NULL;
+
+ if (!Py_IsInitialized() || !PyEval_ThreadsInitialized())
+ return NULL;
+
+ tstate = PyThreadState_GET();
+ if (!tstate)
+ return NULL;
+
+ return tstate->interp;
+}
+
+/*
+ Raise the current allocation of memory on the interpreter by 'increase'.
+ If it the allocation pushes the total memory usage past the memory cap,
+ return a false value.
+*/
+int
+PyInterpreterState_RaiseMemoryUsage(PyInterpreterState *interp, size_t increase)
+{
+ size_t original_mem_usage = 0;
+
+ if (increase < 0)
+ Py_FatalError("can only increase memory usage by a positive value");
+
+ if (!interp->mem_cap)
+ return 1;
+
+ /* Watch out for integer overflow. */
+ original_mem_usage = interp->mem_usage;
+ interp->mem_usage += increase;
+ if (interp->mem_usage < original_mem_usage) {
+ interp->mem_usage = original_mem_usage;
+ PyErr_SetString(PyExc_MemoryError, "integer overflow in memory usage");
+ return 0;
+ }
+
+ if (interp->mem_usage > interp->mem_cap) {
+ interp->mem_usage = original_mem_usage;
+ PyErr_SetString(PyExc_MemoryError, "exceeded memory usage");
+ return 0;
+ }
+
+ return 1;
+}
+
+/*
+ Lower the current memory allocation.
+ If lowered to below zero, push back up to zero.
+*/
+void
+PyInterpreterState_LowerMemoryUsage(PyInterpreterState *interp, size_t decrease)
+{
+ if (decrease < 0)
+ Py_FatalError("must specify memory usage reduction by a positive number");
+
+ interp->mem_usage -= decrease;
+ if (interp->mem_usage < 0)
+ interp->mem_usage = 0;
+}
+#endif /* Py_MEMORY_CAP */
/* Default implementation for _PyThreadState_GetFrame */
static struct _frame *
Modified: python/branches/bcannon-sandboxing/Python/pythonrun.c
==============================================================================
--- python/branches/bcannon-sandboxing/Python/pythonrun.c (original)
+++ python/branches/bcannon-sandboxing/Python/pythonrun.c Thu Jul 20 21:42:38 2006
@@ -512,13 +512,6 @@
if (interp == NULL)
return NULL;
-#ifdef PySandbox_SUPPORTED
- /* Must set sandbox_state to NULL to flag that the interpreter is
- unprotected. It if is to be protected, the field is set by
- PySandbox_NewInterpreter(). */
- interp->sandbox_state = NULL;
-#endif
-
tstate = PyThreadState_New(interp);
if (tstate == NULL) {
PyInterpreterState_Delete(interp);
Deleted: /python/branches/bcannon-sandboxing/Python/sandbox.c
==============================================================================
--- /python/branches/bcannon-sandboxing/Python/sandbox.c Thu Jul 20 21:42:38 2006
+++ (empty file)
@@ -1,79 +0,0 @@
-#include "Python.h" /* Must be defined before PySandbox_SUPPORTED check */
-
-#ifdef PySandbox_SUPPORTED
-
-#ifdef __cplusplus
-extern "C" {
-#endif
-
-/*
- Set the memory cap for a sandboxed interpreter.
-*/
-int
-PySandbox_SetMemoryCap(PyThreadState *s_tstate, size_t mem_cap)
-{
- PySandboxState *sandbox_state = s_tstate->interp->sandbox_state;
-
- if (!sandbox_state)
- return 0;
-
- sandbox_state->mem_cap = mem_cap;
-
- return 1;
-}
-
-/*
- Verify that memory allocation is allowed.
-*/
-int
-_PySandbox_AllowedMemoryAlloc(size_t allocate)
-{
- PySandboxState *sandbox_state = NULL;
-
- /* If can't track yet, just assume it worked. */
-if (!(Py_IsInitialized() && PyEval_ThreadsInitialized()))
- return 1;
-
- sandbox_state = _PySandbox_GET();
-
- if (_PySandbox_Check() && _PySandbox_IsMemCapped()) {
- size_t orig_mem_usage = sandbox_state->mem_usage;
-
- sandbox_state->mem_usage += allocate;
- /* Watch out for integer overflow. */
- if ((sandbox_state->mem_cap < sandbox_state->mem_usage) ||
- (orig_mem_usage > sandbox_state->mem_usage)) {
- sandbox_state -= allocate;
- PyErr_SetString(PyExc_SandboxError, "memory allocation exceeded");
- return 0;
- }
- }
-
- return 1;
-}
-
-/*
- Verify that freeing memory does not go past what was already used.
-*/
-void
-PySandbox_AllowedMemoryFree(size_t deallocate)
-{
- PySandboxState *sandbox_state = NULL;
-
- /* If interpreter not up yet, then don't worry about memory. */
- if (!(Py_IsInitialized() && PyEval_ThreadsInitialized()))
- return;
-
- sandbox_state = _PySandbox_GET();
- if (_PySandbox_Check() && _PySandbox_IsMemCapped()) {
- sandbox_state->mem_usage -= deallocate;
- if (sandbox_state->mem_usage < 0)
- sandbox_state->mem_usage = 0;
- }
-}
-
-#ifdef __cplusplus
-}
-#endif
-
-#endif /* PySandbox_SUPPORTED */
Modified: python/branches/bcannon-sandboxing/configure
==============================================================================
--- python/branches/bcannon-sandboxing/configure (original)
+++ python/branches/bcannon-sandboxing/configure Thu Jul 20 21:42:38 2006
@@ -1,5 +1,5 @@
#! /bin/sh
-# From configure.in Revision: 47023 .
+# From configure.in Revision: 50540 .
# Guess values for system-dependent variables and create Makefiles.
# Generated by GNU Autoconf 2.59 for python 2.5.
#
@@ -866,7 +866,7 @@
compiler
--with-suffix=.exe set executable suffix
--with-pydebug build with Py_DEBUG defined
- --with-sandboxing build with PySandbox_SUPPORTED defined
+ --with-memory-cap build with Py_MEMORY_CAP defined
--with-libs='lib1 ...' link against additional libs
--with-system-ffi build _ctypes module using an installed ffi library
--with-signal-module disable/enable signal module
@@ -3765,26 +3765,26 @@
echo "${ECHO_T}no" >&6
fi;
-# Check for --with-sandboxing
-echo "$as_me:$LINENO: checking for --with-sandboxing" >&5
-echo $ECHO_N "checking for --with-sandboxing... $ECHO_C" >&6
-
-# Check whether --with-sandboxing or --without-sandboxing was given.
-if test "${with_sandboxing+set}" = set; then
- withval="$with_sandboxing"
+# Check for --with-memory-cap
+echo "$as_me:$LINENO: checking for --with-memory-cap" >&5
+echo $ECHO_N "checking for --with-memory-cap... $ECHO_C" >&6
+
+# Check whether --with-memory-cap or --without-memory-cap was given.
+if test "${with_memory_cap+set}" = set; then
+ withval="$with_memory_cap"
if test "$withval" != no
then
cat >>confdefs.h <<\_ACEOF
-#define PySandbox_SUPPORTED 1
+#define Py_MEMORY_CAP 1
_ACEOF
echo "$as_me:$LINENO: result: yes" >&5
echo "${ECHO_T}yes" >&6;
- PySandbox_SUPPORTED='true'
+ Py_MEMORY_CAP='true'
else echo "$as_me:$LINENO: result: no" >&5
-echo "${ECHO_T}no" >&6; PySandbox_SUPPORTED='false'
+echo "${ECHO_T}no" >&6; Py_MEMORY_CAP='false'
fi
else
echo "$as_me:$LINENO: result: no" >&5
Modified: python/branches/bcannon-sandboxing/configure.in
==============================================================================
--- python/branches/bcannon-sandboxing/configure.in (original)
+++ python/branches/bcannon-sandboxing/configure.in Thu Jul 20 21:42:38 2006
@@ -725,18 +725,18 @@
fi],
[AC_MSG_RESULT(no)])
-# Check for --with-sandboxing
-AC_MSG_CHECKING(for --with-sandboxing)
-AC_ARG_WITH(sandboxing,
- AC_HELP_STRING(--with-sandboxing, build with PySandbox_SUPPORTED defined),
+# Check for --with-memory-cap
+AC_MSG_CHECKING(for --with-memory-cap)
+AC_ARG_WITH(memory-cap,
+ AC_HELP_STRING(--with-memory-cap, build with Py_MEMORY_CAP defined),
[
if test "$withval" != no
then
- AC_DEFINE(PySandbox_SUPPORTED, 1,
- [Define if you want to build an interpreter with sandboxing support.])
+ AC_DEFINE(Py_MEMORY_CAP, 1,
+ [Define if you want to build an interpreter that can cap memory usage.])
AC_MSG_RESULT(yes);
- PySandbox_SUPPORTED='true'
-else AC_MSG_RESULT(no); PySandbox_SUPPORTED='false'
+ Py_MEMORY_CAP='true'
+else AC_MSG_RESULT(no); Py_MEMORY_CAP='false'
fi],
[AC_MSG_RESULT(no)])
Modified: python/branches/bcannon-sandboxing/pyconfig.h.in
==============================================================================
--- python/branches/bcannon-sandboxing/pyconfig.h.in (original)
+++ python/branches/bcannon-sandboxing/pyconfig.h.in Thu Jul 20 21:42:38 2006
@@ -766,15 +766,15 @@
/* Define as the integral type used for Unicode representation. */
#undef PY_UNICODE_TYPE
-/* Define if you want to build an interpreter with sandboxing support. */
-#undef PySandbox_SUPPORTED
-
/* Define if you want to build an interpreter with many run-time checks. */
#undef Py_DEBUG
/* Defined if Python is built as a shared library. */
#undef Py_ENABLE_SHARED
+/* Define if you want to build an interpreter that can cap memory usage. */
+#undef Py_MEMORY_CAP
+
/* Define as the size of the unicode type. */
#undef Py_UNICODE_SIZE
Modified: python/branches/bcannon-sandboxing/setup.py
==============================================================================
--- python/branches/bcannon-sandboxing/setup.py (original)
+++ python/branches/bcannon-sandboxing/setup.py Thu Jul 20 21:42:38 2006
@@ -515,10 +515,6 @@
# CSV files
exts.append( Extension('_csv', ['_csv.c']) )
- # Sandboxing
- if config_h_vars.get("PySandbox_SUPPORTED", False):
- exts.append(Extension('sandbox', ['sandboxmodule.c']) )
-
# socket(2)
exts.append( Extension('_socket', ['socketmodule.c'],
depends = ['socketmodule.h']) )
From neal at metaslash.com Thu Jul 20 22:01:14 2006
From: neal at metaslash.com (Neal Norwitz)
Date: Thu, 20 Jul 2006 16:01:14 -0400
Subject: [Python-checkins] Python Regression Test Failures doc (1)
Message-ID: <20060720200114.GA2248@python.psfb.org>
TEXINPUTS=/home/neal/python/trunk/Doc/commontex: python /home/neal/python/trunk/Doc/tools/mkhowto --html --about html/stdabout.dat --iconserver ../icons --favicon ../icons/pyfav.png --address "See About this document... for information on suggesting changes." --up-link ../index.html --up-title "Python Documentation Index" --global-module-index "../modindex.html" --dvips-safe --dir html/lib lib/lib.tex
*** Session transcript and error messages are in /home/neal/python/trunk/Doc/html/lib/lib.how.
*** Exited with status 1.
The relevant lines from the transcript are:
------------------------------------------------------------------------
+++ latex lib
This is TeX, Version 3.14159 (Web2C 7.4.5)
(/home/neal/python/trunk/Doc/lib/lib.tex
LaTeX2e <2001/06/01>
Babel and hyphenation patterns for american, french, german, ngerman, n
ohyphenation, loaded.
(/home/neal/python/trunk/Doc/texinputs/manual.cls
Document Class: manual 1998/03/03 Document class (Python manual)
(/home/neal/python/trunk/Doc/texinputs/pypaper.sty
(/usr/share/texmf/tex/latex/psnfss/times.sty)
Using Times instead of Computer Modern.
) (/usr/share/texmf/tex/latex/misc/fancybox.sty
Style option: `fancybox' v1.3 <2000/09/19> (tvz)
) (/usr/share/texmf/tex/latex/base/report.cls
Document Class: report 2001/04/21 v1.4e Standard LaTeX document class
(/usr/share/texmf/tex/latex/base/size10.clo))
(/home/neal/python/trunk/Doc/texinputs/fancyhdr.sty)
Using fancier footers than usual.
(/home/neal/python/trunk/Doc/texinputs/fncychap.sty)
Using fancy chapter headings.
(/home/neal/python/trunk/Doc/texinputs/python.sty
(/usr/share/texmf/tex/latex/tools/longtable.sty)
(/home/neal/python/trunk/Doc/texinputs/underscore.sty)
(/usr/share/texmf/tex/latex/tools/verbatim.sty)
(/usr/share/texmf/tex/latex/base/alltt.sty)))
(/home/neal/python/trunk/Doc/commontex/boilerplate.tex
(/home/neal/python/trunk/Doc/commontex/patchlevel.tex))
Writing index file lib.idx
No file lib.aux.
(/usr/share/texmf/tex/latex/psnfss/ot1ptm.fd)
(/usr/share/texmf/tex/latex/psnfss/ot1phv.fd) [1]
(/home/neal/python/trunk/Doc/commontex/copyright.tex
(/usr/share/texmf/tex/latex/psnfss/omsptm.fd)) [2]
Adding blank page after the abstract.
[1] [2]
No file lib.toc.
Adding blank page after the table of contents.
[1] [2] (/home/neal/python/trunk/Doc/lib/libintro.tex
Chapter 1.
(/usr/share/texmf/tex/latex/psnfss/ot1pcr.fd)
LaTeX Warning: Reference `builtin' on page 1 undefined on input line 49.
) (/home/neal/python/trunk/Doc/lib/libobjs.tex [1] [2]
Chapter 2.
) (/home/neal/python/trunk/Doc/lib/libfuncs.tex [3] [4] [5] [6]
LaTeX Warning: Reference `bltin-file-objects' on page 7 undefined on input line
405.
[7]
Underfull \hbox (badness 10000) in paragraph at lines 476--480
[]\OT1/ptm/m/n/10 Note that \OT1/pcr/m/n/10 filter(function, \OT1/ptm/m/it/10 l
ist\OT1/pcr/m/n/10 ) \OT1/ptm/m/n/10 is equiv-a-lent to \OT1/pcr/m/n/10 [item f
or item in \OT1/ptm/m/it/10 list \OT1/pcr/m/n/10 if
[8] [9] [10] [11] [12] [13]
LaTeX Warning: Reference `typesseq-mutable' on page 14 undefined on input line
1014.
[14] [15] [16]) (/home/neal/python/trunk/Doc/lib/libstdtypes.tex [17] [18]
LaTeX Warning: Reference `built-in-funcs' on page 19 undefined on input line 29
5.
[19] [20] [21] [22]
LaTeX Warning: Reference `codec-base-classes' on page 23 undefined on input lin
e 600.
LaTeX Warning: Reference `codec-base-classes' on page 23 undefined on input lin
e 613.
LaTeX Warning: Reference `standard-encodings' on page 23 undefined on input lin
e 614.
[23] [24] [25] [26]
Overfull \hbox (8.48134pt too wide) in paragraph at lines 974--984
[]
Overfull \hbox (141.78873pt too wide) in paragraph at lines 991--1016
[]
[27]
Overfull \hbox (44.61931pt too wide) in paragraph at lines 1100--1132
[]
[28] [29] [30]
Overfull \hbox (98.60141pt too wide) in paragraph at lines 1312--1340
[]
[31] [32]
LaTeX Warning: Reference `built-in-funcs' on page 33 undefined on input line 15
27.
LaTeX Warning: Reference `context-closing' on page 33 undefined on input line 1
579.
[33] [34] [35]
Underfull \hbox (badness 10000) in paragraph at lines 1828--1835
[]\OT1/ptm/m/n/10 An ex-am-ple of a con-text man-ager that re-turns a re-lated
ob-ject is the one re-turned by
[36] [37] [38]) (/home/neal/python/trunk/Doc/lib/libexcs.tex [39]
Underfull \hbox (badness 10000) in paragraph at lines 94--98
\OT1/ptm/m/n/10 The base class for all built-in ex-cep-tions ex-cept \OT1/pcr/m
/n/10 StopIteration\OT1/ptm/m/n/10 , \OT1/pcr/m/n/10 GeneratorExit\OT1/ptm/m/n/
10 ,
[40] [41] [42]
No file ../../Lib/test/exception_hierarchy.txt.
) (/home/neal/python/trunk/Doc/lib/libconsts.tex [43])
(/home/neal/python/trunk/Doc/lib/libstrings.tex [44]
Chapter 3.
LaTeX Warning: Reference `string-methods' on page 45 undefined on input line 10
.
) (/home/neal/python/trunk/Doc/lib/libstring.tex [45] [46] [47]
LaTeX Warning: Reference `string-methods' on page 48 undefined on input line 23
7.
[48] [49]) (/home/neal/python/trunk/Doc/lib/libre.tex [50] [51] [52] [53]
[54] [55] [56] [57] [58] [59]) (/home/neal/python/trunk/Doc/lib/libstruct.tex
[60] [61]
LaTeX Warning: Reference `module-array' on page 62 undefined on input line 195.
LaTeX Warning: Reference `module-xdrlib' on page 62 undefined on input line 196
.
) (/home/neal/python/trunk/Doc/lib/libdifflib.tex [62] [63] [64] [65] [66]
[67] [68] [69]) (/home/neal/python/trunk/Doc/lib/libstringio.tex [70]
LaTeX Warning: Reference `bltin-file-objects' on page 71 undefined on input lin
e 11.
[71]) (/home/neal/python/trunk/Doc/lib/libtextwrap.tex
(/usr/share/texmf/tex/latex/psnfss/omspcr.fd) [72] [73])
(/home/neal/python/trunk/Doc/lib/libcodecs.tex
Underfull \hbox (badness 5161) in paragraph at lines 52--56
[]\OT1/ptm/m/n/10 The fac-tory func-tions must re-turn ob-jects pro-vid-ing the
in-ter-faces de-fined by the base classes
[74] [75] [76]
Overfull \hbox (405.07822pt too wide) in paragraph at lines 288--300
[]
[77] [78] [79] [80] [81] [82]
Overfull \hbox (11.18082pt too wide) in alignment at lines 820--901
[] [] []
Overfull \hbox (11.18082pt too wide) in alignment at lines 901--981
[] [] []
Overfull \hbox (254.7505pt too wide) in alignment at lines 981--1062
[] [] []
[83]
Overfull \hbox (254.7505pt too wide) in alignment at lines 1062--1142
[] [] []
Overfull \hbox (254.7505pt too wide) in alignment at lines 1142--1167
[] [] []
Package longtable Warning: Column widths have changed
(longtable) in table 3.1 on input line 1167.
[84]
Overfull \hbox (465.03658pt too wide) in paragraph at lines 1179--1270
[]
[85]) (/home/neal/python/trunk/Doc/lib/libunicodedata.tex [86])
(/home/neal/python/trunk/Doc/lib/libstringprep.tex [87] [88])
(/home/neal/python/trunk/Doc/lib/libfpformat.tex)
Underfull \hbox (badness 10000) in paragraph at lines 50--98
(/home/neal/python/trunk/Doc/lib/datatypes.tex [89] [90]
Chapter 4.
) (/home/neal/python/trunk/Doc/lib/libdatetime.tex
LaTeX Warning: Reference `module-calendar' on page 91 undefined on input line 5
6.
LaTeX Warning: Reference `module-time' on page 91 undefined on input line 57.
[91] [92]
Underfull \hbox (badness 10000) in paragraph at lines 185--187
\OT1/ptm/m/n/10 The small-est pos-si-ble dif-fer-ence be-tween non-equal \OT1/p
cr/m/n/10 timedelta \OT1/ptm/m/n/10 ob-jects,
(/usr/share/texmf/tex/latex/psnfss/omlptm.fd)
Overfull \hbox (44.65097pt too wide) in paragraph at lines 204--235
[]
[93] [94]
Underfull \hbox (badness 10000) in paragraph at lines 430--439
\OT1/ptm/m/it/10 d\OT1/pcr/m/n/10 .month, \OT1/ptm/m/it/10 d\OT1/pcr/m/n/10 .da
y, 0, 0, 0, \OT1/ptm/m/it/10 d\OT1/pcr/m/n/10 .weekday(), \OT1/ptm/m/it/10 d\OT
1/pcr/m/n/10 .toordinal() - date(\OT1/ptm/m/it/10 d\OT1/pcr/m/n/10 .year, 1,
[95]
Underfull \hbox (badness 10000) in paragraph at lines 464--467
[]\OT1/ptm/m/n/10 The ISO cal-en-dar is a widely used vari-ant of the Gre-go-ri
an cal-en-dar. See
LaTeX Warning: Reference `strftime-behavior' on page 96 undefined on input line
507.
[96]
Underfull \hbox (badness 10000) in paragraph at lines 547--551
\OT1/ptm/m/n/10 Return the cur-rent lo-cal date-time, with \OT1/pcr/m/n/10 tzin
fo None\OT1/ptm/m/n/10 . This is equiv-a-lent to
Underfull \hbox (badness 10000) in paragraph at lines 561--566
[]\OT1/ptm/m/n/10 Else \OT1/ptm/m/it/10 tz \OT1/ptm/m/n/10 must be an in-stance
of a class \OT1/pcr/m/n/10 tzinfo \OT1/ptm/m/n/10 sub-class, and the cur-rent
date
Underfull \hbox (badness 10000) in paragraph at lines 561--566
\OT1/ptm/m/n/10 and time are con-verted to \OT1/ptm/m/it/10 tz\OT1/ptm/m/n/10 '
s time zone. In this case the re-sult is equiv-a-lent to
Underfull \hbox (badness 10000) in paragraph at lines 582--586
[]\OT1/ptm/m/n/10 Else \OT1/ptm/m/it/10 tz \OT1/ptm/m/n/10 must be an in-stance
of a class \OT1/pcr/m/n/10 tzinfo \OT1/ptm/m/n/10 sub-class, and the times-
Underfull \hbox (badness 10000) in paragraph at lines 582--586
\OT1/ptm/m/n/10 tamp is con-verted to \OT1/ptm/m/it/10 tz\OT1/ptm/m/n/10 's tim
e zone. In this case the re-sult is equiv-a-lent to
[97]
Underfull \hbox (badness 5519) in paragraph at lines 646--648
\OT1/ptm/m/n/10 The lat-est rep-re-sentable \OT1/pcr/m/n/10 datetime\OT1/ptm/m/
n/10 , \OT1/pcr/m/n/10 datetime(MAXYEAR, 12, 31, 23, 59, 59, 999999,
[98] [99]
Underfull \hbox (badness 10000) in paragraph at lines 874--888
\OT1/ptm/m/it/10 d\OT1/pcr/m/n/10 .weekday(), \OT1/ptm/m/it/10 d\OT1/pcr/m/n/10
.toordinal() - date(\OT1/ptm/m/it/10 d\OT1/pcr/m/n/10 .year, 1, 1).toordinal()
+ 1, dst)) \OT1/ptm/m/n/10 The
Underfull \hbox (badness 10000) in paragraph at lines 923--925
\OT1/ptm/m/n/10 Return a 3-tuple, (ISO year, ISO week num-ber, ISO week-day). T
he same as
[100]
Underfull \hbox (badness 5064) in paragraph at lines 960--969
\OT1/ptm/m/n/10 Return a string rep-re-sent-ing the date and time, for ex-am-pl
e \OT1/pcr/m/n/10 datetime(2002, 12, 4, 20,
Underfull \hbox (badness 10000) in paragraph at lines 960--969
\OT1/pcr/m/n/10 30, 40).ctime() == 'Wed Dec 4 20:30:40 2002'\OT1/ptm/m/n/10 . \
OT1/pcr/m/n/10 d.ctime() \OT1/ptm/m/n/10 is equiv-a-lent to
LaTeX Warning: Reference `strftime-behavior' on page 101 undefined on input lin
e 973.
[101]
LaTeX Warning: Reference `strftime-behavior' on page 102 undefined on input lin
e 1103.
[102] [103] [104] [105] [106] [107])
(/home/neal/python/trunk/Doc/lib/libcalendar.tex [108] [109] [110] [111]
LaTeX Warning: Reference `module-datetime' on page 112 undefined on input line
302.
LaTeX Warning: Reference `module-time' on page 112 undefined on input line 303.
) (/home/neal/python/trunk/Doc/lib/libcollections.tex [112] [113] [114]
[115] [116]
Overfull \hbox (6.89723pt too wide) in paragraph at lines 337--337
[]\OT1/pcr/m/n/9 >>> s = [('red', 1), ('blue', 2), ('red', 3), ('blue', 4), ('r
ed', 1), ('blue', 4)][]
) (/home/neal/python/trunk/Doc/lib/libheapq.tex [117] [118] [119])
(/home/neal/python/trunk/Doc/lib/libbisect.tex)
(/home/neal/python/trunk/Doc/lib/libarray.tex [120] [121] [122]
LaTeX Warning: Reference `module-struct' on page 123 undefined on input line 23
0.
LaTeX Warning: Reference `module-xdrlib' on page 123 undefined on input line 23
3.
[123]) (/home/neal/python/trunk/Doc/lib/libsets.tex
LaTeX Warning: Reference `immutable-transforms' on page 124 undefined on input
line 49.
LaTeX Warning: Reference `immutable-transforms' on page 124 undefined on input
line 57.
[124]
Overfull \hbox (98.60141pt too wide) in paragraph at lines 135--163
[]
[125] [126]) (/home/neal/python/trunk/Doc/lib/libsched.tex [127])
(/home/neal/python/trunk/Doc/lib/libmutex.tex [128])
(/home/neal/python/trunk/Doc/lib/libqueue.tex [129])
(/home/neal/python/trunk/Doc/lib/libweakref.tex [130]
LaTeX Warning: Reference `weakref-extension' on page 131 undefined on input lin
e 69.
[131] [132] [133] [134]) (/home/neal/python/trunk/Doc/lib/libuserdict.tex
[135]
LaTeX Warning: Reference `typesmapping' on page 136 undefined on input line 40.
LaTeX Warning: Reference `typesseq' on page 136 undefined on input line 97.
[136]
LaTeX Warning: Reference `string-methods' on page 137 undefined on input line 1
74.
) (/home/neal/python/trunk/Doc/lib/libtypes.tex [137] [138] [139])
(/home/neal/python/trunk/Doc/lib/libnew.tex)
(/home/neal/python/trunk/Doc/lib/libcopy.tex [140]
LaTeX Warning: Reference `module-pickle' on page 141 undefined on input line 96
.
) (/home/neal/python/trunk/Doc/lib/libpprint.tex [141] [142] [143])
(/home/neal/python/trunk/Doc/lib/librepr.tex [144])
Underfull \hbox (badness 10000) in paragraph at lines 120--121
(/home/neal/python/trunk/Doc/lib/numeric.tex [145] [146]
Chapter 5.
) (/home/neal/python/trunk/Doc/lib/libmath.tex [147] [148]
LaTeX Warning: Reference `module-cmath' on page 149 undefined on input line 208
.
) (/home/neal/python/trunk/Doc/lib/libcmath.tex [149])
(/home/neal/python/trunk/Doc/lib/libdecimal.tex [150] [151] [152] [153]
[154]
Overfull \hbox (21.09727pt too wide) in paragraph at lines 305--305
[] \OT1/pcr/m/n/9 digit ::= '0' | '1' | '2' | '3' | '4' | '5' | '6
' | '7' | '8' | '9'[]
[155] [156] [157] [158] [159] [160] [161] [162] [163]
Overfull \vbox (959.57pt too high) has occurred while \output is active
[164] [165] [166]) (/home/neal/python/trunk/Doc/lib/librandom.tex [167]
Underfull \hbox (badness 6575) in paragraph at lines 119--124
\OT1/ptm/m/n/10 Return a ran-domly se-lected el-e-ment from \OT1/pcr/m/n/10 ran
ge(\OT1/ptm/m/it/10 start\OT1/pcr/m/n/10 , \OT1/ptm/m/it/10 stop\OT1/pcr/m/n/10
, \OT1/ptm/m/it/10 step\OT1/pcr/m/n/10 )\OT1/ptm/m/n/10 . This is equiv-a-lent
to
[168] [169]) (/home/neal/python/trunk/Doc/lib/libitertools.tex [170] [171]
[172] [173] [174] [175] [176])
(/home/neal/python/trunk/Doc/lib/libfunctools.tex [177]
Overfull \vbox (255.57pt too high) has occurred while \output is active
[178] [179]) (/home/neal/python/trunk/Doc/lib/liboperator.tex [180] [181]
[182] [183] [184] [185]) (/home/neal/python/trunk/Doc/lib/netdata.tex [186]
Chapter 6.
) (/home/neal/python/trunk/Doc/lib/email.tex
LaTeX Warning: Reference `module-smtplib' on page 187 undefined on input line 5
8.
LaTeX Warning: Reference `module-nntplib' on page 187 undefined on input line 5
9.
(/home/neal/python/trunk/Doc/lib/emailmessage.tex [187] [188] [189] [190]
[191] [192]) (/home/neal/python/trunk/Doc/lib/emailparser.tex [193] [194]
[195]) (/home/neal/python/trunk/Doc/lib/emailgenerator.tex [196])
(/home/neal/python/trunk/Doc/lib/emailmimebase.tex [197] [198])
(/home/neal/python/trunk/Doc/lib/emailheaders.tex [199] [200])
(/home/neal/python/trunk/Doc/lib/emailcharsets.tex [201] [202])
(/home/neal/python/trunk/Doc/lib/emailencoders.tex [203])
(/home/neal/python/trunk/Doc/lib/emailexc.tex [204])
(/home/neal/python/trunk/Doc/lib/emailutil.tex [205] [206])
(/home/neal/python/trunk/Doc/lib/emailiter.tex [207]) [208] [209] [210]
[211] [212] [213] [214]) (/home/neal/python/trunk/Doc/lib/libmailcap.tex
[215]) (/home/neal/python/trunk/Doc/lib/libmailbox.tex [216]
LaTeX Warning: Reference `module-email' on page 217 undefined on input line 18.
[217] [218] [219] [220] [221] [222] [223] [224] [225] [226] [227] [228]
[229] [230] [231] [232] [233]) (/home/neal/python/trunk/Doc/lib/libmhlib.tex
[234] [235]) (/home/neal/python/trunk/Doc/lib/libmimetools.tex [236]
LaTeX Warning: Reference `module-email' on page 237 undefined on input line 61.
LaTeX Warning: Reference `module-rfc822' on page 237 undefined on input line 63
.
LaTeX Warning: Reference `module-multifile' on page 237 undefined on input line
65.
) (/home/neal/python/trunk/Doc/lib/libmimetypes.tex [237] [238])
(/home/neal/python/trunk/Doc/lib/libmimewriter.tex [239])
(/home/neal/python/trunk/Doc/lib/libmimify.tex [240]
LaTeX Warning: Reference `module-quopri' on page 241 undefined on input line 93
.
) (/home/neal/python/trunk/Doc/lib/libmultifile.tex [241]
LaTeX Warning: Reference `module-email' on page 242 undefined on input line 43.
[242]) (/home/neal/python/trunk/Doc/lib/librfc822.tex [243] [244]
LaTeX Warning: Reference `module-email' on page 245 undefined on input line 133
.
LaTeX Warning: Reference `module-mailbox' on page 245 undefined on input line 1
35.
[245]
LaTeX Warning: Reference `module-mimetools' on page 246 undefined on input line
137.
[246]
Underfull \hbox (badness 7379) in paragraph at lines 254--270
[]\OT1/pcr/m/n/10 Message \OT1/ptm/m/n/10 in-stances also sup-port a lim-ited m
ap-ping in-ter-face. In par-tic-u-lar: \OT1/ptm/m/it/10 m\OT1/pcr/m/n/10 [name]
\OT1/ptm/m/n/10 is like
[247]) (/home/neal/python/trunk/Doc/lib/libbase64.tex [248]
LaTeX Warning: Reference `module-binascii' on page 249 undefined on input line
151.
) (/home/neal/python/trunk/Doc/lib/libbinascii.tex [249] [250]
LaTeX Warning: Reference `module-base64' on page 251 undefined on input line 13
9.
LaTeX Warning: Reference `module-binhex' on page 251 undefined on input line 14
1.
LaTeX Warning: Reference `module-uu' on page 251 undefined on input line 143.
LaTeX Warning: Reference `module-quopri' on page 251 undefined on input line 14
5.
) (/home/neal/python/trunk/Doc/lib/libbinhex.tex
LaTeX Warning: Reference `module-binascii' on page 251 undefined on input line
41.
[251]) (/home/neal/python/trunk/Doc/lib/libquopri.tex
LaTeX Warning: Reference `module-mimify' on page 252 undefined on input line 59
.
LaTeX Warning: Reference `module-base64' on page 252 undefined on input line 60
.
) (/home/neal/python/trunk/Doc/lib/libuu.tex [252]
LaTeX Warning: Reference `module-binascii' on page 253 undefined on input line
57.
) (/home/neal/python/trunk/Doc/lib/markup.tex [253] [254]
Chapter 7.
) (/home/neal/python/trunk/Doc/lib/libhtmlparser.tex [255] [256])
(/home/neal/python/trunk/Doc/lib/libsgmllib.tex [257] [258] [259])
(/home/neal/python/trunk/Doc/lib/libhtmllib.tex [260]
LaTeX Warning: Reference `module-formatter' on page 261 undefined on input line
81.
LaTeX Warning: Reference `module-HTMLParser' on page 261 undefined on input lin
e 87.
LaTeX Warning: Reference `module-htmlentitydefs' on page 261 undefined on input
line 89.
LaTeX Warning: Reference `module-sgmllib' on page 261 undefined on input line 9
0.
Underfull \hbox (badness 7168) in paragraph at lines 158--165
\OT1/ptm/m/n/10 This mod-ule de-fines three dic-tio-nar-ies, \OT1/pcr/m/n/10 na
me2codepoint\OT1/ptm/m/n/10 , \OT1/pcr/m/n/10 codepoint2name\OT1/ptm/m/n/10 , a
nd \OT1/pcr/m/n/10 entitydefs\OT1/ptm/m/n/10 .
[261]) (/home/neal/python/trunk/Doc/lib/libpyexpat.tex
LaTeX Warning: Reference `expaterror-objects' on page 262 undefined on input li
ne 36.
[262]
Underfull \hbox (badness 10000) in paragraph at lines 160--167
\OT1/ptm/m/n/10 Calling this with a true value for \OT1/ptm/m/it/10 flag \OT1/p
tm/m/n/10 (the de-fault) will cause Ex-pat to call the
[263] [264] [265] [266] [267] [268] [269])
(/home/neal/python/trunk/Doc/lib/xmldom.tex [270]
LaTeX Warning: Reference `dom-conformance' on page 271 undefined on input line
71.
[271]
LaTeX Warning: Reference `dom-implementation-objects' on page 272 undefined on
input line 181.
LaTeX Warning: Reference `dom-node-objects' on page 272 undefined on input line
183.
LaTeX Warning: Reference `dom-nodelist-objects' on page 272 undefined on input
line 185.
LaTeX Warning: Reference `dom-documenttype-objects' on page 272 undefined on in
put line 187.
LaTeX Warning: Reference `dom-document-objects' on page 272 undefined on input
line 189.
LaTeX Warning: Reference `dom-element-objects' on page 272 undefined on input l
ine 191.
LaTeX Warning: Reference `dom-attr-objects' on page 272 undefined on input line
193.
LaTeX Warning: Reference `dom-comment-objects' on page 272 undefined on input l
ine 195.
LaTeX Warning: Reference `dom-text-objects' on page 272 undefined on input line
197.
LaTeX Warning: Reference `dom-pi-objects' on page 272 undefined on input line 1
99.
[272] [273] [274] [275] [276] [277] [278]
Underfull \hbox (badness 10000) in paragraph at lines 807--810
\OT1/ptm/m/n/10 Exception when a node does not ex-ist in the ref-er-enced con-t
ext. For ex-am-ple,
[279]) (/home/neal/python/trunk/Doc/lib/xmldomminidom.tex [280] [281] [282]
[283]
Underfull \hbox (badness 10000) in paragraph at lines 242--246
[]\OT1/pcr/m/n/10 const \OT1/ptm/m/n/10 dec-la-ra-tions map to vari-ables in th
eir re-spec-tive scope (e.g.
[284]) (/home/neal/python/trunk/Doc/lib/xmldompulldom.tex)
(/home/neal/python/trunk/Doc/lib/xmlsax.tex [285] [286]
LaTeX Warning: Reference `module-xml.sax.handler' on page 287 undefined on inpu
t line 122.
LaTeX Warning: Reference `module-xml.sax.saxutils' on page 287 undefined on inp
ut line 125.
LaTeX Warning: Reference `module-xml.sax.xmlreader' on page 287 undefined on in
put line 128.
) (/home/neal/python/trunk/Doc/lib/xmlsaxhandler.tex [287] [288] [289] [290])
(/home/neal/python/trunk/Doc/lib/xmlsaxutils.tex [291])
(/home/neal/python/trunk/Doc/lib/xmlsaxreader.tex [292]
LaTeX Warning: Reference `attributes-objects' on page 293 undefined on input li
ne 74.
LaTeX Warning: Reference `attributes-ns-objects' on page 293 undefined on input
line 92.
[293]
Underfull \hbox (badness 10000) in paragraph at lines 159--163
\OT1/ptm/m/n/10 Return the cur-rent set-ting for fea-ture \OT1/ptm/m/it/10 fea-
ture-name\OT1/ptm/m/n/10 . If the fea-ture is not rec-og-nized,
Underfull \hbox (badness 6188) in paragraph at lines 173--177
\OT1/ptm/m/n/10 Return the cur-rent set-ting for prop-erty \OT1/ptm/m/it/10 pro
p-er-ty-name\OT1/ptm/m/n/10 . If the prop-erty is not rec-og-nized, a
[294] [295]
LaTeX Warning: Reference `attributes-objects' on page 296 undefined on input li
ne 330.
) (/home/neal/python/trunk/Doc/lib/libetree.tex [296] [297] [298] [299])
(/home/neal/python/trunk/Doc/lib/fileformats.tex [300]
Chapter 8.
) (/home/neal/python/trunk/Doc/lib/libcsv.tex
LaTeX Warning: Reference `csv-examples' on page 301 undefined on input line 37.
LaTeX Warning: Reference `csv-fmt-params' on page 301 undefined on input line 6
8.
[301]
LaTeX Warning: Reference `csv-fmt-params' on page 302 undefined on input line 8
9.
LaTeX Warning: Reference `csv-fmt-params' on page 302 undefined on input line 1
05.
[302] [303]
LaTeX Warning: Reference `csv-contents' on page 304 undefined on input line 308
.
[304] [305]) (/home/neal/python/trunk/Doc/lib/libcfgparser.tex [306]
Overfull \vbox (35.57pt too high) has occurred while \output is active [307]
Underfull \hbox (badness 10000) in paragraph at lines 22--23
[308]
LaTeX Warning: Reference `module-shlex' on page 309 undefined on input line 141
.
[309] [310]) (/home/neal/python/trunk/Doc/lib/librobotparser.tex
Overfull \hbox (1.49724pt too wide) in paragraph at lines 66--66
[]\OT1/pcr/m/n/9 >>> rp.can_fetch("*", "http://www.musi-cal.com/cgi-bin/search?
city=San+Francisco")[]
) (/home/neal/python/trunk/Doc/lib/libnetrc.tex [311])
(/home/neal/python/trunk/Doc/lib/libxdrlib.tex [312] [313] [314])
Underfull \hbox (badness 10000) in paragraph at lines 244--183
(/home/neal/python/trunk/Doc/lib/libcrypto.tex [315] [316]
Chapter 9.
) (/home/neal/python/trunk/Doc/lib/libhashlib.tex [317]
LaTeX Warning: Reference `module-hmac' on page 318 undefined on input line 107.
LaTeX Warning: Reference `module-base64' on page 318 undefined on input line 10
8.
) (/home/neal/python/trunk/Doc/lib/libhmac.tex [318]
LaTeX Warning: Reference `module-hashlib' on page 319 undefined on input line 5
3.
) (/home/neal/python/trunk/Doc/lib/libmd5.tex [319]
LaTeX Warning: Reference `module-sha' on page 320 undefined on input line 91.
) (/home/neal/python/trunk/Doc/lib/libsha.tex [320])
(/home/neal/python/trunk/Doc/lib/filesys.tex [321] [322]
Chapter 10.
LaTeX Warning: Reference `bltin-file-objects' on page 323 undefined on input li
ne 12.
LaTeX Warning: Reference `module-os' on page 323 undefined on input line 17.
) (/home/neal/python/trunk/Doc/lib/libposixpath.tex [323]
(/usr/share/texmf/tex/latex/psnfss/omsphv.fd) [324] [325])
(/home/neal/python/trunk/Doc/lib/libfileinput.tex [326]
Underfull \hbox (badness 10000) in paragraph at lines 185--187
[]\OT1/ptm/m/n/10 Usage ex-am-ple: `\OT1/pcr/m/n/10 fi = fileinput.FileInput(op
enhook=fileinput.hook_-
[327]) (/home/neal/python/trunk/Doc/lib/libstat.tex [328])
(/home/neal/python/trunk/Doc/lib/libstatvfs.tex [329])
(/home/neal/python/trunk/Doc/lib/libfilecmp.tex [330] [331])
(/home/neal/python/trunk/Doc/lib/libtempfile.tex [332])
(/home/neal/python/trunk/Doc/lib/libglob.tex [333]
LaTeX Warning: Reference `module-fnmatch' on page 334 undefined on input line 5
0.
) (/home/neal/python/trunk/Doc/lib/libfnmatch.tex [334]
LaTeX Warning: Reference `module-glob' on page 335 undefined on input line 54.
) (/home/neal/python/trunk/Doc/lib/liblinecache.tex)
(/home/neal/python/trunk/Doc/lib/libshutil.tex [335])
(/home/neal/python/trunk/Doc/lib/libdircache.tex [336])
Underfull \hbox (badness 10000) in paragraph at lines 37--206
(/home/neal/python/trunk/Doc/lib/archiving.tex [337] [338]
Chapter 11.
) (/home/neal/python/trunk/Doc/lib/libzlib.tex [339] [340]
LaTeX Warning: Reference `module-gzip' on page 341 undefined on input line 193.
) (/home/neal/python/trunk/Doc/lib/libgzip.tex [341]
LaTeX Warning: Reference `module-zlib' on page 342 undefined on input line 69.
) (/home/neal/python/trunk/Doc/lib/libbz2.tex
Underfull \hbox (badness 5460) in paragraph at lines 19--22
[]\OT1/pcr/m/n/10 BZ2File \OT1/ptm/m/n/10 class im-ple-ments a com-plete file i
n-ter-face, in-clud-ing \OT1/pcr/m/n/10 readline()\OT1/ptm/m/n/10 , \OT1/pcr/m/
n/10 readlines()\OT1/ptm/m/n/10 ,
[342] [343]) (/home/neal/python/trunk/Doc/lib/libzipfile.tex
LaTeX Warning: Reference `zipfile-objects' on page 344 undefined on input line
36.
LaTeX Warning: Reference `zipinfo-objects' on page 344 undefined on input line
53.
[344] [345] [346]) (/home/neal/python/trunk/Doc/lib/libtarfile.tex
LaTeX Warning: Reference `tarfile-objects' on page 347 undefined on input line
29.
[347]
LaTeX Warning: Reference `tar-examples' on page 348 undefined on input line 65.
LaTeX Warning: Reference `tarfile-objects' on page 348 undefined on input line
81.
[348]
LaTeX Warning: Reference `module-zipfile' on page 349 undefined on input line 1
29.
LaTeX Warning: Reference `tarinfo-objects' on page 349 undefined on input line
147.
LaTeX Warning: Reference `module-tarfile' on page 349 undefined on input line 1
65.
[349] [350] [351] [352])
Underfull \hbox (badness 10000) in paragraph at lines 489--214
(/home/neal/python/trunk/Doc/lib/persistence.tex [353] [354]
Chapter 12.
) (/home/neal/python/trunk/Doc/lib/libpickle.tex [355]
Underfull \hbox (badness 10000) in paragraph at lines 94--95
[356]
LaTeX Warning: Reference `pickle-sub' on page 357 undefined on input line 255.
[357]
LaTeX Warning: Reference `pickle-protocol' on page 358 undefined on input line
347.
[358]
LaTeX Warning: Reference `pickle-protocol' on page 359 undefined on input line
377.
LaTeX Warning: Reference `pickle-sub' on page 359 undefined on input line 433.
[359]
Underfull \hbox (badness 10000) in paragraph at lines 495--496
[360]
LaTeX Warning: Reference `pickle-inst' on page 361 undefined on input line 536.
[361] [362] [363]
LaTeX Warning: Reference `module-copyreg' on page 364 undefined on input line 7
97.
[364]
LaTeX Warning: Reference `module-shelve' on page 365 undefined on input line 79
9.
LaTeX Warning: Reference `module-copy' on page 365 undefined on input line 801.
LaTeX Warning: Reference `module-marshal' on page 365 undefined on input line 8
03.
) (/home/neal/python/trunk/Doc/lib/libcopyreg.tex)
(/home/neal/python/trunk/Doc/lib/libshelve.tex [365] [366]
LaTeX Warning: Reference `module-anydbm' on page 367 undefined on input line 16
3.
LaTeX Warning: Reference `module-bsddb' on page 367 undefined on input line 164
.
LaTeX Warning: Reference `module-dbhash' on page 367 undefined on input line 16
6.
LaTeX Warning: Reference `module-dbm' on page 367 undefined on input line 167.
[367]
LaTeX Warning: Reference `module-dumbdbm' on page 368 undefined on input line 1
68.
LaTeX Warning: Reference `module-gdbm' on page 368 undefined on input line 169.
LaTeX Warning: Reference `module-pickle' on page 368 undefined on input line 17
0.
LaTeX Warning: Reference `module-cPickle' on page 368 undefined on input line 1
71.
) (/home/neal/python/trunk/Doc/lib/libmarshal.tex
Underfull \hbox (badness 10000) in paragraph at lines 38--39
[368]) (/home/neal/python/trunk/Doc/lib/libanydbm.tex
LaTeX Warning: Reference `module-dbhash' on page 369 undefined on input line 51
.
LaTeX Warning: Reference `module-dbm' on page 369 undefined on input line 52.
LaTeX Warning: Reference `module-dumbdbm' on page 369 undefined on input line 5
3.
LaTeX Warning: Reference `module-gdbm' on page 369 undefined on input line 54.
[369]
LaTeX Warning: Reference `module-shelve' on page 370 undefined on input line 56
.
LaTeX Warning: Reference `module-whichdb' on page 370 undefined on input line 5
8.
) (/home/neal/python/trunk/Doc/lib/libwhichdb.tex)
(/home/neal/python/trunk/Doc/lib/libdbm.tex
LaTeX Warning: Reference `module-anydbm' on page 370 undefined on input line 57
.
LaTeX Warning: Reference `module-gdbm' on page 370 undefined on input line 58.
[370]
LaTeX Warning: Reference `module-whichdb' on page 371 undefined on input line 6
0.
) (/home/neal/python/trunk/Doc/lib/libgdbm.tex [371]
LaTeX Warning: Reference `module-anydbm' on page 372 undefined on input line 97
.
LaTeX Warning: Reference `module-whichdb' on page 372 undefined on input line 9
9.
) (/home/neal/python/trunk/Doc/lib/libdbhash.tex
LaTeX Warning: Reference `module-anydbm' on page 372 undefined on input line 44
.
LaTeX Warning: Reference `module-bsddb' on page 372 undefined on input line 45.
LaTeX Warning: Reference `module-whichdb' on page 372 undefined on input line 4
7.
[372]) (/home/neal/python/trunk/Doc/lib/libbsddb.tex [373]
LaTeX Warning: Reference `module-dbhash' on page 374 undefined on input line 10
5.
) (/home/neal/python/trunk/Doc/lib/libdumbdbm.tex [374] [375]
LaTeX Warning: Reference `module-anydbm' on page 376 undefined on input line 46
.
LaTeX Warning: Reference `module-dbm' on page 376 undefined on input line 47.
LaTeX Warning: Reference `module-gdbm' on page 376 undefined on input line 48.
LaTeX Warning: Reference `module-shelve' on page 376 undefined on input line 49
.
LaTeX Warning: Reference `module-whichdb' on page 376 undefined on input line 5
1.
) (/home/neal/python/trunk/Doc/lib/libsqlite3.tex [376] [377]
LaTeX Warning: Reference `sqlite3-Connection-IsolationLevel' on page 378 undefi
ned on input line 148.
LaTeX Warning: Reference `sqlite3-Types' on page 378 undefined on input line 16
4.
[378]
No file sqlite3/complete_statement.py.
LaTeX Warning: Reference `sqlite3-Controlling-Transactions' on page 379 undefin
ed on input line 214.
[379]
No file sqlite3/collation_reverse.py.
[380]
No file sqlite3/row_factory.py.
No file sqlite3/text_factory.py.
No file sqlite3/execute_1.py.
No file sqlite3/execute_2.py.
[381]
No file sqlite3/executemany_1.py.
No file sqlite3/executemany_2.py.
[382]
No file sqlite3/adapter_point_1.py.
[383]
No file sqlite3/adapter_point_2.py.
No file sqlite3/adapter_datetime.py.
LaTeX Warning: Reference `sqlite3-Module-Contents' on page 384 undefined on inp
ut line 559.
No file sqlite3/converter_point.py.
No file sqlite3/pysqlite_datetime.py.
[384]
Underfull \hbox (badness 10000) in paragraph at lines 585--589
\OT1/ptm/m/n/10 By de-fault, the \OT1/pcr/m/n/10 sqlite3 \OT1/ptm/m/n/10 mod-ul
e opens trans-ac-tions im-plic-itly be-fore a DML state-ment (IN-
No file sqlite3/shortcut_methods.py.
) (/home/neal/python/trunk/Doc/lib/liballos.tex [385] [386]
Chapter 13.
) (/home/neal/python/trunk/Doc/lib/libos.tex [387]
LaTeX Warning: Reference `os-file-dir' on page 388 undefined on input line 124.
[388] [389]
LaTeX Warning: Reference `popen2-flow-control' on page 390 undefined on input l
ine 400.
[390] [391] [392] [393] [394] [395] [396] [397]
LaTeX Warning: Reference `os-newstreams' on page 398 undefined on input line 11
49.
LaTeX Warning: Reference `os-newstreams' on page 398 undefined on input line 11
61.
[398] [399] [400] [401]
LaTeX Warning: Reference `os-newstreams' on page 402 undefined on input line 15
37.
[402] [403] [404] [405]) (/home/neal/python/trunk/Doc/lib/libtime.tex [406]
[407] [408] [409] [410]
LaTeX Warning: Reference `module-datetime' on page 411 undefined on input line
454.
LaTeX Warning: Reference `module-locale' on page 411 undefined on input line 45
7.
LaTeX Warning: Reference `module-calendar' on page 411 undefined on input line
460.
) (/home/neal/python/trunk/Doc/lib/liboptparse.tex [411] [412] [413] [414]
LaTeX Warning: Reference `optparse-extending-optparse' on page 415 undefined on
input line 311.
[415]
LaTeX Warning: Reference `optparse-extending-optparse' on page 416 undefined on
input line 376.
[416]
LaTeX Warning: Reference `optparse-reference-guide' on page 417 undefined on in
put line 413.
LaTeX Warning: Reference `optparse-option-callbacks' on page 417 undefined on i
nput line 413.
[417] [418]
Underfull \hbox (badness 5726) in paragraph at lines 516--519
[]\OT1/pcr/m/n/10 optparse \OT1/ptm/m/n/10 ex-pands \OT1/pcr/m/n/10 "%prog" \OT
1/ptm/m/n/10 in the us-age string to the name of the cur-rent pro-gram, i.e.
[419] [420]
LaTeX Warning: Reference `optparse-conflicts-between-options' on page 421 undef
ined on input line 704.
[421]
LaTeX Warning: Reference `optparse-tutorial' on page 422 undefined on input lin
e 728.
[422] [423] [424] [425]
LaTeX Warning: Reference `optparse-option-callbacks' on page 426 undefined on i
nput line 1008.
[426]
LaTeX Warning: Reference `optparse-option-callbacks' on page 427 undefined on i
nput line 1121.
LaTeX Warning: Reference `optparse-tutorial' on page 427 undefined on input lin
e 1142.
LaTeX Warning: Reference `optparse-extending-optparse' on page 427 undefined on
input line 1151.
[427] [428] [429] [430] [431] [432] [433] [434] [435] [436] [437])
(/home/neal/python/trunk/Doc/lib/libgetopt.tex [438] [439]
LaTeX Warning: Reference `module-optparse' on page 440 undefined on input line
150.
) (/home/neal/python/trunk/Doc/lib/liblogging.tex [440] [441]
Overfull \hbox (10.29729pt too wide) in paragraph at lines 213--213
[]\OT1/pcr/m/n/9 2006-02-08 22:20:02,165 192.168.0.1 fbloggs Protocol problem:
connection reset[]
[442] [443]
Overfull \hbox (240.35738pt too wide) in paragraph at lines 317--328
[]
[444]
Overfull \hbox (10.29729pt too wide) in paragraph at lines 441--441
[]\OT1/pcr/m/n/9 2006-02-08 22:20:02,165 192.168.0.1 fbloggs Protocol problem:
connection reset[]
[445] [446]
LaTeX Warning: Reference `typesseq-strings' on page 447 undefined on input line
597.
Overfull \hbox (367.90819pt too wide) in paragraph at lines 601--612
[]
[447] [448] [449] [450]
Overfull \vbox (288.57pt too high) has occurred while \output is active
[451] [452] [453])
Runaway argument?
{portion thereof, depending on the rollover interval. At most \var {b\ETC.
! File ended while scanning use of \code.
\par
l.241 \input{liblogging}
?
! Emergency stop.
\par
l.241 \input{liblogging}
Output written on lib.dvi (459 pages, 1629824 bytes).
Transcript written on lib.log.
*** Session transcript and error messages are in /home/neal/python/trunk/Doc/html/lib/lib.how.
*** Exited with status 1.
+++ TEXINPUTS=/home/neal/python/trunk/Doc/lib:/home/neal/python/trunk/Doc/commontex:/home/neal/python/trunk/Doc/paper-letter:/home/neal/python/trunk/Doc/texinputs:
+++ latex lib
make: *** [html/lib/lib.html] Error 1
From python-checkins at python.org Thu Jul 20 22:11:57 2006
From: python-checkins at python.org (fred.drake)
Date: Thu, 20 Jul 2006 22:11:57 +0200 (CEST)
Subject: [Python-checkins] r50731 - python/trunk/Doc/lib/liblogging.tex
Message-ID: <20060720201157.7E9ED1E4003@bag.python.org>
Author: fred.drake
Date: Thu Jul 20 22:11:57 2006
New Revision: 50731
Modified:
python/trunk/Doc/lib/liblogging.tex
Log:
markup fix
Modified: python/trunk/Doc/lib/liblogging.tex
==============================================================================
--- python/trunk/Doc/lib/liblogging.tex (original)
+++ python/trunk/Doc/lib/liblogging.tex Thu Jul 20 22:11:57 2006
@@ -1069,7 +1069,7 @@
If \var{backupCount} is non-zero, the system will save old log files by
appending extensions to the filename. The extensions are date-and-time
-based, using the strftime format \code{%Y-%m-%d_%H-%M-%S} or a leading
+based, using the strftime format \code{\%Y-\%m-\%d_\%H-\%M-\%S} or a leading
portion thereof, depending on the rollover interval. At most \var{backupCount}
files will be kept, and if more would be created when rollover occurs, the
oldest one is deleted.
@@ -1537,7 +1537,7 @@
To stop the server, call \function{stopListening()}. To send a configuration
to the socket, read in the configuration file and send it to the socket
as a string of bytes preceded by a four-byte length packed in binary using
-struct.\code{pack(">L", n)}.
+struct.\code{pack('>L', n)}.
\end{funcdesc}
\begin{funcdesc}{stopListening}{}
From python-checkins at python.org Thu Jul 20 22:37:27 2006
From: python-checkins at python.org (phillip.eby)
Date: Thu, 20 Jul 2006 22:37:27 +0200 (CEST)
Subject: [Python-checkins] r50732 -
sandbox/trunk/setuptools/setuptools/package_index.py
Message-ID: <20060720203727.7E3F51E4003@bag.python.org>
Author: phillip.eby
Date: Thu Jul 20 22:37:27 2006
New Revision: 50732
Modified:
sandbox/trunk/setuptools/setuptools/package_index.py
Log:
Identify the setuptools version as part of the User-Agent string when
spidering pages or downloading files.
Modified: sandbox/trunk/setuptools/setuptools/package_index.py
==============================================================================
--- sandbox/trunk/setuptools/setuptools/package_index.py (original)
+++ sandbox/trunk/setuptools/setuptools/package_index.py Thu Jul 20 22:37:27 2006
@@ -141,9 +141,9 @@
if match:
yield urlparse.urljoin(url, match.group(1))
-
-
-
+user_agent = "Python-urllib/%s setuptools/%s" % (
+ urllib2.__version__, require('setuptools')[0].version
+)
@@ -617,13 +617,14 @@
if url.startswith('file:'):
return local_open(url)
try:
- return urllib2.urlopen(url)
+ request = urllib2.Request(url)
+ request.add_header('User-Agent', user_agent)
+ return urllib2.urlopen(request)
except urllib2.HTTPError, v:
return v
except urllib2.URLError, v:
raise DistutilsError("Download error: %s" % v.reason)
-
def _download_url(self, scheme, url, tmpdir):
# Determine download filename
#
@@ -653,7 +654,6 @@
else:
return filename
-
def scan_url(self, url):
self.process_url(url, True)
From python-checkins at python.org Thu Jul 20 22:41:02 2006
From: python-checkins at python.org (phillip.eby)
Date: Thu, 20 Jul 2006 22:41:02 +0200 (CEST)
Subject: [Python-checkins] r50733 - in sandbox/branches/setuptools-0.6:
EasyInstall.txt setuptools/package_index.py
Message-ID: <20060720204102.101EF1E4003@bag.python.org>
Author: phillip.eby
Date: Thu Jul 20 22:41:01 2006
New Revision: 50733
Modified:
sandbox/branches/setuptools-0.6/EasyInstall.txt
sandbox/branches/setuptools-0.6/setuptools/package_index.py
Log:
EasyInstall now includes setuptools version information in the
``User-Agent`` string sent to websites it visits. (backport from trunk)
Modified: sandbox/branches/setuptools-0.6/EasyInstall.txt
==============================================================================
--- sandbox/branches/setuptools-0.6/EasyInstall.txt (original)
+++ sandbox/branches/setuptools-0.6/EasyInstall.txt Thu Jul 20 22:41:01 2006
@@ -1190,6 +1190,10 @@
Release Notes/Change History
============================
+0.6c1
+ * EasyInstall now includes setuptools version information in the
+ ``User-Agent`` string sent to websites it visits.
+
0.6b4
* Fix creating Python wrappers for non-Python scripts
Modified: sandbox/branches/setuptools-0.6/setuptools/package_index.py
==============================================================================
--- sandbox/branches/setuptools-0.6/setuptools/package_index.py (original)
+++ sandbox/branches/setuptools-0.6/setuptools/package_index.py Thu Jul 20 22:41:01 2006
@@ -141,9 +141,9 @@
if match:
yield urlparse.urljoin(url, match.group(1))
-
-
-
+user_agent = "Python-urllib/%s setuptools/%s" % (
+ urllib2.__version__, require('setuptools')[0].version
+)
@@ -617,13 +617,14 @@
if url.startswith('file:'):
return local_open(url)
try:
- return urllib2.urlopen(url)
+ request = urllib2.Request(url)
+ request.add_header('User-Agent', user_agent)
+ return urllib2.urlopen(request)
except urllib2.HTTPError, v:
return v
except urllib2.URLError, v:
raise DistutilsError("Download error: %s" % v.reason)
-
def _download_url(self, scheme, url, tmpdir):
# Determine download filename
#
@@ -653,7 +654,6 @@
else:
return filename
-
def scan_url(self, url):
self.process_url(url, True)
From python-checkins at python.org Thu Jul 20 22:46:49 2006
From: python-checkins at python.org (phillip.eby)
Date: Thu, 20 Jul 2006 22:46:49 +0200 (CEST)
Subject: [Python-checkins] r50734 -
sandbox/trunk/setuptools/setuptools/package_index.py
Message-ID: <20060720204649.CFACD1E4003@bag.python.org>
Author: phillip.eby
Date: Thu Jul 20 22:46:49 2006
New Revision: 50734
Modified:
sandbox/trunk/setuptools/setuptools/package_index.py
Log:
Update MD5 matching for current PyPI code. :(
Modified: sandbox/trunk/setuptools/setuptools/package_index.py
==============================================================================
--- sandbox/trunk/setuptools/setuptools/package_index.py (original)
+++ sandbox/trunk/setuptools/setuptools/package_index.py Thu Jul 20 22:46:49 2006
@@ -11,8 +11,8 @@
HREF = re.compile("""href\\s*=\\s*['"]?([^'"> ]+)""", re.I)
# this is here to fix emacs' cruddy broken syntax highlighting
PYPI_MD5 = re.compile(
- '([^<]+)\n\s+\\(md5\\)'
+ '([^<]+)\n\s+\\(md5\\)'
)
URL_SCHEME = re.compile('([-+.a-z0-9]{2,}):',re.I).match
From python-checkins at python.org Thu Jul 20 22:47:41 2006
From: python-checkins at python.org (phillip.eby)
Date: Thu, 20 Jul 2006 22:47:41 +0200 (CEST)
Subject: [Python-checkins] r50735 -
sandbox/branches/setuptools-0.6/setuptools/package_index.py
Message-ID: <20060720204741.873AF1E4003@bag.python.org>
Author: phillip.eby
Date: Thu Jul 20 22:47:41 2006
New Revision: 50735
Modified:
sandbox/branches/setuptools-0.6/setuptools/package_index.py
Log:
Backport PyPI regex change.
Modified: sandbox/branches/setuptools-0.6/setuptools/package_index.py
==============================================================================
--- sandbox/branches/setuptools-0.6/setuptools/package_index.py (original)
+++ sandbox/branches/setuptools-0.6/setuptools/package_index.py Thu Jul 20 22:47:41 2006
@@ -11,8 +11,8 @@
HREF = re.compile("""href\\s*=\\s*['"]?([^'"> ]+)""", re.I)
# this is here to fix emacs' cruddy broken syntax highlighting
PYPI_MD5 = re.compile(
- '([^<]+)\n\s+\\(md5\\)'
+ '([^<]+)\n\s+\\(md5\\)'
)
URL_SCHEME = re.compile('([-+.a-z0-9]{2,}):',re.I).match
From python-checkins at python.org Thu Jul 20 23:30:43 2006
From: python-checkins at python.org (matt.fleming)
Date: Thu, 20 Jul 2006 23:30:43 +0200 (CEST)
Subject: [Python-checkins] r50736 - in sandbox/trunk/pdb:
Doc/lib/libmpdb.tex mpdb.py test/files test/files/proc.py
test/files/thread_script.py test/test_mpdb.py test/thread_script.py
Message-ID: <20060720213043.675CF1E400F@bag.python.org>
Author: matt.fleming
Date: Thu Jul 20 23:30:41 2006
New Revision: 50736
Added:
sandbox/trunk/pdb/test/files/
sandbox/trunk/pdb/test/files/proc.py (contents, props changed)
sandbox/trunk/pdb/test/files/thread_script.py
- copied, changed from r50718, sandbox/trunk/pdb/test/thread_script.py
Removed:
sandbox/trunk/pdb/test/thread_script.py
Modified:
sandbox/trunk/pdb/Doc/lib/libmpdb.tex
sandbox/trunk/pdb/mpdb.py
sandbox/trunk/pdb/test/test_mpdb.py
Log:
Moved somethings around, added a test program for debugging running applications.
Modified: sandbox/trunk/pdb/Doc/lib/libmpdb.tex
==============================================================================
--- sandbox/trunk/pdb/Doc/lib/libmpdb.tex (original)
+++ sandbox/trunk/pdb/Doc/lib/libmpdb.tex Thu Jul 20 23:30:41 2006
@@ -952,6 +952,33 @@
\end{description}
+\section{Debugger Architecture}
+\label{debugger-arch}
+This section describes the organisation of the debugger and its
+architecture.
+
+XXX Images coming soon..
+\begin{center}
+1. MPdb instance
+
+2. Input/Output MQueues
+
+3. MConnection
+\end{center}
+
+
+\subsection{The Message Queue}
+Communication inside the debugger and also between debuggers (i.e. on a remote
+connection) takes place via \emph{message queues}. The idea behind the message
+queue is that it should provide a level of abstraction from exactly \emph{how}
+messages are communicated. An object within \module{mpdb} need only know which
+queue to place commands onto in order to communicate with another object.
+
+\subsection{Instruction format}
+The instructions that are sent on the queue are made up of the following
+components,
+
+Process/ThreadID SequenceID PacketLength Data
\section{MPdb Objects}
\label{mpdb-objects}
Modified: sandbox/trunk/pdb/mpdb.py
==============================================================================
--- sandbox/trunk/pdb/mpdb.py (original)
+++ sandbox/trunk/pdb/mpdb.py Thu Jul 20 23:30:41 2006
@@ -1,605 +1,647 @@
-#!/usr/bin/env python
-# Pdb Improvements
-#
-# This is a Google Summer of Code project
-# Student: Matthew J. Fleming
-# Mentor: Robert L. Bernstein
-"""
-This module provides improvements over the Python Debugger (Pdb) by building
-on the work done by Rocky Bernstein in The Extended Python Debugger.
-This module allows,
-
-- debugging of applications running in a separate process to the debugger
-- debugging of applications on a remote machine
-- debugging of threaded applications.
-
-"""
-
-import os
-from optparse import OptionParser
-import pydb
-from pydb.gdb import Restart
-import sys
-import time
-import thread
-import traceback
-
-__all__ = ["MPdb", "pdbserver", "target", "thread_debugging"]
-__version__ = "0.1alpha"
-
-line_prefix = '\n-> '
-
-class Exit(Exception):
- """ Causes a debugger to exit immediately. """
- pass
-
-class MPdb(pydb.Pdb):
- """ This class extends the command set and functionality of the
- Python debugger and provides support for,
-
- - debugging separate processes
- - debugging applications on remote machines
- - debugging threaded applications
- """
- def __init__(self, completekey='tab', stdin=None, stdout=None):
- """ Instantiate a debugger.
-
- The optional argument 'completekey' is the readline name of a
- completion key; it defaults to the Tab key. If completekey is
- not None and the readline module is available, command completion
- is done automatically. The optional arguments stdin and stdout
- are the objects that data is read from and written to, respectively.
- """
- pydb.Pdb.__init__(self, completekey, stdin, stdout)
- self.orig_stdin = self.stdin
- self.orig_stdout = self.stdout
- self.prompt = '(MPdb)'
- self.target = 'local' # local connections by default
- self.lastcmd = ''
- self.connection = None
- self.debugger_name = 'mpdb'
- self._show_cmds.append('target-address')
- self._show_cmds.sort()
- self._info_cmds.append('target')
- self._info_cmds.sort()
- self.target_addr = "" # target address used by 'attach'
-
- # We need to be in control of self.stdin
- self.use_rawinput = False
-
- def _rebind_input(self, new_input):
- self.stdin.flush()
- self.stdin = new_input
-
- def _rebind_output(self, new_output):
- self.stdout.flush()
- self.stdout = new_output
- if not hasattr(self.stdout, 'flush'):
- self.stdout.flush = lambda: None
-
- def remote_onecmd(self, line):
- """ All commands in 'line' are sent across this object's connection
- instance variable.
- """
- if not line:
- # Execute the previous command
- line = self.lastcmd
- # This is the simplest way I could think of to do this without
- # breaking any of the inherited code from pydb/pdb. If we're a
- # remote client, always call 'rquit' (remote quit) when connected to
- # a pdbserver. This executes extra code to allow the client and server
- # to quit cleanly.
- if 'quit'.startswith(line):
- line = 'rquit'
- self.connection.write(line)
- # Reset the onecmd method
- self.onecmd = lambda x: pydb.Pdb.onecmd(self, x)
- self.do_rquit(None)
- return
- self.connection.write(line)
- ret = self.connection.readline()
- if ret == '':
- self.errmsg('Connection closed unexpectedly')
- self.onecmd = lambda x: pydb.Pdb.onecmd(self, x)
- self.do_rquit(None)
- # The output from the command that we've just sent to the server
- # is returned along with the prompt of that server. So we keep reading
- # until we find our prompt.
- i = 1
- while self.local_prompt not in ret:
- if i == 100:
- # We're probably _never_ going to get that data and that
- # connection is probably dead.
- self.errmsg('Connection died unexpectedly')
- self.onecmd = lambda x: pydb.Pdb.onecmd(self, x)
- self.do_rquit(None)
- else:
- ret += self.connection.readline()
- i += 1
-
- # Some 'special' actions must be taken depending on the data returned
- if 'restart_now' in ret:
- self.connection.write('ACK:restart_now')
- self.errmsg('Pdbserver restarting..')
- # We've acknowledged a restart, which means that a new pdbserver
- # process is started, so we have to connect all over again.
- self._disconnect()
- time.sleep(3.0)
- if not self.do_target(self.target_addr):
- # We cannot trust these variables below to be in a
- # stable state. i.e. if the pdbserver doesn't come back up.
- self.onecmd = lambda x: pydb.Pdb.onecmd(self, x)
- return
- self.msg_nocr(ret)
- self.lastcmd = line
- return
-
- def _disconnect(self):
- """ Disconnect a connection. """
- self.connection.disconnect()
- self.connection = None
- self.target = 'local'
- if hasattr(self, 'local_prompt'):
- self.prompt = self.local_prompt
- self.onecmd = lambda x: pydb.Pdb.onecmd(self, x)
-
- def do_info(self, arg):
- """Extends pydb do_info() to give info about the Mpdb extensions."""
- if not arg:
- pydb.Pdb.do_info(self, arg)
- return
-
- args = arg.split()
- if 'target'.startswith(args[0]) and len(args[0]) > 2:
- self.msg("target is %s" % self.target)
- else:
- pydb.Pdb.do_info(self, arg)
-
- def info_helper(self, cmd, label=False):
- """Extends pydb info_helper() to give info about a single Mpdb
- info extension."""
- if label:
- self.msg_nocr("info %s --" % cmd)
- if 'target'.startswith(cmd):
- self.msg("Names of targets and files being debugged")
- else:
- pydb.Pdb.info_helper(self, cmd)
-
- def help_mpdb(self, *arg):
- help()
-
- def do_set(self, arg):
- """ Extends pydb.do_set() to allow setting of mpdb extensions.
- """
- if not arg:
- pydb.Pdb.do_set(self, arg)
- return
-
- args = arg.split()
- if 'target-address'.startswith(args[0]):
- self.target_addr = "".join(["%s " % a for a in args[1:]])
- self.target_addr = self.target_addr.strip()
-
- self.msg('target address set to: %s' % self.target_addr)
- return
-
- def do_show(self, arg):
- """Extends pydb.do_show() to show Mpdb extension settings. """
- if not arg:
- pydb.Pdb.do_show(self, arg)
- return
-
- args = arg.split()
- if 'target-address'.startswith(args[0]):
- self.msg('target address is %s.' % self.target_addr.__repr__())
- else:
- pydb.Pdb.do_show(self, arg)
-
- # Debugger commands
- def do_attach(self, addr):
- """ Attach to a process or file outside of Pdb.
-This command attaches to another target, of the same type as your last
-"target" command. The command may take as argument a process id or a
-device file. For a process id, you must have permission to send the
-process a signal, and it must have the same effective uid as the debugger.
-When using "attach" with a process id, the debugger finds the
-program running in the process, looking first in the current working
-directory, or (if not found there) using the source file search path
-(see the "directory" command). You can also use the "file" command
-to specify the program, and to load its symbol table.
-"""
- if not self.target_addr:
- self.errmsg('No target address is set')
- return
- try:
- pid = int(addr)
- except ValueError:
- # no PID
- self.errmsg('You must specify a process ID to attach to.')
- return
- from os import kill
- from signal import SIGUSR1
- try:
- kill(pid, SIGUSR1)
- except OSError, err:
- self.errmsg(err)
- return
-
- # Will remove this
- time.sleep(3.0)
-
- # At the moment by default we'll use named pipes for communication
- conn_str = 'mconnection.MConnectionClientFIFO p'
- self.do_target(conn_str)
-
- def do_target(self, args):
- """ Connect to a target machine or process.
-The first argument is the type or protocol of the target machine
-(which can be the name of a class that is available either in the current
-working directory or in Python's PYTHONPATH environment variable).
-Remaining arguments are interpreted by the target protocol. For more
-information on the arguments for a particular protocol, type
-`help target ' followed by the protocol name.
-
-List of target subcommands:
-
-target serial device-name -- Use a remote computer via a serial line
-target tcp hostname:port -- Use a remote computer via a socket connection
-"""
- if not args:
- args = self.target_addr
- try:
- target, addr = args.split(' ')
- except ValueError:
- self.errmsg('Invalid arguments')
- return False
- # If addr is ':PORT' fill in 'localhost' as the hostname
- if addr[0] == ':':
- addr = 'localhost'+addr[:]
- if 'remote' in self.target:
- self.errmsg('Already connected to a remote machine.')
- return False
- if target == 'tcp':
- # TODO: need to save state of current debug session
- if self.connection: self.connection.disconnect()
- try:
- from mconnection import (MConnectionClientTCP,
- ConnectionFailed)
- self.connection = MConnectionClientTCP()
-
- except ImportError:
- self.errmsg('Could not import MConnectionClientTCP')
- return False
- elif target == 'serial':
- if self.connection: self.connection.disconnect()
- try:
- from mconnection import (MConnectionSerial,
- ConnectionFailed)
- self.connection = MConnectionSerial()
- except ImportError:
- self.errmsg('Could not import MConnectionSerial')
- return False
- else:
- cls = target[target.rfind('.')+1:]
- path = target[:target.rfind('.')]
- exec 'from ' + path + ' import ' + cls + ', ConnectionFailed'
- self.connection = eval(cls+'()')
- try:
- self.connection.connect(addr)
- except ConnectionFailed, err:
- self.errmsg("Failed to connect to %s: (%s)" % (addr, err))
- return False
- # This interpreter no longer interprets commands but sends
- # them straight across this object's connection to a server.
- # XXX: In the remote_onecmd method we use the local_prompt string
- # to find where the end of the message from the server is. We
- # really need a way to get the prompt from the server for checking
- # in remote_onecmd, because it may be different to this client's.
- self.local_prompt = self.prompt
- self.prompt = ""
- self.target_addr = target + " " + addr
- line = self.connection.readline()
- if line == '':
- self.errmsg('Connection closed unexpectedly')
- raise Exit
- while '(MPdb)' not in line:
- line = self.connection.readline()
- self.msg_nocr(line)
- self.onecmd = self.remote_onecmd
- self.target = 'remote-client'
- return True
-
- def do_detach(self, args):
- """ Detach a process or file previously attached.
-If a process, it is no longer traced, and it continues its execution. If
-you were debugging a file, the file is closed and Pdb no longer accesses it.
-"""
- pass
-
- def do_pdbserver(self, args):
- """ Allow a debugger to connect to this session.
-The first argument is the type or protocol that is used for this connection
-(which can be the name of a class that is avaible either in the current
-working directory or in Python's PYTHONPATH environtment variable).
-The next argument is protocol specific arguments (e.g. hostname and
-port number for a TCP connection, or a serial device for a serial line
-connection). The next argument is the filename of the script that is
-being debugged. The rest of the arguments are passed as arguments to the
-script file and are optional. For more information on the arguments for a
-particular protocol, type `help pdbserver ' followed by the protocol name.
-The syntax for this command is,
-
-`pdbserver ConnectionClass comm scriptfile [args ...]'
-
-"""
- try:
- target, comm = args.split(' ')
- except ValueError:
- self.errmsg('Invalid arguments')
- return
- if 'remote' in self.target:
- self.errmsg('Already connected remotely')
- return
- if target == 'tcp':
- try:
- from mconnection import (MConnectionServerTCP,
- ConnectionFailed)
- self.connection = MConnectionServerTCP()
- except ImportError:
- self.errmsg('Could not load MConnectionServerTCP class')
- return
- elif target == 'serial':
- try:
- from mconnection import (MConnectionSerial,
- ConnectionFailed)
- self.connection = MConnectionSerial()
- except ImportError:
- self.errmsg('Could not load MConnectionSerial class')
- return
- else:
- if '.' in target:
- base = target[:target.rfind('.')]
- target = target[target.rfind('.')+1:]
- try:
- exec 'from ' + base + ' import (' + target + \
- ', ConnectionFailed)'
- except ImportError:
- self.errmsg('Unknown protocol')
- return
- else:
- try:
- __import__(target)
- except ImportError:
- self.errmsg('Unknown protocol')
- return
- self.connection = eval(target+'()')
- try:
- self.msg('Listening on: %s' % comm)
- self.connection.connect(comm)
- except ConnectionFailed, err:
- self.errmsg("Failed to connect to %s: (%s)" % (comm, err))
- return
- self.pdbserver_addr = comm
- self.target = 'remote-pdbserver'
- self._rebind_input(self.connection)
- self._rebind_output(self.connection)
-
- def do_rquit(self, arg):
- """ Quit a remote debugging session. The program being executed
-is aborted.
-"""
- if self.target == 'local':
- self.errmsg('Connected locally; cannot remotely quit')
- return
- self._rebind_output(self.orig_stdout)
- self._rebind_input(self.orig_stdin)
- self._disconnect()
- self.target = 'local'
- self.do_quit(None)
- raise Exit
-
- def do_restart(self, arg):
- """ Extend pydb.do_restart to signal to any clients connected on
- a debugger's connection that this debugger is going to be restarted.
- All state is lost, and a new copy of the debugger is used.
- """
- # We don't proceed with the restart until the action has been
- # ACK'd by any connected clients
- if self.connection != None:
- self.msg('restart_now\n(MPdb)')
- line = ""
- while not 'ACK:restart_now' in line:
- line = self.connection.readline()
- try:
- self.do_rquit(None)
- except Exit:
- pass
- else:
- self.msg("Re exec'ing\n\t%s" % self._sys_argv)
- os.execvp(self._sys_argv[0], self._sys_argv)
-
-def pdbserver(addr, m):
- """ This method sets up a pdbserver debugger that allows debuggers
- to connect to 'addr', which a protocol-specific address, i.e.
- tcp = 'tcp mydomainname.com:9876'
- serial = 'serial /dev/ttyC0'
- """
- m.do_pdbserver(addr)
- while True:
- try:
- m._runscript(m.mainpyfile)
- except Restart:
- sys.argv = list(m._program_sys_argv)
- m.msg('Restarting')
- except Exit:
- break
-
-def target(addr):
- """ Connect this debugger to a pdbserver at 'addr'. 'addr' is
- a protocol-specific address. i.e.
- tcp = 'tcp mydomainname.com:9876'
- serial = 'serial /dev/ttyC0'
- """
- m = MPdb()
- m.reset()
- # Look Ma, no script!
- m.do_target(addr)
- while True:
- try:
- m.cmdloop()
- if m._user_requested_quit: break
- except:
- break
-
-def thread_debugging(m):
- """ Setup this debugger to handle threaded applications."""
- sys.path.append(os.path.dirname(m._sys_argv[1]))
- import mthread
- mthread.init(m)
- while True:
- try:
- m._runscript(m.mainpyfile)
- if m._user_requested_quit: break
- except Restart:
- sys.argv = list(m._program_sys_argv)
- m.msg('Restarting')
- except:
- m.msg(traceback.format_exc())
- break
-
-# Moving this out of this file makes things 'awkward', for instance
-# the outter file has to know about how to setup a pdbserver.
-def process_debugging():
- """ Allow debugging of other processes. This routine should
- be imported and called near the top of the program file.
- It sets up signal handlers that are used to create a pdbserver
- that a debugging client can attach to.
- """
- import signal
- signal.signal(signal.SIGUSR1, signal_handler)
-
-def signal_handler(signum, frame):
- """ This signal handler replaces the programs signal handler
- for the SIGUSR1 signal. When a program receives this signal,
- it creates a pdbserver with a FIFO connection. Debugger
- clients can then attach to this pdbserver via it's pid.
- """
- m = MPdb()
- m.reset()
- m.running = True
- m.currentframe = frame
- m._sys_argv = ['python']
- for i in sys.argv:
- m._sys_argv.append(i)
- m.do_pdbserver('mconnection.MConnectionServerFIFO p')
- m.set_trace(m.curframe)
-
-def attach(pid, mpdb):
- """ Attach to running process 'pid'. """
- mpdb.reset() # botframe, etc should be None
- mpdb.do_attach(pid)
- while True:
- try:
- mpdb.cmdloop()
- if mpdb._user_requested_quit: break
- except Exit:
- break
-
-def main():
- """ Main entry point to this module. """
- mpdb = MPdb()
-
- from pydb.fns import process_options
- from optparse import make_option
-
- opt_list = [
- make_option("-t", "--target", dest="target",
- help="Specify a target to connect to. The arguments" \
- + " should be of form, 'protocol address'."),
- make_option("--pdbserver", dest="pdbserver",
- help="Start the debugger and execute the pdbserver " \
- + "command. The arguments should be of the form," \
- + " 'protocol address scriptname'."),
- make_option("-d", "--debug-thread", action="store_true",
- help="Turn on thread debugging."),
- make_option("--pid", dest="pid", help="Attach to running process PID.")
- ]
-
- opts = process_options(mpdb, "mpdb", os.path.basename(sys.argv[0])
- , __version__, opt_list)
-
- tmp_argv = ['python']
- for arg in mpdb._sys_argv:
- tmp_argv.append(arg)
-
- mpdb._sys_argv = list(tmp_argv)
-
- if not sys.argv:
- # No program to debug
- mpdb.mainpyfile = None
- else:
- mpdb._program_sys_argv = list(sys.argv)
-
- mpdb.mainpyfile = mpdb._program_sys_argv[0]
-
- if not os.path.exists(mpdb.mainpyfile):
- print 'Error:', mpdb.mainpyfile, 'does not exist'
-
- # Replace mpdb's dir with script's dir in front of
- # module search path.
- sys.path[0] = mpdb.main_dirname = os.path.dirname(mpdb.mainpyfile)
-
- if opts.target:
- target(opts.target)
- sys.exit()
- elif opts.pdbserver:
- pdbserver(opts.pdbserver, mpdb)
- sys.exit()
- elif opts.debug_thread:
- thread_debugging(mpdb)
- sys.exit()
- elif opts.pid:
- attach(opts.pid, mpdb)
- sys.exit()
-
- while 1:
- try:
- if mpdb.mainpyfile:
- mpdb._runscript(mpdb.mainpyfile)
- else:
- mpdb._wait_for_mainpyfile = True
- mpdb.interaction(None, None)
-
- if mpdb._user_requested_quit:
- break
- mpdb.msg("The program finished and will be restarted")
- except Restart:
- if mpdb._program_sys_argv:
- sys.argv = list(mpdb._program_sys_argv)
- mpdb.msg('Restarting %s with arguments:\n\t%s'
- % (mpdb.filename(mpdb.mainpyfile),
- ' '.join(mpdb._program_sys_argv[1:])))
- else: break
-
- except SystemExit:
- # In most cases SystemExit does not warrant a post-mortem session.
- mpdb.msg("The program exited via sys.exit(). " + \
- "Exit status:",sys.exc_info()[1])
- except Exit:
- # This exception raised when we disconnect from a remote session
- pass
- except:
- mpdb.msg(traceback.format_exc())
- mpdb.msg("Uncaught exception. Entering post mortem debugging")
- mpdb.msg("Running 'cont' or 'step' will restart the program")
- t = sys.exc_info()[2]
- while t.tb_next != None:
- t = t.tb_next
- mpdb.interaction(t.tb_frame,t)
- mpdb.msg("Post mortem debugger finished. The " + \
- mpdb.mainpyfile + " will be restarted")
-
-if __name__ == '__main__':
- main()
-
-
+#!/usr/bin/env python
+
+# Pdb Improvements
+#
+# This is a Google Summer of Code project
+# Student: Matthew J. Fleming
+# Mentor: Robert L. Bernstein
+"""
+This module provides improvements over the Python Debugger (Pdb) by building
+on the work done by Rocky Bernstein in The Extended Python Debugger.
+This module allows,
+
+- debugging of applications running in a separate process to the debugger
+- debugging of applications on a remote machine
+- debugging of threaded applications.
+"""
+
+import os
+from optparse import OptionParser
+import pydb
+from pydb.gdb import Restart
+import sys
+import time
+import thread
+import traceback
+
+__all__ = ["MPdb", "pdbserver", "target", "thread_debugging"]
+__version__ = "0.1alpha"
+
+line_prefix = '\n-> '
+
+class MPdb(pydb.Pdb):
+ """ This class extends the command set and functionality of the
+ Python debugger and provides support for,
+
+ - debugging separate processes
+ - debugging applications on remote machines
+ - debugging threaded applications
+ """
+ def __init__(self, completekey='tab', stdin=None, stdout=None):
+ """ Instantiate a debugger.
+
+ The optional argument 'completekey' is the readline name of a
+ completion key; it defaults to the Tab key. If completekey is
+ not None and the readline module is available, command completion
+ is done automatically. The optional arguments stdin and stdout
+ are the objects that data is read from and written to, respectively.
+ """
+ pydb.Pdb.__init__(self, completekey, stdin, stdout)
+ self.orig_stdin = self.stdin
+ self.orig_stdout = self.stdout
+ self.prompt = '(MPdb)'
+ self.target = 'local' # local connections by default
+ self.lastcmd = ''
+ self.connection = None
+ self.debugger_name = 'mpdb'
+ self._show_cmds.append('target-address')
+ self._show_cmds.sort()
+ self._info_cmds.append('target')
+ self._info_cmds.sort()
+ self.target_addr = "" # target address used by 'attach'
+ self.debug_signal = None # The signal used by 'attach'
+
+ # We need to be in control of self.stdin
+ self.use_rawinput = False
+
+ def _rebind_input(self, new_input):
+ self.stdin = new_input
+
+ def _rebind_output(self, new_output):
+ self.stdout.flush()
+ self.stdout = new_output
+ if not hasattr(self.stdout, 'flush'):
+ self.stdout.flush = lambda: None
+
+ def remote_onecmd(self, line):
+ """ All commands in 'line' are sent across this object's connection
+ instance variable.
+ """
+ if not line:
+ # Execute the previous command
+ line = self.lastcmd
+ # This is the simplest way I could think of to do this without
+ # breaking any of the inherited code from pydb/pdb. If we're a
+ # remote client, always call 'rquit' (remote quit) when connected to
+ # a pdbserver. This executes extra code to allow the client and server
+ # to quit cleanly.
+ if 'quit'.startswith(line):
+ line = 'rquit'
+ self.connection.write(line)
+ # Reset the onecmd method
+ self.onecmd = lambda x: pydb.Pdb.onecmd(self, x)
+ self.do_rquit(None)
+ return
+ self.connection.write(line)
+ ret = self.connection.readline()
+ if ret == '':
+ self.errmsg('Connection closed unexpectedly')
+ self.onecmd = lambda x: pydb.Pdb.onecmd(self, x)
+ self.do_rquit(None)
+ # The output from the command that we've just sent to the server
+ # is returned along with the prompt of that server. So we keep reading
+ # until we find our prompt.
+ i = 1
+ while self.local_prompt not in ret:
+ if i == 100:
+ # We're probably _never_ going to get that data and that
+ # connection is probably dead.
+ self.errmsg('Connection died unexpectedly')
+ self.onecmd = lambda x: pydb.Pdb.onecmd(self, x)
+ self.do_rquit(None)
+ else:
+ ret += self.connection.readline()
+ i += 1
+
+ # Some 'special' actions must be taken depending on the data returned
+ if 'restart_now' in ret:
+ self.connection.write('ACK:restart_now')
+ self.errmsg('Pdbserver restarting..')
+ # We've acknowledged a restart, which means that a new pdbserver
+ # process is started, so we have to connect all over again.
+ self._disconnect()
+ time.sleep(3.0)
+ if not self.do_target(self.target_addr):
+ # We cannot trust these variables below to be in a
+ # stable state. i.e. if the pdbserver doesn't come back up.
+ self.onecmd = lambda x: pydb.Pdb.onecmd(self, x)
+ return
+ self.msg_nocr(ret)
+ self.lastcmd = line
+ return
+
+ def _disconnect(self):
+ """ Disconnect a connection. """
+ self.connection.disconnect()
+ self.connection = None
+ self.target = 'local'
+ if hasattr(self, 'local_prompt'):
+ self.prompt = self.local_prompt
+ self.onecmd = lambda x: pydb.Pdb.onecmd(self, x)
+
+ def do_info(self, arg):
+ """Extends pydb do_info() to give info about the Mpdb extensions."""
+ if not arg:
+ pydb.Pdb.do_info(self, arg)
+ return
+
+ args = arg.split()
+ if 'target'.startswith(args[0]) and len(args[0]) > 2:
+ self.msg("target is %s" % self.target)
+ else:
+ pydb.Pdb.do_info(self, arg)
+
+ def info_helper(self, cmd, label=False):
+ """Extends pydb info_helper() to give info about a single Mpdb
+ info extension."""
+ if label:
+ self.msg_nocr("info %s --" % cmd)
+ if 'target'.startswith(cmd):
+ self.msg("Names of targets and files being debugged")
+ else:
+ pydb.Pdb.info_helper(self, cmd)
+
+ def help_mpdb(self, *arg):
+ help()
+
+ def do_set(self, arg):
+ """ Extends pydb.do_set() to allow setting of mpdb extensions.
+ """
+ if not arg:
+ pydb.Pdb.do_set(self, arg)
+ return
+
+ args = arg.split()
+ if 'debug-signal'.startswith(args[0]):
+ self.debug_signal = args[1]
+ self.msg('debug-signal set to: %s' % self.debug_signal)
+ elif 'target-address'.startswith(args[0]):
+ self.target_addr = "".join(["%s " % a for a in args[1:]])
+ self.target_addr = self.target_addr.strip()
+ self.msg('target address set to: %s' % self.target_addr)
+
+ def do_show(self, arg):
+ """Extends pydb.do_show() to show Mpdb extension settings. """
+ if not arg:
+ pydb.Pdb.do_show(self, arg)
+ return
+
+ args = arg.split()
+ if 'debug-signal'.startswith(args[0]):
+ if not self.debug_signal:
+ self.msg('debug-signal is not set.')
+ else:
+ self.msg('debug-signal is %s.' % self.debug_signal)
+ elif 'target-address'.startswith(args[0]):
+ self.msg('target address is %s.' % self.target_addr.__repr__())
+ else:
+ pydb.Pdb.do_show(self, arg)
+
+ # Debugger commands
+ def do_attach(self, addr):
+ """ Attach to a process or file outside of Pdb.
+This command attaches to another target, of the same type as your last
+"target" command. The command may take as argument a process id or a
+device file. For a process id, you must have permission to send the
+process a signal, and it must have the same effective uid as the debugger.
+When using "attach" with a process id, the debugger finds the
+program running in the process, looking first in the current working
+directory, or (if not found there) using the source file search path
+(see the "directory" command). You can also use the "file" command
+to specify the program, and to load its symbol table.
+"""
+ if not self.target_addr:
+ self.errmsg('No target address is set')
+ return
+ try:
+ pid = int(addr)
+ except ValueError:
+ # no PID
+ self.errmsg('You must specify a process ID to attach to.')
+ return
+ if not self.debug_signal:
+ from signal import SIGUSR1
+ self.debug_signal = SIGUSR1
+ else:
+ # Because self.debug_signal may be a string
+ self.debug_signal = eval(self.debug_signal)
+ try:
+ os.kill(pid, self.debug_signal)
+ except OSError, err:
+ self.errmsg(err)
+ return
+
+ # Will remove this
+ time.sleep(3.0)
+
+ # At the moment by default we'll use named pipes for communication
+ self.do_target(self.target_addr)
+
+ def do_target(self, args):
+ """ Connect to a target machine or process.
+The first argument is the type or protocol of the target machine
+(which can be the name of a class that is available either in the current
+working directory or in Python's PYTHONPATH environment variable).
+Remaining arguments are interpreted by the target protocol. For more
+information on the arguments for a particular protocol, type
+`help target ' followed by the protocol name.
+
+List of target subcommands:
+
+target serial device-name -- Use a remote computer via a serial line
+target tcp hostname:port -- Use a remote computer via a socket connection
+"""
+ if not args:
+ args = self.target_addr
+ try:
+ target, addr = args.split(' ')
+ except ValueError:
+ self.errmsg('Invalid arguments')
+ return False
+ # If addr is ':PORT' fill in 'localhost' as the hostname
+ if addr[0] == ':':
+ addr = 'localhost'+addr[:]
+ if 'remote' in self.target:
+ self.errmsg('Already connected to a remote machine.')
+ return False
+ if target == 'tcp':
+ # TODO: need to save state of current debug session
+ if self.connection: self.connection.disconnect()
+ try:
+ from mconnection import (MConnectionClientTCP,
+ ConnectionFailed)
+ self.connection = MConnectionClientTCP()
+
+ except ImportError:
+ self.errmsg('Could not import MConnectionClientTCP')
+ return False
+ elif target == 'serial':
+ if self.connection: self.connection.disconnect()
+ try:
+ from mconnection import (MConnectionSerial,
+ ConnectionFailed)
+ self.connection = MConnectionSerial()
+ except ImportError:
+ self.errmsg('Could not import MConnectionSerial')
+ return False
+ elif target == 'fifo':
+ if self.connection: self.connection.disconnect()
+ try:
+ from mconnection import (MConnectionClientFIFO,
+ ConnectionFailed)
+ self.connection = MConnectionClientFIFO()
+ except ImportError:
+ self.errmsg('Could not import MConnectionClientFIFO')
+ return False
+ else:
+ cls = target[target.rfind('.')+1:]
+ path = target[:target.rfind('.')]
+ exec 'from ' + path + ' import ' + cls + ', ConnectionFailed'
+ self.connection = eval(cls+'()')
+ try:
+ self.connection.connect(addr)
+ except ConnectionFailed, err:
+ self.errmsg("Failed to connect to %s: (%s)" % (addr, err))
+ return False
+ # This interpreter no longer interprets commands but sends
+ # them straight across this object's connection to a server.
+ # XXX: In the remote_onecmd method we use the local_prompt string
+ # to find where the end of the message from the server is. We
+ # really need a way to get the prompt from the server for checking
+ # in remote_onecmd, because it may be different to this client's.
+ self.local_prompt = self.prompt
+ self.prompt = ""
+ self.target_addr = target + " " + addr
+ line = self.connection.readline()
+ if line == '':
+ self.errmsg('Connection closed unexpectedly')
+ self.do_quit(None)
+ while '(MPdb)' not in line:
+ line = self.connection.readline()
+ self.msg_nocr(line)
+ self.onecmd = self.remote_onecmd
+ self.target = 'remote-client'
+ return True
+
+ def do_detach(self, args):
+ """ Detach a process or file previously attached.
+If a process, it is no longer traced, and it continues its execution. If
+you were debugging a file, the file is closed and Pdb no longer accesses it.
+"""
+ pass
+
+ def do_pdbserver(self, args):
+ """ Allow a debugger to connect to this session.
+The first argument is the type or protocol that is used for this connection
+(which can be the name of a class that is avaible either in the current
+working directory or in Python's PYTHONPATH environtment variable).
+The next argument is protocol specific arguments (e.g. hostname and
+port number for a TCP connection, or a serial device for a serial line
+connection). The next argument is the filename of the script that is
+being debugged. The rest of the arguments are passed as arguments to the
+script file and are optional. For more information on the arguments for a
+particular protocol, type `help pdbserver ' followed by the protocol name.
+The syntax for this command is,
+
+`pdbserver ConnectionClass comm scriptfile [args ...]'
+
+"""
+ try:
+ target, comm = args.split(' ')
+ except ValueError:
+ self.errmsg('Invalid arguments')
+ return
+ if 'remote' in self.target:
+ self.errmsg('Already connected remotely')
+ return
+ if target == 'tcp':
+ try:
+ from mconnection import (MConnectionServerTCP,
+ ConnectionFailed)
+ self.connection = MConnectionServerTCP()
+ except ImportError:
+ self.errmsg('Could not load MConnectionServerTCP class')
+ return
+ elif target == 'serial':
+ try:
+ from mconnection import (MConnectionSerial,
+ ConnectionFailed)
+ self.connection = MConnectionSerial()
+ except ImportError:
+ self.errmsg('Could not load MConnectionSerial class')
+ return
+ else:
+ if '.' in target:
+ base = target[:target.rfind('.')]
+ target = target[target.rfind('.')+1:]
+ try:
+ exec 'from ' + base + ' import (' + target + \
+ ', ConnectionFailed)'
+ except ImportError:
+ self.errmsg('Unknown protocol')
+ return
+ else:
+ try:
+ __import__(target)
+ except ImportError:
+ self.errmsg('Unknown protocol')
+ return
+ self.connection = eval(target+'()')
+ try:
+ self.msg('Listening on: %s' % comm)
+ self.connection.connect(comm)
+ except ConnectionFailed, err:
+ self.errmsg("Failed to connect to %s: (%s)" % (comm, err))
+ return
+ self.pdbserver_addr = comm
+ self.target = 'remote-pdbserver'
+ self._rebind_input(self.connection)
+ self._rebind_output(self.connection)
+
+ def do_rquit(self, arg):
+ """ Quit a remote debugging session. The program being executed
+is aborted.
+"""
+ if self.target == 'local':
+ self.errmsg('Connected locally; cannot remotely quit')
+ return
+ self._rebind_output(self.orig_stdout)
+ self._rebind_input(self.orig_stdin)
+ self._disconnect()
+ self.target = 'local'
+ self.do_quit(None)
+
+ def do_restart(self, arg):
+ """ Extend pydb.do_restart to signal to any clients connected on
+ a debugger's connection that this debugger is going to be restarted.
+ All state is lost, and a new copy of the debugger is used.
+ """
+ # We don't proceed with the restart until the action has been
+ # ACK'd by any connected clients
+ if self.connection != None:
+ self.msg('restart_now\n(MPdb)')
+ line = ""
+ while not 'ACK:restart_now' in line:
+ line = self.connection.readline()
+ self.do_rquit(None)
+ else:
+ self.msg("Re exec'ing\n\t%s" % self._sys_argv)
+ os.execvp(self._sys_argv[0], self._sys_argv)
+
+def pdbserver(addr, m):
+ """ This method sets up a pdbserver debugger that allows debuggers
+ to connect to 'addr', which a protocol-specific address, i.e.
+ tcp = 'tcp mydomainname.com:9876'
+ serial = 'serial /dev/ttyC0'
+ """
+ m.do_pdbserver(addr)
+ while True:
+ try:
+ m._runscript(m.mainpyfile)
+ if m._user_requested_quit: break
+ except Restart:
+ sys.argv = list(m._program_sys_argv)
+ m.msg('Restarting')
+
+# The value of 'opts' dictates whether we call do_target or do_attach, there
+# were two separate top-level routines for these options, but apart from
+# choosing which do_* to call, the code was the same so it made sense to merge.
+def target(addr, opts, mpdb):
+ """ Connect this debugger to a pdbserver at 'addr'. 'addr' is
+ a protocol-specific address. i.e.
+ tcp = 'tcp mydomainname.com:9876'
+ serial = 'serial /dev/ttyC0'
+
+ 'opts' an the OptionParser object. If opts.target is True, call do_target.
+ If opts.pid is true, call do_attach.
+ """
+ mpdb.reset()
+ if opts.target:
+ mpdb.do_target(addr)
+ elif opts.pid:
+ pid = addr[:addr.find(' ')]
+ addr = addr[addr.find(' ')+1:]
+ mpdb.do_set('target ' + addr)
+ mpdb.do_attach(pid)
+ while True:
+ try:
+ mpdb.cmdloop()
+ if mpdb._user_requested_quit: break
+ except:
+ break
+
+def thread_debugging(m):
+ """ Setup this debugger to handle threaded applications."""
+ sys.path.append(os.path.dirname(m._sys_argv[1]))
+ import mthread
+ mthread.init(m)
+ while True:
+ try:
+ m._runscript(m.mainpyfile)
+ if m._user_requested_quit: break
+ except Restart:
+ sys.argv = list(m._program_sys_argv)
+ m.msg('Restarting')
+ except:
+ m.msg(traceback.format_exc())
+ break
+
+# Moving this out of this file makes things 'awkward', for instance
+# the outter file has to know about how to setup a pdbserver.
+def process_debugging(sig=None, protocol=None, addr=None):
+ """ Allow debugging of other processes. This routine should
+ be imported and called near the top of the program file.
+ It sets up signal handlers that are used to create a pdbserver
+ that a debugging client can attach to.
+
+ The optional argument 'sig', specifies which signal will be
+ used for running process debugging. If 'sig' is not specified
+ the SIGUSR1 signal is used. The optional argument 'protocol' is
+ the protocol used to create a pdbserver. The optional argument 'addr'
+ is the address used to create a pdbserver. The argument must
+ containa protocol and protocol-specific address. See the
+ docstring for pdbserver and target for more details.
+ """
+ import signal
+ if not sig:
+ sig = signal.SIGUSR1
+
+ # Save the old signal handler
+ global old_handler
+ old_handler = signal.getsignal(sig)
+
+ # Set up the new signal handler
+ signal.signal(sig, signal_handler)
+
+ global pdbserver_addr
+
+ if protocol is not None:
+ proto = protocol
+ else:
+ proto = 'mconnection.MConnectionServerFIFO'
+
+ if addr is not None:
+ pdbserver_addr = addr
+ else:
+ # XXX I don't think a successful symlink attack can be made here,
+ # because pdbserver bails if the file for a FIFO already exists.
+ tmp = os.tempnam(None,'mpdb') # use 'mpdb' as a prefix
+ pdbserver_addr = proto + " " + tmp
+ print pdbserver_addr
+
+def signal_handler(signum, frame):
+ """ This signal handler replaces the programs signal handler
+ for the 'signum' signal (by default SIGUSR1). When a program
+ receives this signal, it creates a pdbserver.
+ Debugger clients can then attach to this pdbserver via it's pid.
+ """
+ m = MPdb()
+ m._sys_argv = list(sys.argv)
+ m.reset()
+ m.running = True
+ m.currentframe = frame
+
+ # Clear up namespace
+ del frame.f_globals['mpdb']
+
+ m.do_pdbserver(pdbserver_addr)
+
+ try:
+ m.set_trace(m.curframe)
+ finally:
+ m.do_quit(None)
+ import signal
+ signal.signal(signum, old_handler)
+
+
+def main():
+ """ Main entry point to this module. """
+ mpdb = MPdb()
+
+ from pydb.fns import process_options
+ from optparse import make_option
+
+ opt_list = [
+ make_option("-t", "--target", dest="target",
+ help="Specify a target to connect to. The arguments" \
+ + " should be of form, 'protocol address'."),
+ make_option("--pdbserver", dest="pdbserver",
+ help="Start the debugger and execute the pdbserver " \
+ + "command. The arguments should be of the form," \
+ + " 'protocol address scriptname'."),
+ make_option("-d", "--debug-thread", action="store_true",
+ help="Turn on thread debugging."),
+ make_option("--pid", dest="pid", help="Attach to running process PID.")
+ ]
+
+ opts = process_options(mpdb, "mpdb", os.path.basename(sys.argv[0])
+ , __version__, opt_list)
+
+ mpdb._sys_argv = list(sys.argv)
+
+ if not sys.argv:
+ # No program to debug
+ mpdb.mainpyfile = None
+ else:
+ mpdb._program_sys_argv = list(sys.argv)
+
+ mpdb.mainpyfile = mpdb._program_sys_argv[0]
+
+ if not os.path.exists(mpdb.mainpyfile):
+ print 'Error:', mpdb.mainpyfile, 'does not exist'
+
+ # Replace mpdb's dir with script's dir in front of
+ # module search path.
+ sys.path[0] = mpdb.main_dirname = os.path.dirname(mpdb.mainpyfile)
+
+ if opts.target:
+ target(opts.target, opts, mpdb)
+ sys.exit()
+ elif opts.pdbserver:
+ pdbserver(opts.pdbserver, mpdb)
+ sys.exit()
+ elif opts.debug_thread:
+ thread_debugging(mpdb)
+ sys.exit()
+ elif opts.pid:
+ target(opts.pid, opts, mpdb)
+ sys.exit()
+
+ while 1:
+ try:
+ if mpdb.mainpyfile:
+ mpdb._runscript(mpdb.mainpyfile)
+ else:
+ mpdb._wait_for_mainpyfile = True
+ mpdb.interaction(None, None)
+
+ if mpdb._user_requested_quit:
+ break
+ mpdb.msg("The program finished and will be restarted")
+ except Restart:
+ if mpdb._program_sys_argv:
+ sys.argv = list(mpdb._program_sys_argv)
+ mpdb.msg('Restarting %s with arguments:\n\t%s'
+ % (mpdb.filename(mpdb.mainpyfile),
+ ' '.join(mpdb._program_sys_argv[1:])))
+ else: break
+
+ except SystemExit:
+ # In most cases SystemExit does not warrant a post-mortem session.
+ mpdb.msg("The program exited via sys.exit(). " + \
+ "Exit status:",sys.exc_info()[1])
+ except:
+ mpdb.msg(traceback.format_exc())
+ mpdb.msg("Uncaught exception. Entering post mortem debugging")
+ mpdb.msg("Running 'cont' or 'step' will restart the program")
+ t = sys.exc_info()[2]
+ while t.tb_next != None:
+ t = t.tb_next
+ mpdb.interaction(t.tb_frame,t)
+ mpdb.msg("Post mortem debugger finished. The " + \
+ mpdb.mainpyfile + " will be restarted")
+
+if __name__ == '__main__':
+ main()
+
+
Added: sandbox/trunk/pdb/test/files/proc.py
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/test/files/proc.py Thu Jul 20 23:30:41 2006
@@ -0,0 +1,17 @@
+#!/usr/bin/env python
+
+# This file demonstrates mpdb's ability to debug a running process.
+# Run this file, making note of it's pid and connect to it with,
+# mpdb.py --pid=PID
+
+import sys
+sys.path.append('../..')
+
+import mpdb
+mpdb.process_debugging()
+
+while True:
+ for i in range(10):
+ x = i
+
+
Copied: sandbox/trunk/pdb/test/files/thread_script.py (from r50718, sandbox/trunk/pdb/test/thread_script.py)
==============================================================================
--- sandbox/trunk/pdb/test/thread_script.py (original)
+++ sandbox/trunk/pdb/test/files/thread_script.py Thu Jul 20 23:30:41 2006
@@ -1,26 +1,25 @@
-#!/usr/bin/env python
-""" This is a script that is being used to debug whilst trying to get
-the thread debugging features working.
-"""
-
-import threading
-
-def foo():
- l = [i for i in range(10)]
- for n in l:
- print l[n+1]
-
-class MyThread(threading.Thread):
- def run(self):
- foo()
-
-def func():
- for i in range(2):
- t = MyThread()
- t.start()
- t.join()
-
-if __name__ == '__main__':
- func()
-
-
+#!/usr/bin/env python
+""" This is a script that is being used to debug whilst trying to get
+the thread debugging features working.
+"""
+
+import threading
+
+def foo():
+ l = [i for i in range(10)]
+ for n in l:
+ print l[n+1]
+
+class MyThread(threading.Thread):
+ def run(self):
+ foo()
+
+def func():
+ for i in range(2):
+ t = MyThread()
+ t.start()
+ t.join()
+
+if __name__ == '__main__':
+ func()
+
Modified: sandbox/trunk/pdb/test/test_mpdb.py
==============================================================================
--- sandbox/trunk/pdb/test/test_mpdb.py (original)
+++ sandbox/trunk/pdb/test/test_mpdb.py Thu Jul 20 23:30:41 2006
@@ -12,11 +12,11 @@
# Global vars
__addr__ = 'localhost:8002'
-script = 'thread_script.py'
+script = 'files/thread_script.py'
MAXTRIES = 100
sys.path.append("..")
-from mpdb import MPdb, pdbserver, target, Exit
+from mpdb import MPdb, pdbserver, target
from mconnection import (MConnectionClientTCP, MConnectionServerTCP,
ConnectionFailed)
Deleted: /sandbox/trunk/pdb/test/thread_script.py
==============================================================================
--- /sandbox/trunk/pdb/test/thread_script.py Thu Jul 20 23:30:41 2006
+++ (empty file)
@@ -1,26 +0,0 @@
-#!/usr/bin/env python
-""" This is a script that is being used to debug whilst trying to get
-the thread debugging features working.
-"""
-
-import threading
-
-def foo():
- l = [i for i in range(10)]
- for n in l:
- print l[n+1]
-
-class MyThread(threading.Thread):
- def run(self):
- foo()
-
-def func():
- for i in range(2):
- t = MyThread()
- t.start()
- t.join()
-
-if __name__ == '__main__':
- func()
-
-
From python-checkins at python.org Thu Jul 20 23:34:06 2006
From: python-checkins at python.org (brett.cannon)
Date: Thu, 20 Jul 2006 23:34:06 +0200 (CEST)
Subject: [Python-checkins] r50737 - in python/branches/bcannon-sandboxing:
Include/pystate.h Python/pystate.c
Message-ID: <20060720213406.EA9F51E4003@bag.python.org>
Author: brett.cannon
Date: Thu Jul 20 23:34:05 2006
New Revision: 50737
Modified:
python/branches/bcannon-sandboxing/Include/pystate.h
python/branches/bcannon-sandboxing/Python/pystate.c
Log:
Keep track of memory usage and cap using PY_LONG_LONG (size_t is way too small for total memory count).
Modified: python/branches/bcannon-sandboxing/Include/pystate.h
==============================================================================
--- python/branches/bcannon-sandboxing/Include/pystate.h (original)
+++ python/branches/bcannon-sandboxing/Include/pystate.h Thu Jul 20 23:34:05 2006
@@ -33,8 +33,8 @@
int tscdump;
#endif
#ifdef Py_MEMORY_CAP
- size_t mem_cap;
- size_t mem_usage;
+ PY_LONG_LONG mem_cap;
+ PY_LONG_LONG mem_usage;
#endif
} PyInterpreterState;
@@ -108,8 +108,8 @@
PyAPI_FUNC(void) PyInterpreterState_Delete(PyInterpreterState *);
#ifdef Py_MEMORY_CAP
-#define PyInterpreterState_SET_MEMORY_CAP(interp, cap) (interp->mem_cap = cap)
PyAPI_FUNC(PyInterpreterState *) PyInterpreterState_SafeGet(void);
+PyAPI_FUNC(int) PyInterpreterState_SetMemoryCap(PyInterpreterState *, PY_LONG_LONG);
PyAPI_FUNC(int) PyInterpreterState_RaiseMemoryUsage(PyInterpreterState *, size_t);
PyAPI_FUNC(void) PyInterpreterState_LowerMemoryUsage(PyInterpreterState *, size_t);
#endif /* Py_MEMORY_CAP */
Modified: python/branches/bcannon-sandboxing/Python/pystate.c
==============================================================================
--- python/branches/bcannon-sandboxing/Python/pystate.c (original)
+++ python/branches/bcannon-sandboxing/Python/pystate.c Thu Jul 20 23:34:05 2006
@@ -154,7 +154,7 @@
{
PyThreadState *tstate = NULL;
- if (!Py_IsInitialized() || !PyEval_ThreadsInitialized())
+ if (!Py_IsInitialized() || !_PyThreadState_Current)
return NULL;
tstate = PyThreadState_GET();
@@ -164,6 +164,25 @@
return tstate->interp;
}
+int
+PyInterpreterState_SetMemoryCap(PyInterpreterState *interp, PY_LONG_LONG cap)
+{
+ if (cap < 0) {
+ PyErr_SetString(PyExc_ValueError, "memory cap must be >= 0");
+ return 0;
+ }
+
+ if (cap < interp->mem_usage) {
+ PyErr_SetString(PyExc_ValueError, "new memory cap too small for "
+ "current memory usage");
+ return 0;
+ }
+
+ interp->mem_cap = cap;
+
+ return 1;
+}
+
/*
Raise the current allocation of memory on the interpreter by 'increase'.
If it the allocation pushes the total memory usage past the memory cap,
@@ -182,16 +201,16 @@
/* Watch out for integer overflow. */
original_mem_usage = interp->mem_usage;
- interp->mem_usage += increase;
+ interp->mem_usage += (PY_LONG_LONG)increase;
if (interp->mem_usage < original_mem_usage) {
interp->mem_usage = original_mem_usage;
- PyErr_SetString(PyExc_MemoryError, "integer overflow in memory usage");
+ PyErr_NoMemory();
return 0;
}
if (interp->mem_usage > interp->mem_cap) {
interp->mem_usage = original_mem_usage;
- PyErr_SetString(PyExc_MemoryError, "exceeded memory usage");
+ PyErr_NoMemory();
return 0;
}
@@ -208,7 +227,7 @@
if (decrease < 0)
Py_FatalError("must specify memory usage reduction by a positive number");
- interp->mem_usage -= decrease;
+ interp->mem_usage -= (PY_LONG_LONG)decrease;
if (interp->mem_usage < 0)
interp->mem_usage = 0;
}
From python-checkins at python.org Thu Jul 20 23:35:06 2006
From: python-checkins at python.org (brett.cannon)
Date: Thu, 20 Jul 2006 23:35:06 +0200 (CEST)
Subject: [Python-checkins] r50738 - in python/branches/bcannon-sandboxing:
Lib/test/test_sys.py Python/sysmodule.c
Message-ID: <20060720213506.371DF1E4003@bag.python.org>
Author: brett.cannon
Date: Thu Jul 20 23:35:05 2006
New Revision: 50738
Modified:
python/branches/bcannon-sandboxing/Lib/test/test_sys.py
python/branches/bcannon-sandboxing/Python/sysmodule.c
Log:
Add support to set/get memory cap and get memory usage in the sys module.
Modified: python/branches/bcannon-sandboxing/Lib/test/test_sys.py
==============================================================================
--- python/branches/bcannon-sandboxing/Lib/test/test_sys.py (original)
+++ python/branches/bcannon-sandboxing/Lib/test/test_sys.py Thu Jul 20 23:35:05 2006
@@ -266,6 +266,25 @@
# the test runs under regrtest.
self.assert_(sys.__stdout__.encoding == sys.__stderr__.encoding)
+ def test_memorycap(self):
+ # Test both getmemorycap() and setmemorycap().
+ if not hasattr(sys, 'getmemorycap'):
+ return
+ original_cap = sys.getmemorycap()
+ self.failUnless(isinstance(original_cap, (int, long)))
+ new_cap = int(original_cap + 10000)
+ assert isinstance(new_cap, int)
+ sys.setmemorycap(new_cap)
+ try: # Make sure don't mess up interpreter.
+ self.failUnlessEqual(new_cap, sys.getmemorycap())
+ sys.setmemorycap(long(new_cap))
+ self.failUnlessEqual(new_cap, sys.getmemorycap())
+ finally:
+ try: # setmemorycap() could be broken.
+ sys.setmemorycap(original_cap)
+ except Exception:
+ pass
+
def test_main():
test.test_support.run_unittest(SysModuleTest)
Modified: python/branches/bcannon-sandboxing/Python/sysmodule.c
==============================================================================
--- python/branches/bcannon-sandboxing/Python/sysmodule.c (original)
+++ python/branches/bcannon-sandboxing/Python/sysmodule.c Thu Jul 20 23:35:05 2006
@@ -700,6 +700,64 @@
10. Number of stack pops performed by call_function()"
);
+#ifdef Py_MEMORY_CAP
+static PyObject *
+sys_setmemorycap(PyObject *self, PyObject *arg)
+{
+ PyInterpreterState *interp = PyInterpreterState_SafeGet();
+ PY_LONG_LONG new_memory_cap;
+ PyObject *arg_as_long = PyNumber_Long(arg);
+
+ if (!arg_as_long)
+ return NULL;
+
+ new_memory_cap = PyLong_AsLongLong(arg_as_long);
+ Py_DECREF(arg_as_long); /* DEAD: arg_as_long */
+
+ if (!interp)
+ Py_FatalError("interpreter not available");
+
+ if (!PyInterpreterState_SetMemoryCap(interp, new_memory_cap))
+ return NULL;
+
+ Py_RETURN_NONE;
+}
+
+PyDoc_STRVAR(setmemorycap_doc,
+"XXX"
+);
+
+static PyObject *
+sys_getmemorycap(PyObject *self, PyObject *ignore)
+{
+ PyInterpreterState *interp = PyInterpreterState_SafeGet();
+
+ if (!interp)
+ Py_FatalError("interpreter not available");
+
+ return PyLong_FromLongLong(interp->mem_cap);
+}
+
+PyDoc_STRVAR(getmemorycap_doc,
+"XXX"
+);
+
+static PyObject *
+sys_getmemoryused(PyObject *self, PyObject *ignore)
+{
+ PyInterpreterState *interp = PyInterpreterState_SafeGet();
+
+ if (!interp)
+ Py_FatalError("interpreter not available");
+
+ return PyLong_FromLongLong(interp->mem_usage);
+}
+
+PyDoc_STRVAR(getmemoryused_doc,
+"XXX"
+);
+#endif /* Py_MEMORY_CAP */
+
#ifdef __cplusplus
extern "C" {
#endif
@@ -782,6 +840,11 @@
#endif
{"settrace", sys_settrace, METH_O, settrace_doc},
{"call_tracing", sys_call_tracing, METH_VARARGS, call_tracing_doc},
+#ifdef Py_MEMORY_CAP
+ {"setmemorycap", sys_setmemorycap, METH_O, setmemorycap_doc},
+ {"getmemorycap", sys_getmemorycap, METH_NOARGS, getmemorycap_doc},
+ {"getmemoryused", sys_getmemoryused, METH_NOARGS, getmemoryused_doc},
+#endif
{NULL, NULL} /* sentinel */
};
From python-checkins at python.org Fri Jul 21 01:20:12 2006
From: python-checkins at python.org (vinay.sajip)
Date: Fri, 21 Jul 2006 01:20:12 +0200 (CEST)
Subject: [Python-checkins] r50740 - python/trunk/Lib/logging/handlers.py
Message-ID: <20060720232012.CECD21E4003@bag.python.org>
Author: vinay.sajip
Date: Fri Jul 21 01:20:12 2006
New Revision: 50740
Modified:
python/trunk/Lib/logging/handlers.py
Log:
Addressed SF#1524081 by using a dictionary to map level names to syslog priority names, rather than a string.lower().
Modified: python/trunk/Lib/logging/handlers.py
==============================================================================
--- python/trunk/Lib/logging/handlers.py (original)
+++ python/trunk/Lib/logging/handlers.py Fri Jul 21 01:20:12 2006
@@ -562,6 +562,18 @@
"local7": LOG_LOCAL7,
}
+ #The map below appears to be trivially lowercasing the key. However,
+ #there's more to it than meets the eye - in some locales, lowercasing
+ #gives unexpected results. See SF #1524081: in the Turkish locale,
+ #"INFO".lower() != "info"
+ priority_map = {
+ "DEBUG" : "debug",
+ "INFO" : "info",
+ "WARNING" : "warning",
+ "ERROR" : "error",
+ "CRITICAL" : "critical"
+ }
+
def __init__(self, address=('localhost', SYSLOG_UDP_PORT), facility=LOG_USER):
"""
Initialize a handler.
@@ -598,7 +610,7 @@
# necessary.
log_format_string = '<%d>%s\000'
- def encodePriority (self, facility, priority):
+ def encodePriority(self, facility, priority):
"""
Encode the facility and priority. You can pass in strings or
integers - if strings are passed, the facility_names and
@@ -619,6 +631,16 @@
self.socket.close()
logging.Handler.close(self)
+ def mapPriority(self, levelName):
+ """
+ Map a logging level name to a key in the priority_names map.
+ This is useful in two scenarios: when custom levels are being
+ used, and in the case where you can't do a straightforward
+ mapping by lowercasing the logging level name because of locale-
+ specific issues (see SF #1524081).
+ """
+ return self.priority_map.get(levelName, "warning")
+
def emit(self, record):
"""
Emit a record.
@@ -633,8 +655,8 @@
"""
msg = self.log_format_string % (
self.encodePriority(self.facility,
- string.lower(record.levelname)),
- msg)
+ self.mapPriority(record.levelname)),
+ msg)
try:
if self.unixsocket:
try:
From nnorwitz at gmail.com Fri Jul 21 05:37:11 2006
From: nnorwitz at gmail.com (Neal Norwitz)
Date: Thu, 20 Jul 2006 20:37:11 -0700
Subject: [Python-checkins] r50738 - in
python/branches/bcannon-sandboxing: Lib/test/test_sys.py
Python/sysmodule.c
In-Reply-To: <20060720213506.371DF1E4003@bag.python.org>
References: <20060720213506.371DF1E4003@bag.python.org>
Message-ID:
> Modified: python/branches/bcannon-sandboxing/Python/sysmodule.c
> ==============================================================================
> --- python/branches/bcannon-sandboxing/Python/sysmodule.c (original)
> +++ python/branches/bcannon-sandboxing/Python/sysmodule.c Thu Jul 20 23:35:05 2006
> @@ -700,6 +700,64 @@
> 10. Number of stack pops performed by call_function()"
> );
>
> +#ifdef Py_MEMORY_CAP
> +static PyObject *
> +sys_setmemorycap(PyObject *self, PyObject *arg)
> +{
> + PyInterpreterState *interp = PyInterpreterState_SafeGet();
> + PY_LONG_LONG new_memory_cap;
> + PyObject *arg_as_long = PyNumber_Long(arg);
> +
> + if (!arg_as_long)
> + return NULL;
> +
> + new_memory_cap = PyLong_AsLongLong(arg_as_long);
> + Py_DECREF(arg_as_long); /* DEAD: arg_as_long */
if (new_memory_cap == -1)
return NULL;
> +
> + if (!interp)
> + Py_FatalError("interpreter not available");
> +
> + if (!PyInterpreterState_SetMemoryCap(interp, new_memory_cap))
> + return NULL;
> +
> + Py_RETURN_NONE;
> +}
From python-checkins at python.org Fri Jul 21 07:29:58 2006
From: python-checkins at python.org (neal.norwitz)
Date: Fri, 21 Jul 2006 07:29:58 +0200 (CEST)
Subject: [Python-checkins] r50741 - python/trunk/Objects/dictobject.c
Message-ID: <20060721052958.023AF1E4003@bag.python.org>
Author: neal.norwitz
Date: Fri Jul 21 07:29:58 2006
New Revision: 50741
Modified:
python/trunk/Objects/dictobject.c
Log:
Add some asserts that we got good params passed
Modified: python/trunk/Objects/dictobject.c
==============================================================================
--- python/trunk/Objects/dictobject.c (original)
+++ python/trunk/Objects/dictobject.c Fri Jul 21 07:29:58 2006
@@ -599,6 +599,8 @@
PyErr_BadInternalCall();
return -1;
}
+ assert(key);
+ assert(value);
mp = (dictobject *)op;
if (PyString_CheckExact(key)) {
hash = ((PyStringObject *)key)->ob_shash;
@@ -647,6 +649,7 @@
PyErr_BadInternalCall();
return -1;
}
+ assert(key);
if (!PyString_CheckExact(key) ||
(hash = ((PyStringObject *) key)->ob_shash) == -1) {
hash = PyObject_Hash(key);
From python-checkins at python.org Fri Jul 21 07:31:02 2006
From: python-checkins at python.org (neal.norwitz)
Date: Fri, 21 Jul 2006 07:31:02 +0200 (CEST)
Subject: [Python-checkins] r50742 - python/trunk/Objects/frameobject.c
Message-ID: <20060721053102.EE5131E4017@bag.python.org>
Author: neal.norwitz
Date: Fri Jul 21 07:31:02 2006
New Revision: 50742
Modified:
python/trunk/Objects/frameobject.c
Log:
Move the initialization of some pointers earlier. The problem is
that if we call Py_DECREF(frame) like we do if allocating locals fails,
frame_dealloc() will try to use these bogus values and crash.
Modified: python/trunk/Objects/frameobject.c
==============================================================================
--- python/trunk/Objects/frameobject.c (original)
+++ python/trunk/Objects/frameobject.c Fri Jul 21 07:31:02 2006
@@ -642,6 +642,7 @@
f->f_trace = NULL;
f->f_exc_type = f->f_exc_value = f->f_exc_traceback = NULL;
}
+ f->f_stacktop = f->f_valuestack;
f->f_builtins = builtins;
Py_XINCREF(back);
f->f_back = back;
@@ -672,7 +673,6 @@
f->f_lineno = code->co_firstlineno;
f->f_iblock = 0;
- f->f_stacktop = f->f_valuestack;
_PyObject_GC_TRACK(f);
return f;
}
From python-checkins at python.org Fri Jul 21 07:32:29 2006
From: python-checkins at python.org (neal.norwitz)
Date: Fri, 21 Jul 2006 07:32:29 +0200 (CEST)
Subject: [Python-checkins] r50743 - in python/trunk: Misc/NEWS
Objects/typeobject.c Objects/unicodeobject.c Python/import.c
Python/pystate.c Python/sysmodule.c
Message-ID: <20060721053229.E87231E4009@bag.python.org>
Author: neal.norwitz
Date: Fri Jul 21 07:32:28 2006
New Revision: 50743
Modified:
python/trunk/Misc/NEWS
python/trunk/Objects/typeobject.c
python/trunk/Objects/unicodeobject.c
python/trunk/Python/import.c
python/trunk/Python/pystate.c
python/trunk/Python/sysmodule.c
Log:
Handle allocation failures gracefully. Found with failmalloc.
Many (all?) of these could be backported.
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Fri Jul 21 07:32:28 2006
@@ -12,6 +12,8 @@
Core and builtins
-----------------
+- Fix some potential crashes found with failmalloc.
+
- Fix warnings reported by Klocwork's static analysis tool.
- Bug #1512814, Fix incorrect lineno's when code within a function
Modified: python/trunk/Objects/typeobject.c
==============================================================================
--- python/trunk/Objects/typeobject.c (original)
+++ python/trunk/Objects/typeobject.c Fri Jul 21 07:32:28 2006
@@ -3260,6 +3260,8 @@
if (PyDict_GetItemString(type->tp_dict, "__doc__") == NULL) {
if (type->tp_doc != NULL) {
PyObject *doc = PyString_FromString(type->tp_doc);
+ if (doc == NULL)
+ goto error;
PyDict_SetItemString(type->tp_dict, "__doc__", doc);
Py_DECREF(doc);
} else {
Modified: python/trunk/Objects/unicodeobject.c
==============================================================================
--- python/trunk/Objects/unicodeobject.c (original)
+++ python/trunk/Objects/unicodeobject.c Fri Jul 21 07:32:28 2006
@@ -7918,6 +7918,9 @@
unicode_freelist = NULL;
unicode_freelist_size = 0;
unicode_empty = _PyUnicode_New(0);
+ if (!unicode_empty)
+ return;
+
strcpy(unicode_default_encoding, "ascii");
for (i = 0; i < 256; i++)
unicode_latin1[i] = NULL;
Modified: python/trunk/Python/import.c
==============================================================================
--- python/trunk/Python/import.c (original)
+++ python/trunk/Python/import.c Fri Jul 21 07:32:28 2006
@@ -116,6 +116,8 @@
for (scan = _PyImport_StandardFiletab; scan->suffix != NULL; ++scan)
++countS;
filetab = PyMem_NEW(struct filedescr, countD + countS + 1);
+ if (filetab == NULL)
+ Py_FatalError("Can't intiialize import file table.");
memcpy(filetab, _PyImport_DynLoadFiletab,
countD * sizeof(struct filedescr));
memcpy(filetab + countD, _PyImport_StandardFiletab,
@@ -239,8 +241,11 @@
long me = PyThread_get_thread_ident();
if (me == -1)
return; /* Too bad */
- if (import_lock == NULL)
+ if (import_lock == NULL) {
import_lock = PyThread_allocate_lock();
+ if (import_lock == NULL)
+ return; /* Nothing much we can do. */
+ }
if (import_lock_thread == me) {
import_lock_level++;
return;
@@ -259,7 +264,7 @@
unlock_import(void)
{
long me = PyThread_get_thread_ident();
- if (me == -1)
+ if (me == -1 || import_lock == NULL)
return 0; /* Too bad */
if (import_lock_thread != me)
return -1;
Modified: python/trunk/Python/pystate.c
==============================================================================
--- python/trunk/Python/pystate.c (original)
+++ python/trunk/Python/pystate.c Fri Jul 21 07:32:28 2006
@@ -63,6 +63,10 @@
if (interp != NULL) {
HEAD_INIT();
+#ifdef WITH_THREAD
+ if (head_mutex == NULL)
+ Py_FatalError("Can't initialize threads for interpreter");
+#endif
interp->modules = NULL;
interp->sysdict = NULL;
interp->builtins = NULL;
Modified: python/trunk/Python/sysmodule.c
==============================================================================
--- python/trunk/Python/sysmodule.c (original)
+++ python/trunk/Python/sysmodule.c Fri Jul 21 07:32:28 2006
@@ -1137,41 +1137,38 @@
#elif PY_RELEASE_LEVEL == PY_RELEASE_LEVEL_FINAL
s = "final";
#endif
- PyDict_SetItemString(sysdict, "version_info",
- v = Py_BuildValue("iiisi", PY_MAJOR_VERSION,
+
+#define SET_SYS_FROM_STRING(key, value) \
+ v = value; \
+ if (v != NULL) \
+ PyDict_SetItemString(sysdict, key, v); \
+ Py_XDECREF(v)
+
+ SET_SYS_FROM_STRING("version_info",
+ Py_BuildValue("iiisi", PY_MAJOR_VERSION,
PY_MINOR_VERSION,
PY_MICRO_VERSION, s,
PY_RELEASE_SERIAL));
- Py_XDECREF(v);
- PyDict_SetItemString(sysdict, "api_version",
- v = PyInt_FromLong(PYTHON_API_VERSION));
- Py_XDECREF(v);
- PyDict_SetItemString(sysdict, "copyright",
- v = PyString_FromString(Py_GetCopyright()));
- Py_XDECREF(v);
- PyDict_SetItemString(sysdict, "platform",
- v = PyString_FromString(Py_GetPlatform()));
- Py_XDECREF(v);
- PyDict_SetItemString(sysdict, "executable",
- v = PyString_FromString(Py_GetProgramFullPath()));
- Py_XDECREF(v);
- PyDict_SetItemString(sysdict, "prefix",
- v = PyString_FromString(Py_GetPrefix()));
- Py_XDECREF(v);
- PyDict_SetItemString(sysdict, "exec_prefix",
- v = PyString_FromString(Py_GetExecPrefix()));
- Py_XDECREF(v);
- PyDict_SetItemString(sysdict, "maxint",
- v = PyInt_FromLong(PyInt_GetMax()));
- Py_XDECREF(v);
+ SET_SYS_FROM_STRING("api_version",
+ PyInt_FromLong(PYTHON_API_VERSION));
+ SET_SYS_FROM_STRING("copyright",
+ PyString_FromString(Py_GetCopyright()));
+ SET_SYS_FROM_STRING("platform",
+ PyString_FromString(Py_GetPlatform()));
+ SET_SYS_FROM_STRING("executable",
+ PyString_FromString(Py_GetProgramFullPath()));
+ SET_SYS_FROM_STRING("prefix",
+ PyString_FromString(Py_GetPrefix()));
+ SET_SYS_FROM_STRING("exec_prefix",
+ PyString_FromString(Py_GetExecPrefix()));
+ SET_SYS_FROM_STRING("maxint",
+ PyInt_FromLong(PyInt_GetMax()));
#ifdef Py_USING_UNICODE
- PyDict_SetItemString(sysdict, "maxunicode",
- v = PyInt_FromLong(PyUnicode_GetMax()));
- Py_XDECREF(v);
+ SET_SYS_FROM_STRING("maxunicode",
+ PyInt_FromLong(PyUnicode_GetMax()));
#endif
- PyDict_SetItemString(sysdict, "builtin_module_names",
- v = list_builtin_module_names());
- Py_XDECREF(v);
+ SET_SYS_FROM_STRING("builtin_module_names",
+ list_builtin_module_names());
{
/* Assumes that longs are at least 2 bytes long.
Should be safe! */
@@ -1183,18 +1180,16 @@
value = "big";
else
value = "little";
- PyDict_SetItemString(sysdict, "byteorder",
- v = PyString_FromString(value));
- Py_XDECREF(v);
+ SET_SYS_FROM_STRING("byteorder",
+ PyString_FromString(value));
}
#ifdef MS_COREDLL
- PyDict_SetItemString(sysdict, "dllhandle",
- v = PyLong_FromVoidPtr(PyWin_DLLhModule));
- Py_XDECREF(v);
- PyDict_SetItemString(sysdict, "winver",
- v = PyString_FromString(PyWin_DLLVersionString));
- Py_XDECREF(v);
+ SET_SYS_FROM_STRING("dllhandle",
+ PyLong_FromVoidPtr(PyWin_DLLhModule));
+ SET_SYS_FROM_STRING("winver",
+ PyString_FromString(PyWin_DLLVersionString));
#endif
+#undef SET_SYS_FROM_STRING
if (warnoptions == NULL) {
warnoptions = PyList_New(0);
}
From python-checkins at python.org Fri Jul 21 09:59:02 2006
From: python-checkins at python.org (neal.norwitz)
Date: Fri, 21 Jul 2006 09:59:02 +0200 (CEST)
Subject: [Python-checkins] r50745 - python/trunk/Python/import.c
Message-ID: <20060721075902.C51921E4003@bag.python.org>
Author: neal.norwitz
Date: Fri Jul 21 09:59:02 2006
New Revision: 50745
Modified:
python/trunk/Python/import.c
Log:
Speel initialise write. Tanks Anthony.
Modified: python/trunk/Python/import.c
==============================================================================
--- python/trunk/Python/import.c (original)
+++ python/trunk/Python/import.c Fri Jul 21 09:59:02 2006
@@ -117,7 +117,7 @@
++countS;
filetab = PyMem_NEW(struct filedescr, countD + countS + 1);
if (filetab == NULL)
- Py_FatalError("Can't intiialize import file table.");
+ Py_FatalError("Can't initialize import file table.");
memcpy(filetab, _PyImport_DynLoadFiletab,
countD * sizeof(struct filedescr));
memcpy(filetab + countD, _PyImport_StandardFiletab,
From python-checkins at python.org Fri Jul 21 09:59:48 2006
From: python-checkins at python.org (neal.norwitz)
Date: Fri, 21 Jul 2006 09:59:48 +0200 (CEST)
Subject: [Python-checkins] r50746 - python/trunk/Python/ast.c
python/trunk/Python/compile.c python/trunk/Python/pythonrun.c
python/trunk/Python/symtable.c python/trunk/Python/thread.c
Message-ID: <20060721075948.B71041E4003@bag.python.org>
Author: neal.norwitz
Date: Fri Jul 21 09:59:47 2006
New Revision: 50746
Modified:
python/trunk/Python/ast.c
python/trunk/Python/compile.c
python/trunk/Python/pythonrun.c
python/trunk/Python/symtable.c
python/trunk/Python/thread.c
Log:
Handle more memory allocation failures without crashing.
Modified: python/trunk/Python/ast.c
==============================================================================
--- python/trunk/Python/ast.c (original)
+++ python/trunk/Python/ast.c Fri Jul 21 09:59:47 2006
@@ -638,8 +638,10 @@
anything other than EQUAL or a comma? */
/* XXX Should NCH(n) check be made a separate check? */
if (i + 1 < NCH(n) && TYPE(CHILD(n, i + 1)) == EQUAL) {
- asdl_seq_SET(defaults, j++,
- ast_for_expr(c, CHILD(n, i + 2)));
+ expr_ty expression = ast_for_expr(c, CHILD(n, i + 2));
+ if (!expression)
+ goto error;
+ asdl_seq_SET(defaults, j++, expression);
i += 2;
found_default = 1;
}
Modified: python/trunk/Python/compile.c
==============================================================================
--- python/trunk/Python/compile.c (original)
+++ python/trunk/Python/compile.c Fri Jul 21 09:59:47 2006
@@ -1105,8 +1105,17 @@
u->u_name = name;
u->u_varnames = list2dict(u->u_ste->ste_varnames);
u->u_cellvars = dictbytype(u->u_ste->ste_symbols, CELL, 0, 0);
+ if (!u->u_varnames || !u->u_cellvars) {
+ compiler_unit_free(u);
+ return 0;
+ }
+
u->u_freevars = dictbytype(u->u_ste->ste_symbols, FREE, DEF_FREE_CLASS,
PyDict_Size(u->u_cellvars));
+ if (!u->u_freevars) {
+ compiler_unit_free(u);
+ return 0;
+ }
u->u_blocks = NULL;
u->u_tmpname = 0;
Modified: python/trunk/Python/pythonrun.c
==============================================================================
--- python/trunk/Python/pythonrun.c (original)
+++ python/trunk/Python/pythonrun.c Fri Jul 21 09:59:47 2006
@@ -1204,8 +1204,12 @@
{
PyObject *ret = NULL;
PyArena *arena = PyArena_New();
- mod_ty mod = PyParser_ASTFromString(str, "", start, flags,
- arena);
+ mod_ty mod;
+
+ if (arena == NULL)
+ return NULL;
+
+ mod = PyParser_ASTFromString(str, "", start, flags, arena);
if (mod != NULL)
ret = run_mod(mod, "", globals, locals, flags, arena);
PyArena_Free(arena);
@@ -1218,8 +1222,13 @@
{
PyObject *ret;
PyArena *arena = PyArena_New();
- mod_ty mod = PyParser_ASTFromFile(fp, filename, start, 0, 0,
- flags, NULL, arena);
+ mod_ty mod;
+
+ if (arena == NULL)
+ return NULL;
+
+ mod = PyParser_ASTFromFile(fp, filename, start, 0, 0,
+ flags, NULL, arena);
if (mod == NULL) {
PyArena_Free(arena);
return NULL;
Modified: python/trunk/Python/symtable.c
==============================================================================
--- python/trunk/Python/symtable.c (original)
+++ python/trunk/Python/symtable.c Fri Jul 21 09:59:47 2006
@@ -221,8 +221,12 @@
return st;
st->st_filename = filename;
st->st_future = future;
- symtable_enter_block(st, GET_IDENTIFIER(top), ModuleBlock,
- (void *)mod, 0);
+ if (!symtable_enter_block(st, GET_IDENTIFIER(top), ModuleBlock,
+ (void *)mod, 0)) {
+ PySymtable_Free(st);
+ return NULL;
+ }
+
st->st_top = st->st_cur;
st->st_cur->ste_unoptimized = OPT_TOPLEVEL;
/* Any other top-level initialization? */
@@ -728,6 +732,8 @@
if (end >= 0) {
st->st_cur = (PySTEntryObject *)PyList_GET_ITEM(st->st_stack,
end);
+ if (st->st_cur == NULL)
+ return 0;
Py_INCREF(st->st_cur);
if (PySequence_DelItem(st->st_stack, end) < 0)
return 0;
@@ -749,6 +755,8 @@
Py_DECREF(st->st_cur);
}
st->st_cur = PySTEntry_New(st, name, block, ast, lineno);
+ if (st->st_cur == NULL)
+ return 0;
if (name == GET_IDENTIFIER(top))
st->st_global = st->st_cur->ste_symbols;
if (prev) {
Modified: python/trunk/Python/thread.c
==============================================================================
--- python/trunk/Python/thread.c (original)
+++ python/trunk/Python/thread.c Fri Jul 21 09:59:47 2006
@@ -267,6 +267,8 @@
struct key *p;
long id = PyThread_get_thread_ident();
+ if (!keymutex)
+ return NULL;
PyThread_acquire_lock(keymutex, 1);
for (p = keyhead; p != NULL; p = p->next) {
if (p->id == id && p->key == key)
From buildbot at python.org Fri Jul 21 10:57:13 2006
From: buildbot at python.org (buildbot at python.org)
Date: Fri, 21 Jul 2006 08:57:13 +0000
Subject: [Python-checkins] buildbot warnings in S-390 Debian trunk
Message-ID: <20060721085713.7BE3E1E4003@bag.python.org>
The Buildbot has detected a new failure of S-390 Debian trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/S-390%2520Debian%2520trunk/builds/271
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: neal.norwitz
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Fri Jul 21 14:40:34 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Fri, 21 Jul 2006 14:40:34 +0200 (CEST)
Subject: [Python-checkins] r50748 -
sandbox/trunk/seealso/convert-python-faqs.py
Message-ID: <20060721124034.6017F1E400A@bag.python.org>
Author: andrew.kuchling
Date: Fri Jul 21 14:40:33 2006
New Revision: 50748
Modified:
sandbox/trunk/seealso/convert-python-faqs.py
Log:
Fix bugs in YAML output
Modified: sandbox/trunk/seealso/convert-python-faqs.py
==============================================================================
--- sandbox/trunk/seealso/convert-python-faqs.py (original)
+++ sandbox/trunk/seealso/convert-python-faqs.py Fri Jul 21 14:40:33 2006
@@ -13,6 +13,13 @@
DESTDIR = 'pyramid-faq'
+def normalize_whitespace (S):
+ """Replace all whitespace by single spaces.
+ >>> normalize_whitespace('a b\tc\nd')
+ 'a b c d'
+ """
+ return ' '.join(S.split())
+
def get_setting (root, setting_name, default=None):
"""Look for a paragraph containing a setting, like 'CATEGORY: general',
and returns the result of the setting.
@@ -31,6 +38,7 @@
# root is the 'html' element
root = htmlload.load(filename)
title = root.findtext('head/title')
+ title = normalize_whitespace(title)
# Find category and name
category = get_setting(root, 'category')
@@ -62,7 +70,7 @@
template: index.html
# The data to pass to the template
global:
- metadata:
+ metadata: {}
local:
title: "%s"
content: !fragment content.yml
@@ -97,6 +105,7 @@
def convert_index (filename):
root = htmlload.load(filename)
title = root.findtext('*/title')
+ title = normalize_whitespace(title)
i = title.find(':')
category = title[i+1:].strip()
From python-checkins at python.org Fri Jul 21 14:42:05 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Fri, 21 Jul 2006 14:42:05 +0200 (CEST)
Subject: [Python-checkins] r50749 -
sandbox/trunk/seealso/convert-python-faqs.py
Message-ID: <20060721124205.6984D1E4004@bag.python.org>
Author: andrew.kuchling
Date: Fri Jul 21 14:42:05 2006
New Revision: 50749
Modified:
sandbox/trunk/seealso/convert-python-faqs.py (props changed)
Log:
Make executable
From python-checkins at python.org Fri Jul 21 15:04:52 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Fri, 21 Jul 2006 15:04:52 +0200 (CEST)
Subject: [Python-checkins] r50750 -
sandbox/trunk/seealso/convert-python-faqs.py
Message-ID: <20060721130452.EE1F81E4004@bag.python.org>
Author: andrew.kuchling
Date: Fri Jul 21 15:04:52 2006
New Revision: 50750
Modified:
sandbox/trunk/seealso/convert-python-faqs.py
Log:
Do YAML escapes somewhat properly; skip the 'test' category
Modified: sandbox/trunk/seealso/convert-python-faqs.py
==============================================================================
--- sandbox/trunk/seealso/convert-python-faqs.py (original)
+++ sandbox/trunk/seealso/convert-python-faqs.py Fri Jul 21 15:04:52 2006
@@ -13,6 +13,14 @@
DESTDIR = 'pyramid-faq'
+def yaml_escape (S):
+ "Escape S as a YAML string"
+ S = normalize_whitespace(S)
+ S = S.replace("\\", "\\\\")
+ S = S.replace('"', '\\"')
+ S = '"' + S + '"'
+ return S
+
def normalize_whitespace (S):
"""Replace all whitespace by single spaces.
>>> normalize_whitespace('a b\tc\nd')
@@ -45,6 +53,8 @@
if category is None:
##print 'Filename without category:', filename
return
+ if category in ('test',):
+ return
name = get_setting(root, 'name', base)
@@ -72,9 +82,9 @@
global:
metadata: {}
local:
- title: "%s"
+ title: %s
content: !fragment content.yml
- """ % title)
+ """ % yaml_escape(title))
f.close()
f = open(os.path.join(qdir, 'content.yml'), 'w')
@@ -108,15 +118,10 @@
title = normalize_whitespace(title)
i = title.find(':')
category = title[i+1:].strip()
+ idir = os.path.join(DESTDIR, category)
# XXX can there be subsections within a category?
- entries = []
- for para in root.findall('*/p'):
- a = para.find('a')
- if a is not None:
- entries.append(a)
-
- idir = os.path.join(DESTDIR, category)
+ entries = list(root.findall('*/p/a'))
# Write body of question
f = open(os.path.join(idir, 'listing.ht'), 'w')
@@ -132,9 +137,9 @@
template: index.html
# The data to pass to the template
local:
- title: "%s"
+ title: %s
content: !fragment content.yml
- """ % title)
+ """ % yaml_escape(title))
f.close()
f = open(os.path.join(idir, 'content.yml'), 'w')
From python-checkins at python.org Fri Jul 21 15:14:35 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Fri, 21 Jul 2006 15:14:35 +0200 (CEST)
Subject: [Python-checkins] r50751 -
sandbox/trunk/seealso/convert-python-faqs.py
Message-ID: <20060721131435.0774B1E4003@bag.python.org>
Author: andrew.kuchling
Date: Fri Jul 21 15:14:34 2006
New Revision: 50751
Modified:
sandbox/trunk/seealso/convert-python-faqs.py
Log:
Don't need a nav.html, apparently
Modified: sandbox/trunk/seealso/convert-python-faqs.py
==============================================================================
--- sandbox/trunk/seealso/convert-python-faqs.py (original)
+++ sandbox/trunk/seealso/convert-python-faqs.py Fri Jul 21 15:14:34 2006
@@ -95,21 +95,8 @@
# The data to pass to the template
local:
content:
- breadcrumb: !breadcrumb nav.yml nav
text: !htfile question.ht""")
f.close()
-
- f = open(os.path.join(qdir, 'nav.yml'), 'w')
- f.write("""--- !fragment
-# Type of template to use
-template: nav.html
-
-# Contents of the template
-global:
- nav : !sectionnav |
- xxx index.html
-""")
- f.close()
def convert_index (filename):
From python-checkins at python.org Fri Jul 21 15:49:57 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Fri, 21 Jul 2006 15:49:57 +0200 (CEST)
Subject: [Python-checkins] r50752 -
sandbox/trunk/seealso/convert-python-faqs.py
Message-ID: <20060721134957.087351E4004@bag.python.org>
Author: andrew.kuchling
Date: Fri Jul 21 15:49:56 2006
New Revision: 50752
Modified:
sandbox/trunk/seealso/convert-python-faqs.py
Log:
Write master index of categories
Modified: sandbox/trunk/seealso/convert-python-faqs.py
==============================================================================
--- sandbox/trunk/seealso/convert-python-faqs.py (original)
+++ sandbox/trunk/seealso/convert-python-faqs.py Fri Jul 21 15:49:56 2006
@@ -13,6 +13,9 @@
DESTDIR = 'pyramid-faq'
+# Master collection of categories
+categories = {}
+
def yaml_escape (S):
"Escape S as a YAML string"
S = normalize_whitespace(S)
@@ -55,7 +58,9 @@
return
if category in ('test',):
return
-
+ if category not in categories:
+ categories[category] = None
+
name = get_setting(root, 'name', base)
qdir = os.path.join(DESTDIR, category, name)
@@ -63,7 +68,7 @@
# Write body of question
f = open(os.path.join(qdir, 'question.ht'), 'w')
- f.write('Title: XXX\n')
+ f.write('Title: Unused title\n')
f.write('\n')
body = root.find('body')
for child in body.getchildren():
@@ -166,9 +171,12 @@
f.close()
f = open(os.path.join(DESTDIR, 'listing.ht'), 'w')
- f.write('Title: XXX\n')
+ f.write('Title: Unused title\n')
f.write('\n')
- f.write('
XXX put content here later
\n')
+ f.write('
\n')
+ for cat in sorted(categories.keys()):
+ f.write('
\n')
f.close()
- f = open(os.path.join(DESTDIR, 'nav.yml'), 'w')
- f.write("""--- !fragment
-# Type of template to use
-template: nav.html
-
-# Contents of the template
-global:
- nav : !sectionnav |
- xxx index.html
-""")
- f.close()
+ # Write navigation info
+ urllist = []
+ for cat in sorted(categories.keys()):
+ urllist.append((cat, cat.capitalize()))
+ write_nav(DESTDIR, urllist)
f = open(os.path.join(DESTDIR, 'content.html'), 'w')
f.write("""
+
""")
f.close()
From python-checkins at python.org Mon Jul 24 15:28:57 2006
From: python-checkins at python.org (georg.brandl)
Date: Mon, 24 Jul 2006 15:28:57 +0200 (CEST)
Subject: [Python-checkins] r50800 - in python/trunk: Makefile.pre.in
Misc/python-config.in
Message-ID: <20060724132857.B93BE1E4003@bag.python.org>
Author: georg.brandl
Date: Mon Jul 24 15:28:57 2006
New Revision: 50800
Modified:
python/trunk/Makefile.pre.in
python/trunk/Misc/python-config.in
Log:
Patch #1523356: fix determining include dirs in python-config.
Also don't install "python-config" when doing altinstall, but
always install "python-config2.x" and make a link to it like
with the main executable.
Modified: python/trunk/Makefile.pre.in
==============================================================================
--- python/trunk/Makefile.pre.in (original)
+++ python/trunk/Makefile.pre.in Mon Jul 24 15:28:57 2006
@@ -649,6 +649,7 @@
else true; \
fi
(cd $(DESTDIR)$(BINDIR); $(LN) python$(VERSION)$(EXE) $(PYTHON))
+ (cd $(DESTDIR)$(BINDIR); $(LN) -sf python-config$(VERSION)$(EXE) python-config$(EXE))
# Install the interpreter with $(VERSION) affixed
# This goes into $(exec_prefix)
@@ -849,8 +850,8 @@
$(INSTALL_SCRIPT) $(srcdir)/install-sh $(DESTDIR)$(LIBPL)/install-sh
# Substitution happens here, as the completely-expanded BINDIR
# is not available in configure
- sed -e "s, at BINDIR@,$(BINDIR)," < $(srcdir)/Misc/python-config.in >python-config
- $(INSTALL_SCRIPT) python-config $(DESTDIR)$(BINDIR)/python-config
+ sed -e "s, at EXENAME@,$(BINDIR)/python$(VERSION)$(EXE)," < $(srcdir)/Misc/python-config.in >python-config
+ $(INSTALL_SCRIPT) python-config $(DESTDIR)$(BINDIR)/python-config$(VERSION)$(EXE)
rm python-config
@if [ -s Modules/python.exp -a \
"`echo $(MACHDEP) | sed 's/^\(...\).*/\1/'`" = "aix" ]; then \
Modified: python/trunk/Misc/python-config.in
==============================================================================
--- python/trunk/Misc/python-config.in (original)
+++ python/trunk/Misc/python-config.in Mon Jul 24 15:28:57 2006
@@ -1,4 +1,4 @@
-#!@BINDIR@/python
+#!@EXENAME@
import sys
import os
@@ -36,13 +36,14 @@
print sysconfig.EXEC_PREFIX
elif opt in ('--includes', '--cflags'):
- flags = ['-I'+dir for dir in getvar('INCLDIRSTOMAKE').split()]
+ flags = ['-I' + sysconfig.get_python_inc(),
+ '-I' + sysconfig.get_python_inc(plat_specific=True)]
if opt == '--cflags':
flags.extend(getvar('CFLAGS').split())
print ' '.join(flags)
elif opt in ('--libs', '--ldflags'):
- libs = sysconfig.get_config_var('LIBS').split()
+ libs = getvar('LIBS').split() + getvar('SYSLIBS').split()
libs.append('-lpython'+pyver)
if opt == '--ldflags':
libs.insert(0, '-L' + getvar('LIBPL'))
From buildbot at python.org Mon Jul 24 15:34:42 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 24 Jul 2006 13:34:42 +0000
Subject: [Python-checkins] buildbot warnings in MIPS Debian trunk
Message-ID: <20060724133442.7A97A1E4003@bag.python.org>
The Buildbot has detected a new failure of MIPS Debian trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/MIPS%2520Debian%2520trunk/builds/313
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: martin.v.loewis
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Mon Jul 24 15:38:09 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Mon, 24 Jul 2006 15:38:09 +0200 (CEST)
Subject: [Python-checkins] r50801 -
sandbox/trunk/seealso/convert-python-faqs.py
Message-ID: <20060724133809.773671E4005@bag.python.org>
Author: andrew.kuchling
Date: Mon Jul 24 15:38:09 2006
New Revision: 50801
Modified:
sandbox/trunk/seealso/convert-python-faqs.py
Log:
Improve page title for questions (not tested yet)
Modified: sandbox/trunk/seealso/convert-python-faqs.py
==============================================================================
--- sandbox/trunk/seealso/convert-python-faqs.py (original)
+++ sandbox/trunk/seealso/convert-python-faqs.py Mon Jul 24 15:38:09 2006
@@ -95,6 +95,7 @@
f.close()
# Write .yml files
+ page_title = '%s FAQ: %s' % (category.capitalize(), title)
f = open(os.path.join(qdir, 'index.yml'), 'w')
f.write("""--- !fragment
template: index.html
@@ -104,7 +105,7 @@
local:
title: %s
content: !fragment content.yml
- """ % yaml_escape(title))
+ """ % yaml_escape(page_title))
f.close()
f = open(os.path.join(qdir, 'content.yml'), 'w')
From python-checkins at python.org Mon Jul 24 15:46:48 2006
From: python-checkins at python.org (georg.brandl)
Date: Mon, 24 Jul 2006 15:46:48 +0200 (CEST)
Subject: [Python-checkins] r50802 - python/trunk/PC/winsound.c
Message-ID: <20060724134648.314411E4003@bag.python.org>
Author: georg.brandl
Date: Mon Jul 24 15:46:47 2006
New Revision: 50802
Modified:
python/trunk/PC/winsound.c
Log:
Patch #1527744: right order of includes in order to have HAVE_CONIO_H defined properly.
Modified: python/trunk/PC/winsound.c
==============================================================================
--- python/trunk/PC/winsound.c (original)
+++ python/trunk/PC/winsound.c Mon Jul 24 15:46:47 2006
@@ -37,10 +37,10 @@
#include
#include
+#include
#ifdef HAVE_CONIO_H
#include /* port functions on Win9x */
#endif
-#include
PyDoc_STRVAR(sound_playsound_doc,
"PlaySound(sound, flags) - a wrapper around the Windows PlaySound API\n"
@@ -147,7 +147,7 @@
return NULL;
}
}
-#ifdef _M_IX86
+#if defined(_M_IX86) && defined(HAVE_CONIO_H)
else if (whichOS == Win9X) {
int speaker_state;
/* Force timer into oscillator mode via timer control port. */
@@ -172,7 +172,7 @@
/* Restore speaker control to original state. */
_outp(0x61, speaker_state);
}
-#endif /* _M_IX86 */
+#endif /* _M_IX86 && HAVE_CONIO_H */
else {
assert(!"winsound's whichOS has insane value");
}
From buildbot at python.org Mon Jul 24 16:03:00 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 24 Jul 2006 14:03:00 +0000
Subject: [Python-checkins] buildbot warnings in x86 Ubuntu dapper (icc) trunk
Message-ID: <20060724140300.A2BD61E4003@bag.python.org>
The Buildbot has detected a new failure of x86 Ubuntu dapper (icc) trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520Ubuntu%2520dapper%2520%2528icc%2529%2520trunk/builds/789
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: martin.v.loewis
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Mon Jul 24 16:09:57 2006
From: python-checkins at python.org (georg.brandl)
Date: Mon, 24 Jul 2006 16:09:57 +0200 (CEST)
Subject: [Python-checkins] r50803 - in python/trunk:
Lib/test/test_traceback.py Lib/traceback.py Misc/NEWS
Message-ID: <20060724140957.2352E1E4003@bag.python.org>
Author: georg.brandl
Date: Mon Jul 24 16:09:56 2006
New Revision: 50803
Modified:
python/trunk/Lib/test/test_traceback.py
python/trunk/Lib/traceback.py
python/trunk/Misc/NEWS
Log:
Patch #1515343: Fix printing of deprecated string exceptions with a
value in the traceback module.
Modified: python/trunk/Lib/test/test_traceback.py
==============================================================================
--- python/trunk/Lib/test/test_traceback.py (original)
+++ python/trunk/Lib/test/test_traceback.py Mon Jul 24 16:09:56 2006
@@ -31,8 +31,9 @@
err = self.get_exception_format(self.syntax_error_with_caret,
SyntaxError)
self.assert_(len(err) == 4)
- self.assert_("^" in err[2]) # third line has caret
self.assert_(err[1].strip() == "return x!")
+ self.assert_("^" in err[2]) # third line has caret
+ self.assert_(err[1].find("!") == err[2].find("^")) # in the right place
def test_nocaret(self):
if is_jython:
@@ -47,8 +48,9 @@
err = self.get_exception_format(self.syntax_error_bad_indentation,
IndentationError)
self.assert_(len(err) == 4)
- self.assert_("^" in err[2])
self.assert_(err[1].strip() == "print 2")
+ self.assert_("^" in err[2])
+ self.assert_(err[1].find("2") == err[2].find("^"))
def test_bug737473(self):
import sys, os, tempfile, time
@@ -109,6 +111,36 @@
lst = traceback.format_exception_only(e.__class__, e)
self.assertEqual(lst, ['KeyboardInterrupt\n'])
+ # String exceptions are deprecated, but legal. The quirky form with
+ # separate "type" and "value" tends to break things, because
+ # not isinstance(value, type)
+ # and a string cannot be the first argument to issubclass.
+ #
+ # Note that sys.last_type and sys.last_value do not get set if an
+ # exception is caught, so we sort of cheat and just emulate them.
+ #
+ # test_string_exception1 is equivalent to
+ #
+ # >>> raise "String Exception"
+ #
+ # test_string_exception2 is equivalent to
+ #
+ # >>> raise "String Exception", "String Value"
+ #
+ def test_string_exception1(self):
+ str_type = "String Exception"
+ err = traceback.format_exception_only(str_type, None)
+ self.assert_(len(err) == 1)
+ self.assert_(err[0] == str_type + '\n')
+
+ def test_string_exception2(self):
+ str_type = "String Exception"
+ str_value = "String Value"
+ err = traceback.format_exception_only(str_type, str_value)
+ self.assert_(len(err) == 1)
+ self.assert_(err[0] == str_type + ': ' + str_value + '\n')
+
+
def test_main():
run_unittest(TracebackCases)
Modified: python/trunk/Lib/traceback.py
==============================================================================
--- python/trunk/Lib/traceback.py (original)
+++ python/trunk/Lib/traceback.py Mon Jul 24 16:09:56 2006
@@ -150,51 +150,63 @@
The arguments are the exception type and value such as given by
sys.last_type and sys.last_value. The return value is a list of
- strings, each ending in a newline. Normally, the list contains a
- single string; however, for SyntaxError exceptions, it contains
- several lines that (when printed) display detailed information
- about where the syntax error occurred. The message indicating
- which exception occurred is the always last string in the list.
+ strings, each ending in a newline.
+
+ Normally, the list contains a single string; however, for
+ SyntaxError exceptions, it contains several lines that (when
+ printed) display detailed information about where the syntax
+ error occurred.
+
+ The message indicating which exception occurred is always the last
+ string in the list.
+
"""
- list = []
- if (type(etype) == types.ClassType
- or (isinstance(etype, type) and issubclass(etype, BaseException))):
- stype = etype.__name__
+
+ # An instance should not have a meaningful value parameter, but
+ # sometimes does, particularly for string exceptions, such as
+ # >>> raise string1, string2 # deprecated
+ #
+ # Clear these out first because issubtype(string1, SyntaxError)
+ # would throw another exception and mask the original problem.
+ if (isinstance(etype, BaseException) or
+ isinstance(etype, types.InstanceType) or
+ type(etype) is str):
+ return [_format_final_exc_line(etype, value)]
+
+ stype = etype.__name__
+
+ if not issubclass(etype, SyntaxError):
+ return [_format_final_exc_line(stype, value)]
+
+ # It was a syntax error; show exactly where the problem was found.
+ try:
+ msg, (filename, lineno, offset, badline) = value
+ except Exception:
+ pass
else:
- stype = etype
- if value is None:
- list.append(str(stype) + '\n')
+ filename = filename or ""
+ lines = [(' File "%s", line %d\n' % (filename, lineno))]
+ if badline is not None:
+ lines.append(' %s\n' % badline.strip())
+ if offset is not None:
+ caretspace = badline[:offset].lstrip()
+ # non-space whitespace (likes tabs) must be kept for alignment
+ caretspace = ((c.isspace() and c or ' ') for c in caretspace)
+ # only three spaces to account for offset1 == pos 0
+ lines.append(' %s^\n' % ''.join(caretspace))
+ value = msg
+
+ lines.append(_format_final_exc_line(stype, value))
+ return lines
+
+def _format_final_exc_line(etype, value):
+ """Return a list of a single line -- normal case for format_exception_only"""
+ if value is None or not str(value):
+ line = "%s\n" % etype
else:
- if issubclass(etype, SyntaxError):
- try:
- msg, (filename, lineno, offset, line) = value
- except:
- pass
- else:
- if not filename: filename = ""
- list.append(' File "%s", line %d\n' %
- (filename, lineno))
- if line is not None:
- i = 0
- while i < len(line) and line[i].isspace():
- i = i+1
- list.append(' %s\n' % line.strip())
- if offset is not None:
- s = ' '
- for c in line[i:offset-1]:
- if c.isspace():
- s = s + c
- else:
- s = s + ' '
- list.append('%s^\n' % s)
- value = msg
- s = _some_str(value)
- if s:
- list.append('%s: %s\n' % (str(stype), s))
- else:
- list.append('%s\n' % str(stype))
- return list
-
+ line = "%s: %s\n" % (etype, _some_str(value))
+ return line
+
def _some_str(value):
try:
return str(value)
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Mon Jul 24 16:09:56 2006
@@ -39,6 +39,9 @@
Library
-------
+- Patch #1515343: Fix printing of deprecated string exceptions with a
+ value in the traceback module.
+
- Resync optparse with Optik 1.5.3: minor tweaks for/to tests.
- Patch #1524429: Use repr() instead of backticks in Tkinter again.
From buildbot at python.org Mon Jul 24 16:31:19 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 24 Jul 2006 14:31:19 +0000
Subject: [Python-checkins] buildbot warnings in x86 W2k trunk
Message-ID: <20060724143119.3C30B1E4003@bag.python.org>
The Buildbot has detected a new failure of x86 W2k trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520W2k%2520trunk/builds/1269
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Mon Jul 24 16:32:33 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 24 Jul 2006 14:32:33 +0000
Subject: [Python-checkins] buildbot warnings in x86 gentoo trunk
Message-ID: <20060724143233.C70D21E4003@bag.python.org>
The Buildbot has detected a new failure of x86 gentoo trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520gentoo%2520trunk/builds/1362
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Mon Jul 24 16:41:00 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 24 Jul 2006 14:41:00 +0000
Subject: [Python-checkins] buildbot warnings in x86 OpenBSD trunk
Message-ID: <20060724144100.2CA251E4003@bag.python.org>
The Buildbot has detected a new failure of x86 OpenBSD trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520OpenBSD%2520trunk/builds/1042
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Mon Jul 24 16:49:16 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 24 Jul 2006 14:49:16 +0000
Subject: [Python-checkins] buildbot warnings in x86 Ubuntu dapper (icc) 2.4
Message-ID: <20060724144916.50E891E4006@bag.python.org>
The Buildbot has detected a new failure of x86 Ubuntu dapper (icc) 2.4.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520Ubuntu%2520dapper%2520%2528icc%2529%25202.4/builds/118
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch branches/release24-maint] HEAD
Blamelist: martin.v.loewis
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Mon Jul 24 17:00:48 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 24 Jul 2006 15:00:48 +0000
Subject: [Python-checkins] buildbot warnings in ppc Debian unstable trunk
Message-ID: <20060724150048.DAE811E4003@bag.python.org>
The Buildbot has detected a new failure of ppc Debian unstable trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/ppc%2520Debian%2520unstable%2520trunk/builds/967
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Mon Jul 24 17:01:29 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 24 Jul 2006 15:01:29 +0000
Subject: [Python-checkins] buildbot warnings in amd64 gentoo trunk
Message-ID: <20060724150130.1A20B1E4003@bag.python.org>
The Buildbot has detected a new failure of amd64 gentoo trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/amd64%2520gentoo%2520trunk/builds/1278
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl
Build Had Warnings: warnings failed slave lost
sincerely,
-The Buildbot
From buildbot at python.org Mon Jul 24 17:02:17 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 24 Jul 2006 15:02:17 +0000
Subject: [Python-checkins] buildbot warnings in x86 XP trunk
Message-ID: <20060724150218.0D8BB1E4003@bag.python.org>
The Buildbot has detected a new failure of x86 XP trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520XP%2520trunk/builds/1222
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Mon Jul 24 17:08:18 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 24 Jul 2006 15:08:18 +0000
Subject: [Python-checkins] buildbot warnings in PPC64 Debian trunk
Message-ID: <20060724150818.6FBAB1E4003@bag.python.org>
The Buildbot has detected a new failure of PPC64 Debian trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/PPC64%2520Debian%2520trunk/builds/271
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Mon Jul 24 17:15:20 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 24 Jul 2006 15:15:20 +0000
Subject: [Python-checkins] buildbot warnings in sparc solaris10 gcc trunk
Message-ID: <20060724151520.010B21E4003@bag.python.org>
The Buildbot has detected a new failure of sparc solaris10 gcc trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/sparc%2520solaris10%2520gcc%2520trunk/builds/1222
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Mon Jul 24 17:21:01 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 24 Jul 2006 15:21:01 +0000
Subject: [Python-checkins] buildbot warnings in g4 osx.4 trunk
Message-ID: <20060724152101.D994F1E4017@bag.python.org>
The Buildbot has detected a new failure of g4 osx.4 trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/g4%2520osx.4%2520trunk/builds/1211
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Mon Jul 24 18:11:29 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 24 Jul 2006 16:11:29 +0000
Subject: [Python-checkins] buildbot warnings in S-390 Debian trunk
Message-ID: <20060724161130.117761E4003@bag.python.org>
The Buildbot has detected a new failure of S-390 Debian trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/S-390%2520Debian%2520trunk/builds/289
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Mon Jul 24 18:18:26 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 24 Jul 2006 16:18:26 +0000
Subject: [Python-checkins] buildbot warnings in alpha Tru64 5.1 trunk
Message-ID: <20060724161826.745641E4003@bag.python.org>
The Buildbot has detected a new failure of alpha Tru64 5.1 trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/alpha%2520Tru64%25205.1%2520trunk/builds/949
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Mon Jul 24 19:13:24 2006
From: python-checkins at python.org (kurt.kaiser)
Date: Mon, 24 Jul 2006 19:13:24 +0200 (CEST)
Subject: [Python-checkins] r50804 - python/trunk/Lib/idlelib/EditorWindow.py
python/trunk/Lib/idlelib/NEWS.txt
python/trunk/Lib/idlelib/PyShell.py
Message-ID: <20060724171324.4E0931E4003@bag.python.org>
Author: kurt.kaiser
Date: Mon Jul 24 19:13:23 2006
New Revision: 50804
Modified:
python/trunk/Lib/idlelib/EditorWindow.py
python/trunk/Lib/idlelib/NEWS.txt
python/trunk/Lib/idlelib/PyShell.py
Log:
EditorWindow failed when used stand-alone if sys.ps1 not set.
Bug 1010370 Dave Florek
M EditorWindow.py
M PyShell.py
M NEWS.txt
Modified: python/trunk/Lib/idlelib/EditorWindow.py
==============================================================================
--- python/trunk/Lib/idlelib/EditorWindow.py (original)
+++ python/trunk/Lib/idlelib/EditorWindow.py Mon Jul 24 19:13:23 2006
@@ -85,6 +85,10 @@
self.flist = flist
root = root or flist.root
self.root = root
+ try:
+ sys.ps1
+ except AttributeError:
+ sys.ps1 = '>>> '
self.menubar = Menu(root)
self.top = top = WindowList.ListedToplevel(root, menu=self.menubar)
if flist:
Modified: python/trunk/Lib/idlelib/NEWS.txt
==============================================================================
--- python/trunk/Lib/idlelib/NEWS.txt (original)
+++ python/trunk/Lib/idlelib/NEWS.txt Mon Jul 24 19:13:23 2006
@@ -3,6 +3,9 @@
*Release date: XX-XXX-2006*
+- EditorWindow failed when used stand-alone if sys.ps1 not set.
+ Bug 1010370 Dave Florek
+
- Tooltips failed on new-syle class __init__ args. Bug 1027566 Loren Guthrie
- Avoid occasional failure to detect closing paren properly.
Modified: python/trunk/Lib/idlelib/PyShell.py
==============================================================================
--- python/trunk/Lib/idlelib/PyShell.py (original)
+++ python/trunk/Lib/idlelib/PyShell.py Mon Jul 24 19:13:23 2006
@@ -1306,10 +1306,6 @@
script = None
startup = False
try:
- sys.ps1
- except AttributeError:
- sys.ps1 = '>>> '
- try:
opts, args = getopt.getopt(sys.argv[1:], "c:deihnr:st:")
except getopt.error, msg:
sys.stderr.write("Error: %s\n" % str(msg))
From python-checkins at python.org Mon Jul 24 20:05:52 2006
From: python-checkins at python.org (kurt.kaiser)
Date: Mon, 24 Jul 2006 20:05:52 +0200 (CEST)
Subject: [Python-checkins] r50805 - python/trunk/Lib/idlelib/EditorWindow.py
python/trunk/Lib/idlelib/NEWS.txt
python/trunk/Lib/idlelib/ScriptBinding.py
Message-ID: <20060724180552.EB7F21E4005@bag.python.org>
Author: kurt.kaiser
Date: Mon Jul 24 20:05:51 2006
New Revision: 50805
Modified:
python/trunk/Lib/idlelib/EditorWindow.py
python/trunk/Lib/idlelib/NEWS.txt
python/trunk/Lib/idlelib/ScriptBinding.py
Log:
- EditorWindow.test() was failing. Bug 1417598
M EditorWindow.py
M ScriptBinding.py
M NEWS.txt
Modified: python/trunk/Lib/idlelib/EditorWindow.py
==============================================================================
--- python/trunk/Lib/idlelib/EditorWindow.py (original)
+++ python/trunk/Lib/idlelib/EditorWindow.py Mon Jul 24 20:05:51 2006
@@ -95,11 +95,12 @@
self.tkinter_vars = flist.vars
#self.top.instance_dict makes flist.inversedict avalable to
#configDialog.py so it can access all EditorWindow instaces
- self.top.instance_dict=flist.inversedict
+ self.top.instance_dict = flist.inversedict
else:
self.tkinter_vars = {} # keys: Tkinter event names
# values: Tkinter variable instances
- self.recent_files_path=os.path.join(idleConf.GetUserCfgDir(),
+ self.top.instance_dict = {}
+ self.recent_files_path = os.path.join(idleConf.GetUserCfgDir(),
'recent-files.lst')
self.vbar = vbar = Scrollbar(top, name='vbar')
self.text_frame = text_frame = Frame(top)
Modified: python/trunk/Lib/idlelib/NEWS.txt
==============================================================================
--- python/trunk/Lib/idlelib/NEWS.txt (original)
+++ python/trunk/Lib/idlelib/NEWS.txt Mon Jul 24 20:05:51 2006
@@ -3,6 +3,8 @@
*Release date: XX-XXX-2006*
+- EditorWindow.test() was failing. Bug 1417598
+
- EditorWindow failed when used stand-alone if sys.ps1 not set.
Bug 1010370 Dave Florek
Modified: python/trunk/Lib/idlelib/ScriptBinding.py
==============================================================================
--- python/trunk/Lib/idlelib/ScriptBinding.py (original)
+++ python/trunk/Lib/idlelib/ScriptBinding.py Mon Jul 24 20:05:51 2006
@@ -51,7 +51,7 @@
# Provide instance variables referenced by Debugger
# XXX This should be done differently
self.flist = self.editwin.flist
- self.root = self.flist.root
+ self.root = self.editwin.root
def check_module_event(self, event):
filename = self.getfilename()
From python-checkins at python.org Mon Jul 24 20:15:05 2006
From: python-checkins at python.org (jackilyn.hoxworth)
Date: Mon, 24 Jul 2006 20:15:05 +0200 (CEST)
Subject: [Python-checkins] r50806 -
python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py
Message-ID: <20060724181505.D10961E4003@bag.python.org>
Author: jackilyn.hoxworth
Date: Mon Jul 24 20:15:05 2006
New Revision: 50806
Modified:
python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py
Log:
Modified: python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py
==============================================================================
--- python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py (original)
+++ python/branches/hoxworth-stdlib_logging-soc/test_soc_httplib.py Mon Jul 24 20:15:05 2006
@@ -29,30 +29,33 @@
# add the handler to the logger
log.addHandler(handler)
+myconn = httplib.HTTPConnection('www.google.com')
+myconn.set_debuglevel(43)
# create socket
-class FakeSocket:
- def __init__(self, text, fileclass=StringIO.StringIO):
- self.text = text
- self.fileclass = fileclass
-
- def makefile(self, mode, bufsize=None):
- if mode != 'r' and mode != 'rb':
- raise httplib.UnimplementedFileMode()
- return self.fileclass(self.text)
+# class FakeSocket:
+# def __init__(self, text, fileclass=StringIO.StringIO):
+# self.text = text
+# self.fileclass = fileclass
+#
+# def makefile(self, mode, bufsize=None):
+# if mode != 'r' and mode != 'rb':
+# raise httplib.UnimplementedFileMode()
+# return self.fileclass(self.text)
-sock = FakeSocket("socket")
+#sock = FakeSocket("socket")
httplib._log.info("message 1") # first stage of testing
-r = httplib.HTTPResponse(sock) # second stage of testing
-r.begin() # call the begin method
+#r = httplib.HTTPResponse(sock) # second stage of testing
+#r.begin() # call the begin method
# class test:
# def someTest:
# self.msg == None
# self._read_status == "message 1" == CONTINUE
# skip != True
-# self.debuglevel > 0
+if myconn.debuglevel > 0:
+ print "something"
print stringLog.getvalue() # For testing purposes
From python-checkins at python.org Mon Jul 24 20:15:55 2006
From: python-checkins at python.org (jackilyn.hoxworth)
Date: Mon, 24 Jul 2006 20:15:55 +0200 (CEST)
Subject: [Python-checkins] r50807 -
python/branches/hoxworth-stdlib_logging-soc/fakesocket.py
python/branches/hoxworth-stdlib_logging-soc/new_soc_logging_test.py
Message-ID: <20060724181555.570B81E4003@bag.python.org>
Author: jackilyn.hoxworth
Date: Mon Jul 24 20:15:55 2006
New Revision: 50807
Added:
python/branches/hoxworth-stdlib_logging-soc/fakesocket.py
python/branches/hoxworth-stdlib_logging-soc/new_soc_logging_test.py
Log:
Added: python/branches/hoxworth-stdlib_logging-soc/fakesocket.py
==============================================================================
--- (empty file)
+++ python/branches/hoxworth-stdlib_logging-soc/fakesocket.py Mon Jul 24 20:15:55 2006
@@ -0,0 +1,33 @@
+import socket
+
+def getaddrinfo(host,port, *args):
+ """no real addr info -- but the test data is copied from 'port' to 'sa'
+ for a bit of control over the resulting mock socket.
+ >>> getaddrinfo("MOCK", "Raise: connection prohibited")
+ ("af", "socktype", "proto", "cannonname", "Raise: connection prohibited")
+ """
+
+ if host != "MOCK":
+ raise ValueError("Faked Socket Module for testing only")
+ # Use port for sa, so we can fake a raise
+ return ("af", "socktype", "proto", "cannonname", port)
+
+# Bad name, but it matches the real module
+class error(Exception): pass
+
+class socket(object):
+ """Mock socket object"""
+ def __init__(self, af, socktype, proto): pass
+ def connect(self, sa):
+ """Raise if the argument says raise, otherwise sa is treated as the response.
+ Wouldn't hurt to put doctests in here, too...
+ """
+ if sa.startswith("Raise: "):
+ raise error(sa)
+ else:
+ self.incoming_msg = sa
+ def close(self): pass
+ def sendall(self, msg):
+ self.gotsent = msg
+ def makefile(self, mode, bufsize):
+ return cStringIO(self.incoming_msg)
\ No newline at end of file
Added: python/branches/hoxworth-stdlib_logging-soc/new_soc_logging_test.py
==============================================================================
--- (empty file)
+++ python/branches/hoxworth-stdlib_logging-soc/new_soc_logging_test.py Mon Jul 24 20:15:55 2006
@@ -0,0 +1,40 @@
+import httplib
+import fakesocket
+import logging
+from cStringIO import StringIO
+
+origsocket=httplib.socket
+# monkeypatch -- replace httplib.socket with our own fake module
+httplib.socket=fakesocket
+
+# ... run the tests ...
+log=logging.getLogger("py.httplib")
+stringLog = StringIO()
+
+# define the handler and level
+handler = logging.StreamHandler(stringLog)
+log.setLevel(logging.INFO)
+
+# add the handler to the logger
+log.addHandler(handler)
+
+httplib._log.info("message 1") # 1st test
+
+myconn = httplib.HTTPConnection('www.google.com')
+myconn.set_debuglevel(43)
+if myconn.debuglevel > 0:
+ print "Debug level is > 0"
+
+myconn.connect()
+myconn.putrequest("GET", "/search?q=Jackilyn+Hoxworth")
+myconn.getresponse()
+
+print stringLog.getvalue() # For testing purposes
+
+if stringLog.getvalue() != "Error: It worked":
+ print "it worked"
+else:
+ print "it didn't work"
+
+# restore it to working order, so other tests won't fail
+httplib.socket=origsocket
\ No newline at end of file
From neal at metaslash.com Mon Jul 24 22:06:49 2006
From: neal at metaslash.com (Neal Norwitz)
Date: Mon, 24 Jul 2006 16:06:49 -0400
Subject: [Python-checkins] Python Regression Test Failures basics (1)
Message-ID: <20060724200649.GA29934@python.psfb.org>
test_grammar
test_opcodes
test_operations
test_builtin
test_exceptions
test_types
test_MimeWriter
test_StringIO
test___all__
test___future__
test__locale
test_aepack
test_aepack skipped -- No module named aepack
test_al
test_al skipped -- No module named al
test_anydbm
test_applesingle
test_applesingle skipped -- No module named macostools
test_array
test_ast
test_asynchat
test_atexit
test_audioop
test_augassign
test_base64
test_bastion
test_bigmem
test_binascii
test_binhex
test_binop
test_bisect
test_bool
test_bsddb
test_bsddb185
test_bsddb185 skipped -- No module named bsddb185
test_bsddb3
test_bsddb3 skipped -- Use of the `bsddb' resource not enabled
test_bufio
test_bz2
test_cProfile
test_calendar
test_call
test_capi
test_cd
test_cd skipped -- No module named cd
test_cfgparser
test_cgi
test_charmapcodec
test_cl
test_cl skipped -- No module named cl
test_class
test_cmath
test_cmd_line
test_code
test_codeccallbacks
test_codecencodings_cn
test_codecencodings_hk
test_codecencodings_jp
test_codecencodings_kr
test_codecencodings_tw
test_codecmaps_cn
test_codecmaps_hk
test_codecmaps_jp
test_codecmaps_kr
test_codecmaps_tw
test_codecs
test_codeop
test_coding
test_coercion
test_colorsys
test_commands
test_compare
test_compile
test_compiler
test_complex
test_contains
test_contextlib
test_cookie
test_cookielib
test_copy
test_copy_reg
test_cpickle
test_crypt
test_csv
test_ctypes
test_curses
test_curses skipped -- Use of the `curses' resource not enabled
test_datetime
test_dbm
test_decimal
test_decorators
test_defaultdict
test_deque
test_descr
test_descrtut
test_dict
test_difflib
test_dircache
test_dis
test_distutils
test_dl
test_doctest
test_doctest2
test_dumbdbm
test_dummy_thread
test_dummy_threading
test_email
test_email_codecs
test_email_renamed
test_enumerate
test_eof
test_errno
test_exception_variations
test_extcall
test_fcntl
test_file
test_filecmp
test_fileinput
test_float
test_fnmatch
test_fork1
test_format
test_fpformat
test_frozen
test_funcattrs
test_functools
test_future
test_gc
test_gdbm
test_generators
test test_generators crashed -- : local variable 'lines' referenced before assignment
test_genexps
test_getargs
test_getargs2
test_getopt
test_gettext
test_gl
test_gl skipped -- No module named gl
test_glob
test_global
test_grp
test_gzip
test_hash
test_hashlib
test_heapq
test_hexoct
test_hmac
test_hotshot
test_htmllib
test_htmlparser
test_httplib
test_imageop
test_imaplib
test_imgfile
test_imgfile skipped -- No module named imgfile
test_imp
test_import
test_importhooks
test_index
test_inspect
test_ioctl
test_ioctl skipped -- Unable to open /dev/tty
test_isinstance
test_iter
test_iterlen
test_itertools
test_largefile
test_linuxaudiodev
test_linuxaudiodev skipped -- Use of the `audio' resource not enabled
test_list
test_locale
test_logging
test_long
test_long_future
test_longexp
test_macfs
test_macfs skipped -- No module named macfs
test_macostools
test_macostools skipped -- No module named macostools
test_macpath
test_mailbox
test_marshal
test_math
test_md5
test_mhlib
test_mimetools
test_mimetypes
test_minidom
test_mmap
test_module
test_multibytecodec
test_multibytecodec_support
test_multifile
test_mutants
test_netrc
test_new
test_nis
test_nis skipped -- Local domain name not set
test_normalization
test_ntpath
test_old_mailbox
test_openpty
test_operator
test_optparse
test_os
test_ossaudiodev
test_ossaudiodev skipped -- Use of the `audio' resource not enabled
test_parser
test_peepholer
test_pep247
test_pep263
test_pep277
test_pep277 skipped -- test works only on NT+
test_pep292
test_pep352
test_pickle
test_pickletools
test_pkg
test_pkgimport
test_platform
test_plistlib
test_plistlib skipped -- No module named plistlib
test_poll
test_popen
[7146 refs]
[7146 refs]
[7146 refs]
test_popen2
test_posix
test_posixpath
test_pow
test_pprint
test_profile
test_profilehooks
test_pty
test_pwd
test_pyclbr
test_pyexpat
test_queue
test_quopri
[7521 refs]
[7521 refs]
test_random
test_re
test_repr
test_resource
test_rfc822
test_rgbimg
test_richcmp
test_robotparser
test_runpy
test_sax
test_scope
test_scriptpackages
test_scriptpackages skipped -- No module named aetools
test_select
test_set
test_sets
test_sgmllib
test_sha
test_shelve
test_shlex
test_shutil
test_signal
test_site
test_slice
test_socket
test_socket_ssl
test_socket_ssl skipped -- Use of the `network' resource not enabled
test_socketserver
test_socketserver skipped -- Use of the `network' resource not enabled
test_softspace
test_sort
test_sqlite
test_startfile
test_startfile skipped -- cannot import name startfile
test_str
test_strftime
test_string
test_stringprep
test_strop
test_strptime
test_struct
test_structseq
test_subprocess
[7141 refs]
[7142 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7142 refs]
[8690 refs]
[7357 refs]
[7142 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
this bit of output is from a test of stdout in a different process ...
[7141 refs]
[7141 refs]
[7357 refs]
test_sunaudiodev
test_sunaudiodev skipped -- No module named sunaudiodev
test_sundry
test_symtable
test_syntax
test test_syntax crashed -- : local variable 'lines' referenced before assignment
test_sys
[7141 refs]
[7141 refs]
test_tarfile
test_tcl
test_tcl skipped -- No module named _tkinter
test_tempfile
[7148 refs]
test_textwrap
test_thread
test_threaded_import
test_threadedtempfile
test_threading
test_threading_local
test_threadsignals
test_time
test_timeout
test_timeout skipped -- Use of the `network' resource not enabled
test_tokenize
test_trace
test_traceback
test_transformer
test_tuple
test_ucn
test_unary
test_unicode
test_unicode_file
test_unicode_file skipped -- No Unicode filesystem semantics on this platform.
test_unicodedata
test_unittest
test_univnewlines
test_unpack
test_urllib
test_urllib2
test_urllib2net
test_urllib2net skipped -- Use of the `network' resource not enabled
test_urllibnet
test_urllibnet skipped -- Use of the `network' resource not enabled
test_urlparse
test_userdict
test_userlist
test_userstring
test_uu
test_uuid
test_wait3
test_wait4
test_warnings
test_wave
test_weakref
test_whichdb
test_winreg
test_winreg skipped -- No module named _winreg
test_winsound
test_winsound skipped -- No module named winsound
test_with
test_wsgiref
test_xdrlib
test_xml_etree
test_xml_etree_c
test_xmllib
test_xmlrpc
test_xpickle
test_xrange
test_zipfile
/tmp/python-test/local/lib/python2.5/struct.py:63: DeprecationWarning: struct integer overflow masking is deprecated
return o.pack(*args)
test_zipfile64
test_zipfile64 skipped -- test requires loads of disk-space bytes and a long time to run
test_zipimport
test_zlib
285 tests OK.
2 tests failed:
test_generators test_syntax
31 tests skipped:
test_aepack test_al test_applesingle test_bsddb185 test_bsddb3
test_cd test_cl test_curses test_gl test_imgfile test_ioctl
test_linuxaudiodev test_macfs test_macostools test_nis
test_ossaudiodev test_pep277 test_plistlib test_scriptpackages
test_socket_ssl test_socketserver test_startfile test_sunaudiodev
test_tcl test_timeout test_unicode_file test_urllib2net
test_urllibnet test_winreg test_winsound test_zipfile64
1 skip unexpected on linux2:
test_ioctl
[435300 refs]
From python-checkins at python.org Mon Jul 24 22:11:36 2006
From: python-checkins at python.org (georg.brandl)
Date: Mon, 24 Jul 2006 22:11:36 +0200 (CEST)
Subject: [Python-checkins] r50808 - python/trunk/Lib/traceback.py
Message-ID: <20060724201136.2EA731E400E@bag.python.org>
Author: georg.brandl
Date: Mon Jul 24 22:11:35 2006
New Revision: 50808
Modified:
python/trunk/Lib/traceback.py
Log:
Repair accidental NameError.
Modified: python/trunk/Lib/traceback.py
==============================================================================
--- python/trunk/Lib/traceback.py (original)
+++ python/trunk/Lib/traceback.py Mon Jul 24 22:11:35 2006
@@ -179,13 +179,14 @@
return [_format_final_exc_line(stype, value)]
# It was a syntax error; show exactly where the problem was found.
+ lines = []
try:
msg, (filename, lineno, offset, badline) = value
except Exception:
pass
else:
filename = filename or ""
- lines = [(' File "%s", line %d\n' % (filename, lineno))]
+ lines.append(' File "%s", line %d\n' % (filename, lineno))
if badline is not None:
lines.append(' %s\n' % badline.strip())
if offset is not None:
From neal at metaslash.com Mon Jul 24 22:12:17 2006
From: neal at metaslash.com (Neal Norwitz)
Date: Mon, 24 Jul 2006 16:12:17 -0400
Subject: [Python-checkins] Python Regression Test Failures opt (1)
Message-ID: <20060724201217.GA30601@python.psfb.org>
test_grammar
test_opcodes
test_operations
test_builtin
test_exceptions
test_types
test_MimeWriter
test_StringIO
test___all__
test___future__
test__locale
test_aepack
test_aepack skipped -- No module named aepack
test_al
test_al skipped -- No module named al
test_anydbm
test_applesingle
test_applesingle skipped -- No module named macostools
test_array
test_ast
test_asynchat
test_atexit
test_audioop
test_augassign
test_base64
test_bastion
test_bigmem
test_binascii
test_binhex
test_binop
test_bisect
test_bool
test_bsddb
test_bsddb185
test_bsddb185 skipped -- No module named bsddb185
test_bsddb3
test_bsddb3 skipped -- Use of the `bsddb' resource not enabled
test_bufio
test_bz2
test_cProfile
test_calendar
test_call
test_capi
test_cd
test_cd skipped -- No module named cd
test_cfgparser
test_cgi
test_charmapcodec
test_cl
test_cl skipped -- No module named cl
test_class
test_cmath
test_cmd_line
test_code
test_codeccallbacks
test_codecencodings_cn
test_codecencodings_hk
test_codecencodings_jp
test_codecencodings_kr
test_codecencodings_tw
test_codecmaps_cn
test_codecmaps_hk
test_codecmaps_jp
test_codecmaps_kr
test_codecmaps_tw
test_codecs
test_codeop
test_coding
test_coercion
test_colorsys
test_commands
test_compare
test_compile
test_compiler
test_complex
test_contains
test_contextlib
test_cookie
test_cookielib
test_copy
test_copy_reg
test_cpickle
test_crypt
test_csv
test_ctypes
test_curses
test_curses skipped -- Use of the `curses' resource not enabled
test_datetime
test_dbm
test_decimal
test_decorators
test_defaultdict
test_deque
test_descr
test_descrtut
test_dict
test_difflib
test_dircache
test_dis
test_distutils
[8827 refs]
test_dl
test_doctest
test_doctest2
test_dumbdbm
test_dummy_thread
test_dummy_threading
test_email
test_email_codecs
test_email_renamed
test_enumerate
test_eof
test_errno
test_exception_variations
test_extcall
test_fcntl
test_file
test_filecmp
test_fileinput
test_float
test_fnmatch
test_fork1
test_format
test_fpformat
test_frozen
test_funcattrs
test_functools
test_future
test_gc
test_gdbm
test_generators
test test_generators crashed -- : local variable 'lines' referenced before assignment
test_genexps
test_getargs
test_getargs2
test_getopt
test_gettext
test_gl
test_gl skipped -- No module named gl
test_glob
test_global
test_grp
test_gzip
test_hash
test_hashlib
test_heapq
test_hexoct
test_hmac
test_hotshot
test_htmllib
test_htmlparser
test_httplib
test_imageop
test_imaplib
test_imgfile
test_imgfile skipped -- No module named imgfile
test_imp
test_import
test_importhooks
test_index
test_inspect
test_ioctl
test_ioctl skipped -- Unable to open /dev/tty
test_isinstance
test_iter
test_iterlen
test_itertools
test_largefile
test_linuxaudiodev
test_linuxaudiodev skipped -- Use of the `audio' resource not enabled
test_list
test_locale
test_logging
test_long
test_long_future
test_longexp
test_macfs
test_macfs skipped -- No module named macfs
test_macostools
test_macostools skipped -- No module named macostools
test_macpath
test_mailbox
test_marshal
test_math
test_md5
test_mhlib
test_mimetools
test_mimetypes
test_minidom
test_mmap
test_module
test_multibytecodec
test_multibytecodec_support
test_multifile
test_mutants
test_netrc
test_new
test_nis
test_nis skipped -- Local domain name not set
test_normalization
test_ntpath
test_old_mailbox
test_openpty
test_operator
test_optparse
test_os
test_ossaudiodev
test_ossaudiodev skipped -- Use of the `audio' resource not enabled
test_parser
test_peepholer
test_pep247
test_pep263
test_pep277
test_pep277 skipped -- test works only on NT+
test_pep292
test_pep352
test_pickle
test_pickletools
test_pkg
test_pkgimport
test_platform
test_plistlib
test_plistlib skipped -- No module named plistlib
test_poll
test_popen
[7146 refs]
[7146 refs]
[7146 refs]
test_popen2
test_posix
test_posixpath
test_pow
test_pprint
test_profile
test_profilehooks
test_pty
test_pwd
test_pyclbr
test_pyexpat
test_queue
test_quopri
[7521 refs]
[7521 refs]
test_random
test_re
test_repr
test_resource
test_rfc822
test_rgbimg
test_richcmp
test_robotparser
test_runpy
test_sax
test_scope
test_scriptpackages
test_scriptpackages skipped -- No module named aetools
test_select
test_set
test_sets
test_sgmllib
test_sha
test_shelve
test_shlex
test_shutil
test_signal
test_site
test_slice
test_socket
test_socket_ssl
test_socket_ssl skipped -- Use of the `network' resource not enabled
test_socketserver
test_socketserver skipped -- Use of the `network' resource not enabled
test_softspace
test_sort
test_sqlite
test_startfile
test_startfile skipped -- cannot import name startfile
test_str
test_strftime
test_string
test_stringprep
test_strop
test_strptime
test_struct
test_structseq
test_subprocess
[7141 refs]
[7142 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7142 refs]
[8690 refs]
[7357 refs]
[7142 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
this bit of output is from a test of stdout in a different process ...
[7141 refs]
[7141 refs]
[7357 refs]
test_sunaudiodev
test_sunaudiodev skipped -- No module named sunaudiodev
test_sundry
test_symtable
test_syntax
test test_syntax crashed -- : local variable 'lines' referenced before assignment
test_sys
[7141 refs]
[7141 refs]
test_tarfile
test_tcl
test_tcl skipped -- No module named _tkinter
test_tempfile
[7148 refs]
test_textwrap
test_thread
test_threaded_import
test_threadedtempfile
test_threading
test_threading_local
test_threadsignals
test_time
test_timeout
test_timeout skipped -- Use of the `network' resource not enabled
test_tokenize
test_trace
test_traceback
test_transformer
test_tuple
test_ucn
test_unary
test_unicode
test_unicode_file
test_unicode_file skipped -- No Unicode filesystem semantics on this platform.
test_unicodedata
test_unittest
test_univnewlines
test_unpack
test_urllib
test_urllib2
test_urllib2net
test_urllib2net skipped -- Use of the `network' resource not enabled
test_urllibnet
test_urllibnet skipped -- Use of the `network' resource not enabled
test_urlparse
test_userdict
test_userlist
test_userstring
test_uu
test_uuid
test_wait3
test_wait4
test_warnings
test_wave
test_weakref
test_whichdb
test_winreg
test_winreg skipped -- No module named _winreg
test_winsound
test_winsound skipped -- No module named winsound
test_with
test_wsgiref
test_xdrlib
test_xml_etree
test_xml_etree_c
test_xmllib
test_xmlrpc
test_xpickle
test_xrange
test_zipfile
/tmp/python-test/local/lib/python2.5/struct.py:63: DeprecationWarning: struct integer overflow masking is deprecated
return o.pack(*args)
test_zipfile64
test_zipfile64 skipped -- test requires loads of disk-space bytes and a long time to run
test_zipimport
test_zlib
285 tests OK.
2 tests failed:
test_generators test_syntax
31 tests skipped:
test_aepack test_al test_applesingle test_bsddb185 test_bsddb3
test_cd test_cl test_curses test_gl test_imgfile test_ioctl
test_linuxaudiodev test_macfs test_macostools test_nis
test_ossaudiodev test_pep277 test_plistlib test_scriptpackages
test_socket_ssl test_socketserver test_startfile test_sunaudiodev
test_tcl test_timeout test_unicode_file test_urllib2net
test_urllibnet test_winreg test_winsound test_zipfile64
1 skip unexpected on linux2:
test_ioctl
[434841 refs]
From python-checkins at python.org Mon Jul 24 23:02:19 2006
From: python-checkins at python.org (tim.peters)
Date: Mon, 24 Jul 2006 23:02:19 +0200 (CEST)
Subject: [Python-checkins] r50809 - in python/trunk/Lib:
idlelib/macosxSupport.py test/test_traceback.py traceback.py
Message-ID: <20060724210219.307F61E400F@bag.python.org>
Author: tim.peters
Date: Mon Jul 24 23:02:15 2006
New Revision: 50809
Modified:
python/trunk/Lib/idlelib/macosxSupport.py
python/trunk/Lib/test/test_traceback.py
python/trunk/Lib/traceback.py
Log:
Whitespace normalization.
Modified: python/trunk/Lib/idlelib/macosxSupport.py
==============================================================================
--- python/trunk/Lib/idlelib/macosxSupport.py (original)
+++ python/trunk/Lib/idlelib/macosxSupport.py Mon Jul 24 23:02:15 2006
@@ -30,7 +30,7 @@
Replace the Tk root menu by something that's more appropriate for
IDLE.
"""
- # The menu that is attached to the Tk root (".") is also used by AquaTk for
+ # The menu that is attached to the Tk root (".") is also used by AquaTk for
# all windows that don't specify a menu of their own. The default menubar
# contains a number of menus, none of which are appropriate for IDLE. The
# Most annoying of those is an 'About Tck/Tk...' menu in the application
@@ -82,7 +82,7 @@
for mname, entrylist in Bindings.menudefs:
menu = menudict.get(mname)
- if not menu:
+ if not menu:
continue
for entry in entrylist:
if not entry:
@@ -90,14 +90,14 @@
else:
label, eventname = entry
underline, label = prepstr(label)
- accelerator = get_accelerator(Bindings.default_keydefs,
+ accelerator = get_accelerator(Bindings.default_keydefs,
eventname)
def command(text=root, eventname=eventname):
text.event_generate(eventname)
menu.add_command(label=label, underline=underline,
command=command, accelerator=accelerator)
-
+
Modified: python/trunk/Lib/test/test_traceback.py
==============================================================================
--- python/trunk/Lib/test/test_traceback.py (original)
+++ python/trunk/Lib/test/test_traceback.py Mon Jul 24 23:02:15 2006
@@ -112,7 +112,7 @@
self.assertEqual(lst, ['KeyboardInterrupt\n'])
# String exceptions are deprecated, but legal. The quirky form with
- # separate "type" and "value" tends to break things, because
+ # separate "type" and "value" tends to break things, because
# not isinstance(value, type)
# and a string cannot be the first argument to issubclass.
#
@@ -139,7 +139,7 @@
err = traceback.format_exception_only(str_type, str_value)
self.assert_(len(err) == 1)
self.assert_(err[0] == str_type + ': ' + str_value + '\n')
-
+
def test_main():
run_unittest(TracebackCases)
Modified: python/trunk/Lib/traceback.py
==============================================================================
--- python/trunk/Lib/traceback.py (original)
+++ python/trunk/Lib/traceback.py Mon Jul 24 23:02:15 2006
@@ -172,7 +172,7 @@
isinstance(etype, types.InstanceType) or
type(etype) is str):
return [_format_final_exc_line(etype, value)]
-
+
stype = etype.__name__
if not issubclass(etype, SyntaxError):
@@ -196,18 +196,18 @@
# only three spaces to account for offset1 == pos 0
lines.append(' %s^\n' % ''.join(caretspace))
value = msg
-
+
lines.append(_format_final_exc_line(stype, value))
return lines
def _format_final_exc_line(etype, value):
"""Return a list of a single line -- normal case for format_exception_only"""
- if value is None or not str(value):
+ if value is None or not str(value):
line = "%s\n" % etype
else:
line = "%s: %s\n" % (etype, _some_str(value))
return line
-
+
def _some_str(value):
try:
return str(value)
From buildbot at python.org Mon Jul 24 23:11:18 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 24 Jul 2006 21:11:18 +0000
Subject: [Python-checkins] buildbot warnings in sparc Ubuntu dapper trunk
Message-ID: <20060724211118.70D231E4003@bag.python.org>
The Buildbot has detected a new failure of sparc Ubuntu dapper trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/sparc%2520Ubuntu%2520dapper%2520trunk/builds/555
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl,kurt.kaiser
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From neal at metaslash.com Mon Jul 24 23:23:48 2006
From: neal at metaslash.com (Neal Norwitz)
Date: Mon, 24 Jul 2006 17:23:48 -0400
Subject: [Python-checkins] Python Regression Test Failures refleak (1)
Message-ID: <20060724212348.GA29890@python.psfb.org>
test_cmd_line leaked [0, -17, 17] references
From neal at metaslash.com Mon Jul 24 23:40:03 2006
From: neal at metaslash.com (Neal Norwitz)
Date: Mon, 24 Jul 2006 17:40:03 -0400
Subject: [Python-checkins] Python Regression Test Failures all (1)
Message-ID: <20060724214003.GA437@python.psfb.org>
test_grammar
test_opcodes
test_operations
test_builtin
test_exceptions
test_types
test_MimeWriter
test_StringIO
test___all__
test___future__
test__locale
test_aepack
test_aepack skipped -- No module named aepack
test_al
test_al skipped -- No module named al
test_anydbm
test_applesingle
test_applesingle skipped -- No module named macostools
test_array
test_ast
test_asynchat
test_atexit
test_audioop
test_augassign
test_base64
test_bastion
test_bigmem
test_binascii
test_binhex
test_binop
test_bisect
test_bool
test_bsddb
test_bsddb185
test_bsddb185 skipped -- No module named bsddb185
test_bsddb3
Exception in thread reader 2:
Traceback (most recent call last):
File "/tmp/python-test/local/lib/python2.5/threading.py", line 460, in __bootstrap
self.run()
File "/tmp/python-test/local/lib/python2.5/threading.py", line 440, in run
self.__target(*self.__args, **self.__kwargs)
File "/tmp/python-test/local/lib/python2.5/bsddb/test/test_thread.py", line 281, in readerThread
rec = dbutils.DeadlockWrap(c.next, max_retries=10)
File "/tmp/python-test/local/lib/python2.5/bsddb/dbutils.py", line 62, in DeadlockWrap
return function(*_args, **_kwargs)
DBLockDeadlockError: (-30996, 'DB_LOCK_DEADLOCK: Locker killed to resolve a deadlock')
Exception in thread reader 4:
Traceback (most recent call last):
File "/tmp/python-test/local/lib/python2.5/threading.py", line 460, in __bootstrap
self.run()
File "/tmp/python-test/local/lib/python2.5/threading.py", line 440, in run
self.__target(*self.__args, **self.__kwargs)
File "/tmp/python-test/local/lib/python2.5/bsddb/test/test_thread.py", line 281, in readerThread
rec = dbutils.DeadlockWrap(c.next, max_retries=10)
File "/tmp/python-test/local/lib/python2.5/bsddb/dbutils.py", line 62, in DeadlockWrap
return function(*_args, **_kwargs)
DBLockDeadlockError: (-30996, 'DB_LOCK_DEADLOCK: Locker killed to resolve a deadlock')
Exception in thread reader 3:
Traceback (most recent call last):
File "/tmp/python-test/local/lib/python2.5/threading.py", line 460, in __bootstrap
self.run()
File "/tmp/python-test/local/lib/python2.5/threading.py", line 440, in run
self.__target(*self.__args, **self.__kwargs)
File "/tmp/python-test/local/lib/python2.5/bsddb/test/test_thread.py", line 281, in readerThread
rec = dbutils.DeadlockWrap(c.next, max_retries=10)
File "/tmp/python-test/local/lib/python2.5/bsddb/dbutils.py", line 62, in DeadlockWrap
return function(*_args, **_kwargs)
DBLockDeadlockError: (-30996, 'DB_LOCK_DEADLOCK: Locker killed to resolve a deadlock')
Exception in thread reader 0:
Traceback (most recent call last):
File "/tmp/python-test/local/lib/python2.5/threading.py", line 460, in __bootstrap
self.run()
File "/tmp/python-test/local/lib/python2.5/threading.py", line 440, in run
self.__target(*self.__args, **self.__kwargs)
File "/tmp/python-test/local/lib/python2.5/bsddb/test/test_thread.py", line 281, in readerThread
rec = dbutils.DeadlockWrap(c.next, max_retries=10)
File "/tmp/python-test/local/lib/python2.5/bsddb/dbutils.py", line 62, in DeadlockWrap
return function(*_args, **_kwargs)
DBLockDeadlockError: (-30996, 'DB_LOCK_DEADLOCK: Locker killed to resolve a deadlock')
Exception in thread writer 0:
Traceback (most recent call last):
File "/tmp/python-test/local/lib/python2.5/threading.py", line 460, in __bootstrap
self.run()
File "/tmp/python-test/local/lib/python2.5/threading.py", line 440, in run
self.__target(*self.__args, **self.__kwargs)
File "/tmp/python-test/local/lib/python2.5/bsddb/test/test_thread.py", line 260, in writerThread
self.assertEqual(data, self.makeData(key))
File "/tmp/python-test/local/lib/python2.5/unittest.py", line 334, in failUnlessEqual
(msg or '%r != %r' % (first, second))
AssertionError: None != '0004-0004-0004-0004-0004'
Exception in thread writer 1:
Traceback (most recent call last):
File "/tmp/python-test/local/lib/python2.5/threading.py", line 460, in __bootstrap
self.run()
File "/tmp/python-test/local/lib/python2.5/threading.py", line 440, in run
self.__target(*self.__args, **self.__kwargs)
File "/tmp/python-test/local/lib/python2.5/bsddb/test/test_thread.py", line 260, in writerThread
self.assertEqual(data, self.makeData(key))
File "/tmp/python-test/local/lib/python2.5/unittest.py", line 334, in failUnlessEqual
(msg or '%r != %r' % (first, second))
AssertionError: None != '1007-1007-1007-1007-1007'
Exception in thread writer 2:
Traceback (most recent call last):
File "/tmp/python-test/local/lib/python2.5/threading.py", line 460, in __bootstrap
self.run()
File "/tmp/python-test/local/lib/python2.5/threading.py", line 440, in run
self.__target(*self.__args, **self.__kwargs)
File "/tmp/python-test/local/lib/python2.5/bsddb/test/test_thread.py", line 260, in writerThread
self.assertEqual(data, self.makeData(key))
File "/tmp/python-test/local/lib/python2.5/unittest.py", line 334, in failUnlessEqual
(msg or '%r != %r' % (first, second))
AssertionError: None != '2002-2002-2002-2002-2002'
test_bufio
test_bz2
test_cProfile
test_calendar
test_call
test_capi
test_cd
test_cd skipped -- No module named cd
test_cfgparser
test_cgi
test_charmapcodec
test_cl
test_cl skipped -- No module named cl
test_class
test_cmath
test_cmd_line
test_code
test_codeccallbacks
test_codecencodings_cn
test_codecencodings_hk
test_codecencodings_jp
test_codecencodings_kr
test_codecencodings_tw
test_codecmaps_cn
test_codecmaps_hk
test_codecmaps_jp
test_codecmaps_kr
test_codecmaps_tw
test_codecs
test_codeop
test_coding
test_coercion
test_colorsys
test_commands
test_compare
test_compile
test_compiler
testCompileLibrary still working, be patient...
test_complex
test_contains
test_contextlib
test_cookie
test_cookielib
test_copy
test_copy_reg
test_cpickle
test_crypt
test_csv
test_ctypes
test_datetime
test_dbm
test_decimal
test_decorators
test_defaultdict
test_deque
test_descr
test_descrtut
test_dict
test_difflib
test_dircache
test_dis
test_distutils
test_dl
test_doctest
test_doctest2
test_dumbdbm
test_dummy_thread
test_dummy_threading
test_email
test_email_codecs
test_email_renamed
test_enumerate
test_eof
test_errno
test_exception_variations
test_extcall
test_fcntl
test_file
test_filecmp
test_fileinput
test_float
test_fnmatch
test_fork1
test_format
test_fpformat
test_frozen
test_funcattrs
test_functools
test_future
test_gc
test_gdbm
test_generators
test test_generators crashed -- : local variable 'lines' referenced before assignment
test_genexps
test_getargs
test_getargs2
test_getopt
test_gettext
test_gl
test_gl skipped -- No module named gl
test_glob
test_global
test_grp
test_gzip
test_hash
test_hashlib
test_heapq
test_hexoct
test_hmac
test_hotshot
test_htmllib
test_htmlparser
test_httplib
test_imageop
test_imaplib
test_imgfile
test_imgfile skipped -- No module named imgfile
test_imp
test_import
test_importhooks
test_index
test_inspect
test_ioctl
test_ioctl skipped -- Unable to open /dev/tty
test_isinstance
test_iter
test_iterlen
test_itertools
test_largefile
test_list
test_locale
test_logging
test_long
test_long_future
test_longexp
test_macfs
test_macfs skipped -- No module named macfs
test_macostools
test_macostools skipped -- No module named macostools
test_macpath
test_mailbox
test_marshal
test_math
test_md5
test_mhlib
test_mimetools
test_mimetypes
test_minidom
test_mmap
test_module
test_multibytecodec
test_multibytecodec_support
test_multifile
test_mutants
test_netrc
test_new
test_nis
test_nis skipped -- Local domain name not set
test_normalization
test_ntpath
test_old_mailbox
test_openpty
test_operator
test_optparse
test_os
test_parser
test_peepholer
test_pep247
test_pep263
test_pep277
test_pep277 skipped -- test works only on NT+
test_pep292
test_pep352
test_pickle
test_pickletools
test_pkg
test_pkgimport
test_platform
test_plistlib
test_plistlib skipped -- No module named plistlib
test_poll
test_popen
[7146 refs]
[7146 refs]
[7146 refs]
test_popen2
test_posix
test_posixpath
test_pow
test_pprint
test_profile
test_profilehooks
test_pty
test_pwd
test_pyclbr
test_pyexpat
test_queue
test_quopri
[7521 refs]
[7521 refs]
test_random
test_re
test_repr
test_resource
test_rfc822
test_rgbimg
test_richcmp
test_robotparser
test_runpy
test_sax
test_scope
test_scriptpackages
test_scriptpackages skipped -- No module named aetools
test_select
test_set
test_sets
test_sgmllib
test_sha
test_shelve
test_shlex
test_shutil
test_signal
test_site
test_slice
test_socket
test_socket_ssl
test_socketserver
test_softspace
test_sort
test_sqlite
test_startfile
test_startfile skipped -- cannot import name startfile
test_str
test_strftime
test_string
test_stringprep
test_strop
test_strptime
test_struct
test_structseq
test_subprocess
[7141 refs]
[7142 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7142 refs]
[8690 refs]
[7357 refs]
[7142 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
[7141 refs]
this bit of output is from a test of stdout in a different process ...
[7141 refs]
[7141 refs]
[7357 refs]
test_sunaudiodev
test_sunaudiodev skipped -- No module named sunaudiodev
test_sundry
test_symtable
test_syntax
test test_syntax crashed -- : local variable 'lines' referenced before assignment
test_sys
[7141 refs]
[7141 refs]
test_tarfile
test_tcl
test_tcl skipped -- No module named _tkinter
test_tempfile
[7148 refs]
test_textwrap
test_thread
test_threaded_import
test_threadedtempfile
test_threading
test_threading_local
test_threadsignals
test_time
test_timeout
test_tokenize
test_trace
test_traceback
test_transformer
test_tuple
test_ucn
test_unary
test_unicode
test_unicode_file
test_unicode_file skipped -- No Unicode filesystem semantics on this platform.
test_unicodedata
test_unittest
test_univnewlines
test_unpack
test_urllib
test_urllib2
test_urllib2net
test_urllibnet
test_urlparse
test_userdict
test_userlist
test_userstring
test_uu
test_uuid
test_wait3
test_wait4
test_warnings
test_wave
test_weakref
test_whichdb
test_winreg
test_winreg skipped -- No module named _winreg
test_winsound
test_winsound skipped -- No module named winsound
test_with
test_wsgiref
test_xdrlib
test_xml_etree
test_xml_etree_c
test_xmllib
test_xmlrpc
test_xpickle
test_xrange
test_zipfile
/tmp/python-test/local/lib/python2.5/struct.py:63: DeprecationWarning: struct integer overflow masking is deprecated
return o.pack(*args)
test_zipfile64
test_zipfile64 skipped -- test requires loads of disk-space bytes and a long time to run
test_zipimport
test_zlib
291 tests OK.
2 tests failed:
test_generators test_syntax
22 tests skipped:
test_aepack test_al test_applesingle test_bsddb185 test_cd test_cl
test_gl test_imgfile test_ioctl test_macfs test_macostools
test_nis test_pep277 test_plistlib test_scriptpackages
test_startfile test_sunaudiodev test_tcl test_unicode_file
test_winreg test_winsound test_zipfile64
1 skip unexpected on linux2:
test_ioctl
[443362 refs]
From buildbot at python.org Mon Jul 24 23:56:51 2006
From: buildbot at python.org (buildbot at python.org)
Date: Mon, 24 Jul 2006 21:56:51 +0000
Subject: [Python-checkins] buildbot warnings in MIPS Debian trunk
Message-ID: <20060724215651.CC6D01E400B@bag.python.org>
The Buildbot has detected a new failure of MIPS Debian trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/MIPS%2520Debian%2520trunk/builds/315
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl,kurt.kaiser
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From gward-1337f07a94b43060ff5c1ea922ed93d6 at python.net Tue Jul 25 04:06:41 2006
From: gward-1337f07a94b43060ff5c1ea922ed93d6 at python.net (Greg Ward)
Date: Mon, 24 Jul 2006 22:06:41 -0400
Subject: [Python-checkins] r50775 - in python/trunk/Lib/test:
output/test_ossaudiodev test_ossaudiodev.py
In-Reply-To:
References: <20060723022554.D1C6A1E4002@bag.python.org>
Message-ID: <20060725020641.GA12238@cthulhu.gerg.ca>
On 7/22/06, greg.ward wrote:
> + percent_diff = (abs(elapsed_time - expected_time) / expected_time) * 100
> + #print ("actual running time was %.2f sec (%.1f%% difference)"
> + # % (elapsed_time, percent_diff))
> + assert percent_diff <= 10.0, \
> + ("elapsed time (%.2f sec) > 10%% off of expected time (%.2f sec)"
> + % (elapsed_time, expected_time))
[Neal Norwitz keeps me honest]:
> This assert is a no-op if running with -O.
D'ohh! Good catch. Unfortunately there are several other assert
statements in test_ossaudiodev.py too. What the heck, might as well fix
'em all. Coming soon to a checkin near you.
> Do we really need the old code commented out?
Nope. I'll remove it.
Greg
--
Greg Ward http://www.gerg.ca/
I'd rather have a bottle in front of me than have to have a frontal lobotomy.
From python-checkins at python.org Tue Jul 25 04:11:13 2006
From: python-checkins at python.org (greg.ward)
Date: Tue, 25 Jul 2006 04:11:13 +0200 (CEST)
Subject: [Python-checkins] r50810 - python/trunk/Lib/test/test_ossaudiodev.py
Message-ID: <20060725021113.44A621E400E@bag.python.org>
Author: greg.ward
Date: Tue Jul 25 04:11:12 2006
New Revision: 50810
Modified:
python/trunk/Lib/test/test_ossaudiodev.py
Log:
Don't use standard assert: want tests to fail even when run with -O.
Delete cruft.
Modified: python/trunk/Lib/test/test_ossaudiodev.py
==============================================================================
--- python/trunk/Lib/test/test_ossaudiodev.py (original)
+++ python/trunk/Lib/test/test_ossaudiodev.py Tue Jul 25 04:11:12 2006
@@ -40,6 +40,10 @@
data = audioop.ulaw2lin(data, 2)
return (data, rate, 16, nchannels)
+# version of assert that still works with -O
+def _assert(expr, message=None):
+ if not expr:
+ raise AssertionError(message or "assertion failed")
def play_sound_file(data, rate, ssize, nchannels):
try:
@@ -57,9 +61,9 @@
dsp.fileno()
# Make sure the read-only attributes work.
- assert dsp.closed is False, "dsp.closed is not False"
- assert dsp.name == "/dev/dsp"
- assert dsp.mode == 'w', "bad dsp.mode: %r" % dsp.mode
+ _assert(dsp.closed is False, "dsp.closed is not False")
+ _assert(dsp.name == "/dev/dsp")
+ _assert(dsp.mode == 'w', "bad dsp.mode: %r" % dsp.mode)
# And make sure they're really read-only.
for attr in ('closed', 'name', 'mode'):
@@ -83,11 +87,9 @@
elapsed_time = t2 - t1
percent_diff = (abs(elapsed_time - expected_time) / expected_time) * 100
- #print ("actual running time was %.2f sec (%.1f%% difference)"
- # % (elapsed_time, percent_diff))
- assert percent_diff <= 10.0, \
- ("elapsed time (%.2f sec) > 10%% off of expected time (%.2f sec)"
- % (elapsed_time, expected_time))
+ _assert(percent_diff <= 10.0, \
+ ("elapsed time (%.2f sec) > 10%% off of expected time (%.2f sec)"
+ % (elapsed_time, expected_time)))
def test_setparameters(dsp):
# Two configurations for testing:
@@ -112,11 +114,11 @@
# setparameters() should be able to set this configuration in
# either strict or non-strict mode.
result = dsp.setparameters(fmt, channels, rate, False)
- assert result == (fmt, channels, rate), \
- "setparameters%r: returned %r" % (config + result)
+ _assert(result == (fmt, channels, rate),
+ "setparameters%r: returned %r" % (config + result))
result = dsp.setparameters(fmt, channels, rate, True)
- assert result == (fmt, channels, rate), \
- "setparameters%r: returned %r" % (config + result)
+ _assert(result == (fmt, channels, rate),
+ "setparameters%r: returned %r" % (config + result))
def test_bad_setparameters(dsp):
@@ -134,8 +136,8 @@
]:
(fmt, channels, rate) = config
result = dsp.setparameters(fmt, channels, rate, False)
- assert result != config, \
- "setparameters: unexpectedly got requested configuration"
+ _assert(result != config,
+ "setparameters: unexpectedly got requested configuration")
try:
result = dsp.setparameters(fmt, channels, rate, True)
@@ -156,6 +158,6 @@
#test_bad_setparameters(dsp)
finally:
dsp.close()
- assert dsp.closed is True, "dsp.closed is not True"
+ _assert(dsp.closed is True, "dsp.closed is not True")
test()
From buildbot at python.org Tue Jul 25 04:59:55 2006
From: buildbot at python.org (buildbot at python.org)
Date: Tue, 25 Jul 2006 02:59:55 +0000
Subject: [Python-checkins] buildbot warnings in MIPS Debian trunk
Message-ID: <20060725025955.44B3E1E4003@bag.python.org>
The Buildbot has detected a new failure of MIPS Debian trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/MIPS%2520Debian%2520trunk/builds/317
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: greg.ward
Build Had Warnings: warnings failed slave lost
sincerely,
-The Buildbot
From buildbot at python.org Tue Jul 25 05:46:04 2006
From: buildbot at python.org (buildbot at python.org)
Date: Tue, 25 Jul 2006 03:46:04 +0000
Subject: [Python-checkins] buildbot warnings in x86 Ubuntu dapper (icc) trunk
Message-ID: <20060725034604.E774B1E4003@bag.python.org>
The Buildbot has detected a new failure of x86 Ubuntu dapper (icc) trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520Ubuntu%2520dapper%2520%2528icc%2529%2520trunk/builds/796
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: greg.ward
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Tue Jul 25 06:07:23 2006
From: python-checkins at python.org (tim.peters)
Date: Tue, 25 Jul 2006 06:07:23 +0200 (CEST)
Subject: [Python-checkins] r50811 - python/trunk/Lib/test/test_sys.py
Message-ID: <20060725040723.422111E4003@bag.python.org>
Author: tim.peters
Date: Tue Jul 25 06:07:22 2006
New Revision: 50811
Modified:
python/trunk/Lib/test/test_sys.py
Log:
current_frames_with_threads(): There's actually no way
to guess /which/ line the spawned thread is in at the time
sys._current_frames() is called: we know it finished
enter_g.set(), but can't know whether the instruction
counter has advanced to the following leave_g.wait().
The latter is overwhelming most likely, but not guaranteed,
and I see that the "x86 Ubuntu dapper (icc) trunk" buildbot
found it on the other line once. Changed the test so it
passes in either case.
Modified: python/trunk/Lib/test/test_sys.py
==============================================================================
--- python/trunk/Lib/test/test_sys.py (original)
+++ python/trunk/Lib/test/test_sys.py Tue Jul 25 06:07:22 2006
@@ -274,8 +274,9 @@
t.start()
entered_g.wait()
- # At this point, t has finished its entered_g.set(), and is blocked
- # in its leave_g.wait().
+ # At this point, t has finished its entered_g.set(), although it's
+ # impossible to guess whether it's still on that line or has moved on
+ # to its leave_g.wait().
self.assertEqual(len(thread_info), 1)
thread_id = thread_info[0]
@@ -305,7 +306,7 @@
# And the next record must be for g456().
filename, lineno, funcname, sourceline = stack[i+1]
self.assertEqual(funcname, "g456")
- self.assertEqual(sourceline, "leave_g.wait()")
+ self.assert_(sourceline in ["leave_g.wait()", "entered_g.set()"])
# Reap the spawned thread.
leave_g.set()
From python-checkins at python.org Tue Jul 25 06:13:13 2006
From: python-checkins at python.org (neal.norwitz)
Date: Tue, 25 Jul 2006 06:13:13 +0200 (CEST)
Subject: [Python-checkins] r50812 - peps/trunk/pep-0356.txt
Message-ID: <20060725041313.A2B781E4003@bag.python.org>
Author: neal.norwitz
Date: Tue Jul 25 06:13:13 2006
New Revision: 50812
Modified:
peps/trunk/pep-0356.txt
Log:
Update with some new bugs. Most are also issues in 2.4, but should be fixed.
Only one bug was fixed. :-(
Modified: peps/trunk/pep-0356.txt
==============================================================================
--- peps/trunk/pep-0356.txt (original)
+++ peps/trunk/pep-0356.txt Tue Jul 25 06:13:13 2006
@@ -147,13 +147,20 @@
Open issues
- Bugs that need resolving before release:
- http://python.org/sf/1517042 - Fix crashers/gc_inspection.py
+ http://python.org/sf/1526585 - SystemError concat long strings (2.4)
+ http://python.org/sf/1523610 - PyArg_ParseTupleAndKeywords potential
+ core dump (2.4)
+ http://python.org/sf/1521947 - mystrtol.c fails with gcc 4.1 (2.4?)
+ test_compile.test_unary_minus
+ http://python.org/sf/1519025 - socket timeout crash when receive
+ signal (2.4)
+ http://python.org/sf/1517042 - Fix crashers/gc_inspection.py (2.4)
http://python.org/sf/1515471 - stringobject (char buffers)
http://python.org/sf/1513611 - XML: xml.sax.expatreader missing
http://python.org/sf/1511497 - XML: xml.sax.ParseException issue
- http://python.org/sf/1508010 - msvccompiler using VC6
http://python.org/sf/1475523 - gettext.py bug
http://python.org/sf/1333982 - AST
+ http://python.org/sf/1191458 - AST (test_trace issue mentioned below)
- AST compiler problems
(Owner: Jeremy Hylton)
From python-checkins at python.org Tue Jul 25 06:15:22 2006
From: python-checkins at python.org (neal.norwitz)
Date: Tue, 25 Jul 2006 06:15:22 +0200 (CEST)
Subject: [Python-checkins] r50813 - peps/trunk/pep-0356.txt
Message-ID: <20060725041522.5EEE91E4013@bag.python.org>
Author: neal.norwitz
Date: Tue Jul 25 06:15:22 2006
New Revision: 50813
Modified:
peps/trunk/pep-0356.txt
Log:
Forgot one bug also from 2.4
Modified: peps/trunk/pep-0356.txt
==============================================================================
--- peps/trunk/pep-0356.txt (original)
+++ peps/trunk/pep-0356.txt Tue Jul 25 06:15:22 2006
@@ -159,6 +159,7 @@
http://python.org/sf/1513611 - XML: xml.sax.expatreader missing
http://python.org/sf/1511497 - XML: xml.sax.ParseException issue
http://python.org/sf/1475523 - gettext.py bug
+ http://python.org/sf/1467929 - %-formatting and dicts (2.4)
http://python.org/sf/1333982 - AST
http://python.org/sf/1191458 - AST (test_trace issue mentioned below)
From python-checkins at python.org Tue Jul 25 08:28:19 2006
From: python-checkins at python.org (neal.norwitz)
Date: Tue, 25 Jul 2006 08:28:19 +0200 (CEST)
Subject: [Python-checkins] r50814 - peps/trunk/pep-0356.txt
Message-ID: <20060725062819.831B51E4003@bag.python.org>
Author: neal.norwitz
Date: Tue Jul 25 08:28:18 2006
New Revision: 50814
Modified:
peps/trunk/pep-0356.txt
Log:
Organize the bugs by priority (tentative blockers, pending Anthonys review). Add warning issue
Modified: peps/trunk/pep-0356.txt
==============================================================================
--- peps/trunk/pep-0356.txt (original)
+++ peps/trunk/pep-0356.txt Tue Jul 25 08:28:18 2006
@@ -143,26 +143,30 @@
- Make everything in Modules/ build cleanly with g++
-
Open issues
- - Bugs that need resolving before release:
- http://python.org/sf/1526585 - SystemError concat long strings (2.4)
- http://python.org/sf/1523610 - PyArg_ParseTupleAndKeywords potential
- core dump (2.4)
+ - Bugs that need resolving before release, ie, they block release:
http://python.org/sf/1521947 - mystrtol.c fails with gcc 4.1 (2.4?)
test_compile.test_unary_minus
- http://python.org/sf/1519025 - socket timeout crash when receive
- signal (2.4)
- http://python.org/sf/1517042 - Fix crashers/gc_inspection.py (2.4)
http://python.org/sf/1515471 - stringobject (char buffers)
http://python.org/sf/1513611 - XML: xml.sax.expatreader missing
http://python.org/sf/1511497 - XML: xml.sax.ParseException issue
- http://python.org/sf/1475523 - gettext.py bug
- http://python.org/sf/1467929 - %-formatting and dicts (2.4)
+ http://python.org/sf/1519796 says it fixes both XML problems.
http://python.org/sf/1333982 - AST
http://python.org/sf/1191458 - AST (test_trace issue mentioned below)
+ http://mail.python.org/pipermail/python-dev/2006-May/065478.html
+ Add PyErr_WarnEx() to address warnings in test suite.
+
+ - Bugs that ought to be resolved before release (all exist in 2.4):
+ http://python.org/sf/1526585 - SystemError concat long strings
+ http://python.org/sf/1523610 - PyArg_ParseTupleAndKeywords potential
+ core dump
+ http://python.org/sf/1519025 - socket timeout crash when receive signal
+ http://python.org/sf/1517042 - Fix crashers/gc_inspection.py
+ http://python.org/sf/1475523 - gettext.py bug
+ http://python.org/sf/1467929 - %-formatting and dicts
+
- AST compiler problems
(Owner: Jeremy Hylton)
* string constants that are not assigned are in byte code
From neal at metaslash.com Tue Jul 25 11:08:41 2006
From: neal at metaslash.com (Neal Norwitz)
Date: Tue, 25 Jul 2006 05:08:41 -0400
Subject: [Python-checkins] Python Regression Test Failures refleak (1)
Message-ID: <20060725090841.GA21916@python.psfb.org>
test_cmd_line leaked [-17, 17, -17] references
From g.brandl at gmx.net Tue Jul 25 11:52:48 2006
From: g.brandl at gmx.net (Georg Brandl)
Date: Tue, 25 Jul 2006 11:52:48 +0200
Subject: [Python-checkins] r50814 - peps/trunk/pep-0356.txt
In-Reply-To: <20060725062819.831B51E4003@bag.python.org>
References: <20060725062819.831B51E4003@bag.python.org>
Message-ID:
neal.norwitz wrote:
> + http://python.org/sf/1523610 - PyArg_ParseTupleAndKeywords potential
> + core dump
With respect to this bug (which is about stack issues in Python/getargs.c
involving misuse of the "levels" array), I think that we can drop the
"levels" thing completely. It's only there to tell the user which exact item
passed as part of a tuple argument cannot be accepted (and only if that
function is implemented in C code). As tuple arguments
are very rare "argument x" should be enough to tell the user that
something's wrong with that tuple.
Georg
From python-checkins at python.org Tue Jul 25 11:53:13 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Tue, 25 Jul 2006 11:53:13 +0200 (CEST)
Subject: [Python-checkins] r50815 - in python/trunk:
Lib/idlelib/CallTipWindow.py Misc/NEWS
Message-ID: <20060725095313.59AA41E4003@bag.python.org>
Author: martin.v.loewis
Date: Tue Jul 25 11:53:12 2006
New Revision: 50815
Modified:
python/trunk/Lib/idlelib/CallTipWindow.py
python/trunk/Misc/NEWS
Log:
Bug #1525817: Don't truncate short lines in IDLE's tool tips.
Modified: python/trunk/Lib/idlelib/CallTipWindow.py
==============================================================================
--- python/trunk/Lib/idlelib/CallTipWindow.py (original)
+++ python/trunk/Lib/idlelib/CallTipWindow.py Tue Jul 25 11:53:12 2006
@@ -49,7 +49,11 @@
"""
# truncate overly long calltip
if len(text) >= 79:
- text = text[:75] + ' ...'
+ textlines = text.splitlines()
+ for i, line in enumerate(textlines):
+ if len(line) > 79:
+ textlines[i] = line[:75] + ' ...'
+ text = '\n'.join(textlines)
self.text = text
if self.tipwindow or not self.text:
return
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Tue Jul 25 11:53:12 2006
@@ -39,6 +39,8 @@
Library
-------
+- Bug #1525817: Don't truncate short lines in IDLE's tool tips.
+
- Patch #1515343: Fix printing of deprecated string exceptions with a
value in the traceback module.
From python-checkins at python.org Tue Jul 25 12:05:48 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Tue, 25 Jul 2006 12:05:48 +0200 (CEST)
Subject: [Python-checkins] r50816 - python/trunk/Misc/NEWS
Message-ID: <20060725100548.5B2C11E4003@bag.python.org>
Author: martin.v.loewis
Date: Tue Jul 25 12:05:47 2006
New Revision: 50816
Modified:
python/trunk/Misc/NEWS
Log:
Bug #978833: Really close underlying socket in _socketobject.close.
Will backport to 2.4.
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Tue Jul 25 12:05:47 2006
@@ -39,6 +39,8 @@
Library
-------
+- Bug #978833: Really close underlying socket in _socketobject.close.
+
- Bug #1525817: Don't truncate short lines in IDLE's tool tips.
- Patch #1515343: Fix printing of deprecated string exceptions with a
From python-checkins at python.org Tue Jul 25 12:11:15 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Tue, 25 Jul 2006 12:11:15 +0200 (CEST)
Subject: [Python-checkins] r50817 - python/trunk/Misc/NEWS
Message-ID: <20060725101115.3104C1E4003@bag.python.org>
Author: martin.v.loewis
Date: Tue Jul 25 12:11:14 2006
New Revision: 50817
Modified:
python/trunk/Misc/NEWS
Log:
Revert incomplete checkin.
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Tue Jul 25 12:11:14 2006
@@ -39,8 +39,6 @@
Library
-------
-- Bug #978833: Really close underlying socket in _socketobject.close.
-
- Bug #1525817: Don't truncate short lines in IDLE's tool tips.
- Patch #1515343: Fix printing of deprecated string exceptions with a
From python-checkins at python.org Tue Jul 25 12:20:43 2006
From: python-checkins at python.org (georg.brandl)
Date: Tue, 25 Jul 2006 12:20:43 +0200 (CEST)
Subject: [Python-checkins] r50818 - peps/trunk/pep-0356.txt
Message-ID: <20060725102043.6234F1E4003@bag.python.org>
Author: georg.brandl
Date: Tue Jul 25 12:20:43 2006
New Revision: 50818
Modified:
peps/trunk/pep-0356.txt
Log:
Add item: docs of pkgutil changes.
Modified: peps/trunk/pep-0356.txt
==============================================================================
--- peps/trunk/pep-0356.txt (original)
+++ peps/trunk/pep-0356.txt Tue Jul 25 12:20:43 2006
@@ -180,6 +180,9 @@
(Owner: Jeremy Hylton)
http://python.org/sf/1191458
+ - The PEP 302 changes to (at least) pkgutil, runpy and pydoc must be
+ documented.
+
- test_zipfile64 takes too long and too much disk space for
most of the buildbots. How should this be handled?
It is currently disabled.
From python-checkins at python.org Tue Jul 25 12:22:35 2006
From: python-checkins at python.org (georg.brandl)
Date: Tue, 25 Jul 2006 12:22:35 +0200 (CEST)
Subject: [Python-checkins] r50819 - python/trunk/Lib/pkgutil.py
Message-ID: <20060725102235.3B6991E4003@bag.python.org>
Author: georg.brandl
Date: Tue Jul 25 12:22:34 2006
New Revision: 50819
Modified:
python/trunk/Lib/pkgutil.py
Log:
Patch #1525766: correctly pass onerror arg to recursive calls
of pkg.walk_packages. Also improve the docstrings.
Modified: python/trunk/Lib/pkgutil.py
==============================================================================
--- python/trunk/Lib/pkgutil.py (original)
+++ python/trunk/Lib/pkgutil.py Tue Jul 25 12:22:34 2006
@@ -69,7 +69,28 @@
def walk_packages(path=None, prefix='', onerror=None):
- """Yield submodule names+loaders recursively, for path or sys.path"""
+ """Yields (module_loader, name, ispkg) for all modules recursively
+ on path, or, if path is None, all accessible modules.
+
+ 'path' should be either None or a list of paths to look for
+ modules in.
+
+ 'prefix' is a string to output on the front of every module name
+ on output.
+
+ Note that this function must import all *packages* (NOT all
+ modules!) on the given path, in order to access the __path__
+ attribute to find submodules.
+
+ 'onerror' is a function which gets called with one argument (the
+ name of the package which was being imported) if an ImportError
+ occurs trying to import a package. By default the ImportError is
+ caught and ignored.
+
+ Examples:
+ walk_packages() : list all modules python can access
+ walk_packages(ctypes.__path__, ctypes.__name__+'.') : list all submodules of ctypes
+ """
def seen(p, m={}):
if p in m:
@@ -84,19 +105,28 @@
__import__(name)
except ImportError:
if onerror is not None:
- onerror()
+ onerror(name)
else:
path = getattr(sys.modules[name], '__path__', None) or []
# don't traverse path items we've seen before
path = [p for p in path if not seen(p)]
- for item in walk_packages(path, name+'.'):
+ for item in walk_packages(path, name+'.', onerror):
yield item
def iter_modules(path=None, prefix=''):
- """Yield submodule names+loaders for path or sys.path"""
+ """Yields (module_loader, name, ispkg) for all submodules on path,
+ or, if path is None, all top-level modules on sys.path.
+
+ 'path' should be either None or a list of paths to look for
+ modules in.
+
+ 'prefix' is a string to output on the front of every module name
+ on output.
+ """
+
if path is None:
importers = iter_importers()
else:
From python-checkins at python.org Tue Jul 25 13:55:32 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Tue, 25 Jul 2006 13:55:32 +0200 (CEST)
Subject: [Python-checkins] r50820 -
sandbox/trunk/seealso/convert-python-faqs.py
Message-ID: <20060725115532.5746E1E4003@bag.python.org>
Author: andrew.kuchling
Date: Tue Jul 25 13:55:31 2006
New Revision: 50820
Modified:
sandbox/trunk/seealso/convert-python-faqs.py
Log:
Fix the quotes in the nav.yml; add page title
Modified: sandbox/trunk/seealso/convert-python-faqs.py
==============================================================================
--- sandbox/trunk/seealso/convert-python-faqs.py (original)
+++ sandbox/trunk/seealso/convert-python-faqs.py Tue Jul 25 13:55:31 2006
@@ -6,7 +6,7 @@
#
# $Id$
-import os, shutil, glob
+import os, shutil, glob, cgi
import htmlload
ET = htmlload.ET
@@ -55,7 +55,7 @@
nav : !sectionnav |
""")
for dirname, title in urls:
- title = yaml_escape(title)
+ title = normalize_whitespace(title)
f.write(' '*6 + title + ' ' + dirname + '\n')
f.close()
@@ -85,6 +85,8 @@
f = open(os.path.join(qdir, 'question.ht'), 'w')
f.write('Title: Unused title\n')
f.write('\n')
+
+ f.write('
%s
\n\n' % cgi.escape(title))
body = root.find('body')
for child in body.getchildren():
s = ET.tostring(child, 'utf-8')
From python-checkins at python.org Tue Jul 25 14:05:09 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Tue, 25 Jul 2006 14:05:09 +0200 (CEST)
Subject: [Python-checkins] r50821 -
sandbox/trunk/seealso/convert-python-faqs.py
Message-ID: <20060725120509.B37031E4019@bag.python.org>
Author: andrew.kuchling
Date: Tue Jul 25 14:05:08 2006
New Revision: 50821
Modified:
sandbox/trunk/seealso/convert-python-faqs.py
Log:
Fix URLs in nav.yml
Modified: sandbox/trunk/seealso/convert-python-faqs.py
==============================================================================
--- sandbox/trunk/seealso/convert-python-faqs.py (original)
+++ sandbox/trunk/seealso/convert-python-faqs.py Tue Jul 25 14:05:08 2006
@@ -140,7 +140,8 @@
urllist = []
for a in entries:
href = a.get("href")
- a.set("href", href.lstrip('/'))
+ href = href.lstrip('/')
+ a.set("href", href)
f.write('
%s
\n' % ET.tostring(a, 'utf-8'))
urllist.append((href, a.text))
From python-checkins at python.org Tue Jul 25 15:07:00 2006
From: python-checkins at python.org (barry.warsaw)
Date: Tue, 25 Jul 2006 15:07:00 +0200 (CEST)
Subject: [Python-checkins] r50822 - in python/branches/release23-maint:
Lib/email/Utils.py Lib/email/__init__.py
Lib/email/test/test_email.py Misc/NEWS
Message-ID: <20060725130700.792D91E4003@bag.python.org>
Author: barry.warsaw
Date: Tue Jul 25 15:06:56 2006
New Revision: 50822
Modified:
python/branches/release23-maint/Lib/email/Utils.py
python/branches/release23-maint/Lib/email/__init__.py
python/branches/release23-maint/Lib/email/test/test_email.py
python/branches/release23-maint/Misc/NEWS
Log:
Back port r50693 and r50754 from the trunk (and 2.4 branch):
decode_rfc2231(): Be more robust against buggy RFC 2231 encodings.
Specifically, instead of raising a ValueError when there is a single
tick in the parameter, simply return that the entire string unquoted, with
None for both the charset and the language. Also, if there are more than 2
ticks in the parameter, interpret the first three parts as the standard RFC
2231 parts, then the rest of the parts as the encoded string.
More RFC 2231 improvements for the email 4.0 package. As Mark Sapiro
rightly points out there are really two types of continued headers
defined in this RFC (i.e. "encoded" parameters with the form
"name*0*=" and unencoded parameters with the form "name*0="), but we
were were handling them both the same way and that isn't correct.
This patch should be much more RFC compliant in that only encoded
params are %-decoded and the charset/language information is only
extract if there are any encoded params in the segments. If there are
no encoded params then the RFC says that there will be no
charset/language parts.
Note however that this will change the return value for
Message.get_param() in some cases. For example, whereas before if you
had all unencoded param continuations you would have still gotten a
3-tuple back from this method (with charset and language == None), you
will now get just a string. I don't believe this is a backward
incompatible change though because the documentation for this method
already indicates that either return value is possible and that you
must do an isinstance(val, tuple) check to discriminate between the
two. (Yeah that API kind of sucks but we can't change /that/ without
breaking code.)
Test cases, some documentation updates, and a NEWS item accompany this
patch.
Original fewer-than-3-parts fix by Tokio Kikuchi.
Resolves SF bug # 1218081.
Also, bump the package version number to 2.5.8 for release.
Modified: python/branches/release23-maint/Lib/email/Utils.py
==============================================================================
--- python/branches/release23-maint/Lib/email/Utils.py (original)
+++ python/branches/release23-maint/Lib/email/Utils.py Tue Jul 25 15:06:56 2006
@@ -1,14 +1,15 @@
-# Copyright (C) 2001,2002 Python Software Foundation
-# Author: barry at zope.com (Barry Warsaw)
+# Copyright (C) 2001-2006 Python Software Foundation
+# Author: Barry Warsaw
+# Contact: email-sig at python.org
-"""Miscellaneous utilities.
-"""
+"""Miscellaneous utilities."""
import time
import socket
import re
import random
import os
+import urllib
import warnings
from cStringIO import StringIO
from types import ListType
@@ -53,6 +54,7 @@
EMPTYSTRING = ''
UEMPTYSTRING = u''
CRLF = '\r\n'
+TICK = "'"
specialsre = re.compile(r'[][\\()<>@,:;".]')
escapesre = re.compile(r'[][\\()"]')
@@ -277,12 +279,14 @@
# RFC2231-related functions - parameter encoding and decoding
def decode_rfc2231(s):
"""Decode string according to RFC 2231"""
- import urllib
- parts = s.split("'", 2)
- if len(parts) == 1:
+ parts = s.split(TICK, 2)
+ if len(parts) <= 2:
return None, None, urllib.unquote(s)
- charset, language, s = parts
- return charset, language, urllib.unquote(s)
+ if len(parts) > 3:
+ charset, language = pars[:2]
+ s = TICK.join(parts[2:])
+ return charset, language, s
+ return parts
def encode_rfc2231(s, charset=None, language=None):
@@ -306,35 +310,52 @@
def decode_params(params):
"""Decode parameters list according to RFC 2231.
- params is a sequence of 2-tuples containing (content type, string value).
+ params is a sequence of 2-tuples containing (param name, string value).
"""
+ # Copy params so we don't mess with the original
+ params = params[:]
new_params = []
- # maps parameter's name to a list of continuations
+ # Map parameter's name to a list of continuations. The values are a
+ # 3-tuple of the continuation number, the string value, and a flag
+ # specifying whether a particular segment is %-encoded.
rfc2231_params = {}
- # params is a sequence of 2-tuples containing (content_type, string value)
- name, value = params[0]
+ name, value = params.pop(0)
new_params.append((name, value))
- # Cycle through each of the rest of the parameters.
- for name, value in params[1:]:
+ while params:
+ name, value = params.pop(0)
+ if name.endswith('*'):
+ encoded = True
+ else:
+ encoded = False
value = unquote(value)
mo = rfc2231_continuation.match(name)
if mo:
name, num = mo.group('name', 'num')
if num is not None:
num = int(num)
- rfc2231_param1 = rfc2231_params.setdefault(name, [])
- rfc2231_param1.append((num, value))
+ rfc2231_params.setdefault(name, []).append((num, value, encoded))
else:
new_params.append((name, '"%s"' % quote(value)))
if rfc2231_params:
for name, continuations in rfc2231_params.items():
value = []
+ extended = False
# Sort by number
continuations.sort()
- # And now append all values in num order
- for num, continuation in continuations:
- value.append(continuation)
- charset, language, value = decode_rfc2231(EMPTYSTRING.join(value))
- new_params.append(
- (name, (charset, language, '"%s"' % quote(value))))
+ # And now append all values in numerical order, converting
+ # %-encodings for the encoded segments. If any of the
+ # continuation names ends in a *, then the entire string, after
+ # decoding segments and concatenating, must have the charset and
+ # language specifiers at the beginning of the string.
+ for num, s, encoded in continuations:
+ if encoded:
+ s = urllib.unquote(s)
+ extended = True
+ value.append(s)
+ value = quote(EMPTYSTRING.join(value))
+ if extended:
+ charset, language, value = decode_rfc2231(value)
+ new_params.append((name, (charset, language, '"%s"' % value)))
+ else:
+ new_params.append((name, '"%s"' % value))
return new_params
Modified: python/branches/release23-maint/Lib/email/__init__.py
==============================================================================
--- python/branches/release23-maint/Lib/email/__init__.py (original)
+++ python/branches/release23-maint/Lib/email/__init__.py Tue Jul 25 15:06:56 2006
@@ -3,7 +3,7 @@
"""A package for parsing, handling, and generating email messages."""
-__version__ = '2.5.7'
+__version__ = '2.5.8'
__all__ = [
'base64MIME',
Modified: python/branches/release23-maint/Lib/email/test/test_email.py
==============================================================================
--- python/branches/release23-maint/Lib/email/test/test_email.py (original)
+++ python/branches/release23-maint/Lib/email/test/test_email.py Tue Jul 25 15:06:56 2006
@@ -2756,14 +2756,17 @@
'''
msg = email.message_from_string(m)
- self.assertEqual(msg.get_param('NAME'),
- (None, None, 'file____C__DOCUMENTS_20AND_20SETTINGS_FABIEN_LOCAL_20SETTINGS_TEMP_nsmail.htm'))
+ param = msg.get_param('NAME')
+ self.failIf(isinstance(param, tuple))
+ self.assertEqual(
+ param,
+ 'file____C__DOCUMENTS_20AND_20SETTINGS_FABIEN_LOCAL_20SETTINGS_TEMP_nsmail.htm')
def test_rfc2231_no_language_or_charset_in_filename(self):
m = '''\
Content-Disposition: inline;
-\tfilename*0="This%20is%20even%20more%20";
-\tfilename*1="%2A%2A%2Afun%2A%2A%2A%20";
+\tfilename*0*="This%20is%20even%20more%20";
+\tfilename*1*="%2A%2A%2Afun%2A%2A%2A%20";
\tfilename*2="is it not.pdf"
'''
@@ -2774,8 +2777,8 @@
def test_rfc2231_no_language_or_charset_in_boundary(self):
m = '''\
Content-Type: multipart/alternative;
-\tboundary*0="This%20is%20even%20more%20";
-\tboundary*1="%2A%2A%2Afun%2A%2A%2A%20";
+\tboundary*0*="This%20is%20even%20more%20";
+\tboundary*1*="%2A%2A%2Afun%2A%2A%2A%20";
\tboundary*2="is it not.pdf"
'''
@@ -2783,12 +2786,38 @@
self.assertEqual(msg.get_boundary(),
'This is even more ***fun*** is it not.pdf')
+ def test_rfc2231_partly_encoded(self):
+ m = '''\
+Content-Disposition: inline;
+\tfilename*0="''This%20is%20even%20more%20";
+\tfilename*1*="%2A%2A%2Afun%2A%2A%2A%20";
+\tfilename*2="is it not.pdf"
+
+'''
+ msg = email.message_from_string(m)
+ self.assertEqual(
+ msg.get_filename(),
+ 'This%20is%20even%20more%20***fun*** is it not.pdf')
+
+ def test_rfc2231_partly_nonencoded(self):
+ m = '''\
+Content-Disposition: inline;
+\tfilename*0="This%20is%20even%20more%20";
+\tfilename*1="%2A%2A%2Afun%2A%2A%2A%20";
+\tfilename*2="is it not.pdf"
+
+'''
+ msg = email.message_from_string(m)
+ self.assertEqual(
+ msg.get_filename(),
+ 'This%20is%20even%20more%20%2A%2A%2Afun%2A%2A%2A%20is it not.pdf')
+
def test_rfc2231_no_language_or_charset_in_charset(self):
# This is a nonsensical charset value, but tests the code anyway
m = '''\
Content-Type: text/plain;
-\tcharset*0="This%20is%20even%20more%20";
-\tcharset*1="%2A%2A%2Afun%2A%2A%2A%20";
+\tcharset*0*="This%20is%20even%20more%20";
+\tcharset*1*="%2A%2A%2Afun%2A%2A%2A%20";
\tcharset*2="is it not.pdf"
'''
@@ -2799,8 +2828,8 @@
def test_rfc2231_bad_encoding_in_filename(self):
m = '''\
Content-Disposition: inline;
-\tfilename*0="bogus'xx'This%20is%20even%20more%20";
-\tfilename*1="%2A%2A%2Afun%2A%2A%2A%20";
+\tfilename*0*="bogus'xx'This%20is%20even%20more%20";
+\tfilename*1*="%2A%2A%2Afun%2A%2A%2A%20";
\tfilename*2="is it not.pdf"
'''
@@ -2831,9 +2860,9 @@
def test_rfc2231_bad_character_in_filename(self):
m = '''\
Content-Disposition: inline;
-\tfilename*0="ascii'xx'This%20is%20even%20more%20";
-\tfilename*1="%2A%2A%2Afun%2A%2A%2A%20";
-\tfilename*2="is it not.pdf%E2"
+\tfilename*0*="ascii'xx'This%20is%20even%20more%20";
+\tfilename*1*="%2A%2A%2Afun%2A%2A%2A%20";
+\tfilename*2*="is it not.pdf%E2"
'''
msg = email.message_from_string(m)
@@ -2841,6 +2870,102 @@
'This is even more ***fun*** is it not.pdf\xe2')
+ def test_rfc2231_unknown_encoding(self):
+ m = """\
+Content-Transfer-Encoding: 8bit
+Content-Disposition: inline; filename*=X-UNKNOWN''myfile.txt
+
+"""
+ msg = email.message_from_string(m)
+ self.assertEqual(msg.get_filename(), 'myfile.txt')
+
+ def test_rfc2231_single_tick_in_filename_extended(self):
+ eq = self.assertEqual
+ m = """\
+Content-Type: application/x-foo;
+\tname*0*=\"Frank's\"; name*1*=\" Document\"
+
+"""
+ msg = email.message_from_string(m)
+ charset, language, s = msg.get_param('name')
+ eq(charset, None)
+ eq(language, None)
+ eq(s, "Frank's Document")
+
+ def test_rfc2231_single_tick_in_filename(self):
+ m = """\
+Content-Type: application/x-foo; name*0=\"Frank's\"; name*1=\" Document\"
+
+"""
+ msg = email.message_from_string(m)
+ param = msg.get_param('name')
+ self.failIf(isinstance(param, tuple))
+ self.assertEqual(param, "Frank's Document")
+
+ def test_rfc2231_tick_attack_extended(self):
+ eq = self.assertEqual
+ m = """\
+Content-Type: application/x-foo;
+\tname*0*=\"us-ascii'en-us'Frank's\"; name*1*=\" Document\"
+
+"""
+ msg = email.message_from_string(m)
+ charset, language, s = msg.get_param('name')
+ eq(charset, 'us-ascii')
+ eq(language, 'en-us')
+ eq(s, "Frank's Document")
+
+ def test_rfc2231_tick_attack(self):
+ m = """\
+Content-Type: application/x-foo;
+\tname*0=\"us-ascii'en-us'Frank's\"; name*1=\" Document\"
+
+"""
+ msg = email.message_from_string(m)
+ param = msg.get_param('name')
+ self.failIf(isinstance(param, tuple))
+ self.assertEqual(param, "us-ascii'en-us'Frank's Document")
+
+ def test_rfc2231_no_extended_values(self):
+ eq = self.assertEqual
+ m = """\
+Content-Type: application/x-foo; name=\"Frank's Document\"
+
+"""
+ msg = email.message_from_string(m)
+ eq(msg.get_param('name'), "Frank's Document")
+
+ def test_rfc2231_encoded_then_unencoded_segments(self):
+ eq = self.assertEqual
+ m = """\
+Content-Type: application/x-foo;
+\tname*0*=\"us-ascii'en-us'My\";
+\tname*1=\" Document\";
+\tname*2*=\" For You\"
+
+"""
+ msg = email.message_from_string(m)
+ charset, language, s = msg.get_param('name')
+ eq(charset, 'us-ascii')
+ eq(language, 'en-us')
+ eq(s, 'My Document For You')
+
+ def test_rfc2231_unencoded_then_encoded_segments(self):
+ eq = self.assertEqual
+ m = """\
+Content-Type: application/x-foo;
+\tname*0=\"us-ascii'en-us'My\";
+\tname*1*=\" Document\";
+\tname*2*=\" For You\"
+
+"""
+ msg = email.message_from_string(m)
+ charset, language, s = msg.get_param('name')
+ eq(charset, 'us-ascii')
+ eq(language, 'en-us')
+ eq(s, 'My Document For You')
+
+
def _testclasses():
mod = sys.modules[__name__]
Modified: python/branches/release23-maint/Misc/NEWS
==============================================================================
--- python/branches/release23-maint/Misc/NEWS (original)
+++ python/branches/release23-maint/Misc/NEWS Tue Jul 25 15:06:56 2006
@@ -28,6 +28,18 @@
Library
-------
+- The email package has improved RFC 2231 support, specifically for
+ recognizing the difference between encoded (name*0*=) and non-encoded
+ (name*0=) parameter continuations. This may change the types of
+ values returned from email.message.Message.get_param() and friends.
+ Specifically in some cases where non-encoded continuations were used,
+ get_param() used to return a 3-tuple of (None, None, string) whereas now it
+ will just return the string (since non-encoded continuations don't have
+ charset and language parts).
+
+ Also, whereas % values were decoded in all parameter continuations, they are
+ now only decoded in encoded parameter parts.
+
- Applied a security fix to SimpleXMLRPCserver (PSF-2005-001). This
disables recursive traversal through instance attributes, which can
be exploited in various ways.
From ncoghlan at gmail.com Tue Jul 25 15:24:41 2006
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 25 Jul 2006 23:24:41 +1000
Subject: [Python-checkins] r50818 - peps/trunk/pep-0356.txt
In-Reply-To: <20060725102043.6234F1E4003@bag.python.org>
References: <20060725102043.6234F1E4003@bag.python.org>
Message-ID: <44C61B99.6040208@gmail.com>
georg.brandl wrote:
> Author: georg.brandl
> Date: Tue Jul 25 12:20:43 2006
> New Revision: 50818
>
> Modified:
> peps/trunk/pep-0356.txt
> Log:
> Add item: docs of pkgutil changes.
I'm not sure about the intent of this one. What did you feel needs documenting?
Many of the changes to pkgutil were a matter of consolidating two different
PEP 302 compliant emulations of the standard import machinery (one in runpy,
one in test_import) into a single copy.
Leaving this part undocumented makes sense to me, as it is really only
intended for standard library internal use - we don't currently want to
promise that pkgutil will continue to export pkgutil.getloader() after the
real imp.getloader() is implemented.
runpy.run_module definitely has some corner cases where it can't set certain
special variables correctly (e.g. it doesn't know how to set __file__
correctly when running a module from a zip file), but the runpy.run_module
docs state explicitly that __file__ may sometimes be set to None instead of
the proper string value.
The issues with relative imports are a fair bit more esoteric, which is why I
relegated them to a section in PEP 338 for the time being.
For the pydoc changes, I thought those were simply a matter of making pydoc
work as advertised for modules in more locations - the public interface didn't
actually change, did it? (PJE is no doubt in a better position to comment on
that one)
Cheers,
Nick.
--
Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia
---------------------------------------------------------------
http://www.boredomandlaziness.org
From python-checkins at python.org Tue Jul 25 16:10:37 2006
From: python-checkins at python.org (barry.warsaw)
Date: Tue, 25 Jul 2006 16:10:37 +0200 (CEST)
Subject: [Python-checkins] r50823 - sandbox/trunk/emailpkg/tags
Message-ID: <20060725141037.9B0B81E4013@bag.python.org>
Author: barry.warsaw
Date: Tue Jul 25 16:10:37 2006
New Revision: 50823
Added:
sandbox/trunk/emailpkg/tags/
Log:
a directory to tag email package releases
From g.brandl at gmx.net Tue Jul 25 16:13:15 2006
From: g.brandl at gmx.net (Georg Brandl)
Date: Tue, 25 Jul 2006 16:13:15 +0200
Subject: [Python-checkins] r50818 - peps/trunk/pep-0356.txt
In-Reply-To: <44C61B99.6040208@gmail.com>
References: <20060725102043.6234F1E4003@bag.python.org>
<44C61B99.6040208@gmail.com>
Message-ID:
Nick Coghlan wrote:
> georg.brandl wrote:
>> Author: georg.brandl
>> Date: Tue Jul 25 12:20:43 2006
>> New Revision: 50818
>>
>> Modified:
>> peps/trunk/pep-0356.txt
>> Log:
>> Add item: docs of pkgutil changes.
>
> I'm not sure about the intent of this one. What did you feel needs documenting?
Well, I saw that pkgutil.py is full of new non-private functions, while the docs
state that it "exports just one function".
PJE himself writes in a checkin msg:
"""
Tasks remaining: write docs and Misc/NEWS for pkgutil/pydoc changes,
and update setuptools to use pkgutil wherever possible, then add it
to the stdlib.
"""
Georg
From python-checkins at python.org Tue Jul 25 19:11:14 2006
From: python-checkins at python.org (matt.fleming)
Date: Tue, 25 Jul 2006 19:11:14 +0200 (CEST)
Subject: [Python-checkins] r50824 - in sandbox/trunk/pdb:
Doc/lib/libmpdb.tex mpdb.py test/Makefile test/files/proc.py
test/support.py test/test_mconnection.py test/test_mpdb.py
test/test_mthread.py test/test_process.py
Message-ID: <20060725171114.653FA1E4006@bag.python.org>
Author: matt.fleming
Date: Tue Jul 25 19:11:12 2006
New Revision: 50824
Added:
sandbox/trunk/pdb/test/support.py
sandbox/trunk/pdb/test/test_process.py
Modified:
sandbox/trunk/pdb/Doc/lib/libmpdb.tex
sandbox/trunk/pdb/mpdb.py
sandbox/trunk/pdb/test/Makefile
sandbox/trunk/pdb/test/files/proc.py
sandbox/trunk/pdb/test/test_mconnection.py
sandbox/trunk/pdb/test/test_mpdb.py
sandbox/trunk/pdb/test/test_mthread.py
Log:
Change the way we do tests, instead of including docstrings let the name
of the class and test be the description. Also move some common code into
a 'support' file. Documentation update and unit tests for debugging already
running programs. Also added an '-e' option to mpdb to allow executing
commands specified on the cmdline.
Modified: sandbox/trunk/pdb/Doc/lib/libmpdb.tex
==============================================================================
--- sandbox/trunk/pdb/Doc/lib/libmpdb.tex (original)
+++ sandbox/trunk/pdb/Doc/lib/libmpdb.tex Tue Jul 25 19:11:12 2006
@@ -1149,11 +1149,44 @@
this session all commands come from the client and are executed on the
pdbserver.
-\section{External Process Debugging}
+\section{Debugging an already running process}
\label{proc-debug}
This section describes how \module{mpdb} debugs processes that are external
-to the process in which \module{mpdb} is being run.
+to the process in which \module{mpdb} is being run. A user can call the
+\code{mpdb.process_debugging} function inside their code to enable a debugger
+running as a separate process to debug their program. The way this works
+is that \module{mpdb} sets up a signal handler for a specified signal to
+\module{mpdb}'s signal handler. If there was a signal handler for this signal
+previously, it is saved. When this signal handler is traps the signal it
+starts a \code{pdbserver} that a debugger can connect to. Once the debugger
+has 'detached' from the \code{pdbserver}, \module{mpdb}'s signal handler
+is removed and the old signal handler (if any) is restored.
+
+\begin{funcdesc}{process_debugging}{sig, protocol, addr}
+This function enables debugging by another process by setting the signal
+handler for signal \code{sig} to \module{mpdb}'s signal handler. When a client
+wishes to connect to this process, it uses the \code{attach} command which
+sends \module{mpdb}'s \code{debug_signal} to a process. It is up to the user
+of \module{mpdb} to ensure that the programming they are debugging and the
+signal sent during the \code{attach} command are the same, \ref{command::set:}.
+When the signal is received by \module{mpdb}'s signal handler a \code{pdbserver}
+is started on \code{addr} using protoocol \code{protocol}.
+\code{sig}, \code{protocol} and \code{addr} are all optional arguments that
+specify the signal to use for debugging, the protocol to use for \code{pdbserver}
+and the add for the \code{pdbserver}, respectively.
+
+
+This function returns a string which is the address used for the
+\code{pdbserver}. If an address is not specified it defaults to
+a path consisting of the system's temporary
+directory is used as returned by \code{tempfile.gettempdir()}, and a filename
+made up from the process' pid and the word 'mpdb', i.e.
+\code{'/tmp/22987mpdb'}. If \code{protocol} is not specified a FIFO is used.
+The default debug signal is \code{SIGUSR1}.
+\end{funcdesc}
+
+\subsection{Example}
If a program wishes to allow debugging from another process it must import
and call the \code{process_debugging} function from the \module{mpdb} module.
This function sets up a signal handler for \module{mpdb}'s debugging signal
@@ -1168,7 +1201,9 @@
\end{verbatim}
From the debugger console a user must issue the 'attach' command, the
-\code{target_addr} variable must be set \ref{command::set}.
+\code{target_addr} variable must be set \ref{command::set}. The user can
+also select a signal to send to the process by calling the
+\code{set debug-signal} command, \ref{command::set}.
\begin{verbatim}
(MPdb) set target [protcol] [address]
@@ -1181,4 +1216,3 @@
specify the hostname, as it can only be 'localhost'.
-
Modified: sandbox/trunk/pdb/mpdb.py
==============================================================================
--- sandbox/trunk/pdb/mpdb.py (original)
+++ sandbox/trunk/pdb/mpdb.py Tue Jul 25 19:11:12 2006
@@ -54,10 +54,13 @@
self.lastcmd = ''
self.connection = None
self.debugger_name = 'mpdb'
- self._show_cmds.append('target-address')
- self._show_cmds.sort()
- self._info_cmds.append('target')
- self._info_cmds.sort()
+
+ self.setcmds.add('debug-signal', self.set_debug_signal)
+ self.setcmds.add('target-address', self.set_target_address)
+ self.showcmds.add('debug-signal', self.show_debug_signal)
+ self.showcmds.add('target-address', self.show_target_address)
+ self.infocmds.add('target', self.info_target)
+
self.target_addr = "" # target address used by 'attach'
self.debug_signal = None # The signal used by 'attach'
@@ -92,6 +95,9 @@
self.onecmd = lambda x: pydb.Pdb.onecmd(self, x)
self.do_rquit(None)
return
+ if 'detach'.startswith(line):
+ self.connection.write('rdetach')
+ self.do_detach(None)
self.connection.write(line)
ret = self.connection.readline()
if ret == '':
@@ -139,63 +145,41 @@
self.prompt = self.local_prompt
self.onecmd = lambda x: pydb.Pdb.onecmd(self, x)
- def do_info(self, arg):
- """Extends pydb do_info() to give info about the Mpdb extensions."""
- if not arg:
- pydb.Pdb.do_info(self, arg)
- return
-
- args = arg.split()
- if 'target'.startswith(args[0]) and len(args[0]) > 2:
- self.msg("target is %s" % self.target)
- else:
- pydb.Pdb.do_info(self, arg)
-
- def info_helper(self, cmd, label=False):
- """Extends pydb info_helper() to give info about a single Mpdb
- info extension."""
- if label:
- self.msg_nocr("info %s --" % cmd)
- if 'target'.startswith(cmd):
- self.msg("Names of targets and files being debugged")
- else:
- pydb.Pdb.info_helper(self, cmd)
-
def help_mpdb(self, *arg):
help()
- def do_set(self, arg):
- """ Extends pydb.do_set() to allow setting of mpdb extensions.
+ def set_debug_signal(self, args):
+ """Set the signal sent to a process to trigger debugging."""
+ try:
+ exec 'from signal import %s' % args[1]
+ except ImportError:
+ self.errmsg('Invalid signal')
+ return
+ self.debug_signal = args[1]
+ self.msg('debug-signal set to: %s' % self.debug_signal)
+
+ def set_target_address(self, args):
+ """Set the address of a target."""
+ self.target_addr = "".join(["%s " % a for a in args[1:]])
+ self.target_addr = self.target_addr.strip()
+ self.msg('target address set to %s' % self.target_addr)
+
+ def show_debug_signal(self, arg):
+ """Show the signal currently used for triggering debugging
+ of an already running process.
"""
- if not arg:
- pydb.Pdb.do_set(self, arg)
+ if not self.debug_signal:
+ self.msg('debug-signal not set.')
return
+ self.msg('debug-signal is %s' % self.debug_signal)
- args = arg.split()
- if 'debug-signal'.startswith(args[0]):
- self.debug_signal = args[1]
- self.msg('debug-signal set to: %s' % self.debug_signal)
- elif 'target-address'.startswith(args[0]):
- self.target_addr = "".join(["%s " % a for a in args[1:]])
- self.target_addr = self.target_addr.strip()
- self.msg('target address set to: %s' % self.target_addr)
-
- def do_show(self, arg):
- """Extends pydb.do_show() to show Mpdb extension settings. """
- if not arg:
- pydb.Pdb.do_show(self, arg)
- return
+ def show_target_address(self, arg):
+ """Show the address of the current target."""
+ self.msg('target-address is %s.' % self.target_addr.__repr__())
- args = arg.split()
- if 'debug-signal'.startswith(args[0]):
- if not self.debug_signal:
- self.msg('debug-signal is not set.')
- else:
- self.msg('debug-signal is %s.' % self.debug_signal)
- elif 'target-address'.startswith(args[0]):
- self.msg('target address is %s.' % self.target_addr.__repr__())
- else:
- pydb.Pdb.do_show(self, arg)
+ def info_target(self, args):
+ """Display information about the current target."""
+ self.msg('target is %s' % self.target)
# Debugger commands
def do_attach(self, addr):
@@ -222,21 +206,17 @@
if not self.debug_signal:
from signal import SIGUSR1
self.debug_signal = SIGUSR1
- else:
- # Because self.debug_signal may be a string
- self.debug_signal = eval(self.debug_signal)
try:
os.kill(pid, self.debug_signal)
except OSError, err:
self.errmsg(err)
return
- # Will remove this
- time.sleep(3.0)
-
- # At the moment by default we'll use named pipes for communication
+ # XXX this still needs removing
+ time.sleep(1.0)
+
self.do_target(self.target_addr)
-
+
def do_target(self, args):
""" Connect to a target machine or process.
The first argument is the type or protocol of the target machine
@@ -328,7 +308,7 @@
If a process, it is no longer traced, and it continues its execution. If
you were debugging a file, the file is closed and Pdb no longer accesses it.
"""
- pass
+ raise KeyboardInterrupt
def do_pdbserver(self, args):
""" Allow a debugger to connect to this session.
@@ -409,6 +389,7 @@
self._rebind_input(self.orig_stdin)
self._disconnect()
self.target = 'local'
+ sys.settrace(None)
self.do_quit(None)
def do_restart(self, arg):
@@ -427,6 +408,17 @@
else:
self.msg("Re exec'ing\n\t%s" % self._sys_argv)
os.execvp(self._sys_argv[0], self._sys_argv)
+
+ def do_rdetach(self, arg):
+ """ The rdetach command is performed on the pdbserver, it cleans
+ things up when the client has detached from this process.
+ Control returns to the file being debugged and execution of that
+ file continues.
+ """
+ self._rebind_input(self.orig_stdin)
+ self._rebind_output(self.orig_stdout)
+
+ self.cmdqueue.append('continue') # Continue execution
def pdbserver(addr, m):
""" This method sets up a pdbserver debugger that allows debuggers
@@ -472,7 +464,6 @@
def thread_debugging(m):
""" Setup this debugger to handle threaded applications."""
- sys.path.append(os.path.dirname(m._sys_argv[1]))
import mthread
mthread.init(m)
while True:
@@ -492,7 +483,8 @@
""" Allow debugging of other processes. This routine should
be imported and called near the top of the program file.
It sets up signal handlers that are used to create a pdbserver
- that a debugging client can attach to.
+ that a debugging client can attach to. The address of the pdbserver
+ is returned a string.
The optional argument 'sig', specifies which signal will be
used for running process debugging. If 'sig' is not specified
@@ -521,13 +513,12 @@
proto = 'mconnection.MConnectionServerFIFO'
if addr is not None:
- pdbserver_addr = addr
+ pdbserver_addr = proto + " " + addr
else:
- # XXX I don't think a successful symlink attack can be made here,
- # because pdbserver bails if the file for a FIFO already exists.
- tmp = os.tempnam(None,'mpdb') # use 'mpdb' as a prefix
+ from tempfile import gettempdir
+ tmp = gettempdir() + "/" + str(os.getpid()) + "mpdb"
pdbserver_addr = proto + " " + tmp
- print pdbserver_addr
+ return pdbserver_addr
def signal_handler(signum, frame):
""" This signal handler replaces the programs signal handler
@@ -545,13 +536,10 @@
del frame.f_globals['mpdb']
m.do_pdbserver(pdbserver_addr)
+ m.set_trace(frame)
- try:
- m.set_trace(m.curframe)
- finally:
- m.do_quit(None)
- import signal
- signal.signal(signum, old_handler)
+ import signal
+ signal.signal(signum, old_handler)
def main():
@@ -571,7 +559,9 @@
+ " 'protocol address scriptname'."),
make_option("-d", "--debug-thread", action="store_true",
help="Turn on thread debugging."),
- make_option("--pid", dest="pid", help="Attach to running process PID.")
+ make_option("--pid", dest="pid", help="Attach to running process PID."),
+ make_option("-e", "--exec", dest="commands",
+ help="Specify commands to execute.")
]
opts = process_options(mpdb, "mpdb", os.path.basename(sys.argv[0])
@@ -594,6 +584,10 @@
# module search path.
sys.path[0] = mpdb.main_dirname = os.path.dirname(mpdb.mainpyfile)
+ if opts.commands:
+ cmds = opts.commands.split(',')
+ mpdb.cmdqueue = cmds
+
if opts.target:
target(opts.target, opts, mpdb)
sys.exit()
Modified: sandbox/trunk/pdb/test/Makefile
==============================================================================
--- sandbox/trunk/pdb/test/Makefile (original)
+++ sandbox/trunk/pdb/test/Makefile Tue Jul 25 19:11:12 2006
@@ -8,10 +8,10 @@
PY = python
-.PHONY: all test test_mpdb test_mconnection test_mthread
+.PHONY: all test test_mpdb test_mconnection test_mthread test_process
all: test
-test: test_mpdb test_mconnection test_mthread
+test: test_mpdb test_mconnection test_mthread test_process
test_mpdb:
@$(PY) test_mpdb.py
@@ -21,3 +21,6 @@
test_mthread:
@$(PY) test_mthread.py
+
+test_process:
+ @$(PY) test_process.py
Modified: sandbox/trunk/pdb/test/files/proc.py
==============================================================================
--- sandbox/trunk/pdb/test/files/proc.py (original)
+++ sandbox/trunk/pdb/test/files/proc.py Tue Jul 25 19:11:12 2006
@@ -8,10 +8,14 @@
sys.path.append('../..')
import mpdb
-mpdb.process_debugging()
+mpdb.process_debugging(protocol='tcp', addr=':9000')
+
+try:
+ while True:
+ for i in range(10):
+ x = i
+except KeyboardInterrupt:
+ pass
+
-while True:
- for i in range(10):
- x = i
-
Added: sandbox/trunk/pdb/test/support.py
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/test/support.py Tue Jul 25 19:11:12 2006
@@ -0,0 +1,73 @@
+# This file contains classes that can be imported by any of the unit tests.
+
+import sys ; sys.path.append('..')
+import time
+import threading
+
+class DummyStdout(object):
+ """ This class is a replacement for sys.stdout. All output is
+ stored in this object's 'lines' instance variable.
+ """
+ def __init__(self):
+ self.lines = []
+
+ def flush(self):
+ pass
+
+ def write(self, msg):
+ pass
+
+from mpdb import MPdb
+
+class MPdbTest(MPdb):
+ """ This class provides a version of the MPdb class that is
+ suitable for use in unit testing. All output is captured and
+ stored in this object's 'lines' instance variable.
+ """
+ def __init__(self, cmds=[]):
+ """ The optional argument 'cmds' is a list specifying commands
+ that this instance should interpret in it's 'cmdloop' method.
+ """
+ MPdb.__init__(self)
+ self.lines = []
+ self.cmdqueue = cmds
+ self.botframe = None
+
+ def msg_nocr(self, msg):
+ self.lines.append(msg)
+
+
+class Pdbserver(threading.Thread, MPdb):
+ """ This class provides a fully functional pdbserver that runs
+ in a separate thread. The optional argument 'addr' specifies a
+ protocol and an address to use for pdbserver.
+ """
+ def __init__(self, addr=None):
+ MPdb.__init__(self)
+ threading.Thread.__init__(self)
+ self._sys_argv = ['python', '-c', '"pass"']
+ self.botframe = None
+
+ if not addr:
+ self.addr = 'tcp localhost:8000'
+
+ def run(self):
+ self.do_pdbserver(self.addr)
+ while True:
+ self.cmdloop()
+ if self._user_requested_quit:
+ break
+
+class MPdbTestThread(threading.Thread, MPdbTest):
+ """ This class provides a MPdbTest object that runs in a separate
+ thread.
+ """
+ def __init__(self, cmds=[]):
+ threading.Thread.__init__(self)
+ MPdbTest.__init__(self, cmds)
+
+
+
+
+
+
Modified: sandbox/trunk/pdb/test/test_mconnection.py
==============================================================================
--- sandbox/trunk/pdb/test/test_mconnection.py (original)
+++ sandbox/trunk/pdb/test/test_mconnection.py Tue Jul 25 19:11:12 2006
@@ -6,6 +6,7 @@
import os
import sys
import socket
+import time
import thread
import unittest
@@ -39,14 +40,12 @@
self.client = MConnectionClientTCP()
def testClientConnectToServer(self):
- """(tcp) Connect client to server. """
thread.start_new_thread(repeatedConnect, (self.client, __addr__))
self.server.connect(__addr__)
self.server.disconnect()
def testClientConnectAndRead(self):
- """(tcp) Connect to server and read/write. """
thread.start_new_thread(repeatedConnect, (self.client,__addr__))
self.server.connect(__addr__)
@@ -58,14 +57,12 @@
self.assertEqual('success\n', line, 'Could not read from client')
def testDisconnectDisconnected(self):
- """(tcp) Disconnect a disconnected session. """
s = MConnectionServerTCP()
s.disconnect()
s.disconnect()
def testReadline(self):
- """(tcp) Make sure readline method works. """
thread.start_new_thread(repeatedConnect, (self.client,__addr__))
self.server.connect(__addr__)
@@ -76,7 +73,6 @@
self.server.disconnect()
def testErrorAddressAlreadyInUse(self):
- """(tcp) Test address already in use error. """
thread.start_new_thread(repeatedConnect, (self.client, __addr__))
self.server.connect(__addr__)
@@ -85,31 +81,27 @@
self.assertRaises(ConnectionFailed, s.connect, __addr__, False)
def testInvalidServerAddress(self):
- """(tcp) Connect to an invalid hostname. """
addr = 'fff.209320909xcmnm2iu3-=0-0-z.,x.,091209:2990'
self.assertRaises(ConnectionFailed, self.server.connect, addr)
def testConnectionRefused(self):
- """(tcp) Test connection refused error. """
self.assertRaises(ConnectionFailed, self.client.connect, __addr__)
def testInvalidAddressPortPair(self):
- """(tcp) Test invald hostname, port pair. """
addr = 'localhost 8000'
self.assertRaises(ConnectionFailed, self.server.connect, addr)
def testServerReadError(self):
- """(tcp) Test the ReadError exception."""
thread.start_new_thread(self.server.connect, (__addr__,))
while not self.server._sock:
- pass
+ time.sleep(0.1)
repeatedConnect(self.client, __addr__)
# Wait to make _absolutely_ sure that the client has connected
while not self.server.output:
- pass
+ time.sleep(0.1)
self.client.disconnect()
self.assertRaises(ReadError, self.server.readline)
@@ -138,12 +130,10 @@
self.client.connect(TESTFN)
def testClientToServerConnect(self):
- """(serial) Connect client to server. """
self.client.disconnect()
self.server.disconnect()
def testClientWriteRead(self):
- """(serial) Connect client to server and read/write. """
self.client.write('success!')
line = self.server.readline()
self.assertEquals('success!\n', line, 'Could not read from client.')
@@ -157,11 +147,9 @@
self.assertEquals('great!\n', line, 'Could not read from server.')
def testDisconnectDisconnected(self):
- """(serial) Disconnect a disconnected session. """
self.server.disconnect()
def testReadline(self):
- """(serial) Make sure readline method works. """
self.client.write('success!\nNext line.')
self.client.disconnect()
line = self.server.readline()
@@ -172,7 +160,6 @@
self.assertEquals('', line, 'Could not read third line.')
def testInvalidFilename(self):
- """(serial) Connect to an invalid server. """
client = MConnectionSerial()
self.assertRaises(ConnectionFailed, client.connect,
'/dev/pleasepleasepleasedontexit')
@@ -188,12 +175,10 @@
self.client = MConnectionClientFIFO()
def testConnect(self):
- """(FIFO) Connect a client to a server. """
thread.start_new_thread(self.client.connect, ('test_file',))
self.server.connect('test_file')
def testReadWrite(self):
- """(FIFO) Test reading and writing to and from server/client."""
thread.start_new_thread(self.client.connect, ('test_file',))
self.server.connect('test_file')
@@ -212,12 +197,10 @@
self.assertEquals('received\n', line)
def testMultipleDisconnect(self):
- """(FIFO) Disconnect disconnected connections."""
self.client.disconnect()
self.server.disconnect()
def testReadError(self):
- """(FIFO) Test ReadError."""
thread.start_new_thread(self.client.connect, ('test_file',))
self.server.connect('test_file')
@@ -233,7 +216,6 @@
self.assertRaises(ReadError, self.client.readline)
def testWriteError(self):
- """(FIFO) Test WriteError."""
thread.start_new_thread(self.client.connect, ('test_file',))
self.server.connect('test_file')
@@ -249,7 +231,6 @@
self.assertRaises(WriteError, self.client.write, 'Ni!\n')
def testInvalidPipe(self):
- """(FIFO) Connect to an invalid named pipe."""
self.assertRaises(ConnectionFailed,self.client.connect, 'invalid')
os.unlink('invalid0')
Modified: sandbox/trunk/pdb/test/test_mpdb.py
==============================================================================
--- sandbox/trunk/pdb/test/test_mpdb.py (original)
+++ sandbox/trunk/pdb/test/test_mpdb.py Tue Jul 25 19:11:12 2006
@@ -19,7 +19,7 @@
from mpdb import MPdb, pdbserver, target
from mconnection import (MConnectionClientTCP, MConnectionServerTCP,
ConnectionFailed)
-
+from support import MPdbTest, Pdbserver
TESTFN = 'tester'
# This provides us with a fine-grain way of connecting to a server
@@ -39,32 +39,6 @@
else: raise ConnectionFailed, e
break
-class MPdbTest(MPdb):
- def __init__(self):
- MPdb.__init__(self)
- self.lines = []
- self.botframe = None
-
- def msg_nocr(self, msg):
- self.lines.append(msg)
-
-class Pdbserver(threading.Thread, MPdb):
- def __init__(self):
- MPdb.__init__(self)
- threading.Thread.__init__(self)
- self.botframe = None
- self._sys_argv = ['python', '-c', '"pass"']
-
-
- def run(self):
- self.do_pdbserver('tcp localhost:8000')
- while True:
- try:
- self.cmdloop()
- except Exit:
- break
-
-
class TestRemoteDebugging(unittest.TestCase):
""" Test Case to make sure debugging remotely works properly. """
def tearDown(self):
@@ -77,7 +51,6 @@
# and vice versa.
def testPdbserver(self):
- """ Test the pdbserver. """
client = MPdbTest()
thread.start_new_thread(connect_to_target, (client,))
@@ -92,7 +65,6 @@
self.assertEquals('*** Unknown protocol\n', line)
def testTarget(self):
- """ Test the target command. """
server = MConnectionServerTCP()
thread.start_new_thread(server.connect, (__addr__,True))
@@ -127,7 +99,6 @@
else: break
def testRebindOutput(self):
- """ Test rebinding output. """
self.server = MPdb()
f = open(TESTFN, 'w+')
self.server._rebind_output(f)
@@ -140,7 +111,6 @@
self.assertEquals('some text\n', line, 'Could not rebind output')
def testRebindInput(self):
- """ Test rebinding input. """
self.server = MPdb()
f = open(TESTFN, 'w+')
@@ -155,7 +125,6 @@
self.assertEquals(line, 'help', 'Could not rebind input.')
def testRestart(self):
- """ Test the restart command. """
server = Pdbserver()
server.start()
@@ -175,8 +144,71 @@
if server.connection != None: pass
else: break
+
+class TestMpdbDoc(unittest.TestCase):
+ """ Test the expected output from help commands against actual
+ output to ensure that documentation constantly stays the same.
+ """
+ def setUp(self):
+ self.m = MPdbTest()
+
+ def tearDown(self):
+ del self.m
+
+ def testHelpInfo(self):
+ self.m.onecmd('help info')
+
+ exp = ['Generic command for showing things about the program being debugged.\n',
+ '\nList of info subcommands:\n\n',
+ 'info args --', 'Argument variables of current stack frame\n',
+ 'info breakpoints --', 'Status of user-settable breakpoints\n',
+ 'info display --', 'Expressions to display when program stops, with code numbers\n',
+ 'info globals --', 'Global variables of current stack frame\n',
+ 'info line --', 'Current line number in source file\n',
+ 'info locals --', 'Local variables of current stack frame\n',
+ 'info program --', 'Execution status of the program\n',
+ 'info source --', 'Information about the current Python file\n',
+ 'info target --', 'Display information about the current target.\n']
+ self.assertEquals(self.m.lines, exp)
+
+
+ def testInfoCmd(self):
+ self.m.reset()
+ self.m.onecmd('info')
+ exp = ['args: ', 'No stack.\n',
+ 'breakpoints: ', 'No breakpoints.\n',
+ 'display: ', 'There are no auto-display expressions now.\n',
+ 'globals: ', 'No frame selected.\n',
+ 'line: ', 'No line number information available.\n',
+ 'locals: ', 'No frame selected.\n',
+ 'program: ', 'The program being debugged is not being run.\n',
+ 'source: ', 'No current source file.\n',
+ 'target: ', 'target is local\n']
+ self.assertEquals(self.m.lines, exp)
+
+ def testShowCmd(self):
+ self.m.reset()
+ self.m.onecmd('show')
+
+ exp = ['args: ', 'Argument list to give program being debugged when it is started is \n', '"".\n',
+ 'basename: ', 'basename is off.\n',
+ 'cmdtrace: ', 'cmdtrace is off.\n',
+ # Readline is not always available
+ 'debug-signal: ', 'debug-signal not set.\n',
+ 'history: ',
+ 'interactive: ', 'interactive is on.\n',
+ 'linetrace: ', 'line tracing is off.\n',
+ 'listsize: ', 'Number of source lines pydb will list by default is 10.\n',
+ 'logging: ', 'Future logs will be written to pydb.txt.\n', 'Logs will be appended to the log file.\n', 'Output will be logged and displayed.\n',
+ 'prompt: ', 'pydb\'s prompt is "(MPdb)".\n',
+ 'target-address: ', "target-address is ''.\n",
+ 'version: ', 'pydb version 1.17cvs.\n']
+
+ self.assertEquals(self.m.lines, exp)
+
+
def test_main():
- test_support.run_unittest(TestRemoteDebugging)
+ test_support.run_unittest(TestMpdbDoc, TestRemoteDebugging)
if __name__ == '__main__':
test_main()
Modified: sandbox/trunk/pdb/test/test_mthread.py
==============================================================================
--- sandbox/trunk/pdb/test/test_mthread.py (original)
+++ sandbox/trunk/pdb/test/test_mthread.py Tue Jul 25 19:11:12 2006
@@ -11,7 +11,6 @@
class TestThreadDebugging(unittest.TestCase):
def testMthreadInit(self):
- """ Test the init method of the mthread file. """
m = mpdb.MPdb()
mthread.init(m)
Added: sandbox/trunk/pdb/test/test_process.py
==============================================================================
--- (empty file)
+++ sandbox/trunk/pdb/test/test_process.py Tue Jul 25 19:11:12 2006
@@ -0,0 +1,103 @@
+#!/usr/bin/env python
+
+# Unit test for debugging running processes
+
+import os
+import signal
+import sys
+import time
+import threading
+import unittest
+
+from test import test_support
+from support import DummyStdout, MPdbTest, MPdbTestThread
+
+sys.path.append('..')
+
+import mpdb
+
+PROTOCOL = 'tcp'
+ADDRESS = ':9000'
+TESTFN = os.path.abspath(os.curdir+os.path.sep+'.mpdb')
+
+class TestProcessDebugging(unittest.TestCase):
+ def child(self):
+ os.chdir('./files')
+ pid = os.spawnlp(os.P_NOWAIT, './proc.py', './proc.py')
+ os.chdir('..')
+ return pid
+
+ def testTopLevelRoutine(self):
+ sys.stdout = DummyStdout()
+
+ mpdb.process_debugging()
+ self.assertRaises(ValueError, mpdb.process_debugging, -1)
+ self.assertRaises(TypeError, mpdb.process_debugging, 'invalid')
+ self.assertRaises(ValueError, mpdb.process_debugging, signal.NSIG)
+
+ # XXX We can't currently check the validity of the connection
+ # params.
+
+ sys.stdout = sys.__stdout__
+
+ def testSignalHandler(self):
+ pid = self.child()
+ client = MPdbTest()
+ # Allow the child process to catch up
+ time.sleep(0.1)
+ client.onecmd('set target-address tcp :9000')
+ client.onecmd('attach %s' % str(pid))
+ client.onecmd('where')
+ line = client.lines[1]
+ self.assertEquals(line, ' \n(MPdb)')
+ try:
+ client.onecmd('detach')
+ except KeyboardInterrupt:
+ pass
+ os.kill(pid, signal.SIGINT)
+
+ def testCmdLineOption(self):
+ sys.stdout = DummyStdout()
+
+ pid = self.child()
+ os.chdir('..')
+ time.sleep(0.1)
+ os.system("./mpdb.py --exec='set target-address tcp :9000," + \
+ "attach %s,where,detach' -o %s" % (str(pid), TESTFN))
+ os.chdir('./test')
+ os.kill(pid, signal.SIGINT)
+
+ os.waitpid(pid, 0)
+
+ pid = self.child()
+ os.chdir('..')
+ time.sleep(0.1)
+ os.system("./mpdb.py --pid='%s tcp :9000' --exec='where,detach' -o %s" \
+ % (str(pid), TESTFN))
+ os.chdir('./test')
+ os.kill(pid, signal.SIGINT)
+
+ sys.stdout = sys.__stdout__
+
+ def testFifoFilename(self):
+ cls, addr = mpdb.process_debugging().split()
+
+ pid = os.getpid()
+ import tempfile
+ from tempfile import gettempdir
+ tmp = gettempdir()
+
+ path = tmp + os.path.sep + str(pid) + 'mpdb'
+ self.assertEquals(path, addr)
+
+ def tearDown(self):
+ try:
+ os.unlink(TESTFN)
+ except OSError:
+ pass
+
+def test_main():
+ test_support.run_unittest(TestProcessDebugging)
+
+if __name__ == '__main__':
+ test_main()
From python-checkins at python.org Tue Jul 25 19:32:21 2006
From: python-checkins at python.org (brett.cannon)
Date: Tue, 25 Jul 2006 19:32:21 +0200 (CEST)
Subject: [Python-checkins] r50825 - python/trunk/Misc/NEWS
Message-ID: <20060725173221.1D11D1E400C@bag.python.org>
Author: brett.cannon
Date: Tue Jul 25 19:32:20 2006
New Revision: 50825
Modified:
python/trunk/Misc/NEWS
Log:
Add comment for changes to test_ossaudiodev.
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Tue Jul 25 19:32:20 2006
@@ -87,6 +87,14 @@
- Because of a misspelled preprocessor symbol, ctypes was always
compiled without thread support; this is now fixed.
+Tests
+-----
+
+- Bug #1501330: Change test_ossaudiodev to be much more tolerant in terms of
+ how long the test file should take to play. Now accepts taking 2.93 secs
+ (exact time) +/- 10% instead of the hard-coded 3.1 sec.
+
+
What's New in Python 2.5 beta 2?
================================
From python-checkins at python.org Tue Jul 25 19:34:37 2006
From: python-checkins at python.org (brett.cannon)
Date: Tue, 25 Jul 2006 19:34:37 +0200 (CEST)
Subject: [Python-checkins] r50826 - python/trunk/Lib/test/test_ossaudiodev.py
Message-ID: <20060725173437.45D081E4006@bag.python.org>
Author: brett.cannon
Date: Tue Jul 25 19:34:36 2006
New Revision: 50826
Modified:
python/trunk/Lib/test/test_ossaudiodev.py
Log:
Fix a bug in the messages for an assert failure where not enough arguments to a string
were being converted in the format.
Modified: python/trunk/Lib/test/test_ossaudiodev.py
==============================================================================
--- python/trunk/Lib/test/test_ossaudiodev.py (original)
+++ python/trunk/Lib/test/test_ossaudiodev.py Tue Jul 25 19:34:36 2006
@@ -115,10 +115,10 @@
# either strict or non-strict mode.
result = dsp.setparameters(fmt, channels, rate, False)
_assert(result == (fmt, channels, rate),
- "setparameters%r: returned %r" % (config + result))
+ "setparameters%r: returned %r" % (config, result))
result = dsp.setparameters(fmt, channels, rate, True)
_assert(result == (fmt, channels, rate),
- "setparameters%r: returned %r" % (config + result))
+ "setparameters%r: returned %r" % (config, result))
def test_bad_setparameters(dsp):
From python-checkins at python.org Tue Jul 25 19:35:36 2006
From: python-checkins at python.org (brett.cannon)
Date: Tue, 25 Jul 2006 19:35:36 +0200 (CEST)
Subject: [Python-checkins] r50827 -
python/branches/bcannon-sandboxing/securing_python.txt
Message-ID: <20060725173536.99CA01E4026@bag.python.org>
Author: brett.cannon
Date: Tue Jul 25 19:35:36 2006
New Revision: 50827
Modified:
python/branches/bcannon-sandboxing/securing_python.txt
Log:
Clarify the extent of constructors that might need to be moved to factory functions.
Also clarify what physical resources are to be protected.
Modified: python/branches/bcannon-sandboxing/securing_python.txt
==============================================================================
--- python/branches/bcannon-sandboxing/securing_python.txt (original)
+++ python/branches/bcannon-sandboxing/securing_python.txt Tue Jul 25 19:35:36 2006
@@ -290,8 +290,8 @@
All security measures should never have to ask who an interpreter is.
This means that what abilities an interpreter has should not be stored
-at the interpreter level when the security can use a proxy to protect
-a resource. This means that while supporting a memory cap can
+at the interpreter level when the security can be provided at the
+Python level. This means that while supporting a memory cap can
have a per-interpreter setting that is checked (because access to the
operating system's memory allocator is not supported at the program
level), protecting files and imports should not such a per-interpreter
@@ -316,24 +316,32 @@
Keeping Python "pythonic" is required for all design decisions.
In general, being pythonic means that something fits the general
-design guidelines (run ``import this`` from a Python interpreter to
-see the basic ones). If removing an ability leads to something being
-unpythonic, it will not be done. This does not mean existing pythonic
-code must continue to work, but the spirit of being pythonic will not
-be compromised in the name of the security model. While this might
-lead to a weaker security model, this is a price that must be paid in
-order for Python to continue to be the language that it is.
+design guidelines of the Python programming language (run
+``import this`` from a Python interpreter to see the basic ones).
+If removing an ability leads to something being unpythonic, it will not
+be done unless there is an extremely compelling reason to do so.
+This does not mean existing pythonic code must continue to work, but
+the spirit of being pythonic will not be compromised in the name of the
+security model. While this might lead to a weaker security model, this
+is a price that must be paid in order for Python to continue to be the
+language that it is.
Restricting what is in the built-in namespace and the safe-guarding
the interpreter (which includes safe-guarding the built-in types) is
-where security will come from. Imports and the ``file`` type are
-both part of the standard namespace and must be restricted in order
-for any security implementation to be effective.
+where the majority of security will come from. Imports and the
+``file`` type are both part of the standard namespace and must be
+restricted in order for any security implementation to be effective.
The built-in types which are needed for basic Python usage (e.g.,
``object`` code objects, etc.) must be made safe to use in a sandboxed
interpreter since they are easily accessbile and yet required for
Python to function.
+The rest of the security for Python will come in the form of
+protecting physical resources. For those resources that can be denied
+in a Denial of Service (DoS) attack but protected in a
+platform-agnositc fashion, they should. This means, for instance,
+that memory should be protected but CPU usage can't.
+
Abilities of a Standard Sandboxed Interpreter
=============================================
@@ -439,24 +447,23 @@
Constructors
++++++++++++
-Almost all of Python's built-in types
-contain a constructor that allows code to create a new instance of a
-type as long as you have the type itself. Unfortunately this does not
-work in an object-capabilities system without either providing a proxy
-to the constructor or just turning it off.
-
-The plan is to turn off the constructors that are currently supplied
-directly by the types that are dangerous. Their constructors will
-then either be shifted over to factory functions that will be stored
-in a C extension module or to built-ins that will be
-provided to use to create instances. The former approach will allow
-for protections to be enforced by import proxy; just don't allow the
-extension module to be imported. The latter approach would allow
-either a unique constructor per type, or more generic built-in(s) for
-construction (e.g., introducing a ``construct()`` function that takes
-in a type and any arguments desired to be passed in for constructing
-an instance of the type) and allowing using proxies to provide
-security.
+Almost all of Python's built-in types contain a constructor that allows
+code to create a new instance of a type as long as you have the type
+itself. Unfortunately this does not work well in an object-capabilities
+system without either providing a proxy to the constructor or just
+removing it when access to such a constructor should be controlled.
+
+The plan is to remove select constructors of the types that are
+dangerous and either relocate them to an extension module as factory
+functions or create a new built-in that acts a generic factory
+function for all types, missing constructor or not. The former approach
+will allow for protections to be enforced by import proxy; just don't
+allow the extension module to be imported. The latter approach would
+allow either a unique constructor per type, or more generic built-in(s)
+for construction (e.g., introducing a ``construct()`` function that
+takes in a type and any arguments desired to be passed in for
+constructing an instance of the type) and allowing using proxies to
+provide security.
Some might consider this unpythonic. Python very rarely separates the
constructor of an object from the class/type and require that you go
From buildbot at python.org Tue Jul 25 19:57:45 2006
From: buildbot at python.org (buildbot at python.org)
Date: Tue, 25 Jul 2006 17:57:45 +0000
Subject: [Python-checkins] buildbot warnings in amd64 gentoo trunk
Message-ID: <20060725175745.223261E4006@bag.python.org>
The Buildbot has detected a new failure of amd64 gentoo trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/amd64%2520gentoo%2520trunk/builds/1287
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: brett.cannon
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Tue Jul 25 20:09:58 2006
From: python-checkins at python.org (armin.rigo)
Date: Tue, 25 Jul 2006 20:09:58 +0200 (CEST)
Subject: [Python-checkins] r50828 -
python/trunk/Lib/test/crashers/gc_inspection.py
Message-ID: <20060725180958.288701E4008@bag.python.org>
Author: armin.rigo
Date: Tue Jul 25 20:09:57 2006
New Revision: 50828
Modified:
python/trunk/Lib/test/crashers/gc_inspection.py
Log:
Document why is and is not a good way to fix the gc_inspection crasher.
Modified: python/trunk/Lib/test/crashers/gc_inspection.py
==============================================================================
--- python/trunk/Lib/test/crashers/gc_inspection.py (original)
+++ python/trunk/Lib/test/crashers/gc_inspection.py Tue Jul 25 20:09:57 2006
@@ -1,5 +1,20 @@
"""
gc.get_referrers() can be used to see objects before they are fully built.
+
+Note that this is only an example. There are many ways to crash Python
+by using gc.get_referrers(), as well as many extension modules (even
+when they are using perfectly documented patterns to build objects).
+
+Identifying and removing all places that expose to the GC a
+partially-built object is a long-term project. A patch was proposed on
+SF specifically for this example but I consider fixing just this single
+example a bit pointless (#1517042).
+
+A fix would include a whole-scale code review, possibly with an API
+change to decouple object creation and GC registration, and according
+fixes to the documentation for extension module writers. It's unlikely
+to happen, though. So this is currently classified as
+"gc.get_referrers() is dangerous, use only for debugging".
"""
import gc
From python-checkins at python.org Tue Jul 25 20:11:07 2006
From: python-checkins at python.org (armin.rigo)
Date: Tue, 25 Jul 2006 20:11:07 +0200 (CEST)
Subject: [Python-checkins] r50829 -
python/trunk/Lib/test/crashers/recursion_limit_too_high.py
Message-ID: <20060725181107.DC2341E4016@bag.python.org>
Author: armin.rigo
Date: Tue Jul 25 20:11:07 2006
New Revision: 50829
Added:
python/trunk/Lib/test/crashers/recursion_limit_too_high.py (contents, props changed)
Log:
Added another crasher, which hit me today (I was not intentionally
writing such code, of course, but it took some gdb time to figure out
what my bug was).
Added: python/trunk/Lib/test/crashers/recursion_limit_too_high.py
==============================================================================
--- (empty file)
+++ python/trunk/Lib/test/crashers/recursion_limit_too_high.py Tue Jul 25 20:11:07 2006
@@ -0,0 +1,16 @@
+# The following example may crash or not depending on the platform.
+# E.g. on 32-bit Intel Linux in a "standard" configuration it seems to
+# crash on Python 2.5 (but not 2.4 nor 2.3). On Windows the import
+# eventually fails to find the module, possibly because we run out of
+# file handles.
+
+# The point of this example is to show that sys.setrecursionlimit() is a
+# hack, and not a robust solution. This example simply exercices a path
+# where it takes many C-level recursions, consuming a lot of stack
+# space, for each Python-level recursion. So 1000 times this amount of
+# stack space may be too much for standard platforms already.
+
+import sys
+if 'recursion_limit_too_high' in sys.modules:
+ del sys.modules['recursion_limit_too_high']
+import recursion_limit_too_high
From buildbot at python.org Tue Jul 25 20:30:44 2006
From: buildbot at python.org (buildbot at python.org)
Date: Tue, 25 Jul 2006 18:30:44 +0000
Subject: [Python-checkins] buildbot warnings in S-390 Debian trunk
Message-ID: <20060725183044.720C91E4011@bag.python.org>
The Buildbot has detected a new failure of S-390 Debian trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/S-390%2520Debian%2520trunk/builds/298
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: brett.cannon
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Tue Jul 25 20:38:42 2006
From: python-checkins at python.org (armin.rigo)
Date: Tue, 25 Jul 2006 20:38:42 +0200 (CEST)
Subject: [Python-checkins] r50830 -
python/trunk/Lib/test/crashers/bogus_code_obj.py
python/trunk/Lib/test/crashers/recursive_call.py
Message-ID: <20060725183842.50C121E4006@bag.python.org>
Author: armin.rigo
Date: Tue Jul 25 20:38:39 2006
New Revision: 50830
Modified:
python/trunk/Lib/test/crashers/bogus_code_obj.py
python/trunk/Lib/test/crashers/recursive_call.py
Log:
Document the crashers that will not go away soon as "won't fix",
and explain why.
Modified: python/trunk/Lib/test/crashers/bogus_code_obj.py
==============================================================================
--- python/trunk/Lib/test/crashers/bogus_code_obj.py (original)
+++ python/trunk/Lib/test/crashers/bogus_code_obj.py Tue Jul 25 20:38:39 2006
@@ -1,5 +1,15 @@
"""
Broken bytecode objects can easily crash the interpreter.
+
+This is not going to be fixed. It is generally agreed that there is no
+point in writing a bytecode verifier and putting it in CPython just for
+this. Moreover, a verifier is bound to accept only a subset of all safe
+bytecodes, so it could lead to unnecessary breakage.
+
+For security purposes, "restricted" interpreters are not going to let
+the user build or load random bytecodes anyway. Otherwise, this is a
+"won't fix" case.
+
"""
import types
Modified: python/trunk/Lib/test/crashers/recursive_call.py
==============================================================================
--- python/trunk/Lib/test/crashers/recursive_call.py (original)
+++ python/trunk/Lib/test/crashers/recursive_call.py Tue Jul 25 20:38:39 2006
@@ -1,6 +1,11 @@
#!/usr/bin/env python
# No bug report AFAIK, mail on python-dev on 2006-01-10
+
+# This is a "won't fix" case. It is known that setting a high enough
+# recursion limit crashes by overflowing the stack. Unless this is
+# redesigned somehow, it won't go away.
+
import sys
sys.setrecursionlimit(1 << 30)
From buildbot at python.org Tue Jul 25 20:49:04 2006
From: buildbot at python.org (buildbot at python.org)
Date: Tue, 25 Jul 2006 18:49:04 +0000
Subject: [Python-checkins] buildbot warnings in x86 Ubuntu dapper (icc) trunk
Message-ID: <20060725184904.BB79D1E4006@bag.python.org>
The Buildbot has detected a new failure of x86 Ubuntu dapper (icc) trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520Ubuntu%2520dapper%2520%2528icc%2529%2520trunk/builds/800
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: brett.cannon
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Tue Jul 25 21:13:35 2006
From: python-checkins at python.org (ronald.oussoren)
Date: Tue, 25 Jul 2006 21:13:35 +0200 (CEST)
Subject: [Python-checkins] r50831 - python/trunk/Makefile.pre.in
Message-ID: <20060725191335.B8F691E4006@bag.python.org>
Author: ronald.oussoren
Date: Tue Jul 25 21:13:35 2006
New Revision: 50831
Modified:
python/trunk/Makefile.pre.in
Log:
Install the compatibility symlink to libpython.a on OSX using 'ln -sf' instead
of 'ln -s', this avoid problems when reinstalling python.
Modified: python/trunk/Makefile.pre.in
==============================================================================
--- python/trunk/Makefile.pre.in (original)
+++ python/trunk/Makefile.pre.in Tue Jul 25 21:13:35 2006
@@ -935,7 +935,7 @@
# Install a number of symlinks to keep software that expects a normal unix
# install (which includes python-config) happy.
frameworkinstallmaclib:
- ln -s "../../../Python" "$(DESTDIR)$(prefix)/lib/python$(VERSION)/config/libpython$(VERSION).a"
+ ln -fs "../../../Python" "$(DESTDIR)$(prefix)/lib/python$(VERSION)/config/libpython$(VERSION).a"
cd Mac && $(MAKE) installmacsubtree DESTDIR="$(DESTDIR)"
# This installs the IDE, the Launcher and other apps into /Applications
From python-checkins at python.org Tue Jul 25 21:20:54 2006
From: python-checkins at python.org (ronald.oussoren)
Date: Tue, 25 Jul 2006 21:20:54 +0200 (CEST)
Subject: [Python-checkins] r50832 - python/trunk/Mac/Modules/MacOS.c
python/trunk/Mac/Modules/macosmodule.c
Message-ID: <20060725192054.AB0DB1E4006@bag.python.org>
Author: ronald.oussoren
Date: Tue Jul 25 21:20:54 2006
New Revision: 50832
Added:
python/trunk/Mac/Modules/MacOS.c
- copied unchanged from r50783, python/trunk/Mac/Modules/macosmodule.c
Removed:
python/trunk/Mac/Modules/macosmodule.c
Log:
Fix for bug #1525447 (renaming to MacOSmodule.c would also work, but not
without causing problems for anyone that is on a case-insensitive filesystem).
Setup.py tries to compile the MacOS extension from MacOSmodule.c, while the
actual file is named macosmodule.c. This is no problem on the (default)
case-insensitive filesystem, but doesn't work on case-sensitive filesystems.
Deleted: /python/trunk/Mac/Modules/macosmodule.c
==============================================================================
--- /python/trunk/Mac/Modules/macosmodule.c Tue Jul 25 21:20:54 2006
+++ (empty file)
@@ -1,644 +0,0 @@
-/***********************************************************
-Copyright 1991-1997 by Stichting Mathematisch Centrum, Amsterdam,
-The Netherlands.
-
- All Rights Reserved
-
-Permission to use, copy, modify, and distribute this software and its
-documentation for any purpose and without fee is hereby granted,
-provided that the above copyright notice appear in all copies and that
-both that copyright notice and this permission notice appear in
-supporting documentation, and that the names of Stichting Mathematisch
-Centrum or CWI not be used in advertising or publicity pertaining to
-distribution of the software without specific, written prior permission.
-
-STICHTING MATHEMATISCH CENTRUM DISCLAIMS ALL WARRANTIES WITH REGARD TO
-THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
-FITNESS, IN NO EVENT SHALL STICHTING MATHEMATISCH CENTRUM BE LIABLE
-FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
-WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
-ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT
-OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
-
-******************************************************************/
-
-/* Macintosh OS-specific interface */
-
-#include "Python.h"
-#include "pymactoolbox.h"
-
-#include
-#include
-
-static PyObject *MacOS_Error; /* Exception MacOS.Error */
-
-#define PATHNAMELEN 1024
-
-/* ----------------------------------------------------- */
-
-/* Declarations for objects of type Resource fork */
-
-typedef struct {
- PyObject_HEAD
- short fRefNum;
- int isclosed;
-} rfobject;
-
-static PyTypeObject Rftype;
-
-
-
-/* ---------------------------------------------------------------- */
-
-static void
-do_close(rfobject *self)
-{
- if (self->isclosed ) return;
- (void)FSClose(self->fRefNum);
- self->isclosed = 1;
-}
-
-static char rf_read__doc__[] =
-"Read data from resource fork"
-;
-
-static PyObject *
-rf_read(rfobject *self, PyObject *args)
-{
- long n;
- PyObject *v;
- OSErr err;
-
- if (self->isclosed) {
- PyErr_SetString(PyExc_ValueError, "Operation on closed file");
- return NULL;
- }
-
- if (!PyArg_ParseTuple(args, "l", &n))
- return NULL;
-
- v = PyString_FromStringAndSize((char *)NULL, n);
- if (v == NULL)
- return NULL;
-
- err = FSRead(self->fRefNum, &n, PyString_AsString(v));
- if (err && err != eofErr) {
- PyMac_Error(err);
- Py_DECREF(v);
- return NULL;
- }
- _PyString_Resize(&v, n);
- return v;
-}
-
-
-static char rf_write__doc__[] =
-"Write to resource fork"
-;
-
-static PyObject *
-rf_write(rfobject *self, PyObject *args)
-{
- char *buffer;
- long size;
- OSErr err;
-
- if (self->isclosed) {
- PyErr_SetString(PyExc_ValueError, "Operation on closed file");
- return NULL;
- }
- if (!PyArg_ParseTuple(args, "s#", &buffer, &size))
- return NULL;
- err = FSWrite(self->fRefNum, &size, buffer);
- if (err) {
- PyMac_Error(err);
- return NULL;
- }
- Py_INCREF(Py_None);
- return Py_None;
-}
-
-
-static char rf_seek__doc__[] =
-"Set file position"
-;
-
-static PyObject *
-rf_seek(rfobject *self, PyObject *args)
-{
- long amount, pos;
- int whence = SEEK_SET;
- long eof;
- OSErr err;
-
- if (self->isclosed) {
- PyErr_SetString(PyExc_ValueError, "Operation on closed file");
- return NULL;
- }
- if (!PyArg_ParseTuple(args, "l|i", &amount, &whence))
- return NULL;
-
- if ((err = GetEOF(self->fRefNum, &eof)))
- goto ioerr;
-
- switch (whence) {
- case SEEK_CUR:
- if ((err = GetFPos(self->fRefNum, &pos)))
- goto ioerr;
- break;
- case SEEK_END:
- pos = eof;
- break;
- case SEEK_SET:
- pos = 0;
- break;
- default:
- PyErr_BadArgument();
- return NULL;
- }
-
- pos += amount;
-
- /* Don't bother implementing seek past EOF */
- if (pos > eof || pos < 0) {
- PyErr_BadArgument();
- return NULL;
- }
-
- if ((err = SetFPos(self->fRefNum, fsFromStart, pos)) ) {
-ioerr:
- PyMac_Error(err);
- return NULL;
- }
- Py_INCREF(Py_None);
- return Py_None;
-}
-
-
-static char rf_tell__doc__[] =
-"Get file position"
-;
-
-static PyObject *
-rf_tell(rfobject *self, PyObject *args)
-{
- long where;
- OSErr err;
-
- if (self->isclosed) {
- PyErr_SetString(PyExc_ValueError, "Operation on closed file");
- return NULL;
- }
- if (!PyArg_ParseTuple(args, ""))
- return NULL;
- if ((err = GetFPos(self->fRefNum, &where)) ) {
- PyMac_Error(err);
- return NULL;
- }
- return PyInt_FromLong(where);
-}
-
-static char rf_close__doc__[] =
-"Close resource fork"
-;
-
-static PyObject *
-rf_close(rfobject *self, PyObject *args)
-{
- if (!PyArg_ParseTuple(args, ""))
- return NULL;
- do_close(self);
- Py_INCREF(Py_None);
- return Py_None;
-}
-
-
-static struct PyMethodDef rf_methods[] = {
- {"read", (PyCFunction)rf_read, 1, rf_read__doc__},
- {"write", (PyCFunction)rf_write, 1, rf_write__doc__},
- {"seek", (PyCFunction)rf_seek, 1, rf_seek__doc__},
- {"tell", (PyCFunction)rf_tell, 1, rf_tell__doc__},
- {"close", (PyCFunction)rf_close, 1, rf_close__doc__},
-
- {NULL, NULL} /* sentinel */
-};
-
-/* ---------- */
-
-
-static rfobject *
-newrfobject(void)
-{
- rfobject *self;
-
- self = PyObject_NEW(rfobject, &Rftype);
- if (self == NULL)
- return NULL;
- self->isclosed = 1;
- return self;
-}
-
-
-static void
-rf_dealloc(rfobject *self)
-{
- do_close(self);
- PyObject_DEL(self);
-}
-
-static PyObject *
-rf_getattr(rfobject *self, char *name)
-{
- return Py_FindMethod(rf_methods, (PyObject *)self, name);
-}
-
-static char Rftype__doc__[] =
-"Resource fork file object"
-;
-
-static PyTypeObject Rftype = {
- PyObject_HEAD_INIT(&PyType_Type)
- 0, /*ob_size*/
- "MacOS.ResourceFork", /*tp_name*/
- sizeof(rfobject), /*tp_basicsize*/
- 0, /*tp_itemsize*/
- /* methods */
- (destructor)rf_dealloc, /*tp_dealloc*/
- (printfunc)0, /*tp_print*/
- (getattrfunc)rf_getattr, /*tp_getattr*/
- (setattrfunc)0, /*tp_setattr*/
- (cmpfunc)0, /*tp_compare*/
- (reprfunc)0, /*tp_repr*/
- 0, /*tp_as_number*/
- 0, /*tp_as_sequence*/
- 0, /*tp_as_mapping*/
- (hashfunc)0, /*tp_hash*/
- (ternaryfunc)0, /*tp_call*/
- (reprfunc)0, /*tp_str*/
-
- /* Space for future expansion */
- 0L,0L,0L,0L,
- Rftype__doc__ /* Documentation string */
-};
-
-/* End of code for Resource fork objects */
-/* -------------------------------------------------------- */
-
-/*----------------------------------------------------------------------*/
-/* Miscellaneous File System Operations */
-
-static char getcrtp_doc[] = "Get MacOS 4-char creator and type for a file";
-
-static PyObject *
-MacOS_GetCreatorAndType(PyObject *self, PyObject *args)
-{
- FSSpec fss;
- FInfo info;
- PyObject *creator, *type, *res;
- OSErr err;
-
- if (!PyArg_ParseTuple(args, "O&", PyMac_GetFSSpec, &fss))
- return NULL;
- if ((err = FSpGetFInfo(&fss, &info)) != noErr)
- return PyErr_Mac(MacOS_Error, err);
- creator = PyString_FromStringAndSize((char *)&info.fdCreator, 4);
- type = PyString_FromStringAndSize((char *)&info.fdType, 4);
- res = Py_BuildValue("OO", creator, type);
- Py_DECREF(creator);
- Py_DECREF(type);
- return res;
-}
-
-static char setcrtp_doc[] = "Set MacOS 4-char creator and type for a file";
-
-static PyObject *
-MacOS_SetCreatorAndType(PyObject *self, PyObject *args)
-{
- FSSpec fss;
- ResType creator, type;
- FInfo info;
- OSErr err;
-
- if (!PyArg_ParseTuple(args, "O&O&O&",
- PyMac_GetFSSpec, &fss, PyMac_GetOSType, &creator, PyMac_GetOSType, &type))
- return NULL;
- if ((err = FSpGetFInfo(&fss, &info)) != noErr)
- return PyErr_Mac(MacOS_Error, err);
- info.fdCreator = creator;
- info.fdType = type;
- if ((err = FSpSetFInfo(&fss, &info)) != noErr)
- return PyErr_Mac(MacOS_Error, err);
- Py_INCREF(Py_None);
- return Py_None;
-}
-
-
-static char geterr_doc[] = "Convert OSErr number to string";
-
-static PyObject *
-MacOS_GetErrorString(PyObject *self, PyObject *args)
-{
- int err;
- char buf[256];
- Handle h;
- char *str;
- static int errors_loaded;
-
- if (!PyArg_ParseTuple(args, "i", &err))
- return NULL;
-
- h = GetResource('Estr', err);
- if (!h && !errors_loaded) {
- /*
- ** Attempt to open the resource file containing the
- ** Estr resources. We ignore all errors. We also try
- ** this only once.
- */
- PyObject *m, *rv;
- errors_loaded = 1;
-
- m = PyImport_ImportModule("macresource");
- if (!m) {
- if (Py_VerboseFlag)
- PyErr_Print();
- PyErr_Clear();
- }
- else {
- rv = PyObject_CallMethod(m, "open_error_resource", "");
- if (!rv) {
- if (Py_VerboseFlag)
- PyErr_Print();
- PyErr_Clear();
- }
- else {
- Py_DECREF(rv);
- /* And try again... */
- h = GetResource('Estr', err);
- }
- Py_DECREF(m);
- }
- }
- /*
- ** Whether the code above succeeded or not, we won't try
- ** again.
- */
- errors_loaded = 1;
-
- if (h) {
- HLock(h);
- str = (char *)*h;
- memcpy(buf, str+1, (unsigned char)str[0]);
- buf[(unsigned char)str[0]] = '\0';
- HUnlock(h);
- ReleaseResource(h);
- }
- else {
- PyOS_snprintf(buf, sizeof(buf), "Mac OS error code %d", err);
- }
-
- return Py_BuildValue("s", buf);
-}
-
-static char splash_doc[] = "Open a splash-screen dialog by resource-id (0=close)";
-
-static PyObject *
-MacOS_splash(PyObject *self, PyObject *args)
-{
- int resid = -1;
- static DialogPtr curdialog = NULL;
- DialogPtr olddialog;
- WindowRef theWindow;
- CGrafPtr thePort;
-#if 0
- short xpos, ypos, width, height, swidth, sheight;
-#endif
-
- if (!PyArg_ParseTuple(args, "|i", &resid))
- return NULL;
- olddialog = curdialog;
- curdialog = NULL;
-
- if ( resid != -1 ) {
- curdialog = GetNewDialog(resid, NULL, (WindowPtr)-1);
- if ( curdialog ) {
- theWindow = GetDialogWindow(curdialog);
- thePort = GetWindowPort(theWindow);
-#if 0
- width = thePort->portRect.right - thePort->portRect.left;
- height = thePort->portRect.bottom - thePort->portRect.top;
- swidth = qd.screenBits.bounds.right - qd.screenBits.bounds.left;
- sheight = qd.screenBits.bounds.bottom - qd.screenBits.bounds.top - LMGetMBarHeight();
- xpos = (swidth-width)/2;
- ypos = (sheight-height)/5 + LMGetMBarHeight();
- MoveWindow(theWindow, xpos, ypos, 0);
- ShowWindow(theWindow);
-#endif
- DrawDialog(curdialog);
- }
- }
- if (olddialog)
- DisposeDialog(olddialog);
- Py_INCREF(Py_None);
- return Py_None;
-}
-
-static char DebugStr_doc[] = "Switch to low-level debugger with a message";
-
-static PyObject *
-MacOS_DebugStr(PyObject *self, PyObject *args)
-{
- Str255 message;
- PyObject *object = 0;
-
- if (!PyArg_ParseTuple(args, "O&|O", PyMac_GetStr255, message, &object))
- return NULL;
- DebugStr(message);
- Py_INCREF(Py_None);
- return Py_None;
-}
-
-static char SysBeep_doc[] = "BEEEEEP!!!";
-
-static PyObject *
-MacOS_SysBeep(PyObject *self, PyObject *args)
-{
- int duration = 6;
-
- if (!PyArg_ParseTuple(args, "|i", &duration))
- return NULL;
- SysBeep(duration);
- Py_INCREF(Py_None);
- return Py_None;
-}
-
-static char WMAvailable_doc[] =
- "True if this process can interact with the display."
- "Will foreground the application on the first call as a side-effect."
- ;
-
-static PyObject *
-MacOS_WMAvailable(PyObject *self, PyObject *args)
-{
- static PyObject *rv = NULL;
-
- if (!PyArg_ParseTuple(args, ""))
- return NULL;
- if (!rv) {
- ProcessSerialNumber psn;
-
- /*
- ** This is a fairly innocuous call to make if we don't have a window
- ** manager, or if we have no permission to talk to it. It will print
- ** a message on stderr, but at least it won't abort the process.
- ** It appears the function caches the result itself, and it's cheap, so
- ** no need for us to cache.
- */
-#ifdef kCGNullDirectDisplay
- /* On 10.1 CGMainDisplayID() isn't available, and
- ** kCGNullDirectDisplay isn't defined.
- */
- if (CGMainDisplayID() == 0) {
- rv = Py_False;
- } else {
-#else
- {
-#endif
- if (GetCurrentProcess(&psn) < 0 ||
- SetFrontProcess(&psn) < 0) {
- rv = Py_False;
- } else {
- rv = Py_True;
- }
- }
- }
- Py_INCREF(rv);
- return rv;
-}
-
-static char GetTicks_doc[] = "Return number of ticks since bootup";
-
-static PyObject *
-MacOS_GetTicks(PyObject *self, PyObject *args)
-{
- return Py_BuildValue("i", (int)TickCount());
-}
-
-static char openrf_doc[] = "Open resource fork of a file";
-
-static PyObject *
-MacOS_openrf(PyObject *self, PyObject *args)
-{
- OSErr err;
- char *mode = "r";
- FSSpec fss;
- SignedByte permission = 1;
- rfobject *fp;
-
- if (!PyArg_ParseTuple(args, "O&|s", PyMac_GetFSSpec, &fss, &mode))
- return NULL;
- while (*mode) {
- switch (*mode++) {
- case '*': break;
- case 'r': permission = 1; break;
- case 'w': permission = 2; break;
- case 'b': break;
- default:
- PyErr_BadArgument();
- return NULL;
- }
- }
-
- if ( (fp = newrfobject()) == NULL )
- return NULL;
-
- err = HOpenRF(fss.vRefNum, fss.parID, fss.name, permission, &fp->fRefNum);
-
- if ( err == fnfErr ) {
- /* In stead of doing complicated things here to get creator/type
- ** correct we let the standard i/o library handle it
- */
- FILE *tfp;
- char pathname[PATHNAMELEN];
-
- if ( (err=PyMac_GetFullPathname(&fss, pathname, PATHNAMELEN)) ) {
- PyMac_Error(err);
- Py_DECREF(fp);
- return NULL;
- }
-
- if ( (tfp = fopen(pathname, "w")) == NULL ) {
- PyMac_Error(fnfErr); /* What else... */
- Py_DECREF(fp);
- return NULL;
- }
- fclose(tfp);
- err = HOpenRF(fss.vRefNum, fss.parID, fss.name, permission, &fp->fRefNum);
- }
- if ( err ) {
- Py_DECREF(fp);
- PyMac_Error(err);
- return NULL;
- }
- fp->isclosed = 0;
- return (PyObject *)fp;
-}
-
-
-static PyMethodDef MacOS_Methods[] = {
- {"GetCreatorAndType", MacOS_GetCreatorAndType, 1, getcrtp_doc},
- {"SetCreatorAndType", MacOS_SetCreatorAndType, 1, setcrtp_doc},
- {"GetErrorString", MacOS_GetErrorString, 1, geterr_doc},
- {"openrf", MacOS_openrf, 1, openrf_doc},
- {"splash", MacOS_splash, 1, splash_doc},
- {"DebugStr", MacOS_DebugStr, 1, DebugStr_doc},
- {"GetTicks", MacOS_GetTicks, 1, GetTicks_doc},
- {"SysBeep", MacOS_SysBeep, 1, SysBeep_doc},
- {"WMAvailable", MacOS_WMAvailable, 1, WMAvailable_doc},
- {NULL, NULL} /* Sentinel */
-};
-
-
-void
-initMacOS(void)
-{
- PyObject *m, *d;
-
- m = Py_InitModule("MacOS", MacOS_Methods);
- d = PyModule_GetDict(m);
-
- /* Initialize MacOS.Error exception */
- MacOS_Error = PyMac_GetOSErrException();
- if (MacOS_Error == NULL || PyDict_SetItemString(d, "Error", MacOS_Error) != 0)
- return;
- Rftype.ob_type = &PyType_Type;
- Py_INCREF(&Rftype);
- if (PyDict_SetItemString(d, "ResourceForkType", (PyObject *)&Rftype) != 0)
- return;
- /*
- ** This is a hack: the following constant added to the id() of a string
- ** object gives you the address of the data. Unfortunately, it is needed for
- ** some of the image and sound processing interfaces on the mac:-(
- */
- {
- PyStringObject *p = 0;
- long off = (long)&(p->ob_sval[0]);
-
- if( PyDict_SetItemString(d, "string_id_to_buffer", Py_BuildValue("i", off)) != 0)
- return;
- }
-#define PY_RUNTIMEMODEL "macho"
- if (PyDict_SetItemString(d, "runtimemodel",
- Py_BuildValue("s", PY_RUNTIMEMODEL)) != 0)
- return;
-#if defined(WITH_NEXT_FRAMEWORK)
-#define PY_LINKMODEL "framework"
-#elif defined(Py_ENABLE_SHARED)
-#define PY_LINKMODEL "shared"
-#else
-#define PY_LINKMODEL "static"
-#endif
- if (PyDict_SetItemString(d, "linkmodel",
- Py_BuildValue("s", PY_LINKMODEL)) != 0)
- return;
-
-}
From python-checkins at python.org Tue Jul 25 22:28:56 2006
From: python-checkins at python.org (ronald.oussoren)
Date: Tue, 25 Jul 2006 22:28:56 +0200 (CEST)
Subject: [Python-checkins] r50833 - in python/trunk:
Lib/idlelib/EditorWindow.py Lib/idlelib/config-keys.def
Mac/IDLE/config-main.def Misc/NEWS
Message-ID: <20060725202856.82F2E1E4006@bag.python.org>
Author: ronald.oussoren
Date: Tue Jul 25 22:28:55 2006
New Revision: 50833
Modified:
python/trunk/Lib/idlelib/EditorWindow.py
python/trunk/Lib/idlelib/config-keys.def
python/trunk/Mac/IDLE/config-main.def
python/trunk/Misc/NEWS
Log:
Fix bug #1517990: IDLE keybindings on OSX
This adds a new key definition for OSX, which is slightly different from the
classic mac definition.
Also add NEWS item for a couple of bugfixes I added recently.
Modified: python/trunk/Lib/idlelib/EditorWindow.py
==============================================================================
--- python/trunk/Lib/idlelib/EditorWindow.py (original)
+++ python/trunk/Lib/idlelib/EditorWindow.py Tue Jul 25 22:28:55 2006
@@ -128,7 +128,7 @@
self.top.bind("<>", self.close_event)
if macosxSupport.runningAsOSXApp():
# Command-W on editorwindows doesn't work without this.
- text.bind('<>', self.close_event)
text.bind("<>", self.cut)
text.bind("<>", self.copy)
text.bind("<>", self.paste)
Modified: python/trunk/Lib/idlelib/config-keys.def
==============================================================================
--- python/trunk/Lib/idlelib/config-keys.def (original)
+++ python/trunk/Lib/idlelib/config-keys.def Tue Jul 25 22:28:55 2006
@@ -159,3 +159,56 @@
change-indentwidth=
del-word-left=
del-word-right=
+
+[IDLE Classic OSX]
+toggle-tabs =
+interrupt-execution =
+untabify-region =
+remove-selection =
+print-window =
+replace =
+goto-line =
+plain-newline-and-indent =
+history-previous =
+beginning-of-line =
+end-of-line =
+comment-region =
+redo =
+close-window =
+restart-shell =
+save-window-as-file =
+close-all-windows =
+view-restart =
+tabify-region =
+find-again =
+find =
+toggle-auto-coloring =
+select-all =
+smart-backspace =
+change-indentwidth =
+do-nothing =
+smart-indent =
+center-insert =
+history-next =
+del-word-right =
+undo =
+save-window =
+uncomment-region =
+cut =
+find-in-files =
+dedent-region =
+copy =
+paste =
+indent-region =
+del-word-left =
+newline-and-indent =
+end-of-file =
+open-class-browser =
+open-new-window =
+open-module =
+find-selection =
+python-context-help =
+save-copy-of-window-as-file =
+open-window-from-file =
+python-docs =
+
Modified: python/trunk/Mac/IDLE/config-main.def
==============================================================================
--- python/trunk/Mac/IDLE/config-main.def (original)
+++ python/trunk/Mac/IDLE/config-main.def Tue Jul 25 22:28:55 2006
@@ -71,7 +71,7 @@
[Keys]
default= 1
-name= IDLE Classic Mac
+name= IDLE Classic OSX
[History]
cyclic=1
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Tue Jul 25 22:28:55 2006
@@ -36,6 +36,9 @@
new ``sys._current_frames()`` returns a dictionary with one entry,
mapping the faux "thread id" 0 to the current frame.
+- Bug #1525447: build on MacOS X on a case-sensitive filesystem.
+
+
Library
-------
@@ -68,6 +71,12 @@
Also, whereas % values were decoded in all parameter continuations, they are
now only decoded in encoded parameter parts.
+- Bug #1517990: IDLE keybindings on MacOS X now work correctly
+
+- Bug #1517996: IDLE now longer shows the default Tk menu when a
+ path browser, class browser or debugger is the frontmost window on MacOS X
+
+
Extension Modules
-----------------
From buildbot at python.org Tue Jul 25 22:33:39 2006
From: buildbot at python.org (buildbot at python.org)
Date: Tue, 25 Jul 2006 20:33:39 +0000
Subject: [Python-checkins] buildbot warnings in sparc Ubuntu dapper trunk
Message-ID: <20060725203339.240581E400A@bag.python.org>
The Buildbot has detected a new failure of sparc Ubuntu dapper trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/sparc%2520Ubuntu%2520dapper%2520trunk/builds/561
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: brett.cannon
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From neal at metaslash.com Tue Jul 25 23:11:29 2006
From: neal at metaslash.com (Neal Norwitz)
Date: Tue, 25 Jul 2006 17:11:29 -0400
Subject: [Python-checkins] Python Regression Test Failures refleak (1)
Message-ID: <20060725211129.GA25773@python.psfb.org>
test_cmd_line leaked [17, 0, 0] references
From python-checkins at python.org Wed Jul 26 00:30:24 2006
From: python-checkins at python.org (tim.peters)
Date: Wed, 26 Jul 2006 00:30:24 +0200 (CEST)
Subject: [Python-checkins] r50834 - python/trunk/Lib/pkgutil.py
Message-ID: <20060725223024.F06841E4006@bag.python.org>
Author: tim.peters
Date: Wed Jul 26 00:30:24 2006
New Revision: 50834
Modified:
python/trunk/Lib/pkgutil.py
Log:
Whitespace normalization.
Modified: python/trunk/Lib/pkgutil.py
==============================================================================
--- python/trunk/Lib/pkgutil.py (original)
+++ python/trunk/Lib/pkgutil.py Wed Jul 26 00:30:24 2006
@@ -71,7 +71,7 @@
def walk_packages(path=None, prefix='', onerror=None):
"""Yields (module_loader, name, ispkg) for all modules recursively
on path, or, if path is None, all accessible modules.
-
+
'path' should be either None or a list of paths to look for
modules in.
@@ -81,7 +81,7 @@
Note that this function must import all *packages* (NOT all
modules!) on the given path, in order to access the __path__
attribute to find submodules.
-
+
'onerror' is a function which gets called with one argument (the
name of the package which was being imported) if an ImportError
occurs trying to import a package. By default the ImportError is
@@ -126,7 +126,7 @@
'prefix' is a string to output on the front of every module name
on output.
"""
-
+
if path is None:
importers = iter_importers()
else:
From buildbot at python.org Wed Jul 26 00:57:15 2006
From: buildbot at python.org (buildbot at python.org)
Date: Tue, 25 Jul 2006 22:57:15 +0000
Subject: [Python-checkins] buildbot warnings in x86 XP-2 trunk
Message-ID: <20060725225715.3FCE61E4006@bag.python.org>
The Buildbot has detected a new failure of x86 XP-2 trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520XP-2%2520trunk/builds/783
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: tim.peters
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Wed Jul 26 01:09:03 2006
From: buildbot at python.org (buildbot at python.org)
Date: Tue, 25 Jul 2006 23:09:03 +0000
Subject: [Python-checkins] buildbot warnings in x86 OpenBSD trunk
Message-ID: <20060725230903.63E461E4006@bag.python.org>
The Buildbot has detected a new failure of x86 OpenBSD trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520OpenBSD%2520trunk/builds/1057
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: tim.peters
Build Had Warnings: warnings failed slave lost
sincerely,
-The Buildbot
From python-checkins at python.org Wed Jul 26 05:18:43 2006
From: python-checkins at python.org (barry.warsaw)
Date: Wed, 26 Jul 2006 05:18:43 +0200 (CEST)
Subject: [Python-checkins] r50835 - sandbox/trunk/emailpkg/tags/2_5_8
Message-ID: <20060726031843.7F84B1E4006@bag.python.org>
Author: barry.warsaw
Date: Wed Jul 26 05:18:41 2006
New Revision: 50835
Added:
sandbox/trunk/emailpkg/tags/2_5_8/
- copied from r50834, python/branches/release23-maint/Lib/email/
Log:
Tagging email 2.5.8
From nnorwitz at gmail.com Wed Jul 26 05:41:09 2006
From: nnorwitz at gmail.com (Neal Norwitz)
Date: Tue, 25 Jul 2006 20:41:09 -0700
Subject: [Python-checkins] r50819 - python/trunk/Lib/pkgutil.py
In-Reply-To: <20060725102235.3B6991E4003@bag.python.org>
References: <20060725102235.3B6991E4003@bag.python.org>
Message-ID:
Can we get a Misc/NEWS entry for this puppy?
It would be great to start a test for pkgutil too. Anyone? it's a
good way to start learning about the python core and everyone(*) will
thank you.
n
(*) where everyone is defined as me and whoever else wants to join me
in thanks ;-)
--
On 7/25/06, georg.brandl wrote:
> Author: georg.brandl
> Date: Tue Jul 25 12:22:34 2006
> New Revision: 50819
>
> Modified:
> python/trunk/Lib/pkgutil.py
> Log:
> Patch #1525766: correctly pass onerror arg to recursive calls
> of pkg.walk_packages. Also improve the docstrings.
>
>
>
> Modified: python/trunk/Lib/pkgutil.py
> ==============================================================================
> --- python/trunk/Lib/pkgutil.py (original)
> +++ python/trunk/Lib/pkgutil.py Tue Jul 25 12:22:34 2006
> @@ -69,7 +69,28 @@
>
>
> def walk_packages(path=None, prefix='', onerror=None):
> - """Yield submodule names+loaders recursively, for path or sys.path"""
> + """Yields (module_loader, name, ispkg) for all modules recursively
> + on path, or, if path is None, all accessible modules.
> +
> + 'path' should be either None or a list of paths to look for
> + modules in.
> +
> + 'prefix' is a string to output on the front of every module name
> + on output.
> +
> + Note that this function must import all *packages* (NOT all
> + modules!) on the given path, in order to access the __path__
> + attribute to find submodules.
> +
> + 'onerror' is a function which gets called with one argument (the
> + name of the package which was being imported) if an ImportError
> + occurs trying to import a package. By default the ImportError is
> + caught and ignored.
> +
> + Examples:
> + walk_packages() : list all modules python can access
> + walk_packages(ctypes.__path__, ctypes.__name__+'.') : list all submodules of ctypes
> + """
>
> def seen(p, m={}):
> if p in m:
> @@ -84,19 +105,28 @@
> __import__(name)
> except ImportError:
> if onerror is not None:
> - onerror()
> + onerror(name)
> else:
> path = getattr(sys.modules[name], '__path__', None) or []
>
> # don't traverse path items we've seen before
> path = [p for p in path if not seen(p)]
>
> - for item in walk_packages(path, name+'.'):
> + for item in walk_packages(path, name+'.', onerror):
> yield item
>
>
> def iter_modules(path=None, prefix=''):
> - """Yield submodule names+loaders for path or sys.path"""
> + """Yields (module_loader, name, ispkg) for all submodules on path,
> + or, if path is None, all top-level modules on sys.path.
> +
> + 'path' should be either None or a list of paths to look for
> + modules in.
> +
> + 'prefix' is a string to output on the front of every module name
> + on output.
> + """
> +
> if path is None:
> importers = iter_importers()
> else:
> _______________________________________________
> Python-checkins mailing list
> Python-checkins at python.org
> http://mail.python.org/mailman/listinfo/python-checkins
>
From python-checkins at python.org Wed Jul 26 05:55:11 2006
From: python-checkins at python.org (barry.warsaw)
Date: Wed, 26 Jul 2006 05:55:11 +0200 (CEST)
Subject: [Python-checkins] r50836 -
python/branches/release23-maint/Lib/email/test/test_email.py
Message-ID: <20060726035511.80BF61E4006@bag.python.org>
Author: barry.warsaw
Date: Wed Jul 26 05:55:09 2006
New Revision: 50836
Modified:
python/branches/release23-maint/Lib/email/test/test_email.py
Log:
Fix the tests to work with Python 2.1, which email 2.5 must do.
Modified: python/branches/release23-maint/Lib/email/test/test_email.py
==============================================================================
--- python/branches/release23-maint/Lib/email/test/test_email.py (original)
+++ python/branches/release23-maint/Lib/email/test/test_email.py Wed Jul 26 05:55:09 2006
@@ -9,7 +9,7 @@
import unittest
import warnings
from cStringIO import StringIO
-from types import StringType, ListType
+from types import StringType, ListType, TupleType
import email
@@ -2757,7 +2757,7 @@
'''
msg = email.message_from_string(m)
param = msg.get_param('NAME')
- self.failIf(isinstance(param, tuple))
+ self.failIf(isinstance(param, TupleType))
self.assertEqual(
param,
'file____C__DOCUMENTS_20AND_20SETTINGS_FABIEN_LOCAL_20SETTINGS_TEMP_nsmail.htm')
@@ -2899,7 +2899,7 @@
"""
msg = email.message_from_string(m)
param = msg.get_param('name')
- self.failIf(isinstance(param, tuple))
+ self.failIf(isinstance(param, TupleType))
self.assertEqual(param, "Frank's Document")
def test_rfc2231_tick_attack_extended(self):
@@ -2923,7 +2923,7 @@
"""
msg = email.message_from_string(m)
param = msg.get_param('name')
- self.failIf(isinstance(param, tuple))
+ self.failIf(isinstance(param, TupleType))
self.assertEqual(param, "us-ascii'en-us'Frank's Document")
def test_rfc2231_no_extended_values(self):
From python-checkins at python.org Wed Jul 26 05:56:11 2006
From: python-checkins at python.org (barry.warsaw)
Date: Wed, 26 Jul 2006 05:56:11 +0200 (CEST)
Subject: [Python-checkins] r50837 - sandbox/trunk/emailpkg/tags/2_5_8
Message-ID: <20060726035611.F194C1E4006@bag.python.org>
Author: barry.warsaw
Date: Wed Jul 26 05:56:11 2006
New Revision: 50837
Removed:
sandbox/trunk/emailpkg/tags/2_5_8/
Log:
Need to retag 2_5_8; this version doesn't work with Python 2.1.
From python-checkins at python.org Wed Jul 26 05:57:22 2006
From: python-checkins at python.org (barry.warsaw)
Date: Wed, 26 Jul 2006 05:57:22 +0200 (CEST)
Subject: [Python-checkins] r50838 - sandbox/trunk/emailpkg/tags/2_5_8
Message-ID: <20060726035722.BF6B91E4006@bag.python.org>
Author: barry.warsaw
Date: Wed Jul 26 05:57:22 2006
New Revision: 50838
Added:
sandbox/trunk/emailpkg/tags/2_5_8/
- copied from r50837, python/branches/release23-maint/Lib/email/
Log:
Retagging email 2.5.8
From python-checkins at python.org Wed Jul 26 06:00:20 2006
From: python-checkins at python.org (neal.norwitz)
Date: Wed, 26 Jul 2006 06:00:20 +0200 (CEST)
Subject: [Python-checkins] r50839 - python/trunk/Misc/build.sh
Message-ID: <20060726040020.521411E401A@bag.python.org>
Author: neal.norwitz
Date: Wed Jul 26 06:00:18 2006
New Revision: 50839
Modified:
python/trunk/Misc/build.sh
Log:
Hmm, only python2.x is installed, not plain python. Did that change recently?
Modified: python/trunk/Misc/build.sh
==============================================================================
--- python/trunk/Misc/build.sh (original)
+++ python/trunk/Misc/build.sh Wed Jul 26 06:00:18 2006
@@ -161,6 +161,10 @@
make install >& build/$F
update_status "Installing" "$F" $start
+ if [ ! -x $PYTHON ]; then
+ ln -s ${PYTHON}2.* $PYTHON
+ fi
+
## make and run basic tests
F=make-test.out
start=`current_time`
From python-checkins at python.org Wed Jul 26 07:54:47 2006
From: python-checkins at python.org (barry.warsaw)
Date: Wed, 26 Jul 2006 07:54:47 +0200 (CEST)
Subject: [Python-checkins] r50840 - in python/trunk/Lib/email: message.py
test/test_email.py test/test_email_renamed.py
Message-ID: <20060726055447.2AC9F1E4006@bag.python.org>
Author: barry.warsaw
Date: Wed Jul 26 07:54:46 2006
New Revision: 50840
Modified:
python/trunk/Lib/email/message.py
python/trunk/Lib/email/test/test_email.py
python/trunk/Lib/email/test/test_email_renamed.py
Log:
Forward port some fixes that were in email 2.5 but for some reason didn't make
it into email 4.0. Specifically, in Message.get_content_charset(), handle RFC
2231 headers that contain an encoding not known to Python, or a character in
the data that isn't in the charset encoding. Also forward port the
appropriate unit tests.
Modified: python/trunk/Lib/email/message.py
==============================================================================
--- python/trunk/Lib/email/message.py (original)
+++ python/trunk/Lib/email/message.py Wed Jul 26 07:54:46 2006
@@ -747,7 +747,18 @@
if isinstance(charset, tuple):
# RFC 2231 encoded, so decode it, and it better end up as ascii.
pcharset = charset[0] or 'us-ascii'
- charset = unicode(charset[2], pcharset).encode('us-ascii')
+ try:
+ # LookupError will be raised if the charset isn't known to
+ # Python. UnicodeError will be raised if the encoded text
+ # contains a character not in the charset.
+ charset = unicode(charset[2], pcharset).encode('us-ascii')
+ except (LookupError, UnicodeError):
+ charset = charset[2]
+ # charset character must be in us-ascii range
+ try:
+ charset = unicode(charset, 'us-ascii').encode('us-ascii')
+ except UnicodeError:
+ return failobj
# RFC 2046, $4.1.2 says charsets are not case sensitive
return charset.lower()
Modified: python/trunk/Lib/email/test/test_email.py
==============================================================================
--- python/trunk/Lib/email/test/test_email.py (original)
+++ python/trunk/Lib/email/test/test_email.py Wed Jul 26 07:54:46 2006
@@ -3086,6 +3086,50 @@
self.assertEqual(msg.get_content_charset(),
'this is even more ***fun*** is it not.pdf')
+ def test_rfc2231_bad_encoding_in_filename(self):
+ m = '''\
+Content-Disposition: inline;
+\tfilename*0*="bogus'xx'This%20is%20even%20more%20";
+\tfilename*1*="%2A%2A%2Afun%2A%2A%2A%20";
+\tfilename*2="is it not.pdf"
+
+'''
+ msg = email.message_from_string(m)
+ self.assertEqual(msg.get_filename(),
+ 'This is even more ***fun*** is it not.pdf')
+
+ def test_rfc2231_bad_encoding_in_charset(self):
+ m = """\
+Content-Type: text/plain; charset*=bogus''utf-8%E2%80%9D
+
+"""
+ msg = email.message_from_string(m)
+ # This should return None because non-ascii characters in the charset
+ # are not allowed.
+ self.assertEqual(msg.get_content_charset(), None)
+
+ def test_rfc2231_bad_character_in_charset(self):
+ m = """\
+Content-Type: text/plain; charset*=ascii''utf-8%E2%80%9D
+
+"""
+ msg = email.message_from_string(m)
+ # This should return None because non-ascii characters in the charset
+ # are not allowed.
+ self.assertEqual(msg.get_content_charset(), None)
+
+ def test_rfc2231_bad_character_in_filename(self):
+ m = '''\
+Content-Disposition: inline;
+\tfilename*0*="ascii'xx'This%20is%20even%20more%20";
+\tfilename*1*="%2A%2A%2Afun%2A%2A%2A%20";
+\tfilename*2*="is it not.pdf%E2"
+
+'''
+ msg = email.message_from_string(m)
+ self.assertEqual(msg.get_filename(),
+ u'This is even more ***fun*** is it not.pdf\ufffd')
+
def test_rfc2231_unknown_encoding(self):
m = """\
Content-Transfer-Encoding: 8bit
Modified: python/trunk/Lib/email/test/test_email_renamed.py
==============================================================================
--- python/trunk/Lib/email/test/test_email_renamed.py (original)
+++ python/trunk/Lib/email/test/test_email_renamed.py Wed Jul 26 07:54:46 2006
@@ -3092,6 +3092,50 @@
self.assertEqual(msg.get_content_charset(),
'this is even more ***fun*** is it not.pdf')
+ def test_rfc2231_bad_encoding_in_filename(self):
+ m = '''\
+Content-Disposition: inline;
+\tfilename*0*="bogus'xx'This%20is%20even%20more%20";
+\tfilename*1*="%2A%2A%2Afun%2A%2A%2A%20";
+\tfilename*2="is it not.pdf"
+
+'''
+ msg = email.message_from_string(m)
+ self.assertEqual(msg.get_filename(),
+ 'This is even more ***fun*** is it not.pdf')
+
+ def test_rfc2231_bad_encoding_in_charset(self):
+ m = """\
+Content-Type: text/plain; charset*=bogus''utf-8%E2%80%9D
+
+"""
+ msg = email.message_from_string(m)
+ # This should return None because non-ascii characters in the charset
+ # are not allowed.
+ self.assertEqual(msg.get_content_charset(), None)
+
+ def test_rfc2231_bad_character_in_charset(self):
+ m = """\
+Content-Type: text/plain; charset*=ascii''utf-8%E2%80%9D
+
+"""
+ msg = email.message_from_string(m)
+ # This should return None because non-ascii characters in the charset
+ # are not allowed.
+ self.assertEqual(msg.get_content_charset(), None)
+
+ def test_rfc2231_bad_character_in_filename(self):
+ m = '''\
+Content-Disposition: inline;
+\tfilename*0*="ascii'xx'This%20is%20even%20more%20";
+\tfilename*1*="%2A%2A%2Afun%2A%2A%2A%20";
+\tfilename*2*="is it not.pdf%E2"
+
+'''
+ msg = email.message_from_string(m)
+ self.assertEqual(msg.get_filename(),
+ u'This is even more ***fun*** is it not.pdf\ufffd')
+
def test_rfc2231_unknown_encoding(self):
m = """\
Content-Transfer-Encoding: 8bit
From python-checkins at python.org Wed Jul 26 09:23:33 2006
From: python-checkins at python.org (georg.brandl)
Date: Wed, 26 Jul 2006 09:23:33 +0200 (CEST)
Subject: [Python-checkins] r50841 - python/trunk/Misc/NEWS
Message-ID: <20060726072333.78A151E400F@bag.python.org>
Author: georg.brandl
Date: Wed Jul 26 09:23:32 2006
New Revision: 50841
Modified:
python/trunk/Misc/NEWS
Log:
NEWS entry for #1525766.
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Wed Jul 26 09:23:32 2006
@@ -42,6 +42,9 @@
Library
-------
+- Patch #1525766: In pkgutil.walk_packages, correctly pass the onerror callback
+ to recursive calls and call it with the failing package name.
+
- Bug #1525817: Don't truncate short lines in IDLE's tool tips.
- Patch #1515343: Fix printing of deprecated string exceptions with a
From python-checkins at python.org Wed Jul 26 09:40:19 2006
From: python-checkins at python.org (georg.brandl)
Date: Wed, 26 Jul 2006 09:40:19 +0200 (CEST)
Subject: [Python-checkins] r50842 - in python/trunk:
Lib/test/test_urllib2.py Lib/urllib.py Lib/urllib2.py Misc/NEWS
Message-ID: <20060726074019.11C051E4011@bag.python.org>
Author: georg.brandl
Date: Wed Jul 26 09:40:17 2006
New Revision: 50842
Modified:
python/trunk/Lib/test/test_urllib2.py
python/trunk/Lib/urllib.py
python/trunk/Lib/urllib2.py
python/trunk/Misc/NEWS
Log:
Bug #1459963: properly capitalize HTTP header names.
Modified: python/trunk/Lib/test/test_urllib2.py
==============================================================================
--- python/trunk/Lib/test/test_urllib2.py (original)
+++ python/trunk/Lib/test/test_urllib2.py Wed Jul 26 09:40:17 2006
@@ -676,11 +676,11 @@
r = MockResponse(200, "OK", {}, "")
newreq = h.do_request_(req)
if data is None: # GET
- self.assert_("Content-length" not in req.unredirected_hdrs)
- self.assert_("Content-type" not in req.unredirected_hdrs)
+ self.assert_("Content-Length" not in req.unredirected_hdrs)
+ self.assert_("Content-Type" not in req.unredirected_hdrs)
else: # POST
- self.assertEqual(req.unredirected_hdrs["Content-length"], "0")
- self.assertEqual(req.unredirected_hdrs["Content-type"],
+ self.assertEqual(req.unredirected_hdrs["Content-Length"], "0")
+ self.assertEqual(req.unredirected_hdrs["Content-Type"],
"application/x-www-form-urlencoded")
# XXX the details of Host could be better tested
self.assertEqual(req.unredirected_hdrs["Host"], "example.com")
@@ -692,8 +692,8 @@
req.add_unredirected_header("Host", "baz")
req.add_unredirected_header("Spam", "foo")
newreq = h.do_request_(req)
- self.assertEqual(req.unredirected_hdrs["Content-length"], "foo")
- self.assertEqual(req.unredirected_hdrs["Content-type"], "bar")
+ self.assertEqual(req.unredirected_hdrs["Content-Length"], "foo")
+ self.assertEqual(req.unredirected_hdrs["Content-Type"], "bar")
self.assertEqual(req.unredirected_hdrs["Host"], "baz")
self.assertEqual(req.unredirected_hdrs["Spam"], "foo")
@@ -847,7 +847,7 @@
407, 'Proxy-Authenticate: Basic realm="%s"\r\n\r\n' % realm)
opener.add_handler(auth_handler)
opener.add_handler(http_handler)
- self._test_basic_auth(opener, auth_handler, "Proxy-authorization",
+ self._test_basic_auth(opener, auth_handler, "Proxy-Authorization",
realm, http_handler, password_manager,
"http://acme.example.com:3128/protected",
"proxy.example.com:3128",
Modified: python/trunk/Lib/urllib.py
==============================================================================
--- python/trunk/Lib/urllib.py (original)
+++ python/trunk/Lib/urllib.py Wed Jul 26 09:40:17 2006
@@ -118,7 +118,7 @@
self.proxies = proxies
self.key_file = x509.get('key_file')
self.cert_file = x509.get('cert_file')
- self.addheaders = [('User-agent', self.version)]
+ self.addheaders = [('User-Agent', self.version)]
self.__tempfiles = []
self.__unlink = os.unlink # See cleanup()
self.tempcache = None
@@ -314,8 +314,8 @@
h = httplib.HTTP(host)
if data is not None:
h.putrequest('POST', selector)
- h.putheader('Content-type', 'application/x-www-form-urlencoded')
- h.putheader('Content-length', '%d' % len(data))
+ h.putheader('Content-Type', 'application/x-www-form-urlencoded')
+ h.putheader('Content-Length', '%d' % len(data))
else:
h.putrequest('GET', selector)
if proxy_auth: h.putheader('Proxy-Authorization', 'Basic %s' % proxy_auth)
@@ -400,9 +400,9 @@
cert_file=self.cert_file)
if data is not None:
h.putrequest('POST', selector)
- h.putheader('Content-type',
+ h.putheader('Content-Type',
'application/x-www-form-urlencoded')
- h.putheader('Content-length', '%d' % len(data))
+ h.putheader('Content-Length', '%d' % len(data))
else:
h.putrequest('GET', selector)
if proxy_auth: h.putheader('Proxy-Authorization: Basic %s' % proxy_auth)
@@ -584,7 +584,7 @@
data = base64.decodestring(data)
else:
data = unquote(data)
- msg.append('Content-length: %d' % len(data))
+ msg.append('Content-Length: %d' % len(data))
msg.append('')
msg.append(data)
msg = '\n'.join(msg)
Modified: python/trunk/Lib/urllib2.py
==============================================================================
--- python/trunk/Lib/urllib2.py (original)
+++ python/trunk/Lib/urllib2.py Wed Jul 26 09:40:17 2006
@@ -263,11 +263,11 @@
def add_header(self, key, val):
# useful for something like authentication
- self.headers[key.capitalize()] = val
+ self.headers[key.title()] = val
def add_unredirected_header(self, key, val):
# will not be added to a redirected request
- self.unredirected_hdrs[key.capitalize()] = val
+ self.unredirected_hdrs[key.title()] = val
def has_header(self, header_name):
return (header_name in self.headers or
@@ -286,7 +286,7 @@
class OpenerDirector:
def __init__(self):
client_version = "Python-urllib/%s" % __version__
- self.addheaders = [('User-agent', client_version)]
+ self.addheaders = [('User-Agent', client_version)]
# manage the individual handlers
self.handlers = []
self.handle_open = {}
@@ -675,7 +675,7 @@
if user and password:
user_pass = '%s:%s' % (unquote(user), unquote(password))
creds = base64.encodestring(user_pass).strip()
- req.add_header('Proxy-authorization', 'Basic ' + creds)
+ req.add_header('Proxy-Authorization', 'Basic ' + creds)
hostport = unquote(hostport)
req.set_proxy(hostport, proxy_type)
if orig_type == proxy_type:
@@ -819,7 +819,7 @@
class ProxyBasicAuthHandler(AbstractBasicAuthHandler, BaseHandler):
- auth_header = 'Proxy-authorization'
+ auth_header = 'Proxy-Authorization'
def http_error_407(self, req, fp, code, msg, headers):
# http_error_auth_reqed requires that there is no userinfo component in
@@ -1022,20 +1022,20 @@
if request.has_data(): # POST
data = request.get_data()
- if not request.has_header('Content-type'):
+ if not request.has_header('Content-Type'):
request.add_unredirected_header(
- 'Content-type',
+ 'Content-Type',
'application/x-www-form-urlencoded')
- if not request.has_header('Content-length'):
+ if not request.has_header('Content-Length'):
request.add_unredirected_header(
- 'Content-length', '%d' % len(data))
+ 'Content-Length', '%d' % len(data))
scheme, sel = splittype(request.get_selector())
sel_host, sel_path = splithost(sel)
if not request.has_header('Host'):
request.add_unredirected_header('Host', sel_host or host)
for name, value in self.parent.addheaders:
- name = name.capitalize()
+ name = name.title()
if not request.has_header(name):
request.add_unredirected_header(name, value)
@@ -1217,7 +1217,7 @@
modified = email.Utils.formatdate(stats.st_mtime, usegmt=True)
mtype = mimetypes.guess_type(file)[0]
headers = mimetools.Message(StringIO(
- 'Content-type: %s\nContent-length: %d\nLast-modified: %s\n' %
+ 'Content-Type: %s\nContent-Length: %d\nLast-Modified: %s\n' %
(mtype or 'text/plain', size, modified)))
if host:
host, port = splitport(host)
@@ -1272,9 +1272,9 @@
headers = ""
mtype = mimetypes.guess_type(req.get_full_url())[0]
if mtype:
- headers += "Content-type: %s\n" % mtype
+ headers += "Content-Type: %s\n" % mtype
if retrlen is not None and retrlen >= 0:
- headers += "Content-length: %d\n" % retrlen
+ headers += "Content-Length: %d\n" % retrlen
sf = StringIO(headers)
headers = mimetools.Message(sf)
return addinfourl(fp, headers, req.get_full_url())
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Wed Jul 26 09:40:17 2006
@@ -42,6 +42,9 @@
Library
-------
+- Bug #1459963: urllib and urllib2 now normalize HTTP header names correctly
+ with title().
+
- Patch #1525766: In pkgutil.walk_packages, correctly pass the onerror callback
to recursive calls and call it with the failing package name.
From python-checkins at python.org Wed Jul 26 10:03:11 2006
From: python-checkins at python.org (georg.brandl)
Date: Wed, 26 Jul 2006 10:03:11 +0200 (CEST)
Subject: [Python-checkins] r50843 - in python/trunk:
Lib/test/test_getargs2.py Modules/_testcapimodule.c Python/getargs.c
Message-ID: <20060726080311.87BB01E4006@bag.python.org>
Author: georg.brandl
Date: Wed Jul 26 10:03:10 2006
New Revision: 50843
Modified:
python/trunk/Lib/test/test_getargs2.py
python/trunk/Modules/_testcapimodule.c
python/trunk/Python/getargs.c
Log:
Part of bug #1523610: fix miscalculation of buffer length.
Also add a guard against NULL in converttuple and add a test case
(that previously would have crashed).
Modified: python/trunk/Lib/test/test_getargs2.py
==============================================================================
--- python/trunk/Lib/test/test_getargs2.py (original)
+++ python/trunk/Lib/test/test_getargs2.py Wed Jul 26 10:03:10 2006
@@ -233,8 +233,25 @@
self.failUnlessEqual(VERY_LARGE & ULLONG_MAX, getargs_K(VERY_LARGE))
+
+class Tuple_TestCase(unittest.TestCase):
+ def test_tuple(self):
+ from _testcapi import getargs_tuple
+
+ ret = getargs_tuple(1, (2, 3))
+ self.assertEquals(ret, (1,2,3))
+
+ # make sure invalid tuple arguments are handled correctly
+ class seq:
+ def __len__(self):
+ return 2
+ def __getitem__(self, n):
+ raise ValueError
+ self.assertRaises(TypeError, getargs_tuple, 1, seq())
+
+
def test_main():
- tests = [Signed_TestCase, Unsigned_TestCase]
+ tests = [Signed_TestCase, Unsigned_TestCase, Tuple_TestCase]
try:
from _testcapi import getargs_L, getargs_K
except ImportError:
Modified: python/trunk/Modules/_testcapimodule.c
==============================================================================
--- python/trunk/Modules/_testcapimodule.c (original)
+++ python/trunk/Modules/_testcapimodule.c Wed Jul 26 10:03:10 2006
@@ -294,6 +294,16 @@
#endif /* ifdef HAVE_LONG_LONG */
+/* Test tuple argument processing */
+static PyObject *
+getargs_tuple(PyObject *self, PyObject *args)
+{
+ int a, b, c;
+ if (!PyArg_ParseTuple(args, "i(ii)", &a, &b, &c))
+ return NULL;
+ return Py_BuildValue("iii", a, b, c);
+}
+
/* Functions to call PyArg_ParseTuple with integer format codes,
and return the result.
*/
@@ -707,6 +717,7 @@
{"test_null_strings", (PyCFunction)test_null_strings, METH_NOARGS},
{"test_string_from_format", (PyCFunction)test_string_from_format, METH_NOARGS},
+ {"getargs_tuple", getargs_tuple, METH_VARARGS},
{"getargs_b", getargs_b, METH_VARARGS},
{"getargs_B", getargs_B, METH_VARARGS},
{"getargs_H", getargs_H, METH_VARARGS},
Modified: python/trunk/Python/getargs.c
==============================================================================
--- python/trunk/Python/getargs.c (original)
+++ python/trunk/Python/getargs.c Wed Jul 26 10:03:10 2006
@@ -351,8 +351,8 @@
"argument %d", iarg);
i = 0;
p += strlen(p);
- while (levels[i] > 0 && (int)(p-buf) < 220) {
- PyOS_snprintf(p, sizeof(buf) - (buf - p),
+ while (levels[i] > 0 && i < 32 && (int)(p-buf) < 220) {
+ PyOS_snprintf(p, sizeof(buf) - (p - buf),
", item %d", levels[i]-1);
p += strlen(p);
i++;
@@ -439,6 +439,13 @@
char *msg;
PyObject *item;
item = PySequence_GetItem(arg, i);
+ if (item == NULL) {
+ PyErr_Clear();
+ levels[0] = i+1;
+ levels[1] = 0;
+ strncpy(msgbuf, "is not retrievable", bufsize);
+ return msgbuf;
+ }
msg = convertitem(item, &format, p_va, flags, levels+1,
msgbuf, bufsize, freelist);
/* PySequence_GetItem calls tp->sq_item, which INCREFs */
@@ -1509,6 +1516,7 @@
else {
msg = skipitem(&format, p_va, flags);
if (msg) {
+ levels[0] = 0;
seterror(i+1, msg, levels, fname, message);
return cleanreturn(0, freelist);
}
From buildbot at python.org Wed Jul 26 10:55:56 2006
From: buildbot at python.org (buildbot at python.org)
Date: Wed, 26 Jul 2006 08:55:56 +0000
Subject: [Python-checkins] buildbot warnings in PPC64 Debian trunk
Message-ID: <20060726085556.5F9E61E4006@bag.python.org>
The Buildbot has detected a new failure of PPC64 Debian trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/PPC64%2520Debian%2520trunk/builds/288
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Wed Jul 26 14:12:57 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Wed, 26 Jul 2006 14:12:57 +0200 (CEST)
Subject: [Python-checkins] r50844 - in python/trunk: Lib/httplib.py
Lib/socket.py Misc/NEWS
Message-ID: <20060726121257.179601E4009@bag.python.org>
Author: martin.v.loewis
Date: Wed Jul 26 14:12:56 2006
New Revision: 50844
Modified:
python/trunk/Lib/httplib.py
python/trunk/Lib/socket.py
python/trunk/Misc/NEWS
Log:
Bug #978833: Really close underlying socket in _socketobject.close.
Fix httplib.HTTPConnection.getresponse to not close the
socket if it is still needed for the response.
Modified: python/trunk/Lib/httplib.py
==============================================================================
--- python/trunk/Lib/httplib.py (original)
+++ python/trunk/Lib/httplib.py Wed Jul 26 14:12:56 2006
@@ -926,8 +926,8 @@
self.__state = _CS_IDLE
if response.will_close:
- # this effectively passes the connection to the response
- self.close()
+ # Pass the socket to the response
+ self.sock = None
else:
# remember this, so we can tell when it is complete
self.__response = response
Modified: python/trunk/Lib/socket.py
==============================================================================
--- python/trunk/Lib/socket.py (original)
+++ python/trunk/Lib/socket.py Wed Jul 26 14:12:56 2006
@@ -139,6 +139,8 @@
__slots__ = []
def _dummy(*args):
raise error(EBADF, 'Bad file descriptor')
+ def close(self):
+ pass
# All _delegate_methods must also be initialized here.
send = recv = recv_into = sendto = recvfrom = recvfrom_into = _dummy
__getattr__ = _dummy
@@ -157,6 +159,7 @@
setattr(self, method, getattr(_sock, method))
def close(self):
+ self._sock.close()
self._sock = _closedsocket()
dummy = self._sock._dummy
for method in _delegate_methods:
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Wed Jul 26 14:12:56 2006
@@ -42,6 +42,8 @@
Library
-------
+- Bug #978833: Really close underlying socket in _socketobject.close.
+
- Bug #1459963: urllib and urllib2 now normalize HTTP header names correctly
with title().
From jackdied at jackdied.com Wed Jul 26 17:27:05 2006
From: jackdied at jackdied.com (Jack Diederich)
Date: Wed, 26 Jul 2006 11:27:05 -0400
Subject: [Python-checkins] r50842 - in python/trunk:
Lib/test/test_urllib2.py Lib/urllib.py Lib/urllib2.py Misc/NEWS
In-Reply-To: <20060726074019.11C051E4011@bag.python.org>
References: <20060726074019.11C051E4011@bag.python.org>
Message-ID: <20060726152705.GE31035@performancedrivers.com>
On Wed, Jul 26, 2006 at 09:40:19AM +0200, georg.brandl wrote:
> Author: georg.brandl
> Date: Wed Jul 26 09:40:17 2006
> New Revision: 50842
>
> Modified:
> python/trunk/Lib/test/test_urllib2.py
> python/trunk/Lib/urllib.py
> python/trunk/Lib/urllib2.py
> python/trunk/Misc/NEWS
> Log:
> Bug #1459963: properly capitalize HTTP header names.
http://www.w3.org/Protocols/rfc2616/rfc2616-sec4.html
Section 4.2 "Field names are case-insensitive"
So "properly" might not be the best word. That said this does seem to be
the most popular way to case them. The bug submitter doesn't say what
broken server he was having trouble with that demanded them in this format.
I expect this change will break someone's [equally broken] tests though
not as many as it fixes.
-Jack
From python-checkins at python.org Wed Jul 26 19:16:53 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Wed, 26 Jul 2006 19:16:53 +0200 (CEST)
Subject: [Python-checkins] r50845 - in python/trunk: Misc/NEWS
Modules/_cursesmodule.c
Message-ID: <20060726171653.E553F1E4007@bag.python.org>
Author: andrew.kuchling
Date: Wed Jul 26 19:16:52 2006
New Revision: 50845
Modified:
python/trunk/Misc/NEWS
python/trunk/Modules/_cursesmodule.c
Log:
[Bug #1471938] Fix build problem on Solaris 8 by conditionalizing the use of mvwgetnstr(); it was conditionalized a few lines below. Fix from Paul Eggert. I also tried out the STRICT_SYSV_CURSES case and am therefore removing the 'untested' comment.
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Wed Jul 26 19:16:52 2006
@@ -42,6 +42,9 @@
Library
-------
+- Bug #1471938: Fix curses module build problem on Solaris 8; patch by
+ Paul Eggert.
+
- Bug #978833: Really close underlying socket in _socketobject.close.
- Bug #1459963: urllib and urllib2 now normalize HTTP header names correctly
Modified: python/trunk/Modules/_cursesmodule.c
==============================================================================
--- python/trunk/Modules/_cursesmodule.c (original)
+++ python/trunk/Modules/_cursesmodule.c Wed Jul 26 19:16:52 2006
@@ -43,7 +43,7 @@
del_curterm delscreen dupwin inchnstr inchstr innstr keyok
mcprint mvaddchnstr mvaddchstr mvchgat mvcur mvinchnstr
mvinchstr mvinnstr mmvwaddchnstr mvwaddchstr mvwchgat
- mvwgetnstr mvwinchnstr mvwinchstr mvwinnstr newterm
+ mvwinchnstr mvwinchstr mvwinnstr newterm
restartterm ripoffline scr_dump
scr_init scr_restore scr_set scrl set_curterm set_term setterm
tgetent tgetflag tgetnum tgetstr tgoto timeout tputs
@@ -819,14 +819,17 @@
if (!PyArg_ParseTuple(args,"ii;y,x",&y,&x))
return NULL;
Py_BEGIN_ALLOW_THREADS
+#ifdef STRICT_SYSV_CURSES
+ rtn2 = wmove(self->win,y,x)==ERR ? ERR : wgetnstr(self->win, rtn, 1023);
+#else
rtn2 = mvwgetnstr(self->win,y,x,rtn, 1023);
+#endif
Py_END_ALLOW_THREADS
break;
case 3:
if (!PyArg_ParseTuple(args,"iii;y,x,n", &y, &x, &n))
return NULL;
#ifdef STRICT_SYSV_CURSES
- /* Untested */
Py_BEGIN_ALLOW_THREADS
rtn2 = wmove(self->win,y,x)==ERR ? ERR :
wgetnstr(self->win, rtn, MIN(n, 1023));
From python-checkins at python.org Wed Jul 26 19:18:01 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Wed, 26 Jul 2006 19:18:01 +0200 (CEST)
Subject: [Python-checkins] r50846 - python/trunk/Modules/_cursesmodule.c
Message-ID: <20060726171801.CEE361E4007@bag.python.org>
Author: andrew.kuchling
Date: Wed Jul 26 19:18:01 2006
New Revision: 50846
Modified:
python/trunk/Modules/_cursesmodule.c
Log:
Correct error message
Modified: python/trunk/Modules/_cursesmodule.c
==============================================================================
--- python/trunk/Modules/_cursesmodule.c (original)
+++ python/trunk/Modules/_cursesmodule.c Wed Jul 26 19:18:01 2006
@@ -841,7 +841,7 @@
#endif
break;
default:
- PyErr_SetString(PyExc_TypeError, "getstr requires 0 to 2 arguments");
+ PyErr_SetString(PyExc_TypeError, "getstr requires 0 to 3 arguments");
return NULL;
}
if (rtn2 == ERR)
From python-checkins at python.org Wed Jul 26 19:19:40 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Wed, 26 Jul 2006 19:19:40 +0200 (CEST)
Subject: [Python-checkins] r50847 - python/trunk/Misc/NEWS
Message-ID: <20060726171940.0C3D61E4007@bag.python.org>
Author: andrew.kuchling
Date: Wed Jul 26 19:19:39 2006
New Revision: 50847
Modified:
python/trunk/Misc/NEWS
Log:
Minor grammar fix
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Wed Jul 26 19:19:39 2006
@@ -27,7 +27,7 @@
- Patch #1521179: Python now accepts the standard options ``--help`` and
``--version`` as well as ``/?`` on Windows.
-- Bug #1520864: unpacking singleton tuples in for loop (for x, in) work
+- Bug #1520864: unpacking singleton tuples in a 'for' loop (for x, in) works
again. Fixing this problem required changing the .pyc magic number.
This means that .pyc files generated before 2.5c1 will be regenerated.
From python-checkins at python.org Wed Jul 26 19:22:21 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Wed, 26 Jul 2006 19:22:21 +0200 (CEST)
Subject: [Python-checkins] r50848 - python/trunk/Misc/NEWS
Message-ID: <20060726172221.9B7341E4007@bag.python.org>
Author: andrew.kuchling
Date: Wed Jul 26 19:22:21 2006
New Revision: 50848
Modified:
python/trunk/Misc/NEWS
Log:
Put news item in right section
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Wed Jul 26 19:22:21 2006
@@ -42,9 +42,6 @@
Library
-------
-- Bug #1471938: Fix curses module build problem on Solaris 8; patch by
- Paul Eggert.
-
- Bug #978833: Really close underlying socket in _socketobject.close.
- Bug #1459963: urllib and urllib2 now normalize HTTP header names correctly
@@ -91,6 +88,9 @@
Extension Modules
-----------------
+- Bug #1471938: Fix curses module build problem on Solaris 8; patch by
+ Paul Eggert.
+
- Patch #1448199: Release interpreter lock in _winreg.ConnectRegistry.
- Patch #1521817: Index range checking on ctypes arrays containing
From python-checkins at python.org Wed Jul 26 19:25:54 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Wed, 26 Jul 2006 19:25:54 +0200 (CEST)
Subject: [Python-checkins] r50849 - in python/branches/release24-maint:
Misc/NEWS Modules/_cursesmodule.c
Message-ID: <20060726172554.E35891E4011@bag.python.org>
Author: andrew.kuchling
Date: Wed Jul 26 19:25:53 2006
New Revision: 50849
Modified:
python/branches/release24-maint/Misc/NEWS
python/branches/release24-maint/Modules/_cursesmodule.c
Log:
[Bug #1471938] Fix build problem on Solaris 8 by conditionalizing the use of mvwgetnstr(); it was conditionalized a few lines below. Fix from Paul Eggert. I also tried out the STRICT_SYSV_CURSES case and am therefore removing the 'untested' comment.
Modified: python/branches/release24-maint/Misc/NEWS
==============================================================================
--- python/branches/release24-maint/Misc/NEWS (original)
+++ python/branches/release24-maint/Misc/NEWS Wed Jul 26 19:25:53 2006
@@ -31,6 +31,9 @@
Extension Modules
-----------------
+- Bug #1471938: Fix curses module build problem on Solaris 8; patch by
+ Paul Eggert.
+
- Bug #1512695: cPickle.loads could crash if it was interrupted with
a KeyboardInterrupt.
Modified: python/branches/release24-maint/Modules/_cursesmodule.c
==============================================================================
--- python/branches/release24-maint/Modules/_cursesmodule.c (original)
+++ python/branches/release24-maint/Modules/_cursesmodule.c Wed Jul 26 19:25:53 2006
@@ -43,7 +43,7 @@
del_curterm delscreen dupwin inchnstr inchstr innstr keyok
mcprint mvaddchnstr mvaddchstr mvchgat mvcur mvinchnstr
mvinchstr mvinnstr mmvwaddchnstr mvwaddchstr mvwchgat
- mvwgetnstr mvwinchnstr mvwinchstr mvwinnstr newterm
+ mvwinchnstr mvwinchstr mvwinnstr newterm
resizeterm restartterm ripoffline scr_dump
scr_init scr_restore scr_set scrl set_curterm set_term setterm
tgetent tgetflag tgetnum tgetstr tgoto timeout tputs
@@ -819,14 +819,17 @@
if (!PyArg_ParseTuple(args,"ii;y,x",&y,&x))
return NULL;
Py_BEGIN_ALLOW_THREADS
+#ifdef STRICT_SYSV_CURSES
+ rtn2 = wmove(self->win,y,x)==ERR ? ERR : wgetnstr(self->win, rtn, 1023);
+#else
rtn2 = mvwgetnstr(self->win,y,x,rtn, 1023);
+#endif
Py_END_ALLOW_THREADS
break;
case 3:
if (!PyArg_ParseTuple(args,"iii;y,x,n", &y, &x, &n))
return NULL;
#ifdef STRICT_SYSV_CURSES
- /* Untested */
Py_BEGIN_ALLOW_THREADS
rtn2 = wmove(self->win,y,x)==ERR ? ERR :
wgetnstr(self->win, rtn, MIN(n, 1023));
@@ -838,7 +841,7 @@
#endif
break;
default:
- PyErr_SetString(PyExc_TypeError, "getstr requires 0 to 2 arguments");
+ PyErr_SetString(PyExc_TypeError, "getstr requires 0 to 3 arguments");
return NULL;
}
if (rtn2 == ERR)
From buildbot at python.org Wed Jul 26 19:30:55 2006
From: buildbot at python.org (buildbot at python.org)
Date: Wed, 26 Jul 2006 17:30:55 +0000
Subject: [Python-checkins] buildbot warnings in alpha Debian trunk
Message-ID: <20060726173055.6C3111E4013@bag.python.org>
The Buildbot has detected a new failure of alpha Debian trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/alpha%2520Debian%2520trunk/builds/461
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: barry.warsaw
Build Had Warnings: warnings failed slave lost
sincerely,
-The Buildbot
From buildbot at python.org Wed Jul 26 19:42:43 2006
From: buildbot at python.org (buildbot at python.org)
Date: Wed, 26 Jul 2006 17:42:43 +0000
Subject: [Python-checkins] buildbot warnings in x86 W2k 2.4
Message-ID: <20060726174243.7456C1E4007@bag.python.org>
The Buildbot has detected a new failure of x86 W2k 2.4.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520W2k%25202.4/builds/179
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch branches/release24-maint] HEAD
Blamelist: andrew.kuchling
Build Had Warnings: warnings failed slave lost
sincerely,
-The Buildbot
From python-checkins at python.org Wed Jul 26 20:03:13 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Wed, 26 Jul 2006 20:03:13 +0200 (CEST)
Subject: [Python-checkins] r50850 - python/trunk/Tools/faqwiz/faqw.py
Message-ID: <20060726180313.35B991E4007@bag.python.org>
Author: andrew.kuchling
Date: Wed Jul 26 20:03:12 2006
New Revision: 50850
Modified:
python/trunk/Tools/faqwiz/faqw.py
Log:
Use sys.exc_info()
Modified: python/trunk/Tools/faqwiz/faqw.py
==============================================================================
--- python/trunk/Tools/faqwiz/faqw.py (original)
+++ python/trunk/Tools/faqwiz/faqw.py Wed Jul 26 20:03:12 2006
@@ -27,7 +27,7 @@
except SystemExit, n:
sys.exit(n)
except:
- t, v, tb = sys.exc_type, sys.exc_value, sys.exc_traceback
+ t, v, tb = sys.exc_info()
print
import cgi
cgi.print_exception(t, v, tb)
From buildbot at python.org Wed Jul 26 20:03:22 2006
From: buildbot at python.org (buildbot at python.org)
Date: Wed, 26 Jul 2006 18:03:22 +0000
Subject: [Python-checkins] buildbot warnings in g4 osx.4 trunk
Message-ID: <20060726180322.EDC251E4007@bag.python.org>
The Buildbot has detected a new failure of g4 osx.4 trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/g4%2520osx.4%2520trunk/builds/1230
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: andrew.kuchling
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Wed Jul 26 20:08:24 2006
From: buildbot at python.org (buildbot at python.org)
Date: Wed, 26 Jul 2006 18:08:24 +0000
Subject: [Python-checkins] buildbot warnings in x86 OpenBSD 2.4
Message-ID: <20060726180824.66B231E401C@bag.python.org>
The Buildbot has detected a new failure of x86 OpenBSD 2.4.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520OpenBSD%25202.4/builds/137
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch branches/release24-maint] HEAD
Blamelist: andrew.kuchling
Build Had Warnings: warnings failed slave lost
sincerely,
-The Buildbot
From python-checkins at python.org Wed Jul 26 20:15:46 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Wed, 26 Jul 2006 20:15:46 +0200 (CEST)
Subject: [Python-checkins] r50851 -
python/trunk/Tools/webchecker/webchecker.py
Message-ID: <20060726181546.138581E4007@bag.python.org>
Author: andrew.kuchling
Date: Wed Jul 26 20:15:45 2006
New Revision: 50851
Modified:
python/trunk/Tools/webchecker/webchecker.py
Log:
Use sys.exc_info()
Modified: python/trunk/Tools/webchecker/webchecker.py
==============================================================================
--- python/trunk/Tools/webchecker/webchecker.py (original)
+++ python/trunk/Tools/webchecker/webchecker.py Wed Jul 26 20:15:45 2006
@@ -760,7 +760,8 @@
try:
names = os.listdir(path)
except os.error, msg:
- raise IOError, msg, sys.exc_traceback
+ exc_type, exc_value, exc_tb = sys.exc_info()
+ raise IOError, msg, exc_tb
names.sort()
s = MyStringIO("file:"+url, {'content-type': 'text/html'})
s.write('\n' %
From buildbot at python.org Wed Jul 26 20:26:07 2006
From: buildbot at python.org (buildbot at python.org)
Date: Wed, 26 Jul 2006 18:26:07 +0000
Subject: [Python-checkins] buildbot warnings in x86 Ubuntu dapper (icc) trunk
Message-ID: <20060726182607.759091E4007@bag.python.org>
The Buildbot has detected a new failure of x86 Ubuntu dapper (icc) trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520Ubuntu%2520dapper%2520%2528icc%2529%2520trunk/builds/809
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: andrew.kuchling
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Wed Jul 26 20:26:18 2006
From: buildbot at python.org (buildbot at python.org)
Date: Wed, 26 Jul 2006 18:26:18 +0000
Subject: [Python-checkins] buildbot warnings in x86 XP 2.4
Message-ID: <20060726182618.AC28F1E4007@bag.python.org>
The Buildbot has detected a new failure of x86 XP 2.4.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520XP%25202.4/builds/181
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch branches/release24-maint] HEAD
Blamelist: andrew.kuchling
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Wed Jul 26 20:27:58 2006
From: buildbot at python.org (buildbot at python.org)
Date: Wed, 26 Jul 2006 18:27:58 +0000
Subject: [Python-checkins] buildbot warnings in x86 XP-2 trunk
Message-ID: <20060726182758.42EC71E4007@bag.python.org>
The Buildbot has detected a new failure of x86 XP-2 trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520XP-2%2520trunk/builds/786
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: andrew.kuchling
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Wed Jul 26 20:28:03 2006
From: buildbot at python.org (buildbot at python.org)
Date: Wed, 26 Jul 2006 18:28:03 +0000
Subject: [Python-checkins] buildbot warnings in x86 W2k trunk
Message-ID: <20060726182803.8A9101E4007@bag.python.org>
The Buildbot has detected a new failure of x86 W2k trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520W2k%2520trunk/builds/1290
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: andrew.kuchling
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Wed Jul 26 20:28:49 2006
From: buildbot at python.org (buildbot at python.org)
Date: Wed, 26 Jul 2006 18:28:49 +0000
Subject: [Python-checkins] buildbot warnings in x86 gentoo trunk
Message-ID: <20060726182849.9CA471E4007@bag.python.org>
The Buildbot has detected a new failure of x86 gentoo trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520gentoo%2520trunk/builds/1383
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: andrew.kuchling
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Wed Jul 26 20:29:45 2006
From: buildbot at python.org (buildbot at python.org)
Date: Wed, 26 Jul 2006 18:29:45 +0000
Subject: [Python-checkins] buildbot warnings in amd64 gentoo trunk
Message-ID: <20060726182945.51CFE1E4007@bag.python.org>
The Buildbot has detected a new failure of amd64 gentoo trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/amd64%2520gentoo%2520trunk/builds/1299
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: andrew.kuchling
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Wed Jul 26 20:33:20 2006
From: buildbot at python.org (buildbot at python.org)
Date: Wed, 26 Jul 2006 18:33:20 +0000
Subject: [Python-checkins] buildbot warnings in g4 osx.4 2.4
Message-ID: <20060726183320.004C41E4007@bag.python.org>
The Buildbot has detected a new failure of g4 osx.4 2.4.
Full details are available at:
http://www.python.org/dev/buildbot/all/g4%2520osx.4%25202.4/builds/185
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch branches/release24-maint] HEAD
Blamelist: andrew.kuchling
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Wed Jul 26 20:41:49 2006
From: buildbot at python.org (buildbot at python.org)
Date: Wed, 26 Jul 2006 18:41:49 +0000
Subject: [Python-checkins] buildbot warnings in x86 OpenBSD trunk
Message-ID: <20060726184149.B899F1E4007@bag.python.org>
The Buildbot has detected a new failure of x86 OpenBSD trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520OpenBSD%2520trunk/builds/1063
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: andrew.kuchling
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Wed Jul 26 21:22:54 2006
From: buildbot at python.org (buildbot at python.org)
Date: Wed, 26 Jul 2006 19:22:54 +0000
Subject: [Python-checkins] buildbot warnings in ppc Debian unstable trunk
Message-ID: <20060726192254.5F5221E4008@bag.python.org>
The Buildbot has detected a new failure of ppc Debian unstable trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/ppc%2520Debian%2520unstable%2520trunk/builds/988
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: andrew.kuchling
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Wed Jul 26 21:48:27 2006
From: python-checkins at python.org (phillip.eby)
Date: Wed, 26 Jul 2006 21:48:27 +0200 (CEST)
Subject: [Python-checkins] r50852 - python/trunk/Lib/pkgutil.py
Message-ID: <20060726194827.A4EC21E4007@bag.python.org>
Author: phillip.eby
Date: Wed Jul 26 21:48:27 2006
New Revision: 50852
Modified:
python/trunk/Lib/pkgutil.py
Log:
Allow the 'onerror' argument to walk_packages() to catch any Exception, not
just ImportError. This allows documentation tools to better skip unimportable
packages.
Modified: python/trunk/Lib/pkgutil.py
==============================================================================
--- python/trunk/Lib/pkgutil.py (original)
+++ python/trunk/Lib/pkgutil.py Wed Jul 26 21:48:27 2006
@@ -83,13 +83,18 @@
attribute to find submodules.
'onerror' is a function which gets called with one argument (the
- name of the package which was being imported) if an ImportError
- occurs trying to import a package. By default the ImportError is
- caught and ignored.
+ name of the package which was being imported) if any exception
+ occurs while trying to import a package. If no onerror function is
+ supplied, ImportErrors are caught and ignored, while all other
+ exceptions are propagated, terminating the search.
Examples:
- walk_packages() : list all modules python can access
- walk_packages(ctypes.__path__, ctypes.__name__+'.') : list all submodules of ctypes
+
+ # list all modules python can access
+ walk_packages()
+
+ # list all submodules of ctypes
+ walk_packages(ctypes.__path__, ctypes.__name__+'.')
"""
def seen(p, m={}):
@@ -106,6 +111,11 @@
except ImportError:
if onerror is not None:
onerror(name)
+ except Exception:
+ if onerror is not None:
+ onerror(name)
+ else:
+ raise
else:
path = getattr(sys.modules[name], '__path__', None) or []
From buildbot at python.org Wed Jul 26 23:01:41 2006
From: buildbot at python.org (buildbot at python.org)
Date: Wed, 26 Jul 2006 21:01:41 +0000
Subject: [Python-checkins] buildbot warnings in g4 osx.4 trunk
Message-ID: <20060726210141.A90B51E4007@bag.python.org>
The Buildbot has detected a new failure of g4 osx.4 trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/g4%2520osx.4%2520trunk/builds/1233
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: phillip.eby
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Thu Jul 27 01:23:16 2006
From: python-checkins at python.org (tim.peters)
Date: Thu, 27 Jul 2006 01:23:16 +0200 (CEST)
Subject: [Python-checkins] r50854 - in python/trunk/Lib: pkgutil.py
test/test_getargs2.py
Message-ID: <20060726232316.364B61E4008@bag.python.org>
Author: tim.peters
Date: Thu Jul 27 01:23:15 2006
New Revision: 50854
Modified:
python/trunk/Lib/pkgutil.py
python/trunk/Lib/test/test_getargs2.py
Log:
Whitespace normalization.
Modified: python/trunk/Lib/pkgutil.py
==============================================================================
--- python/trunk/Lib/pkgutil.py (original)
+++ python/trunk/Lib/pkgutil.py Thu Jul 27 01:23:15 2006
@@ -91,7 +91,7 @@
Examples:
# list all modules python can access
- walk_packages()
+ walk_packages()
# list all submodules of ctypes
walk_packages(ctypes.__path__, ctypes.__name__+'.')
Modified: python/trunk/Lib/test/test_getargs2.py
==============================================================================
--- python/trunk/Lib/test/test_getargs2.py (original)
+++ python/trunk/Lib/test/test_getargs2.py Thu Jul 27 01:23:15 2006
@@ -237,7 +237,7 @@
class Tuple_TestCase(unittest.TestCase):
def test_tuple(self):
from _testcapi import getargs_tuple
-
+
ret = getargs_tuple(1, (2, 3))
self.assertEquals(ret, (1,2,3))
From python-checkins at python.org Thu Jul 27 03:14:54 2006
From: python-checkins at python.org (tim.peters)
Date: Thu, 27 Jul 2006 03:14:54 +0200 (CEST)
Subject: [Python-checkins] r50855 - in python/trunk: Misc/NEWS
Python/mystrtoul.c
Message-ID: <20060727011454.4E1B71E4007@bag.python.org>
Author: tim.peters
Date: Thu Jul 27 03:14:53 2006
New Revision: 50855
Modified:
python/trunk/Misc/NEWS
python/trunk/Python/mystrtoul.c
Log:
Bug #1521947: possible bug in mystrtol.c with recent gcc.
In general, C doesn't define anything about what happens when
an operation on a signed integral type overflows, and PyOS_strtol()
did several formally undefined things of that nature on signed
longs. Some version of gcc apparently tries to exploit that now,
and PyOS_strtol() could fail to detect overflow then.
Tried to repair all that, although it seems at least as likely to me
that we'll get screwed by bad platform definitions for LONG_MIN
and/or LONG_MAX now. For that reason, I don't recommend backporting
this.
Note that I have no box on which this makes a lick of difference --
can't really test it, except to note that it didn't break anything
on my boxes.
Silent change: PyOS_strtol() used to return the hard-coded 0x7fffffff
in case of overflow. Now it returns LONG_MAX. They're the same only on
32-bit boxes (although C doesn't guarantee that either ...).
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Thu Jul 27 03:14:53 2006
@@ -12,6 +12,11 @@
Core and builtins
-----------------
+- Bug #1521947: When checking for overflow, ``PyOS_strtol()`` used some
+ operations on signed longs that are formally undefined by C.
+ Unfortunately, at least one compiler now cares about that, so complicated
+ the code to make that compiler happy again.
+
- Bug #1524310: Properly report errors from FindNextFile in os.listdir.
- Patch #1232023: Stop including current directory in search
@@ -37,7 +42,7 @@
mapping the faux "thread id" 0 to the current frame.
- Bug #1525447: build on MacOS X on a case-sensitive filesystem.
-
+
Library
-------
@@ -88,7 +93,7 @@
Extension Modules
-----------------
-- Bug #1471938: Fix curses module build problem on Solaris 8; patch by
+- Bug #1471938: Fix curses module build problem on Solaris 8; patch by
Paul Eggert.
- Patch #1448199: Release interpreter lock in _winreg.ConnectRegistry.
Modified: python/trunk/Python/mystrtoul.c
==============================================================================
--- python/trunk/Python/mystrtoul.c (original)
+++ python/trunk/Python/mystrtoul.c Thu Jul 27 03:14:53 2006
@@ -195,10 +195,19 @@
return (unsigned long)-1;
}
+/* Checking for overflow in PyOS_strtol is a PITA since C doesn't define
+ * anything about what happens when a signed integer operation overflows,
+ * and some compilers think they're doing you a favor by being "clever"
+ * then. Python assumes a 2's-complement representation, so that the bit
+ * pattern for the largest postive signed long is LONG_MAX, and for
+ * the smallest negative signed long is LONG_MAX + 1.
+ */
+
long
PyOS_strtol(char *str, char **ptr, int base)
{
long result;
+ unsigned long uresult;
char sign;
while (*str && isspace(Py_CHARMASK(*str)))
@@ -208,17 +217,20 @@
if (sign == '+' || sign == '-')
str++;
- result = (long) PyOS_strtoul(str, ptr, base);
+ uresult = PyOS_strtoul(str, ptr, base);
- /* Signal overflow if the result appears negative,
- except for the largest negative integer */
- if (result < 0 && !(sign == '-' && result == -result)) {
+ if (uresult <= (unsigned long)LONG_MAX) {
+ result = (long)uresult;
+ if (sign == '-')
+ result = -result;
+ }
+ else if (sign == '-' && uresult == (unsigned long)LONG_MAX + 1) {
+ assert(LONG_MIN == -LONG_MAX-1);
+ result = LONG_MIN;
+ }
+ else {
errno = ERANGE;
- result = 0x7fffffff;
+ result = LONG_MAX;
}
-
- if (sign == '-')
- result = -result;
-
return result;
}
From nnorwitz at gmail.com Thu Jul 27 05:16:22 2006
From: nnorwitz at gmail.com (Neal Norwitz)
Date: Wed, 26 Jul 2006 20:16:22 -0700
Subject: [Python-checkins] r50855 - in python/trunk: Misc/NEWS
Python/mystrtoul.c
In-Reply-To: <20060727011454.4E1B71E4007@bag.python.org>
References: <20060727011454.4E1B71E4007@bag.python.org>
Message-ID:
This fixed the problem on amd64 linux with gcc 4.1.1.
On 7/26/06, tim.peters wrote:
> Author: tim.peters
> Date: Thu Jul 27 03:14:53 2006
> New Revision: 50855
>
> Modified:
> python/trunk/Misc/NEWS
> python/trunk/Python/mystrtoul.c
> Log:
> Bug #1521947: possible bug in mystrtol.c with recent gcc.
>
> In general, C doesn't define anything about what happens when
> an operation on a signed integral type overflows, and PyOS_strtol()
> did several formally undefined things of that nature on signed
> longs. Some version of gcc apparently tries to exploit that now,
> and PyOS_strtol() could fail to detect overflow then.
>
> Tried to repair all that, although it seems at least as likely to me
> that we'll get screwed by bad platform definitions for LONG_MIN
> and/or LONG_MAX now. For that reason, I don't recommend backporting
> this.
>
> Note that I have no box on which this makes a lick of difference --
> can't really test it, except to note that it didn't break anything
> on my boxes.
>
> Silent change: PyOS_strtol() used to return the hard-coded 0x7fffffff
> in case of overflow. Now it returns LONG_MAX. They're the same only on
> 32-bit boxes (although C doesn't guarantee that either ...).
>
>
> Modified: python/trunk/Misc/NEWS
> ==============================================================================
> --- python/trunk/Misc/NEWS (original)
> +++ python/trunk/Misc/NEWS Thu Jul 27 03:14:53 2006
> @@ -12,6 +12,11 @@
> Core and builtins
> -----------------
>
> +- Bug #1521947: When checking for overflow, ``PyOS_strtol()`` used some
> + operations on signed longs that are formally undefined by C.
> + Unfortunately, at least one compiler now cares about that, so complicated
> + the code to make that compiler happy again.
> +
> - Bug #1524310: Properly report errors from FindNextFile in os.listdir.
>
> - Patch #1232023: Stop including current directory in search
> @@ -37,7 +42,7 @@
> mapping the faux "thread id" 0 to the current frame.
>
> - Bug #1525447: build on MacOS X on a case-sensitive filesystem.
> -
> +
>
> Library
> -------
> @@ -88,7 +93,7 @@
> Extension Modules
> -----------------
>
> -- Bug #1471938: Fix curses module build problem on Solaris 8; patch by
> +- Bug #1471938: Fix curses module build problem on Solaris 8; patch by
> Paul Eggert.
>
> - Patch #1448199: Release interpreter lock in _winreg.ConnectRegistry.
>
> Modified: python/trunk/Python/mystrtoul.c
> ==============================================================================
> --- python/trunk/Python/mystrtoul.c (original)
> +++ python/trunk/Python/mystrtoul.c Thu Jul 27 03:14:53 2006
> @@ -195,10 +195,19 @@
> return (unsigned long)-1;
> }
>
> +/* Checking for overflow in PyOS_strtol is a PITA since C doesn't define
> + * anything about what happens when a signed integer operation overflows,
> + * and some compilers think they're doing you a favor by being "clever"
> + * then. Python assumes a 2's-complement representation, so that the bit
> + * pattern for the largest postive signed long is LONG_MAX, and for
> + * the smallest negative signed long is LONG_MAX + 1.
> + */
> +
> long
> PyOS_strtol(char *str, char **ptr, int base)
> {
> long result;
> + unsigned long uresult;
> char sign;
>
> while (*str && isspace(Py_CHARMASK(*str)))
> @@ -208,17 +217,20 @@
> if (sign == '+' || sign == '-')
> str++;
>
> - result = (long) PyOS_strtoul(str, ptr, base);
> + uresult = PyOS_strtoul(str, ptr, base);
>
> - /* Signal overflow if the result appears negative,
> - except for the largest negative integer */
> - if (result < 0 && !(sign == '-' && result == -result)) {
> + if (uresult <= (unsigned long)LONG_MAX) {
> + result = (long)uresult;
> + if (sign == '-')
> + result = -result;
> + }
> + else if (sign == '-' && uresult == (unsigned long)LONG_MAX + 1) {
> + assert(LONG_MIN == -LONG_MAX-1);
> + result = LONG_MIN;
> + }
> + else {
> errno = ERANGE;
> - result = 0x7fffffff;
> + result = LONG_MAX;
> }
> -
> - if (sign == '-')
> - result = -result;
> -
> return result;
> }
> _______________________________________________
> Python-checkins mailing list
> Python-checkins at python.org
> http://mail.python.org/mailman/listinfo/python-checkins
>
From python-checkins at python.org Thu Jul 27 05:51:59 2006
From: python-checkins at python.org (neal.norwitz)
Date: Thu, 27 Jul 2006 05:51:59 +0200 (CEST)
Subject: [Python-checkins] r50856 - python/trunk/Tools/buildbot/kill_python.c
Message-ID: <20060727035159.4A0BF1E4004@bag.python.org>
Author: neal.norwitz
Date: Thu Jul 27 05:51:58 2006
New Revision: 50856
Modified:
python/trunk/Tools/buildbot/kill_python.c
Log:
Don't kill a normal instance of python running on windows when checking
to kill a cygwin instance. build\\python.exe was matching a normal windows
instance. Prefix that with a \\ to ensure build is a directory and not
PCbuild. As discussed on python-dev.
Modified: python/trunk/Tools/buildbot/kill_python.c
==============================================================================
--- python/trunk/Tools/buildbot/kill_python.c (original)
+++ python/trunk/Tools/buildbot/kill_python.c Thu Jul 27 05:51:58 2006
@@ -42,8 +42,19 @@
_strlwr(path);
/* printf("%s\n", path); */
+
+ /* Check if we are running a buildbot version of Python.
+
+ On Windows, this will always be a debug build from the
+ PCbuild directory. build\\PCbuild\\python_d.exe
+
+ On Cygwin, the pathname is similar to other Unixes.
+ Use \\build\\python.exe to ensure we don't match
+ PCbuild\\python.exe which could be a normal instance
+ of Python running on vanilla Windows.
+ */
if ((strstr(path, "build\\pcbuild\\python_d.exe") != NULL) ||
- (strstr(path, "build\\python.exe") != NULL)) {
+ (strstr(path, "\\build\\python.exe") != NULL)) {
printf("Terminating %s (pid %d)\n", path, pids[i]);
if (!TerminateProcess(hProcess, 1)) {
printf("Termination failed: %d\n", GetLastError());
From python-checkins at python.org Thu Jul 27 05:55:39 2006
From: python-checkins at python.org (neal.norwitz)
Date: Thu, 27 Jul 2006 05:55:39 +0200 (CEST)
Subject: [Python-checkins] r50857 - python/trunk/Objects/funcobject.c
Message-ID: <20060727035539.70BCC1E4004@bag.python.org>
Author: neal.norwitz
Date: Thu Jul 27 05:55:39 2006
New Revision: 50857
Modified:
python/trunk/Objects/funcobject.c
Log:
Closure can't be NULL at this point since we know it's a tuple.
Reported by Klocwork # 74.
Modified: python/trunk/Objects/funcobject.c
==============================================================================
--- python/trunk/Objects/funcobject.c (original)
+++ python/trunk/Objects/funcobject.c Thu Jul 27 05:55:39 2006
@@ -141,7 +141,7 @@
if (closure == Py_None)
closure = NULL;
else if (PyTuple_Check(closure)) {
- Py_XINCREF(closure);
+ Py_INCREF(closure);
}
else {
PyErr_Format(PyExc_SystemError,
From python-checkins at python.org Thu Jul 27 06:04:51 2006
From: python-checkins at python.org (neal.norwitz)
Date: Thu, 27 Jul 2006 06:04:51 +0200 (CEST)
Subject: [Python-checkins] r50858 - python/trunk/Modules/unicodedata.c
Message-ID: <20060727040451.3A44E1E4004@bag.python.org>
Author: neal.norwitz
Date: Thu Jul 27 06:04:50 2006
New Revision: 50858
Modified:
python/trunk/Modules/unicodedata.c
Log:
No functional change. Add comment and assert to describe why there cannot be overflow which was reported by Klocwork. Discussed on python-dev
Modified: python/trunk/Modules/unicodedata.c
==============================================================================
--- python/trunk/Modules/unicodedata.c (original)
+++ python/trunk/Modules/unicodedata.c Thu Jul 27 06:04:50 2006
@@ -395,6 +395,7 @@
PyUnicodeObject *v;
char decomp[256];
int code, index, count, i;
+ unsigned int prefix_index;
if (!PyArg_ParseTuple(args, "O!:decomposition",
&PyUnicode_Type, &v))
@@ -428,9 +429,15 @@
/* XXX: could allocate the PyString up front instead
(strlen(prefix) + 5 * count + 1 bytes) */
+ /* Based on how index is calculated above and decomp_data is generated
+ from Tools/unicode/makeunicodedata.py, it should not be possible
+ to overflow decomp_prefix. */
+ prefix_index = decomp_data[index] & 255;
+ assert(prefix_index < (sizeof(decomp_prefix)/sizeof(*decomp_prefix)));
+
/* copy prefix */
- i = strlen(decomp_prefix[decomp_data[index] & 255]);
- memcpy(decomp, decomp_prefix[decomp_data[index] & 255], i);
+ i = strlen(decomp_prefix[prefix_index]);
+ memcpy(decomp, decomp_prefix[prefix_index], i);
while (count-- > 0) {
if (i)
From buildbot at python.org Thu Jul 27 06:10:09 2006
From: buildbot at python.org (buildbot at python.org)
Date: Thu, 27 Jul 2006 04:10:09 +0000
Subject: [Python-checkins] buildbot warnings in x86 XP trunk
Message-ID: <20060727041009.DA8901E4004@bag.python.org>
The Buildbot has detected a new failure of x86 XP trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520XP%2520trunk/builds/1248
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: neal.norwitz
Build Had Warnings: warnings failed slave lost
sincerely,
-The Buildbot
From buildbot at python.org Thu Jul 27 06:58:26 2006
From: buildbot at python.org (buildbot at python.org)
Date: Thu, 27 Jul 2006 04:58:26 +0000
Subject: [Python-checkins] buildbot warnings in x86 Ubuntu dapper (icc) trunk
Message-ID: <20060727045826.3EE2E1E4004@bag.python.org>
The Buildbot has detected a new failure of x86 Ubuntu dapper (icc) trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520Ubuntu%2520dapper%2520%2528icc%2529%2520trunk/builds/814
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: neal.norwitz
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Thu Jul 27 08:38:17 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Thu, 27 Jul 2006 08:38:17 +0200 (CEST)
Subject: [Python-checkins] r50859 - in python/trunk:
Lib/distutils/__init__.py Misc/NEWS
Message-ID: <20060727063817.8A4C31E4004@bag.python.org>
Author: martin.v.loewis
Date: Thu Jul 27 08:38:16 2006
New Revision: 50859
Modified:
python/trunk/Lib/distutils/__init__.py
python/trunk/Misc/NEWS
Log:
Bump distutils version to 2.5, as several new features
have been introduced since 2.4.
Modified: python/trunk/Lib/distutils/__init__.py
==============================================================================
--- python/trunk/Lib/distutils/__init__.py (original)
+++ python/trunk/Lib/distutils/__init__.py Thu Jul 27 08:38:16 2006
@@ -12,4 +12,4 @@
__revision__ = "$Id$"
-__version__ = "2.4.0"
+__version__ = "2.5.0"
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Thu Jul 27 08:38:16 2006
@@ -47,6 +47,8 @@
Library
-------
+- The distutils version has been changed to 2.5.0.
+
- Bug #978833: Really close underlying socket in _socketobject.close.
- Bug #1459963: urllib and urllib2 now normalize HTTP header names correctly
From neal at metaslash.com Thu Jul 27 11:08:38 2006
From: neal at metaslash.com (Neal Norwitz)
Date: Thu, 27 Jul 2006 05:08:38 -0400
Subject: [Python-checkins] Python Regression Test Failures refleak (1)
Message-ID: <20060727090838.GA17298@python.psfb.org>
test_cmd_line leaked [17, 0, -17] references
test_sys leaked [0, 0, 132] references
From python-checkins at python.org Thu Jul 27 14:18:20 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Thu, 27 Jul 2006 14:18:20 +0200 (CEST)
Subject: [Python-checkins] r50860 - python/trunk/Lib/pdb.py
Message-ID: <20060727121820.AA3FC1E4004@bag.python.org>
Author: andrew.kuchling
Date: Thu Jul 27 14:18:20 2006
New Revision: 50860
Modified:
python/trunk/Lib/pdb.py
Log:
Reformat docstring; fix typo
Modified: python/trunk/Lib/pdb.py
==============================================================================
--- python/trunk/Lib/pdb.py (original)
+++ python/trunk/Lib/pdb.py Thu Jul 27 14:18:20 2006
@@ -230,7 +230,8 @@
"""Interpret the argument as though it had been typed in response
to the prompt.
- Checks wether this line is typed in the normal prompt or in a breakpoint command list definition
+ Checks whether this line is typed at the normal prompt or in
+ a breakpoint command list definition.
"""
if not self.commands_defining:
return cmd.Cmd.onecmd(self, line)
From python-checkins at python.org Thu Jul 27 17:05:37 2006
From: python-checkins at python.org (georg.brandl)
Date: Thu, 27 Jul 2006 17:05:37 +0200 (CEST)
Subject: [Python-checkins] r50861 - python/trunk/Lib/test/test_defaultdict.py
python/trunk/Lib/test/test_iterlen.py
python/trunk/Lib/test/test_uuid.py
Message-ID: <20060727150537.4DF091E40C5@bag.python.org>
Author: georg.brandl
Date: Thu Jul 27 17:05:36 2006
New Revision: 50861
Modified:
python/trunk/Lib/test/test_defaultdict.py
python/trunk/Lib/test/test_iterlen.py
python/trunk/Lib/test/test_uuid.py
Log:
Add test_main() methods. These three tests were never run
by regrtest.py.
We really need a simpler testing framework.
Modified: python/trunk/Lib/test/test_defaultdict.py
==============================================================================
--- python/trunk/Lib/test/test_defaultdict.py (original)
+++ python/trunk/Lib/test/test_defaultdict.py Thu Jul 27 17:05:36 2006
@@ -4,6 +4,7 @@
import copy
import tempfile
import unittest
+from test import test_support
from collections import defaultdict
@@ -131,5 +132,8 @@
self.assertEqual(d2, d1)
+def test_main():
+ test_support.run_unittest(TestDefaultDict)
+
if __name__ == "__main__":
- unittest.main()
+ test_main()
Modified: python/trunk/Lib/test/test_iterlen.py
==============================================================================
--- python/trunk/Lib/test/test_iterlen.py (original)
+++ python/trunk/Lib/test/test_iterlen.py Thu Jul 27 17:05:36 2006
@@ -235,9 +235,7 @@
self.assertEqual(len(it), 0)
-
-if __name__ == "__main__":
-
+def test_main():
unittests = [
TestRepeat,
TestXrange,
@@ -255,3 +253,7 @@
TestSeqIterReversed,
]
test_support.run_unittest(*unittests)
+
+if __name__ == "__main__":
+ test_main()
+
Modified: python/trunk/Lib/test/test_uuid.py
==============================================================================
--- python/trunk/Lib/test/test_uuid.py (original)
+++ python/trunk/Lib/test/test_uuid.py Thu Jul 27 17:05:36 2006
@@ -1,4 +1,5 @@
-from unittest import TestCase, main
+from unittest import TestCase
+from test import test_support
import uuid
def importable(name):
@@ -392,5 +393,9 @@
equal(u, uuid.UUID(v))
equal(str(u), v)
+
+def test_main():
+ test_support.run_unittest(TestUUID)
+
if __name__ == '__main__':
- main()
+ test_main()
From python-checkins at python.org Thu Jul 27 17:09:21 2006
From: python-checkins at python.org (tim.peters)
Date: Thu, 27 Jul 2006 17:09:21 +0200 (CEST)
Subject: [Python-checkins] r50862 - python/trunk/Misc/NEWS
Message-ID: <20060727150921.0C3351E4004@bag.python.org>
Author: tim.peters
Date: Thu Jul 27 17:09:20 2006
New Revision: 50862
Modified:
python/trunk/Misc/NEWS
Log:
News for patch #1529686.
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Thu Jul 27 17:09:20 2006
@@ -121,6 +121,9 @@
how long the test file should take to play. Now accepts taking 2.93 secs
(exact time) +/- 10% instead of the hard-coded 3.1 sec.
+- Patch #1529686: The standard ``test_defaultdict`` test didn't actually
+ run any tests when run via ``regrtest.py``. Now it does.
+
What's New in Python 2.5 beta 2?
================================
From python-checkins at python.org Thu Jul 27 17:11:01 2006
From: python-checkins at python.org (tim.peters)
Date: Thu, 27 Jul 2006 17:11:01 +0200 (CEST)
Subject: [Python-checkins] r50863 - in python/trunk/Lib: pdb.py
test/test_iterlen.py
Message-ID: <20060727151101.83BDF1E401F@bag.python.org>
Author: tim.peters
Date: Thu Jul 27 17:11:00 2006
New Revision: 50863
Modified:
python/trunk/Lib/pdb.py
python/trunk/Lib/test/test_iterlen.py
Log:
Whitespace normalization.
Modified: python/trunk/Lib/pdb.py
==============================================================================
--- python/trunk/Lib/pdb.py (original)
+++ python/trunk/Lib/pdb.py Thu Jul 27 17:11:00 2006
@@ -231,7 +231,7 @@
to the prompt.
Checks whether this line is typed at the normal prompt or in
- a breakpoint command list definition.
+ a breakpoint command list definition.
"""
if not self.commands_defining:
return cmd.Cmd.onecmd(self, line)
Modified: python/trunk/Lib/test/test_iterlen.py
==============================================================================
--- python/trunk/Lib/test/test_iterlen.py (original)
+++ python/trunk/Lib/test/test_iterlen.py Thu Jul 27 17:11:00 2006
@@ -256,4 +256,3 @@
if __name__ == "__main__":
test_main()
-
From buildbot at python.org Thu Jul 27 17:28:59 2006
From: buildbot at python.org (buildbot at python.org)
Date: Thu, 27 Jul 2006 15:28:59 +0000
Subject: [Python-checkins] buildbot warnings in x86 XP-2 trunk
Message-ID: <20060727152859.7646E1E4004@bag.python.org>
The Buildbot has detected a new failure of x86 XP-2 trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520XP-2%2520trunk/builds/792
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl,tim.peters
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Thu Jul 27 17:33:06 2006
From: buildbot at python.org (buildbot at python.org)
Date: Thu, 27 Jul 2006 15:33:06 +0000
Subject: [Python-checkins] buildbot warnings in x86 OpenBSD trunk
Message-ID: <20060727153306.91DD21E4004@bag.python.org>
The Buildbot has detected a new failure of x86 OpenBSD trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520OpenBSD%2520trunk/builds/1072
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl,tim.peters
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Thu Jul 27 17:38:33 2006
From: python-checkins at python.org (georg.brandl)
Date: Thu, 27 Jul 2006 17:38:33 +0200 (CEST)
Subject: [Python-checkins] r50864 - python/trunk/Misc/NEWS
Message-ID: <20060727153833.954AB1E4004@bag.python.org>
Author: georg.brandl
Date: Thu Jul 27 17:38:33 2006
New Revision: 50864
Modified:
python/trunk/Misc/NEWS
Log:
Amend news entry.
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Thu Jul 27 17:38:33 2006
@@ -121,8 +121,9 @@
how long the test file should take to play. Now accepts taking 2.93 secs
(exact time) +/- 10% instead of the hard-coded 3.1 sec.
-- Patch #1529686: The standard ``test_defaultdict`` test didn't actually
- run any tests when run via ``regrtest.py``. Now it does.
+- Patch #1529686: The standard tests ``test_defaultdict``, ``test_iterlen``
+ and ``test_uuid`` didn't actually run any tests when run via ``regrtest.py``.
+ Now they do.
What's New in Python 2.5 beta 2?
From buildbot at python.org Thu Jul 27 17:40:43 2006
From: buildbot at python.org (buildbot at python.org)
Date: Thu, 27 Jul 2006 15:40:43 +0000
Subject: [Python-checkins] buildbot warnings in ppc Debian unstable trunk
Message-ID: <20060727154044.025FD1E4018@bag.python.org>
The Buildbot has detected a new failure of ppc Debian unstable trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/ppc%2520Debian%2520unstable%2520trunk/builds/997
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl,tim.peters
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Thu Jul 27 17:43:50 2006
From: buildbot at python.org (buildbot at python.org)
Date: Thu, 27 Jul 2006 15:43:50 +0000
Subject: [Python-checkins] buildbot warnings in PPC64 Debian trunk
Message-ID: <20060727154350.5A5041E4004@bag.python.org>
The Buildbot has detected a new failure of PPC64 Debian trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/PPC64%2520Debian%2520trunk/builds/300
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl,tim.peters
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Thu Jul 27 17:46:29 2006
From: buildbot at python.org (buildbot at python.org)
Date: Thu, 27 Jul 2006 15:46:29 +0000
Subject: [Python-checkins] buildbot warnings in sparc solaris10 gcc trunk
Message-ID: <20060727154629.A7DDB1E4008@bag.python.org>
The Buildbot has detected a new failure of sparc solaris10 gcc trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/sparc%2520solaris10%2520gcc%2520trunk/builds/1251
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl,tim.peters
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Thu Jul 27 18:00:40 2006
From: buildbot at python.org (buildbot at python.org)
Date: Thu, 27 Jul 2006 16:00:40 +0000
Subject: [Python-checkins] buildbot warnings in alpha Tru64 5.1 trunk
Message-ID: <20060727160040.3E79A1E4004@bag.python.org>
The Buildbot has detected a new failure of alpha Tru64 5.1 trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/alpha%2520Tru64%25205.1%2520trunk/builds/975
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl,tim.peters
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Thu Jul 27 18:08:15 2006
From: python-checkins at python.org (georg.brandl)
Date: Thu, 27 Jul 2006 18:08:15 +0200 (CEST)
Subject: [Python-checkins] r50865 - in python/trunk/Lib: test/test_uuid.py
uuid.py
Message-ID: <20060727160815.CE2091E4004@bag.python.org>
Author: georg.brandl
Date: Thu Jul 27 18:08:15 2006
New Revision: 50865
Modified:
python/trunk/Lib/test/test_uuid.py
python/trunk/Lib/uuid.py
Log:
Make uuid test suite pass on this box by requesting output with LC_ALL=C.
Modified: python/trunk/Lib/test/test_uuid.py
==============================================================================
--- python/trunk/Lib/test/test_uuid.py (original)
+++ python/trunk/Lib/test/test_uuid.py Thu Jul 27 18:08:15 2006
@@ -288,12 +288,16 @@
def test_ifconfig_getnode(self):
import os
if os.name == 'posix':
- self.check_node(uuid._ifconfig_getnode(), 'ifconfig')
+ node = uuid._ifconfig_getnode()
+ if node is not None:
+ self.check_node(node, 'ifconfig')
def test_ipconfig_getnode(self):
import os
if os.name == 'nt':
- self.check_node(uuid._ipconfig_getnode(), 'ipconfig')
+ node = uuid._ipconfig_getnode()
+ if node is not None:
+ self.check_node(node, 'ipconfig')
def test_netbios_getnode(self):
if importable('win32wnet') and importable('netbios'):
Modified: python/trunk/Lib/uuid.py
==============================================================================
--- python/trunk/Lib/uuid.py (original)
+++ python/trunk/Lib/uuid.py Thu Jul 27 18:08:15 2006
@@ -276,7 +276,10 @@
import os
for dir in ['', '/sbin/', '/usr/sbin']:
try:
- pipe = os.popen(os.path.join(dir, 'ifconfig'))
+ # LC_ALL to get English output, 2>/dev/null to
+ # prevent output on stderr
+ cmd = 'LC_ALL=C %s 2>/dev/null' % os.path.join(dir, 'ifconfig')
+ pipe = os.popen(cmd)
except IOError:
continue
for line in pipe:
From python-checkins at python.org Thu Jul 27 20:37:33 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Thu, 27 Jul 2006 20:37:33 +0200 (CEST)
Subject: [Python-checkins] r50866 - python/trunk/Doc/lib/libstringio.tex
Message-ID: <20060727183733.EC69E1E4004@bag.python.org>
Author: andrew.kuchling
Date: Thu Jul 27 20:37:33 2006
New Revision: 50866
Modified:
python/trunk/Doc/lib/libstringio.tex
Log:
Add example
Modified: python/trunk/Doc/lib/libstringio.tex
==============================================================================
--- python/trunk/Doc/lib/libstringio.tex (original)
+++ python/trunk/Doc/lib/libstringio.tex Thu Jul 27 20:37:33 2006
@@ -37,6 +37,24 @@
Free the memory buffer.
\end{methoddesc}
+Example usage:
+
+\begin{verbatim}
+import StringIO
+
+output = StringIO.StringIO()
+output.write('First line.\n')
+print >>output, 'Second line.'
+
+# Retrieve file contents -- this will be
+# 'First line.\nSecond line.\n'
+contents = output.getvalue()
+
+# Close object and discard memory buffer --
+# .getvalue() will now raise an exception.
+output.close()
+\end{verbatim}
+
\section{\module{cStringIO} ---
Faster version of \module{StringIO}}
@@ -82,3 +100,22 @@
There is a C API to the module as well; refer to the module source for
more information.
+
+Example usage:
+
+\begin{verbatim}
+import cStringIO
+
+output = cStringIO.StringIO()
+output.write('First line.\n')
+print >>output, 'Second line.'
+
+# Retrieve file contents -- this will be
+# 'First line.\nSecond line.\n'
+contents = output.getvalue()
+
+# Close object and discard memory buffer --
+# .getvalue() will now raise an exception.
+output.close()
+\end{verbatim}
+
From python-checkins at python.org Thu Jul 27 20:39:55 2006
From: python-checkins at python.org (thomas.heller)
Date: Thu, 27 Jul 2006 20:39:55 +0200 (CEST)
Subject: [Python-checkins] r50867 - python/trunk/Lib/ctypes/__init__.py
Message-ID: <20060727183955.E14041E4004@bag.python.org>
Author: thomas.heller
Date: Thu Jul 27 20:39:55 2006
New Revision: 50867
Modified:
python/trunk/Lib/ctypes/__init__.py
Log:
Remove code that is no longer used (ctypes.com).
Fix the DllGetClassObject and DllCanUnloadNow so that they forward the
call to the comtypes.server.inprocserver module.
The latter was never documented, never used by published code, and
didn't work anyway, so I think it does not deserve a NEWS entry (but I
might be wrong).
Modified: python/trunk/Lib/ctypes/__init__.py
==============================================================================
--- python/trunk/Lib/ctypes/__init__.py (original)
+++ python/trunk/Lib/ctypes/__init__.py Thu Jul 27 20:39:55 2006
@@ -464,52 +464,21 @@
return _wstring_at(ptr, size)
-if _os.name == "nt": # COM stuff
+if _os.name in ("nt", "ce"): # COM stuff
def DllGetClassObject(rclsid, riid, ppv):
- # First ask ctypes.com.server than comtypes.server for the
- # class object.
-
- # trick py2exe by doing dynamic imports
- result = -2147221231 # CLASS_E_CLASSNOTAVAILABLE
try:
- ctcom = __import__("ctypes.com.server", globals(), locals(), ['*'])
+ ccom = __import__("comtypes.server.inprocserver", globals(), locals(), ['*'])
except ImportError:
- pass
+ return -2147221231 # CLASS_E_CLASSNOTAVAILABLE
else:
- result = ctcom.DllGetClassObject(rclsid, riid, ppv)
-
- if result == -2147221231: # CLASS_E_CLASSNOTAVAILABLE
- try:
- ccom = __import__("comtypes.server", globals(), locals(), ['*'])
- except ImportError:
- pass
- else:
- result = ccom.DllGetClassObject(rclsid, riid, ppv)
-
- return result
+ return ccom.DllGetClassObject(rclsid, riid, ppv)
def DllCanUnloadNow():
- # First ask ctypes.com.server than comtypes.server if we can unload or not.
- # trick py2exe by doing dynamic imports
- result = 0 # S_OK
- try:
- ctcom = __import__("ctypes.com.server", globals(), locals(), ['*'])
- except ImportError:
- pass
- else:
- result = ctcom.DllCanUnloadNow()
- if result != 0: # != S_OK
- return result
-
try:
- ccom = __import__("comtypes.server", globals(), locals(), ['*'])
+ ccom = __import__("comtypes.server.inprocserver", globals(), locals(), ['*'])
except ImportError:
- return result
- try:
- return ccom.DllCanUnloadNow()
- except AttributeError:
- pass
- return result
+ return 0 # S_OK
+ return ccom.DllCanUnloadNow()
from ctypes._endian import BigEndianStructure, LittleEndianStructure
From python-checkins at python.org Thu Jul 27 20:41:21 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Thu, 27 Jul 2006 20:41:21 +0200 (CEST)
Subject: [Python-checkins] r50868 - python/trunk/Doc/lib/libunicodedata.tex
Message-ID: <20060727184121.6D36E1E4004@bag.python.org>
Author: andrew.kuchling
Date: Thu Jul 27 20:41:21 2006
New Revision: 50868
Modified:
python/trunk/Doc/lib/libunicodedata.tex
Log:
Typo fix ('publically' is rare, poss. non-standard)
Modified: python/trunk/Doc/lib/libunicodedata.tex
==============================================================================
--- python/trunk/Doc/lib/libunicodedata.tex (original)
+++ python/trunk/Doc/lib/libunicodedata.tex Thu Jul 27 20:41:21 2006
@@ -14,7 +14,7 @@
This module provides access to the Unicode Character Database which
defines character properties for all Unicode characters. The data in
this database is based on the \file{UnicodeData.txt} file version
-4.1.0 which is publically available from \url{ftp://ftp.unicode.org/}.
+4.1.0 which is publicly available from \url{ftp://ftp.unicode.org/}.
The module uses the same names and symbols as defined by the
UnicodeData File Format 4.1.0 (see
From python-checkins at python.org Thu Jul 27 20:42:41 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Thu, 27 Jul 2006 20:42:41 +0200 (CEST)
Subject: [Python-checkins] r50869 - python/trunk/Doc/lib/libunicodedata.tex
Message-ID: <20060727184241.A07241E4004@bag.python.org>
Author: andrew.kuchling
Date: Thu Jul 27 20:42:41 2006
New Revision: 50869
Modified:
python/trunk/Doc/lib/libunicodedata.tex
Log:
Add missing word
Modified: python/trunk/Doc/lib/libunicodedata.tex
==============================================================================
--- python/trunk/Doc/lib/libunicodedata.tex (original)
+++ python/trunk/Doc/lib/libunicodedata.tex Thu Jul 27 20:42:41 2006
@@ -108,7 +108,7 @@
Normal form C (NFC) first applies a canonical decomposition, then
composes pre-combined characters again.
-In addition to these two forms, there two additional normal forms
+In addition to these two forms, there are two additional normal forms
based on compatibility equivalence. In Unicode, certain characters are
supported which normally would be unified with other characters. For
example, U+2160 (ROMAN NUMERAL ONE) is really the same thing as U+0049
@@ -139,3 +139,4 @@
\versionadded{2.5}
\end{datadesc}
+
From python-checkins at python.org Thu Jul 27 20:44:10 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Thu, 27 Jul 2006 20:44:10 +0200 (CEST)
Subject: [Python-checkins] r50870 - python/trunk/Doc/whatsnew/whatsnew20.tex
Message-ID: <20060727184410.E72411E4004@bag.python.org>
Author: andrew.kuchling
Date: Thu Jul 27 20:44:10 2006
New Revision: 50870
Modified:
python/trunk/Doc/whatsnew/whatsnew20.tex
Log:
Repair typos
Modified: python/trunk/Doc/whatsnew/whatsnew20.tex
==============================================================================
--- python/trunk/Doc/whatsnew/whatsnew20.tex (original)
+++ python/trunk/Doc/whatsnew/whatsnew20.tex Thu Jul 27 20:44:10 2006
@@ -216,7 +216,7 @@
character properties. For example, \code{unicodedata.category(u'A')}
returns the 2-character string 'Lu', the 'L' denoting it's a letter,
and 'u' meaning that it's uppercase.
-\code{u.bidirectional(u'\e x0660')} returns 'AN', meaning that U+0660 is
+\code{unicodedata.bidirectional(u'\e u0660')} returns 'AN', meaning that U+0660 is
an Arabic number.
The \module{codecs} module contains functions to look up existing encodings
From python-checkins at python.org Thu Jul 27 20:48:47 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Thu, 27 Jul 2006 20:48:47 +0200 (CEST)
Subject: [Python-checkins] r50871 -
python/branches/release24-maint/Doc/lib/libunicodedata.tex
Message-ID: <20060727184847.8F0F81E4004@bag.python.org>
Author: andrew.kuchling
Date: Thu Jul 27 20:48:47 2006
New Revision: 50871
Modified:
python/branches/release24-maint/Doc/lib/libunicodedata.tex
Log:
Update URL
Modified: python/branches/release24-maint/Doc/lib/libunicodedata.tex
==============================================================================
--- python/branches/release24-maint/Doc/lib/libunicodedata.tex (original)
+++ python/branches/release24-maint/Doc/lib/libunicodedata.tex Thu Jul 27 20:48:47 2006
@@ -18,7 +18,7 @@
The module uses the same names and symbols as defined by the
UnicodeData File Format 3.2.0 (see
-\url{http://www.unicode.org/Public/UNIDATA/UnicodeData.html}). It
+\url{http://www.unicode.org/Public/3.2-Update/UnicodeData-3.2.0.html}). It
defines the following functions:
\begin{funcdesc}{lookup}{name}
From python-checkins at python.org Thu Jul 27 20:53:33 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Thu, 27 Jul 2006 20:53:33 +0200 (CEST)
Subject: [Python-checkins] r50872 - python/trunk/Doc/lib/libunicodedata.tex
Message-ID: <20060727185333.4E3821E4004@bag.python.org>
Author: andrew.kuchling
Date: Thu Jul 27 20:53:33 2006
New Revision: 50872
Modified:
python/trunk/Doc/lib/libunicodedata.tex
Log:
Update URL; add example
Modified: python/trunk/Doc/lib/libunicodedata.tex
==============================================================================
--- python/trunk/Doc/lib/libunicodedata.tex (original)
+++ python/trunk/Doc/lib/libunicodedata.tex Thu Jul 27 20:53:33 2006
@@ -18,7 +18,7 @@
The module uses the same names and symbols as defined by the
UnicodeData File Format 4.1.0 (see
-\url{http://www.unicode.org/Public/4.1-Update/UnicodeData-4.1.0.html}). It
+\url{http://www.unicode.org/Public/4.1.0/ucd/UCD.html}). It
defines the following functions:
\begin{funcdesc}{lookup}{name}
@@ -140,3 +140,21 @@
\versionadded{2.5}
\end{datadesc}
+Examples:
+
+\begin{verbatim}
+>>> unicodedata.lookup('LEFT CURLY BRACKET')
+u'{'
+>>> unicodedata.name(u'/')
+'SOLIDUS'
+>>> unicodedata.decimal(u'9')
+9
+>>> unicodedata.decimal(u'a')
+Traceback (most recent call last):
+ File "", line 1, in ?
+ValueError: not a decimal
+>>> unicodedata.category(u'A') # 'L'etter, 'u'ppercase
+'Lu'
+>>> unicodedata.bidirectional(u'\u0660') # 'A'rabic, 'N'umber
+'AN'
+\end{verbatim}
From python-checkins at python.org Thu Jul 27 21:07:30 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Thu, 27 Jul 2006 21:07:30 +0200 (CEST)
Subject: [Python-checkins] r50873 - python/trunk/Doc/lib/librandom.tex
Message-ID: <20060727190730.70AB51E4004@bag.python.org>
Author: andrew.kuchling
Date: Thu Jul 27 21:07:29 2006
New Revision: 50873
Modified:
python/trunk/Doc/lib/librandom.tex
Log:
Add punctuation mark; add some examples
Modified: python/trunk/Doc/lib/librandom.tex
==============================================================================
--- python/trunk/Doc/lib/librandom.tex (original)
+++ python/trunk/Doc/lib/librandom.tex Thu Jul 27 21:07:29 2006
@@ -236,7 +236,7 @@
\var{beta} is the shape parameter.
\end{funcdesc}
-Alternative Generators
+Alternative Generators:
\begin{classdesc}{WichmannHill}{\optional{seed}}
Class that implements the Wichmann-Hill algorithm as the core generator.
@@ -267,6 +267,30 @@
\versionadded{2.4}
\end{classdesc}
+Examples of basic usage:
+
+\begin{verbatim}
+>>> random.random() # Random float x, 0.0 <= x < 1.0
+0.37444887175646646
+>>> random.uniform(1, 10) # Random float x, 1.0 <= x < 10.0
+1.1800146073117523
+>>> random.randint(1, 10) # Integer from 1 to 10, endpoints included
+7
+>>> random.randrange(0, 101, 2) # Even integer from 0 to 100
+26
+>>> random.choice('abcdefghij') # Choose a random element
+'c'
+
+>>> items = [1, 2, 3, 4, 5, 6, 7]
+>>> random.shuffle(items)
+>>> items
+[7, 3, 2, 5, 6, 4, 1]
+
+>>> random.sample([1, 2, 3, 4, 5], 3) # Choose 3 elements
+[4, 1, 5]
+
+\end{verbatim}
+
\begin{seealso}
\seetext{M. Matsumoto and T. Nishimura, ``Mersenne Twister: A
623-dimensionally equidistributed uniform pseudorandom
From python-checkins at python.org Thu Jul 27 21:11:07 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Thu, 27 Jul 2006 21:11:07 +0200 (CEST)
Subject: [Python-checkins] r50874 - python/trunk/Doc/lib/libbinascii.tex
Message-ID: <20060727191107.A2B161E4004@bag.python.org>
Author: andrew.kuchling
Date: Thu Jul 27 21:11:07 2006
New Revision: 50874
Modified:
python/trunk/Doc/lib/libbinascii.tex
Log:
Mention base64 module; rewrite last sentence to be more positive
Modified: python/trunk/Doc/lib/libbinascii.tex
==============================================================================
--- python/trunk/Doc/lib/libbinascii.tex (original)
+++ python/trunk/Doc/lib/libbinascii.tex Thu Jul 27 21:11:07 2006
@@ -9,10 +9,11 @@
The \module{binascii} module contains a number of methods to convert
between binary and various \ASCII-encoded binary
representations. Normally, you will not use these functions directly
-but use wrapper modules like \refmodule{uu}\refstmodindex{uu} or
-\refmodule{binhex}\refstmodindex{binhex} instead, this module solely
-exists because bit-manipulation of large amounts of data is slow in
-Python.
+but use wrapper modules like \refmodule{uu}\refstmodindex{uu},
+\refmodule{base64}\refstmodindex{base64}, or
+\refmodule{binhex}\refstmodindex{binhex} instead. The \module{binascii} module
+contains low-level functions written in C for greater speed
+that are used by the higher-level modules.
The \module{binascii} module defines the following functions:
From python-checkins at python.org Thu Jul 27 21:12:50 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Thu, 27 Jul 2006 21:12:50 +0200 (CEST)
Subject: [Python-checkins] r50875 - python/trunk/Doc/lib/lib.tex
Message-ID: <20060727191250.4EA511E4004@bag.python.org>
Author: andrew.kuchling
Date: Thu Jul 27 21:12:49 2006
New Revision: 50875
Modified:
python/trunk/Doc/lib/lib.tex
Log:
If binhex is higher-level than binascii, it should come first in the chapter
Modified: python/trunk/Doc/lib/lib.tex
==============================================================================
--- python/trunk/Doc/lib/lib.tex (original)
+++ python/trunk/Doc/lib/lib.tex Thu Jul 27 21:12:49 2006
@@ -154,8 +154,8 @@
% encoding stuff
\input{libbase64}
-\input{libbinascii}
\input{libbinhex}
+\input{libbinascii}
\input{libquopri}
\input{libuu}
From buildbot at python.org Thu Jul 27 21:31:39 2006
From: buildbot at python.org (buildbot at python.org)
Date: Thu, 27 Jul 2006 19:31:39 +0000
Subject: [Python-checkins] buildbot warnings in alpha Tru64 5.1 trunk
Message-ID: <20060727193139.366281E4004@bag.python.org>
The Buildbot has detected a new failure of alpha Tru64 5.1 trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/alpha%2520Tru64%25205.1%2520trunk/builds/978
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: andrew.kuchling,thomas.heller
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Thu Jul 27 22:47:25 2006
From: python-checkins at python.org (tim.peters)
Date: Thu, 27 Jul 2006 22:47:25 +0200 (CEST)
Subject: [Python-checkins] r50876 - python/trunk/Lib/test/test_uuid.py
Message-ID: <20060727204725.0723E1E4013@bag.python.org>
Author: tim.peters
Date: Thu Jul 27 22:47:24 2006
New Revision: 50876
Modified:
python/trunk/Lib/test/test_uuid.py
Log:
check_node(): stop spraying mystery output to stderr.
When a node number disagrees, keep track of all sources & the
node numbers they reported, and stick all that in the error message.
Changed all callers to supply a non-empty "source" argument; made
the "source" argument non-optional.
On my box, test_uuid still fails, but with the less confusing output:
AssertionError: different sources disagree on node:
from source 'getnode1', node was 00038a000015
from source 'getnode2', node was 00038a000015
from source 'ipconfig', node was 001111b2b7bf
Only the last one appears to be correct; e.g.,
C:\Code\python\PCbuild>getmac
Physical Address Transport Name
=================== ==========================================================
00-11-11-B2-B7-BF \Device\Tcpip_{190FB163-5AFD-4483-86A1-2FE16AC61FF1}
62-A1-AC-6C-FD-BE \Device\Tcpip_{8F77DF5A-EA3D-4F1D-975E-D472CEE6438A}
E2-1F-01-C6-5D-88 \Device\Tcpip_{CD18F76B-2EF3-409F-9B8A-6481EE70A1E4}
I can't find anything on my box with MAC 00-03-8a-00-00-15, and am
not clear on where that comes from.
Modified: python/trunk/Lib/test/test_uuid.py
==============================================================================
--- python/trunk/Lib/test/test_uuid.py (original)
+++ python/trunk/Lib/test/test_uuid.py Thu Jul 27 22:47:24 2006
@@ -11,6 +11,7 @@
class TestUUID(TestCase):
last_node = None
+ source2node = {}
def test_UUID(self):
equal = self.assertEqual
@@ -266,7 +267,7 @@
badtype(lambda: setattr(u, 'fields', f))
badtype(lambda: setattr(u, 'int', i))
- def check_node(self, node, source=''):
+ def check_node(self, node, source):
individual_group_bit = (node >> 40L) & 1
universal_local_bit = (node >> 40L) & 2
message = "%012x doesn't look like a real MAC address" % node
@@ -275,13 +276,15 @@
self.assertNotEqual(node, 0, message)
self.assertNotEqual(node, 0xffffffffffffL, message)
self.assert_(0 <= node, message)
- self.assert_(node < 1<<48L, message)
+ self.assert_(node < (1L << 48), message)
- import sys
- if source:
- sys.stderr.write('(%s: %012x)' % (source, node))
+ TestUUID.source2node[source] = node
if TestUUID.last_node:
- self.assertEqual(TestUUID.last_node, node, 'inconsistent node IDs')
+ if TestUUID.last_node != node:
+ msg = "different sources disagree on node:\n"
+ for s, n in TestUUID.source2node.iteritems():
+ msg += " from source %r, node was %012x\n" % (s, n)
+ self.fail(msg)
else:
TestUUID.last_node = node
@@ -319,10 +322,10 @@
self.check_node(uuid._windll_getnode(), 'windll')
def test_getnode(self):
- self.check_node(uuid.getnode())
+ self.check_node(uuid.getnode(), "getnode1")
# Test it again to ensure consistency.
- self.check_node(uuid.getnode())
+ self.check_node(uuid.getnode(), "getnode2")
def test_uuid1(self):
equal = self.assertEqual
From neal at metaslash.com Thu Jul 27 23:11:51 2006
From: neal at metaslash.com (Neal Norwitz)
Date: Thu, 27 Jul 2006 17:11:51 -0400
Subject: [Python-checkins] Python Regression Test Failures refleak (1)
Message-ID: <20060727211151.GA24667@python.psfb.org>
test_cmd_line leaked [0, 17, 0] references
test_defaultdict leaked [9, 9, 9] references
From buildbot at python.org Thu Jul 27 23:42:58 2006
From: buildbot at python.org (buildbot at python.org)
Date: Thu, 27 Jul 2006 21:42:58 +0000
Subject: [Python-checkins] buildbot warnings in x86 gentoo trunk
Message-ID: <20060727214259.133191E4004@bag.python.org>
The Buildbot has detected a new failure of x86 gentoo trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520gentoo%2520trunk/builds/1396
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: andrew.kuchling,tim.peters
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Fri Jul 28 00:40:06 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Fri, 28 Jul 2006 00:40:06 +0200 (CEST)
Subject: [Python-checkins] r50878 - python/trunk/Doc/lib/libmailbox.tex
Message-ID: <20060727224006.4DC111E400F@bag.python.org>
Author: andrew.kuchling
Date: Fri Jul 28 00:40:05 2006
New Revision: 50878
Modified:
python/trunk/Doc/lib/libmailbox.tex
Log:
Reword paragraph
Modified: python/trunk/Doc/lib/libmailbox.tex
==============================================================================
--- python/trunk/Doc/lib/libmailbox.tex (original)
+++ python/trunk/Doc/lib/libmailbox.tex Fri Jul 28 00:40:05 2006
@@ -1367,9 +1367,8 @@
print subject
\end{verbatim}
-A (surprisingly) simple example of copying all mail from a Babyl mailbox to an
-MH mailbox, converting all of the format-specific information that can be
-converted:
+To copy all mail from a Babyl mailbox to an MH mailbox, converting all
+of the format-specific information that can be converted:
\begin{verbatim}
import mailbox
From python-checkins at python.org Fri Jul 28 00:49:41 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Fri, 28 Jul 2006 00:49:41 +0200 (CEST)
Subject: [Python-checkins] r50879 - python/trunk/Doc/lib/libmimetypes.tex
Message-ID: <20060727224941.06D8C1E4004@bag.python.org>
Author: andrew.kuchling
Date: Fri Jul 28 00:49:38 2006
New Revision: 50879
Modified:
python/trunk/Doc/lib/libmimetypes.tex
Log:
Add example
Modified: python/trunk/Doc/lib/libmimetypes.tex
==============================================================================
--- python/trunk/Doc/lib/libmimetypes.tex (original)
+++ python/trunk/Doc/lib/libmimetypes.tex Fri Jul 28 00:49:38 2006
@@ -158,6 +158,20 @@
\versionadded{2.2}
\end{classdesc}
+An example usage of the module:
+
+\begin{verbatim}
+>>> import mimetypes
+>>> mimetypes.init()
+>>> mimetypes.knownfiles
+['/etc/mime.types', '/etc/httpd/mime.types', ... ]
+>>> mimetypes.suffix_map['.tgz']
+'.tar.gz'
+>>> mimetypes.encodings_map['.gz']
+'gzip'
+>>> mimetypes.types_map['.tgz']
+'application/x-tar-gz'
+\end{verbatim}
\subsection{MimeTypes Objects \label{mimetypes-objects}}
From python-checkins at python.org Fri Jul 28 00:49:56 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Fri, 28 Jul 2006 00:49:56 +0200 (CEST)
Subject: [Python-checkins] r50880 - python/trunk/Doc/lib/libbase64.tex
Message-ID: <20060727224956.0F69B1E4004@bag.python.org>
Author: andrew.kuchling
Date: Fri Jul 28 00:49:54 2006
New Revision: 50880
Modified:
python/trunk/Doc/lib/libbase64.tex
Log:
Add example
Modified: python/trunk/Doc/lib/libbase64.tex
==============================================================================
--- python/trunk/Doc/lib/libbase64.tex (original)
+++ python/trunk/Doc/lib/libbase64.tex Fri Jul 28 00:49:54 2006
@@ -146,6 +146,18 @@
always including an extra trailing newline (\code{'\e n'}).
\end{funcdesc}
+An example usage of the module:
+
+\begin{verbatim}
+>>> import base64
+>>> encoded = base64.b64encode('data to be encoded')
+>>> encoded
+'ZGF0YSB0byBiZSBlbmNvZGVk'
+>>> data = base64.b64decode(encoded)
+>>> data
+'data to be encoded'
+\end{verbatim}
+
\begin{seealso}
\seemodule{binascii}{Support module containing \ASCII-to-binary
and binary-to-\ASCII{} conversions.}
From python-checkins at python.org Fri Jul 28 01:43:18 2006
From: python-checkins at python.org (barry.warsaw)
Date: Fri, 28 Jul 2006 01:43:18 +0200 (CEST)
Subject: [Python-checkins] r50881 - in python/trunk: Doc/lib/libinspect.tex
Doc/lib/libtypes.tex Lib/inspect.py Lib/pydoc.py
Lib/test/test_inspect.py Lib/types.py Makefile.pre.in
Misc/NEWS Modules/_typesmodule.c Modules/config.c.in
Message-ID: <20060727234318.A70121E4004@bag.python.org>
Author: barry.warsaw
Date: Fri Jul 28 01:43:15 2006
New Revision: 50881
Added:
python/trunk/Modules/_typesmodule.c
Modified:
python/trunk/Doc/lib/libinspect.tex
python/trunk/Doc/lib/libtypes.tex
python/trunk/Lib/inspect.py
python/trunk/Lib/pydoc.py
python/trunk/Lib/test/test_inspect.py
python/trunk/Lib/types.py
python/trunk/Makefile.pre.in
python/trunk/Misc/NEWS
python/trunk/Modules/config.c.in
Log:
Patch #1520294: Support for getset and member descriptors in types.py,
inspect.py, and pydoc.py. Specifically, this allows for querying the type of
an object against these built-in C types and more importantly, for getting
their docstrings printed in the interactive interpreter's help() function.
This patch includes a new built-in module called _types which provides
definitions of getset and member descriptors for use by the types.py module.
These types are exposed as types.GetSetDescriptorType and
types.MemberDescriptorType. Query functions are provided as
inspect.isgetsetdescriptor() and inspect.ismemberdescriptor(). The
implementations of these are robust enough to work with Python implementations
other than CPython, which may not have these fundamental types.
The patch also includes documentation and test suite updates.
I commit these changes now under these guiding principles:
1. Silence is assent. The release manager has not said "no", and of the few
people that cared enough to respond to the thread, the worst vote was "0".
2. It's easier to ask for forgiveness than permission.
3. It's so dang easy to revert stuff in svn, that you could view this as a
forcing function. :)
Windows build patches will follow.
Modified: python/trunk/Doc/lib/libinspect.tex
==============================================================================
--- python/trunk/Doc/lib/libinspect.tex (original)
+++ python/trunk/Doc/lib/libinspect.tex Fri Jul 28 01:43:15 2006
@@ -180,13 +180,32 @@
Return true if the object is a data descriptor.
Data descriptors have both a __get__ and a __set__ attribute. Examples are
- properties (defined in Python) and getsets and members (defined in C).
- Typically, data descriptors will also have __name__ and __doc__ attributes
- (properties, getsets, and members have both of these attributes), but this
- is not guaranteed.
+ properties (defined in Python), getsets, and members. The latter two are
+ defined in C and there are more specific tests available for those types,
+ which is robust across Python implementations. Typically, data descriptors
+ will also have __name__ and __doc__ attributes (properties, getsets, and
+ members have both of these attributes), but this is not guaranteed.
\versionadded{2.3}
\end{funcdesc}
+\begin{funcdesc}{isgetsetdescriptor}{object}
+ Return true if the object is a getset descriptor.
+
+ getsets are attributes defined in extension modules via \code{PyGetSetDef}
+ structures. For Python implementations without such types, this method will
+ always return \code{False}.
+\versionadded{2.5}
+\end{funcdesc}
+
+\begin{funcdesc}{ismemberdescriptor}{object}
+ Return true if the object is a member descriptor.
+
+ Member descriptors are attributes defined in extension modules via
+ \code{PyMemberDef} structures. For Python implementations without such
+ types, this method will always return \code{False}.
+\versionadded{2.5}
+\end{funcdesc}
+
\subsection{Retrieving source code
\label{inspect-source}}
Modified: python/trunk/Doc/lib/libtypes.tex
==============================================================================
--- python/trunk/Doc/lib/libtypes.tex (original)
+++ python/trunk/Doc/lib/libtypes.tex Fri Jul 28 01:43:15 2006
@@ -180,6 +180,30 @@
\function{buffer()}\bifuncindex{buffer} function.
\end{datadesc}
+\begin{datadesc}{DictProxyType}
+The type of dict proxies, such as \code{TypeType.__dict__}.
+\end{datadesc}
+
+\begin{datadesc}{NotImplementedType}
+The type of \code{NotImplemented}
+\end{datadesc}
+
+\begin{datadesc}{GetSetDescriptorType}
+The type of objects defined in extension modules with \code{PyGetSetDef}, such
+as \code{FrameType.f_locals} or \code{array.array.typecode}. This constant is
+not defined in implementations of Python that do not have such extension
+types, so for portable code use \code{hasattr(types, 'GetSetDescriptorType')}.
+\versionadded{2.5}
+\end{datadesc}
+
+\begin{datadesc}{MemberDescriptorType}
+The type of objects defined in extension modules with \code{PyMemberDef}, such
+as \code {datetime.timedelta.days}. This constant is not defined in
+implementations of Python that do not have such extension types, so for
+portable code use \code{hasattr(types, 'MemberDescriptorType')}.
+\versionadded{2.5}
+\end{datadesc}
+
\begin{datadesc}{StringTypes}
A sequence containing \code{StringType} and \code{UnicodeType} used to
facilitate easier checking for any string object. Using this is more
Modified: python/trunk/Lib/inspect.py
==============================================================================
--- python/trunk/Lib/inspect.py (original)
+++ python/trunk/Lib/inspect.py Fri Jul 28 01:43:15 2006
@@ -89,6 +89,40 @@
is not guaranteed."""
return (hasattr(object, "__set__") and hasattr(object, "__get__"))
+if hasattr(types, 'MemberDescriptorType'):
+ # CPython and equivalent
+ def ismemberdescriptor(object):
+ """Return true if the object is a member descriptor.
+
+ Member descriptors are specialized descriptors defined in extension
+ modules."""
+ return isinstance(object, types.MemberDescriptorType)
+else:
+ # Other implementations
+ def ismemberdescriptor(object):
+ """Return true if the object is a member descriptor.
+
+ Member descriptors are specialized descriptors defined in extension
+ modules."""
+ return False
+
+if hasattr(types, 'GetSetDescriptorType'):
+ # CPython and equivalent
+ def isgetsetdescriptor(object):
+ """Return true if the object is a getset descriptor.
+
+ getset descriptors are specialized descriptors defined in extension
+ modules."""
+ return isinstance(object, types.GetSetDescriptorType)
+else:
+ # Other implementations
+ def isgetsetdescriptor(object):
+ """Return true if the object is a getset descriptor.
+
+ getset descriptors are specialized descriptors defined in extension
+ modules."""
+ return False
+
def isfunction(object):
"""Return true if the object is a user-defined function.
Modified: python/trunk/Lib/pydoc.py
==============================================================================
--- python/trunk/Lib/pydoc.py (original)
+++ python/trunk/Lib/pydoc.py Fri Jul 28 01:43:15 2006
@@ -318,6 +318,8 @@
# identifies something in a way that pydoc itself has issues handling;
# think 'super' and how it is a descriptor (which raises the exception
# by lacking a __name__ attribute) and an instance.
+ if inspect.isgetsetdescriptor(object): return self.docdata(*args)
+ if inspect.ismemberdescriptor(object): return self.docdata(*args)
try:
if inspect.ismodule(object): return self.docmodule(*args)
if inspect.isclass(object): return self.docclass(*args)
@@ -333,7 +335,7 @@
name and ' ' + repr(name), type(object).__name__)
raise TypeError, message
- docmodule = docclass = docroutine = docother = fail
+ docmodule = docclass = docroutine = docother = docproperty = docdata = fail
def getdocloc(self, object):
"""Return the location of module docs or None"""
@@ -915,6 +917,10 @@
lhs = name and '%s = ' % name or ''
return lhs + self.repr(object)
+ def docdata(self, object, name=None, mod=None, cl=None):
+ """Produce html documentation for a data descriptor."""
+ return self._docdescriptor(name, object, mod)
+
def index(self, dir, shadowed=None):
"""Generate an HTML index for a directory of modules."""
modpkgs = []
@@ -1268,6 +1274,10 @@
"""Produce text documentation for a property."""
return self._docdescriptor(name, object, mod)
+ def docdata(self, object, name=None, mod=None, cl=None):
+ """Produce text documentation for a data descriptor."""
+ return self._docdescriptor(name, object, mod)
+
def docother(self, object, name=None, mod=None, parent=None, maxlen=None, doc=None):
"""Produce text documentation for a data object."""
repr = self.repr(object)
@@ -1397,6 +1407,14 @@
return 'module ' + thing.__name__
if inspect.isbuiltin(thing):
return 'built-in function ' + thing.__name__
+ if inspect.isgetsetdescriptor(thing):
+ return 'getset descriptor %s.%s.%s' % (
+ thing.__objclass__.__module__, thing.__objclass__.__name__,
+ thing.__name__)
+ if inspect.ismemberdescriptor(thing):
+ return 'member descriptor %s.%s.%s' % (
+ thing.__objclass__.__module__, thing.__objclass__.__name__,
+ thing.__name__)
if inspect.isclass(thing):
return 'class ' + thing.__name__
if inspect.isfunction(thing):
@@ -1453,6 +1471,8 @@
if not (inspect.ismodule(object) or
inspect.isclass(object) or
inspect.isroutine(object) or
+ inspect.isgetsetdescriptor(object) or
+ inspect.ismemberdescriptor(object) or
isinstance(object, property)):
# If the passed object is a piece of data or an instance,
# document its available methods instead of its value.
Modified: python/trunk/Lib/test/test_inspect.py
==============================================================================
--- python/trunk/Lib/test/test_inspect.py (original)
+++ python/trunk/Lib/test/test_inspect.py Fri Jul 28 01:43:15 2006
@@ -1,6 +1,8 @@
import sys
+import types
import unittest
import inspect
+import datetime
from test.test_support import TESTFN, run_unittest
@@ -40,10 +42,11 @@
self.failIf(other(obj), 'not %s(%s)' % (other.__name__, exp))
class TestPredicates(IsTestBase):
- def test_eleven(self):
- # Doc/lib/libinspect.tex claims there are 11 such functions
+ def test_thirteen(self):
+ # Doc/lib/libinspect.tex claims there are 13 such functions
count = len(filter(lambda x:x.startswith('is'), dir(inspect)))
- self.assertEqual(count, 11, "There are %d (not 11) is* functions" % count)
+ self.assertEqual(count, 13,
+ "There are %d (not 12) is* functions" % count)
def test_excluding_predicates(self):
self.istest(inspect.isbuiltin, 'sys.exit')
@@ -58,6 +61,15 @@
self.istest(inspect.istraceback, 'tb')
self.istest(inspect.isdatadescriptor, '__builtin__.file.closed')
self.istest(inspect.isdatadescriptor, '__builtin__.file.softspace')
+ if hasattr(types, 'GetSetDescriptorType'):
+ self.istest(inspect.isgetsetdescriptor,
+ 'type(tb.tb_frame).f_locals')
+ else:
+ self.failIf(inspect.isgetsetdescriptor(type(tb.tb_frame).f_locals))
+ if hasattr(types, 'MemberDescriptorType'):
+ self.istest(inspect.ismemberdescriptor, 'datetime.timedelta.days')
+ else:
+ self.failIf(inspect.ismemberdescriptor(datetime.timedelta.days))
def test_isroutine(self):
self.assert_(inspect.isroutine(mod.spam))
Modified: python/trunk/Lib/types.py
==============================================================================
--- python/trunk/Lib/types.py (original)
+++ python/trunk/Lib/types.py Fri Jul 28 01:43:15 2006
@@ -86,4 +86,16 @@
DictProxyType = type(TypeType.__dict__)
NotImplementedType = type(NotImplemented)
-del sys, _f, _g, _C, _x # Not for export
+# Extension types defined in a C helper module. XXX There may be no
+# equivalent in implementations other than CPython, so it seems better to
+# leave them undefined then to set them to e.g. None.
+try:
+ import _types
+except ImportError:
+ pass
+else:
+ GetSetDescriptorType = type(_types.Helper.getter)
+ MemberDescriptorType = type(_types.Helper.member)
+ del _types
+
+del sys, _f, _g, _C, _x # Not for export
Modified: python/trunk/Makefile.pre.in
==============================================================================
--- python/trunk/Makefile.pre.in (original)
+++ python/trunk/Makefile.pre.in Fri Jul 28 01:43:15 2006
@@ -317,6 +317,7 @@
##########################################################################
# objects that get linked into the Python library
LIBRARY_OBJS= \
+ Modules/_typesmodule.o \
Modules/getbuildinfo.o \
$(PARSER_OBJS) \
$(OBJECT_OBJS) \
@@ -353,6 +354,7 @@
$(LIBRARY): $(LIBRARY_OBJS)
-rm -f $@
$(AR) cr $@ Modules/getbuildinfo.o
+ $(AR) cr $@ Modules/_typesmodule.o
$(AR) cr $@ $(PARSER_OBJS)
$(AR) cr $@ $(OBJECT_OBJS)
$(AR) cr $@ $(PYTHON_OBJS)
@@ -485,7 +487,7 @@
$(AST_C): $(AST_ASDL) $(ASDLGEN_FILES)
$(ASDLGEN) -c $(AST_C_DIR) $(AST_ASDL)
-
+
Python/compile.o Python/symtable.o: $(GRAMMAR_H) $(AST_H)
Python/getplatform.o: $(srcdir)/Python/getplatform.c
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Fri Jul 28 01:43:15 2006
@@ -91,6 +91,11 @@
- Bug #1517996: IDLE now longer shows the default Tk menu when a
path browser, class browser or debugger is the frontmost window on MacOS X
+- Patch #1520294: Support for getset and member descriptors in types.py,
+ inspect.py, and pydoc.py. Specifically, this allows for querying the type
+ of an object against these built-in types and more importantly, for getting
+ their docstrings printed in the interactive interpreter's help() function.
+
Extension Modules
-----------------
Added: python/trunk/Modules/_typesmodule.c
==============================================================================
--- (empty file)
+++ python/trunk/Modules/_typesmodule.c Fri Jul 28 01:43:15 2006
@@ -0,0 +1,94 @@
+/* This extension module exposes some types that are only available at the
+ * C level. It should not be used directly, but instead through the Python
+ * level types modules, which imports this.
+ */
+
+#include "Python.h"
+#include "structmember.h"
+
+typedef struct
+{
+ PyObject_HEAD
+ int member;
+} Helper;
+
+static PyMemberDef helper_members[] = {
+ { "member", T_INT, offsetof(Helper, member), READONLY,
+ PyDoc_STR("A member descriptor")
+ },
+ { NULL }
+};
+
+static PyObject *
+helper_getter(Helper *self, void *unused)
+{
+ Py_RETURN_NONE;
+}
+
+static PyGetSetDef helper_getset[] = {
+ { "getter", (getter)helper_getter, NULL,
+ PyDoc_STR("A getset descriptor"),
+ },
+ { NULL }
+};
+
+static PyTypeObject HelperType = {
+ PyObject_HEAD_INIT(NULL)
+ 0, /* ob_size */
+ "_types.Helper", /* tp_name */
+ sizeof(Helper), /* tp_basicsize */
+ 0, /* tp_itemsize */
+ 0, /* tp_dealloc */
+ 0, /* tp_print */
+ 0, /* tp_getattr */
+ 0, /* tp_setattr */
+ 0, /* tp_compare */
+ 0, /* tp_repr */
+ 0, /* tp_as_number */
+ 0, /* tp_as_sequence */
+ 0, /* tp_as_mapping */
+ 0, /* tp_hash */
+ 0, /* tp_call */
+ 0, /* tp_str */
+ 0, /* tp_getattro */
+ 0, /* tp_setattro */
+ 0, /* tp_as_buffer */
+ Py_TPFLAGS_DEFAULT, /* tp_flags */
+ 0, /* tp_doc */
+ 0, /* tp_traverse */
+ 0, /* tp_clear */
+ 0, /* tp_richcompare */
+ 0, /* tp_weaklistoffset */
+ 0, /* tp_iter */
+ 0, /* tp_iternext */
+ 0, /* tp_methods */
+ helper_members, /* tp_members */
+ helper_getset, /* tp_getset */
+ 0, /* tp_base */
+ 0, /* tp_dict */
+ 0, /* tp_descr_get */
+ 0, /* tp_descr_set */
+ 0, /* tp_dictoffset */
+ 0, /* tp_init */
+ 0, /* tp_alloc */
+ 0, /* tp_new */
+ 0, /* tp_free */
+};
+
+PyMODINIT_FUNC
+init_types(void)
+{
+ PyObject *m;
+
+ m = Py_InitModule3("_types", NULL, "A types module helper");
+ if (!m)
+ return;
+
+ if (PyType_Ready(&HelperType) < 0)
+ return;
+
+ Py_INCREF(&HelperType);
+ PyModule_AddObject(m, "Helper", (PyObject *)&HelperType);
+}
+
+
Modified: python/trunk/Modules/config.c.in
==============================================================================
--- python/trunk/Modules/config.c.in (original)
+++ python/trunk/Modules/config.c.in Fri Jul 28 01:43:15 2006
@@ -28,6 +28,7 @@
extern void initimp(void);
extern void initgc(void);
extern void init_ast(void);
+extern void init_types(void);
struct _inittab _PyImport_Inittab[] = {
@@ -42,6 +43,9 @@
/* This lives in Python/Python-ast.c */
{"_ast", init_ast},
+ /* This lives in Python/_types.c */
+ {"_types", init_types},
+
/* These entries are here for sys.builtin_module_names */
{"__main__", NULL},
{"__builtin__", NULL},
From python-checkins at python.org Fri Jul 28 01:44:38 2006
From: python-checkins at python.org (tim.peters)
Date: Fri, 28 Jul 2006 01:44:38 +0200 (CEST)
Subject: [Python-checkins] r50882 - in python/trunk: Lib/doctest.py
Lib/test/test_doctest.py Misc/NEWS
Message-ID: <20060727234438.86FF51E4004@bag.python.org>
Author: tim.peters
Date: Fri Jul 28 01:44:37 2006
New Revision: 50882
Modified:
python/trunk/Lib/doctest.py
python/trunk/Lib/test/test_doctest.py
python/trunk/Misc/NEWS
Log:
Bug #1529297: The rewrite of doctest for Python 2.4 unintentionally
lost that tests are sorted by name before being run. ``DocTestFinder``
has been changed to sort the list of tests it returns.
Modified: python/trunk/Lib/doctest.py
==============================================================================
--- python/trunk/Lib/doctest.py (original)
+++ python/trunk/Lib/doctest.py Fri Jul 28 01:44:37 2006
@@ -821,6 +821,11 @@
# Recursively expore `obj`, extracting DocTests.
tests = []
self._find(tests, obj, name, module, source_lines, globs, {})
+ # Sort the tests by alpha order of names, for consistency in
+ # verbose-mode output. This was a feature of doctest in Pythons
+ # <= 2.3 that got lost by accident in 2.4. It was repaired in
+ # 2.4.4 and 2.5.
+ tests.sort()
return tests
def _from_module(self, module, object):
Modified: python/trunk/Lib/test/test_doctest.py
==============================================================================
--- python/trunk/Lib/test/test_doctest.py (original)
+++ python/trunk/Lib/test/test_doctest.py Fri Jul 28 01:44:37 2006
@@ -419,7 +419,6 @@
>>> finder = doctest.DocTestFinder()
>>> tests = finder.find(SampleClass)
- >>> tests.sort()
>>> for t in tests:
... print '%2s %s' % (len(t.examples), t.name)
3 SampleClass
@@ -435,7 +434,6 @@
New-style classes are also supported:
>>> tests = finder.find(SampleNewStyleClass)
- >>> tests.sort()
>>> for t in tests:
... print '%2s %s' % (len(t.examples), t.name)
1 SampleNewStyleClass
@@ -475,7 +473,6 @@
>>> # ignoring the objects since they weren't defined in m.
>>> import test.test_doctest
>>> tests = finder.find(m, module=test.test_doctest)
- >>> tests.sort()
>>> for t in tests:
... print '%2s %s' % (len(t.examples), t.name)
1 some_module
@@ -499,7 +496,6 @@
>>> from test import doctest_aliases
>>> tests = excl_empty_finder.find(doctest_aliases)
- >>> tests.sort()
>>> print len(tests)
2
>>> print tests[0].name
@@ -517,7 +513,6 @@
By default, an object with no doctests doesn't create any tests:
>>> tests = doctest.DocTestFinder().find(SampleClass)
- >>> tests.sort()
>>> for t in tests:
... print '%2s %s' % (len(t.examples), t.name)
3 SampleClass
@@ -536,7 +531,6 @@
displays.
>>> tests = doctest.DocTestFinder(exclude_empty=False).find(SampleClass)
- >>> tests.sort()
>>> for t in tests:
... print '%2s %s' % (len(t.examples), t.name)
3 SampleClass
@@ -557,7 +551,6 @@
using the `recurse` flag:
>>> tests = doctest.DocTestFinder(recurse=False).find(SampleClass)
- >>> tests.sort()
>>> for t in tests:
... print '%2s %s' % (len(t.examples), t.name)
3 SampleClass
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Fri Jul 28 01:44:37 2006
@@ -47,6 +47,12 @@
Library
-------
+- Bug #1529297: The rewrite of doctest for Python 2.4 unintentionally
+ lost that tests are sorted by name before being run. This rarely
+ matters for well-written tests, but can create baffling symptoms if
+ side effects from one test to the next affect outcomes. ``DocTestFinder``
+ has been changed to sort the list of tests it returns.
+
- The distutils version has been changed to 2.5.0.
- Bug #978833: Really close underlying socket in _socketobject.close.
From python-checkins at python.org Fri Jul 28 01:45:49 2006
From: python-checkins at python.org (tim.peters)
Date: Fri, 28 Jul 2006 01:45:49 +0200 (CEST)
Subject: [Python-checkins] r50883 - python/trunk/Lib/inspect.py
Message-ID: <20060727234549.1E37D1E4004@bag.python.org>
Author: tim.peters
Date: Fri Jul 28 01:45:48 2006
New Revision: 50883
Modified:
python/trunk/Lib/inspect.py
Log:
Whitespace normalization.
Modified: python/trunk/Lib/inspect.py
==============================================================================
--- python/trunk/Lib/inspect.py (original)
+++ python/trunk/Lib/inspect.py Fri Jul 28 01:45:48 2006
@@ -122,7 +122,7 @@
getset descriptors are specialized descriptors defined in extension
modules."""
return False
-
+
def isfunction(object):
"""Return true if the object is a user-defined function.
From python-checkins at python.org Fri Jul 28 01:46:37 2006
From: python-checkins at python.org (tim.peters)
Date: Fri, 28 Jul 2006 01:46:37 +0200 (CEST)
Subject: [Python-checkins] r50884 - python/trunk/Modules/_typesmodule.c
Message-ID: <20060727234637.43B561E4004@bag.python.org>
Author: tim.peters
Date: Fri Jul 28 01:46:36 2006
New Revision: 50884
Modified:
python/trunk/Modules/_typesmodule.c (props changed)
Log:
Add missing svn:eol-style property to text files.
From python-checkins at python.org Fri Jul 28 01:50:41 2006
From: python-checkins at python.org (barry.warsaw)
Date: Fri, 28 Jul 2006 01:50:41 +0200 (CEST)
Subject: [Python-checkins] r50885 - in python/trunk: PC/config.c
PCbuild/pythoncore.vcproj
Message-ID: <20060727235041.54E151E4004@bag.python.org>
Author: barry.warsaw
Date: Fri Jul 28 01:50:40 2006
New Revision: 50885
Modified:
python/trunk/PC/config.c
python/trunk/PCbuild/pythoncore.vcproj
Log:
Enable the building of the _types module on Windows.
Note that this has only been tested for VS 2003 since that's all I have.
Modified: python/trunk/PC/config.c
==============================================================================
--- python/trunk/PC/config.c (original)
+++ python/trunk/PC/config.c Fri Jul 28 01:50:40 2006
@@ -67,6 +67,7 @@
extern void init_subprocess(void);
extern void init_lsprof(void);
extern void init_ast(void);
+extern void init_types(void);
/* tools/freeze/makeconfig.py marker for additional "extern" */
/* -- ADDMODULE MARKER 1 -- */
@@ -161,6 +162,8 @@
{"__builtin__", NULL},
{"sys", NULL},
{"exceptions", NULL},
+
+ {"_types", init_types},
/* Sentinel */
{0, 0}
Modified: python/trunk/PCbuild/pythoncore.vcproj
==============================================================================
--- python/trunk/PCbuild/pythoncore.vcproj (original)
+++ python/trunk/PCbuild/pythoncore.vcproj Fri Jul 28 01:50:40 2006
@@ -344,9 +344,6 @@
RelativePath="..\Modules\_bisectmodule.c">
-
-
+
+
+
+
+
+
-
-
Author: tim.peters
Date: Fri Jul 28 01:56:12 2006
New Revision: 50886
Modified:
python/branches/release24-maint/Lib/doctest.py
python/branches/release24-maint/Lib/test/test_doctest.py
python/branches/release24-maint/Misc/NEWS
Log:
Merge rev 50882 from the trunk.
Bug #1529297: The rewrite of doctest for Python 2.4 unintentionally
lost that tests are sorted by name before being run. ``DocTestFinder``
has been changed to sort the list of tests it returns.
Modified: python/branches/release24-maint/Lib/doctest.py
==============================================================================
--- python/branches/release24-maint/Lib/doctest.py (original)
+++ python/branches/release24-maint/Lib/doctest.py Fri Jul 28 01:56:12 2006
@@ -848,6 +848,11 @@
# Recursively expore `obj`, extracting DocTests.
tests = []
self._find(tests, obj, name, module, source_lines, globs, {})
+ # Sort the tests by alpha order of names, for consistency in
+ # verbose-mode output. This was a feature of doctest in Pythons
+ # <= 2.3 that got lost by accident in 2.4. It was repaired in
+ # 2.4.4 and 2.5.
+ tests.sort()
return tests
def _filter(self, obj, prefix, base):
Modified: python/branches/release24-maint/Lib/test/test_doctest.py
==============================================================================
--- python/branches/release24-maint/Lib/test/test_doctest.py (original)
+++ python/branches/release24-maint/Lib/test/test_doctest.py Fri Jul 28 01:56:12 2006
@@ -419,7 +419,6 @@
>>> finder = doctest.DocTestFinder()
>>> tests = finder.find(SampleClass)
- >>> tests.sort()
>>> for t in tests:
... print '%2s %s' % (len(t.examples), t.name)
3 SampleClass
@@ -435,7 +434,6 @@
New-style classes are also supported:
>>> tests = finder.find(SampleNewStyleClass)
- >>> tests.sort()
>>> for t in tests:
... print '%2s %s' % (len(t.examples), t.name)
1 SampleNewStyleClass
@@ -475,7 +473,6 @@
>>> # ignoring the objects since they weren't defined in m.
>>> import test.test_doctest
>>> tests = finder.find(m, module=test.test_doctest)
- >>> tests.sort()
>>> for t in tests:
... print '%2s %s' % (len(t.examples), t.name)
1 some_module
@@ -499,7 +496,6 @@
>>> from test import doctest_aliases
>>> tests = excl_empty_finder.find(doctest_aliases)
- >>> tests.sort()
>>> print len(tests)
2
>>> print tests[0].name
@@ -521,7 +517,6 @@
>>> def namefilter(prefix, base):
... return base.startswith('a_')
>>> tests = doctest.DocTestFinder(_namefilter=namefilter).find(SampleClass)
- >>> tests.sort()
>>> for t in tests:
... print '%2s %s' % (len(t.examples), t.name)
3 SampleClass
@@ -538,7 +533,6 @@
>>> tests = doctest.DocTestFinder(_namefilter=namefilter,
... exclude_empty=False).find(SampleClass)
- >>> tests.sort()
>>> for t in tests:
... print '%2s %s' % (len(t.examples), t.name)
3 SampleClass
@@ -582,7 +576,6 @@
using the `recurse` flag:
>>> tests = doctest.DocTestFinder(recurse=False).find(SampleClass)
- >>> tests.sort()
>>> for t in tests:
... print '%2s %s' % (len(t.examples), t.name)
3 SampleClass
Modified: python/branches/release24-maint/Misc/NEWS
==============================================================================
--- python/branches/release24-maint/Misc/NEWS (original)
+++ python/branches/release24-maint/Misc/NEWS Fri Jul 28 01:56:12 2006
@@ -31,9 +31,9 @@
Extension Modules
-----------------
-- Bug #1471938: Fix curses module build problem on Solaris 8; patch by
+- Bug #1471938: Fix curses module build problem on Solaris 8; patch by
Paul Eggert.
-
+
- Bug #1512695: cPickle.loads could crash if it was interrupted with
a KeyboardInterrupt.
@@ -64,6 +64,12 @@
Library
-------
+- Bug #1529297: The rewrite of doctest for Python 2.4 unintentionally
+ lost that tests are sorted by name before being run. This rarely
+ matters for well-written tests, but can create baffling symptoms if
+ side effects from one test to the next affect outcomes. ``DocTestFinder``
+ has been changed to sort the list of tests it returns.
+
- The email package has improved RFC 2231 support, specifically for
recognizing the difference between encoded (name*0*=) and non-encoded
(name*0=) parameter continuations. This may change the types of
From barry at python.org Fri Jul 28 01:56:39 2006
From: barry at python.org (Barry Warsaw)
Date: Thu, 27 Jul 2006 19:56:39 -0400
Subject: [Python-checkins] r50884 - python/trunk/Modules/_typesmodule.c
In-Reply-To: <20060727234637.43B561E4004@bag.python.org>
References: <20060727234637.43B561E4004@bag.python.org>
Message-ID:
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
On Jul 27, 2006, at 7:46 PM, tim.peters wrote:
> Author: tim.peters
> Date: Fri Jul 28 01:46:36 2006
> New Revision: 50884
>
> Modified:
> python/trunk/Modules/_typesmodule.c (props changed)
> Log:
> Add missing svn:eol-style property to text files.
Wow Tim, you're fast! :)
- -Barry
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.3 (Darwin)
iQCVAwUBRMlSt3EjvBPtnXfVAQJLPQP/Yv8M0RrZMxkemNd5io2qoaFPxBrQe7YI
aSPIIpT3a1ytS9k6KLFO8lKDUjQ9J9WUIbEJ8HtCpUq8Abr8XXU/8SBzUZF+UG5G
14RWEo+8tX8AmMpPr9evgeOBa2neGehyDAPUUaHBHxE/aldftUMUpLeY66hzZ07F
7NQ7eIwwIBs=
=Qufm
-----END PGP SIGNATURE-----
From tim.peters at gmail.com Fri Jul 28 01:58:48 2006
From: tim.peters at gmail.com (Tim Peters)
Date: Thu, 27 Jul 2006 19:58:48 -0400
Subject: [Python-checkins] r50884 - python/trunk/Modules/_typesmodule.c
In-Reply-To:
References: <20060727234637.43B561E4004@bag.python.org>
Message-ID: <1f7befae0607271658t3ea544ecr24b4c3bcc29dd0f5@mail.gmail.com>
>> Author: tim.peters
>> Date: Fri Jul 28 01:46:36 2006
>> New Revision: 50884
>>
>> Modified:
>> python/trunk/Modules/_typesmodule.c (props changed)
>> Log:
>> Add missing svn:eol-style property to text files.
[Barry]
> Wow Tim, you're fast! :)
And the sad thing was that really was me :-)
From python-checkins at python.org Fri Jul 28 02:23:17 2006
From: python-checkins at python.org (tim.peters)
Date: Fri, 28 Jul 2006 02:23:17 +0200 (CEST)
Subject: [Python-checkins] r50887 - python/trunk/Modules/collectionsmodule.c
Message-ID: <20060728002317.05F3F1E4004@bag.python.org>
Author: tim.peters
Date: Fri Jul 28 02:23:15 2006
New Revision: 50887
Modified:
python/trunk/Modules/collectionsmodule.c
Log:
defdict_reduce(): Plug leaks.
We didn't notice these before because test_defaultdict didn't
actually do anything before Georg fixed that earlier today.
Neal's next refleak run then showed test_defaultdict leaking
9 references on each run. That's repaired by this checkin.
Modified: python/trunk/Modules/collectionsmodule.c
==============================================================================
--- python/trunk/Modules/collectionsmodule.c (original)
+++ python/trunk/Modules/collectionsmodule.c Fri Jul 28 02:23:15 2006
@@ -10,7 +10,7 @@
/* The block length may be set to any number over 1. Larger numbers
* reduce the number of calls to the memory allocator but take more
* memory. Ideally, BLOCKLEN should be set with an eye to the
- * length of a cache line.
+ * length of a cache line.
*/
#define BLOCKLEN 62
@@ -22,9 +22,9 @@
* element is at d.leftblock[leftindex] and its last element is at
* d.rightblock[rightindex]; note that, unlike as for Python slice
* indices, these indices are inclusive on both ends. By being inclusive
- * on both ends, algorithms for left and right operations become
+ * on both ends, algorithms for left and right operations become
* symmetrical which simplifies the design.
- *
+ *
* The list of blocks is never empty, so d.leftblock and d.rightblock
* are never equal to NULL.
*
@@ -37,11 +37,11 @@
* d.leftindex == CENTER+1; and d.rightindex == CENTER.
* Checking for d.len == 0 is the intended way to see whether d is empty.
*
- * Whenever d.leftblock == d.rightblock,
+ * Whenever d.leftblock == d.rightblock,
* d.leftindex + d.len - 1 == d.rightindex.
- *
+ *
* However, when d.leftblock != d.rightblock, d.leftindex and d.rightindex
- * become indices into distinct blocks and either may be larger than the
+ * become indices into distinct blocks and either may be larger than the
* other.
*/
@@ -381,7 +381,7 @@
int cmp = PyObject_RichCompareBool(item, value, Py_EQ);
if (deque->len != n) {
- PyErr_SetString(PyExc_IndexError,
+ PyErr_SetString(PyExc_IndexError,
"deque mutated during remove().");
return NULL;
}
@@ -920,7 +920,7 @@
"deque mutated during iteration");
return NULL;
}
- assert (!(it->b == it->deque->rightblock &&
+ assert (!(it->b == it->deque->rightblock &&
it->index > it->deque->rightindex));
item = it->b->data[it->index];
@@ -1016,7 +1016,7 @@
"deque mutated during iteration");
return NULL;
}
- assert (!(it->b == it->deque->leftblock &&
+ assert (!(it->b == it->deque->leftblock &&
it->index < it->deque->leftindex));
item = it->b->data[it->index];
@@ -1117,7 +1117,7 @@
static PyObject *
defdict_reduce(defdictobject *dd)
{
- /* __reduce__ must returns a 5-tuple as follows:
+ /* __reduce__ must return a 5-tuple as follows:
- factory function
- tuple of args for the factory function
@@ -1155,6 +1155,7 @@
}
result = PyTuple_Pack(5, dd->dict.ob_type, args,
Py_None, Py_None, items);
+ Py_DECREF(items);
Py_DECREF(args);
return result;
}
From python-checkins at python.org Fri Jul 28 02:30:01 2006
From: python-checkins at python.org (tim.peters)
Date: Fri, 28 Jul 2006 02:30:01 +0200 (CEST)
Subject: [Python-checkins] r50888 - python/trunk/Misc/NEWS
Message-ID: <20060728003001.66BD11E4004@bag.python.org>
Author: tim.peters
Date: Fri Jul 28 02:30:00 2006
New Revision: 50888
Modified:
python/trunk/Misc/NEWS
Log:
News about the repaired memory leak in defaultdict.
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Fri Jul 28 02:30:00 2006
@@ -106,6 +106,9 @@
Extension Modules
-----------------
+- The ``__reduce__()`` method of the new ``collections.defaultdict`` had
+ a memory leak, affecting pickles and deep copies.
+
- Bug #1471938: Fix curses module build problem on Solaris 8; patch by
Paul Eggert.
From python-checkins at python.org Fri Jul 28 03:35:28 2006
From: python-checkins at python.org (gregory.p.smith)
Date: Fri, 28 Jul 2006 03:35:28 +0200 (CEST)
Subject: [Python-checkins] r50889 - in python/trunk:
Lib/bsddb/test/test_basics.py Misc/NEWS Modules/_bsddb.c
Message-ID: <20060728013528.216F51E4004@bag.python.org>
Author: gregory.p.smith
Date: Fri Jul 28 03:35:25 2006
New Revision: 50889
Modified:
python/trunk/Lib/bsddb/test/test_basics.py
python/trunk/Misc/NEWS
python/trunk/Modules/_bsddb.c
Log:
- pybsddb Bug #1527939: bsddb module DBEnv dbremove and dbrename
methods now allow their database parameter to be None as the
sleepycat API allows.
Also adds an appropriate test case for DBEnv.dbrename and dbremove.
Modified: python/trunk/Lib/bsddb/test/test_basics.py
==============================================================================
--- python/trunk/Lib/bsddb/test/test_basics.py (original)
+++ python/trunk/Lib/bsddb/test/test_basics.py Fri Jul 28 03:35:25 2006
@@ -562,6 +562,9 @@
num = d.truncate()
assert num == 0, "truncate on empty DB returned nonzero (%r)" % (num,)
+ #----------------------------------------
+
+
#----------------------------------------------------------------------
@@ -583,18 +586,40 @@
dbopenflags = db.DB_THREAD
-class BasicBTreeWithEnvTestCase(BasicTestCase):
- dbtype = db.DB_BTREE
+class BasicWithEnvTestCase(BasicTestCase):
dbopenflags = db.DB_THREAD
useEnv = 1
envflags = db.DB_THREAD | db.DB_INIT_MPOOL | db.DB_INIT_LOCK
+ #----------------------------------------
+
+ def test07_EnvRemoveAndRename(self):
+ if not self.env:
+ return
+
+ if verbose:
+ print '\n', '-=' * 30
+ print "Running %s.test07_EnvRemoveAndRename..." % self.__class__.__name__
+
+ # can't rename or remove an open DB
+ self.d.close()
+
+ newname = self.filename + '.renamed'
+ self.env.dbrename(self.filename, None, newname)
+ self.env.dbremove(newname)
+
+ # dbremove and dbrename are in 4.1 and later
+ if db.version() < (4,1):
+ del test07_EnvRemoveAndRename
+
+ #----------------------------------------
+
+class BasicBTreeWithEnvTestCase(BasicWithEnvTestCase):
+ dbtype = db.DB_BTREE
+
-class BasicHashWithEnvTestCase(BasicTestCase):
+class BasicHashWithEnvTestCase(BasicWithEnvTestCase):
dbtype = db.DB_HASH
- dbopenflags = db.DB_THREAD
- useEnv = 1
- envflags = db.DB_THREAD | db.DB_INIT_MPOOL | db.DB_INIT_LOCK
#----------------------------------------------------------------------
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Fri Jul 28 03:35:25 2006
@@ -128,6 +128,10 @@
- Because of a misspelled preprocessor symbol, ctypes was always
compiled without thread support; this is now fixed.
+- pybsddb Bug #1527939: bsddb module DBEnv dbremove and dbrename
+ methods now allow their database parameter to be None as the
+ sleepycat API allows.
+
Tests
-----
Modified: python/trunk/Modules/_bsddb.c
==============================================================================
--- python/trunk/Modules/_bsddb.c (original)
+++ python/trunk/Modules/_bsddb.c Fri Jul 28 03:35:25 2006
@@ -98,7 +98,7 @@
#error "eek! DBVER can't handle minor versions > 9"
#endif
-#define PY_BSDDB_VERSION "4.4.4"
+#define PY_BSDDB_VERSION "4.4.5"
static char *rcs_id = "$Id$";
@@ -3876,7 +3876,7 @@
static char* kwnames[] = { "file", "database", "txn", "flags",
NULL };
- if (!PyArg_ParseTupleAndKeywords(args, kwargs, "ss|Oi:dbremove", kwnames,
+ if (!PyArg_ParseTupleAndKeywords(args, kwargs, "s|zOi:dbremove", kwnames,
&file, &database, &txnobj, &flags)) {
return NULL;
}
@@ -3904,7 +3904,7 @@
static char* kwnames[] = { "file", "database", "newname", "txn",
"flags", NULL };
- if (!PyArg_ParseTupleAndKeywords(args, kwargs, "sss|Oi:dbrename", kwnames,
+ if (!PyArg_ParseTupleAndKeywords(args, kwargs, "szs|Oi:dbrename", kwnames,
&file, &database, &newname, &txnobj, &flags)) {
return NULL;
}
From buildbot at python.org Fri Jul 28 04:28:51 2006
From: buildbot at python.org (buildbot at python.org)
Date: Fri, 28 Jul 2006 02:28:51 +0000
Subject: [Python-checkins] buildbot warnings in alpha Tru64 5.1 trunk
Message-ID: <20060728022851.5E2131E4004@bag.python.org>
The Buildbot has detected a new failure of alpha Tru64 5.1 trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/alpha%2520Tru64%25205.1%2520trunk/builds/981
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: tim.peters
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Fri Jul 28 04:34:38 2006
From: python-checkins at python.org (barry.warsaw)
Date: Fri, 28 Jul 2006 04:34:38 +0200 (CEST)
Subject: [Python-checkins] r50890 -
python/branches/release24-maint/Doc/lib/emailmimebase.tex
Message-ID: <20060728023438.283921E4004@bag.python.org>
Author: barry.warsaw
Date: Fri Jul 28 04:34:35 2006
New Revision: 50890
Modified:
python/branches/release24-maint/Doc/lib/emailmimebase.tex
Log:
Delete a misleading example. SF bug #1481650.
Modified: python/branches/release24-maint/Doc/lib/emailmimebase.tex
==============================================================================
--- python/branches/release24-maint/Doc/lib/emailmimebase.tex (original)
+++ python/branches/release24-maint/Doc/lib/emailmimebase.tex Fri Jul 28 04:34:35 2006
@@ -14,12 +14,6 @@
within the \module{email} package. E.g.:
\begin{verbatim}
-import email.MIMEImage.MIMEImage
-\end{verbatim}
-
-or
-
-\begin{verbatim}
from email.MIMEText import MIMEText
\end{verbatim}
From python-checkins at python.org Fri Jul 28 04:38:45 2006
From: python-checkins at python.org (brett.cannon)
Date: Fri, 28 Jul 2006 04:38:45 +0200 (CEST)
Subject: [Python-checkins] r50891 - in python/branches/bcannon-sandboxing:
Include/Python.h Include/objimpl.h Include/pystate.h
Makefile.pre.in Modules/_sre.c Modules/datetimemodule.c
Modules/gcmodule.c Modules/parsermodule.c
Objects/bufferobject.c Objects/complexobject.c
Objects/object.c Objects/obmalloc.c Objects/stringobject.c
Objects/typeobject.c Objects/unicodeobject.c
Python/pythonrun.c Python/sysmodule.c configure configure.in
pyconfig.h.in
Message-ID: <20060728023845.0F9941E4004@bag.python.org>
Author: brett.cannon
Date: Fri Jul 28 04:38:28 2006
New Revision: 50891
Modified:
python/branches/bcannon-sandboxing/Include/Python.h
python/branches/bcannon-sandboxing/Include/objimpl.h
python/branches/bcannon-sandboxing/Include/pystate.h
python/branches/bcannon-sandboxing/Makefile.pre.in
python/branches/bcannon-sandboxing/Modules/_sre.c
python/branches/bcannon-sandboxing/Modules/datetimemodule.c
python/branches/bcannon-sandboxing/Modules/gcmodule.c
python/branches/bcannon-sandboxing/Modules/parsermodule.c
python/branches/bcannon-sandboxing/Objects/bufferobject.c
python/branches/bcannon-sandboxing/Objects/complexobject.c
python/branches/bcannon-sandboxing/Objects/object.c
python/branches/bcannon-sandboxing/Objects/obmalloc.c
python/branches/bcannon-sandboxing/Objects/stringobject.c
python/branches/bcannon-sandboxing/Objects/typeobject.c
python/branches/bcannon-sandboxing/Objects/unicodeobject.c
python/branches/bcannon-sandboxing/Python/pythonrun.c
python/branches/bcannon-sandboxing/Python/sysmodule.c
python/branches/bcannon-sandboxing/configure
python/branches/bcannon-sandboxing/configure.in
python/branches/bcannon-sandboxing/pyconfig.h.in
Log:
Redo memory cap by starting from scratch by just trying to track memory usage
for the overall Python process. Introduce PyObject_T_MALLOC() and friends that
take a const char * specifying what the memory is for. Also convert
malloc()-like uses of PyObject_MALLOC() over to PyObject_Malloc() (for now).
Still have to add in hooks for several more memory APIs used by Python, but as
of now, no extra memory is magically tracked by hitting Enter at the interpreter
prompt. Strings also seem to be tracking properly with interned strings and
new ones.
Modified: python/branches/bcannon-sandboxing/Include/Python.h
==============================================================================
--- python/branches/bcannon-sandboxing/Include/Python.h (original)
+++ python/branches/bcannon-sandboxing/Include/Python.h Fri Jul 28 04:38:28 2006
@@ -67,7 +67,7 @@
/* Debug-mode build with pymalloc implies PYMALLOC_DEBUG.
* PYMALLOC_DEBUG is in error if pymalloc is not in use.
*/
-#if defined(Py_DEBUG) && defined(WITH_PYMALLOC) && !defined(PYMALLOC_DEBUG) && !defined(Py_MEMORY_CAP)
+#if defined(Py_DEBUG) && defined(WITH_PYMALLOC) && !defined(PYMALLOC_DEBUG)
#define PYMALLOC_DEBUG
#endif
#if defined(PYMALLOC_DEBUG) && !defined(WITH_PYMALLOC)
Modified: python/branches/bcannon-sandboxing/Include/objimpl.h
==============================================================================
--- python/branches/bcannon-sandboxing/Include/objimpl.h (original)
+++ python/branches/bcannon-sandboxing/Include/objimpl.h Fri Jul 28 04:38:28 2006
@@ -94,10 +94,17 @@
the object gets initialized via PyObject_{Init, InitVar} after obtaining
the raw memory.
*/
+PyAPI_DATA(unsigned long) Py_ProcessMemUsage;
+PyAPI_FUNC(int) PyMalloc_ManagesMemory(void *);
+PyAPI_FUNC(size_t) PyMalloc_AllocatedSize(void *);
PyAPI_FUNC(void *) PyObject_Malloc(size_t);
PyAPI_FUNC(void *) PyObject_Realloc(void *, size_t);
PyAPI_FUNC(void) PyObject_Free(void *);
+PyAPI_FUNC(void *) PyObject_TrackedMalloc(const char *, size_t);
+PyAPI_FUNC(void *) PyObject_TrackedRealloc(const char *, void *, size_t);
+PyAPI_FUNC(void) PyObject_TrackedFree(const char *, void *);
+
/* Macros */
#ifdef WITH_PYMALLOC
@@ -128,15 +135,22 @@
#endif /* WITH_PYMALLOC */
-#ifdef Py_MEMORY_CAP
+#ifdef Py_TRACK_MEMORY
PyAPI_FUNC(void) _PyObject_Del(void *);
#define PyObject_Del _PyObject_Del
#define PyObject_DEL _PyObject_Del
-#else /* !Py_MEMORY_CAP */
+#define PyObject_T_MALLOC PyObject_TrackedMalloc
+#define PyObject_T_REALLOC PyObject_TrackedRealloc
+#define PyObject_T_FREE PyObject_TrackedFree
+#else /* !Py_TRACK_MEMORY */
#define PyObject_Del PyObject_Free
#define _PyObject_Del PyObject_Free
#define PyObject_DEL PyObject_FREE
-#endif /* Py_MEMORY_CAP */
+#define PyObject_T_MALLOC(what, size) PyObject_MALLOC(size)
+#define PyObject_T_REALLOC(what, who, size) PyObject_REALLOC(who, size)
+#define PyObject_T_FREE(what, who) PyObject_FREE(who)
+
+#endif /* Py_TRACK_MEMORY */
/* for source compatibility with 2.2 */
/*
@@ -187,6 +201,21 @@
) & ~(SIZEOF_VOID_P - 1) \
)
+#ifdef Py_TRACK_MEMORY
+
+#define PyObject_NEW(type, typeobj) \
+( (type *) PyObject_Init( \
+ (PyObject *) PyObject_T_MALLOC((typeobj)->tp_name, \
+ _PyObject_SIZE(typeobj) ), (typeobj)) )
+
+#define PyObject_NEW_VAR(type, typeobj, n) \
+( (type *) PyObject_InitVar( \
+ (PyVarObject *) PyObject_T_MALLOC((typeobj)->tp_name, \
+ _PyObject_VAR_SIZE((typeobj),(n)) ),\
+ (typeobj), (n)) )
+
+#else
+
#define PyObject_NEW(type, typeobj) \
( (type *) PyObject_Init( \
(PyObject *) PyObject_MALLOC( _PyObject_SIZE(typeobj) ), (typeobj)) )
@@ -195,6 +224,7 @@
( (type *) PyObject_InitVar( \
(PyVarObject *) PyObject_MALLOC(_PyObject_VAR_SIZE((typeobj),(n)) ),\
(typeobj), (n)) )
+#endif /* Py_TRACK_MEMORY */
/* This example code implements an object constructor with a custom
allocator, where PyObject_New is inlined, and shows the important
Modified: python/branches/bcannon-sandboxing/Include/pystate.h
==============================================================================
--- python/branches/bcannon-sandboxing/Include/pystate.h (original)
+++ python/branches/bcannon-sandboxing/Include/pystate.h Fri Jul 28 04:38:28 2006
@@ -32,11 +32,6 @@
#ifdef WITH_TSC
int tscdump;
#endif
-#ifdef Py_MEMORY_CAP
- PY_LONG_LONG mem_cap;
- PY_LONG_LONG mem_usage;
-#endif
-
} PyInterpreterState;
Modified: python/branches/bcannon-sandboxing/Makefile.pre.in
==============================================================================
--- python/branches/bcannon-sandboxing/Makefile.pre.in (original)
+++ python/branches/bcannon-sandboxing/Makefile.pre.in Fri Jul 28 04:38:28 2006
@@ -308,6 +308,7 @@
Objects/sliceobject.o \
Objects/stringobject.o \
Objects/structseq.o \
+ Objects/trackedmalloc.o \
Objects/tupleobject.o \
Objects/typeobject.o \
Objects/weakrefobject.o \
Modified: python/branches/bcannon-sandboxing/Modules/_sre.c
==============================================================================
--- python/branches/bcannon-sandboxing/Modules/_sre.c (original)
+++ python/branches/bcannon-sandboxing/Modules/_sre.c Fri Jul 28 04:38:28 2006
@@ -1165,7 +1165,7 @@
ctx->pattern[1], ctx->pattern[2]));
/* install new repeat context */
- ctx->u.rep = (SRE_REPEAT*) PyObject_MALLOC(sizeof(*ctx->u.rep));
+ ctx->u.rep = (SRE_REPEAT*) PyObject_T_MALLOC("sre.*", sizeof(*ctx->u.rep));
ctx->u.rep->count = -1;
ctx->u.rep->pattern = ctx->pattern;
ctx->u.rep->prev = state->repeat;
@@ -1175,7 +1175,7 @@
state->ptr = ctx->ptr;
DO_JUMP(JUMP_REPEAT, jump_repeat, ctx->pattern+ctx->pattern[0]);
state->repeat = ctx->u.rep->prev;
- PyObject_FREE(ctx->u.rep);
+ PyObject_T_FREE("sre.*", ctx->u.rep);
if (ret) {
RETURN_ON_ERROR(ret);
Modified: python/branches/bcannon-sandboxing/Modules/datetimemodule.c
==============================================================================
--- python/branches/bcannon-sandboxing/Modules/datetimemodule.c (original)
+++ python/branches/bcannon-sandboxing/Modules/datetimemodule.c Fri Jul 28 04:38:28 2006
@@ -601,7 +601,7 @@
PyObject *self;
self = (PyObject *)
- PyObject_MALLOC(aware ?
+ PyObject_T_MALLOC("datetime.time", aware ?
sizeof(PyDateTime_Time) :
sizeof(_PyDateTime_BaseTime));
if (self == NULL)
@@ -616,7 +616,7 @@
PyObject *self;
self = (PyObject *)
- PyObject_MALLOC(aware ?
+ PyObject_T_MALLOC("datetime.datetime", aware ?
sizeof(PyDateTime_DateTime) :
sizeof(_PyDateTime_BaseDateTime));
if (self == NULL)
Modified: python/branches/bcannon-sandboxing/Modules/gcmodule.c
==============================================================================
--- python/branches/bcannon-sandboxing/Modules/gcmodule.c (original)
+++ python/branches/bcannon-sandboxing/Modules/gcmodule.c Fri Jul 28 04:38:28 2006
@@ -1341,12 +1341,9 @@
_PyObject_GC_New(PyTypeObject *tp)
{
PyObject *op = NULL;
+ size_t obj_size = _PyObject_SIZE(tp);
-#ifdef Py_MEMORY_CAP
- if (!PyInterpreterState_AddObjectMem(tp))
- return NULL;
-#endif
- op = _PyObject_GC_Malloc(_PyObject_SIZE(tp));
+ op = _PyObject_GC_Malloc(obj_size);
if (op != NULL)
op = PyObject_INIT(op, tp);
return op;
@@ -1358,10 +1355,6 @@
const size_t size = _PyObject_VAR_SIZE(tp, nitems);
PyVarObject *op = NULL;
-#ifdef Py_MEMORY_CAP
- if (!PyInterpreterState_AddVarObjectMem(tp, nitems))
- return NULL;
-#endif
op = (PyVarObject *) _PyObject_GC_Malloc(size);
if (op != NULL)
op = PyObject_INIT_VAR(op, tp, nitems);
@@ -1390,9 +1383,6 @@
if (generations[0].count > 0) {
generations[0].count--;
}
-#ifdef Py_MEMORY_CAP
- PyInterpreterState_RemoveObjectMem((PyObject *)op);
-#endif
PyObject_FREE(g);
}
Modified: python/branches/bcannon-sandboxing/Modules/parsermodule.c
==============================================================================
--- python/branches/bcannon-sandboxing/Modules/parsermodule.c (original)
+++ python/branches/bcannon-sandboxing/Modules/parsermodule.c Fri Jul 28 04:38:28 2006
@@ -701,7 +701,7 @@
}
}
len = PyString_GET_SIZE(temp) + 1;
- strn = (char *)PyObject_MALLOC(len);
+ strn = (char *)PyObject_Malloc(len);
if (strn != NULL)
(void) memcpy(strn, PyString_AS_STRING(temp), len);
Py_DECREF(temp);
@@ -719,11 +719,11 @@
}
err = PyNode_AddChild(root, type, strn, *line_num, 0);
if (err == E_NOMEM) {
- PyObject_FREE(strn);
+ PyObject_T_FREE("str", strn);
return (node *) PyErr_NoMemory();
}
if (err == E_OVERFLOW) {
- PyObject_FREE(strn);
+ PyObject_Free(strn);
PyErr_SetString(PyExc_ValueError,
"unsupported number of child nodes");
return NULL;
@@ -787,7 +787,7 @@
if (res && encoding) {
Py_ssize_t len;
len = PyString_GET_SIZE(encoding) + 1;
- res->n_str = (char *)PyObject_MALLOC(len);
+ res->n_str = (char *)PyObject_Malloc(len);
if (res->n_str != NULL)
(void) memcpy(res->n_str, PyString_AS_STRING(encoding), len);
Py_DECREF(encoding);
Modified: python/branches/bcannon-sandboxing/Objects/bufferobject.c
==============================================================================
--- python/branches/bcannon-sandboxing/Objects/bufferobject.c (original)
+++ python/branches/bcannon-sandboxing/Objects/bufferobject.c Fri Jul 28 04:38:28 2006
@@ -209,12 +209,7 @@
}
/* XXX: check for overflow in multiply */
/* Inline PyObject_New */
-#ifdef Py_MEMORY_CAP
- if (!PyInterpreterState_AddRawMem("buffer",
- sizeof(*b) + size))
- return PyErr_NoMemory();
-#endif
- o = (PyObject *)PyObject_MALLOC(sizeof(*b) + size);
+ o = (PyObject *)PyObject_T_MALLOC("buffer", sizeof(*b) + size);
if ( o == NULL )
return PyErr_NoMemory();
b = (PyBufferObject *) PyObject_INIT(o, &PyBuffer_Type);
Modified: python/branches/bcannon-sandboxing/Objects/complexobject.c
==============================================================================
--- python/branches/bcannon-sandboxing/Objects/complexobject.c (original)
+++ python/branches/bcannon-sandboxing/Objects/complexobject.c Fri Jul 28 04:38:28 2006
@@ -200,11 +200,8 @@
register PyComplexObject *op;
/* Inline PyObject_New */
-#ifdef Py_MEMORY_CAP
- if (!PyInterpreterState_AddRawMem("complex number", sizeof(PyComplexObject)))
- return PyErr_NoMemory();
-#endif
- op = (PyComplexObject *) PyObject_MALLOC(sizeof(PyComplexObject));
+ op = (PyComplexObject *) PyObject_T_MALLOC("complex",
+ sizeof(PyComplexObject));
if (op == NULL)
return PyErr_NoMemory();
PyObject_INIT(op, &PyComplex_Type);
Modified: python/branches/bcannon-sandboxing/Objects/object.c
==============================================================================
--- python/branches/bcannon-sandboxing/Objects/object.c (original)
+++ python/branches/bcannon-sandboxing/Objects/object.c Fri Jul 28 04:38:28 2006
@@ -236,12 +236,7 @@
PyObject *op;
size_t tp_size = _PyObject_SIZE(tp);
-#ifdef Py_MEMORY_CAP
- if (!PyInterpreterState_AddObjectMem(tp))
- return PyErr_NoMemory();
-#endif
-
- op = (PyObject *) PyObject_MALLOC(tp_size);
+ op = (PyObject *) PyObject_T_MALLOC(tp->tp_name, tp_size);
if (op == NULL)
return PyErr_NoMemory();
return PyObject_INIT(op, tp);
@@ -253,12 +248,7 @@
PyVarObject *op;
const size_t size = _PyObject_VAR_SIZE(tp, nitems);
-#ifdef Py_MEMORY_CAP
- if (!PyInterpreterState_AddVarObjectMem(tp, nitems))
- return (PyVarObject *)PyErr_NoMemory();
-#endif
-
- op = (PyVarObject *) PyObject_MALLOC(size);
+ op = (PyVarObject *) PyObject_T_MALLOC(tp->tp_name, size);
if (op == NULL)
return (PyVarObject *)PyErr_NoMemory();
return PyObject_INIT_VAR(op, tp, nitems);
@@ -266,13 +256,15 @@
/* for binary compatibility with 2.2. */
#undef _PyObject_Del
+/*
+ Assume that argument is PyObject *!!!
+*/
void
_PyObject_Del(void *op)
{
-#ifdef Py_MEMORY_CAP
- PyInterpreterState_RemoveObjectMem((PyObject *)op);
-#endif
- PyObject_FREE(op);
+ PyObject *obj_ptr = (PyObject *)op;
+
+ PyObject_T_FREE(obj_ptr->ob_type->tp_name, op);
}
/* Implementation of PyObject_Print with recursion checking */
Modified: python/branches/bcannon-sandboxing/Objects/obmalloc.c
==============================================================================
--- python/branches/bcannon-sandboxing/Objects/obmalloc.c (original)
+++ python/branches/bcannon-sandboxing/Objects/obmalloc.c Fri Jul 28 04:38:28 2006
@@ -702,6 +702,35 @@
#undef Py_NO_INLINE
#endif
+/*
+ Return the amount of memory allocated for pointer if created by pymalloc,
+ else return 0.
+*/
+int
+PyMalloc_ManagesMemory(void *ptr)
+{
+ poolp pool = NULL;
+
+ if (!ptr)
+ return 0;
+
+ pool = POOL_ADDR(ptr);
+
+ if (Py_ADDRESS_IN_RANGE(ptr, pool))
+ return 1;
+ else
+ return 0;
+}
+
+size_t
+PyMalloc_AllocatedSize(void *ptr)
+{
+ poolp pool = POOL_ADDR(ptr);
+
+ return INDEX2SIZE(pool->szidx);
+}
+
+
/*==========================================================================*/
/* malloc. Note that nbytes==0 tries to return a non-NULL pointer, distinct
Modified: python/branches/bcannon-sandboxing/Objects/stringobject.c
==============================================================================
--- python/branches/bcannon-sandboxing/Objects/stringobject.c (original)
+++ python/branches/bcannon-sandboxing/Objects/stringobject.c Fri Jul 28 04:38:28 2006
@@ -72,11 +72,7 @@
}
/* Inline PyObject_NewVar */
-#ifdef Py_MEMORY_CAP
- if (!PyInterpreterState_AddRawMem("str", sizeof(PyStringObject) + size))
- return PyErr_NoMemory();
-#endif
- op = (PyStringObject *)PyObject_MALLOC(sizeof(PyStringObject) + size);
+ op = (PyStringObject *)PyObject_T_MALLOC("str", sizeof(PyStringObject) + size);
if (op == NULL)
return PyErr_NoMemory();
PyObject_INIT_VAR(op, &PyString_Type, size);
@@ -131,11 +127,7 @@
}
/* Inline PyObject_NewVar */
-#ifdef Py_MEMORY_CAP
- if (!PyInterpreterState_AddRawMem("str", sizeof(PyStringObject) + size))
- return PyErr_NoMemory();
-#endif
- op = (PyStringObject *)PyObject_MALLOC(sizeof(PyStringObject) + size);
+ op = (PyStringObject *)PyObject_T_MALLOC("str", sizeof(PyStringObject) + size);
if (op == NULL)
return PyErr_NoMemory();
PyObject_INIT_VAR(op, &PyString_Type, size);
@@ -968,11 +960,7 @@
}
/* Inline PyObject_NewVar */
-#ifdef Py_MEMORY_CAP
- if (!PyInterpreterState_AddRawMem("str", sizeof(PyStringObject) + size))
- return PyErr_NoMemory();
-#endif
- op = (PyStringObject *)PyObject_MALLOC(sizeof(PyStringObject) + size);
+ op = (PyStringObject *)PyObject_T_MALLOC("str", sizeof(PyStringObject) + size);
if (op == NULL)
return PyErr_NoMemory();
PyObject_INIT_VAR(op, &PyString_Type, size);
@@ -1014,12 +1002,8 @@
"repeated string is too long");
return NULL;
}
-#ifdef Py_MEMORY_CAP
- if (!PyInterpreterState_AddRawMem("str", sizeof(PyStringObject) + nbytes))
- return PyErr_NoMemory();
-#endif
op = (PyStringObject *)
- PyObject_MALLOC(sizeof(PyStringObject) + nbytes);
+ PyObject_T_MALLOC("str", sizeof(PyStringObject) + nbytes);
if (op == NULL)
return PyErr_NoMemory();
PyObject_INIT_VAR(op, &PyString_Type, size);
@@ -4123,7 +4107,8 @@
_Py_DEC_REFTOTAL;
_Py_ForgetReference(v);
*pv = (PyObject *)
- PyObject_REALLOC((char *)v, sizeof(PyStringObject) + newsize);
+ PyObject_T_REALLOC("str", (char *)v,
+ sizeof(PyStringObject) + newsize);
if (*pv == NULL) {
PyObject_Del(v);
PyErr_NoMemory();
Modified: python/branches/bcannon-sandboxing/Objects/typeobject.c
==============================================================================
--- python/branches/bcannon-sandboxing/Objects/typeobject.c (original)
+++ python/branches/bcannon-sandboxing/Objects/typeobject.c Fri Jul 28 04:38:28 2006
@@ -450,14 +450,10 @@
const size_t size = _PyObject_VAR_SIZE(type, nitems+1);
/* note that we need to add one, for the sentinel */
-#ifdef Py_MEMORY_CAP
- if (!PyInterpreterState_AddVarObjectMem(type, nitems))
- return PyErr_NoMemory();
-#endif
if (PyType_IS_GC(type))
obj = _PyObject_GC_Malloc(size);
else
- obj = (PyObject *)PyObject_MALLOC(size);
+ obj = (PyObject *)PyObject_Malloc(size);
if (obj == NULL)
return PyErr_NoMemory();
@@ -1896,13 +1892,7 @@
PyObject *doc = PyDict_GetItemString(dict, "__doc__");
if (doc != NULL && PyString_Check(doc)) {
const size_t n = (size_t)PyString_GET_SIZE(doc);
-#ifdef Py_MEMORY_CAP
- if (!PyInterpreterState_AddRawMem("str", n)) {
- Py_DECREF(type);
- return NULL;
- }
-#endif
- char *tp_doc = (char *)PyObject_MALLOC(n+1);
+ char *tp_doc = (char *)PyObject_Malloc(n+1);
if (tp_doc == NULL) {
Py_DECREF(type);
return NULL;
@@ -2155,7 +2145,7 @@
/* A type's tp_doc is heap allocated, unlike the tp_doc slots
* of most other objects. It's okay to cast it to char *.
*/
- PyObject_FREE((char *)type->tp_doc);
+ PyObject_Free((char *)type->tp_doc);
Py_XDECREF(et->ht_name);
Py_XDECREF(et->ht_slots);
type->ob_type->tp_free((PyObject *)type);
Modified: python/branches/bcannon-sandboxing/Objects/unicodeobject.c
==============================================================================
--- python/branches/bcannon-sandboxing/Objects/unicodeobject.c (original)
+++ python/branches/bcannon-sandboxing/Objects/unicodeobject.c Fri Jul 28 04:38:28 2006
@@ -3219,7 +3219,7 @@
static void
encoding_map_dealloc(PyObject* o)
{
- PyObject_FREE(o);
+ PyObject_T_FREE("", o);
}
static PyTypeObject EncodingMapType = {
@@ -3342,7 +3342,7 @@
}
/* Create a three-level trie */
- result = PyObject_MALLOC(sizeof(struct encoding_map) +
+ result = PyObject_T_MALLOC("", sizeof(struct encoding_map) +
16*count2 + 128*count3 - 1);
if (!result)
return PyErr_NoMemory();
Modified: python/branches/bcannon-sandboxing/Python/pythonrun.c
==============================================================================
--- python/branches/bcannon-sandboxing/Python/pythonrun.c (original)
+++ python/branches/bcannon-sandboxing/Python/pythonrun.c Fri Jul 28 04:38:28 2006
@@ -35,10 +35,20 @@
#define PRINT_TOTAL_REFS()
#else /* Py_REF_DEBUG */
#define PRINT_TOTAL_REFS() fprintf(stderr, \
- "[%" PY_FORMAT_SIZE_T "d refs, %lld memory]\n", \
- _Py_GetRefTotal(), PyThreadState_Get()->interp->mem_usage)
+ "[%" PY_FORMAT_SIZE_T "d refs]\n", \
+ _Py_GetRefTotal())
#endif
+#ifdef Py_TRACK_MEMORY
+#define PRINT_TOTAL_MEM() fprintf(stderr, \
+ "[%lu bytes used]\n", \
+ Py_ProcessMemUsage)
+#else
+#define PRINT_TOTAL_MEM()
+#endif /* Py_TRACK_MEMORY */
+
+#define PRINT_STATE_DATA() PRINT_TOTAL_REFS(); PRINT_TOTAL_MEM()
+
#ifdef __cplusplus
extern "C" {
#endif
@@ -413,7 +423,7 @@
dump_counts(stdout);
#endif
- PRINT_TOTAL_REFS();
+ PRINT_STATE_DATA();
#ifdef Py_TRACE_REFS
/* Display all objects still alive -- this can invoke arbitrary
@@ -703,7 +713,7 @@
}
for (;;) {
ret = PyRun_InteractiveOneFlags(fp, filename, flags);
- PRINT_TOTAL_REFS();
+ PRINT_STATE_DATA();
if (ret == E_EOF)
return 0;
/*
@@ -1483,7 +1493,7 @@
v = Py_BuildValue("(ziiz)", err->filename,
err->lineno, err->offset, err->text);
if (err->text != NULL) {
- PyObject_FREE(err->text);
+ PyObject_Free(err->text);
err->text = NULL;
}
w = NULL;
Modified: python/branches/bcannon-sandboxing/Python/sysmodule.c
==============================================================================
--- python/branches/bcannon-sandboxing/Python/sysmodule.c (original)
+++ python/branches/bcannon-sandboxing/Python/sysmodule.c Fri Jul 28 04:38:28 2006
@@ -700,64 +700,6 @@
10. Number of stack pops performed by call_function()"
);
-#ifdef Py_MEMORY_CAP
-static PyObject *
-sys_setmemorycap(PyObject *self, PyObject *arg)
-{
- PyInterpreterState *interp = PyInterpreterState_SafeGet();
- PY_LONG_LONG new_memory_cap;
- PyObject *arg_as_long = PyNumber_Long(arg);
-
- if (!arg_as_long)
- return NULL;
-
- new_memory_cap = PyLong_AsLongLong(arg_as_long);
- Py_DECREF(arg_as_long); /* DEAD: arg_as_long */
-
- if (!interp)
- Py_FatalError("interpreter not available");
-
- if (!PyInterpreterState_SetMemoryCap(interp, new_memory_cap))
- return NULL;
-
- Py_RETURN_NONE;
-}
-
-PyDoc_STRVAR(setmemorycap_doc,
-"XXX"
-);
-
-static PyObject *
-sys_getmemorycap(PyObject *self, PyObject *ignore)
-{
- PyInterpreterState *interp = PyInterpreterState_SafeGet();
-
- if (!interp)
- Py_FatalError("interpreter not available");
-
- return PyLong_FromLongLong(interp->mem_cap);
-}
-
-PyDoc_STRVAR(getmemorycap_doc,
-"XXX"
-);
-
-static PyObject *
-sys_getmemoryused(PyObject *self, PyObject *ignore)
-{
- PyInterpreterState *interp = PyInterpreterState_SafeGet();
-
- if (!interp)
- Py_FatalError("interpreter not available");
-
- return PyLong_FromLongLong(interp->mem_usage);
-}
-
-PyDoc_STRVAR(getmemoryused_doc,
-"XXX"
-);
-#endif /* Py_MEMORY_CAP */
-
#ifdef __cplusplus
extern "C" {
#endif
@@ -840,11 +782,6 @@
#endif
{"settrace", sys_settrace, METH_O, settrace_doc},
{"call_tracing", sys_call_tracing, METH_VARARGS, call_tracing_doc},
-#ifdef Py_MEMORY_CAP
- {"setmemorycap", sys_setmemorycap, METH_O, setmemorycap_doc},
- {"getmemorycap", sys_getmemorycap, METH_NOARGS, getmemorycap_doc},
- {"getmemoryused", sys_getmemoryused, METH_NOARGS, getmemoryused_doc},
-#endif
{NULL, NULL} /* sentinel */
};
Modified: python/branches/bcannon-sandboxing/configure
==============================================================================
--- python/branches/bcannon-sandboxing/configure (original)
+++ python/branches/bcannon-sandboxing/configure Fri Jul 28 04:38:28 2006
@@ -1,5 +1,5 @@
#! /bin/sh
-# From configure.in Revision: 50540 .
+# From configure.in Revision: 50730 .
# Guess values for system-dependent variables and create Makefiles.
# Generated by GNU Autoconf 2.59 for python 2.5.
#
@@ -866,7 +866,7 @@
compiler
--with-suffix=.exe set executable suffix
--with-pydebug build with Py_DEBUG defined
- --with-memory-cap build with Py_MEMORY_CAP defined
+ --with-memory-tracking build with Py_TRACK_MEMORY defined
--with-libs='lib1 ...' link against additional libs
--with-system-ffi build _ctypes module using an installed ffi library
--with-signal-module disable/enable signal module
@@ -3765,26 +3765,26 @@
echo "${ECHO_T}no" >&6
fi;
-# Check for --with-memory-cap
-echo "$as_me:$LINENO: checking for --with-memory-cap" >&5
-echo $ECHO_N "checking for --with-memory-cap... $ECHO_C" >&6
-
-# Check whether --with-memory-cap or --without-memory-cap was given.
-if test "${with_memory_cap+set}" = set; then
- withval="$with_memory_cap"
+# Check for --with-memory-tracking
+echo "$as_me:$LINENO: checking for --with-memory-tracking" >&5
+echo $ECHO_N "checking for --with-memory-tracking... $ECHO_C" >&6
+
+# Check whether --with-memory-tracking or --without-memory-tracking was given.
+if test "${with_memory_tracking+set}" = set; then
+ withval="$with_memory_tracking"
if test "$withval" != no
then
cat >>confdefs.h <<\_ACEOF
-#define Py_MEMORY_CAP 1
+#define Py_TRACK_MEMORY 1
_ACEOF
echo "$as_me:$LINENO: result: yes" >&5
echo "${ECHO_T}yes" >&6;
- Py_MEMORY_CAP='true'
+ Py_TRACK_MEMORY='true'
else echo "$as_me:$LINENO: result: no" >&5
-echo "${ECHO_T}no" >&6; Py_MEMORY_CAP='false'
+echo "${ECHO_T}no" >&6; Py_TRACK_MEMORY='false'
fi
else
echo "$as_me:$LINENO: result: no" >&5
@@ -20455,6 +20455,70 @@
fi
+echo "$as_me:$LINENO: checking for good malloc_usable_size()" >&5
+echo $ECHO_N "checking for good malloc_usable_size()... $ECHO_C" >&6
+if test "${ac_cv_good_malloc_usable_size+set}" = set; then
+ echo $ECHO_N "(cached) $ECHO_C" >&6
+else
+
+if test "$cross_compiling" = yes; then
+ ac_cv_good_malloc_usable_size=no
+else
+ cat >conftest.$ac_ext <<_ACEOF
+/* confdefs.h. */
+_ACEOF
+cat confdefs.h >>conftest.$ac_ext
+cat >>conftest.$ac_ext <<_ACEOF
+/* end confdefs.h. */
+
+#include
+#include
+int
+main()
+{
+ void *p = malloc(8);
+ if (malloc_usable_size(p) <= 16)
+ exit(0);
+ exit(1);
+}
+
+_ACEOF
+rm -f conftest$ac_exeext
+if { (eval echo "$as_me:$LINENO: \"$ac_link\"") >&5
+ (eval $ac_link) 2>&5
+ ac_status=$?
+ echo "$as_me:$LINENO: \$? = $ac_status" >&5
+ (exit $ac_status); } && { ac_try='./conftest$ac_exeext'
+ { (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
+ (eval $ac_try) 2>&5
+ ac_status=$?
+ echo "$as_me:$LINENO: \$? = $ac_status" >&5
+ (exit $ac_status); }; }; then
+ ac_cv_good_malloc_usable_size=yes
+else
+ echo "$as_me: program exited with status $ac_status" >&5
+echo "$as_me: failed program was:" >&5
+sed 's/^/| /' conftest.$ac_ext >&5
+
+( exit $ac_status )
+ac_cv_good_malloc_usable_size=no
+fi
+rm -f core *.core gmon.out bb.out conftest$ac_exeext conftest.$ac_objext conftest.$ac_ext
+fi
+fi
+
+echo "$as_me:$LINENO: result: $ac_cv_good_malloc_usable_size" >&5
+echo "${ECHO_T}$ac_cv_good_malloc_usable_size" >&6
+if test "$ac_cv_good_malloc_usable_size" = yes
+then
+
+cat >>confdefs.h <<\_ACEOF
+#define HAVE_MALLOC_USABLE_SIZE 1
+_ACEOF
+
+fi
+
+
# check where readline lives
# save the value of LIBS so we don't actually link Python with readline
LIBS_no_readline=$LIBS
Modified: python/branches/bcannon-sandboxing/configure.in
==============================================================================
--- python/branches/bcannon-sandboxing/configure.in (original)
+++ python/branches/bcannon-sandboxing/configure.in Fri Jul 28 04:38:28 2006
@@ -725,18 +725,18 @@
fi],
[AC_MSG_RESULT(no)])
-# Check for --with-memory-cap
-AC_MSG_CHECKING(for --with-memory-cap)
-AC_ARG_WITH(memory-cap,
- AC_HELP_STRING(--with-memory-cap, build with Py_MEMORY_CAP defined),
+# Check for --with-memory-tracking
+AC_MSG_CHECKING(for --with-memory-tracking)
+AC_ARG_WITH(memory-tracking,
+ AC_HELP_STRING(--with-memory-tracking, build with Py_TRACK_MEMORY defined),
[
if test "$withval" != no
then
- AC_DEFINE(Py_MEMORY_CAP, 1,
- [Define if you want to build an interpreter that can cap memory usage.])
+ AC_DEFINE(Py_TRACK_MEMORY, 1,
+ [Define if you want to build an interpreter that tracks memory usage.])
AC_MSG_RESULT(yes);
- Py_MEMORY_CAP='true'
-else AC_MSG_RESULT(no); Py_MEMORY_CAP='false'
+ Py_TRACK_MEMORY='true'
+else AC_MSG_RESULT(no); Py_TRACK_MEMORY='false'
fi],
[AC_MSG_RESULT(no)])
@@ -3055,6 +3055,31 @@
[Define this if you have flockfile(), getc_unlocked(), and funlockfile()])
fi
+AC_MSG_CHECKING(for good malloc_usable_size())
+AC_CACHE_VAL(ac_cv_good_malloc_usable_size, [
+AC_TRY_RUN([
+#include
+#include
+int
+main()
+{
+ void *p = malloc(8);
+ if (malloc_usable_size(p) <= 16)
+ exit(0);
+ exit(1);
+}
+],
+ac_cv_good_malloc_usable_size=yes,
+ac_cv_good_malloc_usable_size=no,
+ac_cv_good_malloc_usable_size=no)])
+AC_MSG_RESULT($ac_cv_good_malloc_usable_size)
+if test "$ac_cv_good_malloc_usable_size" = yes
+then
+ AC_DEFINE(HAVE_MALLOC_USABLE_SIZE, 1,
+ [Define if malloc_usable_size() exists and works.])
+fi
+
+
# check where readline lives
# save the value of LIBS so we don't actually link Python with readline
LIBS_no_readline=$LIBS
Modified: python/branches/bcannon-sandboxing/pyconfig.h.in
==============================================================================
--- python/branches/bcannon-sandboxing/pyconfig.h.in (original)
+++ python/branches/bcannon-sandboxing/pyconfig.h.in Fri Jul 28 04:38:28 2006
@@ -326,6 +326,9 @@
/* Define this if you have the makedev macro. */
#undef HAVE_MAKEDEV
+/* Define if malloc_usable_size() exists and works. */
+#undef HAVE_MALLOC_USABLE_SIZE
+
/* Define to 1 if you have the `memmove' function. */
#undef HAVE_MEMMOVE
@@ -772,8 +775,8 @@
/* Defined if Python is built as a shared library. */
#undef Py_ENABLE_SHARED
-/* Define if you want to build an interpreter that can cap memory usage. */
-#undef Py_MEMORY_CAP
+/* Define if you want to build an interpreter that tracks memory usage. */
+#undef Py_TRACK_MEMORY
/* Define as the size of the unicode type. */
#undef Py_UNICODE_SIZE
From python-checkins at python.org Fri Jul 28 04:40:10 2006
From: python-checkins at python.org (brett.cannon)
Date: Fri, 28 Jul 2006 04:40:10 +0200 (CEST)
Subject: [Python-checkins] r50892 -
python/branches/bcannon-sandboxing/configure
python/branches/bcannon-sandboxing/configure.in
python/branches/bcannon-sandboxing/pyconfig.h.in
Message-ID: <20060728024010.2CFC31E4004@bag.python.org>
Author: brett.cannon
Date: Fri Jul 28 04:40:07 2006
New Revision: 50892
Modified:
python/branches/bcannon-sandboxing/configure
python/branches/bcannon-sandboxing/configure.in
python/branches/bcannon-sandboxing/pyconfig.h.in
Log:
Remove malloc_usable_size() check. The function didn't work properly.
Modified: python/branches/bcannon-sandboxing/configure
==============================================================================
--- python/branches/bcannon-sandboxing/configure (original)
+++ python/branches/bcannon-sandboxing/configure Fri Jul 28 04:40:07 2006
@@ -1,5 +1,5 @@
#! /bin/sh
-# From configure.in Revision: 50730 .
+# From configure.in Revision: 50891 .
# Guess values for system-dependent variables and create Makefiles.
# Generated by GNU Autoconf 2.59 for python 2.5.
#
@@ -20455,70 +20455,6 @@
fi
-echo "$as_me:$LINENO: checking for good malloc_usable_size()" >&5
-echo $ECHO_N "checking for good malloc_usable_size()... $ECHO_C" >&6
-if test "${ac_cv_good_malloc_usable_size+set}" = set; then
- echo $ECHO_N "(cached) $ECHO_C" >&6
-else
-
-if test "$cross_compiling" = yes; then
- ac_cv_good_malloc_usable_size=no
-else
- cat >conftest.$ac_ext <<_ACEOF
-/* confdefs.h. */
-_ACEOF
-cat confdefs.h >>conftest.$ac_ext
-cat >>conftest.$ac_ext <<_ACEOF
-/* end confdefs.h. */
-
-#include
-#include
-int
-main()
-{
- void *p = malloc(8);
- if (malloc_usable_size(p) <= 16)
- exit(0);
- exit(1);
-}
-
-_ACEOF
-rm -f conftest$ac_exeext
-if { (eval echo "$as_me:$LINENO: \"$ac_link\"") >&5
- (eval $ac_link) 2>&5
- ac_status=$?
- echo "$as_me:$LINENO: \$? = $ac_status" >&5
- (exit $ac_status); } && { ac_try='./conftest$ac_exeext'
- { (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
- (eval $ac_try) 2>&5
- ac_status=$?
- echo "$as_me:$LINENO: \$? = $ac_status" >&5
- (exit $ac_status); }; }; then
- ac_cv_good_malloc_usable_size=yes
-else
- echo "$as_me: program exited with status $ac_status" >&5
-echo "$as_me: failed program was:" >&5
-sed 's/^/| /' conftest.$ac_ext >&5
-
-( exit $ac_status )
-ac_cv_good_malloc_usable_size=no
-fi
-rm -f core *.core gmon.out bb.out conftest$ac_exeext conftest.$ac_objext conftest.$ac_ext
-fi
-fi
-
-echo "$as_me:$LINENO: result: $ac_cv_good_malloc_usable_size" >&5
-echo "${ECHO_T}$ac_cv_good_malloc_usable_size" >&6
-if test "$ac_cv_good_malloc_usable_size" = yes
-then
-
-cat >>confdefs.h <<\_ACEOF
-#define HAVE_MALLOC_USABLE_SIZE 1
-_ACEOF
-
-fi
-
-
# check where readline lives
# save the value of LIBS so we don't actually link Python with readline
LIBS_no_readline=$LIBS
Modified: python/branches/bcannon-sandboxing/configure.in
==============================================================================
--- python/branches/bcannon-sandboxing/configure.in (original)
+++ python/branches/bcannon-sandboxing/configure.in Fri Jul 28 04:40:07 2006
@@ -3055,31 +3055,6 @@
[Define this if you have flockfile(), getc_unlocked(), and funlockfile()])
fi
-AC_MSG_CHECKING(for good malloc_usable_size())
-AC_CACHE_VAL(ac_cv_good_malloc_usable_size, [
-AC_TRY_RUN([
-#include
-#include
-int
-main()
-{
- void *p = malloc(8);
- if (malloc_usable_size(p) <= 16)
- exit(0);
- exit(1);
-}
-],
-ac_cv_good_malloc_usable_size=yes,
-ac_cv_good_malloc_usable_size=no,
-ac_cv_good_malloc_usable_size=no)])
-AC_MSG_RESULT($ac_cv_good_malloc_usable_size)
-if test "$ac_cv_good_malloc_usable_size" = yes
-then
- AC_DEFINE(HAVE_MALLOC_USABLE_SIZE, 1,
- [Define if malloc_usable_size() exists and works.])
-fi
-
-
# check where readline lives
# save the value of LIBS so we don't actually link Python with readline
LIBS_no_readline=$LIBS
Modified: python/branches/bcannon-sandboxing/pyconfig.h.in
==============================================================================
--- python/branches/bcannon-sandboxing/pyconfig.h.in (original)
+++ python/branches/bcannon-sandboxing/pyconfig.h.in Fri Jul 28 04:40:07 2006
@@ -326,9 +326,6 @@
/* Define this if you have the makedev macro. */
#undef HAVE_MAKEDEV
-/* Define if malloc_usable_size() exists and works. */
-#undef HAVE_MALLOC_USABLE_SIZE
-
/* Define to 1 if you have the `memmove' function. */
#undef HAVE_MEMMOVE
From python-checkins at python.org Fri Jul 28 05:16:58 2006
From: python-checkins at python.org (gregory.p.smith)
Date: Fri, 28 Jul 2006 05:16:58 +0200 (CEST)
Subject: [Python-checkins] r50893 - in python/branches/release24-maint:
Lib/bsddb/test/test_basics.py Misc/NEWS Modules/_bsddb.c
Message-ID: <20060728031658.B615E1E4004@bag.python.org>
Author: gregory.p.smith
Date: Fri Jul 28 05:16:53 2006
New Revision: 50893
Modified:
python/branches/release24-maint/Lib/bsddb/test/test_basics.py
python/branches/release24-maint/Misc/NEWS
python/branches/release24-maint/Modules/_bsddb.c
Log:
- pybsddb Bug #1527939: bsddb module DBEnv dbremove and dbrename
methods now allow their database parameter to be None as the
sleepycat API allows.
also adds a testcase.
backport of trunk commit 50889 to 2.4.
Modified: python/branches/release24-maint/Lib/bsddb/test/test_basics.py
==============================================================================
--- python/branches/release24-maint/Lib/bsddb/test/test_basics.py (original)
+++ python/branches/release24-maint/Lib/bsddb/test/test_basics.py Fri Jul 28 05:16:53 2006
@@ -555,6 +555,9 @@
num = d.truncate()
assert num == 0, "truncate on empty DB returned nonzero (%r)" % (num,)
+ #----------------------------------------
+
+
#----------------------------------------------------------------------
@@ -576,18 +579,40 @@
dbopenflags = db.DB_THREAD
-class BasicBTreeWithEnvTestCase(BasicTestCase):
- dbtype = db.DB_BTREE
+class BasicWithEnvTestCase(BasicTestCase):
dbopenflags = db.DB_THREAD
useEnv = 1
envflags = db.DB_THREAD | db.DB_INIT_MPOOL | db.DB_INIT_LOCK
+ #----------------------------------------
+
+ def test07_EnvRemoveAndRename(self):
+ if not self.env:
+ return
+
+ if verbose:
+ print '\n', '-=' * 30
+ print "Running %s.test07_EnvRemoveAndRename..." % self.__class__.__name__
+
+ # can't rename or remove an open DB
+ self.d.close()
+
+ newname = self.filename + '.renamed'
+ self.env.dbrename(self.filename, None, newname)
+ self.env.dbremove(newname)
+
+ # dbremove and dbrename are in 4.1 and later
+ if db.version() < (4,1):
+ del test07_EnvRemoveAndRename
+
+ #----------------------------------------
+
+class BasicBTreeWithEnvTestCase(BasicWithEnvTestCase):
+ dbtype = db.DB_BTREE
+
-class BasicHashWithEnvTestCase(BasicTestCase):
+class BasicHashWithEnvTestCase(BasicWithEnvTestCase):
dbtype = db.DB_HASH
- dbopenflags = db.DB_THREAD
- useEnv = 1
- envflags = db.DB_THREAD | db.DB_INIT_MPOOL | db.DB_INIT_LOCK
#----------------------------------------------------------------------
Modified: python/branches/release24-maint/Misc/NEWS
==============================================================================
--- python/branches/release24-maint/Misc/NEWS (original)
+++ python/branches/release24-maint/Misc/NEWS Fri Jul 28 05:16:53 2006
@@ -60,6 +60,9 @@
return correct results. It could previously incorrectly return 0 in some
cases. Fixes SF bug 1493322 (pybsddb bug 1184012).
+- pybsddb Bug #1527939: bsddb module DBEnv dbremove and dbrename
+ methods now allow their database parameter to be None as the
+ sleepycat API allows.
Library
-------
Modified: python/branches/release24-maint/Modules/_bsddb.c
==============================================================================
--- python/branches/release24-maint/Modules/_bsddb.c (original)
+++ python/branches/release24-maint/Modules/_bsddb.c Fri Jul 28 05:16:53 2006
@@ -97,7 +97,7 @@
#error "eek! DBVER can't handle minor versions > 9"
#endif
-#define PY_BSDDB_VERSION "4.3.0.2"
+#define PY_BSDDB_VERSION "4.3.0.3"
static char *rcs_id = "$Id$";
@@ -3587,7 +3587,7 @@
DB_TXN *txn = NULL;
char* kwnames[] = { "file", "database", "txn", "flags", NULL };
- if (!PyArg_ParseTupleAndKeywords(args, kwargs, "ss|Oi:dbremove", kwnames,
+ if (!PyArg_ParseTupleAndKeywords(args, kwargs, "s|zOi:dbremove", kwnames,
&file, &database, &txnobj, &flags)) {
return NULL;
}
@@ -3614,7 +3614,7 @@
DB_TXN *txn = NULL;
char* kwnames[] = { "file", "database", "newname", "txn", "flags", NULL };
- if (!PyArg_ParseTupleAndKeywords(args, kwargs, "sss|Oi:dbrename", kwnames,
+ if (!PyArg_ParseTupleAndKeywords(args, kwargs, "szs|Oi:dbrename", kwnames,
&file, &database, &newname, &txnobj, &flags)) {
return NULL;
}
From python-checkins at python.org Fri Jul 28 05:19:01 2006
From: python-checkins at python.org (barry.warsaw)
Date: Fri, 28 Jul 2006 05:19:01 +0200 (CEST)
Subject: [Python-checkins] r50894 - in
python/branches/release24-maint/Lib/email: Message.py
test/test_email.py
Message-ID: <20060728031901.316C51E4004@bag.python.org>
Author: barry.warsaw
Date: Fri Jul 28 05:18:56 2006
New Revision: 50894
Modified:
python/branches/release24-maint/Lib/email/Message.py
python/branches/release24-maint/Lib/email/test/test_email.py
Log:
Backport r50840 to Python 2.4:
Forward port some fixes that were in email 2.5 but for some reason didn't
make it into email 4.0. Specifically, in Message.get_content_charset(),
handle RFC 2231 headers that contain an encoding not known to Python, or a
character in the data that isn't in the charset encoding. Also forward
port the appropriate unit tests.
Also, this resolves SF bug #1414018.
Modified: python/branches/release24-maint/Lib/email/Message.py
==============================================================================
--- python/branches/release24-maint/Lib/email/Message.py (original)
+++ python/branches/release24-maint/Lib/email/Message.py Fri Jul 28 05:18:56 2006
@@ -788,7 +788,18 @@
if isinstance(charset, tuple):
# RFC 2231 encoded, so decode it, and it better end up as ascii.
pcharset = charset[0] or 'us-ascii'
- charset = unicode(charset[2], pcharset).encode('us-ascii')
+ try:
+ # LookupError will be raised if the charset isn't known to
+ # Python. UnicodeError will be raised if the encoded text
+ # contains a character not in the charset.
+ charset = unicode(charset[2], pcharset).encode('us-ascii')
+ except (LookupError, UnicodeError):
+ charset = charset[2]
+ # charset character must be in us-ascii range
+ try:
+ charset = unicode(charset, 'us-ascii').encode('us-ascii')
+ except UnicodeError:
+ return failobj
# RFC 2046, $4.1.2 says charsets are not case sensitive
return charset.lower()
Modified: python/branches/release24-maint/Lib/email/test/test_email.py
==============================================================================
--- python/branches/release24-maint/Lib/email/test/test_email.py (original)
+++ python/branches/release24-maint/Lib/email/test/test_email.py Fri Jul 28 05:18:56 2006
@@ -3078,6 +3078,50 @@
self.assertEqual(msg.get_content_charset(),
'this is even more ***fun*** is it not.pdf')
+ def test_rfc2231_bad_encoding_in_filename(self):
+ m = '''\
+Content-Disposition: inline;
+\tfilename*0*="bogus'xx'This%20is%20even%20more%20";
+\tfilename*1*="%2A%2A%2Afun%2A%2A%2A%20";
+\tfilename*2="is it not.pdf"
+
+'''
+ msg = email.message_from_string(m)
+ self.assertEqual(msg.get_filename(),
+ 'This is even more ***fun*** is it not.pdf')
+
+ def test_rfc2231_bad_encoding_in_charset(self):
+ m = """\
+Content-Type: text/plain; charset*=bogus''utf-8%E2%80%9D
+
+"""
+ msg = email.message_from_string(m)
+ # This should return None because non-ascii characters in the charset
+ # are not allowed.
+ self.assertEqual(msg.get_content_charset(), None)
+
+ def test_rfc2231_bad_character_in_charset(self):
+ m = """\
+Content-Type: text/plain; charset*=ascii''utf-8%E2%80%9D
+
+"""
+ msg = email.message_from_string(m)
+ # This should return None because non-ascii characters in the charset
+ # are not allowed.
+ self.assertEqual(msg.get_content_charset(), None)
+
+ def test_rfc2231_bad_character_in_filename(self):
+ m = '''\
+Content-Disposition: inline;
+\tfilename*0*="ascii'xx'This%20is%20even%20more%20";
+\tfilename*1*="%2A%2A%2Afun%2A%2A%2A%20";
+\tfilename*2*="is it not.pdf%E2"
+
+'''
+ msg = email.message_from_string(m)
+ self.assertEqual(msg.get_filename(),
+ u'This is even more ***fun*** is it not.pdf\ufffd')
+
def test_rfc2231_unknown_encoding(self):
m = """\
Content-Transfer-Encoding: 8bit
From python-checkins at python.org Fri Jul 28 06:22:36 2006
From: python-checkins at python.org (neal.norwitz)
Date: Fri, 28 Jul 2006 06:22:36 +0200 (CEST)
Subject: [Python-checkins] r50895 - python/trunk/Lib/test/test_inspect.py
Message-ID: <20060728042236.04AF91E4004@bag.python.org>
Author: neal.norwitz
Date: Fri Jul 28 06:22:34 2006
New Revision: 50895
Modified:
python/trunk/Lib/test/test_inspect.py
Log:
Ensure the actual number matches the expected count
Modified: python/trunk/Lib/test/test_inspect.py
==============================================================================
--- python/trunk/Lib/test/test_inspect.py (original)
+++ python/trunk/Lib/test/test_inspect.py Fri Jul 28 06:22:34 2006
@@ -43,10 +43,11 @@
class TestPredicates(IsTestBase):
def test_thirteen(self):
- # Doc/lib/libinspect.tex claims there are 13 such functions
count = len(filter(lambda x:x.startswith('is'), dir(inspect)))
- self.assertEqual(count, 13,
- "There are %d (not 12) is* functions" % count)
+ # Doc/lib/libinspect.tex claims there are 13 such functions
+ expected = 13
+ err_msg = "There are %d (not %d) is* functions" % (count, expected)
+ self.assertEqual(count, expected, err_msg)
def test_excluding_predicates(self):
self.istest(inspect.isbuiltin, 'sys.exit')
From python-checkins at python.org Fri Jul 28 06:51:59 2006
From: python-checkins at python.org (tim.peters)
Date: Fri, 28 Jul 2006 06:51:59 +0200 (CEST)
Subject: [Python-checkins] r50896 - in python/trunk: Doc/lib/libuuid.tex
Lib/test/test_uuid.py Lib/uuid.py
Message-ID: <20060728045159.F01A41E4007@bag.python.org>
Author: tim.peters
Date: Fri Jul 28 06:51:59 2006
New Revision: 50896
Modified:
python/trunk/Doc/lib/libuuid.tex
python/trunk/Lib/test/test_uuid.py
python/trunk/Lib/uuid.py
Log:
Live with that "the hardware address" is an ill-defined
concept, and that different ways of trying to find "the
hardware address" may return different results. Certainly
true on both of my Windows boxes, and in different ways
(see whining on python-dev).
Modified: python/trunk/Doc/lib/libuuid.tex
==============================================================================
--- python/trunk/Doc/lib/libuuid.tex (original)
+++ python/trunk/Doc/lib/libuuid.tex Fri Jul 28 06:51:59 2006
@@ -32,7 +32,7 @@
Create a UUID from either a string of 32 hexadecimal digits,
a string of 16 bytes as the \var{bytes} argument, a tuple of six
-integers (32-bit \var{time_low}, 16-bit \var{time_mid},
+integers (32-bit \var{time_low}, 16-bit \var{time_mid},
16-bit \var{time_hi_version},
8-bit \var{clock_seq_hi_variant}, 8-bit \var{clock_seq_low}, 48-bit \var{node})
as the \var{fields} argument, or a single 128-bit integer as the \var{int}
@@ -109,10 +109,13 @@
The \module{uuid} module defines the following functions
\begin{funcdesc}{getnode}{}
-Get the hardware address as a 48-bit integer. The first time this runs,
-it may launch a separate program, which could be quite slow. If all
+Get the hardware address as a 48-bit positive integer. The first time this
+runs, it may launch a separate program, which could be quite slow. If all
attempts to obtain the hardware address fail, we choose a random 48-bit
-number with its eighth bit set to 1 as recommended in RFC 4122.
+number with its eighth bit set to 1 as recommended in RFC 4122. "Hardware
+address" means the MAC address of a network interface, and on a machine
+with multiple network interfaces the MAC address of any one of them may
+be returned.
\end{funcdesc}
\index{getnode}
@@ -126,10 +129,10 @@
\index{uuid1}
\begin{funcdesc}{uuid3}{namespace, name}
-Generate a UUID based upon a MD5 hash of the \var{name} string value
-drawn from a specified namespace. \var{namespace}
+Generate a UUID based upon a MD5 hash of the \var{name} string value
+drawn from a specified namespace. \var{namespace}
must be one of \constant{NAMESPACE_DNS},
-\constant{NAMESPACE_URL}, \constant{NAMESPACE_OID},
+\constant{NAMESPACE_URL}, \constant{NAMESPACE_OID},
or \constant{NAMESPACE_X500}.
\end{funcdesc}
\index{uuid3}
@@ -140,15 +143,15 @@
\index{uuid4}
\begin{funcdesc}{uuid5}{namespace, name}
-Generate a UUID based upon a SHA-1 hash of the \var{name} string value
-drawn from a specified namespace. \var{namespace}
+Generate a UUID based upon a SHA-1 hash of the \var{name} string value
+drawn from a specified namespace. \var{namespace}
must be one of \constant{NAMESPACE_DNS},
-\constant{NAMESPACE_URL}, \constant{NAMESPACE_OID},
+\constant{NAMESPACE_URL}, \constant{NAMESPACE_OID},
or \constant{NAMESPACE_X500}.
\end{funcdesc}
\index{uuid5}
-The \module{uuid} module defines the following namespace constants
+The \module{uuid} module defines the following namespace constants
for use with \function{uuid3()} or \function{uuid5()}.
\begin{datadesc}{NAMESPACE_DNS}
@@ -167,7 +170,7 @@
X.500 DN namespace UUID.
\end{datadesc}
-The \module{uuid} module defines the following constants
+The \module{uuid} module defines the following constants
for the possible values of the \member{variant} attribute:
\begin{datadesc}{RESERVED_NCS}
Modified: python/trunk/Lib/test/test_uuid.py
==============================================================================
--- python/trunk/Lib/test/test_uuid.py (original)
+++ python/trunk/Lib/test/test_uuid.py Fri Jul 28 06:51:59 2006
@@ -284,7 +284,11 @@
msg = "different sources disagree on node:\n"
for s, n in TestUUID.source2node.iteritems():
msg += " from source %r, node was %012x\n" % (s, n)
- self.fail(msg)
+ # There's actually no reason to expect the MAC addresses
+ # to agree across various methods -- e.g., a box may have
+ # multiple network interfaces, and different ways of getting
+ # a MAC address may favor different HW.
+ ##self.fail(msg)
else:
TestUUID.last_node = node
@@ -309,7 +313,7 @@
def test_random_getnode(self):
node = uuid._random_getnode()
self.assert_(0 <= node)
- self.assert_(node < 1<<48L)
+ self.assert_(node < (1L <<48))
def test_unixdll_getnode(self):
import os
@@ -322,10 +326,14 @@
self.check_node(uuid._windll_getnode(), 'windll')
def test_getnode(self):
- self.check_node(uuid.getnode(), "getnode1")
+ node1 = uuid.getnode()
+ self.check_node(node1, "getnode1")
# Test it again to ensure consistency.
- self.check_node(uuid.getnode(), "getnode2")
+ node2 = uuid.getnode()
+ self.check_node(node2, "getnode2")
+
+ self.assertEqual(node1, node2)
def test_uuid1(self):
equal = self.assertEqual
Modified: python/trunk/Lib/uuid.py
==============================================================================
--- python/trunk/Lib/uuid.py (original)
+++ python/trunk/Lib/uuid.py Fri Jul 28 06:51:59 2006
@@ -362,6 +362,10 @@
# hardware address. On Windows 2000 and later, UuidCreate makes a
# random UUID and UuidCreateSequential gives a UUID containing the
# hardware address. These routines are provided by the RPC runtime.
+ # NOTE: at least on Tim's WinXP Pro SP2 desktop box, while the last
+ # 6 bytes returned by UuidCreateSequential are fixed, they don't appear
+ # to bear any relationship to the MAC address of any network device
+ # on the box.
try:
lib = ctypes.windll.rpcrt4
except:
@@ -389,10 +393,13 @@
_node = None
def getnode():
- """Get the hardware address as a 48-bit integer. The first time this
- runs, it may launch a separate program, which could be quite slow. If
- all attempts to obtain the hardware address fail, we choose a random
- 48-bit number with its eighth bit set to 1 as recommended in RFC 4122."""
+ """Get the hardware address as a 48-bit positive integer.
+
+ The first time this runs, it may launch a separate program, which could
+ be quite slow. If all attempts to obtain the hardware address fail, we
+ choose a random 48-bit number with its eighth bit set to 1 as recommended
+ in RFC 4122.
+ """
global _node
if _node is not None:
From buildbot at python.org Fri Jul 28 07:21:23 2006
From: buildbot at python.org (buildbot at python.org)
Date: Fri, 28 Jul 2006 05:21:23 +0000
Subject: [Python-checkins] buildbot warnings in alpha Tru64 5.1 trunk
Message-ID: <20060728052123.E885C1E4007@bag.python.org>
The Buildbot has detected a new failure of alpha Tru64 5.1 trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/alpha%2520Tru64%25205.1%2520trunk/builds/983
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: neal.norwitz
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Fri Jul 28 09:17:03 2006
From: buildbot at python.org (buildbot at python.org)
Date: Fri, 28 Jul 2006 07:17:03 +0000
Subject: [Python-checkins] buildbot warnings in sparc Ubuntu dapper 2.4
Message-ID: <20060728071703.362C61E4007@bag.python.org>
The Buildbot has detected a new failure of sparc Ubuntu dapper 2.4.
Full details are available at:
http://www.python.org/dev/buildbot/all/sparc%2520Ubuntu%2520dapper%25202.4/builds/113
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch branches/release24-maint] HEAD
Blamelist: barry.warsaw,gregory.p.smith
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Fri Jul 28 09:21:27 2006
From: python-checkins at python.org (neal.norwitz)
Date: Fri, 28 Jul 2006 09:21:27 +0200 (CEST)
Subject: [Python-checkins] r50897 - python/trunk/Lib/uuid.py
Message-ID: <20060728072127.E10F21E4014@bag.python.org>
Author: neal.norwitz
Date: Fri Jul 28 09:21:27 2006
New Revision: 50897
Modified:
python/trunk/Lib/uuid.py
Log:
Try to find the MAC addr on various flavours of Unix. This seems hopeless.
The reduces the test_uuid failures, but there's still another method failing.
Modified: python/trunk/Lib/uuid.py
==============================================================================
--- python/trunk/Lib/uuid.py (original)
+++ python/trunk/Lib/uuid.py Fri Jul 28 09:21:27 2006
@@ -271,22 +271,48 @@
version = property(get_version)
-def _ifconfig_getnode():
- """Get the hardware address on Unix by running ifconfig."""
+def _find_mac(command, args, hw_identifiers, get_index):
import os
for dir in ['', '/sbin/', '/usr/sbin']:
try:
# LC_ALL to get English output, 2>/dev/null to
# prevent output on stderr
- cmd = 'LC_ALL=C %s 2>/dev/null' % os.path.join(dir, 'ifconfig')
+ executable = os.path.join(dir, command)
+ cmd = 'LC_ALL=C %s %s 2>/dev/null' % (executable, args)
pipe = os.popen(cmd)
except IOError:
continue
+
for line in pipe:
words = line.lower().split()
for i in range(len(words)):
- if words[i] in ['hwaddr', 'ether']:
- return int(words[i + 1].replace(':', ''), 16)
+ if words[i] in hw_identifiers:
+ return int(words[get_index(i)].replace(':', ''), 16)
+ return None
+
+def _ifconfig_getnode():
+ """Get the hardware address on Unix by running ifconfig."""
+
+ # This works on Linux ('' or '-a'), Tru64 ('-av'), but not all Unixes.
+ for args in ('', '-a', '-av'):
+ mac = _find_mac('ifconfig', args, ['hwaddr', 'ether'], lambda i: i+1)
+ if mac:
+ return mac
+
+ import socket
+ ip_addr = socket.gethostbyname(socket.gethostname())
+
+ # Try getting the MAC addr from arp based on our IP address (Solaris).
+ mac = _find_mac('arp', '-an', [ip_addr], lambda i: -1)
+ if mac:
+ return mac
+
+ # This might work on HP-UX.
+ mac = _find_mac('lanscan', '-ai', ['lan0'], lambda i: 0)
+ if mac:
+ return mac
+
+ return None
def _ipconfig_getnode():
"""Get the hardware address on Windows by running ipconfig.exe."""
From python-checkins at python.org Fri Jul 28 09:45:49 2006
From: python-checkins at python.org (martin.v.loewis)
Date: Fri, 28 Jul 2006 09:45:49 +0200 (CEST)
Subject: [Python-checkins] r50898 - python/trunk/Tools/msi/uuids.py
Message-ID: <20060728074549.C67DD1E4017@bag.python.org>
Author: martin.v.loewis
Date: Fri Jul 28 09:45:49 2006
New Revision: 50898
Modified:
python/trunk/Tools/msi/uuids.py
Log:
Add UUID for upcoming 2.5b3.
Modified: python/trunk/Tools/msi/uuids.py
==============================================================================
--- python/trunk/Tools/msi/uuids.py (original)
+++ python/trunk/Tools/msi/uuids.py Fri Jul 28 09:45:49 2006
@@ -27,6 +27,7 @@
'2.5.103': '{73dcd966-ffec-415f-bb39-8342c1f47017}', # 2.5a3
'2.5.111': '{c797ecf8-a8e6-4fec-bb99-526b65f28626}', # 2.5b1
'2.5.112': '{32beb774-f625-439d-b587-7187487baf15}', # 2.5b2
+ '2.5.113': '{89f23918-11cf-4f08-be13-b9b2e6463fd9}', # 2.5b3
'2.5.121': '{8e9321bc-6b24-48a3-8fd4-c95f8e531e5f}', # 2.5c1
'2.5.122': '{a6cd508d-9599-45da-a441-cbffa9f7e070}', # 2.5c2
'2.5.150': '{0a2c5854-557e-48c8-835a-3b9f074bdcaa}', # 2.5.0
From buildbot at python.org Fri Jul 28 11:41:24 2006
From: buildbot at python.org (buildbot at python.org)
Date: Fri, 28 Jul 2006 09:41:24 +0000
Subject: [Python-checkins] buildbot warnings in alpha Tru64 5.1 trunk
Message-ID: <20060728094124.503EC1E4008@bag.python.org>
The Buildbot has detected a new failure of alpha Tru64 5.1 trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/alpha%2520Tru64%25205.1%2520trunk/builds/986
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: martin.v.loewis
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Fri Jul 28 13:27:32 2006
From: python-checkins at python.org (matt.fleming)
Date: Fri, 28 Jul 2006 13:27:32 +0200 (CEST)
Subject: [Python-checkins] r50899 - in python/trunk: Misc/ACKS Misc/NEWS
Modules/socketmodule.c
Message-ID: <20060728112732.67FD11E4002@bag.python.org>
Author: matt.fleming
Date: Fri Jul 28 13:27:27 2006
New Revision: 50899
Modified:
python/trunk/Misc/ACKS
python/trunk/Misc/NEWS
python/trunk/Modules/socketmodule.c
Log:
Allow socketmodule to compile on NetBSD -current, whose bluetooth API
differs from both Linux and FreeBSD. Accepted by Neal Norwitz.
Modified: python/trunk/Misc/ACKS
==============================================================================
--- python/trunk/Misc/ACKS (original)
+++ python/trunk/Misc/ACKS Fri Jul 28 13:27:27 2006
@@ -203,6 +203,7 @@
Russell Finn
Nils Fischbeck
Frederik Fix
+Matt Fleming
Hernán Martínez Foffani
Doug Fort
John Fouhy
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Fri Jul 28 13:27:27 2006
@@ -132,6 +132,9 @@
methods now allow their database parameter to be None as the
sleepycat API allows.
+- Bug #1526460: Fix socketmodule compile on NetBSD as it has a different
+ bluetooth API compared with Linux and FreeBSD.
+
Tests
-----
Modified: python/trunk/Modules/socketmodule.c
==============================================================================
--- python/trunk/Modules/socketmodule.c (original)
+++ python/trunk/Modules/socketmodule.c Fri Jul 28 13:27:27 2006
@@ -367,6 +367,14 @@
#define _BT_SOCKADDR_MEMB(s, proto) &((s)->sock_addr)
#define _BT_L2_MEMB(sa, memb) ((sa)->l2cap_##memb)
#define _BT_RC_MEMB(sa, memb) ((sa)->rfcomm_##memb)
+#elif defined(__NetBSD__)
+#define sockaddr_l2 sockaddr_bt
+#define sockaddr_rc sockaddr_bt
+#define sockaddr_sco sockaddr_bt
+#define _BT_SOCKADDR_MEMB(s, proto) &((s)->sock_addr)
+#define _BT_L2_MEMB(sa, memb) ((sa)->bt_##memb)
+#define _BT_RC_MEMB(sa, memb) ((sa)->bt_##memb)
+#define _BT_SCO_MEMB(sa, memb) ((sa)->bt_##memb)
#else
#define _BT_SOCKADDR_MEMB(s, proto) (&((s)->sock_addr).bt_##proto)
#define _BT_L2_MEMB(sa, memb) ((sa)->l2_##memb)
From python-checkins at python.org Fri Jul 28 14:07:12 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Fri, 28 Jul 2006 14:07:12 +0200 (CEST)
Subject: [Python-checkins] r50900 - python/trunk/Doc/whatsnew/whatsnew25.tex
Message-ID: <20060728120712.DBE0D1E4002@bag.python.org>
Author: andrew.kuchling
Date: Fri Jul 28 14:07:12 2006
New Revision: 50900
Modified:
python/trunk/Doc/whatsnew/whatsnew25.tex
Log:
[Patch #1529811] Correction to description of r|* mode
Modified: python/trunk/Doc/whatsnew/whatsnew25.tex
==============================================================================
--- python/trunk/Doc/whatsnew/whatsnew25.tex (original)
+++ python/trunk/Doc/whatsnew/whatsnew25.tex Fri Jul 28 14:07:12 2006
@@ -1687,8 +1687,8 @@
a different directory as the extraction target, and to unpack only a
subset of the archive's members.
-A tarfile's compression can be autodetected by
-using the mode \code{'r|*'}.
+The compression used for a tarfile opened in stream mode can now be
+autodetected using the mode \code{'r|*'}.
% patch 918101
(Contributed by Lars Gust\"abel.)
@@ -2430,7 +2430,7 @@
The author would like to thank the following people for offering
suggestions, corrections and assistance with various drafts of this
-article: Nick Coghlan, Phillip J. Eby, Raymond Hettinger, Ralf
+article: Nick Coghlan, Phillip J. Eby, Lars Gust\"abel, Raymond Hettinger, Ralf
W. Grosse-Kunstleve, Kent Johnson, Martin von~L\"owis, Fredrik Lundh,
Gustavo Niemeyer, James Pryor, Mike Rovner, Scott Weikart, Barry
Warsaw, Thomas Wouters.
From python-checkins at python.org Fri Jul 28 14:18:22 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Fri, 28 Jul 2006 14:18:22 +0200 (CEST)
Subject: [Python-checkins] r50901 -
python/trunk/Doc/lib/sqlite3/complete_statement.py
Message-ID: <20060728121822.55F7B1E4002@bag.python.org>
Author: andrew.kuchling
Date: Fri Jul 28 14:18:22 2006
New Revision: 50901
Modified:
python/trunk/Doc/lib/sqlite3/complete_statement.py
Log:
Typo fix
Modified: python/trunk/Doc/lib/sqlite3/complete_statement.py
==============================================================================
--- python/trunk/Doc/lib/sqlite3/complete_statement.py (original)
+++ python/trunk/Doc/lib/sqlite3/complete_statement.py Fri Jul 28 14:18:22 2006
@@ -24,7 +24,7 @@
if buffer.lstrip().upper().startswith("SELECT"):
print cur.fetchall()
except sqlite3.Error, e:
- print "An error occured:", e.args[0]
+ print "An error occurred:", e.args[0]
buffer = ""
con.close()
From python-checkins at python.org Fri Jul 28 14:32:44 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Fri, 28 Jul 2006 14:32:44 +0200 (CEST)
Subject: [Python-checkins] r50902 - python/trunk/Doc/lib/libwebbrowser.tex
Message-ID: <20060728123244.040861E4002@bag.python.org>
Author: andrew.kuchling
Date: Fri Jul 28 14:32:43 2006
New Revision: 50902
Modified:
python/trunk/Doc/lib/libwebbrowser.tex
Log:
Add example
Modified: python/trunk/Doc/lib/libwebbrowser.tex
==============================================================================
--- python/trunk/Doc/lib/libwebbrowser.tex (original)
+++ python/trunk/Doc/lib/libwebbrowser.tex Fri Jul 28 14:32:43 2006
@@ -136,6 +136,18 @@
Only on MacOS X platform.
\end{description}
+Here are some simple examples:
+
+\begin{verbatim}
+url = 'http://www.python.org'
+
+# Open URL in a new tab, if a browser window is already open.
+webbrowser.open_new_tab(url + '/doc')
+
+# Open URL in new window, raising the window if possible.
+webbrowser.open_new(url)
+\end{verbatim}
+
\subsection{Browser Controller Objects \label{browser-controllers}}
From python-checkins at python.org Fri Jul 28 14:33:19 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Fri, 28 Jul 2006 14:33:19 +0200 (CEST)
Subject: [Python-checkins] r50903 - python/trunk/Doc/lib/libpickle.tex
Message-ID: <20060728123319.B6B601E4002@bag.python.org>
Author: andrew.kuchling
Date: Fri Jul 28 14:33:19 2006
New Revision: 50903
Modified:
python/trunk/Doc/lib/libpickle.tex
Log:
Add example
Modified: python/trunk/Doc/lib/libpickle.tex
==============================================================================
--- python/trunk/Doc/lib/libpickle.tex (original)
+++ python/trunk/Doc/lib/libpickle.tex Fri Jul 28 14:33:19 2006
@@ -725,7 +725,50 @@
\subsection{Example \label{pickle-example}}
-Here's a simple example of how to modify pickling behavior for a
+For the simplest code, use the \function{dump()} and \function{load()}
+functions. Note that a self-referencing list is pickled and restored
+correctly.
+
+\begin{verbatim}
+import pickle
+
+data1 = {'a': [1, 2.0, 3, 4+6j],
+ 'b': ('string', u'Unicode string'),
+ 'c': None}
+
+selfref_list = [1, 2, 3]
+selfref_list.append(selfref_list)
+
+output = open('data.pkl', 'wb')
+
+# Pickle dictionary using protocol 0.
+pickle.dump(data1, output)
+
+# Pickle the list using the highest protocol available.
+pickle.dump(selfref_list, output, -1)
+
+output.close()
+\end{verbatim}
+
+The following example reads the resulting pickled data. When reading
+a pickle-containing file, you should open the file in binary mode
+because you can't be sure if the ASCII or binary format was used.
+
+\begin{verbatim}
+import pprint, pickle
+
+pkl_file = open('data.pkl', 'rb')
+
+data1 = pickle.load(pkl_file)
+pprint.pprint(data1)
+
+data2 = pickle.load(pkl_file)
+pprint.pprint(data2)
+
+pkl_file.close()
+\end{verbatim}
+
+Here's a larger example that shows how to modify pickling behavior for a
class. The \class{TextReader} class opens a text file, and returns
the line number and line contents each time its \method{readline()}
method is called. If a \class{TextReader} instance is pickled, all
From python-checkins at python.org Fri Jul 28 14:45:55 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Fri, 28 Jul 2006 14:45:55 +0200 (CEST)
Subject: [Python-checkins] r50904 - python/trunk/Doc/lib/libshelve.tex
Message-ID: <20060728124555.A41FB1E4002@bag.python.org>
Author: andrew.kuchling
Date: Fri Jul 28 14:45:55 2006
New Revision: 50904
Modified:
python/trunk/Doc/lib/libshelve.tex
Log:
Don't overwrite built-in name; add some blank lines for readability
Modified: python/trunk/Doc/lib/libshelve.tex
==============================================================================
--- python/trunk/Doc/lib/libshelve.tex (original)
+++ python/trunk/Doc/lib/libshelve.tex Fri Jul 28 14:45:55 2006
@@ -143,15 +143,17 @@
del d[key] # delete data stored at key (raises KeyError
# if no such key)
flag = d.has_key(key) # true if the key exists
-list = d.keys() # a list of all existing keys (slow!)
+klist = d.keys() # a list of all existing keys (slow!)
# as d was opened WITHOUT writeback=True, beware:
d['xx'] = range(4) # this works as expected, but...
d['xx'].append(5) # *this doesn't!* -- d['xx'] is STILL range(4)!!!
+
# having opened d without writeback=True, you need to code carefully:
temp = d['xx'] # extracts the copy
temp.append(5) # mutates the copy
d['xx'] = temp # stores the copy right back, to persist it
+
# or, d=shelve.open(filename,writeback=True) would let you just code
# d['xx'].append(5) and have it work as expected, BUT it would also
# consume more memory and make the d.close() operation slower.
From python-checkins at python.org Fri Jul 28 14:48:07 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Fri, 28 Jul 2006 14:48:07 +0200 (CEST)
Subject: [Python-checkins] r50905 - python/trunk/Doc/lib/libanydbm.tex
Message-ID: <20060728124807.83EC41E4002@bag.python.org>
Author: andrew.kuchling
Date: Fri Jul 28 14:48:07 2006
New Revision: 50905
Modified:
python/trunk/Doc/lib/libanydbm.tex
Log:
Add example. Should I propagate this example to all the other DBM-ish modules, too?
Modified: python/trunk/Doc/lib/libanydbm.tex
==============================================================================
--- python/trunk/Doc/lib/libanydbm.tex (original)
+++ python/trunk/Doc/lib/libanydbm.tex Fri Jul 28 14:48:07 2006
@@ -46,6 +46,32 @@
\method{keys()} methods are available. Keys and values must always be
strings.
+The following example records some hostnames and a corresponding title,
+and then prints out the contents of the database:
+
+\begin{verbatim}
+import anydbm
+
+# Open database, creating it if necessary.
+db = anydbm.open('cache', 'c')
+
+# Record some values
+db['www.python.org'] = 'Python Website'
+db['www.cnn.com'] = 'Cable News Network'
+
+# Loop through contents. Other dictionary methods
+# such as .keys(), .values() also work.
+for k, v in db.iteritems():
+ print k, '\t', v
+
+# Storing a non-string key or value will raise an exception (most
+# likely a TypeError).
+db['www.yahoo.com'] = 4
+
+# Close when done.
+db.close()
+\end{verbatim}
+
\begin{seealso}
\seemodule{dbhash}{BSD \code{db} database interface.}
From buildbot at python.org Fri Jul 28 15:45:32 2006
From: buildbot at python.org (buildbot at python.org)
Date: Fri, 28 Jul 2006 13:45:32 +0000
Subject: [Python-checkins] buildbot failure in sparc Ubuntu dapper trunk
Message-ID: <20060728134533.19A191E4014@bag.python.org>
The Buildbot has detected a new failure of sparc Ubuntu dapper trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/sparc%2520Ubuntu%2520dapper%2520trunk/builds/579
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: matt.fleming
BUILD FAILED: failed svn
sincerely,
-The Buildbot
From python-checkins at python.org Fri Jul 28 17:45:56 2006
From: python-checkins at python.org (barry.warsaw)
Date: Fri, 28 Jul 2006 17:45:56 +0200 (CEST)
Subject: [Python-checkins] r50906 - sandbox/trunk/emailpkg/tags/4_0_1
Message-ID: <20060728154556.464B51E4002@bag.python.org>
Author: barry.warsaw
Date: Fri Jul 28 17:45:55 2006
New Revision: 50906
Added:
sandbox/trunk/emailpkg/tags/4_0_1/
- copied from r50905, python/trunk/Lib/email/
Log:
Tagging email 4.0.1
From python-checkins at python.org Fri Jul 28 18:37:33 2006
From: python-checkins at python.org (brett.cannon)
Date: Fri, 28 Jul 2006 18:37:33 +0200 (CEST)
Subject: [Python-checkins] r50907 -
python/branches/bcannon-sandboxing/Objects/obmalloc.c
Message-ID: <20060728163733.1B4871E4002@bag.python.org>
Author: brett.cannon
Date: Fri Jul 28 18:37:32 2006
New Revision: 50907
Modified:
python/branches/bcannon-sandboxing/Objects/obmalloc.c
Log:
Simplify the return of PyMalloc_ManagesMemory() to please Neal's optimizing heart.
Modified: python/branches/bcannon-sandboxing/Objects/obmalloc.c
==============================================================================
--- python/branches/bcannon-sandboxing/Objects/obmalloc.c (original)
+++ python/branches/bcannon-sandboxing/Objects/obmalloc.c Fri Jul 28 18:37:32 2006
@@ -716,10 +716,7 @@
pool = POOL_ADDR(ptr);
- if (Py_ADDRESS_IN_RANGE(ptr, pool))
- return 1;
- else
- return 0;
+ return Py_ADDRESS_IN_RANGE(ptr, pool);
}
size_t
From python-checkins at python.org Fri Jul 28 19:21:15 2006
From: python-checkins at python.org (brett.cannon)
Date: Fri, 28 Jul 2006 19:21:15 +0200 (CEST)
Subject: [Python-checkins] r50908 -
python/branches/bcannon-sandboxing/Objects/trackedmalloc.c
Message-ID: <20060728172115.03D861E4002@bag.python.org>
Author: brett.cannon
Date: Fri Jul 28 19:21:14 2006
New Revision: 50908
Added:
python/branches/bcannon-sandboxing/Objects/trackedmalloc.c
Log:
Major oversight of forgetting to check in the tracking malloc functions to begin with. =)
Added: python/branches/bcannon-sandboxing/Objects/trackedmalloc.c
==============================================================================
--- (empty file)
+++ python/branches/bcannon-sandboxing/Objects/trackedmalloc.c Fri Jul 28 19:21:14 2006
@@ -0,0 +1,155 @@
+#include "Python.h"
+#include
+
+/*
+ Add accountability to memory allocation.
+
+ The goal of these functions is to allow for all memory to be tracked based
+ on how much is being (roughly) used, and for what.
+
+ The APIs that need to be covered are PyObject_New(), Pyobject_Malloc(),
+ PyMem_Malloc(), the realloc/free mates, the macro variants, and the GC
+ variants.
+
+ * PyObject_New()/PyObject_NewVar()/PyObject_Del()
+ Uses PyObject_T_*()
+ * PyObject_NEW()/PyObject_NEW_VAR()/PyObject_DEL()
+ Uses PyObject_T_*()
+ * PyObject_MALLOC()/PyObject_REALLOC()/PyObject_FREE()
+ Change over to PyObject_T_*()
+ * PyObject_Malloc()/PyObject_Realloc()/PyObject_Free()
+ XXX
+ * PyObject_GC_New()/PyObject_GC_NewVar()/PyObject_GC_Del()
+ Uses _PyObject_GC_TrackedMalloc()
+ * _PyObject_GC_Malloc()
+ Changed to _PyObject_GC_TrackedMalloc()
+ * PyObject_GC_Resize()
+ Uses PyObject_T_MALLOC()
+ * PyMem_Malloc()/PyMem_Realloc()/PyMem_Free()
+ XXX
+ * PyMem_MALLOC()/PyMem_REALLOC()/PyMem_FREE()
+ XXX
+ * malloc()/realloc()/free()
+ XXX
+
+ In order to properly track memory usage, we must handle both memory handed
+ out by pymalloc as well as memory from malloc(). For pymaloc, we need to
+ first find out if pymalloc is managing the memory, and if that is true then
+ how big of a chunk of memory was given for the pointer.
+ For malloc(), we need to either use functions provided by the C library
+ (for glibc, see
+ http://www.gnu.org/software/libc/manual/html_node/Summary-of-Malloc.html).
+*/
+
+unsigned long Py_ProcessMemUsage = 0;
+
+static const char *UNKNOWN_WHAT = "";
+
+
+/*
+ Track an anonymous chunk of memory.
+*/
+int
+PyObject_TrackMemory(const char *what, size_t nbytes)
+{
+ what = what ? what : UNKNOWN_WHAT;
+
+ Py_ProcessMemUsage += nbytes;
+
+ return 1;
+}
+
+/*
+ Stop tracking an anonymous chunk of memory.
+*/
+int
+PyObject_UntrackMemory(const char *what, size_t nbytes)
+{
+ what = what ? what : UNKNOWN_WHAT;
+
+ Py_ProcessMemUsage -= nbytes;
+
+ return 1;
+}
+
+/*
+ Track the memory created by PyObject_Maloc().
+*/
+void *
+PyObject_TrackedMalloc(const char *what, size_t nbytes)
+{
+ struct mallinfo before = mallinfo();
+ void *allocated = NULL;
+ size_t used = 0;
+
+ allocated = PyObject_Malloc(nbytes);
+
+ if (!allocated)
+ return NULL;
+
+ if (PyMalloc_ManagesMemory(allocated)) {
+ used = PyMalloc_AllocatedSize(allocated);
+ }
+ else {
+ used = mallinfo().uordblks - before.uordblks;
+ }
+
+ PyObject_TrackMemory(what, used);
+
+ return allocated;
+}
+
+/*
+ Track the realloc of memory cretaed by PyObject_Malloc().
+
+ XXX really need to have reason string passed in?
+*/
+void *
+PyObject_TrackedRealloc(const char *what, void *to_resize, size_t new_size)
+{
+ struct mallinfo before = mallinfo();
+ void *allocated = NULL;
+ size_t size_delta = 0;
+
+ if (PyMalloc_ManagesMemory(to_resize)) {
+ size_delta = PyMalloc_AllocatedSize(to_resize);
+ }
+
+ allocated = PyObject_Realloc(to_resize, new_size);
+
+ if (!allocated)
+ return NULL;
+
+ if (PyMalloc_ManagesMemory(allocated)) {
+ size_delta = PyMalloc_AllocatedSize(allocated) - size_delta;
+ }
+ else {
+ size_delta = mallinfo().uordblks - before.uordblks;
+ }
+
+ PyObject_TrackMemory(what, size_delta);
+
+ return allocated;
+}
+
+/*
+ Untrack memory created by PyObject_Malloc().
+*/
+void
+PyObject_TrackedFree(const char *what, void *to_free)
+{
+ struct mallinfo before = mallinfo();
+ size_t freed = 0;
+
+ if (PyMalloc_ManagesMemory(to_free)) {
+ freed = PyMalloc_AllocatedSize(to_free);
+ }
+
+ PyObject_Free(to_free);
+
+ if (!freed) {
+ freed = before.uordblks - mallinfo().uordblks;
+ }
+
+ PyObject_UntrackMemory(what, freed);
+}
From python-checkins at python.org Fri Jul 28 19:22:32 2006
From: python-checkins at python.org (brett.cannon)
Date: Fri, 28 Jul 2006 19:22:32 +0200 (CEST)
Subject: [Python-checkins] r50909 - in python/branches/bcannon-sandboxing:
Include/objimpl.h Modules/gcmodule.c Objects/typeobject.c
Message-ID: <20060728172232.8E89A1E4002@bag.python.org>
Author: brett.cannon
Date: Fri Jul 28 19:22:31 2006
New Revision: 50909
Modified:
python/branches/bcannon-sandboxing/Include/objimpl.h
python/branches/bcannon-sandboxing/Modules/gcmodule.c
python/branches/bcannon-sandboxing/Objects/typeobject.c
Log:
Change _PyObject_GC_Malloc() to _PyObject_GC_TrackedMalloc() and take in a
const char * to denote what the memory is for. Also switch over
PyObject_GC_New() & friends to be tracked.
Modified: python/branches/bcannon-sandboxing/Include/objimpl.h
==============================================================================
--- python/branches/bcannon-sandboxing/Include/objimpl.h (original)
+++ python/branches/bcannon-sandboxing/Include/objimpl.h Fri Jul 28 19:22:31 2006
@@ -320,7 +320,7 @@
g->gc.gc_next = NULL; \
} while (0);
-PyAPI_FUNC(PyObject *) _PyObject_GC_Malloc(size_t);
+PyAPI_FUNC(PyObject *) _PyObject_GC_TrackedMalloc(const char *, size_t);
PyAPI_FUNC(PyObject *) _PyObject_GC_New(PyTypeObject *);
PyAPI_FUNC(PyVarObject *) _PyObject_GC_NewVar(PyTypeObject *, Py_ssize_t);
PyAPI_FUNC(void) PyObject_GC_Track(void *);
Modified: python/branches/bcannon-sandboxing/Modules/gcmodule.c
==============================================================================
--- python/branches/bcannon-sandboxing/Modules/gcmodule.c (original)
+++ python/branches/bcannon-sandboxing/Modules/gcmodule.c Fri Jul 28 19:22:31 2006
@@ -1281,7 +1281,7 @@
#undef PyObject_GC_Track
#undef PyObject_GC_UnTrack
#undef PyObject_GC_Del
-#undef _PyObject_GC_Malloc
+#undef _PyObject_GC_TrackedMalloc
void
PyObject_GC_Track(void *op)
@@ -1314,12 +1314,12 @@
}
PyObject *
-_PyObject_GC_Malloc(size_t basicsize)
+_PyObject_GC_TrackedMalloc(const char* what, size_t basicsize)
{
PyObject *op;
PyGC_Head *g = NULL;
- g = (PyGC_Head *)PyObject_MALLOC(sizeof(PyGC_Head) + basicsize);
+ g = (PyGC_Head *)PyObject_T_MALLOC(what, sizeof(PyGC_Head) + basicsize);
if (g == NULL)
return PyErr_NoMemory();
g->gc.gc_refs = GC_UNTRACKED;
@@ -1343,7 +1343,7 @@
PyObject *op = NULL;
size_t obj_size = _PyObject_SIZE(tp);
- op = _PyObject_GC_Malloc(obj_size);
+ op = _PyObject_GC_TrackedMalloc(tp->tp_name, obj_size);
if (op != NULL)
op = PyObject_INIT(op, tp);
return op;
@@ -1355,7 +1355,7 @@
const size_t size = _PyObject_VAR_SIZE(tp, nitems);
PyVarObject *op = NULL;
- op = (PyVarObject *) _PyObject_GC_Malloc(size);
+ op = (PyVarObject *) _PyObject_GC_TrackedMalloc(tp->tp_name, size);
if (op != NULL)
op = PyObject_INIT_VAR(op, tp, nitems);
return op;
@@ -1366,7 +1366,8 @@
{
const size_t basicsize = _PyObject_VAR_SIZE(op->ob_type, nitems);
PyGC_Head *g = AS_GC(op);
- g = (PyGC_Head *)PyObject_REALLOC(g, sizeof(PyGC_Head) + basicsize);
+ g = (PyGC_Head *)PyObject_T_REALLOC(((PyObject *)op)->ob_type->tp_name,
+ g, sizeof(PyGC_Head) + basicsize);
if (g == NULL)
return (PyVarObject *)PyErr_NoMemory();
op = (PyVarObject *) FROM_GC(g);
@@ -1383,7 +1384,7 @@
if (generations[0].count > 0) {
generations[0].count--;
}
- PyObject_FREE(g);
+ PyObject_T_FREE(((PyObject *)op)->ob_type->tp_name, g);
}
/* for binary compatibility with 2.2 */
Modified: python/branches/bcannon-sandboxing/Objects/typeobject.c
==============================================================================
--- python/branches/bcannon-sandboxing/Objects/typeobject.c (original)
+++ python/branches/bcannon-sandboxing/Objects/typeobject.c Fri Jul 28 19:22:31 2006
@@ -451,7 +451,7 @@
/* note that we need to add one, for the sentinel */
if (PyType_IS_GC(type))
- obj = _PyObject_GC_Malloc(size);
+ obj = _PyObject_GC_TrackedMalloc(type->tp_name, size);
else
obj = (PyObject *)PyObject_Malloc(size);
From python-checkins at python.org Fri Jul 28 19:26:55 2006
From: python-checkins at python.org (brett.cannon)
Date: Fri, 28 Jul 2006 19:26:55 +0200 (CEST)
Subject: [Python-checkins] r50910 -
python/branches/bcannon-sandboxing/Objects/trackedmalloc.c
Message-ID: <20060728172655.9113B1E4002@bag.python.org>
Author: brett.cannon
Date: Fri Jul 28 19:26:55 2006
New Revision: 50910
Modified:
python/branches/bcannon-sandboxing/Objects/trackedmalloc.c
Log:
Migrate todo list to file doc instead of personal Writely doc.
Modified: python/branches/bcannon-sandboxing/Objects/trackedmalloc.c
==============================================================================
--- python/branches/bcannon-sandboxing/Objects/trackedmalloc.c (original)
+++ python/branches/bcannon-sandboxing/Objects/trackedmalloc.c Fri Jul 28 19:26:55 2006
@@ -39,6 +39,12 @@
For malloc(), we need to either use functions provided by the C library
(for glibc, see
http://www.gnu.org/software/libc/manual/html_node/Summary-of-Malloc.html).
+
+XXX:
++ convert over all APIs.
++ add proper Py_TRACK_MEMORY #ifdef protections.
++ Raise an error during compilation if required functionality (e.g. mallinfo())
+ not available for tracking memory, even if requested.
*/
unsigned long Py_ProcessMemUsage = 0;
From python-checkins at python.org Fri Jul 28 20:30:51 2006
From: python-checkins at python.org (georg.brandl)
Date: Fri, 28 Jul 2006 20:30:51 +0200 (CEST)
Subject: [Python-checkins] r50911 - in python/branches/release24-maint:
Lib/test/test_email_codecs.py Lib/test/test_iterlen.py Misc/NEWS
Message-ID: <20060728183051.C6CA21E4014@bag.python.org>
Author: georg.brandl
Date: Fri Jul 28 20:30:50 2006
New Revision: 50911
Modified:
python/branches/release24-maint/Lib/test/test_email_codecs.py
python/branches/release24-maint/Lib/test/test_iterlen.py
python/branches/release24-maint/Misc/NEWS
Log:
Patch #1529686: run test_iterlen and test_email_codecs in 2.4.
Modified: python/branches/release24-maint/Lib/test/test_email_codecs.py
==============================================================================
--- python/branches/release24-maint/Lib/test/test_email_codecs.py (original)
+++ python/branches/release24-maint/Lib/test/test_email_codecs.py Fri Jul 28 20:30:50 2006
@@ -1,11 +1,12 @@
# Copyright (C) 2002 Python Software Foundation
# email package unit tests for (optional) Asian codecs
-import unittest
# The specific tests now live in Lib/email/test
-from email.test.test_email_codecs import suite
+from email.test import test_email_codecs
+from test import test_support
+def test_main():
+ test_support.run_suite(test_email_codecs.suite())
-
if __name__ == '__main__':
- unittest.main(defaultTest='suite')
+ test_main()
Modified: python/branches/release24-maint/Lib/test/test_iterlen.py
==============================================================================
--- python/branches/release24-maint/Lib/test/test_iterlen.py (original)
+++ python/branches/release24-maint/Lib/test/test_iterlen.py Fri Jul 28 20:30:50 2006
@@ -223,9 +223,7 @@
self.assertEqual(len(it), 0)
-
-if __name__ == "__main__":
-
+def test_main():
unittests = [
TestRepeat,
TestXrange,
@@ -243,3 +241,6 @@
TestSeqIterReversed,
]
test_support.run_unittest(*unittests)
+
+if __name__ == "__main__":
+ test_main()
Modified: python/branches/release24-maint/Misc/NEWS
==============================================================================
--- python/branches/release24-maint/Misc/NEWS (original)
+++ python/branches/release24-maint/Misc/NEWS Fri Jul 28 20:30:50 2006
@@ -153,6 +153,12 @@
- Bug #1337990: clarified that ``doctest`` does not support examples
requiring both expected output and an exception.
+Tests
+-----
+
+- Patch #1529686: test_iterlen and test_email_codecs are now actually
+ run by regrtest.py.
+
What's New in Python 2.4.3?
===========================
From python-checkins at python.org Fri Jul 28 20:31:39 2006
From: python-checkins at python.org (georg.brandl)
Date: Fri, 28 Jul 2006 20:31:39 +0200 (CEST)
Subject: [Python-checkins] r50912 - in python/trunk:
Lib/test/test_email_codecs.py Misc/NEWS
Message-ID: <20060728183139.CBEC21E4011@bag.python.org>
Author: georg.brandl
Date: Fri Jul 28 20:31:39 2006
New Revision: 50912
Modified:
python/trunk/Lib/test/test_email_codecs.py
python/trunk/Misc/NEWS
Log:
Patch #1529686: also run test_email_codecs with regrtest.py.
Modified: python/trunk/Lib/test/test_email_codecs.py
==============================================================================
--- python/trunk/Lib/test/test_email_codecs.py (original)
+++ python/trunk/Lib/test/test_email_codecs.py Fri Jul 28 20:31:39 2006
@@ -1,11 +1,15 @@
# Copyright (C) 2002 Python Software Foundation
# email package unit tests for (optional) Asian codecs
-import unittest
# The specific tests now live in Lib/email/test
-from email.test.test_email_codecs import suite
+from email.test import test_email_codecs
+from email.test import test_email_codecs_renamed
+from test import test_support
+def test_main():
+ suite = test_email_codecs.suite()
+ suite.addTest(test_email_codecs_renamed.suite())
+ test_support.run_suite(suite)
-
if __name__ == '__main__':
- unittest.main(defaultTest='suite')
+ test_main()
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Fri Jul 28 20:31:39 2006
@@ -142,9 +142,9 @@
how long the test file should take to play. Now accepts taking 2.93 secs
(exact time) +/- 10% instead of the hard-coded 3.1 sec.
-- Patch #1529686: The standard tests ``test_defaultdict``, ``test_iterlen``
- and ``test_uuid`` didn't actually run any tests when run via ``regrtest.py``.
- Now they do.
+- Patch #1529686: The standard tests ``test_defaultdict``, ``test_iterlen``,
+ ``test_uuid`` and ``test_email_codecs`` didn't actually run any tests when
+ run via ``regrtest.py``. Now they do.
What's New in Python 2.5 beta 2?
From python-checkins at python.org Fri Jul 28 20:36:02 2006
From: python-checkins at python.org (georg.brandl)
Date: Fri, 28 Jul 2006 20:36:02 +0200 (CEST)
Subject: [Python-checkins] r50913 - in python/trunk/Modules:
_sqlite/cursor.c _sqlite/util.c _sqlite/util.h socketmodule.c
Message-ID: <20060728183602.ACD0E1E401A@bag.python.org>
Author: georg.brandl
Date: Fri Jul 28 20:36:01 2006
New Revision: 50913
Modified:
python/trunk/Modules/_sqlite/cursor.c
python/trunk/Modules/_sqlite/util.c
python/trunk/Modules/_sqlite/util.h
python/trunk/Modules/socketmodule.c
Log:
Fix spelling.
Modified: python/trunk/Modules/_sqlite/cursor.c
==============================================================================
--- python/trunk/Modules/_sqlite/cursor.c (original)
+++ python/trunk/Modules/_sqlite/cursor.c Fri Jul 28 20:36:01 2006
@@ -621,7 +621,7 @@
}
} else {
if (PyErr_Occurred()) {
- /* there was an error that occured in a user-defined callback */
+ /* there was an error that occurred in a user-defined callback */
if (_enable_callback_tracebacks) {
PyErr_Print();
} else {
Modified: python/trunk/Modules/_sqlite/util.c
==============================================================================
--- python/trunk/Modules/_sqlite/util.c (original)
+++ python/trunk/Modules/_sqlite/util.c Fri Jul 28 20:36:01 2006
@@ -38,7 +38,7 @@
/**
* Checks the SQLite error code and sets the appropriate DB-API exception.
- * Returns the error code (0 means no error occured).
+ * Returns the error code (0 means no error occurred).
*/
int _seterror(sqlite3* db)
{
Modified: python/trunk/Modules/_sqlite/util.h
==============================================================================
--- python/trunk/Modules/_sqlite/util.h (original)
+++ python/trunk/Modules/_sqlite/util.h Fri Jul 28 20:36:01 2006
@@ -32,7 +32,7 @@
/**
* Checks the SQLite error code and sets the appropriate DB-API exception.
- * Returns the error code (0 means no error occured).
+ * Returns the error code (0 means no error occurred).
*/
int _seterror(sqlite3* db);
#endif
Modified: python/trunk/Modules/socketmodule.c
==============================================================================
--- python/trunk/Modules/socketmodule.c (original)
+++ python/trunk/Modules/socketmodule.c Fri Jul 28 20:36:01 2006
@@ -2307,7 +2307,7 @@
/* Call the guts */
outlen = sock_recv_guts(s, PyString_AS_STRING(buf), recvlen, flags);
if (outlen < 0) {
- /* An error occured, release the string and return an
+ /* An error occurred, release the string and return an
error. */
Py_DECREF(buf);
return NULL;
From python-checkins at python.org Fri Jul 28 20:50:35 2006
From: python-checkins at python.org (brett.cannon)
Date: Fri, 28 Jul 2006 20:50:35 +0200 (CEST)
Subject: [Python-checkins] r50914 - in python/branches/bcannon-sandboxing:
Include/pymem.h Modules/_elementtree.c Modules/parsermodule.c
Objects/trackedmalloc.c Objects/typeobject.c Python/compile.c
Python/future.c Python/pythonrun.c
Message-ID: <20060728185035.ECF1C1E4015@bag.python.org>
Author: brett.cannon
Date: Fri Jul 28 20:50:34 2006
New Revision: 50914
Modified:
python/branches/bcannon-sandboxing/Include/pymem.h
python/branches/bcannon-sandboxing/Modules/_elementtree.c
python/branches/bcannon-sandboxing/Modules/parsermodule.c
python/branches/bcannon-sandboxing/Objects/trackedmalloc.c
python/branches/bcannon-sandboxing/Objects/typeobject.c
python/branches/bcannon-sandboxing/Python/compile.c
python/branches/bcannon-sandboxing/Python/future.c
python/branches/bcannon-sandboxing/Python/pythonrun.c
Log:
Track PyMem_*() calls as unknown.
Modified: python/branches/bcannon-sandboxing/Include/pymem.h
==============================================================================
--- python/branches/bcannon-sandboxing/Include/pymem.h (original)
+++ python/branches/bcannon-sandboxing/Include/pymem.h Fri Jul 28 20:50:34 2006
@@ -61,6 +61,12 @@
#define PyMem_REALLOC PyObject_REALLOC
#define PyMem_FREE PyObject_FREE
+#elif define (Py_TRACK_MEMORY)
+
+#define PyMem_MALLOC(size) PyObject_T_MALLOC("", size)
+#define PyMem_REALLOC(size) PyObject_T_REALLOC("", size)
+#define PyMem_FREE(size) PyObject_T_FREE("", size)
+
#else /* ! PYMALLOC_DEBUG */
/* PyMem_MALLOC(0) means malloc(1). Some systems would return NULL
Modified: python/branches/bcannon-sandboxing/Modules/_elementtree.c
==============================================================================
--- python/branches/bcannon-sandboxing/Modules/_elementtree.c (original)
+++ python/branches/bcannon-sandboxing/Modules/_elementtree.c Fri Jul 28 20:50:34 2006
@@ -277,7 +277,7 @@
LOCAL(int)
element_new_extra(ElementObject* self, PyObject* attrib)
{
- self->extra = PyObject_Malloc(sizeof(ElementObjectExtra));
+ self->extra = PyObject_T_MALLOC("ElementTree", sizeof(ElementObjectExtra));
if (!self->extra)
return -1;
@@ -305,9 +305,9 @@
Py_DECREF(self->extra->children[i]);
if (self->extra->children != self->extra->_children)
- PyObject_Free(self->extra->children);
+ PyObject_T_FREE("ElementTree", self->extra->children);
- PyObject_Free(self->extra);
+ PyObject_T_FREE("ElementTree", self->extra);
}
LOCAL(PyObject*)
@@ -370,12 +370,12 @@
/* use Python 2.4's list growth strategy */
size = (size >> 3) + (size < 9 ? 3 : 6) + size;
if (self->extra->children != self->extra->_children) {
- children = PyObject_Realloc(self->extra->children,
+ children = PyObject_T_REALLOC("ElementTree", self->extra->children,
size * sizeof(PyObject*));
if (!children)
goto nomemory;
} else {
- children = PyObject_Malloc(size * sizeof(PyObject*));
+ children = PyObject_T_MALLOC("ElementTree", size * sizeof(PyObject*));
if (!children)
goto nomemory;
/* copy existing children from static area to malloc buffer */
Modified: python/branches/bcannon-sandboxing/Modules/parsermodule.c
==============================================================================
--- python/branches/bcannon-sandboxing/Modules/parsermodule.c (original)
+++ python/branches/bcannon-sandboxing/Modules/parsermodule.c Fri Jul 28 20:50:34 2006
@@ -701,7 +701,7 @@
}
}
len = PyString_GET_SIZE(temp) + 1;
- strn = (char *)PyObject_Malloc(len);
+ strn = (char *)PyObject_T_MALLOC("char", len);
if (strn != NULL)
(void) memcpy(strn, PyString_AS_STRING(temp), len);
Py_DECREF(temp);
@@ -723,7 +723,7 @@
return (node *) PyErr_NoMemory();
}
if (err == E_OVERFLOW) {
- PyObject_Free(strn);
+ PyObject_T_FREE("char", strn);
PyErr_SetString(PyExc_ValueError,
"unsupported number of child nodes");
return NULL;
@@ -787,7 +787,7 @@
if (res && encoding) {
Py_ssize_t len;
len = PyString_GET_SIZE(encoding) + 1;
- res->n_str = (char *)PyObject_Malloc(len);
+ res->n_str = (char *)PyObject_T_MALLOC("char", len);
if (res->n_str != NULL)
(void) memcpy(res->n_str, PyString_AS_STRING(encoding), len);
Py_DECREF(encoding);
Modified: python/branches/bcannon-sandboxing/Objects/trackedmalloc.c
==============================================================================
--- python/branches/bcannon-sandboxing/Objects/trackedmalloc.c (original)
+++ python/branches/bcannon-sandboxing/Objects/trackedmalloc.c Fri Jul 28 20:50:34 2006
@@ -18,7 +18,7 @@
* PyObject_MALLOC()/PyObject_REALLOC()/PyObject_FREE()
Change over to PyObject_T_*()
* PyObject_Malloc()/PyObject_Realloc()/PyObject_Free()
- XXX
+ Change over to PyObject_T_*()
* PyObject_GC_New()/PyObject_GC_NewVar()/PyObject_GC_Del()
Uses _PyObject_GC_TrackedMalloc()
* _PyObject_GC_Malloc()
@@ -26,9 +26,9 @@
* PyObject_GC_Resize()
Uses PyObject_T_MALLOC()
* PyMem_Malloc()/PyMem_Realloc()/PyMem_Free()
- XXX
+ Uses PyMem_MALLOC(), etc.
* PyMem_MALLOC()/PyMem_REALLOC()/PyMem_FREE()
- XXX
+ Change to PyObject_T_*("", size)
* malloc()/realloc()/free()
XXX
@@ -45,6 +45,7 @@
+ add proper Py_TRACK_MEMORY #ifdef protections.
+ Raise an error during compilation if required functionality (e.g. mallinfo())
not available for tracking memory, even if requested.
++ Completely convert ElementTree.
*/
unsigned long Py_ProcessMemUsage = 0;
Modified: python/branches/bcannon-sandboxing/Objects/typeobject.c
==============================================================================
--- python/branches/bcannon-sandboxing/Objects/typeobject.c (original)
+++ python/branches/bcannon-sandboxing/Objects/typeobject.c Fri Jul 28 20:50:34 2006
@@ -453,7 +453,7 @@
if (PyType_IS_GC(type))
obj = _PyObject_GC_TrackedMalloc(type->tp_name, size);
else
- obj = (PyObject *)PyObject_Malloc(size);
+ obj = (PyObject *)PyObject_T_MALLOC(type->tp_name, size);
if (obj == NULL)
return PyErr_NoMemory();
@@ -1892,7 +1892,7 @@
PyObject *doc = PyDict_GetItemString(dict, "__doc__");
if (doc != NULL && PyString_Check(doc)) {
const size_t n = (size_t)PyString_GET_SIZE(doc);
- char *tp_doc = (char *)PyObject_Malloc(n+1);
+ char *tp_doc = (char *)PyObject_T_MALLOC("char", n+1);
if (tp_doc == NULL) {
Py_DECREF(type);
return NULL;
@@ -2145,7 +2145,7 @@
/* A type's tp_doc is heap allocated, unlike the tp_doc slots
* of most other objects. It's okay to cast it to char *.
*/
- PyObject_Free((char *)type->tp_doc);
+ PyObject_T_FREE("char", (char *)type->tp_doc);
Py_XDECREF(et->ht_name);
Py_XDECREF(et->ht_slots);
type->ob_type->tp_free((PyObject *)type);
Modified: python/branches/bcannon-sandboxing/Python/compile.c
==============================================================================
--- python/branches/bcannon-sandboxing/Python/compile.c (original)
+++ python/branches/bcannon-sandboxing/Python/compile.c Fri Jul 28 20:50:34 2006
@@ -314,7 +314,7 @@
if (c->c_st)
PySymtable_Free(c->c_st);
if (c->c_future)
- PyObject_Free(c->c_future);
+ PyObject_T_FREE("compiler", c->c_future);
Py_DECREF(c->c_stack);
}
@@ -1066,9 +1066,9 @@
b = u->u_blocks;
while (b != NULL) {
if (b->b_instr)
- PyObject_Free((void *)b->b_instr);
+ PyObject_T_FREE("compiler", (void *)b->b_instr);
next = b->b_list;
- PyObject_Free((void *)b);
+ PyObject_T_FREE("compiler", (void *)b);
b = next;
}
Py_XDECREF(u->u_ste);
@@ -1079,7 +1079,7 @@
Py_XDECREF(u->u_freevars);
Py_XDECREF(u->u_cellvars);
Py_XDECREF(u->u_private);
- PyObject_Free(u);
+ PyObject_T_FREE("compiler", u);
}
static int
@@ -1088,7 +1088,7 @@
{
struct compiler_unit *u;
- u = (struct compiler_unit *)PyObject_Malloc(sizeof(
+ u = (struct compiler_unit *)PyObject_T_MALLOC("compiler", sizeof(
struct compiler_unit));
if (!u) {
PyErr_NoMemory();
@@ -1193,7 +1193,7 @@
struct compiler_unit *u;
u = c->u;
- b = (basicblock *)PyObject_Malloc(sizeof(basicblock));
+ b = (basicblock *)PyObject_T_MALLOC("compiler", sizeof(basicblock));
if (b == NULL) {
PyErr_NoMemory();
return NULL;
@@ -1245,7 +1245,7 @@
{
assert(b != NULL);
if (b->b_instr == NULL) {
- b->b_instr = (struct instr *)PyObject_Malloc(
+ b->b_instr = (struct instr *)PyObject_T_MALLOC("compiler",
sizeof(struct instr) * DEFAULT_BLOCK_SIZE);
if (b->b_instr == NULL) {
PyErr_NoMemory();
@@ -1264,7 +1264,7 @@
return -1;
}
b->b_ialloc <<= 1;
- b->b_instr = (struct instr *)PyObject_Realloc(
+ b->b_instr = (struct instr *)PyObject_T_REALLOC("compiler",
(void *)b->b_instr, newsize);
if (b->b_instr == NULL)
return -1;
@@ -4015,7 +4015,7 @@
a->a_lnotab = PyString_FromStringAndSize(NULL, DEFAULT_LNOTAB_SIZE);
if (!a->a_lnotab)
return 0;
- a->a_postorder = (basicblock **)PyObject_Malloc(
+ a->a_postorder = (basicblock **)PyObject_T_MALLOC("compiler",
sizeof(basicblock *) * nblocks);
if (!a->a_postorder) {
PyErr_NoMemory();
@@ -4030,7 +4030,7 @@
Py_XDECREF(a->a_bytecode);
Py_XDECREF(a->a_lnotab);
if (a->a_postorder)
- PyObject_Free(a->a_postorder);
+ PyObject_T_FREE("compiler", a->a_postorder);
}
/* Return the size of a basic block in bytes. */
Modified: python/branches/bcannon-sandboxing/Python/future.c
==============================================================================
--- python/branches/bcannon-sandboxing/Python/future.c (original)
+++ python/branches/bcannon-sandboxing/Python/future.c Fri Jul 28 20:50:34 2006
@@ -120,14 +120,15 @@
{
PyFutureFeatures *ff;
- ff = (PyFutureFeatures *)PyObject_Malloc(sizeof(PyFutureFeatures));
+ ff = (PyFutureFeatures *)PyObject_T_MALLOC("compiler",
+ sizeof(PyFutureFeatures));
if (ff == NULL)
return NULL;
ff->ff_features = 0;
ff->ff_lineno = -1;
if (!future_parse(ff, mod, filename)) {
- PyObject_Free(ff);
+ PyObject_T_FREE("compiler", ff);
return NULL;
}
return ff;
Modified: python/branches/bcannon-sandboxing/Python/pythonrun.c
==============================================================================
--- python/branches/bcannon-sandboxing/Python/pythonrun.c (original)
+++ python/branches/bcannon-sandboxing/Python/pythonrun.c Fri Jul 28 20:50:34 2006
@@ -1493,7 +1493,7 @@
v = Py_BuildValue("(ziiz)", err->filename,
err->lineno, err->offset, err->text);
if (err->text != NULL) {
- PyObject_Free(err->text);
+ PyObject_T_FREE("char", err->text);
err->text = NULL;
}
w = NULL;
From python-checkins at python.org Fri Jul 28 21:42:41 2006
From: python-checkins at python.org (thomas.heller)
Date: Fri, 28 Jul 2006 21:42:41 +0200 (CEST)
Subject: [Python-checkins] r50915 - python/trunk/Modules/_ctypes/callproc.c
Message-ID: <20060728194241.51E1F1E4002@bag.python.org>
Author: thomas.heller
Date: Fri Jul 28 21:42:40 2006
New Revision: 50915
Modified:
python/trunk/Modules/_ctypes/callproc.c
Log:
Remove a useless XXX comment.
Cosmetic changes to the code so that the #ifdef _UNICODE block
doesn't mess emacs code formatting.
Modified: python/trunk/Modules/_ctypes/callproc.c
==============================================================================
--- python/trunk/Modules/_ctypes/callproc.c (original)
+++ python/trunk/Modules/_ctypes/callproc.c Fri Jul 28 21:42:40 2006
@@ -851,17 +851,17 @@
progid = NULL;
ProgIDFromCLSID(&guid, &progid);
-/* XXX Is COMError derived from WindowsError or not? */
text = FormatError(errcode);
+ obj = Py_BuildValue(
#ifdef _UNICODE
- obj = Py_BuildValue("iu(uuuiu)",
+ "iu(uuuiu)",
#else
- obj = Py_BuildValue("is(uuuiu)",
+ "is(uuuiu)",
#endif
- errcode,
- text,
- descr, source, helpfile, helpcontext,
- progid);
+ errcode,
+ text,
+ descr, source, helpfile, helpcontext,
+ progid);
if (obj) {
PyErr_SetObject(ComError, obj);
Py_DECREF(obj);
From python-checkins at python.org Fri Jul 28 23:12:09 2006
From: python-checkins at python.org (phillip.eby)
Date: Fri, 28 Jul 2006 23:12:09 +0200 (CEST)
Subject: [Python-checkins] r50916 - in python/trunk: Doc/lib/libimp.tex
Lib/pkgutil.py Misc/NEWS Python/import.c
Message-ID: <20060728211209.C8CAE1E4008@bag.python.org>
Author: phillip.eby
Date: Fri Jul 28 23:12:07 2006
New Revision: 50916
Modified:
python/trunk/Doc/lib/libimp.tex
python/trunk/Lib/pkgutil.py
python/trunk/Misc/NEWS
python/trunk/Python/import.c
Log:
Bug #1529871: The speed enhancement patch #921466 broke Python's compliance
with PEP 302. This was fixed by adding an ``imp.NullImporter`` type that is
used in ``sys.path_importer_cache`` to cache non-directory paths and avoid
excessive filesystem operations during imports.
Modified: python/trunk/Doc/lib/libimp.tex
==============================================================================
--- python/trunk/Doc/lib/libimp.tex (original)
+++ python/trunk/Doc/lib/libimp.tex Fri Jul 28 23:12:07 2006
@@ -232,6 +232,24 @@
source file.
\end{funcdesc}
+\begin{classdesc}{NullImporter}{path_string}
+The \class{NullImporter} type is a \pep{302} import hook that handles
+non-directory path strings by failing to find any modules. Calling this
+type with an existing directory or empty string raises
+\exception{ImportError}. Otherwise, a \class{NullImporter} instance is
+returned.
+
+Python adds instances of this type to \code{sys.path_importer_cache} for
+any path entries that are not directories and are not handled by any other
+path hooks on \code{sys.path_hooks}. Instances have only one method:
+
+\begin{methoddesc}{find_module}{fullname \optional{, path}}
+This method always returns \code{None}, indicating that the requested
+module could not be found.
+\end{methoddesc}
+
+\versionadded{2.5}
+\end{classdesc}
\subsection{Examples}
\label{examples-imp}
@@ -257,7 +275,7 @@
# there's a problem we can't handle -- let the caller handle it.
fp, pathname, description = imp.find_module(name)
-
+
try:
return imp.load_module(name, fp, pathname, description)
finally:
Modified: python/trunk/Lib/pkgutil.py
==============================================================================
--- python/trunk/Lib/pkgutil.py (original)
+++ python/trunk/Lib/pkgutil.py Fri Jul 28 23:12:07 2006
@@ -381,9 +381,7 @@
importer = None
sys.path_importer_cache.setdefault(path_item, importer)
- # The boolean values are used for caching valid and invalid
- # file paths for the built-in import machinery
- if importer in (None, True, False):
+ if importer is None:
try:
importer = ImpImporter(path_item)
except ImportError:
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Fri Jul 28 23:12:07 2006
@@ -12,6 +12,11 @@
Core and builtins
-----------------
+- Bug #1529871: The speed enhancement patch #921466 broke Python's compliance
+ with PEP 302. This was fixed by adding an ``imp.NullImporter`` type that is
+ used in ``sys.path_importer_cache`` to cache non-directory paths and avoid
+ excessive filesystem operations during imports.
+
- Bug #1521947: When checking for overflow, ``PyOS_strtol()`` used some
operations on signed longs that are formally undefined by C.
Unfortunately, at least one compiler now cares about that, so complicated
@@ -106,10 +111,14 @@
Extension Modules
-----------------
+<<<<<<< .mine
+- Bug #1471938: Fix curses module build problem on Solaris 8; patch by
+=======
- The ``__reduce__()`` method of the new ``collections.defaultdict`` had
a memory leak, affecting pickles and deep copies.
- Bug #1471938: Fix curses module build problem on Solaris 8; patch by
+>>>>>>> .r50915
Paul Eggert.
- Patch #1448199: Release interpreter lock in _winreg.ConnectRegistry.
Modified: python/trunk/Python/import.c
==============================================================================
--- python/trunk/Python/import.c (original)
+++ python/trunk/Python/import.c Fri Jul 28 23:12:07 2006
@@ -98,6 +98,8 @@
};
#endif
+static PyTypeObject NullImporterType; /* Forward reference */
+
/* Initialize things */
void
@@ -155,6 +157,8 @@
/* adding sys.path_hooks and sys.path_importer_cache, setting up
zipimport */
+ if (PyType_Ready(&NullImporterType) < 0)
+ goto error;
if (Py_VerboseFlag)
PySys_WriteStderr("# installing zipimport hook\n");
@@ -180,9 +184,11 @@
if (err) {
error:
PyErr_Print();
- Py_FatalError("initializing sys.meta_path, sys.path_hooks or "
- "path_importer_cache failed");
+ Py_FatalError("initializing sys.meta_path, sys.path_hooks, "
+ "path_importer_cache, or NullImporter failed"
+ );
}
+
zimpimport = PyImport_ImportModule("zipimport");
if (zimpimport == NULL) {
PyErr_Clear(); /* No zip import module -- okay */
@@ -1058,9 +1064,18 @@
}
PyErr_Clear();
}
- if (importer == NULL)
- importer = Py_None;
- else if (importer != Py_None) {
+ if (importer == NULL) {
+ importer = PyObject_CallFunctionObjArgs(
+ (PyObject *)&NullImporterType, p, NULL
+ );
+ if (importer == NULL) {
+ if (PyErr_ExceptionMatches(PyExc_ImportError)) {
+ PyErr_Clear();
+ return Py_None;
+ }
+ }
+ }
+ if (importer != NULL) {
int err = PyDict_SetItem(path_importer_cache, p, importer);
Py_DECREF(importer);
if (err != 0)
@@ -1248,35 +1263,7 @@
return NULL;
}
/* Note: importer is a borrowed reference */
- if (importer == Py_False) {
- /* Cached as not being a valid dir. */
- Py_XDECREF(copy);
- continue;
- }
- else if (importer == Py_True) {
- /* Cached as being a valid dir, so just
- * continue below. */
- }
- else if (importer == Py_None) {
- /* No importer was found, so it has to be a file.
- * Check if the directory is valid.
- * Note that the empty string is a valid path, but
- * not stat'able, hence the check for len. */
-#ifdef HAVE_STAT
- if (len && stat(buf, &statbuf) != 0) {
- /* Directory does not exist. */
- PyDict_SetItem(path_importer_cache,
- v, Py_False);
- Py_XDECREF(copy);
- continue;
- } else {
- PyDict_SetItem(path_importer_cache,
- v, Py_True);
- }
-#endif
- }
- else {
- /* A real import hook importer was found. */
+ if (importer != Py_None) {
PyObject *loader;
loader = PyObject_CallMethod(importer,
"find_module",
@@ -2935,11 +2922,120 @@
return err;
}
+typedef struct {
+ PyObject_HEAD
+} NullImporter;
+
+static int
+NullImporter_init(NullImporter *self, PyObject *args, PyObject *kwds)
+{
+ char *path;
+
+ if (!_PyArg_NoKeywords("NullImporter()", kwds))
+ return -1;
+
+ if (!PyArg_ParseTuple(args, "s:NullImporter",
+ &path))
+ return -1;
+
+ if (strlen(path) == 0) {
+ PyErr_SetString(PyExc_ImportError, "empty pathname");
+ return -1;
+ } else {
+#ifndef RISCOS
+ struct stat statbuf;
+ int rv;
+
+ rv = stat(path, &statbuf);
+ if (rv == 0) {
+ /* it exists */
+ if (S_ISDIR(statbuf.st_mode)) {
+ /* it's a directory */
+ PyErr_SetString(PyExc_ImportError,
+ "existing directory");
+ return -1;
+ }
+ }
+#else
+ if (object_exists(path)) {
+ /* it exists */
+ if (isdir(path)) {
+ /* it's a directory */
+ PyErr_SetString(PyExc_ImportError,
+ "existing directory");
+ return -1;
+ }
+ }
+#endif
+ }
+ return 0;
+}
+
+static PyObject *
+NullImporter_find_module(NullImporter *self, PyObject *args)
+{
+ Py_RETURN_NONE;
+}
+
+static PyMethodDef NullImporter_methods[] = {
+ {"find_module", (PyCFunction)NullImporter_find_module, METH_VARARGS,
+ "Always return None"
+ },
+ {NULL} /* Sentinel */
+};
+
+
+static PyTypeObject NullImporterType = {
+ PyObject_HEAD_INIT(NULL)
+ 0, /*ob_size*/
+ "imp.NullImporter", /*tp_name*/
+ sizeof(NullImporter), /*tp_basicsize*/
+ 0, /*tp_itemsize*/
+ 0, /*tp_dealloc*/
+ 0, /*tp_print*/
+ 0, /*tp_getattr*/
+ 0, /*tp_setattr*/
+ 0, /*tp_compare*/
+ 0, /*tp_repr*/
+ 0, /*tp_as_number*/
+ 0, /*tp_as_sequence*/
+ 0, /*tp_as_mapping*/
+ 0, /*tp_hash */
+ 0, /*tp_call*/
+ 0, /*tp_str*/
+ 0, /*tp_getattro*/
+ 0, /*tp_setattro*/
+ 0, /*tp_as_buffer*/
+ Py_TPFLAGS_DEFAULT, /*tp_flags*/
+ "Null importer object", /* tp_doc */
+ 0, /* tp_traverse */
+ 0, /* tp_clear */
+ 0, /* tp_richcompare */
+ 0, /* tp_weaklistoffset */
+ 0, /* tp_iter */
+ 0, /* tp_iternext */
+ NullImporter_methods, /* tp_methods */
+ 0, /* tp_members */
+ 0, /* tp_getset */
+ 0, /* tp_base */
+ 0, /* tp_dict */
+ 0, /* tp_descr_get */
+ 0, /* tp_descr_set */
+ 0, /* tp_dictoffset */
+ (initproc)NullImporter_init, /* tp_init */
+ 0, /* tp_alloc */
+ PyType_GenericNew /* tp_new */
+};
+
+
PyMODINIT_FUNC
initimp(void)
{
PyObject *m, *d;
+ if (PyType_Ready(&NullImporterType) < 0)
+ goto failure;
+
m = Py_InitModule4("imp", imp_methods, doc_imp,
NULL, PYTHON_API_VERSION);
if (m == NULL)
@@ -2957,6 +3053,8 @@
if (setint(d, "PY_CODERESOURCE", PY_CODERESOURCE) < 0) goto failure;
if (setint(d, "IMP_HOOK", IMP_HOOK) < 0) goto failure;
+ Py_INCREF(&NullImporterType);
+ PyModule_AddObject(m, "NullImporter", (PyObject *)&NullImporterType);
failure:
;
}
From theller at python.net Fri Jul 28 23:20:57 2006
From: theller at python.net (Thomas Heller)
Date: Fri, 28 Jul 2006 23:20:57 +0200
Subject: [Python-checkins] r50916 - in python/trunk: Doc/lib/libimp.tex
Lib/pkgutil.py Misc/NEWS Python/import.c
In-Reply-To: <20060728211209.C8CAE1E4008@bag.python.org>
References: <20060728211209.C8CAE1E4008@bag.python.org>
Message-ID:
phillip.eby schrieb:
> --- python/trunk/Misc/NEWS (original)
> +++ python/trunk/Misc/NEWS Fri Jul 28 23:12:07 2006
> @@ -12,6 +12,11 @@
> Core and builtins
> -----------------
>
> +- Bug #1529871: The speed enhancement patch #921466 broke Python's compliance
> + with PEP 302. This was fixed by adding an ``imp.NullImporter`` type that is
> + used in ``sys.path_importer_cache`` to cache non-directory paths and avoid
> + excessive filesystem operations during imports.
> +
> - Bug #1521947: When checking for overflow, ``PyOS_strtol()`` used some
> operations on signed longs that are formally undefined by C.
> Unfortunately, at least one compiler now cares about that, so complicated
> @@ -106,10 +111,14 @@
> Extension Modules
> -----------------
>
> +<<<<<<< .mine
> +- Bug #1471938: Fix curses module build problem on Solaris 8; patch by
> +=======
> - The ``__reduce__()`` method of the new ``collections.defaultdict`` had
> a memory leak, affecting pickles and deep copies.
>
> - Bug #1471938: Fix curses module build problem on Solaris 8; patch by
> +>>>>>>> .r50915
> Paul Eggert.
>
> - Patch #1448199: Release interpreter lock in _winreg.ConnectRegistry.
>
This doesn't look correct. ;-)
Thomas
From python-checkins at python.org Fri Jul 28 23:31:55 2006
From: python-checkins at python.org (phillip.eby)
Date: Fri, 28 Jul 2006 23:31:55 +0200 (CEST)
Subject: [Python-checkins] r50917 - python/trunk/Misc/NEWS
Message-ID: <20060728213155.827E71E4002@bag.python.org>
Author: phillip.eby
Date: Fri Jul 28 23:31:54 2006
New Revision: 50917
Modified:
python/trunk/Misc/NEWS
Log:
Fix svn merge spew.
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Fri Jul 28 23:31:54 2006
@@ -111,14 +111,10 @@
Extension Modules
-----------------
-<<<<<<< .mine
-- Bug #1471938: Fix curses module build problem on Solaris 8; patch by
-=======
- The ``__reduce__()`` method of the new ``collections.defaultdict`` had
a memory leak, affecting pickles and deep copies.
- Bug #1471938: Fix curses module build problem on Solaris 8; patch by
->>>>>>> .r50915
Paul Eggert.
- Patch #1448199: Release interpreter lock in _winreg.ConnectRegistry.
From python-checkins at python.org Fri Jul 28 23:43:25 2006
From: python-checkins at python.org (thomas.heller)
Date: Fri, 28 Jul 2006 23:43:25 +0200 (CEST)
Subject: [Python-checkins] r50918 - in python/trunk: Misc/NEWS
Modules/_ctypes/libffi/configure Modules/_ctypes/libffi/configure.ac
Message-ID: <20060728214325.1F7A71E4008@bag.python.org>
Author: thomas.heller
Date: Fri Jul 28 23:43:20 2006
New Revision: 50918
Modified:
python/trunk/Misc/NEWS
python/trunk/Modules/_ctypes/libffi/configure
python/trunk/Modules/_ctypes/libffi/configure.ac
Log:
Patch #1529514: More openbsd platforms for ctypes.
Regenerated Modules/_ctypes/libffi/configure with autoconf 2.59.
Approved by Neal.
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Fri Jul 28 23:43:20 2006
@@ -111,6 +111,9 @@
Extension Modules
-----------------
+- Patch #1529514: The _ctypes extension is now compiled on more
+ openbsd target platforms.
+
- The ``__reduce__()`` method of the new ``collections.defaultdict`` had
a memory leak, affecting pickles and deep copies.
Modified: python/trunk/Modules/_ctypes/libffi/configure
==============================================================================
--- python/trunk/Modules/_ctypes/libffi/configure (original)
+++ python/trunk/Modules/_ctypes/libffi/configure Fri Jul 28 23:43:20 2006
@@ -934,7 +934,7 @@
else
echo "$as_me: WARNING: no configuration information is in $ac_dir" >&2
fi
- cd $ac_popdir
+ cd "$ac_popdir"
done
fi
@@ -1973,8 +1973,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -2032,8 +2031,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -2149,8 +2147,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -2204,8 +2201,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -2250,8 +2246,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -2295,8 +2290,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -2623,8 +2617,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -2794,8 +2787,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -2862,8 +2854,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -3047,8 +3038,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -3111,8 +3101,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -3290,8 +3279,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -3408,8 +3396,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -3483,6 +3470,12 @@
TARGETDIR="unknown"
case "$host" in
+mips*-*-openbsd*) TARGET=MIPS; TARGETDIR=mips;;
+sparc-*-openbsd*) TARGET=SPARC; TARGETDIR=sparc;;
+sparc64-*-openbsd*) TARGET=SPARC; TARGETDIR=sparc;;
+alpha*-*-openbsd*) TARGET=ALPHA; TARGETDIR=alpha;;
+m68k-*-openbsd*) TARGET=M68K; TARGETDIR=m68k;;
+powerpc-*-openbsd*) TARGET=POWERPC; TARGETDIR=powerpc;;
i*86-*-darwin*) TARGET=X86_DARWIN; TARGETDIR=x86;;
i*86-*-linux*) TARGET=X86; TARGETDIR=x86;;
i*86-*-gnu*) TARGET=X86; TARGETDIR=x86;;
@@ -3575,8 +3568,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -3777,8 +3769,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -3841,8 +3832,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -3923,8 +3913,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -4065,8 +4054,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -4202,8 +4190,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -4265,8 +4252,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -4306,8 +4292,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -4363,8 +4348,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -4404,8 +4388,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -4469,8 +4452,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -4501,10 +4483,8 @@
esac
else
if test "$cross_compiling" = yes; then
- { { echo "$as_me:$LINENO: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&5
-echo "$as_me: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&2;}
+ { { echo "$as_me:$LINENO: error: internal error: not reached in cross-compile" >&5
+echo "$as_me: error: internal error: not reached in cross-compile" >&2;}
{ (exit 1); exit 1; }; }
else
cat >conftest.$ac_ext <<_ACEOF
@@ -4616,8 +4596,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -4679,8 +4658,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -4720,8 +4698,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -4777,8 +4754,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -4818,8 +4794,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -4883,8 +4858,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -4915,10 +4889,8 @@
esac
else
if test "$cross_compiling" = yes; then
- { { echo "$as_me:$LINENO: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&5
-echo "$as_me: error: cannot run test program while cross compiling
-See \`config.log' for more details." >&2;}
+ { { echo "$as_me:$LINENO: error: internal error: not reached in cross-compile" >&5
+echo "$as_me: error: internal error: not reached in cross-compile" >&2;}
{ (exit 1); exit 1; }; }
else
cat >conftest.$ac_ext <<_ACEOF
@@ -5048,8 +5020,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -5091,8 +5062,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -5149,8 +5119,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -5282,8 +5251,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -5349,8 +5317,7 @@
cat conftest.err >&5
echo "$as_me:$LINENO: \$? = $ac_status" >&5
(exit $ac_status); } &&
- { ac_try='test -z "$ac_c_werror_flag"
- || test ! -s conftest.err'
+ { ac_try='test -z "$ac_c_werror_flag" || test ! -s conftest.err'
{ (eval echo "$as_me:$LINENO: \"$ac_try\"") >&5
(eval $ac_try) 2>&5
ac_status=$?
@@ -6309,11 +6276,6 @@
- if test x"$ac_file" != x-; then
- { echo "$as_me:$LINENO: creating $ac_file" >&5
-echo "$as_me: creating $ac_file" >&6;}
- rm -f "$ac_file"
- fi
# Let's still pretend it is `configure' which instantiates (i.e., don't
# use $as_me), people would be surprised to read:
# /* config.h. Generated by config.status. */
@@ -6352,6 +6314,12 @@
fi;;
esac
done` || { (exit 1); exit 1; }
+
+ if test x"$ac_file" != x-; then
+ { echo "$as_me:$LINENO: creating $ac_file" >&5
+echo "$as_me: creating $ac_file" >&6;}
+ rm -f "$ac_file"
+ fi
_ACEOF
cat >>$CONFIG_STATUS <<_ACEOF
sed "$ac_vpsub
Modified: python/trunk/Modules/_ctypes/libffi/configure.ac
==============================================================================
--- python/trunk/Modules/_ctypes/libffi/configure.ac (original)
+++ python/trunk/Modules/_ctypes/libffi/configure.ac Fri Jul 28 23:43:20 2006
@@ -21,6 +21,12 @@
TARGETDIR="unknown"
case "$host" in
+mips*-*-openbsd*) TARGET=MIPS; TARGETDIR=mips;;
+sparc-*-openbsd*) TARGET=SPARC; TARGETDIR=sparc;;
+sparc64-*-openbsd*) TARGET=SPARC; TARGETDIR=sparc;;
+alpha*-*-openbsd*) TARGET=ALPHA; TARGETDIR=alpha;;
+m68k-*-openbsd*) TARGET=M68K; TARGETDIR=m68k;;
+powerpc-*-openbsd*) TARGET=POWERPC; TARGETDIR=powerpc;;
i*86-*-darwin*) TARGET=X86_DARWIN; TARGETDIR=x86;;
i*86-*-linux*) TARGET=X86; TARGETDIR=x86;;
i*86-*-gnu*) TARGET=X86; TARGETDIR=x86;;
From python-checkins at python.org Sat Jul 29 01:43:46 2006
From: python-checkins at python.org (brett.cannon)
Date: Sat, 29 Jul 2006 01:43:46 +0200 (CEST)
Subject: [Python-checkins] r50919 -
python/branches/bcannon-sandboxing/Objects/stringobject.c
Message-ID: <20060728234346.B007C1E4002@bag.python.org>
Author: brett.cannon
Date: Sat Jul 29 01:43:45 2006
New Revision: 50919
Modified:
python/branches/bcannon-sandboxing/Objects/stringobject.c
Log:
Minor line break.
Modified: python/branches/bcannon-sandboxing/Objects/stringobject.c
==============================================================================
--- python/branches/bcannon-sandboxing/Objects/stringobject.c (original)
+++ python/branches/bcannon-sandboxing/Objects/stringobject.c Sat Jul 29 01:43:45 2006
@@ -127,7 +127,8 @@
}
/* Inline PyObject_NewVar */
- op = (PyStringObject *)PyObject_T_MALLOC("str", sizeof(PyStringObject) + size);
+ op = (PyStringObject *)PyObject_T_MALLOC("str",
+ sizeof(PyStringObject) + size);
if (op == NULL)
return PyErr_NoMemory();
PyObject_INIT_VAR(op, &PyString_Type, size);
From python-checkins at python.org Sat Jul 29 01:45:00 2006
From: python-checkins at python.org (brett.cannon)
Date: Sat, 29 Jul 2006 01:45:00 +0200 (CEST)
Subject: [Python-checkins] r50920 - in python/branches/bcannon-sandboxing:
Include/objimpl.h Include/pymem.h Objects/trackedmalloc.c
Python/sysmodule.c
Message-ID: <20060728234500.3731B1E4002@bag.python.org>
Author: brett.cannon
Date: Sat Jul 29 01:44:59 2006
New Revision: 50920
Modified:
python/branches/bcannon-sandboxing/Include/objimpl.h
python/branches/bcannon-sandboxing/Include/pymem.h
python/branches/bcannon-sandboxing/Objects/trackedmalloc.c
python/branches/bcannon-sandboxing/Python/sysmodule.c
Log:
Add tracking at the type level.
Uses a linked list to store types since easy to do in pure C and thus minimize
bootstrapping issues of using Python objects when tracking memory allocation.
Plus the number of types created during the lifetime of a Python process is
actually rather small.
Modified: python/branches/bcannon-sandboxing/Include/objimpl.h
==============================================================================
--- python/branches/bcannon-sandboxing/Include/objimpl.h (original)
+++ python/branches/bcannon-sandboxing/Include/objimpl.h Sat Jul 29 01:44:59 2006
@@ -95,6 +95,7 @@
the raw memory.
*/
PyAPI_DATA(unsigned long) Py_ProcessMemUsage;
+PyAPI_FUNC(PyObject *) Py_MemoryUsage(PyObject *, PyObject *);
PyAPI_FUNC(int) PyMalloc_ManagesMemory(void *);
PyAPI_FUNC(size_t) PyMalloc_AllocatedSize(void *);
PyAPI_FUNC(void *) PyObject_Malloc(size_t);
Modified: python/branches/bcannon-sandboxing/Include/pymem.h
==============================================================================
--- python/branches/bcannon-sandboxing/Include/pymem.h (original)
+++ python/branches/bcannon-sandboxing/Include/pymem.h Sat Jul 29 01:44:59 2006
@@ -61,12 +61,6 @@
#define PyMem_REALLOC PyObject_REALLOC
#define PyMem_FREE PyObject_FREE
-#elif define (Py_TRACK_MEMORY)
-
-#define PyMem_MALLOC(size) PyObject_T_MALLOC("", size)
-#define PyMem_REALLOC(size) PyObject_T_REALLOC("", size)
-#define PyMem_FREE(size) PyObject_T_FREE("", size)
-
#else /* ! PYMALLOC_DEBUG */
/* PyMem_MALLOC(0) means malloc(1). Some systems would return NULL
Modified: python/branches/bcannon-sandboxing/Objects/trackedmalloc.c
==============================================================================
--- python/branches/bcannon-sandboxing/Objects/trackedmalloc.c (original)
+++ python/branches/bcannon-sandboxing/Objects/trackedmalloc.c Sat Jul 29 01:44:59 2006
@@ -26,9 +26,9 @@
* PyObject_GC_Resize()
Uses PyObject_T_MALLOC()
* PyMem_Malloc()/PyMem_Realloc()/PyMem_Free()
- Uses PyMem_MALLOC(), etc.
+ XXX
* PyMem_MALLOC()/PyMem_REALLOC()/PyMem_FREE()
- Change to PyObject_T_*("", size)
+ XXX
* malloc()/realloc()/free()
XXX
@@ -52,6 +52,72 @@
static const char *UNKNOWN_WHAT = "";
+struct mem_item {
+ struct mem_item *next;
+ const char *type;
+ unsigned long using;
+};
+
+static struct mem_item mem_sentinel = {NULL, NULL, 0};
+static struct mem_item *mem_head = &mem_sentinel;
+static Py_ssize_t mem_item_count = 0;
+
+
+PyObject *
+Py_MemoryUsage(PyObject *self, PyObject *ignore)
+{
+ struct mem_item *cur_mem = mem_head;
+ PyObject *mem_dict = PyDict_New();
+ Py_ssize_t x = 0;
+ int int_result = 0;
+
+ if (!mem_dict)
+ return NULL;
+
+ for (x=0; x < mem_item_count; x+=1) {
+ cur_mem = cur_mem->next;
+ PyObject *long_obj = PyLong_FromUnsignedLong(cur_mem->using);
+ if (!long_obj)
+ return NULL;
+ int_result = PyDict_SetItemString(mem_dict, cur_mem->type, long_obj);
+ Py_DECREF(long_obj);
+
+ if (int_result < 0)
+ return NULL;
+ }
+
+ return mem_dict;
+}
+
+/* XXX remove entries where memory usage is zero? */
+static struct mem_item *
+find_mem_entry(const char *what)
+{
+ struct mem_item *cur_mem = mem_head;
+
+ what = what ? what : UNKNOWN_WHAT;
+
+ while (cur_mem->next) {
+ cur_mem = cur_mem->next;
+
+ if (strcmp(what, cur_mem->type) == 0)
+ return cur_mem;
+ }
+
+ cur_mem->next = malloc(sizeof(struct mem_item));
+ cur_mem = cur_mem->next;
+
+ if (!cur_mem)
+ return NULL;
+
+ mem_item_count += 1;
+
+ cur_mem->next = NULL;
+ cur_mem->type = what; /* XXX memcpy? */
+ cur_mem->using = 0;
+
+ return cur_mem;
+}
/*
Track an anonymous chunk of memory.
@@ -59,8 +125,13 @@
int
PyObject_TrackMemory(const char *what, size_t nbytes)
{
- what = what ? what : UNKNOWN_WHAT;
+ struct mem_item *mem_entry = find_mem_entry(what);
+ if (!mem_entry)
+ return 0;
+
+ /* XXX check for overflow. */
+ mem_entry->using += nbytes;
Py_ProcessMemUsage += nbytes;
return 1;
@@ -72,8 +143,13 @@
int
PyObject_UntrackMemory(const char *what, size_t nbytes)
{
- what = what ? what : UNKNOWN_WHAT;
+ struct mem_item *mem_entry = find_mem_entry(what);
+
+ if (!mem_entry)
+ return 0;
+ /* XXX check for hitting < 0. */
+ mem_entry->using -= nbytes;
Py_ProcessMemUsage -= nbytes;
return 1;
Modified: python/branches/bcannon-sandboxing/Python/sysmodule.c
==============================================================================
--- python/branches/bcannon-sandboxing/Python/sysmodule.c (original)
+++ python/branches/bcannon-sandboxing/Python/sysmodule.c Sat Jul 29 01:44:59 2006
@@ -782,6 +782,7 @@
#endif
{"settrace", sys_settrace, METH_O, settrace_doc},
{"call_tracing", sys_call_tracing, METH_VARARGS, call_tracing_doc},
+ {"memoryusage", Py_MemoryUsage, METH_NOARGS, "XXX"},
{NULL, NULL} /* sentinel */
};
From python-checkins at python.org Sat Jul 29 01:58:15 2006
From: python-checkins at python.org (phillip.eby)
Date: Sat, 29 Jul 2006 01:58:15 +0200 (CEST)
Subject: [Python-checkins] r50921 - sandbox/trunk/setuptools/_pkgutil.py
Message-ID: <20060728235815.1ACD11E4008@bag.python.org>
Author: phillip.eby
Date: Sat Jul 29 01:58:14 2006
New Revision: 50921
Modified:
sandbox/trunk/setuptools/_pkgutil.py
Log:
Sync pkgutil from trunk
Modified: sandbox/trunk/setuptools/_pkgutil.py
==============================================================================
--- sandbox/trunk/setuptools/_pkgutil.py (original)
+++ sandbox/trunk/setuptools/_pkgutil.py Sat Jul 29 01:58:14 2006
@@ -31,7 +31,7 @@
def simplegeneric(func):
"""Make a trivial single-dispatch generic function"""
registry = {}
- def wrapper(*args,**kw):
+ def wrapper(*args, **kw):
ob = args[0]
try:
cls = ob.__class__
@@ -41,18 +41,19 @@
mro = cls.__mro__
except AttributeError:
try:
- class cls(cls,object): pass
+ class cls(cls, object):
+ pass
mro = cls.__mro__[1:]
except TypeError:
mro = object, # must be an ExtensionClass or some such :(
for t in mro:
if t in registry:
- return registry[t](*args,**kw)
+ return registry[t](*args, **kw)
else:
- return func(*args,**kw)
+ return func(*args, **kw)
try:
wrapper.__name__ = func.__name__
- except (TypeError,AttributeError):
+ except (TypeError, AttributeError):
pass # Python 2.3 doesn't allow functions to be renamed
def register(typ, func=None):
@@ -68,10 +69,37 @@
def walk_packages(path=None, prefix='', onerror=None):
- """Yield submodule names+loaders recursively, for path or sys.path"""
+ """Yields (module_loader, name, ispkg) for all modules recursively
+ on path, or, if path is None, all accessible modules.
- def seen(p,m={}):
- if p in m: return True
+ 'path' should be either None or a list of paths to look for
+ modules in.
+
+ 'prefix' is a string to output on the front of every module name
+ on output.
+
+ Note that this function must import all *packages* (NOT all
+ modules!) on the given path, in order to access the __path__
+ attribute to find submodules.
+
+ 'onerror' is a function which gets called with one argument (the
+ name of the package which was being imported) if any exception
+ occurs while trying to import a package. If no onerror function is
+ supplied, ImportErrors are caught and ignored, while all other
+ exceptions are propagated, terminating the search.
+
+ Examples:
+
+ # list all modules python can access
+ walk_packages()
+
+ # list all submodules of ctypes
+ walk_packages(ctypes.__path__, ctypes.__name__+'.')
+ """
+
+ def seen(p, m={}):
+ if p in m:
+ return True
m[p] = True
for importer, name, ispkg in iter_modules(path, prefix):
@@ -82,19 +110,33 @@
__import__(name)
except ImportError:
if onerror is not None:
- onerror()
+ onerror(name)
+ except Exception:
+ if onerror is not None:
+ onerror(name)
+ else:
+ raise
else:
path = getattr(sys.modules[name], '__path__', None) or []
# don't traverse path items we've seen before
path = [p for p in path if not seen(p)]
- for item in walk_packages(path, name+'.'):
+ for item in walk_packages(path, name+'.', onerror):
yield item
def iter_modules(path=None, prefix=''):
- """Yield submodule names+loaders for path or sys.path"""
+ """Yields (module_loader, name, ispkg) for all submodules on path,
+ or, if path is None, all top-level modules on sys.path.
+
+ 'path' should be either None or a list of paths to look for
+ modules in.
+
+ 'prefix' is a string to output on the front of every module name
+ on output.
+ """
+
if path is None:
importers = iter_importers()
else:
@@ -110,7 +152,7 @@
#@simplegeneric
def iter_importer_modules(importer, prefix=''):
- if not hasattr(importer,'iter_modules'):
+ if not hasattr(importer, 'iter_modules'):
return []
return importer.iter_modules(prefix)
@@ -206,6 +248,7 @@
def _reopen(self):
if self.file and self.file.closed:
+ mod_type = self.etc[2]
if mod_type==imp.PY_SOURCE:
self.file = open(self.filename, 'rU')
elif mod_type in (imp.PY_COMPILED, imp.C_EXTENSION):
@@ -336,13 +379,13 @@
pass
else:
importer = None
- sys.path_importer_cache.setdefault(path_item,importer)
+ sys.path_importer_cache.setdefault(path_item, importer)
if importer is None:
try:
importer = ImpImporter(path_item)
except ImportError:
- pass
+ importer = None
return importer
@@ -377,7 +420,7 @@
pkg = '.'.join(fullname.split('.')[:-1])
if pkg not in sys.modules:
__import__(pkg)
- path = getattr(sys.modules[pkg],'__path__',None) or []
+ path = getattr(sys.modules[pkg], '__path__', None) or []
else:
for importer in sys.meta_path:
yield importer
@@ -404,7 +447,7 @@
module_or_name = sys.modules[module_or_name]
if isinstance(module_or_name, ModuleType):
module = module_or_name
- loader = getattr(module,'__loader__',None)
+ loader = getattr(module, '__loader__', None)
if loader is not None:
return loader
fullname = module.__name__
From nnorwitz at gmail.com Sat Jul 29 02:53:52 2006
From: nnorwitz at gmail.com (Neal Norwitz)
Date: Fri, 28 Jul 2006 17:53:52 -0700
Subject: [Python-checkins] r50916 - in python/trunk: Doc/lib/libimp.tex
Lib/pkgutil.py Misc/NEWS Python/import.c
In-Reply-To: <20060728211209.C8CAE1E4008@bag.python.org>
References: <20060728211209.C8CAE1E4008@bag.python.org>
Message-ID:
On 7/28/06, phillip.eby wrote:
> Modified: python/trunk/Python/import.c
> ==============================================================================
> --- python/trunk/Python/import.c (original)
> +++ python/trunk/Python/import.c Fri Jul 28 23:12:07 2006
> @@ -155,6 +157,8 @@
>
> /* adding sys.path_hooks and sys.path_importer_cache, setting up
> zipimport */
> + if (PyType_Ready(&NullImporterType) < 0)
> + goto error;
Isn't the PyType_Ready above redundant with the one below?
> PyMODINIT_FUNC
> initimp(void)
> {
> PyObject *m, *d;
>
> + if (PyType_Ready(&NullImporterType) < 0)
> + goto failure;
> +
> m = Py_InitModule4("imp", imp_methods, doc_imp,
> NULL, PYTHON_API_VERSION);
> if (m == NULL)
From g.brandl at gmx.net Sat Jul 29 09:39:50 2006
From: g.brandl at gmx.net (Georg Brandl)
Date: Sat, 29 Jul 2006 09:39:50 +0200
Subject: [Python-checkins] r50916 - in python/trunk: Doc/lib/libimp.tex
Lib/pkgutil.py Misc/NEWS Python/import.c
In-Reply-To: <20060728211209.C8CAE1E4008@bag.python.org>
References: <20060728211209.C8CAE1E4008@bag.python.org>
Message-ID:
phillip.eby wrote:
> Author: phillip.eby
> Date: Fri Jul 28 23:12:07 2006
> New Revision: 50916
>
> Modified:
> python/trunk/Doc/lib/libimp.tex
> python/trunk/Lib/pkgutil.py
> python/trunk/Misc/NEWS
> python/trunk/Python/import.c
> Log:
> Bug #1529871: The speed enhancement patch #921466 broke Python's compliance
> with PEP 302. This was fixed by adding an ``imp.NullImporter`` type that is
> used in ``sys.path_importer_cache`` to cache non-directory paths and avoid
> excessive filesystem operations during imports.
Thanks for the work, Phillip!
Georg
From python-checkins at python.org Sat Jul 29 10:51:22 2006
From: python-checkins at python.org (georg.brandl)
Date: Sat, 29 Jul 2006 10:51:22 +0200 (CEST)
Subject: [Python-checkins] r50922 - python/trunk/Doc/lib/libnew.tex
Message-ID: <20060729085122.7C5E41E4002@bag.python.org>
Author: georg.brandl
Date: Sat Jul 29 10:51:21 2006
New Revision: 50922
Modified:
python/trunk/Doc/lib/libnew.tex
Log:
Bug #835255: The "closure" argument to new.function() is now documented.
Modified: python/trunk/Doc/lib/libnew.tex
==============================================================================
--- python/trunk/Doc/lib/libnew.tex (original)
+++ python/trunk/Doc/lib/libnew.tex Sat Jul 29 10:51:21 2006
@@ -30,13 +30,16 @@
callable.
\end{funcdesc}
-\begin{funcdesc}{function}{code, globals\optional{, name\optional{, argdefs}}}
+\begin{funcdesc}{function}{code, globals\optional{, name\optional{,
+ argdefs\optional{, closure}}}}
Returns a (Python) function with the given code and globals. If
\var{name} is given, it must be a string or \code{None}. If it is a
string, the function will have the given name, otherwise the function
name will be taken from \code{\var{code}.co_name}. If
\var{argdefs} is given, it must be a tuple and will be used to
-determine the default values of parameters.
+determine the default values of parameters. If \var{closure} is given,
+it must be \code{None} or a tuple of cell objects containing objects
+to bind to the names in \code{\var{code}.co_freevars}.
\end{funcdesc}
\begin{funcdesc}{code}{argcount, nlocals, stacksize, flags, codestring,
From python-checkins at python.org Sat Jul 29 10:51:25 2006
From: python-checkins at python.org (georg.brandl)
Date: Sat, 29 Jul 2006 10:51:25 +0200 (CEST)
Subject: [Python-checkins] r50923 -
python/branches/release24-maint/Doc/lib/libnew.tex
Message-ID: <20060729085125.AFB381E400E@bag.python.org>
Author: georg.brandl
Date: Sat Jul 29 10:51:25 2006
New Revision: 50923
Modified:
python/branches/release24-maint/Doc/lib/libnew.tex
Log:
Bug #835255: The "closure" argument to new.function() is now documented.
(backport from rev. 50922)
Modified: python/branches/release24-maint/Doc/lib/libnew.tex
==============================================================================
--- python/branches/release24-maint/Doc/lib/libnew.tex (original)
+++ python/branches/release24-maint/Doc/lib/libnew.tex Sat Jul 29 10:51:25 2006
@@ -30,13 +30,16 @@
callable.
\end{funcdesc}
-\begin{funcdesc}{function}{code, globals\optional{, name\optional{, argdefs}}}
+\begin{funcdesc}{function}{code, globals\optional{, name\optional{,
+ argdefs\optional{, closure}}}}
Returns a (Python) function with the given code and globals. If
\var{name} is given, it must be a string or \code{None}. If it is a
string, the function will have the given name, otherwise the function
name will be taken from \code{\var{code}.co_name}. If
\var{argdefs} is given, it must be a tuple and will be used to
-determine the default values of parameters.
+determine the default values of parameters. If \var{closure} is given,
+it must be \code{None} or a tuple of cell objects containing objects
+to bind to the names in \code{\var{code}.co_freevars}.
\end{funcdesc}
\begin{funcdesc}{code}{argcount, nlocals, stacksize, flags, codestring,
From python-checkins at python.org Sat Jul 29 11:33:28 2006
From: python-checkins at python.org (georg.brandl)
Date: Sat, 29 Jul 2006 11:33:28 +0200 (CEST)
Subject: [Python-checkins] r50924 - in python/trunk:
Lib/compiler/transformer.py Lib/test/test_compiler.py Misc/NEWS
Message-ID: <20060729093328.19CF01E4002@bag.python.org>
Author: georg.brandl
Date: Sat Jul 29 11:33:26 2006
New Revision: 50924
Modified:
python/trunk/Lib/compiler/transformer.py
python/trunk/Lib/test/test_compiler.py
python/trunk/Misc/NEWS
Log:
Bug #1441397: The compiler module now recognizes module and function
docstrings correctly as it did in Python 2.4.
Modified: python/trunk/Lib/compiler/transformer.py
==============================================================================
--- python/trunk/Lib/compiler/transformer.py (original)
+++ python/trunk/Lib/compiler/transformer.py Sat Jul 29 11:33:26 2006
@@ -1382,6 +1382,7 @@
symbol.testlist,
symbol.testlist_safe,
symbol.test,
+ symbol.or_test,
symbol.and_test,
symbol.not_test,
symbol.comparison,
Modified: python/trunk/Lib/test/test_compiler.py
==============================================================================
--- python/trunk/Lib/test/test_compiler.py (original)
+++ python/trunk/Lib/test/test_compiler.py Sat Jul 29 11:33:26 2006
@@ -68,6 +68,14 @@
def testDefaultArgs(self):
self.assertRaises(SyntaxError, compiler.parse, "def foo(a=1, b): pass")
+ def testDocstrings(self):
+ c = compiler.compile('"doc"', '', 'exec')
+ self.assert_('__doc__' in c.co_names)
+ c = compiler.compile('def f():\n "doc"', '', 'exec')
+ g = {}
+ exec c in g
+ self.assertEquals(g['f'].__doc__, "doc")
+
def testLineNo(self):
# Test that all nodes except Module have a correct lineno attribute.
filename = __file__
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Sat Jul 29 11:33:26 2006
@@ -52,6 +52,9 @@
Library
-------
+- Bug #1441397: The compiler module now recognizes module and function
+ docstrings correctly as it did in Python 2.4.
+
- Bug #1529297: The rewrite of doctest for Python 2.4 unintentionally
lost that tests are sorted by name before being run. This rarely
matters for well-written tests, but can create baffling symptoms if
From buildbot at python.org Sat Jul 29 11:54:30 2006
From: buildbot at python.org (buildbot at python.org)
Date: Sat, 29 Jul 2006 09:54:30 +0000
Subject: [Python-checkins] buildbot warnings in x86 W2k trunk
Message-ID: <20060729095430.D38B41E4002@bag.python.org>
The Buildbot has detected a new failure of x86 W2k trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520W2k%2520trunk/builds/1316
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Sat Jul 29 11:59:42 2006
From: buildbot at python.org (buildbot at python.org)
Date: Sat, 29 Jul 2006 09:59:42 +0000
Subject: [Python-checkins] buildbot warnings in amd64 gentoo trunk
Message-ID: <20060729095942.9BB171E4002@bag.python.org>
The Buildbot has detected a new failure of amd64 gentoo trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/amd64%2520gentoo%2520trunk/builds/1325
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Sat Jul 29 11:59:42 2006
From: buildbot at python.org (buildbot at python.org)
Date: Sat, 29 Jul 2006 09:59:42 +0000
Subject: [Python-checkins] buildbot warnings in x86 gentoo trunk
Message-ID: <20060729095942.C58251E4008@bag.python.org>
The Buildbot has detected a new failure of x86 gentoo trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520gentoo%2520trunk/builds/1409
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Sat Jul 29 12:15:06 2006
From: buildbot at python.org (buildbot at python.org)
Date: Sat, 29 Jul 2006 10:15:06 +0000
Subject: [Python-checkins] buildbot warnings in g4 osx.4 trunk
Message-ID: <20060729101506.43BC11E4008@bag.python.org>
The Buildbot has detected a new failure of g4 osx.4 trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/g4%2520osx.4%2520trunk/builds/1257
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Sat Jul 29 12:16:35 2006
From: buildbot at python.org (buildbot at python.org)
Date: Sat, 29 Jul 2006 10:16:35 +0000
Subject: [Python-checkins] buildbot warnings in x86 XP trunk
Message-ID: <20060729101635.6A33B1E4008@bag.python.org>
The Buildbot has detected a new failure of x86 XP trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520XP%2520trunk/builds/1269
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Sat Jul 29 12:25:47 2006
From: python-checkins at python.org (georg.brandl)
Date: Sat, 29 Jul 2006 12:25:47 +0200 (CEST)
Subject: [Python-checkins] r50925 - python/trunk/Lib/compiler/future.py
Message-ID: <20060729102547.83B851E4008@bag.python.org>
Author: georg.brandl
Date: Sat Jul 29 12:25:46 2006
New Revision: 50925
Modified:
python/trunk/Lib/compiler/future.py
Log:
Revert rev 42617, it was introduced to work around bug #1441397.
test_compiler now passes again.
Modified: python/trunk/Lib/compiler/future.py
==============================================================================
--- python/trunk/Lib/compiler/future.py (original)
+++ python/trunk/Lib/compiler/future.py Sat Jul 29 12:25:46 2006
@@ -23,14 +23,7 @@
def visitModule(self, node):
stmt = node.node
- found_docstring = False
for s in stmt.nodes:
- # Skip over docstrings
- if not found_docstring and isinstance(s, ast.Discard) \
- and isinstance(s.expr, ast.Const) \
- and isinstance(s.expr.value, str):
- found_docstring = True
- continue
if not self.check_stmt(s):
break
From buildbot at python.org Sat Jul 29 12:40:17 2006
From: buildbot at python.org (buildbot at python.org)
Date: Sat, 29 Jul 2006 10:40:17 +0000
Subject: [Python-checkins] buildbot warnings in alpha Tru64 5.1 trunk
Message-ID: <20060729104018.1091E1E4009@bag.python.org>
The Buildbot has detected a new failure of alpha Tru64 5.1 trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/alpha%2520Tru64%25205.1%2520trunk/builds/992
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From buildbot at python.org Sat Jul 29 12:53:47 2006
From: buildbot at python.org (buildbot at python.org)
Date: Sat, 29 Jul 2006 10:53:47 +0000
Subject: [Python-checkins] buildbot warnings in x86 Ubuntu dapper (icc) trunk
Message-ID: <20060729105347.492281E4008@bag.python.org>
The Buildbot has detected a new failure of x86 Ubuntu dapper (icc) trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520Ubuntu%2520dapper%2520%2528icc%2529%2520trunk/builds/835
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Sat Jul 29 15:22:49 2006
From: python-checkins at python.org (fred.drake)
Date: Sat, 29 Jul 2006 15:22:49 +0200 (CEST)
Subject: [Python-checkins] r50926 - python/trunk/Misc/NEWS
Message-ID: <20060729132249.8B74B1E4008@bag.python.org>
Author: fred.drake
Date: Sat Jul 29 15:22:49 2006
New Revision: 50926
Modified:
python/trunk/Misc/NEWS
Log:
update target version number
Modified: python/trunk/Misc/NEWS
==============================================================================
--- python/trunk/Misc/NEWS (original)
+++ python/trunk/Misc/NEWS Sat Jul 29 15:22:49 2006
@@ -4,8 +4,8 @@
(editors: check NEWS.help for information about editing NEWS using ReST.)
-What's New in Python 2.5 release candidate 1?
-=============================================
+What's New in Python 2.5 beta 3?
+================================
*Release date: XX-AUG-2006*
From buildbot at python.org Sat Jul 29 15:36:29 2006
From: buildbot at python.org (buildbot at python.org)
Date: Sat, 29 Jul 2006 13:36:29 +0000
Subject: [Python-checkins] buildbot warnings in sparc Ubuntu dapper trunk
Message-ID: <20060729133630.15C621E4008@bag.python.org>
The Buildbot has detected a new failure of sparc Ubuntu dapper trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/sparc%2520Ubuntu%2520dapper%2520trunk/builds/582
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: georg.brandl
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Sat Jul 29 15:56:48 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Sat, 29 Jul 2006 15:56:48 +0200 (CEST)
Subject: [Python-checkins] r50927 - python/trunk/Doc/lib/libcompileall.tex
Message-ID: <20060729135648.CC6641E4008@bag.python.org>
Author: andrew.kuchling
Date: Sat Jul 29 15:56:48 2006
New Revision: 50927
Modified:
python/trunk/Doc/lib/libcompileall.tex
Log:
Add example
Modified: python/trunk/Doc/lib/libcompileall.tex
==============================================================================
--- python/trunk/Doc/lib/libcompileall.tex (original)
+++ python/trunk/Doc/lib/libcompileall.tex Sat Jul 29 15:56:48 2006
@@ -44,6 +44,19 @@
\function{compile_dir()} function.
\end{funcdesc}
+To force a recompile of all the \file{.py} files in the \file{Lib/}
+subdirectory and all its subdirectories:
+
+\begin{verbatim}
+import compileall
+
+compileall.compile_dir('Lib/', force=True)
+
+# Perform same compilation, excluding files in .svn directories.
+import re
+compileall.compile_dir('Lib/', rx=re.compile('/[.]svn'), force=True)
+\end{verbatim}
+
\begin{seealso}
\seemodule[pycompile]{py_compile}{Byte-compile a single source file.}
From python-checkins at python.org Sat Jul 29 16:04:48 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Sat, 29 Jul 2006 16:04:48 +0200 (CEST)
Subject: [Python-checkins] r50928 - python/trunk/Doc/lib/libgettext.tex
Message-ID: <20060729140448.213881E4008@bag.python.org>
Author: andrew.kuchling
Date: Sat Jul 29 16:04:47 2006
New Revision: 50928
Modified:
python/trunk/Doc/lib/libgettext.tex
Log:
Update URL
Modified: python/trunk/Doc/lib/libgettext.tex
==============================================================================
--- python/trunk/Doc/lib/libgettext.tex (original)
+++ python/trunk/Doc/lib/libgettext.tex Sat Jul 29 16:04:47 2006
@@ -549,7 +549,7 @@
written a program called
\program{xpot} which does a similar job. It is available as part of
his \program{po-utils} package at
-\url{http://www.iro.umontreal.ca/contrib/po-utils/HTML/}.} program
+\url{http://po-utils.progiciels-bpi.ca/}.} program
scans all your Python source code looking for the strings you
previously marked as translatable. It is similar to the GNU
\program{gettext} program except that it understands all the
From python-checkins at python.org Sat Jul 29 16:05:18 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Sat, 29 Jul 2006 16:05:18 +0200 (CEST)
Subject: [Python-checkins] r50929 -
python/branches/release24-maint/Doc/lib/libgettext.tex
Message-ID: <20060729140518.206CF1E4008@bag.python.org>
Author: andrew.kuchling
Date: Sat Jul 29 16:05:17 2006
New Revision: 50929
Modified:
python/branches/release24-maint/Doc/lib/libgettext.tex
Log:
Update URL
Modified: python/branches/release24-maint/Doc/lib/libgettext.tex
==============================================================================
--- python/branches/release24-maint/Doc/lib/libgettext.tex (original)
+++ python/branches/release24-maint/Doc/lib/libgettext.tex Sat Jul 29 16:05:17 2006
@@ -534,7 +534,7 @@
written a program called
\program{xpot} which does a similar job. It is available as part of
his \program{po-utils} package at
-\url{http://www.iro.umontreal.ca/contrib/po-utils/HTML/}.} program
+\url{http://po-utils.progiciels-bpi.ca/}.} program
scans all your Python source code looking for the strings you
previously marked as translatable. It is similar to the GNU
\program{gettext} program except that it understands all the
From python-checkins at python.org Sat Jul 29 16:08:16 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Sat, 29 Jul 2006 16:08:16 +0200 (CEST)
Subject: [Python-checkins] r50930 - python/trunk/Doc/lib/libgettext.tex
Message-ID: <20060729140816.16A931E4008@bag.python.org>
Author: andrew.kuchling
Date: Sat Jul 29 16:08:15 2006
New Revision: 50930
Modified:
python/trunk/Doc/lib/libgettext.tex
Log:
Reword paragraph to match the order of the subsequent sections
Modified: python/trunk/Doc/lib/libgettext.tex
==============================================================================
--- python/trunk/Doc/lib/libgettext.tex (original)
+++ python/trunk/Doc/lib/libgettext.tex Sat Jul 29 16:08:15 2006
@@ -585,8 +585,8 @@
translation processing during run-time.
How you use the \module{gettext} module in your code depends on
-whether you are internationalizing your entire application or a single
-module.
+whether you are internationalizing a single module or your entire application.
+The next two sections will discuss each case.
\subsubsection{Localizing your module}
From python-checkins at python.org Sat Jul 29 16:21:15 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Sat, 29 Jul 2006 16:21:15 +0200 (CEST)
Subject: [Python-checkins] r50931 - python/trunk/Doc/lib/libreadline.tex
Message-ID: <20060729142115.E17C11E4009@bag.python.org>
Author: andrew.kuchling
Date: Sat Jul 29 16:21:15 2006
New Revision: 50931
Modified:
python/trunk/Doc/lib/libreadline.tex
Log:
[Bug #1529157] Mention raw_input() and input(); while I'm at it, reword the description a bit
Modified: python/trunk/Doc/lib/libreadline.tex
==============================================================================
--- python/trunk/Doc/lib/libreadline.tex (original)
+++ python/trunk/Doc/lib/libreadline.tex Sat Jul 29 16:21:15 2006
@@ -7,10 +7,13 @@
\modulesynopsis{GNU readline support for Python.}
-The \module{readline} module defines a number of functions used either
-directly or from the \refmodule{rlcompleter} module to facilitate
-completion and history file read and write from the Python
-interpreter.
+The \module{readline} module defines a number of functions to
+facilitate completion and reading/writing of history files from the
+Python interpreter. This module can be used directly or via the
+\refmodule{rlcompleter} module. Settings made using
+this module affect the behaviour of both the interpreter's interactive prompt
+and the prompts offered by the \function{raw_input()} and \function{input()}
+built-in functions.
The \module{readline} module defines the following functions:
From python-checkins at python.org Sat Jul 29 16:42:48 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Sat, 29 Jul 2006 16:42:48 +0200 (CEST)
Subject: [Python-checkins] r50932 - python/trunk/Doc/lib/libturtle.tex
Message-ID: <20060729144248.8DC671E4008@bag.python.org>
Author: andrew.kuchling
Date: Sat Jul 29 16:42:48 2006
New Revision: 50932
Modified:
python/trunk/Doc/lib/libturtle.tex
Log:
[Bug #1519571] Document some missing functions: setup(), title(), done()
Modified: python/trunk/Doc/lib/libturtle.tex
==============================================================================
--- python/trunk/Doc/lib/libturtle.tex (original)
+++ python/trunk/Doc/lib/libturtle.tex Sat Jul 29 16:42:48 2006
@@ -27,6 +27,45 @@
Set angle measurement units to radians.
\end{funcdesc}
+\begin{funcdesc}{setup}{**kwargs}
+Sets the size and position of the main window. Keywords are:
+\begin{itemize}
+ \item \code{width}: either a size in pixels or a fraction of the screen.
+ The default is 50\% of the screen.
+ \item \code{height}: either a size in pixels or a fraction of the screen.
+ The default is 50\% of the screen.
+ \item \code{startx}: starting position in pixels from the left edge
+ of the screen. \code{None} is the default value and
+ centers the window horizontally on screen.
+ \item \code{starty}: starting position in pixels from the top edge
+ of the screen. \code{None} is the default value and
+ centers the window vertically on screen.
+\end{itemize}
+
+ Examples:
+
+\begin{verbatim}
+# Uses default geometry: 50% x 50% of screen, centered.
+setup()
+
+# Sets window to 200x200 pixels, in upper left of screen
+setup (width=200, height=200, startx=0, starty=0)
+
+# Sets window to 75% of screen by 50% of screen, and centers it.
+setup(width=.75, height=0.5, startx=None, starty=None)
+\end{verbatim}
+
+\end{funcdesc}
+
+\begin{funcdesc}{title}{title_str}
+Set the window's title to \var{title}.
+\end{funcdesc}
+
+\begin{funcdesc}{done}{}
+Enters the Tk main loop. The window will continue to
+be displayed until the user closes it or the process is killed.
+\end{funcdesc}
+
\begin{funcdesc}{reset}{}
Clear the screen, re-center the pen, and set variables to the default
values.
From python-checkins at python.org Sat Jul 29 16:43:56 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Sat, 29 Jul 2006 16:43:56 +0200 (CEST)
Subject: [Python-checkins] r50933 - python/trunk/Lib/lib-tk/turtle.py
Message-ID: <20060729144356.0B0B01E4008@bag.python.org>
Author: andrew.kuchling
Date: Sat Jul 29 16:43:55 2006
New Revision: 50933
Modified:
python/trunk/Lib/lib-tk/turtle.py
Log:
Fix docstring punctuation
Modified: python/trunk/Lib/lib-tk/turtle.py
==============================================================================
--- python/trunk/Lib/lib-tk/turtle.py (original)
+++ python/trunk/Lib/lib-tk/turtle.py Sat Jul 29 16:43:55 2006
@@ -713,7 +713,7 @@
def setup(**geometry):
""" Sets the size and position of the main window.
- Keywords are width, height, startx and starty
+ Keywords are width, height, startx and starty:
width: either a size in pixels or a fraction of the screen.
Default is 50% of screen.
@@ -788,7 +788,7 @@
_root.geometry("%dx%d+%d+%d" % (_width, _height, _startx, _starty))
def title(title):
- """ set the window title.
+ """Set the window title.
By default this is set to 'Turtle Graphics'
From python-checkins at python.org Sat Jul 29 17:10:33 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Sat, 29 Jul 2006 17:10:33 +0200 (CEST)
Subject: [Python-checkins] r50934 - python/trunk/Objects/setobject.c
Message-ID: <20060729151033.01BE31E4008@bag.python.org>
Author: andrew.kuchling
Date: Sat Jul 29 17:10:32 2006
New Revision: 50934
Modified:
python/trunk/Objects/setobject.c
Log:
[Bug #1414697] Change docstring of set/frozenset types to specify that the contents are unique. Raymond, please feel free to edit or revert.
Modified: python/trunk/Objects/setobject.c
==============================================================================
--- python/trunk/Objects/setobject.c (original)
+++ python/trunk/Objects/setobject.c Sat Jul 29 17:10:32 2006
@@ -1797,7 +1797,7 @@
PyDoc_STRVAR(set_doc,
"set(iterable) --> set object\n\
\n\
-Build an unordered collection.");
+Build an unordered collection of unique elements.");
PyTypeObject PySet_Type = {
PyObject_HEAD_INIT(&PyType_Type)
@@ -1892,7 +1892,7 @@
PyDoc_STRVAR(frozenset_doc,
"frozenset(iterable) --> frozenset object\n\
\n\
-Build an immutable unordered collection.");
+Build an immutable unordered collection of unique elements.");
PyTypeObject PyFrozenSet_Type = {
PyObject_HEAD_INIT(&PyType_Type)
From python-checkins at python.org Sat Jul 29 17:35:21 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Sat, 29 Jul 2006 17:35:21 +0200 (CEST)
Subject: [Python-checkins] r50935 - python/trunk/Doc/lib/libsocket.tex
Message-ID: <20060729153521.D93201E4008@bag.python.org>
Author: andrew.kuchling
Date: Sat Jul 29 17:35:21 2006
New Revision: 50935
Modified:
python/trunk/Doc/lib/libsocket.tex
Log:
[Bug #1530382] Document SSL.server(), .issuer() methods
Modified: python/trunk/Doc/lib/libsocket.tex
==============================================================================
--- python/trunk/Doc/lib/libsocket.tex (original)
+++ python/trunk/Doc/lib/libsocket.tex Sat Jul 29 17:35:21 2006
@@ -711,6 +711,17 @@
read until EOF. The return value is a string of the bytes read.
\end{methoddesc}
+\begin{methoddesc}{server}{}
+Returns a string containing the ASN.1 distinguished name identifying the
+server's certificate. (See below for an example
+showing what distinguished names look like.)
+\end{methoddesc}
+
+\begin{methoddesc}{issuer}{}
+Returns a string containing the ASN.1 distinguished name identifying the
+issuer of the server's certificate.
+\end{methoddesc}
+
\subsection{Example \label{socket-example}}
Here are four minimal example programs using the TCP/IP protocol:\ a
@@ -833,3 +844,44 @@
s.close()
print 'Received', repr(data)
\end{verbatim}
+
+This example connects to an SSL server, prints the
+server and issuer's distinguished names, sends some bytes,
+and reads part of the response:
+
+\begin{verbatim}
+import socket
+
+s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
+s.connect(('www.verisign.com', 443))
+
+ssl_sock = socket.ssl(s)
+
+print repr(ssl_sock.server())
+print repr(ssl_sock.issuer())
+
+# Set a simple HTTP request -- use httplib in actual code.
+ssl_sock.write("""GET / HTTP/1.0\r
+Host: www.verisign.com\r\n\r\n""")
+
+# Read a chunk of data. Will not necessarily
+# read all the data returned by the server.
+data = ssl_sock.read()
+
+# Note that you need to close the underlying socket, not the SSL object.
+del ssl_sock
+s.close()
+\end{verbatim}
+
+At this writing, this SSL example prints the following output (line
+breaks inserted for readability):
+
+\begin{verbatim}
+'/C=US/ST=California/L=Mountain View/
+ O=VeriSign, Inc./OU=Production Services/
+ OU=Terms of use at www.verisign.com/rpa (c)00/
+ CN=www.verisign.com'
+'/O=VeriSign Trust Network/OU=VeriSign, Inc./
+ OU=VeriSign International Server CA - Class 3/
+ OU=www.verisign.com/CPS Incorp.by Ref. LIABILITY LTD.(c)97 VeriSign'
+\end{verbatim}
From python-checkins at python.org Sat Jul 29 17:42:46 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Sat, 29 Jul 2006 17:42:46 +0200 (CEST)
Subject: [Python-checkins] r50936 - python/trunk/Doc/whatsnew/whatsnew24.tex
Message-ID: <20060729154246.F14BC1E4009@bag.python.org>
Author: andrew.kuchling
Date: Sat Jul 29 17:42:46 2006
New Revision: 50936
Modified:
python/trunk/Doc/whatsnew/whatsnew24.tex
Log:
Typo fix
Modified: python/trunk/Doc/whatsnew/whatsnew24.tex
==============================================================================
--- python/trunk/Doc/whatsnew/whatsnew24.tex (original)
+++ python/trunk/Doc/whatsnew/whatsnew24.tex Sat Jul 29 17:42:46 2006
@@ -162,7 +162,7 @@
Generator expressions always have to be written inside parentheses, as
in the above example. The parentheses signalling a function call also
-count, so if you want to create a iterator that will be immediately
+count, so if you want to create an iterator that will be immediately
passed to a function you could write:
\begin{verbatim}
From python-checkins at python.org Sat Jul 29 17:43:13 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Sat, 29 Jul 2006 17:43:13 +0200 (CEST)
Subject: [Python-checkins] r50937 - python/trunk/Modules/fcntlmodule.c
Message-ID: <20060729154313.B81A91E4009@bag.python.org>
Author: andrew.kuchling
Date: Sat Jul 29 17:43:13 2006
New Revision: 50937
Modified:
python/trunk/Modules/fcntlmodule.c
Log:
Tweak wording
Modified: python/trunk/Modules/fcntlmodule.c
==============================================================================
--- python/trunk/Modules/fcntlmodule.c (original)
+++ python/trunk/Modules/fcntlmodule.c Sat Jul 29 17:43:13 2006
@@ -290,7 +290,7 @@
"flock(fd, operation)\n\
\n\
Perform the lock operation op on file descriptor fd. See the Unix \n\
-manual flock(3) for details. (On some systems, this function is\n\
+manual page for flock(3) for details. (On some systems, this function is\n\
emulated using fcntl().)");
From python-checkins at python.org Sat Jul 29 17:55:30 2006
From: python-checkins at python.org (matt.fleming)
Date: Sat, 29 Jul 2006 17:55:30 +0200 (CEST)
Subject: [Python-checkins] r50938 - python/trunk/Doc/lib/libpkgutil.tex
Message-ID: <20060729155530.EEFB11E4009@bag.python.org>
Author: matt.fleming
Date: Sat Jul 29 17:55:30 2006
New Revision: 50938
Modified:
python/trunk/Doc/lib/libpkgutil.tex
Log:
Fix typo
Modified: python/trunk/Doc/lib/libpkgutil.tex
==============================================================================
--- python/trunk/Doc/lib/libpkgutil.tex (original)
+++ python/trunk/Doc/lib/libpkgutil.tex Sat Jul 29 17:55:30 2006
@@ -30,7 +30,7 @@
with \code{import}. A \file{*.pkg} file is trusted at face value:
apart from checking for duplicates, all entries found in a
\file{*.pkg} file are added to the path, regardless of whether they
- exist the filesystem. (This is a feature.)
+ exist on the filesystem. (This is a feature.)
If the input path is not a list (as is the case for frozen
packages) it is returned unchanged. The input path is not
From python-checkins at python.org Sat Jul 29 17:57:08 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Sat, 29 Jul 2006 17:57:08 +0200 (CEST)
Subject: [Python-checkins] r50939 - python/trunk/Doc/lib/liburllib2.tex
Message-ID: <20060729155708.729BD1E4009@bag.python.org>
Author: andrew.kuchling
Date: Sat Jul 29 17:57:08 2006
New Revision: 50939
Modified:
python/trunk/Doc/lib/liburllib2.tex
Log:
[Bug #1528258] Mention that the 'data' argument can be None.
The constructor docs referred the reader to the add_data() method's docs,
but they weren't very helpful. I've simply copied an earlier explanation
of 'data' that's more useful.
Modified: python/trunk/Doc/lib/liburllib2.tex
==============================================================================
--- python/trunk/Doc/lib/liburllib2.tex (original)
+++ python/trunk/Doc/lib/liburllib2.tex Sat Jul 29 17:57:08 2006
@@ -19,7 +19,8 @@
object.
\var{data} may be a string specifying additional data to send to the
-server. Currently HTTP requests are the only ones that use \var{data};
+server, or \code{None} if no such data is needed.
+Currently HTTP requests are the only ones that use \var{data};
the HTTP request will be a POST instead of a GET when the \var{data}
parameter is provided. \var{data} should be a buffer in the standard
\mimetype{application/x-www-form-urlencoded} format. The
@@ -97,8 +98,17 @@
\optional{, origin_req_host}\optional{, unverifiable}}
This class is an abstraction of a URL request.
-\var{url} should be a string which is a valid URL. For a description
-of \var{data} see the \method{add_data()} description.
+\var{url} should be a string containing a valid URL.
+
+\var{data} may be a string specifying additional data to send to the
+server, or \code{None} if no such data is needed.
+Currently HTTP requests are the only ones that use \var{data};
+the HTTP request will be a POST instead of a GET when the \var{data}
+parameter is provided. \var{data} should be a buffer in the standard
+\mimetype{application/x-www-form-urlencoded} format. The
+\function{urllib.urlencode()} function takes a mapping or sequence of
+2-tuples and returns a string in this format.
+
\var{headers} should be a dictionary, and will be treated as if
\method{add_header()} was called with each key and value as arguments.
From python-checkins at python.org Sat Jul 29 18:08:40 2006
From: python-checkins at python.org (andrew.kuchling)
Date: Sat, 29 Jul 2006 18:08:40 +0200 (CEST)
Subject: [Python-checkins] r50940 - python/trunk/Doc/whatsnew/whatsnew25.tex
Message-ID: <20060729160840.A4CC61E4008@bag.python.org>
Author: andrew.kuchling
Date: Sat Jul 29 18:08:40 2006
New Revision: 50940
Modified:
python/trunk/Doc/whatsnew/whatsnew25.tex
Log:
Set bug/patch count. Take a bow, everyone!
Modified: python/trunk/Doc/whatsnew/whatsnew25.tex
==============================================================================
--- python/trunk/Doc/whatsnew/whatsnew25.tex (original)
+++ python/trunk/Doc/whatsnew/whatsnew25.tex Sat Jul 29 18:08:40 2006
@@ -2329,7 +2329,7 @@
As usual, there were a bunch of other improvements and bugfixes
scattered throughout the source tree. A search through the SVN change
-logs finds there were XXX patches applied and YYY bugs fixed between
+logs finds there were 334 patches applied and 443 bugs fixed between
Python 2.4 and 2.5. Both figures are likely to be underestimates.
Some of the more notable changes are:
From buildbot at python.org Sat Jul 29 18:44:57 2006
From: buildbot at python.org (buildbot at python.org)
Date: Sat, 29 Jul 2006 16:44:57 +0000
Subject: [Python-checkins] buildbot warnings in x86 Ubuntu dapper (icc) trunk
Message-ID: <20060729164457.D80061E4008@bag.python.org>
The Buildbot has detected a new failure of x86 Ubuntu dapper (icc) trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520Ubuntu%2520dapper%2520%2528icc%2529%2520trunk/builds/838
Buildbot URL: http://www.python.org/dev/buildbot/all/
Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: andrew.kuchling
Build Had Warnings: warnings test
sincerely,
-The Buildbot
From python-checkins at python.org Sat Jul 29 18:56:23 2006
From: python-checkins at python.org (fred.drake)
Date: Sat, 29 Jul 2006 18:56:23 +0200 (CEST)
Subject: [Python-checkins] r50941 - in python/trunk: Doc/lib/markup.tex
Doc/whatsnew/whatsnew25.tex Lib/test/test_minidom.py
Lib/test/test_sax.py Lib/test/test_xml_etree.py
Lib/test/test_xml_etree_c.py Lib/xml Lib/xml.py Lib/xml/dom
Lib/xml/dom/expatbuilder.py Lib/xml/dom/minicompat.py
Lib/xml/dom/minidom.py Lib/xml/dom/xmlbuilder.py
Lib/xml/etree/ElementInclude.py Lib/xml/etree/ElementPath.py
Lib/xml/etree/ElementTree.py Lib/xml/etree/__init__.py
Lib/xml/etree/cElementTree.py Lib/xml/parsers Lib/xml/sax
Lib/xml/sax/saxutils.py Lib/xmlcore Makefile.pre.in Misc/NEWS
Message-ID: <20060729165623.7176E1E401A@bag.python.org>
Author: fred.drake
Date: Sat Jul 29 18:56:15 2006
New Revision: 50941
Added:
python/trunk/Lib/xml/ (props changed)
- copied from r41666, python/trunk/Lib/xml/
python/trunk/Lib/xml/etree/cElementTree.py (props changed)
- copied unchanged from r41677, python/trunk/Lib/xmlcore/etree/cElementTree.py
Removed:
python/trunk/Lib/xml.py
python/trunk/Lib/xmlcore/
Modified:
python/trunk/Doc/lib/markup.tex
python/trunk/Doc/whatsnew/whatsnew25.tex
python/trunk/Lib/test/test_minidom.py
python/trunk/Lib/test/test_sax.py
python/trunk/Lib/test/test_xml_etree.py
python/trunk/Lib/test/test_xml_etree_c.py
python/trunk/Lib/xml/dom/ (props changed)
python/trunk/Lib/xml/dom/expatbuilder.py
python/trunk/Lib/xml/dom/minicompat.py
python/trunk/Lib/xml/dom/minidom.py
python/trunk/Lib/xml/dom/xmlbuilder.py
python/trunk/Lib/xml/etree/ElementInclude.py (contents, props changed)
python/trunk/Lib/xml/etree/ElementPath.py (contents, props changed)
python/trunk/Lib/xml/etree/ElementTree.py (contents, props changed)
python/trunk/Lib/xml/etree/__init__.py (contents, props changed)
python/trunk/Lib/xml/parsers/ (props changed)
python/trunk/Lib/xml/sax/ (props changed)
python/trunk/Lib/xml/sax/saxutils.py
python/trunk/Makefile.pre.in
python/trunk/Misc/NEWS
Log:
expunge the xmlcore changes:
41667, 41668 - initial switch to xmlcore
47044 - mention of xmlcore in What's New
50687 - mention of xmlcore in the library reference
re-apply xmlcore changes to xml:
41674 - line ending changes (re-applied manually), directory props
41677 - add cElementTree wrapper
41678 - PSF licensing for etree
41812 - whitespace normalization
42724 - fix svn:eol-style settings
43681, 43682 - remove Python version-compatibility cruft from minidom
46773 - fix encoding of \r\n\t in attr values in saxutils
47269 - added XMLParser alias for cElementTree compatibility
additional tests were added in Lib/test/test_sax.py that failed with
the xmlcore changes; these relate to SF bugs #1511497, #1513611
Modified: python/trunk/Doc/lib/markup.tex
==============================================================================
--- python/trunk/Doc/lib/markup.tex (original)
+++ python/trunk/Doc/lib/markup.tex Sat Jul 29 18:56:15 2006
@@ -15,17 +15,6 @@
package}{http://pyxml.sourceforge.net/}; that package provides an
extended set of XML libraries for Python.
-Python 2.5 introduces the \module{xmlcore} package; this package
-provides the implementation of the \module{xml} package as distributed
-with the standard library. The \module{xml} package, as in earlier
-versions, provides an interface that will provide the PyXML
-implementation of the interfaces when available, and the standard
-library implementation if not. Applications that can use either the
-PyXML implementation or the standard library's implementation may
-continue to make imports from the \module{xml} package; applications
-that want to only import the standard library's implementation can now
-use the \module{xmlcore} package.
-
The documentation for the \module{xml.dom} and \module{xml.sax}
packages are the definition of the Python bindings for the DOM and SAX
interfaces.
Modified: python/trunk/Doc/whatsnew/whatsnew25.tex
==============================================================================
--- python/trunk/Doc/whatsnew/whatsnew25.tex (original)
+++ python/trunk/Doc/whatsnew/whatsnew25.tex Sat Jul 29 18:56:15 2006
@@ -1760,13 +1760,6 @@
Brandl.)
% Patch #754022
-\item The standard library's XML-related package
-has been renamed to \module{xmlcore}. The \module{xml} module will
-now import either the \module{xmlcore} or PyXML version of subpackages
-such as \module{xml.dom}. The renaming means it will always be
-possible to import the standard library's XML support whether or not
-the PyXML package is installed.
-
\item The \module{xmlrpclib} module now supports returning
\class{datetime} objects for the XML-RPC date type. Supply
\code{use_datetime=True} to the \function{loads()} function
@@ -2404,10 +2397,6 @@
\member{rpc_paths} to \code{None} or an empty tuple disables
this path checking.
-\item Library: the \module{xml} package has been renamed to \module{xmlcore}.
-The PyXML package will therefore be \module{xml}, and the Python
-distribution's code will always be accessible as \module{xmlcore}.
-
\item C API: Many functions now use \ctype{Py_ssize_t}
instead of \ctype{int} to allow processing more data on 64-bit
machines. Extension code may need to make the same change to avoid
Modified: python/trunk/Lib/test/test_minidom.py
==============================================================================
--- python/trunk/Lib/test/test_minidom.py (original)
+++ python/trunk/Lib/test/test_minidom.py Sat Jul 29 18:56:15 2006
@@ -1,4 +1,4 @@
-# test for xmlcore.dom.minidom
+# test for xml.dom.minidom
import os
import sys
@@ -7,12 +7,12 @@
from StringIO import StringIO
from test.test_support import verbose
-import xmlcore.dom
-import xmlcore.dom.minidom
-import xmlcore.parsers.expat
+import xml.dom
+import xml.dom.minidom
+import xml.parsers.expat
-from xmlcore.dom.minidom import parse, Node, Document, parseString
-from xmlcore.dom.minidom import getDOMImplementation
+from xml.dom.minidom import parse, Node, Document, parseString
+from xml.dom.minidom import getDOMImplementation
if __name__ == "__main__":
@@ -138,29 +138,29 @@
text = dom.createTextNode('text')
try: dom.appendChild(text)
- except xmlcore.dom.HierarchyRequestErr: pass
+ except xml.dom.HierarchyRequestErr: pass
else:
print "dom.appendChild didn't raise HierarchyRequestErr"
dom.appendChild(elem)
try: dom.insertBefore(text, elem)
- except xmlcore.dom.HierarchyRequestErr: pass
+ except xml.dom.HierarchyRequestErr: pass
else:
print "dom.appendChild didn't raise HierarchyRequestErr"
try: dom.replaceChild(text, elem)
- except xmlcore.dom.HierarchyRequestErr: pass
+ except xml.dom.HierarchyRequestErr: pass
else:
print "dom.appendChild didn't raise HierarchyRequestErr"
nodemap = elem.attributes
try: nodemap.setNamedItem(text)
- except xmlcore.dom.HierarchyRequestErr: pass
+ except xml.dom.HierarchyRequestErr: pass
else:
print "NamedNodeMap.setNamedItem didn't raise HierarchyRequestErr"
try: nodemap.setNamedItemNS(text)
- except xmlcore.dom.HierarchyRequestErr: pass
+ except xml.dom.HierarchyRequestErr: pass
else:
print "NamedNodeMap.setNamedItemNS didn't raise HierarchyRequestErr"
@@ -439,7 +439,7 @@
and pi.firstChild is None
and pi.lastChild is None
and pi.localName is None
- and pi.namespaceURI == xmlcore.dom.EMPTY_NAMESPACE)
+ and pi.namespaceURI == xml.dom.EMPTY_NAMESPACE)
def testProcessingInstructionRepr(): pass
@@ -454,7 +454,7 @@
elem = doc.createElement("extra")
try:
doc.appendChild(elem)
- except xmlcore.dom.HierarchyRequestErr:
+ except xml.dom.HierarchyRequestErr:
pass
else:
print "Failed to catch expected exception when" \
@@ -491,7 +491,7 @@
confirm(a1.isSameNode(a2))
try:
attrs.removeNamedItem("a")
- except xmlcore.dom.NotFoundErr:
+ except xml.dom.NotFoundErr:
pass
def testRemoveNamedItemNS():
@@ -503,7 +503,7 @@
confirm(a1.isSameNode(a2))
try:
attrs.removeNamedItemNS("http://xml.python.org/", "b")
- except xmlcore.dom.NotFoundErr:
+ except xml.dom.NotFoundErr:
pass
def testAttrListValues(): pass
@@ -682,7 +682,7 @@
doc2 = parseString("")
try:
doc1.importNode(doc2, deep)
- except xmlcore.dom.NotSupportedErr:
+ except xml.dom.NotSupportedErr:
pass
else:
raise Exception(testName +
@@ -705,14 +705,12 @@
doctype = getDOMImplementation().createDocumentType("doc", None, None)
doctype.entities._seq = []
doctype.notations._seq = []
- notation = xmlcore.dom.minidom.Notation(
- "my-notation", None,
- "http://xml.python.org/notations/my")
+ notation = xml.dom.minidom.Notation("my-notation", None,
+ "http://xml.python.org/notations/my")
doctype.notations._seq.append(notation)
- entity = xmlcore.dom.minidom.Entity(
- "my-entity", None,
- "http://xml.python.org/entities/my",
- "my-notation")
+ entity = xml.dom.minidom.Entity("my-entity", None,
+ "http://xml.python.org/entities/my",
+ "my-notation")
entity.version = "1.0"
entity.encoding = "utf-8"
entity.actualEncoding = "us-ascii"
@@ -731,7 +729,7 @@
target = create_doc_without_doctype()
try:
imported = target.importNode(src.doctype, 0)
- except xmlcore.dom.NotSupportedErr:
+ except xml.dom.NotSupportedErr:
pass
else:
raise Exception(
@@ -742,7 +740,7 @@
target = create_doc_without_doctype()
try:
imported = target.importNode(src.doctype, 1)
- except xmlcore.dom.NotSupportedErr:
+ except xml.dom.NotSupportedErr:
pass
else:
raise Exception(
@@ -850,7 +848,7 @@
doc.unlink()
def testSAX2DOM():
- from xmlcore.dom import pulldom
+ from xml.dom import pulldom
sax2dom = pulldom.SAX2DOM()
sax2dom.startDocument()
@@ -940,11 +938,11 @@
attr = elem.attributes['a']
# Simple renaming
- attr = doc.renameNode(attr, xmlcore.dom.EMPTY_NAMESPACE, "b")
+ attr = doc.renameNode(attr, xml.dom.EMPTY_NAMESPACE, "b")
confirm(attr.name == "b"
and attr.nodeName == "b"
and attr.localName is None
- and attr.namespaceURI == xmlcore.dom.EMPTY_NAMESPACE
+ and attr.namespaceURI == xml.dom.EMPTY_NAMESPACE
and attr.prefix is None
and attr.value == "v"
and elem.getAttributeNode("a") is None
@@ -989,11 +987,11 @@
and attrmap[("http://xml.python.org/ns2", "d")].isSameNode(attr))
# Rename back to a simple non-NS node
- attr = doc.renameNode(attr, xmlcore.dom.EMPTY_NAMESPACE, "e")
+ attr = doc.renameNode(attr, xml.dom.EMPTY_NAMESPACE, "e")
confirm(attr.name == "e"
and attr.nodeName == "e"
and attr.localName is None
- and attr.namespaceURI == xmlcore.dom.EMPTY_NAMESPACE
+ and attr.namespaceURI == xml.dom.EMPTY_NAMESPACE
and attr.prefix is None
and attr.value == "v"
and elem.getAttributeNode("a") is None
@@ -1007,7 +1005,7 @@
try:
doc.renameNode(attr, "http://xml.python.org/ns", "xmlns")
- except xmlcore.dom.NamespaceErr:
+ except xml.dom.NamespaceErr:
pass
else:
print "expected NamespaceErr"
@@ -1020,11 +1018,11 @@
elem = doc.documentElement
# Simple renaming
- elem = doc.renameNode(elem, xmlcore.dom.EMPTY_NAMESPACE, "a")
+ elem = doc.renameNode(elem, xml.dom.EMPTY_NAMESPACE, "a")
confirm(elem.tagName == "a"
and elem.nodeName == "a"
and elem.localName is None
- and elem.namespaceURI == xmlcore.dom.EMPTY_NAMESPACE
+ and elem.namespaceURI == xml.dom.EMPTY_NAMESPACE
and elem.prefix is None
and elem.ownerDocument.isSameNode(doc))
@@ -1047,11 +1045,11 @@
and elem.ownerDocument.isSameNode(doc))
# Rename back to a simple non-NS node
- elem = doc.renameNode(elem, xmlcore.dom.EMPTY_NAMESPACE, "d")
+ elem = doc.renameNode(elem, xml.dom.EMPTY_NAMESPACE, "d")
confirm(elem.tagName == "d"
and elem.nodeName == "d"
and elem.localName is None
- and elem.namespaceURI == xmlcore.dom.EMPTY_NAMESPACE
+ and elem.namespaceURI == xml.dom.EMPTY_NAMESPACE
and elem.prefix is None
and elem.ownerDocument.isSameNode(doc))
@@ -1062,15 +1060,15 @@
# Make sure illegal NS usage is detected:
try:
doc.renameNode(node, "http://xml.python.org/ns", "xmlns:foo")
- except xmlcore.dom.NamespaceErr:
+ except xml.dom.NamespaceErr:
pass
else:
print "expected NamespaceErr"
doc2 = parseString("")
try:
- doc2.renameNode(node, xmlcore.dom.EMPTY_NAMESPACE, "foo")
- except xmlcore.dom.WrongDocumentErr:
+ doc2.renameNode(node, xml.dom.EMPTY_NAMESPACE, "foo")
+ except xml.dom.WrongDocumentErr:
pass
else:
print "expected WrongDocumentErr"
@@ -1078,12 +1076,12 @@
def testRenameOther():
# We have to create a comment node explicitly since not all DOM
# builders used with minidom add comments to the DOM.
- doc = xmlcore.dom.minidom.getDOMImplementation().createDocument(
- xmlcore.dom.EMPTY_NAMESPACE, "e", None)
+ doc = xml.dom.minidom.getDOMImplementation().createDocument(
+ xml.dom.EMPTY_NAMESPACE, "e", None)
node = doc.createComment("comment")
try:
- doc.renameNode(node, xmlcore.dom.EMPTY_NAMESPACE, "foo")
- except xmlcore.dom.NotSupportedErr:
+ doc.renameNode(node, xml.dom.EMPTY_NAMESPACE, "foo")
+ except xml.dom.NotSupportedErr:
pass
else:
print "expected NotSupportedErr when renaming comment node"
@@ -1194,13 +1192,13 @@
# since each supports a different level of DTD information.
t = elem.schemaType
confirm(t.name is None
- and t.namespace == xmlcore.dom.EMPTY_NAMESPACE)
+ and t.namespace == xml.dom.EMPTY_NAMESPACE)
names = "id notid text enum ref refs ent ents nm nms".split()
for name in names:
a = elem.getAttributeNode(name)
t = a.schemaType
confirm(hasattr(t, "name")
- and t.namespace == xmlcore.dom.EMPTY_NAMESPACE)
+ and t.namespace == xml.dom.EMPTY_NAMESPACE)
def testSetIdAttribute():
doc = parseString("")
@@ -1229,7 +1227,7 @@
and a2.isId
and not a3.isId)
# renaming an attribute should not affect its ID-ness:
- doc.renameNode(a2, xmlcore.dom.EMPTY_NAMESPACE, "an")
+ doc.renameNode(a2, xml.dom.EMPTY_NAMESPACE, "an")
confirm(e.isSameNode(doc.getElementById("w"))
and a2.isId)
@@ -1265,7 +1263,7 @@
confirm(not a3.isId)
confirm(doc.getElementById("v") is None)
# renaming an attribute should not affect its ID-ness:
- doc.renameNode(a2, xmlcore.dom.EMPTY_NAMESPACE, "an")
+ doc.renameNode(a2, xml.dom.EMPTY_NAMESPACE, "an")
confirm(e.isSameNode(doc.getElementById("w"))
and a2.isId)
@@ -1301,7 +1299,7 @@
confirm(not a3.isId)
confirm(doc.getElementById("v") is None)
# renaming an attribute should not affect its ID-ness:
- doc.renameNode(a2, xmlcore.dom.EMPTY_NAMESPACE, "an")
+ doc.renameNode(a2, xml.dom.EMPTY_NAMESPACE, "an")
confirm(e.isSameNode(doc.getElementById("w"))
and a2.isId)
Modified: python/trunk/Lib/test/test_sax.py
==============================================================================
--- python/trunk/Lib/test/test_sax.py (original)
+++ python/trunk/Lib/test/test_sax.py Sat Jul 29 18:56:15 2006
@@ -1,17 +1,17 @@
# regression test for SAX 2.0 -*- coding: iso-8859-1 -*-
# $Id$
-from xmlcore.sax import make_parser, ContentHandler, \
- SAXException, SAXReaderNotAvailable, SAXParseException
+from xml.sax import make_parser, ContentHandler, \
+ SAXException, SAXReaderNotAvailable, SAXParseException
try:
make_parser()
except SAXReaderNotAvailable:
# don't try to test this module if we cannot create a parser
raise ImportError("no XML parsers available")
-from xmlcore.sax.saxutils import XMLGenerator, escape, unescape, quoteattr, \
- XMLFilterBase
-from xmlcore.sax.expatreader import create_parser
-from xmlcore.sax.xmlreader import InputSource, AttributesImpl, AttributesNSImpl
+from xml.sax.saxutils import XMLGenerator, escape, unescape, quoteattr, \
+ XMLFilterBase
+from xml.sax.expatreader import create_parser
+from xml.sax.xmlreader import InputSource, AttributesImpl, AttributesNSImpl
from cStringIO import StringIO
from test.test_support import verify, verbose, TestFailed, findfile
import os
@@ -36,17 +36,17 @@
# Creating parsers several times in a row should succeed.
# Testing this because there have been failures of this kind
# before.
- from xmlcore.sax import make_parser
+ from xml.sax import make_parser
p = make_parser()
- from xmlcore.sax import make_parser
+ from xml.sax import make_parser
p = make_parser()
- from xmlcore.sax import make_parser
+ from xml.sax import make_parser
p = make_parser()
- from xmlcore.sax import make_parser
+ from xml.sax import make_parser
p = make_parser()
- from xmlcore.sax import make_parser
+ from xml.sax import make_parser
p = make_parser()
- from xmlcore.sax import make_parser
+ from xml.sax import make_parser
p = make_parser()
except:
return 0
@@ -108,7 +108,7 @@
try:
# Creating a parser should succeed - it should fall back
# to the expatreader
- p = make_parser(['xmlcore.parsers.no_such_parser'])
+ p = make_parser(['xml.parsers.no_such_parser'])
except:
return 0
else:
@@ -671,6 +671,55 @@
attrs.getQNameByName((ns_uri, "attr")) == "ns:attr"
+# During the development of Python 2.5, an attempt to move the "xml"
+# package implementation to a new package ("xmlcore") proved painful.
+# The goal of this change was to allow applications to be able to
+# obtain and rely on behavior in the standard library implementation
+# of the XML support without needing to be concerned about the
+# availability of the PyXML implementation.
+#
+# While the existing import hackery in Lib/xml/__init__.py can cause
+# PyXML's _xmlpus package to supplant the "xml" package, that only
+# works because either implementation uses the "xml" package name for
+# imports.
+#
+# The move resulted in a number of problems related to the fact that
+# the import machinery's "package context" is based on the name that's
+# being imported rather than the __name__ of the actual package
+# containment; it wasn't possible for the "xml" package to be replaced
+# by a simple module that indirected imports to the "xmlcore" package.
+#
+# The following two tests exercised bugs that were introduced in that
+# attempt. Keeping these tests around will help detect problems with
+# other attempts to provide reliable access to the standard library's
+# implementation of the XML support.
+
+def test_sf_1511497():
+ # Bug report: http://www.python.org/sf/1511497
+ import sys
+ old_modules = sys.modules.copy()
+ for modname in sys.modules.keys():
+ if modname.startswith("xml."):
+ del sys.modules[modname]
+ try:
+ import xml.sax.expatreader
+ module = xml.sax.expatreader
+ return module.__name__ == "xml.sax.expatreader"
+ finally:
+ sys.modules.update(old_modules)
+
+def test_sf_1513611():
+ # Bug report: http://www.python.org/sf/1513611
+ sio = StringIO("invalid")
+ parser = make_parser()
+ from xml.sax import SAXParseException
+ try:
+ parser.parse(sio)
+ except SAXParseException:
+ return True
+ else:
+ return False
+
# ===== Main program
def make_test_output():
Modified: python/trunk/Lib/test/test_xml_etree.py
==============================================================================
--- python/trunk/Lib/test/test_xml_etree.py (original)
+++ python/trunk/Lib/test/test_xml_etree.py Sat Jul 29 18:56:15 2006
@@ -1,4 +1,4 @@
-# xmlcore.etree test. This file contains enough tests to make sure that
+# xml.etree test. This file contains enough tests to make sure that
# all included components work as they should. For a more extensive
# test suite, see the selftest script in the ElementTree distribution.
@@ -6,8 +6,6 @@
from test import test_support
-from xmlcore.etree import ElementTree as ET
-
SAMPLE_XML = """
text
@@ -32,9 +30,9 @@
"""
Import sanity.
- >>> from xmlcore.etree import ElementTree
- >>> from xmlcore.etree import ElementInclude
- >>> from xmlcore.etree import ElementPath
+ >>> from xml.etree import ElementTree
+ >>> from xml.etree import ElementInclude
+ >>> from xml.etree import ElementPath
"""
def check_method(method):
@@ -61,6 +59,8 @@
"""
Test element tree interface.
+ >>> from xml.etree import ElementTree as ET
+
>>> element = ET.Element("tag", key="value")
>>> tree = ET.ElementTree(element)
@@ -108,6 +108,8 @@
"""
Test find methods (including xpath syntax).
+ >>> from xml.etree import ElementTree as ET
+
>>> elem = ET.XML(SAMPLE_XML)
>>> elem.find("tag").tag
'tag'
@@ -174,6 +176,8 @@
def parseliteral():
r"""
+ >>> from xml.etree import ElementTree as ET
+
>>> element = ET.XML("text")
>>> ET.ElementTree(element).write(sys.stdout)
text
@@ -195,19 +199,6 @@
'body'
"""
-def check_encoding(encoding):
- """
- >>> check_encoding("ascii")
- >>> check_encoding("us-ascii")
- >>> check_encoding("iso-8859-1")
- >>> check_encoding("iso-8859-15")
- >>> check_encoding("cp437")
- >>> check_encoding("mac-roman")
- """
- ET.XML(
- "" % encoding
- )
-
#
# xinclude tests (samples from appendix C of the xinclude specification)
@@ -282,14 +273,16 @@
except KeyError:
raise IOError("resource not found")
if parse == "xml":
- return ET.XML(data)
+ from xml.etree.ElementTree import XML
+ return XML(data)
return data
def xinclude():
r"""
Basic inclusion example (XInclude C.1)
- >>> from xmlcore.etree import ElementInclude
+ >>> from xml.etree import ElementTree as ET
+ >>> from xml.etree import ElementInclude
>>> document = xinclude_loader("C1.xml")
>>> ElementInclude.include(document, xinclude_loader)
Modified: python/trunk/Lib/test/test_xml_etree_c.py
==============================================================================
--- python/trunk/Lib/test/test_xml_etree_c.py (original)
+++ python/trunk/Lib/test/test_xml_etree_c.py Sat Jul 29 18:56:15 2006
@@ -1,10 +1,10 @@
-# xmlcore.etree test for cElementTree
+# xml.etree test for cElementTree
import doctest, sys
from test import test_support
-from xmlcore.etree import cElementTree as ET
+from xml.etree import cElementTree as ET
SAMPLE_XML = """
@@ -30,7 +30,7 @@
"""
Import sanity.
- >>> from xmlcore.etree import cElementTree
+ >>> from xml.etree import cElementTree
"""
def check_method(method):
Deleted: /python/trunk/Lib/xml.py
==============================================================================
--- /python/trunk/Lib/xml.py Sat Jul 29 18:56:15 2006
+++ (empty file)
@@ -1,47 +0,0 @@
-"""Core XML support for Python.
-
-This package contains four sub-packages:
-
-dom -- The W3C Document Object Model. This supports DOM Level 1 +
- Namespaces.
-
-parsers -- Python wrappers for XML parsers (currently only supports Expat).
-
-sax -- The Simple API for XML, developed by XML-Dev, led by David
- Megginson and ported to Python by Lars Marius Garshol. This
- supports the SAX 2 API.
-
-etree -- The ElementTree XML library. This is a subset of the full
- ElementTree XML release.
-
-"""
-
-import sys
-import xmlcore
-
-__all__ = ["dom", "parsers", "sax", "etree"]
-
-# When being checked-out without options, this has the form
-# "Revision: x.y "
-# When exported using -kv, it is "x.y".
-__version__ = "$Revision$".split()[-2:][0]
-
-
-_MINIMUM_XMLPLUS_VERSION = (0, 8, 4)
-
-try:
- import _xmlplus
-except ImportError:
- sys.modules[__name__] = xmlcore
-else:
- try:
- v = _xmlplus.version_info
- except AttributeError:
- # _xmlplus is too old; ignore it
- pass
- else:
- if v >= _MINIMUM_XMLPLUS_VERSION:
- _xmlplus.__path__.extend(xmlcore.__path__)
- sys.modules[__name__] = _xmlplus
- else:
- del v
Modified: python/trunk/Lib/xml/dom/expatbuilder.py
==============================================================================
--- python/trunk/Lib/xml/dom/expatbuilder.py (original)
+++ python/trunk/Lib/xml/dom/expatbuilder.py Sat Jul 29 18:56:15 2006
@@ -59,7 +59,7 @@
"NMTOKENS": minidom.TypeInfo(None, "nmtokens"),
}
-class ElementInfo(NewStyle):
+class ElementInfo(object):
__slots__ = '_attr_info', '_model', 'tagName'
def __init__(self, tagName, model=None):
@@ -460,7 +460,7 @@
# where allowed.
_ALLOWED_FILTER_RETURNS = (FILTER_ACCEPT, FILTER_REJECT, FILTER_SKIP)
-class FilterVisibilityController(NewStyle):
+class FilterVisibilityController(object):
"""Wrapper around a DOMBuilderFilter which implements the checks
to make the whatToShow filter attribute work."""
@@ -518,7 +518,7 @@
}
-class FilterCrutch(NewStyle):
+class FilterCrutch(object):
__slots__ = '_builder', '_level', '_old_start', '_old_end'
def __init__(self, builder):
@@ -908,7 +908,7 @@
raise ParseEscape()
-def parse(file, namespaces=1):
+def parse(file, namespaces=True):
"""Parse a document, returning the resulting Document node.
'file' may be either a file name or an open file object.
@@ -929,7 +929,7 @@
return result
-def parseString(string, namespaces=1):
+def parseString(string, namespaces=True):
"""Parse a document from a string, returning the resulting
Document node.
"""
@@ -940,7 +940,7 @@
return builder.parseString(string)
-def parseFragment(file, context, namespaces=1):
+def parseFragment(file, context, namespaces=True):
"""Parse a fragment of a document, given the context from which it
was originally extracted. context should be the parent of the
node(s) which are in the fragment.
@@ -963,7 +963,7 @@
return result
-def parseFragmentString(string, context, namespaces=1):
+def parseFragmentString(string, context, namespaces=True):
"""Parse a fragment of a document from a string, given the context
from which it was originally extracted. context should be the
parent of the node(s) which are in the fragment.
Modified: python/trunk/Lib/xml/dom/minicompat.py
==============================================================================
--- python/trunk/Lib/xml/dom/minicompat.py (original)
+++ python/trunk/Lib/xml/dom/minicompat.py Sat Jul 29 18:56:15 2006
@@ -4,10 +4,6 @@
#
# The following names are defined:
#
-# isinstance -- version of the isinstance() function that accepts
-# tuples as the second parameter regardless of the
-# Python version
-#
# NodeList -- lightest possible NodeList implementation
#
# EmptyNodeList -- lightest possible NodeList that is guarateed to
@@ -15,8 +11,6 @@
#
# StringTypes -- tuple of defined string types
#
-# GetattrMagic -- base class used to make _get_ be magically
-# invoked when available
# defproperty -- function used in conjunction with GetattrMagic;
# using these together is needed to make them work
# as efficiently as possible in both Python 2.2+
@@ -41,14 +35,8 @@
#
# defproperty() should be used for each version of
# the relevant _get_() function.
-#
-# NewStyle -- base class to cause __slots__ to be honored in
-# the new world
-#
-# True, False -- only for Python 2.2 and earlier
-__all__ = ["NodeList", "EmptyNodeList", "NewStyle",
- "StringTypes", "defproperty", "GetattrMagic"]
+__all__ = ["NodeList", "EmptyNodeList", "StringTypes", "defproperty"]
import xml.dom
@@ -60,125 +48,63 @@
StringTypes = type(''), type(unicode(''))
-# define True and False only if not defined as built-ins
-try:
- True
-except NameError:
- True = 1
- False = 0
- __all__.extend(["True", "False"])
+class NodeList(list):
+ __slots__ = ()
+ def item(self, index):
+ if 0 <= index < len(self):
+ return self[index]
-try:
- isinstance('', StringTypes)
-except TypeError:
- #
- # Wrap isinstance() to make it compatible with the version in
- # Python 2.2 and newer.
- #
- _isinstance = isinstance
- def isinstance(obj, type_or_seq):
- try:
- return _isinstance(obj, type_or_seq)
- except TypeError:
- for t in type_or_seq:
- if _isinstance(obj, t):
- return 1
- return 0
- __all__.append("isinstance")
-
-
-if list is type([]):
- class NodeList(list):
- __slots__ = ()
-
- def item(self, index):
- if 0 <= index < len(self):
- return self[index]
-
- def _get_length(self):
- return len(self)
-
- def _set_length(self, value):
- raise xml.dom.NoModificationAllowedErr(
- "attempt to modify read-only attribute 'length'")
-
- length = property(_get_length, _set_length,
- doc="The number of nodes in the NodeList.")
-
- def __getstate__(self):
- return list(self)
-
- def __setstate__(self, state):
- self[:] = state
-
- class EmptyNodeList(tuple):
- __slots__ = ()
-
- def __add__(self, other):
- NL = NodeList()
- NL.extend(other)
- return NL
-
- def __radd__(self, other):
- NL = NodeList()
- NL.extend(other)
- return NL
-
- def item(self, index):
- return None
-
- def _get_length(self):
- return 0
-
- def _set_length(self, value):
- raise xml.dom.NoModificationAllowedErr(
- "attempt to modify read-only attribute 'length'")
+ def _get_length(self):
+ return len(self)
- length = property(_get_length, _set_length,
- doc="The number of nodes in the NodeList.")
+ def _set_length(self, value):
+ raise xml.dom.NoModificationAllowedErr(
+ "attempt to modify read-only attribute 'length'")
-else:
- def NodeList():
- return []
+ length = property(_get_length, _set_length,
+ doc="The number of nodes in the NodeList.")
- def EmptyNodeList():
- return []
+ def __getstate__(self):
+ return list(self)
+ def __setstate__(self, state):
+ self[:] = state
-try:
- property
-except NameError:
- def defproperty(klass, name, doc):
- # taken care of by the base __getattr__()
- pass
-
- class GetattrMagic:
- def __getattr__(self, key):
- if key.startswith("_"):
- raise AttributeError, key
-
- try:
- get = getattr(self, "_get_" + key)
- except AttributeError:
- raise AttributeError, key
- return get()
- class NewStyle:
- pass
+class EmptyNodeList(tuple):
+ __slots__ = ()
-else:
- def defproperty(klass, name, doc):
- get = getattr(klass, ("_get_" + name)).im_func
- def set(self, value, name=name):
- raise xml.dom.NoModificationAllowedErr(
- "attempt to modify read-only attribute " + repr(name))
- assert not hasattr(klass, "_set_" + name), \
- "expected not to find _set_" + name
- prop = property(get, set, doc=doc)
- setattr(klass, name, prop)
+ def __add__(self, other):
+ NL = NodeList()
+ NL.extend(other)
+ return NL
+
+ def __radd__(self, other):
+ NL = NodeList()
+ NL.extend(other)
+ return NL
+
+ def item(self, index):
+ return None
+
+ def _get_length(self):
+ return 0
+
+ def _set_length(self, value):
+ raise xml.dom.NoModificationAllowedErr(
+ "attempt to modify read-only attribute 'length'")
+
+ length = property(_get_length, _set_length,
+ doc="The number of nodes in the NodeList.")
- class GetattrMagic:
- pass
- NewStyle = object
+def defproperty(klass, name, doc):
+ get = getattr(klass, ("_get_" + name)).im_func
+ def set(self, value, name=name):
+ raise xml.dom.NoModificationAllowedErr(
+ "attempt to modify read-only attribute " + repr(name))
+ assert not hasattr(klass, "_set_" + name), \
+ "expected not to find _set_" + name
+ prop = property(get, set, doc=doc)
+ setattr(klass, name, prop)
Modified: python/trunk/Lib/xml/dom/minidom.py
==============================================================================
--- python/trunk/Lib/xml/dom/minidom.py (original)
+++ python/trunk/Lib/xml/dom/minidom.py Sat Jul 29 18:56:15 2006
@@ -20,8 +20,6 @@
from xml.dom.minicompat import *
from xml.dom.xmlbuilder import DOMImplementationLS, DocumentLS
-_TupleType = type(())
-
# This is used by the ID-cache invalidation checks; the list isn't
# actually complete, since the nodes being checked will never be the
# DOCUMENT_NODE or DOCUMENT_FRAGMENT_NODE. (The node being checked is
@@ -31,7 +29,7 @@
xml.dom.Node.ENTITY_REFERENCE_NODE)
-class Node(xml.dom.Node, GetattrMagic):
+class Node(xml.dom.Node):
namespaceURI = None # this is non-null only for elements and attributes
parentNode = None
ownerDocument = None
@@ -459,7 +457,7 @@
defproperty(Attr, "schemaType", doc="Schema type for this attribute.")
-class NamedNodeMap(NewStyle, GetattrMagic):
+class NamedNodeMap(object):
"""The attribute list is a transient interface to the underlying
dictionaries. Mutations here will change the underlying element's
dictionary.
@@ -523,7 +521,7 @@
return cmp(id(self), id(other))
def __getitem__(self, attname_or_tuple):
- if isinstance(attname_or_tuple, _TupleType):
+ if isinstance(attname_or_tuple, tuple):
return self._attrsNS[attname_or_tuple]
else:
return self._attrs[attname_or_tuple]
@@ -613,7 +611,7 @@
AttributeList = NamedNodeMap
-class TypeInfo(NewStyle):
+class TypeInfo(object):
__slots__ = 'namespace', 'name'
def __init__(self, namespace, name):
@@ -1146,7 +1144,7 @@
writer.write("" % self.data)
-class ReadOnlySequentialNamedNodeMap(NewStyle, GetattrMagic):
+class ReadOnlySequentialNamedNodeMap(object):
__slots__ = '_seq',
def __init__(self, seq=()):
@@ -1170,7 +1168,7 @@
return n
def __getitem__(self, name_or_tuple):
- if isinstance(name_or_tuple, _TupleType):
+ if isinstance(name_or_tuple, tuple):
node = self.getNamedItemNS(*name_or_tuple)
else:
node = self.getNamedItem(name_or_tuple)
@@ -1418,7 +1416,7 @@
def _create_document(self):
return Document()
-class ElementInfo(NewStyle):
+class ElementInfo(object):
"""Object that represents content-model information for an element.
This implementation is not expected to be used in practice; DOM
Modified: python/trunk/Lib/xml/dom/xmlbuilder.py
==============================================================================
--- python/trunk/Lib/xml/dom/xmlbuilder.py (original)
+++ python/trunk/Lib/xml/dom/xmlbuilder.py Sat Jul 29 18:56:15 2006
@@ -3,8 +3,6 @@
import copy
import xml.dom
-from xml.dom.minicompat import *
-
from xml.dom.NodeFilter import NodeFilter
@@ -211,7 +209,7 @@
return name.lower().replace('-', '_')
-class DOMEntityResolver(NewStyle):
+class DOMEntityResolver(object):
__slots__ = '_opener',
def resolveEntity(self, publicId, systemId):
@@ -255,7 +253,7 @@
return param.split("=", 1)[1].lower()
-class DOMInputSource(NewStyle):
+class DOMInputSource(object):
__slots__ = ('byteStream', 'characterStream', 'stringData',
'encoding', 'publicId', 'systemId', 'baseURI')
Modified: python/trunk/Lib/xml/etree/ElementInclude.py
==============================================================================
--- python/trunk/Lib/xml/etree/ElementInclude.py (original)
+++ python/trunk/Lib/xml/etree/ElementInclude.py Sat Jul 29 18:56:15 2006
@@ -1,141 +1,143 @@
-#
-# ElementTree
-# $Id: ElementInclude.py 1862 2004-06-18 07:31:02Z Fredrik $
-#
-# limited xinclude support for element trees
-#
-# history:
-# 2003-08-15 fl created
-# 2003-11-14 fl fixed default loader
-#
-# Copyright (c) 2003-2004 by Fredrik Lundh. All rights reserved.
-#
-# fredrik at pythonware.com
-# http://www.pythonware.com
-#
-# --------------------------------------------------------------------
-# The ElementTree toolkit is
-#
-# Copyright (c) 1999-2004 by Fredrik Lundh
-#
-# By obtaining, using, and/or copying this software and/or its
-# associated documentation, you agree that you have read, understood,
-# and will comply with the following terms and conditions:
-#
-# Permission to use, copy, modify, and distribute this software and
-# its associated documentation for any purpose and without fee is
-# hereby granted, provided that the above copyright notice appears in
-# all copies, and that both that copyright notice and this permission
-# notice appear in supporting documentation, and that the name of
-# Secret Labs AB or the author not be used in advertising or publicity
-# pertaining to distribution of the software without specific, written
-# prior permission.
-#
-# SECRET LABS AB AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD
-# TO THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANT-
-# ABILITY AND FITNESS. IN NO EVENT SHALL SECRET LABS AB OR THE AUTHOR
-# BE LIABLE FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY
-# DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS,
-# WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS
-# ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE
-# OF THIS SOFTWARE.
-# --------------------------------------------------------------------
-
-##
-# Limited XInclude support for the ElementTree package.
-##
-
-import copy
-import ElementTree
-
-XINCLUDE = "{http://www.w3.org/2001/XInclude}"
-
-XINCLUDE_INCLUDE = XINCLUDE + "include"
-XINCLUDE_FALLBACK = XINCLUDE + "fallback"
-
-##
-# Fatal include error.
-
-class FatalIncludeError(SyntaxError):
- pass
-
-##
-# Default loader. This loader reads an included resource from disk.
-#
-# @param href Resource reference.
-# @param parse Parse mode. Either "xml" or "text".
-# @param encoding Optional text encoding.
-# @return The expanded resource. If the parse mode is "xml", this
-# is an ElementTree instance. If the parse mode is "text", this
-# is a Unicode string. If the loader fails, it can return None
-# or raise an IOError exception.
-# @throws IOError If the loader fails to load the resource.
-
-def default_loader(href, parse, encoding=None):
- file = open(href)
- if parse == "xml":
- data = ElementTree.parse(file).getroot()
- else:
- data = file.read()
- if encoding:
- data = data.decode(encoding)
- file.close()
- return data
-
-##
-# Expand XInclude directives.
-#
-# @param elem Root element.
-# @param loader Optional resource loader. If omitted, it defaults
-# to {@link default_loader}. If given, it should be a callable
-# that implements the same interface as default_loader.
-# @throws FatalIncludeError If the function fails to include a given
-# resource, or if the tree contains malformed XInclude elements.
-# @throws IOError If the function fails to load a given resource.
-
-def include(elem, loader=None):
- if loader is None:
- loader = default_loader
- # look for xinclude elements
- i = 0
- while i < len(elem):
- e = elem[i]
- if e.tag == XINCLUDE_INCLUDE:
- # process xinclude directive
- href = e.get("href")
- parse = e.get("parse", "xml")
- if parse == "xml":
- node = loader(href, parse)
- if node is None:
- raise FatalIncludeError(
- "cannot load %r as %r" % (href, parse)
- )
- node = copy.copy(node)
- if e.tail:
- node.tail = (node.tail or "") + e.tail
- elem[i] = node
- elif parse == "text":
- text = loader(href, parse, e.get("encoding"))
- if text is None:
- raise FatalIncludeError(
- "cannot load %r as %r" % (href, parse)
- )
- if i:
- node = elem[i-1]
- node.tail = (node.tail or "") + text
- else:
- elem.text = (elem.text or "") + text + (e.tail or "")
- del elem[i]
- continue
- else:
- raise FatalIncludeError(
- "unknown parse type in xi:include tag (%r)" % parse
- )
- elif e.tag == XINCLUDE_FALLBACK:
- raise FatalIncludeError(
- "xi:fallback tag must be child of xi:include (%r)" % e.tag
- )
- else:
- include(e, loader)
- i = i + 1
-
+#
+# ElementTree
+# $Id: ElementInclude.py 1862 2004-06-18 07:31:02Z Fredrik $
+#
+# limited xinclude support for element trees
+#
+# history:
+# 2003-08-15 fl created
+# 2003-11-14 fl fixed default loader
+#
+# Copyright (c) 2003-2004 by Fredrik Lundh. All rights reserved.
+#
+# fredrik at pythonware.com
+# http://www.pythonware.com
+#
+# --------------------------------------------------------------------
+# The ElementTree toolkit is
+#
+# Copyright (c) 1999-2004 by Fredrik Lundh
+#
+# By obtaining, using, and/or copying this software and/or its
+# associated documentation, you agree that you have read, understood,
+# and will comply with the following terms and conditions:
+#
+# Permission to use, copy, modify, and distribute this software and
+# its associated documentation for any purpose and without fee is
+# hereby granted, provided that the above copyright notice appears in
+# all copies, and that both that copyright notice and this permission
+# notice appear in supporting documentation, and that the name of
+# Secret Labs AB or the author not be used in advertising or publicity
+# pertaining to distribution of the software without specific, written
+# prior permission.
+#
+# SECRET LABS AB AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD
+# TO THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANT-
+# ABILITY AND FITNESS. IN NO EVENT SHALL SECRET LABS AB OR THE AUTHOR
+# BE LIABLE FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY
+# DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS,
+# WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS
+# ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE
+# OF THIS SOFTWARE.
+# --------------------------------------------------------------------
+
+# Licensed to PSF under a Contributor Agreement.
+# See http://www.python.org/2.4/license for licensing details.
+
+##
+# Limited XInclude support for the ElementTree package.
+##
+
+import copy
+import ElementTree
+
+XINCLUDE = "{http://www.w3.org/2001/XInclude}"
+
+XINCLUDE_INCLUDE = XINCLUDE + "include"
+XINCLUDE_FALLBACK = XINCLUDE + "fallback"
+
+##
+# Fatal include error.
+
+class FatalIncludeError(SyntaxError):
+ pass
+
+##
+# Default loader. This loader reads an included resource from disk.
+#
+# @param href Resource reference.
+# @param parse Parse mode. Either "xml" or "text".
+# @param encoding Optional text encoding.
+# @return The expanded resource. If the parse mode is "xml", this
+# is an ElementTree instance. If the parse mode is "text", this
+# is a Unicode string. If the loader fails, it can return None
+# or raise an IOError exception.
+# @throws IOError If the loader fails to load the resource.
+
+def default_loader(href, parse, encoding=None):
+ file = open(href)
+ if parse == "xml":
+ data = ElementTree.parse(file).getroot()
+ else:
+ data = file.read()
+ if encoding:
+ data = data.decode(encoding)
+ file.close()
+ return data
+
+##
+# Expand XInclude directives.
+#
+# @param elem Root element.
+# @param loader Optional resource loader. If omitted, it defaults
+# to {@link default_loader}. If given, it should be a callable
+# that implements the same interface as default_loader.
+# @throws FatalIncludeError If the function fails to include a given
+# resource, or if the tree contains malformed XInclude elements.
+# @throws IOError If the function fails to load a given resource.
+
+def include(elem, loader=None):
+ if loader is None:
+ loader = default_loader
+ # look for xinclude elements
+ i = 0
+ while i < len(elem):
+ e = elem[i]
+ if e.tag == XINCLUDE_INCLUDE:
+ # process xinclude directive
+ href = e.get("href")
+ parse = e.get("parse", "xml")
+ if parse == "xml":
+ node = loader(href, parse)
+ if node is None:
+ raise FatalIncludeError(
+ "cannot load %r as %r" % (href, parse)
+ )
+ node = copy.copy(node)
+ if e.tail:
+ node.tail = (node.tail or "") + e.tail
+ elem[i] = node
+ elif parse == "text":
+ text = loader(href, parse, e.get("encoding"))
+ if text is None:
+ raise FatalIncludeError(
+ "cannot load %r as %r" % (href, parse)
+ )
+ if i:
+ node = elem[i-1]
+ node.tail = (node.tail or "") + text
+ else:
+ elem.text = (elem.text or "") + text + (e.tail or "")
+ del elem[i]
+ continue
+ else:
+ raise FatalIncludeError(
+ "unknown parse type in xi:include tag (%r)" % parse
+ )
+ elif e.tag == XINCLUDE_FALLBACK:
+ raise FatalIncludeError(
+ "xi:fallback tag must be child of xi:include (%r)" % e.tag
+ )
+ else:
+ include(e, loader)
+ i = i + 1
Modified: python/trunk/Lib/xml/etree/ElementPath.py
==============================================================================
--- python/trunk/Lib/xml/etree/ElementPath.py (original)
+++ python/trunk/Lib/xml/etree/ElementPath.py Sat Jul 29 18:56:15 2006
@@ -1,196 +1,198 @@
-#
-# ElementTree
-# $Id: ElementPath.py 1858 2004-06-17 21:31:41Z Fredrik $
-#
-# limited xpath support for element trees
-#
-# history:
-# 2003-05-23 fl created
-# 2003-05-28 fl added support for // etc
-# 2003-08-27 fl fixed parsing of periods in element names
-#
-# Copyright (c) 2003-2004 by Fredrik Lundh. All rights reserved.
-#
-# fredrik at pythonware.com
-# http://www.pythonware.com
-#
-# --------------------------------------------------------------------
-# The ElementTree toolkit is
-#
-# Copyright (c) 1999-2004 by Fredrik Lundh
-#
-# By obtaining, using, and/or copying this software and/or its
-# associated documentation, you agree that you have read, understood,
-# and will comply with the following terms and conditions:
-#
-# Permission to use, copy, modify, and distribute this software and
-# its associated documentation for any purpose and without fee is
-# hereby granted, provided that the above copyright notice appears in
-# all copies, and that both that copyright notice and this permission
-# notice appear in supporting documentation, and that the name of
-# Secret Labs AB or the author not be used in advertising or publicity
-# pertaining to distribution of the software without specific, written
-# prior permission.
-#
-# SECRET LABS AB AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD
-# TO THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANT-
-# ABILITY AND FITNESS. IN NO EVENT SHALL SECRET LABS AB OR THE AUTHOR
-# BE LIABLE FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY
-# DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS,
-# WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS
-# ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE
-# OF THIS SOFTWARE.
-# --------------------------------------------------------------------
-
-##
-# Implementation module for XPath support. There's usually no reason
-# to import this module directly; the ElementTree does this for
-# you, if needed.
-##
-
-import re
-
-xpath_tokenizer = re.compile(
- "(::|\.\.|\(\)|[/.*:\[\]\(\)@=])|((?:\{[^}]+\})?[^/:\[\]\(\)@=\s]+)|\s+"
- ).findall
-
-class xpath_descendant_or_self:
- pass
-
-##
-# Wrapper for a compiled XPath.
-
-class Path:
-
- ##
- # Create an Path instance from an XPath expression.
-
- def __init__(self, path):
- tokens = xpath_tokenizer(path)
- # the current version supports 'path/path'-style expressions only
- self.path = []
- self.tag = None
- if tokens and tokens[0][0] == "/":
- raise SyntaxError("cannot use absolute path on element")
- while tokens:
- op, tag = tokens.pop(0)
- if tag or op == "*":
- self.path.append(tag or op)
- elif op == ".":
- pass
- elif op == "/":
- self.path.append(xpath_descendant_or_self())
- continue
- else:
- raise SyntaxError("unsupported path syntax (%s)" % op)
- if tokens:
- op, tag = tokens.pop(0)
- if op != "/":
- raise SyntaxError(
- "expected path separator (%s)" % (op or tag)
- )
- if self.path and isinstance(self.path[-1], xpath_descendant_or_self):
- raise SyntaxError("path cannot end with //")
- if len(self.path) == 1 and isinstance(self.path[0], type("")):
- self.tag = self.path[0]
-
- ##
- # Find first matching object.
-
- def find(self, element):
- tag = self.tag
- if tag is None:
- nodeset = self.findall(element)
- if not nodeset:
- return None
- return nodeset[0]
- for elem in element:
- if elem.tag == tag:
- return elem
- return None
-
- ##
- # Find text for first matching object.
-
- def findtext(self, element, default=None):
- tag = self.tag
- if tag is None:
- nodeset = self.findall(element)
- if not nodeset:
- return default
- return nodeset[0].text or ""
- for elem in element:
- if elem.tag == tag:
- return elem.text or ""
- return default
-
- ##
- # Find all matching objects.
-
- def findall(self, element):
- nodeset = [element]
- index = 0
- while 1:
- try:
- path = self.path[index]
- index = index + 1
- except IndexError:
- return nodeset
- set = []
- if isinstance(path, xpath_descendant_or_self):
- try:
- tag = self.path[index]
- if not isinstance(tag, type("")):
- tag = None
- else:
- index = index + 1
- except IndexError:
- tag = None # invalid path
- for node in nodeset:
- new = list(node.getiterator(tag))
- if new and new[0] is node:
- set.extend(new[1:])
- else:
- set.extend(new)
- else:
- for node in nodeset:
- for node in node:
- if path == "*" or node.tag == path:
- set.append(node)
- if not set:
- return []
- nodeset = set
-
-_cache = {}
-
-##
-# (Internal) Compile path.
-
-def _compile(path):
- p = _cache.get(path)
- if p is not None:
- return p
- p = Path(path)
- if len(_cache) >= 100:
- _cache.clear()
- _cache[path] = p
- return p
-
-##
-# Find first matching object.
-
-def find(element, path):
- return _compile(path).find(element)
-
-##
-# Find text for first matching object.
-
-def findtext(element, path, default=None):
- return _compile(path).findtext(element, default)
-
-##
-# Find all matching objects.
-
-def findall(element, path):
- return _compile(path).findall(element)
-
+#
+# ElementTree
+# $Id: ElementPath.py 1858 2004-06-17 21:31:41Z Fredrik $
+#
+# limited xpath support for element trees
+#
+# history:
+# 2003-05-23 fl created
+# 2003-05-28 fl added support for // etc
+# 2003-08-27 fl fixed parsing of periods in element names
+#
+# Copyright (c) 2003-2004 by Fredrik Lundh. All rights reserved.
+#
+# fredrik at pythonware.com
+# http://www.pythonware.com
+#
+# --------------------------------------------------------------------
+# The ElementTree toolkit is
+#
+# Copyright (c) 1999-2004 by Fredrik Lundh
+#
+# By obtaining, using, and/or copying this software and/or its
+# associated documentation, you agree that you have read, understood,
+# and will comply with the following terms and conditions:
+#
+# Permission to use, copy, modify, and distribute this software and
+# its associated documentation for any purpose and without fee is
+# hereby granted, provided that the above copyright notice appears in
+# all copies, and that both that copyright notice and this permission
+# notice appear in supporting documentation, and that the name of
+# Secret Labs AB or the author not be used in advertising or publicity
+# pertaining to distribution of the software without specific, written
+# prior permission.
+#
+# SECRET LABS AB AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD
+# TO THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANT-
+# ABILITY AND FITNESS. IN NO EVENT SHALL SECRET LABS AB OR THE AUTHOR
+# BE LIABLE FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY
+# DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS,
+# WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS
+# ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE
+# OF THIS SOFTWARE.
+# --------------------------------------------------------------------
+
+# Licensed to PSF under a Contributor Agreement.
+# See http://www.python.org/2.4/license for licensing details.
+
+##
+# Implementation module for XPath support. There's usually no reason
+# to import this module directly; the ElementTree does this for
+# you, if needed.
+##
+
+import re
+
+xpath_tokenizer = re.compile(
+ "(::|\.\.|\(\)|[/.*:\[\]\(\)@=])|((?:\{[^}]+\})?[^/:\[\]\(\)@=\s]+)|\s+"
+ ).findall
+
+class xpath_descendant_or_self:
+ pass
+
+##
+# Wrapper for a compiled XPath.
+
+class Path:
+
+ ##
+ # Create an Path instance from an XPath expression.
+
+ def __init__(self, path):
+ tokens = xpath_tokenizer(path)
+ # the current version supports 'path/path'-style expressions only
+ self.path = []
+ self.tag = None
+ if tokens and tokens[0][0] == "/":
+ raise SyntaxError("cannot use absolute path on element")
+ while tokens:
+ op, tag = tokens.pop(0)
+ if tag or op == "*":
+ self.path.append(tag or op)
+ elif op == ".":
+ pass
+ elif op == "/":
+ self.path.append(xpath_descendant_or_self())
+ continue
+ else:
+ raise SyntaxError("unsupported path syntax (%s)" % op)
+ if tokens:
+ op, tag = tokens.pop(0)
+ if op != "/":
+ raise SyntaxError(
+ "expected path separator (%s)" % (op or tag)
+ )
+ if self.path and isinstance(self.path[-1], xpath_descendant_or_self):
+ raise SyntaxError("path cannot end with //")
+ if len(self.path) == 1 and isinstance(self.path[0], type("")):
+ self.tag = self.path[0]
+
+ ##
+ # Find first matching object.
+
+ def find(self, element):
+ tag = self.tag
+ if tag is None:
+ nodeset = self.findall(element)
+ if not nodeset:
+ return None
+ return nodeset[0]
+ for elem in element:
+ if elem.tag == tag:
+ return elem
+ return None
+
+ ##
+ # Find text for first matching object.
+
+ def findtext(self, element, default=None):
+ tag = self.tag
+ if tag is None:
+ nodeset = self.findall(element)
+ if not nodeset:
+ return default
+ return nodeset[0].text or ""
+ for elem in element:
+ if elem.tag == tag:
+ return elem.text or ""
+ return default
+
+ ##
+ # Find all matching objects.
+
+ def findall(self, element):
+ nodeset = [element]
+ index = 0
+ while 1:
+ try:
+ path = self.path[index]
+ index = index + 1
+ except IndexError:
+ return nodeset
+ set = []
+ if isinstance(path, xpath_descendant_or_self):
+ try:
+ tag = self.path[index]
+ if not isinstance(tag, type("")):
+ tag = None
+ else:
+ index = index + 1
+ except IndexError:
+ tag = None # invalid path
+ for node in nodeset:
+ new = list(node.getiterator(tag))
+ if new and new[0] is node:
+ set.extend(new[1:])
+ else:
+ set.extend(new)
+ else:
+ for node in nodeset:
+ for node in node:
+ if path == "*" or node.tag == path:
+ set.append(node)
+ if not set:
+ return []
+ nodeset = set
+
+_cache = {}
+
+##
+# (Internal) Compile path.
+
+def _compile(path):
+ p = _cache.get(path)
+ if p is not None:
+ return p
+ p = Path(path)
+ if len(_cache) >= 100:
+ _cache.clear()
+ _cache[path] = p
+ return p
+
+##
+# Find first matching object.
+
+def find(element, path):
+ return _compile(path).find(element)
+
+##
+# Find text for first matching object.
+
+def findtext(element, path, default=None):
+ return _compile(path).findtext(element, default)
+
+##
+# Find all matching objects.
+
+def findall(element, path):
+ return _compile(path).findall(element)
Modified: python/trunk/Lib/xml/etree/ElementTree.py
==============================================================================
--- python/trunk/Lib/xml/etree/ElementTree.py (original)
+++ python/trunk/Lib/xml/etree/ElementTree.py Sat Jul 29 18:56:15 2006
@@ -1,1254 +1,1260 @@
-#
-# ElementTree
-# $Id: ElementTree.py 2326 2005-03-17 07:45:21Z fredrik $
-#
-# light-weight XML support for Python 1.5.2 and later.
-#
-# history:
-# 2001-10-20 fl created (from various sources)
-# 2001-11-01 fl return root from parse method
-# 2002-02-16 fl sort attributes in lexical order
-# 2002-04-06 fl TreeBuilder refactoring, added PythonDoc markup
-# 2002-05-01 fl finished TreeBuilder refactoring
-# 2002-07-14 fl added basic namespace support to ElementTree.write
-# 2002-07-25 fl added QName attribute support
-# 2002-10-20 fl fixed encoding in write
-# 2002-11-24 fl changed default encoding to ascii; fixed attribute encoding
-# 2002-11-27 fl accept file objects or file names for parse/write
-# 2002-12-04 fl moved XMLTreeBuilder back to this module
-# 2003-01-11 fl fixed entity encoding glitch for us-ascii
-# 2003-02-13 fl added XML literal factory
-# 2003-02-21 fl added ProcessingInstruction/PI factory
-# 2003-05-11 fl added tostring/fromstring helpers
-# 2003-05-26 fl added ElementPath support
-# 2003-07-05 fl added makeelement factory method
-# 2003-07-28 fl added more well-known namespace prefixes
-# 2003-08-15 fl fixed typo in ElementTree.findtext (Thomas Dartsch)
-# 2003-09-04 fl fall back on emulator if ElementPath is not installed
-# 2003-10-31 fl markup updates
-# 2003-11-15 fl fixed nested namespace bug
-# 2004-03-28 fl added XMLID helper
-# 2004-06-02 fl added default support to findtext
-# 2004-06-08 fl fixed encoding of non-ascii element/attribute names
-# 2004-08-23 fl take advantage of post-2.1 expat features
-# 2005-02-01 fl added iterparse implementation
-# 2005-03-02 fl fixed iterparse support for pre-2.2 versions
-#
-# Copyright (c) 1999-2005 by Fredrik Lundh. All rights reserved.
-#
-# fredrik at pythonware.com
-# http://www.pythonware.com
-#
-# --------------------------------------------------------------------
-# The ElementTree toolkit is
-#
-# Copyright (c) 1999-2005 by Fredrik Lundh
-#
-# By obtaining, using, and/or copying this software and/or its
-# associated documentation, you agree that you have read, understood,
-# and will comply with the following terms and conditions:
-#
-# Permission to use, copy, modify, and distribute this software and
-# its associated documentation for any purpose and without fee is
-# hereby granted, provided that the above copyright notice appears in
-# all copies, and that both that copyright notice and this permission
-# notice appear in supporting documentation, and that the name of
-# Secret Labs AB or the author not be used in advertising or publicity
-# pertaining to distribution of the software without specific, written
-# prior permission.
-#
-# SECRET LABS AB AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD
-# TO THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANT-
-# ABILITY AND FITNESS. IN NO EVENT SHALL SECRET LABS AB OR THE AUTHOR
-# BE LIABLE FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY
-# DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS,
-# WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS
-# ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE
-# OF THIS SOFTWARE.
-# --------------------------------------------------------------------
-
-__all__ = [
- # public symbols
- "Comment",
- "dump",
- "Element", "ElementTree",
- "fromstring",
- "iselement", "iterparse",
- "parse",
- "PI", "ProcessingInstruction",
- "QName",
- "SubElement",
- "tostring",
- "TreeBuilder",
- "VERSION", "XML",
- "XMLTreeBuilder",
- ]
-
-##
-# The Element type is a flexible container object, designed to
-# store hierarchical data structures in memory. The type can be
-# described as a cross between a list and a dictionary.
-#
-# Each element has a number of properties associated with it:
-#
-#
a tag. This is a string identifying what kind of data
-# this element represents (the element type, in other words).
-#
a number of attributes, stored in a Python dictionary.
-#
a text string.
-#
an optional tail string.
-#
a number of child elements, stored in a Python sequence
-#
-#
-# To create an element instance, use the {@link #Element} or {@link
-# #SubElement} factory functions.
-#
-# The {@link #ElementTree} class can be used to wrap an element
-# structure, and convert it from and to XML.
-##
-
-import string, sys, re
-
-class _SimpleElementPath:
- # emulate pre-1.2 find/findtext/findall behaviour
- def find(self, element, tag):
- for elem in element:
- if elem.tag == tag:
- return elem
- return None
- def findtext(self, element, tag, default=None):
- for elem in element:
- if elem.tag == tag:
- return elem.text or ""
- return default
- def findall(self, element, tag):
- if tag[:3] == ".//":
- return element.getiterator(tag[3:])
- result = []
- for elem in element:
- if elem.tag == tag:
- result.append(elem)
- return result
-
-try:
- import ElementPath
-except ImportError:
- # FIXME: issue warning in this case?
- ElementPath = _SimpleElementPath()
-
-# TODO: add support for custom namespace resolvers/default namespaces
-# TODO: add improved support for incremental parsing
-
-VERSION = "1.2.6"
-
-##
-# Internal element class. This class defines the Element interface,
-# and provides a reference implementation of this interface.
-#
-# You should not create instances of this class directly. Use the
-# appropriate factory functions instead, such as {@link #Element}
-# and {@link #SubElement}.
-#
-# @see Element
-# @see SubElement
-# @see Comment
-# @see ProcessingInstruction
-
-class _ElementInterface:
- # text...tail
-
- ##
- # (Attribute) Element tag.
-
- tag = None
-
- ##
- # (Attribute) Element attribute dictionary. Where possible, use
- # {@link #_ElementInterface.get},
- # {@link #_ElementInterface.set},
- # {@link #_ElementInterface.keys}, and
- # {@link #_ElementInterface.items} to access
- # element attributes.
-
- attrib = None
-
- ##
- # (Attribute) Text before first subelement. This is either a
- # string or the value None, if there was no text.
-
- text = None
-
- ##
- # (Attribute) Text after this element's end tag, but before the
- # next sibling element's start tag. This is either a string or
- # the value None, if there was no text.
-
- tail = None # text after end tag, if any
-
- def __init__(self, tag, attrib):
- self.tag = tag
- self.attrib = attrib
- self._children = []
-
- def __repr__(self):
- return "" % (self.tag, id(self))
-
- ##
- # Creates a new element object of the same type as this element.
- #
- # @param tag Element tag.
- # @param attrib Element attributes, given as a dictionary.
- # @return A new element instance.
-
- def makeelement(self, tag, attrib):
- return Element(tag, attrib)
-
- ##
- # Returns the number of subelements.
- #
- # @return The number of subelements.
-
- def __len__(self):
- return len(self._children)
-
- ##
- # Returns the given subelement.
- #
- # @param index What subelement to return.
- # @return The given subelement.
- # @exception IndexError If the given element does not exist.
-
- def __getitem__(self, index):
- return self._children[index]
-
- ##
- # Replaces the given subelement.
- #
- # @param index What subelement to replace.
- # @param element The new element value.
- # @exception IndexError If the given element does not exist.
- # @exception AssertionError If element is not a valid object.
-
- def __setitem__(self, index, element):
- assert iselement(element)
- self._children[index] = element
-
- ##
- # Deletes the given subelement.
- #
- # @param index What subelement to delete.
- # @exception IndexError If the given element does not exist.
-
- def __delitem__(self, index):
- del self._children[index]
-
- ##
- # Returns a list containing subelements in the given range.
- #
- # @param start The first subelement to return.
- # @param stop The first subelement that shouldn't be returned.
- # @return A sequence object containing subelements.
-
- def __getslice__(self, start, stop):
- return self._children[start:stop]
-
- ##
- # Replaces a number of subelements with elements from a sequence.
- #
- # @param start The first subelement to replace.
- # @param stop The first subelement that shouldn't be replaced.
- # @param elements A sequence object with zero or more elements.
- # @exception AssertionError If a sequence member is not a valid object.
-
- def __setslice__(self, start, stop, elements):
- for element in elements:
- assert iselement(element)
- self._children[start:stop] = list(elements)
-
- ##
- # Deletes a number of subelements.
- #
- # @param start The first subelement to delete.
- # @param stop The first subelement to leave in there.
-
- def __delslice__(self, start, stop):
- del self._children[start:stop]
-
- ##
- # Adds a subelement to the end of this element.
- #
- # @param element The element to add.
- # @exception AssertionError If a sequence member is not a valid object.
-
- def append(self, element):
- assert iselement(element)
- self._children.append(element)
-
- ##
- # Inserts a subelement at the given position in this element.
- #
- # @param index Where to insert the new subelement.
- # @exception AssertionError If the element is not a valid object.
-
- def insert(self, index, element):
- assert iselement(element)
- self._children.insert(index, element)
-
- ##
- # Removes a matching subelement. Unlike the find methods,
- # this method compares elements based on identity, not on tag
- # value or contents.
- #
- # @param element What element to remove.
- # @exception ValueError If a matching element could not be found.
- # @exception AssertionError If the element is not a valid object.
-
- def remove(self, element):
- assert iselement(element)
- self._children.remove(element)
-
- ##
- # Returns all subelements. The elements are returned in document
- # order.
- #
- # @return A list of subelements.
- # @defreturn list of Element instances
-
- def getchildren(self):
- return self._children
-
- ##
- # Finds the first matching subelement, by tag name or path.
- #
- # @param path What element to look for.
- # @return The first matching element, or None if no element was found.
- # @defreturn Element or None
-
- def find(self, path):
- return ElementPath.find(self, path)
-
- ##
- # Finds text for the first matching subelement, by tag name or path.
- #
- # @param path What element to look for.
- # @param default What to return if the element was not found.
- # @return The text content of the first matching element, or the
- # default value no element was found. Note that if the element
- # has is found, but has no text content, this method returns an
- # empty string.
- # @defreturn string
-
- def findtext(self, path, default=None):
- return ElementPath.findtext(self, path, default)
-
- ##
- # Finds all matching subelements, by tag name or path.
- #
- # @param path What element to look for.
- # @return A list or iterator containing all matching elements,
- # in document order.
- # @defreturn list of Element instances
-
- def findall(self, path):
- return ElementPath.findall(self, path)
-
- ##
- # Resets an element. This function removes all subelements, clears
- # all attributes, and sets the text and tail attributes to None.
-
- def clear(self):
- self.attrib.clear()
- self._children = []
- self.text = self.tail = None
-
- ##
- # Gets an element attribute.
- #
- # @param key What attribute to look for.
- # @param default What to return if the attribute was not found.
- # @return The attribute value, or the default value, if the
- # attribute was not found.
- # @defreturn string or None
-
- def get(self, key, default=None):
- return self.attrib.get(key, default)
-
- ##
- # Sets an element attribute.
- #
- # @param key What attribute to set.
- # @param value The attribute value.
-
- def set(self, key, value):
- self.attrib[key] = value
-
- ##
- # Gets a list of attribute names. The names are returned in an
- # arbitrary order (just like for an ordinary Python dictionary).
- #
- # @return A list of element attribute names.
- # @defreturn list of strings
-
- def keys(self):
- return self.attrib.keys()
-
- ##
- # Gets element attributes, as a sequence. The attributes are
- # returned in an arbitrary order.
- #
- # @return A list of (name, value) tuples for all attributes.
- # @defreturn list of (string, string) tuples
-
- def items(self):
- return self.attrib.items()
-
- ##
- # Creates a tree iterator. The iterator loops over this element
- # and all subelements, in document order, and returns all elements
- # with a matching tag.
- #
- # If the tree structure is modified during iteration, the result
- # is undefined.
- #
- # @param tag What tags to look for (default is to return all elements).
- # @return A list or iterator containing all the matching elements.
- # @defreturn list or iterator
-
- def getiterator(self, tag=None):
- nodes = []
- if tag == "*":
- tag = None
- if tag is None or self.tag == tag:
- nodes.append(self)
- for node in self._children:
- nodes.extend(node.getiterator(tag))
- return nodes
-
-# compatibility
-_Element = _ElementInterface
-
-##
-# Element factory. This function returns an object implementing the
-# standard Element interface. The exact class or type of that object
-# is implementation dependent, but it will always be compatible with
-# the {@link #_ElementInterface} class in this module.
-#
-# The element name, attribute names, and attribute values can be
-# either 8-bit ASCII strings or Unicode strings.
-#
-# @param tag The element name.
-# @param attrib An optional dictionary, containing element attributes.
-# @param **extra Additional attributes, given as keyword arguments.
-# @return An element instance.
-# @defreturn Element
-
-def Element(tag, attrib={}, **extra):
- attrib = attrib.copy()
- attrib.update(extra)
- return _ElementInterface(tag, attrib)
-
-##
-# Subelement factory. This function creates an element instance, and
-# appends it to an existing element.
-#
-# The element name, attribute names, and attribute values can be
-# either 8-bit ASCII strings or Unicode strings.
-#
-# @param parent The parent element.
-# @param tag The subelement name.
-# @param attrib An optional dictionary, containing element attributes.
-# @param **extra Additional attributes, given as keyword arguments.
-# @return An element instance.
-# @defreturn Element
-
-def SubElement(parent, tag, attrib={}, **extra):
- attrib = attrib.copy()
- attrib.update(extra)
- element = parent.makeelement(tag, attrib)
- parent.append(element)
- return element
-
-##
-# Comment element factory. This factory function creates a special
-# element that will be serialized as an XML comment.
-#
-# The comment string can be either an 8-bit ASCII string or a Unicode
-# string.
-#
-# @param text A string containing the comment string.
-# @return An element instance, representing a comment.
-# @defreturn Element
-
-def Comment(text=None):
- element = Element(Comment)
- element.text = text
- return element
-
-##
-# PI element factory. This factory function creates a special element
-# that will be serialized as an XML processing instruction.
-#
-# @param target A string containing the PI target.
-# @param text A string containing the PI contents, if any.
-# @return An element instance, representing a PI.
-# @defreturn Element
-
-def ProcessingInstruction(target, text=None):
- element = Element(ProcessingInstruction)
- element.text = target
- if text:
- element.text = element.text + " " + text
- return element
-
-PI = ProcessingInstruction
-
-##
-# QName wrapper. This can be used to wrap a QName attribute value, in
-# order to get proper namespace handling on output.
-#
-# @param text A string containing the QName value, in the form {uri}local,
-# or, if the tag argument is given, the URI part of a QName.
-# @param tag Optional tag. If given, the first argument is interpreted as
-# an URI, and this argument is interpreted as a local name.
-# @return An opaque object, representing the QName.
-
-class QName:
- def __init__(self, text_or_uri, tag=None):
- if tag:
- text_or_uri = "{%s}%s" % (text_or_uri, tag)
- self.text = text_or_uri
- def __str__(self):
- return self.text
- def __hash__(self):
- return hash(self.text)
- def __cmp__(self, other):
- if isinstance(other, QName):
- return cmp(self.text, other.text)
- return cmp(self.text, other)
-
-##
-# ElementTree wrapper class. This class represents an entire element
-# hierarchy, and adds some extra support for serialization to and from
-# standard XML.
-#
-# @param element Optional root element.
-# @keyparam file Optional file handle or name. If given, the
-# tree is initialized with the contents of this XML file.
-
-class ElementTree:
-
- def __init__(self, element=None, file=None):
- assert element is None or iselement(element)
- self._root = element # first node
- if file:
- self.parse(file)
-
- ##
- # Gets the root element for this tree.
- #
- # @return An element instance.
- # @defreturn Element
-
- def getroot(self):
- return self._root
-
- ##
- # Replaces the root element for this tree. This discards the
- # current contents of the tree, and replaces it with the given
- # element. Use with care.
- #
- # @param element An element instance.
-
- def _setroot(self, element):
- assert iselement(element)
- self._root = element
-
- ##
- # Loads an external XML document into this element tree.
- #
- # @param source A file name or file object.
- # @param parser An optional parser instance. If not given, the
- # standard {@link XMLTreeBuilder} parser is used.
- # @return The document root element.
- # @defreturn Element
-
- def parse(self, source, parser=None):
- if not hasattr(source, "read"):
- source = open(source, "rb")
- if not parser:
- parser = XMLTreeBuilder()
- while 1:
- data = source.read(32768)
- if not data:
- break
- parser.feed(data)
- self._root = parser.close()
- return self._root
-
- ##
- # Creates a tree iterator for the root element. The iterator loops
- # over all elements in this tree, in document order.
- #
- # @param tag What tags to look for (default is to return all elements)
- # @return An iterator.
- # @defreturn iterator
-
- def getiterator(self, tag=None):
- assert self._root is not None
- return self._root.getiterator(tag)
-
- ##
- # Finds the first toplevel element with given tag.
- # Same as getroot().find(path).
- #
- # @param path What element to look for.
- # @return The first matching element, or None if no element was found.
- # @defreturn Element or None
-
- def find(self, path):
- assert self._root is not None
- if path[:1] == "/":
- path = "." + path
- return self._root.find(path)
-
- ##
- # Finds the element text for the first toplevel element with given
- # tag. Same as getroot().findtext(path).
- #
- # @param path What toplevel element to look for.
- # @param default What to return if the element was not found.
- # @return The text content of the first matching element, or the
- # default value no element was found. Note that if the element
- # has is found, but has no text content, this method returns an
- # empty string.
- # @defreturn string
-
- def findtext(self, path, default=None):
- assert self._root is not None
- if path[:1] == "/":
- path = "." + path
- return self._root.findtext(path, default)
-
- ##
- # Finds all toplevel elements with the given tag.
- # Same as getroot().findall(path).
- #
- # @param path What element to look for.
- # @return A list or iterator containing all matching elements,
- # in document order.
- # @defreturn list of Element instances
-
- def findall(self, path):
- assert self._root is not None
- if path[:1] == "/":
- path = "." + path
- return self._root.findall(path)
-
- ##
- # Writes the element tree to a file, as XML.
- #
- # @param file A file name, or a file object opened for writing.
- # @param encoding Optional output encoding (default is US-ASCII).
-
- def write(self, file, encoding="us-ascii"):
- assert self._root is not None
- if not hasattr(file, "write"):
- file = open(file, "wb")
- if not encoding:
- encoding = "us-ascii"
- elif encoding != "utf-8" and encoding != "us-ascii":
- file.write("\n" % encoding)
- self._write(file, self._root, encoding, {})
-
- def _write(self, file, node, encoding, namespaces):
- # write XML to file
- tag = node.tag
- if tag is Comment:
- file.write("" % _escape_cdata(node.text, encoding))
- elif tag is ProcessingInstruction:
- file.write("%s?>" % _escape_cdata(node.text, encoding))
- else:
- items = node.items()
- xmlns_items = [] # new namespaces in this scope
- try:
- if isinstance(tag, QName) or tag[:1] == "{":
- tag, xmlns = fixtag(tag, namespaces)
- if xmlns: xmlns_items.append(xmlns)
- except TypeError:
- _raise_serialization_error(tag)
- file.write("<" + _encode(tag, encoding))
- if items or xmlns_items:
- items.sort() # lexical order
- for k, v in items:
- try:
- if isinstance(k, QName) or k[:1] == "{":
- k, xmlns = fixtag(k, namespaces)
- if xmlns: xmlns_items.append(xmlns)
- except TypeError:
- _raise_serialization_error(k)
- try:
- if isinstance(v, QName):
- v, xmlns = fixtag(v, namespaces)
- if xmlns: xmlns_items.append(xmlns)
- except TypeError:
- _raise_serialization_error(v)
- file.write(" %s=\"%s\"" % (_encode(k, encoding),
- _escape_attrib(v, encoding)))
- for k, v in xmlns_items:
- file.write(" %s=\"%s\"" % (_encode(k, encoding),
- _escape_attrib(v, encoding)))
- if node.text or len(node):
- file.write(">")
- if node.text:
- file.write(_escape_cdata(node.text, encoding))
- for n in node:
- self._write(file, n, encoding, namespaces)
- file.write("" + _encode(tag, encoding) + ">")
- else:
- file.write(" />")
- for k, v in xmlns_items:
- del namespaces[v]
- if node.tail:
- file.write(_escape_cdata(node.tail, encoding))
-
-# --------------------------------------------------------------------
-# helpers
-
-##
-# Checks if an object appears to be a valid element object.
-#
-# @param An element instance.
-# @return A true value if this is an element object.
-# @defreturn flag
-
-def iselement(element):
- # FIXME: not sure about this; might be a better idea to look
- # for tag/attrib/text attributes
- return isinstance(element, _ElementInterface) or hasattr(element, "tag")
-
-##
-# Writes an element tree or element structure to sys.stdout. This
-# function should be used for debugging only.
-#
-# The exact output format is implementation dependent. In this
-# version, it's written as an ordinary XML file.
-#
-# @param elem An element tree or an individual element.
-
-def dump(elem):
- # debugging
- if not isinstance(elem, ElementTree):
- elem = ElementTree(elem)
- elem.write(sys.stdout)
- tail = elem.getroot().tail
- if not tail or tail[-1] != "\n":
- sys.stdout.write("\n")
-
-def _encode(s, encoding):
- try:
- return s.encode(encoding)
- except AttributeError:
- return s # 1.5.2: assume the string uses the right encoding
-
-if sys.version[:3] == "1.5":
- _escape = re.compile(r"[&<>\"\x80-\xff]+") # 1.5.2
-else:
- _escape = re.compile(eval(r'u"[&<>\"\u0080-\uffff]+"'))
-
-_escape_map = {
- "&": "&",
- "<": "<",
- ">": ">",
- '"': """,
-}
-
-_namespace_map = {
- # "well-known" namespace prefixes
- "http://www.w3.org/XML/1998/namespace": "xml",
- "http://www.w3.org/1999/xhtml": "html",
- "http://www.w3.org/1999/02/22-rdf-syntax-ns#": "rdf",
- "http://schemas.xmlsoap.org/wsdl/": "wsdl",
-}
-
-def _raise_serialization_error(text):
- raise TypeError(
- "cannot serialize %r (type %s)" % (text, type(text).__name__)
- )
-
-def _encode_entity(text, pattern=_escape):
- # map reserved and non-ascii characters to numerical entities
- def escape_entities(m, map=_escape_map):
- out = []
- append = out.append
- for char in m.group():
- text = map.get(char)
- if text is None:
- text = "%d;" % ord(char)
- append(text)
- return string.join(out, "")
- try:
- return _encode(pattern.sub(escape_entities, text), "ascii")
- except TypeError:
- _raise_serialization_error(text)
-
-#
-# the following functions assume an ascii-compatible encoding
-# (or "utf-16")
-
-def _escape_cdata(text, encoding=None, replace=string.replace):
- # escape character data
- try:
- if encoding:
- try:
- text = _encode(text, encoding)
- except UnicodeError:
- return _encode_entity(text)
- text = replace(text, "&", "&")
- text = replace(text, "<", "<")
- text = replace(text, ">", ">")
- return text
- except (TypeError, AttributeError):
- _raise_serialization_error(text)
-
-def _escape_attrib(text, encoding=None, replace=string.replace):
- # escape attribute value
- try:
- if encoding:
- try:
- text = _encode(text, encoding)
- except UnicodeError:
- return _encode_entity(text)
- text = replace(text, "&", "&")
- text = replace(text, "'", "'") # FIXME: overkill
- text = replace(text, "\"", """)
- text = replace(text, "<", "<")
- text = replace(text, ">", ">")
- return text
- except (TypeError, AttributeError):
- _raise_serialization_error(text)
-
-def fixtag(tag, namespaces):
- # given a decorated tag (of the form {uri}tag), return prefixed
- # tag and namespace declaration, if any
- if isinstance(tag, QName):
- tag = tag.text
- namespace_uri, tag = string.split(tag[1:], "}", 1)
- prefix = namespaces.get(namespace_uri)
- if prefix is None:
- prefix = _namespace_map.get(namespace_uri)
- if prefix is None:
- prefix = "ns%d" % len(namespaces)
- namespaces[namespace_uri] = prefix
- if prefix == "xml":
- xmlns = None
- else:
- xmlns = ("xmlns:%s" % prefix, namespace_uri)
- else:
- xmlns = None
- return "%s:%s" % (prefix, tag), xmlns
-
-##
-# Parses an XML document into an element tree.
-#
-# @param source A filename or file object containing XML data.
-# @param parser An optional parser instance. If not given, the
-# standard {@link XMLTreeBuilder} parser is used.
-# @return An ElementTree instance
-
-def parse(source, parser=None):
- tree = ElementTree()
- tree.parse(source, parser)
- return tree
-
-##
-# Parses an XML document into an element tree incrementally, and reports
-# what's going on to the user.
-#
-# @param source A filename or file object containing XML data.
-# @param events A list of events to report back. If omitted, only "end"
-# events are reported.
-# @return A (event, elem) iterator.
-
-class iterparse:
-
- def __init__(self, source, events=None):
- if not hasattr(source, "read"):
- source = open(source, "rb")
- self._file = source
- self._events = []
- self._index = 0
- self.root = self._root = None
- self._parser = XMLTreeBuilder()
- # wire up the parser for event reporting
- parser = self._parser._parser
- append = self._events.append
- if events is None:
- events = ["end"]
- for event in events:
- if event == "start":
- try:
- parser.ordered_attributes = 1
- parser.specified_attributes = 1
- def handler(tag, attrib_in, event=event, append=append,
- start=self._parser._start_list):
- append((event, start(tag, attrib_in)))
- parser.StartElementHandler = handler
- except AttributeError:
- def handler(tag, attrib_in, event=event, append=append,
- start=self._parser._start):
- append((event, start(tag, attrib_in)))
- parser.StartElementHandler = handler
- elif event == "end":
- def handler(tag, event=event, append=append,
- end=self._parser._end):
- append((event, end(tag)))
- parser.EndElementHandler = handler
- elif event == "start-ns":
- def handler(prefix, uri, event=event, append=append):
- try:
- uri = _encode(uri, "ascii")
- except UnicodeError:
- pass
- append((event, (prefix or "", uri)))
- parser.StartNamespaceDeclHandler = handler
- elif event == "end-ns":
- def handler(prefix, event=event, append=append):
- append((event, None))
- parser.EndNamespaceDeclHandler = handler
-
- def next(self):
- while 1:
- try:
- item = self._events[self._index]
- except IndexError:
- if self._parser is None:
- self.root = self._root
- try:
- raise StopIteration
- except NameError:
- raise IndexError
- # load event buffer
- del self._events[:]
- self._index = 0
- data = self._file.read(16384)
- if data:
- self._parser.feed(data)
- else:
- self._root = self._parser.close()
- self._parser = None
- else:
- self._index = self._index + 1
- return item
-
- try:
- iter
- def __iter__(self):
- return self
- except NameError:
- def __getitem__(self, index):
- return self.next()
-
-##
-# Parses an XML document from a string constant. This function can
-# be used to embed "XML literals" in Python code.
-#
-# @param source A string containing XML data.
-# @return An Element instance.
-# @defreturn Element
-
-def XML(text):
- parser = XMLTreeBuilder()
- parser.feed(text)
- return parser.close()
-
-##
-# Parses an XML document from a string constant, and also returns
-# a dictionary which maps from element id:s to elements.
-#
-# @param source A string containing XML data.
-# @return A tuple containing an Element instance and a dictionary.
-# @defreturn (Element, dictionary)
-
-def XMLID(text):
- parser = XMLTreeBuilder()
- parser.feed(text)
- tree = parser.close()
- ids = {}
- for elem in tree.getiterator():
- id = elem.get("id")
- if id:
- ids[id] = elem
- return tree, ids
-
-##
-# Parses an XML document from a string constant. Same as {@link #XML}.
-#
-# @def fromstring(text)
-# @param source A string containing XML data.
-# @return An Element instance.
-# @defreturn Element
-
-fromstring = XML
-
-##
-# Generates a string representation of an XML element, including all
-# subelements.
-#
-# @param element An Element instance.
-# @return An encoded string containing the XML data.
-# @defreturn string
-
-def tostring(element, encoding=None):
- class dummy:
- pass
- data = []
- file = dummy()
- file.write = data.append
- ElementTree(element).write(file, encoding)
- return string.join(data, "")
-
-##
-# Generic element structure builder. This builder converts a sequence
-# of {@link #TreeBuilder.start}, {@link #TreeBuilder.data}, and {@link
-# #TreeBuilder.end} method calls to a well-formed element structure.
-#
-# You can use this class to build an element structure using a custom XML
-# parser, or a parser for some other XML-like format.
-#
-# @param element_factory Optional element factory. This factory
-# is called to create new Element instances, as necessary.
-
-class TreeBuilder:
-
- def __init__(self, element_factory=None):
- self._data = [] # data collector
- self._elem = [] # element stack
- self._last = None # last element
- self._tail = None # true if we're after an end tag
- if element_factory is None:
- element_factory = _ElementInterface
- self._factory = element_factory
-
- ##
- # Flushes the parser buffers, and returns the toplevel documen
- # element.
- #
- # @return An Element instance.
- # @defreturn Element
-
- def close(self):
- assert len(self._elem) == 0, "missing end tags"
- assert self._last != None, "missing toplevel element"
- return self._last
-
- def _flush(self):
- if self._data:
- if self._last is not None:
- text = string.join(self._data, "")
- if self._tail:
- assert self._last.tail is None, "internal error (tail)"
- self._last.tail = text
- else:
- assert self._last.text is None, "internal error (text)"
- self._last.text = text
- self._data = []
-
- ##
- # Adds text to the current element.
- #
- # @param data A string. This should be either an 8-bit string
- # containing ASCII text, or a Unicode string.
-
- def data(self, data):
- self._data.append(data)
-
- ##
- # Opens a new element.
- #
- # @param tag The element name.
- # @param attrib A dictionary containing element attributes.
- # @return The opened element.
- # @defreturn Element
-
- def start(self, tag, attrs):
- self._flush()
- self._last = elem = self._factory(tag, attrs)
- if self._elem:
- self._elem[-1].append(elem)
- self._elem.append(elem)
- self._tail = 0
- return elem
-
- ##
- # Closes the current element.
- #
- # @param tag The element name.
- # @return The closed element.
- # @defreturn Element
-
- def end(self, tag):
- self._flush()
- self._last = self._elem.pop()
- assert self._last.tag == tag,\
- "end tag mismatch (expected %s, got %s)" % (
- self._last.tag, tag)
- self._tail = 1
- return self._last
-
-##
-# Element structure builder for XML source data, based on the
-# expat parser.
-#
-# @keyparam target Target object. If omitted, the builder uses an
-# instance of the standard {@link #TreeBuilder} class.
-# @keyparam html Predefine HTML entities. This flag is not supported
-# by the current implementation.
-# @see #ElementTree
-# @see #TreeBuilder
-
-class XMLTreeBuilder:
-
- def __init__(self, html=0, target=None):
- try:
- from xml.parsers import expat
- except ImportError:
- raise ImportError(
- "No module named expat; use SimpleXMLTreeBuilder instead"
- )
- self._parser = parser = expat.ParserCreate(None, "}")
- if target is None:
- target = TreeBuilder()
- self._target = target
- self._names = {} # name memo cache
- # callbacks
- parser.DefaultHandlerExpand = self._default
- parser.StartElementHandler = self._start
- parser.EndElementHandler = self._end
- parser.CharacterDataHandler = self._data
- # let expat do the buffering, if supported
- try:
- self._parser.buffer_text = 1
- except AttributeError:
- pass
- # use new-style attribute handling, if supported
- try:
- self._parser.ordered_attributes = 1
- self._parser.specified_attributes = 1
- parser.StartElementHandler = self._start_list
- except AttributeError:
- pass
- encoding = None
- if not parser.returns_unicode:
- encoding = "utf-8"
- # target.xml(encoding, None)
- self._doctype = None
- self.entity = {}
-
- def _fixtext(self, text):
- # convert text string to ascii, if possible
- try:
- return _encode(text, "ascii")
- except UnicodeError:
- return text
-
- def _fixname(self, key):
- # expand qname, and convert name string to ascii, if possible
- try:
- name = self._names[key]
- except KeyError:
- name = key
- if "}" in name:
- name = "{" + name
- self._names[key] = name = self._fixtext(name)
- return name
-
- def _start(self, tag, attrib_in):
- fixname = self._fixname
- tag = fixname(tag)
- attrib = {}
- for key, value in attrib_in.items():
- attrib[fixname(key)] = self._fixtext(value)
- return self._target.start(tag, attrib)
-
- def _start_list(self, tag, attrib_in):
- fixname = self._fixname
- tag = fixname(tag)
- attrib = {}
- if attrib_in:
- for i in range(0, len(attrib_in), 2):
- attrib[fixname(attrib_in[i])] = self._fixtext(attrib_in[i+1])
- return self._target.start(tag, attrib)
-
- def _data(self, text):
- return self._target.data(self._fixtext(text))
-
- def _end(self, tag):
- return self._target.end(self._fixname(tag))
-
- def _default(self, text):
- prefix = text[:1]
- if prefix == "&":
- # deal with undefined entities
- try:
- self._target.data(self.entity[text[1:-1]])
- except KeyError:
- from xml.parsers import expat
- raise expat.error(
- "undefined entity %s: line %d, column %d" %
- (text, self._parser.ErrorLineNumber,
- self._parser.ErrorColumnNumber)
- )
- elif prefix == "<" and text[:9] == "":
- self._doctype = None
- return
- text = string.strip(text)
- if not text:
- return
- self._doctype.append(text)
- n = len(self._doctype)
- if n > 2:
- type = self._doctype[1]
- if type == "PUBLIC" and n == 4:
- name, type, pubid, system = self._doctype
- elif type == "SYSTEM" and n == 3:
- name, type, system = self._doctype
- pubid = None
- else:
- return
- if pubid:
- pubid = pubid[1:-1]
- self.doctype(name, pubid, system[1:-1])
- self._doctype = None
-
- ##
- # Handles a doctype declaration.
- #
- # @param name Doctype name.
- # @param pubid Public identifier.
- # @param system System identifier.
-
- def doctype(self, name, pubid, system):
- pass
-
- ##
- # Feeds data to the parser.
- #
- # @param data Encoded data.
-
- def feed(self, data):
- self._parser.Parse(data, 0)
-
- ##
- # Finishes feeding data to the parser.
- #
- # @return An element structure.
- # @defreturn Element
-
- def close(self):
- self._parser.Parse("", 1) # end of data
- tree = self._target.close()
- del self._target, self._parser # get rid of circular references
- return tree
+#
+# ElementTree
+# $Id: ElementTree.py 2326 2005-03-17 07:45:21Z fredrik $
+#
+# light-weight XML support for Python 1.5.2 and later.
+#
+# history:
+# 2001-10-20 fl created (from various sources)
+# 2001-11-01 fl return root from parse method
+# 2002-02-16 fl sort attributes in lexical order
+# 2002-04-06 fl TreeBuilder refactoring, added PythonDoc markup
+# 2002-05-01 fl finished TreeBuilder refactoring
+# 2002-07-14 fl added basic namespace support to ElementTree.write
+# 2002-07-25 fl added QName attribute support
+# 2002-10-20 fl fixed encoding in write
+# 2002-11-24 fl changed default encoding to ascii; fixed attribute encoding
+# 2002-11-27 fl accept file objects or file names for parse/write
+# 2002-12-04 fl moved XMLTreeBuilder back to this module
+# 2003-01-11 fl fixed entity encoding glitch for us-ascii
+# 2003-02-13 fl added XML literal factory
+# 2003-02-21 fl added ProcessingInstruction/PI factory
+# 2003-05-11 fl added tostring/fromstring helpers
+# 2003-05-26 fl added ElementPath support
+# 2003-07-05 fl added makeelement factory method
+# 2003-07-28 fl added more well-known namespace prefixes
+# 2003-08-15 fl fixed typo in ElementTree.findtext (Thomas Dartsch)
+# 2003-09-04 fl fall back on emulator if ElementPath is not installed
+# 2003-10-31 fl markup updates
+# 2003-11-15 fl fixed nested namespace bug
+# 2004-03-28 fl added XMLID helper
+# 2004-06-02 fl added default support to findtext
+# 2004-06-08 fl fixed encoding of non-ascii element/attribute names
+# 2004-08-23 fl take advantage of post-2.1 expat features
+# 2005-02-01 fl added iterparse implementation
+# 2005-03-02 fl fixed iterparse support for pre-2.2 versions
+#
+# Copyright (c) 1999-2005 by Fredrik Lundh. All rights reserved.
+#
+# fredrik at pythonware.com
+# http://www.pythonware.com
+#
+# --------------------------------------------------------------------
+# The ElementTree toolkit is
+#
+# Copyright (c) 1999-2005 by Fredrik Lundh
+#
+# By obtaining, using, and/or copying this software and/or its
+# associated documentation, you agree that you have read, understood,
+# and will comply with the following terms and conditions:
+#
+# Permission to use, copy, modify, and distribute this software and
+# its associated documentation for any purpose and without fee is
+# hereby granted, provided that the above copyright notice appears in
+# all copies, and that both that copyright notice and this permission
+# notice appear in supporting documentation, and that the name of
+# Secret Labs AB or the author not be used in advertising or publicity
+# pertaining to distribution of the software without specific, written
+# prior permission.
+#
+# SECRET LABS AB AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD
+# TO THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANT-
+# ABILITY AND FITNESS. IN NO EVENT SHALL SECRET LABS AB OR THE AUTHOR
+# BE LIABLE FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY
+# DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS,
+# WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS
+# ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE
+# OF THIS SOFTWARE.
+# --------------------------------------------------------------------
+
+# Licensed to PSF under a Contributor Agreement.
+# See http://www.python.org/2.4/license for licensing details.
+
+__all__ = [
+ # public symbols
+ "Comment",
+ "dump",
+ "Element", "ElementTree",
+ "fromstring",
+ "iselement", "iterparse",
+ "parse",
+ "PI", "ProcessingInstruction",
+ "QName",
+ "SubElement",
+ "tostring",
+ "TreeBuilder",
+ "VERSION", "XML",
+ "XMLParser", "XMLTreeBuilder",
+ ]
+
+##
+# The Element type is a flexible container object, designed to
+# store hierarchical data structures in memory. The type can be
+# described as a cross between a list and a dictionary.
+#
+# Each element has a number of properties associated with it:
+#
+#
a tag. This is a string identifying what kind of data
+# this element represents (the element type, in other words).
+#
a number of attributes, stored in a Python dictionary.
+#
a text string.
+#
an optional tail string.
+#
a number of child elements, stored in a Python sequence
+#
+#
+# To create an element instance, use the {@link #Element} or {@link
+# #SubElement} factory functions.
+#
+# The {@link #ElementTree} class can be used to wrap an element
+# structure, and convert it from and to XML.
+##
+
+import string, sys, re
+
+class _SimpleElementPath:
+ # emulate pre-1.2 find/findtext/findall behaviour
+ def find(self, element, tag):
+ for elem in element:
+ if elem.tag == tag:
+ return elem
+ return None
+ def findtext(self, element, tag, default=None):
+ for elem in element:
+ if elem.tag == tag:
+ return elem.text or ""
+ return default
+ def findall(self, element, tag):
+ if tag[:3] == ".//":
+ return element.getiterator(tag[3:])
+ result = []
+ for elem in element:
+ if elem.tag == tag:
+ result.append(elem)
+ return result
+
+try:
+ import ElementPath
+except ImportError:
+ # FIXME: issue warning in this case?
+ ElementPath = _SimpleElementPath()
+
+# TODO: add support for custom namespace resolvers/default namespaces
+# TODO: add improved support for incremental parsing
+
+VERSION = "1.2.6"
+
+##
+# Internal element class. This class defines the Element interface,
+# and provides a reference implementation of this interface.
+#
+# You should not create instances of this class directly. Use the
+# appropriate factory functions instead, such as {@link #Element}
+# and {@link #SubElement}.
+#
+# @see Element
+# @see SubElement
+# @see Comment
+# @see ProcessingInstruction
+
+class _ElementInterface:
+ # text...tail
+
+ ##
+ # (Attribute) Element tag.
+
+ tag = None
+
+ ##
+ # (Attribute) Element attribute dictionary. Where possible, use
+ # {@link #_ElementInterface.get},
+ # {@link #_ElementInterface.set},
+ # {@link #_ElementInterface.keys}, and
+ # {@link #_ElementInterface.items} to access
+ # element attributes.
+
+ attrib = None
+
+ ##
+ # (Attribute) Text before first subelement. This is either a
+ # string or the value None, if there was no text.
+
+ text = None
+
+ ##
+ # (Attribute) Text after this element's end tag, but before the
+ # next sibling element's start tag. This is either a string or
+ # the value None, if there was no text.
+
+ tail = None # text after end tag, if any
+
+ def __init__(self, tag, attrib):
+ self.tag = tag
+ self.attrib = attrib
+ self._children = []
+
+ def __repr__(self):
+ return "" % (self.tag, id(self))
+
+ ##
+ # Creates a new element object of the same type as this element.
+ #
+ # @param tag Element tag.
+ # @param attrib Element attributes, given as a dictionary.
+ # @return A new element instance.
+
+ def makeelement(self, tag, attrib):
+ return Element(tag, attrib)
+
+ ##
+ # Returns the number of subelements.
+ #
+ # @return The number of subelements.
+
+ def __len__(self):
+ return len(self._children)
+
+ ##
+ # Returns the given subelement.
+ #
+ # @param index What subelement to return.
+ # @return The given subelement.
+ # @exception IndexError If the given element does not exist.
+
+ def __getitem__(self, index):
+ return self._children[index]
+
+ ##
+ # Replaces the given subelement.
+ #
+ # @param index What subelement to replace.
+ # @param element The new element value.
+ # @exception IndexError If the given element does not exist.
+ # @exception AssertionError If element is not a valid object.
+
+ def __setitem__(self, index, element):
+ assert iselement(element)
+ self._children[index] = element
+
+ ##
+ # Deletes the given subelement.
+ #
+ # @param index What subelement to delete.
+ # @exception IndexError If the given element does not exist.
+
+ def __delitem__(self, index):
+ del self._children[index]
+
+ ##
+ # Returns a list containing subelements in the given range.
+ #
+ # @param start The first subelement to return.
+ # @param stop The first subelement that shouldn't be returned.
+ # @return A sequence object containing subelements.
+
+ def __getslice__(self, start, stop):
+ return self._children[start:stop]
+
+ ##
+ # Replaces a number of subelements with elements from a sequence.
+ #
+ # @param start The first subelement to replace.
+ # @param stop The first subelement that shouldn't be replaced.
+ # @param elements A sequence object with zero or more elements.
+ # @exception AssertionError If a sequence member is not a valid object.
+
+ def __setslice__(self, start, stop, elements):
+ for element in elements:
+ assert iselement(element)
+ self._children[start:stop] = list(elements)
+
+ ##
+ # Deletes a number of subelements.
+ #
+ # @param start The first subelement to delete.
+ # @param stop The first subelement to leave in there.
+
+ def __delslice__(self, start, stop):
+ del self._children[start:stop]
+
+ ##
+ # Adds a subelement to the end of this element.
+ #
+ # @param element The element to add.
+ # @exception AssertionError If a sequence member is not a valid object.
+
+ def append(self, element):
+ assert iselement(element)
+ self._children.append(element)
+
+ ##
+ # Inserts a subelement at the given position in this element.
+ #
+ # @param index Where to insert the new subelement.
+ # @exception AssertionError If the element is not a valid object.
+
+ def insert(self, index, element):
+ assert iselement(element)
+ self._children.insert(index, element)
+
+ ##
+ # Removes a matching subelement. Unlike the find methods,
+ # this method compares elements based on identity, not on tag
+ # value or contents.
+ #
+ # @param element What element to remove.
+ # @exception ValueError If a matching element could not be found.
+ # @exception AssertionError If the element is not a valid object.
+
+ def remove(self, element):
+ assert iselement(element)
+ self._children.remove(element)
+
+ ##
+ # Returns all subelements. The elements are returned in document
+ # order.
+ #
+ # @return A list of subelements.
+ # @defreturn list of Element instances
+
+ def getchildren(self):
+ return self._children
+
+ ##
+ # Finds the first matching subelement, by tag name or path.
+ #
+ # @param path What element to look for.
+ # @return The first matching element, or None if no element was found.
+ # @defreturn Element or None
+
+ def find(self, path):
+ return ElementPath.find(self, path)
+
+ ##
+ # Finds text for the first matching subelement, by tag name or path.
+ #
+ # @param path What element to look for.
+ # @param default What to return if the element was not found.
+ # @return The text content of the first matching element, or the
+ # default value no element was found. Note that if the element
+ # has is found, but has no text content, this method returns an
+ # empty string.
+ # @defreturn string
+
+ def findtext(self, path, default=None):
+ return ElementPath.findtext(self, path, default)
+
+ ##
+ # Finds all matching subelements, by tag name or path.
+ #
+ # @param path What element to look for.
+ # @return A list or iterator containing all matching elements,
+ # in document order.
+ # @defreturn list of Element instances
+
+ def findall(self, path):
+ return ElementPath.findall(self, path)
+
+ ##
+ # Resets an element. This function removes all subelements, clears
+ # all attributes, and sets the text and tail attributes to None.
+
+ def clear(self):
+ self.attrib.clear()
+ self._children = []
+ self.text = self.tail = None
+
+ ##
+ # Gets an element attribute.
+ #
+ # @param key What attribute to look for.
+ # @param default What to return if the attribute was not found.
+ # @return The attribute value, or the default value, if the
+ # attribute was not found.
+ # @defreturn string or None
+
+ def get(self, key, default=None):
+ return self.attrib.get(key, default)
+
+ ##
+ # Sets an element attribute.
+ #
+ # @param key What attribute to set.
+ # @param value The attribute value.
+
+ def set(self, key, value):
+ self.attrib[key] = value
+
+ ##
+ # Gets a list of attribute names. The names are returned in an
+ # arbitrary order (just like for an ordinary Python dictionary).
+ #
+ # @return A list of element attribute names.
+ # @defreturn list of strings
+
+ def keys(self):
+ return self.attrib.keys()
+
+ ##
+ # Gets element attributes, as a sequence. The attributes are
+ # returned in an arbitrary order.
+ #
+ # @return A list of (name, value) tuples for all attributes.
+ # @defreturn list of (string, string) tuples
+
+ def items(self):
+ return self.attrib.items()
+
+ ##
+ # Creates a tree iterator. The iterator loops over this element
+ # and all subelements, in document order, and returns all elements
+ # with a matching tag.
+ #
+ # If the tree structure is modified during iteration, the result
+ # is undefined.
+ #
+ # @param tag What tags to look for (default is to return all elements).
+ # @return A list or iterator containing all the matching elements.
+ # @defreturn list or iterator
+
+ def getiterator(self, tag=None):
+ nodes = []
+ if tag == "*":
+ tag = None
+ if tag is None or self.tag == tag:
+ nodes.append(self)
+ for node in self._children:
+ nodes.extend(node.getiterator(tag))
+ return nodes
+
+# compatibility
+_Element = _ElementInterface
+
+##
+# Element factory. This function returns an object implementing the
+# standard Element interface. The exact class or type of that object
+# is implementation dependent, but it will always be compatible with
+# the {@link #_ElementInterface} class in this module.
+#
+# The element name, attribute names, and attribute values can be
+# either 8-bit ASCII strings or Unicode strings.
+#
+# @param tag The element name.
+# @param attrib An optional dictionary, containing element attributes.
+# @param **extra Additional attributes, given as keyword arguments.
+# @return An element instance.
+# @defreturn Element
+
+def Element(tag, attrib={}, **extra):
+ attrib = attrib.copy()
+ attrib.update(extra)
+ return _ElementInterface(tag, attrib)
+
+##
+# Subelement factory. This function creates an element instance, and
+# appends it to an existing element.
+#
+# The element name, attribute names, and attribute values can be
+# either 8-bit ASCII strings or Unicode strings.
+#
+# @param parent The parent element.
+# @param tag The subelement name.
+# @param attrib An optional dictionary, containing element attributes.
+# @param **extra Additional attributes, given as keyword arguments.
+# @return An element instance.
+# @defreturn Element
+
+def SubElement(parent, tag, attrib={}, **extra):
+ attrib = attrib.copy()
+ attrib.update(extra)
+ element = parent.makeelement(tag, attrib)
+ parent.append(element)
+ return element
+
+##
+# Comment element factory. This factory function creates a special
+# element that will be serialized as an XML comment.
+#
+# The comment string can be either an 8-bit ASCII string or a Unicode
+# string.
+#
+# @param text A string containing the comment string.
+# @return An element instance, representing a comment.
+# @defreturn Element
+
+def Comment(text=None):
+ element = Element(Comment)
+ element.text = text
+ return element
+
+##
+# PI element factory. This factory function creates a special element
+# that will be serialized as an XML processing instruction.
+#
+# @param target A string containing the PI target.
+# @param text A string containing the PI contents, if any.
+# @return An element instance, representing a PI.
+# @defreturn Element
+
+def ProcessingInstruction(target, text=None):
+ element = Element(ProcessingInstruction)
+ element.text = target
+ if text:
+ element.text = element.text + " " + text
+ return element
+
+PI = ProcessingInstruction
+
+##
+# QName wrapper. This can be used to wrap a QName attribute value, in
+# order to get proper namespace handling on output.
+#
+# @param text A string containing the QName value, in the form {uri}local,
+# or, if the tag argument is given, the URI part of a QName.
+# @param tag Optional tag. If given, the first argument is interpreted as
+# an URI, and this argument is interpreted as a local name.
+# @return An opaque object, representing the QName.
+
+class QName:
+ def __init__(self, text_or_uri, tag=None):
+ if tag:
+ text_or_uri = "{%s}%s" % (text_or_uri, tag)
+ self.text = text_or_uri
+ def __str__(self):
+ return self.text
+ def __hash__(self):
+ return hash(self.text)
+ def __cmp__(self, other):
+ if isinstance(other, QName):
+ return cmp(self.text, other.text)
+ return cmp(self.text, other)
+
+##
+# ElementTree wrapper class. This class represents an entire element
+# hierarchy, and adds some extra support for serialization to and from
+# standard XML.
+#
+# @param element Optional root element.
+# @keyparam file Optional file handle or name. If given, the
+# tree is initialized with the contents of this XML file.
+
+class ElementTree:
+
+ def __init__(self, element=None, file=None):
+ assert element is None or iselement(element)
+ self._root = element # first node
+ if file:
+ self.parse(file)
+
+ ##
+ # Gets the root element for this tree.
+ #
+ # @return An element instance.
+ # @defreturn Element
+
+ def getroot(self):
+ return self._root
+
+ ##
+ # Replaces the root element for this tree. This discards the
+ # current contents of the tree, and replaces it with the given
+ # element. Use with care.
+ #
+ # @param element An element instance.
+
+ def _setroot(self, element):
+ assert iselement(element)
+ self._root = element
+
+ ##
+ # Loads an external XML document into this element tree.
+ #
+ # @param source A file name or file object.
+ # @param parser An optional parser instance. If not given, the
+ # standard {@link XMLTreeBuilder} parser is used.
+ # @return The document root element.
+ # @defreturn Element
+
+ def parse(self, source, parser=None):
+ if not hasattr(source, "read"):
+ source = open(source, "rb")
+ if not parser:
+ parser = XMLTreeBuilder()
+ while 1:
+ data = source.read(32768)
+ if not data:
+ break
+ parser.feed(data)
+ self._root = parser.close()
+ return self._root
+
+ ##
+ # Creates a tree iterator for the root element. The iterator loops
+ # over all elements in this tree, in document order.
+ #
+ # @param tag What tags to look for (default is to return all elements)
+ # @return An iterator.
+ # @defreturn iterator
+
+ def getiterator(self, tag=None):
+ assert self._root is not None
+ return self._root.getiterator(tag)
+
+ ##
+ # Finds the first toplevel element with given tag.
+ # Same as getroot().find(path).
+ #
+ # @param path What element to look for.
+ # @return The first matching element, or None if no element was found.
+ # @defreturn Element or None
+
+ def find(self, path):
+ assert self._root is not None
+ if path[:1] == "/":
+ path = "." + path
+ return self._root.find(path)
+
+ ##
+ # Finds the element text for the first toplevel element with given
+ # tag. Same as getroot().findtext(path).
+ #
+ # @param path What toplevel element to look for.
+ # @param default What to return if the element was not found.
+ # @return The text content of the first matching element, or the
+ # default value no element was found. Note that if the element
+ # has is found, but has no text content, this method returns an
+ # empty string.
+ # @defreturn string
+
+ def findtext(self, path, default=None):
+ assert self._root is not None
+ if path[:1] == "/":
+ path = "." + path
+ return self._root.findtext(path, default)
+
+ ##
+ # Finds all toplevel elements with the given tag.
+ # Same as getroot().findall(path).
+ #
+ # @param path What element to look for.
+ # @return A list or iterator containing all matching elements,
+ # in document order.
+ # @defreturn list of Element instances
+
+ def findall(self, path):
+ assert self._root is not None
+ if path[:1] == "/":
+ path = "." + path
+ return self._root.findall(path)
+
+ ##
+ # Writes the element tree to a file, as XML.
+ #
+ # @param file A file name, or a file object opened for writing.
+ # @param encoding Optional output encoding (default is US-ASCII).
+
+ def write(self, file, encoding="us-ascii"):
+ assert self._root is not None
+ if not hasattr(file, "write"):
+ file = open(file, "wb")
+ if not encoding:
+ encoding = "us-ascii"
+ elif encoding != "utf-8" and encoding != "us-ascii":
+ file.write("\n" % encoding)
+ self._write(file, self._root, encoding, {})
+
+ def _write(self, file, node, encoding, namespaces):
+ # write XML to file
+ tag = node.tag
+ if tag is Comment:
+ file.write("" % _escape_cdata(node.text, encoding))
+ elif tag is ProcessingInstruction:
+ file.write("%s?>" % _escape_cdata(node.text, encoding))
+ else:
+ items = node.items()
+ xmlns_items = [] # new namespaces in this scope
+ try:
+ if isinstance(tag, QName) or tag[:1] == "{":
+ tag, xmlns = fixtag(tag, namespaces)
+ if xmlns: xmlns_items.append(xmlns)
+ except TypeError:
+ _raise_serialization_error(tag)
+ file.write("<" + _encode(tag, encoding))
+ if items or xmlns_items:
+ items.sort() # lexical order
+ for k, v in items:
+ try:
+ if isinstance(k, QName) or k[:1] == "{":
+ k, xmlns = fixtag(k, namespaces)
+ if xmlns: xmlns_items.append(xmlns)
+ except TypeError:
+ _raise_serialization_error(k)
+ try:
+ if isinstance(v, QName):
+ v, xmlns = fixtag(v, namespaces)
+ if xmlns: xmlns_items.append(xmlns)
+ except TypeError:
+ _raise_serialization_error(v)
+ file.write(" %s=\"%s\"" % (_encode(k, encoding),
+ _escape_attrib(v, encoding)))
+ for k, v in xmlns_items:
+ file.write(" %s=\"%s\"" % (_encode(k, encoding),
+ _escape_attrib(v, encoding)))
+ if node.text or len(node):
+ file.write(">")
+ if node.text:
+ file.write(_escape_cdata(node.text, encoding))
+ for n in node:
+ self._write(file, n, encoding, namespaces)
+ file.write("" + _encode(tag, encoding) + ">")
+ else:
+ file.write(" />")
+ for k, v in xmlns_items:
+ del namespaces[v]
+ if node.tail:
+ file.write(_escape_cdata(node.tail, encoding))
+
+# --------------------------------------------------------------------
+# helpers
+
+##
+# Checks if an object appears to be a valid element object.
+#
+# @param An element instance.
+# @return A true value if this is an element object.
+# @defreturn flag
+
+def iselement(element):
+ # FIXME: not sure about this; might be a better idea to look
+ # for tag/attrib/text attributes
+ return isinstance(element, _ElementInterface) or hasattr(element, "tag")
+
+##
+# Writes an element tree or element structure to sys.stdout. This
+# function should be used for debugging only.
+#
+# The exact output format is implementation dependent. In this
+# version, it's written as an ordinary XML file.
+#
+# @param elem An element tree or an individual element.
+
+def dump(elem):
+ # debugging
+ if not isinstance(elem, ElementTree):
+ elem = ElementTree(elem)
+ elem.write(sys.stdout)
+ tail = elem.getroot().tail
+ if not tail or tail[-1] != "\n":
+ sys.stdout.write("\n")
+
+def _encode(s, encoding):
+ try:
+ return s.encode(encoding)
+ except AttributeError:
+ return s # 1.5.2: assume the string uses the right encoding
+
+if sys.version[:3] == "1.5":
+ _escape = re.compile(r"[&<>\"\x80-\xff]+") # 1.5.2
+else:
+ _escape = re.compile(eval(r'u"[&<>\"\u0080-\uffff]+"'))
+
+_escape_map = {
+ "&": "&",
+ "<": "<",
+ ">": ">",
+ '"': """,
+}
+
+_namespace_map = {
+ # "well-known" namespace prefixes
+ "http://www.w3.org/XML/1998/namespace": "xml",
+ "http://www.w3.org/1999/xhtml": "html",
+ "http://www.w3.org/1999/02/22-rdf-syntax-ns#": "rdf",
+ "http://schemas.xmlsoap.org/wsdl/": "wsdl",
+}
+
+def _raise_serialization_error(text):
+ raise TypeError(
+ "cannot serialize %r (type %s)" % (text, type(text).__name__)
+ )
+
+def _encode_entity(text, pattern=_escape):
+ # map reserved and non-ascii characters to numerical entities
+ def escape_entities(m, map=_escape_map):
+ out = []
+ append = out.append
+ for char in m.group():
+ text = map.get(char)
+ if text is None:
+ text = "%d;" % ord(char)
+ append(text)
+ return string.join(out, "")
+ try:
+ return _encode(pattern.sub(escape_entities, text), "ascii")
+ except TypeError:
+ _raise_serialization_error(text)
+
+#
+# the following functions assume an ascii-compatible encoding
+# (or "utf-16")
+
+def _escape_cdata(text, encoding=None, replace=string.replace):
+ # escape character data
+ try:
+ if encoding:
+ try:
+ text = _encode(text, encoding)
+ except UnicodeError:
+ return _encode_entity(text)
+ text = replace(text, "&", "&")
+ text = replace(text, "<", "<")
+ text = replace(text, ">", ">")
+ return text
+ except (TypeError, AttributeError):
+ _raise_serialization_error(text)
+
+def _escape_attrib(text, encoding=None, replace=string.replace):
+ # escape attribute value
+ try:
+ if encoding:
+ try:
+ text = _encode(text, encoding)
+ except UnicodeError:
+ return _encode_entity(text)
+ text = replace(text, "&", "&")
+ text = replace(text, "'", "'") # FIXME: overkill
+ text = replace(text, "\"", """)
+ text = replace(text, "<", "<")
+ text = replace(text, ">", ">")
+ return text
+ except (TypeError, AttributeError):
+ _raise_serialization_error(text)
+
+def fixtag(tag, namespaces):
+ # given a decorated tag (of the form {uri}tag), return prefixed
+ # tag and namespace declaration, if any
+ if isinstance(tag, QName):
+ tag = tag.text
+ namespace_uri, tag = string.split(tag[1:], "}", 1)
+ prefix = namespaces.get(namespace_uri)
+ if prefix is None:
+ prefix = _namespace_map.get(namespace_uri)
+ if prefix is None:
+ prefix = "ns%d" % len(namespaces)
+ namespaces[namespace_uri] = prefix
+ if prefix == "xml":
+ xmlns = None
+ else:
+ xmlns = ("xmlns:%s" % prefix, namespace_uri)
+ else:
+ xmlns = None
+ return "%s:%s" % (prefix, tag), xmlns
+
+##
+# Parses an XML document into an element tree.
+#
+# @param source A filename or file object containing XML data.
+# @param parser An optional parser instance. If not given, the
+# standard {@link XMLTreeBuilder} parser is used.
+# @return An ElementTree instance
+
+def parse(source, parser=None):
+ tree = ElementTree()
+ tree.parse(source, parser)
+ return tree
+
+##
+# Parses an XML document into an element tree incrementally, and reports
+# what's going on to the user.
+#
+# @param source A filename or file object containing XML data.
+# @param events A list of events to report back. If omitted, only "end"
+# events are reported.
+# @return A (event, elem) iterator.
+
+class iterparse:
+
+ def __init__(self, source, events=None):
+ if not hasattr(source, "read"):
+ source = open(source, "rb")
+ self._file = source
+ self._events = []
+ self._index = 0
+ self.root = self._root = None
+ self._parser = XMLTreeBuilder()
+ # wire up the parser for event reporting
+ parser = self._parser._parser
+ append = self._events.append
+ if events is None:
+ events = ["end"]
+ for event in events:
+ if event == "start":
+ try:
+ parser.ordered_attributes = 1
+ parser.specified_attributes = 1
+ def handler(tag, attrib_in, event=event, append=append,
+ start=self._parser._start_list):
+ append((event, start(tag, attrib_in)))
+ parser.StartElementHandler = handler
+ except AttributeError:
+ def handler(tag, attrib_in, event=event, append=append,
+ start=self._parser._start):
+ append((event, start(tag, attrib_in)))
+ parser.StartElementHandler = handler
+ elif event == "end":
+ def handler(tag, event=event, append=append,
+ end=self._parser._end):
+ append((event, end(tag)))
+ parser.EndElementHandler = handler
+ elif event == "start-ns":
+ def handler(prefix, uri, event=event, append=append):
+ try:
+ uri = _encode(uri, "ascii")
+ except UnicodeError:
+ pass
+ append((event, (prefix or "", uri)))
+ parser.StartNamespaceDeclHandler = handler
+ elif event == "end-ns":
+ def handler(prefix, event=event, append=append):
+ append((event, None))
+ parser.EndNamespaceDeclHandler = handler
+
+ def next(self):
+ while 1:
+ try:
+ item = self._events[self._index]
+ except IndexError:
+ if self._parser is None:
+ self.root = self._root
+ try:
+ raise StopIteration
+ except NameError:
+ raise IndexError
+ # load event buffer
+ del self._events[:]
+ self._index = 0
+ data = self._file.read(16384)
+ if data:
+ self._parser.feed(data)
+ else:
+ self._root = self._parser.close()
+ self._parser = None
+ else:
+ self._index = self._index + 1
+ return item
+
+ try:
+ iter
+ def __iter__(self):
+ return self
+ except NameError:
+ def __getitem__(self, index):
+ return self.next()
+
+##
+# Parses an XML document from a string constant. This function can
+# be used to embed "XML literals" in Python code.
+#
+# @param source A string containing XML data.
+# @return An Element instance.
+# @defreturn Element
+
+def XML(text):
+ parser = XMLTreeBuilder()
+ parser.feed(text)
+ return parser.close()
+
+##
+# Parses an XML document from a string constant, and also returns
+# a dictionary which maps from element id:s to elements.
+#
+# @param source A string containing XML data.
+# @return A tuple containing an Element instance and a dictionary.
+# @defreturn (Element, dictionary)
+
+def XMLID(text):
+ parser = XMLTreeBuilder()
+ parser.feed(text)
+ tree = parser.close()
+ ids = {}
+ for elem in tree.getiterator():
+ id = elem.get("id")
+ if id:
+ ids[id] = elem
+ return tree, ids
+
+##
+# Parses an XML document from a string constant. Same as {@link #XML}.
+#
+# @def fromstring(text)
+# @param source A string containing XML data.
+# @return An Element instance.
+# @defreturn Element
+
+fromstring = XML
+
+##
+# Generates a string representation of an XML element, including all
+# subelements.
+#
+# @param element An Element instance.
+# @return An encoded string containing the XML data.
+# @defreturn string
+
+def tostring(element, encoding=None):
+ class dummy:
+ pass
+ data = []
+ file = dummy()
+ file.write = data.append
+ ElementTree(element).write(file, encoding)
+ return string.join(data, "")
+
+##
+# Generic element structure builder. This builder converts a sequence
+# of {@link #TreeBuilder.start}, {@link #TreeBuilder.data}, and {@link
+# #TreeBuilder.end} method calls to a well-formed element structure.
+#
+# You can use this class to build an element structure using a custom XML
+# parser, or a parser for some other XML-like format.
+#
+# @param element_factory Optional element factory. This factory
+# is called to create new Element instances, as necessary.
+
+class TreeBuilder:
+
+ def __init__(self, element_factory=None):
+ self._data = [] # data collector
+ self._elem = [] # element stack
+ self._last = None # last element
+ self._tail = None # true if we're after an end tag
+ if element_factory is None:
+ element_factory = _ElementInterface
+ self._factory = element_factory
+
+ ##
+ # Flushes the parser buffers, and returns the toplevel documen
+ # element.
+ #
+ # @return An Element instance.
+ # @defreturn Element
+
+ def close(self):
+ assert len(self._elem) == 0, "missing end tags"
+ assert self._last != None, "missing toplevel element"
+ return self._last
+
+ def _flush(self):
+ if self._data:
+ if self._last is not None:
+ text = string.join(self._data, "")
+ if self._tail:
+ assert self._last.tail is None, "internal error (tail)"
+ self._last.tail = text
+ else:
+ assert self._last.text is None, "internal error (text)"
+ self._last.text = text
+ self._data = []
+
+ ##
+ # Adds text to the current element.
+ #
+ # @param data A string. This should be either an 8-bit string
+ # containing ASCII text, or a Unicode string.
+
+ def data(self, data):
+ self._data.append(data)
+
+ ##
+ # Opens a new element.
+ #
+ # @param tag The element name.
+ # @param attrib A dictionary containing element attributes.
+ # @return The opened element.
+ # @defreturn Element
+
+ def start(self, tag, attrs):
+ self._flush()
+ self._last = elem = self._factory(tag, attrs)
+ if self._elem:
+ self._elem[-1].append(elem)
+ self._elem.append(elem)
+ self._tail = 0
+ return elem
+
+ ##
+ # Closes the current element.
+ #
+ # @param tag The element name.
+ # @return The closed element.
+ # @defreturn Element
+
+ def end(self, tag):
+ self._flush()
+ self._last = self._elem.pop()
+ assert self._last.tag == tag,\
+ "end tag mismatch (expected %s, got %s)" % (
+ self._last.tag, tag)
+ self._tail = 1
+ return self._last
+
+##
+# Element structure builder for XML source data, based on the
+#