[Python-checkins] distutils2: merged with Tarek's code

tarek.ziade python-checkins at python.org
Sun Aug 8 11:50:45 CEST 2010


tarek.ziade pushed e6a99de6bb38 to distutils2:

http://hg.python.org/distutils2/rev/e6a99de6bb38
changeset:   429:e6a99de6bb38
parent:      428:0f61b7399cfa
parent:      380:feceafae5855
user:        Josip Djolonga
date:        Sun Jul 18 03:08:42 2010 +0200
summary:     merged with Tarek's code
files:       src/distutils2/_backport/pkgutil.py

diff --git a/.hgignore b/.hgignore
--- a/.hgignore
+++ b/.hgignore
@@ -1,5 +1,9 @@
-.*\.pyc$
-.*\.pyo$
-^src/build
-^docs/build
-.*\.swp$
+syntax: glob
+*.py[co]
+__pycache__/
+configure.cache
+build/
+MANIFEST
+dist/
+*.swp
+.coverage
\ No newline at end of file
diff --git a/docs/design/pep-0376.txt b/docs/design/pep-0376.txt
--- a/docs/design/pep-0376.txt
+++ b/docs/design/pep-0376.txt
@@ -402,7 +402,7 @@
   ``.egg-info`` directory that contains a PKG-INFO that matches `name` 
   for the `name` metadata.
 
-  Notice that there should be at most one result. The first result founded
+  Notice that there should be at most one result. The first result found
   is returned. If the directory is not found, returns None.
 
 - ``get_file_users(path)`` -> iterator of ``Distribution`` instances.
@@ -633,7 +633,7 @@
 
 Distributions installed using existing, pre-standardization formats do not have
 the necessary metadata available for the new API, and thus will be
-ignored. Third-party tools may of course to continue to support previous
+ignored. Third-party tools may of course continue to support previous
 formats in addition to the new format, in order to ease the transition.
 
 
diff --git a/docs/design/wiki.rst b/docs/design/wiki.rst
--- a/docs/design/wiki.rst
+++ b/docs/design/wiki.rst
@@ -282,7 +282,7 @@
   mailman/etc/*               = {config}                # 8
   mailman/foo/**/bar/*.cfg    = {config}/baz            # 9
   mailman/foo/**/*.cfg        = {config}/hmm            # 9, 10
-  some-new-semantic.txt       = {funky-crazy-category}  # 11
+  some-new-semantic.sns       = {funky-crazy-category}  # 11
 
 The glob definitions are relative paths that match files from the top
 of the source tree (the location of ``setup.cfg``). Forward slashes
@@ -399,7 +399,7 @@
   the folder hierarchy from the module until we find a setup.cfg? A setup.cfg is
   necessary if you use distutils2, is it not?
 
-  -> information founded in setup.cfg will be put in the *FILES* file upon
+  -> information found in setup.cfg will be put in the *FILES* file upon
   installation in the egg-info directory. 
   IOW in the unbuit-egg case, we would need to create that dir, then use 
   pkgutil APIs.
diff --git a/docs/source/index.rst b/docs/source/index.rst
--- a/docs/source/index.rst
+++ b/docs/source/index.rst
@@ -14,6 +14,10 @@
    metadata
    pkgutil
    depgraph
+   new_commands
+   test_framework
+   pypi
+   version
 
 Indices and tables
 ==================
diff --git a/docs/source/metadata.rst b/docs/source/metadata.rst
--- a/docs/source/metadata.rst
+++ b/docs/source/metadata.rst
@@ -17,7 +17,7 @@
 Reading metadata
 ================
 
-The :class:`DistributionMetadata` class can be instanciated with the path of
+The :class:`DistributionMetadata` class can be instantiated with the path of
 the metadata file, and provides a dict-like interface to the values::
 
     >>> from distutils2.metadata import DistributionMetadata
@@ -32,14 +32,14 @@
     ["pywin32; sys.platform == 'win32'", "Sphinx"]
 
 The fields that supports environment markers can be automatically ignored if
-the object is instanciated using the ``platform_dependant`` option.
+the object is instantiated using the ``platform_dependent`` option.
 :class:`DistributionMetadata` will interpret in the case the markers and will
 automatically remove the fields that are not compliant with the running
 environment. Here's an example under Mac OS X. The win32 dependency
 we saw earlier is ignored::
 
     >>> from distutils2.metadata import DistributionMetadata
-    >>> metadata = DistributionMetadata('PKG-INFO', platform_dependant=True)
+    >>> metadata = DistributionMetadata('PKG-INFO', platform_dependent=True)
     >>> metadata['Requires-Dist']
     ['bar']
 
@@ -53,7 +53,7 @@
 
     >>> from distutils2.metadata import DistributionMetadata
     >>> context = {'sys.platform': 'win32'}
-    >>> metadata = DistributionMetadata('PKG-INFO', platform_dependant=True,
+    >>> metadata = DistributionMetadata('PKG-INFO', platform_dependent=True,
     ...                                 execution_context=context)
     ...
     >>> metadata['Requires-Dist'] = ["pywin32; sys.platform == 'win32'",
@@ -71,15 +71,15 @@
     >>> metadata.write('/to/my/PKG-INFO')
 
 The class will pick the best version for the metadata, depending on the values
-provided. If all the values provided exists in all versions, teh class will
-used :attr:`metadata.PKG_INFO_PREFERRED_VERSION`. It is set by default to 1.0.
+provided. If all the values provided exist in all versions, the class will
+use :attr:`metadata.PKG_INFO_PREFERRED_VERSION`. It is set by default to 1.0.
 
 
 Conflict checking and best version
 ==================================
 
 Some fields in :pep:`345` have to follow a version scheme in their versions
-predicate. When the scheme is violated, a warning is emited::
+predicate. When the scheme is violated, a warning is emitted::
 
     >>> from distutils2.metadata import DistributionMetadata
     >>> metadata = DistributionMetadata()
@@ -90,6 +90,3 @@
 
 
 .. TODO talk about check()
-
-
-
diff --git a/docs/source/new_commands.rst b/docs/source/new_commands.rst
new file mode 100644
--- /dev/null
+++ b/docs/source/new_commands.rst
@@ -0,0 +1,65 @@
+========
+Commands
+========
+
+Distutils2 provides a set of commands that are not present in distutils itself.
+You might recognize some of them from other projects, like Distribute or
+Setuptools.
+
+``upload_docs`` - Upload package documentation to PyPI
+======================================================
+
+PyPI now supports uploading project documentation to the dedicated URL
+http://packages.python.org/<project>/.
+
+The ``upload_docs`` command will create the necessary zip file out of a
+documentation directory and will post to the repository.
+
+Note that to upload the documentation of a project, the corresponding version
+must already be registered with PyPI, using the distutils ``register``
+command -- just like the ``upload`` command.
+
+Assuming there is an ``Example`` project with documentation in the
+subdirectory ``docs``, e.g.::
+
+  Example/
+  |-- example.py
+  |-- setup.cfg
+  |-- setup.py
+  |-- docs
+  |   |-- build
+  |   |   `-- html
+  |   |   |   |-- index.html
+  |   |   |   `-- tips_tricks.html
+  |   |-- conf.py
+  |   |-- index.txt
+  |   `-- tips_tricks.txt
+
+You can simply pass the documentation directory path to the ``upload_docs``
+command::
+
+    python setup.py upload_docs --upload-dir=docs/build/html
+
+As with any other ``setuptools`` based command, you can define useful
+defaults in the ``setup.cfg`` of your Python project, e.g.:
+
+.. code-block:: ini
+
+    [upload_docs]
+    upload-dir = docs/build/html
+
+The ``upload_docs`` command has the following options:
+
+``--upload-dir``
+    The directory to be uploaded to the repository. The default value is
+    ``docs`` in project root.
+
+``--show-response``
+    Display the full response text from server; this is useful for debugging
+    PyPI problems.
+
+``--repository=URL, -r URL``
+    The URL of the repository to upload to.  Defaults to
+    http://pypi.python.org/pypi (i.e., the main PyPI installation).
+
+
diff --git a/docs/source/pypi.rst b/docs/source/pypi.rst
new file mode 100644
--- /dev/null
+++ b/docs/source/pypi.rst
@@ -0,0 +1,195 @@
+=========================================
+Tools to query PyPI: the PyPI package
+=========================================
+
+Distutils2 comes with a module (eg. `distutils2.pypi`) which contains
+facilities to access the Python Package Index (named "pypi", and avalaible on
+the url `http://pypi.python.org`.
+
+There is two ways to retrieve data from pypi: using the *simple* API, and using
+*XML-RPC*. The first one is in fact a set of HTML pages avalaible at
+`http://pypi.python.org/simple/`, and the second one contains a set of XML-RPC
+methods. In order to reduce the overload caused by running distant methods on 
+the pypi server (by using the XML-RPC methods), the best way to retrieve 
+informations is by using the simple API, when it contains the information you 
+need.
+
+Distutils2 provides two python modules to ease the work with those two APIs:
+`distutils2.pypi.simple` and `distutils2.pypi.xmlrpc`. Both of them depends on 
+another python module: `distutils2.pypi.dist`.
+
+
+Requesting information via the "simple" API `distutils2.pypi.simple`
+====================================================================
+
+`distutils2.pypi.simple` can process the Python Package Index and return and 
+download urls of distributions, for specific versions or latests, but it also 
+can process external html pages, with the goal to find *pypi unhosted* versions 
+of python distributions.
+
+You should use `distutils2.pypi.simple` for: 
+
+    * Search distributions by name and versions.
+    * Process pypi external pages.
+    * Download distributions by name and versions.
+
+And should not be used to:
+
+    * Things that will end up in too long index processing (like "finding all
+      distributions with a specific version, no matters the name")
+
+API
+----
+
+Here is a complete overview of the APIs of the SimpleIndex class.
+
+.. autoclass:: distutils2.pypi.simple.SimpleIndex
+    :members:
+
+Usage Exemples
+---------------
+
+To help you understand how using the `SimpleIndex` class, here are some basic
+usages.
+
+Request PyPI to get a specific distribution
+++++++++++++++++++++++++++++++++++++++++++++
+
+Supposing you want to scan the PyPI index to get a list of distributions for 
+the "foobar" project. You can use the "find" method for that::
+
+    >>> from distutils2.pypi import SimpleIndex
+    >>> client = SimpleIndex()
+    >>> client.find("foobar")
+    [<PyPIDistribution "Foobar 1.1">, <PyPIDistribution "Foobar 1.2">]
+    
+Note that you also can request the client about specific versions, using version
+specifiers (described in `PEP 345 
+<http://www.python.org/dev/peps/pep-0345/#version-specifiers>`_)::
+
+    >>> client.find("foobar < 1.2")
+    [<PyPIDistribution "foobar 1.1">, ]
+
+`find` returns a list of distributions, but you also can get the last
+distribution (the more up to date) that fullfil your requirements, like this::
+    
+    >>> client.get("foobar < 1.2")
+    <PyPIDistribution "foobar 1.1">
+
+Download distributions
++++++++++++++++++++++++
+
+As it can get the urls of distributions provided by PyPI, the `SimpleIndex` 
+client also can download the distributions and put it for you in a temporary
+destination::
+
+    >>> client.download("foobar")
+    /tmp/temp_dir/foobar-1.2.tar.gz
+
+You also can specify the directory you want to download to::
+    
+    >>> client.download("foobar", "/path/to/my/dir")
+    /path/to/my/dir/foobar-1.2.tar.gz
+
+While downloading, the md5 of the archive will be checked, if not matches, it
+will try another time, then if fails again, raise `MD5HashDoesNotMatchError`.
+
+Internally, that's not the SimpleIndex which download the distributions, but the
+`PyPIDistribution` class. Please refer to this documentation for more details.
+
+Following PyPI external links
+++++++++++++++++++++++++++++++
+
+The default behavior for distutils2 is to *not* follow the links provided
+by HTML pages in the "simple index", to find distributions related
+downloads.
+
+It's possible to tell the PyPIClient to follow external links by setting the 
+`follow_externals` attribute, on instanciation or after::
+
+    >>> client = SimpleIndex(follow_externals=True)
+
+or ::
+
+    >>> client = SimpleIndex()
+    >>> client.follow_externals = True
+
+Working with external indexes, and mirrors
++++++++++++++++++++++++++++++++++++++++++++
+
+The default `SimpleIndex` behavior is to rely on the Python Package index stored
+on PyPI (http://pypi.python.org/simple).
+
+As you can need to work with a local index, or private indexes, you can specify
+it using the index_url parameter::
+
+    >>> client = SimpleIndex(index_url="file://filesystem/path/")
+
+or ::
+
+    >>> client = SimpleIndex(index_url="http://some.specific.url/")
+
+You also can specify mirrors to fallback on in case the first index_url you
+provided doesnt respond, or not correctly. The default behavior for
+`SimpleIndex` is to use the list provided by Python.org DNS records, as
+described in the :pep:`381` about mirroring infrastructure.
+
+If you don't want to rely on these, you could specify the list of mirrors you
+want to try by specifying the `mirrors` attribute. It's a simple iterable::
+
+    >>> mirrors = ["http://first.mirror","http://second.mirror"]
+    >>> client = SimpleIndex(mirrors=mirrors)
+
+
+Requesting informations via XML-RPC (`distutils2.pypi.XmlRpcIndex`)
+==========================================================================
+
+The other method to request the Python package index, is using the XML-RPC
+methods. Distutils2 provides a simple wrapper around `xmlrpclib
+<http://docs.python.org/library/xmlrpclib.html>`_, that can return you
+`PyPIDistribution` objects.
+
+::
+    >>> from distutils2.pypi import XmlRpcIndex()
+    >>> client = XmlRpcIndex()
+
+
+PyPI Distributions
+==================
+
+Both `SimpleIndex` and `XmlRpcIndex` classes works with the classes provided
+in the `pypi.dist` package.
+
+`PyPIDistribution`
+------------------
+
+`PyPIDistribution` is a simple class that defines the following attributes:
+
+:name:
+    The name of the package. `foobar` in our exemples here
+:version:
+    The version of the package
+:location:
+    If the files from the archive has been downloaded, here is the path where
+    you can find them.
+:url:
+    The url of the distribution
+
+.. autoclass:: distutils2.pypi.dist.PyPIDistribution
+    :members:
+
+`PyPIDistributions`
+-------------------
+
+The `dist` module also provides another class, to work with lists of 
+`PyPIDistribution` classes. It allow to filter results and is used as a 
+container of 
+
+.. autoclass:: distutils2.pypi.dist.PyPIDistributions
+    :members:
+
+At a higher level
+=================
+
+XXX : A description about a wraper around PyPI simple and XmlRpc Indexes
+(PyPIIndex ?) 
diff --git a/docs/source/test_framework.rst b/docs/source/test_framework.rst
new file mode 100644
--- /dev/null
+++ b/docs/source/test_framework.rst
@@ -0,0 +1,88 @@
+==============
+Test Framework
+==============
+
+When you are testing code that works with distutils, you might find these tools
+useful.
+
+``PyPIServer``
+==============
+
+PyPIServer is a class that implements an HTTP server running in a separate
+thread. All it does is record the requests for further inspection. The recorded
+data is available under ``requests`` attribute. The default
+HTTP response can be overriden with the ``default_response_status``,
+``default_response_headers`` and ``default_response_data`` attributes.
+
+By default, when accessing the server with urls beginning with `/simple/`, 
+the server also record your requests, but will look for files under 
+the `/tests/pypiserver/simple/` path.
+
+You can tell the sever to serve static files for other paths. This could be 
+accomplished by using the `static_uri_paths` parameter, as below::
+
+    server = PyPIServer(static_uri_paths=["first_path", "second_path"])
+
+You need to create the content that will be served under the 
+`/tests/pypiserver/default` path. If you want to serve content from another 
+place, you also can specify another filesystem path (wich need to be under
+`tests/pypiserver/`. This will replace the default behavior of the server, and
+it will not serve content from the `default` dir ::
+
+    server = PyPIServer(static_filesystem_paths=["path/to/your/dir"])
+
+If you just need to add some paths to the existing ones, you can do as shown, 
+keeping in mind that the server will alwas try to load paths in reverse order 
+(e.g here, try "another/super/path" then the default one) ::
+
+    server = PyPIServer(test_static_path="another/super/path")
+    server = PyPIServer("another/super/path")
+    # or 
+    server.static_filesystem_paths.append("another/super/path")
+
+As a result of what, in your tests, while you need to use the PyPIServer, in
+order to isolates the test cases, the best practice is to place the common files
+in the `default` folder, and to create a directory for each specific test case::
+
+    server = PyPIServer(static_filesystem_paths = ["default", "test_pypi_server"],
+        static_uri_paths=["simple", "external"])
+
+``PyPIServerTestCase``
+======================
+
+``PyPIServerTestCase`` is a test case class with setUp and tearDown methods that
+take care of a single PyPIServer instance attached as a ``pypi`` attribute on
+the test class. Use it as one of the base classes in your test case::
+
+  class UploadTestCase(PyPIServerTestCase):
+      def test_something(self):
+          cmd = self.prepare_command()
+          cmd.ensure_finalized()
+          cmd.repository = self.pypi.full_address
+          cmd.run()
+
+          environ, request_data = self.pypi.requests[-1]
+          self.assertEqual(request_data, EXPECTED_REQUEST_DATA)
+
+The ``use_pypi_server`` decorator
+=================================
+
+You also can use a decorator for your tests, if you do not need the same server
+instance along all you test case. So, you can specify, for each test method,
+some initialisation parameters for the server.
+
+For this, you need to add a `server` parameter to your method, like this::
+
+    class SampleTestCase(TestCase):
+        @use_pypi_server()
+        def test_somthing(self, server):
+            # your tests goes here
+
+The decorator will instanciate the server for you, and run and stop it just
+before and after your method call. You also can pass the server initializer,
+just like this::
+
+    class SampleTestCase(TestCase):
+        @use_pypi_server("test_case_name")
+        def test_something(self, server):
+            # something
diff --git a/docs/source/version.rst b/docs/source/version.rst
new file mode 100644
--- /dev/null
+++ b/docs/source/version.rst
@@ -0,0 +1,64 @@
+======================
+Working with versions
+======================
+
+Distutils2 ships with a python package capable to work with version numbers.
+It's an implementation of version specifiers `as defined in PEP 345
+<http://www.python.org/dev/peps/pep-0345/#version-specifiers>`_ about
+Metadata.
+
+`distutils2.version.NormalizedVersion`
+======================================
+
+A Normalized version corresponds to a specific version of a distribution, as
+described in the PEP 345. So, you can work with the `NormalizedVersion` like
+this::
+
+    >>> NormalizedVersion("1.2b1")
+    NormalizedVersion('1.2b1')
+
+If you try to use irrational version specifiers, an `IrrationalVersionError`
+will be raised::
+
+    >>> NormalizedVersion("irrational_version_number")
+    ...
+    IrrationalVersionError: irrational_version_number
+
+You can compare NormalizedVersion objects, like this::
+
+    >>> NormalizedVersion("1.2b1") < NormalizedVersion("1.2")
+    True
+
+NormalizedVersion is used internally by `VersionPredicate` to do his stuff.
+
+`distutils2.version.suggest_normalized_version`
+-----------------------------------------------
+
+You also can let the normalized version be suggested to you, using the
+`suggest_normalized_version` function::
+
+    >>> suggest_normalized_version('2.1-rc1') 
+    2.1c1
+
+If `suggest_normalized_version` can't actually suggest you a version, it will
+return `None`::
+
+    >>> print suggest_normalized_version('not a version')
+    None
+
+`distutils2.version.VersionPredicate`
+=====================================
+
+`VersionPredicate` knows how to parse stuff like "ProjectName (>=version)", the
+class also provides a `match` method to test if a version number is the version
+predicate::
+
+    >>> version = VersionPredicate("ProjectName (<1.2,>1.0")
+    >>> version.match("1.2.1")
+    False
+    >>> version.match("1.1.1")
+    True
+
+`is_valid_predicate`
+--------------------
+
diff --git a/src/CONTRIBUTORS.txt b/src/CONTRIBUTORS.txt
--- a/src/CONTRIBUTORS.txt
+++ b/src/CONTRIBUTORS.txt
@@ -5,7 +5,7 @@
 Distutils2 is a project that was started and that is maintained by
 Tarek Ziadé, and many people are contributing to the project.
 
-If you did, please add your name below in alphabetical order !
+If you did, please add your name below in alphabetical order!
 
 Thanks to:
 
@@ -13,14 +13,18 @@
 - Pior Bastida
 - Titus Brown
 - Nicolas Cadou
+- Konrad Delong
 - Josip Djolonga
 - Yannick Gringas
+- Jeremy Kloth
+- Martin von Löwis
 - Carl Meyer
+- Alexis Métaireau
+- Zubin Mithra
 - Michael Mulich
-- George Peris
+- George Peristerakis
 - Sean Reifschneider
+- Luis Rojas
 - Erik Rose
 - Brian Rosner
 - Alexandre Vassalotti
-- Martin von Löwis
-
diff --git a/src/DEVNOTES.txt b/src/DEVNOTES.txt
--- a/src/DEVNOTES.txt
+++ b/src/DEVNOTES.txt
@@ -3,8 +3,8 @@
 
 - Distutils2 runs from 2.4 to 3.2 (3.x not implemented yet), so
   make sure you don't use a syntax that doesn't work under
-  a specific Python version.
+  one of these Python versions.
 
 - Always run tests.sh before you push a change. This implies
-  that you have all Python versions installed.
+  that you have all Python versions installed from 2.4 to 2.6.
 
diff --git a/src/Modules/_hashopenssl.c b/src/Modules/_hashopenssl.c
new file mode 100644
--- /dev/null
+++ b/src/Modules/_hashopenssl.c
@@ -0,0 +1,524 @@
+/* Module that wraps all OpenSSL hash algorithms */
+
+/*
+ * Copyright (C) 2005   Gregory P. Smith (greg at krypto.org)
+ * Licensed to PSF under a Contributor Agreement.
+ *
+ * Derived from a skeleton of shamodule.c containing work performed by:
+ *
+ * Andrew Kuchling (amk at amk.ca)
+ * Greg Stein (gstein at lyra.org)
+ *
+ */
+
+#define PY_SSIZE_T_CLEAN
+
+#include "Python.h"
+#include "structmember.h"
+
+#if (PY_VERSION_HEX < 0x02050000)
+#define Py_ssize_t      int
+#endif
+
+/* EVP is the preferred interface to hashing in OpenSSL */
+#include <openssl/evp.h>
+
+#define MUNCH_SIZE INT_MAX
+
+
+#ifndef HASH_OBJ_CONSTRUCTOR
+#define HASH_OBJ_CONSTRUCTOR 0
+#endif
+
+typedef struct {
+    PyObject_HEAD
+    PyObject            *name;  /* name of this hash algorithm */
+    EVP_MD_CTX          ctx;    /* OpenSSL message digest context */
+} EVPobject;
+
+
+static PyTypeObject EVPtype;
+
+
+#define DEFINE_CONSTS_FOR_NEW(Name)  \
+    static PyObject *CONST_ ## Name ## _name_obj; \
+    static EVP_MD_CTX CONST_new_ ## Name ## _ctx; \
+    static EVP_MD_CTX *CONST_new_ ## Name ## _ctx_p = NULL;
+
+DEFINE_CONSTS_FOR_NEW(md5)
+DEFINE_CONSTS_FOR_NEW(sha1)
+DEFINE_CONSTS_FOR_NEW(sha224)
+DEFINE_CONSTS_FOR_NEW(sha256)
+DEFINE_CONSTS_FOR_NEW(sha384)
+DEFINE_CONSTS_FOR_NEW(sha512)
+
+
+static EVPobject *
+newEVPobject(PyObject *name)
+{
+    EVPobject *retval = (EVPobject *)PyObject_New(EVPobject, &EVPtype);
+
+    /* save the name for .name to return */
+    if (retval != NULL) {
+        Py_INCREF(name);
+        retval->name = name;
+    }
+
+    return retval;
+}
+
+/* Internal methods for a hash object */
+
+static void
+EVP_dealloc(PyObject *ptr)
+{
+    EVP_MD_CTX_cleanup(&((EVPobject *)ptr)->ctx);
+    Py_XDECREF(((EVPobject *)ptr)->name);
+    PyObject_Del(ptr);
+}
+
+
+/* External methods for a hash object */
+
+PyDoc_STRVAR(EVP_copy__doc__, "Return a copy of the hash object.");
+
+static PyObject *
+EVP_copy(EVPobject *self, PyObject *unused)
+{
+    EVPobject *newobj;
+
+    if ( (newobj = newEVPobject(self->name))==NULL)
+        return NULL;
+
+    EVP_MD_CTX_copy(&newobj->ctx, &self->ctx);
+    return (PyObject *)newobj;
+}
+
+PyDoc_STRVAR(EVP_digest__doc__,
+"Return the digest value as a string of binary data.");
+
+static PyObject *
+EVP_digest(EVPobject *self, PyObject *unused)
+{
+    unsigned char digest[EVP_MAX_MD_SIZE];
+    EVP_MD_CTX temp_ctx;
+    PyObject *retval;
+    unsigned int digest_size;
+
+    EVP_MD_CTX_copy(&temp_ctx, &self->ctx);
+    digest_size = EVP_MD_CTX_size(&temp_ctx);
+    EVP_DigestFinal(&temp_ctx, digest, NULL);
+
+    retval = PyString_FromStringAndSize((const char *)digest, digest_size);
+    EVP_MD_CTX_cleanup(&temp_ctx);
+    return retval;
+}
+
+PyDoc_STRVAR(EVP_hexdigest__doc__,
+"Return the digest value as a string of hexadecimal digits.");
+
+static PyObject *
+EVP_hexdigest(EVPobject *self, PyObject *unused)
+{
+    unsigned char digest[EVP_MAX_MD_SIZE];
+    EVP_MD_CTX temp_ctx;
+    PyObject *retval;
+    char *hex_digest;
+    unsigned int i, j, digest_size;
+
+    /* Get the raw (binary) digest value */
+    EVP_MD_CTX_copy(&temp_ctx, &self->ctx);
+    digest_size = EVP_MD_CTX_size(&temp_ctx);
+    EVP_DigestFinal(&temp_ctx, digest, NULL);
+
+    EVP_MD_CTX_cleanup(&temp_ctx);
+
+    /* Create a new string */
+    /* NOTE: not thread safe! modifying an already created string object */
+    /* (not a problem because we hold the GIL by default) */
+    retval = PyString_FromStringAndSize(NULL, digest_size * 2);
+    if (!retval)
+	    return NULL;
+    hex_digest = PyString_AsString(retval);
+    if (!hex_digest) {
+	    Py_DECREF(retval);
+	    return NULL;
+    }
+
+    /* Make hex version of the digest */
+    for(i=j=0; i<digest_size; i++) {
+        char c;
+        c = (digest[i] >> 4) & 0xf;
+	c = (c>9) ? c+'a'-10 : c + '0';
+        hex_digest[j++] = c;
+        c = (digest[i] & 0xf);
+	c = (c>9) ? c+'a'-10 : c + '0';
+        hex_digest[j++] = c;
+    }
+    return retval;
+}
+
+PyDoc_STRVAR(EVP_update__doc__,
+"Update this hash object's state with the provided string.");
+
+static PyObject *
+EVP_update(EVPobject *self, PyObject *args)
+{
+    unsigned char *cp;
+    Py_ssize_t len;
+
+    if (!PyArg_ParseTuple(args, "s#:update", &cp, &len))
+        return NULL;
+
+    if (len > 0 && len <= MUNCH_SIZE) {
+    EVP_DigestUpdate(&self->ctx, cp, Py_SAFE_DOWNCAST(len, Py_ssize_t,
+                                                      unsigned int));
+    } else {
+        Py_ssize_t offset = 0;
+        while (len) {
+            unsigned int process = len > MUNCH_SIZE ? MUNCH_SIZE : len;
+            EVP_DigestUpdate(&self->ctx, cp + offset, process);
+            len -= process;
+            offset += process;
+        }
+    }
+    Py_INCREF(Py_None);
+    return Py_None;
+}
+
+static PyMethodDef EVP_methods[] = {
+    {"update",	  (PyCFunction)EVP_update,    METH_VARARGS, EVP_update__doc__},
+    {"digest",	  (PyCFunction)EVP_digest,    METH_NOARGS,  EVP_digest__doc__},
+    {"hexdigest", (PyCFunction)EVP_hexdigest, METH_NOARGS,  EVP_hexdigest__doc__},
+    {"copy",	  (PyCFunction)EVP_copy,      METH_NOARGS,  EVP_copy__doc__},
+    {NULL,	  NULL}		/* sentinel */
+};
+
+static PyObject *
+EVP_get_block_size(EVPobject *self, void *closure)
+{
+    return PyInt_FromLong(EVP_MD_CTX_block_size(&((EVPobject *)self)->ctx));
+}
+
+static PyObject *
+EVP_get_digest_size(EVPobject *self, void *closure)
+{
+    return PyInt_FromLong(EVP_MD_CTX_size(&((EVPobject *)self)->ctx));
+}
+
+static PyMemberDef EVP_members[] = {
+    {"name", T_OBJECT, offsetof(EVPobject, name), READONLY, PyDoc_STR("algorithm name.")},
+    {NULL}  /* Sentinel */
+};
+
+static PyGetSetDef EVP_getseters[] = {
+    {"digest_size",
+     (getter)EVP_get_digest_size, NULL,
+     NULL,
+     NULL},
+    {"block_size",
+     (getter)EVP_get_block_size, NULL,
+     NULL,
+     NULL},
+    /* the old md5 and sha modules support 'digest_size' as in PEP 247.
+     * the old sha module also supported 'digestsize'.  ugh. */
+    {"digestsize",
+     (getter)EVP_get_digest_size, NULL,
+     NULL,
+     NULL},
+    {NULL}  /* Sentinel */
+};
+
+
+static PyObject *
+EVP_repr(PyObject *self)
+{
+    char buf[100];
+    PyOS_snprintf(buf, sizeof(buf), "<%s HASH object @ %p>",
+            PyString_AsString(((EVPobject *)self)->name), self);
+    return PyString_FromString(buf);
+}
+
+#if HASH_OBJ_CONSTRUCTOR
+static int
+EVP_tp_init(EVPobject *self, PyObject *args, PyObject *kwds)
+{
+    static char *kwlist[] = {"name", "string", NULL};
+    PyObject *name_obj = NULL;
+    char *nameStr;
+    unsigned char *cp = NULL;
+    Py_ssize_t len = 0;
+    const EVP_MD *digest;
+
+    if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|s#:HASH", kwlist,
+                                     &name_obj, &cp, &len)) {
+        return -1;
+    }
+
+    if (!PyArg_Parse(name_obj, "s", &nameStr)) {
+        PyErr_SetString(PyExc_TypeError, "name must be a string");
+        return -1;
+    }
+
+    digest = EVP_get_digestbyname(nameStr);
+    if (!digest) {
+        PyErr_SetString(PyExc_ValueError, "unknown hash function");
+        return -1;
+    }
+    EVP_DigestInit(&self->ctx, digest);
+
+    self->name = name_obj;
+    Py_INCREF(self->name);
+
+    if (cp && len) {
+        if (len > 0 && len <= MUNCH_SIZE) {
+        EVP_DigestUpdate(&self->ctx, cp, Py_SAFE_DOWNCAST(len, Py_ssize_t,
+                                                          unsigned int));
+        } else {
+            Py_ssize_t offset = 0;
+            while (len) {
+                unsigned int process = len > MUNCH_SIZE ? MUNCH_SIZE : len;
+                EVP_DigestUpdate(&self->ctx, cp + offset, process);
+                len -= process;
+                offset += process;
+            }
+        }
+    }
+    
+    return 0;
+}
+#endif
+
+
+PyDoc_STRVAR(hashtype_doc,
+"A hash represents the object used to calculate a checksum of a\n\
+string of information.\n\
+\n\
+Methods:\n\
+\n\
+update() -- updates the current digest with an additional string\n\
+digest() -- return the current digest value\n\
+hexdigest() -- return the current digest as a string of hexadecimal digits\n\
+copy() -- return a copy of the current hash object\n\
+\n\
+Attributes:\n\
+\n\
+name -- the hash algorithm being used by this object\n\
+digest_size -- number of bytes in this hashes output\n");
+
+static PyTypeObject EVPtype = {
+    PyObject_HEAD_INIT(NULL)
+    0,			/*ob_size*/
+    "_hashlib.HASH",    /*tp_name*/
+    sizeof(EVPobject),	/*tp_basicsize*/
+    0,			/*tp_itemsize*/
+    /* methods */
+    EVP_dealloc,	/*tp_dealloc*/
+    0,			/*tp_print*/
+    0,                  /*tp_getattr*/
+    0,                  /*tp_setattr*/
+    0,                  /*tp_compare*/
+    EVP_repr,           /*tp_repr*/
+    0,                  /*tp_as_number*/
+    0,                  /*tp_as_sequence*/
+    0,                  /*tp_as_mapping*/
+    0,                  /*tp_hash*/
+    0,                  /*tp_call*/
+    0,                  /*tp_str*/
+    0,                  /*tp_getattro*/
+    0,                  /*tp_setattro*/
+    0,                  /*tp_as_buffer*/
+    Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE, /*tp_flags*/
+    hashtype_doc,       /*tp_doc*/
+    0,                  /*tp_traverse*/
+    0,			/*tp_clear*/
+    0,			/*tp_richcompare*/
+    0,			/*tp_weaklistoffset*/
+    0,			/*tp_iter*/
+    0,			/*tp_iternext*/
+    EVP_methods,	/* tp_methods */
+    EVP_members,	/* tp_members */
+    EVP_getseters,      /* tp_getset */
+#if 1
+    0,                  /* tp_base */
+    0,                  /* tp_dict */
+    0,                  /* tp_descr_get */
+    0,                  /* tp_descr_set */
+    0,                  /* tp_dictoffset */
+#endif
+#if HASH_OBJ_CONSTRUCTOR
+    (initproc)EVP_tp_init, /* tp_init */
+#endif
+};
+
+static PyObject *
+EVPnew(PyObject *name_obj,
+       const EVP_MD *digest, const EVP_MD_CTX *initial_ctx,
+       const unsigned char *cp, Py_ssize_t len)
+{
+    EVPobject *self;
+
+    if (!digest && !initial_ctx) {
+        PyErr_SetString(PyExc_ValueError, "unsupported hash type");
+        return NULL;
+    }
+
+    if ((self = newEVPobject(name_obj)) == NULL)
+        return NULL;
+
+    if (initial_ctx) {
+        EVP_MD_CTX_copy(&self->ctx, initial_ctx);
+    } else {
+        EVP_DigestInit(&self->ctx, digest);
+    }
+
+    if (cp && len) {
+        if (len > 0 && len <= MUNCH_SIZE) {
+            EVP_DigestUpdate(&self->ctx, cp, Py_SAFE_DOWNCAST(len, Py_ssize_t,
+                                                              unsigned int));
+        } else {
+            Py_ssize_t offset = 0;
+            while (len) {
+                unsigned int process = len > MUNCH_SIZE ? MUNCH_SIZE : len;
+                EVP_DigestUpdate(&self->ctx, cp + offset, process);
+                len -= process;
+                offset += process;
+            }
+        }
+    }
+
+    return (PyObject *)self;
+}
+
+
+/* The module-level function: new() */
+
+PyDoc_STRVAR(EVP_new__doc__,
+"Return a new hash object using the named algorithm.\n\
+An optional string argument may be provided and will be\n\
+automatically hashed.\n\
+\n\
+The MD5 and SHA1 algorithms are always supported.\n");
+
+static PyObject *
+EVP_new(PyObject *self, PyObject *args, PyObject *kwdict)
+{
+    static char *kwlist[] = {"name", "string", NULL};
+    PyObject *name_obj = NULL;
+    char *name;
+    const EVP_MD *digest;
+    unsigned char *cp = NULL;
+    Py_ssize_t len = 0;
+
+    if (!PyArg_ParseTupleAndKeywords(args, kwdict, "O|s#:new", kwlist,
+                                     &name_obj, &cp, &len)) {
+        return NULL;
+    }
+
+    if (!PyArg_Parse(name_obj, "s", &name)) {
+        PyErr_SetString(PyExc_TypeError, "name must be a string");
+        return NULL;
+    }
+
+    digest = EVP_get_digestbyname(name);
+
+    return EVPnew(name_obj, digest, NULL, cp, len);
+}
+
+/*
+ *  This macro generates constructor function definitions for specific
+ *  hash algorithms.  These constructors are much faster than calling
+ *  the generic one passing it a python string and are noticably
+ *  faster than calling a python new() wrapper.  Thats important for
+ *  code that wants to make hashes of a bunch of small strings.
+ */
+#define GEN_CONSTRUCTOR(NAME)  \
+    static PyObject * \
+    EVP_new_ ## NAME (PyObject *self, PyObject *args) \
+    { \
+        unsigned char *cp = NULL; \
+        Py_ssize_t len = 0; \
+     \
+        if (!PyArg_ParseTuple(args, "|s#:" #NAME , &cp, &len)) { \
+            return NULL; \
+        } \
+     \
+        return EVPnew( \
+                CONST_ ## NAME ## _name_obj, \
+                NULL, \
+                CONST_new_ ## NAME ## _ctx_p, \
+                cp, len); \
+    }
+
+/* a PyMethodDef structure for the constructor */
+#define CONSTRUCTOR_METH_DEF(NAME)  \
+    {"openssl_" #NAME, (PyCFunction)EVP_new_ ## NAME, METH_VARARGS, \
+        PyDoc_STR("Returns a " #NAME \
+                  " hash object; optionally initialized with a string") \
+    }
+
+/* used in the init function to setup a constructor */
+#define INIT_CONSTRUCTOR_CONSTANTS(NAME)  do { \
+    CONST_ ## NAME ## _name_obj = PyString_FromString(#NAME); \
+    if (EVP_get_digestbyname(#NAME)) { \
+        CONST_new_ ## NAME ## _ctx_p = &CONST_new_ ## NAME ## _ctx; \
+        EVP_DigestInit(CONST_new_ ## NAME ## _ctx_p, EVP_get_digestbyname(#NAME)); \
+    } \
+} while (0);
+
+GEN_CONSTRUCTOR(md5)
+GEN_CONSTRUCTOR(sha1)
+GEN_CONSTRUCTOR(sha224)
+GEN_CONSTRUCTOR(sha256)
+GEN_CONSTRUCTOR(sha384)
+GEN_CONSTRUCTOR(sha512)
+
+/* List of functions exported by this module */
+
+static struct PyMethodDef EVP_functions[] = {
+    {"new", (PyCFunction)EVP_new, METH_VARARGS|METH_KEYWORDS, EVP_new__doc__},
+    CONSTRUCTOR_METH_DEF(md5),
+    CONSTRUCTOR_METH_DEF(sha1),
+    CONSTRUCTOR_METH_DEF(sha224),
+    CONSTRUCTOR_METH_DEF(sha256),
+    CONSTRUCTOR_METH_DEF(sha384),
+    CONSTRUCTOR_METH_DEF(sha512),
+    {NULL,	NULL}		 /* Sentinel */
+};
+
+
+/* Initialize this module. */
+
+PyMODINIT_FUNC
+init_hashlib(void)
+{
+    PyObject *m;
+
+    OpenSSL_add_all_digests();
+
+    /* TODO build EVP_functions openssl_* entries dynamically based
+     * on what hashes are supported rather than listing many
+     * but having some be unsupported.  Only init appropriate
+     * constants. */
+
+    EVPtype.ob_type = &PyType_Type;
+    if (PyType_Ready(&EVPtype) < 0)
+        return;
+
+    m = Py_InitModule("_hashlib", EVP_functions);
+    if (m == NULL)
+        return;
+
+#if HASH_OBJ_CONSTRUCTOR
+    Py_INCREF(&EVPtype);
+    PyModule_AddObject(m, "HASH", (PyObject *)&EVPtype);
+#endif
+
+    /* these constants are used by the convenience constructors */
+    INIT_CONSTRUCTOR_CONSTANTS(md5);
+    INIT_CONSTRUCTOR_CONSTANTS(sha1);
+    INIT_CONSTRUCTOR_CONSTANTS(sha224);
+    INIT_CONSTRUCTOR_CONSTANTS(sha256);
+    INIT_CONSTRUCTOR_CONSTANTS(sha384);
+    INIT_CONSTRUCTOR_CONSTANTS(sha512);
+}
diff --git a/src/Modules/md5.c b/src/Modules/md5.c
new file mode 100644
--- /dev/null
+++ b/src/Modules/md5.c
@@ -0,0 +1,381 @@
+/*
+  Copyright (C) 1999, 2000, 2002 Aladdin Enterprises.  All rights reserved.
+
+  This software is provided 'as-is', without any express or implied
+  warranty.  In no event will the authors be held liable for any damages
+  arising from the use of this software.
+
+  Permission is granted to anyone to use this software for any purpose,
+  including commercial applications, and to alter it and redistribute it
+  freely, subject to the following restrictions:
+
+  1. The origin of this software must not be misrepresented; you must not
+     claim that you wrote the original software. If you use this software
+     in a product, an acknowledgment in the product documentation would be
+     appreciated but is not required.
+  2. Altered source versions must be plainly marked as such, and must not be
+     misrepresented as being the original software.
+  3. This notice may not be removed or altered from any source distribution.
+
+  L. Peter Deutsch
+  ghost at aladdin.com
+
+ */
+/* $Id: md5.c,v 1.6 2002/04/13 19:20:28 lpd Exp $ */
+/*
+  Independent implementation of MD5 (RFC 1321).
+
+  This code implements the MD5 Algorithm defined in RFC 1321, whose
+  text is available at
+	http://www.ietf.org/rfc/rfc1321.txt
+  The code is derived from the text of the RFC, including the test suite
+  (section A.5) but excluding the rest of Appendix A.  It does not include
+  any code or documentation that is identified in the RFC as being
+  copyrighted.
+
+  The original and principal author of md5.c is L. Peter Deutsch
+  <ghost at aladdin.com>.  Other authors are noted in the change history
+  that follows (in reverse chronological order):
+
+  2002-04-13 lpd Clarified derivation from RFC 1321; now handles byte order
+	either statically or dynamically; added missing #include <string.h>
+	in library.
+  2002-03-11 lpd Corrected argument list for main(), and added int return
+	type, in test program and T value program.
+  2002-02-21 lpd Added missing #include <stdio.h> in test program.
+  2000-07-03 lpd Patched to eliminate warnings about "constant is
+	unsigned in ANSI C, signed in traditional"; made test program
+	self-checking.
+  1999-11-04 lpd Edited comments slightly for automatic TOC extraction.
+  1999-10-18 lpd Fixed typo in header comment (ansi2knr rather than md5).
+  1999-05-03 lpd Original version.
+ */
+
+#include "md5.h"
+#include <string.h>
+
+#undef BYTE_ORDER	/* 1 = big-endian, -1 = little-endian, 0 = unknown */
+#ifdef ARCH_IS_BIG_ENDIAN
+#  define BYTE_ORDER (ARCH_IS_BIG_ENDIAN ? 1 : -1)
+#else
+#  define BYTE_ORDER 0
+#endif
+
+#define T_MASK ((md5_word_t)~0)
+#define T1 /* 0xd76aa478 */ (T_MASK ^ 0x28955b87)
+#define T2 /* 0xe8c7b756 */ (T_MASK ^ 0x173848a9)
+#define T3    0x242070db
+#define T4 /* 0xc1bdceee */ (T_MASK ^ 0x3e423111)
+#define T5 /* 0xf57c0faf */ (T_MASK ^ 0x0a83f050)
+#define T6    0x4787c62a
+#define T7 /* 0xa8304613 */ (T_MASK ^ 0x57cfb9ec)
+#define T8 /* 0xfd469501 */ (T_MASK ^ 0x02b96afe)
+#define T9    0x698098d8
+#define T10 /* 0x8b44f7af */ (T_MASK ^ 0x74bb0850)
+#define T11 /* 0xffff5bb1 */ (T_MASK ^ 0x0000a44e)
+#define T12 /* 0x895cd7be */ (T_MASK ^ 0x76a32841)
+#define T13    0x6b901122
+#define T14 /* 0xfd987193 */ (T_MASK ^ 0x02678e6c)
+#define T15 /* 0xa679438e */ (T_MASK ^ 0x5986bc71)
+#define T16    0x49b40821
+#define T17 /* 0xf61e2562 */ (T_MASK ^ 0x09e1da9d)
+#define T18 /* 0xc040b340 */ (T_MASK ^ 0x3fbf4cbf)
+#define T19    0x265e5a51
+#define T20 /* 0xe9b6c7aa */ (T_MASK ^ 0x16493855)
+#define T21 /* 0xd62f105d */ (T_MASK ^ 0x29d0efa2)
+#define T22    0x02441453
+#define T23 /* 0xd8a1e681 */ (T_MASK ^ 0x275e197e)
+#define T24 /* 0xe7d3fbc8 */ (T_MASK ^ 0x182c0437)
+#define T25    0x21e1cde6
+#define T26 /* 0xc33707d6 */ (T_MASK ^ 0x3cc8f829)
+#define T27 /* 0xf4d50d87 */ (T_MASK ^ 0x0b2af278)
+#define T28    0x455a14ed
+#define T29 /* 0xa9e3e905 */ (T_MASK ^ 0x561c16fa)
+#define T30 /* 0xfcefa3f8 */ (T_MASK ^ 0x03105c07)
+#define T31    0x676f02d9
+#define T32 /* 0x8d2a4c8a */ (T_MASK ^ 0x72d5b375)
+#define T33 /* 0xfffa3942 */ (T_MASK ^ 0x0005c6bd)
+#define T34 /* 0x8771f681 */ (T_MASK ^ 0x788e097e)
+#define T35    0x6d9d6122
+#define T36 /* 0xfde5380c */ (T_MASK ^ 0x021ac7f3)
+#define T37 /* 0xa4beea44 */ (T_MASK ^ 0x5b4115bb)
+#define T38    0x4bdecfa9
+#define T39 /* 0xf6bb4b60 */ (T_MASK ^ 0x0944b49f)
+#define T40 /* 0xbebfbc70 */ (T_MASK ^ 0x4140438f)
+#define T41    0x289b7ec6
+#define T42 /* 0xeaa127fa */ (T_MASK ^ 0x155ed805)
+#define T43 /* 0xd4ef3085 */ (T_MASK ^ 0x2b10cf7a)
+#define T44    0x04881d05
+#define T45 /* 0xd9d4d039 */ (T_MASK ^ 0x262b2fc6)
+#define T46 /* 0xe6db99e5 */ (T_MASK ^ 0x1924661a)
+#define T47    0x1fa27cf8
+#define T48 /* 0xc4ac5665 */ (T_MASK ^ 0x3b53a99a)
+#define T49 /* 0xf4292244 */ (T_MASK ^ 0x0bd6ddbb)
+#define T50    0x432aff97
+#define T51 /* 0xab9423a7 */ (T_MASK ^ 0x546bdc58)
+#define T52 /* 0xfc93a039 */ (T_MASK ^ 0x036c5fc6)
+#define T53    0x655b59c3
+#define T54 /* 0x8f0ccc92 */ (T_MASK ^ 0x70f3336d)
+#define T55 /* 0xffeff47d */ (T_MASK ^ 0x00100b82)
+#define T56 /* 0x85845dd1 */ (T_MASK ^ 0x7a7ba22e)
+#define T57    0x6fa87e4f
+#define T58 /* 0xfe2ce6e0 */ (T_MASK ^ 0x01d3191f)
+#define T59 /* 0xa3014314 */ (T_MASK ^ 0x5cfebceb)
+#define T60    0x4e0811a1
+#define T61 /* 0xf7537e82 */ (T_MASK ^ 0x08ac817d)
+#define T62 /* 0xbd3af235 */ (T_MASK ^ 0x42c50dca)
+#define T63    0x2ad7d2bb
+#define T64 /* 0xeb86d391 */ (T_MASK ^ 0x14792c6e)
+
+
+static void
+md5_process(md5_state_t *pms, const md5_byte_t *data /*[64]*/)
+{
+    md5_word_t
+	a = pms->abcd[0], b = pms->abcd[1],
+	c = pms->abcd[2], d = pms->abcd[3];
+    md5_word_t t;
+#if BYTE_ORDER > 0
+    /* Define storage only for big-endian CPUs. */
+    md5_word_t X[16];
+#else
+    /* Define storage for little-endian or both types of CPUs. */
+    md5_word_t xbuf[16];
+    const md5_word_t *X;
+#endif
+
+    {
+#if BYTE_ORDER == 0
+	/*
+	 * Determine dynamically whether this is a big-endian or
+	 * little-endian machine, since we can use a more efficient
+	 * algorithm on the latter.
+	 */
+	static const int w = 1;
+
+	if (*((const md5_byte_t *)&w)) /* dynamic little-endian */
+#endif
+#if BYTE_ORDER <= 0		/* little-endian */
+	{
+	    /*
+	     * On little-endian machines, we can process properly aligned
+	     * data without copying it.
+	     */
+	    if (!((data - (const md5_byte_t *)0) & 3)) {
+		/* data are properly aligned */
+		X = (const md5_word_t *)data;
+	    } else {
+		/* not aligned */
+		memcpy(xbuf, data, 64);
+		X = xbuf;
+	    }
+	}
+#endif
+#if BYTE_ORDER == 0
+	else			/* dynamic big-endian */
+#endif
+#if BYTE_ORDER >= 0		/* big-endian */
+	{
+	    /*
+	     * On big-endian machines, we must arrange the bytes in the
+	     * right order.
+	     */
+	    const md5_byte_t *xp = data;
+	    int i;
+
+#  if BYTE_ORDER == 0
+	    X = xbuf;		/* (dynamic only) */
+#  else
+#    define xbuf X		/* (static only) */
+#  endif
+	    for (i = 0; i < 16; ++i, xp += 4)
+		xbuf[i] = xp[0] + (xp[1] << 8) + (xp[2] << 16) + (xp[3] << 24);
+	}
+#endif
+    }
+
+#define ROTATE_LEFT(x, n) (((x) << (n)) | ((x) >> (32 - (n))))
+
+    /* Round 1. */
+    /* Let [abcd k s i] denote the operation
+       a = b + ((a + F(b,c,d) + X[k] + T[i]) <<< s). */
+#define F(x, y, z) (((x) & (y)) | (~(x) & (z)))
+#define SET(a, b, c, d, k, s, Ti)\
+  t = a + F(b,c,d) + X[k] + Ti;\
+  a = ROTATE_LEFT(t, s) + b
+    /* Do the following 16 operations. */
+    SET(a, b, c, d,  0,  7,  T1);
+    SET(d, a, b, c,  1, 12,  T2);
+    SET(c, d, a, b,  2, 17,  T3);
+    SET(b, c, d, a,  3, 22,  T4);
+    SET(a, b, c, d,  4,  7,  T5);
+    SET(d, a, b, c,  5, 12,  T6);
+    SET(c, d, a, b,  6, 17,  T7);
+    SET(b, c, d, a,  7, 22,  T8);
+    SET(a, b, c, d,  8,  7,  T9);
+    SET(d, a, b, c,  9, 12, T10);
+    SET(c, d, a, b, 10, 17, T11);
+    SET(b, c, d, a, 11, 22, T12);
+    SET(a, b, c, d, 12,  7, T13);
+    SET(d, a, b, c, 13, 12, T14);
+    SET(c, d, a, b, 14, 17, T15);
+    SET(b, c, d, a, 15, 22, T16);
+#undef SET
+
+     /* Round 2. */
+     /* Let [abcd k s i] denote the operation
+          a = b + ((a + G(b,c,d) + X[k] + T[i]) <<< s). */
+#define G(x, y, z) (((x) & (z)) | ((y) & ~(z)))
+#define SET(a, b, c, d, k, s, Ti)\
+  t = a + G(b,c,d) + X[k] + Ti;\
+  a = ROTATE_LEFT(t, s) + b
+     /* Do the following 16 operations. */
+    SET(a, b, c, d,  1,  5, T17);
+    SET(d, a, b, c,  6,  9, T18);
+    SET(c, d, a, b, 11, 14, T19);
+    SET(b, c, d, a,  0, 20, T20);
+    SET(a, b, c, d,  5,  5, T21);
+    SET(d, a, b, c, 10,  9, T22);
+    SET(c, d, a, b, 15, 14, T23);
+    SET(b, c, d, a,  4, 20, T24);
+    SET(a, b, c, d,  9,  5, T25);
+    SET(d, a, b, c, 14,  9, T26);
+    SET(c, d, a, b,  3, 14, T27);
+    SET(b, c, d, a,  8, 20, T28);
+    SET(a, b, c, d, 13,  5, T29);
+    SET(d, a, b, c,  2,  9, T30);
+    SET(c, d, a, b,  7, 14, T31);
+    SET(b, c, d, a, 12, 20, T32);
+#undef SET
+
+     /* Round 3. */
+     /* Let [abcd k s t] denote the operation
+          a = b + ((a + H(b,c,d) + X[k] + T[i]) <<< s). */
+#define H(x, y, z) ((x) ^ (y) ^ (z))
+#define SET(a, b, c, d, k, s, Ti)\
+  t = a + H(b,c,d) + X[k] + Ti;\
+  a = ROTATE_LEFT(t, s) + b
+     /* Do the following 16 operations. */
+    SET(a, b, c, d,  5,  4, T33);
+    SET(d, a, b, c,  8, 11, T34);
+    SET(c, d, a, b, 11, 16, T35);
+    SET(b, c, d, a, 14, 23, T36);
+    SET(a, b, c, d,  1,  4, T37);
+    SET(d, a, b, c,  4, 11, T38);
+    SET(c, d, a, b,  7, 16, T39);
+    SET(b, c, d, a, 10, 23, T40);
+    SET(a, b, c, d, 13,  4, T41);
+    SET(d, a, b, c,  0, 11, T42);
+    SET(c, d, a, b,  3, 16, T43);
+    SET(b, c, d, a,  6, 23, T44);
+    SET(a, b, c, d,  9,  4, T45);
+    SET(d, a, b, c, 12, 11, T46);
+    SET(c, d, a, b, 15, 16, T47);
+    SET(b, c, d, a,  2, 23, T48);
+#undef SET
+
+     /* Round 4. */
+     /* Let [abcd k s t] denote the operation
+          a = b + ((a + I(b,c,d) + X[k] + T[i]) <<< s). */
+#define I(x, y, z) ((y) ^ ((x) | ~(z)))
+#define SET(a, b, c, d, k, s, Ti)\
+  t = a + I(b,c,d) + X[k] + Ti;\
+  a = ROTATE_LEFT(t, s) + b
+     /* Do the following 16 operations. */
+    SET(a, b, c, d,  0,  6, T49);
+    SET(d, a, b, c,  7, 10, T50);
+    SET(c, d, a, b, 14, 15, T51);
+    SET(b, c, d, a,  5, 21, T52);
+    SET(a, b, c, d, 12,  6, T53);
+    SET(d, a, b, c,  3, 10, T54);
+    SET(c, d, a, b, 10, 15, T55);
+    SET(b, c, d, a,  1, 21, T56);
+    SET(a, b, c, d,  8,  6, T57);
+    SET(d, a, b, c, 15, 10, T58);
+    SET(c, d, a, b,  6, 15, T59);
+    SET(b, c, d, a, 13, 21, T60);
+    SET(a, b, c, d,  4,  6, T61);
+    SET(d, a, b, c, 11, 10, T62);
+    SET(c, d, a, b,  2, 15, T63);
+    SET(b, c, d, a,  9, 21, T64);
+#undef SET
+
+     /* Then perform the following additions. (That is increment each
+        of the four registers by the value it had before this block
+        was started.) */
+    pms->abcd[0] += a;
+    pms->abcd[1] += b;
+    pms->abcd[2] += c;
+    pms->abcd[3] += d;
+}
+
+void
+md5_init(md5_state_t *pms)
+{
+    pms->count[0] = pms->count[1] = 0;
+    pms->abcd[0] = 0x67452301;
+    pms->abcd[1] = /*0xefcdab89*/ T_MASK ^ 0x10325476;
+    pms->abcd[2] = /*0x98badcfe*/ T_MASK ^ 0x67452301;
+    pms->abcd[3] = 0x10325476;
+}
+
+void
+md5_append(md5_state_t *pms, const md5_byte_t *data, int nbytes)
+{
+    const md5_byte_t *p = data;
+    int left = nbytes;
+    int offset = (pms->count[0] >> 3) & 63;
+    md5_word_t nbits = (md5_word_t)(nbytes << 3);
+
+    if (nbytes <= 0)
+	return;
+
+    /* Update the message length. */
+    pms->count[1] += nbytes >> 29;
+    pms->count[0] += nbits;
+    if (pms->count[0] < nbits)
+	pms->count[1]++;
+
+    /* Process an initial partial block. */
+    if (offset) {
+	int copy = (offset + nbytes > 64 ? 64 - offset : nbytes);
+
+	memcpy(pms->buf + offset, p, copy);
+	if (offset + copy < 64)
+	    return;
+	p += copy;
+	left -= copy;
+	md5_process(pms, pms->buf);
+    }
+
+    /* Process full blocks. */
+    for (; left >= 64; p += 64, left -= 64)
+	md5_process(pms, p);
+
+    /* Process a final partial block. */
+    if (left)
+	memcpy(pms->buf, p, left);
+}
+
+void
+md5_finish(md5_state_t *pms, md5_byte_t digest[16])
+{
+    static const md5_byte_t pad[64] = {
+	0x80, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
+	0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
+	0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
+	0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0
+    };
+    md5_byte_t data[8];
+    int i;
+
+    /* Save the length before padding. */
+    for (i = 0; i < 8; ++i)
+	data[i] = (md5_byte_t)(pms->count[i >> 2] >> ((i & 3) << 3));
+    /* Pad to 56 bytes mod 64. */
+    md5_append(pms, pad, ((55 - (pms->count[0] >> 3)) & 63) + 1);
+    /* Append the length. */
+    md5_append(pms, data, 8);
+    for (i = 0; i < 16; ++i)
+	digest[i] = (md5_byte_t)(pms->abcd[i >> 2] >> ((i & 3) << 3));
+}
diff --git a/src/Modules/md5.h b/src/Modules/md5.h
new file mode 100644
--- /dev/null
+++ b/src/Modules/md5.h
@@ -0,0 +1,91 @@
+/*
+  Copyright (C) 1999, 2002 Aladdin Enterprises.  All rights reserved.
+
+  This software is provided 'as-is', without any express or implied
+  warranty.  In no event will the authors be held liable for any damages
+  arising from the use of this software.
+
+  Permission is granted to anyone to use this software for any purpose,
+  including commercial applications, and to alter it and redistribute it
+  freely, subject to the following restrictions:
+
+  1. The origin of this software must not be misrepresented; you must not
+     claim that you wrote the original software. If you use this software
+     in a product, an acknowledgment in the product documentation would be
+     appreciated but is not required.
+  2. Altered source versions must be plainly marked as such, and must not be
+     misrepresented as being the original software.
+  3. This notice may not be removed or altered from any source distribution.
+
+  L. Peter Deutsch
+  ghost at aladdin.com
+
+ */
+/* $Id: md5.h 43594 2006-04-03 16:27:50Z matthias.klose $ */
+/*
+  Independent implementation of MD5 (RFC 1321).
+
+  This code implements the MD5 Algorithm defined in RFC 1321, whose
+  text is available at
+	http://www.ietf.org/rfc/rfc1321.txt
+  The code is derived from the text of the RFC, including the test suite
+  (section A.5) but excluding the rest of Appendix A.  It does not include
+  any code or documentation that is identified in the RFC as being
+  copyrighted.
+
+  The original and principal author of md5.h is L. Peter Deutsch
+  <ghost at aladdin.com>.  Other authors are noted in the change history
+  that follows (in reverse chronological order):
+
+  2002-04-13 lpd Removed support for non-ANSI compilers; removed
+	references to Ghostscript; clarified derivation from RFC 1321;
+	now handles byte order either statically or dynamically.
+  1999-11-04 lpd Edited comments slightly for automatic TOC extraction.
+  1999-10-18 lpd Fixed typo in header comment (ansi2knr rather than md5);
+	added conditionalization for C++ compilation from Martin
+	Purschke <purschke at bnl.gov>.
+  1999-05-03 lpd Original version.
+ */
+
+#ifndef md5_INCLUDED
+#  define md5_INCLUDED
+
+/*
+ * This package supports both compile-time and run-time determination of CPU
+ * byte order.  If ARCH_IS_BIG_ENDIAN is defined as 0, the code will be
+ * compiled to run only on little-endian CPUs; if ARCH_IS_BIG_ENDIAN is
+ * defined as non-zero, the code will be compiled to run only on big-endian
+ * CPUs; if ARCH_IS_BIG_ENDIAN is not defined, the code will be compiled to
+ * run on either big- or little-endian CPUs, but will run slightly less
+ * efficiently on either one than if ARCH_IS_BIG_ENDIAN is defined.
+ */
+
+typedef unsigned char md5_byte_t; /* 8-bit byte */
+typedef unsigned int md5_word_t; /* 32-bit word */
+
+/* Define the state of the MD5 Algorithm. */
+typedef struct md5_state_s {
+    md5_word_t count[2];	/* message length in bits, lsw first */
+    md5_word_t abcd[4];		/* digest buffer */
+    md5_byte_t buf[64];		/* accumulate block */
+} md5_state_t;
+
+#ifdef __cplusplus
+extern "C" 
+{
+#endif
+
+/* Initialize the algorithm. */
+void md5_init(md5_state_t *pms);
+
+/* Append a string to the message. */
+void md5_append(md5_state_t *pms, const md5_byte_t *data, int nbytes);
+
+/* Finish the message and return the digest. */
+void md5_finish(md5_state_t *pms, md5_byte_t digest[16]);
+
+#ifdef __cplusplus
+}  /* end extern "C" */
+#endif
+
+#endif /* md5_INCLUDED */
diff --git a/src/Modules/md5module.c b/src/Modules/md5module.c
new file mode 100644
--- /dev/null
+++ b/src/Modules/md5module.c
@@ -0,0 +1,312 @@
+
+/* MD5 module */
+
+/* This module provides an interface to the RSA Data Security,
+   Inc. MD5 Message-Digest Algorithm, described in RFC 1321.
+   It requires the files md5c.c and md5.h (which are slightly changed
+   from the versions in the RFC to avoid the "global.h" file.) */
+
+
+/* MD5 objects */
+
+#include "Python.h"
+#include "structmember.h"
+#include "md5.h"
+
+typedef struct {
+	PyObject_HEAD
+        md5_state_t	md5;		/* the context holder */
+} md5object;
+
+static PyTypeObject MD5type;
+
+#define is_md5object(v)		((v)->ob_type == &MD5type)
+
+static md5object *
+newmd5object(void)
+{
+	md5object *md5p;
+
+	md5p = PyObject_New(md5object, &MD5type);
+	if (md5p == NULL)
+		return NULL;
+
+	md5_init(&md5p->md5);	/* actual initialisation */
+	return md5p;
+}
+
+
+/* MD5 methods */
+
+static void
+md5_dealloc(md5object *md5p)
+{
+	PyObject_Del(md5p);
+}
+
+
+/* MD5 methods-as-attributes */
+
+static PyObject *
+md5_update(md5object *self, PyObject *args)
+{
+	unsigned char *cp;
+	int len;
+
+	if (!PyArg_ParseTuple(args, "s#:update", &cp, &len))
+		return NULL;
+
+	md5_append(&self->md5, cp, len);
+
+	Py_INCREF(Py_None);
+	return Py_None;
+}
+
+PyDoc_STRVAR(update_doc,
+"update (arg)\n\
+\n\
+Update the md5 object with the string arg. Repeated calls are\n\
+equivalent to a single call with the concatenation of all the\n\
+arguments.");
+
+
+static PyObject *
+md5_digest(md5object *self)
+{
+ 	md5_state_t mdContext;
+	unsigned char aDigest[16];
+
+	/* make a temporary copy, and perform the final */
+	mdContext = self->md5;
+	md5_finish(&mdContext, aDigest);
+
+	return PyString_FromStringAndSize((char *)aDigest, 16);
+}
+
+PyDoc_STRVAR(digest_doc,
+"digest() -> string\n\
+\n\
+Return the digest of the strings passed to the update() method so\n\
+far. This is a 16-byte string which may contain non-ASCII characters,\n\
+including null bytes.");
+
+
+static PyObject *
+md5_hexdigest(md5object *self)
+{
+ 	md5_state_t mdContext;
+	unsigned char digest[16];
+	unsigned char hexdigest[32];
+	int i, j;
+
+	/* make a temporary copy, and perform the final */
+	mdContext = self->md5;
+	md5_finish(&mdContext, digest);
+
+	/* Make hex version of the digest */
+	for(i=j=0; i<16; i++) {
+		char c;
+		c = (digest[i] >> 4) & 0xf;
+		c = (c>9) ? c+'a'-10 : c + '0';
+		hexdigest[j++] = c;
+		c = (digest[i] & 0xf);
+		c = (c>9) ? c+'a'-10 : c + '0';
+		hexdigest[j++] = c;
+	}
+	return PyString_FromStringAndSize((char*)hexdigest, 32);
+}
+
+
+PyDoc_STRVAR(hexdigest_doc,
+"hexdigest() -> string\n\
+\n\
+Like digest(), but returns the digest as a string of hexadecimal digits.");
+
+
+static PyObject *
+md5_copy(md5object *self)
+{
+	md5object *md5p;
+
+	if ((md5p = newmd5object()) == NULL)
+		return NULL;
+
+	md5p->md5 = self->md5;
+
+	return (PyObject *)md5p;
+}
+
+PyDoc_STRVAR(copy_doc,
+"copy() -> md5 object\n\
+\n\
+Return a copy (``clone'') of the md5 object.");
+
+
+static PyMethodDef md5_methods[] = {
+	{"update",    (PyCFunction)md5_update,    METH_VARARGS, update_doc},
+	{"digest",    (PyCFunction)md5_digest,    METH_NOARGS,  digest_doc},
+	{"hexdigest", (PyCFunction)md5_hexdigest, METH_NOARGS,  hexdigest_doc},
+	{"copy",      (PyCFunction)md5_copy,      METH_NOARGS,  copy_doc},
+	{NULL, NULL}			     /* sentinel */
+};
+
+static PyObject *
+md5_get_block_size(PyObject *self, void *closure)
+{
+    return PyInt_FromLong(64);
+}
+
+static PyObject *
+md5_get_digest_size(PyObject *self, void *closure)
+{
+    return PyInt_FromLong(16);
+}
+
+static PyObject *
+md5_get_name(PyObject *self, void *closure)
+{
+    return PyString_FromStringAndSize("MD5", 3);
+}
+
+static PyGetSetDef md5_getseters[] = {
+    {"digest_size",
+     (getter)md5_get_digest_size, NULL,
+     NULL,
+     NULL},
+    {"block_size",
+     (getter)md5_get_block_size, NULL,
+     NULL,
+     NULL},
+    {"name",
+     (getter)md5_get_name, NULL,
+     NULL,
+     NULL},
+    /* the old md5 and sha modules support 'digest_size' as in PEP 247.
+     * the old sha module also supported 'digestsize'.  ugh. */
+    {"digestsize",
+     (getter)md5_get_digest_size, NULL,
+     NULL,
+     NULL},
+    {NULL}  /* Sentinel */
+};
+
+
+PyDoc_STRVAR(module_doc,
+"This module implements the interface to RSA's MD5 message digest\n\
+algorithm (see also Internet RFC 1321). Its use is quite\n\
+straightforward: use the new() to create an md5 object. You can now\n\
+feed this object with arbitrary strings using the update() method, and\n\
+at any point you can ask it for the digest (a strong kind of 128-bit\n\
+checksum, a.k.a. ``fingerprint'') of the concatenation of the strings\n\
+fed to it so far using the digest() method.\n\
+\n\
+Functions:\n\
+\n\
+new([arg]) -- return a new md5 object, initialized with arg if provided\n\
+md5([arg]) -- DEPRECATED, same as new, but for compatibility\n\
+\n\
+Special Objects:\n\
+\n\
+MD5Type -- type object for md5 objects");
+
+PyDoc_STRVAR(md5type_doc,
+"An md5 represents the object used to calculate the MD5 checksum of a\n\
+string of information.\n\
+\n\
+Methods:\n\
+\n\
+update() -- updates the current digest with an additional string\n\
+digest() -- return the current digest value\n\
+hexdigest() -- return the current digest as a string of hexadecimal digits\n\
+copy() -- return a copy of the current md5 object");
+
+static PyTypeObject MD5type = {
+	PyObject_HEAD_INIT(NULL)
+	0,			  /*ob_size*/
+	"_md5.md5",		  /*tp_name*/
+	sizeof(md5object),	  /*tp_size*/
+	0,			  /*tp_itemsize*/
+	/* methods */
+	(destructor)md5_dealloc,  /*tp_dealloc*/
+	0,			  /*tp_print*/
+	0,                        /*tp_getattr*/
+	0,			  /*tp_setattr*/
+	0,			  /*tp_compare*/
+	0,			  /*tp_repr*/
+        0,			  /*tp_as_number*/
+	0,                        /*tp_as_sequence*/
+	0,			  /*tp_as_mapping*/
+	0, 			  /*tp_hash*/
+	0,			  /*tp_call*/
+	0,			  /*tp_str*/
+	0,			  /*tp_getattro*/
+	0,			  /*tp_setattro*/
+	0,	                  /*tp_as_buffer*/
+	Py_TPFLAGS_DEFAULT,	  /*tp_flags*/
+	md5type_doc,		  /*tp_doc*/
+        0,                        /*tp_traverse*/
+        0,			  /*tp_clear*/
+        0,			  /*tp_richcompare*/
+        0,			  /*tp_weaklistoffset*/
+        0,			  /*tp_iter*/
+        0,			  /*tp_iternext*/
+        md5_methods,	          /*tp_methods*/
+        0,      	          /*tp_members*/
+        md5_getseters,            /*tp_getset*/
+};
+
+
+/* MD5 functions */
+
+static PyObject *
+MD5_new(PyObject *self, PyObject *args)
+{
+	md5object *md5p;
+	unsigned char *cp = NULL;
+	int len = 0;
+
+	if (!PyArg_ParseTuple(args, "|s#:new", &cp, &len))
+		return NULL;
+
+	if ((md5p = newmd5object()) == NULL)
+		return NULL;
+
+	if (cp)
+		md5_append(&md5p->md5, cp, len);
+
+	return (PyObject *)md5p;
+}
+
+PyDoc_STRVAR(new_doc,
+"new([arg]) -> md5 object\n\
+\n\
+Return a new md5 object. If arg is present, the method call update(arg)\n\
+is made.");
+
+
+/* List of functions exported by this module */
+
+static PyMethodDef md5_functions[] = {
+	{"new",		(PyCFunction)MD5_new, METH_VARARGS, new_doc},
+	{NULL,		NULL}	/* Sentinel */
+};
+
+
+/* Initialize this module. */
+
+PyMODINIT_FUNC
+init_md5(void)
+{
+	PyObject *m, *d;
+
+        MD5type.ob_type = &PyType_Type;
+        if (PyType_Ready(&MD5type) < 0)
+            return;
+	m = Py_InitModule3("_md5", md5_functions, module_doc);
+	if (m == NULL)
+	    return;
+	d = PyModule_GetDict(m);
+	PyDict_SetItemString(d, "MD5Type", (PyObject *)&MD5type);
+	PyModule_AddIntConstant(m, "digest_size", 16);
+	/* No need to check the error here, the caller will do that */
+}
diff --git a/src/Modules/sha256module.c b/src/Modules/sha256module.c
new file mode 100644
--- /dev/null
+++ b/src/Modules/sha256module.c
@@ -0,0 +1,701 @@
+/* SHA256 module */
+
+/* This module provides an interface to NIST's SHA-256 and SHA-224 Algorithms */
+
+/* See below for information about the original code this module was
+   based upon. Additional work performed by:
+
+   Andrew Kuchling (amk at amk.ca)
+   Greg Stein (gstein at lyra.org)
+   Trevor Perrin (trevp at trevp.net)
+
+   Copyright (C) 2005   Gregory P. Smith (greg at krypto.org)
+   Licensed to PSF under a Contributor Agreement.
+
+*/
+
+/* SHA objects */
+
+#include "Python.h"
+#include "structmember.h"
+
+
+/* Endianness testing and definitions */
+#define TestEndianness(variable) {int i=1; variable=PCT_BIG_ENDIAN;\
+	if (*((char*)&i)==1) variable=PCT_LITTLE_ENDIAN;}
+
+#define PCT_LITTLE_ENDIAN 1
+#define PCT_BIG_ENDIAN 0
+
+/* Some useful types */
+
+typedef unsigned char SHA_BYTE;
+
+#if SIZEOF_INT == 4
+typedef unsigned int SHA_INT32;	/* 32-bit integer */
+#else
+/* not defined. compilation will die. */
+#endif
+
+/* The SHA block size and message digest sizes, in bytes */
+
+#define SHA_BLOCKSIZE    64
+#define SHA_DIGESTSIZE  32
+
+/* The structure for storing SHA info */
+
+typedef struct {
+    PyObject_HEAD
+    SHA_INT32 digest[8];		/* Message digest */
+    SHA_INT32 count_lo, count_hi;	/* 64-bit bit count */
+    SHA_BYTE data[SHA_BLOCKSIZE];	/* SHA data buffer */
+    int Endianness;
+    int local;				/* unprocessed amount in data */
+    int digestsize;
+} SHAobject;
+
+/* When run on a little-endian CPU we need to perform byte reversal on an
+   array of longwords. */
+
+static void longReverse(SHA_INT32 *buffer, int byteCount, int Endianness)
+{
+    SHA_INT32 value;
+
+    if ( Endianness == PCT_BIG_ENDIAN )
+	return;
+
+    byteCount /= sizeof(*buffer);
+    while (byteCount--) {
+        value = *buffer;
+        value = ( ( value & 0xFF00FF00L ) >> 8  ) | \
+                ( ( value & 0x00FF00FFL ) << 8 );
+        *buffer++ = ( value << 16 ) | ( value >> 16 );
+    }
+}
+
+static void SHAcopy(SHAobject *src, SHAobject *dest)
+{
+    dest->Endianness = src->Endianness;
+    dest->local = src->local;
+    dest->digestsize = src->digestsize;
+    dest->count_lo = src->count_lo;
+    dest->count_hi = src->count_hi;
+    memcpy(dest->digest, src->digest, sizeof(src->digest));
+    memcpy(dest->data, src->data, sizeof(src->data));
+}
+
+
+/* ------------------------------------------------------------------------
+ *
+ * This code for the SHA-256 algorithm was noted as public domain. The
+ * original headers are pasted below.
+ *
+ * Several changes have been made to make it more compatible with the
+ * Python environment and desired interface.
+ *
+ */
+
+/* LibTomCrypt, modular cryptographic library -- Tom St Denis
+ *
+ * LibTomCrypt is a library that provides various cryptographic
+ * algorithms in a highly modular and flexible manner.
+ *
+ * The library is free for all purposes without any express
+ * gurantee it works.
+ *
+ * Tom St Denis, tomstdenis at iahu.ca, http://libtomcrypt.org
+ */
+
+
+/* SHA256 by Tom St Denis */
+
+/* Various logical functions */
+#define ROR(x, y)\
+( ((((unsigned long)(x)&0xFFFFFFFFUL)>>(unsigned long)((y)&31)) | \
+((unsigned long)(x)<<(unsigned long)(32-((y)&31)))) & 0xFFFFFFFFUL)
+#define Ch(x,y,z)       (z ^ (x & (y ^ z)))
+#define Maj(x,y,z)      (((x | y) & z) | (x & y)) 
+#define S(x, n)         ROR((x),(n))
+#define R(x, n)         (((x)&0xFFFFFFFFUL)>>(n))
+#define Sigma0(x)       (S(x, 2) ^ S(x, 13) ^ S(x, 22))
+#define Sigma1(x)       (S(x, 6) ^ S(x, 11) ^ S(x, 25))
+#define Gamma0(x)       (S(x, 7) ^ S(x, 18) ^ R(x, 3))
+#define Gamma1(x)       (S(x, 17) ^ S(x, 19) ^ R(x, 10))
+
+
+static void
+sha_transform(SHAobject *sha_info)
+{
+    int i;
+	SHA_INT32 S[8], W[64], t0, t1;
+
+    memcpy(W, sha_info->data, sizeof(sha_info->data));
+    longReverse(W, (int)sizeof(sha_info->data), sha_info->Endianness);
+
+    for (i = 16; i < 64; ++i) {
+		W[i] = Gamma1(W[i - 2]) + W[i - 7] + Gamma0(W[i - 15]) + W[i - 16];
+    }
+    for (i = 0; i < 8; ++i) {
+        S[i] = sha_info->digest[i];
+    }
+
+    /* Compress */
+#define RND(a,b,c,d,e,f,g,h,i,ki)                    \
+     t0 = h + Sigma1(e) + Ch(e, f, g) + ki + W[i];   \
+     t1 = Sigma0(a) + Maj(a, b, c);                  \
+     d += t0;                                        \
+     h  = t0 + t1;
+
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],0,0x428a2f98);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],1,0x71374491);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],2,0xb5c0fbcf);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],3,0xe9b5dba5);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],4,0x3956c25b);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],5,0x59f111f1);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],6,0x923f82a4);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],7,0xab1c5ed5);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],8,0xd807aa98);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],9,0x12835b01);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],10,0x243185be);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],11,0x550c7dc3);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],12,0x72be5d74);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],13,0x80deb1fe);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],14,0x9bdc06a7);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],15,0xc19bf174);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],16,0xe49b69c1);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],17,0xefbe4786);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],18,0x0fc19dc6);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],19,0x240ca1cc);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],20,0x2de92c6f);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],21,0x4a7484aa);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],22,0x5cb0a9dc);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],23,0x76f988da);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],24,0x983e5152);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],25,0xa831c66d);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],26,0xb00327c8);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],27,0xbf597fc7);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],28,0xc6e00bf3);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],29,0xd5a79147);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],30,0x06ca6351);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],31,0x14292967);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],32,0x27b70a85);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],33,0x2e1b2138);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],34,0x4d2c6dfc);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],35,0x53380d13);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],36,0x650a7354);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],37,0x766a0abb);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],38,0x81c2c92e);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],39,0x92722c85);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],40,0xa2bfe8a1);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],41,0xa81a664b);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],42,0xc24b8b70);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],43,0xc76c51a3);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],44,0xd192e819);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],45,0xd6990624);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],46,0xf40e3585);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],47,0x106aa070);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],48,0x19a4c116);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],49,0x1e376c08);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],50,0x2748774c);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],51,0x34b0bcb5);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],52,0x391c0cb3);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],53,0x4ed8aa4a);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],54,0x5b9cca4f);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],55,0x682e6ff3);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],56,0x748f82ee);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],57,0x78a5636f);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],58,0x84c87814);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],59,0x8cc70208);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],60,0x90befffa);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],61,0xa4506ceb);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],62,0xbef9a3f7);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],63,0xc67178f2);
+
+#undef RND     
+    
+    /* feedback */
+    for (i = 0; i < 8; i++) {
+        sha_info->digest[i] = sha_info->digest[i] + S[i];
+    }
+
+}
+
+
+
+/* initialize the SHA digest */
+
+static void
+sha_init(SHAobject *sha_info)
+{
+    TestEndianness(sha_info->Endianness)
+    sha_info->digest[0] = 0x6A09E667L;
+    sha_info->digest[1] = 0xBB67AE85L;
+    sha_info->digest[2] = 0x3C6EF372L;
+    sha_info->digest[3] = 0xA54FF53AL;
+    sha_info->digest[4] = 0x510E527FL;
+    sha_info->digest[5] = 0x9B05688CL;
+    sha_info->digest[6] = 0x1F83D9ABL;
+    sha_info->digest[7] = 0x5BE0CD19L;
+    sha_info->count_lo = 0L;
+    sha_info->count_hi = 0L;
+    sha_info->local = 0;
+    sha_info->digestsize = 32;
+}
+
+static void
+sha224_init(SHAobject *sha_info)
+{
+    TestEndianness(sha_info->Endianness)
+    sha_info->digest[0] = 0xc1059ed8L;
+    sha_info->digest[1] = 0x367cd507L;
+    sha_info->digest[2] = 0x3070dd17L;
+    sha_info->digest[3] = 0xf70e5939L;
+    sha_info->digest[4] = 0xffc00b31L;
+    sha_info->digest[5] = 0x68581511L;
+    sha_info->digest[6] = 0x64f98fa7L;
+    sha_info->digest[7] = 0xbefa4fa4L;
+    sha_info->count_lo = 0L;
+    sha_info->count_hi = 0L;
+    sha_info->local = 0;
+    sha_info->digestsize = 28;
+}
+
+
+/* update the SHA digest */
+
+static void
+sha_update(SHAobject *sha_info, SHA_BYTE *buffer, int count)
+{
+    int i;
+    SHA_INT32 clo;
+
+    clo = sha_info->count_lo + ((SHA_INT32) count << 3);
+    if (clo < sha_info->count_lo) {
+        ++sha_info->count_hi;
+    }
+    sha_info->count_lo = clo;
+    sha_info->count_hi += (SHA_INT32) count >> 29;
+    if (sha_info->local) {
+        i = SHA_BLOCKSIZE - sha_info->local;
+        if (i > count) {
+            i = count;
+        }
+        memcpy(((SHA_BYTE *) sha_info->data) + sha_info->local, buffer, i);
+        count -= i;
+        buffer += i;
+        sha_info->local += i;
+        if (sha_info->local == SHA_BLOCKSIZE) {
+            sha_transform(sha_info);
+        }
+        else {
+            return;
+        }
+    }
+    while (count >= SHA_BLOCKSIZE) {
+        memcpy(sha_info->data, buffer, SHA_BLOCKSIZE);
+        buffer += SHA_BLOCKSIZE;
+        count -= SHA_BLOCKSIZE;
+        sha_transform(sha_info);
+    }
+    memcpy(sha_info->data, buffer, count);
+    sha_info->local = count;
+}
+
+/* finish computing the SHA digest */
+
+static void
+sha_final(unsigned char digest[SHA_DIGESTSIZE], SHAobject *sha_info)
+{
+    int count;
+    SHA_INT32 lo_bit_count, hi_bit_count;
+
+    lo_bit_count = sha_info->count_lo;
+    hi_bit_count = sha_info->count_hi;
+    count = (int) ((lo_bit_count >> 3) & 0x3f);
+    ((SHA_BYTE *) sha_info->data)[count++] = 0x80;
+    if (count > SHA_BLOCKSIZE - 8) {
+	memset(((SHA_BYTE *) sha_info->data) + count, 0,
+	       SHA_BLOCKSIZE - count);
+	sha_transform(sha_info);
+	memset((SHA_BYTE *) sha_info->data, 0, SHA_BLOCKSIZE - 8);
+    }
+    else {
+	memset(((SHA_BYTE *) sha_info->data) + count, 0,
+	       SHA_BLOCKSIZE - 8 - count);
+    }
+
+    /* GJS: note that we add the hi/lo in big-endian. sha_transform will
+       swap these values into host-order. */
+    sha_info->data[56] = (hi_bit_count >> 24) & 0xff;
+    sha_info->data[57] = (hi_bit_count >> 16) & 0xff;
+    sha_info->data[58] = (hi_bit_count >>  8) & 0xff;
+    sha_info->data[59] = (hi_bit_count >>  0) & 0xff;
+    sha_info->data[60] = (lo_bit_count >> 24) & 0xff;
+    sha_info->data[61] = (lo_bit_count >> 16) & 0xff;
+    sha_info->data[62] = (lo_bit_count >>  8) & 0xff;
+    sha_info->data[63] = (lo_bit_count >>  0) & 0xff;
+    sha_transform(sha_info);
+    digest[ 0] = (unsigned char) ((sha_info->digest[0] >> 24) & 0xff);
+    digest[ 1] = (unsigned char) ((sha_info->digest[0] >> 16) & 0xff);
+    digest[ 2] = (unsigned char) ((sha_info->digest[0] >>  8) & 0xff);
+    digest[ 3] = (unsigned char) ((sha_info->digest[0]      ) & 0xff);
+    digest[ 4] = (unsigned char) ((sha_info->digest[1] >> 24) & 0xff);
+    digest[ 5] = (unsigned char) ((sha_info->digest[1] >> 16) & 0xff);
+    digest[ 6] = (unsigned char) ((sha_info->digest[1] >>  8) & 0xff);
+    digest[ 7] = (unsigned char) ((sha_info->digest[1]      ) & 0xff);
+    digest[ 8] = (unsigned char) ((sha_info->digest[2] >> 24) & 0xff);
+    digest[ 9] = (unsigned char) ((sha_info->digest[2] >> 16) & 0xff);
+    digest[10] = (unsigned char) ((sha_info->digest[2] >>  8) & 0xff);
+    digest[11] = (unsigned char) ((sha_info->digest[2]      ) & 0xff);
+    digest[12] = (unsigned char) ((sha_info->digest[3] >> 24) & 0xff);
+    digest[13] = (unsigned char) ((sha_info->digest[3] >> 16) & 0xff);
+    digest[14] = (unsigned char) ((sha_info->digest[3] >>  8) & 0xff);
+    digest[15] = (unsigned char) ((sha_info->digest[3]      ) & 0xff);
+    digest[16] = (unsigned char) ((sha_info->digest[4] >> 24) & 0xff);
+    digest[17] = (unsigned char) ((sha_info->digest[4] >> 16) & 0xff);
+    digest[18] = (unsigned char) ((sha_info->digest[4] >>  8) & 0xff);
+    digest[19] = (unsigned char) ((sha_info->digest[4]      ) & 0xff);
+    digest[20] = (unsigned char) ((sha_info->digest[5] >> 24) & 0xff);
+    digest[21] = (unsigned char) ((sha_info->digest[5] >> 16) & 0xff);
+    digest[22] = (unsigned char) ((sha_info->digest[5] >>  8) & 0xff);
+    digest[23] = (unsigned char) ((sha_info->digest[5]      ) & 0xff);
+    digest[24] = (unsigned char) ((sha_info->digest[6] >> 24) & 0xff);
+    digest[25] = (unsigned char) ((sha_info->digest[6] >> 16) & 0xff);
+    digest[26] = (unsigned char) ((sha_info->digest[6] >>  8) & 0xff);
+    digest[27] = (unsigned char) ((sha_info->digest[6]      ) & 0xff);
+    digest[28] = (unsigned char) ((sha_info->digest[7] >> 24) & 0xff);
+    digest[29] = (unsigned char) ((sha_info->digest[7] >> 16) & 0xff);
+    digest[30] = (unsigned char) ((sha_info->digest[7] >>  8) & 0xff);
+    digest[31] = (unsigned char) ((sha_info->digest[7]      ) & 0xff);
+}
+
+/*
+ * End of copied SHA code.
+ *
+ * ------------------------------------------------------------------------
+ */
+
+static PyTypeObject SHA224type;
+static PyTypeObject SHA256type;
+
+
+static SHAobject *
+newSHA224object(void)
+{
+    return (SHAobject *)PyObject_New(SHAobject, &SHA224type);
+}
+
+static SHAobject *
+newSHA256object(void)
+{
+    return (SHAobject *)PyObject_New(SHAobject, &SHA256type);
+}
+
+/* Internal methods for a hash object */
+
+static void
+SHA_dealloc(PyObject *ptr)
+{
+    PyObject_Del(ptr);
+}
+
+
+/* External methods for a hash object */
+
+PyDoc_STRVAR(SHA256_copy__doc__, "Return a copy of the hash object.");
+
+static PyObject *
+SHA256_copy(SHAobject *self, PyObject *unused)
+{
+    SHAobject *newobj;
+
+    if (((PyObject*)self)->ob_type == &SHA256type) {
+        if ( (newobj = newSHA256object())==NULL)
+            return NULL;
+    } else {
+        if ( (newobj = newSHA224object())==NULL)
+            return NULL;
+    }
+
+    SHAcopy(self, newobj);
+    return (PyObject *)newobj;
+}
+
+PyDoc_STRVAR(SHA256_digest__doc__,
+"Return the digest value as a string of binary data.");
+
+static PyObject *
+SHA256_digest(SHAobject *self, PyObject *unused)
+{
+    unsigned char digest[SHA_DIGESTSIZE];
+    SHAobject temp;
+
+    SHAcopy(self, &temp);
+    sha_final(digest, &temp);
+    return PyString_FromStringAndSize((const char *)digest, self->digestsize);
+}
+
+PyDoc_STRVAR(SHA256_hexdigest__doc__,
+"Return the digest value as a string of hexadecimal digits.");
+
+static PyObject *
+SHA256_hexdigest(SHAobject *self, PyObject *unused)
+{
+    unsigned char digest[SHA_DIGESTSIZE];
+    SHAobject temp;
+    PyObject *retval;
+    char *hex_digest;
+    int i, j;
+
+    /* Get the raw (binary) digest value */
+    SHAcopy(self, &temp);
+    sha_final(digest, &temp);
+
+    /* Create a new string */
+    retval = PyString_FromStringAndSize(NULL, self->digestsize * 2);
+    if (!retval)
+	    return NULL;
+    hex_digest = PyString_AsString(retval);
+    if (!hex_digest) {
+	    Py_DECREF(retval);
+	    return NULL;
+    }
+
+    /* Make hex version of the digest */
+    for(i=j=0; i<self->digestsize; i++) {
+        char c;
+        c = (digest[i] >> 4) & 0xf;
+	c = (c>9) ? c+'a'-10 : c + '0';
+        hex_digest[j++] = c;
+        c = (digest[i] & 0xf);
+	c = (c>9) ? c+'a'-10 : c + '0';
+        hex_digest[j++] = c;
+    }
+    return retval;
+}
+
+PyDoc_STRVAR(SHA256_update__doc__,
+"Update this hash object's state with the provided string.");
+
+static PyObject *
+SHA256_update(SHAobject *self, PyObject *args)
+{
+    unsigned char *cp;
+    int len;
+
+    if (!PyArg_ParseTuple(args, "s#:update", &cp, &len))
+        return NULL;
+
+    sha_update(self, cp, len);
+
+    Py_INCREF(Py_None);
+    return Py_None;
+}
+
+static PyMethodDef SHA_methods[] = {
+    {"copy",	  (PyCFunction)SHA256_copy,      METH_NOARGS,  SHA256_copy__doc__},
+    {"digest",	  (PyCFunction)SHA256_digest,    METH_NOARGS,  SHA256_digest__doc__},
+    {"hexdigest", (PyCFunction)SHA256_hexdigest, METH_NOARGS,  SHA256_hexdigest__doc__},
+    {"update",	  (PyCFunction)SHA256_update,    METH_VARARGS, SHA256_update__doc__},
+    {NULL,	  NULL}		/* sentinel */
+};
+
+static PyObject *
+SHA256_get_block_size(PyObject *self, void *closure)
+{
+    return PyInt_FromLong(SHA_BLOCKSIZE);
+}
+
+static PyObject *
+SHA256_get_name(PyObject *self, void *closure)
+{
+    if (((SHAobject *)self)->digestsize == 32)
+        return PyString_FromStringAndSize("SHA256", 6);
+    else
+        return PyString_FromStringAndSize("SHA224", 6);
+}
+
+static PyGetSetDef SHA_getseters[] = {
+    {"block_size",
+     (getter)SHA256_get_block_size, NULL,
+     NULL,
+     NULL},
+    {"name",
+     (getter)SHA256_get_name, NULL,
+     NULL,
+     NULL},
+    {NULL}  /* Sentinel */
+};
+
+static PyMemberDef SHA_members[] = {
+    {"digest_size", T_INT, offsetof(SHAobject, digestsize), READONLY, NULL},
+    /* the old md5 and sha modules support 'digest_size' as in PEP 247.
+     * the old sha module also supported 'digestsize'.  ugh. */
+    {"digestsize", T_INT, offsetof(SHAobject, digestsize), READONLY, NULL},
+    {NULL}  /* Sentinel */
+};
+
+static PyTypeObject SHA224type = {
+    PyObject_HEAD_INIT(NULL)
+    0,			/*ob_size*/
+    "_sha256.sha224",	/*tp_name*/
+    sizeof(SHAobject),	/*tp_size*/
+    0,			/*tp_itemsize*/
+    /* methods */
+    SHA_dealloc,	/*tp_dealloc*/
+    0,			/*tp_print*/
+    0,          	/*tp_getattr*/
+    0,                  /*tp_setattr*/
+    0,                  /*tp_compare*/
+    0,                  /*tp_repr*/
+    0,                  /*tp_as_number*/
+    0,                  /*tp_as_sequence*/
+    0,                  /*tp_as_mapping*/
+    0,                  /*tp_hash*/
+    0,                  /*tp_call*/
+    0,                  /*tp_str*/
+    0,                  /*tp_getattro*/
+    0,                  /*tp_setattro*/
+    0,                  /*tp_as_buffer*/
+    Py_TPFLAGS_DEFAULT, /*tp_flags*/
+    0,                  /*tp_doc*/
+    0,                  /*tp_traverse*/
+    0,			/*tp_clear*/
+    0,			/*tp_richcompare*/
+    0,			/*tp_weaklistoffset*/
+    0,			/*tp_iter*/
+    0,			/*tp_iternext*/
+    SHA_methods,	/* tp_methods */
+    SHA_members,	/* tp_members */
+    SHA_getseters,      /* tp_getset */
+};
+
+static PyTypeObject SHA256type = {
+    PyObject_HEAD_INIT(NULL)
+    0,			/*ob_size*/
+    "_sha256.sha256",	/*tp_name*/
+    sizeof(SHAobject),	/*tp_size*/
+    0,			/*tp_itemsize*/
+    /* methods */
+    SHA_dealloc,	/*tp_dealloc*/
+    0,			/*tp_print*/
+    0,          	/*tp_getattr*/
+    0,                  /*tp_setattr*/
+    0,                  /*tp_compare*/
+    0,                  /*tp_repr*/
+    0,                  /*tp_as_number*/
+    0,                  /*tp_as_sequence*/
+    0,                  /*tp_as_mapping*/
+    0,                  /*tp_hash*/
+    0,                  /*tp_call*/
+    0,                  /*tp_str*/
+    0,                  /*tp_getattro*/
+    0,                  /*tp_setattro*/
+    0,                  /*tp_as_buffer*/
+    Py_TPFLAGS_DEFAULT, /*tp_flags*/
+    0,                  /*tp_doc*/
+    0,                  /*tp_traverse*/
+    0,			/*tp_clear*/
+    0,			/*tp_richcompare*/
+    0,			/*tp_weaklistoffset*/
+    0,			/*tp_iter*/
+    0,			/*tp_iternext*/
+    SHA_methods,	/* tp_methods */
+    SHA_members,	/* tp_members */
+    SHA_getseters,      /* tp_getset */
+};
+
+
+/* The single module-level function: new() */
+
+PyDoc_STRVAR(SHA256_new__doc__,
+"Return a new SHA-256 hash object; optionally initialized with a string.");
+
+static PyObject *
+SHA256_new(PyObject *self, PyObject *args, PyObject *kwdict)
+{
+    static char *kwlist[] = {"string", NULL};
+    SHAobject *new;
+    unsigned char *cp = NULL;
+    int len;
+
+    if (!PyArg_ParseTupleAndKeywords(args, kwdict, "|s#:new", kwlist,
+                                     &cp, &len)) {
+        return NULL;
+    }
+
+    if ((new = newSHA256object()) == NULL)
+        return NULL;
+
+    sha_init(new);
+
+    if (PyErr_Occurred()) {
+        Py_DECREF(new);
+        return NULL;
+    }
+    if (cp)
+        sha_update(new, cp, len);
+
+    return (PyObject *)new;
+}
+
+PyDoc_STRVAR(SHA224_new__doc__,
+"Return a new SHA-224 hash object; optionally initialized with a string.");
+
+static PyObject *
+SHA224_new(PyObject *self, PyObject *args, PyObject *kwdict)
+{
+    static char *kwlist[] = {"string", NULL};
+    SHAobject *new;
+    unsigned char *cp = NULL;
+    int len;
+
+    if (!PyArg_ParseTupleAndKeywords(args, kwdict, "|s#:new", kwlist,
+                                     &cp, &len)) {
+        return NULL;
+    }
+
+    if ((new = newSHA224object()) == NULL)
+        return NULL;
+
+    sha224_init(new);
+
+    if (PyErr_Occurred()) {
+        Py_DECREF(new);
+        return NULL;
+    }
+    if (cp)
+        sha_update(new, cp, len);
+
+    return (PyObject *)new;
+}
+
+
+/* List of functions exported by this module */
+
+static struct PyMethodDef SHA_functions[] = {
+    {"sha256", (PyCFunction)SHA256_new, METH_VARARGS|METH_KEYWORDS, SHA256_new__doc__},
+    {"sha224", (PyCFunction)SHA224_new, METH_VARARGS|METH_KEYWORDS, SHA224_new__doc__},
+    {NULL,	NULL}		 /* Sentinel */
+};
+
+
+/* Initialize this module. */
+
+#define insint(n,v) { PyModule_AddIntConstant(m,n,v); }
+
+PyMODINIT_FUNC
+init_sha256(void)
+{
+    PyObject *m;
+
+    SHA224type.ob_type = &PyType_Type;
+    if (PyType_Ready(&SHA224type) < 0)
+        return;
+    SHA256type.ob_type = &PyType_Type;
+    if (PyType_Ready(&SHA256type) < 0)
+        return;
+    m = Py_InitModule("_sha256", SHA_functions);
+    if (m == NULL)
+	return;
+}
diff --git a/src/Modules/sha512module.c b/src/Modules/sha512module.c
new file mode 100644
--- /dev/null
+++ b/src/Modules/sha512module.c
@@ -0,0 +1,769 @@
+/* SHA512 module */
+
+/* This module provides an interface to NIST's SHA-512 and SHA-384 Algorithms */
+
+/* See below for information about the original code this module was
+   based upon. Additional work performed by:
+
+   Andrew Kuchling (amk at amk.ca)
+   Greg Stein (gstein at lyra.org)
+   Trevor Perrin (trevp at trevp.net)
+
+   Copyright (C) 2005   Gregory P. Smith (greg at krypto.org)
+   Licensed to PSF under a Contributor Agreement.
+
+*/
+
+/* SHA objects */
+
+#include "Python.h"
+#include "structmember.h"
+
+#ifdef PY_LONG_LONG /* If no PY_LONG_LONG, don't compile anything! */
+
+/* Endianness testing and definitions */
+#define TestEndianness(variable) {int i=1; variable=PCT_BIG_ENDIAN;\
+	if (*((char*)&i)==1) variable=PCT_LITTLE_ENDIAN;}
+
+#define PCT_LITTLE_ENDIAN 1
+#define PCT_BIG_ENDIAN 0
+
+/* Some useful types */
+
+typedef unsigned char SHA_BYTE;
+
+#if SIZEOF_INT == 4
+typedef unsigned int SHA_INT32;	/* 32-bit integer */
+typedef unsigned PY_LONG_LONG SHA_INT64;	/* 64-bit integer */
+#else
+/* not defined. compilation will die. */
+#endif
+
+/* The SHA block size and message digest sizes, in bytes */
+
+#define SHA_BLOCKSIZE   128
+#define SHA_DIGESTSIZE  64
+
+/* The structure for storing SHA info */
+
+typedef struct {
+    PyObject_HEAD
+    SHA_INT64 digest[8];		/* Message digest */
+    SHA_INT32 count_lo, count_hi;	/* 64-bit bit count */
+    SHA_BYTE data[SHA_BLOCKSIZE];	/* SHA data buffer */
+    int Endianness;
+    int local;				/* unprocessed amount in data */
+    int digestsize;
+} SHAobject;
+
+/* When run on a little-endian CPU we need to perform byte reversal on an
+   array of longwords. */
+
+static void longReverse(SHA_INT64 *buffer, int byteCount, int Endianness)
+{
+    SHA_INT64 value;
+
+    if ( Endianness == PCT_BIG_ENDIAN )
+	return;
+
+    byteCount /= sizeof(*buffer);
+    while (byteCount--) {
+        value = *buffer;
+
+		((unsigned char*)buffer)[0] = (unsigned char)(value >> 56) & 0xff;
+		((unsigned char*)buffer)[1] = (unsigned char)(value >> 48) & 0xff;
+		((unsigned char*)buffer)[2] = (unsigned char)(value >> 40) & 0xff;
+		((unsigned char*)buffer)[3] = (unsigned char)(value >> 32) & 0xff;
+		((unsigned char*)buffer)[4] = (unsigned char)(value >> 24) & 0xff;
+		((unsigned char*)buffer)[5] = (unsigned char)(value >> 16) & 0xff;
+		((unsigned char*)buffer)[6] = (unsigned char)(value >>  8) & 0xff;
+		((unsigned char*)buffer)[7] = (unsigned char)(value      ) & 0xff;
+        
+		buffer++;
+    }
+}
+
+static void SHAcopy(SHAobject *src, SHAobject *dest)
+{
+    dest->Endianness = src->Endianness;
+    dest->local = src->local;
+    dest->digestsize = src->digestsize;
+    dest->count_lo = src->count_lo;
+    dest->count_hi = src->count_hi;
+    memcpy(dest->digest, src->digest, sizeof(src->digest));
+    memcpy(dest->data, src->data, sizeof(src->data));
+}
+
+
+/* ------------------------------------------------------------------------
+ *
+ * This code for the SHA-512 algorithm was noted as public domain. The
+ * original headers are pasted below.
+ *
+ * Several changes have been made to make it more compatible with the
+ * Python environment and desired interface.
+ *
+ */
+
+/* LibTomCrypt, modular cryptographic library -- Tom St Denis
+ *
+ * LibTomCrypt is a library that provides various cryptographic
+ * algorithms in a highly modular and flexible manner.
+ *
+ * The library is free for all purposes without any express
+ * gurantee it works.
+ *
+ * Tom St Denis, tomstdenis at iahu.ca, http://libtomcrypt.org
+ */
+
+
+/* SHA512 by Tom St Denis */
+
+/* Various logical functions */
+#define ROR64(x, y) \
+    ( ((((x) & 0xFFFFFFFFFFFFFFFFULL)>>((unsigned PY_LONG_LONG)(y) & 63)) | \
+      ((x)<<((unsigned PY_LONG_LONG)(64-((y) & 63))))) & 0xFFFFFFFFFFFFFFFFULL)
+#define Ch(x,y,z)       (z ^ (x & (y ^ z)))
+#define Maj(x,y,z)      (((x | y) & z) | (x & y)) 
+#define S(x, n)         ROR64((x),(n))
+#define R(x, n)         (((x) & 0xFFFFFFFFFFFFFFFFULL) >> ((unsigned PY_LONG_LONG)n))
+#define Sigma0(x)       (S(x, 28) ^ S(x, 34) ^ S(x, 39))
+#define Sigma1(x)       (S(x, 14) ^ S(x, 18) ^ S(x, 41))
+#define Gamma0(x)       (S(x, 1) ^ S(x, 8) ^ R(x, 7))
+#define Gamma1(x)       (S(x, 19) ^ S(x, 61) ^ R(x, 6))
+
+
+static void
+sha512_transform(SHAobject *sha_info)
+{
+    int i;
+    SHA_INT64 S[8], W[80], t0, t1;
+
+    memcpy(W, sha_info->data, sizeof(sha_info->data));
+    longReverse(W, (int)sizeof(sha_info->data), sha_info->Endianness);
+
+    for (i = 16; i < 80; ++i) {
+		W[i] = Gamma1(W[i - 2]) + W[i - 7] + Gamma0(W[i - 15]) + W[i - 16];
+    }
+    for (i = 0; i < 8; ++i) {
+        S[i] = sha_info->digest[i];
+    }
+
+    /* Compress */
+#define RND(a,b,c,d,e,f,g,h,i,ki)                    \
+     t0 = h + Sigma1(e) + Ch(e, f, g) + ki + W[i];   \
+     t1 = Sigma0(a) + Maj(a, b, c);                  \
+     d += t0;                                        \
+     h  = t0 + t1;
+
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],0,0x428a2f98d728ae22ULL);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],1,0x7137449123ef65cdULL);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],2,0xb5c0fbcfec4d3b2fULL);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],3,0xe9b5dba58189dbbcULL);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],4,0x3956c25bf348b538ULL);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],5,0x59f111f1b605d019ULL);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],6,0x923f82a4af194f9bULL);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],7,0xab1c5ed5da6d8118ULL);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],8,0xd807aa98a3030242ULL);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],9,0x12835b0145706fbeULL);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],10,0x243185be4ee4b28cULL);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],11,0x550c7dc3d5ffb4e2ULL);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],12,0x72be5d74f27b896fULL);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],13,0x80deb1fe3b1696b1ULL);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],14,0x9bdc06a725c71235ULL);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],15,0xc19bf174cf692694ULL);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],16,0xe49b69c19ef14ad2ULL);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],17,0xefbe4786384f25e3ULL);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],18,0x0fc19dc68b8cd5b5ULL);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],19,0x240ca1cc77ac9c65ULL);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],20,0x2de92c6f592b0275ULL);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],21,0x4a7484aa6ea6e483ULL);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],22,0x5cb0a9dcbd41fbd4ULL);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],23,0x76f988da831153b5ULL);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],24,0x983e5152ee66dfabULL);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],25,0xa831c66d2db43210ULL);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],26,0xb00327c898fb213fULL);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],27,0xbf597fc7beef0ee4ULL);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],28,0xc6e00bf33da88fc2ULL);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],29,0xd5a79147930aa725ULL);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],30,0x06ca6351e003826fULL);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],31,0x142929670a0e6e70ULL);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],32,0x27b70a8546d22ffcULL);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],33,0x2e1b21385c26c926ULL);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],34,0x4d2c6dfc5ac42aedULL);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],35,0x53380d139d95b3dfULL);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],36,0x650a73548baf63deULL);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],37,0x766a0abb3c77b2a8ULL);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],38,0x81c2c92e47edaee6ULL);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],39,0x92722c851482353bULL);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],40,0xa2bfe8a14cf10364ULL);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],41,0xa81a664bbc423001ULL);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],42,0xc24b8b70d0f89791ULL);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],43,0xc76c51a30654be30ULL);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],44,0xd192e819d6ef5218ULL);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],45,0xd69906245565a910ULL);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],46,0xf40e35855771202aULL);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],47,0x106aa07032bbd1b8ULL);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],48,0x19a4c116b8d2d0c8ULL);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],49,0x1e376c085141ab53ULL);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],50,0x2748774cdf8eeb99ULL);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],51,0x34b0bcb5e19b48a8ULL);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],52,0x391c0cb3c5c95a63ULL);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],53,0x4ed8aa4ae3418acbULL);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],54,0x5b9cca4f7763e373ULL);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],55,0x682e6ff3d6b2b8a3ULL);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],56,0x748f82ee5defb2fcULL);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],57,0x78a5636f43172f60ULL);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],58,0x84c87814a1f0ab72ULL);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],59,0x8cc702081a6439ecULL);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],60,0x90befffa23631e28ULL);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],61,0xa4506cebde82bde9ULL);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],62,0xbef9a3f7b2c67915ULL);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],63,0xc67178f2e372532bULL);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],64,0xca273eceea26619cULL);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],65,0xd186b8c721c0c207ULL);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],66,0xeada7dd6cde0eb1eULL);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],67,0xf57d4f7fee6ed178ULL);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],68,0x06f067aa72176fbaULL);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],69,0x0a637dc5a2c898a6ULL);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],70,0x113f9804bef90daeULL);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],71,0x1b710b35131c471bULL);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],72,0x28db77f523047d84ULL);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],73,0x32caab7b40c72493ULL);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],74,0x3c9ebe0a15c9bebcULL);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],75,0x431d67c49c100d4cULL);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],76,0x4cc5d4becb3e42b6ULL);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],77,0x597f299cfc657e2aULL);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],78,0x5fcb6fab3ad6faecULL);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],79,0x6c44198c4a475817ULL);
+
+#undef RND     
+    
+    /* feedback */
+    for (i = 0; i < 8; i++) {
+        sha_info->digest[i] = sha_info->digest[i] + S[i];
+    }
+
+}
+
+
+
+/* initialize the SHA digest */
+
+static void
+sha512_init(SHAobject *sha_info)
+{
+    TestEndianness(sha_info->Endianness)
+    sha_info->digest[0] = 0x6a09e667f3bcc908ULL;
+    sha_info->digest[1] = 0xbb67ae8584caa73bULL;
+    sha_info->digest[2] = 0x3c6ef372fe94f82bULL;
+    sha_info->digest[3] = 0xa54ff53a5f1d36f1ULL;
+    sha_info->digest[4] = 0x510e527fade682d1ULL;
+    sha_info->digest[5] = 0x9b05688c2b3e6c1fULL;
+    sha_info->digest[6] = 0x1f83d9abfb41bd6bULL;
+    sha_info->digest[7] = 0x5be0cd19137e2179ULL;
+    sha_info->count_lo = 0L;
+    sha_info->count_hi = 0L;
+    sha_info->local = 0;
+    sha_info->digestsize = 64;
+}
+
+static void
+sha384_init(SHAobject *sha_info)
+{
+    TestEndianness(sha_info->Endianness)
+    sha_info->digest[0] = 0xcbbb9d5dc1059ed8ULL;
+    sha_info->digest[1] = 0x629a292a367cd507ULL;
+    sha_info->digest[2] = 0x9159015a3070dd17ULL;
+    sha_info->digest[3] = 0x152fecd8f70e5939ULL;
+    sha_info->digest[4] = 0x67332667ffc00b31ULL;
+    sha_info->digest[5] = 0x8eb44a8768581511ULL;
+    sha_info->digest[6] = 0xdb0c2e0d64f98fa7ULL;
+    sha_info->digest[7] = 0x47b5481dbefa4fa4ULL;
+    sha_info->count_lo = 0L;
+    sha_info->count_hi = 0L;
+    sha_info->local = 0;
+    sha_info->digestsize = 48;
+}
+
+
+/* update the SHA digest */
+
+static void
+sha512_update(SHAobject *sha_info, SHA_BYTE *buffer, int count)
+{
+    int i;
+    SHA_INT32 clo;
+
+    clo = sha_info->count_lo + ((SHA_INT32) count << 3);
+    if (clo < sha_info->count_lo) {
+        ++sha_info->count_hi;
+    }
+    sha_info->count_lo = clo;
+    sha_info->count_hi += (SHA_INT32) count >> 29;
+    if (sha_info->local) {
+        i = SHA_BLOCKSIZE - sha_info->local;
+        if (i > count) {
+            i = count;
+        }
+        memcpy(((SHA_BYTE *) sha_info->data) + sha_info->local, buffer, i);
+        count -= i;
+        buffer += i;
+        sha_info->local += i;
+        if (sha_info->local == SHA_BLOCKSIZE) {
+            sha512_transform(sha_info);
+        }
+        else {
+            return;
+        }
+    }
+    while (count >= SHA_BLOCKSIZE) {
+        memcpy(sha_info->data, buffer, SHA_BLOCKSIZE);
+        buffer += SHA_BLOCKSIZE;
+        count -= SHA_BLOCKSIZE;
+        sha512_transform(sha_info);
+    }
+    memcpy(sha_info->data, buffer, count);
+    sha_info->local = count;
+}
+
+/* finish computing the SHA digest */
+
+static void
+sha512_final(unsigned char digest[SHA_DIGESTSIZE], SHAobject *sha_info)
+{
+    int count;
+    SHA_INT32 lo_bit_count, hi_bit_count;
+
+    lo_bit_count = sha_info->count_lo;
+    hi_bit_count = sha_info->count_hi;
+    count = (int) ((lo_bit_count >> 3) & 0x7f);
+    ((SHA_BYTE *) sha_info->data)[count++] = 0x80;
+    if (count > SHA_BLOCKSIZE - 16) {
+	memset(((SHA_BYTE *) sha_info->data) + count, 0,
+	       SHA_BLOCKSIZE - count);
+	sha512_transform(sha_info);
+	memset((SHA_BYTE *) sha_info->data, 0, SHA_BLOCKSIZE - 16);
+    }
+    else {
+	memset(((SHA_BYTE *) sha_info->data) + count, 0,
+	       SHA_BLOCKSIZE - 16 - count);
+    }
+
+    /* GJS: note that we add the hi/lo in big-endian. sha512_transform will
+       swap these values into host-order. */
+    sha_info->data[112] = 0;
+    sha_info->data[113] = 0;
+    sha_info->data[114] = 0;
+    sha_info->data[115] = 0;
+    sha_info->data[116] = 0;
+    sha_info->data[117] = 0;
+    sha_info->data[118] = 0;
+    sha_info->data[119] = 0;
+    sha_info->data[120] = (hi_bit_count >> 24) & 0xff;
+    sha_info->data[121] = (hi_bit_count >> 16) & 0xff;
+    sha_info->data[122] = (hi_bit_count >>  8) & 0xff;
+    sha_info->data[123] = (hi_bit_count >>  0) & 0xff;
+    sha_info->data[124] = (lo_bit_count >> 24) & 0xff;
+    sha_info->data[125] = (lo_bit_count >> 16) & 0xff;
+    sha_info->data[126] = (lo_bit_count >>  8) & 0xff;
+    sha_info->data[127] = (lo_bit_count >>  0) & 0xff;
+    sha512_transform(sha_info);
+    digest[ 0] = (unsigned char) ((sha_info->digest[0] >> 56) & 0xff);
+    digest[ 1] = (unsigned char) ((sha_info->digest[0] >> 48) & 0xff);
+    digest[ 2] = (unsigned char) ((sha_info->digest[0] >> 40) & 0xff);
+    digest[ 3] = (unsigned char) ((sha_info->digest[0] >> 32) & 0xff);
+    digest[ 4] = (unsigned char) ((sha_info->digest[0] >> 24) & 0xff);
+    digest[ 5] = (unsigned char) ((sha_info->digest[0] >> 16) & 0xff);
+    digest[ 6] = (unsigned char) ((sha_info->digest[0] >>  8) & 0xff);
+    digest[ 7] = (unsigned char) ((sha_info->digest[0]      ) & 0xff);
+    digest[ 8] = (unsigned char) ((sha_info->digest[1] >> 56) & 0xff);
+    digest[ 9] = (unsigned char) ((sha_info->digest[1] >> 48) & 0xff);
+    digest[10] = (unsigned char) ((sha_info->digest[1] >> 40) & 0xff);
+    digest[11] = (unsigned char) ((sha_info->digest[1] >> 32) & 0xff);
+    digest[12] = (unsigned char) ((sha_info->digest[1] >> 24) & 0xff);
+    digest[13] = (unsigned char) ((sha_info->digest[1] >> 16) & 0xff);
+    digest[14] = (unsigned char) ((sha_info->digest[1] >>  8) & 0xff);
+    digest[15] = (unsigned char) ((sha_info->digest[1]      ) & 0xff);
+    digest[16] = (unsigned char) ((sha_info->digest[2] >> 56) & 0xff);
+    digest[17] = (unsigned char) ((sha_info->digest[2] >> 48) & 0xff);
+    digest[18] = (unsigned char) ((sha_info->digest[2] >> 40) & 0xff);
+    digest[19] = (unsigned char) ((sha_info->digest[2] >> 32) & 0xff);
+    digest[20] = (unsigned char) ((sha_info->digest[2] >> 24) & 0xff);
+    digest[21] = (unsigned char) ((sha_info->digest[2] >> 16) & 0xff);
+    digest[22] = (unsigned char) ((sha_info->digest[2] >>  8) & 0xff);
+    digest[23] = (unsigned char) ((sha_info->digest[2]      ) & 0xff);
+    digest[24] = (unsigned char) ((sha_info->digest[3] >> 56) & 0xff);
+    digest[25] = (unsigned char) ((sha_info->digest[3] >> 48) & 0xff);
+    digest[26] = (unsigned char) ((sha_info->digest[3] >> 40) & 0xff);
+    digest[27] = (unsigned char) ((sha_info->digest[3] >> 32) & 0xff);
+    digest[28] = (unsigned char) ((sha_info->digest[3] >> 24) & 0xff);
+    digest[29] = (unsigned char) ((sha_info->digest[3] >> 16) & 0xff);
+    digest[30] = (unsigned char) ((sha_info->digest[3] >>  8) & 0xff);
+    digest[31] = (unsigned char) ((sha_info->digest[3]      ) & 0xff);
+    digest[32] = (unsigned char) ((sha_info->digest[4] >> 56) & 0xff);
+    digest[33] = (unsigned char) ((sha_info->digest[4] >> 48) & 0xff);
+    digest[34] = (unsigned char) ((sha_info->digest[4] >> 40) & 0xff);
+    digest[35] = (unsigned char) ((sha_info->digest[4] >> 32) & 0xff);
+    digest[36] = (unsigned char) ((sha_info->digest[4] >> 24) & 0xff);
+    digest[37] = (unsigned char) ((sha_info->digest[4] >> 16) & 0xff);
+    digest[38] = (unsigned char) ((sha_info->digest[4] >>  8) & 0xff);
+    digest[39] = (unsigned char) ((sha_info->digest[4]      ) & 0xff);
+    digest[40] = (unsigned char) ((sha_info->digest[5] >> 56) & 0xff);
+    digest[41] = (unsigned char) ((sha_info->digest[5] >> 48) & 0xff);
+    digest[42] = (unsigned char) ((sha_info->digest[5] >> 40) & 0xff);
+    digest[43] = (unsigned char) ((sha_info->digest[5] >> 32) & 0xff);
+    digest[44] = (unsigned char) ((sha_info->digest[5] >> 24) & 0xff);
+    digest[45] = (unsigned char) ((sha_info->digest[5] >> 16) & 0xff);
+    digest[46] = (unsigned char) ((sha_info->digest[5] >>  8) & 0xff);
+    digest[47] = (unsigned char) ((sha_info->digest[5]      ) & 0xff);
+    digest[48] = (unsigned char) ((sha_info->digest[6] >> 56) & 0xff);
+    digest[49] = (unsigned char) ((sha_info->digest[6] >> 48) & 0xff);
+    digest[50] = (unsigned char) ((sha_info->digest[6] >> 40) & 0xff);
+    digest[51] = (unsigned char) ((sha_info->digest[6] >> 32) & 0xff);
+    digest[52] = (unsigned char) ((sha_info->digest[6] >> 24) & 0xff);
+    digest[53] = (unsigned char) ((sha_info->digest[6] >> 16) & 0xff);
+    digest[54] = (unsigned char) ((sha_info->digest[6] >>  8) & 0xff);
+    digest[55] = (unsigned char) ((sha_info->digest[6]      ) & 0xff);
+    digest[56] = (unsigned char) ((sha_info->digest[7] >> 56) & 0xff);
+    digest[57] = (unsigned char) ((sha_info->digest[7] >> 48) & 0xff);
+    digest[58] = (unsigned char) ((sha_info->digest[7] >> 40) & 0xff);
+    digest[59] = (unsigned char) ((sha_info->digest[7] >> 32) & 0xff);
+    digest[60] = (unsigned char) ((sha_info->digest[7] >> 24) & 0xff);
+    digest[61] = (unsigned char) ((sha_info->digest[7] >> 16) & 0xff);
+    digest[62] = (unsigned char) ((sha_info->digest[7] >>  8) & 0xff);
+    digest[63] = (unsigned char) ((sha_info->digest[7]      ) & 0xff);
+}
+
+/*
+ * End of copied SHA code.
+ *
+ * ------------------------------------------------------------------------
+ */
+
+static PyTypeObject SHA384type;
+static PyTypeObject SHA512type;
+
+
+static SHAobject *
+newSHA384object(void)
+{
+    return (SHAobject *)PyObject_New(SHAobject, &SHA384type);
+}
+
+static SHAobject *
+newSHA512object(void)
+{
+    return (SHAobject *)PyObject_New(SHAobject, &SHA512type);
+}
+
+/* Internal methods for a hash object */
+
+static void
+SHA512_dealloc(PyObject *ptr)
+{
+    PyObject_Del(ptr);
+}
+
+
+/* External methods for a hash object */
+
+PyDoc_STRVAR(SHA512_copy__doc__, "Return a copy of the hash object.");
+
+static PyObject *
+SHA512_copy(SHAobject *self, PyObject *unused)
+{
+    SHAobject *newobj;
+
+    if (((PyObject*)self)->ob_type == &SHA512type) {
+        if ( (newobj = newSHA512object())==NULL)
+            return NULL;
+    } else {
+        if ( (newobj = newSHA384object())==NULL)
+            return NULL;
+    }
+
+    SHAcopy(self, newobj);
+    return (PyObject *)newobj;
+}
+
+PyDoc_STRVAR(SHA512_digest__doc__,
+"Return the digest value as a string of binary data.");
+
+static PyObject *
+SHA512_digest(SHAobject *self, PyObject *unused)
+{
+    unsigned char digest[SHA_DIGESTSIZE];
+    SHAobject temp;
+
+    SHAcopy(self, &temp);
+    sha512_final(digest, &temp);
+    return PyString_FromStringAndSize((const char *)digest, self->digestsize);
+}
+
+PyDoc_STRVAR(SHA512_hexdigest__doc__,
+"Return the digest value as a string of hexadecimal digits.");
+
+static PyObject *
+SHA512_hexdigest(SHAobject *self, PyObject *unused)
+{
+    unsigned char digest[SHA_DIGESTSIZE];
+    SHAobject temp;
+    PyObject *retval;
+    char *hex_digest;
+    int i, j;
+
+    /* Get the raw (binary) digest value */
+    SHAcopy(self, &temp);
+    sha512_final(digest, &temp);
+
+    /* Create a new string */
+    retval = PyString_FromStringAndSize(NULL, self->digestsize * 2);
+    if (!retval)
+	    return NULL;
+    hex_digest = PyString_AsString(retval);
+    if (!hex_digest) {
+	    Py_DECREF(retval);
+	    return NULL;
+    }
+
+    /* Make hex version of the digest */
+    for (i=j=0; i<self->digestsize; i++) {
+        char c;
+        c = (digest[i] >> 4) & 0xf;
+	c = (c>9) ? c+'a'-10 : c + '0';
+        hex_digest[j++] = c;
+        c = (digest[i] & 0xf);
+	c = (c>9) ? c+'a'-10 : c + '0';
+        hex_digest[j++] = c;
+    }
+    return retval;
+}
+
+PyDoc_STRVAR(SHA512_update__doc__,
+"Update this hash object's state with the provided string.");
+
+static PyObject *
+SHA512_update(SHAobject *self, PyObject *args)
+{
+    unsigned char *cp;
+    int len;
+
+    if (!PyArg_ParseTuple(args, "s#:update", &cp, &len))
+        return NULL;
+
+    sha512_update(self, cp, len);
+
+    Py_INCREF(Py_None);
+    return Py_None;
+}
+
+static PyMethodDef SHA_methods[] = {
+    {"copy",	  (PyCFunction)SHA512_copy,      METH_NOARGS, SHA512_copy__doc__},
+    {"digest",	  (PyCFunction)SHA512_digest,    METH_NOARGS, SHA512_digest__doc__},
+    {"hexdigest", (PyCFunction)SHA512_hexdigest, METH_NOARGS, SHA512_hexdigest__doc__},
+    {"update",	  (PyCFunction)SHA512_update,    METH_VARARGS, SHA512_update__doc__},
+    {NULL,	  NULL}		/* sentinel */
+};
+
+static PyObject *
+SHA512_get_block_size(PyObject *self, void *closure)
+{
+    return PyInt_FromLong(SHA_BLOCKSIZE);
+}
+
+static PyObject *
+SHA512_get_name(PyObject *self, void *closure)
+{
+    if (((SHAobject *)self)->digestsize == 64)
+        return PyString_FromStringAndSize("SHA512", 6);
+    else
+        return PyString_FromStringAndSize("SHA384", 6);
+}
+
+static PyGetSetDef SHA_getseters[] = {
+    {"block_size",
+     (getter)SHA512_get_block_size, NULL,
+     NULL,
+     NULL},
+    {"name",
+     (getter)SHA512_get_name, NULL,
+     NULL,
+     NULL},
+    {NULL}  /* Sentinel */
+};
+
+static PyMemberDef SHA_members[] = {
+    {"digest_size", T_INT, offsetof(SHAobject, digestsize), READONLY, NULL},
+    /* the old md5 and sha modules support 'digest_size' as in PEP 247.
+     * the old sha module also supported 'digestsize'.  ugh. */
+    {"digestsize", T_INT, offsetof(SHAobject, digestsize), READONLY, NULL},
+    {NULL}  /* Sentinel */
+};
+
+static PyTypeObject SHA384type = {
+    PyObject_HEAD_INIT(NULL)
+    0,			/*ob_size*/
+    "_sha512.sha384",	/*tp_name*/
+    sizeof(SHAobject),	/*tp_size*/
+    0,			/*tp_itemsize*/
+    /* methods */
+    SHA512_dealloc,	/*tp_dealloc*/
+    0,			/*tp_print*/
+    0,          	/*tp_getattr*/
+    0,                  /*tp_setattr*/
+    0,                  /*tp_compare*/
+    0,                  /*tp_repr*/
+    0,                  /*tp_as_number*/
+    0,                  /*tp_as_sequence*/
+    0,                  /*tp_as_mapping*/
+    0,                  /*tp_hash*/
+    0,                  /*tp_call*/
+    0,                  /*tp_str*/
+    0,                  /*tp_getattro*/
+    0,                  /*tp_setattro*/
+    0,                  /*tp_as_buffer*/
+    Py_TPFLAGS_DEFAULT, /*tp_flags*/
+    0,                  /*tp_doc*/
+    0,                  /*tp_traverse*/
+    0,			/*tp_clear*/
+    0,			/*tp_richcompare*/
+    0,			/*tp_weaklistoffset*/
+    0,			/*tp_iter*/
+    0,			/*tp_iternext*/
+    SHA_methods,	/* tp_methods */
+    SHA_members,	/* tp_members */
+    SHA_getseters,      /* tp_getset */
+};
+
+static PyTypeObject SHA512type = {
+    PyObject_HEAD_INIT(NULL)
+    0,			/*ob_size*/
+    "_sha512.sha512",	/*tp_name*/
+    sizeof(SHAobject),	/*tp_size*/
+    0,			/*tp_itemsize*/
+    /* methods */
+    SHA512_dealloc,	/*tp_dealloc*/
+    0,			/*tp_print*/
+    0,          	/*tp_getattr*/
+    0,                  /*tp_setattr*/
+    0,                  /*tp_compare*/
+    0,                  /*tp_repr*/
+    0,                  /*tp_as_number*/
+    0,                  /*tp_as_sequence*/
+    0,                  /*tp_as_mapping*/
+    0,                  /*tp_hash*/
+    0,                  /*tp_call*/
+    0,                  /*tp_str*/
+    0,                  /*tp_getattro*/
+    0,                  /*tp_setattro*/
+    0,                  /*tp_as_buffer*/
+    Py_TPFLAGS_DEFAULT, /*tp_flags*/
+    0,                  /*tp_doc*/
+    0,                  /*tp_traverse*/
+    0,			/*tp_clear*/
+    0,			/*tp_richcompare*/
+    0,			/*tp_weaklistoffset*/
+    0,			/*tp_iter*/
+    0,			/*tp_iternext*/
+    SHA_methods,	/* tp_methods */
+    SHA_members,	/* tp_members */
+    SHA_getseters,      /* tp_getset */
+};
+
+
+/* The single module-level function: new() */
+
+PyDoc_STRVAR(SHA512_new__doc__,
+"Return a new SHA-512 hash object; optionally initialized with a string.");
+
+static PyObject *
+SHA512_new(PyObject *self, PyObject *args, PyObject *kwdict)
+{
+    static char *kwlist[] = {"string", NULL};
+    SHAobject *new;
+    unsigned char *cp = NULL;
+    int len;
+
+    if (!PyArg_ParseTupleAndKeywords(args, kwdict, "|s#:new", kwlist,
+                                     &cp, &len)) {
+        return NULL;
+    }
+
+    if ((new = newSHA512object()) == NULL)
+        return NULL;
+
+    sha512_init(new);
+
+    if (PyErr_Occurred()) {
+        Py_DECREF(new);
+        return NULL;
+    }
+    if (cp)
+        sha512_update(new, cp, len);
+
+    return (PyObject *)new;
+}
+
+PyDoc_STRVAR(SHA384_new__doc__,
+"Return a new SHA-384 hash object; optionally initialized with a string.");
+
+static PyObject *
+SHA384_new(PyObject *self, PyObject *args, PyObject *kwdict)
+{
+    static char *kwlist[] = {"string", NULL};
+    SHAobject *new;
+    unsigned char *cp = NULL;
+    int len;
+
+    if (!PyArg_ParseTupleAndKeywords(args, kwdict, "|s#:new", kwlist,
+                                     &cp, &len)) {
+        return NULL;
+    }
+
+    if ((new = newSHA384object()) == NULL)
+        return NULL;
+
+    sha384_init(new);
+
+    if (PyErr_Occurred()) {
+        Py_DECREF(new);
+        return NULL;
+    }
+    if (cp)
+        sha512_update(new, cp, len);
+
+    return (PyObject *)new;
+}
+
+
+/* List of functions exported by this module */
+
+static struct PyMethodDef SHA_functions[] = {
+    {"sha512", (PyCFunction)SHA512_new, METH_VARARGS|METH_KEYWORDS, SHA512_new__doc__},
+    {"sha384", (PyCFunction)SHA384_new, METH_VARARGS|METH_KEYWORDS, SHA384_new__doc__},
+    {NULL,	NULL}		 /* Sentinel */
+};
+
+
+/* Initialize this module. */
+
+#define insint(n,v) { PyModule_AddIntConstant(m,n,v); }
+
+PyMODINIT_FUNC
+init_sha512(void)
+{
+    PyObject *m;
+
+    SHA384type.ob_type = &PyType_Type;
+    if (PyType_Ready(&SHA384type) < 0)
+        return;
+    SHA512type.ob_type = &PyType_Type;
+    if (PyType_Ready(&SHA512type) < 0)
+        return;
+    m = Py_InitModule("_sha512", SHA_functions);
+    if (m == NULL)
+	return;
+}
+
+#endif
diff --git a/src/Modules/shamodule.c b/src/Modules/shamodule.c
new file mode 100644
--- /dev/null
+++ b/src/Modules/shamodule.c
@@ -0,0 +1,593 @@
+/* SHA module */
+
+/* This module provides an interface to NIST's Secure Hash Algorithm */
+
+/* See below for information about the original code this module was
+   based upon. Additional work performed by:
+
+   Andrew Kuchling (amk at amk.ca)
+   Greg Stein (gstein at lyra.org)
+
+   Copyright (C) 2005   Gregory P. Smith (greg at krypto.org)
+   Licensed to PSF under a Contributor Agreement.
+
+*/
+
+/* SHA objects */
+
+#include "Python.h"
+#include "structmember.h"
+
+
+/* Endianness testing and definitions */
+#define TestEndianness(variable) {int i=1; variable=PCT_BIG_ENDIAN;\
+	if (*((char*)&i)==1) variable=PCT_LITTLE_ENDIAN;}
+
+#define PCT_LITTLE_ENDIAN 1
+#define PCT_BIG_ENDIAN 0
+
+/* Some useful types */
+
+typedef unsigned char SHA_BYTE;
+
+#if SIZEOF_INT == 4
+typedef unsigned int SHA_INT32;	/* 32-bit integer */
+#else
+/* not defined. compilation will die. */
+#endif
+
+/* The SHA block size and message digest sizes, in bytes */
+
+#define SHA_BLOCKSIZE    64
+#define SHA_DIGESTSIZE  20
+
+/* The structure for storing SHS info */
+
+typedef struct {
+    PyObject_HEAD
+    SHA_INT32 digest[5];		/* Message digest */
+    SHA_INT32 count_lo, count_hi;	/* 64-bit bit count */
+    SHA_BYTE data[SHA_BLOCKSIZE];	/* SHA data buffer */
+    int Endianness;
+    int local;				/* unprocessed amount in data */
+} SHAobject;
+
+/* When run on a little-endian CPU we need to perform byte reversal on an
+   array of longwords. */
+
+static void longReverse(SHA_INT32 *buffer, int byteCount, int Endianness)
+{
+    SHA_INT32 value;
+
+    if ( Endianness == PCT_BIG_ENDIAN )
+	return;
+
+    byteCount /= sizeof(*buffer);
+    while (byteCount--) {
+        value = *buffer;
+        value = ( ( value & 0xFF00FF00L ) >> 8  ) | \
+                ( ( value & 0x00FF00FFL ) << 8 );
+        *buffer++ = ( value << 16 ) | ( value >> 16 );
+    }
+}
+
+static void SHAcopy(SHAobject *src, SHAobject *dest)
+{
+    dest->Endianness = src->Endianness;
+    dest->local = src->local;
+    dest->count_lo = src->count_lo;
+    dest->count_hi = src->count_hi;
+    memcpy(dest->digest, src->digest, sizeof(src->digest));
+    memcpy(dest->data, src->data, sizeof(src->data));
+}
+
+
+/* ------------------------------------------------------------------------
+ *
+ * This code for the SHA algorithm was noted as public domain. The original
+ * headers are pasted below.
+ *
+ * Several changes have been made to make it more compatible with the
+ * Python environment and desired interface.
+ *
+ */
+
+/* NIST Secure Hash Algorithm */
+/* heavily modified by Uwe Hollerbach <uh at alumni.caltech edu> */
+/* from Peter C. Gutmann's implementation as found in */
+/* Applied Cryptography by Bruce Schneier */
+/* Further modifications to include the "UNRAVEL" stuff, below */
+
+/* This code is in the public domain */
+
+/* UNRAVEL should be fastest & biggest */
+/* UNROLL_LOOPS should be just as big, but slightly slower */
+/* both undefined should be smallest and slowest */
+
+#define UNRAVEL
+/* #define UNROLL_LOOPS */
+
+/* The SHA f()-functions.  The f1 and f3 functions can be optimized to
+   save one boolean operation each - thanks to Rich Schroeppel,
+   rcs at cs.arizona.edu for discovering this */
+
+/*#define f1(x,y,z)	((x & y) | (~x & z))		// Rounds  0-19 */
+#define f1(x,y,z)	(z ^ (x & (y ^ z)))		/* Rounds  0-19 */
+#define f2(x,y,z)	(x ^ y ^ z)			/* Rounds 20-39 */
+/*#define f3(x,y,z)	((x & y) | (x & z) | (y & z))	// Rounds 40-59 */
+#define f3(x,y,z)	((x & y) | (z & (x | y)))	/* Rounds 40-59 */
+#define f4(x,y,z)	(x ^ y ^ z)			/* Rounds 60-79 */
+
+/* SHA constants */
+
+#define CONST1		0x5a827999L			/* Rounds  0-19 */
+#define CONST2		0x6ed9eba1L			/* Rounds 20-39 */
+#define CONST3		0x8f1bbcdcL			/* Rounds 40-59 */
+#define CONST4		0xca62c1d6L			/* Rounds 60-79 */
+
+/* 32-bit rotate */
+
+#define R32(x,n)	((x << n) | (x >> (32 - n)))
+
+/* the generic case, for when the overall rotation is not unraveled */
+
+#define FG(n)	\
+    T = R32(A,5) + f##n(B,C,D) + E + *WP++ + CONST##n;	\
+    E = D; D = C; C = R32(B,30); B = A; A = T
+
+/* specific cases, for when the overall rotation is unraveled */
+
+#define FA(n)	\
+    T = R32(A,5) + f##n(B,C,D) + E + *WP++ + CONST##n; B = R32(B,30)
+
+#define FB(n)	\
+    E = R32(T,5) + f##n(A,B,C) + D + *WP++ + CONST##n; A = R32(A,30)
+
+#define FC(n)	\
+    D = R32(E,5) + f##n(T,A,B) + C + *WP++ + CONST##n; T = R32(T,30)
+
+#define FD(n)	\
+    C = R32(D,5) + f##n(E,T,A) + B + *WP++ + CONST##n; E = R32(E,30)
+
+#define FE(n)	\
+    B = R32(C,5) + f##n(D,E,T) + A + *WP++ + CONST##n; D = R32(D,30)
+
+#define FT(n)	\
+    A = R32(B,5) + f##n(C,D,E) + T + *WP++ + CONST##n; C = R32(C,30)
+
+/* do SHA transformation */
+
+static void
+sha_transform(SHAobject *sha_info)
+{
+    int i;
+    SHA_INT32 T, A, B, C, D, E, W[80], *WP;
+
+    memcpy(W, sha_info->data, sizeof(sha_info->data));
+    longReverse(W, (int)sizeof(sha_info->data), sha_info->Endianness);
+
+    for (i = 16; i < 80; ++i) {
+	W[i] = W[i-3] ^ W[i-8] ^ W[i-14] ^ W[i-16];
+
+	/* extra rotation fix */
+	W[i] = R32(W[i], 1);
+    }
+    A = sha_info->digest[0];
+    B = sha_info->digest[1];
+    C = sha_info->digest[2];
+    D = sha_info->digest[3];
+    E = sha_info->digest[4];
+    WP = W;
+#ifdef UNRAVEL
+    FA(1); FB(1); FC(1); FD(1); FE(1); FT(1); FA(1); FB(1); FC(1); FD(1);
+    FE(1); FT(1); FA(1); FB(1); FC(1); FD(1); FE(1); FT(1); FA(1); FB(1);
+    FC(2); FD(2); FE(2); FT(2); FA(2); FB(2); FC(2); FD(2); FE(2); FT(2);
+    FA(2); FB(2); FC(2); FD(2); FE(2); FT(2); FA(2); FB(2); FC(2); FD(2);
+    FE(3); FT(3); FA(3); FB(3); FC(3); FD(3); FE(3); FT(3); FA(3); FB(3);
+    FC(3); FD(3); FE(3); FT(3); FA(3); FB(3); FC(3); FD(3); FE(3); FT(3);
+    FA(4); FB(4); FC(4); FD(4); FE(4); FT(4); FA(4); FB(4); FC(4); FD(4);
+    FE(4); FT(4); FA(4); FB(4); FC(4); FD(4); FE(4); FT(4); FA(4); FB(4);
+    sha_info->digest[0] += E;
+    sha_info->digest[1] += T;
+    sha_info->digest[2] += A;
+    sha_info->digest[3] += B;
+    sha_info->digest[4] += C;
+#else /* !UNRAVEL */
+#ifdef UNROLL_LOOPS
+    FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1);
+    FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1);
+    FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2);
+    FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2);
+    FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3);
+    FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3);
+    FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4);
+    FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4);
+#else /* !UNROLL_LOOPS */
+    for (i =  0; i < 20; ++i) { FG(1); }
+    for (i = 20; i < 40; ++i) { FG(2); }
+    for (i = 40; i < 60; ++i) { FG(3); }
+    for (i = 60; i < 80; ++i) { FG(4); }
+#endif /* !UNROLL_LOOPS */
+    sha_info->digest[0] += A;
+    sha_info->digest[1] += B;
+    sha_info->digest[2] += C;
+    sha_info->digest[3] += D;
+    sha_info->digest[4] += E;
+#endif /* !UNRAVEL */
+}
+
+/* initialize the SHA digest */
+
+static void
+sha_init(SHAobject *sha_info)
+{
+    TestEndianness(sha_info->Endianness)
+
+    sha_info->digest[0] = 0x67452301L;
+    sha_info->digest[1] = 0xefcdab89L;
+    sha_info->digest[2] = 0x98badcfeL;
+    sha_info->digest[3] = 0x10325476L;
+    sha_info->digest[4] = 0xc3d2e1f0L;
+    sha_info->count_lo = 0L;
+    sha_info->count_hi = 0L;
+    sha_info->local = 0;
+}
+
+/* update the SHA digest */
+
+static void
+sha_update(SHAobject *sha_info, SHA_BYTE *buffer, int count)
+{
+    int i;
+    SHA_INT32 clo;
+
+    clo = sha_info->count_lo + ((SHA_INT32) count << 3);
+    if (clo < sha_info->count_lo) {
+        ++sha_info->count_hi;
+    }
+    sha_info->count_lo = clo;
+    sha_info->count_hi += (SHA_INT32) count >> 29;
+    if (sha_info->local) {
+        i = SHA_BLOCKSIZE - sha_info->local;
+        if (i > count) {
+            i = count;
+        }
+        memcpy(((SHA_BYTE *) sha_info->data) + sha_info->local, buffer, i);
+        count -= i;
+        buffer += i;
+        sha_info->local += i;
+        if (sha_info->local == SHA_BLOCKSIZE) {
+            sha_transform(sha_info);
+        }
+        else {
+            return;
+        }
+    }
+    while (count >= SHA_BLOCKSIZE) {
+        memcpy(sha_info->data, buffer, SHA_BLOCKSIZE);
+        buffer += SHA_BLOCKSIZE;
+        count -= SHA_BLOCKSIZE;
+        sha_transform(sha_info);
+    }
+    memcpy(sha_info->data, buffer, count);
+    sha_info->local = count;
+}
+
+/* finish computing the SHA digest */
+
+static void
+sha_final(unsigned char digest[20], SHAobject *sha_info)
+{
+    int count;
+    SHA_INT32 lo_bit_count, hi_bit_count;
+
+    lo_bit_count = sha_info->count_lo;
+    hi_bit_count = sha_info->count_hi;
+    count = (int) ((lo_bit_count >> 3) & 0x3f);
+    ((SHA_BYTE *) sha_info->data)[count++] = 0x80;
+    if (count > SHA_BLOCKSIZE - 8) {
+	memset(((SHA_BYTE *) sha_info->data) + count, 0,
+	       SHA_BLOCKSIZE - count);
+	sha_transform(sha_info);
+	memset((SHA_BYTE *) sha_info->data, 0, SHA_BLOCKSIZE - 8);
+    }
+    else {
+	memset(((SHA_BYTE *) sha_info->data) + count, 0,
+	       SHA_BLOCKSIZE - 8 - count);
+    }
+
+    /* GJS: note that we add the hi/lo in big-endian. sha_transform will
+       swap these values into host-order. */
+    sha_info->data[56] = (hi_bit_count >> 24) & 0xff;
+    sha_info->data[57] = (hi_bit_count >> 16) & 0xff;
+    sha_info->data[58] = (hi_bit_count >>  8) & 0xff;
+    sha_info->data[59] = (hi_bit_count >>  0) & 0xff;
+    sha_info->data[60] = (lo_bit_count >> 24) & 0xff;
+    sha_info->data[61] = (lo_bit_count >> 16) & 0xff;
+    sha_info->data[62] = (lo_bit_count >>  8) & 0xff;
+    sha_info->data[63] = (lo_bit_count >>  0) & 0xff;
+    sha_transform(sha_info);
+    digest[ 0] = (unsigned char) ((sha_info->digest[0] >> 24) & 0xff);
+    digest[ 1] = (unsigned char) ((sha_info->digest[0] >> 16) & 0xff);
+    digest[ 2] = (unsigned char) ((sha_info->digest[0] >>  8) & 0xff);
+    digest[ 3] = (unsigned char) ((sha_info->digest[0]      ) & 0xff);
+    digest[ 4] = (unsigned char) ((sha_info->digest[1] >> 24) & 0xff);
+    digest[ 5] = (unsigned char) ((sha_info->digest[1] >> 16) & 0xff);
+    digest[ 6] = (unsigned char) ((sha_info->digest[1] >>  8) & 0xff);
+    digest[ 7] = (unsigned char) ((sha_info->digest[1]      ) & 0xff);
+    digest[ 8] = (unsigned char) ((sha_info->digest[2] >> 24) & 0xff);
+    digest[ 9] = (unsigned char) ((sha_info->digest[2] >> 16) & 0xff);
+    digest[10] = (unsigned char) ((sha_info->digest[2] >>  8) & 0xff);
+    digest[11] = (unsigned char) ((sha_info->digest[2]      ) & 0xff);
+    digest[12] = (unsigned char) ((sha_info->digest[3] >> 24) & 0xff);
+    digest[13] = (unsigned char) ((sha_info->digest[3] >> 16) & 0xff);
+    digest[14] = (unsigned char) ((sha_info->digest[3] >>  8) & 0xff);
+    digest[15] = (unsigned char) ((sha_info->digest[3]      ) & 0xff);
+    digest[16] = (unsigned char) ((sha_info->digest[4] >> 24) & 0xff);
+    digest[17] = (unsigned char) ((sha_info->digest[4] >> 16) & 0xff);
+    digest[18] = (unsigned char) ((sha_info->digest[4] >>  8) & 0xff);
+    digest[19] = (unsigned char) ((sha_info->digest[4]      ) & 0xff);
+}
+
+/*
+ * End of copied SHA code.
+ *
+ * ------------------------------------------------------------------------
+ */
+
+static PyTypeObject SHAtype;
+
+
+static SHAobject *
+newSHAobject(void)
+{
+    return (SHAobject *)PyObject_New(SHAobject, &SHAtype);
+}
+
+/* Internal methods for a hashing object */
+
+static void
+SHA_dealloc(PyObject *ptr)
+{
+    PyObject_Del(ptr);
+}
+
+
+/* External methods for a hashing object */
+
+PyDoc_STRVAR(SHA_copy__doc__, "Return a copy of the hashing object.");
+
+static PyObject *
+SHA_copy(SHAobject *self, PyObject *unused)
+{
+    SHAobject *newobj;
+
+    if ( (newobj = newSHAobject())==NULL)
+        return NULL;
+
+    SHAcopy(self, newobj);
+    return (PyObject *)newobj;
+}
+
+PyDoc_STRVAR(SHA_digest__doc__,
+"Return the digest value as a string of binary data.");
+
+static PyObject *
+SHA_digest(SHAobject *self, PyObject *unused)
+{
+    unsigned char digest[SHA_DIGESTSIZE];
+    SHAobject temp;
+
+    SHAcopy(self, &temp);
+    sha_final(digest, &temp);
+    return PyString_FromStringAndSize((const char *)digest, sizeof(digest));
+}
+
+PyDoc_STRVAR(SHA_hexdigest__doc__,
+"Return the digest value as a string of hexadecimal digits.");
+
+static PyObject *
+SHA_hexdigest(SHAobject *self, PyObject *unused)
+{
+    unsigned char digest[SHA_DIGESTSIZE];
+    SHAobject temp;
+    PyObject *retval;
+    char *hex_digest;
+    int i, j;
+
+    /* Get the raw (binary) digest value */
+    SHAcopy(self, &temp);
+    sha_final(digest, &temp);
+
+    /* Create a new string */
+    retval = PyString_FromStringAndSize(NULL, sizeof(digest) * 2);
+    if (!retval)
+	    return NULL;
+    hex_digest = PyString_AsString(retval);
+    if (!hex_digest) {
+	    Py_DECREF(retval);
+	    return NULL;
+    }
+
+    /* Make hex version of the digest */
+    for(i=j=0; i<sizeof(digest); i++) {
+        char c;
+        c = (digest[i] >> 4) & 0xf;
+	c = (c>9) ? c+'a'-10 : c + '0';
+        hex_digest[j++] = c;
+        c = (digest[i] & 0xf);
+	c = (c>9) ? c+'a'-10 : c + '0';
+        hex_digest[j++] = c;
+    }
+    return retval;
+}
+
+PyDoc_STRVAR(SHA_update__doc__,
+"Update this hashing object's state with the provided string.");
+
+static PyObject *
+SHA_update(SHAobject *self, PyObject *args)
+{
+    unsigned char *cp;
+    int len;
+
+    if (!PyArg_ParseTuple(args, "s#:update", &cp, &len))
+        return NULL;
+
+    sha_update(self, cp, len);
+
+    Py_INCREF(Py_None);
+    return Py_None;
+}
+
+static PyMethodDef SHA_methods[] = {
+    {"copy",	  (PyCFunction)SHA_copy,      METH_NOARGS,  SHA_copy__doc__},
+    {"digest",	  (PyCFunction)SHA_digest,    METH_NOARGS,  SHA_digest__doc__},
+    {"hexdigest", (PyCFunction)SHA_hexdigest, METH_NOARGS,  SHA_hexdigest__doc__},
+    {"update",	  (PyCFunction)SHA_update,    METH_VARARGS, SHA_update__doc__},
+    {NULL,	  NULL}		/* sentinel */
+};
+
+static PyObject *
+SHA_get_block_size(PyObject *self, void *closure)
+{
+    return PyInt_FromLong(SHA_BLOCKSIZE);
+}
+
+static PyObject *
+SHA_get_digest_size(PyObject *self, void *closure)
+{
+    return PyInt_FromLong(SHA_DIGESTSIZE);
+}
+
+static PyObject *
+SHA_get_name(PyObject *self, void *closure)
+{
+    return PyString_FromStringAndSize("SHA1", 4);
+}
+
+static PyGetSetDef SHA_getseters[] = {
+    {"digest_size",
+     (getter)SHA_get_digest_size, NULL,
+     NULL,
+     NULL},
+    {"block_size",
+     (getter)SHA_get_block_size, NULL,
+     NULL,
+     NULL},
+    {"name",
+     (getter)SHA_get_name, NULL,
+     NULL,
+     NULL},
+    /* the old md5 and sha modules support 'digest_size' as in PEP 247.
+     * the old sha module also supported 'digestsize'.  ugh. */
+    {"digestsize",
+     (getter)SHA_get_digest_size, NULL,
+     NULL,
+     NULL},
+    {NULL}  /* Sentinel */
+};
+
+static PyTypeObject SHAtype = {
+    PyObject_HEAD_INIT(NULL)
+    0,			/*ob_size*/
+    "_sha.sha",		/*tp_name*/
+    sizeof(SHAobject),	/*tp_size*/
+    0,			/*tp_itemsize*/
+    /* methods */
+    SHA_dealloc,	/*tp_dealloc*/
+    0,			/*tp_print*/
+    0,                  /*tp_getattr*/
+    0,                  /*tp_setattr*/
+    0,                  /*tp_compare*/
+    0,                  /*tp_repr*/
+    0,                  /*tp_as_number*/
+    0,                  /*tp_as_sequence*/
+    0,                  /*tp_as_mapping*/
+    0,                  /*tp_hash*/
+    0,                  /*tp_call*/
+    0,                  /*tp_str*/
+    0,                  /*tp_getattro*/
+    0,                  /*tp_setattro*/
+    0,                  /*tp_as_buffer*/
+    Py_TPFLAGS_DEFAULT, /*tp_flags*/
+    0,                  /*tp_doc*/
+    0,                  /*tp_traverse*/
+    0,			/*tp_clear*/
+    0,			/*tp_richcompare*/
+    0,			/*tp_weaklistoffset*/
+    0,			/*tp_iter*/
+    0,			/*tp_iternext*/
+    SHA_methods,	/* tp_methods */
+    0,                  /* tp_members */
+    SHA_getseters,      /* tp_getset */
+};
+
+
+/* The single module-level function: new() */
+
+PyDoc_STRVAR(SHA_new__doc__,
+"Return a new SHA hashing object.  An optional string argument\n\
+may be provided; if present, this string will be automatically\n\
+hashed.");
+
+static PyObject *
+SHA_new(PyObject *self, PyObject *args, PyObject *kwdict)
+{
+    static char *kwlist[] = {"string", NULL};
+    SHAobject *new;
+    unsigned char *cp = NULL;
+    int len;
+
+    if (!PyArg_ParseTupleAndKeywords(args, kwdict, "|s#:new", kwlist,
+                                     &cp, &len)) {
+        return NULL;
+    }
+
+    if ((new = newSHAobject()) == NULL)
+        return NULL;
+
+    sha_init(new);
+
+    if (PyErr_Occurred()) {
+        Py_DECREF(new);
+        return NULL;
+    }
+    if (cp)
+        sha_update(new, cp, len);
+
+    return (PyObject *)new;
+}
+
+
+/* List of functions exported by this module */
+
+static struct PyMethodDef SHA_functions[] = {
+    {"new", (PyCFunction)SHA_new, METH_VARARGS|METH_KEYWORDS, SHA_new__doc__},
+    {NULL,	NULL}		 /* Sentinel */
+};
+
+
+/* Initialize this module. */
+
+#define insint(n,v) { PyModule_AddIntConstant(m,n,v); }
+
+PyMODINIT_FUNC
+init_sha(void)
+{
+    PyObject *m;
+
+    SHAtype.ob_type = &PyType_Type;
+    if (PyType_Ready(&SHAtype) < 0)
+        return;
+    m = Py_InitModule("_sha", SHA_functions);
+    if (m == NULL)
+	return;
+
+    /* Add some symbolic constants to the module */
+    insint("blocksize", 1);  /* For future use, in case some hash
+                                functions require an integral number of
+                                blocks */ 
+    insint("digestsize", 20);
+    insint("digest_size", 20);
+}
diff --git a/src/README.txt b/src/README.txt
--- a/src/README.txt
+++ b/src/README.txt
@@ -2,7 +2,7 @@
 Distutils2
 ==========
 
-Welcome to Distutils2 !
+Welcome to Distutils2!
 
 Distutils2 is the new version of Distutils. It's not backward compatible with
 Distutils but provides more features, and implement most new packaging
@@ -10,6 +10,6 @@
 
 See the documentation at http://packages.python.org/Distutils2 for more info.
 
-**Beware that Distutils2 is its in early stage and should not be used in
+**Beware that Distutils2 is in its early stage and should not be used in
 production. Its API is subject to changes**
 
diff --git a/src/distutils2/README b/src/distutils2/README
--- a/src/distutils2/README
+++ b/src/distutils2/README
@@ -1,4 +1,4 @@
-This directory contains the Distutils package.
+This directory contains the Distutils2 package.
 
 There's a full documentation available at:
 
@@ -8,6 +8,6 @@
 
     http://www.python.org/sigs/distutils-sig/
 
-WARNING: Distutils2 must remain compatible with 2.4
+WARNING: Distutils2 must remain compatible with Python 2.4
 
 $Id: README 70017 2009-02-27 12:53:34Z tarek.ziade $
diff --git a/src/distutils2/__init__.py b/src/distutils2/__init__.py
--- a/src/distutils2/__init__.py
+++ b/src/distutils2/__init__.py
@@ -1,16 +1,26 @@
 """distutils
 
-The main package for the Python Module Distribution Utilities.  Normally
-used from a setup script as
+The main package for the Python Distribution Utilities 2.  Setup
+scripts should import the setup function from distutils2.core:
 
-   from distutils.core import setup
+    from distutils2.core import setup
 
-   setup (...)
+    setup(name=..., version=..., ...)
+
+Third-party tools can use parts of Distutils2 as building blocks
+without causing the other modules to be imported:
+
+    import distutils2.version
+    import distutils2.pypi.simple
+    import distutils2.tests.pypi_server
 """
-__all__ = ['__version__', 'setup']
+__all__ = ['__version__']
 
 __revision__ = "$Id: __init__.py 78020 2010-02-06 16:37:32Z benjamin.peterson $"
 __version__ = "1.0a2"
 
-from distutils2.core import setup
 
+# when set to True, converts doctests by default too 
+run_2to3_on_doctests = True
+# Standard package names for fixer packages
+lib2to3_fixer_packages = ['lib2to3.fixes']
diff --git a/src/distutils2/_backport/hashlib.py b/src/distutils2/_backport/hashlib.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/_backport/hashlib.py
@@ -0,0 +1,143 @@
+# $Id$
+#
+#  Copyright (C) 2005   Gregory P. Smith (greg at krypto.org)
+#  Licensed to PSF under a Contributor Agreement.
+#
+
+__doc__ = """hashlib module - A common interface to many hash functions.
+
+new(name, string='') - returns a new hash object implementing the
+                       given hash function; initializing the hash
+                       using the given string data.
+
+Named constructor functions are also available, these are much faster
+than using new():
+
+md5(), sha1(), sha224(), sha256(), sha384(), and sha512()
+
+More algorithms may be available on your platform but the above are
+guaranteed to exist.
+
+NOTE: If you want the adler32 or crc32 hash functions they are available in
+the zlib module.
+
+Choose your hash function wisely.  Some have known collision weaknesses.
+sha384 and sha512 will be slow on 32 bit platforms.
+
+Hash objects have these methods:
+ - update(arg): Update the hash object with the string arg. Repeated calls
+                are equivalent to a single call with the concatenation of all
+                the arguments.
+ - digest():    Return the digest of the strings passed to the update() method
+                so far. This may contain non-ASCII characters, including
+                NUL bytes.
+ - hexdigest(): Like digest() except the digest is returned as a string of
+                double length, containing only hexadecimal digits.
+ - copy():      Return a copy (clone) of the hash object. This can be used to
+                efficiently compute the digests of strings that share a common
+                initial substring.
+
+For example, to obtain the digest of the string 'Nobody inspects the
+spammish repetition':
+
+    >>> import hashlib
+    >>> m = hashlib.md5()
+    >>> m.update("Nobody inspects")
+    >>> m.update(" the spammish repetition")
+    >>> m.digest()
+    '\\xbbd\\x9c\\x83\\xdd\\x1e\\xa5\\xc9\\xd9\\xde\\xc9\\xa1\\x8d\\xf0\\xff\\xe9'
+
+More condensed:
+
+    >>> hashlib.sha224("Nobody inspects the spammish repetition").hexdigest()
+    'a4337bc45a8fc544c03f52dc550cd6e1e87021bc896588bd79e901e2'
+
+"""
+
+# This tuple and __get_builtin_constructor() must be modified if a new
+# always available algorithm is added.
+__always_supported = ('md5', 'sha1', 'sha224', 'sha256', 'sha384', 'sha512')
+
+algorithms = __always_supported
+
+__all__ = __always_supported + ('new', 'algorithms')
+
+
+def __get_builtin_constructor(name):
+    if name in ('SHA1', 'sha1'):
+        import _sha
+        return _sha.new
+    elif name in ('MD5', 'md5'):
+        import _md5
+        return _md5.new
+    elif name in ('SHA256', 'sha256', 'SHA224', 'sha224'):
+        import _sha256
+        bs = name[3:]
+        if bs == '256':
+            return _sha256.sha256
+        elif bs == '224':
+            return _sha256.sha224
+    elif name in ('SHA512', 'sha512', 'SHA384', 'sha384'):
+        import _sha512
+        bs = name[3:]
+        if bs == '512':
+            return _sha512.sha512
+        elif bs == '384':
+            return _sha512.sha384
+
+    raise ValueError('unsupported hash type %s' % name)
+
+
+def __get_openssl_constructor(name):
+    try:
+        f = getattr(_hashlib, 'openssl_' + name)
+        # Allow the C module to raise ValueError.  The function will be
+        # defined but the hash not actually available thanks to OpenSSL.
+        f()
+        # Use the C function directly (very fast)
+        return f
+    except (AttributeError, ValueError):
+        return __get_builtin_constructor(name)
+
+
+def __py_new(name, string=''):
+    """new(name, string='') - Return a new hashing object using the named algorithm;
+    optionally initialized with a string.
+    """
+    return __get_builtin_constructor(name)(string)
+
+
+def __hash_new(name, string=''):
+    """new(name, string='') - Return a new hashing object using the named algorithm;
+    optionally initialized with a string.
+    """
+    try:
+        return _hashlib.new(name, string)
+    except ValueError:
+        # If the _hashlib module (OpenSSL) doesn't support the named
+        # hash, try using our builtin implementations.
+        # This allows for SHA224/256 and SHA384/512 support even though
+        # the OpenSSL library prior to 0.9.8 doesn't provide them.
+        return __get_builtin_constructor(name)(string)
+
+
+try:
+    import _hashlib
+    new = __hash_new
+    __get_hash = __get_openssl_constructor
+except ImportError:
+    new = __py_new
+    __get_hash = __get_builtin_constructor
+
+for __func_name in __always_supported:
+    # try them all, some may not work due to the OpenSSL
+    # version not supporting that algorithm.
+    try:
+        globals()[__func_name] = __get_hash(__func_name)
+    except ValueError:
+        import logging
+        logging.exception('code for hash %s was not found.', __func_name)
+
+# Cleanup locals()
+del __always_supported, __func_name, __get_hash
+del __py_new, __hash_new, __get_openssl_constructor
diff --git a/src/distutils2/_backport/pkgutil.py b/src/distutils2/_backport/pkgutil.py
--- a/src/distutils2/_backport/pkgutil.py
+++ b/src/distutils2/_backport/pkgutil.py
@@ -1,4 +1,4 @@
-""" Utilities to support packages. """
+"""Utilities to support packages."""
 
 # NOTE: This module must remain compatible with Python 2.3, as it is shared
 # by setuptools for distribution with Python 2.3 and up.
@@ -181,7 +181,7 @@
 iter_importer_modules = simplegeneric(iter_importer_modules)
 
 
-class ImpImporter:
+class ImpImporter(object):
     """:pep:`302` Importer that wraps Python's "classic" import algorithm
 
     ``ImpImporter(dirname)`` produces a :pep:`302` importer that searches that
@@ -244,7 +244,7 @@
                 yield prefix + modname, ispkg
 
 
-class ImpLoader:
+class ImpLoader(object):
     """:pep:`302` Loader that wraps Python's "classic" import algorithm """
 
     code = source = None
@@ -827,6 +827,9 @@
     def __eq__(self, other):
         return isinstance(other, Distribution) and self.path == other.path
 
+    # See http://docs.python.org/reference/datamodel#object.__hash__
+    __hash__ = object.__hash__
+
 
 class EggInfoDistribution(object):
     """Created with the *path* of the ``.egg-info`` directory or file provided
@@ -950,6 +953,9 @@
         return isinstance(other, EggInfoDistribution) and \
                self.path == other.path
 
+    # See http://docs.python.org/reference/datamodel#object.__hash__
+    __hash__ = object.__hash__
+
 
 def _normalize_dist_name(name):
     """Returns a normalized name from the given *name*.
@@ -1021,7 +1027,7 @@
     returned if one is found that has metadata that matches *name* for the
     *name* metadata field.
 
-    This function only returns the first result founded, as no more than one
+    This function only returns the first result found, as no more than one
     value is expected. If the directory is not found, ``None`` is returned.
 
     :rtype: :class:`Distribution` or :class:`EggInfoDistribution` or None"""
@@ -1081,7 +1087,7 @@
     then all files and directories ending with ``.egg-info`` are considered
     as well and returns an :class:`EggInfoDistribution` instance.
 
-    This function only returns the first result founded, since no more than
+    This function only returns the first result found, since no more than
     one values are expected. If the directory is not found, returns ``None``.
 
     :parameter version: a version specifier that indicates the version
diff --git a/src/distutils2/_backport/sysconfig.cfg b/src/distutils2/_backport/sysconfig.cfg
--- a/src/distutils2/_backport/sysconfig.cfg
+++ b/src/distutils2/_backport/sysconfig.cfg
@@ -1,6 +1,6 @@
 [globals]
 # These are the useful categories that are sometimes referenced at runtime,
-# using pkgutils.open():
+# using pkgutil.open():
 config             = {confdir}/{distribution.name}         # Configuration files
 appdata            = {datadir}/{distribution.name}         # Non-writable data that is independent of architecture (images, many xml/text files)
 appdata.arch       = {libdir}/{distribution.name}          # Non-writable data that is architecture-dependent (some binary data formats)
diff --git a/src/distutils2/_backport/sysconfig.py b/src/distutils2/_backport/sysconfig.py
--- a/src/distutils2/_backport/sysconfig.py
+++ b/src/distutils2/_backport/sysconfig.py
@@ -5,7 +5,7 @@
 import os
 import re
 from os.path import pardir, abspath
-from ConfigParser import ConfigParser
+from ConfigParser import RawConfigParser
 
 _PREFIX = os.path.normpath(sys.prefix)
 _EXEC_PREFIX = os.path.normpath(sys.exec_prefix)
@@ -14,7 +14,7 @@
 # XXX _CONFIG_DIR will be set by the Makefile later
 _CONFIG_DIR = os.path.normpath(os.path.dirname(__file__))
 _CONFIG_FILE = os.path.join(_CONFIG_DIR, 'sysconfig.cfg')
-_SCHEMES = ConfigParser()
+_SCHEMES = RawConfigParser()
 _SCHEMES.read(_CONFIG_FILE)
 _VAR_REPL = re.compile(r'\{([^{]*?)\}')
 
@@ -580,10 +580,11 @@
                 # behaviour.
                 pass
             else:
-                m = re.search(
-                        r'<key>ProductUserVisibleVersion</key>\s*' +
-                        r'<string>(.*?)</string>', f.read())
-                f.close()
+                try:
+                    m = re.search(r'<key>ProductUserVisibleVersion</key>\s*'
+                                  r'<string>(.*?)</string>', f.read())
+                finally:
+                    f.close()
                 if m is not None:
                     macrelease = '.'.join(m.group(1).split('.')[:2])
                 # else: fall back to the default behaviour
diff --git a/src/distutils2/_backport/tarfile.py b/src/distutils2/_backport/tarfile.py
--- a/src/distutils2/_backport/tarfile.py
+++ b/src/distutils2/_backport/tarfile.py
@@ -370,7 +370,7 @@
 #---------------------------
 # internal stream interface
 #---------------------------
-class _LowLevelFile:
+class _LowLevelFile(object):
     """Low-level file object. Supports reading and writing.
        It is used instead of a regular file object for streaming
        access.
@@ -394,7 +394,7 @@
     def write(self, s):
         os.write(self.fd, s)
 
-class _Stream:
+class _Stream(object):
     """Class that serves as an adapter between TarFile and
        a stream-like object.  The stream-like object only
        needs to have a read() or write() method and is accessed
@@ -2003,8 +2003,10 @@
         # Append the tar header and data to the archive.
         if tarinfo.isreg():
             f = bltn_open(name, "rb")
-            self.addfile(tarinfo, f)
-            f.close()
+            try:
+                self.addfile(tarinfo, f)
+            finally:
+                f.close()
 
         elif tarinfo.isdir():
             self.addfile(tarinfo)
@@ -2214,9 +2216,11 @@
         """
         source = self.extractfile(tarinfo)
         target = bltn_open(targetpath, "wb")
-        copyfileobj(source, target)
-        source.close()
-        target.close()
+        try:
+            copyfileobj(source, target)
+        finally:
+            source.close()
+            target.close()
 
     def makeunknown(self, tarinfo, targetpath):
         """Make a file from a TarInfo object with an unknown type
@@ -2423,7 +2427,7 @@
             print >> sys.stderr, msg
 # class TarFile
 
-class TarIter:
+class TarIter(object):
     """Iterator Class.
 
        for tarinfo in TarFile(...):
@@ -2460,7 +2464,7 @@
         return tarinfo
 
 # Helper classes for sparse file support
-class _section:
+class _section(object):
     """Base class for _data and _hole.
     """
     def __init__(self, offset, size):
@@ -2507,7 +2511,7 @@
 #---------------------------------------------
 TAR_PLAIN = 0           # zipfile.ZIP_STORED
 TAR_GZIPPED = 8         # zipfile.ZIP_DEFLATED
-class TarFileCompat:
+class TarFileCompat(object):
     """TarFile class compatible with standard module zipfile's
        ZipFile class.
     """
@@ -2564,8 +2568,10 @@
        are able to handle, else return False.
     """
     try:
-        t = open(name)
-        t.close()
+        try:
+            t = open(name)
+        finally:
+            t.close()
         return True
     except TarError:
         return False
diff --git a/src/distutils2/_backport/tests/test_pkgutil.py b/src/distutils2/_backport/tests/test_pkgutil.py
--- a/src/distutils2/_backport/tests/test_pkgutil.py
+++ b/src/distutils2/_backport/tests/test_pkgutil.py
@@ -1,6 +1,5 @@
 # -*- coding: utf-8 -*-
 """Tests for PEP 376 pkgutil functionality"""
-import unittest2
 import sys
 import os
 import csv
@@ -14,22 +13,33 @@
     from md5 import md5
 
 from test.test_support import run_unittest, TESTFN
+from distutils2.tests.support import unittest
 
 from distutils2._backport import pkgutil
 
+try:
+    from os.path import relpath
+except ImportError:
+    try:
+        from unittest.compatibility import relpath
+    except ImportError:
+        from unittest2.compatibility import relpath
+
 # TODO Add a test for getting a distribution that is provided by another
-#   distribution.
+# distribution.
 
 # TODO Add a test for absolute pathed RECORD items (e.g. /etc/myapp/config.ini)
 
 # Adapted from Python 2.7's trunk
-class TestPkgUtilData(unittest2.TestCase):
+class TestPkgUtilData(unittest.TestCase):
 
     def setUp(self):
+        super(TestPkgUtilData, self).setUp()
         self.dirname = tempfile.mkdtemp()
         sys.path.insert(0, self.dirname)
 
     def tearDown(self):
+        super(TestPkgUtilData, self).tearDown()
         del sys.path[0]
         shutil.rmtree(self.dirname)
 
@@ -44,15 +54,22 @@
         os.mkdir(package_dir)
         # Empty init.py
         f = open(os.path.join(package_dir, '__init__.py'), "wb")
-        f.close()
+        try:
+            pass
+        finally:
+            f.close()
         # Resource files, res.txt, sub/res.txt
         f = open(os.path.join(package_dir, 'res.txt'), "wb")
-        f.write(RESOURCE_DATA)
-        f.close()
+        try:
+            f.write(RESOURCE_DATA)
+        finally:
+            f.close()
         os.mkdir(os.path.join(package_dir, 'sub'))
         f = open(os.path.join(package_dir, 'sub', 'res.txt'), "wb")
-        f.write(RESOURCE_DATA)
-        f.close()
+        try:
+            f.write(RESOURCE_DATA)
+        finally:
+            f.close()
 
         # Check we can read the resources
         res1 = pkgutil.get_data(pkg, 'res.txt')
@@ -72,13 +89,14 @@
         # Make a package with some resources
         zip_file = os.path.join(self.dirname, zip)
         z = zipfile.ZipFile(zip_file, 'w')
-
-        # Empty init.py
-        z.writestr(pkg + '/__init__.py', "")
-        # Resource files, res.txt, sub/res.txt
-        z.writestr(pkg + '/res.txt', RESOURCE_DATA)
-        z.writestr(pkg + '/sub/res.txt', RESOURCE_DATA)
-        z.close()
+        try:
+            # Empty init.py
+            z.writestr(pkg + '/__init__.py', "")
+            # Resource files, res.txt, sub/res.txt
+            z.writestr(pkg + '/res.txt', RESOURCE_DATA)
+            z.writestr(pkg + '/sub/res.txt', RESOURCE_DATA)
+        finally:
+            z.close()
 
         # Check we can read the resources
         sys.path.insert(0, zip_file)
@@ -91,7 +109,7 @@
         del sys.modules[pkg]
 
 # Adapted from Python 2.7's trunk
-class TestPkgUtilPEP302(unittest2.TestCase):
+class TestPkgUtilPEP302(unittest.TestCase):
 
     class MyTestLoader(object):
         def load_module(self, fullname):
@@ -113,10 +131,12 @@
             return TestPkgUtilPEP302.MyTestLoader()
 
     def setUp(self):
+        super(TestPkgUtilPEP302, self).setUp()
         sys.meta_path.insert(0, self.MyTestImporter())
 
     def tearDown(self):
         del sys.meta_path[0]
+        super(TestPkgUtilPEP302, self).tearDown()
 
     def test_getdata_pep302(self):
         # Use a dummy importer/loader
@@ -134,10 +154,11 @@
         del sys.modules['foo']
 
 
-class TestPkgUtilDistribution(unittest2.TestCase):
-    """Tests the pkgutil.Distribution class"""
+class TestPkgUtilDistribution(unittest.TestCase):
+    # Tests the pkgutil.Distribution class
 
     def setUp(self):
+        super(TestPkgUtilDistribution, self).setUp()
         self.fake_dists_path = os.path.abspath(
             os.path.join(os.path.dirname(__file__), 'fake_dists'))
 
@@ -151,7 +172,7 @@
             return md5_hash.hexdigest()
 
         def record_pieces(file):
-            path = os.path.relpath(file, sys.prefix)
+            path = relpath(file, sys.prefix)
             digest = get_hexdigest(file)
             size = os.path.getsize(file)
             return [path, digest, size]
@@ -171,7 +192,7 @@
             for file in ['INSTALLER', 'METADATA', 'REQUESTED']:
                 record_writer.writerow(record_pieces(
                     os.path.join(distinfo_dir, file)))
-            record_writer.writerow([os.path.relpath(record_file, sys.prefix)])
+            record_writer.writerow([relpath(record_file, sys.prefix)])
             del record_writer # causes the RECORD file to close
             record_reader = csv.reader(open(record_file, 'rb'))
             record_data = []
@@ -186,10 +207,11 @@
         for distinfo_dir in self.distinfo_dirs:
             record_file = os.path.join(distinfo_dir, 'RECORD')
             open(record_file, 'w').close()
+        super(TestPkgUtilDistribution, self).tearDown()
 
     def test_instantiation(self):
-        """Test the Distribution class's instantiation provides us with usable
-        attributes."""
+        # Test the Distribution class's instantiation provides us with usable
+        # attributes.
         # Import the Distribution class
         from distutils2._backport.pkgutil import distinfo_dirname, Distribution
 
@@ -207,7 +229,7 @@
         self.assertTrue(isinstance(dist.requested, type(bool())))
 
     def test_installed_files(self):
-        """Test the iteration of installed files."""
+        # Test the iteration of installed files.
         # Test the distribution's installed files
         from distutils2._backport.pkgutil import Distribution
         for distinfo_dir in self.distinfo_dirs:
@@ -219,17 +241,17 @@
                 self.assertEqual(size, record_data[path][1])
 
     def test_uses(self):
-        """Test to determine if a distribution uses a specified file."""
+        # Test to determine if a distribution uses a specified file.
         # Criteria to test against
         distinfo_name = 'grammar-1.0a4'
         distinfo_dir = os.path.join(self.fake_dists_path,
             distinfo_name + '.dist-info')
         true_path = [self.fake_dists_path, distinfo_name, \
                      'grammar', 'utils.py']
-        true_path = os.path.relpath(os.path.join(*true_path), sys.prefix)
+        true_path = relpath(os.path.join(*true_path), sys.prefix)
         false_path = [self.fake_dists_path, 'towel_stuff-0.1', 'towel_stuff',
             '__init__.py']
-        false_path = os.path.relpath(os.path.join(*false_path), sys.prefix)
+        false_path = relpath(os.path.join(*false_path), sys.prefix)
 
         # Test if the distribution uses the file in question
         from distutils2._backport.pkgutil import Distribution
@@ -238,7 +260,7 @@
         self.assertFalse(dist.uses(false_path))
 
     def test_get_distinfo_file(self):
-        """Test the retrieval of dist-info file objects."""
+        # Test the retrieval of dist-info file objects.
         from distutils2._backport.pkgutil import Distribution
         distinfo_name = 'choxie-2.0.0.9'
         other_distinfo_name = 'grammar-1.0a4'
@@ -271,7 +293,7 @@
                           'ENTRYPOINTS')
 
     def test_get_distinfo_files(self):
-        """Test for the iteration of RECORD path entries."""
+        # Test for the iteration of RECORD path entries.
         from distutils2._backport.pkgutil import Distribution
         distinfo_name = 'towel_stuff-0.1'
         distinfo_dir = os.path.join(self.fake_dists_path,
@@ -288,10 +310,11 @@
         self.assertEqual(sorted(found), sorted(distinfo_record_paths))
 
 
-class TestPkgUtilPEP376(unittest2.TestCase):
-    """Tests for the new functionality added in PEP 376."""
+class TestPkgUtilPEP376(unittest.TestCase):
+    # Tests for the new functionality added in PEP 376.
 
     def setUp(self):
+        super(TestPkgUtilPEP376, self).setUp()
         # Setup the path environment with our fake distributions
         current_path = os.path.abspath(os.path.dirname(__file__))
         self.sys_path = sys.path[:]
@@ -300,17 +323,18 @@
 
     def tearDown(self):
         sys.path[:] = self.sys_path
+        super(TestPkgUtilPEP376, self).tearDown()
 
     def test_distinfo_dirname(self):
-        """Given a name and a version, we expect the distinfo_dirname function
-        to return a standard distribution information directory name."""
+        # Given a name and a version, we expect the distinfo_dirname function
+        # to return a standard distribution information directory name.
 
         items = [# (name, version, standard_dirname)
             # Test for a very simple single word name and decimal
             # version number
             ('docutils', '0.5', 'docutils-0.5.dist-info'),
             # Test for another except this time with a '-' in the name, which
-            #   needs to be transformed during the name lookup
+            # needs to be transformed during the name lookup
             ('python-ldap', '2.5', 'python_ldap-2.5.dist-info'),
             # Test for both '-' in the name and a funky version number
             ('python-ldap', '2.5 a---5', 'python_ldap-2.5 a---5.dist-info'),
@@ -325,7 +349,7 @@
             self.assertEqual(dirname, standard_dirname)
 
     def test_get_distributions(self):
-        """Lookup all distributions found in the ``sys.path``."""
+        # Lookup all distributions found in the ``sys.path``.
         # This test could potentially pick up other installed distributions
         fake_dists = [('grammar', '1.0a4'), ('choxie', '2.0.0.9'),
             ('towel-stuff', '0.1')]
@@ -366,7 +390,7 @@
         self.assertListEqual(sorted(fake_dists), sorted(found_dists))
 
     def test_get_distribution(self):
-        """Test for looking up a distribution by name."""
+        # Test for looking up a distribution by name.
         # Test the lookup of the towel-stuff distribution
         name = 'towel-stuff' # Note: This is different from the directory name
 
@@ -412,7 +436,7 @@
         self.assertEqual(dist.name, 'strawberry')
 
     def test_get_file_users(self):
-        """Test the iteration of distributions that use a file."""
+        # Test the iteration of distributions that use a file.
         from distutils2._backport.pkgutil import get_file_users, Distribution
         name = 'towel_stuff-0.1'
         path = os.path.join(self.fake_dists_path, name,
@@ -422,7 +446,7 @@
             self.assertEqual(dist.name, name)
 
     def test_provides(self):
-        """ Test for looking up distributions by what they provide """
+        # Test for looking up distributions by what they provide
         from distutils2._backport.pkgutil import provides_distribution
         from distutils2.errors import DistutilsError
 
@@ -494,7 +518,7 @@
         checkLists(l, [])
 
     def test_obsoletes(self):
-        """ Test looking for distributions based on what they obsolete """
+        # Test looking for distributions based on what they obsolete
         from distutils2._backport.pkgutil import obsoletes_distribution
         from distutils2.errors import DistutilsError
 
@@ -527,12 +551,12 @@
 
 
 def test_suite():
-    suite = unittest2.TestSuite()
-    testcase_loader = unittest2.loader.defaultTestLoader.loadTestsFromTestCase
-    suite.addTest(testcase_loader(TestPkgUtilData))
-    suite.addTest(testcase_loader(TestPkgUtilDistribution))
-    suite.addTest(testcase_loader(TestPkgUtilPEP302))
-    suite.addTest(testcase_loader(TestPkgUtilPEP376))
+    suite = unittest.TestSuite()
+    load = unittest.defaultTestLoader.loadTestsFromTestCase
+    suite.addTest(load(TestPkgUtilData))
+    suite.addTest(load(TestPkgUtilDistribution))
+    suite.addTest(load(TestPkgUtilPEP302))
+    suite.addTest(load(TestPkgUtilPEP376))
     return suite
 
 
diff --git a/src/distutils2/_backport/tests/test_sysconfig.py b/src/distutils2/_backport/tests/test_sysconfig.py
--- a/src/distutils2/_backport/tests/test_sysconfig.py
+++ b/src/distutils2/_backport/tests/test_sysconfig.py
@@ -8,7 +8,7 @@
 import os
 import shutil
 from copy import copy, deepcopy
-from ConfigParser import ConfigParser
+from ConfigParser import RawConfigParser
 
 from test.test_support import run_unittest, TESTFN
 
@@ -88,15 +88,13 @@
             shutil.rmtree(path)
 
     def test_nested_var_substitution(self):
-        """Assert that the {curly brace token} expansion pattern will replace
-        only the inner {something} on nested expressions like {py{something}} on
-        the first pass.
+        # Assert that the {curly brace token} expansion pattern will replace
+        # only the inner {something} on nested expressions like {py{something}} on
+        # the first pass.
 
-        We have no plans to make use of this, but it keeps the option open for
-        the future, at the cost only of disallowing { itself as a piece of a
-        substitution key (which would be weird).
-
-        """
+        # We have no plans to make use of this, but it keeps the option open for
+        # the future, at the cost only of disallowing { itself as a piece of a
+        # substitution key (which would be weird).
         self.assertEqual(_subst_vars('{py{version}}', {'version': '31'}), '{py31}')
 
     def test_get_paths(self):
@@ -107,7 +105,7 @@
         wanted.sort()
         scheme = scheme.items()
         scheme.sort()
-        self.assertEquals(scheme, wanted)
+        self.assertEqual(scheme, wanted)
 
     def test_get_config_vars(self):
         cvars = get_config_vars()
@@ -120,21 +118,21 @@
         sys.version = ('2.4.4 (#71, Oct 18 2006, 08:34:43) '
                        '[MSC v.1310 32 bit (Intel)]')
         sys.platform = 'win32'
-        self.assertEquals(get_platform(), 'win32')
+        self.assertEqual(get_platform(), 'win32')
 
         # windows XP, amd64
         os.name = 'nt'
         sys.version = ('2.4.4 (#71, Oct 18 2006, 08:34:43) '
                        '[MSC v.1310 32 bit (Amd64)]')
         sys.platform = 'win32'
-        self.assertEquals(get_platform(), 'win-amd64')
+        self.assertEqual(get_platform(), 'win-amd64')
 
         # windows XP, itanium
         os.name = 'nt'
         sys.version = ('2.4.4 (#71, Oct 18 2006, 08:34:43) '
                        '[MSC v.1310 32 bit (Itanium)]')
         sys.platform = 'win32'
-        self.assertEquals(get_platform(), 'win-ia64')
+        self.assertEqual(get_platform(), 'win-ia64')
 
         # macbook
         os.name = 'posix'
@@ -153,9 +151,9 @@
         maxint = sys.maxint
         try:
             sys.maxint = 2147483647
-            self.assertEquals(get_platform(), 'macosx-10.3-ppc')
+            self.assertEqual(get_platform(), 'macosx-10.3-ppc')
             sys.maxint = 9223372036854775807
-            self.assertEquals(get_platform(), 'macosx-10.3-ppc64')
+            self.assertEqual(get_platform(), 'macosx-10.3-ppc64')
         finally:
             sys.maxint = maxint
 
@@ -173,9 +171,9 @@
         maxint = sys.maxint
         try:
             sys.maxint = 2147483647
-            self.assertEquals(get_platform(), 'macosx-10.3-i386')
+            self.assertEqual(get_platform(), 'macosx-10.3-i386')
             sys.maxint = 9223372036854775807
-            self.assertEquals(get_platform(), 'macosx-10.3-x86_64')
+            self.assertEqual(get_platform(), 'macosx-10.3-x86_64')
         finally:
             sys.maxint = maxint
 
@@ -186,33 +184,33 @@
                                        '-fno-strict-aliasing -fno-common '
                                        '-dynamic -DNDEBUG -g -O3')
 
-        self.assertEquals(get_platform(), 'macosx-10.4-fat')
+        self.assertEqual(get_platform(), 'macosx-10.4-fat')
 
         get_config_vars()['CFLAGS'] = ('-arch x86_64 -arch i386 -isysroot '
                                        '/Developer/SDKs/MacOSX10.4u.sdk  '
                                        '-fno-strict-aliasing -fno-common '
                                        '-dynamic -DNDEBUG -g -O3')
 
-        self.assertEquals(get_platform(), 'macosx-10.4-intel')
+        self.assertEqual(get_platform(), 'macosx-10.4-intel')
 
         get_config_vars()['CFLAGS'] = ('-arch x86_64 -arch ppc -arch i386 -isysroot '
                                        '/Developer/SDKs/MacOSX10.4u.sdk  '
                                        '-fno-strict-aliasing -fno-common '
                                        '-dynamic -DNDEBUG -g -O3')
-        self.assertEquals(get_platform(), 'macosx-10.4-fat3')
+        self.assertEqual(get_platform(), 'macosx-10.4-fat3')
 
         get_config_vars()['CFLAGS'] = ('-arch ppc64 -arch x86_64 -arch ppc -arch i386 -isysroot '
                                        '/Developer/SDKs/MacOSX10.4u.sdk  '
                                        '-fno-strict-aliasing -fno-common '
                                        '-dynamic -DNDEBUG -g -O3')
-        self.assertEquals(get_platform(), 'macosx-10.4-universal')
+        self.assertEqual(get_platform(), 'macosx-10.4-universal')
 
         get_config_vars()['CFLAGS'] = ('-arch x86_64 -arch ppc64 -isysroot '
                                        '/Developer/SDKs/MacOSX10.4u.sdk  '
                                        '-fno-strict-aliasing -fno-common '
                                        '-dynamic -DNDEBUG -g -O3')
 
-        self.assertEquals(get_platform(), 'macosx-10.4-fat64')
+        self.assertEqual(get_platform(), 'macosx-10.4-fat64')
 
         for arch in ('ppc', 'i386', 'x86_64', 'ppc64'):
             get_config_vars()['CFLAGS'] = ('-arch %s -isysroot '
@@ -220,7 +218,7 @@
                                            '-fno-strict-aliasing -fno-common '
                                            '-dynamic -DNDEBUG -g -O3'%(arch,))
 
-            self.assertEquals(get_platform(), 'macosx-10.4-%s'%(arch,))
+            self.assertEqual(get_platform(), 'macosx-10.4-%s'%(arch,))
 
         # linux debian sarge
         os.name = 'posix'
@@ -230,7 +228,7 @@
         self._set_uname(('Linux', 'aglae', '2.6.21.1dedibox-r7',
                     '#1 Mon Apr 30 17:25:38 CEST 2007', 'i686'))
 
-        self.assertEquals(get_platform(), 'linux-i686')
+        self.assertEqual(get_platform(), 'linux-i686')
 
         # XXX more platforms to tests here
 
@@ -241,11 +239,11 @@
     def test_get_scheme_names(self):
         wanted = ('nt', 'nt_user', 'os2', 'os2_home', 'posix_home',
                   'posix_prefix', 'posix_user')
-        self.assertEquals(get_scheme_names(), wanted)
+        self.assertEqual(get_scheme_names(), wanted)
 
     def test_expand_globals(self):
 
-        config = ConfigParser()
+        config = RawConfigParser()
         config.add_section('globals')
         config.set('globals', 'foo', 'ok')
         config.add_section('posix')
@@ -254,8 +252,8 @@
 
         _expand_globals(config)
 
-        self.assertEquals(config.get('posix', 'foo'), 'ok')
-        self.assertEquals(config.get('posix', 'more'), '/etc/ok')
+        self.assertEqual(config.get('posix', 'foo'), 'ok')
+        self.assertEqual(config.get('posix', 'more'), '/etc/ok')
 
         # we might not have globals after all
         # extending again (==no more globals section)
diff --git a/src/distutils2/command/bdist.py b/src/distutils2/command/bdist.py
--- a/src/distutils2/command/bdist.py
+++ b/src/distutils2/command/bdist.py
@@ -62,7 +62,7 @@
                       'os2': 'zip'}
 
     # Establish the preferred order (for the --help-formats option).
-    format_commands = ['rpm', 'gztar', 'bztar', 'ztar', 'tar',
+    format_commands = ['gztar', 'bztar', 'ztar', 'tar',
                        'wininst', 'zip', 'msi']
 
     # And the real information.
@@ -96,7 +96,7 @@
 
         # 'bdist_base' -- parent of per-built-distribution-format
         # temporary directories (eg. we'll probably have
-        # "build/bdist.<plat>/dumb", "build/bdist.<plat>/rpm", etc.)
+        # "build/bdist.<plat>/dumb", etc.)
         if self.bdist_base is None:
             build_base = self.get_finalized_command('build').build_base
             self.bdist_base = os.path.join(build_base,
@@ -126,7 +126,7 @@
         # Reinitialize and run each command.
         for i in range(len(self.formats)):
             cmd_name = commands[i]
-            sub_cmd = self.reinitialize_command(cmd_name)
+            sub_cmd = self.get_reinitialized_command(cmd_name)
 
             # passing the owner and group names for tar archiving
             if cmd_name == 'bdist_dumb':
diff --git a/src/distutils2/command/bdist_dumb.py b/src/distutils2/command/bdist_dumb.py
--- a/src/distutils2/command/bdist_dumb.py
+++ b/src/distutils2/command/bdist_dumb.py
@@ -85,7 +85,7 @@
         if not self.skip_build:
             self.run_command('build')
 
-        install = self.reinitialize_command('install', reinit_subcommands=1)
+        install = self.get_reinitialized_command('install', reinit_subcommands=1)
         install.root = self.bdist_dir
         install.skip_build = self.skip_build
         install.warn_dir = 0
diff --git a/src/distutils2/command/bdist_msi.py b/src/distutils2/command/bdist_msi.py
--- a/src/distutils2/command/bdist_msi.py
+++ b/src/distutils2/command/bdist_msi.py
@@ -177,12 +177,12 @@
         if not self.skip_build:
             self.run_command('build')
 
-        install = self.reinitialize_command('install', reinit_subcommands=1)
+        install = self.get_reinitialized_command('install', reinit_subcommands=1)
         install.prefix = self.bdist_dir
         install.skip_build = self.skip_build
         install.warn_dir = 0
 
-        install_lib = self.reinitialize_command('install_lib')
+        install_lib = self.get_reinitialized_command('install_lib')
         # we do not want to include pyc or pyo files
         install_lib.compile = 0
         install_lib.optimize = 0
diff --git a/src/distutils2/command/bdist_wininst.py b/src/distutils2/command/bdist_wininst.py
--- a/src/distutils2/command/bdist_wininst.py
+++ b/src/distutils2/command/bdist_wininst.py
@@ -126,13 +126,13 @@
         if not self.skip_build:
             self.run_command('build')
 
-        install = self.reinitialize_command('install', reinit_subcommands=1)
+        install = self.get_reinitialized_command('install', reinit_subcommands=1)
         install.root = self.bdist_dir
         install.skip_build = self.skip_build
         install.warn_dir = 0
         install.plat_name = self.plat_name
 
-        install_lib = self.reinitialize_command('install_lib')
+        install_lib = self.get_reinitialized_command('install_lib')
         # we do not want to include pyc or pyo files
         install_lib.compile = 0
         install_lib.optimize = 0
diff --git a/src/distutils2/command/build_ext.py b/src/distutils2/command/build_ext.py
--- a/src/distutils2/command/build_ext.py
+++ b/src/distutils2/command/build_ext.py
@@ -188,7 +188,24 @@
         if self.package is None:
             self.package = self.distribution.ext_package
 
+        # Ensure that the list of extensions is valid, i.e. it is a list of
+        # Extension objects.
         self.extensions = self.distribution.ext_modules
+        if self.extensions:
+            if not isinstance(self.extensions, (list, tuple)):
+                type_name = (self.extensions is None and 'None'
+                            or type(self.extensions).__name__)
+                raise DistutilsSetupError(
+                    "'ext_modules' must be a sequence of Extension instances,"
+                    " not %s" % (type_name,))
+            for i, ext in enumerate(self.extensions):
+                if isinstance(ext, Extension):
+                    continue                # OK! (assume type-checking done
+                                            # by Extension constructor)
+                type_name = (ext is None and 'None' or type(ext).__name__)
+                raise DistutilsSetupError(
+                    "'ext_modules' item %d must be an Extension instance,"
+                    " not %s" % (i, type_name))
 
         # Make sure Python's include directories (for Python.h, pyconfig.h,
         # etc.) are in the include search path.
@@ -396,86 +413,7 @@
         # Now actually compile and link everything.
         self.build_extensions()
 
-    def check_extensions_list(self, extensions):
-        """Ensure that the list of extensions (presumably provided as a
-        command option 'extensions') is valid, i.e. it is a list of
-        Extension objects.  We also support the old-style list of 2-tuples,
-        where the tuples are (ext_name, build_info), which are converted to
-        Extension instances here.
-
-        Raise DistutilsSetupError if the structure is invalid anywhere;
-        just returns otherwise.
-        """
-        if not isinstance(extensions, list):
-            raise DistutilsSetupError, \
-                  "'ext_modules' option must be a list of Extension instances"
-
-        for i, ext in enumerate(extensions):
-            if isinstance(ext, Extension):
-                continue                # OK! (assume type-checking done
-                                        # by Extension constructor)
-
-            if not isinstance(ext, tuple) or len(ext) != 2:
-                raise DistutilsSetupError, \
-                      ("each element of 'ext_modules' option must be an "
-                       "Extension instance or 2-tuple")
-
-            ext_name, build_info = ext
-
-            log.warn(("old-style (ext_name, build_info) tuple found in "
-                      "ext_modules for extension '%s'"
-                      "-- please convert to Extension instance" % ext_name))
-
-            if not (isinstance(ext_name, str) and
-                    extension_name_re.match(ext_name)):
-                raise DistutilsSetupError, \
-                      ("first element of each tuple in 'ext_modules' "
-                       "must be the extension name (a string)")
-
-            if not isinstance(build_info, dict):
-                raise DistutilsSetupError, \
-                      ("second element of each tuple in 'ext_modules' "
-                       "must be a dictionary (build info)")
-
-            # OK, the (ext_name, build_info) dict is type-safe: convert it
-            # to an Extension instance.
-            ext = Extension(ext_name, build_info['sources'])
-
-            # Easy stuff: one-to-one mapping from dict elements to
-            # instance attributes.
-            for key in ('include_dirs', 'library_dirs', 'libraries',
-                        'extra_objects', 'extra_compile_args',
-                        'extra_link_args'):
-                val = build_info.get(key)
-                if val is not None:
-                    setattr(ext, key, val)
-
-            # Medium-easy stuff: same syntax/semantics, different names.
-            ext.runtime_library_dirs = build_info.get('rpath')
-            if 'def_file' in build_info:
-                log.warn("'def_file' element of build info dict "
-                         "no longer supported")
-
-            # Non-trivial stuff: 'macros' split into 'define_macros'
-            # and 'undef_macros'.
-            macros = build_info.get('macros')
-            if macros:
-                ext.define_macros = []
-                ext.undef_macros = []
-                for macro in macros:
-                    if not (isinstance(macro, tuple) and len(macro) in (1, 2)):
-                        raise DistutilsSetupError, \
-                              ("'macros' element of build info dict "
-                               "must be 1- or 2-tuple")
-                    if len(macro) == 1:
-                        ext.undef_macros.append(macro[0])
-                    elif len(macro) == 2:
-                        ext.define_macros.append(macro)
-
-            extensions[i] = ext
-
     def get_source_files(self):
-        self.check_extensions_list(self.extensions)
         filenames = []
 
         # Wouldn't it be neat if we knew the names of header files too...
@@ -485,11 +423,6 @@
         return filenames
 
     def get_outputs(self):
-        # Sanity check the 'extensions' list -- can't assume this is being
-        # done in the same run as a 'build_extensions()' call (in fact, we
-        # can probably assume that it *isn't*!).
-        self.check_extensions_list(self.extensions)
-
         # And build the list of output (built) filenames.  Note that this
         # ignores the 'inplace' flag, and assumes everything goes in the
         # "build" tree.
@@ -499,9 +432,6 @@
         return outputs
 
     def build_extensions(self):
-        # First, sanity-check the 'extensions' list
-        self.check_extensions_list(self.extensions)
-
         for ext in self.extensions:
             try:
                 self.build_extension(ext)
diff --git a/src/distutils2/command/build_py.py b/src/distutils2/command/build_py.py
--- a/src/distutils2/command/build_py.py
+++ b/src/distutils2/command/build_py.py
@@ -6,14 +6,65 @@
 
 import os
 import sys
+import logging
 from glob import glob
 
 from distutils2.core import Command
 from distutils2.errors import DistutilsOptionError, DistutilsFileError
 from distutils2.util import convert_path
-from distutils2 import log
+from distutils2.converter.refactor import DistutilsRefactoringTool
 
-class build_py(Command):
+# marking public APIs
+__all__ = ['Mixin2to3', 'build_py']
+
+try:
+    from distutils2.util import Mixin2to3 as _Mixin2to3
+    from lib2to3.refactor import get_fixers_from_package
+    _CONVERT = True
+    _KLASS = _Mixin2to3
+except ImportError:
+    _CONVERT = False
+    _KLASS = object
+
+class Mixin2to3(_KLASS):
+    """ The base class which can be used for refactoring. When run under
+    Python 3.0, the run_2to3 method provided by Mixin2to3 is overridden.
+    When run on Python 2.x, it merely creates a class which overrides run_2to3,
+    yet does nothing in particular with it.
+    """
+    if _CONVERT:
+        def _run_2to3(self, files, doctests=[]):
+            """ Takes a list of files and doctests, and performs conversion
+            on those.
+              - First, the files which contain the code(`files`) are converted.
+              - Second, the doctests in `files` are converted.
+              - Thirdly, the doctests in `doctests` are converted.
+            """
+
+            # Convert the ".py" files.
+            logging.info("Converting Python code")
+            _KLASS.run_2to3(self, files)
+
+            # Convert the doctests in the ".py" files.
+            logging.info("Converting doctests with '.py' files")
+            _KLASS.run_2to3(self, files, doctests_only=True)
+
+            # If the following conditions are met, then convert:-
+            # 1. User has specified the 'convert_2to3_doctests' option. So, we
+            #    can expect that the list 'doctests' is not empty.
+            # 2. The default is allow distutils2 to allow conversion of text files
+            #    containing doctests. It is set as
+            #    distutils2.run_2to3_on_doctests
+
+            if doctests != [] and distutils2.run_2to3_on_doctests:
+                logging.info("Converting text files which contain doctests")
+                _KLASS.run_2to3(self, doctests, doctests_only=True)
+    else:
+        # If run on Python 2.x, there is nothing to do.
+        def _run_2to3(self, files, doctests=[]):
+            pass
+
+class build_py(Command, Mixin2to3):
 
     description = "\"build\" pure Python modules (copy to build directory)"
 
@@ -39,6 +90,8 @@
         self.compile = 0
         self.optimize = 0
         self.force = None
+        self._updated_files = []
+        self._doctests_2to3 = []
 
     def finalize_options(self):
         self.set_undefined_options('build',
@@ -93,10 +146,18 @@
             self.build_packages()
             self.build_package_data()
 
+        if self.distribution.use_2to3 and self_updated_files:
+            self.run_2to3(self._updated_files, self._doctests_2to3)
+
         self.byte_compile(self.get_outputs(include_bytecode=0))
 
+    # -- Top-level worker functions ------------------------------------
+
     def get_data_files(self):
-        """Generate list of '(package,src_dir,build_dir,filenames)' tuples"""
+        """Generate list of '(package,src_dir,build_dir,filenames)' tuples.
+
+        Helper function for `finalize_options()`.
+        """
         data = []
         if not self.packages:
             return data
@@ -120,7 +181,10 @@
         return data
 
     def find_data_files(self, package, src_dir):
-        """Return filenames for package's data files in 'src_dir'"""
+        """Return filenames for package's data files in 'src_dir'.
+
+        Helper function for `get_data_files()`.
+        """
         globs = (self.package_data.get('', [])
                  + self.package_data.get(package, []))
         files = []
@@ -132,14 +196,21 @@
         return files
 
     def build_package_data(self):
-        """Copy data files into build directory"""
+        """Copy data files into build directory.
+
+        Helper function for `run()`.
+        """
         for package, src_dir, build_dir, filenames in self.data_files:
             for filename in filenames:
                 target = os.path.join(build_dir, filename)
                 self.mkpath(os.path.dirname(target))
-                self.copy_file(os.path.join(src_dir, filename), target,
-                               preserve_mode=False)
+                outf, copied = self.copy_file(os.path.join(src_dir, filename),
+                               target, preserve_mode=False)
+                if copied and srcfile in self.distribution.convert_2to3.doctests:
+                    self._doctests_2to3.append(outf)
 
+    # XXX - this should be moved to the Distribution class as it is not
+    # only needed for build_py. It also has no dependencies on this class.
     def get_package_dir(self, package):
         """Return the directory, relative to the top of the source
            distribution, where package 'package' should be found
@@ -181,6 +252,8 @@
                     return ''
 
     def check_package(self, package, package_dir):
+        """Helper function for `find_package_modules()` and `find_modules()'.
+        """
         # Empty dir name means current directory, which we can probably
         # assume exists.  Also, os.path.exists and isdir don't know about
         # my "empty string means current dir" convention, so we have to
@@ -200,8 +273,8 @@
             if os.path.isfile(init_py):
                 return init_py
             else:
-                log.warn(("package init file '%s' not found " +
-                          "(or not a regular file)"), init_py)
+                logging.warning(("package init file '%s' not found " +
+                                 "(or not a regular file)"), init_py)
 
         # Either not in a package at all (__init__.py not expected), or
         # __init__.py doesn't exist -- so don't return the filename.
@@ -209,7 +282,8 @@
 
     def check_module(self, module, module_file):
         if not os.path.isfile(module_file):
-            log.warn("file %s (for module %s) not found", module_file, module)
+            logging.warning("file %s (for module %s) not found",
+                            module_file, module)
             return False
         else:
             return True
diff --git a/src/distutils2/command/cmd.py b/src/distutils2/command/cmd.py
--- a/src/distutils2/command/cmd.py
+++ b/src/distutils2/command/cmd.py
@@ -19,7 +19,7 @@
 except ImportError:
     from distutils2._backport.shutil import make_archive
 
-class Command:
+class Command(object):
     """Abstract base class for defining command classes, the "worker bees"
     of the Distutils.  A useful analogy for command classes is to think of
     them as subroutines with local variables called "options".  The options
@@ -57,8 +57,7 @@
     def __init__(self, dist):
         """Create and initialize a new Command object.  Most importantly,
         invokes the 'initialize_options()' method, which is the real
-        initializer and depends on the actual command being
-        instantiated.
+        initializer and depends on the actual command being instantiated.
         """
         # late import because of mutual dependence between these classes
         from distutils2.dist import Distribution
@@ -189,6 +188,31 @@
         """
         log.log(level, msg)
 
+    # -- External interface --------------------------------------------
+    # (called by outsiders)
+
+    def get_source_files(self):
+        """Return the list of files that are used as inputs to this command,
+        i.e. the files used to generate the output files.  The result is used
+        by the `sdist` command in determining the set of default files.
+
+        Command classes should implement this method if they operate on files
+        from the source tree.
+        """
+        return []
+
+    def get_outputs(self):
+        """Return the list of files that would be produced if this command
+        were actually run.  Not affected by the "dry-run" flag or whether
+        any other commands have been run.
+
+        Command classes should implement this method if they produce any
+        output files that get consumed by another command.  e.g., `build_ext`
+        returns the list of built extension modules, but not any temporary
+        files used in the compilation process.
+        """
+        return []
+
     # -- Option validation methods -------------------------------------
     # (these are very handy in writing the 'finalize_options()' method)
     #
@@ -308,10 +332,8 @@
         cmd_obj.ensure_finalized()
         return cmd_obj
 
-    # XXX rename to 'get_reinitialized_command()'? (should do the
-    # same in dist.py, if so)
-    def reinitialize_command(self, command, reinit_subcommands=0):
-        return self.distribution.reinitialize_command(
+    def get_reinitialized_command(self, command, reinit_subcommands=0):
+        return self.distribution.get_reinitialized_command(
             command, reinit_subcommands)
 
     def run_command(self, command):
@@ -351,8 +373,10 @@
         if os.path.isdir(name) or name == '':
             return
         if dry_run:
+            head = ''
             for part in name.split(os.sep):
-                self.log(part)
+                log.info("created directory %s%s", head, part)
+                head += part + os.sep
             return
         os.makedirs(name, mode)
 
diff --git a/src/distutils2/command/register.py b/src/distutils2/command/register.py
--- a/src/distutils2/command/register.py
+++ b/src/distutils2/command/register.py
@@ -28,8 +28,6 @@
     boolean_options = PyPIRCCommand.boolean_options + [
         'verify', 'list-classifiers', 'strict']
 
-    sub_commands = [('check', lambda self: True)]
-
     def initialize_options(self):
         PyPIRCCommand.initialize_options(self)
         self.list_classifiers = 0
@@ -46,9 +44,8 @@
         self.finalize_options()
         self._set_config()
 
-        # Run sub commands
-        for cmd_name in self.get_sub_commands():
-            self.run_command(cmd_name)
+        # Check the package metadata
+        self.run_command('check')
 
         if self.dry_run:
             self.verify_metadata()
diff --git a/src/distutils2/command/sdist.py b/src/distutils2/command/sdist.py
--- a/src/distutils2/command/sdist.py
+++ b/src/distutils2/command/sdist.py
@@ -20,7 +20,7 @@
 from distutils2.core import Command
 from distutils2 import util
 from distutils2.errors import (DistutilsPlatformError, DistutilsOptionError,
-                              DistutilsTemplateError)
+                               DistutilsTemplateError)
 from distutils2.manifest import Manifest
 from distutils2 import log
 from distutils2.util import convert_path, newer
@@ -45,12 +45,6 @@
 
     description = "create a source distribution (tarball, zip file, etc.)"
 
-    def checking_metadata(self):
-        """Callable used for the check sub-command.
-
-        Placed here so user_options can view it"""
-        return self.metadata_check
-
     user_options = [
         ('template=', 't',
          "name of manifest template file [default: MANIFEST.in]"),
@@ -100,8 +94,6 @@
     default_format = {'posix': 'gztar',
                       'nt': 'zip' }
 
-    sub_commands = [('check', checking_metadata)]
-
     def initialize_options(self):
         # 'template' and 'manifest' are, respectively, the names of
         # the manifest template and manifest file.
@@ -162,9 +154,9 @@
         # manifest
         self.filelist.clear()
 
-        # Run sub commands
-        for cmd_name in self.get_sub_commands():
-            self.run_command(cmd_name)
+        # Check the package metadata
+        if self.metadata_check:
+            self.run_command('check')
 
         # Do whatever it takes to get the list of files to process
         # (process the manifest template, read an existing manifest,
diff --git a/src/distutils2/command/upload.py b/src/distutils2/command/upload.py
--- a/src/distutils2/command/upload.py
+++ b/src/distutils2/command/upload.py
@@ -160,7 +160,7 @@
         # send the data
         try:
             result = urlopen(request)
-            status = result.getcode()
+            status = result.code
             reason = result.msg
         except socket.error, e:
             self.announce(str(e), log.ERROR)
diff --git a/src/distutils2/command/upload_docs.py b/src/distutils2/command/upload_docs.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/command/upload_docs.py
@@ -0,0 +1,135 @@
+import base64, httplib, os.path, socket, tempfile, urlparse, zipfile
+from cStringIO import StringIO
+from distutils2 import log
+from distutils2.command.upload import upload
+from distutils2.core import PyPIRCCommand
+from distutils2.errors import DistutilsFileError
+
+def zip_dir(directory):
+    """Compresses recursively contents of directory into a StringIO object"""
+    destination = StringIO()
+    zip_file = zipfile.ZipFile(destination, "w")
+    for root, dirs, files in os.walk(directory):
+        for name in files:
+            full = os.path.join(root, name)
+            relative = root[len(directory):].lstrip(os.path.sep)
+            dest = os.path.join(relative, name)
+            zip_file.write(full, dest)
+    zip_file.close()
+    return destination
+
+# grabbed from
+#    http://code.activestate.com/recipes/146306-http-client-to-post-using-multipartform-data/
+def encode_multipart(fields, files, boundary=None):
+    """
+    fields is a sequence of (name, value) elements for regular form fields.
+    files is a sequence of (name, filename, value) elements for data to be uploaded as files
+    Return (content_type, body) ready for httplib.HTTP instance
+    """
+    if boundary is None:
+        boundary = '--------------GHSKFJDLGDS7543FJKLFHRE75642756743254'
+    l = []
+    for (key, value) in fields:
+        l.extend([
+            '--' + boundary,
+            'Content-Disposition: form-data; name="%s"' % key,
+            '',
+            value])
+    for (key, filename, value) in files:
+        l.extend([
+            '--' + boundary,
+            'Content-Disposition: form-data; name="%s"; filename="%s"' % (key, filename),
+            '',
+            value])
+    l.append('--' + boundary + '--')
+    l.append('')
+    body =  '\r\n'.join(l)
+    content_type = 'multipart/form-data; boundary=%s' % boundary
+    return content_type, body
+
+class upload_docs(PyPIRCCommand):
+
+    user_options = [
+        ('repository=', 'r', "url of repository [default: %s]" % upload.DEFAULT_REPOSITORY),
+        ('show-response', None, 'display full response text from server'),
+        ('upload-dir=', None, 'directory to upload'),
+        ]
+
+    def initialize_options(self):
+        PyPIRCCommand.initialize_options(self)
+        self.upload_dir = "build/docs"
+
+    def finalize_options(self):
+        PyPIRCCommand.finalize_options(self)
+        if self.upload_dir == None:
+            build = self.get_finalized_command('build')
+            self.upload_dir = os.path.join(build.build_base, "docs")
+        self.announce('Using upload directory %s' % self.upload_dir)
+        self.verify_upload_dir(self.upload_dir)
+        config = self._read_pypirc()
+        if config != {}:
+            self.username = config['username']
+            self.password = config['password']
+            self.repository = config['repository']
+            self.realm = config['realm']
+
+    def verify_upload_dir(self, upload_dir):
+        self.ensure_dirname('upload_dir')
+        index_location = os.path.join(upload_dir, "index.html")
+        if not os.path.exists(index_location):
+            mesg = "No 'index.html found in docs directory (%s)"
+            raise DistutilsFileError(mesg % upload_dir)
+
+    def run(self):
+        tmp_dir = tempfile.mkdtemp()
+        name = self.distribution.metadata['Name']
+        zip_file = zip_dir(self.upload_dir)
+
+        fields = {':action': 'doc_upload', 'name': name}.items()
+        files = [('content', name, zip_file.getvalue())]
+        content_type, body = encode_multipart(fields, files)
+
+        credentials = self.username + ':' + self.password
+        auth = "Basic " + base64.encodestring(credentials).strip()
+
+        self.announce("Submitting documentation to %s" % (self.repository),
+                      log.INFO)
+
+        schema, netloc, url, params, query, fragments = \
+            urlparse.urlparse(self.repository)
+        if schema == "http":
+            conn = httplib.HTTPConnection(netloc)
+        elif schema == "https":
+            conn = httplib.HTTPSConnection(netloc)
+        else:
+            raise AssertionError("unsupported schema "+schema)
+
+        try:
+            conn.connect()
+            conn.putrequest("POST", url)
+            conn.putheader('Content-type', content_type)
+            conn.putheader('Content-length', str(len(body)))
+            conn.putheader('Authorization', auth)
+            conn.endheaders()
+            conn.send(body)
+        except socket.error, e:
+            self.announce(str(e), log.ERROR)
+            return
+
+        r = conn.getresponse()
+
+        if r.status == 200:
+            self.announce('Server response (%s): %s' % (r.status, r.reason),
+                          log.INFO)
+        elif r.status == 301:
+            location = r.getheader('Location')
+            if location is None:
+                location = 'http://packages.python.org/%s/' % meta.get_name()
+            self.announce('Upload successful. Visit %s' % location,
+                          log.INFO)
+        else:
+            self.announce('Upload failed (%s): %s' % (r.status, r.reason),
+                          log.ERROR)
+
+        if self.show_response:
+            print "\n".join(['-'*75, r.read(), '-'*75])
diff --git a/src/distutils2/compiler/bcppcompiler.py b/src/distutils2/compiler/bcppcompiler.py
--- a/src/distutils2/compiler/bcppcompiler.py
+++ b/src/distutils2/compiler/bcppcompiler.py
@@ -16,7 +16,7 @@
 import os
 
 from distutils2.errors import (DistutilsExecError, CompileError, LibError,
-                              LinkError, UnknownFileError)
+                               LinkError, UnknownFileError)
 from distutils2.compiler.ccompiler import CCompiler, gen_preprocess_options
 from distutils2.file_util import write_file
 from distutils2.dep_util import newer
diff --git a/src/distutils2/compiler/ccompiler.py b/src/distutils2/compiler/ccompiler.py
--- a/src/distutils2/compiler/ccompiler.py
+++ b/src/distutils2/compiler/ccompiler.py
@@ -10,7 +10,7 @@
 import re
 
 from distutils2.errors import (CompileError, LinkError, UnknownFileError,
-                              DistutilsPlatformError, DistutilsModuleError)
+                               DistutilsPlatformError, DistutilsModuleError)
 from distutils2.spawn import spawn
 from distutils2.util import split_quoted, execute, newer_group
 from distutils2 import log
@@ -76,7 +76,7 @@
 
         compiler.shared_lib_extension = so_ext
 
-class CCompiler:
+class CCompiler(object):
     """Abstract base class to define the interface that must be implemented
     by real compiler classes.  Also has some utility methods used by
     several compiler classes.
@@ -800,14 +800,16 @@
             library_dirs = []
         fd, fname = tempfile.mkstemp(".c", funcname, text=True)
         f = os.fdopen(fd, "w")
-        for incl in includes:
-            f.write("""#include "%s"\n""" % incl)
-        f.write("""\
+        try:
+            for incl in includes:
+                f.write("""#include "%s"\n""" % incl)
+            f.write("""\
 main (int argc, char **argv) {
     %s();
 }
 """ % funcname)
-        f.close()
+        finally:
+            f.close()
         try:
             objects = self.compile([fname], include_dirs=include_dirs)
         except CompileError:
@@ -938,8 +940,10 @@
         if os.path.isdir(name) or name == '':
             return
         if self.dry_run:
+            head = ''
             for part in name.split(os.sep):
-                self.log(part)
+                log.info("created directory %s%s", head, part)
+                head += part + os.sep
             return
         os.makedirs(name, mode)
 
@@ -1048,7 +1052,7 @@
         module_name = "distutils2.compiler." + module_name
         __import__ (module_name)
         module = sys.modules[module_name]
-        klass = vars(module)[class_name]
+        cls = vars(module)[class_name]
     except ImportError:
         raise DistutilsModuleError, \
               "can't compile C/C++ code: unable to load module '%s'" % \
@@ -1061,7 +1065,7 @@
     # XXX The None is necessary to preserve backwards compatibility
     # with classes that expect verbose to be the first positional
     # argument.
-    return klass(None, dry_run, force)
+    return cls(None, dry_run, force)
 
 
 def gen_preprocess_options(macros, include_dirs):
diff --git a/src/distutils2/compiler/emxccompiler.py b/src/distutils2/compiler/emxccompiler.py
--- a/src/distutils2/compiler/emxccompiler.py
+++ b/src/distutils2/compiler/emxccompiler.py
@@ -25,9 +25,8 @@
 from warnings import warn
 
 from distutils2.compiler.unixccompiler import UnixCCompiler
-from distutils2.util import write_file
 from distutils2.errors import DistutilsExecError, CompileError, UnknownFileError
-from distutils2.util import get_compiler_versions
+from distutils2.util import get_compiler_versions, write_file
 
 class EMXCCompiler (UnixCCompiler):
 
@@ -273,8 +272,10 @@
         # It would probably better to read single lines to search.
         # But we do this only once, and it is fast enough
         f = open(fn)
-        s = f.read()
-        f.close()
+        try:
+            s = f.read()
+        finally:
+            f.close()
 
     except IOError, exc:
         # if we can't read this file, we cannot say it is wrong
diff --git a/src/distutils2/compiler/msvc9compiler.py b/src/distutils2/compiler/msvc9compiler.py
--- a/src/distutils2/compiler/msvc9compiler.py
+++ b/src/distutils2/compiler/msvc9compiler.py
@@ -20,7 +20,7 @@
 import re
 
 from distutils2.errors import (DistutilsExecError, DistutilsPlatformError,
-                              CompileError, LibError, LinkError)
+                               CompileError, LibError, LinkError)
 from distutils2.compiler.ccompiler import CCompiler, gen_lib_options
 from distutils2 import log
 from distutils2.util import get_platform
@@ -50,7 +50,7 @@
     'win-ia64' : 'ia64',
 }
 
-class Reg:
+class Reg(object):
     """Helper class to read values from the registry
     """
 
@@ -112,7 +112,7 @@
         return s
     convert_mbcs = staticmethod(convert_mbcs)
 
-class MacroExpander:
+class MacroExpander(object):
 
     def __init__(self, version):
         self.macros = {}
diff --git a/src/distutils2/compiler/msvccompiler.py b/src/distutils2/compiler/msvccompiler.py
--- a/src/distutils2/compiler/msvccompiler.py
+++ b/src/distutils2/compiler/msvccompiler.py
@@ -15,7 +15,7 @@
 import string
 
 from distutils2.errors import (DistutilsExecError, DistutilsPlatformError,
-                              CompileError, LibError, LinkError)
+                               CompileError, LibError, LinkError)
 from distutils2.compiler.ccompiler import CCompiler, gen_lib_options
 from distutils2 import log
 
@@ -104,7 +104,7 @@
             pass
     return s
 
-class MacroExpander:
+class MacroExpander(object):
 
     def __init__(self, version):
         self.macros = {}
diff --git a/src/distutils2/compiler/unixccompiler.py b/src/distutils2/compiler/unixccompiler.py
--- a/src/distutils2/compiler/unixccompiler.py
+++ b/src/distutils2/compiler/unixccompiler.py
@@ -19,10 +19,10 @@
 from types import StringType, NoneType
 
 from distutils2.util import newer
-from distutils2.compiler.ccompiler import \
-     CCompiler, gen_preprocess_options, gen_lib_options
-from distutils2.errors import \
-     DistutilsExecError, CompileError, LibError, LinkError
+from distutils2.compiler.ccompiler import (CCompiler, gen_preprocess_options,
+                                           gen_lib_options)
+from distutils2.errors import (DistutilsExecError, CompileError,
+                               LibError, LinkError)
 from distutils2 import log
 
 try:
diff --git a/src/distutils2/config.py b/src/distutils2/config.py
--- a/src/distutils2/config.py
+++ b/src/distutils2/config.py
@@ -4,7 +4,7 @@
 that uses .pypirc in the distutils.command package.
 """
 import os
-from ConfigParser import ConfigParser
+from ConfigParser import RawConfigParser
 
 from distutils2.command.cmd import Command
 
@@ -59,7 +59,7 @@
         if os.path.exists(rc):
             self.announce('Using PyPI login from %s' % rc)
             repository = self.repository or self.DEFAULT_REPOSITORY
-            config = ConfigParser()
+            config = RawConfigParser()
             config.read(rc)
             sections = config.sections()
             if 'distutils' in sections:
diff --git a/src/distutils2/converter/fixers/fix_imports.py b/src/distutils2/converter/fixers/fix_imports.py
--- a/src/distutils2/converter/fixers/fix_imports.py
+++ b/src/distutils2/converter/fixers/fix_imports.py
@@ -36,11 +36,16 @@
             pattern = []
             next = imp.next_sibling
             while next is not None:
+                # Get the first child if we have a Node
+                if not hasattr(next, "value"):
+                    next = next.children[0]
                 pattern.append(next.value)
                 if not hasattr(next, "next_sibling"):
                     next.next_sibling = next.get_next_sibling()
                 next = next.next_sibling
-            if pattern == ['import', 'setup']:
+            
+            if set(pattern).issubset(set(
+                    ['import', ',', 'setup', 'find_packages'])):
                 imp.value = 'distutils2.core'
                 imp.changed()
 
diff --git a/src/distutils2/core.py b/src/distutils2/core.py
--- a/src/distutils2/core.py
+++ b/src/distutils2/core.py
@@ -2,8 +2,9 @@
 
 The only module that needs to be imported to use the Distutils; provides
 the 'setup' function (which is to be called from the setup script).  Also
-indirectly provides the Distribution and Command classes, although they are
-really defined in distutils2.dist and distutils2.cmd.
+exports useful classes so that setup scripts can import them from here
+although they are really defined in other modules: Distribution, Command,
+PyPIRCommand, Extension, find_packages.
 """
 
 __revision__ = "$Id: core.py 77704 2010-01-23 09:23:15Z tarek.ziade $"
@@ -12,7 +13,7 @@
 import os
 
 from distutils2.errors import (DistutilsSetupError, DistutilsArgError,
-                              DistutilsError, CCompilerError)
+                               DistutilsError, CCompilerError)
 from distutils2.util import grok_environment_error
 
 # Mainly import these so setup scripts can "from distutils2.core import" them.
@@ -20,6 +21,7 @@
 from distutils2.command.cmd import Command
 from distutils2.config import PyPIRCCommand
 from distutils2.extension import Extension
+from distutils2.util import find_packages
 
 # This is a barebones help message generated displayed when the user
 # runs the setup script with no arguments at all.  More useful help
@@ -47,7 +49,8 @@
                   'maintainer', 'maintainer_email', 'url', 'license',
                   'description', 'long_description', 'keywords',
                   'platforms', 'classifiers', 'download_url',
-                  'requires', 'provides', 'obsoletes',
+                  'requires', 'provides', 'obsoletes', 'use_2to3',
+                  'convert_2to3_doctests',
                   )
 
 # Legal keyword arguments for the Extension constructor
@@ -94,11 +97,7 @@
 
     # Determine the distribution class -- either caller-supplied or
     # our Distribution (see below).
-    klass = attrs.get('distclass')
-    if klass:
-        del attrs['distclass']
-    else:
-        klass = Distribution
+    distclass = attrs.pop('distclass', Distribution)
 
     if 'script_name' not in attrs:
         attrs['script_name'] = os.path.basename(sys.argv[0])
@@ -108,7 +107,7 @@
     # Create the Distribution instance, using the remaining arguments
     # (ie. everything except distclass) to initialize it
     try:
-        _setup_distribution = dist = klass(attrs)
+        _setup_distribution = dist = distclass(attrs)
     except DistutilsSetupError, msg:
         if 'name' in attrs:
             raise SystemExit, "error in %s setup command: %s" % \
diff --git a/src/distutils2/depgraph.py b/src/distutils2/depgraph.py
--- a/src/distutils2/depgraph.py
+++ b/src/distutils2/depgraph.py
@@ -13,7 +13,7 @@
     """
     Represents a dependency graph between distributions.
 
-    The depedency relationships are stored in an ``adjacency_list`` that maps
+    The dependency relationships are stored in an ``adjacency_list`` that maps
     distributions to a list of ``(other, label)`` tuples where  ``other``
     is a distribution and the edge is labelled with ``label`` (i.e. the version
     specifier, if such was provided). Also, for more efficient traversal, for
@@ -31,8 +31,7 @@
         self.missing = {}
 
     def add_distribution(self, distribution):
-        """
-        Add the *distribution* to the graph.
+        """Add the *distribution* to the graph.
 
         :type distribution: :class:`pkgutil.Distribution` or
                             :class:`pkgutil.EggInfoDistribution`
@@ -42,11 +41,9 @@
         self.missing[distribution] = list()
 
     def add_edge(self, x, y, label=None):
-        """
-        Add an edge from distribution *x* to distribution *y* with the given
+        """Add an edge from distribution *x* to distribution *y* with the given
         *label*.
 
-
         :type x: :class:`pkgutil.Distribution` or
                  :class:`pkgutil.EggInfoDistribution`
         :type y: :class:`pkgutil.Distribution` or
@@ -70,8 +67,8 @@
 
 
 def graph_to_dot(graph, f, skip_disconnected=True):
-    """
-    Writes a DOT output for the graph to the provided file *f*.
+    """Writes a DOT output for the graph to the provided file *f*.
+
     If *skip_disconnected* is set to ``True``, then all distributions
     that are not dependent on any other distribution are skipped.
 
@@ -103,8 +100,7 @@
 
 
 def generate_graph(dists):
-    """
-    Generates a dependency graph from the given distributions.
+    """Generates a dependency graph from the given distributions.
 
     :parameter dists: a list of distributions
     :type dists: list of :class:`pkgutil.Distribution` and
@@ -158,8 +154,7 @@
 
 
 def dependent_dists(dists, dist):
-    """
-    Recursively generate a list of distributions from *dists* that are
+    """Recursively generate a list of distributions from *dists* that are
     dependent on *dist*.
 
     :param dists: a list of distributions
diff --git a/src/distutils2/dist.py b/src/distutils2/dist.py
--- a/src/distutils2/dist.py
+++ b/src/distutils2/dist.py
@@ -13,8 +13,10 @@
 except ImportError:
     warnings = None
 
+from ConfigParser import RawConfigParser
+
 from distutils2.errors import (DistutilsOptionError, DistutilsArgError,
-                              DistutilsModuleError, DistutilsClassError)
+                               DistutilsModuleError, DistutilsClassError)
 from distutils2.fancy_getopt import FancyGetopt, translate_longopt
 from distutils2.util import check_environ, strtobool
 from distutils2 import log
@@ -110,7 +112,11 @@
         ('requires', None,
          "print the list of packages/modules required"),
         ('obsoletes', None,
-         "print the list of packages/modules made obsolete")
+         "print the list of packages/modules made obsolete"),
+        ('use-2to3', None,
+         "use 2to3 to make source python 3.x compatible"),
+        ('convert-2to3-doctests', None,
+         "use 2to3 to convert doctests in seperate text files"), 
         ]
     display_option_names = map(lambda x: translate_longopt(x[0]),
                                display_options)
@@ -144,8 +150,8 @@
         # information here (and enough command-line options) that it's
         # worth it.  Also delegate 'get_XXX()' methods to the 'metadata'
         # object in a sneaky and underhanded (but efficient!) way.
+        self.metadata = DistributionMetadata()
 
-        self.metadata = DistributionMetadata()
         #for basename in self.metadata._METHOD_BASENAMES:
         #    method_name = "get_" + basename
         #    setattr(self, method_name, getattr(self.metadata, method_name))
@@ -204,6 +210,8 @@
         self.scripts = None
         self.data_files = None
         self.password = ''
+        self.use_2to3 = False
+        self.convert_2to3_doctests = []
 
         # And now initialize bookkeeping stuff that can't be supplied by
         # the caller at all.  'command_obj' maps command names to
@@ -248,7 +256,7 @@
                 elif hasattr(self, key):
                     setattr(self, key, val)
                 else:
-                    msg = "Unknown distribution option: %s" % repr(key)
+                    msg = "Unknown distribution option: %r" % key
                     if warnings is not None:
                         warnings.warn(msg)
                     else:
@@ -362,14 +370,12 @@
         return files
 
     def parse_config_files(self, filenames=None):
-        from ConfigParser import ConfigParser
-
         if filenames is None:
             filenames = self.find_config_files()
 
         log.debug("Distribution.parse_config_files():")
 
-        parser = ConfigParser()
+        parser = RawConfigParser()
         for filename in filenames:
             log.debug("  reading %s" % filename)
             parser.read(filename)
@@ -383,7 +389,7 @@
                         opt = opt.replace('-', '_')
                         opt_dict[opt] = (filename, val)
 
-            # Make the ConfigParser forget everything (so we retain
+            # Make the RawConfigParser forget everything (so we retain
             # the original filenames that options come from)
             parser.__init__()
 
@@ -581,15 +587,11 @@
         instance, analogous to the .finalize_options() method of Command
         objects.
         """
-
-        # XXX conversion -- removed
-        #for attr in ('keywords', 'platforms'):
-        #    value = self.metadata.get_field(attr)
-        #    if value is None:
-        #        continue
-        #    if isinstance(value, str):
-        #        value = [elm.strip() for elm in value.split(',')]
-        #        setattr(self.metadata, attr, value)
+        if getattr(self, 'convert_2to3_doctests', None):
+            self.convert_2to3_doctests = [os.path.join(p) 
+                                for p in self.convert_2to3_doctests]
+        else:
+            self.convert_2to3_doctests = []
 
     def _show_help(self, parser, global_options=1, display_options=1,
                    commands=[]):
@@ -627,16 +629,16 @@
 
         for command in self.commands:
             if isinstance(command, type) and issubclass(command, Command):
-                klass = command
+                cls = command
             else:
-                klass = self.get_command_class(command)
-            if (hasattr(klass, 'help_options') and
-                isinstance(klass.help_options, list)):
-                parser.set_option_table(klass.user_options +
-                                        fix_help_options(klass.help_options))
+                cls = self.get_command_class(command)
+            if (hasattr(cls, 'help_options') and
+                isinstance(cls.help_options, list)):
+                parser.set_option_table(cls.user_options +
+                                        fix_help_options(cls.help_options))
             else:
-                parser.set_option_table(klass.user_options)
-            parser.print_help("Options for '%s' command:" % klass.__name__)
+                parser.set_option_table(cls.user_options)
+            parser.print_help("Options for '%s' command:" % cls.__name__)
             print('')
 
         print(gen_usage(self.script_name))
@@ -688,11 +690,11 @@
         print(header + ":")
 
         for cmd in commands:
-            klass = self.cmdclass.get(cmd)
-            if not klass:
-                klass = self.get_command_class(cmd)
+            cls = self.cmdclass.get(cmd)
+            if not cls:
+                cls = self.get_command_class(cmd)
             try:
-                description = klass.description
+                description = cls.description
             except AttributeError:
                 description = "(no description available)"
 
@@ -754,11 +756,11 @@
 
         rv = []
         for cmd in (std_commands + extra_commands):
-            klass = self.cmdclass.get(cmd)
-            if not klass:
-                klass = self.get_command_class(cmd)
+            cls = self.cmdclass.get(cmd)
+            if not cls:
+                cls = self.get_command_class(cmd)
             try:
-                description = klass.description
+                description = cls.description
             except AttributeError:
                 description = "(no description available)"
             rv.append((cmd, description))
@@ -790,13 +792,13 @@
         Raises DistutilsModuleError if the expected module could not be
         found, or if that module does not define the expected class.
         """
-        klass = self.cmdclass.get(command)
-        if klass:
-            return klass
+        cls = self.cmdclass.get(command)
+        if cls:
+            return cls
 
         for pkgname in self.get_command_packages():
             module_name = "%s.%s" % (pkgname, command)
-            klass_name = command
+            class_name = command
 
             try:
                 __import__ (module_name)
@@ -805,14 +807,14 @@
                 continue
 
             try:
-                klass = getattr(module, klass_name)
+                cls = getattr(module, class_name)
             except AttributeError:
                 raise DistutilsModuleError, \
                       "invalid command '%s' (no class '%s' in module '%s')" \
-                      % (command, klass_name, module_name)
+                      % (command, class_name, module_name)
 
-            self.cmdclass[command] = klass
-            return klass
+            self.cmdclass[command] = cls
+            return cls
 
         raise DistutilsModuleError("invalid command '%s'" % command)
 
@@ -828,8 +830,8 @@
             log.debug("Distribution.get_command_obj(): " \
                       "creating '%s' command object" % command)
 
-            klass = self.get_command_class(command)
-            cmd_obj = self.command_obj[command] = klass(self)
+            cls = self.get_command_class(command)
+            cmd_obj = self.command_obj[command] = cls(self)
             self.have_run[command] = 0
 
             # Set any options that were supplied in config files
@@ -885,7 +887,7 @@
             except ValueError, msg:
                 raise DistutilsOptionError, msg
 
-    def reinitialize_command(self, command, reinit_subcommands=0):
+    def get_reinitialized_command(self, command, reinit_subcommands=0):
         """Reinitializes a command to the state it was in when first
         returned by 'get_command_obj()': ie., initialized but not yet
         finalized.  This provides the opportunity to sneak option
@@ -920,7 +922,7 @@
 
         if reinit_subcommands:
             for sub in command.get_sub_commands():
-                self.reinitialize_command(sub, reinit_subcommands)
+                self.get_reinitialized_command(sub, reinit_subcommands)
 
         return command
 
diff --git a/src/distutils2/extension.py b/src/distutils2/extension.py
--- a/src/distutils2/extension.py
+++ b/src/distutils2/extension.py
@@ -23,7 +23,7 @@
 # import that large-ish module (indirectly, through distutils.core) in
 # order to do anything.
 
-class Extension:
+class Extension(object):
     """Just a collection of attributes that describes an extension
     module and everything needed to build it (hopefully in a portable
     way, but there are hooks that let you be as unportable as you need).
diff --git a/src/distutils2/fancy_getopt.py b/src/distutils2/fancy_getopt.py
--- a/src/distutils2/fancy_getopt.py
+++ b/src/distutils2/fancy_getopt.py
@@ -30,7 +30,7 @@
 # (for use as attributes of some object).
 longopt_xlate = string.maketrans('-', '_')
 
-class FancyGetopt:
+class FancyGetopt(object):
     """Wrapper around the standard 'getopt()' module that provides some
     handy extra functionality:
       * short and long options are tied together
@@ -473,7 +473,7 @@
     return string.translate(opt, longopt_xlate)
 
 
-class OptionDummy:
+class OptionDummy(object):
     """Dummy class just used as a place to hold command-line option
     values as instance attributes."""
 
diff --git a/src/distutils2/log.py b/src/distutils2/log.py
--- a/src/distutils2/log.py
+++ b/src/distutils2/log.py
@@ -11,14 +11,14 @@
 
 import sys
 
-class Log:
+class Log(object):
 
     def __init__(self, threshold=WARN):
         self.threshold = threshold
 
     def _log(self, level, msg, args):
         if level not in (DEBUG, INFO, WARN, ERROR, FATAL):
-            raise ValueError('%s wrong log level' % str(level))
+            raise ValueError('%s wrong log level' % level)
 
         if level >= self.threshold:
             if args:
diff --git a/src/distutils2/metadata.py b/src/distutils2/metadata.py
--- a/src/distutils2/metadata.py
+++ b/src/distutils2/metadata.py
@@ -105,7 +105,6 @@
     keys = fields.keys()
     possible_versions = ['1.0', '1.1', '1.2']
 
-
     # first let's try to see if a field is not part of one of the version
     for key in keys:
         if key not in _241_FIELDS and '1.0' in possible_versions:
@@ -128,9 +127,9 @@
         raise MetadataConflictError('You used incompatible 1.1 and 1.2 fields')
 
     # we have the choice, either 1.0, or 1.2
-    #   - 1.0 has a broken Summary field but work with all tools
+    #   - 1.0 has a broken Summary field but works with all tools
     #   - 1.1 is to avoid
-    #   - 1.2 fixes Summary but is not spreaded yet
+    #   - 1.2 fixes Summary but is not widespread yet
     if not is_1_1 and not is_1_2:
         # we couldn't find any specific marker
         if PKG_INFO_PREFERRED_VERSION in possible_versions:
@@ -185,12 +184,12 @@
 class DistributionMetadata(object):
     """Distribution meta-data class (1.0 or 1.2).
     """
-    def __init__(self, path=None, platform_dependant=False,
+    def __init__(self, path=None, platform_dependent=False,
                  execution_context=None, fileobj=None):
         self._fields = {}
         self.version = None
         self.docutils_support = _HAS_DOCUTILS
-        self.platform_dependant = platform_dependant
+        self.platform_dependent = platform_dependent
         if path is not None:
             self.read(path)
         elif fileobj is not None:
@@ -263,7 +262,7 @@
         return reporter.messages
 
     def _platform(self, value):
-        if not self.platform_dependant or ';' not in value:
+        if not self.platform_dependent or ';' not in value:
             return True, value
         value, marker = value.split(';')
         return _interpret(marker, self.execution_context), value
@@ -518,7 +517,7 @@
             raise NameError(value)
 
     def _nonsense_op(self):
-        msg = 'This operation is not supported : "%s"' % str(self)
+        msg = 'This operation is not supported : "%s"' % self
         raise SyntaxError(msg)
 
     def __call__(self):
@@ -635,4 +634,3 @@
     operations = _CHAIN(execution_context)
     tokenize(StringIO(marker).readline, operations.eat)
     return operations.result()
-
diff --git a/src/distutils2/mkpkg.py b/src/distutils2/mkpkg.py
--- a/src/distutils2/mkpkg.py
+++ b/src/distutils2/mkpkg.py
@@ -675,7 +675,7 @@
 troveDict = buildTroveDict(troveList)
 
 
-class SetupClass:
+class SetupClass(object):
     def __init__(self):
         self.config = None
         self.classifierDict = {}
@@ -717,14 +717,16 @@
 
     def inspectFile(self, path):
         fp = open(path, 'r')
-        for line in [ fp.readline() for x in range(10) ]:
-            m = re.match(r'^#!.*python((?P<major>\d)(\.\d+)?)?$', line)
-            if m:
-                if m.group('major') == '3':
-                    self.classifierDict['Programming Language :: Python :: 3'] = 1
-                else:
-                    self.classifierDict['Programming Language :: Python :: 2'] = 1
-        fp.close()
+        try:
+            for line in [ fp.readline() for x in range(10) ]:
+                m = re.match(r'^#!.*python((?P<major>\d)(\.\d+)?)?$', line)
+                if m:
+                    if m.group('major') == '3':
+                        self.classifierDict['Programming Language :: Python :: 3'] = 1
+                    else:
+                        self.classifierDict['Programming Language :: Python :: 2'] = 1
+        finally:
+            fp.close()
 
 
     def inspectDirectory(self):
@@ -885,38 +887,33 @@
         if os.path.exists('setup.py'): shutil.move('setup.py', 'setup.py.old')
 
         fp = open('setup.py', 'w')
-        fp.write('#!/usr/bin/env python\n\n')
-        fp.write('from distutils2.core import setup\n\n')
+        try:
+            fp.write('#!/usr/bin/env python\n\n')
+            fp.write('from distutils2.core import setup\n\n')
+            fp.write('setup(name=%s,\n' % repr(self.setupData['name']))
+            fp.write('      version=%s,\n' % repr(self.setupData['version']))
+            fp.write('      description=%s,\n'
+                    % repr(self.setupData['description']))
+            fp.write('      author=%s,\n' % repr(self.setupData['author']))
+            fp.write('      author_email=%s,\n'
+                    % repr(self.setupData['author_email']))
+            if self.setupData['url']:
+                fp.write('      url=%s,\n' % repr(self.setupData['url']))
+            if self.setupData['classifier']:
+                fp.write('      classifier=[\n')
+                for classifier in sorted(self.setupData['classifier'].keys()):
+                    fp.write('            %s,\n' % repr(classifier))
+                fp.write('         ],\n')
+            if self.setupData['packages']:
+                fp.write('      packages=%s,\n'
+                        % repr(self._dotted_packages(self.setupData['packages'])))
+                fp.write('      package_dir=%s,\n'
+                        % repr(self.setupData['packages']))
+            fp.write('      #scripts=[\'path/to/script\']\n')
 
-        fp.write('from sys import version\n')
-        fp.write('if version < \'2.2.3\':\n')
-        fp.write('    from distutils2.dist import DistributionMetadata\n')
-        fp.write('    DistributionMetadata.classifier = None\n')
-        fp.write('    DistributionMetadata.download_url = None\n')
-
-        fp.write('setup(name = %s,\n' % repr(self.setupData['name']))
-        fp.write('        version = %s,\n' % repr(self.setupData['version']))
-        fp.write('        description = %s,\n'
-                % repr(self.setupData['description']))
-        fp.write('        author = %s,\n' % repr(self.setupData['author']))
-        fp.write('        author_email = %s,\n'
-                % repr(self.setupData['author_email']))
-        if self.setupData['url']:
-            fp.write('        url = %s,\n' % repr(self.setupData['url']))
-        if self.setupData['classifier']:
-            fp.write('        classifier = [\n')
-            for classifier in sorted(self.setupData['classifier'].keys()):
-                fp.write('              %s,\n' % repr(classifier))
-            fp.write('           ],\n')
-        if self.setupData['packages']:
-            fp.write('        packages = %s,\n'
-                    % repr(self._dotted_packages(self.setupData['packages'])))
-            fp.write('        package_dir = %s,\n'
-                    % repr(self.setupData['packages']))
-        fp.write('        #scripts = [\'path/to/script\']\n')
-
-        fp.write('        )\n')
-        fp.close()
+            fp.write('      )\n')
+        finally:
+            fp.close()
         os.chmod('setup.py', 0755)
 
         print 'Wrote "setup.py".'
diff --git a/src/distutils2/pypi/__init__.py b/src/distutils2/pypi/__init__.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/pypi/__init__.py
@@ -0,0 +1,8 @@
+"""distutils2.pypi
+
+Package containing ways to interact with the PyPI APIs.
+""" 
+
+__all__ = ['simple',
+           'dist',
+]
diff --git a/src/distutils2/pypi/dist.py b/src/distutils2/pypi/dist.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/pypi/dist.py
@@ -0,0 +1,315 @@
+"""distutils2.pypi.dist
+
+Provides the PyPIDistribution class thats represents a distribution retrieved
+on PyPI.
+"""
+import re
+import urlparse
+import urllib
+import tempfile
+from operator import attrgetter
+
+try:
+    import hashlib
+except ImportError:
+    from distutils2._backport import hashlib
+
+from distutils2.version import suggest_normalized_version, NormalizedVersion
+from distutils2.pypi.errors import HashDoesNotMatch, UnsupportedHashName
+
+EXTENSIONS = ".tar.gz .tar.bz2 .tar .zip .tgz .egg".split()
+MD5_HASH = re.compile(r'^.*#md5=([a-f0-9]+)$')
+
+
+class PyPIDistribution(object):
+    """Represents a distribution retrieved from PyPI.
+
+    This is a simple container for various attributes as name, version,
+    downloaded_location, url etc.
+
+    The PyPIDistribution class is used by the pypi.*Index class to return
+    information about distributions.
+    """
+
+    @classmethod
+    def from_url(cls, url, probable_dist_name=None, is_external=True):
+        """Build a Distribution from a url archive (egg or zip or tgz).
+
+        :param url: complete url of the distribution
+        :param probable_dist_name: A probable name of the distribution.
+        :param is_external: Tell if the url commes from an index or from
+                            an external URL.
+        """
+        # if the url contains a md5 hash, get it.
+        md5_hash = None
+        match = MD5_HASH.match(url)
+        if match is not None:
+            md5_hash = match.group(1)
+            # remove the hash
+            url = url.replace("#md5=%s" % md5_hash, "")
+
+        # parse the archive name to find dist name and version
+        archive_name = urlparse.urlparse(url)[2].split('/')[-1]
+        extension_matched = False
+        # remove the extension from the name
+        for ext in EXTENSIONS:
+            if archive_name.endswith(ext):
+                archive_name = archive_name[:-len(ext)]
+                extension_matched = True
+
+        name, version = split_archive_name(archive_name)
+        if extension_matched is True:
+            return PyPIDistribution(name, version, url=url, url_hashname="md5",
+                                    url_hashval=md5_hash,
+                                    url_is_external=is_external)
+
+    def __init__(self, name, version, type=None, url=None, url_hashname=None,
+                 url_hashval=None, url_is_external=True):
+        """Create a new instance of PyPIDistribution.
+
+        :param name: the name of the distribution
+        :param version: the version of the distribution
+        :param type: the type of the dist (eg. source, bin-*, etc.)
+        :param url: URL where we found this distribution
+        :param url_hashname: the name of the hash we want to use. Refer to the
+                         hashlib.new documentation for more information.
+        :param url_hashval: the hash value.
+        :param url_is_external: we need to know if the provided url comes from an
+                            index browsing, or from an external resource.
+
+        """
+        self.name = name
+        self.version = NormalizedVersion(version)
+        self.type = type
+        # set the downloaded path to None by default. The goal here
+        # is to not download distributions multiple times
+        self.downloaded_location = None
+        # We store urls in dict, because we need to have a bit more informations
+        # than the simple URL. It will be used later to find the good url to
+        # use.
+        # We have two _url* attributes: _url and _urls. _urls contains a list of
+        # dict for the different urls, and _url contains the choosen url, in
+        # order to dont make the selection process multiple times.
+        self._urls = []
+        self._url = None
+        self.add_url(url, url_hashname, url_hashval, url_is_external)
+
+    def add_url(self, url, hashname=None, hashval=None, is_external=True):
+        """Add a new url to the list of urls"""
+        if hashname is not None:
+            try:
+                hashlib.new(hashname)
+            except ValueError:
+                raise UnsupportedHashName(hashname)
+
+        self._urls.append({
+            'url': url,
+            'hashname': hashname,
+            'hashval': hashval,
+            'is_external': is_external,
+        })
+        # reset the url selection process
+        self._url = None
+
+    @property
+    def url(self):
+        """Pick up the right url for the list of urls in self.urls"""
+        # We return internal urls over externals.
+        # If there is more than one internal or external, return the first
+        # one.
+        if self._url is None:
+            if len(self._urls) > 1:
+                internals_urls = [u for u in self._urls \
+                                  if u['is_external'] == False]
+                if len(internals_urls) >= 1:
+                    self._url = internals_urls[0]
+            if self._url is None:
+                self._url = self._urls[0]
+        return self._url
+
+    @property
+    def is_source(self):
+        """return if the distribution is a source one or not"""
+        return self.type == 'source'
+
+    @property
+    def is_final(self):
+        """proxy to version.is_final"""
+        return self.version.is_final
+
+    def download(self, path=None):
+        """Download the distribution to a path, and return it.
+
+        If the path is given in path, use this, otherwise, generates a new one
+        """
+        if path is None:
+            path = tempfile.mkdtemp()
+
+        # if we do not have downloaded it yet, do it.
+        if self.downloaded_location is None:
+            url = self.url['url']
+            archive_name = urlparse.urlparse(url)[2].split('/')[-1]
+            filename, headers = urllib.urlretrieve(url,
+                                                   path + "/" + archive_name)
+            self.downloaded_location = filename
+            self._check_md5(filename)
+        return self.downloaded_location
+
+    def _check_md5(self, filename):
+        """Check that the md5 checksum of the given file matches the one in
+        url param"""
+        hashname = self.url['hashname']
+        expected_hashval = self.url['hashval']
+        if not None in (expected_hashval, hashname):
+            f = open(filename)
+            hashval = hashlib.new(hashname)
+            hashval.update(f.read())
+            if hashval.hexdigest() != expected_hashval:
+                raise HashDoesNotMatch("got %s instead of %s"
+                    % (hashval.hexdigest(), expected_hashval))
+
+    def __repr__(self):
+        return "%s %s %s %s" \
+            % (self.__class__.__name__, self.name, self.version,
+               self.type or "")
+
+    def _check_is_comparable(self, other):
+        if not isinstance(other, PyPIDistribution):
+            raise TypeError("cannot compare %s and %s"
+                % (type(self).__name__, type(other).__name__))
+        elif self.name != other.name:
+            raise TypeError("cannot compare %s and %s"
+                % (self.name, other.name))
+
+    def __eq__(self, other):
+        self._check_is_comparable(other)
+        return self.version == other.version
+
+    def __lt__(self, other):
+        self._check_is_comparable(other)
+        return self.version < other.version
+
+    def __ne__(self, other):
+        return not self.__eq__(other)
+
+    def __gt__(self, other):
+        return not (self.__lt__(other) or self.__eq__(other))
+
+    def __le__(self, other):
+        return self.__eq__(other) or self.__lt__(other)
+
+    def __ge__(self, other):
+        return self.__eq__(other) or self.__gt__(other)
+
+    # See http://docs.python.org/reference/datamodel#object.__hash__
+    __hash__ = object.__hash__
+
+
+class PyPIDistributions(list):
+    """A container of PyPIDistribution objects.
+
+    Contains methods and facilities to sort and filter distributions.
+    """
+    def __init__(self, list=[]):
+        # To disable the ability to pass lists on instanciation
+        super(PyPIDistributions, self).__init__()
+        for item in list:
+            self.append(item)
+
+    def filter(self, predicate):
+        """Filter the distributions and return a subset of distributions that
+        match the given predicate
+        """
+        return PyPIDistributions(
+            [dist for dist in self if dist.name == predicate.name and
+            predicate.match(dist.version)])
+
+    def get_last(self, predicate, prefer_source=None, prefer_final=None):
+        """Return the most up to date version, that satisfy the given
+        predicate
+        """
+        distributions = self.filter(predicate)
+        distributions.sort_distributions(prefer_source, prefer_final, reverse=True)
+        return distributions[0]
+
+    def get_same_name_and_version(self):
+        """Return lists of PyPIDistribution objects that refer to the same
+        name and version number. This do not consider the type (source, binary,
+        etc.)"""
+        processed = []
+        duplicates = []
+        for dist in self:
+            if (dist.name, dist.version) not in processed:
+                processed.append((dist.name, dist.version))
+                found_duplicates = [d for d in self if d.name == dist.name and
+                                    d.version == dist.version]
+                if len(found_duplicates) > 1:
+                    duplicates.append(found_duplicates)
+        return duplicates
+
+    def append(self, o):
+        """Append a new distribution to the list.
+
+        If a distribution with the same name and version exists, just grab the
+        URL informations and add a new new url for the existing one.
+        """
+        similar_dists = [d for d in self if d.name == o.name and
+                         d.version == o.version and d.type == o.type]
+        if len(similar_dists) > 0:
+            dist = similar_dists[0]
+            dist.add_url(**o.url)
+        else:
+            super(PyPIDistributions, self).append(o)
+
+    def sort_distributions(self, prefer_source=True, prefer_final=False,
+                           reverse=True, *args, **kwargs):
+        """order the results with the given properties"""
+
+        sort_by = []
+        if prefer_final:
+            sort_by.append("is_final")
+        sort_by.append("version")
+
+        if prefer_source:
+            sort_by.append("is_source")
+
+        super(PyPIDistributions, self).sort(
+            key=lambda i: [getattr(i, arg) for arg in sort_by],
+            reverse=reverse, *args, **kwargs)
+
+
+def split_archive_name(archive_name, probable_name=None):
+    """Split an archive name into two parts: name and version.
+
+    Return the tuple (name, version)
+    """
+    # Try to determine wich part is the name and wich is the version using the
+    # "-" separator. Take the larger part to be the version number then reduce
+    # if this not works.
+    def eager_split(str, maxsplit=2):
+        # split using the "-" separator
+        splits = str.rsplit("-", maxsplit)
+        name = splits[0]
+        version = "-".join(splits[1:])
+        if version.startswith("-"):
+            version = version[1:]
+        if suggest_normalized_version(version) is None and maxsplit >= 0:
+            # we dont get a good version number: recurse !
+            return eager_split(str, maxsplit - 1)
+        else:
+            return (name, version)
+    if probable_name is not None:
+        probable_name = probable_name.lower()
+    name = None
+    if probable_name is not None and probable_name in archive_name:
+        # we get the name from probable_name, if given.
+        name = probable_name
+        version = archive_name.lstrip(name)
+    else:
+        name, version = eager_split(archive_name)
+
+    version = suggest_normalized_version(version)
+    if version != "" and name != "":
+        return (name.lower(), version)
+    else:
+        raise CantParseArchiveName(archive_name)
diff --git a/src/distutils2/pypi/errors.py b/src/distutils2/pypi/errors.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/pypi/errors.py
@@ -0,0 +1,33 @@
+"""distutils2.pypi.errors
+
+All errors and exceptions raised by PyPiIndex classes.
+"""
+from distutils2.errors import DistutilsError
+
+
+class PyPIError(DistutilsError):
+    """The base class for errors of the pypi python package."""
+
+
+class DistributionNotFound(PyPIError):
+    """No distribution match the given requirements."""
+
+
+class CantParseArchiveName(PyPIError):
+    """An archive name can't be parsed to find distribution name and version"""
+
+
+class DownloadError(PyPIError):
+    """An error has occurs while downloading"""
+
+
+class HashDoesNotMatch(DownloadError):
+    """Compared hashes does not match"""
+
+
+class UnsupportedHashName(PyPIError):
+    """A unsupported hashname has been used"""
+
+
+class UnableToDownload(PyPIError):
+    """All mirrors have been tried, without success"""
diff --git a/src/distutils2/pypi/simple.py b/src/distutils2/pypi/simple.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/pypi/simple.py
@@ -0,0 +1,393 @@
+"""pypi.simple
+
+Contains the class "SimpleIndex", a simple spider to find and retrieve
+distributions on the Python Package Index, using it's "simple" API,
+avalaible at http://pypi.python.org/simple/
+"""
+from fnmatch import translate
+import httplib
+import re
+import socket
+import sys
+import urllib2
+import urlparse
+
+from distutils2.version import VersionPredicate
+from distutils2.pypi.dist import (PyPIDistribution, PyPIDistributions,
+                                  EXTENSIONS)
+from distutils2.pypi.errors import (PyPIError, DistributionNotFound,
+                                    DownloadError, UnableToDownload)
+from distutils2 import __version__ as __distutils2_version__
+
+# -- Constants -----------------------------------------------
+PYPI_DEFAULT_INDEX_URL = "http://pypi.python.org/simple/"
+PYPI_DEFAULT_MIRROR_URL = "mirrors.pypi.python.org"
+DEFAULT_HOSTS = ("*",)
+SOCKET_TIMEOUT = 15
+USER_AGENT = "Python-urllib/%s distutils2/%s" % (
+    sys.version[:3], __distutils2_version__)
+
+# -- Regexps -------------------------------------------------
+EGG_FRAGMENT = re.compile(r'^egg=([-A-Za-z0-9_.]+)$')
+HREF = re.compile("""href\\s*=\\s*['"]?([^'"> ]+)""", re.I)
+PYPI_MD5 = re.compile(
+    '<a href="([^"#]+)">([^<]+)</a>\n\s+\\(<a (?:title="MD5 hash"\n\s+)'
+    'href="[^?]+\?:action=show_md5&amp;digest=([0-9a-f]{32})">md5</a>\\)')
+URL_SCHEME = re.compile('([-+.a-z0-9]{2,}):', re.I).match
+
+# This pattern matches a character entity reference (a decimal numeric
+# references, a hexadecimal numeric reference, or a named reference).
+ENTITY_SUB = re.compile(r'&(#(\d+|x[\da-fA-F]+)|[\w.:-]+);?').sub
+REL = re.compile("""<([^>]*\srel\s*=\s*['"]?([^'">]+)[^>]*)>""", re.I)
+
+
+def socket_timeout(timeout=SOCKET_TIMEOUT):
+    """Decorator to add a socket timeout when requesting pages on PyPI.
+    """
+    def _socket_timeout(func):
+        def _socket_timeout(self, *args, **kwargs):
+            old_timeout = socket.getdefaulttimeout()
+            if hasattr(self, "_timeout"):
+                timeout = self._timeout
+            socket.setdefaulttimeout(timeout)
+            try:
+                return func(self, *args, **kwargs)
+            finally:
+                socket.setdefaulttimeout(old_timeout)
+        return _socket_timeout
+    return _socket_timeout
+
+
+class SimpleIndex(object):
+    """Provides useful tools to request the Python Package Index simple API
+
+    :param index_url: the url of the simple index to search on.
+    :param follow_externals: tell if following external links is needed or
+                             not. Default is False.
+    :param hosts: a list of hosts allowed to be processed while using
+                  follow_externals=True. Default behavior is to follow all
+                  hosts.
+    :param follow_externals: tell if following external links is needed or
+                             not. Default is False.
+    :param prefer_source: if there is binary and source distributions, the
+                          source prevails.
+    :param prefer_final: if the version is not mentioned, and the last
+                         version is not a "final" one (alpha, beta, etc.),
+                         pick up the last final version.
+    :param mirrors_url: the url to look on for DNS records giving mirror
+                        adresses.
+    :param mirrors: a list of mirrors to check out if problems
+                         occurs while working with the one given in "url"
+    :param timeout: time in seconds to consider a url has timeouted.
+    """
+
+    def __init__(self, index_url=PYPI_DEFAULT_INDEX_URL, hosts=DEFAULT_HOSTS,
+                 follow_externals=False, prefer_source=True,
+                 prefer_final=False, mirrors_url=PYPI_DEFAULT_MIRROR_URL,
+                 mirrors=None, timeout=SOCKET_TIMEOUT):
+        self.follow_externals = follow_externals
+
+        if not index_url.endswith("/"):
+            index_url += "/"
+        self._index_urls = [index_url]
+        # if no mirrors are defined, use the method described in PEP 381.
+        if mirrors is None:
+            try:
+                mirrors = socket.gethostbyname_ex(mirrors_url)[-1]
+            except socket.gaierror:
+                mirrors = []
+        self._index_urls.extend(mirrors)
+        self._current_index_url = 0
+        self._timeout = timeout
+        self._prefer_source = prefer_source
+        self._prefer_final = prefer_final
+
+        # create a regexp to match all given hosts
+        self._allowed_hosts = re.compile('|'.join(map(translate, hosts))).match
+
+        # we keep an index of pages we have processed, in order to avoid
+        # scanning them multple time (eg. if there is multiple pages pointing
+        # on one)
+        self._processed_urls = []
+        self._distributions = {}
+
+    def find(self, requirements, prefer_source=None, prefer_final=None):
+        """Browse the PyPI to find distributions that fullfil the given
+        requirements.
+
+        :param requirements: A project name and it's distribution, using
+                             version specifiers, as described in PEP345.
+        :type requirements:  You can pass either a version.VersionPredicate
+                             or a string.
+        :param prefer_source: if there is binary and source distributions, the
+                              source prevails.
+        :param prefer_final: if the version is not mentioned, and the last
+                             version is not a "final" one (alpha, beta, etc.),
+                             pick up the last final version.
+        """
+        requirements = self._get_version_predicate(requirements)
+        if prefer_source is None:
+            prefer_source = self._prefer_source
+        if prefer_final is None:
+            prefer_final = self._prefer_final
+
+        # process the index for this project
+        self._process_pypi_page(requirements.name)
+
+        # filter with requirements and return the results
+        if requirements.name in self._distributions:
+            dists = self._distributions[requirements.name].filter(requirements)
+            dists.sort_distributions(prefer_source=prefer_source,
+                                     prefer_final=prefer_final)
+        else:
+            dists = []
+
+        return dists
+
+    def get(self, requirements, *args, **kwargs):
+        """Browse the PyPI index to find distributions that fullfil the
+        given requirements, and return the most recent one.
+
+        You can specify prefer_final and prefer_source arguments here.
+        If not, the default one will be used.
+        """
+        predicate = self._get_version_predicate(requirements)
+        dists = self.find(predicate, *args, **kwargs)
+
+        if len(dists) == 0:
+            raise DistributionNotFound(requirements)
+
+        return dists.get_last(predicate)
+
+    def download(self, requirements, temp_path=None, *args, **kwargs):
+        """Download the distribution, using the requirements.
+
+        If more than one distribution match the requirements, use the last
+        version.
+        Download the distribution, and put it in the temp_path. If no temp_path
+        is given, creates and return one.
+
+        Returns the complete absolute path to the downloaded archive.
+
+        :param requirements: The same as the find attribute of `find`.
+
+        You can specify prefer_final and prefer_source arguments here.
+        If not, the default one will be used.
+        """
+        return self.get(requirements, *args, **kwargs)\
+                   .download(path=temp_path)
+
+    def _get_version_predicate(self, requirements):
+        """Return a VersionPredicate object, from a string or an already
+        existing object.
+        """
+        if isinstance(requirements, str):
+            requirements = VersionPredicate(requirements)
+        return requirements
+
+    @property
+    def index_url(self):
+        return self._index_urls[self._current_index_url]
+
+    def _switch_to_next_mirror(self):
+        """Switch to the next mirror (eg. point self.index_url to the next
+        url.
+        """
+        # Internally, iter over the _index_url iterable, if we have read all
+        # of the available indexes, raise an exception.
+        if self._current_index_url < len(self._index_urls):
+            self._current_index_url = self._current_index_url + 1
+        else:
+            raise UnableToDownload("All mirrors fails")
+
+    def _is_browsable(self, url):
+        """Tell if the given URL can be browsed or not.
+
+        It uses the follow_externals and the hosts list to tell if the given
+        url is browsable or not.
+        """
+        # if _index_url is contained in the given URL, we are browsing the
+        # index, and it's always "browsable".
+        # local files are always considered browable resources
+        if self.index_url in url or urlparse.urlparse(url)[0] == "file":
+            return True
+        elif self.follow_externals:
+            if self._allowed_hosts(urlparse.urlparse(url)[1]):  # 1 is netloc
+                return True
+            else:
+                return False
+        return False
+
+    def _is_distribution(self, link):
+        """Tell if the given URL matches to a distribution name or not.
+        """
+        #XXX find a better way to check that links are distributions
+        # Using a regexp ?
+        for ext in EXTENSIONS:
+            if ext in link:
+                return True
+        return False
+
+    def _register_dist(self, dist):
+        """Register a distribution as a part of fetched distributions for
+        SimpleIndex.
+
+        Return the PyPIDistributions object for the specified project name
+        """
+        # Internally, check if a entry exists with the project name, if not,
+        # create a new one, and if exists, add the dist to the pool.
+        if not dist.name in self._distributions:
+            self._distributions[dist.name] = PyPIDistributions()
+        self._distributions[dist.name].append(dist)
+        return self._distributions[dist.name]
+
+    def _process_url(self, url, project_name=None, follow_links=True):
+        """Process an url and search for distributions packages.
+
+        For each URL found, if it's a download, creates a PyPIdistribution
+        object. If it's a homepage and we can follow links, process it too.
+
+        :param url: the url to process
+        :param project_name: the project name we are searching for.
+        :param follow_links: Do not want to follow links more than from one
+                             level. This parameter tells if we want to follow
+                             the links we find (eg. run recursively this
+                             method on it)
+        """
+        f = self._open_url(url)
+        base_url = f.url
+        if url not in self._processed_urls:
+            self._processed_urls.append(url)
+            link_matcher = self._get_link_matcher(url)
+            for link, is_download in link_matcher(f.read(), base_url):
+                if link not in self._processed_urls:
+                    if self._is_distribution(link) or is_download:
+                        self._processed_urls.append(link)
+                        # it's a distribution, so create a dist object
+                        dist = PyPIDistribution.from_url(link, project_name,
+                                    is_external=not self.index_url in url)
+                        self._register_dist(dist)
+                    else:
+                        if self._is_browsable(link) and follow_links:
+                            self._process_url(link, project_name,
+                                follow_links=False)
+
+    def _get_link_matcher(self, url):
+        """Returns the right link matcher function of the given url
+        """
+        if self.index_url in url:
+            return self._simple_link_matcher
+        else:
+            return self._default_link_matcher
+
+    def _simple_link_matcher(self, content, base_url):
+        """Yield all links with a rel="download" or rel="homepage".
+
+        This matches the simple index requirements for matching links.
+        If follow_externals is set to False, dont yeld the external
+        urls.
+        """
+        for match in REL.finditer(content):
+            tag, rel = match.groups()
+            rels = map(str.strip, rel.lower().split(','))
+            if 'homepage' in rels or 'download' in rels:
+                for match in HREF.finditer(tag):
+                    url = urlparse.urljoin(base_url,
+                                           self._htmldecode(match.group(1)))
+                    if 'download' in rels or self._is_browsable(url):
+                        # yield a list of (url, is_download)
+                        yield (urlparse.urljoin(base_url, url),
+                               'download' in rels)
+
+    def _default_link_matcher(self, content, base_url):
+        """Yield all links found on the page.
+        """
+        for match in HREF.finditer(content):
+            url = urlparse.urljoin(base_url, self._htmldecode(match.group(1)))
+            if self._is_browsable(url):
+                yield (url, False)
+
+    def _process_pypi_page(self, name):
+        """Find and process a PyPI page for the given project name.
+
+        :param name: the name of the project to find the page
+        """
+        try:
+            # Browse and index the content of the given PyPI page.
+            url = self.index_url + name + "/"
+            self._process_url(url, name)
+        except DownloadError:
+            # if an error occurs, try with the next index_url
+            # (provided by the mirrors)
+            self._switch_to_next_mirror()
+            self._distributions.clear()
+            self._process_pypi_page(name)
+
+    @socket_timeout()
+    def _open_url(self, url):
+        """Open a urllib2 request, handling HTTP authentication, and local
+        files support.
+
+        """
+        try:
+            scheme, netloc, path, params, query, frag = urlparse.urlparse(url)
+
+            if scheme in ('http', 'https'):
+                auth, host = urllib2.splituser(netloc)
+            else:
+                auth = None
+
+            # add index.html automatically for filesystem paths
+            if scheme == 'file':
+                if url.endswith('/'):
+                    url += "index.html"
+
+            if auth:
+                auth = "Basic " + \
+                    urllib2.unquote(auth).encode('base64').strip()
+                new_url = urlparse.urlunparse((
+                    scheme, host, path, params, query, frag))
+                request = urllib2.Request(new_url)
+                request.add_header("Authorization", auth)
+            else:
+                request = urllib2.Request(url)
+            request.add_header('User-Agent', USER_AGENT)
+            fp = urllib2.urlopen(request)
+
+            if auth:
+                # Put authentication info back into request URL if same host,
+                # so that links found on the page will work
+                s2, h2, path2, param2, query2, frag2 = \
+                    urlparse.urlparse(fp.url)
+                if s2 == scheme and h2 == host:
+                    fp.url = urlparse.urlunparse(
+                        (s2, netloc, path2, param2, query2, frag2))
+
+            return fp
+        except (ValueError, httplib.InvalidURL), v:
+            msg = ' '.join([str(arg) for arg in v.args])
+            raise PyPIError('%s %s' % (url, msg))
+        except urllib2.HTTPError, v:
+            return v
+        except urllib2.URLError, v:
+            raise DownloadError("Download error for %s: %s" % (url, v.reason))
+        except httplib.BadStatusLine, v:
+            raise DownloadError('%s returned a bad status line. '
+                'The server might be down, %s' % (url, v.line))
+        except httplib.HTTPException, v:
+            raise DownloadError("Download error for %s: %s" % (url, v))
+
+    def _decode_entity(self, match):
+        what = match.group(1)
+        if what.startswith('#x'):
+            what = int(what[2:], 16)
+        elif what.startswith('#'):
+            what = int(what[1:])
+        else:
+            from htmlentitydefs import name2codepoint
+            what = name2codepoint.get(what, match.group(0))
+        return unichr(what)
+
+    def _htmldecode(self, text):
+        """Decode HTML entities in the given text."""
+        return ENTITY_SUB(self._decode_entity, text)
diff --git a/src/distutils2/spawn.py b/src/distutils2/spawn.py
--- a/src/distutils2/spawn.py
+++ b/src/distutils2/spawn.py
@@ -14,7 +14,7 @@
 from distutils2.errors import DistutilsPlatformError, DistutilsExecError
 from distutils2 import log
 
-def spawn(cmd, search_path=1, verbose=0, dry_run=0):
+def spawn(cmd, search_path=1, verbose=0, dry_run=0, env=None):
     """Run another program, specified as a command list 'cmd', in a new process.
 
     'cmd' is just the argument list for the new process, ie.
@@ -27,15 +27,18 @@
     must be the exact path to the executable.  If 'dry_run' is true,
     the command will not actually be run.
 
+    If 'env' is given, it's a environment dictionary used for the execution
+    environment.
+
     Raise DistutilsExecError if running the program fails in any way; just
     return on success.
     """
     if os.name == 'posix':
-        _spawn_posix(cmd, search_path, dry_run=dry_run)
+        _spawn_posix(cmd, search_path, dry_run=dry_run, env=env)
     elif os.name == 'nt':
-        _spawn_nt(cmd, search_path, dry_run=dry_run)
+        _spawn_nt(cmd, search_path, dry_run=dry_run, env=env)
     elif os.name == 'os2':
-        _spawn_os2(cmd, search_path, dry_run=dry_run)
+        _spawn_os2(cmd, search_path, dry_run=dry_run, env=env)
     else:
         raise DistutilsPlatformError, \
               "don't know how to spawn programs on platform '%s'" % os.name
@@ -56,7 +59,7 @@
             args[i] = '"%s"' % arg
     return args
 
-def _spawn_nt(cmd, search_path=1, verbose=0, dry_run=0):
+def _spawn_nt(cmd, search_path=1, verbose=0, dry_run=0, env=None):
     executable = cmd[0]
     cmd = _nt_quote_args(cmd)
     if search_path:
@@ -66,7 +69,11 @@
     if not dry_run:
         # spawn for NT requires a full path to the .exe
         try:
-            rc = os.spawnv(os.P_WAIT, executable, cmd)
+            if env is None:
+                rc = os.spawnv(os.P_WAIT, executable, cmd)
+            else:
+                rc = os.spawnve(os.P_WAIT, executable, cmd, env)
+
         except OSError, exc:
             # this seems to happen when the command isn't found
             raise DistutilsExecError, \
@@ -76,7 +83,7 @@
             raise DistutilsExecError, \
                   "command '%s' failed with exit status %d" % (cmd[0], rc)
 
-def _spawn_os2(cmd, search_path=1, verbose=0, dry_run=0):
+def _spawn_os2(cmd, search_path=1, verbose=0, dry_run=0, env=None):
     executable = cmd[0]
     if search_path:
         # either we find one or it stays the same
@@ -85,7 +92,11 @@
     if not dry_run:
         # spawnv for OS/2 EMX requires a full path to the .exe
         try:
-            rc = os.spawnv(os.P_WAIT, executable, cmd)
+            if env is None:
+                rc = os.spawnv(os.P_WAIT, executable, cmd)
+            else:
+                rc = os.spawnve(os.P_WAIT, executable, cmd, env)
+
         except OSError, exc:
             # this seems to happen when the command isn't found
             raise DistutilsExecError, \
@@ -97,16 +108,24 @@
                   "command '%s' failed with exit status %d" % (cmd[0], rc)
 
 
-def _spawn_posix(cmd, search_path=1, verbose=0, dry_run=0):
+def _spawn_posix(cmd, search_path=1, verbose=0, dry_run=0, env=None):
     log.info(' '.join(cmd))
     if dry_run:
         return
-    exec_fn = search_path and os.execvp or os.execv
+
+    if env is None:
+        exec_fn = search_path and os.execvp or os.execv
+    else:
+        exec_fn = search_path and os.execvpe or os.execve
+
     pid = os.fork()
 
     if pid == 0:  # in the child
         try:
-            exec_fn(cmd[0], cmd)
+            if env is None:
+                exec_fn(cmd[0], cmd)
+            else:
+                exec_fn(cmd[0], cmd, env)
         except OSError, e:
             sys.stderr.write("unable to execute %s: %s\n" %
                              (cmd[0], e.strerror))
diff --git a/src/distutils2/tests/__init__.py b/src/distutils2/tests/__init__.py
--- a/src/distutils2/tests/__init__.py
+++ b/src/distutils2/tests/__init__.py
@@ -45,7 +45,7 @@
     """Test failed."""
 
 
-class BasicTestRunner:
+class BasicTestRunner(object):
     def run(self, test):
         result = unittest.TestResult()
         test(result)
diff --git a/src/distutils2/tests/conversions/05_after.py b/src/distutils2/tests/conversions/05_after.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/conversions/05_after.py
@@ -0,0 +1,137 @@
+#!/usr/bin/env python
+# -*- coding: utf-8 -*-
+#
+# Copyright (C) 2003-2009 Edgewall Software
+# All rights reserved.
+#
+# This software is licensed as described in the file COPYING, which
+# you should have received as part of this distribution. The terms
+# are also available at http://trac.edgewall.org/wiki/TracLicense.
+#
+# This software consists of voluntary contributions made by many
+# individuals. For the exact contribution history, see the revision
+# history and logs, available at http://trac.edgewall.org/log/.
+
+from distutils2.core import setup, find_packages
+
+extra = {}
+
+try:
+    import babel
+    
+    extractors = [
+        ('**.py',                'python', None),
+        ('**/templates/**.html', 'genshi', None),
+        ('**/templates/**.txt',  'genshi',
+         {'template_class': 'genshi.template:NewTextTemplate'}),
+    ]
+    extra['message_extractors'] = {
+        'trac': extractors,
+        'tracopt': extractors,
+    }
+
+    from trac.util.dist import get_l10n_js_cmdclass
+    extra['cmdclass'] = get_l10n_js_cmdclass()
+
+except ImportError, e:
+    pass
+
+setup(
+    name = 'Trac',
+    version = '0.12.1',
+    summary = 'Integrated SCM, wiki, issue tracker and project environment',
+    description = """
+Trac is a minimalistic web-based software project management and bug/issue
+tracking system. It provides an interface to the Subversion revision control
+systems, an integrated wiki, flexible issue tracking and convenient report
+facilities.
+""",
+    author = 'Edgewall Software',
+    author_email = 'info at edgewall.com',
+    license = 'BSD',
+    home_page = 'http://trac.edgewall.org/',
+    download_url = 'http://trac.edgewall.org/wiki/TracDownload',
+    classifiers = [
+        'Environment :: Web Environment',
+        'Framework :: Trac',
+        'Intended Audience :: Developers',
+        'License :: OSI Approved :: BSD License',
+        'Operating System :: OS Independent',
+        'Programming Language :: Python',
+        'Topic :: Software Development :: Bug Tracking',
+        'Topic :: Software Development :: Version Control',
+    ],
+
+    packages = find_packages(exclude=['*.tests']),
+    package_data = {
+        '': ['templates/*'],
+        'trac': ['htdocs/*.*', 'htdocs/README', 'htdocs/js/*.*',
+                 'htdocs/js/messages/*.*', 'htdocs/css/*.*',
+                 'htdocs/guide/*', 'locale/*/LC_MESSAGES/messages.mo'],
+        'trac.wiki': ['default-pages/*'],
+        'trac.ticket': ['workflows/*.ini'],
+    },
+
+    test_suite = 'trac.test.suite',
+    zip_safe = True,
+
+    requires_dist = [
+        'setuptools>=0.6b1',
+        'Genshi>=0.6',
+    ],
+    extras_require = {
+        'Babel': ['Babel>=0.9.5'],
+        'Pygments': ['Pygments>=0.6'],
+        'reST': ['docutils>=0.3'],
+        'SilverCity': ['SilverCity>=0.9.4'],
+        'Textile': ['textile>=2.0'],
+    },
+
+    entry_points = """
+        [console_scripts]
+        trac-admin = trac.admin.console:run
+        tracd = trac.web.standalone:main
+
+        [trac.plugins]
+        trac.about = trac.about
+        trac.admin.console = trac.admin.console
+        trac.admin.web_ui = trac.admin.web_ui
+        trac.attachment = trac.attachment
+        trac.db.mysql = trac.db.mysql_backend
+        trac.db.postgres = trac.db.postgres_backend
+        trac.db.sqlite = trac.db.sqlite_backend
+        trac.mimeview.patch = trac.mimeview.patch
+        trac.mimeview.pygments = trac.mimeview.pygments[Pygments]
+        trac.mimeview.rst = trac.mimeview.rst[reST]
+        trac.mimeview.silvercity = trac.mimeview.silvercity[SilverCity]
+        trac.mimeview.txtl = trac.mimeview.txtl[Textile]
+        trac.prefs = trac.prefs.web_ui
+        trac.search = trac.search.web_ui
+        trac.ticket.admin = trac.ticket.admin
+        trac.ticket.query = trac.ticket.query
+        trac.ticket.report = trac.ticket.report
+        trac.ticket.roadmap = trac.ticket.roadmap
+        trac.ticket.web_ui = trac.ticket.web_ui
+        trac.timeline = trac.timeline.web_ui
+        trac.versioncontrol.admin = trac.versioncontrol.admin
+        trac.versioncontrol.svn_authz = trac.versioncontrol.svn_authz
+        trac.versioncontrol.svn_fs = trac.versioncontrol.svn_fs
+        trac.versioncontrol.svn_prop = trac.versioncontrol.svn_prop
+        trac.versioncontrol.web_ui = trac.versioncontrol.web_ui
+        trac.web.auth = trac.web.auth
+        trac.web.session = trac.web.session
+        trac.wiki.admin = trac.wiki.admin
+        trac.wiki.interwiki = trac.wiki.interwiki
+        trac.wiki.macros = trac.wiki.macros
+        trac.wiki.web_ui = trac.wiki.web_ui
+        trac.wiki.web_api = trac.wiki.web_api
+        tracopt.mimeview.enscript = tracopt.mimeview.enscript
+        tracopt.mimeview.php = tracopt.mimeview.php
+        tracopt.perm.authz_policy = tracopt.perm.authz_policy
+        tracopt.perm.config_perm_provider = tracopt.perm.config_perm_provider
+        tracopt.ticket.commit_updater = tracopt.ticket.commit_updater
+        tracopt.ticket.deleter = tracopt.ticket.deleter
+    """,
+
+    **extra
+)
diff --git a/src/distutils2/tests/conversions/05_before.py b/src/distutils2/tests/conversions/05_before.py
new file mode 100755
--- /dev/null
+++ b/src/distutils2/tests/conversions/05_before.py
@@ -0,0 +1,137 @@
+#!/usr/bin/env python
+# -*- coding: utf-8 -*-
+#
+# Copyright (C) 2003-2009 Edgewall Software
+# All rights reserved.
+#
+# This software is licensed as described in the file COPYING, which
+# you should have received as part of this distribution. The terms
+# are also available at http://trac.edgewall.org/wiki/TracLicense.
+#
+# This software consists of voluntary contributions made by many
+# individuals. For the exact contribution history, see the revision
+# history and logs, available at http://trac.edgewall.org/log/.
+
+from setuptools import setup, find_packages
+
+extra = {}
+
+try:
+    import babel
+    
+    extractors = [
+        ('**.py',                'python', None),
+        ('**/templates/**.html', 'genshi', None),
+        ('**/templates/**.txt',  'genshi',
+         {'template_class': 'genshi.template:NewTextTemplate'}),
+    ]
+    extra['message_extractors'] = {
+        'trac': extractors,
+        'tracopt': extractors,
+    }
+
+    from trac.util.dist import get_l10n_js_cmdclass
+    extra['cmdclass'] = get_l10n_js_cmdclass()
+
+except ImportError, e:
+    pass
+
+setup(
+    name = 'Trac',
+    version = '0.12.1',
+    description = 'Integrated SCM, wiki, issue tracker and project environment',
+    long_description = """
+Trac is a minimalistic web-based software project management and bug/issue
+tracking system. It provides an interface to the Subversion revision control
+systems, an integrated wiki, flexible issue tracking and convenient report
+facilities.
+""",
+    author = 'Edgewall Software',
+    author_email = 'info at edgewall.com',
+    license = 'BSD',
+    url = 'http://trac.edgewall.org/',
+    download_url = 'http://trac.edgewall.org/wiki/TracDownload',
+    classifiers = [
+        'Environment :: Web Environment',
+        'Framework :: Trac',
+        'Intended Audience :: Developers',
+        'License :: OSI Approved :: BSD License',
+        'Operating System :: OS Independent',
+        'Programming Language :: Python',
+        'Topic :: Software Development :: Bug Tracking',
+        'Topic :: Software Development :: Version Control',
+    ],
+
+    packages = find_packages(exclude=['*.tests']),
+    package_data = {
+        '': ['templates/*'],
+        'trac': ['htdocs/*.*', 'htdocs/README', 'htdocs/js/*.*',
+                 'htdocs/js/messages/*.*', 'htdocs/css/*.*',
+                 'htdocs/guide/*', 'locale/*/LC_MESSAGES/messages.mo'],
+        'trac.wiki': ['default-pages/*'],
+        'trac.ticket': ['workflows/*.ini'],
+    },
+
+    test_suite = 'trac.test.suite',
+    zip_safe = True,
+
+    install_requires = [
+        'setuptools>=0.6b1',
+        'Genshi>=0.6',
+    ],
+    extras_require = {
+        'Babel': ['Babel>=0.9.5'],
+        'Pygments': ['Pygments>=0.6'],
+        'reST': ['docutils>=0.3'],
+        'SilverCity': ['SilverCity>=0.9.4'],
+        'Textile': ['textile>=2.0'],
+    },
+
+    entry_points = """
+        [console_scripts]
+        trac-admin = trac.admin.console:run
+        tracd = trac.web.standalone:main
+
+        [trac.plugins]
+        trac.about = trac.about
+        trac.admin.console = trac.admin.console
+        trac.admin.web_ui = trac.admin.web_ui
+        trac.attachment = trac.attachment
+        trac.db.mysql = trac.db.mysql_backend
+        trac.db.postgres = trac.db.postgres_backend
+        trac.db.sqlite = trac.db.sqlite_backend
+        trac.mimeview.patch = trac.mimeview.patch
+        trac.mimeview.pygments = trac.mimeview.pygments[Pygments]
+        trac.mimeview.rst = trac.mimeview.rst[reST]
+        trac.mimeview.silvercity = trac.mimeview.silvercity[SilverCity]
+        trac.mimeview.txtl = trac.mimeview.txtl[Textile]
+        trac.prefs = trac.prefs.web_ui
+        trac.search = trac.search.web_ui
+        trac.ticket.admin = trac.ticket.admin
+        trac.ticket.query = trac.ticket.query
+        trac.ticket.report = trac.ticket.report
+        trac.ticket.roadmap = trac.ticket.roadmap
+        trac.ticket.web_ui = trac.ticket.web_ui
+        trac.timeline = trac.timeline.web_ui
+        trac.versioncontrol.admin = trac.versioncontrol.admin
+        trac.versioncontrol.svn_authz = trac.versioncontrol.svn_authz
+        trac.versioncontrol.svn_fs = trac.versioncontrol.svn_fs
+        trac.versioncontrol.svn_prop = trac.versioncontrol.svn_prop
+        trac.versioncontrol.web_ui = trac.versioncontrol.web_ui
+        trac.web.auth = trac.web.auth
+        trac.web.session = trac.web.session
+        trac.wiki.admin = trac.wiki.admin
+        trac.wiki.interwiki = trac.wiki.interwiki
+        trac.wiki.macros = trac.wiki.macros
+        trac.wiki.web_ui = trac.wiki.web_ui
+        trac.wiki.web_api = trac.wiki.web_api
+        tracopt.mimeview.enscript = tracopt.mimeview.enscript
+        tracopt.mimeview.php = tracopt.mimeview.php
+        tracopt.perm.authz_policy = tracopt.perm.authz_policy
+        tracopt.perm.config_perm_provider = tracopt.perm.config_perm_provider
+        tracopt.ticket.commit_updater = tracopt.ticket.commit_updater
+        tracopt.ticket.deleter = tracopt.ticket.deleter
+    """,
+
+    **extra
+)
diff --git a/src/distutils2/tests/pypi_server.py b/src/distutils2/tests/pypi_server.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypi_server.py
@@ -0,0 +1,195 @@
+"""Mocked PyPI Server implementation, to use in tests.
+
+This module also provides a simple test case to extend if you need to use
+the PyPIServer all along your test case. Be sure to read the documentation 
+before any use.
+"""
+
+import Queue
+import threading
+import time
+import urllib2
+from BaseHTTPServer import HTTPServer
+from SimpleHTTPServer import SimpleHTTPRequestHandler
+import os.path
+import select
+
+from distutils2.tests.support import unittest
+
+PYPI_DEFAULT_STATIC_PATH = os.path.dirname(os.path.abspath(__file__)) + "/pypiserver"
+
+def use_pypi_server(*server_args, **server_kwargs):
+    """Decorator to make use of the PyPIServer for test methods, 
+    just when needed, and not for the entire duration of the testcase.
+    """
+    def wrapper(func):
+        def wrapped(*args, **kwargs):
+            server = PyPIServer(*server_args, **server_kwargs)
+            server.start()
+            try:
+                func(server=server, *args, **kwargs)
+            finally:
+                server.stop()
+        return wrapped
+    return wrapper
+
+class PyPIServerTestCase(unittest.TestCase):
+
+    def setUp(self):
+        super(PyPIServerTestCase, self).setUp()
+        self.pypi = PyPIServer()
+        self.pypi.start()
+
+    def tearDown(self):
+        super(PyPIServerTestCase, self).tearDown()
+        self.pypi.stop()
+
+class PyPIServer(threading.Thread):
+    """PyPI Mocked server.
+    Provides a mocked version of the PyPI API's, to ease tests.
+
+    Support serving static content and serving previously given text.
+    """
+
+    def __init__(self, test_static_path=None,
+                 static_filesystem_paths=["default"], static_uri_paths=["simple"]):
+        """Initialize the server.
+
+        static_uri_paths and static_base_path are parameters used to provides
+        respectively the http_paths to serve statically, and where to find the
+        matching files on the filesystem.
+        """
+        threading.Thread.__init__(self)
+        self._run = True
+        self.httpd = HTTPServer(('', 0), PyPIRequestHandler)
+        self.httpd.RequestHandlerClass.log_request = lambda *_: None
+        self.httpd.RequestHandlerClass.pypi_server = self
+        self.address = (self.httpd.server_name, self.httpd.server_port)
+        self.request_queue = Queue.Queue()
+        self._requests = []
+        self.default_response_status = 200
+        self.default_response_headers = [('Content-type', 'text/plain')]
+        self.default_response_data = "hello"
+        
+        # initialize static paths / filesystems
+        self.static_uri_paths = static_uri_paths
+        if test_static_path is not None:
+            static_filesystem_paths.append(test_static_path)
+        self.static_filesystem_paths = [PYPI_DEFAULT_STATIC_PATH + "/" + path
+            for path in static_filesystem_paths]
+
+    def run(self):
+        # loop because we can't stop it otherwise, for python < 2.6
+        while self._run:
+            r, w, e = select.select([self.httpd], [], [], 0.5)
+            if r:
+                self.httpd.handle_request()
+
+    def stop(self):
+        """self shutdown is not supported for python < 2.6"""
+        self._run = False
+
+    def get_next_response(self):
+        return (self.default_response_status,
+                self.default_response_headers,
+                self.default_response_data)
+
+    @property
+    def requests(self):
+        """Use this property to get all requests that have been made
+        to the server
+        """
+        while True:
+            try:
+                self._requests.append(self.request_queue.get_nowait())
+            except Queue.Empty:
+                break
+        return self._requests
+
+    @property
+    def full_address(self):
+        return "http://%s:%s" % self.address
+
+
+class PyPIRequestHandler(SimpleHTTPRequestHandler):
+    # we need to access the pypi server while serving the content
+    pypi_server = None
+
+    def do_POST(self):
+        return self.serve_request()
+    def do_GET(self):
+        return self.serve_request()
+    def do_DELETE(self):
+        return self.serve_request()
+    def do_PUT(self):
+        return self.serve_request()
+
+    def serve_request(self):
+        """Serve the content.
+
+        Also record the requests to be accessed later. If trying to access an
+        url matching a static uri, serve static content, otherwise serve
+        what is provided by the `get_next_response` method.
+        """
+        # record the request. Read the input only on PUT or POST requests
+        if self.command in ("PUT", "POST"):
+            if 'content-length' in self.headers.dict:
+                request_data = self.rfile.read(
+                    int(self.headers['content-length']))
+            else:
+                request_data = self.rfile.read()
+        elif self.command in ("GET", "DELETE"):
+            request_data = ''
+
+        self.pypi_server.request_queue.put((self, request_data))
+
+        # serve the content from local disc if we request an URL beginning
+        # by a pattern defined in `static_paths`
+        url_parts = self.path.split("/")
+        if (len(url_parts) > 1 and 
+                url_parts[1] in self.pypi_server.static_uri_paths):
+            data = None
+            # always take the last first.
+            fs_paths = []
+            fs_paths.extend(self.pypi_server.static_filesystem_paths)
+            fs_paths.reverse()
+            relative_path = self.path
+            for fs_path in fs_paths:
+                try:
+                    if self.path.endswith("/"):
+                        relative_path += "index.html"
+                    file = open(fs_path + relative_path)
+                    data = file.read()
+                    if relative_path.endswith('.tar.gz'):
+                        headers=[('Content-type', 'application/x-gtar')]
+                    else:
+                        headers=[('Content-type', 'text/html')]
+                    self.make_response(data, headers=headers)
+                except IOError:
+                    pass
+
+            if data is None:
+                self.make_response("Not found", 404)
+
+        # otherwise serve the content from get_next_response
+        else:
+            # send back a response
+            status, headers, data = self.pypi_server.get_next_response()
+            self.make_response(data, status, headers)
+
+    def make_response(self, data, status=200,
+                      headers=[('Content-type', 'text/html')]):
+        """Send the response to the HTTP client"""
+        if not isinstance(status, int):
+            try:
+                status = int(status)
+            except ValueError:
+                # we probably got something like YYY Codename. 
+                # Just get the first 3 digits
+                status = int(status[:3])
+
+        self.send_response(status)
+        for header, value in headers:
+            self.send_header(header, value)
+        self.end_headers()
+        self.wfile.write(data)
diff --git a/src/distutils2/tests/pypiserver/downloads_with_md5/simple/badmd5/badmd5-0.1.tar.gz b/src/distutils2/tests/pypiserver/downloads_with_md5/simple/badmd5/badmd5-0.1.tar.gz
new file mode 100644
diff --git a/src/distutils2/tests/pypiserver/downloads_with_md5/simple/badmd5/index.html b/src/distutils2/tests/pypiserver/downloads_with_md5/simple/badmd5/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/downloads_with_md5/simple/badmd5/index.html
@@ -0,0 +1,3 @@
+<html><body>
+<a href="badmd5-0.1.tar.gz#md5=3e3d86693d6564c807272b11b3069dfe" rel="download">badmd5-0.1.tar.gz</a><br/>
+</body></html>
diff --git a/src/distutils2/tests/pypiserver/downloads_with_md5/simple/foobar/foobar-0.1.tar.gz b/src/distutils2/tests/pypiserver/downloads_with_md5/simple/foobar/foobar-0.1.tar.gz
new file mode 100644
diff --git a/src/distutils2/tests/pypiserver/downloads_with_md5/simple/foobar/index.html b/src/distutils2/tests/pypiserver/downloads_with_md5/simple/foobar/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/downloads_with_md5/simple/foobar/index.html
@@ -0,0 +1,3 @@
+<html><body>
+<a href="foobar-0.1.tar.gz#md5=d41d8cd98f00b204e9800998ecf8427e" rel="download">foobar-0.1.tar.gz</a><br/>
+</body></html>
diff --git a/src/distutils2/tests/pypiserver/downloads_with_md5/simple/index.html b/src/distutils2/tests/pypiserver/downloads_with_md5/simple/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/downloads_with_md5/simple/index.html
@@ -0,0 +1,2 @@
+<a href="foobar/">foobar/</a> 
+<a href="badmd5/">badmd5/</a> 
diff --git a/src/distutils2/tests/pypiserver/foo_bar_baz/simple/bar/index.html b/src/distutils2/tests/pypiserver/foo_bar_baz/simple/bar/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/foo_bar_baz/simple/bar/index.html
@@ -0,0 +1,6 @@
+<html><head><title>Links for bar</title></head><body><h1>Links for bar</h1>
+<a rel="download" href="../../packages/source/F/bar/bar-1.0.tar.gz">bar-1.0.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/bar/bar-1.0.1.tar.gz">bar-1.0.1.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/bar/bar-2.0.tar.gz">bar-2.0.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/bar/bar-2.0.1.tar.gz">bar-2.0.1.tar.gz</a><br/> 
+</body></html>
diff --git a/src/distutils2/tests/pypiserver/foo_bar_baz/simple/baz/index.html b/src/distutils2/tests/pypiserver/foo_bar_baz/simple/baz/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/foo_bar_baz/simple/baz/index.html
@@ -0,0 +1,6 @@
+<html><head><title>Links for baz</title></head><body><h1>Links for baz</h1>
+<a rel="download" href="../../packages/source/F/baz/baz-1.0.tar.gz">baz-1.0.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/baz/baz-1.0.1.tar.gz">baz-1.0.1.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/baz/baz-2.0.tar.gz">baz-2.0.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/baz/baz-2.0.1.tar.gz">baz-2.0.1.tar.gz</a><br/> 
+</body></html>
diff --git a/src/distutils2/tests/pypiserver/foo_bar_baz/simple/foo/index.html b/src/distutils2/tests/pypiserver/foo_bar_baz/simple/foo/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/foo_bar_baz/simple/foo/index.html
@@ -0,0 +1,6 @@
+<html><head><title>Links for foo</title></head><body><h1>Links for foo</h1>
+<a rel="download" href="../../packages/source/F/foo/foo-1.0.tar.gz">foo-1.0.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/foo/foo-1.0.1.tar.gz">foo-1.0.1.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/foo/foo-2.0.tar.gz">foo-2.0.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/foo/foo-2.0.1.tar.gz">foo-2.0.1.tar.gz</a><br/> 
+</body></html>
diff --git a/src/distutils2/tests/pypiserver/foo_bar_baz/simple/index.html b/src/distutils2/tests/pypiserver/foo_bar_baz/simple/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/foo_bar_baz/simple/index.html
@@ -0,0 +1,3 @@
+<a href="foo/">foo/</a> 
+<a href="bar/">bar/</a> 
+<a href="baz/">baz/</a> 
diff --git a/src/distutils2/tests/pypiserver/test_found_links/simple/foobar/index.html b/src/distutils2/tests/pypiserver/test_found_links/simple/foobar/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/test_found_links/simple/foobar/index.html
@@ -0,0 +1,6 @@
+<html><head><title>Links for Foobar</title></head><body><h1>Links for Foobar</h1>
+<a rel="download" href="../../packages/source/F/Foobar/Foobar-1.0.tar.gz#md5=98fa833fdabcdd78d00245aead66c174">Foobar-1.0.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/Foobar/Foobar-1.0.1.tar.gz#md5=2351efb20f6b7b5d9ce80fa4cb1bd9ca">Foobar-1.0.1.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/Foobar/Foobar-2.0.tar.gz#md5=98fa833fdabcdd78d00245aead66c274">Foobar-2.0.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/Foobar/Foobar-2.0.1.tar.gz#md5=2352efb20f6b7b5d9ce80fa4cb2bd9ca">Foobar-2.0.1.tar.gz</a><br/> 
+</body></html>
diff --git a/src/distutils2/tests/pypiserver/test_found_links/simple/index.html b/src/distutils2/tests/pypiserver/test_found_links/simple/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/test_found_links/simple/index.html
@@ -0,0 +1,1 @@
+<a href="foobar/">foobar/</a> 
diff --git a/src/distutils2/tests/pypiserver/test_pypi_server/external/index.html b/src/distutils2/tests/pypiserver/test_pypi_server/external/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/test_pypi_server/external/index.html
@@ -0,0 +1,1 @@
+index.html from external server
diff --git a/src/distutils2/tests/pypiserver/test_pypi_server/simple/index.html b/src/distutils2/tests/pypiserver/test_pypi_server/simple/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/test_pypi_server/simple/index.html
@@ -0,0 +1,1 @@
+Yeah
diff --git a/src/distutils2/tests/pypiserver/with_externals/external/external.html b/src/distutils2/tests/pypiserver/with_externals/external/external.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/with_externals/external/external.html
@@ -0,0 +1,3 @@
+<html><body>
+<a href="/foobar-0.1.tar.gz#md5=1__bad_md5___">bad old link</a>
+</body></html>
diff --git a/src/distutils2/tests/pypiserver/with_externals/simple/foobar/index.html b/src/distutils2/tests/pypiserver/with_externals/simple/foobar/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/with_externals/simple/foobar/index.html
@@ -0,0 +1,4 @@
+<html><body>
+<a rel ="download" href="/foobar-0.1.tar.gz#md5=12345678901234567">foobar-0.1.tar.gz</a><br/>
+<a href="../../external/external.html" rel="homepage">external homepage</a><br/>
+</body></html>
diff --git a/src/distutils2/tests/pypiserver/with_externals/simple/index.html b/src/distutils2/tests/pypiserver/with_externals/simple/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/with_externals/simple/index.html
@@ -0,0 +1,1 @@
+<a href="foobar/">foobar/</a> 
diff --git a/src/distutils2/tests/pypiserver/with_norel_links/external/homepage.html b/src/distutils2/tests/pypiserver/with_norel_links/external/homepage.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/with_norel_links/external/homepage.html
@@ -0,0 +1,7 @@
+<html>
+<body>
+<p>a rel=homepage HTML page</p>
+<a href="/foobar-2.0.tar.gz">foobar 2.0</a>
+</body>
+</html>
+
diff --git a/src/distutils2/tests/pypiserver/with_norel_links/external/nonrel.html b/src/distutils2/tests/pypiserver/with_norel_links/external/nonrel.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/with_norel_links/external/nonrel.html
@@ -0,0 +1,1 @@
+A page linked without rel="download" or rel="homepage" link.
diff --git a/src/distutils2/tests/pypiserver/with_norel_links/simple/foobar/index.html b/src/distutils2/tests/pypiserver/with_norel_links/simple/foobar/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/with_norel_links/simple/foobar/index.html
@@ -0,0 +1,6 @@
+<html><body>
+<a rel="download" href="/foobar-0.1.tar.gz" rel="download">foobar-0.1.tar.gz</a><br/>
+<a href="../../external/homepage.html" rel="homepage">external homepage</a><br/>
+<a href="../../external/nonrel.html">unrelated link</a><br/>
+<a href="/unrelated-0.2.tar.gz">unrelated download</a></br/>
+</body></html>
diff --git a/src/distutils2/tests/pypiserver/with_norel_links/simple/index.html b/src/distutils2/tests/pypiserver/with_norel_links/simple/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/with_norel_links/simple/index.html
@@ -0,0 +1,1 @@
+<a href="foobar/">foobar/</a> 
diff --git a/src/distutils2/tests/pypiserver/with_real_externals/simple/foobar/index.html b/src/distutils2/tests/pypiserver/with_real_externals/simple/foobar/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/with_real_externals/simple/foobar/index.html
@@ -0,0 +1,4 @@
+<html><body>
+<a rel="download" href="/foobar-0.1.tar.gz#md5=0_correct_md5">foobar-0.1.tar.gz</a><br/>
+<a href="http://a-really-external-website/external/external.html" rel="homepage">external homepage</a><br/>
+</body></html>
diff --git a/src/distutils2/tests/pypiserver/with_real_externals/simple/index.html b/src/distutils2/tests/pypiserver/with_real_externals/simple/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/with_real_externals/simple/index.html
@@ -0,0 +1,1 @@
+<a href="foobar/">foobar/</a> 
diff --git a/src/distutils2/tests/support.py b/src/distutils2/tests/support.py
--- a/src/distutils2/tests/support.py
+++ b/src/distutils2/tests/support.py
@@ -14,7 +14,6 @@
 
 from distutils2 import log
 from distutils2.log import DEBUG, INFO, WARN, ERROR, FATAL
-from distutils2.core import Distribution
 
 if sys.version_info >= (2, 7):
     # improved unittest package from 2.7's standard library
@@ -42,7 +41,7 @@
 
     def _log(self, level, msg, args):
         if level not in (DEBUG, INFO, WARN, ERROR, FATAL):
-            raise ValueError('%s wrong log level' % str(level))
+            raise ValueError('%s wrong log level' % level)
         self.logs.append((level, msg, args))
 
     def get_logs(self, *levels):
@@ -65,12 +64,22 @@
     def setUp(self):
         super(TempdirManager, self).setUp()
         self.tempdirs = []
+        self.tempfiles = []
 
     def tearDown(self):
         super(TempdirManager, self).tearDown()
         while self.tempdirs:
             d = self.tempdirs.pop()
             shutil.rmtree(d, os.name in ('nt', 'cygwin'))
+        for file_ in self.tempfiles:
+            if os.path.exists(file_):
+                os.remove(file_)
+
+    def mktempfile(self):
+        """Create a temporary file that will be cleaned up."""
+        tempfile_ = tempfile.NamedTemporaryFile()
+        self.tempfiles.append(tempfile_.name)
+        return tempfile_
 
     def mkdtemp(self):
         """Create a temporary directory that will be cleaned up.
@@ -105,6 +114,7 @@
         It returns the package directory and the distribution
         instance.
         """
+        from distutils2.dist import Distribution
         tmp_dir = self.mkdtemp()
         pkg_dir = os.path.join(tmp_dir, pkg_name)
         os.mkdir(pkg_dir)
diff --git a/src/distutils2/tests/test_Mixin2to3.py b/src/distutils2/tests/test_Mixin2to3.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/test_Mixin2to3.py
@@ -0,0 +1,54 @@
+"""Tests for distutils.command.build_py."""
+import sys
+import tempfile
+
+import distutils2
+from distutils2.tests import support
+from distutils2.tests.support import unittest
+from distutils2.command.build_py import Mixin2to3
+
+
+class Mixin2to3TestCase(support.TempdirManager, unittest.TestCase):
+
+    @unittest.skipUnless(sys.version > '2.6', 'Need >= 2.6')
+    def test_convert_code_only(self):
+        # used to check if code gets converted properly.
+        code_content = "print 'test'\n"
+        code_handle = self.mktempfile()
+        code_name = code_handle.name
+
+        code_handle.write(code_content)
+        code_handle.flush()
+
+        mixin2to3 = Mixin2to3()
+        mixin2to3._run_2to3([code_name])
+        converted_code_content = "print('test')\n"
+        new_code_content = "".join(open(code_name).readlines())
+
+        self.assertEquals(new_code_content, converted_code_content)
+
+    @unittest.skipUnless(sys.version > '2.6', 'Need >= 2.6')
+    def test_doctests_only(self):
+        # used to check if doctests gets converted properly.
+        doctest_content = '"""\n>>> print test\ntest\n"""\nprint test\n\n'
+        doctest_handle = self.mktempfile()
+        doctest_name = doctest_handle.name
+
+        doctest_handle.write(doctest_content)
+        doctest_handle.flush()
+
+        mixin2to3 = Mixin2to3()
+        mixin2to3._run_2to3([doctest_name])
+
+        converted_doctest_content = ['"""', '>>> print(test)', 'test', '"""',
+                                     'print(test)', '', '', '']
+        converted_doctest_content = '\n'.join(converted_doctest_content)
+        new_doctest_content = "".join(open(doctest_name).readlines())
+
+        self.assertEquals(new_doctest_content, converted_doctest_content)
+
+def test_suite():
+    return unittest.makeSuite(Mixin2to3TestCase)
+
+if __name__ == "__main__":
+    unittest.main(defaultTest="test_suite")
diff --git a/src/distutils2/tests/test_bdist.py b/src/distutils2/tests/test_bdist.py
--- a/src/distutils2/tests/test_bdist.py
+++ b/src/distutils2/tests/test_bdist.py
@@ -1,8 +1,6 @@
 """Tests for distutils.command.bdist."""
 import sys
 import os
-import tempfile
-import shutil
 
 from distutils2.tests import run_unittest
 
@@ -25,7 +23,7 @@
         cmd = bdist(dist)
         cmd.formats = ['msi']
         cmd.ensure_finalized()
-        self.assertEquals(cmd.formats, ['msi'])
+        self.assertEqual(cmd.formats, ['msi'])
 
         # what format bdist offers ?
         # XXX an explicit list in bdist is
@@ -33,9 +31,9 @@
         # we should add a registry
         formats = ['zip', 'gztar', 'bztar', 'ztar', 'tar', 'wininst', 'msi']
         formats.sort()
-        founded = cmd.format_command.keys()
-        founded.sort()
-        self.assertEquals(founded, formats)
+        found = cmd.format_command.keys()
+        found.sort()
+        self.assertEqual(found, formats)
 
 def test_suite():
     return unittest.makeSuite(BuildTestCase)
diff --git a/src/distutils2/tests/test_bdist_dumb.py b/src/distutils2/tests/test_bdist_dumb.py
--- a/src/distutils2/tests/test_bdist_dumb.py
+++ b/src/distutils2/tests/test_bdist_dumb.py
@@ -78,7 +78,7 @@
             base = base.replace(':', '-')
 
         wanted = ['%s.zip' % base]
-        self.assertEquals(dist_created, wanted)
+        self.assertEqual(dist_created, wanted)
 
         # now let's check what we have in the zip file
         # XXX to be done
@@ -87,16 +87,16 @@
         pkg_dir, dist = self.create_dist()
         os.chdir(pkg_dir)
         cmd = bdist_dumb(dist)
-        self.assertEquals(cmd.bdist_dir, None)
+        self.assertEqual(cmd.bdist_dir, None)
         cmd.finalize_options()
 
         # bdist_dir is initialized to bdist_base/dumb if not set
         base = cmd.get_finalized_command('bdist').bdist_base
-        self.assertEquals(cmd.bdist_dir, os.path.join(base, 'dumb'))
+        self.assertEqual(cmd.bdist_dir, os.path.join(base, 'dumb'))
 
         # the format is set to a default value depending on the os.name
         default = cmd.default_format[os.name]
-        self.assertEquals(cmd.format, default)
+        self.assertEqual(cmd.format, default)
 
 def test_suite():
     return unittest.makeSuite(BuildDumbTestCase)
diff --git a/src/distutils2/tests/test_build.py b/src/distutils2/tests/test_build.py
--- a/src/distutils2/tests/test_build.py
+++ b/src/distutils2/tests/test_build.py
@@ -20,11 +20,11 @@
         cmd.finalize_options()
 
         # if not specified, plat_name gets the current platform
-        self.assertEquals(cmd.plat_name, get_platform())
+        self.assertEqual(cmd.plat_name, get_platform())
 
         # build_purelib is build + lib
         wanted = os.path.join(cmd.build_base, 'lib')
-        self.assertEquals(cmd.build_purelib, wanted)
+        self.assertEqual(cmd.build_purelib, wanted)
 
         # build_platlib is 'build/lib.platform-x.x[-pydebug]'
         # examples:
@@ -34,21 +34,21 @@
             self.assertTrue(cmd.build_platlib.endswith('-pydebug'))
             plat_spec += '-pydebug'
         wanted = os.path.join(cmd.build_base, 'lib' + plat_spec)
-        self.assertEquals(cmd.build_platlib, wanted)
+        self.assertEqual(cmd.build_platlib, wanted)
 
         # by default, build_lib = build_purelib
-        self.assertEquals(cmd.build_lib, cmd.build_purelib)
+        self.assertEqual(cmd.build_lib, cmd.build_purelib)
 
         # build_temp is build/temp.<plat>
         wanted = os.path.join(cmd.build_base, 'temp' + plat_spec)
-        self.assertEquals(cmd.build_temp, wanted)
+        self.assertEqual(cmd.build_temp, wanted)
 
         # build_scripts is build/scripts-x.x
         wanted = os.path.join(cmd.build_base, 'scripts-' +  sys.version[0:3])
-        self.assertEquals(cmd.build_scripts, wanted)
+        self.assertEqual(cmd.build_scripts, wanted)
 
         # executable is os.path.normpath(sys.executable)
-        self.assertEquals(cmd.executable, os.path.normpath(sys.executable))
+        self.assertEqual(cmd.executable, os.path.normpath(sys.executable))
 
 def test_suite():
     return unittest.makeSuite(BuildTestCase)
diff --git a/src/distutils2/tests/test_build_clib.py b/src/distutils2/tests/test_build_clib.py
--- a/src/distutils2/tests/test_build_clib.py
+++ b/src/distutils2/tests/test_build_clib.py
@@ -55,14 +55,14 @@
         self.assertRaises(DistutilsSetupError, cmd.get_source_files)
 
         cmd.libraries = [('name', {'sources': ['a', 'b']})]
-        self.assertEquals(cmd.get_source_files(), ['a', 'b'])
+        self.assertEqual(cmd.get_source_files(), ['a', 'b'])
 
         cmd.libraries = [('name', {'sources': ('a', 'b')})]
-        self.assertEquals(cmd.get_source_files(), ['a', 'b'])
+        self.assertEqual(cmd.get_source_files(), ['a', 'b'])
 
         cmd.libraries = [('name', {'sources': ('a', 'b')}),
                          ('name2', {'sources': ['c', 'd']})]
-        self.assertEquals(cmd.get_source_files(), ['a', 'b', 'c', 'd'])
+        self.assertEqual(cmd.get_source_files(), ['a', 'b', 'c', 'd'])
 
     def test_build_libraries(self):
 
@@ -91,11 +91,11 @@
 
         cmd.include_dirs = 'one-dir'
         cmd.finalize_options()
-        self.assertEquals(cmd.include_dirs, ['one-dir'])
+        self.assertEqual(cmd.include_dirs, ['one-dir'])
 
         cmd.include_dirs = None
         cmd.finalize_options()
-        self.assertEquals(cmd.include_dirs, [])
+        self.assertEqual(cmd.include_dirs, [])
 
         cmd.distribution.libraries = 'WONTWORK'
         self.assertRaises(DistutilsSetupError, cmd.finalize_options)
diff --git a/src/distutils2/tests/test_build_ext.py b/src/distutils2/tests/test_build_ext.py
--- a/src/distutils2/tests/test_build_ext.py
+++ b/src/distutils2/tests/test_build_ext.py
@@ -1,6 +1,5 @@
 import sys
 import os
-import tempfile
 import shutil
 from StringIO import StringIO
 import warnings
@@ -81,11 +80,11 @@
         for attr in ('error', 'foo', 'new', 'roj'):
             self.assertTrue(hasattr(xx, attr))
 
-        self.assertEquals(xx.foo(2, 5), 7)
-        self.assertEquals(xx.foo(13,15), 28)
-        self.assertEquals(xx.new().demo(), None)
+        self.assertEqual(xx.foo(2, 5), 7)
+        self.assertEqual(xx.foo(13,15), 28)
+        self.assertEqual(xx.new().demo(), None)
         doc = 'This is a template module just for instruction.'
-        self.assertEquals(xx.__doc__, doc)
+        self.assertEqual(xx.__doc__, doc)
         self.assertTrue(isinstance(xx.Null(), xx.Null))
         self.assertTrue(isinstance(xx.Str(), xx.Str))
 
@@ -195,7 +194,7 @@
         cmd = build_ext(dist)
         cmd.libraries = 'my_lib'
         cmd.finalize_options()
-        self.assertEquals(cmd.libraries, ['my_lib'])
+        self.assertEqual(cmd.libraries, ['my_lib'])
 
         # make sure cmd.library_dirs is turned into a list
         # if it's a string
@@ -209,7 +208,7 @@
         cmd = build_ext(dist)
         cmd.rpath = os.pathsep.join(['one', 'two'])
         cmd.finalize_options()
-        self.assertEquals(cmd.rpath, ['one', 'two'])
+        self.assertEqual(cmd.rpath, ['one', 'two'])
 
         # XXX more tests to perform for win32
 
@@ -218,79 +217,32 @@
         cmd = build_ext(dist)
         cmd.define = 'one,two'
         cmd.finalize_options()
-        self.assertEquals(cmd.define, [('one', '1'), ('two', '1')])
+        self.assertEqual(cmd.define, [('one', '1'), ('two', '1')])
 
         # make sure undef is turned into a list of
         # strings if they are ','-separated strings
         cmd = build_ext(dist)
         cmd.undef = 'one,two'
         cmd.finalize_options()
-        self.assertEquals(cmd.undef, ['one', 'two'])
+        self.assertEqual(cmd.undef, ['one', 'two'])
 
         # make sure swig_opts is turned into a list
         cmd = build_ext(dist)
         cmd.swig_opts = None
         cmd.finalize_options()
-        self.assertEquals(cmd.swig_opts, [])
+        self.assertEqual(cmd.swig_opts, [])
 
         cmd = build_ext(dist)
         cmd.swig_opts = '1 2'
         cmd.finalize_options()
-        self.assertEquals(cmd.swig_opts, ['1', '2'])
-
-    def test_check_extensions_list(self):
-        dist = Distribution()
-        cmd = build_ext(dist)
-        cmd.finalize_options()
-
-        #'extensions' option must be a list of Extension instances
-        self.assertRaises(DistutilsSetupError, cmd.check_extensions_list, 'foo')
-
-        # each element of 'ext_modules' option must be an
-        # Extension instance or 2-tuple
-        exts = [('bar', 'foo', 'bar'), 'foo']
-        self.assertRaises(DistutilsSetupError, cmd.check_extensions_list, exts)
-
-        # first element of each tuple in 'ext_modules'
-        # must be the extension name (a string) and match
-        # a python dotted-separated name
-        exts = [('foo-bar', '')]
-        self.assertRaises(DistutilsSetupError, cmd.check_extensions_list, exts)
-
-        # second element of each tuple in 'ext_modules'
-        # must be a ary (build info)
-        exts = [('foo.bar', '')]
-        self.assertRaises(DistutilsSetupError, cmd.check_extensions_list, exts)
-
-        # ok this one should pass
-        exts = [('foo.bar', {'sources': [''], 'libraries': 'foo',
-                             'some': 'bar'})]
-        cmd.check_extensions_list(exts)
-        ext = exts[0]
-        self.assertTrue(isinstance(ext, Extension))
-
-        # check_extensions_list adds in ext the values passed
-        # when they are in ('include_dirs', 'library_dirs', 'libraries'
-        # 'extra_objects', 'extra_compile_args', 'extra_link_args')
-        self.assertEquals(ext.libraries, 'foo')
-        self.assertTrue(not hasattr(ext, 'some'))
-
-        # 'macros' element of build info dict must be 1- or 2-tuple
-        exts = [('foo.bar', {'sources': [''], 'libraries': 'foo',
-                'some': 'bar', 'macros': [('1', '2', '3'), 'foo']})]
-        self.assertRaises(DistutilsSetupError, cmd.check_extensions_list, exts)
-
-        exts[0][1]['macros'] = [('1', '2'), ('3',)]
-        cmd.check_extensions_list(exts)
-        self.assertEquals(exts[0].undef_macros, ['3'])
-        self.assertEquals(exts[0].define_macros, [('1', '2')])
+        self.assertEqual(cmd.swig_opts, ['1', '2'])
 
     def test_get_source_files(self):
         modules = [Extension('foo', ['xxx'], optional=False)]
         dist = Distribution({'name': 'xx', 'ext_modules': modules})
         cmd = build_ext(dist)
         cmd.ensure_finalized()
-        self.assertEquals(cmd.get_source_files(), ['xxx'])
+        self.assertEqual(cmd.get_source_files(), ['xxx'])
 
     def test_compiler_option(self):
         # cmd.compiler is an option and
@@ -301,7 +253,7 @@
         cmd.compiler = 'unix'
         cmd.ensure_finalized()
         cmd.run()
-        self.assertEquals(cmd.compiler, 'unix')
+        self.assertEqual(cmd.compiler, 'unix')
 
     def test_get_outputs(self):
         tmp_dir = self.mkdtemp()
@@ -312,7 +264,7 @@
                              'ext_modules': [ext]})
         cmd = build_ext(dist)
         cmd.ensure_finalized()
-        self.assertEquals(len(cmd.get_outputs()), 1)
+        self.assertEqual(len(cmd.get_outputs()), 1)
 
         if os.name == "nt":
             cmd.debug = sys.executable.endswith("_d.exe")
@@ -332,19 +284,19 @@
         finally:
             os.chdir(old_wd)
         self.assertTrue(os.path.exists(so_file))
-        self.assertEquals(os.path.splitext(so_file)[-1],
+        self.assertEqual(os.path.splitext(so_file)[-1],
                           sysconfig.get_config_var('SO'))
         so_dir = os.path.dirname(so_file)
-        self.assertEquals(so_dir, other_tmp_dir)
+        self.assertEqual(so_dir, other_tmp_dir)
 
         cmd.inplace = 0
         cmd.run()
         so_file = cmd.get_outputs()[0]
         self.assertTrue(os.path.exists(so_file))
-        self.assertEquals(os.path.splitext(so_file)[-1],
+        self.assertEqual(os.path.splitext(so_file)[-1],
                           sysconfig.get_config_var('SO'))
         so_dir = os.path.dirname(so_file)
-        self.assertEquals(so_dir, cmd.build_lib)
+        self.assertEqual(so_dir, cmd.build_lib)
 
         # inplace = 0, cmd.package = 'bar'
         build_py = cmd.get_finalized_command('build_py')
@@ -352,7 +304,7 @@
         path = cmd.get_ext_fullpath('foo')
         # checking that the last directory is the build_dir
         path = os.path.split(path)[0]
-        self.assertEquals(path, cmd.build_lib)
+        self.assertEqual(path, cmd.build_lib)
 
         # inplace = 1, cmd.package = 'bar'
         cmd.inplace = 1
@@ -366,7 +318,7 @@
         # checking that the last directory is bar
         path = os.path.split(path)[0]
         lastdir = os.path.split(path)[-1]
-        self.assertEquals(lastdir, 'bar')
+        self.assertEqual(lastdir, 'bar')
 
     def test_ext_fullpath(self):
         ext = sysconfig.get_config_vars()['SO']
@@ -382,14 +334,14 @@
         curdir = os.getcwd()
         wanted = os.path.join(curdir, 'src', 'lxml', 'etree' + ext)
         path = cmd.get_ext_fullpath('lxml.etree')
-        self.assertEquals(wanted, path)
+        self.assertEqual(wanted, path)
 
         # building lxml.etree not inplace
         cmd.inplace = 0
         cmd.build_lib = os.path.join(curdir, 'tmpdir')
         wanted = os.path.join(curdir, 'tmpdir', 'lxml', 'etree' + ext)
         path = cmd.get_ext_fullpath('lxml.etree')
-        self.assertEquals(wanted, path)
+        self.assertEqual(wanted, path)
 
         # building twisted.runner.portmap not inplace
         build_py = cmd.get_finalized_command('build_py')
@@ -398,13 +350,13 @@
         path = cmd.get_ext_fullpath('twisted.runner.portmap')
         wanted = os.path.join(curdir, 'tmpdir', 'twisted', 'runner',
                               'portmap' + ext)
-        self.assertEquals(wanted, path)
+        self.assertEqual(wanted, path)
 
         # building twisted.runner.portmap inplace
         cmd.inplace = 1
         path = cmd.get_ext_fullpath('twisted.runner.portmap')
         wanted = os.path.join(curdir, 'twisted', 'runner', 'portmap' + ext)
-        self.assertEquals(wanted, path)
+        self.assertEqual(wanted, path)
 
 def test_suite():
     src = _get_source_filename()
diff --git a/src/distutils2/tests/test_build_py.py b/src/distutils2/tests/test_build_py.py
--- a/src/distutils2/tests/test_build_py.py
+++ b/src/distutils2/tests/test_build_py.py
@@ -19,11 +19,15 @@
     def test_package_data(self):
         sources = self.mkdtemp()
         f = open(os.path.join(sources, "__init__.py"), "w")
-        f.write("# Pretend this is a package.")
-        f.close()
+        try:
+            f.write("# Pretend this is a package.")
+        finally:
+            f.close()
         f = open(os.path.join(sources, "README.txt"), "w")
-        f.write("Info about this package")
-        f.close()
+        try:
+            f.write("Info about this package")
+        finally:
+            f.close()
 
         destination = self.mkdtemp()
 
diff --git a/src/distutils2/tests/test_build_scripts.py b/src/distutils2/tests/test_build_scripts.py
--- a/src/distutils2/tests/test_build_scripts.py
+++ b/src/distutils2/tests/test_build_scripts.py
@@ -74,8 +74,10 @@
 
     def write_script(self, dir, name, text):
         f = open(os.path.join(dir, name), "w")
-        f.write(text)
-        f.close()
+        try:
+            f.write(text)
+        finally:
+            f.close()
 
     def test_version_int(self):
         source = self.mkdtemp()
diff --git a/src/distutils2/tests/test_ccompiler.py b/src/distutils2/tests/test_ccompiler.py
--- a/src/distutils2/tests/test_ccompiler.py
+++ b/src/distutils2/tests/test_ccompiler.py
@@ -31,7 +31,7 @@
         opts = gen_lib_options(compiler, libdirs, runlibdirs, libs)
         wanted = ['-Llib1', '-Llib2', '-cool', '-Rrunlib1', 'found',
                   '-lname2']
-        self.assertEquals(opts, wanted)
+        self.assertEqual(opts, wanted)
 
     def test_customize_compiler(self):
 
@@ -51,7 +51,7 @@
 
         comp = compiler()
         customize_compiler(comp)
-        self.assertEquals(comp.exes['archiver'], 'my_ar -arflags')
+        self.assertEqual(comp.exes['archiver'], 'my_ar -arflags')
 
 def test_suite():
     return unittest.makeSuite(CCompilerTestCase)
diff --git a/src/distutils2/tests/test_check.py b/src/distutils2/tests/test_check.py
--- a/src/distutils2/tests/test_check.py
+++ b/src/distutils2/tests/test_check.py
@@ -37,7 +37,7 @@
                     'name': 'xxx', 'version': 'xxx'
                     }
         cmd = self._run(metadata)
-        self.assertEquals(len(cmd._warnings), 0)
+        self.assertEqual(len(cmd._warnings), 0)
 
         # now with the strict mode, we should
         # get an error if there are missing metadata
@@ -45,7 +45,7 @@
 
         # and of course, no error when all metadata are present
         cmd = self._run(metadata, strict=1)
-        self.assertEquals(len(cmd._warnings), 0)
+        self.assertEqual(len(cmd._warnings), 0)
 
     def test_check_restructuredtext(self):
         if not _HAS_DOCUTILS: # won't test without docutils
@@ -55,7 +55,7 @@
         pkg_info, dist = self.create_dist(description=broken_rest)
         cmd = check(dist)
         cmd.check_restructuredtext()
-        self.assertEquals(len(cmd._warnings), 1)
+        self.assertEqual(len(cmd._warnings), 1)
 
         # let's see if we have an error with strict=1
         metadata = {'home_page': 'xxx', 'author': 'xxx',
@@ -69,7 +69,7 @@
         # and non-broken rest
         metadata['description'] = 'title\n=====\n\ntest'
         cmd = self._run(metadata, strict=1, restructuredtext=1)
-        self.assertEquals(len(cmd._warnings), 0)
+        self.assertEqual(len(cmd._warnings), 0)
 
     def test_check_all(self):
 
diff --git a/src/distutils2/tests/test_cmd.py b/src/distutils2/tests/test_cmd.py
--- a/src/distutils2/tests/test_cmd.py
+++ b/src/distutils2/tests/test_cmd.py
@@ -43,7 +43,7 @@
 
         # making sure execute gets called properly
         def _execute(func, args, exec_msg, level):
-            self.assertEquals(exec_msg, 'generating out from in')
+            self.assertEqual(exec_msg, 'generating out from in')
         cmd.force = True
         cmd.execute = _execute
         cmd.make_file(infiles='in', outfile='out', func='func', args=())
@@ -62,7 +62,7 @@
 
         wanted = ["command options for 'MyCmd':", '  option1 = 1',
                   '  option2 = 1']
-        self.assertEquals(msgs, wanted)
+        self.assertEqual(msgs, wanted)
 
     def test_ensure_string(self):
         cmd = self.cmd
@@ -80,7 +80,7 @@
         cmd = self.cmd
         cmd.option1 = 'ok,dok'
         cmd.ensure_string_list('option1')
-        self.assertEquals(cmd.option1, ['ok', 'dok'])
+        self.assertEqual(cmd.option1, ['ok', 'dok'])
 
         cmd.option2 = ['xxx', 'www']
         cmd.ensure_string_list('option2')
diff --git a/src/distutils2/tests/test_config.py b/src/distutils2/tests/test_config.py
--- a/src/distutils2/tests/test_config.py
+++ b/src/distutils2/tests/test_config.py
@@ -1,7 +1,6 @@
 """Tests for distutils.pypirc.pypirc."""
 import sys
 import os
-import tempfile
 import shutil
 
 from distutils2.core import PyPIRCCommand
@@ -87,20 +86,20 @@
 
         config = config.items()
         config.sort()
-        waited = [('password', 'secret'), ('realm', 'pypi'),
-                  ('repository', 'http://pypi.python.org/pypi'),
-                  ('server', 'server1'), ('username', 'me')]
-        self.assertEquals(config, waited)
+        expected = [('password', 'secret'), ('realm', 'pypi'),
+                    ('repository', 'http://pypi.python.org/pypi'),
+                    ('server', 'server1'), ('username', 'me')]
+        self.assertEqual(config, expected)
 
         # old format
         self.write_file(self.rc, PYPIRC_OLD)
         config = cmd._read_pypirc()
         config = config.items()
         config.sort()
-        waited = [('password', 'secret'), ('realm', 'pypi'),
-                  ('repository', 'http://pypi.python.org/pypi'),
-                  ('server', 'server-login'), ('username', 'tarek')]
-        self.assertEquals(config, waited)
+        expected = [('password', 'secret'), ('realm', 'pypi'),
+                    ('repository', 'http://pypi.python.org/pypi'),
+                    ('server', 'server-login'), ('username', 'tarek')]
+        self.assertEqual(config, expected)
 
     def test_server_empty_registration(self):
         cmd = self._cmd(self.dist)
@@ -109,7 +108,7 @@
         cmd._store_pypirc('tarek', 'xxx')
         self.assertTrue(os.path.exists(rc))
         content = open(rc).read()
-        self.assertEquals(content, WANTED)
+        self.assertEqual(content, WANTED)
 
 def test_suite():
     return unittest.makeSuite(PyPIRCCommandTestCase)
diff --git a/src/distutils2/tests/test_config_cmd.py b/src/distutils2/tests/test_config_cmd.py
--- a/src/distutils2/tests/test_config_cmd.py
+++ b/src/distutils2/tests/test_config_cmd.py
@@ -34,7 +34,7 @@
             f.close()
 
         dump_file(this_file, 'I am the header')
-        self.assertEquals(len(self._logs), numlines+1)
+        self.assertEqual(len(self._logs), numlines+1)
 
     def test_search_cpp(self):
         if sys.platform == 'win32':
@@ -44,10 +44,10 @@
 
         # simple pattern searches
         match = cmd.search_cpp(pattern='xxx', body='// xxx')
-        self.assertEquals(match, 0)
+        self.assertEqual(match, 0)
 
         match = cmd.search_cpp(pattern='_configtest', body='// xxx')
-        self.assertEquals(match, 1)
+        self.assertEqual(match, 1)
 
     def test_finalize_options(self):
         # finalize_options does a bit of transformation
@@ -59,9 +59,9 @@
         cmd.library_dirs = 'three%sfour' % os.pathsep
         cmd.ensure_finalized()
 
-        self.assertEquals(cmd.include_dirs, ['one', 'two'])
-        self.assertEquals(cmd.libraries, ['one'])
-        self.assertEquals(cmd.library_dirs, ['three', 'four'])
+        self.assertEqual(cmd.include_dirs, ['one', 'two'])
+        self.assertEqual(cmd.libraries, ['one'])
+        self.assertEqual(cmd.library_dirs, ['three', 'four'])
 
     def test_clean(self):
         # _clean removes files
diff --git a/src/distutils2/tests/test_converter.py b/src/distutils2/tests/test_converter.py
--- a/src/distutils2/tests/test_converter.py
+++ b/src/distutils2/tests/test_converter.py
@@ -30,7 +30,7 @@
             wanted = file_.replace('before', 'after')
             wanted = _read_file(os.path.join(convdir, wanted))
             res = ref.refactor_string(original, 'setup.py')
-            self.assertEquals(str(res), wanted)
+            self.assertEqual(str(res), wanted)
 
 def test_suite():
     return unittest.makeSuite(ConverterTestCase)
diff --git a/src/distutils2/tests/test_cygwinccompiler.py b/src/distutils2/tests/test_cygwinccompiler.py
--- a/src/distutils2/tests/test_cygwinccompiler.py
+++ b/src/distutils2/tests/test_cygwinccompiler.py
@@ -44,48 +44,48 @@
         sys.version = ('2.6.1 (r261:67515, Dec  6 2008, 16:42:21) \n[GCC '
                        '4.0.1 (Apple Computer, Inc. build 5370)]')
 
-        self.assertEquals(check_config_h()[0], CONFIG_H_OK)
+        self.assertEqual(check_config_h()[0], CONFIG_H_OK)
 
         # then it tries to see if it can find "__GNUC__" in pyconfig.h
         sys.version = 'something without the *CC word'
 
         # if the file doesn't exist it returns  CONFIG_H_UNCERTAIN
-        self.assertEquals(check_config_h()[0], CONFIG_H_UNCERTAIN)
+        self.assertEqual(check_config_h()[0], CONFIG_H_UNCERTAIN)
 
         # if it exists but does not contain __GNUC__, it returns CONFIG_H_NOTOK
         self.write_file(self.python_h, 'xxx')
-        self.assertEquals(check_config_h()[0], CONFIG_H_NOTOK)
+        self.assertEqual(check_config_h()[0], CONFIG_H_NOTOK)
 
         # and CONFIG_H_OK if __GNUC__ is found
         self.write_file(self.python_h, 'xxx __GNUC__ xxx')
-        self.assertEquals(check_config_h()[0], CONFIG_H_OK)
+        self.assertEqual(check_config_h()[0], CONFIG_H_OK)
 
     def test_get_msvcr(self):
 
         # none
         sys.version  = ('2.6.1 (r261:67515, Dec  6 2008, 16:42:21) '
                         '\n[GCC 4.0.1 (Apple Computer, Inc. build 5370)]')
-        self.assertEquals(get_msvcr(), None)
+        self.assertEqual(get_msvcr(), None)
 
         # MSVC 7.0
         sys.version = ('2.5.1 (r251:54863, Apr 18 2007, 08:51:08) '
                        '[MSC v.1300 32 bits (Intel)]')
-        self.assertEquals(get_msvcr(), ['msvcr70'])
+        self.assertEqual(get_msvcr(), ['msvcr70'])
 
         # MSVC 7.1
         sys.version = ('2.5.1 (r251:54863, Apr 18 2007, 08:51:08) '
                        '[MSC v.1310 32 bits (Intel)]')
-        self.assertEquals(get_msvcr(), ['msvcr71'])
+        self.assertEqual(get_msvcr(), ['msvcr71'])
 
         # VS2005 / MSVC 8.0
         sys.version = ('2.5.1 (r251:54863, Apr 18 2007, 08:51:08) '
                        '[MSC v.1400 32 bits (Intel)]')
-        self.assertEquals(get_msvcr(), ['msvcr80'])
+        self.assertEqual(get_msvcr(), ['msvcr80'])
 
         # VS2008 / MSVC 9.0
         sys.version = ('2.5.1 (r251:54863, Apr 18 2007, 08:51:08) '
                        '[MSC v.1500 32 bits (Intel)]')
-        self.assertEquals(get_msvcr(), ['msvcr90'])
+        self.assertEqual(get_msvcr(), ['msvcr90'])
 
         # unknown
         sys.version = ('2.5.1 (r251:54863, Apr 18 2007, 08:51:08) '
diff --git a/src/distutils2/tests/test_depgraph.py b/src/distutils2/tests/test_depgraph.py
--- a/src/distutils2/tests/test_depgraph.py
+++ b/src/distutils2/tests/test_depgraph.py
@@ -27,7 +27,7 @@
         self.assertListEqual(sorted(l1), sorted(l2))
 
     def setUp(self):
-        super(unittest.TestCase, self).setUp()
+        super(DepGraphTestCase, self).setUp()
         path = os.path.join(os.path.dirname(__file__), '..', '_backport',
                             'tests', 'fake_dists')
         path = os.path.abspath(path)
@@ -177,7 +177,7 @@
         self.checkLists(matches, expected)
 
     def tearDown(self):
-        super(unittest.TestCase, self).tearDown()
+        super(DepGraphTestCase, self).tearDown()
         sys.path = self.sys_path
 
 def test_suite():
diff --git a/src/distutils2/tests/test_dist.py b/src/distutils2/tests/test_dist.py
--- a/src/distutils2/tests/test_dist.py
+++ b/src/distutils2/tests/test_dist.py
@@ -71,11 +71,11 @@
         sys.argv.append("build")
 
         __, stdout = captured_stdout(self.create_distribution, files)
-        self.assertEquals(stdout, '')
+        self.assertEqual(stdout, '')
         distutils2.dist.DEBUG = True
         try:
             __, stdout = captured_stdout(self.create_distribution, files)
-            self.assertEquals(stdout, '')
+            self.assertEqual(stdout, '')
         finally:
             distutils2.dist.DEBUG = False
 
@@ -129,13 +129,13 @@
         # Check DistributionMetadata handling of Unicode fields
         tmp_dir = self.mkdtemp()
         my_file = os.path.join(tmp_dir, 'f')
-        klass = Distribution
+        cls = Distribution
 
-        dist = klass(attrs={'author': u'Mister Café',
-                            'name': 'my.package',
-                            'maintainer': u'Café Junior',
-                            'summary': u'Café torréfié',
-                            'description': u'Héhéhé'})
+        dist = cls(attrs={'author': u'Mister Café',
+                          'name': 'my.package',
+                          'maintainer': u'Café Junior',
+                          'summary': u'Café torréfié',
+                          'description': u'Héhéhé'})
 
 
         # let's make sure the file can be written
@@ -144,11 +144,11 @@
         dist.metadata.write_file(open(my_file, 'w'))
 
         # regular ascii is of course always usable
-        dist = klass(attrs={'author': 'Mister Cafe',
-                            'name': 'my.package',
-                            'maintainer': 'Cafe Junior',
-                            'summary': 'Cafe torrefie',
-                            'description': 'Hehehe'})
+        dist = cls(attrs={'author': 'Mister Cafe',
+                          'name': 'my.package',
+                          'maintainer': 'Cafe Junior',
+                          'summary': 'Cafe torrefie',
+                          'description': 'Hehehe'})
 
         my_file2 = os.path.join(tmp_dir, 'f2')
         dist.metadata.write_file(open(my_file, 'w'))
@@ -156,7 +156,7 @@
     def test_empty_options(self):
         # an empty options dictionary should not stay in the
         # list of attributes
-        klass = Distribution
+        cls = Distribution
 
         # catching warnings
         warns = []
@@ -166,15 +166,15 @@
         old_warn = warnings.warn
         warnings.warn = _warn
         try:
-            dist = klass(attrs={'author': 'xxx',
-                                'name': 'xxx',
-                                'version': 'xxx',
-                                'url': 'xxxx',
-                                'options': {}})
+            dist = cls(attrs={'author': 'xxx',
+                              'name': 'xxx',
+                              'version': 'xxx',
+                              'url': 'xxxx',
+                              'options': {}})
         finally:
             warnings.warn = old_warn
 
-        self.assertEquals(len(warns), 0)
+        self.assertEqual(len(warns), 0)
 
     def test_finalize_options(self):
 
@@ -185,20 +185,20 @@
         dist.finalize_options()
 
         # finalize_option splits platforms and keywords
-        self.assertEquals(dist.metadata['platform'], ['one', 'two'])
-        self.assertEquals(dist.metadata['keywords'], ['one', 'two'])
+        self.assertEqual(dist.metadata['platform'], ['one', 'two'])
+        self.assertEqual(dist.metadata['keywords'], ['one', 'two'])
 
     def test_get_command_packages(self):
         dist = Distribution()
-        self.assertEquals(dist.command_packages, None)
+        self.assertEqual(dist.command_packages, None)
         cmds = dist.get_command_packages()
-        self.assertEquals(cmds, ['distutils2.command'])
-        self.assertEquals(dist.command_packages,
+        self.assertEqual(cmds, ['distutils2.command'])
+        self.assertEqual(dist.command_packages,
                           ['distutils2.command'])
 
         dist.command_packages = 'one,two'
         cmds = dist.get_command_packages()
-        self.assertEquals(cmds, ['distutils2.command', 'one', 'two'])
+        self.assertEqual(cmds, ['distutils2.command', 'one', 'two'])
 
 
     def test_announce(self):
@@ -238,7 +238,7 @@
             os.path.expanduser = old_expander
 
         # make sure --no-user-cfg disables the user cfg file
-        self.assertEquals(len(all_files)-1, len(files))
+        self.assertEqual(len(all_files)-1, len(files))
 
 
 class MetadataTestCase(support.TempdirManager, support.EnvironGuard,
@@ -340,8 +340,10 @@
         temp_dir = self.mkdtemp()
         user_filename = os.path.join(temp_dir, user_filename)
         f = open(user_filename, 'w')
-        f.write('.')
-        f.close()
+        try:
+            f.write('.')
+        finally:
+            f.close()
 
         try:
             dist = Distribution()
@@ -365,8 +367,8 @@
     def test_fix_help_options(self):
         help_tuples = [('a', 'b', 'c', 'd'), (1, 2, 3, 4)]
         fancy_options = fix_help_options(help_tuples)
-        self.assertEquals(fancy_options[0], ('a', 'b', 'c'))
-        self.assertEquals(fancy_options[1], (1, 2, 3))
+        self.assertEqual(fancy_options[0], ('a', 'b', 'c'))
+        self.assertEqual(fancy_options[1], (1, 2, 3))
 
     def test_show_help(self):
         # smoke test, just makes sure some help is displayed
@@ -412,14 +414,14 @@
         PKG_INFO.seek(0)
 
         metadata.read_file(PKG_INFO)
-        self.assertEquals(metadata['name'], "package")
-        self.assertEquals(metadata['version'], "1.0")
-        self.assertEquals(metadata['summary'], "xxx")
-        self.assertEquals(metadata['download_url'], 'http://example.com')
-        self.assertEquals(metadata['keywords'], ['one', 'two'])
-        self.assertEquals(metadata['platform'], [])
-        self.assertEquals(metadata['obsoletes'], [])
-        self.assertEquals(metadata['requires-dist'], ['foo'])
+        self.assertEqual(metadata['name'], "package")
+        self.assertEqual(metadata['version'], "1.0")
+        self.assertEqual(metadata['summary'], "xxx")
+        self.assertEqual(metadata['download_url'], 'http://example.com')
+        self.assertEqual(metadata['keywords'], ['one', 'two'])
+        self.assertEqual(metadata['platform'], [])
+        self.assertEqual(metadata['obsoletes'], [])
+        self.assertEqual(metadata['requires-dist'], ['foo'])
 
 def test_suite():
     suite = unittest.TestSuite()
diff --git a/src/distutils2/tests/test_install.py b/src/distutils2/tests/test_install.py
--- a/src/distutils2/tests/test_install.py
+++ b/src/distutils2/tests/test_install.py
@@ -139,23 +139,23 @@
 
         # two elements
         cmd.handle_extra_path()
-        self.assertEquals(cmd.extra_path, ['path', 'dirs'])
-        self.assertEquals(cmd.extra_dirs, 'dirs')
-        self.assertEquals(cmd.path_file, 'path')
+        self.assertEqual(cmd.extra_path, ['path', 'dirs'])
+        self.assertEqual(cmd.extra_dirs, 'dirs')
+        self.assertEqual(cmd.path_file, 'path')
 
         # one element
         cmd.extra_path = ['path']
         cmd.handle_extra_path()
-        self.assertEquals(cmd.extra_path, ['path'])
-        self.assertEquals(cmd.extra_dirs, 'path')
-        self.assertEquals(cmd.path_file, 'path')
+        self.assertEqual(cmd.extra_path, ['path'])
+        self.assertEqual(cmd.extra_dirs, 'path')
+        self.assertEqual(cmd.path_file, 'path')
 
         # none
         dist.extra_path = cmd.extra_path = None
         cmd.handle_extra_path()
-        self.assertEquals(cmd.extra_path, None)
-        self.assertEquals(cmd.extra_dirs, '')
-        self.assertEquals(cmd.path_file, None)
+        self.assertEqual(cmd.extra_path, None)
+        self.assertEqual(cmd.extra_dirs, '')
+        self.assertEqual(cmd.path_file, None)
 
         # three elements (no way !)
         cmd.extra_path = 'path,dirs,again'
@@ -199,7 +199,7 @@
         # line (the egg info file)
         f = open(cmd.record)
         try:
-            self.assertEquals(len(f.readlines()), 1)
+            self.assertEqual(len(f.readlines()), 1)
         finally:
             f.close()
 
diff --git a/src/distutils2/tests/test_install_data.py b/src/distutils2/tests/test_install_data.py
--- a/src/distutils2/tests/test_install_data.py
+++ b/src/distutils2/tests/test_install_data.py
@@ -27,14 +27,14 @@
         self.write_file(two, 'xxx')
 
         cmd.data_files = [one, (inst2, [two])]
-        self.assertEquals(cmd.get_inputs(), [one, (inst2, [two])])
+        self.assertEqual(cmd.get_inputs(), [one, (inst2, [two])])
 
         # let's run the command
         cmd.ensure_finalized()
         cmd.run()
 
         # let's check the result
-        self.assertEquals(len(cmd.get_outputs()), 2)
+        self.assertEqual(len(cmd.get_outputs()), 2)
         rtwo = os.path.split(two)[-1]
         self.assertTrue(os.path.exists(os.path.join(inst2, rtwo)))
         rone = os.path.split(one)[-1]
@@ -47,7 +47,7 @@
         cmd.run()
 
         # let's check the result
-        self.assertEquals(len(cmd.get_outputs()), 2)
+        self.assertEqual(len(cmd.get_outputs()), 2)
         self.assertTrue(os.path.exists(os.path.join(inst2, rtwo)))
         self.assertTrue(os.path.exists(os.path.join(inst, rone)))
         cmd.outfiles = []
@@ -65,7 +65,7 @@
         cmd.run()
 
         # let's check the result
-        self.assertEquals(len(cmd.get_outputs()), 4)
+        self.assertEqual(len(cmd.get_outputs()), 4)
         self.assertTrue(os.path.exists(os.path.join(inst2, rtwo)))
         self.assertTrue(os.path.exists(os.path.join(inst, rone)))
 
diff --git a/src/distutils2/tests/test_install_headers.py b/src/distutils2/tests/test_install_headers.py
--- a/src/distutils2/tests/test_install_headers.py
+++ b/src/distutils2/tests/test_install_headers.py
@@ -23,7 +23,7 @@
 
         pkg_dir, dist = self.create_dist(headers=headers)
         cmd = install_headers(dist)
-        self.assertEquals(cmd.get_inputs(), headers)
+        self.assertEqual(cmd.get_inputs(), headers)
 
         # let's run the command
         cmd.install_dir = os.path.join(pkg_dir, 'inst')
@@ -31,7 +31,7 @@
         cmd.run()
 
         # let's check the results
-        self.assertEquals(len(cmd.get_outputs()), 2)
+        self.assertEqual(len(cmd.get_outputs()), 2)
 
 def test_suite():
     return unittest.makeSuite(InstallHeadersTestCase)
diff --git a/src/distutils2/tests/test_install_lib.py b/src/distutils2/tests/test_install_lib.py
--- a/src/distutils2/tests/test_install_lib.py
+++ b/src/distutils2/tests/test_install_lib.py
@@ -25,8 +25,8 @@
         cmd = install_lib(dist)
 
         cmd.finalize_options()
-        self.assertEquals(cmd.compile, 1)
-        self.assertEquals(cmd.optimize, 0)
+        self.assertEqual(cmd.compile, 1)
+        self.assertEqual(cmd.optimize, 0)
 
         # optimize must be 0, 1, or 2
         cmd.optimize = 'foo'
@@ -36,7 +36,7 @@
 
         cmd.optimize = '2'
         cmd.finalize_options()
-        self.assertEquals(cmd.optimize, 2)
+        self.assertEqual(cmd.optimize, 2)
 
     @unittest.skipIf(no_bytecode, 'byte-compile not supported')
     def test_byte_compile(self):
@@ -82,7 +82,7 @@
         cmd.distribution.script_name = 'setup.py'
 
         # get_input should return 2 elements
-        self.assertEquals(len(cmd.get_inputs()), 2)
+        self.assertEqual(len(cmd.get_inputs()), 2)
 
     @unittest.skipUnless(bytecode_support, 'sys.dont_write_bytecode not supported')
     def test_dont_write_bytecode(self):
diff --git a/src/distutils2/tests/test_install_scripts.py b/src/distutils2/tests/test_install_scripts.py
--- a/src/distutils2/tests/test_install_scripts.py
+++ b/src/distutils2/tests/test_install_scripts.py
@@ -42,8 +42,10 @@
         def write_script(name, text):
             expected.append(name)
             f = open(os.path.join(source, name), "w")
-            f.write(text)
-            f.close()
+            try:
+                f.write(text)
+            finally:
+                f.close()
 
         write_script("script1.py", ("#! /usr/bin/env python2.3\n"
                                     "# bogus script w/ Python sh-bang\n"
diff --git a/src/distutils2/tests/test_manifest.py b/src/distutils2/tests/test_manifest.py
--- a/src/distutils2/tests/test_manifest.py
+++ b/src/distutils2/tests/test_manifest.py
@@ -44,7 +44,7 @@
 
         # the manifest should have been read
         # and 3 warnings issued (we ddidn't provided the files)
-        self.assertEquals(len(warns), 3)
+        self.assertEqual(len(warns), 3)
         for warn in warns:
             self.assertIn('warning: no files found matching', warn)
 
diff --git a/src/distutils2/tests/test_metadata.py b/src/distutils2/tests/test_metadata.py
--- a/src/distutils2/tests/test_metadata.py
+++ b/src/distutils2/tests/test_metadata.py
@@ -5,9 +5,11 @@
 
 from distutils2.metadata import (DistributionMetadata, _interpret,
                                  PKG_INFO_PREFERRED_VERSION)
-from distutils2.tests.support import unittest
+from distutils2.tests.support import unittest, LoggingSilencer
+from distutils2.errors import (MetadataConflictError,
+                               MetadataUnrecognizedVersionError)
 
-class DistributionMetadataTestCase(unittest.TestCase):
+class DistributionMetadataTestCase(LoggingSilencer, unittest.TestCase):
 
 
     def test_interpret(self):
@@ -15,15 +17,15 @@
         version = sys.version.split()[0]
         os_name = os.name
 
-        assert _interpret("sys.platform == '%s'" % platform)
-        assert _interpret("sys.platform == '%s' or python_version == '2.4'" \
-                % platform)
-        assert _interpret("sys.platform == '%s' and "
-                          "python_full_version == '%s'"\
-                % (platform, version))
-        assert _interpret("'%s' == sys.platform" % platform)
+        self.assertTrue(_interpret("sys.platform == '%s'" % platform))
+        self.assertTrue(_interpret(
+            "sys.platform == '%s' or python_version == '2.4'" % platform))
+        self.assertTrue(_interpret(
+            "sys.platform == '%s' and python_full_version == '%s'" %
+            (platform, version)))
+        self.assertTrue(_interpret("'%s' == sys.platform" % platform))
 
-        assert _interpret('os.name == "%s"' % os_name)
+        self.assertTrue(_interpret('os.name == "%s"' % os_name))
 
         # stuff that need to raise a syntax error
         ops = ('os.name == os.name', 'os.name == 2', "'2' == '2'",
@@ -35,24 +37,25 @@
         OP = 'os.name == "%s"' % os_name
         AND = ' and '
         OR = ' or '
-        assert _interpret(OP+AND+OP)
-        assert _interpret(OP+AND+OP+AND+OP)
-        assert _interpret(OP+OR+OP)
-        assert _interpret(OP+OR+OP+OR+OP)
+        self.assertTrue(_interpret(OP + AND + OP))
+        self.assertTrue(_interpret(OP + AND + OP + AND + OP))
+        self.assertTrue(_interpret(OP + OR + OP))
+        self.assertTrue(_interpret(OP + OR + OP + OR + OP))
 
         # other operators
-        assert _interpret("os.name != 'buuuu'")
-        assert _interpret("python_version > '1.0'")
-        assert _interpret("python_version < '5.0'")
-        assert _interpret("python_version <= '5.0'")
-        assert _interpret("python_version >= '1.0'")
-        assert _interpret("'%s' in os.name" % os_name)
-        assert _interpret("'buuuu' not in os.name")
-        assert _interpret("'buuuu' not in os.name and '%s' in os.name" \
-                            % os_name)
+        self.assertTrue(_interpret("os.name != 'buuuu'"))
+        self.assertTrue(_interpret("python_version > '1.0'"))
+        self.assertTrue(_interpret("python_version < '5.0'"))
+        self.assertTrue(_interpret("python_version <= '5.0'"))
+        self.assertTrue(_interpret("python_version >= '1.0'"))
+        self.assertTrue(_interpret("'%s' in os.name" % os_name))
+        self.assertTrue(_interpret("'buuuu' not in os.name"))
+        self.assertTrue(_interpret(
+            "'buuuu' not in os.name and '%s' in os.name" % os_name))
 
         # execution context
-        assert _interpret('python_version == "0.1"', {'python_version': '0.1'})
+        self.assertTrue(_interpret('python_version == "0.1"',
+                                   {'python_version': '0.1'}))
 
     def test_metadata_read_write(self):
 
@@ -63,25 +66,28 @@
         res.seek(0)
         res = res.read()
         f = open(PKG_INFO)
-        wanted = f.read()
+        try:
+            # XXX this is not used
+            wanted = f.read()
+        finally:
+            f.close()
         self.assertTrue('Keywords: keyring,password,crypt' in res)
-        f.close()
 
     def test_metadata_markers(self):
         # see if we can be platform-aware
         PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO')
         content = open(PKG_INFO).read()
         content = content % sys.platform
-        metadata = DistributionMetadata(platform_dependant=True)
+        metadata = DistributionMetadata(platform_dependent=True)
         metadata.read_file(StringIO(content))
-        self.assertEquals(metadata['Requires-Dist'], ['bar'])
+        self.assertEqual(metadata['Requires-Dist'], ['bar'])
 
         # test with context
         context = {'sys.platform': 'okook'}
-        metadata = DistributionMetadata(platform_dependant=True,
+        metadata = DistributionMetadata(platform_dependent=True,
                                         execution_context=context)
         metadata.read_file(StringIO(content))
-        self.assertEquals(metadata['Requires-Dist'], ['foo'])
+        self.assertEqual(metadata['Requires-Dist'], ['foo'])
 
     def test_description(self):
         PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO')
@@ -93,14 +99,14 @@
         # see if we can read the description now
         DESC = os.path.join(os.path.dirname(__file__), 'LONG_DESC.txt')
         wanted = open(DESC).read()
-        self.assertEquals(wanted, metadata['Description'])
+        self.assertEqual(wanted, metadata['Description'])
 
         # save the file somewhere and make sure we can read it back
         out = StringIO()
         metadata.write_file(out)
         out.seek(0)
         metadata.read_file(out)
-        self.assertEquals(wanted, metadata['Description'])
+        self.assertEqual(wanted, metadata['Description'])
 
     def test_mapper_apis(self):
         PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO')
@@ -115,25 +121,32 @@
     def test_versions(self):
         metadata = DistributionMetadata()
         metadata['Obsoletes'] = 'ok'
-        self.assertEquals(metadata['Metadata-Version'], '1.1')
+        self.assertEqual(metadata['Metadata-Version'], '1.1')
 
         del metadata['Obsoletes']
         metadata['Obsoletes-Dist'] = 'ok'
-        self.assertEquals(metadata['Metadata-Version'], '1.2')
+        self.assertEqual(metadata['Metadata-Version'], '1.2')
 
+        self.assertRaises(MetadataConflictError, metadata.set,
+                          'Obsoletes', 'ok')
+
+        del metadata['Obsoletes']
         del metadata['Obsoletes-Dist']
         metadata['Version'] = '1'
-        self.assertEquals(metadata['Metadata-Version'], '1.0')
+        self.assertEqual(metadata['Metadata-Version'], '1.0')
 
         PKG_INFO = os.path.join(os.path.dirname(__file__),
                                 'SETUPTOOLS-PKG-INFO')
         metadata.read_file(StringIO(open(PKG_INFO).read()))
-        self.assertEquals(metadata['Metadata-Version'], '1.0')
+        self.assertEqual(metadata['Metadata-Version'], '1.0')
 
         PKG_INFO = os.path.join(os.path.dirname(__file__),
                                 'SETUPTOOLS-PKG-INFO2')
         metadata.read_file(StringIO(open(PKG_INFO).read()))
-        self.assertEquals(metadata['Metadata-Version'], '1.1')
+        self.assertEqual(metadata['Metadata-Version'], '1.1')
+
+        metadata.version = '1.618'
+        self.assertRaises(MetadataUnrecognizedVersionError, metadata.keys)
 
     def test_warnings(self):
         metadata = DistributionMetadata()
@@ -161,7 +174,7 @@
 
         # we should have a certain amount of warnings
         num_wanted = len(values)
-        self.assertEquals(num_wanted, res)
+        self.assertEqual(num_wanted, res)
 
     def test_multiple_predicates(self):
         metadata = DistributionMetadata()
@@ -184,29 +197,29 @@
             res = m.warns
             del m.warns
 
-        self.assertEquals(res, 0)
+        self.assertEqual(res, 0)
 
     def test_project_url(self):
         metadata = DistributionMetadata()
         metadata['Project-URL'] = [('one', 'http://ok')]
-        self.assertEquals(metadata['Project-URL'],
+        self.assertEqual(metadata['Project-URL'],
                           [('one', 'http://ok')])
-        self.assertEquals(metadata.version, '1.2')
+        self.assertEqual(metadata.version, '1.2')
 
     def test_check(self):
         metadata = DistributionMetadata()
         metadata['Version'] = 'rr'
         metadata['Requires-dist'] = ['Foo (a)']
         missing, warnings = metadata.check()
-        self.assertEquals(missing, ['Name', 'Home-page'])
-        self.assertEquals(len(warnings), 2)
+        self.assertEqual(missing, ['Name', 'Home-page'])
+        self.assertEqual(len(warnings), 2)
 
     def test_best_choice(self):
         metadata = DistributionMetadata()
         metadata['Version'] = '1.0'
-        self.assertEquals(metadata.version, PKG_INFO_PREFERRED_VERSION)
+        self.assertEqual(metadata.version, PKG_INFO_PREFERRED_VERSION)
         metadata['Classifier'] = ['ok']
-        self.assertEquals(metadata.version, '1.2')
+        self.assertEqual(metadata.version, '1.2')
 
     def test_project_urls(self):
         # project-url is a bit specific, make sure we write it
@@ -214,7 +227,7 @@
         metadata = DistributionMetadata()
         metadata['Version'] = '1.0'
         metadata['Project-Url'] = [('one', 'http://ok')]
-        self.assertEquals(metadata['Project-Url'], [('one', 'http://ok')])
+        self.assertEqual(metadata['Project-Url'], [('one', 'http://ok')])
         file_ = StringIO()
         metadata.write_file(file_)
         file_.seek(0)
@@ -224,7 +237,7 @@
         file_.seek(0)
         metadata = DistributionMetadata()
         metadata.read_file(file_)
-        self.assertEquals(metadata['Project-Url'], [('one', 'http://ok')])
+        self.assertEqual(metadata['Project-Url'], [('one', 'http://ok')])
 
 
 def test_suite():
diff --git a/src/distutils2/tests/test_msvc9compiler.py b/src/distutils2/tests/test_msvc9compiler.py
--- a/src/distutils2/tests/test_msvc9compiler.py
+++ b/src/distutils2/tests/test_msvc9compiler.py
@@ -105,7 +105,7 @@
         import _winreg
         HKCU = _winreg.HKEY_CURRENT_USER
         keys = Reg.read_keys(HKCU, 'xxxx')
-        self.assertEquals(keys, None)
+        self.assertEqual(keys, None)
 
         keys = Reg.read_keys(HKCU, r'Control Panel')
         self.assertTrue('Desktop' in keys)
@@ -116,20 +116,24 @@
         tempdir = self.mkdtemp()
         manifest = os.path.join(tempdir, 'manifest')
         f = open(manifest, 'w')
-        f.write(_MANIFEST)
-        f.close()
+        try:
+            f.write(_MANIFEST)
+        finally:
+            f.close()
 
         compiler = MSVCCompiler()
         compiler._remove_visual_c_ref(manifest)
 
         # see what we got
         f = open(manifest)
-        # removing trailing spaces
-        content = '\n'.join([line.rstrip() for line in f.readlines()])
-        f.close()
+        try:
+            # removing trailing spaces
+            content = '\n'.join([line.rstrip() for line in f.readlines()])
+        finally:
+            f.close()
 
         # makes sure the manifest was properly cleaned
-        self.assertEquals(content, _CLEANED_MANIFEST)
+        self.assertEqual(content, _CLEANED_MANIFEST)
 
 
 def test_suite():
diff --git a/src/distutils2/tests/test_pypi_dist.py b/src/distutils2/tests/test_pypi_dist.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/test_pypi_dist.py
@@ -0,0 +1,247 @@
+"""Tests for the distutils2.pypi.dist module."""
+
+import os
+import shutil
+import tempfile
+
+from distutils2.tests.pypi_server import use_pypi_server
+from distutils2.tests import run_unittest
+from distutils2.tests.support import unittest, TempdirManager
+from distutils2.version import VersionPredicate
+from distutils2.pypi.errors import HashDoesNotMatch, UnsupportedHashName
+from distutils2.pypi.dist import (PyPIDistribution as Dist,
+                                  PyPIDistributions as Dists,
+                                  split_archive_name)
+
+
+class TestPyPIDistribution(TempdirManager,
+                           unittest.TestCase):
+    """Tests the pypi.dist.PyPIDistribution class"""
+
+    def test_instanciation(self):
+        # Test the Distribution class provides us the good attributes when
+        # given on construction
+        dist = Dist("FooBar", "1.1")
+        self.assertEqual("FooBar", dist.name)
+        self.assertEqual("1.1", "%s" % dist.version)
+
+    def test_create_from_url(self):
+        # Test that the Distribution object can be built from a single URL
+        url_list = {
+            'FooBar-1.1.0.tar.gz': {
+                'name': 'foobar',  # lowercase the name
+                'version': '1.1',
+            },
+            'Foo-Bar-1.1.0.zip': {
+                'name': 'foo-bar',  # keep the dash
+                'version': '1.1',
+            },
+            'foobar-1.1b2.tar.gz#md5=123123123123123': {
+                'name': 'foobar',
+                'version': '1.1b2',
+                'url': {
+                    'url': 'http://test.tld/foobar-1.1b2.tar.gz',  # no hash
+                    'hashval': '123123123123123',
+                    'hashname': 'md5',
+                }
+            },
+            'foobar-1.1-rc2.tar.gz': {  # use suggested name
+                'name': 'foobar',
+                'version': '1.1c2',
+                'url': {
+                    'url': 'http://test.tld/foobar-1.1-rc2.tar.gz',
+                }
+            }
+        }
+
+        for url, attributes in url_list.items():
+            dist = Dist.from_url("http://test.tld/" + url)
+            for attribute, value in attributes.items():
+                if isinstance(value, dict):
+                    mylist = getattr(dist, attribute)
+                    for val in value.keys():
+                        self.assertEqual(value[val], mylist[val])
+                else:
+                    if attribute == "version":
+                        self.assertEqual("%s" % getattr(dist, "version"), value)
+                    else:
+                        self.assertEqual(getattr(dist, attribute), value)
+
+    def test_get_url(self):
+        # Test that the url property works well
+
+        d = Dist("foobar", "1.1", url="test_url")
+        self.assertDictEqual(d.url, {
+            "url": "test_url",
+            "is_external": True,
+            "hashname": None,
+            "hashval": None,
+        })
+
+        # add a new url
+        d.add_url(url="internal_url", is_external=False)
+        self.assertEqual(d._url, None)
+        self.assertDictEqual(d.url, {
+            "url": "internal_url",
+            "is_external": False,
+            "hashname": None,
+            "hashval": None,
+        })
+        self.assertEqual(2, len(d._urls))
+
+    def test_comparaison(self):
+        # Test that we can compare PyPIDistributions
+        foo1 = Dist("foo", "1.0")
+        foo2 = Dist("foo", "2.0")
+        bar = Dist("bar", "2.0")
+        # assert we use the version to compare
+        self.assertTrue(foo1 < foo2)
+        self.assertFalse(foo1 > foo2)
+        self.assertFalse(foo1 == foo2)
+
+        # assert we can't compare dists with different names
+        self.assertRaises(TypeError, foo1.__eq__, bar)
+
+    def test_split_archive_name(self):
+        # Test we can split the archive names
+        names = {
+            'foo-bar-baz-1.0-rc2': ('foo-bar-baz', '1.0c2'),
+            'foo-bar-baz-1.0': ('foo-bar-baz', '1.0'),
+            'foobarbaz-1.0': ('foobarbaz', '1.0'),
+        }
+        for name, results in names.items():
+            self.assertEqual(results, split_archive_name(name))
+
+    @use_pypi_server("downloads_with_md5")
+    def test_download(self, server):
+        # Download is possible, and the md5 is checked if given
+
+        add_to_tmpdirs = lambda x: self.tempdirs.append(os.path.dirname(x))
+
+        url = "%s/simple/foobar/foobar-0.1.tar.gz" % server.full_address
+        # check md5 if given
+        dist = Dist("FooBar", "0.1", url=url,
+            url_hashname="md5", url_hashval="d41d8cd98f00b204e9800998ecf8427e")
+        add_to_tmpdirs(dist.download())
+
+        # a wrong md5 fails
+        dist2 = Dist("FooBar", "0.1", url=url,
+            url_hashname="md5", url_hashval="wrongmd5")
+
+        self.assertRaises(HashDoesNotMatch, dist2.download)
+        add_to_tmpdirs(dist2.downloaded_location)
+
+        # we can omit the md5 hash
+        dist3 = Dist("FooBar", "0.1", url=url)
+        add_to_tmpdirs(dist3.download())
+
+        # and specify a temporary location
+        # for an already downloaded dist
+        path1 = self.mkdtemp()
+        dist3.download(path=path1)
+        # and for a new one
+        path2_base = self.mkdtemp()
+        dist4 = Dist("FooBar", "0.1", url=url)
+        path2 = dist4.download(path=path2_base)
+        self.assertTrue(path2_base in path2)
+
+    def test_hashname(self):
+        # Invalid hashnames raises an exception on assignation
+        Dist("FooBar", "0.1", url_hashname="md5", url_hashval="value")
+
+        self.assertRaises(UnsupportedHashName, Dist, "FooBar", "0.1",
+                          url_hashname="invalid_hashname", url_hashval="value")
+
+
+class TestPyPIDistributions(unittest.TestCase):
+
+    def test_filter(self):
+        # Test we filter the distributions the right way, using version
+        # predicate match method
+        dists = Dists((
+            Dist("FooBar", "1.1"),
+            Dist("FooBar", "1.1.1"),
+            Dist("FooBar", "1.2"),
+            Dist("FooBar", "1.2.1"),
+        ))
+        filtered = dists.filter(VersionPredicate("FooBar (<1.2)"))
+        self.assertNotIn(dists[2], filtered)
+        self.assertNotIn(dists[3], filtered)
+        self.assertIn(dists[0], filtered)
+        self.assertIn(dists[1], filtered)
+
+    def test_append(self):
+        # When adding a new item to the list, the behavior is to test if
+        # a distribution with the same name and version number already exists,
+        # and if so, to add url informations to the existing PyPIDistribution
+        # object.
+        # If no object matches, just add "normally" the object to the list.
+
+        dists = Dists([
+            Dist("FooBar", "1.1", url="external_url", type="source"),
+        ])
+        self.assertEqual(1, len(dists))
+        dists.append(Dist("FooBar", "1.1", url="internal_url",
+                          url_is_external=False, type="source"))
+        self.assertEqual(1, len(dists))
+        self.assertEqual(2, len(dists[0]._urls))
+
+        dists.append(Dist("Foobar", "1.1.1", type="source"))
+        self.assertEqual(2, len(dists))
+
+        # when adding a distribution whith a different type, a new distribution
+        # has to be added.
+        dists.append(Dist("Foobar", "1.1.1", type="binary"))
+        self.assertEqual(3, len(dists))
+
+    def test_prefer_final(self):
+        # Can order the distributions using prefer_final
+
+        fb10 = Dist("FooBar", "1.0")  # final distribution
+        fb11a = Dist("FooBar", "1.1a1")  # alpha
+        fb12a = Dist("FooBar", "1.2a1")  # alpha
+        fb12b = Dist("FooBar", "1.2b1")  # beta
+        dists = Dists([fb10, fb11a, fb12a, fb12b])
+
+        dists.sort_distributions(prefer_final=True)
+        self.assertEqual(fb10, dists[0])
+
+        dists.sort_distributions(prefer_final=False)
+        self.assertEqual(fb12b, dists[0])
+
+    def test_prefer_source(self):
+        # Ordering support prefer_source
+        fb_source = Dist("FooBar", "1.0", type="source")
+        fb_binary = Dist("FooBar", "1.0", type="binary")
+        fb2_binary = Dist("FooBar", "2.0", type="binary")
+        dists = Dists([fb_binary, fb_source])
+
+        dists.sort_distributions(prefer_source=True)
+        self.assertEqual(fb_source, dists[0])
+
+        dists.sort_distributions(prefer_source=False)
+        self.assertEqual(fb_binary, dists[0])
+
+        dists.append(fb2_binary)
+        dists.sort_distributions(prefer_source=True)
+        self.assertEqual(fb2_binary, dists[0])
+
+    def test_get_same_name_and_version(self):
+        # PyPIDistributions can return a list of "duplicates"
+        fb_source = Dist("FooBar", "1.0", type="source")
+        fb_binary = Dist("FooBar", "1.0", type="binary")
+        fb2_binary = Dist("FooBar", "2.0", type="binary")
+        dists = Dists([fb_binary, fb_source, fb2_binary])
+        duplicates = dists.get_same_name_and_version()
+        self.assertTrue(1, len(duplicates))
+        self.assertIn(fb_source, duplicates[0])
+
+
+def test_suite():
+    suite = unittest.TestSuite()
+    suite.addTest(unittest.makeSuite(TestPyPIDistribution))
+    suite.addTest(unittest.makeSuite(TestPyPIDistributions))
+    return suite
+
+if __name__ == '__main__':
+    run_unittest(test_suite())
diff --git a/src/distutils2/tests/test_pypi_server.py b/src/distutils2/tests/test_pypi_server.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/test_pypi_server.py
@@ -0,0 +1,69 @@
+"""Tests for distutils.command.bdist."""
+import urllib
+import urllib2
+import os.path
+
+from distutils2.tests.pypi_server import PyPIServer, PYPI_DEFAULT_STATIC_PATH
+from distutils2.tests.support import unittest
+
+
+class PyPIServerTest(unittest.TestCase):
+
+    def test_records_requests(self):
+        # We expect that PyPIServer can log our requests
+        server = PyPIServer()
+        server.start()
+        self.assertEqual(len(server.requests), 0)
+
+        data = "Rock Around The Bunker"
+        headers = {"X-test-header": "Mister Iceberg"}
+
+        request = urllib2.Request(server.full_address, data, headers)
+        urllib2.urlopen(request)
+        self.assertEqual(len(server.requests), 1)
+        handler, request_data = server.requests[-1]
+        self.assertIn("Rock Around The Bunker", request_data)
+        self.assertIn("x-test-header", handler.headers.dict)
+        self.assertEqual(handler.headers.dict["x-test-header"],
+                         "Mister Iceberg")
+        server.stop()
+
+    def test_serve_static_content(self):
+        # PYPI Mocked server can serve static content from disk.
+
+        def uses_local_files_for(server, url_path):
+            """Test that files are served statically (eg. the output from the
+            server is the same than the one made by a simple file read.
+            """
+            url = server.full_address + url_path
+            request = urllib2.Request(url)
+            response = urllib2.urlopen(request)
+            file = open(PYPI_DEFAULT_STATIC_PATH + "/test_pypi_server" +
+               url_path)
+            return response.read() == file.read()
+
+        server = PyPIServer(static_uri_paths=["simple", "external"],
+            static_filesystem_paths=["test_pypi_server"])
+        server.start()
+
+        # the file does not exists on the disc, so it might not be served
+        url = server.full_address + "/simple/unexisting_page"
+        request = urllib2.Request(url)
+        try:
+            urllib2.urlopen(request)
+        except urllib2.HTTPError,e:
+            self.assertEqual(e.code, 404)
+
+        # now try serving a content that do exists
+        self.assertTrue(uses_local_files_for(server, "/simple/index.html"))
+
+        # and another one in another root path
+        self.assertTrue(uses_local_files_for(server, "/external/index.html"))
+        server.stop()
+
+
+def test_suite():
+    return unittest.makeSuite(PyPIServerTest)
+
+if __name__ == '__main__':
+    unittest.main(defaultTest="test_suite")
diff --git a/src/distutils2/tests/test_pypi_simple.py b/src/distutils2/tests/test_pypi_simple.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/test_pypi_simple.py
@@ -0,0 +1,277 @@
+"""Tests for the pypi.simple module.
+
+"""
+import sys
+import os
+import shutil
+import tempfile
+import urllib2
+
+from distutils2.pypi import simple
+from distutils2.tests import support, run_unittest
+from distutils2.tests.support import unittest
+from distutils2.tests.pypi_server import (use_pypi_server, PyPIServer,
+                                          PYPI_DEFAULT_STATIC_PATH)
+
+
+class PyPISimpleTestCase(support.TempdirManager,
+                         unittest.TestCase):
+
+    def _get_simple_index(self, server, base_url="/simple/", hosts=None,
+                          *args, **kwargs):
+        """Build and return a SimpleSimpleIndex instance, with the test server
+        urls
+        """
+        if hosts is None:
+            hosts = (server.full_address.strip("http://"),)
+        kwargs['hosts'] = hosts
+        return simple.SimpleIndex(server.full_address + base_url, *args,
+            **kwargs)
+
+    def test_bad_urls(self):
+        index = simple.SimpleIndex()
+        url = 'http://127.0.0.1:0/nonesuch/test_simple'
+        try:
+            v = index._open_url(url)
+        except Exception, v:
+            self.assertTrue(url in str(v))
+        else:
+            self.assertTrue(isinstance(v, urllib2.HTTPError))
+
+        # issue 16
+        # easy_install inquant.contentmirror.plone breaks because of a typo
+        # in its home URL
+        index = simple.SimpleIndex(hosts=('www.example.com',))
+        url = 'url:%20https://svn.plone.org/svn/collective/inquant.contentmirror.plone/trunk'
+        try:
+            v = index._open_url(url)
+        except Exception, v:
+            self.assertTrue(url in str(v))
+        else:
+            self.assertTrue(isinstance(v, urllib2.HTTPError))
+
+        def _urlopen(*args):
+            import httplib
+            raise httplib.BadStatusLine('line')
+
+        old_urlopen = urllib2.urlopen
+        urllib2.urlopen = _urlopen
+        url = 'http://example.com'
+        try:
+            try:
+                v = index._open_url(url)
+            except Exception, v:
+                self.assertTrue('line' in str(v))
+            else:
+                raise AssertionError('Should have raise here!')
+        finally:
+            urllib2.urlopen = old_urlopen
+
+        # issue 20
+        url = 'http://http://svn.pythonpaste.org/Paste/wphp/trunk'
+        try:
+            index._open_url(url)
+        except Exception, v:
+            self.assertTrue('nonnumeric port' in str(v))
+
+        # issue #160
+        if sys.version_info[0] == 2 and sys.version_info[1] == 7:
+            # this should not fail
+            url = 'http://example.com'
+            page = ('<a href="http://www.famfamfam.com]('
+                    'http://www.famfamfam.com/">')
+            index._process_url(url, page)
+
+    @use_pypi_server("test_found_links")
+    def test_found_links(self, server):
+        # Browse the index, asking for a specified distribution version
+        # The PyPI index contains links for version 1.0, 1.1, 2.0 and 2.0.1
+        index = self._get_simple_index(server)
+        last_distribution = index.get("foobar")
+
+        # we have scanned the index page
+        self.assertIn(server.full_address + "/simple/foobar/",
+            index._processed_urls)
+
+        # we have found 4 distributions in this page
+        self.assertEqual(len(index._distributions["foobar"]), 4)
+
+        # and returned the most recent one
+        self.assertEqual("%s" % last_distribution.version, '2.0.1')
+
+    def test_is_browsable(self):
+        index = simple.SimpleIndex(follow_externals=False)
+        self.assertTrue(index._is_browsable(index.index_url + "test"))
+
+        # Now, when following externals, we can have a list of hosts to trust.
+        # and don't follow other external links than the one described here.
+        index = simple.SimpleIndex(hosts=["pypi.python.org", "test.org"],
+                                   follow_externals=True)
+        good_urls = (
+            "http://pypi.python.org/foo/bar",
+            "http://pypi.python.org/simple/foobar",
+            "http://test.org",
+            "http://test.org/",
+            "http://test.org/simple/",
+        )
+        bad_urls = (
+            "http://python.org",
+            "http://test.tld",
+        )
+
+        for url in good_urls:
+            self.assertTrue(index._is_browsable(url))
+
+        for url in bad_urls:
+            self.assertFalse(index._is_browsable(url))
+
+        # allow all hosts
+        index = simple.SimpleIndex(follow_externals=True, hosts=("*",))
+        self.assertTrue(index._is_browsable("http://an-external.link/path"))
+        self.assertTrue(index._is_browsable("pypi.test.tld/a/path"))
+
+        # specify a list of hosts we want to allow
+        index = simple.SimpleIndex(follow_externals=True,
+                                   hosts=("*.test.tld",))
+        self.assertFalse(index._is_browsable("http://an-external.link/path"))
+        self.assertTrue(index._is_browsable("http://pypi.test.tld/a/path"))
+
+    @use_pypi_server("with_externals")
+    def test_restrict_hosts(self, server):
+        # Include external pages
+        # Try to request the package index, wich contains links to "externals"
+        # resources. They have to  be scanned too.
+        index = self._get_simple_index(server, follow_externals=True)
+        index.get("foobar")
+        self.assertIn(server.full_address + "/external/external.html",
+            index._processed_urls)
+
+    @use_pypi_server("with_real_externals")
+    def test_restrict_hosts(self, server):
+        # Only use a list of allowed hosts is possible
+        # Test that telling the simple pyPI client to not retrieve external
+        # works
+        index = self._get_simple_index(server, follow_externals=False)
+        index.get("foobar")
+        self.assertNotIn(server.full_address + "/external/external.html",
+            index._processed_urls)
+
+    @use_pypi_server(static_filesystem_paths=["with_externals"],
+        static_uri_paths=["simple", "external"])
+    def test_links_priority(self, server):
+        # Download links from the pypi simple index should be used before
+        # external download links.
+        # http://bitbucket.org/tarek/distribute/issue/163/md5-validation-error
+        #
+        # Usecase :
+        # - someone uploads a package on pypi, a md5 is generated
+        # - someone manually coindexes this link (with the md5 in the url) onto
+        #   an external page accessible from the package page.
+        # - someone reuploads the package (with a different md5)
+        # - while easy_installing, an MD5 error occurs because the external link
+        #   is used
+        # -> The index should use the link from pypi, not the external one.
+
+        # start an index server
+        index_url = server.full_address + '/simple/'
+
+        # scan a test index
+        index = simple.SimpleIndex(index_url, follow_externals=True)
+        dists = index.find("foobar")
+        server.stop()
+
+        # we have only one link, because links are compared without md5
+        self.assertEqual(len(dists), 1)
+        # the link should be from the index
+        self.assertEqual('12345678901234567', dists[0].url['hashval'])
+        self.assertEqual('md5', dists[0].url['hashname'])
+
+    @use_pypi_server(static_filesystem_paths=["with_norel_links"],
+        static_uri_paths=["simple", "external"])
+    def test_not_scan_all_links(self, server):
+        # Do not follow all index page links.
+        # The links not tagged with rel="download" and rel="homepage" have
+        # to not be processed by the package index, while processing "pages".
+
+        # process the pages
+        index = self._get_simple_index(server, follow_externals=True)
+        index.find("foobar")
+        # now it should have processed only pages with links rel="download"
+        # and rel="homepage"
+        self.assertIn("%s/simple/foobar/" % server.full_address,
+            index._processed_urls)  # it's the simple index page
+        self.assertIn("%s/external/homepage.html" % server.full_address,
+            index._processed_urls)  # the external homepage is rel="homepage"
+        self.assertNotIn("%s/external/nonrel.html" % server.full_address,
+            index._processed_urls)  # this link contains no rel=*
+        self.assertNotIn("%s/unrelated-0.2.tar.gz" % server.full_address,
+            index._processed_urls)  # linked from simple index (no rel)
+        self.assertIn("%s/foobar-0.1.tar.gz" % server.full_address,
+            index._processed_urls)  # linked from simple index (rel)
+        self.assertIn("%s/foobar-2.0.tar.gz" % server.full_address,
+            index._processed_urls)  # linked from external homepage (rel)
+
+    def test_uses_mirrors(self):
+        # When the main repository seems down, try using the given mirrors"""
+        server = PyPIServer("foo_bar_baz")
+        mirror = PyPIServer("foo_bar_baz")
+        mirror.start()  # we dont start the server here
+
+        try:
+            # create the index using both servers
+            index = simple.SimpleIndex(server.full_address + "/simple/",
+                hosts=('*',), timeout=1,  # set the timeout to 1s for the tests
+                mirrors=[mirror.full_address + "/simple/",])
+
+            # this should not raise a timeout
+            self.assertEqual(4, len(index.find("foo")))
+        finally:
+            mirror.stop()
+
+    def test_simple_link_matcher(self):
+        # Test that the simple link matcher yields the right links"""
+        index = simple.SimpleIndex(follow_externals=False)
+
+        # Here, we define:
+        #   1. one link that must be followed, cause it's a download one
+        #   2. one link that must *not* be followed, cause the is_browsable
+        #      returns false for it.
+        #   3. one link that must be followed cause it's a homepage that is
+        #      browsable
+        self.assertTrue(index._is_browsable("%stest" % index.index_url))
+        self.assertFalse(index._is_browsable("http://dl-link2"))
+        content = """
+        <a href="http://dl-link1" rel="download">download_link1</a>
+        <a href="http://dl-link2" rel="homepage">homepage_link1</a>
+        <a href="%stest" rel="homepage">homepage_link2</a>
+        """ % index.index_url
+
+        # Test that the simple link matcher yield the good links.
+        generator = index._simple_link_matcher(content, index.index_url)
+        self.assertEqual(('http://dl-link1', True), generator.next())
+        self.assertEqual(('%stest' % index.index_url, False),
+                         generator.next())
+        self.assertRaises(StopIteration, generator.next)
+
+        # Follow the external links is possible
+        index.follow_externals = True
+        generator = index._simple_link_matcher(content, index.index_url)
+        self.assertEqual(('http://dl-link1', True), generator.next())
+        self.assertEqual(('http://dl-link2', False), generator.next())
+        self.assertEqual(('%stest' % index.index_url, False),
+                         generator.next())
+        self.assertRaises(StopIteration, generator.next)
+
+    def test_browse_local_files(self):
+        # Test that we can browse local files"""
+        index_path = os.sep.join(["file://" + PYPI_DEFAULT_STATIC_PATH,
+                                  "test_found_links", "simple"])
+        index = simple.SimpleIndex(index_path)
+        dists = index.find("foobar")
+        self.assertEqual(4, len(dists))
+
+def test_suite():
+    return unittest.makeSuite(PyPISimpleTestCase)
+
+if __name__ == '__main__':
+    unittest.main(defaultTest="test_suite")
diff --git a/src/distutils2/tests/test_register.py b/src/distutils2/tests/test_register.py
--- a/src/distutils2/tests/test_register.py
+++ b/src/distutils2/tests/test_register.py
@@ -124,7 +124,7 @@
 
         # with the content similar to WANTED_PYPIRC
         content = open(self.rc).read()
-        self.assertEquals(content, WANTED_PYPIRC)
+        self.assertEqual(content, WANTED_PYPIRC)
 
         # now let's make sure the .pypirc file generated
         # really works : we shouldn't be asked anything
@@ -141,7 +141,7 @@
         self.assertTrue(self.conn.reqs, 2)
         req1 = dict(self.conn.reqs[0].headers)
         req2 = dict(self.conn.reqs[1].headers)
-        self.assertEquals(req2['Content-length'], req1['Content-length'])
+        self.assertEqual(req2['Content-length'], req1['Content-length'])
         self.assertTrue('xxx' in self.conn.reqs[1].data)
 
     def test_password_not_in_file(self):
@@ -154,7 +154,7 @@
 
         # dist.password should be set
         # therefore used afterwards by other commands
-        self.assertEquals(cmd.distribution.password, 'password')
+        self.assertEqual(cmd.distribution.password, 'password')
 
     def test_registering(self):
         # this test runs choice 2
@@ -171,7 +171,7 @@
         self.assertTrue(self.conn.reqs, 1)
         req = self.conn.reqs[0]
         headers = dict(req.headers)
-        self.assertEquals(headers['Content-length'], '608')
+        self.assertEqual(headers['Content-length'], '608')
         self.assertTrue('tarek' in req.data)
 
     def test_password_reset(self):
@@ -189,7 +189,7 @@
         self.assertTrue(self.conn.reqs, 1)
         req = self.conn.reqs[0]
         headers = dict(req.headers)
-        self.assertEquals(headers['Content-length'], '290')
+        self.assertEqual(headers['Content-length'], '290')
         self.assertTrue('tarek' in req.data)
 
     @unittest.skipUnless(DOCUTILS_SUPPORT, 'needs docutils')
@@ -246,8 +246,8 @@
         cmd.ensure_finalized()
         cmd.distribution.metadata['Requires-Dist'] = ['lxml']
         data = cmd.build_post_data('submit')
-        self.assertEquals(data['metadata_version'], '1.2')
-        self.assertEquals(data['requires_dist'], ['lxml'])
+        self.assertEqual(data['metadata_version'], '1.2')
+        self.assertEqual(data['requires_dist'], ['lxml'])
 
 def test_suite():
     return unittest.makeSuite(RegisterTestCase)
diff --git a/src/distutils2/tests/test_sdist.py b/src/distutils2/tests/test_sdist.py
--- a/src/distutils2/tests/test_sdist.py
+++ b/src/distutils2/tests/test_sdist.py
@@ -20,7 +20,6 @@
 
 from os.path import join
 import sys
-import tempfile
 import warnings
 
 from distutils2.tests import captured_stdout
@@ -128,7 +127,7 @@
         # now let's check what we have
         dist_folder = join(self.tmp_dir, 'dist')
         files = os.listdir(dist_folder)
-        self.assertEquals(files, ['fake-1.0.zip'])
+        self.assertEqual(files, ['fake-1.0.zip'])
 
         zip_file = zipfile.ZipFile(join(dist_folder, 'fake-1.0.zip'))
         try:
@@ -137,7 +136,7 @@
             zip_file.close()
 
         # making sure everything has been pruned correctly
-        self.assertEquals(len(content), 4)
+        self.assertEqual(len(content), 4)
 
     @unittest.skipUnless(zlib, "requires zlib")
     def test_make_distribution(self):
@@ -159,7 +158,7 @@
         dist_folder = join(self.tmp_dir, 'dist')
         result = os.listdir(dist_folder)
         result.sort()
-        self.assertEquals(result,
+        self.assertEqual(result,
                           ['fake-1.0.tar', 'fake-1.0.tar.gz'] )
 
         os.remove(join(dist_folder, 'fake-1.0.tar'))
@@ -173,7 +172,7 @@
 
         result = os.listdir(dist_folder)
         result.sort()
-        self.assertEquals(result,
+        self.assertEqual(result,
                 ['fake-1.0.tar', 'fake-1.0.tar.gz'])
 
     @unittest.skipUnless(zlib, "requires zlib")
@@ -223,7 +222,7 @@
         # now let's check what we have
         dist_folder = join(self.tmp_dir, 'dist')
         files = os.listdir(dist_folder)
-        self.assertEquals(files, ['fake-1.0.zip'])
+        self.assertEqual(files, ['fake-1.0.zip'])
 
         zip_file = zipfile.ZipFile(join(dist_folder, 'fake-1.0.zip'))
         try:
@@ -232,11 +231,11 @@
             zip_file.close()
 
         # making sure everything was added
-        self.assertEquals(len(content), 11)
+        self.assertEqual(len(content), 11)
 
         # checking the MANIFEST
         manifest = open(join(self.tmp_dir, 'MANIFEST')).read()
-        self.assertEquals(manifest, MANIFEST % {'sep': os.sep})
+        self.assertEqual(manifest, MANIFEST % {'sep': os.sep})
 
     @unittest.skipUnless(zlib, "requires zlib")
     def test_metadata_check_option(self):
@@ -248,7 +247,7 @@
         cmd.ensure_finalized()
         cmd.run()
         warnings = self.get_logs(WARN)
-        self.assertEquals(len(warnings), 1)
+        self.assertEqual(len(warnings), 1)
 
         # trying with a complete set of metadata
         self.clear_logs()
@@ -260,7 +259,7 @@
         # removing manifest generated warnings
         warnings = [warn for warn in warnings if
                     not warn.endswith('-- skipping')]
-        self.assertEquals(len(warnings), 0)
+        self.assertEqual(len(warnings), 0)
 
 
     def test_show_formats(self):
@@ -270,7 +269,7 @@
         num_formats = len(get_archive_formats())
         output = [line for line in stdout.split('\n')
                   if line.strip().startswith('--formats=')]
-        self.assertEquals(len(output), num_formats)
+        self.assertEqual(len(output), num_formats)
 
     def test_finalize_options(self):
 
@@ -278,9 +277,9 @@
         cmd.finalize_options()
 
         # default options set by finalize
-        self.assertEquals(cmd.manifest, 'MANIFEST')
-        self.assertEquals(cmd.template, 'MANIFEST.in')
-        self.assertEquals(cmd.dist_dir, 'dist')
+        self.assertEqual(cmd.manifest, 'MANIFEST')
+        self.assertEqual(cmd.template, 'MANIFEST.in')
+        self.assertEqual(cmd.dist_dir, 'dist')
 
         # formats has to be a string splitable on (' ', ',') or
         # a stringlist
@@ -317,8 +316,8 @@
         archive = tarfile.open(archive_name)
         try:
             for member in archive.getmembers():
-                self.assertEquals(member.uid, 0)
-                self.assertEquals(member.gid, 0)
+                self.assertEqual(member.uid, 0)
+                self.assertEqual(member.gid, 0)
         finally:
             archive.close()
 
@@ -339,7 +338,7 @@
         # rights (see #7408)
         try:
             for member in archive.getmembers():
-                self.assertEquals(member.uid, os.getuid())
+                self.assertEqual(member.uid, os.getuid())
         finally:
             archive.close()
 
diff --git a/src/distutils2/tests/test_spawn.py b/src/distutils2/tests/test_spawn.py
--- a/src/distutils2/tests/test_spawn.py
+++ b/src/distutils2/tests/test_spawn.py
@@ -20,7 +20,7 @@
                                (['nochange', 'nospace'],
                                 ['nochange', 'nospace'])):
             res = _nt_quote_args(args)
-            self.assertEquals(res, wanted)
+            self.assertEqual(res, wanted)
 
 
     @unittest.skipUnless(os.name in ('nt', 'posix'),
diff --git a/src/distutils2/tests/test_upload.py b/src/distutils2/tests/test_upload.py
--- a/src/distutils2/tests/test_upload.py
+++ b/src/distutils2/tests/test_upload.py
@@ -1,34 +1,16 @@
 """Tests for distutils.command.upload."""
 # -*- encoding: utf8 -*-
+import os
 import sys
-import os
 
-from distutils2.command import upload as upload_mod
 from distutils2.command.upload import upload
 from distutils2.core import Distribution
 
 from distutils2.tests import support
+from distutils2.tests.pypi_server import PyPIServer, PyPIServerTestCase
 from distutils2.tests.support import unittest
 from distutils2.tests.test_config import PYPIRC, PyPIRCCommandTestCase
 
-PYPIRC_LONG_PASSWORD = """\
-[distutils]
-
-index-servers =
-    server1
-    server2
-
-[server1]
-username:me
-password:aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
-
-[server2]
-username:meagain
-password: secret
-realm:acme
-repository:http://another.pypi/
-"""
-
 
 PYPIRC_NOPASSWORD = """\
 [distutils]
@@ -40,47 +22,18 @@
 username:me
 """
 
-class FakeOpen(object):
-
-    def __init__(self, url):
-        self.url = url
-        if not isinstance(url, str):
-            self.req = url
-        else:
-            self.req = None
-        self.msg = 'OK'
-
-    def getcode(self):
-        return 200
-
-
-class uploadTestCase(PyPIRCCommandTestCase):
-
-    def setUp(self):
-        super(uploadTestCase, self).setUp()
-        self.old_open = upload_mod.urlopen
-        upload_mod.urlopen = self._urlopen
-        self.last_open = None
-
-    def tearDown(self):
-        upload_mod.urlopen = self.old_open
-        super(uploadTestCase, self).tearDown()
-
-    def _urlopen(self, url):
-        self.last_open = FakeOpen(url)
-        return self.last_open
+class UploadTestCase(PyPIServerTestCase, PyPIRCCommandTestCase):
 
     def test_finalize_options(self):
-
         # new format
         self.write_file(self.rc, PYPIRC)
         dist = Distribution()
         cmd = upload(dist)
         cmd.finalize_options()
-        for attr, waited in (('username', 'me'), ('password', 'secret'),
-                             ('realm', 'pypi'),
-                             ('repository', 'http://pypi.python.org/pypi')):
-            self.assertEquals(getattr(cmd, attr), waited)
+        for attr, expected in (('username', 'me'), ('password', 'secret'),
+                               ('realm', 'pypi'),
+                               ('repository', 'http://pypi.python.org/pypi')):
+            self.assertEqual(getattr(cmd, attr), expected)
 
     def test_saved_password(self):
         # file with no password
@@ -89,44 +42,42 @@
         # make sure it passes
         dist = Distribution()
         cmd = upload(dist)
-        cmd.finalize_options()
-        self.assertEquals(cmd.password, None)
+        cmd.ensure_finalized()
+        self.assertEqual(cmd.password, None)
 
         # make sure we get it as well, if another command
         # initialized it at the dist level
         dist.password = 'xxx'
         cmd = upload(dist)
         cmd.finalize_options()
-        self.assertEquals(cmd.password, 'xxx')
+        self.assertEqual(cmd.password, 'xxx')
 
     def test_upload(self):
-        tmp = self.mkdtemp()
-        path = os.path.join(tmp, 'xxx')
+        path = os.path.join(self.tmp_dir, 'xxx')
         self.write_file(path)
         command, pyversion, filename = 'xxx', '2.6', path
         dist_files = [(command, pyversion, filename)]
-        self.write_file(self.rc, PYPIRC_LONG_PASSWORD)
 
         # lets run it
         pkg_dir, dist = self.create_dist(dist_files=dist_files, author=u'dédé')
         cmd = upload(dist)
         cmd.ensure_finalized()
+        cmd.repository = self.pypi.full_address
         cmd.run()
 
         # what did we send ?
-        self.assertIn('dédé', self.last_open.req.data)
-        headers = dict(self.last_open.req.headers)
-        self.assertTrue(int(headers['Content-length']) < 2000)
-        self.assertTrue(headers['Content-type'].startswith('multipart/form-data'))
-        self.assertEquals(self.last_open.req.get_method(), 'POST')
-        self.assertEquals(self.last_open.req.get_full_url(),
-                          'http://pypi.python.org/pypi')
-        self.assertTrue('xxx' in self.last_open.req.data)
-        auth = self.last_open.req.headers['Authorization']
-        self.assertFalse('\n' in auth)
+        handler, request_data = self.pypi.requests[-1]
+        headers = handler.headers.dict
+        self.assertIn('dédé', request_data)
+        self.assertIn('xxx', request_data)
+        self.assertEqual(int(headers['content-length']), len(request_data))
+        self.assertTrue(int(headers['content-length']) < 2000)
+        self.assertTrue(headers['content-type'].startswith('multipart/form-data'))
+        self.assertEqual(handler.command, 'POST')
+        self.assertNotIn('\n', headers['authorization'])
 
 def test_suite():
-    return unittest.makeSuite(uploadTestCase)
+    return unittest.makeSuite(UploadTestCase)
 
 if __name__ == "__main__":
     unittest.main(defaultTest="test_suite")
diff --git a/src/distutils2/tests/test_upload_docs.py b/src/distutils2/tests/test_upload_docs.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/test_upload_docs.py
@@ -0,0 +1,198 @@
+"""Tests for distutils.command.upload_docs."""
+# -*- encoding: utf8 -*-
+import httplib, os, os.path, shutil, sys, tempfile, zipfile
+from cStringIO import StringIO
+
+from distutils2.command import upload_docs as upload_docs_mod
+from distutils2.command.upload_docs import (upload_docs, zip_dir,
+                                    encode_multipart)
+from distutils2.core import Distribution
+
+from distutils2.errors import DistutilsFileError, DistutilsOptionError
+
+from distutils2.tests import support
+from distutils2.tests.pypi_server import PyPIServer, PyPIServerTestCase
+from distutils2.tests.test_config import PyPIRCCommandTestCase
+from distutils2.tests.support import unittest
+
+
+EXPECTED_MULTIPART_OUTPUT = "\r\n".join([
+'---x',
+'Content-Disposition: form-data; name="a"',
+'',
+'b',
+'---x',
+'Content-Disposition: form-data; name="c"',
+'',
+'d',
+'---x',
+'Content-Disposition: form-data; name="e"; filename="f"',
+'',
+'g',
+'---x',
+'Content-Disposition: form-data; name="h"; filename="i"',
+'',
+'j',
+'---x--',
+'',
+])
+
+PYPIRC = """\
+[distutils]
+index-servers = server1
+
+[server1]
+repository = %s
+username = real_slim_shady
+password = long_island
+"""
+
+class UploadDocsTestCase(PyPIServerTestCase, PyPIRCCommandTestCase):
+
+    def setUp(self):
+        super(UploadDocsTestCase, self).setUp()
+        self.dist = Distribution()
+        self.dist.metadata['Name'] = "distr-name"
+        self.cmd = upload_docs(self.dist)
+
+    def test_default_uploaddir(self):
+        sandbox = tempfile.mkdtemp()
+        previous = os.getcwd()
+        os.chdir(sandbox)
+        try:
+            os.mkdir("build")
+            self.prepare_sample_dir("build")
+            self.cmd.ensure_finalized()
+            self.assertEqual(self.cmd.upload_dir, os.path.join("build", "docs"))
+        finally:
+            os.chdir(previous)
+
+    def prepare_sample_dir(self, sample_dir=None):
+        if sample_dir is None:
+            sample_dir = tempfile.mkdtemp()
+        os.mkdir(os.path.join(sample_dir, "docs"))
+        self.write_file(os.path.join(sample_dir, "docs", "index.html"), "Ce mortel ennui")
+        self.write_file(os.path.join(sample_dir, "index.html"), "Oh la la")
+        return sample_dir
+
+    def test_zip_dir(self):
+        source_dir = self.prepare_sample_dir()
+        compressed = zip_dir(source_dir)
+
+        zip_f = zipfile.ZipFile(compressed)
+        self.assertEqual(zip_f.namelist(), ['index.html', 'docs/index.html'])
+
+    def test_encode_multipart(self):
+        fields = [("a", "b"), ("c", "d")]
+        files = [("e", "f", "g"), ("h", "i", "j")]
+        content_type, body = encode_multipart(fields, files, "-x")
+        self.assertEqual(content_type, "multipart/form-data; boundary=-x")
+        self.assertEqual(body, EXPECTED_MULTIPART_OUTPUT)
+
+    def prepare_command(self):
+        self.cmd.upload_dir = self.prepare_sample_dir()
+        self.cmd.ensure_finalized()
+        self.cmd.repository = self.pypi.full_address
+        self.cmd.username = "username"
+        self.cmd.password = "password"
+
+    def test_upload(self):
+        self.prepare_command()
+        self.cmd.run()
+
+        self.assertEqual(len(self.pypi.requests), 1)
+        handler, request_data = self.pypi.requests[-1]
+        self.assertIn("content", request_data)
+        self.assertIn("Basic", handler.headers.dict['authorization'])
+        self.assertTrue(handler.headers.dict['content-type']
+            .startswith('multipart/form-data;'))
+
+        action, name, content =\
+            request_data.split("----------------GHSKFJDLGDS7543FJKLFHRE75642756743254")[1:4]
+
+        # check that we picked the right chunks
+        self.assertIn('name=":action"', action)
+        self.assertIn('name="name"', name)
+        self.assertIn('name="content"', content)
+
+        # check their contents
+        self.assertIn("doc_upload", action)
+        self.assertIn("distr-name", name)
+        self.assertIn("docs/index.html", content)
+        self.assertIn("Ce mortel ennui", content)
+
+    def test_https_connection(self):
+        https_called = False
+        orig_https = upload_docs_mod.httplib.HTTPSConnection
+        def https_conn_wrapper(*args):
+            https_called = True
+            return upload_docs_mod.httplib.HTTPConnection(*args) # the testing server is http
+        upload_docs_mod.httplib.HTTPSConnection = https_conn_wrapper
+        try:
+            self.prepare_command()
+            self.cmd.run()
+            self.assertFalse(https_called)
+
+            self.cmd.repository = self.cmd.repository.replace("http", "https")
+            self.cmd.run()
+            self.assertFalse(https_called)
+        finally:
+            upload_docs_mod.httplib.HTTPSConnection = orig_https
+
+    def test_handling_response(self):
+        calls = []
+        def aggr(*args):
+            calls.append(args)
+        self.pypi.default_response_status = '403 Forbidden'
+        self.prepare_command()
+        self.cmd.announce = aggr
+        self.cmd.run()
+        message, _ = calls[-1]
+        self.assertIn('Upload failed (403): Forbidden', message)
+
+        calls = []
+        self.pypi.default_response_status = '301 Moved Permanently'
+        self.pypi.default_response_headers.append(("Location", "brand_new_location"))
+        self.cmd.run()
+        message, _ = calls[-1]
+        self.assertIn('brand_new_location', message)
+
+    def test_reads_pypirc_data(self):
+        self.write_file(self.rc, PYPIRC % self.pypi.full_address)
+        self.cmd.repository = self.pypi.full_address
+        self.cmd.upload_dir = self.prepare_sample_dir()
+        self.cmd.ensure_finalized()
+        self.assertEqual(self.cmd.username, "real_slim_shady")
+        self.assertEqual(self.cmd.password, "long_island")
+
+    def test_checks_index_html_presence(self):
+        self.cmd.upload_dir = self.prepare_sample_dir()
+        os.remove(os.path.join(self.cmd.upload_dir, "index.html"))
+        self.assertRaises(DistutilsFileError, self.cmd.ensure_finalized)
+
+    def test_checks_upload_dir(self):
+        self.cmd.upload_dir = self.prepare_sample_dir()
+        shutil.rmtree(os.path.join(self.cmd.upload_dir))
+        self.assertRaises(DistutilsOptionError, self.cmd.ensure_finalized)
+
+    def test_show_response(self):
+        orig_stdout = sys.stdout
+        write_args = []
+        class MockStdIn(object):
+            def write(self, arg):
+                write_args.append(arg)
+        sys.stdout = MockStdIn()
+        try:
+            self.prepare_command()
+            self.cmd.show_response = True
+            self.cmd.run()
+        finally:
+            sys.stdout = orig_stdout
+        self.assertTrue(write_args[0], "should report the response")
+        self.assertIn(self.pypi.default_response_data + "\n", write_args[0])
+
+def test_suite():
+    return unittest.makeSuite(UploadDocsTestCase)
+
+if __name__ == "__main__":
+    unittest.main(defaultTest="test_suite")
diff --git a/src/distutils2/tests/test_util.py b/src/distutils2/tests/test_util.py
--- a/src/distutils2/tests/test_util.py
+++ b/src/distutils2/tests/test_util.py
@@ -4,6 +4,8 @@
 from copy import copy
 from StringIO import StringIO
 import subprocess
+import tempfile
+import time
 
 from distutils2.errors import (DistutilsPlatformError,
                                DistutilsByteCompileError,
@@ -100,7 +102,7 @@
             return '/'.join(path)
         os.path.join = _join
 
-        self.assertEquals(convert_path('/home/to/my/stuff'),
+        self.assertEqual(convert_path('/home/to/my/stuff'),
                           '/home/to/my/stuff')
 
         # win
@@ -112,9 +114,9 @@
         self.assertRaises(ValueError, convert_path, '/home/to/my/stuff')
         self.assertRaises(ValueError, convert_path, 'home/to/my/stuff/')
 
-        self.assertEquals(convert_path('home/to/my/stuff'),
+        self.assertEqual(convert_path('home/to/my/stuff'),
                           'home\\to\\my\\stuff')
-        self.assertEquals(convert_path('.'),
+        self.assertEqual(convert_path('.'),
                           os.curdir)
 
     def test_change_root(self):
@@ -127,9 +129,9 @@
             return '/'.join(path)
         os.path.join = _join
 
-        self.assertEquals(change_root('/root', '/old/its/here'),
+        self.assertEqual(change_root('/root', '/old/its/here'),
                           '/root/old/its/here')
-        self.assertEquals(change_root('/root', 'its/here'),
+        self.assertEqual(change_root('/root', 'its/here'),
                           '/root/its/here')
 
         # windows
@@ -146,9 +148,9 @@
             return '\\'.join(path)
         os.path.join = _join
 
-        self.assertEquals(change_root('c:\\root', 'c:\\old\\its\\here'),
+        self.assertEqual(change_root('c:\\root', 'c:\\old\\its\\here'),
                           'c:\\root\\old\\its\\here')
-        self.assertEquals(change_root('c:\\root', 'its\\here'),
+        self.assertEqual(change_root('c:\\root', 'its\\here'),
                           'c:\\root\\its\\here')
 
         # BugsBunny os (it's a great os)
@@ -159,7 +161,7 @@
         # XXX platforms to be covered: os2, mac
 
     def test_split_quoted(self):
-        self.assertEquals(split_quoted('""one"" "two" \'three\' \\four'),
+        self.assertEqual(split_quoted('""one"" "two" \'three\' \\four'),
                           ['one', 'two', 'three', 'four'])
 
     def test_strtobool(self):
@@ -177,7 +179,7 @@
         res = rfc822_escape(header)
         wanted = ('I am a%(8s)spoor%(8s)slonesome%(8s)s'
                   'header%(8s)s') % {'8s': '\n'+8*' '}
-        self.assertEquals(res, wanted)
+        self.assertEqual(res, wanted)
 
     def test_find_exe_version(self):
         # the ld version scheme under MAC OS is:
@@ -195,7 +197,7 @@
                                 ('@(#)PROGRAM:ld  PROJECT:ld64-95.2.12',
                                  '95.2.12')):
             result = _MAC_OS_X_LD_VERSION.search(output)
-            self.assertEquals(result.group(1), version)
+            self.assertEqual(result.group(1), version)
 
     def _find_executable(self, name):
         if name in self._exes:
@@ -205,43 +207,43 @@
     def test_get_compiler_versions(self):
         # get_versions calls distutils.spawn.find_executable on
         # 'gcc', 'ld' and 'dllwrap'
-        self.assertEquals(get_compiler_versions(), (None, None, None))
+        self.assertEqual(get_compiler_versions(), (None, None, None))
 
         # Let's fake we have 'gcc' and it returns '3.4.5'
         self._exes['gcc'] = 'gcc (GCC) 3.4.5 (mingw special)\nFSF'
         res = get_compiler_versions()
-        self.assertEquals(str(res[0]), '3.4.5')
+        self.assertEqual(str(res[0]), '3.4.5')
 
         # and let's see what happens when the version
         # doesn't match the regular expression
         # (\d+\.\d+(\.\d+)*)
         self._exes['gcc'] = 'very strange output'
         res = get_compiler_versions()
-        self.assertEquals(res[0], None)
+        self.assertEqual(res[0], None)
 
         # same thing for ld
         if sys.platform != 'darwin':
             self._exes['ld'] = 'GNU ld version 2.17.50 20060824'
             res = get_compiler_versions()
-            self.assertEquals(str(res[1]), '2.17.50')
+            self.assertEqual(str(res[1]), '2.17.50')
             self._exes['ld'] = '@(#)PROGRAM:ld  PROJECT:ld64-77'
             res = get_compiler_versions()
-            self.assertEquals(res[1], None)
+            self.assertEqual(res[1], None)
         else:
             self._exes['ld'] = 'GNU ld version 2.17.50 20060824'
             res = get_compiler_versions()
-            self.assertEquals(res[1], None)
+            self.assertEqual(res[1], None)
             self._exes['ld'] = '@(#)PROGRAM:ld  PROJECT:ld64-77'
             res = get_compiler_versions()
-            self.assertEquals(str(res[1]), '77')
+            self.assertEqual(str(res[1]), '77')
 
         # and dllwrap
         self._exes['dllwrap'] = 'GNU dllwrap 2.17.50 20060824\nFSF'
         res = get_compiler_versions()
-        self.assertEquals(str(res[2]), '2.17.50')
+        self.assertEqual(str(res[2]), '2.17.50')
         self._exes['dllwrap'] = 'Cheese Wrap'
         res = get_compiler_versions()
-        self.assertEquals(res[2], None)
+        self.assertEqual(res[2], None)
 
     @unittest.skipUnless(hasattr(sys, 'dont_write_bytecode'),
                           'no dont_write_bytecode support')
@@ -257,7 +259,10 @@
 
     def test_newer(self):
         self.assertRaises(DistutilsFileError, util.newer, 'xxx', 'xxx')
-
+        self.newer_f1 = tempfile.NamedTemporaryFile()
+        time.sleep(1)
+        self.newer_f2 = tempfile.NamedTemporaryFile()
+        self.assertTrue(util.newer(self.newer_f2.name, self.newer_f1.name))
 
     def test_find_packages(self):
         # let's create a structure we want to scan:
@@ -294,7 +299,38 @@
         self.write_file(os.path.join(pkg5, '__init__.py'))
 
         res = find_packages([root], ['pkg1.pkg2'])
-        self.assertEquals(set(res), set(['pkg1', 'pkg5', 'pkg1.pkg3', 'pkg1.pkg3.pkg6']))
+        self.assertEqual(set(res), set(['pkg1', 'pkg5', 'pkg1.pkg3', 'pkg1.pkg3.pkg6']))
+
+    @unittest.skipUnless(sys.version > '2.6', 'Need Python 2.6 or more')
+    def test_run_2to3_on_code(self):
+        content = "print 'test'"
+        converted_content = "print('test')"
+        file_handle = self.mktempfile()
+        file_name = file_handle.name
+        file_handle.write(content)
+        file_handle.flush()
+        file_handle.seek(0)
+        from distutils2.util import run_2to3
+        run_2to3([file_name])
+        new_content = "".join(file_handle.read())
+        file_handle.close()
+        self.assertEquals(new_content, converted_content)
+
+    @unittest.skipUnless(sys.version > '2.6', 'Need Python 2.6 or more')
+    def test_run_2to3_on_doctests(self):
+        # to check if text files containing doctests only get converted.
+        content = ">>> print 'test'\ntest\n"
+        converted_content = ">>> print('test')\ntest\n\n"
+        file_handle = self.mktempfile()
+        file_name = file_handle.name
+        file_handle.write(content)
+        file_handle.flush()
+        file_handle.seek(0)
+        from distutils2.util import run_2to3
+        run_2to3([file_name], doctests_only=True)
+        new_content = "".join(file_handle.readlines())
+        file_handle.close()
+        self.assertEquals(new_content, converted_content)
 
 
 def test_suite():
diff --git a/src/distutils2/tests/test_version.py b/src/distutils2/tests/test_version.py
--- a/src/distutils2/tests/test_version.py
+++ b/src/distutils2/tests/test_version.py
@@ -3,7 +3,7 @@
 import os
 
 from distutils2.version import NormalizedVersion as V
-from distutils2.version import IrrationalVersionError
+from distutils2.version import HugeMajorVersionNumError, IrrationalVersionError
 from distutils2.version import suggest_normalized_version as suggest
 from distutils2.version import VersionPredicate
 from distutils2.tests.support import unittest
@@ -22,18 +22,22 @@
                 (V('1.0.dev345'), '1.0.dev345'),
                 (V('1.0.post456.dev623'), '1.0.post456.dev623'))
 
+    def test_repr(self):
+
+        self.assertEqual(repr(V('1.0')), "NormalizedVersion('1.0')")
+
     def test_basic_versions(self):
 
         for v, s in self.versions:
-            self.assertEquals(str(v), s)
+            self.assertEqual(str(v), s)
 
     def test_from_parts(self):
 
         for v, s in self.versions:
             parts = v.parts
             v2 = V.from_parts(*v.parts)
-            self.assertEquals(v, v2)
-            self.assertEquals(str(v), str(v2))
+            self.assertEqual(v, v2)
+            self.assertEqual(str(v), str(v2))
 
     def test_irrational_versions(self):
 
@@ -44,6 +48,12 @@
         for s in irrational:
             self.assertRaises(IrrationalVersionError, V, s)
 
+    def test_huge_version(self):
+
+        self.assertEquals(str(V('1980.0')), '1980.0')
+        self.assertRaises(HugeMajorVersionNumError, V, '1981.0')
+        self.assertEquals(str(V('1981.0', error_on_huge_major_num=False)), '1981.0')
+
     def test_comparison(self):
         r"""
         >>> V('1.2.0') == '1.2'
@@ -51,12 +61,33 @@
         ...
         TypeError: cannot compare NormalizedVersion and str
 
+        >>> V('1.2') < '1.3'
+        Traceback (most recent call last):
+        ...
+        TypeError: cannot compare NormalizedVersion and str
+
         >>> V('1.2.0') == V('1.2')
         True
         >>> V('1.2.0') == V('1.2.3')
         False
+        >>> V('1.2.0') != V('1.2.3')
+        True
         >>> V('1.2.0') < V('1.2.3')
         True
+        >>> V('1.2.0') < V('1.2.0')
+        False
+        >>> V('1.2.0') <= V('1.2.0')
+        True
+        >>> V('1.2.0') <= V('1.2.3')
+        True
+        >>> V('1.2.3') <= V('1.2.0')
+        False
+        >>> V('1.2.0') >= V('1.2.0')
+        True
+        >>> V('1.2.3') >= V('1.2.0')
+        True
+        >>> V('1.2.0') >= V('1.2.3')
+        False
         >>> (V('1.0') > V('1.0b2'))
         True
         >>> (V('1.0') > V('1.0c2') > V('1.0c1') > V('1.0b2') > V('1.0b1')
@@ -96,34 +127,35 @@
 
     def test_suggest_normalized_version(self):
 
-        self.assertEquals(suggest('1.0'), '1.0')
-        self.assertEquals(suggest('1.0-alpha1'), '1.0a1')
-        self.assertEquals(suggest('1.0c2'), '1.0c2')
-        self.assertEquals(suggest('walla walla washington'), None)
-        self.assertEquals(suggest('2.4c1'), '2.4c1')
+        self.assertEqual(suggest('1.0'), '1.0')
+        self.assertEqual(suggest('1.0-alpha1'), '1.0a1')
+        self.assertEqual(suggest('1.0c2'), '1.0c2')
+        self.assertEqual(suggest('walla walla washington'), None)
+        self.assertEqual(suggest('2.4c1'), '2.4c1')
+        self.assertEqual(suggest('v1.0'), '1.0')
 
         # from setuptools
-        self.assertEquals(suggest('0.4a1.r10'), '0.4a1.post10')
-        self.assertEquals(suggest('0.7a1dev-r66608'), '0.7a1.dev66608')
-        self.assertEquals(suggest('0.6a9.dev-r41475'), '0.6a9.dev41475')
-        self.assertEquals(suggest('2.4preview1'), '2.4c1')
-        self.assertEquals(suggest('2.4pre1') , '2.4c1')
-        self.assertEquals(suggest('2.1-rc2'), '2.1c2')
+        self.assertEqual(suggest('0.4a1.r10'), '0.4a1.post10')
+        self.assertEqual(suggest('0.7a1dev-r66608'), '0.7a1.dev66608')
+        self.assertEqual(suggest('0.6a9.dev-r41475'), '0.6a9.dev41475')
+        self.assertEqual(suggest('2.4preview1'), '2.4c1')
+        self.assertEqual(suggest('2.4pre1') , '2.4c1')
+        self.assertEqual(suggest('2.1-rc2'), '2.1c2')
 
         # from pypi
-        self.assertEquals(suggest('0.1dev'), '0.1.dev0')
-        self.assertEquals(suggest('0.1.dev'), '0.1.dev0')
+        self.assertEqual(suggest('0.1dev'), '0.1.dev0')
+        self.assertEqual(suggest('0.1.dev'), '0.1.dev0')
 
         # we want to be able to parse Twisted
         # development versions are like post releases in Twisted
-        self.assertEquals(suggest('9.0.0+r2363'), '9.0.0.post2363')
+        self.assertEqual(suggest('9.0.0+r2363'), '9.0.0.post2363')
 
         # pre-releases are using markers like "pre1"
-        self.assertEquals(suggest('9.0.0pre1'), '9.0.0c1')
+        self.assertEqual(suggest('9.0.0pre1'), '9.0.0c1')
 
         # we want to be able to parse Tcl-TK
         # they us "p1" "p2" for post releases
-        self.assertEquals(suggest('1.4p1'), '1.4.post1')
+        self.assertEqual(suggest('1.4p1'), '1.4.post1')
 
     def test_predicate(self):
         # VersionPredicate knows how to parse stuff like:
@@ -151,15 +183,37 @@
         self.assertFalse(VersionPredicate('Hey (<=2.5)').match('2.6.0'))
         self.assertTrue(VersionPredicate('Hey (>=2.5)').match('2.5.1'))
 
+        self.assertRaises(ValueError, VersionPredicate, '')
+
         # XXX need to silent the micro version in this case
         #assert not VersionPredicate('Ho (<3.0,!=2.6)').match('2.6.3')
 
+    def test_is_final(self):
+        # VersionPredicate knows is a distribution is a final one or not.
+        final_versions = ('1.0', '1.0.post456')
+        other_versions = ('1.0.dev1', '1.0a2', '1.0c3')
+
+        for version in final_versions:
+            self.assertTrue(V(version).is_final)
+        for version in other_versions:
+            self.assertFalse(V(version).is_final)
+
+class VersionWhiteBoxTestCase(unittest.TestCase):
+
+    def test_parse_numdots(self):
+        # For code coverage completeness, as pad_zeros_length can't be set or
+        # influenced from the public interface
+        self.assertEquals(V('1.0')._parse_numdots('1.0', '1.0',
+                                                  pad_zeros_length=3),
+                          [1, 0, 0])
+
+
 def test_suite():
     #README = os.path.join(os.path.dirname(__file__), 'README.txt')
     #suite = [doctest.DocFileSuite(README), unittest.makeSuite(VersionTestCase)]
-    suite = [unittest.makeSuite(VersionTestCase)]
+    suite = [unittest.makeSuite(VersionTestCase),
+             unittest.makeSuite(VersionWhiteBoxTestCase)]
     return unittest.TestSuite(suite)
 
 if __name__ == "__main__":
     unittest.main(defaultTest="test_suite")
-
diff --git a/src/distutils2/util.py b/src/distutils2/util.py
--- a/src/distutils2/util.py
+++ b/src/distutils2/util.py
@@ -1,12 +1,15 @@
 """distutils.util
 
-Miscellaneous utility functions -- anything that doesn't fit into
-one of the other *util.py modules.
+Miscellaneous utility functions.
 """
 
 __revision__ = "$Id: util.py 77761 2010-01-26 22:46:15Z tarek.ziade $"
 
-import sys, os, string, re
+import sys
+import os
+import string
+import re
+from copy import copy
 from fnmatch import fnmatchcase
 
 from distutils2.errors import (DistutilsPlatformError, DistutilsFileError,
@@ -17,6 +20,7 @@
 
 _PLATFORM = None
 
+
 def newer(source, target):
     """Tells if the target is newer than the source.
 
@@ -37,6 +41,7 @@
 
     return os.stat(source).st_mtime > os.stat(target).st_mtime
 
+
 def get_platform():
     """Return a string that identifies the current platform.
 
@@ -48,6 +53,7 @@
         _PLATFORM = _sysconfig.get_platform()
     return _PLATFORM
 
+
 def set_platform(identifier):
     """Sets the platform string identifier returned by get_platform().
 
@@ -57,6 +63,7 @@
     global _PLATFORM
     _PLATFORM = identifier
 
+
 def convert_path(pathname):
     """Return 'pathname' as a name that will work on the native filesystem.
 
@@ -125,6 +132,7 @@
 
 _environ_checked = 0
 
+
 def check_environ():
     """Ensure that 'os.environ' has all the environment variables needed.
 
@@ -147,6 +155,7 @@
 
     _environ_checked = 1
 
+
 def subst_vars(s, local_vars):
     """Perform shell/Perl-style variable substitution on 'string'.
 
@@ -158,7 +167,8 @@
     variables not found in either 'local_vars' or 'os.environ'.
     """
     check_environ()
-    def _subst (match, local_vars=local_vars):
+
+    def _subst(match, local_vars=local_vars):
         var_name = match.group(1)
         if var_name in local_vars:
             return str(local_vars[var_name])
@@ -170,6 +180,7 @@
     except KeyError, var:
         raise ValueError("invalid variable '$%s'" % var)
 
+
 def grok_environment_error(exc, prefix="error: "):
     """Generate a useful error message from an EnvironmentError.
 
@@ -196,12 +207,14 @@
 # Needed by 'split_quoted()'
 _wordchars_re = _squote_re = _dquote_re = None
 
+
 def _init_regex():
     global _wordchars_re, _squote_re, _dquote_re
     _wordchars_re = re.compile(r'[^\\\'\"%s ]*' % string.whitespace)
     _squote_re = re.compile(r"'(?:[^'\\]|\\.)*'")
     _dquote_re = re.compile(r'"(?:[^"\\]|\\.)*"')
 
+
 def split_quoted(s):
     """Split a string up according to Unix shell-like rules for quotes and
     backslashes.
@@ -217,7 +230,8 @@
     # This is a nice algorithm for splitting up a single string, since it
     # doesn't require character-by-character examination.  It was a little
     # bit of a brain-bender to get it working right, though...
-    if _wordchars_re is None: _init_regex()
+    if _wordchars_re is None:
+        _init_regex()
 
     s = s.strip()
     words = []
@@ -237,8 +251,8 @@
 
         elif s[end] == '\\':            # preserve whatever is being escaped;
                                         # will become part of the current word
-            s = s[:end] + s[end+1:]
-            pos = end+1
+            s = s[:end] + s[end + 1:]
+            pos = end + 1
 
         else:
             if s[end] == "'":           # slurp singly-quoted string
@@ -253,7 +267,7 @@
                 raise ValueError("bad string (mismatched %s quotes?)" % s[end])
 
             (beg, end) = m.span()
-            s = s[:beg] + s[beg+1:end-1] + s[end:]
+            s = s[:beg] + s[beg + 1:end - 1] + s[end:]
             pos = m.end() - 2
 
         if pos >= len(s):
@@ -296,7 +310,7 @@
     elif val in ('n', 'no', 'f', 'false', 'off', '0'):
         return 0
     else:
-        raise ValueError, "invalid truth value %r" % (val,)
+        raise ValueError("invalid truth value %r" % (val,))
 
 
 def byte_compile(py_files, optimize=0, force=0, prefix=None, base_dir=None,
@@ -350,12 +364,8 @@
     # "Indirect" byte-compilation: write a temporary script and then
     # run it with the appropriate flags.
     if not direct:
-        try:
-            from tempfile import mkstemp
-            (script_fd, script_name) = mkstemp(".py")
-        except ImportError:
-            from tempfile import mktemp
-            (script_fd, script_name) = None, mktemp(".py")
+        from tempfile import mkstemp
+        script_fd, script_name = mkstemp(".py")
         log.info("writing byte-compilation script '%s'", script_name)
         if not dry_run:
             if script_fd is not None:
@@ -363,43 +373,50 @@
             else:
                 script = open(script_name, "w")
 
-            script.write("""\
-from distutils.util import byte_compile
+            try:
+                script.write("""\
+from distutils2.util import byte_compile
 files = [
 """)
 
-            # XXX would be nice to write absolute filenames, just for
-            # safety's sake (script should be more robust in the face of
-            # chdir'ing before running it).  But this requires abspath'ing
-            # 'prefix' as well, and that breaks the hack in build_lib's
-            # 'byte_compile()' method that carefully tacks on a trailing
-            # slash (os.sep really) to make sure the prefix here is "just
-            # right".  This whole prefix business is rather delicate -- the
-            # problem is that it's really a directory, but I'm treating it
-            # as a dumb string, so trailing slashes and so forth matter.
+                # XXX would be nice to write absolute filenames, just for
+                # safety's sake (script should be more robust in the face of
+                # chdir'ing before running it).  But this requires abspath'ing
+                # 'prefix' as well, and that breaks the hack in build_lib's
+                # 'byte_compile()' method that carefully tacks on a trailing
+                # slash (os.sep really) to make sure the prefix here is "just
+                # right".  This whole prefix business is rather delicate -- the
+                # problem is that it's really a directory, but I'm treating it
+                # as a dumb string, so trailing slashes and so forth matter.
 
-            #py_files = map(os.path.abspath, py_files)
-            #if prefix:
-            #    prefix = os.path.abspath(prefix)
+                #py_files = map(os.path.abspath, py_files)
+                #if prefix:
+                #    prefix = os.path.abspath(prefix)
 
-            script.write(",\n".join(map(repr, py_files)) + "]\n")
-            script.write("""
+                script.write(",\n".join(map(repr, py_files)) + "]\n")
+                script.write("""
 byte_compile(files, optimize=%r, force=%r,
              prefix=%r, base_dir=%r,
              verbose=%r, dry_run=0,
              direct=1)
 """ % (optimize, force, prefix, base_dir, verbose))
 
-            script.close()
+            finally:
+                script.close()
 
         cmd = [sys.executable, script_name]
         if optimize == 1:
             cmd.insert(1, "-O")
         elif optimize == 2:
             cmd.insert(1, "-OO")
-        spawn(cmd, dry_run=dry_run)
-        execute(os.remove, (script_name,), "removing %s" % script_name,
-                dry_run=dry_run)
+
+        env = copy(os.environ)
+        env['PYTHONPATH'] = ':'.join(sys.path)
+        try:
+            spawn(cmd, dry_run=dry_run, env=env)
+        finally:
+            execute(os.remove, (script_name,), "removing %s" % script_name,
+                    dry_run=dry_run)
 
     # "Direct" byte-compilation: use the py_compile module to compile
     # right here, right now.  Note that the script generated in indirect
@@ -447,7 +464,9 @@
     return sep.join(lines)
 
 _RE_VERSION = re.compile('(\d+\.\d+(\.\d+)*)')
-_MAC_OS_X_LD_VERSION = re.compile('^@\(#\)PROGRAM:ld  PROJECT:ld64-((\d+)(\.\d+)*)')
+_MAC_OS_X_LD_VERSION = re.compile('^@\(#\)PROGRAM:ld  '
+                                  'PROJECT:ld64-((\d+)(\.\d+)*)')
+
 
 def _find_ld_version():
     """Finds the ld version. The version scheme differs under Mac OSX."""
@@ -456,6 +475,7 @@
     else:
         return _find_exe_version('ld -v')
 
+
 def _find_exe_version(cmd, pattern=_RE_VERSION):
     """Find the version of an executable by running `cmd` in the shell.
 
@@ -485,6 +505,7 @@
         return None
     return result.group(1)
 
+
 def get_compiler_versions():
     """Returns a tuple providing the versions of gcc, ld and dllwrap
 
@@ -496,6 +517,7 @@
     dllwrap = _find_exe_version('dllwrap --version')
     return gcc, ld, dllwrap
 
+
 def newer_group(sources, target, missing='error'):
     """Return true if 'target' is out-of-date with respect to any file
     listed in 'sources'.
@@ -535,14 +557,18 @@
 
     return False
 
+
 def write_file(filename, contents):
     """Create a file with the specified name and write 'contents' (a
     sequence of strings without line terminators) to it.
     """
-    f = open(filename, "w")
-    for line in contents:
-        f.write(line + "\n")
-    f.close()
+    try:
+        f = open(filename, "w")
+        for line in contents:
+            f.write(line + "\n")
+    finally:
+        f.close()
+
 
 def _is_package(path):
     """Returns True if path is a package (a dir with an __init__ file."""
@@ -550,6 +576,7 @@
         return False
     return os.path.isfile(os.path.join(path, '__init__.py'))
 
+
 def _under(path, root):
     path = path.split(os.sep)
     root = root.split(os.sep)
@@ -560,12 +587,14 @@
             return False
     return True
 
+
 def _package_name(root_path, path):
     """Returns a dotted package name, given a subpath."""
     if not _under(path, root_path):
         raise ValueError('"%s" is not a subpath of "%s"' % (path, root_path))
     return path[len(root_path) + 1:].replace(os.sep, '.')
 
+
 def find_packages(paths=('.',), exclude=()):
     """Return a list all Python packages found recursively within
     directories 'paths'
@@ -580,6 +609,7 @@
     """
     packages = []
     discarded = []
+
     def _discarded(path):
         for discard in discarded:
             if _under(path, discard):
@@ -611,3 +641,50 @@
                 packages.append(package_name)
     return packages
 
+
+# utility functions for 2to3 support
+
+def run_2to3(files, doctests_only=False, fixer_names=None, options=None,
+                                                            explicit=None):
+    """ Wrapper function around the refactor() class which
+    performs the conversions on a list of python files.
+    Invoke 2to3 on a list of Python files. The files should all come
+    from the build area, as the modification is done in-place."""
+
+    if not files:
+        return
+
+    # Make this class local, to delay import of 2to3
+    from lib2to3.refactor import get_fixers_from_package
+    from distutils2.converter.refactor import DistutilsRefactoringTool
+
+    if fixer_names is None:
+        fixer_names = get_fixers_from_package('lib2to3.fixes')
+
+    r = DistutilsRefactoringTool(fixer_names, options=options)
+    if doctests_only:
+        r.refactor(files, doctests_only=True, write=True)
+    else:
+        r.refactor(files, write=True)
+
+
+class Mixin2to3:
+    """ Wrapper class for commands that run 2to3.
+    To configure 2to3, setup scripts may either change
+    the class variables, or inherit from this class
+    to override how 2to3 is invoked.
+    """
+    # provide list of fixers to run.
+    # defaults to all from lib2to3.fixers
+    fixer_names = None
+
+    # options dictionary
+    options = None
+
+    # list of fixers to invoke even though they are marked as explicit
+    explicit = None
+
+    def run_2to3(self, files, doctests_only=False):
+        """ Issues a call to util.run_2to3. """
+        return run_2to3(files, doctests_only, self.fixer_names,
+                        self.options, self.explicit)
diff --git a/src/distutils2/version.py b/src/distutils2/version.py
--- a/src/distutils2/version.py
+++ b/src/distutils2/version.py
@@ -36,6 +36,7 @@
     (?P<postdev>(\.post(?P<post>\d+))?(\.dev(?P<dev>\d+))?)?
     $''', re.VERBOSE)
 
+
 class NormalizedVersion(object):
     """A rational version.
 
@@ -61,7 +62,8 @@
         @param error_on_huge_major_num {bool} Whether to consider an
             apparent use of a year or full date as the major version number
             an error. Default True. One of the observed patterns on PyPI before
-            the introduction of `NormalizedVersion` was version numbers like this:
+            the introduction of `NormalizedVersion` was version numbers like
+            this:
                 2009.01.03
                 20040603
                 2005.01
@@ -71,6 +73,7 @@
             the possibility of using a version number like "1.0" (i.e.
             where the major number is less than that huge major number).
         """
+        self.is_final = True  # by default, consider a version as final.
         self._parse(s, error_on_huge_major_num)
 
     @classmethod
@@ -101,6 +104,7 @@
             block += self._parse_numdots(groups.get('prerelversion'), s,
                                          pad_zeros_length=1)
             parts.append(tuple(block))
+            self.is_final = False
         else:
             parts.append(_FINAL_MARKER)
 
@@ -115,6 +119,7 @@
                     postdev.append(_FINAL_MARKER[0])
             if dev is not None:
                 postdev.extend(['dev', int(dev)])
+                self.is_final = False
             parts.append(tuple(postdev))
         else:
             parts.append(_FINAL_MARKER)
@@ -204,6 +209,7 @@
     # See http://docs.python.org/reference/datamodel#object.__hash__
     __hash__ = object.__hash__
 
+
 def suggest_normalized_version(s):
     """Suggest a normalized version close to the given version string.
 
@@ -215,7 +221,7 @@
     on observation of versions currently in use on PyPI. Given a dump of
     those version during PyCon 2009, 4287 of them:
     - 2312 (53.93%) match NormalizedVersion without change
-    - with the automatic suggestion
+      with the automatic suggestion
     - 3474 (81.04%) match when using this suggestion method
 
     @param s {str} An irrational version string.
@@ -305,7 +311,6 @@
     # PyPI stats: ~21 (0.62%) better
     rs = re.sub(r"\.?(pre|preview|-c)(\d+)$", r"c\g<2>", rs)
 
-
     # Tcl/Tk uses "px" for their post release markers
     rs = re.sub(r"p(\d+)$", r".post\1", rs)
 
@@ -322,6 +327,7 @@
 _PLAIN_VERSIONS = re.compile(r"^\s*(.*)\s*$")
 _SPLIT_CMP = re.compile(r"^\s*(<=|>=|<|>|!=|==)\s*([^\s,]+)\s*$")
 
+
 def _split_predicate(predicate):
     match = _SPLIT_CMP.match(predicate)
     if match is None:
@@ -368,26 +374,25 @@
                 return False
         return True
 
+
 class _Versions(VersionPredicate):
     def __init__(self, predicate):
         predicate = predicate.strip()
         match = _PLAIN_VERSIONS.match(predicate)
-        if match is None:
-            raise ValueError('Bad predicate "%s"' % predicate)
         self.name = None
         predicates = match.groups()[0]
         self.predicates = [_split_predicate(pred.strip())
                            for pred in predicates.split(',')]
 
+
 class _Version(VersionPredicate):
     def __init__(self, predicate):
         predicate = predicate.strip()
         match = _PLAIN_VERSIONS.match(predicate)
-        if match is None:
-            raise ValueError('Bad predicate "%s"' % predicate)
         self.name = None
         self.predicates = _split_predicate(match.groups()[0])
 
+
 def is_valid_predicate(predicate):
     try:
         VersionPredicate(predicate)
@@ -396,6 +401,7 @@
     else:
         return True
 
+
 def is_valid_versions(predicate):
     try:
         _Versions(predicate)
@@ -404,6 +410,7 @@
     else:
         return True
 
+
 def is_valid_version(predicate):
     try:
         _Version(predicate)
@@ -411,4 +418,3 @@
         return False
     else:
         return True
-
diff --git a/src/runtests-cov.py b/src/runtests-cov.py
new file mode 100755
--- /dev/null
+++ b/src/runtests-cov.py
@@ -0,0 +1,119 @@
+#!/usr/bin/env python
+"""Tests for distutils2.
+
+The tests for distutils2 are defined in the distutils2.tests package.
+"""
+
+import sys
+from os.path import dirname, islink, realpath
+from optparse import OptionParser
+
+def ignore_prefixes(module):
+    """ Return a list of prefixes to ignore in the coverage report if
+    we want to completely skip `module`.
+    """
+    # A function like that is needed because some GNU/Linux
+    # distributions, such a Ubuntu, really like to build link farm in
+    # /usr/lib in order to save a few bytes on the disk.
+    dirnames = [dirname(module.__file__)]
+    
+    pymod = module.__file__.rstrip("c")
+    if islink(pymod):
+        dirnames.append(dirname(realpath(pymod)))
+    return dirnames
+
+def parse_opts():
+    parser = OptionParser(usage="%prog [OPTIONS]", 
+                          description="run the distutils2 unittests")
+    
+    parser.add_option("-q", "--quiet", help="do not print verbose messages", 
+                      action="store_true", default=False)
+    parser.add_option("-c", "--coverage", action="store_true", default=False,
+                      help="produce a coverage report at the end of the run")
+    parser.add_option("-r", "--report", action="store_true", default=False,
+                      help="produce a coverage report from the last test run")
+    parser.add_option("-m", "--show-missing", action="store_true", 
+                      default=False,
+                      help=("Show line numbers of statements in each module "
+                            "that weren't executed."))
+    
+    opts, args = parser.parse_args()
+    return opts, args
+
+def coverage_report(opts):
+    import coverage
+    import unittest2
+    cov = coverage.coverage()
+    cov.load()
+
+    prefixes = ["runtests", "distutils2/tests", "distutils2/_backport"]
+    prefixes += ignore_prefixes(unittest2)
+
+    try:
+        import docutils
+        prefixes += ignore_prefixes(docutils)
+    except ImportError:
+        # that module is completely optional
+        pass
+
+    try:
+        import roman
+        prefixes += ignore_prefixes(roman)
+    except ImportError:
+        # that module is also completely optional
+        pass
+
+    cov.report(omit_prefixes=prefixes, show_missing=opts.show_missing)
+
+
+def test_main():
+    opts, args = parse_opts()
+    verbose = not opts.quiet
+    ret = 0
+    
+    if opts.coverage or opts.report:
+        import coverage
+        
+    if opts.coverage:
+        cov = coverage.coverage()
+        cov.erase()
+        cov.start()
+    if not opts.report:
+        ret = run_tests(verbose)
+    if opts.coverage:
+        cov.stop()
+        cov.save()
+
+    if opts.report or opts.coverage:
+        coverage_report(opts)
+    
+    return ret
+    
+def run_tests(verbose):
+    import distutils2.tests
+    from distutils2.tests import run_unittest, reap_children, TestFailed
+    from distutils2._backport.tests import test_suite as btest_suite
+    # XXX just supporting -q right now to enable detailed/quiet output
+    if len(sys.argv) > 1:
+        verbose = sys.argv[-1] != '-q'
+    else:
+        verbose = 1
+    try:
+        try:
+            run_unittest([distutils2.tests.test_suite(), btest_suite()],
+                         verbose_=verbose)
+            return 0
+        except TestFailed:
+            return 1
+    finally:
+        reap_children()
+
+if __name__ == "__main__":
+    try:
+        from distutils2.tests.support import unittest
+    except ImportError:
+        sys.stderr.write('Error: You have to install unittest2')
+        sys.exit(1)
+
+    sys.exit(test_main())
+
diff --git a/src/runtests.py b/src/runtests.py
--- a/src/runtests.py
+++ b/src/runtests.py
@@ -1,6 +1,6 @@
 """Tests for distutils2.
 
-The tests for distutils2 are defined in the distutils2.tests package;
+The tests for distutils2 are defined in the distutils2.tests package.
 """
 import sys
 
@@ -8,8 +8,7 @@
     import distutils2.tests
     from distutils2.tests import run_unittest, reap_children, TestFailed
     from distutils2._backport.tests import test_suite as btest_suite
-    # just supporting -q right now
-    # to enable detailed/quiet output
+    # XXX just supporting -q right now to enable detailed/quiet output
     if len(sys.argv) > 1:
         verbose = sys.argv[-1] != '-q'
     else:
@@ -17,7 +16,7 @@
     try:
         try:
             run_unittest([distutils2.tests.test_suite(), btest_suite()],
-                    verbose_=verbose)
+                         verbose_=verbose)
             return 0
         except TestFailed:
             return 1
@@ -28,7 +27,7 @@
     try:
         from distutils2.tests.support import unittest
     except ImportError:
-        print('Error: You have to install unittest2')
+        sys.stderr.write('Error: You have to install unittest2')
         sys.exit(1)
 
     sys.exit(test_main())
diff --git a/src/setup.py b/src/setup.py
--- a/src/setup.py
+++ b/src/setup.py
@@ -3,8 +3,12 @@
 __revision__ = "$Id$"
 import sys
 import os
+import re
 
-from distutils2.core import setup
+from distutils2 import __version__ as VERSION
+from distutils2 import log
+from distutils2.core import setup, Extension
+from distutils2.compiler.ccompiler import new_compiler
 from distutils2.command.sdist import sdist
 from distutils2.command.install import install
 from distutils2 import __version__ as VERSION
@@ -31,6 +35,7 @@
 
 DEV_SUFFIX = '.dev%d' % get_tip_revision('..')
 
+
 class install_hg(install):
 
     user_options = install.user_options + [
@@ -62,10 +67,141 @@
             self.distribution.metadata.version += DEV_SUFFIX
         sdist.run(self)
 
+
+# additional paths to check, set from the command line
+SSL_INCDIR = ''   # --openssl-incdir=
+SSL_LIBDIR = ''   # --openssl-libdir=
+SSL_DIR = ''      # --openssl-prefix=
+
+def add_dir_to_list(dirlist, dir):
+    """Add the directory 'dir' to the list 'dirlist' (at the front) if
+    'dir' actually exists and is a directory.  If 'dir' is already in
+    'dirlist' it is moved to the front.
+    """
+    if dir is not None and os.path.isdir(dir) and dir not in dirlist:
+        if dir in dirlist:
+            dirlist.remove(dir)
+        dirlist.insert(0, dir)
+
+
+def prepare_hashlib_extensions():
+    """Decide which C extensions to build and create the appropriate
+    Extension objects to build them.  Return a list of Extensions.
+    """
+    # this CCompiler object is only used to locate include files
+    compiler = new_compiler()
+
+    # Ensure that these paths are always checked
+    if os.name == 'posix':
+        add_dir_to_list(compiler.library_dirs, '/usr/local/lib')
+        add_dir_to_list(compiler.include_dirs, '/usr/local/include')
+
+        add_dir_to_list(compiler.library_dirs, '/usr/local/ssl/lib')
+        add_dir_to_list(compiler.include_dirs, '/usr/local/ssl/include')
+
+        add_dir_to_list(compiler.library_dirs, '/usr/contrib/ssl/lib')
+        add_dir_to_list(compiler.include_dirs, '/usr/contrib/ssl/include')
+
+        add_dir_to_list(compiler.library_dirs, '/usr/lib')
+        add_dir_to_list(compiler.include_dirs, '/usr/include')
+
+    # look in command line supplied paths
+    if SSL_LIBDIR:
+        add_dir_to_list(compiler.library_dirs, SSL_LIBDIR)
+    if SSL_INCDIR:
+        add_dir_to_list(compiler.include_dirs, SSL_INCDIR)
+    if SSL_DIR:
+        if os.name == 'nt':
+            add_dir_to_list(compiler.library_dirs, os.path.join(SSL_DIR, 'out32dll'))
+            # prefer the static library
+            add_dir_to_list(compiler.library_dirs, os.path.join(SSL_DIR, 'out32'))
+        else:
+            add_dir_to_list(compiler.library_dirs, os.path.join(SSL_DIR, 'lib'))
+        add_dir_to_list(compiler.include_dirs, os.path.join(SSL_DIR, 'include'))
+
+    oslibs = {'posix': ['ssl', 'crypto'],
+              'nt': ['libeay32',  'gdi32', 'advapi32', 'user32']}
+
+    if os.name not in oslibs:
+        sys.stderr.write(
+            'unknown operating system, impossible to compile _hashlib')
+        sys.exit(1)
+
+    exts = []
+
+    ssl_inc_dirs = []
+    ssl_incs = []
+    for inc_dir in compiler.include_dirs:
+        f = os.path.join(inc_dir, 'openssl', 'ssl.h')
+        if os.path.exists(f):
+            ssl_incs.append(f)
+            ssl_inc_dirs.append(inc_dir)
+
+    ssl_lib = compiler.find_library_file(compiler.library_dirs, oslibs[os.name][0])
+
+    # find out which version of OpenSSL we have
+    openssl_ver = 0
+    openssl_ver_re = re.compile(
+        '^\s*#\s*define\s+OPENSSL_VERSION_NUMBER\s+(0x[0-9a-fA-F]+)' )
+    ssl_inc_dir = ''
+    for ssl_inc_dir in ssl_inc_dirs:
+        name = os.path.join(ssl_inc_dir, 'openssl', 'opensslv.h')
+        if os.path.isfile(name):
+            try:
+                incfile = open(name, 'r')
+                for line in incfile:
+                    m = openssl_ver_re.match(line)
+                    if m:
+                        openssl_ver = int(m.group(1), 16)
+                        break
+            except IOError:
+                pass
+
+        # first version found is what we'll use
+        if openssl_ver:
+            break
+
+    if (ssl_inc_dir and ssl_lib is not None and openssl_ver >= 0x00907000):
+
+        log.info('Using OpenSSL version 0x%08x from', openssl_ver)
+        log.info(' Headers:\t%s', ssl_inc_dir)
+        log.info(' Library:\t%s', ssl_lib)
+
+        # The _hashlib module wraps optimized implementations
+        # of hash functions from the OpenSSL library.
+        exts.append(Extension('_hashlib', ['_hashopenssl.c'],
+                              include_dirs = [ssl_inc_dir],
+                              library_dirs = [os.path.dirname(ssl_lib)],
+                              libraries = oslibs[os.name]))
+    else:
+        exts.append(Extension('_sha', ['shamodule.c']) )
+        exts.append(Extension('_md5',
+                              sources=['md5module.c', 'md5.c'],
+                              depends=['md5.h']) )
+
+    if (not ssl_lib or openssl_ver < 0x00908000):
+        # OpenSSL doesn't do these until 0.9.8 so we'll bring our own
+        exts.append(Extension('_sha256', ['sha256module.c']))
+        exts.append(Extension('_sha512', ['sha512module.c']))
+
+    def prepend_modules(filename):
+        return os.path.join('Modules', filename)
+
+    # all the C code is in the Modules subdirectory, prepend the path
+    for ext in exts:
+        ext.sources = [prepend_modules(fn) for fn in ext.sources]
+        if hasattr(ext, 'depends') and ext.depends is not None:
+            ext.depends = [prepend_modules(fn) for fn in ext.depends]
+
+    return exts
+
 setup_kwargs = {}
 if sys.version < '2.6':
     setup_kwargs['scripts'] = ['distutils2/mkpkg.py']
 
+if sys.version < '2.5':
+    setup_kwargs['ext_modules'] = prepare_hashlib_extensions()
+
 _CLASSIFIERS = """\
 Development Status :: 3 - Alpha
 Intended Audience :: Developers
@@ -77,26 +213,23 @@
 Topic :: System :: Systems Administration
 Topic :: Utilities"""
 
-setup (name="Distutils2",
-       version=VERSION,
-       summary="Python Distribution Utilities",
-       keywords=['packaging', 'distutils'],
-       author="Tarek Ziade",
-       author_email="tarek at ziade.org",
-       home_page="http://bitbucket.org/tarek/distutils2/wiki/Home",
-       license="PSF",
-       description=README,
-       classifier=_CLASSIFIERS.split('\n'),
-       packages=find_packages(),
-       cmdclass={'sdist': sdist_hg, 'install': install_hg},
-       package_data={'distutils2._backport': ['sysconfig.cfg']},
-       project_url=[('Mailing-list',
+setup(name="Distutils2",
+      version=VERSION,
+      summary="Python Distribution Utilities",
+      keywords=['packaging', 'distutils'],
+      author="Tarek Ziade",
+      author_email="tarek at ziade.org",
+      home_page="http://bitbucket.org/tarek/distutils2/wiki/Home",
+      license="PSF",
+      description=README,
+      classifier=_CLASSIFIERS.split('\n'),
+      packages=find_packages(),
+      cmdclass={'sdist': sdist_hg, 'install': install_hg},
+      package_data={'distutils2._backport': ['sysconfig.cfg']},
+      project_url=[('Mailing list',
                     'http://mail.python.org/mailman/listinfo/distutils-sig/'),
-                    ('Documentation',
-                     'http://packages.python.org/Distutils2'),
-                    ('Repository', 'http://hg.python.org/distutils2'),
-                    ('Bug tracker', 'http://bugs.python.org')],
-       **setup_kwargs
-       )
-
-
+                   ('Documentation',
+                    'http://packages.python.org/Distutils2'),
+                   ('Repository', 'http://hg.python.org/distutils2'),
+                   ('Bug tracker', 'http://bugs.python.org')],
+      **setup_kwargs)
diff --git a/src/tests.sh b/src/tests.sh
--- a/src/tests.sh
+++ b/src/tests.sh
@@ -1,28 +1,40 @@
 #!/bin/sh
-echo -n "Running tests for Python 2.4..."
-python2.4 runtests.py -q > /dev/null 2> /dev/null
+echo -n "Running tests for Python 2.4... "
+rm -rf _hashlib.so
+python2.4 setup.py build_ext -i -q 2> /dev/null > /dev/null
+python2.4 -Wd runtests.py -q 2> /dev/null
 if [ $? -ne 0 ];then
     echo "Failed"
-    exit $1
+    exit 1
 else
     echo "Success"
 fi
 
-echo -n "Running tests for Python 2.5..."
-python2.5 runtests.py -q > /dev/null 2> /dev/null
+echo -n "Running tests for Python 2.5... "
+rm -rf _hashlib.so
+python2.5 setup.py build_ext -i -q 2> /dev/null > /dev/null
+python2.5 -Wd runtests.py -q 2> /dev/null
 if [ $? -ne 0 ];then
     echo "Failed"
-    exit $1
+    exit 1
 else
     echo "Success"
 fi
 
-echo -n "Running tests for Python 2.6..."
-python2.6 runtests.py -q > /dev/null 2> /dev/null
+echo -n "Running tests for Python 2.6... "
+python2.6 -Wd -bb -3 runtests.py -q 2> /dev/null
 if [ $? -ne 0 ];then
     echo "Failed"
-    exit $1
+    exit 1
 else
     echo "Success"
 fi
 
+echo -n "Running tests for Python 2.7... "
+python2.7 -Wd -bb -3 runtests.py -q 2> /dev/null
+if [ $? -ne 0 ];then
+    echo "Failed"
+    exit 1
+else
+    echo "Success"
+fi

--
Repository URL: http://hg.python.org/distutils2


More information about the Python-checkins mailing list