Hi all --
as promised, I have finally checked in some real Distutils code.
There's just enough functionality for the Distutils to build and install
itself. To try it out, you'll need to download the code from the
anonymous CVS archive at cvs.python.org; see
http://www.python.org/sigs/distutils-sig/cvs.html
for instructions.
If you just want to try it out, then from the top-level Distutils
directory (the one that has setup.py), run:
./setup.py build install
This will copy all the .py files to a mockup installation tree in
'build', compile them (to .pyc -- don't run setup.py with "python -O",
as I don't handle that yet and it will get confused), and then copy the
whole mockup tree to the 'site-packages' directory under your Python
library directory. Figuring the installation directory relies on Fred
Drake's 'sysconfig' module -- I don't recall what the status of
sysconfig was on non-Unix systems; hope someone out there can try it and
let us know! Come to think of it, I'd like to hear how it works on
*any* system apart from my Red Hat Linux 5.2, stock Python 1.5.1, home
PC. ;-)
If you want to build in a different directory, use the '--basedir'
option to the 'build' command; to install in a different place, you can
use either the '--prefix' or '--install-lib' options to the 'install'
command. Here are a couple of samples to demonstrate:
./setup.py build --basedir=/tmp/build install --prefix=/tmp/usr/local
or
./setup.py build --basedir=/tmp/build
./setup.py install --build-base=/tmp/build --prefix=/tmp/usr/local
Go crazy. No documentation yet -- please read the code! Start with
distutils/core.py which, as its name implies, is the start of
everything. I went nuts with docstrings in that module last night, so
hopefully there's just enough there that you can figure things out in
the absence of the "Distutils Implementation Notes" document that I'd
like to write.
Greg
--
Greg Ward - software developer gward(a)cnri.reston.va.us
Corporation for National Research Initiatives
1895 Preston White Drive voice: +1-703-620-8990 x287
Reston, Virginia, USA 20191-5434 fax: +1-703-620-0913
Just to chime in about the unmentionable, I have a tool for connecting
Fortran and Python (as do others; mine isn't ready for the light of day yet)
and unfortunately the existing Setup scheme makes me compile my Fortran
separately in a library. I don't have any way to piggyback on what Python
has learned about the platform. This is too bad. I am sure it isn't the
Python community's job to solve the Fortran 90 make problem but there are
days when I dream about it.
The "right" object-oriented approach should allow some way to extend the
system, e.g. adding suffices or file names together with info about how to
compile them.
-----Original Message-----
From: Mark Hammond <MHammond(a)skippinet.com.au>
To: distutils-sig(a)python.org <distutils-sig(a)python.org>
Date: Tuesday, March 30, 1999 3:50 PM
Subject: RE: [Distutils] Compiler abstractiom model
>> > Quoth David Ascher, on 29 March 1999:
>> > > On windows, it is sometimes needed to specify other files
>> which aren't .c
>> > > files, but .def files (possibly all can be done with
>> command line options,
>> > > but might as well build this in). I don't know how these
>> should fit in...
>> >
>> > Are they just listed in the command line like .c files? Or
>are they
>> > specified by a command-line option? Would you use these in
>> code that's
>> > meant to be portable to other platforms?
>>
>> Yes, no, and yes. =)
>>
>> E.g. Python extensions need to declare the 'exported' entry
>> point. This
>> can be done either by modifying the source code (bad for
>> portable code,
>> requires #ifdef's etc.), by specifying a command-line option,
>or by
>> including a .DEF file.
>
>Just to follow up on this, many obscure options may need to be
>passed to the Windows linker, but they do require their own
>option - they are not passed as normal files. Examples are
>/DEF: - the .def file David mentions, /NOD:lib - to prevent a
>default library from being linked, /implib:lib to name the built
>.lib file, etc.
>
>It wasnt clear from David's post that these need their own
>option, and are not passed as a simple filename param to the
>linker....
>
>Building from what Greg said, I agree that _certain_ command-line
>params can be mandated - they are designed not to be
>configurable, as messing them will likely break the build. But
>there also needs to be a secondary class that are virtually
>unconstrained.
>
>[Sorry - as usual, speaking having kept only a quick eye on the
>posts, and not having seen the latest code drop...]
>
>Mark.
>
>
>_______________________________________________
>Distutils-SIG maillist - Distutils-SIG(a)python.org
>http://www.python.org/mailman/listinfo/distutils-sig
>
>
Hi all --
I've finally done some thinking and scribbling on how to build
extensions -- well, C/C++ extensions for CPython. Java extensions for
JPython will have to wait, but they are definitely looming on the
horizon as something Distutils will have to handle.
Anyways, here are the conclusions I've arrived at.
* Stick with C/C++ for now; don't worry about other languages (yet).
That way we can be smart about C/C++ things like preprocessor
tokens and macros, include directories, shared vs static
libraries, source and object files, etc.
* At the highest level, we should just be able to say "I know nothing,
just give me a compiler object". This implies a factory function
returning instances of concrete classes derived from an abstract
CCompiler class. These compiler objects must know how to:
- compile .c -> .o (or local equivalent)
- compile multiple .c's to matching .o's
- be able to define/undefine preprocessor macros/tokens
- be able to supply preprocessor search directories
- link multiple .o's to static library (libfoo.a, or local equiv.)
- link multiple .o's to shared library (libfoo.so, or local equiv.)
- link multiple .o's to shared object (foo.so, or local equiv.)
- for all link steps:
+ be able to supply explicit libraries (/foo/bar/libbaz.a)
+ be able to supply implicit libraries (-lbaz)
+ be able to supply search directories for implicit libraries
- do all this with timestamp-based dependency analysis
(non-trivial because it requires analyzing header dependencies!)
Linking to static/shared libraries and dependency analysis are
optional for now; everything else is required to build C/C++
extensions for Python. (At least that's my impression!)
"Local equivalent" is meant to encompass different filenames for C++
(eg. .C -> .o) and different operating systems/compilers (eg. .c ->
.obj, multiple .obj's to foo.dll or foo.lib)
BIG QUESTION: I know this will work on Unix, and from my distant
recollections of past work on other systems, it should work on MS-DOS
and VMS too. I gather that Windows is pretty derivative of MS-DOS, so
will this model work for Windows compilers too? Do we have to worry
about Windows compilers other than VC++? But I have *no clue* about
Macintosh compilers -- presumably somebody "out there" (not necessarily
on this SIG, but I hope so!) knows how to compile Python on the Mac, so
hopefully it's possible to compile Python extensions on the Mac. But
will this compiler abstraction model work there?
Brushing that moment of self-doubt aside, here's a proposed interface
for CCompiler and derived classes.
define_macro (name [, value])
define a preprocessor macro or token; this will affect all
invocations of the 'compile()' method
undefine_macro (name)
undefine a preprocessor macro or token
add_include_dir (dir)
add 'dir' to the list of directories that will be searched by
the preprocessor for header files
set_include_dir ([dirs])
reset the list of preprocessor search directories; 'dirs' should
be a list or tuple of directory names; if not supplied, the list
is cleared
compile (source, define=macro_list, undef=names, include_dirs=dirs)
compile source file(s). 'source' may be a sequence of source
filenames, all of which will be compiled, or a single filename to
compile. The optional 'define', 'undef', and 'include_dirs'
named parameters all augment the lists setup by the above four
methods. 'macro_list' is a list of either 2-tuples
(macro_name, value) or bare macro names. 'names' is a list of
macro names, and 'dirs' a list of directories.
add_lib (libname)
add a library name to the list of implicit libraries ("-lfoo")
to link with
set_libs ([libnames])
reset the list of implicit libraries (or clear if 'libnames'
not supplied)
add_lib_dir (dir)
add a directory to the list of library search directories
("-L/foo/bar/baz") used when we link
set_lib_dirs ([dirs])
reset (or clear) the list of library search directorie
link_shared_object (objects, shared_object,
libs=libnames, lib_dirs=dirs)
link a set of object files together to create a shared object file.
The optional 'libs' and 'lib_dirs' parameters only augment the
lists setup by the previous four methods.
Things to think about: should there be explicit support for "explicit
libraries" (eg. where you put "/foo/bar/libbaz.a" on the command line
instead of trusting "-lbaz" to figure it out)? I don't think we can
expect the caller to put them in the 'objects' list, because the
filenames are too system-dependent. My inclination, as you could
probably guess, would be to add methods 'add_explicit_lib()' and
'set_explicit_libs()', and a named parameter 'explicit_libs' to
'link_shared_objects()'.
Also, there would have to be methods to support creating static and
shared libraries: I would call them 'link_static_lib()' and
'link_shared_lib()'. They would have the same interface as
'link_shared_object()', except the output filename would of course have
to be handled differently. (To illustrate: on Unix-y systems,
passing shared_object='foo' to 'link_shared_object()' would result in an
output file 'foo.so'. But passing output_lib='foo' to
'link_shared_lib()' would result in 'libfoo.so', and passing it to
'link_static_lib()' would result in 'libfoo.a'.
So, to all the Windows and Mac experts out there: will this cover it?
Can the variations in filename conventions and compilation/link schemes
all be shoved under this umbrella? Or is it back to the drawing board?
Thanks for your comments!
Greg
--
Greg Ward - software developer gward(a)cnri.reston.va.us
Corporation for National Research Initiatives
1895 Preston White Drive voice: +1-703-620-8990 x287
Reston, Virginia, USA 20191-5434 fax: +1-703-620-0913
Hi Greg,
sorry , didn't have time to look at this yet .. just a question/suggestion
What I really like about Makefiles is the -n mode where I can see what it would
do before it actually does it. Is duch an option available in setup.py and if
not could it be added.
Cheers
-Michel
On Mar 30, 2:01pm, Greg Ward wrote:
> > report on what happened.
>
> There's not much to running it; from the top distutils directory, do:
>
> ./setup.py build
>
> should work pretty much anywhere. A slightly more risky proposition is
>
> ./setup.py install
>
> which relies on distutils.sysconfig to get the installation directories;
> it in turn relies on finding Python's Makefiles in the usual place. I
> have no idea if they're even installed under Win 95 -- please let me
> know!
>
> > Greg, anything you'd like me to examine especially?
>
> The weak spots! Search for "XXX" in the code; I'm liberal with X-rated
> comments. Also check my "Current weaknesses" post from last night; see
> if you can correlate my opinions of current problems with the code.
>
> When it occurs to you that commands are a lot like subroutines, and then
> when you start to wonder why parameter passing is done backwards, then
> you'll be up to speed. (The whole problem of communicating options
> between commands was bigger than I expected. The implementation isn't
> overly complicated, but I think it'll take some bouncing around across
> various classes and thinking about the alternatives before it becomes
> apparent why I did it that way.)
>
> Oh, the big reason I put the code up now is this: I think it's close to
> being at a state where development can be in parallel. The basic
> framework is in place, all that's missing is a lot of commands to do the
> work. The beginnings of building and installation are in place, and
> I've started thinking about the 'build_ext' command -- witness the
> thread on compiler abstraction models. But the "dist" and "bdist"
> commands -- to create source and built distributions -- are important
> and could easily be done by someone else.
>
> Greg
> --
> Greg Ward - software developer gward(a)cnri.reston.va.us
> Corporation for National Research Initiatives
> 1895 Preston White Drive voice: +1-703-620-8990 x287
> Reston, Virginia, USA 20191-5434 fax: +1-703-620-0913
>
> _______________________________________________
> Distutils-SIG maillist - Distutils-SIG(a)python.org
> http://www.python.org/mailman/listinfo/distutils-sig
>
>-- End of excerpt from Greg Ward
When should the .pyc and .pyo files be generated during the
install process, in the "build" directory or the "install" one?
I ask because I've been looking into GNU autoconf and automake.
They compile emacs lisp code (.el->.elc) in the build dir before
the actual installation, and it might be nice to follow their
lead on that, and it would ensure that the downloaded python files
are compileable before they are installed.
OTOH, I know the normal Python install does a compileall after
the .py files have been transfered to the install directory.
Also, looking at buildall, it doesn't give any sort of error
status on exit if a file could not be compiled. I would prefer
the make process stop if that occurs, so compileall needs to exit
with a non-zero value. Attached is a context diff patch to
"compileall" from 1.5.1 which supports this ability. I can send
the full file as well if someone wants it.
Andrew
dalke(a)bioreason.com
*** /usr/local/lib/python1.5/compileall.py Mon Jul 20 12:33:01 1998
--- compileall.py Sun Mar 28 17:50:51 1999
***************
*** 34,39 ****
--- 34,40 ----
print "Can't list", dir
names = []
names.sort()
+ success = 1
for name in names:
fullname = os.path.join(dir, name)
if ddir:
***************
*** 54,64 ****
--- 55,67 ----
else: exc_type_name = sys.exc_type.__name__
print 'Sorry:', exc_type_name + ':',
print sys.exc_value
+ success = 0
elif maxlevels > 0 and \
name != os.curdir and name != os.pardir and \
os.path.isdir(fullname) and \
not os.path.islink(fullname):
compile_dir(fullname, maxlevels - 1, dfile)
+ return success
def compile_path(skip_curdir=1, maxlevels=0):
"""Byte-compile all module on sys.path.
***************
*** 69,79 ****
maxlevels: max recursion level (default 0)
"""
for dir in sys.path:
if (not dir or dir == os.curdir) and skip_curdir:
print 'Skipping current directory'
else:
! compile_dir(dir, maxlevels)
def main():
"""Script main program."""
--- 72,84 ----
maxlevels: max recursion level (default 0)
"""
+ success = 1
for dir in sys.path:
if (not dir or dir == os.curdir) and skip_curdir:
print 'Skipping current directory'
else:
! success = success and compile_dir(dir, maxlevels)
! return success
def main():
"""Script main program."""
***************
*** 98,109 ****
sys.exit(2)
try:
if args:
for dir in args:
! compile_dir(dir, maxlevels, ddir)
else:
! compile_path()
except KeyboardInterrupt:
print "\n[interrupt]"
if __name__ == '__main__':
! main()
--- 103,118 ----
sys.exit(2)
try:
if args:
+ success = 1
for dir in args:
! success = success and compile_dir(dir, maxlevels, ddir)
else:
! success = compile_path()
except KeyboardInterrupt:
print "\n[interrupt]"
+ return success
+
if __name__ == '__main__':
! if not main():
! sys.exit(1)
Hi there,
This is NOT major news, just a note of encouragement to Greg Ward, and
making sure he knows that people are indeed looking at things:
I just installed CVS on this win95 system at work, downloaded the
distutils source, and read through the sources some. It looks pretty
neat. I haven't actually tried *running* distutils yet on this windows
box, but I'll try to get to that later this week and give you all a
report on what happened.
Note on installing CVS on a win95 box: be sure to set the environment
variable HOME to some directory or it won't work -- this was not
mentioned in any CVS docs I found; presumably in win NT, HOME is already
set.
Greg, anything you'd like me to examine especially?
Regards,
Martijn
Sorry, I forgot to attach the promised script. (That seems to be
my standard ritual for attachments!)
Anyway, the "checkpycs" script is attached, really.
-Fred
--
Fred L. Drake, Jr. <fdrake(a)acm.org>
Corporation for National Research Initiatives
#! /usr/bin/env python
# -*- Python -*-
import errno
import os
import stat
import string
import sys
VERBOSE = 0
COLUMN_WIDTHS = (40, 10, 10, 10)
s = ''
for cw in COLUMN_WIDTHS:
if s:
s = "%s %%%ds" % (s, cw)
else:
s = "%%%ds" % cw
FORMAT = s
del s, cw
def process_dirs(dirlist, verbose=VERBOSE):
fp = os.popen("find %s -name \*.py -print" % string.join(dirlist))
writeline("", " .py Size", ".pyc Size", ".pyo Size")
writeline("", "---------", "---------", "---------")
while 1:
line = fp.readline()
if not line:
break
filename = line[:-1]
pysize = os.stat(filename)[stat.ST_SIZE] or "0"
pycsize = pyosize = None
if os.path.isfile(filename + "c"):
pycsize = os.stat(filename + "c")[stat.ST_SIZE]
if os.path.isfile(filename + "o"):
pyosize = os.stat(filename + "o")[stat.ST_SIZE]
if pycsize or pyosize or VERBOSE:
writeline(filename, pysize, pycsize, pyosize)
def writeline(c1, c2, c3, c4):
c1 = c1 or ""
c2 = c2 or ""
c3 = c3 or ""
c4 = c4 or ""
if len(c1) > COLUMN_WIDTHS[0]:
c1 = "..." + c1[-(COLUMN_WIDTHS[0] - 3):]
print FORMAT % (c1, c2, c3, c4)
def main():
try:
process_dirs(sys.argv[1:] or ["."])
except IOError, e:
if e.errno != errno.EPIPE:
raise
if __name__ == "__main__":
main()
Well, it's been about a week since I announced the first bundle of
Distutils code. Haven't heard much back yet, so I assume that it has
worked for those of you who tried it. Has anyone really dived in and
started poking around the code? If so, you must have stumbled across
some of the difficulties I had, including:
* it doesn't seem like there's a way to control whether 'compile'
generates .pyc or .pyo files
* worse, it doesn't look like there's even a way to find out
what 'compile' will generate!
* the command options describing installation directories are
haphazard at best. The problem is, I know pretty much what to call
platform-specific library directories: "install_platlib",
"install_site_platlib", and so forth. That seems in keeping with
the Python Makefiles. But I don't really know what to call
non-platform-specific library directories. I take solace in knowing
that I am not alone; Perl's MakeMaker just calls them
"INSTALLLIBDIR", "INSTALLSITELIB", and so forth, which is where my
cop-out of "install_lib" and "install_site_lib" came from. But
this isn't really satisfactory... anyone got better ideas?
* how do we handle copying file metadata under Mac OS? do the
copy routines in distutils.util work under Windows as well as they
do under Unix? (the code is mostly stolen from the standard shutil
module, so if it works then my copying stuff should)
* how should we deal with "wildcard" listing of modules in setup.py?
Even for a moderate sized distribution like Distutils, it's
already obvious that explicitly listing every module in the
distribution is a no-go. See the comments in setup.py for
some of my thinking on this matter.
More importantly, has anyone gotten deeply confused by trying to
associate the code with my two-month-old design proposal? I still need
to revisit that document to make sure I haven't grossly violated any of
its principles (and change the principles to match the violation if so
;-), so if anybody is getting confused it's probably not just you.
Hope to hear some comments soon!
Greg
--
Greg Ward - software developer gward(a)cnri.reston.va.us
Corporation for National Research Initiatives
1895 Preston White Drive voice: +1-703-620-8990 x287
Reston, Virginia, USA 20191-5434 fax: +1-703-620-0913
This is an easy couple of questions (I hope).
1)
What is the correct location for "pure" python module installations?
$(prefix)/lib/python$VERSION/site-packages
or
$(prefix)/lib/site-python
I believe most packages tend to install in site-packages but
site-python makes more sense since I usually write my modules
to deal with differences in python versions.
2) where should python shared libraries be placed if you want
to have one install of python per site which includes different
architectures? (For example, we may distribute our software on
both SGI and Linux boxes.)
As I understand it now, the python specific .so files need to be
on the PYTHONPATH, which currently contains no information about
the specific platform. The only information available is from
sys.platform (along with os.uname) and that doesn't appear
sufficient to distinguish between different SGI binary interfaces.
Specifically, SGIs have 3 interfaces: "old" 32, "new" 32 and 64.
These are specified during compilation by the environment variable
SGI_ABI or by the command-line options (-32/-o32, -n32, -64). In
theory we could compile libraries for each of the different
interfaces, though we only support o32 at present. To do it fully
we would have to write a wrapper script which runs the specified
ABI version of Python ... oh, but then we could modify the
PYTHONPATH to reflect the differences.
Has there been a proposal for how to distribute/manage/install
distributions with multiple architectures? The best I can think
of for now is to modify site.py to add
os.path.join(prefix,
"lib",
"python" + sys.version[:3],
"site-packages",
sys.platform),
to the sitedirs list. Does this make sense, and should it be
added to the 1.5.2 (or discussed more on c.l.py)?
Andrew
dalke(a)bioreason.com
BTW, here's how SGI's java, which is only for 32 bits, manages
things. "java" is actuall a shell script containing the following
three snippets:
# use -n32 binaries by default
if [[ $SGI_ABI = -32 ]]
then
export JAVA_N32=0
elif [[ $SGI_ABI = -o32 ]]
then
export JAVA_N32=0
else
export JAVA_N32=1
fi
case $a in
-32)
JAVA_N32=0
shift
;;
-o32)
JAVA_N32=0
shift
;;
-n32)
JAVA_N32=1
shift
;;
if [ $JAVA_N32 = 1 ]
then
check_path $LD_LIBRARYN32_PATH "LD_LIBRARYN32_PATH"
if [ -z "$LD_LIBRARYN32_PATH" ]
then
if [ -z "$LD_LIBRARY_PATH" ]
then
LD_LIBRARYN32_PATH=$JAVA_HOME/lib32/sgi/$THREADS_TYPE
else
check_path $LD_LIBRARY_PATH "LD_LIBRARY_PATH"
LD_LIBRARYN32_PATH="$JAVA_HOME/lib32/sgi/$THREADS_TYPE:$LD_LIBRARY_PATH"
fi
else
LD_LIBRARYN32_PATH="$JAVA_HOME/lib32/sgi/$THREADS_TYPE:$LD_LIBRARYN32_PATH"
fi
export LD_LIBRARYN32_PATH
prog=$JAVA_HOME/bin32/sgi/${THREADS_TYPE}/${progname}
else
check_path $LD_LIBRARY_PATH "LD_LIBRARY_PATH"
if [ -z "$LD_LIBRARY_PATH" ]
then
LD_LIBRARY_PATH=$JAVA_HOME/lib/sgi/$THREADS_TYPE
else
LD_LIBRARY_PATH="$JAVA_HOME/lib/sgi/$THREADS_TYPE:$LD_LIBRARY_PATH"
fi
export LD_LIBRARY_PATH
prog=$JAVA_HOME/bin/sgi/${THREADS_TYPE}/${progname}
fi