Conjugate gradients minimizer

Andrew Walkingshaw andrew-usenet at lexical.org.uk
Tue Mar 25 16:49:54 EST 2003


In article <Aw2ga.24322$SX.8662 at nwrdny03.gnilink.net>, Carl Banks wrote:
> 
> Just out of curiosity: how many variables are we talking about?

3N, where N is the number of atoms in the system; a few hundred. The
problem with Numeric is getting everything to build happily on large
supercomputers.

> Numeric is your friend for optimization algorithms.  Numeric does the
> iteration though large arrays for you, which is useful because doing
> it in Python isn't very fast.  Because of this, most optimization
> algorithms would be terrible written in pure Python.  I highly suggest
> upgrading the "just about" if you're going to do numerical work.

Noted; I'll start on reading the Numeric manual, then.

The numerical heavy lifting, as far as energy and force (==gradient)
evaluation goes,  is going to be done in about 100,000 lines of
F90 (a popular density functional theory code), and we're talking of the
order of minutes per energy evaluation in the optimistic case; my hope
was that Python wouldn't therefore be the rate-limiting step in practice.

> In Python, you could use the map and reduce functions, with some
> functions from the operator module, to get vector-like operations.
> Unfortunately, I can't think of a good way to multiply a vector by a
> constant (maybe there's a fast iterator that always returns the same
> item that can be passed to map)?  Using a lambda is about the best
> you can do there.

What I'm doing is a list comprehension, which isn't exactly fast:

[n*x for x in self.components], which to my understanding basically
unrolls tt;

for inc in range(len(self.components):
	newarray.append(self.components[inc]*n)
self.components = newarray

- but, as I said, my hope is that this isn't the slow step in any case.
(If it turns out that it *is*, I'll probably give up and recode the
entire thing in F90...)

>> ... which I could either adapt or even just *read*? I've looked at
>> Numerical Methods in C, and though it makes the algorithms pretty clear,
>> it's so unPythonic it sets my teeth on edge. 
> 
> One thing you should keep in mind: in numerical algorithms, where
> speed is at a premium, you shouldn't worry about things being
> "Pythonic".  In fact, Numerical Algorithms code is probably not just
> unPythonic, but also unCic.  In such code, it is often worth it to
> make your code ugly just a save a few CPU cycles here and there.

Noted. I'm beginning to suspect I'll have to go down the "build one to 
throw away" route here, but that's probably no bad thing in any case.

> The language of choice for a lot of people doing numerical work is
> still Fortran.  That should tell you something about their priorities.

Indeed; I'm a newbie (first-year PhD) materials physicist, and learning
Fortran is moving extremely rapidly up my list of priorities.

> And having worked with some of them, I can tell you they do know what
> they're doing.  (But God help them if they want to write regular
> code.)

This is my experience too: the other problem seems to be that a lot of
numerical codes are quick-hack piled on quick-hack, and as such they
rapidly get to the point where no mortal mind would be capable of keeping
all the corner cases in memory at once...

Thank you for your time,
Andrew

-- 
Andrew Walkingshaw | andrew-usenet at lexical.org.uk





More information about the Python-list mailing list