[Chicago] MPI in Python?

Massimo Di Pierro mdipierro at cs.depaul.edu
Wed Feb 27 18:27:43 CET 2008

> A lot of the fermiqcd code is to produce a backwards compatible
> interface for the older ACP/MAPS code. I'm not sure it would be the
> right choice, but once again, it is highly dependent upon the  
> workload.

Ah Ah! Not at all. I wrote it so I know.

Fermiqcd defines the following classes:

- A lattice (a set of vertices connected by any topology that is  
embeddable in a 10 dimensional space, usually a mesh, and the  
veritices are automatically distributed in a parallel environment,  
could be just an array or a multi-dimensional array)

- A field (the data structure that lives on the vertices of the  
lattice, for example a temperature at every point in space).

If you write your algorithm serially using the above data structures  
(with some care), it is automatically parallel. Parallelization is  
implementing by finding optimal communication patterns that minimize  
latency dependence and are implemented in MPI. It also includes a  
linear algebra package for complex numbers.

It runs at Fermilab on the cluster and has been tested in UK on blue  

Fermiqcd is not backward compatible with the Canopy code (the one for  
ACP/MAPS) and was developed independently (when I was in UK, in order  
to run on a Cray T3E).

It is true that most of the old Canopy algorithms have been rewritten  
in Fermiqcd (not in a backward compatible way) and that is why it is  
most popular in that area of Physics (Lattice QCD). Some people use  
if for gravity simulations and other people for condensed matter  
simulations. I am looking at using it for financial simulations as  
well. It even has a clone (Latfield) which actually offers a small  
subset of the Fermiqcd functionalities.

Just for fun, here are some images produced by Fermiqcd simulations:



More information about the Chicago mailing list