the scipy mission, include finite element solver
Hi, the mission of scipy from scipy.org: " SciPy is open-source software for mathematics, science, and engineering. [...] The SciPy library [...] provides many user-friendly and efficient numerical routines such as routines for numerical integration and optimization. " it has solvers for ODEs using several methods. Is having a finite element (FEM) solvers for PDEs (and some ODEs too) among the scipy's goals? My group[0] would be very interested in that. Of course, this is not something, that can happen overnight. But scipy has interface to lot's of sparse solvers, so it might have some nice pythonic interface to finite element solvers for PDEs too. If you look at the topical software, PDE section: http://scipy.org/Topical_Software#head-3df99e31c89f2e8ff60a2622805f6a304c501... there are currently 3 solvers listed, FiPy (finite volumes), SfePy (regular finite elements, Robert Cimrman, some patches from me and other people), Hermes (higher order adaptive finite elements, developed at the University of Nevada/Reno[0], where I am now). There are more good opensource FEM libraries out there, some list is here: http://hpfem.org/femcodes.html the most widely used is probably libmesh, though it's GPL and as far as I know it doesn't yet have python wrappers (I wrote some swig wrappers couple years back, but it was bad --- it's on my todo list to write good wrappers in Cython for libmesh). All FEM codes need a common functionality --- you need to load a mesh (e.g. readers and writers for all the different mesh formats out there, commercial or not), you need to define your PDE, then you need to assemble the global matrices (this is the, where all the FEM codes differ, e.g. you can have lots of different bases, etc. etc.), then depending on your problem, in most cases you end up either with linear system of equations: A * x = b or (eigenproblem): A * x = lambda * B * x where A, B are (sparse) matrices, "x" is the solution, "b" is the right hand side (vector), lambda is the eigenvalue. So the goal of the FE solver is to calculate A, B and "b". It should be general enough to allow you to discretize any integral in the weak formulation of the PDE, but in most cases, A, B or "b" is all you want. Then I solve it using solvers that are in scipy, or pysparse, or petsc4py, or slepc4py. Next one has to take "x" and feed it to the finite element solver again and then tell it to automatically adapt the mesh (there are several approaches for this, you can use some not so good error estimators, like some gradients of the solution, or you can use sophisticated errror estimators, like a reference solution, calculated on a uniformly refined mesh etc.). So you iteratively adapt, until you are satisfied. Most problems are nonlinear, so one also has to use some nonlinear solvers, e.g. I wrote some Jacobian free nonlinear solvers in scipy.optimize.nonlin, but this has to be polished and improved (and also if your problem is simple enough to allow it, you can also construct the Jacobian using FEM directly, that's the best thing). Once you have the solution, you also need to save it to many different file formats for visualization, and/or provide hooks to mayavi or matplotlib (for 2D stuff). And besides all of this, one also needs to have a common functionality like doing arithmetics with the solution (e.g. sum/substract two solutions, integrate it, etc.) , project it to different meshes, etc. etc. So do you think scipy could have some default, adaptive FEM solver, that would do the job for any PDE, and then all the common functionality above, so that one can easily hook in there any other solver, just to construct the A, B, "b" matrices/vectors and then use scipy for the rest (and optionally the solver again for the adaptivity)? This is a question where you want scipy to go in the longer term. But I think a good mission statement is to provide a viable alternative to Matlab. And for that, a good PDE solver and related tools are necessary. And if scipy had all the common functionality above, so that one can easily hook up any FEM solver in there, plus there would be some default solver in scipy itself, imho that would rock. In my group[0] (we also collaborate with Robert Cimrman) we are doing all of the above and our goal is to have such a system anyway. But it occurred to us that if this could be part of scipy, it could be a win to everybody, since I could spend more time working on scipy and all scipy users would have a good FEM solver. Plus I would then have imminent interest to release soon, release often, which imho is not bad for scipy either. As to techinical side, the 2D solver is written in C++ and builds in about 3s on 8 processors, and the python wrappers use Cython. The 3D solver takes a bit longer to build I think 17s. So it would make the scipy build a bit longer, but imho scipy should be able to solve PDEs. Together with a project like SPD: http://code.google.com/p/spdproject/ which easily builds scipy, numpy, atlas, blas, ..... on all platforms from source easily, then imho we could have a viable alternative to matlab. Ondrej P.S. Currently our FE solvers is GPL, but I think that could change to BSD if there is enough interest in that (there is in me). SPD also has to be GPL, because it's based on Sage, but if there is enough interest, I guess it's not difficult to rewrite all the buildscripts from scratch and use BSD (I myself don't mind using GPL for SPD and save myself lots of time by taking advantage of Sage). [0] http://hpfem.math.unr.edu/
As a user, I would very much like to see scipy include some FEM base. As you say, a generic framework would allow others to just write a different core, support other mesh formats, ..., eg they could be done as scikits. The fragmentation now is large. I am using freefem++ (http://www.freefem.org/ ) recently for quick reference solutions in 2D. Would you plan such a high level parsing of weak forms? Anyway, the question is always to use C/C++ directly or python to do the job. Personally I am leaning to using DUNE http://www.dune-project.org for my next project as it is very versatile. I know they where playing with a python interface too, but I find nothing on their site now about this. Benny 2009/4/5 Ondrej Certik <ondrej@certik.cz>
Hi,
the mission of scipy from scipy.org:
" SciPy is open-source software for mathematics, science, and engineering. [...] The SciPy library [...] provides many user-friendly and efficient numerical routines such as routines for numerical integration and optimization. "
it has solvers for ODEs using several methods. Is having a finite element (FEM) solvers for PDEs (and some ODEs too) among the scipy's goals?
My group[0] would be very interested in that.
Of course, this is not something, that can happen overnight. But scipy has interface to lot's of sparse solvers, so it might have some nice pythonic interface to finite element solvers for PDEs too.
If you look at the topical software, PDE section:
http://scipy.org/Topical_Software#head-3df99e31c89f2e8ff60a2622805f6a304c501...
there are currently 3 solvers listed, FiPy (finite volumes), SfePy (regular finite elements, Robert Cimrman, some patches from me and other people), Hermes (higher order adaptive finite elements, developed at the University of Nevada/Reno[0], where I am now). There are more good opensource FEM libraries out there, some list is here:
http://hpfem.org/femcodes.html
the most widely used is probably libmesh, though it's GPL and as far as I know it doesn't yet have python wrappers (I wrote some swig wrappers couple years back, but it was bad --- it's on my todo list to write good wrappers in Cython for libmesh).
All FEM codes need a common functionality --- you need to load a mesh (e.g. readers and writers for all the different mesh formats out there, commercial or not), you need to define your PDE, then you need to assemble the global matrices (this is the, where all the FEM codes differ, e.g. you can have lots of different bases, etc. etc.), then depending on your problem, in most cases you end up either with linear system of equations:
A * x = b
or (eigenproblem):
A * x = lambda * B * x
where A, B are (sparse) matrices, "x" is the solution, "b" is the right hand side (vector), lambda is the eigenvalue. So the goal of the FE solver is to calculate A, B and "b". It should be general enough to allow you to discretize any integral in the weak formulation of the PDE, but in most cases, A, B or "b" is all you want.
Then I solve it using solvers that are in scipy, or pysparse, or petsc4py, or slepc4py.
Next one has to take "x" and feed it to the finite element solver again and then tell it to automatically adapt the mesh (there are several approaches for this, you can use some not so good error estimators, like some gradients of the solution, or you can use sophisticated errror estimators, like a reference solution, calculated on a uniformly refined mesh etc.). So you iteratively adapt, until you are satisfied.
Most problems are nonlinear, so one also has to use some nonlinear solvers, e.g. I wrote some Jacobian free nonlinear solvers in scipy.optimize.nonlin, but this has to be polished and improved (and also if your problem is simple enough to allow it, you can also construct the Jacobian using FEM directly, that's the best thing).
Once you have the solution, you also need to save it to many different file formats for visualization, and/or provide hooks to mayavi or matplotlib (for 2D stuff).
And besides all of this, one also needs to have a common functionality like doing arithmetics with the solution (e.g. sum/substract two solutions, integrate it, etc.) , project it to different meshes, etc. etc.
So do you think scipy could have some default, adaptive FEM solver, that would do the job for any PDE, and then all the common functionality above, so that one can easily hook in there any other solver, just to construct the A, B, "b" matrices/vectors and then use scipy for the rest (and optionally the solver again for the adaptivity)?
This is a question where you want scipy to go in the longer term. But I think a good mission statement is to provide a viable alternative to Matlab. And for that, a good PDE solver and related tools are necessary. And if scipy had all the common functionality above, so that one can easily hook up any FEM solver in there, plus there would be some default solver in scipy itself, imho that would rock.
In my group[0] (we also collaborate with Robert Cimrman) we are doing all of the above and our goal is to have such a system anyway. But it occurred to us that if this could be part of scipy, it could be a win to everybody, since I could spend more time working on scipy and all scipy users would have a good FEM solver. Plus I would then have imminent interest to release soon, release often, which imho is not bad for scipy either.
As to techinical side, the 2D solver is written in C++ and builds in about 3s on 8 processors, and the python wrappers use Cython. The 3D solver takes a bit longer to build I think 17s. So it would make the scipy build a bit longer, but imho scipy should be able to solve PDEs.
Together with a project like SPD:
http://code.google.com/p/spdproject/
which easily builds scipy, numpy, atlas, blas, ..... on all platforms from source easily, then imho we could have a viable alternative to matlab.
Ondrej
P.S. Currently our FE solvers is GPL, but I think that could change to BSD if there is enough interest in that (there is in me). SPD also has to be GPL, because it's based on Sage, but if there is enough interest, I guess it's not difficult to rewrite all the buildscripts from scratch and use BSD (I myself don't mind using GPL for SPD and save myself lots of time by taking advantage of Sage).
[0] http://hpfem.math.unr.edu/ _______________________________________________ Scipy-dev mailing list Scipy-dev@scipy.org http://mail.scipy.org/mailman/listinfo/scipy-dev
On 04/06/09 03:15, Ondrej Certik wrote:
the mission of scipy from scipy.org:
" SciPy is open-source software for mathematics, science, and engineering. [...] The SciPy library [...] provides many user-friendly and efficient numerical routines such as routines for numerical integration and optimization. "
it has solvers for ODEs using several methods. Is having a finite element (FEM) solvers for PDEs (and some ODEs too) among the scipy's goals?
I do not think this would fit in with the purposes of scipy the library, for the next obvious question would be what about grid generators? What about finite difference and finite volume solvers and so on. I think these are a little too specialized to go into scipy proper. There are a plethora of approaches, techniques, algorithms and a host of problems with each. A scikit would be good as would a separate family of tools under the scipy umbrella. It would be great to have a scipy-pde package but what does this have to do with scipy itself? Do we have a larger super-package that hopes to provide different bundles, kind of like sage but more targeted to numerical computing?
Together with a project like SPD:
Perhaps this is really the answer. A specialized set of sub-packages or bundles with SPD for different tasks. cheers, prabhu
On Mon, Apr 6, 2009 at 5:23 AM, Prabhu Ramachandran <prabhu@aero.iitb.ac.in> wrote:
I do not think this would fit in with the purposes of scipy the library, for the next obvious question would be what about grid generators? What about finite difference and finite volume solvers and so on. I think these are a little too specialized to go into scipy proper. There are a plethora of approaches, techniques, algorithms and a host of problems with each.
Totally agree. Scipy should be low level, FE tools etc should be using scipy/numpy not integrated into scipy. One weakness right now is that there doesn't seem to be a good open source python based meshing tool. Again, this is not something that should necessarily be in scipy, but something that should probably leverage from scipy. We (FiPY) currently use gmsh, but it is not tightly integrated and and we've written our own standard mesh classes, but haven't done anything more with this. -- Daniel Wheeler
Sorry to jump on this (interesting) discussion with a relevant question but aside from Ondrej initial question : is there somewhere a scipy mission statement clearly spelled out? best, Johann Daniel Wheeler wrote:
On Mon, Apr 6, 2009 at 5:23 AM, Prabhu Ramachandran <prabhu@aero.iitb.ac.in> wrote:
I do not think this would fit in with the purposes of scipy the library, for the next obvious question would be what about grid generators? What about finite difference and finite volume solvers and so on. I think these are a little too specialized to go into scipy proper. There are a plethora of approaches, techniques, algorithms and a host of problems with each.
Totally agree. Scipy should be low level, FE tools etc should be using scipy/numpy not integrated into scipy.
One weakness right now is that there doesn't seem to be a good open source python based meshing tool. Again, this is not something that should necessarily be in scipy, but something that should probably leverage from scipy. We (FiPY) currently use gmsh, but it is not tightly integrated and and we've written our own standard mesh classes, but haven't done anything more with this.
On Mon, Apr 6, 2009 at 11:56 AM, Cohen-Tanugi Johann <cohen@lpta.in2p3.fr> wrote:
Sorry to jump on this (interesting) discussion with a relevant question but aside from Ondrej initial question : is there somewhere a scipy mission statement clearly spelled out?
I would like to know this too. Seems like people differ in the interpretation of the mission. Let me reply to the points above: On Mon, Apr 6, 2009 at 2:16 AM, Benny Malengier <benny.malengier@gmail.com> wrote:
As a user, I would very much like to see scipy include some FEM base. As you say, a generic framework would allow others to just write a different core, support other mesh formats, ..., eg they could be done as scikits. The fragmentation now is large.
Exactly.
I am using freefem++ (http://www.freefem.org/ ) recently for quick reference solutions in 2D. Would you plan such a high level parsing of weak forms?
Yes, we do --- we just want to use Python+Cython to enter the forms. On Mon, Apr 6, 2009 at 2:23 AM, Prabhu Ramachandran <prabhu@aero.iitb.ac.in> wrote:
On 04/06/09 03:15, Ondrej Certik wrote:
the mission of scipy from scipy.org:
" SciPy is open-source software for mathematics, science, and engineering. [...] The SciPy library [...] provides many user-friendly and efficient numerical routines such as routines for numerical integration and optimization. "
it has solvers for ODEs using several methods. Is having a finite element (FEM) solvers for PDEs (and some ODEs too) among the scipy's goals?
I do not think this would fit in with the purposes of scipy the library, for the next obvious question would be what about grid generators? What about finite difference and finite volume solvers and so on. I think these are a little too specialized to go into scipy proper. There are a plethora of approaches, techniques, algorithms and a host of problems with each.
There could be one good enough default mesh generator in scipy. As to FDM -- imho FEM is much more versatile and general. As to FVM, imho many aspects of FVM could be handled by such an interface as well. Besides, I propose to add adaptive FEM, so the mesh generator is only really needed to define the geometry, which in lots of cases can be done even by hand, as the solver takes care of the rest.
A scikit would be good as would a separate family of tools under the scipy umbrella. It would be great to have a scipy-pde package but what does this have to do with scipy itself? Do we have a larger
So what exactly is "scipy itself"? I thought it could be all numerical things, that are in matlab. They have it as a toolbox: http://www.mathworks.com/products/pde/ so I guess scipy should be like a bare matlab and all toolboxes should go to scikits?
super-package that hopes to provide different bundles, kind of like sage but more targeted to numerical computing?
What do you mean by "we have a larger superpackage"? Are you asking if we want to have such a package, or if we want scipy to become such a package?
Together with a project like SPD:
Perhaps this is really the answer. A specialized set of sub-packages or bundles with SPD for different tasks.
Yes, that's what we'll do most probably. On Mon, Apr 6, 2009 at 8:58 AM, Daniel Wheeler <daniel.wheeler2@gmail.com> wrote:
On Mon, Apr 6, 2009 at 5:23 AM, Prabhu Ramachandran <prabhu@aero.iitb.ac.in> wrote:
I do not think this would fit in with the purposes of scipy the library, for the next obvious question would be what about grid generators? What about finite difference and finite volume solvers and so on. I think these are a little too specialized to go into scipy proper. There are a plethora of approaches, techniques, algorithms and a host of problems with each.
Totally agree. Scipy should be low level, FE tools etc should be using scipy/numpy not integrated into scipy.
One weakness right now is that there doesn't seem to be a good open source python based meshing tool. Again, this is not something that should necessarily be in scipy, but something that should probably leverage from scipy. We (FiPY) currently use gmsh, but it is not tightly integrated and and we've written our own standard mesh classes, but haven't done anything more with this.
I like the FiPy project, when I get more time, I'd like to create some unified interface, so that one can take one equation and solve it with both FiPy and our FEM code (or libmesh), so that one can compare all those solvers and see which one performs the best. Can FiPy do adaptive mesh refinement? I was trying to find it in the manual, but it seems you always need to create the mesh beforehand? Then you calculate on it, you get something. How do you know if it's precise enough and if you should refine the mesh? How do you determine where you should refine it and how much? If our FEM solver creates a nice hp mesh automatically, imho you cannot beat it by any hand made mesh, so then any comparison with a hand made mesh would be unfair to fipy. Ondrej
2009/4/9 Ondrej Certik <ondrej@certik.cz>:
I would like to know this too. Seems like people differ in the interpretation of the mission.
I wouldn't presume to know the full answer to this question, but I think the intention is clear: SciPy should be a library that is useful across different scientific disciplines. Unfortunately, drawing that line isn't a highly scientific process. For example, personally I can see how a mesh generator could fit nicely into scipy.spatial, whereas FEM in itself seems too specialised (unless it is implemented in a very generic framework). My opinion differs from others given so far, emphasising the blurriness of the line. Prabhu's suggestion to write a SciKit strikes me as a good one. In any case, it is the best way to get new code into SciPy. Regards Stéfan
On Wed, Apr 8, 2009 at 17:43, Ondrej Certik <ondrej@certik.cz> wrote:
On Mon, Apr 6, 2009 at 2:23 AM, Prabhu Ramachandran <prabhu@aero.iitb.ac.in> wrote:
A scikit would be good as would a separate family of tools under the scipy umbrella. It would be great to have a scipy-pde package but what does this have to do with scipy itself? Do we have a larger
So what exactly is "scipy itself"?
People usually mean the scipy Python package when they use this term.
I thought it could be all numerical things, that are in matlab.
They have it as a toolbox:
http://www.mathworks.com/products/pde/
so I guess scipy should be like a bare matlab and all toolboxes should go to scikits?
Not necessarily *bare* Matlab, but that's roughly my take. Personally, I don't feel a need to put everything numerical possible into scipy itself. That way lies madness, IMO. A single Python package just isn't the right technology to contain that much code. I see scipy as primarily providing the fundamentals. In addition, it also serves as a home for numerical code that would otherwise have no home. I see no benefit in pulling in existing projects that are thriving on their own. As for PDEs, specifically, I think the domain is rich enough that it can usefully live in its own package better than being put into scipy. While there may be some fundamentals that you can pull out that might make sense to live in scipy (some simple good-enough mesh generation as you suggest), you will inevitably need more options, more sophisticated mesh generation, more dependencies, etc., that fits the development life cycle of a PDE-dedicated package more than scipy. In other words, the important question isn't, "Can scipy do this?" but rather, "Can Python do this?".
super-package that hopes to provide different bundles, kind of like sage but more targeted to numerical computing?
What do you mean by "we have a larger superpackage"? Are you asking if we want to have such a package, or if we want scipy to become such a package?
I don't think he's talking about a Python package, but rather a project that bundles Python packages together, like the SAGE distribution (which is much more than the sage Python package), Python(X,Y) and EPD. scipy is a Python package and can't really morph into a superpackage. It simply belongs to a different category of things. In fact, I suggest not using the term "superpackage" but "distribution". -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
On 04/09/09 04:34, Robert Kern wrote:
I see scipy as primarily providing the fundamentals. In addition, it also serves as a home for numerical code that would otherwise have no home. I see no benefit in pulling in existing projects that are thriving on their own. As for PDEs, specifically, I think the domain is rich enough that it can usefully live in its own package better than being put into scipy. While there may be some fundamentals that you can pull out that might make sense to live in scipy (some simple good-enough mesh generation as you suggest), you will inevitably need more options, more sophisticated mesh generation, more dependencies, etc., that fits the development life cycle of a PDE-dedicated package more than scipy.
In other words, the important question isn't, "Can scipy do this?" but rather, "Can Python do this?".
Thanks Robert for expressing this so nicely!
super-package that hopes to provide different bundles, kind of like sage but more targeted to numerical computing? What do you mean by "we have a larger superpackage"? Are you asking if we want to have such a package, or if we want scipy to become such a package?
I don't think he's talking about a Python package, but rather a project that bundles Python packages together, like the SAGE distribution (which is much more than the sage Python package), Python(X,Y) and EPD. scipy is a Python package and can't really morph into a superpackage. It simply belongs to a different category of things. In fact, I suggest not using the term "superpackage" but "distribution".
Indeed, that is precisely what I meant --- a distribution tailored for PDEs would be nice. If we had a fool proof distribution generation system that works on all platforms this would be trivial but easier said than done. cheers, prabhu
Prabhu Ramachandran wrote:
On 04/09/09 04:34, Robert Kern wrote:
I don't think he's talking about a Python package, but rather a project that bundles Python packages together, like the SAGE distribution (which is much more than the sage Python package), Python(X,Y) and EPD. scipy is a Python package and can't really morph into a superpackage. It simply belongs to a different category of things. In fact, I suggest not using the term "superpackage" but "distribution".
Indeed, that is precisely what I meant --- a distribution tailored for PDEs would be nice. If we had a fool proof distribution generation system that works on all platforms this would be trivial but easier said than done.
Someone said that Sage has a different focus -- but that is just because most of the current active userbase are matematicians. From what I've heard they certainly want to be "a viable open source alternative to ... MATLAB ..." (as well as Maxima and Mathematica) and would welcome work on that area. I'm just saying that I hope we do NOT get a "tailored distribution for PDEs". A very large part of what Sage or EPD brings to the table (notebook interface, plotting GUIs, build and package system, builds on many platforms, interactive Cython development etc.) are field neutral and I think it would be better if the same system could in time be used by "everybody". At least, this is a role Sage very much wants to take on. (But SciPy is definitely a component of such a system, not the system itself.) The "workflow" of the average sciuentific programmer has a big potential for improvement, and one cannot bring that about by further fragmentation. Dag Sverre
On 04/09/09 18:04, Dag Sverre Seljebotn wrote:
The "workflow" of the average sciuentific programmer has a big potential for improvement, and one cannot bring that about by further fragmentation.
I was not advocating fragmentation, any viable solution would be fine. IIRC, SPD itself is based on Sage and surely there will be some exchange of code/ideas between Sage and SPD at some time. However, given a choice I would prefer a modular set of packages rather than one monolithic monster. I haven't delved into all the existing solutions enough to know if they are sufficiently modular or not. The very fact that SPD exists suggests that there is scope for improvement. cheers, prabhu
On Thu, Apr 9, 2009 at 5:43 AM, Prabhu Ramachandran <prabhu@aero.iitb.ac.in> wrote:
On 04/09/09 18:04, Dag Sverre Seljebotn wrote:
The "workflow" of the average sciuentific programmer has a big potential for improvement, and one cannot bring that about by further fragmentation.
I was not advocating fragmentation, any viable solution would be fine. IIRC, SPD itself is based on Sage and surely there will be some exchange of code/ideas between Sage and SPD at some time. However, given a choice I would prefer a modular set of packages rather than one monolithic monster. I haven't delved into all the existing solutions enough to know if they are sufficiently modular or not. The very fact that SPD exists suggests that there is scope for improvement.
The problem with Sage is that it doesn't include any PDE solver, mayavi, not many sparse solvers (I think only superlu in scipy) etc. I want all in one package, so the only way is to do it yourself, but SPD is compatible with Sage, so any spkg package that works in Sage works in SPD and vice versa. I imagine, that after some time, Sage could include the good spkg packages. So I don't see any problem or fragmentation here. Ondrej
Prabhu Ramachandran wrote:
On 04/09/09 18:04, Dag Sverre Seljebotn wrote:
The "workflow" of the average sciuentific programmer has a big potential for improvement, and one cannot bring that about by further fragmentation.
I was not advocating fragmentation, any viable solution would be fine. IIRC, SPD itself is based on Sage and surely there will be some exchange of code/ideas between Sage and SPD at some time. However, given a choice I would prefer a modular set of packages rather than one monolithic monster. I haven't delved into all the existing solutions
I just wanted to comment on the monolithic monster thing: It does have its advantages too. For instance R is included and can be depended upon by any Sage script. I'd much rather spend some hundred MBs disk space to have all the open source software science packages (like R) installed, and work on improving e.g. the R-to-Python bridge, than, say, reimplement the vast amount of algorithms available for R in Python just because "I like Python better than R". That's the upside to having a monolithic monster: It allows you to "build the car instead of reinventing the wheel", using things regardless of the implementation language. Without distributing all the software together it's incredibly hard to have the different pieces fit together everywhere (R would be a slightly incompatible version with RPy, etc. etc.) Dag Sverre
On 04/10/09 14:16, Dag Sverre Seljebotn wrote:
Prabhu Ramachandran wrote:
On 04/09/09 18:04, Dag Sverre Seljebotn wrote:
The "workflow" of the average sciuentific programmer has a big potential for improvement, and one cannot bring that about by further fragmentation. I was not advocating fragmentation, any viable solution would be fine. IIRC, SPD itself is based on Sage and surely there will be some exchange of code/ideas between Sage and SPD at some time. However, given a choice I would prefer a modular set of packages rather than one monolithic monster. I haven't delved into all the existing solutions
I just wanted to comment on the monolithic monster thing: It does have its advantages too. For instance R is included and can be depended upon by any Sage script.
Apologies for sounding like someone who is against Sage. Just for the record, I really like Sage and do promote it quite a bit over here. I think we are digressing. My point was I never thought that scipy is itself a distribution and Robert's comments seem to justify my stance. Python(x,y), EPD and Sage all are and Sage also does a whole lot more than just bundle stuff. So bundling PDE tools in scipy doesn't seem like the right solution to me. Thats all. cheers, prabhu
On Fri, Apr 10, 2009 at 3:48 AM, Prabhu Ramachandran <prabhu@aero.iitb.ac.in> wrote:
On 04/10/09 14:16, Dag Sverre Seljebotn wrote:
Prabhu Ramachandran wrote:
On 04/09/09 18:04, Dag Sverre Seljebotn wrote:
The "workflow" of the average sciuentific programmer has a big potential for improvement, and one cannot bring that about by further fragmentation. I was not advocating fragmentation, any viable solution would be fine. IIRC, SPD itself is based on Sage and surely there will be some exchange of code/ideas between Sage and SPD at some time. However, given a choice I would prefer a modular set of packages rather than one monolithic monster. I haven't delved into all the existing solutions
I just wanted to comment on the monolithic monster thing: It does have its advantages too. For instance R is included and can be depended upon by any Sage script.
Apologies for sounding like someone who is against Sage. Just for the record, I really like Sage and do promote it quite a bit over here.
I think we are digressing. My point was I never thought that scipy is itself a distribution and Robert's comments seem to justify my stance. Python(x,y), EPD and Sage all are and Sage also does a whole lot more than just bundle stuff. So bundling PDE tools in scipy doesn't seem like the right solution to me. Thats all.
Yes, but you may use the same kind of argument to support that scipy should not exist at all and you should just always create a new project for whatever you do and use Sage to bundle it together. So it's just a question where to draw the line and I think it's clear now from this thread that PDE doesn't belong to scipy. Ondrej
Prabhu Ramachandran wrote:
On 04/10/09 14:16, Dag Sverre Seljebotn wrote:
Prabhu Ramachandran wrote:
On 04/09/09 18:04, Dag Sverre Seljebotn wrote:
The "workflow" of the average sciuentific programmer has a big potential for improvement, and one cannot bring that about by further fragmentation. I was not advocating fragmentation, any viable solution would be fine. IIRC, SPD itself is based on Sage and surely there will be some exchange of code/ideas between Sage and SPD at some time. However, given a choice I would prefer a modular set of packages rather than one monolithic monster. I haven't delved into all the existing solutions
I just wanted to comment on the monolithic monster thing: It does have its advantages too. For instance R is included and can be depended upon by any Sage script.
Apologies for sounding like someone who is against Sage. Just for the record, I really like Sage and do promote it quite a bit over here.
I didn't take it that way at all! I guess I was just digressing and continueing on what you said, it didn't have all that much to do with the concrete matter at hand. Dag Sverre
On Wed, Apr 8, 2009 at 4:04 PM, Robert Kern <robert.kern@gmail.com> wrote:
On Wed, Apr 8, 2009 at 17:43, Ondrej Certik <ondrej@certik.cz> wrote:
so I guess scipy should be like a bare matlab and all toolboxes should go to scikits?
Not necessarily *bare* Matlab, but that's roughly my take. Personally, I don't feel a need to put everything numerical possible into scipy itself. That way lies madness, IMO. A single Python package just isn't the right technology to contain that much code.
I see scipy as primarily providing the fundamentals. In addition, it also serves as a home for numerical code that would otherwise have no home. I see no benefit in pulling in existing projects that are thriving on their own. As for PDEs, specifically, I think the domain is rich enough that it can usefully live in its own package better than being put into scipy. While there may be some fundamentals that you can pull out that might make sense to live in scipy (some simple good-enough mesh generation as you suggest), you will inevitably need more options, more sophisticated mesh generation, more dependencies, etc., that fits the development life cycle of a PDE-dedicated package more than scipy.
So to conclude this thread, it seems to me that what Robert wrote above is the mission statement for scipy: ------------ SciPy mission statement: SciPy is providing the fundamentals for all scientific codes in Python, in addition, it also serves as a home for numerical code that would otherwise have no home. Existing projects, that are thriving on their own (e.g. PDE solvers) should stay as separate projects. ---------------------
In other words, the important question isn't, "Can scipy do this?" but rather, "Can Python do this?".
So it's clear that besides scipy itself, there should be a all in one solution, like EPD (commercial), Sage (GPL), or SPD (GPL, but in principle if all the Sage scripts are rewritten, could be BSD) that I am working on. There needs to be something, that people can take and customize to create their own all in one solutions, e.g. for PDE, or for biology, for mathematics (that is Sage) or I don't know what. Too bad I need to sleep some time, otherwise I'd be working on this nonstop to make this happen. Ondrej
On Tue, Apr 14, 2009 at 01:42:31PM -0700, Ondrej Certik wrote:
There needs to be something, that people can take and customize to create their own all in one solutions, e.g. for PDE, or for biology, for mathematics (that is Sage) or I don't know what. Too bad I need to sleep some time, otherwise I'd be working on this nonstop to make this happen.
I am with you here (but not working on it :( ). I didn't add anything to the discussion on PDE's, but all the time during this discussion I had in mind the interface nightmare problem: I don't care which PDE solver I use, if it has an interface I know. So we need a package to implement a base interface, that I can depend on. The solver behind this interface might not be performant, it is just there as a back. Scipy does seem a natural slot for such code. Gaël
On Tue, Apr 14, 2009 at 15:56, Gael Varoquaux <gael.varoquaux@normalesup.org> wrote:
On Tue, Apr 14, 2009 at 01:42:31PM -0700, Ondrej Certik wrote:
There needs to be something, that people can take and customize to create their own all in one solutions, e.g. for PDE, or for biology, for mathematics (that is Sage) or I don't know what. Too bad I need to sleep some time, otherwise I'd be working on this nonstop to make this happen.
I am with you here (but not working on it :( ).
I didn't add anything to the discussion on PDE's, but all the time during this discussion I had in mind the interface nightmare problem: I don't care which PDE solver I use, if it has an interface I know. So we need a package to implement a base interface, that I can depend on. The solver behind this interface might not be performant, it is just there as a back. Scipy does seem a natural slot for such code.
I conjecture, though, that the interface is the critical differentiator (ahem, no pun intended) between implementations. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
On Tue, Apr 14, 2009 at 04:43:46PM -0500, Robert Kern wrote:
I conjecture, though, that the interface is the critical differentiator (ahem, no pun intended) between implementations.
Then let's have a good fight, and get over it :). Gaël PS: same thing for interpolation, and optimisation :P
On Tue, Apr 14, 2009 at 16:47, Gael Varoquaux <gael.varoquaux@normalesup.org> wrote:
On Tue, Apr 14, 2009 at 04:43:46PM -0500, Robert Kern wrote:
I conjecture, though, that the interface is the critical differentiator (ahem, no pun intended) between implementations.
Then let's have a good fight, and get over it :).
Gaël
PS: same thing for interpolation, and optimisation :P
I suspect the problems are much worse for PDEs than either of those. Just the different mesh representations alone make it difficult to provide a uniform implementation. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
Robert Kern wrote:
On Tue, Apr 14, 2009 at 16:47, Gael Varoquaux <gael.varoquaux@normalesup.org> wrote:
On Tue, Apr 14, 2009 at 04:43:46PM -0500, Robert Kern wrote:
I conjecture, though, that the interface is the critical differentiator (ahem, no pun intended) between implementations. Then let's have a good fight, and get over it :).
Gaël
PS: same thing for interpolation, and optimisation :P
I suspect the problems are much worse for PDEs than either of those. Just the different mesh representations alone make it difficult to provide a uniform implementation.
True enough. In SfePy, we describe problems (PDEs) to solve by using regular Python files, which is nice and easy, but the interface, i.e. all that can be in those files and understood by the solver, is rather tightly coupled with what the solver can do - its implementation. The topic of PDE solvers is really very broad, even if we restrict "just" to the finite element method, and there is not a silver bullet interface that would work for everybody. Just my 2 cents, as a guy that writes finite element codes for not-so-small time. r.
On Tue, Apr 14, 2009 at 2:43 PM, Robert Kern <robert.kern@gmail.com> wrote:
On Tue, Apr 14, 2009 at 15:56, Gael Varoquaux <gael.varoquaux@normalesup.org> wrote:
On Tue, Apr 14, 2009 at 01:42:31PM -0700, Ondrej Certik wrote:
There needs to be something, that people can take and customize to create their own all in one solutions, e.g. for PDE, or for biology, for mathematics (that is Sage) or I don't know what. Too bad I need to sleep some time, otherwise I'd be working on this nonstop to make this happen.
I am with you here (but not working on it :( ).
I didn't add anything to the discussion on PDE's, but all the time during this discussion I had in mind the interface nightmare problem: I don't care which PDE solver I use, if it has an interface I know. So we need a package to implement a base interface, that I can depend on. The solver behind this interface might not be performant, it is just there as a back. Scipy does seem a natural slot for such code.
I conjecture, though, that the interface is the critical differentiator (ahem, no pun intended) between implementations.
So if you all seem to agree to that scipy mission, do you think it could be put on the scipy front page? :) Is there anything I can do to make this happen? As to the distribution thing --- as long as it's your own box, or you have the root rights, Debian/Ubuntu is the best (for me). But if you are on Mac, or a cluster (often my case), other computer, lab, or windows, you are stuck. There were attempts to create a source distribution and all of them utterly failed, e.g. see this thread I started on sage-devel not so long ago: http://groups.google.com/group/sage-devel/browse_thread/thread/a8d89440bdff8... The only way forward is to use something that has already pulled off, that has a large support across all the architectures (*now*, not in couple years, that's very important), that has a large user community, etc. E.g. Sage. Then take it and customize it. Imho that is the only chance, instead of starting something new. Hopefully soon Debian Developer, Ondrej
On Tue, Apr 14, 2009 at 19:50, Ondrej Certik <ondrej@certik.cz> wrote:
On Tue, Apr 14, 2009 at 2:43 PM, Robert Kern <robert.kern@gmail.com> wrote:
On Tue, Apr 14, 2009 at 15:56, Gael Varoquaux <gael.varoquaux@normalesup.org> wrote:
On Tue, Apr 14, 2009 at 01:42:31PM -0700, Ondrej Certik wrote:
There needs to be something, that people can take and customize to create their own all in one solutions, e.g. for PDE, or for biology, for mathematics (that is Sage) or I don't know what. Too bad I need to sleep some time, otherwise I'd be working on this nonstop to make this happen.
I am with you here (but not working on it :( ).
I didn't add anything to the discussion on PDE's, but all the time during this discussion I had in mind the interface nightmare problem: I don't care which PDE solver I use, if it has an interface I know. So we need a package to implement a base interface, that I can depend on. The solver behind this interface might not be performant, it is just there as a back. Scipy does seem a natural slot for such code.
I conjecture, though, that the interface is the critical differentiator (ahem, no pun intended) between implementations.
So if you all seem to agree to that scipy mission, do you think it could be put on the scipy front page? :) Is there anything I can do to make this happen?
I don't think it's worded well as a mission statement although I, naturally, agree with the content. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
On Tue, Apr 14, 2009 at 08:11:04PM -0500, Robert Kern wrote:
I don't think it's worded well as a mission statement although I, naturally, agree with the content.
It still could go on the scipy.org frontpage, no? Ondrej, this is a wiki, go ahead and edit it, if we don't agree with your changes, we'll mention it. Try to make it look nice (very important). The front page tends to get over cluttered, I believe you should try and rearrange the content in order not to really make it any longer than it is. Gaël
On Wed, Apr 15, 2009 at 00:16, Gael Varoquaux <gael.varoquaux@normalesup.org> wrote:
On Tue, Apr 14, 2009 at 08:11:04PM -0500, Robert Kern wrote:
I don't think it's worded well as a mission statement although I, naturally, agree with the content.
It still could go on the scipy.org frontpage, no? Ondrej, this is a wiki, go ahead and edit it, if we don't agree with your changes, we'll mention it. Try to make it look nice (very important). The front page tends to get over cluttered, I believe you should try and rearrange the content in order not to really make it any longer than it is.
I believe the front page is restricted. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
Ondrej Certik wrote:
So it's clear that besides scipy itself, there should be a all in one solution, like EPD (commercial), Sage (GPL), or SPD (GPL, but in principle if all the Sage scripts are rewritten, could be BSD) that I am working on.
There needs to be something, that people can take and customize to create their own all in one solutions, e.g. for PDE, or for biology, for mathematics (that is Sage) or I don't know what.
I call that Debian/Ubuntu. Package management is one thing that Debian got really right. I don't really see the value in re-inventing Debian package management. Why not just use it?
Too bad I need to sleep some time, otherwise I'd be working on this nonstop to make this happen.
I notice you already are working pretty close to nonstop on packaging for Debian, so go a bit easier on yourself. :)
On Tue, Apr 14, 2009 at 17:39, Andrew Straw <strawman@astraw.com> wrote:
Ondrej Certik wrote:
So it's clear that besides scipy itself, there should be a all in one solution, like EPD (commercial), Sage (GPL), or SPD (GPL, but in principle if all the Sage scripts are rewritten, could be BSD) that I am working on.
There needs to be something, that people can take and customize to create their own all in one solutions, e.g. for PDE, or for biology, for mathematics (that is Sage) or I don't know what.
I call that Debian/Ubuntu. Package management is one thing that Debian got really right. I don't really see the value in re-inventing Debian package management. Why not just use it?
If you can really restrict your deployments just to a single distribution of a single OS, great. Many of us are not in that position. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
Robert Kern wrote:
On Tue, Apr 14, 2009 at 17:39, Andrew Straw <strawman@astraw.com> wrote:
Ondrej Certik wrote:
So it's clear that besides scipy itself, there should be a all in one solution, like EPD (commercial), Sage (GPL), or SPD (GPL, but in principle if all the Sage scripts are rewritten, could be BSD) that I am working on.
There needs to be something, that people can take and customize to create their own all in one solutions, e.g. for PDE, or for biology, for mathematics (that is Sage) or I don't know what. I call that Debian/Ubuntu. Package management is one thing that Debian got really right. I don't really see the value in re-inventing Debian package management. Why not just use it?
If you can really restrict your deployments just to a single distribution of a single OS, great. Many of us are not in that position.
Well, fink ports the Debian package management to Mac OS X. Does something equivalent exist for Windows? The actual Debian file formats are pretty simple, so it seems like it should be do-able. The no root issue is another setback for my suggestion. I guess porting the Debian package management isn't going to help there... Maybe what is needed is an entirely unprivileged-user Debian-inspired distribution and package management system that doesn't bother installing the low-level system stuff (e.g. the kernel, X11/Windows GUI/Mac OS X GUI) but will keep a copy of everything from libpng, to cairo, to ATLAS, to numpy/scipy/etc in the user's area. This sounds like a generally useful thing, and not one that is specific to Python. Perhaps such a project could actually take off. -Andrew
On Tue, Apr 14, 2009 at 18:37, Andrew Straw <strawman@astraw.com> wrote:
Robert Kern wrote:
On Tue, Apr 14, 2009 at 17:39, Andrew Straw <strawman@astraw.com> wrote:
Ondrej Certik wrote:
So it's clear that besides scipy itself, there should be a all in one solution, like EPD (commercial), Sage (GPL), or SPD (GPL, but in principle if all the Sage scripts are rewritten, could be BSD) that I am working on.
There needs to be something, that people can take and customize to create their own all in one solutions, e.g. for PDE, or for biology, for mathematics (that is Sage) or I don't know what. I call that Debian/Ubuntu. Package management is one thing that Debian got really right. I don't really see the value in re-inventing Debian package management. Why not just use it?
If you can really restrict your deployments just to a single distribution of a single OS, great. Many of us are not in that position.
Well, fink ports the Debian package management to Mac OS X.
Not really, no.
Does something equivalent exist for Windows?
No.
The actual Debian file formats are pretty simple, so it seems like it should be do-able.
You would think, wouldn't you?
The no root issue is another setback for my suggestion. I guess porting the Debian package management isn't going to help there...
Maybe what is needed is an entirely unprivileged-user Debian-inspired distribution and package management system that doesn't bother installing the low-level system stuff (e.g. the kernel, X11/Windows GUI/Mac OS X GUI) but will keep a copy of everything from libpng, to cairo, to ATLAS, to numpy/scipy/etc in the user's area. This sounds like a generally useful thing, and not one that is specific to Python. Perhaps such a project could actually take off.
There have been many abortive attempts. None have actually caught on. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
On Wed, Apr 15, 2009 at 8:37 AM, Andrew Straw <strawman@astraw.com> wrote:
Well, fink ports the Debian package management to Mac OS X. Does something equivalent exist for Windows? The actual Debian file formats are pretty simple, so it seems like it should be do-able.
What makes debian such a well integrated system is not so much the scripts - after all, rpm .spec files, debian files, port (BSd system) files are not that different. What matters is how polish the actual packages are. That's already difficult to do for one platform. That becomes very very difficult for many packages. The whole setuptools thing is a fiasco IMHO partly because it ignores this problem and gives the illusion it is easy (installing is easy, because it is just installing files, uninstalling is easy because it is just removing files - that's like saying programming is easy because it just moves bytes in memory). When I worked in a company doing proprietary software for windows/mac, you had one guy whose sole job is to make sure everything installs properly. That's very time consuming, because every little detail matters - and a mistake in the installer is a deal breaker and costs a lot to the company.
Maybe what is needed is an entirely unprivileged-user Debian-inspired distribution and package management system that doesn't bother installing the low-level system stuff (e.g. the kernel, X11/Windows GUI/Mac OS X GUI) but will keep a copy of everything from libpng, to cairo, to ATLAS, to numpy/scipy/etc in the user's area. This sounds like a generally useful thing, and not one that is specific to Python. Perhaps such a project could actually take off.
The idea is too engrained in unix I think, if only "philosophically". Technically, having a system with many dependencies would not work very well on windows I think. Usually, on windows, you just ship everything you need in your package (many windows softwares bundle their own C runtime for example), and a software is a snapshot of a set of softwares which works and has been tested together. Then, every software has its own update mechanism if it is big enough (acrobat reader, vmware, etc...). David
On Wed, Apr 15, 2009 at 09:23:44AM +0900, David Cournapeau wrote:
On Wed, Apr 15, 2009 at 8:37 AM, Andrew Straw <strawman@astraw.com> wrote:
Well, fink ports the Debian package management to Mac OS X. Does something equivalent exist for Windows? The actual Debian file formats are pretty simple, so it seems like it should be do-able.
What makes debian such a well integrated system is not so much the scripts - after all, rpm .spec files, debian files, port (BSd system) files are not that different. What matters is how polish the actual packages are. That's already difficult to do for one platform. That becomes very very difficult for many packages. The whole setuptools thing is a fiasco IMHO partly because it ignores this problem and gives the illusion it is easy (installing is easy, because it is just installing files, uninstalling is easy because it is just removing files - that's like saying programming is easy because it just moves bytes in memory).
Right, a debian package by itself does not mean anything. What is important is that, at an instant t, the debian archive strives to be a self-consistent set of packages, that all build and run together, with known relationship. I think that one thing that contributes a lot to the quality of debian packages (real ones, not fink, I don't know for fink) is that, under Debian and Ubuntu, they are built is isolated environments. That way no build or installation side effects creep in. Making a Debian package for your soft is a great way to make sure that it really builds, 100% of the time, because if it doesn't, one of the different build bots will find it for you. That culture is totally absent from Python packaging. Gaël
David Cournapeau wrote:
On Wed, Apr 15, 2009 at 8:37 AM, Andrew Straw <strawman@astraw.com> wrote:
Well, fink ports the Debian package management to Mac OS X. Does something equivalent exist for Windows? The actual Debian file formats are pretty simple, so it seems like it should be do-able.
What makes debian such a well integrated system is not so much the scripts - after all, rpm .spec files, debian files, port (BSd system) files are not that different. What matters is how polish the actual packages are. That's already difficult to do for one platform. That
+1 The reason Sage installs works is because Michael Abshoff works as release manager. Making sure, e.g., that one software package isn't upgraded until the rest of the packages can handle the upgrade. Ondrej, will SPD be keyed to Sage releases? Otherwise SPD will be something that works at one point in time, but then requires fulltime supervision to keep it working, and really not much better than setuptools foir the reasons David mentions. The reason a system as complex as Sage actually works is not the technology (what, untar and run a script...) but the release schedules, comprehensive regression testing, Michael Abshoff etc. -- Dag Sverre
On Wed, Apr 15, 2009 at 1:58 AM, Dag Sverre Seljebotn <dagss@student.matnat.uio.no> wrote:
David Cournapeau wrote:
On Wed, Apr 15, 2009 at 8:37 AM, Andrew Straw <strawman@astraw.com> wrote:
Well, fink ports the Debian package management to Mac OS X. Does something equivalent exist for Windows? The actual Debian file formats are pretty simple, so it seems like it should be do-able.
What makes debian such a well integrated system is not so much the scripts - after all, rpm .spec files, debian files, port (BSd system) files are not that different. What matters is how polish the actual packages are. That's already difficult to do for one platform. That
+1
The reason Sage installs works is because Michael Abshoff works as release manager. Making sure, e.g., that one software package isn't upgraded until the rest of the packages can handle the upgrade.
Ondrej, will SPD be keyed to Sage releases? Otherwise SPD will be something that works at one point in time, but then requires fulltime supervision to keep it working, and really not much better than setuptools foir the reasons David mentions.
The reason a system as complex as Sage actually works is not the technology (what, untar and run a script...) but the release schedules, comprehensive regression testing, Michael Abshoff etc.
Yes, my plan is to get it tied to Sage releases and work as close as possible with Sage guys. Ondrej
Dag Sverre Seljebotn wrote:
David Cournapeau wrote:
On Wed, Apr 15, 2009 at 8:37 AM, Andrew Straw <strawman@astraw.com> wrote:
Well, fink ports the Debian package management to Mac OS X. Does something equivalent exist for Windows? The actual Debian file formats are pretty simple, so it seems like it should be do-able. What makes debian such a well integrated system is not so much the scripts - after all, rpm .spec files, debian files, port (BSd system) files are not that different. What matters is how polish the actual packages are. That's already difficult to do for one platform. That
+1
The reason Sage installs works is because Michael Abshoff works as release manager. Making sure, e.g., that one software package isn't upgraded until the rest of the packages can handle the upgrade.
Ondrej, will SPD be keyed to Sage releases? Otherwise SPD will be something that works at one point in time, but then requires fulltime supervision to keep it working, and really not much better than setuptools foir the reasons David mentions.
The major point I'm trying to make here is that if you have Sage version x.y.z, that autoatically implies Python version a.b, NumPy version c.d and SciPy version e.f; all tested by someone else to fit together. Unless the same will be the case for SPD, it just won't work as it must. And if it is not keyed 1:1 to Sage releases, there's going to be much work duplication. The package system is not the problem; distribution maintainance is. -- Dag Sverre
On Wed, Apr 15, 2009 at 2:09 AM, Dag Sverre Seljebotn <dagss@student.matnat.uio.no> wrote:
Dag Sverre Seljebotn wrote:
David Cournapeau wrote:
On Wed, Apr 15, 2009 at 8:37 AM, Andrew Straw <strawman@astraw.com> wrote:
Well, fink ports the Debian package management to Mac OS X. Does something equivalent exist for Windows? The actual Debian file formats are pretty simple, so it seems like it should be do-able. What makes debian such a well integrated system is not so much the scripts - after all, rpm .spec files, debian files, port (BSd system) files are not that different. What matters is how polish the actual packages are. That's already difficult to do for one platform. That
+1
The reason Sage installs works is because Michael Abshoff works as release manager. Making sure, e.g., that one software package isn't upgraded until the rest of the packages can handle the upgrade.
Ondrej, will SPD be keyed to Sage releases? Otherwise SPD will be something that works at one point in time, but then requires fulltime supervision to keep it working, and really not much better than setuptools foir the reasons David mentions.
The major point I'm trying to make here is that if you have Sage version x.y.z, that autoatically implies Python version a.b, NumPy version c.d and SciPy version e.f; all tested by someone else to fit together.
Unless the same will be the case for SPD, it just won't work as it must. And if it is not keyed 1:1 to Sage releases, there's going to be much work duplication.
The package system is not the problem; distribution maintainance is.
Yep, that's why I chose Sage, because in my experience Sage is by far the most tested and robust opensource source distribution. Besides that, both William and Michael are interested in helping me out, for example William and Mike Hansen just spent an evening disentangling the Sage notebook, so that we can use it in SPD and they can use it in the windows port: http://trac.sagemath.org/sage_trac/ticket/5789 while in the scipy community, unfortunately, I can still see many doubts if this effort is really needed/worthy, etc. So I first need to deliver some results, then hopefully more people will join. I can't do this myself alone, so I chose Sage. Ondrej
Ondrej Certik wrote:
On Wed, Apr 15, 2009 at 2:09 AM, Dag Sverre Seljebotn <dagss@student.matnat.uio.no> wrote:
Dag Sverre Seljebotn wrote:
David Cournapeau wrote:
On Wed, Apr 15, 2009 at 8:37 AM, Andrew Straw <strawman@astraw.com> wrote:
Well, fink ports the Debian package management to Mac OS X. Does something equivalent exist for Windows? The actual Debian file formats are pretty simple, so it seems like it should be do-able. What makes debian such a well integrated system is not so much the scripts - after all, rpm .spec files, debian files, port (BSd system) files are not that different. What matters is how polish the actual packages are. That's already difficult to do for one platform. That +1
The reason Sage installs works is because Michael Abshoff works as release manager. Making sure, e.g., that one software package isn't upgraded until the rest of the packages can handle the upgrade.
Ondrej, will SPD be keyed to Sage releases? Otherwise SPD will be something that works at one point in time, but then requires fulltime supervision to keep it working, and really not much better than setuptools foir the reasons David mentions. The major point I'm trying to make here is that if you have Sage version x.y.z, that autoatically implies Python version a.b, NumPy version c.d and SciPy version e.f; all tested by someone else to fit together.
Unless the same will be the case for SPD, it just won't work as it must. And if it is not keyed 1:1 to Sage releases, there's going to be much work duplication.
The package system is not the problem; distribution maintainance is.
Yep, that's why I chose Sage, because in my experience Sage is by far the most tested and robust opensource source distribution.
Besides that, both William and Michael are interested in helping me out, for example William and Mike Hansen just spent an evening disentangling the Sage notebook, so that we can use it in SPD and they can use it in the windows port:
http://trac.sagemath.org/sage_trac/ticket/5789
while in the scipy community, unfortunately, I can still see many doubts if this effort is really needed/worthy, etc. So I first need to deliver some results, then hopefully more people will join. I can't do this myself alone, so I chose Sage.
Well, I'm not really part of the SciPy community, but now that I understand the project better I'm very positive. Especially since it seems it will bring nice cleanups to the Sage codebase. Good luck with it! (Myself I will probably just continue to use the gigabyte of disk space for a full Sage (what is 1 gig anyway in this day and age?), but I understand that others have other needs -- especially Windows support). -- Dag Sverre
On Wed, Apr 15, 2009 at 3:20 AM, Dag Sverre Seljebotn <dagss@student.matnat.uio.no> wrote:
Ondrej Certik wrote:
On Wed, Apr 15, 2009 at 2:09 AM, Dag Sverre Seljebotn <dagss@student.matnat.uio.no> wrote:
Dag Sverre Seljebotn wrote:
David Cournapeau wrote:
On Wed, Apr 15, 2009 at 8:37 AM, Andrew Straw <strawman@astraw.com> wrote:
Well, fink ports the Debian package management to Mac OS X. Does something equivalent exist for Windows? The actual Debian file formats are pretty simple, so it seems like it should be do-able. What makes debian such a well integrated system is not so much the scripts - after all, rpm .spec files, debian files, port (BSd system) files are not that different. What matters is how polish the actual packages are. That's already difficult to do for one platform. That +1
The reason Sage installs works is because Michael Abshoff works as release manager. Making sure, e.g., that one software package isn't upgraded until the rest of the packages can handle the upgrade.
Ondrej, will SPD be keyed to Sage releases? Otherwise SPD will be something that works at one point in time, but then requires fulltime supervision to keep it working, and really not much better than setuptools foir the reasons David mentions. The major point I'm trying to make here is that if you have Sage version x.y.z, that autoatically implies Python version a.b, NumPy version c.d and SciPy version e.f; all tested by someone else to fit together.
Unless the same will be the case for SPD, it just won't work as it must. And if it is not keyed 1:1 to Sage releases, there's going to be much work duplication.
The package system is not the problem; distribution maintainance is.
Yep, that's why I chose Sage, because in my experience Sage is by far the most tested and robust opensource source distribution.
Besides that, both William and Michael are interested in helping me out, for example William and Mike Hansen just spent an evening disentangling the Sage notebook, so that we can use it in SPD and they can use it in the windows port:
http://trac.sagemath.org/sage_trac/ticket/5789
while in the scipy community, unfortunately, I can still see many doubts if this effort is really needed/worthy, etc. So I first need to deliver some results, then hopefully more people will join. I can't do this myself alone, so I chose Sage.
Well, I'm not really part of the SciPy community, but now that I understand the project better I'm very positive. Especially since it seems it will bring nice cleanups to the Sage codebase. Good luck with it!
(Myself I will probably just continue to use the gigabyte of disk space for a full Sage (what is 1 gig anyway in this day and age?), but I understand that others have other needs -- especially Windows support).
It's not only about 1 gig --- first it takes long time to download for me, it takes about 30 min to unpack on my slow laptop, but that could be overcome -- a big problem is that Sage doesn't build on our cluster (some problems in some math packages, that I don't need), that's a showstopper. SPD does build there. Plus I want to create custom all in one solutions for our code, e.g. for PDE, so the result will be much more than 1 gig after including all the custom code (e.g. all solvers, etc.). Ondrej
On 04/15/09 14:34, Ondrej Certik wrote:
Besides that, both William and Michael are interested in helping me out, for example William and Mike Hansen just spent an evening disentangling the Sage notebook, so that we can use it in SPD and they can use it in the windows port:
http://trac.sagemath.org/sage_trac/ticket/5789
while in the scipy community, unfortunately, I can still see many doubts if this effort is really needed/worthy, etc. So I first need to
Well, in case you refer to me, I think SPD is a great idea! Thanks for doing it. cheers, prabhu
On Tue, Apr 14, 2009 at 03:39:18PM -0700, Andrew Straw wrote:
I call that Debian/Ubuntu. Package management is one thing that Debian got really right. I don't really see the value in re-inventing Debian package management. Why not just use it?
Because you have to be root on the box for this. In most labs, you may be root one a few boxes, but rarely on the cluster. In addition, at many workplaces, you are not root on your own box (and anyhow, they rule out Debian, usually). Gaël, a Debian/Ubuntu fan, too.
On Wed, Apr 15, 2009 at 5:42 AM, Ondrej Certik <ondrej@certik.cz> wrote:
So it's clear that besides scipy itself, there should be a all in one solution, like EPD (commercial), Sage (GPL), or SPD (GPL, but in principle if all the Sage scripts are rewritten, could be BSD) that I am working on.
I am not sure it is that clear. If you take one of the most successful open source software for numerical computation, R, they have a totally opposite solution: CRAN. They don't integrate anything themselves, but you have ~2000 packages for extremely specialized tools. It is easier for R than for us, because they can do whatever they want to make it happen, whereas we have to play well with other tools, since we are python. cheers, David
On Wed, Apr 15, 2009 at 09:00:14AM +0900, David Cournapeau wrote:
I am not sure it is that clear. If you take one of the most successful open source software for numerical computation, R, they have a totally opposite solution: CRAN. They don't integrate anything themselves, but you have ~2000 packages for extremely specialized tools. It is easier for R than for us, because they can do whatever they want to make it happen, whereas we have to play well with other tools, since we are python.
I am curious: how reliable is RCRAN, with regards to complex compiled packages. For instance, do they ever run in the gfortran/g77 problems? Maybe once numpy provides a suitable build environment (by exposing its C API to packages) and with some improvement of distutils/setuptools, we can go part way there. Gaël
On Wed, Apr 8, 2009 at 6:43 PM, Ondrej Certik <ondrej@certik.cz> wrote:
On Mon, Apr 6, 2009 at 8:58 AM, Daniel Wheeler
One weakness right now is that there doesn't seem to be a good open source python based meshing tool. Again, this is not something that should necessarily be in scipy, but something that should probably leverage from scipy. We (FiPY) currently use gmsh, but it is not tightly integrated and and we've written our own standard mesh classes, but haven't done anything more with this.
I like the FiPy project, when I get more time, I'd like to create some unified interface, so that one can take one equation and solve it with both FiPy and our FEM code (or libmesh), so that one can compare all those solvers and see which one performs the best.
Essentially, this is what we are trying to do with fipy. We would like to access solvers from various packages via a common interface (pysparse, scipy, trilinos, pykrylov, petsc, pyamg) and also call a number of tools that construct matrices with different techniques. The second part is harder to abstract and we only have our simple FV method at the moment.
Can FiPy do adaptive mesh refinement?
No. We'll wait until there is a meshing tool that can deal with it or create a python interface to an existing tool at a push.
I was trying to find it in the manual, but it seems you always need to create the mesh beforehand?
True.
Then you calculate on it, you get something. How do you know if it's precise enough and if you should refine the mesh? How do you determine where you should refine it and how much?
Not something that fipy deals with at the moment. Basically it's a task left to the user.
If our FEM solver creates a nice hp mesh automatically, imho you cannot beat it by any hand made mesh, so then any comparison with a hand made mesh would be unfair to fipy.
Very true. h refinement is what we would like, but we require a third party tool for this. The scope is too wide for us right now to deal with. -- Daniel Wheeler
participants (12)
-
Andrew Straw
-
Benny Malengier
-
Cohen-Tanugi Johann
-
Dag Sverre Seljebotn
-
Daniel Wheeler
-
David Cournapeau
-
Gael Varoquaux
-
Ondrej Certik
-
Prabhu Ramachandran
-
Robert Cimrman
-
Robert Kern
-
Stéfan van der Walt