Polynomial interpolation
Hi, It appears that scipy does not have a facility for using the Lagrange polynomial to interpolate data. (Or did I miss it?) I am well aware of the numerical difficulties this poses, but it (and its generalization, the Hermite polynomial) does prove useful on occasion. I have written prototype implementations of two algorithms for evaluating this polynomial, and I'd like comments before submitting them for inclusion in scipy. One implementation is based on Krogh 1970, "Efficient Algorithms for Polynomial Interpolation and Numerical Differentiation"; it allows the construction of Hermite polynomials (where some derivatives at each point may also be specified) and the evaluation of arbitrary derivatives. It is based on a Neville-like algorithm, so that it does O(n^2) work when constructed but only O(n) per point evaluated, or O(n^2) per point for which all derivatives must be evaluated. (n here is the degree of the polynomial.) The other implementation is based on Berrut and Trefethen 2004, "Barycentric Lagrange Interpolation". This implementation uses a rewriting of the Lagrange polynomial as a rational function, and is efficient and numerically stable. It also allows efficient updating of the y values at which interpolation occurs, as well as addition of new x values. It does not support evaluation of derivatives or construction of Hermite polynomials. Finally, I have written a "PiecewisePolynomial" class for constructing splines in which each piece may have arbitrary degree, and for which the function values and some derivatives are specified at each knot. The intent is that this be used to represent solutions obtained from ODE solvers, using one polynomial for each solver step, with the same order as the solver's internal polynomial solution, and with (some) derivatives matching at the ends. Such a Trajectory object would contain additional information about the solution (for example stiffness or error estimates) beyond what is in PiecewisePolynomial. (I tried implementing Trajectory using splines, which are evaluated in compiled code, but their maximum degree is 5 while the solvers will go up to degree 12.) They all need work, in particular, efficiency would be improved by making the y values vectors, error checking needs to be more robust, and documentation is not in reST form. Ultimately, too, the evaluation functions at least should be written in a compiled language (cython?). But I thought I'd solicit comments on the code first - is the object-oriented design cumbersome? Is including the algorithm in the name confusing to users? Is the calling convention for Hermite polynomials too confusing or error-prone? Thanks, Anne
They all need work, in particular, efficiency would be improved by making the y values vectors, error checking needs to be more robust, and documentation is not in reST form. Ultimately, too, the evaluation functions at least should be written in a compiled language (cython?). But I thought I'd solicit comments on the code first - is the object-oriented design cumbersome? Is including the algorithm in the name confusing to users? Is the calling convention for Hermite polynomials too confusing or error-prone?
I really like the OO design. If you continue down this route would you make the class new style (inherit from object), and why not use properties for the set_yi, would make it all the sweeter. Gabriel
On 19/04/2008, Gabriel Gellner <ggellner@uoguelph.ca> wrote:
I really like the OO design. If you continue down this route would you make the class new style (inherit from object), and why not use properties for the set_yi, would make it all the sweeter.
Hmm. I'm lukewarm on using properties. It seems to me that if I provide set_foo functions people will assume that it's not safe to modify any attribute directly; if I override __setattr__ so that the right magic happens, that implicitly encourages people to change any attribute that doesn't raise an exception. Which means that I need to evaluate whether each attribute can be safely modified on the fly, and access those that can't through object.__setattr__(self,"name",value). I also need to explicitly document which attributes may be safely written to. Is this about right? Is this the new python standard practice? More generally, it seems to me that generically, interpolation schemes should produce objects which can be evaluated like functions. They should perhaps have certain other standard features: providing derivatives, if they can, through one of a few standardly-named functions, providing acceptable input ranges (or recommended input ranges) in standardly-named attributes, ... anything else? What about other "function" objects in numpy/scipy that know their derivatives (or integrals), should they provide extra functions to compute these? numpy.arctan.derivative(x)? numpy.arctan2.partial_derivative(x,y,1,2)? It seems like "knowing your derivatives" comes in at least three forms: being able to compute the value of a particular derivative on demand (e.g. splev(x,tck,der=2)), being able to evaluate all (or some) derivatives on demand (e.g. KroghInterpolator(xi,yi).derivatives(x)), and being able to construct a full-fledged function object representing the derivative (e.g. poly1d([2,1,0]).deriv()). Certainly one can use any of these to fudge the others, but it involves varying degrees of clumsiness and inefficiency. I'm just thinking about how to make duck typing as useful as possible for interpolator objects. Anne
On 21/04/2008, Anne Archibald <peridot.faceted@gmail.com> wrote:
On 19/04/2008, Gabriel Gellner <ggellner@uoguelph.ca> wrote:
I really like the OO design. If you continue down this route would you make the class new style (inherit from object), and why not use properties for the set_yi, would make it all the sweeter.
Hmm. I'm lukewarm on using properties. It seems to me that if I provide set_foo functions people will assume that it's not safe to modify any attribute directly;
If you don't want people to use a method, start its name with an underscore. In most cases, set_foo isn't necessary. Either use _foo privately, or use foo as a property which users can manipulate.
More generally, it seems to me that generically, interpolation schemes should produce objects which can be evaluated like functions.
Sounds good. Talking about derivatives, does anyone know whether http://en.wikipedia.org/wiki/Automatic_differentiation is of value? It's been on my TODO list for a while, but I haven't gotten round to studying it in detail. Regards Stéfan
On 21 Apr 2008, at 08:02, Stéfan van der Walt wrote:
Talking about derivatives, does anyone know whether
http://en.wikipedia.org/wiki/Automatic_differentiation
is of value? It's been on my TODO list for a while, but I haven't gotten round to studying it in detail.
I think the Scientific.Functions.Derivatives subpackage of Konrad Hinsen has such functionality. Also, to come back to the original thread subject, Scientific.Functions.Interpolation is also able to generate Interpolators (don't know the algorithm). However, last time I checked, Scientific had no stable version for NumPy, only for the old Numeric package (which is the reason why I haven't used Scientific anymore). That was quite some time ago, though, so it may have changed by now. Cheers, Joris Disclaimer: http://www.kuleuven.be/cwis/email_disclaimer.htm
On Mon, Apr 21, 2008 at 6:07 AM, Joris De Ridder <Joris.DeRidder@ster.kuleuven.be> wrote:
On 21 Apr 2008, at 08:02, Stéfan van der Walt wrote:
Talking about derivatives, does anyone know whether
http://en.wikipedia.org/wiki/Automatic_differentiation
is of value? It's been on my TODO list for a while, but I haven't gotten round to studying it in detail.
It is of great value for several reasons (e.g. see Eric Phipps' thesis at www.math.cornell.edu/~gucken/PDF/phipps.pdf). Our group at Cornell a couple of years ago had been waiting to see if there would be a standard package emerging for us to interface into Python....
I think the Scientific.Functions.Derivatives subpackage of Konrad Hinsen has such functionality.
For one thing, I strongly dislike the way of interacting with the numeric objects and their derivatives in that package. It's not very Pythonic in its use of classes and duck typing. The other problem is, the situations where it would be of most use need so many computations that it's not an effective approach unless it is done at the level of C code. A couple of years ago we tried interfacing to a new release of ADOL-C through SWIG, but found some extremely strange memory errors that we couldn't sort out. I think those only showed up when we tried to pass in data that had been prepared by SWIG from Python arrays. Anyway, getting a package like that properly interfaced would be the way forward as far as we are concerned. OpenAD looks like another good bet for an open-source library. Then again, the problem of efficiency becomes getting the users functions into C code so that "source code transformation" can be performed. I certainly like the idea of having all in-built functions "knowing" their derivatives, but it's not clear how these python-level representations can be best interfaced to C code, whether the basis for the AD is "source code transformation" or "operator overloading". I think there would need to be a new class that allows "user" functions that know their derivatives but which are defined in a piecewise-fashion, e.g. to include solutions to differential equations (for instance) represented as interpolated polynomials.
Also, to come back to the original thread subject, Scientific.Functions.Interpolation is also able to generate Interpolators (don't know the algorithm).
It's only linear interpolation, but, on the plus side, does support multi-dimensional meshes, which is a generalization I wholeheartedly endorse. Alas, multivariate polynomials or wavelet-type bases would be needed. <soapbox> If we're going to start thinking "big" for supporting more mathematically natural functionality, I believe we ought to be thinking far enough out to support the basic objects of PDE computations too (or at least a compatible class structure and API), even if it's not fully utilized just yet. Scipy should support scientific computation based around mathematically-oriented fundamental objects and functionality (i.e. to hide the "dumbness" of arrays inside some sugar coating). I think writing and debugging complex mathematical tools will be a lot easier if we raise the bar a little and use somewhat more sophisticated basic objects than arrays (our efforts with Pointsets have helped enormously in writing sub-packages of PyDSTool, in particular for putting PyCont together so quickly). Such an approach will also be a lot easier to introduce to new and less technical scientific users. The field of scientific computation is still weighed down by keeping old Fortran-style APIs up to date for re-use (of course, I'm guilty of this). Python brings a fresh opportunity to break these shackles, at least interpreted in terms of adding an extra level of indirection and duck-typed magic at the heart of scipy. Some imagination is needed here (a la "import antigravity" ;). </soapbox> -Rob
On Mon, Apr 21, 2008 at 11:20:45PM -0400, Rob Clewley wrote:
<soapbox> If we're going to start thinking "big" for supporting more mathematically natural functionality, I believe we ought to be thinking far enough out to support the basic objects of PDE computations too (or at least a compatible class structure and API), even if it's not fully utilized just yet. Scipy should support scientific computation based around mathematically-oriented fundamental objects and functionality [...] </soapbox>
Sounds like sympy to me. Gaël
On Fri, Apr 25, 2008 at 9:26 PM, Gael Varoquaux <gael.varoquaux@normalesup.org> wrote:
On Mon, Apr 21, 2008 at 11:20:45PM -0400, Rob Clewley wrote:
<soapbox> If we're going to start thinking "big" for supporting more mathematically natural functionality, I believe we ought to be thinking far enough out to support the basic objects of PDE computations too (or at least a compatible class structure and API), even if it's not fully utilized just yet. Scipy should support scientific computation based around mathematically-oriented fundamental objects and functionality [...] </soapbox>
Sounds like sympy to me.
You sound dismissive, but symbolic expressions are just one part of it, and not at all what this thread has been about. There are plenty of other types of mathematical objects important in scientific computation. This thread has been about geometric objects such as curves, and in particular trajectories as numerical solutions to differential equations.
On Fri, Apr 25, 2008 at 10:58:01PM -0400, Rob Clewley wrote:
Sounds like sympy to me.
You sound dismissive, but symbolic expressions are just one part of it, and not at all what this thread has been about. There are plenty of other types of mathematical objects important in scientific computation. This thread has been about geometric objects such as curves, and in particular trajectories as numerical solutions to differential equations.
Yes, you are right. I just have the fealing that a good portion of the work has been done in sympy. Maybe I am wrong. Cheers, Gaël
It appears that scipy does not have a facility for using the Lagrange polynomial to interpolate data. (Or did I miss it?) I am well aware of the numerical difficulties this poses, but it (and its generalization, the Hermite polynomial) does prove useful on occasion. I have written prototype implementations of two algorithms for evaluating this polynomial, and I'd like comments before submitting them for inclusion in scipy.
In fact, I've seen that there was a lagrange function in scipy.interpolate In [5]: from scipy import interpolate In [6]: interpolate.lagrange? Type: function Base Class: <type 'function'> String Form: <function lagrange at 0xb582a1ec> Namespace: Interactive File: /usr/lib/python2.4/site-packages/scipy/interpolate/ interpolate.py Definition: interpolate.lagrange(x, w) Docstring: Return the Lagrange interpolating polynomial of the data-points (x,w) In scipy 0.6, this seems to be broken, because numpy's poly1d is not imported at the top of the interpolate module : In [16]: scipy.__version__ Out[16]: '0.6.0' In [17]: x = array([1, 4, 5, 7]) In [18]: y = array([0, 1, 2, 1.5]) In [19]: poly = interpolate.lagrange(x,y) --------------------------------------------------------------------------- NameError Traceback (most recent call last) /usr/lib/python2.4/site-packages/scipy/interpolate/interpolate.py in lagrange(x, w) 28 """ 29 M = len(x) ---> 30 p = poly1d(0.0) 31 for j in xrange(M): 32 pt = poly1d(w[j]) NameError: global name 'poly1d' is not defined This is related in #626. I don't know why this function has been put aside and do not appears in scipy.interpolate docstring. -- LB
Hi Anne I'm surprised there haven't been more responses to this thread. I would certainly like to see this code included. There's been some other work on the fitpack2 module (we're still waiting for an update on that front from John Travers). We should also think of including Stineman- http://projects.scipy.org/pipermail/scipy-dev/2006-April/005652.html and Lanczos/Sinc interpolation. Regards Stéfan On 19/04/2008, Anne Archibald <peridot.faceted@gmail.com> wrote:
Hi,
It appears that scipy does not have a facility for using the Lagrange polynomial to interpolate data. (Or did I miss it?) I am well aware of the numerical difficulties this poses, but it (and its generalization, the Hermite polynomial) does prove useful on occasion. I have written prototype implementations of two algorithms for evaluating this polynomial, and I'd like comments before submitting them for inclusion in scipy.
One implementation is based on Krogh 1970, "Efficient Algorithms for Polynomial Interpolation and Numerical Differentiation"; it allows the construction of Hermite polynomials (where some derivatives at each point may also be specified) and the evaluation of arbitrary derivatives. It is based on a Neville-like algorithm, so that it does O(n^2) work when constructed but only O(n) per point evaluated, or O(n^2) per point for which all derivatives must be evaluated. (n here is the degree of the polynomial.)
The other implementation is based on Berrut and Trefethen 2004, "Barycentric Lagrange Interpolation". This implementation uses a rewriting of the Lagrange polynomial as a rational function, and is efficient and numerically stable. It also allows efficient updating of the y values at which interpolation occurs, as well as addition of new x values. It does not support evaluation of derivatives or construction of Hermite polynomials.
Finally, I have written a "PiecewisePolynomial" class for constructing splines in which each piece may have arbitrary degree, and for which the function values and some derivatives are specified at each knot. The intent is that this be used to represent solutions obtained from ODE solvers, using one polynomial for each solver step, with the same order as the solver's internal polynomial solution, and with (some) derivatives matching at the ends. Such a Trajectory object would contain additional information about the solution (for example stiffness or error estimates) beyond what is in PiecewisePolynomial. (I tried implementing Trajectory using splines, which are evaluated in compiled code, but their maximum degree is 5 while the solvers will go up to degree 12.)
They all need work, in particular, efficiency would be improved by making the y values vectors, error checking needs to be more robust, and documentation is not in reST form. Ultimately, too, the evaluation functions at least should be written in a compiled language (cython?). But I thought I'd solicit comments on the code first - is the object-oriented design cumbersome? Is including the algorithm in the name confusing to users? Is the calling convention for Hermite polynomials too confusing or error-prone?
Thanks,
Anne
_______________________________________________ SciPy-user mailing list SciPy-user@scipy.org http://projects.scipy.org/mailman/listinfo/scipy-user
On Sat, Apr 19, 2008 at 4:27 PM, Anne Archibald <peridot.faceted@gmail.com> wrote:
They all need work, in particular, efficiency would be improved by making the y values vectors, error checking needs to be more robust, and documentation is not in reST form. Ultimately, too, the evaluation functions at least should be written in a compiled language (cython?). But I thought I'd solicit comments on the code first - is the object-oriented design cumbersome?
It looks fine although Gabriel's suggestions would be good improvements. Some people may want single-function interfaces (e.g. krogh(xi, yi, x)); they can be written on top of the OO implementation rather more easily than otherwise.
Is including the algorithm in the name confusing to users?
I doubt it.
Is the calling convention for Hermite polynomials too confusing or error-prone?
It looks fine to me. For PiecewisePolynomial, I might transpose the yi input such that yi[0] is the function value, yi[1] is the first derivative, etc. A nitpick: the code pos[pos==-1] = 0 pos[pos==self.n-1] = self.n-2 can be replaced with pos.clip(0, self.n-2, out=pos) But this all looks good, and I want it in scipy. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
On 20/04/2008, Robert Kern <robert.kern@gmail.com> wrote:
On Sat, Apr 19, 2008 at 4:27 PM, Anne Archibald <peridot.faceted@gmail.com> wrote:
They all need work, in particular, efficiency would be improved by making the y values vectors, error checking needs to be more robust, and documentation is not in reST form. Ultimately, too, the evaluation functions at least should be written in a compiled language (cython?). But I thought I'd solicit comments on the code first - is the object-oriented design cumbersome?
It looks fine although Gabriel's suggestions would be good improvements. Some people may want single-function interfaces (e.g. krogh(xi, yi, x)); they can be written on top of the OO implementation rather more easily than otherwise.
Reasonable, though really, is krogh_interpolate(xi, yi, x) much better than KroghInterpolator(xi, yi)(x)? It's also good to emphasize that the construction of the interpolating polynomial is a relatively slow process compared to its evaluation.
Is the calling convention for Hermite polynomials too confusing or error-prone?
It looks fine to me. For PiecewisePolynomial, I might transpose the yi input such that yi[0] is the function value, yi[1] is the first derivative, etc.
The problem with this is that it forces the user to specify the same number of derivatives at each point. It may be desirable (for example) to specify the derivative at only one point but the function value at many. The way I did it you just pass in a nested list. If you have them in the form you suggest, a quick np.transpose([y,yp,ypp]) should get the form that I want.
A nitpick: the code
pos[pos==-1] = 0 pos[pos==self.n-1] = self.n-2
can be replaced with
pos.clip(0, self.n-2, out=pos)
But this all looks good, and I want it in scipy.
Ah, good. What's the story on including cython code in scipy? Is it an additional build dependency, and so to be avoided? Can it be used in a SWIG-like role to produce files that can be distributed and compiled with a C compiler? For any interpolation scheme, it's obviously essential that it be able to be evaluated rapidly... Anne
Ah, good.
What's the story on including cython code in scipy? Is it an additional build dependency, and so to be avoided? Can it be used in a SWIG-like role to produce files that can be distributed and compiled with a C compiler? For any interpolation scheme, it's obviously essential that it be able to be evaluated rapidly...
Yes as long as you provide the generated C wrappers the user can just compile them without having cython on their computer. Check out setup.py example included in numpy svn at: http://svn.scipy.org/svn/numpy/trunk/numpy/doc/cython/ Gabriel
On Sun, Apr 20, 2008 at 09:03:05PM -0400, Anne Archibald wrote:
Reasonable, though really, is krogh_interpolate(xi, yi, x) much better than KroghInterpolator(xi, yi)(x)?
Yes. Some people don't understand functional/object code. We need to keep scipy accesssible for them.
It's also good to emphasize that the construction of the interpolating polynomial is a relatively slow process compared to its evaluation.
Sure, than provide also KroghInterpolator My 2 cents, Gaël
Gael Varoquaux wrote:
On Sun, Apr 20, 2008 at 09:03:05PM -0400, Anne Archibald wrote:
Reasonable, though really, is krogh_interpolate(xi, yi, x) much better than KroghInterpolator(xi, yi)(x)?
Yes. Some people don't understand functional/object code. We need to keep scipy accesssible for them.
It's silly not to use core features of the target language because some people may not have yet learned them.
It's also good to emphasize that the construction of the interpolating polynomial is a relatively slow process compared to its evaluation.
This is a perfect reason to use an object. - Ed
On Mon, Apr 28, 2008 at 08:49:06AM -0700, Ed Rahn wrote:
Gael Varoquaux wrote:
On Sun, Apr 20, 2008 at 09:03:05PM -0400, Anne Archibald wrote:
Reasonable, though really, is krogh_interpolate(xi, yi, x) much better than KroghInterpolator(xi, yi)(x)?
Yes. Some people don't understand functional/object code. We need to keep scipy accesssible for them.
It's silly not to use core features of the target language because some people may not have yet learned them.
No it is not. I program in an environment, not alone, for myself. I want my cooleagues to understand my code. I have found out that they don't understand heavily functionnal programming. I have the choice between voiding this style, eventhought I like it or giving up having my code reused. Well I know what choice I make. Don't blame these people, or if you do, then learn all the optics and electronics they know, and come and run our experiments please (and fix my photodiode on which I am getting only 300kHz bandwidth while I need at least 1MHz). Computing is not their core job. I want scipy to be open to these user, because they can thus plug in a framework to eg control an experiment, interface to a database... This framework can be written in an elaborate way, they will never read it, but I want the option to have simple coding available. One of the things my colleague like with Matlab is that it doesn't forces them to learn new concepts. What I hate with it is that it forbids me (who is writting the experiment-control framework) to use advanced concepts. We need to find a middle ground between the two. Cheers, Gaël
Gael Varoquaux wrote:
On Mon, Apr 28, 2008 at 08:49:06AM -0700, Ed Rahn wrote:
Gael Varoquaux wrote:
On Sun, Apr 20, 2008 at 09:03:05PM -0400, Anne Archibald wrote:
Reasonable, though really, is krogh_interpolate(xi, yi, x) much better than KroghInterpolator(xi, yi)(x)?
Yes. Some people don't understand functional/object code. We need to keep scipy accesssible for them.
It's silly not to use core features of the target language because some people may not have yet learned them.
No it is not. I program in an environment, not alone, for myself. I want my cooleagues to understand my code. I have found out that they don't understand heavily functionnal programming. I have the choice between voiding this style, eventhought I like it or giving up having my code reused. Well I know what choice I make.
The group of people who use scipy is much greater than you and your colleagues. The people who use scipy do so because it uses python. In this case a problem can be better solved, and the context better understood using objects.
Don't blame these people, or if you do, then learn all the optics and electronics they know, and come and run our experiments please (and fix my photodiode on which I am getting only 300kHz bandwidth while I need at least 1MHz). Computing is not their core job. I want scipy to be open to these user, because they can thus plug in a framework to eg control an experiment, interface to a database... This framework can be written in an elaborate way, they will never read it, but I want the option to have simple coding available.
I have no problem blaming them. When I use a tool, it's my responsibility to understand it, not others to work around.
One of the things my colleague like with Matlab is that it doesn't forces them to learn new concepts. What I hate with it is that it forbids me (who is writting the experiment-control framework) to use advanced concepts. We need to find a middle ground between the two.
Working with so many matlab people, maybe octave would be a better fit for you? The API's you write and expose to your colleagues can be independent of the scipy API. - Ed
On Mon, Apr 28, 2008 at 09:20:48AM -0700, Ed Rahn wrote:
The group of people who use scipy is much greater than you and your colleagues. The people who use scipy do so because it uses python. In this case a problem can be better solved, and the context better understood using objects.
Yes, that's all good. Do provide an elaborate interface, but don't kill the simple one.
One of the things my colleague like with Matlab is that it doesn't forces them to learn new concepts. What I hate with it is that it forbids me (who is writting the experiment-control framework) to use advanced concepts. We need to find a middle ground between the two.
Working with so many matlab people, maybe octave would be a better fit for you? The API's you write and expose to your colleagues can be independent of the scipy API.
I like to use Python to drive the experiments. Why add another language? Scipy already fits the job. Why force on people the more elaborate way of making things? The grat thing about Python is that it allows you to do things simple if you want, while keeping the option of more complexe design. Hell, I can write a script, with no functions, let even objects. If I was doing java, I'd have to write: """ public static void main(String [ ] args) { ... } """ In addition by multiplying the langages you are putting more stress on the support people, on the users, which might have to learn the framework one day, and on the developpers, who have to provide the same functionnality in different languages. Keep it simple, use one language, and keep it open to non advanced users. I don't see the cost of keeping a simple API, it doesn't kill the advanced one. Moreover, it provides a light learning curve for people who can later on move to the more advanced. And it is very little code to support. Gaël
Gael Varoquaux wrote:
On Mon, Apr 28, 2008 at 09:20:48AM -0700, Ed Rahn wrote:
The group of people who use scipy is much greater than you and your colleagues. The people who use scipy do so because it uses python. In this case a problem can be better solved, and the context better understood using objects.
Yes, that's all good. Do provide an elaborate interface, but don't kill the simple one.
The original question was about how to implement a new interface, nothing is being killed. Simple and elaborate are relative to what one understand. Objects are a central part of python. To me it seems elaborate to not use a core feature of the language, and instead create some new type of information passing architecture between appropriate functions.
One of the things my colleague like with Matlab is that it doesn't forces them to learn new concepts. What I hate with it is that it forbids me (who is writting the experiment-control framework) to use advanced concepts. We need to find a middle ground between the two.
Working with so many matlab people, maybe octave would be a better fit for you? The API's you write and expose to your colleagues can be independent of the scipy API.
I like to use Python to drive the experiments. Why add another language? Scipy already fits the job. Why force on people the more elaborate
way > of making things? I think at this point we are going in circles. I didn't suggest another language, simply a program that your colleagues might find more appropriate with their current knowledge and problem at hand. It seems your argument is that scipy is better than matlab, but you want to keep the same semantics. New code in scipy should use python idioms, and not matlab or fortran ones simple because people have prior experience and fell comfortable with them. - Ed
On Mon, Apr 28, 2008 at 10:09:31AM -0700, Ed Rahn wrote:
Gael Varoquaux wrote:
On Mon, Apr 28, 2008 at 09:20:48AM -0700, Ed Rahn wrote:
The group of people who use scipy is much greater than you and your colleagues. The people who use scipy do so because it uses python. In this case a problem can be better solved, and the context better understood using objects.
Yes, that's all good. Do provide an elaborate interface, but don't kill the simple one.
The original question was about how to implement a new interface, nothing is being killed. Simple and elaborate are relative to what one understand. Objects are a central part of python. To me it seems elaborate to not use a core feature of the language, and instead create some new type of information passing architecture between appropriate functions.
OK, maybe I wasn't clear. I would like scipy to expose, amongst other things, a simple procedural interface. It should also expose an object-oriented one. By doing this you are not at all crippling the language.
It seems your argument is that scipy is better than matlab, but you want to keep the same semantics. New code in scipy should use python idioms, and not matlab or fortran ones simple because people have prior experience and fell comfortable with them.
It is not a question of prior experience, it is just that it is conceptual simpler. I want a language that expose multiple levels of complexity, so that different people can use different levels. That way I can work with people using the advanced features of the language when I want without forcing them onto my collaborators. The facts that these idioms are better is irrelevent to someone who is trying to do simple things. And I don't buy the argument that simple things should be done in a different language, keeping scipy for complex problems. It doesn't have to be this way. I am not suggesting to make a bad design choice, or to go out of our way, I just want a two liner helper function. Gaël
On Mon, Apr 28, 2008 at 12:17 PM, Gael Varoquaux <gael.varoquaux@normalesup.org> wrote:
I am not suggesting to make a bad design choice, or to go out of our way, I just want a two liner helper function.
I'm afraid that *is* widely recognized as a bad design choice. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
On Mon, Apr 28, 2008 at 01:24:46PM -0500, Robert Kern wrote:
On Mon, Apr 28, 2008 at 12:17 PM, Gael Varoquaux <gael.varoquaux@normalesup.org> wrote:
I am not suggesting to make a bad design choice, or to go out of our way, I just want a two liner helper function.
I'm afraid that *is* widely recognized as a bad design choice.
I don't agree. We are not changing the design of scipy. Users are welcomed not to use this procedural interface if they want. The choice is not ours. Functional code severely confuses beginners. In a French lab, in experimental physics, but also in biology and chemistry, bosses don't code, only PhD students do (I have heard a famous guy, working in computational physics, call his students "human-computer interfaces"). These PhD students stay in the lab between 3 and 4 years (yes a PhD is short in France, that's because studies before the PhD are long). Thus they spend a good fraction of their time being beginners. Some pick up quickly, others have more difficulties. Some people don't need much out of a computer. Please, give them interfaces that don't confuse them. I was once told that Matlab was superior to Python because it did not have confusing objects like lists, tuples, and arrays. Everything was an array. That's not true, by the way, but it reflects that some people are happy when they can do everything in a simple way, and not worry about the right way of doing things. Gaël
2008/4/28 Robert Kern <robert.kern@gmail.com>:
On Mon, Apr 28, 2008 at 12:17 PM, Gael Varoquaux
<gael.varoquaux@normalesup.org> wrote:
I am not suggesting to make a bad design choice, or to go out of our way, I just want a two liner helper function.
I'm afraid that *is* widely recognized as a bad design choice.
Well, I think the way to go is to mimic splrep/splev: def krogh_rep(xi, yi): return KroghInterpolator(xi, yi) def krogh_ev(kr, x, der=0): if der==0: return kr(x) else: return kr.derivative(x, der=der) The only danger is that if it ever raises an exception and the users see how simple it is they will feel like idiots. Anne
If the functionality of krogh_ev is useful, why not put it in a method? A compatibility module is a very good idea. But a concern with that is one of documentation. If I am a new user to both scipy and the splrep/splev library, I will be confused with which calling convention is best. As a developer more external explanation is needed and more code needs to be commented. More tests also need to be written to ensure full code coverage. - Ed Anne Archibald wrote:
2008/4/28 Robert Kern <robert.kern@gmail.com>:
On Mon, Apr 28, 2008 at 12:17 PM, Gael Varoquaux
<gael.varoquaux@normalesup.org> wrote:
I am not suggesting to make a bad design choice, or to go out of our way, I just want a two liner helper function.
I'm afraid that *is* widely recognized as a bad design choice.
Well, I think the way to go is to mimic splrep/splev:
def krogh_rep(xi, yi): return KroghInterpolator(xi, yi) def krogh_ev(kr, x, der=0): if der==0: return kr(x) else: return kr.derivative(x, der=der)
The only danger is that if it ever raises an exception and the users see how simple it is they will feel like idiots.
Anne _______________________________________________ SciPy-user mailing list SciPy-user@scipy.org http://projects.scipy.org/mailman/listinfo/scipy-user
2008/4/28 Ed Rahn <ed@lamedomain.net>:
If the functionality of krogh_ev is useful, why not put it in a method?
Because all it does is expose a method in a non-method-y way.
A compatibility module is a very good idea. But a concern with that is one of documentation. If I am a new user to both scipy and the splrep/splev library, I will be confused with which calling convention is best. As a developer more external explanation is needed and more code needs to be commented.
More tests also need to be written to ensure full code coverage.
What would you suggest? Would you care to write a few? I think KroghInterpolator is fairly thoroughly tested at this point, though I guess a few checks that it raises sensible errors when fed erroneous input wouldn't hurt. More generally, is it worth putting something in the tests for numerical code about stability? An example with degree thirty, say, and a reasonable error estimate, to make sure that modifications don't make the code less stable numerically? Anne
Anne Archibald wrote:
2008/4/28 Ed Rahn <ed@lamedomain.net>:
If the functionality of krogh_ev is useful, why not put it in a method?
Because all it does is expose a method in a non-method-y way.
It also does some logic, which seems useful to some.
A compatibility module is a very good idea. But a concern with that is one of documentation. If I am a new user to both scipy and the splrep/splev library, I will be confused with which calling convention is best. As a developer more external explanation is needed and more code needs to be commented.
More tests also need to be written to ensure full code coverage.
What would you suggest? Would you care to write a few? I think KroghInterpolator is fairly thoroughly tested at this point, though I guess a few checks that it raises sensible errors when fed erroneous input wouldn't hurt.
I was specifically referring to the wrapper code. In the case given you'd just test that the logic is done right when setup and called properly, one with der equal to 0 and one without. Testing with erroneous data is a good idea.
More generally, is it worth putting something in the tests for numerical code about stability? An example with degree thirty, say, and a reasonable error estimate, to make sure that modifications don't make the code less stable numerically?
I haven't looked at the code or tests specifically, nor do I have a good understanding of the problem being solved. However, as a general rule I assume calculations based on external modules and libraries to be correct and only test internal logic. Tests in the external code should catch any changes to output caused by their modification. - Ed
On Mon, Apr 28, 2008 at 3:11 PM, Anne Archibald <peridot.faceted@gmail.com> wrote:
2008/4/28 Robert Kern <robert.kern@gmail.com>:
On Mon, Apr 28, 2008 at 12:17 PM, Gael Varoquaux
<gael.varoquaux@normalesup.org> wrote:
I am not suggesting to make a bad design choice, or to go out of our way, I just want a two liner helper function.
I'm afraid that *is* widely recognized as a bad design choice.
Well, I think the way to go is to mimic splrep/splev:
def krogh_rep(xi, yi): return KroghInterpolator(xi, yi) def krogh_ev(kr, x, der=0): if der==0: return kr(x) else: return kr.derivative(x, der=der)
The only danger is that if it ever raises an exception and the users see how simple it is they will feel like idiots.
Well those certainly aren't useful. The only functions I would consider adding would be "one-shot" functions, e.g.: def krogh(xi, yi, x): return KroghInterpolator(xi,yi)(x) -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
2008/4/28 Robert Kern <robert.kern@gmail.com>:
Well those certainly aren't useful. The only functions I would consider adding would be "one-shot" functions, e.g.:
def krogh(xi, yi, x): return KroghInterpolator(xi,yi)(x)
The problem here is that construction of the splines is an order degree**2 process, so I want an interface that encourages users to construct them once and for all. I think such an approach also discourages people from just y_interp = krogh(all_my_data_x, all_my_data_y, x_interp) with hundreds of points, the results of which will be meaningless and horrible. That said, python's regex system does provide such an interface, so I can do it.Perhaps a long and cumbersome name - evaluate_krogh_interpolation, maybe. What would be painful would be to expose the internal workings of the interpolator, as splrep/splev do. Anne
On Mon, Apr 28, 2008 at 3:36 PM, Anne Archibald <peridot.faceted@gmail.com> wrote:
2008/4/28 Robert Kern <robert.kern@gmail.com>:
Well those certainly aren't useful. The only functions I would consider adding would be "one-shot" functions, e.g.:
def krogh(xi, yi, x): return KroghInterpolator(xi,yi)(x)
The problem here is that construction of the splines is an order degree**2 process, so I want an interface that encourages users to construct them once and for all. I think such an approach also discourages people from just
y_interp = krogh(all_my_data_x, all_my_data_y, x_interp)
with hundreds of points, the results of which will be meaningless and horrible.
If they repeat that with many x_interp with constant all_my_data, yes. However, there are a number of cases where they won't have many x_interp. I don't like it, either. I would have designed it precisely like you already have and left it as a clean, orthogonal API. But since people do request the alternative APIs, we need to at least consider them. My objection to krogh_rep() and krogh_ev() is that they don't address these people's concerns, either. Yes, they're functions instead of methods, but all they are doing is methody things on opaque objects with a different syntax. If people don't understand OO (and don't want to understand OO and don't have an actual need for multiple x_interps), these functions don't help them. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
On Mon, Apr 28, 2008 at 03:56:50PM -0500, Robert Kern wrote:
If they repeat that with many x_interp with constant all_my_data, yes. However, there are a number of cases where they won't have many x_interp.
Robert, you have understood my request very well, and I think you are right, only the function you are proposing should be added, and in addition its docstring should document how it is equivalent to the OOP approach, and why the OOP approach is better. Simple solutions are only for simple problems. If you want the solution to a more complex problem, use OOP. My 2 cents, Gaël
On Monday 28 April 2008 22:36, Anne Archibald wrote:
2008/4/28 Robert Kern <robert.kern@gmail.com>:
Well those certainly aren't useful. The only functions I would consider adding would be "one-shot" functions, e.g.:
def krogh(xi, yi, x): return KroghInterpolator(xi,yi)(x)
The problem here is that construction of the splines is an order degree**2 process, so I want an interface that encourages users to construct them once and for all. I think such an approach also discourages people from just
y_interp = krogh(all_my_data_x, all_my_data_y, x_interp)
with hundreds of points, the results of which will be meaningless and horrible.
You could store already constructed interpolation objects in a dictionary. (I didn't test it.): krogh_interpolator_cache={} def evaluate_krogh_interpolation(all_my_data_x, all_my_data_y, x_interp): global krogh_interpolator_cache if (all_my_data_x, all_my_data_y) in krogh_interpolator_cache: theInterpolator = krogh_interpolator_cache[(all_my_data_x, all_my_data_y)] return theInterpolator(x_interp) else: newInterpolator = KroghInterpolator(all_my_data_x, all_my_data_y) krogh_interpolator_cache[(all_my_data_x, all_my_data_y)] \ = newInterpolator return newInterpolator(x_interp) Offcourse you could empty the dictionary when there are more than a certain number of objects in it, to avoid memory leaks. When you have implemented this too, the function doesn't look so empty anymore, and then nobody has to feel like an idiot.:-) Kind regards, Eike.
On Mon, Apr 28, 2008 at 4:21 PM, Eike Welk <eike.welk@gmx.net> wrote:
On Monday 28 April 2008 22:36, Anne Archibald wrote:
2008/4/28 Robert Kern <robert.kern@gmail.com>:
Well those certainly aren't useful. The only functions I would consider adding would be "one-shot" functions, e.g.:
def krogh(xi, yi, x): return KroghInterpolator(xi,yi)(x)
The problem here is that construction of the splines is an order degree**2 process, so I want an interface that encourages users to construct them once and for all. I think such an approach also discourages people from just
y_interp = krogh(all_my_data_x, all_my_data_y, x_interp)
with hundreds of points, the results of which will be meaningless and horrible.
You could store already constructed interpolation objects in a dictionary. (I didn't test it.):
Arrays are unhashable and cannot be used as dictionary keys. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
On Monday 28 April 2008 23:24, Robert Kern wrote:
On Mon, Apr 28, 2008 at 4:21 PM, Eike Welk <eike.welk@gmx.net> wrote:
On Monday 28 April 2008 22:36, Anne Archibald wrote:
2008/4/28 Robert Kern <robert.kern@gmail.com>:
Well those certainly aren't useful. The only functions I would consider adding would be "one-shot" functions, e.g.:
def krogh(xi, yi, x): return KroghInterpolator(xi,yi)(x)
The problem here is that construction of the splines is an order degree**2 process, so I want an interface that encourages users to construct them once and for all. I think such an approach also discourages people from just
y_interp = krogh(all_my_data_x, all_my_data_y, x_interp)
with hundreds of points, the results of which will be meaningless and horrible.
You could store already constructed interpolation objects in a dictionary. (I didn't test it.):
Arrays are unhashable and cannot be used as dictionary keys. Ah, yes!
One could however store a copy of the last x-data and y-data together with the interpolator. When x-data and y-data are the same at the next call, the interpolator could be reused. Kind regards, Eike.
On Tuesday 29 April 2008 00:04, Eike Welk wrote:
On Monday 28 April 2008 23:24, Robert Kern wrote:
On Mon, Apr 28, 2008 at 4:21 PM, Eike Welk <eike.welk@gmx.net>
wrote:
On Monday 28 April 2008 22:36, Anne Archibald wrote:
2008/4/28 Robert Kern <robert.kern@gmail.com>:
Well those certainly aren't useful. The only functions I would consider adding would be "one-shot" functions, e.g.:
def krogh(xi, yi, x): return KroghInterpolator(xi,yi)(x)
The problem here is that construction of the splines is an order degree**2 process, so I want an interface that encourages users to construct them once and for all. I think such an approach also discourages people from just
y_interp = krogh(all_my_data_x, all_my_data_y, x_interp)
with hundreds of points, the results of which will be meaningless and horrible.
You could store already constructed interpolation objects in a dictionary. (I didn't test it.):
Arrays are unhashable and cannot be used as dictionary keys.
Ah, yes!
One could however store a copy of the last x-data and y-data together with the interpolator. When x-data and y-data are the same at the next call, the interpolator could be reused. The function could look like this:
krogh_interpolator_cache=[] def evaluate_krogh_interpolation(all_my_data_x, all_my_data_y, x_interp): global krogh_interpolator_cache maxLen = 3 #search for already existing interpolators for record in krogh_interpolator_cache: x, y, interp = record if (x==all_my_data_x).all() and (y == all_my_data_y).all(): return interp(x_interp) #limit size of cache if len(krogh_interpolator_cache) >= maxLen: krogh_interpolator_cache = [] #create new interpolator newInterpolator = KroghInterpolator(all_my_data_x, all_my_data_y) krogh_interpolator_cache.append((all_my_data_x.copy(), all_my_data_y.copy(), newInterpolator)) return newInterpolator(x_interp) Kind regards, Eike.
On Tue, Apr 29, 2008 at 11:57 AM, Eike Welk <eike.welk@gmx.net> wrote:
The function could look like this:
krogh_interpolator_cache=[]
def evaluate_krogh_interpolation(all_my_data_x, all_my_data_y, x_interp): global krogh_interpolator_cache maxLen = 3
#search for already existing interpolators for record in krogh_interpolator_cache: x, y, interp = record if (x==all_my_data_x).all() and (y == all_my_data_y).all(): return interp(x_interp)
#limit size of cache if len(krogh_interpolator_cache) >= maxLen: krogh_interpolator_cache = [] #create new interpolator newInterpolator = KroghInterpolator(all_my_data_x, all_my_data_y) krogh_interpolator_cache.append((all_my_data_x.copy(), all_my_data_y.copy(),
newInterpolator)) return newInterpolator(x_interp)
Please, don't spend too much time playing with this. We will not implement caching here. Caches are something better left for applications, not general libraries because it requires too much consideration of details only the application writer knows. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
On Tue, Apr 29, 2008 at 01:41:56PM -0500, Robert Kern wrote:
Please, don't spend too much time playing with this. We will not implement caching here. Caches are something better left for applications, not general libraries because it requires too much consideration of details only the application writer knows.
+1. Gaël
2008/4/28 Robert Kern <robert.kern@gmail.com>:
Well those certainly aren't useful. The only functions I would consider adding would be "one-shot" functions, e.g.:
def krogh(xi, yi, x): return KroghInterpolator(xi,yi)(x)
I agree. But I also think Anne's original interface was simple enough as well as intuitive, and that we should stick to it. A good docstring should comfortably bridge the gap for users not too familiar with objects. Regards Stéfan
One of the things my colleague like with Matlab is that it doesn't forces them to learn new concepts. What I hate with it is that it forbids me (who is writting the experiment-control framework) to use advanced concepts. We need to find a middle ground between the two.
I completely agree with you. We, as developers, must bring the tools so that as many people as possible use Python. And object-oriented programming is not well known in the scientific community (at least the French one). They are used to Matlab and C/Fortran, nothing else. Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher
On Tue, Apr 29, 2008 at 09:12:17PM +0200, Matthieu Brucher wrote:
I completely agree with you. We, as developers, must bring the tools so that as many people as possible use Python. And object-oriented programming is not well known in the scientific community (at least the French one). They are used to Matlab and C/Fortran, nothing else.
But we (you and I) want to be freed of these chains! :) Gaël
I'm always glad to see more interpolation methods in scipy -- nice job Anne. My 2¢ on the discussion so far: I think the best solution so far suggested is no wrapper function in the package, but described in the docstring (it is only two lines long, after all..). Namespaces get cluttered enough without all that extra stuff, and even if people don't get OO, the can all certainly read a docstring. -Rob ---- Rob Hetland, Associate Professor Dept. of Oceanography, Texas A&M University http://pong.tamu.edu/~rob phone: 979-458-0096, fax: 979-845-6331
On Tue, Apr 29, 2008 at 10:26:49PM +0200, Rob Hetland wrote:
Namespaces get cluttered enough without all that extra stuff, and even if people don't get OO, the can all certainly read a docstring.
These people don't read docs. In addition the people I am thinking about don't know what a docstring is. I used to think that is was normal to have to go through a learning process to use a tool, hardware or software. Nowadays I have learned in incredible amount of tools, I am faced with new tools on a day-to-day basis. Some of which are forced upon me, some that I choose to use, some that I try out and drop later on. I am also heavily overworked (it's almost a culture for me). When I want to get some problem solved, I rarely have much time to spend on it. Quite often I need a solution real fast. As a result I try out a tool. If I am not able to produce something fast, I give it up. The tool might be excellent. I might even be convinced the tool is excellent, but I have to move along, and use a tool that I believe is less technically good, because it has a smaller learning curve. I am glad that I learned Unix system administration, C coding, VIM, Make, Blender, and all those powerful tools with steep learning curves when I had time a few years ago. I just love those tools. However, I am so grateful when I can pick up a software like Inkscape that makes it really easy for me to draw something quickly. I thank the developers of this software for making it really easy to start with. Usability is making the software match the user expectations, helping him make the first step, and also helping him move forward to go from his beginners workflow, to a more advanced and efficient one. Python is great in this respect. You can start with scripting, than use functions, and later objects. You don't have to know what objects are to start with it, but if you use Python long-enough you will absorb these concepts and you will have learned useful things. Gaël
On Tue, Apr 29, 2008 at 3:39 PM, Gael Varoquaux <gael.varoquaux@normalesup.org> wrote:
On Tue, Apr 29, 2008 at 10:26:49PM +0200, Rob Hetland wrote:
Namespaces get cluttered enough without all that extra stuff, and even if people don't get OO, the can all certainly read a docstring.
These people don't read docs. In addition the people I am thinking about don't know what a docstring is.
In that case, I see no reason to cater to them. I will bend over backwards to help someone who will meet me even a tenth of the way, but I just cannot care about someone who will not put forth the de minimis effort of using help(). -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
On 29/04/2008, Rob Hetland <hetland@tamu.edu> wrote:
I'm always glad to see more interpolation methods in scipy -- nice job Anne.
My 2¢ on the discussion so far: I think the best solution so far suggested is no wrapper function in the package, but described in the docstring (it is only two lines long, after all..). Namespaces get cluttered enough without all that extra stuff, and even if people don't get OO, the can all certainly read a docstring.
Well, actually, this brings up a concern I have. Suppose I want to use the class scipy.interpolate.interp1d, but I don't know how. In [18]: scipy.interpolate.interp1d? Type: type Base Class: <type 'type'> String Form: <class 'scipy.interpolate.interpolate.interp1d'> Namespace: Interactive File: /usr/lib/python2.4/site-packages/scipy/interpolate/interpolate.py Docstring: Interpolate a 1D function. See Also -------- splrep, splev - spline interpolation based on FITPACK UnivariateSpline - a more recent wrapper of the FITPACK routines Uh, great, but how do I actually *use* it? In [19]: scipy.interpolate.interp1d.__init__? String Form: <unbound method interp1d.__init__> Namespace: Interactive File: /usr/lib/python2.4/site-packages/scipy/interpolate/interpolate.py Definition: scipy.interpolate.interp1d.__init__(self, x, y, kind='linear', axis=-1, copy=True, bounds_error=True, fill_value=nan) Docstring: Initialize a 1D linear interpolation class. Description ----------- x and y are arrays of values used to approximate some function f: y = f(x) This class returns a function whose call method uses linear interpolation to find the value of new points. Parameters ---------- x : array How are users supposed to find .__init__? This is what they need to use to actually create an instance of the class, but the information is not presented when they look at the class' docstring. Even a moderately experienced user might have no idea there was a method called __init__ whose docstring would have alleviated their suffering. Should the class' docstring suggest users look at the docstring of .__init__? Should it *include* that docstring? There must be a general python solution to this... Anne
On Tue, Apr 29, 2008 at 3:54 PM, Anne Archibald <peridot.faceted@gmail.com> wrote:
Suppose I want to use the class scipy.interpolate.interp1d, but I don't know how.
In [18]: scipy.interpolate.interp1d? Type: type Base Class: <type 'type'> String Form: <class 'scipy.interpolate.interpolate.interp1d'>
Namespace: Interactive File: /usr/lib/python2.4/site-packages/scipy/interpolate/interpolate.py Docstring: Interpolate a 1D function.
See Also -------- splrep, splev - spline interpolation based on FITPACK UnivariateSpline - a more recent wrapper of the FITPACK routines
Uh, great, but how do I actually *use* it?
Well, the version of IPython I have does include the constructor's information when you do "interp1d?". So does help(). -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
I would have thought the general expectation (independent of ipython specifics) is that users should try help(interp1d) which indeed gives the docstring for __init__ (and more besides). -Rob
Could someone please make the new interpolation classes into new-style classes? And I don't know if it's considered a big deal, but for future compatibility maybe the exception raising should be done in the functional style: ValueError("message") rather than ValueError, message ? Maybe it's time I asked for SVN access!
2008/4/30 Rob Clewley <rob.clewley@gmail.com>:
Could someone please make the new interpolation classes into new-style classes? And I don't know if it's considered a big deal, but for future compatibility maybe the exception raising should be done in the functional style: ValueError("message") rather than ValueError, message ?
I can clean those up. But I'm not sure how to set things up as "properties" so that the right things happen when users try to manipulate the attributes. I'll also go in and add simple single-function-call interfaces at the same time. It may be a few days though. Could somebody point me at a link on making proper use of properties? Thanks, Anne
On Wed, Apr 30, 2008 at 2:03 AM, Anne Archibald <peridot.faceted@gmail.com> wrote:
2008/4/30 Rob Clewley <rob.clewley@gmail.com>:
Could someone please make the new interpolation classes into new-style classes? And I don't know if it's considered a big deal, but for future compatibility maybe the exception raising should be done in the functional style: ValueError("message") rather than ValueError, message ?
I can clean those up. But I'm not sure how to set things up as "properties" so that the right things happen when users try to manipulate the attributes.
If it doesn't feel right to you, don't do it. Your point about properies giving a false sense of safety in this case is quite valid. If it's not obvious how to apply properties nicely here, that might be a good sign that properties aren't appropriate. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
2008/4/30 Robert Kern <robert.kern@gmail.com>:
On Wed, Apr 30, 2008 at 2:03 AM, Anne Archibald
<peridot.faceted@gmail.com> wrote:
2008/4/30 Rob Clewley <rob.clewley@gmail.com>:
Could someone please make the new interpolation classes into new-style classes? And I don't know if it's considered a big deal, but for future compatibility maybe the exception raising should be done in the functional style: ValueError("message") rather than ValueError, message ?
I can clean those up. But I'm not sure how to set things up as "properties" so that the right things happen when users try to manipulate the attributes.
If it doesn't feel right to you, don't do it. Your point about properies giving a false sense of safety in this case is quite valid. If it's not obvious how to apply properties nicely here, that might be a good sign that properties aren't appropriate.
Well, actually what I meant was I've never used properties at all and didn't find any useful documentation. I had in mind a fairly draconian configuration that made nearly everything readonly, though I suppose that would become cumbersome within my own methods. And anyway it doesn't really work for numpy arrays, since someone can always do b = A.unwritable_array; b[i,j]=3. But set_yi is a bit ugly. Anne
On Wed, Apr 30, 2008 at 3:03 AM, Anne Archibald <peridot.faceted@gmail.com> wrote:
2008/4/30 Robert Kern <robert.kern@gmail.com>:
On Wed, Apr 30, 2008 at 2:03 AM, Anne Archibald
<peridot.faceted@gmail.com> wrote:
I can clean those up. But I'm not sure how to set things up as "properties" so that the right things happen when users try to manipulate the attributes.
If it doesn't feel right to you, don't do it. Your point about properies giving a false sense of safety in this case is quite valid. If it's not obvious how to apply properties nicely here, that might be a good sign that properties aren't appropriate.
Well, actually what I meant was I've never used properties at all and didn't find any useful documentation. I had in mind a fairly draconian configuration that made nearly everything readonly, though I suppose that would become cumbersome within my own methods.
Properties Lesson #1: Don't do that. :-) Properties are useful to add functionality, or to expose functionality with attribute syntax, which may be appropriate. *Removing* functionality is not a good use of properties. But anyways, the general solution for the internal cumbersomeness is to not use the properties internally. Instead, the property should map to a _private attribute, and internally, you just manipulate those.
And anyway it doesn't really work for numpy arrays, since someone can always do b = A.unwritable_array; b[i,j]=3. But set_yi is a bit ugly.
All things considered, it's fine. Call it update_yi() if you feel it's more appropriate. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
On Wed, Apr 30, 2008 at 03:03:13AM -0400, Anne Archibald wrote:
2008/4/30 Rob Clewley <rob.clewley@gmail.com>:
Could someone please make the new interpolation classes into new-style classes? And I don't know if it's considered a big deal, but for future compatibility maybe the exception raising should be done in the functional style: ValueError("message") rather than ValueError, message ?
Also the 'Value, message' format is going away in python 3000: http://www.python.org/dev/peps/pep-3109/#compatibility-issues
I can clean those up. But I'm not sure how to set things up as "properties" so that the right things happen when users try to manipulate the attributes. I'll also go in and add simple single-function-call interfaces at the same time. It may be a few days though.
Could somebody point me at a link on making proper use of properties?
Not to disagree with the argument that if you find properties unnatural don't use them, but to give a link that I like . . . This link goes over descriptors in general, which I found useful to understand the magic of properties: http://users.rcn.com/python/download/Descriptor.htm If this doesn't cover what you want to do I can drum up some more links. Gabriel
2008/4/30 Rob Clewley <rob.clewley@gmail.com>:
Could someone please make the new interpolation classes into new-style classes? And I don't know if it's considered a big deal, but for future compatibility maybe the exception raising should be done in the functional style: ValueError("message") rather than ValueError, message ?
Done. Plus the procedural versions are there. (Thirty-eight lines of docstring for two lines of function!) Anne
participants (12)
-
Anne Archibald
-
Ed Rahn
-
Eike Welk
-
Gabriel Gellner
-
Gael Varoquaux
-
Joris De Ridder
-
LB
-
Matthieu Brucher
-
Rob Clewley
-
Rob Hetland
-
Robert Kern
-
Stéfan van der Walt