2) I read the tech. report and the approach looks really good considering how often the approximation is sought as the linear combination of basis functions (which in turn depend on tunable parameters).
I didn't understand the reason to store Jacobian in transposed form. What significant difference will it make?
Subject: Re: [SciPy-Dev] GSOC Optimization Project
From: gregor.thalhammer@gmail.com
Date: Mon, 9 Mar 2015 19:42:45 +0100
CC: n59_ru@hotmail.com
To: scipy-dev@scipy.org
Hi! I was tricked by stackedit and I repeat my message with the working link. Sorry for that.
I did some research on suitable algorithmic approaches and want to present you my pre-proposal:
I have to admit that I'm a bit overwhelmed by the amount of papers, books, etc on this subject. I tried my best to come up with some kind of a practical approach. Now I need the feedback / review. Also I'm eager to hear a word from Pauli Virtanen as we don't know yet whether this project could happen at all.
Dear Nikolay,
I hope you will be successful with your proposal. In the past I have been unhappy with the current MINPACK base implementation and ended up writing my own Python based implementation of LM-algorithm, specialized for my needs. Several other translations are floating around, so it seems there is really some need for a more flexible (class based) implementation, that provides easy customization by users. Years ago, my use case was fitting of images (2D Gaussian or slightly more complicated models), directly calculating the Jacobian (no numeric differentiation). Speed was important. I just want to share some findings and ideas I would be happy to be covered by future improvements to the scipy code.
* In my case (many observations, few parameters) directly solving the normal equations was a lot faster, this was the main reason for me not to use the MINPACK implementation.
* For the QR decomposition using the scipy implementations instead of the MINPACK gives better performance, especially when using optimized libraries (MKL, ATLAS, …)