Nonlinear least square problem
spellucci at fb04373.mathematik.tu-darmstadt.de
Fri Apr 4 16:58:15 CEST 2008
In article <0354420e-819d-4ba3-b61a-faaffd46b65c at r9g2000prd.googlegroups.com>,
Uwe Kotyczka <uwe.kotyczka at web.de> writes:
>Hallo, sorry for multiposting, but I am really looking
>for some hint to solve my problem. And no, I don't
>use Matlab, but maybe the matlab people have an idea
>I have to solve a nonlinear least square problem.
>Let me tell you some background first. Imagine
>you have a tool to process some work piece, say
>polishing some piece of glas. The tool behaves
>different on different locations of the piece,
>and I can describe that behaviour. Now the tool
>shall smooth the surface of the workpiece.
>Next I have information about the piece before
>handling it. What I have to find is optimal
>time curve for the tool to obtain a perfectly
>How to formulate the problem?
>Given a time vector (t_j) I have a function
>g which calculates the remaining error (e_i)
>(e_i) = g(t_j)
>The rest error is given at, say, 100 points,
>(t_j) is searched at 200 points.
>My idea was to make the (t_j) a function of
>some few parameters (t_j) = h(p_k), say 15
>parameters. So the concatenated function
>(e_i) = g(t_j) = g(h(p_k)) =: f(p_k) is to be minimized.
>in the sense (e_i)-c -> Min, where c is a constant,
>the end level of the surface.
>To solve this problem I use a "C" implementation
>of the Levenberg-Marquardt algorithm as you can find
>it in the LevMar Package (www.ics.forth.gr/~lourakis/levmar/).
>The function g contains the information about the
>tool and about the initial surface. For the function
>h I tried several approaches, making the time a
>cubic spline of a selected times, or making it some
>Now what is my problem? With the above I do find
>solutions, however a lot of solutions seem to
>give very similar remaining errors. The only problem
>is that the corresponding time vectors, which are
>(t_j_optimal) = h(p_k_optimal) look very different
>from optimal solution to optimal solution.
>In particular the optimization algorithm often prefers
>solutions where the time vector is heavily oscillating.
>Now this is something I _must_ suppress, but I have no
>idea how. The oscillation of the (t_j) depend of
>the ansatz of h, of the number of parameters (p_k).
>If f would be a linear function, then the matrix
>representing it would be a band matrix with a lot
>of diagonals nonzero. How many depends on the
>ratio tool diameter to piece diameter.
>Now what are my question: Is the problem properly
>formulated? Can I expect to find non-oscillating
>solutions? Is it normal that taking more parameters
>(p_k) makes the thing worse? What else should I
>consider? Is this more verbal description sufficient?
>Thank you very much in advance.
wouldn't be the normal way to describe this problem be
an optimal control problem?
and there will be constraints?
let us take the example of the polishing problem: your tool must move
over the surface and will actually operate over a (small)disk:
you must design
a path of the disc center such that the working area of the tool covers the whole
surface. If you assume location dependent roughness, then it might be impossible
directly to polish one location perfectly, such that the path must be taken several times.
maybe due to heating during the polishing process, it may be necessary to introduce gaps
in the path in order to avoid overheating of some area. this may take time
and/or energy and you might want to solve a minimal time or minimal
energy problem subject to your constraints. if you are lucky, you end up
with a convex problem wwhich will exhibit a unique solution. otherwise
you might be trapped in a local optimum.
such a path could be modeled by a spline curve for example, and as far as I
know there are already industrial solvers for types of this problem.
More information about the Python-list