Minimizing sum of differences

Hello,

I have two datasets, one which is measured data and one which is modeled data, both which correspond to a time wave. I need to minimize the sum of their absolute differences,
i.e. min||Σi λi * x(t)i – y(t)||, where x(t) is the modeled values and y(t) are the measured values. I need to find values of lambda (λ) that minimize this expression. So far all I can do is generate random values for lambda, solve for the summation and find which value of lambda corresponds to the minimal sum, but I know this is not correct.

Function MinimizeSum(modelled, measured)
wave modelled, measured
variable i

make/o/n=(numpnts(modelled)) lambda
lambda=gnoise(1)

duplicate/o modelled diff
diff=nan

duplicate/o diff sum_wave
sum_Wave=NaN

    For(i=0;i<(numpnts(modelled)); i+=1)
    diff[i] = abs(lambda[i]*modelled[i]-measured[i])
    sum_wave = sum(diff)
    endfor

Wavestats sum_wave
end


How do I properly calculate values of lambda to minimize this expression?

Thank you for any help,
Stephanie
You can write your expression as a user function, and then use the Optimize operation to minimize the output of the user function.

To learn more, execute this Igor command:

DisplayHelpTopic "Finding Minima and Maxima of Functions"

John Weeks
WaveMetrics, Inc.
support@wavemetrics.com