Finding a point in the data set after which the slope of the line is approximately zero.

Hi! I am having a real time data set and I need to find the point from which the slope is decreasing at a constant rate (i.e the data is at a very small angle with X axis) Can anyone help me to find that point. Please see the attached figure and the wave data set.
Maybe it is a bit late for an answer, but have you looked into taking the derivate (Menu: Analysis => Differentiate)? This should give you a nonzero but rather constant value for your decrease. You then could apply various tools (peak searching, thresholds etc.) to find your desired point in the data. If you post some example data and an explanation what exactly your want the extract, we may be able to help further.
Another idea to http://www.igorexchange.com/node/7532#comment-14312.
You could do a 'sliding' fit to a constant function ( f(x)=y0 ; like box car averaging ) in a small range of your data ( [x0, x0+window] ) and find the minimum of the fit error as a function of x0.

I'd be careful with derivatives in your case since there is significant noise in your data.
Furthermore, you ask for a 'point' where it starts. Since there is noise, how do you define that point? More precise: how do you distinguish between a data point that is in that 'flat area' by noise and one that is really there? Think about your X error bars.

The previously mentioned intersection of functions is really ONE point (if not pathological). But you might need to imply a model to justify this definition.

HJ