Ok I know there are some math gurus on ET .... I have two time series, Series A and Series B. Series B is somewhat correlated to Series A. I want to find the signal from Series B less any influence from Series A. I already normalized each series - I detrended (I don't care about difference in overall trend) and divided by the standard deviation. (Series A) - (Series B * bMult) = Series C I figure if A and B are highly correlated, bMult should be nearly 1. If they are not very correlated, bMult should be nearly 0. Maybe I am looking for the bMult that minimizes the std. dev. of Series C.