Standard error vs Standard deviation for Regression Bands ?

Discussion in 'Technical Analysis' started by Aston01, Feb 27, 2012.

  1. Aston01

    Aston01

    What is the reasoning someone would use 1 vs the other when it comes to determining the outer bands in linear regression ?
     
  2. The standard error of estimation is interpreted in the same way as the standard deviation. The standard deviation tells us how spread out a distribution of scores is with respect to the distribution mean. The standard error of estimate tells us how spread out scores are with respect to their predicted values.

    So if you have some sort of predictive model, to be technically correct, you want to use the standard error of the estimate.

    If you are just looking at a series of past prices and want a feel for the spread you can use the standard deviation.

    Unless you are pushing big money or doing far out ex ante forecasts it doesn't really make that much difference. It just sounds cooler to use the standard error of the estimate.

    The skinny - If you are worrying about this you are in trouble.

    The real value of the SEE is when you are developing predictive models. Lower the SEE, usually, the better the model.
     
  3. Aston01

    Aston01

    Thanks for the info