Interpreting regression results

Discussion in 'Strategy Development' started by logic_man, Mar 31, 2012.

  1. I ran a regression using the trading outcomes for all of my trades as the dependent variable and the model parameter values as the independent variables because I was curious which parameters have the largest impact on outcomes. The intercept comes out as statistically significant (and positive) and one of the model parameters had statistical significance as well, but the rest of the model comes out as insignificant at the 95% confidence level. Also, the adjusted R-squared was very low, about 3%.

    For any others who've done a similar analysis of their models, did you find similar results? Is an analysis of this sort even appropriate, since it assumes some kind of linear relationship between the outcome and the model variables, but the actual relationship is not linear? It seems counterintuitive to me that the model parameters don't seem to have a strong relationship to the outcomes, yet, when I sort the trades by some of the model parameters, there is clearly a difference in the average outcome. I realize you do statistical analysis to avoid subjective opinion, but it just struck me as odd that the regression came out like it did.
  2. ocean5


  3. ssrrkk


    I think it doesn't make sense to do what you did. As you pointed out yourself, a regression assumes a linear relationship like:

    PL = a0+ a1*param1 + a2*param2 + ...

    However as we all know changing the parameters of your system can make a profitable system to be completely unprofitable, and the relationship tends to be very steep and there is often only a finite range of parameter values that works, while values outside those ranges don't work -- in other words, the outcome is not even monotonic with respect to the parameters and the relationship is likely highly nonlinear. It is no wonder that your r-squared basically says "no linear relationship". This is just my opinion without thinking too hard about it...
  4. Thanks for the response. I am inclined to agree with you that the analysis itself is invalid for this type of situation. Just because you can run a regression model on some data doesn't mean you should. :)
  5. You can do statistics on the statistics. Change parameters (or logic, inclusion/exclusion etc) with combinatorics, have some minimum delta and range and thus maximum number of combinations. Output the specific parameter results into 3d graphs. You'll probably get mostly noise but sometimes the results might be interesting. It also depends on the structure, some elements strongly support others at only the precise values and sequence. That way the computer can probably tell you that you yourself cannot understand the complex relationships and that what you just did is produce pretty graphs and hopefully you had fun, but that's it.
  6. I'm good with regression analysis. PM me and I can help you run some robust tests.