Do Neural Networks overfit less than Decision Trees?

Discussion in 'Strategy Building' started by damien.dualsigma, Oct 28, 2017.

  1. Sorry, I added text inside the quote. Damn iphones
     
    #21     Jan 8, 2018
  2. userque

    userque

    Ok, yeah...I forgot that happens sometimes on this site. You posted:

    "Thank you.
    One more “silly” question at this point:
    - let’s say you have a model with 10 parameters and 1000 timesteps. Let’s say it gives you an R squared of 0.3 in sample, 0.2 out of sample.
    How would you feel about it?
    Would you run money on it?

    Thanks all!"


    You're welcome.

    "How would you feel about it?"

    OBJECTION! Overly broad. :)

    I would have to know more about the model. But I will also say that out-of-sample can be overfit as well if you or an algorithm tries long enough to find 'a good result' for both in-sample and out-of-sample. In such a case, the out-of-sample data essentially becomes part of the in-sample data.

    Accordingly, I would not run money on it.
     
    #22     Jan 8, 2018
  3. Ok, but what kind of thing would you need to know? I agree with the out of sample easily becoming in sample, but what if I did not optimize again after trying the out of sample? Something like 1000 points in sample, 200 points out of sample?
    Perhaps to improve I would randomize the selection of points across the whole history, but then you lose some capacity to run time series models
     
    #23     Jan 8, 2018
  4. userque

    userque

    If I were the one building the model; maybe nothing. (If this helps: the only algorithm I 'believe in' for forecasting financial markets is a customized kNN.)

    As an outside observer, I'd want to see it perform going forward for some time.

    That doesn't seem that it would be a sufficient number of oos points...unless you were using a 'sliding window' over additional points.

    I wouldn't randomize. You could try a variation of cross-validation, or a sliding window.
     
    #24     Jan 8, 2018
  5. gon

    gon

    Why only Knn?
     
    #25     Jan 8, 2018
  6. userque

    userque

    kNN can be used to find patterns.
    NN's etc. seek to generate a 'formula' to model the data.

    I have come to believe that financial data can't consistently be reduced to a formula; but patterns do exist, even without us knowing exactly why, or how.

    Also, I've tried both and have conducted experiments etc.
     
    #26     Jan 8, 2018
  7. upcoming volatility is always uncertain.doesnt mean a shit wheather its a neural network or dt, or cabbage field!
     
    #27     Jan 8, 2018
  8. gon

    gon

    And what about this?
    https://arxiv.org/ftp/arxiv/papers/1111/1111.5892.pdf
     
    #28     Jan 8, 2018
  9. userque

    userque

    #29     Jan 8, 2018
  10. gon

    gon

    Iof assigning a window of price-related features, the paper shows how to create a set of planes like if they were a mesh. This way they model is able to learn also the visual side of the trading, not only a sequence of numbers, but also their position and distribution across a 2-dimensional grid.

    The idea is to provide the same visual tools that the traders use to find patterns and geometric factors to take decisions.

    I have not (yet) tried that approach, but I find it interesting.
     
    #30     Jan 8, 2018