Neural Networks don't work for trading

Discussion in 'Automated Trading' started by irniger, Apr 19, 2009.

  1. rdg

    rdg

    One exercise I went through that might help someone else was to create a file that had random binary inputs and an output that was some logical combination of the inputs. I ran that file through an NN program and some other statistical analysis programs to see that they could predict the output. I then introduced varying amounts of noise to the output to see how strong the correlation between the inputs and the output needed to be for the predictions to still have value.

    Not long after doing all that I started looking for better inputs.
     
    #41     Apr 29, 2009
  2. Hugin

    Hugin

    We use the GA to tune the weights of the NN (our hybrid models may also include other parameterized stuff). The output from the NN is used in the signal generation. The signals are then used to generate trades.

    Somewhat simplified the GA goal function looks at these trades and see how their performance evolve over the time after the signal and generates a fitness measure, typically based on the Sharpe Ratio or Profit Factor.

    One benefit of using this type of model is that you don't require the NN to create a complete classification for all inputs in the training set since the GA goal function will not punish the NN for not including all good trades in its "good trade" classification, which we risk doing (maybe unknowingly) when a simple error measure is used. In fact our classifications only includes the two classes "good trade" and "don't care".

    One major drawback with our set-up is that training becomes very slow.

    /Hugin
     
    #42     Apr 30, 2009
  3. RedRat

    RedRat


    What is the good "cost function" for NN? I want to try something different then MSE... Thanks in advance.
     
    #43     Apr 30, 2009
  4. Ok. So, essentially you are optimizing the weights via GA, not so much to minimize RMSE faster, but using fitness functions (sharpe, etc., I read your other post somewhere) as a type of criteria.

    Thanks,
    dt



     
    #44     Apr 30, 2009
  5. Hugin

    Hugin

    That's correct, the reason for us to do it this way is to get away from the limitations of the standard textbook NN training algorithms. There are probably other ways to do it, e.g. by using other network topologies (e.g. non-feedforward) and other training algorithms than gradient based.

    We have also looked into other types of classification algorithms like Support Vector/Kernel regression but so far this works OK for us. Also, the feedforward network integrates well into our models.

    /Hugin
     
    #45     Apr 30, 2009
  6. Hugin

    Hugin

    What I say here comes from my experience working with NN some 15 years ago so it may be outdated. Also, our focus then was on feedforward networks trained by the back propagation training algorithm so my answer is biased by that.

    The main problem we found with changing the goal function is that it may have impact on the training algorithm so the training might just stop working.

    Maybe somebody with experience with other types of networks have some input?

    Anyway, this is why we use a GA to optimize network weights. The goal function for a GA can be more flexible, since the only requirement is that it can tell which of two network configurations is the better one. The GA also makes it possible for us to use the same algorithm for optimizing all parameters in the trading model, including the weights of the NN.

    /Hugin
     
    #46     Apr 30, 2009
  7. Because we're now including GA, in this discussion...

    Want to talk about Markov Chains Analysis?
     
    #47     May 1, 2009
  8. H1N1 virus can evolve and accumulate strength as it passes back and forth between different animals and people; it's adaptive and dynamic nature seems like a genetically engineered algorithm...a role model for trading systems .i.e. changing grid spacing and position sizing. :)
     
    #48     May 6, 2009
  9. RedRat

    RedRat

    Thank you Hugin,
    I am familiar with GA. But unfortunately my basic NN has 16 input neurons + hidden layer, say 24 neurons, so I suppose it will take enormous time to train with GA :(. One output.

    How many neurons or connections do you have?

    I am also trying SVM and RVM but do not have results so far.

    Are you familiar with reinforcement learning? Is it better then GA optimization?

    RR
     
    #49     May 7, 2009
  10. Have any of you who uses GA in connection with NNs
    looked into NEAT (NeuroEvolution of Augmenting Topologies)?
     
    #50     May 7, 2009