Neural Networks Revisited

Discussion in 'Automated Trading' started by maninmoon, May 24, 2016.

  1. 1 neuron will not be able to generalize on anything complex. You're just creating non-linear scalar function which will look like W_h_o * sigmoid(W_i_h * x + b_i_h) + b_h_o, which takes only one scalar x as input. Let's say, input price is 100. What be the delta in 5 days? We don't know anything else, so it could be e.g. +5, or -5. There's no need in NN in this case, you can just calculate mean and std, or build ARIMA - it will be anyway way better ;)
    For your deltas going up and down around 0, it would predict something close to the mean of input deltas I suppose.
     
    #11     May 24, 2016
  2. In other words, you are trying to fit one sigmoid (or tanh, or something) to all of your input data. It makes sense only in the case if your target deltas behave like a sigmoid function of your input prices.
     
    #12     May 24, 2016
  3. IAS_LLC

    IAS_LLC

    How about a plot of Input Data vs Desired Output and NN Output? A Single picture often tells a thousand words. With a a such a simple system, we could probably tell all sorts of things.

    And 1 Neuron in your hidden layer isn't going to be able to fit anything other than the activation function itself... i.e y = b*tanh(a*w*x); You need more hidden neurons. The number will depend on the data set. Too many and your risk over training... too few ...and you have the problem you're probably having now.
     
    #13     May 24, 2016
    userque likes this.
  4. userque

    userque

    I got the impression they are testing/experimenting/trying-to-understand-by-starting-with-something-simple-and-basic.
     
    #14     May 24, 2016
  5. Activation is bipolarSigmoid. I am using 3500 days worth of daily FX data. And I am not sure what you mean by "what type of neuron"..
     
    #15     May 24, 2016
  6. As you can see - the correlation (not surprisingly) is 1...
     
    #16     May 24, 2016
  7. If you have linear activation on the output layer and linear dependency of output on input (on your your chart it just y = x), then even 1 neuron NN should catch it up.
    If the output activation is a sigmoid, there will be errors.
     
    #17     May 24, 2016
  8. Surgo

    Surgo

    Well there's your problem. You'd need a linear activation function like the identity to learn, well, the identity function, with a single neuron. Matrix multiplication is a linear transformation. What linear transformation do you think you could apply to the output of the sigmoid function to turn it into a linear function (the identity)? There isn't one.

    Try doing the math by hand rather than by your NN library, and you'll pretty quickly prove this to yourself.
     
    #18     May 24, 2016
  9. conduit

    conduit

    Lot's of confusion and misunderstanding in the deep learning network space. I highly recommend all beginners to study and fully understand the following before heading to more complex models:

    (Exercise on classifying handwritten digits)
    http://neuralnetworksanddeeplearning.com/chap1.html

    I recommend to start with Chapter 1 but to then read all other chapters as well, I found this author to best describe the underlying basics to absolute beginners in this space. A lot of guys wanna sound educated and scholarly but lack many fundamentals.
     
    #19     May 25, 2016
    Dmitry Lukovkin and userque like this.
  10. Humpy

    Humpy

    Never yet known a consistently profitable NN. But given time ? Lots of time imho
     
    #20     May 25, 2016