Question for Neural Net Users

Discussion in 'Automated Trading' started by clearinghouse, Nov 6, 2011.

  1. I was reading frostengine's thread and I had a few questions for users of neural nets. I currently do not use neural nets, but I find them interesting.

    When you guys feed inputs from price time-series into your neural nets, do you "bin" events or inputs by a given time span? If I observe event X and event Y, in continuous time, the probability of X occuring at the same time Y is effectively 0, since if you cut time down into infinitesimally small slices it's basically impossible to have something happen at the exact same time. So does the time-spacing for the event "X happened AND Y happened" become subject to a constraint "X happened and Y happened within timespan T" ?

    And then I guess the second question is, do you feed the difference in time of events back in as an input for events related to a time series to capture time dependencies?

    Moreover, does the timespan selection then become another variable subject to optimization or is this some fixed choice that most traders just sort of go with based on the time frame they are targeting?
     
  2. I apologize I cannot answer your question but I have one for you: what kind of output you get from a neural net? Do you get a trading system, a relationship, a correlation, what is it that you get?
     
  3. It's just another way to do classification problems, but NNs can be somewhat unwieldy which is why I've avoided them in general and instead opting for simpler techniques.
     
  4. Like many machine learners, it can do both classification and or regression, depending on the application. Regarding inputs, it is part art and part science. A common approach might be to use embedded time delayed variables as inputs to a univariate type predication. Here is an example...

    http://intelligenttradingtech.blogspot.com/2010/01/systems.html
     
  5. Can you recommend some good books on NN architecture? I've read the basic books and they all use similar, basic networks and don't seem to give much intuition on model building and architecture.

    The Amazon reviews are hard to extract value from, because one man's "not enough math" is another man's "too much math" and the complaints sort of cloud whether a book is good or not. I have no fear of math, but I'm specifically looking for a book that treats model construction and selection in depth, but doesn't get lost in cutting edge research or really esoteric topics.
     
  6. There are many more 'broad' dsp or ML texts on architecture of NNs. For example, I think this one is actually free online, and is a fantastic intro to NN (at that chapter) architecture! You can build a simple one using his text from the ground up, with any basic type language and then add inputs, layers, etc to your needs. One of the best practical explanations I've seen.

    http://www.amazon.com/Scientist-Eng...1_fkmr1_3?ie=UTF8&qid=1320780736&sr=8-3-fkmr1

    here is NN (ch 26), free! from author
    http://www.dspguide.com/pdfbook.htm

    ---------------
    Here is a great practical forecasting/econometrics/DSP applications based one.
    http://www.amazon.com/Neural-Novel-...0419/ref=sr_1_1?ie=UTF8&qid=1320780386&sr=8-1

    If you get used, make sure to get with the CD. Very good example based text (slightly technical, but not overly) and you can run examples in included DOS programs. It does not get into architecture itself much at all, but more importantly goes into details about applications, what to feed, etc...

    Lastly, there are more (prog) language specific based texts. PM me if you have a particular one in mind.
     
  7. ssrrkk

    ssrrkk

    I worked on a statistical modeling / classification project once in my day job, and I tried out clustering, Bayesian methods, SVMs, logistic regression, neural nets, and SOMs. After two years on the project, giving a few talks at conferences, and writing a patent on the particular application, my conclusion was that these methods all gave very similar performance, and they of course all depended highly on the quality of training data, i.e., whether there really was enough signal within the noise. Of course, all the caveats about overfitting apply so one must use some kind of mutual information based criteria to stop training and use a cross-validation scheme to evaluate their performance. Another very important thing is pre-conditioning or transforming the input data (descriptors), which is the topic that you ask about. Of course, this is extremely important but really depends on the problem and requires a lot of experimentation, e.g., sometimes you need to log transform the data or normalize the descriptors. You need to look at the distributions of the descriptors to make that judgement call.

    By the way I found this book to be quite helpful:

    http://www.amazon.com/Elements-Stat...ction-Statistics/dp/0387848576/ref=pd_sim_b_3
     
  8. tim888

    tim888

    Forget about NNs. Traders need to learn how to read price action or go broke. NN is the fast way to ruin. GIGO
     
  9. So this is the issue, right: Reading price action.

    Say you read the price action and you find what you think is a pattern, and then you want to classify whether your observations on that price action have merit. Then, you need some sort of classifier if you want to automate things and have some degree of confidence in whether you have an edge at all.

    I mean I guess you could just skip the validation step and pray, but it's not a very polished approach to finding edges.
     
  10. MAESTRO

    MAESTRO

    Here is my 2 cents. Any kind of machine learning could do taxonomy (well, not well, poor etc.) providing that the classes are well defined in advance. However, the key is not taxonomy, but meronomy where the classes could be generated and named. Hypothesis generation is the hardest part of any decision making model. I am not saying that NN are not useful - they are to certain extend. However, I believe that they are very limited in practical applications. I find that they could be used to verify the hypotheses, but not to generate them reliably. Finding an "edge" in trading is all about generation of ideas, not verifying them (there are many other methods to verify ideas more efficiently). I have yet found a reliable machine learning method that could be compared with the human creativity in terms of generation of ideas.
     
    #10     Nov 9, 2011