Programming Automated Trading

Discussion in 'App Development' started by ZenMusic, Nov 6, 2015.

  1. ZenMusic

    ZenMusic

    I'd like to share with programmers design and results of algorithms for automated trading programs.

    Currently I have my own program running well, it scans for stocks that are volatile , capable of large changes for intraday trading.

    The stocks are watched for trends and reversals. The program recognizes 4 patterns very well, I'm adding 5-6 more.. I can describe the pattern and then feed historical data into the program and it tunes itself using the data. Builds multiple tables for each pattern and finds the best match against the stocks it's watching (TrendWatch). It has a confidence factor 0 to 200% on the match and momentum.

    I have a window pop-up for each of my TrendWatch processes when it is getting close to an action point (such as a buy or escalation of confidence) and it displays all the parameters and data it has at the moment. They stack on top of each other neatly but I can open and watch each thread graphically .. I'll post pictures (multiple screens) if there is interest ..

    Thursday was first day of fully automated trading / testing of my new program Most orders were stopped out immediately because of very tight stop-loss settings (intentional for now, I'll loosen then as I gain confidence) , These were the best trades:
    CSTM BOT 5.67 SOLD 6.90
    ITG BOT 17.78 SOLD 20.26
    RLYP BOT 17.595 ... still holding at price 18.74

    Friday: automated trading update
    RLYP BOT 17.595 had been held overnight .. sold at Friday's opening surge for 20.40
    fell and program rebought it at 18.72
    Morning:
    -bought TRIP and sold for $2 profit each share (74.03 - 76.04)
    upload_2015-11-6_12-20-52.png


    Up trend works on finding >2%-3% trend (looks at longer context hi/low prices also to decide if the trend is likely to continue , also considers volume/time)

    Currently reversals are working best ( like TRIP ).. It appears to be the easiest pattern for the program to recognize , and it knows about finding a bottom and waits for a green bar. after it saw a low of 73.57 and then a change red to green.. it triggered a buy

    The automated programming appears to me to be the simple part ...and I over estimated the wins that that would bring simply by having it watch 100 stocks and buy on trend and sell on the first downturn after a profit equal to the initial increase.

    The real goal seems to me now to be stock scanning an selection to feed into the evaluation engine ... pick the right stocks and the rest is easier. Though it could be due to current market conditions.. this may change when I encounter other conditions. The program works best in a dynamic rapidly changing market and it can make better decisions than I can then.
     
    Last edited: Nov 6, 2015
    botpro and Baron like this.
  2. Good job. What platform do you use? I know Java but don't know of any Java based platforms. I use NinjaTrader so I guess I will start watching videos on C#. I hear it's not that different from Java.
     
  3. ZenMusic

    ZenMusic

    I started with Ninja Trader but switched to IB API C# and C++ several months ago .. first I built a TWS assistant (day trade assistant) that could buy and sell with single key entries .. it sets stop-loss and profit takers and can scale in/out .. that I've been using for about 6 weeks .. then I decided to add the automation and the first version is finished and running. Now I'm focused on adding market scanning to input the stocks to the engine. Been great fun .. I quit my job (professional programmer , specialized in data analysis and self-tuning database engines) to do day trading. Scary too.

    latest trade
    upload_2015-11-6_13-43-28.png
     
    Last edited: Nov 6, 2015
    tsgiannis likes this.
  4. vicirek

    vicirek

    How do you select patterns? Is this from trading experience or from published patterns? Is the pattern a set of numbers or set of conditions? How much memory you need for your patterns?

    It seems that historical data play secondary role to build table of possible variations of the pattern. Is this correct? By self tuning you mean something along neural networks or something else? How confidence levels are derived? Is this separate algorithm from patterns?

    Why are you using both C++ and C# ? Are those separate processes that need to communicate?

    From the programming point of view what is table? An array, collection/container? Which data structure are you using and how are you searching through the patterns and what is the speed of your searches?
     
  5. ZenMusic

    ZenMusic

    I'm adding patterns that I know and use. The most basic first such as 1) trend, 2) trend with pullback and continuation or breakout 3) reversals
    The patterns are graphs (outlines) then stored in tables, each table represents one pattern. The actual values are interpolated so they can be stretched or compressed by time frame . Patterns can sequence or be inserted into each other. ((is this clear? the tables store enough data values to redraw the graph .. the exact number of points required can be computed by waveform analysis described below))

    (I have programmed lots of waveforms previously, electronic music and sound engineering. So waveforms, graphs, slopes, interpolations and forward prediction on rate of change are familiar to me).

    (in a trend) Historic data I find 70% of the time show some larger context for the current trend, and if the price had been much higher in the past, it is likely that a current trend could reach higher levels.. sometimes though a stock is breaking into a new range and redefinition so that history may not apply at all .. that's what makes fishing for a new floor dangerous, the stock may be failing .. and fall to new lows ...

    Most of my previous work focused on database engines and data analysis and I had a focus in query (db) tuning ... given market data, the program finds the best pattern match that it knows (currently only 4. soon that will be 10-11 that I've sketched out) then the program makes a prediction (it usually has several alternative plans) .... then after the execution, the program considers the outcome success, views the alternatives that were rejected and then decides if one of them would have worked out better and grades itself .. in the next iteration it will have higher or lower confidence and will weight it's decision accordingly (in the case of a high success, next time confidence will be higher and maybe a little more daring in its buy (maybe earlier) and the sell (maybe will ride out more bumps rather than selling on the first decline) ...
    -- confidence can also be injected from me (user) or other processes based on experience, some knowledge particular to the situation or social networks, rating upgrade or downgrade .. etc.
    there are other rankings in play such as volatility (these are performance factors, separate from the patterns) ((all of this __generally__ I've done many times before in many systems for query optimization and self-tuning))

    I have a huge library of C# and C++ code that I have created over the last 10+ years .. mostly reusable utility functions (file system, monitor/display management, init file data , email, messaging, debugging, logging, performance evaluation) .. now that I find c# fast enough, I'm moving to all c#, but if some part needs to be in c++ that can be isolated (some math computation, graphing, pattern matching may fall into that area) .. yes multiple parallel processes are running. messaging between them .. no direct calls ..some producer / consumer architecture

    (as an aside:: if you know about sound, I extensively used Fourier analysis and it is amazing similar
    Fourier analysis is "the process of decomposing a musical instrument sound or any other periodic function into its constituent sine or cosine waves" .. one of my programs just did pattern matching to determine waveforms using a library graphs.. this was originally on mainframes)
     
    Last edited: Nov 6, 2015
  6. vicirek

    vicirek

    This is very interesting approach. At least you have enough experience in applied signal processing.

    I have also noted that even if there is a pattern match what follows is not exactly what was expected or the entry/exit points based on this are not always optimal. I see that your answer to this issue is the addition of adaptative decision process.

    I am not sure how much Fourier transform would help here. I can imagine that it can simplify/standardize pattern matching based on frequency spectrum or in case of Windowed Fourier it would be possible to compare frequency changes and find certain regularities. From what I have read Fourier analysis did not offer significant advantages in analysis of time series generated by market action. I would be interested in checking it when I have time to see if there is anything interesting there. I have some FFT routines handy but never had time to plug them in.

    C# is definitely a very good choice. On the other hand with the advent of GPU processing I prefer C++ because it offers more flexible access to massive parallelism if required for data intensive programs.
     
  7. I've done quite a bit of quant experimentation and found the FFT by itself to not be all that useful.
    One thing I recently discovered was that the FFT can be used to *greatly* speed up autocorrelation and cross-correlation algorithms. A standard autocorrelation implementation works but is quite inefficient. I achieved a 7x increase using the FFT implementation.

    WRT GPU processing, I agree that the short path to using things like OpenCL is a C++ implementation. Yes you can make it work in C# but it involves bridging C# calls to C library functions. It's not hard but might be a stretch for the average GUI designer looking to design complex algorithms. And that's just part of using GPUs. You also have to implement algos in OpenCL (a C like language) that actually run on the GPU hardware.
     
  8. ZenMusic

    ZenMusic

    FFT would be useful just for the optimization, to be useful on trend analysis, you have to take it to another level, I'm not extracting sine waves and I'm not working with periodic functions ..... you only have partial waveforms and are extracting non-sine waveform fragments, slopes...
    may be unnecessary with what I'm finding works best in my system, I entered a large set of patterns , over 20 now , each has strategies defined with it. Now the program is scanning the historical data , it does very well in matching the patterns , it manages reversals amazingly the most profitable currently until I get the others tuned, it does optimally on quiet markets skimming profit off gentle trends and very volatile movements runs and scalping..
     
  9. Pattern matching is a nearly perfect application for GPU analysis.
     
  10. ZenMusic

    ZenMusic

    The next step for my automated trading was a pleasant surprise. My program recognizes 5 patterns very well. Over the weekend I decided instead of adding new patterns myself, I'd have the program search them out.

    I feed in hundreds of days of historical data. The program would look for the pattern , label it and file it ... if it did not recognize the pattern with >85% confidence, it would label it unknown 1 etc. and it simply added those as possible types of patterns. (The program allows sequences of patterns and also subtypes (component) patterns so 1 pattern type (macro level) may consist of micro level patterns. (it expands or contracts a pattern over the data to find subpatterns)
    The surprise was:
    It found "channels" up and down and distinguished that from simple up or down trends. So I added that as a type.
    It also found premarket gappers and was already matching against that at nearly 100% success as unknown8, it stored that with the increased volume captured.
     
    #10     Nov 11, 2015