most used desktop PC trading software/target for inventory management plugin?

Discussion in 'Trading Software' started by stochastix, Apr 20, 2017.

  1. Hi All, I've developed some software to implement a fairly sophisticated strategy for actively management limit and market orders .. since the group I was working went tits up, I decided to find out which active trading platforms had a sufficiently large user base and API which would be sufficiently advanced as to integrate all the features that the strategy implementation offers. My first thoughts were interactive brokers or lightspeed.

    Does anyone have any recommendations for which platform I should develop first? I've been out of the retail space for a while, supporting institutional traders... also, any general discussions of the framework described at https://arxiv.org/abs/1106.5040 are welcome.

    The software is an implementation of the strategy described in https://arxiv.org/pdf/1106.5040.pdf "Optimal high frequency trading with limit and market orders" , the mathematics of the strategy are fairly sophisticated, and not relying on any sort of blackbox or curve-fitting type ideas.

    The amount of historical quote and trade data to calibrate the models is quite large, and the user/trader would not be expected to do this themselves but that would be an option. I would offer a service which would allow the computed optimal controls to be downloaded to the workstation before the market opens..

    From the quote/tick data, a Markov chain transition probability matrix is estimated which determines the movements of the jump of the spread level. For instance, if the stock is currently at a spread of 2 pennies, there could be a 40% chance of it going back to 1 penny and a 55% chance of it jumping to 3 pennies, and a 5% chance of it jumping to 4 pennies. The spread-transition matrix need not be symmetric, and it usually is not. Also, for each hour of the day, the spread jump rate is calculated. This allows the probabilities of a particular level of spread being reached, given knowledge of the current level, to be calculated by combining the information from the spread transition probability matrix and the spread jump rates. For each trade that occurs, and for each possible level of the spread, the cumulative volume is tracked and compared against the number of shares at the best bid and offer to determine whether the trade simply matched the prevailing price, or caused the price to be “improved” and thus having the order jump in front of the orders already in the queue.
    The idea is that, the current level of the spread determines to some extent the probability that future orders to be placed by counter-parties will be at a matching or improving price. This allows the algorithm to optimally determine when it needs to offer a price-improvement for a faster execution, or when it can afford to be patient and match the best bid or offer at the market, given the current displayed depth of the queue.

    The strategy also allows for the various fees and rebates to be input as parameters. The optimizer takes all of the calibrated data and the parameters, and calculates the optimal controls to apply at any given point during the trading day given any given level of inventory. The goal of the strategy is to be inventory-neutral, ending the day flat holding no inventory over night. It is also assumed to start the day at zero inventory, but this is not strictly necessary. The inventory level jumps at random times based on the demands of the customer, trader, whoever. The algorithm uses its knowledge of the spread transition probabilities, activity levels of the hour of the day, and the relative abundance of improving vs matching orders to optimally manage its inventory levels by actively placing market or limit orders as appropriate. The limit orders are moved when best bid or ask moves, and so, might be implemented more effecienctly if pegged-orders are available.
    The assumptions for this strategy to generative positive returns are that:
    • the activity levels of a stock are roughly constant and do not vary too much, relative to its average, from day to do. Note: if it is known that some stock is going to have a greater activity level on a particular day, a multiplier can be used to scale the averaged historic activity levels.
    • the spread transition probability matrixes, averaged over the calibration period, are roughly equivalent to the spread transition probability matrix on any particular day
    • the machine is able to send and receive messages from the exchange at sufficiently fast rate as to match the discretization level of the computed optimal controls. For technical/logical reasons, the minimum resolution of the computed solution is a function of the activity level, and this is intuitive, because for a stock that doesn't trade actively, a resolution of dt=5 seconds could be sufficient, however for a very active stock, the resolution might need to be as small as dt=15 milliseconds for instance

    The benefits of precomputing the optimal controls before the begining of the trading day, is that the computational resources demanded to actually operate the strategy in real-time are very low since the computationally demanding part, the calibration and optimization, is done ahead-of-time.
     
  2. Zzzz1

    Zzzz1

    What is your question? Do you look for a place to employ your strategy? Or to market it? Have you considered Reuters and Bloomberg app stores? You have have users point to tier own market data for calibration purposes and the two vendors provide a huge user base to boot with and you can set a monthly subscription model.

    I am curious if you are willing to share, what is the latency in microseconds it takes to run through one compute cycle during runtime ? I am aware that the calibration and optimizations are all done beforehand. Have you benchmarked your approach vs other approaches to obtaining queue priority and other ways of modeling spreads?

     
    IAS_LLC likes this.
  3. My main question is where to market it. I don't have the stamina to stay glued to the screen all hours of the day and would rather provide supporting tools and services for traders who do. The strategy doesn't decide when to enter and exit positions autonomously, it just provides better execution when a trader decides to take on a position, reducing slippage, optimizing commissions etc. I looked into operating the strategy myself, but the data providers I looked into, Lime brokerage, were charging about $11,000/month per exchange for data feeds which I thought was outrageous and should be against the law imho.

    No, I wasn't aware they even had app-stores.. that sounds promising, I will look into it. I was thinking that I would have a direct data feed and provide the calibrated parameters as a service for which people would subscribe. There should not be any difference between the different data providers in theory, and I would use one which had the highest frequency data available so vendors like IB who apparently snapshot the data every few hundred milliseconds would still be able to use calibration data that came from a higher-frequency source. I suppose I could offer the calibration and optimization software so people could calibrate it on their own data sources, but it would be quite a bit more tricky since the code I wrote to do that is specialized to do memory-mapped IO on a compact format I developed.. if they were doing it themselves how would I justify the monthly fee? I suppose it would be handy for non-standard markets or trading things besides US stocks. The plugin would just be a Java class to adapt to the data format. Performance is not an issue as I'm also using memory-mapped "direct byte buffers" with C/JNA. The calibrated data is in matlab .mat file format and the software can read and write this format in a memory-mapped manner as well.

    Sure, there is basically no compute cycle whatsoever during runtime, it's literally a giant lookup table and the latency is determined by the discretization frequency.. the math is continuous-time but the control grid is chopped into "small" time increments.. if the increment is not small enough, the optimizer will fail to converge.. the math is explained in the paper. Usually it's something like 25ms to 200ms but it can be as small as you want. If its too big however it wont work. The optimal discretization interval is a an inverse function of the maximum jump rate of the spread. The lookup table has time on the x axis, and inventory level on the y-axis, the control contains the information to place either a limit or market order, and if its a limit, whether to improve the price by one penny or not.

    As far as benchmarking against other spread-modelling methods, I have not done this yet, the referenced paper on arxiv.org does have quite a bit of monte-carlo comparisons though. It took quite a while to develop this system and I chose to focus on one thing and get that done first.. there are some more sophisticated models out there that I recently reviewed but I need to get some revenue first.. it was a really cool model I looked at, they actually used a numerical discretization of a 2nd order Schrodinger equation... https://arxiv.org/pdf/1607.00454.pdf
     
    Last edited: Apr 21, 2017
  4. b2tf.png