How much is too much for a C++ program

Discussion in 'Automated Trading' started by BlueOcean, Jul 18, 2008.

  1. I notice that if you have an event handler such as...


    int CMyStock::OnGotLevel2Record(GTLevel2 *pRcd)

    you'll notice that this callback function will get called a TON! guessing like for a stock of 2 m shares daily volume it'll be called perhaps on average 15 times per second.


    How big of a basket of stocks will be too much for a C++ compiled program. 100 stocks? 1000 stocks? if the average is 15 calls per second, it'll be 15,000 calls per second with a basket of 1000 stocks. Would that be too much for the C++ program?

    Just wondering about the limits you can push a C++ program?

    Thanks
     
  2. ssmegner

    ssmegner

    Without a stress test you will never know. There are factors like number of cores, amount of RAM, 64 or 32 bit, what else is happening in the system, etc. .NET code can actually optimize faster than C++, or slower than C++. It depends on what you are doing. The code in question may execute through different paths depending on the inputs.
     
  3. I was just typing a similar answer :) I n addition to that, how many processors you have available.
     
  4. bespoke

    bespoke

    I would think it can handle it quite easily. Although it's not C++, with VB6.0 using sterling API I can process up to 10,000 quotes/sec before the quotes begin to back up. That's an E8400 dual core overclocked to 4.0 ghz. With a Q6600 quad-core at 2.4 ghz I can do about 3500 q/s. So you can see the importance of CPU speed. AND this is with using a slow ass COM instead of DLL callbacks. So I would think you can go much faster.

    So, I guess the only way to really know is to do it.
     
  5. Sorry, your question is malformed.

    I'm recording tick data for the 6000 most liquid instruments on earth with a (very well) self-coded application.

    Bursts of up to 150.000 ticks/sec: No Problem. Day's average ~15000 ticks/sec, currently during US RTH.

    My software uses 10% CPU on an 2 Core/3GHz Core2Duo with 1MB Cache.
    (E2160, but Intel CPUs can clock much higher than what they come out of the box)

    Database size is currently about 1TB

    Network bandwidth is more an issue !
     
  6. bespoke

    bespoke

    Who is your datafeed? Since I'm tied to Sterling I have to use VB6.0 and their poor API. I'd love to be able to get 150K quotes/sec and go back to a proper language, though I don't want to pay several thousand for it. <$1000 would be great.
     
  7. Multiple feeds (not sterling).

    If you are tied to a shitty API, and believe me I know them, I'd "outsource" that part into an own process on the localhost or local network (own machine, if needed) and re-communicate the data to the real collector process via TCP or IPC.

    The real collector should be an independent, clean, multithreaded (64-bit) application.
     
  8. depends on the hardware. Your market data receiver should be its own process, rather than building 1 huge application to do everything.
     
  9. Jachyra

    Jachyra

    I can tell you that the overhead of the function call in and of itself is negligible and very inexpensive on almost all modern operating systems... so it will depend more on the code you write and execute within the function that is going to determine whether or not your function is lean and mean or slow and bloated.
     


  10. Thats baddass local_crusher, congrats.
    What database are you storing this data to?
     
    #10     Aug 22, 2008