ThinkorSwim platform under Java 6

Discussion in 'Trading Software' started by syswizard, Feb 20, 2009.

  1. thstart

    thstart

    >That attitude would definitely be a >reason for them to hide the >outsourcing.

    That is attitude you have when outsourcing is a problem. There are too many examples when incompetent outsourcing crushed companies.

    Probably out of topic but this is one of the reasons the shipping company DHL spend $1bil to create an US operations but incompetent software outsourcing crushed them (Airborne had excellent web services API, DHL - a horrible web services API, created by the way with Java - this is from my personal experience).

    >Here's just a good example of what >happens when the outsourcing staff >lacks insight and foresight into how >the application could be improved >and how everything is cohesive

    Your example is interesting.

    There are some operations which can be outsourced, some simply impossible to outsource.

    The problem I see in financial applications is that the application domain is too US specific and not everything can be outsourced.

    I cannot imagine some developers in Russia to know details of the US stock market operations. They probably are good developers but this is not enough to create a good financial application.

    Every one feature you implement has thousand possible permutations with other features and you have to know what you are doing when you are developing financial software.

    Also it is one thing for example MS Office - bugs or no bugs it is not so important, an another thing - a financial application.
     
    #11     Feb 21, 2009
  2. soks86

    soks86

    You know this makes me reconsider the way I approached this.

    If an application is to run on an end user machine (god forbid in Vista) then speed issues are very large. This is due to shared resources and the fact that the trader may simply be using hardware that is too slow for what they are trying to accomplish. I suppose if you were locked in to providing for limited systems (e.g, handhelds, smart phones) then C/C++ would be a requirement. However a solid PC running appropriate software (if TOS is pure Java and you have no requirements for Windows you may want to consider running Linux) then the performance costs should not be significant when compared to the savings in a simpler development and maintenance cycle.

    The fact is that development, maintenance, and scalability are all major factors in software and hardware. Higher level languages like Java (or C# for you .NET folks) provide frameworks that assist in the development of thoroughly tested and very robust code a speed cost that is usually constant or linear at worst. I simply see additional hardware as a much cheaper solution than additional development effort, competent developers are expensive. I feel confident knowing my broker understands the benefits of a Java based system which can be run under a proper (Linux, UNIX, Solaris) stable environment.
     
    #12     Feb 21, 2009
  3. thstart

    thstart

    >I simply see additional hardware as a much cheaper solution

    This is no longer true. The free lunch is over, meaning Intel and others cannot just increase the clock speed and you just buy new hardware and problem solved. To keep up with demand for speed they need now better processor architecture, multi core for example. To use this you need appropriate software and to go to the processor cache for optimizations. Which is better made using C++ BUT only with Intel compiler. The Microsoft compiler do not recognizes the latest fast Intel instructions and is useless for latest Intel microprocessors.

    But you are right the developers are expensive. From other side the financial applications are not good for outsourcing.

    Probably what I am saying is more important for server based solutions where you need the best performance for long running calculations, but for workstations it too does not harm if you have a better performance too.

    I am telling this from experience - every day I am processing about 500 EOD parameters 10 years back in history for 500 stocks. This is possible only the way I explained. For some calculations I got 1,000 time better performance, meaning with C# I have to wait 24 hours to get calculations ready which is useless. I reduced this time to several minutes with proper optimizations under C++. I am using a very fast Dell server and made the comparisons on the same server. And tried to avoid using C++ up to the last moment. Development with C# is much faster, but the end result is an unacceptable performance. But having 2 versions open my eyes because I was able to make comparisons on a real problem, not an artificial benchmarks.

    The decimalization created a really big problem because the amount of data to be processed increased 10 fold. These are terabytes of data generated per day from the exchanges. If you add the stock options, it is even more. You need the best performance to process this. amount of data.

    If your broker is running on Java you really have a problem then. If not now they will hit the wall very soon and you will get delays in your data.

    I already see this with Schwab - a very high frequency trading moments are freezing their system and they are unaccessible during this time and this is with their desktop software. They have also a Java client downloadable from the web site and which has a horrible performance, it is not stable, etc. Of course I know Schwab is not the best example but they are $billion company and cannot solve this problem.

    That is one of the reasons I don't believe in high frequency trading anymore. You cannot get the speed needed to have an advantage. The amount of historical data needed to be processed is way too much. Possibly trading only several ETFs that way would be possible.
     
    #13     Feb 21, 2009
  4. Yet more Java is too slow silliness.

    Anybody running TWS or other client trading application can easily check for themself by checking CPU utilization in a fast market. In the case of TWS you will find it is low. If low, there is no problem.

    Of course if you are crunching vast amounts of data it may be another matter, but you will get more return from 1. Using the best algorithm and 2. Good multithreading, than worring about which compiler to use or whether the latest SSE4 (or whatever) instructions are supported.
     
    #14     Feb 22, 2009
  5. thstart

    thstart

    >Java silliness...

    I don't know what are you telling about, possibly old forum threads I don't know about.

    No reason to be angry, I am trying to understand why my broker cannot handle the load. One possibility is that they are not using fast enough software.

    >Of course if you are crunching vast amounts of data it may be another matter

    What you will do if your broker cannot handle the data load anymore? I am referring about server side not desktop side if you read carefully.

    If you are objective you will understand my point.
     
    #15     Feb 22, 2009
  6. Having used both TWS and TOS which are Java based platforms, I have to totally agree with this. The threading architecture used for these realtime applications is critical. I have to admit that the TOS platform does some amazing calculations in realtime with options as well as having to process quotes and calculate indicators. So their thread management framework is most likely very well done.

    Although the outsourcing does make for functional and GUI omissions, it does appear to have one advantage: TOS never, ever crashes. Not once in 3 years for me. So outsourcing the testing of an application does make sense. However, architecting and assembling the framework for the app or coordinating enhancements: forget it, you're likely to miss the target in these areas with outsourcing.
     
    #16     Feb 22, 2009
  7. thstart

    thstart

    >However, architecting and assembling the framework for the app or coordinating enhancements: forget it, you're likely to miss the target in these areas with outsourcing.

    That was my point. It is interesting how the platform will be maintained after the acquisition.
     
    #17     Feb 22, 2009
  8. Interactive gets around this problem with a technique called data sampling. The big question is : Does your trading methodology need every tick and every print ? If your average time-in-trade is 20 seconds, then the answer is "probably". For 5 minute or longer time-frames, I think answer is "no". IMHO with all of the noise in the market data, sampling is valid as a technique for eliminating data avalanches from locking the trading platform.
     
    #18     Feb 22, 2009
  9. thstart

    thstart

    I suppose you are referring to sampling as used for example in MP3 - getting every 10th sample?


    The big question is : Does your trading methodology need every tick and every print ? [/QUOTE]

    Of course not. The moves in such short time frame are very small so that you need a lot of risk capital to get a gain. This plus the very big amount of historical data needed to be analyzed do not justify the high frequency trades for me.

    Probably it makes sense to perform tick by tick historical analysis only for very liquid stock or ETF's.
     
    #19     Feb 22, 2009
  10. Wow, that's a great question ! Maybe we should start a thread on data sampling techniques then ?
    I think the technique of every Nth tick is not what Interactive is using. I am sure it is more sophisticated. Maybe it's a combination of time and Nth. Possibly just time is used as that would always insure consistent data delivery. However, with time only used to determine when to sample, when the tick rate soars, then important data like the high or low of an interval could be lost.
     
    #20     Feb 22, 2009