Matlab in Finance

Discussion in 'Trading Software' started by nitro, Sep 29, 2004.

  1. Hi Anton,

    I've visited your site a few times and I think Quantstudio is very interesting product.

    Could you perhaps give us a rundown of what you are planning with regards to the future development and evolution of the product.

    Thanks,

    Runningbear
     
    #41     Nov 3, 2004
  2. Well, we have recently released QuantServer .NET, which is a fast streaming database engine dedicated to manipulations with time series of any user defined objects (trades, quotes, bars, fundamentals, FIX messages, whatsoever). It shows some 1M objects/second performance on Athlon64 and can store and manipulate with any number of named times series. It also provides high compression capabilities. One month of TAQ (NYSE trade and quotes database), which is 22CDs in binary format, occupies ~2GB being imported in QuantServer.

    QuantServer can work together with QuantStudio time series analysis package, so that you can analyse huge time series (I am talking about tens of gigs of tick data) on the disk as if they were in memory. A typical task is to select the most correlated / cointegrated pairs from NYSE stock universe.

    In general we are moving to the "hedge fund out of the box" product concept. I can write more about it if someone is interested :)

    Cheers,
    Anton
     
    #42     Nov 3, 2004
  3. Looks like the thread is being resuscitated by a spammer.
     
    #43     Nov 3, 2004
  4. I was asked about future plans and I replied, that's it. You are welcome to remove this post if you like. Anyway one of the big problems with using Mathlab in finance is historical data access from Mathlab. I think we have a good solution to this problem and "spam" above is somehow relevant. The rest is up to you.

    Cheers,
    Anton
     
    #44     Nov 3, 2004
  5. prophet

    prophet

    What is "Mathlab"? Do you mean Matlab?

    High performance on Athlon 64 is 10M or 100M objects/second, not 1M. Everyone knows NYSE TAQ can be compressed. So what? I can squeeze 36 months of NYSE TAQ into 20GB with ordinary RAR compression. But then it can't be easily queried. Consididering the low cost of hard drives, you are nuts to be promoting high compression of NYSE TAQ, sacrificing query time.

    Your "typical task" does not require or benefit from memory mapped "tens of gigs of tick data". There are much better implementations. You should first convert the tick data to fixed or uniform time intervals. Save that to disk for reuse.

    Care to explain this "big problem"?
     
    #45     Nov 3, 2004
  6. Well, I am not going to flame here but

    - objects/second IO depends on the kind of objects you IO. In some cases 10 objects per second on Athlon64 is a very good performance :)
    - try to store TAQ data in an SQL db and have a look at compression / performance issues even on Athlon64
    - you can use different compression levels tuning speed vs compression, I believe you will get your 10M ticks / second on Athlon64 with fast bus and HDD if you run without compression
    - you can still query compressed data
    - indeed you can compress trades into bars and have a few hundreds of bars per day instead of tens of thousands of ticks but I was talking about ticks
    - perhaps there are better solutions. Could you send a link please together with pricing info. KX for example?:)

    Cheers,
    Anton
     
    #46     Nov 3, 2004
  7. prophet

    prophet

    You appeared to be talking about tick or bar data, not some massive objects.

    How you store the TAQ data is dependent on how it will be used. Beyond archival purposes, compression is only useful to the point at which it improves throughput from disk without too much CPU overhead to decompress... for the desired usage patterns. NYSE TAQ is a huge amount of data, though I have seen bigger sets. Many applications will process it differently and may require transposes. Of course you can query compressed data. I do it all the time. You can also use DB, flat files, or hybrids of the two. Everything depends on expected usage patterns.

    You originally boasted about how much you can compress TAQ data. I don't see the relevance unless you're concerned about the price of hard drives. If you were concerned about efficiency and speed you would have mentioned bandwidth instead, maybe even preprocessing too. Are you able to increase the query bandwidth over flat uncompressed, indexed TAQ data? If yes, by how much? For what query patterns or types of usage? Even if you compress the original TAQ data it may make no difference, depending on the application.

    I/O bus and HDD speed are not requirements for processing 10M or 100M ticks/second or bars/second. Testing can be made to fit nicely within memory (low page faults), minimizing cache misses. The key is to test multiple models or parameters simultaneously. Otherwise you will have cache misses, will be memory bound and/or disk bound (plus DB overhead) and thus seeing 1/10 to 1/1000 your CPU's potential.

    Everything I've said applies equally to ticks and bars.

    My software is proprietary.
     
    #47     Nov 3, 2004
  8. Anton,

    How much is QuantServer .NET?

    Runningbear
     
    #48     Nov 3, 2004
  9. Hello,

    I do agree that performance (and overall design) depends on your specific task. QuantServer was designed to work with financial data ordered in time (price, fundamental, news, etc. time series).

    It was especially designed for

    1) Historical data warehousing (especially of large amounts of data)
    2) Data capture in real time when you need to capture several thousands of instruments simultaneousely
    3) Data analysis and market simulations (strategy backtesting)

    It can map any objects so that you may desire high level of compression in some cases (think about news stream).

    There is benchmark utility included in the (demo) package. You can see some plots and benchmarking reports here

    http://www.smartquant.com/forums/viewtopic.php?t=756

    We are not saying that QuantServer can compete with RDBMS if you need to execute complex queries but we hope it is a simple and cost cutting solution for financial data analysis (namely strategy backtesting and market simulations).

    PS. We have pricing info on our web, I don't really want to post it here and look like a spammer again :)

    Regards,
    Anton
     
    #49     Nov 4, 2004
  10. linuxtrader

    linuxtrader Guest

    A set of commerical C++/C software that we own and license to a couple large well street firms. We use mathematica to prototype a model or investigate a complex calculation, but rather than run it natively in that environement or incurr the penalty involved in a matlab component, we port it into our well tested libraries - these aren't for sale or available to new licensees by the way.

    Our last native matlab and mathematica applications are scheduled to be phased out by the end of the year.
     
    #50     Nov 4, 2004