Staggering Data Lags with Investor/RT and DTN - Any and All Help Would Be Appreciated

Discussion in 'Data Sets and Feeds' started by cwb1014, Apr 10, 2009.

  1. I know that QT works pretty well with the DTN "IQ-Feed". Good enough to "scalp" with.

    Not sure about their new DTN "NxCore".
    They claim it reduces demands on bandwidth. Not applicable for QT though. Here's some info.

    http://www.dtn.com/files/Trading/SS_NXCORE_1107.pdf
     
    #31     Apr 10, 2009
  2. cwb1014

    cwb1014

    I thought that DTN transmitted their data with exchange timestamps. Is this not correct?
     
    #32     Apr 10, 2009
  3. srv

    srv

    I'm not sure I understand which folders you're referring to. The only "DATA" folder associated with i/rt is: "data_f" which seems to be the entire database. I've deleted it before and the program does not run with an empty "data_f" folder, if I remember correctly.

    Neither do I understand which "CACHE" folder you mean. I find none in or under the i/rt program folder.

    I did find the "NEWS" folder. It has two files in it: "install.txt" dated 9/11/1996 and the other is "option.tab" a 24KB file dated 7/8/2000. Neither of those two files have been updated since the install, so I don't see how deleting them will help.

    If by "trading software platform" you mean the connection to the broker or (?) - I have none on the machine i/rt is running on. I'm using TWS with the ButtonTrader order entry&management front end on another machine.

    Did I misunderstand you completely?



     
    #33     Apr 10, 2009
  4. srv

    srv

    Hi!
    What do you base that on?
    I'm under the impression IQFeed's timestamps are the same as the exchange -

    It seems reasonable to assume the timestamps on datafeeds that do not send "tick by tick" data have another basis for the timestamp.
    For instance IB's datafeed.
    If you look at a time and sales window, you will see the caution that the t&s entries do not necessarily correspond with a trade; only with a price change.

    But, IQFeed is each and every tick - at least that's what I'm paying for. I expect the timestamps to be as accurate as the price and volume.
    If my assumption is wrong I'd really like to know about it.
     
    #34     Apr 10, 2009
  5. cwb1014

    cwb1014

    IRT will not run with an empty data_f folder. I think what Landis82 is talking about is deleting all your data, using the Database Wizard (under the Control menu), and then downloading all the data you want from DTN. I'd recommend backing up IRT after the initial data deletion and then again after the fresh download. I've tried this before and it has improved program functioning generally; whether it will have any particular benefit in connection with the problem we're all concerned with here, I don't know, but it certainly can't hurt to try it.
     
    #35     Apr 10, 2009
  6. cwb1014

    cwb1014

    Hey, srv, if you get a chance to address the issue above, that would be great! Sure would like to have your insights in view of your familiarity with the database! Thanks in advance.

    cwb1014

    :cool:
     
    #36     Apr 10, 2009
  7. Eight

    Eight

    When I used Tradestation it was my understanding that they put their own timestamp on data before sending it out. That raised the question, still unanswered, as to what feed might be using the exchange timestamps. I'm working on some API stuff for IB and running into this kind of ambiguity of data timestamp question to the point where I have to stop and sort it out. I'm wondering how much I would have to pay for a feed that never falls behind in a fast market, then I'm wondering how I would know if it was behind or not... I need a feed as a metric with exchange time stamps so I can compare it to my system clock. IB's feed is said to never fall behind in fast markets but I have no way to know that with certainty without comparing it to a known standard...
     
    #37     Apr 10, 2009
  8. This is my frustration with retail feeds. OpenBook Ultra, for example, will give you timestamps to the microsecond. Does the retail feed provider ever pass this along? No, so you never have any means of quantifying how fast your trading strategies can execute or construct any sort of confidence interval over the mean quote dissemination tiime.
     
    #38     Apr 10, 2009
  9. srv

    srv

    I really couldn't offer anything to try that would be accessible to the end user.
    It would be presumptuous to make suggestions in this area to the developers; I'm unfamiliar with the internals of i/rt.

    I meant to say that the problem is NOT that they are using the wrong tool or an inefficient tool. A database can be implemented using a database ENGINE, such as dbVista efficiently or otherwise depending on the knowledge, skill, and experience of the programmer.

    I have been out of the game for over 18 years. At that time at least, the thinking was design issues were most often the cause of software failure (problems in the general sense) - and implementation issues more often than not followed from poor design decisions.
    (Continuing to the absurd: poor or incomplete specifications invariably result in poor design; inadequate or incomplete domain expertise/knowledge makes good specifications impossible, at best a matter of luck.)

    By analogy: I doubt choosing the C programming language, (because it is "so old") might have something to do with a performance problem.
    On the other hand, lack of expertise in the C programming language could result in serious performance problems.

    An application developed in Java (or C++/Fortran/Basic) might be superior to that application implemented using C; if the Java programmer is expert using Java and the C programmer has little skill with C.

    IOW, don't blame the tools.

    dbVista supports the "network" model, as opposed to a "hierarchical" or "relational" model. In a sense it is a lower level tool than a relational database engine; just as C is lower level than C++ or Java. My recollection is that Raima was working on an SQL front end as that's what programmers were being taught and were familiar with; the relational model was achieving dominance, making everything else a speciality or niche product/expertise. If i/rt is using an SQL wrapper around the underlying engine, then all bets are off as it seems unlikely a 'network model' database engine made to look 'relational' is as efficient as an engine designed to be relational to begin with.

    AND - its been 18+ freeking years since I've had my head in this stuff... I don't know what i've forgotten - let alone what progress has been made in the last two DECADES...

    That's all I got!

    Ready for flames and charges of irrelevance...
    :confused:
     
    #39     Apr 10, 2009
  10. Eight

    Eight

    I emailed dtn and cqg regarding exchange time stamps, I'm getting ready to code up a system. I need to work out these little details regarding latency and worse yet, variations in latency.. in a really busy market it seems quite possible that the delay from the exchange to the data retailer could become a problem...

    A lot of trades don't get recorded exactly when they happen actually, maybe that is more of a pit trading thing than electronic market thing, not real sure but I recall reading about that a long time back... maybe electronic exchanges have evolved beyond the old model of data handling and data retailers need to catch up to the current needs of the trading community a bit... I hope not but it's a bit of an unknown thing with me at this point...

    regarding the local database, a guy developed tickzoom, it handles ticks nearly at the hardware level, the guy proved that there is no need for your pc to be the bottleneck, absolutely no need for that at all...
     
    #40     Apr 10, 2009