I witnessed this recently. A system built at the tick level of detail (yes, one tick at a time), produces one set of performance results for realtime and a DIFFERENT set of results with the same data feed for stored data or backtest results. Can anyone explain this ? Is it due to delayed ticks....i.e. ticks that are streaming in not in perfect timestamp order ? The other possibility is that the bandwidth and CPU of the workstation are not able to keep up in realtime with these fast data feeds. I need an expert on datafeeds and specfically how the CME's data servers are sending out tick data. I have to say that in my measurements, I've seen tick rates as high as 2000 ticks per second for ES or NQ. Of course, I'm probably missing some, so the rate is likely higher.