Interesting presentation, and lots to consider if data volumes grow to a certain size beyond memory constraints.
Nice presentation - it confirms what I've been doing for a few years now in terms of how to best represent/store financial data in particular full OB depth and trades data. Plain old binary files stored in a columnar manner - one file for each column be it Time, Depth Levels, or Trades. Fully portable across all platforms and applications. On top of that I use FastBit plus my own access layer to query data. Also very easy and quick to map that data into memory if needed. Never tried OneTick or KDB but any other DB I tried was miserably worse.
I export historical data in CSV format from TradeStation. There are pseudo-indicators and keystroke macros on the TradeStation forums that can be used to accomplish that. I have also written API code to export data from CTS T4.
Would anyone kindly explain why this is the case? What are the issues for R to work on hourly or minute data? Thank very much in advance.
For strategy dev. and backtesting I'm using Mathematica. Gives you (almost) the power and possibilities of C, but allows you to do many things in a very convenient way (due to its symbolic nature). In particular, setting up parallel computation is as simple as possible. Main drawback is the price tack, as long as you don't have access to an educational license.
I was stating that the trading specific tools for R were not written for intra-day time frames analysis.