Have you considered BerkleyDB? http://www.oracle.com/technology/products/berkeley-db/index.html Save live tick to binary (as recommended by previous posters) and at end of day, merge feed collection to the DB. Keep collected symbols on same table to minimize i/o.
As of recently, this is true also on Windows. http://cran.r-project.org/doc/manuals/R-admin.html#Windows
http://www.oracle.com/technology/products/berkeley-db/pdf/berkeley-db-perf.pdf Matlab gets along with BerkeleyDB. My trading partner has an associate using BerkeleyDB for roughly 12000 tick/sec acquisition, listening to ~8500 instruments, and over 40 gb of collection at end of each session. It's certainly up to the task if kx no longer remains an option (assume you are getting this through work? kdb+/tick?).
Kinda funny to see that the majority of posters believe a db is faster than binary files. Well... how can they be faster than binary? Think about it! In the mean time I would like to go into another performance issue, lets say that some db's are as fast as binary files. But how does one store them inside the code? As objects? Structs? Arrays? Lists?
I have to disagree with those who suggested CSV. I used CSV for years for OHLCV (bar) data but for tick data is was horrible. I ended up using SQLite and I'm very happy with it. I have over 2 years of tick data in CSV files for EURUSD and some other forex pairs. They're around 4-5 gigs each. They imported into a SQLite data with no problem and the data is much easier to use now.
SQLite works well... for Excel users checkout dhRichClient from thecommon.net. Allows Excel users to easily integrate Sqlite databases in worksheets without the ms overhead and even utilize SQLite as an application file format. In-memory SQLite databases are blazing fast.
Well I would imagine if the DB have the smart indexing and caching, etc. it will be faster than plain binary on an average aggregated basis.