I've used a few platforms for strategy development. I always seem to come up against obstacles where I might as well write my own stuff rather than go through the gymnastics to get what I want out of the platform. For example, NinjaTrader felt weird to try to access more than 1-2 securities prices at a time (needed to wrap them into a derived indicator, if I remember correctly). With tradelink, not everything worked out of the box - so there was tweaking to be done plus data management was clunky. With R/quantmod and other modules, there is some nice open-source work mostly approached from an academic angle. The academic angle uses vector operations and can make for some nice succinct functional code. See this backtest in just a few lines of R script: http://blog.fosstrading.com/2011/03/how-to-backtest-strategy-in-r.html But, using vector operations isn't great when you want to switch to an event-driven model that responds to new bars or new ticks - like what happens when you want to run on live data and generate signals. And, more path-dependent trade logic expands the number of vectors needed until it's more messy than it's worth. The nice thing about NT and tradelink is that they are built from the ground up to run the same code whether you are backtesting or running live. Overarching all of this is my belief that I am likely better served developing systems by first exploring relationships in data and then seeing what might be a trade. So, I think R might be the way - but when I use it, iterating over bar data, it feels like I'm doing it wrong by introducing flow control into a functional language. Has anyone made R their "home" for everything (mining+testing+live signals)? Or - If you've been down the same road building your own tools for the whole process, what technologies did you settle on?