I guess you already have someone who tries to figure out your problem but for future reference it would help if you described your problem in much more detail. It is very hard to recommend something without knowing the full scope of your issue. Just my 2 cents
R is perfectly capable. When something is slow, just use Rcpp. Check the https://cran.r-project.org/web/views/HighPerformanceComputing.html list for everything that's available.
Why not use Amibroker for this ? https://www.amibroker.com/guide/h_exploration.html https://www.amibroker.com/guide/h_ranking.html
So I got the r script from my programmer. It looks quite good but fell short of my expectations with regards to speed and 'modify-ability'. And to me the R language feels weird (like heiroglyphs) so I don't think I can work with it. I've always found it hard to modify or extend functionality of code built by somebody else, in any language. It's often faster to start from scratch and code myself and do an architecture that makes sense to me. At this point I've decided to shelve R and try something else. I haven't coded in C for more than a decade and I am probably very rusty. But I might attempt to write this myself from scratch.
It has to do with calculations that had to be assigned to clusters to make the thing run in a reasonable time frame. And certain things had to be cleared out of memory to go on the next step. This creates a lot of issues if I want to access certain objects at a later stage of the backtest.
I would recommend R... I would also recommend getting into the IRC R-finance chat and asking for help.. alot of packages are running more powerful C programming in the pack ground for speed..
The real computational power these days comes from multithreading/ ability to use multiple cores or servers to process your data. Due to R's signle-threaded nature it's limited to offline data exploration and "light" strategy backtesting role. I would be interested to hear if anyone was able to run any more or less serious/realistic strategy involving analysis of more than a few million records in R , without having to wait for hours for it to finish. Having said that- I've been spending alot of time lately learning R and pretty excited about using it for my own preliminary analysis and backtesting, even though I spent most of my career coding in Java, C++, and Python.
I tend to agree. Despite the often furious attacks by R package developers, claiming R can handle anything under the sun and RCCP C++ integration, fact remains that R is not a suitable environment to test data sets worth many millions of data points in a computationally efficient way. Such developers often show off benchmarks that show how they can generate a 100 million element time series or run a single series. What they conveniently omit is that testing a full-fledged trading strategy often involves 20-100 different method invocations and callbacks per single tick/quote/bar iterated. R is great to investigate limited data series and/or to extract a single metric but not much more. I have not seen a single R solution that can back test a full fledged trading strategy in an efficient manner.