I think the problem is the parameters that matter each day change. I don't see how you could code for this.
I don't see much difficulty in coding for daily changes (other than increased computational time maybe). I also think OP (schizo) has some good modern ideas -- there are already a lot of existing approaches with dynamic adaptation in mind. It really is a bit archaic to only optimize over one static lookback period (however, that can also have side benefits in that it forces the model(s) to generalize better over a larger set of data). I think one of the major existing difficulties with ML type approaches is how to formulate a framework that takes into account a lot of consideration around robustness and future stability. Getting ML (as it exists in the literature and public) to work for you is one big hurtle. If you are able to get beyond that, another big existing hurtle is how to approach and evaluate the in/out of sample stability with a good level of confidence. Nevertheless, to OP, lots of literature and blogs are looking at things like N-best momentum and volatility tracking, which are along the lines of what you asked. Lots of free OS software packages exist for that as well. There's also a fertile subset of ML, dedicated to online learning that hasn't been harvested much by the general community - I would guess that technical based funds like Renaissance utilize some of the concepts. One example of this that was popular years ago (though, it had some practical warts) was Cover's Universal Portfolios.
So would your model be training itself on a training data set created in real time to determine correlations that are effecting the market in real time? Then using those correlations to make trades on your test set? Sounds like a tall order.
It may worth considering how this guy does this. I have not figured out how his indicator works but it trains on the full data set to calculate the probability of direction. Also it is interesting that he uses data after 2000. Dr. Ernie Chan thinks that data after 2007 should be used. The data lookback period is a serious issue.
I can't see how historical data would have any value in determining real time correlations between various parameters intraday. Seems like this would be a problem for Deep Learning/Unsupervised Learning. You don't know which correlations are working intra day, but you need the algorithm to discover them. I could see where this could work for HFT if your data was composed of win/lose trades throughout the day.
I've read that guy's PDF on his p-indicator. His indicator is a joke. A grotesque joke. It depends on the trading history of N unspecified price patterns. WTF?! All that crap just to figure out whether price is headed up or down or neither? LOL!
Thanks for the pointer. As always, so much to learn, so little time. ops: BTW which OS proggies are you referring to?
Depending on your level of comfort; apps like R and Python are common. You can look around Quantopian and start by cloning others. There was a lot of good online learning development shared there.
Isn't the risk with adapting the parameters that you overfit to recent data? I saw volpunter refer in the past to the drawbacks of Walk Forward Testing. The thrust of his posts being that with WFT you fit a great model to the last market environment but are constantly chasing your tail. At the other extreme, let's say you want to build a model to predict movements in bond futures prices but use data from so far back in time that it's useless in a regime of low interest rates. So how do you slice and dice your data into test and training sets? How many years of data do you need to have a large enough sample size whilst still keeping your parameters adaptive? The devil is in the details...
There is always risk to fit a model only to recent data since the market can change to a different regime without any prior announcement. Plus some model parameters can change fast, whereas other model parameters are slowly moving. So maybe one fits the same model to both data sets: only recent data and the longer term data, and see what happens.