Across the board. Both bonds and equities fell in size which is unusual. I don't break out the performance by type of forecast, so I am afraid I can't answer that question. But obviously short vol didn't do great, but then neithier did anything else. All other things being equal I'd cut my positions 5% (rounding means that only marginal positions would get cut). But positions would also have been cut by vol rising, and trends turning. So bottom line: Code: code contractid filled_datetime filledtrade filledprice 12889 BTP 201612 2016-09-09 07:51:54 -2 143.3600 12904 GBP 201609 2016-09-09 18:11:57 -1 1.3266 12886 KOSPI 201612 2016-09-09 02:51:43 -1 258.1000 12883 KR3 201609 2016-09-09 03:37:08 -1 110.7600 12901 LIVECOW 201710 2016-09-09 17:49:45 -1 98.4000 12895 NASDAQ 201609 2016-09-09 15:33:01 -1 4762.7500 12892 OAT 201612 2016-09-09 08:00:00 -1 159.3600 12898 SP500 201609 2016-09-09 15:40:44 -1 2154.2500 GAT
Hi GAT, I'm struggling a bit with the instrument weight multipliers, with respect to correlation. Is there a general formula for how to re-weight instruments given a particular correlation matrix? Thanks
I assume you're talking about problems of the kind, you have trading systems for say SP500, US10 and BUND, and you want to know what instrument weights to use? [If another problem let me know] Well it's a problem of optimisation, as discussed in chapter 4, and there are two broad approaches. The first is to use what I call handcrafting. In this case you'd put 50% into stocks, 50% into bonds. Then split the bonds allocation 50:50. This gives you 50% in SP500, 25% In US10, 25% in BUNDs. The second is to do proper optimisation using correlations (and risk adjusted returns, though I wouldn't bother - we don't need volatility estimates as these things have the same expected vol). There are closed form ("general formula") solutions for this given a correlation matrix (slightly complicated by the constraint that all weights positive and add up to 1), but these would normally come up with weights that are too extreme. I prefer other methods like bootstrapping or shrinkage: http://qoppac.blogspot.co.uk/2015/10/a-little-demonstration-of-portfolio.html ... though these need more work, which is why handcrafting is my "go to" method though harder to backtest. GAT
It's a little trickier than that. Equal weights, 14 instruments: I'm tackling the large drawdown at the end of the curve, that happened because all the financial futures suddenly became highly correlated & loss making. Instead of defining fixed weights, I wanted to dynamically adjust the weights based on rolling window correlations. My concern with either bootstrapping or handcrafting is that while good, we wouldn't have learnt from this drawdown until we'd suffered a nasty loss.
I have to say the curve you show has a great sharpe ratio overall so I wonder if you are being unrealistic with your expectations about trying to smooth all drawdowns out of the curve entirely [the first steps on the path to overfitting] or maybe the earlier part is already overfitted [only you know for sure]... I also wonder if you looked at something like average correlation of the financial instruments does that really peak in the period shown or are you just using that as an excuse.... Anyway, to be clear - you think you should be reducing the weight on financials because they are correlated, or because they are doing badly? The latter I wouldn't recommend. It's very hard to forecast risk adjusted returns of these trading systems out of sample. As to the former, okay let's discuss. The speed of "learning" will depend on the lookback of your correlation estimates, eg are you using ten years lookback, five years, one year... (or equivalent if using my favourite exponentially weighted). Perhaps to pick up what is happening here you'd need one year lookback. To what degree you learn will depend on how much your model weights varied depending on correlations (not at all with handcrafting). It's easiest to think about this with something like shrinkage; no shrinkage is just naive markowitz, full shrinkage would give you equal weights. The less shrinkage you have the more you will change your model weights as correlations move around. So for example if you tried something like a markowitz with shrinkage, full shrinkage on the means, 80% correlation with the estimates with something like a 5 year lookback you'd find it did pretty well out of sample (you can't by the way run these experiements just on your data as that is definitely overfitting, random data is best although I have also run similar experiements on a wide variety of real data). If you now try looking at the out of sample performance with shorter correlation windows you find they don't work so well. In other words generally speaking the benefit of shorter windows (adjust quicker to changes in correlation) is overwhelmed by the fact that it's hard to forecast changes in correlations out of sample. Similarly using less shrinkage (adjusting more to changes in correlation) does worse and produces more extreme weights that move about a lot whilst making performance worse. GAT
Actually, your argument makes a lot of sense, thanks. I see the main groups as being: Corn & wheat US2, Eurodollar, bobl What I'll try to do is bootstrap it first and see how that goes, and go from there. PS I'm a bit worried about the Sharpe ratio; it's uncomfortably high, so I'll be checking through to make sure I'm not doing a lookahead somewhere.
GAT, in your system how would you allocate instrument weights if your position in one instrument was dependant on your position on another? For example butterflies, spreads etc. You could create synthetic instruments I suppose, but for something like a pairs trading strategy you would have way too many synthetic instruments (permutations) to optimize weights for. Would you trade them in a separate portfolio?
I would indeed create synthetic instruments. I've used this approach for things like relative value fixed income (synthetic instrument; yield curve steepness or curvature). But you're right this system isn't ideal for something where there is a large number of possible permutations. Perhaps the best solution is to have a "pre-filtering" stage. Let's assume we are running something like a simple stat-arb system where we only want pairs that are highly cointegrated. We first filter all possible permutations (That would be N^2 -N / 2) and return only the K pairs that are highly cointegrated. We then create synthetic instruments with those pairs only. Everything else proceeds as normal. GAT