Perhaps I'm not understanding this correctly. On either approach, there are two values on the Excel spreadsheet that are coming from the future, but in "real time", we wouldn't have this information. And when these values are removed, the average line breaks down. I built some test code of this indicator on the side that "estimates" these values, and the average line works - but because of the estimation, it's not 100% objective.
Panzerman, Did you ever get anywhere further on the zero lag filters? Of course, I'm still under the impression that true zero lag is impossible in real time, but look incredible nonetheless...which has led to much research, coding, and testing in the hopes of coming up with something as close to possible to it. Anyway, just seeing what all you've come up with, if anything. -Jeff
I do like Ehler's supersmoother. It's supposed to be used in conjunction with his high pass filter (he calls the whole thing the roofing filter), but I've never gotten it to to work. The R code is below, but no matter what time series I add it to it, I only get zeros. Code: roofingFilter <- function(x, period1 = 10, period2 = 48) { # high pass filter alpha1 <- (cos(2 * pi / period2) + sin(2 * pi / period2) - 1) / cos(2 * pi / period2) hp <- (1 - alpha1 / 2) * (x - lag(x)) hp <- hp[-1] hp <- stats::filter(hp, (1 - alpha1), method = "recursive") hp <- c(NA, hp) hp <- xts(hp, order.by = index(x)) # super smoother a1 <- exp(-sqrt(2) * pi / period1) b1 <- 2 * a1 * cos(sqrt(2) * pi / period1) c2 <- b1 c3 <- -a1 * a1 c1 <- 1 - c2 - c3 filt <- c1 * (hp + lag(hp)) / 2 leadNAs <- sum(is.na(filt)) filt <- filt[-c(1: leadNAs)] filt <- stats::filter(filt, c(c2, c3), method = "recursive") filt <- c(rep(NA, leadNAs), filt) return(xts(filt, order.by = index(x))) }
How do I try this code within Excel? Another question: can I convert this R code to VBA code in excel? Thanks.
Ehler's Supersmoother is quite nice, but it has a small amount of overshoot (estimation) to compensate for the lag, which in rare cases, can cause false signals in certain systems. Still very useful though. Even better (in my tests at least) is the Arnaud Legoux Moving Average (ALMA). At this time, its likely the smoothest and most accurate moving average with the least amount of lag you'll find. Not sure if there is anything better out there.
Have been studied filters (MA's) for long time and my current realizings have been that all ends up acting just like the Arithmetic S-I-M-P-L-E moving average but using different coefficient parameters. ALMA as a Finite Impulse Response acts quite the same as the Simple Arithmetic one with more complexity in it's calculations. A rule of thumb in filters for me is OverFIT them to the SMA and then compare graphical properties of them as estimators. What things I do not like to see: 1) Overshooting: This leads to displace the squared or absolute deviations (If you make the proper algorithm to verify it) quite far from the impulse response. (Fast and long price moves). 2) Statistical usage: The more traders, institutions use an indicator, the more this one will act directly on the price move. Here the Simple (Arithmetic) Moving Average is KING, PERIOD! The trading consensus use "standard" parameters (coefficient calculation) that will plot exactly same levels for lot of traders, like the 200 DayCandle SimpleMovingAverage or the 52 weekCandle SMA. Here's an image of this proof of concept, clearly you can see that the ALMA (Blue) is the more overshooting filter but the smoothest one given it's gaussian transformation properties (like the Ehlers SuperSmoother does but in the frequency domain with a IIF with windowing FIR transformation) but is the Arithmetic SMA (Red) the one that responds quite fast to price change. Parameters in the image DO NOT MATTER, just similar behaviour of the filters plotted and fitted each other with their own parameters. For me, is the adaptation of the coefficients that matters for a filter. We could use volume, volatility, sentiment, whatever we like to dynamically adjust their parameters with a real world quantitative statistic, in which the Arithmetic one is the less tuneable IMO. Cheers!
Nice work xelarjona. Looks like the SMA and EMA are quite similar in this test. The ALMA definitely has a bit of overshoot. I've never been a fan of WMA.
I'm not an expert on moving averages, but this discussion does remind me of some basics I read about the topic: https://alphaarchitect.com/2017/07/...leriy-zakamulin-moving-average-basics-part-1/ https://alphaarchitect.com/2017/07/...leriy-zakamulin-types-moving-averages-part-2/ After these two primers the blog series continues with how to implement this into trading rules.
Actually I made a mistake on the screenshot and plotted SmoothedMA instead of regular Arithmetic one. Anyway my point don't change. You could have quite the same plot of MA(n) if you plot it with ALMA(size=n*2.5, offset=0.85, sigma=6) : where n=periods, Same behaviour with LinearWeighted but using (n*1.5). The only real deal is that ALMA will smooth nice the drastic moves like gaps given their gaussian filtering, this benefits some way the standard deviation (which assumes gaussian distributions) bands -Bollinger's- plots I think. As a fun fact, you could achieve the same filtering output using Ordinary Least Squares offset as a moving average (LinRegMA), but playing with the line offset as the plotter point. If you use LinRegMA(n, offset=n/2) you will get exactly the ArithmeticMA, if you instead plot LinRegMA(n, offset=n/3) you will end with the LinearWeightedMA, so, if you write any algorithm to use the offset of the linreg as a dynamic "index" you could achieve fun results. I'm currently experimenting with modified algorithms for Deviation functions using different filters for the cumulative result of errors and using them for band's. Quite interesting when using the Linear(time-bar)Weighted Deviation one (Yellow) against the normal SD (Red): Cheers.