In the exponential moving average, there's this curious factor: 1 - 2/(n+1) Since any positive factor less than 1 would do, why this particular factor?

Thanks kut2k2. I suspected that was the appropriate reason. It's simple, but lacks justification (in my opinion). It'd be nice to provide some "best" alpha-factor ... "best", in some sense. Hopefully, I'll be able to do that (if I live long enuff)

Smoothed versions of price (for example: moving averages) can be used in a bunch of different ways to build rule-based algorithmic trading systems. Some of my favorites include the following. (Notation: S[Today] == the smoothed version of price, today. S[Yesterday] == Yesterday's reading of the smoothed version of price S[2 days ago] == ... you get the picture ...) 1. Go long when both (S[Today] > S[Yesterday]) is true, and also (S[Yesterday] > S[2 days ago]) is true. Reverse the comparisons for the short entry criterion. 2. Go long when (Todays Close > S[Today]) and also (Yesterdays Close > S[Yesterday]) and also (2 days ago's Close > S[2 days ago]) 3. Have two smoothings, S1 and S2. S1 is more heavily smoothed than S2. (In moving average terms, S1 uses a longer number of days in the MA than S2). Go long when S2 > S1. 4. Have two smoothings, S1 and S2 as above. Go long when (S1[Today] > S1[Yesterday]) and also (S2[Today] > S2[Yesterday]) There are lots more ways to smooth prices besides moving averages. One that very very few people use, because it is not a built-in primitive in their software testing tools, is the Median. Moving average is the "mean" of the last N days. Why not look at the Median instead of the mean? Hmmm. (By the way, median smoothing is called "median filtering" in engineering and is used quite often, for example it's in the read electronics of your hard disk drive) Another smoothing method is Linear Regression. This is a primitive in most software testing tools, so you can explore it at your leisure. Chande compared LR to MA and found that he liked LR better. (Your opinion may vary). Another smoothing method is Polynomial Regression (also called polynomial fitting). If a line is good, maybe a quadratic is better? Some people think so. And finally there are Digital Filters. John Ehlers's "Rocket Science" books cover these in detail. Just remember that they all try to accomplish the same basic goal: smooth out the wiggles and jiggles in the price data.

horribilicus: Actually, I was particularly interested in the question: In the prescription for the Exponential Moving Average, namely: EMA(N) = α EMA(N-1) + (1-α P(N) everybuddy (for reasons unknown to me) seems to use α = 1 - 2/(N+1) My interest was not because it provided smooooothing ... but WHY THIS α? Curious, eh what? P.S. I'm still workin' on it

If you want more things to be confused about, observe that a low pass filter (electronic smoothing circuit) built with a single resistor and a single capacitor, has the exact same output response as an EMA. The way engineers describe the lowpass filter is to talk about its "RC timeconstant" in units of time. Days, in this case. The RC timeconstant of an electronic lowpass filter, whose output perfectly matches the output of an EMA with smoothing constant A, is given by: RC timeconstant = 1/A Now you have two mutually inconsistent definitions: (1) EMA "#days" to match the Lag of an N day simple MA filter: EMA #days = ((2/A) - 1) days (2) RC timeconstant of filter that perfectly matches an EMA with smoothing constant A Timeconstant = (1/A) days Sleep tight!