Smoothing price data prior to use as input?

Discussion in 'Technical Analysis' started by Aston01, Dec 15, 2015.

  1. Aston01

    Aston01

    Most price based indicators I am aware of use some variation of OHLC bar values as the basis for their calculations. From that point the effectiveness of the tool can vary notably depending on smoothness of the price data in the market. A really clear price trend and things look fantastic, but throw in a few spikes or errant bars and its inevitably reflected... "Trash in Trash out"

    That being said, smoothing of the price data prior to input via different methods (i.e. averaging different combinations of High, Low, Close) can yield subtle improvements in some scenarios and just add lag to others.

    My question for you out of the box thinkers... Is there a better way then just the basics mentioned above?
     
  2. 2rosy

    2rosy

    if you have a lot of spikes you can't really scale using mean. you could use median and percentiles
     
  3. dom993

    dom993

    The OHLC of a bar are 4 samples of the transaction data stream. By the time you are sampling, the filtering of high frequencies should have already been done. It is a pb of bars creation that cannot be fixed after the fact.
     
  4. kut2k2

    kut2k2

    Smoothing price data prior to use as input is a very good idea if you're inputting to a trend indicator. The key is to use a fast smoother to minimize lag. None of that 14 bar nonsense. I use a proprietary adaptive moving average which works really well with a lookback period of 5. I apply it only to the closing prices. You might try the Kaufman efficiency ratio as an AMA. Another choice might be DMI. The gold standard of adaptive smoothing is probably the Jurik moving average but that will cost you a bit of money. Good luck.
     
  5. Aston01

    Aston01

    I actually have and like the Jurik MA, but even it is effected for better or worse by the quality of the input data.
     
  6. kut2k2

    kut2k2

    Not sure what you mean. By quality do you mean the choppiness of the data? And what are you inputting to, if I may ask?
     
  7. IAS_LLC

    IAS_LLC

    Maybe try a kalman filter?
     
  8. Trader13

    Trader13

    The closing price is commonly used because that is the price which represents the end of a time period at which a trader can execute a trade. You cannot trade some average value of a price bar. So if you want to model what can be actually traded, you may do better sticking with the close.
     
  9. %%%%%%%%%%%%%%%%
    He means[about 21 days ago] average price[ all data, years of data] is full of errors; that is why you should for example study SPY +DIA, as much as you can, you maybe suprized how you can catch data errors.Paper charts or hand drawn charts or both are best; less errors.

    PS Aston;''Spikes + errrors'' are part of trading; a true move[closing price] is validated by volume+ time, eventually-LOL.Wisdom is profitable to direct...........................................Good question