Dear all, I have this idea to create a dispersion strategy to trade options. I am thinking to calculate the intraday realized volatility from last 5 bars in every 30 sec, and to compare the implied volatility from either 3-month delta 25 (Call & Put average) or near month ATM (Call and Put average), all in 30 sec as well. Which one will you think its better to compare the richness and cheapness of the options, 3-month delta 25 (Call & Put average) or near month ATM (Call & Put average)?

sorry, the underlying is index. why do you think its not working? any reasons, or its just from your experiences

You won't be able to observe things moving this fast, IMHO... BTW, I also don't quite understand what this has to do with dispersion.

This is NOT a dispersion. A dispersion is when you trade the volatility of the index vs the volatility of the components. Comparing realized volatility over the past 2.5 minutes vs implied doesn't sound like a viable strategy. The slippage would kill ya.

I am with MTE here, dispersion is trading options on the index vs options on the components, the bid offer spread in all of the calculations you want to do would make your data inaccurate.

Not just inaccurate, but meaningless. The ask/bid spread, if you converted it into IV, would typically be 5-10% of the total IV. What could you possibly see over 5 samples in 2.5 minutes of SV, that would give you statistically significant confidence that the rest of the market's estimation of IV is off by 10% over the life of the option? This strategy is really a non-starter. (I wonder if you could do volatility "mean reversion". If the statistical volatility has been extremely low for the last n bars, you buy the next breakout up or down.)

I have found that whenever I have an idea, it is best to test it than to ask. Get the data, roll up your sleeves, and prove it or disprove it to yourself. Just be ready for untold sacrifice of your life, and I am assuming you can already program.