In Markov chain, all states must be defined and steady state must be reached in order to get meaningful results. It would impossible to define all the states of the market let alone define the model. One thing for sure is the price change is normally distributed and we often hear reversion to the mean.
Brownian motion is a well-known Markov process. But why is then volatility of the same supposed to be not Markovian? Hmm. I think volatility must be Markovian as well. https://www.britannica.com/science/probability-theory/Markovian-processes " Markovian processes A stochastic process is called Markovian (after the Russian mathematician Andrey Andreyevich Markov) if at any time t the conditional probability of an arbitrary future event given the entire past of the process — i.e., given X(s) for all s ≤ t — equals the conditional probability of that future event given only X(t). Thus, in order to make a probabilistic statement about the future behaviour of a Markov process, it is no more helpful to know the entire history of the process than it is to know only its current state. The conditional distribution of X(t+h) givenX(t) is called the transition probability of the process. If this conditional distribution does not depend on t, the process is said to have “stationary” transition probabilities. [...] "
Bug in the ET software, sometimes you see blank posts from people you have on ignore/block. Especially when they are the last poster on a thread.
It's today's close vs yesterday's close for each observation. The red line is for 1 day whereas the blue line is average of 10 days.
Markovian proceses only need present to forecast future, nonmarkovian require the entire history of the process