Lets start with an example of NIFTY. Tomorrow, NIFTY can have three states -
It can either end in green.
It can either end in red.
It is in the same place.
Note - On any given day, NIFTY will be following one of these three states. Now, as per the theory of the Markov Chain, We need to assume that, tomorrow's state of NIFTY depends on today's state.
So, What happens today is dependant on yesterday's state and so on!
In other words - there is a way to predict what will be the state of NIFTY tomorrow if you know the state of NIFTY today.
This diagram shows an example of such a Markov Chain with assumptive probability values
It can either end in green.
It can either end in red.
It is in the same place.
Note - On any given day, NIFTY will be following one of these three states. Now, as per the theory of the Markov Chain, We need to assume that, tomorrow's state of NIFTY depends on today's state.
So, What happens today is dependant on yesterday's state and so on!
In other words - there is a way to predict what will be the state of NIFTY tomorrow if you know the state of NIFTY today.
This diagram shows an example of such a Markov Chain with assumptive probability values
Get trades instantly and discuss them with me at my Slack Channel @ unofficed.com/chat/
Our community is full with traders who actually trade!
Telegram: t.me/unofficed
Our community is full with traders who actually trade!
Telegram: t.me/unofficed
면책사항
해당 정보와 게시물은 금융, 투자, 트레이딩 또는 기타 유형의 조언이나 권장 사항으로 간주되지 않으며, 트레이딩뷰에서 제공하거나 보증하는 것이 아닙니다. 자세한 내용은 이용 약관을 참조하세요.
Get trades instantly and discuss them with me at my Slack Channel @ unofficed.com/chat/
Our community is full with traders who actually trade!
Telegram: t.me/unofficed
Our community is full with traders who actually trade!
Telegram: t.me/unofficed
면책사항
해당 정보와 게시물은 금융, 투자, 트레이딩 또는 기타 유형의 조언이나 권장 사항으로 간주되지 않으며, 트레이딩뷰에서 제공하거나 보증하는 것이 아닙니다. 자세한 내용은 이용 약관을 참조하세요.
