How to check irreducible Markov chain?

6 views (last 30 days)
Hello,
Now I want to check in matlab if a Markov Chain is irreducible or not. I found some instructions in mathworks saying:
tf1 = isreducible(mc1) %returns true if the discrete-time Markov chain mc is reducible and false otherwise.
but it seems not to be enough. Then came accros a part saying that the object should be defined first as a Markov chain. dtmc mc1
But it still gives errors. Where can I find simple information/instructions about this topic? I am relatively new in matlab, so I do not know the right keywords. Does somebody have some advice for me?
Thank you in advance
  2 Comments
Clarisha Nijman
Clarisha Nijman on 23 Oct 2018
That is the same document in Mathwords I referred to. This document wants me to define mc as a dtmc object, the link on the page says a transition matrix is needed to do that. But is this the only way? Is it possible to give a series a input, and then to check if the series is irreducible or a proper Markov Chain?
Torsten
Torsten on 23 Oct 2018
Edited: Torsten on 23 Oct 2018
A Markov chain is defined by its transition matrix. If you know how you must transform your "series" (I don't exactly know what this series actually is) into the transition matrix, you are done.
Best wishes
Torsten.

Sign in to comment.

Accepted Answer

Torsten
Torsten on 23 Oct 2018
https://de.mathworks.com/help/econ/dtmc.isreducible.html
  1 Comment
Clarisha Nijman
Clarisha Nijman on 23 Oct 2018
Thanks a lot Torsten,
so with the series (sequence of numbers or states the Markov chain visited after n transitions), the transition probability matrix is composed and then it can be checked if the Markov chain is irreducible or not. Thanks a lot!

Sign in to comment.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!