The following is the transition probability matrix of a Markov chain with states 1,2,3,4 P= 25 .25 .5 0 If Xo 1 (a) find the probability that state 3 is entered before state 4; (b) find the mean number of transitions until either state 3 or state 4 is entered.