Respuesta :
[tex]$\begin{align}P&=\begin{pmatrix}0.2&0.8\\0.6&0.4\end{pmatrix} \\ &\text{So, we'll have:} \\ P_{1,1}&=0.2\\P_{1,2}&=0.8\\P_{2,1}&=0.6\\P_{2,2}&=0.4\end{align}[/tex]
(1)
[tex]$\begin{align}\text{since }P_{1,1}=0.2,\text{ then the probability is } 0.2\end{align}[/tex]
(2)
[tex]$\begin{align}\text{Since }P_{1,2}>P_{1,1},\text{ then the system most likely occupy state 2 because of higher probability}\end{align}[/tex]
(3)
we can multiply the transition matrix (n-1) times, getting:
[tex]$\begin{align}P^2&=\begin{pmatrix}0.2&0.8\\0.6&0.4\end{pmatrix}^2\\&=\begin{pmatrix}0.52&0.48\\0.36&0.64\end{pmatrix} \\ &\text{So,} \\ P_{1,1}&=0.52\end{align}[/tex]
(4)
Using the same reason as (2), the system most likely occupy state 1
The probability that the first observation it is in state 1 on the next observation will be 0.2.
How to calculate the probability?
From the information given, the Markov chain has the transition matrix shown below: P =[0.2,0.8] [0.6,0.4]. Therefore, probability that the first observation it is in state 1 on the next observation will be 0.2.
The state that the system is most likely to occupy on the next observation is state 2 due to higher probability.
When on the first observation the system is in state 1, the state that the system is most likely to occupy on the third observation is the same state which is state 1.
Learn more about probability on:
https://brainly.com/question/24756209
#SPJ2