Example of Markov Model
0.3 0.7
Rain Dry
0.2 0.8
• Two states : ‘Rain’ and ‘Dry’.
• Transition probabilities: P(‘Rain’|‘Rain’)=0.3 ,
P(‘Dry’|‘Rain’)=0.7 , P(‘Rain’|‘Dry’)=0.2, P(‘Dry’|‘Dry’)=0.8
• Initial probabilities: say P(‘Rain’)=0.4 , P(‘Dry’)=0.6 .
Calculation of sequence probability
• By Markov chain property, probability of state sequence can be
found by the formula:
• Suppose we want to calculate a probability of a sequence of
states in our example, {‘Dry’,’Dry’,’Rain’,Rain’}.
P({‘Dry’,’Dry’,’Rain’,Rain’} ) =
P(‘Rain’|’Rain’) P(‘Rain’|’Dry’) P(‘Dry’|’Dry’) P(‘Dry’)=
= 0.3*0.2*0.8*0.6
Example of Hidden Markov Model
0.3 0.7
Low High
0.2 0.8
0.6 0.6
0.4 0.4
Rain Dry
Example of Hidden Markov Model
• Two states : ‘Low’ and ‘High’ atmospheric pressure.
• Two observations : ‘Rain’ and ‘Dry’.
• Transition probabilities: P(‘Low’|‘Low’)=0.3 ,
P(‘High’|‘Low’)=0.7 , P(‘Low’|‘High’)=0.2,
P(‘High’|‘High’)=0.8
• Observation probabilities : P(‘Rain’|‘Low’)=0.6 ,
P(‘Dry’|‘Low’)=0.4 , P(‘Rain’|‘High’)=0.4 ,
P(‘Dry’|‘High’)=0.3 .
• Initial probabilities: say P(‘Low’)=0.4 , P(‘High’)=0.6 .
Calculation of observation sequence probability
• Suppose we want to calculate a probability of a sequence of
observations in our example, {‘Dry’,’Rain’}.
• Consider all possible hidden state sequences:
P({‘Dry’,’Rain’} ) = P({‘Dry’,’Rain’} , {‘Low’,’Low’}) +
P({‘Dry’,’Rain’} , {‘Low’,’High’}) + P({‘Dry’,’Rain’} ,
{‘High’,’Low’}) + P({‘Dry’,’Rain’} , {‘High’,’High’})
where first term is :
P({‘Dry’,’Rain’} , {‘Low’,’Low’})=
P({‘Dry’,’Rain’} | {‘Low’,’Low’}) P({‘Low’,’Low’}) =
P(‘Dry’|’Low’)P(‘Rain’|’Low’) P(‘Low’)P(‘Low’|’Low)
= 0.4*0.4*0.6*0.4*0.3
Exercise: character recognition with HMM(2)
• Suppose that after character image segmentation the following
sequence of island numbers in 4 slices was observed:
{ 1, 3, 2, 1}
• What HMM is more likely to generate this observation
sequence , HMM for ‘A’ or HMM for ‘B’ ?
Exercise: character recognition with HMM(3)
Consider likelihood of generating given observation for each
possible sequence of hidden states:
• HMM for character ‘A’:
Hidden state sequence Transition probabilities Observation probabilities
s1→ s1→ s2→s3 .8 * .2 * .2 * .9 * 0 * .8 * .9 = 0
s1→ s2→ s2→s3 .2 * .8 * .2 * .9 * .1 * .8 * .9 = 0.0020736
s1→ s2→ s3→s3 .2 * .2 * 1 * .9 * .1 * .1 * .9 = 0.000324
Total = 0.0023976
• HMM for character ‘B’:
Hidden state sequence Transition probabilities Observation probabilities
s1→ s1→ s2→s3 .8 * .2 * .2 * .9 * 0 * .2 * .6 = 0
s1→ s2→ s2→s3 .2 * .8 * .2 * .9 * .8 * .2 * .6 = 0.0027648
s1→ s2→ s3→s3 .2 * .2 * 1 * .9 * .8 * .4 * .6 = 0.006912
Total = 0.0096768