how does expectation maximization work?

I’m reading a tutorial on expectation maximization which gives an example of a coin flipping experiment (the description is at Could you please help me understand where the probabilities in step 2 of the process (i.e. in the middle of part b in the below illustration) come from? Thank you.

Expectation maximization

  1. EM starts with an initial guess of the parameters. 2. In the E-step, a probability distribution over possible completions is computed using the current parameters. The counts shown in the table are the expected numbers of heads and tails according to this distribution. 3. In the M-step, new parameters are determined using the current completions. 4. After several repetitions of the E-step and M-step, the algorithm converges.


These are the likelihoods of the corresponding set of 10 coin tosses having been produced by the two coins (using the current estimate for their biases) normalized to add up to 1. The estimated probability of k out of 10 tosses of coin i (i{A,B}) yielding heads is


The binomial coefficient is the same for both coins, so it drops out in the normalization, and only the ratio of the remaining factors determines the result.

For instance, in the second row, we have 9 heads and 1 tails. Given the current bias estimates θA=0.6 and θB=0.5, the factors are




resulting in the numbers




in the second row.

Source : Link , Question Author : Martin08 , Answer Author : joriki

Leave a Comment