\documentclass[fleqn]{article}
\usepackage{haldefs}
\usepackage{notes}
\usepackage{url}
\begin{document}
\lecture{Machine Learning}{HW11: Expectation Maximization}{CS 726, Fall 2011}
% IF YOU ARE USING THIS .TEX FILE AS A TEMPLATE, PLEASE REPLACE
% "CS 726, Fall 2011" WITH YOUR NAME AND UID.
Hand in at: \url{http://www.cs.utah.edu/~hal/handin.pl?course=cs726}.
Remember that only PDF submissions are accepted. We encourage using
\LaTeX\ to produce your writeups. See \verb+hw00.tex+ for an example
of how to do so. You can make a \verb+.pdf+ out of the \verb+.tex+ by
running ``\verb+pdflatex hw00.tex+''.
\begin{enumerate}
\item Consider the GMM framework, and suppose that each cluster $k$
had its own cluster-specific variance $\si^2_k$. What would the
updates for these variables look like?
%\begin{solution}
%\end{solution}
\item I have two coins, A and B. Your job is to figure out $\pi_A$
and $\pi_B$, the probability of heads of each of these coins.
However, I'm evil and won't let you flip the coins yourself. What I
will do, however, is flip them on my own and tell you the results.
In particular I say something like: I picked one of the coins,
flipped it 10 times, and it came up heads 7 times and tails 3 times.
Then I picked one of the coins (perhaps the same one, perhaps not),
flipped it 10 times, and it came up heads 5 times and tails 5
times. I tell you this information $N$-many times (so you the
results of a total of $10N$ coin flips).
Set this up as an EM problem. What is the data, what are the
parameters and what are the hidden variables? Write down the
\emph{complete data likelihood} for this problem. (You don't need
to solve the EM poblem: we'll do that in class.)
%\begin{solution}
%\end{solution}
\end{enumerate}
\end{document}