Information Theory "test your knowledge"
All logarithms should be log base 2. You can get most calculators these days to spit out a log base two, including Google (try typing in log2(100) into the Google search bar; you should get 6.64 bits, or so). See you soon! -- Simon
Sign in to Google to save your progress. Learn more
Please enter your e-mail address (enter a fake one if you like!) *
Short probability/entropy workout
You have a fair, six-sided die. What is the entropy of the outcome of a single roll? i.e., what is the entropy of the probability distribution {1/6, 1/6, 1/6, 1/6, 1/6, 1/6} (over the options {1, 2, 3, 4, 5, 6}). Be sure to work in bits (i.e., log base 2, rather than log base e, or log base 10.)
Clear selection
You have a pair of (fair) six-sided dice. What is the entropy of the *sum* of their outcomes? To do this, you'll have to figure out the probability distribution over the numbers {2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12}; a reminder that this won't be uniform! (e.g., there are more ways to get 7 than there are to get 12.)
Clear selection
Mutual Information
Imagine your friend has two coins. One is fair (50-50 Heads vs Tails), and the other is biased (75-25 Heads vs Tails). He pulls one of the coins out of his pocket, you don't know which—i.e., you attribute 0.5 probability to the possibility that he is holding the fair coin, and 0.5 probability to the possibility that he is holding the biased one; that corresponds to an uncertainty (as you can verify) of one bit.
Your friend tosses the coin and gets "Heads". What's the probability you attribute to the coin being the biased one? (Hint: Bayes's rule!)
Clear selection
Your friend tosses the coin and gets "Tails". What's the probability you attribute to the coin being the biased one?
Clear selection
What's the average uncertainty you expect to have about which coin your friend tossed, *after* you learn the outcome of the toss?
Clear selection
What's the mutual information between the outcome of a single toss and the identity of the coin? (equivalently, "how much information does the toss give you about the nature of the coin"?)
Clear selection
Clear selection
BONUS: What is the entropy, in bits, of words in Jane Austen's novel Pride and Prejudice? (Formally, "estimate the probability distribution over words, by using the word frequency counts in the text, and get the entropy of that distribution?")  The text of the novel is available at http://santafe.edu/~simon/pp.txt
Captionless Image
OPTIONAL: what areas of cultural analytics and data science are you particularly interested in?
Submit
Clear form
Never submit passwords through Google Forms.
This content is neither created nor endorsed by Google. Report Abuse - Terms of Service - Privacy Policy