Information Theory Basics
Information theory was introduced by Claude Shannon in 1948. It is a mathematical theory that deals with the transmission, processing, utilization, and extraction of information. It has given rise to a wide range of applications, including data compression, cryptography, error correction and fueled other industries such as AI, cellular communications and others.
Using this reference, that you need to study before answering the following question, let (x, y) have the following joint distribution:
\[\begin{array}{|c|c|c|c|c|} \hline Y \backslash X & 1 & 2 & 3 & 4 \\ \hline 1 & \frac{1}{8} & \frac{1}{16} & \frac{1}{32} & \frac{1}{32} \\ \hline 2 & \frac{1}{16} & \frac{1}{8} & \frac{1}{32} & \frac{1}{32} \\ \hline 3 & \frac{1}{16} & \frac{1}{16} & \frac{1}{16} & \frac{1}{16} \\ \hline 4 & \frac{1}{4} & 0 & 0 & 0 \\ \hline \end{array}\]If \(H\) is the symbol for the entropy functional, answer quantitatively showing your calculations
Is \(H(x|y)=H(y|x)\) ?
Is \(H(x) - H(x|y) =H(y)-H(y|x)\) ?
Calculate the mutual information \(I(x,y)\).