# Insert your answer here and fee free to add markdown cells as needed
Probability & Linear Algebra
For the exercises below you must use Pytorch tensors and operations. You can use the pytorch documentation freely and any other resources you want.
Problem 1: Simulation (20 points)
The exercise refers to Example 6.6 of the Math for ML book.
Problem 1A (15 points)
Simulate (sample from) the bivariate normal distribution with the shown parameters obtaining a plot similar to Figure 6.8b that shows the simulation result from a different bivariate Gaussian distribution. You can generate \(m=200\) samples/points and show them in a 2D scatter plot. (10 points)
Problem 1B (5 points)
Plot the contours of the bivariate Gaussian distribution and the simulated points in the same plot. (5 points)
Problem 2: Projection (20 points)
You may want to review these linear algebra videos or the other linear algebra links provided in your course site (see Math background sections).
Simulate a 3-dimensional (3d) Gaussian random vector with the following covariance matrix by sampling m = 1000 3D vectors from this distribution.
$ \[\begin{bmatrix} 4 & 2 & 1 \\ 2 & 3 & 1.5 \\ 1 & 1.5 & 2 \\ \end{bmatrix}\]$
Using the Singular Value Decomposition (SVD) of the covariance matrix compute the projection of the m simulated vectors onto the subspace spanned by the first two principal components (or left singular vectors of the covariance matrix).
Problem 2A (5 points)
What determines the principal components ? Show the vectors which denote the first 2 principal components.
Problem 2B (5 points)
Plot the projected vectors in the subspace of first 2 principal components.
Problem 2C (10 points)
Reverse the projection to map back to the original 3D space and create a scatter plot to show the reconstructed points. Do the reconstructed points have identical/similar but not identical/different correlations in respective components as the original matrix?
# Insert your answer here and fee free to add markdown cells as needed