Mathematics for ML
Certainly, let's break down the key concepts in the image you provided. Linear Algebra: * Vectors: These are arrays of numbers arranged in a column or row. They represent quantities with both magnitude and direction. * Matrices: These are rectangular arrays of numbers. They are used to represent linear transformations and systems of equations. * Norms: Norms measure the size or magnitude of a vector. Common norms include the L1 norm (sum of absolute values), L2 norm (Euclidean distance), and infinity norm (maximum absolute value). * Subspaces: These are subsets of a vector space that are closed under addition and scalar multiplication. They are fundamental to understanding the structure of vector spaces. * Projections: Projections are operations that map vectors onto a subspace. They are used in dimensionality reduction techniques like Principal Component Analysis (PCA). * Singular Value Decomposition (SVD) and Eigenvalue Decomposition (EVD): These ar...