# Linear Algebra for Machine Learning Practitioners

1. Vectors and Scalars:
2. Vector Operations:
• Scalar Multiplication: Multiplying a vector by a scalar.
• Dot Product (Inner Product): A binary operation that takes two equal-length sequences of numbers (usually vectors) and returns a single number. It’s defined as the sum of the products of their corresponding components.
3. Matrices:
• A matrix is a 2-dimensional array of numbers, symbols, or expressions arranged in rows and columns.
4. Matrix Operations:
• Matrix Addition and Subtraction: Element-wise addition or subtraction of corresponding elements of two matrices of the same size.
• Scalar Multiplication of a Matrix: Multiplying every element of a matrix by a scalar.
• Matrix Multiplication: A more complex operation that involves the dot product of rows and columns.
5. Transpose of a Matrix:
• The transpose of a matrix flips it over its diagonal.
6. Matrix Inversion:
• The inverse of a square matrix A (denoted as A^(-1)) is another matrix such that when it’s multiplied by A, the result is the identity matrix.
7. Eigenvalues and Eigenvectors:
• For a square matrix A, an eigenvector is a non-zero vector v such that Av is a scalar multiple of v. The corresponding scalar is called the eigenvalue.
8. Determinant:
• The determinant of a square matrix is a scalar value that can be computed from the elements of the matrix.
9. Solving Linear Systems:
• Linear algebra is used to solve systems of linear equations. This is particularly important in regression problems in machine learning.
10. Matrix Decompositions:
• Techniques like LU decomposition, QR decomposition, and Singular Value Decomposition (SVD) are used to factorize a matrix into simpler, more interpretable components.
11. Norms:
• A norm is a way of measuring the size of a vector. Common norms include the L1-norm (sum of absolute values), L2-norm (Euclidean norm), and infinity-norm (maximum absolute value).
12. Orthogonality:
• Vectors are orthogonal if their dot product is zero. A set of vectors is orthonormal if they are orthogonal and all have a unit norm.