A vector is an ordered collection of scalars. It has both magnitude and direction.
Vector Operations:
Vector Addition: Adding corresponding elements of two vectors.
Scalar Multiplication: Multiplying a vector by a scalar.
Dot Product (Inner Product): A binary operation that takes two equal-length sequences of numbers (usually vectors) and returns a single number. It’s defined as the sum of the products of their corresponding components.
Matrices:
A matrix is a 2-dimensional array of numbers, symbols, or expressions arranged in rows and columns.
Matrix Operations:
Matrix Addition and Subtraction: Element-wise addition or subtraction of corresponding elements of two matrices of the same size.
Scalar Multiplication of a Matrix: Multiplying every element of a matrix by a scalar.
Matrix Multiplication: A more complex operation that involves the dot product of rows and columns.
Transpose of a Matrix:
The transpose of a matrix flips it over its diagonal.
Matrix Inversion:
The inverse of a square matrix A (denoted as A^(-1)) is another matrix such that when it’s multiplied by A, the result is the identity matrix.
Eigenvalues and Eigenvectors:
For a square matrix A, an eigenvector is a non-zero vector v such that Av is a scalar multiple of v. The corresponding scalar is called the eigenvalue.
Determinant:
The determinant of a square matrix is a scalar value that can be computed from the elements of the matrix.
Solving Linear Systems:
Linear algebra is used to solve systems of linear equations. This is particularly important in regression problems in machine learning.
Matrix Decompositions:
Techniques like LU decomposition, QR decomposition, and Singular Value Decomposition (SVD) are used to factorize a matrix into simpler, more interpretable components.
Norms:
A norm is a way of measuring the size of a vector. Common norms include the L1-norm (sum of absolute values), L2-norm (Euclidean norm), and infinity-norm (maximum absolute value).
Orthogonality:
Vectors are orthogonal if their dot product is zero. A set of vectors is orthonormal if they are orthogonal and all have a unit norm.