### 2.1 Vectors and their role in Machine Learning and Data Science

#### 2.1.1 Geometric View of Vectors and its significance in Machine Learning and Data Science

### 2.2 Python code to create and access vectors and sub-vectors, slice and dice vectors, via Numpy and PyTorch parallel code

#### 2.2.1 Python Numpy code for introduction to Vectors

#### 2.2.2 PyTorch code for introduction to Vectors

### 2.3 Matrices and their role in Machine Learning and Data Science

### 2.4 Python Code: Introduction to Matrices, Tensors and Images via Numpy and PyTorch parallel code

#### 2.4.1 Python Numpy code for introduction to Tensors, Matrices and Images

#### 2.4.2 PyTorch code for introduction to Tensors and Matrices

### 2.5 Basic Vector and Matrix operations in Machine Learning and Data Science

#### 2.5.1 Matrix and Vector Transpose

#### 2.5.2 Dot Product of two vectors and its role in Machine Learning and Data Science

#### 2.5.3 Matrix Multiplication and Machine Learning, Data Science

#### 2.5.4 Length of a Vector aka L2 norm and its role in Machine Learning

#### 2.5.5 Geometric intuitions for Vector Length - Model Error in Machine Learning

#### 2.5.6 Geometric intuitions for the Dot Product - Feature Similarity in Machine Learning and Data Science

### 2.6 Orthogonality of Vectors and its physical significance

### 2.7 Python code: Basic Vector and Matrix operations via Numpy

#### 2.7.1 Python numpy code for Matrix Transpose

#### 2.7.2 Python numpy code for Dot product

#### 2.7.3 Python numpy code for Matrix vector multiplication

#### 2.7.4 Python numpy code for Matrix Matrix Multiplication

#### 2.7.5 Python numpy code for Transpose of Matrix Product

#### 2.7.6 Python numpy code for Matrix Inverse

### 2.8 Multidimensional Line and Plane Equations and their role in Machine Learning

#### 2.8.1 Multidimensional Line Equation

#### 2.8.2 Multidimensional Planes and their role in Machine Learning

### 2.9 Linear Combination, Linear Dependence, Vector Span and Basis Vectors, their Geometrical Significance, Collinearity Preservation

#### 2.11.1 Array View: Multidimensional arrays of numbers

### 2.12 Linear Systems and Matrix Inverse

#### 2.12.1 Linear Systems with zero or near zero Determinants; Ill Conditioned Systems

#### 2.12.2 Over and Under Determined Linear Systems in Machine Learning and Data Science

#### 2.12.3 Moore Penrose Pseudo-Inverse of a Matrix: solving Over or Under Determined Linear Systems

#### 2.12.4 Pseudo Inverse of a Matrix: A Beautiful Geometric Intuition

#### 2.12.5 Python numpy code to solve over-determined systems

### 2.13 Eigenvalues and Eigenvectors - swiss army knives in Machine Learning and Data Science

#### 2.13.1 Python numpy code to compute eigenvectors and eigenvalues

### 2.14 Orthogonal (Rotation) Matrices and their Eigenvalues and Eigenvectors

#### 2.14.1 Python numpy code for orthogonality of rotation matrices

### 2.15 Matrix Diagonalization

#### 2.15.1 Python Numpy code for Matrix diagonalization

#### 2.15.2 Solving Linear Systems without Inverse via Diagonalization

#### 2.15.3 Python Numpy code for Solving Linear Systems via diagonalization

#### 2.15.4 Matrix powers using diagonalization

### 2.16 Spectral Decomposition of a Symmetric Matrix

#### 2.16.1 Python numpy code for Spectral Decomposition of Matrix

### 2.17 An application relevant to Machine Learning - finding the axes of a hyper-ellipse

#### 2.17.1 Python numpy code for Hyper Ellipses

### 2.18 Summary